Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id B89DF18ACD for ; Mon, 2 Nov 2015 02:04:34 +0000 (UTC) Received: (qmail 65665 invoked by uid 500); 2 Nov 2015 02:04:30 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 65524 invoked by uid 500); 2 Nov 2015 02:04:30 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 65514 invoked by uid 99); 2 Nov 2015 02:04:29 -0000 Received: from Unknown (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 02 Nov 2015 02:04:29 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 879E0C1040 for ; Mon, 2 Nov 2015 02:04:29 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.99 X-Spam-Level: ** X-Spam-Status: No, score=2.99 tagged_above=-999 required=6.31 tests=[HTML_MESSAGE=3, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01, URIBL_BLOCKED=0.001] autolearn=disabled Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id ycotUHYCiF27 for ; Mon, 2 Nov 2015 02:04:20 +0000 (UTC) Received: from szxga01-in.huawei.com (szxga01-in.huawei.com [58.251.152.64]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id 850F422F1E for ; Mon, 2 Nov 2015 02:04:16 +0000 (UTC) Received: from 172.24.1.48 (EHLO SZXEML424-HUB.china.huawei.com) ([172.24.1.48]) by szxrg01-dlp.huawei.com (MOS 4.3.7-GA FastPath queued) with ESMTP id CYG53768; Mon, 02 Nov 2015 10:04:03 +0800 (CST) Received: from SZXEML505-MBX.china.huawei.com ([169.254.1.20]) by SZXEML424-HUB.china.huawei.com ([10.82.67.153]) with mapi id 14.03.0235.001; Mon, 2 Nov 2015 10:03:59 +0800 From: "Naganarasimha G R (Naga)" To: "user@hadoop.apache.org" Subject: RE: Utility to push data into HDFS Thread-Topic: Utility to push data into HDFS Thread-Index: AQHRE9o2vh/1NH1N7UWrvillLPDAR56FgjYAgAE31QCAAB6xgIAARJ2AgADagag= Date: Mon, 2 Nov 2015 02:03:59 +0000 Message-ID: References: , In-Reply-To: Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [10.18.206.66] Content-Type: multipart/alternative; boundary="_000_AD354F56741A1B47882A625909A59C692BE3B121SZXEML505MBXchi_" MIME-Version: 1.0 X-CFilter-Loop: Reflected --_000_AD354F56741A1B47882A625909A59C692BE3B121SZXEML505MBXchi_ Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Hi Shashi, Not sure i got your question right, but if its related to building of Hadoo= p on windows then i think what ever steps mentioned by James and Chris woul= d be definitely help. But is your scenario to remotely(not on one of the nodes of cluster) access= HDFS through java from either windows or linux machines ? In that case certain set of jars needs to be in client machine(refer hadoop= -client/pom.xml) and subset of the server configurations (even if full not = a problem) is required to access the HDFS and YARN @Chris Nauroth, Are native components (winutils.exe and hadoop.dll), requi= red in the remote machine ? AFAIK its not required, correct me if i am wron= g ! + Naga ________________________________ From: Chris Nauroth [cnauroth@hortonworks.com] Sent: Monday, November 02, 2015 02:10 To: user@hadoop.apache.org Subject: Re: Utility to push data into HDFS In addition to the standard Hadoop jars available in an Apache Hadoop distr= o, Windows also requires the native components for Windows: winutils.exe an= d hadoop.dll. This wiki page has more details on how that works: https://wiki.apache.org/hadoop/WindowsProblems --Chris Nauroth From: James Bond > Reply-To: "user@hadoop.apache.org" > Date: Sunday, November 1, 2015 at 9:35 AM To: "user@hadoop.apache.org" > Subject: Re: Utility to push data into HDFS I am guessing this should work - https://stackoverflow.com/questions/9722257/building-jar-that-includes-all-= its-dependencies On Sun, Nov 1, 2015 at 8:15 PM, Shashi Vishwakarma > wrote: Hi Chris, Thanks for your reply. I agree WebHDFS is one of the option to access hadoo= p from windows or *nix. I wanted to know if I can write a java code will ca= n be executed from windows? Ex: java HDFSPut.java <<- this java code should have FSShell cammand (had= oop fs -ls) written in java. In order to execute this , what are list items I should have on windows? For example hadoop jars etc. If you can throw some light on this then it would be great help. Thanks Shashi On Sun, Nov 1, 2015 at 1:39 AM, Chris Nauroth > wrote: Hello Shashi, Maybe I'm missing some context, but are the Hadoop FsShell commands suffici= ent? http://hadoop.apache.org/docs/r2.7.1/hadoop-project-dist/hadoop-common/File= SystemShell.html These commands work on both *nix and Windows. Another option would be WebHDFS, which just requires an HTTP client on your= platform of choice. http://hadoop.apache.org/docs/r2.7.1/hadoop-project-dist/hadoop-hdfs/WebHDF= S.html --Chris Nauroth From: Shashi Vishwakarma > Reply-To: "user@hadoop.apache.org" > Date: Saturday, October 31, 2015 at 5:46 AM To: "user@hadoop.apache.org" > Subject: Utility to push data into HDFS Hi I need build a common utility for unix/windows based system to push data in= to hadoop system. User can run that utility from any platform and should be= able to push data into HDFS. Any suggestions ? Thanks Shashi --_000_AD354F56741A1B47882A625909A59C692BE3B121SZXEML505MBXchi_ Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable
Hi Shashi,

Not sure i got your question right, but if its related to building of = Hadoop on windows then i think what ever steps mentioned by James and Chris= would be definitely help.
But is your scenario to remotely(not on one of the nodes of cluster) a= ccess HDFS through java from either windows or linux machines ?
In that case certain set of jars needs to be in client machine(refer h= adoop-client/pom.xml) and subset of the server configurations (even if full= not a problem) is required to access the HDFS and YARN

@Chris Nauroth,  Are native components (winutils.exe and hadoop.dll)= , required in the remote machine ? AFAIK its not required, correct me if i = am wrong !

+ Naga


 
From: Chris Nauroth [cnauroth@hortonworks= .com]
Sent: Monday, November 02, 2015 02:10
To: user@hadoop.apache.org
Subject: Re: Utility to push data into HDFS

In addition to the standard Hadoop jars available in an Apache Hadoop = distro, Windows also requires the native components for Windows: winutils.e= xe and hadoop.dll.  This wiki page has more details on how that works:=


--Chris Nauroth

From: James Bond <bond.bhai@gmail.com>
Reply-To: "user@hadoop.apache.org" &= lt;user@hadoop.= apache.org>
Date: Sunday, November 1, 2015 at 9= :35 AM
To: "user@hadoop.apache.org" <user@hadoop.apache= .org>
Subject: Re: Utility to push data i= nto HDFS


On Sun, Nov 1, 2015 at 8:15 PM, Shashi Vishwakar= ma <shashi.vi= sh123@gmail.com> wrote:
Hi Chris,

Thanks for your reply. I agree WebHDFS is one of the option to access = hadoop from windows or *nix. I wanted to know if I can write a java code wi= ll can be executed from windows?

Ex:  java HDFSPut.java  <<- this java code should have= FSShell cammand (hadoop fs -ls) written in java. 

In order to execute this , what are list items I should have on window= s?
For example hadoop jars etc.

If you can throw some light on this then it would be great help.

Thanks 
Shashi



 

On Sun, Nov 1, 2015 at 1:39 AM, Chris Nauroth <cnauroth@= hortonworks.com> wrote:
Hello Shashi,

Maybe I'm missing some context, but are the Hadoop FsShell commands su= fficient?

http://hadoop.apache.or= g/docs/r2.7.1/hadoop-project-dist/hadoop-common/FileSystemShell.html

These commands work on both *nix and Windows.

Another option would be WebHDFS, which just requires an HTTP client on= your platform of choice.


--Chris Nauroth<= /font>

From: Shashi Vishwakarma <shashi.vish123@gmai= l.com>
Reply-To: "user@hadoop.apache.org" &= lt;user@hadoop.= apache.org>
Date: Saturday, October 31, 2015 at= 5:46 AM
To: "user@hadoop.apache.org" <user@hadoop.apache= .org>
Subject: Utility to push data into = HDFS

Hi

I need build a common utility for unix/windows based system to push data in= to hadoop system. User can run that utility from any platform and should be= able to push data into HDFS.

Any suggestions ?

Thanks

Shashi



--_000_AD354F56741A1B47882A625909A59C692BE3B121SZXEML505MBXchi_--