hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vinayakumar B <vinayakum...@apache.org>
Subject Re: Utility to push data into HDFS
Date Wed, 04 Nov 2015 13:43:36 GMT
thats cool.

-Vinay

On Tue, Nov 3, 2015 at 9:34 PM, Shashi Vishwakarma <shashi.vish123@gmail.com
> wrote:

> Thanks all...It was a cluster issue...Its working for me now....:)
> On 3 Nov 2015 7:01 am, "Vinayakumar B" <vinayakumar.ba@huawei.com> wrote:
>
>> Hi Shashi,
>>
>>
>>
>>   Did you copy conf directory (ex: *<hadoop>/etc/hadoop *by default)
>> from any of the cluster machine’s Hadoop installation as mentioned in #1 in
>> Andreina’s reply below?
>> I hope, if cluster is running successfully with Kerberos enabled, it
>> should have a configuration “dfs.namenode.kerberos.principal"
>>
>>
>>
>>    Also you need to keep this directory ( yes, directory itself, not
>> files inside it) in class path of your client program.
>>
>>
>>
>> -Vinay
>>
>>
>>
>> *From:* Shashi Vishwakarma [mailto:shashi.vish123@gmail.com]
>> *Sent:* Monday, November 02, 2015 10:47 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Utility to push data into HDFS
>>
>>
>>
>> Hi Andreina,
>>
>>
>>
>> I used you java code and ran it using java command. On console I can see
>> message as Login Successful but while accessing HDFS I am getting below
>> error message:
>>
>>
>>
>> "Failed to specify server's kerberos principal name"
>>
>>
>>
>> Any suggestion for this?
>>
>>
>>
>> Thanks and Regards,
>>
>> Shashi
>>
>>
>>
>> On Mon, Nov 2, 2015 at 4:36 PM, andreina j <andreina.j@huawei.com> wrote:
>>
>>
>>
>> Hi Shashi Vishwakarma ,
>>
>>
>>
>> You can follow below steps to perform HDFS operation using java code on a
>> secure cluster
>>
>>
>>
>> 1.      Copy krb5.conf, hdfs.keytab and conf directory from installed
>> cluster
>>
>> 2.       Create a maven project with dependeny hadoop-client
>>
>>     <dependency>
>>
>>     <groupId>org.apache.hadoop</groupId>
>>
>>    <artifactId>hadoop-client</artifactId>
>>
>>    <version><version>-SNAPSHOT</version>
>>
>>    </dependency>
>>
>>
>>
>> 3.      Build the maven project, to resolve all the dependencies
>>
>> 4.      Add conf directory to classpath.
>>
>> 5.      Use below sample code to perform HDFS operation.
>>
>>
>>
>>             public class KerberosTest {
>>
>>
>>
>>                public static void main(String[] args) throws IOException {
>>
>>                  // This should be ideally default. now just for this
>> purpose overriding
>>
>>                  System.setProperty("java.security.krb5.conf",
>> "D:\\data\\Desktop\\cluster-test\\krb5.conf");
>>
>>
>>
>>                  // Login using keytab if have access to keytab. else
>>
>>                  UserGroupInformation.loginUserFromKeytab("hdfs @
>> HADOOP.COM",
>>
>>
>>          "D:\\data\\Desktop\\cluster-test\\conf\\hdfs.keytab");
>>
>>
>>
>>                  String dest = "/test/userupload/file";
>>
>>                  String localFile = "pom.xml";
>>
>>
>>
>>                  Configuration conf = new HdfsConfiguration();
>>
>>                  FileSystem fs = FileSystem.get(conf);
>>
>>                  FSDataOutputStream out = fs.create(new Path(dest));
>>
>>                  FileInputStream fIn = new FileInputStream(localFile);
>>
>>                  IOUtils.copyBytes(fIn, out, 1024);
>>
>>               }
>>
>>
>>
>>             }
>>
>>          Note: Change the paths mentioned above accordingly
>>
>>
>>
>> Regards,
>>
>> Andreina J.
>>
>>
>>
>> *From:* Shashi Vishwakarma [mailto:shashi.vish123@gmail.com]
>> *Sent:* 02 November 2015 PM 01:18
>>
>>
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Utility to push data into HDFS
>>
>>
>>
>> Hi Naga and Chris,
>>
>>
>>
>> Yes you are right. I don't have hadoop installed on my windows machine
>> and i wish to move my files from windows to remote hadoop cluster (on linux
>> server).
>>
>>
>>
>> And also my cluster is Kerberos enabled. Can you please help here? Let me
>> know the steps that should I follow to implement it?
>>
>>
>>
>> Thanks and Regards
>>
>> Shashi
>>
>>
>>
>>
>>
>>
>>
>> On Mon, Nov 2, 2015 at 7:33 AM, Naganarasimha G R (Naga) <
>> garlanaganarasimha@huawei.com> wrote:
>>
>> Hi Shashi,
>>
>>
>>
>> Not sure i got your question right, but if its related to building of
>> Hadoop on windows then i think what ever steps mentioned by James and Chris
>> would be definitely help.
>>
>> But is your scenario to remotely(not on one of the nodes of cluster)
>> access HDFS through java from either windows or linux machines ?
>>
>> In that case certain set of jars needs to be in client machine(refer
>> hadoop-client/pom.xml) and subset of the server configurations (even if
>> full not a problem) is required to access the HDFS and YARN
>>
>>
>>
>> @Chris Nauroth,  Are native components (winutils.exe and hadoop.dll),
>> required in the remote machine ? AFAIK its not required, correct me if i am
>> wrong !
>>
>>
>>
>> + Naga
>>
>>
>>
>>
>> ------------------------------
>>
>>
>>
>> *From:* Chris Nauroth [cnauroth@hortonworks.com]
>> *Sent:* Monday, November 02, 2015 02:10
>> *To:* user@hadoop.apache.org
>>
>>
>> *Subject:* Re: Utility to push data into HDFS
>>
>>
>>
>> In addition to the standard Hadoop jars available in an Apache Hadoop
>> distro, Windows also requires the native components for Windows:
>> winutils.exe and hadoop.dll.  This wiki page has more details on how that
>> works:
>>
>>
>>
>> https://wiki.apache.org/hadoop/WindowsProblems
>>
>>
>>
>> --Chris Nauroth
>>
>>
>>
>> *From: *James Bond <bond.bhai@gmail.com>
>> *Reply-To: *"user@hadoop.apache.org" <user@hadoop.apache.org>
>> *Date: *Sunday, November 1, 2015 at 9:35 AM
>> *To: *"user@hadoop.apache.org" <user@hadoop.apache.org>
>> *Subject: *Re: Utility to push data into HDFS
>>
>>
>>
>> I am guessing this should work -
>>
>>
>>
>>
>> https://stackoverflow.com/questions/9722257/building-jar-that-includes-all-its-dependencies
>>
>>
>>
>> On Sun, Nov 1, 2015 at 8:15 PM, Shashi Vishwakarma <
>> shashi.vish123@gmail.com> wrote:
>>
>> Hi Chris,
>>
>>
>>
>> Thanks for your reply. I agree WebHDFS is one of the option to access
>> hadoop from windows or *nix. I wanted to know if I can write a java code
>> will can be executed from windows?
>>
>>
>>
>> Ex:  java HDFSPut.java  <<- this java code should have FSShell cammand
>> (hadoop fs -ls) written in java.
>>
>>
>>
>> In order to execute this , what are list items I should have on windows?
>>
>> For example hadoop jars etc.
>>
>>
>>
>> If you can throw some light on this then it would be great help.
>>
>>
>>
>> Thanks
>>
>> Shashi
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On Sun, Nov 1, 2015 at 1:39 AM, Chris Nauroth <cnauroth@hortonworks.com>
>> wrote:
>>
>> Hello Shashi,
>>
>>
>>
>> Maybe I'm missing some context, but are the Hadoop FsShell commands
>> sufficient?
>>
>>
>>
>>
>> http://hadoop.apache.org/docs/r2.7.1/hadoop-project-dist/hadoop-common/FileSystemShell.html
>>
>>
>>
>> These commands work on both *nix and Windows.
>>
>>
>>
>> Another option would be WebHDFS, which just requires an HTTP client on
>> your platform of choice.
>>
>>
>>
>>
>> http://hadoop.apache.org/docs/r2.7.1/hadoop-project-dist/hadoop-hdfs/WebHDFS.html
>>
>>
>>
>> --Chris Nauroth
>>
>>
>>
>> *From: *Shashi Vishwakarma <shashi.vish123@gmail.com>
>> *Reply-To: *"user@hadoop.apache.org" <user@hadoop.apache.org>
>> *Date: *Saturday, October 31, 2015 at 5:46 AM
>> *To: *"user@hadoop.apache.org" <user@hadoop.apache.org>
>> *Subject: *Utility to push data into HDFS
>>
>>
>>
>> Hi
>>
>> I need build a common utility for unix/windows based system to push data
>> into hadoop system. User can run that utility from any platform and should
>> be able to push data into HDFS.
>>
>> Any suggestions ?
>>
>> Thanks
>>
>> Shashi
>>
>>
>>
>>
>>
>>
>>
>>
>>
>

Mime
View raw message