hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peng Yu <pengyu...@gmail.com>
Subject Re: Can not follow Single Node Setup example.
Date Thu, 27 Jun 2013 17:07:23 GMT
Hi,

~/Downloads/hadoop-install/hadoop$ rm -rf ~/input/conf/
~/Downloads/hadoop-install/hadoop$ bin/hadoop fs -put conf input
put: Target input/conf is a directory

I get the above output. Is it the correct output? Thanks.

On Wed, Jun 26, 2013 at 10:51 AM, Shahab Yunus <shahab.yunus@gmail.com> wrote:
> It is looking for a file within your login folder
> /user/py/input/conf
>
> You are running your job form
> hadoop/bin
> and I think the hadoop job will is looking for files in the current folder.
>
> Regards,
> Shahab
>
>
> On Wed, Jun 26, 2013 at 11:02 AM, Peng Yu <pengyu.ut@gmail.com> wrote:
>>
>> Hi,
>>
>> Here are what I have.
>>
>> ~/Downloads/hadoop-install/hadoop$ ls
>> CHANGES.txt  README.txt  c++      hadoop-ant-1.1.2.jar
>> hadoop-examples-1.1.2.jar     hadoop-tools-1.1.2.jar  ivy.xml  logs
>> src
>> LICENSE.txt  bin         conf     hadoop-client-1.1.2.jar
>> hadoop-minicluster-1.1.2.jar  input                   lib      sbin
>> webapps
>> NOTICE.txt   build.xml   contrib  hadoop-core-1.1.2.jar
>> hadoop-test-1.1.2.jar         ivy                     libexec  share
>> ~/Downloads/hadoop-install/hadoop$ ls input/
>> capacity-scheduler.xml  core-site.xml  fair-scheduler.xml
>> hadoop-policy.xml  hdfs-site.xml  mapred-queue-acls.xml
>> mapred-site.xml
>>
>> On Wed, Jun 26, 2013 at 10:00 AM, Shahab Yunus <shahab.yunus@gmail.com>
>> wrote:
>> > Basically whether this step worked or not:
>> >
>> > $ cp conf/*.xml input
>> >
>> > Regards,
>> > Shahab
>> >
>> >
>> > On Wed, Jun 26, 2013 at 10:58 AM, Shahab Yunus <shahab.yunus@gmail.com>
>> > wrote:
>> >>
>> >> Have you verified that the 'input' folder exists on the hdfs (singel
>> >> node
>> >> setup) that you are job needs?
>> >>
>> >> Regards,
>> >> Shahab
>> >>
>> >>
>> >> On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu <pengyu.ut@gmail.com> wrote:
>> >>>
>> >>> Hi,
>> >>>
>> >>> http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html
>> >>>
>> >>> I followed the above instructions. But I get the following errors.
>> >>> Does anybody know what is wrong? Thanks.
>> >>>
>> >>> ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar
>> >>> hadoop-examples-*.jar grep input output 'dfs[a-z.]+'
>> >>> Warning: $HADOOP_HOME is deprecated.
>> >>>
>> >>> 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load
>> >>> native-hadoop library for your platform... using builtin-java classes
>> >>> where applicable
>> >>> 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not
>> >>> loaded
>> >>> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to
>> >>> process : 2
>> >>> 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging area
>> >>>
>> >>>
>> >>> hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging/job_201306260838_0001
>> >>> 13/06/26 09:49:14 ERROR security.UserGroupInformation:
>> >>> PriviledgedActionException as:py cause:java.io.IOException: Not a
>> >>> file: hdfs://localhost:9000/user/py/input/conf
>> >>> java.io.IOException: Not a file:
>> >>> hdfs://localhost:9000/user/py/input/conf
>> >>>         at
>> >>>
>> >>> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215)
>> >>>         at
>> >>> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051)
>> >>>         at
>> >>> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043)
>> >>>         at
>> >>> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
>> >>>         at
>> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
>> >>>         at
>> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
>> >>>         at java.security.AccessController.doPrivileged(Native Method)
>> >>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>> >>>         at
>> >>>
>> >>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>> >>>         at
>> >>>
>> >>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
>> >>>         at
>> >>> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886)
>> >>>         at
>> >>> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323)
>> >>>         at org.apache.hadoop.examples.Grep.run(Grep.java:69)
>> >>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >>>         at org.apache.hadoop.examples.Grep.main(Grep.java:93)
>> >>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>>         at
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>>         at
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>>         at java.lang.reflect.Method.invoke(Method.java:597)
>> >>>         at
>> >>>
>> >>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>> >>>         at
>> >>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>> >>>         at
>> >>> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>> >>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>>         at
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>>         at
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>>         at java.lang.reflect.Method.invoke(Method.java:597)
>> >>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> >>>
>> >>> --
>> >>> Regards,
>> >>> Peng
>> >>
>> >>
>> >
>>
>>
>>
>> --
>> Regards,
>> Peng
>
>



-- 
Regards,
Peng

Mime
View raw message