hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Venkatesh <vramanatha...@aol.com>
Subject Re: row_counter map reduce job & 0.90.1
Date Mon, 04 Apr 2011 17:57:52 GMT
Sorry about this..It was indeed an environment issue..my core-site.xml was pointing to wrong
hadoop
thanks for the tips

 

 


 

 

-----Original Message-----
From: Venkatesh <vramanathan00@aol.com>
To: user@hbase.apache.org
Sent: Fri, Apr 1, 2011 4:51 pm
Subject: Re: row_counter map reduce job & 0.90.1




 Yeah.. I tried that as well as what Ted suggested..It can't find hadoop jar

Hadoop map reduce jobs works fine ..it's just hbase map reduce jobs fails with 

this error

tx



 





 



 



-----Original Message-----

From: Stack <stack@duboce.net>

To: user@hbase.apache.org

Sent: Fri, Apr 1, 2011 12:39 pm

Subject: Re: row_counter map reduce job & 0.90.1





Does where you are running from have a build/classes dir and a



hadoop-0.20.2-core.jar at top level?  If so, try cleaning out the



build/classes.  Also, could try something like this:







HADOOP_CLASSPATH=/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT-tests.jar:/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar:`/home/stack/hbase-0.90.2-SNAPSHOT/bin/hbase



classpath` ./bin/hadoop jar



/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar rowcounter



usertable







... only make sure the hadoop jar is in HADOOP_CLASSPATH.







But you shouldn't have to do the latter at least.  Compare where it



works to where it doesn't.  Something is different.







St.Ack







On Fri, Apr 1, 2011 at 9:26 AM, Venkatesh <vramanathan00@aol.com> wrote:



> Definitely yes..It'all referenced in -classpath option of jvm of 



tasktracker/jobtracker/datanode/namenode..



> & file does exist in the cluster..



>



> But the error I get is on the client



> File /home..../hdfs/tmp/mapred/system/job_201103311630_0027/libjars/hadoop-0.20.2-core.jar






does not exist.



>    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)



>    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)



>    at org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)



>    at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:629)



>    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)



>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)



>



> So, in theory in should n't expect from client ..correct?



>



> This is the only that is stopping me in moving to 0.90.1



>



>



>



>



>



>



>



>



>



>



> -----Original Message-----



> From: Stack <stack@duboce.net>



> To: user@hbase.apache.org



> Sent: Fri, Apr 1, 2011 12:19 pm



> Subject: Re: row_counter map reduce job & 0.90.1



>



>



> On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vramanathan00@aol.com> wrote:



>



>>  I'm able to run this job from the hadoop machine (where job & task tracker



>



> also runs)



>



>> /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter 



<usertable>



>



>>



>



>> But, I'm not able to run the same job from



>



>> a) hbase client machine (full hbase & hadoop installed)



>



>> b) hbase server machines (ditto)



>



>>



>



>> Get



>



>> File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar



>



> does not exist.



>



>>



>



>



>



> Is that jar present on the cluster?



>



> St.Ack



>



>



>



>





 


 

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message