hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Zhengguo 'Mike' SUN <zhengguo...@yahoo.com>
Subject Re: How do I include customized InputFormat, InputSplit and RecordReader in a C++ pipes job?
Date Wed, 29 Oct 2008 16:22:01 GMT
Hi, Amareshwari,

Is -libjars a new option in Hadoop 0.19? I am using 0.17.2. The only option I see is -jar,
which didn't work for me. And besides passing them as jar file, is there any other ways to
do that?

Thanks
Mike


________________________________
From: Amareshwari Sriramadasu <amarsri@yahoo-inc.com>
To: core-user@hadoop.apache.org
Sent: Tuesday, October 28, 2008 11:58:33 PM
Subject: Re: How do I include customized InputFormat, InputSplit and RecordReader in a C++
pipes job?

Hi,

How are you passing your classes to the pipes job? If you are passing 
them as a jar file, you can use -libjars option. From branch 0.19, the 
libjar files are added to the client classpath also.

Thanks
Amareshwari
Zhengguo 'Mike' SUN wrote:
> Hi,
>
> I implemented customized classes for InputFormat, InputSplit and RecordReader in Java
and was trying to use them in a C++ pipes job. The customized InputFormat class could be included
using the -inputformat option, but it threw ClassNotFoundException for my customized InputSplit
class. It seemed the classpath has not been correctly set. Is there any way that let me include
my customized classes in a pipes job?
>
>
>
>      
>  


      
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message