hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From shubhangi <shubhangi.g...@oracle.com>
Subject For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions
Date Wed, 06 Mar 2013 09:50:13 GMT
I am writing an application in c++, which uses API provided by libhdfs 
to manipulate Hadoop DFS.
I could run the application with 1.0.4 and 1.1.1; setting classpath 
equal to
$(hadoop classpath).

For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load 
necessary classes required forlibhdfs; as opposed to 1.x versions; 
giving the following error:

loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: 
ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, 
kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: 
ExceptionUtils::getStackTrace error.)

I tried loading the jar files with their full path specified (as opposed 
to wildcard characters used in the classpath); and the application runs, 
but gives the following warning:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for 
further details.
13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes 
where applicable


Mime
View raw message