hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alessandro Binhara <binh...@gmail.com>
Subject Re: Problem write on HDFS - SOLUTION
Date Tue, 25 Jan 2011 18:07:23 GMT
Hello all..

i found the problem . A Hadoop jar file from a hadoop website has a
problem.
I copy a jar file from a Cloudera distribuition and error desapear ..

If have a commiter from hadoop website.. please check the problem ..

thanks

On Tue, Jan 25, 2011 at 9:27 AM, Alessandro Binhara <binhara@gmail.com>wrote:

> I build a servlet with a hadoop...
> i think that tomcat enviroment will be find a hadoop-core-0.20.2.jar ..
> but a get a same error
>
> *ype* Exception report
>
> *message***
>
> *description* *The server encountered an internal error () that prevented
> it from fulfilling this request.*
>
> *exception*
>
> javax.servlet.ServletException: Servlet execution threw an exception
>
>  *root cause*
>
> java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.conf.Configuration
> 	HadoopWriterLib.HadoopWriter.OpenFileSystem(HadoopWriter.java:22)
> 	HadoopWriterLib.HadoopWriter.<init>(HadoopWriter.java:16)
> 	HadoopServletTest.doGet(HadoopServletTest.java:35)
> 	javax.servlet.http.HttpServlet.service(HttpServlet.java:621)
> 	javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
>
>  *note* *The full stack trace of the root cause is available in the Apache
> Tomcat/7.0.6 logs.*
>
> The error can be a problem on my ubuntu server ?
>
> thanks
>
> On Mon, Jan 24, 2011 at 1:01 PM, Alessandro Binhara <binhara@gmail.com>wrote:
>
>> ..i try
>>  java -classpath hadoop-core-0.20.1.jar -jar HahoopHdfsHello.jar
>>
>> i got a same error..
>> i will try build a servlet and run on tomcat...
>> i try many issues to config a classpath... all fail..
>>
>> thanks
>>
>>
>> On Mon, Jan 24, 2011 at 12:54 PM, Harsh J <qwertymaniac@gmail.com> wrote:
>>
>>> The issue would definitely lie with your CLASSPATH.
>>>
>>> Ideally, while beginning development using Hadoop 0.20, it is better
>>> to use the `hadoop jar` command to launch jars of any kind that
>>> require Hadoop libraries; be it MapReduce or not. The command will
>>> ensure that all the classpath requirements for Hadoop-side libraries
>>> are satisfied, so you don't have to worry.
>>>
>>> Anyhow, try launching it this way:
>>> $ java -classpath hadoop-0.20.2-core.jar -jar HadoopHdfsHello.jar; #
>>> This should run just fine.
>>>
>>> On Mon, Jan 24, 2011 at 5:06 PM, Alessandro Binhara <binhara@gmail.com>
>>> wrote:
>>> > Hello ..
>>> >
>>> > i solve problem in jar..
>>> > i put a hadoop-core-0.20.2.jar   in same jar dir.
>>> >
>>> > i configure a class path
>>> > export CLASSPATH=.:$JAVA_HOME
>>> >
>>> > i got this erro in shell
>>> >
>>> > root:~# java -jar HahoopHdfsHello.jar
>>> > Exception in thread "main" java.lang.NoClassDefFoundError:
>>> > org/apache/hadoop/conf/Configuration
>>> >        at HadooHdfsHello.main(HadooHdfsHello.java:18)
>>> > Caused by: java.lang.ClassNotFoundException:
>>> > org.apache.hadoop.conf.Configuration
>>> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>> >        at java.security.AccessController.doPrivileged(Native Method)
>>> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>> >        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>>> >        ... 1 more
>>> >
>>> >
>>> > What is the problem?
>>> >
>>> > thanks
>>> >
>>>
>>>
>>>
>>> --
>>> Harsh J
>>> www.harshj.com
>>>
>>
>>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message