flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dongliang Sun <davidsu...@gmail.com>
Subject Re: Could not instantiate class org.apache.flume.clients.log4jappender
Date Mon, 14 Jan 2013 08:37:32 GMT
The result from ps waux | grep flume :
hadoop   10160  2.8  3.2 186672 32960 pts/2    Sl+  16:35   0:00
/usr/lib/jvm/jdk1.6.0_37/bin/java -Xmx20m -Dflume.root.logger=DEBUG,console
-cp
/home/hadoop/softwares/apache-flume-1.3.1-bin/conf:/home/hadoop/softwares/apache-flume-1.3.1-bin/lib/*:/home/hadoop/softwares/hadoop-1.1.1/libexec/../conf:/usr/lib/jvm/jdk1.6.0_37/lib/tools.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/..:/home/hadoop/softwares/hadoop-1.1.1/libexec/../hadoop-core-1.1.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/asm-3.2.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/aspectjrt-1.6.11.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/aspectjtools-1.6.11.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-el-1.0.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-io-2.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-math-2.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/commons-net-3.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/core-3.1.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/hadoop-capacity-scheduler-1.1.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/hadoop-fairscheduler-1.1.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/hadoop-thriftfs-1.1.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jdeb-0.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/junit-4.5.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/oro-2.0.8.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
-Djava.library.path=:/home/hadoop/softwares/hadoop-1.1.1/libexec/../lib/native/Linux-i386-32
org.apache.flume.node.Application --conf-file ./conf/flume.conf --name
agent1


2013/1/14 Alexander Alten-Lorenz <wget.null@gmail.com>

> Can you please post the running flume process (ps waux|grep flume)?
>
> - Alex
>
> On Jan 14, 2013, at 8:11 AM, Dongliang Sun <davidsundl@gmail.com> wrote:
>
> > Alex,
> >
> > The flume startup no problem,
> > flume-ng agent --conf ./conf/ --conf-file ./conf/flume.conf --name agent1
> > -Dflume.root.logger=DEBUG,console
> >
> > I have test the following client,
> > bin/flume-ng avro-client -H localhost -p 41414 -F ~/test.txt
> >
> > It's OK, but if I can not make it work if I use the log4jappender as the
> > client.
> >
> > Thanks,
> > Dongliang
> >
> >
> > 2013/1/14 Alexander Alten-Lorenz <wget.null@gmail.com>
> >
> >> Please post the flume startup line. I guess you have missed the --conf
> >> switch.
> >>
> >> - Alex
> >>
> >> On Jan 14, 2013, at 4:25 AM, 孙东亮 <davidsundl@gmail.com> wrote:
> >>
> >>> I'm a newbie for the Flume, and I just set up flume for a test.
> >>> I want to use the log4jappender to get log info from PIG scripts, the
> >>> log4j.properties is:
> >>>
> >>> log4j.appender.flume =
> >> org.apache.flume.clients.log4jappender.Log4jAppender
> >>> log4j.appender.flume.Hostname = localhost
> >>> log4j.appender.flume.Port = 41414
> >>> log4j.logger.org.apache.pig=DEBUG,flume
> >>>
> >>> And for the flume.config:
> >>>
> >>> a1.sources = r1
> >>> a1.sinks = k1
> >>> a1.channels = c1
> >>>
> >>> # Describe/configure the source
> >>> a1.sources.r1.type = avro
> >>> a1.sources.r1.bind = localhost
> >>> a1.sources.r1.port = 41414
> >>>
> >>> # Describe the sink
> >>> a1.sinks.k1.type = logger
> >>> # Use a channel which buffers events in memory
> >>> a1.channels.c1.type = memory
> >>> a1.channels.c1.capacity = 1000
> >>> a1.channels.c1.transactionCapacity = 100
> >>>
> >>> # Bind the source and sink to the channel
> >>> a1.sources.r1.channels = c1
> >>> a1.sinks.k1.channel = c1
> >>>
> >>> But I got the following error when run a pig script:
> >>> java.lang.ClassNotFoundException:
> >>> org.apache.flume.clients.log4jappender.Log4jAppender
> >>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >>> at java.security.AccessController.doPrivileged(Native Method)
> >>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >>> at java.lang.Class.forName0(Native Method)
> >>> at java.lang.Class.forName(Class.java:169)
> >>> at org.apache.log4j.helpers.Loader.loadClass(Loader.java:179)
> >>> at
> >>>
> >>
> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:320)
> >>> at
> >>>
> >>
> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:121)
> >>> at
> >>>
> >>
> org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:664)
> >>> at
> >>>
> >>
> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
> >>> at
> >>>
> >>
> org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:568)
> >>> at
> >>>
> >>
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:442)
> >>> at
> >>>
> >>
> org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:367)
> >>> at org.apache.pig.Main.configureLog4J(Main.java:678)
> >>> at org.apache.pig.Main.run(Main.java:337)
> >>> at org.apache.pig.Main.main(Main.java:111)
> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> at
> >>>
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>> at
> >>>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>> at java.lang.reflect.Method.invoke(Method.java:597)
> >>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >>> log4j:ERROR Could not instantiate appender named "flume".
> >>>
> >>> Could you please help me find what I miss or where is incorrect for the
> >>> configuration.
> >>>
> >>> Thanks a lot!
> >>> Dongliang
> >>
> >> --
> >> Alexander Alten-Lorenz
> >> http://mapredit.blogspot.com
> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>
> >>
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>

Mime
View raw message