flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vijay k <k.vija...@gmail.com>
Subject Re: Flume agent failure
Date Mon, 02 Jul 2012 07:11:46 GMT
Hi Mike,

Please find the below flume-ng script execution output.

root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
Info: Sourcing environment configuration script
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop)
for HDFS access
+ exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m -Xmx200m -cp
'/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
-Djava.library.path=:/usr/local/hadoop/bin/../lib/native/Linux-i386-32
org.apache.flume.node.Application -n agent1 -f conf/agent1.conf


Flume.log
==========

root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
more flume.log
2012-07-02 12:37:40,326 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 1
2012-07-02 12:37:40,327 INFO node.FlumeNode: Flume node starting - agent1
2012-07-02 12:37:40,329 INFO nodemanager.DefaultLogicalNodeManager: Node
manager starting
2012-07-02 12:37:40,329 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 10
2012-07-02 12:37:40,329 INFO
properties.PropertiesFileConfigurationProvider: Configuration provider
starting
2012-07-02 12:37:40,330 INFO
properties.PropertiesFileConfigurationProvider: Reloading configuration
file:conf/agent1.conf
2012-07-02 12:37:40,337 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Added sinks: HDFS
Agent: agent1
2012-07-02 12:37:40,354 INFO conf.FlumeConfiguration: Post-validation flume
configuration contains configuration  for agents: [agent1]
2012-07-02 12:37:40,354 INFO
properties.PropertiesFileConfigurationProvider: Creating channels
2012-07-02 12:37:40,357 INFO
properties.PropertiesFileConfigurationProvider: created channel
MemoryChannel-2
2012-07-02 12:37:40,365 INFO sink.DefaultSinkFactory: Creating instance of
sink HDFS typehdfs
2012-07-02 12:37:40,369 ERROR
properties.PropertiesFileConfigurationProvider: Failed to start agent
because dependencies were not found in classpath. Error follows.
java.lang.NoClassDefFoundError:
org/apache/hadoop/io/SequenceFile$CompressionType
        at
org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:204)
        at
org.apache.flume.conf.Configurables.configure(Configurables.java:41)
        at
org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.loadSinks(PropertiesFileConfigurationProvider.java:373)
        at
org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.load(PropertiesFileConfigurationProvider.java:223)
        at
org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(AbstractFileConfigurationProvider.java:123)
        at
org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$300(AbstractFileConfigurationProvider.java:38)
        at
org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:202)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.io.SequenceFile$CompressionType
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Please help me on this issue.

Thanks,
vijay
On Sat, Jun 30, 2012 at 2:46 AM, Mike Percy <mpercy@cloudera.com> wrote:

> Vijay,
> Can you please post the output from flume-ng script when you start it now?
>
> This will be useful info for debugging:
>
> + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp … etc …
>
> Regards,
> Mike
>
>
> On Friday, June 29, 2012 at 1:00 AM, vijay k wrote:
>
> > Hi,
> >
> > Thanks for the reply.
> >
> > I have installed hadoop in /usr/local/hadoop and added the below
> > variable in the flume-env.sh (http://flume-env.sh) file, and re-run the
> bin/flume-ng agent
> > -n agent1 -c conf -f conf/agent1.conf,
> > but still i am facing the same error.
> >
> > flume-env.sh (http://flume-env.sh)
> > ============
> > # Enviroment variables can be set here.
> >
> > #JAVA_HOME=/usr/lib/jvm/java-6-sun
> > JAVA_HOME=/usr/lib/jvm/java-6-sun-1.6.0.26/jre
> > # Give Flume more memory and pre-allocate
> > #JAVA_OPTS="-Xms100m -Xmx200m"
> >
> > # Note that the Flume conf directory is always included in the classpath.
> >
> FLUME_CLASSPATH=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf
> > HADOOP_HOME=/usr/local/hadoop
> >
> >
> > Error in flume.log file
> > =======================
> > 2012-06-29 13:19:23,679 ERROR
> > properties.PropertiesFileConfigurationProvider: Failed to start agent
> > because dependencies were not found in classpath. Error f
> > ollows.
> >
> >
> >
> > Please help me, if i am doing anything wrong?
> >
> > Thanks,
> > Vijay
> >
> >
> > On 6/29/12, Mike Percy <mpercy@cloudera.com (mailto:mpercy@cloudera.com)>
> wrote:
> > > Vijay - Flume does not include the HDFS libraries. This is because
> every
> > > major version of HDFS is wire-incompatible with all the others. So you
> will
> > > need to install Hadoop, set your HADOOP_HOME variable to point to the
> Hadoop
> > > installation (define this in flume-env.sh (http://flume-env.sh)),
> then restart Flume and you
>  > > should be good to go.
> > >
> > > Regards,
> > > Mike
> > >
> > >
> > > On Thursday, June 28, 2012 at 2:02 AM, vijay k wrote:
> > >
> > > > Hi,
> > > >
> > > > I have removed the
> > > >
> 'log4j.appender.LOGFILE=org.apache.flume.lifecycle.LifecycleSupervisor'
> > > > in the log4j.properties file,
> > > >
> > > > and i run the bin/flume-ng agent -n agent1 -c conf -f
> conf/agent1.conf
> > > > command, i got stuck on the execution like below.
> > > >
> > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
> > > > bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
> > > > Info: Sourcing environment configuration script
> > > >
> /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
> > > > + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp
> > > >
> '/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*'
> > > > -Djava.library.path= org.apache.flume.node.Application -n agent1 -f
> > > > conf/agent1.conf
> > > >
> > > >
> > > >
> > > > Flume.log file
> > > > ====================
> > > >
> > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
> > > > more flume.log
> > > > 2012-06-28 14:17:31,556 INFO lifecycle.LifecycleSupervisor: Starting
> > > > lifecycle supervisor 1
> > > > 2012-06-28 14:17:31,558 INFO node.FlumeNode: Flume node starting -
> agent1
> > > > 2012-06-28 14:17:31,559 INFO nodemanager.DefaultLogicalNodeManager:
> > > > Node manager starting
> > > > 2012-06-28 14:17:31,559 INFO lifecycle.LifecycleSupervisor: Starting
> > > > lifecycle supervisor 10
> > > > 2012-06-28 14:17:31,560 INFO
> > > > properties.PropertiesFileConfigurationProvider: Configuration
> provider
> > > > starting
> > > > 2012-06-28 14:17:31,561 INFO
> > > > properties.PropertiesFileConfigurationProvider: Reloading
> > > > configuration file:conf/agent1.conf
> > > > 2012-06-28 14:17:31,566 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Added sinks:
> > > > HDFS Agent: agent1
> > > > 2012-06-28 14:17:31,582 INFO conf.FlumeConfiguration: Post-validation
> > > > flume configuration contains configuration for agents: [agent1]
> > > > 2012-06-28 14:17:31,582 INFO
> > > > properties.PropertiesFileConfigurationProvider: Creating channels
> > > > 2012-06-28 14:17:31,587 INFO
> > > > properties.PropertiesFileConfigurationProvider: created channel
> > > > MemoryChannel-2
> > > > 2012-06-28 14:17:31,595 INFO sink.DefaultSinkFactory: Creating
> > > > instance of sink HDFS typehdfs
> > > > 2012-06-28 14:17:31,599 ERROR
> > > > properties.PropertiesFileConfigurationProvider: Failed to start agent
> > > > because dependencies were not found in classpath. Error f
> > > > ollows.
> > > > java.lang.NoClassDefFoundError:
> > > > org/apache/hadoop/io/SequenceFile$CompressionType
> > > > at
> > > >
> org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:204)
> > > > at
> org.apache.flume.conf.Configurables.configure(Configurables.java:41)
> > > > at
> > > >
> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.loadSinks(PropertiesFileConfigurationProvider.java:373)
> > > > at
> > > >
> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.load(PropertiesFileConfigurationProvider.java:223)
> > > > at
> > > >
> org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(AbstractFileConfigurationProvider.java:123)
> > > > at
> > > >
> org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$300(AbstractFileConfigurationProvider.java:38)
> > > > at
> > > >
> org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:202)
> > > > at
> > > >
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> > > > at
> > > >
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> > > > at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> > > > at
> > > >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> > > > at
> > > >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> > > > at
> > > >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> > > > at
> > > >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > > > at
> > > >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > > > at java.lang.Thread.run(Thread.java:662)
> > > > Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.hadoop.io.SequenceFile$CompressionType
> > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > in which agent file I can add the -Dflume.root.logger=INFO,console
> > > > command?.
> > > >
> > > > Please help me on this issue.
> > > >
> > > >
> > > > Thanks,
> > > > Vijay
> > > >
> > > >
> > > > On 6/28/12, Hari Shreedharan <hshreedharan@cloudera.com (mailto:
> hshreedharan@cloudera.com)
> > > > (mailto:hshreedharan@cloudera.com)> wrote:
> > > > > Yes, Mike is right. I missed the "console."
> > > > >
> > > > > Thanks
> > > > > Hari
> > > > > --
> > > > > Hari Shreedharan
> > > > >
> > > > >
> > > > > On Wednesday, June 27, 2012 at 11:55 PM, Mike Percy wrote:
> > > > >
> > > > > > I think there was a minor typo there in the email, it should
be
> > > > > > -Dflume.root.logger=INFO,console
> > > > > >
> > > > > > Regards,
> > > > > > Mike
> > > > > >
> > > > > > On Wednesday, June 27, 2012, vijay k wrote:
> > > > > > > Thanks for the reply Hari,
> > > > > > >
> > > > > > > I will try the below command, and let you know the result.
> > > > > > >
> > > > > > > On Thu, Jun 28, 2012 at 7:52 AM, Hari Shreedharan
> > > > > > > <hshreedharan@cloudera.com (mailto:hshreedharan@cloudera.com)>
> > > > > > > wrote:
> > > > > > > > Vijay,
> > > > > > > >
> > > > > > > > You are asking flume to look at /conf(rather than
./conf)
> for the
> > > > > > > > log4j properties file. Please change the command to:
> > > > > > > >
> > > > > > > > bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
> > > > > > > >
> > > > > > > > Also please remove the line you added to the log4j
properties
> > > > > > > > file. It
> > > > > > > > is not valid because LifecycleSupervisor is not a
> Log4jAppender.
> > > > > > > > Just
> > > > > > > > leave the config as specified and you will see the
log in the
> > > > > > > > same
> > > > > > > > folder you are running the agent from, or specify
> > > > > > > > -Dflume.root.logger=INFO in the flume agent command,console
> to
> > > > > > > > have
> > > > > > > > flume dump the logs to console.
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > Thanks
> > > > > > > > Hari
> > > > > > > >
> > > > > > > >
> > > > > > > > --
> > > > > > > > Hari Shreedharan
> > > > > > > >
> > > > > > > >
> > > > > > > > On Wednesday, June 27, 2012 at 7:10 PM, vijay k wrote:
> > > > > > > >
> > > > > > > > > Can anyone respond on the below issue?
> > > > > > > > >
> > > > > > > > > On Tue, Jun 26, 2012 at 8:20 PM, vijay k <
> k.vijay52@gmail.com (mailto:k.vijay52@gmail.com)
> > > > > > > > > (mailto:k.vijay52@gmail.com)>
> > > > > > > > > wrote:
> > > > > > > > > >
> > > > > > > > > > Hi,
> > > > > > > > > > I have run the flume-ng, but it is not moving
forward,
> it's
> > > > > > > > > > got
> > > > > > > > > > hang up, below are the my agent1.conf config
file
> > > > > > > > > >
> > > > > > > > > > agent1.conf configuaration
> > > > > > > > > > ---------------------------------
> > > > > > > > > >
> > > > > > > > > > agent1.sources = tail
> > > > > > > > > > agent1.channels = MemoryChannel-2
> > > > > > > > > > agent1.sinks = HDFS
> > > > > > > > > > agent1.sources.tail.type = exec
> > > > > > > > > > agent1.sources.tail.command = tail -F /var/log/syslog.1
> > > > > > > > > > agent1.sources.tail.channels = MemoryChannel-2
> > > > > > > > > > agent1.sinks.HDFS.channel = MemoryChannel-2
> > > > > > > > > > agent1.sinks.HDFS.type = hdfs
> > > > > > > > > > agent1.sinks.HDFS.hdfs.path = hdfs://
> 10.5.114.110:9000/flume
> > > > > > > > > > (http://10.5.114.110:9000/flume)
> > > > > > > > > > agent1.sinks.HDFS.hdfs.file.Type = DataStream
> > > > > > > > > > agent1.channels.MemoryChannel-2.type = memory
> > > > > > > > > >
> > > > > > > > > > I have run agent1.conf by using following
command:
> > > > > > > > > >
> > > > > > > > > > #bin/flume-ng agent -n agent1 -c /conf -f
> conf/agent1.conf
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf#
> > > > > > > > > > ls -lrt
> > > > > > > > > > total 24
> > > > > > > > > > -rw-r--r-- 1 root root 2070 2012-06-26 12:57
> log4j.properties
> > > > > > > > > > -rw-r--r-- 1 root root 1132 2012-06-26 12:57
> > > > > > > > > > flume-env.sh.template
> > > > > > > > > > -rw-r--r-- 1 root root 1661 2012-06-26 12:57
> > > > > > > > > > flume-conf.properties.template
> > > > > > > > > > -rw-r--r-- 1 root root 1661 2012-06-26 19:35
flume.conf
> > > > > > > > > > -rw-r--r-- 1 root root 1132 2012-06-26 19:36
> flume-env.sh (http://flume-env.sh)
> > > > > > > > > > (http://flume-env.sh)
> > > > > > > > > > (http://flume-env.sh)
> > > > > > > > > > -rw-r--r-- 1 root root 438 2012-06-26 19:38
agent1.conf
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf#
> > > > > > > > > > chmod 775 agent1.conf
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf#
> > > > > > > > > > cd ..
> > > > > > > > > >
> > > > > > > > > > Here, i am getting the following error.
> > > > > > > > > >
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
> > > > > > > > > > bin/flume-ng agent -n agent1 -c /conf -f
conf/agent1.conf
> > > > > > > > > > + exec /usr/lib/jvm/java-6-sun/bin/java
-Xmx20m -cp
> > > > > > > > > >
> '/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*'
> > > > > > > > > > -Djava.library.path= org.apache.flume.node.Application
-n
> > > > > > > > > > agent1
> > > > > > > > > > -f conf/agent1.conf
> > > > > > > > > > log4j:WARN No appenders could be found for
logger
> > > > > > > > > > (org.apache.flume.lifecycle.LifecycleSupervisor).
> > > > > > > > > > log4j:WARN Please initialize the log4j system
properly.
> > > > > > > > > > log4j:WARN See http://logging.
> > > > > > > > > > (http://logging.apache.org/log4j/1.2/faq.html#noconfig)
> > > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>
>
>
>

Mime
View raw message