flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brock Noland <br...@cloudera.com>
Subject Re: FlumeNG-Error while writing data from source file to hdfs sink.
Date Mon, 15 Oct 2012 14:30:47 GMT
I would make sure the hadoop command is in your path.

Brock

On Mon, Oct 15, 2012 at 9:23 AM, Swati Ramteke
<Swati_Ramteke@persistent.co.in> wrote:
> HI,
>
> By updating changes in flume.conf and flume-env .sh as per below mail I am
> not observing the error  which was observed earlier , However now I am
> getting new error now:
>
> 2012-10-15 18:53:13,080 (conf-file-poller-0) [ERROR -
> org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:207)]
> Failed to start agent because dependencies were not found in classpath.
> Error follows.
>
> java.lang.NoClassDefFoundError:
> org/apache/hadoop/io/SequenceFile$CompressionType
>
>         at
> org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:205)
>
>         at
> org.apache.flume.conf.Configurables.configure(Configurables.java:41)
>
>         at
> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.loadSinks(PropertiesFileConfigurationProvider.java:373)
>
>         at
> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.load(PropertiesFileConfigurationProvider.java:223)
>
>         at
> org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(AbstractFileConfigurationProvider.java:123)
>
>         at
> org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$300(AbstractFileConfigurationProvider.java:38)
>
>         at
> org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:202)
>
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>
>         at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
>
>         at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
>
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
>
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
>
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
>
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>
>         at java.lang.Thread.run(Thread.java:662)
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.io.SequenceFile$CompressionType
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>
>         ... 16 more
>
> I also copied the *hadoop-core-1.0.3.jar * to <flume-installation>/lib
> folder*. But still getting same error.:
>
>
>
>
>
>
>
> Also I wanted to know that for flume installation does it require  64 bit
> linux machine?My Ubuntu machine is 32 bit does it make any difference.
>
> Any suggestions would be very helpful.
>
>
>
>
>
> Flume.conf
>
>
>
> agent1.sources = src
>
> agent1.sinks = HDFS
>
> agent1.channels = ch1
>
>
>
>
>
> agent1.sources.src.type = exec
>
> agent1.sources.src.command = tail -F /home/hduser/F1/test1.txt
>
> agent1.sources.src.channels = ch1
>
>
>
>
>
> agent1.sinks.HDFS.type = hdfs
>
> agent1.sinks.HDFS.hdfs.path = hdfs://localhost:8020/user/hduser
>
> agent1.sinks.HDFS.hdfs.fileType = DataStream
>
> agent1.sinks.HDFS.channel = ch1
>
> agent1.sinks.HDFS.hdfs.writeFormat = Text
>
> agent1.sinks.HDFS.hdfs.filePrefix = FlumeTest
>
>
>
> agent1.channels.ch1.type = memory
>
>
>
> flume-env.sh(Set the following variables in flume-env.sh)
>
> export FLUME_HOME=/home/hduser/F1/apache-flume-1.2.0
>
>
>
> export JAVA_HOME=/usr/local/java/jdk1.6.0_26/
>
> export FLUME_CONF_DIR=/home/hduser/F1/apache-flume-1.2.0/conf/
>
>
>
>
>
>
>
> export PATH=$JAVA_HOME/bin:$FLUME_HOME/bin:$PATH
>
>
>
> Thanks,
>
> Swati
>
>
>
> From: JP [mailto:jpnaidumca@gmail.com]
> Sent: Monday, October 15, 2012 5:21 PM
> To: Swati Ramteke
> Subject: Re: FlumeNG-Error while writing data from source file to hdfs sink.
>
>
>
> HI ,
>
> I hope env vaiables may be like the following (dono your env, but mostly it
> will be like this.):
>
> export FLUME_HOME=/home/hduser/F1/apache-flume-1.2.0
>
> export FLUME_CONF_DIR=/home/hduser/F1/apache-flume-1.2.0/conf
>
> export JAVA_HOME=/usr/local/java/jdk1.6.0_26
>
>
> export PATH=$JAVA_HOME/bin:$FLUME_HOME/bin:$PATH
>
>
> Thanks
> JP
>
> On Mon, Oct 15, 2012 at 5:11 PM, JP <jpnaidumca@gmail.com> wrote:
>
> Hi Swathi,
>
> Pls check
>
> 1. env. variable are correct or not.
>
> 2. Please check this flume-ng-node-1.2.0.jar is available or not,
>     If available check for the permissions.
>
> 3.I hope this agent declaration should be in the first statement.
>
> agent1.channels = ch1
>
> agent1.sources = src
>
> agent1.sinks = HDFS
>
> 4. Sample configuration i have added:
>
>
>
> agent1.sources = src
> agent1.sinks = HDFS
>
> agent1.channels = ch1
>
>
>
> agent1.sources.src.type = exec
> agent1.sources.src.command = tail -F /home/hduser/F1/h.txt
>
> agent1.sources.src.channels = ch1
>
>
>
> agent1.sinks.HDFS.type = hdfs
> agent1.sinks.HDFS.hdfs.path = hdfs://localhost:8020/user/hduser
> agent1.sinks.HDFS.hdfs.fileType = DataStream
>
> agent1.sinks.HDFS..channel = ch1
>
> agent1.channels.ch1.type = memory
>
>
>
>
> Hope this will help!
> Thanks
> JP
>
>
>
> On Mon, Oct 15, 2012 at 3:45 PM, Swati Ramteke
> <Swati_Ramteke@persistent.co.in> wrote:
>
> Hi,
>
>
>
> I am getting following error while running flume agent:
>
> (I wanted to write data form source file to hdfs sink)
>
>
>
>
>
> hduser@vm-ps7274:~/F1/apache-flume-1.2.0$ bin/flume-ng agent -c conf  -f
> conf/flumeHdfs.conf -Dflume.root.logger=DEBUGG,console  -n agent1
>
> Info: Sourcing environment configuration script
> /home/hduser/F1/apache-flume-1.2.0/conf/flume-env.sh
>
> + exec /usr/local/java/jdk1.6.0_26/jre/bin/java -Xmx20m
> -Dflume.root.logger=DEBUGG,console -cp
> '/home/hduser/F1/apache-flume-1.2.0/conf:/home/hduser/F1/apache-flume-1.2.0/bin/lib/*'
> -Djava.library.path= org.apache.flume.node.Application -f
> conf/flumeHdfs.conf -n agent1
>
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/flume/node/Application
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.flume.node.Application
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>
> Could not find the main class: org.apache.flume.node.Application.  Program
> will exit.
>
>
>
> 1. Flume version is: apache-flume-1.2.0
>
>
>
> 2.Java version "1.6.0_26"
>
> 3.Hadoop version: apache Hadoop 1.0.3
>
>
>
> Are there any missing files which are need to be download explicitly?
>
>
>
> Please find below flume-env.sh and flumeHdfs.conf files:
>
> 1.Flume-env.sh:
>
>
>
> # Enviroment variables can be set here.
>
>
>
>
>
> # Note that the Flume conf directory is always included in the classpath.
>
> #FLUME_CLASSPATH=""
>
>
>
>
>
>
>
> export FLUME_HOME=/home/hduser/F1/apache-flume-1.2.0/bin
>
>
>
> export JAVA_HOME=/usr/local/java/jdk1.6.0_26/jre/bin/java
>
> export FLUME_CONF_DIR=/home/hduser/F1/apache-flume-1.2.0/conf/
>
>
>
>
>
>
>
> export PATH=$JAVA_HOME:$FLUME_HOME/bin:$PATH
>
>
>
> 2.flumeHdfs.conf:
>
>
>
>
>
> # Define a memory channel called ch1 on agent1
>
> agent1.channels.ch1.type = memory
>
>
>
>
>
>
>
> # Define an EXEC source called src on agent1 and connect it to channel ch1.
>
> agent1.sources.src.channels = ch1
>
> agent1.sources.src.type = exec
>
> agent1.sources.src.command = tail -F /home/hduser/F1/h.txt
>
>
>
>
>
> # Define a HDFS sink and connect it to the other end of the same channel.
>
> agent1.sinks.HDFS.channel = ch1
>
> agent1.sinks.HDFS.type = hdfs
>
> agent1.sinks.HDFS.hdfs.path = hdfs://localhost:8020/user/hduser
>
> agent1.sinks.HDFS.hdfs.fileType = DataStream
>
> agent1.sinks.HDFS.hdfs.writeFormat = Text
>
> agent1.sinks.HDFS.hdfs.filePrefix = FlumeTest
>
>
>
>
>
>
>
> # Finally, now that we've defined all of our components, tell
>
> # agent1 which ones we want to activate.
>
> agent1.channels = ch1
>
> agent1.sources = src
>
> agent1.sinks = HDFS
>
>
>
> Thanks and Regards,
>
> Swati
>
>
>
> DISCLAIMER ========== This e-mail may contain privileged and confidential
> information which is the property of Persistent Systems Ltd. It is intended
> only for the use of the individual or entity to which it is addressed. If
> you are not the intended recipient, you are not authorized to read, retain,
> copy, print, distribute or use this message. If you have received this
> communication in error, please notify the sender and delete all copies of
> this message. Persistent Systems Ltd. does not accept any liability for
> virus infected mails.
>
>
>
> --
> JP
>
>
>
>
> --
> JP
>
> DISCLAIMER ========== This e-mail may contain privileged and confidential
> information which is the property of Persistent Systems Ltd. It is intended
> only for the use of the individual or entity to which it is addressed. If
> you are not the intended recipient, you are not authorized to read, retain,
> copy, print, distribute or use this message. If you have received this
> communication in error, please notify the sender and delete all copies of
> this message. Persistent Systems Ltd. does not accept any liability for
> virus infected mails.



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Mime
View raw message