flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Juhani Connolly <juhani_conno...@cyberagent.co.jp>
Subject Re: using tail to HDFS sink
Date Thu, 12 Jul 2012 10:31:19 GMT
Most people in the US are asleep at this time so I wouldn't expect a 
fast response. People will respond when they have time and are able to.

As to your problem, as I said earlier on, unless you've changed your 
config since then, you don't have a host header defined. The logs don't 
seem to show anything wrong, so I suspect that it may be sending data to 
your hdfs without replacing the %{host} with your actual host. Remove 
the %{host} and replace it with a hardcoded string. If that works, you 
know that is your problem. Try using the HostInterceptor to add a host 
header: 
http://people.apache.org/~juhanic/flume-docs/FlumeUserGuide.html#host-interceptor

There is also a pair of quotation marks in your path string... is that 
intended? agent2.sinks.HDFS.hdfs.path = 
hdfs://10.5.114.110:54310/user/flume/'%{host} 
<http://10.5.114.110:54310/user/flume/%27%%7Bhost%7D>'

I can't remember what happens when you  try to use a header that doesn't 
exist... Probably gets replaced by an empty string as it's not throwing 
exceptions.
Try checking hdfs in /user/flume and see if there is a poorly named 
directory there with your data.

On 07/12/2012 06:41 PM, prabhu k wrote:
> can you please any one respond on the below issue?
>
> On Thu, Jul 12, 2012 at 12:57 PM, prabhu k <prabhu.flume@gmail.com 
> <mailto:prabhu.flume@gmail.com>> wrote:
>
>     As per mohammad suggestion i have executed like below.
>
>     root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT
>     <mailto:root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT>#
>     bin/flume-ng agent -n agent2 -c /conf -f conf/agent2.conf
>
>
>     Info: Including Hadoop libraries found via
>     (/usr/local/hadoop_dir/hadoop/bin/hadoop) for HDFS access
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-api-1.4.3.jar
>     from classpath
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from
>     classpath
>     + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp
>     '/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/hadoop_dir/hadoop/libexec/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/local/hadoop_dir/hadoop/libexec/..:/usr/local/hadoop_dir/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
>     -Djava.library.path=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib:/usr/local/hadoop_dir/hadoop/libexec/../lib/native/Linux-i386-32
>     org.apache.flume.node.Application -n agent2 -f conf/agent2.conf
>     12/07/12 12:48:45 ERROR node.Application: The specified
>     configuration file does not exist:
>     /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/agent2.conf
>     root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT
>     <mailto:root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT>#
>
>
>
>     getting error; ERROR node.Application: The specified configuration
>     file does not exist:
>     /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/agent2.conf
>
>     my configuaration file is  /conf/flume.conf, this is reason
>     getting error.
>
>     and i have tryied following commands
>
>     bin/flume-ng agent -n agent2 -c /conf -f conf/flume.conf
>     -----flume.log file not genrated and sample file not able to hdfs
>     sink.
>
>     root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT
>     <mailto:root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT>#
>     bin/flume-ng agent -n agent2 -c /conf -f conf/flume.conf
>
>
>     Info: Including Hadoop libraries found via
>     (/usr/local/hadoop_dir/hadoop/bin/hadoop) for HDFS access
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-api-1.4.3.jar
>     from classpath
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from
>     classpath
>     + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp
>     '/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/hadoop_dir/hadoop/libexec/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/local/hadoop_dir/hadoop/libexec/..:/usr/local/hadoop_dir/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
>     -Djava.library.path=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib:/usr/local/hadoop_dir/hadoop/libexec/../lib/native/Linux-i386-32
>     org.apache.flume.node.Application -n agent2 -f conf/flume.conf
>     12/07/12 12:43:25 INFO lifecycle.LifecycleSupervisor: Starting
>     lifecycle supervisor 1
>     12/07/12 12:43:25 INFO node.FlumeNode: Flume node starting - agent2
>     12/07/12 12:43:25 INFO nodemanager.DefaultLogicalNodeManager: Node
>     manager starting
>     12/07/12 12:43:25 INFO
>     properties.PropertiesFileConfigurationProvider: Configuration
>     provider starting
>     12/07/12 12:43:25 INFO lifecycle.LifecycleSupervisor: Starting
>     lifecycle supervisor 10
>     12/07/12 12:43:25 INFO
>     properties.PropertiesFileConfigurationProvider: Reloading
>     configuration file:conf/flume.conf
>     12/07/12 12:43:25 INFO conf.FlumeConfiguration: Processing:HDFS
>     12/07/12 12:43:25 INFO conf.FlumeConfiguration: Processing:HDFS
>     12/07/12 12:43:25 INFO conf.FlumeConfiguration: Processing:HDFS
>     12/07/12 12:43:25 INFO conf.FlumeConfiguration: Added sinks: HDFS
>     Agent: agent2
>     12/07/12 12:43:25 INFO conf.FlumeConfiguration: Processing:HDFS
>     12/07/12 12:43:25 INFO conf.FlumeConfiguration: Post-validation
>     flume configuration contains configuration for agents: [agent2]
>     12/07/12 12:43:25 INFO
>     properties.PropertiesFileConfigurationProvider: Creating channels
>     12/07/12 12:43:25 INFO
>     properties.PropertiesFileConfigurationProvider: created channel
>     memoryChannel
>     12/07/12 12:43:25 INFO sink.DefaultSinkFactory: Creating instance
>     of sink HDFS typehdfs
>
>     bin/flume-ng agent --conf conf/ -f conf/flume.conf -n agent2 --
>     flume.log file created and sample file not able to hdfs sink.
>
>     root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT
>     <mailto:root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT>#
>     bin/flume-ng agent --conf conf/ -f conf/flume.conf -n agent2
>
>
>     Info: Sourcing environment configuration script
>     /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
>     Info: Including Hadoop libraries found via
>     (/usr/local/hadoop_dir/hadoop/bin/hadoop) for HDFS access
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-api-1.4.3.jar
>     from classpath
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from
>     classpath
>     + exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m
>     -Xmx200m -cp
>     '/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/hadoop_dir/hadoop/libexec/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/local/hadoop_dir/hadoop/libexec/..:/usr/local/hadoop_dir/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
>     -Djava.library.path=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib:/usr/local/hadoop_dir/hadoop/libexec/../lib/native/Linux-i386-32
>     org.apache.flume.node.Application -f conf/flume.conf -n agent2
>     flume.log
>     =============
>     root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT
>     <mailto:root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT>#
>     more flume.log
>     2012-07-12 12:46:24,415 INFO lifecycle.LifecycleSupervisor:
>     Starting lifecycle supervisor 1
>     2012-07-12 12:46:24,416 INFO node.FlumeNode: Flume node starting -
>     agent2
>     2012-07-12 12:46:24,418 INFO
>     nodemanager.DefaultLogicalNodeManager: Node manager starting
>     2012-07-12 12:46:24,418 INFO lifecycle.LifecycleSupervisor:
>     Starting lifecycle supervisor 10
>     2012-07-12 12:46:24,418 INFO
>     properties.PropertiesFileConfigurationProvider: Configuration
>     provider starting
>     2012-07-12 12:46:24,420 INFO
>     properties.PropertiesFileConfigurationProvider: Reloading
>     configuration file:conf/flume.conf
>     2012-07-12 12:46:24,426 INFO conf.FlumeConfiguration: Processing:HDFS
>     2012-07-12 12:46:24,427 INFO conf.FlumeConfiguration: Processing:HDFS
>     2012-07-12 12:46:24,428 INFO conf.FlumeConfiguration: Processing:HDFS
>     2012-07-12 12:46:24,428 INFO conf.FlumeConfiguration: Added sinks:
>     HDFS Agent: agent2
>     2012-07-12 12:46:24,428 INFO conf.FlumeConfiguration: Processing:HDFS
>     2012-07-12 12:46:24,443 INFO conf.FlumeConfiguration:
>     Post-validation flume configuration contains configuration for
>     agents: [agent2]
>     2012-07-12 12:46:24,443 INFO
>     properties.PropertiesFileConfigurationProvider: Creating channels
>     2012-07-12 12:46:24,447 INFO
>     properties.PropertiesFileConfigurationProvider: created channel
>     memoryChannel
>     2012-07-12 12:46:24,455 INFO sink.DefaultSinkFactory: Creating
>     instance of sink HDFS typehdfs
>
>
>     Please suggest me on this issue.
>     Thanks,
>     Prabhu.
>     On Thu, Jul 12, 2012 at 10:22 AM, Juhani Connolly
>     <juhani_connolly@cyberagent.co.jp
>     <mailto:juhani_connolly@cyberagent.co.jp>> wrote:
>
>         Hello Prabhu,
>
>         You should have a look at your logs. Unless you've changed the
>         settings, they'll be in flume.log in whatever directory you
>         executed from.
>
>         It looks to me like you are trying to use a header(%{host}) in
>         the path, but you are not providing that header. If you want
>         to use it, you should try using the host interceptor(which is
>         better documented in the most recent revisions). There may be
>         other issues too, but it's hard to tell without seeing the
>         logs. Check them out and see if there's any exceptions popping up
>
>
>         On 07/12/2012 03:19 AM, prabhu k wrote:
>>         Hi,
>>
>>         While i am trying to load sample text file to HDFS sink,
>>         after executing bin/flume-ng node --conf conf/ -f
>>         conf/flume.conf -n agent2
>>         command, Am unable to view the file in hdfs.
>>
>>         i have executed like below.
>>
>>         root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
>>         <mailto:root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#>
>>         bin/flume-ng node --conf conf/ -f conf/flume.conf -n agent2
>>         Warning: The "node" command is deprecated. Please use "agent"
>>         instead.
>>
>>         Info: Sourcing environment configuration script
>>         /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
>>         Info: Including Hadoop libraries found via
>>         (/usr/local/hadoop_dir/hadoop/bin/hadoop) for HDFS access
>>         Info: Excluding
>>         /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-api-1.4.3.jar
>>         from classpath
>>         Info: Excluding
>>         /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar
>>         from classpath
>>         + exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m
>>         -Xmx200m -cp
>>         '/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/hadoop_dir/hadoop/libexec/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/local/hadoop_dir/hadoop/libexec/..:/usr/local/hadoop_dir/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-configuratio!
>>         n-1.6.jar:
>>         /usr/local/hadoop_dir/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop_dir/hadoop!
>>         /libexec/.
>>         ./lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/log4j-1.2.15.jar:/u!
>>         sr/local/h
>>         adoop_dir/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
>>         -Djava.library.path=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib:/usr/local/hadoop_dir/hadoop/libexec/../lib/native/Linux-i386-32
>>         org.apache.flume.node.Application -f conf/flume.conf -n agent2
>>
>>
>>         flume.conf
>>         ================
>>         agent2.sources = tail
>>         agent2.channels = memoryChannel
>>         agent2.sinks = HDFS
>>         agent2.sources.tail.type = exec
>>         agent2.sources.tail.command = tail -F
>>         /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/flume_test.txt
>>         agent2.sources.tail.channels = memoryChannel
>>         agent2.sinks.HDFS.channel = memoryChannel
>>         agent2.sinks.HDFS.type = hdfs
>>         agent2.sinks.HDFS.hdfs.path =
>>         hdfs://10.5.114.110:54310/user/flume/'%{host}
>>         <http://10.5.114.110:54310/user/flume/%27%%7Bhost%7D>'
>>         agent2.sinks.HDFS.hdfs.file.Type = DataStream
>>         agent2.channels.memoryChannel.type = memory
>>
>>
>>         Please help me, if i have missed anything?
>>         Thanks,
>>         Prabhu
>
>
>
>



Mime
View raw message