flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Percy <mpe...@cloudera.com>
Subject Re: flume ng error while going for hdfs sink
Date Thu, 05 Jul 2012 18:51:50 GMT
On Thu, Jul 5, 2012 at 12:28 AM, Amit Handa <amithanda01@gmail.com> wrote:

> HI All,
>
> While trying to run Flume ng using HDFS SInk, and using avro Client.. i am
> getting IOException. Kindly help in resolving this issue
>
> Exception log is as follows:
> 2012-07-05 12:01:32,789 (conf-file-poller-0) [INFO -
> org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:70)]
> Creating instance of sink HDFS typehdfs
> 2012-07-05 12:01:32,816 (conf-file-poller-0) [DEBUG -
> org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)]
> java.io.IOException: config()
>     at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
>     at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
>     at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>     at
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
> ...
>

Nothing is wrong with this, you are running at DEBUG level and Hadoop is
giving you debug-level output. If you don't want to get DEBUG level
messages from Hadoop while running Flume at DEBUG level then you will need
to add something like:

log4j.logger.org.apache.hadoop = INFO

To your log4j.properties file.

Are you experiencing any problems with your setup?

Regards,
Mike

Mime
View raw message