flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From JP <jpnaidu...@gmail.com>
Subject Re: regarding flume
Date Mon, 30 Jul 2012 16:34:28 GMT
Thanks Hari ,

i got little progress.

But im getting garbage values.

this is my configurations:

*flume-conf.properties*
---------------------------------------
agent2.sources = seqGenSrc
agent2.channels = memoryChannel
agent2.sinks = loggerSink

agent2.sources.seqGenSrc.type = avro
agent2.sources.seqGenSrc.bind=localhost
agent2.sources.seqGenSrc.port=41414

agent2.channels.memoryChannel.type = memory
agent2.channels.memoryChannel.capacity = 1000000
agent2.channels.memoryChannel.transactionCapacity = 1000000
agent2.channels.memoryChannel.keep-alive = 30

agent2.sources.seqGenSrc.channels = memoryChannel

agent2.sinks.loggerSink.type = hdfs
agent2.sinks.loggerSink.hdfs.path = hdfs://ip:portno/data/CspcLogs
agent2.sinks.loggerSink.hdfs.fileType = DataStream
agent2.sinks.loggerSink.channel = memoryChannel
agent2.sinks.loggerSink.serializer = avro_event
agent2.sinks.loggerSink.serializer.compressionCodec = snappy
agent2.sinks.loggerSink.serializer.syncIntervalBytes = 2048000
agent2.channels.memoryChannel.type = memory


log4j.properties
------------------------------------------------------------------------------
log4j.rootLogger=INFO, CA, flume

log4j.appender.CA=org.apache.log4j.ConsoleAppender

log4j.appender.CA.layout=org.apache.log4j.PatternLayout
log4j.appender.CA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414


and my output:
------------------------
Objavro.codecnullavro.schema�{"type":"record","name":"Event","fields":[{"name":"headers","type":{"type":"map","values":"string"}},{"name":"body","type":"bytes"}]}�|��(r5��q
��nl�8flume.client.log4j.log.level
40000Fflume.client.log4j.message.encodingUTF88flume.client.log4j.timestamp1343665387977<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
error message|��(r5��q ��nl�8flume.client.log4j.log.level
50000Fflume.client.log4j.message.encodingUTF88flume.client.log4j.timestamp1343665387993<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
fatal message|��(r5��q ��nl�8flume.client.log4j.log.level
20000Fflume.client.log4j.message.encodingUTF88flume.client.log4j.timestamp


Please let me know, if im in the wrong path.

Please suggest me to get custom logging pattern (for example like in log4j)


Thanks
JP

On Sun, Jul 29, 2012 at 10:04 AM, Hari Shreedharan <
hshreedharan@cloudera.com> wrote:

>  + user@
>
> Thamatam,
>
> The Log4J appender adds the date, log level and logger name to the flume
> event headers and the text of the log event to the flume event body. The
> reason the log level and time are missing is that these are in the headers
> and the text serializer does not serialize the headers.
>
> To write to a file or HDFS, please use a Serializer together with the
> RollingFileSink or HDFSEventSink. Please take a look at the plain text
> serializer or Avro serializer to understand this better.
>
> Thanks,
> Hari
>
> --
> Hari Shreedharan
>
> On Saturday, July 28, 2012 at 5:47 PM, thamatam Jayaprakash wrote:
>
> Hi Hari,
>
>
> Actually im unable to send to this mail to the user and dev group so, im
> mailing to you .
>
> Could you pls point me where im going wrong.
> *Please suggest me which log appender need to use for custom logging
> pattern and appender.*
>
> Im working on Flume 1.1.0 and 1.2.0 . We are not able set log pattern and
> We are using log4jappender
> log4j.appender.flume=org.apache.flume.clients.log4jappender.Log4jAppender
>
> but we are getting plain test
>
> *Example if i log following mssages :*
>
> 17:42:55,928  INFO SimpleJdbcServlet:69 - doGet of SimpleJdbcServlet
> ended...
> 17:43:03,489  INFO HelloServlet:29 - HelloServlet of doGet started...
> 17:43:03,489  INFO HelloServlet:33 -
>  Hello from Simple Servlet
> 17:43:03,489  INFO HelloServlet:35 - HelloServlet of doGet end...
> 17:47:46,000  INFO HelloServlet:29 - HelloServlet of doGet started...
> 17:47:46,001  INFO HelloServlet:33 -
>  Hello from Simple Servlet
> 17:47:46,001  INFO HelloServlet:35 - HelloServlet of doGet end...
>
> *Using Flume in Hadoop im getting only the following logs:*
>
> doGet of SimpleJdbcServlet ended...
> HelloServlet of doGet started...
>
> HelloServlet of doGet end...
> HelloServlet of doGet started...
>
> *
>
> Thanks in advance.
> *
> --
> JP
>
>
>
> --
> Jayaprakash
>
>
>


-- 
JP

Mime
View raw message