flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hari Shreedharan" <hshreedha...@cloudera.com>
Subject RE: Flume log4j appender not converting data to avro
Date Mon, 13 Apr 2015 22:37:09 GMT
What serializer did you use? You’d need a serializer that writes avro back too.




Thanks, Hari

On Mon, Apr 13, 2015 at 3:34 PM, Kaushik shreekanth <kkaushr@outlook.com>
wrote:

> The example in this mail is log4j 2. But I have tried it with log4j 1.x with  AvroReflectionEnabled=true
and still didn't get avro output on my remote.
> Date: Mon, 13 Apr 2015 15:28:44 -0700
> From: hshreedharan@cloudera.com
> To: user@flume.apache.org
> CC: user@flume.apache.org
> Subject: Re: Flume log4j appender not converting data to avro
> Is this log4 2? The appender we bundle with flume is for log4j 1.x. To enable avro with
the log4j 1.x appender, you need to set AvroReflectionEnabled=true. I don’t know if the
log4j 2.x appender supports avro - that is a part of the log4j 2 project.
> Thanks, Hari
> On Wed, Apr 8, 2015 at 3:17 PM, Kaushik Shreekanth <kkaushr@outlook.com> wrote:
> Hi,
> I'm using Flume log4j Appender to send generic records from my application to a flume
avro source. However at the sink, the data is seen as plain text. I'm not sure why it was
not converted to avro format. 
> This is my log4j2 flume appender config:
> <?xml version="1.0" encoding="UTF-8"?>
> <Configuration status="WARN">
> <Appenders>
> <Console name="ConsoleAppender" target="SYSTEM_OUT">
> <PatternLayout pattern="%m%n" />
> </Console>
> <Flume name="FlumeAppender" compress="false" type="Avro">
> <Agent host="localhost" port="41414"/>
> </Flume>
> </Appenders>
> <Loggers>
> <Root level="debug">
> <AppenderRef ref="ConsoleAppender" />
> <AppenderRef ref="FlumeAppender" />
> </Root>
> </Loggers>
> </Configuration> 
> Dependencies in my application:
> <dependency>
> <groupId>org.apache.logging.log4j</groupId>
> <artifactId>log4j-api</artifactId>
> <version>2.2</version>
> </dependency>
> <dependency>
> <groupId>org.apache.logging.log4j</groupId>
> <artifactId>log4j-core</artifactId>
> <version>2.2</version>
> </dependency>
> <dependency>
> <groupId>org.apache.logging.log4j</groupId>
> <artifactId>log4j-flume-ng</artifactId>
> <version>2.2</version>
> </dependency>
> <dependency>
> <groupId>org.apache.avro</groupId>
> <artifactId>avro</artifactId>
> <version>1.7.7</version>
> </dependency>
> <dependency>
> <groupId>com.fasterxml.jackson.core</groupId>
> <artifactId>jackson-databind</artifactId>
> <version>2.5.1</version>
> </dependency>
> <dependency>
> <groupId>com.fasterxml.jackson.core</groupId>
> <artifactId>jackson-core</artifactId>
> <version>2.5.1</version>
> </dependency>
> <dependency>
> <groupId>org.slf4j</groupId>
> <artifactId>slf4j-simple</artifactId>
> <version>1.7.12</version>
> </dependency>
> This is my flume.conf:
> # Define a memory channel called ch1 on agent1
> agent1.channels.ch1.type = memory
>  
> # Define an Avro source called avro-source1 on agent1 and tell it
> # to bind to 0.0.0.0:41414. Connect it to channel ch1.
> agent1.sources.avro-source1.type = avro
> agent1.sources.avro-source1.bind = 0.0.0.0
> agent1.sources.avro-source1.port = 41414
> # Define a logger sink that simply logs all events it receives
> # and connect it to the other end of the same channel.
> agent1.sinks.file-roll-sink.type = file_roll
> agent1.sinks.file-roll-sink.sink.directory = /Users/ds/flume_events
> agent1.sinks.file-roll-sink.sink.rollInterval = 0
> # Finally, now that we've defined all of our components, tell
> # agent1 which ones we want to activate.
> agent1.channels = ch1
> agent1.sources = avro-source1
> agent1.sinks = file-roll-sink
> #chain the different components together
> agent1.sinks.file-roll-sink.channel = ch1
> agent1.sources.avro-source1.channels = ch1
> My java code that logs events:
> public static void main(String[] args) throws IOException {
> Logger logger = LogManager.getLogger();
> URL url = FlumeTest.class.getClassLoader().getResource("user.avsc");
> Schema schema = new Schema.Parser().parse(new File(url.getFile()));
> for (i = 0; i <= 10; i++) {
> GenericRecord user1 = new GenericData.Record(schema);
> user1.put("name", "abc" + String.valueOf(i));
> user1.put("id", i);
> logger.info(user1);
> }
> }
> My sink output is seen as:
> {"name": "abc0", "id": 0, "favorite_color": null}
> {"name": "abc1", "id": 1, "favorite_color": null}
> {"name": "abc2", "id": 2, "favorite_color": null}
> {"name": "abc3", "id": 3, "favorite_color": null}
> {"name": "abc4", "id": 4, "favorite_color": null}
> {"name": "abc5", "id": 5, "favorite_color": null}
> "name": "abc6", "id": 6, "favorite_color": null}
> {"name": "abc7", "id": 7, "favorite_color": null}
> {"name": "abc8", "id": 8, "favorite_color": null}
> {"name": "abc9", "id": 9, "favorite_color": null}
> {"name": "abc10", "id": 10, "favorite_color": null}
> As mentioned, the final output is in plain text and not Avro, can someone please let
me know what I'm doing wrong?
> Thanks.
>  		 	   		  
>  		 	   		  
Mime
View raw message