flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Percy <mpe...@apache.org>
Subject Re: regarding flume
Date Mon, 30 Jul 2012 22:33:47 GMT
Check out the source code for the appender to see which headers you need to
write.

https://github.com/apache/flume/blob/trunk/flume-ng-clients/flume-ng-log4jappender/src/main/java/org/apache/flume/clients/log4jappender/Log4jAppender.java#L96

If you want to verify that the headers are being passed, try using a logger
sink in your Flume agent for debugging purposes.

If you want an example of writing an EventSerializer, I wrote up a little
bit of info here:
https://cwiki.apache.org/confluence/display/FLUME/Flume+1.x+Event+Serializers

Regards,
Mike


On Mon, Jul 30, 2012 at 11:22 AM, Hari Shreedharan <
hshreedharan@cloudera.com> wrote:

>  You are using the AvroEventSerializer. This formats the event into Avro
> format specified by org.apache.flume.serialization.FlumeEventAvroEventSerializer,
> which is why it looks like garbage, while it is not. Your app should be
> written to read and understand the Avro format. If you need it to human
> readable, you will need to write your own serializer, perhaps by extending
> the BodyTextEventSerializer.
>
> Thanks
> Hari
>
> --
> Hari Shreedharan
>
> On Monday, July 30, 2012 at 9:34 AM, JP wrote:
>
> Thanks Hari ,
>
> i got little progress.
>
> But im getting garbage values.
>
> this is my configurations:
>
> *flume-conf.properties*
> ---------------------------------------
> agent2.sources = seqGenSrc
> agent2.channels = memoryChannel
> agent2.sinks = loggerSink
>
> agent2.sources.seqGenSrc.type = avro
> agent2.sources.seqGenSrc.bind=localhost
> agent2.sources.seqGenSrc.port=41414
>
> agent2.channels.memoryChannel.type = memory
> agent2.channels.memoryChannel.capacity = 1000000
> agent2.channels.memoryChannel.transactionCapacity = 1000000
> agent2.channels.memoryChannel.keep-alive = 30
>
> agent2.sources.seqGenSrc.channels = memoryChannel
>
> agent2.sinks.loggerSink.type = hdfs
> agent2.sinks.loggerSink.hdfs.path = hdfs://ip:portno/data/CspcLogs
> agent2.sinks.loggerSink.hdfs.fileType = DataStream
> agent2.sinks.loggerSink.channel = memoryChannel
> agent2.sinks.loggerSink.serializer = avro_event
> agent2.sinks.loggerSink.serializer.compressionCodec = snappy
> agent2.sinks.loggerSink.serializer.syncIntervalBytes = 2048000
> agent2.channels.memoryChannel.type = memory
>
>
> log4j.properties
>
> ------------------------------------------------------------------------------
> log4j.rootLogger=INFO, CA, flume
>
> log4j.appender.CA=org.apache.log4j.ConsoleAppender
>
> log4j.appender.CA.layout=org.apache.log4j.PatternLayout
> log4j.appender.CA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n
>
> log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
> log4j.appender.flume.Hostname = localhost
> log4j.appender.flume.Port = 41414
>
>
> and my output:
> ------------------------
> Obj avro.codec null avro.schema�
> {"type":"record","name":"Event","fields":[{"name":"headers","type":{"type":"map","values":"string"}},{"name":"body","type":"bytes"}]}�|
> ��(r5��q ��nl � 8flume.client.log4j.log.level
> 40000Fflume.client.log4j.message.encoding
> UTF88flume.client.log4j.timestamp
> 1343665387977<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
> error message| ��(r5��q ��nl � 8flume.client.log4j.log.level
> 50000Fflume.client.log4j.message.encoding
> UTF88flume.client.log4j.timestamp
> 1343665387993<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
> fatal message| ��(r5��q ��nl � 8flume.client.log4j.log.level
> 20000Fflume.client.log4j.message.encoding
> UTF88flume.client.log4j.timestamp
>
>
> Please let me know, if im in the wrong path.
>
> Please suggest me to get custom logging pattern (for example like in log4j)
>
>
> Thanks
> JP
>
> On Sun, Jul 29, 2012 at 10:04 AM, Hari Shreedharan <
> hshreedharan@cloudera.com> wrote:
>
>  + user@
>
> Thamatam,
>
> The Log4J appender adds the date, log level and logger name to the flume
> event headers and the text of the log event to the flume event body. The
> reason the log level and time are missing is that these are in the headers
> and the text serializer does not serialize the headers.
>
> To write to a file or HDFS, please use a Serializer together with the
> RollingFileSink or HDFSEventSink. Please take a look at the plain text
> serializer or Avro serializer to understand this better.
>
> Thanks,
> Hari
>
> --
> Hari Shreedharan
>
> On Saturday, July 28, 2012 at 5:47 PM, thamatam Jayaprakash wrote:
>
> Hi Hari,
>
>
> Actually im unable to send to this mail to the user and dev group so, im
> mailing to you .
>
> Could you pls point me where im going wrong.
> *Please suggest me which log appender need to use for custom logging
> pattern and appender.*
>
> Im working on Flume 1.1.0 and 1.2.0 . We are not able set log pattern and
> We are using log4jappender
> log4j.appender.flume=org.apache.flume.clients.log4jappender.Log4jAppender
>
> but we are getting plain test
>
> *Example if i log following mssages :*
>
> 17:42:55,928  INFO SimpleJdbcServlet:69 - doGet of SimpleJdbcServlet
> ended...
> 17:43:03,489  INFO HelloServlet:29 - HelloServlet of doGet started...
> 17:43:03,489  INFO HelloServlet:33 -
>  Hello from Simple Servlet
> 17:43:03,489  INFO HelloServlet:35 - HelloServlet of doGet end...
> 17:47:46,000  INFO HelloServlet:29 - HelloServlet of doGet started...
> 17:47:46,001  INFO HelloServlet:33 -
>  Hello from Simple Servlet
> 17:47:46,001  INFO HelloServlet:35 - HelloServlet of doGet end...
>
> *Using Flume in Hadoop im getting only the following logs:*
>
> doGet of SimpleJdbcServlet ended...
> HelloServlet of doGet started...
>
> HelloServlet of doGet end...
> HelloServlet of doGet started...
>
> *
>
> Thanks in advance.
> *
> --
> JP
>
>
>
> --
> Jayaprakash
>
>
>
>
>
> --
> JP
>
>
>

Mime
View raw message