flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ed <edor...@gmail.com>
Subject Re: Flume - Custom HDFS Sink does not write to hdfs unless killed
Date Mon, 03 Mar 2014 15:08:41 GMT
Not sure if this is the issue but I've found it's easy to have custom flume
components like interceptors, sinks, sources and serializers swallow
exceptions.  You won't see any errors in standard out and the flume agent
will look like it is working but it's not doing anything.  When you kill
the agent it will then close your files in HDFS and any data that was in
the channel before the error will get written out.  Do the avro fies have
all the data you expect them to have after you kill the flume agent?

Best Regards,


On Mon, Mar 3, 2014 at 4:11 AM, Himanshu Patidar <
himanshu.patidar@hotmail.com> wrote:

> I have a custom HDFS Sink which takes events, parse them (convert binary
> data to .avro files) and then writes these files to different directories
> in hdfs. Trying to do so I get a strange error - Only the last avro file
> gets written and to hdfs and rest of the files show a size of 0 kbs untill
> I kill my flume agent (Ctrl + C). As soon as I kill the script, I can see
> the data in rest of the .avro files. I am using flume 1.4 cdh 4.0.0 with
> hdfs 2.0.0.
> Can anyone please suggest some solution?
> Thanks,
> Himanshu

View raw message