flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashish <paliwalash...@gmail.com>
Subject Re: beginner's question -- file source configuration
Date Sun, 08 Mar 2015 08:23:28 GMT
Please look at following
Spooling Directory Source
HDFS Sink (http://flume.apache.org/FlumeUserGuide.html#hdfs-sink)

Spooling Directory Source need immutable files, means files should not
be written to once they are being consumed. In short your application
cannot write to the file being read by Flume.

Log format is not an issue, as long as you don't want it to be
interpreted by Flume components. Since it's log assuming single log
per line with line separator at the end of line.

You can also look at Exec source
(http://flume.apache.org/FlumeUserGuide.html#exec-source) for tailing
to a file being written by application. Documentation covers details
on all the links.


On Sun, Mar 8, 2015 at 12:32 PM, Lin Ma <linlma@gmail.com> wrote:
> Hi Flume masters,
> I want to install Flume on a box, and consume local log file as source and
> send to remote HDFS sink. The log format is private and text (not Avro or
> JSON format).
> I am reading the guide on Flume and many advanced Source configuration,
> wondering for the plain local log file source, any reference samples? And
> not sure if Flume could consume the local file while the application is
> still writing the log file? Thanks.
> regards,
> Lin


Blog: http://www.ashishpaliwal.com/blog
My Photo Galleries: http://www.pbase.com/ashishpaliwal

View raw message