flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sadananda Hegde <saduhe...@gmail.com>
Subject Transfering compressed (gzip) files
Date Mon, 22 Oct 2012 15:18:37 GMT
My application servers produce data files that are in compressed format
(gzip). I am planning to use flume ng (1.2.0) to collect those files and
transfer them to hadoop cluster (write to HDFS). Is it possible to read and
transfer them without uncomressing first? My sink would be HDFS and there
are options to compress before writing to HDFS. That would work fine if my
source is uncompressed text file and need to store hdfs file in compressed
format. But in my case, the source itself is compressed. What would be the
best options to handle such cases?

Thanks for your help.


View raw message