flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sanjay Ramanathan <sanjay.ramanat...@lucidworks.com>
Subject "Hit max consecutive under-replication rotations" Error
Date Tue, 15 Jul 2014 22:17:56 GMT
Hi all,

While trying to input data from flume to HDFS sink, I'm getting this error


[ERROR - org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:566)] Hit max consecutive
under-replication rotations (30); will not continue rolling files under this path due to under-replication


I looked up the error online and it said to make the below modification(dfs.replication).
I did that and the problem still persists.

My hadoop configuration hdfs-site.xml has the property


I also get this message 30 times before the above error message:
"Block Under-replication detected. Rotating file."

My flume conf file has the configuration:
a1.sinks.k1.channel = c1
a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.path = hdfs://xx.xx.xx.xx:8020/input1/event/%y-%m-%d/%H%M
a1.sinks.k1.hdfs.useLocalTimeStamp = true
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute
a1.sinks.k1.hdfs.writeFormat = Text
a1.sinks.k1.hdfs.fileType = DataStream
#a1.sinks.k1.hdfs.filePrefix = events-
a1.sinks.k1.hdfs.rollCount = 1000
a1.sinks.k1.hdfs.batchSize = 10000
a1.sinks.k1.hdfs.rollSize = 0
a1.sinks.k1.hdfs.rollInterval = 30

?Kindly let me know what is it, that I'm doing wrong.

Sanjay Ramanathan

View raw message