I think I found the cause. because one of the line is larger than the set limit, I tried to set the flume.event.max.size.bytes in the agent node and collector
node but the system doesn't seem to take the values
<description>The length of line content in byte.</description>
am I doing anything wrong?
I have encountered this problem with Flume twice. Flume agent just keep sending the same log file again and again to the collector and filling up all the disk space in the collector host at the end. Do you guys know what exactly causes Flume to lost
count of the lines and keep re-streaming. I saw it happen when I try to stream some binary logs, and I saw it happen today with normal logs(may contains some binary data). I can replicated the problem easily. I am using "tail" to stream the content over
Please let me know what are the potential causes.