I have encountered this problem with Flume twice. Flume agent just keep sending the same log file again and again to the collector and filling up all the disk space in the collector host at the end. Do you guys know what exactly causes Flume to lost count of the lines and keep re-streaming. I saw it happen when I try to stream some binary logs, and I saw it happen today with normal logs(may contains some binary data). I can replicated the problem easily. I am using "tail" to stream the content over
Please let me know what are the potential causes.