|capacity||100||The max number of events stored in the channel|
|transactionCapacity||100||The max number of events stored in the channel per transaction|
|keep-alive||3||Timeout in seconds for adding or removing an event|
Thanks for your answer.
Additional,I’m using mem channel, write log to mongodb, when the input log is faster than consume(write into mongo), the queue is growing, when reach the max,the new input log is lost.
So, what I want to know is the exact point that can blance the input and output
here is one example for the capacity defining flow
On Wed, May 15, 2013 at 2:16 PM, Nitin Pawar <firstname.lastname@example.org> wrote:
sorry pressed enter too soon
as for your question: how many events a flume agent can hold?
sorry but I don't think there is any direct answer to that.... .I may be very well wrong there as I am myself pretty new with flume
there was a JIRA for the capacity of file channels FLUME-1571
On Wed, May 15, 2013 at 1:50 PM, Nitin Pawar <email@example.com> wrote:
for maximum performance on your data flow two things which will matter most are: the channel and the transaction batch size.
when you say losing data, are you using memory channel? or file channel?
Flume can batch events. The batch size is the maximum number of events that a sink or client will attempt to take from a channel in a single transaction.
What is the channel type
do you have a slow sink so the # events written out are less than # event incoming to channels so over time it piles up
others may point out more things.
Also your flume conf and if you are seeing any errors on flume then that will help people to find out the problem
On Wed, May 15, 2013 at 11:07 AM, liuyongbo <firstname.lastname@example.org> wrote:
I’m using flume to pass log data to mongodb, but I find that some data lose when the pressure is in high level, so I want to know the max request that flume can hold and need to print the capacity.but I can not find the proper way to do this instead of change the source code. Any ideas?