In fact I can't really achieve a similar effect with threads and futures in a blocking mode. I'm using a Flume (RPC) client in a tomcat web service. If there is a lot of user requests and if our Flume server is too slow to respond, the number of threads is growing too much and it can cause performance issues on the web service.

Of course this kind of problem can always be solved on the server side, but it would be better be able to handle latency with non blocking IO on the client. In particular if you are working with asynchronous web frameworks like Play Framework (or Spray.io) that work better with a low number of threads.

Thanks,

Loïc

Le 29 janv. 2015 02:00, "Hari Shreedharan" <hshreedharan@cloudera.com> a écrit :

>
> Currently, there is no non-blocking Flume client. You can use the standard threads-futures combination to achieve a similar effect.
>
> Thanks, 
> Hari
>
>
> On Wed, Jan 28, 2015 at 8:40 AM, Loic Descotte <loic.descotte@gmail.com> wrote:
>>
>> Hi all,
>>
>> I'm using Flume and I'd like to send messages in an asynchronous way,
>> without consuming (or blocking) too many threads.
>> Is there a way to call Flume in a non blocking IO way, with Thrift or
>> with Avro client?
>>
>> Looking at this class (on the old Cloudera repository) it seems
>> possible, but the "nonBlocking" flag is set to false by default :
>> https://github.com/cloudera/flume/blob/master/flume-core/src/main/java/com/cloudera/flume/handlers/thrift/ThriftEventSink.java
>>
>> Do you know how to achieve this?
>>
>> Thanks, best regards
>>
>> Loïc
>>
>