flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tobias Heintz <tobias.hei...@plista.com>
Subject Kafka sink cannot handle unreachable host
Date Tue, 20 Oct 2015 17:58:58 GMT
We are using the Kafka sink in Flume 1.6 to forward messages to our cluster consisting of 3
hosts. All three of these hosts are configured in the Flume config. If one of the hosts dies
(i.e. we kill the Kafka daemon), the Flume sink works reasonably well in that it starts publishing
data to one of the other hosts. However, if we only interrupt the connection to a particular
host, for example by closing the firewall, the Kafka sink cannot cope.
The problem here seems to be that the host is still part of the Kafka cluster, as the communication
there works fine. It cannot be reached from the host running Flume however. The Kafka sink
should recognize that and switch to one of the other hosts. Instead it keeps on trying to
connect to the unreachable host until the channel overflows and we start losing messages.

Am I misunderstanding some concepts here? Is there something I can configure to enable a more
desirable behavior?

Thanks
tobias

--
Tobias Heintz
Teamlead Core

Telefon: +49 30 47375370 | Fax: +49 30 484984411
E-Mail: tobias.heintz@plista.com
Links: www.plista.com

plista GmbH | Torstraße 33-35 | 10119 Berlin | Deutschland
Amtsgericht Berlin-Charlottenburg | HRB 114726 B
Geschäftsführer: Andreas Richter | Christian Laase | Dr. Dominik Matyka | Jana Kusick

Mime
View raw message