I am exploring Flume-NG,
1) While going for Avro (as Source) , and HDFS as Sink. I am getting error "[ERROR - org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:125)] RPC connection error :" (error log file attached). Kindly help in resolving it
1 a) flume.conf file is as follows
# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory
# Define an Avro source called avro-source1 on agent1 and tell it
# to bind to 0.0.0.0:41414. Connect it to channel ch1.
agent1.sources.avro-source1.channels = ch1
agent1.sources.avro-source1.type = avro
agent1.sources.avro-source1.bind = 22.214.171.124
agent1.sources.avro-source1.port = 41414
# Define a hdfs sink that simply logs all events it receives
# and connect it to the other end of the same channel.
agent1.sinks.HDFS.channel = ch1
agent1.sinks.HDFS.type = hdfs
agent1.sinks.HDFS.hdfs.path = hdfs://126.96.36.199:54310/user/hadoop-node1/flume-test/
agent1.sinks.HDFS.hdfs.file.Type = DataStream
# Finally, now that we've defined all of our components, tell
# agent1 which ones we want to activate.
agent1.channels = ch1
agent1.sources = avro-source1
agent1.sinks = HDFS
1 b) Mine is single node hadoop setup. command executed for flume are
bin/flume-ng agent --conf ./conf/ -f conf/flumeAVRO_HDFS.conf -n agent1 (in one console)
bin/flume-ng avro-client --conf conf -H 188.8.131.52 -p 41414 -F /home/hadoop-node1/Desktop/my.txt (in other console for Avro CLient) [ also tried command with -H localhost option), but same error.
1 c) hadoop version 0.20 is used
2) Provide me links how to explore various combination of source channel, and sink w.r.t Flume-NG