flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nitin Pawar <nitinpawar...@gmail.com>
Subject Re: Unable to setup HDFS sink
Date Mon, 14 Jan 2013 07:17:52 GMT
the correct value maps to this fs.default.name in your core-site.xml

so whatever value you have there, you will need to use same for flume hdfs
sink


On Mon, Jan 14, 2013 at 12:37 PM, Nitin Pawar <nitinpawar432@gmail.com>wrote:

> Its a jobtracker uri
>
> There shd be a conf in ur hdfs-site.xml and core-site.xml which looks like
> hdfs://localhost:9100/
>
> You need to use that value
> On Jan 14, 2013 12:34 PM, "Vikram Kulkarni" <vikulkarni@expedia.com>
> wrote:
>
>> I was able to write using the same hdfs conf from a different sink.
>> Also, I can open the MapRed administration page successfully at
>> http://localhost:50030/jobtracker.jsp So that should indicate that the
>> hdfs path below is valid right? Any other way to check?
>>
>> Thanks.
>>
>> On 1/13/13 10:57 PM, "Alexander Alten-Lorenz" <wget.null@gmail.com>
>> wrote:
>>
>> >Hi,
>> >
>> >Check your HDFS cluster, he's not responding on localhost/
>> 127.0.0.1:50030
>> >
>> >- Alex
>> >
>> >On Jan 14, 2013, at 7:43 AM, Vikram Kulkarni <vikulkarni@expedia.com>
>> >wrote:
>> >
>> >> I am trying to setup a sink for hdfs for HTTPSource . But I get the
>> >>following exception when I try to send a simple Json event. I am also
>> >>using a logger sink and I can clearly see the event output to the
>> >>console window but it fails to write to hdfs. I have also in a separate
>> >>conf file successfully written to hdfs sink.
>> >>
>> >> Thanks,
>> >> Vikram
>> >>
>> >> Exception:
>> >> [WARN -
>>
>> >>org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:456)]
>> >> HDFS IO error
>> >> java.io.IOException: Call to localhost/127.0.0.1:50030 failed on local
>> >>exception: java.io.EOFException
>> >> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1144)
>> >>
>> >> My conf file is as follows:
>> >> # flume-httphdfs.conf: A single-node Flume with Http Source and hdfs
>> >>sink configuration
>> >>
>> >> # Name the components on this agent
>> >> agent1.sources = r1
>> >> agent1.channels = c1
>> >>
>> >> # Describe/configure the source
>> >> agent1.sources.r1.type = org.apache.flume.source.http.HTTPSource
>> >> agent1.sources.r1.port = 5140
>> >> agent1.sources.r1.handler = org.apache.flume.source.http.JSONHandler
>> >> agent1.sources.r1.handler.nickname = random props
>> >>
>> >> # Describe the sink
>> >> agent1.sinks = logsink hdfssink
>> >> agent1.sinks.logsink.type = logger
>> >>
>> >> agent1.sinks.hdfssink.type = hdfs
>> >> agent1.sinks.hdfssink.hdfs.path = hdfs://localhost:50030/flume/events
>> >> agent1.sinks.hdfssink.hdfs.file.Type = DataStream
>> >>
>> >> # Use a channel which buffers events in memory
>> >> agent1.channels.c1.type = memory
>> >> agent1.channels.c1.capacity = 1000
>> >> agent1.channels.c1.transactionCapacity = 100
>> >>
>> >> # Bind the source and sink to the channel
>> >> agent1.sources.r1.channels = c1
>> >> agent1.sinks.logsink.channel = c1
>> >> agent1.sinks.hdfssink.channel = c1
>> >>
>> >>
>> >
>> >--
>> >Alexander Alten-Lorenz
>> >http://mapredit.blogspot.com
>> >German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >
>>
>>


-- 
Nitin Pawar

Mime
View raw message