flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris Horrocks <chrisjhorro...@gmail.com>
Subject Re: Hadoop 2x Compatability
Date Fri, 29 Aug 2014 15:09:40 GMT
Yes please that would be very helpful. Are you using the tar release from
the flume website?


On Fri, Aug 29, 2014 at 3:54 PM, Sandeep khurana <skhurana333@gmail.com>
wrote:

> I have setup flume 1.5..and use hdfssink on hadoop 2. Its in my local
> laptop. Flume agent has started and saved data into hdfs using the sink
> fine and is running without any problem.
>
>
> Let me know if you want to see jars in lib folder of my flume installation.
> ------------------------------
> From: Chris Horrocks <chrisjhorrocks@gmail.com>
> Sent: ‎29-‎08-‎2014 20:16
> To: user@flume.apache.org
> Subject: Hadoop 2x Compatability
>
> Hi All,
>
> I'm pretty new to Flume so forgive the newbish question, but I've been
> working with Hadoop 2x for a little while.
>
> I'm trying to configure flume (1.5.0) with a HDFS sink however the agent
> won't start citing the following error:
>
> 29 Aug 2014 13:40:13,435 ERROR [conf-file-poller-0]
> (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:145)
>  - Failed to start agent because dependencies were not found in classpath.
> Error follows.
> java.lang.NoClassDefFoundError:
> org/apache/hadoop/io/SequenceFile$CompressionType
>         at
> org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:251)
>         at
> org.apache.flume.conf.Configurables.configure(Configurables.java:41)
>         at
> org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:418)
>         at
> org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:103)
>         at
> org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>         at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.io.SequenceFile$CompressionType
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>         ... 12 more
>
> From some searching around it appears that flume is trying to reference
> 'hadoop-core.jar' file which has since been depreciated.
>
> Am I missing something obvious here? Or does Flume 1.5.0 not support
> Hadoop 2x HDFS sinks?
>
>
> Regards
>
> Chris
>

Mime
View raw message