flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris Horrocks <chrisjhorro...@gmail.com>
Subject Re: Hadoop 2x Compatability
Date Sat, 30 Aug 2014 08:53:06 GMT
Thanks Sandeep & Hari your help is much appreciated.

I kept having build errors on the TestAsyncHBaseSink test but rather than
ignoring them I tried the 1.5.0.1 binaries which worked perfectly first
time.


On Fri, Aug 29, 2014 at 7:23 PM, Hari Shreedharan <hshreedharan@cloudera.com
> wrote:

> How are you using Flume? If you downloaded the Flume binary tarball, that
> is built against hadoop 1.1 and will not work against hadoop2. You can
> however, download and build flume against hadoop 2.x, using either the
> hadoop-2 profile (mvn clean package -Dhadoop.profile=2) or hbase-98 profile
> (mvn clean package -Dhadoop.profile=hbase-98)
>
> You could also use vendor packaged Flume built against Hadoop-2
>
> Sandeep Khurana wrote:
>
>
> and yes I took tar from apache flume site only.
>
>
> On Fri, Aug 29, 2014 at 9:15 PM, Sandeep Khurana
> <skhurana333@gmail.com <mailto:skhurana333@gmail.com>> wrote:
>
>     Here it is.But I think problem might be somewhere else. Do you
>     have HADOOP_HOME environment variable set properly?
>
>     Also for me I am not using sequence file format. Do you need it?
>     Since its default setting and you can try out text format in flume
>     hdfs sink setting. Try changing from
>     hdfs.fileType SequenceFile  to
>
>     hdfs.fileType DataStream
>
>     in your flume conf file.
>
>     Inline image 1
>
>
>     On Fri, Aug 29, 2014 at 8:39 PM, Chris Horrocks
>     <chrisjhorrocks@gmail.com <mailto:chrisjhorrocks@gmail.com>> wrote:
>
>         Yes please that would be very helpful. Are you using the tar
>         release from the flume website?
>
>
>         On Fri, Aug 29, 2014 at 3:54 PM, Sandeep khurana
>         <skhurana333@gmail.com <mailto:skhurana333@gmail.com>> wrote:
>
>             I have setup flume 1.5..and use hdfssink on hadoop 2. Its
>             in my local laptop. Flume agent has started and saved data
>             into hdfs using the sink fine and is running without any
>             problem.
>
>
>             Let me know if you want to see jars in lib folder of my
>             flume installation.
>
> ------------------------------------------------------------------------
>             From: Chris Horrocks <mailto:chrisjhorrocks@gmail.com>
>
>             Sent: ‎29-‎08-‎2014 20:16
>             To: user@flume.apache.org <mailto:user@flume.apache.org>
>
>             Subject: Hadoop 2x Compatability
>
>             Hi All,
>
>             I'm pretty new to Flume so forgive the newbish question,
>             but I've been working with Hadoop 2x for a little while.
>
>             I'm trying to configure flume (1.5.0) with a HDFS sink
>             however the agent won't start citing the following error:
>
>             29 Aug 2014 13:40:13,435 ERROR [conf-file-poller-0]
>
> (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:145)
>              - Failed to start agent because dependencies were not
>             found in classpath. Error follows.
>             java.lang.NoClassDefFoundError:
>             org/apache/hadoop/io/SequenceFile$CompressionType
>                     at
>
> org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:251)
>                     at
>
> org.apache.flume.conf.Configurables.configure(Configurables.java:41)
>                     at
>
> org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:418)
>                     at
>
> org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:103)
>                     at
>
> org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
>                     at
>
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>                     at
>
> java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
>                     at
>
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
>                     at
>
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>                     at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>                     at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>                     at java.lang.Thread.run(Thread.java:745)
>             Caused by: java.lang.ClassNotFoundException:
>             org.apache.hadoop.io.SequenceFile$CompressionType
>                     at
>             java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>                     at
>             java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>                     at
>             java.security.AccessController.doPrivileged(Native Method)
>                     at
>             java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>                     at
>             java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>                     at
>             sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>                     at
>             java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>                     ... 12 more
>
>             From some searching around it appears that flume is trying
>             to reference 'hadoop-core.jar' file which has since been
>             depreciated.
>
>             Am I missing something obvious here? Or does Flume 1.5.0
>             not support Hadoop 2x HDFS sinks?
>
>
>             Regards
>
>             Chris
>
>
>
>
>
>     --
>     Thanks and regards
>     Sandeep Khurana
>
>
>
>
> --
> Thanks and regards
> Sandeep Khurana
>
>

Mime
View raw message