flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Surindhar <surind...@gmail.com>
Subject Re: FLumeNG -HDFS errors
Date Fri, 12 Oct 2012 11:09:23 GMT
HI,

Your Hadoop libraries are not in classpath, from where you run Flume.

The following should work: Please change the paths to your respective
installations,

1)

C:\Users>set CLASSPATH=.;D:\Source\my-flume\my-flume;D:\Source\apache-
flume-1.3.0-SNAPSHOT-dist\apache-flume-1.3.0-SNAPSHOT\lib\*;D:\Source\hadoop-1.0
.3\hadoop-1.0.3\*;D:\Source\hadoop-1.0.3\hadoop-1.0.3\lib\*

2)

C:\Users>java org.apache.flume.node.Application -f D:\Source\my-flume\
my-flume\flume-conf.properties -n host1


Best Regards,




On Fri, Oct 12, 2012 at 3:41 PM, Rewti Ingle
<Rewti_Ingle@persistent.co.in>wrote:

>   Hi, ** ****
>
> ** **
>
> I have installed FlumeNG [1.2] and Hadoop [1.0.3].****
>
> ** **
>
> I am getting following error  while running flume agent with HDFS sink.***
> *
>
> ** **
>
> 2012-10-11 16:55:57,640 WARN hdfs.HDFSEventSink: HDFS IO error****
>
> java.io.IOException: java.lang.RuntimeException:
> java.lang.ClassNotFoundException:
> org.apache.hadoop.hdfs.DistributedFileSystem****
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter.doOpen(BucketWriter.java:202)****
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter.access$000(BucketWriter.java:48)**
> **
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter$1.run(BucketWriter.java:155)****
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter$1.run(BucketWriter.java:152)****
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter.runPrivileged(BucketWriter.java:125)
> ****
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:152)****
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:307)****
>
>                 at
> org.apache.flume.sink.hdfs.HDFSEventSink$1.call(HDFSEventSink.java:717)***
> *
>
>                 at
> org.apache.flume.sink.hdfs.HDFSEventSink$1.call(HDFSEventSink.java:714)***
> *
>
>                 at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)****
>
>                 at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ****
>
>                 at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> ****
>
>                 at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> ****
>
>                 at java.lang.Thread.run(Thread.java:662)****
>
> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
> org.apache.hadoop.hdfs.DistributedFileSystem****
>
>                 at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1128)****
>
>                 at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1729)****
>
>                 at
> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:74)****
>
>                 at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1768)***
> *
>
>                 at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1750)****
>
>                 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:234)
> ****
>
>                 at org.apache.hadoop.fs.Path.getFileSystem(Path.java:189)*
> ***
>
>                 at
> org.apache.flume.sink.hdfs.BucketWriter.doOpen(BucketWriter.java:186)****
>
>                 ... 13 more****
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hdfs.DistributedFileSystem****
>
>                 at java.net.URLClassLoader$1.run(URLClassLoader.java:202)*
> ***
>
>                 at java.security.AccessController.doPrivileged(Native
> Method)****
>
>                 at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)****
>
>                 at java.lang.ClassLoader.loadClass(ClassLoader.java:306)**
> **
>
>                 at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)****
>
>                 at java.lang.ClassLoader.loadClass(ClassLoader.java:247)**
> **
>
>                 at java.lang.Class.forName0(Native Method)****
>
>                 at java.lang.Class.forName(Class.java:247)****
>
>                 at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1074)
> ****
>
>                 at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1126)****
>
>                 ... 20 more ****
>
> ** **
>
> ** **
>
> My Fume configuration file :-****
>
> ** **
>
> ##Agent to copy the log from source to HDFS sink****
>
> ** **
>
> # Define a memory channel called ch1 on agent1****
>
> agent1.channels.ch1.type = memory****
>
> ** **
>
> # Define an EXEC source called src on agent1 and connect it to channel ch1.
> ****
>
> agent1.sources.src.channels = ch1****
>
> agent1.sources.src.type = exec****
>
> agent1.sources.src.command = tail -F /opt/Flume/test/SampleLog****
>
> ** **
>
> # Define a HDFS sink and connect it to the other end of the same channel.*
> ***
>
> agent1.sinks.HDFS.channel = ch1****
>
> agent1.sinks.HDFS.type = hdfs****
>
> agent1.sinks.HDFS.hdfs.path = hdfs://localhost:8020/user/hdfs****
>
> agent1.sinks.HDFS.hdfs.fileType = DataStream****
>
> agent1.sinks.HDFS.hdfs.writeFormat = Text****
>
> agent1.sinks.HDFS.hdfs.filePrefix = FlumeTest****
>
> ** **
>
> # Finally, now that we've defined all of our components, tell****
>
> # agent1 which ones we want to activate.****
>
> agent1.channels = ch1****
>
> agent1.sources = src****
>
> agent1.sinks = HDFS****
>
> ** **
>
> Both Flume and Hadoop are installed on same box.****
>
> ** **
>
> Please let me know where I am going wrong. Are Flume[1.2] and Hadoop
> [1.0.3] compatible with each other.****
>
> ** **
>
> ** **
>
> Regards,****
>
> Rewti****
>
>
>
> DISCLAIMER ========== This e-mail may contain privileged and confidential
> information which is the property of Persistent Systems Ltd. It is intended
> only for the use of the individual or entity to which it is addressed. If
> you are not the intended recipient, you are not authorized to read, retain,
> copy, print, distribute or use this message. If you have received this
> communication in error, please notify the sender and delete all copies of
> this message. Persistent Systems Ltd. does not accept any liability for
> virus infected mails.
>

Mime
View raw message