flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Connor Woodson <cwoodson....@gmail.com>
Subject Re: HDFS Test Failure
Date Mon, 28 Jan 2013 02:24:12 GMT
I ran it in Sudo mode to try and get around that, which worked; I changed
the default umask to 0022 and it works without sudo. However, I am still
getting the timeout for the HBase sink tests. Is there something else
that's not set correctly?

Thanks,

- Connor


On Sun, Jan 27, 2013 at 4:01 PM, Brock Noland <brock@cloudera.com> wrote:

> Have you tried my umask suggestion? More detail here:
> http://s.apache.org/9or
>
> On Sun, Jan 27, 2013 at 5:44 PM, Connor Woodson <cwoodson.dev@gmail.com>
> wrote:
> > Nope, doesn't work. Doing 'mvn clean install' still gives me errors on
> the
> > HDFS mini cluster test; doing an install without tests and then running
> 'mvn
> > test' doesn't work either.
> >
> > Running the tests with sudo generates the same results as before: HDFS
> tests
> > work, but TestAsyncHBaseSink / TestHBaseSink end up timing out.
> >
> > - Connor
> >
> >
> > On Fri, Jan 25, 2013 at 6:20 PM, Mike Percy <mpercy@apache.org> wrote:
> >>
> >> Seems strange. Connor have you tried running "mvn clean install" and do
> >> you get the same results?
> >>
> >> Flume is weird because we push SNAPSHOT builds per commit so you have to
> >> install to avoid strange dependency issues sometimes. It's especially
> >> insidious to do mvn clean package.
> >>
> >> I don't know if it's related to this problem but I'd be +1 for disabling
> >> pushing SNAPSHOT builds to Maven, unless anyone sees the benefit of
> keeping
> >> it this way.
> >>
> >> Regards,
> >> Mike
> >>
> >>
> >> On Fri, Jan 25, 2013 at 5:38 PM, Connor Woodson <cwoodson.dev@gmail.com
> >
> >> wrote:
> >>>
> >>> Running "mvn clean test" as root, the HDFS test doesn't crash.
> >>> TestAsyncHBaseSink takes a long time but succeeds. TestHBaseSink,
> however,
> >>> fails after a while when it times out.
> >>>
> >>> How can I get this to work without running in 'sudo' mode, and why
> might
> >>> the TestHBaseSink be hanging for just me?
> >>>
> >>> - Connor
> >>>
> >>>
> >>>
> >>> On Sat, Jan 19, 2013 at 3:06 PM, Brock Noland <brock@cloudera.com>
> wrote:
> >>>>
> >>>> I think there is/was a bug in HDFS which caused a NPE due to umask.
> >>>>
> >>>> My guess is it's 0002 where as it needs to be 0022.
> >>>>
> >>>> On Sat, Jan 19, 2013 at 2:56 PM, Connor Woodson <
> cwoodson.dev@gmail.com>
> >>>> wrote:
> >>>> > Running "mvn test" on the latest Flume code, I get a test failure
in
> >>>> > TestHDFSEventSinkOnMiniCluster.
> >>>> >
> >>>> > I'm using a fresh build of Ubuntu - is there a package I'm supposed
> to
> >>>> > install for it to work?
> >>>> >
> >>>> >   <testcase time="2.092"
> >>>> >
> classname="org.apache.flume.sink.hdfs.TestHDFSEventSinkOnMiniCluster"
> >>>> > name="org.apache.flume.sink.hdfs.TestHDFSEventSinkOnMiniCluster">
> >>>> >     <error
> >>>> > type="java.lang.NullPointerException">java.lang.NullPointerException
> >>>> > at
> >>>> >
> >>>> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:422)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.hadoop.hdfs.MiniDFSCluster.&lt;init&gt;(MiniDFSCluster.java:280)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.hadoop.hdfs.MiniDFSCluster.&lt;init&gt;(MiniDFSCluster.java:124)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.flume.sink.hdfs.TestHDFSEventSinkOnMiniCluster.setup(TestHDFSEventSinkOnMiniCluster.java:73)
> >>>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>> > at
> >>>> >
> >>>> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>>> > at
> >>>> >
> >>>> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>>> > at java.lang.reflect.Method.invoke(Method.java:616)
> >>>> > at
> >>>> >
> >>>> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:45)
> >>>> > at
> >>>> >
> >>>> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> >>>> > at
> >>>> >
> >>>> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:42)
> >>>> > at
> >>>> >
> >>>> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
> >>>> > at
> >>>> >
> >>>> >
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:30)
> >>>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:300)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:236)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:134)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:113)
> >>>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>> > at
> >>>> >
> >>>> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>>> > at
> >>>> >
> >>>> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>>> > at java.lang.reflect.Method.invoke(Method.java:616)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
> >>>> > at
> >>>> >
> >>>> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:103)
> >>>> > at
> >>>> >
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:74)
> >>>> > </error>
> >>>> >     <system-out>2013-01-19 22:46:11,966 (main) [WARN -
> >>>> >
> >>>> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:137)]
> >>>> > Metrics system not started: Cannot locate configuration: tried
> >>>> > hadoop-metrics2-namenode.properties, hadoop-metrics2.properties
> >>>> > Starting DataNode 0 with dfs.data.dir:
> >>>> > target/test/dfs/dfs/data/data1,target/test/dfs/dfs/data/data2
> >>>> > 2013-01-19 22:46:12,950 (main) [WARN -
> >>>> >
> >>>> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:137)]
> >>>> > Metrics system not started: Cannot locate configuration: tried
> >>>> > hadoop-metrics2-datanode.properties, hadoop-metrics2.properties
> >>>> > 2013-01-19 22:46:12,970 (main) [WARN -
> >>>> >
> >>>> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1577)]
> >>>> > Invalid directory in dfs.data.dir: Incorrect permission for
> >>>> > target/test/dfs/dfs/data/data1, expected: rwxr-xr-x, while actual:
> >>>> > rwxrwxr-x
> >>>> > 2013-01-19 22:46:12,978 (main) [WARN -
> >>>> >
> >>>> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1577)]
> >>>> > Invalid directory in dfs.data.dir: Incorrect permission for
> >>>> > target/test/dfs/dfs/data/data2, expected: rwxr-xr-x, while actual:
> >>>> > rwxrwxr-x
> >>>> > 2013-01-19 22:46:12,978 (main) [ERROR -
> >>>> >
> >>>> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1583)]
> >>>> > All directories in dfs.data.dir are invalid.
> >>>> > </system-out>
> >>>> >   </testcase>
> >>>>
> >>>>
> >>>>
> >>>> --
> >>>> Apache MRUnit - Unit testing MapReduce -
> >>>> http://incubator.apache.org/mrunit/
> >>>
> >>>
> >>
> >
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Mime
View raw message