phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gabriel Reid <gabriel.r...@gmail.com>
Subject Re: Help With CSVBulkLoadTool
Date Fri, 23 Oct 2015 05:18:32 GMT
Hi Zack,

I can't give you any information about compatibility of a given
Phoenix version with a given version of HDP (because I don't know).

However, could you give a bit more info on what you're seeing? Are all
import jobs failing with this error for a given set of tables? Or is
this a random failure that can happen on any table?

This error looks to me like it would be some kind of configuration
issue with your cluster(s), but if that's the case then I would expect
that you'd be getting the same error every time.

- Gabriel

On Wed, Oct 21, 2015 at 2:42 PM, Riesland, Zack
<Zack.Riesland@sensus.com> wrote:
> Hello,
>
>
>
> We recently upgraded our Hadoop stack from HDP 2.2.0 to 2.2.8
>
>
>
> The phoenix version (phoenix-4.2.0.2.2.8.0) and HBase version
> (0.98.4.2.2.8.0) did not change (from what I can tell).
>
>
>
> However, some of our CSVBulkLoadTool jobs have started to fail.
>
>
>
> I’m not sure whether this is related to the upgrade or not, but the timing
> seems suspicious.
>
>
>
> The particular error I’m seeing is like this:
>
>
>
> ERROR mapreduce.CsvBulkLoadTool: Import job on table=<table name> failed due
> to exception:java.lang.IllegalArgumentException: No regions passed
>
>
>
> The Phoenix table in question has 7 regions and millions of rows in it
> already.
>
>
>
> The syntax I’m using is
>
>
>
> HADOOP_CLASSPATH=/<classpath stuff> hadoop jar <path to phoenix-client.jar>
> org.apache.phoenix.mapreduce.CsvBulkLoadTool –Dfs.permissions.umask-mode=000
> --z <zookeeper quarum> --table <my table> --input <my hdfs file>
>
>
>
> Can anyone help me understand the solution here?
>
>
>
> Also, does anyone know the most recent version of Phoenix that is compatible
> with HDP 2.2.8 / HBase 0.98.4.2.2 ?
>
>
>
> Thanks!

Mime
View raw message