phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gabriel Reid <gabriel.r...@gmail.com>
Subject Re: Help With CSVBulkLoadTool
Date Fri, 23 Oct 2015 13:39:26 GMT
Do you have a stack trace from the log output from when you got this error?

And could you tell me if the table name that is being complained about
there is an index table name?

Tracing through the code, it looks like you could get this exception
if an index table doesn't exist (or somehow isn't available), which
would explain how the data would be getting into your main table.

- Gabriel


On Fri, Oct 23, 2015 at 11:18 AM, Riesland, Zack
<Zack.Riesland@sensus.com> wrote:
> Thanks Gabriel,
>
> From what I can see in the logs, it happens consistently on most if not all tables that
we import to.
>
> However,  it does not appear to actually prevent the data from getting to the table.
>
> When I first raised this question, I noticed missing data and saw the error, but I have
since found another error (bad data of the wrong type) that I think was the root cause of
my job actually failing.
>
> I certainly never noticed this before, though.
>
> The main things that we have changed since these scripts worked cleanly were upgrading
our stack and adding new region servers.
>
> Does that help at all?
>
> -----Original Message-----
> From: Gabriel Reid [mailto:gabriel.reid@gmail.com]
> Sent: Friday, October 23, 2015 1:19 AM
> To: user@phoenix.apache.org
> Subject: Re: Help With CSVBulkLoadTool
>
> Hi Zack,
>
> I can't give you any information about compatibility of a given Phoenix version with
a given version of HDP (because I don't know).
>
> However, could you give a bit more info on what you're seeing? Are all import jobs failing
with this error for a given set of tables? Or is this a random failure that can happen on
any table?
>
> This error looks to me like it would be some kind of configuration issue with your cluster(s),
but if that's the case then I would expect that you'd be getting the same error every time.
>
> - Gabriel
>
> On Wed, Oct 21, 2015 at 2:42 PM, Riesland, Zack <Zack.Riesland@sensus.com> wrote:
>> Hello,
>>
>>
>>
>> We recently upgraded our Hadoop stack from HDP 2.2.0 to 2.2.8
>>
>>
>>
>> The phoenix version (phoenix-4.2.0.2.2.8.0) and HBase version
>> (0.98.4.2.2.8.0) did not change (from what I can tell).
>>
>>
>>
>> However, some of our CSVBulkLoadTool jobs have started to fail.
>>
>>
>>
>> I’m not sure whether this is related to the upgrade or not, but the
>> timing seems suspicious.
>>
>>
>>
>> The particular error I’m seeing is like this:
>>
>>
>>
>> ERROR mapreduce.CsvBulkLoadTool: Import job on table=<table name>
>> failed due to exception:java.lang.IllegalArgumentException: No regions
>> passed
>>
>>
>>
>> The Phoenix table in question has 7 regions and millions of rows in it
>> already.
>>
>>
>>
>> The syntax I’m using is
>>
>>
>>
>> HADOOP_CLASSPATH=/<classpath stuff> hadoop jar <path to
>> phoenix-client.jar> org.apache.phoenix.mapreduce.CsvBulkLoadTool
>> –Dfs.permissions.umask-mode=000 --z <zookeeper quarum> --table <my
>> table> --input <my hdfs file>
>>
>>
>>
>> Can anyone help me understand the solution here?
>>
>>
>>
>> Also, does anyone know the most recent version of Phoenix that is
>> compatible with HDP 2.2.8 / HBase 0.98.4.2.2 ?
>>
>>
>>
>> Thanks!

Mime
View raw message