phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From rajeshbabu chintaguntla <rajeshbabu.chintagun...@huawei.com>
Subject RE: Re: Unable to find cached index metadata
Date Tue, 02 Sep 2014 08:47:57 GMT
bq. I am trying to load data into the phoenix table, as Phoenix may not support index related
   data bulkload, I am tring to upsert data into phoenix through JDBC statements.

In 4.1 release CSVBulkLoadTool can be used to build indexes when loading data. See [1].
And also some more work is going for the same[2].

1. https://issues.apache.org/jira/browse/PHOENIX-1069
2. https://issues.apache.org/jira/browse/PHOENIX-1056

Are you getting the exception for first attempt of upsert or in the middle of loading the
data?

Can you provide the code snippet(or statements) which you are using to upsert  data?

Thanks,
Rajeshbabu.
________________________________
This e-mail and its attachments contain confidential information from HUAWEI, which
is intended only for the person or entity whose address is listed above. Any use of the
information contained herein in any way (including, but not limited to, total or partial
disclosure, reproduction, or dissemination) by persons other than the intended
recipient(s) is prohibited. If you receive this e-mail in error, please notify the sender
by
phone or email immediately and delete it!
________________________________
From: sunfl@certusnet.com.cn [sunfl@certusnet.com.cn]
Sent: Tuesday, September 02, 2014 8:27 AM
To: user
Subject: Re: Re: Unable to find cached index metadata

Hi,
   Thanks for your reply. Sorry for not completely describing my job information.
   I had configured the properties in hbase-site.xml in hmaster node and run sqlline to
  create table in Phoenix, while creating a local index on my table.
   I am trying to load data into the phoenix table, as Phoenix may not support index related
   data bulkload, I am tring to upsert data into phoenix through JDBC statements. Then I got
the
   following error, not quite sure about the reason. BTW, no local index data upserting works
fine.
   Hoping for your reply and thks.

________________________________
________________________________

CertusNet


From: rajesh babu Chintaguntla<mailto:chrajeshbabu32@gmail.com>
Date: 2014-09-02 10:55
To: user<mailto:user@phoenix.apache.org>
Subject: Re: Unable to find cached index metadata
Hi Sun,
Thanks for testing,

Have you configured following properties at master side and restarted it before creating local
indexes?

<property>
  <name>hbase.master.loadbalancer.class</name>
  <value>org.apache.phoenix.hbase.index.balancer.IndexLoadBalancer</value>
</property>
<property>
  <name>hbase.coprocessor.master.classes</name>
  <value>org.apache.phoenix.hbase.index.master.IndexMasterObserver</value>
</property>




On Tue, Sep 2, 2014 at 7:35 AM, sunfl@certusnet.com.cn<mailto:sunfl@certusnet.com.cn>
<sunfl@certusnet.com.cn<mailto:sunfl@certusnet.com.cn>> wrote:
Hi, everyone,
   I used the latest 4.1 release to run some tests about local indexing. When I am trying
to load data into
   phoenix table with local index, I got the following error. Not sure whether got some relation
with Hbase
   local index table, cause Hbase local index table is uniformly prefixed with '_LOCAL_IDX_'
+ TableRef.
   Any available hints? Also corrects me if I got some misunderstanding.
   Best Regards, Sun.
     org.apache.phoenix.execute.CommitException: java.sql.SQLException: ERROR 2008 (INT10):
Unable to find cached index metadata. ERROR 2008 (INT10): ERROR 2008 (INT10): Unable to find
cached index metadata. key=-8614688887238479432 region=RANAPSIGNAL,\x0D\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1409566437551.9e47a9f579f7cf3865d1148480a3b1b9.
Index update failed

        org.apache.phoenix.execute.MutationState.commit(MutationState.java:433)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:384)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:381)
        org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:381)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:113)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:104)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:104)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:89)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:744)

________________________________
________________________________

CertusNet



Mime
View raw message