Hi Liu,

Please read http://phoenix.incubator.apache.org/upgrade_from_2_2.html on how to upgrade tables created from pre-Apache Phoenix codeline. In case your existing tables were only for test purposes, you can drop all Phoenix tables from HBase shell and try reconnect.

//mujtaba


On Mon, May 12, 2014 at 3:36 AM, liu rickroot <rickroot.liu@gmail.com> wrote:
hi 
I found there is something write error in phoenix4. I install hbase 0.98.1, copy phoenix*.jar to lib path of HBASE, can't start HBASE cluaster. it display below error:
2014-05-12 03:59:57,551 ERROR [RS_OPEN_REGION-ctu-gi-hadp02:60020-0] coprocessor.CoprocessorHost: The coprocessor com.salesforce.phoenix.coprocessor.ServerCachingEndpointImpl threw an unexpected exception
java.io.IOException: No jar path specified for com.salesforce.phoenix.coprocessor.ServerCachingEndpointImpl
        at org.apache.hadoop.hbase.coprocessor.CoprocessorHost.load(CoprocessorHost.java:200)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.loadTableCoprocessors(RegionCoprocessorHost.java:207)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.<init>(RegionCoprocessorHost.java:163)
        at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:547)
        at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:454)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:4089)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4400)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4373)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4329)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4280)
        at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:465)
        at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:139)
        at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
2014-05-12 03:59:57,551 FATAL [RS_OPEN_REGION-ctu-gi-hadp02:60020-0] regionserver.HRegionServer: ABORTING region server ctu-gi-hadp02.ubisoft.org,60020,1399867187448: The coprocessor com.salesforce.phoenix.coprocessor.ServerCachingEndpointImpl threw an unexpected exception
java.io.IOException: No jar path specified for com.salesforce.phoenix.coprocessor.ServerCachingEndpointImpl
        at org.apache.hadoop.hbase.coprocessor.CoprocessorHost.load(CoprocessorHost.java:200)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.loadTableCoprocessors(RegionCoprocessorHost.java:207)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.<init>(RegionCoprocessorHost.java:163)
        at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:547)
        at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:454)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:4089)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4400)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4373)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4329)
        at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:4280)
        at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:465)
        at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:139)
        at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)

I open the phoenix-core-4.0.0-incubating.jar, found there is not any class in com/salesforce/phoenix, and I also found these issue for other user. some one reply it will fixed in 2.2.3.http://mail-archives.apache.org/mod_mbox/phoenix-user/201402.mbox/%3CCAEF26Gf+fbFqDRK5iOCZb-5XwSAnnLnWe7kq2NqM6tZs4QyZow@mail.gmail.com%3E

 When will fixed in phoenix4?