Though you're not using secondary indexing, can you make the config change on your region servers as described here: http://phoenix.incubator.apache.org/secondary_indexing.html#Setup

I suspect if this is done, the batch marker will be removed automatically before the Put is done.

If that doesn't work, another option would be to manually remove the Indexer coprocessor from your table (but, of course, mutable secondary indexing would not be possible then).

Please let us know if the above solves the issue.

Thanks!
James


On Wed, Jan 15, 2014 at 1:59 AM, Bruno Dumon <bruno.dumon@gmail.com> wrote:
Hi,

I have an HBase table created by Phoenix, onto which I added another CF which is unknown to Phoenix.

I also have a custom endpoint coprocessor, which does puts on this table in the CF which is unknown by Phoenix.

However, this doesn't work, it throws an NPE because the WALEdit contains a KeyValue with null bytes, which I found is the BATCH_MARKER added by Indexer.prePut. (I'm not using indexes)

I don't have this problem if I do the put via an HTable, it is only when doing the put directly from the coprocessor via HRegion.put().

I'm using Phoenix 2.2.1 with HBase 0.94.6-cdh4.4.0. I'm running this in an integration test with HBaseTestingUtility.

Is it possible to get this to work, or any insight as to why this is happening? Thanks!

Bruno.

Stacktrace:

[ERROR][10:34:14,724][7 on 58800] org.apache.hadoop.hbase.regionserver.wal.HLog - syncer encountered error, will retry. txid=60
java.io.IOException: java.lang.NullPointerException
at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter.append(SequenceFileLogWriter.java:282)
at org.apache.hadoop.hbase.regionserver.wal.HLog$LogSyncer.hlogFlush(HLog.java:1293)
at org.apache.hadoop.hbase.regionserver.wal.HLog.syncer(HLog.java:1338)
at org.apache.hadoop.hbase.regionserver.wal.HLog.sync(HLog.java:1472)
at org.apache.hadoop.hbase.regionserver.wal.HLog.append(HLog.java:1174)
at org.apache.hadoop.hbase.regionserver.wal.HLog.append(HLog.java:1214)
at org.apache.hadoop.hbase.regionserver.HRegion.internalPut(HRegion.java:2825)
at org.apache.hadoop.hbase.regionserver.HRegion.put(HRegion.java:2018)
at org.apache.hadoop.hbase.regionserver.HRegion.put(HRegion.java:1963)
at com.ngdata.xxx(xxx.java:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.hbase.regionserver.HRegion.exec(HRegion.java:5490)
at org.apache.hadoop.hbase.regionserver.HRegionServer.execCoprocessor(HRegionServer.java:3720)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1428)
Caused by: java.lang.NullPointerException
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:136)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at org.apache.hadoop.hbase.KeyValue.write(KeyValue.java:2287)
at org.apache.hadoop.hbase.regionserver.wal.WALEdit.write(WALEdit.java:160)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
at org.apache.hadoop.io.SequenceFile$Writer.append(SequenceFile.java:1287)
at org.apache.hadoop.io.SequenceFile$Writer.append(SequenceFile.java:1258)
at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogWriter.append(SequenceFileLogWriter.java:279)
... 21 more