phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From William Shen <wills...@marinsoftware.com>
Subject Encountering BufferUnderflowException when querying from Phoenix
Date Fri, 12 Oct 2018 15:53:34 GMT
Hi all,

We are running Phoenix 4.13, and periodically we would encounter the
following exception when querying from Phoenix in our staging environment.
Initially, we thought we had some incompatible client version connecting
and creating data corruption, but after ensuring that we are only
connecting with 4.13 clients, we still see this issue come up from time to
time. So far, fortunately, since it is in staging, we are able to identify
and delete the data to restore service.

However, would like to ask for guidance on what else we could look for to
identify the cause of this exception. Could this perhaps caused by
something other than data corruption?

Thanks in advance!

The exception looks like:

18/10/12 15:45:58 WARN scheduler.TaskSetManager: Lost task 32.2 in stage
14.0 (TID 1275, ...datanode..., executor 82):
java.nio.BufferUnderflowException

at java.nio.HeapByteBuffer.get(HeapByteBuffer.java:151)

at java.nio.ByteBuffer.get(ByteBuffer.java:715)

at
org.apache.phoenix.schema.types.PArrayDataType.createPhoenixArray(PArrayDataType.java:1028)

at
org.apache.phoenix.schema.types.PArrayDataType.toObject(PArrayDataType.java:375)

at
org.apache.phoenix.schema.types.PVarcharArray.toObject(PVarcharArray.java:65)

at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:1011)

at
org.apache.phoenix.compile.ExpressionProjector.getValue(ExpressionProjector.java:75)

at
org.apache.phoenix.jdbc.PhoenixResultSet.getObject(PhoenixResultSet.java:525)

at
org.apache.phoenix.spark.PhoenixRecordWritable$$anonfun$readFields$1.apply$mcVI$sp(PhoenixRecordWritable.scala:96)

at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)

at
org.apache.phoenix.spark.PhoenixRecordWritable.readFields(PhoenixRecordWritable.scala:93)

at
org.apache.phoenix.mapreduce.PhoenixRecordReader.nextKeyValue(PhoenixRecordReader.java:168)

at org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:174)

at
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)

at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)

at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1596)

at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)

at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)

at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1870)

at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1870)

at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)

at org.apache.spark.scheduler.Task.run(Task.scala:89)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:229)

at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:748)

Mime
View raw message