phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jaanai Zhang <cloud.pos...@gmail.com>
Subject Re: Encountering BufferUnderflowException when querying from Phoenix
Date Mon, 15 Oct 2018 03:38:56 GMT
It looks a bug that the remained part greater than retrieved the length in
ByteBuffer, Maybe the position of ByteBuffer or the length of target byte
array exists some problems.

----------------------------------------
   Jaanai Zhang
   Best regards!



William Shen <willshen@marinsoftware.com> 于2018年10月12日周五 下午11:53写道:

> Hi all,
>
> We are running Phoenix 4.13, and periodically we would encounter the
> following exception when querying from Phoenix in our staging environment.
> Initially, we thought we had some incompatible client version connecting
> and creating data corruption, but after ensuring that we are only
> connecting with 4.13 clients, we still see this issue come up from time to
> time. So far, fortunately, since it is in staging, we are able to identify
> and delete the data to restore service.
>
> However, would like to ask for guidance on what else we could look for to
> identify the cause of this exception. Could this perhaps caused by
> something other than data corruption?
>
> Thanks in advance!
>
> The exception looks like:
>
> 18/10/12 15:45:58 WARN scheduler.TaskSetManager: Lost task 32.2 in stage
> 14.0 (TID 1275, ...datanode..., executor 82):
> java.nio.BufferUnderflowException
>
> at java.nio.HeapByteBuffer.get(HeapByteBuffer.java:151)
>
> at java.nio.ByteBuffer.get(ByteBuffer.java:715)
>
> at
> org.apache.phoenix.schema.types.PArrayDataType.createPhoenixArray(PArrayDataType.java:1028)
>
> at
> org.apache.phoenix.schema.types.PArrayDataType.toObject(PArrayDataType.java:375)
>
> at
> org.apache.phoenix.schema.types.PVarcharArray.toObject(PVarcharArray.java:65)
>
> at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:1011)
>
> at
> org.apache.phoenix.compile.ExpressionProjector.getValue(ExpressionProjector.java:75)
>
> at
> org.apache.phoenix.jdbc.PhoenixResultSet.getObject(PhoenixResultSet.java:525)
>
> at
> org.apache.phoenix.spark.PhoenixRecordWritable$$anonfun$readFields$1.apply$mcVI$sp(PhoenixRecordWritable.scala:96)
>
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>
> at
> org.apache.phoenix.spark.PhoenixRecordWritable.readFields(PhoenixRecordWritable.scala:93)
>
> at
> org.apache.phoenix.mapreduce.PhoenixRecordReader.nextKeyValue(PhoenixRecordReader.java:168)
>
> at
> org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:174)
>
> at
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
>
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
> at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1596)
>
> at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)
>
> at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)
>
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1870)
>
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1870)
>
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
>
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:229)
>
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>
> at java.lang.Thread.run(Thread.java:748)
>
>
>

Mime
View raw message