phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Mahonin <jmaho...@gmail.com>
Subject Re: Spark hang on load phoenix table
Date Thu, 17 Nov 2016 06:46:04 GMT
Hi,

Are there any logs in the Spark driver and executors which would help
provide some context? In diagnosing, increasing the log level to DEBUG
might be useful as well.

Also, the snippet you posted is a 'lazy' operation. In theory it should
return quickly, and only evaluate when some sort of Spark action is
performed on it (e.g., count, distinct, save, etc.). If the operation ends
up hanging, perhaps there's some sort of connectivity issue, or maybe the
Zookeeper Znode Parent is missing from the full URL?

Best,

Josh



On Tue, Nov 15, 2016 at 9:18 PM, 马骉 <mabiaocsu@qq.com> wrote:

> Hi all
>     Have you met the problem that submit the job to the Spark, then it
> just hang on? There showed no stage message on the 8080 web UI.
>     It seems this is something wrong in
>     val df = sqlContext.load(
>
>   "org.apache.phoenix.spark",
>   Map("table" -> "\"test_table\"", "zkUrl" -> "10.17.10.2:2181")
> )
>
>    Any ideas, please
> ------------------
> Warmest Regards~
> From BiaoMa
>
>

Mime
View raw message