hi i want to integrate spark with phoenix & hbase.

i try an example from the apace phoenix web site like this;

import org.apache.spark.graphx._
import org.apache.phoenix.spark._
val rdd = sc.phoenixTableAsRDD("EMAIL_ENRON", Seq("MAIL_FROM", "MAIL_TO"), zkUrl=Some("localhost"))           // load from phoenix
val rawEdges = rdd.map{ e => (e("MAIL_FROM").asInstanceOf[VertexId], e("MAIL_TO").asInstanceOf[VertexId]) }   // map to vertexids
val graph = Graph.fromEdgeTuples(rawEdges, 1.0)                                                               // create a graph
val pr = graph.pageRank(0.001)                                                                                // run pagerank
pr.vertices.saveToPhoenix("EMAIL_ENRON_PAGERANK", Seq("ID", "RANK"), zkUrl = Some("localhost"))
 but when show an error message when the proccess come to
val graph = Graph.fromEdgeTuples(rawEdges, 1.0) 
this is the message that i've got:

java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Put.setWriteToWAL(Z)Lorg/apache/hadoop/hbase/client/Put;
    at org.apache.phoenix.schema.PTableImpl$PRowImpl.newMutations(PTableImpl.java:639)
    at org.apache.phoenix.schema.PTableImpl$PRowImpl.<init>(PTableImpl.java:632)
    at org.apache.phoenix.schema.PTableImpl.newRow(PTableImpl.java:557)
    at org.apache.phoenix.schema.PTableImpl.newRow(PTableImpl.java:573)
    at org.apache.phoenix.execute.MutationState.addRowMutations(MutationState.java:185)
    at org.apache.phoenix.execute.MutationState.access$200(MutationState.java:79)
    at org.apache.phoenix.execute.MutationState$2.init(MutationState.java:258)

anybody can help ??
-- 
Best Regards,
Yuza