work in cdh5.
Follow the instructions in http://phoenix.apache.org/bulk_dataload.html
$HADOOP_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hbase/lib/hbase-protocol-0.96.1.1-cdh5.0.0.jar:/etc/hbase/conf.cloudera.hbase hadoop jar phoenix/hadoop-2/phoenix-4.0.0-incubating-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --table TAB_PS_XDR --input /user/cloudil/cdr1/1344131513-1344131661-bssap-1-sigserver113.cdr.csv

error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
        at org.apache.phoenix.mapreduce.CsvBulkLoadTool.run(CsvBulkLoadTool.java:159)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:85)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)

$HADOOP_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hbase/lib/*:/etc/hbase/conf.cloudera.hbase hadoop jar phoenix/common/phoenix-4.0.0-incubating-client-minimal.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --table TAB_PS_XDR --input /user/cloudil/cdr1/1344131513-1344131661-bssap-1-sigserver113.cdr.csv

error:
Exception in thread "main" 14/07/16 17:59:46 INFO zookeeper.ClientCnxn: EventThread shut down
java.lang.NoClassDefFoundError: org/antlr/runtime/CharStream
        at org.apache.phoenix.jdbc.PhoenixStatement.parseStatement(PhoenixStatement.java:852)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:900)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1452)
        at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
        at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:233)
        at org.apache.phoenix.mapreduce.CsvBulkLoadTool.run(CsvBulkLoadTool.java:168)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:85)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)


$HADOOP_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hbase/lib/*:/home/cloudil/antlr-3.5.2-complete.jar:/etc/hbase/conf.cloudera.hbase hadoop jar phoenix/common/phoenix-4.0.0-incubating-client-minimal.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --table TAB_PS_XDR --input /user/cloudil/cdr1/1344131513-1344131661-bssap-1-sigserver113.cdr.csv

now map/reduce work and create job,but exception appear
error:
Error: java.lang.ClassNotFoundException: org.antlr.runtime.RecognitionException
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.apache.phoenix.jdbc.PhoenixStatement.parseStatement(PhoenixStatement.java:852)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:900)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1452)
        at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
        at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:233)
        at org.apache.phoenix.mapreduce.CsvToKeyValueMapper.setup(CsvToKeyValueMapper.java:107)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

How can I do?