phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohanraj Ragupathiraj <>
Subject Read Full Phoenix Table
Date Tue, 12 Jul 2016 04:05:02 GMT

I have a Scenario in which i have to load a phoenix table as a *whole *and
join it with multiple files in Spark. But it takes around 30 minutes just
to read 600 million records from the Phoenix table. I feel it is
inappropriate to load full table data, as HBase works best for Random

May i know if there is a way to read the Entire phoenix table as a
file/files rather loading using JDBC or Dataframes.

Thanks in advance !

View raw message