phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gabriel Reid <>
Subject Re: Phoenix bulk loading
Date Thu, 12 Feb 2015 07:34:04 GMT
Hi Siva,

If I understand correctly, you want to explicitly supply null values
in a CSV file for some fields. In general, this should work by just
leaving the field empty in your CSV file. For example, if you have
three fields (id, first_name, last_name) in your CSV file, then a
record like "1,,Reid" should create a record with first_name left as

Note that there is still an open bug, PHOENIX-1277 [1] that will
prevent inserting null values via the bulk loader or psql, so for some
datatypes there currently isn't a way to explicitly supply null

- Gabriel


On Thu, Feb 12, 2015 at 1:28 AM, Siva <> wrote:
> Hello all,
> is there a way to specify to keep NULL values for the columns which were
> not there in csv file as part of bulk loading?
> Requirement I have is, few row in csv file contains all the column, but
> rows contain only few columns.
> In Hbase, if the given record doesnt have desired columns, it just ignore
> the columns and it goes for next record while loading the data from
> ImportTsv.
> hadoop jar
> /usr/hdp/
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table P_TEST_2_COLS --input
> /user/sbhavanari/p_h_test_2_cols_less.csv --import-columns NAME,LEADID,D
> --zookeeper
> Thanks,
> Siva.

View raw message