phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Job Thomas" <>
Subject RE: Loading data with Sqoop
Date Mon, 02 Jun 2014 08:43:43 GMT
Hi Roberto,
How the tool is working?
Directly loading data from RDBMS to Phoenix using Sqoop? or loading after bulk file is dumped
into HDFS?
Thanks & Regards
Job M Thomas 


From: Ravi Kiran []
Sent: Fri 5/30/2014 3:39 AM
Subject: Re: Loading data with Sqoop

Hi Roberto,

   How are you constructing the composite row key and returning the Put from the transformer
. Also, can you please throw some light on how you perform a look up on the data types of
the columns within the transformer.


On Thu, May 29, 2014 at 3:02 PM, Roberto Gastaldelli <> wrote:

	Hi James,

	I have extended the PutTransformer I've implemented and now it's loading data into tables
with composite primary key.

	Another scenario I'm still working on is to identify if the table is salted, and load the
data accordingly.

	Can you think in any other scenario?


	On 28/05/2014 6:01 PM, "Roberto Gastaldelli" <> wrote:

		I haven't tested the load in tables with composite key, but I'll run some scenarios and
check what can be done.

		On 28/05/2014 5:51 PM, "James Taylor" <> wrote:

			Hi Roberto, 

			Yes, thank you very much for asking - there's definitely interest. Does it handle the case
with a table that has a composite primary key definition? 


			On Wed, May 28, 2014 at 12:45 AM, Roberto Gastaldelli <> wrote:

				Hi there,

				I came across the challenge of loading data from a RDBMS into a Phoenix table using Sqoop,
but that did not work well as Sqoop by default converts all data types to string.

				I came up with a solution to write a PutTransformer that maps the jdbc data types to the
Phoenix native data types.

				Is there any interest to include this feature to the project? If so, I can contribute.


View raw message