phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kristoffer Sjögren <sto...@gmail.com>
Subject Migrate data between 2.2 and 4.0
Date Tue, 24 Jun 2014 12:32:56 GMT
Hi

We're currently running Phoenix 2.2 on HBase 0.94 CDH 4.4 and slowly
preparing to move to Phoenix 4 and HBase 0.96 CDH 5.

For my first tests I wanted to simply copy data from 0.94 to 0.96,
which works fine for regular hbase table using the following commands:

$ hbase org.apache.hadoop.hbase.mapreduce.Export table /tmp/table
$ hadoop distcp hftp://hbase94:50070/tmp/table hdfs://hbase96/tmp/table
$ hbase -Dhbase.import.version=0.94
org.apache.hadoop.hbase.mapreduce.Import table /tmp/table

This approach fail on the import for for phoenix tables tough (see
below) - where I create an identical table in 0.96 using phoenix 4.0
sqlline and then do the commands mentioned above. As I understand, the
_0 column family was used to allow hbase empty rows.

Are there any tricks that can be made to allow copy data between these
two installations?

Cheers,
-Kristoffer


2014-06-18 13:31:09,633 INFO  [main] mapreduce.Job: Task Id :
attempt_1403015236309_0015_m_000004_1, Status : FAILED
Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException:
Failed 6900 actions:
org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException:
Column family _0 does not exist in region
TABLE,\x1F\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1403090967713.51a3eefa9b92568a87223aff7878cdcf.
in table 'TABLE', {TABLE_ATTRIBUTES => {coprocessor$1 =>
'|org.apache.phoenix.coprocessor.ScanRegionObserver|1|', coprocessor$2
=> '|org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver|1|',
coprocessor$3 =>
'|org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver|1|',
coprocessor$4 =>
'|org.apache.phoenix.coprocessor.ServerCachingEndpointImpl|1|',
coprocessor$5 =>
'|org.apache.phoenix.hbase.index.Indexer|1073741823|index.builder=org.apache.phoenix.index.PhoenixIndexBuilder,org.apache.hadoop.hbase.index.codec.class=org.apache.phoenix.index.PhoenixIndexCodec'},
{NAME => '0', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY =>
'false', KEEP_DELETED_CELLS => 'true', DATA_BLOCK_ENCODING =>
'FAST_DIFF', COMPRESSION => 'NONE', TTL => 'FOREVER', MIN_VERSIONS =>
'0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE =>
'0'}
at org.apache.hadoop.hbase.regionserver.HRegionServer.doBatchOp(HRegionServer.java:4056)
at org.apache.hadoop.hbase.regionserver.HRegionServer.doNonAtomicRegionMutation(HRegionServer.java:3361)
at org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3265)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:26935)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2175)
at org.apache.hadoop.hbase.ipc.RpcServer$Handler.run(RpcServer.java:1879)
: 6900 times,
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:187)
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:171)
at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:882)
at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:940)
at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:903)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:864)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:635)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:167)
at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:136)
at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:118)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Mime
View raw message