flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From prabhu k <prabhu.fl...@gmail.com>
Subject Re: hbase sink
Date Wed, 12 Sep 2012 06:06:08 GMT
below email contd...

the flume-ng script got stuck, and when i see the demo table no rows
inserted.

hbase(main):001:0> scan 'demo'
ROW                                         COLUMN+CELL
 11347427525582                             column=cf:col1,
timestamp=1347427517764, value=
1 row(s) in 0.3530 seconds


On Wed, Sep 12, 2012 at 11:24 AM, prabhu k <prabhu.flume@gmail.com> wrote:

> Hi Mohammad,
>
> I'm able to access Hbase web ui and i have run the script again today,
> Please find the attached flume.log file.
>
> Please suggest and help me on this issue.
>
> Thanks&Regards,
> Prabhu.
>
>
> On Fri, Jul 20, 2012 at 1:26 AM, Mohammad Tariq <dontariq@gmail.com>wrote:
>
>> Hello Prabhu,
>>
>>            Sorry for the late response. Is your zookeeper running
>> properly? Is it where your shell expects it to be? Can you access
>> HBase's web ui on port 60010?
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jul 19, 2012 at 8:17 PM, prabhu k <prabhu.flume@gmail.com> wrote:
>> > Hi Mohammad,
>> >
>> > After i export HBASE_HOME and CLASSPATH  flume-ng executed again, it
>> seems
>> > keep on running, unexpectedly i have disconnected the session,
>> > and i checked the hbase demo table, i hit the command like
>> >
>> > scan 'demo'  getting like below.
>> >
>> > hbase(main):001:0> scan 'demo'
>> > ROW                                         COLUMN+CELL
>> > table view also keep on running, it's not exit the command. I then did
>> > ctrl+c.
>> >
>> >
>> > WARN client.ZooKeeperSaslClient: SecurityException:
>> > java.lang.SecurityException: Unable to locate a login configuration
>> occurred
>> > when trying to find JAAS configuration.
>> >
>> > WARN zookeeper.RecoverableZooKeeper: Possibly transient ZooKeeper
>> exception:
>> > org.apache.zookeeper.KeeperException$ConnectionLossException:
>> > KeeperErrorCode = ConnectionLoss for /hbase/master
>> >
>> > WARN client.ZooKeeperSaslClient: SecurityException:
>> > java.lang.SecurityException: Unable to locate a login configuration
>> occurred
>> > when trying to find JAAS configuration.
>> >
>> >
>> >  WARN zookeeper.RecoverableZooKeeper: Possibly transient ZooKeeper
>> > exception: org.apache.zookeeper.KeeperException$ConnectionLossException:
>> > KeeperErrorCode = ConnectionLoss for /hbase/master
>> >
>> > WARN client.ZooKeeperSaslClient: SecurityException:
>> > java.lang.SecurityException: Unable to locate a login configuration
>> occurred
>> > when trying to find JAAS configuration.
>> >
>> > ERROR zookeeper.ZooKeeperWatcher: hconnection Received unexpected
>> > KeeperException, re-throwing exception
>> > org.apache.zookeeper.KeeperException$ConnectionLossException:
>> > KeeperErrorCode = ConnectionLoss for /hbase/master
>> >
>> > ERROR client.HConnectionManager$HConnectionImplementation: Unexpected
>> > exception during initialization, aborting
>> > org.apache.zookeeper.KeeperException$ConnectionLossException:
>> > KeeperErrorCode = ConnectionLoss for /hbase/master
>> >
>> > attached the flume.log file, Please guide me how to fix this error.
>> >
>> > Thanks,
>> > Prabhu.
>> >
>> > On Thu, Jul 19, 2012 at 7:05 PM, Mohammad Tariq <dontariq@gmail.com>
>> wrote:
>> >>
>> >> Configuration looks fine. HBASE_HOME and CLASSPATH are also important.
>> >> Please check these once.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Thu, Jul 19, 2012 at 6:50 PM, prabhu k <prabhu.flume@gmail.com>
>> wrote:
>> >> > Thanks for the quick response,
>> >> >
>> >> > Yes, i have exported HADOOP_HOME variable, anything needs to do?
>> >> >
>> >> > On Thu, Jul 19, 2012 at 6:41 PM, Mohammad Tariq <dontariq@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> Hi Prabhu,
>> >> >>
>> >> >>    Have you exported HADOOP_HOME variable??
>> >> >>
>> >> >> Regards,
>> >> >>     Mohammad Tariq
>> >> >>
>> >> >>
>> >> >> On Thu, Jul 19, 2012 at 6:34 PM, prabhu k <prabhu.flume@gmail.com>
>> >> >> wrote:
>> >> >> > Hi Users,
>> >> >> >
>> >> >> > I have followed the below link for move the sample data to
hbase
>> >> >> > sink. i
>> >> >> > then executed the below command, I am getting following error
>> output.
>> >> >> > pasted the flume.log log file.
>> >> >> >
>> >> >> > Please suggest and help on this issue.
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> >
>> http://cloudfront.blogspot.in/2012/06/how-to-move-data-into-hbase-table-using.html
>> >> >> >
>> >> >> > command
>> >> >> > ===========
>> >> >> > bin/flume-ng agent -n hbase-agent -c conf/ -f
>> conf/hbase-agent.conf
>> >> >> >
>> >> >> >
>> >> >> > hbase-agent.conf
>> >> >> > ==================
>> >> >> > hbase-agent.sources = tail
>> >> >> > hbase-agent.sinks = sink1
>> >> >> > hbase-agent.channels = ch1
>> >> >> > hbase-agent.sources.tail.type = exec
>> >> >> > hbase-agent.sources.tail.command = tail -F /usr/local/demo.txt
>> >> >> > hbase-agent.sources.tail.channels = ch1
>> >> >> > hbase-agent.sinks.sink1.type =
>> org.apache.flume.sink.hbase.HBaseSink
>> >> >> > hbase-agent.sinks.sink1.channel = ch1
>> >> >> > hbase-agent.sinks.sink1.table = demo
>> >> >> > hbase-agent.sinks.sink1.columnFamily = cf
>> >> >> > hbase-agent.sinks.sink1.serializer =
>> >> >> > org.apache.flume.sink.hbase.SimpleHbaseEventSerializer
>> >> >> > hbase-agent.sinks.sink1.serializer.payloadColumn = col1
>> >> >> > hbase-agent.sinks.sink1.serializer.keyType = timestamp
>> >> >> > hbase-agent.sinks.sink1.serializer.rowPrefix = 1
>> >> >> > hbase-agent.sinks.sink1.serializer.suffix = timestamp
>> >> >> > hbase-agent.channels.ch1.type=memory
>> >> >> >
>> >> >> >
>> >> >> > flume.log
>> >> >> > =================
>> >> >> > /flume/flume-1.2.0-incubating-SNAPSHOT# more flume.log
>> >> >> > 2012-07-19 18:24:58,007 INFO lifecycle.LifecycleSupervisor:
>> Starting
>> >> >> > lifecycle supervisor 1
>> >> >> > 2012-07-19 18:24:58,008 INFO node.FlumeNode: Flume node starting
-
>> >> >> > hbase-agent
>> >> >> > 2012-07-19 18:24:58,011 INFO
>> nodemanager.DefaultLogicalNodeManager:
>> >> >> > Node
>> >> >> > manager starting
>> >> >> > 2012-07-19 18:24:58,011 INFO lifecycle.LifecycleSupervisor:
>> Starting
>> >> >> > lifecycle supervisor 10
>> >> >> > 2012-07-19 18:24:58,011 INFO
>> >> >> > properties.PropertiesFileConfigurationProvider:
>> >> >> > Configuration provider starting
>> >> >> > 2012-07-19 18:24:58,013 INFO
>> >> >> > properties.PropertiesFileConfigurationProvider:
>> >> >> > Reloading configuration file:conf/hbase-agent.conf
>> >> >> > 2012-07-19 18:24:58,019 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,020 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,020 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,020 INFO conf.FlumeConfiguration: Added
sinks:
>> >> >> > sink1
>> >> >> > Agent: hbase-agent
>> >> >> > 2012-07-19 18:24:58,020 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,020 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,020 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,020 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,021 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,021 INFO conf.FlumeConfiguration:
>> >> >> > Processing:sink1
>> >> >> > 2012-07-19 18:24:58,034 INFO conf.FlumeConfiguration:
>> Post-validation
>> >> >> > flume
>> >> >> > configuration contains configuration  for agents: [hbase-agent]
>> >> >> > 2012-07-19 18:24:58,034 INFO
>> >> >> > properties.PropertiesFileConfigurationProvider:
>> >> >> > Creating channels
>> >> >> > 2012-07-19 18:24:58,038 INFO
>> >> >> > properties.PropertiesFileConfigurationProvider:
>> >> >> > created channel ch1
>> >> >> > 2012-07-19 18:24:58,046 INFO sink.DefaultSinkFactory: Creating
>> >> >> > instance
>> >> >> > of
>> >> >> > sink sink1 typeorg.apache.flume.sink.hbase.HBaseSink
>> >> >> > 2012-07-19 18:24:58,051 ERROR
>> >> >> > properties.PropertiesFileConfigurationProvider: Failed to
start
>> agent
>> >> >> > because dependencies were not found in classpath. Error follows.
>> >> >> > java.lang.NoClassDefFoundError:
>> >> >> > org/apache/hadoop/hbase/HBaseConfiguration
>> >> >> >         at
>> >> >> > org.apache.flume.sink.hbase.HBaseSink.<init>(HBaseSink.java:94)
>> >> >> >         at
>> >> >> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >> >> > Method)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>> >> >> >         at
>> >> >> > java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>> >> >> >         at java.lang.Class.newInstance0(Class.java:355)
>> >> >> >         at java.lang.Class.newInstance(Class.java:308)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:103)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.loadSinks(PropertiesFileConfigurationProvider.java:371)
>> >> >> >
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.load(PropertiesFileConfigurationProvider.java:223)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(AbstractFileConfigurationProvider.java:123)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$300(AbstractFileConfigurationProvider.java:38)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:202)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
>> >> >> >         at
>> >> >> > java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>> >> >> >         at
>> >> >> >
>> >> >> >
>> >> >> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> > Thanks,
>> >> >> > Prabhu.
>> >> >> >
>> >> >
>> >> >
>> >
>> >
>>
>
>

Mime
View raw message