phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashutosh Sharma <ashu.sharma.in...@gmail.com>
Subject Re: Problems getting started with Apache Phoenix
Date Sun, 20 Sep 2015 00:03:55 GMT
Another question.
at hbase shell.
hbase(main):002:0> scan 'WEB_STAT'
ROW                                   COLUMN+CELL

 EUApple.com\x00Mac\x00\x80\x00\x01;\ column=STATS:ACTIVE_VISITOR,
timestamp=1442636717013, value=\x80\x00\x00"

 xF3\xA04\xC8

 EUApple.com\x00Mac\x00\x80\x00\x01;\ column=USAGE:CORE,
timestamp=1442636717013, value=\x80\x00\x00\x00\x00\x00\x00#

 xF3\xA04\xC8

 EUApple.com\x00Mac\x00\x80\x00\x01;\ column=USAGE:DB,
timestamp=1442636717013, value=\x80\x00\x00\x00\x00\x00\x00\x16

 xF3\xA04\xC8

 EUApple.com\x00Mac\x00\x80\x00\x01;\ column=USAGE:_0,
timestamp=1442636717013, value=

 xF3\xA04\xC8

 EUApple.com\x00Store\x00\x80\x00\x01 column=STATS:ACTIVE_VISITOR,
timestamp=1442636717013, value=\x80\x00\x00\xAA

 ;\xFD\xEC\xEC\xC8

 EUApple.com\x00Store\x00\x80\x00\x01 column=USAGE:CORE,
timestamp=1442636717013, value=\x80\x00\x00\x00\x00\x00\x01Y

 ;\xFD\xEC\xEC\xC8

 EUApple.com\x00Store\x00\x80\x00\x01 column=USAGE:DB,
timestamp=1442636717013, value=\x80\x00\x00\x00\x00\x00\x02\xD2

what are these \x00 etc...are these the unwanted characters inserted due to
load by python scripts?

On Sat, Sep 19, 2015 at 4:49 PM, Ashutosh Sharma <
ashu.sharma.india@gmail.com> wrote:

> Thanks a lot for your response and help.
> I have been struggling for almost 3 days with the HBase connectivity from
> Java client. Tried HBase definitive Guide examples also to connect to the
> HBase and the sample provided at Apache HBase.
> Here is the most simple stuff i tried to do:
> import java.io.IOException;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.hbase.HBaseConfiguration;
> import org.apache.hadoop.hbase.TableName;
> import org.apache.hadoop.hbase.client.Connection;
> import org.apache.hadoop.hbase.client.ConnectionFactory;
> import org.apache.hadoop.hbase.client.Get;
> import org.apache.hadoop.hbase.client.Table;
> import org.apache.hadoop.hbase.client.Put;
> import org.apache.hadoop.hbase.client.Result;
> import org.apache.hadoop.hbase.client.ResultScanner;
> import org.apache.hadoop.hbase.client.Scan;
> import org.apache.hadoop.hbase.util.Bytes;
> import org.apache.hadoop.hbase.protobuf.generated.*;
>
>
> public class FirstHBaseClient {
>   public static void main(String[] args) throws IOException {
>
>     Configuration config = HBaseConfiguration.create();
>
>     Connection connection = ConnectionFactory.createConnection(config);
>     try {
>
>
>       Table table = connection.getTable(TableName.valueOf("test"));
>       try {
>
>         Scan s = new Scan();
>         ResultScanner scanner = table.getScanner(s);
>         try {
>
>            for (Result rr = scanner.next(); rr != null; rr =
> scanner.next()) {
>              // print out the row we found and the columns we were looking
> for
>              System.out.println("Found row: " + rr);
>            }
>
>          } finally {
>
>            scanner.close();
>          }
>
>        } finally {
>          if (table != null) table.close();
>        }
>      } finally {
>        connection.close();
>      }
>   }
> }
>
> But never gone beyond the connectivity. Posted into StackExchange but not
> much luck.
> 15/09/17 19:37:18 INFO zookeeper.ZooKeeper: Client
> environment:user.dir=/root/workspace_hbase/HBaseIntro
> 15/09/17 19:37:18 INFO zookeeper.ZooKeeper: Initiating client connection,
> connectString=localhost:2181 sessionTimeout=90000
> watcher=hconnection-0xea4a92b0x0, quorum=localhost:2181, baseZNode=/hbase
> 15/09/17 19:37:18 INFO zookeeper.ClientCnxn: Opening socket connection to
> server localhost/127.0.0.1:2181. Will not attempt to authenticate using
> SASL (unknown error)
> 15/09/17 19:37:18 INFO zookeeper.ClientCnxn: Socket connection established
> to localhost/127.0.0.1:2181, initiating session
> 15/09/17 19:37:18 INFO zookeeper.ClientCnxn: Session establishment
> complete on server localhost/127.0.0.1:2181, sessionid =
> 0x14fde0f7576000e, negotiated timeout = 40000
>
> Any idea - what i am doing wrong? I tried this with Apache Hbase running
> in my Ubutnu, Apache HBase running within Cloudera QuickStart VM and at
> work - M7(which is MapR layer on top of HBase)....no success. did a lot of
> googling and tried out everything. This SASL error is happening and let
> going beyond the handshake.
>
> *What I liked about Apache Phoenix is that - it's very simple to start
> with - which is a very important factor when getting into any new open
> source for me. And the examples and documentation are very high
> quality...above all user forum is good...So for me it's the way to go with
> Apache Phoenix.*
>
> *Thanks once again.*
>
> On Sat, Sep 19, 2015 at 1:08 PM, James Taylor <jamestaylor@apache.org>
> wrote:
>
>> Hi Ashutosh,
>> Yes, you can use HBase APIs to write to the HBase-backed Phoenix tables,
>> but you have to do it in the way Phoenix expects, using the Phoenix
>> serialization format. Also, you won't be able to leverage some Phoenix
>> features such as secondary indexing which rely on you going through the
>> Phoenix APIs so that Phoenix can maintain the index to be in sync with the
>> data table. The easiest way to write, of course, is to just use the Phoenix
>> APIs.
>> Thanks,
>> James
>>
>> On Sat, Sep 19, 2015 at 12:21 PM, anil gupta <anilgupta84@gmail.com>
>> wrote:
>>
>>> Yes, Phoenix provides a SQL and JDBC interface to talk to HBase. Phoenix
>>> runs on top of HBase.
>>> HBase is the datastore for Phoenix table.
>>>
>>> **Every Phoenix table is backed by HBase table(s).**
>>>
>>> On Sat, Sep 19, 2015 at 12:09 PM, Ashutosh Sharma <
>>> ashu.sharma.india@gmail.com> wrote:
>>>
>>>> But when I am writing  into Phoenix tables using Java application it is
>>>> reflecting in the corresponding Hbase table also. So Phoenix and Hbase
>>>> tables are one and the same,  right
>>>> On Sep 19, 2015 11:35 AM, "anil gupta" <anilgupta84@gmail.com> wrote:
>>>>
>>>>> Phoenix api has to be used to read/write data to Phoenix tables.
>>>>> However, in a HBase cluster, you have the freedom to have both Phoenix
>>>>> and Non-Phoenix(hbase) tables in HBase.
>>>>>
>>>>> On Sat, Sep 19, 2015 at 8:50 AM, Ashutosh Sharma <
>>>>> ashu.sharma.india@gmail.com> wrote:
>>>>>
>>>>>> *Problem is resolved now.*
>>>>>> It was class file version mismatch due to some conflicting version
>>>>>> jars....
>>>>>>
>>>>>> Followed all these links thoroughly:
>>>>>> follow these links:
>>>>>> https://phoenix.apache.org/installation.html
>>>>>>
>>>>>> https://phoenix.apache.org/Phoenix-in-15-minutes-or-less.html
>>>>>>
>>>>>>
>>>>>> https://phoenix.apache.org/faq.html#I_want_to_get_started_Is_there_a_Phoenix_Hello_World
>>>>>>
>>>>>> Created a brand new Eclipse workspace and then successfully executed
>>>>>> this one:
>>>>>> import java.sql.Connection;
>>>>>> import java.sql.DriverManager;
>>>>>> import java.sql.ResultSet;
>>>>>> import java.sql.SQLException;
>>>>>> import java.sql.PreparedStatement;
>>>>>> import java.sql.Statement;
>>>>>>
>>>>>>
>>>>>> //Folllow this one:
>>>>>> //
>>>>>> https://phoenix.apache.org/faq.html#I_want_to_get_started_Is_there_a_Phoenix_Hello_World
>>>>>> public class TestPhoenix {
>>>>>>
>>>>>> public static void main(String[] args) throws SQLException {
>>>>>> Statement stmt = null;
>>>>>> ResultSet rset = null;
>>>>>> Connection con =
>>>>>> DriverManager.getConnection("jdbc:phoenix:localhost");
>>>>>> stmt = con.createStatement();
>>>>>> //The below lines are commented as the table already exists in the
DB
>>>>>> /*
>>>>>> stmt.executeUpdate("create table test (mykey integer not null primary
>>>>>> key, mycolumn varchar)");
>>>>>> stmt.executeUpdate("upsert into test values (1,'Hello')");
>>>>>> stmt.executeUpdate("upsert into test values (2,'World!')");
>>>>>> con.commit();*/
>>>>>> PreparedStatement statement = con.prepareStatement("select * from
>>>>>> test");
>>>>>> rset = statement.executeQuery();
>>>>>> while (rset.next()) {
>>>>>> System.out.println(rset.getString("mycolumn"));
>>>>>> }
>>>>>> //Add some more rows for testing
>>>>>> stmt.executeUpdate("upsert into test values (3,'Ashu')");
>>>>>> stmt.executeUpdate("upsert into test values (4,'Sharma')");
>>>>>> stmt.executeUpdate("upsert into test values (5,'Ayush')");
>>>>>> stmt.executeUpdate("upsert into test values (6,'Shivam')");
>>>>>> con.commit();
>>>>>> //Now read it further
>>>>>> rset = statement.executeQuery();
>>>>>> while (rset.next()) {
>>>>>> System.out.println(rset.getString("mycolumn"));
>>>>>> }
>>>>>> statement.close();
>>>>>> con.close();
>>>>>> }
>>>>>> }
>>>>>>
>>>>>>
>>>>>> Working fine. Only Phoenix client JAR is needed...nothing more than
>>>>>> that.
>>>>>> Few questions, I can see that table that i created using Phoenix
is
>>>>>> also created into HBase. But how they are working internally....means
if
>>>>>> any update happens at Hbase side...are they reflected at Phoenix
side or
>>>>>> not?
>>>>>> and vice versa....
>>>>>>
>>>>>> On Sat, Sep 19, 2015 at 7:59 AM, Ashutosh Sharma <
>>>>>> ashu.sharma.india@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>> I am very new to HBase as well as Apache Phoenix.
>>>>>>> Tried making use of this basic program to do the connectivity
check
>>>>>>> etc...
>>>>>>> import java.sql.*;
>>>>>>> import java.util.*;
>>>>>>>
>>>>>>> public class phoenixTest
>>>>>>> {
>>>>>>> public static void main(String args[]) throws Exception
>>>>>>> {
>>>>>>> Connection conn;
>>>>>>> Properties prop = new Properties();
>>>>>>> Class.forName("org.apache.phoenix.jdbc.PhoenixDriver");
>>>>>>> //conn =
>>>>>>> DriverManager.getConnection("jdbc:phoenix:localhost:/hbase-unsecure");
>>>>>>> //conn =
>>>>>>> DriverManager.getConnection("jdbc:phoenix:localhost:2181/hbase-unsecure");
>>>>>>> conn = DriverManager.getConnection("jdbc:phoenix:localhost");
>>>>>>> System.out.println("got connection");
>>>>>>> ResultSet rst = conn.createStatement().executeQuery(
>>>>>>> "select * from stock_symbol");
>>>>>>> while (rst.next())
>>>>>>> {
>>>>>>> System.out.println(rst.getString(1) + " " + rst.getString(2));
>>>>>>> }
>>>>>>> System.out.println(conn.createStatement().executeUpdate(
>>>>>>> "delete from stock_symbol"));
>>>>>>> conn.commit();
>>>>>>> rst = conn.createStatement().executeQuery("select * from
>>>>>>> stock_symbol");
>>>>>>> while (rst.next())
>>>>>>> {
>>>>>>> System.out.println(rst.getString(1) + " " + rst.getString(2));
>>>>>>> }
>>>>>>> System.out
>>>>>>> .println(conn
>>>>>>> .createStatement()
>>>>>>> .executeUpdate(
>>>>>>> "upsert into stock_symbol values('IBM','International Business
>>>>>>> Machines')"));
>>>>>>> conn.commit();
>>>>>>> }
>>>>>>> }
>>>>>>>
>>>>>>> But getting this issue:
>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/hbase-1.1.2/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> [jar:file:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-4.5.2-HBase-1.1-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings
for an
>>>>>>> explanation.
>>>>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>>>>> Driver class loaded successfully
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.RecoverableZooKeeper: Process
>>>>>>> identifier=hconnection-0x43738a82 connecting to ZooKeeper
>>>>>>> ensemble=localhost:2181
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014
09:09 GMT
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client environment:
>>>>>>> host.name=ashu-700-430qe
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:java.version=1.8.0_25
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:java.vendor=Oracle Corporation
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:java.home=/jdk1.8.0_25/jre
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:java.class.path=/root/workspace_hbase/HBaseIntro/bin:/hbase-1.1.2/lib/hbase-client-1.1.2.jar:/hbase-1.1.2/lib/hbase-common-1.1.2.jar:/hbase-1.1.2/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/hadoop-core-1.2.1.jar:/usr/local/hadoop/lib/commons-configuration-1.6.jar:/hbase-1.1.2/lib/commons-logging-1.2.jar:/hbase-1.1.2/lib/commons-lang-2.6.jar:/hbase-1.1.2/lib/protobuf-java-2.5.0.jar:/hbase-1.1.2/lib/hbase-protocol-1.1.2.jar:/hbase-1.1.2/lib/slf4j-log4j12-1.7.5.jar:/hbase-1.1.2/lib/slf4j-api-1.7.7.jar:/hbase-1.1.2/lib/log4j-1.2.17.jar:/hbase-1.1.2/lib/htrace-core-3.1.0-incubating.jar:/hbase-1.1.2/lib/guava-12.0.1.jar:/hbase-1.1.2/lib/guice-3.0.jar:/hbase-1.1.2/lib/netty-all-4.0.23.Final.jar:/hbase-1.1.2/lib/netty-3.2.4.Final.jar:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-core-4.5.2-HBase-1.1.jar:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-flume-4.5.2-HBase-1.1.jar:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-pig-4.5.2-HBase-1.1.jar:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-spark-4.5.2-HBase-1.1.jar:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-server-4.5.2-HBase-1.1.jar:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-server-client-4.5.2-HBase-1.1.jar:/home/ashu/Downloads/phoenix-4.5.2-HBase-1.1-bin/phoenix-4.5.2-HBase-1.1-client.jar
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:java.io.tmpdir=/tmp
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:java.compiler=<NA>
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client environment:
>>>>>>> os.name=Linux
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:os.arch=amd64
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:os.version=3.13.0-63-generic
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client environment:
>>>>>>> user.name=root
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:user.home=/root
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Client
>>>>>>> environment:user.dir=/root/workspace_hbase/HBaseIntro
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Initiating client
>>>>>>> connection, connectString=localhost:2181 sessionTimeout=90000
>>>>>>> watcher=hconnection-0x43738a820x0, quorum=localhost:2181, baseZNode=/hbase
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ClientCnxn: Opening socket
>>>>>>> connection to server localhost/127.0.0.1:2181. Will not attempt
to
>>>>>>> authenticate using SASL (unknown error)
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ClientCnxn: Socket connection
>>>>>>> established to localhost/127.0.0.1:2181, initiating session
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ClientCnxn: Session establishment
>>>>>>> complete on server localhost/127.0.0.1:2181, sessionid =
>>>>>>> 0x14fe3b6b1e40039, negotiated timeout = 40000
>>>>>>> 15/09/19 07:54:01 INFO
>>>>>>> client.ConnectionManager$HConnectionImplementation: Closing zookeeper
>>>>>>> sessionid=0x14fe3b6b1e40039
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ZooKeeper: Session:
>>>>>>> 0x14fe3b6b1e40039 closed
>>>>>>> 15/09/19 07:54:01 INFO zookeeper.ClientCnxn: EventThread shut
down
>>>>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>>>>> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.instance()Lorg/apache/hadoop/metrics2/MetricsSystem;
>>>>>>> at org.apache.phoenix.metrics.Metrics.<clinit>(Metrics.java:29)
>>>>>>> at
>>>>>>> org.apache.phoenix.trace.TraceMetricSource.<init>(TraceMetricSource.java:86)
>>>>>>> at
>>>>>>> org.apache.phoenix.trace.util.Tracing.addTraceMetricsSource(Tracing.java:269)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixConnection.<clinit>(PhoenixConnection.java:149)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1924)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1896)
>>>>>>> at
>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1896)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
>>>>>>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>>>>>> at java.sql.DriverManager.getConnection(DriverManager.java:270)
>>>>>>> at phoenixTest.main(phoenixTest.java:16)
>>>>>>>
>>>>>>>
>>>>>>> Seems more like a JAR file version mismatch issue.
>>>>>>> Here are the JAR files that I am using:
>>>>>>> Please refer to the screen shot
>>>>>>>
>>>>>>> I have followed all the guidelines for setting up Phoenix at:
>>>>>>> https://phoenix.apache.org/installation.html
>>>>>>>
>>>>>>> My connection from Squirrel is working fine...but from Java Program
>>>>>>> getting the errors.
>>>>>>> --
>>>>>>> With best Regards:
>>>>>>> Ashutosh Sharma
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> With best Regards:
>>>>>> Ashutosh Sharma
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Thanks & Regards,
>>>>> Anil Gupta
>>>>>
>>>>
>>>
>>>
>>> --
>>> Thanks & Regards,
>>> Anil Gupta
>>>
>>
>>
>
>
> --
> With best Regards:
> Ashutosh Sharma
>



-- 
With best Regards:
Ashutosh Sharma

Mime
View raw message