phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xindian Long (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix
Date Mon, 07 Nov 2016 17:19:58 GMT

    [ https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15644768#comment-15644768
] 

Xindian Long commented on PHOENIX-3460:
---------------------------------------

Test code:

public class Application {
    static private Logger log  = Logger.getLogger(Application.class);
 
 
    public static void main(String[] args) {
        SparkConf conf = new SparkConf().setAppName("NMS Tuning Engine");
        JavaSparkContext sc = new JavaSparkContext(conf);
 
 
 
 
        //testJdbc(sc);
        testSpark(sc);
    }
 
 
    static public void testSpark(JavaSparkContext sc)  {
        //SparkContextBuilder.buildSparkContext("Simple Application", "local");
 
 
        // One JVM can only have one Spark Context now
        Map<String, String> options = new HashMap<String, String>();
        SQLContext sqlContext = new SQLContext(sc);
 
 
        String tableStr = "\"ACME:ENDPOINT_STATUS\"";
        String dataSrcUrl="jdbc:phoenix:luna-sdp-nms-01.davis.sensus.lab:2181:/hbase-unsecure";
        options.put("zkUrl", dataSrcUrl);
        options.put("table", tableStr);
        log.info("Phoenix DB URL: " + dataSrcUrl + " tableStr: " + tableStr);
 
 
        DataFrame df = null;
        try {
            df = sqlContext.read().format("org.apache.phoenix.spark").options(options).load();
            df.explain(true);
        } catch (Exception ex) {
            log.error("sql error: ", ex);
        }
 
 
        try {
            log.info ("Count By phoenix spark plugin: "+ df.count());
        } catch (Exception ex) {
            log.error("dataframe error: ", ex);
        }
 
 
    }
 
 
    static public void testJdbc(JavaSparkContext sc)   {
        Map<String, String> options = new HashMap<String, String>();
        SQLContext sqlContext = new SQLContext(sc);
 
 
        if (sc == null || sqlContext == null || options == null) {
            log.info("NULL sc, sqlContext, or options");
        }
 
 
        String qry2 = "(Select ENDPOINT_ID, CITY from \"ACME:ENDPOINT_STATUS\" Where city
= 'ACME City')";
        String dataSrcUrl="jdbc:phoenix:luna-sdp-nms-01.davis.sensus.lab:2181:/hbase-unsecure";
        options.put("url", dataSrcUrl);
        options.put("dbtable", qry2);
        log.info("Phoenix DB URL: " + dataSrcUrl + "\nquery: " + qry2);
 
 
        DataFrame df = null;
        try {
            DataFrameReader dfRd = sqlContext.read().format("jdbc").options(options);
 
 
            if (dfRd == null) {
                log.error("NULL DataFrameReader Object dfRd in getEndPointDataByJdbc");
            }
            df = dfRd.load();
            df.explain(true);
 
 
        } catch (Exception ex) {
            log.error("sql error: ", ex);
        }
 
 
        try {
            log.info ("Count By Jdbc: "+ df.count());
        } catch (Exception ex) {
            log.error("dataframe error: ", ex);
        }
    }
}

> Phoenix Spark plugin cannot find table with a Namespace prefix
> --------------------------------------------------------------
>
>                 Key: PHOENIX-3460
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3460
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.8.0
>         Environment: HDP 2.5
>            Reporter: Xindian Long
>              Labels: phoenix, spark
>             Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table with a namespace
prefix in the table name (the table is created as a phoenix table not a hbase table), but
it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql through Squirrel.
In addition, using spark sql to query it has no problem at all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 cluster,
with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace prefix in the
DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives the above
exception, but if I run the testJdbc first, and followed by testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table name ENDPOINT_STATUS,
and the later on shows as scheme none, and table name ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, upserts in
the table ACME.ENDPOINT_STATUS show up in the other table, so is the other way around.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message