carbondata-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ravindra Pesala (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (CARBONDATA-789) Join operation does not work properly in Carbon data.
Date Wed, 10 May 2017 12:30:04 GMT

     [ https://issues.apache.org/jira/browse/CARBONDATA-789?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Ravindra Pesala resolved CARBONDATA-789.
----------------------------------------
    Resolution: Won't Fix

> Join operation does not work properly  in Carbon data. 
> -------------------------------------------------------
>
>                 Key: CARBONDATA-789
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-789
>             Project: CarbonData
>          Issue Type: Bug
>          Components: data-query
>    Affects Versions: 1.1.0
>         Environment: Spark 2.1
>            Reporter: Vinod Rohilla
>            Priority: Trivial
>         Attachments: 2000_UniqData.csv
>
>
> Join operation does not work properly  in Carbon data for the int data type.
> Steps to Reproduces:
> A) Create Table in Hive:
> First table:
> CREATE TABLE uniqdata_nobucket11_Hive (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION
string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1
decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1
int) ROW FORMAT DELIMITED FIELDS TERMINATED BY ",";
> First table Load:
> LOAD DATA LOCAL INPATH '/home/vinod/Desktop/AllCSV/2000_UniqData.csv'OVERWRITE INTO TABLE
uniqdata_nobucket11_Hive;
> Second  table :
> CREATE TABLE uniqdata_nobucket22_Hive (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION
string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1
decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1
int) ROW FORMAT DELIMITED FIELDS TERMINATED BY ",";
> Second  table Load:
> LOAD DATA LOCAL INPATH '/home/vinod/Desktop/AllCSV/2000_UniqData.csv'OVERWRITE INTO TABLE
uniqdata_nobucket22_Hive;
> Results in Hive:
> | CUST_ID  |    CUST_NAME     |    ACTIVE_EMUI_VERSION     |          DOB           |
         DOJ           | BIGINT_COLUMN1  | BIGINT_COLUMN2  |     DECIMAL_COLUMN1     |   
 DECIMAL_COLUMN2     |    Double_COLUMN1    |    Double_COLUMN2     | INTEGER_COLUMN1  | CUST_ID
 |    CUST_NAME     |    ACTIVE_EMUI_VERSION     |          DOB           |          DOJ 
         | BIGINT_COLUMN1  | BIGINT_COLUMN2  |     DECIMAL_COLUMN1     |     DECIMAL_COLUMN2
    |    Double_COLUMN1    |    Double_COLUMN2     | INTEGER_COLUMN1  |
> +----------+------------------+----------------------------+------------------------+------------------------+-----------------+-----------------+-------------------------+-------------------------+----------------------+-----------------------+------------------+----------+------------------+----------------------------+------------------------+------------------------+-----------------+-----------------+-------------------------+-------------------------+----------------------+-----------------------+------------------+--+
> | 10999    | CUST_NAME_01999  | ACTIVE_EMUI_VERSION_01999  | 1975-06-23 01:00:03.0  |
1975-06-23 02:00:03.0  | 123372038853    | -223372034855   | 12345680900.1234000000  | 22345680900.1234000000
 | 1.12345674897976E10  | -1.12345674897976E10  | 2000             | 10999    | CUST_NAME_01999
 | ACTIVE_EMUI_VERSION_01999  | 1975-06-23 01:00:03.0  | 1975-06-23 02:00:03.0  | 123372038853
   | -223372034855   | 12345680900.1234000000  | 22345680900.1234000000  | 1.12345674897976E10
 | -1.12345674897976E10  | 2000             |
> +----------+------------------+----------------------------+------------------------+------------------------+-----------------+-----------------+-------------------------+-------------------------+----------------------+-----------------------+------------------+----------+------------------+----------------------------+------------------------+------------------------+-----------------+-----------------+-------------------------+-------------------------+----------------------+-----------------------+------------------+--+
> 2,001 rows selected (3.369 seconds)
> B) Create table in Carbon data 
> First Table:
> CREATE TABLE uniqdata_nobucket11 (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string,
DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1
decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1
int) STORED BY 'org.apache.carbondata.format' ;
> Load Data in table:
> LOAD DATA INPATH 'hdfs://localhost:54310/2000_UniqData.csv' into table uniqdata_nobucket11
OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')
> Create Second table:
> CREATE TABLE uniqdata_nobucket22 (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string,
DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1
decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1
int) STORED BY 'org.apache.carbondata.format';
> Load data in Table:
> LOAD DATA INPATH 'hdfs://localhost:54310/2000_UniqData.csv' into table uniqdata_nobucket22
OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')
> Result in CarbonData:
> 0: jdbc:hive2://localhost:10000> select * from uniqdata_nobucket11 u1, uniqdata_nobucket22
u2 where u1.cust_id = u2.cust_id;
> Error: java.util.EmptyStackException (state=,code=0)
> Expected Result: Results should be displayed in carbon data.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message