carbondata-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From xuchuanyin <>
Subject [GitHub] carbondata pull request #1484: [CARBONDATA-1700][DataLoad] Add TableProperti...
Date Fri, 10 Nov 2017 14:30:23 GMT
GitHub user xuchuanyin opened a pull request:

    [CARBONDATA-1700][DataLoad] Add TableProperties during (de)serialization of TableSchema

    Be sure to do all of the following checklist to help us incorporate 
    your contribution quickly and easily:
     - [X] Any interfaces changed?
     - [X] Any backward compatibility impacted?
     - [X] Document update required?
     - [X] Testing done
            Please provide details on 
            - Whether new unit test cases have been added or why no new tests are required?
            - How it is tested? Please attach test report.
            `TEST IN MANUALLY`
            - Is it a performance related change? Please attach the performance test report.
            - Any additional information to help reviewers in testing this change.
     - [X] For large changes, please consider breaking it into sub-tasks under an umbrella
            `NOT RELATED`
    # scenario
    I encounterd loading data to existed carbondata table failure after query the table after
restarting spark session. I have this failure in spark local mode (found it during local test)
and haven't test in other scenarioes.
    The problem can be reproduced by following steps:
    0. START: start a session;
    1. CREATE: create table `t1`;
    2. LOAD: create a dataframe and write apppend to `t1`;
    3. STOP: stop current session;
    4. START: start a session;
    5. QUERY: query table `t1`;  ----  This step is essential to reproduce the problem.
    6. LOAD: create a dataframe and write append to `t1`;  --- This step will be failed.
    Error will be thrown in Step6. The error message in console looks like
    java.lang.NullPointerException was thrown.
    at org.apache.spark.sql.CarbonDataFrameWriter.loadDataFrame(CarbonDataFrameWriter.scala:141)
    at org.apache.spark.sql.CarbonDataFrameWriter.writeToCarbonFile(CarbonDataFrameWriter.scala:50)
    at org.apache.spark.sql.CarbonDataFrameWriter.appendToCarbonFile(CarbonDataFrameWriter.scala:42)
    at org.apache.spark.sql.CarbonSource.createRelation(CarbonSource.scala:110)
    at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:426)
    The following code can be pasted in `TestLoadDataFrame.scala` to reproduce this problem
—— but keep
    in mind you should manually run the first test and then the second in different iteration
(to make sure that the sparksession is restarted).
      test("prepare") {
        sql("drop table if exists carbon_stand_alone")
        sql( "create table if not exists carbon_stand_alone (c1 string, c2 string, c3 int)"
        " stored by 'carbondata'").collect()
        sql("select * from carbon_stand_alone").show()
          .option("tableName", "carbon_stand_alone")
          .option("tempCSV", "false")
      test("test load dataframe after query") {
        sql("select * from carbon_stand_alone").show()
        // the following line will cause failure
          .option("tableName", "carbon_stand_alone")
          .option("tempCSV", "false")
        // if it works fine, it sould be true
          sql("select count(*) from carbon_stand_alone where c3 > 500"), Row(31500 * 2)
    I went through the code and found the problem was caused by NULL `tableProperties` in
`tablemeta: tableMeta.carbonTable.getTableInfo
          .getFactTable.getTableProperties` (we will name it `propertyInTableInfo` for short)
is null in Line89 in `LoadTableCommand.scala`.
    After debug, I found that the `propertyInTableInfo` sett in `CarbonTableInputFormat.setTableInfo(...)`
had the correct value. But `CarbonTableInputFormat.getTableInfo(...)` had the incorrect value.
The setter is used to serialized TableInfo, while the getter is used to deserialized TableInfo
———— That means there are something wrong in serialization-deserialization.
    Keep diving into the code, I found that serialization and deserialization in `TableSchema`,
a member of `TableInfo`, ignores the `tableProperties` member, thus causing this value empty
after deserialization. Since this value has not been initialized in construtor, so the value
remains `NULL` and cause the NPE problem.
    1. Initialize `tableProperties` in `TableSchema`
    2. Include `tableProperties` in serialization-deserialization of `TableSchema`
    # Notes
    Although the bug has been fix, I still can't understand why the problem can be triggered
in above way.
    Tests need the sparksession to be restarted, which is impossible currently, so no tests
will be added.

You can merge this pull request into a Git repository by running:

    $ git pull bug_table_property_NPE

Alternatively you can review and apply these changes as the patch at:

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1484
commit 47f70c7c2b7c0363e982335817055ddfd9c8b84d
Author: xuchuanyin <>
Date:   2017-11-10T13:10:55Z

    Add TableProperties during (de)serialization of TableSchema



View raw message