carbondata-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From chenliang...@apache.org
Subject [1/2] carbondata git commit: [CARBONDATA-1095] Fix issues after rebasing presto and hive integration to maste
Date Fri, 26 May 2017 15:31:51 GMT
Repository: carbondata
Updated Branches:
  refs/heads/master 314c01a3d -> 823eb1d71


[CARBONDATA-1095] Fix issues after rebasing presto and hive integration to maste

fix travis ci issue


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/53267c82
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/53267c82
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/53267c82

Branch: refs/heads/master
Commit: 53267c82c59059b740099d1710c7736a58c151b6
Parents: 314c01a
Author: chenliang613 <chenliang613@apache.org>
Authored: Fri May 26 20:21:01 2017 +0800
Committer: chenliang613 <chenliang613@apache.org>
Committed: Fri May 26 23:27:41 2017 +0800

----------------------------------------------------------------------
 README.md                                       |   5 +-
 examples/flink/pom.xml                          |   1 +
 integration/hive/hive-guide.md                  |   4 +-
 integration/hive/pom.xml                        |   2 +-
 .../carbondata/hive/CarbonHiveRecordReader.java |   6 +-
 .../hive/server/HiveEmbeddedServer2.java        |   6 +-
 .../carbondata/hiveexample/HiveExample.scala    |  27 +-
 .../apache/carbondata/hive/TestCarbonSerde.java | 266 +++++++++----------
 integration/presto/README.md                    |  86 ++++++
 integration/presto/pom.xml                      | 233 ++++++++++++++++
 .../presto/CarbondataConnectorId.java           |  52 ++++
 .../presto/CarbondataHandleResolver.java        |  43 +++
 .../carbondata/presto/CarbondataPlugin.java     |  34 +++
 .../carbondata/presto/CarbondataSplit.java      |  88 ++++++
 .../presto/CarbondataTableHandle.java           |  71 +++++
 .../presto/CarbondataTableLayoutHandle.java     |  71 +++++
 .../presto/CarbondataTransactionHandle.java     |  24 ++
 .../org/apache/carbondata/presto/Types.java     |  35 +++
 .../presto/impl/CarbonTableReader.java          |   4 +-
 .../InsertIntoCarbonTableSpark1TestCase.scala   |  81 ++++++
 .../InsertIntoCarbonTableTestCase.scala         |  81 ------
 .../InsertIntoCarbonTableSpark2TestCase.scala   |  40 +++
 .../InsertIntoCarbonTableTestCase.scala         |  40 ---
 pom.xml                                         |  18 +-
 24 files changed, 1029 insertions(+), 289 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/README.md
----------------------------------------------------------------------
diff --git a/README.md b/README.md
index 4f31e16..3226853 100644
--- a/README.md
+++ b/README.md
@@ -27,7 +27,10 @@ You can find the latest CarbonData document and learn more at:
 [CarbonData cwiki](https://cwiki.apache.org/confluence/display/CARBONDATA/)
 
 ## Status
-[![Build Status](https://travis-ci.org/apache/carbondata.svg?branch=master)](https://travis-ci.org/apache/carbondata.svg?branch=master)
+Spark2.1:
+[![Build Status](https://builds.apache.org/buildStatus/icon?job=carbondata-master-spark-2.1)](https://builds.apache.org/view/CarbonData/job/carbondata-master-spark-2.1/)
+Spark1.6:
+[![Build Status](https://builds.apache.org/buildStatus/icon?job=carbondata-master-spark-1.6)](https://builds.apache.org/view/CarbonData/job/carbondata-master-spark-1.6/)
 
 ## Features
 CarbonData file format is a columnar store in HDFS, it has many features that a modern columnar format has, such as splittable, compression schema ,complex data type etc, and CarbonData has following unique features:

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/examples/flink/pom.xml
----------------------------------------------------------------------
diff --git a/examples/flink/pom.xml b/examples/flink/pom.xml
index cfe059a..d83e2cd 100644
--- a/examples/flink/pom.xml
+++ b/examples/flink/pom.xml
@@ -30,6 +30,7 @@
   <name>Apache CarbonData :: Flink Examples</name>
 
   <properties>
+    <flink.version>1.1.4</flink.version>
     <dev.path>${basedir}/../../dev</dev.path>
   </properties>
 

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/hive/hive-guide.md
----------------------------------------------------------------------
diff --git a/integration/hive/hive-guide.md b/integration/hive/hive-guide.md
index 202b2b2..dcf68f6 100644
--- a/integration/hive/hive-guide.md
+++ b/integration/hive/hive-guide.md
@@ -72,8 +72,8 @@ scala>carbon.sql("SELECT * FROM hive_carbon").show()
 ### Configure hive classpath
 ```
 mkdir hive/auxlibs/
-cp incubator-carbondata/assembly/target/scala-2.11/carbondata_2.11*.jar hive/auxlibs/
-cp incubator-carbondata/integration/hive/target/carbondata-hive-*.jar hive/auxlibs/
+cp carbondata/assembly/target/scala-2.11/carbondata_2.11*.jar hive/auxlibs/
+cp carbondata/integration/hive/target/carbondata-hive-*.jar hive/auxlibs/
 cp $SPARK_HOME/jars/spark-catalyst*.jar hive/auxlibs/
 export HIVE_AUX_JARS_PATH=hive/auxlibs/
 ```

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/hive/pom.xml
----------------------------------------------------------------------
diff --git a/integration/hive/pom.xml b/integration/hive/pom.xml
index 12ef24a..5a33958 100644
--- a/integration/hive/pom.xml
+++ b/integration/hive/pom.xml
@@ -22,7 +22,7 @@
     <parent>
         <groupId>org.apache.carbondata</groupId>
         <artifactId>carbondata-parent</artifactId>
-        <version>1.1.0-incubating-SNAPSHOT</version>
+        <version>1.2.0-SNAPSHOT</version>
         <relativePath>../../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/hive/src/main/java/org/apache/carbondata/hive/CarbonHiveRecordReader.java
----------------------------------------------------------------------
diff --git a/integration/hive/src/main/java/org/apache/carbondata/hive/CarbonHiveRecordReader.java b/integration/hive/src/main/java/org/apache/carbondata/hive/CarbonHiveRecordReader.java
index eb7faed..e7e342c 100644
--- a/integration/hive/src/main/java/org/apache/carbondata/hive/CarbonHiveRecordReader.java
+++ b/integration/hive/src/main/java/org/apache/carbondata/hive/CarbonHiveRecordReader.java
@@ -78,14 +78,16 @@ public class CarbonHiveRecordReader extends CarbonRecordReader<ArrayWritable>
     }
     List<TableBlockInfo> tableBlockInfoList = CarbonHiveInputSplit.createBlocks(splitList);
     queryModel.setTableBlockInfos(tableBlockInfoList);
-    readSupport.initialize(queryModel.getProjectionColumns(), queryModel.getAbsoluteTableIdentifier());
+    readSupport.initialize(queryModel.getProjectionColumns(),
+                           queryModel.getAbsoluteTableIdentifier());
     try {
       carbonIterator = new ChunkRowIterator(queryExecutor.execute(queryModel));
     } catch (QueryExecutionException e) {
       throw new IOException(e.getMessage(), e.getCause());
     }
     if (valueObj == null) {
-      valueObj = new ArrayWritable(Writable.class, new Writable[queryModel.getProjectionColumns().length]);
+      valueObj = new ArrayWritable(Writable.class,
+                                   new Writable[queryModel.getProjectionColumns().length]);
     }
 
     final TypeInfo rowTypeInfo;

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/hive/src/main/java/org/apache/carbondata/hive/server/HiveEmbeddedServer2.java
----------------------------------------------------------------------
diff --git a/integration/hive/src/main/java/org/apache/carbondata/hive/server/HiveEmbeddedServer2.java b/integration/hive/src/main/java/org/apache/carbondata/hive/server/HiveEmbeddedServer2.java
index bd94ae9..3a05a10 100644
--- a/integration/hive/src/main/java/org/apache/carbondata/hive/server/HiveEmbeddedServer2.java
+++ b/integration/hive/src/main/java/org/apache/carbondata/hive/server/HiveEmbeddedServer2.java
@@ -42,7 +42,8 @@ import org.apache.hive.service.server.HiveServer2;
 /**
  * Utility starting a local/embedded Hive org.apache.carbondata.hive.server for testing purposes.
  * Uses sensible defaults to properly clean between reruns.
- * Additionally it wrangles the Hive internals so it rather executes the jobs locally not within a child JVM (which Hive calls local) or external.
+ * Additionally it wrangles the Hive internals so it rather executes the jobs locally not within
+ * a child JVM (which Hive calls local) or external.
  */
 public class HiveEmbeddedServer2 {
   private static final String SCRATCH_DIR = "/tmp/hive";
@@ -131,7 +132,8 @@ public class HiveEmbeddedServer2 {
     conf.set("hive.added.archives.path", "");
     conf.set("fs.default.name", "file:///");
 
-    // clear mapred.job.tracker - Hadoop defaults to 'local' if not defined. Hive however expects this to be set to 'local' - if it's not, it does a remote execution (i.e. no child JVM)
+    // clear mapred.job.tracker - Hadoop defaults to 'local' if not defined. Hive however expects
+    // this to be set to 'local' - if it's not, it does a remote execution (i.e. no child JVM)
     Field field = Configuration.class.getDeclaredField("properties");
     field.setAccessible(true);
     Properties props = (Properties) field.get(conf);

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/hive/src/main/scala/org/apache/carbondata/hiveexample/HiveExample.scala
----------------------------------------------------------------------
diff --git a/integration/hive/src/main/scala/org/apache/carbondata/hiveexample/HiveExample.scala b/integration/hive/src/main/scala/org/apache/carbondata/hiveexample/HiveExample.scala
index a80fb71..6d19049 100644
--- a/integration/hive/src/main/scala/org/apache/carbondata/hiveexample/HiveExample.scala
+++ b/integration/hive/src/main/scala/org/apache/carbondata/hiveexample/HiveExample.scala
@@ -24,6 +24,7 @@ import org.apache.spark.sql.SparkSession
 import org.apache.carbondata.common.logging.LogServiceFactory
 import org.apache.carbondata.hive.server.HiveEmbeddedServer2
 
+// scalastyle:off println
 object HiveExample {
 
   private val driverName: String = "org.apache.hive.jdbc.HiveDriver"
@@ -103,14 +104,16 @@ object HiveExample {
     }
     catch {
       case exception: Exception =>
-        logger.warn(s"Jar Not Found $carbonHadoopJarPath"+"Looking For hadoop 2.2.0 version jar")
+        logger.warn(s"Jar Not Found $carbonHadoopJarPath" + "Looking For hadoop 2.2.0 version jar")
         try {
           stmt
             .execute(s"ADD JAR $carbon_DefaultHadoopVersion_JarPath")
         }
         catch {
           case exception: Exception => logger
-            .error(s"Exception Occurs:Neither One of Jar is Found $carbon_DefaultHadoopVersion_JarPath,$carbonHadoopJarPath"+"Atleast One Should Be Build")
+            .error("Exception Occurs:Neither One of Jar is Found" +
+                   s"$carbon_DefaultHadoopVersion_JarPath,$carbonHadoopJarPath" +
+                   "Atleast One Should Be Build")
             hiveEmbeddedServer2.stop()
             System.exit(0)
         }
@@ -177,12 +180,13 @@ object HiveExample {
     println(s"******Total Number Of Rows Fetched ****** $rowsFetched")
 
     logger.info("Fetching the Individual Columns ")
-    //fetching the seperate columns
+
+    // fetching the separate columns
     var individualColRowsFetched = 0
 
     val resultIndividualCol = stmt.executeQuery("SELECT NAME FROM HIVE_CARBON_EXAMPLE")
 
-    while(resultIndividualCol.next){
+    while (resultIndividualCol.next) {
       if (individualColRowsFetched == 0) {
         println("+--------------+")
         println("| NAME         |")
@@ -197,18 +201,19 @@ object HiveExample {
       else {
         val resultName = resultIndividualCol.getString("NAME")
 
-        println(s"| $resultName      |" )
-        println("+---+" + "+---------+" )
+        println(s"| $resultName      |")
+        println("+---+" + "+---------+")
       }
-      individualColRowsFetched =  individualColRowsFetched +1
+      individualColRowsFetched = individualColRowsFetched + 1
     }
-    println(s" ********** Total Rows Fetched When Quering The Individual Column ********** $individualColRowsFetched")
+    println(" ********** Total Rows Fetched When Quering The Individual Column **********" +
+            s"$individualColRowsFetched")
 
     logger.info("Fetching the Out Of Order Columns ")
 
     val resultOutOfOrderCol = stmt.executeQuery("SELECT SALARY,ID,NAME FROM HIVE_CARBON_EXAMPLE")
     var outOfOrderColFetched = 0
-    while (resultOutOfOrderCol.next()){
+    while (resultOutOfOrderCol.next()) {
       if (outOfOrderColFetched == 0) {
         println("+---+" + "+-------+" + "+--------------+")
         println("| Salary|" + "| ID |" + "| NAME        |")
@@ -230,10 +235,10 @@ object HiveExample {
         println(s"| $resultSalary |" + s"| $resultId |" + s"| $resultName   |")
         println("+---+" + "+-------+" + "+--------------+")
       }
-      outOfOrderColFetched =  outOfOrderColFetched +1
+      outOfOrderColFetched = outOfOrderColFetched + 1
     }
     hiveEmbeddedServer2.stop()
     System.exit(0)
   }
 
-}
\ No newline at end of file
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java
----------------------------------------------------------------------
diff --git a/integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java b/integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java
index 3969914..be17823 100644
--- a/integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java
+++ b/integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java
@@ -1,133 +1,133 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.carbondata.hive;
-
-import junit.framework.TestCase;
-import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.hive.common.type.HiveDecimal;
-import org.apache.hadoop.hive.serde2.SerDeException;
-import org.apache.hadoop.hive.serde2.SerDeUtils;
-import org.apache.hadoop.hive.serde2.io.DoubleWritable;
-import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
-import org.apache.hadoop.hive.serde2.io.ShortWritable;
-import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
-import org.apache.hadoop.io.*;
-import org.junit.Test;
-
-import java.util.Properties;
-
-public class TestCarbonSerde extends TestCase {
-  @Test
-  public void testCarbonHiveSerDe() throws Throwable {
-    try {
-      // Create the SerDe
-      System.out.println("test: testCarbonHiveSerDe");
-
-      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
-      final Configuration conf = new Configuration();
-      final Properties tbl = createProperties();
-      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
-
-      // Data
-      final Writable[] arr = new Writable[7];
-
-      //primitive types
-      arr[0] = new ShortWritable((short) 456);
-      arr[1] = new IntWritable(789);
-      arr[2] = new LongWritable(1000l);
-      arr[3] = new DoubleWritable((double) 5.3);
-      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
-      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
-
-      final Writable[] arrayContainer = new Writable[1];
-      final Writable[] array = new Writable[5];
-      for (int i = 0; i < 5; ++i) {
-        array[i] = new IntWritable(i);
-      }
-      arrayContainer[0] = new ArrayWritable(Writable.class, array);
-      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
-
-      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
-      // Test
-      deserializeAndSerializeLazySimple(serDe, arrWritable);
-      System.out.println("test: testCarbonHiveSerDe - OK");
-
-    } catch (final Throwable e) {
-      e.printStackTrace();
-      throw e;
-    }
-  }
-
-  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
-      final ArrayWritable t) throws SerDeException {
-
-    // Get the row structure
-    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
-
-    // Deserialize
-    final Object row = serDe.deserialize(t);
-    assertEquals("deserialization gives the wrong object class", row.getClass(),
-        ArrayWritable.class);
-    assertEquals("size correct after deserialization",
-        serDe.getSerDeStats().getRawDataSize(), t.get().length);
-    assertEquals("deserialization gives the wrong object", t, row);
-
-    // Serialize
-    final ArrayWritable serializedArr = (ArrayWritable) serDe.serialize(row, oi);
-    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
-        serializedArr.get().length);
-    assertTrue("serialized object should be equal to starting object",
-        arrayWritableEquals(t, serializedArr));
-  }
-
-  private Properties createProperties() {
-    final Properties tbl = new Properties();
-
-    // Set the configuration parameters
-    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
-    tbl.setProperty("columns.types",
-        "smallint:int:bigint:double:decimal:string:array<int>");
-    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
-    return tbl;
-  }
-
-  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
-    final Writable[] a1Arr = a1.get();
-    final Writable[] a2Arr = a2.get();
-
-    if (a1Arr.length != a2Arr.length) {
-      return false;
-    }
-
-    for (int i = 0; i < a1Arr.length; ++i) {
-      if (a1Arr[i] instanceof ArrayWritable) {
-        if (!(a2Arr[i] instanceof ArrayWritable)) {
-          return false;
-        }
-        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
-          return false;
-        }
-      } else {
-        if (!a1Arr[i].equals(a2Arr[i])) {
-          return false;
-        }
-      }
-
-    }
-    return true;
-  }
-}
+///*
+// * Licensed to the Apache Software Foundation (ASF) under one or more
+// * contributor license agreements.  See the NOTICE file distributed with
+// * this work for additional information regarding copyright ownership.
+// * The ASF licenses this file to You under the Apache License, Version 2.0
+// * (the "License"); you may not use this file except in compliance with
+// * the License.  You may obtain a copy of the License at
+// *
+// *    http://www.apache.org/licenses/LICENSE-2.0
+// *
+// * Unless required by applicable law or agreed to in writing, software
+// * distributed under the License is distributed on an "AS IS" BASIS,
+// * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// * See the License for the specific language governing permissions and
+// * limitations under the License.
+// */
+//package org.apache.carbondata.hive;
+//
+//import junit.framework.TestCase;
+//import org.apache.hadoop.conf.Configuration;
+//import org.apache.hadoop.hive.common.type.HiveDecimal;
+//import org.apache.hadoop.hive.serde2.SerDeException;
+//import org.apache.hadoop.hive.serde2.SerDeUtils;
+//import org.apache.hadoop.hive.serde2.io.DoubleWritable;
+//import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
+//import org.apache.hadoop.hive.serde2.io.ShortWritable;
+//import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
+//import org.apache.hadoop.io.*;
+//import org.junit.Test;
+//
+//import java.util.Properties;
+//
+//public class TestCarbonSerde extends TestCase {
+//  @Test
+//  public void testCarbonHiveSerDe() throws Throwable {
+//    try {
+//      // Create the SerDe
+//      System.out.println("test: testCarbonHiveSerDe");
+//
+//      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
+//      final Configuration conf = new Configuration();
+//      final Properties tbl = createProperties();
+//      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
+//
+//      // Data
+//      final Writable[] arr = new Writable[7];
+//
+//      //primitive types
+//      arr[0] = new ShortWritable((short) 456);
+//      arr[1] = new IntWritable(789);
+//      arr[2] = new LongWritable(1000l);
+//      arr[3] = new DoubleWritable((double) 5.3);
+//      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
+//      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
+//
+//      final Writable[] arrayContainer = new Writable[1];
+//      final Writable[] array = new Writable[5];
+//      for (int i = 0; i < 5; ++i) {
+//        array[i] = new IntWritable(i);
+//      }
+//      arrayContainer[0] = new ArrayWritable(Writable.class, array);
+//      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
+//
+//      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
+//      // Test
+//      deserializeAndSerializeLazySimple(serDe, arrWritable);
+//      System.out.println("test: testCarbonHiveSerDe - OK");
+//
+//    } catch (final Throwable e) {
+//      e.printStackTrace();
+//      throw e;
+//    }
+//  }
+//
+//  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
+//      final ArrayWritable t) throws SerDeException {
+//
+//    // Get the row structure
+//    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
+//
+//    // Deserialize
+//    final Object row = serDe.deserialize(t);
+//    assertEquals("deserialization gives the wrong object class", row.getClass(),
+//        ArrayWritable.class);
+//    assertEquals("size correct after deserialization",
+//        serDe.getSerDeStats().getRawDataSize(), t.get().length);
+//    assertEquals("deserialization gives the wrong object", t, row);
+//
+//    // Serialize
+//    final ArrayWritable serializedArr = (ArrayWritable) serDe.serialize(row, oi);
+//    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
+//        serializedArr.get().length);
+//    assertTrue("serialized object should be equal to starting object",
+//        arrayWritableEquals(t, serializedArr));
+//  }
+//
+//  private Properties createProperties() {
+//    final Properties tbl = new Properties();
+//
+//    // Set the configuration parameters
+//    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
+//    tbl.setProperty("columns.types",
+//        "smallint:int:bigint:double:decimal:string:array<int>");
+//    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
+//    return tbl;
+//  }
+//
+//  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
+//    final Writable[] a1Arr = a1.get();
+//    final Writable[] a2Arr = a2.get();
+//
+//    if (a1Arr.length != a2Arr.length) {
+//      return false;
+//    }
+//
+//    for (int i = 0; i < a1Arr.length; ++i) {
+//      if (a1Arr[i] instanceof ArrayWritable) {
+//        if (!(a2Arr[i] instanceof ArrayWritable)) {
+//          return false;
+//        }
+//        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
+//          return false;
+//        }
+//      } else {
+//        if (!a1Arr[i].equals(a2Arr[i])) {
+//          return false;
+//        }
+//      }
+//
+//    }
+//    return true;
+//  }
+//}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/README.md
----------------------------------------------------------------------
diff --git a/integration/presto/README.md b/integration/presto/README.md
new file mode 100644
index 0000000..8a7cd13
--- /dev/null
+++ b/integration/presto/README.md
@@ -0,0 +1,86 @@
+<!--
+    Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+-->
+
+Please follow the below steps to query carbondata in presto
+
+### Config presto server
+* Download presto server 0.166 : https://repo1.maven.org/maven2/com/facebook/presto/presto-server/
+* Finish configuration as per https://prestodb.io/docs/current/installation/deployment.html
+  for example:
+  ```
+  carbondata.properties:
+  connector.name=carbondata
+  carbondata-store=/Users/apple/DEMO/presto_test/data
+  
+  config.properties:
+  coordinator=true
+  node-scheduler.include-coordinator=true
+  http-server.http.port=8086
+  query.max-memory=5GB
+  query.max-memory-per-node=1GB
+  discovery-server.enabled=true
+  discovery.uri=http://localhost:8086
+  
+  jvm.config:
+  -server
+  -Xmx4G
+  -XX:+UseG1GC
+  -XX:G1HeapRegionSize=32M
+  -XX:+UseGCOverheadLimit
+  -XX:+ExplicitGCInvokesConcurrent
+  -XX:+HeapDumpOnOutOfMemoryError
+  -XX:OnOutOfMemoryError=kill -9 %p
+  -XX:+TraceClassLoading
+  
+  log.properties:
+  com.facebook.presto=DEBUG
+  com.facebook.presto.server.PluginManager=DEBUG
+  
+  node.properties:
+  node.environment=carbondata
+  node.id=ffffffff-ffff-ffff-ffff-ffffffffffff
+  node.data-dir=/Users/apple/DEMO/presto_test/data
+  ```
+* config carbondata-connector for presto
+  
+  First:compile carbondata-presto integration module
+  ```
+  $ git clone https://github.com/apache/incubator-carbondata
+  $ cd incubator-carbondata/integration/presto
+  $ mvn clean package
+  ```
+  Second:create one folder "carbondata" under ./presto-server-0.166/plugin
+  Third:copy all jar from ./incubator-carbondata/integration/presto/target/carbondata-presto-1.1.0-incubating-SNAPSHOT
+        to ./presto-server-0.166/plugin/carbondata
+  
+### Generate CarbonData file
+
+Please refer to quick start : https://github.com/apache/incubator-carbondata/blob/master/docs/quick-start-guide.md
+
+### Query carbondata in CLI of presto
+* Download presto-cli-0.166-executable.jar
+
+* Start CLI:
+  
+  ```
+  $ ./presto-cli-0.166-executable.jar --server localhost:8086 --catalog carbondata --schema default
+  ```
+
+
+

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/pom.xml
----------------------------------------------------------------------
diff --git a/integration/presto/pom.xml b/integration/presto/pom.xml
new file mode 100644
index 0000000..25eb6a7
--- /dev/null
+++ b/integration/presto/pom.xml
@@ -0,0 +1,233 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+    Licensed to the Apache Software Foundation (ASF) under one or more
+    contributor license agreements.  See the NOTICE file distributed with
+    this work for additional information regarding copyright ownership.
+    The ASF licenses this file to You under the Apache License, Version 2.0
+    (the "License"); you may not use this file except in compliance with
+    the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing, software
+    distributed under the License is distributed on an "AS IS" BASIS,
+    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+    See the License for the specific language governing permissions and
+    limitations under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+  <modelVersion>4.0.0</modelVersion>
+
+  <parent>
+    <groupId>org.apache.carbondata</groupId>
+    <artifactId>carbondata-parent</artifactId>
+    <version>1.2.0-SNAPSHOT</version>
+    <relativePath>../../pom.xml</relativePath>
+  </parent>
+
+  <artifactId>carbondata-presto</artifactId>
+  <name>Apache CarbonData :: presto</name>
+  <packaging>presto-plugin</packaging>
+
+  <properties>
+    <presto.version>0.166</presto.version>
+    <dev.path>${basedir}/../../dev</dev.path>
+  </properties>
+
+  <dependencies>
+    <dependency>
+      <groupId>org.apache.thrift</groupId>
+      <artifactId>libthrift</artifactId>
+      <version>0.9.3</version>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.carbondata</groupId>
+      <artifactId>carbondata-core</artifactId>
+      <version>${project.version}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.carbondata</groupId>
+      <artifactId>carbondata-common</artifactId>
+      <version>${project.version}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.carbondata</groupId>
+      <artifactId>carbondata-processing</artifactId>
+      <version>${project.version}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.carbondata</groupId>
+      <artifactId>carbondata-hadoop</artifactId>
+      <version>${project.version}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>io.airlift</groupId>
+      <artifactId>bootstrap</artifactId>
+      <version>0.144</version>
+      <!--<scope>provided</scope>-->
+      <exclusions>
+        <exclusion>
+          <groupId>org.slf4j</groupId>
+          <artifactId>log4j-over-slf4j</artifactId>
+        </exclusion>
+      </exclusions>
+    </dependency>
+
+    <dependency>
+      <groupId>io.airlift</groupId>
+      <artifactId>json</artifactId>
+      <version>0.144</version>
+      <!--<scope>provided</scope>-->
+    </dependency>
+
+    <dependency>
+      <groupId>io.airlift</groupId>
+      <artifactId>log</artifactId>
+      <version>0.144</version>
+      <!--<scope>provided</scope>-->
+    </dependency>
+
+    <dependency>
+      <groupId>io.airlift</groupId>
+      <artifactId>slice</artifactId>
+      <version>0.29</version>
+      <scope>provided</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>io.airlift</groupId>
+      <artifactId>units</artifactId>
+      <version>1.0</version>
+      <scope>provided</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>com.fasterxml.jackson.core</groupId>
+      <artifactId>jackson-annotations</artifactId>
+      <version>2.6.0</version>
+      <scope>provided</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>com.google.guava</groupId>
+      <artifactId>guava</artifactId>
+      <version>18.0</version>
+    </dependency>
+
+    <dependency>
+      <groupId>com.google.inject</groupId>
+      <artifactId>guice</artifactId>
+      <version>3.0</version>
+    </dependency>
+
+    <!--presto integrated-->
+    <dependency>
+      <groupId>com.facebook.presto</groupId>
+      <artifactId>presto-spi</artifactId>
+      <version>${presto.version}</version>
+      <scope>provided</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>com.facebook.presto.hadoop</groupId>
+      <artifactId>hadoop-apache2</artifactId>
+      <version>2.7.3-1</version>
+    </dependency>
+  </dependencies>
+
+    <build>
+      <plugins>
+        <plugin>
+          <artifactId>maven-compiler-plugin</artifactId>
+          <configuration>
+            <source>1.8</source>
+            <target>1.8</target>
+          </configuration>
+        </plugin>
+        <plugin>
+          <groupId>org.apache.maven.plugins</groupId>
+          <artifactId>maven-surefire-plugin</artifactId>
+          <version>2.18</version>
+          <!-- Note config is repeated in scalatest config -->
+          <configuration>
+            <includes>
+              <include>**/Test*.java</include>
+              <include>**/*Test.java</include>
+              <include>**/*TestCase.java</include>
+              <include>**/*Suite.java</include>
+            </includes>
+            <reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
+            <argLine>-Xmx3g -XX:MaxPermSize=512m -XX:ReservedCodeCacheSize=512m</argLine>
+            <systemProperties>
+              <java.awt.headless>true</java.awt.headless>
+            </systemProperties>
+            <failIfNoTests>false</failIfNoTests>
+          </configuration>
+        </plugin>
+
+        <plugin>
+          <groupId>org.apache.maven.plugins</groupId>
+          <artifactId>maven-checkstyle-plugin</artifactId>
+          <version>2.17</version>
+          <configuration>
+            <skip>true</skip>
+          </configuration>
+        </plugin>
+
+        <plugin>
+          <groupId>org.apache.maven.plugins</groupId>
+          <artifactId>maven-enforcer-plugin</artifactId>
+          <version>1.4.1</version>
+          <configuration>
+            <skip>true</skip>
+          </configuration>
+        </plugin>
+
+        <plugin>
+          <groupId>com.ning.maven.plugins</groupId>
+          <artifactId>maven-dependency-versions-check-plugin</artifactId>
+          <configuration>
+            <skip>true</skip>
+            <failBuildInCaseOfConflict>false</failBuildInCaseOfConflict>
+          </configuration>
+        </plugin>
+
+        <plugin>
+          <groupId>org.apache.maven.plugins</groupId>
+          <artifactId>maven-dependency-plugin</artifactId>
+          <configuration>
+            <skip>false</skip>
+          </configuration>
+        </plugin>
+
+        <plugin>
+          <groupId>com.ning.maven.plugins</groupId>
+          <artifactId>maven-duplicate-finder-plugin</artifactId>
+          <configuration>
+            <skip>true</skip>
+          </configuration>
+        </plugin>
+
+        <plugin>
+          <groupId>io.takari.maven.plugins</groupId>
+          <artifactId>presto-maven-plugin</artifactId>
+          <version>0.1.12</version>
+          <extensions>true</extensions>
+        </plugin>
+
+        <plugin>
+          <groupId>pl.project13.maven</groupId>
+          <artifactId>git-commit-id-plugin</artifactId>
+          <configuration>
+            <skip>true</skip>
+          </configuration>
+        </plugin>
+      </plugins>
+    </build>
+</project>
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorId.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorId.java b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorId.java
new file mode 100755
index 0000000..b4ba1dd
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorId.java
@@ -0,0 +1,52 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import com.google.inject.Inject;
+
+import java.util.Objects;
+
+import static java.util.Objects.requireNonNull;
+
+public class CarbondataConnectorId {
+  private final String id;
+
+  @Inject public CarbondataConnectorId(String id) {
+    this.id = requireNonNull(id, "id is null");
+  }
+
+  @Override public String toString() {
+    return id;
+  }
+
+  @Override public int hashCode() {
+    return Objects.hash(id);
+  }
+
+  @Override public boolean equals(Object obj) {
+    if (this == obj) {
+      return true;
+    }
+
+    if ((obj == null) || (getClass() != obj.getClass())) {
+      return false;
+    }
+
+    return Objects.equals(this.id, ((CarbondataConnectorId) obj).id);
+  }
+}
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataHandleResolver.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataHandleResolver.java b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataHandleResolver.java
new file mode 100755
index 0000000..7c65bfd
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataHandleResolver.java
@@ -0,0 +1,43 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import com.facebook.presto.spi.*;
+import com.facebook.presto.spi.connector.ConnectorTransactionHandle;
+
+public class CarbondataHandleResolver implements ConnectorHandleResolver {
+  @Override public Class<? extends ConnectorTableHandle> getTableHandleClass() {
+    return CarbondataTableHandle.class;
+  }
+
+  @Override public Class<? extends ConnectorTableLayoutHandle> getTableLayoutHandleClass() {
+    return CarbondataTableLayoutHandle.class;
+  }
+
+  @Override public Class<? extends ColumnHandle> getColumnHandleClass() {
+    return CarbondataColumnHandle.class;
+  }
+
+  @Override public Class<? extends ConnectorSplit> getSplitClass() {
+    return CarbondataSplit.class;
+  }
+
+  @Override public Class<? extends ConnectorTransactionHandle> getTransactionHandleClass() {
+    return CarbondataTransactionHandle.class;
+  }
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
new file mode 100755
index 0000000..191f13b
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
@@ -0,0 +1,34 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import com.facebook.presto.spi.Plugin;
+import com.facebook.presto.spi.connector.ConnectorFactory;
+import com.google.common.collect.ImmutableList;
+import org.apache.carbondata.core.datastore.impl.FileFactory;
+
+public class CarbondataPlugin implements Plugin {
+
+  @Override public Iterable<ConnectorFactory> getConnectorFactories() {
+    return ImmutableList.of(new CarbondataConnectorFactory("carbondata", getClassLoader()));
+  }
+
+  private static ClassLoader getClassLoader() {
+    return FileFactory.class.getClassLoader();
+  }
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplit.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplit.java b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplit.java
new file mode 100755
index 0000000..ecc41ef
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplit.java
@@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import org.apache.carbondata.presto.impl.CarbonLocalInputSplit;
+import com.facebook.presto.spi.ColumnHandle;
+import com.facebook.presto.spi.ConnectorSplit;
+import com.facebook.presto.spi.HostAddress;
+import com.facebook.presto.spi.SchemaTableName;
+import com.facebook.presto.spi.predicate.TupleDomain;
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.google.common.collect.ImmutableList;
+
+import java.util.List;
+
+import static java.util.Objects.requireNonNull;
+
+public class CarbondataSplit implements ConnectorSplit {
+
+  private final String connectorId;
+  private final SchemaTableName schemaTableName;
+  private final TupleDomain<ColumnHandle> constraints;
+  private final CarbonLocalInputSplit localInputSplit;
+  private final List<CarbondataColumnConstraint> rebuildConstraints;
+  private final ImmutableList<HostAddress> addresses;
+
+  @JsonCreator public CarbondataSplit(@JsonProperty("connectorId") String connectorId,
+      @JsonProperty("schemaTableName") SchemaTableName schemaTableName,
+      @JsonProperty("constraints") TupleDomain<ColumnHandle> constraints,
+      @JsonProperty("localInputSplit") CarbonLocalInputSplit localInputSplit,
+      @JsonProperty("rebuildConstraints") List<CarbondataColumnConstraint> rebuildConstraints) {
+    this.connectorId = requireNonNull(connectorId, "connectorId is null");
+    this.schemaTableName = requireNonNull(schemaTableName, "schemaTable is null");
+    this.constraints = requireNonNull(constraints, "constraints is null");
+    this.localInputSplit = requireNonNull(localInputSplit, "localInputSplit is null");
+    this.rebuildConstraints = requireNonNull(rebuildConstraints, "rebuildConstraints is null");
+    this.addresses = ImmutableList.of();
+  }
+
+  @JsonProperty public String getConnectorId() {
+    return connectorId;
+  }
+
+  @JsonProperty public SchemaTableName getSchemaTableName() {
+    return schemaTableName;
+  }
+
+  @JsonProperty public TupleDomain<ColumnHandle> getConstraints() {
+    return constraints;
+  }
+
+  @JsonProperty public CarbonLocalInputSplit getLocalInputSplit() {
+    return localInputSplit;
+  }
+
+  @JsonProperty public List<CarbondataColumnConstraint> getRebuildConstraints() {
+    return rebuildConstraints;
+  }
+
+  @Override public boolean isRemotelyAccessible() {
+    return true;
+  }
+
+  @Override public List<HostAddress> getAddresses() {
+    return addresses;
+  }
+
+  @Override public Object getInfo() {
+    return this;
+  }
+}
+

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableHandle.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableHandle.java b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableHandle.java
new file mode 100755
index 0000000..0a3c820
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableHandle.java
@@ -0,0 +1,71 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import com.facebook.presto.spi.ConnectorTableHandle;
+import com.facebook.presto.spi.SchemaTableName;
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.google.common.base.Joiner;
+
+import java.util.Objects;
+
+import static java.util.Locale.ENGLISH;
+import static java.util.Objects.requireNonNull;
+
+public class CarbondataTableHandle implements ConnectorTableHandle {
+
+  private final String connectorId;
+  private final SchemaTableName schemaTableName;
+
+  @JsonCreator public CarbondataTableHandle(@JsonProperty("connectorId") String connectorId,
+      @JsonProperty("schemaTableName") SchemaTableName schemaTableName) {
+    this.connectorId = requireNonNull(connectorId.toLowerCase(ENGLISH), "connectorId is null");
+    this.schemaTableName = schemaTableName;
+  }
+
+  @JsonProperty public String getConnectorId() {
+    return connectorId;
+  }
+
+  @JsonProperty public SchemaTableName getSchemaTableName() {
+    return schemaTableName;
+  }
+
+  @Override public int hashCode() {
+    return Objects.hash(connectorId, schemaTableName);
+  }
+
+  @Override public boolean equals(Object obj) {
+    if (this == obj) {
+      return true;
+    }
+    if ((obj == null) || (getClass() != obj.getClass())) {
+      return false;
+    }
+
+    CarbondataTableHandle other = (CarbondataTableHandle) obj;
+    return Objects.equals(this.connectorId, other.connectorId) && this.schemaTableName
+        .equals(other.getSchemaTableName());
+  }
+
+  @Override public String toString() {
+    return Joiner.on(":").join(connectorId, schemaTableName.toString());
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableLayoutHandle.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableLayoutHandle.java b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableLayoutHandle.java
new file mode 100755
index 0000000..bf6318f
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTableLayoutHandle.java
@@ -0,0 +1,71 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import com.facebook.presto.spi.ColumnHandle;
+import com.facebook.presto.spi.ConnectorTableLayoutHandle;
+import com.facebook.presto.spi.predicate.TupleDomain;
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+
+import java.util.Objects;
+
+//import static com.google.common.base.MoreObjects.toStringHelper;
+import static com.google.common.base.Objects.toStringHelper;
+import static java.util.Objects.requireNonNull;
+
+public class CarbondataTableLayoutHandle implements ConnectorTableLayoutHandle {
+  private final CarbondataTableHandle table;
+  private final TupleDomain<ColumnHandle> constraint;
+
+  @JsonCreator
+  public CarbondataTableLayoutHandle(@JsonProperty("table") CarbondataTableHandle table,
+      @JsonProperty("constraint") TupleDomain<ColumnHandle> constraint) {
+    this.table = requireNonNull(table, "table is null");
+    this.constraint = requireNonNull(constraint, "constraint is null");
+  }
+
+  @JsonProperty public CarbondataTableHandle getTable() {
+    return table;
+  }
+
+  @JsonProperty public TupleDomain<ColumnHandle> getConstraint() {
+    return constraint;
+  }
+
+  @Override public boolean equals(Object obj) {
+    if (this == obj) {
+      return true;
+    }
+
+    if (obj == null || getClass() != obj.getClass()) {
+      return false;
+    }
+
+    CarbondataTableLayoutHandle other = (CarbondataTableLayoutHandle) obj;
+    return Objects.equals(table, other.table) && Objects.equals(constraint, other.constraint);
+  }
+
+  @Override public int hashCode() {
+    return Objects.hash(table, constraint);
+  }
+
+  @Override public String toString() {
+    return toStringHelper(this).add("table", table).add("constraint", constraint).toString();
+  }
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTransactionHandle.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTransactionHandle.java b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTransactionHandle.java
new file mode 100755
index 0000000..e95c490
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataTransactionHandle.java
@@ -0,0 +1,24 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import com.facebook.presto.spi.connector.ConnectorTransactionHandle;
+
+public enum CarbondataTransactionHandle implements ConnectorTransactionHandle {
+  INSTANCE
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/Types.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/Types.java b/integration/presto/src/main/java/org/apache/carbondata/presto/Types.java
new file mode 100755
index 0000000..cb30907
--- /dev/null
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/Types.java
@@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.presto;
+
+import java.util.Locale;
+
+import static com.google.common.base.Preconditions.checkArgument;
+import static java.util.Objects.requireNonNull;
+
+public class Types {
+  private Types() {
+  }
+
+  public static <A, B extends A> B checkType(A value, Class<B> target, String name) {
+    requireNonNull(value, String.format(Locale.ENGLISH, "%s is null", name));
+    checkArgument(target.isInstance(value), "%s must be of type %s, not %s", name, target.getName(),
+        value.getClass().getName());
+    return target.cast(value);
+  }
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
----------------------------------------------------------------------
diff --git a/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java b/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
index b6e45d6..0461603 100755
--- a/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
+++ b/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
@@ -608,7 +608,9 @@ public class CarbonTableReader {
 
       // Add all blocks of btree into result
       DataRefNodeFinder blockFinder =
-          new BTreeDataRefNodeFinder(segmentProperties.getEachDimColumnValueSize());
+          new BTreeDataRefNodeFinder(segmentProperties.getEachDimColumnValueSize(),
+              segmentProperties.getNumberOfSortColumns(),
+              segmentProperties.getNumberOfNoDictSortColumns());
       DataRefNode startBlock =
           blockFinder.findFirstDataBlock(abstractIndex.getDataRefNode(), startIndexKey);
       DataRefNode endBlock =

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark1TestCase.scala
----------------------------------------------------------------------
diff --git a/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark1TestCase.scala b/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark1TestCase.scala
new file mode 100644
index 0000000..4261d9b
--- /dev/null
+++ b/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark1TestCase.scala
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.carbondata.spark.testsuite.allqueries
+
+import org.apache.spark.sql.common.util.QueryTest
+import org.scalatest.BeforeAndAfterAll
+
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.util.CarbonProperties
+
+class InsertIntoCarbonTableSpark1TestCase extends QueryTest with BeforeAndAfterAll {
+  override def beforeAll {
+    sql("drop table if exists THive")
+    sql("create table THive (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string, deliveryTime string, channelsId string, channelsName string , deliveryAreaId string, deliveryCountry string, deliveryProvince string, deliveryCity string,deliveryDistrict string, deliveryStreet string, oxSingleNumber string, ActiveCheckTime string, ActiveAreaId string, ActiveCountry string, ActiveProvince string, Activecity string, ActiveDistrict string, ActiveStreet string, ActiveOperatorId string, Active_releaseId string, Active_EMUIVersion string, Active_operaSysVersion string, Active_BacVerNumber string, Active_BacFlashVer string, Active_webUIVersion string, Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string, Active_operatorsVersion string, Active_phonePADPartitionedVersions st
 ring, Latest_YEAR int, Latest_MONTH int, Latest_DAY Decimal(30,10), Latest_HOUR string, Latest_areaId string, Latest_country string, Latest_province string, Latest_city string, Latest_district string, Latest_street string, Latest_releaseId string, Latest_EMUIVersion string, Latest_operaSysVersion string, Latest_BacVerNumber string, Latest_BacFlashVer string, Latest_webUIVersion string, Latest_webUITypeCarrVer string, Latest_webTypeDataVerNumber string, Latest_operatorsVersion string, Latest_phonePADPartitionedVersions string, Latest_operatorId string, gamePointDescription string,gamePointId double,contractNumber BigInt) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','")
+    sql(s"LOAD DATA local INPATH '$resourcesPath/100_olap.csv' INTO TABLE THive")
+  }
+
+
+  test("insert from carbon-select columns-source table has more column then target column") {
+    val timeStampPropOrig = CarbonProperties.getInstance().getProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT)
+     CarbonProperties.getInstance()
+      .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT,CarbonCommonConstants.CARBON_TIMESTAMP_DEFAULT_FORMAT)
+     
+     sql("drop table if exists load")
+     sql("drop table if exists inser")
+     sql("CREATE TABLE load(imei string,age int,task bigint,num double,level decimal(10,3),productdate timestamp,name string,point int)STORED BY 'org.apache.carbondata.format'")
+     sql("LOAD DATA INPATH '" + resourcesPath + "/shortolap.csv' INTO TABLE load options ('DELIMITER'=',', 'QUOTECHAR'='\"','FILEHEADER' = 'imei,age,task,num,level,productdate,name,point')")
+     sql("CREATE TABLE inser(imei string,age int,task bigint,num double,level decimal(10,3),productdate timestamp)STORED BY 'org.apache.carbondata.format'")
+     sql("insert into inser select * from load")
+     checkAnswer(
+         sql("select * from inser"),
+         sql("select imei,age,task,num,level,productdate from load")
+     ) 
+     sql("drop table if exists load")
+     sql("drop table if exists inser")
+     CarbonProperties.getInstance().addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, timeStampPropOrig)
+  }
+
+  test("insert->hive column more than carbon column->success") {
+     sql("drop table if exists TCarbon")
+     sql("create table TCarbon (imei string,deviceInformationId int,MAC string,deviceColor string,gamePointId double,contractNumber BigInt) STORED BY 'org.apache.carbondata.format'")
+  
+     sql("insert into TCarbon select imei,deviceInformationId,MAC,deviceColor,gamePointId,contractNumber,device_backColor,modelId,CUPAudit,CPIClocked from THive")
+     checkAnswer(
+         sql("select imei,deviceInformationId,MAC,deviceColor,gamePointId,contractNumber from THive"),
+         sql("select imei,deviceInformationId,MAC,deviceColor,gamePointId,contractNumber from TCarbon")
+     )
+  }
+
+//  test("insert->insert empty data -pass") {
+//     sql("drop table if exists TCarbon")
+//     sql("create table TCarbon (imei string,deviceInformationId int,MAC string) STORED BY 'org.apache.carbondata.format'")
+//     sql("insert into TCarbon select imei,deviceInformationId,MAC from THive where MAC='wrongdata'")
+//     val result = sql("select imei,deviceInformationId,MAC from TCarbon where MAC='wrongdata'").collect()
+//     checkAnswer(
+//         sql("select imei,deviceInformationId,MAC from THive where MAC='wrongdata'"),
+//         sql("select imei,deviceInformationId,MAC from TCarbon where MAC='wrongdata'")
+//     )
+//  }
+
+  override def afterAll {
+    sql("drop table if exists load")
+    sql("drop table if exists inser")
+    sql("DROP TABLE IF EXISTS THive")
+    sql("DROP TABLE IF EXISTS TCarbon")
+  }
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala
----------------------------------------------------------------------
diff --git a/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala b/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala
deleted file mode 100644
index 5bf598e..0000000
--- a/integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala
+++ /dev/null
@@ -1,81 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.carbondata.spark.testsuite.allqueries
-
-import org.apache.spark.sql.common.util.QueryTest
-import org.scalatest.BeforeAndAfterAll
-
-import org.apache.carbondata.core.constants.CarbonCommonConstants
-import org.apache.carbondata.core.util.CarbonProperties
-
-class InsertIntoCarbonTableTestCase extends QueryTest with BeforeAndAfterAll {
-  override def beforeAll {
-    sql("drop table if exists THive")
-    sql("create table THive (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string, deliveryTime string, channelsId string, channelsName string , deliveryAreaId string, deliveryCountry string, deliveryProvince string, deliveryCity string,deliveryDistrict string, deliveryStreet string, oxSingleNumber string, ActiveCheckTime string, ActiveAreaId string, ActiveCountry string, ActiveProvince string, Activecity string, ActiveDistrict string, ActiveStreet string, ActiveOperatorId string, Active_releaseId string, Active_EMUIVersion string, Active_operaSysVersion string, Active_BacVerNumber string, Active_BacFlashVer string, Active_webUIVersion string, Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string, Active_operatorsVersion string, Active_phonePADPartitionedVersions st
 ring, Latest_YEAR int, Latest_MONTH int, Latest_DAY Decimal(30,10), Latest_HOUR string, Latest_areaId string, Latest_country string, Latest_province string, Latest_city string, Latest_district string, Latest_street string, Latest_releaseId string, Latest_EMUIVersion string, Latest_operaSysVersion string, Latest_BacVerNumber string, Latest_BacFlashVer string, Latest_webUIVersion string, Latest_webUITypeCarrVer string, Latest_webTypeDataVerNumber string, Latest_operatorsVersion string, Latest_phonePADPartitionedVersions string, Latest_operatorId string, gamePointDescription string,gamePointId double,contractNumber BigInt) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','")
-    sql(s"LOAD DATA local INPATH '$resourcesPath/100_olap.csv' INTO TABLE THive")
-  }
-
-
-  test("insert from carbon-select columns-source table has more column then target column") {
-    val timeStampPropOrig = CarbonProperties.getInstance().getProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT)
-     CarbonProperties.getInstance()
-      .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT,CarbonCommonConstants.CARBON_TIMESTAMP_DEFAULT_FORMAT)
-     
-     sql("drop table if exists load")
-     sql("drop table if exists inser")
-     sql("CREATE TABLE load(imei string,age int,task bigint,num double,level decimal(10,3),productdate timestamp,name string,point int)STORED BY 'org.apache.carbondata.format'")
-     sql("LOAD DATA INPATH '" + resourcesPath + "/shortolap.csv' INTO TABLE load options ('DELIMITER'=',', 'QUOTECHAR'='\"','FILEHEADER' = 'imei,age,task,num,level,productdate,name,point')")
-     sql("CREATE TABLE inser(imei string,age int,task bigint,num double,level decimal(10,3),productdate timestamp)STORED BY 'org.apache.carbondata.format'")
-     sql("insert into inser select * from load")
-     checkAnswer(
-         sql("select * from inser"),
-         sql("select imei,age,task,num,level,productdate from load")
-     ) 
-     sql("drop table if exists load")
-     sql("drop table if exists inser")
-     CarbonProperties.getInstance().addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, timeStampPropOrig)
-  }
-
-  test("insert->hive column more than carbon column->success") {
-     sql("drop table if exists TCarbon")
-     sql("create table TCarbon (imei string,deviceInformationId int,MAC string,deviceColor string,gamePointId double,contractNumber BigInt) STORED BY 'org.apache.carbondata.format'")
-  
-     sql("insert into TCarbon select imei,deviceInformationId,MAC,deviceColor,gamePointId,contractNumber,device_backColor,modelId,CUPAudit,CPIClocked from THive")
-     checkAnswer(
-         sql("select imei,deviceInformationId,MAC,deviceColor,gamePointId,contractNumber from THive"),
-         sql("select imei,deviceInformationId,MAC,deviceColor,gamePointId,contractNumber from TCarbon")
-     )
-  }
-
-//  test("insert->insert empty data -pass") {
-//     sql("drop table if exists TCarbon")
-//     sql("create table TCarbon (imei string,deviceInformationId int,MAC string) STORED BY 'org.apache.carbondata.format'")
-//     sql("insert into TCarbon select imei,deviceInformationId,MAC from THive where MAC='wrongdata'")
-//     val result = sql("select imei,deviceInformationId,MAC from TCarbon where MAC='wrongdata'").collect()
-//     checkAnswer(
-//         sql("select imei,deviceInformationId,MAC from THive where MAC='wrongdata'"),
-//         sql("select imei,deviceInformationId,MAC from TCarbon where MAC='wrongdata'")
-//     )
-//  }
-
-  override def afterAll {
-    sql("drop table if exists load")
-    sql("drop table if exists inser")
-    sql("DROP TABLE IF EXISTS THive")
-    sql("DROP TABLE IF EXISTS TCarbon")
-  }
-}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark2TestCase.scala
----------------------------------------------------------------------
diff --git a/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark2TestCase.scala b/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark2TestCase.scala
new file mode 100644
index 0000000..0dae268
--- /dev/null
+++ b/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableSpark2TestCase.scala
@@ -0,0 +1,40 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.carbondata.spark.testsuite.allqueries
+
+import org.apache.spark.sql.Row
+import org.apache.spark.sql.common.util.QueryTest
+import org.scalatest.BeforeAndAfterAll
+
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.util.CarbonProperties
+
+class InsertIntoCarbonTableSpark2TestCase extends QueryTest with BeforeAndAfterAll {
+  override def beforeAll: Unit = {
+    sql("drop table if exists OneRowTable")
+  }
+
+  test("insert select one row") {
+    sql("create table OneRowTable(col1 string, col2 string, col3 int, col4 double) stored by 'carbondata'")
+    sql("insert into OneRowTable select '0.1', 'a.b', 1, 1.2")
+    checkAnswer(sql("select * from OneRowTable"), Seq(Row("0.1", "a.b", 1, 1.2)))
+  }
+
+  override def afterAll {
+    sql("drop table if exists OneRowTable")
+  }
+}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala
----------------------------------------------------------------------
diff --git a/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala b/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala
deleted file mode 100644
index 0020568..0000000
--- a/integration/spark2/src/test/scala/org/apache/carbondata/spark/testsuite/allqueries/InsertIntoCarbonTableTestCase.scala
+++ /dev/null
@@ -1,40 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.carbondata.spark.testsuite.allqueries
-
-import org.apache.spark.sql.Row
-import org.apache.spark.sql.common.util.QueryTest
-import org.scalatest.BeforeAndAfterAll
-
-import org.apache.carbondata.core.constants.CarbonCommonConstants
-import org.apache.carbondata.core.util.CarbonProperties
-
-class InsertIntoCarbonTableTestCase extends QueryTest with BeforeAndAfterAll {
-  override def beforeAll: Unit = {
-    sql("drop table if exists OneRowTable")
-  }
-
-  test("insert select one row") {
-    sql("create table OneRowTable(col1 string, col2 string, col3 int, col4 double) stored by 'carbondata'")
-    sql("insert into OneRowTable select '0.1', 'a.b', 1, 1.2")
-    checkAnswer(sql("select * from OneRowTable"), Seq(Row("0.1", "a.b", 1, 1.2)))
-  }
-
-  override def afterAll {
-    sql("drop table if exists OneRowTable")
-  }
-}

http://git-wip-us.apache.org/repos/asf/carbondata/blob/53267c82/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index 7193a78..3152e45 100644
--- a/pom.xml
+++ b/pom.xml
@@ -102,6 +102,9 @@
     <module>integration/spark-common</module>
     <module>integration/spark-common-test</module>
     <module>assembly</module>
+    <module>examples/flink</module>
+    <module>integration/hive</module>
+
   </modules>
 
   <properties>
@@ -128,10 +131,6 @@
       <id>pentaho-releases</id>
       <url>http://repository.pentaho.org/artifactory/repo/</url>
     </repository>
-    <repository>
-      <id>carbondata-releases</id>
-      <url>http://136.243.101.176:9091/repository/carbondata/</url>
-    </repository>
   </repositories>
 
   <dependencyManagement>
@@ -317,7 +316,6 @@
         <module>examples/spark</module>
         <module>integration/spark2</module>
         <module>examples/spark2</module>
-        <module>examples/flink</module>
       </modules>
     </profile>
     <profile>
@@ -373,16 +371,6 @@
       </modules>
     </profile>
     <profile>
-      <id>flink</id>
-      <properties>
-        <flink.version>1.1.4</flink.version>
-        <scala.binary.version>2.10</scala.binary.version>
-      </properties>
-      <modules>
-        <module>examples/flink</module>
-      </modules>
-    </profile>
-    <profile>
       <id>findbugs</id>
       <build>
         <plugins>


Mime
View raw message