hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-11233) Include Apache Phoenix support in HBaseStorageHandler
Date Sun, 22 May 2016 02:26:12 GMT

    [ https://issues.apache.org/jira/browse/HIVE-11233?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15295362#comment-15295362
] 

Hive QA commented on HIVE-11233:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12805451/HIVE-11233.2.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: http://ec2-54-177-240-2.us-west-1.compute.amazonaws.com/job/PreCommit-HIVE-MASTER-Build/354/testReport
Console output: http://ec2-54-177-240-2.us-west-1.compute.amazonaws.com/job/PreCommit-HIVE-MASTER-Build/354/console
Test logs: http://ec2-50-18-27-0.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-MASTER-Build-354/

Messages:
{noformat}
**** This message was trimmed, see log for full details ****
[INFO] Installing /data/hive-ptest/working/apache-github-source-source/common/target/hive-common-2.1.0-SNAPSHOT-tests.jar
to /home/hiveptest/.m2/repository/org/apache/hive/hive-common/2.1.0-SNAPSHOT/hive-common-2.1.0-SNAPSHOT-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Service RPC 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service-rpc ---
[INFO] Deleting /data/hive-ptest/working/apache-github-source-source/service-rpc/target
[INFO] Deleting /data/hive-ptest/working/apache-github-source-source/service-rpc (includes
= [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ hive-service-rpc ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service-rpc ---
[INFO] Source directory: /data/hive-ptest/working/apache-github-source-source/service-rpc/src/gen/thrift/gen-javabean
added.
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-service-rpc ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-service-rpc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-github-source-source/service-rpc/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service-rpc ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service-rpc ---
[INFO] Compiling 87 source files to /data/hive-ptest/working/apache-github-source-source/service-rpc/target/classes
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hive-service-rpc
---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-github-source-source/service-rpc/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service-rpc ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-github-source-source/service-rpc/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-github-source-source/service-rpc/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-github-source-source/service-rpc/target/tmp/conf
     [copy] Copying 15 files to /data/hive-ptest/working/apache-github-source-source/service-rpc/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service-rpc
---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service-rpc ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service-rpc ---
[INFO] Building jar: /data/hive-ptest/working/apache-github-source-source/service-rpc/target/hive-service-rpc-2.1.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-service-rpc
---
[INFO] 
[INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service-rpc ---
[INFO] Building jar: /data/hive-ptest/working/apache-github-source-source/service-rpc/target/hive-service-rpc-2.1.0-SNAPSHOT-tests.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-service-rpc ---
[INFO] Installing /data/hive-ptest/working/apache-github-source-source/service-rpc/target/hive-service-rpc-2.1.0-SNAPSHOT.jar
to /home/hiveptest/.m2/repository/org/apache/hive/hive-service-rpc/2.1.0-SNAPSHOT/hive-service-rpc-2.1.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-github-source-source/service-rpc/pom.xml
to /home/hiveptest/.m2/repository/org/apache/hive/hive-service-rpc/2.1.0-SNAPSHOT/hive-service-rpc-2.1.0-SNAPSHOT.pom
[INFO] Installing /data/hive-ptest/working/apache-github-source-source/service-rpc/target/hive-service-rpc-2.1.0-SNAPSHOT-tests.jar
to /home/hiveptest/.m2/repository/org/apache/hive/hive-service-rpc/2.1.0-SNAPSHOT/hive-service-rpc-2.1.0-SNAPSHOT-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Serde 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
[INFO] Deleting /data/hive-ptest/working/apache-github-source-source/serde/target
[INFO] Deleting /data/hive-ptest/working/apache-github-source-source/serde (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ hive-serde ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
[INFO] Source directory: /data/hive-ptest/working/apache-github-source-source/serde/src/gen/protobuf/gen-java
added.
[INFO] Source directory: /data/hive-ptest/working/apache-github-source-source/serde/src/gen/thrift/gen-javabean
added.
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-serde ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-serde ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-github-source-source/serde/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
[INFO] Compiling 417 source files to /data/hive-ptest/working/apache-github-source-source/serde/target/classes
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/SerDe.java:
Some input files use or override a deprecated API.
[WARNING] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/SerDe.java:
Recompile with -Xlint:deprecation for details.
[WARNING] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazy/objectinspector/primitive/AbstractPrimitiveLazyObjectInspector.java:
Some input files use unchecked or unsafe operations.
[WARNING] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazy/objectinspector/primitive/AbstractPrimitiveLazyObjectInspector.java:
Recompile with -Xlint:unchecked for details.
[INFO] 4 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[24,43]
cannot find symbol
  symbol:   variable BYTES
  location: class java.lang.Long
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[25,66]
cannot find symbol
  symbol:   variable BYTES
  location: class java.lang.Long
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[29,49]
cannot find symbol
  symbol:   variable BYTES
  location: class java.lang.Integer
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[30,45]
cannot find symbol
  symbol:   variable BYTES
  location: class java.lang.Long
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[30,81]
cannot find symbol
  symbol:   variable BYTES
  location: class java.lang.Integer
[INFO] 5 errors 
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive .............................................. SUCCESS [4.718s]
[INFO] Hive Shims Common ................................. SUCCESS [4.320s]
[INFO] Hive Shims 0.23 ................................... SUCCESS [3.684s]
[INFO] Hive Shims Scheduler .............................. SUCCESS [0.867s]
[INFO] Hive Shims ........................................ SUCCESS [0.526s]
[INFO] Hive Storage API .................................. SUCCESS [1.556s]
[INFO] Hive ORC .......................................... SUCCESS [4.297s]
[INFO] Hive Common ....................................... SUCCESS [6.775s]
[INFO] Hive Service RPC .................................. SUCCESS [2.851s]
[INFO] Hive Serde ........................................ FAILURE [2.645s]
[INFO] Hive Metastore .................................... SKIPPED
[INFO] Hive Ant Utilities ................................ SKIPPED
[INFO] Hive Llap Common .................................. SKIPPED
[INFO] Hive Llap Client .................................. SKIPPED
[INFO] Hive Llap Tez ..................................... SKIPPED
[INFO] Spark Remote Client ............................... SKIPPED
[INFO] Hive Query Language ............................... SKIPPED
[INFO] Hive Llap Server .................................. SKIPPED
[INFO] Hive Service ...................................... SKIPPED
[INFO] Hive Accumulo Handler ............................. SKIPPED
[INFO] Hive JDBC ......................................... SKIPPED
[INFO] Hive Beeline ...................................... SKIPPED
[INFO] Hive CLI .......................................... SKIPPED
[INFO] Hive Contrib ...................................... SKIPPED
[INFO] Hive HBase Handler ................................ SKIPPED
[INFO] Hive HCatalog ..................................... SKIPPED
[INFO] Hive HCatalog Core ................................ SKIPPED
[INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
[INFO] Hive HCatalog Server Extensions ................... SKIPPED
[INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
[INFO] Hive HCatalog Webhcat ............................. SKIPPED
[INFO] Hive HCatalog Streaming ........................... SKIPPED
[INFO] Hive HPL/SQL ...................................... SKIPPED
[INFO] Hive HWI .......................................... SKIPPED
[INFO] Hive Llap External Client ......................... SKIPPED
[INFO] Hive Shims Aggregator ............................. SKIPPED
[INFO] Hive TestUtils .................................... SKIPPED
[INFO] Hive Packaging .................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 33.337s
[INFO] Finished at: Sun May 22 02:26:53 GMT 2016
[INFO] Final Memory: 71M/795M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop-1" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
(default-compile) on project hive-serde: Compilation failure: Compilation failure:
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[24,43]
cannot find symbol
[ERROR] symbol:   variable BYTES
[ERROR] location: class java.lang.Long
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[25,66]
cannot find symbol
[ERROR] symbol:   variable BYTES
[ERROR] location: class java.lang.Long
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[29,49]
cannot find symbol
[ERROR] symbol:   variable BYTES
[ERROR] location: class java.lang.Integer
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[30,45]
cannot find symbol
[ERROR] symbol:   variable BYTES
[ERROR] location: class java.lang.Long
[ERROR] /data/hive-ptest/working/apache-github-source-source/serde/src/java/org/apache/hadoop/hive/serde2/lazydio/LazyDioTimestamp.java:[30,81]
cannot find symbol
[ERROR] symbol:   variable BYTES
[ERROR] location: class java.lang.Integer
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-serde
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12805451 - PreCommit-HIVE-MASTER-Build

> Include Apache Phoenix support in HBaseStorageHandler
> -----------------------------------------------------
>
>                 Key: HIVE-11233
>                 URL: https://issues.apache.org/jira/browse/HIVE-11233
>             Project: Hive
>          Issue Type: New Feature
>          Components: HBase Handler
>    Affects Versions: 1.2.1, 2.0.0
>            Reporter: Svetozar Ivanov
>            Assignee: Svetozar Ivanov
>              Labels: Binary, Hbase, Numeric, Phoenix, Sortable
>         Attachments: HIVE-11233-branch-1.2.patch, HIVE-11233-branch-2.0.patch, HIVE-11233.1.patch,
HIVE-11233.2.patch, HIVE-11233.patch
>
>
> Currently HBaseStorageHandler doesn't provide mechanism for storage of binary sortable
key and values. It is necessary when given HBase table is used for persistence by Apache Hive
and Apache Phoenix. In that way all byte arrays read or written by Hive will be compatible
with binary sortable format used in Phoenix.
> It turns out the major difference is in all numeric data types accordingly officially
provided documentation - https://phoenix.apache.org/language/datatypes.html.
> That's how I'm using it in my code:
> {code}
>     private static String buildWithSerDeProperties(TableDescriptor tableDescriptor) {
>         Map<String, String> serdePropertiesMap = new HashMap<>();
>         serdePropertiesMap.put(HBaseSerDe.HBASE_TABLE_NAME, tableDescriptor.getTableName());
>         serdePropertiesMap.put(HBaseSerDe.HBASE_TABLE_DEFAULT_STORAGE_TYPE, BINARY_STORAGE_TYPE);
>         serdePropertiesMap.put(HBaseSerDe.HBASE_COLUMNS_MAPPING, buildHBaseColumnsDefinition(tableDescriptor));
>         serdePropertiesMap.put(HBaseSerDe.HBASE_VALUE_FACTORY_CLASS, PhoenixValueFactory.class.getName());
>         /* Use different key factory for simple and composite primary key */
>         if (tableDescriptor.getPkDescriptors().size() == 1) {
>             serdePropertiesMap.put(HBaseSerDe.HBASE_KEY_FACTORY_CLASS, PhoenixKeyFactory.class.getName());
>         } else {
>             serdePropertiesMap.put(HBaseSerDe.HBASE_COMPOSITE_KEY_FACTORY, PhoenixCompositeKeyFactory.class.getName());
>         }
>         String serDeProperties = serdePropertiesMap.entrySet().stream()
>                 .map(e -> quoteInSingleQuotes(e.getKey()) + " = " + quoteInSingleQuotes(e.getValue()))
>                 .collect(Collectors.joining(COLUMNS_SEPARATOR));
>         logger.debug("SERDEPROPERTIES are [{}]", serDeProperties);
>         return serDeProperties;
>     }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message