hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-6060) Define API for RecordUpdater and UpdateReader
Date Sat, 15 Mar 2014 07:18:42 GMT

    [ https://issues.apache.org/jira/browse/HIVE-6060?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13936052#comment-13936052
] 

Hive QA commented on HIVE-6060:
-------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12634783/HIVE-6060.patch

Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1793/testReport
Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1793/console

Messages:
{noformat}
**** This message was trimmed, see log for full details ****
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
     [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hwi ---
[INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hwi ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.14.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-hwi ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hwi ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.14.0-SNAPSHOT.jar
to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.14.0-SNAPSHOT/hive-hwi-0.14.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.14.0-SNAPSHOT/hive-hwi-0.14.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive ODBC 0.14.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-odbc ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/odbc (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-odbc ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
     [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-odbc ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-odbc ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/odbc/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-odbc/0.14.0-SNAPSHOT/hive-odbc-0.14.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Aggregator 0.14.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-aggregator ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-shims-aggregator ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggregator ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggregator ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
     [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-shims-aggregator
---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-aggregator ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims-aggregator/0.14.0-SNAPSHOT/hive-shims-aggregator-0.14.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive TestUtils 0.14.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-testutils ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/testutils (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-testutils ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-testutils ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testutils ---
[INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/classes
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hive-testutils
---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
     [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-testutils ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-testutils ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.14.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-testutils ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-testutils ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.14.0-SNAPSHOT.jar
to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.14.0-SNAPSHOT/hive-testutils-0.14.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.14.0-SNAPSHOT/hive-testutils-0.14.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Packaging 0.14.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/hive-hcatalog-hbase-storage-handler-0.14.0-SNAPSHOT.pom
[WARNING] The POM for org.apache.hive.hcatalog:hive-hcatalog-hbase-storage-handler:jar:0.14.0-SNAPSHOT
is missing, no dependency information available
Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/hive-hcatalog-hbase-storage-handler-0.14.0-SNAPSHOT.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive .............................................. SUCCESS [8.738s]
[INFO] Hive Ant Utilities ................................ SUCCESS [5.485s]
[INFO] Hive Shims Common ................................. SUCCESS [3.616s]
[INFO] Hive Shims 0.20 ................................... SUCCESS [2.557s]
[INFO] Hive Shims Secure Common .......................... SUCCESS [4.265s]
[INFO] Hive Shims 0.20S .................................. SUCCESS [2.994s]
[INFO] Hive Shims 0.23 ................................... SUCCESS [7.812s]
[INFO] Hive Shims ........................................ SUCCESS [0.890s]
[INFO] Hive Common ....................................... SUCCESS [6.844s]
[INFO] Hive Serde ........................................ SUCCESS [12.393s]
[INFO] Hive Metastore .................................... SUCCESS [32.374s]
[INFO] Hive Query Language ............................... SUCCESS [1:29.604s]
[INFO] Hive Service ...................................... SUCCESS [9.606s]
[INFO] Hive JDBC ......................................... SUCCESS [3.045s]
[INFO] Hive Beeline ...................................... SUCCESS [3.072s]
[INFO] Hive CLI .......................................... SUCCESS [1.919s]
[INFO] Hive Contrib ...................................... SUCCESS [2.425s]
[INFO] Hive HBase Handler ................................ SUCCESS [2.796s]
[INFO] Hive HCatalog ..................................... SUCCESS [0.656s]
[INFO] Hive HCatalog Core ................................ SUCCESS [2.371s]
[INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [2.297s]
[INFO] Hive HCatalog Server Extensions ................... SUCCESS [2.005s]
[INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [1.515s]
[INFO] Hive HCatalog Webhcat ............................. SUCCESS [10.225s]
[INFO] Hive HWI .......................................... SUCCESS [1.298s]
[INFO] Hive ODBC ......................................... SUCCESS [0.746s]
[INFO] Hive Shims Aggregator ............................. SUCCESS [0.257s]
[INFO] Hive TestUtils .................................... SUCCESS [0.658s]
[INFO] Hive Packaging .................................... FAILURE [0.809s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3:47.992s
[INFO] Finished at: Sat Mar 15 03:16:53 EDT 2014
[INFO] Final Memory: 74M/520M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hive-packaging: Could not resolve dependencies for
project org.apache.hive:hive-packaging:pom:0.14.0-SNAPSHOT: Could not find artifact org.apache.hive.hcatalog:hive-hcatalog-hbase-storage-handler:jar:0.14.0-SNAPSHOT
in apache.snapshots (http://repository.apache.org/snapshots) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-packaging
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12634783

> Define API for RecordUpdater and UpdateReader
> ---------------------------------------------
>
>                 Key: HIVE-6060
>                 URL: https://issues.apache.org/jira/browse/HIVE-6060
>             Project: Hive
>          Issue Type: Sub-task
>            Reporter: Owen O'Malley
>            Assignee: Owen O'Malley
>         Attachments: HIVE-6060.patch, HIVE-6060.patch, HIVE-6060.patch, HIVE-6060.patch,
HIVE-6060.patch, acid-io.patch, h-5317.patch, h-5317.patch, h-5317.patch, h-6060.patch, h-6060.patch
>
>
> We need to define some new APIs for how Hive interacts with the file formats since it
needs to be much richer than the current RecordReader and RecordWriter.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message