hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-10456) Grace Hash Join should not load spilled partitions on abort
Date Sun, 26 Apr 2015 09:00:46 GMT

    [ https://issues.apache.org/jira/browse/HIVE-10456?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14512952#comment-14512952
] 

Hive QA commented on HIVE-10456:
--------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12728218/HIVE-10456.3.patch

Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/3589/testReport
Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/3589/console
Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-3589/

Messages:
{noformat}
**** This message was trimmed, see log for full details ****
    [mkdir] Created dir: /data/hive-ptest/working/apache-git-master-source/spark-client/target/tmp/conf
     [copy] Copying 11 files to /data/hive-ptest/working/apache-git-master-source/spark-client/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ spark-client ---
[INFO] Compiling 5 source files to /data/hive-ptest/working/apache-git-master-source/spark-client/target/test-classes
[INFO] 
[INFO] --- maven-dependency-plugin:2.8:copy (copy-guava-14) @ spark-client ---
[INFO] Configured Artifact: com.google.guava:guava:14.0.1:jar
[INFO] Copying guava-14.0.1.jar to /data/hive-ptest/working/apache-git-master-source/spark-client/target/dependency/guava-14.0.1.jar
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ spark-client ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ spark-client ---
[INFO] Building jar: /data/hive-ptest/working/apache-git-master-source/spark-client/target/spark-client-1.3.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ spark-client ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ spark-client ---
[INFO] Installing /data/hive-ptest/working/apache-git-master-source/spark-client/target/spark-client-1.3.0-SNAPSHOT.jar
to /data/hive-ptest/working/maven/org/apache/hive/spark-client/1.3.0-SNAPSHOT/spark-client-1.3.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-git-master-source/spark-client/pom.xml to
/data/hive-ptest/working/maven/org/apache/hive/spark-client/1.3.0-SNAPSHOT/spark-client-1.3.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Query Language 1.3.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec ---
[INFO] Deleting /data/hive-ptest/working/apache-git-master-source/ql/target
[INFO] Deleting /data/hive-ptest/working/apache-git-master-source/ql (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ hive-exec ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-git-master-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
    [mkdir] Created dir: /data/hive-ptest/working/apache-git-master-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen
    [mkdir] Created dir: /data/hive-ptest/working/apache-git-master-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
Generating vector expression code
Generating vector expression test code
[INFO] Executed tasks
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
[INFO] Source directory: /data/hive-ptest/working/apache-git-master-source/ql/src/gen/protobuf/gen-java
added.
[INFO] Source directory: /data/hive-ptest/working/apache-git-master-source/ql/src/gen/thrift/gen-javabean
added.
[INFO] Source directory: /data/hive-ptest/working/apache-git-master-source/ql/target/generated-sources/java
added.
[INFO] 
[INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
[INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-git-master-source/ql/src/java
ANTLR Parser Generator  Version 3.4
org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_UNION KW_MAP" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_UNION KW_SELECT" using multiple
alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple
alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_UNION KW_REDUCE" using multiple
alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_UNION KW_ALL" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple
alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple
alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_UNION KW_FROM" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple
alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:455:5: 
Decision can match input such as "{KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives:
2, 9

As a result, alternative(s) 9 were disabled for that input
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-exec ---
Downloading: http://www.datanucleus.org/downloads/maven2/org/pentaho/pentaho-aggdesigner/5.1.5-jhyde/pentaho-aggdesigner-5.1.5-jhyde.pom
Downloading: https://s3-us-west-1.amazonaws.com/hive-spark/maven2/spark_2.10-1.3-rc1/org/pentaho/pentaho-aggdesigner/5.1.5-jhyde/pentaho-aggdesigner-5.1.5-jhyde.pom
Downloading: http://repo.maven.apache.org/maven2/org/pentaho/pentaho-aggdesigner/5.1.5-jhyde/pentaho-aggdesigner-5.1.5-jhyde.pom
[WARNING] Invalid project model for artifact [pentaho-aggdesigner-algorithm:org.pentaho:5.1.5-jhyde].
It will be ignored by the remote resources Mojo.
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-exec ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
[INFO] Compiling 2370 source files to /data/hive-ptest/working/apache-git-master-source/ql/target/classes
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] /data/hive-ptest/working/apache-git-master-source/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFLead.java:
Some input files use or override a deprecated API.
[WARNING] /data/hive-ptest/working/apache-git-master-source/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFLead.java:
Recompile with -Xlint:deprecation for details.
[WARNING] /data/hive-ptest/working/apache-git-master-source/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFMkCollectionEvaluator.java:
Some input files use unchecked or unsafe operations.
[WARNING] /data/hive-ptest/working/apache-git-master-source/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFMkCollectionEvaluator.java:
Recompile with -Xlint:unchecked for details.
[INFO] 4 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] /data/hive-ptest/working/apache-git-master-source/ql/src/java/org/apache/hadoop/hive/ql/exec/persistence/HybridHashTableContainer.java:[191,21]
unreported exception java.io.IOException; must be caught or declared to be thrown
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive .............................................. SUCCESS [13.999s]
[INFO] Hive Shims Common ................................. SUCCESS [11.669s]
[INFO] Hive Shims 0.20S .................................. SUCCESS [2.823s]
[INFO] Hive Shims 0.23 ................................... SUCCESS [10.003s]
[INFO] Hive Shims Scheduler .............................. SUCCESS [2.187s]
[INFO] Hive Shims ........................................ SUCCESS [2.760s]
[INFO] Hive Common ....................................... SUCCESS [13.868s]
[INFO] Hive Serde ........................................ SUCCESS [16.394s]
[INFO] Hive Metastore .................................... SUCCESS [35.652s]
[INFO] Hive Ant Utilities ................................ SUCCESS [2.155s]
[INFO] Spark Remote Client ............................... SUCCESS [23.722s]
[INFO] Hive Query Language ............................... FAILURE [1:06.746s]
[INFO] Hive Service ...................................... SKIPPED
[INFO] Hive Accumulo Handler ............................. SKIPPED
[INFO] Hive JDBC ......................................... SKIPPED
[INFO] Hive Beeline ...................................... SKIPPED
[INFO] Hive CLI .......................................... SKIPPED
[INFO] Hive Contrib ...................................... SKIPPED
[INFO] Hive HBase Handler ................................ SKIPPED
[INFO] Hive HCatalog ..................................... SKIPPED
[INFO] Hive HCatalog Core ................................ SKIPPED
[INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
[INFO] Hive HCatalog Server Extensions ................... SKIPPED
[INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
[INFO] Hive HCatalog Webhcat ............................. SKIPPED
[INFO] Hive HCatalog Streaming ........................... SKIPPED
[INFO] Hive HWI .......................................... SKIPPED
[INFO] Hive ODBC ......................................... SKIPPED
[INFO] Hive Shims Aggregator ............................. SKIPPED
[INFO] Hive TestUtils .................................... SKIPPED
[INFO] Hive Packaging .................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3:25.113s
[INFO] Finished at: Sun Apr 26 04:59:43 EDT 2015
[INFO] Final Memory: 95M/857M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
(default-compile) on project hive-exec: Compilation failure
[ERROR] /data/hive-ptest/working/apache-git-master-source/ql/src/java/org/apache/hadoop/hive/ql/exec/persistence/HybridHashTableContainer.java:[191,21]
unreported exception java.io.IOException; must be caught or declared to be thrown
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-exec
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12728218 - PreCommit-HIVE-TRUNK-Build

> Grace Hash Join should not load spilled partitions on abort
> -----------------------------------------------------------
>
>                 Key: HIVE-10456
>                 URL: https://issues.apache.org/jira/browse/HIVE-10456
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 1.2.0
>            Reporter: Prasanth Jayachandran
>            Assignee: Prasanth Jayachandran
>         Attachments: HIVE-10456.1.patch, HIVE-10456.1.patch, HIVE-10456.2.patch, HIVE-10456.3.patch
>
>
> Grace Hash Join loads the spilled partitions to complete the join in closeOp(). This
should not happen when closeOp with abort is invoked. Instead it should clean up all the spilled
data.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message