hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-14990) run all tests for MM tables and fix the issues that are found
Date Tue, 30 May 2017 16:13:04 GMT

    [ https://issues.apache.org/jira/browse/HIVE-14990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16029627#comment-16029627
] 

Hive QA commented on HIVE-14990:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12870147/HIVE-14990.19.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5475/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5475/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5475/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and
output '+ date '+%Y-%m-%d %T.%3N'
2017-05-30 16:12:02.642
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-5475/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2017-05-30 16:12:02.645
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
   e2ecc92..fea9142  storage-branch-2.3 -> origin/storage-branch-2.3
 * [new tag]         rel/storage-release-2.3.1 -> rel/storage-release-2.3.1
+ git reset --hard HEAD
HEAD is now at 8dcc78a HIVE-16727 : REPL DUMP for insert event should't fail if the table
is already dropped. (Sankar Hariappan via Thejas Nair
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at 8dcc78a HIVE-16727 : REPL DUMP for insert event should't fail if the table
is already dropped. (Sankar Hariappan via Thejas Nair
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2017-05-30 16:12:09.902
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch
error: a/common/src/java/org/apache/hadoop/hive/common/FileUtils.java: No such file or directory
error: a/common/src/java/org/apache/hadoop/hive/common/HiveStatsUtils.java: No such file or
directory
error: a/common/src/java/org/apache/hadoop/hive/common/JavaUtils.java: No such file or directory
error: a/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java: No such file or directory
error: a/data/scripts/q_test_init.sql: No such file or directory
error: a/data/scripts/q_test_init_compare.sql: No such file or directory
error: a/data/scripts/q_test_init_contrib.sql: No such file or directory
error: a/data/scripts/q_test_init_for_minimr.sql: No such file or directory
error: a/data/scripts/q_test_init_src.sql: No such file or directory
error: a/data/scripts/q_test_init_src_with_stats.sql: No such file or directory
error: a/data/scripts/q_test_init_tez.sql: No such file or directory
error: a/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:
No such file or directory
error: a/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/HCatOutputFormat.java:
No such file or directory
error: a/itests/hcatalog-unit/src/test/java/org/apache/hive/hcatalog/listener/DummyRawStoreFailEvent.java:
No such file or directory
error: a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/history/TestHiveHistory.java:
No such file or directory
error: a/itests/src/test/resources/testconfiguration.properties: No such file or directory
error: a/llap-server/src/java/org/apache/hadoop/hive/llap/io/encoded/OrcEncodedDataReader.java:
No such file or directory
error: a/llap-server/src/java/org/apache/hadoop/hive/llap/io/encoded/VectorDeserializeOrcWriter.java:
No such file or directory
error: a/metastore/if/hive_metastore.thrift: No such file or directory
error: a/metastore/scripts/upgrade/derby/hive-schema-2.2.0.derby.sql: No such file or directory
error: a/metastore/scripts/upgrade/mssql/hive-schema-2.2.0.mssql.sql: No such file or directory
error: a/metastore/scripts/upgrade/oracle/hive-schema-2.2.0.oracle.sql: No such file or directory
error: a/metastore/scripts/upgrade/postgres/hive-schema-2.2.0.postgres.sql: No such file or
directory
error: a/metastore/src/gen/thrift/gen-cpp/hive_metastore_types.cpp: No such file or directory
error: a/metastore/src/gen/thrift/gen-cpp/hive_metastore_types.h: No such file or directory
error: a/metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/ClientCapability.java:
No such file or directory
error: a/metastore/src/gen/thrift/gen-php/metastore/Types.php: No such file or directory
error: a/metastore/src/gen/thrift/gen-py/hive_metastore/ttypes.py: No such file or directory
error: a/metastore/src/gen/thrift/gen-rb/hive_metastore_types.rb: No such file or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStore.java: No such file
or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java: No
such file or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreDirectSql.java: No such
file or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreThread.java: No such
file or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreUtils.java: No such
file or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/ObjectStore.java: No such file
or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/RawStore.java: No such file or
directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/TransactionalValidationListener.java:
No such file or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/Warehouse.java: No such file
or directory
error: a/metastore/src/java/org/apache/hadoop/hive/metastore/cache/CachedStore.java: No such
file or directory
error: a/metastore/src/model/org/apache/hadoop/hive/metastore/model/MTable.java: No such file
or directory
error: a/metastore/src/model/package.jdo: No such file or directory
error: a/metastore/src/test/org/apache/hadoop/hive/metastore/DummyRawStoreControlledCommit.java:
No such file or directory
error: a/metastore/src/test/org/apache/hadoop/hive/metastore/DummyRawStoreForJdoConnection.java:
No such file or directory
error: a/metastore/src/test/org/apache/hadoop/hive/metastore/TestObjectStore.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/Context.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/Driver.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/AbstractFileMergeOperator.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/CopyTask.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/DependencyCollectionTask.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchOperator.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchTask.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/HashTableSinkOperator.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/JoinOperator.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/MoveTask.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/OrcFileMergeOperator.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/RCFileMergeOperator.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ReplCopyTask.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/StatsNoJobTask.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/StatsTask.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/TableScanOperator.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/TaskFactory.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/SplitGrouper.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizedRowBatchCtx.java: No
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/hooks/WriteEntity.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/index/HiveIndexedInputFormat.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/CombineHiveInputFormat.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/CombineHiveRecordReader.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/HiveContextAwareRecordReader.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/HiveFileFormatUtils.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/merge/MergeFileWork.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcInputFormat.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/parquet/MapredParquetInputFormat.java: No
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/parquet/ProjectionPusher.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanWork.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/DbTxnManager.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/Partition.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/Table.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/JsonMetaDataFormatter.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/TextMetaDataFormatter.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/AbstractBucketJoinProc.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/BucketingSortingReduceSinkOptimizer.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMRTableScan1.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMapRedUtils.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/SamplePruner.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/StatsOptimizer.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/BucketingSortingOpProcFactory.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/GenMRSkewJoinProcessor.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/GenSparkSkewJoinProcessor.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/LocalMapJoinProcFactory.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/SamplingOptimizer.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/SkewJoinResolver.java: No
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/index/IndexWhereProcessor.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/unionproc/UnionProcFactory.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/AlterTablePartMergeFilesDesc.java: No
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/BaseSemanticAnalyzer.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/EximUtil.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/ExportSemanticAnalyzer.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/GenTezUtils.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/ImportSemanticAnalyzer.java: No such
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/IndexUpdater.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/LoadSemanticAnalyzer.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/ParseContext.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/ProcessAnalyzeTable.java: No such file
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java: No such file or
directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/TaskCompiler.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/TypeCheckCtx.java: No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/spark/SparkProcessAnalyzeTable.java:
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/BucketMapJoinContext.java: No such file
or directory
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12870147 - PreCommit-HIVE-Build

> run all tests for MM tables and fix the issues that are found
> -------------------------------------------------------------
>
>                 Key: HIVE-14990
>                 URL: https://issues.apache.org/jira/browse/HIVE-14990
>             Project: Hive
>          Issue Type: Sub-task
>            Reporter: Sergey Shelukhin
>            Assignee: Wei Zheng
>             Fix For: hive-14535
>
>         Attachments: HIVE-14990.01.patch, HIVE-14990.02.patch, HIVE-14990.03.patch, HIVE-14990.04.patch,
HIVE-14990.04.patch, HIVE-14990.05.patch, HIVE-14990.05.patch, HIVE-14990.06.patch, HIVE-14990.06.patch,
HIVE-14990.07.patch, HIVE-14990.08.patch, HIVE-14990.09.patch, HIVE-14990.10.patch, HIVE-14990.10.patch,
HIVE-14990.10.patch, HIVE-14990.12.patch, HIVE-14990.13.patch, HIVE-14990.14.patch, HIVE-14990.15.patch,
HIVE-14990.16.patch, HIVE-14990.17.patch, HIVE-14990.18.patch, HIVE-14990.19.patch, HIVE-14990.patch
>
>
> I am running the tests with isMmTable returning true for most tables (except ACID, temporary
tables, views, etc.).
> Many tests will fail because of various expected issues with such an approach; however
we can find issues in MM tables from other failures.
> Expected failures 
> 1) All HCat tests (cannot write MM tables via the HCat writer)
> 2) Almost all merge tests (alter .. concat is not supported).
> 3) Tests that run dfs commands with specific paths (path changes).
> 4) Truncate column (not supported).
> 5) Describe formatted will have the new table fields in the output (before merging MM
with ACID).
> 6) Many tests w/explain extended - diff in partition "base file name" (path changes).
> 7) TestTxnCommands - all the conversion tests, as they check for bucket count using file
lists (path changes).
> 8) HBase metastore tests cause methods are not implemented.
> 9) Some load and ExIm tests that export a table and then rely on specific path for load
(path changes).
> 10) Bucket map join/etc. - diffs; disabled the optimization for MM tables due to how
it accounts for buckets
> 11) rand - different results due to different sequence of processing.
> 12) many (not all i.e. not the ones with just one insert) tests that have stats output,
such as file count, for obvious reasons
> 13) materialized views, not handled by design - the test check erroneously makes them
"mm", no easy way to tell them apart, I don't want to plumb more stuff thru just for this
test
> I'm filing jiras for some test failures that are not obvious and need an investigation
later



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message