Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 18976200BF1 for ; Tue, 3 Jan 2017 21:32:01 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 172CD160B20; Tue, 3 Jan 2017 20:32:01 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 17903160B43 for ; Tue, 3 Jan 2017 21:31:59 +0100 (CET) Received: (qmail 15363 invoked by uid 500); 3 Jan 2017 20:31:58 -0000 Mailing-List: contact issues-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list issues@hive.apache.org Received: (qmail 15229 invoked by uid 99); 3 Jan 2017 20:31:58 -0000 Received: from arcas.apache.org (HELO arcas) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 03 Jan 2017 20:31:58 +0000 Received: from arcas.apache.org (localhost [127.0.0.1]) by arcas (Postfix) with ESMTP id A28622C1F54 for ; Tue, 3 Jan 2017 20:31:58 +0000 (UTC) Date: Tue, 3 Jan 2017 20:31:58 +0000 (UTC) From: "Hive QA (JIRA)" To: issues@hive.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HIVE-11116) Can not select data from table which points to remote hdfs location MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Tue, 03 Jan 2017 20:32:01 -0000 [ https://issues.apache.org/jira/browse/HIVE-11116?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15796104#comment-15796104 ] Hive QA commented on HIVE-11116: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12820392/HIVE-11116.1.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2767/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2767/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2767/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-01-03 20:30:27.980 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-2767/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-01-03 20:30:27.982 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at c928ad3 HIVE-15528: Expose Spark job error in SparkTask (Zhihai via Xuefu) + git clean -f -d + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at c928ad3 HIVE-15528: Expose Spark job error in SparkTask (Zhihai via Xuefu) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-01-03 20:30:28.949 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch Going to apply patch with: patch -p1 patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java Hunk #1 succeeded at 2268 with fuzz 1 (offset 70 lines). patching file ql/src/java/org/apache/hadoop/hive/ql/session/SessionState.java Hunk #1 succeeded at 483 with fuzz 1 (offset 29 lines). + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g DataNucleus Enhancer (version 4.1.6) for API "JDO" DataNucleus Enhancer : Classpath >> /usr/share/maven/boot/plexus-classworlds-2.x.jar ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MConstraint ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MVersionTable ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MResourceUri ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFunction ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationLog ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationNextId DataNucleus Enhancer completed with success for 30 classes. Timings : input=144 ms, enhance=177 ms, total=321 ms. Consult the log for full details ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveLexer.g Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g org/apache/hadoop/hive/ql/parse/HiveParser.g Generating vector expression code Generating vector expression test code [ERROR] COMPILATION ERROR : [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java:[2271,46] cannot find symbol symbol: variable path location: class org.apache.hadoop.hive.ql.parse.SemanticAnalyzer [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java:[2272,74] cannot find symbol symbol: variable path location: class org.apache.hadoop.hive.ql.parse.SemanticAnalyzer [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure: Compilation failure: [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java:[2271,46] cannot find symbol [ERROR] symbol: variable path [ERROR] location: class org.apache.hadoop.hive.ql.parse.SemanticAnalyzer [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java:[2272,74] cannot find symbol [ERROR] symbol: variable path [ERROR] location: class org.apache.hadoop.hive.ql.parse.SemanticAnalyzer [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn -rf :hive-exec + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12820392 - PreCommit-HIVE-Build > Can not select data from table which points to remote hdfs location > ------------------------------------------------------------------- > > Key: HIVE-11116 > URL: https://issues.apache.org/jira/browse/HIVE-11116 > Project: Hive > Issue Type: Bug > Components: Encryption > Affects Versions: 1.2.0, 1.1.0, 1.3.0, 2.0.0 > Reporter: Alexander Pivovarov > Assignee: David Karoly > Attachments: HIVE-11116.1.patch > > > I tried to create new table which points to remote hdfs location and select data from it. > It works for hive-0.14 and hive-1.0 but it does not work starting from hive-1.1 > to reproduce the issue > 1. create folder on remote hdfs > {code} > hadoop fs -mkdir -p hdfs://remote-nn/tmp/et1 > {code} > 2. create table > {code} > CREATE TABLE et1 ( > a string > ) stored as textfile > LOCATION 'hdfs://remote-nn/tmp/et1'; > {code} > 3. run select > {code} > select * from et1 limit 10; > {code} > 4. Should get the following error > {code} > select * from et1; > 15/06/25 13:43:44 [main]: ERROR parse.CalcitePlanner: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to determine if hdfs://remote-nn/tmp/et1is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020 > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1763) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getStagingDirectoryPathname(SemanticAnalyzer.java:1875) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1689) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1427) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10132) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10147) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:190) > at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:222) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:421) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307) > at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1160) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1039) > at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370) > at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at org.apache.hadoop.util.RunJar.run(RunJar.java:221) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020 > at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645) > at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:193) > at org.apache.hadoop.hdfs.DistributedFileSystem.getEZForPath(DistributedFileSystem.java:1906) > at org.apache.hadoop.hdfs.client.HdfsAdmin.getEncryptionZoneForPath(HdfsAdmin.java:262) > at org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.isPathEncrypted(Hadoop23Shims.java:1097) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1759) > ... 25 more > FAILED: SemanticException Unable to determine if hdfs://remote-nn/tmp/et1is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020 > 15/06/25 13:43:44 [main]: ERROR ql.Driver: FAILED: SemanticException Unable to determine if hdfs://remote-nn/tmp/et1is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020 > org.apache.hadoop.hive.ql.parse.SemanticException: Unable to determine if hdfs://remote-nn/tmp/et1is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020 > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1743) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1427) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10132) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10147) > at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:190) > at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:222) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:421) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307) > at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1160) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1039) > at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370) > at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at org.apache.hadoop.util.RunJar.run(RunJar.java:221) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to determine if hdfs://remote-nn/tmp/et1is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020 > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1763) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getStagingDirectoryPathname(SemanticAnalyzer.java:1875) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1689) > ... 23 more > Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020 > at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645) > at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:193) > at org.apache.hadoop.hdfs.DistributedFileSystem.getEZForPath(DistributedFileSystem.java:1906) > at org.apache.hadoop.hdfs.client.HdfsAdmin.getEncryptionZoneForPath(HdfsAdmin.java:262) > at org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.isPathEncrypted(Hadoop23Shims.java:1097) > at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1759) > ... 25 more > {code} > 5. can you also fix bug with log message below. It should be space before "is encrypted" > {code} > Unable to determine if hdfs://remote-nn/tmp/et1is encrypted > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)