hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "sivasaravanakumar (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-11116) Can not select data from table which points to remote hdfs location
Date Thu, 10 Dec 2015 07:57:10 GMT

    [ https://issues.apache.org/jira/browse/HIVE-11116?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15050272#comment-15050272
] 

sivasaravanakumar commented on HIVE-11116:
------------------------------------------

I found that the Hive metastore tracks the location of each table. You can see the that location
be running the following in the Hive console.

hive> DESCRIBE EXTENDED test_table;
Thus, this issue occurs if the NameNode in core-site.xml was changed while the metastore service
was still running. Therefore, to resolve this issue the service should be restarted on that
machine:

$ sudo service hive-metastore restart
Then, the metastore will use the new fs.defaultFS for newly created tables such.

Already Existing Tables
The location for tables that already exist can be corrected by running the following set of
commands. These were obtained from Cloudera documentation to configure the Hive metastore
to use High-Availability.

$ /usr/lib/hive/bin/metatool -listFSRoot
...
Listing FS Roots..
hdfs://localhost:8020/user/hive/warehouse
hdfs://localhost:8020/user/hive/warehouse/test.db
Correcting the NameNode location:

$ /usr/lib/hive/bin/metatool -updateLocation hdfs://hadoop:8020 hdfs://localhost:8020
Now the listed NameNode is correct.

$ /usr/lib/hive/bin/metatool -listFSRoot
...
Listing FS Roots..
hdfs://hadoop:8020/user/hive/warehouse
hdfs://hadoop:8020/user/hive/warehouse/test.db

> Can not select data from table which points to remote hdfs location
> -------------------------------------------------------------------
>
>                 Key: HIVE-11116
>                 URL: https://issues.apache.org/jira/browse/HIVE-11116
>             Project: Hive
>          Issue Type: Bug
>          Components: Encryption
>    Affects Versions: 1.2.0, 1.1.0, 1.3.0, 2.0.0
>            Reporter: Alexander Pivovarov
>
> I tried to create new table which points to remote hdfs location and select data from
it.
> It works for hive-0.14 and hive-1.0  but it does not work starting from hive-1.1
> to reproduce the issue
> 1. create folder on remote hdfs
> {code}
> hadoop fs -mkdir -p hdfs://remote-nn/tmp/et1
> {code}
> 2. create table 
> {code}
> CREATE TABLE et1 (
>   a string
> ) stored as textfile
> LOCATION 'hdfs://remote-nn/tmp/et1';
> {code}
> 3. run select
> {code}
> select * from et1 limit 10;
> {code}
> 4. Should get the following error
> {code}
> select * from et1;
> 15/06/25 13:43:44 [main]: ERROR parse.CalcitePlanner: org.apache.hadoop.hive.ql.metadata.HiveException:
Unable to determine if hdfs://remote-nn/tmp/et1is encrypted: java.lang.IllegalArgumentException:
Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1763)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getStagingDirectoryPathname(SemanticAnalyzer.java:1875)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1689)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1427)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10132)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10147)
> 	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:190)
> 	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:222)
> 	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:421)
> 	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307)
> 	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1160)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1039)
> 	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207)
> 	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159)
> 	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370)
> 	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754)
> 	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
> 	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
> 	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected:
hdfs://localhost:8020
> 	at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:193)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getEZForPath(DistributedFileSystem.java:1906)
> 	at org.apache.hadoop.hdfs.client.HdfsAdmin.getEncryptionZoneForPath(HdfsAdmin.java:262)
> 	at org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.isPathEncrypted(Hadoop23Shims.java:1097)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1759)
> 	... 25 more
> FAILED: SemanticException Unable to determine if hdfs://remote-nn/tmp/et1is encrypted:
java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected: hdfs://localhost:8020
> 15/06/25 13:43:44 [main]: ERROR ql.Driver: FAILED: SemanticException Unable to determine
if hdfs://remote-nn/tmp/et1is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1,
expected: hdfs://localhost:8020
> org.apache.hadoop.hive.ql.parse.SemanticException: Unable to determine if hdfs://remote-nn/tmp/et1is
encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected:
hdfs://localhost:8020
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1743)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1427)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10132)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10147)
> 	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:190)
> 	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:222)
> 	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:421)
> 	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307)
> 	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1160)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1039)
> 	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207)
> 	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159)
> 	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370)
> 	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754)
> 	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
> 	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
> 	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to determine if hdfs://remote-nn/tmp/et1is
encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected:
hdfs://localhost:8020
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1763)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getStagingDirectoryPathname(SemanticAnalyzer.java:1875)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1689)
> 	... 23 more
> Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://remote-nn/tmp/et1, expected:
hdfs://localhost:8020
> 	at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:193)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getEZForPath(DistributedFileSystem.java:1906)
> 	at org.apache.hadoop.hdfs.client.HdfsAdmin.getEncryptionZoneForPath(HdfsAdmin.java:262)
> 	at org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.isPathEncrypted(Hadoop23Shims.java:1097)
> 	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathEncrypted(SemanticAnalyzer.java:1759)
> 	... 25 more
> {code}
> 5. can you also fix bug with log message below. It should be space before "is encrypted"
> {code}
> Unable to determine if hdfs://remote-nn/tmp/et1is encrypted
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message