hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "meiyoula (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (HIVE-15256) One session close and delete resourceDir, cause others open failed
Date Tue, 22 Nov 2016 06:49:58 GMT

     [ https://issues.apache.org/jira/browse/HIVE-15256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

meiyoula resolved HIVE-15256.
-----------------------------
    Resolution: Not A Problem

> One session close and delete resourceDir, cause others open failed
> ------------------------------------------------------------------
>
>                 Key: HIVE-15256
>                 URL: https://issues.apache.org/jira/browse/HIVE-15256
>             Project: Hive
>          Issue Type: Bug
>          Components: Clients
>            Reporter: meiyoula
>
> *resourceDir* is shared to clients. When one connected client closes, it will delete
*resourceDir*. At the same time, other clients are opening session, they will failed.
> Exception is below:
>  {quote}
> Error opening session:  | org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:536)
> java.lang.RuntimeException: ExitCodeException exitCode=1: chmod: cannot access `/opt/huawei/Bigdata/tmp/spark/dlresources':
No such file or directory
> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:528)
> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:477)
> 	at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:229)
> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:191)
> 	at org.apache.spark.sql.hive.client.ClientWrapper.newSession(ClientWrapper.scala:1053)
> 	at org.apache.spark.sql.hive.HiveContext.newSession(HiveContext.scala:93)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:89)
> 	at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:189)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:654)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:522)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1257)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1242)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:690)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: ExitCodeException exitCode=1: chmod: cannot access `/opt/huawei/Bigdata/tmp/spark/dlresources':
No such file or directory
> 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
> 	at org.apache.hadoop.util.Shell.run(Shell.java:472)
> 	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:744)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:502)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:542)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:520)
> 	at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:340)
> 	at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:656)
> 	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:584)
> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
> 	... 18 more
> {quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message