spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Amir Gur (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-10066) Can't create HiveContext with spark-shell or spark-sql on snapshot
Date Wed, 13 Jan 2016 00:19:39 GMT

    [ https://issues.apache.org/jira/browse/SPARK-10066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15095282#comment-15095282
] 

Amir Gur commented on SPARK-10066:
----------------------------------

Issue still need a resolution.  SPARK-10528 has more details, linked them.  Can also close
this and keep other one opened till it is fixed.


> Can't create HiveContext with spark-shell or spark-sql on snapshot
> ------------------------------------------------------------------
>
>                 Key: SPARK-10066
>                 URL: https://issues.apache.org/jira/browse/SPARK-10066
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, SQL
>    Affects Versions: 1.5.0
>         Environment: Centos 6.6
>            Reporter: Robert Beauchemin
>            Priority: Minor
>
> Built the 1.5.0-preview-20150812 with the following:
> ./make-distribution.sh -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
-Psparkr -DskipTests
> Starting spark-shell or spark-sql returns the following error: 
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive
on HDFS should be writable. Current permissions are: rwx------
>         at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
      ....      [elided]
> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)      
                
> It's trying to create a new HiveContext. Running pySpark or sparkR works and creates
a HiveContext successfully. SqlContext can be created successfully with any shell.
> I've tried changing permissions on that HDFS directory (even as far as making it world-writable)
without success. Tried changing SPARK_USER and also running spark-shell as different users
without success.
> This works on same machine on 1.4.1 and on earlier pre-release versions of Spark 1.5.0
(same make-distribution parms) sucessfully. Just trying the snapshot... 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message