spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Robert Beauchemin (JIRA)" <>
Subject [jira] [Reopened] (SPARK-10066) Can't create HiveContext with spark-shell or spark-sql on snapshot
Date Mon, 05 Oct 2015 21:10:27 GMT


Robert Beauchemin reopened SPARK-10066:

This problem reappears with Spark 1.5.1. Same HDP/Hive setup and config.

> Can't create HiveContext with spark-shell or spark-sql on snapshot
> ------------------------------------------------------------------
>                 Key: SPARK-10066
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, SQL
>    Affects Versions: 1.5.0
>         Environment: Centos 6.6
>            Reporter: Robert Beauchemin
>            Priority: Minor
> Built the 1.5.0-preview-20150812 with the following:
> ./ -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
-Psparkr -DskipTests
> Starting spark-shell or spark-sql returns the following error: 
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive
on HDFS should be writable. Current permissions are: rwx------
>         at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(
      ....      [elided]
> at org.apache.hadoop.hive.ql.session.SessionState.start(      
> It's trying to create a new HiveContext. Running pySpark or sparkR works and creates
a HiveContext successfully. SqlContext can be created successfully with any shell.
> I've tried changing permissions on that HDFS directory (even as far as making it world-writable)
without success. Tried changing SPARK_USER and also running spark-shell as different users
without success.
> This works on same machine on 1.4.1 and on earlier pre-release versions of Spark 1.5.0
(same make-distribution parms) sucessfully. Just trying the snapshot... 

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message