spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-10046) Hive warehouse dir not set in current directory when not providing hive-site.xml
Date Thu, 29 Oct 2015 22:22:27 GMT

     [ https://issues.apache.org/jira/browse/SPARK-10046?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned SPARK-10046:
------------------------------------

    Assignee:     (was: Apache Spark)

> Hive warehouse dir not set in current directory when not providing hive-site.xml
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-10046
>                 URL: https://issues.apache.org/jira/browse/SPARK-10046
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 1.3.1
>         Environment: OS X 10.10.4
> Java 1.7.0_79-b15
> Scala 2.10.5
> Spark 1.3.1
>            Reporter: Antonio Murgia
>              Labels: hive, spark, sparksql
>
> When running spark in local environment (for unit-testing purpose) and without providing
any `hive-site.xml, databases apart from the default one are created in Hive default hive.metastore.warehouse.dir
and not in the current directory (as stated in [Spark docs](http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables)).
This code snippet, tested with Spark 1.3.1 demonstrates the issue: https://github.com/tmnd1991/spark-hive-bug/blob/master/src/main/scala/Main.scala
You cane see that the exception is thrown when executing the CREATE DATABASE STATEMENT, stating
that is `Unable to create database path file:/user/hive/warehouse/abc.db, failed to create
database abc)` where is `/user/hive/warehouse/abc.db`is not the current directory as stated
in the docs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message