spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "todd.chen (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-18050) spark 2.0.1 enable hive throw AlreadyExistsException(message:Database default already exists)
Date Mon, 14 Nov 2016 12:45:59 GMT

    [ https://issues.apache.org/jira/browse/SPARK-18050?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15663828#comment-15663828
] 

todd.chen commented on SPARK-18050:
-----------------------------------

when I init sparkSession ,throw this error 

> spark 2.0.1 enable hive throw AlreadyExistsException(message:Database default already
exists)
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-18050
>                 URL: https://issues.apache.org/jira/browse/SPARK-18050
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>         Environment: jdk1.8, macOs,spark 2.0.1
>            Reporter: todd.chen
>
> in spark 2.0.1 ,I enable hive support and when init the sqlContext ,throw a AlreadyExistsException(message:Database
default already exists),same as 
> https://www.mail-archive.com/dev@spark.apache.org/msg15306.html ,my code is 
> {code}
>   private val master = "local[*]"
>   private val appName = "xqlServerSpark"
>   val fileSystem = FileSystem.get()
>   val sparkConf = new SparkConf().setMaster(master).
>     setAppName(appName).set("spark.sql.warehouse.dir", s"${fileSystem.getUri.toASCIIString}/user/hive/warehouse")
>   val   hiveContext = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate().sqlContext
>     print(sparkConf.get("spark.sql.warehouse.dir"))
>     hiveContext.sql("show tables").show()
> {code}
> the result is correct,but a exception also throwBy the code



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message