spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ran Mingxuan (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-22560) Must create spark session directly to connect to hive
Date Wed, 22 Nov 2017 08:00:01 GMT

    [ https://issues.apache.org/jira/browse/SPARK-22560?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16262118#comment-16262118
] 

Ran Mingxuan commented on SPARK-22560:
--------------------------------------

In my opinion, options of spark session should be added to spark context in the case of building
spark session via spark context with some additional options. In this way the conflict options
will be corrected?


> Must create spark session directly to connect to hive
> -----------------------------------------------------
>
>                 Key: SPARK-22560
>                 URL: https://issues.apache.org/jira/browse/SPARK-22560
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API, SQL
>    Affects Versions: 2.1.0, 2.2.0
>            Reporter: Ran Mingxuan
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> In a java project I have to use both JavaSparkContext  and SparkSession. I find the order
to create them affect hive connection.
> I have built a spark job like below:
> {code:java}
> // wrong code
> public void main(String[] args)
> {
>     SparkConf sparkConf = new SparkConf().setAppName("testApp");
>     JavaSparkContext sc = new JavaSparkContext(sparkConf);
>     SparkSession spark = SparkSession.builder().sparkContext(sc.sc()).enableHiveSupport().getOrCreate();
>     spark.sql("show databases").show();
> }
> {code}
> and with this code spark job will not be able to find hive meta-store even if it can
discover correct warehouse.
> I have to use code like below to make things work:
> {code:java}
> // correct code 
> public String main(String[] args)
> {
>     SparkConf sparkConf = new SparkConf().setAppName("testApp");
>     SparkSession spark = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate();
>     SparkContext sparkContext = spark.sparkContext();
>     JavaSparkContext sc = JavaSparkContext.fromSparkContext(sparkContext);
>     spark.sql("show databases").show();
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message