spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-15270) Creating HiveContext does not work
Date Thu, 12 May 2016 12:19:12 GMT

     [ https://issues.apache.org/jira/browse/SPARK-15270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen updated SPARK-15270:
------------------------------
    Assignee: Sandeep Singh

> Creating HiveContext does not work
> ----------------------------------
>
>                 Key: SPARK-15270
>                 URL: https://issues.apache.org/jira/browse/SPARK-15270
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.0.0
>            Reporter: Piotr Milanowski
>            Assignee: Sandeep Singh
>            Priority: Blocker
>             Fix For: 2.0.0
>
>
> Built spark (commit c6d23b6604e85bcddbd1fb6a2c1c3edbfd2be2c1, branch-2.0)  with command:
> /dev/make-distribution.sh -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver -Dhadoop.version=2.6.0
-DskipTests
> Launched master and slave, launched ./bin/pyspark
> Creating hive context fails:
> {code}
> from pyspark.sql import HiveContext
> hc = HiveContext(sc)
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "spark-2.0/python/pyspark/sql/context.py", line 458, in __init__
>     sparkSession = SparkSession.withHiveSupport(sparkContext)
>   File "spark-2.0/python/pyspark/sql/session.py", line 192, in withHiveSupport
>     jsparkSession = sparkContext._jvm.SparkSession.withHiveSupport(sparkContext._jsc.sc())
>   File "spark-2.0/python/lib/py4j-0.9.2-src.zip/py4j/java_gateway.py", line 1048, in
__getattr__
> py4j.protocol.Py4JError: org.apache.spark.sql.SparkSession.withHiveSupport does not exist
in the JVM
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message