phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 马士成 <>
Subject Spark sql query the hive external table mapped from phoenix always throw out Class org.apache.phoenix.hive.PhoenixSerDe not found exception
Date Thu, 11 Jul 2019 06:18:09 GMT

Hello All,

In Apache Phoenix homepage,  It shows two additional functions: Apache Spark Integration and
Phoenix Storage Handler for Apache Hive,
According the guidance, I can query phoenix table from beeline-cli, I can load phoenix table
as dataframe using Spark-sql.
So my question is :

Does Phoenix support spark-sql query the hive external table mapped from Phoenix ?

I am working on hdp3.0 ( Phoenix 5.0 Hbase 2.0, Hive 3.1.0 ,Spark2.3.1  )  and facing the
issue as subject mentioned.
I tried to solve this problem but failed, I found some similar questions on internet but the
answers didn’t work for me.

My submit command :

  spark-submit --jars \

Log attached and Demo code as below:

  from pyspark.sql import SparkSession
  if __name__ == '__main__':
      spark = SparkSession.builder \
          .appName("test") \
          .enableHiveSupport() \
      df= spark.sql("select count(*) from ajmide_dw.part_device")

Similar Issues:

Any comment or suggestion is appreciated!

Shi-Cheng, Ma
  • Unnamed multipart/mixed (inline, None, 0 bytes)
View raw message