spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cinyoung Hur <cinyoung....@gmail.com>
Subject No rows in Apache Tajo table reading(Spark SQL)
Date Mon, 04 Sep 2017 02:53:44 GMT
Hi,

I want to read Apache Tajo table using spark sql.

Tajo JDBC driver is added to spark-shell, but Tajo table doesn't show
anything.
The followings are Spark code and the result.

$ spark-shell --jars tajo-jdbc-0.11.3.jar

scala> val componentDF = spark.sqlContext.load("jdbc", Map(
    "url"-> "jdbc:tajo://tajo-master-ip:26002/analysis",
    "driver"->"org.apache.tajo.jdbc.TajoDriver",
    "dbtable"->"component_usage_2015"
    ))
scala> componentDF.registerTempTable("components")
scala> val allComponents = sqlContext.sql("select * from components")
scala> allComponents.show(5)


warning: there was one deprecation warning; re-run with -deprecation for
details
componentDF: org.apache.spark.sql.DataFrame =
[analysis.component_usage_2015.gnl_nm_cd: string,
analysis.component_usage_2015.qty: double ... 1 more field]
warning: there was one deprecation warning; re-run with -deprecation for
details
allComponents: org.apache.spark.sql.DataFrame =
[analysis.component_usage_2015.gnl_nm_cd: string,
analysis.component_usage_2015.qty: double ... 1 more field]
+--------------------------------------------+--------------------------------------+--------------------------------------+
|analysis.component_usage_2015.gnl_nm_cd|analysis.component_usage_2015.qty|analysis.component_usage_2015.amt|
+--------------------------------------------+--------------------------------------+--------------------------------------+
+--------------------------------------------+--------------------------------------+--------------------------------------+

Regards,
Cinyoung Hur

Mime
View raw message