hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Garry Chen <g...@cornell.edu>
Subject hive on spark query error
Date Fri, 25 Sep 2015 16:18:32 GMT
Hi All,
                I am following https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started?
To setup hive on spark.  After setup/configuration everything startup I am able to show tables
but when executing sql statement within beeline I got error.  Please help and thank you very
much.

Cluster Environment (3 nodes) as following
hadoop-2.7.1
spark-1.4.1-bin-hadoop2.6
zookeeper-3.4.6
apache-hive-1.2.1-bin

Error from hive log:
2015-09-25 11:51:03,123 INFO  [HiveServer2-Handler-Pool: Thread-50]: client.SparkClientImpl
(SparkClientImpl.java:startDriver(375)) - Attempting impersonation of oracle
2015-09-25 11:51:03,133 INFO  [HiveServer2-Handler-Pool: Thread-50]: client.SparkClientImpl
(SparkClientImpl.java:startDriver(409)) - Running client driver with argv: /u01/app/spark-1.4.1-bin-hadoop2.6/bin/spark-submit
--proxy-user oracle --properties-file /tmp/spark-submit.840692098393819749.properties --class
org.apache.hive.spark.client.RemoteDriver /u01/app/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar
--remote-host ip-10-92-82-229.ec2.internal --remote-port 40476 --conf hive.spark.client.connect.timeout=1000
--conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null
--conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf
hive.spark.client.secret.bits=256
2015-09-25 11:51:03,867 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000
2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
2015-09-25 11:51:03,876 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- Error: Master must start with yarn, spark, mesos, or local
2015-09-25 11:51:03,876 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- Run with --help for usage help or --verbose for debug output
2015-09-25 11:51:03,885 INFO  [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569))
- 15/09/25 11:51:03 INFO util.Utils: Shutdown hook called
2015-09-25 11:51:03,889 WARN  [Driver]: client.SparkClientImpl (SparkClientImpl.java:run(427))
- Child process exited with code 1.


Mime
View raw message