Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8332217543 for ; Fri, 25 Sep 2015 16:18:56 +0000 (UTC) Received: (qmail 69889 invoked by uid 500); 25 Sep 2015 16:18:54 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 69813 invoked by uid 500); 25 Sep 2015 16:18:54 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 69803 invoked by uid 99); 25 Sep 2015 16:18:54 -0000 Received: from Unknown (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 25 Sep 2015 16:18:54 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 4EB5C180A2C for ; Fri, 25 Sep 2015 16:18:54 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.992 X-Spam-Level: ** X-Spam-Status: No, score=2.992 tagged_above=-999 required=6.31 tests=[HTML_MESSAGE=3, RP_MATCHES_RCVD=-0.006, SPF_HELO_PASS=-0.001, SPF_PASS=-0.001] autolearn=disabled Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id ZP9Vr7xbpAnm for ; Fri, 25 Sep 2015 16:18:47 +0000 (UTC) Received: from limerock04.mail.cornell.edu (limerock04.mail.cornell.edu [128.84.13.244]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTP id 43993207EE for ; Fri, 25 Sep 2015 16:18:47 +0000 (UTC) X-CornellRouted: This message has been Routed already. Received: from exchange.cornell.edu (sf-e2013-08.exchange.cornell.edu [10.22.40.55]) by limerock04.mail.cornell.edu (8.14.4/8.14.4_cu) with ESMTP id t8PGIdpe005562 for ; Fri, 25 Sep 2015 12:18:40 -0400 Received: from sf-e2013-01.exchange.cornell.edu (10.22.40.48) by sf-e2013-08.exchange.cornell.edu (10.22.40.55) with Microsoft SMTP Server (TLS) id 15.0.1104.5; Fri, 25 Sep 2015 12:18:39 -0400 Received: from na01-bl2-obe.outbound.protection.outlook.com (207.46.163.205) by sf-e2013-01.exchange.cornell.edu (10.22.40.48) with Microsoft SMTP Server (TLS) id 15.0.1104.5 via Frontend Transport; Fri, 25 Sep 2015 12:18:39 -0400 Received: from DM2PR0401MB0912.namprd04.prod.outlook.com (10.160.97.18) by DM2PR0401MB0912.namprd04.prod.outlook.com (10.160.97.18) with Microsoft SMTP Server (TLS) id 15.1.280.20; Fri, 25 Sep 2015 16:18:33 +0000 Received: from DM2PR0401MB0912.namprd04.prod.outlook.com ([10.160.97.18]) by DM2PR0401MB0912.namprd04.prod.outlook.com ([10.160.97.18]) with mapi id 15.01.0280.017; Fri, 25 Sep 2015 16:18:33 +0000 From: Garry Chen To: "user@hive.apache.org" Subject: hive on spark query error Thread-Topic: hive on spark query error Thread-Index: AdD3rOCuDu+nlahhTOmjccvBcMAO+A== Date: Fri, 25 Sep 2015 16:18:32 +0000 Message-ID: Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: authentication-results: spf=none (sender IP is ) smtp.mailfrom=gc92@cornell.edu; x-originating-ip: [128.84.127.212] x-microsoft-antispam: UriScan:;BCL:0;PCL:0;RULEID:;SRVR:DM2PR0401MB0912; x-microsoft-antispam-prvs: x-exchange-antispam-report-test: UriScan:(108003899814671); x-exchange-antispam-report-cfa-test: BCL:0;PCL:0;RULEID:(601004)(2401047)(5005006)(520078)(8121501046)(3002001);SRVR:DM2PR0401MB0912;BCL:0;PCL:0;RULEID:;SRVR:DM2PR0401MB0912; x-forefront-prvs: 07106EF9B9 received-spf: None (protection.outlook.com: cornell.edu does not designate permitted sender hosts) spamdiagnosticoutput: 1:23 spamdiagnosticmetadata: NSPM Content-Type: multipart/alternative; boundary="_000_DM2PR0401MB0912AD39A4142DADCD0CFB1FDA420DM2PR0401MB0912_" MIME-Version: 1.0 X-MS-Exchange-CrossTenant-originalarrivaltime: 25 Sep 2015 16:18:33.0348 (UTC) X-MS-Exchange-CrossTenant-fromentityheader: Hosted X-MS-Exchange-CrossTenant-id: 5d7e4366-1b9b-45cf-8e79-b14b27df46e1 X-MS-Exchange-Transport-CrossTenantHeadersStamped: DM2PR0401MB0912 X-OriginatorOrg: cornell.edu X-ORG-HybridRouting: 94d79a8283cfbfdbb83ac5bf70966774 --_000_DM2PR0401MB0912AD39A4142DADCD0CFB1FDA420DM2PR0401MB0912_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Hi All, I am following https://cwiki.apache.org/confluence/display/= Hive/Hive+on+Spark%3A+Getting+Started? To setup hive on spark. After setup= /configuration everything startup I am able to show tables but when executi= ng sql statement within beeline I got error. Please help and thank you ver= y much. Cluster Environment (3 nodes) as following hadoop-2.7.1 spark-1.4.1-bin-hadoop2.6 zookeeper-3.4.6 apache-hive-1.2.1-bin Error from hive log: 2015-09-25 11:51:03,123 INFO [HiveServer2-Handler-Pool: Thread-50]: client= .SparkClientImpl (SparkClientImpl.java:startDriver(375)) - Attempting imper= sonation of oracle 2015-09-25 11:51:03,133 INFO [HiveServer2-Handler-Pool: Thread-50]: client= .SparkClientImpl (SparkClientImpl.java:startDriver(409)) - Running client d= river with argv: /u01/app/spark-1.4.1-bin-hadoop2.6/bin/spark-submit --prox= y-user oracle --properties-file /tmp/spark-submit.840692098393819749.proper= ties --class org.apache.hive.spark.client.RemoteDriver /u01/app/apache-hive= -1.2.1-bin/lib/hive-exec-1.2.1.jar --remote-host ip-10-92-82-229.ec2.intern= al --remote-port 40476 --conf hive.spark.client.connect.timeout=3D1000 --co= nf hive.spark.client.server.connect.timeout=3D90000 --conf hive.spark.clien= t.channel.log.level=3Dnull --conf hive.spark.client.rpc.max.size=3D52428800= --conf hive.spark.client.rpc.threads=3D8 --conf hive.spark.client.secret.b= its=3D256 2015-09-25 11:51:03,867 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - Warning: Ignoring non-spark config property: = hive.spark.client.server.connect.timeout=3D90000 2015-09-25 11:51:03,868 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - Warning: Ignoring non-spark config property: = hive.spark.client.rpc.threads=3D8 2015-09-25 11:51:03,868 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - Warning: Ignoring non-spark config property: = hive.spark.client.connect.timeout=3D1000 2015-09-25 11:51:03,868 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - Warning: Ignoring non-spark config property: = hive.spark.client.secret.bits=3D256 2015-09-25 11:51:03,868 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - Warning: Ignoring non-spark config property: = hive.spark.client.rpc.max.size=3D52428800 2015-09-25 11:51:03,876 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - Error: Master must start with yarn, spark, me= sos, or local 2015-09-25 11:51:03,876 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - Run with --help for usage help or --verbose f= or debug output 2015-09-25 11:51:03,885 INFO [stderr-redir-1]: client.SparkClientImpl (Spa= rkClientImpl.java:run(569)) - 15/09/25 11:51:03 INFO util.Utils: Shutdown h= ook called 2015-09-25 11:51:03,889 WARN [Driver]: client.SparkClientImpl (SparkClient= Impl.java:run(427)) - Child process exited with code 1. --_000_DM2PR0401MB0912AD39A4142DADCD0CFB1FDA420DM2PR0401MB0912_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable

Hi All,

        &nbs= p;       I am following https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A&#= 43;Getting+Started? To setup hive on spark.  After setup/confi= guration everything startup I am able to show tables but when executing sql= statement within beeline I got error.  Please help and thank you very much.

 

Cluster Environment (3 nodes) as following

hadoop-2.7.1

spark-1.4.1-bin-hadoop2.6

zookeeper-3.4.6

apache-hive-1.2.1-bin

 

Error from hive log:

2015-09-25 11:51:03,123 INFO  [HiveServer2-Hand= ler-Pool: Thread-50]: client.SparkClientImpl (SparkClientImpl.java:startDri= ver(375)) - Attempting impersonation of oracle

2015-09-25 11:51:03,133 INFO  [HiveServer2-Hand= ler-Pool: Thread-50]: client.SparkClientImpl (SparkClientImpl.java:startDri= ver(409)) - Running client driver with argv: /u01/app/spark-1.4.1-bin-hadoo= p2.6/bin/spark-submit --proxy-user oracle --properties-file /tmp/spark-submit.840692098393819749.properties --class = org.apache.hive.spark.client.RemoteDriver /u01/app/apache-hive-1.2.1-bin/li= b/hive-exec-1.2.1.jar --remote-host ip-10-92-82-229.ec2.internal --remote-p= ort 40476 --conf hive.spark.client.connect.timeout=3D1000 --conf hive.spark.client.server.connect.timeout=3D90000 --conf hive.spark.= client.channel.log.level=3Dnull --conf hive.spark.client.rpc.max.size=3D524= 28800 --conf hive.spark.client.rpc.threads=3D8 --conf hive.spark.client.sec= ret.bits=3D256

2015-09-25 11:51:03,867 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - Warning: Ignoring= non-spark config property: hive.spark.client.server.connect.timeout=3D9000= 0

2015-09-25 11:51:03,868 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - Warning: Ignoring= non-spark config property: hive.spark.client.rpc.threads=3D8

2015-09-25 11:51:03,868 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - Warning: Ignoring= non-spark config property: hive.spark.client.connect.timeout=3D1000

2015-09-25 11:51:03,868 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - Warning: Ignoring= non-spark config property: hive.spark.client.secret.bits=3D256<= /p>

2015-09-25 11:51:03,868 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - Warning: Ignoring= non-spark config property: hive.spark.client.rpc.max.size=3D52428800<= /o:p>

2015-09-25 11:51:03,876 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - Error: Master mus= t start with yarn, spark, mesos, or local

2015-09-25 11:51:03,876 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - Run with --help f= or usage help or --verbose for debug output

2015-09-25 11:51:03,885 INFO  [stderr-redir-1]:= client.SparkClientImpl (SparkClientImpl.java:run(569)) - 15/09/25 11:51:03= INFO util.Utils: Shutdown hook called

2015-09-25 11:51:03,889 WARN  [Driver]: client.= SparkClientImpl (SparkClientImpl.java:run(427)) - Child process exited with= code 1.

 

--_000_DM2PR0401MB0912AD39A4142DADCD0CFB1FDA420DM2PR0401MB0912_--