Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 98F3D18784 for ; Fri, 20 Nov 2015 21:17:40 +0000 (UTC) Received: (qmail 47660 invoked by uid 500); 20 Nov 2015 21:17:38 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 47584 invoked by uid 500); 20 Nov 2015 21:17:38 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 47574 invoked by uid 99); 20 Nov 2015 21:17:38 -0000 Received: from Unknown (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 20 Nov 2015 21:17:38 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 67FEC180AA3 for ; Fri, 20 Nov 2015 21:17:38 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.001 X-Spam-Level: *** X-Spam-Status: No, score=3.001 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HTML_MESSAGE=3, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=cloudera-com.20150623.gappssmtp.com Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id W4RsBMxcJFMf for ; Fri, 20 Nov 2015 21:17:24 +0000 (UTC) Received: from mail-wm0-f52.google.com (mail-wm0-f52.google.com [74.125.82.52]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with ESMTPS id BD5092059C for ; Fri, 20 Nov 2015 21:17:23 +0000 (UTC) Received: by wmww144 with SMTP id w144so36782071wmw.0 for ; Fri, 20 Nov 2015 13:17:23 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=cloudera-com.20150623.gappssmtp.com; s=20150623; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=IJlOdoAQOg3kR0mOI/dECsdv2lZnd7SufymOGlEo4tg=; b=dAno9fyfslORPdTSQNZ5FMBMJAahxXl8YXPHE6aQD8b+cS+j965G0KLeQZZGp5uvCL Iw46AgTCEc+eesTFnc2j9sBB5EjFAOnAbzCXNGBvSfRfOO0iwXHf1Q+bXKeFV05sVrUo YC+SyirhkroAJ8rKVW8F/fuQFKEtVAja+AH5zFyNZzOKoOOEvXfiM1M2CMsKL/VB6ZUr U88cfsL5QfGSY7Tb0cNsbQ93N/YT4AZRcoVTtIVZ/i0wVeV+biTAkO9CAUZ3hyYq2ld2 YFugau27piHUlIbo3eshQ73m/zBYT673KmmmIufsJmntNRRUfzQL3kcgVOOwajYBc3te rfkg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:content-type; bh=IJlOdoAQOg3kR0mOI/dECsdv2lZnd7SufymOGlEo4tg=; b=B/lbemUCqykHbB5q/W5gF+TQfQYMuuASUhWpL4Kr0+H0H0yuwsiE9eyIVVUt4HQEkY FdtVGLR9UShVNRF3lMEIP6+dJMWgi/uwAJJZqIDVqNPfQ46ACBOY3nzbJs1FogcLy7KD KxqFoUWTepuwq++J0Zx/930WjCy+beoMRnS/ArMFTdRYS9lb3zCAhfKW/xUM+zJfG3o+ KoGOeG3CUCjtslq+Kzd9k7Yx0ZFKD7V9tOaEOyZUc1/viqvI42EP17/TEIXCeO+m/I0d Z9TWlq5j5+5iEYkj9LeXiDRGndDRSjnIZU3770TiV/WXGaN1Z7jxzq0TwtX/YyIaF7Ct 1hLQ== X-Gm-Message-State: ALoCoQnXpUC+lfTL8Zde0o3S29Xq+naDMiSIUwFpzz4ar2xU2iuJf3ptnIraRYGy1+FSO4GXoz8j MIME-Version: 1.0 X-Received: by 10.28.30.206 with SMTP id e197mr1941317wme.97.1448054243254; Fri, 20 Nov 2015 13:17:23 -0800 (PST) Received: by 10.28.177.85 with HTTP; Fri, 20 Nov 2015 13:17:23 -0800 (PST) In-Reply-To: <022f01d123d8$5ef249a0$1cd6dce0$@peridale.co.uk> References: <022f01d123d8$5ef249a0$1cd6dce0$@peridale.co.uk> Date: Fri, 20 Nov 2015 13:17:23 -0800 Message-ID: Subject: Re: starting spark-shell throws /tmp/hive on HDFS should be writable error From: Xuefu Zhang To: "user@hive.apache.org" Content-Type: multipart/alternative; boundary=001a114b3beac908780524ff66e4 --001a114b3beac908780524ff66e4 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable This seems belonging to Spark user list. I don't see any relevance to Hive except the directory containing "hive" word. --Xuefu On Fri, Nov 20, 2015 at 1:13 PM, Mich Talebzadeh wrote: > Hi, > > > > Has this been resolved. I don=E2=80=99t think this has anything to do wit= h > /tmp/hive directory permission > > > > spark-shell > > log4j:WARN No appenders could be found for logger > (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). > > log4j:WARN Please initialize the log4j system properly. > > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for > more info. > > Using Spark's repl log4j profile: > org/apache/spark/log4j-defaults-repl.properties > > To adjust logging level use sc.setLogLevel("INFO") > > Welcome to > > ____ __ > > / __/__ ___ _____/ /__ > > _\ \/ _ \/ _ `/ __/ '_/ > > /___/ .__/\_,_/_/ /_/\_\ version 1.5.2 > > /_/ > > > > Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java > 1.7.0_25) > > Type in expressions to have them evaluated. > > Type :help for more information. > > java.lang.RuntimeException: java.lang.RuntimeException: The root scratch > dir: /tmp/hive on HDFS should be writable. Current permissions are: > rwx------ > > > > > > :10: error: not found: value sqlContext > > import sqlContext.implicits._ > > ^ > > :10: error: not found: value sqlContext > > import sqlContext.sql > > ^ > > > > scala> > > > > Thanks, > > > > > > Mich Talebzadeh > > > > *Sybase ASE 15 Gold Medal Award 2008* > > A Winning Strategy: Running the most Critical Financial Data on ASE 15 > > > http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0919= 08.pdf > > Author of the books* "A Practitioner=E2=80=99s Guide to Upgrading to Syba= se ASE > 15", ISBN 978-0-9563693-0-7*. > > co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN > 978-0-9759693-0-4* > > *Publications due shortly:* > > *Complex Event Processing in Heterogeneous Environments*, ISBN: > 978-0-9563693-3-8 > > *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, vol= ume > one out shortly > > > > http://talebzadehmich.wordpress.com > > > > NOTE: The information in this email is proprietary and confidential. This > message is for the designated recipient only, if you are not the intended > recipient, you should destroy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technolo= gy > Ltd, its subsidiaries or their employees, unless expressly so stated. It = is > the responsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employee= s > accept any responsibility. > > > --001a114b3beac908780524ff66e4 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
This seems belonging to Spark user list. I don't = see any relevance to Hive except the directory containing "hive" = word.

--Xuefu

On Fri, Nov 20, 2015 at 1:13 PM, Mich Talebzadeh <mic= h@peridale.co.uk> wrote:

Hi,

=C2=A0

= Has this been reso= lved. I don=E2=80=99t think this has anything to do with /tmp/hive director= y permission

=C2=A0

spark-shell

log4j:WARN No appen= ders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetri= csFactory).

log4j:WARN Please initialize= the log4j system properly.

= log4j:WARN S= ee http://logging.apache.org/log4j/1.2/faq.html#noconfig for m= ore info.

Using Spark's repl log4j p= rofile: org/apache/spark/log4j-defaults-repl.properties

To adjust logging level use sc.setLogLevel("INFO"= )

Welcome to

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ____=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 __

=C2=A0=C2=A0=C2=A0=C2=A0 / __/__=C2=A0 ___ _____/ /__=

=C2=A0=C2=A0=C2=A0 _\ \/ _ \/ _ `/ __/=C2=A0 '= ;_/

=C2=A0=C2=A0 /___/ .__/\_,_/_/ /_/\_= \=C2=A0=C2=A0 version 1.5.2

= =C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 /_/

=C2=A0<= /u>

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bi= t Server VM, Java 1.7.0_25)

= Type in expr= essions to have them evaluated.

Type :he= lp for more information.

java.lang.Runtim= eException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on = HDFS should be writable. Current permissions are: rwx------

=C2=A0

= =C2=A0

<console>:10: error: not = found: value sqlContext

=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 import sqlContext.implicits._

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 ^

<c= onsole>:10: error: not found: value sqlContext

<= p class=3D"MsoNormal">=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 import sqlContext.sql

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ^

=C2=A0

scala>

=C2=A0

Thanks,=

=C2=A0

= =C2=A0

Mich Talebzadeh

<= u>=C2=A0

Sybase ASE 15 Gold M= edal Award 2008

A Winning Strategy: Running the most Cri= tical Financial Data on ASE 15

http://login= .sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf<= /a>

Author of the books "A Practitioner=E2=80=99s Guide= to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7.

co-author "Sybase Transact SQL Guidelines Best Practices",= ISBN 978-0-9759693-0-4

Publi= cations due shortly:=

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts = and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

=C2=A0

http= ://talebzadehmich.wordpress.com

=C2=A0

NOTE: The information in this email is proprietary and confidential. This= message is for the designated recipient only, if you are not the intended = recipient, you should destroy it immediately. Any information in this messa= ge shall not be understood as given or endorsed by Peridale Technology Ltd,= its subsidiaries or their employees, unless expressly so stated. It is the= responsibility of the recipient to ensure that this email is virus free, t= herefore neither Peridale Ltd, its subsidiaries nor their employees accept = any responsibility.=

=C2=A0


--001a114b3beac908780524ff66e4--