Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id ACB6C11B6A for ; Tue, 13 May 2014 01:07:07 +0000 (UTC) Received: (qmail 54006 invoked by uid 500); 12 May 2014 21:03:06 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 53944 invoked by uid 500); 12 May 2014 21:03:06 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 53936 invoked by uid 99); 12 May 2014 21:03:06 -0000 Received: from Unknown (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 12 May 2014 21:03:06 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_NONE,SPF_HELO_PASS,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: local policy) Received: from [207.46.163.187] (HELO na01-bn1-obe.outbound.protection.outlook.com) (207.46.163.187) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 12 May 2014 21:03:00 +0000 Received: from BLUPR04MB482.namprd04.prod.outlook.com (10.141.28.142) by BLUPR04MB484.namprd04.prod.outlook.com (10.141.28.151) with Microsoft SMTP Server (TLS) id 15.0.934.12; Mon, 12 May 2014 21:02:34 +0000 Received: from BLUPR04MB482.namprd04.prod.outlook.com ([169.254.16.35]) by BLUPR04MB482.namprd04.prod.outlook.com ([169.254.16.35]) with mapi id 15.00.0944.000; Mon, 12 May 2014 21:02:33 +0000 From: John Zeng To: "user@hive.apache.org" Subject: RE: Hive dataload issue. Thread-Topic: Hive dataload issue. Thread-Index: Ac9tt9l/HCE183vSTkuGdFvKNd25UwAbVNag Date: Mon, 12 May 2014 21:02:33 +0000 Message-ID: <0b9ab9374aef4cbdad22ec369e903813@BLUPR04MB482.namprd04.prod.outlook.com> References: <2DE56D5C4BFB1B46AC4810F883CCE49607E0925A@chn-hclt-mbs08.HCLT.CORP.HCL.IN> In-Reply-To: <2DE56D5C4BFB1B46AC4810F883CCE49607E0925A@chn-hclt-mbs08.HCLT.CORP.HCL.IN> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [50.76.46.166] x-forefront-prvs: 0209425D0A x-forefront-antispam-report: SFV:NSPM;SFS:(10009001)(6009001)(428001)(13464003)(199002)(189002)(38564003)(377454003)(81342001)(21056001)(81542001)(15202345003)(20776003)(15975445006)(85852003)(76482001)(64706001)(54356999)(92566001)(33646001)(83072002)(76176999)(50986999)(2656002)(74316001)(77982001)(66066001)(101416001)(74662001)(99286001)(80022001)(31966008)(19580405001)(86362001)(83322001)(19580395003)(79102001)(87936001)(74502001)(4396001)(99396002)(46102001)(16236675002)(19300405004)(19625215002)(76576001)(24736002);DIR:OUT;SFP:1101;SCL:1;SRVR:BLUPR04MB484;H:BLUPR04MB482.namprd04.prod.outlook.com;FPR:;MLV:sfv;PTR:InfoNoRecords;A:1;MX:1;LANG:en; received-spf: None (: dataguise.com does not designate permitted sender hosts) authentication-results: spf=none (sender IP is ) smtp.mailfrom=john.zeng@dataguise.com; Content-Type: multipart/alternative; boundary="_000_0b9ab9374aef4cbdad22ec369e903813BLUPR04MB482namprd04pro_" MIME-Version: 1.0 X-OriginatorOrg: dataguise.com X-Virus-Checked: Checked by ClamAV on apache.org --_000_0b9ab9374aef4cbdad22ec369e903813BLUPR04MB482namprd04pro_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Is the extra 's' needed in your path: LOAD DATA INPATH '/users/dept.txt' overwrite into table DEPT; -----Original Message----- From: Muthukumar Somasundaram (ETS) [mailto:muthukumars@hcl.com] Sent: Monday, May 12, 2014 12:59 AM To: user@hive.apache.org Subject: Hive dataload issue. Hi I need a help in resolving a data load issue with Hive. Background: I am testing the Informatica connectivity with Big data ( Hadoop and Hive). I have installed HDFS 1.0.3 and Hive 0.7.1 in RHEL 5.5. I am able to perf= orm all HDFS operations. When I try to load a hive table using hive command line, I am getting the b= elow error. If you have faced this earlier or you know the solution please = let me know. I tried loading both local file and hdfs file. Both are giving same error. = Hope I miss some configuration. Please find the attached screen shot. I tested the script in cloudera , it works fine. Thanks in advance. Error Info. hive> describe dept; OK deptid int dname string Time taken: 3.792 seconds hive> ! cat /user/dept.txt; Command failed with exit code =3D 1 cat: /user/dept.txt: No such file or directory hive> ! hadoop fs -cat /user/dept.txt; 1,IT 2,Finance 3,Sales hive> LOAD DATA INPATH '/users/dept.txt' overwrite into table DEPT; FAILED: Hive Internal Error: java.lang.IllegalArgumentException(java.net.UR= ISyntaxException: Relative path in absolute URI: hdfs://informatica:8020$%7= Bbuild.dir%7D/scratchdir/hive_2014-05-12_12-11-29_340_565872632113593986) java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative p= ath in absolute URI: hdfs://informatica:8020$%7Bbuild.dir%7D/scratchdir/hiv= e_2014-05-12_12-11-29_340_565872632113593986 at org.apache.hadoop.fs.Path.initialize(Path.java:148) at org.apache.hadoop.fs.Path.(Path.java:132) at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:142= ) at org.apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.= java:202) at org.apache.hadoop.hive.ql.Context.getExternalTmpFileURI(Context.= java:294) at org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInte= rnal(LoadSemanticAnalyzer.java:238) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(Bas= eSemanticAnalyzer.java:238) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:1= 64) Regards Muthukumar.S ::DISCLAIMER:: ---------------------------------------------------------------------------= ------------------------------------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and inte= nded for the named recipient(s) only. E-mail transmission is not guaranteed to be secure or error-free as informa= tion could be intercepted, corrupted, lost, destroyed, arrive late or incom= plete, or may contain viruses in transmission. The e mail and its contents = (with or without referred errors) shall therefore not attach any liability = on the originator or HCL or its affiliates. Views or opinions, if any, presented in this email are solely those of the = author and may not necessarily reflect the views or opinions of HCL or its = affiliates. Any form of reproduction, dissemination, copying, disclosure, m= odification, distribution and / or publication of this message without the = prior written consent of authorized representative of HCL is strictly prohi= bited. If you have received this email in error please delete it and notify= the sender immediately. Before opening any email and/or attachments, please check them for viruses = and other defects. ---------------------------------------------------------------------------= ------------------------------------------------------------------------- --_000_0b9ab9374aef4cbdad22ec369e903813BLUPR04MB482namprd04pro_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable

Is the extra 's' needed in your path:<= /p>

 

LOAD DATA INPATH '/users/dept.txt' overwrite into table DEPT;

 =

-----Original Message-----
From: Muthukumar Somasundaram (ETS) [mailto:muthukumars@hcl.com]
Sent: Monday, May 12, 2014 12:59 AM
To: user@hive.apache.org
Subject: Hive dataload issue.

 

Hi

 

I need a help in resolving a data load issue with= Hive.

 

Background:

 

I am testing the Informatica connectivity with Bi= g data ( Hadoop and Hive).

 

I have installed HDFS 1.0.3  and Hive 0.7.1 =  in RHEL 5.5. I am able to perform all HDFS operations.

 

When I try to load a hive table using hive comman= d line, I am getting the below error. If you have faced this earlier or you= know the solution please let me know.

 

I tried loading both local file and hdfs file. Bo= th are giving same error. Hope I miss some configuration. Please find the a= ttached screen shot.

 

I tested the script in cloudera , it works fine.<= o:p>

 

Thanks in advance.

 

Error Info.

 

hive> describe dept;

OK

deptid  int

dname   string

Time taken: 3.792 seconds

hive> ! cat /user/dept.txt;

Command failed with exit code =3D 1

cat: /user/dept.txt: No such file or directory

hive> ! hadoop fs -cat /user/dept.txt;

1,IT

2,Finance

3,Sales

hive> LOAD DATA INPATH '/users/dept.txt' overw= rite into table DEPT;

FAILED: Hive Internal Error: java.lang.IllegalArg= umentException(java.net.URISyntaxException: Relative path in absolute URI: = hdfs://informatica:8020$%7Bbuild.dir%7D/scratchdir/hive_2014-05-12_12-11-29= _340_565872632113593986)

java.lang.IllegalArgumentException: java.net.URIS= yntaxException: Relative path in absolute URI: hdfs://informatica:8020$%7Bb= uild.dir%7D/scratchdir/hive_2014-05-12_12-11-29_340_565872632113593986=

        at org= .apache.hadoop.fs.Path.initialize(Path.java:148)

        at org= .apache.hadoop.fs.Path.<init>(Path.java:132)

        at org= .apache.hadoop.hive.ql.Context.getScratchDir(Context.java:142)

        at org= .apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.java:202)=

        at org= .apache.hadoop.hive.ql.Context.getExternalTmpFileURI(Context.java:294)=

        at org= .apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInternal(LoadSeman= ticAnalyzer.java:238)

        at org= .apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnaly= zer.java:238)

        at org= .apache.hadoop.hive.ql.Driver.compile(Driver.java:340)

        at org= .apache.hadoop.hive.ql.Driver.run(Driver.java:736)

        at org= .apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)=

Regards

Muthukumar.S

 

 

 

::DISCLAIMER::

-------------------------------------------------= ---------------------------------------------------------------------------= ------------------------

 

The contents of this e-mail and any attachment(s)= are confidential and intended for the named recipient(s) only.<= /p>

E-mail transmission is not guaranteed to be secur= e or error-free as information could be intercepted, corrupted, lost, destr= oyed, arrive late or incomplete, or may contain viruses in transmission. Th= e e mail and its contents (with or without referred errors) shall therefore not attach any liability on the o= riginator or HCL or its affiliates.

Views or opinions, if any, presented in this emai= l are solely those of the author and may not necessarily reflect the views = or opinions of HCL or its affiliates. Any form of reproduction, disseminati= on, copying, disclosure, modification, distribution and / or publication of this message without the prior writte= n consent of authorized representative of HCL is strictly prohibited. If yo= u have received this email in error please delete it and notify the sender = immediately.

Before opening any email and/or attachments, plea= se check them for viruses and other defects.

 

-------------------------------------------------= ---------------------------------------------------------------------------= ------------------------

 

--_000_0b9ab9374aef4cbdad22ec369e903813BLUPR04MB482namprd04pro_--