Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id CC9A49624 for ; Wed, 21 Sep 2011 22:01:52 +0000 (UTC) Received: (qmail 31912 invoked by uid 500); 21 Sep 2011 22:01:51 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 31870 invoked by uid 500); 21 Sep 2011 22:01:51 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 31862 invoked by uid 99); 21 Sep 2011 22:01:51 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 21 Sep 2011 22:01:51 +0000 X-ASF-Spam-Status: No, hits=4.7 required=5.0 tests=FREEMAIL_FROM,HTML_MESSAGE,MANY_SPAN_IN_TEXT,RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: local policy) Received: from [98.139.91.71] (HELO nm1.bullet.mail.sp2.yahoo.com) (98.139.91.71) by apache.org (qpsmtpd/0.29) with SMTP; Wed, 21 Sep 2011 22:01:42 +0000 Received: from [98.139.91.64] by nm1.bullet.mail.sp2.yahoo.com with NNFMP; 21 Sep 2011 22:01:21 -0000 Received: from [98.139.91.19] by tm4.bullet.mail.sp2.yahoo.com with NNFMP; 21 Sep 2011 22:01:21 -0000 Received: from [127.0.0.1] by omp1019.mail.sp2.yahoo.com with NNFMP; 21 Sep 2011 22:01:21 -0000 X-Yahoo-Newman-Property: ymail-3 X-Yahoo-Newman-Id: 825976.56981.bm@omp1019.mail.sp2.yahoo.com Received: (qmail 87619 invoked by uid 60001); 21 Sep 2011 22:01:21 -0000 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=yahoo.com; s=s1024; t=1316642480; bh=7Zy47+R9ih4RZJ4if6rrB1v5e/d72Thf74bnvZS/X/A=; h=X-YMail-OSG:Received:X-Mailer:References:Message-ID:Date:From:Reply-To:Subject:To:In-Reply-To:MIME-Version:Content-Type; b=5JYkl4I4MF24vWW3v9O5LPnM5rNnUjlCnShijCkbqhmPCabZwJZu+5vxWBVo8/ANndatt17t69itDM898vziFm4AYjm09QnvNlHFv+9KcriGdcEeId5Nk7ulpq0/PLWFXVhqL8MLGxE17owba+7Db/g19wilBXUNxpDEtO3EcXM= DomainKey-Signature: a=rsa-sha1; q=dns; c=nofws; s=s1024; d=yahoo.com; h=X-YMail-OSG:Received:X-Mailer:References:Message-ID:Date:From:Reply-To:Subject:To:In-Reply-To:MIME-Version:Content-Type; b=XOJSNhZXcA090690B5HstfvgaDa+nhSWFCqnisuzPWxtFM6WS0WYXSrdZbKhcrW/mjl2m6CTdKois7fPJbZABcqUoZU8I6ySWhU+oxvpEUUToQXzyAGdqGwRYe1LRerYQ8DLRYNJBBpy2LQzyiedbjEcaZOfLRXlC2dr08XicAo=; X-YMail-OSG: 4yZK3KgVM1kAP2Zfm2VrmKY.AFzm5ceNZ1PtrXitEmMRcie YP3EUsX8y6LSULdP6au1gWXrUowewqd9s.fus4cL6XL9yJAqzcd0vIwQUyFr U.8y_pZmi93HWz_YqAc_OQJ75SaB1N2mgZTEHc6PIalfKKJ9.ZPgtahf7Jtm we8wCdlfB7VUMg2S6Eomca.eE1taarxe7Pp8pzAHtV1KEStoQTGCREhPSzxZ cjZLMmCLccITodWTtmBA7A9sN9nqigQuo3tJFvcZNYo5PHWWr7jGRNGVClZX VioBPNZJb5zRC8SDUAbFqmITad3M60Ko5cbq4_SdhJvninANY93SYGbBWjNC mYZRLbD1s6ctHMpGVKiL2kttT8GAYC5mM4ocfVFe4GFNZwPNj0qO0GyTHKuX TYZMYL5T9W0O_69ENCfo8geVESi.ZoT_MPl1uCIG.10uqhHGKD08utMMndjx mvu.3Oc9myHlo4HcQX__2AOrxUuiUb1IA4N_P_H7LKdLMN.JWClFHsGAQYig lw9J.Qj9W1D_l7hqMaa70NEuSq64VL8gY8I8YMGRC5AhOYsgGE9YQbu.YKaq 6ck9fhjAY7WjGnyVGgwcJ66XbUTQ4DEAm92r_h5QXb48hRLd6Yx6l2h8pf7W qbG77HgEp8moC Received: from [173.164.150.189] by web110504.mail.gq1.yahoo.com via HTTP; Wed, 21 Sep 2011 15:01:20 PDT X-Mailer: YahooMailWebService/0.8.114.317681 References: <1316639439.45331.YahooMailNeo@web110505.mail.gq1.yahoo.com> Message-ID: <1316642480.83117.YahooMailNeo@web110504.mail.gq1.yahoo.com> Date: Wed, 21 Sep 2011 15:01:20 -0700 (PDT) From: Ayon Sinha Reply-To: Ayon Sinha Subject: Re: Exception when joining HIVE tables To: "user@hive.apache.org" In-Reply-To: MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="1145223237-1307809409-1316642480=:83117" --1145223237-1307809409-1316642480=:83117 Content-Type: text/plain; charset=iso-8859-1 Content-Transfer-Encoding: quoted-printable I'm a bit concerned about port 9000 for the HDFS location. Is your namenode= at port 9000? Can you run=0Ahadoop dfs -ls =A0hdfs://localhost:9000/user/h= ive/warehouse/supplier=A0=0A=A0=0A-Ayon=0ASee My Photos on Flickr=0AAlso ch= eck out my Blog for answers to commonly asked questions.=0A=0A=0A=0A_______= _________________________=0AFrom: Krish Khambadkone =0ATo: user@hive.apache.org=0ASent: Wednesday, September 21, 2011 2:45 PM= =0ASubject: Re: Exception when joining HIVE tables=0A=0A=0AHere is the tabl= e info, =A0 and the query is "select acctbal, availqty, partkey from partsu= pp JOIN supplier ON (partsupp.suppkey =3D=3D supplier.suppkey); "=0A=0A=A0d= esc formatted supplier;=0AOK=0A# col_name =A0 =A0 =A0 =A0 =A0 =A0data_type = =A0 =A0 =A0 =A0 =A0 comment =A0 =A0 =A0 =A0 =A0 =A0=A0=0A =A0=0Akey =A0 =A0= =A0 =A0 =A0 =A0 =A0 =A0 string =A0 =A0 =A0 =A0 =A0 =A0 =A0None =A0 =A0 =A0= =A0 =A0 =A0 =A0 =A0=0Aacctbal =A0 =A0 =A0 =A0 =A0 =A0 string =A0 =A0 =A0 = =A0 =A0 =A0 =A0None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=0Aaddress =A0 =A0 =A0 = =A0 =A0 =A0 string =A0 =A0 =A0 =A0 =A0 =A0 =A0None =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0=0Aname =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0string =A0 =A0 =A0 =A0 =A0 = =A0 =A0None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=0Anationkey =A0 =A0 =A0 =A0 =A0= bigint =A0 =A0 =A0 =A0 =A0 =A0 =A0None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=0Ap= hone =A0 =A0 =A0 =A0 =A0 =A0 =A0 string =A0 =A0 =A0 =A0 =A0 =A0 =A0None =A0= =A0 =A0 =A0 =A0 =A0 =A0 =A0=0Asuppkey =A0 =A0 =A0 =A0 =A0 =A0 bigint =A0 = =A0 =A0 =A0 =A0 =A0 =A0None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=0A =A0=0A# Deta= iled Table Information =A0=0ADatabase: =A0 =A0 =A0 =A0 =A0 default =A0 =A0 = =A0 =A0 =A0 =A0 =A0=0AOwner: =A0 =A0 =A0 =A0 =A0 =A0 =A0hadoop =A0 =A0 =A0 = =A0 =A0 =A0 =A0=A0=0ACreateTime: =A0 =A0 =A0 =A0 Wed Sep 21 13:05:50 PDT 20= 11=A0=0ALastAccessTime: =A0 =A0 UNKNOWN =A0 =A0 =A0 =A0 =A0 =A0 =A0=0AProte= ct Mode: =A0 =A0 =A0 None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ARetention: = =A0 =A0 =A0 =A0 =A00 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=0ALocation: = =A0 =A0 =A0 =A0 =A0 hdfs://localhost:9000/user/hive/warehouse/supplier=A0= =0ATable Type: =A0 =A0 =A0 =A0 MANAGED_TABLE =A0 =A0 =A0 =A0=0ATable Parame= ters: =A0=0Atransient_lastDdlTime1316635649 =A0 =A0 =A0 =A0 =A0=0A =A0=0A# = Storage Information =A0=0ASerDe Library: =A0 =A0 =A0org.apache.hadoop.hive.= serde2.lazy.LazySimpleSerDe=A0=0AInputFormat: =A0 =A0 =A0 =A0org.apache.had= oop.mapred.TextInputFormat=A0=0AOutputFormat: =A0 =A0 =A0 org.apache.hadoop= .hive.ql.io.HiveIgnoreKeyTextOutputFormat=A0=0ACompressed: =A0 =A0 =A0 =A0 = No =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ANum Buckets: =A0 =A0 =A0 =A0-1 = =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ABucket Columns: =A0 =A0 [] =A0 =A0= =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ASort Columns: =A0 =A0 =A0 [] =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0 =A0=A0=0AStorage Desc Params: =A0=0Afield.delim =A0 =A0= =A0 =A0 , =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0Aserialization.format, = =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ATime taken: 0.213 seconds=0A=0A=0A= desc formatted partsupp;=0AOK=0A# col_name =A0 =A0 =A0 =A0 =A0 =A0data_type= =A0 =A0 =A0 =A0 =A0 comment =A0 =A0 =A0 =A0 =A0 =A0=A0=0A =A0=0Akey =A0 = =A0 =A0 =A0 =A0 =A0 =A0 =A0 string =A0 =A0 =A0 =A0 =A0 =A0 =A0None =A0 =A0 = =A0 =A0 =A0 =A0 =A0 =A0=0Aavailqty =A0 =A0 =A0 =A0 =A0 =A0int =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0 None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=0Apartkey =A0 =A0 = =A0 =A0 =A0 =A0 bigint =A0 =A0 =A0 =A0 =A0 =A0 =A0None =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0=0Asuppkey =A0 =A0 =A0 =A0 =A0 =A0 bigint =A0 =A0 =A0 =A0 =A0 = =A0 =A0None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=0Asupplycost =A0 =A0 =A0 =A0 = =A0double =A0 =A0 =A0 =A0 =A0 =A0 =A0None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =0A =A0=0A# Detailed Table Information =A0=0ADatabase: =A0 =A0 =A0 =A0 =A0 = default =A0 =A0 =A0 =A0 =A0 =A0 =A0=0AOwner: =A0 =A0 =A0 =A0 =A0 =A0 =A0had= oop =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ACreateTime: =A0 =A0 =A0 =A0 Wed Sep 21= 13:05:37 PDT 2011=A0=0ALastAccessTime: =A0 =A0 UNKNOWN =A0 =A0 =A0 =A0 =A0= =A0 =A0=0AProtect Mode: =A0 =A0 =A0 None =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =A0=0ARetention: =A0 =A0 =A0 =A0 =A00 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0=0ALocation: =A0 =A0 =A0 =A0 =A0 hdfs://localhost:9000/user/hive/warehou= se/partsupp=A0=0ATable Type: =A0 =A0 =A0 =A0 MANAGED_TABLE =A0 =A0 =A0 =A0= =0ATable Parameters: =A0=0Atransient_lastDdlTime1316635698 =A0 =A0 =A0 =A0 = =A0=0A =A0=0A# Storage Information =A0=0ASerDe Library: =A0 =A0 =A0org.apac= he.hadoop.hive.serde2.lazy.LazySimpleSerDe=A0=0AInputFormat: =A0 =A0 =A0 = =A0org.apache.hadoop.mapred.TextInputFormat=A0=0AOutputFormat: =A0 =A0 =A0 = org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat=A0=0ACompressed:= =A0 =A0 =A0 =A0 No =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ANum Buckets: = =A0 =A0 =A0 =A0-1 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ABucket Columns: = =A0 =A0 [] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ASort Columns: =A0 =A0 = =A0 [] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0AStorage Desc Params: =A0=0A= field.delim =A0 =A0 =A0 =A0 , =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0Aseri= alization.format, =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0=0ATime taken: 2.19= 2 seconds=0A=0A=0AOn Sep 21, 2011, at 2:10 PM, Ayon Sinha wrote:=0A=0AIf yo= u can share details of your tables and query we might be able to help. Do a= desc formatted =0A>=A0=0A>-Ayon=0A>See My Photos on Flickr=0A>A= lso check out my Blog for answers to commonly asked questions.=0A>=0A>=0A>= =0A>________________________________=0A>From: Krish Khambadkone =0A>To: user@hive.apache.org=0A>Sent: Wednesday, September 21,= 2011 1:11 PM=0A>Subject: Exception when joining HIVE tables=0A>=0A>Hi,=A0 = I get this exception when I try to join two hive tables or even when I use = a specific WHERE clause.=A0 "SELECT *" from any individual table seems to = work fine.=A0 Any idea what is missing here.=A0 I am on hive version hive-0= .7.0-cdh3u0.=0A>=0A>java.lang.IllegalArgumentException: Can not create a Pa= th from an empty string=0A>=A0=A0=A0 at org.apache.hadoop.fs.Path.checkPath= Arg(Path.java:82)=0A>=A0=A0=A0 at org.apache.hadoop.fs.Path.(Path.jav= a:90)=0A>=A0=A0=A0 at org.apache.hadoop.fs.Path.(Path.java:50)=0A>=A0= =A0=A0 at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java= :608)=0A>=A0=A0=A0 at org.apache.hadoop.mapred.JobClient.copyAndConfigureFi= les(JobClient.java:713)=0A>=A0=A0=A0=0A at org.apache.hadoop.mapred.JobClie= nt.copyAndConfigureFiles(JobClient.java:637)=0A>=A0=A0=A0 at org.apache.had= oop.mapred.JobClient.access$300(JobClient.java:170)=0A>=A0=A0=A0 at org.apa= che.hadoop.mapred.JobClient$2.run(JobClient.java:848)=0A>=A0=A0=A0 at org.a= pache.hadoop.mapred.JobClient$2.run(JobClient.java:833)=0A>=A0=A0=A0 at jav= a.security.AccessController.doPrivileged(Native Method)=0A>=A0=A0=A0 at jav= ax.security.auth.Subject.doAs(Subject.java:396)=0A>=A0=A0=A0 at org.apache.= hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)= =0A>=A0=A0=A0 at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobCl= ient.java:833)=0A>=A0=A0=A0 at org.apache.hadoop.mapred.JobClient.submitJob= (JobClient.java:807)=0A>=A0=A0=A0 at org.apache.hadoop.hive.ql.exec.ExecDri= ver.execute(ExecDriver.java:657)=0A>=A0=A0=A0 at=0A org.apache.hadoop.hive.= ql.exec.MapRedTask.execute(MapRedTask.java:123)=0A>=A0=A0=A0 at org.apache.= hadoop.hive.ql.exec.Task.executeTask(Task.java:130)=0A>=A0=A0=A0 at org.apa= che.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)=0A>=A0= =A0=A0 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)=0A>= =A0=A0=A0 at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)=0A>= =A0=A0=A0 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)=0A>=A0= =A0=A0 at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:16= 4)=0A>=A0=A0=A0 at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriv= er.java:241)=0A>=A0=A0=A0 at org.apache.hadoop.hive.cli.CliDriver.main(CliD= river.java:456)=0A>=A0=A0=A0 at sun.reflect.NativeMethodAccessorImpl.invoke= 0(Native Method)=0A>=A0=A0=A0 at=0A sun.reflect.NativeMethodAccessorImpl.in= voke(NativeMethodAccessorImpl.java:39)=0A>=A0=A0=A0 at sun.reflect.Delegati= ngMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)=0A>=A0=A0= =A0 at java.lang.reflect.Method.invoke(Method.java:597)=0A>=A0=A0=A0 at org= .apache.hadoop.util.RunJar.main(RunJar.java:186)=0A>Job Submission failed w= ith exception 'java.lang.IllegalArgumentException(Can not create a Path fro= m an empty string)'=0A>FAILED: Execution Error, return code 1 from org.apac= he.hadoop.hive.ql.exec.MapRedTask=0A>=0A>=0A> --1145223237-1307809409-1316642480=:83117 Content-Type: text/html; charset=iso-8859-1 Content-Transfer-Encoding: quoted-printable
I'm a bit concerned a= bout port 9000 for the HDFS location. Is your namenode at port 9000? Can yo= u run
 

From:= Krish Khambadkone <kkhambadkone@apple.com>
To: user@hive.apache.org
Sent: Wednesday, September 21, 2011= 2:45 PM
Subject: Re: E= xception when joining HIVE tables

H= ere is the table info,   and the query is "select acctbal, availqty, p= artkey from partsupp JOIN supplier ON (partsupp.suppkey =3D=3D supplier.sup= pkey); "

 desc formatted supplier;
OK
# col_name            =09data_type           =09comment         &= nbsp;   
=09 =09 
key    = ;             =09string     &nbs= p;        =09None          =      
acctbal           =   =09string              =09N= one                
addre= ss             =09string              = ;=09= None                
name                =09strin= g              =09None     =            
nationkey    =       =09bigint            = ;  =09None                <= /div>
phone               =09string        = ;      =09None             =    
suppkey             <= span class=3D"yiv1219307475Apple-tab-span" style=3D"white-space:pre;">=09bigint              =09None &nbs= p;              
=09 =09 
# Detailed Table Information=09 =09 
Database:   &nb= sp;       =09default           =   =09 
Owner:           &nb= sp;  =09hadoop              =09 
CreateTime:         =09Wed Sep 21= 13:05:50 PDT 2011=09 
LastAccessTime:     =09UNKNOWN             =09 
Protect Mode:       =09None                =09 = ;
Retention:          =090   &nb= sp;               =09 
Location:           =09hdf= s://localhost:9000/user/hive/warehouse/supplier=09 
T= able Type:         =09MANAGED_TABLE       =09 
Tab= le Parameters:=09 =09 
=09transient_lastDdlTime=091316635649          
=09 =09&nbs= p;
# Storage Information=09 =09 
SerDe Library:      =09org.a= pache.hadoop.hive.serde2.lazy.LazySimpleSerDe=09 
InputFo= rmat:        =09org.apache.hadoop.mapred.TextInputFo= rmat= =09 
OutputFormat:       =09org.a= pache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat=09 
Compressed:         =09No       &nbs= p;          =09 
Num Buckets:        =09-1                  =09&nb= sp;
Bucket Columns:     =09[]       &= nbsp;          =09 
Sort Columns= :       =09[]             &= nbsp;    =09 
Storage Desc Params:=09 =09 
=09field= .delim         =09,         &nbs= p;         
=09serialization.format=09,                   <= /div>
Time taken: 0.213 seconds


desc formatted partsupp;
OK
# col_na= me            =09data_type     &= nbsp;     =09comment             
=09 <= span class=3D"yiv1219307475Apple-tab-span" style=3D"white-space:pre;">=09 
key               =   =09string              =09N= one                
avail= qty            =09int      =           =09None         =        
partkey             =09b= igint              =09None   &nb= sp;            
suppkey   &nbs= p;         =09bigint         &nb= sp;    =09None              = ;  
supplycost          =09doubl= e              =09None     =            
=09 =09 
# Detailed Table In= formation=09 =09 
Database:        = ;   =09default             =09&nbs= p;
Owner:              =09h= adoop              =09 
CreateTime:         =09= Wed Sep 21 13:05:37 PDT 2011=09 
LastAccessTime: &= nbsp;   =09UNKNOWN             =09=  
Protect Mode:       =09None    = ;            =09 
Retention= :          =090         &nb= sp;         =09 
SerDe Library:      =09org.apache.hadoop.hive.serde2.l= azy.LazySimpleSerDe=09 
OutputFormat:       =09org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat=09&= nbsp;
=09field.delim         =09,                   <= /div>
=09serialization.format=09,         &nb= sp;         
Time taken: 2.192 seconds

=
On Sep 21, 2011, at 2:10 PM, Ayon Sinha wrote:
If you can share details of your tables and query we might be able to hel= p. Do a desc formatted <tablename>
 

From: Krish Kha= mbadkone <kkhambadkone@apple= .com>
To: user@hive.apache.org
Sent: Wednesday, September 21, 2011 1= :11 PM
Subject: Exceptio= n when joining HIVE tables

Hi,  I get this exception whe= n I try to join two hive tables or even when I use a specific WHERE clause.=   "SELECT *" from any individual table seems to work fine.  Any = idea what is missing here.  I am on hive version hive-0.7.0-cdh3u0.
java.lang.IllegalArgumentException: Can not create a Path from an empt= y string
    at org.apache.hadoop.fs.Path.checkPathArg(Pa= th.java:82)
    at org.apache.hadoop.fs.Path.<init>= (Path.java:90)
    at org.apache.hadoop.fs.Path.<init&= gt;(Path.java:50)
    at org.apache.hadoop.mapred.JobClie= nt.copyRemoteFiles(JobClient.java:608)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:71= 3)
   =0A at org.apache.hadoop.mapred.JobClient.copyAndCo= nfigureFiles(JobClient.java:637)
    at org.apache.hadoop= .mapred.JobClient.access$300(JobClient.java:170)
    at o= rg.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
  =   at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
&= nbsp;   at java.security.AccessController.doPrivileged(Native Met= hod)
    at javax.security.auth.Subject.doAs(Subject.java= :396)
    at org.apache.hadoop.security.UserGroupInformat= ion.doAs(UserGroupInformation.java:1115)
    at org.apach= e.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
 &n= bsp;  at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:8= 07)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.exec= ute(ExecDriver.java:657)
    at=0A org.apache.hadoop.hive= .ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at o= rg.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
 &nbs= p;  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRun= ner.java:57)
    at org.apache.hadoop.hive.ql.Driver.laun= chTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql= .Driver.execute(Driver.java:900)
    at org.apache.hadoop= .hive.ql.Driver.run(Driver.java:748)
    at org.apache.ha= doop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
   = ; at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver= .java:456)
    at sun.reflect.NativeMethodAccessorImpl.in= voke0(Native Method)
    at=0A sun.reflect.NativeMethodAc= cessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    a= t sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:25)
    at java.lang.reflect.Method.invoke(Metho= d.java:597)
    at org.apache.hadoop.util.RunJar.main(Run= Jar.java:186)
Job Submission failed with exception 'java.lang.IllegalArg= umentException(Can not create a Path from an empty string)'
FAILED: Exec= ution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask



--1145223237-1307809409-1316642480=:83117--