hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bejoy KS" <bejoy...@yahoo.com>
Subject Re: unable to create external table plz corrrect the syntax
Date Thu, 12 Jul 2012 07:09:17 GMT
Hi shaik

Step 1
Create an external table with the desired location in hdfs. Your data files for the hive table
will be stored in this location/directory in hdfs.

Step 2
Now use the LOAD DATA command to load data from any other location  into this table. On successful
execution of this command the data files will be moved to the table's location .( specified
in previous step)

Alternatively you can move or copy files within hdfs  using hadoop fs copy commands.

Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: shaik ahamed <shaik5943@gmail.com>
Date: Thu, 12 Jul 2012 12:30:23 
To: <user@hive.apache.org>; Bejoy Ks<bejoy_ks@yahoo.com>
Reply-To: user@hive.apache.org
Subject: Re: unable to create external table plz corrrect the syntax

Thanks for the reply guys

               I have tried dng with the load cmd

i need the HDFS file to be place in the below hive path


*/usr/local/hive-0.9.0#
*
*/usr/local/hadoop_dir/hadoop/big_data/vender_details.txt --* this is the
hdfs path ,this path file
**
*i.e vender_details.txt  to be placed in the path /usr/local/hive-0.9.0#  --
* in the hive path

please reply me with the syntax i tried all the ways with external table
also


Thanks in advance

Shaik
On Wed, Jul 11, 2012 at 9:03 PM, Bejoy Ks <bejoy_ks@yahoo.com> wrote:

>  Hi Shaik
>
> For the correct syntax for create table statement please refer
>
> https://cwiki.apache.org/Hive/languagemanual-ddl.html#LanguageManualDDL-CreateTable
>
>
> Please try our this command to avoid the syntax error
>
> Create external table vender(vender string,supplier string,order_date
> string,quantity int)
>  row format delimited fields terminated by ' '
> stored as textfile
>
> LOCATION '<hdfs dir>';
>
>
> Replace 'hdfs dir' with the required director path in hdfs
>
>
> Then try out the LOAD DATA LOCAL command, since your are loading data from lfs to hdfs
if the data volume is large (100G) it'll take some time.
>
>
> Regards
>
> Bejoy KS
>
>
>
>
>   ------------------------------
> *From:* shaik ahamed <shaik5943@gmail.com>
> *To:* user@hive.apache.org
> *Sent:* Wednesday, July 11, 2012 8:38 PM
> *Subject:* unable to create external table plz corrrect the syntax
>
>  Thanks for the reply guys,
>
> I have tried using the below cmd
>
>  usr/local/hive-0.9.0# load data local inpath
> ‘/usr/local/hadoop_dir/hadoop/big_data/vender_details.txt’ into table
> vender;
>
> in the above hive path we cant load the data using the above cmd ?
>
> In the below there  is an syntax error
> plz correct it
>
> hive> create external table vender(vender string,supplier
> string,order_date string,quantity
> int)['./usr/local/hadoop_dir/hadoop/big_data/vender_details.txt'] [ row
> format delimited fields terminated by ' ' stored as textfile] ;
>
> FAILED: Parse Error: line 1:90 mismatched input '[' expecting EOF near ')'
> Thanks in advance
>
> Regards
> shaik.
>
>
>

Mime
View raw message