hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hamza Asad <hamza.asa...@gmail.com>
Subject Re: Export hive table format issue
Date Tue, 18 Jun 2013 12:22:29 GMT
Attached are the schema files of both HIVE and mySql tables


On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <nitinpawar432@gmail.com>wrote:

>  for the number format exception, can you share your mysql schema (put as
> attachment and not inline in mail). If you have created table with int ..
> try to switch the column with bigint
>
>
>
> On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <hamza.asad13@gmail.com>wrote:
>
>> I have copy paste the ROW in office writer where i saw its # separated...
>> yeah \N values representing NULL..
>> the version of sqoop is
>> *Sqoop 1.4.2
>> git commit id
>> Compiled by ag on Tue Aug 14 17:37:19 IST 2012*
>>
>>
>> On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <nitinpawar432@gmail.com>wrote:
>>
>>> is "#" your field separator?
>>> also the separator is normally an octal representation so you can give
>>> it a try.
>>>
>>> why does your columns have \N as values? is it for NULL ?
>>>
>>> what version of sqoop are you using?
>>>
>>>
>>> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <hamza.asad13@gmail.com>wrote:
>>>
>>>> im executing following command*
>>>> sqoop export --connect jdbc:mysql://localhost/xxxx --table
>>>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>>>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>>>> xxxxxxxx --password xxxxxxxxx*
>>>> *
>>>> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>>>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>>>> java.lang.NumberFormatException: For input string: "8119844 1 4472499
>>>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>>>> 1 \N \N 3 2 \N 1"
>>>>     at
>>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>>>     at java.lang.Integer.parseInt(Integer.java:492)
>>>>     at java.lang.Integer.valueOf(Integer.java:582)
>>>>     at
>>>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>>>>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>>>>     at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>>>>     at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>     at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> *
>>>>
>>>>
>>>> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <nitinpawar432@gmail.com>wrote:
>>>>
>>>>> check the option --input-fields-terminated-by in sqoop export
>>>>>
>>>>>
>>>>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <hamza.asad13@gmail.com>wrote:
>>>>>
>>>>>> I want to export my table in mysql and for that i'm using sqoop
>>>>>> export command but in HDFS i've data apparantly without any field
seperator
>>>>>> But it does contain some field separator. data is saved in the format
as
>>>>>> shown below
>>>>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N
\N
>>>>>> \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>>>>> how can i export this type of data to mysql and what field separator
>>>>>> i mention it there.. Please help
>>>>>>
>>>>>> --
>>>>>> *Muhammad Hamza Asad*
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Nitin Pawar
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> *Muhammad Hamza Asad*
>>>>
>>>
>>>
>>>
>>> --
>>> Nitin Pawar
>>>
>>
>>
>>
>> --
>> *Muhammad Hamza Asad*
>>
>
>
>
> --
> Nitin Pawar
>



-- 
*Muhammad Hamza Asad*

Mime
View raw message