sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jarek Jarcec Cecho <jar...@apache.org>
Subject Re: sqoop-export with sequence files doesn't work.
Date Sat, 24 Aug 2013 23:07:25 GMT
Hi Deepak,
I would advise you to ask CDH specific questions on cdh-user [1] mailing lists as you are
much more likely to get answer there.

For your convenience the Sqoop HCatalog integration will be part of upcoming CDH 4.4.0.

Jarcec

Links:
1: https://groups.google.com/a/cloudera.org/forum/#!forum/cdh-user

On Fri, Aug 16, 2013 at 11:04:34AM -0700, Venkat Ranganathan wrote:
> i don't know about CDH distro - may be it does.      You need Sqoop 1.4.4
> along with HCatalog installation
> 
> Thanks
> 
> Venkat
> 
> 
> On Fri, Aug 16, 2013 at 10:28 AM, Deepak Konidena <deepakkoni@gmail.com>wrote:
> 
> > @Krishna Rao - Sequence file provides us the compression we need.
> > Converting the data into a non-sequence file is not an option since it
> > won't be scale.
> >
> > @Venkat - The version of sqoop I am using is 1.4.3. Does HCatalog have to
> > be installed separately or does it come as part of CDH4.3 ?
> >
> >
> > -Deepak
> >
> >
> >
> > On Fri, Aug 16, 2013 at 10:14 AM, Venkat Ranganathan <
> > vranganathan@hortonworks.com> wrote:
> >
> >> The HCatalog integration handles tables with sequence files - It is part
> >> of 1.4.4 - It handles both exports and imports
> >>
> >> Venkat
> >>
> >>
> >> On Fri, Aug 16, 2013 at 9:32 AM, Krishna Rao <krishnanjrao@gmail.com>wrote:
> >>
> >>> I've run into this problem as well. I ended up copying the table into a
> >>> non-sequenceFile table just so I could sqoop it out (something along the
> >>> lines of CREATE TABLE nonSeqTbl LIKE seqTbl; INSERT OVERWRITE nonSeqTbl
> >>> SELECT * FROM seqTbl;).
> >>>
> >>> Is there a plan to allow sqoop-exporting of sequence file tables?
> >>>
> >>> Krishna
> >>>
> >>>
> >>> On 16 August 2013 17:26, Abraham Elmahrek <abe@cloudera.com> wrote:
> >>>
> >>>> Ah I believe you're correct. Was this data imported with Sqoop? If so,
> >>>> does the table you're exporting to differ from the table you imported
from?
> >>>>
> >>>>
> >>>> On Thu, Aug 15, 2013 at 11:38 PM, Deepak Konidena <deepakkoni@gmail.com
> >>>> > wrote:
> >>>>
> >>>>> Does sqoop-export support --as-sequence option? I know sqoop-import
> >>>>> does.
> >>>>>
> >>>>>
> >>>>> -Deepak
> >>>>>
> >>>>>
> >>>>>
> >>>>> On Thu, Aug 15, 2013 at 11:34 PM, Abraham Elmahrek <abe@cloudera.com>wrote:
> >>>>>
> >>>>>> Hey There,
> >>>>>>
> >>>>>> I believe you're missing the --as-sequence directive!
> >>>>>>
> >>>>>> -Abe
> >>>>>>
> >>>>>>
> >>>>>> On Thu, Aug 15, 2013 at 7:16 PM, Deepak Konidena <
> >>>>>> deepakkoni@gmail.com> wrote:
> >>>>>>
> >>>>>>> Hi,
> >>>>>>>
> >>>>>>> I have a sequence file with with both (key,value) as
> >>>>>>> org.apache.hadoop.io.Text
> >>>>>>>
> >>>>>>> I am trying to export the data into a mysql table with (key,value)
> >>>>>>> mapped to (varchar, blob) since the value is pretty big.
and I get the
> >>>>>>> following error:
> >>>>>>>
> >>>>>>> (command) - sqoop export -m "1" -connect
> >>>>>>> "jdbc:mysql://<host>:3306/database" --username "sqoop"
--password
> >>>>>>> "sqooppwd" --table "tablename"  --export-dir "/path/to/sequencefile"
> >>>>>>> --verbose
> >>>>>>>
> >>>>>>> java.lang.ClassCastException: org.apache.hadoop.io.Text
cannot be
> >>>>>>> cast to org.apache.hadoop.io.LongWritable
> >>>>>>>     at
> >>>>>>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
> >>>>>>>     at
> >>>>>>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
> >>>>>>>     at
> >>>>>>> org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
> >>>>>>>     at
> >>>>>>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:461)
> >>>>>>>     at
> >>>>>>> org.apache.hadoop.mapreduce.task.MapContextImpl.getCurrentKey(MapContextImpl.java:66)
> >>>>>>>     at
> >>>>>>> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.getCurrentKey(WrappedMapper.java:75)
> >>>>>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
> >>>>>>>     at
> >>>>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> >>>>>>>     at
> >>>>>>> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >>>>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >>>>>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >>>>>>>
> >>>>>>> The export works fine when I create a text file like so,
> >>>>>>>
> >>>>>>> <key,value1,value2,value3>
> >>>>>>>
> >>>>>>> and upload it to hdfs using -CopyFromLocal
> >>>>>>>
> >>>>>>> But, its only with sequence files that it doesn't seem to
work. Any
> >>>>>>> thoughts?
> >>>>>>>
> >>>>>>> Thanks,
> >>>>>>> Deepak
> >>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>>
> >>
> >> CONFIDENTIALITY NOTICE
> >> NOTICE: This message is intended for the use of the individual or entity
> >> to which it is addressed and may contain information that is confidential,
> >> privileged and exempt from disclosure under applicable law. If the reader
> >> of this message is not the intended recipient, you are hereby notified that
> >> any printing, copying, dissemination, distribution, disclosure or
> >> forwarding of this communication is strictly prohibited. If you have
> >> received this communication in error, please contact the sender immediately
> >> and delete it from your system. Thank You.
> >
> >
> >
> 
> -- 
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to 
> which it is addressed and may contain information that is confidential, 
> privileged and exempt from disclosure under applicable law. If the reader 
> of this message is not the intended recipient, you are hereby notified that 
> any printing, copying, dissemination, distribution, disclosure or 
> forwarding of this communication is strictly prohibited. If you have 
> received this communication in error, please contact the sender immediately 
> and delete it from your system. Thank You.

Mime
View raw message