hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sambit Tripathy <sambi...@gmail.com>
Subject Re: Exceptions with importtsv
Date Wed, 02 May 2012 06:42:42 GMT
Thanks Yifeng. Well thought input :) and it works.

On Sun, Apr 29, 2012 at 1:43 PM, Yifeng Jiang <uprushworld@gmail.com> wrote:

> Hi Sambit,
>
> Are you specifying a local file system path on the command line?
> Before invoking importtsv, you will need to copy your tsv files to HDFS at
> first.
>
> -Yifeng
>
> On Apr 27, 2012, at 6:08 PM, Sambit Tripathy wrote:
>
> > I am able to run this command but it goes on forever. I don't see any
> data
> > uploaded.
> >
> > This is what I see on the console.
> >
> > http://pastebin.com/J2WApji1
> >
> >
> > Any idea on how to debug this?
> >
> >
> >
> >
> > On Fri, Apr 27, 2012 at 11:30 AM, Sambit Tripathy <sambit19@gmail.com
> >wrote:
> >
> >> Thanks all for the reply.
> >>
> >> I am able to run this.
> >>
> >> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
> >> ${HBASE_HOME}/hbase-0.92.1.jar importtsv
> >> -Dimporttsv.bulk.output=/user/hadoop/input/bulk
> >> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
> >> /opt/hadoop/raw
> >>
> >>
> >>
> >>
> >> -Sambit.
> >>
> >>
> >> On Thu, Apr 26, 2012 at 4:21 PM, Harsh J <harsh@cloudera.com> wrote:
> >>
> >>> Sambit,
> >>>
> >>> Just a tip:
> >>>
> >>> When using the "hadoop" executable to run HBase programs of any kind,
> >>> the right way is to do this:
> >>>
> >>> HADOOP_CLASSPATH=`hbase classpath` hadoop jar <args>
> >>>
> >>> This will ensure you run with all HBase dependencies loaded on the
> >>> classpath, for code to find its HBase-specific resources.
> >>>
> >>> On Thu, Apr 26, 2012 at 3:10 PM, Sambit Tripathy <sambit19@gmail.com>
> >>> wrote:
> >>>> Slim,
> >>>>
> >>>>
> >>>> That exception is gone now after adding guava jar. (I wonder why do
we
> >>> need
> >>>> a Google Data Java Client !!!)
> >>>>
> >>>> Well there is something more, I am getting the following exception
> now.
> >>>>
> >>>> Exception in thread "main" java.lang.reflect.InvocationTargetException
> >>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>>       at
> >>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>       at
> >>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>       at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
> >>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>>       at
> >>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>       at
> >>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>       at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >>>> Caused by: java.lang.NoClassDefFoundError:
> >>>> org/apache/zookeeper/KeeperException
> >>>>       at
> >>>>
> >>>
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
> >>>>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
> >>>>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
> >>>>       at
> >>>>
> >>>
> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
> >>>>       at
> >>>> org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
> >>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>>       at
> >>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>       at
> >>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>       at
> >>>>
> >>>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> >>>>       at
> >>>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >>>>       ... 10 more
> >>>> Caused by: java.lang.ClassNotFoundException:
> >>>> org.apache.zookeeper.KeeperException
> >>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >>>>       at java.security.AccessController.doPrivileged(Native Method)
> >>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >>>>       ... 21 more
> >>>>
> >>>>
> >>>> Any idea? Looks like some issues with ZooKeeper, but I checked the
> logs
> >>> and
> >>>> zookeeper is just fine. This exception message gets printed in the
> >>> console.
> >>>>
> >>>>
> >>>> Thanks
> >>>> Sambit.
> >>>>
> >>>>
> >>>> On Thu, Apr 26, 2012 at 2:25 PM, slim tebourbi <slimtbourbi@gmail.com
> >>>> wrote:
> >>>>
> >>>>> Hi Sambit,
> >>>>> I think that you should add google guava jar to your job classpath.
> >>>>>
> >>>>> Slim.
> >>>>>
> >>>>> Le 26 avril 2012 10:50, Sambit Tripathy <sambit19@gmail.com>
a
> écrit :
> >>>>>
> >>>>>> Hi All,
> >>>>>>
> >>>>>> Can anyone help me with this exception?
> >>>>>>
> >>>>>> I have been trying to import data from csv files into HBase.
> >>>>>>
> >>>>>> As per my understanding the process is
> >>>>>>
> >>>>>> 1. Import  as  HFile using *importtsv *tool provided by HBase
> >>>>>> 2. Bulkupload the data from those HFiles into HBase using
> >>>>>> *completebulkupload
> >>>>>> *tool.
> >>>>>>
> >>>>>> However when I issue the following command, I encounter exception.
> >>>>>>
> >>>>>> hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar
> >>>>> importtsv
> >>>>>> -Dimporttsv.bulk.output=/user/hadoop/input.bulk
> >>>>>> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=,
> >>> testTable
> >>>>>> /opt/hadoop/raw
> >>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
> >>>>>> com/google/common/collect/Multimap
> >>>>>>       at
> >>> org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
> >>>>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
> >>>>>>       at
> >>>>>>
> >>>>>>
> >>>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>>>       at
> >>>>>>
> >>>>>>
> >>>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>>>       at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >>>>>> Caused by: java.lang.ClassNotFoundException:
> >>>>>> com.google.common.collect.Multimap
> >>>>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >>>>>>       at java.security.AccessController.doPrivileged(Native
Method)
> >>>>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >>>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >>>>>>       ... 6 more
> >>>>>>
> >>>>>> *Note: I have removed the native libraries during hadoop
> >>> installation. I
> >>>>>> doubt if this is causing the exception as it is looking for
the
> >>> "Google
> >>>>>> Data Java client".
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>> *Thanks
> >>>>>> Sambit.
> >>>>>>
> >>>>>
> >>>
> >>>
> >>>
> >>> --
> >>> Harsh J
> >>>
> >>
> >>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message