hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: org.apache.hadoop.hbase.TableNotFoundException
Date Tue, 16 Apr 2013 05:38:08 GMT
Was there a carriage return between ADDRESS and , in your command line ?

Cheers

On Apr 15, 2013, at 10:22 PM, Omkar Joshi <Omkar.Joshi@lntinfotech.com> wrote:

> Hi,
> 
> I had created a table called CUSTOMERS (using create 'CUSTOMERS', 'CUSTOMER_INFO') 2-3
days back and inserted a couple of rows via shell.
> 
> Now I wish to upload data into the table from a text file which looks like this :
> 
> C1;Carol X. Nash;hendrerit.Donec@diamnuncullamcorper.edu;459-1190 Tempor Rd.;(656) 169-7763;
> C2;Francesca B. Kirby;eget.odio.Aliquam@duiCumsociis.edu;4862 Integer Street;(884) 979-2109;
> C3;Quentin Z. Rodriquez;sit.amet@ligulaAeneaneuismod.com;1225 Egestas Rd.;(400) 901-2951;
> C4;Steven D. Ashley;accumsan.interdum@elitelit.edu;3747 Fringilla Rd.;(160) 300-7921;
> 
> and so on.
> 
> The DFS directory structure is as follows :
> 
> hadoop fs -ls /hbase
> Warning: $HADOOP_HOME is deprecated.
> 
> Found 11 items
> drwxr-xr-x   - hduser supergroup          0 2013-04-09 19:47 /hbase/-ROOT-
> drwxr-xr-x   - hduser supergroup          0 2013-04-09 19:47 /hbase/.META.
> drwxr-xr-x   - hduser supergroup          0 2013-04-16 16:02 /hbase/.archive
> drwxr-xr-x   - hduser supergroup          0 2013-04-09 19:47 /hbase/.logs
> drwxr-xr-x   - hduser supergroup          0 2013-04-09 19:47 /hbase/.oldlogs
> drwxr-xr-x   - hduser supergroup          0 2013-04-16 16:05 /hbase/.tmp
> drwxr-xr-x   - hduser supergroup          0 2013-04-16 16:05 /hbase/CUSTOMERS
> drwxr-xr-x   - hduser supergroup          0 2013-04-16 15:44 /hbase/copiedFromLocal
> -rw-r--r--   4 hduser supergroup         38 2013-04-09 19:47 /hbase/hbase.id
> -rw-r--r--   4 hduser supergroup          3 2013-04-09 19:47 /hbase/hbase.version
> drwxr-xr-x   - hduser supergroup          0 2013-04-09 22:03 /hbase/users
> 
> I have loaded the text file onto the HDFS :
> 
> hadoop fs -ls /hbase/copiedFromLocal
> Warning: $HADOOP_HOME is deprecated.
> 
> Found 1 items
> -rw-r--r--   4 hduser supergroup    4751429 2013-04-16 15:44 /hbase/copiedFromLocal/customer.txt
> 
> I'm using the below command to import this file into HBase :
> 
> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-0.94.6.1.jar
importtsv '-Dimporttsv.separator=;' -Dimporttsv.columns=HBASE_ROW_KEY,CUSTOMER_INFO:NAME,CUSTOMER_INFO:EMAIL,CUSTOMER_INFO:ADDRESS
,CUSTOMER_INFO:MOBILE  -Dimporttsv.bulk.output=hdfs://hbase/storefileoutput CUSTOMERS hdfs://hbase/copiedFromLocal/customer.txt
> 
> But I get an exception :
> 
> 13/04/16 16:16:05 WARN client.HConnectionManager$HConnectionImplementation: Encountered
problems when prefetch META table:
> org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in .META. for table:
,CUSTOMER_INFO:MOBILE, row=,CUSTOMER_INFO:MOBILE,,99999999999999
>        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:164)
>        at org.apache.hadoop.hbase.client.MetaScanner.access$000(MetaScanner.java:54)
>        at org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:133)
>        at org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:130)
>        at org.apache.hadoop.hbase.client.HConnectionManager.execute(HConnectionManager.java:383)
>        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:130)
>        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:105)
>        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:947)
>        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1002)
>        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:889)
>        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:846)
>        at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:133)
>        at org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:201)
>        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:884)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
>       at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:415)
>        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
>        at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
>        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
>        at org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:425)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:601)
>        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:601)
>        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:601)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> 13/04/16 16:16:05 ERROR mapreduce.TableOutputFormat: org.apache.hadoop.hbase.TableNotFoundException:
,CUSTOMER_INFO:MOBILE
> 13/04/16 16:16:05 INFO mapred.JobClient: Cleaning up the staging area hdfs://cldx-1139-1033:9000/tmp/hadoop-hduser/mapred/staging/hduser/.staging/job_201304091909_0006
> Exception in thread "main" java.lang.reflect.InvocationTargetException
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:601)
>        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:601)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.RuntimeException: org.apache.hadoop.hbase.TableNotFoundException:
,CUSTOMER_INFO:MOBILE
>        at org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:206)
>        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:884)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:415)
>        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
>        at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
>        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
>        at org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:425)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:601)
>        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>        ... 10 more
> Caused by: org.apache.hadoop.hbase.TableNotFoundException: ,CUSTOMER_INFO:MOBILE
>        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1024)
>        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:889)
>        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:846)
>        at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:133)
>        at org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:201)
>        ... 27 more
> 
> I disabled, dropped CUSTOMERS and recreated it but the issue is recurring.
> 
> Please guide me
> 
> Regards,
> Omkar Joshi
> 
> 
> ________________________________
> The contents of this e-mail and any attachment(s) may contain confidential or privileged
information for the intended recipient(s). Unintended recipients are prohibited from taking
action on the basis of information in this e-mail and using or disseminating the information,
and must notify the sender and delete it from their system. L&T Infotech will not accept
responsibility or liability for the accuracy or completeness of, or the presence of any virus
or disabling code in this e-mail"

Mime
View raw message