hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yair Even-Zohar" <ya...@revenuescience.com>
Subject RE: backup tables using ImportMR / ExportMR ( HBASE-974 )
Date Fri, 06 Feb 2009 21:10:25 GMT
Thanks for the quick reply. Several comments
1) I had to replace the "new Configuration()" to "new
HBaseConfiguration()" in the java source or the Export didn't work
properly.

2) I had to add hadoop jar and hbase jar to the classpath in the
make.....jar.sh or they wouldn't compile

3) When running the ImportMR.sh, I always get the following error after
100% map and 40% or 66% reduce. Please let me know if you are familiar
with the problem 
Thanks
-Yair

09/02/06 15:57:52 INFO mapred.JobClient:  map 100% reduce 66%
09/02/06 16:00:47 INFO mapred.JobClient:  map 100% reduce 53%
09/02/06 16:00:47 INFO mapred.JobClient: Task Id :
attempt_200902061529_0007_r_000000_0, Status : FAILED
org.apache.hadoop.hbase.MasterNotRunningException: localhost:60000
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getMaster
(HConnectionManager.java:236)
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateReg
ion(HConnectionManager.java:422)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:114)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:74)
        at ImportMR$MyReducer.reduce(ImportMR.java:138)
        at ImportMR$MyReducer.reduce(ImportMR.java:128)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:430)
        at org.apache.hadoop.mapred.Child.main(Child.java:155)

attempt_200902061529_0007_r_000000_0: Exception in thread "Timer thread
for monitoring mapred" java.lang.NullPointerException
attempt_200902061529_0007_r_000000_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.xdr_string(GangliaConte
xt.java:195)
attempt_200902061529_0007_r_000000_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.emitMetric(GangliaConte
xt.java:138)
attempt_200902061529_0007_r_000000_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.emitRecord(GangliaConte
xt.java:123)
attempt_200902061529_0007_r_000000_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.emitRecords(Abstrac
tMetricsContext.java:304)
attempt_200902061529_0007_r_000000_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.timerEvent(Abstract
MetricsContext.java:290)
attempt_200902061529_0007_r_000000_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.access$000(Abstract
MetricsContext.java:50)
attempt_200902061529_0007_r_000000_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext$1.run(AbstractMetri
csContext.java:249)
attempt_200902061529_0007_r_000000_0:   at
java.util.TimerThread.mainLoop(Timer.java:512)
attempt_200902061529_0007_r_000000_0:   at
java.util.TimerThread.run(Timer.java:462)
09/02/06 16:00:48 INFO mapred.JobClient:  map 100% reduce 13%
09/02/06 16:00:48 INFO mapred.JobClient: Task Id :
attempt_200902061529_0007_r_000002_0, Status : FAILED
org.apache.hadoop.hbase.MasterNotRunningException: localhost:60000
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getMaster
(HConnectionManager.java:236)
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateReg
ion(HConnectionManager.java:422)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:114)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:74)
        at ImportMR$MyReducer.reduce(ImportMR.java:138)
        at ImportMR$MyReducer.reduce(ImportMR.java:128)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:430)
        at org.apache.hadoop.mapred.Child.main(Child.java:155)

attempt_200902061529_0007_r_000002_0: Exception in thread "Timer thread
for monitoring mapred" java.lang.NullPointerException
attempt_200902061529_0007_r_000002_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.xdr_string(GangliaConte
xt.java:195)
attempt_200902061529_0007_r_000002_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.emitMetric(GangliaConte
xt.java:138)
attempt_200902061529_0007_r_000002_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.emitRecord(GangliaConte
xt.java:123)
attempt_200902061529_0007_r_000002_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.emitRecords(Abstrac
tMetricsContext.java:304)
attempt_200902061529_0007_r_000002_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.timerEvent(Abstract
MetricsContext.java:290)
attempt_200902061529_0007_r_000002_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.access$000(Abstract
MetricsContext.java:50)
attempt_200902061529_0007_r_000002_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext$1.run(AbstractMetri
csContext.java:249)
attempt_200902061529_0007_r_000002_0:   at
java.util.TimerThread.mainLoop(Timer.java:512)
attempt_200902061529_0007_r_000002_0:   at
java.util.TimerThread.run(Timer.java:462)
09/02/06 16:00:48 INFO mapred.JobClient: Task Id :
attempt_200902061529_0007_r_000001_0, Status : FAILED
org.apache.hadoop.hbase.MasterNotRunningException: localhost:60000
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getMaster
(HConnectionManager.java:236)
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateReg
ion(HConnectionManager.java:422)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:114)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:74)
        at ImportMR$MyReducer.reduce(ImportMR.java:138)
        at ImportMR$MyReducer.reduce(ImportMR.java:128)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:430)
        at org.apache.hadoop.mapred.Child.main(Child.java:155)

attempt_200902061529_0007_r_000001_0: Exception in thread "Timer thread
for monitoring mapred" java.lang.NullPointerException
attempt_200902061529_0007_r_000001_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.xdr_string(GangliaConte
xt.java:195)
attempt_200902061529_0007_r_000001_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.emitMetric(GangliaConte
xt.java:138)
attempt_200902061529_0007_r_000001_0:   at
org.apache.hadoop.metrics.ganglia.GangliaContext.emitRecord(GangliaConte
xt.java:123)
attempt_200902061529_0007_r_000001_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.emitRecords(Abstrac
tMetricsContext.java:304)
attempt_200902061529_0007_r_000001_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.timerEvent(Abstract
MetricsContext.java:290)
attempt_200902061529_0007_r_000001_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext.access$000(Abstract
MetricsContext.java:50)
attempt_200902061529_0007_r_000001_0:   at
org.apache.hadoop.metrics.spi.AbstractMetricsContext$1.run(AbstractMetri
csContext.java:249)
attempt_200902061529_0007_r_000001_0:   at
java.util.TimerThread.mainLoop(Timer.java:512)
attempt_200902061529_0007_r_000001_0:   at
java.util.TimerThread.run(Timer.java:462)
09/02/06 16:00:48 INFO mapred.JobClient: Task Id :
attempt_200902061529_0007_r_000003_0, Status : FAILED
org.apache.hadoop.hbase.MasterNotRunningException: localhost:60000
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getMaster
(HConnectionManager.java:236)
        at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateReg
ion(HConnectionManager.java:422)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:114)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:74)
        at ImportMR$MyReducer.reduce(ImportMR.java:138)
        at ImportMR$MyReducer.reduce(ImportMR.java:128)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:430)
        at org.apache.hadoop.mapred.Child.main(Child.java:155)

-----Original Message-----
From: Erik Holstad [mailto:erikholstad@gmail.com] 
Sent: Friday, February 06, 2009 7:51 PM
To: hbase-user@hadoop.apache.org
Subject: Re: backup tables using ImportMR / ExportMR ( HBASE-974 )

Hey Yair!
Sorry about that, HBaseRef is not needed for the import. I Deleted the
makeJar file, removed the code
and uploaded a new version. SO you can just remove it in your code or
download the new version.

If you have any more questions, please let me know.

Regards Erik

Mime
View raw message