Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1D0A99D5E for ; Thu, 26 Jan 2012 17:56:20 +0000 (UTC) Received: (qmail 92011 invoked by uid 500); 26 Jan 2012 17:56:16 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 91327 invoked by uid 500); 26 Jan 2012 17:56:15 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 91312 invoked by uid 99); 26 Jan 2012 17:56:15 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 26 Jan 2012 17:56:14 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of harsh@cloudera.com designates 209.85.210.41 as permitted sender) Received: from [209.85.210.41] (HELO mail-pz0-f41.google.com) (209.85.210.41) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 26 Jan 2012 17:56:09 +0000 Received: by dake40 with SMTP id e40so830109dak.14 for ; Thu, 26 Jan 2012 09:55:49 -0800 (PST) Received: by 10.68.210.12 with SMTP id mq12mr7006711pbc.2.1327600549214; Thu, 26 Jan 2012 09:55:49 -0800 (PST) MIME-Version: 1.0 Received: by 10.142.247.26 with HTTP; Thu, 26 Jan 2012 09:55:29 -0800 (PST) In-Reply-To: <33208913.post@talk.nabble.com> References: <33208913.post@talk.nabble.com> From: Harsh J Date: Thu, 26 Jan 2012 23:25:29 +0530 Message-ID: Subject: Re: Error While using hbase with hadoop To: user@hbase.apache.org Cc: hbase-user@hadoop.apache.org Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Try the options listed here: http://wiki.apache.org/hadoop/FAQ#What_does_.22file_could_only_be_replicate= d_to_0_nodes.2C_instead_of_1.22_mean.3F On Thu, Jan 26, 2012 at 10:47 PM, neuron005 wrote: > > Hii there > I earlier used hbase locally , using my ext3 as filesystem for hbase. Tha= t > worked ok :) . Now I moved on to next step of setting it up on hdfs. I am > using hadoop-0.20.2 and hbase0.90.4 in pseudo distributed mode > I an getting this error in my log > > 2012-01-26 22:37:50,629 DEBUG org.apache.hadoop.hbase.util.FSUtils: Creat= ed > version file at hdfs://89neuron:9000/hbase set its version at:7 > 2012-01-26 22:37:50,637 WARN org.apache.hadoop.hdfs.DFSClient: DataStream= er > Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: Fi= le > /hbase/hbase.version could only be replicated to 0 nodes, instead of 1 > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FS= Namesystem.java:1271) > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:42= 2) > =A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Met= hod) > =A0 =A0 =A0 =A0at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > =A0 =A0 =A0 =A0at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > =A0 =A0 =A0 =A0at java.lang.reflect.Method.invoke(Method.java:616) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:= 959) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:= 955) > =A0 =A0 =A0 =A0at java.security.AccessController.doPrivileged(Native Meth= od) > =A0 =A0 =A0 =A0at javax.security.auth.Subject.doAs(Subject.java:416) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.Server$Handler.run(Server.java:95= 3) > > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.Client.call(Client.java:740) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) > =A0 =A0 =A0 =A0at $Proxy6.addBlock(Unknown Source) > =A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Met= hod) > =A0 =A0 =A0 =A0at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > =A0 =A0 =A0 =A0at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > =A0 =A0 =A0 =A0at java.lang.reflect.Method.invoke(Method.java:616) > =A0 =A0 =A0 =A0at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvoc= ationHandler.java:82) > =A0 =A0 =A0 =A0at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationH= andler.java:59) > =A0 =A0 =A0 =A0at $Proxy6.addBlock(Unknown Source) > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFS= Client.java:2937) > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DF= SClient.java:2819) > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.ja= va:2102) > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClie= nt.java:2288) > > 2012-01-26 22:37:50,638 WARN org.apache.hadoop.hdfs.DFSClient: Error > Recovery for block null bad datanode[0] nodes =3D=3D null > 2012-01-26 22:37:50,638 WARN org.apache.hadoop.hdfs.DFSClient: Could not = get > block locations. Source file "/hbase/hbase.version" - Aborting... > 2012-01-26 22:37:50,638 WARN org.apache.hadoop.hbase.util.FSUtils: Unable= to > create version file at hdfs://89neuron:9000/hbase, retrying: > java.io.IOException: File /hbase/hbase.version could only be replicated t= o 0 > nodes, instead of 1 > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FS= Namesystem.java:1271) > =A0 =A0 =A0 =A0at > org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:42= 2) > =A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Met= hod) > =A0 =A0 =A0 =A0at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > =A0 =A0 =A0 =A0at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > =A0 =A0 =A0 =A0at java.lang.reflect.Method.invoke(Method.java:616) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:= 959) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:= 955) > =A0 =A0 =A0 =A0at java.security.AccessController.doPrivileged(Native Meth= od) > =A0 =A0 =A0 =A0at javax.security.auth.Subject.doAs(Subject.java:416) > =A0 =A0 =A0 =A0at org.apache.hadoop.ipc.Server$Handler.run(Server.java:95= 3) > > Looks like that dfs.replication which is set to 1 is the problem but I ca= n > not confirm it actually is. Please help me out. > Thanks in advance > -- > View this message in context: http://old.nabble.com/Error-While-using-hba= se-with-hadoop-tp33208913p33208913.html > Sent from the HBase User mailing list archive at Nabble.com. > --=20 Harsh J Customer Ops. Engineer, Cloudera