Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8CDFF1040A for ; Mon, 30 Dec 2013 17:24:35 +0000 (UTC) Received: (qmail 55437 invoked by uid 500); 30 Dec 2013 17:23:30 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 55339 invoked by uid 500); 30 Dec 2013 17:23:28 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 55320 invoked by uid 99); 30 Dec 2013 17:23:24 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 30 Dec 2013 17:23:24 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of akumarb2010@gmail.com designates 209.85.128.48 as permitted sender) Received: from [209.85.128.48] (HELO mail-qe0-f48.google.com) (209.85.128.48) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 30 Dec 2013 17:23:18 +0000 Received: by mail-qe0-f48.google.com with SMTP id gc15so11634937qeb.7 for ; Mon, 30 Dec 2013 09:22:57 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=kzffE+GYh5XMLcBaQkBE3mJxVx47z/QynIvUijKOrVc=; b=oeZWtqblebyGMwQfZbQFGpTF7mtr4a0WcgqcGdcfNaPO9hy2O/6Sca2ugvagQbvLvr RPAuZNfUt2IOmRaIs3UKtYYeMR1Ls6N3FzJTOMkIx7Tg43chbJcf9YgwdJLMs3NrlWn/ IdAktXHyafLJ7Ddh3ZSSl16fHliD70/UWBTX908SY3MQqtUJIlPkCF3DivtshM+GMX3g LzRhAXfgdNLEfKq9Txy+EnmOSfSY9MuDNH/7isJvxrk3YjQRXyzKUJVPbOnS54jHcR0c dC0LrQY4/6z9nu7X6SL8ia9A7dbJ8Iy1WLXNcKB0YvI14+JdGyvDMNaf48Mo8m7UxSkR C4VA== MIME-Version: 1.0 X-Received: by 10.224.115.70 with SMTP id h6mr110077716qaq.61.1388424177239; Mon, 30 Dec 2013 09:22:57 -0800 (PST) Received: by 10.96.66.66 with HTTP; Mon, 30 Dec 2013 09:22:57 -0800 (PST) In-Reply-To: References: Date: Mon, 30 Dec 2013 22:52:57 +0530 Message-ID: Subject: Re: Job fails while re attempting the task in multiple outputs case From: AnilKumar B To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7bf16006e1caa704eec3b16e X-Virus-Checked: Checked by ClamAV on apache.org --047d7bf16006e1caa704eec3b16e Content-Type: text/plain; charset=ISO-8859-1 Thanks Harsh. @Are you using the MultipleOutputs class shipped with Apache Hadoop or one of your own? I am using Apache Hadoop's multipleOutputs. But as you see in stack trace, it's not appending the attempt id to file name, it's only consists of task id. Thanks & Regards, B Anil Kumar. On Mon, Dec 30, 2013 at 7:42 PM, Harsh J wrote: > Are you using the MultipleOutputs class shipped with Apache Hadoop or > one of your own? > > If its the latter, please take a look at gotchas to take care of > described at > http://wiki.apache.org/hadoop/FAQ#Can_I_write_create.2Fwrite-to_hdfs_files_directly_from_map.2Freduce_tasks.3F > > On Mon, Dec 30, 2013 at 4:22 PM, AnilKumar B > wrote: > > Hi, > > > > I am using multiple outputs in our job. So whenever any reduce task > fails, > > all it's next task attempts are failing with file exist exception. > > > > > > The output file name should also append the task attempt right? But it's > > only appending the task id. Is this the bug or Some thing wrong from my > > side? > > > > Where should look in src code? I went through code at > > FileOutputFormat$getTaskOutputPath(), but there it's only considering > task > > id. > > > > > > Exception Trace: > > 13/12/29 09:13:00 INFO mapred.JobClient: Task Id : > > attempt_201312162255_60465_r_000008_0, Status : FAILED > > 13/12/29 09:14:42 WARN mapred.JobClient: Error reading task > > > outputhttp://localhost:50050/tasklog?plaintext=true&attemptid=attempt_201312162255_60465_r_000008_0&filter=stdout > > 13/12/29 09:14:42 WARN mapred.JobClient: Error reading task > > > outputhttp://localhost:50050/tasklog?plaintext=true&attemptid=attempt_201312162255_60465_r_000008_0&filter=stderr > > 13/12/29 09:15:04 INFO mapred.JobClient: map 100% reduce 93% > > 13/12/29 09:15:23 INFO mapred.JobClient: map 100% reduce 96% > > 13/12/29 09:17:31 INFO mapred.JobClient: map 100% reduce 97% > > 13/12/29 09:19:34 INFO mapred.JobClient: Task Id : > > attempt_201312162255_60465_r_000008_1, Status : FAILED > > org.apache.hadoop.ipc.RemoteException: java.io.IOException: failed to > create > > /x/y/z/2013/12/29/04/o_2013_12_29_03-r-00008.gz on client 10.103.10.31 > > either because the filename is invalid or the file exists > > at > > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1672) > > at > > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1599) > > at > > org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:732) > > at > > org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:711) > > at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown > > Source) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587) > > at > > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1448) > > at > > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1444) > > at java.security.AccessController.doPrivileged(Native > > Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) > > at > > org.apache.hadoop.ipc.Server$Handler.run(Server.java:1442) > > > > at org.apache.hadoop.ipc.Client.call(Client.java:1118) > > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) > > at $Proxy7.create(Unknown Source) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > > Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62) > > at $Proxy7.create(Unknown Source) > > at > > > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.(DFSClient.java:3753) > > at > > org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:937) > > at > > > org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:207) > > at > > org.apache.hadoop.fs.FileSystem.create(FileSystem.java:555) > > at > > org.apache.hadoop.fs.FileSystem.create(FileSystem.java:536) > > at > > org.apache.hadoop.fs.FileSystem.create(FileSystem.java:443) > > at > > > org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:131) > > at > > > org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.getRecordWriter(MultipleOutputs.java:411) > > at > > > org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:370) > > at > > com.team.hadoop.mapreduce1$Reducer1.reduce(MapReduce1.java:254) > > at > > com.team.hadoop.mapreduce1$Reducer1.reduce(MapReduce1.java::144) > > at > org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:177) > > at > > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:649) > > at > > org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:418) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > > at java.security.AccessController.doPrivileged(Native > > Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) > > at org.apache.hadoop.mapred.Child.main(Child.java:249) > > > > 13/12/29 09:19:35 WARN mapred.JobClient: Error reading task > > > outputhttp://localhost:50050/tasklog?plaintext=true&attemptid=attempt_201312162255_60465_r_000008_1&filter=stdout > > 13/12/29 09:19:35 WARN mapred.JobClient: Error reading task > > > outputhttp://localhost:50050/tasklog?plaintext=true&attemptid=attempt_201312162255_60465_r_000008_1&filter=stderr > > 13/12/29 09:19:36 INFO mapred.JobClient: map 100% reduce 89% > > 13/12/29 09:19:54 INFO mapred.JobClient: map 100% reduce 92% > > 13/12/29 09:20:11 INFO mapred.JobClient: map 100% reduce 96% > > 13/12/29 09:22:52 INFO mapred.JobClient: map 100% reduce 97% > > 13/12/29 09:23:48 INFO mapred.JobClient: Task Id : > > attempt_201312162255_60465_r_000008_2, Status : FAILED > > org.apache.hadoop.ipc.RemoteException: java.io.IOException: failed to > create > > /x/y/z/2013/12/29/04/o_2013_12_29_03-r-00008.gz on client 10.103.7.33 > either > > because the filename is invalid or the file exists > > at > > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1672) > > at > > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1599) > > at > > org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:732) > > at > > org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:711) > > at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown > > Source) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587) > > at > > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1448) > > > > > > Thanks & Regards, > > B Anil Kumar. > > > > -- > Harsh J > --047d7bf16006e1caa704eec3b16e Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Thanks Harsh.

@Are you using the Multi= pleOutputs class shipped with Apache Hadoop or
one of your own= ?
I am using Apache Hadoop's multipleOutputs.=A0

<= /span>
But as you see in stack trace, it's not appending the attempt id = to file name, it's only consists of task id.=A0



Thanks & Regards,
B Anil Kumar.


On Mon, Dec 30, 2013 at 7:42 PM, Harsh J= <harsh@cloudera.com> wrote:
Are you using the MultipleOutputs class shipped with Apache Hadoop or
one of your own?

If its the latter, please take a look at gotchas to take care of
described at http://wiki.apache.org/hadoop/FAQ#Can_I_write_create.2Fwrite-to_hdfs_fi= les_directly_from_map.2Freduce_tasks.3F

On Mon, Dec 30, 2013 at 4:22 PM, AnilKumar B <akumarb2010@gmail.com> wrote:
> Hi,
>
> I am =A0using multiple outputs in our job. So whenever any reduce task= fails,
> all it's next task attempts are failing with file exist exception.=
>
>
> The output file name should also append the task attempt right? But it= 's
> only appending the task id. Is this the bug or Some thing wrong from m= y
> side?
>
> Where should look in src code? I went through =A0code at
> FileOutputFormat$getTaskOutputPath(), but there it's only consider= ing task
> id.
>
>
> Exception Trace:
> 13/12/29 09:13:00 INFO mapred.JobClient: Task Id :
> attempt_201312162255_60465_r_000008_0, Status : FAILED
> 13/12/29 09:14:42 WARN mapred.JobClient: Error reading task
> outputhttp://localhost:50050/tasklog?plaintext=3Dtrue&attemptid=3D= attempt_201312162255_60465_r_000008_0&filter=3Dstdout
> 13/12/29 09:14:42 WARN mapred.JobClient: Error reading task
> outputhttp://localhost:50050/tasklog?plaintext=3Dtrue&attemptid=3D= attempt_201312162255_60465_r_000008_0&filter=3Dstderr
> 13/12/29 09:15:04 INFO mapred.JobClient: =A0map 100% reduce 93%
> 13/12/29 09:15:23 INFO mapred.JobClient: =A0map 100% reduce 96%
> 13/12/29 09:17:31 INFO mapred.JobClient: =A0map 100% reduce 97%
> 13/12/29 09:19:34 INFO mapred.JobClient: Task Id :
> attempt_201312162255_60465_r_000008_1, Status : FAILED
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: failed to = create
> /x/y/z/2013/12/29/04/o_2013_12_29_03-r-00008.gz on client 10.103.10.31=
> either because the filename is invalid or the file exists
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(= FSNamesystem.java:1672)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesy= stem.java:1599)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:7= 32)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:7= 11)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at sun.reflect.GeneratedMethodAccessor= 14.invoke(Unknown
> Source)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess= orImpl.java:25)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Met= hod.java:597)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RPC$Server.ca= ll(RPC.java:587)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1448)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1444)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at java.security.AccessController.doPr= ivileged(Native
> Method)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Su= bject.java:396)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformat= ion.java:1232)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.ipc.Server$Handler.run(Server.java:1442)
>
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Client.call(C= lient.java:1118)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RPC$Invoker.i= nvoke(RPC.java:229)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at $Proxy7.create(Unknown Source)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImp= l.invoke0(Native
> Method)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j= ava:39)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess= orImpl.java:25)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Met= hod.java:597)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryIn= vocationHandler.java:85)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocati= onHandler.java:62)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at $Proxy7.create(Unknown Source)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClien= t.java:3753)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:937)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSys= tem.java:207)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.fs.FileSystem.create(FileSystem.java:555)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.fs.FileSystem.create(FileSystem.java:536)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.fs.FileSystem.create(FileSystem.java:443)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWrite= r(TextOutputFormat.java:131)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.getRecordWriter= (MultipleOutputs.java:411)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleO= utputs.java:370)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> com.team.hadoop.mapreduce1$Reducer1.reduce(MapReduce1.java:254)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> com.team.hadoop.mapreduce1$Reducer1.reduce(MapReduce1.java::144)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at org.apache.hadoop.mapreduce.Reducer= .run(Reducer.java:177)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:649)=
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:418)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Child$4.ru= n(Child.java:255)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at java.security.AccessController.doPr= ivileged(Native
> Method)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Su= bject.java:396)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformat= ion.java:1232)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Child.main= (Child.java:249)
>
> 13/12/29 09:19:35 WARN mapred.JobClient: Error reading task
> outputhttp://localhost:50050/tasklog?plaintext=3Dtrue&attemptid=3D= attempt_201312162255_60465_r_000008_1&filter=3Dstdout
> 13/12/29 09:19:35 WARN mapred.JobClient: Error reading task
> outputhttp://localhost:50050/tasklog?plaintext=3Dtrue&attemptid=3D= attempt_201312162255_60465_r_000008_1&filter=3Dstderr
> 13/12/29 09:19:36 INFO mapred.JobClient: =A0map 100% reduce 89%
> 13/12/29 09:19:54 INFO mapred.JobClient: =A0map 100% reduce 92%
> 13/12/29 09:20:11 INFO mapred.JobClient: =A0map 100% reduce 96%
> 13/12/29 09:22:52 INFO mapred.JobClient: =A0map 100% reduce 97%
> 13/12/29 09:23:48 INFO mapred.JobClient: Task Id :
> attempt_201312162255_60465_r_000008_2, Status : FAILED
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: failed to = create
> /x/y/z/2013/12/29/04/o_2013_12_29_03-r-00008.gz on client 10.103.7.33 = either
> because the filename is invalid or the file exists
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(= FSNamesystem.java:1672)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesy= stem.java:1599)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:7= 32)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:7= 11)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at sun.reflect.GeneratedMethodAccessor= 14.invoke(Unknown
> Source)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess= orImpl.java:25)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Met= hod.java:597)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RPC$Server.ca= ll(RPC.java:587)
> =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 at
> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1448)
>
>
> Thanks & Regards,
> B Anil Kumar.



--
Harsh J

--047d7bf16006e1caa704eec3b16e--