Return-Path: Delivered-To: apmail-lucene-hadoop-dev-archive@locus.apache.org Received: (qmail 73571 invoked from network); 6 Sep 2006 20:57:57 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (209.237.227.199) by minotaur.apache.org with SMTP; 6 Sep 2006 20:57:57 -0000 Received: (qmail 94502 invoked by uid 500); 6 Sep 2006 20:57:57 -0000 Delivered-To: apmail-lucene-hadoop-dev-archive@lucene.apache.org Received: (qmail 94473 invoked by uid 500); 6 Sep 2006 20:57:56 -0000 Mailing-List: contact hadoop-dev-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hadoop-dev@lucene.apache.org Delivered-To: mailing list hadoop-dev@lucene.apache.org Received: (qmail 94463 invoked by uid 99); 6 Sep 2006 20:57:56 -0000 Received: from asf.osuosl.org (HELO asf.osuosl.org) (140.211.166.49) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 06 Sep 2006 13:57:56 -0700 X-ASF-Spam-Status: No, hits=0.0 required=10.0 tests= X-Spam-Check-By: apache.org Received: from [209.237.227.198] (HELO brutus.apache.org) (209.237.227.198) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 06 Sep 2006 13:57:56 -0700 Received: from brutus (localhost [127.0.0.1]) by brutus.apache.org (Postfix) with ESMTP id CB6497142FE for ; Wed, 6 Sep 2006 20:54:24 +0000 (GMT) Message-ID: <1730170.1157576064828.JavaMail.jira@brutus> Date: Wed, 6 Sep 2006 13:54:24 -0700 (PDT) From: "Doug Cutting (JIRA)" To: hadoop-dev@lucene.apache.org Subject: [jira] Updated: (HADOOP-507) Runtime exception in org.apache.hadoop.io.WritableFactories.newInstance when trying to startup namenode/datanode In-Reply-To: <17811210.1157476164472.JavaMail.jira@brutus> MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org X-Spam-Rating: minotaur.apache.org 1.6.2 0/1000/N [ http://issues.apache.org/jira/browse/HADOOP-507?page=all ] Doug Cutting updated HADOOP-507: -------------------------------- Status: Resolved (was: Patch Available) Resolution: Fixed I just committed this. Thanks, Owen! > Runtime exception in org.apache.hadoop.io.WritableFactories.newInstance when trying to startup namenode/datanode > ---------------------------------------------------------------------------------------------------------------- > > Key: HADOOP-507 > URL: http://issues.apache.org/jira/browse/HADOOP-507 > Project: Hadoop > Issue Type: Bug > Components: util > Affects Versions: 0.5.0 > Reporter: Arun C Murthy > Assigned To: Owen O'Malley > Fix For: 0.6.0 > > Attachments: write-factory.patch > > > Here's the logs: > arun@neo ~/dev/java/latest-hadoop/trunk $ cat /home/arun/dev/java/hadoop-0.4.0/build/libhdfs/tests/logs/hadoop-arun-namenode-neo.out > 2006-09-05 22:18:39,756 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-default.xml > 2006-09-05 22:18:39,804 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-site.xml > 2006-09-05 22:18:39,918 INFO util.Credential (FileResource.java:(60)) - Checking Resource aliases > 2006-09-05 22:18:39,935 INFO http.HttpServer (HttpServer.java:doStart(729)) - Version Jetty/5.1.4 > 2006-09-05 22:18:40,366 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.servlet.WebApplicationHandler@1171b26 > 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started WebApplicationContext[/,/] > 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started HttpContext[/logs,/logs] > 2006-09-05 22:18:40,479 INFO util.Container (Container.java:start(74)) - Started HttpContext[/static,/static] > 2006-09-05 22:18:40,485 INFO http.SocketListener (SocketListener.java:start(204)) - Started SocketListener on 0.0.0.0:50070 > 2006-09-05 22:18:40,487 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.Server@1b09468 > Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public" > at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49) > at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:81) > at org.apache.hadoop.dfs.FSEditLog.loadFSEdits(FSEditLog.java:134) > at org.apache.hadoop.dfs.FSImage.loadFSImage(FSImage.java:157) > at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:317) > at org.apache.hadoop.dfs.FSNamesystem.(FSNamesystem.java:199) > at org.apache.hadoop.dfs.NameNode.(NameNode.java:132) > at org.apache.hadoop.dfs.NameNode.(NameNode.java:123) > at org.apache.hadoop.dfs.NameNode.main(NameNode.java:543) > Caused by: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public" > at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:65) > at java.lang.Class.newInstance0(Class.java:344) > at java.lang.Class.newInstance(Class.java:303) > at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:45) > ... 8 more > Steps to reproduce: > 1. Start namenode/datanode > 2. Run hdfs_test program (part of libhdfs) > 3. Stop namenode/datanode > 4. goto step 1 -- This message is automatically generated by JIRA. - If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa - For more information on JIRA, see: http://www.atlassian.com/software/jira