Return-Path: Delivered-To: apmail-lucene-hadoop-user-archive@locus.apache.org Received: (qmail 83136 invoked from network); 8 Aug 2007 13:16:31 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 8 Aug 2007 13:16:31 -0000 Received: (qmail 64483 invoked by uid 500); 8 Aug 2007 13:16:25 -0000 Delivered-To: apmail-lucene-hadoop-user-archive@lucene.apache.org Received: (qmail 64450 invoked by uid 500); 8 Aug 2007 13:16:25 -0000 Mailing-List: contact hadoop-user-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hadoop-user@lucene.apache.org Delivered-To: mailing list hadoop-user@lucene.apache.org Received: (qmail 64439 invoked by uid 99); 8 Aug 2007 13:16:24 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 08 Aug 2007 06:16:24 -0700 X-ASF-Spam-Status: No, hits=2.0 required=10.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of prem.kumar.l@gmail.com designates 66.249.92.175 as permitted sender) Received: from [66.249.92.175] (HELO ug-out-1314.google.com) (66.249.92.175) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 08 Aug 2007 13:16:16 +0000 Received: by ug-out-1314.google.com with SMTP id c2so209545ugf for ; Wed, 08 Aug 2007 06:15:55 -0700 (PDT) DKIM-Signature: a=rsa-sha1; c=relaxed/relaxed; d=gmail.com; s=beta; h=domainkey-signature:received:received:message-id:date:from:to:subject:mime-version:content-type; b=m68I/Avq2ZUHTErvEfvAaZl03GprOQOLuwsUGDY3zMcuWJkXRxbGtGKEHcd8n9wh0TEIecm5iZeGv+QxyMYITGbVdJUI1NXxt90a67MWR7CTM3vkU0+HpE2QKDyoeFMcCdAa4XcE4IzU8EuuqYkwy8GJGiBqyGVjEMTaOSlI/AQ= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=beta; h=received:message-id:date:from:to:subject:mime-version:content-type; b=oIJ4oYoTMgJGGidjcgix0j9vKeaXr8Q0DwqdEngOufWoEQBJ6Rn12RdVXm44TNxq1cgUT/1Q8Zb23l37B3JVi1bAhBqSitCoXOVCJ9rAmP3GjiNDudXulRu+jk1dn0dbQ4hbmmdWlblao5kb8SURzsiyB7k2HGsS6bch+en1DMw= Received: by 10.78.142.14 with SMTP id p14mr524878hud.1186578954174; Wed, 08 Aug 2007 06:15:54 -0700 (PDT) Received: by 10.78.138.15 with HTTP; Wed, 8 Aug 2007 06:15:54 -0700 (PDT) Message-ID: <8551555f0708080615n10b147bew728cadca5e4a7d10@mail.gmail.com> Date: Wed, 8 Aug 2007 18:45:54 +0530 From: "prem kumar" To: hadoop-user@lucene.apache.org, hadoop-dev@lucene.apache.org Subject: exception during Dedup on multiple nodes MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_99626_7107488.1186578954140" X-Virus-Checked: Checked by ClamAV on apache.org ------=_Part_99626_7107488.1186578954140 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit Content-Disposition: inline Hello, I am running nutch 0.9 in three nodes on an nfs mounted drive. For more info on my setup please refer : http://joey.mazzarelli.com/2007/07/25/nutch-and-hadoop-as-user-with-nfs/ A simple nutch crawl fails with during the dedup phase. The stack trace of the problem I am facing is as follows: task_0037_m_000001_3: log4j:ERROR setFile(null,true) call failed. task_0037_m_000001_3: java.io.FileNotFoundException: /home/pl162331/opt/nutch/crawler/logs/mishti (Is a directory) task_0037_m_000001_3: at java.io.FileOutputStream.openAppend(Native Method) task_0037_m_000001_3: at java.io.FileOutputStream.( FileOutputStream.java:177) task_0037_m_000001_3: at java.io.FileOutputStream.( FileOutputStream.java:102) task_0037_m_000001_3: at org.apache.log4j.FileAppender.setFile( FileAppender.java:289) task_0037_m_000001_3: at org.apache.log4j.FileAppender.activateOptions( FileAppender.java:163) task_0037_m_000001_3: at org.apache.log4j.DailyRollingFileAppender.activateOptions( DailyRollingFileAppender.java:215) task_0037_m_000001_3: at org.apache.log4j.config.PropertySetter.activate( PropertySetter.java:256) task_0037_m_000001_3: at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java :132) task_0037_m_000001_3: at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:96) task_0037_m_000001_3: at org.apache.log4j.PropertyConfigurator.parseAppender( PropertyConfigurator.java:654) task_0037_m_000001_3: at org.apache.log4j.PropertyConfigurator.parseCategory( PropertyConfigurator.java:612) task_0037_m_000001_3: at org.apache.log4j.PropertyConfigurator.configureRootCategory( PropertyConfigurator.java:509) task_0037_m_000001_3: at org.apache.log4j.PropertyConfigurator.doConfigure (PropertyConfigurator.java:415) task_0037_m_000001_3: at org.apache.log4j.PropertyConfigurator.doConfigure (PropertyConfigurator.java:441) task_0037_m_000001_3: at org.apache.log4j.helpers.OptionConverter.selectAndConfigure( OptionConverter.java:468) task_0037_m_000001_3: at org.apache.log4j.LogManager.( LogManager.java:122) task_0037_m_000001_3: at org.apache.log4j.Logger.getLogger(Logger.java :104) task_0037_m_000001_3: at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:229) task_0037_m_000001_3: at org.apache.commons.logging.impl.Log4JLogger .(Log4JLogger.java:65) task_0037_m_000001_3: at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) task_0037_m_000001_3: at sun.reflect.NativeConstructorAccessorImpl.newInstance( NativeConstructorAccessorImpl.java:39) task_0037_m_000001_3: at sun.reflect.DelegatingConstructorAccessorImpl.newInstance( DelegatingConstructorAccessorImpl.java:27) task_0037_m_000001_3: at java.lang.reflect.Constructor.newInstance( Constructor.java:494) task_0037_m_000001_3: at org.apache.commons.logging.impl.LogFactoryImpl.newInstance( LogFactoryImpl.java:529) task_0037_m_000001_3: at org.apache.commons.logging.impl.LogFactoryImpl.getInstance( LogFactoryImpl.java:235) task_0037_m_000001_3: at org.apache.commons.logging.LogFactory.getLog( LogFactory.java:370) task_0037_m_000001_3: at org.apache.hadoop.mapred.TaskTracker.( TaskTracker.java:82) task_0037_m_000001_3: at org.apache.hadoop.mapred.TaskTracker$Child.main( TaskTracker.java:1423) task_0037_m_000001_3: log4j:ERROR Either File or DatePattern options are not set for appender [DRFA]. Exception in thread "main" java.io.IOException: Job failed! at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604) at org.apache.nutch.indexer.DeleteDuplicates.dedup( DeleteDuplicates.java:439) at org.apache.nutch.crawl.Crawl.main(Crawl.java:135) The log folders have sufficient permissions too. Unable to proceed further. Any help would be appreciated. Cheers! Prem ------=_Part_99626_7107488.1186578954140--