hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From chsanthosh <chsanth...@hotmail.com>
Subject Re: exception during Dedup on multiple nodes
Date Mon, 03 Sep 2007 05:38:01 GMT

Hi,

I'm also in the same situation..

After that what are the steps you followed for getting the things done...

Please help me out..

Thanks & Regards,
Santhosh.Ch


prem kumar-4 wrote:
> 
> Hello,
> I am running nutch 0.9 in three nodes on an nfs mounted drive. For more
> info
> on my setup please refer :
> http://joey.mazzarelli.com/2007/07/25/nutch-and-hadoop-as-user-with-nfs/
> 
> A simple nutch crawl fails with during the dedup phase. The stack trace of
> the problem I am facing is as follows:
> 
> task_0037_m_000001_3: log4j:ERROR setFile(null,true) call failed.
> task_0037_m_000001_3: java.io.FileNotFoundException:
> /home/pl162331/opt/nutch/crawler/logs/mishti (Is a directory)
> task_0037_m_000001_3:   at java.io.FileOutputStream.openAppend(Native
> Method)
> task_0037_m_000001_3:   at java.io.FileOutputStream.<init>(
> FileOutputStream.java:177)
> task_0037_m_000001_3:   at java.io.FileOutputStream.<init>(
> FileOutputStream.java:102)
> task_0037_m_000001_3:   at org.apache.log4j.FileAppender.setFile(
> FileAppender.java:289)
> task_0037_m_000001_3:   at org.apache.log4j.FileAppender.activateOptions(
> FileAppender.java:163)
> task_0037_m_000001_3:   at
> org.apache.log4j.DailyRollingFileAppender.activateOptions(
> DailyRollingFileAppender.java:215)
> task_0037_m_000001_3:   at
> org.apache.log4j.config.PropertySetter.activate(
> PropertySetter.java:256)
> task_0037_m_000001_3:   at
> org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java
> :132)
> task_0037_m_000001_3:   at
> org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:96)
> task_0037_m_000001_3:   at
> org.apache.log4j.PropertyConfigurator.parseAppender(
> PropertyConfigurator.java:654)
> task_0037_m_000001_3:   at
> org.apache.log4j.PropertyConfigurator.parseCategory(
> PropertyConfigurator.java:612)
> task_0037_m_000001_3:   at
> org.apache.log4j.PropertyConfigurator.configureRootCategory(
> PropertyConfigurator.java:509)
> task_0037_m_000001_3:   at
> org.apache.log4j.PropertyConfigurator.doConfigure
> (PropertyConfigurator.java:415)
> task_0037_m_000001_3:   at
> org.apache.log4j.PropertyConfigurator.doConfigure
> (PropertyConfigurator.java:441)
> task_0037_m_000001_3:   at
> org.apache.log4j.helpers.OptionConverter.selectAndConfigure(
> OptionConverter.java:468)
> task_0037_m_000001_3:   at org.apache.log4j.LogManager.<clinit>(
> LogManager.java:122)
> task_0037_m_000001_3:   at org.apache.log4j.Logger.getLogger(Logger.java
> :104)
> task_0037_m_000001_3:   at
> org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:229)
> task_0037_m_000001_3:   at org.apache.commons.logging.impl.Log4JLogger
> .<init>(Log4JLogger.java:65)
> task_0037_m_000001_3:   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> task_0037_m_000001_3:   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:39)
> task_0037_m_000001_3:   at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:27)
> task_0037_m_000001_3:   at java.lang.reflect.Constructor.newInstance(
> Constructor.java:494)
> task_0037_m_000001_3:   at
> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(
> LogFactoryImpl.java:529)
> task_0037_m_000001_3:   at
> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(
> LogFactoryImpl.java:235)
> task_0037_m_000001_3:   at org.apache.commons.logging.LogFactory.getLog(
> LogFactory.java:370)
> task_0037_m_000001_3:   at org.apache.hadoop.mapred.TaskTracker.<clinit>(
> TaskTracker.java:82)
> task_0037_m_000001_3:   at
> org.apache.hadoop.mapred.TaskTracker$Child.main(
> TaskTracker.java:1423)
> task_0037_m_000001_3: log4j:ERROR Either File or DatePattern options are
> not
> set for appender [DRFA].
> Exception in thread "main" java.io.IOException: Job failed!
>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
>         at org.apache.nutch.indexer.DeleteDuplicates.dedup(
> DeleteDuplicates.java:439)
>         at org.apache.nutch.crawl.Crawl.main(Crawl.java:135)
> 
> 
> The log folders have sufficient permissions too. Unable to proceed
> further.
> Any help would be appreciated.
> 
> Cheers!
> Prem
> 
> 

-- 
View this message in context: http://www.nabble.com/exception-during-Dedup-on-multiple-nodes-tf4236301.html#a12456626
Sent from the Hadoop Users mailing list archive at Nabble.com.


Mime
View raw message