hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Doug Cutting (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-58) Hadoop requires configuration of hadoop-site.xml or won't run
Date Thu, 23 Feb 2006 19:14:37 GMT
    [ http://issues.apache.org/jira/browse/HADOOP-58?page=comments#action_12367568 ] 

Doug Cutting commented on HADOOP-58:
------------------------------------

I don't think we want things to use DFS and TaskTracker/JobTracker by default, since this
slows things down and uses more resources than needed when running on a single node.

The javadoc provides a recommended configuration for "pseudo-distributed" use:

http://lucene.apache.org/hadoop/docs/api/overview-summary.html

With that in place, bin/start-all.sh works fine, no?

> Hadoop requires configuration of hadoop-site.xml or won't run
> -------------------------------------------------------------
>
>          Key: HADOOP-58
>          URL: http://issues.apache.org/jira/browse/HADOOP-58
>      Project: Hadoop
>         Type: Bug
>     Reporter: stack@archive.org
>     Priority: Minor
>  Attachments: local2localhostPort.patch
>
> On a new install, I would expect '${HADOOP_HOME}/bin/start-all.sh" to bring up a basic
instance, one that is using local filesystem (Or if not, then uses a DFS homed in localhost:/tmp)
and that has all four daemons running on localhost.  Currently this is not the case.  Hadoop
complains 'java.lang.RuntimeException: Not a host:port pair: local'.  It doesn't like the
'local' default value for mapred.job.tracker and fs.default.name properties.
> Revision: 379930

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
   http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
   http://www.atlassian.com/software/jira


Mime
View raw message