hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James Seigel <ja...@tynt.com>
Subject Re: UI doesn't work
Date Tue, 28 Dec 2010 21:37:30 GMT
Nope, just on my iPhone I thought you'd tried a different port ( bad memory :) )

Try accessing it with an ip address you get from doing an ipconfig on
the machine.

Then look at the logs and see if there are any errors or indications
that it is being hit properly.

Does your browser follow redirects properly?  As well try clearing the
cache on your browser.

Sorry for checking out the obvious stuff but sometimes it is :).

Cheers
James

Sent from my mobile. Please excuse the typos.

On 2010-12-28, at 2:30 PM, maha <maha@umail.ucsb.edu> wrote:

> Hi James,
>
>   I'm accessing  ---> http://speed.cs.ucsb.edu:50030/   for the job tracker and  port:
50070 for the name node just like Hadoop quick start.
>
> Did you mean to change the port in my mapred-site.xml file ?
>
>  <property>
>    <name>mapred.job.tracker</name>
>    <value>speed.cs.ucsb.edu:9001</value>
>  </property>
>
>
> Maha
>
>
> On Dec 28, 2010, at 1:01 PM, James Seigel wrote:
>
>> For job tracker go to port 50030 see if that helps
>>
>> James
>>
>> Sent from my mobile. Please excuse the typos.
>>
>> On 2010-12-28, at 1:36 PM, maha <maha@umail.ucsb.edu> wrote:
>>
>>> James said:
>>>
>>> Is the job tracker running on that machine?    YES
>>> Is there a firewall in the way?  I don't think so, because it used to work for
me. How can I check that?
>>>
>>> ========================================================================================================================================
>>> Harsh said:
>>>
>>> Did you do any ant operation on your release copy of Hadoop prior to
>>> starting it, by the way?
>>>
>>> NO, I get the following error:
>>>
>>> BUILD FAILED
>>> /cs/sandbox/student/maha/hadoop-0.20.2/build.xml:316: Unable to find a javac
compiler;
>>> com.sun.tools.javac.Main is not on the classpath.
>>> Perhaps JAVA_HOME does not point to the JDK.
>>> It is currently set to "/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0/jre"
>>>
>>> I had to change JAVA_HOME to point to --> /usr/lib/jvm/jre-1.6.0-openjdk 
 because I used to get an error when trying to run a jar file. The error was:
>>>
>>>> bin/hadoop: line 258: /etc/alternatives/java/bin/java: Not a directory
>>>> bin/hadoop: line 289: /etc/alternatives/java/bin/java: Not a directory
>>>> bin/hadoop: line 289: exec: /etc/alternatives/java/bin/java: cannot
>>>> execute: Not a directory
>>>
>>>
>>> ========================================================================================================================================
>>> Adarsh said:
>>>
>>> logs of namenode + jobtracker
>>>
>>> <<<<< namenode log >>>>
>>>
>>> [maha@speed logs]$ cat hadoop-maha-namenode-speed.cs.ucsb.edu.log
>>> 2010-12-28 12:23:25,006 INFO org.apache.hadoop.hdfs.server.namenode.NameNode:
STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = speed.cs.ucsb.edu/128.111.43.50
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 0.20.2
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
-r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>>> ************************************************************/
>>> 2010-12-28 12:23:25,126 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing
RPC Metrics with hostName=NameNode, port=9000
>>> 2010-12-28 12:23:25,130 INFO org.apache.hadoop.hdfs.server.namenode.NameNode:
Namenode up at: speed.cs.ucsb.edu/128.111.43.50:9000
>>> 2010-12-28 12:23:25,133 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing
JVM Metrics with processName=NameNode, sessionId=null
>>> 2010-12-28 12:23:25,134 INFO org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics:
Initializing NameNodeMeterics using context object:org.apache.hadoop.metrics.spi.NullContext
>>> 2010-12-28 12:23:25,258 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
fsOwner=maha,grad
>>> 2010-12-28 12:23:25,258 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
supergroup=supergroup
>>> 2010-12-28 12:23:25,258 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
isPermissionEnabled=true
>>> 2010-12-28 12:23:25,269 INFO org.apache.hadoop.hdfs.server.namenode.metrics.FSNamesystemMetrics:
Initializing FSNamesystemMetrics using context object:org.apache.hadoop.metrics.spi.NullContext
>>> 2010-12-28 12:23:25,270 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
Registered FSNamesystemStatusMBean
>>> 2010-12-28 12:23:25,316 INFO org.apache.hadoop.hdfs.server.common.Storage: Number
of files = 6
>>> 2010-12-28 12:23:25,323 INFO org.apache.hadoop.hdfs.server.common.Storage: Number
of files under construction = 0
>>> 2010-12-28 12:23:25,323 INFO org.apache.hadoop.hdfs.server.common.Storage: Image
file of size 551 loaded in 0 seconds.
>>> 2010-12-28 12:23:25,323 INFO org.apache.hadoop.hdfs.server.common.Storage: Edits
file /tmp/hadoop-maha/dfs/name/current/edits of size 4 edits # 0 loaded in 0 seconds.
>>> 2010-12-28 12:23:25,358 INFO org.apache.hadoop.hdfs.server.common.Storage: Image
file of size 551 saved in 0 seconds.
>>> 2010-12-28 12:23:25,711 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem:
Finished loading FSImage in 542 msecs
>>> 2010-12-28 12:23:25,715 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe
mode ON.
>>> The ratio of reported blocks 0.0000 has not reached the threshold 0.9990. Safe
mode will be turned off automatically.
>>> 2010-12-28 12:23:25,834 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log)
via org.mortbay.log.Slf4jLog
>>> 2010-12-28 12:23:25,901 INFO org.apache.hadoop.http.HttpServer: Port returned
by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on
50070
>>> 2010-12-28 12:23:25,902 INFO org.apache.hadoop.http.HttpServer: listener.getLocalPort()
returned 50070 webServer.getConnectors()[0].getLocalPort() returned 50070
>>> 2010-12-28 12:23:25,902 INFO org.apache.hadoop.http.HttpServer: Jetty bound to
port 50070
>>> 2010-12-28 12:23:25,902 INFO org.mortbay.log: jetty-6.1.14
>>> 2010-12-28 12:23:26,360 INFO org.mortbay.log: Started SelectChannelConnector@0.0.0.0:50070
>>> 2010-12-28 12:23:26,360 INFO org.apache.hadoop.hdfs.server.namenode.NameNode:
Web-server up at: 0.0.0.0:50070
>>> 2010-12-28 12:23:26,360 INFO org.apache.hadoop.ipc.Server: IPC Server Responder:
starting
>>> 2010-12-28 12:23:26,362 INFO org.apache.hadoop.ipc.Server: IPC Server listener
on 9000: starting
>>> 2010-12-28 12:23:26,362 INFO org.apache.hadoop.ipc.Server: IPC Server handler
0 on 9000: starting
>>> 2010-12-28 12:23:26,366 INFO org.apache.hadoop.ipc.Server: IPC Server handler
1 on 9000: starting
>>> 2010-12-28 12:23:26,369 INFO org.apache.hadoop.ipc.Server: IPC Server handler
2 on 9000: starting
>>> 2010-12-28 12:23:26,370 INFO org.apache.hadoop.ipc.Server: IPC Server handler
3 on 9000: starting
>>> 2010-12-28 12:23:26,370 INFO org.apache.hadoop.ipc.Server: IPC Server handler
5 on 9000: starting
>>> 2010-12-28 12:23:26,370 INFO org.apache.hadoop.ipc.Server: IPC Server handler
6 on 9000: starting
>>> 2010-12-28 12:23:26,370 INFO org.apache.hadoop.ipc.Server: IPC Server handler
7 on 9000: starting
>>> 2010-12-28 12:23:26,370 INFO org.apache.hadoop.ipc.Server: IPC Server handler
8 on 9000: starting
>>> 2010-12-28 12:23:26,371 INFO org.apache.hadoop.ipc.Server: IPC Server handler
4 on 9000: starting
>>> 2010-12-28 12:23:26,372 INFO org.apache.hadoop.ipc.Server: IPC Server handler
9 on 9000: starting
>>>
>>> <<<<< JobTracker log >>>>
>>>
>>> [maha@speed logs]$ cat hadoop-maha-jobtracker-speed.cs.ucsb.edu.log
>>> 2010-12-28 12:23:29,321 INFO org.apache.hadoop.mapred.JobTracker: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting JobTracker
>>> STARTUP_MSG:   host = speed.cs.ucsb.edu/128.111.43.50
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 0.20.2
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
-r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>>> ************************************************************/
>>> 2010-12-28 12:23:29,443 INFO org.apache.hadoop.mapred.JobTracker: Scheduler configured
with (memSizeForMapSlotOnJT, memSizeForReduceSlotOnJT, limitMaxMemForMapTasks, limitMaxMemForReduceTasks)
(-1, -1, -1, -1)
>>> 2010-12-28 12:23:29,487 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing
RPC Metrics with hostName=JobTracker, port=9001
>>> 2010-12-28 12:23:29,559 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log)
via org.mortbay.log.Slf4jLog
>>> 2010-12-28 12:23:29,745 INFO org.apache.hadoop.http.HttpServer: Port returned
by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on
50030
>>> 2010-12-28 12:23:29,746 INFO org.apache.hadoop.http.HttpServer: listener.getLocalPort()
returned 50030 webServer.getConnectors()[0].getLocalPort() returned 50030
>>> 2010-12-28 12:23:29,746 INFO org.apache.hadoop.http.HttpServer: Jetty bound to
port 50030
>>> 2010-12-28 12:23:29,746 INFO org.mortbay.log: jetty-6.1.14
>>>
>>>
>>>
>>>     Thanks guys for your help,
>>>          Maha
>>>
>>>
>

Mime
View raw message