hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sonal Goyal <sonalgoy...@gmail.com>
Subject Re: java.lang.NullPointerException at java.util.concurrent.ConcurrentHashMap
Date Wed, 28 Jul 2010 17:20:40 GMT
Hi Sameer,

Do you see all processes up? Can you check the logs and see what is going
wrong? I suspect you will need to make the entries in /etc/hosts anyways.
Somewhere in hadoop code, the ip  hostname mapping is done, I forget the
exact location.

Thanks and Regards,
Sonal
www.meghsoft.com
http://in.linkedin.com/in/sonalgoyal


On Wed, Jul 28, 2010 at 10:38 PM, Sameer Joshi <
sameer.joshi@serenesoftware.com> wrote:

> Hi Sonal,
> Thank you for replying.
>
> I'm trying to run it as single node, so I had localhost in all the conf
> files and in /etc/hosts. I tried replacing it with 127.0.0.1 in all Hadoop
> conf files (read to do that on some forum), but with no change.
>
> I figured I didn't need hostname since I was trying to run it locally, but
> still tried replacing localhost with 173.1.86.194. However, as expected, it
> could not connect to server.
>
> Any thoughts?
>
> Thanks for your help, I'm new at this and feel much obliged.
>
> Regards,
> Sameer
>
> On Wed, Jul 28, 2010 at 12:12 AM, Sonal Goyal <sonalgoyal4@gmail.com>
> wrote:
>
> > Hi Sameer,
> >
> > For GoGrid, you will have to configure /etc/hosts with the ip and
> hostname
> > for master and slaves. If you run hostname and nslookup -sil, you will
> get
> > the details for your machines. Make sure /etc/hosts has an entry like:
> >
> > 173.1.45.202 173.1.45.202.reverse.gogrid.com 28684_1_85017_358406
> >
> > Thanks and Regards,
> > Sonal
> > www.meghsoft.com
> > http://in.linkedin.com/in/sonalgoyal
> >
> >
> > On Wed, Jul 28, 2010 at 6:42 AM, Sameer Joshi <
> > sameer.joshi@serenesoftware.com> wrote:
> >
> > > I installed Java and Hadoop on a GoGrid cloud server using Red Hat
> > > Enterprise Linux Server release 5.1 (Tikanga). Hadoop installed fine
> and
> > > starts fine, however I get an error (java.lang.NullPointerException at
> > > java.util.concurrent.ConcurrentHashMap) while running the Hadoop
> > wordcount
> > > example. My guess was that this was a localhost or IPv6 issue.
> > >
> > > * I have tested replacing 'localhost' with both the local IP, and
> server
> > IP
> > > addresses (when out of options) in Hadoop conf
> > > * I have disabled IPv6 both in sysctl.conf and hadoop-env.sh (former
> > > followed by a server restart)
> > >
> > > Any thoughts?
> > > Thank you.
> > >
> > >
> > > The output is given below
> > >
> > > # bin/hadoop jar hadoop-0.20.2-examples.jar wordcount datasets
> tests/out7
> > > 10/07/27 05:44:59 INFO input.FileInputFormat: Total input paths to
> > process
> > > :
> > > 1
> > > 10/07/27 05:44:59 INFO mapred.JobClient: Running job:
> > job_201007270544_0001
> > > 10/07/27 05:45:00 INFO mapred.JobClient: map 0% reduce 0%
> > > 10/07/27 05:45:12 INFO mapred.JobClient: map 100% reduce 0%
> > > 10/07/27 05:45:17 INFO mapred.JobClient: Task Id :
> > > attempt_201007270544_0001_r_000000_0, Status : FAILED
> > >
> > > Error: java.lang.NullPointerException
> > > at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
> > >
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)
> > >
> > >
> > > 10/07/27 05:45:24 INFO mapred.JobClient: Task Id :
> > > attempt_201007270544_0001_r_000000_1, Status : FAILED
> > > Error: java.lang.NullPointerException
> > > at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
> > >
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)
> > >
> > >
> > > 10/07/27 05:45:30 INFO mapred.JobClient: Task Id :
> > > attempt_201007270544_0001_r_000000_2, Status : FAILED
> > > Error: java.lang.NullPointerException
> > > at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
> > >
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)
> > >
> > >
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Job complete:
> > > job_201007270544_0001
> > >
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Counters: 12
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Job Counters
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Launched reduce tasks=4
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Launched map tasks=1
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Data-local map tasks=1
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Failed reduce tasks=1
> > > 10/07/27 05:45:39 INFO mapred.JobClient: FileSystemCounters
> > > 10/07/27 05:45:39 INFO mapred.JobClient: HDFS_BYTES_READ=15319
> > > 10/07/27 05:45:39 INFO mapred.JobClient: FILE_BYTES_WRITTEN=12847
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Map-Reduce Framework
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Combine output records=934
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Map input records=149
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Spilled Records=934
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Map output bytes=25346
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Combine input records=2541
> > > 10/07/27 05:45:39 INFO mapred.JobClient: Map output records=2541
> > >
> > > --
> > > Dr. Sameer Joshi, Ph.D.
> > > Senior computer scientist,
> > > Serene Software.
> > >
> >
>
>
>
> --
> Dr. Sameer Joshi, Ph.D.
> Senior computer scientist,
> Serene Software.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message