hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sonal Goyal <sonalgoy...@gmail.com>
Subject Re: java.lang.NullPointerException at java.util.concurrent.ConcurrentHashMap
Date Sun, 01 Aug 2010 14:59:36 GMT
173.1.45.202 173.1.45.202.reverse.gogrid.com 28684_1_85017_358406

Change that to whatever your ips/hostnames are.

Thanks and Regards,
Sonal
www.meghsoft.com
http://in.linkedin.com/in/sonalgoyal


On Sun, Aug 1, 2010 at 7:35 AM, Sameer Joshi <
sameer.joshi@serenesoftware.com> wrote:

> Hi again Sonal,
> Just checking, if I'm running a pseudo-cluster on one node, so the master
> and slaves are all localhost. What may the /etc/hosts look like in this
> case?
>
> Thanks,
> Sameer
>
> On Sat, Jul 31, 2010 at 3:37 AM, Sameer Joshi <
> sameer.joshi@serenesoftware.com> wrote:
>
> > Hi Sonal,
> > Sorry for the late reply, I was traveling. The logs, unfortunately, did
> not
> > yield too much information. I'll try some more configurations with
> etc/hosts
> > edits, and see if I can't get it to behave.
> > Thanks so much again for your help.
> > Sameer
> >
> >
> > On Wed, Jul 28, 2010 at 1:20 PM, Sonal Goyal <sonalgoyal4@gmail.com
> >wrote:
> >
> >> Hi Sameer,
> >>
> >> Do you see all processes up? Can you check the logs and see what is
> going
> >> wrong? I suspect you will need to make the entries in /etc/hosts
> anyways.
> >> Somewhere in hadoop code, the ip  hostname mapping is done, I forget the
> >> exact location.
> >>
> >> Thanks and Regards,
> >> Sonal
> >> www.meghsoft.com
> >> http://in.linkedin.com/in/sonalgoyal
> >>
> >>
> >> On Wed, Jul 28, 2010 at 10:38 PM, Sameer Joshi <
> >> sameer.joshi@serenesoftware.com> wrote:
> >>
> >> > Hi Sonal,
> >> > Thank you for replying.
> >> >
> >> > I'm trying to run it as single node, so I had localhost in all the
> conf
> >> > files and in /etc/hosts. I tried replacing it with 127.0.0.1 in all
> >> Hadoop
> >> > conf files (read to do that on some forum), but with no change.
> >> >
> >> > I figured I didn't need hostname since I was trying to run it locally,
> >> but
> >> > still tried replacing localhost with 173.1.86.194. However, as
> expected,
> >> it
> >> > could not connect to server.
> >> >
> >> > Any thoughts?
> >> >
> >> > Thanks for your help, I'm new at this and feel much obliged.
> >> >
> >> > Regards,
> >> > Sameer
> >> >
> >> > On Wed, Jul 28, 2010 at 12:12 AM, Sonal Goyal <sonalgoyal4@gmail.com>
> >> > wrote:
> >> >
> >> > > Hi Sameer,
> >> > >
> >> > > For GoGrid, you will have to configure /etc/hosts with the ip and
> >> > hostname
> >> > > for master and slaves. If you run hostname and nslookup -sil, you
> will
> >> > get
> >> > > the details for your machines. Make sure /etc/hosts has an entry
> like:
> >> > >
> >> > > 173.1.45.202 173.1.45.202.reverse.gogrid.com 28684_1_85017_358406
> >> > >
> >> > > Thanks and Regards,
> >> > > Sonal
> >> > > www.meghsoft.com
> >> > > http://in.linkedin.com/in/sonalgoyal
> >> > >
> >> > >
> >> > > On Wed, Jul 28, 2010 at 6:42 AM, Sameer Joshi <
> >> > > sameer.joshi@serenesoftware.com> wrote:
> >> > >
> >> > > > I installed Java and Hadoop on a GoGrid cloud server using Red
Hat
> >> > > > Enterprise Linux Server release 5.1 (Tikanga). Hadoop installed
> fine
> >> > and
> >> > > > starts fine, however I get an error
> (java.lang.NullPointerException
> >> at
> >> > > > java.util.concurrent.ConcurrentHashMap) while running the Hadoop
> >> > > wordcount
> >> > > > example. My guess was that this was a localhost or IPv6 issue.
> >> > > >
> >> > > > * I have tested replacing 'localhost' with both the local IP,
and
> >> > server
> >> > > IP
> >> > > > addresses (when out of options) in Hadoop conf
> >> > > > * I have disabled IPv6 both in sysctl.conf and hadoop-env.sh
> (former
> >> > > > followed by a server restart)
> >> > > >
> >> > > > Any thoughts?
> >> > > > Thank you.
> >> > > >
> >> > > >
> >> > > > The output is given below
> >> > > >
> >> > > > # bin/hadoop jar hadoop-0.20.2-examples.jar wordcount datasets
> >> > tests/out7
> >> > > > 10/07/27 05:44:59 INFO input.FileInputFormat: Total input paths
to
> >> > > process
> >> > > > :
> >> > > > 1
> >> > > > 10/07/27 05:44:59 INFO mapred.JobClient: Running job:
> >> > > job_201007270544_0001
> >> > > > 10/07/27 05:45:00 INFO mapred.JobClient: map 0% reduce 0%
> >> > > > 10/07/27 05:45:12 INFO mapred.JobClient: map 100% reduce 0%
> >> > > > 10/07/27 05:45:17 INFO mapred.JobClient: Task Id :
> >> > > > attempt_201007270544_0001_r_000000_0, Status : FAILED
> >> > > >
> >> > > > Error: java.lang.NullPointerException
> >> > > > at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
> >> > > > at
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
> >> > > >
> >> > > > at
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)
> >> > > >
> >> > > >
> >> > > > 10/07/27 05:45:24 INFO mapred.JobClient: Task Id :
> >> > > > attempt_201007270544_0001_r_000000_1, Status : FAILED
> >> > > > Error: java.lang.NullPointerException
> >> > > > at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
> >> > > > at
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
> >> > > >
> >> > > > at
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)
> >> > > >
> >> > > >
> >> > > > 10/07/27 05:45:30 INFO mapred.JobClient: Task Id :
> >> > > > attempt_201007270544_0001_r_000000_2, Status : FAILED
> >> > > > Error: java.lang.NullPointerException
> >> > > > at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
> >> > > > at
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
> >> > > >
> >> > > > at
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)
> >> > > >
> >> > > >
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Job complete:
> >> > > > job_201007270544_0001
> >> > > >
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Counters: 12
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Job Counters
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Launched reduce tasks=4
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Launched map tasks=1
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Data-local map tasks=1
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Failed reduce tasks=1
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: FileSystemCounters
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: HDFS_BYTES_READ=15319
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: FILE_BYTES_WRITTEN=12847
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Map-Reduce Framework
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Combine output
> records=934
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Map input records=149
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Spilled Records=934
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Map output bytes=25346
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Combine input
> records=2541
> >> > > > 10/07/27 05:45:39 INFO mapred.JobClient: Map output records=2541
> >> > > >
> >> > > > --
> >> > > > Dr. Sameer Joshi, Ph.D.
> >> > > > Senior computer scientist,
> >> > > > Serene Software.
> >> > > >
> >> > >
> >> >
> >> >
> >> >
> >> > --
> >> > Dr. Sameer Joshi, Ph.D.
> >> > Senior computer scientist,
> >> > Serene Software.
> >> >
> >>
> >
> >
> >
> > --
> > Dr. Sameer Joshi, Ph.D.
> > Senior computer scientist,
> > Serene Software.
> >
> >
>
>
> --
> Dr. Sameer Joshi, Ph.D.
> Senior computer scientist,
> Serene Software.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message