Hi
You may need to increase the memory for map/reduce tasks that run in
cluster. By default they have only 200M, and you can adjust the value by
putting following in conf/mapred-site.xml
mapred.child.java.opts
-Xmx1024m
On Fri, Jul 11, 2014 at 8:05 AM, prabhat wrote:
> Hi,
>
> I am getting following error while training a model. All Mapreduce runs
> fine without error on HDP Ambari1.5(as per logs). But, I assume, while
> creating models, it gives following error. Please help
>
> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
> at
>
> org.apache.mahout.math.map.OpenIntDoubleHashMap.rehash(OpenIntDoubleHashMap.java:491)
> at
>
> org.apache.mahout.math.map.OpenIntDoubleHashMap.put(OpenIntDoubleHashMap.java:444)
> at
>
> org.apache.mahout.math.RandomAccessSparseVector.setQuick(RandomAccessSparseVector.java:127)
> at
> org.apache.mahout.math.VectorWritable.readFields(VectorWritable.java:120)
> at
>
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2178)
> at
> org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:2306)
> at
>
> org.apache.mahout.common.iterator.sequencefile.SequenceFileIterator.computeNext(SequenceFileIterator.java:101)
> at
>
> org.apache.mahout.common.iterator.sequencefile.SequenceFileIterator.computeNext(SequenceFileIterator.java:40)
> at
>
> com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
> at
>
> com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
> at
> com.google.common.collect.Iterators$5.hasNext(Iterators.java:543)
> at
>
> com.google.common.collect.ForwardingIterator.hasNext(ForwardingIterator.java:43)
> at
>
> org.apache.mahout.classifier.naivebayes.BayesUtils.readModelFromDir(BayesUtils.java:81)
> at
>
> org.apache.mahout.classifier.naivebayes.training.TrainNaiveBayesJob.run(TrainNaiveBayesJob.java:164)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at
>
> org.apache.mahout.classifier.naivebayes.training.TrainNaiveBayesJob.main(TrainNaiveBayesJob.java:62)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
> at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:145)
> at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:153)
> at
> org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>
>
> Prabhat Kumar Singh
>
--
With best wishes, Alex Ott
http://alexott.net/
Twitter: alexott_en (English), alexott (Russian)
Skype: alex.ott