mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sergey Bartunov <sbos....@gmail.com>
Subject Re: StackOverflowError on running bin/mahout with HADOOP_CONF_DIR specified
Date Sun, 03 Jul 2011 19:19:23 GMT
Oh, some code works, some not (these stack overflows). I'm confused.
Seems that I need to run everything in the fresh isolated environment

On 3 July 2011 21:58, Ted Dunning <ted.dunning@gmail.com> wrote:
> Sean,
>
> Any ideas how that tiny little commit caused this?
>
> On Sun, Jul 3, 2011 at 5:07 AM, Sergey Bartunov <sbos.net@gmail.com> wrote:
>
>> Well, I think the problem could be in the one of the latest commits:
>>
>> https://github.com/apache/mahout/commit/2a8aa48dbba06e6a12ea3c87196d729e8fe1f5cc
>>
>> When I switched back to my code (it's about
>>
>> https://github.com/sbos/mahout/commit/172b6fef278ef2ff134c68fe3c4b176df9b673d4
>> ) everything become to work.
>>
>> I did change only my code so I expect the problem is in that commit
>>
>> I'll investigate a little more there.
>>
>> On 3 July 2011 02:04, Sean Owen <srowen@gmail.com> wrote:
>> > Looks like a Hadoop bug as far as I can tell... or even something
>> > weird about your local config or file system. It's failing somehow
>> > while reading an XML file from the file system?
>> >
>> > It's not from Mahout in any event.
>> >
>> > On Sat, Jul 2, 2011 at 8:01 PM, Sergey Bartunov <sbos.net@gmail.com>
>> wrote:
>> >> All the day I'm trying to do something around the
>> >>
>> http://stackoverflow.com/questions/6558606/hadoop-0-20-203-dont-load-configuration-files
>> >>
>> >> And now I pass the HADOOP_CONF_DIR to mahout to ensure that I include
>> >> all necessary configuration files. But if do this, mahout fails with
>> >> StackOverflowError on some configuration-related code.
>> >>
>> >> Here's a part of call-stack:
>> >>
>> >> Running on hadoop, using HADOOP_HOME=/opt/hadoop
>> >> HADOOP_CONF_DIR=/opt/hadoop/conf
>> >> MAHOUT-JOB:
>> /home/sbos/gsoc/mahout/examples/target/mahout-examples-0.6-SNAPSHOT-job.jar
>> >> 11/07/02 22:37:49 WARN driver.MahoutDriver: No hmmchunks.props found
>> >> on classpath, will use command-line arguments only
>> >> Exception in thread "main" java.lang.StackOverflowError
>> >>       at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
>> >>       at
>> java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:228)
>> >>       at java.io.File.exists(File.java:733)
>> >>       at
>> sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:999)
>> >>       at
>> sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:966)
>> >>       at sun.misc.URLClassPath.findResource(URLClassPath.java:146)
>> >>       at java.net.URLClassLoader$2.run(URLClassLoader.java:385)
>> >>       at java.security.AccessController.doPrivileged(Native Method)
>> >>       at java.net.URLClassLoader.findResource(URLClassLoader.java:382)
>> >>       at java.lang.ClassLoader.getResource(ClassLoader.java:1003)
>> >>       at java.lang.ClassLoader.getResource(ClassLoader.java:998)
>> >>       at
>> java.lang.ClassLoader.getResourceAsStream(ClassLoader.java:1193)
>> >>       at
>> javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:96)
>> >>       at java.security.AccessController.doPrivileged(Native Method)
>> >>       at
>> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:89)
>> >>       at
>> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:250)
>> >>       at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:223)
>> >>       at
>> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:123)
>> >>       at
>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1109)
>> >>       at
>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1093)
>> >>       at
>> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1037)
>> >>       at
>> org.apache.hadoop.conf.Configuration.get(Configuration.java:415)
>> >>       at
>> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:168)
>> >>       at
>> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
>> >>       at
>> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
>> >>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
>> >>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
>> >>       at
>> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
>> >>       at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
>> >>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
>> >>       at
>> org.apache.hadoop.fs.FsUrlConnection.connect(FsUrlConnection.java:45)
>> >>       at
>> org.apache.hadoop.fs.FsUrlConnection.getInputStream(FsUrlConnection.java:56)
>> >>       at
>> org.apache.xerces.impl.XMLEntityManager.setupCurrentEntity(Unknown
>> >> Source)
>> >>       at
>> org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown
>> >> Source)
>> >>
>> >> and so on, and so on. Does it look like a bug in mahout/hadoop or my
>> mistake?
>> >>
>> >
>>
>

Mime
View raw message