accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bob.Thor...@l-3com.com
Subject RE: AcculumoFileOutputFormat class cannot be found by child jvm
Date Wed, 23 May 2012 16:07:18 GMT
Attached are the contents of the job.jar and job.xml and the taskTracker directory (job number
changed because the GC removed the contents before I could copy/paste)

-----Original Message-----
From: Billie J Rinaldi [mailto:billie.j.rinaldi@ugov.gov] 
Sent: Wednesday, May 23, 2012 08:18
To: Thorman, Bob @ ISG - ComCept
Subject: Re: AcculumoFileOutputFormat class cannot be found by child jvm

On Tuesday, May 22, 2012 6:21:28 PM, "Bob Thorman" <Bob.Thorman@l-3com.com> wrote:
> Here is the exec string. As you can see the illusive jars are on the 
> list, but the child jvm didn't get the memo on where they are. The 
> jars did not appear in the taskTracker or JobTracker directories. The 
> paths are correct to the jar files on the local file system.
> 
> exec /cloudbase/hadoop-0.20.2/bin/hadoop jar 
> /mnt/hgfs/CSI.Cloudbase/Java/CloudbaseServices/out/artifacts/Cloudbase
> Ingesters/CloudbaseIngesters.jar 
> com.comcept.cloudbase.ingesters.placemarks.PlacemarkIngester -libjars 
> "/cloudbase/accumulo-1.4.0/lib/libthrift-0.6.1.jar,/cloudbase/accumulo-1.4.0/lib/accumulo-core-1.4.0.jar,/cloudbase/zookeeper-3.4.3/zookeeper-3.4.3.jar,/cloudbase/accumulo-1.4.0/lib/cloudtrace-1.4.0.jar,/mnt/hgfs/CSI.Cloudbase/Java/CloudbaseServices/out/artifacts/CloudbaseIngesters/CloudbaseIngesters.jar,/usr/lib/ncct/kxml2-2.3.0.jar,/usr/lib/ncct/xmlpull-1.1.3.1.jar,/usr/lib/ncct/xstream-1.4.1.jar,/cloudbase/accumulo-1.4.0/lib/accumulo-core-1.4.0.jar,/cloudbase/accumulo-1.4.0/lib/commons-collections-3.2.jar,/cloudbase/accumulo-1.4.0/lib/commons-configuration-1.5.jar,/cloudbase/accumulo-1.4.0/lib/commons-io-1.4.jar,/cloudbase/accumulo-1.4.0/lib/commons-jci-core-1.0.jar,/cloudbase/accumulo-1.4.0/lib/commons-jci-fam-1.0.jar,/cloudbase/accumulo-1.4.0/lib/commons-lang-2.4.jar,/cloudbase/accumulo-1.4.0/lib/commons-logging-1.0.4.jar,/cloudbase/accumulo-1.4.0/lib/commons-logging-api-1.0.4.jar"

The libjars string looks ok, assuming all those jars exist in those locations.  Can you identify
what subset of these jars made it into the MapReduce config?  That might tell us where the
error is occurring.

Billie


> When I set up a DistributedCache on hdfs:// and added these jars to 
> the job configuration the child jvm found them just fine.
> 
> -----Original Message-----
> From: Billie J Rinaldi [mailto:billie.j.rinaldi@ugov.gov]
> Sent: Tuesday, May 22, 2012 16:50
> To: user@accumulo.apache.org
> Subject: Re: AcculumoFileOutputFormat class cannot be found by child 
> jvm
> 
> On Tuesday, May 22, 2012 2:19:34 PM, "John Vines"
> <john.w.vines@ugov.gov> wrote:
> > I'm wondering if our tool.sh plays nice with an explicit -libjars.
> 
> It looks like tool.sh is attempting to parse explicit libjars, but it 
> might not be working correctly. The second-to-last line of tool.sh is
> 
> #echo exec "$HADOOP_HOME/bin/hadoop" jar "$TOOLJAR" $CLASSNAME 
> -libjars \"$LIB_JARS\" $ARGS
> 
> If you could uncomment this line and send us what it displays, it 
> might give us some insight into where tool.sh is going wrong. The 
> -libjars are supposed to include the user-specified libjars as well as 
> accumulo-core, zookeeper, etc.
> 
> Billie
> 
> 
> > John
> >
> >
> > On Tue, May 22, 2012 at 2:07 PM, William Slacum < wslacum@gmail.com
> > >
> > wrote:
> >
> >
> > Make sure that the paths you pass to `-libjars` are correct. I have 
> > noticed behavior with Hadoop where it will silently drop the end of 
> > a libjars string if the paths do not exist.
> >
> >
> >
> > On Tue, May 22, 2012 at 11:03 AM, Keith Turner < keith@deenlo.com >
> > wrote:
> > > I think it should be in the job.xml file. The way the class path 
> > > for the java program that starts map reduce is configured is 
> > > different from how the classpath for the remote task are 
> > > configured. So your program that starts the map reduce job has 
> > > AccumuloFileOutputFormat on the class path because you use a 
> > > static method on it and it runs.
> > > So I am thinking its not in --libjars for some reason or maybe you 
> > > are not using Hadoop Tool?
> > >
> > > On Tue, May 22, 2012 at 1:54 PM, < Bob.Thorman@l-3com.com > wrote:
> > >> No, it's not in the job.xml file. A reference to 
> > >> AccumuloFileOutputFormat is but not the accumulo-core-1.4.0.jar.
> > >> The Job*.jar file is referenced there. I was under the impression 
> > >> that the child jvm inherits the parent jvm attributes which makes 
> > >> the job*.jar file. Is that not correct?
> > >>
> > >> -----Original Message-----
> > >> From: Keith Turner [mailto: keith@deenlo.com ]
> > >> Sent: Tuesday, May 22, 2012 12:07
> > >> To: user@accumulo.apache.org
> > >> Subject: Re: AcculumoFileOutputFormat class cannot be found by 
> > >> child jvm
> > >>
> > >> If you look at Job.xml, do you see accumulo-core there? There 
> > >> should be a link to this file on the jobs page on the 
> > >> tasktracker.
> > >>
> > >> On Tue, May 22, 2012 at 10:40 AM, < Bob.Thorman@l-3com.com >
> > >> wrote:
> > >>> I upgrade to accumulo-1.4.0 and updated my map/reduce jobs and 
> > >>> now they don't run. The parent class path has the 
> > >>> accumulo-core-1.4.0.jar file included. Do the accumulo jar files 
> > >>> have to be manually put on a distribute cache? Any help is 
> > >>> appreciated.
> > >>>
> > >>> [hadoop@redhat-cloudbase1 placemarks]$ ./runPlacemarkIngester.sh 
> > >>> Found
> > >>> 5 items drwxrwxr-x - hadoop hadoop 0 2012-05-21 14:13 /accumulo 
> > >>> drwxrwxr-x - hadoop hadoop 0 2012-05-21 15:06 /data drwxr-xr-x - 
> > >>> hadoop hadoop 0 2012-05-22 08:58 /input drwxr-xr-x - hadoop 
> > >>> hadoop
> > >>> 0 2012-05-22 08:58 /output drwxrwxr-x - hadoop hadoop 0
> > >>> 2012-05-21
> > >>> 14:34 /usr Deleted hdfs://redhat-cloudbase1:9000/output
> > >>> Deleted hdfs://redhat-cloudbase1:9000/input
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester: Invoking 
> > >>> ToolRunner.run
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester: zooKeeper 
> > >>> is
> > >>> redhat-cloudbase1:2181
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester:
> > >>> instanceName
> > >>> is NCCT-Cloudbase
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester:
> > >>> timeTableName
> > >>> is NCCTServicesTimes
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester:
> > >>> geoTableName
> > >>> is NCCTServicesGeos
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester:
> > >>> metadataTableName
> > >>> is NCCTServicesMetadata
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester:
> > >>> edgeTableName
> > >>> is NCCTEdgeTable
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester: userName is 
> > >>> NCCT.Services.Client
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester: password is
> > >>> *********
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester: visibility 
> > >>> is public,BD2,UNCLASSIFIED-NO_COMPARTMENT-UNRESTRICTED
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester: inputDir is 
> > >>> /input
> > >>> 12/05/22 09:05:29 INFO placemarks.PlacemarkIngester: outputDir 
> > >>> is /output
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:zookeeper.version=3.4.3-1240972, built on 02/06/2012
> > >>> 10:48
> > >>> GMT
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client
> > >>> environment: host.name =redhat-cloudbase1
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client
> > >>> environment:java.version=1.6.0_32
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:java.vendor=Sun Microsystems Inc.
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:java.home=/usr/java/jdk1.6.0_32/jre
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:java.class.path=/cloudbase/hadoop-0.20.2/bin/../conf
> > >>> :/
> > >>> usr/
> > >>> ja
> > >>> va/jdk1.6.0_32/lib/tools.jar:/cloudbase/hadoop-0.20.2/bin/..:/cl
> > >>> ou
> > >>> dbas
> > >>> e/
> > >>> hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar:/cloudbase/hadoop-0.
> > >>> 20
> > >>> .2/b
> > >>> in
> > >>> /../lib/commons-cli-1.2.jar:/cloudbase/hadoop-0.20.2/bin/../lib/
> > >>> co
> > >>> mmon
> > >>> s-
> > >>> codec-1.3.jar:/cloudbase/hadoop-0.20.2/bin/../lib/commons-el-1.0
> > >>> .j
> > >>> ar:/
> > >>> cl
> > >>> oudbase/hadoop-0.20.2/bin/../lib/commons-httpclient-3.0.1.jar:/c
> > >>> lo
> > >>> udba
> > >>> se
> > >>> /hadoop-0.20.2/bin/../lib/commons-logging-1.0.4.jar:/cloudbase/hadoop-0.
> > >>> 20.2/bin/../lib/commons-logging-api-1.0.4.jar:/cloudbase/hadoop-0.
> > >>> 20.2
> > >>> /b
> > >>> in/../lib/commons-net-1.4.1.jar:/cloudbase/hadoop-0.20.2/bin/../
> > >>> li
> > >>> b/co
> > >>> re
> > >>> -3.1.1.jar:/cloudbase/hadoop-0.20.2/bin/../lib/hadoop-0.20.2-ant
> > >>> .j
> > >>> ar:/
> > >>> cl
> > >>> oudbase/hadoop-0.20.2/bin/../lib/hadoop-0.20.2-core.jar:/cloudba
> > >>> se
> > >>> /had
> > >>> oo
> > >>> p-0.20.2/bin/../lib/hadoop-0.20.2-examples.jar:/cloudbase/hadoop-0.20.
> > >>> 2/
> > >>> bin/../lib/hadoop-0.20.2-test.jar:/cloudbase/hadoop-0.20.2/bin/.
> > >>> ./
> > >>> lib/
> > >>> ha
> > >>> doop-0.20.2-tools.jar:/cloudbase/hadoop-0.20.2/bin/../lib/hsqldb
> > >>> -1
> > >>> .8.0
> > >>> .1
> > >>> 0.jar:/cloudbase/hadoop-0.20.2/bin/../lib/jasper-compiler-5.5.12
> > >>> .j
> > >>> ar:/
> > >>> cl
> > >>> oudbase/hadoop-0.20.2/bin/../lib/jasper-runtime-5.5.12.jar:/clou
> > >>> db
> > >>> ase/
> > >>> ha
> > >>> doop-0.20.2/bin/../lib/jets3t-0.6.1.jar:/cloudbase/hadoop-0.20.2/bin/.
> > >>> ./
> > >>> lib/jetty-6.1.14.jar:/cloudbase/hadoop-0.20.2/bin/../lib/jetty-util-6.1.
> > >>> 14.jar:/cloudbase/hadoop-0.20.2/bin/../lib/junit-3.8.1.jar:/clou
> > >>> db
> > >>> ase/
> > >>> ha
> > >>> doop-0.20.2/bin/../lib/kfs-0.2.2.jar:/cloudbase/hadoop-0.20.2/bi
> > >>> n/
> > >>> ../l
> > >>> ib
> > >>> /log4j-1.2.15.jar:/cloudbase/hadoop-0.20.2/bin/../lib/mockito-all-1.8.0.
> > >>> jar:/cloudbase/hadoop-0.20.2/bin/../lib/oro-2.0.8.jar:/cloudbase
> > >>> /h
> > >>> adoo
> > >>> p-
> > >>> 0.20.2/bin/../lib/servlet-api-2.5-6.1.14.jar:/cloudbase/hadoop-0
> > >>> .2
> > >>> 0.2/
> > >>> bi
> > >>> n/../lib/slf4j-api-1.4.3.jar:/cloudbase/hadoop-0.20.2/bin/../lib
> > >>> /s
> > >>> lf4j
> > >>> -l
> > >>> og4j12-1.4.3.jar:/cloudbase/hadoop-0.20.2/bin/../lib/xmlenc-0.52
> > >>> .j
> > >>> ar:/
> > >>> cl
> > >>> oudbase/hadoop-0.20.2/bin/../lib/jsp-2.1/jsp-2.1.jar:/cloudbase/
> > >>> ha
> > >>> doop
> > >>> -0
> > >>> .20.2/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/cloudbase/accumulo-1.4
> > >>> .0
> > >>> /lib
> > >>> /l
> > >>> ibthrift-0.6.1.jar:/cloudbase/accumulo-1.4.0/lib/accumulo-core-1
> > >>> .4
> > >>> .0.j
> > >>> ar
> > >>> :/cloudbase/zookeeper-3.4.3/zookeeper-3.4.3.jar:/cloudbase/accumulo-1.4.
> > >>> 0/lib/cloudtrace-1.4.0.jar:/usr/lib/ncct/kxml2-2.3.0.jar:/usr/li
> > >>> b/
> > >>> ncct
> > >>> /x
> > >>> mlpull-1.1.3.1.jar:/usr/lib/ncct/xstream-1.4.1.jar:/cloudbase/ac
> > >>> cu
> > >>> mulo
> > >>> -1
> > >>> .4.0/lib/commons-collections-3.2.jar:/cloudbase/accumulo-1.4.0/l
> > >>> ib
> > >>> /com
> > >>> mo
> > >>> ns-configuration-1.5.jar:/cloudbase/accumulo-1.4.0/lib/commons-io-1.4.
> > >>> ja
> > >>> r:/cloudbase/accumulo-1.4.0/lib/commons-jci-core-1.0.jar:/cloudb
> > >>> as
> > >>> e/ac
> > >>> cu
> > >>> mulo-1.4.0/lib/commons-jci-fam-1.0.jar:/cloudbase/accumulo-1.4.0
> > >>> /l
> > >>> ib/c
> > >>> om
> > >>> mons-lang-2.4.jar:/cloudbase/accumulo-1.4.0/lib/commons-logging-1.0.4.
> > >>> ja
> > >>> r:/cloudbase/accumulo-1.4.0/lib/commons-logging-api-1.0.4.jar:
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:java.library.path=/cloudbase/hadoop-0.20.2/bin/../li
> > >>> b/
> > >>> nati
> > >>> ve
> > >>> /Linux-amd64-64
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:java.io.tmpdir=/tmp
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:java.compiler=<NA>
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client
> > >>> environment: os.name =Linux
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client
> > >>> environment:os.arch=amd64
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client
> > >>> environment:os.version=2.6.32-131.0.15.el6.x86_64
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client
> > >>> environment: user.name =hadoop
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:user.home=/home/hadoop
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Client 
> > >>> environment:user.dir=/mnt/hgfs/CSI.Cloudbase/Java/CloudbaseServi
> > >>> ce
> > >>> s/Cl ou
> > >>> dbaseIngesters/src/com/comcept/cloudbase/ingesters/placemarks
> > >>> 12/05/22 09:05:29 INFO zookeeper.ZooKeeper: Initiating client 
> > >>> connection, connectString=redhat-cloudbase1:2181
> > >>> sessionTimeout=30000
> > >>> watcher=org.apache.accumulo.core.zookeeper.ZooSession$AccumuloWa
> > >>> tc
> > >>> her@
> > >>> 4f
> > >>> 4db0e3
> > >>> 12/05/22 09:05:29 INFO zookeeper.ClientCnxn: Opening socket 
> > >>> connection to server / 192.168.136.2:2181
> > >>> 12/05/22 09:05:29 WARN client.ZooKeeperSaslClient:
> > >>> SecurityException:
> > >>> java.lang.SecurityException: Unable to locate a login 
> > >>> configuration occurred when trying to find JAAS configuration.
> > >>> 12/05/22 09:05:29 INFO client.ZooKeeperSaslClient: Client will 
> > >>> not SASL-authenticate because the default JAAS configuration 
> > >>> section 'Client' could not be found. If you are not using SASL, 
> > >>> you may ignore this. On the other hand, if you expected SASL to 
> > >>> work, please fix your JAAS configuration.
> > >>> 12/05/22 09:05:29 INFO zookeeper.ClientCnxn: Socket connection 
> > >>> established to redhat-cloudbase1/ 192.168.136.2:2181 , 
> > >>> initiating session
> > >>> 12/05/22 09:05:29 INFO zookeeper.ClientCnxn: Session 
> > >>> establishment complete on server redhat-cloudbase1/ 
> > >>> 192.168.136.2:2181 , sessionid = 0x1377101615a323f, negotiated 
> > >>> timeout = 30000
> > >>> 12/05/22 09:05:30 INFO input.FileInputFormat: Total input paths 
> > >>> to process : 3
> > >>> 12/05/22 09:05:30 INFO mapred.JobClient: Running job:
> > >>> job_201205211505_0012
> > >>> 12/05/22 09:05:31 INFO mapred.JobClient: map 0% reduce 0%
> > >>> 12/05/22 09:05:38 INFO mapred.JobClient: Task Id :
> > >>> attempt_201205211505_0012_m_000004_0, Status : FAILED
> > >>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
> > >>> org.apache.accumulo.core.client.mapreduce.AccumuloFileOutputForm
> > >>> at
> > >>> at
> > >>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java
> > >>> :8
> > >>> 09)
> > >>> at
> > >>> org.apache.hadoop.mapreduce.JobContext.getOutputFormatClass(JobC
> > >>> on
> > >>> text
> > >>> .j
> > >>> ava:193)
> > >>> at org.apache.hadoop.mapred.Task.initialize(Task.java:413)
> > >>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:288)
> > >>> at org.apache.hadoop.mapred.Child.main(Child.java:170)
> > >>> Caused by: java.lang.ClassNotFoundException:
> > >>> org.apache.accumulo.core.client.mapreduce.AccumuloFileOutputForm
> > >>> at at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >>> at java.security.AccessController.doPrivileged(Native Method) at
> > >>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >>> at
> > >>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > >>> at java.lang.Class.forName0(Native Method) at
> > >>> java.lang.Class.forName(Class.java:247)
> > >>> at
> > >>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.
> > >>> java
> > >>> :7
> > >>> 62)
> > >>> at
> > >>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java
> > >>> :8
> > >>> 07)
> > >>> ... 4 more
> > >>>
> > >>> Bob Thorman
> > >>> Engineering Fellow
> > >>> L-3 Communications, ComCept
> > >>> 1700 Science Place
> > >>> Rockwall, TX 75032
> > >>> (972) 772-7501 work
> > >>> Bob.Thorman@ncct.af.smil.mil
> > >>> rdthorm@nsa.ic.gov
> > >>>
> > >>>
Mime
View raw message