giraph-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ryan Compton <>
Subject Spark 1.0: slf4j version conflicts with pig
Date Tue, 27 May 2014 21:38:35 GMT
I use both Pig and Spark. All my code is built with Maven into a giant
*-jar-with-dependencies.jar. I recently upgraded to Spark 1.0 and now
all my pig scripts fail with:

Caused by: java.lang.RuntimeException: Could not resolve error that
occured when launching map reduce job: java.lang.NoSuchMethodError:
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(
at java.lang.Thread.dispatchUncaughtException(

Did Spark 1.0 change the version of slf4j? I can't seem to find it via
mvn dependency:tree

View raw message