spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: Building Spark with a Custom Version of Hadoop: HDFS ClassNotFoundException
Date Fri, 12 Feb 2016 01:29:00 GMT
Hdfs class is in hadoop-hdfs-XX.jar

Can you check the classpath to see if the above jar is there ?

Please describe the command lines you used for building hadoop / Spark.

Cheers

On Thu, Feb 11, 2016 at 5:15 PM, Charlie Wright <charliewright@live.ca>
wrote:

> I am having issues trying to run a test job on a built version of Spark
> with a custom Hadoop JAR.
> My custom hadoop version runs without issues and I can run jobs from a
> precompiled version of Spark (with Hadoop) no problem.
>
> However, whenever I try to run the same Spark example on the Spark version
> with my custom hadoop JAR - I get this error:
> "Exception in thread "main" java.lang.RuntimeException:
> java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.Hdfs not found"
>
> Does anybody know why this is happening?
>
> Thanks,
> Charles.
>
>

Mime
View raw message