spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Christophe Préaud <christophe.pre...@kelkoo.com>
Subject Re: Spark can't find jars
Date Wed, 15 Oct 2014 07:49:00 GMT
Hi Jimmy,
Did you try my patch?
The problem on my side was that the hadoop.tmp.dir  (in hadoop core-site.xml) was not handled
properly by Spark when it is set on multiple partitions/disks, i.e.:

<property>
  <name>hadoop.tmp.dir</name>
  <value>file:/d1/yarn/local,file:/d2/yarn/local,file:/d3/yarn/local,file:/d4/yarn/local,file:/d5/yarn/local,file:/d6/yarn/local,file:/d7/yarn/local</value>
</property>

Hence, you won't be hit by this bug if your hadoop.tmp.dir is set on one partition only.
If your hadoop.tmp.dir is also set on several partitions, I agree that it looks like a bug
in Spark.

Christophe.

On 14/10/2014 18:50, Jimmy McErlain wrote:
So the only way that I could make this work was to build a fat jar file as suggested earlier.
 To me (and I am no expert) it seems like this is a bug.  Everything was working for me prior
to our upgrade to Spark 1.1 on Hadoop 2.2 but now it seems to not...  ie packaging my jars
locally then pushing them out to the cluster and pointing them to corresponding dependent
jars....

Sorry I cannot be more help!
J
[https://mailfoogae.appspot.com/t?sender=aamltbXlAc2VsbHBvaW50cy5jb20%3D&type=zerocontent&guid=c1a21a6a-dbf9-453d-8c2a-b5e6a8d5ca56]ᐧ





JIMMY MCERLAIN

DATA SCIENTIST (NERD)

. . . . . . . . . . . . . . . . . .

[http://assetsw.sellpoint.net/IA/creative_services/logo_2014/sellpoints_logo_black_transparent_170x81.png]

IF WE CAN’T DOUBLE YOUR SALES,

ONE OF US IS IN THE WRONG BUSINESS.


E: jimmy@sellpoints.com<mailto:jimmy@sellpoints.com>

M: 510.303.7751

On Tue, Oct 14, 2014 at 4:59 AM, Christophe Préaud <christophe.preaud@kelkoo.com<mailto:christophe.preaud@kelkoo.com>>
wrote:
Hello,

I have already posted a message with the exact same problem, and proposed a patch (the subject
is "Application failure in yarn-cluster mode").
Can you test it, and see if it works for you?
I would be glad too if someone can confirm that it is a bug in Spark 1.1.0.

Regards,
Christophe.


On 14/10/2014 03:15, Jimmy McErlain wrote:
BTW this has always worked for me before until we upgraded the cluster to Spark 1.1.1...
J
[https://mailfoogae.appspot.com/t?sender=aamltbXlAc2VsbHBvaW50cy5jb20%3D&type=zerocontent&guid=92430839-642b-4921-8d42-f266e48bcdfe]ᐧ





JIMMY MCERLAIN

DATA SCIENTIST (NERD)

. . . . . . . . . . . . . . . . . .

[http://assetsw.sellpoint.net/IA/creative_services/logo_2014/sellpoints_logo_black_transparent_170x81.png]

IF WE CAN’T DOUBLE YOUR SALES,

ONE OF US IS IN THE WRONG BUSINESS.


E: jimmy@sellpoints.com<mailto:jimmy@sellpoints.com>

M: 510.303.7751<tel:510.303.7751>

On Mon, Oct 13, 2014 at 5:39 PM, HARIPRIYA AYYALASOMAYAJULA <aharipriya92@gmail.com<mailto:aharipriya92@gmail.com>>
wrote:
Helo,

Can you check if  the jar file is available in the target->scala-2.10 folder?

When you use sbt package to make the jar file, that is where the jar file would be located.

The following command works well for me:


spark-submit --class “Classname"   --master yarn-cluster jarfile(withcomplete path)

Can you try checking  with this initially and later add other options?

On Mon, Oct 13, 2014 at 7:36 PM, Jimmy <jimmy@sellpoints.com<mailto:jimmy@sellpoints.com>>
wrote:
Having the exact same error with the exact same jar.... Do you work for Altiscale? :)
J

Sent from my iPhone

On Oct 13, 2014, at 5:33 PM, Andy Srine <andy.srine@gmail.com<mailto:andy.srine@gmail.com>>
wrote:


Hi Guys,


Spark rookie here. I am getting a file not found exception on the --jars. This is on the yarn
cluster mode and I am running the following command on our recently upgraded Spark 1.1.1 environment.


./bin/spark-submit --verbose --master yarn --deploy-mode cluster --class myEngine --driver-memory
1g --driver-library-path /hadoop/share/hadoop/mapreduce/lib/hadoop-lzo-0.4.18-201406111750.jar
--executor-memory 5g --executor-cores 5 --jars /home/andy/spark/lib/joda-convert-1.2.jar --queue
default --num-executors 4 /home/andy/spark/lib/my-spark-lib_1.0.jar


This is the error I am hitting. Any tips would be much appreciated. The file permissions looks
fine on my local disk.


14/10/13 22:49:39 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED

14/10/13 22:49:39 INFO impl.AMRMClientImpl: Waiting for application to be successfully unregistered.

Exception in thread "Driver" java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage
1.0 failed 4 times, most recent failure: Lost task 3.3 in stage 1.0 (TID 12, 122-67.vb2.company.com<http://122-67.vb2.company.com>):
java.io.FileNotFoundException: ./joda-convert-1.2.jar (Permission denied)

        java.io.FileOutputStream.open(Native Method)

        java.io.FileOutputStream.<init>(FileOutputStream.java:221)

        com.google.common.io.Files$FileByteSink.openStream(Files.java:223)

        com.google.common.io.Files$FileByteSink.openStream(Files.java:211)



Thanks,
Andy




--
Regards,
Haripriya Ayyalasomayajula




________________________________
Kelkoo SAS
Société par Actions Simplifiée
Au capital de € 4.168.964,30
Siège social : 8, rue du Sentier 75002 Paris
425 093 069 RCS Paris

Ce message et les pièces jointes sont confidentiels et établis à l'attention exclusive
de leurs destinataires. Si vous n'êtes pas le destinataire de ce message, merci de le détruire
et d'en avertir l'expéditeur.



________________________________
Kelkoo SAS
Société par Actions Simplifiée
Au capital de € 4.168.964,30
Siège social : 8, rue du Sentier 75002 Paris
425 093 069 RCS Paris

Ce message et les pièces jointes sont confidentiels et établis à l'attention exclusive
de leurs destinataires. Si vous n'êtes pas le destinataire de ce message, merci de le détruire
et d'en avertir l'expéditeur.

Mime
View raw message