spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cyril Scetbon <cyril.scet...@free.fr>
Subject spark-shell failing but pyspark works
Date Fri, 01 Apr 2016 03:22:08 GMT
Hi,

I'm having issues to create a StreamingContext with Scala using spark-shell. It tries to access
the localhost interface and the Application Master is not running on this interface :

ERROR ApplicationMaster: Failed to connect to driver at localhost:47257, retrying ...

I don't have the issue with Python and pyspark which works fine (you can see it uses the ip
address) : 

ApplicationMaster: Driver now available: 192.168.10.100:43290

I use similar codes though :

test.scala :
--------------

import org.apache.spark._
import org.apache.spark.streaming._
val app = "test-scala"
val conf = new SparkConf().setAppName(app).setMaster("yarn-client")
val ssc = new StreamingContext(conf, Seconds(3))

command used : spark-shell -i test.scala

test.py :
-----------

from pyspark import SparkConf, SparkContext
from pyspark.streaming import StreamingContext
app = "test-python"
conf = SparkConf().setAppName(app).setMaster("yarn-client")
sc = SparkContext(conf=conf)
ssc = StreamingContext(sc, 3)

command used : pyspark test.py

Any idea why scala can't instantiate it ? I thought python was barely using scala under the
hood, but it seems there are differences. Are there any parameters set using Scala but not
Python ? 

Thanks
-- 
Cyril SCETBON


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message