spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cyril Scetbon <>
Subject spark-shell failing but pyspark works
Date Fri, 01 Apr 2016 03:22:08 GMT

I'm having issues to create a StreamingContext with Scala using spark-shell. It tries to access
the localhost interface and the Application Master is not running on this interface :

ERROR ApplicationMaster: Failed to connect to driver at localhost:47257, retrying ...

I don't have the issue with Python and pyspark which works fine (you can see it uses the ip
address) : 

ApplicationMaster: Driver now available:

I use similar codes though :

test.scala :

import org.apache.spark._
import org.apache.spark.streaming._
val app = "test-scala"
val conf = new SparkConf().setAppName(app).setMaster("yarn-client")
val ssc = new StreamingContext(conf, Seconds(3))

command used : spark-shell -i test.scala :

from pyspark import SparkConf, SparkContext
from pyspark.streaming import StreamingContext
app = "test-python"
conf = SparkConf().setAppName(app).setMaster("yarn-client")
sc = SparkContext(conf=conf)
ssc = StreamingContext(sc, 3)

command used : pyspark

Any idea why scala can't instantiate it ? I thought python was barely using scala under the
hood, but it seems there are differences. Are there any parameters set using Scala but not
Python ? 


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message