spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cyril Scetbon <cyril.scet...@free.fr>
Subject Re: spark-shell failing but pyspark works
Date Mon, 04 Apr 2016 20:11:29 GMT
I suppose it doesn't work using spark-shell too ? If you can confirm

Thanks
> On Apr 3, 2016, at 03:39, Mich Talebzadeh <mich.talebzadeh@gmail.com> wrote:
> 
> This works fine for me
> 
> val sparkConf = new SparkConf().
>              setAppName("StreamTest").
>              setMaster("yarn-client").
>              set("spark.cores.max", "12").
>              set("spark.driver.allowMultipleContexts", "true").
>              set("spark.hadoop.validateOutputSpecs", "false")
> 
> Time: 1459669805000 ms
> -------------------------------------------
> -------------------------------------------
> Time: 1459669860000 ms
> -------------------------------------------
> (Sun Apr 3 08:35:01 BST 2016  ======= Sending messages from rhes5)
> 
> 
> 
> 
> Dr Mich Talebzadeh
>  
> LinkedIn  https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>
>  
> http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>
>  
> 
> On 3 April 2016 at 03:34, Cyril Scetbon <cyril.scetbon@free.fr <mailto:cyril.scetbon@free.fr>>
wrote:
> Nobody has any idea ?
> 
> > On Mar 31, 2016, at 23:22, Cyril Scetbon <cyril.scetbon@free.fr <mailto:cyril.scetbon@free.fr>>
wrote:
> >
> > Hi,
> >
> > I'm having issues to create a StreamingContext with Scala using spark-shell. It
tries to access the localhost interface and the Application Master is not running on this
interface :
> >
> > ERROR ApplicationMaster: Failed to connect to driver at localhost:47257, retrying
...
> >
> > I don't have the issue with Python and pyspark which works fine (you can see it
uses the ip address) :
> >
> > ApplicationMaster: Driver now available: 192.168.10.100:43290 <http://192.168.10.100:43290/>
> >
> > I use similar codes though :
> >
> > test.scala :
> > --------------
> >
> > import org.apache.spark._
> > import org.apache.spark.streaming._
> > val app = "test-scala"
> > val conf = new SparkConf().setAppName(app).setMaster("yarn-client")
> > val ssc = new StreamingContext(conf, Seconds(3))
> >
> > command used : spark-shell -i test.scala
> >
> > test.py :
> > -----------
> >
> > from pyspark import SparkConf, SparkContext
> > from pyspark.streaming import StreamingContext
> > app = "test-python"
> > conf = SparkConf().setAppName(app).setMaster("yarn-client")
> > sc = SparkContext(conf=conf)
> > ssc = StreamingContext(sc, 3)
> >
> > command used : pyspark test.py
> >
> > Any idea why scala can't instantiate it ? I thought python was barely using scala
under the hood, but it seems there are differences. Are there any parameters set using Scala
but not Python ?
> >
> > Thanks
> > --
> > Cyril SCETBON
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscribe@spark.apache.org <mailto:user-unsubscribe@spark.apache.org>
> > For additional commands, e-mail: user-help@spark.apache.org <mailto:user-help@spark.apache.org>
> >
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org <mailto:user-unsubscribe@spark.apache.org>
> For additional commands, e-mail: user-help@spark.apache.org <mailto:user-help@spark.apache.org>
> 
> 


Mime
View raw message