Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id B8E6D200C83 for ; Sun, 28 May 2017 20:59:50 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id AF83D160BCC; Sun, 28 May 2017 18:59:50 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 7E8F1160BAF for ; Sun, 28 May 2017 20:59:49 +0200 (CEST) Received: (qmail 16707 invoked by uid 500); 28 May 2017 18:59:48 -0000 Mailing-List: contact dev-help@systemml.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@systemml.apache.org Delivered-To: mailing list dev@systemml.apache.org Received: (qmail 16694 invoked by uid 99); 28 May 2017 18:59:48 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 28 May 2017 18:59:48 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id B1486C1ACE for ; Sun, 28 May 2017 18:59:47 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.494 X-Spam-Level: *** X-Spam-Status: No, score=3.494 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FORGED_MUA_MOZILLA=1.596, HTML_MESSAGE=2, RCVD_IN_DNSWL_NONE=-0.0001, RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=yahoo.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id B14HRG2tChFj for ; Sun, 28 May 2017 18:59:45 +0000 (UTC) Received: from sonic306-24.consmr.mail.ne1.yahoo.com (sonic306-24.consmr.mail.ne1.yahoo.com [66.163.189.86]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id E63DD5F3F5 for ; Sun, 28 May 2017 18:59:44 +0000 (UTC) X-YMail-OSG: P29GKzgVM1lhf7BYN4YnVEDJ9N5.GW.LGBUTW_3uxRGI41IsoV3lysbb6UjBHot vvkr69RgmTKojc6S37yNbhAzfctuwIbSlx3h_DEb_1UNbKIlzraRh6qKdhjItALKjNm48.GSuV7Y 5WjkR9xV8aTAISZZru30P7fFVCLkzS_OKryJVryOaxG3Z7EyNfidQQhJIk3aVAJnSku6Tl9MJGik Oc9J6_8ZyV8t4mTAXeoa.0ODDw7eeLKqOM82VkuS_yCdKSjdkahwgf2yhViHhadrtqfZ3DHe1t2W Cg4FP.pdiPvuIkjA7wtn.pIENLJSVjHgxmklkotNBP.3AV1DD_dO_BxnyTLpF6rJlajP6nJRq6yM I3wpMhvyjyQtsMBAn5vjPSnVE2qax9qP.fBfE9j4UPOqtbR06luTBIZzDWfvCVn24ODshlZAld_r tjz02VcTyMpTA_GJPjf3aR4Sbx0T3EWqZ7zCd7MqLKUvwK03jqnMl6CHLyKTCbE0yye5odTbxQc_ y3EgtiFCgSlUgKqfNKy7idCqhYJpvqpe6xMgwXhaz22CfUY1TaTwLh1C7jys- Received: from sonic.gate.mail.ne1.yahoo.com by sonic306.consmr.mail.ne1.yahoo.com with HTTP; Sun, 28 May 2017 18:59:37 +0000 Date: Sun, 28 May 2017 18:59:18 +0000 (UTC) From: Arvind Surve Reply-To: Arvind Surve To: Message-ID: <1712589898.1876267.1495997958142@mail.yahoo.com> In-Reply-To: References: <1821164242.1855926.1495996810151@mail.yahoo.com> Subject: Re: Execution Error MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_1876266_558162765.1495997958138" X-Mailer: WebService/1.1.9726 YMailNorrin Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_3) AppleWebKit/602.4.8 (KHTML, like Gecko) Version/10.0.3 Safari/602.4.8 archived-at: Sun, 28 May 2017 18:59:50 -0000 ------=_Part_1876266_558162765.1495997958138 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable I would not recommend to set environment variables in the notebook itself, = rather set those environment variables outside and then start notebook. There are mixed of slashes used in path variables in your setting, c:\Spark= /bin, and many other places.For some reason Anaconda 3.5 may have handled i= t but Anaconda 3.6 does not.First try changing slash for e.g. /bin to \bin.= Immaterial if this works or not, set these environment variables outside no= tebook. Arvind Surve |=C2=A0Spark Technology Center =C2=A0|=C2=A0http://www.spark.t= c/ On Sunday, May 28, 2017, 11:51:22 AM PDT, arijit chakraborty wrote:Thanks a lot Arvind for your quick reply! We were using the fol= lowing way to execute systemML through spart in ipython notebook. It was ru= nning perfectly for spark_2.1.0 and anaconda 3.5. But with spark 2.1.1 and = anaconda 3.6, I'm getting this error. import os import sys import numpy spark_path =3D "C:\spark" os.environ['SPARK_HOME'] =3D spark_path os.environ['HADOOP_HOME'] =3D spark_path sys.path.append(spark_path + "/bin") sys.path.append(spark_path + "/python") sys.path.append(spark_path + "/python/pyspark/") sys.path.append(spark_path + "/python/lib") sys.path.append(spark_path + "/python/lib/pyspark.zip") sys.path.append(spark_path + "/python/lib/py4j-0.10.4-src.zip") from pyspark import SparkContext from pyspark import SparkConf SparkContext.setSystemProperty('spark.executor.memory', '15g') sc =3D SparkContext("local[*]", "test") from pyspark.sql import SQLContext import systemml as sml sqlCtx =3D SQLContext(sc) ml =3D sml.MLContext(sc).setStatistics(True) Regards, Arijit ________________________________ From: Arvind Surve Sent: Monday, May 29, 2017 12:10:10 AM To: dev@systemml.apache.org; Dev Subject: Re: Execution Error Seems like you have Spark path issue either due to ' ' (Space) in the direc= tory path or path for spark has not set correctly. Cannot run program "C:\spark": CreateProcess error=3D5, Access is denied =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessBuilder.start(Unknown Sourc= e) Arvind Surve | Spark Technology Center=C2=A0 | http://www.spark.tc/ On Sunday, May 28, 2017, 11:36:16 AM PDT, arijit chakraborty wrote:Hi, I was running our systemML code in spark 2.1.1 and anaconda 3.6, and we sta= rt getting this error. Could not understand why we are getting it. Can anyo= ne please help. Thank you! Arijit --------------------------------------------------------------------------- Py4JJavaError=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 Traceback (most recent call last) in () =C2=A0 =C2=A0 14=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 ).output("dml_r= un","n_trt","max_level") =C2=A0 =C2=A0 15 ---> 16 beta, n_trt, max_level =3D ml.execute(script).get("dml_run","n_trt"= ,"max_level") C:\ProgramData\Anaconda3\lib\site-packages\systemml\mlcontext.py in execute= (self, script) =C2=A0 =C2=A0 335=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 py= 4j.java_gateway.get_method(script_java, "in")(key, val) =C2=A0 =C2=A0 336=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 else: --> 337=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 py4j.java_ga= teway.get_method(script_java, "in")(key, _py2java(self._sc, val)) =C2=A0 =C2=A0 338=C2=A0 =C2=A0 =C2=A0 =C2=A0 for val in script._output: =C2=A0 =C2=A0 339=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 script_java.out(= val) C:\spark\python\lib\py4j-0.10.4-src.zip\py4j\java_gateway.py in __call__(se= lf, *args) =C2=A0 1131=C2=A0 =C2=A0 =C2=A0 =C2=A0 answer =3D self.gateway_client.send_= command(command) =C2=A0 1132=C2=A0 =C2=A0 =C2=A0 =C2=A0 return_value =3D get_return_value( -> 1133=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 answer, self.gateway_clien= t, self.target_id, self.name) =C2=A0 1134 =C2=A0 1135=C2=A0 =C2=A0 =C2=A0 =C2=A0 for temp_arg in temp_args: C:\spark/python\pyspark\sql\utils.py in deco(*a, **kw) =C2=A0 =C2=A0 61=C2=A0 =C2=A0 def deco(*a, **kw): =C2=A0 =C2=A0 62=C2=A0 =C2=A0 =C2=A0 =C2=A0 try: ---> 63=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 return f(*a, **kw) =C2=A0 =C2=A0 64=C2=A0 =C2=A0 =C2=A0 =C2=A0 except py4j.protocol.Py4JJavaEr= ror as e: =C2=A0 =C2=A0 65=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 s =3D e.java_exce= ption.toString() C:\spark\python\lib\py4j-0.10.4-src.zip\py4j\protocol.py in get_return_valu= e(answer, gateway_client, target_id, name) =C2=A0 =C2=A0 317=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 ra= ise Py4JJavaError( =C2=A0 =C2=A0 318=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 "An error occurred while calling {0}{1}{2}.\n". --> 319=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 format(target_id, ".", name), value) =C2=A0 =C2=A0 320=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 else: =C2=A0 =C2=A0 321=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 ra= ise Py4JError( Py4JJavaError: An error occurred while calling o52.in. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 9= in stage 0.0 failed 1 times, most recent failure: Lost task 9.0 in stage 0= .0 (TID 9, localhost, executor driver): java.io.IOException: Cannot run pro= gram "C:\spark": CreateProcess error=3D5, Access is denied =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessBuilder.start(Unknown Sourc= e) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonWorkerFact= ory.createSimpleWorker(PythonWorkerFactory.scala:120) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonWorkerFact= ory.create(PythonWorkerFactory.scala:67) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.SparkEnv.createPythonWorker= (SparkEnv.scala:116) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonRunner.com= pute(PythonRDD.scala:128) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonRDD.comput= e(PythonRDD.scala:63) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.ResultTask.runTas= k(ResultTask.scala:87) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.Task.run(Task.sca= la:99) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.executor.Executor$TaskRunne= r.run(Executor.scala:322) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.util.concurrent.ThreadPoolExecutor.runW= orker(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.util.concurrent.ThreadPoolExecutor$Work= er.run(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.Thread.run(Unknown Source) Caused by: java.io.IOException: CreateProcess error=3D5, Access is denied =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessImpl.create(Native Method) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessImpl.(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessImpl.start(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 ... 35 more Driver stacktrace: =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler.org$= apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGSchedul= er.scala:1435) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler$$ano= nfun$abortStage$1.apply(DAGScheduler.scala:1423) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler$$ano= nfun$abortStage$1.apply(DAGScheduler.scala:1422) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at scala.collection.mutable.ResizableArray$clas= s.foreach(ResizableArray.scala:59) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at scala.collection.mutable.ArrayBuffer.foreach= (ArrayBuffer.scala:48) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler.abor= tStage(DAGScheduler.scala:1422) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler$$ano= nfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler$$ano= nfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at scala.Option.foreach(Option.scala:257) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler.hand= leTaskSetFailed(DAGScheduler.scala:802) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGSchedulerEvent= ProcessLoop.doOnReceive(DAGScheduler.scala:1650) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGSchedulerEvent= ProcessLoop.onReceive(DAGScheduler.scala:1605) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGSchedulerEvent= ProcessLoop.onReceive(DAGScheduler.scala:1594) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.util.EventLoop$$anon$1.run(= EventLoop.scala:48) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.DAGScheduler.runJ= ob(DAGScheduler.scala:628) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.SparkContext.runJob(SparkCo= ntext.scala:1925) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.SparkContext.runJob(SparkCo= ntext.scala:1938) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.SparkContext.runJob(SparkCo= ntext.scala:1951) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.SparkContext.runJob(SparkCo= ntext.scala:1965) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.count(RDD.scala:115= 8) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.java.JavaRDDLike$class.= count(JavaRDDLike.scala:455) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.java.AbstractJavaRDDLik= e.count(JavaRDDLike.scala:45) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.sysml.runtime.instructions.spark.= utils.RDDConverterUtils.dataFrameToBinaryBlock(RDDConverterUtils.java:236) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.sysml.api.mlcontext.MLContextConv= ersionUtil.dataFrameToMatrixBinaryBlocks(MLContextConversionUtil.java:430) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.sysml.api.mlcontext.MLContextConv= ersionUtil.dataFrameToMatrixObject(MLContextConversionUtil.java:330) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.sysml.api.mlcontext.MLContextConv= ersionUtil.dataFrameToMatrixObject(MLContextConversionUtil.java:311) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.sysml.api.mlcontext.MLContextUtil= .convertInputType(MLContextUtil.java:516) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.sysml.api.mlcontext.Script.in(Scr= ipt.java:347) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.sysml.api.mlcontext.Script.in(Scr= ipt.java:306) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0= (Native Method) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(= Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.inv= oke(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.reflect.Method.invoke(Unknown Sour= ce) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at py4j.reflection.MethodInvoker.invoke(MethodI= nvoker.java:244) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at py4j.reflection.ReflectionEngine.invoke(Refl= ectionEngine.java:357) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at py4j.Gateway.invoke(Gateway.java:280) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at py4j.commands.AbstractCommand.invokeMethod(A= bstractCommand.java:132) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at py4j.commands.CallCommand.execute(CallComman= d.java:79) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at py4j.GatewayConnection.run(GatewayConnection= .java:214) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.Thread.run(Unknown Source) Caused by: java.io.IOException: Cannot run program "C:\spark": CreateProces= s error=3D5, Access is denied =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessBuilder.start(Unknown Sourc= e) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonWorkerFact= ory.createSimpleWorker(PythonWorkerFactory.scala:120) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonWorkerFact= ory.create(PythonWorkerFactory.scala:67) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.SparkEnv.createPythonWorker= (SparkEnv.scala:116) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonRunner.com= pute(PythonRDD.scala:128) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.api.python.PythonRDD.comput= e(PythonRDD.scala:63) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.MapPartitionsRDD.comput= e(MapPartitionsRDD.scala:38) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.computeOrReadCheckp= oint(RDD.scala:323) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.rdd.RDD.iterator(RDD.scala:= 287) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.ResultTask.runTas= k(ResultTask.scala:87) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.scheduler.Task.run(Task.sca= la:99) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.spark.executor.Executor$TaskRunne= r.run(Executor.scala:322) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.util.concurrent.ThreadPoolExecutor.runW= orker(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.util.concurrent.ThreadPoolExecutor$Work= er.run(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 ... 1 more Caused by: java.io.IOException: CreateProcess error=3D5, Access is denied =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessImpl.create(Native Method) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessImpl.(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.ProcessImpl.start(Unknown Source) =C2=A0 =C2=A0 =C2=A0 =C2=A0 ... 35 more ------=_Part_1876266_558162765.1495997958138--