spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From YaoPau <jonrgr...@gmail.com>
Subject Pyspark ImportError: No module named definitions
Date Tue, 25 Aug 2015 15:37:57 GMT
I have three modules:

*join_driver.py* - driver, imports 'joined_paths_all', then calls some of
joined_paths_all's functions for wrangling RDDs

*joined_paths_all.py* - all wrangling functions for this project are defined
here.  Imports 'definitions' 

*definitions.py* - contains all my regex definitions and global variables
that I use for many data wrangling applications

All three files are in the same folder, and that folder is in my PYTHONPATH. 
In join_driver, when I run import joined_paths_all, that seems to work (no
error).  But in turn, it looks like joined_paths_all is not able to import
definitions, as I'm getting "ImportError: No module named definitions".

I tried putting joined_paths_all.py and definitions.py into a zip file and
adding it using --pyfiles, but still not working.  Any ideas?



... schemaBirfModel = sqlContext.createDataFrame(birfModelData, schema)
[Stage 0:>                                                          (0 + 2)
/ 6][Stage 1:>                                                          (0 +
0) / 2]15/08/25 11:26:13 ERROR TaskSetManager: Task 1 in stage 0.0 failed 4
times; aborting job
Traceback (most recent call last):
  File "<stdin>", line 2, in <module>
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/lib/spark/python/pyspark/sql/context.py",
line 341, in createDataFrame
    return self.applySchema(data, schema)
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/lib/spark/python/pyspark/sql/context.py",
line 241, in applySchema
    rows = rdd.take(10)
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/lib/spark/python/pyspark/rdd.py",
line 1225, in take
    res = self.context.runJob(self, takeUpToNumLeft, p, True)
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/lib/spark/python/pyspark/context.py",
line 843, in runJob
    it = self._jvm.PythonRDD.runJob(self._jsc.sc(), mappedRDD._jrdd,
javaPartitions, allowLocal)
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/lib/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
line 538, in __call__
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/lib/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py",
line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling
z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1
in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0
(TID 8, phd40010020.autotrader.com):
org.apache.spark.api.python.PythonException: Traceback (most recent call
last):
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/jars/spark-assembly-1.3.0-cdh5.4.4-hadoop2.6.0-cdh5.4.4.jar/pyspark/worker.py",
line 88, in main
    command = pickleSer._read_with_length(infile)
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/jars/spark-assembly-1.3.0-cdh5.4.4-hadoop2.6.0-cdh5.4.4.jar/pyspark/serializers.py",
line 156, in _read_with_length
    return self.loads(obj)
  File
"/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p894.568/jars/spark-assembly-1.3.0-cdh5.4.4-hadoop2.6.0-cdh5.4.4.jar/pyspark/serializers.py",
line 405, in loads
    return cPickle.loads(obj)
ImportError: No module named definitions

	at org.apache.spark.api.python.PythonRDD$$anon$1.read(PythonRDD.scala:135)
	at
org.apache.spark.api.python.PythonRDD$$anon$1.<init>(PythonRDD.scala:176)
	at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:94)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
	at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:307)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
	at org.apache.spark.scheduler.Task.run(Task.scala:64)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
	at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1203)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1192)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1191)
	at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
	at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1191)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
	at scala.Option.foreach(Option.scala:236)
	at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:693)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Pyspark-ImportError-No-module-named-definitions-tp24447.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message