spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Stefano Parmesan (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-6677) pyspark.sql nondeterministic issue with row fields
Date Thu, 09 Apr 2015 09:02:15 GMT

    [ https://issues.apache.org/jira/browse/SPARK-6677?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14487010#comment-14487010
] 

Stefano Parmesan commented on SPARK-6677:
-----------------------------------------

Uhm, don't know what to say. Let's try with this: I've created a docker that reproduces the
issue, its available here:
https://github.com/armisael/SPARK-6677

I tested it on three different machines, and the issue appeared on all of them. Can you give
it a try?

> pyspark.sql nondeterministic issue with row fields
> --------------------------------------------------
>
>                 Key: SPARK-6677
>                 URL: https://issues.apache.org/jira/browse/SPARK-6677
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.3.0
>         Environment: spark version: spark-1.3.0-bin-hadoop2.4
> python version: Python 2.7.6
> operating system: MacOS, x86_64 x86_64 x86_64 GNU/Linux
>            Reporter: Stefano Parmesan
>            Assignee: Davies Liu
>              Labels: pyspark, row, sql
>
> The following issue happens only when running pyspark in the python interpreter, it works
correctly with spark-submit.
> Reading two json files containing objects with a different structure leads sometimes
to the definition of wrong Rows, where the fields of a file are used for the other one.
> I was able to write a sample code that reproduce this issue one out of three times; the
code snippet is available at the following link, together with some (very simple) data samples:
> https://gist.github.com/armisael/e08bb4567d0a11efe2db



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message