spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Wenchen Fan (Jira)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-31408) Build Spark’s own Datetime patterns
Date Fri, 10 Apr 2020 08:00:23 GMT

     [ https://issues.apache.org/jira/browse/SPARK-31408?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Wenchen Fan updated SPARK-31408:
--------------------------------
    Description: 
This is an umbrella ticket for building Spark's own Datetime patterns and related works.

In Spark version 2.4 and earlier, datetime parsing and formatting are performed by the old
Java 7 `SimpleDateFormat` API. Since Spark 3.0, we switch to the new Java 8 `DateTimeFormatter`
to use the Proleptic Gregorian calendar, which is required by the ISO and SQL standards.

However, there are some patterns not compatible between Java 8 and Java 7 APIs, and it's fragile
to rely on the JDK API to define Spark's behavior. We should build our own Datetime patterns,
which is compatible with Spark 2.4 (the old Java 7 `SimpleDateFormat` API).

  was:
This is an umbrella ticket for building Spark's own Datetime patterns and related works.

In Spark version 2.4 and earlier, datetime parsing and formatting are performed by the old
Java 7 `SimpleDateFormat` API. Since Spark 3.0, we switch to the new Java 8 `DateTimeFormatter`
to use the Proleptic Gregorian calendar, which is required by the ISO and SQL standards.

However, there are some patterns not compatible between Java 8 and Java 7 APIs, and it's fragile
to rely on the JDK API to define Spark's behavior, we should build our own 


 using the hybrid calendar (Julian + Gregorian).
Since the Proleptic Gregorian calendar is a de-facto calendar worldwide, as well as the chosen
one in ANSI SQL standard, Spark 3.0 switches to it by using Java 8 API classes (the java.time
packages that are based on ISO chronology ). The switching job is completed in SPARK-26651.
But after the switching, there are some patterns not compatible between Java 8 and Java 7,
Spark needs its own definition on the patterns rather than depends on Java API.


> Build Spark’s own Datetime patterns
> -----------------------------------
>
>                 Key: SPARK-31408
>                 URL: https://issues.apache.org/jira/browse/SPARK-31408
>             Project: Spark
>          Issue Type: Umbrella
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Yuanjian Li
>            Priority: Major
>
> This is an umbrella ticket for building Spark's own Datetime patterns and related works.
> In Spark version 2.4 and earlier, datetime parsing and formatting are performed by the
old Java 7 `SimpleDateFormat` API. Since Spark 3.0, we switch to the new Java 8 `DateTimeFormatter`
to use the Proleptic Gregorian calendar, which is required by the ISO and SQL standards.
> However, there are some patterns not compatible between Java 8 and Java 7 APIs, and it's
fragile to rely on the JDK API to define Spark's behavior. We should build our own Datetime
patterns, which is compatible with Spark 2.4 (the old Java 7 `SimpleDateFormat` API).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message