spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [spark] yaooqinn commented on a change in pull request #27889: [SPARK-31131][SQL] Remove the unnecessary config spark.sql.legacy.timeParser.enabled
Date Thu, 12 Mar 2020 10:14:34 GMT
yaooqinn commented on a change in pull request #27889: [SPARK-31131][SQL] Remove the unnecessary
config spark.sql.legacy.timeParser.enabled
URL: https://github.com/apache/spark/pull/27889#discussion_r391519667
 
 

 ##########
 File path: docs/sql-migration-guide.md
 ##########
 @@ -70,7 +70,7 @@ license: |
 
   - Since Spark 3.0, Proleptic Gregorian calendar is used in parsing, formatting, and converting
dates and timestamps as well as in extracting sub-components like years, days and etc. Spark
3.0 uses Java 8 API classes from the java.time packages that based on ISO chronology (https://docs.oracle.com/javase/8/docs/api/java/time/chrono/IsoChronology.html).
In Spark version 2.4 and earlier, those operations are performed by using the hybrid calendar
(Julian + Gregorian, see https://docs.oracle.com/javase/7/docs/api/java/util/GregorianCalendar.html).
The changes impact on the results for dates before October 15, 1582 (Gregorian) and affect
on the following Spark 3.0 API:
 
-    - Parsing/formatting of timestamp/date strings. This effects on CSV/JSON datasources
and on the `unix_timestamp`, `date_format`, `to_unix_timestamp`, `from_unixtime`, `to_date`,
`to_timestamp` functions when patterns specified by users is used for parsing and formatting.
Since Spark 3.0, the conversions are based on `java.time.format.DateTimeFormatter`, see https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html.
New implementation performs strict checking of its input. For example, the `2015-07-22 10:00:00`
timestamp cannot be parse if pattern is `yyyy-MM-dd` because the parser does not consume whole
input. Another example is the `31/01/2015 00:00` input cannot be parsed by the `dd/MM/yyyy
hh:mm` pattern because `hh` supposes hours in the range `1-12`. In Spark version 2.4 and earlier,
`java.text.SimpleDateFormat` is used for timestamp/date string conversions, and the supported
patterns are described in https://docs.oracle.com/javase/7/docs/api/java/text/SimpleDateFormat.html.
The old behavior can be restored by setting `spark.sql.legacy.timeParser.enabled` to `true`.
+    - Parsing/formatting of timestamp/date strings. This effects on CSV/JSON datasources
and on the `unix_timestamp`, `date_format`, `to_unix_timestamp`, `from_unixtime`, `to_date`,
`to_timestamp` functions when patterns specified by users is used for parsing and formatting.
Since Spark 3.0, the conversions are based on `java.time.format.DateTimeFormatter`, see https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html.
New implementation performs strict checking of its input. For example, the `2015-07-22 10:00:00`
timestamp cannot be parse if pattern is `yyyy-MM-dd` because the parser does not consume whole
input. Another example is the `31/01/2015 00:00` input cannot be parsed by the `dd/MM/yyyy
hh:mm` pattern because `hh` supposes hours in the range `1-12`. In Spark version 2.4 and earlier,
`java.text.SimpleDateFormat` is used for timestamp/date string conversions, and the supported
patterns are described in https://docs.oracle.com/javase/7/docs/api/java/text/SimpleDateFormat.html.
The old behavior can be restored by setting `spark.sql.legacy.timeParserPolicy` to `LEGACY`.
 
 Review comment:
   I notice that the pattern we defined in  `sql-ref-datetime-pattern.md` is slightly different
from javadoc, e.g.  we have `M` but javadoc has 'M\L' for month-of-year. On any purpurse?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message