spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From wzhfy <>
Subject [GitHub] spark pull request #17630: [SPARK-20318][SQL] Use Catalyst type for min/max ...
Date Fri, 14 Apr 2017 01:34:31 GMT
Github user wzhfy commented on a diff in the pull request:
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/Statistics.scala
    @@ -104,20 +106,38 @@ case class ColumnStat(
        * Returns a map from string to string that can be used to serialize the column stats.
        * The key is the name of the field (e.g. "distinctCount" or "min"), and the value
is the string
    -   * representation for the value. The deserialization side is defined in [[ColumnStat.fromMap]].
    +   * representation for the value. min/max values are converted to the external data
type. For
    +   * example, for DateType we store java.sql.Date, and for TimestampType we store
    +   * java.sql.Timestamp. The deserialization side is defined in [[ColumnStat.fromMap]].
        * As part of the protocol, the returned map always contains a key called "version".
        * In the case min/max values are null (None), they won't appear in the map.
    -  def toMap: Map[String, String] = {
    +  def toMap(name: String, dataType: DataType): Map[String, String] = {
    +    def toExternalString(v: Any, dataType: DataType): String = {
    +      val externalValue = dataType match {
    +        case BooleanType => v.toString.toBoolean
    +        case _: IntegralType => v.toString.toLong
    --- End diff --
    Here we want to convert to external format.

If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message