spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Michael Armbrust (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-11889) Type inference in REPL broken for GroupedDataset.agg
Date Fri, 20 Nov 2015 21:15:11 GMT

     [ https://issues.apache.org/jira/browse/SPARK-11889?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Michael Armbrust updated SPARK-11889:
-------------------------------------
    Component/s: SQL

> Type inference in REPL broken for GroupedDataset.agg
> ----------------------------------------------------
>
>                 Key: SPARK-11889
>                 URL: https://issues.apache.org/jira/browse/SPARK-11889
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Michael Armbrust
>            Assignee: Michael Armbrust
>            Priority: Critical
>
> This works in compiled code, but fails in the REPL.
> {code}
> /** An `Aggregator` that adds up any numeric type returned by the given function. */
> class SumOf[I, N : Numeric](f: I => N) extends Aggregator[I, N, N] with Serializable
{
>   val numeric = implicitly[Numeric[N]]
>   override def zero: N = numeric.zero
>   override def reduce(b: N, a: I): N = numeric.plus(b, f(a))
>   override def merge(b1: N,b2: N): N = numeric.plus(b1, b2)
>   override def finish(reduction: N): N = reduction
> }
> def sum[I, N : Numeric : Encoder](f: I => N): TypedColumn[I, N] = new SumOf(f).toColumn
> val ds = Seq((1, 1, 2L), (1, 2, 3L), (1, 3, 4L), (2, 1, 5L)).toDS()
> ds.groupBy(_._1).agg(count("*"), sum(_._2), sum(_._3)).collect()
> {code}
> {code}
> <console>:38: error: missing parameter type for expanded function ((x$2) =>
x$2._2)
>               ds.groupBy(_._1).agg(sum(_._2), sum(_._3)).collect()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message