Github user Whoosh commented on a diff in the pull request:
https://github.com/apache/spark/pull/19553#discussion_r148451557
--- Diff: core/src/main/scala/org/apache/spark/api/java/JavaUtils.scala ---
@@ -43,10 +43,17 @@ private[spark] object JavaUtils {
override def size: Int = underlying.size
- override def get(key: AnyRef): B = try {
- underlying.getOrElse(key.asInstanceOf[A], null.asInstanceOf[B])
- } catch {
- case ex: ClassCastException => null.asInstanceOf[B]
+ // Delegate to implementation because AbstractMap implementation iterates over whole
key set
+ override def containsKey(key: AnyRef): Boolean = {
+ underlying.contains(key.asInstanceOf[A])
--- End diff --
@cloud-fan @srowen
So, let's have decided what we want for containsKey() here first, before i will do next
commit -)
Because suggested (key.isInstanceOf[A] && underlying.contains(key.asInstanceOf[A]))
will not work, it cause compile-time error with "abstract type A is unchecked since it
is eliminated by erasure" message.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
|