spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Niranda Perera <niranda.per...@gmail.com>
Subject passing a AbstractFunction1 to sparkContext().runJob instead of a Closure
Date Fri, 09 Oct 2015 07:25:23 GMT
hi all,

I want to run a job in the spark context and since I am running the system
in the java environment, I can not use a closure in
the sparkContext().runJob. Instead, I am passing an AbstractFunction1
extension.

while I get the jobs run without an issue, I constantly get the following
WARN message

TID: [-1234] [] [2015-10-06 04:39:43,387]  WARN
{org.apache.spark.util.ClosureCleaner} -  Expected a closure; got
org.wso2.carbon.analytics.spark.core.sources.AnalyticsWritingFunction
{org.apache.spark.util.ClosureCleaner}


I want to know what are the implications of this approach?
could this WARN cause issues in the functionality later on?

rgds
-- 
Niranda
@n1r44 <https://twitter.com/N1R44>
+94-71-554-8430
https://pythagoreanscript.wordpress.com/

Mime
View raw message