scala - Task Not Serializable exception when using IgniteRDD -


what wrong code?? can not escape task not serializable

@throws(classof[exception]) override def setup(cfg: benchmarkconfiguration) {   super.setup(cfg)   sc = new sparkcontext("local[4]", "benchmarktest")   sqlcontext = new hivecontext(sc)   ic = new ignitecontext[rddkey, rddval](sc,     () ⇒ configuration("client", client = true))   iccache = ic.fromcache(partitioned_cache_name)      iccache.savepairs(  sc.parallelize({       (0 until 1000).map{ n => (n.tolong, s"value key $n")}         }, 10)) // error happens here: "line 89"     println(iccache.collect)  } 

here st:

<20:47:45><yardstick> failed start benchmark server (will stop , exit). org.apache.spark.sparkexception: task not serializable     @ org.apache.spark.util.closurecleaner$.ensureserializable(closurecleaner.scala:166)     @ org.apache.spark.util.closurecleaner$.clean(closurecleaner.scala:158)     @ org.apache.spark.sparkcontext.clean(sparkcontext.scala:1623)     @ org.apache.spark.rdd.rdd.foreachpartition(rdd.scala:805)     @ org.apache.ignite.spark.igniterdd.savepairs(igniterdd.scala:170)     @ org.yardstickframework.spark.sparkabstractbenchmark.setup(sparkabstractbenchmark.scala:89)     @ org.yardstickframework.spark.sparkcorerddbenchmark.setup(sparkcorerddbenchmark.scala:18)     @ org.yardstickframework.spark.sparkcorerddbenchmark$.main(sparkcorerddbenchmark.scala:72)     @ org.yardstickframework.spark.sparknode.start(sparknode.scala:28)     @ org.yardstickframework.benchmarkserverstartup.main(benchmarkserverstartup.java:61) caused by: java.lang.reflect.invocationtargetexception     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:606)     @ org.apache.spark.serializer.serializationdebugger$objectstreamclassmethods$.getobjfieldvalues$extension(serializationdebugger.scala:240) 

it looks code compiled against different version of scala ignite or spark modules compiled. got similar exceptions while testing when code compiled against scala 2.10 , spark running scala 2.11 or vice-versa. module com.databricks:spark-csv_2.10:1.1.0 might reason this.


Comments