Problem with scala version mismatch in Spark application

I was developing a spark program on my machine and it worked ok. But when i tried to deploy it in Spark running in my Hadoop sandbox i started getting this error

java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
 at com.spnotes.enrich.CSVFieldEnricher.enrich(CSVFieldEnricher.scala:31)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:59)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:58)
 at scala.collection.immutable.List.foreach(List.scala:318)
 at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
 at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:58)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:56)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
 at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1469)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
 at org.apache.spark.scheduler.Task.run(Task.scala:64)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)
16/01/05 13:03:53 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main]
java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
 at com.spnotes.enrich.CSVFieldEnricher.enrich(CSVFieldEnricher.scala:31)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:59)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:58)
 at scala.collection.immutable.List.foreach(List.scala:318)
 at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
 at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:58)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:56)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
 at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1469)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
 at org.apache.spark.scheduler.Task.run(Task.scala:64)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)
So it seems the problem is you need to use same version of Scala for compiling your code as the Scala used by Spark. In my case i was using scala 2.11 for compiling my code and Spark 1.3.1 uses Scala 2.10.4. So i changed the build file and then rebuilt the code and deployed it. That fixed the issue

No comments: