Problem with scala version mismatch in Spark application

I was developing a spark program on my machine and it worked ok. But when i tried to deploy it in Spark running in my Hadoop sandbox i started getting this error

java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
 at com.spnotes.enrich.CSVFieldEnricher.enrich(CSVFieldEnricher.scala:31)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:59)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:58)
 at scala.collection.immutable.List.foreach(List.scala:318)
 at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
 at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:58)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:56)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
 at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1469)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
 at org.apache.spark.scheduler.Task.run(Task.scala:64)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)
16/01/05 13:03:53 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main]
java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
 at com.spnotes.enrich.CSVFieldEnricher.enrich(CSVFieldEnricher.scala:31)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:59)
 at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:58)
 at scala.collection.immutable.List.foreach(List.scala:318)
 at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
 at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:58)
 at com.spnotes.PMDriver$$anonfun$1.apply(PMDriver.scala:56)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
 at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1469)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1006)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
 at org.apache.spark.scheduler.Task.run(Task.scala:64)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)
So it seems the problem is you need to use same version of Scala for compiling your code as the Scala used by Spark. In my case i was using scala 2.11 for compiling my code and Spark 1.3.1 uses Scala 2.10.4. So i changed the build file and then rebuilt the code and deployed it. That fixed the issue

7 comments:

  1. Really appreciated the information and please keep sharing, I would like to share some information regarding online training.Maxmunus Solutions is providing the best quality of this Apache Spark and Scala programming language. and the training will be online and very convenient for the learner.This course gives you the knowledge you need to achieve success.

    For Joining online training batches please feel free to call or email us.
    Email : minati@maxmunus.com
    Contact No.-+91-9066638196/91-9738075708
    website:-www.maxmunus.com

    ReplyDelete
  2. Thank you for sharing article with us. It is reaaly awesome. Keep posting such information.
    Custom website design Phoenix

    ReplyDelete
  3. yes, it's a common error in spark.
    that's why to manage this scala version using few versioning tools as well.
    Thanks to sharing good info
    Regards
    Venu
    bigdata training institute in Hyderabad
    spark training in Hyderabad

    ReplyDelete