-
Notifications
You must be signed in to change notification settings - Fork 454
-
Hi,
I have a scenario; where I have configured a spark notebook with the dependent jar.
%%configure -f
{ "conf": {"spark.jars.packages": "<custom package>>",
"spark.jars.repositories":"<<repo URL>>"
}
}
The included jar has a function to get class Type using scala reflection. For example, the Person case class:
case class Person(name: String)
object Helper {
import scala.reflect.runtime.{universe => ru}
def getTypeFromStringClassName(name: String): ru.Type = {
val classInstance: Class[_] = Class.forName(name)
val mirror: ru.Mirror = ru.runtimeMirror(getClass.getClassLoader)
val classSymbol: ru.ClassSymbol = mirror.classSymbol(classInstance)
classSymbol.selfType
}
}
When I am trying to invoke this function from a Jupyter cell
Helper.getTypeFromStringClassName("Person")
It throws java.lang.NoClassDefFoundError: Person (wrong name: Person )
Suppose I include the case class (Person) part of the jar file and refer to it using spark.jar.packages, It works fine. The Spark executor cannot resolve the class from the class loader.
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment