Setting different class loader
Steven Nunez
steve_nunez at yahoo.com
Wed Jul 22 09:52:45 UTC 2020
24 hours later and little progress. I have determined that moving the properties file into the ABCL project directory enables me to get an inputstream on it from ABCL, but the application library still fails to load.
It (still) looks like a class loader issue. What I'd really like is a macro along lines of:
(with-class-loader 'foo ...
which would quickly confirm or eliminate that hypothesis. Anyone know if one exists, or something similar?
Cheers, Steve
On Tuesday, July 21, 2020, 3:22:13 PM GMT+8, Steven Nunez <steve_nunez at yahoo.com> wrote:
Greetings all,
I have what I think is a problem with the ABCL class loader. I am working with a 'big data' library, Spark, but run into an issue on line 2 of the programming guide example. I am able to load the JARs from Maven with the ASDF system definition:
(asdf:defsystem #:spark
:description "Wrapper for Spark 3.0"
:serial t
:defsystem-depends-on (abcl-asdf)
:depends-on (#:jss #:javaparser)
:components ((:mvn "org.apache.spark/spark-core_2.12" :version "3.0.0")
(:file "package")
(:file "spark")))
and can create a SparkConf object:
(defvar *spark-conf*
#1"new SparkConf()
.setAppName("abcl-app")
.setMaster("local")" )
But when I try to create a 'context'
(defvar *sc* (new 'JavaSparkContext *spark-conf*))
I get an error in the initialisation:
Java exception 'java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.package$'.
There isn't much on this except from the Jenkins guys, who have attempted to put Spark and Spark applications into a CI system. They seem to think that it's related to a call to get a properties file in the package class and on a StackOverflow discussion suggested that "you should make sure that you set the classloader that Spark was loaded through using the Thread.currentThread().setContextClassLoader(myCustomLoader) call".
I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with:
(defvar rs
#1"Thread.currentThread()
.getContextClassLoader()
.getResourceAsStream("spark-version-info.properties")" )
which returns nil, so their theory may be correct.
Messing around with class loaders is a bit beyond my 20 year old Java knowledge so I thought I'd ask here if anyone has any ideas on how I can load Spark in way to use the default Java class loader. Alternatively it occurs to me to ask why the ABCL class loader isn't able to find the properties file if the JAR is on the classpath and then to correct whatever that problem is.
Cheers, Steve
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.common-lisp.net/pipermail/armedbear-devel/attachments/20200722/67fb65a7/attachment-0001.htm>
More information about the armedbear-devel
mailing list