From evenson at panix.com Wed Jul 1 11:41:07 2020 From: evenson at panix.com (Mark Evenson) Date: Wed, 1 Jul 2020 13:41:07 +0200 Subject: run-program issues with OpenJDK 11 In-Reply-To: <878sg6hjh6.fsf@rocinante.timmons.dev> References: <878sg6hjh6.fsf@rocinante.timmons.dev> Message-ID: <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> > On Jun 29, 2020, at 01:13, Eric Timmons wrote: > > It appears that java.lang.UNIXProcess was removed in OpenJDK 9 or > thereabouts, assuming I'm interpreting > https://github.com/openjdk/jdk/commit/aa6b19f38ee4dc69cac664d1211bff0e69b31f4c > correctly (/definitely/ not a Java developer here...). > > This seems to make run-program unusable when using prebuilt jars on > recent JREs. Using the prebuilt 1.7.0 release and OpenJDK 11, I get the > errors shown in prebuilt-openjdk11.txt (attached). Hmmm. I can’t seem to confirm under abcl-bin-1.7.0/openjdk11/linux. CL-USER> (sys:run-program "ls" nil) #S(SYSTEM>:PROCESS :JPROCESS # > to get the PID of a process gives the error in from-source-openjdk11.txt > (attached). > […] The (SWANK-BACKEND:GETPID) thunks down to the implementation where available, as is the case in abcl-1.4.0 (??) onwards. […] > > Armed Bear Common Lisp 1.7.1-dev > Java 11.0.7 Oracle Corporation > OpenJDK 64-Bit Server VM > Low-level initialization completed in 0.205 seconds. > Startup completed in 1.224 seconds. > Type ":help" for a list of available commands. > CL-USER(1): (sys:run-program "ls" nil :wait nil) > #S(SYSTEM:PROCESS :JPROCESS # :%INPUT #S(SYSTEM::SYSTEM-STREAM) :%OUTPUT #S(SYSTEM::SYSTEM-STREAM) :%ERROR #S(SYSTEM::SYSTEM-STREAM)) > CL-USER(2): (sys:process-pid *) > #: Debugger invoked on condition of type ERROR > Class not found: java.lang.UNIXProcess […] What are the values returned for LISP-IMPLEMENTATION-VERSION where you encountering the error? You aren’t trying to get a java.lang.UNIXProcess under Windows by any chance? -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From etimmons at mit.edu Wed Jul 1 17:27:13 2020 From: etimmons at mit.edu (Eric Timmons) Date: Wed, 01 Jul 2020 13:27:13 -0400 Subject: run-program issues with OpenJDK 11 In-Reply-To: <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> References: <878sg6hjh6.fsf@rocinante.timmons.dev> <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> Message-ID: <87h7urw3gu.fsf@rocinante.timmons.dev> Mark Evenson writes: > Hmmm. I can’t seem to confirm under abcl-bin-1.7.0/openjdk11/linux. I'll be honest, I thought I was going crazy at first! It seemed weird that things like this would break things like this between versions. I'd love it if there's some other explanation! [...] > What are the values returned for LISP-IMPLEMENTATION-VERSION where you > encountering the error? > > You aren’t trying to get a java.lang.UNIXProcess under Windows by any chance? Nope, I'm running on Gentoo with kernel 5.4.45 and in Docker containers running on that computer. I've attached a tarball containing the Docker build context and some scripts I used to test this. The build-images script will build Docker images using the ABCL 1.7.0 jar running with Buster's default JRE (11), Stretch's default JRE (8), the JRE in the openjdk:11-buster image, and the JRE in the openjdk:8-buster image. The test-run-program script will try to start a child process using each of the images. The first value of LISP-IMPLEMENTATION-VERSION is always "1.7.0" and the third is always "amd64-Linux-5.4.45-gentoo". I get the errors when the second value is one of: + "OpenJDK_64-Bit_Server_VM-Debian-11.0.7+10-post-Debian-3deb10u1" (docker label debian-buster) + "OpenJDK_64-Bit_Server_VM-Oracle_Corporation-11.0.7+10" (docker label opendjk-11-buster) + "OpenJDK_64-Bit_Server_VM-AdoptOpenJDK-11.0.7+10" (native) I don't get errors when the second value is one of: + "OpenJDK_64-Bit_Server_VM-Oracle_Corporation-1.8.0_252-8u252-b09-1~deb9u1-b09" (docker label debian-stretch) + "OpenJDK_64-Bit_Server_VM-Oracle_Corporation-1.8.0_252-b09" (docker label openjdk-8-buster) + "OpenJDK_64-Bit_Server_VM-AdoptOpenJDK-1.8.0_252-b09" (native) I've also attached the full output of running the test-run-program script from the tarball (after build-images has already been run). -Eric -------------- next part -------------- A non-text attachment was scrubbed... Name: abcl-run-program-test.tar.gz Type: application/octet-stream Size: 1095 bytes Desc: not available URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: output-of-test-run-program.txt URL: From alanruttenberg at gmail.com Thu Jul 16 02:30:25 2020 From: alanruttenberg at gmail.com (Alan Ruttenberg) Date: Wed, 15 Jul 2020 22:30:25 -0400 Subject: Create instance of generic class at runtime, supplying specializer? Message-ID: For instance in http://owlcs.github.io/owlapi/apidocs_4/org/semanticweb/owlapi/util/InferredObjectPropertyAxiomGenerator.html It says: Type Parameters:A - the axiom type Question is: how do I supply the axiom type when creating an instance. Thanks ALan -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Thu Jul 16 04:23:23 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Thu, 16 Jul 2020 06:23:23 +0200 Subject: Create instance of generic class at runtime, supplying specializer? In-Reply-To: References: Message-ID: You can't. The specializer exists only in the Java compiler, it's erased at runtime. On Thu, Jul 16, 2020, 04:31 Alan Ruttenberg wrote: > For instance in > > > http://owlcs.github.io/owlapi/apidocs_4/org/semanticweb/owlapi/util/InferredObjectPropertyAxiomGenerator.html > > It says: > Type Parameters:A - the axiom type > Question is: how do I supply the axiom type when creating an instance. > > Thanks > ALan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evenson at panix.com Sat Jul 18 15:56:20 2020 From: evenson at panix.com (Mark Evenson) Date: Sat, 18 Jul 2020 17:56:20 +0200 Subject: abcl-1.7.1 released Message-ID: <3416DC05-382C-48BD-8C55-E7966963D710@panix.com> With gentle prodding, we have released ABCL 1.7.1, a decidedly minor release correcting a few bugs resulting from the overhaul of arrays specialized on unsigned byte types. The brief list of CHANGES is a available for your perusal. . "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From evenson at panix.com Mon Jul 20 16:48:20 2020 From: evenson at panix.com (Mark Evenson) Date: Mon, 20 Jul 2020 18:48:20 +0200 Subject: Create instance of generic class at runtime, supplying specializer? In-Reply-To: References: Message-ID: <6159F417-7FF4-45B1-BBAD-0E13EF087927@panix.com> > On Jul 16, 2020, at 04:30, Alan Ruttenberg wrote: > > For instance in > > http://owlcs.github.io/owlapi/apidocs_4/org/semanticweb/owlapi/util/InferredObjectPropertyAxiomGenerator.html > > It says: > Type Parameters: > A - the axiom type > Question is: how do I supply the axiom type when creating an instance. Can you point me to some code that does the setup? Somewhere in lsw2? Alessio is correct that the compiler erases types, but I have sucessfully created instances that switch such methods over the years working on reasoners, so I should be able to get this to work for you. Unfortunately the margin is too small to contain a proof of this statement without further annotation… -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From alessiostalla at gmail.com Tue Jul 21 07:04:55 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Tue, 21 Jul 2020 09:04:55 +0200 Subject: Create instance of generic class at runtime, supplying specializer? In-Reply-To: <6159F417-7FF4-45B1-BBAD-0E13EF087927@panix.com> References: <6159F417-7FF4-45B1-BBAD-0E13EF087927@panix.com> Message-ID: Oh, the class is abstract, so you're meant to provide A by subclassing it, not by instantiating. In that case, the value of A *is* retained at runtime and is available for reflection, even if method signatures are still erased to Object. In ABCL there used to be some runtime-class machinery that was just a sketch... I don't know if it's still there and has enough plumbing to allow you to subclass a generic class providing type specifiers. On Mon, 20 Jul 2020 at 18:49, Mark Evenson wrote: > > > > On Jul 16, 2020, at 04:30, Alan Ruttenberg > wrote: > > > > For instance in > > > > > http://owlcs.github.io/owlapi/apidocs_4/org/semanticweb/owlapi/util/InferredObjectPropertyAxiomGenerator.html > > > > It says: > > Type Parameters: > > A - the axiom type > > Question is: how do I supply the axiom type when creating an instance. > > Can you point me to some code that does the setup? Somewhere in lsw2? > > Alessio is correct that the compiler erases types, but I have sucessfully > created instances that switch such methods over the years working on > reasoners, > so I should be able to get this to work for you. Unfortunately the margin > is > too small to contain a proof of this statement without further annotation… > > -- > "A screaming comes across the sky. It has happened before but there is > nothing > to compare to it now." > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Tue Jul 21 07:21:19 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Tue, 21 Jul 2020 07:21:19 +0000 (UTC) Subject: Setting different class loader References: <383327080.4743772.1595316079768.ref@mail.yahoo.com> Message-ID: <383327080.4743772.1595316079768@mail.yahoo.com> Greetings all, I have what I think is a problem with the ABCL class loader. I am working with a 'big data' library, Spark, but run into an issue on line 2 of the programming guide example. I am able to load the JARs from Maven with the ASDF system definition: (asdf:defsystem #:spark   :description "Wrapper for Spark 3.0"   :serial t   :defsystem-depends-on (abcl-asdf)   :depends-on (#:jss #:javaparser)   :components ((:mvn "org.apache.spark/spark-core_2.12" :version "3.0.0")            (:file "package")                (:file "spark"))) and can create a SparkConf object: (defvar *spark-conf*   #1"new SparkConf()     .setAppName("abcl-app")     .setMaster("local")" ) But when I try to create a 'context' (defvar *sc* (new 'JavaSparkContext *spark-conf*)) I get an error in the initialisation: Java exception 'java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.package$'. There isn't much on this except from the Jenkins guys, who have attempted to put Spark and Spark applications into a CI system. They seem to think that it's related to a call to get a properties file in the package class and on a StackOverflow discussion suggested that "you should make sure that you set the classloader that Spark was loaded through using the Thread.currentThread().setContextClassLoader(myCustomLoader) call". I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with: (defvar rs   #1"Thread.currentThread()     .getContextClassLoader()     .getResourceAsStream("spark-version-info.properties")" ) which returns nil, so their theory may be correct. Messing around with class loaders is a bit beyond my 20 year old Java knowledge so I thought I'd ask here if anyone has any ideas on how I can load Spark in way to use the default Java class loader. Alternatively it occurs to me to ask why the ABCL class loader isn't able to find the properties file if the JAR is on the classpath and then to correct whatever that problem is. Cheers,    Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From alanruttenberg at gmail.com Tue Jul 21 15:04:54 2020 From: alanruttenberg at gmail.com (Alan Ruttenberg) Date: Tue, 21 Jul 2020 15:04:54 +0000 Subject: Create instance of generic class at runtime, supplying specializer? In-Reply-To: References: <6159F417-7FF4-45B1-BBAD-0E13EF087927@panix.com> Message-ID: So I need to subclass first, and then instantiate that new class? Can I to do that with the provided support? I've used jnew-runtime-class jnew-runtime-class (class-name super-name interfaces constructors methods fields &optional filename) Not sure how I would provide the specializing class(es) here. The type information may be "erased" but methods can still differ by specialization, right? For JSS syntax I would probably just accept a list class argument to new. (new '(InferredObjectPropertyAxiomGenerator OWLObjectSubPropertyAxiom) ...) Presumably keep a cache of the created subclasses indexed by that list. It would be nice to be able to find by reflection that InferredObjectPropertyAxiomGenerator is a generic and how many class arguments there are. Alan On Tue, Jul 21, 2020 at 7:06 AM Alessio Stalla wrote: > Oh, the class is abstract, so you're meant to provide A by subclassing it, > not by instantiating. In that case, the value of A *is* retained at runtime > and is available for reflection, even if method signatures are still erased > to Object. > In ABCL there used to be some runtime-class machinery that was just a > sketch... I don't know if it's still there and has enough plumbing to allow > you to subclass a generic class providing type specifiers. > > On Mon, 20 Jul 2020 at 18:49, Mark Evenson wrote: > >> >> >> > On Jul 16, 2020, at 04:30, Alan Ruttenberg >> wrote: >> > >> > For instance in >> > >> > >> http://owlcs.github.io/owlapi/apidocs_4/org/semanticweb/owlapi/util/InferredObjectPropertyAxiomGenerator.html >> > >> > It says: >> > Type Parameters: >> > A - the axiom type >> > Question is: how do I supply the axiom type when creating an instance. >> >> Can you point me to some code that does the setup? Somewhere in lsw2? >> >> Alessio is correct that the compiler erases types, but I have sucessfully >> created instances that switch such methods over the years working on >> reasoners, >> so I should be able to get this to work for you. Unfortunately the >> margin is >> too small to contain a proof of this statement without further annotation… >> >> -- >> "A screaming comes across the sky. It has happened before but there is >> nothing >> to compare to it now." >> >> >> >> >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Tue Jul 21 17:01:12 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Tue, 21 Jul 2020 19:01:12 +0200 Subject: Create instance of generic class at runtime, supplying specializer? In-Reply-To: References: <6159F417-7FF4-45B1-BBAD-0E13EF087927@panix.com> Message-ID: On Tue, 21 Jul 2020 at 17:05, Alan Ruttenberg wrote: > So I need to subclass first, and then instantiate that new class? > The class you referred to is abstract, so yes you need a subclass to instantiate it, generics or not. > Can I to do that with the provided support? I've used jnew-runtime-class > > jnew-runtime-class (class-name super-name interfaces constructors methods > fields &optional filename) > > Not sure how I would provide the specializing class(es) here. The type > information may be "erased" but methods can still differ by specialization, > right? > If there's no specializers argument, then I'm afraid you cannot provide them. However, the type specializer is only useful if the class or framework does some advanced reflection magic at runtime to read it [1]. You don't need it in general in Lisp or dynamic Java (with reflection), the type being erased means that methods all take/return Object (or the most specific supertype available, in general) and do no runtime checks. If you had written your class in Java, then the Java compiler would have inserted synthetic methods with the more specific signature, and it would have inserted type casts at call sites. But this is just an aid towards type safety. Java generics are designed so as to be compatible with the pre-generics world: you can happily omit them and the compiler will only issue some type safety warnings. Unless, as I said, the library or framework does some fancy reflection thing to inspect the generic type parameters of subclasses, and doesn't account for users not specifying such parameters. [1] or if the class is to be used by a tool doing static analysis, e.g. an IDE or the Java compiler, and you want to restrict its usage to only the provided type, but I guess it's not your case. > > For JSS syntax I would probably just accept a list class argument to new. > > (new '(InferredObjectPropertyAxiomGenerator OWLObjectSubPropertyAxiom) ...) > You don't supply those parameters with the constructor at all. When you write: List foos = new ArrayList(); The information about Foo only exists in the compiler's head. There's no trace of it whatsoever at runtime. That's why it's called erasure :) > Presumably keep a cache of the created subclasses indexed by that list. > > It would be nice to be able to find by reflection that > InferredObjectPropertyAxiomGenerator is a generic and how many class > arguments there are. > > Alan > > > On Tue, Jul 21, 2020 at 7:06 AM Alessio Stalla > wrote: > >> Oh, the class is abstract, so you're meant to provide A by subclassing >> it, not by instantiating. In that case, the value of A *is* retained at >> runtime and is available for reflection, even if method signatures are >> still erased to Object. >> In ABCL there used to be some runtime-class machinery that was just a >> sketch... I don't know if it's still there and has enough plumbing to allow >> you to subclass a generic class providing type specifiers. >> >> On Mon, 20 Jul 2020 at 18:49, Mark Evenson wrote: >> >>> >>> >>> > On Jul 16, 2020, at 04:30, Alan Ruttenberg >>> wrote: >>> > >>> > For instance in >>> > >>> > >>> http://owlcs.github.io/owlapi/apidocs_4/org/semanticweb/owlapi/util/InferredObjectPropertyAxiomGenerator.html >>> > >>> > It says: >>> > Type Parameters: >>> > A - the axiom type >>> > Question is: how do I supply the axiom type when creating an instance. >>> >>> Can you point me to some code that does the setup? Somewhere in lsw2? >>> >>> Alessio is correct that the compiler erases types, but I have sucessfully >>> created instances that switch such methods over the years working on >>> reasoners, >>> so I should be able to get this to work for you. Unfortunately the >>> margin is >>> too small to contain a proof of this statement without further >>> annotation… >>> >>> -- >>> "A screaming comes across the sky. It has happened before but there is >>> nothing >>> to compare to it now." >>> >>> >>> >>> >>> >>> >>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Wed Jul 22 09:52:45 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Wed, 22 Jul 2020 09:52:45 +0000 (UTC) Subject: Setting different class loader In-Reply-To: <383327080.4743772.1595316079768@mail.yahoo.com> References: <383327080.4743772.1595316079768.ref@mail.yahoo.com> <383327080.4743772.1595316079768@mail.yahoo.com> Message-ID: <930141014.5376866.1595411565945@mail.yahoo.com> 24 hours later and little progress. I have determined that moving the properties file into the ABCL project directory enables me to get an inputstream on it from ABCL, but the application library still fails to load. It (still) looks like a class loader issue. What I'd really like is a macro along lines of: (with-class-loader 'foo   ... which would quickly confirm or eliminate that hypothesis. Anyone know if one exists, or something similar? Cheers,    Steve On Tuesday, July 21, 2020, 3:22:13 PM GMT+8, Steven Nunez wrote: Greetings all, I have what I think is a problem with the ABCL class loader. I am working with a 'big data' library, Spark, but run into an issue on line 2 of the programming guide example. I am able to load the JARs from Maven with the ASDF system definition: (asdf:defsystem #:spark   :description "Wrapper for Spark 3.0"   :serial t   :defsystem-depends-on (abcl-asdf)   :depends-on (#:jss #:javaparser)   :components ((:mvn "org.apache.spark/spark-core_2.12" :version "3.0.0")            (:file "package")                (:file "spark"))) and can create a SparkConf object: (defvar *spark-conf*   #1"new SparkConf()     .setAppName("abcl-app")     .setMaster("local")" ) But when I try to create a 'context' (defvar *sc* (new 'JavaSparkContext *spark-conf*)) I get an error in the initialisation: Java exception 'java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.package$'. There isn't much on this except from the Jenkins guys, who have attempted to put Spark and Spark applications into a CI system. They seem to think that it's related to a call to get a properties file in the package class and on a StackOverflow discussion suggested that "you should make sure that you set the classloader that Spark was loaded through using the Thread.currentThread().setContextClassLoader(myCustomLoader) call". I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with: (defvar rs   #1"Thread.currentThread()     .getContextClassLoader()     .getResourceAsStream("spark-version-info.properties")" ) which returns nil, so their theory may be correct. Messing around with class loaders is a bit beyond my 20 year old Java knowledge so I thought I'd ask here if anyone has any ideas on how I can load Spark in way to use the default Java class loader. Alternatively it occurs to me to ask why the ABCL class loader isn't able to find the properties file if the JAR is on the classpath and then to correct whatever that problem is. Cheers,    Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From olof at macrolet.net Wed Jul 22 10:58:41 2020 From: olof at macrolet.net (Olof-Joachim Frahm) Date: Wed, 22 Jul 2020 12:58:41 +0200 Subject: Setting different class loader In-Reply-To: <930141014.5376866.1595411565945@mail.yahoo.com> References: <383327080.4743772.1595316079768.ref@mail.yahoo.com> <383327080.4743772.1595316079768@mail.yahoo.com> <930141014.5376866.1595411565945@mail.yahoo.com> Message-ID: <20200722105841.GB11069@v22012114971199.netcup.net> On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote: > I've verified with (java:dump-classpath) that the JAR is on the ABCL > classpath, and the JAR file does contain the > spark-version-info.properties file. I've also tried getting the file > myself with: > (defvar rs >   #1"Thread.currentThread() >     .getContextClassLoader() >     .getResourceAsStream("spark-version-info.properties")" ) > which returns nil, so their theory may be correct. > Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...] Just to get you a bit unblocked, it seems you can indeed set the current context class loader and then the call to create the `JavaSparkContext` succeeds: ``` # verify that it doesn't work by default CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" NIL # have to find the right one, for me the first one in the list worked CL-USER> (car (car (dump-classpath))) # CL-USER> #1"Thread.currentThread()" # # well, thread first, then class loader CL-USER> (#"setContextClassLoader" * **) NIL # looks like it works CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" # CL-USER> (defvar *spark-conf* #1"new SparkConf().setAppName("abcl-app").setMaster("local")" ) *SPARK-CONF* # important to only attempt this call last, otherwise it might throw errors (c.f. *inferior-lisp*) about already having one in the process of being constructed CL-USER> (defvar *sc* (jss:new 'JavaSparkContext *spark-conf*)) *SC* ``` Hopefully there's a better way of course, since this is hardly convenient. From evenson at panix.com Wed Jul 22 17:35:53 2020 From: evenson at panix.com (Mark Evenson) Date: Wed, 22 Jul 2020 19:35:53 +0200 Subject: Setting different class loader In-Reply-To: <20200722105841.GB11069@v22012114971199.netcup.net> References: <383327080.4743772.1595316079768.ref@mail.yahoo.com> <383327080.4743772.1595316079768@mail.yahoo.com> <930141014.5376866.1595411565945@mail.yahoo.com> <20200722105841.GB11069@v22012114971199.netcup.net> Message-ID: <56755BE2-EBA0-4043-A867-45C949FEAC07@panix.com> > On Jul 22, 2020, at 12:58, Olof-Joachim Frahm wrote: > > On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote: >> I've verified with (java:dump-classpath) that the JAR is on the ABCL >> classpath, and the JAR file does contain the >> spark-version-info.properties file. I've also tried getting the file >> myself with: >> (defvar rs >> #1"Thread.currentThread() >> .getContextClassLoader() >> .getResourceAsStream("spark-version-info.properties")" ) >> which returns nil, so their theory may be correct. >> Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...] > > Just to get you a bit unblocked, it seems you can indeed set the current > context class loader and then the call to create the `JavaSparkContext` > succeeds: I’ve started to tool around with getting Spark working, but it doesn’t quite work for me yet. My current progress is in [Ember][] [Ember]: -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From etimmons at mit.edu Wed Jul 22 19:03:21 2020 From: etimmons at mit.edu (Eric Timmons) Date: Wed, 22 Jul 2020 15:03:21 -0400 Subject: run-program issues with OpenJDK 11 In-Reply-To: <87h7urw3gu.fsf@rocinante.timmons.dev> References: <878sg6hjh6.fsf@rocinante.timmons.dev> <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> <87h7urw3gu.fsf@rocinante.timmons.dev> Message-ID: <878sfbxthi.fsf@rocinante.timmons.dev> I just tested this on a fresh Debian Buster VM (to rule out anything weird from my environment) with openjdk-11-jdk-headless and ABCL 1.7.1 and ended up with the same results: + The prebuilt jar is unable to use sys:run-program at all. + When built from source, sys:run-program works, but sys:process-pid does not. lisp-implementation-version: "1.7.1" "OpenJDK_64-Bit_Server_VM-Debian-11.0.7+10-post-Debian-3deb10u1" "amd64-Linux-4.19.0.9-amd64" Mark: I just realized that your use of swank-backend:getpid doesn't match what I was trying to do. Swank's getpid gets the PID of the current process, but I was trying to get the PID of a process started with (sys:run-program ... :wait nil) using sys:process-pid. -Eric From steve_nunez at yahoo.com Thu Jul 23 06:37:31 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Thu, 23 Jul 2020 06:37:31 +0000 (UTC) Subject: Setting different class loader In-Reply-To: <20200722105841.GB11069@v22012114971199.netcup.net> References: <383327080.4743772.1595316079768.ref@mail.yahoo.com> <383327080.4743772.1595316079768@mail.yahoo.com> <930141014.5376866.1595411565945@mail.yahoo.com> <20200722105841.GB11069@v22012114971199.netcup.net> Message-ID: <1556492299.5926822.1595486251645@mail.yahoo.com> Thank you Olof; that was just what I needed to get things working. Well, that and another half day struggling with what turns out to be a 5 year old bug (#388). I really wish someone had mentioned in the JSS documentation, "Oh, and this doesn't yet work in top-level forms". All told, it was a lot more difficult to get started with ABCL than I expected it to be, but I'm glad it's done and grateful to those that helped. For reference, here's the code that finally works: (defun change-class-loader ()   (#"setContextClassLoader" #1"Thread.currentThread()" (java:get-current-classloader))) (change-class-loader) (defun make-spark-config (&key (app-name "abcl-app") (conf-master "local"))   "Return a spark configuration. Required to work around ABCL bug 388, otherwise we'd just do this at a top-level form. See https://abcl.org/trac/ticket/338"   (let ((conf (jss:new (jss:find-java-class "org.apache.spark.sparkConf"))))     (java:chain conf                 ("setAppName" app-name)                 ("setMaster" conf-master)))) (defun make-spark-context (spark-config)   (jss:new 'JavaSparkContext spark-config)) ;;; Now we can create our context and configuration object (defvar *spark-conf* (make-spark-config)) (defvar *sc* (make-spark-context *spark-conf*)) At least it gets me as far as line two of the spark 'hello world'; hopefully there aren't any other surprises lurking. If anyone can recommend any best practices or improvements, especially around the class loader bits, I'd be very happy to hear them. Regards,    Steve On Wednesday, July 22, 2020, 6:59:32 PM GMT+8, Olof-Joachim Frahm wrote: On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote: > I've verified with (java:dump-classpath) that the JAR is on the ABCL > classpath, and the JAR file does contain the > spark-version-info.properties file. I've also tried getting the file > myself with: > (defvar rs >   #1"Thread.currentThread() >     .getContextClassLoader() >     .getResourceAsStream("spark-version-info.properties")" ) > which returns nil, so their theory may be correct. > Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...] Just to get you a bit unblocked, it seems you can indeed set the current context class loader and then the call to create the `JavaSparkContext` succeeds: ``` # verify that it doesn't work by default CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" NIL # have to find the right one, for me the first one in the list worked CL-USER> (car (car (dump-classpath))) # CL-USER> #1"Thread.currentThread()" # # well, thread first, then class loader CL-USER> (#"setContextClassLoader" * **) NIL # looks like it works CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" # CL-USER> (defvar *spark-conf* #1"new SparkConf().setAppName("abcl-app").setMaster("local")" ) *SPARK-CONF* # important to only attempt this call last, otherwise it might throw errors (c.f. *inferior-lisp*) about already having one in the process of being constructed CL-USER> (defvar *sc* (jss:new 'JavaSparkContext *spark-conf*)) *SC* ``` Hopefully there's a better way of course, since this is hardly convenient. -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Fri Jul 24 03:01:18 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Fri, 24 Jul 2020 03:01:18 +0000 (UTC) Subject: Third Brick Wall References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> Message-ID: <2002682159.6434503.1595559678182@mail.yahoo.com> OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick wall. Trying to convert the following two lines into ABCL: List data = Arrays.asList(1, 2, 3, 4, 5); JavaRDD distData = sc.parallelize(data); it looks like it should be easy. Heck, I can do that in one line: (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to be correct But no, it claims "no applicable method named parallelize found on JavaSparkContext" (but there is!). Reading through section 3.1.1 of the documentation, it appears that this is probably because '(1 2 3...) is a LispObject and not a Java object (why no automatic conversion?). Let's try to convert it: (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) No instance method named copytoArray found for type org.armedbear.lisp.Cons' And the same with using an array, e.g. (#"parallelize" *sc* (#"copytoArray" #(1 2 3 4 5))) Sigh It's been a week and my intention was to have a working prototype by now and present ABCL as a viable alternative to use in a project. I haven't got past line 5 in 'hello world'. This doesn't bode well. I've been reading about ABCL for years, and it's impressive. Full MOP, extensible sequences, nearly 100% ANSI compliance, and the ability to deploy on the JVM are major achievements. However, as a not-inexperienced Lisp programmer, I find the barrier to entry remarkably high and the documentation and examples sparse and insufficient to surmount the hurdles I encountered. Please take these comments in the way they are intended: constructive feedback from someone who is a fan of the project and would love to be able to use it. It's nearly impossible to get Lisp introduced into enterprise environments, and ABCL provides a wedge into those types of projects, ticking the boxes on deployment and ability to work with legacy Java code. Perhaps it makes more sense to someone approaching Lisp from the Java side, but coming from the Lisp side to Java, there's a high barrier to entry. I know that no volunteer wants to write documentation, but more and clearer docs are sorely needed here. This is probably not news, but sometimes it helps to be reminded of the obvious. I hate giving up, so this will be a personal background project in the hopes that at the next opportunity things will have improved to the point where we can consider introducing ABCL, so if anyone has any pointers, generally (though I think I would have found any docs or examples (lsw2) by now) and explaining this problem in particular, it would be greatly appreciated. @easye, you mentioned your ember project. If you're going to continue with that, please message me. A Spark wrapper would be useful, serve as a good exemplar for using ABCL to wrap a large library and, with a companion tutorial, help others overcome the kind of obstacles I've encountered. I'd be happy to contribute. -------------- next part -------------- An HTML attachment was scrubbed... URL: From evenson at panix.com Fri Jul 24 07:52:19 2020 From: evenson at panix.com (Mark Evenson) Date: Fri, 24 Jul 2020 09:52:19 +0200 Subject: run-program issues with OpenJDK 11 In-Reply-To: <878sfbxthi.fsf@rocinante.timmons.dev> References: <878sg6hjh6.fsf@rocinante.timmons.dev> <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> <87h7urw3gu.fsf@rocinante.timmons.dev> <878sfbxthi.fsf@rocinante.timmons.dev> Message-ID: > On Jul 22, 2020, at 21:03, Eric Timmons wrote: > > I just tested this on a fresh Debian Buster VM (to rule out anything > weird from my environment) with openjdk-11-jdk-headless and ABCL 1.7.1 > and ended up with the same results: > > + The prebuilt jar is unable to use sys:run-program at all. > > + When built from source, sys:run-program works, but sys:process-pid > does not. > > lisp-implementation-version: > > "1.7.1" > "OpenJDK_64-Bit_Server_VM-Debian-11.0.7+10-post-Debian-3deb10u1" > "amd64-Linux-4.19.0.9-amd64" > > Mark: I just realized that your use of swank-backend:getpid doesn't > match what I was trying to do. Swank's getpid gets the PID of the > current process, but I was trying to get the PID of a process started > with (sys:run-program ... :wait nil) using sys:process-pid. > > -Eric > Finally able to confirm the failure of SYS:RUN-PROGRAM when SLIME is *not* used. Debugging further… -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From evenson at panix.com Fri Jul 24 07:54:43 2020 From: evenson at panix.com (Mark Evenson) Date: Fri, 24 Jul 2020 09:54:43 +0200 Subject: Third Brick Wall In-Reply-To: <2002682159.6434503.1595559678182@mail.yahoo.com> References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> Message-ID: <4E257772-E1B6-41A3-BBC8-94146859A51F@panix.com> > On Jul 24, 2020, at 05:01, Steven Nunez wrote: > […] > > @easye, you mentioned your ember project. If you're going to continue with that, please message me. A Spark wrapper would be useful, serve as a good exemplar for using ABCL to wrap a large library and, with a companion tutorial, help others overcome the kind of obstacles I've encountered. I'd be happy to contribute. Sorry to hear that you weren’t able to bootstrap your project in time. I will definitely continue with Ember to wrap Apache Spark as time permits to start crafting the sort of documentation/tutorial that you would have needed a week ago… -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From alessiostalla at gmail.com Fri Jul 24 08:36:51 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Fri, 24 Jul 2020 10:36:51 +0200 Subject: Third Brick Wall In-Reply-To: <4E257772-E1B6-41A3-BBC8-94146859A51F@panix.com> References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> <4E257772-E1B6-41A3-BBC8-94146859A51F@panix.com> Message-ID: Lisp lists are not Java List instances. There are some automatic conversions (e.g., from Lisp numbers to Java numbers) but not for lists or hash tables. You could invoke Arrays.asList(...) in Lisp as well, I don't remember the exact syntax to do that as I haven't been using ABCL for years. Granted, it wouldn't be much work to either have Cons implement List or provide a cons-backed List implementation. However, the devil is in the details – how do you convert those LispObject's to the appropriate Java type? Is it Integer, Long, Double, ...? Generics are erased at runtime, ABCL couldn't possibly know, so we'd need another primitive, e.g., (jcoerce-collection collection &key element-type (collection-type "java.util.ArrayList")). But then one would have to know about it in order to use it. ABCL could also do type inference and deduce the type of the elements of the list from how they're used in the code, e.g., from the call to parallelize. But 1) it's a lot of work to implement 2) it doesn't solve all problems, e.g., I guess parallelize is itself generic like List parallelize(List data) or something like that, in that case, you as a user would still have to spell a type name. BTW – it gives me warm fuzzy feelings to read that you see extensible sequences as a feature worth mentioning. I did the porting from SBCL back then but it didn't look like it was used much. FWIW, back then I also integrated CLOS dispatch with the Java class hierarchy – it may have bit rotted, but you could write, e.g., (defmethod foo ((bar (jclass "java.lang.List"))) ...) and it would do the "right thing" wrt Java inheritance. But hey, maybe it's undocumented as well. On Fri, 24 Jul 2020 at 09:55, Mark Evenson wrote: > > > > On Jul 24, 2020, at 05:01, Steven Nunez wrote: > > > […] > > > > > @easye, you mentioned your ember project. If you're going to continue > with that, please message me. A Spark wrapper would be useful, serve as a > good exemplar for using ABCL to wrap a large library and, with a companion > tutorial, help others overcome the kind of obstacles I've encountered. I'd > be happy to contribute. > > Sorry to hear that you weren’t able to bootstrap your project in time. > > I will definitely continue with Ember to wrap Apache Spark as time permits > to start crafting the sort of documentation/tutorial that you would have > needed a week ago… > > -- > "A screaming comes across the sky. It has happened before but there is > nothing > to compare to it now." > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alanruttenberg at gmail.com Fri Jul 24 21:20:03 2020 From: alanruttenberg at gmail.com (Alan Ruttenberg) Date: Fri, 24 Jul 2020 17:20:03 -0400 Subject: Third Brick Wall In-Reply-To: <2002682159.6434503.1595559678182@mail.yahoo.com> References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> Message-ID: If you send a smallish example of the code that doesn't work, and a list of the dependencies, I can have a look. I've been dealing with some stuff recently that might make it easier to debug. You definitely can't use CopyToArray that way. You need to work with a java array (jarray-from-list '(1 2 3 4 5)) Maybe: (#"parallelize" *sc* (jarray-from-list '(1 2 3 4 5))) Alan On Thu, Jul 23, 2020 at 11:03 PM Steven Nunez wrote: > OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick > wall. Trying to convert the following two lines into ABCL: > > List data = Arrays.asList(1, 2, 3, 4, 5);JavaRDD distData = sc.parallelize(data); > > > it looks like it should be easy. Heck, I can do that in one line: > > (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to > be correct > > But no, it claims "no applicable method named parallelize found on > JavaSparkContext" (but there is!). Reading through section 3.1.1 of the > documentation, it appears that this is probably because '(1 2 3...) is a > LispObject and not a Java object (why no automatic conversion?). Let's try > to convert it: > > (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) > > No instance method named copytoArray found for type > org.armedbear.lisp.Cons' > > And the same with using an array, e.g. (#"parallelize" *sc* > (#"copytoArray" #(1 2 3 4 5))) > > *Sigh* > > It's been a week and my intention was to have a working prototype by now > and present ABCL as a viable alternative to use in a project. I haven't got > past line 5 in 'hello world'. This doesn't bode well. > > I've been reading about ABCL for years, and it's impressive. Full MOP, > extensible sequences, nearly 100% ANSI compliance, and the ability to > deploy on the JVM are major achievements. However, as a not-inexperienced > Lisp programmer, I find the barrier to entry remarkably high and the > documentation and examples sparse and insufficient to surmount the hurdles > I encountered. > > Please take these comments in the way they are intended: constructive > feedback from someone who is a fan of the project and would love to be able > to use it. It's nearly impossible to get Lisp introduced into enterprise > environments, and ABCL provides a wedge into those types of projects, > ticking the boxes on deployment and ability to work with legacy Java code. > Perhaps it makes more sense to someone approaching Lisp from the Java side, > but coming from the Lisp side to Java, there's a high barrier to entry. I > know that no volunteer wants to write documentation, but more and clearer > docs are sorely needed here. This is probably not news, but sometimes it > helps to be reminded of the obvious. > > I hate giving up, so this will be a personal background project in the > hopes that at the next opportunity things will have improved to the point > where we can consider introducing ABCL, so if anyone has any pointers, > generally (though I think I would have found any docs or examples (lsw2) by > now) and explaining this problem in particular, it would be greatly > appreciated. > > @easye, you mentioned your ember project. If you're going to continue with > that, please message me. A Spark wrapper would be useful, serve as a good > exemplar for using ABCL to wrap a large library and, with a companion > tutorial, help others overcome the kind of obstacles I've encountered. I'd > be happy to contribute. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Sat Jul 25 02:20:59 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Sat, 25 Jul 2020 02:20:59 +0000 (UTC) Subject: Third Brick Wall In-Reply-To: References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> Message-ID: <1602185978.6932961.1595643659878@mail.yahoo.com> I think what parallelize really needs is a java.util.list. Alessio mentioned some reasons why an automatic conversion is challenging; perhaps a restart is easier? I.e. search for a method of the given name that takes a java.util.list, and if you're giving it an abcl.cons, the restart asks if you want an automatic conversion. Trying such a conversion manually, it seems I need a jlist-from-list, but this doesn't exist in the JAVA package.  How do I get a java.util.list from an abcl.cons ? On Saturday, July 25, 2020, 5:20:30 AM GMT+8, Alan Ruttenberg wrote: If you send a smallish example of the code that doesn't work, and a list of the dependencies, I can have a look. I've been dealing with some stuff recently that might make it easier to debug. You definitely can't use CopyToArray that way. You need to work with a java array(jarray-from-list '(1 2 3 4 5))Maybe:(#"parallelize" *sc* (jarray-from-list '(1 2 3 4 5))) Alan On Thu, Jul 23, 2020 at 11:03 PM Steven Nunez wrote: OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick wall. Trying to convert the following two lines into ABCL: List data = Arrays.asList(1, 2, 3, 4, 5); JavaRDD distData = sc.parallelize(data); it looks like it should be easy. Heck, I can do that in one line: (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to be correct But no, it claims "no applicable method named parallelize found on JavaSparkContext" (but there is!). Reading through section 3.1.1 of the documentation, it appears that this is probably because '(1 2 3...) is a LispObject and not a Java object (why no automatic conversion?). Let's try to convert it: (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) No instance method named copytoArray found for type org.armedbear.lisp.Cons' And the same with using an array, e.g. (#"parallelize" *sc* (#"copytoArray" #(1 2 3 4 5))) Sigh It's been a week and my intention was to have a working prototype by now and present ABCL as a viable alternative to use in a project. I haven't got past line 5 in 'hello world'. This doesn't bode well. I've been reading about ABCL for years, and it's impressive. Full MOP, extensible sequences, nearly 100% ANSI compliance, and the ability to deploy on the JVM are major achievements. However, as a not-inexperienced Lisp programmer, I find the barrier to entry remarkably high and the documentation and examples sparse and insufficient to surmount the hurdles I encountered. Please take these comments in the way they are intended: constructive feedback from someone who is a fan of the project and would love to be able to use it. It's nearly impossible to get Lisp introduced into enterprise environments, and ABCL provides a wedge into those types of projects, ticking the boxes on deployment and ability to work with legacy Java code. Perhaps it makes more sense to someone approaching Lisp from the Java side, but coming from the Lisp side to Java, there's a high barrier to entry. I know that no volunteer wants to write documentation, but more and clearer docs are sorely needed here. This is probably not news, but sometimes it helps to be reminded of the obvious. I hate giving up, so this will be a personal background project in the hopes that at the next opportunity things will have improved to the point where we can consider introducing ABCL, so if anyone has any pointers, generally (though I think I would have found any docs or examples (lsw2) by now) and explaining this problem in particular, it would be greatly appreciated. @easye, you mentioned your ember project. If you're going to continue with that, please message me. A Spark wrapper would be useful, serve as a good exemplar for using ABCL to wrap a large library and, with a companion tutorial, help others overcome the kind of obstacles I've encountered. I'd be happy to contribute. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alanruttenberg at gmail.com Sat Jul 25 04:34:41 2020 From: alanruttenberg at gmail.com (Alan Ruttenberg) Date: Sat, 25 Jul 2020 00:34:41 -0400 Subject: Third Brick Wall In-Reply-To: <1602185978.6932961.1595643659878@mail.yahoo.com> References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> <1602185978.6932961.1595643659878@mail.yahoo.com> Message-ID: Try just using a java list. As I understand it, that should work the same way. Since Java.util.List is abstract, you need to choose a concrete class, such as ArrayList. (let ((jlist (jss::new 'arraylist))) (loop for el in '(1 2 3 4 5) do (#"add" jlist el)) (print (jss::jlist-to-list jlist)) jlist) The print statement is to verify that we got what was expected, and to demonstrate jlist-to-list. Alan On Fri, Jul 24, 2020 at 10:21 PM Steven Nunez wrote: > I think what parallelize > really > needs is a java.util.list. Alessio mentioned some reasons why an automatic > conversion is challenging; perhaps a restart is easier? I.e. search for a > method of the given name that takes a java.util.list, and if you're giving > it an abcl.cons, the restart asks if you want an automatic conversion. > > Trying such a conversion manually, it seems I need a jlist-from-list, but > this doesn't exist in the JAVA package. How do I get a java.util.list from > an abcl.cons ? > > > > On Saturday, July 25, 2020, 5:20:30 AM GMT+8, Alan Ruttenberg < > alanruttenberg at gmail.com> wrote: > > > If you send a smallish example of the code that doesn't work, and a list > of the dependencies, I can have a look. I've been dealing with some stuff > recently that might make it easier to debug. > You definitely can't use CopyToArray that way. You need to work with a > java array > (jarray-from-list '(1 2 3 4 5)) > Maybe: > (#"parallelize" *sc* (jarray-from-list '(1 2 3 4 5))) > > Alan > > On Thu, Jul 23, 2020 at 11:03 PM Steven Nunez > wrote: > > OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick > wall. Trying to convert the following two lines into ABCL: > > List data = Arrays.asList(1, 2, 3, 4, 5);JavaRDD distData = sc.parallelize(data); > > > it looks like it should be easy. Heck, I can do that in one line: > > (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to > be correct > > But no, it claims "no applicable method named parallelize found on > JavaSparkContext" (but there is!). Reading through section 3.1.1 of the > documentation, it appears that this is probably because '(1 2 3...) is a > LispObject and not a Java object (why no automatic conversion?). Let's try > to convert it: > > (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) > > No instance method named copytoArray found for type > org.armedbear.lisp.Cons' > > And the same with using an array, e.g. (#"parallelize" *sc* > (#"copytoArray" #(1 2 3 4 5))) > > *Sigh* > > It's been a week and my intention was to have a working prototype by now > and present ABCL as a viable alternative to use in a project. I haven't got > past line 5 in 'hello world'. This doesn't bode well. > > I've been reading about ABCL for years, and it's impressive. Full MOP, > extensible sequences, nearly 100% ANSI compliance, and the ability to > deploy on the JVM are major achievements. However, as a not-inexperienced > Lisp programmer, I find the barrier to entry remarkably high and the > documentation and examples sparse and insufficient to surmount the hurdles > I encountered. > > Please take these comments in the way they are intended: constructive > feedback from someone who is a fan of the project and would love to be able > to use it. It's nearly impossible to get Lisp introduced into enterprise > environments, and ABCL provides a wedge into those types of projects, > ticking the boxes on deployment and ability to work with legacy Java code. > Perhaps it makes more sense to someone approaching Lisp from the Java side, > but coming from the Lisp side to Java, there's a high barrier to entry. I > know that no volunteer wants to write documentation, but more and clearer > docs are sorely needed here. This is probably not news, but sometimes it > helps to be reminded of the obvious. > > I hate giving up, so this will be a personal background project in the > hopes that at the next opportunity things will have improved to the point > where we can consider introducing ABCL, so if anyone has any pointers, > generally (though I think I would have found any docs or examples (lsw2) by > now) and explaining this problem in particular, it would be greatly > appreciated. > > @easye, you mentioned your ember project. If you're going to continue with > that, please message me. A Spark wrapper would be useful, serve as a good > exemplar for using ABCL to wrap a large library and, with a companion > tutorial, help others overcome the kind of obstacles I've encountered. I'd > be happy to contribute. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Sat Jul 25 09:11:31 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Sat, 25 Jul 2020 09:11:31 +0000 (UTC) Subject: Third Brick Wall In-Reply-To: References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> <1602185978.6932961.1595643659878@mail.yahoo.com> Message-ID: <1106618490.6990038.1595668291742@mail.yahoo.com> Thanks Alan. What finally worked is: (#"parallelize" *sc* (#"asList" 'Arrays (java:jarray-from-list '(1 2 3 4 5)))) But I can't help but think that going from a list to an array to a list is the long way 'round. Is there a better or more idiomatic way to do this? More generally, I think much of the confusion is that a Lisper will enter into the FFI with a certain set of assumptions that don't hold. An array is a type of list? No Array class? Only specific types of array classes (e.g. array of Int)?  I was hoping that this was or could be papered over in a manner similar to CFFI where I barely notice I'm using an external library most of the time. On Saturday, July 25, 2020, 12:35:08 PM GMT+8, Alan Ruttenberg wrote: Try just using a java list.  As I understand it, that should work the same way. Since Java.util.List is abstract,  you need to choose a concrete class, such as ArrayList. (let ((jlist (jss::new 'arraylist)))  (loop for el in '(1 2 3 4 5)         do (#"add" jlist el))   (print (jss::jlist-to-list jlist))   jlist) The print statement is to verify that we got what was expected, and to demonstrate jlist-to-list. Alan On Fri, Jul 24, 2020 at 10:21 PM Steven Nunez wrote: I think what parallelize really needs is a java.util.list. Alessio mentioned some reasons why an automatic conversion is challenging; perhaps a restart is easier? I.e. search for a method of the given name that takes a java.util.list, and if you're giving it an abcl.cons, the restart asks if you want an automatic conversion. Trying such a conversion manually, it seems I need a jlist-from-list, but this doesn't exist in the JAVA package.  How do I get a java.util.list from an abcl.cons ? On Saturday, July 25, 2020, 5:20:30 AM GMT+8, Alan Ruttenberg wrote: If you send a smallish example of the code that doesn't work, and a list of the dependencies, I can have a look. I've been dealing with some stuff recently that might make it easier to debug. You definitely can't use CopyToArray that way. You need to work with a java array(jarray-from-list '(1 2 3 4 5))Maybe:(#"parallelize" *sc* (jarray-from-list '(1 2 3 4 5))) Alan On Thu, Jul 23, 2020 at 11:03 PM Steven Nunez wrote: OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick wall. Trying to convert the following two lines into ABCL: List data = Arrays.asList(1, 2, 3, 4, 5); JavaRDD distData = sc.parallelize(data); it looks like it should be easy. Heck, I can do that in one line: (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to be correct But no, it claims "no applicable method named parallelize found on JavaSparkContext" (but there is!). Reading through section 3.1.1 of the documentation, it appears that this is probably because '(1 2 3...) is a LispObject and not a Java object (why no automatic conversion?). Let's try to convert it: (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) No instance method named copytoArray found for type org.armedbear.lisp.Cons' And the same with using an array, e.g. (#"parallelize" *sc* (#"copytoArray" #(1 2 3 4 5))) Sigh It's been a week and my intention was to have a working prototype by now and present ABCL as a viable alternative to use in a project. I haven't got past line 5 in 'hello world'. This doesn't bode well. I've been reading about ABCL for years, and it's impressive. Full MOP, extensible sequences, nearly 100% ANSI compliance, and the ability to deploy on the JVM are major achievements. However, as a not-inexperienced Lisp programmer, I find the barrier to entry remarkably high and the documentation and examples sparse and insufficient to surmount the hurdles I encountered. Please take these comments in the way they are intended: constructive feedback from someone who is a fan of the project and would love to be able to use it. It's nearly impossible to get Lisp introduced into enterprise environments, and ABCL provides a wedge into those types of projects, ticking the boxes on deployment and ability to work with legacy Java code. Perhaps it makes more sense to someone approaching Lisp from the Java side, but coming from the Lisp side to Java, there's a high barrier to entry. I know that no volunteer wants to write documentation, but more and clearer docs are sorely needed here. This is probably not news, but sometimes it helps to be reminded of the obvious. I hate giving up, so this will be a personal background project in the hopes that at the next opportunity things will have improved to the point where we can consider introducing ABCL, so if anyone has any pointers, generally (though I think I would have found any docs or examples (lsw2) by now) and explaining this problem in particular, it would be greatly appreciated. @easye, you mentioned your ember project. If you're going to continue with that, please message me. A Spark wrapper would be useful, serve as a good exemplar for using ABCL to wrap a large library and, with a companion tutorial, help others overcome the kind of obstacles I've encountered. I'd be happy to contribute. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alanruttenberg at gmail.com Sat Jul 25 22:36:59 2020 From: alanruttenberg at gmail.com (Alan Ruttenberg) Date: Sat, 25 Jul 2020 18:36:59 -0400 Subject: Third Brick Wall In-Reply-To: <1106618490.6990038.1595668291742@mail.yahoo.com> References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> <1602185978.6932961.1595643659878@mail.yahoo.com> <1106618490.6990038.1595668291742@mail.yahoo.com> Message-ID: On Sat, Jul 25, 2020 at 5:15 AM Steven Nunez wrote: > Thanks Alan. What finally worked is: > > (#"parallelize" *sc* (#"asList" 'Arrays (java:jarray-from-list '(1 2 3 4 > 5)))) > Did you try the code I sent to create the list? Here it is rephrased to be analogous to jarrary-from-list (defun jlist-from-list (list) (let ((jlist (jss::new 'arraylist))) (loop for el in list do (#"add" jlist el)) jlist)) I would expect to be able to write (#"parallelize" *sc* (jlist-from-list '(1 2 3 4 5))) It would be helpful to know if this doesn't work, as it means there's something I need to learn. > But I can't help but think that going from a list to an array to a list is > the long way 'round. Is there a better or more idiomatic way to do this? > In this case it's not really going the long way around. #"asList" takes a variable number of arguments - its java method signature is (T... a). JSS doesn't yet know about varargs. The way java implements varags is to actually create a method that takes an *array* of the arguments, and then when calling the method, add code to pack the arguments into an array. That's what you did - pack the arguments into an array, doing what the java compiler would do. See https://stackoverflow.com/questions/21746663/how-does-jvm-implement-the-varargs I'm going to think about how to make varargs work as expected so you could use a more natural syntax (#"parallelize" *sc* (#"asList" 'Arrays 1 2 3 4 5)). But, as I said, the implementation of jlist-from-list should be adequate. If you can verify that then we can add (an optimized version of) it to abcl. More generally, I think much of the confusion is that a Lisper will enter > into the FFI with a certain set of assumptions that don't hold. An array is > a type of list? > What's the basis for thinking this? No Array class? Only specific types of array classes (e.g. array of Int)? > Not following. It would be useful for us if you to unpack how someone would come to these conclusions. We could make the documentation better so as to try to avoid the confusion. I was hoping that this was or could be papered over in a manner similar to > CFFI where I barely notice I'm using an external library most of the time. > Well, that's the intent of JSS (I was the original author of JSS, BTW). In this case I think you are effectively arguing that we should coerce lisp lists and arrays passed as java arguments to a java equivalent. That's also something I can think about. The downside of this is that sometimes I *want* to pass a cons. I suppose we could provide both, making coercing be the default and add syntax to escape it, so (#"parallelize" *sc* '(1 2 3 4 5)) and (#"add" jlist (the cons '(1 2 3 4 5)) if I wanted to have an element of jlist be a cons. There's still an issue that there are multiple implementations of java's List and we aren't indicating which one is desired. So we'd have to pick a default, like java.util.ArrayList. We'd have to document that if a different type of list was wanted it needs to be created explicitly, ala jlist-to-list. Similarly for arrays. If one writes: (#"myMethod" ob #(1 2 3 4 5)) should it pass a byte array? an integer array? an array of Objects? Again we'd have to choose a default, presumably what jarray-from list does - The array type is the java type of the first argument. In this case that would be java.lang.Integer. Again I suppose we could add syntax to help. (#"myMethod" ob (the (array byte) #(1 2 3 4 5))) Any JSS users have an opinion? Alan On Saturday, July 25, 2020, 12:35:08 PM GMT+8, Alan Ruttenberg < > alanruttenberg at gmail.com> wrote: > > > Try just using a java list. As I understand it, that should work the same > way. > Since Java.util.List is abstract, you need to choose a concrete class, > such as ArrayList. > > (let ((jlist (jss::new 'arraylist))) > (loop for el in '(1 2 3 4 5) > do (#"add" jlist el)) > (print (jss::jlist-to-list jlist)) > jlist) > > The print statement is to verify that we got what was expected, and to > demonstrate jlist-to-list. > Alan > > On Fri, Jul 24, 2020 at 10:21 PM Steven Nunez > wrote: > > I think what parallelize > really > needs is a java.util.list. Alessio mentioned some reasons why an automatic > conversion is challenging; perhaps a restart is easier? I.e. search for a > method of the given name that takes a java.util.list, and if you're giving > it an abcl.cons, the restart asks if you want an automatic conversion. > > Trying such a conversion manually, it seems I need a jlist-from-list, but > this doesn't exist in the JAVA package. How do I get a java.util.list from > an abcl.cons ? > > > > On Saturday, July 25, 2020, 5:20:30 AM GMT+8, Alan Ruttenberg < > alanruttenberg at gmail.com> wrote: > > > If you send a smallish example of the code that doesn't work, and a list > of the dependencies, I can have a look. I've been dealing with some stuff > recently that might make it easier to debug. > You definitely can't use CopyToArray that way. You need to work with a > java array > (jarray-from-list '(1 2 3 4 5)) > Maybe: > (#"parallelize" *sc* (jarray-from-list '(1 2 3 4 5))) > > Alan > > On Thu, Jul 23, 2020 at 11:03 PM Steven Nunez > wrote: > > OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick > wall. Trying to convert the following two lines into ABCL: > > List data = Arrays.asList(1, 2, 3, 4, 5);JavaRDD distData = sc.parallelize(data); > > > it looks like it should be easy. Heck, I can do that in one line: > > (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to > be correct > > But no, it claims "no applicable method named parallelize found on > JavaSparkContext" (but there is!). Reading through section 3.1.1 of the > documentation, it appears that this is probably because '(1 2 3...) is a > LispObject and not a Java object (why no automatic conversion?). Let's try > to convert it: > > (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) > > No instance method named copytoArray found for type > org.armedbear.lisp.Cons' > > And the same with using an array, e.g. (#"parallelize" *sc* > (#"copytoArray" #(1 2 3 4 5))) > > *Sigh* > > It's been a week and my intention was to have a working prototype by now > and present ABCL as a viable alternative to use in a project. I haven't got > past line 5 in 'hello world'. This doesn't bode well. > > I've been reading about ABCL for years, and it's impressive. Full MOP, > extensible sequences, nearly 100% ANSI compliance, and the ability to > deploy on the JVM are major achievements. However, as a not-inexperienced > Lisp programmer, I find the barrier to entry remarkably high and the > documentation and examples sparse and insufficient to surmount the hurdles > I encountered. > > Please take these comments in the way they are intended: constructive > feedback from someone who is a fan of the project and would love to be able > to use it. It's nearly impossible to get Lisp introduced into enterprise > environments, and ABCL provides a wedge into those types of projects, > ticking the boxes on deployment and ability to work with legacy Java code. > Perhaps it makes more sense to someone approaching Lisp from the Java side, > but coming from the Lisp side to Java, there's a high barrier to entry. I > know that no volunteer wants to write documentation, but more and clearer > docs are sorely needed here. This is probably not news, but sometimes it > helps to be reminded of the obvious. > > I hate giving up, so this will be a personal background project in the > hopes that at the next opportunity things will have improved to the point > where we can consider introducing ABCL, so if anyone has any pointers, > generally (though I think I would have found any docs or examples (lsw2) by > now) and explaining this problem in particular, it would be greatly > appreciated. > > @easye, you mentioned your ember project. If you're going to continue with > that, please message me. A Spark wrapper would be useful, serve as a good > exemplar for using ABCL to wrap a large library and, with a companion > tutorial, help others overcome the kind of obstacles I've encountered. I'd > be happy to contribute. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Mon Jul 27 04:13:49 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Mon, 27 Jul 2020 04:13:49 +0000 (UTC) Subject: Third Brick Wall In-Reply-To: References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> <1602185978.6932961.1595643659878@mail.yahoo.com> <1106618490.6990038.1595668291742@mail.yahoo.com> Message-ID: <1012854219.7529129.1595823229360@mail.yahoo.com> The jlist-from-list works fine, thank you. More generally, I think much of the confusion is that a Lisper will enter into the FFI with a certain set of assumptions that don't hold. An array is a type of list? What's the basis for thinking this? In an earlier message you wrote "Since Java.util.List is abstract,  you need to choose a concrete class, such as ArrayList." I took this to mean that there was an is-a relationship between Array and List. No Array class? Only specific types of array classes (e.g. array of Int)?  Not following. It would be useful for us if you to unpack how someone would come to these conclusions. We could  make the documentation better so as to try to avoid the confusion. This came about when I searched for Array classes and a stack overflow discussion seemed to suggest "There are array classes, one per element type used in the program" along with some other information that probably confused me further. What helped was when I realised that java.lang.list is probably like a lisp sequence and although perhaps not a direct superclass, may act like one. I still could be wrong, but it helped me conceptualise things enough to 'solve' the problem. I've put what I've learned so far into an annotated hello-world.lisp on github so that it can be turned into a tutorial when it's finished. It's a work in progress, but by the end of the Spark hello world, I hope that most of the basics will have been demonstrated. The code is almost certainly not yet a good exemplar, and would benefit greatly from a review by someone(s) more knowledgeable about ABCL than I. On Sunday, July 26, 2020, 6:37:27 AM GMT+8, Alan Ruttenberg wrote: On Sat, Jul 25, 2020 at 5:15 AM Steven Nunez wrote: Thanks Alan. What finally worked is: (#"parallelize" *sc* (#"asList" 'Arrays (java:jarray-from-list '(1 2 3 4 5)))) Did you try the code I sent to create the list? Here it is rephrased to be analogous to jarrary-from-list (defun jlist-from-list (list)   (let ((jlist (jss::new 'arraylist)))     (loop for el in list           do (#"add" jlist el))     jlist)) I would expect to be able to write (#"parallelize" *sc* (jlist-from-list '(1 2 3 4 5))) It would be helpful to know if this doesn't work, as it means there's something I need to learn.   But I can't help but think that going from a list to an array to a list is the long way 'round. Is there a better or more idiomatic way to do this? In this case it's not really going the long way around. #"asList" takes a variable number of arguments - its java method signature is (T... a). JSS doesn't yet know about varargs. The way java implements varags is to actually create a method that takes an array of the arguments, and then when calling the method, add code to pack the arguments into an array. That's what you did - pack the arguments into an array, doing what the java compiler would do. See https://stackoverflow.com/questions/21746663/how-does-jvm-implement-the-varargs I'm going to think about how to make varargs work as expected so you could use a more natural syntax (#"parallelize" *sc* (#"asList" 'Arrays 1 2 3 4 5)). But, as I said, the implementation of jlist-from-list should be adequate. If you can verify that then we can add (an optimized version of) it to abcl. More generally, I think much of the confusion is that a Lisper will enter into the FFI with a certain set of assumptions that don't hold. An array is a type of list? What's the basis for thinking this? No Array class? Only specific types of array classes (e.g. array of Int)?  Not following. It would be useful for us if you to unpack how someone would come to these conclusions. We could  make the documentation better so as to try to avoid the confusion. I was hoping that this was or could be papered over in a manner similar to CFFI where I barely notice I'm using an external library most of the time. Well, that's the intent of JSS (I was the original author of JSS, BTW).  In this case I think you are effectively arguing that we should coerce lisp lists and arrays passed as java arguments to a java equivalent. That's also something I can think about. The downside of this is that sometimes I *want* to pass a cons. I suppose we could provide both, making coercing be the default and add syntax to escape it, so (#"parallelize" *sc* '(1 2 3 4 5))and(#"add" jlist (the cons '(1 2 3 4 5)) if I wanted to have an element of jlist be a cons. There's still an issue that there are multiple implementations of java's List and we aren't indicating which one is desired. So we'd have to pick a default, like java.util.ArrayList. We'd have to document that if a different type of list was wanted it needs to be created explicitly, ala jlist-to-list. Similarly for arrays. If one writes: (#"myMethod" ob #(1 2 3 4 5)) should it pass a byte array? an integer array? an array of Objects? Again we'd have to choose a default, presumably what jarray-from list does - The array type is the  java type of the first argument. In this case that would be java.lang.Integer. Again I suppose we could add syntax to help. (#"myMethod" ob (the (array byte) #(1 2 3 4 5))) Any JSS users have an opinion? Alan On Saturday, July 25, 2020, 12:35:08 PM GMT+8, Alan Ruttenberg wrote: Try just using a java list.  As I understand it, that should work the same way. Since Java.util.List is abstract,  you need to choose a concrete class, such as ArrayList. (let ((jlist (jss::new 'arraylist)))  (loop for el in '(1 2 3 4 5)         do (#"add" jlist el))   (print (jss::jlist-to-list jlist))   jlist) The print statement is to verify that we got what was expected, and to demonstrate jlist-to-list. Alan On Fri, Jul 24, 2020 at 10:21 PM Steven Nunez wrote: I think what parallelize really needs is a java.util.list. Alessio mentioned some reasons why an automatic conversion is challenging; perhaps a restart is easier? I.e. search for a method of the given name that takes a java.util.list, and if you're giving it an abcl.cons, the restart asks if you want an automatic conversion. Trying such a conversion manually, it seems I need a jlist-from-list, but this doesn't exist in the JAVA package.  How do I get a java.util.list from an abcl.cons ? On Saturday, July 25, 2020, 5:20:30 AM GMT+8, Alan Ruttenberg wrote: If you send a smallish example of the code that doesn't work, and a list of the dependencies, I can have a look. I've been dealing with some stuff recently that might make it easier to debug. You definitely can't use CopyToArray that way. You need to work with a java array(jarray-from-list '(1 2 3 4 5))Maybe:(#"parallelize" *sc* (jarray-from-list '(1 2 3 4 5))) Alan On Thu, Jul 23, 2020 at 11:03 PM Steven Nunez wrote: OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick wall. Trying to convert the following two lines into ABCL: List data = Arrays.asList(1, 2, 3, 4, 5); JavaRDD distData = sc.parallelize(data); it looks like it should be easy. Heck, I can do that in one line: (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to be correct But no, it claims "no applicable method named parallelize found on JavaSparkContext" (but there is!). Reading through section 3.1.1 of the documentation, it appears that this is probably because '(1 2 3...) is a LispObject and not a Java object (why no automatic conversion?). Let's try to convert it: (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) No instance method named copytoArray found for type org.armedbear.lisp.Cons' And the same with using an array, e.g. (#"parallelize" *sc* (#"copytoArray" #(1 2 3 4 5))) Sigh It's been a week and my intention was to have a working prototype by now and present ABCL as a viable alternative to use in a project. I haven't got past line 5 in 'hello world'. This doesn't bode well. I've been reading about ABCL for years, and it's impressive. Full MOP, extensible sequences, nearly 100% ANSI compliance, and the ability to deploy on the JVM are major achievements. However, as a not-inexperienced Lisp programmer, I find the barrier to entry remarkably high and the documentation and examples sparse and insufficient to surmount the hurdles I encountered. Please take these comments in the way they are intended: constructive feedback from someone who is a fan of the project and would love to be able to use it. It's nearly impossible to get Lisp introduced into enterprise environments, and ABCL provides a wedge into those types of projects, ticking the boxes on deployment and ability to work with legacy Java code. Perhaps it makes more sense to someone approaching Lisp from the Java side, but coming from the Lisp side to Java, there's a high barrier to entry. I know that no volunteer wants to write documentation, but more and clearer docs are sorely needed here. This is probably not news, but sometimes it helps to be reminded of the obvious. I hate giving up, so this will be a personal background project in the hopes that at the next opportunity things will have improved to the point where we can consider introducing ABCL, so if anyone has any pointers, generally (though I think I would have found any docs or examples (lsw2) by now) and explaining this problem in particular, it would be greatly appreciated. @easye, you mentioned your ember project. If you're going to continue with that, please message me. A Spark wrapper would be useful, serve as a good exemplar for using ABCL to wrap a large library and, with a companion tutorial, help others overcome the kind of obstacles I've encountered. I'd be happy to contribute. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Mon Jul 27 07:27:39 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Mon, 27 Jul 2020 09:27:39 +0200 Subject: Third Brick Wall In-Reply-To: <1012854219.7529129.1595823229360@mail.yahoo.com> References: <2002682159.6434503.1595559678182.ref@mail.yahoo.com> <2002682159.6434503.1595559678182@mail.yahoo.com> <1602185978.6932961.1595643659878@mail.yahoo.com> <1106618490.6990038.1595668291742@mail.yahoo.com> <1012854219.7529129.1595823229360@mail.yahoo.com> Message-ID: Arrays are primitive sequences of fixed size and element type. They are implemented in the JVM itself (presumably in C++). Each array has a (synthetic) array class represented in Java as [].class, e.g., int[].class or String[].class or Object[][].class. Lists are higher-level data structures, implemented in Java. They're potentially heterogeneous in element type, have variable size, and several implementations with different characteristics, particularly wrt. performance and concurrency. ArrayList is one such implementation: a List backed by a primitive array. Then there's LinkedList, etc. On Mon, 27 Jul 2020 at 06:14, Steven Nunez wrote: > The jlist-from-list works fine, thank you. > > More generally, I think much of the confusion is that a Lisper will enter > into the FFI with a certain set of assumptions that don't hold. An array is > a type of list? > > > What's the basis for thinking this? > > > In an earlier message you wrote "*Since Java.util.List is abstract, you > need to choose a concrete class, such as ArrayList.*" I took this to mean > that there was an is-a relationship between Array and List. > > No Array class? Only specific types of array classes (e.g. array of Int)? > > > Not following. It would be useful for us if you to unpack how someone > would come to these conclusions. We could make the documentation better so > as to try to avoid the confusion. > > This came about when I searched for Array classes and a stack overflow > discussion > > seemed to suggest "There *are* array classes, one per element type used > in the program" along with some other information that probably confused > me further. > > What helped was when I realised that java.lang.list is probably like a > lisp *sequence *and although perhaps not a direct superclass, may act > like one. I still could be wrong, but it helped me conceptualise things > enough to 'solve' the problem. > > I've put what I've learned so far into an annotated hello-world.lisp > on > github so that it can be turned into a tutorial when it's finished. It's a > work in progress, but by the end of the Spark hello world, I hope that most > of the basics will have been demonstrated. The code is almost certainly not > yet a good exemplar, and would benefit greatly from a review by someone(s) > more knowledgeable about ABCL than I. > > > On Sunday, July 26, 2020, 6:37:27 AM GMT+8, Alan Ruttenberg < > alanruttenberg at gmail.com> wrote: > > > > > On Sat, Jul 25, 2020 at 5:15 AM Steven Nunez > wrote: > > Thanks Alan. What finally worked is: > > (#"parallelize" *sc* (#"asList" 'Arrays (java:jarray-from-list '(1 2 3 4 > 5)))) > > > Did you try the code I sent to create the list? Here it is rephrased to be > analogous to jarrary-from-list > > (defun jlist-from-list (list) > (let ((jlist (jss::new 'arraylist))) > (loop for el in list > do (#"add" jlist el)) > jlist)) > > I would expect to be able to write > > (#"parallelize" *sc* (jlist-from-list '(1 2 3 4 5))) > > It would be helpful to know if this doesn't work, as it means there's > something I need to learn. > > > But I can't help but think that going from a list to an array to a list is > the long way 'round. Is there a better or more idiomatic way to do this? > > > In this case it's not really going the long way around. #"asList" takes a > variable number of arguments - its java method signature is (T... a). JSS > doesn't yet know about varargs. The way java implements varags is to > actually create a method that takes an *array* of the arguments, and then > when calling the method, add code to pack the arguments into an array. > That's what you did - pack the arguments into an array, doing what the java > compiler would do. > See > https://stackoverflow.com/questions/21746663/how-does-jvm-implement-the-varargs > > I'm going to think about how to make varargs work as expected so you could > use a more natural syntax > (#"parallelize" *sc* (#"asList" 'Arrays 1 2 3 4 5)). > > But, as I said, the implementation of jlist-from-list should be adequate. > If you can verify that then we can add (an optimized version of) it to abcl. > > More generally, I think much of the confusion is that a Lisper will enter > into the FFI with a certain set of assumptions that don't hold. An array is > a type of list? > > > What's the basis for thinking this? > > No Array class? Only specific types of array classes (e.g. array of Int)? > > > Not following. It would be useful for us if you to unpack how someone > would come to these conclusions. We could make the documentation better so > as to try to avoid the confusion. > > I was hoping that this was or could be papered over in a manner similar to > CFFI where I barely notice I'm using an external library most of the time. > > > Well, that's the intent of JSS (I was the original author of JSS, BTW). > In this case I think you are effectively arguing that we should coerce lisp > lists and arrays passed as java arguments to a java equivalent. That's also > something I can think about. The downside of this is that sometimes I > *want* to pass a cons. I suppose we could provide both, making coercing be > the default and add syntax to escape it, so > (#"parallelize" *sc* '(1 2 3 4 5)) > and > (#"add" jlist (the cons '(1 2 3 4 5)) if I wanted to have an element of > jlist be a cons. > > There's still an issue that there are multiple implementations of java's > List and we aren't indicating which one is desired. So we'd have to pick a > default, like java.util.ArrayList. We'd have to document that if a > different type of list was wanted it needs to be created explicitly, ala > jlist-to-list. Similarly for arrays. If one writes: > (#"myMethod" ob #(1 2 3 4 5)) > should it pass a byte array? an integer array? an array of Objects? Again > we'd have to choose a default, presumably what jarray-from list does - The > array type is the java type of the first argument. In this case that would > be java.lang.Integer. Again I suppose we could add syntax to help. > (#"myMethod" ob (the (array byte) #(1 2 3 4 5))) > > Any JSS users have an opinion? > > Alan > > On Saturday, July 25, 2020, 12:35:08 PM GMT+8, Alan Ruttenberg < > alanruttenberg at gmail.com> wrote: > > > Try just using a java list. As I understand it, that should work the same > way. > Since Java.util.List is abstract, you need to choose a concrete class, > such as ArrayList. > > (let ((jlist (jss::new 'arraylist))) > (loop for el in '(1 2 3 4 5) > do (#"add" jlist el)) > (print (jss::jlist-to-list jlist)) > jlist) > > The print statement is to verify that we got what was expected, and to > demonstrate jlist-to-list. > Alan > > On Fri, Jul 24, 2020 at 10:21 PM Steven Nunez > wrote: > > I think what parallelize > really > needs is a java.util.list. Alessio mentioned some reasons why an automatic > conversion is challenging; perhaps a restart is easier? I.e. search for a > method of the given name that takes a java.util.list, and if you're giving > it an abcl.cons, the restart asks if you want an automatic conversion. > > Trying such a conversion manually, it seems I need a jlist-from-list, but > this doesn't exist in the JAVA package. How do I get a java.util.list from > an abcl.cons ? > > > > On Saturday, July 25, 2020, 5:20:30 AM GMT+8, Alan Ruttenberg < > alanruttenberg at gmail.com> wrote: > > > If you send a smallish example of the code that doesn't work, and a list > of the dependencies, I can have a look. I've been dealing with some stuff > recently that might make it easier to debug. > You definitely can't use CopyToArray that way. You need to work with a > java array > (jarray-from-list '(1 2 3 4 5)) > Maybe: > (#"parallelize" *sc* (jarray-from-list '(1 2 3 4 5))) > > Alan > > On Thu, Jul 23, 2020 at 11:03 PM Steven Nunez > wrote: > > OK, I'm on lines 4 and 5 of 'hello world' and ran into yet another brick > wall. Trying to convert the following two lines into ABCL: > > List data = Arrays.asList(1, 2, 3, 4, 5);JavaRDD distData = sc.parallelize(data); > > > it looks like it should be easy. Heck, I can do that in one line: > > (#"parallelize" *sc* '(1 2 3 4 5)) ; *sc* defined yesterday and known to > be correct > > But no, it claims "no applicable method named parallelize found on > JavaSparkContext" (but there is!). Reading through section 3.1.1 of the > documentation, it appears that this is probably because '(1 2 3...) is a > LispObject and not a Java object (why no automatic conversion?). Let's try > to convert it: > > (#"parallelize" *sc* (#"copytoArray" '(1 2 3 4 5))) > > No instance method named copytoArray found for type > org.armedbear.lisp.Cons' > > And the same with using an array, e.g. (#"parallelize" *sc* > (#"copytoArray" #(1 2 3 4 5))) > > *Sigh* > > It's been a week and my intention was to have a working prototype by now > and present ABCL as a viable alternative to use in a project. I haven't got > past line 5 in 'hello world'. This doesn't bode well. > > I've been reading about ABCL for years, and it's impressive. Full MOP, > extensible sequences, nearly 100% ANSI compliance, and the ability to > deploy on the JVM are major achievements. However, as a not-inexperienced > Lisp programmer, I find the barrier to entry remarkably high and the > documentation and examples sparse and insufficient to surmount the hurdles > I encountered. > > Please take these comments in the way they are intended: constructive > feedback from someone who is a fan of the project and would love to be able > to use it. It's nearly impossible to get Lisp introduced into enterprise > environments, and ABCL provides a wedge into those types of projects, > ticking the boxes on deployment and ability to work with legacy Java code. > Perhaps it makes more sense to someone approaching Lisp from the Java side, > but coming from the Lisp side to Java, there's a high barrier to entry. I > know that no volunteer wants to write documentation, but more and clearer > docs are sorely needed here. This is probably not news, but sometimes it > helps to be reminded of the obvious. > > I hate giving up, so this will be a personal background project in the > hopes that at the next opportunity things will have improved to the point > where we can consider introducing ABCL, so if anyone has any pointers, > generally (though I think I would have found any docs or examples (lsw2) by > now) and explaining this problem in particular, it would be greatly > appreciated. > > @easye, you mentioned your ember project. If you're going to continue with > that, please message me. A Spark wrapper would be useful, serve as a good > exemplar for using ABCL to wrap a large library and, with a companion > tutorial, help others overcome the kind of obstacles I've encountered. I'd > be happy to contribute. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jgodbou at gmail.com Tue Jul 28 03:14:01 2020 From: jgodbou at gmail.com (Jonathan Godbout) Date: Mon, 27 Jul 2020 23:14:01 -0400 Subject: run-program issues with OpenJDK 11 In-Reply-To: References: <878sg6hjh6.fsf@rocinante.timmons.dev> <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> <87h7urw3gu.fsf@rocinante.timmons.dev> <878sfbxthi.fsf@rocinante.timmons.dev> Message-ID: Heh, I was just about to send the same error... On a pixelbook vm openjdk 11.0.8 2020-07-14 On Fri, Jul 24, 2020 at 3:53 AM Mark Evenson wrote: > > > > On Jul 22, 2020, at 21:03, Eric Timmons wrote: > > > > I just tested this on a fresh Debian Buster VM (to rule out anything > > weird from my environment) with openjdk-11-jdk-headless and ABCL 1.7.1 > > and ended up with the same results: > > > > + The prebuilt jar is unable to use sys:run-program at all. > > > > + When built from source, sys:run-program works, but sys:process-pid > > does not. > > > > lisp-implementation-version: > > > > "1.7.1" > > "OpenJDK_64-Bit_Server_VM-Debian-11.0.7+10-post-Debian-3deb10u1" > > "amd64-Linux-4.19.0.9-amd64" > > > > Mark: I just realized that your use of swank-backend:getpid doesn't > > match what I was trying to do. Swank's getpid gets the PID of the > > current process, but I was trying to get the PID of a process started > > with (sys:run-program ... :wait nil) using sys:process-pid. > > > > -Eric > > > > Finally able to confirm the failure of SYS:RUN-PROGRAM when SLIME is *not* > used. > > Debugging further… > > > > -- > "A screaming comes across the sky. It has happened before but there is > nothing > to compare to it now." > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Wed Jul 29 07:29:01 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Wed, 29 Jul 2020 07:29:01 +0000 (UTC) Subject: JDK8 Style Lambda Functions and ABCL References: <195020859.8787390.1596007741659.ref@mail.yahoo.com> Message-ID: <195020859.8787390.1596007741659@mail.yahoo.com> Greetings all, I'm trying to convert the following Java code in the "Basics" section of the Spark Programming Guide to ABCL: JavaRDD lineLengths = lines.map(s -> s.length()); I know that "s -> s.length" is a JDK8 style lambda function with one parameter, returning the result of calling length() on 's'. What I'd like to be able to do is write: (let ((line-lengths (#"map" *lines* (lambda (s) (#"length" s))))) but this isn't getting me anywhere, with Java saying there is no applicable method 'map' on *lines* (an instance of JavaRDD). There is such a method (if it matters, it is inherited by JavaRDD from interface JavaRDDLike). Investigating that map method a bit further, it seems to want an org.apache.spark.api.java.function. Here's a clip from the Spark description: Spark’s API relies heavily on passing functions in the driver program to run on the cluster. In Java, functions are represented by classes implementing the interfaces in the org.apache.spark.api.java.function package. There are two ways to create such functions: - Implement the Function interfaces in your own class, either as an anonymous inner class or a named one, and pass an instance of it to Spark. - Use lambda expressions to concisely define an implementation. I think what I'm getting from the ABCL lambda expression is a java.util.function: SPARK> (describe (lambda (s) #"length" s))# {70B82AD0}> is an object of type FUNCTION.The function's lambda list is:  (S) I do wonder though how the Java lambda "s -> s.length()" manages to produce the correct result, so this theory may not be correct. The Spark guide goes on to say: While much of this guide uses lambda syntax for conciseness, it is easy to use all the same APIs in long-form. For example, we could have written our code above as follows:  JavaRDD lines = sc.textFile("data.txt"); JavaRDD lineLengths = lines.map(new Function() {    public Integer call(String s) { return s.length(); } }); int totalLength = lineLengths.reduce(new Function2() {   public Integer call(Integer a, Integer b) { return a + b; } }); Or, if writing the functions inline is unwieldy: class GetLength implements Function {    public Integer call(String s) { return s.length(); } } class Sum implements Function2 {    public Integer call(Integer a, Integer b) { return a + b; } }   JavaRDD lines = sc.textFile("data.txt"); JavaRDD lineLengths = lines.map(new GetLength()); int totalLength = lineLengths.reduce(new Sum()); To me, both of those examples look unwieldy. There was a stack overflow discussion on creating Java classes with ABCL from 9 years ago, and I've read the java interface examples on abcl.org, but both of those techniques look like they will produce code just as unwieldy as the Java syntax above. Is there any way to use a lisp-style lambda syntax to produce a function that will satisfy Spark's requirement to implement the org.apache.spark.api.java.function interface? If not, the obvious route would be some macrology to create a defsparkfun, defsparkfun2, etc. that wrap the ABCL function I want to use with something that implements 'call' in the org.apache.spark.api.java.function interface. I'm hoping there's a better way. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Wed Jul 29 08:10:13 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Wed, 29 Jul 2020 10:10:13 +0200 Subject: JDK8 Style Lambda Functions and ABCL In-Reply-To: <195020859.8787390.1596007741659@mail.yahoo.com> References: <195020859.8787390.1596007741659.ref@mail.yahoo.com> <195020859.8787390.1596007741659@mail.yahoo.com> Message-ID: Java lambdas are syntax sugar to implement interfaces with a single non-default method. ABCL doesn't translate them automatically and doing so dynamically would be slow (creating a proxy each time is costly) without some advanced optimizations, but it could be done. You could use some functions and macros to hide the complexity of creating the proxy. On Wed, 29 Jul 2020 at 09:30, Steven Nunez wrote: > Greetings all, > > I'm trying to convert the following Java code in the "Basics" section of > the Spark Programming Guide > to ABCL: > > JavaRDD lineLengths = lines.map(s -> s.length()); > > I know that "s -> s.length" is a JDK8 style lambda function with one > parameter, returning the result of calling length() on 's'. What I'd like > to be able to do is write: > > (let ((line-lengths (#"map" *lines* (lambda (s) (#"length" s))))) > > but this isn't getting me anywhere, with Java saying there is no > applicable method 'map > ' > on *lines* (an instance of JavaRDD > ). > There is such a method (if it matters, it is inherited by JavaRDD from > interface JavaRDDLike > ). > Investigating that map method a bit further, it seems to want an > org.apache.spark.api.java.function > . > Here's a clip from the Spark description: > > *Spark’s API relies heavily on passing functions in the driver program to > run on the cluster. In Java, functions are represented by classes > implementing the interfaces in the org.apache.spark.api.java.function > package. There are two ways to create such functions:* > > - *Implement the Function interfaces in your own class, either as an > anonymous inner class or a named one, and pass an instance of it to Spark.* > - *Use lambda expressions > > to concisely define an implementation.* > > I think what I'm getting from the ABCL lambda expression is a > java.util.function: > > SPARK> (describe (lambda (s) #"length" s)) > # {70B82AD0}> is an object of > type FUNCTION. > The function's lambda list is: > (S) > > I do wonder though how the Java lambda "s -> s.length()" manages to > produce the correct result, so this theory may not be correct. > > The Spark guide goes on to say: > > *While much of this guide uses lambda syntax for conciseness, it is easy > to use all the same APIs in long-form. For example, we could have written > our code above as follows:* > > *JavaRDD lines = sc.textFile("data.txt");* > *JavaRDD lineLengths = lines.map(new Function() > {* > * public Integer call(String s) { return s.length(); }* > *});* > *int totalLength = lineLengths.reduce(new Function2 Integer>() {* > * public Integer call(Integer a, Integer b) { return a + b; }* > *});* > > *Or, if writing the functions inline is unwieldy:* > > *class GetLength implements Function {* > * public Integer call(String s) { return s.length(); }* > *}* > *class Sum implements Function2 {* > * public Integer call(Integer a, Integer b) { return a + b; }* > *}* > > *JavaRDD lines = sc.textFile("data.txt");* > *JavaRDD lineLengths = lines.map(new GetLength());* > *int totalLength = lineLengths.reduce(new Sum());* > > > To me, both of those examples look unwieldy. There was a stack overflow > discussion on creating Java classes with ABCL > > from 9 years ago, and I've read the java interface examples > on > abcl.org, but both of those techniques look like they will produce code > just as unwieldy as the Java syntax above. > > Is there any way to use a lisp-style lambda syntax to produce a function > that will satisfy Spark's requirement to implement the > org.apache.spark.api.java.function interface? If not, the obvious route > would be some macrology to create a defsparkfun, defsparkfun2, etc. that > wrap the ABCL function I want to use with something that implements 'call' > in the org.apache.spark.api.java.function interface. I'm hoping there's a > better way. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Thu Jul 30 00:17:06 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Thu, 30 Jul 2020 00:17:06 +0000 (UTC) Subject: JDK8 Style Lambda Functions and ABCL In-Reply-To: References: <195020859.8787390.1596007741659.ref@mail.yahoo.com> <195020859.8787390.1596007741659@mail.yahoo.com> Message-ID: <587917587.9275180.1596068226837@mail.yahoo.com> I see. So when you mention proxies, are you referring to wrapping java:jnew-runtime-class with some macros? Something like this: (java:jnew-runtime-class    "get-length"    :interfaces (list "org.apache.spark.api.java.function")    :methods `(("call" ,"java.lang.Integer" (,"java.lang.String")                (lambda (s)                  (length s))                :modifiers (:public)))    :access-flags '(:public :static :final)) ? I'll give that a try today and see how it goes, but I notice that I've got to specify the RETURN-TYPE, where the JDK8 lambda's do not. Looking at how the well existing lambda works, barring returning java.util.function, I can't help but wonder if a variant of the ABCL lambda that returns an implementation of org.apache.spark.api.java.function and the 'call' method might not be the most elegant route, the one with the most natural syntax. On Wednesday, July 29, 2020, 4:11:40 PM GMT+8, Alessio Stalla wrote: Java lambdas are syntax sugar to implement interfaces with a single non-default method. ABCL doesn't translate them automatically and doing so dynamically would be slow (creating a proxy each time is costly) without some advanced optimizations, but it could be done.You could use some functions and macros to hide the complexity of creating the proxy. On Wed, 29 Jul 2020 at 09:30, Steven Nunez wrote: Greetings all, I'm trying to convert the following Java code in the "Basics" section of the Spark Programming Guide to ABCL: JavaRDD lineLengths = lines.map(s -> s.length()); I know that "s -> s.length" is a JDK8 style lambda function with one parameter, returning the result of calling length() on 's'. What I'd like to be able to do is write: (let ((line-lengths (#"map" *lines* (lambda (s) (#"length" s))))) but this isn't getting me anywhere, with Java saying there is no applicable method 'map' on *lines* (an instance of JavaRDD). There is such a method (if it matters, it is inherited by JavaRDD from interface JavaRDDLike). Investigating that map method a bit further, it seems to want an org.apache.spark.api.java.function. Here's a clip from the Spark description: Spark’s API relies heavily on passing functions in the driver program to run on the cluster. In Java, functions are represented by classes implementing the interfaces in the org.apache.spark.api.java.function package. There are two ways to create such functions: - Implement the Function interfaces in your own class, either as an anonymous inner class or a named one, and pass an instance of it to Spark. - Use lambda expressions to concisely define an implementation. I think what I'm getting from the ABCL lambda expression is a java.util.function: SPARK> (describe (lambda (s) #"length" s))# {70B82AD0}> is an object of type FUNCTION.The function's lambda list is:  (S) I do wonder though how the Java lambda "s -> s.length()" manages to produce the correct result, so this theory may not be correct. The Spark guide goes on to say: While much of this guide uses lambda syntax for conciseness, it is easy to use all the same APIs in long-form. For example, we could have written our code above as follows:  JavaRDD lines = sc.textFile("data.txt"); JavaRDD lineLengths = lines.map(new Function() {    public Integer call(String s) { return s.length(); } }); int totalLength = lineLengths.reduce(new Function2() {   public Integer call(Integer a, Integer b) { return a + b; } }); Or, if writing the functions inline is unwieldy: class GetLength implements Function {    public Integer call(String s) { return s.length(); } } class Sum implements Function2 {    public Integer call(Integer a, Integer b) { return a + b; } }   JavaRDD lines = sc.textFile("data.txt"); JavaRDD lineLengths = lines.map(new GetLength()); int totalLength = lineLengths.reduce(new Sum()); To me, both of those examples look unwieldy. There was a stack overflow discussion on creating Java classes with ABCL from 9 years ago, and I've read the java interface examples on abcl.org, but both of those techniques look like they will produce code just as unwieldy as the Java syntax above. Is there any way to use a lisp-style lambda syntax to produce a function that will satisfy Spark's requirement to implement the org.apache.spark.api.java.function interface? If not, the obvious route would be some macrology to create a defsparkfun, defsparkfun2, etc. that wrap the ABCL function I want to use with something that implements 'call' in the org.apache.spark.api.java.function interface. I'm hoping there's a better way. -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Thu Jul 30 03:18:16 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Thu, 30 Jul 2020 03:18:16 +0000 (UTC) Subject: java:jinterface-implementation: multiple interfaces? References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> Message-ID: <468899037.9322843.1596079096667@mail.yahoo.com> Is there a way to implement multiple interfaces on a single Java proxy? This code almost works: (java:jinterface-implementation  "org.apache.spark.api.java.function.Function"  "call" (lambda (s) (length s))) except that the proxy also needs to implement Serializable. The jproxy code in java.lisp seems to suggest that multiple implementations are allowed: (defgeneric jmake-proxy (interface implementation &optional lisp-this)  (:documentation "Returns a proxy Java object implementing the provided interface(s)... but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL. Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda. -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Thu Jul 30 03:21:17 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Thu, 30 Jul 2020 03:21:17 +0000 (UTC) Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: <468899037.9322843.1596079096667@mail.yahoo.com> References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> Message-ID: <963008952.9327429.1596079277962@mail.yahoo.com> Apologies, when I said "but I can't see adding multiple implementations", I meant multiple interfaces. On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez wrote: Is there a way to implement multiple interfaces on a single Java proxy? This code almost works: (java:jinterface-implementation  "org.apache.spark.api.java.function.Function"  "call" (lambda (s) (length s))) except that the proxy also needs to implement Serializable. The jproxy code in java.lisp seems to suggest that multiple implementations are allowed: (defgeneric jmake-proxy (interface implementation &optional lisp-this)  (:documentation "Returns a proxy Java object implementing the provided interface(s)... but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL. Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jgodbou at gmail.com Thu Jul 30 03:55:37 2020 From: jgodbou at gmail.com (Jonathan Godbout) Date: Wed, 29 Jul 2020 23:55:37 -0400 Subject: run-program issues with OpenJDK 11 In-Reply-To: References: <878sg6hjh6.fsf@rocinante.timmons.dev> <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> <87h7urw3gu.fsf@rocinante.timmons.dev> <878sfbxthi.fsf@rocinante.timmons.dev> Message-ID: Note: I'm using the ABCL binary from https://common-lisp.net/project/armedbear/ abcl-bin-1.7.1.tar.gz On Mon, Jul 27, 2020 at 11:14 PM Jonathan Godbout wrote: > Heh, I was just about to send the same error... > On a pixelbook vm openjdk 11.0.8 2020-07-14 > > On Fri, Jul 24, 2020 at 3:53 AM Mark Evenson wrote: > >> >> >> > On Jul 22, 2020, at 21:03, Eric Timmons wrote: >> > >> > I just tested this on a fresh Debian Buster VM (to rule out anything >> > weird from my environment) with openjdk-11-jdk-headless and ABCL 1.7.1 >> > and ended up with the same results: >> > >> > + The prebuilt jar is unable to use sys:run-program at all. >> > >> > + When built from source, sys:run-program works, but sys:process-pid >> > does not. >> > >> > lisp-implementation-version: >> > >> > "1.7.1" >> > "OpenJDK_64-Bit_Server_VM-Debian-11.0.7+10-post-Debian-3deb10u1" >> > "amd64-Linux-4.19.0.9-amd64" >> > >> > Mark: I just realized that your use of swank-backend:getpid doesn't >> > match what I was trying to do. Swank's getpid gets the PID of the >> > current process, but I was trying to get the PID of a process started >> > with (sys:run-program ... :wait nil) using sys:process-pid. >> > >> > -Eric >> > >> >> Finally able to confirm the failure of SYS:RUN-PROGRAM when SLIME is >> *not* used. >> >> Debugging further… >> >> >> >> -- >> "A screaming comes across the sky. It has happened before but there is >> nothing >> to compare to it now." >> >> >> >> >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Thu Jul 30 07:50:36 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Thu, 30 Jul 2020 09:50:36 +0200 Subject: JDK8 Style Lambda Functions and ABCL In-Reply-To: <587917587.9275180.1596068226837@mail.yahoo.com> References: <195020859.8787390.1596007741659.ref@mail.yahoo.com> <195020859.8787390.1596007741659@mail.yahoo.com> <587917587.9275180.1596068226837@mail.yahoo.com> Message-ID: No, you don't need to spin new classes. You can use a piece of machinery that is already in the JVM, which is called a proxy factory, that ABCL already knows how to use via jimplement-interface (or how's it called). On Thu, 30 Jul 2020 at 02:17, Steven Nunez wrote: > I see. So when you mention proxies, are you referring to wrapping > java:jnew-runtime-class with some macros? Something like this: > > (java:jnew-runtime-class > "get-length" > :interfaces (list "org.apache.spark.api.java.function") > :methods `(("call" ,"java.lang.Integer" (,"java.lang.String") > (lambda (s) > (length s)) > :modifiers (:public))) > :access-flags '(:public :static :final)) > > ? > > I'll give that a try today and see how it goes, but I notice that I've got > to specify the RETURN-TYPE, where the JDK8 lambda's do not. > > Looking at how the well existing lambda works, barring returning > java.util.function, I can't help but wonder if a variant of the ABCL lambda > that returns an implementation of org.apache.spark.api.java.function and > the 'call' method might not be the most elegant route, the one with the > most natural syntax. > > On Wednesday, July 29, 2020, 4:11:40 PM GMT+8, Alessio Stalla < > alessiostalla at gmail.com> wrote: > > > Java lambdas are syntax sugar to implement interfaces with a single > non-default method. ABCL doesn't translate them automatically and doing so > dynamically would be slow (creating a proxy each time is costly) without > some advanced optimizations, but it could be done. > You could use some functions and macros to hide the complexity of creating > the proxy. > > On Wed, 29 Jul 2020 at 09:30, Steven Nunez wrote: > > Greetings all, > > I'm trying to convert the following Java code in the "Basics" section of > the Spark Programming Guide > to ABCL: > > JavaRDD lineLengths = lines.map(s -> s.length()); > > I know that "s -> s.length" is a JDK8 style lambda function with one > parameter, returning the result of calling length() on 's'. What I'd like > to be able to do is write: > > (let ((line-lengths (#"map" *lines* (lambda (s) (#"length" s))))) > > but this isn't getting me anywhere, with Java saying there is no > applicable method 'map > ' > on *lines* (an instance of JavaRDD > ). > There is such a method (if it matters, it is inherited by JavaRDD from > interface JavaRDDLike > ). > Investigating that map method a bit further, it seems to want an > org.apache.spark.api.java.function > . > Here's a clip from the Spark description: > > *Spark’s API relies heavily on passing functions in the driver program to > run on the cluster. In Java, functions are represented by classes > implementing the interfaces in the org.apache.spark.api.java.function > package. There are two ways to create such functions:* > > - *Implement the Function interfaces in your own class, either as an > anonymous inner class or a named one, and pass an instance of it to Spark.* > - *Use lambda expressions > > to concisely define an implementation.* > > I think what I'm getting from the ABCL lambda expression is a > java.util.function: > > SPARK> (describe (lambda (s) #"length" s)) > # {70B82AD0}> is an object of > type FUNCTION. > The function's lambda list is: > (S) > > I do wonder though how the Java lambda "s -> s.length()" manages to > produce the correct result, so this theory may not be correct. > > The Spark guide goes on to say: > > *While much of this guide uses lambda syntax for conciseness, it is easy > to use all the same APIs in long-form. For example, we could have written > our code above as follows:* > > *JavaRDD lines = sc.textFile("data.txt");* > *JavaRDD lineLengths = lines.map(new Function() > {* > * public Integer call(String s) { return s.length(); }* > *});* > *int totalLength = lineLengths.reduce(new Function2 Integer>() {* > * public Integer call(Integer a, Integer b) { return a + b; }* > *});* > > *Or, if writing the functions inline is unwieldy:* > > *class GetLength implements Function {* > * public Integer call(String s) { return s.length(); }* > *}* > *class Sum implements Function2 {* > * public Integer call(Integer a, Integer b) { return a + b; }* > *}* > > *JavaRDD lines = sc.textFile("data.txt");* > *JavaRDD lineLengths = lines.map(new GetLength());* > *int totalLength = lineLengths.reduce(new Sum());* > > > To me, both of those examples look unwieldy. There was a stack overflow > discussion on creating Java classes with ABCL > > from 9 years ago, and I've read the java interface examples > on > abcl.org, but both of those techniques look like they will produce code > just as unwieldy as the Java syntax above. > > Is there any way to use a lisp-style lambda syntax to produce a function > that will satisfy Spark's requirement to implement the > org.apache.spark.api.java.function interface? If not, the obvious route > would be some macrology to create a defsparkfun, defsparkfun2, etc. that > wrap the ABCL function I want to use with something that implements 'call' > in the org.apache.spark.api.java.function interface. I'm hoping there's a > better way. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Thu Jul 30 07:57:26 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Thu, 30 Jul 2020 09:57:26 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: <963008952.9327429.1596079277962@mail.yahoo.com> References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> Message-ID: You may have luck by providing a list. However, I see a deeper problem. Serializable is a marker interface: it has no methods, it only declares the type serializable. However, you cannot just declare that an object is serializable to make it so; all its components must be serializable as well. This includes the invocation handler that ABCL creates under the cover, as well as all the Lisp objects that you use for the implementation, particularly functions and closures. And, bad news – those aren't serializable. So, if Serializable is a requirement because those instances will effectively be serialized – e.g., to persist them to a file or to send them over the network – you're out of luck. Ages ago I had started a branch to make most Lisp objects serializable, but I don't remember how far I got. I don't think it was ever mature enough to be merged, but many years have passed. On Thu, 30 Jul 2020 at 05:22, Steven Nunez wrote: > Apologies, when I said "but I can't see adding multiple implementations", > I meant multiple interfaces. > > On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez < > steve_nunez at yahoo.com> wrote: > > > Is there a way to implement multiple interfaces on a single Java proxy? > This code almost works: > > (java:jinterface-implementation > "org.apache.spark.api.java.function.Function" > > "call" (lambda (s) (length s))) > > except that the proxy also needs to implement Serializable. The jproxy > code > > in java.lisp seems to suggest that multiple implementations are allowed: > > (defgeneric jmake-proxy (interface implementation &optional lisp-this) > (:documentation "Returns a proxy Java object implementing the provided > interface(s)... > > but I can't see adding multiple implementations in the code. I see there's > a few jmake-proxy methods in there though: are there any documentation or > examples for their usage? Lsw2 doesn't use this at all and I can't find any > other good examples of using ABCL. > > Multiple interfaces from the jinterface-implementation function would be > ideal, as the above code could then be wrapped with a macro to produce a > 'spark-lambda' and be used nearly like the regular ABCL lambda. > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Thu Jul 30 08:00:24 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Thu, 30 Jul 2020 10:00:24 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> Message-ID: Correction: indeed it was merged, but I didn't go as far as to make closures serializable. On Thu, 30 Jul 2020 at 09:57, Alessio Stalla wrote: > You may have luck by providing a list. However, I see a deeper problem. > Serializable is a marker interface: it has no methods, it only declares the > type serializable. However, you cannot just declare that an object is > serializable to make it so; all its components must be serializable as > well. This includes the invocation handler that ABCL creates under the > cover, as well as all the Lisp objects that you use for the implementation, > particularly functions and closures. And, bad news – those aren't > serializable. So, if Serializable is a requirement because those instances > will effectively be serialized – e.g., to persist them to a file or to send > them over the network – you're out of luck. > > Ages ago I had started a branch to make most Lisp objects serializable, > but I don't remember how far I got. I don't think it was ever mature enough > to be merged, but many years have passed. > > On Thu, 30 Jul 2020 at 05:22, Steven Nunez wrote: > >> Apologies, when I said "but I can't see adding multiple implementations", >> I meant multiple interfaces. >> >> On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez < >> steve_nunez at yahoo.com> wrote: >> >> >> Is there a way to implement multiple interfaces on a single Java proxy? >> This code almost works: >> >> (java:jinterface-implementation >> "org.apache.spark.api.java.function.Function" >> >> "call" (lambda (s) (length s))) >> >> except that the proxy also needs to implement Serializable. The jproxy >> code >> >> in java.lisp seems to suggest that multiple implementations are allowed: >> >> (defgeneric jmake-proxy (interface implementation &optional lisp-this) >> (:documentation "Returns a proxy Java object implementing the provided >> interface(s)... >> >> but I can't see adding multiple implementations in the code. I see >> there's a few jmake-proxy methods in there though: are there any >> documentation or examples for their usage? Lsw2 doesn't use this at all and >> I can't find any other good examples of using ABCL. >> >> Multiple interfaces from the jinterface-implementation function would be >> ideal, as the above code could then be wrapped with a macro to produce a >> 'spark-lambda' and be used nearly like the regular ABCL lambda. >> >> >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From evenson at panix.com Thu Jul 30 08:14:13 2020 From: evenson at panix.com (Mark Evenson) Date: Thu, 30 Jul 2020 10:14:13 +0200 Subject: Workaround for SYS:RUN-PROGRAM under openjdk11+ (was Re: run-program issues with OpenJDK 11) In-Reply-To: References: <878sg6hjh6.fsf@rocinante.timmons.dev> <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> <87h7urw3gu.fsf@rocinante.timmons.dev> <878sfbxthi.fsf@rocinante.timmons.dev> Message-ID: <9C4CE05A-D689-4D7A-81BD-E8D529A7B3F0@panix.com> I am finally able to report some progress on understanding the problem, and provide a workaround. This bug occurs when the system compiled with openjdk8 is run under the openjdk11 runtime. Inconsistently, the abcl-1.7.0 binaries were compiled with openjdk11 while the abcl-1.7.1 binaries were compiled with openjdk8. In openjdk11, the java.lang.UNIXProcess class has been replaced with java.lang.ProcessImpl, so our code needs to conditionally find the pid of a given process based on the platform that the ABCL finds itself running upon. As a workaround until we fix and release the next version of ABCL, one may get sys:run-program running as long as one doesn't try to get the pid via first running the following code which is sufficient to load sys:run-program: (ignore-errors (sys:run-program "true" nil)) An easy fix for users of the abcl-1.7.1 binaries under openjdk11 would be to place such code in . -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From steve_nunez at yahoo.com Thu Jul 30 08:25:44 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Thu, 30 Jul 2020 08:25:44 +0000 (UTC) Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> Message-ID: <945996571.9390096.1596097544486@mail.yahoo.com> Bugger. I'd hate for this to come to a dead-end, as it was looking like an elegant solution. The Spark tuning guide mentions using Kyro to provide serialization in Spark. It's not entirely free however; you need to 'register' your classes with Kyro for it to work. Would that be sufficient to provide serialisation to the needed Lisp objects? On Thursday, July 30, 2020, 3:57:53 PM GMT+8, Alessio Stalla wrote: You may have luck by providing a list. However, I see a deeper problem. Serializable is a marker interface: it has no methods, it only declares the type serializable. However, you cannot just declare that an object is serializable to make it so; all its components must be serializable as well. This includes the invocation handler that ABCL creates under the cover, as well as all the Lisp objects that you use for the implementation, particularly functions and closures. And, bad news – those aren't serializable. So, if Serializable is a requirement because those instances will effectively be serialized – e.g., to persist them to a file or to send them over the network – you're out of luck. Ages ago I had started a branch to make most Lisp objects serializable, but I don't remember how far I got. I don't think it was ever mature enough to be merged, but many years have passed. On Thu, 30 Jul 2020 at 05:22, Steven Nunez wrote: Apologies, when I said "but I can't see adding multiple implementations", I meant multiple interfaces. On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez wrote: Is there a way to implement multiple interfaces on a single Java proxy? This code almost works: (java:jinterface-implementation  "org.apache.spark.api.java.function.Function"  "call" (lambda (s) (length s))) except that the proxy also needs to implement Serializable. The jproxy code in java.lisp seems to suggest that multiple implementations are allowed: (defgeneric jmake-proxy (interface implementation &optional lisp-this)  (:documentation "Returns a proxy Java object implementing the provided interface(s)... but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL. Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda. -------------- next part -------------- An HTML attachment was scrubbed... URL: From evenson at panix.com Thu Jul 30 11:04:38 2020 From: evenson at panix.com (Mark Evenson) Date: Thu, 30 Jul 2020 13:04:38 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> Message-ID: <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> > On Jul 30, 2020, at 10:00, Alessio Stalla wrote: > > Correction: indeed it was merged, but I didn't go as far as to make closures serializable. Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation? -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From alessiostalla at gmail.com Thu Jul 30 11:07:36 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Thu, 30 Jul 2020 13:07:36 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> Message-ID: Somewhere in between. I could give a shot at it. It would be useful if Steven detailed his use case a bit more. On Thu, Jul 30, 2020, 13:04 Mark Evenson wrote: > > > > On Jul 30, 2020, at 10:00, Alessio Stalla > wrote: > > > > Correction: indeed it was merged, but I didn't go as far as to make > closures serializable. > > Any idea how much work for someone (i.e. me)to be able to serialize > closures? > Just a bit of elbow-grease, or major implementation? > > -- > "A screaming comes across the sky. It has happened before but there is > nothing > to compare to it now." > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Thu Jul 30 11:26:32 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Thu, 30 Jul 2020 11:26:32 +0000 (UTC) Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> Message-ID: <632871186.9440194.1596108392654@mail.yahoo.com> Sorry, meant to send to the ABCL-dev list the first time. The use case is Spark lambda functions. I couldn't do a better job than the Spark RDD Programming Guide does at explaining the use case. It starts on the Basics heading. The ideal case would be the ability to take Java code like this:JavaRDD lineLengths = lines.map(s -> s.length());and write it in ABCL like this: (let ((line-lengths (#"map" *lines* (lambda (s) (length s))))) This uses the ABCL length function, which would be a huge win if we can use Lisp functions to map across data structure. I've already got abcl.jar accessible to Spark on all the nodes of a cluster. I'd probably shadow the cl:lambda with a spark:lambda to make the syntax natural. On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla wrote: Somewhere in between.I could give a shot at it. It would be useful if Steven detailed his use case a bit more. On Thu, Jul 30, 2020, 13:04 Mark Evenson wrote: > On Jul 30, 2020, at 10:00, Alessio Stalla wrote: > > Correction: indeed it was merged, but I didn't go as far as to make closures serializable. Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation? -- "A screaming comes across the sky.  It has happened before but there is nothing to compare to it now." -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Thu Jul 30 11:40:48 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Thu, 30 Jul 2020 13:40:48 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: <632871186.9440194.1596108392654@mail.yahoo.com> References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> <632871186.9440194.1596108392654@mail.yahoo.com> Message-ID: Hmm so Spark being a distributed computing library I guess you need full serialization of functions/closures. Well, I can give it a shot, but don't expect anything too soon. On Thu, 30 Jul 2020 at 13:26, Steven Nunez wrote: > Sorry, meant to send to the ABCL-dev list the first time. > > The use case is Spark lambda functions. I couldn't do a better job than > the Spark RDD Programming Guide > does at > explaining the use case. It starts on the Basics heading. The ideal case > would be the ability to take Java code like this: > > JavaRDD lineLengths = lines.map(s -> s.length()); > > and write it in ABCL like this: > > (let ((line-lengths (#"map" *lines* (lambda (s) (length s))))) > > This uses the ABCL length function, which would be a huge win if we can > use Lisp functions to map across data structure. I've already got abcl.jar > accessible to Spark on all the nodes of a cluster. I'd probably shadow the > cl:lambda with a spark:lambda to make the syntax natural. > > > On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla < > alessiostalla at gmail.com> wrote: > > > Somewhere in between. > I could give a shot at it. It would be useful if Steven detailed his use > case a bit more. > > > On Thu, Jul 30, 2020, 13:04 Mark Evenson wrote: > > > > > On Jul 30, 2020, at 10:00, Alessio Stalla > wrote: > > > > Correction: indeed it was merged, but I didn't go as far as to make > closures serializable. > > Any idea how much work for someone (i.e. me)to be able to serialize > closures? > Just a bit of elbow-grease, or major implementation? > > -- > "A screaming comes across the sky. It has happened before but there is > nothing > to compare to it now." > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessiostalla at gmail.com Thu Jul 30 12:38:28 2020 From: alessiostalla at gmail.com (Alessio Stalla) Date: Thu, 30 Jul 2020 14:38:28 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> <632871186.9440194.1596108392654@mail.yahoo.com> Message-ID: @Mark Evenson would a GitHub fork + pull request work for you in case I work on this? On Thu, 30 Jul 2020 at 13:40, Alessio Stalla wrote: > Hmm so Spark being a distributed computing library I guess you need full > serialization of functions/closures. Well, I can give it a shot, but don't > expect anything too soon. > > On Thu, 30 Jul 2020 at 13:26, Steven Nunez wrote: > >> Sorry, meant to send to the ABCL-dev list the first time. >> >> The use case is Spark lambda functions. I couldn't do a better job than >> the Spark RDD Programming Guide >> does at >> explaining the use case. It starts on the Basics heading. The ideal case >> would be the ability to take Java code like this: >> >> JavaRDD lineLengths = lines.map(s -> s.length()); >> >> and write it in ABCL like this: >> >> (let ((line-lengths (#"map" *lines* (lambda (s) (length s))))) >> >> This uses the ABCL length function, which would be a huge win if we can >> use Lisp functions to map across data structure. I've already got abcl.jar >> accessible to Spark on all the nodes of a cluster. I'd probably shadow the >> cl:lambda with a spark:lambda to make the syntax natural. >> >> >> On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla < >> alessiostalla at gmail.com> wrote: >> >> >> Somewhere in between. >> I could give a shot at it. It would be useful if Steven detailed his use >> case a bit more. >> >> >> On Thu, Jul 30, 2020, 13:04 Mark Evenson wrote: >> >> >> >> > On Jul 30, 2020, at 10:00, Alessio Stalla >> wrote: >> > >> > Correction: indeed it was merged, but I didn't go as far as to make >> closures serializable. >> >> Any idea how much work for someone (i.e. me)to be able to serialize >> closures? >> Just a bit of elbow-grease, or major implementation? >> >> -- >> "A screaming comes across the sky. It has happened before but there is >> nothing >> to compare to it now." >> >> >> >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve_nunez at yahoo.com Thu Jul 30 13:47:40 2020 From: steve_nunez at yahoo.com (Steven Nunez) Date: Thu, 30 Jul 2020 13:47:40 +0000 (UTC) Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> <632871186.9440194.1596108392654@mail.yahoo.com> Message-ID: <913036844.8887254.1596116860472@mail.yahoo.com> Let know know where/if you get a github repo. To the extent that I can, I'll contribute. On Thursday, July 30, 2020, 7:41:15 PM GMT+8, Alessio Stalla wrote: Hmm so Spark being a distributed computing library I guess you need full serialization of functions/closures. Well, I can give it a shot, but don't expect anything too soon. On Thu, 30 Jul 2020 at 13:26, Steven Nunez wrote: Sorry, meant to send to the ABCL-dev list the first time. The use case is Spark lambda functions. I couldn't do a better job than the Spark RDD Programming Guide does at explaining the use case. It starts on the Basics heading. The ideal case would be the ability to take Java code like this:JavaRDD lineLengths = lines.map(s -> s.length());and write it in ABCL like this: (let ((line-lengths (#"map" *lines* (lambda (s) (length s))))) This uses the ABCL length function, which would be a huge win if we can use Lisp functions to map across data structure. I've already got abcl.jar accessible to Spark on all the nodes of a cluster. I'd probably shadow the cl:lambda with a spark:lambda to make the syntax natural. On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla wrote: Somewhere in between.I could give a shot at it. It would be useful if Steven detailed his use case a bit more. On Thu, Jul 30, 2020, 13:04 Mark Evenson wrote: > On Jul 30, 2020, at 10:00, Alessio Stalla wrote: > > Correction: indeed it was merged, but I didn't go as far as to make closures serializable. Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation? -- "A screaming comes across the sky.  It has happened before but there is nothing to compare to it now." -------------- next part -------------- An HTML attachment was scrubbed... URL: From evenson at panix.com Thu Jul 30 17:11:48 2020 From: evenson at panix.com (Mark Evenson) Date: Thu, 30 Jul 2020 19:11:48 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> <632871186.9440194.1596108392654@mail.yahoo.com> Message-ID: > On Jul 30, 2020, at 14:38, Alessio Stalla wrote: > > @Mark Evenson would a GitHub fork + pull request work for you in case I work on this? Certainly. I will add your admin rights as a maintainer forthwith! -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From evenson at panix.com Thu Jul 30 17:24:09 2020 From: evenson at panix.com (Mark Evenson) Date: Thu, 30 Jul 2020 19:24:09 +0200 Subject: java:jinterface-implementation: multiple interfaces? In-Reply-To: References: <468899037.9322843.1596079096667.ref@mail.yahoo.com> <468899037.9322843.1596079096667@mail.yahoo.com> <963008952.9327429.1596079277962@mail.yahoo.com> <39036C5C-4F9E-4865-84AD-E5528DFA9DE4@panix.com> <632871186.9440194.1596108392654@mail.yahoo.com> Message-ID: > On Jul 30, 2020, at 19:11, Mark Evenson wrote: > > > >> On Jul 30, 2020, at 14:38, Alessio Stalla wrote: >> >> @Mark Evenson would a GitHub fork + pull request work for you in case I work on this? > > > Certainly. I will add your admin rights as a maintainer forthwith! > Done. Travis builds via on pull requests is the current test suite run on pull requests. Welcome, Mark -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now." From evenson at panix.com Thu Jul 30 19:52:58 2020 From: evenson at panix.com (Mark Evenson) Date: Thu, 30 Jul 2020 21:52:58 +0200 Subject: Fixed in trunk (was Re: run-program issues with OpenJDK 11) In-Reply-To: References: <878sg6hjh6.fsf@rocinante.timmons.dev> <40DB9263-896F-4593-A60F-F20A2E9EA498@panix.com> <87h7urw3gu.fsf@rocinante.timmons.dev> <878sfbxthi.fsf@rocinante.timmons.dev> Message-ID: <3EA0F697-46D1-4617-B837-671AC6E2159F@panix.com> I’ve made some recents commits that should be part of abcl-1.7.2 (real soon noon). > On Jul 24, 2020, at 09:52, Mark Evenson wrote: > […] > Debugging further… > > […] Issues with openjdk11 not able to invoke SYS:RUN-PROGRAM fixed with . Additionally I did some light testing over running the implementation compiled with openjdk11 on java8 implementations. This shook out [some bugs with the compiler][2], as the the openjdk11 compiler optimizesoptimizes the call to java.nio.Buffer.flip() in to be directly on the java.nio.ByteBuffer type. We fixed other instances of java.nio.Buffer interfaces to the point of being able to run the ANSI-TEST suite. I was able to get around this with some fairly ugly looking casts. But it seems to work… // capacity = buffer.limit(); ==> capacity = ((java.nio.Buffer)buffer).limit(); [1]: <​https://github.com/armedbear/abcl/commit/7021a905222c2b743422b607409233771e4ed623> [2]: -- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now."