[elephant-devel] Using CVS on SBCL-0.9.3/sparc-sunos

pvk at pvk.ca pvk at pvk.ca
Thu Aug 18 05:55:12 UTC 2005


I know, my configuration is quite rare, and not explicitly supported.
However, I have managed to compile everything without a warning, so I
suppose this is a good sign. Unfortunately, elephant fail ~71-75 of the
tests (depending on the run). I've tried to debug and find the cause of
these failures, and so far, I've found that write-int randomly dies.
Here's an example:

CL-USER> my-bs
#S(SLEEPYCAT:BUFFER-STREAM
   :BUFFER #<SB-ALIEN-INTERNALS:ALIEN-VALUE :SAP #X00036898 :TYPE (*
                                                                   (SIGNED
8))>
   :SIZE 2
   :POSITION 0
   :LENGTH 20)

CL-USER> (sleepycat::write-int (elephant::buffer-stream-buffer my-bs) 2 0) 2

CL-USER> (sleepycat::write-int (elephant::buffer-stream-buffer my-bs) 2 1)
; Evaluation aborted

Here's the error message:

bus error at #X405A13B4
   [Condition of type SIMPLE-ERROR]

Restarts:
  0: [ABORT] Abort handling SLIME request.
  1: [ABORT] Exit debugger, returning to top level.

Backtrace:
  0: (SB-UNIX::SIGBUS-HANDLER #<unavailable argument> #<unavailable
argument> #.(SB-SYS:INT-SAP #XFFBFF6E8))
  1: ("foreign function: call_into_lisp")
  2: (SLEEPYCAT::WRITE-INT #<unavailable argument> #<unavailable argument>
#<unavailable argument>)
  3: (SB-INT:EVAL-IN-LEXENV (SLEEPYCAT::WRITE-INT
(SLEEPYCAT:BUFFER-STREAM-BUFFER MY-BS) 2 1) #<NULL-LEXENV>)

It really seems to be linked to the fact that the offset isn't 0. When I
generate another buffer-stream with GRAB-BUFFER-STREAM, the exact same
behaviour happens. Note that this does not happen on buffer-write-byte. Is
it my fault for running CVS? If so, can someone direct me to an usable
version? Or does it seem to be a problem linked to my platform? Again, if
so, how would I fix this?

EDIT: After more research, I've realized this is an alignment problem.
Basically, elephant is doing something that's illegal even in C: casting
to a type with stricter alignment requirements (from char* to int*, or
float*, etc). Doing only this wouldn't be a problem, usually, though,
since malloc aligns its pointers. However, casting &((char*) foo[1]) to
int* is definitely a recipe for disaster on a RISC machine. I'm not sure
what the solution should be, or evne if it should be at the application or
FFI level. However, it seems to me that providing a portable
implementation of the concerned functions would still be a good idea, if
it's doable.

Thank you,

 Paul Khuong

PS. It might be a good idea to have a flag to conditionalize all the
(optimize) declares. That would immensely help when debugging.



More information about the elephant-devel mailing list