[pro] write-char vs. 8-bit bytes
Antoniotti Marco
antoniotti.marco at disco.unimib.it
Thu Apr 10 15:05:42 UTC 2014
On Apr 10, 2014, at 16:31 , Paul Tarvydas <paultarvydas at gmail.com> wrote:
> I'm using sbcl to write-char a 16-bit unsigned integer to a socket as two separate unsigned 8-bit bytes, for example 141 should appear as
>
> #x00 #x8d.
>
> SBCL appears to convert the #x8d into a two-byte utf-8 char, resulting in 3 bytes written to the stream
>
> \#x00 #xcd #x8d.
>
> What is the proper incantation to achieve this? (SBCL on Windows, if that matters).
It may not be very helpful, but the “right incantation” would be to write a CDR that specified the behavior of implementations that deal with UTF* and UNICODE.
Any takers?
Cheers
—
MA
More information about the pro
mailing list