[pro] write-char vs. 8-bit bytes

Pascal J. Bourguignon pjb at informatimago.com
Fri Apr 11 11:02:23 UTC 2014


Antoniotti Marco <antoniotti.marco at disco.unimib.it> writes:

> On Apr 10, 2014, at 16:31 , Paul Tarvydas <paultarvydas at gmail.com> wrote:
>
>> I'm using sbcl to write-char a 16-bit unsigned integer to a socket as two separate unsigned 8-bit bytes, for example 141 should appear as
>> 
>> #x00 #x8d.
>> 
>> SBCL appears to convert the #x8d into a two-byte utf-8 char, resulting in 3 bytes written to the stream
>> 
>> \#x00 #xcd #x8d.
>> 
>> What is the proper incantation to achieve this?  (SBCL on Windows, if that matters).
>
> It may not be very helpful, but the “right incantation” would be to
> write a CDR that specified the behavior of implementations that deal
> with UTF* and UNICODE.

No, not in this case.


-- 
__Pascal Bourguignon__
http://www.informatimago.com/
"Le mercure monte ?  C'est le moment d'acheter !"




More information about the pro mailing list