[Ecls-list] unicode troubles
Арсений Заостровных
necto.ne at gmail.com
Sun Jul 22 21:29:19 UTC 2012
Hi all.
I'm using the latest release of ecl - 12.7.1. The question may be not about
the ecl at all. But there a misunderstanding: what encoding ecl uses by
default during conversation between lisp and c? The nice project
http://createuniverses.blogspot.com/2009/09/qtimagine-skeleton-project-for-live.htmldoesn't
works on ecl with enabled unicode support, because the code
cl_object princed = cl_princ_to_string(output); // output was "No
error" for example
std::string outStr = (char*)princed->base_string.
self; //outStr is just "N" now
become wrong, and base_string.self currently contains a pointer to a
wchar_t 32-bit unicode string.* (Is there any way to generate simple 8-bit
ASCII string from cl_objects?)
*I tried to use QString to convert wchar_t* into std::string:
QString::fromWCharArray((wchar_t*)princed->base_string.self).toStdString()
but it doesn't work because, the string turned out not to be
null-terminated. An it puls a lot of garbege from memory. *How should i
properly handle this strings? What is the most convinient way to convert
cl_obects into strings?* *
* Furthermore, as an input string it requires an ASCII char* pointer, and
provided unicode one it fails(it just reads only the first symbol, because
it is followed by 3 nulls: L"(print 1)" -> "(\000\000\000p\000\000\000...").
It looks quite wired: inputing from char*, and outputing to wchar_t* and
only in this way.
The version with disabled unicode support first goes to fail on Windows7x64
with: Unsupported external format: latin-1. *Does any one know which code
page can be accepted and worked with?*
--
Necto.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.common-lisp.net/pipermail/ecl-devel/attachments/20120723/c365a93c/attachment.html>
More information about the ecl-devel
mailing list