Bug in font code


#1

This:

CustomTypeface::CustomTypeface (InputStream& serialisedTypefaceStream)
    : Typeface (String::empty)
{
    clear();
[...]
    for (i = 0; i < numChars; ++i)
    {
        const juce_wchar c = (juce_wchar) in.readShort();

Should read:

CustomTypeface::CustomTypeface (InputStream& serialisedTypefaceStream)
    : Typeface (String::empty)
{
[...]
    for (i = 0; i < numChars; ++i)
    {
        const juce_wchar c = (juce_wchar)(uint16) in.readShort();
// ditto for the next 4 lines 

I’ve stored glyph for char u+F6D6, but on reading, sign is extended and it reads u+FFFFF6D6 unless you prevent sign extension.
BTW, shouldn’t the whole class be changed so it stores unicode points as 32 bits code points instead ?


#2

Ah, thanks for that one, good catch!

Yes, it probably should all use 32-bit values, but that’d break compatibility with older stored typefaces. Might be possible to store the glyphs as UTF16, though, without breaking anything…