Can somebody shed some light on the connection between the UTF-8 codepoint and the int glyphNumber?
I am in the process of writing a SMuFL compliant score renderer, since it is hard to write music theory software, if you can’t visualise it properly.
The SMuFL defines code points for every glyph in a json file, so the UTF codepoint is known.
I am using the method:
Path p; typeface->getOutlineForGlyph (0x1e050, p);
However, I realise there is an additional mapping necessary. By outputting tables of glyphs, I could figure out an offset, but that is not consistent, neither within one font, nor across fonts (I am comparing bravura and petaluma, both publicly available).
Now I found the character map, but no information, how to access that, nor could I see in the JUCE code, that it uses that anywhere.
So how can I map from the smufl codepoint to the glyphnumber the Typeface expects?
Links I looked at:
I don’t really get how that would work… any hints please?