AudioUnit Identifier string, osTypeToString


Hi, I’m trying to load the Ambience Reverb from Smart Electronix in my app and there seems to be a problem with creating the identifier.

When the code is in AudioUnitFormatHelpers osTypeToString it runs into this assert

[code] /* If you get an assertion here, then you’re trying to create a string from 8-bit data
that contains values greater than 127. These can NOT be correctly converted to unicode
because there’s no way for the String class to know what encoding was used to
create them. The source data could be UTF-8, ASCII or one of many local code-pages.

    To get around this problem, you must be more explicit when you pass an ambiguous 8-bit
    string to the String class - so for example if your source data is actually UTF-8,
    you'd call String (CharPointer_UTF8 ("my utf8 string..")), and it would be able to
    correctly convert the multi-byte characters to unicode. It's *highly* recommended that
    you use UTF-8 with escape characters in your source code to represent extended characters,
    because there's no other way to represent these strings in a way that isn't dependent on
    the compiler, source code editor and platform.
jassert (t == nullptr || CharPointer_ASCII::isValidString (t, (int) maxChars));[/code]

The plugin does load into logic and AU Lab. If you want to try it out, you can get the plugin here:



Ok thanks, that’s just a false alarm, nothing to worry about. Easy to avoid - I’ll just change it to this:

const String osTypeToString (OSType type) { juce_wchar s[4]; s[0] = (juce_wchar) (((uint32) type) >> 24); s[1] = (juce_wchar) (((uint32) type) >> 16); s[2] = (juce_wchar) (((uint32) type) >> 8); s[3] = (juce_wchar) ((uint32) type); return String (s, 4); }


Thanks Jules, but now the first assert in createAUPluginIndentifier fails?


Yes… I’m surprised nobody spotted my deliberate mistake there (doh!) Sorry, should be:

const juce_wchar s[4] = { (juce_wchar) ((type >> 24) & 0xff), (juce_wchar) ((type >> 16) & 0xff), (juce_wchar) ((type >> 8) & 0xff), (juce_wchar) (type & 0xff) };

Will check in a correction right away!


ha thanks, yeah I wish I could’ve spotted it…still much to learn.

The identifier that gets created has some odd resulting characters.


Maybe this plugin is just messed up?


[quote]The identifier that gets created has some odd resulting characters.


Yes, but not if you use the fix I just suggested, right?


This is with the latest fix.


It can’t be… What’s the OSType value that you’re feeding it?


I believe the value is the componentSubType for the plugin. I comes in as this \007\300\274\322

Edit: That’s the value as displayed by the xcode debugger


Well, yes… if you feed it non-ascii characters, what were you expecting it to look like?


Fair enough I guess. I didn’t figure they would use invalid characters for the componentSubType but if that’s the case then so be it.

It loads now anyway, which is the main thing. Thanks!