Parsing JSON fails with escaped UTF

We have some large JSON structures and are parsing with: JSON::parse (const String& text, var& result)

Our JSON string sometimes contains UTF characters, presented with the \u0000 type escaping.

But the JSON::parse method stops parsing once it hits this text. It returns var objects for everything up until that point but nothing after.

Is there some trick to allowing such text thru the parser, or better yet, interpreting it as the original UTF16 string?

"Parameters": {
                         "AvidASPIPlugInName": "Time Compression Expansion\u0000\u0000\ubead\ucf9c"

JUCE strings are all zero-terminated, so they can’t actually represent a string that contains a zero character. There’s no problem with the JSON parser, but we’d need to change the fundamental way that String works in order to make it possible to parse that particular piece of text.

Using strings that deliberately contain zeros is a bit of a weird design choice IMHO, and likely to run into problems in other libraries too. If you’re in control of the way the original JSON is generated, you should probably consider doing it differently.

There’s no problem with the JSON parser,

It is a bit of a problem since JSON explicitly allows escaped null characters in strings, if only a problem with documentation (is there a “conformance” page somewhere in the docs?)

But it looks like OP is doing some tricks to put metadata inside a valid UTF string that isn’t seen when parsed as a C string.

Since string is defined as \0 terminated, it is reasonable, that juce::String will not represent that property correctly. What happens, if you read it using

parameters.getProperty ("AvidASPIPlugInName").getBinaryData();

I haven’t checked the sources, but I would expect this variant to include the \0000 characters.

Yes, I see the problem now. We’re not totally in control of the data, so will have to try and clean it up before it reaches juce::String.