I’m having difficulty building an M1 Mac version of my plug-in which works correctly. Windows VST3 works fine. Intel Mac AU & VST3 work fine. M1 Mac AU & VST3 both have these bugs:
-
There is a rapid downward pitch sweep at the start of every note.
-
LFOs which are synchronised to song tempo play different parts of their cycle at different rates.
Should I have to change my code or do something differently to build a working M1 plug–in, other than build it on an M1 machine and select “Any Mac” as the destination?
Do you use a safeguard against denormals?
juce::ScopedNoDenormals noDenormals;
I once ran into differences with how Intel and Arm CPUs handle denormals.
If I remember well, I found this also in code that is not executed via the audio callback, like calculating filter responses.
If it works on Intel but these bugs manifest on ARM you probably have undefined behavior in your code. That basically means that even on Intel, it only works “by accident”. And it could also stop working anytime even on Intel when you make some unrelated changes.
Most likely you’ll have issues with thread safety, some kind of data race that only falls on your foot on ARM. Maybe ThreadSanitizer could help find these issues.
1 Like
In addition to architectural differences in ARM already mentioned, you mention two problems which sound potentially related to sample rate differences. For example a snippet of code which is assuming a particular sample rate to calculate millisecond timing could cause that sort of issue on different sound card configurations.
Thank you for these answers.
@PeterEmanuelRoos,
I’m already using ScopedNoDenormals noDenormals, so it’s probably not that.
@hugoderwolf,
You could be onto something, although I don’t use JUCE’s Synthesizer class where the voices are separate objects which could presumably run in different threads. All my voices are calculated inside processBlock using an arrangement a bit like a hardware polysynth with a fixed number of voices. I assume that means it would only use the one thread for the audio? But I could have that completely wrong!
I had a go using ThreadSanitizer yesterday but couldn’t get anything useful out of it. (I do my development on Windows VS and don’t really understand Xcode) Following online advice, I set the ‘Run’ Scheme to include ThreadSanitizer and used ‘Build’, but after that I’m stuck. If I select ‘Run’ it tells me I can’t do that? If I just open Logic Pro, the (ThreadSanitized) AU crashes validation with a lot of messages related to ThreadSanitizer but nothing relating to my plug-in? Or am I supposed to do something different to access the ThreadSanitizer results?
@caustik
I don’t have any audio or MIDI interfaces plugged into my Macs, and they’re both set to the default 48 KHz rate, so I doubt it’s anything to do with sample rate. Also, I would have thought that would affect Intel Macs just the same?
I don’t use JUCE’s Synthesizer class where the voices are separate objects which could presumably run in different threads
The JUCE synthesizer class (and basically anything regarding audio rendering in JUCE) is also running on a single thread (unless you go the extra mile and do the multithreading yourself, but then it’s better to run your own system anyway).
It’s just using a dynamic voice allocation so you can change the number of voices on runtime.
chrisboy2000, thank you for that clarification. From your answer it seems like my M1 problem isn’t likely to be a threading issue, although I’d still like to understand how to use ThreadSanitizer.
I hope you don’t mind me bumping this only my efforts to solve this issue have got nowhere and only made me more confused:
-
The ‘pitch drop’ I mentioned at the start of notes also seems to happen to other plug-ins. In Logic Pro, if I try using any of the sample-based plug-ins before their sample data has downloaded, they produce what sounds like a sine wave but with the same pitch drop as I’m getting from my plug-in - this makes no sense whatsoever! Is it possible there’s something wrong with my MacMini’s audio system?
-
I was using AudioPlayHead::CurrentPositionInfo
and MidiBuffer::Iterator
, both of which produced deprecation warnings, so I thought I might as well go ahead and replace both of them, which I’ve now done. I get no deprecation warnings now but it’s made no difference to the problems I mentioned.
Incidentally, the problem is exactly the same on AudioUnit and VST3.
I presume you aren’t able to share relevant portions of your code, or else you’d have probably done it already, but still worth asking as it would be a lot more easier to help.
If you heard these weird phenomena with other plug-ins, that’s another story, obviously. Did you also try with other DAWs? If you bounce a project, is the rendered audio file similar to the real time playback (i.e. exhibiting the issue)?
Thank you mathieudemange, your reply inspired me to look into this a bit more methodically.
Okaaay, this is a bit embarrassing! So I recorded the output of my synth on my M1 MacMini in Reaper, exported it as a wav (because I couldn’t work out how to examine the waveform in Reaper or Logic Pro), networked it over to my PC, opened it in Audacity, normalised it, and examined the waveform. You probably wont be surprised to hear that there is actually no pitch drop at all - every cycle is the same length in samples and looks perfect. So it seems as if the ‘blip’ I hear on my MacMini is just an artifact of its poor internal speakers? That explains why other plug-ins seem to be affected, it also explains why my old Intel iMac with decent internal speakers sounded fine. Doh!
So that’s one problem down, one still to fix.
Speakers on Apple devices sometimes do interesting things to transients, at least it sounds like they do. I never got around to trying to measure what they do, but it’s a fascinating effect that squeezes a lot of clarity out of those tiny speakers. Maybe that’s what you’ve been hearing?