What happens in-between plugins in different DAWs?

Hey everyone,

I was doing experiments with LSB Steganography (https://www.slideshare.net/mohabshishtawy/audio-steganography-lsb). This process would write a code into the LSB of a binary word of length n. This is then scaled down to float to make the whole integer signal fit into the [-1,1] range and written to the audio buffer as provided by JUCE (float precision). A second plugin placed right after the first in the audio effect chain would pick up this signal and decode the original code from the low amplitude signal.

I tested this with different DAWs, being able to vary the encoding length n dynamically. It was surprising how differently early each DAWs seem to scramble the data unreadable. For example:
– ProTools: solid code transmission up to ~60 bits precision. Even choosing 64 bits of precision would have the occasional code being decoded correctly, giving me the suspicion I was dealing with float precision here. It is surprising that even though the audio buffer is defined as float by JUCE, such a tiny encoding amplitude would still come across correctly.
– Reaper: starts breaking down over ~40bits of precision
– Ableton Live: Maximum precision is 17 bits, anything more fine is already unreadable at the receiving end
– Cubase: anything above 12 bits is already manipulated, so that the receiving plugin cannot detect any code anymore
– Max/MSP: I copied my code into a Max external and interestingly was able to go up to a precision of 1023 bits (!!) before the code transmission between two objects broke down.

Is there anyone else that has made similar experiences? Does this mean ProTools has the cleaner audio handling? Why is it, that the audio buffer from one plugin to another is not exactly the same anymore? What are DAWs doing in-between plugins? Sure enough, at a precision of 17 bits we’re nowhere near the area of precision errors. Something else must be going on here.

Just for clarification, I did another test of first writing and then reading the code within the same plugin and same callback (i.e. same audio buffer) and sure enough running this new plugin in Ableton I was able to read the code at 32 bits precision. So, something must be going on internally after writing the code and trying to read it again in another plugin.


Perhaps some denormal-prevention? Or your scaling is losing precision in some cases? I suppose you could try just comparing actual (random) audio leaving one plug-in with what is read on entry into the next plug-in in the chain? That would tell you at least if there is some information lost in the chain that is unrelated to your scaling operations. (Not sure if it matters, since I don’t know the details, but you might also check if your scaled data fits within the buffer size you’re being given in each host, in case that’s an issue.)