Using AudioDeviceManager for realtime filtering

Hi All, I am working on a standalone application that is aiming to work as a realtime filter between an input device and an output device. I would like to allow the user to choose a number of inputs on an input device, and the same number of ouputs on an output device, and the application should apply some effects in realtime.
I have subclassed AudioIODeviceCallback, implemented the right virtual functions, and there was one point when the filters were actually working. I could hear audio passing through, I could see my spectrum analyser working, and my input level meter was also showing activity.
Then I started to play with getAudioDeviceSetup and setAudioDeviceSetup, and allowed changes to the input and output device, also to the selected channels. I must be doing something wrong here because I am not getting any audio anymore, not even if I stick to the default audio devices.

It is important to mention that my callbacks are still working, I have tested all my callbacks with breakpoints and they get called, and they provide the right amount of buffered data, with the right amount of channels, so it seems the main issue is that I get no audio coming in from my inputs. (One of my default inputs is an instrument attached to an audio interface, which otherwise works fine, and I should immediately see and hear it working if everything worked correctly.)

I have created a wrapper for AudioDeviceManager in order to keep all my calls to it in one convenient place, to make sure it is used in a consistent way throughout the application. Here is one function that I use to initialise devices, am I doing it the right way?

void initialise(const juce::String &deviceName, const BigInteger &channels, bool isInput = true)
{
    String audioError;

    auto setup = deviceManager.getAudioDeviceSetup();
    if (isInput)
    {
        setup.inputDeviceName = deviceName;
        setup.inputChannels = channels;
    }
    else
    {
        setup.outputDeviceName = deviceName;
        setup.outputChannels = channels;
    }
    audioError = deviceManager.setAudioDeviceSetup(setup, defaultChosenDeviceState);

    jassert(audioError.isEmpty());
    if (audioError.isNotEmpty())
    {
        printf("Audio device initialise error: %s\n", audioError.toRawUTF8());
        logMessage("Audio device initialise error: " + audioError);
    }

    deviceManager.addAudioCallback(&owner);
}

I think it may be important to mention that I stopped calling deviceManager.Initialise, because I saw it in one of the Juce classes that it is not used when the deviceManager is configured with the above getSetup and setSetup functions.

Update: I have rolled back my code to a stage when the above setup attempt was still working for me, to see exactly what I was doing then. I think the main difference between that stage and my current one is that I did not have various async event handlers dealing with my AudioDeviceManager object, but all the setup happened in once single function in one single thread, from beginning to end. I think it is mainly important for the functions that end up calling AudioIODeviceCallback::audioDeviceAboutToStart and AudioProcessor::prepareToPlay functions.
At least for now I have two codebases, one does what I need, the other doesn’t. When I find out more I will post it here.

Update 2: Finally I have found the root of the problem, one line in my CMakeLists.txt file caused my audio to stop working:

This is the line because of which my audio was muted:
set (CMAKE_OSX_ARCHITECTURES arm64 x86_64)

And this fixes it (as I am on an Intel Mac):
set (CMAKE_OSX_ARCHITECTURES x86_64)

I’m not sure how it would work on an Apple silicon Mac though.

I have no idea why this was an issue, if anyone can explain it to me I would appreciate it.