JUCE for ARKit?

Hi folks, I have used JUCE for years now in my mixed reality live looper, which in its most recent incarnation was a networked HoloLens-2-and-Windows-PC-with-Azure-Kinect experience. Worked great! But recent developments have left HoloLens kind of adrift, so I’m looking at jumping ship to the Apple ecosystem, specifically iOS and ARKit. Gotta be ready for their rumored mixed reality goggles in 2023!

My looper has had VST2 support for a long time, to be able to run plugins like Polyverse’s Manipulator. So ideally I would like the ARKit version of my app to be able to run such plugins too. But has anyone actually ever run JUCE as an ARKit app? Does JUCE integrate with ARKit’s spatial audio mixing? Does ARKit itself have enough of an audio integration API to allow external sound effects processing? Are there any ARKit-based live looping-style realtime recording + effects apps today, let alone ones using JUCE?

What about if it was (say) an Unreal-based ARKit app? (My current app is Unity-based, but given the current Unity situation I’m looking at jumping off of that possibly-sinking ship too.)

I know this all isn’t strictly a JUCE question, but any information would be great to help me understand the space and what has/hasn’t been done. Thanks very much!