Real-time video processing at high frame rates?

Does anyone know if it’s possible to enable the highest frame rates (like 120 fps) on iOS for doing real-time video processing? I’m not talking about recording here: just doing something in real-time with the frame images as they come in (not storing them). And yes, I realize 8.3 ms is not a lot of time to do some processing…

If I look at juce::CameraDevice, I can see that the openDeviceAsync function has size restrictions and a bool parameter named “highQuality”, but I don’t find any way to specify a frame rate. Or is that not possible even in the low-level iOS APIs? Since Apple offers frame rates of 120 fps and 240 fps for recording (even on my iPhone SE 1st gen) I would think it should be possible to also do real-time streaming processing on the images somehow?

Related question:
And how would you go about capturing both streaming video (frame images) and audio (audio buffers) simultaneously in a synchronized way? The CameraDevice class apparently only supports a listener callback for the captured frame images, so would you need to open an AudioIODevice too then, and some kind of wall-clock time-stamping and sync them up yourself in a separate thread, so you can process images and corresponding audio samples together?