CameraDevice issues in iOS

Hello everyone,

I’m working on a camera app for mobile devices and am trying to decide whether to replace my custom camera capture implementation for iOS with JUCE’s CameraDevice class.

I should point out that the goal is to do custom image processing on the stream of preview pictures, which seems to be possible implementing a CameraDevice::Listener according to this thread: New feature: Camera support for iOS and Android

So far I’ve encountered the following issues:

  1. Using a CameraDevice::Listener to obtain images and drawing them on the screen using an ImageComponent is terribly slow on an iPhone from 3 years ago.
  2. The app requires microphone access, even though I only want to capture images.

Does anyone know a faster method for 1) or how to solve 2)?

Or would you recommend to simply continue using an iOS native implementation where I can manipulate the raw sample buffer directly (and which runs really smoothly on the same iPhone)?

Best regards,
Fritz