I would like to bring to you some issue we encountered with camera management in MAC OS
We are using camera device in our app, when our need is to process the images we get, and draw the result.
What we are doing there is, as the demo shows with snapshots example, to copy the received image in a member, and process it in handleAsyncUpdate function.
The issue is that no matter how we do this, the resulting video is incredibly delayed…
We did a little experiment in juce demo: we simply remarked out the line that remove listener in imageReceived function, and in that way, by pushing snapshot button, we can display every new frame. At least that is what we were expecting, But as a matter of fact, the same occurs, and we can see a video with a big big delay.
By investigating the JUCE code, I could see it is being using QTCaptureDecompressedVideoOutput,which according to apple doc, is for high-quality processing, and then does not drop images, and creates a queue of waiting images. What is obviously problematic for us, since we really need to get most updated image at every call of imageReceived callback, and not queued images.
QT gives possibility of using QTCaptureVideoPreviewOutput instead of QTCaptureDecompressedVideoOutput, When modifying the JUCE code by this one, everything runs fine.
I know you are working on moving to AV Foundation, so this is a suggestion for next releases:it would be really cool and very important if JUCE could give the user to choose between those 2 options. Because this will allow support for things that need dropping to be on.