We use the following steps to take a still image from a camera at about 10 FPS
- Open the camera device
- Create a camera viewer and attach it to the visual tree
- In a timer callback every 100ms call device->takeStillPicture (unless still waiting for a previous call to call back)
On desktop this seems to work very well.
On iOS devices there are a few issues.
Not easily reproducible but mentioning in case anyone else has this:
- Sometimes the camera view fails and it’s just blank. But still images come through.
- When switching camera we sometimes get a
memoryWarning
but we are not allocating anything in this period. We do the thing recommended in the header comments, i.e. open a new device before closing (deleting) the old one when switching.
Another problem for us is that taking a still image causes the camera shutter sound constantly. This is not going to work for us at all. I assume this is because that API is designed for taking single high quality images at low frequency, like in a camera app, and camera apps expect the sound for reasons (sometimes legal).
What we really need is a stream of images but not necessarily at full video quality, in fact low quality is better for us. They can be unreliably timed, dropped, etc, no problem. E.g. If there was some way to sample the camera view that would work (as long as that works when not-visible, e.g. off screen in some way but does exist). Like the startRecordingToFile
functionality but instead maybe we call a function like getLastFrame
. We tried rendering the component to an image but that doesn’t work. (the internal paint function doesn’t do anything on iOS).
Anyone got any other experience with the camera on iOS?