New feature: Camera support for iOS and Android

With commit 772185f we have added support for camera device on iOS (iOS 7+) and Android (API 21+). To enable camera support, switch JUCE_USE_CAMERA to Enabled in juce_video module:

The DemoRunner app now enables camera by default, so you can play with it on Mac, Windows, iOS and Android.

As for other mobile features, you need to request permission from a user to access the camera. On Android you need to specify camera permission in Projucer setting:

Similarly on iOS, you need to enable the Camera Access Projucer option. There is an optional description you can set describing why you need a camera permission with a default text:

On top of Projucer settings above, JUCE will automatically request permissions from a user at runtime (you still need to enable the permissions in Projucer).

To make an existing code to work on iOS and Android, there are minimal changes required on your end:

  • use openDeviceAsync() instead of openDevice() on all platforms. Async opening is required on iOS/Android and it will just call openDevice() for Mac/Windows. You can still use openDevice() on desktop but that will not work on iOS/Android
  • use takeStillPicture() to capture photos passing a lambda that will be called to notify you when the photo has been taken, the old CameraDevice::Listener class has been removed
  • provide CameraDevice::onErrorOccurred lambda to be notified if any error occurs in a device. Usually when an error occurs, the device will have to be closed and reopened.

In case you encounter any issues, you can set JUCE_CAMERA_LOG_ENABLED flag to get a lot of debug information as well as info on camera capabilities.

Finally, we realise that modern mobile devices support tons of features like capturing raw images and that the devices allow to fine control various settings e.g. focus, exposure, white balance and so on. Time permitting, we will add support for these, but we can’t promise any timeline at this point. With this increment we made iOS and Android implementations use the same capabilities as Mac and Windows ones, which is already a big chunk of work and a step towards further improvements and features.

9 Likes

Ohno!
We are using CameraListener to capture all frames and cannot sit and wait for them to get over to the message thread.
Please keep a version of the callback that is called for all images as an option for us doing video editing.

1 Like

Ok, I have resurrected CameraDevice::Listener so that you can still use it to process individual frames and you will get callbacks on whichever thread the processing is done, so same as it used to be.

For people wanting to just take a picture, takeStillPicture() is a preferred method, which will always call your callback on the message thread.

The relevant change is now pushed to develop in commit 2dd1a80

1 Like

Thank you, Lukasz!

Lukasz, the continues capture doesnt work. The CameraDevice::Listener isnt called! Only on the very first picture.
Tested on Mac.

ah, school boy error, apologies! I will push a fix shortly.

Ok, the fix was pushed and it should appear on develop and master branches shortly.

Hi. I am trying to capture a sequence of frames from the video (iOS) and apply some image processing. For that, I am using the CameraDevice::Listener method. However, it is only been called once by the CameraDevice. It looks like the listener is automatically removed after the call. Am I doing something wrong? Is there an alternative way to capture a sequence of video frames?

I have a question relating to “closing” the camera device. There is no specific “close” function, so should I just delete the object? This is the approach I have used when capturing video with JUCE classes.

No problem with my former video capture code, but I am having a problem with a new app where I only need to capture a still image. What happens is the code works perfectly first time, but ,when called a second time, the “picturetaken_callback” is not called. I suspect it is because the device needs to be closed, but if I delete the device after use, I just get an exception. Is this a timing issue, because asynchronously speaking, the device has not quite finished?

Some notes on the code below:
(1) TakeStillPhoto () is called within the main GUI thread. After setting results, it sets a waiting event, to tell the initiator of the photo, which is running in another thread, that it is OK to proceed.
(2) The code is for Windows, OSX and iOS, not Linux or Android, which is why I omit the extra steps needed for Android.
(3) I have commented each section of code to make it clear. Some proprietary functions and objects are used, but I think the intentions are clear enough.
(4) This is my first experience at using lambdas. I have nested one lambda inside another. Perhaps this is not the correct way to chain callbacks and that is the cause of a timing issue. Any lambda experts out there?

void TakeStillPhoto ( ALV::String filepath, ErrorInfo * & err, Event * ready_event)
{
err = nullptr ;

// check a camera is available
juce::StringArray camera_array = juce::CameraDevice::getAvailableDevices();
int numb_cameras = camera_array.size();
if (numb_cameras == 0)
{
	err = MakeError(_T("Sorry. No cameras available."));
	return;
}

// definition of callback function after camera is opened
auto OpenCameraResultCallback = [filepath, &err, ready_event](juce::CameraDevice* camera_dev, const juce::String& errmsg) {

	// check no error opening the camera
	ALV::String msg = ALVSTR(errmsg);
	if (!msg.IsEmpty())
		err = MakeError(msg);

	else
	{
		// definition of callback function after picture is taken
		auto picturetaken_callback = [filepath, &err, ready_event](const juce::Image & image) {
			
			// stream captured image to file
			juce::File outfile(JUCESTR(filepath)); 
			juce::FileOutputStream out_stream(outfile);
			juce::JPEGImageFormat out_format;
			if (!out_format.writeImageToStream(image, out_stream))
			{
				err = MakeError(_T("Could not write captured jpeg"));
			}
			
			// tell waiting thread that picture is ready for use or that an error has been set
			ready_event->Set();

		}; // picturetaken_callback

		// call function to take the still picture
		camera_dev->takeStillPicture(picturetaken_callback);
	}
}; // OpenCameraResultCallback

// Hardcoded to use camera at index 0 for this test app
juce::CameraDevice::openDeviceAsync(0, OpenCameraResultCallback);

}

I have this working now and attach my code for anyone who needs to know how to do this. The problem was not so much my use of JUCE but my wrong beginner’s understanding of lambdas. From a very basic viewpoint, lambdas are glorified callback function pointers. However, under the hood they are actually implemented as objects. That means they have lifetime. The very simple examples in books instantiate a lambda locally in a function. That’s fine if all you are doing is, say, adding two numbers synchronously - at the end of the function your lambda object is needed no longer, goes out of scope and is destroyed. But in an asynchronous situation, such as the JUCE camera code, we need to ensure our lambda stays in scope until we are sure all asynchronous actions have completed.

In my solution, I am playing safe. There may be simpler ways of doing it, but at the moment I am more interested in working code that I can rely on. What I do is define a class “PictureTaker” which stores the two lambdas as std::function member objects, to preserve their lifetime. An object of the class is instantiated to create the lambdas, and a method is called to take the still photo and inform the photograph initiating code, via an event, that the photo is ready or there has been an error. Only then can a further call be made to delete the PictureTaker object, destroy the lambdas and delete the JUCE camera object.

class PictureTaker {
public:
PictureTaker(String filepath, ErrorInfo * & err, Event * ready_event);
~PictureTaker();
void TakePicture();
private:
// callbacks
std::function < void(juce::CameraDevice* camera_dev, const juce::String& errmsg) > openCameraCallback;
std::function < void(const juce::Image & image) > pictureTakenCallback;

	// persistent data
	juce::CameraDevice* pCameraDev ;  // store for deleting at appropriate time
	String filePath ; // path to store image file
	ErrorInfo * & picErr ; // for setting an external error
	Event * readyEvent ; // for setting an external event when pic ready or there has been an error

};

static PictureTaker * s_PictureTaker = nullptr;

PictureTaker :: PictureTaker(String filepath, ErrorInfo * & err, Event * ready_event) :
pCameraDev (nullptr),
filePath (filepath),
picErr (err),
readyEvent (ready_event)
{
// Instantiate the callbacks. Note: the code is not actually called till later.

openCameraCallback = [this](juce::CameraDevice* camera_dev, const juce::String& errmsg) {
	ALV::String msg = ALVSTR(errmsg);
	if (!msg.IsEmpty())
	{
		this->picErr = MakeError(msg);
		this->readyEvent->Set(); 
	}
	else
	{
		this->pCameraDev = camera_dev ; // preserving for later delete
		camera_dev->takeStillPicture(this->pictureTakenCallback);
	}
}; // openCameraCallback

pictureTakenCallback = [this](const juce::Image & image) {
	juce::File outfile(JUCESTR(this->filePath)); 
	juce::FileOutputStream out_stream(outfile);
	juce::JPEGImageFormat out_format;
	if (!out_format.writeImageToStream(image, out_stream))
	{
		this->picErr = MakeError(_T("Could not captured jpeg"));
	}
	this->readyEvent->Set(); 
}; // picturetaken_callback

}

PictureTaker::~PictureTaker() { delete pCameraDev ; }

void PictureTaker :: TakePicture() { juce::CameraDevice::openDeviceAsync(0, openCameraCallback); }

void BeginTakeStillPhoto ( String filepath, ErrorInfo * & err, Event * ready_event)
// MUST be called from main gui thread so that JUCE functions are called from main gui thread
{
err = nullptr ;
juce::StringArray camera_array = juce::CameraDevice::getAvailableDevices();
int numb_cameras = camera_array.size();
if (numb_cameras == 0)
{
err = MakeError(_T(“Sorry. No cameras available.”));
ready_event->Set();
return;
}

s_PictureTaker = new PictureTaker(filepath, err, ready_event);
s_PictureTaker->TakePicture();

}

void EndTakeStillPhoto()
// Call this after have received photo or an error meaage
// MUST be called from main gui thread so that JUCE functions are called from main gui thread
{
delete s_PictureTaker; s_PictureTaker = nullptr ;
}

Hello. This new feature is really useful! On Android Camera API gives access to flashlight. Is there a way to control flashlight using CameraDevice class? I am developing some kind of flashing metronome so I need to turn flash on and off without camera input. Thank you!

Hey there @OBO Hope you are well!

By any chance would you mind explaining how you are starting the camera capture, setting frame rates, resolution etc please? I understand you are using the CameraListener to react to captured frames, but I’d love to know how you got to that point. An openAsync and listen?

Many thanks and best wishes,
Jeff