Screenshot of opengl renderer


Hey guys,
I want to save a screenshot of what is being drawn in the opengl renderer, but I don’t know how to begin. I tried the createComponentSnapshot, but it just renders the components and not the opengl stuff to the image. I was reading on this and saw the glReadPixels, but then I don’t know how I can convert this into a juce::Image.

Any help would be greatly appreciated


Just wanted to document this. I had to do 2 steps:

  1. From the buttonClicked, let the user select the file he wants to save into and turn a shouldSaveScreenshot flag to true so that I can save it on the opengl render method
  2. Call the saveScreenshot function from the rendering method so we have a current context. Create a frame buffer image and render the scene

Don’t know if there’s a more elegant way, but I solved it. Hope this helps someone in the future :oops:

void OpenGLCanvas::saveScreenshot()
// Step 1. Save the drawing to an image.
Image snapshotImage = Image (OpenGLImageType().create (Image::ARGB, getWidth(), getHeight(), true));
OpenGLFrameBuffer* buffer = OpenGLImageType::getFrameBufferFrom(snapshotImage);

LeapUtilGL::GLMatrixScope sceneMatrixScope;


// Step 2. Save to output
File imageFile(m_fileToSaveScreenshot);

// Delete the file if it exists &write the new data
if (imageFile.existsAsFile())

FileOutputStream outputFileStream(imageFile);
PNGImageFormat imageFormatter;
imageFormatter.writeImageToStream (snapshotImage, outputFileStream);



Hello Jucers!

I know, it’s an old topic, but I ran into the same situation, and can’t figure out, how to create a juce::Image from the result of a renderOpenGL() callback.

@andi’s solution does not work for me, it fails on buffer->releaseAsRenderingTarget() with GL_INVALID_OPERATION.

Any ideas appreciated! Thanks!


It was my fault, the 3D engine I use (Chai3D) has a flag where I have to tell the engine when I’d like to render into an already bound FrameBuffer. Now it works almost fine.

Maybe I did something wrong, but there is an issue with transparency and aliasing. I’ve attached three PNGs. One is a screenshot, the other is the captured frame, and the third is the same captured frame, but with a grid layer added under it in PS.

I think it’s something with the Image’s underlying Context / PixelFormat / FrameBuffer, but I can’t access them. The context for the renderer looks like this:

OpenGLPixelFormat format;
format.multisamplingLevel = 8;
format.depthBufferBits = 16;


And this is the code I used. I removed all not-frame-capture-related code, rendering directly to the context above works just fine (screenshot.png).

void Renderer::newOpenGLContextCreated ()
    frameImage = Image(Image::PixelFormat::ARGB, getWidth(), getHeight(), true, OpenGLImageType());
    frameBuffer = OpenGLImageType::getFrameBufferFrom(frameImage);

void Renderer::renderOpenGL ()
    if (wasResized) // Is this necessary?
        frameImage = Image(Image::PixelFormat::ARGB, getWidth(), getHeight(), true, OpenGLImageType());
        frameBuffer = OpenGLImageType::getFrameBufferFrom(frameImage);
        wasResized = false;
    camera->renderView(getWidth(), getHeight(), 0, C_STEREO_LEFT_EYE, false);
    File temp("/Users/MyUser/Desktop/image.png");
    if (temp.existsAsFile())
    FileOutputStream stream(temp);
    PNGImageFormat format;
    format.writeImageToStream(frameImage, stream);

What should I do to achieve the same result as screenshot.png? And another important question, how can I do itt offscreen? As I experienced, renderOpenGL() called only when the renderer is added to the display list.