High CPU usage on opengl repaints/renderFrame no matter what, even on unmodified default project

On a project I’m working on which uses JUCE’s OpenGLRenderer, we’ve been struggling with high cpu usage from frequent repaints. However after attempting to isolate the most problematic/intensive repaints, it appeared that any call to OpenGLContex::renderFrame() on our project would cause severe spikes in cpu usage if called frequently, even if it’s triggered from repainting a tiny single colour rectangle on a tiny opaque component.

So I did some further investigation and found that even the default “OpenGL Application” project has this exact same issue. Simply create a new ‘OpenGL Application’ project with Projucer, don’t add or change anything in the project, just compile under release mode and run, and you should notice right off the bat it’s using a high amount of CPU.

For me this default black window project was using up to 40% of my CPU, and keep in mind I’m running on an i5 6600k overclocked to 4.4ghz.

By default OpenGL projects are configured with setContinuousRepainting set to true. If we turn this off, and manually control the repaint frequency, some even more interesting stuff happens. Adding a timer to the project, and calling repaint() every 10ms causes a similar performance hit. As does calling repaint() every 15ms - however strangely, at least on my computer, calling repaint() every 16ms instead dramatically reduces CPU usage, down to 2-3% from 35%. I have no idea why this would be, but it does not seem normal that a 1 millisecond difference in timing would have such a dramatic effect on CPU usage.

This issue happens both on the develop branch and the master branch. I’m using a windows 10 pc compiling with vs2017, but have the same issue on a windows 8 pc compiling with vs2015, and have heard there are similar issues on mac when compiling via xcode.

Let me know if you need any further information, thanks!

What you’re seeing is JUCE doing a crazy amount of work on the CPU before passing any data to the GPU via OpenGL.

As I indicate in this post and this post, JUCE’s OpenGL wrappers approach GPU drawing by putting everything into one texture after breaking it all down into scanlines using a juce::EdgeTable.

1 Like

Try profiling with very sleepy, it should quickly show you what’s going on (and it may be the driver spinning if you repaint too fast)