Custom Graphics Backend

How can I integrate custom graphics in JUCE? I’m looking at using bgfx to render my graphics in an audio plugin, but I can’t figure out a place to start to do this.

I’ve looked at bgfx before, seems very promising but a pain in the ass to get started with (on mac at least).

If you want to implement your backend for juce::Graphics then have a look at JUCE: LowLevelGraphicsContext Class Reference. You may want to look at some of the native implementations (Direct2D on windows and CoreGraphics on mac) to get an idea of how to use it, and how to adapt it to bgfx.

Or you could have a look at how OpenGL works with JUCE, which IIRC doesn’t use a low level context and instead creates an OpenGL window on top of the JUCE window.

However, if you really want to get the most out of bgfx, I’d probably not bother with any of the JUCE graphics stuff and just use bgfx directly.

1 Like

Using pure bgfx was my plan, but I’m unaware of how JUCE handles the graphics contexts. Would I need to override some sort of Open or Close method directly and create the graphics context there? I’m coming over from iPlug2 where this was as trivial as overriding a few functions.

I guess it should work like that: if you want to draw over some window or component, just call Component::getWindowHandle and assign this handle to bgfx::PlatformData.nwh. I have not used it with bgfx yet, but was doing that with other custom rendering engines.

Does changing the window handle just hand over all power to the custom rendering, meaning the custom rendering engine is fully responsible for rendering the UI? Or does it just draw on top of the screen like the OpenGL option?

Having this handle means that your rendering engine can draw everything what’s needed and JUCE does not know nothing about it. So yes, it’s drawing on the top of the component. If you want to integrate your engine with JUCE, then you need to do what @ImJimmi described above: to implement your own LowLevelGraphicsContext.

Thanks, that makes sense now. Bgfx’s refresh function is frame() - it is usually called in a while loop in regular applications like those using GLFW. Would it be possible to just crudely implement a Timer and on the callback call frame()? I don’t see a pressing need to integrate with JUCE’s LowLevelGraphicsContext right now but if it makes things easier down the line it’s worth looking at.

Yes, it can work like that or it can be an independent loop/callback. I used both methods for rendering JUCE components by OpenGLES on embedded Linux.

Makes sense, was probably overthinking a lot of it lol. Last thing and I think I’ll have something- when should I give bgfx the Window Handle? I’m pretty sure the editor constructor won’t work because it would just be a nullptr. Is there a function I can override that is called when the window handle is initilized Thanks for your help

Ah, sorry, I forgot to add that you must add your component to Desktop (addToDesktop) to create a window for it otherwise it will not have its own native handle. After it’s added you can get a handle. However, I am not sure how it will be handled inside a plugin as plugin windows are handled by host… I never used this solution in any plugin, I will have to test it too.

Couldn’t I just use the main window handle and grab it once it is initialized (Is there a callback function or a function to override that is called once the handle is created in the main window)?

Edit: I realized paint(juce::Graphics& g) gets called only once, so I can just get the handle from there and do the bgfx rendering using timerCallback()

Yes, it seems that you can get a handle for the main window in paint(). I tested resized(), parentSizeChanged() and broughtToFront() but the handle is still null when these are called for the first time after the plugin window is created and displayed. You can also run a timer in an editor constructor and ask for a handle in timerCallback until it’s finally available.

If you manage to run it with bgfx, please drop a line, I’m curious how it would work.

This is how I do it (I’m using bgfx with JUCE):

void ApplicationEditor::parentHierarchyChanged() {
  if (window_handle_)

  void* window_handle = getWindowHandle();
  if (window_handle) {
    window_handle_ = window_handle;

As a side note, with bgfx there’s a static renderer so having multiple windows/plugins is kind of a pain and is a lot of work to get working correctly in my experience.


If you don’t want to dive into the OS specific window handle stuff, you could also take the easy road (with some performance disadvantages I guess) and just set up your render engine however you want. Then use an extra copy/blit step and write your result framebuffer pixels into juce::Image::BitmapData. Then use the regular juce::Graphics to draw it.

This way you could embed some heavy lifting rendering into the regular component hierarchy without implementing your own juce::LowLevelGraphicsContext.

In short: Some kind of offscreen rendering, instead of drawing to the default swapchain framebuffer, copy it into a juce::Image.

JUCE wonderfully abstracts the image pixel data with


So as long as you can sync it somehow, you can provide any kind of pixel source.

The OpenGLImageType for example implements it using an OpenGL framebuffer with glReadPixels.

Depends on what you want to achieve and how much time you will spend in your render callback. If it’s heavy 3D stuff you probably want to decouple it from the regular component/message thread to keep the UI responsive.

I’m still learning some of the low level rendering stuff that comes with this, what does this mean?

I did consider doing something like this. I figured that if I’m going to get away from the software renderer I might as well learn how to properly implement a custom renderer. I’m shifting some code from iPlug2 too just to get some sort of basic code base for future projects and I want to make sure future me won’t have to go back and redo a ton of code. JUCE’s api really does fit the bitmap method though :slight_smile:

I worked on a custom renderer based on iPlug2’s nanovg renderer, maybe it’s interesting to you:

It works with Metal and OpenGL currently, but it’s a work-in-progress and it doesn’t work very well yet. It’s also partially based on juce_bgfx, but without using bgfx!

A lot of the bgfx api is static functions. So e.g. you call bgfx::init() once for all the plugins and not once per plugin.

Does this include only the same plugin or all plugins using bgfx? I tested the new version of Vital with a test plugin using bgfx and both worked fine. If it’s only specific to the same plugin, fixing it is as easy as creating some counter or something and using bgfx::cacheWrite and bgfx::cacheRead to figure out if anything needs to start or be shut down…

Windows user, same experience here lol

It’s just static calls per plugin (plugins won’t interfere afaik)