AU: Desktop::setGlobalScaleFactor with value < 1 results in wrongly sized window

When running an AU that uses Desktop::getInstance().setGlobalScaleFactor(0.75) in Logic Pro X, the contents of the Plugin Editor GUI don’t fill the entire window. The same code works fine on Windows (VST3) with REAPER.

To provide some more context: I’m at the tip of develop (commit 83b1436c6a21f82ffdc4125592836f21dbd7b1e7).

This is what it looks like:

@ed95 can you have a look at this?
In this post, you recommend using Desktop::setGlobalScaleFactor over AudioProcessorEditor::setScaleFactor.

Indeed, on Windows (VST3) the former works and the latter doesn’t, but on macOS (AU) it’s the other way around - AudioProcessorEditor::setScaleFactor actually works fine but Desktop::setGlobalScaleFactor produces the issue above.

I want to release my plugin within the next few weeks, so it’d be good if this could be looked at!

This should fix it. It’s worth noting that if you want a separate scale factor for each editor instance you should wrap your editor in a top-level component which you can apply a transform to instead of using Desktop::setGlobalScaleFactor() but if you’re fine with a global one then this should work with AUs now.


Thanks ed, this works great!

@ed95 I just realized there is another follow-up issue caused by changing the Desktop’s scale factor:

I have a component in which I use custom OpenGL code to render it and its children.
To be able to use the JUCE point-based coordinate system, I use something along the following lines:

auto desktopScale = (float) (getOpenGLContext().getRenderingScale() * Desktop::getInstance().getGlobalScaleFactor());

for (auto &child : children) {
	auto bounds = child->getBounds();

	// translate the child's position relative to this component
	// into the OpenGL coordinate system
	auto x = (int) std::round(bounds.getX() * desktopScale);
	auto y = (int) std::round((getHeight() - bounds.getBottom()) * desktopScale);
	auto width = (int) std::round(bounds.getWidth() * desktopScale);
	auto height = (int) std::round(bounds.getHeight() * desktopScale);

	glViewport(x, y, width, height);
	// restrict the area the child can draw in
	glScissor(x, y, width, height);


On Windows, this code works just fine, in fact, I need to multiply desktopScale by the global scale factor for the drawn OpenGL contents to be scaled correctly.

However, on macOS, including the global scale factor in the equation causes the rendered OpenGL to be too small (since I use a global scale factor < 1):

Therefore, on macOS the first line simply needs to be

auto desktopScale = getOpenGLContext().getRenderingScale();

Is there a way to make this work cross-platform? Why is there a difference in behaviour?

I was able to track this issue down to the following lines of code which differ between Windows and macOS:

The newScale value, which is assigned to the value returned by getOpenGLContext().getRenderingScale(), is assigned to the display scale on macOS.

This apparently incorporates the desktop’s global scale factor, while the Windows implementation, which calls getScaleFactorForWindow, doesn’t.

Therefore, to make my rendering code correct, I have to do this:

float OpenGLComponent::getRenderingScale() {
	return (float) (getOpenGLContext().getRenderingScale() * Desktop::getInstance().getGlobalScaleFactor());
	return (float) getOpenGLContext().getRenderingScale();

Please check if this is intended behaviour.

No, that looks like an unintentional omission. We’ve added a fix in 1bb38fe so getRenderingScale() will now include the global scale on both macOS and Windows.

1 Like