Are you sure about this? The thing is:
The Software rasterizer uses platform dependent functions for drawing. On windows the GDI functions. You know, all that paint event, invalidate and windowing stuff. Here dirty regions are used and it only draws stuff that was invalidated. Makes sense that there could be a problem with overdraw.
But hold on.
The OpenGL context avoids GDI regions. It uses a “non-repainting” window and just “draws” the framebuffer as a quad. Here the “dirty regions” are applied to the framebuffer. Then the components are rendered to the invalidated framebuffer. This whole part here in OpenGLContext.cpp
void paintComponent()
{
// you mustn't set your own cached image object when attaching a GL context!
jassert (get (component) == this);
if (! ensureFrameBufferSize())
return;
RectangleList<int> invalid (viewportArea);
invalid.subtract (validArea);
validArea = viewportArea;
if (! invalid.isEmpty())
{
clearRegionInFrameBuffer (invalid);
{
std::unique_ptr<LowLevelGraphicsContext> g (createOpenGLGraphicsContext (context, cachedImageFrameBuffer));
g->clipToRectangleList (invalid);
g->addTransform (transform);
paintOwner (*g);
JUCE_CHECK_OPENGL_ERROR
}
if (! context.isActive())
context.makeActive();
}
JUCE_CHECK_OPENGL_ERROR
}
Now, while implementing a Vulkan context I stripped all of this code.
Left over is:
- A JUCE timer running at 60Hz.
- A Vulkan context that uses “Mailbox” as present mode (it’s VSycned to 60Hz).
- 1 Frame in flight, synced with Vulkan fences.
So there are no dirty regions involved at all! JUCE timers and pure rendering on a windows GDI surface (window?). It still has the same micro stutter problems!
Interestingly increasing the timer rate to 120Hz also mitigates the stutter (like in OpenGL).
But ONLY if I use “one frame in flight” instead of multiple frames being pre-rendered (so it’s the same behavior like in OpenGL with wgl SwapBuffers). This makes sense, since the delta frame time here would be naturally incorrect if there is no syncronization.
To me it seems like it’s a combination of inaccurate timers. Sometimes 15ms, 16ms, 17ms. And the problem mentioned in the previous article. That there is a mismatch of CPU submitted “present” and the GPU actually showing it at the right time.
So to me it’s not that obvious what is causing the stutter.
The inaccurate timers? The GPU<->CPU mismatch? The windowing system? The frame delta calculation? All of it together?
Has anyone achieved a stutter free display of moving objects with a 60Hz juce timer?
What are the alternatives?