Im crashing with intel onboard
Same crashes here:
Windows 7 64 bit
NVIDIA GeForce 9600M GT
Maybe it’s a 64-bit Win7 thing, I need to sort out a copy of that to test with…
Whatever change was done recently has now made the call stack on crash, what I deem, entirely abstract.
Ah… I had to add a couple of missing GL definitions on Windows, and may have not made them 64-bit safe. Try the latest update…
Sorry, still the same.
I did a fresh build of the tip but the crash isn’t fixed.
Ok, then I’m out of ideas. It’s the nvidea driver crashing, but I don’t know how I could possibly guess what’s causing it…
Maybe these images will help trying to find the root cause…
And these, too:
Root of all evil?
Sorry, but I can’t see anything at all wrong with the way those functions are getting called. It works just fine on my PC, mac, linux, etc. For some reason your nvidea driver is crashing, but I really have no idea what might be confusing it - I’m pretty sure the data it’s getting sent is ok.
Hm… oh well - I tried! Good luck!
Ok… Just doing a bit of research on Nvidea bugs - try this tweak to the GLState constructor…
[code] GLState (const OpenGLTarget& target_) noexcept
: target (target_),
previousFrameBufferTarget (OpenGLFrameBuffer::getCurrentFrameBufferTarget())
{
initialiseGLExtensions();
// This object can only be created and used when the current thread has an active OpenGL context.
jassert (OpenGLHelpers::isContextActive());
target.makeActiveFor2D();
blendMode.resync();
#if JUCE_USE_OPENGL_FIXED_FUNCTION
currentColour.resync();
glDisableClientState (GL_COLOR_ARRAY);
glDisableClientState (GL_NORMAL_ARRAY);
if (currentShader.canUseShaders)
glDisableClientState (GL_VERTEX_ARRAY);
else
glEnableClientState (GL_VERTEX_ARRAY);
for (int i = 3; --i >= 0;)
{
activeTextures.setActiveTexture (i);
if (currentShader.canUseShaders)
glDisableClientState (GL_TEXTURE_COORD_ARRAY);
else
glEnableClientState (GL_TEXTURE_COORD_ARRAY);
}
#endif
[/code]
It now breaks at:
- Line 923: juce_OpenGLGraphicsContext.cpp @ quadQueue.flush();
[quote=“jrlanglois”]It now breaks at:
- Line 923: juce_OpenGLGraphicsContext.cpp @ quadQueue.flush();[/quote]
That’s the same place you were talking about before (?)
Ok, I have one more shot-in-the-dark idea to try - in the ShaderQuadQueue class:
void draw() noexcept
{
glBindBuffer (GL_ARRAY_BUFFER, buffers[0]);
glBindBuffer (GL_ELEMENT_ARRAY_BUFFER, buffers[1]);
glBufferData (GL_ARRAY_BUFFER, numVertices * 8, vertexData, GL_DYNAMIC_DRAW);
glDrawElements (GL_TRIANGLES, numVertices + numVertices / 2, GL_UNSIGNED_SHORT, 0);
…this shouldn’t be necessary, but is just in case the driver disables all the attribute bindings when the shader program changes.
Following my previous comment on issue on OSX
There is some artifact on borders when drawing images as well
[attachment=0]Screen Shot 2011-12-22 at 18.53.56.png[/attachment]
HTH
No, it was crashing, and still is, at the glDrawElements() call in juce_OpenGLGraphicsContext.cpp @ line 1195 (if you discount the modifications to the method ShaderQuadQueue::draw()).
Note that this is with and without the tweak to the GLState constructor you have provided.
Yes, and it’ll still be crashing in that function, which is what gets called by quadQueue.flush(). Unfortunately, none of this really matters, because there’s nothing wrong with the call to glDrawElements, it must be the state of the GL context at that point which is causing the problems… which could be absolutely anything!
Interestingly though, the few tweaks I’ve made today have pushed the performance up a lot - the GL renderer on my system is now several times faster than the CoreGraphics or software renderers for most primitives.
Damn, I wish I could see this performance change! 8)
If I come across anything by chance, I’ll let you know.
