**SOLVED** Lowering the resolution of an OpenGL fractal generator


So I’ll be honest: I’m flummoxed. I’ve got an OpenGLGraphicsContextCustomShader working with an implementation of the Julia Set, but there’s no way it’s going to run smoothly on a phone without lowering the res a lot. Basically there’s two ways I’ve tried, both using vertex shaders, and both to no avail:

  1. ditch the CustomShader and modify the OpenGlApp example
  2. attempt to modify juce_OpenGLGraphicsContext.cpp directly

Perhaps there’s some way of stretching a smaller image onto a larger texture. I’m in a pickle here, and could really use some advice.


Have you seen OpenGLFrameBuffer? You can render the fractal into the framebuffer (which can have any size) and then use the framebuffer as an opengl texture or get the pixel data directly to create a JUCE image which you can also draw at any size.


Thanks a million fabian. As you can tell from the below snippet I used the latter method you suggested; it worked like a charm.

        glViewport (0, 0, 256, 256); 
        Rectangle<int> b = Rectangle<int>(getWidth(),getHeight()); 
          customShader->fillRect(g.getInternalContext(), b); 
        int resx = 256; 
        int resy = 256; 
        AffineTransform scale = AffineTransform::scale (float(getWidth())/float(resx),float(getHeight())/float(resy)); 
        Image image    (     Image::RGB,        256,        256,        true    )     ;     
        unsigned short data[3*resx*resy]; 
        glReadPixels(  0,    0,  256,  256,  GL_RGB,  GL_UNSIGNED_SHORT, data);
        for ( int x = 0; x<resx; x+=1) 
        for ( int y = 0; y<resy; y+=1) 
                       image.setPixelAt (resy-y, resx-x, Colour(char(data[y*3+x*resx*3]), char(data[y*3+x*resx*3+1]),  
                       char(data[y*3+x*resx*3+2]))) ; 
        glViewport (0, 0, getWidth(), getHeight()); 
        g.drawImageTransformed (image, scale, false);