OpenGL simple draw an Image using a sourceImage and a shader

opengl

#1

Hello,
i’m trying to achieve something fairly simple, and i thought i found the solution using the OpenGLGraphicsContextCustomShader class, but it seems that there are some limitations to it that make it don’t fit.

I’ve got a 1920x1080 source Image (of type OpenGLImageType) and i want to generate a 256x256 image that uses the source image and a custom shader.

It’s not just resizing / resampling, my goal is to optimize grabbing of some pixels from a large texture in realtime without having to copy the large texture in CPU at each frame.
So i’m feeding the custom shader a list of coordinates relative to the source texture, the shader takes each pixel and set its color to the one at the source texture’s coordinate taken from the array at the index of the pixel. The result is a small image with all the colors from the sourceTextures i’m interested in.

This allows then to transfer the small image to CPU, which is really faster, and as the list of coordinates is already ordered, i can directly use the data from the out image as a list of colors.

I’ve wandered in this forum, in the source code of Juce OpenGL Demo but i have a lot of problems getting this working, i’m not really good at OpenGL but i understand how it should basically work.

If someone has some clues on how to get started, it would be really cool.

So far i’ve got my vert and fragment shader loaded, it seems that i can’t set or find any declared uniform (the etUniformIDFromName() function returns -1 on every uniform declared in both vert and frag shader).
After that i’m a bit lost, i don’t know how to bind the texture to the shader and how to draw a simple quad so the fbo is filled with my final texture.

The project is to do pixel mapping, i’ve already a working software written in OpenFrameworks (https://github.com/benkuper/OpenStripSpatializer) but i want to port it to JUCE so i can integrate it into a larger project.

Thanks
Ben


#2

Ok, i got something working and i’m able to use a shader and output it in an FBO, the only problem now is i can’t seem to find a way to pass the coordinates to the shader.

If i use for instance :
uniform vec2 pixelPositions[16];

it can’t compile the shader, and says :
0(2) : error C0000: syntax error, unexpected '[', expecting '{' at token "["

the way i used to do with openFrameworks was to use a floating point texture instead of a vec2 vector in the shader, and the R & G component would store the X and Y locations. But JUCE doesn’t seem to handle floating point textures/images, does it ?