Pass screen resolution to shader OpenGL. Making UV's for quad frag shader

Hi,

Been smashing my head against the wall for the last little while trying to work how to get screen resolution sent into shader. I’ve gone through the tutorials, there’s pixelPos mentioned in Gui OpenGL 2D example, searched and tried to trace it down, and can’t figure out if it’s a built in thing that forgoes the need to specify screen resolution.

I’ve tried accessing vec2 uv = (gl_Position.xy / gl_Position.w) * 0.5 + vec2(0.5); directly in the frag shader, but it doesn’t like that, even though I’m aware gl.position is a thing.

Also are there mouse variables that are already linked across?

Would I have to go: OpenGLUtil::UniformWrapper Mouse {“Mouse” };

and set a Point< int > mouse?

Do I need to pass across the variables into the shader like this: resolution->set(screen_resolution.x, screen_resolution.y);

I really don’t know I should be doing here.

I feel like there are core fundamentals missing from Juce, that are seemingly impossibly out of reach. It’s taken hours to draw just two triangles on the screen!

Are there any built in parts from juce that allow access to simple things like screen resolution, I mean already baked in and name/function that’s already doing the same job?

Out of the bunch of projects I tried on the forum, only one worked, so I’ve been stuck in the dark here, trying to look at others code and they’ve used a completely different method of getting things up and running, or using 10.14 features when I’m on 10.13(can’t upgrade).

1 Like

ok so that’s what pixelPos does, I’m not entirely sure what’s going on, but i’m 75% there.

I think I’m very close now 78.359% of the way there. But still can’t figure out adapting to resolution of the window.

I’m now guessing screen_resolution could be the whole screen, and not (as I wrongly assumed) the open/currently rendering window.

If you draw a quad to the viewport, your coordinates should go from (-1,-1) to (1,1)
For any kind of pixel/ texture referencing it would probably be best to send up UV coordinates in the range 0 to 1 across the poly for the fragment shader.

Otherwise, any variable passed to a shader is called a 'uniform, ’ and you should add it to the list of uniforms in the demo for example. You should probably make it a vec2 so the resolution can be sent in one go as two floats.
Search for ‘uniform’ in the demos, you should find an example of how it’s sent to the shader.

1 Like

awesome thank you!

I’ve been trying to pass it uniforms all day, I’ve got 1 mat4 uniform going through as such:

Matrix3D OpenGLComponent::calculateResolution() const

{

float x = getWidth();

float y = getHeight();

Matrix3D res = (Vector3D (x, y, 0.0f));

return res;

}

But it would rather be a vec2, I don’t know the syntax of how to initialise a vector2, all of the examples I’ve seen are passing in textures, I just want a simple float of a vect2 example, but every tutorial seems to skip the most basic use case for passing just a simple float!

Thanks for helping me! I think if I get a vec2 working now it will setup the way for inputting mouse co-ords.

I’ve got it normalised via the resolution of the screen so it can adapt to size changes on shadertoy, just need to figure the right way to send uniforms like resolution, and how to set resolution when it’s changed, I haven’t the monkiest, I keep seeing Getbounds and GetWidth, GetHeight(), tried to set resolution via that but didn’t have much luck.

Really wanted to understand and get past this set up so I never ever have to do this again!!!

OK don’t mind me, I think I figured out how to at least send values into the shader.

You have to just explicitly write them ie shaderProgram->setUniform(“colour1”, 1.f, 0.f, 0.f);

Next onto actually pushing the resolution properly and getting mouse/time across. God help me!!

You should ignore screen resolution and normalise every to a 0 to 1 range. The screen is in the -1 to 1 range in the shader because 0,0 is the vanishing point for 3D, and is at the centre. That’s why I reference the UV coordinates in the shader instead.

You can pass in the normalised mouse position, so you don’t have to worry about resolution.

For example…
I’m using Juce’s ‘Uniform’ class.
If I have a vec2 shader uniform, I set it with:
uniforms->mouse->set(MouseX / res.x, MouseY / res.Y);
Remembering to have the shader set to ‘use’ first, of course :slight_smile:

In the end, it will be an advantage to buy a book on OpenGL, or learn some tutorials for a while. 3D rendering is a big subject outside of game engine tech, and it’s well worth learning the basics.

Yeah I realised in the 2nd post calling it screen_resolution is a cause of confusion! It’s the render window’s size I wanted, and easy to get through GetWidth() and GetHeight().

I understand what you’re saying, that’s the co-ord system (-1 1 space) I’ve used to create the two screen quads. I always normalise (wouldn’t have been able to draw a centered isometric grid otherwise), I half* know my way around shadertoy, frag shaders and can render fairly complex 3d raymarching scenes with physical based lighting, and can model using signed distance functions. However I had totally taken for granted how hard it is to even get to the stage of rendering a quad! Full respect to the badasses that have made it easy for people to jam with shaders without having to be Einstein!

*I’m not just pilfering someone else’s shaders

The screen_resolution variable I’ve made is made up with GetHeight and GetWidth, which are pixel count amounts, not a -1 to 1 space, which may be the cause of confusion here! I believe it’s what you’ve used to describe as res, so I think we’re on the same page!

i’ll try that with the mouse now! Tbh I tried learning Juce and Cpp through the tutorials on audio, but don’t understand cpp, so I’m going a different route trying to learn juce and cpp as an extension of my understand of bits of arduino/glsl/c#. I should probably get a book on cpp as I’m just deciphering via pattern recognition at this stage!!

(UV debugging demonstration, unique colours for pixel co-ords)

This is where I’m up to:

Thanks Dave for the help! I almost have everything working! For the mouse part I’ve written:

float mx = getMouseXYRelative().getX();

float my = getMouseXYRelative().getY();

shaderProgram->setUniform(“mouser”, mx,my); //??

But haven’t tested it out, is that what you mean when you say mouse.x and mouse.y?

The other thing is setting time, I think I’m wrapping around instead of counting:

float tm = Time::getCurrentTime().getMilliseconds(); // in milisecs, 1000

tm*=0.001;

should I store time in another variable and += tm to that?

It’s up to you, I don’t know what you’re using mouse for. I thought you were going to divide it by the viewport width and height to normalise it?
For time, you’ll have to remember floating point numbers lose their accuracy when they get large, so you may have to wrap and loop it.

I understand, I was just going to normalise in my fragment shader, as I have canvas res in there now, so I just need mouse pixel co-ords going into the shader. But you’re right, i could just send already normalised co-ords into the shader, I guess doing it on the openGL call would be on cpu, and norming in shader would be gpu?

Yeah having a major mission getting time working, this is what I’ve been trying so far:

// float tmr = Time::getCurrentTime().getMilliseconds(); // in milisecs, 0-999 not a counter
float tmr = Time::getApproximateMillisecondCounter(); // this breaks into quads, very slow?
tmr*=0.001;
shaderProgram->setUniform(“shader_time”, tmr);

reading getMilliseconds is smooth on the shader, but wraps back around at 0, I’ve tried counting with it, ie if x is 999 add 999 to y and send shader_time as y+=x, but that didn’t work. And found getMiliSecCounter and approxMilisecondcounter, but they both break the shader into quads, so I imagine the calls are too slow to keep up with the shader, I don’t really understand.

I’ll remember that bit about floating point numbers loosing accuracy, I’m sure it’s going to be the cause of another few headaches to come!

I just use a frame counter, as my animations aren’t frame rate critical.

That’s a good idea. No idea how to build one. Just trying the best I can to mimic the inputs of shadertoy, like iTime.

Nearly figured out the mouse as well:

Except I’m not getting screen/canvas co-ords with relativeXY, I need getScreenPosition() from the mouse, but I don’t know how to return it.

I’ve been an active member of shadertoy since it started in 2013. Are you aware of the default licence? https://www.shadertoy.com/terms

I can’t help wondering why you didn’t use Unity3D if you have C# experience?

CC attribution non commercial -share-alike? Precisely why I’m not pilfering anyone else’s shaders. 90% of my shaders are unlisted or private on my shadertoy account.

I have used Unity to import shaders from shadertoy (around 2017), had to be done by converting GLSL to HLSL, which wasn’t the most fun process at times!

Why not Unity? Well I’ve tried making a midi app before with Unity, but it is absolutely horrible for critical timing stuff, also no built in midi library, you’d have to roll your own, or buy one from asset store(foriero core I think is the one I used, and stupidly slow), and no real time audio.

But I really do love C#, I need to find a book like the C# programmer’s guide to C++. Or just one person that explains how to use Juce api, I’m literally just breaking things till they work!

But totally got everything working now: https://www.youtube.com/watch?v=QNl6RoyTmnM

Mouse input, resolution, smooth time animation (thanks to your suggestion of using a frame counter)! I dunno if I should try and attempt to get FFT signal packaged into a texture, I’m guessing someone down the line on here will want to make some audio reactive shaders.
I think I’ve read Mulitpass can’t be done inside Juce, but I wonder if there’s a creative way around this limitation, as fluid sims are my favourites.

1 Like

Why not share your shaders with the shadertoy community?

You can make a 1x512 texture and put a FFT result into it with the CPU.
and use that as a shader reference. See the dynamic texture demo.
You may be able to do multi-pass, if you do a glFush on each texture render, but I can’t fully remember, sorry, it’s been a while.

Good question, maybe it’s just fear of being ridiculed!

You sir are a scholar and a gentleman! I’ve set up fft input using max4live before, and you’re completely right, it’s just a 512x2 texture, the first row is fft and 2nd row is the waveform. I will look at dynamic texture right away, thank you so much for the heads up!

Think I’m gonna steer clear of multipass and fft for now. My current goal is making a button and sending out a midi note, not sure the best way to go about marrying up midi note triggers to the hexagon grid, it’s looking like a may have to write the same function for the hexagon picker in cpp, the code’s written in opengl, just need to translate it across, not an easy task for me!

Hey thanks. From experience, all I can say is, when I have a lot to learn it’s easy to get lost in all the tasks, and get a little overwhelmed by all the things I have to catch up on. So it’s ALWAYS best to concentrate on one thing at a time, no matter how impatient I am.

Have you listened to any of this guys tuts? :- “The Audio Programmer”, @ https://www.youtube.com/channel/UCpKb02FsH4WH4X_2xhIoJ1A/featured
His tutorials are slow enough to pick up on why things are done certain ways in C++. It might help. You definitely need some more C++ coding experience.

I really enjoyed C# a lot when I did some Unity stuff (the lack of header files is quite a relief) Although getting it faster enough was quite tough a few years ago, as C# was about 10 times slower than C++ when doing non-library calling DSP code. I don’t know how fast it is now, but I bet it doesn’t quite compete with the native assembly C++ produces.