Hi, is there a definitive way of doing this? Having trouble covering all bases.
For instance, on my monitor, for the main display I get the following:
Scale: 1, Width: 3840
So this is fine.
For a client, we get:
Scale: 2, Width: 1280, on a retina display
So I assume here, I can multiple 1280*2 = 2560, so that’s retina
However, if I scale my monitor to be 1920x1080 then I get:
Scale: 2, Width: 1920 - so if I did the multiplication again here, that would come to 3840 which would register as retina even though it’s not.
So, is there a better way to determine if I’m on a retina screen or not?