why oh why do they think it’s a good idea to pixel double. this is the road to madness. i’ve seen it before.
anyhow,
i have been battling the iOS “retina” problem again when using texture patterns.
i’m using a texture to create UI surface. This texture is a subtle bumpy pattern and what i really want is for iOS to use my texture always without any scaling, because when scaled, the bumps don’t look right. Think of something like gravel texture here.
For normal, sane user interfaces that don’t lie to me about resolution, i just draw this texture as needed. for “retina”, what i do is create a piece twice as big as i need then render it using “drawImage” with a doubled width/height for the source into the destination. So, it looks like iOS sees this halving and, because it secretly needs to double, ignores it and, presto, i get what i want.
However, this doesn’t work for non-retina, because the texture gets scaled down. Normally, scaling down a 2x image would be fine (aside from performance issues), except this is a texture that looks different when you scale it. basically, my bumps on the texture are now too small.
to fix this, i have had to KNOW whether this doubling malarky is going on. so when i’m not doubling, i use my texture normally, when we are (retina), i create one twice as big and then ask it to halve into the coordinate space (for which it internally knows is not needed for pixel space) after which it gets it right.
To KNOW, i’ve hacked the following `getContentScale’ method into the “Desktop” class:
// juce_ios_Windowing.mm
float Desktop::getContentScale()
{
float g_scale = 1.0f;
if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)])
{
g_scale = [UIScreen mainScreen].scale;
}
return g_scale;
}
i’m wondering if you think this is a good idea in general, ie to have a way to query, or whether there is already a Juce way to know this that i’ve missed.
– hugh.
