Simple slideshow on iPad


#1

I have a simple slide show running, and the animation loop (timer at 1000 / 30, calling repaint) and the draw image with an opacity is using up most of the CPU time.

Do any image draws use native calls on iOS? Or will I have to get into OGL?

Bruce


#2

All the image drawing on iOS is done natively, so I’d have expected it to be pretty quick already.

However, there may be ways that it could be sped up a bit - e.g. I think the CALayer stuff is supposed to be faster than the CGImage stuff that it currently uses, so it might be better to store the images like that.


#3

Yeah, I just didn’t leave enough time to learn CA, and like you I suspect, I hate to spend time on non-cross-platform solutions.

I looked at the code and it does look like it’s all CG, but maybe some CG uses the CPU on the iPad - the images are in ram anyway.

Maybe I’ll see if I can get some OpenGL in. Main snag to that is I have a juce drawn background that I alpha onto, so I’ll have to also chop a piece of that and use it as background in the OpenGL, try to make it seamless with the juce background. Interesting challenge.

Bruce


#4

Just to circle around. I put an OpenGLComponent in. When the parent resizes, it makes the component invisible, draws into an image and sends that to the component to be a backdrop.

The OpenGL comp has three textures - backdrop, current and next slide. It draws the slides (which were uploaded from a composited image to make a nice soft dropshadow) blended over the backdrop, pixel for pixel (nice juce resizing) I think that’s a good blend of GL and juce’s abilities.

Still fine tuning when GL renders. Wasn’t too hard - a day or so (I do have a fair amount of OpenGl experience now). Very pleased though.

Bruce