If anyone fancies earning a bit of pocket-money in return for some coding, there are a few self-contained little features which I’m struggling to find time to do myself… If you’ve got “m4d juce skillz” and some spare time, I can offer a (modest!) fee for the implementation of some of these things:
An openGL ES rendering engine for Android/iOS. This would basically be an openGL implementation of the LowLevelGraphicsContext class, and a type of Image::SharedImage that can hold a p-buffer. I know very little about openGL, and less about openGL ES, so it’d be a bit of a learning curve for me to do this myself… To be useful, it’d need to be able to produce an identical result to the software or CoreGraphics rendering engines, correctly performing anti-aliasing, complex clipping etc.
And some simpler items:
A CoreVideo video player, to replace the QuickTimeMovieComponent class.
A DirectShow video player to provide a Windows replacement for QuickTimeMovieComponent.
A CoreCodecAudioFormat class that can open and read (but not necessarily write) audio files using CoreCodec.
A DirectShowAudioFormat class… same as above, but for Windows.
If any of these sound interesting to you, please email me to discuss, and give me a quote!
Out of curiosity, do you think reimplementing Juce’s graphics to use OpenGL rather than CoreGraphics for instance will boost the graphics rendering performance of apps on iOS and android? I wrote an app for iOS that does a fair amount of high frame rate rendering, and I haven’t been 100% happy with its rendering performance. I had thought about reimplementing parts of the app using an OpenGL component or creating an OpenGL based LowLevelGraphicsContext for the app as a whole, but I wasn’t sure if CoreGraphics was using hardware acceleration already and whether this would actually give any measurable performance improvement…
On iOS, CoreGraphics is probably pretty good already, so it might not make much difference (depending on what kind of thing you’re drawing). But on Android, the rendering would certainly be vastly better using openGL, not least because it’d cut out a huge number of very expensive JNI calls.
On the iPad app I did, which has a fading slide show view, I had to go to OpenGL. Big, big difference.
I did a fairly nice trick by snapping the background of the component, and using that as the background in OpenGL - makes the seams of the OpenGLComponent invisible (hide component, draw and grab snapshot)
[quote]- A CoreVideo video player, to replace the QuickTimeMovieComponent class.
A CoreCodecAudioFormat class that can open and read (but not necessarily write) audio files using CoreCodec.[/quote]
What is CoreCodec? I’ve never heard of it mentioned in the Apple docs.
Additionally, CoreVideo only deals with uncompressed video frames, you will still need to use QuickTime to decode compressed video.
The AudioToolbox Framework apart of CoreAudio lets you decode/encode compressed Audio. It is available on iOS 2+ and will be available in Mac OS X 10.5+.
AVFoundation is a new Apple API meant to deal with decoding and displaying compressed video and audio. It is available on iOS 4+ and will be available in Mac OS X 10.7+.
It also may be worth noting that there are AAC and MP3 encoders and decoders available via Microsoft’s Media Foundation API. The downside is that these codecs are only available on Windows 7 and later.
Nothing substantial found for a DirectShowAudioFormat class, but i found this code which works nice for MP3 decoding on Windows based on the the old ACM Decoding infastructure in Windows. It seems to work fine in Windows Vista 64bit.
Why not use this, if it just work, would be happy to have this as an Juce AudioFormat