JUCE and FFmpeg

I wanna make an app with JUCE and FFmpeg, and I wonder if JUCE is able to display video image decoded from ffplay.
AFAIK ffplay creates a window of its own to display images, and now I want the video to be displayed within JUCE, by ImageControl, VideoControl or something…
Or, can JUCE embed the play window of 3rd-party video apps, like VLC, mplayer, etc?
Any ideas?

About a year ago I created a wrapper to use ffmpeg with juce, it’s OpenSource on Github: filmstro_ffmpeg.

It creates juce Images, so you can interfere with the images and do your own processing/drawing on top. Also the audio is presented as a AudioSource subclass, so you can pipe the audio into your processing chain.

I hit a wall with the correct usage of FFmpeg, that I couldn’t figure out. It is usable for several codecs, but some of them just don’t play nicely, especially divX AVIs are nasty.
Also writing videos was problematic, the produced videos would play in VLC, but not with Quicktime, Preview and such.
Meanwhile the API in FFmpeg I used is deprecated, I don’t know if just rewriting the decode and encode calls would solve these issues.

I haven’t worked on it because the project was paused and now I moved to a different company.
If you (or anybody reading this) want to dive into FFmpeg, I am happy to team up and get it more complete.

2 Likes

Daniel,
FFmpeg does have some issues with muxing/containers that would need to be updated to make the files play in something like Quicktime. I have used ffmpeg libraries a lot with Juce and like your approach of serving up the video as a positionable audio source. Also note that for some image types, the scaling actually takes longer than the decoding (so this also needs to be done on a separate thread). IPP has some functions for scaling that are faster than FFmpeg’s.

Hey Darren,
Great, thanks for the feedback!
Can you tell me a bit more about the muxing issues, is there any more I have to do apart from calling av_write_trailer() when closing the container?

The issues are contained within the FFmpeg source code base itself (avformat, avcodec, etc.). However, recent builds from the head seem to fix many of these issues. I would update your FFmpeg and just try again.

1 Like

Hey there @daniel

I just bumped into this thread and noticed your concern with ffmpeg and making quicktime movies that weren’t fully compatible with some players. I’ve actually hit this problem from a different world (visual effects python pipelining, not C++ dev work). I’m guessing you figured it out, but if I can help at all let me know, as what you’re describing matches what I had to resolve before.

Cheers

Jeff

1 Like

Thank you, the project morphed into the more elaborate foleys_video_engine.
I am just returning after a break from the project. I managed to write videos that would display properly by NOT setting the AV_CODEC_FLAG_GLOBAL_HEADER (despite of many tips, that this is really needed).

But thanks for the offer, I might be back with questions.

1 Like

Hi @daniel!
I also need to work on a video engine for a Music App. I just bumped into your git repo and was wondering if I could use your library for that. What I find a bit odd:
You seem to be doing the decoding on the CPU entirely (because you get back Juce Images) and then (in case of OpenGL rendering) send it back to the GPU for rendering. Wouldn’t it be good if all of that happened entirely on the GPU? Maybe though you are already doing this and I just misunderstood :slight_smile:

Hi @konrad,
Apologies, I completely forgot about your question.

Yes you are right, there is room for improvement. JUCE’s main thread drawing model gets in the way here, as well the fact that I didn’t invest too much time how to do hardware decoding.

I did some experiments to use the OpenGL backed juce::Image, so that at least the drawing wouldn’t have to happen on the message thread. You can have a look at it in this thread:

Currently I am undecided how to proceed with it, especially since contract work took over my bandwidth…

Let me know if that is interesting for you. If you want to join forces here, I am open to contributions or joint ventures.

3 Likes

Thanks Daniel!
I did end up creating my own hw accelerated video engine using ffmpeg.
This is a closed source project, so I cannot post any real code examples. For anyone here that has to do a similar thing and like me doesn’t have video and direct3d in his comfort zone:

  • I started out with this post Streaming Video With FFmpeg and DirectX 11 | by Ori Gold | The Startup | Medium
  • There are quite a few examples in the ffmpeg sources as well
  • I created a demo app using my library thankfully using the Visual Studio 2019 example for C++ using Direct3D11 on UWP apps. There you can see how XAML components, swap chains, device contexts and so on work together.
  • Most annoying pitfalls: Make sure you have the correct version of the ffmpeg dlls (libavcodec and so on). Those might not support hardware acceleration. I ended up using the dlls from conan.io
  • Make sure your device is created with the correct flags.
  • Pay attention to locking access to your device context. The hardware driver will pretty reliably crash when trying to decode and present a frame concurrently
  • like always: If something doesn’t work fall back to the simplest version that does work and work your way up…

I hope that helps!

Konrad

2 Likes