Poll: video engine for JUCE?

Some may know, almost 3 years ago I wrote a wrapper to stream videos in JUCE called filmstro_ffmpeg*. It allows to process the audio stream with the JUCE tools and display the images synchronously, also writing back to a file.
Even with very little to none maintenance meanwhile it seems to be useful to some.

I am contemplating writing a more comprehensive video image processing engine, that would allow

  • Video clip objects to stream
  • mixing/overlaying frames
  • interface for visual effects
  • colour curve processing
  • resize/scale/crop frames to render virtual camera moves
  • display film strips
  • display live histograms
  • using stills as video objects

I would offer that on similar terms like the Tracktion engine. Is that something people would be interested in? Maybe adopt? What would be other must-have features?

*) I was working at Filmstro back then, the project was partly in my work time, hence it bears their name. Today I run my own company Foleys Finest Audio and might pull that over to my space, once I make major additions there


I would love to be able to capture my app’s GUI at 60FPS via createComponentSnapshot and dump it into an mp4 file without having to leave the JUCE eco system to do it.

1 Like

Yes, that is a good use case, although it wouldn’t need the whole engine in that case. The video wrapper should do that. But I remember our conversation from back then and figuring out, what made Quicktime not showing the videos, while VLC could is definitely necessary to figure out.

Just nitpicking a bit on 60FPS: I think the gaming and graphics board industry spoilt a bit the perception of FPS. While it is a good thing to trim performance for the painting to be as fast as possible, for a video it is much more important, to have a constant time interval between the frames. For sure, a minimum of guaranteed frames in the video stream is necessary, but 30 is a good value to aim for. You will need to send those frames later on, so consider your bandwidth.

The reason, why the graphics board manufacturers and gamers love their FPS display is IMHO, because the rendering runs on it’s own, and the graphics board has no other jobs anyway, so there is no loss in rendering as fast as possible. And it makes you feel good, that you bought the right graphics board (or you start saving for the next better one, they will be happy).

Rendering with 60FPS is like recording in 192kHz IMHO. You can do it, theoretically your information is more detailed, but in the end, it doesn’t matter too much.

1 Like

Some news from the project, a first version is public now:

Find the API documentation here:

And there will be a talk at the next meetup 5th June in London:


Please forgive my ignorance, but before I dive into installing this, that and the other and try for probably weeks to get something going which I’m not sure the “thing” can do in the first place, I’d like to ask you about a very specific use case.
The thing that’s missing most in Logic Pro X’ video capabilities is placing more than one video clips in the timeline. In commercial and film/TV promo work you usually have several cutdowns of a spot or you want to show the client three different versions of your work. This currently means you have to unload and load/move that one video clip everytime you want to play a different version which makes it look (and feel) very unprofessional.
Can your solution place several video clips at specific, non-overlapping points in the timeline with frame accurate sync for play and locate ?


That is a good question.
The work I have ready so far is the engine, that is available for anybody, who wants to add video functionality to his/her DAW, or who wants to create a new DAW or NLE.

The engine offers a pipeline similar to the AudioSource in JUCE (in fact they are PositionableAudioSources), but they serve additionally video, that you can compose in different ways, similar to FinalCutPro or Premiere. That bundle is aimed towards developers.

The screenshot you see is a minimalistic NLE to show, that the pipeline model works, and that you can easily create a video editor with that engine, where you can add all the different approaches or workflows.

I am working on a more complete audio and video editor, but I think, it will be a couple more months, until that is ready for release. It will add the more musical views, like adding track based mixing (which FinalCut doesn’t allow), offering presets for different edits (e.g. based on the location) and syncing clips.

I had already in mind to allow working in multiple edits in one project, that you can arrange later. But your use case to combine different video edits with your audio or music stem, is worth pursuing.
I’ll keep that in mind, thank you!

Just to clarify - I want to show different music/audio edits (which I did) which have many audio- and MIDI-tracks in Logic Pro with the final video clip (which I got from the client). Logic Pro allows only one video clip on its one video track, putting another/the same video clip to another location (like 10:02:00:00 if my first version is at 10:01:00:00) kicks out the first video clip which makes version hopping a nightmare. Clients love it, though “Can you show me V1. Now V2. Now V3. Now V2 again. I like V3 better, play that. No, actually it’s V1. Or V2 ? (leans over to his reception girl which for some reason is there at the approval) Cathy, what do you think ?” And so on, ad nauseum.

Thus, I’d like to have, I dunno, a plugin which can fire off a video clip in its window at TC 10:01:00:00 (and in sync from there on, so if I jump to 10:01:02:00, that same video clip shows its frame 00:00:02:00 and plays from there in sync with the host). If I open another instance of that plugin, it can play another/the same video clip at another timecode address. And so on.

Currently we have to resort to clunky workarounds like VideoSlave which is an external software in which you rebuild the timeline with several video clips and sync to Logic with MIDI Time Code and MIDI Machine Control. It actually works, but project- and file-management is a nightmare.

Oh interesting…
That is doable, the video engine can live in a plugin and synchronise to the AudioPlayhead.

Let me know, if you need help with that