Process audio and video in realtime from a video stream or a video file

I’m a C++ creative coder with some experience (Xcode, VisualStudio, Qt, openFrameworks) but I’m totally new to JUCE.
I’m trying to understand if in JUCE exists some way to handle and process in realtime audio and video which are streamed from a webcam, or from a web server or simply in a local video file.
In particular, it seems that Video Component class is completely “sealed”, no way to access movie frame data, e.g. as an OpenGL texture or whatever.
Maybe someone could help me to shed some light on this topic? Thank you very much in advance for any advice.


I’d be interested to hear any responses to to this question myself. I’m beginning to look into if it would be possible to grab frames from an NDI stream to show in a JUCE based GUI.

The VideoComponent wraps platform widgets to display. And you got it right, you cannot intercept neither video nor audio.

I wrote a video engine that allows to read video files using FFMPEG, I have also support for WebCameras (on windows and Mac) which is not yet published. It also allows to composite video from different sources with parameters like position, alpha etc.

It also features audio processor hosting and video porcessors, but they are software only (and might need some TLC).

This is the juce module, the link to the demo projects is in the readme:

If you want to use it please get in touch. I am looking for collaborators. The NDI could be a fabulous joint effort.