So I’ve started looking at JUCE8 WebView.
The most important change as we all know is that now it comes with tools for both sides (C++ and html/js) to pass data.
I’ve looked at some of the code and the following page:
I see that I can easily “serve” data using Resource Provider.
Eg, I can serialize my state as json, and then fetch it from WebView with appropriate URL for the resource.
There are useful Relays/States abstracted under Web Attachments.
But what I’m confused about is Listeners.
The link above says:
You can wire up an arbitrary piece of UI to the JUCE backend using this slider state. In raw javascript, assuming you have a html range input with id slider , you could use event listeners:
So currently, this is the way to “listen” (from web side) to notifications?
JUCE decide to use higher level components in favor of abstract listenable objects?
Another question I was wondering, JUCE WebView has ResourceProvider, but it seems to be running on the message thread. but what if I’d like to pass a stream. is there a way to do that? (without hogging the MessageManager)
Basically I was wondering the same, since I figured I have some complex JUCE components that I’d rather just use as part of my web ui instead of rewriting them. So, what if I could render them in JUCE and just stream the pixel data to a WebGL canvas…
So long story short, with some hacking, it’s possible to setup a websocket server through juce::StreamingSocket and use just any websocket client library on your WebUI to connect to it for high-speed two-way streaming.
While mine is not perfect and super stable (yet), it seems like this - or something similarly suitably - should be something, that JUCE could be able to provide down the road; especially with the increasing revenue from the new licensing model … can we get a feature request for that going anywhere?
Thank you for pointing out the thread. I think it’s more relevant than this one for broader discussion.
Are you using WebSocket with binary data?
I saw you are fiddling with passing rendered waveform image. But isn’t that’s exactly the case where you can benefit by passing values to be rendered on the html side?
I’ve actually found the CHOC already has most building blocks you’d need such as a limited web server including websockets.
The major benefit on JUCE is that its cross platform also includes mobile platforms.
Well obviously you’re right about being able to transmit values to render them html side.
But, this waveform component I’m rendering in my example is a fleshed-out component, that offers a LOT of functionality that wouldn’t be easy to recreate and is battle-tested in a bunch of released software. We have a couple of these components and I figured instead of recreating all of them, I could just as well render them on the backend, display them on a WebGL canvas and be done with it.
I’m using websocket with binary format yes, just transmitting raw data based on needs and interpreting it accordingly on the other end. I spent some more time on it since I made the post, slashed the rendering rate and data transmission rate drastically (at least 10x less data now than in my last post) and it’s running super smooth. I can now even render a 4k/32:9 component without a problem, which wasn’t possible without choking the message thread, when just displaying the thing in JUCE natively. Rendering that data on a background thread, transmitting it to the ui on another thread and having the webgl canvas display it makes it work perfectly at any resolution.