There are some posts on here that ask about using Juce functionality (mainly audio) inside games built with Unreal or Unity… but I am wondering about the opposite. I have a lot of experience with Juce, and I use it as the primary way to build my apps from scratch. For a new app I’m working on, I want there to be a separate 3D window that has some nice real-time rendering. I don’t have any experience working with Unreal or Unity but I was wondering if it’s even possible to use them “just” as the rendering engine, but use Juce for the bulk of the app (mainly a very complex 2D UI with several windows).
To say it another way, I want to link to Unreal or Unity from my Juce app, as opposed to linking to Juce from my Unreal or Unity app.
Anyone have any experience with this or could point me in a useful direction?