Porting macOS App to JUCE, What is the best Audio/MIDI rendering method


I currently have a macOS app that is about 50% completed in Swift/C. I am an experienced C++ developer and am considering moving to JUCE. Currently my app uses an AUGraph and I render the Audio and MIDI data in a callback added via AUGraphAddRenderNotify(). I send the events that are within the current rendering frames using MusicDeviceMIDIEvent() or MIDISend() and for audio I fill the buffers accordingly. The reason I use this approach is because the App loops clips independently and does not have a linear transport. My question is what JUCE classes do I need to use to accomplish similar capability. It seems like a good deal of what I coded myself already exists in JUCE. But I am not clear on what methods will guarantee that I will get sample accurate MIDI events when sending to an instrument plugin along with my audio.