My use case is a record button that let you record audio while tweaking knobs of a synth or effect.
This is not possible with Renderer::renderToFile (even when using Renderer::Parameters::realTimeRender), as rendering is not allowed whilst attached to audio device.
Does anyone have any advice for my use case, or example of code to do this?
You can’t really do this whilst rendering, it’s an inherently offline process.
The way you’d normally do this is by recording the output of one track to another. To do this, get the use auto inputDevice = AudioTrack::getWaveInputDevice() on the source track and then assign that to a target track.
It’s a bit tricker with TrackInputDevices as you have to get them to be added to the state first before you can get an instance (this should probably be improved in future).
edit.getEditInputDevices().getInstanceStateForInputDevice (*inputDevice);
if (auto epc = edit.getCurrentPlaybackContext())
if (auto instance = epc->getInputFor (inputDevice))
instance->setTargetTrack (destTrack, 0, true);
Another approach is to simply record the parameter changes as automation and do the rendering afterwards.
Sorry, you can’t use the Renderer to do this track-to-track process, you need to arm the destination track and then do a TransportControl::record() operation, just like you would for live audio/MIDI input.
Yes those are all true, but step 3, the render, needs to happen after the recording has stopped. I’m not quite sure why you would need this step though as the “destination track” will contain a clip with the recorded file.
What additional processing would the “render” operation add?
1- In tracktion_Renderer.cpp, line 430, I think you need to add this line: r.tracksToDo = tracksToDo;
2- When I render to a file, I must have this line, otherwise the file is not updated:
if (fileChooser.browseForFileToSave(true)) {
renderFile = fileChooser.getResult();
if (renderFile.existsAsFile())
renderFile.deleteFile(); // added line
}
I think that’s correct behaviour. I don’t think it’s the Renderer’s responsibility to delete the file. You should make sure the file doesn’t exist and the location is writable before you start rendering.
This creates a “render.wav” file in the “Render” folder with the edit. If the file already exists, it appends a number, i.e. - “Render(1).wav”.
I tried recording the output of one track to another, and getting the file with AudioClipBase::getSourceFileReference().getFile(), but the resulting .wav is very saturated when there are overlapping notes.
No problem Dave, thank you for your help!
The real-time audio playback is fine.
It’s only the rendered wav that is saturated.
Normally, with the code I uploaded, it’s quick to reproduce.