FYI I've added this now, with a flag JUCE_VST3_CAN_REPLACE_VST2 to enable it to be turned off.
I've decided to leave it on by default.. I think that's probably ok as most people will probably want to use it, and can turn it off if there are any problems.
setStateInformation gets called from both readFromMemoryStream and readFromUnknownStream.
The fix (https://github.com/julianstorer/JUCE/commit/bf72ff0851b51b07830a831baf333465017a05ae) only changed the call in
readFromMemoryStream and not in readFromUnknownStream.
Iiuc, both places should be changed.
Also, I notice some work-around for Adobe Audition in one of them, should that also be applied to both?
Sorry that I'm asking again, but my problem isnt solved so far.
I know that the sample accurate Automation stands in conflict with the block-processing of the JUCE Framework.
I've tested and tried things in my plugin know and I only want to write the Automation sample accurate, so that a jump of a parameter from 0->1 or from 1->0 is a clear gap in the automation curve in the Host.
I still have some research on a possibly related automation problem* ahead of myself - what I can tell you so far:
If you specify a "NumSteps" value for your parameter in AudioProcessor::getParameterNumSteps(int index), Cubase's automation will show much more precise parameter jumps and will do less interpolation.
However, specifing discrete NumSteps has a few downsides...
*If anyone cares to read about that: my problem is that if I don't specify a NumSteps value, Cubase will ignore small automation paramter changes when I write them. I have a parameter ranging from -100 to 100 and need a parameter precision of 0.01. When I write an automation with changes in a small range (e.g. ranging from 10.20 to 11.20) Cubase will show these chages during automation write, but once writing away, Cubase will interpolate them away as if they did not happen. The only way I found so far to change that is to specify a NumSteps Value (but I don't really want to do that).
(I was planning to do some more research about that before asking questions here, but since it seems like it might be related...)
Edit: found out that the full version of Cubase offers an automation setting called Reduction level - if that is set to 0, no interpolation of automation data is done.
Testing on Cubase 7.0.7 32-bit on Windows 7 and Cubase 7.0.5 on OS X 10.8.4, I'm getting crashes with the JUCE demo plugin when putting it on mono tracks (on stereo tracks it works fine).
I'm suspecting it has something to do with these lines:
I'm seeing the same problem. It appears that the VST2 version always passes in a valid buffer for the second channel even if running in mono, whereas VST3 does not.
I can think of several options for how to solve this, so
Which option is better?:
When "process" is called, see how much channels there are and if it's different than what the processor knows, call its "setPlayConfigDetails" with the updated info? (would one then need to call it's "prepareToPlay" again?)
When "process" is called, if we recieve less channels than "JucePlugin_MaxNumInputChannels", add fake silent channels and call "processBlock" with that
I think that fix would definitely cause some nasty gltiching when prepareToPlay is called in the process function.
The best approach would just be to ask the plugin for the correct number on ins + outs and pass them to setPlayConfigDetails instead of JucePlugin_MaxNumInputChannels, but looking through the VST3 docs, I can't actually see any clear way of getting that info.. would it make sense to ask for the bus arrangements and get the number of channels from that?
Ah yes, that looks much better! I didn't notice that we already had the bus info already there in the class. If people on this thread can confirm that it works for them, I'd be happy to go with that solution!
A maintenance commit to fix warnings accidentally caused changes that cause crashes in Cubase 7.
The variable iid was renamed to targetIID, but macros used in that scope still referenced "iid" which is also a name of a member variable (which was previously masked by the local variable) so the meaning of the code changed when that doesn't seem to be the intent of the commit (and it causes crashes in Cubase 7 in Windows).
The macros hid the fact that the variable was referenced there..
Proposed fix at
https://github.com/yairchu/JUCE/commit/5230cbc7796a57b41f73067486fb3789072810c1
Missing setPlayhead?
This one's pretty simple - https://github.com/yairchu/JUCE/commit/d87888aa3ca1e9c16c0bd8041ff1714e06236882
Wrong fix commited for the mono tracks bug?
As I mentioned before, it seems that my first suggested fix which you said isn't to your liking was used? It seemed to accidentally slip in in another unrelated commit about iOS stuff?