— website to validate your vst3 and audio units

I made a drag-n-drop website that runs pluginval on your plugin:

Here’s a results example.

If you aren’t using pluginval already, you should! It ensures compatibility with hosts, checks all sorts of edge cases and in general will make your plugins more reliable for your customers.

Please post your questions, issues, feature requests for in this thread!

(creating this post so we don’t clutter up the pluginval megathread I first posted to)


@JeffMcClintock Re: your run, it looks the plugin hung on Editor opening:

Starting tests in: pluginval / Open editor whilst processing...
*** FAILED: Timeout after 30 secs

The jobs were actually still running when I got to this (5.5hrs and counting), so it looks like I also have to figure out why that 30s timeout didn’t actually kill the process.

Would you mind either DMing me the plugin you used for testing purposes or downloading this release on my fork to reproduce locally?

Appreciate it and sorry you ran into the issue — haven’t seen this one before!

FYI: The site is not working for me. It says “couldn’t detect platform”.

The plugins submitted are vanila Windows 64bit vst3 files, such as “CompPlus.vst3”. They have been in the wild without issues for several years, and passed all tests in previous versions of pluginval as atested by their listing under “Direct Approach” on the website. You may download them from website to try yourself, if desired.

1 Like

Hey @bwall, thanks for trying it out and reporting.

Looks like I wasn’t grabbing the arch correctly when the .vst3 was a Windows file (vs. a folder). That’s now fixed (with tests) and deployed and CompPlus passes validation. Thanks for trying and reporting!

Btw, I noticed that your macOS .dmg files have the “can’t drag to symlink” issue on Monterey that I recently ran into. You might be interested in the rabbit hole I went down and apple’s response

1 Like

Yes, I can confirm the site is now accepting my plugins.

Thank you!

1 Like

The reason my plugin was ‘failing’ the validator was because the validator is passing a tempo value of 0 BPM. Which is invalid.

To pass the validator, I have added a check to my plugin for any invalid BPM, and to silently swallow it.

So, now we have a situation where the next person who writes a buggy host (that passes zero as a BMP). That person will not experience crashing plugins, and will release the buggy host out into the world. Eventually, if enough DAWs contain the same bug, plugin writers will all have to accommodate the buggy hosts. Which will result in a less stable ecosystem.
This behavior from a validator risks legitimizing the behavior of buggy DAWs IMHO.

That’s my opinion anyhow.


Hey Jeff,

Thanks for testing the service out. It helped me identify that Windows plugins built in Debug (such as yours) could pop up an assertion dialog which waits for user input (unideal when running on a CLI). I tried to set _CrtSetReportMode on the pluginval process to redirect assertions to stderr but that didn’t seem to do the trick.

Not sure if anyone has ideas on why _CrtSetReportMode on the host wouldn’t propagate to the plugin…

Anyway, for now, MacOS and Linux Debug assertions will output to stderr. Windows will timeout after 30s on (you’ll see the assertion locally in the GUI). In the worst case, maybe we can improve pluginval CLI output by terminating the process with an error that clearly indicates that an assertion might have occurred.

Re: validation philosophy, it’s an interesting debate… If my plugin crashed more when hosts did Wrong Things, would hosts improve their behavior? Would users blame it on me and just stop using my plugin? What’s best for the ecosystem?

I’m not an expert on the host<->plugin relationship. I just know it’s a messy business with a lot of historical quirks that are still being compensated for. As a plugin maker, that’s part of what makes JUCE so valuable. Many of those quirks are handled. I don’t want to have to deal with every DAW and framework’s quirks bubbling through to my users.

I don’t want to speak for @dave96, but that seems like a large motivation for pluginval and part of the point of fuzz testing. The vst3 host will pass values to the plugin, and it’s up to the plugin how to handle that input. 100% agree with you that 0 BPM makes no sense coming from the host… but the idea is that we’d want to to bubble up and identify the potential edge cases (via pluginval) ahead of time (instead of finding out via a user having trouble in host XYZ). I’m not sure what the bigger picture solution is here though…

Thanks again for your time helping to test the service!

I’m all for strong contracts but a big part of the fuzz testing approach is because contracts in plugin APIs are historically very weak.

A bpm of 0 may make no musical sense but is it an invalid input?
I can’t find anything on AU here: Apple Developer Documentation
or VST3 here: [3.7.0] Process Context Requirements - VST 3 Developer Portal or here VST 3 Interfaces: ProcessContext Struct Reference

If 0 isn’t a valid input, what is? Any value > 0?

I could be wrong but as far as I can see, pluginval doesn’t actually set the tempo anywhere. This could indicate a bug in the JUCE hosting code or something I need to tighten up in pluginval if I can find the valid input range for tempo for all plugin formats.

pluginval has two aims, the first is to test within plugin API contracts, this should happen up to level 5. If you find any behaviour there that doesn’t conform to plugin API contracts I’d be more than happy to change them.

The second aim is to stress and fuzz test plugins in a way that is outside plugin API contracts. Users might find this useful for different reasons such as replicating bugs found with some hosts more quickly and repeatably or for finding performance bottlenecks etc. It’s really up to the individual if they want to use these tests.

But I fundamentally disagree that we should not stress or fuzz test our plugins. I just don’t agree that it’s going to make DAWs sloppy and I don’t see how pluginval could change that. If you get some buggy behaviour from a DAW, you report it and they fix it, how does using pluginval change that? All I’m trying to do with pluginval is give a tool that users who aren’t established in the industry can use to test their plugins and hopefully provide an environment where DAW/JUCE plugin hosting can be improved as it’s much easier to reproduce issues with a tool like this.


In both VST2 and VST3 BPM is optional and there is a flag to indicate if it’s valid. In JUCE6 and earlier an unset BPM appeared as 0. In JUCE7 and later BPM is now an optional to differentiate between 0 and not set. I’m not sure if a BPM of 0 is ok, but handling a not set BPM is required. There are audio editors and live hosts where the concept of a BPM doesn’t really have any meaning.


yeah, that’s a great approach. Being able to test against a valid host, but also have the option to test against a malicious host.

What I’ll do in future when a DAW is misbehaving is spew a bunch of diagnostics to std::cerr, I think that will help everyone.
Below is just a test, pluginval is actually conforming.

Can I ask what kind of call stack these errors are being triggered from?

I don’t think pluginval sets any kind of AudioPlayhead which as far as I’m aware is perfectly valid to do from a JUCE perspective but also the plugin APIs, they should just be queried and return “invalid” (as @RolandMR explained). Is that not what’s happening? Maybe JUCE isn’t setting these flags correctly if you are querying them?

sorry, JUCE IS setting the flags correctly.

I merely simulated a ‘bad host’ in order to verify that I can log diagnostic messages that will show up in pluginvals log.

I modified my plugin to check the flag that says tempo is valid. (if DAW tempo is not valid I use a default value of 120) And my plugin now passes pluginval without errors. Thanks for your help folks!


Tracktion and The Audio Programmer are now sponsors of :tada:

We recently did a hangout to announce pluginval 1.0, live demo the drag and drop validation (which had a small glitch, but hey) and chat about validation generally:


The link to the example run doesn’t appear to work any more.
I ran the test on the Linux build of my plugin. No errors reported, and took 13 s to run tests. Am I in the clear? Or is there something else I should be looking for?
Also, 180 ms for cold open of plugin – does this seem a little slow to anyone else, or is it normal?

The link to the example run

Thanks for reporting, it was accidentally deleted :sweat_smile:. Should be fixed going forward.

Am I in the clear?

It should say “PASSED!” up top if the plugin passed. Maybe I should add that to the bottom as well so things are extra clear? The logs would be peppered with “FAILED!!” or a segfault if you didn’t pass.

Pluginval catches a lot of common problems that plugins tend to have once they are interfacing with hosts. Passing strictness level 10 is a great way of ensuring maximum compatibility with DAWs, but doesn’t guarantee your plugin will be bug free :slight_smile:

Also, 180 ms for cold open of plugin – does this seem a little slow to anyone else, or is it normal?

That lines up with other linux runs. I haven’t looked into what’s being timed or what exactly happens on editor bootup, but it smells a bit long to me too!

I took this opportunity to bump the version of pluginval being used to 1.0.3 and update the server infrastructure. This didn’t end up being the most popular project, but I still think it’s a good gateway drug to using pluginval!

1 Like

cute emojies :slight_smile: btw would be cool if a successful test actually spelled out clearly “Success” or something like that. the absense of errors can be seen as success but it’s not 100% straight forward imo

1 Like

Hmm, pluginval should output “ALL TESTS PASSED”, but I’m not seeing that on the linux run I just made, so I’ll look into it.

There’s also this up top:

I’ll add this to the bottom as well so it’s clearer.