Who called SetParameter

I have a graphic slider (knob) that can send values between 0 and 2000.

When I twist it, as expected, the setParameter member function in the plugin processor is called and all is happy.

However, if I associate an external knob with the parameter, then the value that comes into the audio plugin is always a floating point number between 0 and 1. I note that if I use the external knob, then the SetParameter function is called in the juce_AU_Wrapper which then forwards the value to the setParameter function in the plugin processor.

Question is, when I"m in the plugin processor setParameter method, I don’t see how I’m supposed to tell whether the value came in from the GUI (in which case the value will be between 0 and 2000) or if it came in through the external knob (in which case the value will be between 0 and 1 and needs to be scaled) and of course the value “1” can come through either method.

It’s not clear how I’m supposed to handle this. Am I supposed to modify the SetParameter in the juce_AU_Wrapper to do scaling there (which doesn’t seem right) or is there some other way to detect from where the original value came?

Thanks,
D

Why don’t you just set your slider’s range to also go from 0 to 1…?

Maybe he wants to use Hertz. A filter that goes from 0 to 1 Hz is good for sub-audio modulation, but can a fairly, uh, subtle effect on audio signals.

I address this point in my thread about recommended changes to the Juce Demo Plugin. Having the ability to use “real-world” values for sliders is critical for plugin design. It is also critical to translate these values to and from the 0.0-1.0 range used by VST and RTAS. I have my own solution for this issue in my plugins, but I don’t think that my code would be useful example code, as I’m a pretty shoddy C++ programmer when it comes to object oriented GUI stuff (efficient C++ for audio is a different topic). If the Juce demo plugin addressed this issue, it would be badass.

Sean Costello

I thought about having all the sliders actually produce values between 0 and 1 but then I would have to produce custom display output for all of them, I’d like not to have to do that.

I have a simple scaling function that I can use to give me one value in terms of the other, but that’s not the issue. I need to know whether the change came from the GUI or externally so that I know whether to scale or not.

Obviously, I could hack up a solution myself by modifying the AU_Wrapper but I was trying to address my problem in the context of what’s already available…I hate having to reinvent the wheel.

Sean, it’s probably not necessary to see your actual code — but what was your approach to distinguish between input from GUI vs input from external knob?

P.S. I’ve noticed I’m no longer getting any email notifications even though I have that option set — anyone know whether the notification system is working properly?

You know, I’m reading my own code this morning, and I really don’t understand what I am doing! It works, but it is one of those things I wrote a while back.

A brief description:

  • I have my own Juce slider class, that can translate the “real world” values to and from VST/RTAS values.
  • My audio code is created using my own wrapper class. I have a special Parameter class, that stores the current value, the min and max values in “real world” values, and the translation functions needed to convert back and forth between the numeric ranges. With this class, I can write a GUI-less AU, and convert it to VST in about 1/2 hour. When I create a Juce plugin, I have the SetParameter() function in Juce call my SetParameterVST() function.

This code is suited to my own needs, but wouldn’t work as a good generic Juce solution. I think that Jules would probably create something that works better from a proper C++ perspective.

Sean Costello

I’m just revisiting the parameter range issue for plugins.

Previously, I’d patched the AU wrapper just to get real-world values displayed on the track automation in Logic (for example). But that wouldn’t have worked in VST (or RTAS).

I’ve now gone for something similar Sean’s approach with a subclass of Slider displaying the real value (and optionally a units suffix) in its text box. Overriding getValueFromText() and getTextFromValue(). Then I’ve added the following functions to my AudioProcessor subclass:

    float getMappedParameter(int index);
    void setMappedParameterNotifyingHost(int index, float newValue);         
    float getParameterMin(int index) const;
    float getParameterMax(int index) const;
    bool getParameterWarp(int index) const;
    const String getParameterUnits(int index) const;

I’m storing all my paramater info in a nested namespace e.g.:

    namespace Parameters
    {
        static const char UNUSED_NOWARN *Names[] = {
            "Gain",
            "Pan",
            "Cut-off"
        };
 
        static const ParamRange Ranges[] = {
            {0, 1, LINEAR, 0},
            {-1, 1, LINEAR, 0},
            {50, 18000, EXPONENTIAL, "Hz"}
        };
 
        enum {
            Gain,
            Pan,
            Cutoff,   
 
            Count
        };
    }

With ParamRange just being a lightweight struct:

struct ParamRange
{
    double minimum;
    double maximum;
    bool warp;        // true for exponential, false for linear
    char units[64];   // use 0 for no units, e.g., "Hz" for freqency
};

(If I was designing this again form scratch the parameter name would be in this struct too but I have users of my code and that change would break their code at the moment.)

Now I find that the String returned from AudioProcessor::getParameterText() is the value displayed in Cubase when doing automation. I don’t have an RTAS dev license, is that what RTAS does too? But of course Logic still displays 0-1 rather than the real range.

So there seems to be a general solution that would work:
[list]
[] AudioProcessor deals with real world parameters rather than 0-1 and has knowledge of the parameter range (and ideally a concept of linear/exponential mapping for things like frequencies and delay times or perhaps just using a similar idea to the Slider skew concept)[/]
[] The AU wrapper sets up the parameter ranges (and possibly the unit type?) as is possible in AU[/]
[] The VST and RTAS wrappers map down to 0-1 to communicate with the host[/][/list]

This should be possible without breaking exisiting code by keeping setParameter/getParameter (normalised versions) and adding ‘mapped’ versions of these functions.

This looks like a useful hack to distinguish between the sources — it does bother me a little that a GUI object has to be used to do this rather than having the original call include a parameter to indicate the source.

Okay this might not be directly relevant to the issue of the VST interface, but my open source IIR filters library + demo directly addresses the general problem of mapping domain-specific values to user interface elements.

Here is the interface and implementation I am using:

http://code.google.com/p/dspfilterscpp/source/browse/trunk/include/DspFilters/Params.h

http://code.google.com/p/dspfilterscpp/source/browse/trunk/include/DspFilters/Param.cpp

Every “Param” has the native value, which is a filter parameter expressed in native units. For example, cutoff frequency in hertz. Or the filter order (an integer in the range 1 to 50).

For each native value there is also a “control value”, a number in the range 0 to 1. Each ParamInfo can be customized to use a different mapping to and from the control value. For example, frequencies are mapped using a logarithmic function. The “Q” resonance is mapped using an exponential function. Every ParamInfo also has a routine to pretty-print the value to a string for display.

I hope this is inspiring to someone.

Yes it is, I think I’ll do something similar.

PS your .cpp file seems to be Param.cpp not Params.cpp, so the link didn’t quite work.

I’ve also had a bunch of suggestions about this from other people - seems to be a lot of you all coming up with similar-but-not-quite-the-same approaches.

I think what I’ll do is sketch out one of the other suggestions I’ve had, and check it into the demo plugin as a starting point, with the aim of discussing and evolving it into a solution that everyone likes, and will then move it into the library.

[quote=“jules”]I’ve also had a bunch of suggestions about this from other people - seems to be a lot of you all coming up with similar-but-not-quite-the-same approaches.

I think what I’ll do is sketch out one of the other suggestions I’ve had, and check it into the demo plugin as a starting point, with the aim of discussing and evolving it into a solution that everyone likes, and will then move it into the library.[/quote]

Excellent!

One other issue that is semi-related, is having a warping of the slider parameter that also maps to the warping of the automation parameters in the DAW. Here’s the problem I am having:

  • I have a parameter in my GUI that has a non-linear warping of the Juce slider. It has a nice exponential skew.
  • The same parameter, in the DAW automation parameters, is totally linear. All of the useful low values are bunched up in a few pixels of control.

I was going to deal with this by using my own internal parameter warping, having the slider spit out 0 to 1 values, and using a separate Label to display the internal parameter values, instead of the Slider’s own label. But if this can be dealt with within the Slider code itself, I would be a super happy camper.

Jules, I really appreciate your looking at these issues. I think Juce is an amazing toolkit for constructing plugins - I’ve had a lot of fun coming up with my own “procedural” controls with the LookAndFeel. By addressing these common cross-platform issues, it will enable plugin developers to get up and running far quicker.

Sean Costello

Hmm. Possibly the best plan is to have a generic “LinearMapping” class that can be used by sliders, parameters, etc, to map value between two different scales, using whatever shape you want.

Can I assume that the solution you implement will mean that we won’t need to care from where the value originated (i.e, the GUI, or external knob) and we’ll just get a normalized value? Will we be able to tell from where the value came so that if (for example) it came through the GUI, then we can do something to update the external view as well if there is one, or will that happen automatically?

The reason I’m asking is because I’m wondering whether I always need to know from where the value came, even if it is normalized?

Don’t know, haven’t thought about it in detail yet.

Ha ha ha

From my perspective, as long as I know from where the info came (GUI or externally), then I can do the scaling myself trivially.

A few additions, or restatements for emphasis:

  • As martinrobinson mentioned earlier, even if you translate your values into real-world numbers for your VST plugin, they will show up as 0-1 numbers in AU automation. I don’t know of any way around this using classes that aren’t part of the AU or VST wrapper, as the whole point of the audio code in Juce is that it doesn’t know whether it is running as an AU, a VST, or as RTAS. So a solution to this would be great.

  • The Juce-derived Audio Unit automation parameters also don’t display any text strings at the end for units. In my AudioProcessor class, I use the following code to display units for my VST automation parameters:

const String ValhallaRoom::getParameterText (int index) { switch(index) { case kParamMix: return (String (reverb.GetParameter(kParamMix), 2) + T(" % ")); break; case kParamPreDelay: return (String (reverb.GetParameter(kParamPreDelay), 2) + T(" ms ")); break; case kParamDecay: return (String (reverb.GetParameter(kParamDecay), 2) + T(" s ")); break; {...} default : return String (reverb.GetParameter(index), 2); break; } }

  • The CoreAudio SDK provides a list of units for the parameters, such as kAudioUnitParameterUnit_Generic, kAudioUnitParameterUnit_Hertz, kAudioUnitParameterUnit_Decibels, and so on. Adding support for a similar list of “standard” units would be useful, as it could map to Audio Units as well as to text strings for VST plugins as specified above.

  • The Juce sliders currently only support unipolar warping. Bipolar warping is useful for specifying parameters such as shifting or tuning by semitones, or any type of algorithm where negative frequencies are useful. I’ve implemented my own, but my solution is not as robust as what could be put into the AU/VST/RTAS wrappers, or however the code ends up being developed.

  • Currently, the warping in my Slider class doesn’t map to the warping that is displayed in the automation parameters. I’ve had several users complain about this. Having identical warping for both GUI and automation parameters is a high priority.

Sean Costello

Hi,

I’m working on an existing plugin which uses parameters outside the 0…1 range and I’m wondering if I should do parameter scaling myself or if there’s going to be a new version of JUCE which will handle the parameter scaling automatically. Reading this and other related threads, it seems to me this is an issue which is being adressed currently (also for improving the platform independence between VST and AU as far as I understood).

Thanks in advance for giving me some advice in this matter.

Best regards,
Fritz

I started doing some work on that, but never finished… Will do when I get a chance!

Thank you for your quick reply. I’m just afraid that I’ll end up doing pretty much the same thing as you because the project I’m working on has to handle reconfigurable parameter sets, and the “real” parameter ranges are not even defined at compile time… But it’s not at the top of my priority list because for now I’m using a host that seems to handle parameters outside the 0…1 range well (LiveProfessor: http://ifoundasound.com/?page_id=8).