Save a value of enum type in ValueTree

I am able to save/load primitive values to/from a ValueTree but I wonder if it’s possible to store/load an enum value without having to casting it to/from an integer?

Why would you need to cast it? An enum value is already int compatible. Personally, I have never actually defined a member or parameter as an enum type, always as an integer. I only use the actual enum to initialize the int (or to check a range of the int).

If it’s an enum class you’d need to explicitly cast it (IIRC).

The only way to put an enum into a ValueTree is to cast it, and the simplest cast would be to an int (you could write a converter to any other type that juce::var supports, like a juce::String but that’d be significantly more expensive). You could write a template overload of juce::VariantConverter to wrap the casting if you like, then you only need to use the toVar and fromVar methods which I find can be a look cleaner.


What I do is cast first to the underlying type, then to integer. Most enums will use integer as their underlying type, but I think it’s safest to be very explicit, and if integer was the underlying type, then the second cast here will be a no-op:

to save:

template < typename Type >
juce::var fromEnum (Type value)
    return static_cast< int > (static_cast< std::underlying_type_t< Type > > (value));

to load:

template < typename Type >
Type toEnum (const juce::var& var)
    return static_cast< Type > (static_cast< std::underlying_type_t< Type > > ((int) var));

TIL of std:: underlying_type_t!


enums (of which I only use enum class now) provide a form of safety in that you can’t (without casting) use values that are not valid. I use them more and more instead of int, or bool, for function parameters, which makes it impossible to get the parameters in the wrong order.

1 Like