Hello, I’m experimenting a sequencer for fun.
I would like to know what is the best practice to encode event’s timestamp ?
- Using floating point arithmetic and pray IEEE Standard is robust enough for your comparison operators.
- Using integer with a tick resolution which support the desired factor decomposition such you can’t get fooled.
I know midi use 2) but I would like to know if major DAW do this as well, or if it was old fashionned nowadays.
I would suggest using integers, the size (32/64 bits) of it depends on the use case and the selected PPQ (ticks per beat).
AAX plugin format is using 960000 ticks per beat internally, that allows sub-sample accuracy with typical tempos and sample rates. When using 32 bit unsigned tick and PPQ of 960000 the maximum tick will be around 2^32/960000 beats = 4472 beats = ~37 minutes @ 120bpm. That may easily overflow so using int64 is safer and more future proof.
With floating point values there will be issues with accuracy and non-linearity due to the nature of the format. Not recommended.
Yes I refactored everything from float to int64, was quite boring, but now it’s working perfectly.