Clock: Measure in samples, or beats per minute?


For implementing a master clock should I record the value in samples per beat (always integral) or beats per minute (floating point)?

Going with the integer makes the calculation of sample indexes “perfect”, no worries about rounding, and you always know the exact phase of the current audio output block.

Going with float means sub-sample precision which might make a difference in accuracy for long inputs?

What is the convention here?


Personally, I’d always make a master clock just provide your most accurate value, which is probably the raw sample count. It’s easy enough for its clients to convert that into their own beat number and offset.