How to convert MIDI event times to beats in a plugin / AudioProcessor

I was getting quite confused trying to convert MIDI events to beats in my plugin, and I’ve figured it out, so I thought I’d post a minimal code example. If anyone else is confused, this may help, and possibly this info could be added to docs or tutorials. Also if there’s anything misguided in my approach, someone can suggest improvements!

double samplesToBeats(juce::int64 timestamp, tempoBpm) {
    double secondsPerBeat = 60.0 / tempoBpm;
    double samplesPerBeat = secondsPerBeat * getSampleRate();
    return (timestamp / samplesPerBeat);

void processBlock(juce::AudioBuffer<float>& buffer, juce::MidiBuffer& midiMessages) {
  double playheadTimeSamples = 0;
  double tempoBpm = 120;

  // Get the current playhead position & timing info.
  juce::AudioPlayHead::CurrentPositionInfo playheadPosition;
  juce::AudioPlayHead* playhead = AudioProcessor::getPlayHead();
  if (playhead) {
      playheadTimeSamples = playheadPosition.timeInSamples;
      tempoBpm = playheadPosition.bpm;

  // Loop over each event, dumping info and time in beats to console.
  for (auto m: midiMessages)
      auto message = m.getMessage();

      // Convert the event time to global (since transport started).
      auto eventTime = playheadTimeSamples + message.getTimeStamp();
      double eventBeats = samplesToBeats(eventTime, tempoBpm);

          << message.getDescription()
          << ", " << message.getTimeStamp()
          << ", " << eventTime
          << ", " << eventBeats
          << std::endl;

Newbie note: if you want to see the console output when testing your plugin in a DAW (e.g. Bitwig) you can use (on Mac OS X).

Why am I doing this? Thanks for asking! I’m keen to make my own custom MIDI plugins for control during live performance. For example, a MIDI filter that switches channels aligned with beat/phrase boundaries, or an automation plugin that generates CC ramps aligned with phrases/beats.