Hi folks,

I’m comparing the HighResolutionTimer to Time::getHighResolutionTicks() to see how accurate it is - or, at least, whether those two systems agree. I set the timer to 1 millisecond, and increment a counter in each callback to measure the running time. Also, when I start the timer, I get the start time in high-res ticks. Then, in each callback, I get the current time in high-res ticks again and use subtraction to get the total running time in high-res ticks, which I then convert to milliseconds. I then send it to std::cout, run it a while, and see something like this:

since start: 40951.1 : 40951 : 0.0820312

since start: 40952.1 : 40952 : 0.0820312

since start: 40953.1 : 40953 : 0.0820312

since start: 40954.1 : 40954 : 0.0820312

since start: 40955.1 : 40955 : 0.0820312

since start: 40956.1 : 40956 : 0.0820312

The first number is obtained from using Time::getHighResolutionTicks(). The second number is the counter value, and the third number is the difference between them. They are normally in very close agreement, usually within .08 of a millisecond. However, periodically there is a block of output that looks like this:

since start: 40964.3 : 40957 : 7.29297

since start: 40964.4 : 40958 : 6.36719

since start: 40964.4 : 40959 : 5.4375

since start: 40964.5 : 40960 : 4.50781

since start: 40964.6 : 40961 : 3.57812

since start: 40964.6 : 40962 : 2.64844

since start: 40964.7 : 40963 : 1.71875

since start: 40964.8 : 40964 : 0.847656

since start: 40965.1 : 40965 : 0.078125

What this appears to show is a series of HighResolutionTimer callbacks happening within a 10th of a millisecond of one another. Is it possible for HighResolutionTimer callbacks to “stack up” and be processed in quick succession? What could explain this behavior?

Thanks for any ideas!