Do you really mean “Time::getMillisconds”? This method returns the milliseconds part of a Time object and it’s resolution depends how you initialised the Time object.
I am talking of “Time::getMillisecondCounter” which is used internally by the Timer class.
I created this very simple test program:
int main(int argc, char** argv)
for (int i=0; i<100000; i++)
printf("Time::getMillisecondCounter(): %u\n", Time::getMillisecondCounter());
The output on my windows machine is the following:
and so on. On Linux it really switches every millisecond.
I understand that the Timer class is not designed for time-critical events, but I do not understand that my test is completely meaningless, when it shows, that it could be much better in Windows, if “Time::getMillisecondCounter” is replaced with “Time::getMillisecondCounterHiRes” in the implementation of the Timer class.
The behaviour of the Timer class is sufficient for my applications on Linux and could be sufficient on Windows, if it would use the HiRes counter. But the default behaviour (at least on my test machines) on Windows is not good enough.
Do you have any other explanation, why “Time::getMillisecondCounter()” behaves in this way on my machines on Windows. Is there a mistake that I can make when building and linking Juce applications??