Looking at the class documentation, I can see that setBufferSize() sets the number of blocks of samples in history that will be displayed, and setSamplesPerBlock() is the number samples used in each block calculation. So, the buffer is the sum of blocks, and the total time history in seconds will be (buffer size * samples per block / sample rate). Naming aside, it seems straightforward enough.
What I’m not understanding is the documentation for setBufferSize. It reads:
“Note that this value refers to the number of averaged sample blocks, and each block is calculated as the peak of a number of incoming audio samples. To set the number of incoming samples per block, use setSamplesPerBlock()”
The first part of the first sentence says number of averaged blocks, but the second part of the sentence says each block is calculated as the peak. It can’t be both. Which is it? Is each block calculated as the average, or peak value?
Logically, I would have assumed peak. I took a quick look at the source code but it’s still not completely clear to me (I’m leaning to averaged, not peak). I see that a Path is drawn in the getChannelsAsPath() method (and I don’t see any peak or averaging up to that point as data is pushed to the component). There are a few operations inside I still can’t get my head around as I’m just learning parts of the framework.
If anyone smarter than I can weigh in on what the documentation should read (or what I’m missing), and potentially where this is being calculated in source I’d appreciate it!