Waveform display, better than min/max

sorry if this is a bit off-topic, but i need a hand. 


i'm looking at creating a kind of hybrid sampler/sound editor, and for that i need a good waveform display.  So a few weeks ago, i started looking up lots of stuff, and found that a min/max approach to getting the wave height for each pixel is not actually the best way to go, and there's a better algorithm.


from memory (and my memory is REALLY bad), it was something like 2 * log10(something)....

does anyone know what i am talking about, and what that algorithm is?  i swear i am not hallucinating, and i actually saw source code for it...but now i can't find it anymore. 

Presumably this is db scaling for the sample amplitude.  If you open up audacity and somewhere on the left of the screen there's a drop down where you can change between a log scale and a linear one.

If that's what you want it's probably easy to search for db scalling routines.  There's an open source meter plugin somewhere that'll have an example.

I'm not sure it's actually better for general viewing or editing though most of the time.

from memory, it wasn't as simple as that.  was more an 'average power' algorithm, which used each value in the window, did the log transform, and then divided the total result by the number of samples in the window. 


sorry i can't be more accurate...i'm scratching my head trying to remember where i saw it. 

Well - if you remember  :)  We were just debating the best ways of drawing waveforms a couple of threads back. 

ah, you were right :D  ...decibels


decibelLevel = 20 * log10(x);  // x previously made absolute


the implementation i saw was taking the average decibel level for each block of samples.  and i found it again, it was this one:



actually, i think i like the min/max display better