I’m new to DSP and wondering how do I calculate the frequency in Hz of incoming samples?
Can I get the Hz of an individual sample, or do I have to look at a range of samples?
edit: I want my plugin to only act on only certain frequencies. E.g. only add distortion to samples in the mid frequency range.
One sample on its own means nothing. You need to look at a range of samples. The basic gist of pitch detection is that you find approximately how many samples are between cycles – ie, how long it takes for the signal to start repeating itself. This is called the period, and from this you can get the frequency.
This could be solved with a band pass. You split the frequency bands and treat only the desirred filtered signal with your distortion and mix it back together at the end.
It is basically the wavelength.
1 Hz = 1 period per second
sampleRate: samples per second
therefore:
sampleRate / frequency
= (samples / second) / (periods / second)
= samples / periods
the sample rate can be seen as the speed of sound in our system
Example:
441 Hz at 44.1 kHz = 44100 / 441 = 100 samples per period (easier to compute than 440 Hz )
where:
- $d_t(\tau)$ is the autocorrelation function at lag $\tau$ of input signal $x$ at sample $t$;
- $D(\tau)$ is the summation of all values $d_t(\tau)$ for each sample index $t$ in the current frame of audio of length $N$.
This autocorrelation function is calculated for every lag $\tau$ within the set of possible period values, and the $\tau$ value whose autocorrelation function shows the lowest amount of difference between the original and delayed signal (ie, the lowest value of $D(\tau)$ ) is determined to be the period.
some example pseudocode:
template<typename SampleType>
void calculateDifferenceFunction (const SampleType* samples, int numSamples, SampleType* out)
{
// tau is the amount of delay in the signal that we're going to test
// tau is *also* the period value we're testing
for (auto tau = 0; tau < numSamples; ++tau)
{
out[tau] = 0;
for (auto i = 0; i + tau < numSamples; ++i)
{
// comparing the signal to itself with a delay of tau samples
const auto delta = samples[i] - samples[i + tau];
out[tau] += (delta * delta);
}
}
}
template<typename SampleType>
int findPeriod (const SampleType* samples, int numSamples, SampleType* workBuffer)
{
calculateDifferenceFunction (samples, numSamples, workBuffer);
const auto* workBufEnd = workBuffer + numSamples;
// find minimum difference function value
const auto minDifference = std::min_element (workBuffer, workBufEnd);
// what we want is the *index* in the work buffer where the minimum value was
// index = tau = period
return static_cast<int>(std::distance (std::find (workBuffer, workBufEnd, minDifference),
workBufEnd));
}
TL;DR:
You find the period by finding the number of samples it takes for the signal to become most similar to itself again. Think of cycles of a sine wave - you take a copy of the sine wave and shift it around in time until you find where it lines up with itself again, and the amount you shifted it by to make it line up is the period.