I am new to JUCE and I picked it up because I like the GUI classes you have. But I can’t for the life of me figure out how to use your AudioSampleBuffer class. I’ve been trying to implement a simple delay just to get started and have failed miserably. How do I take this code from my VST SDK simple delay and implement with the AudioSampleBufferClass in the processBlock method?
index is my pointer in my circular buffer. bufferLeft is my circular buffer.
blend and feedback are the float blend and feedback values from 0 to 1.
delaySamples is an int which is the number of samples I’m delaying by.
I got that far I think. I don’t know how to write to the output stream (or if it’s even set up that way). Is the AudioSampleBuffer that’s passed into the processBlock method both the input and the output?
I’m trying to port an old VST plug to Au using JUCE and I’m stuck on the same issue, I don’t know how to write to the output stream and the link above is dead. could someone please help? My issue is almost identical to the OP above. Thanks
That’s what I thought and I’ve been using the demo plugin as a reference, I just can’t get my plugin to work. I’ll keep trying and maybe I’ll post some code later and you might be able to spot the problem.
First of all it’s not a good idea to mix indexing (data) with pointer arimethric (i). The first statement takes the ith offset from pointer data, while the second gives you the data at offset 0 and increases the position afterwards. If you increase your pointer in every loop you’re reading from data[0], data[2], data[4]… which might get you an access violation when you start reading data from outside your buffer. Don’t know what host you’re using but the host might decide that your plugin is no good due to the access violation resulting in the silence.
Also your code only works for mono data. With stereo data you will mix the channels since you fill your delay buffer with the alternating channel data.
Generally, this is something you can probably spot quite easily with a debugger. Simply set a breakpoint inside the loop and have a look at the data.
I’m an idiot, thanks Chris, that’s solved the major problem. I’ve been away from this kind of stuff for a while and it’s taking time to relearn my own algorithms! I hadn’t commented too good when I first developed this simple process.
Chris, is it this line of code in the jucePluginDemo that handles filling the delay before with alternating channel data for stereo? I’ve tried to set mine up like this and it works fine still in mono but I’m getting strange phasing and distortion when I use it on a stereo track in Logic 9.
I promise this is my last noob question, I realise it’s probably outside the scope of this forum but I’d really appreciate your insight
Thanks…
My updated code looks like this:
[code]void ReverseDelayAudioProcessor::processBlock (AudioSampleBuffer& buffer, MidiBuffer& midiMessages)
{
const long int sampleFrames = buffer.getNumSamples();
int channel;
float delOut;
for(channel = 0; channel < getNumInputChannels(); channel++ )
{
float* data = buffer.getSampleData (channel);
float* delayData = delayBuffer.getSampleData (jmin (channel, delayBuffer.getNumChannels() - 1));
for(int i=0;i<sampleFrames;i++)
{
delOut = delayData[writePointer];
delayData[writePointer] = data[i];
++writePointer;
//Check Bounds for wp
if(writePointer >= size) {writePointer = 0;}
//Apply amplitude envelope to the buffer using a cosine wave
delOut = (delayData[readPointer1] * cosine[readPointer1]) + (delayData[readPointer2] * (1-cosine[readPointer2]));
data[i] = delOut;
//Play stored samples in Reverse
--readPointer1;
//Check Bounds for rp1
if(readPointer1 < 0)
readPointer1 = size-1;
--readPointer2;
//Check Bounds for rp2
if(readPointer2 < 0)
readPointer2 = size-1;
}
}
// In case we have more outputs than inputs, we'll clear any output
// channels that didn't contain input data, (because these aren't
// guaranteed to be empty - they may contain garbage).
for (int i = getNumInputChannels(); i < getNumOutputChannels(); ++i)
{
buffer.clear (i, 0, buffer.getNumSamples());
}
Does delayBuffer have two channels? If not you’re using the same buffer for both channels due to the jmin (channel, delayBuffer.getNumChannels() - 1).
Also, with stereo your readPointers jump from block to block because you’re increasing the same variable for both channels.