Apply a big amount of gain makes Logic Pro freeze

Hi!
This is my first post and I’m working on a plugin based on the circuit of a guitar distortion pedal. This pedal has two separate sections, the first boost the signal (up to +46 dB) and then the second section clip the signal in order to create the distortion. I’ve modeled the clipping stage, and doing simulation in MATLAB i got the correct results, I implemented this part in JUCE and I would like to test it. Here’s the problem, how can I boost the signal so much before the simulation of the clipping stage? I’ve tried buffer.applyGain(200*distortion), where “distortion” is the parameter that controls the amount of gain (it’s a value between 0 and 1), while 200 is simply 46 dB in decimals, but when I add my plugin on a guitar track in Logic the program freezes and I’m forced to close it. I’ve tried also to multiply each sample of the buffer for 200 before the saturation, but it does the same. Why does this happen? On MATLAB with the same audio file and the same algorithm everything works without problems. Also if I run the plugin with only the simulation of the saturating part it works perfectly, but obviously it sounds “clean” because the signal needs to be boosted in order to clip.
Thanks!

Have you tried debugging it? That way XCode will jump to the line which causes the problem. Could be a bad memory access, however we will only know if you debug it. Applying a high gain shouldn‘t cause any freezes or crashes, only destroys loudspeakers and the ability to hear :wink:
When porting code from MATLAB to JUCE, the zero counting of C++ comes to mind which could easily lead to a bad memory access.

It is hard to tell, without knowing your code. But a side note: when you work in digital, the numerics have no saturation, so there is little point in multiplying with 200.0. It would be a better design to model your saturation to come into effect in legitimate gain values.

The multiplying of denormals can have a negative effect as well.

There are great examples about distortion by @IvanC, have a look into this thread:

Hope that helps

1 Like

Thanks for your answer! Unfortunately I don’t know how to debug it, because I encounter the problem when I use it in Logic (is there a way to connect Logic and Xcode maybe?).
I’ve checked the code multiple times, there are no bad memory access, also because I simply use “applyGain()” without explicitly access the data in the buffer…

thanks for your answer!
The code is simply a buffer.applyGain(200*distortion), and I’ve done exactly what you have written: I’ve modeled the non-linear part of the circuit but in order to get the saturation the values that enter in the saturating section must be boosted, like in the analog pedal, without this it will sound “clean” (like when you roll off the volume on a guitar to get a clean sound from a distorted one). I’ll read IvanC’s post, thank you!

I would be curious of your non linear model.
Because of the linear nature of numerics, if you increase the gain, you will have to reduce it afterwards, and your operation becomes a NOP:

buffer.applyGain (200 * distortion);
// your modelling
buffer.applyGain (distortion / 200.0);

In an analog circuits, especially tube amplifiers, you do this to bias the signal into a certain operating-point of the tube. Usually A-mode is the linear part in the middle, the B-mode is actually with the input power below the linear part, and C-mode is above.

Now if you model your circuit, it should be easy to adapt your model to have the mode you desire in the range of your signal.

Sorry for this theoretical excursion, it doesn’t actually have anything to do with your problem, why Logic freezes with your gain applied.
In these cases the best to try is, removing code until it works normal. But if you multiply by 200, don’t forget to undo it afterwards, otherwise you kill your gear and your ears…

Good luck

Thanks for the theoretical part, is always a good thing to learn something new, by the way I modeled a very simple distortion pedal, in which the clipping stage is made with two diodes, so you can boost the signal very high, but then it will be clipped by the diodes, so there’s no need to undo the multiplication. I’ve started my work from a video of IvanC about modeling analog pedals, you can find it here (https://youtu.be/l_HHJdCKcjA). In MATLAB to test my model I’ve done this:

  • create a sinusoid with amplitude 1
  • multiply by 200
  • apply the distortion
  • plot (blue one is the original signal before boost&distortion while the red one is the output of the distortion model)

As you can see I didn’t needed to undo the multiplication, because the diodes clip the signal at the threshold voltage (around 0.3 V for this diodes)

1 Like

Thanks for sharing your model. Very interesting, I didn’t think of limiting to create distortion, but yes, makes sense now.

Debugging with XCode is fairly simple: CMD+< -> Run -> Build Configuration: Debug -> Executable: Ask on launch.
Then build it and select Logic, once finished Logic will start and you can load your plug-in and it will be debugged by XCode

Thank you so much! This will really help!