Rendering a visual depiction of a staff w/ notes, based on MIDI notes being played

Hello everyone,

I’m fairly new to Juce, and I’m working on a polyphonic synthesizer. As part of the GUI, I’d like to display a musical grand staff, with noteheads on it displaying the MIDI notes currently being played.

On my audioProcessor side, I have an array that’s collecting the currently held pitches, so I can pretty easily feed my audioEditor a simple list of MIDI pitches to display, it’s really just a question of being able to draw them accurately.

The only way I can think of to implement this would be to display an image of a blank grand staff: 212466.image0 and then draw notehead-shaped ovals at the proper note positions’ Y coordinates, determined by some fancy formula having to do with the MIDIpitch vs the height above the bottom of the grand staff image. Is that basic approach correct? And if so, any thoughts on how to implement this formula to calculate the needed distance from the top/bottom of the grand staff image based on midiPitch?

[also, adding/removing ledger lines could be another issue entirely… maybe if a note is “off the staff”, then instead of a regular notehead, draw a notehead w/ a ledger line through it at those coordinates…?]

Thanks in advance for your help!

1 Like

While is is being played? The bigger problem I see with that would be knowing what note to draw (1/4, 1/8th, etc.), since you don’t know until a note is released how long it was, and manually getting a note exactly the right length can be very difficult, especially at higher tempos or for shorter duration notes.

But yes, your approach sounds like what I would do for the drawing. I’d probably have pre-made bitmaps for each possible note type (including up and down versions), with transparency for the non-black parts, to let the background staff images show through.

I actually don’t need to draw/render any tempo/rhythmic information – I just want to display a visual reference of what pitches are active/on AT THIS EXACT MOMENT, so every pitch that’s on can be represented by just a black notehead with no stem, similar to the NSlider object in Max/MSP:Screen Shot 2020-11-12 at 4.22.11 PM

would you store the notehead shape as a bitmap, and draw it in all the correct locations for each active pitch? Or try to do this with some complicated vector graphics?

You should use a font like bravura and use the information in SMUFL how to draw and connect those.

Beware, it is a very long topic, the layout is a difficult task, especially the horizontal layout.

Bravura font: https://github.com/steinbergmedia/bravura

SMUFL informations: https://www.smufl.org/

btw. Bravura is licensed under a permessive license SIL.

2 Likes

Apologies for the delayed response, I’ve been addressing some more of the under-the-hood code for my project before circling back around to this…

You’re not kidding, this appears to be a very deep rabbit hole! I may end up nixing this part of my project if I can’t find a simpler way to implement this. Though it does surprise me that there isn’t an easier way to display some notes on a staff, it seems like that would be a fairly common task done with JUCE…

Deep? It isn’t that difficult, it’s not very different from showing some dots in diagram… albeit with some more fancy stuff and lots of corner cases.

First you need some svg stuff for the clefs, accidental symbols and the note(s). There’s plenty of them free on the internet, just google around a bit. Extract the necc stuff from the collected files w/ a svg editor or just by viewing source in your browser. See my earlier post (Changing Color of SVG).

Now, as you’re very well aware of, the difference between a C and a D is two midi notes (60 -> 62), whereas E to F is only one, 64 -> 65, but both intervals should be drawn with the same vertical distance, a half line. I.e you need a lookup table to convert from the midi note numbers to the position in the chart. The example was for C, but the other keys work the same, it’s only a question of offset…

Talking of keys, you extract the key signatures from the dedicated Midi messages and display the relevant set of sharps or flat symbols. If playing live, you could select the intended key manually from a menu, I guess.

In the constructor or some once-executed function you create Paths from the svg files (collected as of above), like Drawable::parseSVGPath(noteData);

In your resized() function you determine the scale of the drawing area and do the scaling of the svg:s. You’d use something like svg.applyTransform(AffineTransform::scale()) to do that.

In paint() you draw the notes with fillPath (svgNote, AffineTransform(x, y)), where y is a function of the midi note number, the key signature and current drawing scale.

I guess the best way to calculate the y-position for the note path in the chart would be right when a note-on or note-off occurs. Guess you have a collection of the current notes being played in an array. And then you do a call to repaint() for every change in that array.

That’s the gist of it I think… Lots of aritmetics, but I guess a bit easier than… Fourier transformation and much other exotic stuffs

PS I listened to some of your music on Soundcloud. I liked it.

That might be true if you simply want to draw some note like shape. Even with notes and accidentals there are already layout problems, if you don’t want to count pixels in your SVG and hardcode positions.

There are several permutations where to put accidentals per clef, how close you can put them so they don’t overlap etc.

Next stop are staves of beams, how to make it a fluid line following the melody without jumps. Also the horizontal positions, that resemble the proportional positions. When there are double notes that need to be drawn on the left and right of the staff, there is another margin to consider.

But I don’t mean to lecture you, make your own experiences. Maybe one has to try before one can appreciate the details.

Score music is indeed a niche in audio programming, most engineers and electronic producers care more for wave forms and midi events. In a modern workflow scores become less and less important, I would even claim that only a minority knows how to read them in the industry, but I don’t mean to be partronising.

However, they are important in ensemble music and orchestral music.

My recommendation is to use a library, however I am still evaluating myself.

There is lilypond which is a LaTex style engraver. It is python, so not easy to use interactively in an App or plugin.

Then there is guido, which at least allows feedback, like picking notes and get a pointer, as to which note in the source was hit.

To me lomse looks most promising, I am currently trying to integrate this in JUCE (it can draw on a pixel canvas), but I have unfortunately many other priorities.

I am undecided if I open source my juce native attempt, it does already nice stuff using SMUFL and juce::Graphics, but it’s a long way to go still and chances are I’ll abandon it in favour of one of the mentioned libraries.

2 Likes

@oxxyyd wow, thank you for the in depth answer!

Your suggestion of a lookup table for Y coordinates makes a lot of sense to me! I can keep the lookup process super simple by making the table have 127 index positions, of which some will hold duplicate Y coords (like G’s and G-sharps next to each other). Then for X coords it can be calculated based on if any notes are within a close enough vertical distance of one another to require a horizontal offset.

Then once the notehead’s Y coord & X coord/offset have been calculated, it should be easy to create relevant accidentals for each note – the accidental will be centered on the same Y coord as the notehead, and would have a different X-center but the same X offset as the notehead, right? :thinking:

Then it’s just a matter of drawing each notehead & accidental at the appropriate positions…

I’m not sure if a key signature is necessary or if I’ll just display accidentals for everything… I suppose I could add that later. I do know I wanted a sharps/flats toggle, though.

Thank you again, and thanks for checking out my music, I really appreciate that! :blush:

@daniel for my specific application, I won’t need to worry about beams, slurs, or even stems really, because I only want to display “at this instant in time, what pitches are active?” So it really is just a series of notehead-like shapes, with accidentals. The staff & clefs I think can be a static image, because it will always be a piano-style grand staff…

with @oxxyyd’s suggestion of a lookup table for Y coordinates, this is sounding more achievable. IF I can figure out how to properly place the X coordinate offsets, etc, hehe… I am still a little bit confused about how exactly the notehead drawing is actually done, once I have my desired X & Y coordinates, it seems like a very complicated process for a task that should be fairly conceptually simple…

@daniel you mention a library… for this specific implementation I wish to achieve, if I can get it working, I intend to make the finalized component class available for public use on GitHub, because even though Western notated sheet music may be more niche in the audio programming community, I don’t believe it is a niche in the target demographic of VST users. I think many musicians would appreciate having a synth with a convenient little display that tells you what chord you’re playing at the moment. I can’t be the only Juce programmer who’s ever encountered this issue ¯_(ツ)_/¯

For the first iteration I’d go for something simpler for the horisontal displacement of nearby notes. Something like

auto xOffset = 1;
float someScaleFactor;

for (auto yPos : yPosOfNotesCurrentlyDown)
{
	drawNote(xOffset * someScaleFactor, yPos);

	xOffset ^= 1;
}

Then when I’ve actually got some notes showing up I’d do something more elaborate like

for (auto n = 0; n < yCoordsOfNotesDown.length; ++n)
{
	auto yPos = yCoordsOfNotesDown[n];

	if (yCoordsOfNotesDown[n + 1] - yPos > 1)
		xOffset = 0;

	drawNote(xOffset * someScaleFactor, yPos);

	xOffset ^= 1;
}

This would (hopefully) also address the fact that there could be several consecutive nearby notes in a cluster.

yCoordsOfNotesDown is a sorted array containing the pre-calculated y-positions of the notes currently held down in units of a half staff line spacing

1 Like

I believe the right person to ask regarding implementation is @matkatmusic who did something similar (I think?) with his Chordie app:

As a user I agree that it would be a great addition to many plugins!

3 Likes

Thanks, @eyalamir!

Yup, I have a notation renderer in my app. Lots of folks use it to teach music theory on youtube and in private skype/zoom lessons.

1 Like

@oxxyyd I’ll try this approach. Thanks for the pseudocode, this is really helpful!!

@eyalamir exactly, I’m a musician myself and I’m building an instrument plugin designed for use by musicians, not just producers/people fluent with DAWs.

also, total sidenote, but a while ago I definitely watched your video on TheAudioProgrammer’s YouTube about setting up Midi processing in a separate class, and using MidiBufferIterator – super helpful, thank you!

@mkatmusic very cool!

Hey, I’m glad you’ve found that video helpful!

It’s now getting a bit out of date because JUCE 6 deprecated MidiBufferIterator for a much cleaner, better syntax.

1 Like

unfortunately yes, but it was a very good intro for me to overall project layout, moving MIDI processing into a separate file, etc… ¯_(ツ)_/¯

This is as far as I got:

I could be persuaded to open source it, if some people would commit to contribute.
The goal is to read MusicXML, render the score, allow to edit or at least select notes and phrases, process/analyse the notes… all the music theory stuff.

It renders any SMUFL compatible font (tested with Bravura and Petaluma).
Next feature are double notes and beam groups.

1 Like

About that, where can I find information about the new syntax? I opened a Juce 5 project in Juce 6 and noticed the deprecation warning but did not find what is the right way to do it now :confused:

Edit : Actually found it, I didn’t understand that the new iterator returned a MidiMessageMetadata that had the samplePosition and that the message had to be retrieved via getMessage from that metadata object. All good, sorry!

@stfufane here’s what I’ve got in my current project:

where midiMessages is a reference to the input midi buffer

for (const MidiMessageMetadata meta : midiMessages)
{
   const MidiMessage currentMessage = meta.getMessage();
       const int samplePos = meta.samplePosition;

       // do processing on  currentMessage
}
1 Like