Airplay streaming

As of Mac OS 10.8 (Mountain Lion) apps have the ability to send audio to airplay devices. If an airplay device is available Mac OS will provide a new audio device named “Airplay” with a transport type of kAudioDeviceTransportTypeAirPlay. If this airplay audio device is selected for playback by an application, coreaudio will send the audio over the network to the physical airplay device. This happens completely transparent for the app. It only needs to open the airplay audio device and coreaudio does the magic.

If there is more than one physical airplay device available coreaudio will send audio only to the first, default device. But it is possible for an application to tell coreaudio which physical airplay devices should be selected for playback. Even multiple devices at once can be selected. Each physical airplay device corresponds to one data source of the Mac OS airplay audio device. It is possible to find out the names of these data sources and it is also possible to select multiple data sources for playback.

I have hacked the JUCE coreaudio audio device to make it possible to control airplay devices in this way. Please find attached the changed files and a small example project that demonstrates how to use them.

A few things are still missing and would still need to be implemented are:

  • JUCE should inform clients that the available devices have changed. This is because airplay devices can be turned on and off at any time. When no physical airplay device is visible anymore the airplay audio device will no longer be available. This may already be implemented through the device manager and its change callbacks.
  • JUCE should inform clients that the available data sources of a device have changed throught some callback mechanism. If there are multiple airplay devices available and one is turned off the list of available data sources will change.

What’s not possible with the available coreaudio apis as far as I can see is:

  • Control the a data source volume to make it possible to adjust the relative volumes between multiple airplay devices.
  • Simultaneous playback on the local machine and through airplay devices.

iTunes can do both but it seems they don’t use the airplay support that is part of the operating system but use an independent implementation of the complete airplay protocol instead.

I would love to continue working on this. Guidance how to correctly integrate this into JUCE is needed though. This is all very plattform dependend and the api changes to the JUCE AudioIODevice class should be done in a way that makes sense to other plattforms.

Thanks - I’m busy this week, but will try to take a look asap.

Sorry - I did it but forgot to push. Should be up there now.

Hmm. Not seeing anything in the commit log of the tip yet.

oops - no, sorry, I think my post above was a reply to the wrong thread! I’ve not had time to look at this yet, but will to this week…

Interesting, but the approach feels wrong to me. Perhaps it should be done by creating coreaudio devices for each of these data type things, rather than trying to extend the base class to cope with them?

I am not sure. This is the way CoreAudio provides the functionality. A device named Airplay with multiple data sources. In abstract terms I would see a data source like an output of an audio device. Maybe it would make sense for JUCE to integrate them as (named!) outputs of the audio device.

A user sees them like the attached image though. A master volume. The builtin output device and multiple Airplay devices.

Another example of a data source: The Built-in Output device with a data source names Headphones.

It seems for coreaudio data sources are the same as input / output jacks. The virtual Airplay device in that sense has multiple output jacks.

Well… it just doesn’t fit with the way any device type other than CoreAudio works, so I’m not keen on extending the generic audio device base class in such a CoreAudio specific way. Not sure what to suggest, I’d need to play around with some airplay devices to get more of an picture of how they work.

Of course! If support is added it should be done in a way that fits into the existing architecture. There are not that much choices though. Each physical airplay device as its own audio device. Or each airplay device as (two?) outputs of an airplay audio device. The last one seems to be a better choice in my opinion because it allows to send audio to all outputs at the same time which is possible with airplay devices.

If you have any questions I may be able to answer them. I have been playing around with them for a while now.

Here is a short summary:
If the first airplay device is turned on it will be automatically detected by the Mac and a new CoreAudio audio device named “Airplay” will become available. The airplay audio device will have one data source that corresponds to the physical airplay device. The data source will also have the name the user gave to his physical airplay device. If additional airplay device are turned on they will become available as additional data sources.

When airplay devices are turned off the corresponding data sources will be removed from the airplay audio device on the mac. When the last airplay device is turned off the Airplay audio device will also be removed by coreaudio.

All datasources share the same audio properties. Two channels, 16 bit if I remember correctly.

Datasources can be turned on and off independently. This also means you can route audio to multiple datasources at the same time.

Except of reacting to airplay devices beeing turned on/off this is already implemented in the attached files. I will continue to implement it the plattform specific way for now, maybe it can give you some hints / ideas how to add it in a reasonable way to JUCE.

The functionality itself seems to be pretty much mandatory for a media player on the Mac OS these days. Half of my beta tester asked for airplay support.

Thanks for the info - not sure when I’ll get chance to look at this, but we’ll obviously need to make it work for tracktion, so it’ll get done soon!

Has this been implemented in one way or another (maybe I've overlooked it, but it seems it hasn't) ? Airplay support would be a welcome addition!

Airplay support in the way described has stopped working with the release of El Capitan. There is a radar filed (Apples bug tracker) for this issue that is still open. I would wait implementing it until Apple either fixes the issue or drops support for Airplay in CoreAudio completely.

Hi,

My AirPlay device doesn’t appear in my “CoreAudio” devices.

I use AudioDeviceManager -> createAudioDeviceTypes to get the audio devices types available and I don’t find my AirPlay device in the "CoreAudio"Type -> scanForDevices()

I’m on Mac OS Sierra and I experiment this on El Capitan too.

Note that if I select my AirPlay device as a primary audio output in MacOS Preferences, “CoreAudio”->scanForDevices() returns successfully my AirPlay device.

Any ideas ?
Thank you !