SwiftUI Music Player

SwiftUI Music Player

Introduction

As second proof of concept for my bigger application, I needed to implement a music player. Even in this case, it was not easy to find a complete example.
In this blog post I will describe the process I have followed to develop this new sample.

Input mp3 files

First of all, I needed some sample input mp3 files for my application, with the requirement that they embed artwork.
As I couldn't find such files with MIT or Creative Common license, I had to create them by myself:

Creating the SwiftUI application

The next step has been to create the basic SwiftUI application.
For this, the best tutorial I have found is in YouTube by Kavsoft: Custom Audio Player Using SwiftUI - AVAudioPlayer Using SwiftUI - SwiftUI Tutorial.
The generated app is functional and minimal, but it has a couple of little problems and misses multiple features that I would consider necessary in a player application.

Some of the features that the sample is missing, and that I will describe later are:

  • the app doesn't follow the MVVM pattern; I deal with this as the first point, as it will allow some code simplification;
  • the app doesn't play music in the background, so switching to another app, or locking the device, stops playing music;
  • the app doesn't show the control to play/pause the music from the lock screen;
  • the app doesn't allow to select a different output device (for example: speaker, or AirPods, or HomePod, and so on); this must be done from Control Center, and is not very user friendly;
  • the app doesn't allow to set the volume of the audio device, or update the UI responding to volume changes.

Refactoring to MVVM

I have refactored the code to MVVM because as usual it's more easy to read and maintain.
When doing this, creating the MusicPlayerViewModel class, I have done a few changes from the initial implementation:

  • the view-model can directly implement AVAudioPlayerDelegate because it's a class (btw needed to inherit from NSObject); so communication between the delegate and the view-model is now direct, and doesn't require the use of local notifications anymore;
  • I have corrected a bug: using the +30 and -15 buttons, the slider was not updated if the audio was not playing;
  • another imprecision is the use of AVAudioPlayer.prepareToPlay during appear; it's documented that this causes other applications to stop playing, so I have commented it to be a better neighbor.

Graphical improvements

Dark theme and buttons

The app developed in the video doesn't respond well when used with a dark theme: the buttons are black on a black background, so not visible.
So, for all the buttons in the UI, I have set the color to primary, like here:

Button(action: { ... }) {
	Image(systemName: "backward.fill").font(.title).foregroundColor(.primary)
}

Cursor

I've added a little circle on the current position of the Capsule. Nothing crazy here.

Playing music in the background

In the current implementation, audio stops when the app is backgrounded or we lock the device.
The Apple documentation to do this is at Configuring the Audio Playback of iOS and tvOS Apps.

Essentially, it is necessary to do the following calls:

		do {
			let audioSession = AVAudioSession.sharedInstance()
			try audioSession.setCategory(.playback)
			try audioSession.setActive(true)
		} catch { }

But be careful: if you do them in your onAppear(), and if there is already other music playing when you open the app, it will stop playing immediately.
This is not a good neighbor behavior, as the Apple documentation states: You can activate the audio session at any time after setting its category, but it's generally preferable to defer this call until your app begins audio playback. Deferring the call ensures that you don't prematurely interrupt any other background audio that may be in progress.
So I have followed the Apple documentation, and added these calls just after the play command.

Also, to add background playing, you need to add the Background Audio capability:
Background Audio Capability

Finally: when doing tests, I have noticed that enabling background audio allows to switch to other apps, that can start audio themselves. When switching back to our app, audio is stopped, but the UI still thinks to be in a playing state.

To handle this event, in the existing AVAudioPlayerDelegate, it's possible to override audioPlayerBeginInterruption:

extension MusicPlayerViewModel: AVAudioPlayerDelegate {
	func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
		self.isPlaying = false
		self.isFinished = true
	}

	func audioPlayerBeginInterruption(_ player: AVAudioPlayer) {
		self.isPlaying = false
	}
}

One last consideration: one side effect of this change is that the audio will play even when the device is in silent mode.

Controlling reproduction from the lock screen and updating Now Playing

From Controllig Audio Background I have used the MPRemoteCommandCenter class to control reproduction from the lock screen.
You add targets for the commands that you want to implement: in my case they are playCommand, pauseCommand, skipBackwardCommand, skipForwardCommand and changePlaybackPositionCommand.

	private func setupRemoteTransportControls() {
		// Get the shared MPRemoteCommandCenter
		let commandCenter = MPRemoteCommandCenter.shared()

		commandCenter.playCommand.addTarget { [unowned self] _ in
			if !self.isPlaying {
				self.player.play()
				self.isPlaying = true
				return .success
			}
			return .commandFailed
		}

		commandCenter.pauseCommand.addTarget { [unowned self] _ in
			if self.isPlaying {
				self.player.pause()
				self.isPlaying = false
				return .success
			}
			return .commandFailed
		}

		commandCenter.skipBackwardCommand.preferredIntervals = [NSNumber(value: 15)]
		commandCenter.skipBackwardCommand.addTarget { [unowned self] _ in
			self.skipBackward()
			return .success
		}

		commandCenter.skipForwardCommand.preferredIntervals = [NSNumber(value: 30)]
		commandCenter.skipForwardCommand.addTarget { [unowned self] _ in
			self.skipForward()
			return .success
		}

		commandCenter.changePlaybackPositionCommand.addTarget { [unowned self] event in
			if let event = event as? MPChangePlaybackPositionCommandEvent {
				self.player.currentTime = event.positionTime
				return .success
			}
			return .commandFailed
		}
	}

Related to this, to update the current state of the playing music, you need to use MPNowPlayingInfoCenter class and its property nowPlayingInfo.
You need to set various properties. One of the properties is the MPNowPlayingInfoPropertyElapsedPlaybackTime. You need to constantly update it to refresh the progress bar in the lock screen.

	private func updateNowPlaying() {
		// Define Now Playing Info
		var nowPlayingInfo = [String: Any]()
		nowPlayingInfo[MPMediaItemPropertyTitle] = self.title
		nowPlayingInfo[MPMediaItemPropertyMediaType] = MPMediaType.anyAudio.rawValue

		let image = UIImage(data: self.data)!
		nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { _ in
			return image
		}

		nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = self.player.currentTime
		nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = NSNumber(value: self.player.duration)
		nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = self.player.rate
		nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = NSNumber(value: self.player.currentTime)

		// Set the metadata
		MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
	}

Selecting a different output device

The user can naturally change the output device using the Control Center in the top right corner. The app, using AVAudioPlayer, handles it naturally and without the need of any custom code.
But what if you want to offer the possibility of changing output device from within the app? In this case you need to use the AVRoutePickerView, that is an UIView.
So it must be wrapped in a UIViewRepresentable.

In the end, the code to implement this control is not complex at all:

struct AirPlayButton: UIViewRepresentable {
	func makeUIView(context: Context) -> AVRoutePickerView {
		let result = AVRoutePickerView(frame: .zero)

		// Configure the button's color.
		// result.delegate = context.coordinator
		// result.backgroundColor = UIColor.white
		result.tintColor = UIColor.label

		// Indicate whether your app prefers video content.
		result.prioritizesVideoDevices = false

		return result
	}

	func updateUIView(_ uiView: UIViewType, context: Context) {

	}
}

Handling volume

The original app created in the YouTube video doesn't handle volume. And it's not necessary either, because like in the previous section, integration with the operative system is automatic.
But what if I want to explicitly control volume from my view-model? Well... in this case there are multiple steps to implement it properly:

  1. How to handle volume in the View Model
    It's not possible to directly manage the volume in the view-model. The trick used to do this is to create an UI element (MPVolumeView in particular) and control its slider from the code:

    		let masterVolumeView = MPVolumeView()
     	masterVolumeSlider = masterVolumeView.subviews.compactMap({ $0 as? UISlider }).first
    

    The it's possible to control the volume manipulating this slider:

    	public var audioVolume: Double {
     	get {
     		return Double(masterVolumeSlider.value)
     	}
     	set {
     		percentVolume = newValue
     		masterVolumeSlider.value = Float(newValue)
     	}
     }
    
  2. How to respond to volume changes
    Volume can change by external factors, the easiest one would be simply to increase/decrease volume with the physical iPhone buttons. If we want to keep the UI aligned to these changes, we need to subscribe to the SystemVolumeDidChange event

    private var notificationCenter = NotificationCenter.default
    
    func onAppear() {
       ...
       notificationCenter.addObserver(self,
     								   selector: #selector(systemVolumeDidChange),
     								   name: Notification.Name("SystemVolumeDidChange"),
     								   object: nil
       )
    }
    
    func onDisappear() {
       notificationCenter.removeObserver(self, name: Notification.Name("SystemVolumeDidChange"), object: nil)
    }
    
     @objc func systemVolumeDidChange(notification: NSNotification) {
     	guard let userInfo = notification.userInfo,
     		  let percVolume = userInfo["Volume"] as? Float else {
     		return
     	}
    
     	Task.detached(priority: .background) {
     		await MainActor.run {
     			self.percentVolume = Double(percVolume)
     		}
     	}
     }
    
  3. How to hide volume in the UI
    Finally there is just one last improvement to do: if the UI doesn't see a MPVolumeView, it creates an indicator on the top left corner after volume changes. To hide it, we need to create another MPVolumeView in the UI.
    We can't simply create the view and hide it: SwiftUI is clever and skips creating the view entirely.
    So we need to create the view and set a very low level of opacity:

    struct HideVolumeIndicator: UIViewRepresentable {
     func makeUIView(context: Context) -> MPVolumeView {
     	let result = MPVolumeView(frame: .zero)
     	result.alpha = 0.001
     	return result
     }
    
     func updateUIView(_ uiView: UIViewType, context: Context) {
     }
    }
    

Conclusion

You can find this new sample in my GitHub repository SwiftUIMusicPlayer.