Creating an accessible audio player in SwiftUI (part 2)

ยท

6 min read

In this article we are taking the player created in part 1, refactoring it into a player and audio manager.

To refresh the memory this is the audio player that we are going to refactor. The UI will be the same but the logic will be extracted outside.

Screenshot 2022-01-11 at 19.12.27.png

For the refactor I got inspired by the Subsonic audio player by Paul Hudson, which is freely available as a Swift Package. It is really great and you should check it out. The code is available for all to see in GitHub, so I modified his concept and added some extra things like the duration and other property. he overall structure is like the Subsonic player, though, because I quite like it, and I could not do better myself at this stage :)

What we do is create a new class for our player logic. I will call it LaurentsAudioPlayer, this class will be responsible for loading and playing a single sound file. The class needs to conform to ObservableObject because in this way the view will update automatically when its @Published properties are changing.
It conforms to the NSObject because I am using the AVAudioPlayerDelegate protocol to know when a sound finished playing.

public class LaurentsAudioPlayer: NSObject, ObservableObject, AVAudioPlayerDelegate {
 /// A Boolean representing whether this sound is currently playing.
    @Published public var isPlaying = false

/// These are used in our view
    @Published public var progress: CGFloat = 0.0
    @Published public var duration: Double = 0.0
    @Published public var formattedDuration: String = ""
    @Published public var formattedProgress: String = "00:00"

  /// The internal audio player being managed by this object.
    private var audioPlayer: AVAudioPlayer?

    /// How loud to play this sound relative to other sounds in your app,
    /// specified in the range 0 (no volume) to 1 (maximum volume).
    public var volume: Double {
        didSet {
            audioPlayer?.volume = Float(volume)
        }
    }

    /// If the sound is played on a loop. Specifying false here
    /// (the default) will play the sound only once.
    public var repeatSound: Bool

// more to come
}

So because we are using a class we need an initialiser. It will check that the audio file is available, create the audio player, create a formatter to display the minutes and seconds in the view, assign my class to be the delegate of the AVAudioPlayerDelegate to be notified when the audio finishes, and also start the timer. It looks like this

 /// Creates a new instance by looking for a particular sound filename in a bundle of your choosing.of `.reset`.
    /// - Parameters:
    ///   - sound: The name of the sound file you want to load.
    ///   - bundle: The bundle containing the sound file. Defaults to the main bundle.
    ///   - volume: How loud to play this sound relative to other sounds in your app,
    ///     specified in the range 0 (no volume) to 1 (maximum volume).
    ///   - repeatSound: if false  (the default) will play the sound only once.
    public init(sound: String, bundle: Bundle = .main, volume: Double = 1.0, repeatSound: Bool = false) {
        self.volume = volume
        self.repeatSound = repeatSound

        super.init()

        guard let url = bundle.url(forResource: sound, withExtension: nil) else {
            print("Failed to find \(sound) in bundle.")
            return
        }

        guard let player = try? AVAudioPlayer(contentsOf: url) else {
            print("Failed to load \(sound) from bundle.")
            return
        }
        self.audioPlayer = player
        self.audioPlayer?.prepareToPlay()

        /// a formatter to get the duration and progress to the view
        let formatter = DateComponentsFormatter()
        formatter.allowedUnits = [.minute, .second]
        formatter.unitsStyle = .positional
        formatter.zeroFormattingBehavior = [ .pad ]

        //I need both! The formattedDuration is the string to display and duration is used when forwarding
        formattedDuration = formatter.string(from: TimeInterval(self.audioPlayer?.duration ?? 0.0))!
        duration = self.audioPlayer?.duration ?? 0.0

        Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in
            if let player = self.audioPlayer {
                if !player.isPlaying {
                    self.isPlaying = false
                }
                self.progress = CGFloat(player.currentTime / player.duration)
                self.formattedProgress = formatter.string(from: TimeInterval(player.currentTime))!
            }
        }
        audioPlayer?.delegate = self
    }

I thought of adding a deinit just in case, but this might not be necessary if you stop all audio when closing the view!

   deinit {
        audioPlayer?.stop()
    }

Now we need some methods to start, stop, fast forward and rewind our player:

   /// this will play from where the sound last left off.
    public func play() {
        isPlaying = true
        audioPlayer?.play()
    }

    /// Stops the audio from playing.
    public func stop() {
        isPlaying = false
        audioPlayer?.stop()
    }


    /// Forward the current sound of 15 sec.
    public func forward() {
        if let player = self.audioPlayer {
            let increase = player.currentTime + 15
            if increase < self.duration {
                player.currentTime = increase
            } else {
                // give the user the chance to hear the end if he wishes
                player.currentTime = duration
            }
        }



    /// Rewind the current sound of 15 sec.
    public func rewind() {
        if let player = self.audioPlayer {
            let decrease = player.currentTime - 15.0
            if decrease < 0.0 {
                player.currentTime = 0
            } else {
                player.currentTime -= 15
            }
        }
    }

When the sound is finished, if we loop we continue playing from the beginning, otherwise we will stop!

    /// This is the delegate method of `AVAudioPlayerDelegate` - wew get notified when the audio ends and we reset the button
    public func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
        if repeatSound  {
            play()
        } else {
            isPlaying = false
        }
    }

Finally going back to my AudioPlayerView, I will instantiate my player as @StateObject but I could put it in the environment as well since it is an ObservableObject:

@StateObject private var sound = LaurentsAudioPlayer(sound: "audioTest.m4a")

For clarity I will put the body of the view again below. Now the only difference is that I am using the instance of the LaurentsAudioPlayer in my code:


var body: some View {
        VStack {
            Text("Audio Player")
                .bold()
                .multilineTextAlignment(.center)
                .font(.title)
                .minimumScaleFactor(0.75)
                .padding()

            HStack {
                Text(sound.formattedProgress)
                    .font(.caption.monospacedDigit())

                /// this is a dynamic length progress bar
                GeometryReader { gr in
                    Capsule()
                        .stroke(Color.blue, lineWidth: 2)
                        .background(
                            Capsule()
                                .foregroundColor(Color.blue)
                                .frame(width: gr.size.width * sound.progress, height: 8), alignment: .leading)
                }
                .frame( height: 8)

                Text(sound.formattedDuration)
                    .font(.caption.monospacedDigit())
            }
            .padding()
            .frame(height: 50, alignment: .center)
            .accessibilityElement(children: .ignore)
            .accessibility(identifier: "audio player")
            .accessibilityLabel(sound.isPlaying ? Text("Playing at ") : Text("Duration"))
            .accessibilityValue(Text("\(sound.formattedProgress)"))

            /// the control buttons
            HStack(alignment: .center, spacing: 20) {
                Spacer()
                Button(action: {
                    /// back 15 sec
                    sound.rewind()
                }) {
                    Image(systemName: "gobackward.15")
                        .font(.title)
                        .imageScale(.medium)
                }

                /// main playing button
                Button(action: {
                    if sound.isPlaying {
                        sound.stop()
                        sound.isPlaying = false
//                        self.audioPlayer.pause()
                    } else if !sound.isPlaying {
                        sound.play()
//                        playing = true
//                        self.audioPlayer.play()
                    }
                }) {
                    Image(systemName: sound.isPlaying ?
                          "pause.circle.fill" : "play.circle.fill")
                        .font(.title)
                        .imageScale(.large)
                }

                Button(action: {
                    sound.forward()
                }) {
                    Image(systemName: "goforward.15")
                        .font(.title)
                        .imageScale(.medium)
                }
                Spacer()
            }
        }
        .foregroundColor(.blue)
    }

The end

Or not the end? We could still customise and refactor the player further, but I just wanted to show how to extract the logic of the player from our view. The code is so much clearer. Paul Hudson in Subsonic has an AudioPlayerController for holding to more than one player instances. Also really nice is the possibility to add the player as a ViewModifier, not used here. Please go and check his repo. It is really nice.
I hope you enjoyed this short tutorial. I will put the code on GitHub in some near future :) Keep safe!

Sources

For inspiration check Subsonic.