While working on adding music and sounds to my latest project I found a problem: If you’re running some background music and then you trigger a sound ( for example by picking up some item ) the music will stop, as you’re overriding the sound with the new object.
In order to fix this we can use AVAudioSession
in iOS:
import AVFoundation
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSession.Category.playback)
try audioSession.setActive(true)
} catch {
print("Unable to set the audio session category and activate it: \(error)")
}
In this example, we first import the AVFoundation
framework, which provides the AVAudioSession
class. Then, we create a new instance of AVAudioSession
by calling the sharedInstance()
method.
Next, we use a do-catch
block to try setting the audio session’s category to .playback
and activating it. If any errors are thrown while setting the category or activating the session, they will be caught and printed in the catch
block.
Keep in mind that this is just a simple example to illustrate how to use AVAudioSession
, you may want to handle any errors more gracefully and customize the audio session’s settings further.
macOS
Now, all of that is nice and dandy but the game I’m working on is on macOS, and while trying to use AVAudioSession I wasn’t able to access the class or its methods, despite being built-in… in UIKit. This means the AVAudioSession class is not available in macOS, only in iOS. Is there an alternative for macOS?
On macOS, you can use the AVAudioEngine
class to manage audio input and output, as well as to perform audio processing. Here’s the example to play background music:
let audioEngine: AVAudioEngine = AVAudioEngine()
let audioPlayerNode: AVAudioPlayerNode = AVAudioPlayerNode()
func playBGMusic() {
guard let path = Bundle.main.path(forResource: "music", ofType: "wav") else {
return
}
let url = URL(fileURLWithPath: path)
let audioFile = try! AVAudioFile(forReading: url)
let audioFormat = audioFile.processingFormat
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(audioFile.length))
try! audioFile.read(into: audioFileBuffer!)
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: mainMixer, format: audioFileBuffer!.format)
audioPlayerNode.scheduleBuffer(audioFileBuffer!, completionHandler: nil)
audioEngine.prepare()
do {
try audioEngine.start()
} catch {
print("error! \(error)")
}
audioPlayerNode.play()
}
Here is a different example of using AVAudioEngine
to play a computed sine wave, rather than a music file:
// Create an instance of AVAudioEngine
let audioEngine = AVAudioEngine()
// Create an instance of AVAudioPlayerNode
let player = AVAudioPlayerNode()
// Attach the player to the audio engine
audioEngine.attach(player)
// Create an instance of AVAudioFormat and set it to a 44.1kHz sine wave
let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: false)
let buffer = AVAudioPCMBuffer(pcmFormat: format!, frameCapacity: AVAudioFrameCount(format!.sampleRate))
buffer.frameLength = buffer.frameCapacity
let data = buffer.floatChannelData!.pointee
vDSP_vgen([0], 1, [1], data, 1, vDSP_Length(buffer.frameLength))
// Connect the player to the audio engine's output node
audioEngine.connect(player, to: audioEngine.outputNode, format: buffer.format)
// Start the audio engine
try! audioEngine.start()
// Play the sine wave
player.scheduleBuffer(buffer, at: nil, options: .loops)
player.play()
We first create an instance of AVAudioEngine
to manage the audio input and output. Then, we create an instance of AVAudioPlayerNode
to play the sine wave.
Next, we create an AVAudioFormat
instance with a sample rate of 44.1kHz and a single channel, and use it to create an AVAudioPCMBuffer
that will hold the sine wave data. We generate the sine wave data using the vDSP_vgen
function from the Accelerate framework, and store it in the buffer.
Then, we connect the player node to the audio engine’s output node, start the audio engine, and use the player node’s scheduleBuffer(_:at:options:)
method to schedule the sine wave buffer for playback. Finally, we call the player node’s play
method to start playing the sine wave.
Leave a Reply