I’m going through an issue in implementing spatial audio results in my iOS 18 app. I’ve tried a number of approaches to attain 3D audio impact, however the consequence by no means felt adequate or it didn’t work in any respect.
Challenge that largely troubles me is I seen that AirPods I’ve do not acknowledge my app as one having spatial audio. In audio settings it reveals “Spatial Audio Not Taking part in”, which makes me assume my app does not use spatial audio potential.
It may be related to say that I’m utilizing a private staff account.
First method makes use of AVAudioEnviromentNode with AVAudioEngine. Regardless of setting the playerNode.place and listener parameters, the spatial results don’t appear to work. Altering the listenerPosition or playerNode.place has no noticeable influence on playback.
Here is simplified instance how i initialize AVAudioEngine:
import Basis
import AVFoundation
class AudioManager: ObservableObject {
// vital class variables
var audioEngine: AVAudioEngine!
var environmentNode: AVAudioEnvironmentNode!
var playerNode: AVAudioPlayerNode!
var audioFile: AVAudioFile?
...
//Sound arrange
func setupAudio() {
do {
let session = AVAudioSession.sharedInstance()
attempt session.setCategory(.playback, mode: .default, choices: [])
attempt session.setActive(true)
} catch {
print("Didn't configure AVAudioSession: (error.localizedDescription)")
}
audioEngine = AVAudioEngine()
environmentNode = AVAudioEnvironmentNode()
playerNode = AVAudioPlayerNode()
audioEngine.connect(environmentNode)
audioEngine.connect(playerNode)
audioEngine.join(playerNode, to: environmentNode, format: nil)
audioEngine.join(environmentNode, to: audioEngine.mainMixerNode, format: nil)
environmentNode.listenerPosition = AVAudio3DPoint(x: 0, y: 0, z: 0)
environmentNode.listenerAngularOrientation = AVAudio3DAngularOrientation(yaw: 0, pitch: 0, roll: 0)
environmentNode.distanceAttenuationParameters.referenceDistance = 1.0 environmentNode.distanceAttenuationParameters.maximumDistance = 100.0
environmentNode.distanceAttenuationParameters.rolloffFactor = 2.0
// instance.mp3 is mono sound
guard let audioURL = Bundle.important.url(forResource: "instance", withExtension: "mp3") else {
print("Audio file not discovered")
return
}
do {
audioFile = attempt AVAudioFile(forReading: audioURL)
} catch {
print("Didn't load audio file: (error)")
}
}
...
//Taking part in sound
func playSpatialAudio(pan: Float ) {
guard let audioFile = audioFile else { return }
// left facet
playerNode.place = AVAudio3DPoint(x: pan, y: 0, z: 0)
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
do {
attempt audioEngine.begin()
playerNode.play()
} catch {
print("Failed to begin audio engine: (error)")
}
...
}
Second method is extra complicated utilizing PHASE – it did higher. I’ve made an exemplary app that permits gamers to maneuver audio supply in 3D area. I’ve added reverb, and sliders altering audio place as much as 10 meters every route from listener however audio appears to solely actually change left to proper (x axis) – once more I feel it may be bother with the app not being acknowledged as spatial (AirPods settings nonetheless present “Spatial Audio Not Taking part in”).
Here is the instance setup:
class PHASEAudioController: ObservableObject{
//Essential class Variables:
personal var soundSourcePosition: simd_float4x4 = matrix_identity_float4x4
personal var audioAsset: PHASESoundAsset!
personal let phaseEngine: PHASEEngine
personal let params = PHASEMixerParameters()
personal var soundSource: PHASESource
personal var phaseListener: PHASEListener!
personal var soundEventAsset: PHASESoundEventNodeAsset?
// Initialization of PHASE
init{
do {
let session = AVAudioSession.sharedInstance()
attempt session.setCategory(.playback, mode: .default, choices: [])
attempt session.setActive(true)
} catch {
print("Didn't configure AVAudioSession: (error.localizedDescription)")
}
// Init PHASE Engine
phaseEngine = PHASEEngine(updateMode: .computerized)
phaseEngine.defaultReverbPreset = .mediumHall
phaseEngine.outputSpatializationMode = .computerized //nothing helps
// Set listener place to (0,0,0) in World area
let origin: simd_float4x4 = matrix_identity_float4x4
phaseListener = PHASEListener(engine: phaseEngine)
phaseListener.remodel = origin
phaseListener.automaticHeadTrackingFlags = .orientation
attempt! self.phaseEngine.rootObject.addChild(self.phaseListener)
do{
attempt self.phaseEngine.begin();
}
catch {
print("Couldn't begin PHASE engine")
}
audioAsset = loadAudioAsset()
// Create sound Supply
// Sphere
soundSourcePosition.translate(z:3.0)
let sphere = MDLMesh.newEllipsoid(withRadii: vector_float3(0.1,0.1,0.1), radialSegments: 14, verticalSegments: 14, geometryType: MDLGeometryType.triangles, inwardNormals: false, hemisphere: false, allocator: nil)
let form = PHASEShape(engine: phaseEngine, mesh: sphere)
soundSource = PHASESource(engine: phaseEngine, shapes: [shape])
soundSource.remodel = soundSourcePosition
print(soundSourcePosition)
do {
attempt phaseEngine.rootObject.addChild(soundSource)
}
catch {
print ("Failed so as to add a baby object to the scene.")
}
let simpleModel = PHASEGeometricSpreadingDistanceModelParameters()
simpleModel.rolloffFactor = rolloffFactor
soundPipeline.distanceModelParameters = simpleModel
let samplerNode = PHASESamplerNodeDefinition(
soundAssetIdentifier: audioAsset.identifier,
mixerDefinition: soundPipeline,
identifier: audioAsset.identifier + "_SamplerNode")
samplerNode.playbackMode = .looping
do {soundEventAsset = attempt
phaseEngine.assetRegistry.registerSoundEventAsset(
rootNode: samplerNode,
identifier: audioAsset.identifier + "_SoundEventAsset")
} catch {
print("Didn't register a sound occasion asset.")
soundEventAsset = nil
}
}
//Taking part in sound
func playSound(){
// Fireplace new sound occasion with at the moment set properties
guard let soundEventAsset else { return }
params.addSpatialMixerParameters(
identifier: soundPipeline.identifier,
supply: soundSource,
listener: phaseListener)
let soundEvent = attempt! PHASESoundEvent(engine: phaseEngine,
assetIdentifier: soundEventAsset.identifier,
mixerParameters: params)
soundEvent.begin(completion: nil)
}
...
}
Additionally I’ve experimented with RealityKit, however I hope to discover a answer with out utilizing AR view, PHASE appears to be best choice if it labored as supposed.
What I count on
I hope my app will place audio in 3D area accurately in all axes like in another spatial apps out there on iOS and app shall be acknowledged as spatial audio enabled app by e.g. AirPods.