11.2 C
New York
Sunday, March 30, 2025

Actual time audio processing iOS


I am making an attempt to get sound from the microphone, course of the sound with some perform after which output the processed sound to the audio system.
I want to have the ability to course of buffers of 1024 samples. however for now I get solely uneven sound. theres a greater approach to course of sound then utilizing set up faucet for actual time processing?

on this code instance I haven’t got any processing however I nonetheless get uneven sounds.

    personal func setupAudioEngine() {
        do {
            let audioSession = AVAudioSession.sharedInstance()
            strive audioSession.setCategory(.playAndRecord, mode: .default, choices: [.defaultToSpeaker, .allowBluetooth])
            strive audioSession.setActive(true)
        } catch {
            errorMessage = "Didn't arrange audio session: (error.localizedDescription)"
            print(errorMessage ?? "")
            return
        }
        
        // Get the enter format
        let inputNode = audioEngine.inputNode
        let inputFormat = inputNode.outputFormat(forBus: 0)
        
        // Connect nodes
        audioEngine.connect(mixerNode)
        audioEngine.connect(playerNode)
        
        // Set mixer format to match enter
        mixerNode.outputFormat(forBus: 0)
        
        // Join enter to mixer
        audioEngine.join(inputNode, to: mixerNode, format: nil)
        
        // Join mixer to output
        audioEngine.join(mixerNode, to: audioEngine.mainMixerNode, format: nil)
        
        // Join participant to mixer (not on to output)
        audioEngine.join(playerNode, to: audioEngine.outputNode, format: nil)
        
        let format = AVAudioFormat(
            standardFormatWithSampleRate: inputFormat.sampleRate,
            channels: 2
        )
        
        // Set up faucet on mixer node to course of audio
        inputNode.installTap(onBus: 0, bufferSize: 1024, format: format) { [weak self] (buffer, audioTime) in
            self!.scheduleProcessedBuffer(buffer)
        }
        
        // Put together the engine earlier than beginning
        audioEngine.put together()
    }
    
    
    personal func scheduleProcessedBuffer(_ buffer: AVAudioPCMBuffer) {
        if playerNode.isPlaying {
            playerNode.scheduleBuffer(buffer, at: nil, choices: .interrupts) {
                // Non-compulsory: Callback when buffer finishes enjoying
            }
        }
    }

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles