14.9 C
New York
Wednesday, October 16, 2024

ios – CIContext.render(_ picture: CIImage, to buffer: CVPixelBuffer) an excessive amount of CPU time utilization


So I am making an attempt to document movies utilizing AVAssetWriter, course of every digital camera body (e.g.: including some watermark or textual content overlays) by creating CIImage from digital camera buffer ( CVImageBuffer), add some filters to CIImage (which may be very quick in efficiency), after which I must get new CVPixelBuffer from CIImage and it turns into an issue with excessive resolutions like 4K on base iPhone 11 as a result of cIContext.render(compositedImage, to: pixelBuffer) takes about 30 ms of CPU time, so the app will not have the ability to document 4K at 60 FPS.

Are there any options to enhance it?

Or the one method to enhance efficiency is to make use of OpenGL/Metallic? However unsure how precisely it could be higher if we nonetheless must by some means move pixel buffer to AVAssetWriter. Is there any easy instance with utilizing Metallic and AVAssetWriter much like the next code instance?

personal let cIContext: CIContext = {
    if let mtlDevice = MTLCreateSystemDefaultDevice() {
        return CIContext(mtlDevice: mtlDevice) // make no distinction in perfomance for CIContext.render()
    } else {
        return CIContext()
    }
}()

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    guard let assetWriter = assetWriter, let videoWriterInput = videoWriterInput else { return }
    
    if isRecording == false || assetWriter.standing != .writing { return }
    
    if CMSampleBufferDataIsReady(sampleBuffer) == false {
        return
    }
    
    if output == videoOutput, videoWriterInput.isReadyForMoreMediaData {
        let presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
        
        if hasWritingSessionStarted == false {
            assetWriter.startSession(atSourceTime: presentationTime)
            
            hasWritingSessionStarted = true
            
            guard let pixelBufferPool = pixelBufferAdaptor?.pixelBufferPool else { return }
            
            let standing = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixelBufferPool, &pixelBuffer)
            guard standing == kCVReturnSuccess else {
                print("Did not create pixel buffer")
                return
            }
        }
        
        guard let pixelBuffer = pixelBuffer else {
            print("Pixel buffer is nil")
            return
        }
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
            print("Did not get picture buffer.")
            return
        }
        
        // quick, as much as ~1 ms
        let ciImage = CIImage(cvPixelBuffer: imageBuffer)
        
        // quick, as much as ~1 ms
        let compositedImage = watemarkImage.composited(over: ciImage)
        
        var tookTime = CFAbsoluteTimeGetCurrent()
        
        // very gradual, ~30 ms for 4K decision on iPhone 11 (base)
        cIContext.render(compositedImage, to: pixelBuffer)
        
        tookTime = CFAbsoluteTimeGetCurrent() - tookTime
        
        // quick, as much as ~1 ms
        pixelBufferAdaptor?.append(pixelBuffer, withPresentationTime: presentationTime)
        
        print("cIContext.render took (tookTime * 1000) ms")
    }
}

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles