I need to present an interactive audio waveform like this.
I’ve extracted the pattern information utilizing AVAssetReader. Utilizing this information, I am drawing a UIBezierPath in a Scrollview’s contentView. At the moment, once I pinch zoom-in or zoom-out the scrollView, I am downsampling the pattern information to find out what number of samples are to be proven.
class WaveformView: UIView {
var amplitudes: [CGFloat] = [] {
didSet {
setNeedsDisplay()
}
}
override func draw(_ rect: CGRect) {
guard let context = UIGraphicsGetCurrentContext(), !amplitudes.isEmpty else { return }
// Arrange drawing parameters
context.setStrokeColor(UIColor.black.cgColor)
context.setLineWidth(1.0)
context.setLineCap(.spherical)
let midY = rect.peak / 2
let widthPerSample = rect.width / CGFloat(amplitudes.rely)
// Draw waveform
let path = UIBezierPath()
for (index, amplitude) in amplitudes.enumerated() {
let x = CGFloat(index) * widthPerSample
let peak = amplitude * rect.peak * 0.8
// Draw vertical line for every pattern
path.transfer(to: CGPoint(x: x, y: midY - peak))
path.addLine(to: CGPoint(x: x, y: midY + peak))
}
path.stroke()
}
}
Added gesture deal with
@objc personal func handlePinch(_ gesture: UIPinchGestureRecognizer) {
change gesture.state {
case .started:
initialPinchDistance = gesture.scale
case .modified:
let scaleFactor = gesture.scale / initialPinchDistance
var newScale = currentScale * scaleFactor
newScale = min(max(newScale, minScale), maxScale)
// Replace displayed samples with new scale
updateDisplayedSamples(scale: newScale)
print(newScale)
// Preserve zoom middle level
let pinchCenter = gesture.location(in: scrollView)
let offsetX = (pinchCenter.x - scrollView.bounds.origin.x) / scrollView.bounds.width
let newOffsetX = (totalWidth * offsetX) - (pinchCenter.x - scrollView.bounds.origin.x)
scrollView.contentOffset.x = max(0, min(newOffsetX, totalWidth - scrollView.bounds.width))
view.layoutIfNeeded()
case .ended, .cancelled:
currentScale = scrollView.contentSize.width / (baseWidth * widthPerSample)
default:
break
}
}
personal func updateDisplayedSamples(scale: CGFloat) {
let targetSampleCount = Int(baseWidth * scale)
displayedSamples = downsampleWaveform(samples: rawSamples, targetCount: targetSampleCount)
waveformView.amplitudes = displayedSamples
totalWidth = CGFloat(displayedSamples.rely) * widthPerSample
contentWidthConstraint?.fixed = totalWidth
scrollView.contentSize = CGSize(width: totalWidth, peak: 300)
}
personal func downsampleWaveform(samples: [CGFloat], targetCount: Int) -> [CGFloat] {
guard samples.rely > 0, targetCount > 0 else { return [] }
if samples.rely <= targetCount {
return samples
}
var downsampled: [CGFloat] = []
let sampleSize = samples.rely / targetCount
for i in 0..
The next method works very inefficiently as everytime gesture.state is modified I am calculating the downsampled information and performs UI operation primarily based on that. How can I implement this performance extra effectively for clean interplay?