Hello, I am able to capture audio and video using NDI Advanced. However, the audio I receive is in PCM format, specifically .pcmFormatFloat32 format at 48kHz. I was able to convert it to an AVAudioPCMBuffer, but I need help to do a more efficient conversion. I am sharing the code I use for the .pcmFormatFloat32 format below. With this code, I can play the audio using a regular AVAudioPCMBuffer. However, when I convert it to a CMSampleBuffer, the packet size seems to be too large, causing the audio to play either too slow or too fast when adding it to AVAssetWriterInput. I wasn't able to adjust the timing during the conversion process, or I couldn't figure out how to do it.
Swift:
private func bytesToAudioBuffer(_ buf: [UInt8], fmt: AVAudioFormat?) -> AVAudioPCMBuffer {
let frameLength = UInt32(buf.count) / fmt!.streamDescription.pointee.mBytesPerFrame
let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt!, frameCapacity: frameLength)
audioBuffer!.frameLength = frameLength
let dstLeft = audioBuffer!.floatChannelData![0]
let dstRight = audioBuffer!.floatChannelData![1]
buf.withUnsafeBufferPointer {
let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
dstLeft.initialize(from: src, count: Int(frameLength))
dstRight.initialize(from: src, count: Int(frameLength))
}
return audioBuffer!
}