PDA

View Full Version : Using NDI SDK on iOS



FeralBob
08-19-2018, 10:15 AM
Hi All,

* Disclaimer - I'm a total newbie when it comes to video processing and NDI - be kind if I have a fundamental misunderstanding :) )

I'm trying to send a video signal using the iOS SDK. I get a CMSampleBuffer - from which I can get a CVPixelBuffer - which is in a 420f/NV12/Bi Planar format. However in the internal representation - the Y and UV planes are non-contiguous. Resulting in corruption of the NDI stream... see below

142527

If I memcpy the Y and UV planes to a contiguous chunk of memory then I'm all good and the receiving Monitor displays the image correctly - when in 'low quality' mode. However if the monitor is not in 'low-quality' the receiving Monitor crashes.

So my questions.
1) Any idea why 'low-quality' would work - but 'high quality' would cause the receiver to crash?
2) Is there any way to specify addresses for the Y and UV planes - rather than assuming the pData is contiguous - thus saving a memcpy in my app code
3) I suspect (But have not verified yet) that doing a Metal image conversion on the GPU from NV12 to UYVY will yield better performance. If I'm going to write a computer shader to do the conversion - is there any other processing I could do on GPU (eg any of the compression) so that the send has to do even less work.
3a) If Metal is out of the question - I could potentially convert using the vImageConvert_AnyToAny to make use of the CPUs vector processing
4) Talking of compression - is there any way to make use of the hardware compression on iOS devices again to minimize the work that the NDI lib has to do?

Thanks for any help or guidance. Looking forward to chatting more.

Cheers

Bob

livepad
08-20-2018, 04:46 AM
I would suggest you stick to passing UYVY to NDI.
Support for planar 420 is more recent and probably not proven on iOS. UYVY does appear to be reliable.



Hi All,

* Disclaimer - I'm a total newbie when it comes to video processing and NDI - be kind if I have a fundamental misunderstanding :) )

I'm trying to send a video signal using the iOS SDK. I get a CMSampleBuffer - from which I can get a CVPixelBuffer - which is in a 420f/NV12/Bi Planar format. However in the internal representation - the Y and UV planes are non-contiguous. Resulting in corruption of the NDI stream... see below

142527

If I memcpy the Y and UV planes to a contiguous chunk of memory then I'm all good and the receiving Monitor displays the image correctly - when in 'low quality' mode. However if the monitor is not in 'low-quality' the receiving Monitor crashes.

So my questions.
1) Any idea why 'low-quality' would work - but 'high quality' would cause the receiver to crash?
2) Is there any way to specify addresses for the Y and UV planes - rather than assuming the pData is contiguous - thus saving a memcpy in my app code
3) I suspect (But have not verified yet) that doing a Metal image conversion on the GPU from NV12 to UYVY will yield better performance. If I'm going to write a computer shader to do the conversion - is there any other processing I could do on GPU (eg any of the compression) so that the send has to do even less work.
3a) If Metal is out of the question - I could potentially convert using the vImageConvert_AnyToAny to make use of the CPUs vector processing
4) Talking of compression - is there any way to make use of the hardware compression on iOS devices again to minimize the work that the NDI lib has to do?

Thanks for any help or guidance. Looking forward to chatting more.

Cheers

Bob

FeralBob
09-23-2018, 06:19 PM
1) Any idea why 'low-quality' would work - but 'high quality' would cause the receiver to crash?


Just tried the same code on my iPad and it works in high quality mode. Too much CPU usage for my liking but NV12 did work. More investigation required.

FeralBob
01-02-2019, 07:14 PM
And with 3.8 - it's working on iPhone in NV12 mode with no other code changes on my part... I might pickup my project again and get it released.