Results 1 to 3 of 3

Thread: NDI frames to OpenGL texture?

  1. #1
    Registered User roddyp's Avatar
    Join Date
    Sep 2017
    Location
    UK
    Posts
    58

    NDI frames to OpenGL texture?

    Anyone using NDI with OpenGL?

    I'm hoping to be able to use the (faster) UYVY video frames as textures, but the double Y values in each 32-bit "pixel pair" means that OpenGl won't interpolate the texture nicely if it's transformed at all.

    RGBA works fine, but obviously will be slower.

  2. #2
    LightWave Engineer Jarno's Avatar
    Join Date
    Aug 2003
    Location
    New Zealand
    Posts
    625
    If your platform supports pixel shaders, I suggest writing a shader to convert UYVY to RGBA (or RGB or BGR or whatever is optimal) and use that to convert the video frame data to OpenGL texture data. That way you also have better control over the colour space used.

  3. #3
    Registered User roddyp's Avatar
    Join Date
    Sep 2017
    Location
    UK
    Posts
    58
    Thanks! I'm already using a fragment (AKA pixel?) shader to convert the YUV texture colourspace when rendering - but that doesn't solve the 422/444 interpolation issue.

    You're suggesting I upload the frame as 422 YUYV, then render it via a shader + framebuffer to a second texture as RGB with interpolation off (GL_NEAREST), then use that second texture with interpolation on (GL_LINEAR)? I'll give that a go...

    The other option is to try and do the correct interpolation in the shader itself, but that's probably going to eat GPU resources. It would be ideal if the SDK could provide frames directly in YUV 4:4:4, or even output 422 Y U and V as separate planes.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •