Results 1 to 6 of 6

Thread: Diagnosing inconsistent frames with the Analyzer of NDI HX camera's

  1. #1
    Registered User
    Join Date
    Aug 2020

    Diagnosing inconsistent frames with the Analyzer of NDI HX camera's

    Hello guys, I hope to gain some knowledge here to help me along.

    First just a brief summery of my technical setup
    I often have setups with multiple PTZ cams, all wired up to a workstation running vmix.
    All cams are NDI HX and about 50% of then run 4k 30fps, and the other 1080p 60fps.

    Since the start the 4k cams have given me somewhat inconsistent performance that tends to get worse with scaling out, and i can't seem to really pinpoint what the weak point is.
    With a framerate of 30fps it is just not as smooth that you would think, its just always a bit of stutter here and there.

    Its a closed system, with only 1 network hop/switch so no mdns or any other factors.
    The workstation is a 24core TRX with 64gb ddr4 3600mhz and a RTX 2080s, so more than enough power.
    It is linked to the main switch with a 10gigabit Ethernet port and the PTZ are linked to the 1gb Poe+ ports on the switch so bandwidth is plenty.

    I tried lots of tweaks on the switch, some seem to influence things a bit but nothing that got the desired result.
    Same inconsistent behavior is also present when just linking a camera directly in the workstation.

    So the NDI analyzer looks really promising to diagnose the camera's and changes in the network switch, but i have some problems to interpret the given results, even after reading the included documentation.

    I have noticed the video recv value is very depended on resolution, bitrate and encoding parameters.
    But when setting everything so it goes as low as possible it results in vmix dropping most frames, so now i wonder if its better to focus on deviation in between reported values instead of absolutes?
    Also some of my values are so far of the quoted values in the documentation I am wondering how can this give sort of smooth image, or am i understanding this poorly.
    I'm not ruling out the cameras are to blame because there Chinese OEM sold by local company camera's that are fairly priced accordingly.
    Although i found out this is the same OEM that builds ptz optics, hubble cam and a couple of other big brands so they should have some knowledge.

    I would like to understand the analyzer better so i can optimize as much as possible with the given hardware.

    So i hope to get some tips and pointers.

    I have added a txt from 3 ndi analyzer runs.
    1 is 4k 30fps bitrate 52100 - the OEM added that high bitrate sort on my request for preventing compression artifacts in very complex scenes.
    2 a 4k 30fps bitrate 10240 - Half bitrate of what normally is max
    3 a 720P 30fps bitrate 10240 - as a very low end control run

    any help is appreciated.
    Attached Files Attached Files

  2. #2
    How dose the video look in NDI Studio Monitor with one source? Is that okay?

    I ask because when working with many cameras and with HX type sources, they can decode on the GPU. It sounds like you have multiple cameras (how many?), with half of them running in 4K.

    If any camera in NDI Studio Monitor looks okay, but the issue is with many cameras at once, then it can be how much the destination system can support at once.

    I looked at your NDI Analysis results. The main thing is the 'video recv' and 'video send' times. They look to be the correct results for 30fps frame rates, which says that the data is coming out of the cameras and thru the network with correct timing. That points to the destination system.
    Kane Peterson
    Solutions Architect
    NewTek, Inc.

  3. #3
    Registered User
    Join Date
    Aug 2020
    Hey Kane,

    Thank you for you're response, that's was not the direction i was expecting but very interesting to peruse.
    I was a bit confused with the high max of the receive parameter.

    So i'm actually glad its probably not the camera.
    When looking at the stream in studio monitor its the same kind of a bit unsmooth, in Vmix by default its also not great and a lot of frames are getting dropped until you select the NDI source with large buffer enabled.
    I had the same experience on several high end systems i have build, all these systems had completely different components but demonstrated the same behavior with the camera's.
    Its really giving the feeling of inconsistent frame times, like gaming at 40fps without adaptive sync monitor.
    Again the 60fps cameras have no visible issues, its just the 4k camera's have this issue, unfortunately i have no other brands to A-B test it.
    That is why i began looking at the network or camera's.

    I do a lot of tweaking to get the best low latency performance, been doing that for dante digital audio for a long time.
    Dedicated up to date windows and stripped of all non essential apps, all proper drivers.
    Fixed clock speed with no SMT because plenty cores, high speed and tuned memory.
    The new GPU hardware scheduler also was a big improvement in the performance
    I also tried the difference between gigabit out or 10gb and it makes no difference.

    The only thing i cant rule out is that maybe there is a inherit disadvantage with the ryzen 3000 architecture.
    All systems where 3900X, 3950X Ryzen and 3960X threadripper.
    Or maybe its something with the way the network port is connected to the CPU.

    In the biggest setup its 3 4k camera's and 3 1080P camera's, unfortunately i don't have a Quadro yet so the 2080s is limited to 3 encode/decode sessions if ndi HX uses the NVENC for that.
    Or would this gpu decode only be on a intel cpu with intergrated GPU?

    What would you're recommendations be to first look and test for possible solutions?
    NIC settings. just raw performance and tuning or software side?
    In spec all workstations are very beefy machines even way above the reference systems spec so somewhere this should be able to run things smoothly.
    And same behavior is seen in VMIX, obs, studio monitor and Wirecast, although when 2 programs are opened at the same time they differ in the moments of stutter.
    Just not as smooth as you would expect of 30fps and cant seem to completely get rid of some dropped frames, like 1 or 2 in 60 frames, even with just 1 camera.
    And often the drop is the 2 frames at once so its noticeable.

    I will see if i can cut and link a demo video somewhere this week to show what i mean.
    Been rambling on for to long now, will dive back into looking at the workstation side off things.

    Thanks in advance for the assistance.

    Kind regards

    Peter van Kalleveen

  4. #4
    Registered User
    Join Date
    Aug 2020
    With some further thought about the matter I realized the potential of what you said.
    Does NDI HX decode on the gpu? like on the NVENC nvidia gpu? Because normal NDI is done with the CPU if im not mistaken.
    In that case the 3 encode/decode sessions allowed on a consumer gpu card will be gone in a instant, because stream and recording is also best with hardware acceleration.
    Could you confirm if this is really true? that a NDI HX with hardware acceleration is occupying a encode/decode on the GPU.
    If it is..... its unfortunately probably Quadro buying time

  5. #5
    NDI-HX does support GPU decoding (it can be on CPU as well, but typically GPU is used). In some applications (like NDI Studio Monitor) you can choose the decoding method with the 'Allow HW Acceleration' option. High bandwidth NDI is CPU based.

    I think you are confusing PureVideo and ENVEC. I wasn't aware of specific limits for H.264 PureVideo decoding on nVidia GPUs, you are correct that there are limits on GeForce GPUs for NVENC encoding of H.264.
    Kane Peterson
    Solutions Architect
    NewTek, Inc.

  6. #6
    Registered User
    Join Date
    Aug 2020
    After some fact checking of my own previous assumptions I indeed found that Nvidia only segments the cards on the maximum amount of encoding sessions and not decoding sessions on the fixed function hardware.
    Also was not aware that the video decoding part was called NVDEC, so extra learnings for me.

    You're reference to purevideo and ENVEC where new to me, after some reading into it i guess its a low level driver layer in between the hardware and software, please correct me if i am mistaken.

    I also tried a quick test of the fact that decoding is not limited to 3 sessions like encode (used to be two). And it indeed seems that even at 6 HX decodes there is no instant stop where the cpu gets the burden of decoding.
    I did see that with two decoding sessions of 4k the gpu decoder is already around 60-65%, i wonder if i unintentionally overburden the GPU normally, because i often have 3 local displays, a stream with HW encoding, 1 or multiple recordings preferably with HW encoding and some extra full size NDI streams for and to other computers.

    Also I use a little magic to unlock the NVENC ecoder so the encoding session limit is its max theoretical limit of 24, but combined with the decodes the GPU might become the real bottleneck.

    That is not a conclusion i would have thought off a couple of days ago before this conversation.
    So thank you for that.

Tags for this Thread


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts