NDIMediaSender of NDI plugin for Unreal Engine donot support audio

chu1979

New member
HI.

In the code of NDIMediaSender of the NDI plugin for Unreal Engine,

the function for sending audio is empty , as the following:
Code:
/**
	This will attempt to generate an audio frame, add the frame to the stack and return immediately,
	having scheduled the frame asynchronously.
*/
void UNDIMediaSender::TrySendAudioFrame(int64 time_code)
{
	// Currently unsupported
}

For the project' reason , i need to implement this function.

Is there any suggestion or example codes of sending audio?

Thanks a lot!
 

Rumzie

New member
I'm seeing the same thing. It seems like during the init of the object it sets up the audio...

You can see in NDIMediaSender.cpp

C++:
void UNDIMediaSender::Initialize()
{
    if (this->p_send_instance == nullptr)
    {
        // Create valid settings to be seen on the network
        NDIlib_send_create_t settings;
        settings.clock_audio = false;
        settings.clock_video = false;
        settings.p_ndi_name = TCHAR_TO_UTF8(*this->SourceName);

        // create the instance and store it
        p_send_instance = NDIlib_send_create(&settings);

        // If it's valid then lets do some engine related setup
        if (p_send_instance != nullptr)
        {
            // Update the Render Target Configuration
            ChangeRenderTargetConfiguration(FrameSize, FrameRate);

            // Send audio frames at the end of the 'update' loop
            FNDIConnectionService::EventOnSendAudioFrame.AddUObject(this, &UNDIMediaSender::TrySendAudioFrame);

            // We don't want to limit the engine rendering speed to the sync rate of the connection hook
            // into the core delegates render thread 'EndFrame'
            FNDIConnectionService::EventOnSendVideoFrame.AddUObject(this, &UNDIMediaSender::TrySendVideoFrame);

            // Initialize the 'LastRender' timecode
            LastRenderTime = FTimecode::FromTimespan(0, FrameRate, FTimecode::IsDropFormatTimecodeSupported(FrameRate),
                                                     true // use roll-over timecode
            );

But yes further down in this function...

C++:
void UNDIMediaSender::TrySendAudioFrame(int64 time_code)
{
    // Currently unsupported
}

/**
    This will attempt to generate a video frame, add the frame to the stack and return immediately,
    having scheduled the frame asynchronously.
*/

I guess the confusing part about all this is the pipeline, according to the docs the audio pipeline gets passed around like so:

NDIMediaReciever -> NDIMediaSoundWave -> NDIMediaSender -> Broadcasted out to Network (NDI Monitor etc)

From what I gathered from the docs and from what is represented in the C++ code is that the UNDIMediaSoundWave (essentially an unreal sound object) gets passed around and eventually ends up at the NDIMediaSender which is incapable of rebroadcasting out the audio frame. (See code above)

This is completely contradictory to the docs . The docs state that there are properties that can be set concerning the audio:

- Under the NDI Media Receiver Section:

"AudioWave (NDIMediaSoundWave*) An NDI® IO Plugin specific Sound Wave object used to render audio frames from a connected NDI® sender."

- You cannot set this Property in the UE4 editor and I can't find where it gets set in the code by default
- What's interesting is that it DOES get set by default in the NDIRecieverActor. Unfortunately this means that the only way to get audio "in" to ue4 is to specifically use this actor and to also enable its playback in the editor.

What this all boils down to is that there is no way to sync audio with the video since there is a delay between when UE4 is going to schedule its audio frame and the network latency on the video frame getting sent out over the NDI output stream from UE4.

Hopefully someone with a little more information and expertise can chime in and either correct me if I'm wrong or confirm.

Thanks
 
Top Bottom