Results 1 to 2 of 2

Thread: WPF RenderTargetBitmap is too slow, any alternatives?

  1. #1
    Registered User
    Join Date
    Aug 2019
    Location
    X
    Posts
    2

    WPF RenderTargetBitmap is too slow, any alternatives?

    Hello,

    I am using the WPF NdiSendContainer to output my WPF application to NDI. While it is working great and very easy to use, it has the major drawback of using RenderTargetBitmap to render the scene into a Bitmap. This is simply too slow for my purposes. I'd like to reach something close to 60 FPS, but with even a slightly complex visual tree the FPS drops to < 5. Moreover, since the conversion happens on the UI thread the entire application bogs down as well.

    I'm looking for an alternative way to render a WPF visual into a Bitmap, so it can be sent via NDI.

    One important requirement is that it should support transparency. The Visual I want to render is on a transparent window (AllowsTransparency = true), and I only want the visual elements to be in the NDI stream, so I can put it on top of any other content when mixing later.

    I realize this is a bit outside of the scope of NDI but I wanted to post it here anyway since I thought perhaps someone else has ran into this issue before. Or maybe there's a better way to do what I'm trying to do....


    So far I stumbled across two methods that seemed promising at the start but I kind of got stuck. One is using a combination of Graphics.FromHwnd and BitBlt to copy pixels directly from the graphics object. It works and is much faster, but it does not seem to allow transparency, so it won't work for my goals.

    The second method is rather more complicated to explain. I stumbled across a UWP Composition sample that does Screen Capturing:
    https://github.com/microsoft/Windows.../ScreenCapture

    This sample seems to use a combination of UWP code and SharpDX to grab a "Texture2D" object from any window. I tried to hook into the code at the point where the Texture2D is being grabbed, and convert it to a Bitmap. This is not a typical WPF Bitmap but a SharpDX Bitmap. With some pieces of code I found here and there I managed to render that object into a byte array (or Stream) that I can then send along via NDI. The code is at the bottom for reference.

    Here the same issue appears: it works, I can see my window and send it over NDI, but again there is no transparency. Instead of the transparent areas it just captures black.

    I was about to give up but then I noticed that this Screen Capture sample has a built in "Graphics Capture Picker": it shows a list of windows available for capture with a preview, just like for example Skype screen sharing (I think Skype uses the same technique actually). The important bit: this preview DOES show the transparency. If I select my transparent window, the selection color appears through the transparent parts. I realize it's a stretch but if this preview window is able to make a live preview of my transparent window then surely I should be able to do the same somehow?

    There are a number of pixel formats going on in the code, I am hopeful something just needs to be configured in the right way for it to capture the transparency... But so far I have not figured it out. If anyone has any input I'd be happy to try it!


    A screenshot of the "Graphics Picker" preview window which shows the transparency (bottom left window):
    Click image for larger version. 

Name:	capturepicker.jpg 
Views:	36 
Size:	132.2 KB 
ID:	145784



    Here is where I hook in and try to grab the captured frame to convert it to a usable bitmap or byte array:
    Code:
                using (var backBuffer = swapChain.GetBackBuffer<SharpDX.Direct3D11.Texture2D>(0))
                using (var bitmap = Direct3D11Helper.CreateSharpDXTexture2D(frame.Surface))
                {
                    d3dDevice.ImmediateContext.CopyResource(bitmap, backBuffer);
    
                    // My addition:
                    GetBitmap(bitmap);
                }
    Code:
       private void GetBitmap(Texture2D texture)
        {
            // Create texture copy
            var copy = new Texture2D(d3dDevice, new Texture2DDescription
            {
                Width = texture.Description.Width,
                Height = texture.Description.Height,
                MipLevels = 1,
                ArraySize = 1,
                Format = texture.Description.Format,
                Usage = ResourceUsage.Staging,
                SampleDescription = new SampleDescription(1, 0),
                BindFlags = BindFlags.None,
                CpuAccessFlags = CpuAccessFlags.Read,
                OptionFlags = ResourceOptionFlags.None
            });
    
            // Copy data
            d3dDevice.ImmediateContext.CopyResource(texture, copy);
    
            var dataBox = d3dDevice.ImmediateContext.MapSubresource(copy, 0, 0, MapMode.Read, MapFlags.None,
                out DataStream stream);
            var rect = new DataRectangle
            {
                DataPointer = stream.DataPointer,
                Pitch = dataBox.RowPitch
            };
            
            var format = PixelFormat.Format32bppPBGRA;
            Bitmap bmp = new Bitmap(factory, copy.Description.Width, copy.Description.Height, format, rect);
            
            using (var ras = new InMemoryRandomAccessStream())
            {
                var ms = ras.AsStream(); // Do not dispose here
                using (var wic = new WICStream(factory, ms))
                using (var encoder = new PngBitmapEncoder(factory, wic))
                using (var frame = new BitmapFrameEncode(encoder))
                {
                    frame.Initialize();
                    frame.SetSize(bmp.Size.Width, bmp.Size.Height);
                    frame.SetPixelFormat(ref format);
                    frame.WriteSource(bmp);
                    frame.Commit();
                    encoder.Commit();
                }
    
                BitmapCaptured?.Invoke(this, new CaptureEventArgs(ms, bmp.Size.Width, bmp.Size.Height));
            }
    
            d3dDevice.ImmediateContext.UnmapSubresource(copy, 0);
            copy.Dispose();
            bmp.Dispose();
        }
    Here is where the screen capture is initialized and various settings I am trying to play with:
    Code:
        public BasicCapture(IDirect3DDevice d, GraphicsCaptureItem i, BitmapImage im)
        {
            item = i;
            device = d;
            d3dDevice = Direct3D11Helper.CreateSharpDXDevice(device);
    
            factory = new ImagingFactory();
    
            var dxgiFactory = new SharpDX.DXGI.Factory2();
            var description = new SharpDX.DXGI.SwapChainDescription1()
            {
                Width = item.Size.Width,
                Height = item.Size.Height,
                Format = SharpDX.DXGI.Format.B8G8R8A8_UNorm,
                Stereo = false,
                SampleDescription = new SharpDX.DXGI.SampleDescription()
                {
                    Count = 1,
                    Quality = 0
                },
                Usage = SharpDX.DXGI.Usage.RenderTargetOutput,
                BufferCount = 2,
                Scaling = SharpDX.DXGI.Scaling.Stretch,
                SwapEffect = SharpDX.DXGI.SwapEffect.FlipSequential,
                AlphaMode = SharpDX.DXGI.AlphaMode.Premultiplied,
                Flags = SharpDX.DXGI.SwapChainFlags.None
            };
            swapChain = new SharpDX.DXGI.SwapChain1(dxgiFactory, d3dDevice, ref description);
    
            framePool = Direct3D11CaptureFramePool.Create(
                device,
                DirectXPixelFormat.B8G8R8A8UIntNormalized,
                2,
                i.Size);
            session = framePool.CreateCaptureSession(i);
            lastSize = i.Size;
    
            framePool.FrameArrived += OnFrameArrived;
        }
    Finally, the BitmapCaptured event is handled in my application and the Stream (that contains the raw pixel data) is sent over NDI via:
    Code:
            public void SendData(Stream ms, int width, int height)
            {
                var image = new BitmapImage();
                image.BeginInit();
                image.CacheOption = BitmapCacheOption.OnLoad;
                image.StreamSource = ms;
                image.EndInit();
                ms.Dispose();
    
                int xres = width; // NdiWidth;
                int yres = height; //NdiHeight;
    
                int frNum = NdiFrameRateNumerator;
                int frDen = NdiFrameRateDenominator;
    
                // sanity
                if (sendInstancePtr == IntPtr.Zero || xres < 8 || yres < 8)
                    return;
    
                stride = (xres * 32/*BGRA bpp*/ + 7) / 8;
                bufferSize = yres * stride;
                aspectRatio = (float)xres / (float)yres;
    
                // allocate some memory for a video buffer
                IntPtr bufferPtr = Marshal.AllocHGlobal(bufferSize);
    
                // We are going to create a progressive frame at 60Hz.
                NDIlib.video_frame_v2_t videoFrame = new NDIlib.video_frame_v2_t()
                {
                    // Resolution
                    xres = xres,
                    yres = yres,
                    // Use BGRA video
                    FourCC = NDIlib.FourCC_type_e.FourCC_type_BGRA,
                    // The frame-eate
                    frame_rate_N = frNum,
                    frame_rate_D = frDen,
                    // The aspect ratio
                    picture_aspect_ratio = aspectRatio,
                    // This is a progressive frame
                    frame_format_type = NDIlib.frame_format_type_e.frame_format_type_progressive,
                    // Timecode.
                    timecode = NDIlib.send_timecode_synthesize,
                    // The video memory used for this frame
                    p_data = bufferPtr,
                    // The line to line stride of this image
                    line_stride_in_bytes = stride,
                    // no metadata
                    p_metadata = IntPtr.Zero,
                    // only valid on received frames
                    timestamp = 0
                };
    
                fmtConvertedBmp = new FormatConvertedBitmap();
                fmtConvertedBmp.BeginInit();
                fmtConvertedBmp.Source = image;
                fmtConvertedBmp.DestinationFormat = PixelFormats.Bgra32;
                fmtConvertedBmp.EndInit();
    
                fmtConvertedBmp.CopyPixels(new Int32Rect(0, 0, xres, yres), bufferPtr, bufferSize, stride);
                
                // add it to the output queue
                AddFrame(videoFrame);
            }

  2. #2
    Registered User
    Join Date
    Aug 2019
    Location
    X
    Posts
    2
    Several weeks later and the answer is finally clear: the failure to capture transparency is a BUG in Windows 10 at this moment. This bug has been fixed and can be tested in the latest release in the Insider Builds (fast ring). The code as above should work with no further changes and now properly captures transparency, and enables me to send it over NDI. This fix is not available yet in the stable builds of W10, no idea when that will arrive, probably the next "big" update.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •