Setup
Hey,
I’m trying to capture my screen and send/communicate the stream via MR-WebRTC. Communication between two PCs or PC with HoloLens worked with webcams for me, so I thought the next step could be streaming my screen. So I took the uwp application that I already had, which worked with my webcam and tried to make things work:
- UWP App is based on the example uwp app from MR-WebRTC.
- For Capturing I’m using the instruction from MS about screen capturing via GraphicsCapturePicker.
So now I’m stuck in the following situation:
- I get a frame from the screen capturing, but its type is Direct3D11CaptureFrame. You can see it below in the code snipped.
- MR-WebRTC takes a frame type
I420AVideoFrame
(also in a code snipped).
How can I “connect” them?
I420AVideoFrame
wants a frame in the I420A format (YUV 4:2:0).- Configuring the framePool I can set the DirectXPixelFormat, but it has no YUV420.
- I found this post on so, saying that it its possible.
Code Snipped Frame from Direct3D:
_framePool = Direct3D11CaptureFramePool.Create( _canvasDevice, // D3D device DirectXPixelFormat.B8G8R8A8UIntNormalized, // Pixel format 3, // Number of frames _item.Size); // Size of the buffers _session = _framePool.CreateCaptureSession(_item); _session.StartCapture(); _framePool.FrameArrived += (s, a) => { using (var frame = _framePool.TryGetNextFrame()) { // Here I would take the Frame and call the MR-WebRTC method LocalI420AFrameReady } };
Code Snippet Frame from WebRTC:
// This is the way with the webcam; so LocalI420 was subscribed to // the event I420AVideoFrameReady and got the frame from there _webcamSource = await DeviceVideoTrackSource.CreateAsync(); _webcamSource.I420AVideoFrameReady += LocalI420AFrameReady; // enqueueing the newly captured video frames into the bridge, // which will later deliver them when the Media Foundation // playback pipeline requests them. private void LocalI420AFrameReady(I420AVideoFrame frame) { lock (_localVideoLock) { if (!_localVideoPlaying) { _localVideoPlaying = true; // Capture the resolution into local variable useable from the lambda below uint width = frame.width; uint height = frame.height; // Defer UI-related work to the main UI thread RunOnMainThread(() => { // Bridge the local video track with the local media player UI int framerate = 30; // assumed, for lack of an actual value _localVideoSource = CreateI420VideoStreamSource( width, height, framerate); var localVideoPlayer = new MediaPlayer(); localVideoPlayer.Source = MediaSource.CreateFromMediaStreamSource( _localVideoSource); localVideoPlayerElement.SetMediaPlayer(localVideoPlayer); localVideoPlayer.Play(); }); } } // Enqueue the incoming frame into the video bridge; the media player will // later dequeue it as soon as it's ready. _localVideoBridge.HandleIncomingVideoFrame(frame); }
Answer
I found a solution for my problem by creating an issue on the github repo. Answer was provided by KarthikRichie:
- You have to use the
ExternalVideoTrackSource
- You can convert from the
Direct3D11CaptureFrame
toArgb32VideoFrame
// Setting up external video track source _screenshareSource = ExternalVideoTrackSource.CreateFromArgb32Callback(FrameCallback); struct WebRTCFrameData { public IntPtr Data; public uint Height; public uint Width; public int Stride; } public void FrameCallback(in FrameRequest frameRequest) { try { if (FramePool != null) { using (Direct3D11CaptureFrame _currentFrame = FramePool.TryGetNextFrame()) { if (_currentFrame != null) { WebRTCFrameData webRTCFrameData = ProcessBitmap(_currentFrame.Surface).Result; frameRequest.CompleteRequest(new Argb32VideoFrame() { data = webRTCFrameData.Data, height = webRTCFrameData.Height, width = webRTCFrameData.Width, stride = webRTCFrameData.Stride }); } } } } catch (Exception ex) { } } private async Task<WebRTCFrameData> ProcessBitmap(IDirect3DSurface surface) { SoftwareBitmap softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(surface, Windows.Graphics.Imaging.BitmapAlphaMode.Straight); byte[] imageBytes = new byte[4 * softwareBitmap.PixelWidth * softwareBitmap.PixelHeight]; softwareBitmap.CopyToBuffer(imageBytes.AsBuffer()); WebRTCFrameData argb32VideoFrame = new WebRTCFrameData(); argb32VideoFrame.Data = GetByteIntPtr(imageBytes); argb32VideoFrame.Height = (uint)softwareBitmap.PixelHeight; argb32VideoFrame.Width = (uint)softwareBitmap.PixelWidth; var test = softwareBitmap.LockBuffer(BitmapBufferAccessMode.Read); int count = test.GetPlaneCount(); var pl = test.GetPlaneDescription(count - 1); argb32VideoFrame.Stride = pl.Stride; return argb32VideoFrame; } private IntPtr GetByteIntPtr(byte[] byteArr) { IntPtr intPtr2 = System.Runtime.InteropServices.Marshal.UnsafeAddrOfPinnedArrayElement(byteArr, 0); return intPtr2; }