Android远程控制PC的实现方式(WebRTC方式的实现)

前言

C#屏幕共享可以通过以下几种方式实现:

  1. 使用Socket通信 - 这种方式可以实现实时屏幕共享,可以将屏幕数据传输到远端的接收端,在接收端将数据还原成图像,从而实现屏幕共享的效果。

  2. 使用VNC协议。

  3. 使用桌面流媒体传输协议(RDP) - RDP是一种专门用于远程桌面的协议,可以实现高效率、低延迟的远程桌面共享。

  4. 使用WebRTC - WebRTC是一种Web实时通信协议,可以实现Web浏览器之间的实时通信,包括屏幕共享。

总的来说,最好的实现方式应该根据具体的场景和需求来选择,以达到最好的效果。

使用Socket通信

https://www.psvmc.cn/article/2023-05-29-ws-csharp-web.html

这种方式画面传输效果不是很好。

使用VNC协议

目前未找到C#可用的VNCServer库。

使用RDP

这种方式要求PC开启允许远程连接,但是家庭版的系统不支持,并且连接时要求有帐号和密码,所以不推荐。

使用WebRTC

这里找到了两个库

  • SIPSorcery 完全使用C#开发的库,优点使用方便,缺点编解码较慢。

  • WebrtcSharp C#对C++库的封装,优点是效率高,缺点是使用略麻烦。

SIPSorcery

https://github.com/sipsorcery-org/sipsorcery

我们这里使用SIPSorcery这个库。

这个库要求项目是64位的。

安装依赖

1
2
Install-Package SIPSorcery
Install-Package SIPSorceryMedia.Encoders -Pre

运行报错

System.IO.FileLoadException:“未能加载文件或程序集“Microsoft.Extensions.Logging.Abstractions, Version=6.0.0.2, Culture=neutral, PublicKeyToken=adb9793829ddae60”或它的某一个依赖项。找到的程序集清单定义与程序集引用不匹配

升级版本即可

1
Install-Package Microsoft.Extensions.Logging.Abstractions -Version 7.0.0

C#发送

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
private const int WEBSOCKET_PORT = 8081;

public MainWindow()
{
InitializeComponent();
Console.WriteLine(@"Starting web socket server...");
var webSocketServer = new WebSocketServer(
IPAddress.Any,
WEBSOCKET_PORT
);
webSocketServer.AddWebSocketService<WebRTCWebSocketPeer>(
"/",
(peer) => peer.CreatePeerConnection = CreatePeerConnection
);
webSocketServer.Start();
Console.WriteLine($@"Waiting for web socket connections on {webSocketServer.Address}:{webSocketServer.Port}...");
}

private static Task<RTCPeerConnection> CreatePeerConnection()
{
var pc = new RTCPeerConnection(null);
var screenSource = new ZVideoSource(new VpxVideoEncoder());
screenSource.SetFrameRate(30);
MediaStreamTrack videoTrack = new MediaStreamTrack(
screenSource.GetVideoSourceFormats(),
MediaStreamStatusEnum.SendOnly
);
pc.addTrack(videoTrack);
screenSource.OnVideoSourceEncodedSample += pc.SendVideo;
pc.OnVideoFormatsNegotiated += (formats) => screenSource.SetVideoSourceFormat(formats.First());
pc.onconnectionstatechange += async (state) =>
{
Console.WriteLine($@"Peer connection state change to {state}.");
switch (state)
{
case RTCPeerConnectionState.connected:
await screenSource.StartVideo();
break;
case RTCPeerConnectionState.failed:
pc.Close("ice disconnection");
break;
case RTCPeerConnectionState.closed:
await screenSource.CloseVideo();
screenSource.Dispose();
break;
}
};
return Task.FromResult(pc);
}

其中ZVideoSource是自定义的桌面采集的类

ZVideoSource.cs

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
namespace z_remote_control.Utils
{
using SIPSorceryMedia.Abstractions;
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;

public class ZVideoSource : IVideoSource, IDisposable
{
public static readonly List<VideoFormat> SupportedFormats = new List<VideoFormat>()
{
new VideoFormat(
VideoCodecsEnum.VP8,
96
),
new VideoFormat(
VideoCodecsEnum.H264,
100,
parameters: "packetization-mode=1"
)
};

private int _frameSpacing;
private readonly byte[] _myI420Buffer;
private readonly Timer _sendTestPatternTimer;
private bool _isStarted;
private bool _isPaused;
private bool _isClosed;
private bool _isMaxFrameRate;
private int _frameCount;
private readonly IVideoEncoder _videoEncoder;
private readonly MediaFormatManager<VideoFormat> _formatManager;

public event RawVideoSampleDelegate OnVideoSourceRawSample;

public event RawVideoSampleFasterDelegate OnVideoSourceRawSampleFaster;

public event EncodedSampleDelegate OnVideoSourceEncodedSample;

public event SourceErrorDelegate OnVideoSourceError;
private const int SCREEN_WIDTH = 1280;
private const int SCREEN_HEIGHT = 720;

public ZVideoSource(IVideoEncoder encoder = null)
{
if (encoder != null)
{
_videoEncoder = encoder;
_formatManager = new MediaFormatManager<VideoFormat>(SupportedFormats);
}
_myI420Buffer = new byte[10 * 1024 * 1024];
UpdateBuffer();
_sendTestPatternTimer = new Timer(
GenerateTestPattern,
null,
-1,
-1
);
_frameSpacing = 33;
}

private void UpdateBuffer()
{
var source = ZScreenUtils.CaptureScreen(
SCREEN_WIDTH,
SCREEN_HEIGHT
);
var i420Byte = PixelConverter.BGRtoI420(
source,
SCREEN_WIDTH,
SCREEN_HEIGHT,
SCREEN_WIDTH * 3
);
Buffer.BlockCopy(
i420Byte,
0,
_myI420Buffer,
0,
i420Byte.Length
);
}

public void RestrictFormats(Func<VideoFormat, bool> filter) => _formatManager.RestrictFormats(filter);

public List<VideoFormat> GetVideoSourceFormats() => _formatManager.GetSourceFormats();

public void SetVideoSourceFormat(VideoFormat videoFormat) => _formatManager.SetSelectedFormat(videoFormat);

public List<VideoFormat> GetVideoSinkFormats() => _formatManager.GetSourceFormats();

public void SetVideoSinkFormat(VideoFormat videoFormat) => _formatManager.SetSelectedFormat(videoFormat);

public void ForceKeyFrame() => _videoEncoder?.ForceKeyFrame();

public bool HasEncodedVideoSubscribers() => OnVideoSourceEncodedSample != null;

public void ExternalVideoSourceRawSample
(
uint durationMilliseconds,
int width,
int height,
byte[] sample,
VideoPixelFormatsEnum pixelFormat
)
{
}

public void ExternalVideoSourceRawSampleFaster
(
uint durationMilliseconds,
RawImage rawImage
)
{
}

public bool IsVideoSourcePaused() => _isPaused;

public void SetFrameRate(int framesPerSecond)
{
if (framesPerSecond < 1 || framesPerSecond > 60)
{
Console.WriteLine(
string.Format(
@"Frames per second not in the allowed range of {0} to {1}, ignoring.",
1,
60
),
Array.Empty<object>()
);
}
else
{
_frameSpacing = 1000 / framesPerSecond;
if (!_isStarted)
return;
_sendTestPatternTimer.Change(
0,
_frameSpacing
);
}
}

public void SetMaxFrameRate(bool isMaxFrameRate)
{
if (_isMaxFrameRate == isMaxFrameRate)
return;
_isMaxFrameRate = isMaxFrameRate;
if (!_isStarted)
return;
if (_isMaxFrameRate)
{
_sendTestPatternTimer.Change(
-1,
-1
);
GenerateMaxFrames();
}
else
_sendTestPatternTimer.Change(
0,
_frameSpacing
);
}

public Task PauseVideo()
{
_isPaused = true;
_sendTestPatternTimer.Change(
-1,
-1
);
return Task.CompletedTask;
}

public Task ResumeVideo()
{
_isPaused = false;
_sendTestPatternTimer.Change(
0,
_frameSpacing
);
return Task.CompletedTask;
}

public Task StartVideo()
{
if (!_isStarted)
{
_isStarted = true;
if (_isMaxFrameRate)
GenerateMaxFrames();
else
_sendTestPatternTimer.Change(
0,
_frameSpacing
);
}
return Task.CompletedTask;
}

public Task CloseVideo()
{
if (_isClosed)
return Task.CompletedTask;
_isClosed = true;
ManualResetEventSlim mre = new ManualResetEventSlim();
_sendTestPatternTimer?.Dispose(mre.WaitHandle);
return Task.Run(() => mre.Wait(1000));
}

private void GenerateMaxFrames()
{
DateTime now = DateTime.Now;
while (!_isClosed && _isMaxFrameRate)
{
_frameSpacing = Convert.ToInt32(DateTime.Now.Subtract(now).TotalMilliseconds);
GenerateTestPattern(null);
now = DateTime.Now;
}
}

private void GenerateTestPattern(object state)
{
lock (_sendTestPatternTimer)
{
if (_isClosed || OnVideoSourceRawSample == null && OnVideoSourceEncodedSample == null)
return;
++_frameCount;
StampI420Buffer(
_myI420Buffer,
SCREEN_WIDTH,
SCREEN_HEIGHT,
_frameCount
);
if (OnVideoSourceRawSample != null)
GenerateRawSample(
SCREEN_WIDTH,
SCREEN_HEIGHT,
_myI420Buffer
);
if (_videoEncoder != null && OnVideoSourceEncodedSample != null)
{
VideoFormat selectedFormat = _formatManager.SelectedFormat;
if (!selectedFormat.IsEmpty())
{
IVideoEncoder videoEncoder = _videoEncoder;
UpdateBuffer();
byte[] testI420Buffer = _myI420Buffer;
selectedFormat = _formatManager.SelectedFormat;
int codec = (int)selectedFormat.Codec;
byte[] sample = videoEncoder.EncodeVideo(
SCREEN_WIDTH,
SCREEN_HEIGHT,
testI420Buffer,
VideoPixelFormatsEnum.I420,
(VideoCodecsEnum)codec
);
if (sample != null)
OnVideoSourceEncodedSample(
90000U / (_frameSpacing > 0 ? 1000U / (uint)_frameSpacing : 30U),
sample
);
}
}
if (_frameCount != int.MaxValue)
return;
_frameCount = 0;
}
}

private void GenerateRawSample
(
int width,
int height,
byte[] i420Buffer
)
{
byte[] sample = PixelConverter.I420toBGR(
i420Buffer,
width,
height,
out int _
);
RawVideoSampleDelegate videoSourceRawSample = OnVideoSourceRawSample;
if (videoSourceRawSample == null)
return;
videoSourceRawSample(
(uint)_frameSpacing,
width,
height,
sample,
VideoPixelFormatsEnum.Bgr
);
}

public static void StampI420Buffer
(
byte[] i420Buffer,
int width,
int height,
int frameNumber
)
{
int num1 = width - 20 - 10;
int num2 = height - 20 - 10;
for (int index1 = num2; index1 < num2 + 20; ++index1)
{
for (int index2 = num1; index2 < num1 + 20; ++index2)
i420Buffer[index1 * width + index2] = (byte)(frameNumber % byte.MaxValue);
}
}

public void Dispose()
{
_isClosed = true;
_sendTestPatternTimer?.Dispose();
_videoEncoder?.Dispose();
}
}
}

获取截图BGR的工具类

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
using System.IO;

namespace z_remote_control.Utils
{
using System.Drawing;
using System.Drawing.Drawing2D;
using System.Drawing.Imaging;
using System.Runtime.InteropServices;
using System.Windows.Forms;

public class ZScreenUtils
{
private const PixelFormat FORMAT = PixelFormat.Format24bppRgb;

/// <summary>
/// 获取屏幕的Byte[]
/// </summary>
/// <param name="maxHeight"></param>
/// <returns></returns>
public static byte[] GetScreenshot(int maxHeight = 720)
{
var screen = GetScreen();
Bitmap targetPic;
int width = screen.Width;
int height = screen.Height;
if (screen.Height > maxHeight)
{
double rate = 1.0d * screen.Height / maxHeight;
height = (int)(height / rate);
width = (int)(width / rate);
targetPic = ScalePic(
screen,
width,
height
);
screen.Dispose();
}
else
{
targetPic = screen;
}
var picByte = GetPicByte(
targetPic,
40
);
targetPic.Dispose();
return picByte;
}

public static Bitmap GetScreen()
{
Bitmap screenshot = new Bitmap(
Screen.PrimaryScreen.Bounds.Width,
Screen.PrimaryScreen.Bounds.Height,
FORMAT
);
using (Graphics gfx = Graphics.FromImage(screenshot))
{
gfx.CopyFromScreen(
Screen.PrimaryScreen.Bounds.X,
Screen.PrimaryScreen.Bounds.Y,
0,
0,
Screen.PrimaryScreen.Bounds.Size,
CopyPixelOperation.SourceCopy
);

// 绘制光标图标
// 创建一个红色的画刷
Brush brush = new SolidBrush(Color.LimeGreen);
gfx.FillEllipse(
brush,
Cursor.Position.X - 10,
Cursor.Position.Y - 10,
20,
20
);
return screenshot;
}
}

public static Bitmap ScalePic
(
Bitmap sourrce,
int width,
int height
)
{
Bitmap result = new Bitmap(
width,
height,
FORMAT
);
using (Graphics g = Graphics.FromImage(result))
{
g.InterpolationMode = InterpolationMode.HighQualityBicubic;
g.DrawImage(
sourrce,
0,
0,
width,
height
);
return result;
}
}

public static byte[] GetPicByte(Bitmap sourrce)
{
using (MemoryStream ms = new MemoryStream())
{
sourrce.Save(
ms,
ImageFormat.Jpeg
);
return ms.ToArray();
}
}

public static byte[] GetPicByte
(
Bitmap sourrce,
int quality
)
{
using (MemoryStream ms = new MemoryStream())
{
ImageCodecInfo jpegCodec = GetEncoderInfo("image/jpeg");
var encoderParameters = GetEncoderParameters(quality);
sourrce.Save(
ms,
jpegCodec,
encoderParameters
);
return ms.ToArray();
}
}

public static EncoderParameters GetEncoderParameters(int quality = 90)
{
Encoder qualityEncoder = Encoder.Quality;
EncoderParameters encoderParams = new EncoderParameters(1);
encoderParams.Param[0] = new EncoderParameter(
qualityEncoder,
quality
);
return encoderParams;
}

/// <summary>
/// 获取解码器
/// </summary>
/// <param name="mimeType">
/// </param>
/// <returns>
/// </returns>
private static ImageCodecInfo GetEncoderInfo(string mimeType)
{
int j;
var encoders = ImageCodecInfo.GetImageEncoders();
for (j = 0; j < encoders.Length; ++j)
{
if (encoders[j].MimeType == mimeType)
{
return encoders[j];
}
}
return null;
}

/// <summary>
/// 获取屏幕并转为Bgr格式
/// </summary>
/// <param name="targetWidth"></param>
/// <param name="targetHeight"></param>
/// <returns></returns>
public static byte[] CaptureScreen
(
int targetWidth,
int targetHeight
)
{
var screen = GetScreen();
var targetPic = ScalePic(
screen,
targetWidth,
targetHeight
);
screen.Dispose();
byte[] bgrByte = Bitmap2Bgr(targetPic);
targetPic.Dispose();
return bgrByte;
}

public static byte[] Bitmap2Bgr(Bitmap bitmap)
{
BitmapData data = bitmap.LockBits(
new Rectangle(
0,
0,
bitmap.Width,
bitmap.Height
),
ImageLockMode.ReadWrite,
PixelFormat.Format24bppRgb
);
bitmap.UnlockBits(data);
int dstBytes = data.Stride * data.Height;
byte[] dstValues = new byte[dstBytes];
System.IntPtr srcPtr = data.Scan0;
Marshal.Copy(
srcPtr,
dstValues,
0,
dstBytes
);
return dstValues;
}
}
}

Web接收

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
<!DOCTYPE html>
<head>
<script type="text/javascript">
const WEBSOCKET_URL = "ws://127.0.0.1:8081/"

let pc, ws;

async function start() {
pc = new RTCPeerConnection();
pc.ontrack = evt => document.querySelector('#videoCtl').srcObject = evt.streams[0];
pc.onicecandidate = evt => evt.candidate && ws.send(JSON.stringify(evt.candidate));
ws = new WebSocket(document.querySelector('#websockurl').value, []);
ws.onmessage = async function (evt) {
const obj = JSON.parse(evt.data);
if (obj?.candidate) {
await pc.addIceCandidate(obj);
} else if (obj?.sdp) {
await pc.setRemoteDescription(new RTCSessionDescription(obj));
pc.createAnswer()
.then((answer) => pc.setLocalDescription(answer))
.then(() => ws.send(JSON.stringify(pc.localDescription)));
}
};
}

async function closePeer() {
await pc?.close();
await ws?.close();
}

</script>
<title>WebRTC</title>
</head>
<body>

<video controls autoplay="autoplay" id="videoCtl" width="640" height="480"></video>

<div>
<label for="websockurl">WS:</label><input type="text" id="websockurl" size="40"/>
<button type="button" class="btn btn-success" onclick="start();">Start</button>
<button type="button" class="btn btn-success" onclick="closePeer();">Close</button>
</div>

</body>

<script>
document.querySelector('#websockurl').value = WEBSOCKET_URL;
</script>

发送又接收

C#发送及接收

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
private const int RTC_WS_PORT = 8081;

public MainWindow()
{
InitializeComponent();
Console.WriteLine(@"Starting web socket server...");
var webSocketServer = new WebSocketServer(
IPAddress.Any,
RTC_WS_PORT
);
webSocketServer.AddWebSocketService<WebRTCWebSocketPeer>(
"/",
(peer) =>
{
peer.CreatePeerConnection = CreatePeerConnection;
}
);
webSocketServer.Start();
Console.WriteLine($@"Waiting for web socket connections on {webSocketServer.Address}:{webSocketServer.Port}...");
}

private Task<RTCPeerConnection> CreatePeerConnection()
{
var pc = new RTCPeerConnection(null);
var videoSource = new ZVideoSource(new VpxVideoEncoder());
var videoSink = new VideoEncoderEndPoint();
videoSink.OnVideoSinkDecodedSample +=
(
byte[] bmp,
uint width,
uint height,
int stride,
VideoPixelFormatsEnum pixelFormat
) =>
{
Bitmap bitmap = ZScreenUtils.RgbToBitmap(
bmp,
(int)width,
(int)height
);
Console.WriteLine($@"接收到 width:{width} height:{height}");
this.Dispatcher.Invoke(
() =>
{
BitmapImage bitmapImage = new BitmapImage();
using (MemoryStream memory = new MemoryStream())
{
bitmap.Save(
memory,
ImageFormat.Png
);
memory.Position = 0;
bitmapImage.BeginInit();
bitmapImage.StreamSource = memory;
bitmapImage.CacheOption = BitmapCacheOption.OnLoad;
bitmapImage.EndInit();
}
this.ShowImg.Source = bitmapImage;
}
);
};
videoSource.SetFrameRate(30);
MediaStreamTrack videoTrack = new MediaStreamTrack(
videoSource.GetVideoSourceFormats(),
MediaStreamStatusEnum.SendRecv
);
pc.addTrack(videoTrack);
videoSource.OnVideoSourceEncodedSample += pc.SendVideo;
pc.OnVideoFormatsNegotiated += (formats) =>
{
videoSource.SetVideoSourceFormat(formats.Last());
videoSink.SetVideoSinkFormat(formats.Last());
};
pc.OnVideoFrameReceived += videoSink.GotVideoFrame;
pc.onconnectionstatechange += async (state) =>
{
Console.WriteLine($@"Peer connection state change to {state}.");
switch (state)
{
case RTCPeerConnectionState.connected:
await videoSource.StartVideo();
break;
case RTCPeerConnectionState.failed:
pc.Close("ice disconnection");
break;
case RTCPeerConnectionState.closed:
await videoSource.CloseVideo();
videoSource.Dispose();
break;
}
};
return Task.FromResult(pc);
}

Web端发送及接收

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
<!DOCTYPE html>
<head>

<title>WebRTC</title>
</head>
<body>

<video controls autoplay="autoplay" id="videoCtl" width="640" height="480"></video>

<div>
<label for="websockurl">WS:</label><input type="text" id="websockurl" size="40"/>
<button type="button" class="btn btn-success" onclick="start();">Start</button>
<button type="button" class="btn btn-success" onclick="closePeer();">Close</button>
</div>

</body>

<script>
const WEBSOCKET_URL = "ws://127.0.0.1:8081/"
document.querySelector('#websockurl').value = WEBSOCKET_URL;
</script>

<script type="text/javascript">
let pc, ws;

async function start() {
var ws_url = document.querySelector('#websockurl').value;
const stream = await navigator.mediaDevices.getDisplayMedia({video: true})
pc = new RTCPeerConnection();
stream.getTracks().forEach(track => pc.addTrack(track, stream))
pc.ontrack = evt => document.querySelector('#videoCtl').srcObject = evt.streams[0];
pc.onicecandidate = evt => evt.candidate && ws.send(JSON.stringify(evt.candidate));
ws = new WebSocket(ws_url, []);
ws.onmessage = async function (evt) {
const obj = JSON.parse(evt.data);
if (obj?.sdp) {
await pc.setRemoteDescription(new RTCSessionDescription(obj));
let answer = await pc.createAnswer();
await pc.setLocalDescription(answer);
ws.send(JSON.stringify(pc.localDescription))
} else if (obj?.candidate) {
await pc.addIceCandidate(obj);
}
};
}

async function closePeer() {
await pc?.close();
await ws?.close();
}

</script>

WebrtcSharp

WebrtcSharp

https://github.com/jatecl/WebrtcSharp

https://gitee.com/psvmc/webrtc-sharp

这个库只能推送画面不能接收画面

加载I420数据

安装依赖

1
2
Install-Package OpenCvSharp4 -Version 4.1.1.20191110
Install-Package OpenCvSharp4.runtime.win -Version 4.1.1.20191110

示例

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
private void Source_Frame(VideoFrame obj)
{
var lenY = GetMemeryCount
(
obj.StrideY,
obj.Width,
obj.Height
);
var lenU = GetMemeryCount
(
obj.StrideU,
obj.Width,
obj.Height
);
var lenV = GetMemeryCount
(
obj.StrideV,
obj.Width,
obj.Height
);
var lenA = 0;
if (obj.StrideA > 0)
lenA = GetMemeryCount
(
obj.StrideA,
obj.Width,
obj.Height
);

OpenCvSharp.Mat yuvImg = new OpenCvSharp.Mat();
yuvImg.Create
(
(obj.Height * 3 / 2),
obj.Width,
OpenCvSharp.MatType.CV_8UC1
);

unsafe
{
Buffer.MemoryCopy
(
obj.DataY.ToPointer(),
yuvImg.Data.ToPointer(),
lenY,
lenY
);
Buffer.MemoryCopy
(
obj.DataU.ToPointer(),
(byte*)yuvImg.Data.ToPointer() + lenY,
lenU,
lenU
);
Buffer.MemoryCopy
(
obj.DataV.ToPointer(),
(byte*)yuvImg.Data.ToPointer() + lenY + lenU,
lenV,
lenV
);
if (lenA > 0)
Buffer.MemoryCopy
(
obj.DataA.ToPointer(),
(byte*)yuvImg.Data.ToPointer() + lenY + lenU + lenV,
lenA,
lenA
);
}

var target = new OpenCvSharp.Mat();
OpenCvSharp.Cv2.CvtColor
(
yuvImg,
target,
OpenCvSharp.ColorConversionCodes.YUV2BGRA_I420
);
Dispatcher.BeginInvoke
(
(Action)(() =>
{
MyImg.Source = OpenCvSharp.Extensions.BitmapSourceConverter.ToBitmapSource
(target);
target.Dispose();
})
);
}