WPF桌面端开发-音视频录制、获取缩略图(使用AForge)

前言

不建议使用,版本较老,.net framework4.5.2不兼容。

AForge包括

  • AForge
  • AForge.Video
  • AForge.Video.FFMPEG
  • AForge.Video.DirectShow

官方文档

NAudio

https://github.com/naudio/NAudio

安装

1
2
Install-Package AForge.Video.DirectShow -Version 2.2.5
Install-Package AForge.Video.FFMPEG -Version 2.2.5.1-rc

视频录制

FFMPEG音视频均可录制 后两者只能录制视频

安装

1
2
Install-Package AForge.Video.DirectShow -Version 2.2.5
Install-Package AForge.Video.FFMPEG -Version 2.2.5.1-rc

其中

  • AForge.Video.FFMPEG负责文件的写入。

  • AForge.Video获取桌面的帧,这个库在AForge.Video.FFMPEG的依赖中,会自动安装。

  • AForge.Video.DirectShow可以获取摄像头的帧,如果不录制摄像头,可以不用安装这个

音频录制

音频录制使用了NAudio库

安装

1
Install-Package NAudio -Version 1.9.0

音视频合并

音视频合成使用了NReco.VideoConverter库,这个库本质也是用FFMpeg进行音视频合并的。

1
2
Install-Package NReco -Version 2.0.3.0
Install-Package NReco.VideoConverter -Version 1.1.4

音频处理

使用NAudio

安装

1
Install-Package NAudio -Version 1.9.0

麦克风列表

1
2
3
4
5
6
7
8
9
10
11
using NAudio.Wave;
public static void GetAudioMicrophone2()
{
for (int n = -1;
n < WaveIn.DeviceCount;
n++)
{
var caps = WaveIn.GetCapabilities(n);
Console.WriteLine($@"{n}: {caps.ProductName}");
}
}

打印如下

-1: Microsoft Sound Mapper
0: 麦克风 (Realtek(R) Audio)

注意上面是从-1开始遍历的,我们获取麦克风设备的时候可以从0遍历。

方式2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
using NAudio.CoreAudioApi;

public static void GetAudioMicrophone2()
{
MMDeviceEnumerator enumerator = new MMDeviceEnumerator();

//获取音频输出设备
IEnumerable<MMDevice> speakDevices =
enumerator.EnumerateAudioEndPoints(DataFlow.Capture, DeviceState.Active).ToArray();
var mmDevices = speakDevices.ToList();
foreach (MMDevice device in mmDevices)
{
int volume = Convert.ToInt16(device.AudioEndpointVolume.MasterVolumeLevelScalar * 100);
Console.WriteLine($@"{device.FriendlyName} 声音大小:{volume}");
}
}

扬声器列表

获取默认的扬声器及其声音大小

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
using NAudio.CoreAudioApi;

public static void GetAudioLoudspeaker2()
{
MMDeviceEnumerator enumerator = new MMDeviceEnumerator();

//获取音频输出设备
IEnumerable<MMDevice> speakDevices =
enumerator.EnumerateAudioEndPoints(DataFlow.Render, DeviceState.Active).ToArray();
var mmDevices = speakDevices.ToList();
foreach (MMDevice device in mmDevices)
{
int volume = Convert.ToInt16(device.AudioEndpointVolume.MasterVolumeLevelScalar * 100);
Console.WriteLine($@"{device.FriendlyName} 声音大小:{volume}");
}
}

结果

PHL 271V8 (NVIDIA High Definition Audio) 声音大小:100
扬声器/听筒 (Realtek(R) Audio) 声音大小:29

默认的麦克风和扬声器

1
2
3
4
var defaultCaptureDevice = WasapiCapture.GetDefaultCaptureDevice();
Console.WriteLine($@"默认麦克风:{defaultCaptureDevice.FriendlyName}");
var defaultLoopbackCaptureDevice = WasapiLoopbackCapture.GetDefaultLoopbackCaptureDevice();
Console.WriteLine($@"默认扬声器:{defaultLoopbackCaptureDevice.FriendlyName}");

获取麦克风实时音量

xaml

1
2
3
4
<ProgressBar
BorderThickness="0"
Maximum="100"
Name="VolumeProgressBar" />

代码

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
// 定义
private WaveInEvent waveIn = null;

private void AudioMonitor()
{
if (WaveIn.DeviceCount == 0)
{
return;
}
// 启动采集
waveIn = new WaveInEvent();
//开始录音,写数据
waveIn.DataAvailable += (o, e1) =>
{
byte[] buf = e1.Buffer;
float maxNumber = 0;
for (int index = 0; index < buf.Length; index += 2)
{
short sample = (short)((buf[index + 1] << 8) | buf[index + 0]);
float sample32 = sample / 32768f;
sample32 = Math.Abs(sample32);

if (sample32 > maxNumber)
{
maxNumber = sample32;
}
}

Dispatcher.Invoke(
() =>
{
VolumeProgressBar.Value = maxNumber * 100;
});
};

//结束录音
waveIn.RecordingStopped += (s, a) =>
{
waveIn.Dispose();
};

waveIn.StartRecording();
}

停止

1
2
3
4
5
// 停止时调用
if (waveIn != null)
{
waveIn.StopRecording();
}

获取麦克风列表

1
2
3
4
5
for (int n = -1; n < WaveIn.DeviceCount; n++)
{
var caps = WaveIn.GetCapabilities(n);
Console.WriteLine($"{n}: {caps.ProductName}");
}

打印如下

-1: Microsoft 声音映射器
0: 麦克风 (Realtek(R) Audio)

注意上面是从-1开始遍历的,我们获取麦克风设备的时候可以从0遍历。

设置麦克风

设置对应的索引

1
waveIn.DeviceNumber = 0;

官方文档

https://github.com/naudio/NAudio/blob/master/Docs/RecordingLevelMeter.md

获取扬声器实时音量

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
// 定义
private WasapiLoopbackCapture capture = null; //new WasapiLoopbackCapture();

// 启动采集
capture = new WasapiLoopbackCapture();

capture.DataAvailable += (s, e1) =>
{
byte[] buf = e1.Buffer;
float maxNumber = 0;
for (int index = 0; index < buf.Length; index += 2)
{
short sample = (short)((buf[index + 1] << 8) | buf[index + 0]);
float sample32 = sample / 32768f;
sample32 = Math.Abs(sample32);

if (sample32 > maxNumber)
{
maxNumber = sample32;
}
}

Console.WriteLine("maxNumber" + maxNumber);
};
//结束录音
capture.RecordingStopped += (s, a) =>
{
capture.Dispose();
};
capture.StartRecording();

if(capture!=null){
capture.StopRecording();
}

注意

获取扬声器声音大小不受系统声音设置大小影响,所以要想获取真实用户听到的声音大小要用 采集的声音大小*扬声器设置的声音大小

设置扬声器的音量

1
2
3
4
5
6
7
8
9
10
11
12
private void SetCurrentSpeakerVolume(int volume)
{
var enumerator = new MMDeviceEnumerator();
IEnumerable<MMDevice> speakDevices = enumerator.EnumerateAudioEndPoints(DataFlow.Render, DeviceState.Active).ToArray();
if (speakDevices.Count() > 0)
{
MMDevice mMDevice = speakDevices.ToList()[0];
mMDevice.AudioEndpointVolume.MasterVolumeLevelScalar = volume / 100.0f;
Console.WriteLine("设置的扬声器为:" + mMDevice.FriendlyName);
Console.WriteLine("设置的扬声器声音大小为:" + volume);
}
}

录制麦克风和扬声器

录制麦克风

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
namespace ZUtils
{
using System;
using System.IO;
using System.Threading;

using NAudio.Wave;
public class ZRecordMicrophoneHelper
{
public enum RecordState
{
Stop = 0,
Start = 1,
Pause = 2
}


private RecordState _state;

//录制麦克风的声音
private readonly WaveInEvent _waveIn; //new WaveInEvent();

private Action<string> _stopAction;


//生成音频文件的对象

public ZRecordMicrophoneHelper(string filePath)
{
WaveFileWriter writer;

var audioFile = filePath;
_state = RecordState.Pause;
try
{
_waveIn = new WaveInEvent();
writer = new WaveFileWriter(audioFile, _waveIn.WaveFormat);
//开始录音,写数据
_waveIn.DataAvailable += (s, a) =>
{
if (_state == RecordState.Start)
{
writer.Write(a.Buffer, 0, a.BytesRecorded);
}
};

//结束录音
_waveIn.RecordingStopped += (s, a) =>
{
writer.Dispose();
writer = null;
_waveIn.Dispose();
_stopAction?.Invoke(audioFile);
};

_waveIn.StartRecording();
}
catch (Exception)
{
// ignored
}
}

/// <summary>
/// 开始录制
/// </summary>
public void StartRecordAudio()
{
_state = RecordState.Start;
}

/// <summary>
/// 结束录制
/// </summary>
public void StopRecordAudio(Action<string> stopAction)
{
_stopAction = stopAction;
_state = RecordState.Stop;
_waveIn.StopRecording();
}

/// <summary>
/// 暂停录制
/// </summary>
public void PauseRecordAudio()
{
_state = RecordState.Pause;
}

/// <summary>
/// 恢复录制
/// </summary>
public void ResumeRecordAudio()
{
_state = RecordState.Start;
}

/// <summary>
/// 设备是否可用
/// </summary>
/// <returns></returns>
public static bool IsDeviceGood()
{
string tempPath = Path.GetTempPath();

WaveInEvent mWaveIn;
WaveFileWriter mWriter = null;
try
{
string mAudioFile = Path.Combine(tempPath, "_microphone.mp3");

mWaveIn = new WaveInEvent();
mWriter = new WaveFileWriter(mAudioFile, mWaveIn.WaveFormat);
//开始录音,写数据
var writer = mWriter;
mWaveIn.DataAvailable += (s, a) =>
{
writer.Write(a.Buffer, 0, a.BytesRecorded);
};

//结束录音
mWaveIn.RecordingStopped += (s, a) =>
{
writer.Dispose();

mWriter = null;
mWaveIn.Dispose();

if (File.Exists(mAudioFile))
{
File.Delete(mAudioFile);
}
};

mWaveIn.StartRecording();

ThreadPool.QueueUserWorkItem(o =>
{
Thread.Sleep(200);
mWaveIn.StopRecording();
});
}
catch (Exception)
{
if (mWriter != null)
{
mWriter.Dispose();
mWriter = null;
}

return false;
}

return true;
}
}
}

注意

这里在初始化类的时候就直接调用录制了,原因在于,如果同时录制音视频的时候,同时开启的时候,由于硬件原因导致启动的时间有先后从而会导致声画不同步。

后文中的视频录制也是同样的原因。

录制扬声器

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
using System;
using System.IO;
using System.Threading;

using NAudio.Wave;

namespace ZUtils
{
public class ZRecordLoudspeakerHelper
{
public enum RecordState
{
Stop = 0,
Start = 1,
Pause = 2
}

private RecordState _state;

//录制扬声器的声音
private readonly WasapiLoopbackCapture _capture;

private Action<string> _stopAction;

public ZRecordLoudspeakerHelper(string filePath)
{
WaveFileWriter writer;

var audioFile = filePath;
_state = RecordState.Pause;
try
{
_capture = new WasapiLoopbackCapture();
writer = new WaveFileWriter(audioFile, _capture.WaveFormat);

_capture.DataAvailable += (s, a) =>
{
if (_state == RecordState.Start)
{
writer.Write(a.Buffer, 0, a.BytesRecorded);
}
};
//结束录音
_capture.RecordingStopped += (s, a) =>
{
writer.Dispose();
writer = null;
_capture.Dispose();
_stopAction?.Invoke(audioFile);
};
_capture.StartRecording();
}
catch (Exception)
{
// ignored
}
}

/// <summary>
/// 开始录制
/// </summary>
public void StartRecordAudio()
{
_state = RecordState.Start;
}

/// <summary>
/// 结束录制
/// </summary>
public void StopRecordAudio(Action<string> stopAction)
{
_stopAction = stopAction;
_state = RecordState.Stop;
_capture.StopRecording();
}

/// <summary>
/// 暂停录制
/// </summary>
public void PauseRecordAudio()
{
_state = RecordState.Pause;
}

/// <summary>
/// 恢复录制
/// </summary>
public void ResumeRecordAudio()
{
_state = RecordState.Start;
}

/// <summary>
/// 设备是否可用
/// </summary>
/// <returns></returns>
public static bool IsDeviceGood()
{
string tempPath = Path.GetTempPath();

WaveFileWriter mWriter = null;
WasapiLoopbackCapture mCapture;
try
{
string mAudioFile = Path.Combine(tempPath, "_loudspeaker.mp3");
mCapture = new WasapiLoopbackCapture();
mWriter = new WaveFileWriter(mAudioFile, mCapture.WaveFormat);

var writer = mWriter;
mCapture.DataAvailable += (s, a) =>
{
writer.Write(a.Buffer, 0, a.BytesRecorded);
};
//结束录音
mCapture.RecordingStopped += (s, a) =>
{
writer.Dispose();

mWriter = null;
mCapture.Dispose();

if (File.Exists(mAudioFile))
{
File.Delete(mAudioFile);
}
};
mCapture.StartRecording();
ThreadPool.QueueUserWorkItem(o =>
{
Thread.Sleep(200);
mCapture.StopRecording();
});
}
catch (Exception)
{
if (mWriter != null)
{
mWriter.Dispose();
mWriter = null;
}
return false;
}

return true;
}
}
}

音频状态获取

改变系统音量

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
[DllImport("user32.dll")]
static extern void keybd_event(byte bVk, byte bScan, UInt32 dwFlags, UInt32 dwExtraInfo);

[DllImport("user32.dll")]
static extern Byte MapVirtualKey(UInt32 uCode, UInt32 uMapType);

private const byte VK_VOLUME_MUTE = 0xAD;
private const byte VK_VOLUME_DOWN = 0xAE;
private const byte VK_VOLUME_UP = 0xAF;
private const UInt32 KEYEVENTF_EXTENDEDKEY = 0x0001;
private const UInt32 KEYEVENTF_KEYUP = 0x0002;

/// <summary>
/// 改变系统音量大小,增加
/// </summary>
public void VolumeUp()
{
keybd_event(VK_VOLUME_UP, MapVirtualKey(VK_VOLUME_UP, 0), KEYEVENTF_EXTENDEDKEY, 0);
keybd_event(VK_VOLUME_UP, MapVirtualKey(VK_VOLUME_UP, 0), KEYEVENTF_EXTENDEDKEY | KEYEVENTF_KEYUP, 0);
}

/// <summary>
/// 改变系统音量大小,减小
/// </summary>
public void VolumeDown()
{
keybd_event(VK_VOLUME_DOWN, MapVirtualKey(VK_VOLUME_DOWN, 0), KEYEVENTF_EXTENDEDKEY, 0);
keybd_event(VK_VOLUME_DOWN, MapVirtualKey(VK_VOLUME_DOWN, 0), KEYEVENTF_EXTENDEDKEY | KEYEVENTF_KEYUP, 0);
}

/// <summary>
/// 改变系统音量大小,静音
/// </summary>
public void Mute()
{
keybd_event(VK_VOLUME_MUTE, MapVirtualKey(VK_VOLUME_MUTE, 0), KEYEVENTF_EXTENDEDKEY, 0);
keybd_event(VK_VOLUME_MUTE, MapVirtualKey(VK_VOLUME_MUTE, 0), KEYEVENTF_EXTENDEDKEY | KEYEVENTF_KEYUP, 0);
}

改变软件音量

改变软件音量 但不改变系统音量

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
[DllImport("Winmm.dll")]
private static extern int waveOutSetVolume(int hwo, System.UInt32 pdwVolume);

[DllImport("Winmm.dll")]
private static extern uint waveOutGetVolume(int hwo, out System.UInt32 pdwVolume);

private int volumeMinScope = 0;
private int volumeMaxScope = 100;
private int volumeSize = 100;

/// <summary>
/// 音量控制,但不改变系统音量设置
/// </summary>
public int VolumeSize
{
get { return volumeSize; }
set { volumeSize = value; }
}

public void SetCurrentVolume()
{
if (volumeSize < 0)
{
volumeSize = 0;
}

if (volumeSize > 100)
{
volumeSize = 100;
}

//先把trackbar的value值映射到0x0000~0xFFFF范围
System.UInt32 Value = (System.UInt32)((double)0xffff * (double)volumeSize / (double)(volumeMaxScope - volumeMinScope));


//限制value的取值范围
if (Value < 0)
{
Value = 0;
}

if (Value > 0xffff)
{
Value = 0xffff;
}

System.UInt32 left = (System.UInt32)Value;//左声道音量
System.UInt32 right = (System.UInt32)Value;//右
waveOutSetVolume(0, left << 16 | right); //"<<"左移,“|”逻辑或运算
}

设置默认音频设备

目前还没有用代码设置默认音频设备的方法

打开系统声音设置,让用户操作

1
Process.Start("mmsys.cpl");

摄像头

摄像头列表

获取摄像头列表

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
/// <summary>
/// 获取所有的摄像头
/// </summary>
/// <returns></returns>
/// <exception cref="ArgumentNullException"></exception>
public static List<FilterInfo> GetAllVideoDevice()
{
var devs = new FilterInfoCollection(FilterCategory.VideoInputDevice); //获取摄像头列表
if (devs == null) throw new ArgumentNullException(nameof(devs));
List<FilterInfo> devices = new List<FilterInfo>();
foreach (FilterInfo dev in devs)
{
devices.Add(dev);
}
return devices;
}

/// <summary>
/// 获取所有的摄像头 排除虚拟设备
/// </summary>
/// <returns></returns>
/// <exception cref="ArgumentNullException"></exception>
public static List<FilterInfo> GetAllVideoDeviceSys()
{
var devs = new FilterInfoCollection(FilterCategory.VideoInputDevice); //获取摄像头列表
if (devs == null) throw new ArgumentNullException(nameof(devs));
List<FilterInfo> devices = new List<FilterInfo>();
foreach (FilterInfo dev in devs)
{
if (dev.Name != "screen-capture-recorder" && dev.Name != "OBS Virtual Camera")
{
devices.Add(dev);
}
}
return devices;
}

注意

new VideoCaptureDevice(CameraName)CameraName传入的是MonikerString

显示的时候使用Name

摄像头画面

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
if (CmbCameraList.SelectedIndex >= 0)
{
var info = CmbCameraList.SelectedItem as FilterInfo;
_camera = new VideoCaptureDevice(info.MonikerString);
//配置录像参数(宽,高,帧率,比特率等参数)VideoCapabilities这个属性会返回摄像头支持哪些配置
_camera.VideoResolution = _camera.VideoCapabilities[0];
_camera.NewFrame +=
Camera_NewFrame; //设置回调,aforge会不断从这个回调推出图像数据,SnapshotFrame也是有待比较
_camera.Start(); //打开摄像头
}

private void Camera_NewFrame(object sender, NewFrameEventArgs eventArgs)
{
Dispatcher.Invoke
(
() =>
{
MemoryStream ms = new MemoryStream();
eventArgs.Frame.Save(ms, ImageFormat.Bmp);
BitmapImage image = new BitmapImage();
image.BeginInit();
image.StreamSource = new MemoryStream(ms.GetBuffer());
ms.Close();
image.EndInit();
imgPlayer.Source = image;
}
); //同步显示
}

我的方案

安装相关依赖

1
2
3
4
5
6
7
Install-Package AForge.Video.DirectShow -Version 2.2.5
Install-Package AForge.Video.FFMPEG -Version 2.2.5.1-rc

Install-Package NAudio -Version 1.9.0

Install-Package NReco -Version 2.0.3.0
Install-Package NReco.VideoConverter -Version 1.1.4

重新安装所有依赖

1
Update-Package –reinstall

添加系统依赖

1
<Reference Include="System.Drawing" />

其中

  • AForge视频处理
  • NAudio 音频处理
  • NReco 音视频合并

音视频分开录制 音频如果麦克风和扬声器都录制的话,也要分开录制,最后再合并所有的流。

视频录制

桌面录制

工具类

整体

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
using System;
using System.Runtime.InteropServices;
using AForge.Video;
using AForge.Video.FFMPEG;


namespace z_recorder_aforge.Uitls
{
public class ZRecordVideoHelper
{
public enum RecordState
{
Stop = 0,
Start = 1,
Pause = 2
}

private RecordState _state;

private readonly bool _captureMouse;

private readonly VideoFileWriter _videoWriter = new VideoFileWriter(); //视频写入
private readonly ScreenCaptureStream _videoStreamer; //视频捕获

public ZRecordVideoHelper
(
string filePath,
int width,
int height,
int rate = 5,
int quality = 8,
bool captureMouse = false
)
{
var width1 = width / 2 * 2;
var height1 = height / 2 * 2;
var rate1 = rate;
_captureMouse = captureMouse;

_state = RecordState.Pause;
try
{
// 打开写入
lock (this)
{
_videoWriter.Open
(
filePath,
width1,
height1,
rate1,
VideoCodec.MPEG4,
width1 * height1 * quality
);
}

System.Drawing.Rectangle rec = new System.Drawing.Rectangle
(
0,
0,
width1,
height1
);
_videoStreamer = new ScreenCaptureStream(rec, 1000 / rate1); //帧间隔需要和帧率关联,不然录的10秒视频文件不是10s
_videoStreamer.NewFrame += VideoNewFrame;
_videoStreamer.Start();
}
catch (Exception)
{
// ignored
}
}

/// <summary>
/// 开始录制
/// </summary>
public bool StartRecordVideo()
{
_state = RecordState.Start;
return true;
}

[DllImport("user32.dll")]
private static extern bool GetCursorInfo(out Cursorinfo pci);

[StructLayout(LayoutKind.Sequential)]
private struct Point
{
public Int32 x;
public Int32 y;
}

[StructLayout(LayoutKind.Sequential)]
private struct Cursorinfo
{
public Int32 cbSize;
public Int32 flags;
public IntPtr hCursor;
public Point ptScreenPos;
}

// 帧返回时绘制上鼠标
private void VideoNewFrame(object sender, NewFrameEventArgs e)
{
if (_state == RecordState.Start)
{
if (_captureMouse)
{
var g = System.Drawing.Graphics.FromImage(e.Frame);
Cursorinfo pci;
pci.cbSize = Marshal.SizeOf(typeof(Cursorinfo));
GetCursorInfo(out pci);
try
{
System.Windows.Forms.Cursor cur = new System.Windows.Forms.Cursor(pci.hCursor);
cur.Draw
(
g,
new System.Drawing.Rectangle
(
System.Windows.Forms.Cursor.Position.X - 10,
System.Windows.Forms.Cursor.Position.Y - 10,
cur.Size.Width,
cur.Size.Height
)
);
}
catch
{
// ignored
} //打开任务管理器时会导致异常
}

lock (this)
{
_videoWriter.WriteVideoFrame(e.Frame);
}
}
}

/// <summary>
/// 结束录制
/// </summary>
public void StopRecordVideo()
{
_state = RecordState.Stop;
// 停止
_videoStreamer.Stop();

//结束写入
lock (this)
{
_videoWriter.Close();
_videoWriter.Dispose();
}
}

/// <summary>
/// 暂停录制
/// </summary>
public void PauseRecordVideo()
{
_state = RecordState.Pause;
}

/// <summary>
/// 恢复录制
/// </summary>
public void ResumeRecordVideo()
{
_state = RecordState.Start;
}
}
}

绘制光标

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
[DllImport("user32.dll")]
private static extern bool GetCursorInfo
(
out Cursorinfo pci
);

[StructLayout(LayoutKind.Sequential)]
private struct Point
{
public int x;
public int y;
}

[StructLayout(LayoutKind.Sequential)]
private struct Cursorinfo
{
public int cbSize;
public int flags;
public IntPtr hCursor;
public Point ptScreenPos;
}

// 帧返回时绘制上鼠标
private void VideoNewFrame
(
object sender,
NewFrameEventArgs e
)
{
if (State == RecordState.Start)
{
if (_captureMouse)
{
var g = Graphics.FromImage(e.Frame);
Cursorinfo pci;
pci.cbSize = Marshal.SizeOf(typeof(Cursorinfo));
GetCursorInfo(out pci);
try
{
System.Windows.Forms.Cursor cur = new System.Windows.Forms.Cursor(pci.hCursor);
cur.Draw(
g,
new Rectangle(
System.Windows.Forms.Cursor.Position.X - 10,
System.Windows.Forms.Cursor.Position.Y - 10,
cur.Size.Width,
cur.Size.Height
)
);
}
catch
{
// ignored
} //打开任务管理器时会导致异常
}
lock (WriterLock)
{
_videoWriter.WriteVideoFrame(e.Frame);
}
}
}

绘制圆点

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
// 帧返回时绘制上圆点
private void VideoNewFrame
(
object sender,
NewFrameEventArgs e
)
{
if (State == RecordState.Start)
{
if (_captureMouse)
{
var g = Graphics.FromImage(e.Frame);

// 创建一个红色的画刷
Brush brush = new SolidBrush(Color.Red);
g.FillEllipse(
brush,
System.Windows.Forms.Cursor.Position.X - 5,
System.Windows.Forms.Cursor.Position.Y - 5,
10,
10
);
}
lock (WriterLock)
{
_videoWriter.WriteVideoFrame(e.Frame);
}
}
}

调用方式

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
// 当前窗体指针
IntPtr winHandle = new WindowInteropHelper(this).Handle;
var curScreen = Screen.FromHandle(winHandle);
int RecordWidth = curScreen.Bounds.Width;
int RecordHeight = curScreen.Bounds.Height;

//桌面录制
ZRecordVideoHelper helper3 = null;
helper3 = new ZRecordVideoHelper(TempVideoPathName, RecordWidth, RecordHeight);
helper3.StartRecordVideo();

//暂停录制
helper3.PauseRecordVideo();

//恢复录制
helper3.ResumeRecordVideo();

//结束录制
helper3.StopRecordVideo();

视频写入

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
using AForge.Video.FFMPEG;

private VideoFileWriter videoWriter = new VideoFileWriter();//视频写入

// 打开写入
lock (this)
{
videoWriter.Open(
curVideoPath,
RecordWidth,
RecordHeight,
视频帧率,
VideoCodec.MPEG4,
RecordWidth * RecordHeight * 视频质量(最大10)
);
}

//写入一帧
videoWriter.WriteVideoFrame(bitmap);

//结束写入
videoWriter.Close();

注意以下几点

  • Open时传入的宽高要和写入时的宽高保持一致。
  • 宽高必须是2的整数倍。
  • Open视频文件时,请加上全局锁,否则多线程并发有异常抛出。
  • Close视频文件时,请使用Open的全局锁,否则会在Open 时出现AccessViolationException异常。

桌面流读取

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
using AForge.Video;
private ScreenCaptureStream videoStreamer;//视频捕获

System.Drawing.Rectangle rec = new System.Drawing.Rectangle(0,0,RecordWidth,RecordHeight);
videoStreamer = new ScreenCaptureStream(rec, 1000 / 10);//帧间隔需要和帧率关联,不然录的10秒视频文件不是10s
videoStreamer.NewFrame += VideoNewFrame;
videoStreamer.Start();

// 帧返回时绘制上鼠标
private void VideoNewFrame(object sender, NewFrameEventArgs e)
{
videoWriter.WriteVideoFrame(e.Frame);
}

// 停止
videoStreamer.Stop();

组件录制

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
/// <summary>
/// 保存图片
/// </summary>
/// <param name="ui">需要截图的UI控件</param>
/// <param name="filePathName">图片保存地址 命名 1.png</param>
/// <param name="ImgWidth">转换后高</param>
/// <param name="ImgHeight">转换后高</param>
public static void SaveUI(FrameworkElement ui, string filePathName, int ImgWidth, int ImgHeight)
{
Console.WriteLine("截图的路径为:" + filePathName);
try
{
RenderTargetBitmap bmp = new RenderTargetBitmap(
(int)ui.ActualWidth,
(int)ui.ActualHeight,
1 / 96, 1 / 96,
PixelFormats.Default
);
bmp.Render(ui);
BitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(bmp));
if (ImgWidth > 0)
{
MemoryStream memoryStream = new MemoryStream();
encoder.Save(memoryStream);
Bitmap bit = new Bitmap(memoryStream, true);
Bitmap Img = new Bitmap(bit, ImgWidth, ImgHeight);
Img.Save(filePathName);
Img.Dispose();
bit.Dispose();
memoryStream.Dispose();
}
else
{
using (var stream = new FileStream(filePathName, FileMode.Create))
{
encoder.Save(stream);
}
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
}

public static void SaveUI2(FrameworkElement frameworkElement, string filePathName)
{
System.IO.FileStream fs = new System.IO.FileStream(filePathName, System.IO.FileMode.Create);
RenderTargetBitmap bmp = new RenderTargetBitmap((int)frameworkElement.ActualWidth, (int)frameworkElement.ActualHeight, 1 / 96, 1 / 96, PixelFormats.Default);
bmp.Render(frameworkElement);
BitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(bmp));
encoder.Save(fs);
fs.Close();
}

public static Bitmap SaveUI2Bitmap(FrameworkElement frameworkElement, int width, int height)
{
using (MemoryStream outStream = new MemoryStream())
{
RenderTargetBitmap bmp = new RenderTargetBitmap(
(int)frameworkElement.ActualWidth,
(int)frameworkElement.ActualHeight,
1 / 96,
1 / 96,
PixelFormats.Default
);
bmp.Render(frameworkElement);
BitmapEncoder enc = new PngBitmapEncoder();
enc.Frames.Add(BitmapFrame.Create(bmp));
enc.Save(outStream);

System.Drawing.Bitmap bitmap = new System.Drawing.Bitmap(outStream);
return new Bitmap(bitmap, width, height);
}
}

摄像头录制

录制

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
using AForge.Video;
using AForge.Video.DirectShow;
using AForge.Video.FFMPEG;

private VideoCaptureDevice Camera;//用来操作摄像头
private VideoFileWriter VideoOutPut;//用来把每一帧图像编码到视频文件
private string CameraName;
private bool isParsing;
private int frameCount;//介于0和frame之间计数用
private int frame;//抽帧提速:手动将摄像头的帧率将到10帧,30帧的时候每3帧抽一帧出来


lock (this) //打开录像文件(如果没有则创建,如果有也会清空),这里还有关于
{
VideoOutPut = new VideoFileWriter();
VideoOutPut.Open(
CameraSavePath,
Camera.VideoResolution.FrameSize.Width,
Camera.VideoResolution.FrameSize.Height,
10,
VideoCodec.MSMPEG4v3,//为了减小高帧率时视频失速,强制为10帧
Camera.VideoResolution.FrameSize.Width * Camera.VideoResolution.FrameSize.Height * SettingHelp.Settings.视频质量
);
}

frameCount = 0;
Camera = new VideoCaptureDevice(CameraName);
//配置录像参数(宽,高,帧率,比特率等参数)VideoCapabilities这个属性会返回摄像头支持哪些配置
Camera.VideoResolution = Camera.VideoCapabilities[0];
frame = Camera.VideoResolution.AverageFrameRate / 10;

Camera.NewFrame += Camera_NewFrame;//设置回调,aforge会不断从这个回调推出图像数据,SnapshotFrame也是有待比较
Camera.Start();//打开摄像头


/// <summary>
/// 摄像头输出回调
/// </summary>
private void Camera_NewFrame(object sender, NewFrameEventArgs eventArgs)
{
if (!isParsing)
{
Dispatcher.Invoke(new Action(
() =>
{
MemoryStream ms = new MemoryStream();
eventArgs.Frame.Save(ms, ImageFormat.Bmp);
BitmapImage image = new BitmapImage();
image.BeginInit();
image.StreamSource = new MemoryStream(ms.GetBuffer());
ms.Close();
image.EndInit();
imgCamera.Source = image;
}));//同步显示
frameCount += 1;//抽帧提速:因为写入速度的影响高帧率时处理不及时30帧可能需要2s,但是视频认为30帧为1s最终导致视频为加速状态
if (frameCount == frame)
{
frameCount = 0;
VideoOutPut.WriteVideoFrame(eventArgs.Frame);
}
}
}


if (Camera != null) {
Camera.SignalToStop();
}

音视频合并

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
using NReco.VideoConverter;

FFMpegConverter ffMpeg = new FFMpegConverter();
ffMpeg.ConvertProgress += FfMpeg_ConvertProgress;
FFMpegInput[] input = new FFMpegInput[] {
new FFMpegInput(tempVideo),
new FFMpegInput(tempAudio),
new FFMpegInput(tempAudio2),
};
string OutFilePath = "D:\test.mp4";
// 多路音频混音
ffMpeg.ConvertMedia(
input,
OutFilePath,
"mp4",
new OutputSettings()
{
CustomOutputArgs = "-filter_complex amix=inputs=2:duration=longest:dropout_transition=2"
}
);

private void FfMpeg_ConvertProgress(object sender, ConvertProgressEventArgs e){
Dispatcher.Invoke(
() =>
{
waitBar.Value = e.Processed.TotalSeconds * 10 / e.TotalDuration.TotalSeconds;
}
);
}

参数说明

  • amix=inputs=2 : 表示混音文件个数 , 有 2 个文件进行混音 ;

  • duration=longest : 设置混音时间对齐策略 , longest 表示最长的音频文件持续时间 , shortest 表示最短输入的持续时间 , first 表示第一个文件的持续时间 ;

  • dropout_transition=2 : 表示输入流结束时 , 音量从满音量到 0 音量渐弱 2 秒消失 ;

生成缩略图

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
/// <summary>
/// 生成缩略图
/// </summary>
/// <param name="mp4Path"></param>
/// <param name="picPath"></param>
public static void GetVideoPic(string mp4Path, string picPath)
{
using (var video = new VideoFileReader())
{
video.Open(mp4Path);
int frameNum = video.FrameCount > 3 ? 3 : 0;
using (var bmp = video.ReadVideoFrame(frameNum))
{
SaveAsJpeg(bmp, picPath, 90);
}
video.Close();
}
}

// 保存为jpeg格式,quality为图像质量(0-100)
public static void SaveAsJpeg(Bitmap bitmap, string fileName, int quality)
{
var jpegEncoder = ImageCodecInfo.GetImageEncoders().First(codec => codec.FormatID == ImageFormat.Jpeg.Guid);
var encoderParameters = new EncoderParameters(1);
encoderParameters.Param[0] = new EncoderParameter(Encoder.Quality, quality);
bitmap.Save(fileName, jpegEncoder, encoderParameters);
}

获取视频时长

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
/// <summary>
/// 获取视频时长
/// </summary>
/// <param name="mp4Path"></param>
/// <returns></returns>
public static string GetDuration(string mp4Path)
{
string durationStr;
using (var reader = new VideoFileReader())
{
reader.Open(mp4Path);

double duration = reader.FrameCount / reader.FrameRate.Value;
TimeSpan timeSpan = TimeSpan.FromSeconds(duration);
durationStr = timeSpan.ToString(@"hh\:mm\:ss");
reader.Close();
}

return durationStr;
}

音视频录制工具类

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
/*
*┌────────────────────────────────────────────────┐
*│ 描 述:ZRecordManager
*│ 作 者:剑行者
*│ 版 本:1.0
*│ 创建时间:2023/2/15 15:51:49
*└────────────────────────────────────────────────┘
*/

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading;
using System.Windows;
using System.Windows.Forms;
using System.Windows.Interop;

using NAudio.CoreAudioApi;
using NAudio.Wave;

using NReco.VideoConverter;

namespace ZUtils
{
public class ZRecordManager
{
/// <summary>
/// 是否有麦克风
/// </summary>
private readonly bool _hasMicrophone;

/// <summary>
/// 是否有扬声器
/// </summary>
private readonly bool _hasLoudspeaker;

//录制麦克风的声音
private readonly ZRecordMicrophoneHelper _helper1;

//录制扬声器的声音
private readonly ZRecordLoudspeakerHelper _helper2;

private readonly ZRecordVideoHelper _helper3;

private readonly string _tempVideoPathName;
private readonly string _tempAudioPathName1;
private readonly string _tempAudioPathName2;

private readonly string _imagePath;

private readonly string _savePath;

private bool _audio01Finish;
private bool _audio02Finish;

/// <summary>
/// 视频路径、缩略图路径、时长、进度
/// </summary>
private readonly Action<string, string, string, double> _callback;

public ZRecordManager(string savePath, Window window, Action<string, string, string, double> callback)
{
_callback = callback;
IntPtr winHandle = new WindowInteropHelper(window).Handle;
var curScreen = Screen.FromHandle(winHandle);
int recordWidth = curScreen.Bounds.Width;
int recordHeight = curScreen.Bounds.Height;
_savePath = savePath;

FileInfo fi = new FileInfo(savePath);
var di = fi.Directory;
if (di is { Exists: false })
{
di.Create();
}

_tempVideoPathName = savePath + "_v.mp4";
_tempAudioPathName1 = savePath + "_a1.mp3";
_tempAudioPathName2 = savePath + "_a2.mp3";
_imagePath = savePath + ".jpg";

if (WaveIn.DeviceCount > 0)
{
_hasMicrophone = true;
_helper1 = new ZRecordMicrophoneHelper(_tempAudioPathName1);
}
else
{
_hasMicrophone = false;
}

MMDeviceEnumerator enumerator = new MMDeviceEnumerator();

//获取音频输出设备
IEnumerable<MMDevice> speakDevices = enumerator.EnumerateAudioEndPoints(DataFlow.Render, DeviceState.Active).ToArray();
if (speakDevices.Any())
{
_hasLoudspeaker = true;
_helper2 = new ZRecordLoudspeakerHelper(_tempAudioPathName2);
}
else
{
_hasLoudspeaker = false;
}

_helper3 = new ZRecordVideoHelper(_tempVideoPathName, recordWidth, recordHeight);
}

public void Start()
{
if (_hasMicrophone)
{
_helper1.StartRecordAudio();
}

if (_hasLoudspeaker)
{
_helper2.StartRecordAudio();
}

_helper3.StartRecordVideo();
}

public void Stop()
{
if (_hasMicrophone)
{
_helper1.StopRecordAudio((path) =>
{
_audio01Finish = true;
});
}

if (_hasLoudspeaker)
{
_helper2.StopRecordAudio((path) =>
{
_audio02Finish = true;
});
}

_helper3.StopRecordVideo();

new Thread(o =>
{
Merge();
})
.Start();
}

public void Pause()
{
if (_hasMicrophone)
{
_helper1.PauseRecordAudio();
}

if (_hasLoudspeaker)
{
_helper2.PauseRecordAudio();
}

_helper3.PauseRecordVideo();
}

public void Resume()
{
if (_hasMicrophone)
{
_helper1.ResumeRecordAudio();
}

if (_hasLoudspeaker)
{
_helper2.ResumeRecordAudio();
}

_helper3.ResumeRecordVideo();
}

private void Merge()
{
if (!_audio01Finish)
{
Thread.Sleep(100);
Merge();
return;
}

if (!_audio02Finish)
{
Thread.Sleep(100);
Merge();
return;
}

Console.WriteLine(@"音视频合并");
if (_hasMicrophone && _hasLoudspeaker)
{
FFMpegConverter ffmpeg = new FFMpegConverter();
ffmpeg.ConvertProgress += FfMpeg_ConvertProgress;
FFMpegInput[] input =
{
new FFMpegInput(_tempVideoPathName), new FFMpegInput(_tempAudioPathName1),
new FFMpegInput(_tempAudioPathName2),
};

// 多路音频混音
ffmpeg.ConvertMedia(
input,
_savePath,
Format.mp4,
new OutputSettings
{
CustomOutputArgs = "-filter_complex amix=inputs=2:duration=longest:dropout_transition=2"
}
);
}
else if (_hasMicrophone && !_hasLoudspeaker)
{
FFMpegConverter ffmpeg = new FFMpegConverter();
ffmpeg.ConvertProgress += FfMpeg_ConvertProgress;
FFMpegInput[] input = { new FFMpegInput(_tempVideoPathName), new FFMpegInput(_tempAudioPathName1), };

// 多路音频混音
ffmpeg.ConvertMedia(
input,
_savePath,
Format.mp4,
new OutputSettings
{
CustomOutputArgs = "-filter_complex amix=inputs=1:duration=longest:dropout_transition=2"
}
);
}
else if (!_hasMicrophone && _hasLoudspeaker)
{
FFMpegConverter ffmpeg = new FFMpegConverter();
ffmpeg.ConvertProgress += FfMpeg_ConvertProgress;
FFMpegInput[] input = { new FFMpegInput(_tempVideoPathName), new FFMpegInput(_tempAudioPathName2), };

// 多路音频混音
ffmpeg.ConvertMedia(
input,
_savePath,
Format.mp4,
new OutputSettings
{
CustomOutputArgs = "-filter_complex amix=inputs=1:duration=longest:dropout_transition=2"
}
);
}
else
{
if (File.Exists(_savePath))
{
File.Move(_tempVideoPathName, _savePath);
}

Console.WriteLine(@"只有视频获取封面");
FinishAction();
}
}

/// <summary>
/// 生成缩略图、时长结束
/// </summary>
private void FinishAction()
{
GenerateThumbnails(_savePath, _imagePath);
string duration = GetVideoDuration(_savePath);
_callback(_savePath, _imagePath, duration, 100);
}

private void FfMpeg_ConvertProgress(object sender, ConvertProgressEventArgs e)
{
double progress = e.Processed.TotalSeconds * 100 / e.TotalDuration.TotalSeconds;
if (progress - 100.0 != 0)
{
return;
}

if (File.Exists(_tempAudioPathName1))
{
try
{
File.Delete(_tempAudioPathName1);
}
catch (Exception)
{
Console.WriteLine(@"删除麦克风音频失败");
}
}

if (File.Exists(_tempAudioPathName2))
{
try
{
File.Delete(_tempAudioPathName2);
}
catch (Exception)
{
Console.WriteLine(@"删除扬声器音频失败");
}
}

if (File.Exists(_tempVideoPathName))
{
try
{
File.Delete(_tempVideoPathName);
}
catch (Exception)
{
Console.WriteLine(@"删除视频失败");
}
}

Console.WriteLine(@"合并完成获取封面");
FinishAction();
}
}
}

调用

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
IsRecording = !IsRecording;
string savePath = ZConfig.configLocal.savePath;

string pathAll = Path.Combine(savePath, DateTime.Now.ToString("yyyyMMddHHmmss") + ".mp4");

_recordManager = new ZRecordManager(
pathAll,
this,
(mp4, image, duration, progress) =>
{
if (progress - 100.0 == 0)
{
string name = Path.GetFileNameWithoutExtension(mp4);
ZRecordModel recordModel = new ZRecordModel
{
Name = name,
MP4Path = mp4,
Pic = image,
Duration = duration,
RecordTime = DateTime.Now.ToString("MM月dd日 HH:mm:ss")
};
Console.WriteLine(@"name:" + name);
Console.WriteLine(@"mp4:" + mp4);
Console.WriteLine(@"image:" + image);
Console.WriteLine(@"duration:" + duration);
ZCommonData.recordList.Add(recordModel);
}
}
);

打开系统声音设置

1
Process.Start("mmsys.cpl");

调用本地播放

1
2
3
4
5
Process pro = new Process
{
StartInfo = new ProcessStartInfo(videoPath)
};
pro.Start();

音视频参数计算

音频

采样率和比特率(码率)之间的公式如下:

比特率 = 采样率 × 样本深度 × 声道数

AudioBitRate = SampleRate × bits × channels

其中,采样率表示每秒钟采集的样本数,单位为Hz(赫兹);样本深度表示每个采样值所占的位数,通常为16或24位;声道数表示音频信号的通道数,通常为单声道或立体声。

例如,对于一段立体声音乐,采样率为44100Hz,样本深度为16位,声道数为2,则它的比特率为:

比特率 = 44100 × 16 × 2 = 1411200 bit/s = 1.41 Mbps

采样率为44100Hz,样本深度为16位,声道数为1,则它的比特率为:

比特率 = 44100 × 16 × 1 = 705600 bit/s

因为

1 byte = 8 bits

所以上面的示例1s产生的byte为

1411200 / 8 = 176400

音频计算中使用到的参数

1
2
3
4
5
6
7
8
9
10
11
//帧率
private readonly int _frameRate;

//音频采样率
private readonly int _audioSampleRate = 44100;

//音频位深
private readonly int _audioBits = 16;

//音频声道数
private readonly int _audioChannels = 1;

音频的比特率

1
2
比特率 = 采样率 * 位深 * 声道数
audioBitRate = _audioSampleRate * _audioBits * _audioChannels;

音频帧大小

1
2
3
4
音频帧大小 = 比特率 * 1秒 / 8 / 声道数 / 帧率
audioFrameSize = audioBitRate * 1 / 8 / _audioChannels / _frameRate
= _audioSampleRate * _audioBits * _audioChannels * 1 / 8 / 2 / _frameRate
= _audioSampleRate * _audioBits * _audioChannels / 16 / _frameRate

其中比特率 * 1秒 / 8为1秒的byte数。