前言 重新安装所有依赖
1 Update-Package –reinstall
音视频分开录制,音频如果麦克风和扬声器都录制的话,也要分开录制,最后再合并所有的流。
官方文档
NAudio
https://github.com/naudio/NAudio
安装 视频库 OpenCvSharp4
1 2 3 Install-Package OpenCvSharp4 -Version 4.7.0.20230115 Install-Package OpenCvSharp4.runtime.win -Version 4.7.0.20230115 Install-Package OpenCvSharp4.Extensions -Version 4.7.0.20230115
OpenCvSharp3
1 Install-Package OpenCvSharp3-AnyCPU -Version 4.0.0.20181129
使用OpenCvSharp4在保存视频的时候老是报错或无法生成视频文件,换成OpenCvSharp3就一切正常。
音频库 音频录制使用了NAudio库,它既能录制麦克风也能录制扬声器
安装
1 Install-Package NAudio -Version 1.9.0
音视频合并库 目前未找到好的合并方案。
合并的库大多都是FFmpeg的封装,FFmpeg本身也比较大,不建议使用,所以未找到更好的替代方案。
音频处理 使用NAudio
安装
1 Install-Package NAudio -Version 1.9 .0
麦克风列表 1 2 3 4 5 6 7 8 9 10 11 using NAudio.Wave;public static void GetAudioMicrophone2 (){ for (int n = -1 ; n < WaveIn.DeviceCount; n++) { var caps = WaveIn.GetCapabilities(n); Console.WriteLine($@"{n} : {caps.ProductName} " ); } }
打印如下
-1: Microsoft Sound Mapper 0: 麦克风 (Realtek(R) Audio)
注意上面是从-1开始遍历的,我们获取麦克风设备的时候可以从0遍历。
方式2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 using NAudio.CoreAudioApi;public static void GetAudioMicrophone2 (){ MMDeviceEnumerator enumerator = new MMDeviceEnumerator(); IEnumerable<MMDevice> speakDevices = enumerator.EnumerateAudioEndPoints(DataFlow.Capture, DeviceState.Active).ToArray(); var mmDevices = speakDevices.ToList(); foreach (MMDevice device in mmDevices) { int volume = Convert.ToInt16(device.AudioEndpointVolume.MasterVolumeLevelScalar * 100 ); Console.WriteLine($@"{device.FriendlyName} 声音大小:{volume} " ); } }
扬声器列表 获取默认的扬声器及其声音大小
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 using NAudio.CoreAudioApi;public static void GetAudioLoudspeaker2 (){ MMDeviceEnumerator enumerator = new MMDeviceEnumerator(); IEnumerable<MMDevice> speakDevices = enumerator.EnumerateAudioEndPoints(DataFlow.Render, DeviceState.Active).ToArray(); var mmDevices = speakDevices.ToList(); foreach (MMDevice device in mmDevices) { int volume = Convert.ToInt16(device.AudioEndpointVolume.MasterVolumeLevelScalar * 100 ); Console.WriteLine($@"{device.FriendlyName} 声音大小:{volume} " ); } }
结果
PHL 271V8 (NVIDIA High Definition Audio) 声音大小:100 扬声器/听筒 (Realtek(R) Audio) 声音大小:29
默认的麦克风和扬声器 1 2 3 4 var defaultCaptureDevice = WasapiCapture.GetDefaultCaptureDevice();Console.WriteLine($@"默认麦克风:{defaultCaptureDevice.FriendlyName} " ); var defaultLoopbackCaptureDevice = WasapiLoopbackCapture.GetDefaultLoopbackCaptureDevice();Console.WriteLine($@"默认扬声器:{defaultLoopbackCaptureDevice.FriendlyName} " );
获取麦克风实时音量 xaml
1 2 3 4 <ProgressBar BorderThickness="0" Maximum="100" Name="VolumeProgressBar" />
代码
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 private WaveInEvent waveIn = null ;private void AudioMonitor (){ if (WaveIn.DeviceCount == 0 ) { return ; } waveIn = new WaveInEvent(); waveIn.DataAvailable += (o, e1) => { byte [] buf = e1.Buffer; float maxNumber = 0 ; for (int index = 0 ; index < buf.Length; index += 2 ) { short sample = (short )((buf[index + 1 ] << 8 ) | buf[index + 0 ]); float sample32 = sample / 32768f ; sample32 = Math.Abs(sample32); if (sample32 > maxNumber) { maxNumber = sample32; } } Dispatcher.Invoke( () => { VolumeProgressBar.Value = maxNumber * 100 ; }); }; waveIn.RecordingStopped += (s, a) => { waveIn.Dispose(); }; waveIn.StartRecording(); }
停止
1 2 3 4 5 if (waveIn != null ){ waveIn.StopRecording(); }
获取麦克风列表
1 2 3 4 5 for (int n = -1 ; n < WaveIn.DeviceCount; n++){ var caps = WaveIn.GetCapabilities(n); Console.WriteLine($"{n} : {caps.ProductName} " ); }
打印如下
-1: Microsoft 声音映射器 0: 麦克风 (Realtek(R) Audio)
注意上面是从-1开始遍历的,我们获取麦克风设备的时候可以从0遍历。
设置麦克风
设置对应的索引
1 waveIn.DeviceNumber = 0 ;
官方文档
https://github.com/naudio/NAudio/blob/master/Docs/RecordingLevelMeter.md
获取扬声器实时音量 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 private WasapiLoopbackCapture capture = null ; capture = new WasapiLoopbackCapture(); capture.DataAvailable += (s, e1) => { byte [] buf = e1.Buffer; float maxNumber = 0 ; for (int index = 0 ; index < buf.Length; index += 2 ) { short sample = (short )((buf[index + 1 ] << 8 ) | buf[index + 0 ]); float sample32 = sample / 32768f ; sample32 = Math.Abs(sample32); if (sample32 > maxNumber) { maxNumber = sample32; } } Console.WriteLine("maxNumber" + maxNumber); }; capture.RecordingStopped += (s, a) => { capture.Dispose(); }; capture.StartRecording(); if (capture!=null ){ capture.StopRecording(); }
注意
获取扬声器声音大小不受系统声音设置大小影响,所以要想获取真实用户听到的声音大小要用 采集的声音大小*扬声器设置的声音大小
设置扬声器的音量 1 2 3 4 5 6 7 8 9 10 11 12 private void SetCurrentSpeakerVolume (int volume ){ var enumerator = new MMDeviceEnumerator(); IEnumerable<MMDevice> speakDevices = enumerator.EnumerateAudioEndPoints(DataFlow.Render, DeviceState.Active).ToArray(); if (speakDevices.Count() > 0 ) { MMDevice mMDevice = speakDevices.ToList()[0 ]; mMDevice.AudioEndpointVolume.MasterVolumeLevelScalar = volume / 100.0f ; Console.WriteLine("设置的扬声器为:" + mMDevice.FriendlyName); Console.WriteLine("设置的扬声器声音大小为:" + volume); } }
录制麦克风和扬声器 录制麦克风 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 using System;using System.IO;using System.Threading;using NAudio.Wave;namespace ZUtils { public class ZRecordMicrophoneHelper { public enum RecordState { Stop = 0 , Start = 1 , Pause = 2 } private RecordState _state; private readonly WaveInEvent _waveIn; private Action<string > _stopAction; public ZRecordMicrophoneHelper (string filePath ) { WaveFileWriter writer; var audioFile = filePath; _state = RecordState.Pause; try { _waveIn = new WaveInEvent(); writer = new WaveFileWriter(audioFile, _waveIn.WaveFormat); _waveIn.DataAvailable += (s, a) => { if (_state == RecordState.Start) { writer.Write(a.Buffer, 0 , a.BytesRecorded); } }; _waveIn.RecordingStopped += (s, a) => { writer.Dispose(); writer = null ; _waveIn.Dispose(); _stopAction?.Invoke(audioFile); }; _waveIn.StartRecording(); } catch (Exception) { } } public void StartRecordAudio () { _state = RecordState.Start; } public void StopRecordAudio (Action<string > stopAction ) { _stopAction = stopAction; _state = RecordState.Stop; _waveIn.StopRecording(); } public void PauseRecordAudio () { _state = RecordState.Pause; } public void ResumeRecordAudio () { _state = RecordState.Start; } public static bool IsDeviceGood () { string tempPath = Path.GetTempPath(); WaveInEvent mWaveIn; WaveFileWriter mWriter = null ; try { string mAudioFile = Path.Combine(tempPath, "_microphone.mp3" ); mWaveIn = new WaveInEvent(); mWriter = new WaveFileWriter(mAudioFile, mWaveIn.WaveFormat); var writer = mWriter; mWaveIn.DataAvailable += (s, a) => { writer.Write(a.Buffer, 0 , a.BytesRecorded); }; mWaveIn.RecordingStopped += (s, a) => { writer.Dispose(); mWriter = null ; mWaveIn.Dispose(); if (File.Exists(mAudioFile)) { File.Delete(mAudioFile); } }; mWaveIn.StartRecording(); ThreadPool.QueueUserWorkItem(o => { Thread.Sleep(200 ); mWaveIn.StopRecording(); }); } catch (Exception) { if (mWriter != null ) { mWriter.Dispose(); mWriter = null ; } return false ; } return true ; } } }
注意
这里在初始化类的时候就直接调用录制了,原因在于,如果同时录制音视频的时候,同时开启的时候,由于硬件原因导致启动的时间有先后从而会导致声画不同步。
后文中的视频录制也是同样的原因。
录制扬声器 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 using System;using System.IO;using System.Threading;using NAudio.Wave;namespace ZUtils { public class ZRecordLoudspeakerHelper { public enum RecordState { Stop = 0 , Start = 1 , Pause = 2 } private RecordState _state; private readonly WasapiLoopbackCapture _capture; private Action<string > _stopAction; public ZRecordLoudspeakerHelper (string filePath ) { WaveFileWriter writer; var audioFile = filePath; _state = RecordState.Pause; try { _capture = new WasapiLoopbackCapture(); writer = new WaveFileWriter(audioFile, _capture.WaveFormat); _capture.DataAvailable += (s, a) => { if (_state == RecordState.Start) { writer.Write(a.Buffer, 0 , a.BytesRecorded); } }; _capture.RecordingStopped += (s, a) => { writer.Dispose(); writer = null ; _capture.Dispose(); _stopAction?.Invoke(audioFile); }; _capture.StartRecording(); } catch (Exception) { } } public void StartRecordAudio () { _state = RecordState.Start; } public void StopRecordAudio (Action<string > stopAction ) { _stopAction = stopAction; _state = RecordState.Stop; _capture.StopRecording(); } public void PauseRecordAudio () { _state = RecordState.Pause; } public void ResumeRecordAudio () { _state = RecordState.Start; } public static bool IsDeviceGood () { string tempPath = Path.GetTempPath(); WaveFileWriter mWriter = null ; WasapiLoopbackCapture mCapture; try { string mAudioFile = Path.Combine(tempPath, "_loudspeaker.mp3" ); mCapture = new WasapiLoopbackCapture(); mWriter = new WaveFileWriter(mAudioFile, mCapture.WaveFormat); var writer = mWriter; mCapture.DataAvailable += (s, a) => { writer.Write(a.Buffer, 0 , a.BytesRecorded); }; mCapture.RecordingStopped += (s, a) => { writer.Dispose(); mWriter = null ; mCapture.Dispose(); if (File.Exists(mAudioFile)) { File.Delete(mAudioFile); } }; mCapture.StartRecording(); ThreadPool.QueueUserWorkItem(o => { Thread.Sleep(200 ); mCapture.StopRecording(); }); } catch (Exception) { if (mWriter != null ) { mWriter.Dispose(); mWriter = null ; } return false ; } return true ; } } }
音频混音 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 public static void MixAudio ( string microphonePath, string loudspeakerPath, string outPath ){ using (var reader1 = new AudioFileReader(microphonePath)) using (var reader2 = new AudioFileReader(loudspeakerPath)) { reader1.Volume = 1.0f ; reader2.Volume = 0.6f ; var mixer = new MixingSampleProvider ( new [] { reader1, reader2 } ); WaveFileWriter.CreateWaveFile16(outPath, mixer); } }
音频状态获取 改变系统音量 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 [DllImport("user32.dll" ) ] static extern void keybd_event (byte bVk, byte bScan, UInt32 dwFlags, UInt32 dwExtraInfo ) ; [DllImport("user32.dll" ) ] static extern Byte MapVirtualKey (UInt32 uCode, UInt32 uMapType ) ; private const byte VK_VOLUME_MUTE = 0xAD ;private const byte VK_VOLUME_DOWN = 0xAE ;private const byte VK_VOLUME_UP = 0xAF ;private const UInt32 KEYEVENTF_EXTENDEDKEY = 0x0001 ;private const UInt32 KEYEVENTF_KEYUP = 0x0002 ; public void VolumeUp (){ keybd_event(VK_VOLUME_UP, MapVirtualKey(VK_VOLUME_UP, 0 ), KEYEVENTF_EXTENDEDKEY, 0 ); keybd_event(VK_VOLUME_UP, MapVirtualKey(VK_VOLUME_UP, 0 ), KEYEVENTF_EXTENDEDKEY | KEYEVENTF_KEYUP, 0 ); } public void VolumeDown (){ keybd_event(VK_VOLUME_DOWN, MapVirtualKey(VK_VOLUME_DOWN, 0 ), KEYEVENTF_EXTENDEDKEY, 0 ); keybd_event(VK_VOLUME_DOWN, MapVirtualKey(VK_VOLUME_DOWN, 0 ), KEYEVENTF_EXTENDEDKEY | KEYEVENTF_KEYUP, 0 ); } public void Mute (){ keybd_event(VK_VOLUME_MUTE, MapVirtualKey(VK_VOLUME_MUTE, 0 ), KEYEVENTF_EXTENDEDKEY, 0 ); keybd_event(VK_VOLUME_MUTE, MapVirtualKey(VK_VOLUME_MUTE, 0 ), KEYEVENTF_EXTENDEDKEY | KEYEVENTF_KEYUP, 0 ); }
改变软件音量 改变软件音量 但不改变系统音量
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 [DllImport("Winmm.dll" ) ] private static extern int waveOutSetVolume (int hwo, System.UInt32 pdwVolume ) ;[DllImport("Winmm.dll" ) ] private static extern uint waveOutGetVolume (int hwo, out System.UInt32 pdwVolume ) ; private int volumeMinScope = 0 ;private int volumeMaxScope = 100 ;private int volumeSize = 100 ;public int VolumeSize{ get { return volumeSize; } set { volumeSize = value ; } } public void SetCurrentVolume (){ if (volumeSize < 0 ) { volumeSize = 0 ; } if (volumeSize > 100 ) { volumeSize = 100 ; } System.UInt32 Value = (System.UInt32)((double )0xffff * (double )volumeSize / (double )(volumeMaxScope - volumeMinScope)); if (Value < 0 ) { Value = 0 ; } if (Value > 0xffff ) { Value = 0xffff ; } System.UInt32 left = (System.UInt32)Value; System.UInt32 right = (System.UInt32)Value; waveOutSetVolume(0 , left << 16 | right); }
设置默认音频设备 目前还没有用代码设置默认音频设备的方法
打开系统声音设置,让用户操作
1 Process.Start("mmsys.cpl" );
摄像头 摄像头列表 获取摄像头列表
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 ``` ## 摄像头画面 ```csharp string docPath = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);var fouderPath = Path.Combine(docPath, "Video" );if (!Directory.Exists(fouderPath)){ Directory.CreateDirectory(fouderPath); } string mp4Path = Path.Combine(fouderPath, "out.avi" );double fps = 20 ;Size videoSize = new Size(640 , 480 ); VideoCapture videoCapture = new VideoCapture(CaptureDevice.DShow, 0 ); videoCapture.FrameWidth = videoSize.Width; videoCapture.FrameHeight = videoSize.Height; videoCapture.Fps = fps; videoCapture.Open(0 ); _videoWriter = new VideoWriter ( mp4Path, FourCC.XVID, fps, videoSize ); new Thread( () => { while (true ) { using (var frame = new Mat()) { while (true ) { if (!videoCapture.Read(frame)) return ; if (frame.Width != 640 ) return ; lock (this ) { _videoWriter?.Write(frame); } Dispatcher.Invoke ( () => { var bitmap = BitmapConverter.ToBitmap(frame); BitmapImage bitmapImage = new BitmapImage(); using (MemoryStream memory = new MemoryStream()) { bitmap.Save(memory, ImageFormat.Png); memory.Position = 0 ; bitmapImage.BeginInit(); bitmapImage.StreamSource = memory; bitmapImage.CacheOption = BitmapCacheOption.OnLoad; bitmapImage.EndInit(); } this .MyImg.Source = bitmapImage; bitmap.Dispose(); } ); } } } } ).Start();
视频录制 桌面录制 工具类 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 using System;using System.Drawing;using System.Drawing.Imaging;using System.Threading;using System.Windows.Forms;using OpenCvSharp;using OpenCvSharp.Extensions;using Size = OpenCvSharp.Size;namespace z_recorder_opencv.Uitls { public class ZRecordVideoHelper { public enum RecordState { Stop = 0 , Start = 1 , Pause = 2 } private RecordState _state; private int _fps; private readonly VideoWriter _videoWriter; public ZRecordVideoHelper (string filePath, int fps = 5 ) { _fps = fps; _state = RecordState.Pause; Size videoSize = new Size(Screen.PrimaryScreen.Bounds.Width, Screen.PrimaryScreen.Bounds.Height); try { lock (this ) { _videoWriter = new VideoWriter ( filePath, FourCC.XVID, fps, videoSize ); } } catch (Exception) { } } public bool StartRecordVideo () { _state = RecordState.Start; int frameSpace = 1000 / _fps; new Thread ( () => { var dateTime = DateTime.Now; while (true ) { var miniSec = DateTime.Now.Subtract(dateTime).Milliseconds; if (miniSec < frameSpace) { break ; } if (_state == RecordState.Start) { lock (this ) { dateTime = DateTime.Now; using (var screen = GetScreen()) { using (var frame = screen.ToMat()) { _videoWriter?.Write(frame); } } } } } } ).Start(); return true ; } public void StopRecordVideo () { _state = RecordState.Stop; lock (this ) { _videoWriter.Release(); _videoWriter.Dispose(); } } public void PauseRecordVideo () { _state = RecordState.Pause; } public void ResumeRecordVideo () { _state = RecordState.Start; } private const PixelFormat FORMAT = PixelFormat.Format24bppRgb; public static Bitmap GetScreen () { Bitmap screenshot = new Bitmap ( Screen.PrimaryScreen.Bounds.Width, Screen.PrimaryScreen.Bounds.Height, FORMAT ); using (Graphics gfx = Graphics.FromImage(screenshot)) { gfx.CopyFromScreen ( Screen.PrimaryScreen.Bounds.X, Screen.PrimaryScreen.Bounds.Y, 0 , 0 , Screen.PrimaryScreen.Bounds.Size, CopyPixelOperation.SourceCopy ); Brush brush = new SolidBrush(Color.LimeGreen); gfx.FillEllipse ( brush, Cursor.Position.X - 10 , Cursor.Position.Y - 10 , 20 , 20 ); return screenshot; } } } }
调用方式 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 IntPtr winHandle = new WindowInteropHelper(this ).Handle; var curScreen = Screen.FromHandle(winHandle);int RecordWidth = curScreen.Bounds.Width;int RecordHeight = curScreen.Bounds.Height;ZRecordVideoHelper helper3 = null ; helper3 = new ZRecordVideoHelper(TempVideoPathName, RecordWidth, RecordHeight); helper3.StartRecordVideo(); helper3.PauseRecordVideo(); helper3.ResumeRecordVideo(); helper3.StopRecordVideo();
视频写入 1 2 3 4 5 6 7 8 9 10 11 12 private VideoWriter _videoWriter;_videoWriter = new VideoWriter ( aviPath, FourCC.XVID, fps, videoSize ); _videoWriter?.Write(frame);
桌面流读取
组件录制 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 public static void SaveUI (FrameworkElement ui, string filePathName, int ImgWidth, int ImgHeight ){ Console.WriteLine("截图的路径为:" + filePathName); try { RenderTargetBitmap bmp = new RenderTargetBitmap( (int )ui.ActualWidth, (int )ui.ActualHeight, 1 / 96 , 1 / 96 , PixelFormats.Default ); bmp.Render(ui); BitmapEncoder encoder = new PngBitmapEncoder(); encoder.Frames.Add(BitmapFrame.Create(bmp)); if (ImgWidth > 0 ) { MemoryStream memoryStream = new MemoryStream(); encoder.Save(memoryStream); Bitmap bit = new Bitmap(memoryStream, true ); Bitmap Img = new Bitmap(bit, ImgWidth, ImgHeight); Img.Save(filePathName); Img.Dispose(); bit.Dispose(); memoryStream.Dispose(); } else { using (var stream = new FileStream(filePathName, FileMode.Create)) { encoder.Save(stream); } } } catch (Exception e) { Console.WriteLine(e.Message); } } public static void SaveUI2 (FrameworkElement frameworkElement, string filePathName ){ System.IO.FileStream fs = new System.IO.FileStream(filePathName, System.IO.FileMode.Create); RenderTargetBitmap bmp = new RenderTargetBitmap((int )frameworkElement.ActualWidth, (int )frameworkElement.ActualHeight, 1 / 96 , 1 / 96 , PixelFormats.Default); bmp.Render(frameworkElement); BitmapEncoder encoder = new PngBitmapEncoder(); encoder.Frames.Add(BitmapFrame.Create(bmp)); encoder.Save(fs); fs.Close(); } public static Bitmap SaveUI2Bitmap (FrameworkElement frameworkElement, int width, int height ){ using (MemoryStream outStream = new MemoryStream()) { RenderTargetBitmap bmp = new RenderTargetBitmap( (int )frameworkElement.ActualWidth, (int )frameworkElement.ActualHeight, 1 / 96 , 1 / 96 , PixelFormats.Default ); bmp.Render(frameworkElement); BitmapEncoder enc = new PngBitmapEncoder(); enc.Frames.Add(BitmapFrame.Create(bmp)); enc.Save(outStream); System.Drawing.Bitmap bitmap = new System.Drawing.Bitmap(outStream); return new Bitmap(bitmap, width, height); } }
摄像头录制 录制控制帧间隔 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 private WaveFileWriter _audioWriter;private VideoWriter _videoWriter;private async void CameraRecord (){ string docPath = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); var fouderPath = Path.Combine(docPath, "Video" ); if (!Directory.Exists(fouderPath)) { Directory.CreateDirectory(fouderPath); } string aviPath = Path.Combine(fouderPath, "in_video.avi" ); double fps = 24 ; Size videoSize = new Size(640 , 480 ); VideoCapture videoCapture = new VideoCapture(CaptureDevice.DShow, 0 ); videoCapture.FrameWidth = videoSize.Width; videoCapture.FrameHeight = videoSize.Height; videoCapture.Fps = fps; videoCapture.Open(0 ); _videoWriter = new VideoWriter ( aviPath, FourCC.XVID, fps, videoSize ); int frameSpace = (int )(1000 / fps); new Thread ( () => { var dateTime = DateTime.Now; while (true ) { using (var frame = new Mat()) { while (true ) { var miniSec = DateTime.Now.Subtract(dateTime).Milliseconds; if (miniSec < frameSpace) { break ; } if (!videoCapture.Read(frame)) { return ; } if (frame.Width != 640 ) { return ; } lock (this ) { dateTime = DateTime.Now; _videoWriter?.Write(frame); } Dispatcher.Invoke ( () => { var bitmap = BitmapConverter.ToBitmap(frame); BitmapImage bitmapImage = new BitmapImage(); using (MemoryStream memory = new MemoryStream()) { bitmap.Save(memory, ImageFormat.Png); memory.Position = 0 ; bitmapImage.BeginInit(); bitmapImage.StreamSource = memory; bitmapImage.CacheOption = BitmapCacheOption.OnLoad; bitmapImage.EndInit(); } this .MyImg.Source = bitmapImage; bitmap.Dispose(); } ); } } } } ).Start(); string wavPath = Path.Combine(fouderPath, "in_audio.wav" ); WaveFormat audioFormat = new WaveFormat ( 44100 , 16 , 2 ); _audioWriter = new WaveFileWriter(wavPath, audioFormat); WaveInEvent audioCapture = new WaveInEvent(); audioCapture.WaveFormat = audioFormat; audioCapture.DataAvailable += (sender, e) => _audioWriter.Write ( e.Buffer, 0 , e.BytesRecorded ); audioCapture.StartRecording(); await Task.Delay(20 * 1000 ); videoCapture.Release(); _videoWriter.Release(); await Task.Delay(1 * 1000 ); string picPath = Path.Combine(fouderPath, "test.jpg" ); ZOpencvUtils.GetVideoPic(aviPath, picPath); var duration = ZOpencvUtils.GetDuration(aviPath); Console.WriteLine($@"duration:{duration} " ); audioCapture.StopRecording(); _audioWriter.Close(); _audioWriter.Dispose(); }
取帧和写帧分离 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 private WaveFileWriter _audioWriter;private VideoWriter _videoWriter;private readonly Queue<Mat> _videoQueue = new Queue<Mat>();private RecordState _recordState;enum RecordState{ Stop, Recording } private async void CameraRecord (){ string docPath = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); var fouderPath = Path.Combine(docPath, "Video" ); if (!Directory.Exists(fouderPath)) { Directory.CreateDirectory(fouderPath); } string aviPath = Path.Combine(fouderPath, "in_video.avi" ); double fps = 30 ; Size videoSize = new Size(640 , 480 ); VideoCapture videoCapture = new VideoCapture(CaptureDevice.DShow, 0 ); videoCapture.FrameWidth = videoSize.Width; videoCapture.FrameHeight = videoSize.Height; videoCapture.Fps = fps; videoCapture.Open(0 ); _videoWriter = new VideoWriter ( aviPath, FourCC.XVID, fps, videoSize ); int frameSpace = (int )(1000 / fps); _recordState = RecordState.Recording; new Thread ( () => { var dateTime = DateTime.Now; while (true ) { using (var frame = new Mat()) { while (_recordState == RecordState.Recording) { var miniSec = DateTime.Now.Subtract(dateTime).Milliseconds; if (miniSec < frameSpace) { break ; } if (!videoCapture.Read(frame)) { return ; } if (frame.Width == 0 ) { return ; } lock (this ) { dateTime = DateTime.Now; _videoQueue.Enqueue(frame); } Dispatcher.Invoke ( () => { var bitmap = BitmapConverter.ToBitmap(frame); BitmapImage bitmapImage = new BitmapImage(); using (MemoryStream memory = new MemoryStream()) { bitmap.Save(memory, ImageFormat.Png); memory.Position = 0 ; bitmapImage.BeginInit(); bitmapImage.StreamSource = memory; bitmapImage.CacheOption = BitmapCacheOption.OnLoad; bitmapImage.EndInit(); } this .MyImg.Source = bitmapImage; bitmap.Dispose(); } ); } } } } ).Start(); new Thread ( () => { while (_recordState == RecordState.Recording) { if (_videoQueue.Count > 0 ) { var frame = _videoQueue.Dequeue(); _videoWriter?.Write(frame); } } } ).Start(); string wavPath = Path.Combine(fouderPath, "in_audio.wav" ); WaveFormat audioFormat = new WaveFormat ( 44100 , 16 , 2 ); _audioWriter = new WaveFileWriter(wavPath, audioFormat); WaveInEvent audioCapture = new WaveInEvent(); audioCapture.WaveFormat = audioFormat; audioCapture.DataAvailable += (sender, e) => { if (_recordState == RecordState.Recording) { _audioWriter.Write ( e.Buffer, 0 , e.BytesRecorded ); } }; audioCapture.StartRecording(); await Task.Delay(20 * 1000 ); _recordState = RecordState.Stop; videoCapture.Release(); _videoWriter.Release(); await Task.Delay(1 * 1000 ); string picPath = Path.Combine(fouderPath, "out_thumbnail.jpg" ); ZOpencvUtils.GetVideoPic(aviPath, picPath); var duration = ZOpencvUtils.GetDuration(aviPath); Console.WriteLine($@"duration:{duration} " ); audioCapture.StopRecording(); _audioWriter.Close(); _audioWriter.Dispose(); }
音视频合并
生成缩略图 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 public static void GetVideoPic (string mp4Path, string picPath ){ using (var video = new VideoCapture()) { Mat image = new Mat(); video.Open(mp4Path); while (video.Read(image)) { using (var bitmap = BitmapConverter.ToBitmap(image)) { SaveAsJpeg ( bitmap, picPath, 90 ); } break ; } } } public static void SaveAsJpeg ( Bitmap bitmap, string fileName, int quality ){ var jpegEncoder = ImageCodecInfo.GetImageEncoders().First(codec => codec.FormatID == ImageFormat.Jpeg.Guid); var encoderParameters = new EncoderParameters(1 ); encoderParameters.Param[0 ] = new EncoderParameter(Encoder.Quality, quality); bitmap.Save ( fileName, jpegEncoder, encoderParameters ); }
获取视频时长 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 public static int GetDuration (string mp4Path ){ try { using (var video = new VideoCapture()) { video.Open(mp4Path); return (int )(video.FrameCount / video.Fps); } } catch (Exception) { return 0 ; } }
打开系统声音设置 1 Process.Start("mmsys.cpl" );
调用本地播放 1 2 3 4 5 Process pro = new Process { StartInfo = new ProcessStartInfo(videoPath) }; pro.Start();
音频参数计算 采样率和比特率(码率)之间的公式如下:
比特率 = 采样率 × 样本深度 × 声道数
AudioBitRate = SampleRate × bits × channels
其中,采样率表示每秒钟采集的样本数,单位为Hz(赫兹);样本深度表示每个采样值所占的位数,通常为16或24位;声道数表示音频信号的通道数,通常为单声道或立体声。
例如,对于一段立体声音乐,采样率为44100Hz,样本深度为16位,声道数为2,则它的比特率为:
比特率 = 44100 × 16 × 2 = 1411200 bit/s = 1.41 Mbps
采样率为44100Hz,样本深度为16位,声道数为1,则它的比特率为:
比特率 = 44100 × 16 × 1 = 705600 bit/s
因为
1 byte = 8 bits
所以上面的示例1s产生的byte为
1411200 / 8 = 176400
音频计算中使用到的参数
1 2 3 4 5 6 7 8 9 10 11 private readonly int _frameRate;private readonly int _audioSampleRate = 44100 ;private readonly int _audioBits = 16 ;private readonly int _audioChannels = 1 ;
音频的比特率
1 2 比特率 = 采样率 * 位深 * 声道数 audioBitRate = _audioSampleRate * _audioBits * _audioChannels;
音频帧大小
1 2 3 4 音频帧大小 = 比特率 * 1 秒 / 8 / 声道数 / 帧率 audioFrameSize = audioBitRate * 1 / 8 / _audioChannels / _frameRate = _audioSampleRate * _audioBits * _audioChannels * 1 / 8 / 2 / _frameRate = _audioSampleRate * _audioBits * _audioChannels / 16 / _frameRate
其中比特率 * 1秒 / 8为1秒的byte数。
运行时间 1 2 3 4 Stopwatch sw = Stopwatch.StartNew(); sw.Stop(); Console.WriteLine($@"sw.Elapsed.Milliseconds:{sw.Elapsed.Milliseconds} " );