Silverlight之视频录制

摘要:在前两篇Silverlight的文章中跟大家一块学习了Silverlight的基础知识、Silverlight摄像头麦克风的相关操作以及截图、声音录制等,在文章后面也简单的说明了为什么没有视频录制,今天就和大家一块看一下上一节中最后的一个问题:如何使用Silverlight进行视频录制。

主要内容:

1.NESL项目简介

2.使用NESL实现视频录制

3.注意

一、NESL项目简介

在silverlight 中如何录制视频?相信这个问题有不少朋友都搜索过,但是好像目前还没有见到很好的答案,究其原因其实就是视频编码问题。当然也有朋友提到直接进行截图,只要每秒截取足够多的图片,然后依次播放就可以形成视频。但是我看到国外一个朋友使用此方法进行了几十秒的视频录制,其文件大小就达到了百兆级别,而且还进行了优化。因此这种方式要实现视频录制就目前而言还不是很合适。那么到底有没有好的方法呢?答案是有,但有限制,那就是借助于NESL。

Native Extensions for Silverlight(简称NESL)是由微软Silverlight团队进行开发,其目的主要为了增强Silverlight Out-of-Browser离线应用的功能。大家都知道虽然Silverlight 4的OOB应用支持信任人权限提升功能,允许Silverlight的OOB应用对COM组件的访问,但对绝大多数Windows API仍旧无法调用,而NESL的出现正是为了解决这个问题。在最新的NESL 2.0中包含了大量有用的功能,而这其中就包括今天要说的视频编码部分。在NESL中有一个类库Microsoft.Silverlight.Windows.LocalEncode.dll主要负责本地视频和音频编码,这里就是用此类库来解决上面提到的视频录制问题。

二、使用NESL实现视频录制

在Microsoft.Silverlight.Windows.LocalEncode.dll中一个核心类就是EncodeSession,它负责音频和视频的编码输出工作。使用EncodeSession进行视频录制大概分为下面两步:

1.准备输入输出信息

在这个过程中需要定义VideInputFormatInfo、AudioInputFormatInfo、VideoOutputFormatInfo、AudioOutputFormatInfo和OutputContainerInfo,然后调用EncodeSession.Prepare()方法。

2.捕获视频输出

当输入输出信息准备好之后接下来就是调用EncodeSession.Start()方法进行视频编码输出。当然为了接收音频和视频数据必须准备两个sink类,分别继承于AudioSink和VideoSink,在这两个sink中指定CaptureSource,并且在对应的OnSample中调用EncodeSession的WirteVideoSample()和WirteAudioSample()接收并编码数据(关于AudioSink在前面的文章中已经说过,VideoSink与之类似)。

知道了EncodeSession的使用方法后下面就将其操作进行简单封装,LocalCamera.cs是本例中的核心类:

using System;

using System.Collections.ObjectModel;

using System.IO;

using System.Windows;

using System.Windows.Threading;

using System.Windows.Media;

using System.Windows.Controls;

using System.Windows.Shapes;

using Microsoft.Silverlight.Windows.LocalEncode;

 

namespace Cmj.MyWeb.MySilverlight.SilverlightMeida

{

    /// <summary>

    /// 编码状态

    /// </summary>

    public enum EncodeSessionState

    {

        Start,

        Pause,

        Stop

    }

    /// <summary>

    /// 本地视频对象

    /// </summary>

    public class LocalCamera

    {

        private string _saveFullPath = "";

        private uint _videoWidth = 640;

        private uint _videoHeight = 480;

        private VideoSinkExtensions _videoSink = null;

        private AudioSinkExtensions _audioSink= null;

        private EncodeSession _encodeSession = null;

        private UserControl _page = null;

        private CaptureSource _cSource = null;

        public LocalCamera(UserControl page,VideoFormat videoFormat,AudioFormat audioFormat)

        {

            //this._saveFullPath = saveFullPath;

            this._videoWidth = (uint)videoFormat.PixelWidth;

            this._videoHeight = (uint)videoFormat.PixelHeight;

            this._page = page;

            this.SessionState = EncodeSessionState.Stop;

            //this._encodeSession = new EncodeSession();

            _cSource = new CaptureSource();

            this.VideoDevice = DefaultVideoDevice;

            this.VideoDevice.DesiredFormat = videoFormat;

            this.AudioDevice = DefaultAudioDevice;

            this.AudioDevice.DesiredFormat = audioFormat;

            _cSource.VideoCaptureDevice = this.VideoDevice;

            _cSource.AudioCaptureDevice = this.AudioDevice;

            audioInputFormatInfo = new AudioInputFormatInfo() { SourceCompressionType = FormatConstants.AudioFormat_PCM };

            videoInputFormatInfo = new VideoInputFormatInfo() { SourceCompressionType = FormatConstants.VideoFormat_ARGB32 };

            audioOutputFormatInfo = new AudioOutputFormatInfo() { TargetCompressionType = FormatConstants.AudioFormat_AAC };

            videoOutputFormatInfo = new VideoOutputFormatInfo() { TargetCompressionType = FormatConstants.VideoFormat_H264 };

            outputContainerInfo = new OutputContainerInfo() { ContainerType = FormatConstants.TranscodeContainerType_MPEG4 };

        }

 

        public LocalCamera(UserControl page,VideoCaptureDevice videoCaptureDevice,AudioCaptureDevice audioCaptureDevice, VideoFormat videoFormat, AudioFormat audioFormat)

        {

            //this._saveFullPath = saveFullPath;

            this._videoWidth = (uint)videoFormat.PixelWidth;

            this._videoHeight = (uint)videoFormat.PixelHeight;

            this._page = page;

            this.SessionState = EncodeSessionState.Stop;

            //this._encodeSession = new EncodeSession();

            _cSource = new CaptureSource();

            this.VideoDevice = videoCaptureDevice;

            this.VideoDevice.DesiredFormat = videoFormat;

            this.AudioDevice = audioCaptureDevice;

            this.AudioDevice.DesiredFormat = audioFormat;

            _cSource.VideoCaptureDevice = this.VideoDevice;

            _cSource.AudioCaptureDevice = this.AudioDevice;

            audioInputFormatInfo = new AudioInputFormatInfo() { SourceCompressionType = FormatConstants.AudioFormat_PCM };

            videoInputFormatInfo = new VideoInputFormatInfo() { SourceCompressionType = FormatConstants.VideoFormat_ARGB32 };

            audioOutputFormatInfo = new AudioOutputFormatInfo() { TargetCompressionType = FormatConstants.AudioFormat_AAC };

            videoOutputFormatInfo = new VideoOutputFormatInfo() { TargetCompressionType = FormatConstants.VideoFormat_H264 };

            outputContainerInfo = new OutputContainerInfo() { ContainerType = FormatConstants.TranscodeContainerType_MPEG4 };

        }

 

        public EncodeSessionState SessionState

        {

            get;

            set;

        }

        public EncodeSession Session

        {

            get

            {

                return _encodeSession;

            }

            set

            {

                _encodeSession = value;

            }

        }

        /// <summary>

        /// 编码对象所在用户控件对象

        /// </summary>

        public UserControl OwnPage

        {

            get

            {

                return _page;

            }

            set

            {

                _page = value;

            }

        }

        /// <summary>

        /// 捕获源

        /// </summary>

        public CaptureSource Source

        {

            get

            {

                return _cSource;

            }

        }

        /// <summary>

        /// 操作音频对象

        /// </summary>

        public AudioSinkExtensions AudioSink

        {

            get

            {

                return _audioSink;

            }

        }

 

        public static VideoCaptureDevice DefaultVideoDevice

        {

            get

            {

                return CaptureDeviceConfiguration.GetDefaultVideoCaptureDevice();

            }

        }

         

        public static ReadOnlyCollection<VideoCaptureDevice> AvailableVideoDevice

        {

            get

            {

                return CaptureDeviceConfiguration.GetAvailableVideoCaptureDevices();

            }

        }

 

        public VideoCaptureDevice VideoDevice

        {

            get;

            set;

        }

 

        public static AudioCaptureDevice DefaultAudioDevice

        {

            get

            {

                return CaptureDeviceConfiguration.GetDefaultAudioCaptureDevice();

            }

        }

        public static ReadOnlyCollection<AudioCaptureDevice> AvailableAudioDevice

        {

            get

            {

                return CaptureDeviceConfiguration.GetAvailableAudioCaptureDevices();

            }

        }

 

        public AudioCaptureDevice AudioDevice

        {

            get;

            set;

        }

 

        private Object lockObj = new object();

        internal VideoInputFormatInfo videoInputFormatInfo;

        internal AudioInputFormatInfo audioInputFormatInfo;

        internal VideoOutputFormatInfo videoOutputFormatInfo;

        internal AudioOutputFormatInfo audioOutputFormatInfo;

        internal OutputContainerInfo outputContainerInfo;

        /// <summary>

        /// 视频录制

        /// </summary>

        public void StartRecord()

        {

            lock (lockObj)

            {

                if (this.SessionState == EncodeSessionState.Stop)

                {

                    _videoSink = new VideoSinkExtensions(this);

                    _audioSink = new AudioSinkExtensions(this);

                    //_audioSink.VolumnChange += new AudioSinkExtensions.VolumnChangeHanlder(_audioSink_VolumnChange);

                    if (_encodeSession == null)

                    {

                        _encodeSession = new EncodeSession();

                    }

                    PrepareFormatInfo(_cSource.VideoCaptureDevice.DesiredFormat, _cSource.AudioCaptureDevice.DesiredFormat);

                    _encodeSession.Prepare(videoInputFormatInfo, audioInputFormatInfo, videoOutputFormatInfo, audioOutputFormatInfo, outputContainerInfo);

                    _encodeSession.Start(false, 200);

                    this.SessionState = EncodeSessionState.Start;

                }

            }

        }

        /// <summary>

        /// 音量大小指示

        /// </summary>

        /// <param name="sender"></param>

        /// <param name="e"></param>

        //void _audioSink_VolumnChange(object sender, VolumnChangeArgs e)

        //{

        //    this.OwnPage.Dispatcher.BeginInvoke(new Action(() =>

        //    {

        //        (

        //            this.OwnPage.Tag as ProgressBar).Value = e.Volumn;

        //    }));

        //}

 

        /// <summary>

        /// 暂停录制

        /// </summary>

        public void PauseRecord()

        {

            lock (lockObj)

            {

                this.SessionState = EncodeSessionState.Pause;

                _encodeSession.Pause();

            }

        }

        /// <summary>

        /// 停止录制

        /// </summary>

        public void StopRecord()

        {

            lock (lockObj)

            {

                this.SessionState = EncodeSessionState.Stop;

                _encodeSession.Shutdown();

                _videoSink = null;

                _audioSink = null;

            }

        }

 

        /// <summary>

        /// 准备编码信息

        /// </summary>

        /// <param name="videoFormat"></param>

        /// <param name="audioFormat"></param>

        private void PrepareFormatInfo(VideoFormat videoFormat, AudioFormat audioFormat)

        {

            uint FrameRateRatioNumerator = 0;

            uint FrameRateRationDenominator = 0;

            FormatConstants.FrameRateToRatio((float)Math.Round(videoFormat.FramesPerSecond, 2), ref FrameRateRatioNumerator, ref FrameRateRationDenominator);

 

            videoInputFormatInfo.FrameRateRatioNumerator = FrameRateRatioNumerator;

            videoInputFormatInfo.FrameRateRatioDenominator = FrameRateRationDenominator;

            videoInputFormatInfo.FrameWidthInPixels = _videoWidth;

            videoInputFormatInfo.FrameHeightInPixels = _videoHeight ;

            videoInputFormatInfo.Stride = (int)_videoWidth*-4;

 

            videoOutputFormatInfo.FrameRateRatioNumerator = FrameRateRatioNumerator;

            videoOutputFormatInfo.FrameRateRatioDenominator = FrameRateRationDenominator;

            videoOutputFormatInfo.FrameWidthInPixels = videoOutputFormatInfo.FrameWidthInPixels == 0 ? (uint)videoFormat.PixelWidth : videoOutputFormatInfo.FrameWidthInPixels;

            videoOutputFormatInfo.FrameHeightInPixels = videoOutputFormatInfo.FrameHeightInPixels == 0 ? (uint)videoFormat.PixelHeight : videoOutputFormatInfo.FrameHeightInPixels;

 

            audioInputFormatInfo.BitsPerSample = (uint)audioFormat.BitsPerSample;

            audioInputFormatInfo.SamplesPerSecond = (uint)audioFormat.SamplesPerSecond;

            audioInputFormatInfo.ChannelCount = (uint)audioFormat.Channels;

            if (outputContainerInfo.FilePath == null || outputContainerInfo.FilePath == string.Empty)

            {

                _saveFullPath=System.IO.Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyVideos), "cCameraRecordVideo.tmp");

            }

            outputContainerInfo.FilePath = _saveFullPath;

            //outputContainerInfo.FilePath = _saveFullPath;

            if (audioOutputFormatInfo.AverageBitrate == 0)

                audioOutputFormatInfo.AverageBitrate = 24000;

            if (videoOutputFormatInfo.AverageBitrate == 0)

                videoOutputFormatInfo.AverageBitrate = 2000000;

        }

 

        /// <summary>

        /// 开始捕获

        /// </summary>

        public void StartCaptrue()

        {

            if (CaptureDeviceConfiguration.AllowedDeviceAccess || CaptureDeviceConfiguration.RequestDeviceAccess())

            {

                _cSource.Start();

            }

        }

 

        /// <summary>

        /// 停止捕获

        /// </summary>

        public void StopCapture()

        {

            _videoSink = null;

            _audioSink = null;

            _cSource.Stop();

        }

 

        /// <summary>

        /// 获得视频

        /// </summary>

        /// <returns></returns>

        public VideoBrush GetVideoBrush()

        {

            VideoBrush vBrush = new VideoBrush();

            vBrush.SetSource(_cSource);

            return vBrush;

        }

 

        /// <summary>

        /// 获得视频

        /// </summary>

        /// <returns></returns>

        public Rectangle GetVideoRectangle()

        {

            Rectangle rctg = new Rectangle();

            rctg.Width = this._videoWidth;

            rctg.Height = this._videoHeight;

            rctg.Fill = GetVideoBrush();

            return rctg;

        }

 

        /// <summary>

        /// 保存视频

        /// </summary>

        public void SaveRecord()

        {

            if (_saveFullPath == string.Empty)

            {

                MessageBox.Show("尚未录制视频,无法进行保存!", "系统提示", MessageBoxButton.OK);

                return;

            }

            SaveFileDialog sfd = new SaveFileDialog

            {

                Filter = "MP4 Files (*.mp4)|*.mp4",

                DefaultExt = ".mp4",

                FilterIndex = 1

            };

 

            if ((bool)sfd.ShowDialog())

            {

                using (Stream stm=sfd.OpenFile())

                {

                    FileStream fs = new FileStream(_saveFullPath, FileMode.Open, FileAccess.Read);

                    try

                    {

                        byte[] buffur = new byte[fs.Length];

                        fs.Read(buffur, 0, (int)fs.Length);

                        stm.Write(buffur, 0, (int)buffur.Length);

                        fs.Close();

                        File.Delete(_saveFullPath);

                    }

                    catch (IOException ioe)

                    {

                        MessageBox.Show("文件保存失败!错误信息如下:"+Environment.NewLine+ioe.Message,"系统提示",MessageBoxButton.OK);

                    }

                    stm.Close();

                }

            }

        }

    }

}

当然上面说过必须有两个Sink:

 

using System;

using System.Windows.Media;

using System.Windows.Controls;

using Microsoft.Silverlight.Windows.LocalEncode;

 

namespace Cmj.MyWeb.MySilverlight.SilverlightMeida

{

    public class VideoSinkExtensions:VideoSink

    {

        //private UserControl _page;

        //private EncodeSession _session;

        private LocalCamera _localCamera;

        public VideoSinkExtensions(LocalCamera localCamera)

        {

            //this._page = page;

            this._localCamera = localCamera;

            //this._session = session;

            this.CaptureSource = _localCamera.Source;

        }

 

        protected override void OnCaptureStarted()

        {

             

        }

 

        protected override void OnCaptureStopped()

        {

 

        }

 

        protected override void OnFormatChange(VideoFormat videoFormat)

        {

 

        }

 

        protected override void OnSample(long sampleTimeInHundredNanoseconds, long frameDurationInHundredNanoseconds, byte[] sampleData)

        {

            if (_localCamera.SessionState == EncodeSessionState.Start)

            {

                _localCamera.OwnPage.Dispatcher.BeginInvoke(new Action<long, long, byte[]>((ts, dur, data) =>

                {

                    _localCamera.Session.WriteVideoSample(data, data.Length, ts, dur);

                }), sampleTimeInHundredNanoseconds, frameDurationInHundredNanoseconds, sampleData);

            }

        }

    }

}

  

using System;

using System.Windows.Media;

using System.Windows.Controls;

using Microsoft.Silverlight.Windows.LocalEncode;

 

 

namespace Cmj.MyWeb.MySilverlight.SilverlightMeida

{

    public class AudioSinkExtensions:AudioSink

    {

        private LocalCamera _localCamera;

        public AudioSinkExtensions(LocalCamera localCamera)

        {

            this._localCamera = localCamera;

            this.CaptureSource = _localCamera.Source;

 

        }

        protected override void OnCaptureStarted()

        {

             

        }

 

        protected override void OnCaptureStopped()

        {

 

        }

 

        protected override void OnFormatChange(AudioFormat audioFormat)

        {

 

        }

 

        protected override void OnSamples(long sampleTimeInHundredNanoseconds, long sampleDurationInHundredNanoseconds, byte[] sampleData)

        {

            if (_localCamera.SessionState == EncodeSessionState.Start)

            {

                _localCamera.OwnPage.Dispatcher.BeginInvoke(new Action<long, long, byte[]>((ts, dur, data) =>

                {

                    _localCamera.Session.WriteAudioSample(data, data.Length, ts, dur);

                }), sampleTimeInHundredNanoseconds, sampleDurationInHundredNanoseconds, sampleData);

 

                //计算音量变化

                //for (int index = 0; index < sampleData.Length; index += 1)

                //{

                //    short sample = (short)((sampleData[index] << 8) | sampleData[index]);

                //    float sample32 = sample / 32768f;

                //    float maxValue = 0;

                //    float minValue = 0;

                //    maxValue = Math.Max(maxValue, sample32);

                //    minValue = Math.Min(minValue, sample32);

                //    float lastPeak = Math.Max(maxValue, Math.Abs(minValue));

                //    float micLevel = (100 - (lastPeak * 100)) * 10;

                //    OnVolumnChange(this, new VolumnChangeArgs() { Volumn=micLevel});

                //}

            }

        }

 

 

        /// <summary>

        /// 定义一个事件,反馈音量变化

        /// </summary>

        /// <param name="sender"></param>

        /// <param name="e"></param>

        //public delegate void VolumnChangeHanlder(object sender, VolumnChangeArgs e);

        //public event VolumnChangeHanlder VolumnChange;

        //private void OnVolumnChange(object sender, VolumnChangeArgs e)

        //{

        //    if (VolumnChange != null)

        //    {

        //        VolumnChange(sender, e);

        //    }

        //}

    }

 

    //public class VolumnChangeArgs : EventArgs

    //{

    //    public float Volumn

    //    {

    //        get;

    //        internal set;

    //    }

    //}

}

有了这三个类,下面准备一个界面,使用LocalCamera进行视频录制操作。

需要注意的是保存操作,事实上在EncodeSession中视频的保存路径是在视频录制之前就必须指定的(当然这一点并不难理解,因为长时间的视频录制是会形成很大的文件的,保存之前缓存到内存中也不是很现实),在LocalCamera中对保存方法的封装事实上是文件的读取和删除操作。另外在这个例子中用到了前面文章中自定义的OOB控件,不明白的朋友可以查看前面的文章内容。下面是调用代码:

+ View Code

OK,下面是视频录制的截图:

正在录制

 

停止录制后保存

播放录制的视频

三、注意:

1.video sink和audio sink都是运行在不同于UI的各自的线程中,你可以使用UI的Dispathcher或者SynchronizationContext进行不同线程之间的调用。

2.在video sink和audio sink的OnSample方法中必须进行状态判断,因为sink实例创建之后就会执行OnSample方法,但此时EncodeSession还没有启动因此如果不进行状态判读就会抛出com异常。

3.视频的宽度和高度不能够随意指定,这个在NESL的帮助文档中也是特意说明的,如果任意指定同样会抛出异常。

4.最后再次提醒大家,上面的视频录制是基于NESL的因此必须将应用运行到浏览器外(OOB)。

源代码下载

时间: 2024-11-04 05:54:09

Silverlight之视频录制的相关文章

FLASH视频录制+抓图

源文件:视频录制.rar代码:stop();  实例名begin_btn,view_btn,norm_btn,photo_btn四个按钮,实例名为mybox的一个视频(库--新建视频),  //将按钮设为禁用  begin_btn.enabled = false;  view_btn.enabled = false;  norm_btn.enabled=false;  client_nc = new NetConnection();  client_nc.onStatus = function(

Flex与.NET互操作(十三):FluorineFx.Net实现视频录制与视频回放

本文主要介绍使用FluorineFx.Net来实现视频录制与视频回放,FluorineFx如同FMS一样,除了有AMF通信,RTMP协议,RPC和远程共享对象 外,它同样具备视频流服务的功能.通过它我们可以非常方便的实现在线视频录制.视频直播.视频聊天以及视频会议等类似应用程序的开发 . 在<FMS3系列(四):在线视频录制.视频回放>这篇文章里我写了通过FMS来实现在线视频录制和视频回放的功能,客户端的开发和 这篇文章是相同的,不同的是本文将使用Flex来开发. 首先我们来看看使用Fluor

FMS3系列(四):在线视频录制、视频回放

使用Flash/Flex+FMS实现在线视频录制.视频回放的很简单的.通过阅读API文档后基本都可以实现这个功能,本文也意在抛砖引玉,希望对刚入手这块的朋友有所帮助. 首先建立好Flash(ActionScript 3.0)文件,从组件(可使用Ctrl+F7打开)库中拖拽相应的组件到Flash舞台上,如下图: 界面布局好后我们通过程序设置组见的显示文本以及为按扭添加事件监听,新建一个ActionScript类文件,编写代码如下: 以下为引用的内容: 1 public function Publi

linux下的视频录制软件xvidcap

  1.xvidcap简介 在linux如果我们想要进行视频录制,那么xvidcap是一个不错的选择.Xvidcap 是一个可将屏幕上的操作过程录制下来并保存为视频的小工具.对于需要制作产品演示和教学的朋友来说,这个屏幕录像机十分实用.Xvidcap 支持生成 avi.mpeg.asf.flv.swf.mov 等视频格式,可以应用在各种场合.录制的区域也可以随意选择,显得非常方便. 我们接下来安装xvidcap,他需要用到mplayer,因为视频录制首先得有播放器么.^_^ 2.xvidcap的

视频录制软件kk录像机怎么进行视频编辑?

  kk录像机不仅仅是一款视频录制软件,经常用于游戏直播视频的录制,也是一款简单.好用的视频编辑软件.那么,kk录像机的视频编辑功能怎么使用呢?下面,小编就给大家分享一下kk录像机进行视频简单编辑的方法! kk录像机 推荐:KK录像机录制lol游戏视频教程 1.首先我们打开软件点[添加一个视频],添加需要剪切的视频; 2.将播放指针移到需要剪切的位置,点击视频剪辑,时间轴上会出现剪切图标,拖动图标调整剪切范围,剪切图标之间的内容表示删除部分.如需剪切多个片段,则添加多个剪切区,步骤同上.(注:如

iOS开发系列--音频播放、录音、视频播放、拍照、视频录制

概览 随着移动互联网的发展,如今的手机早已不是打电话.发短信那么简单了,播放音乐.视频.录音.拍照等都是很常用的功能.在iOS中对于多媒体的支持是非常强大的,无论是音视频播放.录制,还是对麦克风.摄像头的操作都提供了多套API.在今天的文章中将会对这些内容进行一一介绍: 音频 音效 音乐 音频会话 录音 音频队列服务 视频 MPMoviePlayerController MPMoviePlayerViewController AVPlayer 摄像头 UIImagePickerControlle

视频解码-用C#开发视频录制软件。包括声音和屏幕录制。保存到flv

问题描述 用C#开发视频录制软件.包括声音和屏幕录制.保存到flv 5C 用C#开发视频录制软件.包括声音和屏幕录制.保存到flv谁能提供我思路和方法阿.用到什么技术,方便一点的,就是录制屏幕和声音 然后生成flv 解决方案 建议研究一下FFmepg吧.FFmpeg是一套可以用来记录.转换数字音频.视频,并能将其转化为流的开源计算机程序.采用LGPL或GPL许可证.它提供了录制.转换以及流化音视频的完整解决方案. 解决方案二: 或者使用vlc activex. 解决方案三: C# .net对它有

android-Android开发之视频录制后的亮度

问题描述 Android开发之视频录制后的亮度 我想问一个问题,怎么使录制的视频亮度高一些,程序中需要修改什么参数吗,请大神指教

android-安卓怎么使用surfacetexture实现后台视频录制

问题描述 安卓怎么使用surfacetexture实现后台视频录制 service无法使用surfaceView,所以希望能够通过surfacetexture来实现,activity退出后录像仍在继续,求大神帮忙