首页 > 其他分享 >NetCore Rtsp视频流转Websocket实现Web实时查看摄像头

NetCore Rtsp视频流转Websocket实现Web实时查看摄像头

时间:2024-03-07 10:46:31浏览次数:30  
标签:Web Websocket ffmpeg NetCore error var new public string

.NetCore Rtsp视频流转Websocket实现Web实时查看摄像头

最近工作中遇到需求需要实现这个功能,网上找了很多方案,大都是转为视频文件保存,实时查看的方案倒比较少,最终自己慢慢琢磨了很久在windows系统下实现了,里面的核心思路是:由FFmpeg.AutoGen捕捉Rtsp流视频帧,转为Bitmap,借由Websocket向前端推送,

其中借鉴了Github中一个WPF的项目代码,Github地址: github.com/cuiguangzhen/FFmpegPlayRtsp

核心代码:

Step1:

首先注册FFmpeg环境,这里使用的是FFmpeg 3.4.0.5,在当前项目下创建文件夹FFmpeg,并将FFmpeg相关文件及Dll放至目录下,参考上面Github项目的放置目录

public class FFmpegBinariesHelper
    {
        private const string LD_LIBRARY_PATH = "LD_LIBRARY_PATH";

        internal static void RegisterFFmpegBinaries()
        {
            switch (Environment.OSVersion.Platform)
            {
                case PlatformID.Win32NT:
                case PlatformID.Win32S:
                case PlatformID.Win32Windows:
                    var current = Environment.CurrentDirectory;
                    var probe = $"FFmpeg/bin/{(Environment.Is64BitProcess ? @"x64" : @"x86")}";
                    while (current != null)
                    {
                        var ffmpegDirectory = Path.Combine(current, probe);
                        if (Directory.Exists(ffmpegDirectory))
                        {
                            //Console.WriteLine($"FFmpeg binaries found in: {ffmpegDirectory}");
                            RegisterLibrariesSearchPath(ffmpegDirectory);
                            return;
                        }
                        current = Directory.GetParent(current)?.FullName;
                    }
                    break;
                case PlatformID.Unix:
                case PlatformID.MacOSX:
                    var libraryPath = Environment.GetEnvironmentVariable(LD_LIBRARY_PATH);
                    RegisterLibrariesSearchPath(libraryPath);
                    break;
            }
        }
        private static void RegisterLibrariesSearchPath(string path)
        {
            switch (Environment.OSVersion.Platform)
            {
                case PlatformID.Win32NT:
                case PlatformID.Win32S:
                case PlatformID.Win32Windows:
                    SetDllDirectory(path);
                    break;
                case PlatformID.Unix:
                case PlatformID.MacOSX:
                    string currentValue = Environment.GetEnvironmentVariable(LD_LIBRARY_PATH);
                    if (string.IsNullOrWhiteSpace(currentValue) == false && currentValue.Contains(path) == false)
                    {
                        string newValue = currentValue + Path.PathSeparator + path;
                        Environment.SetEnvironmentVariable(LD_LIBRARY_PATH, newValue);
                    }
                    break;
            }
        }

        [DllImport("kernel32", SetLastError = true)]
        private static extern bool SetDllDirectory(string lpPathName);
    }

因为参考的项目是WPF,这里暂时未实现Linux平台的环境初始化

Step2:

实现FFmpeg帮助类,该类主要完成通过FFmpegAutoGen对Rtsp流的解码及转换

public unsafe class FFmpegHelp
    {
        public FFmpegHelp()
        {

        }
        public AVFormatContext* pFc;
        public unsafe void Register()
        {
            //FFmpegDLL目录查找和设置
            FFmpegBinariesHelper.RegisterFFmpegBinaries();
            #region ffmpeg 初始化
            // 初始化注册ffmpeg相关的编码器
            ffmpeg.av_register_all();
            ffmpeg.avcodec_register_all();
            ffmpeg.avformat_network_init();
            #endregion
            AVFormatContext* pFc;
            // 分配音视频格式上下文
            pFc = ffmpeg.avformat_alloc_context();
            ClassHelp.Instance.FFmpegHelp.pFc = pFc;
        }
        /// <summary>
        /// 显示图片委托
        /// </summary>
        /// <param name="bitmap"></param>
        public delegate void ShowBitmap(Bitmap bitmap);

        /// <summary>
        /// 执行控制变量
        /// </summary>
        public bool CanRun;
        
    	/// <summary>
        /// 对读取的264数据包进行解码和转换
        /// </summary>
        /// <param name="show">解码完成回调函数</param>
        /// <param name="url">播放地址,也可以是本地文件地址</param>
        public unsafe void Start(ShowBitmap show, string url, out string strResult)
        {
            strResult = "";
            #region ffmpeg 转码
            int error;
            AVDictionary* c = null;
            ffmpeg.av_dict_set(&c, "stimeout", "3000000", 0);
            ffmpeg.av_dict_set(&c, "rtsp_transport", "tcp", 0);
            AVFormatContext* pFormatContext = pFc;
            //打开流
            error = ffmpeg.avformat_open_input(&pFormatContext, url, null, &c);
            if (error != 0)
            {
                strResult = "网络连接失败";
                return;
            }
            CanRun = true;
            // 读取媒体流信息
            error = ffmpeg.avformat_find_stream_info(pFormatContext, null);
            if (error != 0)
            {
                strResult = "网络连接失败";
                return;
            }

            AVStream* pStream = null, aStream;
            for (var i = 0; i < pFormatContext->nb_streams; i++)
            {
                if (pFormatContext->streams[i]->codec->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
                {
                    pStream = pFormatContext->streams[i];

                }
                else if (pFormatContext->streams[i]->codec->codec_type == AVMediaType.AVMEDIA_TYPE_AUDIO)
                {
                    aStream = pFormatContext->streams[i];

                }
            }
            if (pStream == null) throw new ApplicationException(@"Could not found video stream.");

            // 获取流的编码器上下文
            var codecContext = *pStream->codec;
            // 获取图像的宽、高及像素格式
            var width = codecContext.width;
            var height = codecContext.height;
            var sourcePixFmt = codecContext.pix_fmt;

            // 得到编码器ID
            var codecId = codecContext.codec_id;
            // 目标像素格式
            var destinationPixFmt = AVPixelFormat.AV_PIX_FMT_BGR24;

            // 某些264格式codecContext.pix_fmt获取到的格式是AV_PIX_FMT_NONE 统一都认为是YUV420P
            if (sourcePixFmt == AVPixelFormat.AV_PIX_FMT_NONE && codecId == AVCodecID.AV_CODEC_ID_H264)
            {
                sourcePixFmt = AVPixelFormat.AV_PIX_FMT_YUV420P;
            }

            // 得到SwsContext对象:用于图像的缩放和转换操作
            var pConvertContext = ffmpeg.sws_getContext(width, height, sourcePixFmt,
                width, height, destinationPixFmt,
                ffmpeg.SWS_FAST_BILINEAR, null, null, null);
            if (pConvertContext == null) throw new ApplicationException(@"Could not initialize the conversion context.");

            //分配一个默认的帧对象:AVFrame
            var pConvertedFrame = ffmpeg.av_frame_alloc();
            // 目标媒体格式需要的字节长度
            var convertedFrameBufferSize = ffmpeg.av_image_get_buffer_size(destinationPixFmt, width, height, 1);
            // 分配目标媒体格式内存使用
            var convertedFrameBufferPtr = Marshal.AllocHGlobal(convertedFrameBufferSize);
            var dstData = new byte_ptrArray4();
            var dstLinesize = new int_array4();
            // 设置图像填充参数
            ffmpeg.av_image_fill_arrays(ref dstData, ref dstLinesize, (byte*)convertedFrameBufferPtr, destinationPixFmt, width, height, 1);

            #endregion

            #region ffmpeg 解码
            // 根据编码器ID获取对应的解码器
            var pCodec = ffmpeg.avcodec_find_decoder(codecId);
            if (pCodec == null) throw new ApplicationException(@"Unsupported codec.");

            var pCodecContext = &codecContext;

            if ((pCodec->capabilities & ffmpeg.AV_CODEC_CAP_TRUNCATED) == ffmpeg.AV_CODEC_CAP_TRUNCATED)
                pCodecContext->flags |= ffmpeg.AV_CODEC_FLAG_TRUNCATED;

            // 通过解码器打开解码器上下文:AVCodecContext pCodecContext
            error = ffmpeg.avcodec_open2(pCodecContext, pCodec, null);
            if (error < 0)
            {
                CanRun = false;
                throw new ApplicationException(GetErrorMessage(error));
            }
            // 分配解码帧对象:AVFrame pDecodedFrame
            var pDecodedFrame = ffmpeg.av_frame_alloc();

            // 初始化媒体数据包
            var packet = new AVPacket();
            var pPacket = &packet;
            ffmpeg.av_init_packet(pPacket);

            while (CanRun)
            {
                try
                {
                    do
                    {
                        // 读取一帧未解码数据
                        error = ffmpeg.av_read_frame(pFormatContext, pPacket);
                        if (error == ffmpeg.AVERROR_EOF) break;
                        if (error < 0)
                        {
                            //strResult = "网络连接失败";
                            throw new ApplicationException(GetErrorMessage(error));
                        }
                        if (pPacket->stream_index != pStream->index)
                            continue;

                        // 解码
                        error = ffmpeg.avcodec_send_packet(pCodecContext, pPacket);
                        if (error < 0) throw new ApplicationException(GetErrorMessage(error));
                        // 解码输出解码数据
                        error = ffmpeg.avcodec_receive_frame(pCodecContext, pDecodedFrame);
                    }
                    while (error == ffmpeg.AVERROR(ffmpeg.EAGAIN) && CanRun);
                    //判断是否有流
                    if (error == ffmpeg.AVERROR_EOF)
                    {
                        strResult = "网络连接失败";
                        break;
                    }
                    if (error < 0)
                    {
                        strResult = "网络连接失败";
                        break;
                    }

                    if (pPacket->stream_index != pStream->index)
                        continue;

                    // YUV->RGB
                    ffmpeg.sws_scale(pConvertContext, pDecodedFrame->data, pDecodedFrame->linesize, 0, height, dstData, dstLinesize);
                }
                finally
                {
                    ffmpeg.av_packet_unref(pPacket);//释放数据包对象引用                   
                    ffmpeg.av_frame_unref(pDecodedFrame);//释放解码帧对象引用
                }
                // 封装Bitmap图片
                var bitmap = new Bitmap(width, height, dstLinesize[0], PixelFormat.Format24bppRgb, convertedFrameBufferPtr);
                // 回调
                show(bitmap);
            }
            //播放完置空播放图片 
            show(null);
            #endregion

            #region 释放资源
            Marshal.FreeHGlobal(convertedFrameBufferPtr);
            ffmpeg.av_free(pConvertedFrame);
            ffmpeg.sws_freeContext(pConvertContext);

            ffmpeg.av_free(pDecodedFrame);
            ffmpeg.avcodec_close(pCodecContext);
            ffmpeg.avformat_close_input(&pFormatContext);
            #endregion
        }

        /// <summary>
        /// 获取ffmpeg错误信息
        /// </summary>
        /// <param name="error"></param>
        /// <returns></returns>
        private static unsafe string GetErrorMessage(int error)
        {
            var bufferSize = 1024;
            var buffer = stackalloc byte[bufferSize];
            ffmpeg.av_strerror(error, buffer, (ulong)bufferSize);
            var message = Marshal.PtrToStringAnsi((IntPtr)buffer);
            return message;
        }

        public void Stop()
        {
            CanRun = false;
        }
    }
public class ClassHelp
    {
        public ClassHelp() { }
        private static ClassHelp _instance = null;
        public static ClassHelp Instance => _instance ?? (_instance = new ClassHelp());

        public FFmpegHelp FFmpegHelp
        {
            get
            {
                return _FFmpegHelp;
            }

            set
            {
                _FFmpegHelp = value;
            }
        }

        public bool IsNetwork
        {
            get
            {
                return _IsNetwork;
            }

            set
            {
                _IsNetwork = value;
            }
        }

        public bool IsAlert
        {
            get
            {
                return _IsAlert;
            }

            set
            {
                _IsAlert = value;
            }
        }

        private bool _IsNetwork = true;
        private bool _IsAlert = true;

        private FFmpegHelp _FFmpegHelp;
    }

Step3:

接下来编写一下自定义的Websocket中间件,处理前端发送的请求及推流

public class WebSocketHandlerMiddleware
{
    private readonly RequestDelegate _next;
    private byte[] _buffer = new byte[1024 * 1024];
    public bool CanRun;
        
    public WebSocketHandlerMiddleware(RequestDelegate next)
    {
        _next = next;
    }

    public async Task InvokeAsync(HttpContext context)
    {
        try
        {
           if (context.Request.Path == "/ws")
           {
               if (context.WebSockets.IsWebSocketRequest)
               {
                   WebSocket socket = await context.WebSockets.AcceptWebSocketAsync();
                   string clientId = Guid.NewGuid().ToString();
                   var wsClient = new WebSocketClient{ Id = clientId, WebSocket = socket };
                   Console.WriteLine($"User:{clientId} Connection Websocket Success,Wait For Handle");
                   await HandleAsync(wsClient);
               }
               else
               {
                   context.Response.StatusCode = 404;
               }
           }
           else
           {
                await _next(context);
           }
        }
        catch (Exception ex)
        {
            Console.WriteLine($"WebSocketHandlerMiddleware:Error: " + ex.ToString());
            await _next(context);
        }
    }

    private async Task HandleAsync(WebSocketClient webSocketClient)
    {
        Console.WriteLine("Start Handle WebSocketClient");
        WebSocketClientCollection.Add(webSocketClient);
        WebSocketReceiveResult clientData = null;
            
        do
        {
            var buffer = new byte[1024 * 1];
            clientData = await webSocketClient.WebSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
            if (clientData.MessageType == WebSocketMessageType.Text && !clientData.CloseStatus.HasValue)
            {
                var msgString = Encoding.UTF8.GetString(buffer);
                var message = JsonConvert.DeserializeObject<WebsocketMessage>(msgString);
                message.SendClientId = webSocketClient.Id;
                HandleMessage(message);
            }
        }
        while (!clientData.CloseStatus.HasValue);
        WebSocketClientCollection.Remove(webSocketClient);
    }

        #region FFMpeg.AutoGen
    private async void HandleMessage(WebsocketMessage message)
    {
        var client = WebSocketClientCollection.Get(message.SendClientId);
        client.RtspUrl = message.RtspUrl;
        if (string.IsNullOrEmpty(client.RtspUrl))
            return;
        await DeCoding(client);
    }

    private async Task DeCoding(WebSocketClient client)
    {
        try
        {
            ClassHelp.Instance.FFmpegHelp = new FFmpegHelp();
            ClassHelp.Instance.FFmpegHelp.Register();
            string strResult = "";
            
            #region 更新图片显示
            FFmpegHelp.ShowBitmap show = async (bmp) =>
            {
                Bitmap oldBitMap;
                Bitmap autobitmap;
                if (bmp != null)
                {
                    oldBitMap = bmp;
                    autobitmap = bmp;
                    ClassHelp.Instance.IsAlert = false;
                    using (MemoryStream ms = new MemoryStream())
                    {
                        bmp.Save(ms, ImageFormat.Jpeg);
                        byte[] imageData = ms.ToArray();
                        await client.WebSocket.SendAsync(imageData, WebSocketMessageType.Binary, true, CancellationToken.None);
                    }
                 }
                else
                {
                    ClassHelp.Instance.IsAlert = true;
                    ClassHelp.Instance.IsNetwork = true;
                }
             };
             #endregion
                        
             ClassHelp.Instance.FFmpegHelp.Start(show, client.RtspUrl, out strResult);
             if (!string.IsNullOrEmpty(strResult) && ClassHelp.Instance.IsAlert == true)
             {
                 ClassHelp.Instance.IsAlert = false;
                 ClassHelp.Instance.IsNetwork = false;
             }
         }
         finally
         {
            await DeCoding(client);
         }
     }
     #endregion
}

其中涉及的一些自定义的对象也贴一下:

public class WebsocketMessage
{
    public string RtspUrl { get; set; } = string.Empty;

    public string SendClientId { get; set; } = string.Empty;
}
public class WebSocketClientCollection
{
    private static List<WebSocketClient> _clients = new List<WebSocketClient>();

    public static void Add(WebSocketClient client)
    {
        _clients.Add(client);
    }

    public static void Remove(WebSocketClient client)
    {
        _clients.Remove(client);
    }

    public static List<WebSocketClient> GetAll()
    {
        return _clients;
    }

    public static WebSocketClient Get(string clientId)
    {
        return _clients.FirstOrDefault(t => t.Id == clientId);
    }
}
public class WebSocketClient
{
    public string Id { get; set; }

    public string RtspUrl { get; set; }

    public WebSocket WebSocket { get; set; }

    public async Task SendMessageAsync(byte[] bytes)
    {
        await WebSocket.SendAsync(new ArraySegment<byte>(bytes), WebSocketMessageType.Binary, true, CancellationToken.None);
    }
}

Step4:

在Program中启用Websocket及我们刚编写的Websocket中间件

app.UseWebSockets(new WebSocketOptions
{
    KeepAliveInterval = TimeSpan.FromSeconds(60)
});
app.UseMiddleware<WebSocketHandlerMiddleware>();

Step5:

至此后端的工作基本就已经完结,接下来我们简单写一个前端示例

<!DOCTYPE html>
<html lang="en">

<head>
    <meta charset="UTF-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>

<body>
    <div id="app">
        <title>canvas Player</title>
        <button id="btn1">channel1</button>
        <canvas id="player"></canvas>
        <canvas id="player1"></canvas>
        <div id="image-box">
            <!-- <img id="fixedImage" src="initial-image.jpg" alt="Fixed Image"> -->
        </div>
    </div>
    <script>
        var server = "ws://192.168.80.62:5093/ws";  //调整为你的web地址
        
        var btn1 = document.getElementById('btn1');
        var webSocket1 = new WebSocket(server);
        webSocket1.binaryType = 'arraybuffer';
        webSocket1.onmessage = function(event){
              const blob = new Blob([event.data],{type: 'image/jpeg'});
              const imageUrl = URL.createObjectURL(blob);

              const newImg = document.createElement('img');
              newImg.src = imageUrl;
              newImg.style.width = '800px';
              newImg.style.height = '600px';
              const box = document.getElementById("image-box");
			  
              //这里简单的思路就是使用一个元素,不断的向里面移除websocket中接收到的旧图片并且显示新图片
              if (box.childNodes.length > 0) {
                while(box.firstChild){
                  box.removeChild(box.firstChild);
                }
              }        
              box.appendChild(newImg)      
            }
        
        btn1.onclick = function(){           
            var msg = {Action:'test',RtspUrl:'rtsp://user:password@ipaddress/Streaming/Channels/1'};
            webSocket1.send(JSON.stringify(msg));          
        }     
    </script>
</body>

</html>

Step6:

简单的展示效果,如下图,至此大功告成,这个只是个很粗糙的demo,还有很多值得考虑的地方,例如:Linux环境下Helper中的类引用了dll需要考虑是否兼容,多路播放(这里其实已经可以实现,建立多个websocket连接就好,但是还需要考虑断线重连,链接释放的处理等),后续如果有完善,会继续更新

image-20240306151458565

标签:Web,Websocket,ffmpeg,NetCore,error,var,new,public,string
From: https://www.cnblogs.com/waterMirrorWei/p/18057078

相关文章

  • 使用H5 实现 websocket 实现视频通讯 延迟较大
    发送端<div><canvasid="canvas"></canvas><videoid="srcvideo"></video></div><divid="xs"></div><buttonid="startBtn"onclick="setRecorder(format);&qu......
  • .NET Core WebAPI项目部署iis后Swagger 404问题解决
    .NETCoreWebAPI项目部署iis后Swagger404问题解决前言之前做了一个WebAPI的项目,我在文章中写到的是Docker方式部署,然后考虑到很多初学者用的是iis,下面讲解下iis如何部署WebAPI项目。环境准备iisASPNETCoreModuleV2重点.NETCoreRuntimeiis的配置这里就不讲了,主要讲解......
  • web实时消息推送方案 - (重要~个人简历要引申)
    一什么是消息推送推送的场景比较多,比如有人关注我的公众号,这时我就会收到一条推送消息,以此来吸引我点击打开应用。消息推送通常是指网站的运营工作等人员,通过某种工具对用户当前网页或移动设备APP进行的主动消息推送。消息推送一般又分为Web端消息推送和移动端消息推送。......
  • CTFshow web71-?
    查看根目录基本方法:var_dump print_rvar_export(上面两被过滤)var_dump(scandir("/");c=?><?php$a=newDirectoryIterator("glob:///*");foreach($aas$f){echo($f->__toString().'');}exit(0);?>后加exit(); 查看文件内容include("......
  • netcore AES同等效转java语言 加密方法
    privatestaticbyte[]Keys={0x00,0x01,0x02,0x03,0x04,0x05,0x06,0x07,0x08,0x09,0x0A,0x0B,0x0C,0x0D,0x0E,0x0F};///<summary>///DES加密字符串///</summary>///<paramname="encryptString&qu......
  • Day03---Web前端基础
    JavaScript的使用Javascript的定义JavaScript是运行在浏览器端的脚步语言,是由浏览器解释执行的,简称js,它能够让网页和用户有交互功能,增加良好的用户体验效果。前端开发三大块1、HTML:负责网页结构2、CSS:负责网页样式3、JavaScript:负责网页行为,比如:网页与用户的交互......
  • WEBRTC 局域网 自己搭建信令服务 实现视频通讯
    信令服务constapp=require('express')();constwsInstance=require('express-ws')(app);app.ws('/',ws=>{ ws.on('message',data=>{ wsInstance.getWss().clients.forEach(server=>{ if(server!==ws)......
  • WebRTC 通讯隧道和信令服务实现服务视频通话
    安装NAT穿透服务器(ICEServer)brewinstallcoturn//添加用户turnadmin-a-uadmin-rrealm-padmin//测试服务turnutils_peer-p34800turnutils_uclient-v-e192.168.1.112-r34800-uadmin-wadmin-p3478192.168.1.112安装信令服务器gitclonehttps......
  • 关于Maven创建javaweb项目在配置Tomcat的问题(1)
    先将Maven项目转换成Java项目,确保没有架包没有报错和编译错误的情况下进行如下操作:打开Maven项目中的pom.xml文件,找到标签。查看里面是否配置元素,有则直接添加如下配置。org.codehaus.cargocargo-maven2-plugin1.7.6tomcat9xembedded如果没有,则先添加元素,然后再添加......
  • WebAPI中控制器路由和方法路由的区别
    控制器级别的路由:在控制器级别应用路由意味着所有该控制器中的操作都将遵循指定的路由模板。控制器级别的路由通常用于区分不同的API版本或将相关的操作分组到一个URL路径下。控制器级别的路由可以在控制器类上应用,例如:[Route("api/[controller]")][ApiController......