首页 > 其他分享 >跨平台xamarin.Android 开发之 :适配各架构(X86_64 、 X86、arm64-v8a、 armeabi-v7a )FFmpeg 编码

跨平台xamarin.Android 开发之 :适配各架构(X86_64 、 X86、arm64-v8a、 armeabi-v7a )FFmpeg 编码

时间:2023-08-13 17:11:39浏览次数:37  
标签:X86 适配 pCodecContext private Width 跨平台 new detFrameSize FFmpeg

此代码的编写花费了脑细胞:在每次编码开启编码器到只需要一次编码器的开启优化

前提:编译好FFMpeg 的各平台的动态库

基本上Android X86_64 、 X86、arm64-v8a、 armeabi-v7a 采用FFmpeg 编码的方式基本一直。差异是内存分配和取指有所不同,如果分配不对,直接闪退。

先看看通用的编码,包括编码器创建、编码、释放编码器。

using System;
using System.Diagnostics;
using System.Runtime.InteropServices;
using System.Drawing;

namespace FFmpegAnalyzer
{
    /// <summary>
    /// 编码器
    /// </summary>
    internal unsafe class FFmpegEncoder
    {
        /// <param name="srcFrameSize">编码前一帧原始数据的大小</param>
        /// <param name="isRgb">rgb数据</param>
        /// <param name="detFrameSize">编码后一帧目标数据的大小</param>
        public FFmpegEncoder(Size srcFrameSize, bool isRgb , Size detFrameSize )
        {
            _srcFrameSize = srcFrameSize;
            _isRgb = isRgb;
            _detFrameSize = detFrameSize == default ? _srcFrameSize : detFrameSize;
            _detFrameSize.Width = (_detFrameSize.Width % 2 == 0) ? _detFrameSize.Width : _detFrameSize.Width - 1;
            _detFrameSize.Height = (_detFrameSize.Height % 2 == 0) ? _detFrameSize.Height : _detFrameSize.Height - 1;
        }

        /// <summary>
        /// 创建编码器
        /// </summary>
        public  void CreateEncoder(AVCodecID codecFormat)
        {
            var originPixelFormat = _isRgb ? AVPixelFormat.AV_PIX_FMT_RGB24 : AVPixelFormat.AV_PIX_FMT_BGRA;
            var destinationPixelFormat = AVPixelFormat.AV_PIX_FMT_YUV420P;
            _pCodec = FFmpeg.avcodec_find_encoder(codecFormat);

            if (_pCodec == null)
                throw new InvalidOperationException("Codec not found.");
            _pCodecContext = FFmpeg.avcodec_alloc_context3(_pCodec);
            _pCodecContext->width = _detFrameSize.Width;
            _pCodecContext->height = _detFrameSize.Height;

            _pCodecContext->framerate = new AVRational { num = 30, den = 1 };
            _pCodecContext->time_base = new AVRational {num = 1, den = 30};
            _pCodecContext->gop_size = 30;
            _pCodecContext->pix_fmt = destinationPixelFormat;
            // 设置预测算法
            _pCodecContext->flags |= FFmpeg.AV_CODEC_FLAG_PSNR;
            _pCodecContext->flags2 |= FFmpeg.AV_CODEC_FLAG2_FAST;
            _pCodecContext->max_b_frames = 0;

            FFmpeg.av_opt_set(_pCodecContext->priv_data, "preset", "veryfast", 0);
            FFmpeg.av_opt_set(_pCodecContext->priv_data, "tune", "zerolatency", 0);

            //打开编码器
            FFmpeg.avcodec_open2(_pCodecContext, _pCodec, null).ThrowExceptionIfError();
            _pConvertContext = FFmpeg.sws_getContext(_srcFrameSize.Width, _srcFrameSize.Height, originPixelFormat, _detFrameSize.Width, _detFrameSize.Height, destinationPixelFormat,
            FFmpeg.SWS_BICUBIC, null, null, null);
            if (_pConvertContext == null)
                throw new ApplicationException("Could not initialize the conversion context.");

            var convertedFrameBufferSize = FFmpeg.av_image_get_buffer_size(destinationPixelFormat, _detFrameSize.Width, _detFrameSize.Height, 1);
            _convertedFrameBufferPtr = Marshal.AllocHGlobal(convertedFrameBufferSize);
            _dstData = new BytePtr4();
            _dstLineSize = new Int4();

            FFmpeg.av_image_fill_arrays(ref _dstData, ref _dstLineSize, (byte*)_convertedFrameBufferPtr, destinationPixelFormat, _detFrameSize.Width, _detFrameSize.Height, 1);
            _isCodecRunning = true;
        }

        /// <summary>
        /// 释放
        /// </summary>
        public  void Dispose()
        {
            if (!_isCodecRunning) return;
            _isCodecRunning = false;
            //释放编码器
            FFmpeg.avcodec_close(_pCodecContext);
            FFmpeg.av_free(_pCodecContext);
            //释放转换器
            Marshal.FreeHGlobal(_convertedFrameBufferPtr);
            FFmpeg.sws_freeContext(_pConvertContext);
        }

        private AVFrame waitToYuvFrame;
        /// <summary>
        /// 编码
        /// </summary>
        /// <param name="frameBytes"></param>
        /// <returns></returns>
        public  byte[] EncodeFrames(byte[] frameBytes)
        {
            if (!_isCodecRunning)
            {
                 throw new InvalidOperationException("编码器未运行!");
            }

            //行跨度:涉及图像内存取值
            var rowPitch = _isRgb ? _srcFrameSize.Width * 3 : _srcFrameSize.Width * 4;
            fixed (byte* pBitmapData = frameBytes)
            {
                waitToYuvFrame = new AVFrame
                {
                    data = new BytePtr8 { [0] = pBitmapData },
                    linesize = new Int8 { [0] = rowPitch },
                    height = _srcFrameSize.Height
                };
                var rgbToYuv = ConvertToYuv(waitToYuvFrame, _detFrameSize);
                byte[] buffer;
                var pPacket = FFmpeg.av_packet_alloc();
                try
                {
                    int error;
                    do
                    {
                        FFmpeg.avcodec_send_frame(_pCodecContext, &rgbToYuv).ThrowExceptionIfError();
                        error = FFmpeg.avcodec_receive_packet(_pCodecContext, pPacket);
                    } while (error == FFmpeg.AVERROR(FFmpeg.EAGAIN));
                    error.ThrowExceptionIfError();
                    buffer = new byte[pPacket->size];
                    Marshal.Copy(new IntPtr(pPacket->data), buffer, 0, pPacket->size);
                }
                finally
                {
                    FFmpeg.av_frame_unref(&rgbToYuv);
                    FFmpeg.av_packet_unref(pPacket);
                }

                return buffer;
            }
        }

        /// <summary>
        /// 转换成Yuv格式
        /// </summary>
        /// <param name="waitConvertYuvFrame"></param>
        /// <param name="detSize">变化后目标大小</param>
        /// <returns></returns>
        private AVFrame ConvertToYuv(AVFrame waitConvertYuvFrame,Size detSize)
        {
            FFmpeg.sws_scale(_pConvertContext, waitConvertYuvFrame.data, waitConvertYuvFrame.linesize, 0, waitConvertYuvFrame.height, _dstData, _dstLineSize);
            var data = new BytePtr8();
            data.UpdateFrom(_dstData);
            var lineSize = new Int8();
            lineSize.UpdateFrom(_dstLineSize);
            IntPtr address = (IntPtr)(&waitConvertYuvFrame);
            Debug.WriteLine("Address: 0x" + address.ToString("X"));
            Debug.WriteLine("Size: " + sizeof(AVFrame));
            FFmpeg.av_frame_unref(waitConvertYuvFrame);  
            return new AVFrame
            {
                data = data,
                linesize = lineSize,
                width = detSize.Width,
                height = detSize.Height
            };
        }

        //编码器
        private AVCodec* _pCodec;
        private AVCodecContext* _pCodecContext;
        //转换缓存区
        private IntPtr _convertedFrameBufferPtr;
        private BytePtr4 _dstData;
        private Int4 _dstLineSize;
        //格式转换
        private SwsContext* _pConvertContext;

        //源数据大小
        private Size _srcFrameSize;
        //源数据大小
        private Size _detFrameSize;

        //三通道
        private readonly bool _isRgb;

        //编码器正在运行
        private bool _isCodecRunning;
    }
}

上面的编码器使用,在Windows 和Andriod X86_64 armeabi-v7a 平台架构上们都可以正常使用。

但是在arm64-v8a 使用使用是,只要使用  &取指。例如:

1 FFmpeg.av_frame_unref(&waitConvertYuvFrame);

2 FFmpeg.avcodec_send_frame(_pCodecContext, &rgbToYuv) 

直接闪退。通过打开FFmpeg 的日志工具 提示内存段错误。既然异常知道了就可以针对处理了,发现是在arm64-v8a主要主动分配内存,那就好修改了,修改的整体代码如下

  AVFrame* pFrame = FFmpeg.av_frame_alloc(); //兼容ARM 64位(arm64-v8a)
                pFrame->data = new BytePtr8 {[0] = pBitmapData};
                pFrame->linesize = new Int8 { [0] = rowPitch };
                pFrame->height = _srcFrameSize.Height;

通过主动申请非托管内存替代托管内存。整体代码如下

using System;
using System.Drawing;
using System.Runtime.InteropServices;

namespace FFmpegAnalyzer
{
    /// <summary>
    /// 编码器
    /// </summary>
    internal unsafe class FFmpegEncoder
    {
        /// <param name="srcFrameSize">编码前一帧原始数据的大小</param>
        /// <param name="isRgb">rgb数据</param>
        /// <param name="detFrameSize">编码后一帧目标数据的大小</param>
        public FFmpegEncoder(Size srcFrameSize, bool isRgb , Size detFrameSize )
        {
            _srcFrameSize = srcFrameSize;
            _isRgb = isRgb;
            _detFrameSize = detFrameSize == default ? _srcFrameSize : detFrameSize;
            _detFrameSize.Width = (_detFrameSize.Width % 2 == 0) ? _detFrameSize.Width : _detFrameSize.Width - 1;
            _detFrameSize.Height = (_detFrameSize.Height % 2 == 0) ? _detFrameSize.Height : _detFrameSize.Height - 1;
        }

        /// <summary>
        /// 创建编码器
        /// </summary>
        public  void CreateEncoder(AVCodecID codecFormat)
        {
            var originPixelFormat = _isRgb ? AVPixelFormat.AV_PIX_FMT_RGB24 : AVPixelFormat.AV_PIX_FMT_BGRA;
            var destinationPixelFormat = AVPixelFormat.AV_PIX_FMT_YUV420P;
            _pCodec = FFmpeg.avcodec_find_encoder(codecFormat);

            if (_pCodec == null)
                throw new InvalidOperationException("Codec not found.");
            _pCodecContext = FFmpeg.avcodec_alloc_context3(_pCodec);
            _pCodecContext->width = _detFrameSize.Width;
            _pCodecContext->height = _detFrameSize.Height;

            _pCodecContext->framerate = new AVRational { num = 30, den = 1 };
            _pCodecContext->time_base = new AVRational {num = 1, den = 30};
            _pCodecContext->gop_size = 30;
            _pCodecContext->pix_fmt = destinationPixelFormat;
            // 设置预测算法
            _pCodecContext->flags |= FFmpeg.AV_CODEC_FLAG_PSNR;
            _pCodecContext->flags2 |= FFmpeg.AV_CODEC_FLAG2_FAST;
            _pCodecContext->max_b_frames = 0;

            FFmpeg.av_opt_set(_pCodecContext->priv_data, "preset", "veryfast", 0);
            FFmpeg.av_opt_set(_pCodecContext->priv_data, "tune", "zerolatency", 0);

            //打开编码器
            FFmpeg.avcodec_open2(_pCodecContext, _pCodec, null).ThrowExceptionIfError();
            _pConvertContext = FFmpeg.sws_getContext(_srcFrameSize.Width, _srcFrameSize.Height, originPixelFormat, _detFrameSize.Width, _detFrameSize.Height, destinationPixelFormat,
            FFmpeg.SWS_BICUBIC, null, null, null);
            if (_pConvertContext == null)
                throw new ApplicationException("Could not initialize the conversion context.");

            var convertedFrameBufferSize = FFmpeg.av_image_get_buffer_size(destinationPixelFormat, _detFrameSize.Width, _detFrameSize.Height, 1);
            _convertedFrameBufferPtr = Marshal.AllocHGlobal(convertedFrameBufferSize);
            _dstData = new BytePtr4();
            _dstLineSize = new Int4();

            FFmpeg.av_image_fill_arrays(ref _dstData, ref _dstLineSize, (byte*)_convertedFrameBufferPtr, destinationPixelFormat, _detFrameSize.Width, _detFrameSize.Height, 1);
            _isCodecRunning = true;
        }

        /// <summary>
        /// 释放
        /// </summary>
        public  void Dispose()
        {
            if (!_isCodecRunning) return;
            _isCodecRunning = false;
            //释放编码器
            FFmpeg.avcodec_close(_pCodecContext);
            FFmpeg.av_free(_pCodecContext);
            //释放转换器
            Marshal.FreeHGlobal(_convertedFrameBufferPtr);
            FFmpeg.sws_freeContext(_pConvertContext);
        }

        /// <summary>
        /// 编码
        /// </summary>
        /// <param name="frameBytes"></param>
        /// <returns></returns>
        public  byte[] EncodeFrames(byte[] frameBytes)
        {
            if (!_isCodecRunning)
            {
                 throw new InvalidOperationException("编码器未运行!");
            }

            //行跨度:涉及图像内存取值
            var rowPitch = _isRgb ? _srcFrameSize.Width * 3 : _srcFrameSize.Width * 4;
            fixed (byte* pBitmapData = frameBytes)
            {
                AVFrame* pFrame = FFmpeg.av_frame_alloc(); 
                pFrame->data = new BytePtr8 {[0] = pBitmapData};
                pFrame->linesize = new Int8 { [0] = rowPitch };
                pFrame->height = _srcFrameSize.Height;
                var rgbToYuv = ConvertToYuv(pFrame, _detFrameSize);
                byte[] buffer;
                var pPacket = FFmpeg.av_packet_alloc();
                try
                {
                    int error;
                    do
                    {
                        FFmpeg.avcodec_send_frame(_pCodecContext, rgbToYuv).ThrowExceptionIfError();
                        error = FFmpeg.avcodec_receive_packet(_pCodecContext, pPacket);
                    } while (error == FFmpeg.AVERROR(FFmpeg.EAGAIN));
                    error.ThrowExceptionIfError();
                    buffer = new byte[pPacket->size];
                    Marshal.Copy(new IntPtr(pPacket->data), buffer, 0, pPacket->size);
                }
                finally
                {
                    FFmpeg.av_frame_unref(rgbToYuv);
                    FFmpeg.av_packet_unref(pPacket);
                }

                return buffer;
            }
        }

        /// <summary>
        /// 转换成Yuv格式
        /// </summary>
        /// <param name="waitConvertYuvFrame"></param>
        /// <param name="detSize">变化后目标大小</param>
        /// <returns></returns>
        private AVFrame* ConvertToYuv(AVFrame* waitConvertYuvFrame,Size detSize)
        {
            FFmpeg.sws_scale(_pConvertContext, waitConvertYuvFrame->data, waitConvertYuvFrame->linesize, 0, waitConvertYuvFrame->height, _dstData, _dstLineSize);
            var data = new BytePtr8();
            data.UpdateFrom(_dstData);
            var lineSize = new Int8();
            lineSize.UpdateFrom(_dstLineSize);
            FFmpeg.av_frame_unref(waitConvertYuvFrame);
            AVFrame* pFrame = FFmpeg.av_frame_alloc();
            pFrame->data = data;
            pFrame->linesize = lineSize;
            pFrame->width = detSize.Width;
            pFrame->height = detSize.Height;
            pFrame->format = (int) AVPixelFormat.AV_PIX_FMT_YUV420P;
            return pFrame;
        }

        //编码器
        private AVCodec* _pCodec;
        private AVCodecContext* _pCodecContext;
        //转换缓存区
        private IntPtr _convertedFrameBufferPtr;
        private BytePtr4 _dstData;
        private Int4 _dstLineSize;
        //格式转换
        private SwsContext* _pConvertContext;

        //源数据大小
        private Size _srcFrameSize;
        //源数据大小
        private Size _detFrameSize;

        //三通道
        private readonly bool _isRgb;

        //编码器正在运行
        private bool _isCodecRunning;
    }
}

 

标签:X86,适配,pCodecContext,private,Width,跨平台,new,detFrameSize,FFmpeg
From: https://www.cnblogs.com/terryK/p/17626826.html

相关文章

  • 跨平台xamarin.Android 开发之 :适配各架构(X86_64 、 X86、arm64-v8a、 armeabi-v7a )
    此代码的编写花费了脑细胞:在每次解码开启解码器到只需要一次解码器的开启优化前提:编译好FFMpeg的各平台的动态库Windows、Android(X86_64、X86、arm64-v8a、armeabi-v7a)解码相对编码要简单一些,因为不涉及到AVFrame取指转换解码包括:创建解码器、解码、释放解码器us......
  • 跨平台xamarin.Android 开发之 :适配各架构(X86_64 、 X86、arm64-v8a、 armeabi-v7a
    从事Windows,项目探索预研跨平台开发,对Android只知道有X86_64、X86、arm64-v8a、  armeabi-v7a这么个东西其他空白。编译入手采用Xamarin.Android开发。通过摸索。在Xamarin.Android中使用FFmpeg编解码,需要获取源码编译成对应Android架构的so动态库,如何编译不在此处讨论,稍......
  • 对话音视频牛哥:如何设计功能齐全的跨平台低延迟RTMP播放器
    开发背景2015年,我们在做移动单兵应急指挥项目的时候,推送端采用了RTMP方案,这在当时算是介入RTMP比较早的了,RTMP推送模块做好以后,我们找了市面上VLC还有Vitamio,来测试整体延迟,实际效果真的不尽人意,大家知道,应急指挥系统,除了稳定性外,对延迟有很高的要求,几秒钟(>3-5秒)的延迟,是我们接受不......
  • 小程序生成App:可跨平台开发的移动应用开发框架
    小程序生成App可以成为一种轻量低门槛的开发App的方式,但是需要根据具体情况进行选择。如果应用需要处理大量数据或需要进行复杂计算,或者需要实现原生特有的功能或交互效果,可能需要选择其他开发方式。在文章开始之前,我们看看目前市面上比较容易上手、低门槛开发App的框架和方式Rea......
  • Axure 9无法设置移动设备适配的解决方法
    虽然Axure9做了很多移动端的适配工作,移动端的设计体验也好了很多,但是只是在PC端预览有移动端的效果,在移动设备上浏览却没有自动适应屏幕的效果,而且也没有设置移动端的适配的入口而在Axure8中是有这个设置面板的,设置非常方便,如下:   那如何才能设置移动端适配呢,还是有办......
  • Android屏幕适配全攻略(最权威的官方适配指导)
    Android的屏幕适配一直以来都在折磨着我们这些开发者,本篇文章以Google的官方文档为基础,全面而深入的讲解了Android屏幕适配的原因、重要概念、解决方案及最佳实践,我相信如果你能认真的学习本文,对于Android的屏幕适配,你将有所收获! Android屏幕适配出现的原因重要概念屏幕尺寸屏幕......
  • vivo 场景下的 H5无障碍适配实践
    作者:vivo互联网前端团队-ZhangLi、DaiWenkuan随着信息无障碍的建设越来越受重视,开发人员在无障碍适配中也遇到了越来越多的挑战。本文是笔者在vivo开发H5项目做无障碍适配的实践总结。本文主要介绍了在前端项目中常用的无障碍手势和无障碍属性,并且结合具体的开发案例为开发......
  • 在x86构架中 时间片是如何实现的
     在x86架构(通常指的是基于Intel的x86指令集架构)中,时间片的实现通常涉及操作系统、中断机制和时钟硬件。以下是在x86架构中实现时间片的一般步骤:硬件时钟:在x86架构中,存在一个硬件时钟,通常称为计时器或时钟中断。这个时钟以固定的频率发出中断,比如每秒100次中......
  • warning: /var/cache/yum/x86_64/7/mysql57-community/packages/mysql-community-comm
    问题描述在我正确地安装好mysql包之后,再安装mysql,就出现了这么一个问题:就去疯狂百度找解决问题的方法!!!问题解决经过查找资料,才发现,原来是有GPG验证检查,只需要禁止GPG验证检查就行啦!也就是在安装mysql的语句后面,加上这样一个语句:--nogpgcheck总起来就是:yum-yinstallmys......
  • Android 12 适配之 "Android:exported"
    Android12适配之"Android:exported"将build.gradle中的targetSDKVersion和compileSdkVersion改为31,对应Android12build项目报错AppstargetingAndroid12andhigherarerequiredtospecifyanexplicitvalueforandroid:exportedwhenthecorrespondingc......