1.算法仿真效果
matlab2022a仿真结果如下:
2.算法涉及理论知识概要
ResNet系列网络,图像分类领域的知名算法,经久不衰,历久弥新,直到今天依旧具有广泛的研究意义和应用场景。被业界各种改进,经常用于图像识别任务。ResNet-18,数字代表的是网络的深度,也就是说ResNet18 网络就是18层的吗?实则不然,其实这里的18指定的是带有权重的 18层,包括卷积层和全连接层,不包括池化层和BN层。图像分类(Image Classification)是计算机视觉中的一个基础任务,将图像的语义将不同图像划分到不同类别。很多任务也可以转换为图像分类任务。比如人脸检测就是判断一个区域内是否有人脸,可以看作一个二分类的图像分类任务。
ResNet18的基本含义是,网络的基本架构是ResNet,网络的深度是18层。但是这里的网络深度指的是网络的权重层,也就是包括池化,激活,线性层。而不包括批量化归一层,池化层。下图就是一个ResNet18的基本网络架构,其中并未加入批量化归一和池化层。
(1)7*7卷积层
首先根据论文中所说的首先经过一个卷积层。这个卷积层的卷积核的大小为77,步长为2,padding为3,输出通道为64。
(2)池化层
这里通过一个最大池化层,这一层的卷积核的大小是33,步长为2,padding为1。最后输出数据的大小为6456*56.也就是说这个池化不改变数据的通道数量,而会减半数据的大小。
(3)第一个3*3卷积层
第一个卷积33卷积层,卷积核的大小为33,步长为1,padding为1。最后通过两个第一个卷积层的输出数据大小为645454,也就是这一层不改变数据的大小和通道数。
(4)第二个3*3卷积层
首先通过一个11的卷积层,并经过一个下采样。这样最后的输出数据为12828*28。也就是将输出通道翻倍,输出数据大小全部减半。
(5)第三个3*3卷积层
同样进行11卷积,和下采样。这样最后的输出为25614*14。也就是将输出通道翻倍,输出数据大小全部减半。
(6)第四个3*3卷积层
是将输出通道翻倍,输出数据大小全部减半。
(7)平均池化层
最后输出为51211
(8)线性层
3.MATLAB核心程序
tempLayers = [ additionLayer(2,"Name","res5b") reluLayer("Name","res5b_relu") globalAveragePooling2dLayer("Name","pool5") fullyConnectedLayer(10,"Name","fc10") softmaxLayer("Name","prob") classificationLayer("Name","ClassificationLayer_predictions")]; LG = addLayers(LG,tempLayers); % clean up helper variable clear tempLayers; LG = connectLayers(LG,"pool1","res2a_branch2a"); LG = connectLayers(LG,"pool1","res2a/in2"); LG = connectLayers(LG,"bn2a_branch2b","res2a/in1"); LG = connectLayers(LG,"res2a_relu","res2b_branch2a"); LG = connectLayers(LG,"res2a_relu","res2b/in2"); LG = connectLayers(LG,"bn2b_branch2b","res2b/in1"); LG = connectLayers(LG,"res2b_relu","res3a_branch2a"); LG = connectLayers(LG,"res2b_relu","res3a_branch1"); LG = connectLayers(LG,"bn3a_branch1","res3a/in2"); LG = connectLayers(LG,"bn3a_branch2b","res3a/in1"); LG = connectLayers(LG,"res3a_relu","res3b_branch2a"); LG = connectLayers(LG,"res3a_relu","res3b/in2"); LG = connectLayers(LG,"bn3b_branch2b","res3b/in1"); LG = connectLayers(LG,"res3b_relu","res4a_branch2a"); LG = connectLayers(LG,"res3b_relu","res4a_branch1"); LG = connectLayers(LG,"bn4a_branch1","res4a/in2"); LG = connectLayers(LG,"bn4a_branch2b","res4a/in1"); LG = connectLayers(LG,"res4a_relu","res4b_branch2a"); LG = connectLayers(LG,"res4a_relu","res4b/in2"); LG = connectLayers(LG,"bn4b_branch2b","res4b/in1"); LG = connectLayers(LG,"res4b_relu","res5a_branch2a"); LG = connectLayers(LG,"res4b_relu","res5a_branch1"); LG = connectLayers(LG,"bn5a_branch1","res5a/in2"); LG = connectLayers(LG,"bn5a_branch2b","res5a/in1"); LG = connectLayers(LG,"res5a_relu","res5b_branch2a"); LG = connectLayers(LG,"res5a_relu","res5b/in2"); LG = connectLayers(LG,"bn5b_branch2b","res5b/in1"); net = trainNetwork(XTrain, YTrainCat, LG, options); save Res18.mat net
标签:ResNet18,LG,卷积,in1,relu,branch2a,matlab,mnist,connectLayers From: https://www.cnblogs.com/51matlab/p/17444654.html