首页 > 其他分享 >Loss_contrast

Loss_contrast

时间:2023-07-29 14:55:53浏览次数:35  
标签:Loss vgg features nn pred self torch contrast

import numpy
import torch
import torch.nn.functional as F
from torchvision import models


class Vgg19(torch.nn.Module):
    def __init__(self, requires_grad=False):
        super(Vgg19, self).__init__()
        vgg_pretrained_features = models.vgg19(pretrained=True).features
        self.slice1 = torch.nn.Sequential()
        self.slice2 = torch.nn.Sequential()
        self.slice3 = torch.nn.Sequential()
        self.slice4 = torch.nn.Sequential()
        self.slice5 = torch.nn.Sequential()
        for x in range(2):
            self.slice1.add_module(str(x), vgg_pretrained_features[x])
        for x in range(2, 7):
            self.slice2.add_module(str(x), vgg_pretrained_features[x])
        for x in range(7, 12):
            self.slice3.add_module(str(x), vgg_pretrained_features[x])
        for x in range(12, 21):
            self.slice4.add_module(str(x), vgg_pretrained_features[x])
        for x in range(21, 30):
            self.slice5.add_module(str(x), vgg_pretrained_features[x])
        if not requires_grad:
            for param in self.parameters():
                param.requires_grad = False

    def forward(self, X):
        h_relu1 = self.slice1(X)  # relu1_1
        h_relu2 = self.slice2(h_relu1)  # relu2_1
        h_relu3 = self.slice3(h_relu2)  # relu3_1
        h_relu4 = self.slice4(h_relu3)  # relu4_1
        h_relu5 = self.slice5(h_relu4)  # relu5_1
        out = [h_relu1, h_relu2, h_relu3, h_relu4, h_relu5]
        return out


class LossNetwork(torch.nn.Module):
    def __init__(self, device):
        super(LossNetwork, self).__init__()
        self.vgg = Vgg19().to(device)
        self.L1 = torch.nn.L1Loss()
        self.weight = [1.0 / 32, 1.0 / 16, 1.0 / 8, 1.0 / 4, 1.0]

    def forward(self, pred, gt, input):
        loss = []
        pred_features = self.vgg(pred)
        gt_features = self.vgg(gt)
        input_features = self.vgg(input)

        for i in range(len(pred_features)):
            pred_gt = self.L1(pred_features[i], gt_features[i])
            pred_input = self.L1(pred_features[i], input_features[i])
            per_loss = pred_gt / (pred_input + 1e-7)
            loss.append(self.weight[i] * per_loss)

            # loss.append(self.weight[i] * pred_gt)

        return sum(loss)

 

标签:Loss,vgg,features,nn,pred,self,torch,contrast
From: https://www.cnblogs.com/yyhappy/p/17589822.html

相关文章

  • CLAHE:Contrast Limited Adaptive histgram equalization
    论文:ContrastlimitedadaptivehistogramequalizationZuiderveld,Karel."Contrastlimitedadaptivehistogramequalization."GraphicsgemsIV.AcademicPressProfessional,Inc.,1994.目录 一、背景1、对比度和直方图均衡HE2、HE的问题3、AHE  4、底噪问题二、CLAHE1......
  • 模型训练——样本选择,训练方式,loss等
     数据采样第一阶段预训练时,通过是否点击、点击位次等,将曝光点击率大于一定阈值Query-POI对作为正样本。负样本采样上,skip-above采样策略将位于点击POI之前&点击率小于阈值的POI,这样的query-POI对作为负样本。此外,也可以随机负采样补充简单负例。  欠采样过采样 ......
  • How to Restore ASM Based OCR After Complete Loss of the CRS Diskgroup on Linux/U
    InthisDocumentGoalSolutionReferencesAPPLIESTO:OracleDatabase-EnterpriseEdition-Version11.2.0.1.0andlaterOracleDatabaseCloudSchemaService-VersionN/AandlaterGen1ExadataCloudatCustomer(OracleExadataDatabaseCloudMachine)......
  • Contrast Value(数学规律)
    #ContrastValue##题面翻译定义序列$a_1,a_2,\dots,a_n$的权值是$|a_1-a_2|+|a_2-a_3|+\dots+|a_{n-1}-a_n|$。$T$次询问,每次给一个序列$a$,一个$a$的子序列$b$合法当且仅当$b$权值和$a$相等,求$b$的最小长度。##题目描述Foranarrayofintegers$[a_1,a......
  • prometheus Rmote Write loss data 丢失数据
    问题现象背景介绍目前prometheus(本地存储一小时数据)收集指标victroriametrics负责存储项目地址grafana村victoria中读取指标并绘制图表remoteread现象描述指标丢失prometheus本地数据完好。判断远程写入存在问题......
  • How to restore ASM based OCR after complete loss of the CRS diskgroup on Linux/U
    HowtorestoreASMbasedOCRaftercompletelossoftheCRSdiskgrouponLinux/Unixsystems[ID1062983.1]--------------------------------------------------------------------------------修改时间12-FEB-2012类型HOWTO状态PUBLISHEDInthis......
  • 2022年最新对比学习(Contrastive Learning)相关必读论文整理分享
        要说到对比学习(ContrastiveLearning),首先要从自监督学习开始讲起。自监督学习属于无监督学习范式的一种,特点是不需要人工标注的类别标签信息,直接利用数据本身作为监督信息,来学习样本数据的特征表达,并用于下游任务。    当前自监督学习可以被大致分为两类:    Genera......
  • AsymmetricLoss
    (59条消息)交叉熵损失函数(CrossEntropyLoss)_crossentropyloss_SongGu1996的博客-CSDN博客(59条消息)多标签分类之非对称损失-AsymmetricLoss_watersink的博客-CSDN博客......
  • 机器学习模型中的损失函数loss function
    1.概述在机器学习算法中,有一个重要的概念就是损失函数(LossFunction)。损失函数的作用就是度量模型的预测值与真实值之间的差异程度的函数,且是一个非负实值函数。对于分类问题损失函数通常可以表示成损失项和正则项的和,即有如下的形式:其中,为损失项,为正则项。的具体形式如下:对于损失......
  • 2.4类神经网路训练不起来怎么办 (四):损失函数 (Loss) 也可能有影响
    1.classification与regression的区别1.1classification与regression输出的区别  classification中,我们用one-hot向量表示不同的类别(一个向量中只有1个1,其余都为0,1在不同的位置代表不同类别).在regression中的神经网络输出只有一个,而classification则有多个输......