首页 > 其他分享 >0.875的准确率

0.875的准确率

时间:2022-12-02 17:56:07浏览次数:31  
标签:0.875 nn torch feature label 准确率 test out

我调了下模型, nn.Linear(36,2),nn.Sigmoid(),nn.Linear(2,2),nn.Sigmoid(),nn.Linear(2,1),nn.Sigmoid(). 原先是(36,3)(3,1)的全连接神经网络。但是容易过拟合,但是我随便调了下调成了。(36,2)(2,2)(2,1)模型准确率达到了0.85,且更能找到2模型准确率。 由此我得出结论深层是对前一层的微调,而宽的可能表示提取的特征。宽度大了也容易过拟合。 import matplotlib.pyplot as plt import numpy as np import pandas as pd import torch import torch.fft as fft df = pd.read_csv('train.csv') df=df.drop(['ID'],axis=1) nmp=df.to_numpy() feature=nmp[:-20,:-1] label=nmp[:-20,-1]#(210,240) feature=torch.fft.fft(torch.Tensor(feature)) feature=torch.abs(feature)/240*2 feature=feature[:,[0, 1, 60, 180, 239, 58, 59, 61, 179, 181, 182, 120, 62, 178, 119, 121, 117, 123, 2, 238, 55, 65, 175, 185, 63, 116, 124, 177, 118, 122, 56, 64, 176, 184, 57, 183]] test_feature=nmp[-20:,:-1] test_label=nmp[-20:,-1]#(210,240)
test_feature=torch.fft.fft(torch.Tensor(test_feature)) test_feature=torch.abs(test_feature)/240*2 test_feature=test_feature[:,[0, 1, 60, 180, 239, 58, 59, 61, 179, 181, 182, 120, 62, 178, 119, 121, 117, 123, 2, 238, 55, 65, 175, 185, 63, 116, 124, 177, 118, 122, 56, 64, 176, 184, 57, 183]] from torch import nn import torch loss=nn.MSELoss() feature=torch.Tensor(feature) label=torch.Tensor(label) label=label.reshape(-1,1)
test_feature=torch.Tensor(test_feature) test_label=torch.Tensor(test_label) test_label=test_label.reshape(-1,1)
network=nn.Sequential(nn.Linear(36,2),nn.Sigmoid(),nn.Linear(2,2),nn.Sigmoid(),nn.Linear(2,1),nn.Sigmoid()) import torch.optim as optim optimizer = optim.Adam(network.parameters(), lr=0.004) for epoch in range(100000):     optimizer.zero_grad()     out=network(feature)     l=loss(out,label)     l.backward()     optimizer.step()     Y = torch.ge(out, 0.5).float()     acc=Y.eq(label).float().sum()/len(label)     out=network(test_feature)     Y = torch.ge(out, 0.5).float()     test_acc=Y.eq(test_label).float().sum()/len(test_label)     print(epoch,l,acc,test_acc)     #if test_acc==0.50 and acc>0.93:     if acc>=0.8 and test_acc>=0.90:         break
df = pd.read_csv('test.csv') df=df.drop(['ID'],axis=1) nmp=df.to_numpy() feature=nmp[:,:] feature=torch.fft.fft(torch.Tensor(feature)) feature=torch.abs(feature)/240*2 feature=torch.Tensor(feature[:,[0, 1, 60, 180, 239, 58, 59, 61, 179, 181, 182, 120, 62, 178, 119, 121, 117, 123, 2, 238, 55, 65, 175, 185, 63, 116, 124, 177, 118, 122, 56, 64, 176, 184, 57, 183]]) out=network(feature) out=out.detach().numpy() out=out>0.5 out=out.astype(np.int) out=pd.DataFrame(out) out.columns = ['CLASS'] w=[] for k in range(out.shape[0]):     w.append(k+210) out['ID']=np.reshape(w,(-1,1)) out[['ID','CLASS']].to_csv('out.csv',index=False)

标签:0.875,nn,torch,feature,label,准确率,test,out
From: https://www.cnblogs.com/hahaah/p/16945216.html

相关文章