首页 > 其他分享 >《深度学习的数学》(涌井良幸、涌井贞美著) 神经网络计算pytorch示例二

《深度学习的数学》(涌井良幸、涌井贞美著) 神经网络计算pytorch示例二

时间:2023-11-03 17:06:16浏览次数:46  
标签:end 示例 涌井良幸 涌井贞美著 self get param print data

《深度学习的数学》(涌井良幸、涌井贞美著) 神经网络计算pytorch示例二_涌井良幸

涌井良幸、涌井贞美著的《深度学习的数学》这本书,浅显易懂。书中还用Excel示例(如下图)神经网络的计算,真是不错。但光有Excel示例还是有点欠缺的,如果有pytorch代码演示就更好了。

《深度学习的数学》(涌井良幸、涌井贞美著) 神经网络计算pytorch示例二_pytorch_02

百度了半天在网上没找到别人写的,只好自己撸一个(使用python + pytorch),供同样在学习神经网络的初学者参考。

(注,这是书中5-6节:体验卷积神经网络误差反向传播法,数据是96个6x6的1、2和3,用平方误差的总和作为代价函数, 用 Sigmoid 函数作为激活函数

(书中4-4神经网络计算pytorch示例一请参考:https://blog.51cto.com/oldycat/8133220

(看这本书前建议可以先看立石贤吾著的《白话机器学习的数学》,再看这本书会变得很简单)

demo54.py:(注:这一版和Excel数据仍然部份差异,暂时还找不到应修改的地方)

import torch
import torch.nn as nn
import torch.optim as optimal
from torch import cosine_similarity

import demo54data as demo


class CNN(nn.Module):
    def __init__(self):
        super(CNN, self).__init__()
        self.activation = nn.Sigmoid()
        self.conv1 = nn.Conv2d(1, 3, kernel_size=3)
        self.pool = nn.MaxPool2d(kernel_size=2)
        self.fc = nn.Linear(3 * (2 * 2), 3)  # 输出层有3个神经元,对应数字0、1、2

        self.conv1.weight.data = demo.get_param_co().resize(3, 1, 3, 3)  # 若用正态分布,注释此行
        self.conv1.bias.data = demo.get_param_co_bias()  # 若用正态分布,注释此行
        self.fc.weight.data = demo.get_param_op().resize(3, 12)  # 若用正态分布,注释此行
        self.fc.bias.data = demo.get_param_co_bias()  # 若用正态分布,注释此行

    def forward(self, x):
        x = self.conv1(x)
        demo.print_x("zF=", x)
        x = self.activation(x)
        x = self.pool(x)
        demo.print_x("aF=", x)
        x = x.view(x.size(0), -1)
        x = self.fc(x)  # 这里可能错误,导致算出来的数据和Excel不尽相同
        demo.print_x("zO=", x)
        x = self.activation(x)
        demo.print_x("aO=", x)
        return x


def mse_loss(x, y):
    # m = nn.MSELoss(size_average=False)
    # return m(x, y) / 2
    z = x - y
    print("  c= ", end='')
    print(((z[0, 0] ** 2 + z[0, 1] ** 2) / 2).data.numpy(), end='')
    for i in range(z.size()[0]):
        if i > 0:
            print("\t", ((z[i, 0] ** 2 + z[i, 1] ** 2) / 2).data.numpy(), end='')
    print()
    return (z[:, 0] ** 2 + z[:, 1] ** 2).sum() / 2


# 创建模型实例
model = CNN()
for param in model.parameters():
    print(param)

# 定义损失函数和优化器
criterion = mse_loss
optimizer = optimal.SGD(model.parameters(), lr=0.2)

# 转换输入数据为张量
train_data = demo.get_data()
train_labels = demo.get_result()

# 开始训练
num_epochs = 1000
for epoch in range(num_epochs):
    print("\nepoch=", epoch + 1)
    optimizer.zero_grad()
    outputs = model(train_data)

    loss = criterion(outputs, train_labels)
    print("Loss: {:.4f}".format(loss.item()))
    loss.backward()
    optimizer.step()

    if (epoch + 1) == num_epochs or loss.item() < 0.05:
        print("Epoch [{}/{}], Loss: {:.4f}".format(epoch + 1, num_epochs, loss.item()))
        break

# 使用训练好的模型进行预测
model.eval()
print()
output = model(demo.get_test()).data
print(output.argmax(dim=1) + 1)

print("\n=======  比对全部结果  ======")
test_data = demo.get_data()
predictions = model(test_data)
result = (predictions.argmax(dim=1) + 1)
print(result.data)
print("差异:")
print((demo.get_result2() - result).long())
print()
print("准确度:", (torch.round(
    cosine_similarity(result.unsqueeze(0), demo.get_result2().unsqueeze(0)).mean() * 10000) / 100).data.numpy(),
      "%")

demo54data.py

import torch


def get_param_co():
    return torch.tensor([[
        -1.277, -0.454, 0.358,
        1.138, -2.398, -1.664,
        -0.794, 0.899, 0.675
    ], [
        -1.274, 2.338, 2.301,
        0.649, -0.339, -2.054,
        -1.022, -1.204, -1.900
    ], [
        -1.869, 2.044, -1.290,
        -1.710, -2.091, -2.946,
        0.201, -1.323, 0.207
    ]])


def get_param_co_bias():
    return torch.tensor([-3.363, -3.176, -1.739])


def get_param_op():
    return torch.tensor([

        [[
            -0.276, 0.124,
            - 0.961, 0.718
        ], [
            -3.680, - 0.594,
            0.280, - 0.782

        ], [
            -1.475, - 2.010,
            - 1.085, - 0.188
        ]],

        [[
            0.010, 0.661,
            - 1.591, 2.189
        ], [
            1.728, 0.003,
            - 0.250, 1.898
        ], [
            0.238, 1.589,
            2.246, - 0.093
        ]],

        [[
            -1.322, - 0.218,
            3.527, 0.061
        ], [
            0.613, 0.218,
            - 2.130, - 1.678
        ], [
            1.236, - 0.486,
            - 0.144, - 1.235
        ]]

    ])


def get_param_o_bias():
    return torch.tensor([2.060, -2.746, -1.818])


def get_test():
    return (torch.tensor([
        [
            1.0, 1, 1, 1, 0, 0,
            1, 1, 0, 0, 1, 0,
            0, 0, 0, 0, 1, 0,
            0, 0, 0, 1, 1, 0,
            1, 1, 0, 0, 1, 0,
            1, 1, 1, 1, 0, 0], [

            0, 0, 1, 1, 1, 0,
            0, 1, 0, 0, 1, 1,
            0, 0, 0, 1, 1, 0,
            0, 0, 0, 0, 1, 0,
            0, 1, 0, 0, 1, 1,
            0, 0, 1, 1, 1, 0]])
            .resize(2, 1, 6, 6))


def print_x(name, x):
    if x.dim() > 3:
        print(name, end='')
        # for i in range(x.size()[0]):
        for j in range(x.size()[1]):
            print("\t[", end='')
            for k in range(x.size()[2]):
                print("", x[0, j, k, :].data.numpy(), end='')
            print("]\n  ", end='')
        print()
    elif x.dim() > 1:
        print(name, end='')
        print("", x[0, :].data.numpy(), end='')
        for i in range(x.size()[0]):
            if i > 0:
                print("\t\t", x[i, :].data.numpy(), end='')
        print()


def print_params(params):
    for param in params:
        if param.dim() > 1:
            for i in range(param.size()[0]):
                print('\t[', end='')
                print(param[i, 0].data.numpy(), end='')
                for j in range(param.size()[1]):
                    if j > 0:
                        print('\t', param[i, j].data.numpy(), end='')
                print('] ', end='')
            print()
        else:
            print('\t[', end='')
            print(param[0].data.numpy(), end='')
            for i in range(param.size()[0]):
                if i > 0:
                    print('\t', param[i].data.numpy(), end='')
            print('] ')
    print()


def get_data():
    return torch.tensor([[
        0.0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,  # 10
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 1, 0], [

        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0], [

        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 0, 0, 0, 0], [

        0, 0, 1, 0, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 0, 0, 0,  # 20
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 0, 1, 0, 0, 0,
        0, 1, 1, 0, 0, 0,
        0, 1, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0], [

        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0], [

        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0], [

        0, 0, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0], [

        0, 1, 0, 0, 0, 0,
        0, 1, 0, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0], [

        0, 1, 0, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 0, 0, 0], [

        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 0, 0, 0, 0,
        0, 0, 0, 0, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,  # 30
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 0, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 0, 0, 0, 0,
        0, 1, 0, 0, 0, 0,
        0, 0, 0, 0, 0, 0], [

        0, 0, 0, 0, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 0, 0, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 1, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 1, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        1, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 1], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        1, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        1, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        1, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        1, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 1], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 1], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 1,
        0, 0, 0, 0, 1, 1,
        0, 0, 0, 1, 1, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 1, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 1, 0, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 0, 0, 0,
        1, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 1,
        0, 0, 0, 0, 1, 1,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 1], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 1, 1, 0, 0,
        1, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 1, 0, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 0, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 1, 0, 0,
        0, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 1, 0], [

        0, 0, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 0, 1,
        0, 0, 0, 1, 1, 1,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 1,
        0, 0, 0, 0, 1, 1,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 0, 1,
        0, 0, 0, 1, 1, 1,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 1, 0,
        0, 1, 0, 0, 0, 1,
        0, 0, 0, 1, 1, 1,
        0, 0, 0, 0, 1, 1,
        0, 1, 0, 0, 0, 1,
        0, 0, 1, 1, 1, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        1, 0, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        1, 0, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        1, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        1, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        1, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        1, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 1,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 0, 1,
        0, 0, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        1, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        1, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        1, 0, 0, 0, 1, 0,
        0, 0, 1, 1, 1, 0,
        0, 0, 1, 1, 1, 0,
        0, 0, 0, 0, 1, 0,
        1, 1, 1, 1, 0, 0], [

        1, 1, 1, 1, 0, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 1, 1, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 1,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        1, 1, 0, 0, 1, 1,
        0, 1, 1, 1, 1, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 1, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        0, 1, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 1, 1, 0, 1, 0,
        0, 0, 1, 1, 0, 0], [

        0, 0, 1, 1, 0, 0,
        0, 1, 0, 0, 1, 1,
        0, 0, 0, 0, 1, 1,
        0, 0, 0, 1, 1, 0,
        1, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 1, 0], [

        1, 1, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        1, 1, 0, 0, 1, 0,
        0, 1, 1, 1, 0, 0], [

        1, 1, 1, 1, 0, 0,
        1, 1, 0, 0, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 1, 0,
        1, 1, 0, 0, 1, 0,
        1, 1, 1, 1, 0, 0], [

        0, 0, 1, 1, 1, 0,
        0, 1, 0, 0, 1, 1,
        0, 0, 0, 1, 1, 0,
        0, 0, 0, 0, 1, 0,
        0, 1, 0, 0, 1, 1,
        0, 0, 1, 1, 1, 0]]

    ).resize(96, 1, 6, 6)


def get_result():
    return torch.tensor([[
        1.0, 0.0, 0.0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        1, 0, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 1, 0], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1], [
        0, 0, 1]])


def get_result2():
    return torch.tensor([
        1.0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
        2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,
        3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3
    ])

运行结果:

《深度学习的数学》(涌井良幸、涌井贞美著) 神经网络计算pytorch示例二_pytorch_03

《深度学习的数学》(涌井良幸、涌井贞美著) 神经网络计算pytorch示例二_深度学习的数学_04



标签:end,示例,涌井良幸,涌井贞美著,self,get,param,print,data
From: https://blog.51cto.com/oldycat/8172806

相关文章

  • 浅述边缘计算场景下的云边端协同融合架构的应用场景示例
    云计算正在向一种更加全局化的分布式节点组合形态进阶,而边缘计算是云计算能力向边缘侧分布式拓展的新触角。随着城市建设进程加快,海量设备产生的数据,若上传到云端进行处理,会对云端造成巨大压力。如果利用边缘计算来让云端的能力下沉,则可以很好地解决海量数据的处理问题,让云端的数据......
  • RK3568外部IO中断示例
    1. 外部IO中断介绍本篇文章以万象奥科HD-RK3568-IOT评估板中GPIO30为例,介绍Linux内核中断的注册方法,使用中断的方式检测GPIO30是否出现上升沿信号。中断在linux、设备驱动开发里使用的都非常多,可以更加实时的检测GPIO30的状态。Linux内核提供了中断的注册接口:1)     注......
  • 金蝶云星空表单插件获取日期控件判空处理【代码示例】
      DateTime?deliveryDate=(DateTime?)this.View.Model.GetValue("FApproveDate");//审核日期longleadtime=20;//天数if(!deliveryDate.IsNullOrEmpty()&&deliveryDate>Convert.ToDateTime("1800-......
  • spark代码示例---explode()炸裂函数使用
    数据结构,及bean的结构root|--eventName:string(nullable=true)|--itmeList:array(nullable=true)||--element:struct(containsNull=true)|||--did:string(nullable=true)|||--dno:long(nullable=true)|||--d......
  • NPOI设置样式示例
    HSSFWorkbookworkbook=newHSSFWorkbook();MemoryStreamms=newMemoryStream();ISheetsheet=workbook.CreateSheet();IRowheaderRow=sheet.CreateRow(0);HSSFCellStyleHeaderCellStyle=(HSSFCellStyle)w......
  • 《深度学习的数学》(涌井良幸、涌井贞美著) 神经网络计算pytorch示例一
    涌井良幸、涌井贞美著的《深度学习的数学》这本书,浅显易懂。书中还用Excel示例神经网络的计算,真是不错。但光有Excel示例还是有点欠缺的,如果有代码演示就更好了。百度了半天在网上没找到别人写的,只好自己撸一个(使用python+pytorch),供同样在学习神经网络的初学者参考。(注,这是书中4-......
  • Flyweight 享元模式简介与 C# 示例【结构型6】【设计模式来了_11】
    〇、简介1、什么是享元模式?一句话解释:  将相似或同类的对象共享同一个对象,将这些对象暂存在列表中,使用时直接取出,避免每次使用时都要新建浪费资源。享元模式的目的是减少对象的创建,通过共享对象来提高系统的性能。享元设计模式将对象的实例分为两种:内部共享对象和外部共享对......
  • 使用phpQuery库采集平安健康代码示例
    大家好,今天给大家分享的内容是使用phpQuery库采集平安健康相关视频,内容非常简单,篇幅也很短,但是确实很实用,一起学习一下吧。```php<?php//引入phpQuery库require_once'phpQuery/phpQuery.php';//创建一个phpQuery对象$jq=phpQuery::newDocument();//使用配置p......
  • 创建一个Web服务器并保持其运行,可以使用Python的Flask库。以下是一个基本的示例: ```p
    创建一个Web服务器并保持其运行,可以使用Python的Flask库。以下是一个基本的示例:```pythonfromflaskimportFlask,requestimportosapp=Flask(__name__)@app.route('/webhook',methods=['POST'])defwebhook():  data=request.get_json()  #在这里添加你的......
  • Android自动化测试框架:UiAutomator和UiAutomator2的区别与示例代码
    UiAutomator和UiAutomator2是两种常用的Android自动化测试框架,它们都是由Google开发的。然而,它们之间存在一些关键的区别:API级别:UiAutomator框架在Android4.3(API级别18)中引入,而UiAutomator2在Android5.0(API级别21)中引入。测试能力:UiAutomator只能测试Android系统应用......