首页 > 其他分享 >从0开始学pytorch【4】--维度变换、拼接与拆分

从0开始学pytorch【4】--维度变换、拼接与拆分

时间:2023-06-08 13:05:27浏览次数:29  
标签:-- 32 torch 28 shape print pytorch 维度 Size



从0开始学pytorch【4】--维度变换、拼接与拆分

  • 学习内容:
  • 维度变换:
  • 张量拆分与拼接:
  • 小结


学习内容:

维度变换、张量拆分与拼接

从0开始学pytorch【4】--维度变换、拼接与拆分_tensorflow


维度变换:

1、view

import torch
a = torch.rand(4,1,28,28)
print(a.shape)
print(a.view(4, 28*28))
print(a.shape)
b = a.view(4, 28, -1)
b.view(4, 28,28, -1).shape

'''
torch.Size([4, 1, 28, 28])
tensor([[0.2516, 0.4973, 0.2032,  ..., 0.1892, 0.5932, 0.9167],
        [0.9366, 0.3864, 0.8891,  ..., 0.3008, 0.7179, 0.5442],
        [0.5551, 0.8872, 0.8677,  ..., 0.3314, 0.7326, 0.2640],
        [0.2746, 0.3667, 0.0543,  ..., 0.9942, 0.4714, 0.9417]])
torch.Size([4, 1, 28, 28])
torch.Size([4, 28, 28, 1])
'''

2、unsqueeze

插入维度(以增加中括号实现,不改变数据个数多少)

print(a.shape)
a.unsqueeze(0).shape, a.unsqueeze(-1).shape, a.unsqueeze(4).shape, a.unsqueeze(-4).shape, a.unsqueeze(-5).shape

'''
torch.Size([4, 1, 28, 28])
(torch.Size([1, 4, 1, 28, 28]),
 torch.Size([4, 1, 28, 28, 1]),
 torch.Size([4, 1, 28, 28, 1]),
 torch.Size([4, 1, 1, 28, 28]),
 torch.Size([1, 4, 1, 28, 28]))
'''

直观, 维度体现在,最外层(所有元素)(0), 最里层(每个元素)(-1)

a = torch.tensor([1.2, 2.3])
a.unsqueeze(-1), a.unsqueeze(0)

'''
(tensor([[1.2000],
         [2.3000]]),
 tensor([[1.2000, 2.3000]]))
'''

维度增加

bias = torch.rand(1, 32, 1, 1)
f = torch.rand(4, 32, 14, 14)
print((f+bias).shape)

bias = torch.rand(32)
bias = bias.unsqueeze(1).unsqueeze(2).unsqueeze(0)
(f+bias).shape

'''
torch.Size([4, 32, 14, 14])
torch.Size([4, 32, 14, 14])
'''

squeeze,维度删减,体现在中括号的减少,元素个数不变

b = torch.rand(1, 32, 1, 1)
print(b.squeeze().shape) # 将所有维度为1的都去掉
print(b.squeeze(0).shape)
print(b.squeeze(-1).shape)
print(b.squeeze(1).shape) # 无变化
b.squeeze(-4).shape

'''
torch.Size([32])
torch.Size([32, 1, 1])
torch.Size([1, 32, 1])
torch.Size([1, 32, 1, 1])
torch.Size([32, 1, 1])
'''

expand,逻辑上复制,实际上内存不改变

a = torch.rand(4, 32, 14, 14)
b = torch.randn(1, 32, 1, 1)
print(a.shape, b.shape)

b.expand(4, 32, 14, 14).shape, b.expand(-1, 32, -1, -1).shape, b.shape

'''
torch.Size([4, 32, 14, 14]) torch.Size([1, 32, 1, 1])
(torch.Size([4, 32, 14, 14]),
 torch.Size([1, 32, 1, 1]),
 torch.Size([1, 32, 1, 1]))
'''

repeat, 复制

print(b.shape, 32*32)

b.repeat(4, 32, 1, 1).shape, b.repeat(4, 1, 1,1).shape, b.repeat(4, 1, 32, 32).shape

'''
torch.Size([1, 32, 1, 1]) 1024
(torch.Size([4, 1024, 1, 1]),
 torch.Size([4, 32, 1, 1]),
 torch.Size([4, 32, 32, 32]))
'''

t转置

a = torch.randn(3, 4)
a.shape, a.t().shape

'''
(torch.Size([3, 4]), torch.Size([4, 3]))
'''

transpose维度交换

a = torch.rand(4, 3, 30, 32)
a.shape, a.transpose(1, 3).shape

'''
(torch.Size([4, 3, 30, 32]), torch.Size([4, 32, 30, 3]))
'''

contiguous() 对维度转换后的张量进行复制,必要步骤,否则会改变原始张量,无法作比较

a = torch.rand(4, 3, 30, 32)
a1 = a.transpose(1, 3).contiguous().view(4, 3*30*32).view(4, 3, 30, 32)
a2 = a.transpose(1, 3).contiguous().view(4, 3*30*32).view(4, 32, 30, 3).transpose(1,3)
print(a1.shape, a2.shape)

print(torch.all(torch.eq(a, a1)), torch.all(torch.eq(a, a1)).int())
print(torch.all(torch.eq(a, a2)), torch.all(torch.eq(a, a2)).int())

'''
torch.Size([4, 3, 30, 32]) torch.Size([4, 3, 30, 32])
tensor(False) tensor(0, dtype=torch.int32)
tensor(True) tensor(1, dtype=torch.int32)
'''

permute, 按照序号,直接调换各维度位置

a = torch.randn(4, 3, 28, 30)
a.shape, a.permute(3, 0, 2, 1).shape

'''
(torch.Size([4, 3, 28, 30]), torch.Size([30, 4, 28, 3]))
'''

broadcasting , 同tensorflow, 都是broadcast_to, 从最里层的维度开始对齐扩张, 从最后一个维度开始。

x = torch.tensor([1, 2, 3])
print(x, x.shape, torch.broadcast_to(x, (6, 3)))

'''
tensor([1, 2, 3]) torch.Size([3]) tensor([[1, 2, 3],
        [1, 2, 3],
        [1, 2, 3],
        [1, 2, 3],
        [1, 2, 3],
        [1, 2, 3]])
'''

torch.broadcast_tensors 返回两个值,变换后的两个值

x = torch.rand(4, 2)
bias = torch.randn(1, 2)

# torch.broadcast_tensors 返回两个值,变换后的两个值
bias1 = torch.broadcast_tensors(bias, x)
x,  bias,  bias1

'''
(tensor([[0.5489, 0.0084],
         [0.2405, 0.0769],
         [0.2159, 0.5728],
         [0.2529, 0.7946]]),
 tensor([[0.3781, 0.9359]]),
 (tensor([[0.3781, 0.9359],
          [0.3781, 0.9359],
          [0.3781, 0.9359],
          [0.3781, 0.9359]]),
  tensor([[0.5489, 0.0084],
          [0.2405, 0.0769],
          [0.2159, 0.5728],
          [0.2529, 0.7946]])))
'''

张量拆分与拼接:

cat拼接

a = torch.rand(4, 32, 8)
b = torch.rand(5, 32, 8)
print(torch.cat([a, b], dim=0).shape)

a1 = torch.rand(4, 3, 32, 32)
a2 = torch.rand(5, 3, 32, 32)
print(torch.cat([a1, a2], dim=0).shape)

a1 = torch.rand(4, 1, 32, 32)
a2 = torch.rand(4, 6, 32, 32)
print(torch.cat([a1, a2], dim=1).shape)

a1 = torch.rand(4, 3, 32, 32)
a2 = torch.rand(4, 3, 32, 3)
print(torch.cat([a1, a2], dim=-1).shape)

'''
torch.Size([9, 32, 8])
torch.Size([9, 3, 32, 32])
torch.Size([4, 7, 32, 32])
torch.Size([4, 3, 32, 35])
'''

stack, 新增维度,开关. 要求两个张量维度一致

a1 = torch.rand(4, 3, 16, 32)
a2 = torch.randn(4, 3, 16, 32)
print(torch.cat([a1, a2], dim=2).shape)

print(torch.stack([a1, a2], dim=2).shape)

a = torch.rand(32, 8)
b = torch.rand(32, 8)

torch.stack([a, b], dim=0).shape

'''
torch.Size([4, 3, 32, 32])
torch.Size([4, 3, 2, 16, 32])
torch.Size([2, 32, 8])
'''

split 按照维度和元素个数 切割

a = torch.rand(32, 8)
b = torch.rand(32, 8)

c = torch.stack([a, b], dim=0)
print(c.shape)

aa, bb = c.split([1, 1], dim=0)
print(aa.shape, bb.shape)

aa, bb = c.split([7, 1], dim=-1)
print(aa.shape, bb.shape)

aa, bb = c.split(1, dim=0)
print(aa.shape, bb.shape)

c = torch.cat([a, b], dim=0)
print(c.shape)

aa, bb = c.split([1,63], dim=0)
print(aa.shape, bb.shape)

'''
torch.Size([2, 32, 8])
torch.Size([1, 32, 8]) torch.Size([1, 32, 8])
torch.Size([2, 32, 7]) torch.Size([2, 32, 1])
torch.Size([1, 32, 8]) torch.Size([1, 32, 8])
torch.Size([64, 8])
torch.Size([1, 8]) torch.Size([63, 8])
'''

chunk 以及填充、对数

c = torch.randn(2, 32, 8)
print(c.shape)

aa, bb = c.chunk(2, dim=0)
print(aa.shape, bb.shape)

a = torch.full([4], 10)
print(torch.log10(a),torch.log(a),torch.log2(a))

'''
torch.Size([2, 32, 8])
torch.Size([1, 32, 8]) torch.Size([1, 32, 8])
tensor([1., 1., 1., 1.]) tensor([2.3026, 2.3026, 2.3026, 2.3026]) tensor([3.3219, 3.3219, 3.3219, 3.3219])
'''

小结

从0开始学pytorch【4】--维度变换、拼接与拆分_tensorflow_02




标签:--,32,torch,28,shape,print,pytorch,维度,Size
From: https://blog.51cto.com/guog/6439017

相关文章

  • 从0开始学pytorch【3】--张量数据类型
    从0开始学pytorch【3】--张量数据类型前言学习目标基本数据类型创建tensor索引、切片小结前言  在前两篇博文中,从0开始学pytorch【1】–线性函数的梯度下降、从0开始学pytorch【2】——手写数字集案例中介绍了人工智能入门最为基础的梯度下降算法实现,以及机器学习、深度网络编......
  • python爬虫技术实例详解及数据可视化库
    前言在当前数据爆发的时代,数据分析行业势头强劲,越来越多的人涉足数据分析领域。面对大量数据,人工获取信息的成本高、耗时长、效率低,那么是否能用代码去完成大量复杂的工作,从而从网络上获取到目标信息?由此,网络爬虫技术应运而生。本文目录,你将会看到网络爬虫简介网络爬虫(webcrawler,又......
  • python 安装包、基础学习资料、代码应用示例
    安装包python-3.7.0.rar链接:https://pan.baidu.com/s/1Gl5QUMrLFoTekENighd0iw提取码:ysgxpycharm5.0.3.zip链接:https://pan.baidu.com/s/1DpzRiMWSW2byWjB1cYmQKw提取码:9rgiAnaconda3jupyternotebook第一步进入:https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/第......
  • python 网络爬虫技术 运用正则表达式爬取当当网(实战演练)
    爬取网络:当当网代码importreimportrequestsimporttimeimportxlwturl_basic='http://search.dangdang.com/?key='heads={'Connection':'keep-alive','Accept-Language':'zh-CN,zh;q=0.9','......
  • hncloud:常见的美国服务器操作系统
    常见的美国服务器操作系统包括:WindowsServer:WindowsServer是微软公司提供的服务器操作系统,适用于各种企业级应用和服务,如网站托管、数据库管理、应用程序部署等。Linux发行版:Linux是一种开源操作系统,有许多不同的发行版可供选择,包括但不限于以下几种常见的发行版:Ubuntu:一种基于De......
  • 第四周第一次学习
    第十九课时字符串转义字符串格式化内建函数转义字符用一个特殊的…不同的系统对换行有不同的理解用特殊的字符表示出一些列不方便写出的内容In[1]:ss=“ilove\r\naaaa”print(s)iloveaaaa字符串的格式化把字符按照一定的格式打印或者填充格式化百......
  • CART——Classification And Regression Tree在python下的实现
    分类与回归树(CART——ClassificationAndRegressionTree))是一种非参数分类和回归方法,它通过构建二叉树达到预测目的。示例:1.样本数据集 2.运行结果-cart决策树的字典max_n_feats=3时tree_dict={house:{yes:agreen......
  • 第三周第二次学习
    十二课时(续)e_ww=[1,3,5,9]forshuziine_ww:print(shuzi)print(shuzi+123)print(shuzi*100)1124100312630051285009132900In[7]:stu_list=[‘小白’,‘小黑’,‘小红’]forstuinstu_list:ifstu==“小白”:print(“我是小白”......
  • 【Leetcode】5-最长回文子串
    1.一般方法:暴力for循环求解,时间复杂度,空间复杂度。2.动态规划:我们发现在匹配过程中有许多重复计算的部分,我们把这些放到一个表里保存起来会减少运算,用空间换时间。时间复杂度,空间复杂度。例如“babab”字符串对应的表为:dp[i][j]为TRUE代表字符串从i到j为回文串。判断i到j是否为回文......
  • 基本功练习_2_16_插入法
    ......