:)深度学习模型如何统计params量-|
1 大概统计
已知模型大小,如312M
计算为 312 00 0000 Bytes,
浮点数据 一个参数占4个字节,
import transformers import torch import os from transformers import GPT2TokenizerFast, GPT2LMHeadModel, GPT2Config from transformers import BertTokenizerFast model_path10 = r"/home/arm/disk_arm_8T/xiaoliu/AI610-SDK-r1p0-00eac0/GPT2_chinese_chat/GPT2-chitchat/model_bs1/min_ppl_model_bs1_lay10" model_path20 = r"/home/arm/disk_arm_8T/xiaoliu/AI610-SDK-r1p0-00eac0/GPT2_chinese_chat/GPT2-chitchat/model_chat1to6_bs8_lay20/min_ppl_model" modelyuan = "/home/arm/disk_arm_8T/xiaoliu/AI610-SDK-r1p0-00eac0/GPT2_chinese_chat/GPT2-chitchat/model_20wfrom100w/min_ppl_model" model = GPT2LMHeadModel.from_pretrained(modelyuan) # update param tol = sum((p.numel() for p in model.parameters())) # unupdate param buftol = sum((p.numel() for p in model.buffers())) print("update params ===>", tol) print("not update params ===>", buftol) print("total ===> ", tol+buftol)
则计算参数量为312 00 0000 /4 = 78000000 个参数
2 精确计算
已知模型大小,如312M
则 模型中存在parameters() 更新的参数
模型buffers 未作更新 的 那部分参数
两部分相加就是参数总数
标签:GPT2,模型,params,深度,import,model,arm From: https://www.cnblogs.com/lx63blog/p/17326829.html