Transformer
简介
paper:Attention Is All You Need
Bert
简介
paper:BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
T5
简介
paper:Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
GPT-1
简介
paper:Improving Language Understanding by Generative Pre-Training
GPT-2
简介
paper:Language Models are Unsupervised Multitask Learners
GPT-3
简介
paper:Language Models are Few-Shot Learners
标签:Transformer,Family,Language,简介,paper,GPT,Learners From: https://www.cnblogs.com/NaughtyBaby/p/17019386.html