郑重声明:原文参见标题,如有侵权,请联系作者,将会撤销发布!
Proceedings of the 38 th International Conference on Machine Learning, PMLR 139, 2021.
Abstract
1. Introduction and Motivating Work
2. Approach
2.1. Creating a Sufficiently Large Dataset
2.2. Selecting an Efficient Pre-Training Method
2.3. Choosing and Scaling a Model
2.4. Pre-training
2.5. Using CLIP
3. Analysis
3.1. Initial Comparison to Visual N-Grams
3.2. Zero-Shot Performance
3.3. Representation Learning
3.4. Robustness to Natural Distribution Shift
4. Data Overlap Analysis
5. Broader Impacts
6. Limitations
7. Related Work
8. Conclusion
标签:Natural,Language,Models,Transferable,Visual,Learning From: https://www.cnblogs.com/lucifer1997/p/18219682