首页 > 其他分享 >CS231n: Convolutional Neural Networks for Visual Recognition

CS231n: Convolutional Neural Networks for Visual Recognition

时间:2023-08-14 12:33:57浏览次数:61  
标签:Convolutional CS231n Neural May notes slides video Lecture



CS231n: Convolutional Neural Networks for Visual Recognition



Event Type

Date

Description

Course Materials

Lecture 1

Tuesday 

April 4

Course Introduction 

Computer vision overview 

Historical context 

Course logistics

[slides] [video]

Lecture 2

Thursday 

April 6

Image Classification 

The data-driven approach 

K-nearest neighbor 

Linear classification I

[slides] [video] 

[python/numpy tutorial][image classification notes]

[linear classification notes]

Lecture 3

Tuesday 

April 11

Loss Functions and Optimization 

Linear classification II

Higher-level representations, image features

Optimization, stochastic gradient descent

[slides] [video] 

[linear classification notes][optimization notes]

Lecture 4

Thursday 

April 13

Introduction to Neural Networks 

Backpropagation

Multi-layer Perceptrons

The neural viewpoint

[slides] [video] 

[backprop notes][linear backprop example]

[derivatives notes] (optional) 

[Efficient BackProp] (optional)

related: [1][2][3] (optional)

Lecture 5

Tuesday 

April 18

Convolutional Neural Networks 

History 

Convolution and pooling 

ConvNets outside vision

[slides] [video] 

ConvNet notes

Lecture 6

Thursday 

April 20

Training Neural Networks, part I 

Activation functions, initialization, dropout, batch normalization

[slides] [video] 

Neural Nets notes 1Neural Nets notes 2

Neural Nets notes 3

tips/tricks: [1][2][3] (optional) 

Deep Learning [Nature] (optional)

A1 Due

Thursday 

April 20

Assignment #1 due 

kNN, SVM, SoftMax, two-layer network

[Assignment #1]

Lecture 7

Tuesday 

April 25

Training Neural Networks, part II 

Update rules, ensembles, data augmentation, transfer learning

[slides] [video] 

Neural Nets notes 3

Proposal due

Tuesday 

April 25

Couse Project Proposal due

[proposal description]

Lecture 8

Thursday 

April 27

Deep Learning Software 

Caffe, Torch, Theano, TensorFlow, Keras, PyTorch, etc

[slides] [video]

Lecture 9

Tuesday 

May 2

CNN Architectures 

AlexNet, VGG, GoogLeNet, ResNet, etc

[slides] [video] 

AlexNetVGGNetGoogLeNetResNet

Lecture 10

Thursday 

May 4

Recurrent Neural Networks 

RNN, LSTM, GRU 

Language modeling 

Image captioning, visual question answering 

Soft attention

[slides] [video] 

DL book RNN chapter (optional)

min-char-rnnchar-rnnneuraltalk2

A2 Due

Thursday 

May 4

Assignment #2 due 

Neural networks, ConvNets

[Assignment #2]

Midterm

Tuesday 

May 9

In-class midterm

Location: Various (not

 

Lecture 11

Thursday 

May 11

Detection and Segmentation 

Semantic segmentation 

Object detection 

Instance segmentation

[slides] [video] 

Lecture 12

Tuesday 

May 16

Visualizing and Understanding 

Feature visualization and inversion 

Adversarial examples 

DeepDream and style transfer

[slides] [video] 

DeepDreamneural-style

fast-neural-style

Milestone

Tuesday 

May 16

Course Project Milestone due

 

Lecture 13

Thursday 

May 18

Generative Models 

PixelRNN/CNN 

Variational Autoencoders 

Generative Adversarial Networks

[slides] [video] 

Lecture 14

Tuesday 

May 23

Deep Reinforcement Learning 

Policy gradients, hard attention 

Q-Learning, Actor-Critic

[slides] [video] 

Guest Lecture

Thursday 

May 25

Invited Talk: Song Han 

Efficient Methods and Hardware for Deep Learning

[slides] [video] 

A3 Due

Friday 

May 26

Assignment #3 due

[Assignment #3]

Guest Lecture

Tuesday 

May 30

Invited Talk: Ian Goodfellow 

Adversarial Examples and Adversarial Training

[slides] [video] 

Lecture 16

Thursday 

June 1

Student spotlight talks, conclusions

[slides]

Poster Due

Monday 

June 5

Poster PDF due

[poster description]

Poster Presentation

Tuesday

June 6

 

 

Final Project Due

Monday 

June 12

Final course project due date

[reports]


标签:Convolutional,CS231n,Neural,May,notes,slides,video,Lecture
From: https://blog.51cto.com/u_12667998/7074887

相关文章

  • Paper Reading: NBDT: Neural-Backed Decision Trees
    目录研究动机文章贡献本文方法推理建立层次结构用WordNet标记决策节点微调和树监督损失实验结果对比实验结果可解释性识别错误的模型预测引导图像分类人更倾向的解释识别有缺陷的数据标签优点和创新点PaperReading是从个人角度进行的一些总结分享,受到个人关注点的侧重和实力......
  • HS-GCN Hamming Spatial Graph Convolutional Networks for Recommendation
    目录概符号说明HS-GCNInitialLayerPropagationLayerHashCodeEncoding矩阵表示PredictionLayerOptimization代码LiuH.,WeiY.,YinJ.andNieL.HS-GCN:Hammingspatialgraphconvolutionalnetworksforrecommendation.IEEETKDE.概二值化的nodeembedding.符......
  • [论文阅读] Neural Transformation Fields for Arbitrary-Styled Font Generation
    Pretitle:NeuralTransformationFieldsforArbitrary-StyledFontGenerationaccepted:CVPR2023paper:https://openaccess.thecvf.com/content/CVPR2023/html/Fu_Neural_Transformation_Fields_for_Arbitrary-Styled_Font_Generation_CVPR_2023_paper.htmlcode:htt......
  • NNs(Neural Networks,神经网络)和Polynomial Regression(多项式回归)等价性之思考,以及深度
    NNs(NeuralNetworks,神经网络)和PolynomialRegression(多项式回归)等价性之思考,以及深度模型可解释性原理研究与案例1.MainPoint0x1:行文框架第二章:我们会分别介绍NNs神经网络和PR多项式回归各自的定义和应用场景。第三章:讨论NNs和PR在数学公式上的等价性,NNs......
  • Neural Network 初学
    参数:机器学习的内容超参数:人手动设置的数值,比如学习率、训练轮数MLP在inputlayer和outputlayer之间有一堆hiddenlayer,每两层之间可以理解成一张完全二分图,二分图的邻接矩阵上有一些权重,随机初始化。将图片的每个像素点抽出来变成向量之后在二分图上矩阵乘法得到第一层......
  • 机器翻译 | Improving Neural Machine Translation Robustness via Data Augmentation
    论文地址:https://arxiv.org/abs/1910.03009动机神经机器翻译(NMT)模型在翻译干净文本时已被证明是强大的,但它们对输入中的噪声非常敏感。改进NMT模型的鲁棒性可以看作是对噪声的“域”适应的一种形式。最先进的方法严重依赖于大量的反向翻译数据。最近创建的基于噪声文本的机......
  • 机器翻译 | Improving Neural Machine Translation Robustness via Data Augmentation
    摘要神经机器翻译(NMT)模型在翻译干净文本时已被证明是强大的,但它们对输入中的噪声非常敏感。改进NMT模型的鲁棒性可以看作是对噪声的“域”适应的一种形式。最近创建的基于噪声文本的机器翻译任务语料库为一些语言对提供了噪声清洁的并行数据,但这些数据在大小和多样性方面非常有......
  • Exploiting Noise as a Resource for Computation and Learning in Spiking Neural Ne
    郑重声明:原文参见标题,如有侵权,请联系作者,将会撤销发布!https://arxiv.org/abs/2305.16044 Summary Keywords Introduction  ResultsNoisyspikingneuralnetworkandnoise-drivenlearning NSNNleadstohigh-performancespikingneuralmodels NSNN......
  • 4.3 Recurrent Neural Network (RNN) II
    1.RNN怎么学习1.1LossFunction  如果要做learning的话,你要定义一个costfunction来evaluate你的model是好还是不好,选一个parameter要让你的loss最小.那在RecurrentNeuralNetwork里面,你会怎么定义这个loss呢,下面我们先不写算式,先直接举个例子.  如下图所示,这是一......
  • Spike timing reshapes robustness against attacks in spiking neural networks
    郑重声明:原文参见标题,如有侵权,请联系作者,将会撤销发布!同大组工作......