If you’re reading this blog, it’s likely that you’re familiar with some of the classic applications of convolutional neural networks to tasks like image recognition and text classification. Convolutions are a very natural and powerful tool for capturing spacially invariant patterns. It matters little where in the image whiskers occur when we’re identifying a cat. Similarly, in classifying a document as a court case transcript, the presence of legal jargon phrases matters much more to us than their position in the document. But what about temporal patterns? By a similar token, might there be recurring patterns like weekly cyclicality and certain autocorrelation structures that convolutions are well-suited to model?
你可能比较熟悉卷积神经网络的一些经典应用,例如图像识别任务和文本分类任务。对于捕捉spacially invariant(空间不变性,平移不变性)模式来说,卷积是一个非常自然和强大的工具,
标签:like,spacially,WaveNet,re,patterns,matters From: https://www.cnblogs.com/zjuhaohaoxuexi/p/16724901.html