Papers About Attention

Started to seriously read some NLP/CV/MulitModal publications, try to focus on fancy terms such as attention, fusion that are more frequently mentioned everywhere.

‘Attention Model incorporates this notion of relevance by allowing the model to dynamically pay attention to only certain parts of the input that help in performing the task at hand effectively’

  • Attention is all you need from Google, 2019

Transformer is proposed in this paper.

Leave a Reply

Your email address will not be published. Required fields are marked *