数码小王 深度学习中的注意力机制(一)( 五 )


TextSummarization
EMNLP2015:ANeuralAttentionModelforSentenceSummarization[7]
给定一篇英文文章作为输入序列 , 输出一个对应的摘要序列 。 Attention机制被用于关联输出摘要中的每个词和输入中的一些特定词 。
参考资料:
[1]SequencetoSequenceLearningwithNeuralNetworks:
https://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf
[2]LearningPhraseRepresentationsusingRNNEncoder–DecoderforStatisticalMachineTranslation:
https://www.aclweb.org/anthology/D14-1179
[3]ICLR2015:NeuralMachineTranslationbyJointlyLearningtoAlignandTranslate:
https://arxiv.org/pdf/1409.0473.pdf
[4]ICML2015:Show,AttendandTell-NeuralImageCaptionGenerationwithVisualAttention:
https://arxiv.org/pdf/1502.03044.pdf
[5]NIPS2015:Attention-BasedModelsforSpeechRecognition:
https://arxiv.org/pdf/1506.07503.pdf
[6]ICLR2016:ReasoningaboutEntailmentwithNeuralAttention:
https://arxiv.org/pdf/1509.06664.pdf
[7]EMNLP2015:ANeuralAttentionModelforSentenceSummarization:
https://www.aclweb.org/anthology/D/D15/D15-1044.pdf
数码小王 深度学习中的注意力机制(一)
文章图片
数码小王 深度学习中的注意力机制(一)
文章图片