a.需要多集===>传递推理。 e.g. ask where is the football? it will attend to sentence of "john put down the football"), then in second pass, it need to attend location of john.b.注意机制: two-layer feed forward nueral network.input is candidate fact c,previous memory m and question q. feature get by take: element-wise,matmul and absolute distance of q with c, and q with m.c.记忆更新机制:h = f(c,h_previous,g)。 最后一个隐藏状态是应答模块的输入。 4. 答案模块 要做的事情: 1.文本分类的字符级卷积网络 2.文本分类的卷积神经网络:浅词级与深字符级 3.文本分类的深度卷积网络 4.半监督文本分类的对抗训练方法 参考: 1. 《用于高效文本分类的技巧》Bag of Tricks for Efficient Text Classification 2.《语音分类的卷积神经网络》Convolutional Neural Networks for Sentence Classification 3.《卷积神经网络对句子分类的敏感性分析(和使用指南)》A Sensitivity Analysis of (and Practitioners' Guide to) Convolutional Neural Networks for Sentence Classification 4. 《聊天机器人中的深度学习》,第2部分—在Tensorflow中实现基于检索的模型Deep Learning for Chatbots, Part 2 – Implementing a Retrieval-Based Model in Tensorflow() 5.《文本分类的复杂卷积神经网络》Recurrent Convolutional Neural Network for Text Classification 6.《文档分类的分层注意网络》Hierarchical Attention Networks for Document Classification 7. 《共同学习对齐排列和翻译的神经机器翻译》Neural Machine Translation by Jointly Learning to Align and Translate 8. Attention Is All You Need 9. 《问我任何事情:自然语言处理的动态记忆网络》Ask Me Anything:Dynamic Memory Networks for Natural Language Processing 10.《用循环实体网络跟踪世界的状况》Tracking the state of world with recurrent entity networks返回搜狐,查看更多 (责任编辑:本港台直播) |