采用隐树注意力的神经机器翻译

文档序号:1745753 发布日期:2019-11-26 浏览:11次 >En<

阅读说明:本技术 采用隐树注意力的神经机器翻译 (Using the neural machine translation of hidden tree attention ) 是由 J·布拉德伯里 于 2018-04-11 设计创作,主要内容包括:我们介绍了一种用于机器翻译任务的注意力神经机器翻译模型,该模型实现了自然语言处理的长期目标,以利用语言的层次结构而无需先验注释。该模型包括具有新型注意力RNNG解码器的循环神经网络语法(RNNG)编码器,并应用策略梯度强化学习以在源序列和目标序列上诱导无监督树结构。当对没有明确分割或解析注释的字符级数据集进行训练时,模型学习似乎合理的分割和浅层解析,获得接近注意力基线的性能。(We talk of a kind of attention nerve Machine Translation Model for machine translation task, the model realization long term object of natural language processing, to be annotated using the hierarchical structure of language without priori.The model includes Recognition with Recurrent Neural Network grammer (RNNG) encoder with novel attention RNNG decoder, and application strategy Gradient Reinforcement Learning on source sequence and target sequence to induce unsupervised tree construction.When being trained to the character level data set without clearly dividing or parsing annotation, model learning seems reasonably segmentation and shallow parsing, obtains the performance close to attention baseline.)

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:神经网络训练系统、方法和计算机可读存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!