Corpus generation method and device, electronic equipment and storage medium

文档序号:191275 发布日期:2021-11-02 浏览:27次 中文

阅读说明:本技术 语料生成方法、装置、电子设备以及存储介质 (Corpus generation method and device, electronic equipment and storage medium ) 是由 李绩成 高鹏至 何中军 李芝 于 2021-06-30 设计创作,主要内容包括:本公开提供了一种语料生成方法、装置、电子设备以及存储介质,涉及人工智能领域,尤其涉及深度学习、自然语言处理等领域。具体实现方案为:基于基础的第一反向翻译模型,生成至少一个第二反向翻译模型,其中,第二反向翻译模型的模型参数集合与第一反向翻译模型的模型参数集合不同;基于至少一个第二反向翻译模型,生成目标语言中单语语料的伪平行语料。由此,能够实现在反向翻译的过程中生成高质量且多样的语料。(The disclosure provides a corpus generating method, a corpus generating device, electronic equipment and a storage medium, and relates to the field of artificial intelligence, in particular to the fields of deep learning, natural language processing and the like. The specific implementation scheme is as follows: generating at least one second reverse translation model based on the basic first reverse translation model, wherein the model parameter set of the second reverse translation model is different from the model parameter set of the first reverse translation model; and generating pseudo parallel corpora of the monolingual corpora in the target language based on the at least one second reverse translation model. Therefore, high-quality and various corpora can be generated in the reverse translation process.)

1. A corpus generation method includes:

generating at least one second reverse translation model based on the underlying first reverse translation model, wherein a set of model parameters of the second reverse translation model is different from a set of model parameters of the first reverse translation model;

and generating pseudo parallel corpora of the monolingual corpora in the target language based on the at least one second reverse translation model.

2. The method of claim 1, wherein the set of model parameters of the second reverse translation model is smaller than the set of model parameters of the first reverse translation model.

3. The method of claim 1, wherein the generating at least one second reverse translation model based on the underlying first reverse translation model comprises:

and performing a random parameter discarding dropout operation on the first reverse translation model to generate the at least one second reverse translation model.

4. The method of claim 3, wherein the performing a stochastic parameter dropping dropout operation on the first reverse translation model to generate the at least one second reverse translation model comprises:

for each second reverse translation model, carrying out Bernoulli sampling on the first reverse translation model with a discarding probability, and acquiring first model parameters sampled with the discarding probability and second model parameters sampled with a retaining probability, wherein the sum of the discarding probability and the retaining probability is 1;

and configuring the first model parameter to be 0, and configuring the second model parameter to be reserved with an original value so as to generate the second reverse translation model.

5. The method of claim 4, further comprising:

and the drop probability of the dropout operation corresponding to each second reverse translation model is the same.

6. The method of claim 4 or 5, wherein the method further comprises: obtaining bilingual evaluation substitution parameters between a source language material and a translation language material corresponding to the currently generated second reverse translation model;

and adjusting the discarding probability of the dropout operation according to the bilingual estimation substitution parameter, and continuing to generate the next second reverse translation model according to the adjusted discarding probability.

7. The method of claim 6, wherein the bilingual evaluation substitution parameter is inversely related to a drop probability of the dropout operation.

8. The method of claim 1, wherein said generating a pseudo-parallel corpus of monolingual corpora in the target language based on the at least one second reverse translation model comprises: and respectively inputting the monolingual corpus into each second reverse translation model to obtain a pseudo parallel corpus of the monolingual corpus output by each second reverse translation model.

9. The method of claim 1, wherein the method further comprises:

and augmenting the parallel corpus of the source language based on the obtained pseudo parallel corpus to obtain a training corpus of the translation model.

10. A corpus generation apparatus, comprising:

a first generation module, configured to generate at least one second reverse translation model based on a basic first reverse translation model, where a set of model parameters of the second reverse translation model is different from a set of model parameters of the first reverse translation model;

and the second generation module is used for generating the pseudo parallel linguistic data of the monolingual linguistic data in the target language based on the at least one second reverse translation model.

11. The apparatus of claim 10, wherein the set of model parameters of the second reverse translation model is smaller than the set of model parameters of the first reverse translation model.

12. The apparatus of claim 10, wherein the first generating means is further configured to:

and performing a random parameter discarding dropout operation on the first reverse translation model to generate the at least one second reverse translation model.

13. The apparatus of claim 12, wherein the first generating module is further configured to: for each second reverse translation model, carrying out Bernoulli sampling on the first reverse translation model with a discarding probability, and acquiring first model parameters sampled with the discarding probability and second model parameters sampled with a retaining probability, wherein the sum of the discarding probability and the retaining probability is 1;

and configuring the first model parameter to be 0, and configuring the second model parameter to be reserved with an original value so as to generate the second reverse translation model.

14. The apparatus of claim 13, wherein the drop probability of the dropout operation for each of the second reverse translation models is the same.

15. The apparatus of claim 12 or 13, wherein the first generating means is further configured to:

obtaining bilingual evaluation substitution parameters between a source language material and a translation language material corresponding to the currently generated second reverse translation model;

and adjusting the discarding probability of the dropout operation according to the bilingual estimation substitution parameter, and continuing to generate the next second reverse translation model according to the adjusted discarding probability.

16. The apparatus of claim 15, wherein the bilingual evaluation substitution parameter is inversely related to a drop probability of the dropout operation.

17. The apparatus of claim 10, wherein the second generating means is further configured to:

and respectively inputting the monolingual corpus into each second reverse translation model to obtain a pseudo parallel corpus of the monolingual corpus output by each second reverse translation model.

18. The apparatus of claim 10, wherein the second generating means is further configured to:

and augmenting the parallel corpus of the source language based on the obtained pseudo parallel corpus to obtain a training corpus of the translation model.

19. An electronic device, comprising:

at least one processor; and

a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,

the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.

20. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-9.

21. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-9.

Technical Field

The present disclosure relates to the field of artificial intelligence, and more particularly to the fields of deep learning, natural language processing, and the like.

Background

Machine translation is a process of translating one natural language text (source language) into another natural language text (target language) by machine force, is an important research field of natural language processing, and is one of the current internet common services. How to improve the translation accuracy of the machine translation model is the focus of research.

Disclosure of Invention

The disclosure provides a corpus generating method, a corpus generating device and a storage medium.

According to an aspect of the present disclosure, there is provided a corpus generating method, including:

generating at least one second reverse translation model based on the underlying first reverse translation model, wherein a set of model parameters of the second reverse translation model is different from a set of model parameters of the first reverse translation model;

and generating pseudo parallel corpora of the monolingual corpora in the target language based on the at least one second reverse translation model.

According to another aspect of the present disclosure, there is provided an apparatus for corpus generation, including:

a first generation module, configured to generate at least one second reverse translation model based on a basic first reverse translation model, where a set of model parameters of the second reverse translation model is different from a set of model parameters of the first reverse translation model;

and the second generation module is used for generating the pseudo parallel linguistic data of the monolingual linguistic data in the target language based on the at least one second reverse translation model.

According to another aspect of the present disclosure, there is provided an electronic device including:

at least one processor; and

a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,

the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a corpus generation method.

According to another aspect of the present disclosure, a non-transitory computer-readable storage medium having stored thereon computer instructions for causing the computer to execute a corpus generation method.

According to another aspect of the disclosure, a computer program product comprising a computer program which, when executed by a processor, implements a corpus generation method.

It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.

Drawings

The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:

FIG. 1 is a schematic flow diagram of a corpus generation method according to an embodiment of the present disclosure;

FIG. 2 is a schematic flow chart diagram of a corpus generation method according to another embodiment of the present disclosure;

FIG. 3 is a flowchart illustrating an example corpus generation method according to another embodiment of the present disclosure;

FIG. 4 is a schematic diagram of a corpus generation apparatus, according to an embodiment of the present disclosure;

FIG. 5 is a block diagram of an electronic device for implementing the corpus generation method of an embodiment of the present disclosure.

Detailed Description

Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.

The following briefly describes the technical field to which the disclosed solution relates:

artificial Intelligence (AI) is a subject of studying some thinking processes and intelligent behaviors (such as learning, reasoning, thinking, planning, etc.) of a computer to simulate a human life, and has both hardware and software technologies. Artificial intelligence hardware techniques generally include computer vision techniques, speech recognition techniques, natural language processing techniques, and learning/deep learning thereof, big data processing techniques, knowledge-graph techniques, and the like.

Deep learning is a new research direction in the field of machine learning. Deep learning is the intrinsic law and expression level of the learning sample data, and the information obtained in the learning process is very helpful for the interpretation of data such as characters, images and sounds. The final aim of the method is to enable the machine to have the analysis and learning capability like a human, and to recognize data such as characters, images and sounds. Deep learning is a complex machine learning algorithm, and achieves the effect in speech and image recognition far exceeding the prior related art.

Natural Language Processing (NLP) is an important direction in the fields of computer science and artificial intelligence. It studies various theories and methods that enable efficient communication between humans and computers using natural language. Natural language processing is a science integrating linguistics, computer science and mathematics. Therefore, the research in this field will relate to natural language, i.e. the language that people use everyday, so it is closely related to the research of linguistics, but has important difference. Natural language processing is not a general study of natural language but is directed to the development of computer systems, and particularly software systems therein, that can efficiently implement natural language communications. It is thus part of computer science.

The natural language processing is mainly applied to the aspects of machine translation, public opinion monitoring, automatic summarization, viewpoint extraction, text classification, question answering, text semantic comparison, voice recognition, Chinese OCR and the like.

The neural network model makes great progress on a machine translation task and surpasses statistical machine translation, the neural machine translation model based on a Transformer (multi-head attention) model obtains good translation quality under the training of a large amount of data, but although the neural network is greatly improved in translation quality and limited by the existing parallel corpus quantity, the translation model cannot further obtain a better effect, and the corpus generation method is provided aiming at the problem of insufficient corpus quantity.

The corpus generating method provided by the embodiment of the present disclosure may be executed by an electronic device, and the electronic device may be a server, a cloud platform, and the like, which is not limited herein.

The corpus generating method provided by the embodiment of the present disclosure may be executed by an electronic device, and the electronic device may be a Personal Computer (PC), a server, a cloud, and the like, which is not limited herein.

In the disclosed embodiment, the electronic device may be provided with a processing component, a storage component and a driving component. Optionally, the driving component and the processing component may be integrated, the storage component may store an operating system, an application program, or other program modules, and the processing component implements the corpus generating method provided by the embodiment of the present disclosure by executing the application program stored in the storage component.

The corpus generating method, apparatus, electronic device, and storage medium provided by the present disclosure are described in detail below with reference to the accompanying drawings.

FIG. 1 is a flow diagram of a corpus generation method according to one embodiment of the present disclosure.

The corpus generating method according to the embodiment of the present disclosure may be further implemented by the corpus generating device according to the embodiment of the present disclosure, which may be configured in an electronic device to implement a first reverse translation model based on a basis, generate at least one second reverse translation model, and generate a pseudo-parallel corpus of a monolingual corpus in a target language based on the at least one second reverse translation model, so as to enable generation of high-quality and diverse corpuses in a reverse translation process.

As a possible situation, the corpus generating method according to the embodiment of the present disclosure may also be executed at a server side, where the server may be a cloud server, and the corpus generating method may be executed at a cloud side.

As shown in fig. 1, the corpus generating method may include:

step 101, generating at least one second reverse translation model based on the basic first reverse translation model, wherein a model parameter set of the second reverse translation model is different from a model parameter set of the first reverse translation model.

The model parameter set may include parameters of an encoder side, parameters of a decoder side, parameters of a Word Embedding layer, parameters of an intermediate layer, parameters of an output layer, and the like in the reverse translation model.

In the embodiment of the present disclosure, the parameters of the first reverse translation model may be subjected to certain processing according to a set policy to generate one or more second reverse translation models, for example, the second reverse translation model may be generated by model pruning, using the base model as a teacher model, building a student model, performing dropout operation, and other policies. And in the process of processing the parameters of the first reverse translation model according to a set strategy, the parameters of the model are changed, so that the generated model parameter set of the second reverse translation model is different from the model parameter set of the first reverse translation model.

In addition, the first reverse translation model and the second reverse translation model may be standard machine translation models, but the translation directions are opposite, for example, if the forward translation model is chinese to english, the first reverse translation model and the second reverse translation model are english to chinese, and the reverse translation may be implemented by an encoder and a decoder, which will not be described herein.

And 102, generating a pseudo parallel corpus of the monolingual corpus in the target language based on at least one second reverse translation model.

In the embodiment of the present disclosure, after the one or more second reverse translation models are generated, the monolingual corpus in the target language may be input into the one or more second reverse translation models to generate corresponding pseudo-parallel corpus, and the pseudo-parallel corpus may augment the corpus of the translation models.

Therefore, according to the corpus generating method provided by the embodiment of the disclosure, at least one second reverse translation model is generated based on the basic first reverse translation model, and then the pseudo parallel corpus of the monolingual corpus in the target language is generated based on the at least one second reverse translation model, so that high-quality and diverse corpuses can be generated in the reverse translation process.

To clarify the above embodiment, in one embodiment of the present disclosure, generating at least one second reverse translation model based on the underlying first reverse translation model may include: and carrying out a random parameter discarding dropout operation on the first reverse translation model to generate at least one second reverse translation model.

Optionally, for each second reverse translation model, bernoulli sampling may be performed on the first reverse translation model with a discarding probability, obtaining first model parameters sampled with the discarding probability and second model parameters sampled with a retaining probability, where a sum of the discarding probability and the retaining probability is 1, and configuring the first model parameters to be 0 and the second model parameters to be original retaining values to generate the second reverse translation model.

In some implementations, during the random parameter discarding dropout operation on the first reverse translation model, a discarding probability value may be set to be p, and a retention probability value may be set to be 1-p, where p and 1-p are both between 0 and 1. In the present disclosure, Bernoulli sampling the first reverse translation model with a drop probability p would sample to 0 with a probability of p and to 1 with a probability of 1-p. For each parameter of the first reverse translation model, when 0 is sampled, the parameter is set to zero, i.e., the parameter is discarded; when 1 is sampled, the parameter remains unchanged, i.e. the parameter is retained.

For example, if the p value is set to 0.04, then in the process of sampling from the bernoulli distribution, a probability of 0.04 is sampled to 0, and a probability of 0.96 is sampled to 1, and for each parameter of the first inverse model, when the probability of 0.04 is sampled to 0, the parameter is set to zero (discarded), and when the probability of 0.96 is sampled to 1, the parameter remains unchanged (reserved).

It should be noted that, in the embodiment, the value p of the discarding probability is usually set to be (0.01-0.1), and the value thereof depends on the diversity degree of the expected corpus, the larger the value p is, the higher the diversity degree of the corpus is, the smaller the value p is, the lower the diversity degree of the corpus is, the value p can be set according to the diversity degree of the expected corpus, and the setting is not limited herein. In general, it is preferable that the p value is set within the range (0.03 to 0.05).

Further, in the process of generating the second reverse translation model by performing dropout operation on the first reverse translation model, the model parameters are discarded with a certain discarding probability, so that the parameter set of the second reverse translation model is smaller than the parameter set of the first reverse translation model.

The dropping probability of the dropout operation corresponding to each second reverse translation model is the same, but the whole dropout operation process is a random sampling process, and even if the dropping probability is the same, the sampled parameter sets are different, namely, the parameter sets of each second reverse translation model are different, so that the diversity of the second reverse translation models can be ensured, and the translated pseudo parallel linguistic data are more diverse.

Therefore, the embodiment of the disclosure can generate a plurality of reverse translation models by using one reverse translation model, can generate various pseudo parallel corpora by using the plurality of reverse translation models, is used as the corpus of the translation model, enriches the diversity of the corpus of the translation model, and can train the translation model with better translation performance based on the abundant corpus.

In the process of generating the second reverse translation model, the quality of the model needs to be evaluated, so that the model is further optimized to obtain a better model. In an embodiment of the present disclosure, as shown in fig. 2, the corpus generating method may further include:

step 201, obtaining bilingual evaluation substitution parameters between the source language material and the translated language material corresponding to the currently generated second reverse translation model.

The bilingual estimation substitution parameter can be pariwise-BLEU, which is an index provided in the diversity translation, and the smaller the pariwise-BLEU value is, the lower the diversity degree of the corpus is, the larger the pariwise-BLEU value is, and the higher the diversity degree of the corpus is.

In the embodiment of the present disclosure, a BLEU value may be obtained by calculating a source material and a translated corpus corresponding to a currently generated second reverse translation model, and then a pariwise-BLEU value (i.e., a bilingual estimation substitution parameter) may be obtained by averaging the BLEU values.

Optionally, the translation model may obtain multiple N corpora through dropout operation, the corpora mutually calculate the BLEU value, and then average the BLEU value to obtain the pariwise-BLEU value. Specifically, corpus 1 and corpus 2-N are calculated to obtain N-1 BLEU values, corpus 2 and corpus 3-N are calculated to obtain N-2 BLEU values, and by analogy, N x (N-1)/2 BLEU values can be finally obtained, and then the average value of N x (N-1)/2 BLEU values is taken to obtain a Pairwise-BLEU value.

And step 202, estimating a substitution parameter according to the bilingual, adjusting the discarding probability of the dropout operation, and continuing to generate the next second reverse translation model according to the adjusted discarding probability.

The bilingual evaluation substitution parameter is inversely related to the dropping probability of the dropout operation, and the smaller the dropping probability of the dropout operation is, the larger the bilingual evaluation substitution parameter is; the greater the drop probability of dropout operations, the smaller the bilingual evaluation substitution parameter.

Optionally, when the dropping probability of the dropout operation is high, the dropped parameters are high, the parameters shared by different second reverse translation models are low, the difference between every two second reverse translation models is high, the diversity degree of the corpus generated by the second reverse translation models is high, and the value of the bilingual evaluation substitution parameter is low.

When the dropping probability of the dropout operation is small, the dropped parameters are small, the number of the parameters shared by different second reverse translation models is large, the difference between every two second reverse translation models is small, the corpus generated by the second reverse translation models is low in diversity degree, the bilingual evaluation substitution parameter value is large, and under the condition, the diversity degree of the corpus is low, and the purpose of increasing the corpus cannot be achieved. Therefore, when the bilingual evaluation parameter is larger, the dropping probability of the dropout operation can be adjusted to be larger, when the bilingual evaluation parameter is smaller, the dropping probability of the dropout operation can be adjusted to be smaller, and the next second reverse translation model is continuously generated according to the adjusted dropping probability, so that a high-quality translation model is obtained, and the generated corpus has diversity and high quality.

In one embodiment of the present disclosure, generating a pseudo-parallel corpus of monolingual corpora in the target language based on at least one second reverse translation model may include: and respectively inputting the monolingual corpus into each second reverse translation model to obtain the pseudo parallel corpus of the monolingual corpus output by each second reverse translation model.

Optionally, after the one or more second reverse translation models are generated, the monolingual corpus may be input into each second reverse translation model respectively to generate a pseudo parallel corpus corresponding to the monolingual corpus, and the pseudo parallel corpus of the monolingual corpus generated has diversity because the parameter sets of each second reverse translation model are different. This can improve the diversity of the corpus generated by the reverse translation.

In an embodiment of the present disclosure, the corpus generating method may further include: and augmenting the parallel corpus of the source language based on the obtained pseudo parallel corpus to obtain a training corpus of the translation model.

Optionally, since the obtained pseudo parallel corpus has diversity, after the pseudo parallel corpus is obtained, the parallel corpus of the source language may be augmented by the pseudo parallel corpus to obtain a corpus of the translation model, which is used for training the translation model and further optimizing the translation model. Thus, the corpus generated by the translation model can be expanded.

To more clearly illustrate the corpus generating method of the present disclosure, fig. 3 is a flowchart illustrating a concrete example of the corpus generating method.

In the embodiment of the present disclosure, referring to fig. 3, a dashed frame is a first reverse translation model, different second reverse translation models can be obtained through dropout operation, and a monolingual corpus in a target language can be input into different second reverse translation models to obtain a more diverse pseudo-parallel corpus of a translated text, thereby achieving the purpose of increasing the number of reverse translations.

In summary, the corpus generating method according to the embodiment of the disclosure performs dropout operation in the reverse translation process based on diversity, so as to improve the diversity of the reversely translated corpus, and further expand the reverse corpus. On the english standard dataset, the diversity-based reverse translation method is significantly improved compared to the normal directional translation method, and the results are shown in table 1:

Back-translation method Newstest2014
No back-translation 27.43 0
Beam search(bsize=5,top1) 28.81 1.38
MixDiversity 29.19 1.76
Dropout(p==0.03) 29.96 2.53
Dropout(p==0.05) 30.01 2.58

TABLE 1

Wherein, Back-translation method in table 1 is reverse translation, No Back-translation is normal translation, Beam search is bundle search, mixdrive is hybrid diversity, news 2014 is a commonly used news report test set, and Δ is corpus difference.

In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, and do not violate the good customs of the public order.

The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.

Fig. 4 is a schematic structural diagram of a corpus generating device according to an embodiment of the present disclosure.

The corpus generating device of the embodiment of the disclosure can be configured in an electronic device to realize a first reverse translation model based on a basis, generate at least one second reverse translation model, and generate a pseudo parallel corpus of a monolingual corpus in a target language based on the at least one second reverse translation model, thereby being capable of generating high-quality and diverse corpora in a reverse translation process.

As shown in fig. 4, the corpus generating apparatus 400 may include: a first generation module 410 and a second generation module 420.

The first generating module 410 is configured to generate at least one second reverse translation model based on the basic first reverse translation model, where a set of model parameters of the second reverse translation model is different from a set of model parameters of the first reverse translation model.

The model parameter set may include parameters of an encoder side, parameters of a decoder side, parameters of a Word Embedding layer, parameters of an intermediate layer, parameters of an output layer, and the like in the reverse translation model.

In the embodiment of the present disclosure, the first generating module 410 may perform certain processing on the parameters of the first reverse translation model according to a set policy to generate one or more second reverse translation models, for example, the first generating module 410 may generate the second reverse translation models through model pruning, using the base model as a teacher model, building a student model, and performing dropout operations. And in the process of processing the parameters of the first reverse translation model according to a set strategy, the parameters of the model are changed, so that the generated model parameter set of the second reverse translation model is different from the model parameter set of the first reverse translation model.

In addition, the first reverse translation model and the second reverse translation model may be standard machine translation models, but the translation directions are opposite, for example, if the forward translation model is chinese to english, the first reverse translation model and the second reverse translation model are english to chinese, and the reverse translation may be implemented by an encoder and a decoder, which will not be described herein.

The second generating module 420 is configured to generate a pseudo parallel corpus of the monolingual corpus in the target language based on the at least one second reverse translation model.

In the embodiment of the disclosure, after the first generating module 410 generates the one or more second reverse translation models, the second generating module 420 may input the monolingual corpus in the target language into the one or more second reverse translation models to generate corresponding pseudo-parallel corpus, which may augment the corpus of the translation models.

Therefore, according to the corpus generating method provided by the embodiment of the disclosure, at least one second reverse translation model is generated through the first generating module based on the basic first reverse translation model, and then the pseudo parallel corpus of the monolingual corpus in the target language is generated through the second generating module based on the at least one second reverse translation model, so that high-quality and diverse corpuses can be generated in the reverse translation process.

In one embodiment of the present disclosure, the set of model parameters of the second reverse translation model is smaller than the set of model parameters of the first reverse translation model.

In an embodiment of the present disclosure, the first generating module 410 is further configured to: and carrying out a random parameter discarding dropout operation on the first reverse translation model to generate at least one second reverse translation model.

In an embodiment of the present disclosure, the first generating module 410 is further configured to: and for each second reverse translation model, carrying out Bernoulli sampling on the first reverse translation model by using a discarding probability, and acquiring first model parameters sampled by using the discarding probability and second model parameters sampled by using a retaining probability, wherein the sum of the discarding probability and the retaining probability is 1, the first model parameters are configured to be 0, and the second model parameters are configured to be original retaining values so as to generate the second reverse translation model.

In one embodiment of the present disclosure, the drop probability of the dropout operation corresponding to each second reverse translation model is the same.

In an embodiment of the present disclosure, the first generating module 410 is further configured to: and acquiring bilingual evaluation substitution parameters between the source language material and the translated language material corresponding to the currently generated second reverse translation model, adjusting the discarding probability of the dropout operation according to the bilingual evaluation substitution parameters, and continuously generating the next second reverse translation model according to the adjusted discarding probability.

In one embodiment of the present disclosure, the bilingual evaluation substitution parameter is inversely related to the drop probability of the dropout operation.

In an embodiment of the disclosure, the second generating module 420 is further configured to: and respectively inputting the monolingual corpus into each second reverse translation model to obtain the pseudo parallel corpus of the monolingual corpus output by each second reverse translation model.

In an embodiment of the disclosure, the second generating module 420 is further configured to: and augmenting the parallel corpus of the source language based on the obtained pseudo parallel corpus to obtain a training corpus of the translation model.

The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.

FIG. 5 illustrates a schematic block diagram of an example electronic device 500 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.

As shown in fig. 5, the apparatus 500 comprises a computing unit 501 which may perform various appropriate actions and processes in accordance with a computer program stored in a Read Only Memory (ROM)502 or a computer program loaded from a storage unit 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the device 500 can also be stored. The calculation unit 501, the ROM 502, and the RAM 503 are connected to each other by a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.

A number of components in the device 500 are connected to the I/O interface 505, including: an input unit 506 such as a keyboard, a mouse, or the like; an output unit 507 such as various types of displays, speakers, and the like; a storage unit 508, such as a magnetic disk, optical disk, or the like; and a communication unit 509 such as a network card, modem, wireless communication transceiver, etc. The communication unit 509 allows the device 500 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.

The computing unit 501 may be a variety of general-purpose and/or special-purpose processing components having processing and computing capabilities. Some examples of the computing unit 501 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 501 executes the respective methods and processes described above, such as the corpus generation method. For example, in some embodiments, the corpus generation method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 508. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 500 via the ROM 502 and/or the communication unit 509. When the computer program is loaded into the RAM 503 and executed by the computing unit 501, one or more steps of the corpus generation method described above may be performed. Alternatively, in other embodiments, the computing unit 501 may be configured to perform the corpus generation method by any other suitable means (e.g., by means of firmware).

Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.

Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.

In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.

To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.

The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.

The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.

It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.

The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

14页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:翻译模型的训练方法及翻译模型的装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!