Information processing method, device, electronic equipment and computer readable medium

文档序号:379093 发布日期:2021-12-10 浏览:11次 中文

阅读说明:本技术 信息处理方法、装置、电子设备和计算机可读介质 (Information processing method, device, electronic equipment and computer readable medium ) 是由 殷威 肖智鹏 施文祥 徐昊 李森 张万书 金林荣 裴英瑞 王浩 于 2021-01-21 设计创作,主要内容包括:本公开的实施例公开了信息处理方法、装置、电子设备和计算机可读介质。该方法的一具体实施方式包括:对二分图进行子图抽取以生成子图集合;对子图集合中子图的每个用户节点所输入的用户信息集和每个物品节点所输入的物品信息集分别进行第一次编码以生成第一用户向量和第一物品向量;生成编码后的子图;对编码后的子图中的每个用户节点对应的第一用户向量和每个物品节点对应的第一物品向量分别进行第二次编码以生成第二用户向量和第二物品向量;将编码后的子图中各个节点的向量依照对应位置替换为第二用户向量集合和第二物品向量集合以生成替换后的子图作为词嵌入后的子图。该实施方式可以准确、高效的对二分图中用户节点和物品节点进行编码处理。(The embodiment of the disclosure discloses an information processing method, an information processing device, an electronic device and a computer readable medium. One embodiment of the method comprises: performing subgraph extraction on the bipartite graph to generate a subgraph set; respectively carrying out first coding on a user information set input by each user node and an item information set input by each item node of a sub-graph in the sub-graph set to generate a first user vector and a first item vector; generating a coded subgraph; respectively carrying out secondary encoding on the first user vector corresponding to each user node and the first article vector corresponding to each article node in the encoded subgraph to generate a second user vector and a second article vector; and replacing the vectors of each node in the encoded subgraph with a second user vector set and a second article vector set according to the corresponding positions to generate a replaced subgraph as a word-embedded subgraph. The embodiment can accurately and efficiently encode the user nodes and the article nodes in the bipartite graph.)

1. An information processing method comprising:

performing subgraph extraction on the bipartite graph to generate a subgraph set, wherein the bipartite graph represents the association relation between the user set and the item set;

respectively carrying out first coding on a user information set input by each user node and an item information set input by each item node of a subgraph in the subgraph set to generate a first user vector and a first item vector, and obtaining a first user vector set and a first item vector set;

generating a coded sub-graph according to the first user vector set and the first item vector set to obtain a coded sub-graph set;

according to the incidence relation between the user nodes and the article nodes in the sub-graph, respectively carrying out second coding on the first user vector corresponding to each user node and the first article vector corresponding to each article node in the coded sub-graph so as to generate a second user vector and a second article vector, and obtaining a second user vector set and a second article vector set;

and replacing the vector of each node in the encoded subgraph with the second user vector set and the second article vector set according to the corresponding position to generate a replaced subgraph as a subgraph after word embedding, and obtaining a subgraph set after word embedding.

2. The method of claim 1, wherein the method further comprises:

according to the obtained sub-graph with each embedded word, carrying out vector labeling on each node on the bipartite graph to obtain the bipartite graph with the vector labeling;

taking each article node in the bipartite graph marked by the vector as a leaf node, and constructing a tree model;

and determining the item set to be recalled associated with the target user node according to the tree model.

3. The method of claim 1, wherein the first encoding of the user information set entered by each user node and the item information set entered by each item node of the subgraph in the set of subgraphs to generate a first user vector and a first item vector, respectively, comprises:

performing word embedding processing on each user information in the user information set to generate a first vector to obtain a first vector set;

splicing each first vector in the first vector set to obtain a spliced vector;

carrying out batch normalization processing on the spliced vectors to obtain normalized vectors;

and inputting the normalized vector to a plurality of layers of first activation function layers to obtain the first user vector.

4. The method of claim 3, wherein the first encoding of the user information set entered by each user node and the item information set entered by each item node of the subgraph in the set of subgraphs to generate a first user vector and a first item vector, respectively, comprises:

performing word embedding processing on each item information in the item information set to generate a second vector, so as to obtain a second vector set;

splicing all the second vectors in the second vector set to obtain spliced vectors;

carrying out batch normalization processing on the spliced vectors to obtain normalized vectors;

and inputting the normalized vector to a plurality of layers of second activation function layers to obtain the first item vector, wherein the vector dimension of the first user vector is greater than or equal to the vector dimension of the first item vector.

5. The method of claim 1, wherein the second encoding of the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-graph to generate a second user vector and a second item vector according to the association relationship between the user node and the item node in the sub-graph comprises:

determining a first item vector set adjacent to the first user vector according to the encoded subgraph;

inputting each first item vector in the first item vector set to a third activation function layer to output a third vector, so as to obtain a third vector set;

inputting each third vector in the third vector set into a first pooling layer to output a fourth vector, so as to obtain a fourth vector set;

splicing each fourth vector in the fourth vector set with the first user vector to obtain a fifth vector;

and inputting the fifth vector to a fourth activation function layer to obtain the second user vector.

6. The method of claim 1, wherein the second encoding of the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-graph to generate a second user vector and a second item vector according to the association relationship between the user node and the item node in the sub-graph comprises:

determining a first user vector set adjacent to the first item vector according to the encoded subgraph;

inputting each first user vector in the adjacent first user vector sets to a fifth activation function layer to output a sixth vector, so as to obtain a sixth vector set;

inputting each sixth vector in the sixth vector set to a second pooling layer to output a seventh vector, so as to obtain a seventh vector set;

splicing each seventh vector in the seventh vector set with the first article vector to obtain an eighth vector;

and inputting the eighth vector to a sixth activation function layer to obtain the second article vector.

7. The method of claim 2, wherein said determining a set of items to recall associated with a target user node according to the tree model comprises:

determining related article nodes from the tree model by adopting a neighbor searching method according to the target user nodes;

and determining an article set to be recalled according to the article nodes.

8. An information processing apparatus comprising:

an extraction unit configured to perform sub-graph extraction on the bipartite graph to generate a sub-graph set, wherein the bipartite graph represents an association relationship between the user set and the item set;

the first coding unit is configured to perform first coding on a user information set input by each user node and an item information set input by each item node of a sub-graph in the sub-graph set respectively to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set;

a generating unit configured to generate a coded sub-graph according to the first user vector set and the first item vector set, resulting in a coded sub-graph set;

the second-time coding unit is configured to perform second-time coding on the first user vector corresponding to each user node and the first article vector corresponding to each article node in the coded sub-image respectively according to the incidence relation between the user nodes and the article nodes in the sub-image to generate a second user vector and a second article vector, and a second user vector set and a second article vector set are obtained;

and the replacing unit is configured to replace the vector of each node in the encoded subgraph with the second user vector set and the second item vector set according to the corresponding position to generate a replaced subgraph as a word-embedded subgraph, so as to obtain a word-embedded subgraph set.

9. An electronic device, comprising:

one or more processors;

storage means for storing one or more programs;

the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method recited in any of claims 1-7.

10. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-7.

Technical Field

Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to an information processing method and apparatus, an electronic device, and a computer-readable medium.

Background

At present, each business is often used for visually showing each user and each article involved in the business in the form of a graph model. For the use of graph models, the following is generally used: first, Word Embedding (Word Embedding) is performed on user information of each user and item information of each item in the graph model. Then, each user vector and each article vector after word embedding are directly input into a pre-trained graph neural network related to the business, and an output result required by the business can be obtained. This approach has the following problems:

and performing word embedding on the user information of each user and the article information of each article in the graph model, wherein the obtained word-embedded user vectors and article vectors cannot well and deeply extract the feature information of each user and each article. In addition, each user vector and each article vector after word embedding cannot well represent the associated information between the user and the article.

Disclosure of Invention

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

Some embodiments of the present disclosure propose an information processing method, apparatus, device, and computer readable medium to solve the technical problems mentioned in the above background section.

In a first aspect, some embodiments of the present disclosure provide an information processing method, including: performing subgraph extraction on the bipartite graph to generate a subgraph set, wherein the bipartite graph represents the association relation between the user set and the item set; respectively carrying out first coding on a user information set input by each user node and an item information set input by each item node of a subgraph in the subgraph set to generate a first user vector and a first item vector, and obtaining a first user vector set and a first item vector set; generating a coded sub-graph according to the first user vector set and the first item vector set to obtain a coded sub-graph set; according to the incidence relation between the user nodes and the article nodes in the sub-image, respectively carrying out second coding on the first user vector corresponding to each user node and the first article vector corresponding to each article node in the coded sub-image so as to generate a second user vector and a second article vector, and obtaining a second user vector set and a second article vector set; and replacing the vector of each node in the coded subgraph with the second user vector set and the second article vector set according to the corresponding position to generate a replaced subgraph as a subgraph after word embedding, and obtaining a subgraph set after word embedding.

Optionally, the method further includes: according to the obtained sub-graph with each embedded word, carrying out vector labeling on each node on the bipartite graph to obtain the bipartite graph with the vector labeling; taking each article node in the bipartite graph marked by the vector as a leaf node, and constructing a tree model; and determining the item set to be recalled associated with the target user node according to the tree model.

Optionally, the above-mentioned encoding the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set for the first time respectively to generate a first user vector and a first item vector, includes: performing word embedding processing on each user information in the user information set to generate a first vector to obtain a first vector set; splicing each first vector in the first vector set to obtain a spliced vector; carrying out batch normalization processing on the spliced vectors to obtain normalized vectors; and inputting the normalized vector into a plurality of layers of first activation function layers to obtain the first user vector.

Optionally, the above-mentioned encoding the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set for the first time respectively to generate a first user vector and a first item vector, includes: performing word embedding processing on each item information in the item information set to generate a second vector to obtain a second vector set; splicing all the second vectors in the second vector set to obtain spliced vectors; carrying out batch normalization processing on the spliced vectors to obtain normalized vectors; and inputting the normalized vector into a plurality of layers of second activation function layers to obtain the first item vector, wherein the vector dimension of the first user vector is greater than or equal to the vector dimension of the first item vector.

Optionally, the second encoding, according to the association relationship between the user nodes and the item nodes in the sub-graph, the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-graph to generate a second user vector and a second item vector, respectively includes: determining a first item vector set adjacent to the first user vector according to the encoded subgraph; inputting each first item vector in the first item vector set to a third activation function layer to output a third vector, so as to obtain a third vector set; inputting each third vector in the third vector set into the first pooling layer to output a fourth vector, so as to obtain a fourth vector set; splicing each fourth vector in the fourth vector set with the first user vector to obtain a fifth vector; and inputting the fifth vector to a fourth activation function layer to obtain the second user vector.

Optionally, the second encoding, according to the association relationship between the user nodes and the item nodes in the sub-graph, the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-graph to generate a second user vector and a second item vector, respectively includes: determining a first user vector set adjacent to the first item vector according to the encoded subgraph; inputting each first user vector in the adjacent first user vector sets to a fifth activation function layer to output a sixth vector, so as to obtain a sixth vector set; inputting each sixth vector in the sixth vector set into a second pooling layer to output a seventh vector, so as to obtain a seventh vector set; splicing each seventh vector in the seventh vector set with the first article vector to obtain an eighth vector; and inputting the eighth vector to a sixth activation function layer to obtain the second article vector.

Optionally, the determining, according to the tree model, an item set to be recalled in association with the target user node includes: determining related article nodes from the tree model by adopting a neighbor searching method according to the target user nodes; and determining an article set to be recalled according to the article nodes.

In a second aspect, some embodiments of the present disclosure provide an information processing apparatus, the apparatus comprising: the extraction unit is configured to perform sub-graph extraction on the bipartite graph to generate a sub-graph set, wherein the bipartite graph represents the association relation between the user set and the item set; the first coding unit is configured to perform first coding on a user information set input by each user node and an item information set input by each item node of a sub-graph in the sub-graph set respectively to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set; a generating unit configured to generate a coded sub-graph according to the first user vector set and the first item vector set, so as to obtain a coded sub-graph set; a second coding unit, configured to perform second coding on the first user vector corresponding to each user node and the first item vector corresponding to each item node in the coded sub-graph respectively according to an association relationship between the user nodes and the item nodes in the sub-graph to generate a second user vector and a second item vector, so as to obtain a second user vector set and a second item vector set; and the replacing unit is configured to replace the vector of each node in the encoded subgraph with the second user vector set and the second article vector set according to the corresponding position to generate a replaced subgraph as a word-embedded subgraph, so as to obtain a word-embedded subgraph set.

Optionally, the apparatus further comprises: according to the obtained sub-graph with each embedded word, carrying out vector labeling on each node on the bipartite graph to obtain the bipartite graph with the vector labeling; taking each article node in the bipartite graph marked by the vector as a leaf node, and constructing a tree model; and determining the item set to be recalled associated with the target user node according to the tree model.

Optionally, the first encoding unit is further configured to: performing word embedding processing on each user information in the user information set to generate a first vector to obtain a first vector set; splicing each first vector in the first vector set to obtain a spliced vector; carrying out batch normalization processing on the spliced vectors to obtain normalized vectors; and inputting the normalized vector into a plurality of layers of first activation function layers to obtain the first user vector.

Optionally, the first encoding unit is further configured to: performing word embedding processing on each item information in the item information set to generate a second vector to obtain a second vector set; splicing all the second vectors in the second vector set to obtain spliced vectors; carrying out batch normalization processing on the spliced vectors to obtain normalized vectors; and inputting the normalized vector into a plurality of layers of second activation function layers to obtain the first item vector, wherein the vector dimension of the first user vector is greater than or equal to the vector dimension of the first item vector.

Optionally, the second encoding unit is further configured to: determining a first item vector set adjacent to the first user vector according to the encoded subgraph; inputting each first item vector in the first item vector set to a third activation function layer to output a third vector, so as to obtain a third vector set; inputting each third vector in the third vector set into the first pooling layer to output a fourth vector, so as to obtain a fourth vector set; splicing each fourth vector in the fourth vector set with the first user vector to obtain a fifth vector; and inputting the fifth vector to a fourth activation function layer to obtain the second user vector.

Optionally, the second encoding unit is further configured to: determining a first user vector set adjacent to the first item vector according to the encoded subgraph; inputting each first user vector in the adjacent first user vector sets to a fifth activation function layer to output a sixth vector, so as to obtain a sixth vector set; inputting each sixth vector in the sixth vector set into a second pooling layer to output a seventh vector, so as to obtain a seventh vector set; splicing each seventh vector in the seventh vector set with the first article vector to obtain an eighth vector; and inputting the eighth vector to a sixth activation function layer to obtain the second article vector.

Optionally, the apparatus further comprises: determining related article nodes from the tree model by adopting a neighbor searching method according to the target user nodes; and determining an article set to be recalled according to the article nodes.

In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon which, when executed by one or more processors, cause the one or more processors to implement a method as in any one of the first aspects.

In a fourth aspect, some embodiments of the disclosure provide a computer readable medium having a computer program stored thereon, wherein the program when executed by a processor implements a method as in any one of the first aspect.

The above embodiments of the present disclosure have the following beneficial effects: the information processing method of some embodiments of the present disclosure can accurately and efficiently encode the user nodes and the article nodes in the bipartite graph. Specifically, word embedding is performed on user information of each user and article information of each article in the graph model, and the obtained word-embedded user vectors and article vectors cannot well and deeply extract feature information of each user and each article. In addition, each user vector and each article vector after word embedding cannot well represent the associated information between the user and the article. Based on this, the information processing method of some embodiments of the present disclosure first performs subgraph extraction on the bipartite graph to generate a subgraph set. The bipartite graph represents the association relation between the user set and the item set. Here, the number of nodes of the bipartite graph is generally much larger than the number of nodes in the respective subgraphs. Therefore, in subsequent training, the node scale in subsequent model aggregation can be greatly reduced, and the complexity of model training is reduced. Then, the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set are respectively coded for the first time to extract the user characteristic information and the item characteristic information so as to generate a first user vector and a first item vector, and a first user vector set and a first item vector set are obtained. And then, generating a coded subgraph according to the first user vector set and the first item vector set to obtain a coded subgraph set. And then, according to the incidence relation between the user nodes and the article nodes in the sub-image, respectively carrying out second coding on the first user vector corresponding to each user node and the first article vector corresponding to each article node in the coded sub-image so as to generate a second user vector and a second article vector, and obtaining a second user vector set and a second article vector set. It should be noted that the second encoding can further extract the association relationship between the user node and the item node in the sub-graph. Each node in the bipartite graph can learn the topological structure information of the graph at the same time. And finally, replacing the vector of each node in the coded subgraph with the second user vector set and the second article vector set according to the corresponding position to generate a replaced subgraph as a subgraph after word embedding, and obtaining a subgraph set after word embedding. Therefore, the information processing method can accurately and efficiently carry out coding processing on the user nodes and the article nodes in the bipartite graph.

Drawings

The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.

FIGS. 1-3 are schematic diagrams of an application scenario diagram of an information processing method of some embodiments of the present disclosure;

FIG. 4 is a flow diagram of some embodiments of an information processing method according to the present disclosure;

FIG. 5 is a flow diagram of further embodiments of an information processing method according to the present disclosure;

FIG. 6 is a schematic block diagram of some embodiments of an information processing apparatus according to the present disclosure;

FIG. 7 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.

Detailed Description

Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.

It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.

It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.

It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.

The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.

The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.

Fig. 1 to 3 are schematic diagrams of an application scenario diagram of an information processing method according to some embodiments of the present disclosure.

As shown in fig. 1-3, electronic device 101 may first perform subgraph extraction on bipartite graph 102 to generate subgraph set 105. Wherein, the bipartite graph 102 represents the association relationship between the user set 103 and the item set 104. In the application scenario, the user set 103 may include: a first user 1031, a second user 1032, and a third user 1033. The set of items 104 may include: first item 1041, second item 1042, third item 1043. The subgraph set 105 includes: subgraph 1051, subgraph 1052 and subgraph 1053. The first-level node of the sub-graph 1051 is the first user 1031. The second level nodes of the sub-graph 1051 are a first item 1041 and a second item 1042 associated with a first user 1031 in the first level nodes. The third level node of the above sub-graph 1051 includes: first and third users 1031, 1033 associated with the first item 1041 in the second level node, and first and third users 1031, 1033 associated with the second item 1042 in the second level node. The first level node of the above sub-graph 1052 is the second user 1032. The second level nodes of sub-diagram 1052 are first item 1041 and third item 1043 associated with second user 1032 in the first level node. The third level nodes of sub-graph 1052 include: first and third users 1031, 1033 associated with the first item 1041 in the second level node, and second and third users 1032, 1033 associated with the third item 1043 in the second level node. The first level node of the sub-graph 1053 is the third user 1033. The second level nodes of sub-graph 1053 above are first item 1041 and second item 1042 associated with third user 1033 in the first level nodes. The third level node of sub-graph 1053 above includes: first and third users 1031, 1033 associated with the first item 1041 in the second level node, and first and third users 1031, 1033 associated with the second item 1042 in the second level node. Then, the user information set input by each user node and the item information set input by each item node of the sub-graph in the sub-graph set 105 are respectively encoded for the first time to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set. And then, generating a coded subgraph according to the first user vector set and the first item vector set to obtain a coded subgraph set. In the present application scenario, the encoded sub-graph 106 is derived from sub-graph 1052. The first user vector corresponding to the second user in the first layer in the encoded subgraph 106 may be: (1,2,4,5,2,43,2,12,21,33). The first item vector corresponding to the first item in the second layer in the encoded subgraph 106 may be: (6,2,5,2,9,4,1,3). The first item vector corresponding to the third item in the second layer in the encoded subgraph 106 may be: (12,4,9,3,7,4,0,3). The first user vector corresponding to the first user at the third layer in the encoded subgraph 106 may be: (4,3,4,8,4,8,21,12,23,21). The first user vector corresponding to the third user in the third layer in the encoded subgraph 106 may be: (12,2,12,5,2,4,2,2,0,3). The first user vector corresponding to the second user in the third layer in the encoded subgraph 106 may be: (1,2,4,5,2,43,2,12,21,33). However, according to the association relationship between the user nodes and the item nodes in the sub-image, the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-image are respectively encoded for the second time to generate a second user vector and a second item vector, so as to obtain a second user vector set and a second item vector set. And finally, replacing the vector of each node in the coded subgraph with the second user vector set and the second article vector set according to the corresponding position to generate a replaced subgraph as a subgraph after word embedding, and obtaining a subgraph set after word embedding. In the present application scenario, the word-embedded subgraph 107 is the subgraph 106 according to the above-described coding. The second user vector corresponding to the second user in the first layer in the sub-graph 107 after the word embedding may be: (8,8,7,5,2,3,5,12,5,4). The second item vector corresponding to the first item in the second layer in the sub-image 107 after the word is embedded may be: (6,6,8,5,2,3,5,5,5,3). The second item vector corresponding to the third item in the second layer in the sub-graph 107 after the word is embedded may be: (5,5,7,5,2,3,3,12,0,2). The second user vector corresponding to the first user at the third level in the sub-graph 107 after the word embedding may be: (1,2,3,5,2,3,5,9,5,4). The second user vector corresponding to the third user in the third layer of the sub-graph 107 after the word embedding may be: (2,2,7,5,2,3,1,1,5,9). The second user vector corresponding to the second user in the third layer in the encoded subgraph 106 may be: (8,8,7,5,2,3,5,12,5,4).

The electronic device 101 may be hardware or software. When the electronic device is hardware, the electronic device may be implemented as a distributed cluster formed by a plurality of servers or terminal devices, or may be implemented as a single server or a single terminal device. When the electronic device is embodied as software, it may be installed in the above-listed hardware devices. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.

It should be understood that the number of electronic devices in fig. 1,2 or 3 is merely illustrative. There may be any number of electronic devices, as desired for implementation.

With continued reference to fig. 4, a flow 400 of some embodiments of an information processing method according to the present disclosure is shown. The information processing method comprises the following steps:

step 401, performing subgraph extraction on the bipartite graph to generate a subgraph set.

In some embodiments, an executing subject of the information processing method (e.g., the electronic device 101 shown in fig. 1) may perform subgraph extraction on the bipartite graph to generate a subgraph set. Wherein, the bipartite graph represents the association relationship between the user set and the item set. For example, the association relationship between the user and the item in the bipartite graph may be a click behavior relationship. In addition, the connecting edge in the bipartite graph may be pointed to an item by a user. No connecting edges are established between users and between articles. The connecting edge can represent an association relationship between the user and the article.

As an example, a two-stage graph sampling strategy may be employed for subgraph decimation to generate a set of subgraphs.

Optionally, the two stages of graph sampling strategies may include:

first, a target user node is determined from the user node set in the bipartite graph.

And secondly, determining the number of nodes of the item node set connected with the target user node in the bipartite graph.

And thirdly, randomly selecting a first number of article nodes from the article node set connected with the target user node. Wherein, the first number is less than or equal to the number of nodes of the connected article node set.

And fourthly, determining the node number of the user node set connected with each selected article node.

And fifthly, selecting a second number of user nodes from the user node set connected with each item node. And the second number is less than or equal to the number of the nodes of the connected user node set.

Step 402, respectively performing first encoding on the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set.

In some embodiments, the executing body performs first encoding on the user information set input by each user node of the sub-graph in the sub-graph set and the item information set input by each item node respectively to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set. The user information set input by the user node may be user characteristic information related to the corresponding user. For example, age information of the user, gender information of the user, and the like. The item information set input by the item node may be item feature information related to the corresponding item. For example, category information of the article, brand information of the article. Value information of the item, etc.

As an example, the executing entity may perform word embedding processing on the user information set input by each user node of the subgraph in the subgraph set and the item information set input by each item node respectively to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set. Word Embedding (Word Embedding) may be a method of converting words in text into digital vectors. The word embedding process may be embedding a high-dimensional space with dimensions of all word quantities into a continuous vector space with lower dimensions, each word or phrase being mapped to a vector on the real number domain, the word vector being generated as a result of the word embedding.

In some optional implementation manners of some embodiments, the first encoding the user information set input by each user node and the item information set input by each item node of the sub-graph in the sub-graph set to generate a first user vector and a first item vector respectively includes the following steps:

the method comprises the following steps that firstly, word embedding processing is carried out on each user information in the user information set to generate a first vector, and a first vector set is obtained.

And secondly, splicing each first vector in the first vector set to obtain a spliced vector.

And thirdly, carrying out Batch Normalization (BN) on the spliced vectors to obtain normalized vectors. Here, the batch normalization processing of the spliced vectors can increase the training speed and omit the regularization processing process in the model training. Furthermore, the training precision of the model is improved.

And fourthly, inputting the vector after the normalization processing into a plurality of layers of first activation function layers to obtain the first user vector. Wherein the dimensions of the output vectors between the respective first activation function layers may be non-uniform. The activation function may include, but is not limited to, at least one of: normalized exponential function (Softmax function), Linear rectification function (Rectified Linear Unit, ReLU). As an example, the above-described multi-layer first activation function layer may be a three-layer first activation function, and the output vector dimension of the first-layer first activation function may be 512 × 1. The output vector dimension of the second layer first activation function may be 128 x 1. The output vector dimension of the third tier first activation function may be 64 x 1.

Optionally, the first encoding the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set to generate a first user vector and a first item vector may include the following steps:

the method comprises the following steps of firstly, carrying out word embedding processing on each item information in the item information set to generate a second vector, and obtaining a second vector set.

And secondly, splicing each second vector in the second vector set to obtain a spliced vector.

And thirdly, carrying out batch normalization processing on the spliced vectors to obtain normalized vectors.

And fourthly, inputting the normalized vector into a plurality of layers of second activation function layers to obtain the first article vector. And the vector dimension of the first user vector is greater than or equal to the vector dimension of the first item vector. As an example, the above-mentioned plurality of second activation function layers may be three layers of second activation functions, and the output vector dimension of the first layer of second activation functions may be 128 × 1. The output vector dimension of the second activation function of the second layer may be 64 × 1. The output vector dimension of the third tier second activation function may be 32 x 1.

Step 403, generating a coded sub-graph according to the first user vector set and the first item vector set, so as to obtain a coded sub-graph set.

In some embodiments, the executing agent may generate an encoded subgraph according to the first set of user vectors and the first set of item vectors, to obtain an encoded subgraph set. Wherein. And each node on the coded subgraph has a corresponding coding vector. As an example, the executing entity may label the first set of user vectors and the first set of item vectors at corresponding positions in the sub-image to obtain an encoded sub-image.

And step 404, respectively performing second coding on the first user vector corresponding to each user node and the first item vector corresponding to each item node in the coded sub-image according to the association relationship between the user nodes and the item nodes in the sub-image to generate a second user vector and a second item vector, so as to obtain a second user vector set and a second item vector set.

In some embodiments, the executing entity may perform second encoding on the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-image respectively according to an association relationship between the user node and the item node in the sub-image to generate a second user vector and a second item vector, so as to obtain a second user vector set and a second item vector set. As an example, the executing entity may input the encoded sub-Graph to a Graph Convolutional neural Network (GCN) trained in advance according to an association relationship between a user node and an item node in the sub-Graph, so as to obtain a second set of user vectors and a second set of item vectors.

In some optional implementation manners of some embodiments, the second encoding, according to the association relationship between the user node and the item node in the sub-graph, the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-graph to generate a second user vector and a second item vector, respectively, may include the following steps:

and step one, determining a first item vector set adjacent to the first user vector according to the encoded subgraph.

And secondly, inputting each first item vector in the first item vector set to a third activation function layer to output a third vector, so as to obtain a third vector set. As an example, the output vector dimension of the third activation function layer described above may be 64 × 1.

And thirdly, inputting each third vector in the third vector set into the first pooling layer to output a fourth vector, so as to obtain a fourth vector set. Wherein, the first pooling layer may be one of: maximum Pooling (Max Pooling), Average Pooling (Average Pooling). The feature mapping results may be further processed using a pooling layer. Pooling statistically summarizes the eigenvalues of some unknown and adjacent positions within the plane. And the summed result is taken as the value of this position in the plane. Maximum pooling calculates the maximum value within the location and its neighboring matrix region and takes this maximum value as the value of the location. Average pooling calculates the average value in the location and its neighboring matrix region and takes this value as the value for the location. The use of pooling does not cause the change of the depth of the data matrix, and only reduces the height and the width so as to achieve the purpose of dimension reduction. As an example, the dimension of the output vector of the above-described first pooling layer may be 64 × 1.

And fourthly, splicing each fourth vector in the fourth vector set with the first user vector to obtain a fifth vector.

And fifthly, inputting the fifth vector to a fourth activation function layer to obtain the second user vector. As an example, the dimension of the vector output by the fourth activation-function layer described above may be 64 × 1.

In some optional implementation manners of some embodiments, the second encoding, according to the association relationship between the user node and the item node in the sub-graph, the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-graph to generate a second user vector and a second item vector, respectively, may include the following steps:

and step one, determining a first user vector set adjacent to the first item vector according to the encoded subgraph.

And secondly, inputting each first user vector in the adjacent first user vector sets to a fifth activation function layer to output a sixth vector, so as to obtain a sixth vector set. As an example, the output vector dimension of the above-described fifth activation function layer may be 64 × 1.

And thirdly, inputting each sixth vector in the sixth vector set into a second pooling layer to output a seventh vector, so as to obtain a seventh vector set. As an example, the dimension of the output vector of the second pooling layer described above may be 64 × 1.

And fourthly, splicing each seventh vector in the seventh vector set with the first article vector to obtain an eighth vector.

And fifthly, inputting the eighth vector to a sixth activation function layer to obtain the second article vector. As an example, the output vector dimension of the sixth activation function layer described above may be 64 × 1.

The above embodiments of the present disclosure have the following beneficial effects: the information processing method of some embodiments of the present disclosure can accurately and efficiently encode the user nodes and the article nodes in the bipartite graph. Specifically, word embedding is performed on user information of each user and article information of each article in the graph model, and the obtained word-embedded user vectors and article vectors cannot well and deeply extract feature information of each user and each article. In addition, each user vector and each article vector after word embedding cannot well represent the associated information between the user and the article. Based on this, the information processing method of some embodiments of the present disclosure first performs subgraph extraction on the bipartite graph to generate a subgraph set. The bipartite graph represents the association relation between the user set and the item set. Here, the number of nodes of the bipartite graph is generally much larger than the number of nodes in the respective subgraphs. Therefore, in subsequent training, the node scale in subsequent model aggregation can be greatly reduced, and the complexity of model training is reduced. Then, the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set are respectively coded for the first time to extract the user characteristic information and the item characteristic information so as to generate a first user vector and a first item vector, and a first user vector set and a first item vector set are obtained. And then, generating a coded subgraph according to the first user vector set and the first item vector set to obtain a coded subgraph set. And then, according to the incidence relation between the user nodes and the article nodes in the sub-image, respectively carrying out second coding on the first user vector corresponding to each user node and the first article vector corresponding to each article node in the coded sub-image so as to generate a second user vector and a second article vector, and obtaining a second user vector set and a second article vector set. It should be noted that the second encoding can further extract the association relationship between the user node and the item node in the sub-graph. Each node in the bipartite graph can learn the topological structure information of the graph at the same time. And finally, replacing the vector of each node in the coded subgraph with the second user vector set and the second article vector set according to the corresponding position to generate a replaced subgraph as a subgraph after word embedding, and obtaining a subgraph set after word embedding. Therefore, the information processing method can accurately and efficiently carry out coding processing on the user nodes and the article nodes in the bipartite graph.

With continued reference to FIG. 5, a flow 500 of further embodiments of information processing methods according to the present disclosure is shown. The information processing method comprises the following steps:

step 501, performing subgraph extraction on the bipartite graph to generate a subgraph set.

Step 502, respectively performing first encoding on the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set.

Step 503, generating a coded sub-graph according to the first user vector set and the first item vector set, and obtaining a coded sub-graph set.

And step 504, respectively carrying out second coding on the first user vector corresponding to each user node and the first item vector corresponding to each item node in the coded sub-image according to the incidence relation between the user nodes and the item nodes in the sub-image so as to generate a second user vector and a second item vector, and obtaining a second user vector set and a second item vector set.

Step 505, replacing the vector of each node in the encoded subgraph with the second user vector set and the second item vector set according to the corresponding position to generate a replaced subgraph as a word-embedded subgraph, and obtaining a word-embedded subgraph set

In some embodiments, the specific implementation and technical effects of steps 501-505 may refer to steps 401-405 in those embodiments corresponding to fig. 4, which are not described herein again.

And step 506, according to the obtained sub-graph with each embedded word, performing vector labeling on each node on the bipartite graph to obtain the vector-labeled bipartite graph.

In some embodiments, in response to the absence, the execution principal (e.g., the electronic device 101 shown in fig. 1) may perform vector labeling on each node on the bipartite graph according to the obtained embedded subgraph of each word, so as to obtain a vector-labeled bipartite graph.

And 507, taking each article node in the bipartite graph marked by the vector as a leaf node to construct a tree model.

In some embodiments, the executing entity constructs a tree model by using each item node in the vector-labeled bipartite graph as a leaf node.

As an example, the execution subject may first take each item node in the vector labeled bipartite graph as a leaf node. Then, the vectors corresponding to the leaf nodes are clustered by using a clustering algorithm, so that the internal nodes of the previous level with a preset number can be obtained. Each internal node may be at least one leaf node categorized as a class. Further, a set of vectors at the center of the cluster in each internal node is taken. And finally, determining the internal node of the upper level and determining the vector set of the cluster center through multiple clustering, and finally obtaining the root node of the tree model.

And step 508, determining the item set to be recalled associated with the target user node according to the tree model.

In some embodiments, the executing entity may determine the set of items to recall associated with the target user node in various ways according to the tree model.

In some optional implementations of some embodiments, the determining the set of items to be recalled associated with the target user node according to the tree model may include:

firstly, according to a target user node, determining a related article node from the tree model by adopting a neighbor searching method.

And secondly, determining an article set to be recalled according to the article nodes.

As can be seen from fig. 5, compared with the description of some embodiments corresponding to fig. 4, the flow 500 of the information processing method in some embodiments corresponding to fig. 5 highlights the specific steps of implementing the object recall related to the target user by the obtained word-embedded subgraph set. Therefore, the scheme described in the embodiments constructs a tree model based on the obtained word-embedded subgraph set, so as to further accurately and quickly determine the item set to be recalled associated with the target user node through the tree model.

With continued reference to fig. 6, as an implementation of the above-described method for the above-described figures, the present disclosure provides some embodiments of an information processing apparatus, which correspond to those of the method embodiments described above for fig. 4, and which may be particularly applicable to various electronic devices.

As shown in fig. 6, an information processing apparatus 600 of some embodiments includes: extraction section 601, first encoding section 602, generation section 603, second encoding section 604, and substitution section 605. The extracting unit 601 is configured to perform sub-graph extraction on a bipartite graph to generate a sub-graph set, wherein the bipartite graph represents an association relationship between a user set and an item set. A first encoding unit 602, configured to perform first encoding on the user information set input by each user node and the item information set input by each item node of the sub-graph in the sub-graph set to generate a first user vector and a first item vector, so as to obtain a first user vector set and a first item vector set. A generating unit 603 configured to generate a coded sub-graph according to the first set of user vectors and the first set of item vectors, resulting in a coded sub-graph set. A second encoding unit 604, configured to perform second encoding on the first user vector corresponding to each user node and the first item vector corresponding to each item node in the encoded sub-graph respectively according to an association relationship between the user node and the item node in the sub-graph to generate a second user vector and a second item vector, so as to obtain a second user vector set and a second item vector set; a replacing unit 605, configured to replace vectors of each node in the encoded subgraph with the second user vector set and the second item vector set according to corresponding positions to generate a replaced subgraph as a word-embedded subgraph, resulting in a word-embedded subgraph set.

In some optional implementations of some embodiments, the apparatus further includes: a preprocessing unit and an input unit (not shown in the figure). Wherein the preprocessing unit may be further configured to: and responding to the absence, preprocessing the question inquired by the target user to obtain a processed text. The input unit may be further configured to: and inputting the processed text into a pre-trained language generation model to obtain the reply information.

In some optional implementations of some embodiments, the apparatus further includes: a vector labeling unit, a construction unit and a determination unit (not shown in the figure). Wherein the vector labeling unit may be configured to: and according to the obtained sub-graph with each embedded word, carrying out vector labeling on each node on the bipartite graph to obtain the bipartite graph with the vector labeling. The building unit may be configured to: and taking each article node in the bipartite graph marked by the vector as a leaf node to construct a tree model. The determination unit may be configured to: and determining the item set to be recalled associated with the target user node according to the tree model.

In some optional implementations of some embodiments, the first encoding unit 602 of the information processing apparatus 600 may be further configured to: performing word embedding processing on each user information in the user information set to generate a first vector to obtain a first vector set; splicing each first vector in the first vector set to obtain a spliced vector; carrying out batch normalization processing on the spliced vectors to obtain normalized vectors; and inputting the normalized vector into a plurality of layers of first activation function layers to obtain the first user vector.

In some optional implementations of some embodiments, the first encoding unit 602 of the information processing apparatus 600 may be further configured to: performing word embedding processing on each item information in the item information set to generate a second vector to obtain a second vector set; splicing all the second vectors in the second vector set to obtain spliced vectors; carrying out batch normalization processing on the spliced vectors to obtain normalized vectors; and inputting the normalized vector into a plurality of layers of second activation function layers to obtain the first item vector, wherein the vector dimension of the first user vector is greater than or equal to the vector dimension of the first item vector.

In some optional implementations of some embodiments, the second encoding unit 604 of the information processing apparatus 600 may be further configured to: determining a first item vector set adjacent to the first user vector according to the encoded subgraph; inputting each first item vector in the first item vector set to a third activation function layer to output a third vector, so as to obtain a third vector set; inputting each third vector in the third vector set into the first pooling layer to output a fourth vector, so as to obtain a fourth vector set; splicing each fourth vector in the fourth vector set with the first user vector to obtain a fifth vector; and inputting the fifth vector to a fourth activation function layer to obtain the second user vector.

In some optional implementations of some embodiments, the second encoding unit 604 of the information processing apparatus 600 may be further configured to: determining a first user vector set adjacent to the first item vector according to the encoded subgraph; inputting each first user vector in the adjacent first user vector sets to a fifth activation function layer to output a sixth vector, so as to obtain a sixth vector set; inputting each sixth vector in the sixth vector set into a second pooling layer to output a seventh vector, so as to obtain a seventh vector set; splicing each seventh vector in the seventh vector set with the first article vector to obtain an eighth vector; and inputting the eighth vector to a sixth activation function layer to obtain the second article vector.

In some optional implementations of some embodiments, the apparatus further includes: an item node determination unit and an item determination unit (not shown in the figure). Wherein the item node determination unit may be configured to: and determining the associated article nodes from the tree model by adopting a neighbor searching method according to the target user nodes. The item determination unit may be configured to: and determining an article set to be recalled according to the article nodes.

It will be understood that the elements described in the apparatus 600 correspond to various steps in the method described with reference to fig. 4. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 600 and the units included therein, and are not described herein again.

Referring now to fig. 7, shown is a schematic diagram of an electronic device 700 suitable for use in implementing some embodiments of the present disclosure. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.

As shown in fig. 7, electronic device 700 may include a processing means (e.g., central processing unit, graphics processor, etc.) 701 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from storage 708 into a Random Access Memory (RAM) 703. In the RAM703, various programs and data necessary for the operation of the electronic apparatus 700 are also stored. The processing device 701, the ROM 702, and the RAM703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.

Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication means 709 may allow the electronic device 700 to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device 700 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 7 may represent one device or may represent multiple devices as desired.

In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network via communications means 709, or may be installed from storage 708, or may be installed from ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of some embodiments of the present disclosure.

It should be noted that the computer readable medium described above in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.

In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.

The computer readable medium may be embodied in the apparatus; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: performing subgraph extraction on the bipartite graph to generate a subgraph set, wherein the bipartite graph represents the association relation between the user set and the item set; respectively carrying out first coding on a user information set input by each user node and an item information set input by each item node of a subgraph in the subgraph set to generate a first user vector and a first item vector, and obtaining a first user vector set and a first item vector set; generating a coded sub-graph according to the first user vector set and the first item vector set to obtain a coded sub-graph set; according to the incidence relation between the user nodes and the article nodes in the sub-image, respectively carrying out second coding on the first user vector corresponding to each user node and the first article vector corresponding to each article node in the coded sub-image so as to generate a second user vector and a second article vector, and obtaining a second user vector set and a second article vector set; and replacing the vector of each node in the coded subgraph with the second user vector set and the second article vector set according to the corresponding position to generate a replaced subgraph as a subgraph after word embedding, and obtaining a subgraph set after word embedding.

Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).

The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes a decimation unit, a first encoding unit, a generation unit, a second encoding unit, and a replacement unit. For example, the first encoding unit may be further described as "a unit that performs first encoding on the user information set input by each user node and the item information set input by each item node of the subgraph in the subgraph set to generate a first user vector and a first item vector, respectively, resulting in a first set of user vectors and a first set of item vectors".

The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.

The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

23页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:信息生成方法、装置、电子设备和计算机可读介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!