Contextual feedback with expiration indicators for natural understanding systems in chat robots

文档序号:425868 发布日期:2021-12-21 浏览:2次 中文

阅读说明:本技术 对聊天机器人中的自然理解系统的具有到期指标的上下文反馈 (Contextual feedback with expiration indicators for natural understanding systems in chat robots ) 是由 J·A·泰勒 V·S·坎南 于 2020-04-23 设计创作,主要内容包括:聊天机器人计算系统包括机器人控制器和自然语言处理器。自然语言处理器接收第一文本输入并且标识由第一文本输入表示的概念。概念的指示被输出到机器人控制器,机器人控制器生成针对第一文本输入的响应。当接收到第二文本输入时,自然语言处理器输出的概念也与到期指标一起被反馈到自然语言处理器的输入中。自然语言处理器然后基于第二自然语言、文本输入和未到期的上下文信息来标识第二文本输入中表示的概念。(A chat robot computing system includes a robot controller and a natural language processor. The natural language processor receives a first text input and identifies a concept represented by the first text input. An indication of the concept is output to a robot controller, which generates a response to the first text input. When a second text input is received, the concept output by the natural language processor is also fed back into the input of the natural language processor along with the expiration indicator. The natural language processor then identifies concepts represented in the second textual input based on the second natural language, the textual input, and the unexpired contextual information.)

1. A computing system, comprising:

a Natural Language Processor (NLP) that receives a text input indicating a given received chat message and context information identified based on a previously received chat message received prior to the given received chat message, the natural language processor generating an NLP output based on the text input and the context information, the NLP output identifying concepts in the given received chat message;

an expiration indicator generator that generates an expiration indicator corresponding to the identified concept;

context filtering logic that filters the NLP output based on the expiration indicator to obtain a filtered NLP output and provides the filtered NLP output back to the natural language processor as context information for a subsequent received chat message received after the given received chat message; and

a robot controller that receives the NLP output from the natural language processor and generates a response output based on the NLP output.

2. The computing system of claim 1, wherein the context filtering logic comprises:

an expiration criteria processor configured to identify a relevance of the contextual information to the subsequently received chat message based on the expiration indicator.

3. The computing system of claim 2, wherein the expiration indicator generator is configured to generate a concept-level expiration indicator corresponding to each concept of the plurality of concepts identified in the given received chat message.

4. The computing system of claim 3, wherein the plurality of concepts together comprise overall context information, and wherein the expiration indicator generator is configured to generate an overall expiration indicator corresponding to the overall context information.

5. The computing system of claim 4, wherein the context filtering logic comprises:

a context filtering system that filters the contextual information based on the relevance to the subsequently received chat message.

6. The computing system of claim 5 wherein the context filtering system is configured to filter the context information by deleting the identified concepts from the context information based on the relevance.

7. The computing system of claim 5, wherein the context filtering system is configured to filter the contextual information by adjusting weights in the contextual information associated with the identified concepts based on the relevance.

8. The computing system of claim 1, wherein the expiration indicator generator comprises:

a timestamp generator that generates a timestamp indicating when the concept was identified by the natural language processor.

9. The computing system of claim 1, wherein the expiration indicator generator comprises:

a location stamp generator that generates a location stamp indicating a geographic location at which the given received chat message was generated.

10. The computing system of claim 1, wherein the expiration indicator generator comprises:

a conversation round counter that generates a round count stamp indicating a conversation round corresponding to the given received chat message.

11. A computer-implemented method, comprising:

receiving, at a natural language processor, a text input indicating a given received chat message and contextual information identified based on previously received chat messages received prior to the given received chat message;

generating, with the natural language processor, an NLP output based on the text input and the context information, the NLP output identifying concepts in the given received chat message;

generating an expiration indicator corresponding to the identified concept;

filtering the NLP output based on the expiration indicator to obtain a filtered NLP output;

providing the filtered NLP output back to the natural language processor as context information for subsequent received chat messages received after the given received chat message; and

generating, with a robot controller, a response output based on the NLP output.

12. The computer-implemented method of claim 11, wherein filtering comprises:

identifying a relevance of the contextual information to the subsequently received chat message based on the expiration indicator.

13. The computer-implemented method of claim 12, wherein generating an expiration indicator comprises:

generating a concept-level expiration indicator corresponding to each concept of the plurality of concepts identified in the given received chat message.

14. The computer-implemented method of claim 13, wherein the plurality of concepts together comprise overall context information, and wherein generating the expiration indicator comprises:

generating an overall expiration indicator corresponding to the overall context information.

15. A chat robot computing system, comprising:

a Natural Language Processor (NLP) that receives a text input indicating a first chat message and context information identified based on a previously received chat message received prior to the first chat message, the natural language processor generating an NLP output based on the text input and the context information, the NLP output identifying concepts in the first chat message;

an expiration indicator generator that generates an expiration indicator corresponding to the identified concept;

an expiration criteria processor configured to identify a relevance of the contextual information to the subsequently received chat message based on the expiration indicator;

context filtering logic that filters the NLP output based on the correlation to obtain a filtered NLP output and provides the filtered NLP output back to the natural language processor as context information for a subsequent received chat message received after the first chat message; and

a robot controller that receives the NLP output from the natural language processor and generates a response output based on the NLP output.

Background

Computing systems are now in widespread use. Some computing systems include an online chat function that allows a user to messaging another user in real-time (or near real-time). Similarly, some computing systems include robots (sometimes referred to as network robots), which are applications that run on a network (such as a wide area network) to perform tasks. When a bot uses a chat function, it is sometimes referred to as a chat bot.

Chat robots are sometimes used in computing systems to implement conversation interfaces. A user may interact with the dialog interface using natural language to perform a variety of different tasks. Some tasks include acquiring information (in which case the robot implements a search function and returns the information to the user) and performing tasks (in which case the robot implements a control function to control some physical control system or item). The chat robot may also be used by the user to perform various other tasks.

As just a few examples, the chat bot may be used as a dialogue interface for a data storage system to search using natural language input queries. In another example, a chat robot may be used to implement an interface for a home automation system, where different controllable subsystems in a home may be controlled by user dialog input using the chat robot. The chat robot can be used to subscribe, get driving directions, get weather information, and many other things.

The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.

Disclosure of Invention

A chat robot computing system includes a robot controller and a natural language processor. The natural language processor receives a first text input and identifies a concept represented by the first text input. An indication of the concept is output to a robot controller, which generates a response to the first text input. When a second text input is received, the concepts output by the natural language processor are also fed back into the input of the natural language processor as context information along with the expiration indicator. The natural language processor then identifies concepts represented in the second textual input based on the second natural language, the textual input, and the unexpired context information.

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.

Drawings

FIG. 1 is a block diagram of one example of a computing system architecture in which a chat robot computing system is used.

Fig. 2 is a flow chart illustrating one example of the overall operation of the architecture illustrated in fig. 1.

FIG. 3 is a block diagram showing the architecture illustrated in FIG. 1 using a knowledge model.

Fig. 3A, 3B, 3C show examples of different parts of the knowledge model.

FIG. 4 is a block diagram illustrating one example of a knowledge model.

FIG. 5 is a block diagram showing the architecture illustrated in FIG. 3 using context filtering/enhancement logic.

FIG. 6 is a block diagram illustrating one example of context filtering/enhancement logic in greater detail.

FIG. 7 is a flow chart illustrating one example of the architecture illustrated in the preceding figures using a knowledge model.

FIG. 8 is a flow diagram illustrating one example of the architecture shown in the preceding figures operating using context filtering/enhancement logic.

FIG. 9 illustrates one example of the architecture illustrated in the preceding figures deployed in a cloud computing architecture.

Fig. 10-12 show examples of mobile devices that may be used in the architecture shown in the preceding figures.

FIG. 13 is a block diagram illustrating one example of a computing environment that may be used in the architecture illustrated in the preceding figures.

Detailed Description

As discussed above, chat robots are often used to implement natural language interfaces for a variety of different types of systems. Natural language input often contains ambiguities. This is because natural language conversations often assume that there is a degree of shared context between the participants in the conversation.

For example, assume that the following dialog occurs:

conversation participant 1? "

Participant 2 "today is cloudy and may have rain. "

Participant 1? "

Participant 2 "there may be a rain shower tomorrow. "

Participant 1: "allen-skipper woollen? "

Participant 2, "tomorrow allen burger will be sunny. "

At the beginning of the conversation, the recipient of the first utterance has no context, but only the first utterance ("how is the weather in seattle today. However, the second utterance ("how tomorrow.

When these types of natural language input are provided through a conversation interface implemented by the chat robot, the only way for the chat robot to understand the meaning of the user's second utterance is to know the context of the utterance. The context indicates what the participants in the conversation are talking about before receiving the second language. In this example, these include "weather," seattle, "and" today. The user's third utterance ("allen-burger. The only way to answer accurately is to know that the content of the utterance is "weather" and "tomorrow" (which would overwrite the context of "today").

Thus, the present discussion continues with respect to identifying concepts in natural language input to the chat robot and advancing these concepts as contextual information in the conversation to enhance subsequent utterances in the conversation. Thus, fig. 1 illustrates one example of a computing system architecture 100 in which a chat robot computing system 102 receives chat messages through a chat message channel function 104 from a user 106 that provides chat messages through a user device 108.

User device 108 may be any of a variety of different types of devices. In the example shown in fig. 1, it may be a mobile device that generates one or more interfaces 110 for user 106 to interact with. The user 106 illustratively interacts with the interface 110 to control and manipulate the user device 108 and certain portions of the chat robot computing system 102. As one example, the interface 110 can include a microphone such that the user 106 can provide natural language input as voice input to the chat bot computing system 102 through the user device 108 and the chat message channel function 104.

By way of example, FIG. 1 shows that the user 106 has provided a chat message 112 as input to the chat message channel function 104. Chat messages 112 are provided to chat robot computing system 102. The chat robot computing system 102 processes the chat message 112 and generates a chat response 114, the chat response 114 provided back to the user device 108 through the chat message channel function 104, at the user device 108 the chat message 112 presented to the user 106 on one of the interfaces 110. The interface 110 may be generated on a display device, an audio device, a haptic device, or the like.

In the example shown in fig. 1, chat bot computing system 102 illustratively includes one or more processors or servers 116, a data repository 118, a chat bot 120, and it can include various other items 121. The processor and/or server 116 can implement the chat robot 120 in a variety of different ways. In the example shown in fig. 1, the chat robot 102 illustratively includes a robot controller 122 and a natural language processor 124. The robot controller 122 illustratively may be code generated by a developer to implement the particular type of interface that the developer wishes to implement. The natural language processor 124 illustratively performs natural language processing on natural language text inputs to identify concepts that these inputs represent.

Thus, in one example, the chat message 112 is provided as text input 126 from the robot controller 122 to the natural language processor 124. A natural language processor 124 (as will be described in more detail below) identifies concepts in the input text 126 and generates an output 128 representing the concepts. As described in more detail below, these concepts may be represented by unique identifiers.

Concepts (e.g., unique identifiers) in the input text 126 are provided back to the robot controller 122, and the robot controller 122 generates the responsive chat message 114 (or takes other actions in response to the concepts). According to one example, the output 128 is also fed back to the natural language processor 124 along with subsequent input text that is subsequently received from the user 106 (e.g., based on the second utterance).

Thus, continuing with the example discussed above, the first input text 126 may be "how is today the weather in seattle? Based on the input, the natural language processor 124 may identify concepts (or unique identifiers corresponding to the concepts), such as "weather," seattle, "and" today. These concepts may be output as output 128 to robot controller 122, and robot controller 122 generates a responsive chat message 114, which may be "today will be cloudy, and there may be rain. "then, when the second utterance is received as the second chat message 112 (" how tomorrow. Thus, the natural language processor 124 generates an output indicative of concepts represented in the second language based not only on the text of the second language, but also on concepts identified in the first utterance (which serve as the context for the second language). The output generated by the natural language processor 124 based on the second utterance will be fed back to the robot controller 122 again so that the robot controller 122 can generate a response, but if a third utterance is received, the output of the natural language processor 124 will also be fed back to the natural language processor 124 as the context of the third utterance.

Referring again to the example dialog set forth above, the second response generated by the robot controller 122 (in response to the second utterance "how do tomorrow") is "there may be a rain shower tomorrow. "the user 106 then generates a third utterance" allen-burger? "obviously, if the context of the conversation is known, the user 106 is asking for weather conditions on the tomorrow of Allensburg. Thus, in response to the second utterance ("how tomorrow. It will be provided to the robot controller 122 so that the robot controller 122 can generate a response "there may be a rain shower tomorrow". These concepts will also be fed back to the natural language processor 124 along with the third utterance, "allen-burger". Since the concept of "allen-burgh" is more recent than that of "seattle", the "allen-burgh" will replace "seattle". Thus, the natural language processor 124 will know that the current dialog is about "weather", "tomorrow" and "allen burger" based on the context fed back to it.

The natural language processor 124 will thus generate another output 128 based on the third utterance and the context fed back to it from the second utterance. The output 128 based on the third utterance will be provided to the robot controller 122 so that the robot controller 122 can generate a response to the third utterance. As described in the example dialog, the response may include "tomorrow Elunsburg will be sunny. "

As briefly mentioned above, in one example, the natural language processor 124 includes code such that the nearest words (in the nearest utterance) will be more important, will cover any conflicts, and the context that accompanies them and is fed back from the previous utterance. Thus, in the example discussed above, when the second utterance "how tomorrow? The concept "tomorrow" is provided, and the context information of the concept "today" will be covered. Thus, the two concepts of "weather" and "Seattle" are used to disambiguate the second language, but the context of "today" is discarded because the more recent concept of "tomorrow" supersedes it. The new context information for the second utterance would be "weather", "seattle", and "tomorrow". Then, when a third utterance is received, the concept of "Allensburg" takes precedence over the concept of "Seattle".

In the example shown in fig. 1, the natural language processor 124 may identify concepts in the input utterance in a variety of different ways. In one example, it identifies the underlying concepts using a unique identifier that is unique to each concept. Thus, for example, while the concept of "weather" may include many different tags, such as "weather," "conditions," "weather conditions," etc., the underlying concept of "weather" will be represented by a single unique identifier. Similarly, while the potential concept "seattle" may be represented by different labels, such as "seattle" and "seattle," the potential concept "seattle" will be represented by a single unique identifier.

Fig. 2 is a flow chart illustrating one example of the operation of the architecture 100 illustrated in fig. 1 in more detail. Assume first that chat robot computing system 102 receives a representation of an utterance from chat message channel 104. This is represented by block 134 in the flow chart of fig. 2. It should be noted that the representation of the utterance may be an audio representation, a textual representation, or a representation of a different type. In one example, when it is an audio representation, speech recognition is performed on the representation to generate a text representation. This is just one example.

The robot controller 122 provides a representation of the utterance to the natural language processor 124 for evaluation. This is indicated by block 136. The natural language processor receives any context from any previous utterances and the utterance representation provided to it by the robot controller 122. Block 138 in the flow chart in fig. 2 indicates that context is received from any previous utterance. In the example shown in fig. 1, any output 128 generated by the natural language processor 124 is fed back directly to the natural language processor 124 as contextual information for the utterance. Providing the natural language processor output directly from the evaluation of the previous utterance as the contextual information input for the subsequent utterance is indicated by block 140 in the flowchart of fig. 2.

However, as described in more detail below, the context information may include not only the output 128 from the natural language processor 124, but also an enhanced or modified version of the output, or may include context from other sources in addition to the output of the natural language processor 124. Block 142 indicates providing filtered, enhanced output from previous evaluations and block 144 indicates providing context information from other sources.

The natural language processor 124 then evaluates the representation of the utterance it has received, given the context information received together. This is indicated by block 146. By evaluating the representation of the utterance given the context, it is meant that the natural language processor 124 identifies a new set of concepts based on the newly received utterance and its context information. It can do this in a number of different ways. In one example, it uses a knowledge model (as discussed in more detail below with respect to fig. 3-7). Using the knowledge model is indicated by block 148 in the flow diagram of fig. 2. However, the natural language processor 124 may evaluate the utterance representation in a given context in a variety of other ways as well. This is indicated by block 150.

The natural language processor 124 generates a set of concepts based on the evaluation and outputs them to the robot controller 122. This is indicated by block 152. The robot controller 122 formulates and outputs a chat response to the chat message channel function 104 based on the evaluation results provided by the natural language processor 124. The formulation and output of chat responses is indicated by block 154 in the flow diagram of figure 2.

If another representation of another utterance is received, as indicated by block 156, the process returns to block 136, where the robot controller 122 provides the representation to the natural language processor 124. Then, at block 138, the natural language processor 124 receives context information from the previous utterance, and the evaluation continues. Thus, as shown in fig. 1 and 2, the natural language processor 124 interprets (or evaluates) a particular utterance based not only on the information in the particular utterance itself, but also on contextual information from previous utterances. This can greatly facilitate the operation of the chat robot computing system 102. It can also be used to disambiguate utterances, improve the accuracy of implementing natural language interfaces, and the like. A context may be captured as an unordered set of unique identifiers, which may be represented as URIs or otherwise.

Fig. 3 illustrates another example of an architecture 100 that is similar to the example of the architecture 100 illustrated in fig. 1, and like items are numbered similarly. However, FIG. 3 also shows that the natural language processor 124 may now access a knowledge model 158, which knowledge model 158 may be used to identify concepts based on the utterance and context information provided to the natural language processor 124. The knowledge model 158 illustratively associates languages (words and text in the utterance) with topics or concepts that become the output 128 of the natural language processor 124. These concepts also become context information for the next subsequent utterance (and perhaps more subsequent utterances) received.

Each concept may have a number of different tags. For example, the weather may be described as "cloudy". However, when a person's mood falls, their emotional state may also be described as "cloudy". Thus, the concept of "cloudy day" may be labeled "cloudy". Likewise, the concept of "sad emotional state" may also be labeled "yin-sinking". Further, knowledge model 158 is exemplary of synonyms. For example, the concept of "sad emotional state" may be labeled with a language label (e.g., with the words) "sad", "unhappy", "overcast", and the like.

Fig. 3A shows an example of these types of modeling concepts. Fig. 3A shows that the concept of "cloudy day" may have a label of "cloudy". However, the "emotion-sadness" concept may also have the label "dull", as well as the label "sadness" and the label "unhappy".

In addition, concepts may also be related to other concepts, not just tags. Knowledge model 158 also illustratively captures these types of relationships because the contextual information output by natural language processor 124 may include not only concepts identified in text input 126, but also closely related concepts. For example, it may be useful to link the separate concepts "skiing conditions" and "driving conditions" with the concept "weather".

Fig. 3B illustrates one example of this. Fig. 3B shows that both concepts of "weather-driving conditions" and "weather-skiing conditions" are related to the concept of "weather".

It is also useful to name these types of relationships and have them directional. For example, it may be helpful for knowledge model 158 to indicate that the concept of "emotional state" is broader than the concept of "emotional state-sadness". Similarly, the concept of "emotional state-sadness" is broader than the concept of "emotional state-sadness". Further, a concept may have the same relationship to a plurality of different concepts. For example, the concept of "emotional state pessimism" may be related to two broader different concepts. For example, it may be related to the broader concept "emotional state-negative" and possibly also to the broader concept "emotional state".

Fig. 3C illustrates one example of these types of relationships. Fig. 3C shows that both the "emotion-sadness" and "emotion-negativity" concepts are related to a broader "emotion-emotional state" concept. Similarly, the "emotion-pessimism" concept is related to two broader concepts, the first being "emotion-pessimism" and the second being "emotion-negativity".

Thus, in one example, the knowledge model 158 represents different concepts, each of which is assigned a unique identifier. Concepts may be associated with other concepts using named and directional relationships. They may be labeled with natural language words (e.g., language tags). There may be many different tags for a particular concept, and the tags themselves may not be unique, as the same tag may be used for different concepts. However, the underlying concepts are unique relative to other model concepts.

Thus, FIG. 4 shows a block diagram of one example of a knowledge model 158. Knowledge model 158 has a set of concepts, each concept represented by a unique identifier and one or more language tags (words). The concept is identified by block 160 in fig. 4. Concepts may be connected to language tags by links or links. It should be noted that if the models 158 are localized to different languages, the unique identifiers representing the underlying concepts need not be localized because they are language independent. However, the linguistic tags will be localized.

Concepts may also be interconnected by labeled directed links 162. The model 158 may also include a variety of other items 164. Thus, in operation, the example operation of the architecture 100 shown in fig. 3 is similar to the operation of the architecture shown in fig. 1. However, in FIG. 3, the natural language processor 124 uses the knowledge model 158 to identify concepts represented by the input text 126 (and any context fed back to the natural language processor 124 from previous evaluations). An example of this is described below by means of fig. 7.

Fig. 5 shows another example of an architecture 100, which is similar to the example shown in fig. 3. Like items are numbered similarly. However, in the example illustrated in fig. 5, the chat robot computing system 102 also includes other context sources 168 that can provide context to the natural language processor 124 in addition to or in lieu of providing context to the language model 158 and in addition to context fed back from the output 128 generated for a previous utterance. Fig. 5 also shows that, in one example, the output 128 is provided to the robot controller 122 so that the robot controller 122 can form the chat response 114 to the utterance just received. However, after being provided to the context filtering/enhancement logic 170, it is fed back to the natural language processor 124 as context information for the next subsequent utterance. The logic 170 may enhance and/or filter the output 128 to provide filtered and/or enhanced contextual information to the natural language processor 124, as well as the next subsequent utterance received. Thus, in the example shown in fig. 5, the natural language processor 124 can not only receive enhanced and filtered contextual outputs from the logic 170 based on previous evaluation results or outputs 128, but can also receive context from other sources 168 provided by the developer to further customize the natural language interface experience implemented by the chat robot computing system 102.

Before describing the operation of the architecture 100, in the example shown in FIG. 5, a brief description of the context filtering/enhancement logic 170 will first be provided. Fig. 6 shows one example of a block diagram illustrating logic 170 in more detail. In the example shown in fig. 6, the context filtering/enhancement logic 170 illustratively includes a validity period indicator generator 180, an expiration criteria processor 182, a data store 184, a context enhancement system 186, a context filtering system 188, a context weighting system 190, a context output generator 192, and it may include a variety of other items 194. Validity indicator generator 180 itself illustratively includes a timestamp generator 196, a round counter 198, a position stamp generator 200, and it may include other items 202. The expiration criteria processor 182 itself illustratively includes concept level logic 204, general context logic 206, and it may include other items 208.

Before describing the logic 170 in more detail, it should be appreciated that the concepts generally have a useful duration or range (also referred to as an expiration date). Thus, the contextual information provided with the utterance may have a limited useful duration (or validity period). The expiration date can be determined by a number of different criteria. In one example, time criteria may be used to determine the validity period of a concept in context information. For example, if the chat robot computing system 102 receives an utterance on monday asking for the weather of the day, if the next utterance is received two days later, the contextual information generated from the previous utterance is likely to no longer apply to or make sense of the second utterance. Thus, including conceptual information from a first utterance as contextual information in a second utterance may have a finite useful duration, and that duration may be identified using a time criterion (e.g., a timestamp).

Validity period refers to the limited usefulness of expiring standard context information that may also be generalized to other dimensions than time. For example, the chat robot computing system 102 may implement a natural language interface on an automobile. In this case, the user may provide an utterance looking for the nearest gas station. However, the next subsequent utterance may be an utterance provided by the user 106 after the automobile has traveled 100 miles since the previous utterance. In this case, it is unlikely that the previous utterance looking for the nearest gas station is used as context information for the next utterance. Similarly, in the same example, if a first utterance asks for the next closest highway exit, and a second utterance is provided after the car has left the highway, the first utterance may have limited usefulness as context information for the next subsequent utterance. Thus, in this example, the validity or expiration criteria may be location (or geographic location) information (such as a current geographic location) rather than time information.

Further, it can be appreciated that contextual information is often only useful for a certain maximum number of dialog turns. For example, the contextual information may be found to be useful only for three dialog turns (where the chat robot computing system 102 has received and responded to three utterances). Thereafter, it can be found that the usefulness of the context information from the first utterance is relatively low. Thus, in this case, the expiration criterion may be a dialog turn that has been processed within a given time. It should be noted that of course, the number of dialog rounds used to identify usefulness may be any number, and three are provided by way of example only.

In the illustrated example of fig. 6, the validity period indicator generator 180 illustratively generates a validity period indicator associated with the output 128 generated by the natural language processor 124 before the output is fed back as context information along with the next utterance. The validity indicator is then compared to an expiration criterion, or validity criterion, to determine its relevance to the next sequential utterance.

In one example, the expiration criterion (or validity criterion) may be a time criterion. In this case, the timestamp generator 196 generates a timestamp for each concept that is fed back as context information. In examples where the validity period or expiration criteria includes a dialog turn, then the turn counter 198 generates a turn indication identifying the particular dialog turn (within the last predetermined time) that generated the output 128. Similarly, where the validity period or expiration criteria include location information, location stamp generator 200 generates a location stamp (e.g., based on information received from a GPS receiver or other location identifier) that indicates the location of user device 108 when chat message 112 was received.

It should be noted that the validity period indicator may be generated for each individual concept. An overall index may also be generated for the output 128.

The expiration criteria processor 182 then processes the validity period indicators associated with or appended to the different items of context information fed back to the natural language processor 124 (such as by comparing them to the expiration criteria). This is done to determine the relevance of the context information to the next utterance. For example, the concept-level logic 204 processes expiration information corresponding to each concept identifier that is fed back as context information. Each item of context information (e.g., each concept) may be generated at a different time based on a different utterance. At which time a timestamp will be generated for it. When a concept is fed back as an item of context, it is associated with the timestamp. When a sufficiently long time interval is left between timestamps on a given concept, then the item may be deemed less relevant to subsequent utterances. It may be removed from the context information fed back into the natural language processor 124. Which may be assigned a lower relevance weight, etc.

The overall context logic 206 evaluates the expiration information associated with the overall context (which may include, for example, concepts from the utterance or concepts from multiple aggregated and temporally shifted utterances) fed back to the natural language processor 124. For example, the entire context to be fed back to the natural language processor 124 may be generated from the utterance input when the vehicle is located 100 miles from the current location. In this case, logic 170 may discard the entire context as irrelevant.

The context enhancement system 186 may illustratively interact with other context sources 168 to obtain other context information. Other context sources 168 may be specified or generated by the developer to provide specific behavior for the chat bot computing system 102. The context filtering system 188 illustratively filters context items based on expiration (or validity period) criteria or other reasons. The context weighting system may weight different context items based on their validity or expiration criteria. For example, as a context item ages (based on its timestamp), it may be weighted lower by the context weighting system 190 because it may be less relevant than when it was first generated. Similarly, when a context item is generated in a first round conversation, then its weight may decrease with each subsequent round conversation, as it may become less relevant. Similarly, when a context item is generated at a first location, the weight of the context item may be decreased as the user moves away from the particular location. These are examples only.

The context output generator 192 then generates an output to the natural language processor 124 that indicates the filtered and/or enhanced context so that it can be considered with the next subsequent utterance.

FIG. 7 is a flow diagram illustrating one example of the operation of the architecture 100 in which the natural language processor 124 uses the knowledge model 158 to identify concepts in the text input 126 given contextual information.

In one example, the model 158 may be distributed such that different portions of the model may be created by different systems at different times from different data sources and then combined. The combined model may then be used to build a runtime execution model during runtime. Similarly, the knowledge model 158 may be constructed in a non-hierarchical manner. Although concepts are modeled by unique identifiers, namespaces can be used for relationship names.

It is first assumed that the natural language processor 124 receives the text 126 to be evaluated as well as any unique identifiers representing context from previous utterances. This is indicated by block 212 in the flow chart of fig. 7. The natural language processor 124 then matches the current text 126 being evaluated against the tags in the knowledge model 158. This is indicated by block 214. For example, the input text 126 will have linguistic elements (such as words) that can be matched by the tags and potential concepts modeled by the knowledge model 158. The match may be guided or informed by context information from previous utterances.

Given the context information, the natural language processor 124 identifies the current unique identifier of the concept having the label that best matches the current text being evaluated. Again, by way of example, assume that the knowledge model 158 includes entries such as those shown in FIG. 3C. Assume also that the concept "sad" has a label as shown in fig. 3A. Assume further that the input text 126 includes the word "displeasure". In this case, knowledge model 158 would indicate that the input text matches the concept "sad", and therefore, a unique identifier for the concept "sad" would be presented by knowledge model 158. Block 216 in FIG. 7 indicates a current unique identifier that identifies a concept having a label that matches the current text being evaluated.

The natural language processor 124 then generates an output 128 based on the current unique identifier identified for the text input 126 and based on the unique identifier representing the received context information. This is represented by block 218 in the flow chart of fig. 7. It can thus be seen that the knowledge model 158 can be used to present unique identifiers of different concepts based on the input text 126 given its context. It may also be used to enhance contextual information by extending to neighboring concepts in the knowledge model 158.

FIG. 8 is a flow diagram illustrating one example of the operation of the context filtering/enhancement logic 170 using a validity period or expiration criterion when generating a context to be fed back to the natural language processor 124 with a subsequent utterance. The natural language processor 124 first receives input text 126 from the robot controller 122. This is indicated by block 230 in the flow chart of fig. 8. Which may be a textual representation of the utterance, as indicated by block 232, or another representation, as indicated by block 234.

The natural language processor 124 then performs natural language processing to identify concepts in the utterance given the context. This is indicated by block 236. It may use the knowledge model 158 to accomplish this, as shown at block 238 in the flow chart of fig. 8. It may also accomplish this in other ways, as indicated by block 240. Then, the context filtering/enhancement logic 170 first associates a concept-level validity indicator with each identified concept before providing these unique identifiers (of the concept) to the natural language processor 124 with subsequent utterances. This is indicated by block 242 in the flow chart of fig. 8. In one example, timestamp generator 196 generates time-based metrics 244 (such as timestamps indicating when concept identifiers were identified). In another example, upon receiving an evaluated chat message 112, location stamp generator 200 generates a location-based validity period indicator 246 indicating the location at which user device 108 is located. In another example, the turn counter 198 generates an index based on the dialog turns, as indicated by block 248, indicating which dialog turn the concept was identified for. The validity period indicator may also be generated in a variety of other ways, as indicated by block 250 in the flow chart of fig. 8.

Once the validity period indicators have been associated with each concept identified based on the current utterance, any other already existing context items may be added to the output 128 before they are fed back to the natural language processor 124 as the context for the next utterance. This is called overall context information, which will be fed back with the next utterance. Adding existing context items to obtain an overall context is indicated by block 252 in the flow diagram of fig. 8.

The validity indicator generator 180 then associates the overall validity indicator with the overall context. This is indicated by block 254 in the flow chart of fig. 8. Again, this may be a timestamp 256, a location stamp 258, a flag indicating the number of rounds (as indicated by block 260 in the flow chart of fig. 8), or another overall validity indicator 262.

The expiration criteria processor 182 then subsequently compares the concept level validity indicator and the overall validity indicator to the expiration criteria to see if any contextual information should be filtered (or weighted) from the overall context provided to the natural language processor 124 as the context of the next utterance. Comparing the validity period indicator to the expiration criterion is indicated by block 264 in the flow chart of fig. 8. The concept-level logic 104 compares the validity-period indicator for each individual concept-item to an expiration criterion to determine whether the individual concept should be removed (or weighted) from the overall context. The overall context logic 206 compares the expiration indicators of the overall contexts to determine if the overall contexts are irrelevant, should have reduced weight, or should be treated differently.

The form of the expiration criterion depends on the form of the validity period indicator. For example, where the validity period indicator is a timestamp, the expiration criterion can be an elapsed time criterion indicating that the concept associated with the timestamp is no longer relevant, has reduced relevance, or the like. Filtering the context based on the current or elapsed time is indicated by block 266.

If the validity period indicator is a location stamp, the expiration criterion processor 182 may compare the location stamp to the current location of the user device 108. An expiration criterion based on the current location analysis is indicated by block 268.

If the validity period indicator is a session number indicator, the expiration criterion may be a current number of rounds. Evaluating the expiration criteria based on the current number of rounds is indicated by block 270.

It should be understood that the evaluation of the expiration criteria may also be performed in a variety of other ways. This is indicated by block 272.

Context filtering/enhancement logic 170 then filters or enhances the individual concepts and the overall context based on a comparison with the expiration criteria to obtain a filtered/enhanced context. This is indicated by block 274 in the flow chart of FIG. 8. For example, the context filtering system 188 may remove expired concepts from the context for return to the natural language processor 124 with the next subsequent utterance. The concept of removing an expiration is indicated by block 276. The context weighting system 190 may adjust the weights of the various concepts before they are provided as the context of the next utterance. Adjusting the weights of the context items to be provided to the natural language processor 124 in the next evaluation is indicated by block 278 in the flow chart of fig. 8. The context filtering/enhancement logic 170 may also filter and/or enhance contexts in other ways, as indicated by block 280 in the flow diagram of fig. 8.

The context enhancement system 186 may enhance the context by obtaining additional context from other context sources 168. It may also perform enhancement processing on the context information in other ways.

The contextual output generator 192 then generates an output indicative of the filtered/enhanced contextual information and provides it to the natural language processor 124 so that it can be used as the contextual information for the next subsequent utterance. Returning the filtered/enhanced context along with the next utterance to the natural language processor 124 for evaluation is indicated by block 282 in the flow chart of fig. 8.

It should be noted that the above discussion has described various different systems, components, and/or logic. It should be understood that such systems, components, and/or logic may be comprised of hardware items (such as a processor and associated memory or other processing components, some of which are described below) that perform the functions associated with the systems, components, and/or logic. Additionally, the systems, components, and/or logic may be comprised of software that is loaded into memory and subsequently executed by a processor or server or other computing component, as described below. The systems, components, and/or logic may also be comprised of various combinations of hardware, software, firmware, and the like, some examples of which are described below. These are merely a few examples of different structures that may be used to form the above described systems, components, and/or logic. Other configurations may also be used.

The discussion refers to processors and servers. In one example, the processor and server include a computer processor with associated memory and timing circuitry, not separately shown. They are functional parts of the systems or devices to which they belong, activated by other components or items in these systems and facilitating their functions.

In addition, some user interface displays are also discussed. They may take a number of different forms and may have a number of different user actuable input mechanisms provided thereon. For example, the input mechanism that the user may initiate may be a text box, a check box, an icon, a link, a drop down menu, a search box, and the like. They can also be activated in a number of different ways. For example, they may be activated using a pointing device (such as a trackball or mouse). They may be activated by hardware buttons, switches, joysticks or keyboards, finger switches or pads, etc. They may also be activated using a virtual keyboard or other virtual actuators. Furthermore, the screens on which they are displayed are touch sensitive screens, which can be activated using touch gestures. Furthermore, when the devices that display them have a speech recognition component, they can be activated using voice commands.

Some data stores are also discussed. It should be noted that they may each be divided into multiple data repositories. All of these may be local to the system accessing them, all of these may be remote, or some may be local while others are remote. All of these configurations are contemplated herein.

Further, the diagram shows a number of blocks having functionality attributed to each block. It should be noted that fewer blocks may be used and thus the functions are performed by fewer components. Further, as functionality is distributed among more components, more blocks may be used.

Fig. 9 is a block diagram of the architecture 100 shown in the preceding figures, except that elements thereof are provided in a cloud computing architecture 500. Cloud computing provides computing, software, data access, and storage services without requiring end users to know the physical location or configuration of the system providing the services. In various embodiments, cloud computing delivers services over a wide area network (such as the internet) using appropriate protocols. For example, cloud computing providers deliver applications over a wide area network, and may access these applications through a web browser or any other computing component. The software or components of architecture 100 and corresponding data may be stored on a server at a remote location. The computing resources in a cloud computing environment may be integrated into a remote data center or may be distributed. Cloud computing infrastructures can provide services through shared data centers even though they appear to be a single point of access for users. Accordingly, the components and functionality described herein may be provided by a service provider located at a remote location using a cloud computing architecture. Alternatively, they may be provided by a conventional server, or may be installed directly or otherwise on the client device.

The description is intended to include both public cloud computing and private cloud computing. Cloud computing (both public and private) provides a substantially seamless pool of resources, also reducing the need to manage and configure the underlying hardware infrastructure.

The public cloud is managed by a provider and typically supports multiple customers using the same infrastructure. Furthermore, a public cloud (rather than a private cloud) may free end users from managing hardware. Private clouds may be managed by the organization itself, and the infrastructure is typically not shared with other organizations. Organizations still maintain hardware to some extent, such as installation and repair.

In the example shown in fig. 9, some items are similar to those shown in the previous figures and are numbered similarly. Fig. 9 specifically illustrates that computing system 102 may be located in cloud 502 (which may be public, private, or a combination where portions are public and others are private). Thus, user 106 accesses these systems through cloud 502 using user device 108.

FIG. 9 also depicts another example of a cloud architecture. Fig. 9 illustrates that it is also contemplated that some elements of computing system 102 may be disposed in cloud 502 while other elements may not. For example, data store 118 can be disposed outside of cloud 502 and accessed through cloud 502. In another example, knowledge model 158 and other context sources 168 (or other items) may be outside of cloud 502. Wherever they are located, they may be directly accessible by the device 108 over a network (wide area network or local area network), they may be hosted at a remote site by a service, or they may be provided as a service through a cloud or accessed by a connectivity service located within the cloud. All of these architectures are contemplated herein.

It should also be noted that the architecture 100, or portions thereof, may be disposed on a variety of different devices. Some of these devices include servers, desktop computers, laptops, tablets, or other mobile devices, such as palmtops, cell phones, smart phones, multimedia players, personal digital assistants, and the like.

FIG. 10 is a simplified block diagram of one illustrative example of a handheld or mobile computing device that may be used as a user's or customer's handheld device 16 in which the present system may be deployed (or portions thereof). Fig. 11-12 are examples of handheld or mobile devices.

FIG. 10 provides a general block diagram of components of client device 16 that can run components of computing system 102 or user devices or components interacting with architecture 100, or both. In device 16, a communication link 13 is provided that allows the handheld device to communicate with other computing devices, and in some examples, a channel for automatically receiving information (such as by scanning). Examples of communication links 13 include infrared ports, serial/USB ports, cable network ports (such as ethernet ports), and wireless network ports that allow communication via one or more communication protocols including General Packet Radio Service (GPRS), LTE, HSPA + and other 3G and 4G wireless protocols, 1Xrtt and short message services, wireless services for providing cellular network access, and Wi-Fi protocols and bluetooth protocols, which provide local wireless connectivity of the network.

In other examples, the application or system is received on a removable Secure Digital (SD) card connected to SD card interface 15. The SD card interface 15 and communication link 13 communicate with a processor 17 (which may also include processors or servers from other figures) along a bus 19, which bus 19 is also connected to a memory 21 and input/output (I/O) components 23, as well as a clock 25 and a positioning system 27.

In one embodiment, I/O components 23 are provided to implement input and output operations. The I/O components 23 of various embodiments of the device 16 may include input components such as buttons, touch sensors, multi-touch sensors, optical or video sensors, voice sensors, touch screens, proximity sensors, microphones, tilt sensors, and gravity switches, and output components such as a display device, speakers, and/or a printer port. Other I/O components 23 may also be used.

Clock 25 illustratively includes a real-time clock component that outputs a time and date. It may also, for example, provide timing functionality for the processor 17.

Location system 27 illustratively includes components that output the current geographic location of device 16. This may include, for example, a Global Positioning System (GPS) receiver, LORAN system, dead reckoning system, cellular triangulation system, or other positioning system. For example, it may also include mapping software or navigation software that generates desired maps, navigation routes, and other geographic functions.

Memory 21 stores operating system 29, network settings 31, applications 33, application configuration settings 35, data store 37, communication drivers 39, and communication configuration settings 41. Memory 21 may include all types of tangible volatile and non-volatile computer-readable storage devices. It may also include computer storage media (as described below). The memory 21 stores computer-readable instructions that, when executed by the processor 17, cause the processor to perform computer-implemented steps or functions in accordance with the instructions. Similarly, the device 16 may have a client system 24 that may run various applications or embody part or all of the architecture 100. The processor 17 may also be activated by other components to achieve its functionality.

Examples of the network settings 31 include, for example, proxy information, internet connection information, and mappings. The application configuration settings 35 include settings that customize the application for a particular enterprise or user. The communication configuration settings 41 provide parameters for communicating with other computers and include items such as GPRS parameters, short message parameters, connection username and password.

The applications 33 may be applications that have been previously stored on the device 16 or applications that were installed during use, although these applications may be part of the operating system 29 or may also be hosted external to the device 16.

Fig. 11 shows one example where device 16 is a tablet computer 600. In FIG. 11, a computer 600 is shown having a user interface display screen 602. The screen 602 may be a touch screen (so touch gestures from a user's finger may be used to interact with an application) or a pen-enabled interface that receives input from a pen or stylus. It may also use an on-screen virtual keyboard. Of course, it may also be connected to a keyboard or other user input device by a suitable connection mechanism, such as a wireless link or a USB port, for example. The computer 600 may also illustratively receive speech input.

Fig. 12 shows that the device may be a smartphone 71. The smartphone 71 has a touch sensitive display 73 that displays icons or tiles or other user input mechanisms 75. The mechanism 75 may be used by a user to run an application, place a phone call, perform a data transfer operation, and the like. Typically, the smartphone 71 is built on a mobile operating system and provides more advanced computing power and connectivity than a functional handset.

Note that other forms of the device 16 are possible.

FIG. 13 is an example of a computing environment in which the architecture 100, or a portion thereof (e.g., a portion thereof), can be deployed. With reference to FIG. 13, an example system for implementing some embodiments includes a general purpose computing device in the form of a computer 810. Components of computer 810 may include, but are not limited to, a processing unit 820 (which may include a processor or server from the previous figures), a system memory 830, and a system bus 821 that couples various system components including the system memory to the processing unit 820. The system bus 821 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as mezzanine bus. The memory and programs described in the foregoing figures may be deployed in corresponding portions of fig. 13.

Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. The computer storage medium is distinct from and does not include a modulated data signal or a carrier wave. It includes hardware storage media including volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 810. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included in the computer

The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as Read Only Memory (ROM)831 and Random Access Memory (RAM) 832. A basic input/output system 833(BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation, fig. 13 illustrates operating system 834, application programs 835, other program modules 836, and program data 837.

The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only, fig. 13 illustrates a hard disk drive 841 that reads from or writes to non-removable, nonvolatile magnetic media, and an optical disk drive 855 that reads from or writes to a removable, nonvolatile optical disk 856 such as an optical disk or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 841 is typically connected to the system bus 821 through a non-removable memory interface such as interface 840, and optical disk drive 855 is typically connected to the system bus 821 by a removable memory interface, such as interface 850.

Alternatively, or in addition, the functions described herein may be performed, at least in part, by one or more hardware logic components. By way of example, and not limitation, exemplary types of hardware logic components that may be used include Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), System On Chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.

The drives and their associated computer storage media discussed above and illustrated in FIG. 13, provide storage of computer readable instructions, data structures, program modules and other data for the computer 810. In fig. 13, for example, hard disk drive 841 is illustrated as storing operating system 844, application programs 845, other program modules 846, and program data 847. Note that these components can either be the same as or different from operating system 834, application programs 835, other program modules 836, and program data 837. Operating system 844, application programs 845, other program modules 846, and program data 847 are given different numbers here to illustrate that, at a minimum, they are different copies.

A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a Universal Serial Bus (USB). A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.

The computer 810 operates in a networked environment using logical connections to one or more remote computers, such as a remote computer 880. The remote computer 880 may be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 810. The logical connections depicted in FIG. 13 include a Local Area Network (LAN)871 and a Wide Area Network (WAN) 873, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.

When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN873, such as the Internet. The modem 872, which may be internal or external, may be connected to the system bus 821 via the user input interface 860, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 810, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 13 illustrates remote application programs 885 as residing on remote computer 880. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.

It should also be noted that the different examples described herein may be combined in different ways. That is, portions of one or more examples may be combined with portions of one or more other examples. All of which are contemplated herein.

Example 1 is a computing system, comprising:

a Natural Language Processor (NLP) that receives a text input and context information, the text indicating a given received chat message, the context identified based on a previously received chat message received prior to the given received chat message, the natural language processor generating an NLP output based on the text input and the context information, the NLP output identifying concepts in the given received chat message;

an expiration indicator generator that generates an expiration indicator corresponding to the identified concept;

context filtering logic that filters the NLP output based on the expiration indicator to obtain a filtered NLP output and provides the filtered NLP output back to the natural language processor as context information for a subsequent received chat message received after a given received chat message; and

a robot controller that receives the NLP output from the natural language processor and generates a response output based on the NLP output.

Example 2 is the computing system of any or all of the preceding examples, wherein the context filtering logic comprises:

the expiration criteria processor is configured to identify a relevance of the contextual information to subsequently received chat messages based on the expiration indicator.

Example 3 is the computing system of any or all of the preceding examples, wherein the generator expiration indicator generator is configured to generate a concept-level expiration indicator corresponding to each concept indicator of the plurality of concepts identified in the given received chat message.

Example 4 is the computing system of any or all of the preceding examples, wherein the plurality of concepts together comprise overall context information, and wherein the expiration indicator generator is configured to generate an overall expiration indicator corresponding to the overall context information.

Example 5 is the computing system of any or all of the preceding examples, wherein the context filtering logic comprises:

a context filtering system that filters context information based on relevance to subsequently received chat messages.

Example 6 is the computing system of any or all of the preceding examples, wherein the context filtering system is configured to filter the context information by deleting the identified concepts from the context information based on the relevance.

Example 7 is the computing system of any or all of the preceding examples, wherein the context filtering system is configured to filter the context information by adjusting weights in the context information associated with the identified concepts based on the relevance.

Example 8 is the computing system of any or all of the preceding examples, wherein the expiration indicator generator comprises:

a timestamp generator that generates a timestamp indicating when the concept was identified by the natural language processor.

Example 9 is the computing system of any or all of the preceding examples, wherein the expiration indicator generator comprises:

a location stamp generator that generates a location stamp indicating a geographic location at which the received chat message was generated.

Example 10 is the computing system of any or all of the preceding examples, wherein the generator expiration indicator generator comprises:

a conversation round counter that generates a conversation round counter stamp indicating a round count stamp corresponding to a given conversation round of the received chat message.

Example 11 is a computer-implemented method, comprising:

receiving, at a natural language processor, a text input indicating a given received chat message and contextual information, the contextual information identified based on previously received chat messages received prior to the given received chat message;

generating, with a natural language processor, an NLP output based on the text input and the context information, the NLP output identifying a concept in a given received chat message;

generating an expiration indicator corresponding to the identified concept;

filtering the NLP output based on the expiration index to obtain a filtered NLP output;

providing the filtered NLP output back to the natural language processor as context information for subsequent received chat messages received after a given received chat message; and

generating, with the robot controller, a response output based on the NLP output.

Example 12 is a computer-implemented method of any or all of the preceding examples, wherein filtering comprises:

a relevance of the context information to subsequently received chat messages is identified based on the expiration indicator.

Example 13 is the computer-implemented method of any or all of the preceding examples, wherein generating the expiration indicator comprises:

a concept-level expiration indicator is generated that corresponds to each concept of the plurality of concepts identified in a given received chat message.

Example 14 is the computer-implemented method of any or all of the preceding examples, wherein the plurality of concepts together comprise overall context information, and wherein generating the expiration indicator comprises:

an overall expiration indicator corresponding to the overall context information is generated.

Example 15 is a computer-implemented method of any or all of the preceding examples, wherein filtering comprises:

the contextual information is filtered based on relevance to subsequently received chat messages.

Example 16 is a computer-implemented method of any or all of the preceding examples, wherein filtering comprises:

the context information is filtered by deleting the identified concepts from the context information based on the relevance.

Example 17 is a computer-implemented method of any or all of the preceding examples, wherein filtering comprises:

the contextual information is filtered by adjusting weights associated with the identified concepts in the contextual information based on the relevance.

Example 18 is the computer-implemented method of any or all of the preceding examples, wherein generating the expiration indicator comprises:

a timestamp is generated indicating when the concept was identified by the natural language processor.

Example 19 is the computer-implemented method of any or all of the preceding examples, wherein generating the expiration indicator comprises:

a location stamp is generated indicating a geographic location at which a given received chat message was generated.

Example 20 is a chat robot computing system, comprising:

a Natural Language Processor (NLP) that receives a text input indicating a first chat message and context information identified based on a previously received chat message received prior to the first chat message, the natural language processor generating an NLP output based on the text input and the context information, the NLP outputting concepts in the first chat message;

an expiration indicator generator that generates an expiration indicator corresponding to the identified concept;

an expiration criteria processor configured to identify a relevance of the context information to subsequently received chat messages based on the expiration indicator;

context filtering logic that filters the NLP output based on the correlation to obtain a filtered NLP output and provides the filtered NLP output back to the natural language processor as context information for a subsequently received chat message received after the first chat message; and

a robot controller that receives the NLP output from the natural language processor and generates a response output based on the NLP output.

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are described as example forms of implementing the claims.

35页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:发展中的事件特定的临时知识图

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!