Method, apparatus and system for generating Bayesian inferences with spiking neural networks

文档序号:1302105 发布日期:2020-08-07 浏览:8次 中文

阅读说明:本技术 用于利用尖峰神经网络生成贝叶斯推断的方法、设备和系统 (Method, apparatus and system for generating Bayesian inferences with spiking neural networks ) 是由 A·保尔 N·斯里尼瓦萨 于 2018-02-23 设计创作,主要内容包括:用于利用尖峰神经网络执行贝叶斯推断的技术和机制。在实施例中,尖峰神经网络的父节点接收周期性的第一偏置信号。父节点将似然性信号传递到子节点,其中,父节点和子节点分别与第一条件和第二条件对应。基于被施加于第一偏置信号的相位变化,似然性信号指示第一条件的概率。子节点还接收指示第二条件的实例的信号;基于该指示和第二偏置信号,子节点用信号向父节点通知要对施加于第一偏置信号的相位变化作出调节。在该调节之后,似然性信号指示第一条件的经更新的概率。(Techniques and mechanisms for performing bayesian inference using spiking neural networks. In an embodiment, a parent node of a spiking neural network receives a periodic first bias signal. The parent node communicates the likelihood signal to the child node, wherein the parent node and the child node correspond to the first condition and the second condition, respectively. The likelihood signal indicates a probability of the first condition based on a phase change applied to the first bias signal. The child node also receives a signal indicating an instance of a second condition; based on the indication and the second bias signal, the child node signals to the parent node that an adjustment is to be made to the phase change applied to the first bias signal. After the adjusting, the likelihood signal indicates an updated probability of the first condition.)

1. A computer device for generating bayesian inferences using a spiking neural network, the computer device comprising circuitry for performing the steps of:

receiving a first bias signal at a first node of the spiking neural network, wherein the first node corresponds to a first condition of a Bayesian network;

receiving a second bias signal at a second node of the spiking neural network, wherein the second node corresponds to a second condition of the Bayesian network;

applying a change to a phase of the first bias signal;

passing a third signal from the first node to the second node in response to the change in phase of the first bias signal, wherein the third signal indicates a likelihood of the first condition;

receiving, at the second node, a signal indicative of an instance of the second condition;

passing a fourth signal from the second node to the first node in response to the instance of the second condition, the fourth signal based on the second bias signal; and

adjusting an amount of change in a phase of the first bias signal based on the fourth signal.

2. The computer device of claim 1, wherein a spike rate of the third signal varies over time based on the change in phase of the first bias signal, wherein a magnitude of a change caused by the spike rate of the third signal indicates a likelihood of the first condition.

3. The computer device of claim 1, wherein the fourth signal is further based on a conditional probability of the second condition given the first condition.

4. The computer device of claim 3, wherein the third signal is communicated via a synapse coupled between the first node and the second node, wherein a weight assigned to the synapse indicates a conditional probability of the second condition given the first condition.

5. The computer device of claim 3, wherein the fourth signal is based on a ratio of a conditional probability to a probability of the second condition.

6. The computer device of claim 1, further comprising circuitry to apply another change to the phase of the second bias signal, wherein the fourth signal is further based on the change in the phase of the second bias signal.

7. The computer device of claim 1, further comprising circuitry to perform the steps of:

receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the Bayesian network;

communicating the third signal from the first node to the third node;

receiving, at the third node, a signal indicative of an instance of the third condition;

in response to an instance of the third condition, passing a fifth signal from the third node to the first node, the fifth signal based on the third bias signal; and

further adjusting an amount of change in the phase of the first bias signal based on the fifth signal.

8. The computer device of claim 7, wherein the circuitry is to: adjusting an amount of change in the phase of the first bias signal based on the fifth signal while adjusting the change in the phase of the first bias signal based on the fourth signal.

9. The computer device of claim 1, further comprising circuitry to perform the steps of:

receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the Bayesian network;

applying a change to a phase of the third set signal;

communicating a fifth signal from the third node to the second node, wherein the fifth signal indicates a likelihood of the third condition based on a change in phase of the third bias signal;

in response to an instance of the second condition, passing a sixth signal from the second node to the third node, the sixth signal based on the second bias signal; and

adjusting an amount of change in a phase of the third bias signal based on the sixth signal.

10. The computer device of claim 1, further comprising circuitry to perform the steps of: based on the adjusted amount of change, one condition of the Bayesian network is selected in preference to one or more other conditions of the Bayesian network.

11. At least one non-transitory machine-readable medium comprising instructions that, when executed by a machine, cause the machine to perform operations for generating bayesian inferences using a spiking neural network, the operations comprising:

receiving a first bias signal at a first node of the spiking neural network, wherein the first node corresponds to a first condition of a Bayesian network;

receiving a second bias signal at a second node of the spiking neural network, wherein the second node corresponds to a second condition of the Bayesian network;

applying a change to a phase of the first bias signal;

passing a third signal from the first node to the second node in response to the change in phase of the first bias signal, wherein the third signal indicates a likelihood of the first condition;

receiving, at the second node, a signal indicative of an instance of the second condition;

passing a fourth signal from the second node to the first node in response to the instance of the second condition, the fourth signal based on the second bias signal; and

adjusting an amount of change in a phase of the first bias signal based on the fourth signal.

12. The at least one non-transitory machine readable medium of claim 11, wherein a spike rate of the third signal varies over time based on the change in phase of the first bias signal, wherein a magnitude of a change caused by the spike rate of the third signal indicates a likelihood of the first condition.

13. The at least one non-transitory machine readable medium of claim 11, wherein the fourth signal is further based on a conditional probability of the second condition given the first condition.

14. The at least one non-transitory machine readable medium of claim 13, wherein the third signal is communicated via a synapse coupled between the first node and the second node, wherein a weight assigned to the synapse indicates a conditional probability of the second condition given the first condition.

15. The at least one non-transitory machine readable medium of claim 13, wherein the fourth signal is based on a ratio of a conditional probability to a probability of the second condition.

16. The at least one non-transitory machine-readable medium of claim 11, the operations further comprising: applying another change to the phase of the second bias signal, wherein the fourth signal is further based on the change in the phase of the second bias signal.

17. The at least one non-transitory machine-readable medium of claim 11, the operations further comprising:

receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the Bayesian network;

communicating the third signal from the first node to the third node;

receiving, at the third node, a signal indicative of an instance of the third condition;

in response to an instance of the third condition, passing a fifth signal from the third node to the first node, the fifth signal based on the third bias signal; and

further adjusting an amount of change in the phase of the first bias signal based on the fifth signal.

18. The at least one non-transitory machine readable medium of claim 17, wherein the amount of change in the phase of the first bias signal is adjusted based on the fifth signal while adjusting the change in the phase of the first bias signal based on the fourth signal.

19. The at least one non-transitory machine-readable medium of claim 11, the operations further comprising:

receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the Bayesian network;

applying a change to a phase of the second bias signal;

communicating a fifth signal from the second node to the third node, wherein the fifth signal indicates a likelihood of the second condition based on a change in phase of the second bias signal;

receiving, at the third node, a signal indicative of an instance of the third condition;

in response to an instance of the third condition, passing a sixth signal from the third node to the second node, the sixth signal based on the third bias signal; and

adjusting an amount of change in a phase of the second bias signal based on the sixth signal.

20. The at least one non-transitory machine-readable medium of claim 11, the operations further comprising: based on the adjusted amount of change, one condition of the Bayesian network is selected in preference to one or more other conditions of the Bayesian network.

21. A method for generating bayesian inferences using a spiking neural network, the method comprising:

receiving a first bias signal at a first node of the spiking neural network, wherein the first node corresponds to a first condition of a Bayesian network;

receiving a second bias signal at a second node of the spiking neural network, wherein the second node corresponds to a second condition of the Bayesian network;

applying a change to a phase of the first bias signal;

passing a third signal from the first node to the second node in response to the change in phase of the first bias signal, wherein the third signal indicates a likelihood of the first condition;

receiving, at the second node, a signal indicative of an instance of the second condition;

passing a fourth signal from the second node to the first node in response to the instance of the second condition, the fourth signal based on the second bias signal; and

adjusting an amount of change in a phase of the first bias signal based on the fourth signal.

22. The method of claim 21, wherein a spike rate of the third signal varies over time based on the change in phase of the first bias signal, wherein a magnitude of a change caused by the spike rate of the third signal indicates a likelihood of the first condition.

23. The method of claim 21, wherein the fourth signal is further based on a conditional probability of the second condition given the first condition.

24. The method of claim 21, further comprising: applying another change to the phase of the second bias signal, wherein the fourth signal is further based on the change in the phase of the second bias signal.

25. The method of claim 21, further comprising: based on the adjusted amount of change, one condition of the Bayesian network is selected in preference to one or more other conditions of the Bayesian network.

Background

Embodiments described herein relate generally to spiking neural networks, and more particularly, but not exclusively, to techniques for performing bayesian inference operations with spiking neural networks.

Spiking neural networks (or "SNNs") are increasingly being adapted to provide next generation solutions for various applications. "SNN variously relies on signaling techniques in which information is conveyed using a time-based relationship between signal spikes. SNNs provide communication economy compared to typical deep learning architectures, such as those provided with Convolutional Neural Networks (CNNs) or Recurrent Neural Networks (RNNs), which in turn allows orders of magnitude improvement in power efficiency.

While neural networks are generally efficient for applications such as image recognition, problems have been faced when attempting to adapt such neural networks to higher-order analytical reasoning such as bayesian inference processing. Bayesian networks are in principle straightforward to implement, but are practically difficult to implement due to the exponential relationship between the number of terms in a given problem and the circuitry and/or computational load required to solve that problem. To address these difficulties, the prior art variously requires auxiliary nodes, non-local computation, or additional purpose design support circuitry. As the size, diversity, and diffusivity of artificial intelligence networks grow, incremental improvements to solutions providing higher-order analysis are expected to place increased added value.

Drawings

Embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which:

fig. 1 shows diagrams each illustrating features of a simplified neural network according to an embodiment.

Fig. 2A is a functional block diagram illustrating elements of a spiking neural network according to an embodiment.

Fig. 2B is a functional block diagram illustrating elements of a neural network node, according to an embodiment.

Fig. 3 is a flow diagram illustrating features of a method for operating a spiking neural network, in accordance with an embodiment.

Fig. 4 is a functional block diagram illustrating features of a spiking neural network for performing bayesian inference operations, according to an embodiment.

Fig. 5 shows a timing diagram that variously illustrates signals generated using a spiking neural network, in accordance with embodiments.

FIG. 6 is a functional block diagram illustrating a computing device according to one embodiment.

FIG. 7 is a functional block diagram illustrating an exemplary computer system according to one embodiment.

Detailed Description

Embodiments discussed herein variously provide techniques and mechanisms for utilizing spiking neural networks to determine conditional likelihood information, where the determined conditional likelihood represents a corresponding probability of a bayesian network used to generate bayesian inferences. Various embodiments of a neural pattern model associated with a spiking neural network are based on interactions between neural oscillations. Neurons each represent a corresponding variable of interest, while their synaptic connections represent a conditional dependent structure between these variables. The spikes are actively oscillating and the amplitude of the oscillation represents the corresponding probability. When (a subset of) the random variables are observed at some of the leaf nodes of the graph, the evidence propagates from the leaf nodes to the center node as equivalent phase shifts; the combination of the propagating wave and the phase shift modulates the oscillations of the neurons throughout the graph and thereby modulates the corresponding probability of the bayesian network to generate bayesian inferences. The plurality of nodes of the spiking neural network may each represent or otherwise correspond to a respective condition, including, for example, a condition local to the spiking neural network or a condition of a system to be evaluated by the spiking neural network. For a given node, a corresponding condition may result in a signal being received by (or passed from) that node. For example, a given node may be configured to receive a signal and recognize that a predefined pattern and/or timing of the signal is used to indicate an instance of a condition to which that node corresponds. In some embodiments, the spiking neural network is used to perform one or more bayesian inference operations — e.g., where some or all of the plurality of nodes each correspond to a respective condition (or "variable") of the bayesian network. As used herein, "bayesian inference" and "bayesian inference operations" refer in various ways to processes that change values, signal characteristics, and/or other system states to indicate updated probabilities of conditions of a bayesian network. Such updated probabilities may be used when the bayesian network subsequently selects one condition over one or more other such conditions — e.g., where the bayesian inference operation further comprises such selection.

A plurality of nodes of the spiking neural network may each receive a respective bias signal, at least some component of which is sinusoidal or otherwise periodically varied over time. Such multiple nodes may each receive the same bias signal, or alternatively some or all of the multiple nodes may each receive a different respective bias signal. In embodiments, the periodic components of some or all such bias signals each have the same frequency. Based on such biasing, the plurality of nodes can exhibit resonance with respect to each other in various ways-e.g., in terms of respective membrane potentials and/or synaptic signaling of the nodes. Such resonances between nodes may be based on the coupling of the nodes to each other, and further based on common-frequency components of one or more bias signals variously provided to the nodes.

The respective node types of two given nodes, referred to herein as a "parent" type and a "child" type, may be based at least in part on a functional relationship between the nodes. For a given first node and second node, the first node may act as a parent node with respect to the second node, with the second node (in turn) acting as a child node with respect to the first node. A parent node represents an event, entity and/or state that (with a given probability) causally precedes the entity or event represented by the corresponding child node. In some embodiments, a given node is a child node with respect to one node, while being a parent node with respect to another node. Alternatively or additionally, a node may be a parent node for each of a plurality of respective child nodes, and/or a child node may be a child node for each of a plurality of respective parent nodes. During operation of the spiking neural network, a parent node may send a signal, referred to herein as a "likelihood signal," to a corresponding child node. Alternatively or additionally, a child node may send another signal, referred to herein as a "regulation signal," to the parent node.

As used herein, a "likelihood signal" refers to a signal that passes from a parent node to a child node, where a characteristic of the signal indicates a likelihood (e.g., a current probability) of a condition to which the parent node corresponds. Such signal characteristics may include, for example, the rate (e.g., frequency), amplitude, predefined pattern, and/or timing (e.g., relative to some reference time) of at least some signal spikes or other components of the likelihood signal. For example, a node may provide a likelihood signal that includes a periodic component during a period indicative of some time period, the periodic component based at least in part on the periodic characteristics of the bias signal for that node. The periodic component may be further based on one or more other likelihood signals received by that node from the respective parent node (the one or more other likelihood signals each including the respective periodic component). In one embodiment, the "rate-amplitude" characteristic of such likelihood signals indicates the likelihood of the characteristic to which that node corresponds.

As used herein, "rate-amplitude" refers to a characteristic of a signal having a rate of a certain spike ("or spike rate") that varies during at least some period of time. The spike rate may be a moving average rate based on the number of signal spikes occurring within the time period of the sampling window. The rate of spikes may include a periodic component that oscillates or otherwise varies between the rate of first signal spikes and second signal spikes. The rate-amplitude of such a signal is the amplitude of the change caused by the spike rate of the signal for a given period of time.

As used herein, a "regulation signal" refers herein to a signal communicated from a child node to a parent node to indicate that the parent node is to regulate the phase of a bias signal provided to that parent node. The parent node may be configured to receive the adjustment signal and be aware that some predefined spike pattern, timing, and/or other characteristic of the adjustment signal is used to specify or otherwise indicate an adjustment to the value of the phase change applied to the bias signal. Such adjustment signals may be based at least in part on the current probability of the condition to which the child node corresponds. The probability may be indicated by a characteristic of the membrane potential at the sub-node, such as rate-amplitude, for example, where the characteristic is based on a change in phase, if any, of a bias signal applied to the sub-node. In some embodiments, given the second condition, the adjustment signal is further based at least in part on a conditional probability of the first condition (where the first condition and the second condition correspond to the child node and the parent node, respectively). The conditional probability may be indicated by a weight assigned to a synapse by which the child node receives the likelihood signal. The adjustment signal may be communicated from a child node to a parent node based on that child node receiving a corresponding "instance signal".

As used herein, an "instance signal" refers to a signal provided to a node to indicate an instance of a condition to which the node corresponds. For example, the instance signal may be received by the node from a source external to the spiking neural network. Alternatively, the instance signal may be generated based on operation of one or more other nodes of the spiking neural network. In response to the instance signal, the receiving node may each communicate one or more adjustment signals to a corresponding parent node.

The techniques described herein may be implemented in one or more electronic devices. Non-limiting examples of electronic devices that may utilize the techniques described herein include any kind of mobile and/or stationary device, such as cameras, cellular phones, computer terminals, desktop computers, e-readers, facsimile machines, automated teller machines, laptop computers, netbook computers, notebook computers, internet appliances, payment terminals, personal digital assistants, media players and/or recorders, servers (e.g., blade servers, rack-mount servers, combinations thereof, etc.), set-top boxes, smart phones, tablet personal computers, ultra-mobile personal computers, wired phones, combinations thereof, and so forth. More generally, the techniques described herein may be employed in any of a variety of electronic devices including spiking neural networks.

In some embodiments, the nodes of the spiking neural network may be of the leaky integrate-and-fire (L IF) type, e.g., where the membrane potential v of a given node j is based on one or more spike signals received at that node jmCan reach a spike and then decay over time. Such membrane potential vmMay be determined, for example, according to the following equation:

wherein v isAt restIs the membrane potential vmThe rest potential, τ, to be achievedmIs directed against the membrane potential vmTime constant of exponential decay of, wijIs the synaptic weight of the synapse from another node I to node j, IijIs a spike signal (or "spike train") that is communicated to node J via the synapse, and JbIs, for example, based on the value of a bias current or other signal provided to node j from some external node/source. The spiking neural network may be based on a predefined threshold voltage VThreshold valueOperating with node j configured to respond to its membrane potential vmGreater than VThreshold valueAnd output signal spikes.

Fig. 1 illustrates an exemplary diagram of a simplified neural network 100, which provides an illustration of connections 120 between a first set of nodes 110 (e.g., neurons) and a second set of nodes 130 (e.g., neurons). Some or all of the neural networks (such as simplified neural network 100) may be organized into multiple layers-including, for example, an input layer and an output layer. It will be understood that the simplified neural network 100 depicts only two layers and a small number of nodes, but other forms of neural networks may include a large number of nodes, layers, connections, and paths configured in various ways. For example, the neural network may be a monolithic single layer with an arbitrary connection graph representing causal link strengths between respective conditions represented in various ways by the nodes of the layer.

Data provided into the neural network 100May first be processed by the synapses of the input neurons. The interaction between the input, the synapse of a neuron, and the neuron itself determines whether to provide the output to the synapse of another neuron via an axon. Modeling synapses, neurons, axons, etc. can be accomplished in various ways. In an example, the neuromorphic hardware includes multiple separate processing elements (e.g., neural cores) in a synthetic neuron and a messaging structure for passing output to other neurons. Determining whether a particular neuron "fires" to provide data to further connected neurons depends on the activation function applied by the neuron and the synaptic connection (e.g., w) from neuron i (e.g., located in the layer of the first set of nodes 110) to neuron j (e.g., located in the layer of the second set of nodes 130)ij) The weight of (c). The input received by neuron i is depicted as a value xiAnd the output produced from neuron j is depicted as value yj. Thus, the processing performed in the neural network is based on weighted connections, thresholds and evaluations performed between neurons, synapses and other elements of the neural network.

In an example, the neural network 100 is established by a network of spiking neural network cores, where the neural network cores communicate via short packetized (packetized) spike messages sent from core to core. For example, each neural network core may implement a certain number of primitive nonlinear time domain computational elements as neurons such that when an activation of a neuron exceeds a certain threshold level, the neuron generates a spike message that is propagated to a fixed set of fan-out neurons contained in the destination core. The network may distribute spike messages to all destination neurons and in response, those neurons update their activation in a transient, time-dependent manner, similar to the operation of real biological neurons.

Neural network 100 further illustrates that the manipulated value x is received at neuron i in the first set of neurons (e.g., neurons in first set of nodes 110)iThe peak of the representation. The output of the neural network 100 is also shown as being represented by the value yjA spike represented that reaches a spirit in the second set of neurons via a path established by the connection 120Via element j (e.g., a neuron in the first set of nodes 110). In spiking neural networks, all communications occur on event-driven action potentials or spikes. In an example, spikes do not convey other information than spike times and source and destination neuron pairs. As a result of the dynamic non-linear integration of the weighted spike inputs using real-valued state variables, the computation can occur in various ways in each respective neuron. A time-domain sequence of spikes generated by or for a particular neuron may be referred to as a "spike sequence" for that particular neuron.

In the example of a spiking neural network, the activation function occurs via a spike train, which means that time is a factor that must be considered. Furthermore, in spiking neural networks, each neuron may provide a function similar to that of a biological neuron in that an artificial neuron receives inputs to that artificial neuron via synaptic connections to one or more "dendrites" (part of the physical structure of the biological neuron), and these inputs affect the internal membrane potential of the artificial neuron "body cells" (cell bodies). In spiking neural networks, an artificial neuron "fires" (e.g., produces an output spike) when its membrane potential crosses a firing threshold. Thus, the effect of an input on a spiking neural network neuron acts to increase or decrease the internal membrane potential of that neuron, thereby making it more or less likely to fire. Furthermore, in spiking neural networks, the input connections may be stimulating or inhibitory. The membrane potential of a neuron can also be influenced by changes in the internal state ("leakage") of the neuron itself.

Fig. 1 also illustrates an example inference path 140 in a spiking neural network, such as may be implemented by the form of neural network 100 or other forms of neural networks. The inferred path 140 of the neuron includes a pre-synaptic neuron 142 configured to generate a sequence of pre-synaptic spikes x representing a spiking inputi. A spike train is a time series of discrete spike events that provides a set of times that specify when a neuron fires.

As shown, the spike sequence xiBy pre-synaptic neurons (e.g.Neuron 142) and evaluates spike sequence x based on the characteristics of synapse 144iFor processing. For example, the synapse may apply one or more weights, such as weight wjjThese weights are evaluated from the spike series xiIs used. From spike trains xiInto synapses, e.g. with a weight wjjThe synapses 144. This weight scales the effect of the presynaptic spike on the post-synaptic neuron (e.g., neuron 146). If the integrated contribution of all input connections to the post-synaptic neuron exceeds a threshold, the post-synaptic neuron 146 will fire and spike. As shown in the figure, yjIs a sequence of post-synaptic spikes produced by a post-synaptic neuron (e.g., neuron 146) in response to some number of input connections. As shown, the post-synaptic spike sequence yjFrom neuron 146 to other post-synaptic neurons.

Fig. 2A illustrates features of a spiking neural network 200 for implementing bayesian inference operations, according to an embodiment. Fig. 2A also includes a legend 201, the legend 201 showing respective symbols for the network node, the bias signal, the likelihood signal, the instance signal, and the adjustment signal. Spiking neural network 200 is an example of an embodiment as follows: where the parent node passes the likelihood signal to the corresponding child node, which subsequently informs the parent node that a phase adjustment is to be made to the bias current using the reuse signal. The spiking neural network 200 may include some or all of the features of the neural network 100, for example.

As shown in fig. 2A, spiking neural network 200 includes a plurality of nodes variously coupled to one another via synapses, each for receiving a respective bias signal, which is sinusoidal or otherwise periodic. By way of illustration and not limitation, the nodes 210, 220, 230 of the network 200 can receive the respective bias signals 212, 222, 232 in various ways. Based at least in part on such bias signals, the plurality of nodes may variously exhibit resonant coupling, e.g., at least with respect to synaptic signaling and/or respective membrane potentials of the plurality of nodes. For example, the respective signals at nodes 210, 220, 230 may each exhibit at least some resonance at the first frequency. Such resonant coupling may facilitate representing the probability by using a phase change applied to the bias signal.

In the illustrated example embodiment, node 210 and node 230 are to act as a parent node and a child node, respectively, with respect to each other node 210 is further coupled to receive one or more likelihood signals 224, each from a respective other node of the spiking neural network, such as node 220, e.g., where node 210 is also to act as a child node for each other such node, node 210 may apply a change to bias signal 212, where the phase change (and/or a signal based on the phase change) indicates a probability of the condition to which node 210 corresponds.

Node 230 may be configured to utilize adjustment signal 234 to indicate adjustments, if any, to be made to the phase change applied to bias signal 212 by node 210. The membrane voltage of the node 230 may be generated based on the bias signal 232-e.g., where a periodic component of the membrane voltage is based on the phase change (if any) applied to the bias signal 232 by the node 230. In embodiments in which node 230 also acts as a parent node relative to some other node or nodes, such phase changes applied to bias signal 232 may be based on one or more other adjustment signals (not shown) that are each received by node 230 from the other node or nodes.

The node 230 may be further coupled to receive an instance signal 236, the instance signal 236 indicating an instance of a condition to which the node 230 corresponds. The example signal 236 may be provided, for example, from some sensor, test unit, or other source external to the spiking neural network 200 (or alternatively, from some other node of the spiking neural network 200). Some embodiments are not limited with respect to the particular source from which the instance signal 236 is received. Based on the indication provided by the instance signal 236, the node 230 may indicate (via the adjustment signal 234) an amount of phase change to adjust for the bias signal 212. Based on adjustment signal 234, node 210 may adjust the phase change to bias signal 212. Such adjustments may include determining whether a change is to be applied to the phase of the bias signal 212, determining the amount of phase to change, and/or determining whether the amount of phase change is to be adjusted. Such adjustment of the phase change at node 210 may result in the probability of likelihood signal 218 indicating a different value for the condition to which node 210 corresponds. As a result, the spiking neural network 200 may implement bayesian inference operations that change respective values of one or more probabilities.

Although some embodiments are not limited in this regard, node 210 may be further coupled to receive one or more other adjustment signals (not shown) each from a respective other child node — for example, where likelihood signal 218 is also communicated to each other such child node. In such embodiments, the phase change applied to the bias signal 212 may be further adjusted based on one or more other such adjustment signals.

Fig. 2B illustrates features of spiking neural network nodes 240, 250 each participating in a bayesian inference operation, according to an embodiment. In fig. 2B, the functional blocks shown each represent respective circuitry, executing software, and/or other logic for providing their corresponding functionality. Nodes 240, 250 may include respective features of nodes 210, 230, for example.

In terms of each other's operation, nodes 240, 250 are used to act as parent and child nodes, respectively. Nodes 240, 250 may correspond to condition a and to condition B, respectively, which has at least some dependency on condition a. Condition a, condition B may be respective parameters of a bayesian network implemented with a spiking neural network.

Such spiking neural networks may be initialized according to a bayesian network design that defines, for each given node of a plurality of nodes, an initial probability that given node will correspond to a condition. Such probabilities may be indicated based at least in part on phase changes to be applied to respective bias currents for a given node. For each child node in the plurality of nodes, the bayesian network design may define a respective conditional probability for a condition corresponding to that child node given another condition corresponding to the respective parent node of that child node. Such conditional probabilities may be indicated, for example, by weights assigned to the synapses through which the child nodes receive the likelihood signals. The particular probability value for a given initialization state may be determined based on conventional bayesian network design techniques and is not limited to some embodiments.

Referring to node 240, a periodic bias signal J may be received at block 242aThe function block 242 changes the phase by phiaIs applied to the periodic bias signal Ja. At a given moment, the phase changes byaMay be based on one or more adjustment signals (such as the illustrative adjustment signal C shown) each communicated to node 240 from a respective child nodeba). Bias signal JaMay be provided to another block 246, the other block 246 being based on, for example, one or more likelihood signals each from a respective other node (not shown.) by way of illustration and not limitation, node 240 may receive likelihood signal L from another node acting as a parent node relative to node 240x1a(assigned to node 240 by which it received Lx1aOf synapses w) weight wx1a244 may be applied to L by node 240x1a

Block 246 may apply the membrane voltage Vm-aIs generated as a bias signal JaOf the phase change phi applied theretoaAnd likelihood signal Lx1a(e.g., as by w)x1aWeighted) function fa. Function faMay be adapted, for example, from any of the various leaky integrate and fire (L IF) techniques used to generate spikes — for example, where the function faIncluding the features of equation (1) described herein. Another block 248 of the node 240 may be based on the membrane voltage Vm-aGenerating likelihood signals La

For example, block 248 may be responsive to Vm-aCrossing over or otherwise reaching Vth-aTo generate likelihood signals LaLikelihood signal LaMay indicate the current probability of condition a for which node 240 corresponds, signal LaMay be sent to node 250 (and, in some embodiments, to one or more other child nodes for node 240). Node 250 may then adjust signal CbaPassed to node 240, the adjustment signal CbaIndicating the phase change phi to be adjustedaThe amount of (c). Phase change phiaMay result in likelihood signal LaFollowed by an updated value indicative of the probability of condition a.

For example, node 254 may assign a synaptic weight wba254 applied to likelihood signal La. In addition, node 250 may receive a bias signal (not shown), such as bias signal 232, and change the phase by φb252 to the bias signal. In such embodiments, signal C is adjustedbaMay be based at least in part on a phase change phib252 and synaptic weights wba254-e.g., where block 256 is based on the synaptic weight wba254 and phase change phib252 to determine a scaling factor βba. Note that in some embodiments, signal C is adjustedbaMay be independent of likelihood signal La

Synaptic weight wbaThe value of 254 may represent or otherwise indicate the conditional probability P (B | a) of condition B given condition a. Alternatively or additionally, the phase changes phib252 may indicate the probability p (B) of condition B. The initial values of such probabilities P (B) and P (B | a) may be predefined as part of the initialization state of the spiking neural network. In the case where node 250 is also a parent node to one or more other nodes, P (B) -e.g., which may be updated later during operation of the spiking neural networkThe median conditional probability P (B | A) is a fixed value (e.g., corresponding synaptic weight w)ba254). Any such updating of P (B) may include the use and use ofaFor adjusting the phase change phi by techniques similar to those ofb252.

In one embodiment, block 256 scales factor βbaIs determined as a function of the ratio { P (B | A)/P (A) } — e.g., according to the following function:

βba=[{P(B|A)/P(A)}–1](2)

in such embodiments, the scaling factor βbaIndicating whether the association (dependency) between condition a and condition B is positive or negative. Based on such correlation, signal C is adjustedbaThe phase change phi can be signaledaWhether to increase or decrease.

Block 256 may scale factor βbaIs passed to another block 258 of node 250-e.g., where block 258 is based on βbaGenerating an adjustment signal Cba. For example, the adjustment signal CbaMay indicate or otherwise be based on equaling the product [ βba]·[Δφba]Wherein, Δ φbaIs a change phi from phaseaA phase adjustment value corresponding to both the given instance of condition B corresponding to node 250. Delta phibaMay correspond to an increment by which p (a) is to be varied in response to a given instance of p (b). E.g. delta phibaMay be based on initially defined parameters that are part of the bayesian network design. In one embodiment, Δ φbaE.g. based on an instance signal N for indicating an instance of condition BbIs variable. For example, at any given time Δ φbaMay represent or otherwise indicate the instance signal NbDetected intensity (e.g., signal spike rate). In other embodiments, Δ φbaIndicates or is otherwise based on the phase change phiaMaximum allowable value of (d), phase change phiaA range of maximum allowable values, etc. Node 250 may be preconfigured to configure CbaIs identified as indicating the phase change phi to be adjustedbaThe amount of (c).

Accordingly, node 240 may vary φ based on phaseb252 and a weight wba254 both generate the adjustment signal CbaWherein the weight wba254 indicates P (B | A), and wherein based on the phase change φb252, the membrane voltage of node 240 (and/or the likelihood signal L from node 240)b(not shown)) indicates p (b). For example, the adjustment signal may be based on CbaMay be based on a scaling factor βbaAnd the phase adjustment value delta phibaAlthough some embodiments are not limited in this respect, node 240 may be further coupled to also receive likelihood signal LaSome other child node (not shown) — for example, where the other child node corresponds to condition C.

In such embodiments, the other sub-node may include additional circuitry and/or software to vary φ based on the phase (applied to the bias signal by that sub-node)cAnd the child node applied to the likelihood signal LaWeight w ofcaTo generate another adjustment signal C in a similar mannerca. Weight wcaThe probability P (C | A) may be indicated-e.g., where based on the phase change φcThe membrane voltage of (and/or likelihood signal L from) that other sub-nodec) Indication p (c). Regulating signal CcaMay be based on a scaling factor βcaAnd the phase adjustment value delta phicaThe phase adjustment value delta phicaWith e.g. scaling factor βbaAnd the phase adjustment value delta phibaThe corresponding features of (1). Block 242 may be coupled to further receive the adjustment signal CcaWherein block 242 is configured to adjust signal C based oncaThe indicated adjustment value to further adjust the phase change phia. Although some embodiments are not limited in this regard, the phase change φ may also be adjusted during operation of the spiking neural networkb252-e.g., where such adjustment is based on a change in phaseaThose adjusted handshake are analogous to handshake.

Phase change phiaMay be limited to some predefined range of allowable values. Such phasesVariation phiaMay be implemented, for example, by node 210. In an embodiment, the phase changes phiaIs allowed to vary by an amount equal to or less than pi (pi) radians-e.g., where the phase varies by phiaAlways in the range 0 to pi (or for example in the range of-pi/2 values pi/2). In some embodiments, such phase changes are allowed to vary by an amount equal to or less than 0.8 π radians — for example, where the amount is equal to or less than 0.7 π. Phase change phiaMay facilitate a change in phase (phi), e.g., a range of 0.1 pi to 0.9 piaIs based on a respective adjustment signal and a likelihood signal LaThe corresponding changes in the indicated probabilities have a relatively more linear relationship. As a result, simultaneous changes in probabilities (each responsive to a respective different child node) may be more linearly added to each other (or more linearly offset from each other).

Fig. 3 illustrates features of a method 300 for operating a spiking neural network, according to an embodiment. The method 300 may be performed using one of the spiking neural networks 100, 200 — for example, where the method 300 includes passing one or more signals between the node 240 and the node 240. To illustrate certain features of various embodiments, the method 300 is described herein with reference to the spiking neural network 400 shown in fig. 4. However, such descriptions may be extended to apply to any of a variety of different spiking neural networks configured to adjust phase variations to a bias current, where, based on the phase variations, a membrane voltage of a likelihood signal for a node indicates a probability of a condition to which that node corresponds.

The method 300 may include: a first bias signal and a second bias signal are received (at 310) at a first node and a second node, respectively, of a spiking neural network. The first node may correspond to a first condition, wherein the second node corresponds to a second condition, for example, having a dependency relationship with the first condition. Referring now to fig. 4, a spiking neural network 400 may be configured to perform bayesian inference operations in accordance with an embodiment. The spiking neural network 400 may include some or all of the features of the neural network 200, for example. Spiking neural network 400 is an example of one of the following embodiments: one example of an embodiment in which a plurality of nodes are each to receive a respective bias signal, wherein synapses are variously coupled between respective ones of the plurality of nodes each corresponding to a respective dependency relationship of the bayesian network. Resonant coupling of multiple nodes may facilitate applying a phase change to a bias signal as a mechanism for determining (e.g., including updating) the probability of a given variable.

In fig. 4, spiking neural network 400 includes nodes 410, 412, 414, 416, 418 that each correspond to a respective condition (parameter) of the bayesian network. As illustrated by example embodiments, a bayesian network can include conditions by which determinations can be made regarding a house or other residential property. For example, node 410 may correspond to condition G "junk may be flipped" -e.g., where node 412 corresponds to condition D "dog call" and node 414 corresponds to condition W "window broken". Node 416 may correspond to condition R "raccoon on property" -e.g., where node 418 corresponds to condition B "thief on property". In such embodiments, nodes 410, 412 may each be respective child nodes to node 416-e.g., to reflect dependency relationships, where condition R may result in (and may be indicated by resulting in) one or both of condition G or condition D. Alternatively or additionally, nodes 412, 414 may each be respective child nodes to node 418-e.g., to reflect dependency relationships, where condition B may result in (and may be indicated by resulting in) one or both of condition D or condition W. Different parent-child node dependency relationships between respective ones of the nodes 410, 412, 414, 416, 418 may be represented by respective synapses coupled in various ways to communicate likelihood signals 430, 432, 434, 436. In some embodiments, likelihood signals 430, 432 are equal to each other and/or likelihood signals 434, 436 are equal to each other.

Although some embodiments are not limited in this respect, method 300 may further comprise: one condition of the bayesian network is selected over one or more other conditions of the bayesian network, wherein the selecting is based on the adjusting at 360. For example, evaluation logic coupled to the spiking neural network (or alternatively, to a portion of the spiking neural network) may receive likelihood signals that are each from a respective different node of the spiking neural network. In such embodiments, the selection may include: it is detected whether one of the received likelihood signals meets one or more predefined test criteria. By way of illustration and not limitation, such test criteria (e.g., including a minimum threshold probability value, a minimum threshold difference between two probability values, etc.) may define when a condition corresponding to a given likelihood signal is deemed sufficiently likely (either individually or alternatively in relation to the probability of one or the other condition) to justify the selection of that condition. In response to detecting that one of the received likelihood signals satisfies a test criterion, the evaluation logic may generate a signal indicative of a selection of a condition corresponding to the node providing that likelihood signal.

Referring again to fig. 4, one or more additional synaptic signals (such as the illustrative signals 450, 452 shown) may signal other nodes of the network 400 (or signal evaluation circuitry external to the network 400) whether an instance of a particular one of the conditions R, B is indicated, at least at some threshold probability. The evaluation of the signals 450, 452 may detect whether the probability indicated by one of the nodes 416, 418 meets one or more test criteria. For example, such a test may detect whether one of p (r) or p (b) is above some minimum threshold probability value. Alternatively or additionally, such a test may detect whether one of the values [ p (r) -p (b) ] or [ p (b) -p (r) ] is above some minimum threshold probability difference. In response to such detection, the evaluation circuit may output or otherwise generate a signal indicative of the selection of one of the nodes 416, 418 (and thus the condition to which that node corresponds).

Referring again to method 300, the receiving at 310 may include nodes 410, 412, 414, 416, 418 receiving respective bias signals J that are each sinusoidal or otherwise periodicbg、Jbd、Jbw、Jbr、Jbb. For example, the bias signal Jbg、Jbd、Jbw、Jbr、JbbMay have the same frequency omega. The method 300 may further include: a change is applied (at 320) to the phase of the first bias signal. For example, the initial configuration of spiking neural network 400 may include or otherwise be based on respective predefined initial values for probabilities p (g), p (d), p (w), p (r), and p (B) of conditions G, D, W, R and B. For example, p (r) may be indicated by the likelihood signal 430 or the rate-magnitude (or other characteristic) of the membrane voltage at node 416. The indication of P (R) may be based at least in part on node 416 being applied to bias signal JbrThe phase of (2) is changed. Similarly, P (B) may be indicated by a characteristic of the likelihood signal 430 or membrane voltage at node 416-e.g., where such indication of P (B) is based at least in part on node 418 being applied to bias signal JbbThe phase of (2) is changed. In such embodiments, the indication of p (g) by the membrane voltage of node 410 may be based on node 410 being to apply a bias signal JbgE.g., wherein the indication of the membrane voltage p (d) by node 412 is based on the node 412 being applied to the bias signal JbdThe phase of (2) is changed. Similarly, the indication of the membrane voltage by node 414 to P (W) may be based on the node 414 to be applied to the bias signal JbwThe phase of (2) is changed.

The initial configuration of spiking neural network 400 may further include or otherwise be based on conditional probabilities P (G | R), P (D | B), and P (W | B). For example, P (G | R) may be represented or otherwise indicated by the synaptic weight to be applied to likelihood signal 430 by node 410 — e.g., where P (D | R) is indicated by the synaptic weight to be applied to likelihood signal 432 by node 412. Similarly, P (D | R) may be indicated by the synaptic weight to be applied by node 412 to likelihood signal 434 — e.g., where P (W | B) is indicated by the synaptic weight to be applied by node 414 to likelihood signal 436. Some or all such probabilities may be provided as input parameters a priori based on a predefined bayesian network design. The operation of the spiking neural network may signal a change in a given probability by adjusting a change in phase to the bias signal based on one or more instance signals.

The same phase change (if any) may be applied to J simultaneously during the initialization state of spiking neural network 400bg、Jbd、Jbw、Jbr、JbbEach of which, for example, where nodes 410, 412, 414, 416, 418 exhibit some baseline (e.g., in-phase) coupled resonance state. As a result, P (R) and P (B) may be equal to each other at least initially, P (G), P (D) and P (W) may be equal to each other at least initially, and so on.

In an embodiment, the method 300 further comprises: a third signal (likelihood signal) is passed (at 330) from the first node to the second node in response to a change in phase of the first bias signal. Based on the change, the third signal indicates a likelihood of the first condition. A characteristic of the third signal (such as a rate-amplitude) may indicate a likelihood of the first condition. Referring again to spiking neural network 400, the passing at 330 may include: node 416 communicates one or both of likelihood signals 430, 432, and/or node 418 communicates one or both of likelihood signals 434, 436.

The method 300 may further include: a signal (an instance signal) indicative of an instance of a second condition is received (at 340) at a second node. By way of illustration and not limitation, example signals 420, 422, 424 may be communicated to nodes 410, 412, 414 (respectively). Node 410 may be configured to identify a spike pattern, timing, and/or other characteristic of instance signal 420 as instance signal 420 indicating an instance of condition G. Similarly, a characteristic of instance signal 422 may indicate an instance of condition D to node 412 and/or a characteristic of instance signal 424 may indicate an instance of condition W to node 414.

In response to an example of the second condition, the method 300 (at 350) may pass a fourth signal (and adjustment signal) from the second node to the first node, the fourth signal (and adjustment signal) being based on the second bias signal. The fourth signal may be further based on a conditional probability of the second condition given the first condition. For example, the third signal may be communicated via a synapse coupled between the first node and the second node, wherein the weight assigned to the synapse indicates a conditional probability of the second condition given the first condition. In some embodiments, the fourth signal is based on a ratio of a conditional probability to a probability of the second condition.

Referring again to spiking neural network 400, the passing at 350 may include: node 410 transmits adjustment signal 440, node 412 transmits one or both of adjustment signals 442, 444, and/or node 414 transmits adjustment signal 446. The generation of some or all of the adjustment signals 440, 442, 444, 446 may include operations such as those performed at 350, for example.

Based on the fourth signal, the method 300 may adjust (at 360) an amount of change in the phase of the first bias signal. For example, node 416 may adjust the bias signal J applied thereto, e.g., based on one or both of adjustment signals 440, 442brThe amount of phase change of (a). Alternatively or additionally, node 418 may adjust the bias signal J applied based on one or both of adjustment signals 444, 446, for examplebbThe amount of phase change of (a). In response to such adjustment of the phase change, a characteristic of one of likelihood signals 430, 432, 434, 436 may be altered, thereby indicating a change in the probability of condition R or a change in the probability of condition B.

Although some embodiments are not limited in this regard, one or more other nodes (other than the second node) may each act as a child of the first node. In such embodiments, the method 300 may further comprise: receiving a third bias signal at a third node corresponding to a third condition; and passing the third bias signal from the first node to the third node. For example, nodes 416, 410, 412 of spiking neural network 400 may be a first node, a second node, and a third node (respectively) -e.g., where both likelihood signals 430, 432 comprise a third signal. In such embodiments, the method 300 may further comprise: a signal (e.g., instance signal 422) indicative of an instance of a third condition is received at the third node. In response to an example of the third condition, the third node may communicate a fifth signal (e.g., adjustment signal 442) to the first node, the fifth signal (e.g., adjustment signal 442) being based on the third bias signal (and based on any phase change applied to the third bias signal). Then, the amount of change in the phase of the first bias signal may be further adjusted based on the fifth signal. In such embodiments, adjusting the amount of change in the phase of the first bias signal based on the fifth signal may be performed simultaneously with the adjustment as a result of operation 360.

Alternatively or additionally, one or more other nodes (other than the first node) may each act as a parent node for the second node. In such embodiments, the method 300 may further comprise: receiving a third bias signal at a third node corresponding to a third condition; and applying a change to the phase of the third bias signal. For example, nodes 416, 412, 418 of spiking neural network 400 may be a first node, a second node, and a third node (respectively). In such embodiments, method 300 may further pass a fifth signal (e.g., likelihood signal 434) from the third node to the second node, wherein the third bias signal (e.g., J) is based onbb) Indicating a likelihood of a third condition (such as (B)). In response to an example of the second condition, the second node may communicate a sixth signal (e.g., adjustment signal 444) to the third node, the sixth signal (e.g., adjustment signal 444) based on the second bias signal. Based on the sixth signal, the third node may adjust an amount of change in the phase applied to the third bias signal.

In some embodiments, the second node also acts as a parent node for some third node corresponding to the third condition. In such embodiments, the method 300 may further comprise: receiving a third bias signal at a third node; and applying a change to the phase of the second bias signal. Similar to the communication at 330, the second node may communicate a fifth signal to the third node, wherein the fifth signal indicates a likelihood of the second condition based on a change in phase of the second bias signal. Similar to the receiving at 340 and the communicating at 350, the third node may receive a signal indicative of an instance of a third condition and, in response to the instance of the third condition, may communicate a sixth signal to the second node, the sixth signal based on the third bias signal. Based on the sixth signal, the second node may adjust an amount of change in the phase applied to the second bias signal.

Fig. 5 shows timing diagrams 500, 510, 520, 530 illustrating respective signal characteristics in various ways, each with respect to a time axis 505, during operations for performing bayesian inference operations according to embodiments. The signal characteristics may be determined at one of the spiking neural networks 100, 200, 400, e.g., based on respective signals communicated with the nodes 240, 250 in various ways. The various scales (e.g., for timing, voltage, and frequency) shown in timing diagrams 500, 510, 520, 530 are merely illustrative of some embodiments and may vary depending on implementation-specific details.

The timing diagram 500 illustrates the spike produced by the membrane voltage Vm 502 at a parent node, such as one of the nodes 210, 240, 416, 418. The same spike (or otherwise corresponding spike) may be provided by a likelihood signal generated by the parent node based on Vm 502. The spike may have at least some periodic components, where the amplitude, wavelength, and/or other characteristics of the periodic components indicate the probability of the condition corresponding to the parent node. For example, as illustrated by timing diagram 510, the frequency 512 of the spike generated by the membrane voltage Vm 502 may present an oscillating component. The frequency 512 may be a moving average of spikes occurring within a given time window.

Timing diagram 520 illustrates amplitude 522 of the oscillation resulting from frequency 512 (where amplitude 522 is the rate-amplitude of Vm 502). As illustrated by timing diagram 520, amplitude 522 may have a first value (e.g., 4KHz) during a first time period that includes, for example, the illustrated period between time 0 and time 6. The first value of the amplitude 522 may be based on a corresponding amount of phase change applied to the bias signal at the parent node.

Likelihood signals based on Vm 502 can be passed from parent to child nodes. The child node may then detect an instance of another condition corresponding to the child node based on the instance signal provided to the child node. In response to such instances, the child node may communicate to the parent node to adjust the amount of phase change applied to the bias signal. For example, as shown in timing diagram 530, a child node may communicate an adjustment signal Cb532 spike pattern 534. Responsive to the tipPeak mode 534, the value of amplitude 522 may vary. In the example scenario shown, the value of the amplitude 522 is increased to a second value (e.g., 13KHz) after the time of the spike pattern 534. This variable value of the amplitude 522 may indicate the current probability of the condition corresponding to the parent node.

FIG. 6 illustrates a computing device 600 according to one embodiment. The computing device 600 houses a board 602. The board 602 may include a number of components, including but not limited to a processor 604 and at least one communication chip 606. Processor 604 is physically and electrically coupled to board 602. In some implementations, at least one communication chip 606 is also physically and electrically coupled to the board 602. In a further implementation, the communication chip 606 is part of the processor 604.

Depending on its applications, computing device 600 may include other components that may or may not be physically and electrically coupled to board 602. These other components include, but are not limited to, volatile memory (e.g., DRAM), non-volatile memory (e.g., ROM), flash memory, a graphics processor, a digital signal processor, a crypto processor, a chipset, an antenna, a display, a touchscreen controller, a battery, an audio codec, a video codec, a power amplifier, a Global Positioning System (GPS) device, a compass, an accelerometer, a gyroscope, a speaker, a camera, and a mass storage device (such as a hard disk drive, a Compact Disc (CD), a Digital Versatile Disc (DVD), and so forth).

The communication chip 606 enables wireless communication for data transmission to and from the computing device 600 the term "wireless" and derivatives thereof may be used to describe circuits, devices, systems, methods, techniques, communication channels, etc. that may pass data through a non-solid medium using modulated electromagnetic radiation the term does not imply that the associated device does not contain any wires, but in some embodiments the associated device may not contain any wires the communication chip 606 may implement any of a variety of wireless standards or protocols including, but not limited to, Wi-Fi (the IEEE 802.11 series), WiMAX (the IEEE 802.16 series), IEEE 802.20, Long term evolution (L TE), Ev-DO, HSPA +, HSDPA +, HSUPA +, EDGE, GSM, WiMAX, CDMA, generation, TDMA, DECT, Bluetooth and derivatives thereof, and any other wireless protocols known as 3G, 4G, 5G, and higher the computing device 600 may include a plurality of communication chips 606. for example, a first communication chip 606 may be dedicated to shorter range wireless communications such as Wi-Fi and a second communication chip, such as Bluetooth, GPS, etc.

Processor 604 of computing device 600 includes an integrated circuit die packaged within processor 604. The term "processor" may refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory. The communication chip 606 also includes an integrated circuit die packaged within the communication chip 606.

In various implementations, the computing device 600 may be a laptop, a netbook, a notebook, an ultrabook, a smartphone, a tablet, a Personal Digital Assistant (PDA), an ultra mobile OC, a mobile phone, a desktop computer, a server, a printer, a scanner, a monitor, a set-top box, an entertainment control unit, a digital camera, a portable music player, or a digital video recorder. In further implementations, the computing device 600 may be any other electronic device that processes data.

Some embodiments may be provided as a computer program product or software which may include a machine-readable medium having stored thereon instructions which may be used to program a computer system (or other electronic devices) to perform a process according to the embodiments. A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable (e.g., computer-readable) medium includes a machine (e.g., computer) readable storage medium (e.g., read only memory ("ROM"), random access memory ("RAM"), magnetic disk storage media, optical storage media, flash memory devices, etc.), a machine (e.g., computer) readable transmission medium (e.g., electrical, optical, acoustical or other form of propagated signals (e.g., infrared signals, digital signals, etc.)), and the like.

Figure 7 illustrates a diagrammatic representation of a machine in the exemplary form of a computer system 700 within which the machine may execute a set of instructions for causing the machine to perform any one or more of the methodologies described herein, hi alternative embodiments, the machine may be connected (e.g., networked) to other machines in a local area network (L AN), AN intranet, AN extranet, or the internet.

The exemplary computer system 700 includes a processor 702, a main memory 704 (e.g., Read Only Memory (ROM), flash memory, Dynamic Random Access Memory (DRAM) such as synchronous DRAM (sdram) or Rambus DRAM (RDRAM), etc.), a static memory 706 (e.g., flash memory, Static Random Access Memory (SRAM), etc.), and a secondary memory 718 (e.g., a data storage device), which communicate with each other via a bus 730.

The processor 702 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like, more particularly, the processor 702 may be a Complex Instruction Set Computing (CISC) microprocessor, Reduced Instruction Set Computing (RISC) microprocessor, very long instruction word (V L IW) microprocessor, processor implementing other instruction sets, or a processor implementing a combination of instruction sets, the processor 702 may also be one or more special-purpose processing devices such as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), a network processor, or the like, the processor 702 is configured to execute the processing logic 726 for performing the operations described herein.

The computer system 700 may further include a network interface device 708 the computer system 700 may also include a video display unit 710 (e.g., a liquid crystal display (L CD), a light emitting diode display (L ED), or a Cathode Ray Tube (CRT)), an alphanumeric input device 712 (e.g., a keyboard), a cursor control device 714 (e.g., a mouse), and a signal generation device 716 (e.g., a speaker).

Secondary memory 718 may include a machine-accessible storage medium (or more specifically, a computer-readable storage medium) 732 having stored thereon one or more sets of instructions (e.g., software 722) embodying any one or more of the methodologies or functions described herein. The software 722 may also reside, completely or at least partially, within the main memory 704 and/or within the processor 702 during execution thereof by the computer system 700; the main memory 704 and the processor 702 also constitute machine-readable storage media. The software 722 may further be transmitted or received over a network 720 via the network interface device 708.

While the machine-accessible storage medium 732 is shown in an exemplary embodiment to be a single medium, the term "machine-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "machine-readable storage medium" shall also be taken to include any medium that is capable of storing or encoding data for execution by the machine and that cause the machine to perform any one or more of the embodiments. The term "machine-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media.

Example 1 is a computer device for generating bayesian inference operations with a spiking neural network, the computer device comprising circuitry to: receiving a first bias signal at a first node of a spiking neural network, wherein the first node corresponds to a first condition of a Bayesian network; receiving a second bias signal at a second node of the spiking neural network, wherein the second node corresponds to a second condition of the bayesian network; and applying a change to the phase of the first bias signal. The circuit is further configured to: passing a third signal from the first node to the second node in response to a change in phase of the first bias signal, wherein the third signal indicates a likelihood of the first condition; receiving, at a second node, a signal indicative of an instance of a second condition; in response to an instance of a second condition, passing a fourth signal from the second node to the first node, the fourth signal based on a second bias signal; and adjusting the amount of change in the phase of the first bias signal based on the fourth signal.

In example 2, the subject matter of example 1 optionally includes: wherein the spike rate of the third signal varies over time based on a change in the phase of the first bias signal, wherein a magnitude of the change caused by the spike rate of the third signal indicates a likelihood of the first condition.

In example 3, the subject matter of any one or more of examples 1 and 2 optionally includes: wherein the fourth signal is further based on a conditional probability of the second condition given the first condition.

In example 4, the subject matter of example 3 optionally includes: wherein the third signal is communicated via a synapse coupled between the first node and the second node, wherein a weight assigned to the synapse indicates a conditional probability of the second condition given the first condition.

In example 5, the subject matter of example 3 optionally includes: wherein the fourth signal is based on a ratio of the conditional probability to the probability of the second condition.

In example 6, the subject matter of any one or more of examples 1 and 2 optionally includes: the computer device further includes circuitry for applying another change to the phase of the second bias signal, wherein the fourth signal is further based on the change in the phase of the second bias signal.

In example 7, the subject matter of any one or more of examples 1 and 2 optionally includes: the computer device further comprises circuitry for performing the steps of: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; passing a third signal from the first node to a third node; receiving, at a third node, a signal indicative of an instance of a third condition; in response to an instance of a third condition, communicating a fifth signal from the third node to the first node, the fifth node based on a third bias signal; and further adjusting the amount of change in the phase of the first bias signal based on the fifth signal.

In example 8, the subject matter of example 7 optionally includes: wherein the circuit is configured to: the change amount of the phase of the first bias signal is adjusted based on the fifth signal while the change of the phase of the first bias signal is adjusted based on the fourth signal.

In example 9, the subject matter of any one or more of examples 1 and 2 optionally includes:

a computer device of # #, further comprising circuitry for performing the steps of: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; applying a change to the phase of the third bias signal; communicating a fifth signal from the third node to the second node, wherein the fifth node indicates a likelihood of a third condition based on a change in phase of the third bias signal; in response to an instance of the second condition, passing a sixth signal from the second node to the third node, the sixth signal based on the second bias signal; and adjusting a variation amount of the phase of the third bias signal based on the sixth signal.

In example 10, the subject matter of any one or more of examples 1 and 2 optionally includes: the computer device further comprises circuitry for performing the steps of: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; applying a change to the phase of the second bias signal; passing a fifth signal from the second node to the third node, wherein the fifth node indicates a likelihood of a second condition based on a change in phase of the second bias signal; receiving, at a third node, a signal indicative of an instance of a third condition; in response to an instance of a third condition, passing a sixth signal from the third node to the second node, the sixth signal based on the third bias signal; and adjusting the amount of change in the phase of the second bias signal based on the sixth signal.

In example 11, the subject matter of any one or more of examples 1 and 2 optionally includes: the computer device further comprises circuitry for performing the steps of: based on the adjusted amount of change, one condition of the bayesian network is selected over one or more other conditions of the bayesian network.

Example 12 is at least one non-transitory machine-readable medium comprising instructions that, when executed by a machine, cause the machine to perform operations for generating bayesian inferences using a spiking neural network, the operations comprising: receiving a first bias signal at a first node of a spiking neural network, wherein the first node corresponds to a first condition of a Bayesian network; receiving a second bias signal at a second node of the spiking neural network, wherein the second node corresponds to a second condition of the bayesian network; applying a change to a phase of the first bias signal; and communicating a third signal from the first node to the second node in response to a change in phase of the first bias signal, wherein the third signal indicates a likelihood of the first condition. The operations further include: receiving, at a second node, a signal indicative of an instance of a second condition; in response to an instance of a second condition, passing a fourth signal from the second node to the first node, the fourth signal based on a second bias signal; and adjusting the amount of change in the phase of the first bias signal based on the fourth signal.

In example 13, the subject matter of example 12 optionally includes: wherein the spike rate of the third signal varies over time based on a change in the phase of the first bias signal, wherein a magnitude of the change caused by the spike rate of the third signal indicates a likelihood of the first condition.

In example 14, the subject matter of any one or more of example 12 and example 13 optionally includes: wherein the fourth signal is further based on a conditional probability of the second condition given the first condition.

In example 15, the subject matter of example 14 optionally includes: wherein the third signal is communicated via a synapse coupled between the first node and the second node, wherein a weight assigned to the synapse indicates a conditional probability of the second condition given the first condition.

In example 16, the subject matter of example 14 optionally includes: wherein the fourth signal is based on a ratio of the conditional probability to the probability of the second condition.

In example 17, the subject matter of any one or more of example 12 and example 13 optionally includes: the operations further comprise: applying another change to the phase of the second bias signal, wherein the fourth signal is further based on the change in the phase of the second bias signal.

In example 18, the subject matter of any one or more of example 12 and example 13 optionally includes: the operations further comprise: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; passing a third signal from the first node to a third node; receiving, at a third node, a signal indicative of an instance of a third condition; in response to an instance of a third condition, communicating a fifth signal from the third node to the first node, the fifth node based on a third bias signal; and further adjusting the amount of change in the phase of the first bias signal based on the fifth signal.

In example 19, the subject matter of example 18 optionally includes: wherein the amount of change in the phase of the first bias signal is adjusted based on the fifth signal while the change in the phase of the first bias signal is adjusted based on the fourth signal.

In example 20, the subject matter of any one or more of example 12 and example 13 optionally includes: the operations further comprise: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; applying a change to the phase of the third bias signal; communicating a fifth signal from the third node to the second node, wherein the fifth node indicates a likelihood of a third condition based on a change in phase of the third bias signal; in response to an instance of the second condition, passing a sixth signal from the second node to the third node, the sixth signal based on the second bias signal; and adjusting a variation amount of the phase of the third bias signal based on the sixth signal.

In example 21, the subject matter of any one or more of example 12 and example 13 optionally includes: the operations further comprise: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; applying a change to the phase of the second bias signal; passing a fifth signal from the second node to the third node, wherein the fifth node indicates a likelihood of a second condition based on a change in phase of the second bias signal; receiving, at a third node, a signal indicative of an instance of a third condition; in response to an instance of a third condition, passing a sixth signal from the third node to the second node, the sixth signal based on the third bias signal; and adjusting the amount of change in the phase of the second bias signal based on the sixth signal.

In example 22, the subject matter of any one or more of example 12 and example 13 optionally includes: the operations further comprise: based on the adjusted amount of change, one condition of the bayesian network is selected in preference to one or more other conditions of the bayesian network.

Example 23 is a method for generating bayesian inferences using a spiking neural network, the method comprising: receiving a first bias signal at a first node of a spiking neural network, wherein the first node corresponds to a first condition of a Bayesian network; receiving a second bias signal at a second node of the spiking neural network, wherein the second node corresponds to a second condition of the bayesian network; applying a change to a phase of the first bias signal; and communicating a third signal from the first node to the second node in response to a change in phase of the first bias signal, wherein the third signal indicates a likelihood of the first condition. The method further comprises the following steps: receiving, at a second node, a signal indicative of an instance of a second condition; in response to an instance of a second condition, passing a fourth signal from the second node to the first node, the fourth signal based on a second bias signal; and adjusting the amount of change in the phase of the first bias signal based on the fourth signal.

In example 24, the subject matter of example 23 optionally includes: wherein the spike rate of the third signal varies over time based on a change in the phase of the first bias signal, wherein a magnitude of the change caused by the spike rate of the third signal indicates a likelihood of the first condition.

In example 25, the subject matter of any one or more of example 23 and example 24 optionally includes: wherein the fourth signal is further based on a conditional probability of the second condition given the first condition.

In example 26, the subject matter of example 25 optionally includes: wherein the third signal is communicated via a synapse coupled between the first node and the second node, wherein a weight assigned to the synapse indicates a conditional probability of the second condition given the first condition.

In example 27, the subject matter of example 25 optionally includes: wherein the fourth signal is based on a ratio of the conditional probability to the probability of the second condition.

In example 28, the subject matter of any one or more of example 23 and example 24 optionally includes: the method further comprises the following steps: applying another change to the phase of the second bias signal, wherein the fourth signal is further based on the change in the phase of the second bias signal.

In example 29, the subject matter of any one or more of example 23 and example 24 optionally includes: the method further comprises the following steps: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; passing a third signal from the first node to a third node; receiving, at a third node, a signal indicative of an instance of a third condition; in response to an instance of a third condition, communicating a fifth signal from the third node to the first node, the fifth node based on a third bias signal; and further adjusting the amount of change in the phase of the first bias signal based on the fifth signal.

In example 30, the subject matter of example 29 optionally includes: wherein the amount of change in the phase of the first bias signal is adjusted based on the fifth signal while the change in the phase of the first bias signal is adjusted based on the fourth signal.

In example 31, the subject matter of any one or more of example 23 and example 24 optionally includes: the method further comprises the following steps: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; applying a change to the phase of the third bias signal; communicating a fifth signal from the third node to the second node, wherein the fifth node indicates a likelihood of a third condition based on a change in phase of the third bias signal; in response to an instance of the second condition, passing a sixth signal from the second node to the third node, the sixth signal based on the second bias signal; and adjusting a variation amount of the phase of the third bias signal based on the sixth signal.

In example 32, the subject matter of any one or more of example 23 and example 24 optionally includes: the method further comprises the following steps: receiving a third bias signal at a third node of the spiking neural network, the third node corresponding to a third condition of the bayesian network; applying a change to the phase of the second bias signal; passing a fifth signal from the second node to the third node, wherein the fifth node indicates a likelihood of a second condition based on a change in phase of the second bias signal; receiving, at a third node, a signal indicative of an instance of a third condition; in response to an instance of a third condition, passing a sixth signal from the third node to the second node, the sixth signal based on the third bias signal; and adjusting the amount of change in the phase of the second bias signal based on the sixth signal.

In example 33, the subject matter of any one or more of example 23 and example 24 optionally includes: the method further comprises the following steps: based on the adjusted amount of change, one condition of the bayesian network is selected in preference to one or more other conditions of the bayesian network.

Techniques and architectures for providing the functionality of spiking neural networks are described herein. In the description above, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of certain embodiments. It will be apparent, however, to one skilled in the art that certain embodiments may be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the description.

Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.

Some portions of the detailed description herein are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the computer arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.

It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the discussion herein, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.

Certain embodiments also relate to an apparatus for performing the operations herein. These means may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), RAMs such as dynamic Random Access Memories (RAMs) (DRAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and coupled to a computer system bus.

The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description herein. Moreover, some embodiments are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of such embodiments as described herein.

In addition to what is described herein, various modifications may be made to the disclosed embodiments and implementations thereof without departing from their scope. Accordingly, the specification and examples herein should be considered as illustrative and not restrictive. The scope of the invention should be determined only by reference to the claims that follow.

30页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:基于负载形状分析的能量计划通信控制系统及方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!