Dynamic control of communication connections of computing devices based on detected events

文档序号:91098 发布日期:2021-10-08 浏览:3次 中文

阅读说明:本技术 基于检测到的事件的计算设备的通信连接的动态控制 (Dynamic control of communication connections of computing devices based on detected events ) 是由 朱渝 崔维又 王斌 朱婴娜 张德军 于 2019-11-26 设计创作,主要内容包括:本文所公开的技术通过基于特定事件的检测动态地控制两个或更多个计算设备之间的通信连接来提供对现有系统的改进。代替要求设备(诸如IoT设备)被连续连接到网络或远程设备,本文所公开的技术使得设备能够保持离线,直到特定事件被检测到为止。公开的技术可以在提供针对IoT设备的期望功能性的同时避免一直在线的网络配置。例如,一些设备响应于事件的检测而仅与其他计算设备连接或在网络处连接,然后在交换特定信息之后断开连接。这样的技术可以通过仅在适当的时间调用连接来显著降低设备的功率消耗。(The technology disclosed herein provides improvements over existing systems by dynamically controlling a communication connection between two or more computing devices based on the detection of a particular event. Instead of requiring a device (such as an IoT device) to be continuously connected to a network or remote device, the techniques disclosed herein enable the device to remain offline until a particular event is detected. The disclosed techniques may avoid always-on network configurations while providing desired functionality for IoT devices. For example, some devices connect only with other computing devices or at a network in response to detection of an event, and then disconnect after exchanging certain information. Such techniques can significantly reduce power consumption of the device by only invoking the connection at the appropriate time.)

1. A method, comprising:

receiving sensor data from an accelerometer, the sensor data including acceleration information of a vehicle, the vehicle including a first computing device;

detecting, by the first computing device, using at least the acceleration information, that a magnitude of acceleration of the first computing device exceeds a threshold magnitude of acceleration in a predetermined direction;

in response to detecting that the magnitude of acceleration exceeds the threshold magnitude of acceleration in the predetermined direction, initiating a data connection between the first computing device and a second computing device;

receiving, by the first computing device, at least one of timing information or location information from the second computing device; and

terminating the data connection between the first computing device and the second computing device.

2. The method of claim 1, further comprising:

detecting, by the first computing device, a magnitude of acceleration of the first computing device in a second predetermined direction using at least the acceleration information,

wherein the data connection between the first computing device and a second computing device is initiated in response to determining that the magnitude of acceleration of the first computing device in the second predetermined direction is less than the magnitude of acceleration of the first computing device in the predetermined direction.

3. The method of claim 1, further comprising:

detecting, by the first computing device, a magnitude of acceleration of the first computing device in a second predetermined direction using at least the acceleration information,

wherein the data connection between the first computing device and a second computing device is initiated in response to:

detecting that the magnitude of acceleration of the first computing device exceeds the threshold magnitude of acceleration in the predetermined direction, an

Determining that the magnitude of acceleration of the first computing device in the predetermined direction is greater than the magnitude of acceleration of the first computing device in the second predetermined direction by a predetermined difference.

4. The method of claim 1, wherein the timing information comprises a time and date when the magnitude of acceleration of the first computing device exceeds the threshold level of acceleration in the predetermined direction.

5. The method of claim 1, wherein the location information comprises a set of coordinates of the first computing device when the magnitude of acceleration of the first computing device exceeds the threshold in the predetermined direction.

6. The method of claim 1, wherein the data connection between the first computing device and the second computing device is terminated in response to completing communication of at least one of: the timing information, the location information, or the acceleration information.

7. The method of claim 1, wherein the first computing device comprises a dashboard camera device, the method further comprising adding to video data one or more of: the received acceleration information, the received timing information, or the received location information, the video data defining a video clip generated by the dashboard camera device after the detecting.

8. The method of claim 1, wherein initiating the data connection comprises initiating a wireless pairing of the first computing device with an electronic wristband device, a smart watch, or a smartphone.

9. A method, comprising:

receiving video data from an imaging sensor, the video data representing a plurality of scenes corresponding to a perimeter of a first vehicle;

analyzing the video data to identify a second vehicle in at least one scene of the plurality of scenes;

analyzing the video data to determine an acceleration of the second vehicle;

detecting, by a first computing device, that a magnitude of acceleration of the second vehicle exceeds a threshold magnitude of acceleration in a predetermined direction;

in response to detecting that the magnitude of acceleration of the second vehicle exceeds the threshold in a predetermined direction, initiating a data connection between the first computing device and a second computing device;

receiving, by the first computing device, at least one of timing information or location information; and

terminating the data connection between the first computing device and the second computing device.

10. The method of claim 9, wherein the video data defines a plurality of image frames at a defined frame rate, each image frame of the plurality of image frames corresponding to a scene of the plurality of scenes, and wherein analyzing the video data to determine the acceleration of the second vehicle comprises:

determining a series of positions of the second vehicle in respective consecutive image frames of the plurality of image frames; and

generating an estimate of the acceleration using the series of positions and the defined frame rate.

11. The method of claim 9, wherein the timing information includes a time and date when the magnitude of acceleration of the second vehicle exceeds the threshold level of acceleration in the predetermined direction.

12. The method of claim 9, wherein the location information includes a set of coordinates of the first computing device when the magnitude of acceleration of the second vehicle exceeds the threshold magnitude in the predetermined direction.

13. The method of claim 9, wherein the first computing device comprises a dashboard camera device, the method further comprising adding to the second video data one or more of: the received acceleration information, the received timing information, or the received location information, the second video data defining a video clip generated by the dashboard camera device after the detecting.

14. A computing device, comprising:

one or more processors; and

a memory in communication with the one or more processors, the memory having computer-executable instructions stored thereon that, when executed by the one or more processors, cause the computing device to perform operations comprising:

receiving sensor data indicative of a measure of a physical property of an object;

using the sensor data to detect an occurrence of a defined event, the defined event comprising a change in a physical property of an object, wherein the change satisfies one or more criteria;

in response to detecting that the physical property of the object satisfies the one or more criteria, initiating a data connection between the computing device and a second computing device;

receiving, from the second computing device, at least one of timing information or location information corresponding to the occurrence of the defined event; and

terminating the data connection between the computing device and the second computing device.

15. The computing device of claim 14, wherein the physical property comprises one of acceleration, linear velocity, angular velocity, position, temperature, or pressure, and wherein the qualifying event comprises an accident involving a vehicle.

Background

Some computing devices, such as internet of things (IoT) devices, have become commonplace in many different industries. For example, IoT devices are used in home appliances, warehouse tracking systems, truck transportation industries, and the like. Despite the benefits provided by IoT devices, current systems have a number of drawbacks. For example, power consumption has been a constant concern for IoT devices. Some IoT devices are designed to be powered on at all times. While other IoT devices have a low power state to increase the power consumption of one device. However, despite the fact that some designs have low power states, such designs are still inefficient when a device requires network access. In view of such inefficiencies, there is a continuing need for improving the manner in which such devices operate and consume power.

Some IoT devices rely on remote computing devices, such as cloud-based platforms, for certain types of operations. For example, the process for provisioning data to the IoT device may be triggered by a cloud-based system. In another example, some cloud-based platforms may grant permissions to requesting IoT devices. Cloud-based systems are suitable for large-scale applications, which may include IoT devices deployed at fixed locations. However, such a configuration is not optimal for all usage scenarios, especially when usage scenarios require greater mobility of the IoT devices.

Generally, when IoT devices are used in mobile usage scenarios, maintaining communication connections with other devices and services can be expensive from a cost perspective and a power consumption perspective. In most cases, IoT devices deployed for mobile usage scenarios require continuous network connectivity. In some cases, the cost of maintaining a continuous network connection may prevent the use of IoT devices in some usage scenarios. Further, regardless of cost, the amount of computing resources necessary to maintain a continuous network connection may be prohibitive.

It is with respect to these and other technical challenges that the disclosure made herein is presented.

Disclosure of Invention

The technology disclosed herein provides improvements over existing systems by dynamically controlling a communication connection between two or more computing devices based on the detection of a particular event. Instead of requiring a device (such as an IoT device) to be continuously connected to a network or remote device, the techniques disclosed herein enable the device to remain offline until a particular event is detected. In response to detection of an event, the device may initiate a connection for the purpose of communicating data related to the event. Once the devices exchange data related to the event, the devices may terminate the connection for the purpose of conserving power and reducing the use of other computing resources.

In one illustrative example, an IoT device is configured to establish an active communication session in response to detection of an event. This configuration provides a mechanism with improved power consumption efficiency by avoiding the need to have a continuous network connection. Additionally, this configuration may reduce the attack face from any remote threat by providing network connectivity for a limited period of time. In response to detection of a predetermined event, a connection between the local device and the remote device may be established. The predetermined event may include, for example, a collision, a mechanical failure, or a change in environmental metrics. The collision may be detected by using an accelerometer in communication with the local device and/or by analyzing the video data to determine the acceleration of the physical object. Mechanical faults may be detected by using any sensor that can detect movement of an object in a predetermined direction or a sensor that can measure an environmental property exceeding a threshold. In another example, the event may include detecting a particular object having a particular physical property, such as a particular facial feature of a person, predetermined text of a license plate, customized painting or packaging of a vehicle, a particular type of damage, and so forth.

Once an event is detected, the device can control when a connection is initiated and when it is terminated. Additionally, the system may control the type of information exchanged based on the details of the detected event. For example, if a collision is detected between two vehicles, the IoT camera may automatically establish a connection with the remote device and retrieve time and date information. The time and date information may be used to generate a record of the event with an accurate time stamp as well as other data, such as video data and/or location data. Additionally, the IoT camera may automatically send basic information about the event, such as license plate information and video and audio data related to the event. Other types of specific data (such as location information and accelerometer measurements) may be communicated based on the severity and/or other attributes of the detected event. Then, upon determining that the selected information has been transmitted, the IoT camera may automatically terminate the connection to save power and other resources. Control of the connection may also reduce exposure of the device to unsecured external access.

In another example, if a mechanical failure is detected, the IoT device may establish a connection with a remote device and transmit specific information about the mechanical failure. For example, the IoT devices may be programmed to detect specific sounds that indicate a fault (such as a flat tire, a broken fan belt, etc.). Once such an event is detected, the system may establish a connection with the remote device and communicate specific information regarding the severity and/or other conditions associated with the event. Once the threshold of information is communicated, the IoT device may terminate the connection for the purpose of saving power and reducing exposure of the device to unsecured external access.

The techniques of this disclosure are not limited to a particular IoT device or a particular computing device. The disclosed technology is applicable to any group of computing devices where one of the computing devices is a trusted device configured to provide reliable data and metadata to another of the computing devices. Two computing devices configured to wirelessly transmit and receive information, and either device, including a network device, may respond by causing a connection between the two devices and causing termination of the connection and/or completion of communication of a particular type of data related to an event.

It should be noted that the above-described subject matter may be implemented as a computer-controlled apparatus, a computer-implemented method, a computing device, or as an article of manufacture such as a computer-readable storage medium. These and various other features will become apparent from a reading of the following detailed description and a review of the associated drawings.

This summary is provided to introduce a brief description of some aspects of the disclosed technology in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended that this summary be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.

Drawings

FIG. 1A presents an example of a traffic scene involving two vehicles in accordance with one or more embodiments of the present disclosure.

Fig. 1B presents an example of a collision following the example traffic scenario shown in fig. 1A in accordance with one or more embodiments of the present disclosure.

Fig. 2 presents an example of a time dependence of acceleration magnitude of a vehicle in the example traffic scenario shown in fig. 1A, in accordance with one or more embodiments of the present disclosure.

Fig. 3 presents an example of data flow and operations that may be implemented in a system of devices present in a vehicle involved in defining an event, in accordance with one or more embodiments in the present disclosure.

FIG. 4A presents an example of a dashboard camera that can be utilized with the disclosed technology in accordance with one or more embodiments.

FIG. 4B presents another example of a dashboard camera that can be utilized with the disclosed technology in accordance with one or more embodiments.

Fig. 5A presents another example of data flow and operations that may be implemented in a system of devices present in a vehicle involved in a defined event, in accordance with one or more embodiments in the present disclosure.

FIG. 5B presents an example of a computing device that can be utilized with the disclosed technology in accordance with one or more embodiments.

Fig. 5C presents yet another example of data flow and operations that may be implemented in a system of devices present in a vehicle involved in defining an event, in accordance with one or more embodiments in the present disclosure.

Fig. 6A presents an example of a traffic scenario involving three vehicles in accordance with one or more embodiments of the present disclosure. One of the vehicles has a computing device that may include an imaging sensor and an accelerometer or another type of inertial sensor.

Fig. 6B presents an example of a collision following the example traffic scenario shown in fig. 4A in accordance with one or more embodiments of the present disclosure.

Fig. 6C presents an example of a field of view of a computing device included in a vehicle facing a collision between two other vehicles in accordance with one or more embodiments of the present disclosure.

Fig. 7 presents an example of an image of a collision included in a video clip generated by a computing device included in a vehicle positioned a distance from the collision, in accordance with one or more embodiments of the present disclosure.

FIG. 8 presents an example of a method for dynamically controlling a communication connection of a computing device based on a detected event in accordance with one or more embodiments of the present disclosure.

FIG. 9 presents an example of another method for dynamically controlling a communication connection of a computing device in accordance with one or more embodiments of the present disclosure.

FIG. 10 presents an example of a computing environment for implementing dynamic control of a communication connection of a computing device in accordance with one or more embodiments of the present disclosure.

Detailed Description

The present disclosure recognizes and addresses the problems of existing devices that utilize data connections, as well as other technical challenges. Conventional methods of managing data connections and general methods for provisioning data for such devices do not perform well and are costly to operate. Known methods can limit the availability of devices, taking into account the inefficiencies caused by continuous network connections and the lack of control over the amount of data that can be transferred by a portable device.

As described in more detail below, the present disclosure provides an improvement over existing systems by controlling a connection between two or more devices in response to the detection of a predetermined event. By controlling the connection between two devices based on the detection of an event, the system may minimize the amount of resources required to maintain a communication connection. In addition to controlling the state of a connection based on a particular event, the techniques disclosed herein may also control the type of data being transferred between two devices based on the type of event. By controlling the communication connection between two devices and controlling the amount and/or type of data being transferred between two or more devices, systems utilizing the techniques disclosed herein may improve some efficiency with respect to power consumption and use of other computing resources (such as processing power, network bandwidth, etc.).

For purposes of illustration, consider a scenario in which a device (such as an IoT device) is configured with an accelerometer and a camera. Such devices may be used as dashboard cameras for detecting conditions of the vehicle, such as acceleration, speed, direction, etc. of the vehicle. The device may also be configured with a communication module, such as a bluetooth, Wi-Fi, or Wi-Fi direct device, to establish a connection with one or more remote devices, such as a smartphone. In some cases, the device may be configured to remain in a disconnected state with the remote device until an event related to a condition of the vehicle is detected.

Using one or more sensors, the device may detect an event, such as a collision, a mechanical failure, or another predetermined event. In response to detecting the event, the device may connect to a remote device. The connection between the device and the remote device may permit the provision of specific information between the two devices. In one illustrative example, a device may transmit a request for time and date information from a remote device in response to detection of an event. This scenario may be beneficial in situations where remote devices may be networked and have access to accurate time and date information. The time and date information may be used to create a record of the event with the date and time information received from the remote device. In some configurations, when video data is recorded, the device may store time and date information associated with the video data to create a record of the event. Some commitments (commitments), video data, or other sensor data (such as accelerometer measurements) may be transmitted to the remote device in response to detection of an event. Additionally, the device may also communicate other metadata with the remote device, such as location information. By allowing devices to exchange specific information, the data defining an event can be embedded with several different tags and other context data, providing a single, easily accessible resource for the event. After the device exchanges predetermined information with the remote device, the device may terminate the connection with the remote device. Termination of the connection allows each device to operate in a low power state in some cases to conserve power.

The devices may also utilize the exchanged information to generate summary data that may characterize the situation of the defined event. For example, the dashboard camera may generate video data defining the video clips of the collision. The dashboard camera may then mark the video data, where the device may mark the video data by adding the received information to the video data. Additionally, or in another configuration, the device may generate and provide an Augmented Reality (AR) video clip as a summary. The IoT device may generate and add various flags to the AR video clip in order to draw attention to specific elements of the defined event.

Fig. 1A and 1B illustrate the above described scenario in an example involving two vehicles. In this example, the first vehicle 110 includes a device 114 (such as a tachograph) and a remote device 118, such as a mobile phone. In this example, a first vehicle 110 collides with a second vehicle 120, causing the device 114 to establish a connection with a remote device 118 to allow the devices to exchange data about the accident. As described in more detail below, once the devices exchange data associated with the incident, the connection is terminated.

Fig. 1A illustrates a traffic scenario involving a first vehicle 110 and a second vehicle 120. As illustrated, the two vehicles move toward each other in the street intersection 130. The first vehicle 110 may attempt to cross the intersection 130 after the traffic light 140 turns red. The second vehicle 120 may enter the intersection again after the traffic light 150 turns green. As a result, the first vehicle 110 accelerates in a first direction (in the X-direction) and the second vehicle 120 accelerates in a second direction (in the Y-direction) that is substantially perpendicular to the first direction.

As illustrated in fig. 1B, such a scenario may result in a collision between the first vehicle 110 and the second vehicle 120. Although vehicle 110 and vehicle 120 are depicted as automobiles, the disclosed techniques may also be used in conjunction with other types of vehicles, such as, but not limited to, motorized watercraft, drones, ATVs, autonomous vehicles, and the like. For purposes of illustration, device 114 is also referred to herein as an "IoT device 114" or a "dashboard camera 114". Device 114 may be mounted on the dashboard or rear view mirror of first vehicle 110. The dashboard camera 114 is disconnected from the network device or another type of provisioning device. Further, in this example, the dashboard camera 114 does not have location information or sensors for detecting geographic locations. Additionally, dashboard camera 114 includes an accelerometer that detects the level of acceleration of dashboard camera 114 and first vehicle 110. For efficiency purposes, it is not feasible to have a continuous communication connection between the dashboard camera 114 and the remote device 118. Thus, in this example, the dashboard camera 114 generally operates offline with respect to the remote device 118.

As shown in fig. 1B, the dashboard camera 114 may detect a collision by detecting the magnitude of acceleration in a predetermined direction. Acceleration may be monitored using sensor data generated by an accelerometer (not depicted in fig. 1A) in communication with the dashboard camera 114. The accelerometer may be integrated into the dashboard camera 114 or may be integrated into the vehicle 110.

By detecting the magnitude of the acceleration in a predetermined direction (such as the acceleration in the X-direction), the incidence of false positives can be reduced when accelerations in other directions (e.g., the Y or Z directions) are detected. For example, such an acceleration detected in the X direction may allow the device to distinguish a frontal collision from a bump that causes the vehicle to hit the road, as compared to an acceleration detected in the Z direction.

In some embodiments, the dashboard camera 114 may detect the magnitude of acceleration of the dashboard camera 114 in a second predetermined direction (e.g., the Y or Z direction). This will be in addition to the detection of the magnitude of the acceleration of the dashboard camera 114 in a predetermined direction (e.g., the X direction). In such embodiments, the data connection between the dashboard camera 114 and the remote device (such as a watch or smartphone) is initiated in response to determining that the magnitude of the acceleration in the second predetermined direction is less than the magnitude of the acceleration in the predetermined direction. In some embodiments, the data connection may be initiated in response to determining that the magnitude of the acceleration in the second predetermined direction is less than the magnitude of the acceleration in the predetermined direction by a predetermined difference. In some configurations, the predetermined difference may be adjusted based on one or more factors. For example, the predetermined variance may be increased when the dashboard camera 114 receives or generates context data indicating several false positive detections of events. Meanwhile, the predetermined variance may be reduced when the dashboard camera receives or generates context data indicating several false positive (e.g., false negative) detections of one or more events.

In some embodiments, the data connection may be initiated in response to: (1) detecting that a magnitude of the acceleration exceeds a threshold magnitude of the acceleration in a predetermined direction (e.g., the X direction), and (2) determining that a magnitude of the acceleration in a second predetermined direction (e.g., the Y or Z direction) is less than the magnitude of the acceleration in the predetermined direction by a predetermined difference. Similar to the examples described above, the predetermined difference may be adjusted based on one or more factors. For example, the predetermined variance may be increased when the dashboard camera 114 receives or generates context data indicating several false positive detections of events. Meanwhile, the predetermined variance may be reduced when the dashboard camera receives or generates context data indicating several false positive (e.g., false negative) detections of one or more events.

The result of monitoring the acceleration of the first vehicle 110 in the X-direction in the coordinate system 160 is depicted in fig. 2. Although a cartesian coordinate system is shown, it should be appreciated that any coordinate system may be used with the described examples, including but not limited to: an orthogonal coordinate system, a polar coordinate system, or a cylindrical and spherical coordinate system.

Magnitude of acceleration a in the X direction as a function of timex(t) may include a period of time during which the first vehicle 110 is at a substantially constant speed (a)x(t) ═ 0) is moved. During other periods, ax(t) may have a substantially constant size. For example, as illustrated in fig. 2, during a first time period □ beginning at time t11Meanwhile, the first vehicle 110 may move in the X direction at a first acceleration magnitude 220. For example, at time period □1During this time, the first vehicle 110 may be leaving the highway and decelerating or may be passing another vehicle. At slave time t2Another period of time □ begins2Meanwhile, the first vehicle 110 may move in the X direction at a second acceleration magnitude 230. For example, at time period □2Meanwhile, the first vehicle 110 may be decelerating when entering the school zone during the restricted time.

However, at time t3,ax(t) may begin to increase sharply, at time tcExceeds a threshold size ath. Such threshold size may be quantified in terms of a multiple of gravity (g). For example, the threshold size may be 2g, 3g, 4g, 5g, and so on. Time t3May correspond to the time at which the first vehicle 110 begins to decelerate at the intersection 130 as a result of the second vehicle 120 also being at the intersection 130. See fig. 1B. Time tcIndicating the time of the collision between the first vehicle 110 and the second vehicle 120.

In response to an event, such as a collision, it is beneficial to have a record of the location data and the timestamp data. Such information may supplement, for example, a video segment (video track) recorded by the dashboard camera 114. However, the dashboard camera 114 may not have access to accurate time and date information. The dashboard camera 114 may also lack location information. Because of such a need, the techniques disclosed herein enable the dashboard camera 114 to create an on-demand connection with the remote device 118 in response to detection of an event (such as the collision depicted in fig. 1B). In some configurations, the dashboard camera 114 may initiate a data connection with the remote device 118 in response to detecting that a magnitude of acceleration of the first computing device exceeds a threshold magnitude of acceleration in a predetermined direction.

The dashboard camera 114 may be wirelessly connected to the remote device 118. Such a connection may be achieved, for example, by pairing the dashboard camera 120 and the remote device 118, which is illustrated in this example as a smartphone. The remote device 118 may be any device (IoT device or otherwise) that has access to reliable location information or date and time information or both. In other examples, the remote device 118 may be a tablet computer, a computerized wristband, a portable GPS locator, or the like.

After the remote device 118 and the dashboard camera 114 are connected, the dashboard camera 114 may receive location data and timestamp data from the remote device 118. During a particular communication session between the dashboard device 114 and the remote device 118, location data, timestamp data, and any other data may be received at the dashboard camera 114. The timestamp data may include, for example, time tc(fig. 2) and/or the date the collision was detected. In some cases, the remote device 118 may transmit other types of timing information in addition to or in lieu of the timestamp data. For example, the other timing information may include t1、t2Or t3One or more (fig. 2). Such timing information may be used to determine whether first vehicle 110 is being operated erratically. Such timing information may also be used to identify signs of operator distraction of the first vehicle 110. The timing information may include the date and/or time of day of the detected event.

The data received from the remote device 118 constitutes metadata that may be used to mark the video data from the dashboard camera 114. As such, the received timestamp data may be integrated into video data defining a video clip generated by the dashboard camera 114 in response to detection of the incident. The received location data may also be added to such video data. In some cases, acceleration information may also be added to the video data. The acceleration information may include, for example, an acceleration magnitude or an acceleration direction, or both. Such metadata is custom metadata and may be incorporated into metadata fields corresponding to frames of a video clip.

In some scenarios, the dashboard camera 114 may request specific information from the remote device 118. For example, the dashboard camera 114 may request accurate time and date information and/or location information. A remote device 118, such as a mobile phone, may meet such a need by sending provisioning information to the dashboard camera 114. Thus, in such a scenario, provisioning may be requested by the dashboard camera 114, rather than in response to establishing a wireless connection between the dashboard camera 114 and the remote device 118.

After the transfer of the location data and the timestamp data from the remote device 118 is complete, the dashboard camera 114 may be disconnected from the remote device 118. In addition to disconnecting from the remote device 118, the dashboard camera 114 may transition to a low power state after the communication session is ended.

Fig. 3 presents an example data flow process between the dashboard camera 114 and the remote device 118. As illustrated in fig. 3, the accelerometer 310 may generate sensor data 312 and may send the sensor data 312 to the dashboard camera 114. The accelerometer 310 may be integrated into a vehicle and may transmit sensor data 312 via a bus architecture within the vehicle. The sensor data 312 includes acceleration information including a magnitude or direction of acceleration or both of the vehicle's acceleration.

The dashboard camera 114 may receive the sensor data 312. The dashboard camera 114 may then perform operation 322 to detect the qualifying event using the acceleration information included in the sensor data 312. To this end, the dashboard camera 114 may analyze the acceleration information included in the sensor data 312 and may determine that a magnitude of the acceleration of the vehicle exceeds a threshold magnitude of the acceleration in a predetermined direction. Such threshold sizes may be, for example, 2g, 3g, 4g, 5g, and so forth.

In some embodiments, as illustrated in fig. 4A, the dashboard camera 114 may include an input/output (I/O) interface (not depicted in fig. 4A) that may couple the dashboard camera 114 to the bus architecture to receive acceleration information. The dashboard camera 114 may also include an event detection module 415, which event detection module 415 may use the received acceleration information to detect defined events. To this end, the event detection module 415 may apply detection criteria to the acceleration information. The detection criterion specifies that the magnitude of the acceleration in a predetermined direction must exceed a threshold magnitude (e.g. 5 g). The detection criteria may be maintained in one or more memory devices 425 (referred to as detection rule(s) 425) in the dashboard camera 114.

Returning to fig. 3, in response to detecting a defined event (e.g., a collision of the vehicle with another vehicle), the dashboard camera 114 may initiate a communication session between the dashboard camera 114 and the remote device 118. The communication session may be initiated by pairing the dashboard camera 114 and the remote device 118. Accordingly, the dashboard camera 114 and the remote device 118 may exchange pairing messages 332 to establish a wireless connection between the dashboard camera 114 and the remote device 118. As illustrated in fig. 4A, the dashboard camera 114 may include a communication component 410 that may permit the exchange of pairing messages 322. Prior to exchanging pairing messages, event detection module 415 may power up communication component 410.

After the remote device 118 and the dashboard camera 114 are connected, the remote device 118 may send data 334, location data, and/or time and date data ("timing data") to the dashboard camera 114. For example, the dashboard camera 114 may wirelessly receive location data and/or timing data via the communications component 410 shown in fig. 4A. The dashboard camera 114 may maintain the location data and timing data in one or more memory devices 430 (referred to as event data 430) shown in fig. 4A. The timing data may include time stamp data, such as a time and date when the magnitude of the acceleration of the vehicle including the dashboard camera 114 exceeds a threshold magnitude in a predetermined direction. The location data may include, for example, coordinates representing the location of the instrument panel camera 114 when the magnitude of the acceleration of the vehicle exceeds a threshold magnitude in a predetermined direction: . The coordinates may include latitude and longitude. In some cases, the altitude may also be included in the coordinates.

It should be noted that in some embodiments, the dashboard camera 114 may request specific information from the remote device 118. For example, the dashboard camera 114 may request the data 334 or another type of data from the remote device 118. The remote device 118 may respond by transmitting the requested data. Thus, in such embodiments, the provisioning may be requested by the dashboard camera 114 rather than being caused by the remote device 118.

In response to completion of the communication of the location data and/or timing data, the dashboard camera 114 may perform operation 324 to terminate the wireless connection between the dashboard camera 114 and the remote device 118. Completion of such communication may be indicated by end of transmission (EOT) signaling or another type of signaling conveying termination of the information transmission, such as end of file (EOF) signaling. Such signaling may be received at the dashboard camera 114 from the remote device 118. Terminating such a connection may include transitioning the dashboard camera 114 to a low power state, thus reducing power consumption. To this end, the event detection module 415 shown in fig. 4A may power down one or more elements of the dashboard camera 114 while maintaining the event detection module 415 powered on.

The data received from the remote device 118 constitutes metadata that can be used to tag imaging data defining a video clip representing a defined event. Such metadata is custom metadata and may be incorporated into metadata fields corresponding to image frames of a video clip. The dashboard camera 114 may generate imaging data in response to detection of a qualifying event. As shown in fig. 4A, for example, the dashboard camera 114 may include a camera component 405 to acquire images of the surroundings of the vehicle 110. The camera assembly 405 may use visible light and/or infrared electromagnetic radiation to acquire images. The camera assembly 450 may include: lenses, filters, and/or other optical elements; one or more focusing mechanisms; and an imaging sensor permitting capture of both still and moving pictures. The imaging sensor may include one or more photodetectors, active amplifiers, and the like. For example, the imaging sensor device may include a Charge Coupled Device (CCD) camera; an active pixel sensor or other type of Complementary Metal Oxide Semiconductor (CMOS) based photodetector; a multi-channel photodiode array; combinations thereof; and the like.

Thus, the received location data may be integrated into the imaging data after the qualifying event is detected. The received timing data may also be added to the imaging data in addition to or instead of the addition of location data. In some cases, acceleration information may be added to the imaging data. The dashboard camera 114 may perform operation 326 to add the received location data or timing data or both to the video clip. In some embodiments, as illustrated in fig. 4A, the dashboard camera 114 may include an enhancement module 420 that may perform operation 326.

It should be noted that although the accelerometer 310 in fig. 3 is external to the dashboard camera 114, the present disclosure is not limited to such an implementation. There is a configuration in which the accelerometer may be included in a set of inertial sensors 450 integrated into the dashboard camera 114, as illustrated in fig. 4B.

The disclosed techniques are not limited to the example of active provisioning data in fig. 3. These techniques need not be implemented in the remote device 118 and the dashboard camera 114 in a vehicle scene. The disclosed techniques are indeed applicable to other types of IoT devices, where one of such IoT devices is a trusted device configured to provide reliable data and metadata to another of the IoT devices. The trusted device may be verified via an authentication process. Further, after the connection is established, the IoT device that initially disconnected may receive any metadata in addition to other data from the trusted IoT. The metadata may permit characterization of predetermined events that result in provisioning of the disconnected IoT device.

The disclosed techniques are not limited to supplying data in response to a collision between vehicles or between a vehicle and another object. Data may be supplied to a computing device (IoT device or otherwise) in response to detecting other types of defined events, such as vehicle rollover, vehicle electromechanical failure, battery ignition of an electric vehicle, etc. Such failures may include, for example, damage to vehicle tires or a catastrophic increase in operating temperature in the internal combustion engine of the vehicle. Fig. 5A, 5B, and 5C illustrate a broader description of a process for dynamically controlling a communication connection in response to detecting an event.

In particular, FIG. 5A illustrates an example of data flow and operations that may be implemented in a system involving devices in a vehicle defining an event. As illustrated, a set of sensors 510 may generate sensor data 512. The set of sensors 510 may be integrated into the vehicle and may include, for example, inertial sensors, temperature sensors, pressure sensors, charge sensors, and the like. Thus, each sensor of the set of sensors 350 may generate a measure of a physical property indicative of an operating condition of the vehicle.

In some configurations, the inertial sensors may include accelerometers and gyroscopes. Thus, the sensor data 512 includes linear acceleration information and angular acceleration information. Such information may include the magnitude or direction of acceleration or both of the vehicle. The sensor data 512 may also include orientation information defining the pitch, yaw, and heading of the vehicle.

The set of sensors 510 may send sensor data 512 to computing device a 520. The set of sensors 510 may transmit sensor data 512 via a bus architecture within the vehicle. Computing device a 520 may be an IoT device or another type of device having processing circuitry.

Computing device a 520 may receive sensor data 512. Computing device a 520 may then implement operation 522 to detect a qualifying event using sensor data 512 and a set of occurrence criteria. In particular, computing device a 520 may apply at least one criterion of the set of occurrence criteria to sensor data 512. As a result, computing device a 520 may determine whether sensor data 512 satisfies at least one occurrence criterion. The determination that the sensor data 512 meets at least one occurrence criterion results in the detection of a qualifying event.

By way of illustration, the at least one occurrence criterion may include a first criterion specifying that a magnitude of acceleration in a predetermined direction must exceed a threshold magnitude (e.g., 5 g). Accordingly, computing device a 520 may apply such criteria to the acceleration information included in sensor data 512. In some cases, the acceleration information indicates that a magnitude of acceleration of the vehicle in a predetermined direction exceeds a threshold magnitude. Computing device a 360 then determines that the size criteria are satisfied, and thus detects a qualifying event.

As another illustration, the sensor data 512 may include pressure data that identifies the pressure of each tire of the vehicle that includes the set of sensors 510. The at least one occurrence criterion may include a second criterion specifying that a change in pressure of the tire must exceed a threshold magnitude (e.g., 35 psi). Accordingly, computing device a 520 may apply such a size criterion to the received pressure data. In the event that the received pressure data indicates that the pressure of the tire has changed by an amount that exceeds a threshold amount. Computing device a 520 then determines that the size criteria are satisfied, and thus detects a qualifying event.

Other criteria are also specified in order to detect defined events, such as vehicle rollover. For example, the criterion of detecting a qualifying event may be satisfied when a change in the magnitude of the acceleration vector is at least equal to a threshold magnitude. As another example, another criterion to detect an accident may be satisfied when the change in direction of the acceleration vector is at least equal to a threshold vector. Other criteria may also be used to identify abnormal accelerations, such as excessive angular acceleration around the axis of the vehicle 110, as may be the case when the first vehicle 110 is rolling over in a crash.

Regardless of the type of qualifying event detected, computing device a 360 may initiate a communication session between computing device 520 and computing device B530. Computing device B530 may be any device (IoT device or otherwise) that has access to some trusted information. For example, the trusted information may include reliable timing information or location information or both. Computing device B530 may be, for example, a wrist band electronic device, a smart phone, a portable Global Positioning System (GPS) receiver device, and so on. Computing device B530 includes components (not depicted in fig. 3B) that permit wireless communication between computing device B370 and external devices. Thus, a communication session may be initiated by pairing computing device a 520 and computing device B530. Pairing such devices includes exchanging pairing messages 532 to establish a wireless connection between computing device a 520 and computing device B530.

After computing device B530 and computing device a 520 are connected, computing device B530 may send trusted data 534 to computing device 360. For example, computing device a 520 may receive trusted data wirelessly via the communications component 410 shown in fig. 5B. Computing device a 520 may maintain the trusted data 534 in the event data 430 shown in fig. 5B. The trusted data 534 may include location data or timing data or both. The timing data may include time stamp data, such as a time and date when the measure of the physical property indicative of the operating condition of the vehicle satisfies the occurrence criteria. The location data may include, for example, a set of coordinates for computing device a 510 when a measure of such a physical property satisfies occurrence criteria. The set of coordinates may include a latitude and a longitude. In some cases, the altitude may also be included in the set of coordinates.

In response to completion of the communication of the trusted data 534, the computing device 520 may perform operation 524 to terminate the wireless connection between computing device a 520 and computing device B370. Completion of such communication may be indicated by, for example, EOT signaling or another type of signaling conveying termination of the information transmission. Such signaling may be received at computing device a 520 from computing device 530. Terminating such a connection may include transitioning computing device a 520 to a low power state, thus reducing power consumption. To do so, one or more components of computing device a 520 may be powered down while maintaining event detection module 415 (fig. 5B) powered on.

The trusted data 524 received from the computing device 530 constitutes metadata that may be used to mark imaging data defining a video clip representing a defined event. Such metadata is custom metadata and may be incorporated into metadata fields corresponding to image frames of a video clip. Computing device a 520 may generate imaging data in response to detection of a qualifying event. Imaging data may be generated by means of an imaging sensor included in the camera assembly 405 (fig. 5B). Thus, the received location data may be integrated into the imaging data after the qualifying event is detected. The received timing data may also be added to the imaging data in addition to or instead of the addition of location data. In some cases, acceleration information may be added to the imaging data. Computing device a 520 may implement operation 526 to add the received trusted data 534 to the video clip. To this end, as illustrated in fig. 5B, computing device a 520 may include an enhancement module 420.

It should be noted that although sensor(s) 350 in fig. 5A are illustrated as being external to computing device a 510, the disclosure is not limited in this regard. There may be a configuration in which sensor(s) 510 may be included in a set of sensors 590 integrated into computing device a 520, as illustrated in fig. 5C. In some embodiments, the set of sensors 590 can also include an imaging sensor that can generate imaging data defining a video clip.

The disclosed technology is not limited to detecting defined events involving a vehicle that includes a dashboard camera or another type of IoT that can generate imaging data. The disclosed technology may also detect accidents involving other vehicles.

By way of illustration, FIG. 6A presents the following scenario: the vehicle 610 stops at the intersection 130 while two other vehicles 620 and 630 are traversing the intersection 130. Vehicle 610 is oriented in the y-direction and the other vehicles 620 and 630 move in the x-direction. The field of view of vehicle 610 may include both vehicle 620 and vehicle 630. Vehicle 610 may include computing device a 520 and computing device B530, which may be initially unpaired. Computing device a 520 may include camera components 405 (fig. 5B) that may generate imaging data representative of the surroundings of vehicle 610. For example, imaging data may be generated when computing device a 520 is operating in a low power state.

Both the vehicle 620 and the vehicle 630 may move at substantially the same speed, with the vehicle 620 being behind the vehicle 630 with respect to the direction of movement. At a later time, the vehicle 630 may suddenly decelerate. Thus, the vehicle 620 may also decelerate to avoid a collision with the vehicle 630. As illustrated in fig. 6B, in some cases, the deceleration of the vehicle 620 may be insufficient to avoid the collision.

Computing device a 510 may generate imaging data defining a video clip that records, for example, the movement of vehicles 620 and 630 before and after a collision. Thus, the imaging data may represent a series of scenes corresponding to the movement of the vehicle 620 and the vehicle 630 across the intersection 130. The series of scenes is generated from the vantage point of the vehicle 610. The imaging data may define a plurality of image frames at a defined frame rate, wherein each image frame of the plurality of image frames corresponds to one scene of a series of scenes. Fig. 6C illustrates one of such scenarios at the time of a collision.

Computing device a 510 may also analyze the imaging data to track one or both of vehicle 620 and vehicle 630 across a series of scenes. For example, computing device a 510 may identify vehicle 620 in a first scene of a series of scenes, and may then continue to identify vehicle 620 in other scenes after the first scene. In some configurations, computing device a 510 may analyze each image frame in the imaging data as the image frame is generated. In other configurations, computing device a 510 may analyze image frames in batches, analyzing successive sets of image frames when generation of a set of frames is complete. In some embodiments, the imaging data may be analyzed by an event detection module 415 (fig. 5B) that may be included in computing device a 520.

As part of the analysis of the imaging data, computing device a 520 may determine a series of positions of vehicle 620 in respective consecutive image frames of the plurality of image frames. Further, computing device a 520 may determine a series of times for respective locations in the series of locations. Computing device a 520 may then use the series of locations and the series of times to generate an estimate of the acceleration of vehicle 620.

Using the estimate of the acceleration of the vehicle 620, the computing device a 520 may determine that the magnitude of the acceleration exceeds a threshold magnitude of acceleration in a predetermined direction. In response, computing device a 520 may initiate a data connection between the computing device and computing device B530.

After the data connection is established, computing device a 520 may receive at least one of timing information or location information from computing device B530. The timing information may include, for example, a time and date when the magnitude of the acceleration of the vehicle 620 exceeds a threshold magnitude of acceleration in a predetermined direction. Such a time may be, for example, time tc(FIG. 2). The location information may include, for example, a set of coordinates of computing device B520 when the magnitude of the acceleration of the second vehicle exceeds a threshold in a predetermined direction. The set of coordinates may include a longitude and a latitude. In some cases, the altitude may also be included in the set of coordinates.

Computing device 520A may terminate such a data connection after communication of timing information or location information has been completed. Completion of communication of such information may be indicated by signaling received at computing device a 520 from computing device B530. The signaling may include, for example, EOT signaling, EOF signaling, and the like.

Computing device a 520 may generate video data of a video clip defining a collision between vehicle 620 and vehicle 630. To this end, the computing device 520 may include the camera component 405 shown in fig. 5B. In some configurations, computing device a 520 may initiate recording of a video segment at a first frame rate in response to detection of a collision. The video segment may be generated with slow motion, for example, to record more details of the collision. Thus, the first frame rate can be greater than a second frame rate configured for recording video segments without a collision.

Regardless of how the video data is generated, the video data may be processed to generate and provide a summary of the collision. In one configuration, computing device 520A may tag the video data by adding the received timing information or the received location information, or both, to the video data. Enhancement module 420 (fig. 5B), which may be included in computing device a 520, may mark the video data.

Such a summary may also be presented as a collision Augmented Reality (AR) video clip. In an AR video clip, various flags may be added to draw attention to specific elements of the collision. In one configuration, computing device a 520 may add imaging data defining one or more User Interface (UI) elements to the video data. Enhancement module 420 (fig. 5B), which may be included in computing device a 520, may generate UI elements and may add the UI elements to the video data. In one example, a first UI element and a second UI element of the plurality of UI elements may represent an operating condition of the vehicle 620 and an operating condition of the vehicle 630, respectively. Additionally, or in another example, a particular UI element of the plurality of UI elements may indicate a trajectory of one of the vehicles 620 or 630. More specifically, the present invention is to provide a novel,

fig. 7 presents an example of an image frame 700 of an AR video clip corresponding to a collision between a vehicle 620 and a vehicle 630. The image frame 700 includes a first UI element 710, the first UI element 710 indicating a magnitude of acceleration of the vehicle 620 in the x-direction at the time of the collision. The image frame 700 also includes a second UI element 720, the second UI element 720 representing a direction of movement of the vehicle 620 after the collision. The image frame also includes a third UI element 730, the third UI element 730 representing the trajectory of the vehicle 630 after the collision.

In some embodiments, rather than detecting a collision, computing device a 520 may rely on imaging data of traffic in the surroundings of vehicle 110 to detect a particular event. For example, the qualifying event may be the presence of a particular vehicle or vehicle feature in an image of traffic. Thus, computing device a 520 may generate traffic images within the field of view of the camera devices included in computing device a 520. Computing device a 520 may then analyze the image to determine whether the defined vehicle feature is present. For example, computing device a 520 may analyze the image with the event detection module 415 shown in fig. 5B. The defined vehicle characteristic may include at least one of a defined license plate, a defined brand, a defined model, a defined paint type, a defined packaging type, or a defined damage type. The data maintained in the event data 430 may be configured to define vehicle characteristics. The determination that the defined vehicle feature is present in the image causes the defined event to be detected.

The disclosed technology provides many other functionalities. In some cases, a connection may not be established between the IoT device and the computing device in response to a particular event. In such cases, the computing device may generate the location information and timing information, e.g., separate from the IoT device. The IoT device may also generate another type of information. For example, a connection between a dashboard camera and a smartphone may not be established in response to an accident. Thus, the smartphone can generate location information and a timestamp, and the dashboard camera can generate a second timestamp and a video segment of the incident. The second timestamp may be shifted (advanced or delayed) because the timing device of the dashboard camera and the other timing device of the smartphone do not need to be synchronized. The smartphone may also generate other types of information that may complement or supplement the information generated by the dashboard camera. For example, the smartphone may generate velocity, acceleration, user Identification (ID), other credentials, combinations thereof, and the like.

Later, when a connection between the dashboard camera and the smartphone may be established, the information generated at the dashboard camera and the information generated at the smartphone may be merged. For purposes of illustration, the technique is referred to herein as a "stitching" feature, in which information from multiple resources including a dashboard camera and a smartphone (or other type of remote device) is combined to generate a consolidated report of the incident. Merging such information may include, for example, determining supplemental information generated at the smart phone and the dashboard camera, as well as synchronizing and combining such information. The combination of supplemental information may include, for example, transmitting first information available at the smartphone but not present at the dashboard camera to the dashboard camera. The combining may also include, for example, transmitting second information available at the dashboard camera but not present at the smartphone to the smartphone.

The merged information can be used as a complete representation of the incident. In some configurations, the supplemental information may come from a plurality of external resources. For example, a dashboard camera of another vehicle, a traffic camera, or any other sensor that can generate data that is used for a consolidated report of an event. In some configurations, a smartphone or dashboard camera may incorporate information about the incident. The information may include video data, barometric data, temperature data, weather forecast data, weather history data, precipitation metrics, location data, and the like. Such data and other context data may be used to construct confidence in a report about an event, and all of this context data may be stitched together by one or more common data points (such as a common timestamp, etc.).

The relevant merging of the stitching features and information described above may be implemented at the time of a particular event when a connection between the dashboard camera and the smartphone may be established. In other cases, the merging of information may be performed offline after the necessary information has been accessed separately from the dashboard camera and smartphone. Although the stitching feature is described with reference to an incident, the present disclosure is not limited in this regard. The stitching feature may be applied to any event disclosed herein. Further, the stitching feature may be applied to other types of IoT devices and remote devices besides dashboard cameras and smartphones.

A computing device having access to the consolidated information can generate and present a summary of the incident. In one example, the summary may be presented as an Augmented Reality (AR) video clip of the incident, where various markers may be added to draw attention to specific elements of the incident. In some cases, the smartphone may also send the merged information to another computing device that is remotely located relative to the smartphone in response to another defined event. The other computing device may correspond to the first responder or other type of authenticator.

As the computing power of IoT devices increases, IoT devices may generate richer data and/or metadata in response to particular events. For example, upon detection of a particular event, the dashboard camera may initiate video recording at a higher frame rate relative to recording without the particular event. In addition, or in some cases, the dashboard camera may detect amber alert license plates or license plates of first wanted criminals, and may then transmit a timestamp of the time such detection occurred and a video clip of such event. More generally, a dashboard camera may monitor traffic to search for a defined set of license plates. In response to the dashboard camera identifying one or more license plates from the set of license plates, the dashboard camera may then transmit the time stamp and optionally the video clip following the identification. Devices connected to the dashboard camera may utilize the time stamps and video clips in a variety of ways.

In other embodiments, some IoT devices may recognize audio and images, such as faces. The reference audio and the reference image to be identified may be maintained in a specific audio profile and a specific image profile, respectively. In response to the identified audio or the identified image, the IoT device may connect to the remote device or another trusted computing device. As disclosed herein, the IoT may receive certain data and metadata from the remote device after the connection. After the data and metadata are received, the dashboard camera 114 may disconnect from the remote device 118.

Additionally, or in some further embodiments, the predetermined event may be a movement of a predetermined pattern detected by the dashboard camera 114. For example, the driver may get lost and travel around, and such patterns may indicate predetermined events. As another example, the driver may be winding, which may be indicative of a predetermined event.

Fig. 8-9 illustrate flow diagrams of example methods in accordance with aspects of the technology disclosed herein. An example method is illustrated as a collection of blocks in a logical flow graph, which represent a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computer-readable instructions that, when executed by one or more processors, perform the recited operations. Generally, computer readable instructions include routines, programs, objects, components, data structures, etc. that perform or implement particular functions. The order in which the operations are described is not intended to be construed as a limitation, and any number of the blocks described can be combined in any order and/or in parallel to implement a process. Other techniques described throughout this disclosure may be construed accordingly.

Fig. 8 presents an example of a method for provisioning information to a computing device in accordance with one or more embodiments of the present disclosure. In some embodiments, the computing device may include an IoT device, such as a mobile device, wearable device, consumer electronic device, and the like. The computing system may implement the example method 800 in whole or in part. The computing system includes or is functionally coupled to one or more processors, one or more memory devices, other types of computing resources, combinations thereof, and the like. Such processor(s), memory device(s), computing resource(s), alone or in a particular combination, permit or otherwise support implementation of the example method 800. The computing resources may include an operating system (O/S); software for configuring and/or controlling a virtualized environment; firmware; central processing unit(s) ((CPU (s)); graphics processing unit(s) ((GPU (s)); a virtual memory; a disk space; a downstream bandwidth and/or an upstream bandwidth; interface(s) (I/O interface devices), programming interface(s) (such as Application Programming Interfaces (APIs), etc.), controller device(s), power supply, combinations of the foregoing, and the like.

At block 810, a processing device included in a computing device may receive sensor data from an accelerometer. The sensor data may include acceleration information of a vehicle that includes the computing device. The acceleration information may include the magnitude of the acceleration or the acceleration information, or both. In some embodiments, an accelerometer may also be included within the computing device. In one embodiment, the computing device may be computing device a 520 and the accelerometer may be part of sensor(s) 510. See fig. 5A.

At block 820, the processing device may use at least the acceleration information to detect that a magnitude of acceleration of the computing device exceeds a threshold magnitude of acceleration in a predetermined direction. Such a threshold size may be equal to, for example, 2g, 3g, 4g, or 5 g. At block 830, the processing device may initiate a data connection between the computing device and a second computing device. The data connection may be initiated in response to detecting that a magnitude of acceleration of the computing device exceeds a threshold magnitude of acceleration in a predetermined direction. In some embodiments, the second computing device may be a mobile device or a wearable device. For example, the mobile device may be implemented in a smartphone. The wearable device may be, for example, a smart watch or an electronic wristband.

At block 840, the processing device may receive at least one of timing information or location information from a second computing device. The timing information may include, for example, a time and date when the magnitude of the acceleration of the computing device exceeds a threshold level of acceleration in a predetermined direction. The location information may include, for example, a set of coordinates of the computing device when a magnitude of acceleration of the computing device exceeds a threshold in a predetermined direction. The set of coordinates may include a combination of longitude, latitude, and altitude.

At block 850, the processing device may determine whether communication of at least one of timing information or location information is complete. In response to a negative determination, the flow of the example method 800 may return to block 840. In the alternative, the flow of the example method may continue to block 860 in response to a positive determination. At block 860, the processing device may terminate the data connection between the computing device and the second computing device.

At block 870, the processing device may add one or more of the received timing information or the received location information to the video data defining the video segment. Additionally, or in some embodiments, the processing device may add acceleration information included in the received sensor data to the video data. In some cases, a video segment may be generated in response to the detection at block 820. In one example, a computing device may be implemented in a dashboard camera device and may generate imaging data defining a video clip.

Fig. 9 presents an example of another method for generating a record of a vehicle accident using a computing device in accordance with one or more embodiments of the present disclosure. In some embodiments, the computing device may include an IoT device, such as a mobile device, wearable device, consumer electronic device, and the like. The computing system may implement the example method 900 in whole or in part. The computing system includes or is functionally coupled to one or more processors, one or more memory devices, other types of computing resources, combinations thereof, and the like. Such processor(s), memory device(s), computing resource(s), alone or in a particular combination, permit or otherwise support implementation of the example method 900. The computing resources may include an operating system (O/S); software for configuring and/or controlling a virtualized environment; firmware; central processing unit(s) ((CPU (s)); graphics processing unit(s) ((GPU (s)); a virtual memory; a disk space; a downstream bandwidth and/or an upstream bandwidth; interface(s) (I/O interface devices), programming interface(s) (such as Application Programming Interfaces (APIs), etc.), controller device(s), power supply, combinations of the foregoing, and the like.

At block 910, a processing device included in a computing device may receive video data from an imaging sensor. The video data may represent a plurality of scenes corresponding to the surroundings of the vehicle. For example, the vehicle may be vehicle 110 and the plurality of scenarios may include the scenario illustrated in fig. 6A. In one configuration, the imaging sensor may be functionally coupled to a computing device (see, e.g., fig. 5A). In another configuration, the imaging sensor may be integrated into the computing device (see, e.g., fig. 5B).

At block 920, the processing device may analyze the video data to identify a second vehicle in at least one set of scenes from the plurality of scenes. More specifically, the processing device may identify a second vehicle in a first scene of the plurality of scenes, and may continue to identify vehicles across other scenes subsequent to the first scene. In other words, the processing device may track the second vehicle across the set of scenes of the plurality of scenes. In some configurations, the video data defines a plurality of image frames at a defined frame rate, each image frame of the plurality of image frames corresponding to one scene of the plurality of scenes. Accordingly, analyzing the video data to identify the second vehicle in at least one group of the plurality of scenes may include determining a series of positions of the second vehicle in respective consecutive image frames of the plurality of image frames. Continuing with the example in the description of block 910 above, the processing device may identify the vehicle 410 in the scenario shown in fig. 4A, and may track the vehicle 410 from the vantage point of the vehicle 110 across subsequent scenarios.

At block 930, the processing device may analyze the video data to determine an acceleration. To do so, the processing device may access a range of locations of the second vehicle, and may then generate an estimate of acceleration using the range of locations and the defined frame rate. At block 940, using the estimate of the acceleration of the second vehicle, the processing device may detect that a magnitude of the acceleration exceeds a threshold magnitude of the acceleration in a predetermined direction.

At block 950, the processing device may initiate a data connection between the computing device and a second computing device. The data connection may be initiated in response to detecting that the magnitude of acceleration of the second vehicle exceeds a threshold magnitude of acceleration in a predetermined direction. In some embodiments, the second computing device may be a mobile device or a wearable device. At block 960, the processing device may receive at least one of timing information or location information from a second computing device. The timing information may include, for example, a time and date when the magnitude of the acceleration of the second vehicle exceeds a threshold level of acceleration in a predetermined direction. The location information may include, for example, coordinates of the computing device when a magnitude of the acceleration of the second vehicle exceeds a threshold in a predetermined direction. The coordinates may include longitude and latitude. In some cases, the altitude may also be included in the coordinates. For purposes of illustration, the coordinates may also be referred to herein as a set of coordinates.

At block 970, the processing device may determine whether communication of at least one of the received timing information or the received location information is complete. In response to a negative determination, the flow of the example method 900 may return to block 960. In the alternative, the flow of example method 900 may continue to block 980 in response to a positive determination. At block 980, the processing device may terminate the data connection between the computing device and the second computing device.

FIG. 10 illustrates additional details of an example computer architecture for components capable of executing the program components described above for providing automatic network connection sharing. The computer architecture shown in fig. 10 includes a computing device 1000 that may implement an IoT device operating in accordance with aspects of the present disclosure in some configurations. In an example embodiment, the computing device 1000 may implement a dashboard camera 114 (see, e.g., fig. 1) or a network appliance. In other configurations, computing device 1000 may implement another type of computing device, such as computing device a 520. In other examples, computing device 1000 may implement or may constitute a gaming console, server computer, workstation, desktop computer, laptop computer, tablet handset, Personal Digital Assistant (PDA), e-reader, digital cellular telephone, or other computing device, and may be used to execute any software components presented herein. The computer architecture shown in fig. 10 may be used to execute any of the software components described above to implement a reaction mechanism for provisioning information to IoT devices or other types of computing devices. Although some of the components described herein are specific to IoT devices, it should be appreciated that such components and others may be part of a remote computer, such as wearable device 110.

The computing device 1000 includes a substrate 402, or "motherboard," that is a printed circuit board to which a number of components or devices may be connected via a system bus or other electrical communication path. In one embodiment, one or more Central Processing Units (CPUs) 1004 operate in conjunction with a chipset 1006. The CPU 1004 may be a standard programmable processor that performs arithmetic and logical operations required for operation of the computing device 1000.

The CPU 404 performs operations by manipulating switching elements that differentiate and change these states from one discrete physical state to the next. A switching element may generally include electronic circuitry, such as a flip-flop, that maintains one of two binary states, and may include electronic circuitry, such as a logic gate, that provides an output state based on a logical combination of the states of one or more other switching elements. These basic switching elements may be combined to create more complex logic circuits, including registers, adder-subtractors, arithmetic logic units, floating point units, and the like.

The chipset 406 provides an interface between the CPU 404 and the remaining components and devices on the substrate 402. Chipset 406 may provide an interface to RAM 408, which is used as main memory in computing device 1000. Chipset 406 may also provide an interface to a computer-readable storage medium, such as read-only memory ("ROM") 410 or non-volatile RAM ("NVRAM"), for storing data that facilitates booting the computing device 1000 and passing information between various components and devices. The ROM 410 or NVRAM may also store other software components required for operation of the computing device 1000 in accordance with embodiments described herein.

Computing device 1000 may operate in a networked environment using logical connections to remote computing devices and computer systems over a network, such as local network 455. Chipset 406 may include functionality to provide network connectivity through a Network Interface Controller (NIC)1012, such as a gigabit ethernet adapter. NIC 1012 is capable of connecting computing device 1000 to other computing devices via network 1055. For purposes of illustration, NIC 1012 is also referred to herein as a wireless communication module. NIC 1012 may include the necessary circuitry for establishing wired or wireless communication between computing device 1000 and any other computer or network. The wireless communication may be according to various radio technology protocols (including, for example, Bluetooth @)TMNFC, other short range point-to-point wireless communication protocols, etc.). The combination of chipset 1006 and NIC 1012 may implement one or more of communication components 410.

It should be appreciated that multiple NICs 1012 may be present in computing device 1000 to connect such computing device to other types of networks and remote computer systems. The network 1055 permits the computing device 1000 to communicate with remote services and servers. NIC 1012 or other components may be used to provide network or internet access to external computing device 1050. Computing device 1050 may implement remote device 118 or computing device B530 as described herein. In some embodiments, computing device 1050 may include at least some of the components included in computing device 1000. For example, computing device 1050 may include CPU(s) 1004, mass storage 1026, NIC 1012, and chipset 1006. At least such components may permit operation of computing device 1050 in accordance with the functionality described herein.

The computing device 1000 may be connected to a mass storage device 426 that provides non-volatile storage for the computing device. The mass storage device 426 may store system programs, application programs, other program modules, and data that have been described in greater detail herein. The mass storage device 426 may be connected to the computing device 1000 through a storage controller 415 connected to the chipset 406. The mass storage device 426 may be comprised of one or more physical memory units. Storage controller 415 may be via a serial attached SCSI ("SAS") interface, a serial advanced technology attachment ("SATA") interface, a fibre channel ("FC") interface, or another type of interface for physically connecting and transferring data between a computer and physical storage units. It should also be appreciated that the mass storage device 426, other storage media, and the storage controller 415 may include multi-media card (MMC) components, eMMC components, Secure Digital (SD) components, PCI express components, and the like.

The computing device 1000 may store data on the mass storage device 426 by transforming the physical state of the physical storage units to reflect the information being stored. The particular transition in physical state may depend on a variety of factors, in different embodiments of the present description. Examples of such factors may include, but are not limited to, the technology used to implement the physical storage unit, whether the mass storage device 426 is characterized as primary or secondary storage, and the like.

For example, the computing device 1000 may store information to the mass storage device 426 by issuing instructions through the storage controller 415 to alter the magnetic properties of a particular location within a disk drive unit, the reflective or refractive properties of a particular location in an optical storage unit, or the electrical properties of a particular capacitor, transistor, or other discrete component in a solid state storage unit. With the foregoing examples provided solely to support the present specification, other transformations of physical media are possible without departing from the scope and spirit of the present specification. Computing device 1000 may also read information from mass storage device 1026 by detecting the physical state or characteristics of one or more particular locations within the physical storage unit.

In addition to the mass storage device 426 described above, the computing device 1000 may access other computer readable media to store and retrieve information, such as program modules, data structures, or other data. Thus, although the event detection module 415, the augmentation module 420, the detection rule(s) 425, the event data 430, and other modules are depicted as data and software stored in the mass storage device 426, these components and/or other modules may be stored, at least in part, in other computer-readable storage media of the computing device 1000. Although the description of computer-readable media contained herein refers to a mass storage device, such as a solid state drive, hard disk, or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable media can be any available computer storage media or communication media that can be accessed by the computing device 1000.

Communication media includes computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.

By way of example, and not limitation, computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. For example, computer media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, digital versatile disks ("DVD"), HD-DVD, BLU-RAY, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 1000. For the purposes of the claims, the phrase "computer storage medium" and variations thereof does not include the wave or signal itself and/or the communication medium.

The mass storage device 426 may store an operating system 427 that is used to control the operation of the computing device 1000. According to one embodiment, the operating system comprises a game operating system. According to another embodiment, the operating system includes any software that can control the processor. According to a further embodiment, the operating system may comprise a UNIX ANDROID operating system. It should be appreciated that other operating systems may be utilized. The mass storage device 426 may store other systems or applications and data utilized by the computing device 1000, such as the event detection module 415, the augmentation module 420, the detection rule(s) 425, the event data 430, and/or any other software components and data described above. The event data 430 may store data generated by the event detection module 415 or the enhancement module 420, or both. For example, if a bluetooth pairing is initiated between two computers (e.g., computing device 520A and computing device 530), data characterizing such pairing (e.g., message 532) may be stored in event data 430. The mass storage device 426 may also store other programs and data not specifically identified herein.

In one embodiment, the mass storage device 426 or other computer-readable storage medium is encoded with computer-executable instructions that, when loaded into the computing device 1000, transform the computer from a general-purpose computing system into a special-purpose computer capable of implementing the embodiments described herein. These computer-executable instructions transform the computing device 1000 by specifying how the CPU 404 transitions between states, as described above. According to one embodiment, computing device 1000 may access a computer-readable storage medium storing computer-executable instructions that, when executed by computing device 1000, perform the various techniques described above with respect to fig. 1A-7 and/or any of the techniques disclosed herein. Computing device 1000 may also include a computer-readable storage medium for performing any of the other computer-implemented operations described herein.

Computing device 1000 may also include one or more input/output controllers 416 for receiving and processing input from a number of input devices, such as a keyboard, mouse, microphone, headset, touch pad, touch screen, electronic pen, or any other type of input device. Also shown, an input/output controller 1016 is in communication with the input/output device 425. Input/output controller 1016 may provide output to a display, such as a computer monitor, flat panel display, digital projector, printer, plotter, or other type of output device. The input/output controller 416 may provide input communication with other electronic devices, such as a game controller and/or audio device. Additionally or alternatively, the video output 422 may be in communication with the chipset 406 and operate independently of the input/output controller 1016.

Computing device 1000 may also include a set of sensors 1024. Sensor(s) 1024 may include one or a combination of: inertial sensors (e.g., accelerometers or gyroscopes or both); an imaging sensor; a pressure sensor; a temperature sensor; a charge sensor; and the like. Thus, in some configurations, sensor(s) 1024 may implement one of accelerometer 510, sensor(s) 510, or sensor(s) 590. In other configurations, at least one of the sensor(s) 1024 may constitute the camera assembly 405. It should be noted that computing device 1000 may not include all of the components shown in fig. 10, may include other components not explicitly shown in fig. 10, or may utilize an architecture completely different from that shown in fig. 10. Based on the foregoing, it should be appreciated that techniques for reactively provisioning IoT devices with information have been disclosed herein. Although the subject matter presented herein has been described in language specific to computer structural features, methodological and transformative acts, specific computing machinery, and computer readable media, it is to be understood that the subject matter set forth in the appended claims is not necessarily limited to the specific features, acts, or media described herein. Rather, the specific features, acts and mediums are disclosed as example forms of implementing the claimed subject matter.

The following examples are provided to supplement the present disclosure.

Example a: a method, comprising: receiving sensor data from an accelerometer, the sensor data including acceleration information of a vehicle, the vehicle including a first computing device; detecting, by the first computing device, using at least the acceleration information, that a magnitude of the acceleration of the first computing device exceeds a threshold magnitude of the acceleration in a predetermined direction; in response to detecting that the magnitude of the acceleration exceeds a threshold magnitude of acceleration in a predetermined direction, initiating a data connection between the first computing device and the second computing device; receiving, by the first computing device, at least one of timing information or location information from the second computing device; and terminating the data connection between the first computing device and the second computing device.

Example B: the method of example a, further comprising: detecting, by the first computing device, a magnitude of acceleration of the first computing device in a second predetermined direction using at least the acceleration information, wherein the data connection between the first computing device and the second computing device is initiated in response to determining that the magnitude of acceleration of the first computing device in the second predetermined direction is less than the magnitude of acceleration of the first computing device in the predetermined direction.

Example C: the method according to examples a and B, further comprising: detecting, by the first computing device, a magnitude of acceleration of the first computing device in a second predetermined direction using at least the acceleration information, wherein the data connection between the first computing device and the second computing device is initiated in response to: the method further includes detecting that a magnitude of the acceleration of the first computing device exceeds a threshold magnitude of the acceleration in a predetermined direction, and determining that the magnitude of the acceleration of the first computing device in the predetermined direction is greater than the magnitude of the acceleration of the first computing device in a second predetermined direction by a predetermined difference.

Example D: the method of examples a-C, wherein the timing information includes a time and date when a magnitude of the acceleration of the first computing device exceeds a threshold level of acceleration in a predetermined direction.

Example E: the method according to examples a-D, wherein the location information comprises a set of coordinates of the first computing device when a magnitude of acceleration of the first computing device exceeds a threshold in a predetermined direction.

Example F: the method according to examples a-E, wherein the data connection between the first computing device and the second computing device is terminated in response to completing the communication of at least one of: timing information, location information, or acceleration information.

Example G: the method according to examples a-F, wherein the first computing device comprises a dashboard camera device, the method further comprising adding to the video data one or more of: the received acceleration information, the received timing information, or the received location information, the video data defining a video clip generated by the dashboard camera device after the detection.

Example H: the method of examples a-G, wherein initiating the data connection comprises initiating a wireless pairing of the first computing device with an electronic wristband device, a smart watch, or a smartphone.

Example I: a method, comprising: receiving video data from an imaging sensor, the video data representing a plurality of scenes corresponding to a periphery of a first vehicle; analyzing the video data to identify a second vehicle in at least one of the plurality of scenes; analyzing the video data to determine an acceleration of the second vehicle; detecting, by the first computing device, that a magnitude of acceleration of the second vehicle exceeds a threshold magnitude of acceleration in a predetermined direction; in response to detecting that a magnitude of acceleration of the second vehicle exceeds a threshold in a predetermined direction, initiating a data connection between the first computing device and the second computing device; receiving, by a first computing device, at least one of timing information or location information; and terminating the data connection between the first computing device and the second computing device.

Example J: the method of example I, wherein the video data defines a plurality of image frames at a defined frame rate, each image frame of the plurality of image frames corresponding to a scene of the plurality of scenes, and wherein analyzing the video data to determine the acceleration of the second vehicle comprises: determining a series of positions of a second vehicle in respective successive image frames of the plurality of image frames; and generating an estimate of the acceleration using the series of positions and the defined frame rate.

Example K: the method according to examples I and J, wherein the timing information includes a time and date when the magnitude of the acceleration of the second vehicle exceeds a threshold level of acceleration in a predetermined direction.

Example L: the method of examples I-K, wherein the location information includes a set of coordinates of the first computing device when a magnitude of acceleration of the second vehicle exceeds a threshold magnitude in a predetermined direction.

Example M: the method according to examples I-L, wherein the first computing device comprises a dashboard camera device, the method further comprising adding to the second video data one or more of: the received acceleration information, the received timing information, or the received location information, the second video data defining a video clip generated by the dashboard camera device after the detection.

Example N: a computing device, comprising: one or more processors; and a memory in communication with the one or more processors, the memory having computer-executable instructions stored thereon that, when executed by the one or more processors, cause the computing device to perform operations comprising: receiving sensor data indicative of a measure of a physical property of an object; using the sensor data to detect an occurrence of a defined event, the defined event comprising a change in a physical property of the object, wherein the change satisfies one or more criteria; in response to detecting that the physical property of the object satisfies the one or more criteria, initiating a data connection between the computing device and the second computing device; receiving, from the second computing device, at least one of timing information or location information corresponding to an occurrence of a defined event; and terminating the data connection between the computing device and the second computing device.

Example O: the computing device of example N, wherein the physical property comprises one of acceleration, linear velocity, angular velocity, position, temperature, or pressure, and wherein the qualifying event comprises an accident involving the vehicle.

Example P: the computing device of examples N and O, wherein the location information comprises coordinates of the computing device when the defined change in the physical property has a magnitude that exceeds a threshold amount.

Example Q: the computing device of examples N-P, wherein the computing device comprises a camera device, the operations comprising adding one or more of the received timing information or the received location information to second video data, the second video data defining a video clip generated by the camera device after the detecting.

Example R: the computing device of examples N to Q, the operations further comprising adding imaging data defining one or more User Interface (UI) elements to the second video data, wherein a particular UI element of the one or more UI elements represents an operating condition of the second vehicle.

Example S: the computing device of examples N through R, wherein the computing device comprises a camera device, the operations further comprising: in response to the detection, recording of the video segment at a first frame rate is initiated, the first frame rate being greater than a second frame rate configured for recording of the video segment without a qualifying event.

Example T: the computing device of examples N to R, wherein the detecting comprises: generating an image of traffic within a field of view of a camera device; analyzing the image for the presence of defined vehicle features, wherein the defined vehicle features include at least one of: a defined license plate, a defined brand, a defined model, a defined paint type, a defined packaging type, or a defined damage type; and determining that the defined vehicle feature is present in the image.

The above-described subject matter is provided by way of illustration only and should not be construed as limiting. Various modifications and changes may be made to the subject matter described herein without following the example configurations and applications illustrated and described and without departing from the scope of the present disclosure, which is set forth in the claims below.

38页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:利用联合网络和云资源管理的服务递送

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类