Collision warning system and method for a safety operator of an autonomous vehicle

文档序号:147843 发布日期:2021-10-26 浏览:29次 中文

阅读说明:本技术 用于自主车辆的安全操作员的碰撞告警系统和方法 (Collision warning system and method for a safety operator of an autonomous vehicle ) 是由 朱帆 于 2020-12-25 设计创作,主要内容包括:实施例公开了一种用于自主车辆的安全操作员的碰撞告警系统和方法。根据一个实施例,系统感知自主驾驶车辆(ADV)的环境,包括一个或多个障碍物。系统根据规划的轨迹确定ADV是否会潜在地与一个或多个障碍物发生碰撞。如果确定ADV潜在地会发生碰撞,则系统基于规划的轨迹和一个或多个障碍物来确定碰撞时间。如果确定的碰撞时间小于阈值或对于预定数量的连续规划周期,碰撞时间减小,则系统生成告警信号,向ADV的操作员发出警报。系统将告警信号发送到ADV的操作员接口,以警告操作员潜在的碰撞。(Embodiments disclose a collision warning system and method for a safety operator of an autonomous vehicle. According to one embodiment, a system senses an environment of an Autonomously Driven Vehicle (ADV), including one or more obstacles. The system determines from the planned trajectory whether the ADV will potentially collide with one or more obstacles. If it is determined that the ADV is potentially colliding, the system determines a collision time based on the planned trajectory and the one or more obstacles. If the determined time to collision is less than the threshold or the time to collision decreases for a predetermined number of consecutive planning cycles, the system generates an alert signal alerting an operator of the ADV. The system sends an alarm signal to the operator interface of the ADV to alert the operator of the potential collision.)

1. A collision warning method for an autonomously driven vehicle, ADV, the method comprising:

sensing an environment of an Autonomously Driven Vehicle (ADV), the environment including one or more obstacles;

determining from the planned trajectory whether the ADV potentially collides with one or more obstacles;

determining a collision time based on the planned trajectory and the one or more obstacles if it is determined that the ADV is potentially colliding;

generating an alert signal to alert an operator of the ADV if the determined time to collision is less than a threshold or the time to collision decreases for a predetermined number of consecutive planning cycles; and

an alert signal is sent to the user interface of the ADV to alert the operator of the potential collision.

2. The computer-implemented method of claim 1, wherein the alert signal is sent to a user interface of the ADV over a Controller Area Network (CAN) bus to alert the operator.

3. The computer-implemented method of claim 1, wherein the threshold is about 2 seconds and the predetermined number of consecutive planning periods is about 5.

4. The computer-implemented method of claim 1, further comprising: displaying the alert signal on a display device of the ADV, or sounding an alarm through a speaker device of the ADV.

5. The computer-implemented method of claim 1, wherein the ADV assumes hard braking if it is determined that the time to collision is less than the threshold.

6. The computer-implemented method of claim 5, wherein the ADV assumes light braking if it is determined that the time to collision is greater than the threshold but the time to collision is decreasing for a predetermined number of consecutive planning cycles.

7. The computer-implemented method of claim 6, wherein the light brake is approximately 1m/s ^2 and the hard brake is approximately 3m/s ^ 2.

8. A non-transitory machine-readable medium having instructions stored therein, which when executed by one or more processors, cause the one or more processors to perform the collision warning method for an autonomously driven vehicle of any one of claims 1-7.

9. A data processing system comprising:

one or more processors; and

a memory coupled to the one or more processors and storing instructions that, when executed by the one or more processors, cause the one or more processors to perform the collision warning method for an autonomously driven vehicle of any one of claims 1-7.

10. A computer program product comprising a computer program which, when executed by a processor, implements a collision warning method for an autonomously driven vehicle as claimed in any one of claims 1 to 7.

Technical Field

Embodiments of the present disclosure relate generally to operating an autonomously driven vehicle. More particularly, embodiments of the present disclosure relate to collision warning systems and methods for safety operators of Autonomous Driving Vehicles (ADVs).

Background

A vehicle operating in an autonomous mode (e.g., unmanned) may relieve some of the driving-related responsibilities of the occupants, particularly the driver. When operating in an autonomous mode, the vehicle may navigate to various locations using onboard sensors, allowing the vehicle to travel with minimal human interaction or in some situations without any passengers.

The safety operator of an autonomously driven vehicle can prevent the occurrence of traffic accidents and traffic accident deaths. In case of an obvious risk of collision, the safety operator can take over the autonomous driving system. However, it is difficult for a safety operator to concentrate all the time. Therefore, a safety mechanism that alerts the safety operator when extra attention should be paid is very important.

Disclosure of Invention

In a first aspect, there is provided a collision warning method for an Autonomously Driven Vehicle (ADV), the method comprising:

sensing an environment of an Autonomously Driven Vehicle (ADV), the environment including one or more obstacles;

determining from the planned trajectory whether the ADV potentially collides with one or more obstacles;

determining a collision time based on the planned trajectory and the one or more obstacles if it is determined that the ADV is potentially colliding;

generating an alert signal to alert an operator of the ADV if the determined time to collision is less than a threshold or the time to collision decreases for a predetermined number of consecutive planning cycles; and

an alert signal is sent to the user interface of the ADV to alert the operator of the potential collision.

In a second aspect, there is provided a non-transitory machine-readable medium having instructions stored therein, which when executed by one or more processors, cause the one or more processors to perform the collision warning method for an autonomously driven vehicle as described in the first aspect.

In a third aspect, there is provided a data processing system comprising:

one or more processors; and

a memory coupled to the one or more processors and storing instructions that, when executed by the one or more processors, cause the one or more processors to perform a collision warning method for an autonomously driven vehicle as described in the first aspect.

In a fourth aspect, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the collision warning method for an autonomously driven vehicle as described in the first aspect.

According to the present disclosure, it is possible to effectively monitor whether or not a collision will occur and, in the case where a collision is likely to occur, to issue a warning signal to prompt a safe operator, whereby the operator does not need to keep attention all the time and driving safety is improved.

Drawings

Embodiments of the present disclosure are illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.

FIG. 1 is a block diagram illustrating a networked system according to one embodiment.

FIG. 2 is a block diagram illustrating an example of an autonomously driven vehicle according to one embodiment.

3A-3B are block diagrams illustrating an example of an autonomous driving system for use with an autonomously driven vehicle according to one embodiment.

FIG. 4 is a block diagram illustrating a collision warning module according to one embodiment.

Figure 5 illustrates an example of an ADV in a potential collision, according to one embodiment.

Figure 6 illustrates a time diagram of an ADV in another potential collision, according to one embodiment.

Figure 7 is a block diagram illustrating an operator interface of an ADV according to one embodiment.

Figure 8 is a flow diagram illustrating a method performed by an ADV according to one embodiment.

Detailed Description

Various embodiments and aspects of the disclosure will be described with reference to details discussed below, and the accompanying drawings will illustrate the various embodiments. The following description and drawings are illustrative of the disclosure and are not to be construed as limiting the disclosure. Numerous specific details are described to provide a thorough understanding of various embodiments of the present disclosure. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments of the present disclosure.

Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.

Embodiments disclose a system and method for sending warnings/alerts of potential collisions to a safety operator of an Autonomously Driven Vehicle (ADV). According to one embodiment, a system senses an environment of an Autonomously Driven Vehicle (ADV), including one or more obstacles. The system determines from the planned trajectory whether the ADV will potentially collide with one or more obstacles. If it is determined that the ADV is potentially colliding, the system determines a collision time based on the planned trajectory and the one or more obstacles. If the determined time to collision is less than the threshold or the time to collision decreases for a predetermined number of consecutive planning cycles, the system generates an alert signal alerting an operator of the ADV. The system sends an alert signal to the user interface of the ADV to alert the operator of the potential collision.

Fig. 1 is a block diagram illustrating an autonomous vehicle network configuration according to one embodiment of the present disclosure. Referring to fig. 1, a network configuration 100 includes an Autonomous Driving Vehicle (ADV)101 that may be communicatively coupled to one or more servers 103 and 104 via a network 102. Although one ADV is shown, multiple ADVs may be coupled to each other and/or to server 103 via network 102 and 104. The network 102 may be any type of wired or wireless network, such as a Local Area Network (LAN), a Wide Area Network (WAN) (e.g., the Internet), a cellular network, a satellite network, or a combination thereof. The server 103 and 104 may be any kind of server or server cluster, such as a Web or cloud server, an application server, a backend server, or a combination thereof. The server 103 and 104 may be a data analysis server, a content server, a traffic information server, a map and point of interest (MPOI) server, or a location server, etc.

ADV refers to a vehicle that may be configured to be in an autonomous mode in which the vehicle navigates in the environment with little or no driving by the driver. Such ADVs may include a sensor system having one or more sensors configured to detect information about the environment in which the vehicle is operating. The vehicle and its associated controller use the detected information to navigate through the environment. ADV101 may operate in a manual mode, a fully autonomous mode, or a partially autonomous mode.

In one embodiment, ADV101 includes, but is not limited to, an Autonomous Driving System (ADS)110, a vehicle control system 111, a wireless communication system 112, a user interface system 113, and a sensor system 115. ADV101 may also include certain general components included in a typical vehicle, such as an engine, wheels, steering wheel, transmission, etc., which may be controlled by vehicle control system 111 and/or ADS 110 using various communication signals and/or commands, such as an acceleration signal or command, a deceleration signal or command, a steering signal or command, a braking signal or command, etc.

The components 110 and 115 can be communicatively coupled to each other via an interconnect, bus, network, or combination thereof. For example, the components 110 and 115 CAN be communicatively coupled to each other via a Controller Area Network (CAN) bus. The CAN bus is a vehicle bus standard designed to allow microcontrollers and devices to communicate with each other in applications without a host. It is a message-based protocol originally designed for multiple electrical wiring within a vehicle, but is also used in many other situations.

Referring now to FIG. 2, in one embodiment, the sensor system 115 includes, but is not limited to, one or more cameras 211, a Global Positioning System (GPS) unit 212, an Inertial Measurement Unit (IMU)213, a radar unit 214, and a light detection and ranging (LIDAR) unit 215. The GPS system 212 may include a transceiver operable to provide information regarding the location of the ADV. The IMU unit 213 may sense position and orientation changes of the ADV based on inertial acceleration. Radar unit 214 may represent a system that utilizes radio signals to sense objects within the local environment of an ADV. In some embodiments, in addition to sensing an object, radar unit 214 may also sense a speed and/or heading of the object. The LIDAR unit 215 may sense objects in the environment in which the ADV is located using laser light. The LIDAR unit 215 may include one or more laser sources, laser scanners, and one or more detectors, among other system components. The camera 211 may include one or more devices to capture images of the environment surrounding the ADV. The camera 211 may be a still camera and/or a video camera. The camera may be mechanically movable, for example by mounting the camera on a rotating and/or tilting platform.

The sensor system 115 may also include other sensors, such as sonar sensors, infrared sensors, steering sensors, throttle sensors, brake sensors, and audio sensors (e.g., microphones). The audio sensor may be configured to capture sound from the environment surrounding the ADV. The steering sensor may be configured to sense a steering angle of a steering wheel, wheels, or a combination thereof of the vehicle. The throttle sensor and the brake sensor sense a throttle position and a brake position of the vehicle, respectively. In some cases, the throttle sensor and the brake sensor may be integrated into an integrated throttle/brake sensor.

In one embodiment, the vehicle control system 111 includes, but is not limited to, a steering unit 201, a throttle unit 202 (also referred to as an acceleration unit), and a brake unit 203. The steering unit 201 is used to adjust the direction or travel direction of the vehicle. The throttle unit 202 is used to control the speed of the motor or engine and thus the speed and acceleration of the vehicle. The brake unit 203 decelerates the vehicle by providing friction to decelerate the wheels or tires of the vehicle. Note that the components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.

Referring again to fig. 1, wireless communication system 112 allows communication between ADV101 and external systems (e.g., devices, sensors, other vehicles, etc.). For example, the wireless communication system 112 may be in wireless communication with one or more devices directly or through a communication network, such as the server 103 and 104 through the network 102. The wireless communication system 112 may communicate with another component or system using any cellular communication network or Wireless Local Area Network (WLAN), for example, using WiFi. The wireless communication system 112 may communicate directly with devices (e.g., passenger's mobile device, display device, speakers in the vehicle 101), for example, using infrared links, bluetooth, etc. The user interface system 113 may be part of peripheral devices implemented within the vehicle 101 including, for example, a keypad, a touch screen display device, a microphone, and speakers, among others.

Some or all of the functions of ADV101 may be controlled or managed by ADS 110, particularly when operating in an autonomous driving mode. The ADS 110 includes the necessary hardware (e.g., processors, memory, storage) and software (e.g., operating systems, planning and routing programs) to receive information from the sensor system 115, the control system 111, the wireless communication system 112, and/or the user interface system 113, process the received information, plan a route or path from an origin to a destination, and then drive the vehicle 101 based on the planning and control information. Alternatively, the ADS 110 may be integrated with the vehicle control system 111.

For example, a user who is a passenger may specify a start location and a destination of a trip, e.g., via a user interface. ADS 110 obtains trip-related data. For example, ADS 110 may obtain location and route information from an MPOI server, which may be part of servers 103 and 104. The location server provides location services and the MPOI server provides map services and POIs for certain locations. Alternatively, such location and MPOI information may be cached locally in a persistent storage device of ADS 110.

ADS 110 may also obtain real-time traffic information from a traffic information system or server (TIS) as ADV101 travels along the route. Note that server 103 and 104 may be operated by a third party entity. Alternatively, the functionality of server 103-104 may be integrated with ADS 110. Based on the real-time traffic information, MPOI information, and location information, as well as real-time local environmental data (e.g., obstacles, objects, nearby vehicles) detected or sensed by the sensor system, the ADS 110 may plan an optimal route and drive the vehicle 101 according to the planned route to safely and efficiently reach the designated destination, e.g., via the control system 111.

The server 103 may be a data analysis system for performing data analysis services for various clients. In one embodiment, data analysis system 103 includes a data collector 121 and a machine learning engine 122. The data collector 121 collects driving statistics 123 from various vehicles, which are ADVs or conventional vehicles driven by human drivers. The driving statistics 123 include information indicative of driving commands issued (e.g., throttle, brake, steering commands) and responses of the vehicle (e.g., speed, acceleration, deceleration, direction) captured by sensors of the vehicle at different points in time. The driving statistics 123 may further include information describing driving environments at different points in time, such as a route (including a start location and a destination location), an MPOI, weather conditions, and road conditions, such as slow driving on an expressway, traffic paralysis, car accidents, road construction, temporary diversions, unknown obstacles, and the like.

Based on the driving statistics 123, the machine learning engine 122 generates or trains a set of rules, algorithms, and/or predictive models 124 for various purposes, including algorithms that generate collision alerts/alarms for ADVs. The algorithm/model 124 may then be uploaded to the ADV for real-time utilization by the ADV during autonomous driving.

Fig. 3A and 3B are block diagrams illustrating an example of an ADS for use with an ADV according to one embodiment. System 300 may be implemented as part of ADV101 of fig. 1, including but not limited to ADS 110, control system 111, and sensor system 115. Referring to fig. 3A-3B, ADS 110 includes, but is not limited to, a location module 301, a perception module 302, a prediction module 303, a decision module 304, a planning module 305, a control module 306, a routing module 307, and a collision warning module 308.

Some or all of the modules 301-308 may be implemented in software, hardware, or a combination thereof. For example, the modules may be installed in persistent storage 352, loaded into memory 351, and executed by one or more processors (not shown). Note that some or all of these modules may be communicatively coupled to or integrated with some of the modules of the vehicle control system 111 of FIG. 2. Some of the modules 301 and 308 may be integrated together as an integrated module.

The positioning module 301 determines the current location of the ADV 300 (e.g., using the GPS unit 212) and manages any data related to the user's journey or route. The location module 301 (also referred to as a map and route module) manages any data related to the user's journey or route. The user may log in and specify the starting location and destination of the trip, for example, through a user interface. The positioning module 301 communicates with other components of the ADV 300, such as map and route information 311, to obtain data related to the trip. For example, the location module 301 may obtain location and route information from a location server and a map and poi (mpoi) server. The location server provides location services and the MPOI server provides map services and POIs for certain locations, which may be cached as part of the map and route information 311. The positioning module 301 may also obtain real-time traffic information from a traffic information system or server as the ADV 300 travels along the route.

Based on the sensor data provided by sensor system 115 and the positioning information obtained by positioning module 301, perception module 302 determines a perception of the surrounding environment. The perception information may represent the same information as the perception of the surrounding vehicle by an ordinary driver when driving the vehicle. The perception may include, for example, lane configuration, traffic lights, relative position of another vehicle, pedestrians, buildings, crosswalks or other traffic related signs (e.g., no-pass signs, way-giving signs), etc., such as in the form of objects. The lane configuration includes information describing one or more lanes, such as the shape of the lane (e.g., straight or curved), the width of the lane, how many lanes are in the road, one-way or two-way lanes, merge or split lanes, exit lanes, and so forth.

The perception module 302 may include a computer vision system or functionality of a computer vision system to process and analyze images captured by one or more cameras in order to identify objects and/or features in the environment of the ADV. The objects may include traffic signals, road boundaries, other vehicles, pedestrians, and/or obstacles, etc. Computer vision systems may use object recognition algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system may map the environment, track objects, estimate the speed of objects, and the like. The perception module 302 may also detect objects based on other sensor data provided by other sensors, such as radar and/or LIDAR.

For each object, the prediction module 303 predicts what the object will appear in the current case. The prediction is made according to a set of map/route information 311 and traffic rules 312 based on perception data that perceives the driving environment at the point in time. For example, if the object is a vehicle traveling in the opposite direction and the current driving environment includes an intersection, the prediction module 303 will predict whether the vehicle will likely travel straight or turn. If the perception data indicates that the intersection has no traffic lights, the prediction module 303 may predict that the vehicle may have to stop completely before entering the intersection. If the perception data indicates that the vehicle is currently located in a left-turn only lane or a right-turn only lane, the prediction module 303 may predict that the vehicle will be more likely to turn left or right, respectively.

For each object, the decision module 304 makes a decision on how to process the object. For example, for a particular object (e.g., another vehicle in the intersection) and its metadata describing the object (e.g., speed, direction, turn angle), the decision module 304 decides how the process encountered the object (e.g., cut, yield, stop, pass). The decision module 304 may make such decisions based on a set of rules, such as traffic rules or driving rules 312, which may be stored in the persistent storage 352.

The routing module 307 is configured to provide one or more routes or paths from an origin to a destination. For a given trip, e.g., received from a user, from a start location to a destination location, the routing module 307 obtains route and map information 311 and determines all possible routes or paths from the start location to the destination location. The routing module 307 may generate a reference line in the form of a topographical map for each route it determines to reach from the start location to the destination location. The reference line refers to an ideal route or path that is not disturbed by other vehicles, obstacles, or other factors such as traffic conditions. That is, if there are no other vehicles, pedestrians, or obstacles on the road, the ADV should accurately or closely follow the reference line. The terrain map is then provided to a decision module 304 and/or a planning module 305. The decision module 304 and/or the planning module 305 detects all possible routes to select and modify one of the best routes based on other data provided by other modules, such as traffic conditions from the location module 301, driving environment sensed by the sensing module 302, and traffic conditions predicted by the prediction module 303. The actual path or route used to control the ADV may be close to or different from the reference line provided by the routing module 307, depending on the particular driving environment at the point in time.

Based on the decision for each perceived object, the planning module 305 plans the path or route of the ADV and driving parameters (e.g., distance, speed, and/or turn angle) using the reference lines provided by the routing module 307 as a basis. That is, for a given object, the decision module 304 decides how to process the object, and the planning module 305 determines how to do. For example, for a given subject, the decision module 304 may decide to pass through the subject, while the planning module 305 may determine whether to pass on the left or right side of the subject. Planning and control data is generated by the planning module 305, which includes information describing how the vehicle 300 will move in the next motion cycle (e.g., the next route/path segment). For example, the planning and control data may instruct the vehicle 300 to move 10 meters at a speed of 30 miles per hour (mph) and then change to the right lane at a speed of 25 mph.

Based on the planning and control data, the control module 306 controls and drives the ADV by sending appropriate commands or signals to the vehicle control system 111 according to the route or path defined by the planning and control data. The planning and control data includes sufficient information to drive the vehicle from a first point to a second point of the route or path using appropriate vehicle settings or driving parameters (e.g., throttle, brake, steering commands) at different points in time along the route or route.

In one embodiment, the planning phase is performed in multiple planning cycles (also referred to as command cycles), for example in each 100 millisecond (ms) interval. For each planning or command cycle, one or more control commands will be issued based on the planning and control data. That is, for every 100 milliseconds, the planning module 305 plans the next route segment or path segment, including, for example, the target location and the time required for the ADV to reach the target location. Alternatively, the planning module 305 may further specify a particular speed, direction, and/or steering angle, etc. In one embodiment, the planning module 305 plans a route segment or path segment for the next predetermined time period (e.g., 5 seconds). For each planning cycle, the planning module 305 plans the target location for the current cycle (e.g., the next 5 seconds) based on the target locations planned in the previous cycle. The control module 306 then generates one or more control commands (e.g., throttle, brake, steering control commands) based on the current cycle of planning and control data.

Note that the decision module 304 and the planning module 305 may be integrated as an integrated module. The decision module 304/planning module 305 may include a navigation system or a function of a navigation system to determine a driving path for an ADV. For example, the navigation system may determine a range of speeds and heading directions to affect an ADV moving along a path that substantially avoids perceived obstacles as the ADV is driven along a road-based path to a final destination. The destination may be set according to user input via the user interface system 113. The navigation system may dynamically update the driving path while the ADV is running. The navigation system may combine data from the GPS system with one or more maps to determine the driving path of the ADV.

The decision module 304/planning module 305 may also include a collision avoidance system or the functionality of a collision avoidance system to identify, assess, avoid or otherwise negotiate potential obstacles in the environment of the ADV. For example, the collision avoidance system may alter the navigation of the ADV by manipulating one or more subsystems in the control system 111 for a steering maneuver, a turning maneuver, a braking maneuver, and the like. The collision avoidance system can automatically determine feasible obstacle avoidance actions according to surrounding traffic modes, road conditions and the like. The collision avoidance system may be configured to not engage in a steering maneuver when other sensor systems detect vehicles, building barriers, etc. in the area near the ADV that will be entering. The collision avoidance system can automatically select maneuvers that are both available and that maximize the safety of the occupants of the ADV. The collision avoidance system may select an avoidance maneuver predicted to cause the least acceleration in the passenger cabin of the ADV.

FIG. 4 is a block diagram illustrating an example of a collision warning module according to one embodiment. The collision alert module 308 may generate a collision alert/alarm for the ADV to alert safety operators of a potential collision. Referring to fig. 4, the collision warning module 308 may include sub-modules such as an obstacle sensor 401, a collision determiner 403, a collision time calculator 405, a collision time threshold determiner 407, a count determiner 409, and a warning signal generator/transmitter 411. The obstacle sensor 401 may sense the environment of the ADV, which includes one or more obstacles within the field of view of the ADV. The collision determiner 403 may determine whether the ADV is within a collision route with any of the one or more obstacles based on the planned trajectory of the ADV. The collision time calculator 405 may calculate a collision distance from the ADV to any of the one or more obstacles. The collision distance may be converted to a time-to-collision (TTC) (time-based value) based on the current speed and heading of the ADV and/or the predicted speed and heading of any of the one or more obstacles predicted to collide with the ADV. The TTC value is a time that lasts until a collision between the vehicle and the obstacle if the collision path and speed of the vehicle/obstacle are maintained. The time-to-collision threshold determiner 407 may determine a threshold time-to-collision to generate an alert alarm. In one embodiment, the time-to-collision threshold is approximately two seconds. The count determiner 409 may determine a count threshold, with the time to collision continuously decreasing before the alert signal is generated. In one embodiment, the preset count is about 5. The alert signal generator/transmitter 411 may generate and/or transmit an alert alarm to a user interface of the ADV to alert a safety operator operating the ADV. Note that although the collision warning module 308 is illustrated as a separate module, the collision warning module 308 and the planning module 305 may be integrated modules.

Figure 5 illustrates an example of an ADV in a potential collision, according to one embodiment. Referring to fig. 5, in scene 500, ADV101 has a currently planned trajectory 501. Trajectory 501 includes the speed configuration and heading of ADV101 at any point along the path of trajectory 501. ADV101 may use a sensor to sense the distance of obstacle 510 in the collision path of ADV 101. In one embodiment, ADV101 determines a time to collision based on the distance to obstacle 510 and the current speed of the ADV. The time to collision is compared to a predetermined threshold (e.g., about 2 seconds) to determine whether to trigger an alarm alert. For example, ADV101 may travel at a speed of 10m/s, 30 meters from obstacle 510. Thus, the TTC of ADV101 may be calculated as 30/(10m/s) ═ 3 seconds. For a 2 second threshold, the 3 second TTC of ADV101 is not met and no alarm condition is triggered. Thus, ADV101 will not issue an alarm alert.

In another embodiment, ADV101 may calculate TTC threshold region 511, i.e., the region ADV101 will be within the TTC threshold. Here, the threshold may be determined to be 2 seconds by 10m/s to 20 meters. Thus, if ADV101 enters threshold zone 511, i.e., within 20 meters of obstacle 51020 meters while traveling at 10m/s, ADV101 will trigger an alarm alert to alert the safety operator of a potential collision. In another embodiment, when ADV101 enters threshold region 511, ADV101 will automatically apply hard braking (approximately 3m/s 2).

Figure 6 illustrates a time diagram of an ADV in another potential collision, according to one embodiment. Scenario 600 shows the time span of five planning cycles (0-4) of ADV 101. In this scenario, ADV101 has a currently planned trajectory 601 that follows vehicle 610 (or obstacle 610). Based on the sensors of ADV101, ADV101 may determine that ADV101 is within x seconds TTC to obstacle 610 during planning period 0. In the next planning cycle, the ADV may determine the subsequent TTC to the obstacle 610. In one embodiment, if ADV101 determines that the TTC is continuously decreasing for a predetermined number of planning cycles (count threshold), e.g., the TTC of each planning cycle is less than the last prior planning cycle, ADV101 triggers an alert alarm to the safety operator of ADV 101. For example, if the TTC over 5 planning periods is (x, x-0.5, x-1.0, x-1.5, x-2.0) and the predetermined threshold for the number of planning periods is 5, an alarm/warning signal will be triggered.

Note that in this case, each of the 5 consecutive planning periods has a smaller TTC than its immediately preceding planning period, triggering an alarm signal. In another embodiment, when ADV101 determines that a count threshold of successively lower TTCs (e.g., 5) has been reached, ADV101 will automatically apply light braking (approximately 1m/s ^2) to slow ADV 101.

In another embodiment, a counter is used to count the number of TTC planning cycles that are successively decreased, and an alarm is triggered if the counter reaches a predetermined number of planning cycles (e.g., count threshold 5). In some embodiments, the count threshold and TTC threshold may be combined so that either triggering condition (as shown in fig. 5-6) may be applied to trigger an alarm alert. In this case, an alarm may be triggered if ADV101 is within the TTC threshold (as shown in fig. 5), or ADV101 has a reduced TTC within the count threshold of the planned cycle. Although a count threshold of 5 and a TTC threshold of 2 seconds are used, any value may be used.

Figure 7 is a block diagram illustrating an operator interface of an ADV according to one embodiment. Referring to fig. 7, an operator interface 700 of an ADV101 includes a drive wheel 701, one or more display devices 702 and 703, and a speaker device 704. Here, operator interface 700 is the field of view of the safety operator of ADV 101. In one embodiment, if ADV101 triggers an alert signal (which may be generated by collision alert module 308 as part of ADS 110 of fig. 3A), collision alert module 308 may send the alert signal to operator interface 700 of ADV101 via a Controller Area Network (CAN) bus of ADV 101. In another embodiment, the alert signal may be displayed on an output device coupled to the ADS 110 of FIG. 3A. In one embodiment, the alert signal may be displayed as an alert symbol, an alert color indication or an alert statement on any of the display devices 702 and 703, and/or an audible alert may be triggered by the speaker device 704. Although an operator interface is illustrated, the alert signal may alert the safety operator through other interfaces within the safety operator's field of view of ADV101, such as touch feedback, e.g., vibration feedback through drive wheel 701, etc.

Figure 8 is a flow diagram illustrating a method performed by an ADV according to one embodiment. Process 800 may be performed by processing logic that may comprise software, hardware, or a combination thereof. For example, the process 800 may be performed by the collision warning module 308 of FIG. 4. Referring to fig. 8, at block 801, processing logic perceives an environment of an Autonomous Driving Vehicle (ADV), the environment including one or more obstacles. At block 802, processing logic determines whether the ADV will potentially collide with one or more obstacles based on the planned trajectory. At block 803, if it is determined that the ADV is potentially colliding, processing logic determines a collision time based on the planned trajectory and the one or more obstacles. At block 804, if the determined time to collision is less than the threshold or the time to collision decreases for a predetermined number of consecutive planning cycles, processing logic generates an alert signal to alert an operator of the ADV. At block 805, processing logic sends an alert signal to a user interface of the ADV to alert an operator of the potential collision.

In one embodiment, an alarm signal is sent over a Controller Area Network (CAN) bus to a user interface of the ADV to alert the operator. In one embodiment, the threshold is about 2 seconds and the predetermined number of consecutive programming cycles is about 5.

In one embodiment, the processing logic further displays an alert signal on a display device of the ADV or sounds an alarm through a speaker of the ADV. In one embodiment, if the time to collision is determined to be less than the threshold, the ADV assumes hard braking.

In another embodiment, if it is determined that the time to collision is greater than the threshold, but the time to collision is decreasing for a predetermined number of consecutive planning cycles, the ADV assumes light braking. In another embodiment, light braking is about 1m/s 2 and hard braking is about 3m/s 2.

Note that some or all of the components shown and described above may be implemented in software, hardware, or a combination thereof. For example, these components may be implemented as software installed and stored in a persistent storage device, which may be loaded and executed by a processor (not shown) in memory to perform the processes or operations described throughout this application. Alternatively, these components may be implemented as executable code programmed or embedded into special-purpose hardware, such as an integrated circuit (e.g., an application specific IC or ASIC), a Digital Signal Processor (DSP) or a Field Programmable Gate Array (FPGA), which is accessible via corresponding drivers and/or operating systems from applications. Further, these components may be implemented as specific hardware logic within a processor or processor core as part of an instruction set accessible via one or more specific instruction software components.

Some portions of the preceding detailed description have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, considered to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities.

It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the description, discussions utilizing terms such as those set forth in the appended claims refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.

Embodiments of the present disclosure also relate to apparatuses for performing the operations herein. Such a computer program is stored in a non-transitory computer readable medium. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g., a computer). For example, a machine-readable (e.g., computer-readable) medium includes a machine (e.g., computer) readable storage medium (e.g., read only memory ("ROM"), random access memory ("RAM"), magnetic disk storage media, optical storage media, flash memory devices).

The processes or methods described in the foregoing figures may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, etc.), software (e.g., embodied on a non-transitory computer readable medium), or a combination of both. Although the processes or methods are described above in terms of some sequential operations, it should be understood that some of the operations described may be performed in a different order. Further, some operations may be performed in parallel rather than sequentially.

Embodiments of the present disclosure are not described with reference to any particular programming language. It will be appreciated that various programming languages may be used to implement the teachings of the embodiments of the disclosure as described herein.

In the foregoing specification, embodiments of the disclosure have been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope of the disclosure as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:防碰撞功能参数的标定方法、车辆及可读存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!