System and method for controlling a vehicle based on expected lane departure

文档序号:1665932 发布日期:2019-12-31 浏览:30次 中文

阅读说明:本技术 用于基于预期的车道偏离控制车辆的系统和方法 (System and method for controlling a vehicle based on expected lane departure ) 是由 R·泽瑞哈拉姆 M·沙里亚里 M·R·侯赛因 J·萨契戴夫 A·塔克马尔 于 2019-05-30 设计创作,主要内容包括:机动车辆包括至少一个传感器,其被配置为检测靠近车辆的车道标记,并检测车辆的速率、加速度和偏航率。车辆还包括控制器,该控制器与至少一个传感器通信并且被配置为在第一模式和第二模式中选择性地控制转向干预系统。控制器被配置为在相应的多个时间实例中计算多个车道偏离估计,在多个车道偏离估计当中进行仲裁以计算到车道偏离的预测时间,计算与到车道偏离的预测时间相关联的车道偏离置信度值,并且响应于置信度值超过第一阈值并且到车道偏离的预测时间低于第二阈值,控制转向干预系统处于第二模式。(The motor vehicle includes at least one sensor configured to detect lane markers proximate the vehicle and to detect a velocity, an acceleration, and a yaw rate of the vehicle. The vehicle also includes a controller in communication with the at least one sensor and configured to selectively control the steering intervention system in a first mode and a second mode. The controller is configured to calculate a plurality of lane departure estimates at a respective plurality of time instances, arbitrate among the plurality of lane departure estimates to calculate a predicted time to lane departure, calculate a lane departure confidence value associated with the predicted time to lane departure, and control the steering intervention system to be in the second mode in response to the confidence value exceeding a first threshold and the predicted time to lane departure being below a second threshold.)

1. A motor vehicle comprising:

at least one sensor for detecting lane markings near the vehicle, detecting a velocity of the vehicle, detecting a yaw rate of the vehicle, and detecting an acceleration of the vehicle; and

a controller in communication with the at least one sensor and configured to selectively control the steering intervention system in a first mode and a second mode, the controller further configured to calculate a plurality of lane departure estimates at a respective plurality of time instances, arbitrate among the plurality of lane departure estimates to calculate a predicted time to lane departure, calculate a lane departure confidence value associated with the predicted time to lane departure, and control the steering intervention system in the second mode in response to the confidence value exceeding a first threshold and the predicted time to lane departure being below a second threshold.

2. The motor vehicle of claim 1, wherein the controller is further configured to calculate an initial time to lane departure parameter based on a kinematic model, and calculate a predicted time to lane departure and a lane departure confidence value by filtering the initial time to lane departure parameter.

3. The motor vehicle of claim 2, wherein the controller is further configured to filter the initial time to lane departure parameter using an estimation algorithm.

4. A motor vehicle in accordance with claim 3, wherein the estimation algorithm comprises an unscented kalman filter.

5. A motor vehicle in accordance with claim 2, wherein said kinematic model is based on a measured velocity of the vehicle obtained from the at least one sensor, a measured acceleration of the vehicle, a measured yaw rate of the vehicle, a detected lane marker position relative to the vehicle, a detected lane marker heading relative to the vehicle, and a detected lane curvature.

6. A motor vehicle in accordance with claim 1, wherein said steering intervention system comprises an audible, visual, or tactile operator notification system, and wherein in said first mode said steering intervention system provides no notification and in said second mode said steering intervention system provides a notification.

7. A motor vehicle according to claim 1, wherein the steering intervention system comprises at least one actuator configured to control vehicle steering, and wherein in the first mode the steering intervention system does not control the actuator to provide steering torque and in the second mode the steering intervention system controls the actuator to provide steering torque.

8. The motor vehicle of claim 1, wherein the at least one sensor comprises an optical camera, a LiDAR system, or a RADAR system.

Technical Field

The present disclosure relates to vehicles having a steering intervention system configured to automatically provide intervention to avoid or prevent unintended lane departure.

Introduction to the design reside in

The vehicle control system may comprise an arrangement of: a path tracking control system, a lane boundary keeping control system, a steering torque assist control system, and a steering angle assist control system. Such travel control systems rely on various sensors, controllers, and actuators, and may include the use of a visual lane detection system.

Disclosure of Invention

A motor vehicle according to the present disclosure includes at least one sensor and a controller. The sensor is configured to detect lane markings near the vehicle, detect a velocity of the vehicle, detect a yaw rate of the vehicle, and detect an acceleration of the vehicle. A controller is in communication with the at least one sensor and is configured to selectively control the steering intervention system in a first mode and a second mode. The controller is further configured to calculate a plurality of lane departure estimates in a respective plurality of time instances, arbitrate among the plurality of lane departure estimates to calculate a predicted time to lane departure, calculate a lane departure confidence value associated with the predicted time to lane departure, and control the steering intervention system to be in the second mode in response to the confidence value exceeding a first threshold and the predicted time to lane departure being below a second threshold.

In an exemplary embodiment, the controller is further configured to calculate an initial time to lane departure parameter based on the kinematic model, and to calculate a predicted time to lane departure and a lane departure confidence value by filtering the initial time to lane departure parameter. In such embodiments, the controller may be further configured to filter the initial time to lane departure parameter using an estimation algorithm (e.g., using an unscented kalman filter). In such embodiments, the kinematic model may be based on a measured velocity of the vehicle obtained from the at least one sensor, a measured acceleration of the vehicle, a measured yaw rate of the vehicle, a detected lane marker position relative to the vehicle, a detected lane marker heading relative to the vehicle, and a detected lane curvature.

In an exemplary embodiment, the steering intervention system includes an audible, visual, or tactile operator notification system. In the first mode, the steering intervention system provides no notification, and in the second mode, the steering intervention system provides a notification.

In an exemplary embodiment, the steering intervention system comprises at least one actuator configured to control the steering of the vehicle. In the first mode, the steering intervention system does not control the actuator to provide the steering torque, and in the second mode, the steering intervention system controls the actuator to provide the steering torque.

In an exemplary embodiment, the at least one sensor comprises an optical camera, a LiDAR system, or a RADAR system.

A method of controlling a host motor vehicle according to the present disclosure includes providing the host vehicle with at least one sensor in communication with at least one controller, and a steering intervention system. The method also includes obtaining, from at least one sensor, a measured velocity of the host vehicle, a measured acceleration of the host vehicle, a measured yaw rate of the host vehicle, a detected lane marker position relative to the host vehicle, a detected lane marker heading relative to the host vehicle, and a detected lane curvature. The method also includes calculating, via the at least one controller, an initial time parameter to lane crossing from the kinematic model based on the measured velocity, the measured acceleration, the measured yaw rate, the lane marker position, the lane marker heading, and the lane curvature. The method also includes filtering, via the at least one controller, the initial time to lane crossing parameter to obtain a final time to lane crossing value and a confidence parameter associated with the final time to lane crossing value. The method further includes automatically controlling, via the at least one controller, the steering intervention system in the steering intervention mode in response to the final time to lane crossing being below a first threshold and the confidence parameter exceeding a second threshold.

In an exemplary embodiment, filtering includes applying an unscented kalman filter.

In an exemplary embodiment, the steering intervention system comprises an audible, visual or tactile operator notification system, and wherein controlling the steering intervention system in the steering intervention mode comprises controlling the steering intervention system to provide the notification.

In an exemplary embodiment, the steering intervention system comprises at least one actuator configured to control the vehicle steering, and wherein controlling the steering intervention system in the steering intervention mode comprises controlling the steering intervention system to provide a corrective steering torque.

In an exemplary embodiment, filtering includes modifying one or more unreasonable time to lane crossings calculations.

In an exemplary embodiment, the method further comprises fusing, via the at least one controller, the initial time parameter to lane crossing with vehicle kinematics information, vehicle dynamics information, vehicle state information, and host vehicle lane information.

Embodiments in accordance with the present disclosure provide a number of advantages. For example, the present disclosure provides a system and method for accurate and timely intervention based on an expected departure from a current driving lane.

The above and other advantages and features of the present disclosure will become apparent from the following detailed description of the preferred embodiments, which is to be read in connection with the accompanying drawings.

Drawings

FIG. 1 is a schematic illustration of a vehicle according to an embodiment of the present disclosure;

FIG. 2 is a logic diagram of a method of calculating a lane departure estimate for a vehicle in accordance with an embodiment of the present disclosure;

FIG. 3 is a logic diagram method of a system for controlling a vehicle according to a first embodiment of the present disclosure; and

FIG. 4 is a logic diagram of a system for controlling a vehicle according to a second embodiment of the present disclosure.

Detailed Description

Embodiments of the present disclosure are described herein. However, it is to be understood that the disclosed embodiments are merely examples and that other embodiments may take various and alternative forms. These numbers are not necessarily to scale; certain features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as representative. Various features shown and described with reference to any one of the figures may be combined with features shown in one or more other figures to produce embodiments not explicitly shown or described. The combination of features shown provides a representative embodiment of a typical application. However, various combinations and modifications of the features consistent with the teachings of the present disclosure may be required for particular applications or implementations.

Referring now to FIG. 1, a system 10 for controlling a vehicle according to the present disclosure is shown in schematic diagram form. The system 10 includes a motor vehicle 12. The motor vehicle 12 includes a propulsion system 14, and the propulsion system 14 may include, in various embodiments, an internal combustion engine, an electric machine such as a traction motor, and/or a fuel cell propulsion system. The motor vehicle 12 additionally includes a steering system 16. Although depicted as including a steering wheel for purposes of illustration, in some embodiments within the scope of the present disclosure, steering system 16 may omit a steering wheel. The motor vehicle 12 additionally includes a plurality of wheels 18 and associated wheel brakes 20 configured to provide braking torque to the wheels 18. In various embodiments, the wheel brakes 20 may include friction brakes, regenerative braking systems, such as electric motors, and/or other suitable braking systems.

The propulsion system 14, steering system 16, and wheel brakes 20 are in communication with or under the control of at least one controller 22. Although depicted as a single unit for purposes of illustration, the controller 22 may additionally include one or more other controllers, collectively referred to as "controllers". The controller 22 may include a microprocessor or Central Processing Unit (CPU) in communication with various types of computer-readable storage devices or media. The computer readable storage device or medium may include volatile and non-volatile memory such as Read Only Memory (ROM), Random Access Memory (RAM), and Keep Alive Memory (KAM). The KAM is a persistent or non-volatile memory that can be used to store various operating variables when the CPU is powered down. The computer-readable storage device or medium may be implemented using any of a number of known memory devices, such as PROMs (programmable read Only memory), EPROMs (electronic PROM), EEPROMs (electrically erasable PROM), flash memory, or any other electrical, magnetic, optical, or combination memory device capable of storing data, some of which represent executable instructions, used by the controller 22 to control the vehicle.

The controller 22 is in communication with a plurality of sensors 24. In the exemplary embodiment, sensors 24 include one or more sensors configured to capture information about traffic lanes near vehicle 12, such as RADAR, LiDAR, optical cameras, thermal imagers, and ultrasonic sensors. In addition, the sensors 24 include one or more sensors configured to detect velocity, acceleration, and yaw rate of the vehicle 12. Such sensors may include one or more inertial measurement units. The sensors 24 may also include additional sensors or any combination of the above as appropriate.

The controller 22 is provided with a lane departure algorithm 26, as will be discussed in further detail below. The lane departure algorithm 26 is configured to calculate an expected time until the vehicle 12 departs from the current driving lane. The controller is in communication with an intervention system 28, the intervention system 28 being configured to perform an assistance, corrective, or other automated action based on the expected lane departure.

In a first exemplary embodiment, the intervention system 28 includes a Human Machine Interface (HMI) element configured to generate a notification to a vehicle occupant, such as an audio notification, a visual notification, a haptic notification, or any other suitable notification system. In such embodiments, the controller 22 may be configured to control the intervention system 28 to generate the notification in response to the lane departure condition calculated by the lane departure algorithm 26 being satisfied. These embodiments may be referred to as lane departure warning systems.

In the second exemplary embodiment, the intervening system 28 includes an actuator configured to selectively apply steering torque to the steering system 16. In such embodiments, the controller 22 may be configured to control the intervention system 28 to apply the corrective steering torque to move the vehicle 12 away from the lane markings in response to the lane departure condition calculated by the lane departure algorithm 26 being satisfied. These embodiments may be referred to as lane keeping systems.

In a third exemplary embodiment, controller 22 is provided with an Automatic Driving System (ADS) for automatically controlling propulsion system 14, steering system 16, and wheel brakes 20, respectively, to control vehicle acceleration, steering, and braking without human intervention. In such embodiments, the lane departure algorithm may be incorporated into the ADS. In such embodiments, the intervention system 28 includes an actuator configured to selectively apply a steering torque to the steering system 16, and the ADS 24 is configured to control the lane assist system 28 in response to inputs from the plurality of sensors 24.

Known configurations for lane departure algorithms may involve detecting upcoming road geometries, comparing the detected geometries to a database containing a plurality of predefined road geometries having associated lane departure equations, arbitrating between the plurality of predefined road geometries, and calculating a time to lane departure based on the resulting lane departure equations. Such a configuration may be computationally noisy.

Embodiments according to the present disclosure are configured to calculate lane departure based on a high fidelity kinematic model. In an exemplary embodiment, the kinematic model may be described based on a vehicle center coordinate system, as follows:

and

wherein the x-axis is the longitudinal (front-rear) axis of the vehicle, the y-axis is the lateral (side-to-side) axis of the vehicle, a refers to vehicle acceleration, V refers to vehicle velocity, andrefers to the yaw rate of the vehicle.

Assuming that the velocity and yaw rate are unchanged, the vehicle position can therefore be calculated as:

and

the lane estimation in the vehicle center coordinate system can be represented by the camera as:

yLane=C0+C1l+C2l2+C3l3

where l is the look-ahead distance, x above can be usedvehInstead, we obtain:

wherein C is0、C1、C2And C3Is a third order polynomial coefficient mapped to the detected lane marker.

The distance to lane crossing (DLC) can then be defined as:

Δrveh(t)=yveh-yLane

obtaining:

considering the second order taylor expansion of the equation around t ═ 0 yields:

thus, the second order approximation of Time To Lane Crossing (TTLC) based on the kinematic model can be expressed as:

a predictive model may then be defined based on the approximate TTLC from the kinematic model. In the exemplary embodiment below, the predictive model assumes linear propagation or integration of the TTLC between successive time steps. In the following predictive model, vxIndicating the velocity of the host vehicle, axIndicating the acceleration of the host vehicle, C0Indicating the relative distance of the host vehicle from the relevant lane marking, C1Indicating the heading of the lane relative to the host vehicle, C2Representing the curvature of the lane relative to the host vehicle.

Wherein

The measurement model can then be expressed as:

TTLCt=TTLCttlc

wherein

More accurate and timely intervention may be achieved using such kinematic models, as will also be discussed in further detail below in conjunction with fig. 2-4.

Further, the estimated TTLC may be filtered using an unscented kalman filter as follows. Generating and extending a state sigma point:

the Sigma point for the next time step was calculated using the predictive model:

predicted state mean and state covariance:

the measurement model is then used to predict the Sigma points in the measurement space:

the state and covariance matrices are then updated based on the actual measurements:

cross-correlation matrix:

kalman gain:

residue/innovation: y ist+1=z-zt+1|t

Updating the state matrix: x is the number oft+1|t+1=xt+1|t+Kt+1|t·yt+1

Updating the covariance matrix:

it can be seen that the above-described pattern predicts TTLC at a subsequent time step based on measurements at the current time step. In a subsequent time step, the prediction is updated, while also the cross-correlation between the prediction models is used to update the covariance. Unexpected TTLC behavior may thus be detected based on changes in other states. By using the covariance at each time step, the confidence parameter calculated by the TTLC at the respective time step is thus obtained.

Referring now to FIG. 2, a system and method of controlling a vehicle according to the present disclosure is shown in logic diagram form. Vehicle motion parameters 40 are obtained, including vehicle speed, acceleration, and yaw rate. The kinematic parameters 40 may be obtained from one or more sensors (e.g., an accelerometer or IMU associated with the vehicle). The kinematic parameters are input to a trajectory approximation algorithm 42. The trajectory approximation algorithm 42 includes a vehicle model 44 and imposes vehicle motion constraints or physical constraints 46. The trajectory approximation algorithm outputs vehicle state and trajectory parameters 48 and a predicted vehicle trajectory 50.

Lane criteria 52 are obtained including the detected lane marker position, lane heading, and lane curvature. The lane criteria 52 may be obtained from one or more sensors (e.g., optical cameras or LiDAR). The lane criteria 52 and the predicted vehicle trajectory 50 are input to a lane crossing calculation 54. The lane crossing calculation 54 includes an adjustment and transformation step 56, a distance to lane crossing determination step 58, and a relative lane-to-vehicle model step 60. The lane crossing calculation outputs adjusted lane information 62 and a distance to lane crossing parameter 64.

The distance to lane crossing parameter 64 is input to the time to lane crossing calculation 66. The lane crossing time calculation 66 includes an adjustment step 68 and a solver step 70. The time to lane crossing calculation 66 outputs a model-based approximate time to lane 72.

The vehicle state and trajectory parameters 48, the adjusted lane information 62, and the time to lane crossing 72 are input to an estimation and confidence calculation 74, for example, as shown in the equation above. The estimation and confidence calculation 74 includes a first step 76 for determining enhanced lane states and correlations, a second step 78 for predictions and state propagation, a third step 80 for updating predictions based on measurement and model probabilities, and a fourth step 82 for checking the estimates for convergence. If not, the calculation 74 returns to the first step 76. The estimate and confidence calculation 74 outputs a TTLC parameter 84 and an associated confidence factor 86. The confidence factor 86 represents the confidence that the vehicle will cross the lane divider at the time indicated by the TTLC parameter 84.

Thus, the estimate and confidence calculation 74 acts as a supervised estimator, receiving various information including its own TTLC estimate. By fusing vehicle kinematics and dynamics, lane information and vehicle state, estimation and confidence calculations 74 with a supervised estimator, the untrusted TTLC calculation and the wrong lane departure prediction may be robustly filtered to provide an accurate and continuous estimation of TTLC. Advantageously, the estimates and confidence calculations 74 are reconfigurable, e.g., easily modified to accommodate and include other inputs in place of or in addition to the vehicle state and trajectory parameters 48, the adjusted lane information 62, and the time to lane crossing 72.

The TTLC parameter 84 and the confidence factor 86 are input to an intervention system 88. In a first exemplary embodiment, the intervention system 88 comprises a driver notification system configured to provide an audible, visual, tactile, or other notification to the driver that a lane crossing is imminent. In a second exemplary embodiment, the intervention system 88 comprises a lane keeping assist system configured to control the vehicle steering system, for example by applying a corrective steering torque via an actuator, to prevent crossing lane markings. In a third exemplary embodiment, the intervention system 88 comprises a lane centering system configured to control the vehicle steering system to maintain a desired lane, for example, according to an autonomous driving system. In other embodiments, other intervention systems may be implemented.

Referring now to fig. 3, an exemplary embodiment of a lane keeping assist system 100 according to the present disclosure is shown in schematic form. The lane keeping assist system 100 includes a first sensor 102 configured to detect a feature external to the vehicle. The first sensor 102 is arranged to detect information related to the lane. In various exemplary embodiments, the first sensor 102 comprises an optical camera, a LiDAR system, a RADAR system, other sensors, or a combination thereof. The lane keeping assist system 100 additionally includes a second sensor 104 configured to detect vehicle motion parameters, such as vehicle speed, acceleration, and yaw rate. In an exemplary embodiment, the second sensor 104 includes an accelerometer or IMU. The predictive TTLC algorithm 106, for example, as described above, receives lane information from the first sensor 102 and motion parameters from the second sensor 104. The TTLC algorithm 106 outputs the TTLC parameters and the confidence factors as discussed above with reference to fig. 2. One or more intervention criteria 108 are evaluated to determine whether lane keeping assist intervention is required. If the intervention criteria are met 108 and lane keeping assist intervention is desired, an activation command is passed to the lane keeping control algorithm 110. The lane keeping control algorithm 110 generates a steering command, such as a torque command or a target steering angle command, and sends the steering command to an actuator 112, such as a power steering system actuator.

Referring now to fig. 4, an exemplary embodiment of a lane centering control system 120 is shown in schematic form. The lane centering control system 120 includes a first sensor 122 configured to detect a feature external to the vehicle. The first sensor 122 is arranged to detect information relating to a traffic lane close to the vehicle. In various exemplary embodiments, the first sensor 122 includes an optical camera, a LiDAR system, a RADAR system, other sensors, or a combination thereof. The lane centering control system 120 additionally includes a second sensor 124 configured to detect vehicle motion parameters, such as vehicle speed, acceleration, and yaw rate. In an exemplary embodiment, the second sensor 124 includes an accelerometer or IMU. The lane centering control system 120 additionally includes a map 126, the map 126 containing information relating to road curvature, for example stored in a non-transitory data store. The predictive TTLC algorithm 128, for example, as described above, receives lane information from the first sensor 122, motion parameters from the second sensor 124, and road curvature information from the map 126. The TTLC algorithm 128 outputs a TTLC parameter and a confidence factor as discussed above with respect to fig. 2. In addition, a mission planner algorithm, such as a path planning module of an autonomous driving system, receives lane information from the first sensor 122, motion parameters from the second sensor 124, and road curvature information from the map 126. The mission planner algorithm 130 outputs the desired trajectory to the lane centering control algorithm 132. The lane centering control algorithm 132 includes a path following control module 134 and a lane departure mitigation control module 136. The lane departure mitigation control module 136 receives the TTLC parameters and the confidence factor from the TTLC algorithm. The lane centering algorithm 132 combines the outputs from the path following control module 134 and the lane departure mitigation control module 136 to generate a steering command, e.g., a torque command or a target steering angle command, and sends the steering command to an actuator 138, e.g., a power steering system actuator.

It can be seen that the present disclosure provides a system and method for accurate and timely intervention based on an expected departure from a current driving lane.

While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms encompassed by the claims. The words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the disclosure. As previously mentioned, features of the various embodiments may be combined to form other exemplary aspects of the disclosure that may not be explicitly described or illustrated. Although various embodiments may be described as providing advantages or being preferred over other embodiments or prior art implementations with respect to one or more desired characteristics, those of ordinary skill in the art will recognize that one or more features or characteristics may be omitted, depending on the particular application and implementation, to achieve desired overall system attributes. These attributes may include, but are not limited to, cost, strength, durability, life cycle cost, marketability, appearance, packaging, size, maintainability, weight, manufacturability, ease of assembly, and the like. Thus, the described embodiments are not as advantageous as other embodiments or prior art implementations with respect to one or more characteristics, but are outside the scope of the present disclosure and may be advantageous for particular applications.

13页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种混合动力车辆驱动方式控制方法及系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!