Vehicle travel control device

文档序号:1803453 发布日期:2021-11-05 浏览:10次 中文

阅读说明:本技术 车辆行驶控制装置 (Vehicle travel control device ) 是由 坂下真介 堀笼大介 石桥真人 宝神永一 于 2020-03-09 设计创作,主要内容包括:车辆行驶控制装置(100)包括运算装置(110)和控制搭载在车辆上的行驶用部件工作的部件控制装置(200~500)。运算装置(110)包括车外环境认定部(111)、路径设定部(112~115)、车辆运动决定部(116)以及驾驶辅助影像生成部(150),车外环境认定部(111)认定车外环境,路径设定部(112~115)设定车辆应行驶的路径,车辆运动决定部(116)决定车辆的目标运动,该目标运动用于跟踪所设定的路径,驾驶辅助影像生成部(150)使用摄像头(70)的拍摄图像和由车外环境认定部(111)认定的车外环境的信息,生成用于辅助驾驶的显示影像。(A vehicle travel control device (100) is provided with an arithmetic device (110) and component control devices (200-500) that control the operation of travel components mounted on a vehicle. The arithmetic device (110) comprises an external environment recognition unit (111), route setting units (112-115), a vehicle motion determination unit (116), and a driving assistance image generation unit (150), wherein the external environment recognition unit (111) recognizes the external environment, the route setting units (112-115) set a route on which the vehicle should travel, the vehicle motion determination unit (116) determines a target motion of the vehicle for tracking the set route, and the driving assistance image generation unit (150) generates a display image for assisting driving using the captured image of the camera (70) and information of the external environment recognized by the external environment recognition unit (111).)

1. A vehicle travel control device for controlling travel of a vehicle, characterized in that:

the vehicle travel control device includes an arithmetic device and a component control device,

the component control device controls the operation of a running component mounted on the vehicle based on the calculation result of the calculation device,

the arithmetic device includes an external environment recognizing unit, a route setting unit, a target motion determining unit, and a driving assistance image generating unit,

the vehicle exterior environment recognizing unit recognizes a vehicle exterior environment based on an output of a camera provided on the vehicle and capturing an image of the vehicle exterior environment,

the route setting unit sets a route on which the vehicle should travel based on the environment outside the vehicle identified by the environment outside the vehicle identifying unit,

the target motion determination unit determines a target motion of the vehicle for tracking the path set by the path setting unit,

the driving assistance image generation unit generates a display image for assisting driving using the captured image of the camera and the information of the vehicle exterior environment recognized by the vehicle exterior environment recognition unit.

2. The vehicle travel control device according to claim 1, characterized in that:

the driving assistance image generation unit receives information of an obstacle from the vehicle exterior environment recognition unit, synthesizes images captured by the camera to generate an image representing a peripheral area including the vehicle, and superimposes a display for emphasizing the obstacle on the generated image.

3. The vehicle travel control apparatus according to claim 1 or 2, characterized in that:

the arithmetic device includes a physical quantity calculation section,

the physical quantity calculation unit calculates a target physical quantity to be generated by the travel component in order to achieve the target motion determined by the target motion determination unit,

the component control device calculates a control amount for the running component to realize the target physical quantity calculated by the physical quantity calculation unit, and outputs a control signal to the running component.

4. The vehicular running control apparatus according to any one of claims 1 to 3, characterized in that:

the vehicle exterior environment recognizing section recognizes the vehicle exterior environment by deep learning.

Technical Field

The technology disclosed herein belongs to the technical field related to vehicle travel control devices.

Background

Conventionally, a vehicle travel control device has been known which controls a plurality of traveling vehicle-mounted devices mounted on a vehicle.

For example, patent document 1 discloses, as a vehicle travel control device, a control system that is divided into a plurality of domains (domains) in advance according to functions of a plurality of in-vehicle devices, and the control system is hierarchically divided into a device control unit for controlling the in-vehicle devices and a domain control unit that collectively controls the device control units, the control system including a collective control unit that is located at a higher level than the domain control units and that collectively controls the domain control units.

In patent document 1, the device control unit calculates a control amount for the corresponding in-vehicle device, and outputs a control signal for realizing the control amount to each in-vehicle device.

Patent document 1: japanese laid-open patent publication No. 2017-61278

Disclosure of Invention

Technical problems to be solved by the invention

Recently, the development of automatic driving systems is being promoted in the country. In general, in an automatic driving system, vehicle exterior environment information is acquired by a camera or the like, and a route on which a vehicle should travel is calculated from the acquired vehicle exterior environment information. In addition, in the automatic driving system, a travel member is controlled to follow a route to be traveled.

Recently, vehicles provided with a Human Machine Interface (HMI) unit for assisting driving are increasing. The HMI unit synthesizes, for example, images captured by a camera provided on a vehicle body, generates a video representing a state around the vehicle, and displays the video on a display. The driver can recognize the situation around the vehicle immediately by viewing the image displayed on the display. However, if the arithmetic device and the HMI unit of the automatic driving system are separately provided, the configuration becomes complicated and the cost increases, and the configuration for transmitting data output by the camera also becomes complicated, which is not preferable.

The technology disclosed herein is designed to solve the above-mentioned problems, and has an object to: in a vehicle travel control device for controlling a travel member to operate so as to follow a route calculated by an arithmetic device, a video for assisting driving can be displayed with a simple configuration.

Technical solution for solving technical problem

In order to solve the above problem, the technology disclosed herein is a vehicle travel control device that controls travel of a vehicle, and the following configuration is adopted. The vehicle travel control device includes an arithmetic device and a component control device, the component control device controlling an operation of a travel component mounted on the vehicle based on an arithmetic result of the arithmetic device, the arithmetic device includes an external environment recognizing unit that recognizes an external environment based on an output of a camera provided on the vehicle and that captures the external environment, a path setting unit that sets a path along which the vehicle should travel based on the external environment recognized by the external environment recognizing unit, a target motion determining unit that determines a target motion of the vehicle for tracking the path set by the path setting unit, and a drive assist image generating unit that uses a captured image of the camera and information of the external environment recognized by the external environment recognizing unit, a display image for assisting driving is generated.

According to this configuration, the arithmetic device includes a driving assistance image generation unit that generates a display image for assisting driving in addition to the function of performing arithmetic operation for operating the travel member mounted on the vehicle. Thus, the HMI unit generates a video for assisting driving by itself without reading huge raw data such as a camera video. Therefore, even if a large HMI unit is not separately provided in addition to the arithmetic device, it is possible to display a video for assisting the driver in driving. Even when an HMI unit is provided in a vehicle, the HMI unit does not need to have a function of generating a video for assisting driving from enormous raw data such as a camera video. Further, since an output having a large data amount from the camera is only required to be transmitted to the arithmetic device, the configuration for transmitting data in the vehicle is simple.

In the vehicle travel control device, the driving assistance image generation unit may receive information of an obstacle from the vehicle exterior environment recognition unit, synthesize images captured by the camera to generate an image indicating a peripheral area including the vehicle, and superimpose a display for emphasizing the obstacle on the generated image.

According to this configuration, the image indicating the peripheral area including the vehicle and highlighting the obstacle can be generated by the arithmetic device.

In the vehicle travel control device, the following configuration may be adopted: the arithmetic device includes a physical quantity calculation unit that calculates a target physical quantity to be generated by the travel member in order to achieve the target motion determined by the target motion determination unit, and the member control device calculates a control quantity for the travel member so as to achieve the target physical quantity calculated by the physical quantity calculation unit, and outputs a control signal to the travel member.

According to this configuration, the operation of the arithmetic device is performed only until the physical quantity to be realized is calculated, and the actual control quantity for the running member is calculated by the member control device. This reduces the amount of computation by the computation device, and increases the computation speed of the computation device. Further, the component control device may calculate the actual control amount and output the control signal to the running component, and therefore the processing speed is high. As a result, the responsiveness of the travel member to the environment outside the vehicle can be improved.

Further, since the calculation device only needs to calculate the approximate physical quantity by causing the component control device to calculate the control quantity, the calculation speed can be slower than that of the component control device. This improves the calculation accuracy of the calculation device.

Further, by causing the component control device to calculate the control amount, it is possible to cope with a slight change in the environment outside the vehicle by adjusting the control amount by the component control device without via the arithmetic device.

As described above, the arithmetic device of the vehicle travel control device may be configured to: the vehicle exterior environment determination part determines the vehicle exterior environment by deep learning.

According to this configuration, the vehicle exterior environment recognizing unit recognizes the vehicle exterior environment by deep learning, and thus the amount of calculation by the computing device is large in particular. Therefore, if the control amount for the travel member is calculated by the member control device other than the arithmetic device, the effect of further improving the responsiveness of the travel member to the environment outside the vehicle can be more appropriately exhibited.

Effects of the invention

As described above, according to the technology disclosed herein, in the vehicle travel control device that controls the operation of the travel means so as to track the route calculated by the arithmetic device, the image for assisting the driving can be displayed with a simple configuration.

Drawings

Fig. 1 is a diagram schematically showing a configuration of a vehicle controlled by a vehicle travel control device according to an exemplary embodiment;

FIG. 2 is a schematic diagram showing an engine configuration;

FIG. 3 is a block diagram illustrating a control system of an automobile;

FIG. 4 shows an example of the configuration of the arithmetic device;

fig. 5 shows an example of an image for driving support.

Detailed Description

Hereinafter, exemplary embodiments will be described in detail with reference to the accompanying drawings. In the present disclosure, "components" such as "components for running" indicate devices such as actuators and sensors mounted on a vehicle.

Fig. 1 schematically shows a configuration of a vehicle 1 controlled by a vehicle travel control device 100 (hereinafter referred to as a travel control device 100) according to the present embodiment. The vehicle 1 is an automobile that can be driven manually, by driving assistance, or by automatic driving, in which the vehicle 1 is driven by a driver's operation of an accelerator or the like; the driving assistance is an operation for assisting a driver to run the vehicle 1; the automated driving is to run the vehicle 1 without an operation by the driver.

The vehicle 1 includes an engine 10, a transmission 20, a brake device 30, and a steering device 40, the engine 10 includes a plurality of (four in the present embodiment) cylinders 11 as a drive source, the transmission 20 is connected to the engine 10, the brake device 30 brakes rotation of front wheels 50 as drive wheels, and the steering device 40 steers the front wheels 50 as steered wheels.

The engine 10 is, for example, a gasoline engine. As shown in fig. 2, each cylinder 11 of the engine 10 is provided with an injector 12 for supplying fuel into the cylinder 11 and an ignition plug 13 for igniting an air-fuel mixture of the fuel and intake air supplied into the cylinder 11. Further, each cylinder 11 of the engine 10 is provided with an intake valve 14, an exhaust valve 15, and a valve train group 16 that adjusts opening and closing actions of the intake valve 14 and the exhaust valve 15. The engine 10 is provided with a piston 17 reciprocating in the cylinder 11, and a crankshaft 18 connected to the piston 17 via a connecting rod. Engine 10 may be a diesel engine. When the engine 10 is a diesel engine, the ignition plug 13 may not be provided. The injector 12, the ignition plug 13, and the valve train 16 are examples of power train related components.

The transmission 20 is, for example, a stepped automatic transmission. The transmission 20 is disposed on one side in the bank direction of the engine 10. The transmission 20 includes an input shaft (not shown) coupled to the crankshaft 18 of the engine 10, and an output shaft (not shown) coupled to the input shaft via a plurality of reduction gears (not shown). The output shaft is connected to an axle 51 of the front wheel 50. The rotation of the crankshaft 18 is changed in speed by the transmission 20 and transmitted to the front wheel 50. The transmission 20 is an example of a power train related component.

The engine 10 and the transmission 20 are power transmission devices that generate driving force for running the vehicle 1. The operations of engine 10 and transmission 20 are controlled by a power train ecu (electric Control unit) 200. For example, when the vehicle 1 is in a manual drive state, the power train ECU200 controls the fuel injection amount and the fuel injection timing of the injector 12, the ignition timing of the ignition plug 13, the timing and the period of opening of the intake valve 14 and the exhaust valve 15 by the valve train unit 16, and the like based on the detection values of the accelerator opening sensor SW1 and the like, and the accelerator opening sensor SW1 detects the accelerator opening corresponding to the operation amount of the accelerator pedal of the driver. Further, when the vehicle 1 is in the manual drive state, the power train ECU200 adjusts the gear engagement position of the transmission 20 based on the required driving force calculated from the accelerator opening degree based on the detection result of the shift position sensor SW2, which detects the operation of the shift lever by the driver, the shift position sensor SW 2. When the vehicle 1 is in the drive-assist state or the automatic driving state, the power train ECU200 basically calculates control amounts for the respective running members (here, the injectors 12 and the like) to achieve the target driving force calculated by the arithmetic device 110 described later, and outputs control signals to the respective running members. Powertrain ECU200 is an example of a component control device.

The brake device 30 includes a brake pedal 31, a brake actuator 33, a booster 34 connected to the brake actuator 33, a master cylinder 35 connected to the booster 34, a Dynamic Stability Control (DSC) device 36 for adjusting a braking force, and a brake pad 37 that actually brakes the rotation of the front wheel 50. A brake disk 52 is provided on an axle 51 of the front wheel 50. The brake device 30 is an electric brake, and operates the brake actuator 33 in accordance with the operation amount of the brake pedal 31 detected by the brake sensor SW3, and operates the brake pad 37 via the booster 34 and the master cylinder 35. The brake device 30 sandwiches the brake disc 52 with the brake pad 37, and brakes the rotation of the front wheel 50 by the frictional force generated between the brake pad 37 and the brake disc 52. The brake actuator 33 and the DSC device 36 are examples of brake-related components.

The operation of the brake device 30 is controlled by the brake microcomputer 300 and the DSC microcomputer 400. For example, when the vehicle 1 is in a manual driving state, the brake microcomputer 300 controls the operation amount of the brake actuator 33 based on the detection value of the brake sensor SW3 or the like, and the brake sensor SW3 detects the operation amount of the brake pedal 31 by the driver. The DSC microcomputer 400 controls the operation of the DSC device 36 regardless of the operation of the brake pedal 31 by the driver, and applies a braking force to the front wheels 50. When the vehicle 1 is in the driving-assist state or the automatic driving state, the brake microcomputer 300 basically calculates the control amount for each of the running members (here, the brake actuator 33) to achieve the target braking force calculated by the arithmetic device 110 described later, and outputs a control signal to each of the running members. The brake microcomputer 300 and the DSC microcomputer 400 are examples of the component control device. The brake microcomputer 300 and the DSC microcomputer 400 may be constituted by one microcomputer.

The Steering device 40 includes a Steering wheel 41 operated by a driver, an Electronic Power Assist Steering (EPAS) device 42 for assisting the driver in Steering, and a pinion shaft 43 connected to the EPAS device 42. The EPAS device 42 includes an electric motor 42a and a reduction gear 42b that reduces the speed of the driving force of the electric motor 42a and transmits the reduced driving force to the pinion shaft 43. The steering device 40 is a steer-by-wire type steering device, and operates the EPAS device 42 in accordance with the operation amount of the steering wheel 41 detected by the steering angle sensor SW4, and rotates the pinion shaft 43 to operate the front wheels 50. The pinion shaft 43 and the front wheel 50 are coupled by a rack bar, not shown, and the rotation of the pinion shaft 43 is transmitted to the front wheel via the rack bar. The EPAS device 42 is an example of a steering-related component.

The operation of the steering device 40 is controlled by the EPAS microcomputer 500. For example, when the vehicle 1 is in a manual driving state, the EPAS microcomputer 500 controls the operation amount of the electric motor 42a based on the detection value of the steering wheel steering angle sensor SW4 or the like. When the vehicle 1 is in the driving-assist state or the automatic driving state, the EPAS microcomputer 500 basically calculates the control amount for each running member (here, the EPAS device 42) to achieve the target steering angle calculated by the later-described arithmetic device 110, and outputs a control signal to each running member. The EPAS microcomputer 500 is an example of a component control device.

In the present embodiment, power train ECU200, brake microcomputer 300, DSC microcomputer 400, and EPAS microcomputer 500 are configured to be able to communicate with each other, and details thereof will be described later. In the following description, power train ECU200, brake microcomputer 300, DSC microcomputer 400, and EPAS microcomputer 500 may be simply referred to as component control devices.

In the present embodiment, as shown in fig. 3, the travel control device 100 includes an arithmetic unit 110, and the arithmetic unit 110 calculates a route along which the vehicle 1 should travel so as to enable the assisted driving and the automated driving, and determines the motion of the vehicle 1 for following the route. The arithmetic device 110 is a microprocessor composed of one or more chips, and includes a CPU, a memory, and the like. Fig. 3 shows a configuration for performing the functions (path generation functions described later) according to the present embodiment, and does not show all the functions of the arithmetic device 110.

Fig. 4 shows an example of the configuration of the arithmetic device 110. In the configuration example of fig. 4, the arithmetic device 110 includes a processor 3 and a memory 4. The memory 4 stores software, i.e., modules, that can be executed by the processor 3. The functions of the parts shown in fig. 3 are realized by the processor 3 executing the respective modules stored in the memory 4. The memory 4 stores data representing a model used for the processing of each part shown in fig. 3. It should be noted that there may be a plurality of processors 3 and memories 4.

As shown in fig. 3, the arithmetic device 110 determines the target motion of the vehicle 1 based on the outputs from the plurality of sensors and the like, and the control means operates. The sensors and the like that output information to the arithmetic device 110 include: a plurality of cameras 70 that are provided on a vehicle body or the like of the vehicle 1 and that capture images of the environment outside the vehicle; a plurality of radars 71 which are provided on a vehicle body or the like of the vehicle 1 and detect a person, an object, and the like outside the vehicle; a position sensor SW5 for detecting the position of the vehicle 1 (vehicle position information) using a Global Positioning System (GPS); a vehicle state sensor SW6 configured from outputs of sensors for detecting vehicle behavior such as a vehicle speed sensor, an acceleration sensor, and a yaw rate sensor, and acquiring the state of the vehicle 1; and an occupant state sensor SW7 that is configured from an in-vehicle camera or the like and acquires the state of an occupant of the vehicle 1. The communication information from other vehicles located around the host vehicle and the traffic information from the navigation system, which are received by the vehicle exterior communication unit 72, are input to the computing device 110.

The cameras 70 are respectively arranged so as to be able to photograph the surroundings of the vehicle 1 in the horizontal direction by 360 °. Each camera 70 captures an optical image representing the environment outside the vehicle and generates image data. Each camera 70 outputs the generated image data to the arithmetic device 110. The camera 70 is an example of information acquisition means for acquiring environment information outside the vehicle.

Like the cameras 70, the radars 71 are respectively arranged to expand the detection range to 360 ° in the horizontal direction around the vehicle 1. The type of the radar 71 is not particularly limited, and for example, a millimeter wave radar or an infrared radar can be used. The radar 71 is an example of an information acquisition unit that acquires environment information outside the vehicle.

In the assist driving or the automatic driving, the arithmetic device 110 sets a travel route of the vehicle 1 and sets a target motion of the vehicle 1 so that the vehicle 1 follows the travel route. In order to set the target motion of the vehicle 1, the arithmetic device 110 includes a vehicle exterior environment recognition unit 111, a candidate route generation unit 112, a vehicle behavior estimation unit 113, an occupant behavior estimation unit 114, a route determination unit 115, a vehicle motion determination unit 116, a driving force calculation unit 117, a braking force calculation unit 118, and a steering angle calculation unit 119. The vehicle exterior environment recognizing unit 111 recognizes the vehicle exterior environment based on the output from the camera 70 and the like. The candidate route generating unit 112 calculates one or more candidate routes along which the vehicle 1 can travel, based on the environment outside the vehicle identified by the environment outside the vehicle identifying unit 111. The vehicle behavior estimation unit 113 estimates the behavior of the vehicle 1 based on the output from the vehicle state sensor SW 6. The occupant behavior estimation unit 114 estimates the behavior of the occupant of the vehicle 1 based on the output from the occupant state sensor SW 7. The route determination unit 115 determines a route on which the vehicle 1 should travel. The vehicle motion determination unit 116 determines the target motion of the vehicle 1 for tracking the path set by the path determination unit 115. The driving force calculation unit 117, the braking force calculation unit 118, and the steering angle calculation unit 119 calculate target physical quantities (for example, a driving force, a braking force, and a steering angle) to be generated by the travel component so as to achieve the target motion determined by the vehicle motion determination unit 116. The candidate route generation unit 112, the vehicle behavior estimation unit 113, the occupant behavior estimation unit 114, and the route determination unit 115 constitute a route setting unit that sets a route on which the vehicle 1 should travel, based on the vehicle exterior environment recognized by the vehicle exterior environment recognition unit 111.

The computing device 110 has a route generation unit 120 and a backup unit 130 based on a rule as a security function. The rule-based route generation unit 120 recognizes an object outside the vehicle according to a predetermined rule, and generates a travel route avoiding the object. The backup unit 130 generates a travel route for guiding the vehicle 1 to a safety area such as a shoulder.

The arithmetic device 110 further includes a driving assistance image generation unit 150 that generates a display image for assisting driving.

< exterior environment recognition part of vehicle >

The vehicle exterior environment recognition unit 111 receives outputs of the camera 70, the radar 71, and the like mounted on the vehicle 1, and recognizes the vehicle exterior environment. The out-of-vehicle environment to be identified includes at least a road and an obstacle. Here, the vehicle exterior environment recognition unit 111 estimates the vehicle environment including the road and the obstacle by comparing the three-dimensional information around the vehicle 1 with the vehicle exterior environment model based on the data of the camera 70 and the radar 71. The external environment model is, for example, a learned model generated by deep learning, and can recognize a road, an obstacle, and the like with respect to three-dimensional information around the vehicle.

For example, the vehicle exterior environment recognition unit 111 specifies a free space, that is, a region where no object is present, from the image by performing image processing on the image captured by the camera 70. The image processing here uses a learned model generated by, for example, deep learning. Then, a two-dimensional map representing free space is generated. The vehicle exterior environment recognition unit 111 acquires information on people and objects present in the vicinity of the vehicle 1 from the output of the radar 71. The information is positioning information including the position, speed, and the like of the person and the object. Then, the vehicle exterior environment recognition unit 111 combines the generated two-dimensional map with the positioning information of the person and the object to generate a three-dimensional map representing the situation around the vehicle 1. Here, information of the installation position and the imaging direction of the camera 70 and information of the installation position and the transmission direction of the radar 71 are used. The external environment recognition unit 111 estimates the vehicle environment including the road and the obstacle by comparing the generated three-dimensional map with the external environment model. In Deep learning, a multi-layer Neural Network (DNN) is used. As the multilayer Neural Network, for example, a Convolutional Neural Network (CNN) is known.

< candidate Path Generation part >

The route candidate generation unit 112 generates a route candidate on which the vehicle 1 can travel, based on the output of the vehicle exterior environment recognition unit 111, the output of the position sensor SW5, the information transmitted from the vehicle exterior communication unit 73, and the like. For example, the route candidate generation unit 112 generates a travel route that avoids the obstacle identified by the vehicle exterior environment identification unit 111 on the road identified by the vehicle exterior environment identification unit 111. The output of the vehicle exterior environment recognition unit 111 includes, for example, lane information relating to a lane on which the vehicle 1 travels. The lane information includes information on the shape of the lane itself and information on an object on the lane. The information related to the shape of the lane includes the shape of the lane (straight line, curved line curvature), the width of the lane, the number of lanes, the width of each lane, and the like. The information on the object includes a relative position and a relative speed of the object with respect to the vehicle, an attribute (a type and a moving direction) of the object, and the like. Examples of the types of objects include: vehicles, pedestrians, roads, dividing lines, etc.

Here, the candidate route generating unit 112 calculates a plurality of candidate routes by using a state trellis method, and selects one or more candidate routes from the candidate routes based on route costs of the candidate routes. However, other methods may be used to perform the path calculation.

The route candidate generation unit 112 sets a virtual grid region on the roadway based on the roadway information. The grid region has a plurality of grid points. The position on the traffic lane is determined from each grid point. The candidate route generation unit 112 sets a predetermined grid point as a target arrival position. Then, a path search is performed using a plurality of grid points in the grid region, thereby calculating a plurality of candidate paths. In the state grid method, a path branches from a grid point to an arbitrary grid point forward in the vehicle traveling direction. Therefore, each candidate path is set to sequentially pass through a plurality of grid points. Each candidate route also includes time information indicating the time of passage through each grid point, speed information relating to the speed, acceleration, and the like at each grid point, information relating to the movement of another vehicle, and the like.

The candidate route generation unit 112 selects one or more travel routes from the plurality of candidate routes, based on the route cost. The path cost here includes, for example, the degree of lane centering, the acceleration of the vehicle, the steering angle, the possibility of collision, and the like. When the route candidate generation unit 112 selects a plurality of travel routes, the route determination unit 115 selects one travel route.

< vehicle behavior estimation section >

The vehicle behavior estimation unit 113 measures the state of the vehicle based on the outputs of sensors that detect the behavior of the vehicle, such as a vehicle speed sensor, an acceleration sensor, and a yaw rate sensor. The vehicle behavior estimating unit 113 uses a vehicle six-axis model representing the vehicle behavior.

Here, the six-axis vehicle model is obtained by modeling the acceleration in the three-axis directions of "front-rear", "right-left", "up-down" and the angular velocity in the three-axis directions of "pitch", "roll" and "yaw" of the vehicle during traveling. That is, this model does not capture the motion of the vehicle only on the plane of the motion engineering of the classical vehicle (only the front, rear, left, and right (X-Y movement) and yaw movement (Z axis)) but also captures the motion of the vehicle using the pitch (Y axis) and roll (X axis) movements of the vehicle body attached to the four wheels through suspensions, and the movement of the Z axis (up and down movement of the vehicle body), that is, a numerical model in which the motions of the vehicle are reproduced by sharing six axes in total.

The vehicle behavior estimating unit 113 applies a vehicle six-axis model to the travel route generated by the candidate route generating unit 112, and estimates the behavior of the vehicle 1 when traveling along the travel route.

[ occupant behavior estimation section ]

The occupant behavior estimating unit 114 estimates the state of health and the emotion of the driver in particular from the detection result of the occupant state sensor SW 7. Health conditions such as health, mild fatigue, poor physical condition, decreased consciousness, etc. Mood such as happy, normal, boring, impatient, unhappy, etc.

For example, the occupant behavior estimation unit 114 extracts a face image of the driver from an image captured by a camera provided in the vehicle interior, for example, to specify the driver. The extracted face image and the determined driver information are provided as inputs to the human model. The human model is a learned model generated by, for example, deep learning, and outputs health status and emotion information from a face image of each person that may become a driver of the vehicle 1. The occupant behavior estimation unit 114 outputs the health state and emotion information of the driver, which have been output by the human model.

When a biological information sensor such as a skin temperature sensor, a heart rate sensor, a blood flow sensor, or a sweat sensor is used as the occupant state sensor SW7 for acquiring information of the driver, the occupant behavior estimation unit 114 measures biological information of the driver based on an output of the biological information sensor. In this case, the human model outputs health state and emotion information for each person who may become a driver of the vehicle 1, using the biological information as an input. The occupant behavior estimation unit 114 outputs the health state and emotion information of the driver, which have been output by the human model.

Further, as the human model, the following model may also be used: each person who may become a driver of the vehicle 1 estimates an emotion held by the behavior of the vehicle 1 by the human. In this case, the output of the vehicle behavior estimating unit 113, the biological information of the driver, and the estimated emotional state may be managed in chronological order to construct a model. The model can predict, for example, a relationship between a driver's feeling of excitement (arousal level) and a vehicle behavior.

The occupant behavior estimation unit 114 may use a human model as the human model. The human body model specifies, for example, a head mass (for example, 5kg) and a muscle force around the neck in the front-rear-left-right direction G. When the vehicle body motion (acceleration G, jerk) is input, the human body model outputs the expected physical information and subjective information of the occupant. Occupant physical information is, for example, very comfortable/moderate/unpleasant, subjective information is, for example, unexpected/predictable, etc. By referring to the human body model, for example, a vehicle behavior in which the head is slightly tilted back is unpleasant for the occupant, and therefore, the travel route can be made unselected. On the other hand, a vehicle body behavior in which the head portion is moved forward like a bow is easy to make the occupant take a posture against the behavior, and the occupant does not feel uncomfortable immediately, so that the running path can be selected. Alternatively, by referring to the human body model, for example, the target motion can be determined so as to avoid the head of the occupant from shaking or giving a feeling of lively jerkiness.

The occupant behavior estimation unit 114 applies a human model to the vehicle behavior estimated by the vehicle behavior estimation unit 113, and estimates a change in the current state of health and a change in emotion of the driver with respect to the behavior of the vehicle.

< route determination part >

The route determination unit 115 determines a route on which the vehicle 1 should travel, based on the output of the occupant behavior estimation unit 114. When the route generated by the candidate route generation unit 112 is one route, the route determination unit 115 determines the route as a route on which the vehicle 1 should travel. When there are a plurality of routes generated by the candidate route generation unit 112, the output of the occupant behavior estimation unit 114 is considered, and for example, a route that is most comfortable for an occupant (particularly, a driver) among the plurality of candidate routes, that is, a route that does not give the driver a tedious feeling such as being too careful to avoid an obstacle, is selected.

< route generation part based on rule >

The rule-based path generating unit 120 recognizes an object outside the vehicle according to a predetermined rule without using deep learning based on the outputs from the camera 70 and the radar 71, and generates a travel path avoiding the object. As with the candidate route generating unit 112, the rule-based route generating unit 120 calculates a plurality of candidate routes using a state trellis method, and selects one or more candidate routes from the candidate routes based on the route costs of the candidate routes. The rule-based path generating unit 120 calculates the path cost based on a rule that does not intrude within several m around the object, for example. The route generation unit 120 based on the rule may calculate the route by using another method.

Information of the route generated by the route generation unit 120 based on the rule is input to the vehicle motion determination unit 116.

< spare part >

The backup unit 130 generates a travel path for guiding the vehicle 1 to a safe area such as a shoulder of a road or the like when a sensor or the like fails or when the occupant is in poor physical condition, based on the outputs from the camera 70 and the radar 71. The backup unit 130 sets a safety zone where the vehicle 1 can be stopped in an emergency, for example, based on information from the position sensor SW5, and generates a travel route to the safety zone. As with the candidate route generating unit 112, the backup unit 130 calculates a plurality of candidate routes by using the state trellis method, and selects one or more candidate routes from the candidate routes based on the route costs of the candidate routes. The backup unit 130 may calculate the route by using another method.

The information on the route generated by the backup unit 130 is input to the vehicle motion determination unit 116.

< vehicle motion determining part >

The vehicle motion determination unit 116 determines the target motion for the travel route determined by the route determination unit 115. The target motion refers to steering and acceleration/deceleration for tracking a travel path. The vehicle motion determination unit 116 refers to the vehicle six-axis model, and calculates the motion of the vehicle body with respect to the travel route selected by the route determination unit 115.

The vehicle motion determination unit 116 determines a target motion for tracking the travel route generated by the route generation unit 120 based on the rule.

The vehicle motion determination unit 116 determines a target motion for tracking the travel route generated by the backup unit 130.

When the travel route determined by the route determination unit 115 is significantly deviated from the travel route generated by the rule-based route generation unit 120, the vehicle motion determination unit 116 selects the travel route generated by the rule-based route generation unit 120 as the route on which the vehicle 1 should travel.

When it is estimated that the sensor or the like (particularly, the camera 70 or the radar 71) is malfunctioning or the physical condition of the occupant is poor, the vehicle motion determination unit 116 selects the travel path generated by the backup unit 130 as the path on which the vehicle 1 should travel.

< physical quantity calculating part >

The physical quantity calculation unit is composed of a driving force calculation unit 117, a braking force calculation unit 118, and a steering angle calculation unit 119. The driving force calculation unit 117 calculates a target driving force to be generated by the power transmission device (the engine 10 and the transmission 20) in order to achieve the target motion. The braking force calculation unit 118 calculates a target braking force to be generated by the brake device 30 in order to achieve the target motion. The steering angle calculation unit 119 calculates a target steering angle to be generated by the steering device 40 in order to achieve the target motion.

< peripheral component operation setting part >

The peripheral component operation setting unit 140 sets the operation of the vehicle body related component of the vehicle 1, such as the lamp and the door, based on the output of the vehicle motion determination unit 116. The peripheral component operation setting unit 140 sets, for example, the direction of the vehicle lamp when the vehicle 1 follows the travel route determined by the route determination unit 115. Further, for example, when the vehicle 1 is guided to the safe area set by the backup unit 130, the peripheral component operation setting unit 140 sets the following operation: after the vehicle reaches a safe area, the hazard flasher lights or releases the door lock.

< Driving assistance image generating part >

In the present embodiment, the arithmetic device 110 includes the driving assistance image generating unit 150, and the driving assistance image generating unit 150 generates an image to be displayed on the in-vehicle display 700 or the like for assisting driving. As described above, in the computing device 110, the vehicle exterior environment recognizing unit 111 receives the output of the camera 70 that photographs the vehicle exterior environment, recognizes the vehicle exterior environment, and estimates the road, the obstacle, and the like by model prediction. Then, the driving assistance image generating unit 150 receives the output of the camera 70 and the vehicle exterior environment information from the vehicle exterior environment recognizing unit 111, and generates an image for assisting driving. For example, the driving assistance image generation unit 150 synthesizes the captured images of the camera 70 to generate an image including the peripheral area of the vehicle, receives the estimated obstacle information from the external environment recognition unit 111, and superimposes the display for emphasizing the obstacle on the generated image. The captured image may be a captured image or a moving image. Then, the arithmetic device 110 transmits a video signal (for example, RGB signal) representing the video generated by the driving assistance video generating unit 150 to the in-vehicle display 700. The video signal has a very small signal amount as compared with the output of the camera 70. The in-vehicle display 700 displays an image for assisting driving using the image signal transmitted from the arithmetic device 110.

Fig. 5 shows an example of an image displayed on the display 700 in the vehicle interior. In the example of fig. 5, the vehicle interior display 700 displays a composite image V1 showing the vehicle and its surroundings in plan view and a video image V2 of a camera provided in front of the vehicle body side by side. The composite image V1 is generated by coordinate-converting and combining image data of a plurality of cameras provided around the vehicle body. In the example of fig. 5, the vehicle exterior environment recognition unit 11 estimates a person positioned in the left front of the vehicle as an obstacle, and superimposes a display a1 emphasizing the person on the composite image V1.

By providing the driving assistance video generation unit 150 in the computing device 110 in this manner, it is possible to display a video for assisting the driving of the driver without providing a large hmi (human Machine interface) unit in addition to the computing device 110. That is, when the HMI unit is provided in the vehicle, it is not necessary to provide the HMI unit with a function of generating a video for assisting driving using enormous raw data such as a camera video. Further, since an output having a large data amount from the camera 70 is only required to be transmitted to the arithmetic device 110, the configuration for transmitting data in the vehicle is simple.

< output destination of arithmetic device >

The calculation result of calculation device 110 is output to power train ECU200, brake microcomputer 300, EPAS microcomputer 500, and vehicle body system microcomputer 600. Specifically, information on the target driving force calculated by the driving force calculation unit 117 is input to the power train ECU200, information on the target braking force calculated by the braking force calculation unit 118 is input to the brake microcomputer 300, information on the target steering angle calculated by the steering angle calculation unit 119 is input to the EPAS microcomputer 500, and information on the operation of each vehicle body-related component set by the peripheral component operation setting unit 140 is input to the vehicle body system microcomputer 600.

The arithmetic device 110 transmits a video signal (for example, RGB signal) representing the video generated by the driving assistance video generating unit 150 to the in-vehicle display 700.

As described above, power train ECU200 basically calculates the fuel injection timing of injector 12 and the ignition timing of ignition plug 13 to achieve the target driving force, and outputs control signals to the above-described running components. The brake microcomputer 300 basically calculates a control amount for the brake actuator 33 to achieve the target braking force, and outputs a control signal to the brake actuator 33. The EPAS microcomputer 500 basically calculates the amount of current supplied to the EPAS device 42 to achieve the target steering angle, and outputs a control signal to the EPAS device 42.

In this way, in the present embodiment, the arithmetic device 110 calculates only the target physical quantity to be output by each of the travel members, and the control quantities for each of the travel members are calculated by each of the member control devices 200 to 500. This reduces the amount of calculation by the calculation device 110, and can increase the calculation speed of the calculation device 110. Further, each of the component control devices 200 to 500 need only calculate an actual control amount and output a control signal to a component for running (the injector 12 or the like), and therefore, the processing speed is high. As a result, the responsiveness of the travel member to the environment outside the vehicle can be improved.

In addition, since the arithmetic device 110 can calculate a rough physical quantity by calculating the control quantity by each of the component control devices 200 to 500, the arithmetic speed of the arithmetic device 110 can be slower than that of each of the component control devices 200 to 500. This can improve the calculation accuracy of the calculation device 110.

Therefore, in the present embodiment, the present invention includes an arithmetic device 110 and component control devices 200 to 500, the component control devices 200 to 500 control the operation of a travel component (an injector 12 or the like) mounted on a vehicle 1 based on the arithmetic result of the arithmetic device 110, the arithmetic device 110 includes an external environment recognizing unit 111, a route setting unit (a candidate route generating unit 112 or the like), a vehicle motion determining unit 116, and a driving assistance image generating unit 150, the external environment recognizing unit 111 recognizes the external environment based on the outputs from a camera 70 and a radar 71 which acquire external environment information, the route setting unit (the candidate route generating unit 112 or the like) sets a route on which the vehicle 1 should travel based on the external environment recognized by the external environment recognizing unit 111, the vehicle motion determining unit 116 determines a target motion of the vehicle 1 for tracking the route set by the route setting unit, the driving assistance image generation unit 150 generates a display image for assisting driving using the captured image of the camera 70 and the information of the vehicle exterior environment recognized by the vehicle exterior environment recognition unit 111. Thus, the HMI unit generates a video for assisting driving by itself without reading huge raw data such as a camera video. Therefore, even if a large HMI unit is not separately provided in addition to the computing device 110, it is possible to display a video image for assisting the driver in driving. Even when an HMI unit is provided in a vehicle, the HMI unit does not need to have a function of generating a video for assisting driving from enormous raw data such as a camera video. Further, since an output having a large data amount from the camera is only required to be transmitted to the arithmetic device, the configuration for transmitting data in the vehicle is simple.

The arithmetic device 110 includes physical quantity calculation units 117 to 119, the physical quantity calculation units 117 to 119 calculate target physical quantities to be generated by the travel members in order to achieve the target motion determined by the vehicle motion determination unit 116, and the member control devices 200 to 500 calculate control quantities for the travel members so as to achieve the target physical quantities calculated by the physical quantity calculation units 117 to 119, and output control signals to the travel members. In this way, the operation of the arithmetic device 110 is performed only until the physical quantity to be realized is calculated, and the actual control quantity for the running component is calculated by the component control devices 200 to 500. This reduces the amount of calculation by the calculation device 110, and can increase the calculation speed of the calculation device 110. Further, the component control devices 200 to 500 need only calculate the actual control amount and output the control signal to the running component, and therefore the processing speed is high. As a result, the responsiveness of the travel member to the environment outside the vehicle can be improved.

In particular, in the present embodiment, the vehicle exterior environment recognition unit 111 recognizes the vehicle exterior environment by deep learning, and therefore the amount of calculation by the calculation device 110 is particularly large. Therefore, if the control amounts for the travel members are calculated by the member control devices 200 to 500 other than the arithmetic device 110, the effect of further improving the responsiveness of the travel members to the environment outside the vehicle can be more appropriately exhibited.

< other controls >

When the vehicle 1 is in the driving-assist state, the driving force calculation unit 117, the braking force calculation unit 118, and the steering angle calculation unit 119 may change the target driving force or the like according to the state of the driver of the vehicle 1. For example, when the driver enjoys driving (the emotion of the driver is "happy"), the target driving force or the like may also be reduced as close as possible to manual driving. On the other hand, if the driver is in a state of poor physical condition, the target driving force may be increased to approach the automatic driving as much as possible.

(other embodiments)

The technology disclosed herein is not limited to the above-described embodiments, and can be replaced within the scope not departing from the gist of the claims.

For example, in the above embodiment, the route determination unit 115 determines the route on which the vehicle 1 should travel. Not limited to this, the route determination unit 115 may be omitted, and the vehicle motion determination unit 116 may determine the route on which the vehicle 1 should travel. That is, the vehicle motion determination unit 116 may also serve as part of the route setting unit and the target motion determination unit.

In the above embodiment, the driving force calculation unit 117, the braking force calculation unit 118, and the steering angle calculation unit 119 calculate a target physical quantity such as a target driving force. Not limited to this, the driving force calculation unit 117, the braking force calculation unit 118, and the steering angle calculation unit 119 may be omitted, and the vehicle motion determination unit 116 may calculate the target physical quantity. That is, the vehicle motion determination unit 116 may also serve as both the target motion determination unit and the physical quantity calculation unit.

The above embodiments are merely examples and should not be used to limit the scope of the present disclosure. The scope of the present disclosure is defined by the appended claims, and all changes and modifications that fall within the meaning and range of equivalency of the claims are intended to be embraced therein.

Industrial applicability-

The technology disclosed herein is useful as a vehicle travel control device that controls travel of a vehicle.

-description of symbols-

1 vehicle

12 oil sprayer (for driving parts)

13 spark plug (for driving parts)

16 valve transmission set (for driving parts)

20 speed variator (running parts)

33 brake actuator (for driving)

42 EPAS device (traveling component)

100 vehicle travel control device

110 arithmetic device

111 vehicle exterior environment recognition unit

112 candidate route generating part (route setting part)

113 vehicle behavior estimating unit (route setting unit)

114 occupant behavior estimating unit (route setting unit)

115 route determination unit (route setting unit)

116 vehicle motion determination unit (target motion determination unit)

117 drive force calculation unit (physical quantity calculation unit)

118 braking force calculation unit (physical quantity calculation unit)

119 steering angle calculating section (physical quantity calculating section)

150 driving assistance image generating unit

200 powertrain ECU (component control device)

300 brake microcomputer (component control device)

400 DSC microcomputer (component control device)

500 EPAS microcomputer (component control device).

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:控制自我和社会对象的安全性的方法和系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!