Vehicle computing system

文档序号:98397 发布日期:2021-10-12 浏览:31次 中文

阅读说明:本技术 车辆用运算系统 (Vehicle computing system ) 是由 堀笼大介 坂下真介 石桥真人 宝神永一 三谷明弘 土山净之 于 2020-03-03 设计创作,主要内容包括:车辆用运算系统包括单个信息处理单元(1)。信息处理单元(1)包括车外环境推测部(10)、路径生成部(30)以及目标运动决定部(40)。车外环境推测部(10)接收获取车外环境信息的传感器的输出,推测包括道路和障碍物的车外环境;路径生成部(30)根据车外环境推测部(10)的输出,生成该车辆在已推测出的道路上避开已推测出的障碍物的行驶路径;目标运动决定部(40)决定该车辆沿路径生成部(30)已生成的行驶路径行驶时的目标运动。(The vehicle arithmetic system includes a single information processing unit (1). An information processing unit (1) is provided with a vehicle exterior environment estimation unit (10), a route generation unit (30), and a target motion determination unit (40). An external environment estimation unit (10) receives the output of a sensor for acquiring external environment information, and estimates an external environment including a road and an obstacle; a route generation unit (30) that generates a travel route for the vehicle to avoid the estimated obstacle on the estimated road, based on the output of the vehicle exterior environment estimation unit (10); a target motion determination unit (40) determines a target motion of the vehicle when the vehicle travels along the travel route generated by the route generation unit (30).)

1. A vehicle arithmetic system that is mounted on a vehicle and executes an arithmetic operation for controlling the travel of the vehicle, characterized in that:

the vehicular arithmetic system includes a single information processing unit,

the information processing unit includes a vehicle exterior environment estimation unit, a route generation unit, and a target motion determination unit,

the vehicle exterior environment estimation unit receives an output of a sensor that acquires vehicle exterior environment information, estimates a vehicle exterior environment including a road and an obstacle,

the route generation unit generates a travel route for the vehicle to avoid the estimated obstacle on the estimated road, based on the output of the vehicle exterior environment estimation unit,

the target motion determination unit determines the target motion when the vehicle travels along the travel route generated by the route generation unit, based on the output of the route generation unit.

2. The vehicular arithmetic system according to claim 1, characterized in that:

the information processing unit includes an energy management section,

the energy management unit calculates a driving force, a braking force, and a steering angle for achieving the target motion determined by the target motion determination unit.

3. The vehicular arithmetic system according to claim 2, characterized in that:

the energy management unit compares the calculated driving force, braking force, and steering angle with a vehicle energy model, and generates an operation signal for each actuator so as to generate the driving force, braking force, and steering angle.

4. The vehicular arithmetic system according to claim 1, characterized in that:

the information processing unit includes a driver state estimating section,

the driver state estimation portion receives an output of a sensor that measures a driver state, estimates the driver state including at least one of physical behavior and a health state,

the route generation unit generates a route that matches the driver state estimated by the driver state estimation unit.

5. The vehicular arithmetic system according to claim 4, characterized in that:

the driver state estimating unit estimates the driver state by comparing an output of a sensor for measuring the driver state with a human model.

6. The vehicular arithmetic system according to claim 4, characterized in that:

the target motion determination unit determines, using the output of the driver state estimation unit, a target motion including a planar motion of the vehicle and a change in a posture of the vehicle body in an up-down direction when the vehicle travels along the travel route generated by the route generation unit.

7. The vehicular arithmetic system according to claim 1, characterized in that:

the vehicle exterior environment estimation unit estimates the vehicle exterior environment by comparing three-dimensional information of the vehicle surroundings, which is obtained from an output of a sensor that acquires vehicle exterior environment information, with a vehicle exterior environment model.

8. The vehicular arithmetic system according to claim 1, characterized in that:

the target motion determination unit estimates a planar motion and a change in the vehicle body posture in the vertical direction, which are generated when the vehicle travels along the travel path generated by the path generation unit, with reference to a vehicle six-axis model obtained by modeling acceleration in three directions of the front-rear direction, the left-right direction, the vertical direction, and the vertical direction of the vehicle during travel, and angular velocities in three directions of the pitch, the roll, and the yaw, and determines the estimated planar motion and the change in the vehicle body posture in the vertical direction as the target motion of the vehicle.

Technical Field

The technology disclosed herein relates to a vehicular arithmetic system used for, for example, automatic driving of a vehicle.

Background

Patent document 1 discloses a control system that controls a plurality of in-vehicle devices such as an engine and a steering gear mounted on a vehicle. The control system includes a hierarchical configuration of a collective control unit, a domain control unit, and a device control unit for controlling a plurality of in-vehicle devices.

Patent document 1: japanese laid-open patent publication No. 2017-061278

Disclosure of Invention

Technical problems to be solved by the invention

In order to realize highly accurate automatic driving, it is necessary to control the movement of the vehicle not only in accordance with the environment around the vehicle but also in accordance with various information such as the driver state and the vehicle state by comprehensive judgment. Therefore, it is necessary to process a large amount of data from a camera, a sensor, an off-vehicle network, or the like at high speed, determine the optimum motion of the vehicle at every moment, and operate each actuator.

The technology disclosed herein is intended to solve the above-mentioned problems, and has an object to: provided is a vehicle arithmetic system for realizing high-precision automatic driving.

Technical solution for solving technical problem

Specifically, the technology disclosed herein is a vehicle arithmetic system that is mounted on a vehicle and executes arithmetic operations for controlling the travel of the vehicle, the vehicle arithmetic system including a single information processing unit that includes an external environment estimation unit that receives an output of a sensor that acquires external environment information and estimates an external environment including a road and an obstacle, a route generation unit that generates a travel route on the estimated road on which the vehicle avoids the estimated obstacle, and a target motion determination unit that determines a target motion of the vehicle when the vehicle travels along the travel route generated by the route generation unit, based on the output of the external environment estimation unit.

According to this configuration, in the vehicle arithmetic system, the single information processing unit includes the vehicle exterior environment estimation unit, the route generation unit, and the target motion determination unit. An external environment presumption part receives the output of a sensor for acquiring external environment information and presumes the external environment including a road and an obstacle; a route generation unit that generates a travel route for the vehicle to avoid the estimated obstacle on the estimated road, based on an output of the vehicle exterior environment estimation unit; the target motion determination unit determines a target motion of the vehicle when the vehicle travels along the travel route generated by the route generation unit. That is, each of the functions of the external environment estimation, the route generation, and the target motion determination is realized by an information processing unit configured by a single hardware. In this way, it is possible to optimally control the entire functions while realizing high-speed data transmission between the functions. Therefore, by integrating the processing for automatic driving into a single information processing unit, automatic driving with high accuracy can be realized.

Further, the following may be possible: the information processing unit includes an energy management unit that calculates a driving force, a braking force, and a steering angle for achieving the target motion determined by the target motion determination unit.

According to this configuration, the external environment estimation, the route generation, the target motion determination, and the energy management can be realized by using the information processing unit composed of a single hardware together. In this way, the vehicle arithmetic system can control the vehicle motion with high accuracy in accordance with the environment around the vehicle. Further, by integrating the processing for automatic driving into a single information processing unit, it is possible to realize high-precision automatic driving in consideration of vehicle behavior and energy consumption.

Further, the following may be possible: the energy management unit compares the calculated driving force, braking force, and steering angle with a vehicle energy model, and generates operation signals for the respective actuators so as to generate the driving force, braking force, and steering angle.

According to this configuration, in the vehicle arithmetic system, the energy management unit can generate the operation signal to each actuator based on the output of the target motion determination unit.

Further, it may be such that: the information processing unit includes a driver state estimation portion that receives an output of a sensor that measures a driver state, estimates the driver state including at least one of physical behavior and a health state, and the route generation portion generates a route that conforms to the driver state that has been estimated by the driver state estimation portion.

According to this configuration, all of the external environment estimation, the route generation, the target motion determination, and the driver state estimation can be realized by the information processing unit configured by a single hardware. The route generation unit generates a route that matches the driver state estimated by the driver state estimation unit. In this way, the movement of the vehicle can be controlled by performing comprehensive determination based on not only the environment around the vehicle but also the driver's state.

Further, the following may be possible: the driver state estimating unit estimates the driver state by comparing an output of a sensor for measuring the driver state with a human model.

According to this configuration, the driver state estimation unit estimates the driver state using the human model by receiving the output of the sensor that measures the driver state, such as the camera disposed in the vehicle interior. In this way, by performing comprehensive determination based not only on the environment around the vehicle but also on the driver's state, the motion of the vehicle can be controlled more accurately.

Further, it may be such that: the target motion determination unit determines, using the output of the driver state estimation unit, a target motion including a planar motion of the vehicle and a change in the posture of the vehicle body in the vertical direction when the vehicle travels along the travel route generated by the route generation unit.

According to this configuration, the target motion of the vehicle is determined using not only the output of the path generation unit but also the output of the driver state estimation unit. In this way, not only the comprehensive determination based on the environment around the vehicle but also the driver state can be performed for the generation of the route, but also the comprehensive determination based on the environment around the vehicle and also the driver state can be performed for the determination of the target motion.

Further, it may be such that: the vehicle exterior environment estimation unit estimates the vehicle exterior environment by comparing three-dimensional information of the vehicle surroundings, which is obtained from an output of a sensor that acquires vehicle exterior environment information, with a vehicle exterior environment model.

According to this configuration, the vehicle exterior environment estimation unit estimates the vehicle exterior environment including the road and the obstacle by comparing the three-dimensional information around the vehicle with the vehicle exterior environment model, by receiving the output of the sensor that acquires the vehicle exterior environment information, such as the camera and the radar that are mounted on the vehicle. In this way, the motion of the vehicle can be accurately controlled by the arithmetic processing using the vehicle exterior environment model.

Further, it may be such that: the target motion determination unit estimates a planar motion and a change in the vehicle body posture in the vertical direction, which are generated when the vehicle travels along the travel path generated by the path generation unit, with reference to a vehicle six-axis model obtained by modeling acceleration in three directions of the front-rear direction, the left-right direction, the vertical direction, and the vertical direction of the vehicle during travel, and angular velocities in three directions of the pitch, the roll, and the yaw, and determines the estimated planar motion and the change in the vehicle body posture in the vertical direction as the target motion of the vehicle.

According to this configuration, the motion of the vehicle can be accurately controlled by the arithmetic processing using the six-axis model of the vehicle.

Effects of the invention

According to the present disclosure, each of the functions of the external environment estimation, the path generation, and the target motion determination can be realized by an information processing unit configured by a single hardware. In this way, it is possible to optimally control the entire function while realizing high-speed data transmission between the functions. Therefore, by integrating the processing for automatic driving into a single information processing unit, automatic driving with high accuracy can be realized.

Drawings

Fig. 1 shows a functional configuration of a vehicle arithmetic system according to an embodiment;

fig. 2 shows a configuration example of an information processing unit;

fig. 3 shows a specific example of the vehicle actuator and its control device.

Detailed Description

Fig. 1 is a block diagram showing a functional configuration of a vehicle arithmetic system according to an embodiment. Fig. 2 is a configuration example of an information processing unit. As shown in fig. 1 and 2, the vehicle arithmetic system includes an information processing unit 1 mounted on a vehicle 2. The information processing unit 1 receives various signals and data related to the vehicle 2 as input, and performs arithmetic processing using a learned model generated by, for example, deep learning on the basis of the signals and data to determine the target motion of the vehicle 2. Then, an operation signal to each actuator 200 of the vehicle 2 is generated based on the determined target motion.

In the configuration example of fig. 2, the information processing unit 1 includes a processor 3 and a memory 4. The memory 4 stores software, i.e., modules, that can be executed by the processor 3. The functions of the parts shown in fig. 1 are realized by the processor 3 executing the modules stored in the memory 4. Further, the memory 4 stores data representing each model shown in fig. 1. It should be noted that there may be a plurality of processors 3 and memories 4.

The functions of the information processing unit 1 may be implemented by a single chip or by a plurality of chips. When implemented by a plurality of chips, the plurality of chips may be mounted on the same substrate or may be mounted on different substrates. However, in the present embodiment, the information processing unit 1 is configured in a single housing.

< example of input to information processing Unit >

The information processing unit 1 receives as input the outputs of a camera, a sensor, and a switch mounted on the vehicle, and signals, data, and the like from the outside of the vehicle. For example, the following are taken as inputs: examples of sensors for acquiring environment information outside the vehicle include an output of a camera 101, a radar 102, and the like mounted on the vehicle, a signal 111 of a positioning system such as a GPS, and data 112 for navigation, for example, transmitted from a network outside the vehicle; examples of the sensor for acquiring the driver information include an output of a camera 120 or the like provided in the vehicle interior, an output of sensors 130 for detecting the behavior of the vehicle, and an output of sensors 140 for detecting the operation of the driver.

The camera 101 mounted on the vehicle captures the situation around the vehicle and outputs the captured image data. The radar 102 mounted on the vehicle emits a radio wave to the surroundings of the vehicle and receives a reflected wave from an object. The radar 102 also measures the distance from the vehicle to the object and the relative speed of the object with respect to the vehicle from the transmitted wave and the received wave. In addition to the sensor for acquiring the environment information outside the vehicle, there are, for example, a laser radar, an ultrasonic sensor, and the like.

The sensor for acquiring the driver information includes, for example, a biological information sensor such as a skin temperature sensor, a heart rate sensor, a blood flow sensor, or a sweat sensor, in addition to the camera 120 installed in the vehicle interior.

Examples of the sensors 130 for detecting the behavior of the vehicle include a vehicle speed sensor, an acceleration sensor, and a yaw rate sensor. The sensors 140 that detect the operation of the driver include, for example, a steering wheel angle sensor, an accelerator opening sensor, and a brake sensor.

< example of output of information processing Unit >

The information processing unit 1 outputs an operation signal to a control device that controls each actuator (activator) 200 of the vehicle. Examples of the control device include an engine control device, a brake control device, and a steering control device. Each Control device is realized by, for example, an ECU (Electronic Control Unit), and the information processing Unit 1 and the ECU are connected to each other through, for example, an on-vehicle Network such as a CAN (Controller Area Network).

Fig. 3 is a diagram showing a specific example of the actuator. In fig. 3, 201 denotes an engine, 202 denotes a transmission, 203 denotes a brake, and 204 denotes a steering gear. The Power train ECU211, a DSC (Dynamic Stability Control) microcomputer 212, a brake microcomputer 213, and an EPAS (Electric Power assisted Steering) microcomputer 214 are examples of Control devices.

The information processing unit 1 calculates a driving force, a braking force, and a steering angle of the vehicle for achieving the determined target motion. For example, the powertrain ECU211 controls the ignition timing and the fuel injection amount of the engine 201 according to the calculated driving force. Alternatively, the EPAS microcomputer 214 controls the steering of the steering gear 204 based on the calculated steering angle.

The control devices of the other control execution devices include, for example, a vehicle body system microcomputer 221 that controls a vehicle body such as an airbag and a door, and a driver assistance HMI (Human Machine Interface) unit 223 that controls an in-vehicle display 222.

Next, the functional configuration of the information processing unit 1 shown in fig. 1 will be described in detail. The information processing unit 1 executes so-called Model Predictive Control (MPC) in processing such as path generation. In short, the case of model predictive control is as follows: an evaluation function for performing multivariate output by multivariate input is prepared in advance, and the evaluation function is solved by a convex function (multivariate analysis: a mathematical technique for efficiently solving a multivariate problem), thereby extracting a solution with a good balance. A relational expression (referred to as a model) for obtaining a multivariate output from the multivariate input is first established by a designer based on a physical phenomenon as an object. Then, the relation is optimized by neural learning (so-called teachers-less learning). Alternatively, the relationship is optimized by adjusting the relationship by observing the inputs and outputs from a statistical perspective.

When the vehicle leaves the factory, models developed by manufacturers are installed on the vehicle. In addition, the installed model can be optimized to the model conforming to the user in accordance with the driving condition of the vehicle user. Alternatively, the model may also be updated by updating of software by a dealer or the like.

Here, the outputs of the camera 101 and the radar 102 mounted on the vehicle are transmitted to the vehicle exterior environment estimation unit 10. A signal 111 of a positioning system such as a GPS and data 112 for navigation, for example, transmitted from a network outside the vehicle are transmitted to the route searching unit 61. The output of the camera 120 provided in the vehicle interior is transmitted to the driver state estimating unit 20. The output of the sensors 130 that detect the vehicle behavior is sent to the vehicle state measuring unit 62. The output of the sensor 140 that detects the operation of the driver is transmitted to the driver operation sensing unit 63.

< exterior environment estimation section >

The vehicle exterior environment estimation unit 10 receives outputs of a camera 101, a radar 102, and the like mounted on the vehicle, and estimates the vehicle exterior environment. The out-of-vehicle environment to be presumed includes at least a road and an obstacle. Here, the vehicle exterior environment estimation unit 10 estimates the vehicle environment including the road and the obstacle by comparing the three-dimensional information around the vehicle with the vehicle exterior environment model 15 based on the data of the camera 101 and the radar 102. The external environment model 15 is a learned model generated by deep learning, for example, and can recognize a road, an obstacle, and the like from three-dimensional information around the vehicle.

For example, the object recognition/map generation unit 11 performs image processing on the image captured by the camera 101 to identify a free space, that is, an area where no object is present, from the image. The image processing here uses a learned model generated by, for example, deep learning. Then, a two-dimensional map representing a free space is generated. The object recognition/map generation unit 11 also acquires information on people and objects present in the vicinity of the vehicle, based on the output of the radar 102. The information includes the position and speed of the person and object, etc.

The estimation unit 12 combines the two-dimensional map output by the object recognition/map generation unit 11 with the information on the person and object to generate a three-dimensional map showing the situation around the vehicle. Here, information of the setting position and shooting direction of the camera 101, and information of the setting position and transmission direction of the radar 102 are used. The estimation unit 12 estimates the vehicle environment including the road and the obstacle by comparing the generated three-dimensional map with the vehicle exterior environment model 15.

< driver state estimation part >

The driver state estimation unit 20 estimates the state of health, emotion, or physical behavior of the driver from an image captured by a camera 120 provided in the vehicle interior. Health conditions such as health, mild fatigue, poor physical condition, decreased consciousness, etc. Mood such as happy, normal, boring, impatient, unhappy, etc.

For example, the driver state measuring section 21 extracts a face image of the driver from an image captured by the camera 120 provided in the vehicle interior to determine driver information. The extracted face image and the determined driver information are provided as inputs to the human model 25. The human model 25 is a learned model generated by, for example, deep learning, and outputs health status and emotion information from a face image of each person who is likely to be the vehicle driver. The estimation section 22 outputs the health state and emotion information of the driver, which have been output by the human model 25.

When a biological information sensor such as a skin temperature sensor, a heart rate sensor, a blood flow sensor, or a sweat sensor is used as a sensor for acquiring driver information, the driver state measurement unit 21 measures biological information of the driver based on an output of the biological information sensor. In this case, the human model 25 outputs the health state and emotional information for each person who may become the driver of the vehicle, with the biological information as an input. The estimation section 22 outputs the health state and emotion information of the driver, which have been output by the human model 25.

In addition, the human model 25 may also employ the following models: the emotion of a human being held on the behavior of a vehicle is estimated for each person who is likely to be a driver of the vehicle. In this case, it is sufficient to manage the outputs of the sensors 130 for detecting the behavior of the vehicle, the outputs of the sensors 140 for detecting the operation of the driver, the biological information of the driver, and the estimated emotional state in chronological order, and to construct a model. The model can predict, for example, a relationship between a driver's feeling of excitement (arousal level) and a vehicle behavior.

The driver state estimation unit 20 may use a human model as the human model 25. The human body model specifies, for example, a head mass (for example, 5kg) and a muscle force around the neck in the front-rear-left-right direction G. When the vehicle body motion (acceleration G, jerk) is input, the human body model outputs the expected physical information and subjective information of the occupant. Occupant physical information is, for example, very comfortable/moderate/unpleasant, subjective information is, for example, unexpected/predictable, etc. By referring to the human body model, for example, the occupant feels uncomfortable with a vehicle body behavior in which the head is slightly tilted backward, and therefore, the travel route can be made to be unselected. On the other hand, a vehicle body behavior in which the head moves forward like a bow tends to make the occupant take a posture against the behavior, and does not immediately feel unpleasant to the occupant, so that the travel path can be selected. Alternatively, by referring to the human body model, for example, the target motion can be determined so as to avoid the head of the occupant from shaking or giving a feeling of lively jerkiness.

< route search part >

The route search unit 61 searches for a wide area route of the vehicle using a signal 111 of a positioning system such as a GPS and data 112 for navigation transmitted from a network outside the vehicle, for example.

< vehicle State measuring section >

The vehicle state measurement unit 62 measures the state of the vehicle based on the outputs of the sensors 130 that detect the behavior of the vehicle, such as a vehicle speed sensor, an acceleration sensor, and a yaw rate sensor. Then, an in-vehicle environment model 65 representing the in-vehicle environment is generated. The environment in the vehicle includes physical quantities that affect the body of the occupant, such as humidity, temperature, vibration, and acoustic noise. The in-vehicle environment estimation unit 64 estimates and outputs an in-vehicle environment from the in-vehicle environment model 65.

< driver operation sensing part >

The driver operation sensing unit 63 senses the operation of the driver based on the output of the sensors 140 that detect the operation of the driver, such as a steering wheel angle sensor, an accelerator opening sensor, and a brake sensor.

< Path Generation part >

The route generation unit 30 generates a travel route of the vehicle based on the output of the vehicle exterior environment estimation unit 10 and the output of the route search unit 61. For example, the route generation unit 30 generates a travel route that avoids the obstacle estimated by the vehicle exterior environment estimation unit 10 on the road estimated by the vehicle exterior environment estimation unit 10. The output of the vehicle exterior environment estimation unit 10 includes, for example, lane information relating to a lane through which the vehicle travels. The lane information includes information on the shape of the lane itself and information on an object on the lane. The information related to the shape of the roadway includes the shape of the roadway (straight line, curved line curvature), the width of the roadway, the number of lanes, the width of each lane, and the like. The information related to the object includes a relative position and a relative speed of the object with respect to the vehicle, an attribute (kind, moving direction) of the object, and the like. The type of the object is, for example, a vehicle, a pedestrian, a road, a lane line, or the like.

Here, the route generation unit 30 calculates a plurality of candidate routes by using a state trellis method, and selects one or more candidate routes from the candidate routes based on route costs of the candidate routes. However, other methods of generating the path may be used.

The route generation unit 30 sets a virtual grid region on the roadway based on the roadway information. The grid region has a plurality of grid points. The position on the roadway is determined from the grid points. The route generation unit 30 uses the output of the route search unit 61 to set a predetermined grid point as the target arrival position. Then, a plurality of candidate paths are calculated by a path search using a plurality of grid points in the grid region. In the state grid method, a path branches from a grid point to an arbitrary grid point forward in the vehicle traveling direction. Therefore, each candidate path is set to sequentially pass through a plurality of grid points. Each candidate route also includes time information indicating the time of passage through each grid point, speed information relating to the speed, acceleration, and the like at each grid point, information relating to the movement of another vehicle, and the like.

The route generation unit 30 selects one or more travel routes from the plurality of candidate routes according to the route cost. The path cost here includes, for example, the degree of lane centering, the acceleration of the vehicle, the steering angle, the possibility of collision, and the like. When the route generation unit 30 selects a plurality of travel routes, the target motion determination unit 40 and the energy management unit 50, which will be described later, select one travel route.

< target motion determining part >

The target motion determination unit 40 determines the target motion for the travel route selected by the route generation unit 30. The target motion refers to steering and acceleration and deceleration for tracking a traveling path. The target motion determination unit 40 refers to the vehicle six-axis model 45, and calculates the vehicle body motion for the travel route selected by the route generation unit 30.

Here, the vehicle six-axis model 45 is obtained by modeling the acceleration in the three-axis directions of "front-back", "right-left", "up-down" and the angular velocity in the three-axis directions of "pitch", "roll" and "yaw" of the vehicle during traveling. That is, this model does not capture the motion of the vehicle only on the plane of the motion engineering of the classical vehicle (only the front, rear, left, and right (X-Y movement) and yaw movement (Z axis) of the vehicle), but also captures the motion of the vehicle using the pitch (Y axis) and roll (X axis) movements of the vehicle body mounted on four wheels through suspensions, and the movement of the Z axis (up and down movement of the vehicle body), that is, a numerical model in which the vehicle behavior is reproduced by sharing six axes in total.

The target motion determination unit 40 calculates the vehicle body motion by referring to the vehicle six-axis model 45, and determines the target motion using the calculation result. That is, the target motion determination unit 40 refers to the vehicle six-axis model 45 to estimate the planar motion and the change in the vehicle body posture in the vertical direction that occur when the vehicle travels along the travel route generated by the route generation unit 30, and determines the estimated planar motion and the change in the vehicle body posture in the vertical direction as the target motion of the vehicle. In this way, for example, a so-called diagonal roll (diagonalroll) state can be generated during cornering.

For example, the target motion determination unit 40 may input the vehicle body motion (acceleration G, jerk) calculated by referring to the vehicle six-axis model 45 to the human body model to acquire the body information and subjective information of the expected occupant. Further, for example, when the route generation unit 30 has selected a plurality of travel routes, the target motion determination unit 40 may select one travel route based on the estimated physical information and subjective information of the occupant.

When the driver operation sensing unit 63 senses the operation of the driver, the target motion determination unit 40 determines the target motion according to the operation of the driver, instead of determining the target motion according to the travel route selected by the route generation unit 30.

< energy management department >

The energy management unit 50 calculates a driving force, a braking force, and a steering angle for realizing the target motion determined by the target motion determination unit 40. And, operation signals to the respective actuators 200 are generated so as to generate the calculated driving force, braking force, and steering angle.

For example, the vehicle kinetic energy operation unit 51 calculates physical amounts of torque required to be generated by the drive system (engine, motor, transmission), the steering system (steering), and the control system (brake) for the target motion determined by the target motion determination unit 40. The control amount calculation unit 52 calculates the control amount for each actuator so as to optimize the energy efficiency after the target motion determined by the target motion determination unit 40 is achieved. For example, the opening and closing timing of the intake and exhaust valves and the fuel injection timing of the fuel injector, etc., which minimize the fuel consumption amount after the engine torque determined by the vehicle kinetic energy operation unit 51 is realized, are calculated. The energy management here uses a vehicle thermal model 55 and a vehicle energy model 56. For example, the calculated physical quantities are compared with the vehicle energy model 56, and the momentum of the actuators is allocated so that the energy consumption becomes smaller.

Specifically, for example, the energy management unit 50 calculates an operation condition for minimizing energy loss on the basis of the target motion determined by the target motion determination unit 40 for the travel route selected by the route generation unit 30. For example, the energy management unit 50 calculates the vehicle running resistance for the running route selected by the route generation unit 30, and obtains the loss of the route. The running resistance includes tire friction, drive train loss, air resistance. Then, the operating condition for generating the driving force necessary to overcome the loss is determined. For example, the injection with the least fuel consumption in the internal combustion engine, the ignition timing, the shift mode with less energy loss in the transmission, and the operating condition of the lock-up mechanism in the torque control are determined. Alternatively, when deceleration is requested, the combination of the amount of pedal braking of the vehicle model for realizing the deceleration curve, the amount of engine braking, and the regenerative energy of the regenerative model for driving the assist motor is calculated to find the operating condition for minimizing the energy loss.

Then, the energy management unit 50 generates an operation signal for each actuator 200 based on the obtained operating condition, and outputs the operation signal to the control device of each actuator 200.

As described above, according to the vehicle arithmetic system of the present embodiment, the information processing unit 1 includes the vehicle exterior environment estimation unit 10, the route generation unit 30, and the target motion determination unit 40. An external environment estimation unit 10 that receives an output of a sensor that acquires external environment information and estimates an external environment; the route generation unit 30 generates a route of the vehicle based on the output of the vehicle exterior environment estimation unit 10; the target motion determination unit 40 determines the target motion of the vehicle based on the output of the path generation unit 30. That is, each of the functions of the external environment estimation, the route generation, and the target motion determination can be realized by the information processing unit 1 configured by a single hardware.

In this way, it is possible to optimally control the entire function while realizing high-speed data transmission between the functions. For example, in a configuration in which each function is realized by a separate ECU, inter-ECU communication is required to transmit and receive a large amount of data between the functions. However, the communication speed of the vehicle-mounted network (CAN, ethernet (registered trademark)) currently used is about 2Mbps to 100 Mbps. In contrast, the information processing unit 1 configured by a single hardware can realize data transmission speeds of several Gbps to several tens of Gbps.

Therefore, by integrating the processing for automatic driving into a single information processing unit 1, automatic driving with high accuracy can be realized.

In the present embodiment, the information processing unit 1 further includes an energy management unit 50. That is, the external environment estimation, the route generation, the target motion determination, and the energy management are realized by the information processing unit 1 constituted by a single hardware in common. Therefore, by integrating the processing for automatic driving into a single information processing unit 1, automatic driving in consideration of the vehicle behavior and energy consumption can be achieved with high accuracy.

(examples of other controls)

The route generation unit 30 may generate a travel route of the vehicle using the output of the driver state estimation unit 20. For example, the driver state estimation unit 20 outputs data indicating the emotion of the driver to the route generation unit 30, and the route generation unit 30 selects a travel route using the data indicating the emotion. For example, when the emotion is "happy", a path in which the behavior of the vehicle is fluent is selected; when the emotion is "boring", a route in which the behavior of the vehicle changes greatly is selected.

Alternatively, the route generation unit 30 may select a route having the highest driver's emotion (high wakefulness) from among the plurality of candidate routes by referring to the human model 25 included in the driver state estimation unit 20.

When the route generation unit 30 determines that the vehicle is in danger from the environment outside the vehicle estimated by the environment outside the vehicle estimation unit 10, the route generation unit 30 may generate a route for avoiding danger in an emergency regardless of the driver's state. When the route generation unit 30 determines that the driver is unable to drive or difficult to drive (for example, the driver loses consciousness) based on the output of the driver state estimation unit 20, the route generation unit 30 may generate a route for moving the vehicle to a safe place.

When the target motion determination unit 40 determines that the driver is unable to drive or difficult to drive (for example, the driver loses consciousness) based on the output of the driver state estimation unit 20, the target motion determination unit 40 may determine the target motion so as to move the vehicle back to the safe place. In this case, the route generation unit 30 generates a plurality of travel routes including a route for returning the vehicle to the safe place. When the target motion determination unit 40 determines that the driver is unable to drive or is difficult to drive, the target motion determination unit 40 may select a path for moving the vehicle to a safe place (switching from manual driving to automatic driving).

(other embodiments)

In the above-described embodiment, the target motion of the vehicle is decided by the single information processing unit 1 based on various signals and data relating to the vehicle, and the operation signal to each execution device 200 of the vehicle is generated based on the decided target motion. However, for example, the information processing unit 1 may complete the operation until the target motion is determined, and another information processing unit may generate the operation signal for each actuator 200 of the vehicle. In this case, the single information processing unit 1 does not include the energy management portion 50, decides the target motion of the vehicle based on various signals and data related to the vehicle, and outputs data indicating the decided target motion. Also, the other information processing unit receives the data output from the information processing unit 1, and generates an operation signal to each actuator 200 of the vehicle.

-description of symbols-

1 information processing unit

2 vehicle

10 vehicle exterior environment estimating unit

15 vehicle external environment model

20 driver state estimating unit

25 human model

30 route generation unit

40 target motion determination section

45 six-axle model of vehicle

50 energy management unit

56 vehicle energy model

14页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:轨条车辆用换气装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!