Vehicle control device and electronic control system

文档序号:12508 发布日期:2021-09-17 浏览:45次 中文

阅读说明:本技术 车辆控制装置及电子控制系统 (Vehicle control device and electronic control system ) 是由 坂本英之 广津铁平 植田泰辅 山本英达 于 2020-01-15 设计创作,主要内容包括:本发明提供一种车辆控制装置及电子控制系统,即使在车辆周围的传感器或处理传感器融合的运算块中发生动作异常的情况下,也能够安全地进行控制转移,可靠性高。车辆控制装置具备:第1运算块,其根据从多个外界传感器输出的原始数据进行传感器融合处理;第2运算块,其根据分别处理从所述多个外界传感器输出的所述原始数据而生成的对象数据进行传感器融合处理;以及第3运算块,其使用所述第1运算块的输出结果和所述第2运算块的输出结果,诊断所述第1运算块的输出结果。(The invention provides a vehicle control device and an electronic control system, which can safely perform control transfer even if abnormal action occurs in a sensor around a vehicle or an operation block for processing sensor fusion, and has high reliability. A vehicle control device is provided with: a 1 st operation block that performs a sensor fusion process based on raw data output from a plurality of external sensors; a 2 nd arithmetic block that performs a sensor fusion process based on object data generated by processing the raw data output from the plurality of external sensors, respectively; and a 3 rd operation block which diagnoses an output result of the 1 st operation block using an output result of the 1 st operation block and an output result of the 2 nd operation block.)

1. A vehicle control device is characterized by comprising:

a 1 st operation block that performs a sensor fusion process based on raw data output from a plurality of external sensors;

a 2 nd arithmetic block that performs a sensor fusion process based on object data generated by processing the raw data output from the plurality of external sensors, respectively; and

a 3 rd operation block which diagnoses an output result of the 1 st operation block using an output result of the 1 st operation block and an output result of the 2 nd operation block.

2. The vehicle control apparatus according to claim 1,

the 3 rd computation block diagnoses the 1 st computation block as abnormal when the group of objects around the vehicle output as the result of the sensor fusion by the 2 nd computation block is not included in the group of objects around the vehicle output as the result of the sensor fusion by the 1 st computation block.

3. The vehicle control apparatus according to claim 1,

the 2 nd operation block is an operation block for performing a sensor fusion process based on object data generated by processing the raw data output from the plurality of external sensors,

when the object group around the vehicle output as the result of the sensor fusion by the 2 nd computation block is included in the object group around the vehicle output as the result of the sensor fusion by the 1 st computation block, the 3 rd computation block diagnoses the 1 st computation block as normal.

4. The vehicle control apparatus according to claim 1,

the 1 st computation block is provided with a machine learning unit that inputs data output from the plurality of external sensors,

the 1 st operation block outputs the sensor fusion result by inputting the raw data to the machine learning section,

the 2 nd arithmetic block inputs object data obtained by processing the raw data output from the plurality of external sensors, respectively, and outputs an object data fusion result by performing a sensor fusion process on the object data,

and diagnosing the original data fusion result by taking the object data fusion result as positive.

5. The vehicle control apparatus according to any one of claims 1 to 4,

the 2 nd operation block uses a lockstep microcomputer.

6. The vehicle control apparatus according to claim 5,

the lock step microcomputer compares data output from the plurality of external sensors, takes a plurality of decisions, and performs object data fusion processing based on the taken data.

7. The vehicle control apparatus according to any one of claims 1 to 4,

when the 3 rd arithmetic block determines that the 1 st arithmetic block is abnormal, the vehicle tracking control is performed using a trajectory generated by the 2 nd microcomputer based on the sensor fusion result output by the 2 nd arithmetic block.

8. The vehicle control apparatus according to any one of claims 1 to 4,

the trajectory generated by the 2 nd microcomputer is a retreated trajectory.

9. The vehicle control apparatus according to claim 1,

the 2 nd operation block performs a sensor fusion process based on the data preprocessed by the operation block in the external sensor.

10. The vehicle control apparatus according to claim 1,

the data input to the 2 nd arithmetic block is object identification data generated by the arithmetic block in the external sensor.

11. The vehicle control apparatus according to claim 1,

the vehicle control device includes:

a 1 st microcomputer having the 1 st arithmetic block; and

a 2 nd microcomputer having the 2 nd operation block and the 3 rd operation block.

12. The vehicle control apparatus according to claim 1,

the vehicle control device includes:

a 1 st microcomputer having the 1 st arithmetic block;

a 2 nd microcomputer having the 2 nd arithmetic block; and

a 3 rd microcomputer having the 3 rd operation block.

13. The vehicle control apparatus according to any one of claims 1 to 4,

the power supplies of the 1 st operation block and the 2 nd operation block are independent respectively.

14. The vehicle control apparatus according to any one of claims 2 to 3,

the 3 rd arithmetic block prohibits the output of the sensor fusion result calculated by the 1 st arithmetic block when the 1 st arithmetic block is diagnosed as abnormal.

15. The vehicle control apparatus according to any one of claims 2 to 3,

when the 3 rd operation block diagnoses the 1 st operation block as abnormal, the 1 st operation block stops outputting the sensor fusion result calculated by the 1 st operation block.

16. The vehicle control apparatus according to claim 11 or 12,

and the 1 st microcomputer and the 2 nd microcomputer respectively generate vehicle tracking tracks according to the sensor fusion result.

17. The vehicle control apparatus according to claim 11 or 12,

the 1 st microcomputer and the 2 nd microcomputer respectively generate a vehicle tracking track according to the sensor fusion result,

and controlling the vehicle according to the vehicle tracking trajectory outputted from the 1 st microcomputer as long as the 1 st operation block is diagnosed as normal by the 3 rd operation block.

18. The vehicle control apparatus according to claim 4,

the machine learning portion uses a deep neural network algorithm.

19. A vehicle control system is characterized by comprising:

a 1 st arithmetic unit that performs a sensor fusion process based on raw data output from a plurality of external sensors;

a 2 nd calculation unit that performs a sensor fusion process based on object data generated by processing the raw data output from the plurality of external sensors, respectively; and

and a 3 rd arithmetic unit for diagnosing the output result of the 1 st arithmetic unit by using the output result of the 1 st arithmetic unit and the output result of the 2 nd arithmetic unit.

Technical Field

The present invention relates to a vehicle control device and an electronic control system of an automatic driving system.

Background

In order to realize a high-class automatic driving system, an automatic driving ecu (electronic Control unit), which is a high-level Control device for controlling automatic driving, is required to have the following functions: an arithmetic processing unit (hereinafter referred to as a microcomputer) mounted inside the ECU detects an object around the vehicle body by performing a sensor fusion process, and determines a trajectory of the vehicle so as not to contact the object around. Here, the sensor fusion processing is a technique for realizing a high-level recognition function that cannot be obtained from a single sensor by comprehensively processing detection data of 2 or more sensors having different detection principles, for example, a captured image of a camera or a radar. In recent years, with the advancement of automated driving, there has been a demand for improvement in accuracy of external recognition and reduction in delay of sensor-side processing. As a countermeasure for this, a control method has been studied in which raw data output from a sensor is input to a neural network installed in an arithmetic block in an automatic driving ECU to perform a sensor fusion process, thereby outputting object data. In addition, when the above method is performed, a diagnosis means for appropriateness of the sensor fusion output result obtained by the collective processing is required.

As this technique, for example, there is a technique described in patent document 1.

Documents of the prior art

Patent document

Patent document 1: japanese Kohyo publication No. 2017-513162

Disclosure of Invention

Problems to be solved by the invention

The technique of patent document 1 compares fused data based on an object output from each peripheral sensor with original object data resulting from fusion of the original data, and if the difference is outside an error tolerance range, discards and recalculates the data.

Here, when comparing the result of the raw data fusion and the result of the object data fusion, the detectable objects may differ from each other because the data processing methods thereof are different. Therefore, depending on the scene, even if the outputs are normal to each other, a difference may occur between the detected objects. Therefore, it is difficult to diagnose the reliability of each data by simply comparing the result of the original data fusion with the result of the object data fusion. However, this point is not considered in patent document 1. The present invention has been made in view of the above problems, and an object thereof is to diagnose output data from a sensor in an automatic driving control system or an operation block performing a sensor fusion process with high accuracy.

Means for solving the problems

The present invention has a plurality of means for solving the above problems. For example, one embodiment of the present invention employs the following configuration.

A vehicle control device is characterized by comprising: a 1 st operation block that performs a sensor fusion process based on raw data output from a plurality of external sensors; a 2 nd arithmetic block that performs a sensor fusion process based on object data generated by processing the raw data output from the plurality of external sensors, respectively; and a 3 rd operation block which diagnoses an output result of the 1 st operation block using an output result of the 1 st operation block and an output result of the 2 nd operation block.

ADVANTAGEOUS EFFECTS OF INVENTION

According to an aspect of the present invention, a vehicle control device and an electronic control system capable of improving safety can be realized.

Drawings

Fig. 1 is a diagram showing an internal configuration of a vehicle control device (1 st ECU)11 and connections of sensors and actuators in embodiment 1.

Fig. 2 is a diagram showing a diagnostic method for a sensor fusion result in example 1.

Fig. 3 is a diagram showing the diagnosis and the control flow based on the diagnosis result in embodiment 1.

Fig. 4 is a diagram showing connections between sensors and actuators in the internal configuration of the vehicle control device (1 st ECU)11 in embodiment 2.

Fig. 5 is a diagram showing connections between sensors and actuators in the internal configuration of the vehicle control device (1 st ECU)11 and the vehicle control device (2 nd ECU)12 in embodiment 3.

Detailed Description

Hereinafter, embodiments of the present invention will be described with reference to the drawings. The present embodiment is merely an example for implementing the present invention, and does not limit the technical scope of the present invention. In the drawings, the same reference numerals are given to the common components.

(construction of automatic Driving System)

First, a configuration (not shown) of an automatic driving system (vehicle control system) to which the present invention is applied will be described. An automatic driving system of the present invention is a vehicle control device that controls the behavior of a vehicle based on information obtained from an external sensor.

In the present invention, the external sensor is assumed to be, for example, a camera, a laser radar, a laser, or the like, but may be another sensor. In the present invention, the vehicle control system is connected to an actuator group provided in the vehicle, and controls the vehicle by driving the actuator group. The actuator group includes, for example, a brake control, an engine control, and a power steering control, but is not limited thereto.

(example 1)

Hereinafter, embodiment 1 will be described with reference to fig. 1 to 3.

Fig. 1 is a diagram showing an internal configuration of a vehicle control device (1 st ECU)11 and connections of sensors and actuators in embodiment 1.

As described above, in recent years, in order to realize automatic driving, a vehicle control device is required to perform highly accurate trajectory generation by inputting sensor data with higher accuracy and a larger amount of information and performing sensor fusion.

There are a plurality of sensors for identifying the surroundings of the own vehicle. Examples of such a camera are a camera for obtaining image data, a laser radar and a radar for obtaining distance data. Although color recognition such as road marking is highly reliable, a camera has a problem that recognition is difficult in the case of rain, fog, and strong sunlight.

On the other hand, for example, radar has a problem that detection of a distance is useful and resolution is low. Each sensor has the advantages and disadvantages, and only one sensor cannot be used for identifying the surrounding environment.

Here, the sensor fusion is a technique for realizing a high-level recognition function that cannot be obtained from a single sensor by comprehensively processing detection data of 2 or more sensors, for example, a captured image of a camera, a radar, and the like. In the sensor fusion, high reliability has been ensured by preprocessing (object data transformation) each sensor data and performing sensor fusion (object data fusion) in the past. On the other hand, recently, in order to perform more advanced recognition of the surrounding environment, the necessity of performing sensor fusion on raw data obtained from each sensor has increased. The concrete description is as follows.

In the vehicle control device 11 of the present embodiment, a large amount of raw data that has not been subjected to preprocessing is input from each sensor. The microcomputer 111 included in the vehicle control device 11 performs raw data fusion processing based on the raw data input from each sensor. Specifically, the operation block 211 of the microcomputer 111 performs data fusion processing on the raw data 104, 204, 304 obtained from the sensors to generate a raw data fusion result 701.

The vehicle control device 11 inputs object data from various sensors in addition to the raw data. The 1 st sensor 1, the 2 nd sensor 2, and the 3 rd sensor 3 generate object data 103, 203, and 303 of sensor data by the information processing units 102, 202, and 302 included in the sensors, respectively. The microcomputer 112 included in the vehicle control device 11 performs object data fusion processing based on the object data input from each sensor.

Specifically, the calculation block 212 of the microcomputer 112 performs data fusion processing on the target data 103, 203, 303 obtained from the respective sensors to generate a target data fusion result 702. The following sensors are used to illustrate the flow of fusion, as an example.

For example, the camera 1 is a sensor module incorporating the sensor 101 and the information processing unit 102. The raw data acquired from the sensor 101 is sent to the information processing unit 102 and the arithmetic block 211 in the 1 st ECU. The information processing unit 102 also generates target data based on the raw data acquired from the sensor 101. Then, the target data 103 is sent to the operation block 212 in the 1 st ECU.

The radar 2 is a sensor module incorporating a sensor 201 and an information processing unit 202. The raw data acquired from the sensor 201 is sent to the information processing unit 202 and the arithmetic block 211 in the 1 st ECU. Further, the information processing unit 202 generates target data 203 based on raw data acquired from the sensor 201. Then, the target data 203 is sent to the operation block 212 in the 1 st ECU.

The laser 3 is a sensor module incorporating a sensor 301 and an information processing unit 302. The raw data acquired from the sensor 301 is sent to the information processing unit 302 and the arithmetic block 211 in the 1 st ECU. Further, the information processing unit 302 generates target data based on raw data acquired from the sensor 301. Then, the target data 303 is sent to the operation block 212 in the 1 st ECU. Examples of the information processing unit include a microcomputer and an FPGA (Field Programmable gate array).

As described above, the raw data obtained from the respective sensors is input to the operation block 211. The arithmetic block 211 fuses the raw data input from each sensor to generate a raw data fusion result 701 as the peripheral object data. Therefore, the operation block 211 needs to be able to process a large amount of data in batch. The raw data fusion result 701 is output by machine learning. The machine learning here means that, for example, raw data from each sensor is input to NN (Neural Network) or DNN (Deep Neural Network) provided in operation block 211, and raw data fusion result 701 obtained by fusing raw data sensors is generated. Here, the raw data fusion result 701 calculated by machine learning such as NN and DNN includes a large amount of data, and therefore, it is possible to generate a track with high accuracy. The operation block 211 transmits the generated raw data fusion result 701 to the micom 113. Further, the microcomputer 111 performs behavior prediction of an object around the own vehicle based on the raw data fusion result 701, generates the trajectory data 711 of the vehicle based on the result of the behavior prediction, and transmits the trajectory data 711 to the trajectory tracking control section 214.

The arithmetic block 212 is mounted on the highly reliable microcomputer 112 having the lockstep (ロックステップ) function. Here, the lockstep microcomputer 112 has a CPU and a memory, and the CPU subsystems of the respective systems are controlled by a dual control circuit so as to operate in complete synchronization in units of clocks, and therefore, reliability is high.

The microcomputer 112 processes the object data preprocessed by the camera 1, the radar 2, and the laser 3.

The object data 103, 203, 303 obtained by preprocessing the object data by the information processing units 102, 202, 302 provided in the sensors such as the camera 1, the radar 2, the laser 3, and the like are subjected to object data fusion by the arithmetic block 212, and an object data fusion result 702 is generated.

Then, the microcomputer 112 transmits the generated object data fusion result 702 to the microcomputer 113. The microcomputer 112 also predicts the behavior of an object around the host vehicle based on the object data fusion result 702, generates the trajectory data 712 of the host vehicle based on the result of the behavior prediction, and transmits the trajectory data 712 to the trajectory tracking control unit 214.

In the present embodiment, an example of the vehicle control apparatus is shown, which includes the 1 st microcomputer 111 having the arithmetic block 211 mounted thereon, the 2 nd microcomputer 112 having the arithmetic block 212 mounted thereon, and the 3 rd microcomputer 113 having the arithmetic block 213 for performing the comparison diagnosis mounted thereon.

The raw data fusion result 701 obtained by the sensor fusion performed in the operation block 211 and the target data fusion result 702 obtained by the sensor fusion performed in the operation block 212 are diagnosed by the comparative diagnosis function provided in the operation block 213. Then, the trajectory tracking control is performed based on the diagnosis result of the operation block 213. The specific diagnosis will be described later.

The control/tracking control unit 214 generates a track tracking control command based on the track information calculated by the track generation unit of the microcomputer 111 or the track generation unit of the microcomputer 112. As a result of the comparison diagnosis, when the operation block 211 is normal, the trajectory tracking control portion 214 generates and sends a control command to the actuator control ECUs 13, 14, and 15 to cause the own vehicle to track based on the trajectory data 711 generated by the microcomputer 111.

When the operation block 211 is diagnosed as abnormal, the control and tracking control unit 214 generates and transmits a control command to the actuator control ECUs 13, 14, 15 so that the host vehicle tracks the track data 712 generated by the microcomputer 112.

Fig. 2 is a diagram showing a diagnostic method for a sensor fusion result in example 1.

Fig. 2 shows an example of the sensor fusion result of the host vehicle 801 when other vehicles 802 and 803 travel near the host vehicle 801 and another vehicle 804 travels far from the host vehicle 801.

Fig. 2-1a and 2-1b on the left side of fig. 2 show a raw data fusion result 701 obtained by performing sensor fusion in the operation block 211 based on raw data obtained from each sensor. Fig. 2-2a and 2-2b on the right side of fig. 2 show object data fusion results 702 obtained by fusing the data processed by the computation block 212 for each sensor.

Here, the upper stage of fig. 2 shows a case where the diagnosis result is normal, and the lower stage shows a case where the diagnosis result is abnormal. The specific mode of diagnosis is described below.

As described above, the sensor data obtained from the sensors 101, 201, and 301 is subjected to information processing by the information processing units 102, 202, and 302 included in the sensors. The information of the object data 103, 203, 303 of each sensor obtained by the information processing is sent to the arithmetic block 212. The arithmetic block 212 generates an object data fusion result 702 by fusing the obtained object data 103, 203, and 303.

The sensor fusion performed by the operation block 212 is performed using an operation block (processor) or a lock-step microcomputer provided on the sensor side. Further, in order to verify the reliability, it is preferable that the lockstep microcomputer takes a measure to adopt the object data output from each sensor by a majority of the decisions.

However, as shown in fig. 2-2a, the object data fusion result 702 can be expected to output with high reliability for the recognition of other vehicles 802 and 803 traveling around the host vehicle. On the other hand, since data is processed in advance in order to ensure reliability, it is sometimes difficult to identify another vehicle (e.g., another vehicle 804 of fig. 2-2a) traveling in a distant place.

Here, in the operation block 211, data (for example, data 104, 204, 304) obtained from each sensor is directly obtained, raw data fusion processing is performed, and a raw data fusion result 701 is output. Fig. 2-1a shows an example of an output. Since the operation block 211 processes a large amount of raw data from each sensor, the amount of information of the object group that can be recognized increases.

Here, the object group refers to an object existing in the periphery of the vehicle, and is, for example, a vehicle or a pedestrian. Therefore, it is possible to recognize not only the other vehicles 802, 803 traveling near the own vehicle but also, in fig. 2-1a, the other vehicle 804 traveling far away, which cannot be recognized in fig. 2-2 a.

Basically, the object data fusion result 702 generated by the sensor fusion in the operation block 212 has a smaller amount of information on the periphery than the data fusion result 701 output from the operation block 211 to which a large amount of raw data is input. Here, the raw data fusion result 701 generated by the arithmetic block 211 is, for example, an LVDS communication level, and has a data amount of 10Mbps or more. On the other hand, the object fusion result 702 generated by the operation block 212 is, for example, less than 10Mbps of the CAN-FD communication level.

If the original data fusion result 701 and the object fusion result 702 are simply compared in order to determine the reliability of the output and recognition of the original data fusion result 701 and the object fusion result 702 of the operation block 211 and the operation block 212, as described above, the data amount or the recognizable object group always differs, and therefore it is always determined as a diagnostic result in the diagnosis: and (6) abnormal.

Therefore, in the present embodiment, the calculation block 212 employs a flow using a plurality of processors and a lockstep microcomputer, and it is diagnosed whether the calculation block 211 can recognize the same external information recognition as that of the nearby external information recognized by the calculation block 212 on the premise that the recognition of the nearby other vehicle has high reliability.

Specifically, when the target fusion result 702 output from the operation block 212 is included in the raw data fusion result 701 output from the operation block 211, it is determined as a diagnosis result: and (4) normal. In the case where the raw data fusion result 702 output by the operation block 212 is not included in the raw data fusion result 701 output by the operation block 211, the operation block 213 determines that the diagnosis result is: and (6) abnormal.

Hereinafter, the following description will be specifically made by taking fig. 2 as an example.

In the upper diagram of fig. 2, the other vehicle 804 (fig. 2-1a) far away is recognized in the raw data fusion result 701 of the computation block 211, whereas the other vehicle 804 (fig. 2-2a) is not recognized in the computation block 212, but both of the vehicles traveling near can be recognized, and therefore, the result is a diagnosis result: and (4) normal.

On the other hand, in the lower diagram of fig. 2, the calculation block 212 recognizes another vehicle 802 (fig. 2-2b) in the vicinity, whereas the calculation block 211 does not recognize another vehicle 802 (fig. 2-1 b). Therefore, in the case where the target data fusion result 702 generated in the operation block 212 is not included in the raw data fusion result 701 generated in the operation block 211, the operation block 213 sets the diagnosis result to: and (6) abnormal. In this case, it is determined that there is an error in the raw data fusion result 701, and the object fusion result 702 output from the highly reliable operation block 212 is correct.

If there is an object that can be identified by the target data fusion result 702 calculated in the operation block 212 but cannot be identified by the raw data fusion result 701 calculated in the operation block 211, the operation block 213 sets the diagnosis result as abnormal. In addition, when the object identifiable by the target data fusion result 702 calculated in the operation block 212 can be identified by the raw data fusion result 701 calculated in the operation block 211, the diagnosis result is determined to be normal.

Here, when the operation block 213 diagnoses the diagnosis result of the operation block 211 as abnormal, there is a possibility that a failure or malfunction of the sensor occurs.

In the case where the above-described occurrence of a failure is detected, the operation block 213 or 211 preferably prohibits the output of the raw data fusion result 701. Alternatively, the operation block 211 stops the output of the original data fusion result. The vehicle travels along the track of the vehicle output from the operation block 212. The reason for this is that the output of the operation block 212 is determined by a majority decision as to the reliability of data of a plurality of sensors, and therefore, even if one sensor fails, a highly reliable output can be obtained. By continuing to use the output result of the operation block 212, highly reliable track following control can be performed.

More preferably, the microcomputer 112 may generate retraction trajectory (lane retraction ) data 712 and the vehicle may select a retraction operation. The retraction operation will be described later in detail.

In the present diagnostic method, the sensor fusion result 701 output by the operation block 211 is diagnosed with the sensor fusion result output by the operation block 212 being positive. In order to make the output result of the operation block 212 positive, the vehicle control device and system preferably have the following configuration.

The fusion process (operation block 211) using machine learning (for example, DNN) is generated by empirical matching (criteria test · マッチング), whereas the fusion of the operation block 212 is preferably generated by a rule-based algorithm. The fusion process using machine learning is a process based on a causal relationship (for example, DNN) that changes in real time, and the fusion process in the operation block 212 is a process based on a predetermined correlation (for example, lockstep microcomputer). Thus, the raw data fusion results can be diagnosed from the trajectories calculated by the system independent of the fusion process of machine learning. As a result, the diagnosis of the original data fusion result with higher reliability can be realized.

The operation block 212 is preferably executed by a highly reliable lockstep microcomputer. The reliability of the raw data fusion result 701 used for the track generation of the operation block 211 can be diagnosed from the object fusion result 702 output from the operation block 212 with high reliability. As a result, the diagnosis of the raw data fusion result with high reliability can be performed.

In addition, the operation block 212 preferably takes a majority decision based on outputs from a plurality of sensors/electronic control processors (using a different principle/approach, algorithm than the operation block 211). With the above configuration, the operation block 212 can perform an operation using reliable data, and is highly reliable.

The operation block 212 requires the use of an electronic control processor different from that of the operation block 211. Even if one sensor or one electronic control processor fails, it is possible to make an accurate determination by making use of the other sensors or electronic control processors by a large number of times.

The plurality of sensors that supply sensor data to the vehicle control device 11 are preferably supplied with power from at least two or more batteries. For example, the camera 1 receives power supply from the battery a, and the radar 2 receives power supply from the battery B. With this configuration, the vehicle control device 11 can achieve redundancy of the power supply, and can generate a vehicle track with higher reliability.

Fig. 3 is a diagram showing the diagnosis and the control flow based on the diagnosis result in embodiment 1. The fusion result 701 of the operation block 211 provided in the microcomputer 111 and the fusion result 702 of the operation block 212 provided in the microcomputer 112 are compared by the operation block 213 provided in the microcomputer 113.

Here, when it is determined that the data is normal, the microcomputer 111 uses orbit data 712 obtained by sensor fusion and orbit generation. In order to execute the trajectory tracking control by the operation block 214 based on the data of the raw data fusion result 701 calculated by the microcomputer 111, an actuator control command for the trajectory tracking control is transmitted from the vehicle control device 11 to the actuator side.

On the other hand, in a case where the raw data fusion result 701 generated by the sensor fusion output by the operation block 211 is not included in the target data fusion result 702 generated by the sensor fusion output by the operation block 212, an abnormality is diagnosed. Here, when the operation block 213 diagnoses the operation result of the operation block 211 as abnormal, the microcomputer 112 uses the track data 712 obtained by the sensor fusion and the track generation, and based on the data, the operation block 214 transmits an actuator drive command (actuator control command) for the track following control to the actuator ECU.

Here, in the operation block 213, when it is diagnosed that the output of the raw data fusion result 701 calculated by the operation block 211 is abnormal, there is a possibility that abnormality occurs in each sensor. The risk of continuing the travel in this state is high. Therefore, when the diagnosis result is determined to be abnormal, the illustrated generated orbit data 711 of the microcomputer 112 can generate the retracted orbit. Here, the rollback means that, when one of the trunk components is damaged and falls into a functional stop during driving, emergency avoidance for avoiding insufficiency of the function is performed to enable the automatic driving system to continue, or even when the function is stopped due to the damage of the trunk component, the automatic driving is continued for a certain period of time, and the driver is safely and smoothly handed over to the rollback. In addition, the retreat rail refers to a safety retreat rail and an emergency retreat rail generated to realize the above operation. With the above configuration, when the operation block 213 detects that a failure has occurred in the sensor, the microcomputer 112 generates a retreat trajectory, and thus the vehicle can be safely stopped.

In the present embodiment, the operation block 211 and the operation block 212 perform the operation for track generation by different algorithms, respectively, in parallel without interruption. Therefore, even when the operation block 213 diagnoses an abnormality in the raw data fusion result 701 of the operation block 211, it is possible to switch to the track generated by the target data fusion result 702 in the operation block 212, and it is possible to suppress the occurrence of a time lag.

In addition, in the case where the arithmetic block 212 also performs the high-level arithmetic processing performed in the arithmetic block 211 (in the case of a fully redundant system), both the arithmetic block 211 and the arithmetic block 212 need to be high-level processors capable of bearing a heavy arithmetic load. However, when generating the back track, a low-cost processor is not required, and thus the calculation can be performed. Therefore, when the operation block 212 generates only the retreat trajectory, the cost of the vehicle control apparatus can be reduced.

With this configuration, unstable trajectory tracking control due to abnormality of the sensor, each microcomputer provided in the sensor, or each arithmetic block can be prevented.

The present embodiment includes: a 1 st operation block (operation block 211) that performs a sensor fusion process based on raw data output from a plurality of external sensors; a 2 nd operation block (operation block 212) that performs a sensor fusion process based on object data generated by processing the raw data output from the plurality of external sensors, respectively; and a 3 rd operation block (operation block 213) which diagnoses an output result of the 1 st operation block using an output result of the 1 st operation block and an output result of the 2 nd operation block.

It is assumed that the output result of the operation block 211 has a larger amount of information than the output result of the operation block 212. However, if the output results of the operation block 211 and the operation block 212 are simply compared, a difference occurs between the normal outputs due to the difference in the amounts of information between them, and therefore most of the outputs may be discarded. Therefore, in the present embodiment, by having the above-described configuration, it is possible to determine the output result of the operation block 211 having a large amount of information from the highly reliable output result of the operation block 212.

Preferably, the 3 rd operation block (operation block 213) determines that the 1 st operation block (operation block 211) is abnormal when the object group around the vehicle output as the result of the sensor fusion by the 2 nd operation block (operation block 212) is not included in the object group around the vehicle output as the result of the sensor fusion by the 1 st operation block (operation block 211). In addition, when the object group around the vehicle output as the result of the sensor fusion in the 2 nd operation block (operation block 212) is included in the object group around the vehicle output as the result of the sensor fusion in the 1 st operation block (operation block 211), the 3 rd operation block (operation block 213) determines that the 1 st operation block (operation block 211) is abnormal. With the above configuration, the output of the operation block 211 can be diagnosed from the data with high reliability and a small amount of information output from the operation block 212, and even if the operation block 211 has information larger than the operation block 212, it can be diagnosed as normal if the minimum necessary information can be mutually output. As a result, it is possible to perform track generation using information of the operation block 211 including high-precision information while ensuring reliability.

(example 2)

Fig. 4 is a diagram showing the connection of the internal configuration (particularly, the microcomputer 111 and the microcomputer 112), the sensors, and the actuators of the vehicle control device (1 st ECU)11 in embodiment 1.

In the present embodiment, the calculations related to the trajectory generation of the automatic driving system are all performed in the 1 st ECU, as in embodiment 1. The raw data fusion is performed in the operation block 211 of the microcomputer 111, and a raw data fusion result 701 is output.

Here, in the present embodiment, unlike embodiment 1, the microcomputer 112 includes an operation block 212 and an operation block 213 for outputting the object fusion result.

With such a configuration, the operation block 211 and the operation block 212 can operate with different algorithms. Therefore, the operation block 213 can compare with the sensor data with higher reliability. Therefore, the 1 st operation block can generate a track using data with high reliability. In general, the microcomputer 112 that generates the target data fusion result uses a highly reliable microcomputer to ensure safety. Therefore, the reliability can be expected to be ensured also in the operation block 213. Note that the same configuration as in example 1 will not be described.

(example 3)

Fig. 5 is a diagram showing the internal configuration of the vehicle control device (1 st ECU)11 and the vehicle control device (2 nd ECU)12, and the connections of sensors and actuators in embodiment 1.

In the present embodiment, the raw data fusion result 701 is calculated in the operation block 211 of the microcomputer 111 provided in the 1 st ECU. The microcomputer 112 of the 2 nd ECU includes an operation block 212 and an operation block 213 that calculate the target data fusion result 702. That is, the outputs of the raw data fusion result 701 and the target data fusion 702 are implemented by different ECUs.

With this configuration, the electronic control device has an effect of enabling redundancy of power supplies and the like and highly reliable diagnosis.

Here, the 1 st ECU and the 2 nd ECU may have completely different functions. For example, the 1 st ECU may be an autonomous vehicle control ECU, and the 2 nd ECU may be an automatic parking or ADAS ECU. Note that the same configuration as in example 1 will not be described.

As described above, the inventors have described the present invention in examples 1 to 3. The sensor mounted on the illustrated vehicle is an example of a sensor applicable to the present invention, and is not limited to a sensor of a vehicle to which the present invention is applicable.

As described above, according to the present embodiment, as a result, a highly reliable vehicle control device and electronic control system can be realized.

Description of the symbols

1 vidicon

2 radar

3 laser

11 vehicle control device

13 rd 3ECU

14 th ECU

15 th 5ECU

101 sensor 1

201 sensor 2

301 sensor 3

102 information processing unit of sensor 1

202 sensor 2 information processing unit

302 sensor 3 information processing unit

103 object data output from the 1 st sensor

203 object data output from the 2 nd sensor

303 object data output from the 3 rd sensor

104 raw data output from the 1 st sensor

204 raw data output from the 2 nd sensor

304 raw data output from the 3 rd sensor

211 operation block 1

212 operation block 2

213 operation block 3 (diagnosis block)

214 control command generation unit

701 original data fusion result

702 object data fusion results

711 orbit data generated based on the raw data fusion result

712 track data generated based on the object data fusion results.

17页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:演奏辅助系统、方法以及程序和乐器管理系统、方法以及程序

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!