Obstacle detection device

文档序号:1713378 发布日期:2019-12-13 浏览:22次 中文

阅读说明:本技术 障碍物检知装置 (Obstacle detection device ) 是由 松浦充保 原田岳人 前田优 柳川博彦 于 2018-03-15 设计创作,主要内容包括:位置获取部(261)基于由测距传感器(21)接收到的接收波来获取本车辆与障碍物的相对位置信息。形状识别部(262)基于由拍摄部(22)获取到的图像信息来执行障碍物的形状识别。检知处理部(263)基于由位置获取部获取到的相对位置信息和形状识别部的形状识别结果来检知障碍物。检知处理部判定障碍物的高度是否是规定高度以上。在形状识别部的形状识别结果中的障碍物的高度小于规定高度的情况下,检知处理部废弃与该障碍物对应的相对位置信息。(a position acquisition unit (261) acquires relative position information between the vehicle and the obstacle on the basis of the received waves received by the distance measurement sensor (21). A shape recognition unit (262) performs shape recognition of an obstacle on the basis of image information acquired by the imaging unit (22). A detection processing unit (263) detects an obstacle on the basis of the relative position information acquired by the position acquisition unit and the shape recognition result of the shape recognition unit. The detection processing unit determines whether or not the height of the obstacle is equal to or greater than a predetermined height. When the height of the obstacle in the shape recognition result of the shape recognition unit is smaller than a predetermined height, the detection processing unit discards the relative position information corresponding to the obstacle.)

1. an obstacle detection device configured to be mounted on a vehicle (10) to detect an obstacle (B) present outside the vehicle, the obstacle detection device (20) comprising:

At least one ranging sensor (21) configured to output a signal corresponding to a distance between the obstacles by transmitting a probe wave toward an outside of the own vehicle and receiving a received wave including a reflected wave of the probe wave reflected by the obstacles;

An imaging unit (22) configured to acquire image information corresponding to an image of the periphery of the host vehicle;

A vehicle state acquisition unit (260) configured to acquire travel state information corresponding to a travel state of the host vehicle;

A position acquisition unit (261) that acquires relative position information corresponding to the relative position of the obstacle with respect to the host vehicle, based on the output of the distance measurement sensor;

A shape recognition unit (262) configured to perform shape recognition of the obstacle on the basis of the image information acquired by the imaging unit and the travel state information acquired by the vehicle state acquisition unit; and

A detection processing part (263) configured to detect the obstacle based on the relative position information acquired by the position acquisition part and a shape recognition result of the shape recognition part,

The detection processing unit is configured to discard the relative position information corresponding to the obstacle when the height dimension of the obstacle in the shape recognition result is smaller than a predetermined dimension.

2. Obstacle detecting apparatus according to claim 1, wherein,

at least one of the ranging sensors includes a first ranging sensor and a second ranging sensor disposed at different positions from each other,

The first distance measuring sensor and the second distance measuring sensor are arranged in a positional relationship with each other such that the reflected wave of the probe wave transmitted from one of the first distance measuring sensor and the second distance measuring sensor reflected by the obstacle can be received as the received wave of the other of the first distance measuring sensor and the second distance measuring sensor,

The position acquisition unit is configured to acquire the relative position information by triangulation based on positions of the first ranging sensor and the second ranging sensor in a case where the first ranging sensor and the second ranging sensor receive, as the received wave, the reflected wave of the probe wave transmitted by the first ranging sensor, which is reflected by the obstacle.

3. The obstacle detection device according to claim 2, wherein,

The first distance measuring sensor and the second distance measuring sensor are provided on a surface of the host vehicle on a traveling direction side,

In a case where the height dimension of the obstacle in the shape recognition result is equal to or greater than the predetermined dimension, and the relative position information corresponding to the obstacle is acquired based on a direct wave instead of a first indirect wave and a second indirect wave, the detection processing unit recognizes that the obstacle has a wall surface (BW) intersecting a vehicle center axis (VL) of the host vehicle and that the wall surface is likely to approach the host vehicle as the host vehicle travels,

The first indirect wave is the received wave received by the first ranging sensor and is caused by the reflected wave of the probe wave transmitted from the second ranging sensor being reflected by the obstacle,

the second indirect wave is the received wave received by the second ranging sensor and is caused by the reflected wave of the probe wave transmitted from the first ranging sensor reflected by the obstacle,

The direct wave is the received wave received by the first ranging sensor, and is caused by the reflected wave of the probe wave transmitted from the first ranging sensor being reflected by the obstacle.

4. Obstacle detecting apparatus according to claim 1, wherein,

The ranging sensor is configured to output a signal corresponding to a distance between the obstacles by receiving the reflected wave of the probe wave transmitted by itself reflected by the obstacles as the received wave,

The position acquisition unit is configured to acquire the relative position information by triangulation based on the position of the distance measurement sensor and the distance to the obstacle, which are acquired at different times during travel of the host vehicle.

5. The obstacle detecting device according to any one of claims 1 to 4, wherein,

The shape recognition unit is configured to recognize the three-dimensional shape of the obstacle by acquiring three-dimensional positions of a plurality of feature points in the image information based on the travel state information acquired by the vehicle state acquisition unit and a plurality of pieces of image information acquired in time series by the imaging unit with the movement of the own vehicle.

6. An obstacle detection device configured to be mounted on a vehicle (10) to detect an obstacle (B) present outside the vehicle, the obstacle detection device (20) comprising:

At least one ranging sensor (21) configured to output a signal corresponding to a distance between the obstacles by transmitting a probe wave toward an outside of the own vehicle and receiving a received wave including a reflected wave of the probe wave reflected by the obstacles;

An imaging unit (22) configured to acquire image information corresponding to an image of the periphery of the host vehicle;

a distance acquisition unit (264) configured to acquire distance information corresponding to a distance of the obstacle from the host vehicle, based on an output of the distance measurement sensor;

A shape recognition unit (265) configured to perform shape recognition of the obstacle on the basis of the image information acquired by the imaging unit; and

And a distance correction unit (266) configured to correct the distance information corresponding to the obstacle based on a mounting position of the distance measurement sensor in the vehicle height direction when the height dimension of the obstacle in the shape recognition result of the shape recognition unit is smaller than a predetermined dimension.

7. Obstacle detecting apparatus according to claim 6, wherein,

The distance acquisition unit is configured to acquire a horizontal distance from an end surface (V1, V2, V3) of the host vehicle to which the distance measurement sensor is attached to the obstacle,

The distance correction unit is configured to correct the horizontal distance by the following equation when the height dimension of the obstacle in the shape recognition result is smaller than the predetermined dimension,

DC=(DCO2-SH2)1/2

DC0 is the horizontal distance before the distance acquisition unit acquires the horizontal distance and the distance correction unit corrects the horizontal distance, DC is the horizontal distance corrected by the distance correction unit, and SH is the distance in the vehicle height direction between the base end position of the obstacle in the vehicle height direction and the mounting position.

8. The obstacle detection device according to claim 7, wherein,

At least one of the distance measuring sensors includes a first distance measuring sensor and a second distance measuring sensor that are provided at different positions from each other on a travel-side end surface that is the end surface of the host vehicle located on the travel direction side of the host vehicle,

The first distance measuring sensor and the second distance measuring sensor are arranged in a positional relationship with each other such that the reflected wave of the probe wave transmitted from one of the first distance measuring sensor and the second distance measuring sensor reflected by the obstacle can be received as the received wave of the other of the first distance measuring sensor and the second distance measuring sensor,

The distance acquisition unit is configured to acquire a distance in the traveling direction from the travel-side end surface to the obstacle, which is the horizontal distance, as a travelable Distance (DC) by triangulation of positions of the first distance measurement sensor and the second distance measurement sensor based on a case where the first distance measurement sensor and the second distance measurement sensor receive, as the received wave, the reflected wave of the probe wave transmitted by the first distance measurement sensor and reflected by the obstacle,

the distance correcting unit is configured to correct the travelable distance when the height dimension of the obstacle in the shape recognition result of the shape recognizing unit is smaller than the predetermined dimension.

9. The obstacle detecting device according to any one of claims 6 to 8, wherein,

the obstacle detection device further includes a vehicle state acquisition unit (24) configured to acquire travel state information corresponding to a travel state of the own vehicle,

The shape recognition unit is configured to recognize the three-dimensional shape of the obstacle by acquiring three-dimensional positions of a plurality of feature points in the image information based on the travel state information acquired by the vehicle state acquisition unit and a plurality of pieces of image information acquired in time series by the imaging unit with the movement of the own vehicle.

10. The obstacle detecting device according to any one of claims 6 to 8, wherein,

The shape recognition unit extracts a straight edge corresponding to the distance information acquired by the distance acquisition unit from the image information, and recognizes whether or not the obstacle is a step having the height dimension smaller than the predetermined dimension based on a texture image around the straight edge,

The distance correction unit is configured to correct the distance information corresponding to the obstacle when the shape recognition unit recognizes that the obstacle is the step.

11. The obstacle detecting device according to any one of claims 1 to 10, wherein,

The height dimension is a protruding height of the obstacle from a road surface.

12. The obstacle detecting device according to any one of claims 1 to 11, wherein,

The ranging sensor is an ultrasonic sensor.

Technical Field

the present disclosure relates to an obstacle detection device configured to be mounted on a host vehicle to detect an obstacle present outside the host vehicle.

background

The device described in japanese patent application laid-open No. 2014-58247 includes a sonar including an irradiation unit, a reception unit, and a position detection unit, and an object determination unit. The sonar may also be referred to as a "ranging sensor". The irradiation unit irradiates ultrasonic waves to the outside of the vehicle. The receiving unit receives a reflected wave from an object. The position detection unit detects the presence position of an object based on the reciprocation time of the ultrasonic wave. The object determination unit determines a feature related to the height of the object based on a change in the detection state of the object specified based on the reflected wave.

As described above, in such a device, the distance from the distance measuring sensor or the host vehicle mounted with the distance measuring sensor to the obstacle is acquired based on the reflected wave of the probe wave reflected by the obstacle. The detection result of the reflected wave includes information corresponding to the distance between the range sensor and the object, and on the other hand, does not substantially include information corresponding to the height of the object. Therefore, according to the conventional apparatus, it is not possible to accurately obtain information on the height of the object.

On the other hand, the detection result of the reflected wave is affected by the height dimension of the obstacle. Therefore, there is still room for improvement in the conventional apparatus in terms of obstacle detection accuracy. That is, for example, the obstacle to be detected may be an obstacle having a low height protruding from the road surface, such as a curb. In this case, an error of an extent that cannot be ignored may occur between the acquired distance and the actual horizontal distance between the own vehicle and the obstacle.

disclosure of Invention

The present disclosure has been made in view of the above-described exemplary cases and the like.

According to one aspect of the present disclosure, an obstacle detection device is configured to be mounted on a host vehicle to detect an obstacle present outside the host vehicle.

The obstacle detection device includes:

At least one distance measuring sensor configured to output a signal corresponding to a distance between the obstacles by transmitting a probe wave toward an outside of the host vehicle and receiving a received wave including a reflected wave of the probe wave reflected by the obstacles;

An imaging unit configured to acquire image information corresponding to an image of a periphery of the host vehicle;

a vehicle state acquisition unit configured to acquire travel state information corresponding to a travel state of the host vehicle;

A position acquisition unit configured to acquire relative position information corresponding to a relative position of the obstacle with respect to the host vehicle, based on an output of the distance measuring sensor;

A shape recognition unit configured to perform shape recognition of the obstacle based on the image information acquired by the imaging unit and the travel state information acquired by the vehicle state acquisition unit; and

A detection processing portion configured to detect the obstacle based on the relative position information acquired by the position acquisition portion and a shape recognition result of the shape recognition portion,

The detection processing unit is configured to discard the relative position information corresponding to the obstacle when the height dimension of the obstacle in the shape recognition result is smaller than a predetermined dimension.

according to another aspect of the present disclosure, an obstacle detection device is mounted on a host vehicle to detect an obstacle present outside the host vehicle.

The obstacle detection device includes:

At least one distance measuring sensor configured to output a signal corresponding to a distance between the obstacles by transmitting a probe wave toward an outside of the host vehicle and receiving a received wave including a reflected wave of the probe wave reflected by the obstacles;

an imaging unit configured to acquire image information corresponding to an image of a periphery of the host vehicle;

A distance acquisition unit configured to acquire distance information corresponding to a distance from the obstacle to the host vehicle based on an output of the distance measurement sensor;

A shape recognition unit configured to perform shape recognition of the obstacle based on the image information acquired by the imaging unit; and

And a distance correction unit configured to correct the distance information corresponding to the obstacle based on a mounting position of the distance measuring sensor in a vehicle height direction when a height dimension of the obstacle in the shape recognition result of the shape recognition unit is smaller than a predetermined dimension.

The parenthesized reference numerals for each element are merely examples of correspondence between the element and a specific configuration described in the embodiment described later.

Drawings

fig. 1 is a plan view showing a schematic configuration of a vehicle in which an obstacle detection device according to an embodiment is mounted.

Fig. 2 is a functional block diagram of the first embodiment of the obstacle detecting device shown in fig. 1.

Fig. 3 is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 2.

Fig. 4A is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 2.

Fig. 4B is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 2.

Fig. 5 is a flowchart showing an operation example of the obstacle detecting device shown in fig. 2.

Fig. 6 is a flowchart showing an operation example of the obstacle detecting device shown in fig. 2.

Fig. 7 is a schematic diagram for explaining an outline of an operation of the obstacle detecting device according to the second embodiment shown in fig. 1.

Fig. 8 is a schematic diagram for explaining an outline of the operation of the second embodiment of the obstacle detecting device shown in fig. 1.

Fig. 9 is a schematic diagram for explaining an outline of an operation of the obstacle detecting device according to the third embodiment shown in fig. 1.

Fig. 10 is a flowchart showing an operation example of the third embodiment of the obstacle detecting device shown in fig. 1.

Fig. 11 is a functional block diagram of the obstacle detecting device according to the fourth embodiment shown in fig. 1.

Fig. 12A is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 11.

Fig. 12B is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 11.

Fig. 12C is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 11.

Fig. 13A is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 11.

fig. 13B is a schematic diagram for explaining an outline of the operation of the obstacle detecting device shown in fig. 11.

Fig. 14 is a flowchart showing an operation example of the obstacle detecting device shown in fig. 11.

Fig. 15 is a flowchart showing an operation example of the fifth embodiment of the obstacle detecting device shown in fig. 1.

Fig. 16 is a flowchart showing an operation example of the fifth embodiment of the obstacle detecting device shown in fig. 1.

fig. 17 is a flowchart showing an operation example of the fifth embodiment of the obstacle detecting device shown in fig. 1.

Detailed Description

Hereinafter, embodiments will be described with reference to the drawings. In addition, various modifications that can be applied to any one of the embodiments will be described in the following of the series of descriptions without intervening in the series of descriptions relating to the embodiment.

Referring to fig. 1, a vehicle 10 is a so-called four-wheel vehicle, and includes a vehicle body 11 having a substantially rectangular shape in plan view. Hereinafter, a virtual straight line that passes through the center of the vehicle 10 in the vehicle width direction and is parallel to the vehicle overall length direction in the vehicle 10 is referred to as a vehicle center axis VL. The vehicle overall length direction is a direction orthogonal to the vehicle width direction and orthogonal to the vehicle height direction. The vehicle height direction is a direction that defines the vehicle height of the vehicle 10, and is a direction parallel to the direction in which gravity acts when the vehicle 10 is placed on a horizontal surface. In fig. 1, the vehicle overall length direction is the vertical direction in the drawing, and the vehicle width direction is the horizontal direction in the drawing.

"front", "rear", "left", "right" in the vehicle 10 are defined as indicated by arrows in fig. 1. That is, the vehicle overall length direction is synonymous with the front-rear direction. The vehicle width direction is synonymous with the left-right direction. The vehicle height direction is synonymous with the vertical direction. However, as will be described later, the vehicle height direction, i.e., the vertical direction, may not be parallel to the direction in which gravity acts depending on the mounting conditions or the traveling conditions of the vehicle 10.

A front bumper 13 is attached to a front surface portion 12, which is a front end portion of the vehicle body 11. A rear bumper 15 is attached to a rear surface portion 14, which is a rear end portion of the vehicle body 11. A door panel 17 is provided on a side surface portion 16 of the vehicle body 11. In the specific example shown in fig. 1, 2 door panels 17 are provided on the left and right, respectively, for a total of 4. A door mirror 18 is mounted on each of a pair of left and right door panels 17 on the front side.

The vehicle 10 is mounted with an obstacle detection device 20. The obstacle detection device 20 is mounted on the vehicle 10, and is configured to be able to detect an obstacle B present outside the vehicle 10. Hereinafter, the vehicle 10 on which the obstacle detecting device 20 is mounted is referred to as "the own vehicle 10".

Specifically, the obstacle detecting device 20 includes a distance measuring sensor 21, an imaging unit 22, a vehicle speed sensor 23, a shift position sensor 24, a steering angle sensor 25, a control unit 26, and a display 27. Hereinafter, each part constituting the obstacle detection device 20 will be described in detail. In fig. 1, the electrical connection between the parts constituting the obstacle detection device 20 is omitted for simplicity of illustration.

the distance measuring sensor 21 is provided to output a signal corresponding to the distance between the obstacles B by transmitting a probe wave toward the outside of the own vehicle 10 and receiving a received wave containing a reflected wave of the probe wave reflected by the wall surface BW of the obstacle B. Specifically, in the present embodiment, the distance measuring sensor 21 is a so-called ultrasonic sensor, and is configured to transmit a probe wave as an ultrasonic wave and to be capable of receiving a reception wave including the ultrasonic wave.

The obstacle detecting device 20 includes at least one distance measuring sensor 21. Specifically, in the present embodiment, the plurality of distance measuring sensors 21 are attached to the vehicle body 11. The plurality of distance measuring sensors 21 are respectively arranged to be offset to any one side in the vehicle width direction from the vehicle center axis VL. In addition, at least a part of the plurality of ranging sensors 21 is provided to transmit probe waves in a direction intersecting the vehicle center axis VL.

Specifically, the front bumper 13 is mounted with a first front sonar SF1, a second front sonar SF2, a third front sonar SF3, and a fourth front sonar SF4 as the distance measuring sensors 21. Similarly, the rear bumper 15 is mounted with a first rear sonar SR1, a second rear sonar SR2, a third rear sonar SR3, and a fourth rear sonar SR4 as the distance measuring sensors 21.

Further, a first side sonar SS1, a second side sonar SS2, a third side sonar SS3, and a fourth side sonar SS4 as the distance measuring sensors 21 are attached to the side surface portion 16 of the vehicle body 11. In the case where any one of the first front sonar SF1, the second front sonar SF2, the third front sonar SF3, the fourth front sonar SF4, the first rear sonar SR1, the second rear sonar SR2, the third rear sonar SR3, the fourth rear sonar SR4, the first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS4 is not particularly specified, hereinafter, a singular expression such as "ranging sensor 21" or an expression such as "a plurality of ranging sensors 21" is used.

one of the distance measuring sensors 21 is referred to as a "first distance measuring sensor", the other distance measuring sensor is referred to as a "second distance measuring sensor", and "direct wave" and "indirect wave" are defined as follows. The received wave that is the received wave received by the first distance measuring sensor and that is the reflected wave resulting from the reflection of the probe wave transmitted from the first distance measuring sensor by the obstacle B is referred to as a "direct wave". In contrast, a received wave that is a received wave received by the first distance measuring sensor and is a reflected wave resulting from reflection of the probe wave transmitted from the second distance measuring sensor by the obstacle B is referred to as an "indirect wave".

The first front sonar SF1 is provided at the left end portion in the front side surface V1 of the front bumper 13 to send a probe wave to the left front of the own vehicle 10. The second front sonar SF2 is provided at the right end portion in the front side surface V1 of the front bumper 13 to transmit a probe wave to the front right of the own vehicle 10. The first front sonar SF1 and the second front sonar SF2 are arranged symmetrically with respect to the vehicle center axis VL.

The third front sonar SF3 and the fourth front sonar SF4 are aligned in the vehicle width direction at a position closer to the center in the front side surface V1 of the front bumper 13. The third front sonar SF3 is disposed between the first front sonar SF1 and the vehicle center axis VL in the vehicle width direction, and transmits a probe wave substantially forward of the host vehicle 10. The fourth front sonar SF4 is disposed between the second front sonar SF2 and the vehicle center axis VL in the vehicle width direction, and transmits a probe wave substantially forward of the host vehicle 10. The third front sonar SF3 and the fourth front sonar SF4 are arranged symmetrically with respect to the vehicle center axis VL.

as described above, the first front sonar SF1 and the third front sonar SF3 are arranged at different positions from each other in a plan view. The first front sonar SF1 and the third front sonar SF3 adjacent to each other in the vehicle width direction are set in a positional relationship such that a reflected wave of a probe wave transmitted from one side reflected by the obstacle B can be received as a received wave of the other side.

That is, the first front sonar SF1 is arranged so as to be able to receive both a direct wave corresponding to a probe wave transmitted by itself and an indirect wave corresponding to a probe wave transmitted by the third front sonar SF 3. Similarly, the third front sonar SF3 is arranged to be able to receive both a direct wave corresponding to the probe wave transmitted by itself and an indirect wave corresponding to the probe wave transmitted by the first front sonar SF 1.

Similarly, the third front sonar SF3 and the fourth front sonar SF4 are arranged at different positions from each other in a plan view. The third front sonar SF3 and the fourth front sonar SF4 adjacent to each other in the vehicle width direction are set in a positional relationship such that a reflected wave of a probe wave transmitted from one side reflected by the obstacle B can be received as a received wave of the other side.

Similarly, the second front sonar SF2 and the fourth front sonar SF4 are arranged at different positions from each other in a plan view. The second front sonar SF2 and the fourth front sonar SF4 adjacent to each other in the vehicle width direction are set in a positional relationship such that a reflected wave of a probe wave transmitted from one side reflected by the obstacle B can be received as a received wave of the other side.

The first rear sonar SR1 is provided at the left end portion in the rear-side surface V2 of the rear bumper 15 to transmit a probe wave to the left rear of the own vehicle 10. The second rear sonar SR2 is provided at the right end portion in the rear-side surface V2 of the rear bumper 15 to transmit a probe wave to the right rear of the own vehicle 10. The first rear sonar SR1 and the second rear sonar SR2 are arranged symmetrically with respect to the vehicle center axis VL.

The third rear sonar SR3 and the fourth rear sonar SR4 are aligned in the vehicle width direction at a position closer to the center in the rear side surface V2 of the rear bumper 15. The third rear sonar SR3 is disposed between the first rear sonar SR1 and the vehicle central axis VL in the vehicle width direction, and transmits a probe wave to substantially the rear of the own vehicle 10. The fourth rear sonar SR4 is disposed between the second rear sonar SR2 and the vehicle central axis VL in the vehicle width direction, and transmits a probe wave to substantially the rear of the own vehicle 10. The third rear sonar SR3 and the fourth rear sonar SR4 are disposed symmetrically with respect to the vehicle center axis VL.

as described above, the first rear sonar SR1 and the third rear sonar SR3 are arranged at different positions from each other in a plan view. The first rear sonar SR1 and the third rear sonar SR3 adjacent to each other in the vehicle width direction are set in a positional relationship such that a reflected wave of a probe wave transmitted from one side and reflected by the obstacle B can be received as a received wave of the other side.

that is, the first rear sonar SR1 is arranged so as to be able to receive both a direct wave corresponding to the probe wave transmitted by itself and an indirect wave corresponding to the probe wave transmitted by the third rear sonar SR 3. Similarly, the third rear sonar SR3 is disposed so as to be able to receive both a direct wave corresponding to the probe wave transmitted by itself and an indirect wave corresponding to the probe wave transmitted by the first rear sonar SR 1.

Similarly, the third rear sonar SR3 and the fourth rear sonar SR4 are arranged at different positions from each other in a plan view. The third rear sonar SR3 and the fourth rear sonar SR4 adjacent to each other in the vehicle width direction are set in a positional relationship such that a reflected wave of a probe wave transmitted from one side reflected by the obstacle B can be received as a received wave of the other side.

Similarly, the second rear sonar SR2 and the fourth rear sonar SR4 are arranged at different positions from each other in a plan view. The second rear sonar SR2 and the fourth rear sonar SR4 adjacent to each other in the vehicle width direction are set in a positional relationship such that a reflected wave of a probe wave transmitted from one side reflected by the obstacle B can be received as a received wave of the other side.

the first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS4 are provided to transmit probe waves from the vehicle side surface V3, which is the outer side surface of the side surface portion 16, to the side of the own vehicle 10. The first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS4 are respectively provided so as to be able to receive only direct waves.

The first side sonar SS1 is disposed between the left door mirror 18 and the first front sonar SF1 in the front-rear direction to transmit a probe wave to the left of the host vehicle 10. The second side sonar SS2 is disposed between the right door mirror 18 and the second front sonar SF2 in the front-rear direction, and transmits a probe wave to the right of the host vehicle 10. The first side sonar SS1 and the second side sonar SS2 are provided symmetrically with respect to the vehicle center axis VL.

The third side sonar SS3 is disposed between the door panel 17 on the left rear side and the first rear sonar SR1 in the front-rear direction, and transmits a probe wave to the left of the host vehicle 10. The fourth sonar SS4 is disposed between the door panel 17 on the right rear side and the second rear sonar SR2 in the front-rear direction, and transmits a probe wave to the right of the own vehicle 10. The third side sonar SS3 and the fourth side sonar SS4 are provided symmetrically with respect to the vehicle center axis VL.

The plurality of distance measuring sensors 21 are electrically connected to the control unit 26. That is, the plurality of distance measuring sensors 21 each transmit a probe wave under the control of the control section 26, and generate a signal corresponding to the reception result of the received wave and transmit it to the control section 26. Hereinafter, information included in a signal corresponding to a reception result of a reception wave is referred to as "reception information". The reception information includes information associated with the reception intensity of the reception wave and information associated with the distance between each of the plurality of ranging sensors 21 and the obstacle B. The information associated with the distance between the obstacles B includes information associated with a time difference from the transmission of the probe wave to the reception of the received wave.

The image pickup unit 22 is configured to pick up an image of the periphery of the host vehicle 10 and acquire image information corresponding to the image. In the present embodiment, the imaging unit 22 is a digital camera device and includes an image sensor such as a CCD. CCD is an abbreviation of charged coupled Device.

In the present embodiment, the vehicle 10 is mounted with a plurality of imaging units 22, that is, a front camera CF, a rear camera CB, a left side camera CL, and a right side camera CR. When any one of the front camera CF, the rear camera CB, the left side camera CL, and the right side camera CR is not particularly specified, the singular expression of "imaging unit 22" or the expressions of "a plurality of imaging units 22" are used below.

The front camera CF is mounted on the front face portion 12 of the vehicle body 11 to acquire image information corresponding to an image in front of the own vehicle 10. The rear camera CB is mounted on a rear face portion 14 of the vehicle body 11 to acquire image information corresponding to an image of the rear of the own vehicle 10. The left side camera CL is mounted on the left door mirror 18 to acquire image information corresponding to an image of the left side of the own vehicle 10. The right-side camera CR is mounted on the right-side door mirror 18 to acquire image information corresponding to an image of the right side of the own vehicle 10.

The plurality of imaging units 22 are electrically connected to the control unit 26. That is, each of the plurality of imaging units 22 acquires image information under the control of the control unit 26, and transmits the acquired image information to the control unit 26.

The vehicle speed sensor 23, the shift position sensor 24, and the steering angle sensor 25 are electrically connected to the control unit 26. The vehicle speed sensor 23 is provided to generate a signal corresponding to the traveling speed of the host vehicle 10 and send the signal to the control unit 26. The running speed of the host vehicle 10 will be hereinafter simply referred to as "vehicle speed". The shift position sensor 24 is provided to generate a signal corresponding to a shift position of the host vehicle 10 and send it to the control portion 26. The steering angle sensor 25 is provided to generate a signal corresponding to a steering angle of the host vehicle 10 and send the signal to the control unit 26.

The control unit 26 is disposed inside the vehicle body 11. The control unit 26 is a so-called in-vehicle microcomputer, and includes a CPU, a ROM, a RAM, a nonvolatile RAM, and the like, which are not shown. The nonvolatile RAM is, for example, a flash ROM or the like. Hereinafter, the CPU, ROM, RAM, and nonvolatile RAM of the control unit 26 are simply referred to as "CPU", "ROM", "RAM", and "nonvolatile RAM".

the control unit 26 is configured to be able to realize various control operations by the CPU reading and executing programs from the ROM or the nonvolatile RAM. The program includes programs corresponding to the routines described later. Various data used when executing the program are stored in advance in the ROM or the nonvolatile RAM. The various data includes, for example, initial values, look-up tables, mappings, and the like.

The control unit 26 is configured to execute the obstacle detecting operation based on signals and information received from each of the plurality of distance measuring sensors 21, each of the plurality of imaging units 22, the vehicle speed sensor 23, the shift position sensor 24, the steering angle sensor 25, and the like. The display 27 is disposed in the cabin of the host vehicle 10. The display 27 is electrically connected to the control unit 26 so as to display an obstacle detection operation under the control of the control unit 26.

(first embodiment)

Next, the functional block configurations of the obstacle detection device 20 and the control unit 26 in the first embodiment will be described with reference to fig. 2 in addition to fig. 1. The control unit 26 is configured to detect the obstacle B based on the reception result of the received wave by the distance measuring sensor 21, the imaging result of the image by the imaging unit 22, and various signals received from various sensors such as the vehicle speed sensor 23. Specifically, as shown in fig. 2, the control unit 26 includes a vehicle state acquisition unit 260, a position acquisition unit 261, a shape recognition unit 262, and a detection processing unit 263 as functional components.

The vehicle state acquisition unit 260 is provided to acquire running state information corresponding to the running state of the host vehicle 10 by receiving various signals from the vehicle speed sensor 23, the shift position sensor 24, the steering angle sensor 25, and the like shown in fig. 1. The running state information includes a vehicle speed, a steering angle, a shift position, and the like. The running state information also includes a case where the own vehicle 10 is stopped, that is, the vehicle speed is 0 km/h. In the present embodiment, the vehicle state acquisition unit 260 is an interface provided between various sensors such as the vehicle speed sensor 23 and the CPU, and transmits various signals received from various sensors such as the vehicle speed sensor 23 or signals obtained by subjecting relevant signals to predetermined processing to the CPU. In fig. 2, various sensors such as the vehicle speed sensor 23 are not shown for simplicity of illustration.

When the obstacle detection device 20 detects an obstacle B located in front of the host vehicle 10, any two of the first front sonar SF1, the second front sonar SF2, the third front sonar SF3, and the fourth front sonar SF4 that are adjacent to each other are set as the first distance measuring sensor and the second distance measuring sensor. In contrast, when the obstacle detection device 20 detects the obstacle B located behind the host vehicle 10, any two of the first rear sonar SR1, the second rear sonar SR2, the third rear sonar SR3, and the fourth rear sonar SR4 that are adjacent to each other are used as the first distance measuring sensor and the second distance measuring sensor, respectively.

The position acquisition unit 261 is configured to acquire relative position information corresponding to the positional relationship between the host vehicle 10 and the obstacle B by triangulation based on the positions of the first distance measuring sensor and the second distance measuring sensor when the first distance measuring sensor and the second distance measuring sensor receive, as a received wave, a reflected wave of the probe wave transmitted by the first distance measuring sensor reflected by the obstacle B. That is, the position acquisition unit 261 acquires relative position information based on the output of each of the plurality of distance measuring sensors 21.

The relative position information is information corresponding to the relative position of the obstacle B with respect to the host vehicle 10, which is acquired based on the received wave in each of the plurality of distance measuring sensors 21. The relative position information includes distance information and orientation information. The distance information is information corresponding to the distance of the obstacle B from the host vehicle 10. The heading information is information corresponding to the heading of the obstacle B with respect to the host vehicle 10, that is, the angle formed by the vehicle center axis VL and the directional line segment from the host vehicle 10 toward the obstacle B.

the shape recognition portion 262 is provided to perform shape recognition of the obstacle B based on the image information acquired by the imaging portion 22 and the traveling state information acquired by the vehicle state acquisition portion 260. Specifically, in the present embodiment, the shape recognition unit 262 is configured to recognize the three-dimensional shape of the obstacle B by acquiring the three-dimensional positions of a plurality of feature points in image information based on a plurality of pieces of image information acquired in time series along with the movement of the host vehicle 10. That is, the shape recognition unit 262 three-dimensionally recognizes a characteristic shape in an object or the like in an image based on a plurality of images sequentially captured by the imaging unit 22 while the vehicle 10 is moving.

The feature shapes include straight edges, such as horizontal edges, vertical edges, and the like. The "straight edge" is a pixel column having a length continuously defined by a predetermined length or more in correspondence with an outline of an object or the like in an image. "horizontal edge" refers to a straight line edge in the image that is parallel to the horizontal line. "vertical edge" refers to a straight edge in the image that is parallel to a vertical line. The "outline line of the object and the like" includes not only the outline line of the obstacle B but also the outline line in the display such as a dividing line.

Specifically, the shape recognition unit 262 is configured to be able to recognize the characteristic shape three-dimensionally by a so-called mobile stereo technique or an SFM technique. SFM is an abbreviation for Structure From Motion. The techniques for mobile stereo and SFM are known or well known at the time of filing of the present application. Therefore, in the present specification, detailed description about the mobile stereo technology and the SFM technology is omitted.

The detection processing portion 263 is provided to detect the obstacle B based on the relative position information acquired by the position acquisition portion 261 and the shape recognition result of the shape recognition portion 262. Specifically, in the present embodiment, the detection processing unit 263 is configured to discard the relative position information corresponding to the obstacle B when the height dimension of the obstacle B in the shape recognition result of the shape recognition unit 262 is smaller than a predetermined dimension.

(action summary)

The following describes an outline of the operation of the obstacle detecting device 20, i.e., the control unit 26, with reference to fig. 1 to 6. In the following description of the operation, in order to avoid complication of illustration and description, the vehicle 10 is assumed to travel straight ahead, and illustration of each part is appropriately omitted.

Fig. 3, 4A, and 4B show how the vehicle 10 detects an obstacle B present in front. As shown in fig. 3, the obstacle detecting device 20 detects an obstacle B existing ahead using the first front sonar SF1, the second front sonar SF2, the third front sonar SF3, and the fourth front sonar SF 4. In addition, the obstacle detecting device 20 recognizes the three-dimensional shape of the obstacle B existing in front using the front camera CF.

when the host vehicle 10 moves backward, the obstacle detection device 20 detects the obstacle B existing behind, using the first rear sonar SR1, the second rear sonar SR2, the third rear sonar SR3, and the fourth rear sonar SR 4. Further, the obstacle detecting device 20 recognizes the three-dimensional shape of the obstacle B existing rearward using the rear camera CB. However, the obstacle detection operation in the backward movement is basically the same as that in the forward movement. Therefore, the following describes an outline of the operation of the obstacle detecting device 20, taking the obstacle detecting operation during forward travel as an example.

Fig. 3 shows a case where the obstacle B is positioned between the third front sonar SF3 and the fourth front sonar SF4 in the vehicle width direction. In this case, the relative position of the obstacle B with respect to the host vehicle 10 is acquired by receiving, by the third front sonar SF3 and the fourth front sonar SF4, a reflected wave of the probe wave WS transmitted from the third front sonar SF3 or the fourth front sonar SF4 reflected by the wall surface BW of the obstacle B. Hereinafter, the outline of the operation will be described assuming a case where the probe wave WS is transmitted from the third front sonar SF3, the received wave WR1 corresponding to the probe wave WS is received by the third front sonar SF3, and the received wave WR2 corresponding to the probe wave WS is received by the fourth front sonar SF 4.

The direct wave, that is, the received wave WR1 in the third front sonar SF3 is received by the third front sonar SF3 by the probe wave WS transmitted from the third front sonar SF3 being reflected by the wall surface BW of the obstacle B. On the other hand, the received wave WR2, which is an indirect wave in the fourth front sonar SF4, is received by the fourth front sonar SF4 by the probe wave WS transmitted from the third front sonar SF3 being reflected by the wall surface BW of the obstacle B.

The time required from the transmission time of the probe wave WS in the third front sonar SF3 to the reception time of the received wave WR1 is T1. A required time from the transmission timing of the probe wave WS in the third front sonar SF3 to the reception timing of the received wave WR2 in the fourth front sonar SF4 is set to T2. Let the speed of sound be c. In this case, if the distance from the third front sonar SF3 to the wall surface BW of the obstacle B along the propagation direction of the received wave WR1 is D1, D1 is 0.5T1 × c. Further, if the distance from the fourth front sonar SF4 to the wall surface BW of the obstacle B along the propagation direction of the received wave WR2 is D2, D2 is (T2-0.5T 1) × c.

If a point on the wall surface BW of the obstacle B, which is estimated to reflect the probe wave WS, is "detected point P", D1 is the distance from the third front sonar SF3 to the detected point P, and D2 is the distance from the fourth front sonar SF4 to the detected point P. The horizontal positions of the third front sonar SF3 and the fourth front sonar SF4 in the present vehicle 10 are fixed. Thus, the relative position of the detection point P with respect to the own vehicle 10 is acquired by triangulation using the horizontal positions of the third front sonar SF3 and the fourth front sonar SF4 and the calculated distances D1 and D2.

The distance DC that the host vehicle 10 can travel in the forward direction is a horizontal distance in the traveling direction of the host vehicle 10 from the front side surface V1 to the detection point P. As shown in fig. 3, when the host vehicle 10 is moving straight ahead, the possible travel distance DC is the distance from the front side surface V1 to the detection point P in the front-rear direction. Further, the travelable distance DC is smallest in the case where the own vehicle 10 is traveling straight. Therefore, from the viewpoint of reducing the processing load or the like, the distance DC that can travel in the forward direction of the host vehicle 10 is irrelevant even if the distance from the front side surface V1 to the detection point P in the front-rear direction is set regardless of the steering angle.

fig. 4A shows a state in which the vehicle 10 travels toward an obstacle B having a large height. Fig. 4B shows a state in which the vehicle 10 travels toward an obstacle B having a small height. The obstacle B having a large height dimension as shown in fig. 4A is, for example, a wall or the like. The obstacle B having a small height dimension as shown in fig. 4B, that is, the obstacle B having a small projection height from the road surface RS is, for example, a step or a curb.

the height dimension of the obstacle B in the present embodiment corresponds to the projection height of the obstacle B from the road surface RS, that is, the projection length of the obstacle B from the road surface RS in the vehicle height direction. The height dimension of the obstacle B may also be referred to as a distance between the base end portion and the tip end portion of the obstacle B in the vehicle height direction. In the example of fig. 4A and 4B, the base end corresponds to the lower end, and the tip end corresponds to the upper end.

In fig. 4A and 4B, the arrow indicating the travelable distance DC is a horizontal distance between the host vehicle 10 and the obstacle B, and is a shortest distance between the host vehicle 10 and the obstacle B in a plan view. The direction in which the distance DC can travel is defined to be parallel to the road surface RS. However, the vehicle height direction, i.e., the vertical direction may not be parallel to the direction in which gravity acts depending on the inclination state of the road surface RS.

The distance measuring sensor 21 is mounted on the vehicle body 11. The vehicle body 11 is located above the road surface RS. Therefore, the mounting height of the distance measuring sensor 21, that is, the mounting position of the distance measuring sensor 21 in the vehicle height direction becomes the distance from the road surface RS of the distance measuring sensor 21 in the vehicle height direction.

hereinafter, the mounting height of the distance measuring sensor 21 is referred to as "sensor mounting height". The sensor mounting height is a predetermined value corresponding to the distance from the vehicle body 11 to the road surface RS and the mounting position of the distance measuring sensor 21 on the vehicle body 11. Specifically, the sensor mounting height is a height from the road surface RS of the vehicle 10 when the mounting position of the distance measuring sensor 21 is located on the road surface RS parallel to the horizontal plane.

As shown in fig. 4A, when the height dimension is larger than the sensor mounting height, the wall surface BW of the obstacle B is present at the same height as the distance measuring sensor 21. Therefore, the received wave WR that reaches the distance measuring sensor 21 propagates in parallel to the direction of the predetermined horizontal distance. Therefore, in this case, the distance information of the obstacle B acquired by using the distance measuring sensor 21 is basically accurate information corresponding to the actual horizontal distance between the host vehicle 10 and the obstacle B, that is, the possible travel distance DC.

On the other hand, as shown in fig. 4B, when the height dimension is smaller than the sensor mounting height, the upper end of the obstacle B is located at a position lower than the distance measuring sensor 21. That is, the wall surface BW of the obstacle B does not exist at the same height as the distance measurement sensor 21. In this case, the received wave WR reaching the distance measuring sensor 21 travels obliquely upward from the lower end of the obstacle B toward the distance measuring sensor 21. Therefore, when the height dimension is smaller than the sensor mounting height of the obstacle B, the distance information of the obstacle B acquired by using the distance measuring sensor 21 becomes inaccurate information including a large error.

Further, as described above, the obstacle B having a height dimension smaller than the sensor mounting height may protrude to a low extent that the host vehicle 10 can directly pass through. Examples of such objects are, for instance, a relatively low step of about 5cm, a cover of a manhole, etc. Since such an obstacle B does not cause any hindrance to the traveling of the host vehicle 10, the necessity of recognizing the obstacle B as an "obstacle" in the driving assistance operation is low.

Therefore, in the present embodiment, when the height dimension of the obstacle B in the shape recognition result using the front camera CF is equal to or larger than the predetermined dimension, the obstacle detection device 20 validates the relative position information corresponding to the obstacle B and stores the validated relative position information in the nonvolatile RAM. On the other hand, when the height of the obstacle B in the shape recognition result using the front camera CF is smaller than the predetermined size, the obstacle detection device 20 invalidates the relative position information corresponding to the obstacle B and discards it.

this makes it possible to suppress, as far as possible, unnecessary reporting operations and the like caused by recognizing, as the obstacle B, an object having a projection height that is so low that the object does not obstruct the travel of the host vehicle 10 and can directly pass over the object. The "predetermined height" for suppressing the erroneous recognition of such an object may be set to about 5 to 10cm, for example.

(example of operation)

a specific operation example corresponding to the above-described outline of the operation according to the configuration of the present embodiment will be described below with reference to a flowchart. In the drawings and the following description, the "step" will be abbreviated as "S" simply.

Fig. 5 is a flowchart showing an example of the shape recognition operation of the obstacle B based on the image information acquired by the imaging unit 22. The image recognition routine shown in fig. 5 corresponds to the operation of the shape recognition unit 262. This image recognition routine is similarly executed in the second to fourth embodiments to be described later. The image recognition routine is started by the CPU at predetermined time intervals after a predetermined start condition is satisfied.

When the image recognition routine shown in fig. 5 is started, first, in S501, the CPU acquires image information from the imaging unit 22. In addition, the CPU stores the acquired image information in time series to the nonvolatile RAM.

Next, in S502, the CPU executes the image recognition operation of the shape recognition unit 262 using the mobile stereo technique or the SFM technique. Thereby, the three-dimensional shape of the object or the like in the image is recognized. Specifically, for example, the height of the obstacle B can be recognized. Next, in S503, the CPU stores the image recognition result of the shape recognition unit 262 in the nonvolatile RAM, and once ends the present routine.

Fig. 6 is a flowchart showing an example of the detection operation of the obstacle B based on the relative position information acquired by the two adjacent distance measuring sensors 21 and the image information acquired by the imaging unit 22. The obstacle detection routine shown in fig. 6 corresponds to the operation of the position acquisition unit 261 and the detection processing unit 263. The obstacle detection routine is similarly executed in the second embodiment and the third embodiment described later. The obstacle detection routine is started by the CPU at predetermined time intervals after a predetermined start condition is satisfied.

When the obstacle detection routine shown in fig. 6 is started, first, in S601, the CPU selects two adjacent distance measuring sensors 21 and acquires reception information from the two selected distance measuring sensors 21. In the above example, the adjacent two distance measuring sensors 21 are the third front sonar SF3 and the fourth front sonar SF 4. That is, in S601, the probe wave is transmitted from the third front sonar SF3, and the received wave is received by the third front sonar SF3 and the fourth front sonar SF 4.

Next, in S602, the CPU determines whether or not the intensities of the received waves in the two adjacent ranging sensors 21 are both equal to or greater than a predetermined threshold value. If the condition that the intensities of the received waves in the two adjacent ranging sensors 21 are equal to or greater than the predetermined threshold value is not satisfied (i.e., no in S602), the above-described triangulation is not satisfied. Thus, in this case, the CPU skips all the processing of and after S603, and temporarily ends the present routine.

Hereinafter, the description of the present routine is continued assuming that a condition that the intensities of the received waves in the two adjacent distance measuring sensors 21 are both equal to or greater than a predetermined threshold value (that is, yes in S602). In this case, the CPU advances the process to S603 and thereafter.

In S603, the CPU acquires the relative position information of the obstacle B based on the acquired reception information. In the above example, in S603, the CPU acquires the detection point P corresponding to the obstacle B. Next, in S604, the CPU acquires the distance to the obstacle B. In the above example, in S604, the CPU acquires the travelable distance DC. The relative position information and the travelable distance DC acquired in S603 and S604 are temporarily stored in the nonvolatile RAM.

Next, in S605, the CPU acquires the height H of the obstacle B corresponding to the received wave having the intensity equal to or higher than the threshold value based on the image recognition result stored in the nonvolatile RAM. In S606, the CPU determines whether or not the height H acquired in S605 is smaller than a predetermined height Hth 1. The predetermined height Hth1 is, for example, 5 cm.

When the height H is smaller than the predetermined height Hth1 (that is, yes in S606), the CPU advances the process to S607, and then once ends the present routine. In S607, the CPU invalidates and discards the relative position information and the travelable distance DC acquired in S603 and S604 this time. That is, the CPU eliminates the record in the nonvolatile RAM of the relative position information and the travel distance DC acquired in this time S603 and S604.

On the other hand, when the height H is equal to or greater than the predetermined height Hth1 (i.e., no in S606), the CPU skips the process in S607 and once ends the routine. In this case, the relative position information and the travelable distance DC relating to the obstacle B corresponding to the received wave having the intensity equal to or higher than the threshold value and having the height size equal to or higher than the predetermined height Hth1 are used for the travel assistance operation of the host vehicle 10.

(second embodiment)

The obstacle detecting device 20 of the second embodiment will be explained below. In the following description of the second embodiment, differences from the first embodiment will be mainly described. In the second embodiment and the first embodiment described above, the same reference numerals are given to the same or equivalent portions. Therefore, in the following description of the second embodiment, with respect to the constituent elements having the same reference numerals as those of the above-described first embodiment, the description of the above-described first embodiment can be appropriately referred to unless technically contradictory or otherwise specifically added. The same applies to a third embodiment and the like described later.

The structure of the present embodiment is the same as that of the first embodiment described above. The present embodiment corresponds to the detection operation of the obstacle B using the first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS 4.

Referring to fig. 7, the first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS4 output signals corresponding to the distance from the obstacle B located on the side of the own vehicle 10. The left camera CL and the right camera CR acquire image information corresponding to an image of the side of the host vehicle 10. When the obstacle detection device 20 is used for parking assist operation, they are used for parking space detection and the like.

As described above, each of the first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS4 can detect the distance to the facing obstacle B by direct waves. The obstacle detection device 20 can recognize the shape of the obstacle B located on the side of the vehicle 10 by using the first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS 4.

fig. 7 illustrates a case where an obstacle B exists on the right side of the second side sonar SS2 and the right side camera CR. The following describes an outline of a detection operation of the obstacle B located on the right side of the vehicle 10, with reference to the example of fig. 7.

As shown in fig. 7, the second side sonar SS2 outputs a signal corresponding to the distance between the obstacles B by receiving, as the received wave WR, the reflected wave of the probe wave WS transmitted by itself reflected by the obstacles B. The obstacle detecting device 20 repeatedly acquires the distance DD to the obstacle B based on the received wave WR that the second side sonar SS2 repeatedly receives at predetermined time intervals while the host vehicle 10 is traveling. The predetermined time is, for example, several hundred milliseconds. The obstacle detecting device 20 acquires sonar positions, that is, positions of the second side sonar SS2 corresponding to each of the plurality of distances DD, based on the travel state information of the host vehicle 10 and the transmission timing of the probe wave WS or the reception timing of the received wave WR.

the obstacle detecting device 20 can roughly estimate the outline shape of the obstacle B in a plan view based on the plurality of distances DD acquired as described above and the sonar positions corresponding to the respective distances DD. For example, the obstacle detecting device 20 recognizes the plurality of distances DD as a point row on two-dimensional coordinates with the sonar position on the horizontal axis and the distance DD on the vertical axis. The obstacle detecting device 20 estimates the reflection points PR corresponding to the respective distances DD by performing predetermined processing by triangulation on the point sequences.

The reflection point PR is a position on the obstacle B estimated to reflect the received wave WR. That is, the reflection point PR is a position on the virtual obstacle B corresponding to the distance DD obtained by the reception of the received wave WR once. The outline shape of the obstacle B in a plan view is estimated roughly from a point sequence including a plurality of reflection points PR. The reflection point PR is a point estimated to be a point on the wall surface BW facing the own vehicle 10 in the obstacle B, and corresponds to the relative position information of the obstacle B.

It is also known to estimate the external shape of the obstacle B in plan view using the direct wave as described above at the time of application of the present application. For example, refer to U.S. patent No. 7739046 specification, U.S. patent No. 7843767 specification, U.S. patent No. 8130120 specification, and the like.

The obstacle detecting device 20 can acquire the reflection point PR by triangulation based on the sonar position and the distance DD in the second side sonar SS2 acquired at different times during the travel of the host vehicle 10. Fig. 8 shows an outline of an example of obtaining such a reflection point PR.

That is, referring to fig. 8, the position of the second side sonar SS2 indicated by the solid line indicates the position of the second side sonar SS2 at the time of the current reception of the received wave WR. On the other hand, the position of the second side sonar SS2 indicated by the broken line indicates the position of the second side sonar SS2 at the time of the previous reception of the received wave WR. This time is set as the Nth time, and the previous time is set as the (N-1) th time. The distance DD acquired immediately before is DD (N-1), and the distance DD acquired this time is DD (N).

The time interval between the time when the previous distance DD (N-1) is acquired and the time when the current distance DD (N) is acquired is sufficiently small as described above. Therefore, it can be assumed that the position of the wall surface BW of the reflected probe wave corresponding to the distance DD (N-1) and the position of the wall surface BW of the reflected probe wave corresponding to the distance DD (N) are the same. Therefore, the obstacle detecting device 20 acquires, as the reflection point PR, an intersection of a first circle having the position of the second side sonar SS2 at the time of acquiring the distance DD (N-1) as the center and having the radius of the distance DD (N-1) and a second circle having the position of the second side sonar SS2 at the time of acquiring the distance DD (N) as the center and having the radius of the distance DD (N).

As described above, the obstacle detecting device 20 can acquire the relative position information on the obstacle B located on the side of the vehicle 10 and the general shape in the plan view by using the first side sonar SS1, the second side sonar SS2, the third side sonar SS3, and the fourth side sonar SS 4. However, the height of the obstacle B is not detailed.

On the other hand, the obstacle detecting device 20 can acquire the height of the obstacle B using the left camera CL and the right camera CR. Specifically, as shown in fig. 7, when the obstacle B exists on the right side of the host vehicle 10, the obstacle detecting device 20 can acquire the height of the obstacle B using the right-side camera CR. That is, for example, the obstacle detector 20 can recognize the height of the obstacle B by the image processing technique such as the above-described moving stereo technique or SFM technique.

Fig. 7 shows a situation in which the obstacle detecting device 20 searches for a parking space on the right side of the own vehicle 10. In this situation, there may be a case where the obstacle B is an object whose protruding height is so low that the own vehicle 10 can directly pass through. Examples of such objects include a step of about 5cm, a manhole cover, and the like.

In this case, the obstacle B does not actually become an obstacle during the parking assist operation. That is, the area including the obstacle B can be set as a parking space. Even if the obstacle B exists on the parking path to the parking space, it does not interfere with the parking space. Therefore, it is not necessary to maintain the relative position information corresponding to the obstacle B.

Therefore, when the height dimension of the obstacle B in the shape recognition results using the left camera CL and the right camera CR is equal to or larger than the predetermined dimension, the obstacle detection device 20 validates the relative position information corresponding to the obstacle B and stores the validated relative position information in the nonvolatile RAM. On the other hand, when the height of the obstacle B in the shape recognition result using the left camera CL and the right camera CR is smaller than the predetermined size, the obstacle detection device 20 invalidates the relative position information corresponding to the obstacle B and discards the information. According to the present embodiment, it is possible to realize a more appropriate parking assist operation and to reduce the calculation load on the CPU and the storage capacity in the nonvolatile RAM.

(third embodiment)

The obstacle detecting device 20 of the third embodiment will be explained below. In the following description of the third embodiment, differences from the first embodiment will be mainly described.

The structure of the present embodiment is the same as that of the first embodiment described above. As shown in fig. 9, the present embodiment corresponds to a detection operation of an obstacle B when the host vehicle 10 travels while approaching a wall-shaped obstacle B that is provided upright so as to be inclined with respect to the vehicle central axis VL. The obstacle B in this case is hereinafter referred to as "inclined wall".

In the example of fig. 9, for simplicity of explanation, it is assumed that the host vehicle 10 is traveling straight ahead, and the obstacle B as an inclined wall is present in the right front of the host vehicle 10. In the figure, detectable ranges of the second front sonar SF2 and the fourth front sonar SF4 are indicated by two-dot chain lines.

In the example of fig. 9, the object center axis BL in the inclined wall intersects the vehicle center axis VL. The object center axis BL is a center axis of the obstacle B along the vehicle traveling direction in a plan view. In this example, the object center axis BL is parallel to a wall surface BW of the obstacle B facing the host vehicle 10 in a plan view.

as shown in fig. 9, there may be a case where the angle formed by the object center axis BL and the vehicle center axis VL is small and the obstacle B as an inclined wall exists only in the detectable range of the second front sonar SF 2. In this case, the direct wave in the second front sonar SF2 can be received, and the indirect wave in the second front sonar SF2 and the fourth front sonar SF4 cannot be received. That is, in this case, triangulation using the second front sonar SF2 and the fourth front sonar SF4 does not hold.

In the example shown in fig. 9, the relative position information corresponding to the obstacle B is acquired based on the direct wave in the second front sonar SF 2. The direct wave is a received wave WR received by the second front sonar SF2, and is caused by a reflected wave of the probe wave WS transmitted from the second front sonar SF2 reflected by the obstacle B.

Specifically, for example, the obstacle detecting device 20 can estimate the rightmost position in the plan view in the detectable range of the second front sonar SF2 as the detected point P. Alternatively, for example, the obstacle detecting device 20 can infer the position on the center axis of the probe wave WS as the detection point P. Alternatively, for example, the obstacle detection device 20 can estimate the detection point P based on the position and detection distance of the second front sonar SF2 at different times as in the second embodiment described above.

Such relative position information is not acquired based on the first indirect wave which is the received wave received by the second front sonar SF2 and which is caused by the reflected wave of the probe wave transmitted from the fourth front sonar SF4 reflected by the obstacle B. The relative position information is not acquired based on the second indirect wave which is the received wave received by the fourth front sonar SF4 and is caused by the reflected wave of the probe wave transmitted from the second front sonar SF2 and reflected by the obstacle B. Therefore, such relative position information is expressed as "based on only the direct wave in the second front sonar SF 2" hereinafter.

The detection distance itself between the wall surface BW of the obstacle B and the direct wave by the second front sonar SF2 alone may not be used for the driving assistance of the host vehicle 10. However, the relative position information of the end portion BE ahead in the traveling direction in the obstacle B as the inclined wall may BE estimated based on the shape recognition result based on the image information acquired by the front camera CF and the detected distance based only on the direct wave in the second front sonar SF 2. Therefore, even if the height dimension of the obstacle B in the shape recognition result based on the image information acquired by the imaging unit 22 is equal to or larger than the predetermined dimension, the obstacle detecting device 20 recognizes that the obstacle B is an inclined wall when the detection point P is a point based only on the direct wave in the second front sonar SF 2.

in the present embodiment, the second front sonar SF2 and the fourth front sonar SF4 are provided on the front surface portion 12 which is the surface on the traveling direction side of the host vehicle 10. The obstacle detecting device 20, that is, the detecting processing unit 263 shown in fig. 2, recognizes that the obstacle B is an inclined wall when the height dimension of the obstacle B in the shape recognition result using the front camera CF is equal to or larger than a predetermined dimension and the acquired relative position information is based only on the direct wave in the second front sonar SF 2. The inclined wall has a wall surface BW intersecting the vehicle center axis VL of the host vehicle 10, and the wall surface BW may approach the host vehicle 10 as the host vehicle 10 travels.

When recognizing that the obstacle B is an inclined wall, the obstacle detection device 20 executes predetermined processing. The predetermined process is, for example, a process of invalidating and discarding the relative position information corresponding to the obstacle B as in the first embodiment described above. Alternatively, the predetermined processing is processing for notifying the presence of a front inclined wall to the driver of the host vehicle 10 via the display 27 or the like, for example. Alternatively, the predetermined processing is, for example, to search for a straight edge extending forward and passing near the detection point P in the shape recognition result based on the image information, form an extension line from the detection point P along the straight edge, and estimate the relative position of a vertical edge intersecting the extension line as the relative position of the end BE.

(example of operation)

Fig. 10 is a flowchart showing a specific operation example according to the present embodiment. The obstacle recognition routine shown in fig. 10 is started by the CPU at predetermined time intervals after the predetermined start condition is satisfied. It is assumed that the image recognition routine shown in fig. 5 and the obstacle detection routine shown in fig. 6 have already been executed on the premise that the obstacle recognition routine shown in fig. 10 is started.

In the present embodiment, the content of determination at S602 in the obstacle detection routine shown in fig. 6 is a determination as to whether or not the intensity of the received wave of any one of the two adjacent distance measuring sensors 21 selected is equal to or greater than a predetermined threshold value. That is, in the present embodiment, the processing of S603 and S604 is also executed when the direct wave of only one of the two adjacent distance measuring sensors 21 selected has an intensity equal to or higher than a predetermined threshold value. Therefore, in this case as well, the relative positional information of the obstacle B including the distance to the obstacle B is acquired based on the direct wave as described above.

when the obstacle recognition routine shown in fig. 10 is started, first, in S1001, the CPU determines whether the distance to the obstacle B is effectively acquired. That is, in S1001, the CPU determines whether or not the height H is equal to or greater than the predetermined height Hth1 and the relative position information is temporarily validated for the obstacle B that acquired the relative position information.

If the distance to the obstacle B has not been acquired efficiently (i.e., no in S1001), the CPU skips all the processing from S1002 onward, and once ends the present routine. On the other hand, if the distance to the obstacle B is effectively acquired (i.e., yes in S1001), the CPU advances the process to S1002 and thereafter.

In S1002, the CPU determines whether the acquired distance is based only on the direct wave in the first front sonar SF1 or the second front sonar SF 2. In the case where the distance to the obstacle B is acquired based on only the direct wave in the second front sonar SF2, the obstacle B is an inclined wall located in the left front of the host vehicle 10 as shown in fig. 9. On the other hand, in a case where the distance to the obstacle B is acquired based on only the direct wave in the first front sonar SF1, the obstacle B is an inclined wall located in the left front of the host vehicle 10.

If the acquired distance is based on the direct wave only (that is, yes in S1002), the CPU advances the process to S1003 and then once ends the present routine. In S1003, the CPU recognizes that the obstacle B detected this time is an inclined wall, and executes the predetermined processing described above. On the other hand, if the acquired distance is based on the indirect wave (i.e., no in S1002), the CPU skips the process in S1003 and once ends the routine.

(fourth embodiment)

Next, the functional block configuration of the obstacle detection device 20 and the control unit 26 according to the fourth embodiment will be described with reference to fig. 11. In the following description of the fourth embodiment, differences from the first embodiment will be mainly described. In the first and fourth embodiments, the structure of fig. 1 is common. Therefore, in the following description of the fourth embodiment, fig. 1 and 3 can be appropriately referred to.

As shown in fig. 1, the obstacle detecting device 20 of the present embodiment is also configured to be mounted on the vehicle 10 to detect an obstacle B present outside the vehicle 10. Referring to fig. 1, an obstacle detecting device 20 of the present embodiment includes a distance measuring sensor 21, an imaging unit 22, a vehicle speed sensor 23, a shift position sensor 24, a steering angle sensor 25, a control unit 26, and a display 27. The distance measuring sensor 21 and the imaging unit 22 are the same as those of the first embodiment.

the obstacle detecting device 20 includes at least one distance measuring sensor 21. The control unit 26 is configured to detect the obstacle B based on the reception result of the received wave by the distance measuring sensor 21, the imaging result of the image by the imaging unit 22, and various signals received from various sensors such as the vehicle speed sensor 23. Specifically, as shown in fig. 11, the control unit 26 includes a vehicle state acquisition unit 260, a distance acquisition unit 264, a shape recognition unit 265, and a distance correction unit 266 as functional components.

The distance acquisition unit 264 is provided to acquire distance information corresponding to the distance of the obstacle B from the host vehicle 10 based on the output of the distance measurement sensor 21. Specifically, the distance acquiring unit 264 is configured to be able to acquire the distance to the obstacle B, as in the above-described embodiments.

the shape recognition unit 265 is configured to perform shape recognition of the obstacle B based on the image information acquired by the imaging unit 22. That is, the shape recognition unit 265 has a function of recognizing the three-dimensional shape of the object from the plurality of pieces of image information acquired in time series, similarly to the shape recognition unit 262 in the first embodiment described above.

The distance correcting unit 266 is provided to correct the distance information corresponding to the obstacle B based on the sensor mounting height when the height of the obstacle B in the shape recognition result of the shape recognizing unit 265 is smaller than a predetermined size. The "predetermined dimension" may be set to, for example, about 10 to 25cm as described later.

(action summary)

Fig. 12A shows a state in which the vehicle 10 travels toward an obstacle B having a large height, that is, an obstacle B having a sufficiently higher protruding height from the road surface RS than the mounting height of the distance measuring sensor 21.

The obstacle B having a large height dimension as shown in fig. 12A is, for example, a wall or the like. As shown in fig. 12A, when the height of the obstacle B is large and the wall surface BW of the obstacle B is present at the same height as the distance measurement sensor 21, the distance information of the obstacle B using the distance measurement sensor 21 is substantially accurate information corresponding to the actual horizontal distance between the host vehicle 10 and the obstacle B.

Fig. 12B and 12C show the state in which the height of the obstacle B in fig. 12A is lower than the sensor mounting height. The obstacle B having a small height dimension as shown in fig. 12B and 12C is, for example, a step, a bumper, a curb, or the like. Fig. 12C shows a state in which the host vehicle 10 is closer to the obstacle B than the state shown in fig. 12B.

As shown in fig. 12B, when the height dimension of the obstacle B is small and the wall surface BW of the obstacle B is not present at the same height as the distance measurement sensor 21, the distance information of the obstacle B using the distance measurement sensor 21 may include an error to such an extent that the horizontal distance between the actual host vehicle 10 and the obstacle B cannot be ignored. As is clear from a comparison between fig. 12B and 12C, the smaller the actual horizontal distance of the obstacle B from the host vehicle 10, the larger the error in the distance information.

Therefore, in the present embodiment, the distance correcting unit 266 corrects the distance to the obstacle B acquired by the distance acquiring unit 264 when the height of the obstacle B in the shape recognition result of the shape recognizing unit 265 is smaller than the predetermined size. This enables more accurate recognition of the relative position of the obstacle B, which has a low height of protrusion from the road surface RS, with respect to the host vehicle 10. Examples of such an obstacle B include a bumper and a curb. Therefore, the "predetermined height" for correcting the distance information in the obstacle B may be set to, for example, about 10 to 25 cm.

fig. 13A and 13B show an outline of distance correction by the distance correction unit 266. In this example, it is assumed that the obstacle B is located between the third front sonar SF3 and the fourth front sonar SF4 in the vehicle width direction. The outline of acquisition and correction of the detected distance will be described below with reference to the examples of fig. 13A and 13B.

The distance acquisition unit 264 acquires the horizontal distance from the end surface of the host vehicle 10, to which the distance measurement sensor 21 facing the obstacle B is attached, to the obstacle B by triangulation using the third front sonar SF3 and the fourth front sonar SF 4. In this example, the end surface of the host vehicle 10 is the front side surface V1 of the front bumper 13. The acquired horizontal distance is the distance DC can travel.

As shown in fig. 12A, when the height of the upper end portion of the obstacle B is sufficiently higher than the sensor mounting heights of the third front sonar SF3 and the fourth front sonar SF4, the travelable distance DC acquired by the distance acquisition unit 264 is assumed to be an accurate horizontal distance. On the other hand, as shown in fig. 13B, when the height of the upper end portion of the obstacle B is lower than the sensor mounting height, the travelable distance acquired by the distance acquisition unit 264 cannot be an accurate horizontal distance, and becomes a distance DC0 in an oblique direction in a side view. This DC0 is referred to as "distance before correction".

The pre-correction distance DC0 corresponds to the hypotenuse of a right triangle whose length corresponding to the corrected travelable distance DC to be acquired is the base and whose height is SH. SH is a distance in the vehicle height direction between the base end portion position of the obstacle B and the sensor mounting positions of the third front sonar SF3 and the fourth front sonar SF 4. SH can be considered as the sensor mounting height as well. The distance correcting unit 266 corrects the acquired horizontal distance, that is, the travelable distance DC, when the height of the obstacle B in the shape recognition result of the shape recognizing unit 265 is smaller than a predetermined size. That is, the distance correction unit 266 may calculate the equation DC (DC 0)2-SH2)1/2The corrected travelable distance DC is calculated.

(example of operation)

Fig. 14 is a flowchart showing a specific operation example according to the present embodiment. The obstacle detection routine shown in fig. 14 is started by the CPU at predetermined time intervals after the predetermined start condition is satisfied. Further, as a precondition for the obstacle recognition routine shown in fig. 14 to be started, it is assumed that the image recognition routine shown in fig. 5 has already been executed. That is, the obstacle detection routine shown in fig. 14 is obtained by changing a part of the obstacle detection routine shown in fig. 6.

The obstacle detection routine shown in fig. 14 is started by the CPU at predetermined time intervals after the predetermined start condition is satisfied. In the obstacle detection routine shown in fig. 14, S601 to S603 are the same as those in the obstacle detection routine shown in fig. 6. Therefore, descriptions of S601 to S603 will be omitted.

After the process of S603, the CPU executes the process of S1404. In S1404, the CPU acquires the distance DC capable of traveling. When the determination at S1406 described later is yes and the correction process at S1407 is executed, the travelable distance DC acquired at S1404 corresponds to the pre-correction distance DC0 described above.

After the process of S1404, the CPU executes the process of S1405. In S1405, the CPU acquires the height H of the obstacle B corresponding to the received wave of the intensity equal to or greater than the threshold value based on the image recognition result stored in the nonvolatile RAM. That is, the processing content of S1405 is the same as the processing of S605 in the obstacle detection routine shown in fig. 6.

After the process of S1405, the CPU executes the process of S1406. In S1406, the CPU determines whether the height H acquired in S1405 is smaller than the predetermined height Hth 2. The predetermined height Hth2 is, for example, 20 cm. That is, the process in the present embodiment is a process of correcting the possible travel distance DC when the height dimension of the obstacle B is lower than the sensor mounting height and is a height dimension of a degree that the host vehicle 10 cannot pass. Therefore, the predetermined height Hth2, which is the threshold value for the determination at S1406, is set in consideration of the sensor mounting height, and is normally a value larger than the threshold value Hth1 at S606.

If the height H acquired in S1405 is smaller than the predetermined height Hth2 (i.e., yes in S1406), the CPU executes the process in S1407 and then once ends the routine. In S1407, the CPU determines the distance that can be traveled acquired in S1404 as the corrected distance DC0, and calculates the equation DC (DC 0)2-SH2)1/2the corrected travelable distance DC is calculated. On the other hand, if the height H acquired in S1405 is equal to or greater than the predetermined height Hth2 (i.e., no in S1406), the CPU skips the process in S1407 and temporarily ends the present routine.

(fifth embodiment)

next, an obstacle detecting device 20 of a fifth embodiment will be described. This embodiment corresponds to a mode in which the processing load of image recognition is reduced as compared with the fourth embodiment using the mobile stereo technology or the SFM technology.

The functional block configuration of the present embodiment is the same as that of the fourth embodiment. Therefore, in the description of the structure of the present embodiment, fig. 1 and 11 and the description thereof can be appropriately referred to. In the description of the outline of the operation of the present embodiment, fig. 12A to 13B and the description thereof may be referred to as appropriate. In the following description of the fifth embodiment, differences from the fourth embodiment will be mainly described.

The shape recognition unit 265 is configured to perform shape recognition of the obstacle B based on the image information acquired by the imaging unit 22. However, in the present embodiment, unlike the first to fourth embodiments, the shape recognition unit 265 has a function of extracting a characteristic shape of an object from image information corresponding to one image and a function of recognizing a pattern in a texture image.

That is, the shape recognition unit 265 extracts a straight edge corresponding to the distance information acquired by the distance acquisition unit 264. The shape recognition unit 265 recognizes the obstacle B corresponding to the straight edge based on the texture image around the extracted straight edge. Specifically, the shape recognition unit 265 compares texture images in two image regions adjacent to each other with a straight line edge therebetween, and determines whether or not the obstacle B corresponding to the straight line edge is a step having a small height dimension. Hereinafter, this step is referred to as "low step".

In this way, in the present embodiment, the shape recognition unit 265 can easily determine whether the obstacle B is a low step based on the image information acquired by the imaging unit 22. When the shape recognition unit 265 recognizes that the obstacle B is a low step, the distance correction unit 266 corrects the distance information corresponding to the obstacle B. The distance information is corrected in the same manner as in the fourth embodiment.

(example of operation)

Fig. 15 to 17 are flowcharts showing specific operation examples according to the present embodiment. The distance acquisition routine shown in fig. 15 corresponds to the operation of the distance acquisition unit 264. The distance acquisition routine is started by the CPU at predetermined time intervals after a predetermined start condition is satisfied.

If the distance acquisition routine shown in fig. 15 is started, first, in S1501, the CPU selects two adjacent ranging sensors 21 and acquires reception information from the selected two ranging sensors 21. Next, in S1502, the CPU determines whether or not the intensities of the received waves in the two adjacent ranging sensors 21 are both equal to or greater than a predetermined threshold value.

If the condition that the intensities of the received waves in the two adjacent ranging sensors 21 are equal to or greater than the predetermined threshold value is not satisfied (i.e., no in S1502), the above-described triangulation is not satisfied. Therefore, in this case, the CPU skips the processing in S1503 and S1504, and temporarily ends the present routine. On the other hand, if the condition that the intensity of the received wave is equal to or higher than the predetermined threshold value is satisfied in both the adjacent two distance measuring sensors 21 (that is, yes in S1502), the CPU executes the processing in S1503 and S1504, and then once ends the present routine.

in S1503, the CPU acquires the relative position information of the obstacle B based on the acquired reception information. Specifically, as shown in fig. 13A, the CPU acquires a detection point P corresponding to the obstacle B. Next, in S1504, the CPU acquires distance information corresponding to the obstacle B. That is, in S1504, the CPU acquires the travelable distance DC. In addition, the CPU stores the acquisition result in the nonvolatile RAM in S1503 and S1504.

The image recognition routine shown in fig. 16 corresponds to a part of the operation of the shape recognition unit 265. The image recognition routine is started by the CPU at predetermined time intervals after a predetermined start condition is satisfied.

When the image recognition routine shown in fig. 16 is started, first, in S1601, the CPU acquires image information from the image capturing unit 22. In addition, the CPU stores the acquired image information in the nonvolatile RAM. Next, in S1602, the CPU extracts a feature shape such as a straight edge in the stored image information and a pattern in the texture image. Next, in S1603, the CPU stores the extraction result of S1602 in the nonvolatile RAM, and temporarily ends the present routine.

The obstacle detection routine shown in fig. 17 corresponds to a part of the operation of the shape recognition unit 265 and the operation of the distance correction unit 266. The obstacle detection routine is started by the CPU at predetermined time intervals after a predetermined start condition is satisfied.

When the obstacle detection routine shown in fig. 17 is started, first, the CPU reads out the relative position information acquired by the execution of the distance acquisition routine shown in fig. 15 from the nonvolatile RAM in S1701. Thereby, a two-dimensional map of the detection point P obtained by the distance measuring sensor 21 is acquired. Next, in S1702, the CPU reads out the straight line edge acquired by the execution of the image recognition routine shown in fig. 16 from the nonvolatile RAM.

Next, in S1703, the CPU determines whether or not there is a straight edge corresponding to the detection point P. If there is no straight edge corresponding to the detection point P (i.e., no in S1703), the CPU skips all the processing of and after S1704 and temporarily ends the routine. On the other hand, if there is a straight edge corresponding to the detection point P (i.e., yes in S1703), the CPU advances the process to S1704 and S1705.

In S1704, the CPU compares the texture images in two adjacent image regions with the straight line edge therebetween to recognize whether or not the obstacle B corresponding to the straight line edge is a low step. Specifically, when the textures in two adjacent image regions with a straight line edge therebetween match, the CPU recognizes that the obstacle B is a low step. On the other hand, if the textures in two adjacent image regions are not matched with each other with the straight line edge therebetween, the CPU recognizes that the obstacle B is a three-dimensional object having a height greater than a low step.

In S1705, the CPU determines whether the recognition result of the obstacle B is a low step. If the result of recognition of the obstacle B is a low step (i.e., yes in S1705), the CPU executes the process in S1706, and then once ends the routine. In S1706, the CPU sets the distance that can be traveled acquired in S1504 as the corrected distance DC0, and changes the distance to DC (DC 0) in the same manner as in the fourth embodiment described above2-SH2)1/2the corrected travelable distance DC is calculated. If the result of recognition of the obstacle B is a three-dimensional object with a large height dimension (i.e., no in S1705), the CPU skips the process in S1706, and once ends the routine.

(Effect)

The detection result of the obstacle B based on the relative position information acquired by the distance measuring sensor 21 is directly affected by the height dimension of the obstacle B as in the conventional art. However, as described in japanese patent application laid-open No. 2014-58247, the height of the obstacle B is obtained based on the detection result itself of the distance measuring sensor 21, and the error also increases. This is because the basic function of the ranging sensor 21 is to output a signal corresponding to the distance between the obstacles B, and information on the height of the obstacles B is not essentially included in the output.

On the other hand, information about the height direction of the obstacle B can be obtained from the image recognition result based on the image information acquired by the imaging unit 22. Therefore, in each of the above embodiments, the obstacle detection device 20 detects the obstacle B by integrating the detection result of the obstacle B based on the relative position information acquired by the distance measuring sensor 21 and the image recognition result based on the image information acquired by the imaging unit 22. This enables detection of the obstacle B present on the outer side of the vehicle 10 to be performed more appropriately.

(modification example)

The present disclosure is not limited to the above embodiments. Therefore, the above embodiment can be modified as appropriate. Hereinafter, typical modifications will be described. In the following description of the modified examples, differences from the above-described embodiment will be mainly described.

The present disclosure is not limited to the specific device configuration shown in the above-described embodiments. That is, for example, the vehicle 10 is not limited to a four-wheeled automobile. Specifically, the vehicle 10 may be a three-wheeled vehicle, or may be a six-wheeled or eight-wheeled vehicle such as a truck. The type of vehicle 10 may be an automobile equipped with only an internal combustion engine, an electric vehicle or a fuel cell vehicle not equipped with an internal combustion engine, or a hybrid vehicle. The shape of the vehicle body 11 is not limited to a box shape, that is, a substantially rectangular shape in plan view. The number of the door panels 17 is not particularly limited.

The arrangement and number of the distance measuring sensors 21 when the distance measuring sensors 21 are ultrasonic sensors are not limited to the above specific examples. That is, for example, referring to fig. 1, when the third front sonar SF3 is disposed at the center position in the vehicle width direction, the fourth front sonar SF4 is omitted. Similarly, when the third rear sonar SR3 is disposed at the center in the vehicle width direction, the fourth rear sonar SR4 is omitted. The third side sonar SS3 and the fourth side sonar SS4 may be omitted.

The distance measuring sensor 21 is not limited to the ultrasonic sensor. That is, for example, the distance measuring sensor 21 may be a laser radar sensor or a millimeter wave radar sensor. Similarly, the image sensor constituting the imaging unit 22 is not limited to the CCD sensor. That is, for example, a CMOS sensor may be used instead of the CCD sensor. CMOS is an abbreviation for Complementary MOS.

The arrangement and number of the imaging units 22 are not limited to the above examples. That is, for example, the front camera CF may be disposed in the vehicle cabin. Specifically, for example, the front camera CF may be mounted on a vehicle interior mirror, for example. The front camera CF may be one or two. That is, the obstacle detecting device 20 may have a compound-eye stereo camera structure. For example, the left camera CL and the right camera CR may be disposed at different positions from the door mirror 18. Alternatively, the left camera CL and the right camera CR may be omitted.

In each of the above embodiments, the control unit 26 is configured such that the CPU reads a program from the ROM or the like and starts the program. However, the present disclosure is not limited to such a structure. That is, for example, the control unit 26 may be a digital circuit configured to be able to perform the above-described operations, for example, an ASIC such as a gate array. ASIC is an abbreviation of APPLICATION SPECIFIC INTEGRATED CIRCUIT (APPLICATION specific integrated CIRCUIT).

The present disclosure is not limited to the specific operation examples and processing methods shown in the above embodiments. For example, the storage location of the recognition result or the like may be a storage medium other than the nonvolatile RAM, such as a RAM and/or a magnetic storage medium.

In the above-described specific example, the processing performed when the host vehicle 10 is traveling is specifically described. However, the present disclosure may also be suitably applied to the reverse of the own vehicle 10. That is, the processing content in the backward movement is basically the same as the processing content in the forward movement described above, except that the distance measuring sensor 21 and the imaging unit 22 provided on the rear face portion 14 side of the own vehicle 10 are used.

the processing contents in the shape recognition unit 262 are not limited to the above example. That is, for example, compound eye stereo processing or integrated processing of SFM and compound eye stereo may be used. For compound-eye stereo processing or combined processing of SFM and compound-eye stereo, it is known or well-known at the time of application of the present application. For example, refer to Japanese patent application laid-open Nos. 2007-263657 and 2007-263669.

When the relative position information and the travelable distance DC are invalidated in S607, the invalidated data may not be discarded. That is, for example, the invalidation of the relative position information and the travelable distance DC in S607 may be a process of storing the relative position information and the travelable distance DC acquired this time in S603 and S604 in the nonvolatile RAM and storing information indicating that these are invalidated in the nonvolatile RAM.

The determination of S606 may be performed before the determination of S1406. In this modification, the CPU determines whether or not the height H acquired in S1405 is smaller than the predetermined height Hth1 before the determination in S1406.

In this modification, the CPU executes the process of S607 when the height H is smaller than the predetermined height Hth 1. That is, the relative position information and the acquisition result of the travelable distance DC are invalidated. After that, the routine is temporarily ended. On the other hand, if the height H is equal to or greater than the predetermined height Hth1, the CPU advances the process to S1406. That is, when the height of the obstacle B is equal to or greater than Hth1 and less than Hth2, the CPU corrects the travelable distance DC in the process of S1407.

The predetermined height Hth1 and the predetermined height Hth2 may be the same value.

The correction of the travelable distance DC in S1407 and the like is not limited to the calculation using the above-described equations. Specifically, for example, the correction of the travel distance DC may be performed as follows.

As described above, the error of the distance information increases as the actual horizontal distance between the obstacle B and the host vehicle 10 decreases. In addition, the smaller the value of the height H, the larger the error of the distance information.

Therefore, the correction value map DC _ AMD (DC, H) having the value of the distance DC that can travel acquired in S1404 and the value of the height H acquired in S1405 as parameters can be created in advance by appropriate experiments or the like. Further, the corrected travelable distance DC can be acquired by performing a predetermined operation using the correction value DC _ AMD acquired using the correction value map and the value of the distance DC before correction acquired in S1404. Specifically, for example, the correction value DC _ AMD and the value of the pre-correction travelable distance DC acquired in S1404 may be added or accumulated.

In fig. 4A, 4B, 12A, 12B, and 12C, the obstacle B may be disposed above the road surface RS, such as on a wall extending downward from the ceiling or on a vertically movable flap gate. In this state, a space is formed between the obstacle B and the road surface RS. Hereinafter, this space is referred to as "lower space".

When the above-described examples are applied to this situation, for example, the height H acquired in S605 is set to the height of the above-described lower space, that is, the height from the road surface RS of the horizontal edge corresponding to the lower end of the obstacle B. For example, the determination in S606 is a determination as to whether or not the height H of the lower space is equal to or less than a predetermined height Hth 3.

When the height H of the lower space exceeds the predetermined height Hth3, the lower end of the obstacle B is excessively higher than the sensor mounting height, and therefore the same detected distance error as described above occurs. Therefore, in this case, the travelable distance DC is corrected. On the other hand, when the height H of the lower space is equal to or less than the predetermined height Hth3, the wall surface BW of the obstacle B favorably faces the distance measuring sensor 21. Therefore, in this case, the travelable distance DC is not corrected.

for example, depending on the height of the host vehicle 10 on which the obstacle detector 20 is mounted, the host vehicle may not pass through a space below a wall extending downward from the ceiling. Alternatively, for example, the vehicle 10 mounted with the obstacle detector 20 may not be able to be stopped during the ascent and be located below the obstacle B, which is a flap gate in trouble. In this regard, according to the present modification, the distance between the obstacle B that cannot pass through the lower space and the host vehicle 10 in these cases can be acquired more accurately.

In fig. 4A and the like, the obstacle B may be a beam protruding downward from the ceiling. In this situation, the own vehicle 10 does not interfere with the obstacle B. Therefore, it is not necessary to correct the relative position information and the possible travel distance DC corresponding to the obstacle B, and even if invalidation is performed, no trouble is caused. Therefore, the CPU may execute the same invalidation process as in S607 when the height H of the lower space exceeds the predetermined height Hth 4.

The CPU may classify the correction processing method into a case where the obstacle B protrudes upward from the road surface RS and a case where the obstacle B extends downward from the ceiling, that is, a case where the obstacle B protrudes upward from the road surface RS, the correction processing method is the same as that in fig. 14 (i.e., S1406 and S1407).

The above-described case classification may be performed by the CPU based on the image processing result. That is, the CPU may determine whether the obstacle B corresponding to the extracted horizontal edge protrudes upward from the road surface RS or is disposed to extend downward from the ceiling based on the image processing result.

the predetermined heights Hth3 and Hth4 may be stored in advance in ROM or nonvolatile RAM. Alternatively, the predetermined height Hth3 may be changed according to the height of the host vehicle 10 on which the obstacle detecting device 20 is mounted. That is, in the obstacle detecting device 20, the predetermined height Hth3, which is a value corresponding to the vehicle height of the mounted host vehicle 10, can be stored in the nonvolatile RAM in a rewritable manner. The rewriting of the predetermined height Hth3 may be performed by the manufacturer, seller, manager, or user of the host vehicle 10 or the obstacle detecting device 20 as appropriate.

The term "acquisition" may be appropriately modified to the terms "estimation", "detection", "calculation", and the like. The unequal numbers in each determination process may be equal numbers or no equal numbers. That is, for example, "smaller than the predetermined size" may be changed to "smaller than the predetermined size". Likewise, "above a prescribed size" may be changed to "above a prescribed size". Likewise, "smaller than the prescribed height" may be changed to "below the prescribed height". Likewise, "above threshold" may be changed to "above threshold".

The modification is not limited to the above-described examples. In addition, a plurality of modifications may be combined with each other. Further, the above-described embodiments may be combined with each other.

37页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:距离测量系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!