Head-up display system

文档序号:572880 发布日期:2021-05-18 浏览:11次 中文

阅读说明:本技术 平视显示器系统 (Head-up display system ) 是由 藤田裕司 下田望 于 2019-09-13 设计创作,主要内容包括:提供一种平视显示器系统,针对车辆的振动,不会对驾驶员造成不协调感而适当地校正影像的显示位置。在平视显示器系统(1)中为了检测车辆(2)的振动而设置陀螺仪传感器(43)。影像数据生成部(132)根据由陀螺仪传感器(43)取得的2轴方向的角速度信息进行显示的目标的显示位置的颠簸校正。在车辆(2)以倾斜的状态进行弯道行驶的情况下,关于在影像显示范围(80)内的固定位置显示的始终显示目标(82),抑制或者中止颠簸校正。关于重叠于由前方感测装置(30)检测到的特定的物体(72)而显示的实景重叠目标(81),使显示的亮度变暗或者消失。(Provided is a head-up display system which appropriately corrects the display position of an image without giving a sense of incongruity to a driver in response to vibration of a vehicle. A head-up display system (1) is provided with a gyro sensor (43) for detecting vibration of a vehicle (2). An image data generation unit (132) corrects the pitch of the display position of the target to be displayed on the basis of the angular velocity information in the 2-axis direction acquired by the gyro sensor (43). When a vehicle (2) is traveling on a curve while being inclined, the pitch correction is suppressed or suspended with respect to a constant display target (82) displayed at a fixed position within an image display range (80). The brightness of a displayed real-scene overlapping object (81) that is displayed so as to overlap a specific object (72) detected by a front sensor device (30) is darkened or disappears.)

1. A head-up display system is provided with: a head-up display device mounted on a vehicle and displaying an image in front of the vehicle; and a front sensing device that detects an object in front of the vehicle,

the head-up display device includes: an image data generating unit for generating image data and an image display unit for emitting image light of the image data,

the video data generated by the video data generation unit includes: a constantly displayed target displayed at a fixed position within an image display range and a live-action overlay target displayed in an overlay manner on a specific object detected by the front sensor device,

in the vehicle, a gyro sensor is provided for detecting vibration of the vehicle,

the image data generation unit performs a pitch correction of a display position of the target displayed on the image display unit based on the angular velocity information in the 2-axis direction acquired by the gyro sensor,

when the vehicle is traveling in a curve while being inclined, the constant display target is suppressed or the jerk correction is suspended, and the brightness of the display is dimmed or eliminated with respect to the live-action superimposition target.

2. Head-up display system according to claim 1,

when the pitch correction for the constant display target and the luminance of the display for the live-action overlay target are changed, the change is performed by multiplying the damping term G, which is a damping term in which G is 1 when the angular velocity | ω yaw | in the yaw direction acquired by the gyro sensor is smaller than the threshold ω 0 and gradually reaches G is 0 when | ω yaw | is larger than the threshold ω 0.

3. A head-up display system is provided with: a head-up display device mounted on a vehicle and displaying an image in front of the vehicle; and a front sensing device that detects an object in front of the vehicle,

the head-up display device includes: an image data generating unit for generating image data and an image display unit for emitting image light of the image data,

the video data generated by the video data generation unit includes: a constantly displayed target displayed at a fixed position within an image display range and a live-action overlay target displayed in an overlay manner on a specific object detected by the front sensor device,

in the vehicle, a gyro sensor is provided for detecting vibration of the vehicle,

the image data generation unit performs a pitch correction of a display position of the target displayed on the image display unit based on the angular velocity information in the 3-axis direction acquired by the gyro sensor,

when the pitch component ω pitch, yaw component ω yaw, and roll component ω roll of the angular velocity acquired by the gyro sensor are time-integrated with respect to the roll component ω roll to form a roll angle θ roll, the angular velocity ω c' is obtained from ω pitch- ω yaw · tan θ roll, and the pitch correction of the display positions of the constant display target and the live-action superimposition target is performed.

4. A head-up display system is provided with: a head-up display device mounted on a vehicle and displaying an image in front of the vehicle; and a viewpoint detecting device for detecting a position of a viewpoint of a driver,

the head-up display device includes: an image data generating unit for generating image data and an image display unit for emitting image light of the image data,

in the vehicle, in order to detect a rotational shake and a displacement shake as vibration components of the vehicle, a gyro sensor and an acceleration sensor are provided,

the image data generation unit corrects the display position of the target displayed on the image display unit based on the angular velocity information acquired by the gyro sensor and the acceleration information acquired by the acceleration sensor,

the rotation radius when it is considered that the vertical displacement in the acceleration sensor occurs due to the rotational shake of the vehicle is obtained, and the rotational shake and the displacement shake of the vehicle at the position of the driver are calculated from the information of the viewpoint position of the driver detected by the viewpoint detecting device, thereby correcting the display position of the target.

5. Head-up display system according to claim 4,

the rotation radius is obtained from vertical displacement of 2 acceleration sensors provided at front and rear positions of the vehicle.

Technical Field

The present invention relates to a head-up display system mounted on a vehicle and appropriately correcting a display position of an image.

Background

In recent years, a vehicle video display device (head-up display (hereinafter, HUD)) that displays video information as a virtual image on the front side via a windshield of a vehicle has been put into practical use. In this case, the driving operation of the vehicle can be supported by providing information for the driver as the displayed video information.

For example, patent document 1 describes correction of a display position of a display image corresponding to vehicle vibration. Here, the following structure is proposed: the correction processing of the display position is performed with respect to the emphasized image (image displayed so as to have a predetermined positional relationship with respect to a specific object in the real scene), and the correction processing is not performed with respect to the non-emphasized image (image displayed so as not to have a predetermined positional relationship with respect to the object).

Documents of the prior art

Patent document 1: japanese patent laid-open publication No. 2017-13590

Disclosure of Invention

Patent document 1 describes "for example, a 3-axis acceleration sensor or the like as a vibration detection method, but does not describe how to correct the vibration using a sensor detection value. That is, according to the study of the present inventors, the following phenomenon was found: in the case of performing pitch and roll (rotational and roll) correction using a gyro sensor, a display image (target) moves in a direction unrelated to pitch and roll when traveling around a curve. This causes a sense of incongruity to the driver and therefore needs to be avoided. When the correction process is performed with high accuracy, it is necessary to correct not only the pitch and roll but also a roll component (displacement and roll) in the vertical direction. At this time, a hand shake correction method such as a camera is known as a similar technique, but cannot be directly applied, and must be corrected in consideration of the viewpoint position of the driver.

The present invention aims to provide a head-up display system that appropriately corrects the display position of an image without giving a sense of incongruity to the driver in response to vibration of a vehicle.

In the head-up display system of the present invention, a gyro sensor is provided for detecting vibration of a vehicle. The image data generation unit corrects the display position of the displayed object for a blur based on the angular velocity information in the 2-axis direction acquired by the gyro sensor. When the vehicle travels on a curve while being inclined, the pitch correction is suppressed or suspended with respect to the constant display target displayed at a fixed position within the image display range. The brightness of the real-scene overlapping target displayed overlapping the specific object detected by the front sensor is darkened or disappears.

In addition, in the head-up display system of the present invention, in order to detect the rotational shake and the displacement shake as the vibration components of the vehicle, a gyro sensor and an acceleration sensor are provided. The image data generation unit corrects the display position of the target to be displayed, based on the angular velocity information acquired by the gyro sensor and the acceleration information acquired by the acceleration sensor. A rotation radius when it is considered that vertical displacement occurs in an acceleration sensor due to rotational shake of a vehicle is obtained, and the rotational shake and displacement shake of the vehicle at the position of a driver are calculated from information of the viewpoint position of the driver detected by a viewpoint detection device, thereby correcting the display position of a target.

According to the present invention, it is possible to provide a head-up display system that appropriately corrects the display position of an image without giving a sense of incongruity to the driver in response to vibration of the vehicle.

Drawings

Fig. 1 is a diagram showing an outline of a HUD system 1 mounted on a vehicle.

Fig. 2A is a block diagram showing the overall structure of the HUD system 1.

Fig. 2B is a block diagram showing the internal structure of the HUD device 10.

Fig. 2C is a block diagram showing the internal configuration of the viewpoint detecting device 20.

Fig. 2D is a block diagram showing an internal structure of the front sensing device 30.

Fig. 3 is a diagram illustrating the type of video data to be displayed.

Fig. 4A is a diagram showing vibration of the vehicle.

Fig. 4B is a diagram showing the swing of the display image due to the vibration of the vehicle.

Fig. 5A is a graph showing the measurement of pitch and roll (example 1).

Fig. 5B is a diagram showing the movement of the display position accompanying pitch and roll.

Fig. 5C is a diagram illustrating correction of the display position.

Fig. 6 is a diagram illustrating an example of image display when the vehicle is traveling around a curve.

Fig. 7A is a diagram illustrating rotation of the vehicle in the 3-axis direction.

Fig. 7B is a diagram showing a case where the vehicle body turns a curve in a state where the vehicle body is not tilted.

Fig. 7C is a diagram showing a case where the vehicle body turns around a curve in a state of being inclined.

Fig. 8A is a diagram illustrating correction of the display target at all times by the treatment method 1.

Fig. 8B is a diagram illustrating correction of a live-action overlay target by the treatment method 1.

Fig. 8C is a diagram showing a correction function (attenuation term) used in the treatment method 1.

Fig. 9 is a diagram showing an example of display after correction by the treatment method 1.

Fig. 10 is a diagram illustrating an effect of disappearing a live-action overlay target.

Fig. 11 is a diagram illustrating a flowchart of the disposal method 1.

Fig. 12 is a diagram illustrating a flowchart of the disposal method 2.

Fig. 13A is a diagram (example 2) showing 2 vibration components (rotational shake).

Fig. 13B is a diagram (embodiment 2) showing 2 vibration components (displacement shake).

Fig. 14A is a diagram illustrating correction calculation (the fulcrum S is the rear of the vehicle).

Fig. 14B is a diagram illustrating correction calculation (the fulcrum S is the front of the vehicle).

Fig. 15 is a diagram illustrating a method of measuring the distance L between the sensor and the driver.

Fig. 16 is a diagram showing a flowchart.

Fig. 17 is a diagram illustrating correction calculation in the modification.

Fig. 18 is a diagram illustrating a method of measuring the vertical displacement hd of the driver.

(symbol description)

1: head-up display (HUD) systems; 2: a vehicle; 3: windshields (windshields); 5. 5': driver (driver's eyes); 8: a virtual image; 10: a HUD device; 11. 21, 31: a control unit; 20: a viewpoint detecting device; 30: a front sensing device; 42: an acceleration sensor; 43: a gyroscope sensor; 51: an image display device; 80: an image display range; 81. 82: displaying the target; 132: an image data generating unit.

Detailed Description

Embodiments of a head-up display system (hereinafter, referred to as a HUD system) according to the present invention will be described with reference to the drawings.

Fig. 1 is a diagram showing an outline of a HUD system 1 mounted on a vehicle. The HUD system 1 includes: the HUD device 10 is a main body portion having an image display function; a viewpoint detection device 20 for detecting the viewpoint position of the driver; and a front sensing device 30 that detects an object in front of the vehicle.

The HUD device 10 is mounted on a lower portion of an instrument panel of the vehicle 2, and projects an image generated by an image display device onto a windshield 3 (also referred to as a windshield) of the vehicle 2 via a mirror (mirror). The image reflected by the windshield 3 is incident on the eyes 5' of the driver, and the driver can see the image. At this time, the driver observes the virtual image 8 existing in front of the windshield 3. The mirror driving unit 53 in the HUD device 10 adjusts the display position (height direction) of the virtual image 8 by pivoting the mirror 52 in accordance with the height (A, B, C) of the driver's eyes 5'. By this adjustment, the driver can see the virtual image 8 at a position easy to observe.

The viewpoint detecting device 20 is provided on, for example, an instrument panel, and measures the position (distance and height) of the eyes 5' of the driver. The viewpoint detecting device 20 is used for a Driver Monitoring System (DMS). The front sensor device 30 is provided, for example, on the upper portion of the windshield 3, detects an object (specific object) in front of the vehicle 2, and measures the distance to the object. The HUD device 10 determines an image to be displayed based on the detection information of the viewpoint detecting device 20 and the front sensing device 30, and displays the image at a position where the driver can easily observe the image. The virtual image 8 seen by the driver is also simply referred to as "image 8".

Fig. 2A is a block diagram showing the overall structure of the HUD system 1. The HUD system 1 includes a HUD device 10, a viewpoint detecting device 20, and a front sensing device 30. Fig. 2B to 2D show the internal structure of each device. Each of the devices 10, 20, and 30 includes a Control Unit (ECU) 11, 21, and 31 including an information acquisition Unit, a CPU, a memory, and an interface. Each of the devices 10, 20, and 30 is connected to a vehicle ECU60 via a communication bus 61 such as a CAN (Controller Area Network).

Fig. 2B is a block diagram showing the internal structure of the HUD device 10. The information acquisition unit 12 acquires vehicle information from various sensors mounted on the vehicle. The sensors include a vehicle speed sensor 41 that acquires speed information of the vehicle 2, an acceleration sensor 42 that acquires acceleration information as vibration or sway of the vehicle 2, and a gyro sensor 43 that acquires angular velocity information (gyro information). In order to generate map Information indicating the position and the traveling direction of the Vehicle 2, the Vehicle is provided with a GPS receiver 44 that acquires a GPS (Global Positioning System) signal, and a VICS receiver 45 that receives a VICS (Vehicle Information and Communication System) signal. Various sensors, not shown, such as an engine start sensor, a steering angle sensor, a temperature sensor, and an illuminance sensor, are mounted.

The CPU13 of the HUD device 10 includes a video data generation unit 132 and a sound data generation unit 131, and generates video data and sound data to be provided to the driver based on the input vehicle information. The memory 14 stores programs executed by the CPU13, various control data, and displayed image data. The audio data generated by the audio data generation unit 131 is output from the speaker 54 via the audio interface 15. The video data generated by the video data generation unit 132 is displayed on the video display unit 50 via the display interface 16. The image display unit 50 includes: an image display device 51 for generating image light by a light source such as an LED or a laser, an illumination optical system, and a display element such as a liquid crystal element; and a mirror 52 for emitting the generated image light toward the windshield 3. The mirror adjustment section 133 in the CPU13 adjusts the rotation of the mirror 52 via the mirror drive section 53. The communication unit 134 is connected to the communication bus 61 via the communication interface 17, and transmits and receives detection data and control data to and from the viewpoint detecting device 20 and the front sensing device 30.

Fig. 2C is a block diagram showing the internal configuration of the viewpoint detecting device 20. The information acquisition unit 22 acquires distance information up to the viewpoint position of the driver by the distance detector 46 attached to the vehicle, and transmits the distance information to the CPU 24. For example, a TOF (Time of Flight) sensor or the like is used for the distance detector 46. Further, the in-vehicle camera 47 captures an image of the inside of the vehicle, and transmits the image to the CPU24 via the camera interface 23, and the image analysis unit 241 detects the viewpoint of the driver. The memory 25 stores a program executed by the CPU24, and stores detection information. The communication unit 242 transmits the distance information and the viewpoint information of the driver to the HUD device 10 via the communication interface 26.

Fig. 2D is a block diagram showing an internal structure of the front sensing device 30. The information acquisition unit 32 acquires distance information to an object (vehicle, logo, pedestrian, etc.) in front of the vehicle by the distance detector 48 attached to the vehicle, and transmits the distance information to the CPU 34. For example, a tof (time of flight) sensor or the like is used for the distance detector 48. Further, the image in front is captured by the vehicle exterior camera 49, transferred to the CPU34 via the camera interface 33, and the object is detected by the image analysis unit 341. The memory 35 stores a program executed by the CPU34, and stores detection information. The communication unit 342 transmits the distance information and the detection information of the front object to the HUD device 10 via the communication interface 36.

Fig. 3 is a diagram illustrating the type of video data to be displayed. In fig. 3, a real scene 70 in front of the vehicle as viewed from the driver's seat is shown, in which there is a vehicle 71 traveling in front. The dashed line 80 indicates the image display range of the HUD device 10, and 2 kinds of objects are displayed.

The live-action overlap target 81 is a target displayed to overlap a specific object (vehicle, person, logo, etc.) in the live action. Here, a warning (ring) for reminding attention to the preceding vehicle is displayed. As the other live-action overlay target 81, route information, white lines, shoulders, shop information (pop-up information), and the like are displayed so as to be overlaid on the traveling road, the building, and the like.

The constant display target 82 is a target that is displayed at a fixed position within the image display range 80 without overlapping with the real scene. Here, the speed information is displayed at the lower right position of the image display range 80. As the other constant display target 82, the remaining fuel, the temperature, the destination information, and the like are displayed at respective fixed positions.

Further, the image data generating unit 132 of the HUD device 10 generates image data of an object to be displayed using the 3D rendering library, and sets the display position thereof. In the case where the live-view overlapping target 81 is displayed, an object (vehicle, person, logo, or the like) existing in the live view 70 is detected by the front sensing device 30, and the position thereof is calculated. Then, the image data generation unit 132 draws the target so as to overlap the detected object.

Fig. 4A and 4B are diagrams illustrating the wobbling of the display image caused by the vibration of the vehicle. If the vehicle 2 vibrates in the direction of the arrow due to irregularities on the road surface or the like as shown in fig. 4A, the image 8 viewed by the driver also swings in accordance with the vibration. The direction of the sway depends on the up-and-down vibration of the vehicle 2.

Fig. 4B shows the swing of the display image viewed from the inside of the vehicle. Since the image display range 80 of the HUD device 10 is set with reference to the vehicle, if the vehicle swings, the image display range 80 also moves up and down, and accordingly, the positions of the displayed objects 81 and 82 also move up and down. The sway of the target not only gives the driver a sense of discomfort, but also is unsatisfactory because the live-action overlay target 81 is displayed offset from the object (e.g., vehicle 71) within the live-action 70. Further, the live-action superimposed target 81 is originally displayed superimposed on the front object, but when the object detection performance of the front sensor device 30 cannot track the speed of the swing of the vehicle, it is difficult to display the targets superimposed.

Hereinafter, a method of correcting a display image for vibration (sway) of a vehicle will be described in different cases. Pitch and roll (rotational component) are targeted in embodiment 1, and pitch and roll and shift roll (parallel movement component) are targeted in embodiment 2.

Example 1

In embodiment 1, a case will be described in which pitch and yaw (rotational component) is set as a correction target and correction processing is performed using a gyro sensor.

Fig. 5A to 5C are diagrams illustrating a basic operation of the pitch correction using the gyro sensor.

Fig. 5A is a graph showing the determination of pitch and roll. The gyro sensor 43 provided in the vehicle 2 measures the rotational speed around the y-axis, that is, the angular velocity ω pitch in the pitch direction (pitch direction). The CPU13 integrates the angular velocity ω pitch to obtain an angular change amount (pitch angle) θ pitch.

Fig. 5B is a diagram showing the movement of the display position accompanying pitch and roll. The image display range 80 viewed by the driver 5 moves in the upward direction by Δ z (a position → B position) in accordance with the pitch angle θ pitch. Accordingly, the display positions of the targets 81 and 82 observed by the driver 5 also move in the upward direction by Δ z.

Fig. 5C is a diagram illustrating the correction of the display position by the image data generation unit 132. At the B position, the display positions of the targets 81 and 82 within the image display range 80 are shifted downward by Δ z. This prevents the display position of the target viewed by the driver 5 from changing. That is, the display position of the target is shifted according to the pitch angle θ pitch, and the target is suppressed from being swung due to the vehicle vibration.

The present inventors have found that the image display range moves in a direction not related to pitch and yaw when the vehicle is traveling around a curve.

Fig. 6 is a diagram illustrating an example of image display when the vehicle is traveling around a curve. In this example, the case where the vehicle body travels obliquely leftward when the vehicle turns right at the intersection is shown. In the image display range 80, the target 81 is displayed so as to overlap the pedestrian 72 in front. However, although the road surface is flat and the vehicle does not swing up and down, the display is shifted upward from the position 81' that is originally intended to be displayed, and therefore the display becomes an unobservable display.

This phenomenon was investigated. If the vehicle body turns while being tilted in the left-right direction at an intersection or a curve, the gyro sensor 43 detects an apparent pitch component for a reason described later. The reason is considered to be that the control unit 11 of the HUD device shifts the displayed target in the vertical direction in order to correct the pitch component.

Fig. 7A to 7C are diagrams for explaining the reason why an apparent pitch component occurs.

First, as shown in fig. 7A, the rotation of the vehicle in the 3-axis direction is defined as follows. When the traveling direction of the vehicle 2 is defined as an x-axis, the left-right direction is defined as a y-axis, and the vertical (vertical) direction is defined as a z-axis, the rotation around the x-axis is referred to as a roll direction (roll direction), the rotation around the y-axis is referred to as a pitch direction (pitch direction), and the rotation around the z-axis is referred to as a yaw direction (yaw direction). Here, a case where the vehicle 2 turns in the left direction is considered.

Fig. 7B shows a case where the vehicle body turns around a curve in a state where the vehicle body is not tilted. In this case, only the angular velocity ω in the yaw direction is detected by the gyro sensor 43.

Fig. 7C shows a case where the vehicle turns in a curve with the vehicle body tilted. The tilt angle θ roll is provided in the roll direction with respect to the vertical axis. At this time, since the gyro sensor 43 detects the angular velocity with the 3-axis direction of the vehicle as a reference, not only the angular velocity ω yaw in the yaw direction but also the angular velocity ω pitch in the pitch direction are generated.

The angular velocity ω pitch in the pitch direction is expressed by the expression (1-1), and a component of ω pitch appears.

ωpitch=ω·sinθroll (1-1)

The angular velocity ω yaw in the yaw direction is expressed by the equation (1-2).

ωyaw=ω·cosθroll (1-2)

According to the formula (1-1) and the formula (1-2), the formula (1-3) is obtained.

ωpitch=ωyaw·tanθroll (1-3)

In this way, the gyro sensor detects not only the angular velocity ω yaw in the yaw direction but also the angular velocity ω pitch in the pitch direction determined by the roll angle θ roll in the roll direction. As a result, the control unit 11 of the HUD device shifts the displayed target in the vertical direction to correct the pitch component ω pitch.

In order to avoid unnecessary display correction during the curve traveling, the following [ procedure 1] or [ procedure 2] is performed.

[ disposal method 1]

In the processing method 1, the correction target is divided into the constant display target and the live view superimposition target, and correction of the correction amount or change of the display luminance is performed.

Fig. 8A to 8C are diagrams illustrating correction of the target in the processing method 1. First, the description starts with the correction of the always displayed target 82 in fig. 8A.

When the pitch angle detected by the gyro sensor 43 is θ pitch, the pitch angle used for the correction is corrected to θ c using equation (2) with respect to the always displayed target.

θc=θpitch·(1+exp(a(ωyaw+ω0)))-1·(1+exp(-a(ωyaw-ω0)))-1

=θpitch·G (2)

Here, the attenuation term G of the expression (2) is a correction function shown in fig. 8C, and is a function in which G is 1 when the angular velocity | ω yaw | in the yaw direction is smaller than the threshold ω 0, and G is gradually increased to 0 when | ω yaw | is larger than the threshold ω 0. Here, by adjusting the coefficient a in the attenuation term G, it is possible to smoothly transition between G1 and G0. The function G is not limited to the above expression, and may be a function that smoothly transitions between G1 and G0 with the threshold ω 0 as a boundary.

Fig. 8A shows an example of correction using equation (2). When the angular velocity ω yaw in the yaw direction is small, the detected pitch angle θ pitch is used as it is as the pitch angle θ c for correction, but when the angular velocity ω yaw in the yaw direction is large, the pitch angle θ c for correction is made close to 0. That is, in the case where the vehicle intends to make a sharp turn at a curve, even if the pitch angle θ pitch is detected by the gyro sensor, the correction of the display position of the target is suppressed or suspended.

Next, the correction of the live-action overlay target 81 of fig. 8B is explained. The display position of the live view overlap target overlaps with the object in the live view, and therefore is not dependent on the pitch angle θ pitch detected by the gyro sensor 43. Instead, the brightness of the display of the live-action superimposition target is changed in accordance with the yaw-direction angular velocity ω yaw detected by the gyro sensor 43.

When the brightness before the target change is P, the brightness Pc after the change is corrected by expression (3).

Pc=P·(1+exp(a(ωyaw+ω0)))-1·(1+exp(-a(ωyaw-ω0)))-1

=P·G (3)

Here, the attenuation term G of the expression (3) is the same function as the attenuation term G of the expression (2), and when the angular velocity | ω yaw | in the yaw direction is smaller than the threshold ω 0, G becomes 1, and when | ω yaw | is larger than the threshold ω 0, G becomes 0.

Fig. 8B shows an example of correction using expression (3). When the angular velocity ω yaw is small, the luminance Pc of the target display becomes the original luminance P, but when the angular velocity ω yaw is large, the luminance Pc of the display is made close to 0. That is, when the angular velocity in the yaw direction is large, the target display is darkened or disappears. This is because, when the vehicle is going to make a sharp turn at a curve, the possibility that the objects to be superimposed in the real scene move in the left-right direction is high, and therefore it is considered appropriate to stop the target superimposed display itself.

Fig. 9 is a diagram showing an example of display after correction by the treatment method 1. When the angular velocity ω yaw in the yaw direction is greater than the threshold value ω 0, the constant display target 82 (speed display or the like) is a fixed display without moving according to the pitch and yaw. On the other hand, with respect to the live-action superimposition target 81 (warning or the like), the display is stopped (extinguished) by changing the luminance to a value close to 0.

Fig. 10 is a diagram illustrating an effect of disappearing a live-action overlay target. Showing the view of the exterior of the vehicle as seen from the driver's seat. In the course of sharply changing the orientation of the vehicle body at a curve or intersection, the driver looks not at the front 73 of the vehicle, but at the direction of travel 74. In this process, if a dynamic target (warning or the like) 81 is displayed in the image display range 80, the driver is distracted and may be distracted. Further, when the detection speed performance of the front sensor device 30 is low, it is impossible to cope with a sudden change in the orientation of the vehicle body, and there is a possibility that the warning 81 is displayed at a portion different from the object due to a delay in detection. Therefore, when the direction is changed abruptly, the live-action superimposition target 81 is eliminated, thereby avoiding this problem.

Fig. 11 is a diagram illustrating a flowchart of the disposal method 1. The control unit 11 of the HUD device executes the correction processing by the following flow.

S101: the angular velocities (ω yaw, ω pitch) in the yaw direction and the pitch direction are acquired from the gyro sensor 43.

S102: the angular velocity ω pitch in the pitch direction is subjected to filter processing. Specifically, the offset is removed by the HPF, and the high frequency noise is removed by the LPF.

S103: the filtered ω pitch is integrated, and the pitch angle θ pitch is calculated. And further multiplied by a certain attenuation coefficient to prevent divergence.

S104: the pitch angle θ pitch is corrected by the expression (2) and θ c is calculated. That is, when the angular velocity | ω yaw | in the yaw direction is greater than the threshold value ω 0, the pitch angle θ c is made close to 0.

S105: the target brightness P is corrected by the expression (3) to calculate Pc. That is, when the angular velocity | ω yaw | in the yaw direction is greater than the threshold value ω 0, the brightness Pc of the target is brought close to 0.

S106: the image data generation unit 132 moves the position of the constant display target based on the corrected pitch angle θ c.

S107: the brightness of the live-action overlay target is changed according to the corrected brightness Pc. The target subjected to these corrections is displayed on the image display unit 50.

S108: it is determined whether or not the display of the video is completed, and if the display is continued, the process returns to S101 and the above-described process is repeated.

[ treatment method 2]

Next, treatment method 2 will be described as a modification of treatment method 1. In the treatment method 2, the gyro sensor 43 measures the angular velocity in the 3-axis direction, and the substantial pitch angle θ c' is calculated in consideration of the roll component ω roll of the angular velocity. The pitch component ω pitch, yaw component ω yaw, and roll component ω roll of the angular velocity measured by the gyro sensor 43 are time-integrated with respect to the roll component ω roll to be set as a roll angle θ roll.

The angular velocity ω c' used for the pitch correction is obtained by the following equation.

ωc’=ωpitch-ωyaw·tanθroll (4)

The angular velocity ω c' obtained by the expression (4) becomes a substantial pitch component in consideration of the influence of the yaw component ω yaw and the roll angle θ roll.

The above ω c 'is time-integrated to obtain a substantial pitch angle θ c'. The position of the display target is changed in accordance with θ c', and correction is performed. In this case, the live-action overlay target is processed in the same manner as the target is always displayed, without changing (disappearing) the brightness.

Fig. 12 is a diagram illustrating a flowchart of the disposal method 2.

S111: angular velocities (ω roll, ω yaw, and ω pitch) in the roll direction, yaw direction, and pitch direction are acquired from the gyro sensor 43.

S112: the angular velocities in the respective directions (ω roll, ω yaw, ω pitch) are subjected to filter processing. Specifically, the offset is removed by HPF, and the high frequency noise is removed by LPF.

S113: and integrating the filtered omega roll to calculate the rolling angle theta roll. And further multiplied by a certain attenuation coefficient to prevent divergence.

S114: the angular velocity ω c' is calculated from the expression (4). At this time, the phases of ω pitch, ω yaw, and θ roll are adjusted.

S115: the angular velocity ω c 'is time-integrated, and a substantial pitch angle θ c' is calculated. And further multiplied by a certain attenuation coefficient to prevent divergence.

S116: the video data generation unit 132 moves the position of the constant display target and the live view superimposition target according to the pitch angle θ c'.

S117: it is determined whether or not the display of the video is completed, and if the display is continued, the process returns to S111, and the above-described process is repeated.

Treatment method 1 and treatment method 2 are compared. Since the treatment method 1 does not need to consider the roll direction, it can be implemented by a 2-axis gyro sensor. In the case of performing the treatment method 1, the user can freely change the threshold ω 0 and the coefficient a in accordance with the preference. In the treatment method 2, a 3-axis gyro sensor is required, but the target is always displayed according to the substantial pitch angle.

When the HUD device 10 is equipped with a 3-axis gyro sensor, the user can select the treatment method 1 and the treatment method 2. In addition, the treatment method 1 and the treatment method 2 may be assigned according to the type of the live-action overlay target. For example, since the user may be aware of a danger when the target is a pop-up information target including characters such as store information, it is preferable to assign the dealing method 1, and to assign the dealing method 2 when the target is a target not including characters.

According to embodiment 1, it is possible to avoid a phenomenon in which the display target moves in a direction unrelated to the pitch and roll while the vehicle is traveling in a curve. This prevents the driver from being distracted when the vehicle is suddenly changed in direction, and contributes to maintaining safe driving.

Example 2

In embodiment 1, pitching and rolling are handled as the objects of correction of the vibration of the vehicle, but a rotational component and a parallel movement component exist in the vibration of the vehicle. In embodiment 2, the components of both are handled. For this reason, the gyro sensor 43 and the acceleration sensor 42 are used as the yaw sensor of the vehicle.

Fig. 13A and 13B are diagrams illustrating 2 vibration components as correction targets. Fig. 13A shows a rotation component, expressed by the pitch angle θ pitch, referred to as "rotational wobble". Fig. 13B shows a parallel movement component in the up-down direction, which is denoted by displacement h and is referred to as "shift shake". All of them are factors that cause the display position of the target to be shifted in the vertical direction.

Since these are objects to be corrected, a technique of camera shake correction is applied using the gyro sensor 43 and the acceleration sensor 42. However, in the camera shake correction, the shake on the imaging surface is obtained, but since the positional relationship between the sensor and the imaging surface is fixed, it is not necessary to consider the position of the driver. In contrast, in the case of example 2, it is necessary to obtain the sway at the position (viewpoint) of the driver. In this case, the position of the driver varies depending on the height and posture, and is measured in real time by the viewpoint detecting device 20.

Fig. 14A and 14B are diagrams illustrating correction calculation in embodiment 2. The gyro sensor 43 and the acceleration sensor 42 (also referred to as a gyro/acceleration sensor collectively) are provided in front of the driver 5. Here, 2 rocking states are shown, fig. 14A is a case where the fulcrum S of the rotational rocking is located at the rear of the vehicle, and fig. 14B is a case where the fulcrum S is located at the front of the vehicle. In order to deal with both the rotational shake and the displacement shake of the vehicle, parameters used in the correction calculation are as follows.

The pitch component ω pitch of the angular velocity obtained from the gyro sensor 43 is integrated, and the pitch angle θ pitch is calculated. The distance between the gyro/acceleration sensors 43 and 42 and the driver 5 is L. The vertical displacement at the positions of the gyro/acceleration sensors 43 and 42 is hg. The vertical displacement at the position of the driver 5 is denoted by hd. The radius of rotation when hg is considered to be generated by rotational wobble is denoted by R. In the correction calculation, the pitch angle θ pitch corresponding to the amount of rotational shake and the displacement hd of the position of the driver 5 corresponding to the amount of displacement shake are obtained.

Among the parameters, the following relational expression holds.

hg ═ R · θ pitch (where θ pitch < <1) (5-1)

dhg/dt(=Vz)=R·(dθpitch/dt) (5-2)

d2hg/dt2(=αz)=R·(d2θpitch/dt2) (5-3)

The radius of rotation R is determined from the formula (5-2) or (5-3), and hg is determined by substituting the radius into the formula (5-1).

When equation (5-2) is used, the velocity dhg/dt is a value (denoted as Vz) obtained by removing the gravitational acceleration from the z component (α z) of the acceleration sensor 42 by filter processing (HPF or the like) and performing time integration. d θ pitch/dt is the pitch component (ω pitch) of the gyro sensor 43.

In the case of using the formula (5-3), the acceleration d2hg/dt2Is a value obtained by removing the gravitational acceleration from the z component (α z) of the acceleration sensor 42 by filter processing. d2θpitch/dt2The value (d ω pitch/dt) obtained by time-differentiating the pitch component of the gyro sensor 43 is used.

Fig. 15 is a diagram illustrating a method of measuring the distance L between the gyroscopes and acceleration sensors 43 and 42 and the driver 5. The distance L varies depending on the height and posture of the driver 5, and is measured in real time using the distance measuring function (distance detector 46) of the viewpoint detecting device 20. In this figure, distances La, Lb to the drivers 5a, 5b are measured.

When the values of the parameters hg, R, and L are obtained as described above, the vertical displacement hd at the position of the driver 5 is calculated by the expression (5-4).

hd=hg·(R-L)/R (5-4)

In the correction of the display position of the target, the target position is shifted and displayed based on θ pitch as the amount of rotational shake and hd as the amount of displacement shake.

Fig. 16 is a diagram showing a flowchart of embodiment 2.

S201: angular velocities (ω roll, ω yaw, ω pitch) in the 3-axis direction are acquired from the gyro sensor 43.

S202: the pitch component ω pitch is subjected to filter processing (offset is removed by HPF, high frequency noise is removed by LPF), and the filtered ω pitch is integrated to calculate a pitch angle θ pitch (multiplication by an attenuation coefficient to prevent divergence).

S203: the filtered ω pitch is differentiated to calculate d ω pitch/dt.

S204: the acceleration sensor 42 obtains 3-axis accelerations (α x, α y, α z).

S205: the acceleration α z of the vertical component is filtered (the acceleration due to gravity is removed by HPF, and the high-frequency noise is removed by LPF), and further integrated to calculate the velocity Vz in the vertical direction (the velocity Vz is multiplied by an attenuation coefficient to prevent divergence).

S206: the rotation radius R and the vertical displacement hg of the sensor are calculated using α z, Vz, ω pitch, and d ω pitch/dt obtained as described above. In this case, R is calculated from the formula (5-2) or the formula (5-3), and hg is calculated from the formula (5-1).

S207: the distance L between the gyro/acceleration sensors 43 and 42 and the driver 5 is measured by the viewpoint detecting device 20.

S208: the vertical displacement hd of the driver 5 is calculated from the expression (5-4) using R, hg, and L obtained as described above.

S209: the display position of the target is changed according to the amount of rotational fluctuation θ pitch and the amount of displacement fluctuation hd.

S210: it is determined whether or not the display is finished, and if the display is continued, the process returns to S201 and the above-described process is repeated.

Here, in embodiment 2, the following modification can be realized. In the modification, the gyro sensor 43 and the 2 acceleration sensors 42 are provided, and the rotation radius is obtained by another method.

Fig. 17 is a diagram illustrating correction calculation in the modification. Acceleration sensors 42a and 42b are mounted at front and rear positions of the vehicle, respectively. The distance between the 2 acceleration sensors 42a, 42b is D (fixed). The pitch angle obtained from the gyro sensor 43 is θ pitch. The vertical displacement at the position of the front acceleration sensor 42a is hga, and the vertical displacement at the position of the rear acceleration sensor 42b is hgb. When the fulcrum S corresponding to the rotational wobble is set, the radius of rotation for the displacement hga is Ra, and the radius of rotation for the displacement hgb is Rb.

The relationship between the parameters is expressed by the following equation.

hga Ra θ pitch (wherein θ pitch < <1) (6-1a)

hgb Rb θ pitch (wherein θ pitch < <1) (6-1b)

dhga/dt=Ra·(dθpitch/dt) (6-2a)

dhgb/dt=Rb·(dθpitch/dt) (6-2b)

d2hga/dt2=Ra·(d2θpitch/dt2) (6-3a)

d2hgb/dt2=Rb·(d2θpitch/dt2) (6-3b)

Ra-Rb=D (6-4)

The radii of rotation Ra and Rb are determined from the formula (6-2a), (6-2b) or (6-3a) and (6-3 b).

First, when the formulae (6-2a) and (6-2b) are used, the following is the case.

(dhga/dt)/Ra=(dhgb/dt)/Rb (6-5)

Here, if the velocity components are (dhga/dt) ═ Va and (dhgb/dt) ═ Vb, then the radii of rotation Ra and Rb are determined from the expressions (6-4) and (6-5) as expressions (6-6a) and (6-6 b).

Ra=D·Va/(Va-Vb) (6-6a)

Rb=D·Vb/(Va-Vb) (6-6b)

On the other hand, the following is the case where the formulae (6-3a) and (6-3b) are used.

(d2hga/dt2)/Ra=(d2hgb/dt2)/Rb(6-7)

Here, if the acceleration component is (d)2hga/dt2)=αa、(d2hgb/dt2) The radii of rotation Ra and Rb are determined from the expressions (6-4), (6-7) and (6-8a) and (6-8 b).

Ra=D·αa/(αa-αb) (6-8a)

Rb=D·αb/(αa-αb) (6-8b)

The signs of the rotation radii Ra and Rb are determined by the positional relationship between the acceleration sensors 42a and 42b and the fulcrum S. According to the expressions (6-1a) and (6-1b), Rx (x: a, b) is positive if θ pitch and shift hgx (x: a, b) have the same polarity (direction), and negative if they differ. In the case of fig. 17, Ra and Rb are both positive values.

Fig. 18 is a diagram illustrating a method of measuring the vertical displacement hd of the driver. The distance L between the acceleration sensor 42a in front and the driver 5 is measured using the distance measuring function (distance detector 46) of the viewpoint detecting device 20.

When the values of the parameters hga, Ra, and L are obtained as described above, the vertical displacement hd at the position of the driver 5 is calculated by the expression (6-9).

hd=hga·(Ra-L)/Ra (6-9)

In the correction of the target, the target position is shifted and displayed in accordance with the rotation shake amount θ pitch and the displacement shake amount hd. According to the above modification, it is not necessary to use the value of the gyro sensor 43 every time the rotation radius R is obtained.

According to embodiment 2, both of the rotational shake and the displacement shake are detected as the vibration of the vehicle, and the viewpoint position of the driver is taken into consideration, whereby the display position of the display target can be corrected with high accuracy.

30页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:供电装置、车辆及设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!