Head-up display device

文档序号:1493702 发布日期:2020-02-04 浏览:22次 中文

阅读说明:本技术 平视显示装置 (Head-up display device ) 是由 下田望 泷泽和之 于 2018-05-25 设计创作,主要内容包括:本发明能够减少HUD的故障发生引起的异常显示导致的视野不良。AR-HUD具有车辆信息获取部(10)、控制部(20)和影像显示装置(30)。车辆信息获取部(10)获取检测车辆能够检测出的各种车辆信息。控制部(20)基于车辆信息获取部(10)所获取的车辆信息,控制在从车辆的驾驶席通过挡风玻璃可见的显示区域显示的影像的显示。影像显示装置(30)基于来自控制部(20)的指示生成影像。控制部(20)获取作为在判断装置异常时使用的信息的装置信息,并基于获取的装置信息来判断装置是否有异常,当判断为装置有异常时,进行变更投影在挡风玻璃的影像的显示内容的显示内容变更处理。(The invention can reduce poor visual field caused by abnormal display caused by HUD failure. The AR-HUD includes a vehicle information acquisition unit (10), a control unit (20), and an image display device (30). A vehicle information acquisition unit (10) acquires various types of vehicle information that can be detected by a detected vehicle. The control unit (20) controls the display of an image displayed in a display area visible from the driver's seat of the vehicle through the windshield, based on the vehicle information acquired by the vehicle information acquisition unit (10). The video display device (30) generates a video based on an instruction from the control unit (20). A control unit (20) acquires device information that is information used when determining that a device is abnormal, determines whether the device is abnormal on the basis of the acquired device information, and performs display content change processing for changing the display content of an image projected onto a windshield when it is determined that the device is abnormal.)

1. A head-up display device that displays an image in a display region visible from a driver's seat of a vehicle through a windshield, comprising:

a vehicle information acquisition unit for acquiring vehicle information detected by detecting a vehicle;

a control unit for controlling the display of the image; and

an image display device for generating the image,

the control unit acquires device information that is information used when determining that a device is abnormal, determines whether the device is abnormal based on the acquired device information, and performs display content change processing for changing display content of the video when determining that the device is abnormal.

2. A heads-up display device as claimed in claim 1, wherein:

the control unit cuts off electric power supplied to the head-up display device as the display content changing process.

3. A heads-up display device as claimed in claim 2, wherein:

has a storage part for storing information,

the control unit generates a flag indicating that the electric power of the head-up display device has been shut off before the electric power is shut off, stores the flag in the storage unit, confirms whether the flag is stored in the storage unit when the electric power is newly supplied to the head-up display device, and acquires the device information before an initial operation to determine whether the device is abnormal or not when the flag is confirmed to be stored in the storage unit.

4. A heads-up display device as claimed in claim 1, wherein:

the control unit periodically acquires the device information while the display content changing process is being executed, determines whether or not the device is abnormal, and terminates the display content changing process being executed when it is determined that the device is not abnormal.

5. A head-up display device that displays an image in a display region visible from a driver's seat of a vehicle through a windshield, comprising:

a vehicle information acquisition unit for acquiring vehicle information detected by detecting a vehicle;

a control unit for controlling the display of the image; and

an image display device for generating the image,

the control unit determines an abnormality of a unit connected to the outside, and performs a display content changing process for changing the display content of the image when the unit is determined to be abnormal,

the unit outputs the vehicle information acquired by the vehicle information acquisition section.

6. A heads-up display device as claimed in claim 5, wherein:

the control unit cuts off electric power supplied to the head-up display device as the display content changing process.

7. A heads-up display device as claimed in claim 1 or 5, wherein:

the control unit turns off a light source for the image display device to project an image as the display content change process.

8. A heads-up display device as claimed in claim 1 or 5, wherein:

the control unit adjusts the amount of light emitted by a light source for the video display device to project a video as the display content changing process.

9. A heads-up display device as claimed in claim 1 or 5, wherein:

the control unit reduces the area of the display region as the display content change process.

10. A heads-up display device as claimed in claim 1 or 5, wherein:

has a functional film for controlling the transmissivity of an image projected on the windshield by the image display device according to the level of voltage application,

the control unit controls a voltage applied to the functional film so as not to transmit the image, as the display content changing process.

Technical Field

The present invention relates to a technique of a head-up display device, and more particularly to a technique suitable for a head-up display device using AR (augmented reality).

Background

In a vehicle such as an automobile, a technique of using a Head Up Display (hereinafter, sometimes referred to as a HUD) that displays information by projecting the information on a windshield or the like is known.

The HUD is a device that projects traveling information such as a vehicle speed and an engine speed, or information such as car navigation information on the windshield as described above. The driver can confirm the information without moving the line of sight to an instrument panel incorporated in the instrument panel, a so-called instrument panel, or the like, and the amount of movement of the line of sight can be greatly reduced.

The HUD as described above is a device that contributes to safe driving by the driver, but the HUD is not considered much in the case where a display failure occurs. As a result, even if a trouble occurs in the display of the HUD, the driver cannot perform any processing, and the display is in an abnormal state until the repair is completed.

In a system for displaying such vehicle information, a technique is known in which information necessary for driving is displayed instead when, for example, a display device or a control device fails (for example, see patent document 1).

Disclosure of Invention

Technical problem to be solved by the invention

Consider a case that occurs, for example, in a display system control circuit of a HUD while traveling. There are various modes in which a malfunction occurs in a control circuit of a display system, and the display of the HUD is abnormal. When the HUD is simply displayed without any symptom such as disappearance, safe driving is not hindered.

On the other hand, there are cases where display is generated that hinders safe driving by the driver. For example, display abnormality such as an increase in display luminance of the HUD, no update of information displayed on the HUD, or display of a large amount of unnecessary information.

When the display abnormality occurs in the HUD, the attention of the driver may be attracted to the display of the HUD or a poor field of view may be caused. As a result, a situation may occur in which a vehicle or a person, a sign, an obstacle, or the like, which should be noticed during driving, is overlooked or noticed relatively late.

In recent years, the HUD tends to have a larger display area from the viewpoint of driving assistance, and in the HUD having a large display area, when the above-described display abnormality occurs, there is a possibility that the poor visibility of the driver becomes a serious problem.

The invention aims to provide a technology capable of reducing poor visual field caused by abnormal display due to faults of a HUD.

The above and other objects and novel features of the present invention will be apparent from the description of the present specification and the accompanying drawings.

Means for solving the problems

The invention disclosed herein briefly describes a typical technique as follows.

That is, a typical head-up display device displays an image in a display region that is visible from the driver's seat of the vehicle through the windshield.

The head-up display device includes a vehicle information acquisition unit, a control unit, and an image display device. The vehicle information acquisition unit acquires vehicle information detected by a detected vehicle. The control unit controls display of the image. The image display device generates an image.

The control unit acquires device information that is information used when determining that the device is abnormal, determines whether the device is abnormal based on the acquired device information, and performs display content changing processing for changing the display content of the video when determining that the device is abnormal.

In particular, the control unit cuts off the electric power supplied to the head-up display device as the display content changing process. Alternatively, the control unit turns off the light source for projecting the image by the image display device as the display content changing process.

Effects of the invention

In the invention disclosed in the application, effects that can be obtained by a typical technique will be briefly described as follows.

(1) Information required for safe driving can be accurately displayed according to the driving condition of the vehicle.

(2) The contribution to safe driving can be made by the above (1).

Drawings

Fig. 1 is an explanatory diagram showing an outline of an example of an operation concept in the AR-HUD according to embodiment 1.

Fig. 2 is a functional block diagram showing an overview of an overall configuration example of the AR-HUD according to embodiment 1.

Fig. 3 is an explanatory diagram showing an outline of an example of a hardware configuration related to acquisition of vehicle information in the AR-HUD of fig. 2.

FIG. 4 is a functional block diagram showing details of a configuration example of the AR-HUD of FIG. 2.

Fig. 5 is a flowchart showing an outline of an example of the initial operation of the AR-HUD of fig. 2.

Fig. 6 is a flowchart showing an outline of an example of a normal operation of the AR-HUD of fig. 2.

Fig. 7 is an explanatory diagram showing an example of a configuration based on the self-trouble determining table included in the self-trouble determining unit shown in fig. 4.

Fig. 8 is an explanatory diagram showing an example of a configuration based on another self-trouble determining table included in the self-trouble determining unit shown in fig. 4.

Fig. 9 is an explanatory diagram showing an example of display of the windshield at the time of occurrence of a failure, which is performed based on the failure-time display determination unit included in the control unit of fig. 4.

Fig. 10 is a flowchart showing an example of the process of step S23 in fig. 6, that is, the process of confirming the occurrence of a failure and changing the display content.

Fig. 11 is an explanatory diagram showing an example of an abnormal display screen when a failure occurs in an AR-HUD based on the study by the present inventors.

Fig. 12 is an explanatory view showing another example of an abnormal display screen when a failure occurs in an AR-HUD based on the study by the present inventor.

Fig. 13 is an explanatory diagram showing another example of the display on the windshield at the time of occurrence of a failure, which is performed by the failure-time display determination unit included in the control unit of fig. 4.

Fig. 14 is an explanatory diagram showing an example of the structure of the AR-HUD according to embodiment 2.

Detailed Description

In all the drawings for explaining the embodiments, the same components are denoted by the same reference numerals in principle, and redundant explanations thereof are omitted. In addition, in order to make the drawings easy to understand, a shadow may be added in a plan view.

(embodiment mode 1)

The embodiments will be described in detail below.

< action on AR-HUD >

Fig. 1 is an explanatory diagram schematically illustrating an example of an operation concept in a HUD device (hereinafter, may be referred to as "AR-HUD") that realizes the AR function according to embodiment 1.

As shown in fig. 1, the AR-HUD1 as a head-up Display device reflects an image displayed on an image Display device 30 including a projector, an LCD (liquid crystal Display), or the like, with a mirror 51 and a mirror 52, and projects the image through an opening 7 onto a windshield (windshield)3 of a vehicle 2. The mirror 51 and the mirror 52 are, for example, free-form surface mirrors or mirrors having shapes whose optical axes are asymmetrical.

The driver 5 can see the image projected on the windshield 3 as a virtual image through the transparent windshield 3 in front thereof by viewing the image. In the present embodiment, for example, the position of the image projected on the windshield 3 is adjusted by adjusting the angle of the mirror 52, so that the display position of the virtual image viewed by the driver 5 can be adjusted in the vertical direction. The AR function can be realized by adjusting the display position of the virtual image so that the virtual image overlaps with a landscape (road, building, person, or the like) outside the vehicle.

In addition, the AR-HUD1 can enlarge the display area of the image projected on the windshield 3 to display more information on the windshield 3. For example, the mirror 52 can be made larger in area.

< example of AR-HUD construction >

Fig. 2 is a functional block diagram schematically showing an example of the overall configuration of the AR-HUD1 according to embodiment 1.

As shown in fig. 2, the AR-HUD1 mounted on the vehicle 2 includes: the vehicle information acquiring unit 10, the control unit 20, the image display device 30, the mirror driving unit 50, the mirror 52, the speaker 60, and the like. In the example of fig. 2, the shape of the vehicle 2 is shown as a passenger car, but the present invention is not limited thereto, and can be suitably applied to a normal vehicle.

The vehicle information acquisition unit 10 is configured by information acquisition devices such as various sensors described later provided at various parts of the vehicle 2, and detects various events occurring in the vehicle 2 or detects and acquires values of various parameters in the running condition at predetermined intervals, thereby acquiring and outputting the vehicle information 4.

As shown in the figure, the vehicle information 4 can include, for example: speed information or shift position information of the vehicle 2, steering wheel angle information, lamp lighting information, outside light information, distance information, infrared ray information, engine operation/stop information, camera image information inside and outside the vehicle, acceleration gyro information, GPS (Global Positioning System) information, navigation information, vehicle-to-vehicle communication information, road-to-vehicle communication information, and the like.

The control unit 20 has a function of controlling the operation of the AR-HUD1, and is installed by, for example, a CPU (Central processing unit) and software installed using the CPU. Or may be installed by hardware such as a microcomputer or an FPGA (field programmable Gate Array).

As shown in fig. 2, the control unit 20 drives the image display device 30 to generate an image to be displayed as a virtual image based on the vehicle information 4 and the like acquired from the vehicle information acquisition unit 10, and appropriately reflects the image by the mirror 52 and the like to project the image on the windshield 3. Then, control is performed to adjust the display position of the virtual image display region 6.

As described above, the image display device 30 is a device including, for example, a projector and an LCD, generates an image for displaying a virtual image based on an instruction from the control unit 20, and projects or displays the image.

The mirror driving unit 50 adjusts the angle of the mirror 52 based on an instruction from the control unit 20, and adjusts the position of the display region 6 of the virtual image in the vertical direction.

The speaker 60 makes sound output with respect to the AR-HUD 1. For example, voice guidance by a navigation system, or voice output for notifying the driver 5 of a warning or the like by the AR function can be performed.

< example of hardware configuration >

Fig. 3 is an explanatory diagram showing an outline of an example of a hardware configuration related to acquisition of the vehicle information 4 in the AR-HUD1 of fig. 2.

Here, the hardware configuration of a part of the vehicle information acquisition unit 10 and the control unit 20 is mainly shown. The acquisition of the vehicle information 4 is performed by information acquisition devices such as various sensors connected to the ECU21 under the Control of an ECU (Electronic Control Unit) 21, for example.

As these information acquisition apparatuses, for example, there are included: the Vehicle speed sensor 101, the shift position sensor 102, the steering wheel angle sensor 103, the headlight sensor 104, the illuminance sensor 105, the chromaticity sensor 106, the distance measuring sensor 107, the infrared sensor 108, the engine start sensor 109, the acceleration sensor 110, the gyro sensor 111, the temperature sensor 112, the wireless receiver for road-to-Vehicle Communication 113, the wireless receiver for Vehicle-to-Vehicle Communication 114, the camera (in-Vehicle) 115, the camera (out-Vehicle) 116, the GPS receiver 117, and the VICS (Vehicle Information and Communication System: road traffic Information Communication System, registered trademark (the same will be hereinafter)) receiver 118.

It is not necessary to have all of the devices, and in addition, other kinds of devices may be provided. The vehicle information 4 that can be acquired by the device provided can be used as appropriate.

The vehicle speed sensor 101 acquires speed information of the vehicle 2 of fig. 2. The shift position sensor 102 acquires current gear information of the vehicle 2. The steering wheel angle sensor 103 acquires steering wheel angle information.

The headlamp sensor 104 acquires lamp lighting information on/off of the headlamp. The illuminance sensor 105 and the chromaticity sensor 106 acquire outside light information. The distance measuring sensor 107 acquires distance information between the vehicle 2 and an external object.

The infrared sensor 108 acquires infrared information on the presence or absence of an object, the distance, and the like in the short distance of the vehicle 2. The engine start sensor 109 detects engine operation/stop information.

The acceleration sensor 110 and the gyro sensor 111 acquire acceleration gyro information including acceleration and angular velocity as information of the posture and behavior of the vehicle 2. The temperature sensor 112 acquires temperature information inside and outside the vehicle.

The wireless receiver for road-to-vehicle communication 113 and the wireless receiver for vehicle-to-vehicle communication 114 acquire road-to-vehicle communication information received through road-to-vehicle communication between the vehicle 2 and a road or a sign, a signal, or the like, and vehicle-to-vehicle communication information received through vehicle-to-vehicle communication between the vehicle 2 and another vehicle in the vicinity, respectively.

The camera (inside vehicle) 115 and the camera (outside vehicle) 116 capture moving images of the conditions inside and outside the vehicle and acquire camera image information inside the vehicle and camera image information outside the vehicle, respectively. The camera (in-vehicle) 115 captures, for example, the posture, eye position, and movement of the driver 5 in fig. 1. By analyzing the obtained moving image, for example, the fatigue state of the driver 5 or the position of the line of sight can be grasped.

Further, the camera (outside of the vehicle) 116 captures the surrounding conditions such as the front and rear of the vehicle 2. By analyzing the obtained moving image, it is possible to grasp, for example, the presence or absence of a moving object such as another vehicle or a person in the vicinity, a road surface condition such as a building or a terrain, rain, snow, ice, unevenness, or the like, a road sign, or the like.

The GPS receiver 117 and the VICS receiver 118 acquire GPS information that can receive and obtain GPS signals and VICS information that can receive and obtain VICS signals, respectively. Or may be installed as part of a car navigation system that obtains and utilizes such information.

< example of configuration of control section >

FIG. 4 is a functional block diagram showing details of a configuration example of the AR-HUD1 shown in FIG. 2.

The control unit 20 includes, in more detail: the ECU21, the audio output unit 22, the nonvolatile memory 23, the memory 24, the light source adjusting unit 25, the distortion correcting unit 26, the display element driving unit 27, the mirror adjusting unit 29, the self failure determining unit 80, the cell failure determining unit 81, and the failure time display determining unit 82.

The ECU21 acquires the vehicle information 4 via the vehicle information acquisition portion 10 as shown in fig. 3, and records, saves, or reads the acquired information in the nonvolatile memory 23 or the memory 24 as necessary.

The nonvolatile memory 23 may store setting information such as setting values and parameters for various controls. The ECU21 also executes a dedicated program or the like to generate video data to be displayed as a virtual image on the AR-HUD 1.

The audio output unit 22 outputs audio information via the speaker 60 as necessary. The light source adjusting unit 25 adjusts the light emission amount of the video display device 30.

The distortion correcting unit 26 corrects distortion of the image generated by the ECU21 due to the curvature of the windshield 3 by image processing when the image is projected onto the windshield 3 of the vehicle 2 by the image display device 30. The display element driving unit 27 transmits a driving signal corresponding to the image data corrected by the distortion correcting unit 26 to the image display device 30, and generates an image to be projected. When the position of the virtual image display region 6 itself needs to be adjusted, the mirror adjustment unit 29 changes the angle of the mirror 52 by the mirror driving unit 50 to move the virtual image display region 6 up and down.

The self-failure determination unit 80 determines the presence or absence of an abnormality in hardware or the like in the AR-HUD1, that is, a device abnormality. When determining that the device abnormality of the AR-HUD1 has occurred, the self-failure determination unit 80 generates an abnormality occurrence signal and outputs the signal to the failure-time display determination unit 82.

The unit failure determination unit 81 determines whether or not a unit mounted on the vehicle 2 other than the AR-HUD1 has failed. When the cell failure determination unit 81 determines that an abnormality has occurred in another cell, it generates an abnormality occurrence signal and outputs the signal to the failure time display determination unit 82.

The failure-time display determination unit 82 receives the abnormality occurrence signal output from the own failure determination unit 80 or the unit failure determination unit 81, and controls the video display device 30 so that the displayed virtual image does not hinder the driving of the driver.

< contents of treatment >

Fig. 5 is a flowchart showing an outline of an example of the initial operation in the AR-HUD1 of fig. 2.

When the ignition switch is turned on and the power supply of the AR-HUD1 is turned on in the stopped vehicle 2 (step S01), the AR-HUD1 first acquires the vehicle information 4 by the vehicle information acquisition unit 10 based on an instruction from the control unit 20 (step S02).

Then, the control unit 20 calculates an appropriate brightness level based on the outside light information acquired by the illuminance sensor 105, the chromaticity sensor 106, and the like in the vehicle information 4 (step S03), and the light source adjustment unit 25 controls the light emission amount of the video display device so as to set the calculated brightness level (step S04). For example, the brightness level is increased when the external light is bright, and is set lower when the external light is dark.

Then, the ECU21 determines and generates a video to be displayed as a virtual image, for example, an initial image or the like (step S05), the distortion correcting unit 26 performs a process of correcting distortion on the generated video (step S06), and the display element driving unit 27 drives and controls the display element of the video display device 30 to generate a video to be projected (step S07). This causes the image to be projected on the windshield 3, and the driver 5 can view a virtual image.

When the start and activation of each unit including the above-described series of initial operations are completed in the whole AR-HUD1, the HUD-ON signal is output, and the control unit 20 determines whether or not the signal is received (step S08).

If not, it further waits for a certain time to receive the HUD-ON signal (step S09) until it is determined that the HUD-ON signal has been received in the processing of step S08, and the HUD-ON signal wait-to-receive processing is repeated (step S09).

When it is determined in the process of step S08 that the HUD-ON signal has been received, a normal operation of the AR-HUD1 (step S10), which will be described later, is started, and a series of initial operations are terminated.

< example of usual operation >

Fig. 6 is a flowchart showing an outline of an example of a normal operation in the AR-HUD1 of fig. 2.

In the normal operation, the basic processing flow is substantially the same as the initial operation shown in fig. 5. First, the AR-HUD1 acquires the vehicle information 4 through the vehicle information acquisition unit 10 based on an instruction from the control unit 20 (step S21).

Next, the control unit 20 determines whether or not to execute the process of step S23 (step S22), which will be described later. The processing of step S22 is performed every time a preset time elapses, and it is determined that the processing of step S23 is executed when the set time elapses from the previous processing.

When the control unit 20 determines that the set time has elapsed since the previous processing, it performs the failure occurrence status confirmation and the display content change processing (step S23). The display content changing process and the confirmation of the trouble occurrence state are processes of changing the content to be displayed by the AR-HUD1 by confirming the trouble occurrence state of the AR-HUD 1. The processing of step S23 will be described in detail with reference to fig. 10, which will be described later.

In the process of step S22, if the set time has not elapsed from the previous process, the control unit 20 performs the brightness level adjustment process based on the outside light information acquired by the illuminance sensor 105, the chromaticity sensor 106, and the like in the vehicle information 4 (step S24).

Then, the ECU21 changes the video displayed as a virtual image from the current video as necessary based on the latest vehicle information 4 acquired in the process of step S21, and determines and generates a changed video (step S25).

Further, the mode of changing the display content based on the vehicle information 4 may have many modes depending on the content of the acquired vehicle information 4, a combination thereof, and the like. For example, there may be various modes such as a case where the numerical value of the velocity display that is displayed at ordinary times is changed due to a change in the velocity information, a case where an arrow figure based on the navigation information display or the cancel guide is displayed, and a case where the shape or display position of the arrow is changed.

Thereafter, adjustment correction processing for maintaining the visibility and the appropriateness of the display content and the like is performed in accordance with the traveling condition of the vehicle 2. When the position of the virtual image display region 6 itself needs to be adjusted, the mirror driving unit 50 changes the angle of the mirror 52, and performs mirror adjustment processing for moving the virtual image display region 6 up and down (step S26).

When the power supply is turned OFF or the like in association with the stop or the like of the vehicle 2 while the series of normal operations is being performed, a HUD-OFF signal is output to the AR-HUD1, and the control unit 20 determines whether or not the signal has been received (step S27).

If the HUD-OFF signal is not received, the process returns to step S21, and a series of normal operations are repeated until the HUD-OFF signal is received. When it is determined that the HUD-OFF signal has been received, the series of normal operations is terminated.

Next, a description will be given of a technique for determining an abnormality by the self-failure determining unit 80 and the cell failure determining unit 81 based on the processing of step S22 in fig. 6.

< example of configuration of self-failure determination Table >

Fig. 7 is an explanatory diagram illustrating an example of a configuration based on the self failure determination table TB1 included in the self failure determination unit 80 of fig. 4.

The self failure determination unit 80 determines whether or not an abnormality has occurred in the AR-HUD1 by referring to the self failure determination table TB 1. The self failure determination table TB1 is stored in, for example, a memory not shown in the drawings included in the self failure determination unit 80. Or may be stored in the nonvolatile memory 23 included in the control unit 20 in fig. 4, a memory not shown included in the control unit 20, or the like.

The data structure of the self failure determination table TB1 is, as shown in fig. 7, composed of parameters, states, normal ranges, threshold values, current values, and the like. The current values of the parameters in the self failure determination table TB1 become device information.

The parameter is information to be referred to when the self failure determination unit 80 determines whether or not an abnormality has occurred in the AR-HUD1, and is data acquired by the self failure determination unit 80.

These parameters include information such as the temperature of the image display device 30, the temperature of the CPU serving as the control unit 20, the light emission intensity (luminance intensity) of the light emitted from the image display device 30, and the angle of the mirror 52. The parameters are acquired periodically at certain intervals.

The threshold value of the temperature of the CPU is set to a temperature before a failure or the like occurs in the CPU, that is, a preventive temperature before the CPU falls into a failure such as complete runaway. In this case, if the threshold value of the temperature of the CPU is set to be high, for example, a temperature at which the CPU is just about to fail, it is determined that runaway or malfunction of the CPU occurs immediately after the failure as described above, and control of the AR-HUD1 may become difficult, so that there is a margin for the threshold value.

The state indicates an acquisition state of the parameter, and is readable when the information of the parameter can be acquired, and is unreadable when a failure occurs in the sensor or the like and the parameter cannot be acquired.

The normal range indicates a range of normal values of each parameter. The current value is a value when the self failure determination unit 80 has acquired the parameter. Thus, the current value is updated each time the self failure determination unit 80 acquires the parameter.

The self-failure determination unit 80 compares the acquired parameter with a threshold value preset for each parameter, and determines that an abnormality has occurred in the AR-HUD1 when the current value of the acquired parameter exceeds the set threshold value.

For example, when the temperature of the video display device 30 exceeds 80 ℃ which is a threshold value, a failure of the video display device 30 may occur, and the self failure determination unit 80 determines that a failure has occurred or a failure is likely to occur in the AR-HUD1, generates an abnormality occurrence signal, and outputs the signal.

In addition to the parameters shown in fig. 7, other parameters may be acquired, which indicate, for example, the operating state of the sensor or the like that acquired the parameters of fig. 7.

For example, temperature sensors are used for acquiring the temperature of the image display device 30 and the temperature of the CPU. Therefore, as other parameters, information such as the operating time, the accumulated operating time, and the accumulated number of errors of each sensor such as a temperature sensor can be acquired.

The continuous operation time indicates, for example, a time during which the temperature sensor is continuously operated. The accumulated operating time is an accumulated value of operating time of the temperature sensor or the like. The cumulative number of errors is the number of times of measurement errors when temperature information is measured by a measurement temperature sensor or the like.

As for these other parameters, thresholds are set in advance, and not only the parameters shown in fig. 7, but also the operating time, the accumulated error count, and the like may be close to the lifetime of the sensor when the threshold is exceeded, and as a result, it is determined that there is a high possibility of a failure or the like occurring in the AR-HUD1, and an abnormality occurrence signal is generated and output.

< example of Structure of other cell failure determination Table >

Fig. 8 is an explanatory diagram illustrating an example of a configuration based on another cell failure determination table TB2 included in the cell failure determination unit 81 of fig. 4.

As shown in fig. 8, the other-unit failure determination table TB2 has information such as the communication destination, the predetermined communication interval, the failure determination threshold, the destination, the upper limit of the number of consecutive errors, and the current number of consecutive errors.

The communication destination means a unit that periodically performs communication with the control unit 20. In this case, the communication destination is, for example, a unit included in the vehicle information acquisition unit 10 that acquires the vehicle information 4 of fig. 2, such as the GPS receiver 117, the vehicle speed sensor 101, and the camera 116 of fig. 3.

The predetermined communication interval indicates a time of a predetermined communication interval between the control unit 20 and the unit. The failure determination threshold is a threshold used when determining that there is an abnormality in communication between the control unit 20 and the unit.

The control unit 20 compares the time of the communication interval of the unit with the time of the failure determination threshold, and when the time of the communication interruption exceeds the time of the failure determination threshold, the unit failure determination unit 81 generates and outputs an abnormality occurrence signal.

The destination indicates a unit for the control unit 20 to transmit data. The upper limit of the number of consecutive errors indicates an upper limit value of the number of consecutive transmission errors between the control unit 20 and the cell. The current number of consecutive errors indicates the number of consecutive transmission errors between the control section 20 and the unit at present.

The cell failure determination unit 81 compares the current number of consecutive errors with the upper limit value of the number of consecutive errors, and generates and outputs an abnormality occurrence signal when the current number of consecutive errors exceeds the upper limit value of the number of consecutive errors. Here, the time of the communication interval between the control unit 20 and the cell and the current number of consecutive errors become cell information.

Next, the operation of the display content changing process in the case where the malfunction has occurred by the display determination unit 82 when the malfunction has occurred will be described.

< display example >

Fig. 9 is an explanatory diagram showing an example of display on the windshield 3 when an abnormality occurs by the failure-time display determination unit 82 provided in the control unit 20 in fig. 4.

Fig. 9 schematically shows an example of a state in which the driver 5 of the vehicle 2 of fig. 2 views a landscape in front through the windshield 3 from the driver's seat and the display area 6 displaying a virtual image generated based on the AR-HUD1 projected on the windshield 3.

Upon receiving the abnormality occurrence signal, the failure display determination unit 82 executes a process of turning off the AR-HUD1 itself. Since the AR-HUD1 is off, nothing is displayed in the display area 6 shown in dotted lines as shown in fig. 9. This ensures the field of view of the driver 5, and safe driving can be maintained.

The shutdown of the AR-HUD1 may be, for example, a unit that operates in cooperation with the AR-HUD1, specifically, an ECU or the like included in an automated driving system that assists the automated driving of the vehicle 2 is shut off, and the electric power supplied to the AR-HUD 1. In this case, the failure time display determination unit 82 outputs a control signal to the ECU or the like included in the above-described unit. The ECU performs a process of cutting off power supply to the AR-HUD1 when receiving a control signal from the failure-time display determination unit 82.

Alternatively, the display determination unit 82 may perform the process of cutting off the power supply to the AR-HUD1 when a failure occurs. The failure display determination unit 82 outputs a flag indicating that the shutdown process has been performed, when the AR-HUD1 is shutdown by the failure display determination unit 82.

The flag output by the failure display determination unit 82 is stored in the nonvolatile memory 23 or the like in fig. 4, for example. When the flag is present at the time of starting the AR-HUD1, the control unit 20 performs an initial operation after checking that there is no abnormality in the various parameters by performing self-diagnosis.

As another process for not displaying the virtual image in the display region 6, for example, the failure display determination unit 82 performs a process for adjusting the light emission amount of the video display device 30 by controlling the light source adjustment unit 25 via the ECU.

Specifically, when the image display device 30 is an LCD, the backlight, which is a light source of the LCD, is turned off. This makes it possible to enter a state in which the AR-HUD1 performs an operation of displaying a virtual image, but nothing is displayed in the display area 6.

In the case where the backlight is turned off and the display is not performed in the display region 6, there is an advantage that the display of the virtual image in the display region 6 can be resumed after the backlight is turned off.

For example, in fig. 7, since the temperature of the CPU exceeds the threshold value, the backlight of the LCD is turned off, and no virtual image is displayed in the display region 6. In this case, the AR-HUD1 is not turned off, and therefore the control unit 20 is operating.

As shown in fig. 6, when the AR-HUD1 is operated, the trouble occurrence state (the processing of step S22) is checked and executed at certain intervals. Thus, even after the backlight of the LCD is turned off, the monitoring of the parameters can be continued. After the backlight of the LCD is turned off, if it is determined by the process of step S22 that the temperature of the CPU is equal to or lower than the threshold value, the control unit 20 can control to turn on the backlight again.

Further, the backlight may be turned on immediately when the current value of the parameter becomes the threshold value or may be turned on after a predetermined time elapses after the current value of the parameter becomes the threshold value or less.

This makes it possible to restore the display of the virtual image on the display area 6 without performing processes such as shutdown and restart of the AR-HUD1, thereby improving the convenience of the driver 5.

Next, the confirmation of the failure occurrence state and the display content changing process will be described in detail.

< example of confirmation of failure occurrence status and display content changing processing >

Fig. 10 is a flowchart showing an example of the process of step S23 in fig. 6, that is, the process of confirming the occurrence of a failure and changing the display content.

Fig. 10 shows, as an example of the display content changing process, a process in which, for example, the backlight of the LCD is turned off so that a virtual image is not displayed in the display region 6.

First, the self failure determination unit 80 and the cell failure determination unit 81 acquire parameters of objects, respectively (step S31), and store the acquired parameters in the self failure determination table TB1 of fig. 7 and the other cell failure determination table TB2 of fig. 8, respectively (step S32).

Next, the control unit 20 determines whether or not the special display in the failure is being executed (step S33). In fig. 10, the special display is a display different from the normal display of a virtual image in the AR-HUD1, such as turning off the backlight of the LCD.

In the process of step S33, if the special display is not being performed, that is, if it is determined that the backlight of the LCD is on, the self failure determination section 80 and the cell failure determination section 81 respectively refer to the self failure determination table TB1 of fig. 7 and the other cell failure determination table TB2 of fig. 8, and determine whether or not the current values of the acquired various parameters exceed the threshold values (step S34).

When the current value of the parameter does not exceed the threshold value, the failure occurrence condition confirmation and display content change processing are ended. In the processing of step S33, when the current value of the parameter exceeds the threshold value, it is determined whether the current value of the parameter acquired again exceeds the threshold value (step S35). The process of step S35 is repeated a predetermined number of times. The process of step S35 is repeatedly executed to prevent a situation in which it is determined that a failure has occurred due to, for example, a malfunction of a sensor or the like, in which the current value of a parameter exceeds a threshold value only once.

If it is determined in the process of step S35 that the current value of the parameter does not exceed the threshold value, the process returns to step S34 after the predetermined time of the standby state has elapsed (step S36). In the processing of step S35, when it is determined that the current value of the parameter exceeds the threshold value, special display, that is, backlight off of the LCD is performed (step S37).

In addition, in the process of step S33, when it is determined that the special display is being performed, that is, the backlight of the LCD is off, the self failure determination unit 80 and the cell failure determination unit 81 respectively refer to the self failure determination table TB1 of fig. 7 and the other cell failure determination table TB2 of fig. 8, and determine whether or not the current values of the acquired various parameters exceed the threshold values (step S38).

In the process of step S38, when the current value of the parameter exceeds the threshold value, the confirmation of the failure occurrence state and the display content changing process are ended, and the backlight continues to be turned off. In addition, in the process of step S38, when the current value of the parameter does not exceed the threshold value, since no failure has occurred, the execution of the special display in failure, that is, the backlight of the LCD is turned on, is released (step S39).

After that, the control unit 20 determines the content displayed in the display area 6 of the AR-HUD1 (step S40), and executes the display processing of the determined content (step S41).

Through the above steps, the failure occurrence status confirmation and display content change processing are completed.

In fig. 10, an example is shown in which the self failure determination unit 80 and the unit failure determination unit 81 detect an abnormality, but only the device abnormality detection by the self failure determination unit 80 may be performed. Alternatively, only the cell failure determination unit 81 may detect a cell failure.

< about problem >

Here, a problem when an abnormal display occurs due to a failure or the like in the AR-HUD1 will be described.

Fig. 11 is an explanatory diagram showing an example of an abnormal display screen when a failure occurs in the AR-HUD1, based on the study by the present inventor. Fig. 12 is an explanatory diagram showing another example of an abnormal display screen when a failure occurs in an AR-HUD based on the study by the present inventors.

Fig. 11(a) is an example of an abnormal display showing a guidance display generated by navigation in the display area 150 of a virtual image viewed by the driver from the driver's seat through the windshield. In this case, as shown in the upper part of fig. 11(a), although the vehicle 151 has passed through the left turn point P1 of the intersection, the state in which the display of the left turn guide is continuously displayed. In addition, the numerical value of the distance indicating the distance to the left-turn point P1 is also displayed erroneously.

Fig. 11(b) shows an example in which the entire or most of the display area 150 is displayed in a smeared manner due to abnormal display. Fig. 12 shows a state in which a large number of unnecessary and irrelevant objects are displayed in the entire display area 150.

As shown in fig. 11 and 12, when the AR-HUD1 is malfunctioning and is abnormally displayed, the driver may be distracted by the display in the display area 150 and may neglect the driving action. Moreover, as shown in fig. 11(b) and 12, a poor front view may be caused.

On the other hand, in the case of the AR-HUD1 of the present invention, the failure of the AR-HUD1 is detected as soon as possible, and as shown in fig. 9 and the like, the virtual image of the display region 6 can be made not to be displayed. This makes it possible to avoid situations in which the driver 5 is distracted by looking at the display area 6, or in which the driver cannot see important safety information due to poor forward visibility caused by abnormal display.

With the above, the field of view of the driver 5 can be secured even when the AR-HUD1 fails, and therefore, the safety of the vehicle during traveling can be improved.

In embodiment 1, the virtual image is not displayed in the display region 6 when a failure occurs, but for example, the display of the display region 6 may be maintained as it is, but the brightness, that is, the luminance of the display in the display region 6 may be minimized.

Alternatively, the region in which the virtual image is displayed may be made small, in other words, the area may be made small.

< other display example >

Fig. 13 is an explanatory diagram showing another example of the display to the windshield 3 at the time of occurrence of a failure by the failure display determination unit 82 provided in the control unit 20 in fig. 4.

In fig. 13, the broken line indicates the display area 6 in the normal operation before the failure transmission, and the solid line indicates the display area 6a in the case where the failure occurs. In the normal operation, a virtual image is displayed in the display region 6, and in the event of a failure, a display region 6a is formed in which the area of the display region is reduced. In the display region 6a in which the area of the display region is reduced, a warning such as a failure occurrence can be displayed to notify the driver 5.

This ensures the forward field of view of the driver 5. This reduction of the display area is effective particularly for an AR-HUD that displays a virtual image in a wide range.

This makes it possible to make the information displayed in the display area 6 inconspicuous, and thus, it is possible to reduce a poor front view.

(embodiment mode 2)

< example of AR-HUD construction >

Fig. 14 is an explanatory diagram showing an example of the structure of the AR-HUD1 according to embodiment 2.

In embodiment 2, an example in which a functional film is used is described as a technique for preventing a virtual image from being displayed when a failure occurs.

In this case, the AR-HUD1 has a structure in which a new functional film 120 is provided in the structure of embodiment 1 shown in fig. 1 and 2. The functional film 120 is changed to be transparent, white, or the like according to the application of voltage. For example, the transparent conductive film has a property of being white when a voltage is applied and not transmitting light, and being transparent when a voltage is not applied and capable of transmitting light.

The functional film 120 is provided in the opening 7 in fig. 1 of embodiment 1. In the normal operation, light is transmitted by controlling the applied voltage to be transparent as shown in fig. 14 (a). When a failure occurs, the applied voltage is controlled to be white as shown in fig. 14(b), and light is not transmitted.

Thus, when a failure occurs, a virtual image is not displayed in the display region 6 as shown in fig. 9. The control of the voltage applied to the functional film 120 is performed by the failure occurrence indication determination unit 82 in fig. 4, for example.

This can prevent the occurrence of a poor forward view due to the driver 5 looking at the display area 6 or due to abnormal display.

With the above, it is possible to contribute to safe driving.

The invention completed by the present inventors has been specifically described above based on the embodiments, but the present invention is not limited to the above embodiments, and various modifications can be made without departing from the scope of the invention.

The present invention is not limited to the above-described embodiments, and various modifications can be made. For example, the above-described embodiments are detailed descriptions for facilitating understanding of the present invention, and are not limited to having all of the described configurations.

In addition, a part of the structure of one embodiment may be replaced with the structure of another embodiment, or the structure of another embodiment may be added to the structure of one embodiment. Further, a part of the configuration of each embodiment can be added, deleted, or replaced with another configuration.

Description of reference numerals

1 AR-HUD

2 vehicle

3 windscreen

4 vehicle information

5 driver

6 display area

7 opening part

10 vehicle information acquisition unit

20 control part

22 sound output unit

23 non-volatile memory

24 memory

25 light source adjusting part

26 distortion correcting unit

27 display element driving section

29 mirror adjusting part

30 image display device

50 mirror driving unit

51 reflecting mirror

52 mirror

60 loudspeaker

80 self-failure determination unit

81 Unit failure determination section

82 failure display determination unit

101 vehicle speed sensor

102 shift position sensor

103 steering wheel angle sensor

104 headlamp sensor

105 illuminance sensor

106 colorimetric sensor

107 ranging sensor

108 infrared sensor

109 engine start sensor

110 acceleration sensor

111 gyroscope sensor

112 temperature sensor

113-path radio receiver for vehicle-to-vehicle communication

114 radio receiver for vehicle-to-vehicle communication

116 vidicon

117 GPS receiver

118 VICS receiver

120 functional film.

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种图像处理方法、设备、系统及存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类