Dirt detection system, LiDAR unit, sensing system for vehicle, and vehicle

文档序号:23622 发布日期:2021-09-21 浏览:33次 中文

阅读说明:本技术 污垢检测系统、LiDAR单元、车辆用传感系统及车辆 (Dirt detection system, LiDAR unit, sensing system for vehicle, and vehicle ) 是由 小野田幸央 于 2020-01-20 设计创作,主要内容包括:污垢检测系统(6a)构成为对附着于车辆用灯具的外罩的污垢进行检测。在车辆用灯具分别搭载对车辆的周边环境进行检测的照相机(43a)、LiDAR单元(44a)、毫米波雷达(45a)。污垢检测系统(6a)具有:热图像照相机(62a),其构成为取得表示外罩的热图像数据;灯具清洁器(63a),其构成为将附着于外罩的污垢去除;以及灯具清洁器控制部(64a),其构成为,基于热图像数据对污垢是否附着于所述外罩进行判定,与污垢附着于所述外罩的判定相应地使灯具清洁器(63a)驱动。(The dirt detection system (6a) is configured to detect dirt adhering to a housing of the vehicle lamp. A camera (43a), a LiDAR unit (44a), and a millimeter wave radar (45a) for detecting the surrounding environment of a vehicle are mounted on the vehicle lamp, respectively. The dirt detection system (6a) has: a thermal image camera (62a) configured to acquire thermal image data representing the cover; a lamp cleaner (63a) configured to remove dirt adhering to the housing; and a lamp cleaner control unit (64a) configured to determine whether or not dirt is attached to the housing based on the thermal image data, and to drive the lamp cleaner (63a) in accordance with the determination that dirt is attached to the housing.)

1. A dirt detection system for detecting dirt adhering to a housing of a vehicle lamp equipped with a sensor for detecting the surrounding environment of a vehicle,

the dirt detection system has:

a thermal image camera configured to acquire thermal image data representing the cover;

a lamp cleaner configured to remove dirt adhering to the housing; and

and a lamp cleaner control unit configured to determine whether or not dirt is attached to the housing based on the thermal image data, and to drive the lamp cleaner in accordance with the determination that dirt is attached to the housing.

2. The fouling detection system of claim 1,

the thermal image camera is disposed in a space formed by the housing and the cover of the vehicle lamp.

3. The fouling detection system according to claim 1 or 2,

the lamp cleaner control unit is configured to determine whether or not dirt adheres to the cover based on the thermal image data when a pedestrian is not present within a predetermined range from the vehicle.

4. The fouling detection system according to any one of claims 1 to 3,

the lamp cleaner control unit is configured to control the lamp cleaner,

determining a high temperature region greater than or equal to a threshold temperature based on the thermal image data,

determining whether the determined high temperature region is greater than or equal to a predetermined area,

and determining that dirt is attached to the housing when the high-temperature region is greater than or equal to a predetermined area.

5. The fouling detection system of claim 4,

the lamp cleaner control unit is configured to determine the threshold temperature in accordance with an outside air temperature outside the vehicle.

6. A vehicle having the dirt detection system of any one of claims 1 to 5.

7. A LiDAR unit having:

a 1 st light emitting unit configured to emit a 1 st laser beam having a 1 st peak wavelength;

a 1 st light receiving unit configured to receive the reflected light of the 1 st laser beam and to photoelectrically convert the reflected light of the 1 st laser beam;

a 2 nd light emitting unit configured to emit a 2 nd laser beam having a 2 nd peak wavelength different from the 1 st peak wavelength;

a 2 nd light receiving unit configured to receive the reflected light of the 2 nd laser beam and to photoelectrically convert the reflected light of the 2 nd laser beam;

a 1 st generation unit configured to generate 1 st point group data based on an emission time of the 1 st laser beam and a light reception time of reflected light of the 1 st laser beam; and

a 2 nd generation unit configured to generate 2 nd point group data based on the emission time of the 2 nd laser beam and the light reception time of the reflected light of the 2 nd laser beam,

the detection wavelength range of the 1 st light receiving part and the detection wavelength range of the 2 nd light receiving part do not overlap each other.

8. The LiDAR unit of claim 7, wherein,

the emission intensity of the 2 nd laser beam is smaller than the emission intensity of the 1 st laser beam.

9. A vehicle sensor system configured to detect dirt adhering to a housing of a vehicle lamp provided in a vehicle,

the vehicle sensor system includes:

the LiDAR unit according to claim 7 or 8, configured to be disposed in a space formed by a housing of the vehicle lamp and the housing, and to acquire 1 st point group data and 2 nd point group data indicating an ambient environment outside the vehicle;

a lamp cleaner configured to remove dirt adhering to the housing; and

and a lamp cleaner control unit configured to determine whether or not dirt adheres to the housing based on the 2 nd point group data, and to drive the lamp cleaner in accordance with the determination that dirt adheres to the housing.

10. The vehicular sensor system according to claim 9, wherein,

the 2 nd point cloud data shows a surrounding environment within a specified distance from the LiDAR unit,

when dirt adheres to the housing, the lamp cleaner control unit determines a point group indicated by the 2 nd point group data as the dirt adhering to the housing.

11. The vehicular sensor system according to claim 9, wherein,

the point 2 group data shows a surrounding environment outside the vehicle,

when dirt adheres to the housing, the luminaire cleaner control unit determines a point group present within a predetermined distance from the LiDAR unit, which is indicated by the 2 nd point group data, as dirt adhering to the housing.

12. A vehicle having the vehicular sensor system according to any one of the preceding claims 9 to 11.

Technical Field

The invention relates to a dirt detection system, a LiDAR unit, a vehicle sensor system, and a vehicle. In particular, the present invention relates to a dirt detection system and a vehicle sensor system for detecting dirt in a housing of a vehicle lamp provided in a vehicle.

Background

Currently, research on an automatic driving technique of an automobile is being conducted in hot weather in various countries, and a supplement to laws for enabling a vehicle (hereinafter, "vehicle" means an automobile) to travel on a road through an automatic driving mode is being studied in various countries. Here, in the automatic driving mode, the vehicle system automatically controls the running of the vehicle. Specifically, in the automatic driving mode, the vehicle system automatically performs at least 1 of steering control (control of the traveling direction of the vehicle), braking control, and acceleration control (control of braking, acceleration, and deceleration of the vehicle) based on information (surrounding environment information) indicating the surrounding environment of the vehicle obtained from a sensor such as a camera or a radar (for example, a laser radar or a millimeter wave radar). On the other hand, in the manual driving mode described below, the driver controls the traveling of the vehicle as shown in most of conventional vehicles. Specifically, in the manual driving mode, the traveling of the vehicle is controlled in accordance with the operation (steering operation, braking operation, and acceleration operation) by the driver, and the vehicle system does not automatically perform the steering control, the braking control, and the acceleration control. The driving mode of the vehicle is not a concept of being present only in a part of the vehicles, but a concept of being present in all vehicles including a conventional vehicle having no automatic driving function, and is classified according to, for example, a vehicle control method.

As described above, it is expected that a vehicle traveling on a road in the future in the automatic driving mode (hereinafter, appropriately referred to as "automatic driving vehicle") and a vehicle traveling in the manual driving mode (hereinafter, appropriately referred to as "manual driving vehicle") will travel together.

As an example of the automatic driving technique, patent document 1 discloses an automatic follow-up running system in which a following vehicle runs automatically following a preceding vehicle. In this automatic follow-up running system, each of the preceding vehicle and the following vehicle has an illumination system for preventing the display of text information of the other vehicle inserted between the preceding vehicle and the following vehicle in the illumination system of the preceding vehicle, and text information indicating the intention of automatic follow-up running is displayed in the illumination system of the following vehicle.

Patent document 1: japanese laid-open patent publication No. 9-277887

Disclosure of Invention

In addition, in the development of the automatic driving technique, it is necessary to significantly improve the detection accuracy of the surrounding environment of the vehicle. In this regard, a vehicle is being studied to mount a plurality of different types of sensors (e.g., a camera, a LiDAR unit, a millimeter wave radar, etc.). For example, it has been studied to dispose a plurality of sensors at each of the 4 corners of the vehicle. Specifically, it has been studied to mount a LiDAR unit, a camera, and a millimeter wave radar on each of 4 vehicle lamps arranged at 4 corners of a vehicle.

A LiDAR unit disposed in a vehicle lamp acquires point group data representing the surrounding environment of a vehicle through a transparent cover. Similarly, a camera disposed in the vehicle lamp acquires image data representing the surroundings of the vehicle through a transparent cover. Therefore, when dirt adheres to the housing of the vehicle lamp, there is a possibility that the surrounding environment of the vehicle cannot be accurately specified based on the point group data of the LiDAR unit and/or the image data of the camera due to the dirt (mud, dust, etc.) adhering to the housing. As described above, when sensors such as a LiDAR unit and a camera are disposed in a vehicle lamp, it is necessary to study a method for detecting dirt that adheres to a housing and adversely affects detection accuracy of the sensors.

The invention aims to provide a system capable of restraining the reduction of the detection precision of a sensor arranged in a vehicle lamp.

A dirt detection system according to an aspect of the present invention is configured to detect dirt adhering to a housing of a vehicle lamp. A vehicle lamp is mounted with a sensor for detecting the surrounding environment of the vehicle.

The dirt detection system has:

a thermal image camera configured to acquire thermal image data representing the cover;

a lamp cleaner configured to remove dirt adhering to the housing; and

and a lamp cleaner control unit configured to determine whether or not dirt is attached to the housing based on the thermal image data, and to drive the lamp cleaner in accordance with the determination that dirt is attached to the housing.

According to the above configuration, after determining whether or not dirt is attached to the housing based on the thermal image data, the lamp cleaner is driven in accordance with the determination that dirt is attached to the housing. As described above, the dirt adhering to the cover can be detected based on the thermal image data acquired from the thermal image camera. At this point, dirt such as mud absorbs light emitted from the illumination unit or light emitted from the LiDAR unit, and therefore the temperature of the dirt is higher than that of the housing. Therefore, the dirt attached to the cover can be detected based on the thermal image data.

Therefore, since dirt adhering to the housing can be reliably detected, a decrease in detection accuracy of a sensor (particularly, a LiDAR unit, a camera, or the like) disposed in a space formed by the housing and the case of the vehicle lamp can be suppressed.

In addition, the thermal image camera may be disposed in a space formed by the housing of the vehicle lamp and the cover.

According to the above configuration, since the thermal image camera is disposed in the space formed by the housing and the cover of the vehicle lamp, whether or not dirt is attached to the cover can be reliably determined based on the thermal image data representing the cover.

Further, the lamp cleaner control unit may be configured to determine whether or not dirt adheres to the cover based on the thermal image data when there is no pedestrian within a predetermined range from the vehicle.

According to the above configuration, since the determination process is executed when there is no pedestrian in the predetermined range from the vehicle, it is possible to reliably prevent the situation in which a pedestrian is shown in the thermal image data. As described above, it is possible to reliably prevent a situation in which a pedestrian radiating heat is determined to be dirt adhering to the cover (i.e., false detection of dirt).

In addition, the lamp cleaner control unit may be configured to,

determining a high temperature region greater than or equal to a threshold temperature based on the thermal image data,

determining whether the determined high temperature region is greater than or equal to a predetermined area,

and determining that dirt is attached to the housing when the high-temperature region is greater than or equal to a predetermined area.

According to the above configuration, whether or not dirt is attached to the cover can be reliably determined based on the thermal image data representing the cover.

Further, the lamp cleaner control unit may be configured to determine the threshold temperature in accordance with an outside air temperature outside the vehicle.

According to the above configuration, since the threshold temperature is determined in accordance with the outside air temperature, the optimum fouling determination process corresponding to the outside air temperature can be executed. That is, it is possible to reliably prevent a situation in which dirt adhering to the cover is not detected in accordance with the outside air temperature.

A vehicle having the above-described dirt detection system may be provided.

According to the above, a vehicle can be provided in which a decrease in detection accuracy of a sensor disposed in a vehicle lamp can be suppressed.

A LiDAR unit according to an embodiment of the present invention includes:

a 1 st light emitting unit configured to emit a 1 st laser beam having a 1 st peak wavelength;

a 1 st light receiving unit configured to receive the reflected light of the 1 st laser beam and to photoelectrically convert the reflected light of the 1 st laser beam;

a 2 nd light emitting unit configured to emit a 2 nd laser beam having a 2 nd peak wavelength different from the 1 st peak wavelength;

a 2 nd light receiving unit configured to receive the reflected light of the 2 nd laser beam and to photoelectrically convert the reflected light of the 2 nd laser beam;

a 1 st generation unit configured to generate 1 st point group data based on an emission time of the 1 st laser beam and a light reception time of reflected light of the 1 st laser beam; and

and a 2 nd generating unit configured to generate 2 nd point group data based on the emission time of the 2 nd laser beam and the light receiving time of the reflected light of the 2 nd laser beam.

The detection wavelength range of the 1 st light receiving part and the detection wavelength range of the 2 nd light receiving part do not overlap each other.

According to the above-described structure, the LiDAR unit is capable of generating the 1 st point cloud data associated with the 1 st laser and the 2 nd point cloud data associated with the 2 nd laser. As described above, a LiDAR unit capable of acquiring 2 different point cloud data can be provided. For example, the surrounding environment of a vehicle mounting a LiDAR unit can be determined using one point cloud data (e.g., the 1 st point cloud data) of the 2 point cloud data. Further, information other than the surrounding environment of the vehicle (for example, information related to dirt adhering to the cover) can be specified using the other point group data (for example, the 2 nd point group data) among the 2 point group data.

In addition, the emission intensity of the 2 nd laser beam may be smaller than the emission intensity of the 1 st laser beam.

According to the above configuration, since the emission intensity of the 2 nd laser beam is smaller than the emission intensity of the 1 st laser beam, the ambient environment indicated by the 1 st point group data and the ambient environment indicated by the 2 nd point group data can be made different from each other. For example, it is possible to acquire ambient environment information outside the vehicle using the 1 st point group data and acquire information relating to dirt adhering to the cover using the 2 nd point group data.

A vehicle sensor system according to an aspect of the present invention is configured to detect dirt adhering to a housing of a vehicle lamp provided in a vehicle.

A vehicle sensor system is provided with:

the LiDAR unit configured to be disposed in a space formed by the housing and the cover of the vehicle lamp and to acquire 1 st point group data and 2 nd point group data indicating an ambient environment outside the vehicle;

a lamp cleaner configured to remove dirt adhering to the housing; and

and a lamp cleaner control unit configured to determine whether or not dirt adheres to the housing based on the 2 nd point group data, and drive the lamp cleaner in accordance with the determination that dirt adheres to the housing.

According to the above configuration, after determining whether or not dirt adheres to the housing based on the 2 nd point group data, the lamp cleaner is driven in accordance with the determination that dirt adheres to the housing. As described above, dirt adhering to the housing can be detected based on one 2 nd point cloud data among the 2 nd point cloud data acquired from the LiDAR unit. In this regard, when dirt such as rain, snow, mud, or the like adheres to the cover, a point group indicating the dirt adhering to the cover appears in the 2 nd point group data, and therefore, the dirt adhering to the cover can be detected based on the point group. Therefore, dirt adhering to the housing can be reliably detected, and therefore, a decrease in detection accuracy of a sensor such as a LiDAR unit disposed in the vehicle lamp can be suppressed.

Additionally, the 2 nd point cloud data may show a surrounding environment within a specified distance from the LiDAR unit. In a case where dirt adheres to the housing, the lamp cleaner control unit may determine the point group indicated by the 2 nd point group data as the dirt adhering to the housing.

With the above configuration, the point group indicated by the 2 nd point group data is determined as dirt adhering to the cover. As described above, since the point group indicating the object existing outside the vehicle does not appear in the 2 nd point group data, whether or not dirt is attached to the cover can be determined based on the presence or absence of the point group appearing in the 2 nd point group data.

In addition, the 2 nd point group data may show a surrounding environment outside the vehicle. When dirt adheres to the housing, the luminaire cleaner control unit may determine a point group present within a predetermined distance from the LiDAR unit, which is indicated by the 2 nd point group data, as the dirt adhering to the housing.

According to the above configuration, a point group present within a predetermined distance from the LiDAR unit, which is indicated by the 2 nd point group data, is determined as dirt adhering to the housing. As described above, even when the 2 nd point group data indicates the environment around the outside of the vehicle, whether or not dirt is attached to the cover can be determined based on the presence or absence of the point group indicated within the predetermined distance.

In addition, a vehicle having the vehicle sensing system is provided.

According to the above, a vehicle can be provided in which a decrease in detection accuracy of a sensor disposed in a vehicle lamp can be suppressed.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the present invention, it is possible to provide a system capable of suppressing a decrease in detection accuracy of a sensor disposed in a vehicle lamp.

Drawings

Fig. 1 is a schematic diagram of a vehicle including a vehicle system according to embodiment 1 of the present invention.

Fig. 2 is a block diagram showing a vehicle system according to embodiment 1.

Fig. 3 (a) is a block diagram showing a left front sensor system. (b) Is a block diagram illustrating a front left dirt detection system.

Fig. 4 is a flowchart for explaining a method of detecting dirt adhering to the housing.

Fig. 5 is a schematic view of a vehicle equipped with the vehicle system according to embodiment 2.

Fig. 6 is a block diagram showing a vehicle system according to embodiment 2.

Fig. 7 is a block diagram showing a front left sensing system.

Fig. 8 is a block diagram showing the configuration of the LiDAR unit according to embodiment 2.

Fig. 9 is a schematic diagram of the LIDAR unit according to embodiment 2.

Fig. 10 is a flowchart for explaining a method of detecting dirt adhering to the cover according to embodiment 2.

FIG. 11 is a diagram showing the 1 st and 2 nd laser beams emitted from a LiDAR unit.

Detailed Description

(embodiment 1)

Embodiment 1 of the present invention (hereinafter, simply referred to as "the present embodiment") will be described below with reference to the drawings. In the description of the present embodiment, the components having the same reference numerals as those already described are omitted for convenience of description. The dimensions of the members shown in the drawings may be different from the actual dimensions of the members for convenience of description.

In the description of the present embodiment, for convenience of description, the terms "left-right direction", "front-back direction", and "up-down direction" may be appropriately used. These directions are relative directions set with respect to the vehicle 1 shown in fig. 1. Here, the "front-rear direction" is a direction including the "front direction" and the "rear direction". The "left-right direction" is a direction including the "left direction" and the "right direction". The "up-down direction" is a direction including an "up direction" and a "down direction". Note that, the up-down direction is not shown in fig. 1, but the up-down direction is a direction perpendicular to the front-back direction and the left-right direction.

First, a vehicle 1 and a vehicle system 2 according to the present embodiment will be described with reference to fig. 1 and 2. Fig. 1 is a schematic diagram showing a plan view of a vehicle 1 having a vehicle system 2. Fig. 2 is a block diagram showing the vehicle system 2.

As shown in fig. 1, the vehicle 1 is a vehicle (automobile) capable of traveling in an automatic driving mode, and includes a vehicle system 2, a left front lamp 7a, a right front lamp 7b, a left rear lamp 7c, and a right rear lamp 7 d.

As shown in fig. 1 and 2, the vehicle system 2 includes at least: a vehicle control unit 3, a front left sensor system 4a (hereinafter, simply referred to as "sensor system 4 a"), a front right sensor system 4b (hereinafter, simply referred to as "sensor system 4 b"), a rear left sensor system 4c (hereinafter, simply referred to as "sensor system 4 c"), and a rear right sensor system 4d (hereinafter, simply referred to as "sensor system 4 d").

In addition, the vehicle system 2 further has a front left dirt detection system 6a (hereinafter, simply referred to as "dirt detection system 6 a"), a front right dirt detection system 6b (hereinafter, simply referred to as "dirt detection system 6 b"), a rear left dirt detection system 6c (hereinafter, simply referred to as "dirt detection system 6 c"), and a rear right dirt detection system 6d (hereinafter, simply referred to as "dirt detection system 6 d").

The vehicle system 2 includes a sensor 5, an hmi (human Machine interface)8, a gps (global Positioning system)9, a wireless communication unit 10, and a storage device 11. In addition, the vehicle system 2 has a steering actuator 12, a steering device 13, a brake actuator 14, a brake device 15, an acceleration actuator 16, and an acceleration device 17.

The vehicle control unit 3 is configured to control the traveling of the vehicle 1. The vehicle Control Unit 3 is constituted by at least one Electronic Control Unit (ECU), for example. The electronic control unit includes a computer system (e.g., soc (system on a chip)) including 1 or more processors and 1 or more memories, and an electronic circuit including active elements such as transistors and passive elements. The processor includes at least one of a CPU (Central Processing Unit), an MPU (micro Processing Unit), a GPU (graphics Processing Unit), and a TPU (temporal Processing Unit). The CPU may be composed of a plurality of CPU cores. A GPU may be made up of multiple GPU cores. The memory includes ROM (read Only memory) and RAM (random Access memory). The ROM may store a vehicle control program. For example, the vehicle control program may include an Artificial Intelligence (AI) program for automatic driving. The AI program is a program (trained model) constructed by machine learning with or without a teacher (in particular, deep learning) using a multi-layer neural network. The RAM may temporarily store a vehicle control program, vehicle control data, and/or ambient environment information indicating the ambient environment of the vehicle. The processor may be configured to expand a program specified from various vehicle control programs stored in the ROM on the RAM, and execute various processes by cooperating with the RAM. The computer system may be a non-von neumann computer such as an asic (application Specific Integrated circuit) or an FPGA (Field-Programmable Gate Array). The computer system may be a combination of a von neumann computer and a non-von neumann computer.

The sensor systems 4a to 4d are each configured to detect the surrounding environment of the vehicle 1. In the description of the present embodiment, the sensor systems 4a to 4d have the same components. Therefore, the sensor system 4a will be described below with reference to fig. 3 (a). Fig. 3 (a) is a block diagram showing the sensor system 4 a.

As shown in fig. 3 (a), the sensor system 4a includes: a control section 40a, a lighting unit 42a, a camera 43a, a Light Detection and Ranging unit 44a (an example of a laser radar), and a millimeter wave radar 45 a. The control unit 40a, the illumination unit 42a, the camera 43a, the LiDAR unit 44a, and the millimeter wave radar 45a are disposed in a space Sa formed by the housing 24a and the translucent cover 22a of the left front lamp 7a shown in fig. 1. On the other hand, the control unit 40a may be disposed in a predetermined place of the vehicle 1 other than the space Sa. For example, the control unit 40a may be integrally configured with the vehicle control unit 3.

The control unit 40a is configured to control the operations of the illumination unit 42a, the camera 43a, the LiDAR unit 44a, and the millimeter-wave radar 45a, respectively. At this point, the control section 40a functions as an illumination unit control section 420a, a camera control section 430a, a LiDAR unit control section 440a, and a millimeter wave radar control section 450 a. The control portion 40a is constituted by at least one Electronic Control Unit (ECU). The electronic control unit includes a computer system (e.g., SoC or the like) including 1 or more processors and 1 or more memories, and an electronic circuit composed of active elements such as transistors and passive elements. The processor includes at least one of a CPU, MPU, GPU and TPU. The memory includes ROM and RAM. The computer system may be a non-von neumann computer such as an ASIC or FPGA.

The illumination unit 42a is configured to emit light toward the outside (forward) of the vehicle 1, thereby forming a light distribution pattern. The illumination unit 42a has a light source for emitting light and an optical system. The light source may be constituted by a plurality of light emitting elements arranged in a matrix (for example, N rows × M columns, N > 1, M > 1). The light Emitting element is, for example, an led (light Emitting diode), an ld (ser diode), or an organic EL element. The optical system may include at least one of a mirror configured to reflect light emitted from the light source toward the front of the illumination unit 42a and a lens configured to refract light directly emitted from the light source or light reflected by the mirror.

The illumination unit control unit 420a is configured to control the illumination unit 42a such that the illumination unit 42a emits a predetermined light distribution pattern toward the front region of the vehicle 1. For example, the illumination unit control unit 420a may change the light distribution pattern emitted from the illumination unit 42a according to the driving pattern of the vehicle 1.

The camera 43a is configured to detect the surrounding environment of the vehicle 1. Specifically, the camera 43a is configured to acquire image data indicating the surrounding environment of the vehicle 1 and then transmit the image data to the camera control unit 430 a. The camera control section 430a may determine the surrounding environment information based on the transmitted image data. Here, the ambient environment information may include information relating to an object existing outside the vehicle 1. For example, the surrounding environment information may include information related to the attribute of an object existing outside the vehicle 1 and information related to the distance, direction, and/or position of the object with respect to the vehicle 1. The camera 43a includes an imaging element such as a CCD (Charge-Coupled Device) or a CMOS (complementary MOS). The camera 43a may be configured as a single-lens camera or a stereo camera. When the camera 43a is a stereo camera, the control unit 40a can determine the distance between the vehicle 1 and an object (for example, a pedestrian or the like) present outside the vehicle 1 based on 2 or more image data acquired by the stereo camera by using parallax.

The LiDAR unit 44a is configured to detect the surrounding environment of the vehicle 1. In particular, the LiDAR unit 44a is configured to acquire point cloud data indicating the surrounding environment of the vehicle 1 and then transmit the point cloud data to the LiDAR unit control unit 440 a. The LiDAR unit control unit 440a may determine ambient environment information based on the transmitted point cloud data.

More specifically, the LiDAR unit 44a takes each emission angle (horizontal angle θ, vertical angle) with the laser light) Time of flight (TOF: time of Flight) Δ T1-related information. The LiDAR unit 44a can acquire information about the distance D between the LiDAR unit 44a and an object present outside the vehicle 1 at each exit angle based on the information about the time of flight Δ T1 at each exit angle.

In addition, the LiDAR unit 44a has, for example: a light emitting unit configured to emit laser light; an optical deflector configured to scan the laser light in a horizontal direction and a vertical direction; optical systems such as lenses; and a light receiving unit configured to receive the laser light reflected by the object. The peak wavelength of the laser light emitted from the light emitting section is not particularly limited. For example, the laser light may be non-visible light (infrared light) having a peak wavelength of about 900 nm. The light emitting section is, for example, a laser diode. The optical deflector is, for example, a mems (micro Electro Mechanical systems) mirror or a polygon mirror. The light receiving part is, for example, a photodiode. In addition, the LiDAR unit 44a may also acquire point cloud data without scanning the laser through the optical deflector. For example, the LiDAR unit 44a may acquire point cloud data in a phased array manner or in a flashing manner. The LiDAR unit 44a may also acquire point cloud data by mechanically driving the light-emitting unit and the light-receiving unit to rotate.

The millimeter wave radar 45a is configured to detect radar data indicating the surrounding environment of the vehicle 1. Specifically, the millimeter wave radar 45a is configured to acquire radar data and then transmit the radar data to the millimeter wave radar control unit 450 a. The millimeter wave radar control unit 450a is configured to acquire the surrounding environment information based on the radar data. The ambient environment information may include information related to an object existing outside the vehicle 1. The ambient environment information may include, for example, information relating to the position and direction of the object with respect to the vehicle 1 and information relating to the relative speed of the object with respect to the vehicle 1.

For example, the millimeter Wave radar 45a can acquire the distance and direction between the millimeter Wave radar 45a and an object existing outside the vehicle 1 by a pulse modulation method, an FM-CW (Frequency modulated-Continuous Wave) method, or a dual-Frequency CW method. In the case of using the pulse modulation method, the millimeter wave radar 45a can acquire information on the distance D between the millimeter wave radar 45a and the object existing outside the vehicle 1 based on the information on the time of flight Δ T2 after acquiring the information on the time of flight Δ T2 of the millimeter wave. The millimeter wave radar 45a can acquire information on the direction of the object with respect to the vehicle 1 based on the phase difference between the phase of the millimeter wave (received wave) received by one receiving antenna and the phase of the millimeter wave (received wave) received by the other receiving antenna adjacent to the one receiving antenna. Further, the millimeter wave radar 45a can acquire information relating to the relative speed V of the object with respect to the millimeter wave radar 45a based on the frequency f0 of the transmission wave radiated from the transmission antenna and the frequency f1 of the reception wave received by the reception antenna.

Each of the sensor systems 4b to 4d similarly includes a control unit, an illumination unit, a camera, a LiDAR unit, and a millimeter-wave radar. In particular, these devices of the sensor system 4b are disposed in the space Sb formed by the case 24b and the translucent cover 22b of the right front lamp 7b shown in fig. 1. These devices of the sensor system 4c are disposed in a space Sc formed by the case 24c of the left rear lamp 7c and the translucent cover 22 c. These devices of the sensor system 4d are disposed in a space Sd formed by the housing 24d of the right rear lamp 7d and the translucent cover 22 d.

Next, the dirt detection systems 6a to 6d will be explained. The dirt detection systems 6a to 6d are each configured to detect dirt (e.g., mud, dust, etc.) adhering to the housing and remove the detected dirt. At this point, the dirt detection system 6a is configured to detect dirt adhering to the cover 22a and remove the dirt. Similarly, the dirt detection system 6b is configured to detect dirt adhering to the cover 22b and remove the dirt. The dirt detection system 6c is configured to detect dirt adhering to the cover 22c and remove the dirt. The dirt detection system 6d is configured to detect dirt adhering to the cover 22d and remove the dirt.

The dirt detection systems 6a to 6d have the same components. Therefore, the dirt detection system 6a will be described below with reference to fig. 3 (b). Fig. 3 (b) is a block diagram showing the dirt detection system 6 a.

As shown in fig. 3 (b), the dirt detection system 6a includes a thermal image camera 62a, a lamp cleaner 63a, and a lamp cleaner control section 64 a. The thermal image camera 62a is, for example, a thermal imager, and is configured to acquire thermal image data. The heat-generating object (particularly, an object that radiates infrared rays) present in the periphery of the thermal image camera 62a can be visualized by the thermal image data captured by the thermal image camera 62 a. The thermal image camera 62a has an imaging element having a light receiving sensitivity to infrared rays (in particular, far infrared rays).

The thermal image camera 62a is disposed in the space Sa (see fig. 1), and is configured to acquire thermal image data representing the cover 22 a. In particular, the thermographic image camera 62a may be disposed in the vicinity of the LiDAR unit 44a disposed at the spatial Sa. The thermographic camera 62a may be configured to capture an area of the housing 22a through which laser light emitted from the LiDAR unit 44a passes. In the present embodiment, the thermal image camera 62a may be configured to detect dirt adhering to the cover 22a and also configured to detect an object that radiates heat, such as a pedestrian, present in the periphery of the vehicle 1. As described above, the vehicle control unit 3 can determine that the attribute of the object existing in the periphery of the vehicle 1 is a person based on the thermal image data transmitted from the thermal image camera 62 a.

The lamp cleaner 63a is configured to remove dirt adhering to the cover 22a, and is disposed in the vicinity of the cover 22 a. The lamp cleaner 63a may be configured to remove dirt attached to the housing 22a by spraying cleaning liquid or air toward the housing 22 a.

The lamp cleaner control unit 64a is configured to control the thermal image camera 62a and the lamp cleaner 63 a. The lamp cleaner control unit 64a is configured to receive the thermal image data from the thermal image camera 62a and determine whether dirt is attached to the housing 22a based on the received thermal image data. The lamp cleaner control unit 64a is configured to drive the lamp cleaner 63a in response to the determination that the dirt adheres to the outer cover 22 a.

The lamp cleaner control portion 64a is constituted by at least one Electronic Control Unit (ECU). The electronic control unit includes a computer system (e.g., SoC or the like) including 1 or more processors and 1 or more memories, and an electronic circuit composed of active elements such as transistors and passive elements. The processor includes at least one of a CPU, MPU, GPU and TPU. The memory includes ROM and RAM. The computer system may be a non-von neumann computer such as an ASIC or FPGA.

Returning to fig. 2, the sensor 5 may have an acceleration sensor, a velocity sensor, a gyro sensor, and the like. The sensor 5 is configured to detect a traveling state of the vehicle 1 and output traveling state information indicating the traveling state of the vehicle 1 to the vehicle control unit 3. In addition, the sensor 5 may have an outside air temperature sensor that detects the outside air temperature outside the vehicle 1.

The HMI 8 is constituted by an input unit that receives an input operation from the driver and an output unit that outputs travel information and the like to the driver. The input unit includes: a steering wheel, an accelerator pedal, a brake pedal, a driving mode changeover switch that switches the driving mode of the vehicle 1, and the like. The output unit is a Display (e.g., Head Up Display (HUD)) that displays various types of travel information. The GPS 9 is configured to acquire current position information of the vehicle 1 and output the acquired current position information to the vehicle control unit 3.

The wireless communication unit 10 is configured to receive information on another vehicle present around the vehicle 1 from the other vehicle and transmit the information on the vehicle 1 to the other vehicle (inter-vehicle communication). The wireless communication unit 10 is configured to receive infrastructure information from infrastructure equipment such as a traffic signal and a marker light and transmit travel information of the vehicle 1 to the infrastructure equipment (road-to-vehicle communication). The wireless communication unit 10 is configured to receive information related to a pedestrian from a portable electronic device (a smartphone, a tablet, a wearable device, or the like) carried by the pedestrian, and transmit the own vehicle travel information of the vehicle 1 to the portable electronic device (human-to-vehicle communication). The vehicle 1 may communicate directly with another vehicle, infrastructure equipment, or portable electronic equipment in the peer-to-peer mode, or may communicate via a communication network such as the internet.

The storage device 11 is an external storage device such as a Hard Disk Drive (HDD) or ssd (solid State drive). The storage device 11 may store 2-dimensional or 3-dimensional map information and/or a vehicle control program. For example, the 3-dimensional map information may be constituted by 3D mapping data (point group data). The storage device 11 is configured to output map information and a vehicle control program to the vehicle control unit 3 in response to a request from the vehicle control unit 3. The map information and the vehicle control program can be updated with the wireless communication unit 10 via the communication network.

When the vehicle 1 travels in the automatic driving mode, the vehicle control unit 3 automatically generates at least one of a steering control signal, an acceleration control signal, and a braking control signal based on the travel state information, the surrounding environment information, the current position information, the map information, and the like. The steering actuator 12 is configured to receive a steering control signal from the vehicle control unit 3 and control the steering device 13 based on the received steering control signal. The brake actuator 14 is configured to receive a brake control signal from the vehicle control unit 3 and control the brake device 15 based on the received brake control signal. The acceleration actuator 16 is configured to receive an acceleration control signal from the vehicle control unit 3 and control the acceleration device 17 based on the received acceleration control signal. As described above, the vehicle control unit 3 automatically controls the travel of the vehicle 1 based on the travel state information, the surrounding environment information, the current position information, the map information, and the like. That is, in the automatic driving mode, the travel of the vehicle 1 is automatically controlled by the vehicle system 2.

On the other hand, when the vehicle 1 travels in the manual driving mode, the vehicle control unit 3 generates a steering control signal, an acceleration control signal, and a braking control signal in accordance with manual operations of an accelerator pedal, a brake pedal, and a steering wheel by the driver. As described above, in the manual driving mode, the steering control signal, the acceleration control signal, and the brake control signal are generated by the manual operation of the driver, and thus the traveling of the vehicle 1 is controlled by the driver.

Next, the driving mode of the vehicle 1 will be explained. The driving mode is composed of an automatic driving mode and a manual driving mode. The automatic driving mode is constituted by a full automatic driving mode, an advanced driving assistance mode, and a driving assistance mode. In the full-automatic driving mode, the vehicle system 2 automatically performs all the travel controls of the steering control, the braking control, and the acceleration control, and the driver is not in a state in which the vehicle 1 can be driven. In the advanced driving assistance mode, the vehicle system 2 automatically performs all the travel control of the steering control, the braking control, and the acceleration control, and the driver does not drive the vehicle 1 although the driver is in a state in which the vehicle 1 can be driven. In the driving assistance mode, the vehicle system 2 automatically performs a part of travel control among steering control, braking control, and acceleration control, and the vehicle 1 is driven by the driver with driving assistance of the vehicle system 2. On the other hand, in the manual driving mode, the vehicle system 2 does not automatically perform the running control, and the vehicle 1 is driven by the driver without the driving assistance of the vehicle system 2.

(description of fouling detection method)

Next, a method of detecting dirt adhering to the cover 22a of the left light tool 7a will be described below with reference to fig. 4. Fig. 4 is a flowchart for explaining a method of detecting dirt adhering to the cover 22a (hereinafter referred to as a "dirt detection method"). Note that, in the present embodiment, only the dirt detection process performed by the sensor system 6a will be described, but it is desirable to note that the dirt detection process performed by the sensor systems 6b to 6d is also the same as the dirt detection process performed by the sensor system 6 a.

As shown in fig. 4, in step S1, the vehicle control unit 3 determines whether or not an object (particularly, a pedestrian) is present outside the vehicle 1 based on the surrounding environment information transmitted from the sensing systems 4a to 4 d. If the determination result at step S1 is YES, the present determination process is repeatedly executed until the determination result at step S1 becomes NO. On the other hand, if the determination result at step S1 is NO, the process proceeds to step S2.

Next, in step S2, the lamp cleaner control section 64a activates the thermal image camera 62 a. Further, in the case where the thermal image camera 62a has been started, the processing of steps S2, S3 is skipped. Next, in step S3, the lamp cleaner control unit 64a acquires thermal image data representing the cover 22a from the thermal image camera 62 a. In particular, the thermographic image data may show the area of the housing 22a through which laser light emitted from the LiDAR unit 44a passes.

Next, in step S4, lamp cleaner control unit 64a acquires information relating to the outside air temperature outside vehicle 1, which is acquired from the outside air temperature sensor, from vehicle control unit 3. Then, the lamp cleaner control portion 64a determines a threshold temperature corresponding to the outside air temperature. For example, in the case where the outside air temperature is low, the threshold temperature may be set to a low temperature. Conversely, in the case where the outside air temperature is high, the threshold temperature may be set to a high temperature.

Next, the lamp cleaner control portion 64a determines whether or not there is a high temperature region greater than or equal to the threshold temperature based on the thermal image data (step S5). Here, the thermal image data (thermal map) shows a temperature distribution of the captured ambient environment. Therefore, the lamp cleaner control portion 64a can detect whether or not a high-temperature region having a temperature greater than or equal to the threshold temperature exists in the captured housing 22a based on the thermal image data. If the determination result at step S5 is YES, the process proceeds to step S6. On the other hand, if the determination result in step S5 is NO, lamp cleaner control unit 64a determines that dirt is not attached to housing 22a in step S7, and the process ends.

Next, in step S6, the lamp cleaner control unit 64a determines whether or not the high temperature region present in the thermal image data is equal to or larger than a predetermined area. If the determination result in step S6 is YES, lamp cleaner control unit 64a determines that dirt is attached to housing 22a (step S8). On the other hand, if the determination result in step S6 is NO, the lamp cleaner control unit 64a determines that dirt is not attached to the cover 22a, and the process ends. In step S6, it may be determined whether or not the high temperature region is formed of an aggregate of a predetermined number of pixels or more.

Then, in step S9, the lamp cleaner control unit 64a drives the lamp cleaner 63a to remove dirt adhering to the housing 22 a. Specifically, the lamp cleaner control portion 64a drives the lamp cleaner 63a so that the cleaning liquid or air is ejected from the lamp cleaner 63a toward the housing 22 a.

After the process of step S9 is performed, the process returns to step S5. As described above, the processing of steps S5 to S9 is repeatedly executed until it is determined that dirt is not attached to the cover 22 a. Note that the present process may be ended after the process of step S9 is executed.

As described above, according to the present embodiment, it is determined whether or not dirt adheres to the cover 22a based on the thermal image data, and the cover 22a is driven in accordance with the determination that dirt adheres to the cover 22 a. As described above, the dirt adhering to the outer cover 22a can be detected based on the thermal image data acquired from the thermal image camera 62 a. At this point, dirt such as mud absorbs light emitted from the illumination unit 42a and laser light emitted from the LiDAR unit 44a, and therefore the surface temperature of the dirt is higher than the surface temperature of the housing 22 a. Therefore, when dirt adheres to the cover 22a, the high-temperature region can be detected from the thermal image data. As described above, the dirt adhering to the cover 22a can be detected based on the thermal image data.

Therefore, since dirt adhering to the cover 22a can be reliably detected, a decrease in detection accuracy of the LiDAR unit 44a and the camera 43a disposed in the space Sa formed by the cover 22a and the housing 24a can be suppressed.

In addition, in the present embodiment, since the determination process of step S1 is executed, it is possible to reliably prevent the thermal image data from showing a pedestrian or the like. As described above, it is possible to reliably prevent a situation in which a pedestrian or the like radiating heat is determined to have dirt adhering to the cover 22a (i.e., erroneous detection of dirt).

In addition, in the present embodiment, since the threshold temperature is determined in accordance with the outside air temperature outside the vehicle 1 in the process of step S4, it is possible to reliably prevent a situation in which dirt adhering to the cover 22a is not detected in accordance with the outside air temperature.

(embodiment 2)

Embodiment 2 of the present invention (hereinafter simply referred to as "the present embodiment") will be described below with reference to the drawings. In the description of the present embodiment, the components having the same reference numerals as those already described in embodiment 1 are not described for convenience of description. The dimensions of the members shown in the drawings may be different from the actual dimensions of the members for convenience of description.

In the description of the present embodiment, for convenience of description, the terms "left-right direction", "front-back direction", and "up-down direction" may be appropriately used. These directions are relative directions set with respect to the vehicle 1A shown in fig. 5. Here, the "front-rear direction" is a direction including the "front direction" and the "rear direction". The "left-right direction" is a direction including the "left direction" and the "right direction". The "up-down direction" is a direction including an "up direction" and a "down direction". Note that, the up-down direction is not shown in fig. 5, but the up-down direction is a direction perpendicular to the front-back direction and the left-right direction.

First, a vehicle 1A and a vehicle system 2A according to the present embodiment will be described with reference to fig. 5 and 6. Fig. 5 is a schematic diagram showing a plan view of the vehicle 1A having the vehicle system 2A. Fig. 6 is a block diagram showing the vehicle system 2A.

As shown in fig. 5, the vehicle 1A is a vehicle (automobile) capable of traveling in an automatic driving mode, and includes a vehicle system 2A, a left front lamp 7a, a right front lamp 7b, a left rear lamp 7c, and a right rear lamp 7 d.

As shown in fig. 5 and 6, the vehicle system 2A includes at least a vehicle control unit 3, a front left sensor system 104a (hereinafter, simply referred to as "sensor system 104 a"), a front right sensor system 104b (hereinafter, simply referred to as "sensor system 104 b"), a rear left sensor system 104c (hereinafter, simply referred to as "sensor system 104 c"), and a rear right sensor system 104d (hereinafter, simply referred to as "sensor system 104 d").

The vehicle system 2A includes the sensor 5, HMI 8, GPS 9, wireless communication unit 10, and storage device 11. In addition, the vehicle system 2A has a steering actuator 12, a steering device 13, a brake actuator 14, a brake device 15, an acceleration actuator 16, and an acceleration device 17.

The vehicle control unit 3 is configured to control the traveling of the vehicle 1A. The vehicle control unit 3 is constituted by at least one Electronic Control Unit (ECU), for example.

The sensor systems 104a to 104d are each configured to detect the surrounding environment of the vehicle 1A. In the description of the present embodiment, the sensing systems 104a to 104d have the same components. Therefore, the sensor system 104a will be described below with reference to fig. 7. Fig. 7 is a block diagram illustrating the sensing system 104 a.

As shown in fig. 7, sensing system 104a has a control section 140a, a lighting unit 142a, a camera 143a, a LiDAR unit 44a (an example of a laser radar), a millimeter wave radar 145a, and a lamp cleaner 146 a. The control unit 140a, the illumination unit 142a, the camera 143a, the LiDAR unit 144a, and the millimeter wave radar 145a are disposed in a space Sa formed by the housing 24a and the translucent cover 22a of the left front lamp 7a shown in fig. 5. On the other hand, the lamp cleaner 146a is disposed outside the space Sa and near the left lamp 7 a. The control unit 140a may be disposed in a predetermined place of the vehicle 1A other than the space Sa. For example, the control unit 140a may be integrally configured with the vehicle control unit 3.

The control unit 140a is configured to control the operations of the illumination unit 142a, the camera 143a, the LiDAR unit 144a, the millimeter-wave radar 145a, and the lamp cleaner 146a, respectively. In this regard, the control unit 140a functions as a lighting unit control unit 520a, a camera control unit 530a, a LiDAR unit control unit 540a, a millimeter wave radar control unit 550a, and a lamp cleaner control unit 560 a.

The control unit 140a is constituted by at least one Electronic Control Unit (ECU). The electronic control unit includes a computer system (e.g., SoC or the like) including 1 or more processors and 1 or more memories, and an electronic circuit composed of active elements such as transistors and passive elements. The processor includes at least one of a CPU, MPU, GPU and TPU. The memory includes ROM and RAM. The computer system may be a non-von neumann computer such as an ASIC or FPGA.

The illumination unit 142a is configured to emit light toward the outside (forward) of the vehicle 1A, thereby forming a light distribution pattern. The illumination unit 142a has a light source for emitting light and an optical system. The light source may be constituted by a plurality of light emitting elements arranged in a matrix (for example, N rows × M columns, N > 1, M > 1). The light-emitting element is, for example, an LED, an LD, or an organic EL element. The optical system may include at least one of a mirror configured to reflect light emitted from the light source toward the front of the illumination unit 142a and a lens configured to refract light directly emitted from the light source or light reflected by the mirror.

The illumination unit control unit 520a is configured to control the illumination unit 142a such that the illumination unit 142a emits a predetermined light distribution pattern toward the front region of the vehicle 1A. For example, the illumination unit control unit 520a may change the light distribution pattern emitted from the illumination unit 142a according to the driving pattern of the vehicle 1A.

The camera 143a is configured to detect the surrounding environment of the vehicle 1A. Specifically, the camera 143a is configured to acquire image data indicating the surrounding environment of the vehicle 1A and then transmit the image data to the camera control unit 530 a. The camera control part 530a may determine the surrounding environment information based on the transmitted image data. Here, the ambient environment information may include information relating to an object existing outside the vehicle 1A. For example, the surrounding environment information may include information related to an attribute of an object existing outside the vehicle 1A and information related to a distance, a direction, and/or a position of the object with respect to the vehicle 1A. The camera 143a includes an imaging element such as a CCD or a CMOS (complementary MOS). The camera 143a may be configured as a single-lens camera or a stereo camera. When the camera 143a is a stereo camera, the control unit 140a can determine the distance between the vehicle 1A and an object (for example, a pedestrian or the like) present outside the vehicle 1A based on 2 or more image data acquired by the stereo camera by using parallax.

The LiDAR unit 144a is configured to detect the surrounding environment of the vehicle 1A. In particular, the LiDAR unit 144a is configured to acquire point cloud data indicating the surrounding environment of the vehicle 1A and then transmit the point cloud data to the LiDAR unit control unit 540 a. The LiDAR unit control unit 540a may determine the ambient environment information based on the transmitted point cloud data.

In this regard, the following description will be given of the structure of the LiDAR unit 144a according to the present embodiment with reference to fig. 8. As shown in FIG. 8, the LiDAR unit 144a has a plurality of 1 st light emitting parts 75a, a plurality of 1 st light receiving parts 76a, a plurality of 2 nd light emitting parts 77a, a plurality of 2 nd light receiving parts 78a, a motor 79a, and a control part 70 a.

Each of the 1 st light emitting parts 75a includes: a light emitting element configured to emit a 1 st laser beam having a 1 st peak wavelength; and optical components such as lenses. The 1 st peak wavelength is, for example, 905 nm. The light emitting element is, for example, a laser diode that emits an infrared laser beam having a peak wavelength of 905 nm.

Each of the 1 st light receiving portions 76a includes: a light receiving element configured to receive reflected light of the 1 st laser beam reflected by an object outside the vehicle 1A and to photoelectrically convert the reflected light of the 1 st laser beam; and optical components such as lenses. The light receiving element is, for example, an Si photodiode having light receiving sensitivity for light in a wavelength band of 300nm to 1100 nm. As described above, the detection wavelength range of the 1 st light receiving part 76a is 300nm to 1100 nm.

Each of the plurality of 2 nd light emitting parts 77a has: a light emitting element configured to emit a 2 nd laser light having a 2 nd peak wavelength; and optical components such as lenses. The 2 nd peak wavelength is 1550nm, for example. The light emitting element is, for example, a laser diode that emits an infrared laser beam having a peak wavelength of 1550 nm.

Each of the plurality of 2 nd light receiving portions 78a includes: a light receiving element configured to receive reflected light of the 2 nd laser beam reflected by dirt (e.g., rain, snow, mud, dust, etc.) formed on the cover 22a and to photoelectrically convert the reflected light of the 2 nd laser beam; optical members such as lenses; and a wavelength filter. The light receiving element is, for example, an InGaAs photodiode having light receiving sensitivity for light in a wavelength band of 800nm to 1700 nm. The wavelength filter is configured to block at least light in a wavelength range of 800nm to 1200 nm. As described above, the detection wavelength range of the 2 nd light receiving part 78a is 1200nm to 1700 nm. Therefore, in the present embodiment, the detection wavelength range (300nm to 1100nm) of the 1 st light receiving part 76a and the detection wavelength range (1200nm to 1700nm) of the 2 nd light receiving part 78a do not overlap each other.

As described above, the 1 st light receiving unit 76a can detect the 1 st laser beam, but cannot detect the 2 nd laser beam. The 2 nd light receiving unit 78a can detect the 2 nd laser beam, but cannot detect the 1 st laser beam. Therefore, the 1 st light receiving unit 76a or the 2 nd light receiving unit 78a can be prevented from detecting both the 1 st laser light and the 2 nd laser light.

As shown in FIG. 9, the LiDAR unit 144a has: a housing 340 a; and a LiDAR unit body portion 343a that is housed within the housing 340 a. The 1 st light-emitting unit 75a, the 1 st light-receiving unit 76a, the 2 nd light-emitting unit 77a, and the 2 nd light-receiving unit 78a are housed in the LiDAR unit body portion 343 a. For example, the light emitting portions and the light receiving portions may be arranged in a straight line along the rotation axis Ax. In this figure, 3 1 st and 2 nd light emitting parts and 1 st and 2 nd light receiving parts are shown for convenience of illustration, but the number of the light receiving parts and the light emitting parts is not particularly limited. For example, the LiDAR unit 144a may also have 8 1 st and 2 nd light emitting sections and 8 1 st and 2 nd light receiving sections.

The 1 st light emitting units 75a may be configured to emit the 1 st laser light (light pulse) at the same timing. Furthermore, each of the 1 st light emitting parts 75a may be configured to have different vertical angles in the vertical directionThe 1 st laser beam is emitted. In this case, the 1 st laser beam emitted from one 1 st light emitting unit 75a has a vertical angleAnd the vertical angle of the 1 st laser beam emitted from the other 1 st light emitting part 75a adjacent to the one 1 st light emitting part 75aAngular spacing therebetweenThe angle can be set to a predetermined angle.

The 1 st light emitting unit 75a is configured to emit the 1 st laser beam at a plurality of different horizontal angles θ in the horizontal direction. For example, it may be that the angular range in the blister direction is 100 °, and the angular interval Δ θ in the horizontal direction is 0.2 °. In this case, the 1 st light emitting units 75a are each configured to emit the 1 st laser light at an angular pitch of 0.2 ° in the horizontal direction.

Similarly, the 2 nd light emitting units 77a may be configured to emit the 2 nd laser light (light pulse) at the same timing. Further, each of the 2 nd light emitting parts 77a may be configured to have different vertical angles in the vertical directionThe 2 nd laser beam is emitted. In this case, the 2 nd laser beam emitted from one of the 2 nd light emitting parts 77a has a vertical angleAnd the vertical angle of the 2 nd laser beam emitted from the other 2 nd light emitting part 77a adjacent to the one 2 nd light emitting part 77aAngular spacing therebetweenThe angle can be set to a predetermined angle.

The 2 nd light emitting unit 77a is configured to emit the 2 nd laser light at a plurality of different horizontal angles θ in the horizontal direction. For example, it may be that the angle range in the horizontal direction is 100 °, and the angle pitch Δ θ in the horizontal direction is 0.2 °. In this case, each of the 2 nd light emitting parts 77a is configured to emit the 2 nd laser light at an angular pitch of 0.2 ° in the horizontal direction.

The motor 79a is configured to rotationally drive the LiDAR unit body portion 343a about the rotational axis Ax. By the rotational driving of the LiDAR-unit main body 343a, the 1 st light emitting unit 75a and the 2 nd light emitting unit 77a can emit laser light at a plurality of different horizontal angles θ in the horizontal direction. For example, it may be that the angle range in the horizontal direction is 100 °, and the angle pitch Δ θ in the horizontal direction is 0.2 °. In this case, the 1 st light emitting parts 75a can emit the 1 st laser light at an angular pitch of 0.2 ° in the horizontal direction, respectively. Each of the 2 nd light emitting parts 77a can emit the 2 nd laser light at an angular pitch of 0.2 ° in the horizontal direction.

The control unit 70a includes a motor control unit 71a, a light emission control unit 72a, a 1 st generation unit 73a, and a 2 nd generation unit 74 a. The control portion 70a is constituted by at least one Electronic Control Unit (ECU). The electronic control unit includes a computer system (e.g., SoC or the like) including 1 or more processors and 1 or more memories, and an electronic circuit composed of active elements such as transistors and passive elements. The processor includes at least one of a CPU, MPU, GPU and TPU. The memory includes ROM and RAM. The computer system may be a non-von neumann computer such as an ASIC or FPGA.

The motor control unit 71a is configured to control driving of the motor 79 a. The light emission controller 72a is configured to control light emission of each of the 1 st light emitting unit 75a and the 2 nd light emitting unit 77 a.

The 1 st generating unit 73a is configured to receive a signal corresponding to the reflected light of the 1 st laser beam output from the 1 st light receiving unit 76a, and determine the light receiving time of the reflected light of the 1 st laser beam based on the received signal. The 1 st generator 73a is configured to determine the emission time of the 1 st laser beam based on the signal output from the light emission controller 72 a.

The 1 st generating part 73a is configured to obtain each emission angle (horizontal angle θ, vertical angle) with the 1 st laser beam) The time difference between the emission time of the following 1 st laser beam and the light receiving time of the reflected light of the 1 st laser beam reflected by the object, that is, the time of flight (TOF: time of Flight) Δ T1-related information. The 1 st generating unit 73a is configured to generate 1 st point cloud data indicating the distance D between the LiDAR unit 144a and the object at each emission angle, based on the information on the flight time Δ T1 at each emission angle. The 1 st point cloud data generated by the 1 st generation unit 73a is transmitted to the LiDAR-unit control unit 540 a.

The 2 nd generating unit 74a is configured to receive a signal corresponding to the reflected light of the 2 nd laser beam output from the 2 nd light receiving unit 78a, and to determine the light receiving time of the reflected light of the 2 nd laser beam based on the received signal. The 2 nd generation unit 74a is configured to determine the emission time of the 2 nd laser beam based on the signal output from the light emission control unit 72 a.

The 2 nd generating part 74a is configured to obtain each emission angle (horizontal angle θ, vertical angle) with the 2 nd laser beam) The time difference between the emission time of the lower 2 nd laser beam and the light receiving time of the 2 nd laser beam reflected by the object, i.e., the time of flight (TOF) Δ T2, is correlated withThe information of (1). The 2 nd generating unit 74a is configured to generate the 2 nd point cloud data indicating the distance D between the LiDAR unit 144a and the object at each emission angle, based on the information on the flight time Δ T2 at each emission angle. The 2 nd generating unit 74a is configured to transmit the generated 2 nd point cloud data to the LiDAR-unit control unit 540a

According to this embodiment, the LiDAR unit 144a may be capable of generating the 1 st point cloud data associated with the 1 st laser and the 2 nd point cloud data associated with the 2 nd laser. As described above, a LiDAR unit 144a capable of acquiring 2 different point cloud data can be provided. In this regard, the ambient environment information of the vehicle 1A can be acquired using the 1 st point group data out of the 2 point group data. On the other hand, information other than the ambient environment information of the vehicle 1A (for example, information about dirt adhering to the cover 22a described later) can be acquired using the 2 nd point group data out of the 2 nd point group data.

In the present embodiment, the LiDAR unit 144a mechanically drives the light emitting unit and the light receiving unit to acquire the 1 st and 2 nd point cloud data, but the configuration of the LiDAR unit 144a is not limited to this. For example, the LiDAR unit 144a may have an optical deflector configured to scan the 1 st laser light and the 2 nd laser light in the horizontal direction and the vertical direction. The optical deflector is, for example, a mems (micro Electro Mechanical systems) mirror or a polygon mirror. Also, the LiDAR unit 144a may acquire the 1 st and 2 nd point cloud data by a phased array method or a flash method.

Next, returning to fig. 7, description will be given below of the millimeter wave radar 145a and the lamp cleaner 146 a. The millimeter wave radar 145a is configured to detect radar data indicating the surrounding environment of the vehicle 1A. Specifically, the millimeter wave radar 145a is configured to acquire radar data and then transmit the radar data to the millimeter wave radar control unit 550 a. The millimeter wave radar control unit 550a is configured to acquire the surrounding environment information based on the radar data. The ambient environment information may include information related to an object existing outside the vehicle 1A. The ambient environment information may include, for example, information relating to the position and direction of the object with respect to the vehicle 1A and information relating to the relative speed of the object with respect to the vehicle 1A.

For example, the millimeter Wave radar 145a can acquire the distance and direction between the millimeter Wave radar 145a and an object existing outside the vehicle 1A by a pulse modulation method, an FM-CW (Frequency modulated-Continuous Wave) method, or a dual-Frequency CW method. In the case of using the pulse modulation method, the millimeter wave radar 145a can acquire information on the distance D between the millimeter wave radar 145a and the object existing outside the vehicle 1A based on the information on the time of flight Δ T2 after acquiring the information on the time of flight Δ T2 of the millimeter wave. The millimeter wave radar 145a can acquire information on the direction of the object with respect to the vehicle 1A based on the phase difference between the phase of the millimeter wave (received wave) received by one receiving antenna and the phase of the millimeter wave (received wave) received by the other receiving antenna adjacent to the one receiving antenna. In addition, the millimeter wave radar 145a can acquire information relating to the relative speed V of the object with respect to the millimeter wave radar 145a based on the frequency f0 of the transmission wave radiated from the transmission antenna and the frequency f1 of the reception wave received by the reception antenna.

The lamp cleaner 146a is configured to remove dirt adhering to the cover 22a, and is disposed in the vicinity of the cover 22a (see fig. 11). The lamp cleaner 146a may remove dirt attached to the housing 22a by spraying cleaning liquid or air toward the housing 22 a.

The lamp cleaner control unit 560a is configured to control the lamp cleaner 146 a. The lamp cleaner control unit 560a is configured to determine whether or not dirt (e.g., rain, snow, mud, dust, etc.) adheres to the housing 22a based on the 2 nd point group data transmitted from the LiDAR unit control unit 540 a. Lamp cleaner control unit 560a is configured to drive lamp cleaner 146a in response to a determination that dirt adheres to cover 22 a.

Each of the sensor systems 104b to 104d similarly includes: control, lighting units, cameras, LiDAR units, millimeter wave radars, and light fixture cleaners. In particular, these devices of the sensor system 104b are disposed in the space Sb formed by the housing 24b and the translucent cover 22b of the right front lamp 7b shown in fig. 5. These devices of the sensor system 104c are disposed in a space Sc formed by the case 24c of the left rear lamp 7c and the translucent cover 22 c. These devices of the sensor system 104d are disposed in a space Sd formed by the housing 24d of the right rear lamp 7d and the translucent cover 22 d.

(method for detecting fouling according to the present embodiment)

Next, a method of detecting dirt adhering to the cover 22a of the left light tool 7a will be described below with reference mainly to fig. 10. Fig. 10 is a flowchart for explaining a method of detecting dirt adhering to the cover 22a (hereinafter referred to as a "dirt detection method") according to the present embodiment. Note that, in the present embodiment, only the dirt detection process performed by the sensor system 6a will be described, but it is desirable to note that the dirt detection process performed by the sensor systems 6b to 6d is also the same as the dirt detection process performed by the sensor system 6 a.

As shown in fig. 10, in step S11, the LiDAR unit control section 540a controls the LiDAR unit 144a in accordance with the instruction from the luminaire cleaner control section 560a such that the 2 nd laser light L2 is emitted outward from the plurality of 2 nd light emitting sections 77a of the LiDAR unit 144 a. Here, the LiDAR unit 144a is at each exit angle (horizontal angle θ, vertical angle)) The 2 nd laser beam L2 is emitted downward. The emission intensity I2 of the 2 nd laser beam L2 emitted from the 2 nd light emitting section 77a is smaller than the emission intensity I1 of the 1 st laser beam L1 emitted from the 1 st light emitting section 75 a. In this regard, as shown in fig. 11, the emission intensity I1 of the 1 st laser beam L1 is set to a magnitude such that the reflected light of the 1 st laser beam L1 reflected by the object existing outside the vehicle 1A can be detected by the 1 st light receiving unit 76 a. At this point, the maximum reaching distance of the 1 st laser light L1 is in the range of several tens of m to several hundreds of m. On the other hand, the emission intensity I2 of the 2 nd laser light L2 is set so that the maximum reaching distance of the 2 nd laser light L2 is in the vicinity of the cover 22 a. That is, the reflected light of the 2 nd laser light L2 reflected by the object existing outside the vehicle 1A cannot be obtained at the emission intensity I2 of the 2 nd laser light L2The light is detected by the 2 nd light receiving unit 78a, but the light reflected by the 2 nd laser beam L2 reflected by the cover 22a can be detected by the 2 nd light receiving unit 78 a.

Next, in step S12, the 2 nd light receiving parts 78a of the LiDAR unit 144a each receive the reflected light of the 2 nd laser light L2 reflected by the cover 22 a.

Next, in step S13, the 2 nd generating unit 74a acquires information on the flight time Δ T2 which is the time difference between the emission time of the 2 nd laser light L2 at each emission angle of the 2 nd laser light L2 and the light receiving time of the reflected light of the 2 nd laser light L2 reflected by the cover 22 a. Then, the 2 nd generating unit 74a generates 2 nd point cloud data indicating the distance D between the LiDAR unit 144a and the cover 22a at each emission angle, based on the information on the flight time Δ T2 at each emission angle. The generated 2 nd point cloud data is then transmitted to the luminaire cleaner control unit 560a via the LiDAR unit control unit 540 a.

Next, in step S14, the lamp cleaner control unit 560a determines whether or not there is a point group satisfying a predetermined condition in the 2 nd point group data. In this regard, the predetermined condition is a condition relating to dirt adhering to the cover 22 a. Here, when dirt such as mud adheres to the cover 22a, the 2 nd laser beam L2 reflected by the dirt is detected by the 2 nd light receiving unit 78 a. Therefore, in the 2 nd point group data, the dirt attached to the cover 22a is shown as a point group. On the other hand, since the cover 22a is a translucent cover, when dirt is not present in the cover 22a, the 2 nd dot group data does not include a dot group including a predetermined number of dots.

As described above, when dirt adheres to the cover 22a, the point group generated by the dirt is shown in the 2 nd point group data. For example, when a point group consisting of a predetermined number of points exists in the 2 nd point group data, it is determined that a point group related to dirt exists in the 2 nd point group data.

If the determination result in step S14 is YES, lamp cleaner control unit 560a determines that dirt is attached to housing 22a (step S16). On the other hand, if the determination result in step S14 is NO, lamp cleaner control unit 560a determines that dirt is not attached to housing 22a (step S15).

Then, in step S17, lamp cleaner control unit 560a drives lamp cleaner 146a to remove dirt adhering to housing 22 a. Specifically, lamp cleaner control unit 560a causes lamp cleaner 146a to be driven so that the cleaning liquid or air is ejected from lamp cleaner 146a toward housing 22 a.

After the dirt removal process is performed by the lamp cleaner 146a with respect to the housing 22a (after the process of step S17 is performed), the process returns to step S11. As described above, the processes of steps S11 to S17 are repeatedly executed until it is determined that dirt is not attached to the cover 22 a. Note that the present process may be ended after the process of step S17 is executed.

As described above, according to the present embodiment, after determining whether or not dirt adheres to the housing 22a based on the 2 nd point group data, the lamp cleaner 146a is driven in accordance with the determination that dirt adheres to the housing 22 a. As described above, dirt adhering to the housing 22a can be detected based on one 2 nd point cloud data among the 2 nd point cloud data acquired from the LiDAR unit 144 a. In this regard, when dirt such as rain, snow, dust, mud, or the like adheres to the cover, a point group indicating the dirt adhering to the cover 22a appears in the 2 nd point group data, and therefore the dirt adhering to the cover 22a can be detected based on the point group. Therefore, since dirt adhering to the cover 22a can be detected with high accuracy, a decrease in detection accuracy of the sensor such as the LiDAR unit 144a disposed in the left light fixture 7a can be suppressed.

In addition, according to the present embodiment, since the emission intensity I2 of the 2 nd laser light L2 is small, the 2 nd dot group data shows the surrounding environment within a predetermined distance from the LiDAR unit 144 a. Specifically, the 2 nd point cloud data shows the housing 22a that is present within a predetermined distance from the LiDAR unit 144 a. As described above, since the 2 nd point group data does not include the point group indicating the object existing outside the vehicle 1A, whether or not dirt is attached to the cover 22a can be determined based on the presence or absence of the point group included in the 2 nd point group data.

In the present embodiment, the emission intensity I2 of the 2 nd laser beam is set so that the maximum reaching distance of the 2 nd laser beam L2 is in the vicinity of the cover 22a, but the present embodiment is not limited thereto. For example, the emission intensity I2 of the 2 nd laser light may be greater than or equal to the emission intensity I1 of the 1 st laser light. In this case, similarly to the 1 st point group data, the 2 nd point group data specifies an object outside the vehicle 1A. On the other hand, in the processing of step S14, the luminaire cleaner control unit 560a may determine whether or not a point group satisfying a predetermined condition exists within a predetermined distance from the LiDAR unit 144a in the 2 nd point group data. Here, the cover 22a is arranged within a predetermined distance from the LiDAR unit 144 a. The lamp cleaner control unit 560a may determine that dirt is attached to the housing 22a when it is determined that a point group satisfying a predetermined condition exists within a predetermined distance from the LiDAR unit 144 a. On the other hand, the lamp cleaner control unit 560a may determine that dirt is not attached to the housing 22a when it is determined that there is no point group satisfying a predetermined condition within a predetermined distance from the LiDAR unit 144 a. As described above, even when the point 2 group data shows the surrounding environment outside the vehicle 1A, whether or not dirt adheres to the cover 22a can be determined.

While the embodiments of the present invention have been described above, it is needless to say that the technical scope of the present invention should not be construed as being limited to the description of the embodiments. This embodiment is merely an example, and it will be understood by those skilled in the art that various modifications of the embodiment can be made within the scope of the invention described in the claims. The technical scope of the present invention should be determined based on the scope of the invention described in the claims and the equivalent scope thereof.

The present application appropriately refers to the contents disclosed in japanese patent application No. 2019-18, filed on 2019-2-18 (japanese patent application No. 2019-026549) and the contents disclosed in japanese patent application No. 2019-026550, filed on 2019-2-18.

31页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:制动控制装置和制动控制方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!