Vehicle sensing system and vehicle

文档序号:53971 发布日期:2021-09-28 浏览:33次 中文

阅读说明:本技术 车辆用传感系统及车辆 (Vehicle sensing system and vehicle ) 是由 户塚雄介 丸山雄太 难波高范 于 2020-01-20 设计创作,主要内容包括:传感系统(4a)构成为对在设置于车辆的车辆用灯具的外罩附着的污垢进行检测。传感系统(4a)具有:LiDAR单元(44a),其构成为配置于由所述车辆用灯具的壳体和外罩形成的空间内,并且取得表示所述车辆的周边环境的点群数据;灯具清洁器(46a),其构成为将附着于外罩的污垢去除;以及灯具清洁器控制部(460a),其构成为取得与在从LiDAR单元(44a)射出后由道路面反射的多个反射光的强度相关的反射光强度信息,基于取得的所述反射光强度信息,判定污垢是否附着于所述外罩,与污垢附着于所述外罩的判定相应地使所述灯具清洁器(46a)驱动。(The sensor system (4a) is configured to detect dirt adhering to a housing of a vehicle lamp provided in a vehicle. The sensing system (4a) has: a LiDAR unit (44a) configured to be arranged in a space formed by a housing and a cover of the vehicle lamp and acquire point group data representing the surrounding environment of the vehicle; a lamp cleaner (46a) configured to remove dirt adhering to the housing; and a lamp cleaner control unit (460a) configured to acquire reflected light intensity information relating to the intensities of a plurality of reflected lights reflected by a road surface after being emitted from the LiDAR unit (44a), determine whether or not dirt adheres to the housing based on the acquired reflected light intensity information, and drive the lamp cleaner (46a) in accordance with the determination that dirt adheres to the housing.)

1. A vehicle sensor system configured to detect dirt adhering to a housing of a vehicle lamp provided in a vehicle,

the vehicle sensor system includes:

a LiDAR unit configured to be disposed in a space formed by a housing and a cover of the vehicle lamp and to acquire point group data indicating an environment surrounding the vehicle;

a lamp cleaner configured to remove dirt adhering to the housing; and

and a lamp cleaner control unit configured to acquire reflected light intensity information relating to intensities of a plurality of reflected lights reflected by a road surface after being emitted from the LiDAR unit, determine whether or not dirt adheres to the housing based on the acquired reflected light intensity information, and drive the lamp cleaner in accordance with the determination that the dirt adheres to the housing.

2. The vehicular sensor system according to claim 1, wherein,

the lamp cleaner control unit is configured to determine whether or not dirt adheres to the housing based on a comparison between the acquired reflected light intensity information and a predetermined threshold value.

3. The vehicular sensor system according to claim 2, wherein,

the lamp cleaner control unit is configured to determine whether or not dirt adheres to the housing based on a comparison between the intensity of each of the plurality of reflected lights and the predetermined threshold value.

4. The vehicular sensor system according to claim 2, wherein,

the lamp cleaner control unit is configured to determine whether or not dirt adheres to the housing based on a comparison between an average value or a median value of the intensities of the plurality of reflected lights and the predetermined threshold value.

5. The vehicular sensor system according to any one of claims 2 to 4,

the predetermined threshold value is associated with the intensity of reflected light from the road surface measured when dirt is not attached to the cover.

6. The vehicular sensor system according to claim 1, wherein,

the lamp cleaner control unit is configured to acquire and store the reflected light intensity information when the vehicle is parked,

the lamp cleaner control unit is configured to determine whether or not dirt adheres to the housing based on a comparison between the newly acquired reflected light intensity information and the stored reflected light intensity information.

7. The vehicular sensor system according to any one of claims 1 to 6,

the lamp cleaner control unit is configured to determine whether or not dirt adheres to the cover based on the acquired reflected light intensity information when the road surface is dry.

8. A vehicle having the vehicular sensor system according to any one of claims 1 to 7.

Technical Field

The invention relates to a vehicle sensor system and a vehicle.

Background

Currently, research on an automatic driving technique of an automobile is being conducted in hot weather in various countries, and a supplement to laws for enabling a vehicle (hereinafter, "vehicle" means an automobile) to travel on a road through an automatic driving mode is being studied in various countries. Here, in the automatic driving mode, the vehicle system automatically controls the running of the vehicle. Specifically, in the automatic driving mode, the vehicle system automatically performs at least 1 of steering control (control of the traveling direction of the vehicle), braking control, and acceleration control (control of braking, acceleration, and deceleration of the vehicle) based on information (surrounding environment information) indicating the surrounding environment of the vehicle obtained from a sensor such as a camera or a radar (for example, a laser radar or a millimeter wave radar). On the other hand, in the manual driving mode described below, the driver controls the traveling of the vehicle as shown in most of conventional vehicles. Specifically, in the manual driving mode, the traveling of the vehicle is controlled in accordance with the operation (steering operation, braking operation, and acceleration operation) by the driver, and the vehicle system does not automatically perform the steering control, the braking control, and the acceleration control. The driving mode of the vehicle is not a concept of being present only in a part of the vehicles, but a concept of being present in all vehicles including a conventional vehicle having no automatic driving function, and is classified according to, for example, a vehicle control method.

As described above, it is expected that a vehicle traveling on a road in the future in the automatic driving mode (hereinafter, appropriately referred to as "automatic driving vehicle") and a vehicle traveling in the manual driving mode (hereinafter, appropriately referred to as "manual driving vehicle") will travel together.

As an example of the automatic driving technique, patent document 1 discloses an automatic follow-up running system in which a following vehicle runs automatically following a preceding vehicle. In this automatic follow-up running system, each of the preceding vehicle and the following vehicle has an illumination system for preventing the display of text information of the other vehicle inserted between the preceding vehicle and the following vehicle in the illumination system of the preceding vehicle, and text information indicating the intention of automatic follow-up running is displayed in the illumination system of the following vehicle.

Patent document 1: japanese laid-open patent publication No. 9-277887

Disclosure of Invention

In addition, in the development of the automatic driving technique, it is necessary to significantly improve the detection accuracy of the surrounding environment of the vehicle. In this regard, a vehicle is being studied to mount a plurality of different types of sensors (e.g., a camera, a LiDAR unit, a millimeter wave radar, etc.). For example, it has been studied to dispose a plurality of sensors at each of the 4 corners of the vehicle. Specifically, it has been studied to mount a LiDAR unit, a camera, and a millimeter wave radar on each of 4 vehicle lamps arranged at 4 corners of a vehicle.

A LiDAR unit disposed in a vehicle lamp acquires point group data representing the surrounding environment of a vehicle through a transparent cover. Similarly, a camera disposed in the vehicle lamp acquires image data representing the surroundings of the vehicle through a transparent cover. Therefore, when dirt adheres to the cover of the vehicle lamp, there is a possibility that the surrounding environment of the vehicle cannot be accurately specified based on the point group data of the LiDAR unit and/or the image data of the camera due to the dirt (rain, snow, mud, etc.) adhering to the cover. As described above, when sensors such as a LiDAR unit and a camera are disposed in a vehicle lamp, it is necessary to study a method for detecting dirt that adheres to a housing and adversely affects detection accuracy of the sensors.

The invention aims to provide a vehicle sensing system and a vehicle, which can restrain the reduction of the detection precision of a sensor arranged in a vehicle lamp.

A vehicle sensor system according to an aspect of the present invention is configured to detect dirt adhering to a housing of a vehicle lamp provided in a vehicle.

A vehicle sensor system is provided with:

a LiDAR unit configured to be disposed in a space formed by a housing and a cover of the vehicle lamp and to acquire point group data indicating an environment surrounding the vehicle;

a lamp cleaner configured to remove dirt adhering to the housing; and

and a lamp cleaner control unit configured to acquire reflected light intensity information relating to intensities of a plurality of reflected lights reflected by a road surface after being emitted from the LiDAR unit, determine whether or not dirt adheres to the housing based on the acquired reflected light intensity information, and drive the lamp cleaner in accordance with the determination that the dirt adheres to the housing.

According to the above configuration, after determining whether or not dirt adheres to the housing based on the reflected light intensity information, the lamp cleaner is driven in accordance with the determination that dirt adheres to the housing. As described above, the dirt adhering to the cover can be detected based on the reflected light intensity information. In this regard, when dirt such as rain, snow, mud, or the like adheres to the cover, the dirt reduces the intensity of the reflected light, and therefore, the dirt adhering to the cover can be detected based on the intensity of the reflected light.

Therefore, dirt adhering to the housing can be reliably detected, and therefore, a decrease in detection accuracy of a sensor such as a LiDAR unit disposed in the vehicle lamp can be suppressed.

The lamp cleaner control unit may be configured to determine whether or not dirt adheres to the housing based on a comparison between the acquired reflected light intensity information and a predetermined threshold value.

According to the above configuration, dirt adhering to the cover can be detected based on a comparison between the acquired reflected light intensity information and a predetermined threshold value.

Further, the lamp cleaner control unit may be configured to determine whether or not dirt adheres to the housing based on a comparison between the intensity of each of the plurality of reflected lights and the predetermined threshold value.

According to the above configuration, dirt adhering to the cover can be detected based on comparison between the intensities of the plurality of reflected lights and a predetermined threshold value.

The lamp cleaner control unit may be configured to determine whether or not dirt adheres to the housing based on a comparison between an average value or a median value of the intensities of the plurality of reflected lights and the predetermined threshold value.

According to the above configuration, dirt adhering to the cover can be detected based on a comparison between an average value or a median value of the intensities of the plurality of reflected lights and a predetermined threshold value.

The predetermined threshold value may be related to the intensity of reflected light from the road surface measured when dirt is not attached to the cover.

According to the above configuration, since the predetermined threshold value is associated with the intensity of the reflected light from the road surface measured when dirt is not attached to the cover, the dirt attached to the cover can be detected based on the comparison between the acquired reflected light intensity information and the predetermined threshold value.

Further, the lamp cleaner control unit may be configured to acquire and store the reflected light intensity information when the vehicle is parked.

The lamp cleaner control unit may be configured to determine whether or not dirt is attached to the housing based on a comparison between the newly acquired reflected light intensity information and the stored reflected light intensity information.

According to the above configuration, dirt adhering to the cover can be detected based on comparison between the newly acquired reflected light intensity information and the reflected light intensity information acquired when the vehicle was parked last time.

Further, the lamp cleaner control unit may be configured to determine whether or not dirt adheres to the cover based on the acquired reflected light intensity information when the road surface is dry.

In addition, a vehicle having the vehicle sensing system is provided.

According to the above, a vehicle can be provided in which a decrease in detection accuracy of a sensor disposed in a vehicle lamp can be suppressed.

ADVANTAGEOUS EFFECTS OF INVENTION

According to the present invention, it is possible to provide a vehicle sensor system and a vehicle that can suppress a decrease in detection accuracy of a sensor disposed in a vehicle lamp.

Drawings

Fig. 1 is a schematic diagram of a vehicle including a vehicle system according to an embodiment of the present invention (hereinafter, referred to as the present embodiment).

Fig. 2 is a block diagram showing a vehicle system according to the present embodiment.

FIG. 3 is a block diagram showing a front left sensing system.

Fig. 4 is a flowchart for explaining a method of detecting dirt adhering to the cover according to embodiment 1.

FIG. 5 is a diagram illustrating laser light emitted from a LiDAR unit at each of a plurality of perpendicular angles.

FIG. 6 is a graph showing the intensity I of the n-th reflected lightnAnd a threshold value IthTable of one example of the results of the comparison therebetween.

Fig. 7 is a flowchart for explaining a series of processing for acquiring reflected light intensity information when the vehicle is parked.

Fig. 8 is a flowchart for explaining a method of detecting dirt adhering to the cover according to embodiment 2.

FIG. 9 shows the intensity I of the n-th reflected light measured this timenAnd the intensity I of the n-th reflected light measured previouslyref_nTable of one example of the results of the comparison therebetween.

Detailed Description

Embodiments of the present invention (hereinafter, simply referred to as "the present embodiments") will be described below with reference to the drawings. In the description of the present embodiment, the components having the same reference numerals as those already described are omitted for convenience of description. The dimensions of the members shown in the drawings may be different from the actual dimensions of the members for convenience of description.

In the description of the present embodiment, for convenience of description, the terms "left-right direction", "front-back direction", and "up-down direction" may be appropriately used. These directions are relative directions set with respect to the vehicle 1 shown in fig. 1. Here, the "front-rear direction" is a direction including the "front direction" and the "rear direction". The "left-right direction" is a direction including the "left direction" and the "right direction". The "up-down direction" is a direction including an "up direction" and a "down direction". Note that, the up-down direction is not shown in fig. 1, but the up-down direction is a direction perpendicular to the front-back direction and the left-right direction.

First, a vehicle 1 and a vehicle system 2 according to the present embodiment will be described with reference to fig. 1 and 2. Fig. 1 is a schematic diagram showing a plan view of a vehicle 1 having a vehicle system 2. Fig. 2 is a block diagram showing the vehicle system 2.

As shown in fig. 1, the vehicle 1 is a vehicle (automobile) capable of traveling in an automatic driving mode, and includes a vehicle system 2, a left front lamp 7a, a right front lamp 7b, a left rear lamp 7c, and a right rear lamp 7 d.

As shown in fig. 1 and 2, the vehicle system 2 includes at least a vehicle control unit 3, a front left sensor system 4a (hereinafter, simply referred to as "sensor system 4 a"), a front right sensor system 4b (hereinafter, simply referred to as "sensor system 4 b"), a rear left sensor system 4c (hereinafter, simply referred to as "sensor system 4 c"), and a rear right sensor system 4d (hereinafter, simply referred to as "sensor system 4 d").

The vehicle system 2 includes a sensor 5, an hmi (human Machine interface)8, a gps (global Positioning system)9, a wireless communication unit 10, and a storage device 11. In addition, the vehicle system 2 has a steering actuator 12, a steering device 13, a brake actuator 14, a brake device 15, an acceleration actuator 16, and an acceleration device 17.

The vehicle control unit 3 is configured to control the traveling of the vehicle 1. The vehicle Control Unit 3 is constituted by at least one Electronic Control Unit (ECU), for example. The electronic control unit includes a computer system (e.g., soc (system on a chip)) including 1 or more processors and 1 or more memories, and an electronic circuit including active elements such as transistors and passive elements. The processor includes at least one of a CPU (Central Processing Unit), an MPU (micro Processing Unit), a GPU (graphics Processing Unit), and a TPU (temporal Processing Unit). The CPU may be composed of a plurality of CPU cores. A GPU may be made up of multiple GPU cores. The memory includes ROM (read Only memory) and RAM (random Access memory). The ROM may store a vehicle control program. For example, the vehicle control program may include an Artificial Intelligence (AI) program for automatic driving. The AI program is a program (trained model) constructed by machine learning with or without a teacher (in particular, deep learning) using a multi-layer neural network. The RAM may temporarily store a vehicle control program, vehicle control data, and/or ambient environment information indicating the ambient environment of the vehicle. The processor may be configured to expand a program specified from various vehicle control programs stored in the ROM on the RAM, and execute various processes by cooperating with the RAM. The computer system may be a non-roman computer such as an asic (application Specific Integrated circuit) or an FPGA (Field-Programmable Gate Array). The computer system may be a combination of a roman-type computer and a non-roman-type computer.

The sensor systems 4a to 4d are each configured to detect the surrounding environment of the vehicle 1. In the description of the present embodiment, the sensor systems 4a to 4d have the same components. Therefore, the sensor system 4a will be described below with reference to fig. 3. Fig. 3 is a block diagram showing the sensing system 4 a.

As shown in fig. 3, the sensor system 4a has a control section 40a, a lighting unit 42a, a camera 43a, a Light Detection and Ranging unit 44a (an example of a laser radar), a millimeter wave radar 45a, and a lamp cleaner 46 a. The control unit 40a, the illumination unit 42a, the camera 43a, the LiDAR unit 44a, and the millimeter wave radar 45a are disposed in a space Sa formed by the housing 24a and the translucent cover 22a of the left front lamp 7a shown in fig. 1. On the other hand, the lamp cleaner 46a is disposed outside the space Sa and near the left lamp 7 a. The control unit 40a may be disposed in a predetermined place of the vehicle 1 other than the space Sa. For example, the control unit 40a may be integrally configured with the vehicle control unit 3.

The control unit 40a is configured to control the operations of the illumination unit 42a, the camera 43a, the LiDAR unit 44a, the millimeter-wave radar 45a, and the lamp cleaner 46a, respectively. In this regard, the control unit 40a functions as a lighting unit control unit 420a, a camera control unit 430a, a LiDAR unit control unit 440a, a millimeter wave radar control unit 450a, and a lamp cleaner control unit 460 a.

The control portion 40a is constituted by at least one Electronic Control Unit (ECU). The electronic control unit includes a computer system (e.g., SoC or the like) including 1 or more processors and 1 or more memories, and an electronic circuit composed of active elements such as transistors and passive elements. The processor includes at least one of a CPU, MPU, GPU and TPU. The memory includes ROM and RAM. The computer system may be a non-von neumann computer such as an ASIC or FPGA.

The illumination unit 42a is configured to emit light toward the outside (forward) of the vehicle 1, thereby forming a light distribution pattern. The illumination unit 42a has a light source for emitting light and an optical system. The light source may be constituted by a plurality of light emitting elements arranged in a matrix (for example, N rows × M columns, N > 1, M > 1). The light Emitting element is, for example, an led (light Emitting diode), an ld (ser diode), or an organic EL element. The optical system may include at least one of a mirror configured to reflect light emitted from the light source toward the front of the illumination unit 42a and a lens configured to refract light directly emitted from the light source or light reflected by the mirror.

The illumination unit control unit 420a is configured to control the illumination unit 42a such that the illumination unit 42a emits a predetermined light distribution pattern toward the front region of the vehicle 1. For example, the illumination unit control unit 420a may change the light distribution pattern emitted from the illumination unit 42a according to the driving pattern of the vehicle 1.

The camera 43a is configured to detect the surrounding environment of the vehicle 1. Specifically, the camera 43a is configured to acquire image data indicating the surrounding environment of the vehicle 1 and then transmit the image data to the camera control unit 430 a. The camera control section 430a may determine the surrounding environment information based on the transmitted image data. Here, the ambient environment information may include information relating to an object existing outside the vehicle 1. For example, the surrounding environment information may include information related to the attribute of an object existing outside the vehicle 1 and information related to the distance, direction, and/or position of the object with respect to the vehicle 1. The camera 43a includes an imaging element such as a CCD (Charge-Coupled Device) or a CMOS (complementary MOS). The camera 43a may be configured as a single-lens camera or a stereo camera. When the camera 43a is a stereo camera, the control unit 40a can determine the distance between the vehicle 1 and an object (for example, a pedestrian or the like) present outside the vehicle 1 based on 2 or more image data acquired by the stereo camera by using parallax.

The LiDAR unit 44a is configured to detect the surrounding environment of the vehicle 1. In particular, the LiDAR unit 44a is configured to acquire point cloud data indicating the surrounding environment of the vehicle 1 and then transmit the point cloud data to the LiDAR unit control unit 440 a. The LiDAR unit control unit 440a may determine ambient environment information based on the transmitted point cloud data.

More specifically, the LiDAR unit 44a takes each emission angle (horizontal angle θ, vertical angle) with the laser light) Time of flight (TOF: time of Flight) Δ T1-related information. The LiDAR unit 44a can acquire information relating to the time of flight Δ T1 at each exit angle between the LiDAR unit 44a and an object existing outside the vehicle 1 at each exit angleDistance D related information.

In addition, the LiDAR unit 44a has, for example: a light emitting unit configured to emit laser light; an optical deflector configured to scan the laser light in a horizontal direction and a vertical direction; optical systems such as lenses; and a light receiving unit configured to receive the laser light reflected by the object. The peak wavelength of the laser light emitted from the light emitting section is not particularly limited. For example, the laser light may be non-visible light (infrared light) having a peak wavelength of about 900 nm. The light emitting section is, for example, a laser diode. The optical deflector is, for example, a mems (micro Electro Mechanical systems) mirror or a polygon mirror. The light receiving part is, for example, a photodiode. In addition, the LiDAR unit 44a may also acquire point cloud data without scanning the laser through the optical deflector. For example, the LiDAR unit 44a may acquire point cloud data in a phased array manner or in a flashing manner. The LiDAR unit 44a may also acquire point cloud data by mechanically driving the light-emitting unit and the light-receiving unit to rotate.

The millimeter wave radar 45a is configured to detect radar data indicating the surrounding environment of the vehicle 1. Specifically, the millimeter wave radar 45a is configured to acquire radar data and then transmit the radar data to the millimeter wave radar control unit 450 a. The millimeter wave radar control unit 450a is configured to acquire the surrounding environment information based on the radar data. The ambient environment information may include information related to an object existing outside the vehicle 1. The ambient environment information may include, for example, information relating to the position and direction of the object with respect to the vehicle 1 and information relating to the relative speed of the object with respect to the vehicle 1.

For example, the millimeter Wave radar 45a can acquire the distance and direction between the millimeter Wave radar 45a and an object existing outside the vehicle 1 by a pulse modulation method, an FM-CW (Frequency modulated-Continuous Wave) method, or a dual-Frequency CW method. In the case of using the pulse modulation method, the millimeter wave radar 45a can acquire information on the distance D between the millimeter wave radar 45a and the object existing outside the vehicle 1 based on the information on the time of flight Δ T2 after acquiring the information on the time of flight Δ T2 of the millimeter wave. The millimeter wave radar 45a can acquire information on the direction of the object with respect to the vehicle 1 based on the phase difference between the phase of the millimeter wave (received wave) received by one receiving antenna and the phase of the millimeter wave (received wave) received by the other receiving antenna adjacent to the one receiving antenna. Further, the millimeter wave radar 45a can acquire information relating to the relative speed V of the object with respect to the millimeter wave radar 45a based on the frequency f0 of the transmission wave radiated from the transmission antenna and the frequency f1 of the reception wave received by the reception antenna.

The lamp cleaner 46a is configured to remove dirt adhering to the cover 22a, and is disposed in the vicinity of the cover 22a (see fig. 5). The lamp cleaner 46a can remove dirt attached to the housing 22a by spraying cleaning liquid or air toward the housing 22 a.

The lamp cleaner control unit 460a is configured to control the lamp cleaner 46 a. The lamp cleaner control unit 460a is configured to determine whether or not dirt (e.g., rain, snow, mud, dust, etc.) adheres to the cover 22a based on reflected light intensity information relating to the intensities of a plurality of reflected lights reflected by the road surface after being emitted from the LiDAR unit 44 a. Further, lamp cleaner control unit 460a is configured to drive lamp cleaner 46a in response to determining that dirt adheres to cover 22 a.

Each of the sensor systems 4b to 4d similarly includes: control, lighting units, cameras, LiDAR units, millimeter wave radars, and light fixture cleaners. In particular, these devices of the sensor system 4b are disposed in the space Sb formed by the case 24b and the translucent cover 22b of the right front lamp 7b shown in fig. 1. These devices of the sensor system 4c are disposed in a space Sc formed by the case 24c of the left rear lamp 7c and the translucent cover 22 c. These devices of the sensor system 4d are disposed in a space Sd formed by the housing 24d of the right rear lamp 7d and the translucent cover 22 d.

Returning to fig. 2, the sensor 5 may have an acceleration sensor, a velocity sensor, a gyro sensor, and the like. The sensor 5 is configured to detect a traveling state of the vehicle 1 and output traveling state information indicating the traveling state of the vehicle 1 to the vehicle control unit 3. In addition, the sensor 5 may have an outside air temperature sensor that detects the outside air temperature outside the vehicle 1.

The HMI 8 is constituted by an input unit that receives an input operation from the driver and an output unit that outputs travel information and the like to the driver. The input unit includes: a steering wheel, an accelerator pedal, a brake pedal, a driving mode changeover switch that switches the driving mode of the vehicle 1, and the like. The output unit is a Display (e.g., Head Up Display (HUD)) that displays various types of travel information. The GPS 9 is configured to acquire current position information of the vehicle 1 and output the acquired current position information to the vehicle control unit 3.

The wireless communication unit 10 is configured to receive information on another vehicle present around the vehicle 1 from the other vehicle and transmit the information on the vehicle 1 to the other vehicle (inter-vehicle communication). The wireless communication unit 10 is configured to receive infrastructure information from infrastructure equipment such as a traffic signal and a marker light and transmit travel information of the vehicle 1 to the infrastructure equipment (road-to-vehicle communication). The wireless communication unit 10 is configured to receive information related to a pedestrian from a portable electronic device (a smartphone, a tablet, a wearable device, or the like) carried by the pedestrian, and transmit the own vehicle travel information of the vehicle 1 to the portable electronic device (human-to-vehicle communication). The vehicle 1 may communicate directly with another vehicle, infrastructure equipment, or portable electronic equipment in the peer-to-peer mode, or may communicate via a communication network such as the internet.

The storage device 11 is an external storage device such as a Hard Disk Drive (HDD) or ssd (solid State drive). The storage device 11 may store 2-dimensional or 3-dimensional map information and/or a vehicle control program. For example, the 3-dimensional map information may be constituted by 3D mapping data (point group data). The storage device 11 is configured to output map information and a vehicle control program to the vehicle control unit 3 in response to a request from the vehicle control unit 3. The map information and the vehicle control program can be updated with the wireless communication unit 10 via the communication network.

When the vehicle 1 travels in the automatic driving mode, the vehicle control unit 3 automatically generates at least one of a steering control signal, an acceleration control signal, and a braking control signal based on the travel state information, the surrounding environment information, the current position information, the map information, and the like. The steering actuator 12 is configured to receive a steering control signal from the vehicle control unit 3 and control the steering device 13 based on the received steering control signal. The brake actuator 14 is configured to receive a brake control signal from the vehicle control unit 3 and control the brake device 15 based on the received brake control signal. The acceleration actuator 16 is configured to receive an acceleration control signal from the vehicle control unit 3 and control the acceleration device 17 based on the received acceleration control signal. As described above, the vehicle control unit 3 automatically controls the travel of the vehicle 1 based on the travel state information, the surrounding environment information, the current position information, the map information, and the like. That is, in the automatic driving mode, the travel of the vehicle 1 is automatically controlled by the vehicle system 2.

On the other hand, when the vehicle 1 travels in the manual driving mode, the vehicle control unit 3 generates a steering control signal, an acceleration control signal, and a braking control signal in accordance with manual operations of an accelerator pedal, a brake pedal, and a steering wheel by the driver. As described above, in the manual driving mode, the steering control signal, the acceleration control signal, and the brake control signal are generated by the manual operation of the driver, and thus the traveling of the vehicle 1 is controlled by the driver.

Next, the driving mode of the vehicle 1 will be explained. The driving mode is composed of an automatic driving mode and a manual driving mode. The automatic driving mode is constituted by a full automatic driving mode, an advanced driving assistance mode, and a driving assistance mode. In the full-automatic driving mode, the vehicle system 2 automatically performs all the travel controls of the steering control, the braking control, and the acceleration control, and the driver is not in a state in which the vehicle 1 can be driven. In the advanced driving assistance mode, the vehicle system 2 automatically performs all the travel control of the steering control, the braking control, and the acceleration control, and the driver does not drive the vehicle 1 although the driver is in a state in which the vehicle 1 can be driven. In the driving assistance mode, the vehicle system 2 automatically performs a part of travel control among steering control, braking control, and acceleration control, and the vehicle 1 is driven by the driver with driving assistance of the vehicle system 2. On the other hand, in the manual driving mode, the vehicle system 2 does not automatically perform the running control, and the vehicle 1 is driven by the driver without the driving assistance of the vehicle system 2.

(method for detecting fouling according to embodiment 1)

Next, a method of detecting dirt adhering to the cover 22a of the left light tool 7a will be described below with reference mainly to fig. 4. Fig. 4 is a flowchart for explaining a method of detecting dirt adhering to the cover 22a (hereinafter referred to as a "dirt detection method") according to embodiment 1. Note that, in the present embodiment, only the dirt detection process performed by the sensor system 6a will be described, but it is desirable to note that the dirt detection process performed by the sensor systems 6b to 6d is also the same as the dirt detection process performed by the sensor system 6 a.

As shown in fig. 4, in step S1, the vehicle control unit 3 determines whether or not the road surface around the vehicle 1 is dry, based on the surrounding environment information transmitted from the sensor systems 4a to 4 d. If the determination result at step S1 is NO, the present determination process is repeatedly executed until the determination result at step S1 becomes YES. For example, when the vehicle 1 is traveling, the road surface around the vehicle 1 changes gradually, and therefore the process of step S1 may be executed until it is determined that the road surface around the vehicle 1 is dry. On the other hand, if the determination result at step S1 is YES, the process proceeds to step S2.

Next, in step S2, the LiDAR-unit control unit 440a controls the LiDAR unit 44a such that the LiDAR unit 44a emits laser light L toward the road surface R at each horizontal angle θ (see fig. 5). As already described, the LiDAR unit 44a is configured to be angled by a horizontal angle θ in the horizontal direction and a vertical angle in the vertical directionThe laser beam is emitted at a plurality of emission angles. As described above, the table is generated by acquiring information on the flight time Δ T at each emission anglePoint group data indicating the distance from each emission angle. In the dirt detection process according to the present embodiment, the LiDAR unit 44a is disposed at a predetermined level (a predetermined vertical angle) at which the road surface R is measured) The laser light is emitted. Here, as shown in fig. 5, the predetermined layer corresponds to the layer of the laser light L indicated by the solid line. I.e. the vertical angle of the laserFixed at a predetermined vertical angle for scanning the road surface R. On the other hand, the horizontal angle θ of the laser light changes. Specifically, when the angle range in the horizontal direction is 45 ° and the angular distance Δ θ in the horizontal direction is 0.2 °, the LiDAR unit 44a emits laser light toward the road surface R at 226 horizontal angles θ, respectively. Here, the horizontal angle of the laser beam emitted at the nth position (n is an integer, 1. ltoreq. n.ltoreq.226) is represented by θnSetting the horizontal angle of the laser emitted by the (n-1) th laser as thetan-1In the case of (a), θn=θn-1The relation of + Δ θ holds. Here, Δ θ is set to 0.2 ° as described above.

The intensity of the laser light emitted from the LiDAR unit 44a in the processing of step S2 may be greater than the intensity of the laser light emitted from the LiDAR unit 44a when point cloud data is acquired. In this regard, in the present dirt detection method, since the information on the intensity of the reflected light reflected by the object is acquired instead of the information on the distance to the object, it is preferable that the intensity of the laser light emitted from the LiDAR unit 44a be higher than the intensity of the normal laser light in order to improve the accuracy of the information on the intensity of the laser light. The light receiving sensitivity of the light receiving unit with respect to the reflected light in the processing of step S2 may be greater than the light receiving sensitivity of the light receiving unit with respect to the reflected light when the dot group data is acquired.

Next, in step S3, the LiDAR unit 44a subtends 226 horizontal angles θ (θ) reflected by the road surface R1、θ2···、θ226) The respective reflected lights are received. Then, LiDAR sheetsThe element 44a is generated at angles theta with respect to each horizontalnIntensity of a plurality of reflected lights InThe generated reflected light intensity information is then transmitted to the luminaire cleaner control 460a via the LiDAR unit control 440 a. As described above, in step S4, the luminaire cleaner control portion 460a obtains reflected light intensity information from the LiDAR unit 44 a. Here, the reflected light intensity information includes intensity I of reflected light of the laser beam emitted from the nth (n is 1 to 226) th laser beamnThe relevant information.

Next, in step S5, lamp cleaner control unit 460a compares the intensities I of the 226 reflected lightsnRespective and prescribed threshold value IthA comparison is made. Specifically, the intensity I of 226 reflected lights by the lamp cleaner control part 460anWhether each is less than a prescribed threshold value Ith(In<Ith) And (6) judging. Here, a defined threshold value IthCorrelated with the intensity I of reflected light from the road surface R measured when dirt is not attached to the cover 22 a. E.g. a defined threshold value IthThe value of X% of the intensity I of the reflected light from the road surface R measured when dirt is not attached to the cover 22a can be set. Here, X is preferably set to a value between 40 and 70 (preferably a value between 60 and 70), but the value of X is not particularly limited. I.e. a predetermined threshold value IthAnd is not particularly limited. In addition, a predetermined threshold value IthIs stored in advance in the memory of the control unit 40 a. In addition, a predetermined threshold value IthThe update may be performed over time in consideration of deterioration of the cover 22a over time.

Next, in step S5, lamp cleaner control unit 460a compares the difference between the value of light intensity and the value of light intensity with a predetermined threshold value IthIntensity of reflected light InWhether the number of (a) is greater than or equal to a predetermined number is determined (step S6). As shown in FIG. 6, lamp cleaner control unit 460a determines intensity I of reflected light1To I226Whether each is less than a threshold value IthThen, for the value less than the threshold IthIntensity of reflected light InThe number of (2) is counted. Then, the intensity I of the reflected light obtained by counting is measurednWhether the number of (2) is greater than or equal to a predetermined number is determined.

If the determination result in step S6 is YES, lamp cleaner control unit 460a determines that dirt G (see fig. 5) adheres to cover 22a (step S8). Here, the dirt G is, for example, rain, snow, mud, dust, or the like. On the other hand, if the determination result in step S6 is NO, lamp cleaner control unit 460a determines that dirt G is not attached to housing 22a (step S7), and then ends the present process.

Then, in step S9, lamp cleaner control unit 460a drives lamp cleaner 46a to remove dirt G adhering to housing 22 a. Specifically, the lamp cleaner control portion 460a causes the lamp cleaner 46a to be driven so that the washer fluid or air is ejected from the lamp cleaner 46a toward the housing 22 a.

After the dirt removal process is performed by the lamp cleaner 46a with respect to the housing 22a (after the process of step S9 is performed), the process returns to step S2. As described above, the processing from step S2 to step S9 is repeatedly executed until it is determined that the dirt G is not attached to the cover 22 a. Note that the present process may be ended after the process of step S9 is executed.

As described above, according to the present embodiment, the intensity I based on a plurality of reflected lightsnThe relevant reflected light intensity information determines whether or not dirt is attached to the cover 22a, and then the cover 22a is driven in accordance with the determination that dirt is attached to the cover 22 a. As described above, the dirt adhering to the cover 22a can be detected based on the reflected light intensity information. In this regard, when dirt such as rain, snow, mud, or the like adheres to the cover 22a, the dirt reduces the intensity of the reflected light, and therefore, the dirt adhering to the cover 22a can be detected based on the intensity of the reflected light. In particular, it has been found from the results of the present experiment that the intensity of the reflected light when dirt adheres to the cover 22a is a value between 60% and 70% of the intensity I of the reflected light from the road surface R measured when dirt does not adhere to the cover 22 a. Therefore, since dirt adhering to the cover 22a can be reliably detected, a decrease in detection accuracy of the sensor such as the LiDAR unit 44a disposed in the left light fixture 7a can be suppressed.

In addition, according to the present embodiment, as explained by the process of step S1, the processes of steps S2 to S9 (in other words, the dirt detection process) are executed when the road surface R in the periphery of the vehicle 1 is dry. At this point, when the road surface R is wet, the laser light emitted from the LiDAR unit 44a is regularly reflected by the road surface R. Therefore, the intensity of light incident on the light receiving portion of the LiDAR unit 44a after being reflected by the road surface R becomes very small, and therefore it may not be possible to accurately determine whether dirt is attached to the cover 22a based on the reflected light intensity information. On the other hand, according to the present embodiment, the determination process of whether or not dirt adheres to the cover 22a is executed when the road surface R is dry, and therefore, whether or not dirt adheres to the cover 22a can be determined with high accuracy based on the reflected light intensity information.

In the present embodiment, the intensity I of 226 reflected lights is determined in the comparison processing of step S5nWhether each is less than a prescribed threshold value IthHowever, the comparison processing in step S5 is not particularly limited. For example, the intensity I of 226 reflected lights may be determinednWhether the mean or median of (a) is less than a prescribed threshold value Ith. Is judged as the intensity I of the reflected lightnIs greater than or equal to a prescribed threshold value IthIn the case of (3), in step S7, lamp cleaner control unit 460a may determine that dirt G is not attached to housing 22 a. On the other hand, the intensity of the reflected light is determined as InIs less than a predetermined threshold value IthIn the case of (3), in step S8, lamp cleaner control unit 460a may determine that dirt G adheres to housing 22 a. In this case, it is desirable to note that the processing of step S6 is omitted.

In the present embodiment, the angular range and the angular distance in the horizontal direction of the LiDAR unit 44a are 45 ° and 0.2 °, respectively, but the present embodiment is not limited thereto. The angular range and angular separation of the LiDAR unit 44a in the horizontal direction may be any value.

(fouling detection method according to embodiment 2)

Next, a method of detecting dirt adhering to the cover 22a of the left light tool 7a according to embodiment 2 will be described below mainly with reference to fig. 7 and 8. Fig. 7 is a flowchart for explaining a series of processing for acquiring reflected light intensity information when the vehicle 1 is parked. Fig. 8 is a flowchart for explaining a method of detecting dirt adhering to the cover 22a (dirt detecting method) according to embodiment 2. Note that, in the present embodiment, although only the dirt detection process performed by the sensor system 6a is described similarly, it is desirable to note that the dirt detection process performed by the sensor systems 6b to 6d is also the same as the dirt detection process performed by the sensor system 6 a.

First, a series of processing for acquiring reflected light intensity information when the vehicle 1 is parked will be described below with reference to fig. 7. As shown in fig. 7, in step S10, when the vehicle 1 is stopped (YES in step S10), the vehicle control unit 3 determines whether or not the road surface around the vehicle 1 is dry based on the surrounding environment information transmitted from the sensor systems 4a to 4d (step S11). If the determination results at steps S10 and S11 are NO, the present determination process is repeatedly executed until the determination results at steps S10 and S11 become YES. On the other hand, if the determination result at step S11 is YES, the process proceeds to step S12. Further, in the case where the vehicle 1 is running in the advanced autonomous driving mode or the full autonomous driving mode, the vehicle control portion 3 may decide the parking of the vehicle 1. In this case, after the vehicle control unit 3 determines that the vehicle 1 is stopped, the processing of step S11 and the subsequent steps is executed. On the other hand, in the case where the vehicle 1 is traveling in the manual driving mode or the driving assistance mode, the vehicle control portion 3 may determine whether the vehicle 1 is currently stopping, based on the surrounding environment information (for example, presence of a parking lot) and the traveling information (for example, reverse traveling) of the vehicle 1.

Next, in step S12, the LiDAR-unit control unit 440a controls the LiDAR unit 44a such that the LiDAR unit 44a emits laser light L toward the road surface R at each horizontal angle θ (see fig. 5). Next, in step S13, 226 horizontal angles θ (θ) reflected by the road surface R are treated1、θ2···、θ226) The respective reflected lights are received. The LiDAR unit 44a then generates and aligns angles θ relative to each horizontalnIntensity of a plurality of reflected lights InThe generated reflected light intensity information is then transmitted to the luminaire cleaner control 460a via the LiDAR unit control 440 a. As described above, the lamp cleaner control unit 460a can acquire the reflected light intensity information (step S14). Then, the lamp cleaner control unit 460a stores the acquired reflected light intensity information in the memory or storage device 11 (see fig. 2) of the control unit 40a (step S15). As described above, the reflected light intensity information measured when the vehicle 1 is stopped is stored in the vehicle 1.

Next, a dirt detection method according to embodiment 2 will be described below with reference to fig. 8. The dirt detection method shown in fig. 8 is executed, for example, when the vehicle 1 stored in the parking lot is started. As shown in fig. 8, in step S20, the vehicle control unit 3 determines whether or not the road surface around the vehicle 1 is dry, based on the surrounding environment information transmitted from the sensor systems 4a to 4 d. If the determination result at step S20 is YES, the process proceeds to step S21. On the other hand, when the determination result at step S20 is NO, the determination process at step S20 is repeatedly executed.

Next, in step S21, the LiDAR-unit control unit 440a controls the LiDAR unit 44a such that the LiDAR unit 44a emits the laser light L toward the road surface R at each horizontal angle θ.

Next, in step S22, the LiDAR unit 44a reflected 226 horizontal angles θ (θ) from the road surface R1、θ2···、θ226) The respective reflected lights are received. The LiDAR unit 44a then generates and aligns angles θ relative to each horizontalnIntensity of a plurality of reflected lights InThe generated reflected light intensity information is then transmitted to the luminaire cleaner control 460a via the LiDAR unit control 440 a. As described above, in step S23, the luminaire cleaner control portion 460a obtains reflected light intensity information from the LiDAR unit 44 a.

Next, the process of the present invention is described,in step S24, lamp cleaner control unit 460a compares the reflected light intensity information measured this time with the reflected light intensity information measured the previous time and stored in vehicle 1. At this point, lamp cleaner control unit 460a measures the intensity I of the 226 reflected lights measured this timenThe intensities I of the 226 reflected lights respectively and previously measuredref_nThe corresponding one of them is compared. Specifically, lamp cleaner control unit 460a measures intensity I of nth reflected light measured this timenIntensity I of the n-th reflected light measured in the previous timeref_nWhether the ratio (percentage) of (a) is less than 50% or not is judged (n ═ 1, · · 226). That is, the intensity I of the reflected light is expressed based on the following formula (1)nAnd intensity of reflected light Iref_nA comparison is made.

(In/Iref_n)×100%<50%···(1)

Then, lamp cleaner control unit 460a determines intensity I of reflected light satisfying equation (1) abovenWhether the number of (D) is greater than or equal to a predetermined number (step S25). As shown in FIG. 9, lamp cleaner control 460a will determine the intensity I of reflected light1To I226Intensity of each and reflected light Iref_1To Iref_226Is compared with the corresponding one, thereby the intensity I of the reflected light satisfying the above formula (1) is obtainednThe number of (2) is counted.

If the determination result in step S25 is YES, lamp cleaner control unit 460a determines that dirt G (see fig. 5) adheres to cover 22a (step S27). On the other hand, if the determination result in step S25 is NO, lamp cleaner control unit 460a determines that dirt G is not attached to housing 22a (step S26), and the process ends.

Then, in step S28, lamp cleaner control unit 460a drives lamp cleaner 46a to remove dirt G adhering to housing 22 a. Specifically, the lamp cleaner control portion 460a causes the lamp cleaner 46a to be driven so that the washer fluid or air is ejected from the lamp cleaner 46a toward the housing 22 a.

After the dirt removal process is performed by the lamp cleaner 46a with respect to the housing 22a (after the process of step S28 is performed), the process returns to step S21. As described above, the processing from step S21 to step S8 is repeated until it is determined that the dirt G is not attached to the cover 22 a. Note that the present process may be ended after the process of step S28 is executed.

As described above, according to the present embodiment, the dirt G adhering to the cover 22a can be detected based on the comparison between the reflected light intensity information measured in the previous time and the reflected light intensity information measured in the present time. Therefore, since the dirt G adhering to the cover 22a can be reliably detected, a decrease in detection accuracy of the sensor such as the LiDAR unit 44a disposed in the left-hand lamp 7a can be suppressed.

In the present embodiment, in the processing of steps S24 and S25, the intensity I of the n-th reflected light measured this time is determinednIntensity I of the n-th reflected light measured in the previous timeref_nWhether or not the ratio (percentage) of (A) is less than 50%, and the intensity I of reflected light satisfying the above formula (1)nThe number of (2) is counted, but the present embodiment is not limited thereto. For example, the intensity I of the reflected light may also be determinednRelative to the intensity of the reflected light Iref_nWhether the ratio (percentage) of (A) is less than X% (here, 0% < X < 100%). In addition, the intensity I of the reflected light can also be determinedref_nAnd intensity of reflected light InWhether or not the difference Δ In between is less than or equal to a prescribed threshold value Ith

While the embodiments of the present invention have been described above, it is needless to say that the technical scope of the present invention should not be construed as being limited to the description of the embodiments. This embodiment is merely an example, and it will be understood by those skilled in the art that various modifications of the embodiment can be made within the scope of the invention described in the claims. The technical scope of the present invention should be determined based on the scope of the invention described in the claims and the equivalent scope thereof.

The present application appropriately refers to the disclosure disclosed in japanese patent application No. 2019-18/2 (japanese patent application No. 2019-026548).

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:具有电动驱动马达、电动液压制动器和诸如变速器、扭矩矢量和驻车制动器的附加模块的车辆轴

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!