Vehicle-mounted camera system and vehicle lamp

文档序号:991712 发布日期:2020-10-20 浏览:2次 中文

阅读说明:本技术 车载照相机系统、车辆用灯具 (Vehicle-mounted camera system and vehicle lamp ) 是由 远藤修 真野光治 田中秀忠 于 2019-02-05 设计创作,主要内容包括:车载照相机系统(100)包括:照相机(110),其对车辆前方进行拍摄;以及处理器(120),其对照相机(110)的图像进行处理。处理器(120)将车辆前方的目标检测处理分类为期限较晚的任务与期限较早的任务来执行。(An in-vehicle camera system (100) includes: a camera (110) that takes an image of the front of the vehicle; and a processor (120) that processes an image of the camera (110). The processor (120) classifies the object detection processing in front of the vehicle into a task with a later time limit and a task with an earlier time limit for execution.)

1. An in-vehicle camera system, characterized by comprising:

a camera which photographs the front of the vehicle, an

A processor for processing the image of the camera;

the processor classifies the object detection processing in front of the vehicle into a task with a later time limit and a task with an earlier time limit and executes the tasks.

2. The in-vehicle camera system according to claim 1,

the later due task includes a detection process of the remote vehicle.

3. The in-vehicle camera system according to claim 1 or 2,

the task whose deadline is later includes at least one of the following processes:

(i) the decision process of the straight line/turn,

(ii) the process of detection in a remote area is,

(iii) the detection process of the remote spot of light,

(iv) tracking of the distant spot.

4. The in-vehicle camera system according to any one of claims 1 to 3,

the task of the earlier term includes a detection process of the nearer vehicle.

5. The in-vehicle camera system according to any one of claims 1 to 4,

the task whose deadline is earlier includes the following processes:

(i) the process of extracting the high-luminance region,

(ii) the generation process of the feature quantity of the high luminance region,

(iii) a detection process of the vehicle based on the above feature amount.

6. An in-vehicle camera system, characterized by comprising:

a camera which photographs the front of the vehicle, an

A processor for processing the image of the camera;

the processor performs:

a high-speed task of processing all frames of the image of the camera, an

And a low-speed task for processing the image of the camera at a rate of 1 time in the plurality of frames.

7. The in-vehicle camera system according to claim 6,

the low-speed task includes a process of detecting a relatively distant preceding vehicle, and the high-speed task includes a process of detecting a relatively close preceding vehicle.

8. A lamp for a vehicle, characterized in that,

comprising an in-vehicle camera system as claimed in any one of claims 1 to 7.

Technical Field

The present invention relates to an in-vehicle camera system.

Background

In recent years, an ADB (Adaptive Driving Beam) technology has been introduced, which dynamically and adaptively controls a light distribution pattern of a high Beam based on a state around a vehicle. The ADB technique reduces glare given to a vehicle or a pedestrian by detecting the presence or absence of a preceding vehicle, an oncoming vehicle (hereinafter referred to as a preceding vehicle), or a pedestrian in front of the vehicle, and reducing light or the like in a region corresponding to the vehicle or the pedestrian.

To implement the ADB function, it is necessary to: a sensor that measures a condition in front of the vehicle; a processor that processes an output of the sensor to generate a light distribution pattern; and a light source that irradiates the front of the vehicle according to the light distribution pattern.

As a sensor used for an automobile, a camera, LiDAR (Light Detection and Ranging: Light Detection and measurement; Laser Imaging Detection and Ranging: Laser Imaging Detection and measurement), millimeter wave radar, ultrasonic sonar, and the like can be listed as candidates. Among them, the most inexpensive camera is widely used in view of the current situation, and among them, a method of detecting a preceding vehicle by pattern-matching a light spot included in an image captured by the camera in front of the vehicle with a headlight or a tail lamp (hereinafter, collectively referred to as a light source) of the vehicle is used.

[ Prior art documents ]

[ patent document ]

Patent document 1 Japanese patent laid-open publication No. 2009-032030

Disclosure of Invention

[ problems to be solved by the invention ]

However, images taken at night include not only the light source of the vehicle ahead but also other light sources (for example, street lamps, reflectors, and lighting of buildings), and it is not easy to distinguish them, and a complicated signal processing is required in the processor. When the detection of the preceding vehicle is delayed, glare is given to the preceding vehicle, and therefore high-speed detection is required. This requires a high-speed operation and a dedicated image processing engine for the processor, and increases the cost of the system.

The present invention has been made in view of the above circumstances, and an exemplary object of one aspect thereof is to reduce the cost of an in-vehicle camera system.

[ means for solving the problems ]

One aspect of the present invention relates to an in-vehicle camera system. The in-vehicle camera system includes: a camera that photographs the front of the vehicle; and a processor that processes the image of the camera. The processor classifies the object detection processing in front of the vehicle into a task with a later time limit and a task with an earlier time limit for execution.

The target includes a sign, a road shape, and the like in addition to a traffic participant such as a vehicle or a pedestrian. For example, the time (allowable delay) required for the detection differs depending on the type of the preceding vehicle (preceding vehicle, oncoming vehicle), the traveling position thereof, the traveling condition, and the like. According to this aspect, the object detection processing in front of the vehicle is classified into a task having a later deadline, i.e., a long processing time, and a task having an earlier deadline, i.e., a short processing time, and is executed, so that the object can be detected at a high speed in a short time in response to the detected object even when an inexpensive processor having a low processing speed is used.

Tasks that are older than their deadline may also be scheduled more frequently.

It is also possible that a later task comprises a detection process of a remote vehicle. The task of later date may include at least one of (i) a straight/turning determination process, (ii) a detection process of a distant area, (iii) a detection process of a distant light spot, and (iv) a tracking process of a distant light spot.

It is also possible that a task of an earlier date includes a detection process of a nearer vehicle. The task with the earlier deadline may include (i) a process of extracting the high-luminance region, (ii) a process of generating the feature amount of the high-luminance region, and (iii) a process of detecting the vehicle based on the feature amount.

Another aspect of the invention also relates to an in-vehicle camera system. The in-vehicle camera system includes: a camera that photographs the front of the vehicle; and a processor that processes the image of the camera. The processor performs a high-speed task of processing the image of the camera for all frames and a low-speed task of processing the image of the camera at a rate of 1 multiframe.

The low-speed task may include a process of detecting a relatively distant preceding vehicle, and the high-speed task may include a process of detecting a relatively close preceding vehicle.

Another aspect of the invention relates to a vehicle lamp. The vehicle lamp may include the above-described in-vehicle camera system.

[ Effect of the invention ]

According to the present invention, the cost of the in-vehicle camera system can be reduced.

Drawings

Fig. 1 is a block diagram of an in-vehicle camera system of the embodiment.

Fig. 2 (a) to 2 (d) are diagrams illustrating examples of a task having a later deadline and a task having an earlier deadline.

Fig. 3 (a) to 3 (c) are diagrams illustrating another example of a task with a later deadline and a task with an earlier deadline.

Fig. 4 is a flowchart showing a process of the in-vehicle camera system according to the embodiment.

Fig. 5 is a timing chart for explaining the operation of the in-vehicle camera system.

Fig. 6 is a block diagram of the vehicular lamp of the embodiment.

Detailed Description

The present invention will be described below based on preferred embodiments with reference to the accompanying drawings. The same or equivalent constituent elements, members, and processes shown in the respective drawings are denoted by the same reference numerals, and overlapping descriptions are appropriately omitted. The embodiments are not intended to limit the present invention, and are merely examples, and all the features or combinations thereof described in the embodiments are not necessarily essential to the present invention.

Fig. 1 is a block diagram of an in-vehicle camera system 100 of the embodiment. The in-vehicle camera system 100 includes a camera 110 and a processor 120. The in-vehicle camera system 100 detects a "preceding vehicle". The "preceding vehicle" as the detection target is typically the position of the preceding vehicle and the oncoming vehicle.

The camera 110 photographs the front of the vehicle. The processor 120 receives the image IMG of the camera 110 and performs object detection in front of the vehicle.

The Processor 120 includes a CPU (Central Processing Unit), a GPGPU (general purpose graphics Processor Unit), and the like, and may be a general-purpose microcomputer that can be controlled by a program using software, or may be dedicated hardware such as an AI (Artificial Intelligence) chip.

The processor 120 classifies the object detection processing in front of the vehicle into a task with a later time limit and a task with an earlier time limit to execute.

The processor 120 includes a preprocessing unit 122, a task assigning unit 124, a high-speed task executing unit 126, and a low-speed task executing unit 128. In addition, these blocks represent functions of the processor 120 and do not exist as distinguishable hardware.

The preprocessing unit 122 performs preprocessing on the image IMG of the camera 110. The preprocessing includes noise removal, binarization, and light spot detection by a mark.

The task assigning unit 124 classifies a plurality of sub-processes included in the object detection in the front of the vehicle into a task with a later deadline and a task with an earlier deadline, and transfers the sub-processes to the high-speed task executing unit 126 and the low-speed task executing unit 128. The high-speed task execution unit 126 executes a task whose deadline is earlier, and the low-speed task execution unit 128 executes a task whose deadline is later. The output unit 130 outputs information of the preceding vehicle based on the processing results of the high-speed task execution unit 126 and the low-speed task execution unit 128.

Typically, a later time limit can be set for a sub-process related to information or an object whose detection change speed is slow, whereas an earlier time limit needs to be set for a sub-process related to information or an object whose detection change speed is fast. For example, the term is set to be earlier for an object or information that changes in accordance with each frame of the camera image (or the speed on which it is based). In contrast, the term can be set later for an object or information that changes slowly across a plurality of frames.

Several examples will be described for tasks with a later deadline.

1. Detection of vanishing points (distant areas)

The light point included in the image is generated from the vanishing point (distant area) and radially moved, or vanished at the vanishing point. Therefore, the process (sub-process) of detecting the vanishing point is important for detecting the preceding vehicle. Here, the vanishing point is located forward of the front of the vehicle (0 ° direction, 12-point direction) in the case of a straight line, and the vanishing point moves rightward or leftward from the forward direction in the case of a curved line. It can be said that: since the straight line does not suddenly change to a curve or the curve suddenly changes to a straight line, the speed of moving the vanishing point is continuously and slowly changed, and is slower than the frame rate of the camera (33 ms in the case of 30 fps). Therefore, the deadline can be set later for the sub-process related to the detection of the vanishing point (distant area).

2. Detection and determination of straight-ahead and turning

As described above, straight lines and turns also gradually change over a plurality of frames, and therefore, the deadlines can be made later also for the sub-processes related to their detection.

3. Vehicle detection in remote areas

In the case where the position information of the preceding vehicle is used for the ADB lamp, the object is not to cause glare. When the preceding vehicle is sufficiently distant from the host vehicle (for example, in front of 300m or more), the influence of glare is small even if the detection is late, that is, the light shielding is late. Thus, the deadline can be made later for the sub-processes associated with the spot comprised in the distant area. For example, a predetermined angular range around the vanishing point may be set as the distant area.

Next, a task with an earlier deadline will be described.

4. Vehicle detection of an area close to the own vehicle (close area)

The closer the leading vehicle is to the host vehicle, the higher the risk of glare due to detection delay. In other words, the closer the vehicle is, the higher the speed of detection is required. Thus, the deadline can be made later for the sub-process associated with the spot close to the area. For example, the outside of a predetermined angular range around the vanishing point may be set as the distant area.

Fig. 2 (a) to 2 (d) are diagrams illustrating examples of a task having a later deadline and a task having an earlier deadline. In this example, a car meeting with an opposing car on a straight line will be described. Fig. 2 (a) is a view of a straight line viewed from above. The host vehicle 200 travels from right to left in the figure. At 400m in front of the host vehicle 200, the preceding vehicle 202 travels at the same speed as the host vehicle 200.

The opposing cars 204 in turn approach position A, B, C. The oncoming vehicle 204 at position A, B, C is labeled 204A, 204B, 204C for distinction. Here, for the sake of simplicity of explanation, the positions of the host vehicle 200 and the preceding vehicle 202 are fixed. Fig. 2 (b) to 2 (d) show camera images when the oncoming vehicle 204 is present at the position A, B, C.

For example, when the speed of the host vehicle 200 and the oncoming vehicle 204 is 60km/h, the relative speed is 120km/h, and it takes 9 seconds from the position a to the position B. Further, it takes about 3 seconds from the position B to the position C. With respect to the angle from the center, it is about 0.6 ° at position a, about 2.5 ° at position B, and about 20 ° at position C.

In the case of straight line, the far side will become the center of the screen. Therefore, the boundary 210 may be set at a predetermined angle to the left and right from the center of the screen, the time limit may be set later for the light spot included in the area inside the boundary 210, and the time limit may be set earlier for the light spot outside the boundary 210. In the case where the predetermined angle is set to ± 2.5 °, an earlier time limit is set for the sub-process associated with the detection of the oncoming vehicle 206B, 206C at the position B, C in (a) of fig. 2. The deadline may be set to 33ms for all frames for the sub-procedure related to the light spot whose deadline should be set earlier.

In contrast, a later deadline is set for the sub-process associated with the detection of the oncoming vehicle 204A at the position a in fig. 2 (a). Since the preceding vehicle 202 is always included inside the boundary 210, a later time limit is set. The deadline may be set to several hundred ms for a sub-procedure related to a spot whose deadline should be set later. For example, when the time limit is 200ms, 1 sheet may be processed every 6 frames.

Fig. 3 (a) to 3 (c) are diagrams illustrating another example of a task with a later deadline and a task with an earlier deadline. In this example, a car meeting with an opposing car at a turn will be described. In a right turn, the far distance (vanishing point) is shifted to the right from the center of the screen, and in a left turn, the far distance (vanishing point) is shifted to the left from the center of the screen. The determination of straight traveling, right turning, left turning, and the position of the distant place may be detected based on sensor information, or may be detected based on comparison between a past frame and a current frame. As described above, the sub-process related to the distant detection can be set to a later deadline.

During cornering, the boundary 210 may be set at a predetermined angle to the left and right around the distant position, a later time limit may be set on the inner side of the boundary 210, and an earlier time limit may be set on the outer side.

Fig. 4 is a flowchart showing the processing of the in-vehicle camera system 100 according to the embodiment. Here, a task of a later period (referred to as a low-speed task) is executed 1 time per multiframe as an object, and a task of an earlier period (referred to as a high-speed task) is executed for all frames.

The high-speed task S100 includes an input process S102. In the input process S100, an image of the camera 110 is taken in, and outputs or information of other sensors are taken in.

Next, preprocessing of the captured image is performed (S104). The pre-processing includes binarization, labeling, etc., and extracts a light spot.

It is determined whether or not the period is a period of the low-speed task (S106), and if the period is a period of the low-speed task (yes in S106), the low-speed task S200 is executed. For example, the low-speed task is executed at a rate of 1 time per 6 frames. If the period is not a period of the low-speed task (no in S106), the remaining high-speed task is executed.

In the low-speed task S200, a task whose deadline is later is processed. Specifically, the straight traveling and the turning are determined (S202). Specifically, the straight traveling, the right turning, and the left turning are determined based on the sensor information and the comparison between the past and the current frame, and the curvature and the like are calculated in the case of the turning.

Next, the distant area (the boundary described above) is specified based on the results of the straight traveling and turning determination (S202) (S204). This boundary is used as described above to assign which of the later due task and the earlier task processes the light point.

Next, the distant light spot included in the distant area is extracted, and feature data including the position, shape, color, brightness, and the like is calculated (S206). Then, the distant vehicle is determined (S210) by processing the distant light spot with a predetermined algorithm (S208).

For example, the process S208 may include a process of tracking the distant light spot and acquiring the trajectory thereof. In particular, the trajectory is obtained by establishing a link of spots of identical characteristic data across a plurality of frames.

Then, in the process S210, when the trajectory matches the trajectory of the distant vehicle, it may be determined as the distant vehicle. In addition, the track of the distant light point is tracked every 6 frames.

When the low-speed task S200 is completed, the remaining processing of the high-speed task S100 is executed (S108 to S114). The remaining processing involves vehicle determination related to the light points outside the distant area.

First, a high-luminance region outside the distant region is extracted (S108). Then, the feature values (position, area, center of gravity, brightness, color, etc.) are calculated for the extracted high-luminance regions (S110). Then, the presence or absence of a vehicle is determined based on the presence or absence of the high-luminance region and the feature value thereof (S112). Finally, information of the vehicle detected in the processing S210 or S112 is output.

The order of execution of each process included in the flowchart of fig. 4 can be replaced within a range where no trouble is caused in the process. For example, the processes S106 and S200 may be executed at the end of the process S100.

Fig. 5 is a timing chart for explaining the operation of the in-vehicle camera system 100. In this example, the execution cycle of the high-speed task is equal to the frame cycle, and is 33 ms. The execution cycle of the low-speed task is 200ms, and one of 6 consecutive frames is a processing target. In fig. 5, when the first frame number is denoted by n, the (n + k × 6) th frame (k is 0, 1, …) is a processing target of the low-speed task. By extension, the low-speed task can target the (n + k × m) th frame.

The high-speed task for the i-th frame is scheduled to a part of the i + 1-th frame period thereafter.

The low-speed task for the (n + k) × m-th frame is scheduled to a time slot in which the high-speed task is not scheduled to be free, in the following m frame periods (n + k) × m +1 to n + (k +1) × m. In the example of fig. 5, m is 6, and the low-speed task related to the (n + k) th frame × 6 is executed during the (n + k × 6+ 1) th to (n + k × 6+ 5) th frames.

The in-vehicle camera system 100 described above can be used particularly favorably for a vehicle lamp. Fig. 6 is a block diagram of the vehicle lamp 300 of the embodiment. The vehicle lamp 300 includes a light distribution pattern generating unit 310 and a light distribution variable lamp 320 in addition to the vehicle-mounted camera system 100.

The light distribution variable lamp 320 may employ a known technique or a technique available in the future. For example, as the light distribution variable lamp 320, an array method, a scanning method, and the like are proposed. The array lamp includes a plurality of semiconductor light sources arranged in an array, and controls on/off and brightness of each of the plurality of semiconductor light sources. The scanning type lamp includes a semiconductor light source and a scanning optical system for scanning light emitted from the semiconductor light source in front of a vehicle, and controls turning on and off and brightness of the semiconductor light source in synchronization with a scanning operation.

Information of the preceding vehicle detected by the in-vehicle camera system 100 is supplied to the light distribution pattern generating unit 310. The light distribution pattern generating unit 310 generates a light distribution pattern PAT composed of a combination of a light blocking region (or a light reduction region) and an irradiation region based on the position information of the vehicle ahead.

The light distribution pattern generating unit 310 may be mounted on a processor different from the processor 120 of the in-vehicle camera system 100, or may be mounted on the same processor.

The present invention has been described above based on the embodiments. It should be understood by those skilled in the art that this embodiment is merely an example, and various modifications are possible in the combination of the respective constituent elements or the respective processes, and such modifications are also within the scope of the present invention. Hereinafter, such a modification will be described.

The classification of the task with the earlier deadline (the high-speed task) and the task with the later deadline (the low-speed task) is merely an example and is not limited. For example, when there is a margin in the processing capacity of the processor, the sub-processes related to straight traveling, turning determination (S202 in fig. 4), and detection of a far area (S204) may be assigned to the low-speed task, and the sub-processes related to processing targeting a far light spot (S206, S208, and S210) may be assigned to the high-speed task.

Although the present invention has been described using specific terms based on the embodiments, the embodiments only show one side of the principle and application of the present invention, and many modifications of the modifications and arrangements are acceptable for the embodiments without departing from the scope of the idea of the present invention defined in the claims.

[ description of reference numerals ]

100 vehicle-mounted camera system

110 camera

120 processor

122 pretreatment section

124 task assigning part

126 high-speed task execution unit

128 low-speed task execution unit

130 output unit

300 vehicle lamp

310 light distribution pattern generating unit

320 light distribution variable lamp

[ Industrial availability ]

The present invention relates to an in-vehicle camera system.

14页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:检测装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类