Sensing device and method for detecting distance information for a vehicle

文档序号:1926658 发布日期:2021-12-03 浏览:4次 中文

阅读说明:本技术 用于车辆的用于检测距离信息的传感装置和方法 (Sensing device and method for detecting distance information for a vehicle ) 是由 T·鲁查茨 S·沃伦伯格 于 2020-01-15 设计创作,主要内容包括:本发明涉及一种用于车辆(10)的用于检测距离信息的传感装置(290)和方法,其中该距离信息基于条形区域(191、210)和针对此所形成的距离直方图来被检测并且更准确地说关于这些条形区域(191、210)的交叉区域来被检测。在此规定:场景的所检测的条形区域(191、210)分别与该传感装置(290)的传感器(291)的条形区域对应,并且针对多个对应的条形区域同时确定其中接收到的光。(The invention relates to a sensor device (290) and a method for detecting distance information for a vehicle (10), wherein the distance information is detected on the basis of strip-shaped regions (191, 210) and a distance histogram formed for this and more precisely with respect to the intersection region of the strip-shaped regions (191, 210). Provision is made here for: the detected strip-shaped regions (191, 210) of the scene correspond to the strip-shaped regions of the sensor (291) of the sensor device (290) and the light received therein is determined simultaneously for a plurality of corresponding strip-shaped regions.)

1. A sensing device (290) for a vehicle (10) for detecting distance information, the sensing device having:

-a sensor (291) set up to receive light (16) reflected by a scene in the surroundings;

-a processing unit (292) being set up to determine a plurality of first distance histograms (191) from the received light (16), wherein a respective first distance histogram (191) of the plurality of first distance histograms (191) is assigned a respective first strip-shaped region of the scene, wherein the first distance histogram (191) comprises intensities of reflections within a range of distances caused by objects in the assigned first strip-shaped region;

and the processing unit is further set up to determine a plurality of second distance histograms (201) from the received light (16), wherein a respective second distance histogram (201) of the plurality of second distance histograms (201) is assigned a respective second bar region of the scene, wherein the second distance histogram (201) comprises intensities of reflections caused by objects in the assigned second bar region within a distance range;

and the processing unit is further set up to determine distance information of regions of the scene from the plurality of first distance histograms (191) and the plurality of second distance histograms (201), wherein the regions of the scene comprise an intersection region of one of the first bar-shaped regions and one of the second bar-shaped regions,

wherein the bar-shaped regions of the scene correspond to the bar-shaped regions of the sensor (291), respectively,

characterized in that the sensor device (290) is set up to: the light received therein is determined simultaneously for a plurality of corresponding stripe-shaped areas.

2. The sensing device (290) according to claim 1,

wherein the sensor (291) is set up to: light (16) received therein is simultaneously determined for at least 50% of the corresponding strip-shaped areas.

3. The sensing device (290) according to claim 1 or 2,

wherein the corresponding strip-shaped area of the sensor (291) is subdivided into areas in which the received light (16) is determined simultaneously and areas in which the received light (16) is not determined simultaneously.

4. The sensing device (290) according to claim 3,

wherein the region in which the received light (16) is not simultaneously determined is at least partially in an edge region of the sensor (291).

5. The sensing device (290) according to any of the preceding claims,

wherein the sensor (291) comprises a sensor matrix having sensor elements (300) arranged in rows (Z) and columns (S), which are each set up for receiving light (16), wherein the rows (Z) correspond to a first strip-shaped region of the scene and the columns (S) correspond to a second strip-shaped region of the scene.

6. The sensing device (290) according to any of the preceding claims,

wherein the sensor elements (300) in a row (Z) and a column (S) are interconnected separately, and the distance histogram is determined from the total signal of the correspondingly interconnected sensor elements (300).

7. The sensing device (290) according to any of claims 5 to 6,

wherein at least those sensor elements (300) whose received light (16) is simultaneously determined comprise a current mirror device (316) respectively, which is connected to the row lines (304) and (308) and to the column lines (310) and (314), to which further current mirror devices (316) of further sensor elements (300) are also connected.

8. The sensing device (290) according to any of claims 5 to 7,

wherein at least those sensor elements (300) whose received light (16) is determined simultaneously comprise at least two photodetector elements (320, 340), respectively, which receive light (16), respectively, and wherein one of the photodetector elements (320, 340) is connected to the row line (304-308) and the other photodetector element is connected to the column line (310-314).

9. The sensing device (290) of claim 8,

wherein more than two photodetector elements (320, 340) are provided and said photodetector elements are combined into two groups, wherein the photodetector elements (320, 340) of one group are connected to the row lines (304-308) and the photodetector elements (320, 340) of one group are connected to the column lines (310-314), and wherein the at least two photodetector elements (320, 340) of one group enclose between them the at least one photodetector element (320, 340) of the other group.

10. A method for detecting distance information for a vehicle (10), the method having:

-receiving light (16) reflected by a scene in the surrounding environment with a sensor (291);

-determining a plurality of first distance histograms (191) from the received light (16), wherein a respective first distance histogram (191) of the plurality of first distance histograms (191) is assigned a respective first strip-shaped region of the scene, wherein the first distance histogram (191) comprises intensities of reflections within a distance range caused by objects in the assigned first strip-shaped region;

-determining a plurality of second distance histograms (201) from the received light (16), wherein a respective second distance histogram (201) of the plurality of second distance histograms (201) is assigned a respective second bar region of the scene, wherein the second distance histogram (201) comprises intensities of reflections within a distance range caused by objects in the assigned second bar region;

-determining distance information for regions of the scene from the plurality of first distance histograms (191) and the plurality of second distance histograms (201), wherein the regions of the scene comprise an intersection region of one of the first bar-shaped regions and one of the second bar-shaped regions,

wherein the bar-shaped regions of the scene correspond to the bar-shaped regions of the sensor (291), respectively,

characterized in that the light (16) received therein is determined simultaneously for a plurality of corresponding strip-shaped areas.

Technical Field

The present invention relates to a sensor device and a method for detecting distance information for a vehicle, in particular for a motor vehicle such as a passenger car or a truck. The detection principle can generally be based on the detection of visible or invisible light, that is to say electromagnetic radiation.

Background

In the context of vehicles, it is known to: a scene within the vehicle or in the surroundings of the vehicle is detected by means of an optical sensor. It is particularly well known that: light having predetermined characteristics is injected into a scene and light reflected by the scene is received by means of a light-sensitive sensor. From the received light, measurement signals can be generated and evaluated for obtaining distance information, or in other words, spacing information.

This information can then be used by various driver assistance systems, as described in paragraph [0002] of the applicant's prior publication DE 102013002671 a 1. DE 102013002671 a1, for example, also discloses, with reference to WO 2008/1547361 a 1: the lighting system of the vehicle, which is available anyway, and in particular the LED-based lighting system, can be used as a light source for generating and injecting light, which can then be detected in reflected form by a sensor (see for example paragraph [0003] of DE 102013002671 a 1). The lighting system may-also within the scope of the present application-for example comprise daytime running lights, high beam lights, low beam lights, turn signal lights, fog lights, etc.

Furthermore, it has been recognized from paragraph [0007] of DE 102013002671A 1 and the following detailed description: in order to shorten the evaluation time and reduce the requirements on the sensor, strip-shaped regions can be formed, for which a distance histogram is determined in each case. Then, based on the distance histograms, distance information may be determined for the intersection points of the bar regions, respectively.

The present application is based on the knowledge of DE 102013002671 a1, so that the teachings of this document and in particular the general description of this document, the subject matter described in the claims and mainly the depictions of fig. 17 to 22 are incorporated herein by reference in their entirety.

It has been recognized that: nevertheless, optimal distance detection is not achieved under the teaching of DE 102013002671 a 1.

Disclosure of Invention

The task of the invention is thus to: the strip light detection of DE 102013002671 a1 is further improved.

This object is achieved by a sensor device according to claim 1 and a method according to claim 10. Advantageous embodiments are specified in the dependent claims. It is easy to understand that: the foregoing description is also applicable to the present invention unless otherwise indicated or apparent.

Unlike the solution of DE 102013002671 a1, which provides for sequential measurement value detection for individual strip-shaped regions, as is derived from the measurement time specification at the end of the paragraph [0067], the invention provides for: the measured values of these strip-shaped regions are detected at least partially simultaneously or in parallel. For example, all strip-shaped regions, but at least 10% of the strip-shaped regions provided in total, can be detected or read in parallel.

The measured values can be understood here to mean measured values and in particular measured signal values generated at predetermined points in time and obtained for the corresponding strip-shaped regions. The measured values or measured signal values, respectively, can also be a distribution of values with respect to a specific point in time. The measurement value is preferably one of the above-mentioned distance histograms or may be used to form such a distance histogram.

The advantages of the disclosed solution are: the observed scenes may be detected at least partially and preferably completely simultaneously. Thus, the simultaneously detected light or the simultaneously generated measurement values based thereon depict the scene at the same point in time. This reduces motion blur that may occur in the case of sequential detection and simultaneous relative motion between the sensor and the scene. In other words, in the solution proposed herein motion blur may occur at most during comparatively short parallel exposure times or reading times, whereas in the known solutions motion blur may occur during the accumulated exposure time for all sequentially evaluated strip-shaped areas.

Another advantage is that: due to the parallel measurement value detection with respect to at least parts of the plurality of strip-shaped regions, the scattered light is better utilized. In contrast to the solutions up to now, it is now sufficient to have comparatively short interspersed light pulses or pulse trains, between which also comparatively long pauses can be made. This enables savings potential on the hardware side and in particular in relation to the illumination source used.

It should be noted that: in order to actually implement the teaching of DE 102013002671 a1, it may also be necessary to provide a plurality of detection cycles with associated illumination cycles, for example one detection cycle with associated illumination cycle for a first strip-shaped region (for example in the row direction) and the next detection cycle with associated illumination cycle for a second strip-shaped region (for example in the column direction). In the present solution, however, one detection and illumination cycle may be sufficient, since the light of a plurality of strip-shaped areas is detected in parallel.

In detail, a (optical) sensor device for a vehicle for detecting distance information is proposed, the sensor device having:

- (optical) sensor, which is set up to receive light (visible or invisible) reflected by a scene in the surrounding (of the sensing device);

a processing unit (e.g. electronic) which is set up to determine a plurality of first distance histograms from the received light, wherein a respective first distance histogram of the plurality of first distance histograms is assigned a respective first strip-shaped region of the scene, wherein the first distance histogram comprises the intensity of reflections within a distance range caused by objects in the assigned first strip-shaped region;

and the processing unit is further set up to determine a plurality of second distance histograms from the received light, wherein a respective second distance histogram of the plurality of second distance histograms is assigned a respective second strip-shaped region of the scene, wherein the second distance histogram comprises intensities of reflections within a distance range caused by objects in the assigned second strip-shaped region;

and the processing unit is further set up to determine distance information for a region of the scene from the plurality of first distance histograms and the plurality of second distance histograms, wherein the region of the scene comprises an intersection region of one of the first bar regions and one of the second bar regions,

wherein the strip-shaped areas of the scene correspond to the strip-shaped areas of the sensor respectively,

and wherein the sensing device (and in particular the processing unit of the sensing device) is set up to: the light received therein is determined simultaneously for a plurality of corresponding strip-shaped areas (or in other words the light received therein is read and/or the measurement values are determined simultaneously for these corresponding strip-shaped areas).

It can thus be provided that: there is an assignment or correspondence between the regions of the scene, i.e. the bar-shaped regions used to detect the scene, and the bar-shaped regions of or on the sensor. In particular, provision can be made for: the scene is detected by means of (e.g. virtual) strip-shaped regions of the type described above and these strip-shaped regions are formed or provided by corresponding (corresponding) regions of the sensor and in particular the sensor elements set forth below which are present therein. That is, to some extent it can be said that the first and second bar-shaped regions may be virtual, but may be due to a corresponding grouping or arrangement of sensor elements within the sensor that subdivide the scene into correspondingly detectable bar-shaped regions. In particular, these strip-shaped regions may correspond to rows and columns of the sensor subdivided in the form of a matrix (that is to say the corresponding strip-shaped regions of the sensor may be rows and columns of the sensor).

The sensor may typically implement: the scene can be mapped by means of a two-dimensional detection area which is subdivided into corresponding strip-shaped areas or which can be assigned to the above-mentioned strip-shaped areas which should be used to detect the scene.

The sensor may comprise a plurality of sensor elements, each having at least one photodetector (or also photodetector elements). These sensor elements may be sipms (silicon photomultipliers) which may preferably be constructed from a plurality of smaller photodetectors (for example from so-called SPADs-single Photon avalanche diodes). Alternatively, the sensor elements may be formed by so-called PIN diodes.

These photodetectors may be arranged in a matrix or grid and thus have mutually perpendicular row and column directions, as are the sensors or the detection areas defined by them.

The determination of the received light of the corresponding bar-shaped regions can also be referred to as a reading or evaluation of these regions, wherein the latter preferably also comprises the formation of a corresponding distance histogram (or in other words a region-by-region measurement value generation).

All measured values or measured signals can generally be determined in a time-resolved manner. The sensor device may also comprise a memory device in which the (preferably time-resolved) measurement signals can be stored. Preferably each pixel and/or sensor element obtains a time-resolved signal, preferably the same time-resolved sum signal in rows and columns can be obtained simultaneously. This sum signal can then be converted to a distance resolved signal or to a distance histogram using the methods described herein.

The parallelism or simultaneity of the reads can be achieved by: each strip-shaped area is assigned its own (electrical) line, into which all sensor elements within the strip-shaped area (or within the corresponding strip-shaped area of the sensor) feed their signals. The lines may be row and column lines as set forth below.

It can be provided that: all corresponding strip-shaped areas (of the sensor) can be read simultaneously, but at least all strip-shaped areas (of the sensor) corresponding to a first or a second strip-shaped area can be read simultaneously.

According to one embodiment, provision can be made for: the light received therein is determined simultaneously for at least 50% (and preferably at least 25% or at least 10%) of these corresponding stripe-shaped areas.

In general, it is also possible to specify: the corresponding strip-shaped area of the sensor is (e.g. virtually) subdivided into an area in which the received light is determined simultaneously and an area in which the received light is not determined simultaneously. Alternatively, sequential evaluation or reading may be provided in the above-described regions.

In this respect, provision may also be made for: the region of the received light is not determined at the same time to be at least partially within the (at least one) edge region of the sensor. The edge region of the sensor can be understood, for example, as a row and/or column region (or region in general) which comprises at most 10% or at most 5% of the total number and/or total area of the rows and/or columns of the sensor (or of the corresponding strip-shaped regions in general) and which preferably is remote from the geometric center of the sensor and in particular additionally comprises rows and/or columns which form the outermost edges (or strip-shaped regions which form the outermost edges in general).

The advantages of this variant are: simultaneous and more accurate detection can be reserved for the central region of the sensor, and less precise sequential detection can be made in the edge regions where fewer events important for the vehicle will be observed by comparison.

As already indicated, the sensor may comprise a sensor matrix having sensor elements arranged in rows and columns, which are each set up to receive light, wherein the rows correspond to a first strip-shaped region of the scene and the columns correspond to a second strip-shaped region of the scene.

In general, the sensor and in particular the sensor elements may each detect the (light) intensity of the received light. These sensor elements may provide pixel values and/or individual pixels or image points defining a sensor matrix. Simultaneously recorded pixel values but also sequentially recorded pixel values relating to the same detection process may be jointly considered and used for example to derive a common distance histogram.

The sensor elements in these rows and columns are preferably interconnected separately (e.g. by connection to a common electrical (signal) line and in particular to a common row or column line), and the distance histogram is preferably determined from the total signal of the correspondingly interconnected sensor elements. By means of the interconnection, corresponding first and second strip-shaped regions of the sensor may be formed.

According to a further embodiment, at least those sensor elements whose received light is determined simultaneously each comprise a current mirror device, which is connected to the row line and to the column line, to which further current mirror devices of further sensor elements can also be connected. This can be achieved: the measurement signals obtained by each sensor element are combined row-wise or column-wise and then a corresponding distance histogram is also formed therefrom. Furthermore, this variant is suitable in particular when the sensor elements are sipms. In general, this variant is a reliable and advantageous way of forming or detecting strip-shaped regions. The current mirror arrangement can be formed in a conventional manner by two, for example parallel, (semiconductor) transistors and is explained below as an example in accordance with the figures.

Another variant provides for: the sensor elements each comprise at least one PIN detector, which is connected to two resistors in order to feed the current falling thereon into the lines assigned to the strip-shaped regions (in particular into the signal lines and the row lines). The PIN detector may be connected to the transimpedance amplifier and the output voltage of the transimpedance amplifier may be attached to a corresponding resistor.

Provision may also be made for: at least those sensor elements whose received light is determined simultaneously (that is to say read simultaneously) comprise at least two photodetector elements which receive light separately (that is to say are able to read separately or provide measurement signals separately), and wherein one of these photodetector elements is connected to a row line and the other photodetector element is connected to a column line. Especially when the sensor elements are sipms, these photodetector elements may be constructed as the aforementioned SPADS. In particular, but not limited to the latter case, the sensor elements may comprise up to sixteen or up to thirty-two photodetector elements.

By means of this variant, additional hardware elements, such as the current mirrors mentioned above, can be saved, and a reliable parallel readability of the strip-shaped regions can still be achieved.

Provision may also be made for: more than two photodetector elements are provided for each sensor element (e.g. sixteen or thirty-two, see above) and the photodetector elements are combined into two, preferably equally large groups, wherein the photodetector elements of one group are connected to a row line and the photodetector elements of one group are connected to a column line, and wherein at least two photodetector elements of one group enclose at least one photodetector element of the other group between them. In other words, the photodetector elements of the two groups may be arranged nested with one another and/or alternately (preferably in the row direction and in the column direction). That is, typically the photodetector elements of the two groups may be arranged in a checkerboard pattern. By means of a corresponding arrangement in groups, a high resolution can be ensured despite the subdivision of the individual sensor elements into individual detection regions (that is to say into individual photodetector elements).

The invention also relates to a method for detecting distance information for a vehicle, having:

-receiving with a sensor light reflected by a scene in a surrounding environment;

-determining a plurality of first distance histograms from the received light, wherein a respective first distance histogram of the plurality of first distance histograms is assigned a respective first strip region of the scene, wherein the first distance histogram comprises intensities of reflections within a distance range caused by objects in the assigned first strip region;

-determining a plurality of second distance histograms from the received light, wherein a respective second distance histogram of the plurality of second distance histograms is assigned a respective second bar region of the scene, wherein the second distance histogram comprises the intensity of reflections within a distance range caused by objects in the assigned second bar region;

-determining distance information for a region of the scene from the plurality of first distance histograms and the plurality of second distance histograms, wherein the region of the scene comprises an intersection region of one of the first bar regions and one of the second bar regions,

wherein the strip-shaped areas of the scene correspond to the strip-shaped areas of the sensor respectively,

and wherein the light received therein is determined simultaneously for a plurality of corresponding strip-shaped areas (or in other words a plurality of corresponding strip-shaped areas of the sensor are read simultaneously).

All the above and the following explanations regarding the features of the sensing device are equally applicable to the same features of the method. In particular, the method may comprise any other steps and any other features in order to provide all the functions, operating states or effects described herein in connection with the sensing device. In particular, the method may be implemented with a sensing device according to any of the embodiments above and below.

According to a further embodiment of the sensor device and the method, the scene is illuminated with a light-emitting diode light source of the vehicle. The led light source may comprise a lighting device of the vehicle for illuminating the surroundings or an interior space of the vehicle. The lighting devices may include, for example, daytime running lights, high beam lights, low beam lights, turn signal lights, fog lights, and the like. The led light source can be operated using a modulation method and a distance histogram can be determined from the modulation signal and the received signal of the sensor matrix. The modulation method may for example comprise a frequency modulated continuous wave method, wherein the frequency used to modulate the light emitting diode light source is changed from an initial frequency to a final frequency within a certain time. Here, the modulation frequency is preferably continuously changed from the initial frequency to the final frequency. Furthermore, a random frequency modulation method may be used as a modulation method, wherein the frequency with which the light-emitting diode light source is modulated is varied randomly or pseudo-randomly. The led light source may also be operated with a single frequency modulation method, wherein the frequency for modulating the led light source is constant. Finally, the led light source can be operated by a pulse modulation method. Depending on the modulation method used, different evaluation methods can be used in order to determine the distance histogram. For example, correlation methods can be used which correlate the signal which has been used to modulate the led light source with the received signal of the sensor matrix. In another evaluation method, a modulation signal may be mixed with the received signal and the distance determined from the mixing frequency. In principle, the distance histogram represents a distance-resolved echo map, which is generated by an object or objects in a strip-shaped detection region. These distance-resolved echo maps can be processed into spatially resolved images in a similar way to that in the case of computed tomography, wherein each position of the image is assigned a corresponding distance. All the mentioned methods can be applied even in case of simultaneous reading of at least a single strip-shaped area.

Since these distance histograms comprise an entire row or column of the scene, respectively, less coarse resolution sensors (row or column sensors) are required, or if a matrix sensor is used, only a small number of row and column measurements are required, instead of a measurement for each image point. Thus, high resolution can be achieved with a small number of sensors or measurements.

According to the present invention, there is also provided an apparatus for detecting distance information of a vehicle. The device comprises a light source which is designed to illuminate a scene in the surroundings or within the vehicle. The light source is preferably a lighting device of a vehicle. Furthermore, the lighting device preferably comprises light emitting diodes for generating light, since the light emitting diodes may be modulated with a sufficiently high frequency in order to provide light which may be used for the determination of the distance histogram as described later. The device further comprises sensing means of the above-mentioned type for receiving light from the light source and reflected by the scene. Finally, the device comprises a processing unit which manipulates the light source and determines a plurality of first distance histograms and a plurality of second distance histograms of the above-mentioned type from the received light.

Drawings

Subsequently, the present invention will be described in detail with reference to the accompanying drawings.

Fig. 1 schematically shows a vehicle and an object in the surroundings of the vehicle according to an embodiment of the invention.

Fig. 2 shows the steps of a method for determining a distance to an object according to an embodiment of the invention.

Fig. 3 shows the steps of a method for determining the velocity of an object according to an embodiment of the invention.

Fig. 4 schematically shows a circuit of a light-emitting diode light source according to an embodiment of the invention, which is designed to emit light for distance measurement.

Fig. 5 schematically shows the arrangement of the components of the led light source of fig. 4 in a common semiconductor housing.

Fig. 6 shows the steps of a method for determining the distance of an object according to a further embodiment of the invention.

Fig. 7 shows a first detection region of a sensor of an apparatus for determining a position of an object according to an embodiment of the invention.

Fig. 8 shows a second detection area of a sensor of the device for determining the position of an object.

Fig. 9 shows an overlap of the first and second detection regions of fig. 7 and 8, as detected by the sensors of the device for determining the position of an object according to an embodiment of the invention.

Fig. 10 shows the second detection area of fig. 8 with additional blurring.

Fig. 11 shows the overlap of the first and second detection regions of fig. 9 with the additional blurring of the second detection region.

Fig. 12 shows an emitter segment used by a light source of an apparatus for detecting a position of an object according to an embodiment of the present invention.

Fig. 13 shows a receiver section used by a sensor of an apparatus for determining a position of an object according to an embodiment of the invention.

Fig. 14 shows an overlap of the transmitter segment of fig. 12 and the receiver segment of fig. 13.

Fig. 15 shows a near field view produced by a staggered arrangement of emitting diodes according to an embodiment of the invention.

Fig. 16 shows a far field view of the transmitter segment of fig. 15.

Fig. 17 shows method steps of a further method for determining distance information according to an embodiment of the invention.

Fig. 18 shows a scene with objects in the surroundings of the vehicle.

Fig. 19 shows a distance histogram of the rows of the scene of fig. 18.

FIG. 20 shows a distance histogram of columns of the scene of FIG. 18.

Fig. 21 shows a vehicle according to an embodiment of the invention, which simultaneously measures the distance to a vehicle traveling ahead and transmits data.

Fig. 22 shows the steps of a method for determining the distance to an object and for transmitting transmission data according to an embodiment of the invention.

Fig. 23 shows a schematic section of an exemplary sensor device according to the invention, which carries out a method according to an embodiment of the invention.

Fig. 24 shows a schematic section of a sensor device according to a further embodiment of the invention.

Detailed Description

Fig. 1 shows a vehicle 10 with a device for determining distance information. The device comprises a light source 11 which is designed for illuminating an object 17 in the surroundings of the vehicle 10. The light source 11 may include, for example, a daytime running light, a dipped light, a turn signal, a tail light, a high beam, a fog light, or a backup light of the vehicle 10. The light source may also include one or more light emitting diodes that generate light for illuminating the surroundings of the vehicle 10 or generate signaling light, such as turn signal or brake lights. Furthermore, the light source 11 may also include an illumination device for illuminating the interior space of the vehicle 10, such as dashboard illumination or passenger compartment illumination. The apparatus for determining distance information further comprises: an optical sensor 12 for receiving light reflected by the object 17; and a processing unit 13 coupled to the light source 11 and the sensor 12. If, in the arrangement shown in fig. 1, the object 17 is located, for example, at a distance 18 in an area in front of the vehicle 10, the light 15 emitted by the light source 11 is reflected by the object 17 and received as reflected light 16 by the sensor 12. The operation of the device for determining distance information is described later with reference to fig. 2.

Fig. 2 shows a method 20 for the vehicle 10 for determining the distance 18 between the vehicle 10 and the object 17. In step 21, the light source 11 of the vehicle 10 is operated with the modulated signal. The modulation signal is generated by the processing unit 13. The light 15 emitted by the light source 11 is reflected by the object 17 and received by the sensor 12 as reflected light 16 (step 22). In step 23, a receive signal is generated from the received reflected light 16. The received signal may comprise, for example, an analog or digital electrical signal. In step 24, the received signal is combined with the modulated signal in the processing unit 13. For example, the modulated signal and the received signal may be correlated or mixed, as will be described in detail later. In step 25, the distance 18 to the object 17 is determined from the combined signal, for example the correlation signal or the mixed signal. The distance to the object 17 determined in this way can be provided, for example, to the driver assistance system 14 of the vehicle 10. The driver assistance system 14 may comprise, for example, an adaptive cruise control system, a brake assistance system, a parking assistance system or a collision warning system. The object 17 may also be located in the interior space of the vehicle 10 and illuminated by the vehicle's corresponding illumination device in the interior space, and the reflected light from the object may be received with a corresponding sensor. In this way, the distance to an object in the interior of the vehicle can be determined, for example in order to recognize gestures of the operating system or in order to detect the current position of the head of the occupant, for example in the event of an accident, in order to trigger a corresponding protection mechanism, such as an airbag.

In order to be able to use the above-described method in a vehicle for different driver assistance systems, it may be necessary to use different transmission and reception methods for different application situations. These methods may be selected, for example, according to the desired distance or application. For this purpose, for example, an operating state of the vehicle 10 can be determined and the corresponding transmission and reception method can be selected as a function of the operating state of the vehicle, i.e. the corresponding modulation method for generating the modulation signal and the corresponding evaluation method (for example mixing or correlation) can be selected as a function of the operating state. The modulation method may include, for example, a frequency modulated continuous wave method, a random frequency modulation method, a single frequency modulation method, or a pulse modulation method. The method will be described in detail later. The running state of the vehicle may include, for example: the speed of the vehicle; an activation state of a light source of the vehicle, which activation state indicates whether it is designed for illuminating the surroundings of the vehicle or for outputting an optical signal; a direction of travel of the vehicle; previously determined position information or distance information of an object in the surroundings of the vehicle; weather conditions in the surroundings of the vehicle; or the type of the auxiliary device of the vehicle, to which the distance information is provided.

In the frequency Modulated Continuous wave method, also called fmcw (frequency Modulated Continuous wave) or Chirp (Chirp) method, the modulation frequency is changed from an initial frequency to a final frequency within a certain time. Preferably, the modulation frequency is continuously changed from the initial frequency to the final frequency. As will be shown later, this method can be used not only for distance measurement but also for velocity measurement of the object 17. The generation of a modulation signal is known from the prior art, in which case the modulation frequency is continuously changed from an initial frequency to a final frequency over a certain time, and thus the method can be implemented simply, for example in such a way that the synthetically generated waveform is blanked. With this method, the distance 18 to the object 17 can be measured continuously, whereby the method is particularly suitable for continuously switched-on light sources 11. The frequency ramp is obtained by continuously changing the modulation frequency and thus the transmission frequency of the light source 11 from the initial frequency to the final frequency. By mixing the transmitted signal with the received signal received by the sensor 12, it is possible to directly measure not only the distance 18 of the object 17 but also the velocity of the object. In case a Light Emitting Diode (LED) with a typical response time of 5-10 nsec is used as light source 11, a modulation frequency of e.g. up to 100 MHz may be used. Thus, FMCW modulation may continuously use a transmit frequency from 10 MHz to 100 MHz, for example, over a time period of 0.5-400 μ sec. In the case of the frequency modulated continuous wave method (FMCW), the distance measurement can optionally be carried out by means of frequency mixing or correlation methods.

In the case of using the frequency modulated continuous wave method, a mixer may be used to compare the transmitted and received signals. An object at a particular distance produces a mixing frequency proportional to that distance. The spatial resolution of the plurality of objects is a function of the frequency measurement and thereby the resolution of the measurement time. Such a mixing method may be implemented, for example, as an analog circuit in an integrated circuit. If, for example, a distance between 0 m and 40 m should be measured, the light needs approximately 3.3 nsec/m x 40 m x 2 = 264 nsec for this distance back and forth along arrows 15 and 16 of fig. 1. This yields a useful signal length of approximately 500 nsec for FMCW signals. Thus, the modulation down to 10 MHz is too low for the method, so that preferably a frequency offset between 50 and 100 MHz should be used, which varies linearly within 500 nsec. In the case where the distance 18 between the vehicle 10 and the object 17 is, for example, 25 m, the received signal is delayed by 165 nsec with respect to the transmitted signal. As described above, the transmission signal has a frequency shift of 50 MHz/500 nsec = 100 kHz/nsec due to modulation. In the case of a signal delay of 165 nsec for the received signal, the received signal has a frequency 16.5 MHz lower than the transmitted signal. By mixing the transmission signal with the reception signal, a frequency of 16.5 MHz is obtained with an exemplary distance of 25 m. Generally, a frequency of 0.66 MHz per meter distance is obtained by this mixing.

In the case of distance measurement by means of the frequency-modulated continuous wave method, it is also possible to correlate the transmitted signal and the received signal with one another in order to determine the distance to the object 17 on the basis of the correlation signal thus generated. For this purpose, the transmitted modulation signal is correlated with the received signal with a temporal offset. A correlation maximum is obtained with respect to the offset time, which is proportional to the distance 18 of the object 17. Since substantially noisy signals are evaluated, the value of the correlation maximum is a measure of the signal strength, so that different objects 17 can be distinguished. The resolution in terms of distance can be determined, for example, by the sampling frequency of the received signal. To generate the correlation signal, a plurality of correlation coefficients may be generated. Each of the plurality of correlation coefficients is assigned a respective offset time. Each correlation coefficient is formed by correlating a modulated signal that is shifted by a corresponding assigned shift time with a received signal. The distance to the object is determined from the plurality of correlation coefficients, for example by determining absolute or local maxima and from the assigned offset time. It is advantageous here that: the high importance of the signal in the time domain is exploited by continuously modulated FMCW signals comprising a plurality of mutually independent frequencies. To achieve a high sampling rate of the received signal, for example, one bit conversion may be advantageous. To generate a binary-valued receive signal, a receive signal having a first signal value may be generated if the level of received light is below a certain intensity, and a receive signal having a second signal value may be generated if the level of received light reaches or exceeds the certain intensity. For this purpose, for example, a limiting amplifier can be used which generates a received signal with a defined level, for example a sequence of binary values of zero and one, depending on the received light. Since the importance is in time, little information is lost by this binary value conversion, since the importance of the amplitude may not be reliable due to the expected amplitude modulation caused by the object 17. By the received signal being reduced to a binary value signal, the corresponding correlator can be constructed relatively simply and can be adapted to handle long sequences. This improves the correlation result. If the received signal is present digitally or in binary value, it is advantageous: the comparison pattern of the modulated transmission signal is likewise digital. This can be achieved, for example, by: the composite digital signal is used for modulation of the light source. This can be generated, for example, with constant quality and from only one transmit clock. If the receive clock is the same as the transmit clock, errors that occur, for example, due to temperature drift, can be compensated. By using a correlation method, a long signal sequence can be used. The available frequency offset is therefore not limited to the transit time of the signal for the distance to be measured. As described above, this method can be implemented purely digitally and is therefore created cost-effectively. For example, the modulation signal may be emitted at a length of 50 μ sec-500 μ sec and the frequency may be increased from 10 MHz to 100 MHz during this time. Such a modulation signal can be generated, for example, by a shift register in which the signal generated by the synthesis is stored. The clock frequency with which the transmission signal can be clocked out and the received signal can be clocked in synchronously can be 1 GHz, for example, and can thus be realized with comparatively little effort. The measurement times of 50 musec to 500 musec are so fast for most applications of driver assistance systems that a multiplexing method can be implemented also in the case of multi-channel sensors. Multiple measurements may also be performed and averaged to further improve signal quality.

Random frequency modulation methods may also be used to generate the modulation signal used to steer the light source 11. Here, the transmission frequency from the frequency band changes randomly within a certain time. This method is also known as Random Frequency Modulation (RFM). In order to determine the distance to the object 17, the above-described correlation method can be used in a similar manner and method. The random frequency modulation method provides a very high interference rejection capability, for example, with respect to the scattered light measurement method and other measurement methods. Furthermore, multiple measurement channels can be measured simultaneously, since corresponding crosstalk from other measurement channels is suppressed by the correlation evaluation. The modulation frequency and time length of the transmission signal may be selected similarly to those of the frequency modulated continuous wave method. Thus, especially when a plurality of light sources illuminate a scene or space simultaneously and measurements should be made with all light sources simultaneously, a stochastic frequency modulation method may be used. For example, in the case of the stochastic frequency modulation method, the measurement can be carried out simultaneously with all headlights of the vehicle 10. Here, each light source obtains its own unique signature, which can then be distinguished by correlation methods. The data encoded into the modulated signal may also be transmitted simultaneously to other vehicles or to a receiver at the roadside. For continuous distance measurement, continuous operation of the light source 11 is required, so that the method is particularly suitable for continuously switched-on light sources, such as daytime running lights or headlights during night driving. It is also possible to combine the frequency modulated continuous wave method described above with the random frequency modulation method described above. For example, frequency modulated continuous wave methods may be used first, due to the better signal quality. If a source of interference is detected, for example a light source of another vehicle, switching to the random frequency modulation method is possible. If data transmission is required, it is likewise possible to switch at least temporarily to the random frequency modulation method. The light source 11 and the sensor 12 may be used for the frequency modulated continuous wave method as for the random frequency modulation method.

In the method for determining the distance to the object 17, a single-frequency modulation method can also be used to generate the modulation signal for steering the light source 11. The single frequency modulation method uses a constant modulation frequency and can thus be implemented particularly easily. In accordance therewith, however, this single-frequency modulation method may be comparatively susceptible to disturbances by fog, spray, dust or extraneous light sources, and may thus be used in particular in applications where such disturbances cannot occur, for example, due to the installation location, or in applications where temporary malfunctions can be tolerated, such as when distance measurements are made in an interior space or in parking assistance where too close measurements do not have a negative effect and the required distance is small or the generation of spray is insignificant due to the low speed of the vehicle. For continuous distance measurement, permanently active light sources are likewise required in the single-frequency method, so that the single-frequency method can preferably be used in conjunction with, for example, daytime running lights or dipped headlights of a vehicle. The distance determination, i.e. the evaluation of the single-frequency modulation method, can be attributed, for example, to a phase measurement which determines the phase difference between the modulated signal and the received signal. This phase measurement can be made digitally by an and operation, for example, by a comparison of the received signal with the modulated signal. Typical modulation frequencies suitable are for example in the range of 5-20 MHz. Within this range, uniqueness of the phase estimation based on single frequency modulation can be ensured.

Finally, in order to generate a modulation signal for operating the light source 11, a pulse modulation method may be used. The measurement can also be performed by a pulse modulation method, especially when the light source 11 is switched off. The short light pulses of the pulse modulation method can be constructed such that they are invisible or hardly visible to an observer. If the light source is switched on, it can likewise be operated in a pulse modulation method in that the light source is designed for a pulse duration, or in other words in that "negative" light pulses are generated. Thus, the pulse modulation method is particularly suitable for the case where measurements should be made at a low measurement frequency of, for example, 10 to 100 Hz and the light used for the measurements should not be identifiable. Light sources that are not switched on at the measuring time, such as low beam, turn signal, tail light, brake light or reversing light, may be switched on in short pulses having a length of, for example, 10 to 100 nsec, which short pulses are not noticed by a human observer due to the low average power. With the light source switched on, the light can be switched off for a short period of time, for example 10 to 100 nsec, thereby forming a negative light pulse which can likewise be detected by the sensor 12. In the case of using a pulse modulation method, the distance 18 to the object 17 can be determined, for example, with the above-described correlation method. In particular, pulse modulation can be used, which consists of a pulse sequence with a high temporal importance over non-uniform pulse intervals. The received signal generated from the received light may in turn be correlated with the modulated signal or alternatively may be used as a correlation pattern for pulse modulation and mathematical description of the pulses. The received signal may be employed beyond the measurement distance. By means of an Oversampling (Oversampling) method, a plurality of such echo maps can be recorded and summed to a distance histogram. The echo pulses can then be identified with pulse evaluation and the exact distance 18 can be determined, for example, with a center of gravity determination. This method is suitable not only for positive pulses but also for negative pulses.

As already mentioned above in connection with the frequency modulated continuous wave method, the velocity of the object 17 can also be determined in addition to the distance 18 to the object 17. Subsequently, this will be described in detail with reference to fig. 3. Fig. 3 shows a method 30 for determining the velocity of the object 17. In step 31, the light source 11 of the vehicle 10 is operated using the frequency modulated signal. In step 32, reflected light 16 is received, which is emitted by the light source 11 and is reflected by an object 17 in the surroundings of the vehicle 11. In step 33, a receive signal is generated from the received light 16. In step 34, the difference frequency between the frequency of the frequency-modulated signal used to steer the light source 11 and the frequency of the received signal is determined by the mixing of the two signals. In step 35, the speed information of the object 17 is determined on the mixed signal, i.e. on the basis of the difference frequency. The frequency-modulated signal may in particular be generated according to the above-described frequency-modulated continuous wave method, wherein the modulation frequency of the frequency-modulated signal is changed from an initial frequency to a final frequency within a certain time. As mentioned above, the distance information about the object 17 may also be determined from the frequencies of the received signal and the frequency modulated signal, for example by mixing or correlating the signals. The frequency of the frequency modulated signal is preferably in the range of 10 to 200 MHz.

Subsequently, the method will be described in detail exemplarily in terms of modulation by means of the frequency modulated continuous wave method (FMCW). In the case of FMCW modulation, a continuous frequency shift from, for example, 10 MHz to 100 MHz is modulated within 40 microseconds. Since the distance between the vehicle 10 and the object 17 is 200 meters, an offset of 1.32 μ s or 2.97 MHz results. Another mixing sequence is obtained by the relative velocity v according to the doppler formula:

where f is the modulation frequency, f0Is the frequency of the received signal and c is the speed of light. In the following table, the doppler shifts of different velocities of the object are shown.

The table shows that: the doppler frequency depends on the modulation frequency. Higher modulation frequencies also result in higher doppler frequencies. Thus, the FMCW modulation may be changed, for example, such that the frequency is modulated from 10 MHz to 100 MHz within, for example, 20 μ s and then the frequency of 100 MHz is maintained for the next 20 μ s. The doppler frequency can then be measured, for example, at 100 MHz. The doppler frequency can be determined directly, for example, by mixing the transmit frequency with the receive frequency. However, for practical reasons, the doppler frequency may alternatively be determined by mixing the received signal with a further signal having a frequency which deviates from the frequency of the frequency modulated transmission signal by a predetermined value. For example, the received signal may be compared or mixed with a signal having a frequency that is 100 kHz lower than the frequency modulated transmit signal. Thus, for the doppler frequencies in the example shown in the table, frequencies between 100000 and 100024 Hz are obtained for speeds between 0 and 260 km/h. These significantly higher frequencies can be measured more easily and can occur within a measurement duration of, for example, 20 mus.

As described above, the light source 11 of the vehicle 10 is to be modulated in a frequency range of 10 MHz to 100 MHz, for example. Especially led light sources using semiconductor leds to generate light 15 are suitable for this. Especially light emitting diodes generating ultraviolet or blue light have such a large modulation bandwidth. For light components that are color converted to white light or other colors, such as red or green light, the light emitting diodes may additionally have phosphor coatings that convert ultraviolet or blue light to light of other colors. The high-frequency light used for distance or speed measurement is in particular the blue light of a light-emitting diode. The current through the light emitting diode is in the range of a few amperes in order to achieve a corresponding illumination range. In order to achieve an efficient modulation, the corresponding actuation of the light-emitting diodes must be designed accordingly. Fig. 4 shows a light-emitting diode light source 40, which is also referred to as a modulation circuit and which has a corresponding design. The led light source 40 includes a light emitting diode 41, a switching element 42, and an energy accumulating element 43. As described above, the light emitting diode 41 may preferably include a light emitting diode that generates blue light or at least has a blue light component. The switching element 42 may comprise, for example, a transistor, in particular a field effect transistor. The energy storage element may comprise a capacitor, for example. The switching element 42 is operated by a modulation signal 44. The power supply includes a ground connection (GND) 45 and a supply connection (Vcc) 46. If the switching element 42 is switched on as a result of the actuation of the modulation signal 44, a current flows from the supply voltage connection 46 via the light-emitting diode 41 to the ground connection 45 and, in addition, a further current of the charge stored in the energy storage element 43 flows from the first connection 47 via the switching element 42 and the light-emitting diode 41 to the second connection 48 of the energy storage element 43. Due to the high switching frequency, a construction with the shortest line should be sought, in particular between the elements 41, 42 and 43, so that the inductance of the line is as low as possible, and therefore the losses, susceptibility to interference and, in particular, the emitted interference radiation are as low as possible. In the switched-off state of the switching element 42, the energy storage element 43 is charged by the supply voltage 46 and the ground connection 45. In the conductive state of the switching element 42, the energy storage element provides a very large current through the light emitting diode 41 for a short period of time. Thus, in particular the connections between the energy storage element 43, the switching element 42 and the light-emitting diode 41 should be kept as short as possible. If the lines in the circuit of the light emitting diode 41, the switch 42 and the accumulator 43 are too long, these lines represent inductances that "cancel" any current changes. Thus, very high voltages are required in order to be able to produce a modulation representing a fast current change. Already here, a line length of a few millimeters can have a significant effect. A part of the energy stored in these lines during modulation is absorbed in these lines and converted into heat, while another part is emitted as interfering radiation. In order to generate, for example, 10W of light with the light emitting diode 41, a current of about 10 amperes is required to pass through the light emitting diode 41. If the light pulse should be, for example, 50 nsec long, about 200 volts is required in the case of a wiring structure in which the light emitting diode 41, the switching element 42, and the capacitor 43 are arranged as separate elements on a printed circuit. Therefore, an energy demand of 200V x 10A x 50 nsec = 0.1 mJ is required. In the case of a structure in SMD technology, for example, 60V and 10A are required, i.e. the energy requirement is 30 μ J. However, in the case of the optimized structure that will be shown later in connection with fig. 5, only 8V and 10A are required, that is to say the energy requirement is 4 μ J. In all cases, about 40W is absorbed in the led 41. That is, the efficiency is 50% in the optimized structure, about 6% in the structure with SMD technology, and only 2% in the wiring structure on the printed circuit.

Fig. 5 shows an optimized structure of the led light source 40. The led light source 40 includes a light emitting diode 41, a switching element 42, and an energy accumulating element 43. The switching element 42 is coupled in series with the light emitting diode 41. The energy storage element 43 is coupled in parallel with the series circuit of the light-emitting diode 41 and the switching element 42. If the switching element 42 is switched on, a current path through the light-emitting diode 41 is switched, which current path extends from the first connection 47 of the energy storage element 43 via the first line portion 50 to the switching element 42 and from there via the second line portion 51 to the light-emitting diode 41. The current path extends via the third line portion 52 to the second connection 48 of the energy storage element 43. As shown in fig. 5, the elements 41, 42 and 43 are arranged in a common housing 54. In other words, the semiconductor elements 41 and 42 and the capacitor 43, which have no own housing, are mounted in the common housing 54. Thereby, the length of the connections 50 to 52 can be designed to be correspondingly short. For example, the total current path connecting the energy accumulating element 43, the light emitting diode 41 and the switching element 42 may have a length of less than 12 mm. Preferably, the length of the current path is shorter than 9 mm. Each of the connections 50, 51 and 52 may be, for example, 1 to 3 mm. The connections 50 to 52 can together with the connection ends 44 to 46 form what are known as leadframes (leadframes) which provide the external connection ends 44 to 46 of the led light source 40 on the one hand and the connections 50 to 52 for the coupling of the elements 41 to 43 on the other hand. Due to the short connection length of the connections 50 to 52, a high efficiency of the led light source 40 may be achieved. A plurality of led light sources can be implemented in the housing 54 in that a plurality of leds 41, switching elements 42 and energy stores 43 are arranged in a corresponding manner on a common lead frame in a common housing 54. The light emitting diode 41 may generate light having a wavelength of less than 760 nm, preferably less than 500 nm, i.e. in particular blue light. A phosphor coating may also be provided in the housing 54 that converts the ultraviolet or blue light generated by the light emitting diodes 41 into light of other colors. The light-emitting diode light source 40 or a plurality of light-emitting diode light sources in the light-emitting diode light source 40 may be used in the lighting device 11 of the vehicle 10, for example, in order to illuminate the surroundings of the vehicle 10 or to generate a light signal, such as a flashing light or a brake light.

In the above-described method and apparatus, existing lighting devices of the vehicle, such as low beam, fog, blinkers, brake or back-up lamps, are used in order to generate a modulated light signal which is reflected by objects in the surroundings of the vehicle and received by sensors on the vehicle. From the received signals of the sensors and knowledge of the modulation signals for operating the lighting devices of the vehicle, the distance or speed of the object can be determined. Since the main function of the lighting device is to illuminate the surroundings of the vehicle or to output an optical signal, such as a flashing signal or a brake signal, a method 60 is described later, which at the same time ensures the determination of the distance information. For this purpose, in step 61, the operating state of the vehicle is first detected. The operating state of the vehicle may be, for example, a target state of a lighting device of the vehicle, which indicates whether the lighting device should be switched on or off. The detection of the operating state may also comprise a determination of the brightness of the environment in the surroundings or within the vehicle or a determination of the distance measurement range for which distance information is to be determined. Based on the operating state thus determined, a modulated transmission signal is generated in step 62. For example, if the target state of the lighting device indicates that the lighting device should be turned on, a first modulated transmission signal may be generated. Furthermore, if the target state indicates that the lighting device should be switched off, a second modulated transmission signal may be generated, which is inverted with respect to the first modulated transmission signal. In this way, for example, a modulated transmission signal comprising short light pulses which are not sufficiently energetic to be seen by an observer can be generated with the lighting device switched off. Conversely, if the lighting device should be on, a modulated transmission signal may be generated that turns the lighting device off for short pulses that are so short that the short pulses are not noticed by the observer and thus the lighting device appears to be continuously on. In step 63, the generated transmission signal is used to operate the lighting device 11 of the vehicle 10. In step 64, reflected light 16 is received, which is emitted by the illumination device 11 as light 15 and is reflected by the object 17. Based on the received light 16, a receive signal is generated in step 65. In step 66, the received signal is combined with the transmitted signal, and in step 67 the distance of the object 17 is determined from the combination.

The amount of light that cannot be seen by the observer depends mainly on the overall brightness of the surroundings of the vehicle and the contrast at the transmission level. During the day, the lighting device may emit a significantly greater amount of light that is not noticed by the observer than during the night. Generally, the signal-to-noise ratio is significantly worse during the day due to the disturbing light of the sun, so that higher transmission power is required during the day than at night. During the day, for example, up to 2 mJ of light can be emitted which is not noticed by the observer. Thus, in this method, the average power of the modulation signal can be set in dependence on the operating state, in particular the ambient brightness. Further, the transmission energy may be set according to a distance measurement range for which distance information should be determined. The transmission energy depends, for example, on the requirements of the application using the distance information. A driver assistance system or a collision avoidance system for distance adjustment may require a larger distance measurement range than a parking system.

The modulated transmission signal may comprise, for example, a pulse modulated signal. The pulse modulated signal may have a pulse duration in the range of 1 to 500 ns, preferably 10 to 100 ns. The frequency of the pulses used to repeat the pulse modulated signal may be in the range of 1 to 1000 Hz, preferably 10 to 100 Hz.

The lighting device of the vehicle may for example comprise a light emitting diode light source or a plurality of light emitting diodes as described before. In the case of a white light emitting diode, the main blue component may be used as a modulated carrier. The modulated carrier is modulated with the modulated transmission signal at high frequencies and remains in the spectrum of the white light emitting diode. The phosphor of the light emitting diode cannot follow the fast modulation because it is usually slow. In this way, white light is formed which is uniformly luminous for human perception, while the blue component of the white light has the desired modulation.

Other lighting devices of the vehicle can be controlled according to the running state of the vehicle and the modulation transmission signal. The vehicle 10 is, for example, traveling on a local road, and a driver assistance system, such as adaptive cruise control, is turned on. The headlights of the vehicle are switched off. Thus, a modulated transmission signal is generated, which comprises brief light pulses. Thereby, distance information to an object in front of the vehicle may be provided for the adaptive cruise control system. It is therefore not necessary to switch on the running lights of the vehicle, that is to say to supply all the energy for all the led lighting devices of the headlights of the vehicle, which can be advantageous in particular for electric vehicles. Especially adaptive cruise control systems require a large measurement range. If the headlight is switched off during daytime as described above, for example, a high beam with high energy can be used in order to emit a measuring pulse with a large range of action. Whereas if the vehicle is running in the dark, the high beam is modulated by briefly decreasing the brightness, so that a large measurement range can be achieved. However, if a vehicle is oncoming in the dark, the high beam can no longer be operated so as to not dazzle the driver of the oncoming vehicle. In this case, the light-emitting diodes of the low beam lamps can be modulated by briefly reducing the brightness in order to determine the distance information. At the same time, the leds of the high beam can be modulated with short pulses in order to determine the distance information without blinding the oncoming traffic. In other words, some LEDs are briefly switched on (in this case the LEDs of the high beam that are switched off) and other LEDs are briefly switched off (in this case the LEDs of the low beam). Thus, a large measurement range can be achieved without the light emitting diodes of the high beam dazzling or disturbing the oncoming vehicle.

In the above-described method and apparatus, the distance of the object 17 or the speed of the object 17 is determined using the lighting devices 11 that are always present at the vehicle 10, such as the low beam, daytime running light, or high beam of the vehicle 10. Subsequently described is: how the position information of the object 17 relative to the vehicle 10, i.e. the additional orientation information, can additionally be determined using the above-described method.

According to one embodiment, the sensors 12 of the vehicle 10 comprise at least two first sensors for receiving light generated by the light sources 11 of the vehicle and reflected by a scene comprising objects 17 in the surroundings of the vehicle. Each of the at least two first sensors is assigned a respective first detection region of the scene. The first detection regions are arranged in a line in a first direction. Fig. 7 shows 15 first detection areas, which 15 first detection areas are assigned to 15 first sensors. The 15 first detection regions are arranged in the horizontal direction. Two of these 15 first detection areas are characterized by reference numerals 71 and 72. The sensor 12 further comprises at least two second sensors for receiving light reflected by the scene, wherein each of the at least two second sensors is assigned a respective second detection area of the scene. The second detection regions are arranged in a line along the second direction. The second direction is different from the first direction. In fig. 8, two second detection regions 81 and 82 are shown, which are arranged in a line in the vertical direction. In fig. 8, further detection regions are also shown, which are likewise arranged in pairs in a line in the vertical direction, for example two third detection regions 83 and 84. The processing unit 13 is designed to: the position of the object 17 in the surroundings of the vehicle 10 is determined from the signals of the first and second sensors. One of the first detection areas, e.g. area 71, partially overlaps one of the second detection areas, e.g. area 81. Additionally, one of the first detection areas, i.e. area 71, may partially overlap with another of the second detection areas, e.g. area 82, as shown in fig. 9. The third detection areas 83, 84 monitored by the corresponding third sensors may be arranged such that one of the first detection areas, e.g. detection area 71, partially overlaps one of the second detection areas, e.g. area 81, another one of the second detection areas, e.g. area 82, one of the third detection areas, e.g. area 83, and another one of the third detection areas, e.g. area 84.

The positioning of the object 17 by means of overlapping detection regions, which has been described previously, will be described in detail later. In contrast to this, it should be noted at this point that: for example, if there are five detection areas, only five different position areas of the object 17 can be distinguished when the detection areas do not overlap. However, by overlapping these detection areas as shown in fig. 9, eight different location areas of the object 17 can be distinguished by the detection areas 71 and 81-84. If only the sensor assigned to one of the detection regions 81-84 detects the object 17, the object 17 is located within a region which is assigned to the corresponding sensor and does not cover the region assigned to the sensor 71. Thus, it is already possible to distinguish four different regions of the object 17. If an object 17 is detected in one of the regions 81-84 and additionally in the region 71, the object 17 must be located in one of four overlap regions, which result from the overlap of the region 81 with the region 71, the overlap of the region 82 with the region 71, the overlap of the region 83 with the region 71 or the overlap of the region 84 with the region 71. Thereby, four other position areas of the object 17 can be distinguished. If the sensors are arranged such that the detection regions shown in fig. 7 and 8 can be monitored separately, then by means of the overlap shown in fig. 9: a total of 56 different regions, in which the object 17 can be detected separately, are realized with the 15 first sensors required for the region of fig. 7 and the 16 sensors for the region of fig. 8.

The second detection region can additionally overlap in the vertical direction and additionally overlap in the horizontal direction with other detection regions, for example the third overlap regions 83, 84. This can be achieved, for example, by so-called "blurring" of the assigned sensors. Fig. 10 shows the above-described overlap of the second, third and further detection regions. Thus, in connection with the first detection area of fig. 7, a plurality of different areas may be provided for localization of the object 17, as shown in fig. 11. The resolution of the localization of the object 17 can be further increased by the overlapping of the first detection regions with one another, which is, however, not shown in fig. 11 for the sake of clarity. Fig. 9 and 11 also show: in particular in the center, that is to say in the region in which the horizontally arranged detection region and the vertically arranged detection region intersect, a particularly high resolution for the localization of the object 17 can be achieved. This can be used advantageously for a plurality of driver assistance systems of the vehicle, since, in particular in the linear direction of the vehicle, a high resolution is advantageous, while lower resolutions in the edge region can generally be tolerated.

The detection regions of fig. 7 to 11 are perpendicular to the measuring direction, that is to say to the arrow 16 of fig. 1.

With reference to fig. 12-14, another possibility of determining position information of an object 17 relative to the vehicle 10 is shown.

The lighting device 11 of the vehicle 10 has at least a first light source and a second light source. The first and second light sources can be operated independently of each other. The first light source is designed for illuminating a first illumination region of a scene in the surroundings or within the vehicle 10. The second light source is designed for illuminating a second illumination area of the scene. The first illumination area is different from the second illumination area. A plurality of illumination areas 121 and 127 are shown in fig. 12. For example, the first illumination region may be region 121 and the second illumination region may be region 122. The sensors 12 include at least a first sensor and a second sensor for receiving light reflected by the scene. In this case, the first sensor is assigned a first detection region of the scene, and the second sensor is assigned a second detection region of the scene. The first detection area is different from the second detection area. Six detection regions 131-136 are shown in fig. 13. The first detection region may be, for example, region 131, and the second detection region may be, for example, region 132. The processing unit 13 actuates the first and second light sources and, if appropriate, further light sources for generating the illumination region 123 and 127 and determines the position of the object 17 in the surroundings of the vehicle 10 on the basis of the signals of the first and second sensors and, if appropriate, further sensors assigned to the detection region 133 and 136 and on the basis of the actuation of these light sources. The regions 121 and 127 and 131 and 136 are located, for example, in the plane of the arrows 15 and 16 in fig. 1.

These detection regions can be arranged, for example, in alignment with the illumination regions, that is to say that the detection region 131 substantially corresponds to the illumination region 121, the detection region 132 substantially corresponds to the illumination region 122, etc. Each of these detection zones may have a predetermined angular extent, such as 10 °, or 20 ° as shown in fig. 12 and 13. The segments thus formed can be sampled in sequence in a so-called time-division multiplex method. Since the distance measurement within the segments can be performed in a short time, for example within 50 μ sec, using the above-described distance measurement method, in particular using the frequency modulated continuous wave method or the stochastic frequency modulation method, the entire angular range covered by the segments can be sampled in a short time. If an angular range of, for example, 120 should be sampled in 10 deg. segments, the entire angular range may be sampled in 600 musec with a measurement time of 50 musec per segment. Even with a longer measurement time of 500 μ sec, the entire angular range of 120 ° can be sampled at 6 ms. Typical applications of driver assistance systems require measurement times in the range of 30 ms to 50 ms, so that sufficiently fast sampling is possible. By not providing each angular segment with a respective transmitter and receiver, but using segments that are each half-overlapped, the resolution of the sampling can be improved. Fig. 14 shows such an overlap of the illumination area 121-127 and the detection area 131-136. Both the illumination area and the detection area each comprise an angular range of 20 °. By the interleaved overlap of the illumination area 121-127 and the detection area 131-136 twelve 10 deg. segments are obtained, which can be sampled with seven light sources and six sensors. Since it works with a time division multiplexing method and thus the crosstalk of one segment with an adjacent segment is not important, the segments can be arranged side by side. At any time only one pair of transmitter and receiver is operating so that it can be unambiguously ascertained in which segment the signal is present. In other words, the first detection area 131 covers a partial area of the first illumination area 121 and a partial area of the second illumination area 122. The second detection area 132 includes another partial area of the second illumination area 122. Here, the second detection region 132 is separated from the first illumination region 121.

According to the arrangement of illumination areas and detection areas described earlier in connection with fig. 12-14, information for the estimation of the line of sight can additionally be obtained if detection areas which are not assigned at all to illumination areas are evaluated at the same time. For example, for distance measurement, a light source for illuminating the area 121 and a sensor for detecting the area 131 are operated. A measurement section in the overlap region between the illumination region 121 and the measurement range 131 is obtained. Simultaneously or also in a time-division multiplex method, the sensors assigned to the detection regions 136 are interrogated. This can only be formed by the secondary scattered light if the sensor also reports a distance signal based on the light output for the illumination area 121. If, as here, a signal occurs with the illumination area and the detection area far apart, there is, for example, a very large fog. If the segments are close together, i.e. when, for example, the detection region 133 provides a distance signal, already at low particle densities, measurable secondary scattering occurs. By evaluating areas that are not as far apart, the fog can be rated very well. From which the current line of sight can be estimated.

In order to illuminate the surroundings of the vehicle 10 in sections, as described above, a plurality of light sources are required. For this purpose, for example, a plurality of light-emitting diodes, for example of a low beam or, in particular, of a daytime running light with a linear structure, can be used. In order to achieve a uniform appearance, in particular in the case of a linear daytime running light, for example, light-emitting diodes arranged at a distance in the daytime running light can be interconnected in groups and illuminate the respective illumination regions. The middle led may illuminate other illumination areas. In other words, for example, the first light source used to create the illumination area 121 may include at least a first light emitting diode and a second light emitting diode. The second light source illuminating the illumination area 122 may likewise comprise at least one light emitting diode or a plurality of light emitting diodes. The first and second light emitting diodes of the first light source and the light emitting diodes of the second light source are arranged in a row, wherein the light emitting diodes of the second light source are arranged between the first and second light emitting diodes of the first light source. Since the brightness of the light-emitting diodes may vary during the distance measurement, this staggered arrangement makes it possible to: these brightness differences are not perceived by the viewer. Alternatively, however, it is also possible to provide interesting effects in terms of design when these brightness differences are visible to the observer.

Fig. 15 shows the near field of the illumination area due to the staggered arrangement of the light emitting diodes. The lamp bank 151 includes 21 light emitting diodes. The light bank 151 may be, for example, a light bank of a daytime running light and has a length of, for example, 42 cm. With the lamp row 151, illumination areas or segments having angles of 20 ° each are illuminated. Each section is produced by three light-emitting diodes with a spacing of 14 cm. Fig. 15 shows a segment that can be illuminated by the individual light-emitting diodes. The far field of the segment generated by the light-emitting diodes of the lamp bank 151 is shown in fig. 16. Here, it can be clearly seen that the illumination area 121, which comprises about 20 °, 127.

Various assistance systems of a vehicle may require image information of the surroundings of the vehicle, which image information provides a high resolution of an image of a scene in front of the vehicle from the perspective of the vehicle, wherein each region or image point of the image information is assigned a corresponding distance value from an object in the region. This image information may be necessary, for example, in order to be able to detect obstacles above or below a specific region, such as obstacles located on a lane that cannot be traveled over, such as a doorstep. Fig. 17 illustrates a method 170 for determining such distance information. In step 171, a scene in the surroundings of the vehicle is illuminated. The method 170 can be used not only outside of the vehicle, but also inside of the vehicle, for example, to recognize a gesture of the driver. The light reflected by the scene is received with the sensors 12 of the vehicle 10. In step 172, a plurality of first distance histograms are determined from the received light. A respective first distance histogram of the plurality of first distance histograms is assigned a respective first bounding region of the scene. The first distance histogram includes intensities of reflections within a distance range caused by objects in the assigned first bar region. In step 173, a plurality of second distance histograms are determined from the received light. A respective second distance histogram of the plurality of second distance histograms is assigned a respective second bar region of the scene. The second distance histogram includes intensities of reflections within the distance range caused by objects in the assigned second bar region. In step 174, distances are determined for regions of the scene from the plurality of first distance histograms and the plurality of second distance histograms. The region of the scene includes an intersection region of one of the first bar regions and one of the second bar regions. The first strip-shaped areas are preferably parallel to each other along their longitudinal direction and the second strip-shaped areas are preferably parallel to each other along their longitudinal direction. The longitudinal direction of the first strip-shaped regions is preferably perpendicular to the longitudinal direction of the second strip-shaped regions. The first bar regions may include rows of the scene in front of or within the vehicle, and the second bar regions may include columns of the scene. To determine the plurality of first distance histograms and the plurality of second distance histograms, the sensor 12 may comprise a receiver matrix, in which case the rows and columns may be selectively interconnected such that the received signal is formed either by the sum of all elements in a column or by the sum of all elements of a row. Then, all rows and columns can be measured separately. For example, the distance measurement can be carried out by one of the methods described above, by correspondingly modulating the light source of the vehicle and correlating or mixing the received signal from one of the rows or columns with the transmitted signal of the lighting device 11. The receiver matrix may have, for example, 300 rows and 700 columns, i.e. a total of 1000 rows and columns. In the case of a measurement time of, for example, 50 μ sec per row or column, these 1000 measurements can be carried out within 50 ms, wherein, according to the statements below, at least partially and preferably completely simultaneous measurements of all rows and columns are provided in the present case.

Each row or column now provides a range-resolved echo map, the so-called range histogram. This can be processed into a pixel-resolved image using a similar method as in the case of computed tomography. To reduce processing costs, the same approach can be used to select specific regions of interest. For this region, the corresponding receiving elements of the receiver matrix are interconnected and only this region is observed and evaluated.

Switching between the different regions to be evaluated can be effected dynamically and in this way different driving situations can be matched.

The above method will be described later using examples with reference to fig. 18 to 20. Fig. 18 shows a scene in the surroundings of the vehicle. Vehicle 182 is on lane 181. The scene is subdivided into a plurality of regions in a matrix form. In the example shown in fig. 18, the scene is subdivided into 14 rows and 19 columns, resulting in a total of 266 regions. Such a small number of rows and columns is chosen in fig. 18 to 20 for the sake of clarity. A practical implementation may for example have at least 100 rows and at least 200 columns, preferably 300 rows and 700 columns. Thus, the sensor 12 preferably comprises a sensor matrix with corresponding row and column resolution. The lighting device 11 of the vehicle 10 preferably illuminates the scene shown in fig. 18 with a light-emitting diode light source and evaluates with one of the above-described modulation methods, for example, the frequency modulated continuous wave method, the random frequency modulated method, the single frequency modulation method or the pulse modulation method. By means of the interconnection of the receiver matrix in rows or columns, distance-resolved echo maps for these rows and columns are produced.

Fig. 19 shows the corresponding distance-resolved echo map of 14 lines of the scene of fig. 18. Subsequently, the echo map of the five rows below the scene of fig. 18 should be exemplarily described in detail. In fig. 19, the echo diagram for these five rows is characterized by reference numeral 191. As can be seen from fig. 19, the echo pattern has an increased signal level in the range of 60 to 110 meters. In contrast, in the range of 10 to 60 meters and in the range of 110 to 150 meters, there is substantially no signal level. This means that: in these five columns, there is at least one object located in the range of 60 to 110 m. However, there may be a plurality of objects within this range. The position of the object in the horizontal direction, i.e., in which column region the object is located, is not visible from the echo diagram of fig. 19.

Fig. 20 shows the corresponding echo map for the 19 columns of the scene of fig. 18. For example, reference should be made in this respect to column 6 on the left side of fig. 18, which column 6 is identified by reference numeral 201 in fig. 20. The echo map 201 in the sixth column indicates the presence of an object or objects in the range of 60 to 110 meters. The echo maps of these columns in turn do not contain information about the distribution of the objects within the column.

From the totality of these echo maps, corresponding distance information about objects in the scene of fig. 18 can be determined for each of the 266 individual regions of the scene. The region-specific information can be obtained, for example, from row-and column-distance resolved echo maps by means of a two-dimensional fourier transformation.

The distance-resolved echo diagrams in fig. 19 and 20 are dimensionless and can, for example, indicate a relative quantity which indicates what percentage of the area region in the form of a row or column has a corresponding distance from the vehicle.

Both in the case of pulse modulation and in the case of the random frequency modulation method (RFM), information can be encoded into the emitted signal 15, which can be decoded by the receiver. This information can be used, for example, for communication between vehicles, so-called vehicle-to-vehicle (Car-to-Car) communication, or for communication between the vehicle 10 and infrastructure objects, such as traffic lights or traffic control systems. FIG. 22 illustrates a method 220 that may be used to transmit digital information simultaneously with a distance measurement. Fig. 21 shows the vehicle 10 as well as another vehicle 210 and an infrastructure object 211. Using the method 220 described in fig. 22, the distance between the vehicles 10 and 210 can be measured simultaneously and information, in particular digital information, can be transmitted to the vehicle 210 or the infrastructure object 211.

In step 221, a modulated signal is generated in accordance with transmission data to be transmitted by the vehicle 10. In step 222, the light source 11 of the vehicle 10 is operated with the modulated signal. In step 223, light 16 is received, which is emitted by the light source 11 as light 15 and is reflected by the vehicle 210 or other objects in the surroundings of the vehicle 10. In step 224, a receive signal is generated from the received light. The received signal may comprise, for example, an analog electrical signal or a digital signal. In step 225, the received signal is combined with the modulated signal, for example by means of the above-described correlation method, and in step 226, the distance between the vehicle 10 and the vehicle 210 is determined from the combined signal. The modulation method used for generating the modulation signal may comprise, in particular, a random frequency modulation method or a pulse modulation method. In the frequency modulation method, a modulation frequency is changed according to transmission data. In the pulse modulation method, the pulse interval or the pulse length is changed according to transmission data. The modulation signal may additionally be generated based on random data.

Thus, the data to be transmitted by the vehicle 10 is transmitted in the modulation of the transmission signal. For example, as shown in fig. 21, the bit sequence 213 can be transmitted from the vehicle 10 to the vehicle 210 traveling ahead and to the infrastructure object 211 by means of a modulated transmission signal, as indicated by the light propagation arrows 15 and 212. A receiver in vehicle 210 or in infrastructure 211 can receive the modulated transmission signal, demodulate the modulated transmission signal and thus recover transmission data 213 and further process the transmission data. Subsequently, encoding of the transmission data 213 into the modulated transmission signal will be described in detail exemplarily for a pulse modulation method and a random frequency modulation method (RFM).

In the pulse modulation method, light pulses are transmitted at a pulse repetition rate. The pulse repetition rate is typically long compared to the pulse length of the optical pulses. Since a constant pulse repetition rate may be disadvantageous for distance measurement, the spacing between the pulses may be varied within a certain range, for example to avoid a fluttering state. For transmitting data, this variation of the spacing between the pulses, for example, can be subdivided into a static component and a system component. For example, pulses having a length of 50 nsec and a pulse repetition rate of 25 kHz, that is to say 40 μ sec, can be used. For measuring distances in the range of, for example, up to 250 m, the pulse interval should not be below 250 m x 6.6.6 nsec/m x 2 = 3.3 μ sec. Therefore, the pulse interval can be changed between 3.3 μ sec and 76 μ sec. In a system with a time-of-flight distance measurement and a basic timing of 25 nsec, 2936 variation possibilities were obtained. For example, 512 of these possibilities can be used in order to transmit 9 bits. For example, 6 bits of which may include the transmitted data to be transmitted, and the remaining 3 bits may be statistically varied. Thus, the interval between pulses fluctuates from 33.6 to 46.6 μ sec by 12.8 μ sec. Thus, 6 bits of transmit data can be transmitted for a total of 40 μ sec, thereby achieving a net data rate of 150 Kbit/sec.

In the case of Random Frequency Modulation (RFM), the frequency can be changed from 10 MHz to 100 MHz within 40 μ sec, for example. In the case of the random modulation method without data transmission, a plurality of frequencies are selected statically at random from the frequency band, which are then modulated in succession and thus result in a frequency shift which is important for the measurement. For transmitting the transmission data, the frequency selection is no longer performed randomly but comprises at least one system component. For example, frequencies may be synthesized in frequency steps of 10 kHz from a frequency band of 10 to 100 MHz. Thus, there may be 9000 different frequencies. 512 of these can, for example, in turn be used as the important frequencies, so that a frequency band of approximately 175 kHz results for each piece of information. Typical fm receivers can distinguish between frequencies of 50 kHz without problems so that the transmitted information can be easily decoded if a frequency separation of 50 kHz or more is to be maintained. For random variations to reduce the effect of interference, 125 kHz or ± 62.5 kHz also remains.

In fig. 23, a schematic section of an exemplary sensor device according to the invention is shown, which sensor device carries out the method according to an embodiment of the invention. A sensor 290 can be seen, which comprises a plurality of individual sensor elements 300, which each provide a measurement signal in the form of an intensity value from the received light.

The sensor elements 300 are arranged in a matrix. Thus, there are row and column directions delineated by a coordinate system, where the rows are represented by Z and the columns are represented by S in fig. 23. Segments of three rows and three columns are shown by way of example, wherein a much larger number of each can also be provided (see the example above). The position of the pixel or sensor element 300 can be illustrated by row and column coordinates, as exemplarily shown in fig. 23.

Each row Z and each column S form a strip-shaped area of the type described herein, so that a distance histogram is formed correspondingly by rows and columns. Furthermore, each sensor element 300 provides a measurement signal which can be read both in the row direction and in the column direction in the manner described below, in particular in order to be able to form a distance histogram formed by rows and columns.

The sensor elements 300 are identically constructed. However, the components of the sensor element 300 which are explained below are not provided with a separate reference numeral for each sensor element 300.

Each sensor element 300 comprises a photodetector element 302 in the form of an SiPM 302. The SiPM 302 generates an electrical measurement signal as a function of the received light or the received light intensity. For each sensor element 300, the measurement signals should be taken into account not only by row but also by column (i.e. to facilitate the formation of a distance histogram by row and column, respectively). To this end, each SiPM 302 is connected to column lines 304-308 and to column lines 310-314.

The signals attached to both row lines 304 and 308 and column lines 310 and 314 may be used to form a distance histogram by row and column.

Connections to the column lines 304-308 and on the column lines 310-314 are provided via a current mirror 316, which, in addition to the example shown, is formed by the usual circuitry of two semiconductor transistors 318. Column line 310 and 314 are each powered through one of the semiconductor transistors 318 and column line 304 and 308 are each powered through the other semiconductor transistor.

This configuration is a simple and reliable variant in order to be able to achieve the desired simultaneous determination of the received light of the sensor elements 300 in the strip-shaped regions or more precisely in the individual rows Z and columns S.

In fig. 24, a schematic section of a sensor device 290 according to another embodiment of the invention is shown. Furthermore, sensor elements 300 are shown in a matrix-shaped arrangement (not all sensor elements are provided with corresponding reference numerals), whereas the internal structure of these sensor elements is only very rough and is only outlined in selected and schematically outlined partial regions 319.

Shows that: each sensor element 300 includes a plurality of photodetector elements 320, 340 that are combined into two groups. More precisely, a first set of photo-detector elements 320, presented brightly, and a second set of photo-detector elements 340, presented darkly, are shown, wherein each set may comprise, for example, sixteen individual photo-detector elements 320, 340. The photodetector elements 320, 340 may be constructed in the form of the aforementioned SPADS, which are combined into or included with one or more SIPMs.

It can be seen that: the photodetector elements 320, 340 of the respective groups are arranged checkerboard-like, such that two photodetector elements 320, 340 of one group which are consecutive in the column and row directions enclose between them the photodetector elements 320, 340 of the other group, respectively. Thus, the photodetector elements 320, 340 of the respective groups are alternately arranged in the column and row directions.

One of the groups of photodetector elements 320 provides a measurement signal attached to a row line (see exemplary labeled row line 304). The photodetector elements 340 of the other group provide measurement signals that are attached to column lines (see exemplarily labeled row lines 310).

This ensures that: without additional hardware components in the form of current mirrors or other amplification circuits (although these amplification circuits may optionally likewise be provided), the measurement signals of the sensor elements 300 can be read not only row-wise but also column-wise and in particular simultaneously.

It is also elucidated again from fig. 24: for each row Z (illustratively represented as Z)mZ of (A)1) And each column S (illustratively represented as S)mS of1) Measurements are taken and, in particular, a distance histogram by columns and by rows is formed as depicted.

Finally, also outlined in fig. 24 are: the sensor device 290 may be subdivided into sensors 291 which comprise sensor elements 300 and thus units providing measurement signals. On the other hand, these column-wise and row-wise distance histograms may be determined by an exemplary delineated processing unit 292, the functionality of which has been explained in connection with the other figures above and which may be constructed substantially the same as these figures.

For the sake of completeness, mention should be made of: line P1,1To Pn,1And P1, mTo Pn,mAnd column Pn,1To Pn,mAnd P1,1To P1,mThe outermost sensor elements 300 (and optionally also other rows and columns adjacent thereto) are edge regions in which, instead of a simultaneous detection or triggering of the corresponding sensor elements 300, a sequential detection or triggering of the corresponding sensor elements is possible. This is based on the following idea: in these regions, important objects may be detected less frequently than in the central region of a matrix-shaped arrangement of sensor elements, for example.

List of reference numerals

10 vehicle

11 light source

12 optical sensor

13 processing unit

14 driver assistance system

15 light

16 reflected light

17 object

18 distance

20 method

21-25 steps

30 method

31-35 steps

40 light emitting diode light source

41 light emitting diode

42 switching element

43 energy storage element

44 modulated signal

45 ground connection terminal

46 power supply connection terminal

47 first connection end

48 second connection end

50-52 connection

54 outer casing

60 method

61-67 step

71. 72 detection area

81-84 detection area

121-127 illumination area

131-136 detection region

151 lamp bank

170 method

171 step-174

181 lanes

182 vehicle

191 echo diagram

201 echo diagram

210 vehicle

211 infrastructure object

212 light propagation arrow

213 sending data

220 method

221-226 step

291 sensor

292 processing unit

300 sensor element

S column

Line Z

302 SiPM

320. 340 photodetector element

304-308 column lines

310-314 column line

316 current mirror

318 semiconductor transistor

319 partial area.

42页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:对象检测网络的训练与检测方法、装置、设备和存储介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!