Distance measuring system and calibration method of distance measuring sensor

文档序号:1860166 发布日期:2021-11-19 浏览:11次 中文

阅读说明:本技术 测距系统以及测距传感器的校准方法 (Distance measuring system and calibration method of distance measuring sensor ) 是由 林久纮 市川纪元 于 2021-04-09 设计创作,主要内容包括:本发明提供测距系统以及测距传感器的校准方法。能够减轻用于测距传感器的校准作业的作业者的负荷,能够与测定环境无关地容易地实施校准。在设置多个测距传感器(1a、1b)来生成测定区域内的对象物的距离图像的测距系统中,具备进行测距传感器间的位置对齐,合成来自多个测距传感器的距离数据来显示对象物的距离图像的协作处理装置(2)。协作处理装置(2)为了进行测距传感器间的位置对齐,通过多个测距传感器取得在测定区域内移动的人物(9)的轨迹即动线,并进行各测距传感器的设置位置的校准,使得由各测距传感器取得的动线在共同的坐标系中一致。(The invention provides a ranging system and a calibration method of a ranging sensor. The load on an operator who performs calibration work for a distance measuring sensor can be reduced, and calibration can be easily performed regardless of the measurement environment. A distance measurement system in which a plurality of distance measurement sensors (1a, 1b) are provided to generate a distance image of an object in a measurement area is provided with a cooperation processing device (2) which performs alignment between the distance measurement sensors and combines distance data from the plurality of distance measurement sensors to display the distance image of the object. In order to align the positions of the distance measuring sensors, the cooperation processing device (2) acquires a line of action, which is the trajectory of a person (9) moving within the measurement area, by the plurality of distance measuring sensors, and calibrates the installation positions of the distance measuring sensors so that the line of action acquired by the distance measuring sensors coincide with each other in a common coordinate system.)

1. A distance measuring system in which a plurality of distance measuring sensors are provided to generate a distance image of an object in a measuring area,

the distance measuring sensor measures the distance to the object based on the propagation time of light,

the distance measurement system is provided with: a cooperation processing device for performing alignment between the distance measuring sensors, synthesizing distance data from the plurality of distance measuring sensors, and displaying a distance image of the object,

in order to align the positions of the distance measuring sensors, the cooperation processing device acquires a line of action, which is a trajectory of a person moving in the measurement area, by the plurality of distance measuring sensors, and calibrates the installation positions of the distance measuring sensors so that the lines of action acquired by the distance measuring sensors coincide with each other in a common coordinate system.

2. The ranging system of claim 1,

the cooperation processing device includes:

a coordinate conversion unit that converts distance data from the plurality of distance measurement sensors into position data of a common coordinate system using sensor setting information of the plurality of distance measurement sensors;

an image synthesis unit that synthesizes the measurement data to generate 1 distance image;

a display unit that displays the synthesized distance image;

a human figure detection unit that detects a line of motion of a human figure effective for calibration based on distance data input from the plurality of distance measurement sensors; and

and a calibration unit that corrects the sensor setting information used in the coordinate conversion unit based on a result of a distance image obtained by synthesizing a line of action of the person.

3. The ranging system of claim 2,

the person detection unit acquires height information of a person as incidental information of the detected person,

the calibration unit calculates the similarity of the movement lines when performing the alignment of the movement lines of the person acquired by the plurality of distance measuring sensors, and performs the alignment between the movement lines corresponding to the height information or the time information with reference to the height information of the person acquired by the person detecting unit and the time information at which the distance data is acquired.

4. The ranging system of claim 2,

the person detection unit acquires a distance from the distance measurement sensor to a person as accompanying information of the detected person, and evaluates reliability of a course of action based on the distance from the distance measurement sensor to the person with respect to the detected person,

the calibration unit performs alignment between the movement lines evaluated to have high reliability by the human detection unit.

5. The ranging system of claim 2,

the human figure detection unit acquires the amount of the point group included in the human figure region as the incidental information of the detected human figure, and evaluates the reliability of the action line for the detected human figure based on the amount of the point group included in the human figure region,

the calibration unit performs alignment between the movement lines evaluated to have high reliability by the human detection unit.

6. The ranging system of claim 2,

the person detection unit acquires a detection direction of a person within the angle of view as incidental information of the detected person, and evaluates reliability of the movement line for the detected person based on the detection direction of the person within the angle of view,

the calibration unit performs alignment between the movement lines evaluated to have high reliability by the human detection unit.

7. The ranging system of claim 2,

the person detection unit acquires the presence or absence of an obstacle in front of a person as incidental information of the detected person, and evaluates the reliability of the course of action based on the presence or absence of an obstacle in front of the person with respect to the detected person,

the calibration unit performs alignment between the movement lines evaluated to have high reliability by the human detection unit.

8. The ranging system according to any of claims 4 to 7,

the display unit displays the moving line by changing the density or color of the moving line according to the reliability of the moving line evaluated by the human detection unit.

9. The ranging system of claim 2,

the distance measurement system is provided with: and a user adjustment unit configured to perform fine adjustment by a user when the user selects a trace effective for calibration from among traces of the person detected by the person detection unit and the sensor setting information is corrected by the calibration unit.

10. A calibration method for a distance measuring sensor, when a plurality of distance measuring sensors are provided to generate a distance image of an object in a measurement area, is characterized in that,

the distance measuring sensor measures the distance to the object based on the propagation time of light,

the calibration method of the distance measuring sensor comprises the following steps:

detecting a person moving in a measurement area by the plurality of distance measuring sensors and acquiring a motion line which is a trajectory of the person in order to align the distance measuring sensors; and

and calibrating the sensor installation information of each distance measuring sensor so that the moving lines acquired by each distance measuring sensor are identical in a common coordinate system.

11. The calibration method of a ranging sensor according to claim 10,

in the step of acquiring the action line, height information of the person is acquired as the accompanying information of the detected person,

in the step of performing the calibration, the similarity of the trajectories acquired by the plurality of distance measuring sensors is calculated, and the height information of the detected person and the time information at which the distance is measured are referred to, and the alignment is performed between the trajectories whose height information or time information matches.

12. The calibration method of a ranging sensor according to claim 10,

in the step of acquiring the course, a distance from the distance sensor to the person is acquired as the accompanying information of the detected person, and the reliability of the course is evaluated based on the distance from the distance sensor to the person with respect to the detected person,

in the step of performing the calibration, the alignment is performed between the trajectories evaluated to be highly reliable by the step of obtaining the trajectories.

13. The calibration method of a ranging sensor according to claim 10,

in the step of acquiring the action line, the amount of the point group included in the person region is acquired as the incidental information of the detected person, and the reliability of the action line is evaluated for the detected person on the basis of the amount of the point group included in the person region,

in the step of performing the calibration, the alignment is performed between the trajectories evaluated to be highly reliable by the step of obtaining the trajectories.

14. The calibration method of a ranging sensor according to claim 10,

in the step of acquiring the motion line, the detection direction of the person within the angle of view is acquired as the incidental information of the detected person, and the reliability of the motion line is evaluated for the detected person on the basis of the detection direction of the person within the angle of view,

in the step of performing the calibration, the alignment is performed between the trajectories evaluated to be highly reliable by the step of obtaining the trajectories.

15. The calibration method of a ranging sensor according to claim 10,

in the step of acquiring the movement line, whether an obstacle exists in front of the person is acquired as the accompanying information of the detected person, and the reliability of the movement line is evaluated according to whether an obstacle exists in front of the person with respect to the detected person,

in the step of performing the calibration, the alignment is performed between the trajectories evaluated to be highly reliable by the step of obtaining the trajectories.

Technical Field

The present invention relates to a distance measuring system and a method for calibrating a distance measuring sensor for measuring a distance to an object using a plurality of distance measuring sensors.

Background

A distance measuring sensor (hereinafter, also referred to as a TOF sensor) is known which measures a distance to an object based on a propagation time of light (hereinafter, also referred to as a TOF method). The movement path can be obtained by detecting a person, for example, from the feature amount of the distance data acquired by the TOF sensor and tracking the temporal change of the detected person or the like. The principle of the TOF sensor is to calculate the distance to an object by measuring the time until irradiation light emitted from a light source is reflected by the object and returns to a light receiving unit. Since there are limits to the distance and angle of view (angle of view) that can be measured by 1 TOF sensor, when measuring a large space, a plurality of sensors are arranged to perform the measurement.

In this regard, for example, a distance image camera described in patent document 1 is provided with a plurality of camera units (TOF sensors), and has an angle of view larger than that of a single imaging unit and obtains a distance image with high distance accuracy. As its structure, "having: a two-dimensional position correction unit that corrects two-dimensional position information of each pixel based on the average distance information obtained by the distance information replacement unit and a two-dimensional pixel position of each pixel of each distance image; and a distance image synthesizing unit that obtains a synthesized distance image in which the distance images are synthesized by converting the two-dimensional position information and the distance information of each pixel corrected by the two-dimensional position correcting unit into a common three-dimensional coordinate system.

Patent document 1 describes that, when distance images of camera units (TOF sensors) are subjected to coordinate conversion and combined, "each distance image is combined by converting the X, Y, and Z values of each pixel of each distance image into a camera coordinate system or a world coordinate system based on camera parameters (internal and external) obtained by calibration at the time of installation of each camera unit 10. As a general method of this calibration, it is known to arrange a specific object (marker) in a measurement space, measure the position of the marker with each camera unit (TOF sensor), and perform coordinate conversion so as to obtain a common coordinate value. However, in reality, it is sometimes difficult to appropriately arrange the markers.

For example, it is known to use a reflective tape made of a retro-reflective material as the mark for calibration, but this reflective tape needs to be attached to the floor of the measurement site. As the number of TOF sensors increases, the load on the worker of the work increases. Further, depending on the measurement environment, there are irregularities and obstacles on the floor surface, and it may be difficult to attach the reflection tape to a desired position.

Further, the technique described in patent document 1 is a technique of combining distance images of a plurality of camera units, but each camera unit is provided in the same direction as viewed from an object (box) having a surface perpendicular to the irradiation direction of each camera unit. Therefore, the image composition is defined in the positional relationship, and the required calibration is also defined.

Patent document 1: japanese patent laid-open No. 2012-247226

Disclosure of Invention

The invention aims to provide a distance measuring system and a calibration method, which can reduce the load of operators for the calibration operation of a distance measuring sensor and can easily perform calibration regardless of the measurement environment.

The present invention provides a distance measurement system in which a plurality of distance measurement sensors are provided to generate a distance image of an object in a measurement area, and the distance measurement system includes a cooperation processing device that performs alignment between the distance measurement sensors, synthesizes distance data from the plurality of distance measurement sensors, and displays the distance image of the object. In order to align the positions of the distance measuring sensors, the cooperative processing device acquires the trajectory (hereinafter, referred to as a "course") of a person moving in the measurement area by the plurality of distance measuring sensors, and performs calibration of the installation positions of the distance measuring sensors so that the courses acquired by the distance measuring sensors coincide with each other in a common coordinate system.

Further, the present invention provides a method for calibrating a distance measuring sensor when a plurality of distance measuring sensors are provided to generate a distance image of an object in a measurement area, the method including: detecting a person moving in the measurement area by a plurality of distance measuring sensors and acquiring a trajectory (moving line) of the person in order to align the distance measuring sensors; and calibrating the sensor installation information of each distance measuring sensor so that the moving lines acquired by each distance measuring sensor are consistent in a common coordinate system.

According to the present invention, the following effects are provided: the load on the operator for the calibration work of the distance measuring sensor is reduced, and the calibration can be easily performed regardless of the measurement environment.

Drawings

Fig. 1 is a diagram showing a configuration of a distance measuring system according to the present embodiment.

Fig. 2 is a diagram showing a configuration of a distance measuring sensor (TOF sensor).

Fig. 3 is a diagram illustrating the principle of distance measurement by the TOF method.

Fig. 4 is a diagram showing a configuration of the cooperative processing apparatus.

Fig. 5A is a diagram illustrating a calibration method using a reflection band.

Fig. 5B is a diagram illustrating a calibration method using a reflection band.

Fig. 6A is a diagram illustrating a calibration method using the trace data.

Fig. 6B is a diagram illustrating a calibration method using the trace data.

Fig. 7 is a diagram showing an evaluation of reliability of the trace data and an example of display thereof.

Fig. 8 is a flowchart showing the procedure of the calibration process.

Description of reference numerals

1: 1a, 1 b: distance measuring sensors (TOF sensors),

2: a cooperation processing device,

3: a network,

4: the ground surface,

8: a reflection band,

9: objects (persons),

9a, 9b, 91-94: a moving wire,

11: a light emitting part,

12: a light receiving part,

13: a light emission control section,

14: a distance calculating section,

21: a data input unit,

22: a coordinate conversion unit,

23: an image synthesis unit,

24: a display part,

25: a human detection part,

26: a calibration unit.

Detailed Description

Hereinafter, embodiments of the present invention will be described. In the calibration of the distance measuring sensors according to the present embodiment, trajectory data (trace data) of a person moving in the measurement space is acquired by each distance measuring sensor, and the position alignment between the sensors (correction of the set position information) is performed so that the trajectory data acquired by each distance measuring sensor matches in a common coordinate system.

Fig. 1 is a diagram showing a configuration of a distance measuring system according to the present embodiment. The distance measurement system is connected to a plurality of distance measurement sensors (hereinafter, also referred to as "TOF sensors" or simply "sensors") 1a and 1b and a cooperative processing device 2 that controls them via a network 3. The cooperation processing device 2 synthesizes the distance data acquired by the sensors 1 to generate 1 distance image, and performs calibration processing for correcting the position information of the sensors 1. In the cooperation processing apparatus 2, for example, a Personal Computer (PC) or a server is used.

In the example shown in fig. 1, 2 sensors 1a and 1b are attached to a ceiling 5, and a distance to an object 9 (here, a person) present on a floor 4 is measured to generate a distance image as a movement trajectory (a line of motion) of the person 9. Since the distance and the angle of view that can be measured by 1 sensor are limited, the arrangement of a plurality of sensors can not only enlarge the measurement area but also measure the position of the object 9 with high accuracy. Therefore, the coordinate conversion of the measurement value in each sensor must be performed with high accuracy, and therefore calibration between sensors is required.

Fig. 2 is a diagram showing the structure of the distance measuring sensor (TOF sensor) 1. The distance measuring sensor 1 includes: a light emitting unit 11 that emits pulsed light of infrared light from a light source such as a Laser Diode (LD) or a Light Emitting Diode (LED); a light receiving unit 12 for receiving pulsed light reflected from an object by a CCD sensor, a CMOS sensor, or the like; a light emission control unit 13 that controls the on/off and light emission amount of the light emitting unit 11; and a distance calculation unit 14 that calculates a distance to the object based on the detection signal (light reception data) of the light reception unit 12. The distance data calculated by the distance calculation unit 14 is transmitted to the cooperation processing device 2. Further, the light emission control unit 13 of the distance measuring sensor 1 starts light emission in accordance with the measurement command signal from the cooperative processing device 2.

Fig. 3 is a diagram illustrating the principle of distance measurement by the TOF method. The distance measuring sensor (TOF sensor) 1 emits irradiation light 31 for distance measurement from the light emitting unit 11 to the object 9 (e.g., a person). The light receiving unit 12 receives the reflected light 32 reflected by the object 9 by the two-dimensional sensor 12 a. The two-dimensional sensor 12a is formed by two-dimensionally arranging a plurality of pixels such as CCD sensors, and the distance calculating unit 14 calculates two-dimensional distance data from light reception data in each pixel.

The object 9 is located at a distance D from the light emitting unit 11 and the light receiving unit 12. Here, assuming that the light velocity is c and the time difference between the emission of the irradiation light 31 from the light emitting unit 11 and the reception of the reflected light 32 by the light receiving unit 12 is t, the distance D to the object 9 is determined by D being c × t/2. In the practical distance measurement by the distance calculating unit 14, an irradiation pulse of a predetermined width is emitted in place of the time difference t, and the two-dimensional sensor 12a receives the irradiation pulse while shifting the timing of the exposure gate. Then, the distance D (exposure gate method) is calculated from the value of the light receiving amount (accumulation amount) at different timings.

Fig. 4 is a diagram showing the configuration of the cooperative processing apparatus 2. The cooperation processing device 2 includes: a data input unit 21 that inputs distance data from the respective distance measuring sensors 1a and 1 b; a coordinate conversion unit 22 for converting each input distance data into position data of a common coordinate system; an image synthesizing unit 23 that synthesizes the pieces of position data to generate 1 distance image; and a display unit 24 for displaying the synthesized distance image. Further, the calibration device further includes, for calibrating the sensors 1a and 1 b: a human detection unit 25 that detects a human (action line) effective for calibration from the input distance data of each sensor; and a calibration unit 26 that corrects conversion parameters (sensor setting information) used in the coordinate conversion unit 22 based on the result of the synthesized image. Further, a transmission unit, not shown, is provided for transmitting a measurement instruction signal to each of the sensors 1a and 1 b.

The cooperative processing device 2 performs arithmetic processing such as coordinate conversion, image synthesis, and calibration, stores a program used therein in the ROM, and develops the program in the RAM to be executed by the CPU, thereby realizing the above-described functions (not shown). The human detection process and the calibration process may be appropriately adjusted by an operator (user) via a user adjustment unit (not shown) while observing the image of the action line displayed on the display unit 24.

Next, a calibration method will be described. In the present embodiment, the action line of a human being is used as a measurement object (marker) for calibration processing, but for comparison, a description will be given of a method using a reflection band.

Fig. 5A and 5B are diagrams illustrating a calibration method using a reflection band.

Fig. 5A (1) shows a state in which the reflection tape 8 is disposed (attached) on the floor surface 4 of the measurement space. The sensors 1a and 1b are disposed at horizontal positions (x1, y1), (x2, y2) in the measurement space (expressed by xyz coordinates). The installation heights z of both are set to be the same for simplicity, but can be corrected by calculation even if they are different. Note that the azimuth angles in the measurement directions (the center directions of the viewing angles) of the sensors 1a and 1b are represented by θ 1 and θ 2. The pitch angles of the two are the same, but they can be corrected by calculation even if they are different. The reflection tape 8 is made of a retro-reflective material having a characteristic of reflecting incident light in an incident direction, and is attached to the floor 4 in a cross shape, for example.

Fig. 5A (2) shows a state where the distance of the reflection band 8 is measured by the sensors 1a and 1 b. The position of the reflection band 8 measured by the sensor 1a is indicated by 8a, and the position of the reflection band 8 measured by the sensor 1b is indicated by 8b (indicated by a double line for distinction). The measurement positions 8a and 8b are positions obtained by coordinate-converting the distance data obtained from the respective sensors using the installation positions (x1, y1), (x2, y2) and the azimuth angles θ 1 and θ 2 of the sensors, and are displayed on a common coordinate system, in other words, virtual measurement images of the reflection belt 8. The same reflection band 8 may be used, and the measurement positions (measurement images) may not coincide with each other as in the case of 8a and 8 b. This is because there is an error in the information of the set positions (x1, y1), (x2, y2) and the azimuth angles θ 1, θ 2 of the sensors. If there is an error in the information on the installation height and pitch angle of the sensor, the measurement positions 8a and 8b do not coincide with the floor surface 4.

In the calibration process, the information of the installation position and the azimuth angle of the sensor is corrected so that the measurement positions 8a and 8b of the reflection belt 8 coincide with each other. Then, coordinate conversion is performed based on the corrected setting information, and a virtual measurement image is displayed again, and this is repeated until these images match. The following describes the procedure of calibration.

Fig. 5B (1) shows a state where viewpoint conversion is performed. That is, the measurement positions (measurement images) 8a and 8b on the xy plane when the measurement space is viewed from the z direction (directly above) are shown, and the positions and directions of the two are deviated.

Fig. 5B (2) shows a state in which the azimuth angle information of the sensor is corrected by rotation in order to measure the positional alignment of the positions 8a and 8B. Here, the azimuth angle θ 1 of the sensor 1a is fixed, and the directions (the directions of the crosses) of the measurement positions 8a and 8b are matched by correcting the azimuth angle information of the sensor 1b from θ 2 to θ 2'.

Fig. 5B (3) shows a state in which the positional information of the sensor is corrected by movement in order to align the measurement positions 8a and 8B. The position (x1, y1) of the sensor 1a is fixed, and the position information of the sensor 1b is corrected from (x2, y2) to (x2 ', y 2'), so that the measurement positions 8a and 8b are matched.

In the above-described calibration method using the reflection tape 8, an operation of attaching the reflection tape serving as a marker to a measurement site is required. In this case, if the number of sensors is increased, the load of the attaching operation is increased, and depending on the measurement environment, the floor surface may be uneven or an obstacle may be present, making it difficult to attach the reflective tape. Therefore, in the present embodiment, it is characterized in that the line data of the moving person is used instead of the reflection band. Hereinafter, a calibration method using the trace data will be described.

Fig. 6A and 6B are diagrams illustrating a calibration method using the dynamic line data.

Fig. 6A (1) shows a state where the human figure 9 moves on the floor 4 in the measurement space. The set positions (x1, y1), (x2, y2) and the measurement directions (azimuth angles) θ 1, θ 2 of the sensors 1a, 1b are the same as those of fig. 5A. From time t0 to t2, person 9 moves on floor 4 as shown by the dashed line.

Fig. 6A (2) shows a state where the sensors 1a and 1b measure the distance to the person 9. Further, as the distance to the person 9, for example, a head is extracted from the person image and represented by distance data to the head. Then, data of the movement locus (line of motion) of the person 9 moving on the floor 4 is acquired. The movement line of person 9 measured by sensor 1a is indicated by 9a, and the movement line of person 9 measured by sensor 1b is indicated by 9b (indicated by a double line for distinction). In this case, the motion lines 9a and 9b are also displayed on a common coordinate system as a virtual measurement image of the motion line of the human figure 9 by coordinate-converting the distance data obtained from the respective sensors using the installation positions (x1, y1), (x2, y2) and the azimuth angles θ 1 and θ 2 of the sensors. Although the same person 9 moves, the movement lines (measurement images) thereof may not coincide with each other as in the case of 9a and 9 b. This is because there is an error in the information of the set positions (x1, y1), (x2, y2) and the azimuth angles θ 1, θ 2 of the sensors. In addition, if there is an error in the information of the installation height and the pitch angle of the sensor, the moving lines 9a and 9b do not coincide with each other on the ground.

In the calibration process, the information of the installation position and the azimuth angle of the sensor is corrected so that the movement lines 9a and 9b of the person 9 coincide with each other. Then, coordinate conversion is performed based on the corrected setting information, and the line is displayed again, and this is repeated until they match. The following describes the procedure of calibration.

Fig. 6B (1) shows a state where viewpoint conversion is performed. That is, the movement lines 9a and 9b on the xy plane when the measurement space is viewed from the z direction (directly above) are shown, and the positions and directions of both are deviated. In this example, the start time t0 of the motion line 9a is different from the start time t1 of the motion line 9b, and therefore the length of the motion line is also different.

Fig. 6B (2) shows a state in which the azimuth angle information of the sensor is corrected by rotation so as to align the positions of the rotor wires 9a and 9B. Here, the direction angle θ 1 of the sensor 1a is fixed, and the directions of the wires 9a and 9b are aligned by correcting the direction angle information of the sensor 1b from θ 2 to θ 2'. At this time, the time information is referred to, and correction is performed so that the common portion of the motion curves of both (i.e., the section from time t1 to time t 2) becomes parallel.

Fig. 6B (3) shows a state in which the positional information of the sensor is corrected by movement so as to align the position of the wires 9a and 9B. The position (x1, y1) of the sensor 1a is fixed, and the position information of the sensor 1b is corrected from (x2, y2) to (x2 ', y 2'), so that the wires 9a and 9b are aligned. In this example, the intervals from time t1 to time t2 of the motion lines 9a and 9b coincide with each other.

As described above, in the present embodiment, calibration is performed using the trajectory data that is the movement trajectory of the person, and it is not necessary for the operator to attach the reflection tape to the floor surface as in the comparative example. Therefore, the load on the operator for the calibration work is reduced, and the calibration can be easily performed regardless of the measurement environment. Further, trajectory data for calibration of various shapes can be easily obtained, and improvement in accuracy of calibration can be expected.

In addition, in the present embodiment, since the line data of the human head is used, the height position of the human head can be calibrated. Therefore, the present invention is more suitable for calibration in the case where a human figure is a measurement target than for calibration on a floor surface to which a reflection tape is attached as in the comparative example, and can be expected to improve accuracy.

In the present embodiment, a specific person may be moved in order to acquire line data of the person, but it is also possible to use a case where an arbitrary person moves in the measurement space. Therefore, it is necessary to obtain various kinds of moving line data by the ranging sensor, and extract effective moving line data for calibration therefrom. Further, assuming that an operator (user) may extract valid trajectory data, it is necessary to study a display method of the trajectory data. In view of this, the processing is performed as follows in the present embodiment.

(1) Height information is acquired as incidental information of persons detected from the distance data, and moving line data between persons having the same height is extracted to align the moving lines. This makes it possible to screen the same person and align the persons even when a plurality of unspecified persons move in the measurement space.

(2) As the incidental information of the line data, the position of the line is aligned so that the positions of the points on the line at the same time are matched with each other, with reference to the time information at which the distance data is acquired. Therefore, when the animation data is displayed, the timings are synchronized to display animation.

(3) And evaluating the reliability of the moving line data and extracting the moving line data with high reliability. The reliability here refers to the degree of accuracy of measurement of the detected person data, and is high if the person is close to the sensor, the number of points is large, or the detection direction of the person is close to the center within the angle of view. Conversely, the reliability of the measurement value decreases as the TOF system light reception intensity decreases as the distance from the sensor or the end position within the angle of view increases. Further, when the area of the detected person becomes small, the number of dot groups (the number of detection pixels of the light receiving unit) decreases, or when an obstacle exists in front of the person, a part of the line data may be lost (hidden) (blocking occurs), and thus reliability may be lowered. When the reliability of the moving line data is evaluated, the moving line is displayed in a differentiated manner on the display unit 24 according to the evaluation result. For example, a high-reliability line is displayed in a darker state, and a low-reliability line is displayed in a lighter state (or the display color may be changed).

(4) When the trace data of the plurality of sensors is displayed On the display unit 24, On/Off (On/Off) switching of the trace data display can be performed for each sensor. In addition, the data of the moving line measured a plurality of times in the past is stored in advance, and desired data is read and displayed. By using the data for a plurality of times to perform calibration adjustment, the accuracy of calibration is improved.

The reliability of the trace data described in (3) above will be described with reference to the drawings.

Fig. 7 is a diagram showing an evaluation of the reliability of the power line data and an example of display thereof. The data of the trace measured by the sensor 1a show 4 examples 91 to 94. The movable line 91 is located at a position close to the sensor 1a, and the movable line 92 is a case where the detection position is located at an end of the angle of view. The moving line 93 is located at a position distant from the sensor 1a, and the moving line 94 is a case where an obstacle 95 is present in front of the moving path. When the comparison is performed with reference to the motion line 91, the motion line 92 is located at the end of the viewing angle, and therefore the amount of light received is small, and the motion line 93 is located at a distant position, and therefore the number of dots is small, and a part of the motion line 94 is missing. Therefore, when these dynamic lines 91 to 94 are displayed, the dynamic line 91 with high reliability is displayed in a darker state, and the other dynamic lines 92 to 94 with low reliability are displayed in a lighter state. Alternatively, the color of the line may be changed according to the reliability and displayed. Thus, the operator can select a highly reliable trajectory from the plurality of trajectories and use the selected trajectory for calibration.

In addition, when the trace data is used, the shape of the trace may be considered. That is, if the length of the action line is short, it becomes difficult to align the position in the direction (rotation), and therefore a predetermined length or more is necessary. In addition, when the shape of the motion line is linear, the alignment in the direction perpendicular thereto can be clearly performed, but the alignment in the direction parallel thereto becomes unclear. Therefore, the shape of the action line is preferably curved, and the reliability is high.

Fig. 8 is a flowchart showing the procedure of the calibration process of the present embodiment. The calibration process is performed by the cooperative processing device 2 instructing each ranging sensor. The following describes the contents of the processing in order of steps.

S101: the cooperation processing device 2 sets installation parameters of the respective distance measuring sensors 1. The setting parameters include the sensor installation position (x, y, z) and the measurement direction (azimuth angle) (θ x, θ y, θ z).

S102: in response to an instruction from the cooperative processing apparatus 2, each sensor 1 acquires distance data of the measurement space for a predetermined time and transmits the distance data to the cooperative processing apparatus 2.

S103: the human detection unit 25 of the cooperation processing apparatus 2 detects a human from the received distance data. In person detection, the position of the head of a person is detected by an image recognition technique. As the accompanying information, the time, height, and dot group amount (the number of pixels included in the person region) of the detected person are acquired and held. If a plurality of persons are detected, position information and accompanying information are acquired for each person.

S104: further, the human detection unit 25 evaluates the reliability of the detected human (motion line data). This is to extract the most accurate data to evaluate when used in the calibration process, and to evaluate on the condition that a person is close to the sensor, a person with a large number of point groups, or the detection direction is close to the center of the angle of view.

S105: the coordinate conversion unit 22 converts the position data of the person detected by each sensor into a common coordinate space. In the coordinate conversion, the setting parameters set in S101 are used.

S106: it is determined whether the person data after the coordinate conversion is sufficient. That is, it is determined whether the incidental information (time and height) of the person detected by each sensor matches each other between the sensors. If the data is sufficient, the process proceeds to S107, and if not, the process returns to S102 to acquire the distance data again.

S107: the image synthesizing unit 23 synchronizes the acquisition times of the position data of the person from the sensors after the coordinate conversion in S105, synthesizes the position data into a common coordinate space, and draws the coordinate space on the display unit 24. That is, the action lines acquired by the sensors are displayed. If there are a plurality of detected persons, a plurality of groups of action lines are displayed.

S108: the calibration unit 26 calculates the similarity of the moving lines acquired by the sensors. That is, portions where the shapes (patterns) of the motion lines are similar to each other are extracted. Therefore, the moving line portions from the respective sensors corresponding to the time are compared, and the similarity of the moving lines is obtained by a pattern matching method.

S109: the calibration unit 26 aligns (moves, rotates) the positions of the sensors so that the lines coincide with each other, for a portion where the similarity (correspondence) of the lines is high. That is, the installation parameters of the sensors are corrected to the installation position (x ', y', z ') and the measurement direction (azimuth angle) (θ x', θ y ', θ z'). Here, when there are a plurality of (3 or more) sensors, a sensor to be a reference is determined, and on the other hand, 1 sensor is subjected to position alignment, or the corrected sensors are sequentially subjected to position alignment of other uncorrected sensors.

S110: the calibration result is again subjected to coordinate conversion of the position of the line of action by the coordinate conversion unit 22, and is drawn on the display unit 24. The operator observes the corrected position of the line of action to determine whether the position is sufficient. If the result is sufficient, the calibration process is terminated, and if the result is insufficient, the process returns to S107, and the alignment is repeated.

In the above-described flow, in the step of evaluating the reliability in S104 and the calibration in S109, the operator may be assisted by the user adjustment unit while observing the trajectory displayed on the display unit 24. That is, in S104, the operator determines the reliability of the movement line and selects a movement line with high reliability, thereby improving the efficiency of the subsequent calibration processing. In the calibration step in S109, the operator manually performs fine adjustment of the setting parameters, and the accuracy of the calibration process can be further improved.

As described above, in the calibration of the distance measuring sensors in the present embodiment, the trajectory data (trajectory data) of the person moving in the measurement space is acquired by each distance measuring sensor, and the position alignment between the sensors (correction of the installation position information) is performed so that the trajectory data acquired by each distance measuring sensor matches in the common coordinate system. This reduces the load on the operator to set the marker (reflection tape) for calibration work, and makes it possible to easily perform calibration regardless of the measurement environment.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:运动对象的检测

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类