Three-dimensional imaging system based on TOF

文档序号:1648919 发布日期:2019-12-24 浏览:2次 中文

阅读说明:本技术 一种基于tof的三维成像系统 (Three-dimensional imaging system based on TOF ) 是由 万云武 贾仁耀 石江涛 陈晓东 黄守强 叶松 于 2019-10-12 设计创作,主要内容包括:本发明公开了一种基于TOF的三维成像系统,包括激光发射模块、感光传感器、ARM处理器模块、FPGA处理模块、存储器模块、千兆网口、显示模块和温度传感器;本发明通过发射主动光源发射一束红外光,光信号在沿传播方向传播遇到障碍物反射回来并通过镜头接收,并滤除环境光的影响,感光器通过接收回光量来感知目标的位置以及距离。本发明能够实现十米以内目标的三维成像,并且能够实现较高的测量精度;而且本发明具有较远的测量范围,在确保测量精度的同时能够适应一定的环境光影响,而且能够获得较高的帧率;本发明简单有效,且易于实用。(The invention discloses a three-dimensional imaging system based on TOF (time of flight), which comprises a laser emission module, a photosensitive sensor, an ARM processor module, an FPGA (field programmable gate array) processing module, a memory module, a gigabit network port, a display module and a temperature sensor, wherein the laser emission module is used for emitting laser beams; the invention transmits a beam of infrared light by transmitting the active light source, the light signal is transmitted along the transmission direction, meets the barrier, is reflected back and is received by the lens, the influence of the ambient light is filtered, and the position and the distance of the target are sensed by the photoreceptor by receiving the light receiving amount. The invention can realize three-dimensional imaging of the target within ten meters and can realize higher measurement precision; the invention has a longer measuring range, can adapt to certain ambient light influence while ensuring the measuring precision, and can obtain a higher frame rate; the invention is simple, effective and easy to use.)

1. A three-dimensional imaging system based on TOF is characterized by comprising a laser emission module, a photosensitive sensor, an ARM processor module, an FPGA processing module, a memory module, a gigabit network port, a display module and a temperature sensor;

the laser emission module, the memory module, the FPGA processor module and the gigabit network port are respectively connected with the ARM processor module; the kilomega network port is connected with the display module, the laser emission module is connected with the temperature sensor, and the photosensitive sensor is connected with the FPGA processing module;

the ARM processor module controls the laser emission module to emit light signals, the light signals are 940nm narrow-band lasers, the light signals are debugged through 12MHZ sinusoidal signals to obtain modulated light signals, and the diffusion sheet is adopted at the emission end of the light source to diffuse the lasers with concentrated energy, so that the diffusion angle of the lasers is increased; the laser emission module emits the modulated light signal to a target, and the modulated light signal is reflected to the photosensitive sensor through the target;

the photosensitive sensor receives a modulated light signal reflected by a target; the photosensitive sensor passes through a narrow-band filter before receiving the modulated optical signal, and light of other wave bands except 940nm is filtered out, so that the photosensitive sensor only responds to light with the wavelength of 940nm to obtain a filtering signal;

the photosensitive sensor converts the received filtering signals into electron quantity, records the electron quantity in a voltage form to obtain voltage signals, and converts the voltage signals into digital signals through an analog-to-digital converter in the photosensitive sensor and transmits the digital signals to the FPGA processing module; each pixel point respectively calculates the number S of electrons accumulated for four times through four times of exposure time, and the phase difference of each time of exposure time is 90 degrees;

the FPGA processing module is also used for calculating the phase difference between the emitted light and the reflected light according to the quantity of electrons accumulated by each pixel, and analyzing the phase difference to obtain depth information D and intensity information I;

the FPGA processing module stores the calculated depth information D and the strength information I into the memory module and calibrates the measured depth information;

transmitting the calibrated depth information and strength information to a display module through a gigabit network interface;

the display module is used for displaying the tested calibrated depth information and intensity information and representing the depth information and the intensity information by means of a distance image, an intensity image and a point cloud image.

2. The TOF-based three-dimensional imaging system of claim 1 wherein the diffuser is a 60 ° by 45 ° diffuser.

3. The three-dimensional imaging system based on TOF of claim 1, wherein the specific method for calculating the phase difference and analyzing the phase difference to obtain the depth information D and the intensity information I by the FPGA processing module is as follows:

the method comprises the following steps: obtaining the electron quantities obtained by four exposures in one period respectively and sequentially marking the electron quantities as S0 ', S1', S2 'and S3';

step two: calculating to obtain the phase difference by solving equation formula 2-formula 5The calculation can be carried out by completing an arc tangent calculation formula 6 in the FPGATo obtain a phase differenceThe method specifically comprises the following steps:

s1: the lidar receiving optical signal is described as formula 1, which is as follows:

where A denotes the received light signal intensity, B denotes the background light intensity, ω denotes the frequency of the light,phase difference indicative of received light

S2: in order to increase the calculated phase differenceThe precision of the method is that 4n times of exposure is adopted to calculate a distance, wherein n is a preset value; calculating the phase difference of the received light with respect to the emitted light by four sets of integral electron numbers using, for example, equations 2, 3, 4, and 5And light intensity information a of the received light;

the concrete formula is as follows:

s3: since the optical signals are subjected to sine debugging, the phase difference is calculatedAn inverse trigonometric function needs to be solved; the phase difference between the emitted light and the reflected light is calculated by equation 6, and the intensity information of the reflected light is calculated by equation 7

S4: phase difference obtained by calculationThe depth information of the target is calculated, and the calculation derivation method is as follows:

SS 1: the distance D between the target and the system is the time of flight of the light multiplied by the speed of light c, divided by 2 to represent the distance in a single pass, and D is calculated as equation 11;

wherein t is the time of flight of the light;

SS 2: calculating the time of flight of the light by calculating the phase difference between the emitted light and the received light using an indirect method in calculating the time of flight of the light, in particular measuring the phase difference between the received reflected light and the emitted lightThereby indirectly obtaining the flight of lightThe time t is specifically expressed by formula 10:

wherein f is the modulation frequency of the light intensity;

combining equation 10 and equation 11, then

By combining the formula 6 and the formula 12, a calculation formula of the distance D between the target and the system can be obtained, which specifically includes:

the distance D of the target from the system is the depth information D.

4. The TOF-based three dimensional imaging system of claim 1 wherein said display module is further configured to set respective operational parameters; the operation parameters comprise integration time, measurement distance, test frequency, ROI area setting and offset.

5. The TOF-based three-dimensional imaging system of claim 1, wherein the temperature sensor detects the temperature of the laser emitting module in real time, and the ARM processor module controls to turn off the laser emitting module when the temperature is too high.

6. The TOF-based three-dimensional imaging system according to claim 5, wherein the temperature is determined to be too high, in particular, to exceed a preset value.

7. The TOF-based three-dimensional imaging system according to claim 1, wherein the specific calibration method for calibrating the measured depth information is: the number of accumulated electrons corresponding to each different distance is found in a standard calibration device to form a table, and the final calibration process is to determine the actual distance measured by looking up the table.

Technical Field

The invention belongs to the field of three-dimensional imaging and laser radar, relates to a three-dimensional imaging technology, and particularly relates to a three-dimensional imaging system based on TOF.

Background

The three-dimensional image can acquire not only plane image information but also depth information, and has wide application scenes. Common three-dimensional image imaging methods include binocular and structured light schemes. The scheme can obtain higher resolution and measurement accuracy, but the measurement distance is shorter and is greatly influenced by ambient light. The existing TOF-based three-dimensional image imaging method has complex circuit design and compensation process, is complex in calculation, is difficult to obtain a high frame rate, and is easily influenced by background light. Based on the laser radar of mechanical scanning formula, structural design is very complicated again.

In order to solve the above-mentioned drawbacks, a solution is now provided.

Disclosure of Invention

The invention aims to provide a three-dimensional imaging system based on TOF.

The purpose of the invention can be realized by the following technical scheme:

a three-dimensional imaging system based on TOF comprises a laser emission module, a photosensitive sensor, an ARM processor module, an FPGA processing module, a memory module, a gigabit network port, a display module and a temperature sensor;

the laser emission module, the memory module, the FPGA processor module and the gigabit network port are respectively connected with the ARM processor module; the kilomega network port is connected with the display module, the laser emission module is connected with the temperature sensor, and the photosensitive sensor is connected with the FPGA processing module;

the ARM processor module controls the laser emission module to emit light signals, the light signals are 940nm narrow-band lasers, the light signals are debugged through 12MHZ sinusoidal signals to obtain modulated light signals, and the diffusion sheet is adopted at the emission end of the light source to diffuse the lasers with concentrated energy, so that the diffusion angle of the lasers is increased; the laser emission module emits the modulated light signal to a target, and the modulated light signal is reflected to the photosensitive sensor through the target;

the photosensitive sensor receives a modulated light signal reflected by a target; the photosensitive sensor passes through a narrow-band filter before receiving the modulated optical signal, and light of other wave bands except 940nm is filtered out, so that the photosensitive sensor only responds to light with the wavelength of 940nm to obtain a filtering signal;

the photosensitive sensor converts the received filtering signals into electron quantity, records the electron quantity in a voltage form to obtain voltage signals, and converts the voltage signals into digital signals through an analog-to-digital converter in the photosensitive sensor and transmits the digital signals to the FPGA processing module; each pixel point respectively calculates the number S of electrons accumulated for four times through four times of exposure time, and the phase difference of each time of exposure time is 90 degrees;

the FPGA processing module is also used for calculating the phase difference between the emitted light and the reflected light according to the quantity of electrons accumulated by each pixel, and analyzing the phase difference to obtain depth information D and intensity information I;

the FPGA processing module stores the calculated depth information D and the strength information I into the memory module and calibrates the measured depth information;

the specific calibration method comprises the following steps: finding the number of accumulated electrons corresponding to each different distance in a standard calibration device to form a table, wherein the final calibration process is to determine the measured actual distance by looking up the table; transmitting the calibrated depth information and strength information to a display module through a gigabit network interface;

the display module is used for displaying the tested calibrated depth information and intensity information and representing the depth information and the intensity information by means of a distance image, an intensity image and a point cloud image.

Further, the diffuser is a 60 ° -45 ° diffuser.

Further, the specific method for calculating the phase difference and analyzing the phase difference to obtain the depth information D and the intensity information I by the FPGA processing module is as follows:

the method comprises the following steps: obtaining the electron quantities obtained by four exposures in one period respectively and sequentially marking the electron quantities as S0 ', S1', S2 'and S3';

step two: calculating to obtain the phase difference by solving equation formula 2-formula 5Completing an arc tangent calculation formula 6 in the FPGA to calculate the phase differenceThe method specifically comprises the following steps:

s1: the lidar receiving optical signal is described as formula 1, which is as follows:

where A denotes the received light signal intensity, B denotes the background light intensity, ω denotes the frequency of the light,phase difference indicative of received light

S2: in order to increase the calculated phase differenceThe precision of the method is that 4n times of exposure is adopted to calculate a distance, wherein n is a preset value; calculating the phase difference of the received light with respect to the emitted light by four sets of integral electron numbers using, for example, equations 2, 3, 4, and 5And light intensity information a of the received light;

the concrete formula is as follows:

s3: since the optical signals are subjected to sine debugging, the phase difference is calculatedAn inverse trigonometric function needs to be solved; the phase difference between the emitted light and the reflected light is calculated by equation 6, and the intensity information of the reflected light is calculated by equation 7

S4: phase difference obtained by calculationThe depth information of the target is calculated, and the calculation derivation method is as follows:

SS 1: the distance D between the target and the system is the time of flight of the light multiplied by the speed of light c, divided by 2 to represent the distance in a single pass, and D is calculated as equation 11;

wherein t is the time of flight of the light;

SS 2: in calculating the flight of lightLine time the time of flight of the light is calculated using an indirect method by calculating the phase difference between the emitted light and the received light, in particular measuring the phase difference between the reflected light received and the emitted lightThereby indirectly obtaining the flight time t of the light, as shown in formula 10:

wherein f is the modulation frequency of the light intensity;

combining equation 10 and equation 11, then

By combining the formula 6 and the formula 12, a calculation formula of the distance D between the target and the system can be obtained, which specifically includes:

the distance D of the target from the system is the depth information D.

Furthermore, the display module is also used for setting corresponding operating parameters; the operation parameters comprise integration time, measurement distance, test frequency, ROI area setting and offset.

Furthermore, the temperature sensor detects the temperature of the laser emission module in real time, and the ARM processor module controls the laser emission module to be turned off when the temperature is too high.

Further, the temperature is determined to be higher than the preset value.

The invention has the beneficial effects that:

the invention transmits a beam of infrared light by transmitting the active light source, the light signal is transmitted along the transmission direction, meets the barrier, is reflected back and is received by the lens, the influence of the ambient light is filtered, and the position and the distance of the target are sensed by the photoreceptor by receiving the light receiving amount. The invention can realize three-dimensional imaging of the target within ten meters and can realize higher measurement precision; the invention has a longer measuring range, can adapt to certain ambient light influence while ensuring the measuring precision, and can obtain a higher frame rate; the invention is simple, effective and easy to use.

Drawings

In order to facilitate understanding for those skilled in the art, the present invention will be further described with reference to the accompanying drawings.

FIG. 1 is a block diagram of the system of the present invention.

Detailed Description

A three-dimensional imaging system based on TOF comprises a laser emission module, a photosensitive sensor, an ARM processor module, an FPGA processing module, a memory module, a gigabit network port, a display module and a temperature sensor;

the laser emission module, the memory module, the FPGA processor module and the gigabit network port are respectively connected with the ARM processor module; the kilomega network port is connected with the display module, the laser emission module is connected with the temperature sensor, and the photosensitive sensor is connected with the FPGA processing module;

the ARM processor module controls the laser emission module to emit light signals, the light signals are 940nm narrow-band lasers, the light signals are debugged through 12MHZ sinusoidal signals to obtain modulated light signals, and the diffusion sheet is adopted at the emission end of the light source to diffuse the lasers with concentrated energy, so that the diffusion angle of the lasers is increased; the laser emission module emits the modulated light signal to a target, and the modulated light signal is reflected to the photosensitive sensor through the target;

the photosensitive sensor receives a modulated light signal reflected by a target; the photosensitive sensor passes through a narrow-band filter before receiving the modulated optical signal, and light of other wave bands except 940nm is filtered out, so that the photosensitive sensor only responds to light with the wavelength of 940nm to obtain a filtering signal;

the photosensitive sensor converts the received filtering signals into electron quantity, records the electron quantity in a voltage form to obtain voltage signals, and converts the voltage signals into digital signals through an analog-to-digital converter in the photosensitive sensor and transmits the digital signals to the FPGA processing module; each pixel point respectively calculates the number S of electrons accumulated for four times through four times of exposure time, and the phase difference of each time of exposure time is 90 degrees;

the FPGA processing module is also used for calculating the phase difference between the emitted light and the reflected light according to the quantity of electrons accumulated by each pixel, and analyzing the phase difference to obtain depth information D and intensity information I;

the FPGA processing module stores the calculated depth information D and the strength information I into the memory module and calibrates the measured depth information;

the specific calibration method comprises the following steps: finding the number of accumulated electrons corresponding to each different distance in a standard calibration device to form a table, wherein the final calibration process is to determine the measured actual distance by looking up the table; transmitting the calibrated depth information and strength information to a display module through a gigabit network interface;

the display module is used for displaying the tested calibrated depth information and intensity information and representing the depth information and the intensity information by means of a distance image, an intensity image and a point cloud image.

Wherein, the diffuser is a diffuser of 60 degrees by 45 degrees.

The specific method for calculating the phase difference and analyzing the phase difference to obtain the depth information D and the intensity information I by the FPGA processing module comprises the following steps:

the method comprises the following steps: obtaining the electron quantities obtained by four exposures in one period respectively and sequentially marking the electron quantities as S0 ', S1', S2 'and S3';

step two: calculating to obtain the phase difference by solving equation formula 2-formula 5Completing an arc tangent calculation formula 6 in the FPGA to calculate the phase differenceThe method specifically comprises the following steps:

s1: the lidar receiving optical signal is described as formula 1, which is as follows:

where A denotes the received light signal intensity, B denotes the background light intensity, ω denotes the frequency of the light,phase difference indicative of received light

S2: in order to increase the calculated phase differenceThe precision of the method is that 4n times of exposure is adopted to calculate a distance, wherein n is a preset value; calculating the phase difference of the received light with respect to the emitted light by four sets of integral electron numbers using, for example, equations 2, 3, 4, and 5And light intensity information a of the received light;

the concrete formula is as follows:

s3: since the optical signals are subjected to sine debugging, the phase difference is calculatedAn inverse trigonometric function needs to be solved; the phase difference between the emitted light and the reflected light is calculated by equation 6, and the intensity information of the reflected light is calculated by equation 7

S4: phase difference obtained by calculationThe depth information of the target is calculated, and the calculation derivation method is as follows:

SS 1: the distance D between the target and the system is the time of flight of the light multiplied by the speed of light c, divided by 2 to represent the distance in a single pass, and D is calculated as equation 11;

wherein t is the time of flight of the light;

SS 2: calculating the time of flight of the light by calculating the phase difference between the emitted light and the received light using an indirect method in calculating the time of flight of the light, in particular measuring the phase difference between the received reflected light and the emitted lightThereby indirectly obtaining the flight time t of the light, as shown in formula 10:

wherein f is the modulation frequency of the light intensity;

combining equation 10 and equation 11, then

By combining the formula 6 and the formula 12, a calculation formula of the distance D between the target and the system can be obtained, which specifically includes:

the distance D of the target from the system is the depth information D.

The display module is also used for setting corresponding operation parameters; the operation parameters comprise integration time, measurement distance, test frequency, ROI area setting and offset.

The temperature sensor detects the temperature of the laser emission module in real time, and the ARM processor module controls the laser emission module to be turned off when the temperature is too high.

And specifically judging that the temperature exceeds a preset value when the temperature is too high.

In a specific application, as shown in fig. 1, a three-dimensional imaging system based on TOF includes a laser emitting module, a photosensitive sensor, an ARM processor module, an FPGA processing module, a memory module, a gigabit network port, a display module, and a temperature sensor;

the laser emission module, the memory module, the FPGA processor module and the gigabit network port are respectively connected with the ARM processor module; the kilomega network port is connected with the display module, the laser emitting module is connected with the temperature sensor, and the photosensitive sensor is connected with the FPGA processing module.

The ARM processor module controls the laser emission module to emit light signals, the light signals are 940nm narrow-band lasers, the light signals are modulated through 12MHZ sine signal debugging to obtain modulated light signals, and a diffusion sheet is used at the emission end of the light source to diffuse lasers with concentrated energy, so that the divergence angle of the lasers is increased;

the diffusion sheet is a 60-degree 45-degree diffusion sheet; the laser emission module emits the modulated optical signal to a target, and the modulated optical signal is reflected to the photosensitive sensor through the target; the photosensitive sensor receives a modulated light signal reflected by a target; the photosensitive sensor passes through a narrow-band filter before receiving the modulated optical signal, and light of other wave bands except 940nm is filtered out, so that the photosensitive sensor only responds to light with the wavelength of 940nm to obtain a filtering signal;

the photosensitive sensor converts the received filtering signals into electron quantity, records the electron quantity in a voltage form to obtain voltage signals, and converts the voltage signals into digital signals through an analog-to-digital converter in the photosensitive sensor and transmits the digital signals to the FPGA processing module; each pixel point respectively calculates the number S of electrons accumulated by four times through four times of exposure time, and the phase difference of each time of exposure time is 90 degrees;

in the FPGA, we calculate the phase difference between the emitted light and the reflected light by the number of electrons accumulated in each pixel, and the specific calculation method is as follows:

the method comprises the following steps: obtaining electron quantities S0 ', S1', S2 'and S3' obtained by four exposures through one period and four exposures respectively;

step two: the phase difference can be calculated by solving equation 2-equation 5Completing an arc tangent calculation formula 6 in the FPGA to calculate the phase difference

The lidar receiving light signal can be described as formula 1, where a denotes the signal intensity of the receiving light, B denotes the intensity of the background light, ω denotes the frequency of the light,a phase difference representing received light;

in order to increase the calculated phase differenceWe calculate a distance using 4N exposures, N being 10 in this example implementation. The phase difference of the return light with respect to the emitted light is calculated by four sets of integral electron numbers using the formula 2, 3, 4, 5And light intensity information a of the received light;

since the optical signals are subjected to sine debugging, the phase difference is calculatedAn inverse trigonometric function needs to be solved. The phase difference between the emitted light and the reflected light is calculated by equation 6, and the intensity information of the reflected light is calculated by equation 7

As can be seen from equations 6 and 7, the sine wave modulation is used in our radar system, and the (S1 '-S3') and (S0 '-S2') are used in the phase difference calculation process, so that the background light part in the reflected light is subtracted together, so that the laser radar in the present invention naturally has the property of resisting background light;

phase difference obtained by calculationCalculating the depth information of the target, wherein the calculation formula is shown in a formula 8; where D is the measured distance, i.e. the depth information D, C is the speed of light, and f is the modulation frequency;

according to the formula 8, the maximum measurement distance of the laser radar system is related to the modulation frequency of laser, and the lower the debugging frequency is, the farther the detection distance is; the modulation signal uses sine signal modulation, and the sine signal is a periodic function with the period of 2 pi as the period as shown in formula 9It is not possible to distinguish the case of integer multiples of 2 pi, that is to say when the modulation frequency is 12MHZ, the farthest detection range is 12.5 meters.

We calculate the time of flight of light by calculating the phase difference between the emitted light and the received light using an indirect method when calculating the time of flight of light, and the specific calculation process of equation 8 is:

measuring the phase difference between the received reflected light and the emitted lightThe time of flight t of the light is thus indirectly obtained,

wherein f is the modulation frequency of the light intensity;

the distance D between the target and the system is the time of flight of the light multiplied by the speed of light c, divided by 2 to represent the distance in a single pass, then

As can be seen from equations 10 and 11:

the distance of the target from the system can be calculated according to equation 12.

The FPGA processing module stores the calculated depth information D and the intensity information I into the memory module and calibrates the measured depth information, the calibration method is that the number of accumulated electrons corresponding to different distances is found in a standard calibration device to form a table, and the final calibration process is to determine the measured actual distance by looking up the table; transmitting the calibrated depth information and strength information to a display module through a gigabit network interface; transmitting the calibrated depth information and strength information to a display module through a gigabit network interface; the ARM reads the distance information from the memory and performs temperature compensation, depth information calibration and depth information compensation on the distance information; and multiplying the compensated phase value representing the depth information by a constant to obtain a distance value between the target and the system equipment. And the distance information and the intensity information are transmitted to a display module through the gigabit network port, and the display module can select to display the distance information, the intensity information and the point cloud picture of the target.

Displaying the tested distance information, strength information and point cloud picture on a display module, and simultaneously setting corresponding operating parameters by the display module; the temperature sensor detects the temperature of the laser emission module in real time, if the temperature is too high, the ARM processor module controls the laser emission module to be turned off, and the temperature is too high to be specifically represented as the temperature exceeding a preset value.

The invention transmits a beam of infrared light by transmitting the active light source, the light signal is transmitted along the transmission direction, meets the barrier, is reflected back and is received by the lens, the influence of the ambient light is filtered, and the position and the distance of the target are sensed by the photoreceptor by receiving the light receiving amount. The invention can realize three-dimensional imaging of the target within ten meters and can realize higher measurement precision; the invention has a longer measuring range, can adapt to certain ambient light influence while ensuring the measuring precision, and can obtain a higher frame rate; the invention is simple, effective and easy to use.

The foregoing is merely exemplary and illustrative of the present invention and various modifications, additions and substitutions may be made by those skilled in the art to the specific embodiments described without departing from the scope of the invention as defined in the following claims.

13页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:深度相机

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类