Target object ranging method and device

文档序号:1612743 发布日期:2020-01-10 浏览:23次 中文

阅读说明:本技术 一种目标对象的测距方法及装置 (Target object ranging method and device ) 是由 潘铭星 于 2019-10-24 设计创作,主要内容包括:本发明公开了目标对象的测距方法及装置,根据图像采集装置获取的图像,确定目标对象的第一实际宽度;然后获取目标对象在前一时刻的相关状态参数;并根据相关状态参数,获得第一相对距离,再根据第一相对距离,获得目标对象的第二实际宽度。然后根据第一实际宽度和第二实际宽度获得相对差值,并根据相对差值将第一相对距离调整为第二相对距离。由此可见,本申请采用不同方式得到目标对象的实际宽度,然后利用实际宽度的比较结果来约束调整两个可移动设备之间的相对距离,进而使两者的相对距离更为准确,能够提高测距的精准度,以保证驾驶的安全性。(The invention discloses a distance measuring method and a distance measuring device for a target object, wherein a first actual width of the target object is determined according to an image acquired by an image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining a first relative distance according to the relevant state parameters, and obtaining a second actual width of the target object according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.)

1. A method of ranging a target object, the method comprising:

determining a first actual width of the target object according to an image acquired by an image acquisition device;

acquiring relevant state parameters of the target object at the previous moment;

obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment;

obtaining a second actual width of the target object according to the first relative distance;

obtaining a relative difference value according to the first actual width and the second actual width;

and adjusting the first relative distance to be a second relative distance according to the relative difference.

2. The method of claim 1, wherein said determining a first actual width of said target object comprises:

obtaining the width of the target object in the image;

determining the initial width of the target object according to the width and the projection relation of the target object in the image; the projection relation comprises a mapping relation between a preset reference surface and the image;

and determining the first actual width according to the initial width.

3. The method of claim 2, wherein said determining the first actual width from the initial width comprises:

inputting the image into a preset model, and determining the type of the target object;

and adjusting the initial width of the target object according to the type of the target object to obtain the first actual width.

4. The method of claim 1, wherein,

the acquiring of the relevant state parameter of the target object at the previous moment includes: acquiring a relevant distance parameter and a relevant speed parameter of the target object at a previous moment;

the obtaining a first relative distance according to the relevant state parameter includes:

and obtaining the first relative distance according to the relevant distance parameter at the previous moment and the relevant speed parameter at the previous moment.

5. The method of claim 4, wherein the obtaining the first relative distance according to the distance parameter associated with the previous time and the speed parameter associated with the previous time comprises:

obtaining a time difference between the current time and the previous time;

obtaining the relative movement distance of the current moment according to the time difference and the related speed parameter of the previous moment;

and obtaining the first relative distance based on the relative distance parameter of the previous moment, the relative movement distance of the current moment and the distance noise.

6. The method of claim 2, wherein said obtaining a second actual width of the target object from the first relative distance comprises:

obtaining a focal length of the image acquisition device;

and processing the focal length, the width of the target object in the image and the first relative distance to obtain the second actual width.

7. The method of claim 1, wherein said updating the first relative distance to the second relative distance according to the relative difference comprises:

judging whether the relative difference value is smaller than a preset threshold value or not;

if so, determining the first relative distance as the second relative distance;

if not, adjusting the distance noise according to the relative difference value; obtaining the second relative distance according to the adjusted distance noise and the relevant state parameter; and the difference value between the third actual width obtained by the second relative distance and the first actual width is smaller than the preset threshold value.

8. A ranging apparatus for a target object, comprising:

the first determining module is used for determining a first actual width of the target object according to the image acquired by the image acquisition device;

the acquisition module is used for acquiring the relevant state parameters of the target object at the previous moment;

the first processing module is used for obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment;

the second processing module is used for obtaining a second actual width of the target object according to the first relative distance;

the comparison module is used for obtaining a relative difference value according to the first actual width and the second actual width;

and the adjusting module is used for adjusting the first relative distance into a second relative distance according to the relative difference.

9. An electronic device, comprising:

a processor; and

a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the method of any of claims 1-7.

10. A computer-readable storage medium, the storage medium storing a computer program for performing the method of any of the preceding claims 1-7.

Technical Field

The application relates to the technical field of automatic driving, in particular to a distance measuring method and device for a target object.

Background

With the continuous development of science and technology, automatic driving is also developed at a rapid speed. The automatic driving is not required to be equipped with a driver, and the whole process is automatically controlled by a computer.

One of the major concerns of autopilot research is ranging, i.e., measuring the distance between a preceding autopilot device and a current autopilot device. The accuracy of the distance measurement directly affects driving safety and driving efficiency. For example, if the distance between the front vehicle and the rear vehicle is not accurately measured, the rear vehicle easily collides with the front vehicle, causing traffic accidents, and seriously affecting the driving safety. For another example, inaccurate measurement distances of front and rear unmanned aerial vehicles may cause accidents such as collision and crash of the unmanned aerial vehicles.

Therefore, how to improve the ranging accuracy is a problem that needs to be solved urgently at present.

Disclosure of Invention

The present application is proposed to solve the above-mentioned technical problems.

According to an aspect of the present application, there is provided a ranging method of a target object, the method including: determining a first actual width of the target object according to an image acquired by an image acquisition device;

acquiring relevant state parameters of the target object at the previous moment; obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment; obtaining a second actual width of the target object according to the first relative distance; obtaining a relative difference value according to the first actual width and the second actual width; and adjusting the first relative distance to be a second relative distance according to the relative difference.

According to another aspect of the present application, there is provided a ranging apparatus for a target object, including: the first determining module is used for determining a first actual width of the target object according to the image acquired by the image acquisition device; the acquisition module is used for acquiring the relevant state parameters of the target object at the previous moment; the first processing module is used for obtaining a first relative distance according to the relevant state parameter, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment; the second processing module is used for obtaining a second actual width of the target object according to the first relative distance; the comparison module is used for obtaining a relative difference value according to the first actual width and the second actual width; and the adjusting module is used for adjusting the first relative distance into a second relative distance according to the relative difference.

According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the method as described above.

According to yet another aspect of the application, there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method as described above.

Compared with the prior art, the method and the device have the advantages that the first actual width of the target object is determined according to the image acquired by the image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining the first relative distance according to the relevant state parameters, wherein the distance between the image acquisition device at the current moment and the target object can be obtained because the relevant state parameters can truly reflect the actual state of the target object in the driving process. In addition, a second actual width of the target object is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.

The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.

Drawings

The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.

Fig. 1 is a flowchart illustrating a method for measuring a distance to a target object according to an exemplary embodiment of the present disclosure.

Fig. 2 is a schematic flowchart of a process before determining a first actual width of a target object according to another exemplary embodiment of the present application.

FIG. 3A is a projection relationship diagram provided by an exemplary embodiment of the present application;

fig. 3B is a schematic illustration of an unmanned vehicle provided by an exemplary embodiment of the present application.

Fig. 4 is a schematic flow chart of obtaining the first relative distance according to an exemplary embodiment of the present application.

FIG. 5 is a schematic illustration of a second actual width provided by an exemplary embodiment of the present application.

Fig. 6 is a flowchart of a method for adjusting a first relative distance to a second relative distance according to a relative difference according to an exemplary embodiment of the present application.

Fig. 7 is a schematic diagram of a ranging apparatus for a target object according to an exemplary embodiment of the present application.

Fig. 8 is another schematic diagram of a ranging apparatus for a target object according to an exemplary embodiment of the present disclosure.

Fig. 9 is an exemplary block diagram of the first processing module 730 according to an exemplary embodiment of the present application.

Fig. 10 is an exemplary block diagram of an adjustment module 760 provided in an exemplary embodiment of the present application.

Fig. 11 is a block diagram of an electronic device provided in an exemplary embodiment of the present application.

Detailed Description

Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.

Summary of the application

The existing distance measuring modes are generally divided into monocular distance measuring, binocular distance measuring, trinocular distance measuring and the like, and are divided according to the number of image acquisition devices. For example, monocular ranging is where a single image acquisition device (in a posterior movable apparatus) measures the distance between the posterior movable apparatus and a preceding movable apparatus (also referred to herein as a "target object").

In monocular distance measurement, however, the accuracy of distance measurement directly affects driving safety and driving efficiency. If the distance measurement is inaccurate, the driving safety is seriously influenced.

In view of the above problems, the present application aims to research how to improve the accuracy of ranging in monocular ranging, and based on the purpose, the present application researches a ranging method for a target object, and determines a first actual width of the target object according to an image acquired by an image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining the first relative distance according to the relevant state parameters, wherein the distance between the image acquisition device at the current moment and the target object can be obtained because the relevant state parameters can truly reflect the actual state of the target object in the driving process. In addition, a second actual width of the target object is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.

Exemplary method

Fig. 1 is a flowchart illustrating a ranging method for a target object according to an exemplary embodiment of the present disclosure. The embodiment can be applied to mobile equipment. The mobile equipment of this embodiment, including unmanned vehicle, unmanned aerial vehicle, arm and mobile robot etc. can the autonomous movement's equipment.

The present embodiment is applied in monocular distance measurement, and during forward and backward relative driving of two movable devices, the distance between the two movable devices (target object) and the preceding movable device is measured by the following movable device. It is noted that the rear movable apparatus takes an image with the image pickup device to perform ranging, and thus the distance between the image pickup device and the target object is equivalent to the distance between the rear movable apparatus and the target object.

A ranging method of a target object described in one or more embodiments of the present application is shown in fig. 1, and includes the following steps:

step 101, determining a first actual width of the target object according to the image acquired by the image acquisition device.

The image capturing device of this embodiment may be a camera, an infrared camera, or other devices, and of course, the specific type is not limited, and any device having an image capturing function should be included in the scope of this embodiment.

The image obtained in this example contains the following information: the width of the target object in the image (which may be considered as a virtual width), the morphology of the target object (shape, state, appearance, etc.). In addition, the image acquisition device has a focal length, which is set in advance.

The first actual width is used to characterize the true width of the target object, which is processed from one or more of the above-mentioned information contained in the image. The specific implementation process will be described in detail later, and will not be described herein again.

Taking the unmanned vehicle as an example, the unmanned vehicle has a virtual vehicle width in the captured image, and the first actual width refers to a real vehicle width of the unmanned vehicle.

And 102, acquiring relevant state parameters of the target object at the previous moment.

In particular, the relevant state parameter can reflect the actual state of the target object as it appears during driving. Since the relevant state parameters presented by the target object at each moment in the driving process may be different, the relevant state parameters of the target object at each moment in time include: a correlated distance parameter, a correlated velocity parameter, a process noise. Further, the process noise includes: velocity noise and distance noise. Furthermore, the term "relevant state parameter at the previous time" in this embodiment refers to the relevant state parameter of the target object at the previous time based on the current time. When the target object runs to the current time, the relevant state parameters of the target object at the previous time are already presented, so that the relevant state parameters can be obtained through a specific implementation mode, and detailed description is given later, and is not repeated herein.

Wherein, the related speed parameter refers to the driving speed of the target object, and the related speed parameter at each moment is related to the related state parameter at the previous moment. The correlation velocity at a previous time may affect the correlation velocity at a later time. Taking the current time as an example, the relevant speed parameter at the current time is related to the relevant speed parameter at the previous time, the speed noise at the current time, and the like.

The relevant distance parameter refers to the distance between the image acquisition device and the target object, and the relevant distance parameter at each moment is related to the relevant state parameter at the previous moment. Taking the current time as an example, the relevant distance parameter of the current time is related to the relevant distance parameter of the previous time, the relevant speed parameter of the previous time, the time difference between the previous time and the current time, the distance noise and the like.

The velocity noise at each time is used to regulate the velocity at each time. The distance noise at each time is used for regulating and controlling the distance at each time.

Further, during the driving process of the target object, the relevant state parameters presented at each moment may be different, and the relevant state parameters of the target object at the previous moment may affect the relevant state parameters of the target object at the current moment. Therefore, the related state parameter at the previous time is required to be obtained as the basic parameter for obtaining the first relative distance. The specific implementation process will be described later, and will not be described herein again.

Step 103, obtaining a first relative distance according to the relevant state parameter.

Specifically, the first relative distance is a distance between the image acquisition device and the target object at the present time.

The obtained related state parameters can reflect the actual state of the target object in the driving process, so that the first relative distance obtained by the related state parameters can truly and accurately reflect the relative distance between the image acquisition device and the target object.

And 104, acquiring a second actual width of the target object according to the first relative distance.

And the second actual width is used for representing the real width of the target object and is obtained based on the first relative distance. The first actual width is the true width of the target object, which is processed from the information contained in the image. The second actual width is also the true width of the target object and is obtained from the true state parameters of the target object. The two actual widths are of different origin.

And 105, obtaining a relative difference value according to the first actual width and the second actual width.

Comparing the two, the relative difference can be obtained. If the relative difference is small (e.g., smaller than a predetermined threshold), it indicates that both actual widths are relatively accurate. If the relative difference is large (e.g., above a predetermined threshold), it reflects a deviation in the measured first relative distance, and further adjustment is required.

And 106, adjusting the first relative distance to a second relative distance according to the relative difference.

Wherein, the adjustment mode is different according to the difference of the relative difference. The embodiment adopts the relative difference value as the adjustment standard, and can further optimize the relative distance with the target object.

Through the analysis, the embodiment of the invention determines the first actual width of the target object according to the image acquired by the image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining the first relative distance according to the relevant state parameters, wherein the distance between the image acquisition device at the current moment and the target object can be obtained because the relevant state parameters can truly reflect the actual state of the target object in the driving process. In addition, a second actual width of the target object is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first actual width and the second actual width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the actual width of the target object is obtained in different modes, then the comparison result of the actual width is utilized to restrict and adjust the relative distance between the two movable devices, the relative distance between the two movable devices is accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.

On the basis of the above-mentioned embodiment shown in fig. 1, as an alternative implementation manner of this embodiment, before determining the first actual width of the target object in step 101, referring to fig. 2, the following steps need to be performed:

step 201, the width of the target object in the image is obtained.

Here, the width of the target object in the image refers to a virtual width for distinguishing from an actual width in the present application. After the image is shot, the image contains the target object, so that the virtual width of the target object in the image can be obtained by identifying the image. Taking an unmanned vehicle as an example, referring to fig. 3B, the image of the rear end of the leading vehicle is obtained by shooting the trailing vehicle, and the virtual vehicle width of the leading vehicle in the image can be obtained by recognizing the image. It is to be noted that the virtual vehicle width of the preceding vehicle in the image can be obtained from the image even under extreme conditions (e.g., the preceding vehicle turns, the preceding vehicle travels straight on a lane).

Step 202, determining the initial width of the target object according to the width of the target object in the image and the projection relation.

The projection relation comprises a mapping relation and an amplification scale of a preset reference surface and an image. The enlargement ratio may be set in advance. Referring to fig. 3A, the image may be enlarged into the preset reference plane according to the enlargement ratio, the width of the target object in the image is also correspondingly enlarged in the preset reference plane according to the enlargement ratio, and then the width of the target object in the preset reference plane is used as the initial width. Specifically, the preset reference surface may be set as the ground.

Step 203, determining a first actual width according to the initial width.

In particular, the initial width of the target object is also used to characterize the true width (first actual width) of the target object. Therefore, in a specific implementation process, the initial width can be directly used as the first actual width.

However, if the image obtained by shooting under the extreme condition is not correct in angle (for example, turning), the initial width of the target object may have a certain deviation.

In view of this, after obtaining the initial width of the target object, the following operations are performed: inputting the image into a preset model, and determining the type of a target object; and adjusting the initial width of the target object according to the type of the target object to obtain a first actual width.

Wherein the target object in the image will present its own form, such as the shape, state and appearance presented by the target object. As different target objects may have different appearances, particular shapes, or their own brands LOGO, etc. Therefore, after the image is input into the preset model, the preset model can determine the type of the target object according to the form of the target object.

In the process of determining the type of the target object, a base model (such as a neural network model (CNN), RNN, and the like) is constrained in advance by using the sample morphology and the sample type of a large number of related objects, so as to obtain the preset model. And then, inputting the image into a preset model, processing the form of the target object in the image according to the preset model, and outputting the type of the target object.

And the type of target object may determine the actual width of the target object. Taking an unmanned vehicle as an example, the type of the unmanned vehicle refers to an unmanned vehicle type, for example, if the unmanned vehicle in fig. 3B is a type a of a certain brand, the vehicle width is fixed. The actual width corresponding to the model of the unmanned vehicle can be used as a standard to adjust the initial width to obtain a first actual width of the unmanned vehicle so as to correct possible deviation of the initial width. And in the adjusting process, determining the width difference value between the real width and the initial width corresponding to the unmanned vehicle type, comparing the width difference value with a preset threshold value, and if the width difference value is smaller than the preset threshold value, selecting one of the real width and the initial width corresponding to the unmanned vehicle type. And if the width difference is larger than a preset threshold value, replacing the initial width by using the real width corresponding to the unmanned vehicle model. Of course, other adjustment methods, such as combining (increasing or decreasing) the width difference with the initial width, should also fall within the scope of the present invention.

On the basis of the embodiment shown in fig. 1, as an optional implementation manner of this embodiment, in the process of step 102, a relevant distance parameter of the target object at a previous time, a relevant speed parameter at a previous time, and a noise error at a previous time are obtained.

This is done because during driving, the relevant state parameter of the target object at the previous moment in time influences the relevant state parameter at the current moment in time. Taking an unmanned vehicle as an example, under the condition that the driving condition of the rear vehicle is not changed, if the front vehicle decelerates to drive at a time before, the distance between the two vehicles at the current time is reduced. The actual influence of the related state parameter at the previous moment is considered, the first relative distance at the current moment is obtained by taking the actual influence as a basis, and the accuracy of the first relative distance can be improved.

On the basis of obtaining the relevant state parameter, as an optional implementation manner of this embodiment, in the process of step 103, the following operations are implemented: and obtaining a first relative distance according to the relevant distance parameter at the previous moment and the relevant speed parameter at the previous moment.

More specifically, referring to fig. 4, the above implementation process includes the following specific operation steps:

step 401, obtaining a time difference between a current time and a previous time.

The time difference between the previous time and the current time may be in units of "milliseconds", and the time difference may be any value, for example, 2ms, 5ms, and the like.

And step 402, obtaining the relative movement distance of the current moment according to the time difference and the related speed parameter of the previous moment.

The process noise can be divided into: distance noise and velocity noise. The distance noise at each moment can be used for regulating and controlling the distance precision at each moment, and the speed noise at each moment can be used for regulating and controlling the speed precision at each moment.

In addition, the relevant state parameters at each moment can influence the relevant state parameters at the moment immediately after the moment. Therefore, in the process of calculating the relevant speed parameter at the previous moment, the speed noise at the previous moment and the relevant speed parameter at the moment immediately before the previous moment need to be referred to. Specifically, the correlated velocity parameter immediately before the previous time and the velocity noise at the previous time may be summed to obtain the correlated velocity parameter at the previous time. For ease of understanding, formula V may be utilizedk-1=Vk-2+Wvk-1And (6) obtaining. Wherein k-1 represents the previous time, Vk-1Indicating the relevant speed parameter, W, at the previous momentvk-1Represents the velocity noise at the previous time, k-2 represents the time immediately before the previous time (also referred to as the previous time of the previous time), Vk-2Representing the relevant speed parameter at the instant immediately preceding the previous instant.

In the implementation process of obtaining the relative movement distance at the current moment according to the time difference and the related speed parameter at the previous moment, the product of the time difference and the related speed parameter at the previous moment can be used as the relative movement distance at the current moment. In accordance with the above example, the relative movement distance at the current time is Vk-1Δ t, wherein Δ t represents a time difference between the current time and the previous time.

In the above operation, the relative movement distance is obtained by combining the relevant state parameters at the previous time (the relevant speed parameter, the time difference, and the like at the previous time), so that the influence of the change of the relevant state parameters at the previous time on the relative movement distance can be comprehensively considered, and the accuracy of the relative movement distance at the current time can be improved.

In step 403, a first relative distance is obtained based on the relative distance parameter at the previous time, the relative movement distance at the current time, and the distance noise at the current time.

Specifically, the relative distance parameter at the previous time is used for representing the distance between the target object image acquisition devices. The relative movement distance of the current moment at the previous moment is used for representing the movement distance of the target object moving relative to the image acquisition device in the time difference, so that the first relative distance can be obtained by summing the relative distance parameter of the previous moment, the relative movement distance of the current moment and the distance noise of the current moment.

Further, the distance noise at the current time is obtained by: obtaining distance noise at a previous moment; and obtaining the distance noise of the current moment according to the distance noise and the time difference of the previous moment. Specifically, the distance noise at the current time can be obtained by summing the distance noise at the previous time and the time difference.

For ease of understanding, in conjunction with the formula notation given above, the present embodiment obtains the first relative distance specifically as follows: sk=Sk-1+Vk-1*Δt+Wsk1Where k denotes the current time, SkA first relative distance, S, representing the current timek-1Representing the relative distance parameter, V, of the preceding instantk-1Δ t represents the relative movement distance at the current time, Wsk1Representing the distance noise at the current time.

As can be seen, in the above operation, the first relative distance is obtained by combining the relevant state parameter at the previous time (the relative distance parameter, the relative movement distance, and the like at the previous time) and the distance noise at the current time, so that the influence of the change in the relevant state parameter at the previous time on the relative movement distance can be comprehensively considered, and the accuracy of the first relative distance at the current time can be improved.

Through the analysis, the relevant state parameters at the previous moment and various parameters at the current moment are comprehensively considered in the implementation process of the first relative distance, and the parameters can comprehensively reflect the real-time driving state of the target object, so that the first relative distance is obtained by taking the parameters as the basis, the relative position relation between the target object and the image acquisition device can be accurately reflected, the distance measurement accuracy can be realized, and the driving safety is further ensured.

On the basis of the embodiment shown in fig. 1, as an optional implementation manner of this embodiment, the step 104 specifically includes the following operations: obtaining the focal length of the image acquisition device; and processing the focal length, the width of the target object in the image and the first relative distance to obtain a second actual width.

Wherein, the focal length of the image acquisition device is set in advance.

The second actual width is obtained according to the pinhole imaging principle. Specifically, the ratio of the focal length to the first relative distance is equal to the ratio of the width of the target object in the image to the second actual width, which is obtained from this relationship.

Referring to fig. 5, the width of the image capturing device in the image of the target object is p, and the first relative distance between the target object and the image capturing device is SkThe focal length of the image acquisition device is f, and the focal length f is set in advance. The magnification ratio can be calculated according to the distance and the focal length as follows: f/S. And then amplifying the width p of the target object in the image according to the amplification ratio f/S to obtain the initial width D of the target object.

Namely:

Figure BDA0002245626080000111

the deformation is as follows:

Figure BDA0002245626080000112

referring to fig. 6, on the basis of the embodiment shown in fig. 1, as an alternative implementation manner of this embodiment, the step 106 specifically includes the following operations:

step 601, judging whether the relative difference is smaller than a preset threshold value.

Specifically, the specific value of the preset threshold needs to be adjusted according to experience and actual conditions, and this embodiment is not limited herein.

After the determination is performed, the obtained determination result may be one of the following two results:

firstly, the relative difference is smaller than a preset threshold value. Indicating that the first actual width and the second actual width are relatively close, step 602 may be performed.

The second step is as follows: if the relative difference is greater than or equal to the preset threshold, which indicates that there may be a deviation in the first relative distance, step 603 is executed.

Step 602, if yes, determining the first relative distance as the second relative distance.

Wherein the second relative distance is used to characterize the distance between the target object and the image acquisition device at the current time. The second relative distance is basic data for performing the subsequent driving operation, so that the requirement on the accuracy of the second relative distance is high. The higher the accuracy of the second relative distance is, the more the driving safety and the driving efficiency can be ensured. If the relative difference is smaller than the preset threshold, it indicates that the accuracy of the obtained first relative distance is higher, so that the first relative distance can be directly determined as the second relative distance.

Therefore, the first relative distance between the two mobile devices is constrained by the relative difference value between the first actual width and the second actual width and the preset threshold value so as to obtain the second relative distance, the obtained second relative distance can be used for representing the relative position relation between the target object and the image acquisition device more accurately, the distance measurement accuracy can be realized, and the driving safety is further guaranteed.

Step 603, if not, adjusting the distance noise according to the relative difference, and obtaining a second relative distance according to the adjusted distance noise and the relevant state parameters.

And the difference value between the third actual width and the first actual width obtained by the second relative distance is smaller than a preset threshold value.

Specifically, the relative difference and the distance noise have a mapping relationship. And acquiring corresponding distance noise from the mapping relation according to the obtained relative difference, and then adjusting the distance noise at the current moment according to the distance noise. For example, the distance noise W at the current time is measuredsk1Adjusted to adjusted range noise Wsk2

After the adjustment, the second relative distance may be obtained with reference to the manner in which the first relative distance is obtained. Specifically, the second relative distance can be obtained by summing the relative distance parameter at the previous time, the relative movement distance at the current time, and the adjusted distance noise. The second relative distance S according to the above formulak’=Sk-1+Vk-1*Δt+Wsk2. Wherein S isk' denotes the second relative distance, denotes Wsk1After adjustmentDistance noise.

Further, obtaining the focal length of the image acquisition device; and processing the focal length, the width of the target object in the image and the second relative distance to obtain a third actual width. The present embodiment is configured such that the difference between the third actual width and the first actual width is smaller than a preset threshold. That is to say, in this embodiment, the difference between the third actual width and the first actual width is smaller than the preset threshold as the constraint condition, so as to constrain the second relative distance, and the obtained second relative distance can be more accurate to represent the relative position relationship between the target object and the image acquisition device, so that the accuracy of distance measurement can be realized, and the driving safety can be further ensured.

It should be noted that, in order to obtain the second relative distance more accurately, the difference between the third actual width and the first actual width may be smaller than a preset threshold as a constraint condition, and the distance noise is adjusted to perform the above steps for multiple times until the constraint condition is satisfied.

Exemplary devices

Fig. 7 illustrates a block diagram of a ranging apparatus 700 of a target object according to an embodiment of the present application.

As shown in fig. 7, a ranging apparatus 700 for a target object according to an embodiment of the present invention includes: a first determining module 710, configured to determine a first actual width of the target object according to the image acquired by the image acquisition apparatus; an obtaining module 720, configured to obtain a relevant state parameter of the target object at a previous time; the first processing module 730, configured to obtain a first relative distance according to the relevant state parameter, where the first relative distance is a distance between the image acquisition device and the target object at the current time; a second processing module 740, configured to obtain a second actual width of the target object according to the first relative distance; a comparing module 750, configured to obtain a relative difference according to the first actual width and the second actual width; an adjusting module 760 for adjusting the first relative distance to the second relative distance according to the relative difference.

FIG. 8 illustrates modules prior to the first determination module 710 according to an embodiment of the application. Specifically, the device further comprises: a first obtaining module 810, configured to obtain a width of the target object in the image;

a second determining module 820, configured to determine an initial width of the target object according to the width of the target object in the image and the projection relationship; the projection relation comprises a mapping relation between a preset reference surface and the image.

A third determining module 830, configured to determine the first actual width according to the initial width.

In one example, the first determining module 710 is specifically configured to input an image into a preset model, and determine a type of the target object; and adjusting the initial width of the target object according to the type of the target object to obtain the first actual width.

In an example, the obtaining module 720 is specifically configured to obtain a distance parameter and a speed parameter of the target object at a previous time; the first processing module 730 is specifically configured to obtain the first relative distance according to the relevant distance parameter at the previous time and the relevant speed parameter at the previous time.

Fig. 9 illustrates an example block diagram of a first processing module 730 according to an embodiment of this application. As shown in fig. 9, in an example, the first processing module 730 specifically includes: a second obtaining module 910, configured to obtain a time difference between the current time and the previous time; a third obtaining module 920, configured to obtain the relative movement distance of the current time according to the time difference and the related speed parameter of the previous time; a fourth obtaining module 930, configured to obtain the first relative distance based on the relative distance parameter at the previous time, the relative movement distance at the current time, and the distance noise.

In one example, the second processing module 740 includes: a fifth obtaining module, configured to obtain a focal length of the image acquisition device; a sixth obtaining module, configured to process the focal length, the width of the target object in the image, and the first relative distance to obtain the second actual width.

Fig. 10 illustrates an example block diagram of an adjustment module 760 in accordance with an embodiment of the present application. As shown in fig. 10, in one example, the adjustment module 760 includes: a determining module 1010, configured to determine whether the relative difference is smaller than a preset threshold; a first adjusting submodule 1020, configured to determine, if yes, the first relative distance as the second relative distance; a second adjusting submodule 1030, configured to adjust the distance noise according to the relative difference if the distance noise is not within the predetermined range; obtaining the second relative distance according to the adjusted distance noise and the relevant state parameter; and the difference value between the third actual width obtained by the second relative distance and the first actual width is smaller than the preset threshold value.

Exemplary Mobile device

FIG. 11 illustrates a block diagram of a removable device according to an embodiment of the present application.

As shown in fig. 11, the removable device (electronic device 10) includes one or more processors 11 and memory 12.

The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the removable device to perform desired functions.

Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, Random Access Memory (RAM), cache memory (or the like). The non-volatile memory may include, for example, Read Only Memory (ROM), a hard disk, flash memory, and the like. One or more computer program instructions may be stored on a computer readable storage medium and executed by the processor 11 to implement the above ranging method for a target object of the various embodiments of the present application and/or other desired functions. Various contents such as an input signal, a signal component, a noise component, etc. may also be stored in the computer-readable storage medium.

In one example, the removable device may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).

For example, when the removable device is a first device or a second device, the input means 13 may be a microphone or a microphone array as described above for capturing an input signal of a sound source. When the electronic device is a stand-alone device, the input means 13 may be a communication network connector for receiving the acquired input signals from the first device and the second device.

The input device 13 may also include, for example, a keyboard, a mouse, and the like.

The output device 14 may output various information including determined distance information, direction information, and the like to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.

Of course, for simplicity, only some of the components of the removable device relevant to the present application are shown in fig. 11, omitting components such as buses, input/output interfaces, and the like. In addition, the removable device may include any other suitable components, depending on the particular application.

Exemplary computer program product and computer-readable storage Medium

In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the trajectory planning method according to various embodiments of the present application described in the above-mentioned "exemplary methods" section of this specification.

The computer program product may include program code for carrying out operations for embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.

Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform the steps in the method of pose tracking of a target object according to various embodiments of the present application described in the "exemplary methods" section above in this specification.

A computer-readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.

The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.

The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".

It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.

The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

20页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:通信单元和用于时钟分布与同步的方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!