Image synchronization method and device, equipment and computer storage medium

文档序号:1909871 发布日期:2021-11-30 浏览:13次 中文

阅读说明:本技术 图像同步方法及装置、设备、计算机存储介质 (Image synchronization method and device, equipment and computer storage medium ) 是由 吴佳成 孙栋梁 张帅 于 2021-09-26 设计创作,主要内容包括:本公开实施例提供一种图像同步方法及装置、设备、计算机存储介质,方法包括:获取由多个图像采集装置针对目标对象采集的多组候选图像和每一候选图像对应的时间参数;从每一组候选图像中分别选取一个候选图像作为待分析图像,并基于多个待分析图像构建成待分析图像组;响应于待分析图像的时间参数均满足预设同步条件,将待分析图像组确定为目标对象对应的一个同步图像组。(The embodiment of the disclosure provides an image synchronization method, an image synchronization device and a computer storage medium, wherein the method comprises the following steps: acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; respectively selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed; and determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the images to be analyzed all meet the preset synchronous condition.)

1. An image synchronization method, characterized in that the method comprises:

acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image;

respectively selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed;

and determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the image to be analyzed all meet preset synchronous conditions.

2. The method of claim 1, further comprising:

synchronously sending image acquisition trigger signals to the plurality of image acquisition devices;

when the candidate image which is returned by any image acquisition device and acquired in response to the image acquisition trigger signal is received, recording the time parameter corresponding to the candidate image;

according to the sequence of the time parameter sequencing, caching the candidate images and the time parameters to any image caching queue in a corresponding relation so as to construct any group of candidate images; each image acquisition device corresponds to one image buffer queue.

3. The method of claim 2, wherein said synchronously sending image acquisition trigger signals to the plurality of image acquisition devices comprises:

acquiring a preset frame frequency corresponding to each image acquisition device, and determining a signal emission frequency based on a plurality of preset frame frequencies; wherein the signal emission frequency is less than or equal to any one of the preset frame frequencies;

and synchronously sending the image acquisition trigger signals to the plurality of image acquisition devices according to the signal emission frequency.

4. The method according to claim 2, wherein said selecting one of the candidate images from each group of candidate images as the image to be analyzed comprises:

and respectively selecting one candidate image with the most front time parameter sequence as an image to be analyzed from a group of candidate images cached in each image caching queue to obtain a plurality of images to be analyzed.

5. The method according to claim 4, wherein after constructing a group of images to be analyzed based on a plurality of images to be analyzed, and in response to that the time parameters of the images to be analyzed all satisfy a preset synchronization condition, the group of images to be analyzed is determined as a group of synchronization images corresponding to the target object, and before determining the group of images to be analyzed as a group of synchronization images corresponding to the target object, the method further comprises:

determining an image to be analyzed with the most back time parameter sequence in the image group to be analyzed as the target image;

calculating a time difference value between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed;

and determining that the time parameters of the images to be analyzed all meet preset synchronization conditions in response to the fact that the time difference values are all smaller than or equal to a preset time threshold value.

6. The method according to claim 5, wherein after calculating the time difference between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed, the method further comprises:

discarding the first image to be analyzed in response to the fact that the time difference value between the time parameter corresponding to the target image and the time parameter corresponding to the first image to be analyzed in the image group to be analyzed is larger than the preset time threshold; the first image to be analyzed is any image to be analyzed except the target image in the image group to be analyzed;

continuously selecting a second image to be analyzed with the most front time parameter sequence from a first image cache queue to which the first image to be analyzed belongs;

and updating the image group to be analyzed based on the second image to be analyzed, and continuing to execute the determination processing of the target image and the judgment processing of the preset synchronization condition based on the updated image group to be analyzed.

7. The method according to any one of claims 1 to 6, wherein after determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to the time parameters of the images to be analyzed all satisfying a preset synchronization condition, the method further comprises:

preprocessing each image to be analyzed in the synchronous image group to obtain a processed synchronous image group meeting preset model input conditions;

and inputting the processed synchronous image group into a target algorithm model for image analysis processing.

8. The method according to any one of claims 2 to 6, wherein after determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to the time parameters of the images to be analyzed all satisfying a preset synchronization condition, the method further comprises:

and continuously constructing a next image group to be analyzed based on the image to be analyzed with the most front time parameter in each image buffer queue, and executing the determination processing of the next synchronous image group corresponding to the target object.

9. The method according to claim 1, wherein the plurality of image capturing devices are disposed at a plurality of angular orientations relative to the target object; an image acquisition device is arranged in an angular direction.

10. A computer device comprising a memory and a processor, the memory storing a computer program operable on the processor, wherein the processor, when executing the program, is configured to:

acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image;

respectively selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed;

and determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the image to be analyzed all meet preset synchronous conditions.

11. The computer device of claim 10, wherein the processor is further configured to:

synchronously sending image acquisition trigger signals to the plurality of image acquisition devices;

when the candidate image which is returned by any image acquisition device and acquired in response to the image acquisition trigger signal is received, recording the time parameter corresponding to the candidate image;

according to the sequence of the time parameter sequencing, caching the candidate images and the time parameters to any image caching queue in a corresponding relation so as to construct any group of candidate images; each image acquisition device corresponds to one image buffer queue.

12. The computer apparatus of claim 11, wherein, when synchronously transmitting image acquisition trigger signals to the plurality of image acquisition devices, the processor is configured to:

acquiring a preset frame frequency corresponding to each image acquisition device, and determining a signal emission frequency based on a plurality of preset frame frequencies; wherein the signal emission frequency is less than or equal to any one of the preset frame frequencies;

and synchronously sending the image acquisition trigger signals to the plurality of image acquisition devices according to the signal emission frequency.

13. The computer device of claim 11, wherein when selecting one of the candidate images from each of the groups of candidate images as the image to be analyzed, the processor is configured to:

and respectively selecting one candidate image with the most front time parameter sequence as an image to be analyzed from a group of candidate images cached in each image caching queue to obtain a plurality of images to be analyzed.

14. The computer device of claim 13, wherein after constructing a group of images to be analyzed based on a plurality of images to be analyzed, and before determining the group of images to be analyzed as a group of synchronized images corresponding to the target object in response to the time parameters of the images to be analyzed all satisfying a preset synchronization condition, the processor is further configured to:

determining an image to be analyzed with the most back time parameter sequence in the image group to be analyzed as the target image;

calculating a time difference value between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed;

and determining that the time parameters of the images to be analyzed all meet preset synchronization conditions in response to the fact that the time difference values are all smaller than or equal to a preset time threshold value.

15. The computer device of claim 14, wherein after calculating the time difference between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the group of images to be analyzed, the processor is further configured to:

discarding the first image to be analyzed in response to the fact that the time difference value between the time parameter corresponding to the target image and the time parameter corresponding to the first image to be analyzed in the image group to be analyzed is larger than the preset time threshold; the first image to be analyzed is any image to be analyzed except the target image in the image group to be analyzed;

continuously selecting a second image to be analyzed with the most front time parameter sequence from a first image cache queue to which the first image to be analyzed belongs;

and updating the image group to be analyzed based on the second image to be analyzed, and continuing to execute the determination processing of the target image and the judgment processing of the preset synchronization condition based on the updated image group to be analyzed.

16. The computer device according to any one of claims 10 to 15, wherein after determining the group of images to be analyzed as one group of synchronized images corresponding to the target object in response to the time parameters of the images to be analyzed each satisfying a preset synchronization condition, the processor is further configured to:

preprocessing each image to be analyzed in the synchronous image group to obtain a processed synchronous image group meeting preset model input conditions;

and inputting the processed synchronous image group into a target algorithm model for image analysis processing.

17. The computer device according to any one of claims 11 to 15, wherein after determining the group of images to be analyzed as one group of synchronized images corresponding to the target object in response to the time parameters of the images to be analyzed each satisfying a preset synchronization condition, the processor is further configured to:

and continuously constructing a next image group to be analyzed based on the image to be analyzed with the most front time parameter in each image buffer queue, and executing the determination processing of the next synchronous image group corresponding to the target object.

18. The computer apparatus of claim 10, wherein the plurality of image capture devices are disposed at a plurality of angular orientations relative to the target object; an image acquisition device is arranged in an angular direction.

19. A computer-readable storage medium, having a computer program stored thereon, the computer program, when executed by a processor, configured to:

acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image;

respectively selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed;

and determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the image to be analyzed all meet preset synchronous conditions.

20. A computer program comprising computer instructions executable by an electronic device, wherein the computer instructions, when executed by a processor in the electronic device, are configured to:

acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image;

respectively selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed;

and determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the image to be analyzed all meet preset synchronous conditions.

Technical Field

The present disclosure relates to the field of intelligent video analysis, and in particular, to an image synchronization method, an image synchronization apparatus, a computer storage medium, and a computer program product.

Background

At present, in a specific scene, such as a live broadcast scene and a table game scene, a plurality of cameras arranged in different angle directions are commonly utilized for cooperative processing, that is, the plurality of cameras are triggered to shoot the same target object at different angles, a plurality of pictures are returned, and then the information analysis and the information fusion are carried out on the plurality of pictures so that the seen target object is more consistent with the real visual world of human face vision. Thus, multiple cameras need to comply with strict frame synchronization requirements.

However, due to timing errors caused by network transmission or hardware of the cameras, multiple images returned by multiple cameras may not be frame synchronized.

Disclosure of Invention

The embodiment of the disclosure provides an image synchronization method, an image synchronization device, image synchronization equipment and a computer storage medium.

The technical scheme of the embodiment of the disclosure is realized as follows:

the embodiment of the disclosure provides an image synchronization method, which includes:

acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image; respectively selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed; and determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the image to be analyzed all meet preset synchronous conditions.

In this way, by acquiring a plurality of sets of candidate images acquired by a plurality of image acquisition devices for a target object and a time parameter corresponding to each candidate image, a group of images to be analyzed constructed based on one candidate image in the set of candidate images acquired by each image acquisition device can be synchronously judged according to the time parameter until a synchronous image group with the time parameter meeting a preset synchronization condition is determined.

In some embodiments, the method further comprises: synchronously sending image acquisition trigger signals to the plurality of image acquisition devices; when the candidate image which is returned by any image acquisition device and acquired in response to the image acquisition trigger signal is received, recording the time parameter corresponding to the candidate image; according to the sequence of the time parameter sequencing, caching the candidate images and the time parameters to any image caching queue in a corresponding relation so as to construct any group of candidate images; each image acquisition device corresponds to one image buffer queue.

In some embodiments, said synchronously sending image acquisition trigger signals to said plurality of image acquisition devices comprises: acquiring a preset frame frequency corresponding to each image acquisition device, and determining a signal emission frequency based on a plurality of preset frame frequencies; wherein the signal emission frequency is less than or equal to any one of the preset frame frequencies; and synchronously sending the image acquisition trigger signals to the plurality of image acquisition devices according to the signal emission frequency.

Therefore, the transmitting frequency of the image acquisition trigger signal is determined according to the frame frequency of the image acquisition device, and more image data can be obtained in unit time.

In some embodiments, the selecting one of the candidate images from each of the groups of candidate images as an image to be analyzed includes: and respectively selecting one candidate image with the most front time parameter sequence as an image to be analyzed from a group of candidate images cached in each image caching queue to obtain a plurality of images to be analyzed.

In some embodiments, after constructing the image group to be analyzed based on the plurality of images to be analyzed, and before determining the image group to be analyzed as one synchronized image group corresponding to the target object in response to the time parameters of the images to be analyzed all satisfying the preset synchronization condition, the method further includes: determining an image to be analyzed with the most back time parameter sequence in the image group to be analyzed as the target image; calculating a time difference value between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed; and determining that the time parameters of the images to be analyzed all meet preset synchronization conditions in response to the fact that the time difference values are all smaller than or equal to a preset time threshold value.

Therefore, the image group to be analyzed is determined to be used as a target image with the latest time parameter sequence, then the corresponding time parameter is compared with the corresponding time parameter of any other image to be analyzed, and then a group of frame synchronization images of the target object is determined under the condition that the difference value of the time parameters meets the preset error range.

In some embodiments, after calculating a time difference between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed, the method further includes: discarding the first image to be analyzed in response to the fact that the time difference value between the time parameter corresponding to the target image and the time parameter corresponding to the first image to be analyzed in the image group to be analyzed is larger than the preset time threshold; the first image to be analyzed is any image to be analyzed except the target image in the image group to be analyzed; continuously selecting a second image to be analyzed with the most front time parameter sequence from a first image cache queue to which the first image to be analyzed belongs; and updating the image group to be analyzed based on the second image to be analyzed, and continuing to execute the determination processing of the target image and the judgment processing of the preset synchronization condition based on the updated image group to be analyzed.

In this way, when an image to be analyzed exists, the difference value of the time parameter of which does not meet the preset error range, the image to be analyzed is discarded, a new image to be analyzed, the time parameter of which is the most advanced, is obtained from the corresponding image buffer queue so as to update the image group to be analyzed, and the determination of the target image and the judgment processing of the preset synchronization condition are repeated so as to determine the image frame synchronization.

In some embodiments, after the determining the image group to be analyzed as one synchronized image group corresponding to the target object in response to that the time parameters of the images to be analyzed all satisfy the preset synchronization condition, the method further includes: preprocessing each image to be analyzed in the synchronous image group to obtain a processed synchronous image group meeting preset model input conditions; and inputting the processed synchronous image group into a target algorithm model for image analysis processing.

In some embodiments, after the determining the image group to be analyzed as one synchronized image group corresponding to the target object in response to that the time parameters of the images to be analyzed all satisfy the preset synchronization condition, the method further includes: and continuously constructing a next image group to be analyzed based on the image to be analyzed with the most front time parameter in each image buffer queue, and executing the determination processing of the next synchronous image group corresponding to the target object.

In some embodiments, the plurality of image capture devices are disposed at a plurality of angular orientations relative to the target object; an image acquisition device is arranged in an angular direction.

An embodiment of the present disclosure provides an image synchronization apparatus, including:

the acquisition module is configured to acquire a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image;

the selection module is configured to select one candidate image from each group of candidate images as an image to be analyzed;

the construction module is configured to construct an image group to be analyzed based on a plurality of images to be analyzed;

and the determining module is configured to determine the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the image to be analyzed all meet preset synchronous conditions.

An embodiment of the present disclosure provides a computer device, including a memory and a processor, where the memory stores a computer program executable on the processor, and the processor implements the steps of the method when executing the program.

The disclosed embodiments provide a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps of the method described above.

The embodiment of the disclosure provides an image synchronization method, an image synchronization device, an image synchronization apparatus and a storage medium, wherein a plurality of sets of candidate images acquired by a plurality of image acquisition devices for a target object and a time parameter corresponding to each candidate image are acquired, and a synchronization judgment of the time parameter is performed on an image group to be analyzed constructed based on one candidate image in the set of candidate images acquired by each image acquisition device until a synchronization image group with the time parameter meeting a preset synchronization condition is determined. Therefore, the images of the cameras are judged synchronously based on the timestamps, so that a group of images which are output by the cameras to an algorithm for detection and identification and information fusion are ensured to be synchronous, and the strict frame synchronization requirement is met.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.

Drawings

Fig. 1 is a schematic flow chart of a first implementation of an image synchronization method according to an embodiment of the present disclosure;

fig. 2A is a schematic flow chart illustrating an implementation process of an image synchronization method according to an embodiment of the present disclosure;

fig. 2B is a schematic flow chart illustrating an implementation of the image synchronization method according to the embodiment of the disclosure;

fig. 3A is a schematic flow chart illustrating an implementation of an image synchronization method according to an embodiment of the present disclosure;

fig. 3B is a schematic flow chart illustrating an implementation of the image synchronization method according to the embodiment of the present disclosure;

fig. 4 is a schematic structural diagram of an image synchronization apparatus according to an embodiment of the present disclosure;

fig. 5 is a schematic structural diagram of a computer device provided in an embodiment of the present disclosure.

Detailed Description

In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, specific technical solutions of the present disclosure will be described in further detail below with reference to the accompanying drawings in the embodiments of the present disclosure. The following examples are intended to illustrate the present disclosure, but are not intended to limit the scope of the present disclosure.

In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.

In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where permissible, so that the disclosed embodiments described herein can be practiced in other than the order shown or described herein.

Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. The terminology used herein is for the purpose of describing embodiments of the disclosure only and is not intended to be limiting of the disclosure.

Before further detailed description of the embodiments of the present disclosure, terms and expressions referred to in the embodiments of the present disclosure are explained, and the terms and expressions referred to in the embodiments of the present disclosure are applied to the following explanations.

1) And (4) callback function: is a function that is passed as a parameter. Callback literally means having the system call back the function we specify well. A process is stored for later use at a particular time.

At present, in a specific scene, such as a live broadcast scene and a table game scene, a plurality of cameras arranged in different angle directions are commonly utilized for cooperative processing, that is, the plurality of cameras are triggered to shoot the same target object at different angles, a plurality of pictures are returned, and then the information analysis and the information fusion are carried out on the plurality of pictures so that the seen target object is more consistent with the real visual world of human face vision. Thus, multiple cameras need to comply with strict frame synchronization requirements.

However, due to timing errors caused by network transmission or hardware of the cameras, multiple images returned by multiple cameras may not be frame synchronized.

The embodiment of the disclosure provides an image synchronization method, an image synchronization device, an image synchronization apparatus, and a computer storage medium, wherein a plurality of sets of candidate images acquired by a plurality of image acquisition devices for a target object and a time parameter corresponding to each candidate image are acquired, and a group of images to be analyzed constructed based on one candidate image in the set of candidate images acquired by each image acquisition device is subjected to time parameter synchronization judgment until a group of synchronized images with time parameters meeting preset synchronization conditions is determined. Therefore, the images of the cameras are judged synchronously based on the timestamps, so that a group of images which are output by the cameras to an algorithm for detection and identification and information fusion are ensured to be synchronous, and the strict frame synchronization requirement is met.

The embodiment of the disclosure provides an image synchronization method, which is applied to computer equipment, wherein an end-to-end visual model generation platform is deployed on the computer equipment, and an artificial intelligence model general training framework commonly used in visual fields such as object detection, image classification and the like is embedded in the visual model generation platform.

An exemplary application of the image synchronization apparatus provided by the embodiments of the present disclosure is described below, and the image synchronization apparatus provided by the embodiments of the present disclosure includes, but is not limited to, a computer, a notebook computer, a tablet computer, a multimedia device, a mobile internet device or other types of devices, and a server, a distributed computing node, and other devices with computing capabilities.

The functions implemented by the method can be implemented by calling a program code by a processor in a computer device, and the program code can be stored in a computer storage medium.

The embodiment of the present disclosure provides an image synchronization method, fig. 1 is a schematic view illustrating an implementation flow of the image synchronization method provided by the embodiment of the present disclosure, and as shown in fig. 1, the method for executing image synchronization includes the following steps:

s101, acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; each image acquisition device corresponds to one group of candidate images in a plurality of groups of candidate images, and each group of candidate images comprises at least one candidate image.

In the embodiment of the present disclosure, the image processing apparatus may first acquire a plurality of sets of candidate images acquired by a plurality of image acquisition apparatuses for a target object and a time parameter corresponding to each frame of candidate images.

In some embodiments, a target object refers to a target person in a current scene (e.g., a table game scene or a stage scene); the image acquisition device may be a device for photographing the target object, such as a camera, independent of the image synchronization device.

Here, in order to acquire motion information or body characteristic information and the like in each angular direction of the target task, a plurality of image capturing devices may be provided; the plurality of image acquisition devices are respectively and oppositely arranged in each angle direction of the target person, so that the action information and the body characteristic information of the target person in each angle direction under the current scene can be shot.

In one embodiment, a plurality of image acquisition devices are in communication connection with an image synchronization device for information interaction; each image acquisition device can be in wired connection with the image synchronization device, can also be in wireless connection, or can be in wired connection with one part of the image acquisition devices and the other part of the image acquisition devices.

For example, assuming that the current scene is a game table scene, the image synchronization device is a computer device, and the image acquisition device is a camera, cameras are arranged in various angular directions of a game platform area of the game table game, including an upper direction, a lower direction, a front direction, a rear direction, a left direction, a right direction, and the like, and the cameras and the computer are located in the same local area network or are all connected to the same wireless hotspot, and may perform information interaction with the computer, for example, the computer may send an instruction to the camera or receive image data acquired by the camera for a target person.

In some embodiments, each image capturing device has its corresponding device identifier, and the image synchronization device may store the images returned by each image capturing device according to the device identifiers, so as to obtain multiple sets of candidate images corresponding to multiple image capturing devices, in other words, a set of candidate images corresponding to each image capturing device. Here, at least one candidate image is included in each set of candidate images.

In the embodiment of the present disclosure, when receiving images acquired by a plurality of image acquisition devices for a target object, an image synchronization device may record a time parameter corresponding to each candidate image, and store the candidate images and the time parameters in a form of a corresponding relationship.

The time parameter may be a receiving time recorded by the image synchronization apparatus when receiving the image, and the time parameter may also be an acquisition time when the image acquisition apparatus acquires the image, in other words, the time parameter may be a receiving timestamp or an acquisition timestamp of the candidate image, which is not specifically limited in this application.

It should be noted that, when the image synchronization device executes the image synchronization method provided by the present disclosure, the acquired candidate images and the corresponding time parameters may be acquired in real time, that is, the candidate images returned by the image acquisition device are received in real time, and the time parameters are recorded in real time; or off-line, that is, the image acquisition device of the history cache acquires the returned image and the corresponding time parameter of the history record.

S102, selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed.

In the embodiment of the present disclosure, after acquiring a plurality of sets of candidate images acquired by a plurality of image acquisition devices for a target object and a time parameter corresponding to each of the candidate images, construction of a set of images to be analyzed may be performed based on one of each set of candidate images.

In some embodiments, one candidate image may be selected from each set of candidate images acquired by each image acquisition device as an image to be analyzed, and the image set to be analyzed may be constructed based on the plurality of images to be analyzed.

When the images to be analyzed are selected, the images to be analyzed can be selected according to the sequence of the time parameters, that is, the candidate image with the most front time parameter in each group of candidate images is selected as the image to be analyzed.

S103, in response to the fact that the time parameters of the images to be analyzed all meet preset synchronization conditions, determining the image group to be analyzed as a synchronization image group corresponding to the target object.

In the embodiment of the present disclosure, after the images to be analyzed are selected from each group of candidate images and constructed to form the image group to be analyzed, the images to be analyzed in the image group to be analyzed may be subjected to the synchronous determination of the time parameter.

In some embodiments, the image synchronization apparatus may set in advance an image synchronization condition that a plurality of images to be analyzed in the group of images to be analyzed are frame-synchronized. The synchronization condition can be represented by a time parameter, in other words, if the time parameter satisfies a certain condition, the image to be analyzed is considered to be frame-synchronized.

In the embodiment of the present disclosure, the image synchronization apparatus may determine whether the time parameter corresponding to each image to be analyzed in the image group to be analyzed satisfies a preset synchronization condition, and determine that the image group to be analyzed is a synchronization image group when the time parameters corresponding to a plurality of images to be analyzed all satisfy the preset synchronization condition. Here, the plurality of images in the group of synchronized images are frame-synchronized with each other.

It is to be understood that, after one synchronized image group of the target image is obtained, the image synchronization apparatus may proceed with the determination processing of the next synchronized image group.

The image synchronization device can continuously select a candidate image from a group of candidate images acquired by each image acquisition device and the remaining candidate images as an image to be analyzed in the next round, wherein the candidate image with the most front time parameter sequence is continuously selected from each group of remaining candidate images as the image to be analyzed according to the time parameter sequence, a new image group to be analyzed is constructed based on the plurality of images to be analyzed, and then the judgment processing of whether the time parameter of the image to be analyzed meets the preset synchronization condition is continuously carried out until the next synchronization image group of the target object is obtained.

Thereafter, the image synchronization method of S102 and S103 described above is repeated, and determination processing of the next synchronization image group of the target object is performed.

In some embodiments, in the embodiments of the present disclosure, after determining one synchronization image group of the target object, the image synchronization apparatus may input the synchronization image group into a subsequent algorithm model for information analysis and information fusion.

It can be understood that different algorithm models have different requirements on the image input format, and therefore, in the embodiment of the present disclosure, preprocessing, such as image cropping, image format conversion, and the like, may be performed on the multiple frame synchronization images in the synchronization image group, so as to obtain multiple frame synchronization images that satisfy the input requirements of the algorithm models. And further inputting the synchronous image group which meets the algorithm input requirement and is synchronous with the frame into an algorithm model for information analysis and information fusion.

The image synchronization method includes acquiring multiple sets of candidate images acquired by multiple image acquisition devices for a target object and a time parameter corresponding to each candidate image, and performing time parameter synchronization judgment on an image group to be analyzed constructed based on one candidate image in the set of candidate images acquired by each image acquisition device until a synchronized image group with time parameters meeting preset synchronization conditions is determined. Therefore, the images of the cameras are judged synchronously based on the timestamps, so that a group of images which are output by the cameras to an algorithm for detection and identification and information fusion are ensured to be synchronous, and the strict frame synchronization requirement is met.

Based on the above embodiment, in an embodiment of the present disclosure, fig. 2A is a schematic flow chart illustrating an implementation process of an image synchronization method provided in the embodiment of the present disclosure, and as shown in fig. 2A, the method for performing image synchronization further includes:

s201, synchronously sending an image acquisition trigger signal to a plurality of image acquisition devices.

In the embodiment of the present disclosure, the image synchronization device and the plurality of image capturing devices are in communication connection, and in order to promote that the images captured by the plurality of image capturing devices can implement frame synchronization, the image capturing devices may further implement a certain synchronization means in the image capturing process, such as synchronous trigger capturing.

Here, the synchronous trigger acquisition may be synchronous hard trigger, such as hardware switch synchronous on trigger synchronous acquisition; or synchronous soft triggering, such as sending image acquisition trigger signals synchronously.

The image synchronization device may adopt a means of synchronously sending image acquisition trigger signals to the plurality of image acquisition devices to promote frame synchronization of images acquired by the plurality of image acquisition devices.

In some embodiments, to ensure that more images can be acquired while the image frames are synchronized, the image synchronization device may determine the frequency of the image acquisition trigger signal based on the frame frequency of the image acquisition device.

Fig. 2B is a schematic view of a third implementation flow of an image synchronization method provided in an embodiment of the present disclosure, and as shown in fig. 2B, in the embodiment of the present disclosure, a method for synchronously sending an image acquisition trigger signal to a plurality of image acquisition devices may include the following steps:

s201a, acquiring a preset frame frequency corresponding to each image acquisition device, and determining a signal emission frequency based on a plurality of preset frame frequencies; and the signal transmitting frequency is less than or equal to any preset frame frequency.

S201b, synchronously sending image acquisition trigger signals to a plurality of image acquisition devices according to the signal emission frequency.

It should be noted that, in the embodiment of the present disclosure, the frame frequencies corresponding to the plurality of image capturing devices may be the same or different.

In the embodiment of the disclosure, in order to ensure that the image capturing device can capture images to the maximum extent, the image synchronization device may obtain frame frequencies corresponding to the plurality of image capturing devices, respectively, and determine the transmission frequency of the image capturing trigger signal based on the minimum frame frequency of the plurality of frame frequencies.

In an embodiment, if the frame frequencies corresponding to the plurality of image capturing devices are the same, the frame frequency of any image capturing device may be determined as the transmission frequency of the image capturing trigger signal, at this time, the transmission frequency of the image capturing trigger signal reaches the maximum value of the frame frequency of the image capturing device, and then the image capturing device may capture and obtain more image data in a unit time.

In another embodiment, if the frame frequencies corresponding to the plurality of image capturing devices are different, the minimum frame frequency in the image capturing devices may be determined as the emission frequency of the image capturing trigger signal, and at this time, the emission frequency of the image capturing trigger signal may reach the maximum value under the condition of ensuring good and effective image capturing.

Here, after determining the transmission frequency of the image capturing trigger signal, the image synchronizing device may synchronize the plurality of image capturing devices to transmit the image capturing trigger signal according to the transmission frequency of the signal.

S202, when a candidate image which is returned by any image acquisition device and acquired in response to an image acquisition trigger signal is received, recording a time parameter corresponding to the candidate image.

In the embodiment of the present disclosure, after the image synchronization device synchronously sends the image capturing trigger signal to the plurality of image capturing devices, each image capturing device may perform image capturing processing for the target object in response to the trigger signal.

It should be noted that, because the image acquisition trigger signal caused by network transmission does not reach the image acquisition devices at the same time, it is possible that a plurality of image acquisition devices do not realize synchronous image acquisition; or some defects on the hardware of the image acquisition device itself, may also cause a plurality of image acquisition devices not to realize synchronous image acquisition.

After the image acquisition device performs the image acquisition processing on the target object in response to the image acquisition trigger signal, the image synchronization device may receive the image acquired by the image acquisition device on the target object, that is, the candidate image.

In some embodiments, the image capturing device may record the time of capture of the candidate image and return the candidate image and the corresponding capture time to the image synchronization device, at which point the image synchronization device may record the capture time and determine it as the time parameter corresponding to the candidate image.

It is understood that, since the time for transmitting the images from each image capturing device to the image synchronization device is substantially the same, in order to reduce the amount of data calculation on the image capturing device side, in the embodiment of the present disclosure, the image synchronization device may record the receiving time when receiving any candidate image returned by any image capturing device in the plurality of image capturing devices, and determine the receiving time as the time parameter corresponding to any candidate image.

Here, since the time for transmitting the images from the respective image capturing devices to the image synchronizing device is substantially the same, the interval of the capturing time between different images can be expressed well using the interval of the receiving time, while the amount of data calculation on the image capturing device side can be reduced.

S203, caching the candidate images and the time parameters to any image cache queue in a corresponding relation according to the sequencing sequence of the time parameters to construct any group of candidate images; each image acquisition device corresponds to one image buffer queue.

After the receiving time corresponding to each candidate image is determined as the time parameter corresponding to the candidate image, a corresponding relationship between each candidate image and the corresponding time parameter may be established and stored.

Here, each image capturing apparatus has its corresponding device identifier, and the image synchronization apparatus can accurately determine which image capturing apparatus the candidate image is from when the captured image is received, based on the device identifier.

In the embodiment of the present disclosure, in order to better divide the images acquired by the multiple image acquisition devices, the image synchronization device may divide a storage area for each image acquisition device, so as to store a correspondence between the candidate image returned by the image acquisition device and the time parameter. Wherein, each storage area can be divided by adopting the equipment identification of the image acquisition device.

It can be understood that the later the image acquisition time is, the later the image synchronization device receives the returned image, and in order to better specify the image acquisition time or the image reception time, the image synchronization device sets an image buffer queue for each image acquisition device to store the image. Here, each image buffer queue stores the corresponding relationship between the candidate image and the time parameter according to the sequence of the time parameter sequencing, following the first-in first-out rule.

In other words, each image buffer queue is used for storing a group of candidate images obtained by shooting the target object by each image acquisition device in one angle direction.

In the embodiment of the present disclosure, the image synchronization apparatus may introduce a callback function in an execution process of the related code, call back content of the callback function to record an image receiving timestamp, and store the image and the timestamp in a corresponding relationship form in a corresponding image cache queue.

Here, the callback function may be set after the code step of receiving the candidate image returned by the image capturing apparatus. Therefore, when a candidate image returned by any image acquisition device is received, the receiving time of the candidate image, namely the time parameter corresponding to the candidate image, can be recorded through the callback function, and the candidate image and the time parameter are stored in the image cache queue corresponding to the image acquisition device in a corresponding relationship mode.

Therefore, in the embodiment of the disclosure, the emission frequency of the image capturing trigger signal is determined according to the frame frequency of the image capturing device, so that more image data can be obtained in unit time.

Based on the above embodiment, in yet another embodiment of the present disclosure, fig. 3A is a schematic flow chart of an implementation of an image synchronization method provided by the embodiment of the present disclosure, and as shown in fig. 3A, the method for performing image synchronization may include the following steps:

s301, selecting a candidate image with the most front time parameter sequence from a group of candidate images cached in each image cache queue as an image to be analyzed to obtain a plurality of images to be analyzed.

S302, determining an image to be analyzed with the most rear time parameter sequence in the image group to be analyzed as a target image.

S303, calculating a time difference value between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed.

S304, in response to the fact that the time difference values are all smaller than or equal to the preset time threshold value, the time parameters of the images to be analyzed are determined to meet the preset synchronization condition.

In the embodiment of the present disclosure, in the process of performing frame synchronization judgment on images acquired by multiple image acquisition devices, one candidate image may be selected from an image buffer queue for storing candidate images of the image acquisition devices, respectively, as an image to be analyzed.

It can be understood that the processing and analysis of the images are performed according to the order of the image acquisition time, and here, based on the image caching rules of the image caching queues described in the image synchronization method of S201-S202, a candidate image with the top time parameter ranking can be selected from a group of candidate images in each image caching queue as an image to be analyzed, and an image group to be analyzed is constructed based on the multiple images to be analyzed.

Then, the image synchronization device may determine one image to be analyzed from the plurality of images to be analyzed in the image group to be analyzed as a target image, and then determine synchronization between the plurality of image frames to be analyzed based on a time difference between a time parameter corresponding to the target image and a time parameter corresponding to any of the other remaining images to be analyzed.

The target image may be one image to be analyzed with the most time parameter ranked last among the plurality of images to be analyzed, in other words, the time when the image synchronization device receives the target image is the latest; the target image may also be an image to be analyzed with the highest time parameter ranking or with the middle time parameter ranking among the multiple images to be analyzed, which is not specifically limited in this application.

And under the condition that the target image is one image to be analyzed with the most-ranked time parameter in the images to be analyzed, calculating the time difference between the time parameter corresponding to the target image and the time parameter corresponding to each other image to be analyzed, and determining whether the time parameter corresponding to each image to be analyzed meets the preset synchronization condition or not based on the comparison result of each time difference and the preset time threshold.

Here, the preset time threshold is a difference error range between time parameters satisfying a preset frame synchronization condition, which are preset by the image synchronization apparatus. The preset time threshold may be set based on the frame rate of the camera, and the preset time threshold set by the camera with the frame rate of 10FPS may be 20 ms.

Under the condition that the time difference value between the time parameter corresponding to the target image and the time parameter corresponding to each of the other images to be analyzed is smaller than or equal to the preset time threshold, it can be determined that the time parameter corresponding to each of the images to be analyzed meets the preset synchronization condition, that is, the images to be analyzed are frame-synchronized, and then the image group to be analyzed can be determined as a synchronized image group.

Therefore, in the embodiment of the present disclosure, when the frame synchronization of a plurality of images acquired by a plurality of image acquisition devices is determined, one image to be analyzed may be acquired from each buffer queue to construct an image group to be analyzed, and a time parameter with the latest time parameter sequence is determined therefrom to be used as a target image, and then the corresponding time parameter is compared with the corresponding time parameter of any other image to be analyzed, so that a group of frame synchronization images of the target object is determined when the difference values of the time parameters all satisfy the preset error range.

Based on the above embodiment, in yet another embodiment of the present disclosure, fig. 3B is a schematic flow chart of an implementation of an image synchronization method proposed in the embodiment of the present disclosure, and as shown in fig. 3B, after a time difference between a time parameter corresponding to a target image and a time parameter corresponding to any other image to be analyzed in an image group to be analyzed is calculated, that is, after S303, the method for performing image synchronization may further include the following steps:

s305, in response to the fact that the time difference value between the time parameter corresponding to the target image and the time parameter corresponding to the first image to be analyzed in the image group to be analyzed is larger than a preset time threshold value, discarding the first image to be analyzed; the first image to be analyzed is any other image to be analyzed except the target image in the image group to be analyzed.

S306, continuously selecting a second image to be analyzed with the most advanced time parameter sequence from the first image buffer queue to which the first image to be analyzed belongs.

And S307, updating the image group to be analyzed based on the second image to be analyzed, and continuously executing the determination processing of the target image and the judgment processing of the preset synchronization condition based on the updated image group to be analyzed.

When the target image is a to-be-analyzed image with the most-ranked time parameter among the plurality of to-be-analyzed images, calculating a time difference between the time parameter corresponding to the target image and the time parameter corresponding to each of the other to-be-analyzed images, and when the time difference between the time parameter corresponding to one or more of the to-be-analyzed images in the to-be-analyzed image group and the time parameter corresponding to the target image is greater than a preset time threshold, determining that the time parameter corresponding to the other one or more to-be-analyzed images does not meet a preset synchronization condition.

In addition, the remaining images to be analyzed, in which the time difference between the time parameters corresponding to the target images is less than or equal to the preset time threshold, may be determined to satisfy the preset synchronization condition.

Here, the image synchronization device may discard other one or more images to be analyzed that are determined to not satisfy the preset synchronization condition, and then continue to select a next candidate image from the image cache queue to which the image to be analyzed that does not satisfy the preset synchronization condition belongs as an image to be analyzed; and the next image to be analyzed still belongs to the image cache queue with the most front time parameter sequencing.

In some embodiments, the group of images to be analyzed may be updated based on the newly selected image to be analyzed, the image to be analyzed satisfying the preset frame synchronization in the previous round, and the target image in the previous round, that is, a new group of images to be analyzed may be reconstructed. The images to be analyzed still include a candidate image with the most advanced time parameter respectively selected from a group of candidate images of each image buffer queue.

In the embodiment of the present disclosure, the determination process of the new target image and the determination process of whether to preset the synchronization condition, which is executed based on the time parameter corresponding to the image to be analyzed, may be continued based on the updated image group to be analyzed. In other words, the image synchronization methods of S302, S303, and S304 or the image synchronization methods of steps S302, S303, S305, S306, and S307 are cyclically executed to determine the synchronized image group of the target object.

Thus, in the embodiment of the present disclosure, it may be determined that the image group to be analyzed has the newest time parameter sequence as the target image, then the corresponding time parameter is compared with any time parameter corresponding to any other image to be analyzed, when there is an image to be analyzed whose difference between the time parameters does not satisfy the preset error range, the image to be analyzed is discarded, and a new image to be analyzed whose time parameter is the most front is obtained from the corresponding image buffer queue to update the image group to be analyzed, and the determination of the target image and the determination processing of the preset synchronization condition are repeated to determine the image frame synchronization.

Based on the above embodiments, in still another embodiment of the present disclosure, a method of performing image synchronization may include the steps of:

s401, acquiring a preset frame frequency corresponding to each image acquisition device in a plurality of image acquisition devices, and determining a signal emission frequency based on the plurality of preset frame frequencies; and the signal transmitting frequency is less than or equal to any preset frame frequency.

S402, synchronously sending image acquisition trigger signals to a plurality of image acquisition devices according to the signal emission frequency.

And S403, recording time parameters corresponding to the candidate images when the candidate images which are returned by any image acquisition device and acquired in response to the image acquisition trigger signals are received.

S404, caching the candidate images and the time parameters to any image cache queue in a corresponding relation according to the sequencing sequence of the time parameters to construct any group of candidate images; each image acquisition device corresponds to one image buffer queue.

S405, selecting a candidate image with the most front time parameter sequence from a group of candidate images cached in each image cache queue as an image to be analyzed to obtain a plurality of images to be analyzed, and constructing an image group to be analyzed based on the plurality of images to be analyzed.

S406, determining an image to be analyzed with the most rear time parameter sequence in the image group to be analyzed as a target image.

S407, calculating a time difference value between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed.

S408, judging whether the time difference values all meet a preset time threshold value; if yes, executing step S409; if not, the step S411 is skipped to execute.

And S409, determining that the time parameters of the images to be analyzed all meet preset synchronization conditions, and determining the image group to be analyzed as a synchronization image group corresponding to the target object.

S410, preprocessing each image to be analyzed in the synchronous image group to obtain a processed synchronous image group meeting the preset model input condition, and inputting the processed synchronous image group into a target algorithm model for image analysis processing.

S411, discarding the first image to be analyzed of which the time difference value between the time parameter and the time parameter corresponding to the target image is larger than a preset time threshold; the first image to be analyzed is any other image to be analyzed except the target image in the image group to be analyzed.

S412, continuously selecting a second image to be analyzed with the most advanced time parameter sequence from the first image buffer queue to which the first image to be analyzed belongs.

And S413, updating the image group to be analyzed based on the second image to be analyzed, and circularly executing S406 to S413.

As can be seen, based on the image synchronization method in S401 to S412, by obtaining a plurality of sets of candidate images acquired by a plurality of image acquisition devices for a target object and a time parameter corresponding to each candidate image, a group of images to be analyzed constructed based on one candidate image in the set of candidate images acquired by each image acquisition device can be synchronously judged according to the time parameter until a synchronous image group with time parameters all meeting preset synchronization conditions is determined. Therefore, the synchronous judgment of the picture frames of the cameras is carried out based on the timestamps, so that a group of picture frames which are output by the cameras to an algorithm for detection and identification and information fusion are ensured to be synchronous, and the strict frame synchronization requirement is met.

Fig. 4 is a schematic structural component diagram of an image synchronization apparatus provided in an embodiment of the present disclosure, and as shown in fig. 4, the image synchronization apparatus 400 includes:

an obtaining module 401 configured to obtain a plurality of sets of candidate images acquired by a plurality of image acquisition apparatuses for a target object and a time parameter corresponding to each of the candidate images; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image;

a selecting module 402 configured to select one of the candidate images from each group of candidate images as an image to be analyzed;

a construction module 403 configured to construct a group of images to be analyzed based on a plurality of the images to be analyzed;

a determining module 404, configured to determine, in response to that the time parameters of the images to be analyzed all satisfy a preset synchronization condition, the image group to be analyzed as a synchronization image group corresponding to the target object.

In some embodiments, the image synchronization apparatus 400 further comprises: a sending module 405 configured to synchronously send an image capturing trigger signal to the plurality of image capturing devices.

In some embodiments, the recording module 406 is configured to record the time parameter corresponding to the candidate image when receiving the candidate image acquired by any one of the image acquisition apparatuses in response to the image acquisition trigger signal.

In some embodiments, the caching module 407 is configured to cache the candidate images and the time parameters to any image caching queue in a corresponding relationship according to the sequence of the time parameter sorting so as to construct any group of candidate images; each image acquisition device corresponds to one image buffer queue.

In some embodiments, the sending module 405 is specifically configured to obtain a preset frame frequency corresponding to each of the image capturing devices, and determine a signal transmitting frequency based on a plurality of preset frame frequencies; wherein the signal emission frequency is less than or equal to any one of the preset frame frequencies; and synchronously sending the image acquisition trigger signals to the plurality of image acquisition devices according to the signal emission frequency.

In some embodiments, the selecting module 402 is configured to select, from a group of candidate images buffered in each of the image buffer queues, a candidate image with the top time parameter ranking as an image to be analyzed, so as to obtain the multiple images to be analyzed.

In some embodiments, the determining module 404 is further configured to determine, as the target image, an image to be analyzed in which the time parameter is ranked the most backward in the image group to be analyzed before the image group to be analyzed is determined as one synchronized image group corresponding to the target object based on the plurality of images to be analyzed and the time parameters of the images to be analyzed all satisfy the preset synchronization condition.

In some embodiments, the calculating module 408 is configured to calculate a time difference between the time parameter corresponding to the target image and the time parameter corresponding to any other image to be analyzed in the image group to be analyzed.

In some embodiments, the determining module 404 is further configured to determine that the time parameters of the images to be analyzed all satisfy a preset synchronization condition in response to that the time difference values are all smaller than or equal to a preset time threshold.

In some embodiments, the discarding module 409 is configured to, after calculating a time difference between the time parameter corresponding to the target image and a time parameter corresponding to any other image to be analyzed in the image group to be analyzed, discard the first image to be analyzed in response to that the time difference between the time parameter corresponding to the target image and the time parameter corresponding to the first image to be analyzed in the image group to be analyzed is greater than the preset time threshold; the first image to be analyzed is any other image to be analyzed except the target image in the image group to be analyzed.

In some embodiments, the selecting module 402 is further configured to continue to select a second image to be analyzed with the highest temporal parameter order from the first image buffer queue to which the first image to be analyzed belongs.

In some embodiments, the updating module 410 is configured to update the image group to be analyzed based on the second image to be analyzed, and continue to execute the determination process of the target image and the determination process of the preset synchronization condition based on the updated image group to be analyzed.

In some embodiments, the preprocessing module 411 is configured to preprocess each image to be analyzed in the synchronous image group, so as to obtain a processed synchronous image group that meets a preset model input condition.

In some embodiments, the input module 412 is configured to input the processed synchronized image group into a target algorithm model for image analysis processing.

In some embodiments, the constructing module 403 is further configured to continue to construct a next image group to be analyzed based on a previous image to be analyzed in each of the image buffer queues with the most advanced time parameter, and perform the determining process of the next synchronized image group corresponding to the target object.

In some embodiments, the plurality of image capture devices are disposed at a plurality of angular orientations relative to the target object; an image acquisition device is arranged in an angular direction.

It should be noted that the above description of the embodiment of the apparatus, similar to the above description of the embodiment of the method, has similar beneficial effects as the embodiment of the method. For technical details not disclosed in the embodiments of the apparatus of the present disclosure, reference is made to the description of the embodiments of the method of the present disclosure.

It should be noted that, in the embodiment of the present disclosure, if the image synchronization method is implemented in the form of a software functional module and is sold or used as a standalone product, the image synchronization method may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a terminal, a server, etc.) to execute all or part of the methods described in the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a hard disk drive, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present disclosure are not limited to any specific combination of hardware and software.

Correspondingly, the embodiment of the present disclosure further provides a computer program product, where the computer program product includes computer-executable instructions, and after the computer-executable instructions are executed, the steps in the image synchronization method provided by the embodiment of the present disclosure can be implemented.

The embodiment of the present disclosure further provides a computer storage medium, where computer-executable instructions are stored on the computer storage medium, and when executed by a processor, the computer-executable instructions implement the steps of the image synchronization method provided in the foregoing embodiment.

Fig. 5 is a schematic diagram of a composition structure of a computer device provided in the embodiment of the present disclosure, as shown in fig. 5, a computer device 500 provided in the embodiment of the present disclosure may further include a processor 501, a memory 502 storing executable instructions of the processor 501, and further, the computer device 500 may further include a communication interface 503 and a bus 504 for connecting the processor 501, the memory 502 and the communication interface 503.

In the embodiment of the present disclosure, the Processor 501 may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a ProgRAMmable Logic Device (PLD), a Field ProgRAMmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor. It is understood that the electronic devices for implementing the above-described processor functions may be other devices, and the embodiments of the present disclosure are not particularly limited. The computer device 500 may further comprise a memory 502, which memory 502 may be connected to the processor 501, wherein the memory 502 is adapted to store executable program code comprising computer operating instructions, and wherein the memory 502 may comprise a high speed RAM memory and may further comprise a non-volatile memory, such as at least two disk memories.

In an embodiment of the present disclosure, a bus 504 is used to connect the communication interface 503, the processor 501, and the memory 502, and the intercommunication among these devices.

In an embodiment of the present disclosure, memory 502 is used to store instructions and data.

Further, in an embodiment of the present disclosure, the processor 501 is configured to execute the image synchronization method, where the method includes:

acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices aiming at a target object and a time parameter corresponding to each candidate image; wherein each image acquisition device corresponds to one candidate image in the plurality of candidate images, and each candidate image comprises at least one candidate image;

respectively selecting one candidate image from each group of candidate images as an image to be analyzed, and constructing an image group to be analyzed based on a plurality of images to be analyzed;

and determining the image group to be analyzed as a synchronous image group corresponding to the target object in response to that the time parameters of the image to be analyzed all meet preset synchronous conditions.

In practical applications, the Memory 502 may be a volatile Memory (volatile Memory), such as a Random-Access Memory (RAM); or a non-volatile Memory (non-volatile Memory), such as a Read-Only Memory (ROM), a flash Memory (flash Memory), a Hard Disk (Hard Disk Drive, HDD) or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor 501.

In addition, each functional module in this embodiment may be integrated into one recommendation unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.

Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.

The embodiment of the disclosure provides an image synchronization device, which can perform synchronization judgment on an image group to be analyzed, which is constructed based on one candidate image in a group of candidate images acquired by each image acquisition device, according to a time parameter by acquiring a plurality of groups of candidate images acquired by a plurality of image acquisition devices for a target object and the time parameter corresponding to each candidate image until a synchronization image group with the time parameter meeting a preset synchronization condition is determined. Therefore, the synchronous judgment of the picture frames of the cameras is carried out based on the timestamps, so that a group of picture frames which are output by the cameras to an algorithm for detection and identification and information fusion are ensured to be synchronous, and strict frames are met.

The disclosed embodiments provide a computer-readable storage medium on which a program is stored, which when executed by a processor implements the image synchronization method as described above.

Specifically, the program instructions corresponding to an image synchronization method in the present embodiment may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the program instructions corresponding to an image synchronization method in the storage medium are read or executed by an electronic device, the image synchronization method is implemented.

Accordingly, the disclosed embodiments further provide a computer program product, which includes computer-executable instructions for implementing the steps in the image processing method proposed by the disclosed embodiments.

As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.

The present disclosure is described with reference to flowchart illustrations and/or block diagrams of implementations of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart block or blocks and/or flowchart block or blocks.

These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart block or blocks.

These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks in the flowchart and/or block diagram block or blocks.

The above description is only for the preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure.

23页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像冻结方法、芯片、拍摄装置、存储装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类