Video frame insertion method, terminal and storage medium

文档序号:1849910 发布日期:2021-11-16 浏览:30次 中文

阅读说明:本技术 一种视频插帧方法、终端以及存储介质 (Video frame insertion method, terminal and storage medium ) 是由 徐鹏 于 2020-05-12 设计创作,主要内容包括:本发明公开了一种视频插帧方法、终端及存储介质,所述视频插帧方法通过分别计算目标图像对的前向光流和后向光流,根据前向光流和后向光流获取图像中的遮挡区域,并根据图像中的遮挡区域对前向光流和后向光流进行修正,使用修正后的前向光流和后向光流来生成用于插入至目标图像对中的目标帧,由于在生成目标帧的过程中对于遮挡区域进行了单独的光流修正,能够提高存在遮挡情形下的插帧效果。(The invention discloses a video frame interpolation method, a terminal and a storage medium, wherein the video frame interpolation method respectively calculates a forward optical flow and a backward optical flow of a target image pair, acquires a sheltering area in the image according to the forward optical flow and the backward optical flow, corrects the forward optical flow and the backward optical flow according to the sheltering area in the image, and uses the corrected forward optical flow and the backward optical flow to generate a target frame for being inserted into the target image pair.)

1. A video frame interpolation method, comprising:

acquiring a forward optical flow corresponding to a target image pair, and acquiring a backward optical flow corresponding to the target image pair according to the forward optical flow, wherein the forward optical flow is from a first image to a second image in the target image pair, and the backward optical flow is from the second image to the first image;

acquiring a first occlusion area in the first image according to the forward optical flow, and acquiring a second occlusion area in the second image according to the backward optical flow;

correcting a forward optical flow value corresponding to the first shielding area to generate a corrected forward optical flow, and correcting a backward optical flow value corresponding to the second shielding area to generate a corrected backward optical flow;

generating a target frame for insertion between the first image and the second image from the modified forward optical flow and the modified backward optical flow.

2. The method of claim 1, wherein said obtaining a backward optical flow corresponding to the target image pair according to the forward optical flow comprises:

acquiring each first forward optical flow value corresponding to each pixel point in the first image;

rounding up each first forward optical flow value to obtain each integer intermediate optical flow value corresponding to each pixel point in the first image;

and obtaining backward optical flow values of a plurality of pixel points in the second image according to the coordinates of the pixel points in the first image, the first forward optical flow values and the integer intermediate optical flow values.

3. The method of claim 2, wherein said obtaining backward optical flow values for a plurality of pixels in said second image based on coordinates of said pixels in said first image, said respective first forward optical flow values, and said respective integer intermediate optical flow values comprises:

according to the formulaObtaining backward optical flow values of a plurality of pixel points in the second image;

wherein the content of the first and second substances,as the coordinates in the second image areX is the coordinates of the pixel point in the first image, F0→1(x) Is a first forward optical flow value of a pixel point with coordinate x in the first image,and the integral intermediate optical flow value corresponding to the pixel point with the coordinate x in the first image.

4. The method according to claim 1, wherein said obtaining a first occlusion region of said first image according to said forward optical flow comprises:

extracting at least one pixel point set from each pixel point in the first image, wherein the pixel point set comprises at least two pixel points, and the pixel point set comprises each pixel point of which the forward optical flow value points to the same point in the second image;

and acquiring each pixel value corresponding to each pixel point in the pixel point set and a second pixel value of a point, in the second image, of a forward optical flow value of each pixel point in the pixel point set, and marking the pixel points except the pixel point with the pixel value closest to the second pixel value in the pixel point set as a first shielding area.

5. The method according to claim 1, wherein said obtaining a second occlusion region of said second image according to said backward optical flow comprises:

acquiring a third pixel point corresponding to each backward optical flow value;

and marking the pixel points except the third pixel point in the second image as a second shielding area.

6. The method of claim 1, wherein the modifying the forward optical flow value corresponding to the first occlusion region comprises:

acquiring a first non-shielding area with at least one preset size, wherein the distance between the first non-shielding area and a first target pixel point in the first shielding area is smaller than a first preset value;

acquiring pixel values corresponding to the first non-shielding areas respectively, and correcting the forward optical flow value of the first target pixel point into a target forward optical flow value, wherein the target forward optical flow value is the forward optical flow value corresponding to the first non-shielding area closest to the pixel value of the first target pixel point;

the correcting the backward light flow value corresponding to the second shielding area comprises:

acquiring a second non-shielding area with at least one preset size, wherein the distance between the second non-shielding area and a second target pixel point in the second shielding area is smaller than a second preset value;

and obtaining pixel values corresponding to the second non-shielding areas respectively, and correcting the backward optical flow value of the second target pixel point into a target backward optical flow value, wherein the target backward optical flow value is the backward optical flow value corresponding to the second non-shielding area closest to the pixel value of the second target pixel point.

7. The method of claim 1, wherein said generating a target frame for insertion between said first image and said second image from said modified forward optical flow and said modified backward optical flow comprises:

generating a first intermediate frame from the modified forward optical flow and the first image, and generating a second intermediate frame from the modified backward optical flow and the second image;

and generating the target frame according to the first intermediate frame and the second intermediate frame.

8. The method of claim 7, wherein said generating a first intermediate frame from said modified forward optical flow and said first image comprises:

acquiring a first intermediate optical flow from the first image to the target frame according to the modified forward optical flow;

acquiring pixel values corresponding to a plurality of pixel points on a first intermediate frame according to a first preset formula and the first intermediate optical flow to generate the first intermediate frame;

said generating a second intermediate frame from said modified backward optical flow and said second image comprises:

acquiring a second intermediate optical flow from the second image to the target frame according to the modified backward optical flow;

and acquiring pixel values corresponding to a plurality of pixel points on a second intermediate frame according to a second preset formula and the second intermediate optical flow to generate the second intermediate frame.

9. The method of claim 8, wherein the first predetermined formula is: i is0(m+Δm)=It1(m+Δm+F0→t(m))

Wherein, I0(m + Δ m) is the pixel value of a point on the first image with coordinates m + Δ m, It1(m+Δm+F0→t(m)) is as describedThe coordinate on the first intermediate frame is m + Δ m + F0→t(m) pixel value of pixel, F0→t(m) is a first intermediate optical flow value corresponding to a pixel point with a coordinate m on the first image, m being an integer, Δ m being such that Δ m + F0→t(m) is an integer value, 0. ltoreq. DELTA.m<1;

The second preset formula is as follows: i is1(n+Δn)=It2(n+Δn+F1→t(n))

Wherein, I1(n + Δ n) is the pixel value of a point on the second image with coordinates n + Δ n, It2(n+Δn+F1→t(n)) is the coordinate on the second intermediate frame is n + Δ n + F1→t(n) pixel value of pixel, F1→t(n) is a second intermediate optical flow value corresponding to a pixel point with coordinate n on the second image, n is an integer, Δ n is such that Δ n + F1→t(n) is an integer value, 0. ltoreq. DELTA.n<1。

10. The method of claim 8, wherein said obtaining pixel values corresponding to each pixel point on the first intermediate frame according to the first predetermined formula and the first intermediate optical flow comprises:

when a first intermediate optical flow value corresponding to a plurality of first pixel points to the same pixel point on the first intermediate frame in the first image, acquiring a third pixel point which is not in the first shielding area in the plurality of first pixel points, and taking the pixel value of the pixel point corresponding to the third pixel point on the first intermediate frame as the pixel value of the pixel point corresponding to the plurality of first pixel points on the first intermediate frame;

the obtaining of the pixel value corresponding to each pixel point on the second intermediate frame according to the second preset formula and the second intermediate optical flow includes:

when a second intermediate optical flow value corresponding to a plurality of second pixel points to the same pixel point on the second intermediate frame exists in the second image, a fourth pixel point which is not in the second shielding area in the plurality of second pixel points is obtained, and the pixel value of the pixel point corresponding to the fourth pixel point on the second intermediate frame is used as the pixel value of the pixel point corresponding to the plurality of pixel points on the second intermediate frame.

11. A terminal, characterized in that the terminal comprises: a processor, a storage medium communicatively coupled to the processor, the storage medium adapted to store a plurality of instructions, the processor adapted to invoke the instructions in the storage medium to perform the steps of implementing the video framing method of any of claims 1-10 above.

12. A storage medium storing one or more programs, the one or more programs being executable by one or more processors to perform the steps of the video framing method of any of claims 1-10.

Technical Field

The present invention relates to the field of computer vision technologies, and in particular, to a video frame insertion method, a terminal, and a storage medium.

Background

The technology of inserting extra frames between frames of a video during video frame insertion can improve the fluency of the video and improve the watching experience. Although the video frame interpolation method based on the optical flow has a good effect in the prior art, the video frame interpolation method based on the optical flow has a poor effect in the case of blocking an object in a frame image, and particularly has a poor frame interpolation effect when the object moves greatly between frames.

Thus, there is a need for improvements and enhancements in the art.

Disclosure of Invention

In view of the above-mentioned defects in the prior art, the present invention provides a video frame interpolation method, a terminal and a storage medium, and aims to solve the problem of poor frame interpolation effect of the video frame interpolation method based on optical flow in the prior art.

In order to solve the technical problems, the technical scheme adopted by the invention is as follows:

a video frame interpolation method, wherein the video frame interpolation method comprises:

acquiring a forward optical flow corresponding to a target image pair, and acquiring a backward optical flow corresponding to the target image pair according to the forward optical flow, wherein the forward optical flow is from a first image to a second image in the target image pair, and the backward optical flow is from the second image to the first image;

acquiring a first occlusion area in the first image according to the forward optical flow, and acquiring a second occlusion area in the second image according to the backward optical flow;

correcting a forward optical flow value corresponding to the first shielding area to generate a corrected forward optical flow, and correcting a backward optical flow value corresponding to the second shielding area to generate a corrected backward optical flow;

generating a target frame for insertion between the first image and the second image from the modified forward optical flow and the modified backward optical flow.

The video frame interpolation method, wherein the obtaining of the backward optical flow corresponding to the target image pair according to the forward optical flow comprises:

acquiring each first forward optical flow value corresponding to each pixel point in the first image;

rounding up each first forward optical flow value to obtain each integer intermediate optical flow value corresponding to each pixel point in the first image;

and obtaining backward optical flow values of a plurality of pixel points in the second image according to the coordinates of the pixel points in the first image, the first forward optical flow values and the integer intermediate optical flow values.

The video frame interpolation method, wherein the obtaining backward optical flow values of a plurality of pixel points in the second image according to the coordinates of the pixel points in the first image, the first forward optical flow values and the integer intermediate optical flow values comprises:

according to the formulaObtaining backward optical flow values of a plurality of pixel points in the second image;

wherein the content of the first and second substances,as the coordinates in the second image areX is the coordinates of the pixel point in the first image, F0→1(x) Is a first forward optical flow value of a pixel point with coordinate x in the first image,and the integral intermediate optical flow value corresponding to the pixel point with the coordinate x in the first image.

The video frame interpolation method, wherein the obtaining a first occlusion region of the first image according to the forward optical flow comprises:

extracting at least one pixel point set from each pixel point in the first image, wherein the pixel point set comprises at least two pixel points, and the pixel point set comprises each pixel point of which the forward optical flow value points to the same point in the second image;

and acquiring each pixel value corresponding to each pixel point in the pixel point set and a second pixel value of a point, in the second image, of a forward optical flow value of each pixel point in the pixel point set, and marking the pixel points except the pixel point with the pixel value closest to the second pixel value in the pixel point set as a first shielding area.

The video frame interpolation method, wherein the obtaining a second occlusion region of the second image according to the backward optical flow comprises:

acquiring a third pixel point corresponding to each backward optical flow value;

and marking the pixel points except the third pixel point in the second image as a second shielding area.

The video frame interpolation method, wherein the modifying the forward optical flow value corresponding to the first occlusion region includes:

acquiring a first non-shielding area with at least one preset size, wherein the distance between the first non-shielding area and a first target pixel point in the first shielding area is smaller than a first preset value;

acquiring pixel values corresponding to the first non-shielding areas respectively, and correcting the forward optical flow value of the first target pixel point into a target forward optical flow value, wherein the target forward optical flow value is the forward optical flow value corresponding to the first non-shielding area closest to the pixel value of the first target pixel point;

the correcting the backward light flow value corresponding to each pixel point in the second shielding area comprises:

acquiring a second non-shielding area with at least one preset size, wherein the distance between the second non-shielding area and a second target pixel point in the second shielding area is smaller than a second preset value;

and obtaining pixel values corresponding to the second non-shielding areas respectively, and correcting the backward optical flow value of the second target pixel point into a target backward optical flow value, wherein the target backward optical flow value is the backward optical flow value corresponding to the second non-shielding area closest to the pixel value of the second target pixel point.

The video frame interpolation method, wherein the generating a target frame for interpolation between the first image and the second image according to the modified forward optical flow and the modified backward optical flow comprises:

generating a first intermediate frame from the modified forward optical flow and the first image, and generating a second intermediate frame from the modified backward optical flow and the second image;

and generating the target frame according to the first intermediate frame and the second intermediate frame.

The video frame interpolation method, wherein the generating a first intermediate frame according to the modified forward optical flow and the first image comprises:

acquiring a first intermediate optical flow from the first image to the target frame according to the modified forward optical flow;

acquiring pixel values corresponding to a plurality of pixel points on a first intermediate frame according to a first preset formula and the first intermediate optical flow to generate the first intermediate frame;

said generating a second intermediate frame from said modified backward optical flow and said second image comprises:

acquiring a second intermediate optical flow from the second image to the target frame according to the modified backward optical flow;

and acquiring pixel values corresponding to a plurality of pixel points on a second intermediate frame according to a second preset formula and the second intermediate optical flow to generate the second intermediate frame.

The video frame interpolation method, wherein the first preset formula is as follows: i is0(m+Δm)=It1(m+Δm+F0→t(m))

Wherein, I0(m + Δ m) is the pixel value of a point on the first image with coordinates m + Δ m, It1(m+Δm+F0→t(m)) is the coordinate on the first intermediate frame is m + Δ m + F0→t(m) pixel value of pixel, F0→t(m) is a first intermediate optical flow value corresponding to a pixel point with a coordinate m on the first image, m being an integer, Δ m being such that Δ m + F0→t(m) is an integer value, 0. ltoreq. DELTA.m<1;

The second preset formula is as follows: i is1(n+Δn)=It2(n+Δn+F1→t(n))

Wherein, I1(n + Δ n) is the pixel value of a point on the second image with coordinates n + Δ n, It2(n+Δn+F1→t(n)) is the coordinate on the second intermediate frame is n + Δ n + F1→t(n)Pixel value of the pixel point of (1), F1→t(n) is a second intermediate optical flow value corresponding to a pixel point with coordinate n on the second image, n is an integer, Δ n is such that Δ n + F1→t(n) is an integer value, 0. ltoreq. DELTA.n<1。

The video frame interpolation method, wherein the obtaining of the pixel value corresponding to each pixel point on the first intermediate frame according to the first preset formula and the first intermediate optical flow includes:

when a first intermediate optical flow value corresponding to a plurality of first pixel points to the same pixel point on the first intermediate frame in the first image, acquiring a third pixel point which is not in the first shielding area in the plurality of first pixel points, and taking the pixel value of the pixel point corresponding to the third pixel point on the first intermediate frame as the pixel value of the pixel point corresponding to the plurality of first pixel points on the first intermediate frame;

the obtaining of the pixel value corresponding to each pixel point on the second intermediate frame according to the second preset formula and the second intermediate optical flow includes:

when a second intermediate optical flow value corresponding to a plurality of second pixel points to the same pixel point on the second intermediate frame exists in the second image, a fourth pixel point which is not in the second shielding area in the plurality of second pixel points is obtained, and the pixel value of the pixel point corresponding to the fourth pixel point on the second intermediate frame is used as the pixel value of the pixel point corresponding to the plurality of pixel points on the second intermediate frame.

A terminal, wherein the terminal comprises: the video frame interpolation method comprises a processor and a storage medium which is in communication connection with the processor, wherein the storage medium is suitable for storing a plurality of instructions, and the processor is suitable for calling the instructions in the storage medium to execute the steps of realizing the video frame interpolation method in any one of the above items.

A storage medium having one or more programs stored thereon that are executable by one or more processors to implement the steps of a video framing method as described in any of the above.

Has the advantages that: compared with the prior art, the invention provides a video frame interpolation method, a terminal and a storage medium, the video frame interpolation method respectively calculates a forward optical flow and a backward optical flow for a target image pair to be interpolated into a target frame, acquires a shielding area in the image according to the forward optical flow and the backward optical flow, corrects the forward optical flow and the backward optical flow according to the shielding area in the image, and generates the target frame by using the corrected forward optical flow and backward optical flow.

Drawings

FIG. 1 is a flowchart of an embodiment of a video frame interpolation method according to the present invention;

FIG. 2 is a schematic diagram of one implementation of generating a backward optical flow from a forward optical flow;

FIG. 3 is a schematic diagram of another implementation of generating a backward optical flow from a forward optical flow;

FIG. 4 is a schematic diagram of an occlusion situation;

FIG. 5 is a comparison diagram of the frame interpolation effect of the video frame interpolation method provided by the present invention and the prior art;

fig. 6 is a schematic structural diagram of an embodiment of a terminal provided in the present invention.

Detailed Description

In order to make the objects, technical solutions and effects of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.

The video frame insertion method provided by the invention can be applied to terminals, and the terminals can be but are not limited to various personal computers, notebook computers, mobile phones, tablet computers, vehicle-mounted computers and portable wearable equipment. When the terminal acquires the target image pair of the target frame to be inserted, the target frame can be generated by the video frame inserting method provided by the invention.

Example one

Referring to fig. 1, fig. 1 is a flowchart illustrating a video frame interpolation method according to a first embodiment of the present invention.

The video frame interpolation method comprises the following steps:

s100, acquiring a forward optical flow corresponding to a target image pair, and acquiring a backward optical flow corresponding to the target image pair according to the forward optical flow.

In the present embodiment, after acquiring a target image pair to be inserted into a target frame, a forward optical flow of the target image pair is acquired first, specifically, the target image pair includes two images: the image processing device comprises a first image and a second image, wherein the forward optical flow is the optical flow from the first image to the second image, and the backward optical flow is the optical flow from the second image to the first image, namely, the forward optical flow reflects the displacement of each point in the first image in the time period from the first image to the second image, and the backward optical flow reflects the displacement of each point in the second image in the time period from the second image to the first image. For example, the forward optical flow value corresponding to the pixel point with the coordinate x in the first image is F0→1(x) This point then has the coordinates x + F on the second image after a movement over time from said first image to said second image0→1(x) It can be easily seen that the coordinates on the second image are x + F0→1(x) After a temporal movement from the second image to the first image, the coordinate on the first image is x, i.e. the coordinate in the second image is x + F0→1(x) Backward luminous flux value F of point (a)1→0(x+F0→1(x))=-F0→1(x)。

In practical application, when the backward optical flow is obtained, it is required to know optical flow values of integer coordinate positions, and the forward optical flow includes front optical flows at respective integer coordinate positions in the first imageThe forward optical flow values, and the respective forward optical flow values included in the forward optical flow are substantially floating point numbers, i.e. x is an integer, x + F0→1(x) Are often not integers and therefore cannot be directly based on formula F1→0(x+F0→1(x))=-F0→1(x) The backward optical flow is acquired.

In one possible implementation, as shown in fig. 2, the second image (I in the figure) is directly applied0For the first image, I1For the second image) of pixels with coordinate x is set to-F0→1(x) I.e. F1→0(x)=-F0→1(x) Then, according to the formula F1→0(x+F0→1(x))=-F0→1(x) Can yield F1→0(x+F0→1(x))=-F0→1(x)=F1→0(x) That is, when the backward optical flow value is obtained in this way, a default condition is included: x + F0→1(x) X, when the relative motion of the object between the first image and the second image is small, F0→1(x) The method is also small, and the error caused by the method is not large, but when the motion is large, the error caused by the method is increased, so that the frame interpolation effect is obviously abnormal.

In another possible implementation manner, the obtaining, according to the forward optical flow, a backward optical flow corresponding to the target image pair includes:

s110, obtaining each first forward optical flow value corresponding to each pixel point in the first image.

As already described above, the forward optical flow includes a forward optical flow value of a point of which coordinates are integers in the first image, and in the art, an image is composed of pixel points, and the coordinates of the pixel points in the image are all integers, so that after the forward optical flow is obtained, each first forward optical flow value corresponding to each pixel point in the first image can be obtained.

S120, rounding up each first forward optical flow value to obtain each integer intermediate optical flow value corresponding to each pixel point in the first image.

As explained above, each forward optical flow value in the forward optical flow is a floating point number, and after the respective first forward optical flow value is obtained, rounding up is performed on the respective first forward optical flow value, that is, for each forward optical flow value, an integer that is greater than the forward optical flow value and closest to the forward optical flow value is obtained as the integer intermediate optical flow value.

S130, obtaining backward optical flow values of a plurality of pixel points in the second image according to the coordinates of the pixel points in the first image, the first forward optical flow values and the integer intermediate optical flow values.

Specifically, as shown in FIG. 3, in S130, it is according to the formulaObtaining backward optical flow values of a plurality of pixel points in the second image, wherein,as the coordinates in the second image areX is the coordinates of the pixel point in the first image, F0→1(x) Is a first forward optical flow value of a pixel point with coordinate x in the first image,and the integral intermediate optical flow value corresponding to the pixel point with the coordinate x in the first image. It can be easily seen that, since the coordinates of each pixel point in the first image are integers, that is, x is an integer, then,the backward optical flow values of a plurality of integer coordinate points (i.e. a plurality of pixel points) in the second image can be obtained. It is easy to see that, when the backward optical flow is obtained in this way, the default condition is included:corresponding to the coordinates in the second image asThe floating point coordinate x + F of the point in the range of 1 pixel point0→1(x) Value of backward light flux ofThe backward optical flow value of (a). Therefore, the error of the backward optical flow value is controlled in one pixel point, and the frame interpolation effect is improved.

Referring to fig. 1, after the forward optical flow and the backward optical flow are obtained, the video frame interpolation method further includes:

s200, acquiring a first shielding area in the first image according to the forward optical flow, and acquiring a second shielding area in the second image according to the backward optical flow.

In this embodiment, the area that is visible in the first image and is occluded in the second image is referred to as a first occlusion area in the first image, the area that is occluded in the second image is occluded in the first image, and the area that is visible in the second image is referred to as a second occlusion area in the second image. Therefore, in this embodiment, the first shielding area and the second shielding area are further obtained, and the forward optical flow and the backward optical flow are corrected according to the first shielding area and the second shielding area, respectively.

Specifically, the acquiring a first occlusion region in the first image according to the forward optical flow comprises:

s211, extracting at least one pixel point set from each pixel point in the first image, wherein the pixel point set comprises at least two pixel points, and the forward optical flow value in the pixel point set points to each pixel point in the same point in the second image.

If there are two or more forward optical flow values of the pixels in the first image pointing to the same point in the second image, then some of the pixels in the pixels whose forward optical flow values point to the same point are blocked in the second image after a period of time, as shown in fig. 4, image I0The forward light flow values of the pixel points with the upper coordinates of x and y respectively point to the image I1Point z of (i.e., x + F)0→1(x)=y+F0→1(y), then there are pixel points in image I in x and y1Is blocked.

In this embodiment, a pixel set of pixels in which a forward optical flow value of each pixel in the first image points to a same point in the second image is obtained, that is, a forward optical flow value of each pixel in each pixel set points to a same point in the second image.

S212, obtaining each pixel value corresponding to each pixel point in the pixel point set and a second pixel value of a point, in the second image, of a forward optical flow value of each pixel point in the pixel point set, and marking the pixel points except the pixel point, in the pixel point set, of which the pixel value is closest to the second pixel value, as a first shielding area.

After pixels which are visible in the first image and possibly shielded in the second image are obtained, determining which of the pixels are shielded according to pixel values of the pixels and pixel values of points in the second image pointed by forward light flow values of the pixels, specifically, determining that the pixel with the pixel value closest to the second pixel value in the pixel set is not shielded, that is, the forward light flow value of the pixel with the pixel value closest to the second pixel value in the pixel set is real, and determining that the pixels in the pixel set are shielded according to pixel values of the pixels and the pixel values of the points in the second image pointed by the forward light flow values of the pixelsThe pixel point of (1) is marked as a first shielding area. Specifically, the coordinates of a point, in the second image, to which the forward optical flow value of each pixel in the pixel set points, may not be an integer, that is, the point is not a pixel, and then the pixel value of the point may be obtained by a bilinear interpolation method to serve as the second pixel value, or the pixel value of the pixel closest to the point is obtained to serve as the second pixel value. The number of the pixel point sets may be multiple, and when all the pixel points marked as a first shielding region in the pixel point sets are obtained, the first shielding region in the first image is obtained. For example, image I in FIG. 40The forward light flow values of the pixel points with the upper coordinates of x and y respectively point to the image I1Then, the pixel values I of the pixel points with the coordinates of x and y are obtained0(x),I0(y) and image I1Pixel value I of point z on1(z) if I1(z) and I0(x) More closely, then I is determined0The pixel point with the upper coordinate of y is in the image I1Is covered with, mark I0And the pixel point with the upper coordinate of y is a first shielding area.

The acquiring a second occlusion region in the second image according to the backward optical flow comprises:

s221, acquiring third pixel points corresponding to all backward light flow values;

s222, marking the pixel points except the third pixel point in the second image as a second shielding area.

The second occlusion region in the second image is visible in the second image, and the region occluded in the first image is a region occluded in the first image, but the region occluded in the first image does not have a corresponding forward optical flow value in the first image, that is, a point in the second occlusion region on the second image is not pointed to by the forward optical flow, so when the backward optical flow is acquired according to the forward optical flow, when the second occlusion region exists in the second image, a pixel point acquiring the backward optical flow value according to the forward optical flow is a partial pixel point in the second image, and a backward optical flow value of a point in the second occlusion region in the second image cannot be acquired. And when the second shielding area is obtained, obtaining third pixel points corresponding to all backward light flow values, wherein the third pixel points are also visible in the first image, and the pixel points except the third pixel points are shielded in the first image, and marking the pixel points except the third pixel points in the second image as the second shielding area.

The video frame interpolation method further comprises the following steps:

s300, correcting the forward optical flow value corresponding to the first shielding area to generate a corrected forward optical flow, and correcting the backward optical flow value corresponding to the second shielding area to generate a corrected backward optical flow.

According to the foregoing description, since the pixel points in the first occlusion region in the first image are occluded in the second image, the real forward optical flow values of the pixel points in the first occlusion region cannot be obtained, and similarly, since the pixel points in the second occlusion region in the second image are occluded in the first image, the backward optical flow values of the pixel points in the second occlusion region do not exist in the backward optical flow, in the video frame interpolation method provided by the present invention, the forward optical flow and the backward optical flow are also corrected.

Specifically, the modifying the forward optical flow value corresponding to the first occlusion region includes:

s311, a first non-shielding area with at least one preset size and a distance between the first non-shielding area and a first target pixel point in the first shielding area being smaller than a first preset value is obtained.

Since the motion of the object between the first image and the second image is rigid motion, the same semantic object has the same motion, that is, the forward optical flows of the same semantic object are substantially the same, and the same semantic object generally has similar pixel values, in this embodiment, for each first target pixel point in the first occlusion region, at least one first non-occlusion region around the first target pixel point is obtained, and which region in each first non-occlusion region and the first target pixel point belong to the same semantic object is determined according to the pixel value of each first non-occlusion region. Specifically, a first non-blocking area with a distance from the first target pixel point smaller than a preset size of the first preset value is obtained, where the first non-blocking area may be a single pixel point not in the first blocking area, that is, the preset size of the first non-blocking area is 1 × 1, and obtaining the first non-blocking area with a distance from the first target pixel point smaller than the preset size of the first preset value is to obtain a pixel point not in the first blocking area closest to the first target pixel point; the first non-blocking area may also be an area formed by a plurality of pixels that are not in the first blocking area, and the preset size may be 3 × 3, or 5 × 3, and so on, where a distance between the first non-blocking area and the first target pixel is a distance between a central pixel point of the first non-blocking area and the first target pixel, that is, a non-blocking area of a preset size centered on a pixel point whose distance from the first target pixel is smaller than the first preset value is obtained. The first preset value can be two pixel points, five pixel points and the like, and it is easy to see that in this case, the first non-shielding area can be multiple. In a possible implementation manner, the first preset value may be determined according to a minimum value of distances between non-occlusion regions of preset sizes in the first image and the first target pixel, specifically, the non-occlusion region of the preset size with the minimum distance between the non-occlusion region and the first target pixel may be directly obtained as the first non-occlusion region, which means that under this condition, a plurality of first non-occlusion regions may also exist, that is, the distance between the non-occlusion regions of the preset sizes and the distance between the first target pixels are the minimum and equal, which is related to the accuracy of selection in distance calculation.

S312, obtaining pixel values corresponding to the first non-occlusion regions, and modifying the forward optical flow value of the first target pixel point to a target forward optical flow value, where the target forward optical flow value is a forward optical flow value corresponding to the first non-occlusion region closest to the pixel value of the first target pixel point.

After each first non-blocking area is obtained, obtaining pixel values corresponding to each first non-blocking area, specifically, the pixel value corresponding to the first non-blocking area may be an average pixel value of all pixel points included in the first non-blocking area, a pixel value of a center pixel point in the first non-blocking area, or a pixel value of a pixel point in the first non-blocking area that is closest to the first target pixel point, and the like. Comparing the pixel value corresponding to each first non-shielding area with the pixel value of the first target pixel point, obtaining a first non-shielding area with the pixel value closest to the pixel value of the first target pixel point, and correcting the forward optical flow value of the first target pixel point to be the forward optical flow value corresponding to the first non-shielding area (namely, the target forward optical flow value). Specifically, the forward optical flow value corresponding to the first non-occlusion region may be an average forward optical flow value of all pixel points included in the first non-occlusion region, a forward optical flow value of a center pixel point in the first non-occlusion region, or a forward optical flow value of a pixel point closest to the first target pixel point in the first non-occlusion region, or the like.

And the first target pixel point is each pixel point in the first shielding area, and the corrected forward optical flow is generated after the forward optical flow value of each pixel point in the first shielding area is corrected.

The correcting the backward light flow value corresponding to the second shielding area comprises:

s321, obtaining a second non-shielding area with at least one preset size, wherein the distance between the second non-shielding area and a second target pixel point in the second shielding area is smaller than a second preset value.

Similar to the first occlusion region, in this embodiment, for each second target pixel point in the second occlusion region, at least one second non-occlusion region around the second target pixel point is obtained, and which region in each second non-occlusion region and the second target pixel point belong to the same semantic object is determined according to a pixel value of each second non-occlusion region. Specifically, a second non-blocking area with a preset size, where a distance from the second target pixel point is smaller than a second preset value, is obtained, where the second non-blocking area may be a single pixel point not in the second blocking area, that is, the preset size of the second non-blocking area is 1 × 1, and obtaining the second non-blocking area with a preset size, where a distance from the second target pixel point is smaller than the second preset value, is to obtain a pixel point not in the second blocking area closest to the second target pixel point; the second non-blocking area may also be an area formed by a plurality of pixels that are not in the second blocking area, where the preset size may be 3 × 3, or 5 × 3, and so on, and at this time, the distance between the second non-blocking area and the second target pixel is the distance between the central pixel point of the second non-blocking area and the second target pixel, that is, the non-blocking area with the preset size, which is centered on the pixel point whose distance from the second target pixel is smaller than the second preset value, is obtained. The second preset value may be equal to or different from the first preset value, and the second preset value may be two pixel points, five pixel points, or the like. In a possible implementation manner, the second preset value may be determined according to a minimum value of distances between non-occlusion regions of preset sizes in the second image and the second target pixel, specifically, the non-occlusion region of the preset size with the minimum distance between the non-occlusion region and the second target pixel may be directly obtained as the second non-occlusion region, which means that under this condition, a plurality of second non-occlusion regions may also exist, that is, the distance between the non-occlusion regions of the preset sizes and the distance between the second target pixels are the minimum and equal, which is related to the accuracy of selection in distance calculation.

And S322, obtaining pixel values corresponding to the second non-shielding areas respectively, and correcting the backward optical flow value of the second target pixel point to be a target backward optical flow value, wherein the target backward optical flow value is the backward optical flow value corresponding to the second non-shielding area closest to the pixel value of the second target pixel point.

After each second non-blocking area is obtained, obtaining pixel values corresponding to each second non-blocking area, specifically, the pixel values corresponding to the second non-blocking area may be an average pixel value of all pixel points included in the second non-blocking area, a pixel value of a center pixel point in the second non-blocking area, or a pixel value of a pixel point in the second non-blocking area that is closest to the second target pixel point, and the like. Comparing the pixel value corresponding to each second non-shielding area with the pixel value of the second target pixel point, acquiring a second non-shielding area with the pixel value closest to the pixel value of the second target pixel point, and correcting the backward optical flow value of the second target pixel point to be the backward optical flow value corresponding to the second non-shielding area (namely the target backward optical flow value). Specifically, the backward optical flow value corresponding to the second non-occlusion region may be an average backward optical flow value of all pixel points included in the second non-occlusion region, a backward optical flow value of a center pixel point in the second non-occlusion region, or a backward optical flow value of a pixel point closest to the second target pixel point in the second non-occlusion region, or the like.

And the second target pixel points are all pixel points in the second shielding area, and after the backward optical flow value of each pixel point in the second shielding area is corrected, the corrected backward optical flow is generated.

After obtaining the modified forward optical flow and the modified backward optical flow, the video frame interpolation method further includes:

s400, generating a target frame for inserting between the first image and the second image according to the corrected forward optical flow and the corrected backward optical flow.

Said generating a target frame for insertion between said first image and said second image from said modified forward optical flow and said modified backward optical flow comprises:

generating a first intermediate frame from the modified forward optical flow and the first image, and generating a second intermediate frame from the modified backward optical flow and the second image;

and generating the target frame according to the first intermediate frame and the second intermediate frame.

The target frame is an image of one frame chronologically between the first image and the second image, then according to the time interval of the target frame and the first image, the optical flow of the first image to the target frame and the optical flow of the second image to the target frame may be acquired, and since the objects in the first image and the second image are moving, there may be some pixels visible on the intermediate frame on the target frame, and is sheltered from the first image, then the pixel values of the part of pixels can not be solved according to the optical flow from the first image to the target frame, but the part of pixels are visible on the second image with high probability, therefore, the pixel values of the part of pixel points can be obtained according to the optical flow from the second image to the target frame; if there may be some pixels visible on the intermediate frame and blocked on the second image, then the pixel values of the pixels cannot be found according to the optical flows from the second image to the intermediate frame, but similarly, the pixel values of the pixels are found on the first image with a high probability, and therefore, the pixel values of the pixels can be found according to the optical flows from the first image to the target frame. Therefore, the first intermediate frame may be obtained according to the optical flow from the first image to the target frame, the second intermediate frame may be obtained according to the optical flow from the second image to the target frame, and the two frames may be fused, that is, the second intermediate frame is used to complement the pixel value of the missing pixel point in the first intermediate frame, or the first intermediate frame is used to complement the pixel value of the missing pixel point in the second intermediate frame, so as to generate the complete target frame.

Specifically, the generating a first intermediate frame from the modified forward optical flow and the first image comprises:

s411, acquiring a first intermediate optical flow from the first image to the target frame according to the corrected forward optical flow.

Specifically, after the time between the target frame and the first image and the time between the target frame and the second image are determined, a first intermediate optical flow from the first image to the target frame may be obtained according to the modified forward optical flow, which is the prior art and is not described herein again.

S412, obtaining pixel values corresponding to a plurality of pixel points on a first intermediate frame according to a first preset formula and the first intermediate optical flow, and generating the first intermediate frame.

The first intermediate optical flow value corresponding to the pixel point with the coordinate m on the first image reflects the displacement of the pixel point with the coordinate m on the first image between the first image and the first intermediate frame, and the first intermediate optical flow value of the pixel point with the coordinate m on the first image is recorded as F0→t(m), that is, the coordinate of the pixel point with the coordinate m on the first image on the first intermediate frame is m + F0→t(m), that is, the first intermediate frame has a coordinate m + F0→t(m) the pixel value of the pixel point is equal to the pixel value of the pixel point with the coordinate m on the first image, and the pixel value can be expressed as: i is0(m)=It1(m+F0→t(m)), wherein I0(m) is the pixel value of a pixel point with coordinate m on the first image, It1(m+F0→t(m)) is the coordinate m + F on the first intermediate frame0→t(m) pixel value of pixel, F0→t(m) is a first intermediate optical flow value for a pixel point with coordinate m on the first image.

As explained earlier, the optical flow values are floating point numbers, often not integers, i.e., m + F0→t(m) is not always an integer, and when the first intermediate frame is generated, the pixel values of the pixel points of which the coordinates are integers on the first intermediate frame need to be acquired. In this embodiment, the pixels of the plurality of pixels on the first intermediate frame are obtained by a first preset formulaThe first preset formula is as follows: i is0(m+Δm)=It1(m+Δm+F0→t(m))

Wherein, I0(m + Δ m) is the pixel value of a point on the first image with coordinates m + Δ m, It1(m+Δm+F0→t(m)) is the coordinate on the first intermediate frame is m + Δ m + F0→t(m) pixel value of pixel, F0→t(m) is a first intermediate optical flow value corresponding to a pixel point with a coordinate m on the first image, m being an integer, Δ m being such that Δ m + F0→t(m) is an integer value, 0. ltoreq. DELTA.m<1. In this way, pixel values of a plurality of integer coordinate points on the first intermediate frame may be obtained, where the pixel values of the points with the coordinates of m + Δ m on the first image may be obtained by means of bilinear interpolation.

As explained above, there may be some pixels visible on the first intermediate frame and hidden on the first image, and these pixels are not pointed by the first intermediate optical flow, that is, the pixel values of these pixels cannot be obtained according to the first image and the first intermediate optical flow, that is, the pixel values of some pixels are missing in the first intermediate frame. In this embodiment, a second intermediate frame is generated according to the modified backward optical flow and the second image, and the pixel values of the pixel points are obtained by using the second intermediate frame.

Said generating a second intermediate frame from said modified backward optical flow and said second image comprises:

s421, acquiring a second intermediate optical flow from the second image to the target frame according to the corrected backward optical flow.

After the time from the target frame to the first image and the second image is determined, a second intermediate optical flow from the second image to the target frame may be obtained according to the modified backward optical flow, which is the prior art and is not described herein again.

S422, pixel values corresponding to a plurality of pixel points on a second intermediate frame are obtained according to a second preset formula and the second intermediate optical flow, and the second intermediate frame is generated.

Similar to the first intermediate frame, a second intermediate optical flow value corresponding to the pixel point with the coordinate n on the second image reflects the displacement of the pixel point with the coordinate n on the second image between the second image and the second intermediate frame, and the second intermediate optical flow value of the pixel point with the coordinate n on the second image is recorded as F1→t(n), that is, the coordinate of the pixel point with the coordinate n on the second image on the second intermediate frame is n + F1→t(n), that is, the coordinates on the second intermediate frame are n + F1→t(n) the pixel value of the pixel point is equal to the pixel value of the pixel point with the coordinate of n on the second image, and the pixel value can be expressed as: i is1(n)=It2(n+F1→t(n)), wherein, I1(n) is the pixel value of a pixel point with coordinate n on the second image, It2(n+F1→t(n)) is the coordinate on the second intermediate frame is n + F1→t(n) pixel value of pixel, F1→t(n) is a second intermediate optical flow value of a pixel point with coordinate n on the second image.

As explained earlier, the optical flow values are floating point numbers, often not integers, i.e., n + F1→t(n) is not always an integer, and when the second intermediate frame is generated, the pixel values of the pixel points of which the coordinates are integers on the second intermediate frame need to be acquired. In this embodiment, the pixel values of the plurality of pixel points on the second intermediate frame are obtained by a second preset formula, where the second preset formula is: i is1(n+Δn)=It2(n+Δn+F1→t(n))

Wherein, I1(n + Δ n) is the pixel value of a point on the second image with coordinates n + Δ n, It2(n+Δn+F1→t(n)) is the coordinate on the second intermediate frame is n + Δ n + F1→t(n) pixel value of pixel, F1→t(n) is a second intermediate optical flow value corresponding to a pixel point with coordinate n on the second image, n is an integer, Δ n is such that Δ n + F1→t(n) is an integer value, 0. ltoreq. DELTA.n<1. Thus, pixel values of a plurality of integer coordinate points on the second intermediate frame can be found, wherein the second mapThe pixel value of a point with coordinates n + Δ n on the image can be obtained by means of bilinear interpolation.

In practical applications, a situation that a pixel point on the first image is blocked on the first intermediate frame and a pixel point on the second image is blocked on the second intermediate frame may also occur, and therefore, in this embodiment, the obtaining a pixel value corresponding to each pixel point on the first intermediate frame according to the first preset formula and the first intermediate optical flow includes:

s4121, when a first intermediate optical flow value corresponding to a plurality of first pixel points in the first image points to a same pixel point on the first intermediate frame, obtaining a third pixel point not located in the first occlusion region among the plurality of first pixel points, and taking a pixel value of a pixel point corresponding to the third pixel point on the first intermediate frame as a pixel value of a pixel point corresponding to the plurality of first pixel points on the first intermediate frame.

When a first intermediate optical flow value corresponding to a plurality of first pixel points in the first image points to a same pixel point on the first intermediate frame, it is described that the plurality of first pixel points include a pixel point which is visible on the first image and is blocked on the first intermediate frame, and the first blocking area is an area which is visible on the first image and is blocked on the second image. And acquiring a third pixel point which is not in the first shielding area in the plurality of first pixel points, and taking the pixel value of a pixel point corresponding to the third pixel point on the first intermediate frame as the pixel value of a pixel point corresponding to the plurality of first pixel points on the first intermediate frame.

The obtaining of the pixel value corresponding to each pixel point on the second intermediate frame according to the second preset formula and the second intermediate optical flow includes:

s4221, when a second intermediate optical flow value corresponding to a plurality of second pixel points in the second image points to a same pixel point on the second intermediate frame, obtaining a fourth pixel point not in the second occlusion region among the plurality of second pixel points, and taking a pixel value of a pixel point corresponding to the fourth pixel point on the second intermediate frame as a pixel value of a pixel point corresponding to the plurality of pixel points on the second intermediate frame.

When a second intermediate optical flow value corresponding to a plurality of second pixel points to a same pixel point on the second intermediate frame exists in the second image, it is described that the plurality of second pixel points include a pixel point which is visible on the second image and is blocked on the second intermediate frame, and the second blocked area is an area which is visible on the second image and is blocked on the first image. And acquiring a fourth pixel point which is not in the second shielding area in the plurality of second pixel points, and taking the pixel value of the pixel point corresponding to the fourth pixel point on the second intermediate frame as the pixel value of the pixel point corresponding to the plurality of second pixel points on the second intermediate frame.

As can be seen from the above description, the first intermediate frame includes pixel values of some pixel points on the target frame obtained according to the first image and the forward modified optical flow, the second intermediate frame includes pixel values of some pixel points on the target frame obtained according to the second image and the backward modified optical flow, the first intermediate frame and the second intermediate frame are fused to obtain pixel values of all pixel points on the target frame, and then the target frame is generated, and as mentioned above, the pixel values of all pixel points in the first intermediate frame may be retained, and for the pixel value of a pixel point missing in the first intermediate frame, the pixel value of a corresponding pixel point in the second intermediate frame may be obtained to supplement, or the pixel values of all pixel points on the second intermediate frame may be retained, for the pixel values of the missing pixel points in the second intermediate frame, the pixel values of the corresponding pixel points in the first intermediate frame are obtained for supplementation, and since the backward optical flow is obtained according to the forward optical flow, the accuracy of the corrected forward optical flow is higher than that of the backward corrected backward optical flow, that is, the accuracy of the first intermediate frame is higher than that of the second intermediate frame, in this embodiment, the pixel values of the pixel points in the first intermediate frame are preferentially reserved, and for the pixel points on the first intermediate frame that cannot obtain the pixel values, the pixel values of the pixel points of the corresponding coordinates in the second intermediate frame are used for supplementation.

As shown in fig. 5, fig. 5 is a comparison diagram of a frame interpolation effect between the video frame interpolation method provided by the present embodiment and a conventional method, where in fig. 5, a first line is a first image in an image pair to be interpolated into a target frame, a fourth line is a second image in the image pair to be interpolated into the target frame, a second line is a target frame generated in the conventional method, and a third line is a target frame generated by the video frame interpolation method provided by the present embodiment, as is apparent from fig. 5, compared with the conventional method, the target frame generated by the video frame interpolation method provided by the present embodiment has no foreground ghost and edge shadow, and the effect is better.

In summary, the present embodiment provides a video frame interpolation method, which calculates a forward optical flow and a backward optical flow for a target image pair to be interpolated into a target frame, acquires an occlusion area in the image according to the forward optical flow and the backward optical flow, corrects the forward optical flow and the backward optical flow according to the occlusion area in the image, and generates the target frame using the corrected forward optical flow and backward optical flow.

It should be understood that, although the steps in the flowcharts shown in the figures of the present specification are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the flowchart may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.

It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).

Example two

Based on the above embodiments, the present invention further provides a terminal, and a schematic block diagram thereof may be as shown in fig. 6. The terminal comprises a processor, a memory, a network interface, a display screen and a temperature sensor which are connected through a system bus. Wherein the processor of the terminal is configured to provide computing and control capabilities. The memory of the terminal comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the terminal is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement a video framing method. The display screen of the terminal can be a liquid crystal display screen or an electronic ink display screen, and the temperature sensor of the terminal is arranged in the terminal in advance and used for detecting the current operating temperature of internal equipment.

It will be appreciated by those skilled in the art that the block diagram of fig. 6 is only a block diagram of a portion of the structure associated with the inventive arrangements and does not constitute a limitation of the terminal to which the inventive arrangements are applied, and that a particular terminal may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.

In one embodiment, a terminal is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor when executing the computer program implementing at least the following steps:

acquiring a forward optical flow corresponding to a target image pair, and acquiring a backward optical flow corresponding to the target image pair according to the forward optical flow, wherein the forward optical flow is from a first image to a second image in the target image pair, and the backward optical flow is from the second image to the first image;

acquiring a first occlusion area in the first image according to the forward optical flow, and acquiring a second occlusion area in the second image according to the backward optical flow;

correcting a forward optical flow value corresponding to the first shielding area to generate a corrected forward optical flow, and correcting a backward optical flow value corresponding to the second shielding area to generate a corrected backward optical flow;

generating a target frame for insertion between the first image and the second image from the modified forward optical flow and the modified backward optical flow.

Wherein the obtaining of the backward optical flow corresponding to the target image pair according to the forward optical flow comprises:

acquiring each first forward optical flow value corresponding to each pixel point in the first image;

rounding up each first forward optical flow value to obtain each integer intermediate optical flow value corresponding to each pixel point in the first image;

and obtaining backward optical flow values of a plurality of pixel points in the second image according to the coordinates of the pixel points in the first image, the first forward optical flow values and the integer intermediate optical flow values.

Wherein the obtaining backward optical flow values of a plurality of pixels in the second image according to the coordinates of each pixel in the first image, the first forward optical flow values, and the integer intermediate optical flow values comprises:

according to the formulaObtaining backward optical flow values of a plurality of pixel points in the second image;

wherein the content of the first and second substances,as the coordinates in the second image areX is the coordinates of the pixel point in the first image, F0→1(x) Is a first forward optical flow value of a pixel point with coordinate x in the first image,and the integral intermediate optical flow value corresponding to the pixel point with the coordinate x in the first image.

Wherein said obtaining a first occlusion region of said first image from said forward optical flow comprises:

extracting at least one pixel point set from each pixel point in the first image, wherein the pixel point set comprises at least two pixel points, and the pixel point set comprises each pixel point of which the forward optical flow value points to the same point in the second image;

and acquiring each pixel value corresponding to each pixel point in the pixel point set and a second pixel value of a point, in the second image, of a forward optical flow value of each pixel point in the pixel point set, and marking the pixel points except the pixel point with the pixel value closest to the second pixel value in the pixel point set as a first shielding area.

Wherein said obtaining a second occlusion region of said second image from said backward optical flow comprises:

acquiring a third pixel point corresponding to each backward optical flow value;

and marking the pixel points except the third pixel point in the second image as a second shielding area.

Wherein the correcting the forward optical flow value corresponding to the first occlusion region includes:

acquiring a first non-shielding area with at least one preset size, wherein the distance between the first non-shielding area and a first target pixel point in the first shielding area is smaller than a first preset value;

acquiring pixel values corresponding to the first non-shielding areas respectively, and correcting the forward optical flow value of the first target pixel point into a target forward optical flow value, wherein the target forward optical flow value is the forward optical flow value corresponding to the first non-shielding area closest to the pixel value of the first target pixel point;

the correcting the backward light flow value corresponding to the second shielding area comprises:

acquiring a second non-shielding area with at least one preset size, wherein the distance between the second non-shielding area and a second target pixel point in the second shielding area is smaller than a second preset value;

and obtaining pixel values corresponding to the second non-shielding areas respectively, and correcting the backward optical flow value of the second target pixel point into a target backward optical flow value, wherein the target backward optical flow value is the backward optical flow value corresponding to the second non-shielding area closest to the pixel value of the second target pixel point.

Wherein said generating a target frame for insertion between the first image and the second image from the modified forward optical flow and the modified backward optical flow comprises:

generating a first intermediate frame from the modified forward optical flow and the first image, and generating a second intermediate frame from the modified backward optical flow and the second image;

and generating the target frame according to the first intermediate frame and the second intermediate frame.

Wherein said generating a first intermediate frame from said modified forward optical flow and said first image comprises:

acquiring a first intermediate optical flow from the first image to the target frame according to the modified forward optical flow;

acquiring pixel values corresponding to a plurality of pixel points on a first intermediate frame according to a first preset formula and the first intermediate optical flow to generate the first intermediate frame;

said generating a second intermediate frame from said modified backward optical flow and said second image comprises:

acquiring a second intermediate optical flow from the second image to the target frame according to the modified backward optical flow;

and acquiring pixel values corresponding to a plurality of pixel points on a second intermediate frame according to a second preset formula and the second intermediate optical flow to generate the second intermediate frame.

Wherein the first preset formula is as follows: i is0(m+Δm)=It1(m+Δm+F0→t(m))

Wherein, I0(m + Δ m) is the pixel value of a point on the first image with coordinates m + Δ m, It1(m+Δm+F0→t(m)) is the coordinate on the first intermediate frame is m + Δ m + F0→t(m) pixel value of pixel, F0→t(m) is a first intermediate optical flow value corresponding to a pixel point with a coordinate m on the first image, m being an integer, Δ m being such that Δ m + F0→t(m) is an integer value, 0. ltoreq. DELTA.m<1;

The second preset formula is as follows: i is1(n+Δn)=It2(n+Δn+F1→t(n))

Wherein, I1(n + Δ n) is the second image sit-upPixel value of a point denoted n + Δ n, It2(n+Δn+F1→t(n)) is the coordinate on the second intermediate frame is n + Δ n + F1→t(n) pixel value of pixel, F1→t(n) is a second intermediate optical flow value corresponding to a pixel point with coordinate n on the second image, n is an integer, Δ n is such that Δ n + F1→t(n) is an integer value, 0. ltoreq. DELTA.n<1。

Wherein the obtaining of the pixel value corresponding to each pixel point on the first intermediate frame according to the first preset formula and the first intermediate optical flow comprises:

when a first intermediate optical flow value corresponding to a plurality of first pixel points to the same pixel point on the first intermediate frame in the first image, acquiring a third pixel point which is not in the first shielding area in the plurality of first pixel points, and taking the pixel value of the pixel point corresponding to the third pixel point on the first intermediate frame as the pixel value of the pixel point corresponding to the plurality of first pixel points on the first intermediate frame;

the obtaining of the pixel value corresponding to each pixel point on the second intermediate frame according to the second preset formula and the second intermediate optical flow includes:

when a second intermediate optical flow value corresponding to a plurality of second pixel points to the same pixel point on the second intermediate frame exists in the second image, a fourth pixel point which is not in the second shielding area in the plurality of second pixel points is obtained, and the pixel value of the pixel point corresponding to the fourth pixel point on the second intermediate frame is used as the pixel value of the pixel point corresponding to the plurality of pixel points on the second intermediate frame.

EXAMPLE III

The present invention also provides a storage medium storing one or more programs, which are executable by one or more processors to implement the steps of the video frame interpolation method described in the above embodiments.

Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:图像生成方法及装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类