Image generation method and device and electronic equipment

文档序号:1923235 发布日期:2021-12-03 浏览:18次 中文

阅读说明:本技术 图像生成方法、装置和电子设备 (Image generation method and device and electronic equipment ) 是由 张彪 王科 孔亮 于 2021-08-25 设计创作,主要内容包括:本公开实施例公开了图像生成方法、装置和电子设备。该方法的一具体实施方式包括:从第一画布的像素中,选取至少一个像素作为第一类型像素,其中,该第一画布用于记录基于三维模型生成的目标二维图像;基于光线追踪算法对第一类型像素进行渲染,得到第一类型像素值;对记录有第一类型像素值的第一画布进行图像修复,得到第二类型像素的第二类型像素值,以生成该目标二维图像,其中,该第一画布中除第一类型像素之外的像素为第二类型像素。由此,提供了一种新的图像生成方式。(The embodiment of the disclosure discloses an image generation method and device and electronic equipment. One embodiment of the method comprises: selecting at least one pixel from pixels of a first canvas as a first type pixel, wherein the first canvas is used for recording a target two-dimensional image generated based on a three-dimensional model; rendering the first type of pixels based on a ray tracing algorithm to obtain a first type of pixel value; and performing image restoration on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image, wherein the pixels except the first type pixel in the first canvas are the second type pixel. Thus, a new image generation method is provided.)

1. An image generation method, comprising:

selecting at least one pixel from pixels of a first canvas as a first type pixel, wherein the first canvas is used for recording a target two-dimensional image generated based on a three-dimensional model;

rendering the first type of pixels based on a ray tracing algorithm to obtain a first type of pixel value;

and performing image restoration on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image, wherein the pixels except the first type pixel in the first canvas are the second type pixel.

2. The method of claim 1, wherein rendering the first type of pixel based on the ray tracing algorithm to obtain the first type of pixel value comprises:

based on at least one piece of relevant information of the pixels in the first canvas, at least one pixel is selected as a first type pixel.

3. The method of claim 2, wherein the at least one related information comprises: pixel location and/or estimated pixel energy value.

4. The method of claim 2, wherein selecting at least one pixel as the first type pixel based on at least one type of information about the pixels in the first canvas comprises:

selecting target rows from the first canvas, wherein the difference value of the row sequence numbers of at least one pair of adjacent target rows in the first canvas is not less than 2;

selecting target columns from the first canvas, wherein the difference value of the column sequence numbers of at least one pair of adjacent target columns in the first canvas is not less than 2;

and determining pixels positioned on the target row and the target column as the first type pixels.

5. The method of claim 4, wherein the difference of the row sequence numbers of the adjacent target rows in the first canvas is a preset first difference, and the difference of the sequence numbers of the adjacent target columns in the first canvas is a preset second difference.

6. The method of claim 2, wherein selecting at least one pixel as the first type pixel based on at least one type of information about the pixels in the first canvas comprises:

acquiring a related two-dimensional image of a target two-dimensional image;

and selecting at least one pixel as a first type pixel based on the energy distribution graph of the related two-dimensional image.

7. The method of claim 6, wherein the first canvas comprises a first number of pixels; and

the acquiring of the relevant two-dimensional image of the target two-dimensional image comprises:

determining a second canvas comprising a second number of pixels, wherein the second number is less than the first number;

rendering the three-dimensional model to each pixel of the second canvas based on a ray tracing algorithm to obtain the related two-dimensional image.

8. The method of claim 7, wherein selecting at least one pixel as a first type pixel based on the energy profile of the associated two-dimensional image comprises:

determining pixels with energy values larger than an energy threshold value in the energy distribution map as pixels of a third type;

and determining the first type pixel in the first canvas according to the first corresponding relation and the third type pixel, wherein the first corresponding relation is used for indicating the corresponding relation between the pixels in the energy distribution map and the pixels in the first canvas.

9. The method according to claim 1, wherein the image repairing the first canvas recorded with the first type pixel value to obtain the second type pixel value of the second type pixel comprises:

and repairing the first canvas recorded with the first type of pixel values based on a preset image repairing mode.

10. The method according to claim 9, wherein the repairing the first canvas recorded with the first type of pixel values based on a preset image repairing manner comprises:

selecting an image restoration mode corresponding to the pixel selection mode from at least two preset image restoration modes according to the pixel selection mode adopted by the first type of pixels and a preset second corresponding relation;

repairing the first canvas recorded with the first type pixel value by adopting the selected image repairing mode;

and the preset second corresponding relation is used for representing the corresponding relation between the pixel selection mode and the image restoration mode.

11. The method according to claim 10, wherein selecting an image restoration method corresponding to the pixel selection method from at least two preset image restoration methods according to the pixel selection method used for determining the first type of pixels and the preset second corresponding relationship comprises:

and selecting the image restoration mode based on the interpolation as the corresponding image restoration mode in response to the fact that the adopted determination mode comprises a pixel selection mode based on the pixel position.

12. An image generation apparatus, comprising:

the device comprises a selecting unit and a judging unit, wherein the selecting unit is used for selecting at least one pixel from pixels of a first canvas as a first type pixel, and the first canvas is used for recording a target two-dimensional image generated based on a three-dimensional model;

the rendering unit is used for rendering the first type of pixels based on a ray tracing algorithm to obtain a first type of pixel value;

and the repairing unit is used for performing image repairing on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image, wherein the pixels except the first type pixel in the first canvas are the second type pixel.

13. An electronic device, comprising:

one or more processors;

a storage device for storing one or more programs,

when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-11.

14. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-11.

Technical Field

The present disclosure relates to the field of computer technologies, and in particular, to an image generation method and apparatus, and an electronic device.

Background

With the development of computers, users can implement various functions using electronic devices. For example, a user may render a realistic picture through a virtual-based world via an electronic device.

In some scenarios, rendering a realistic picture based on a virtual world needs to be implemented by means of rendering. Rendering is one of the important research directions in computer graphics and typically involves the process of generating two-dimensional images based on three-dimensional models. Here, the three-dimensional model may indicate a three-dimensional object or a virtual scene defined in a language or a data structure, and the three-dimensional model may include information of geometry, viewpoint, texture, lighting, and the like.

Disclosure of Invention

This disclosure is provided to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

In a first aspect, an embodiment of the present disclosure provides an image generation method, including: selecting at least one pixel from pixels of a first canvas as a first type pixel, wherein the first canvas is used for recording a target two-dimensional image generated based on a three-dimensional model; rendering the first type of pixels based on a ray tracing algorithm to obtain a first type of pixel value; and performing image restoration on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image, wherein the pixels except the first type pixel in the first canvas are the second type pixel.

In a second aspect, an embodiment of the present disclosure provides an image generating apparatus, including: the device comprises a selecting unit and a judging unit, wherein the selecting unit is used for selecting at least one pixel from pixels of a first canvas as a first type pixel, and the first canvas is used for recording a target two-dimensional image generated based on a three-dimensional model; the rendering unit is used for rendering the first type of pixels based on a ray tracing algorithm to obtain a first type of pixel value; and the repairing unit is used for performing image repairing on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image, wherein the pixels except the first type pixel in the first canvas are the second type pixel.

In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image generation method of the first aspect.

In a fourth aspect, the disclosed embodiments provide a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the steps of the image generation method according to the first aspect.

According to the image generation method, the image generation device and the electronic equipment, the first type pixel can be selected from the first canvas, and then the first type pixel is rendered according to the ray tracing algorithm; and then, based on the first type pixel value, image restoration is carried out on the two-dimensional image recorded by the first canvas to obtain a complete two-dimensional image. Thus, a new image generation method is provided.

Specifically, two-dimensional image rendering is decomposed into two stages, a first stage screens out a part of pixels from a picture, and the part of pixels are rendered based on ray tracing; the second stage fills in the unrendered pixels of the first stage based on image inpainting. Generally, the time for image restoration is generally far shorter than the time for image rendering, and can be ignored. Therefore, the number of pixels rendered based on the pipeline tracking algorithm can be reduced, so that the time and the calculation amount consumed by two-dimensional image rendering are greatly reduced, and the rendering speed is improved. In other words, a part of pixels are artificially screened out in the first stage and are not subjected to ray tracing, and image restoration is used for filling in the second stage, so that the effect of accelerating the whole rendering process can be achieved.

In contrast, in some related art, a ray tracing-based rendering system renders pixels by pixels according to an image of a specified size, and although the rendering effect is realistic, its algorithm is essentially a process of making monte carlo integrals, which is very computer-computationally expensive. In order to accelerate the speed, some means such as hardware acceleration (for example, GPU rendering), parallel computing (multiple threads simultaneously render different pixel points) and the like may be adopted, but the idea of pixel-by-pixel rendering is not left.

It should be noted that, an image restoration method (or an image restoration algorithm) is designed for an image restoration scene, and a digital image with missing part of content can be usually restored, and since the missing part of content of the digital image is irregular and unknown, the difficulty in restoring is usually high. In the application, under the condition that the selection of the first type of pixels can be controlled, the repairing difficulty of the first canvas is generally smaller than that of the digital image with the missing part of content. In other words, an image repairing manner (or an image repairing algorithm) is designed for an image repairing scene, and for repairing the first canvas, the high configuration can be understood. Therefore, an image restoration mode is selected for rendering, and the target two-dimensional image can be generated at a higher speed and with a more accurate generation effect.

Drawings

The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.

FIG. 1 is a flow diagram of one embodiment of an image generation method according to the present disclosure;

fig. 2, 3 and 4 are schematic diagrams of an application scenario of the image generation method according to the present disclosure;

FIG. 5 is a schematic structural diagram of one embodiment of an image generation apparatus according to the present disclosure;

FIG. 6 is an exemplary system architecture to which the image generation method of one embodiment of the present disclosure may be applied;

fig. 7 is a schematic diagram of a basic structure of an electronic device provided according to an embodiment of the present disclosure.

Detailed Description

Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.

It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.

The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.

It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.

It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.

The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.

Referring to fig. 1, a flow of one embodiment of an image generation method according to the present disclosure is shown. The image generation method as shown in fig. 1 includes the steps of:

step 101, selecting at least one pixel from the pixels of the first canvas as a first type pixel.

In this embodiment, an execution subject (e.g., a server and/or a terminal device) of the image generation method may select at least one pixel from pixels of the first canvas as the first type pixel.

Here, the first canvas is used to record a target two-dimensional image generated based on a three-dimensional model.

Here, the three-dimensional model may indicate a three-dimensional object and/or a virtual scene.

Here, the target two-dimensional image may be an image to be generated. At the time of step 101, there is no two-dimensional image of the target yet. It is to be understood that the embodiment corresponding to fig. 1 may include a process of generating a two-dimensional image of the target.

It is commonly understood that the relationship between the first canvas and the target two-dimensional image may be analogous to the relationship between the canvas and the canvas in a scene where a human draws an oil painting. However, the first canvas in this embodiment is not a perceivable object, and in the algorithm principle of the ray tracing algorithm, the canvas may be understood as a concept set for calculation.

Ray tracing (Ray tracing) is a rendering algorithm in three-dimensional computer graphics, and can trace rays emitted from a camera instead of rays emitted from a light source to realize the display of a mathematical model of a programmed scene.

The physical principle of ray tracing can be briefly described as follows: in geometric optics, the physical characteristics of light rays can be studied by directly simplifying the light rays into straight lines regardless of the fluctuation of the light rays. Similarly, in computer graphics, this feature can also be exploited to simplify the illumination rendering process; the illumination information received by the eyes of humans is a finite number of pixels, with most people's eyes being around 5 hundred million pixels; the image information received by human can be divided into 5 hundred million pixels, that is, can be divided into 5 hundred million very tiny rays, and the rays are traced backwards in the opposite way, and the information (position, orientation, indication material, illumination color and brightness, etc.) of the scene object corresponding to the rays can be detected.

Ray tracing is derived using the above physical principles. The eye is abstracted into a camera, the retina is abstracted into a canvas (or a display screen), 5 hundred million pixels are simplified into canvas pixels, and rays are connected with each pixel of the canvas from the position of the camera to track the illumination information of the intersection points of the rays and the scene objects.

Here, the first canvas may have a size smaller than 5 hundred million pixels that the human eye can perceive. The size of the first canvas may be set according to an actual application scenario, and is not limited herein.

In this embodiment, a part of the pixels may be selected as the first type pixels in various ways, which is not limited herein. As an example, a pixel may be randomly chosen from the first canvas as a first type of pixel.

It can be understood that, in the case that the first canvas includes the first type pixels and the second type pixels, the number of the first type pixels is smaller than the number of the pixels in the first canvas, and details are not repeated here. In other words, the first type of pixel is a proper subset of the pixels in the first canvas.

102, rendering the first type pixel based on a ray tracing algorithm to obtain a first type pixel value.

As an example, a viewpoint (i.e., a camera assumed in the principle of ray tracing algorithm) is used as a starting point, light is emitted to each first type pixel, then an operation such as Bidirectional Reflection Distribution Function (BRDF) is performed according to the material of a collision point, then diffused reflection, specular reflection or refraction is performed, and the above steps are recursively cycled until the light escapes from a scene or reaches the maximum reflection number, and then monte carlo integration is performed on the light generated in the process to obtain a first type pixel value.

And 103, performing image restoration on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image.

Here, the image restoration may refer to restoring a portion of the image loss based on background information of the image.

Optionally, various Image inpainting (Image inpainting) algorithms may be used for Image inpainting, for example, interpolation-based Image inpainting, depth learning-based Image inpainting, and the like.

Referring to fig. 2, 3 and 4, fig. 2, 3 and 4 show exemplary application scenarios of the corresponding embodiment of fig. 1.

In fig. 2, the first type of pixels and the second type of pixels in the first canvas are shown. Wherein the black areas in fig. 2 indicate pixels of the second type, i.e. pixels that are not to be rendered using the ray tracing algorithm; the blank area in FIG. 2 indicates the first type of pixel, the pixel to be rendered using the ray tracing algorithm. It should be noted that, in practice, the human eye cannot distinguish between pixel-level sizes, and the actual first type pixels and second type pixels are difficult to show. The canvas size, pixel size, and choice of pixels in fig. 2 are merely exemplary diagrams made for ease of illustration.

Figure 3 shows a first canvas with holes (black areas) whose first type areas have been rendered with first type pixel values.

Then, the first canvas shown in fig. 3 is subjected to image restoration, and the second type pixels with missing pixel values in the first canvas can be completed. Thereby, the first canvas shown in fig. 4 may be obtained. The first canvas shown in figure 4 may be understood as the target two-dimensional image.

It should be noted that, in the image generation method provided in this embodiment, the first type pixel may be selected from the first canvas, and then the first type pixel is rendered by using the ray tracing algorithm; and then, based on the first type pixel value, image restoration is carried out on the two-dimensional image recorded by the first canvas to obtain a complete two-dimensional image. Thus, a new image generation method is provided.

Specifically, two-dimensional image rendering is decomposed into two stages, a first stage screens out a part of pixels from a picture, and the part of pixels are rendered based on ray tracing; the second stage fills in the unrendered pixels of the first stage based on image inpainting. Generally, the time for image restoration is generally far shorter than the time for image rendering, and can be ignored. Therefore, the number of pixels rendered based on the pipeline tracking algorithm can be reduced, so that the time and the calculation amount consumed by two-dimensional image rendering are greatly reduced, and the rendering speed is improved. In other words, a part of pixels are artificially screened out in the first stage and are not subjected to ray tracing, and image restoration is used for filling in the second stage, so that the effect of accelerating the whole rendering process can be achieved.

In contrast, in some related art, a ray tracing-based rendering system renders pixels by pixels according to an image of a specified size, and although the rendering effect is realistic, its algorithm is essentially a process of making monte carlo integrals, which is very computer-computationally expensive. In order to accelerate the speed, some means such as hardware acceleration (for example, GPU rendering), parallel computing (multiple threads simultaneously render different pixel points) and the like may be adopted, but the idea of pixel-by-pixel rendering is not left.

It should be noted that, an image restoration method (or an image restoration algorithm) is designed for an image restoration scene, and a digital image with missing part of content can be usually restored, and since the missing part of content of the digital image is irregular and unknown, the difficulty in restoring is usually high. In the application, under the condition that the selection of the first type of pixels can be controlled, the repairing difficulty of the first canvas is generally smaller than that of the digital image with the missing part of content. In other words, an image repairing manner (or an image repairing algorithm) is designed for an image repairing scene, and for repairing the first canvas, the high configuration can be understood. Therefore, an image restoration mode is selected for rendering, and the target two-dimensional image can be generated at a higher speed and with a more accurate generation effect.

In some embodiments, step 101 may include: based on at least one piece of relevant information of the pixels in the first canvas, at least one pixel is selected as a first type pixel.

It should be noted that, based on at least one kind of related information, a part of pixels are selected as first-type pixels, and the first-type pixels suitable for an actual application scene may be selected by referring to one or more kinds of related information.

In some embodiments, the at least one related information may include, but is not limited to: pixel location and/or estimated pixel energy value.

Here, whether to select a pixel may be determined based on the position where the pixel is located.

Here, the estimated pixel energy value may include simply estimating a pixel value corresponding to a pixel of the target two-dimensional image according to the three-dimensional model, then transforming the estimated corresponding pixel value to obtain a spectrogram, and estimating an energy value corresponding to the pixel in the target two-dimensional image from the spectrogram.

It should be noted that, based on the pixel position, at least one pixel is selected as the first type pixel, which generally can realize that the distribution of the first type pixel has a certain regularity, facilitate the targeted setting of the image restoration mode, realize a low image restoration speed and ensure a good restoration accuracy.

It should be noted that, based on the estimated pixel energy value, at least one pixel is selected as a first type pixel, areas with higher image information amount and areas with lower image information amount can be distinguished by referring to the pixel energy value, ray tracing rendering is adopted for the higher areas, image restoration can be adopted for the lower areas, and generally faster rendering speed can be achieved.

In some embodiments, said selecting at least one pixel as a first type pixel based on at least one piece of relevant information about pixels in the first canvas comprises: selecting a target line from the first canvas; selecting a target column from the first canvas; and determining pixels positioned on the target row and the target column as the first type pixels.

Here, the difference in the line sequence numbers of at least one pair of adjacent target lines in the first canvas is not less than 2. That is, at least one line in the first canvas is discarded, and the other lines in the first canvas after being discarded are taken as target lines.

As an example, a line with a line number of 1 in the first canvas is selected as the target line, a line with a line number of 2 in the first canvas is not selected as the target line, and a line with a line number of 3 in the first canvas is selected as the target line. Then the line numbered 1 in the first canvas and the line numbered 3 in the first canvas may become an adjacent target line after being selected as the target line.

Here, the column number difference of at least one pair of adjacent target columns in the first canvas is not less than 2. That is, at least one column in the first canvas is discarded, and the other columns in the first canvas after being discarded are taken as target columns.

Here, the column number difference of at least one pair of adjacent target columns in the first canvas is not less than 2. That is, at least one column in the first canvas is discarded, and the other columns in the first canvas after being discarded are taken as target columns.

As an example, a column with column number 1 in the first canvas is selected as the target column, a column with column number 2 in the first canvas is not selected as the target column, and a column with column number 3 in the first canvas is selected as the target column. Then the column numbered 1 in the first canvas and the column numbered 3 in the first canvas may become adjacent target columns after being selected as target columns.

It should be noted that, selecting the target row and the target column, and taking the intersection of the target row and the target column as the first type pixel can ensure the distribution regularity of the first type pixel as much as possible, so as to facilitate the targeted implementation of image restoration, thereby further ensuring the image restoration quality and improving the image generation speed.

In some embodiments, the difference value of the line sequence numbers of the adjacent target lines in the first canvas is a preset first difference value, and the difference value of the sequence numbers of the adjacent target columns in the first canvas is a preset second difference value.

As an example, the value of the preset first difference may be 2, and the value of the preset second difference may also be 2. In this case, it is understood that the target row is selected every 1 row and the target column is selected every other row.

It should be noted that, the distance between adjacent target rows of each team is a preset first difference value, and the distance between each pair of adjacent target columns is a preset second difference value, so that the first type of pixels can be uniformly distributed, the image restoration difficulty can be reduced, the image restoration accuracy can be ensured (the difference between the generated image effect and the image rendered based on the original rendering method and visible to the naked eye is avoided), and the image restoration speed can be increased as much as possible.

In some embodiments, said selecting at least one pixel as a first type pixel based on at least one piece of relevant information about pixels in the first canvas comprises: acquiring a related two-dimensional image of a target two-dimensional image; and selecting at least one pixel as a first type pixel based on the energy distribution graph of the related two-dimensional image.

Here, the relevant two-dimensional image may be a relevant image of the target two-dimensional image, and the image content of the relevant two-dimensional image may have a preset relevant relationship with the image content of the target two-dimensional image.

As an example, the target two-dimensional image is 400 × 400, and a reduced image (for example, 100 × 100) of the target two-dimensional image may be the correlated two-dimensional image.

As an example, an image at an intermediate stage in the rendering process (e.g., an intermediate result that does not include global lighting information) may be taken as the relevant two-dimensional image described above.

Here, a spectrogram of the relevant two-dimensional image, which may also be referred to as a power map or an energy distribution map, may be computed (e.g., using fourier transform, etc.).

On the basis of the energy profile, at least one pixel may be selected as the first type pixel in various ways. For example, pixels with energy values in the first fifty percent are considered as the first type of pixels.

It should be noted that, under the condition that the target two-dimensional image is not generated, at least one pixel is selected as the first type pixel by virtue of the energy distribution of the relevant two-dimensional image, so that the problem of how to acquire the energy distribution of the target two-dimensional image is solved ingeniously, and the judgment of the rendering area by adopting the energy distribution is possible. In some scenes, the more low-frequency regions in a general image, the larger the acceleration space, and the more obvious acceleration effect can be achieved by selecting rendering pixels based on an energy map.

In some embodiments, the first canvas comprises a first number of pixels.

In some embodiments, said obtaining a correlated two-dimensional image of the target two-dimensional image comprises: determining a second canvas comprising a second number of pixels, wherein the second number is less than the first number; rendering the three-dimensional model to each pixel of the second canvas based on a ray tracing algorithm to obtain the related two-dimensional image.

Here, the second canvas is used to record the above-mentioned related two-dimensional image generated based on the three-dimensional model.

It will be appreciated that the size of the second canvas is smaller than the size of the first canvas. In other words, the associated two-dimensional image obtained based on the second canvas may be regarded as a reduced image of the target two-dimensional image.

The process of rendering the three-dimensional model to each pixel in the second canvas may refer to the rendering process described in step 102, and is not described herein again.

It should be noted that the size of the second canvas is smaller than the size of the first canvas, so that the time for rendering the second canvas is much shorter than the time for the complete first canvas. The method comprises the steps of rendering a second canvas, and determining a mode of a first type of pixel in a first canvas based on energy distribution of the second canvas, wherein time superposition of two rendering processes is generally smaller than that of the rendering of the complete first type of canvas. Therefore, the generation time of the target two-dimensional image can be reduced. And moreover, the pixels of the second canvas and the pixels of the first canvas have a clear corresponding relation, so that the energy distribution of the target two-dimensional image can be accurately estimated according to the energy distribution in the second canvas, and the accurate first type pixels can be determined.

In some embodiments, said selecting at least one pixel as a first type pixel based on the energy profile of the associated two-dimensional image comprises: determining pixels with energy values larger than an energy threshold value in the energy distribution map as pixels of a third type; and determining the first type pixel in the first canvas according to the first corresponding relation and the third type pixel.

Alternatively, the energy threshold may be preset, or may be obtained according to an energy value in the energy distribution map. As an example, the energy value of the median in the energy distribution map may be used as the energy threshold.

Here, the first correspondence is used to indicate a correspondence between pixels in the energy profile and pixels in the first canvas.

Here, a proportional relationship of the first canvas to the second canvas may be determined according to a relationship between the first canvas pixel number and the second canvas pixel number. From the proportional relationship, a correspondence between pixels in the second canvas and pixels in the first canvas may be determined.

By way of example, if the first canvas size is 400 x 400, the second canvas size is 100 x 100. The top left 4 x 4 pixel block in the first canvas may then correspond to the top left 1 x 1 pixel in the second canvas.

It should be noted that, the area with high energy is selected for rendering, and the area with low energy is not rendered. Intuitively speaking, areas with high frequency, various details and difficult later-stage repairing and completing through images can be selected for rendering, the accuracy of the generated images can be ensured, flat areas with low frequency and less image content change are not rendered, the number of rendered pixels is reduced, and the rendering time is shortened.

In some embodiments, the step 103 may include: and repairing the first canvas recorded with the first type of pixel values based on a preset image repairing mode.

Here, the preset repairing manner may include one or at least two.

As an example, an image restoration method based on interpolation may be adopted to perform image restoration.

As an example, an image restoration method based on deep learning may be adopted to perform image restoration.

It should be noted that, the image restoration is performed based on a preset image restoration method, and the image restoration effect and the restoration speed can be ensured according to the actual application requirements.

In some embodiments, the repairing the first canvas recorded with the first type of pixel values based on a preset image repairing manner includes: selecting an image restoration mode corresponding to the pixel selection mode from at least two preset image restoration modes according to the pixel selection mode adopted by the first type of pixels and a preset second corresponding relation; and repairing the first canvas recorded with the first type of pixel values by adopting the selected image repairing mode.

Here, the preset second corresponding relationship is used to represent a corresponding relationship between a pixel selection method and an image restoration method.

The pixel selection mode may indicate a mode of selecting a pixel. As can be seen from the above, the pixel selection manner may include at least one of the following: pixel location based approach, and estimated pixel energy value based approach.

The pixel selection may affect the speed of the first stage (i.e., the rendering stage indicated by step 102) and may also affect the repairing effect of the second stage (i.e., the image repairing stage indicated by step 103).

Intuitively understand that the first-stage rendering is less first-type pixels, the information which can be referred to in the second-stage image repairing is less, and the repairing difficulty of the second stage is increased; more first-type pixels are rendered in the first stage, more information can be referred to in the second-stage image repairing process, and the repairing difficulty of the second stage is lower. The repair difficulty is large, usually, the calculation amount generated by repair is large, and the repair effect may be reduced.

It should be noted that, the image restoration method is determined according to the adopted image selection method, which is helpful to find a balance point between the rendering speed in the first stage and the image restoration difficulty in the second stage, and give consideration to the speed in the first stage and the restoration difficulty in the second stage, so that the speed in the first stage is higher and the restoration difficulty in the second stage is lower, and the overall efficiency of generating the target two-dimensional image is improved.

In some embodiments, the selecting, according to the pixel selection method used for determining the first type of pixel and the preset second corresponding relationship, an image restoration method corresponding to the pixel selection method from among at least two preset image restoration methods may include: and selecting the image restoration mode based on the interpolation as the corresponding image restoration mode in response to the fact that the adopted determination mode comprises a pixel selection mode based on the pixel position.

Here, the first type pixels generally selected based on the pixel positions are not related to the image contents, and the positional relationship between the selected first type pixels may have some regularity. Under the condition, the second type pixel value is determined based on the interpolation image restoration method, the distribution rule of the first type pixel can be just adapted, and the second type pixel value can be restored more quickly.

Among various image restoration methods, an image restoration method based on interpolation is a method that consumes a small amount of calculation and time. Under the condition that the image restoration method based on interpolation can be attached to the pixel selection method based on position, the image restoration is carried out in the image restoration method based on interpolation, so that the image restoration effect can be ensured, and the consumption of time and computing resources is greatly reduced.

With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an image generating apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which is particularly applicable in various electronic devices.

As shown in fig. 5, the image generating apparatus of the present embodiment includes: a selection unit 501, a rendering unit 502 and a repair unit 503. The device comprises a selecting unit, a calculating unit and a generating unit, wherein the selecting unit is used for selecting at least one pixel from pixels of a first canvas as a first type pixel, and the first canvas is used for recording a target two-dimensional image generated based on a three-dimensional model; the rendering unit is used for rendering the first type of pixels based on a ray tracing algorithm to obtain a first type of pixel value; and the repairing unit is used for performing image repairing on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image, wherein the pixels except the first type pixel in the first canvas are the second type pixel.

In this embodiment, specific processing of the selecting unit 501, the rendering unit 502, and the repairing unit 503 of the image generating apparatus and technical effects thereof can refer to related descriptions of step 101, step 102, and step 103 in the corresponding embodiment of fig. 1, which are not described herein again.

In some embodiments, the rendering the first type of pixel based on the ray tracing algorithm to obtain a first type of pixel value includes: based on at least one piece of relevant information of the pixels in the first canvas, at least one pixel is selected as a first type pixel.

In some embodiments, the at least one related information comprises: pixel location and/or estimated pixel energy value.

In some embodiments, said selecting at least one pixel as a first type pixel based on at least one piece of relevant information about pixels in the first canvas comprises: selecting target rows from the first canvas, wherein the difference value of the row sequence numbers of at least one pair of adjacent target rows in the first canvas is not less than 2; selecting target columns from the first canvas, wherein the difference value of the column sequence numbers of at least one pair of adjacent target columns in the first canvas is not less than 2; and determining pixels positioned on the target row and the target column as the first type pixels.

In some embodiments, the difference value of the line sequence numbers of the adjacent target lines in the first canvas is a preset first difference value, and the difference value of the sequence numbers of the adjacent target columns in the first canvas is a preset second difference value.

In some embodiments, said selecting at least one pixel as a first type pixel based on at least one piece of relevant information about pixels in the first canvas comprises: acquiring a related two-dimensional image of a target two-dimensional image; and selecting at least one pixel as a first type pixel based on the energy distribution graph of the related two-dimensional image.

In some embodiments, the first canvas comprises a first number of pixels; and the acquiring of the relevant two-dimensional image of the target two-dimensional image comprises: determining a second canvas comprising a second number of pixels, wherein the second number is less than the first number; rendering the three-dimensional model to each pixel of the second canvas based on a ray tracing algorithm to obtain the related two-dimensional image.

In some embodiments, said selecting at least one pixel as a first type pixel based on the energy profile of the associated two-dimensional image comprises: determining pixels with energy values larger than an energy threshold value in the energy distribution map as pixels of a third type; and determining the first type pixel in the first canvas according to the first corresponding relation and the third type pixel, wherein the first corresponding relation is used for indicating the corresponding relation between the pixels in the energy distribution map and the pixels in the first canvas.

In some embodiments, the image repairing the first canvas recorded with the first type of pixel values to obtain the second type of pixel values of the second type of pixels includes: and repairing the first canvas recorded with the first type of pixel values based on a preset image repairing mode.

In some embodiments, the repairing the first canvas recorded with the first type of pixel values based on a preset image repairing manner includes: selecting an image restoration mode corresponding to the pixel selection mode from at least two preset image restoration modes according to the pixel selection mode adopted by the first type of pixels and a preset second corresponding relation; repairing the first canvas recorded with the first type pixel value by adopting the selected image repairing mode; and the preset second corresponding relation is used for representing the corresponding relation between the pixel selection mode and the image restoration mode.

In some embodiments, the selecting, according to the pixel selection method used for determining the first type of pixel and the preset second corresponding relationship, an image restoration method corresponding to the pixel selection method used from at least two preset image restoration methods includes: and selecting the image restoration mode based on the interpolation as the corresponding image restoration mode in response to the fact that the adopted determination mode comprises a pixel selection mode based on the pixel position.

Referring to fig. 6, fig. 6 illustrates an exemplary system architecture to which the image generation method of one embodiment of the present disclosure may be applied.

As shown in fig. 6, the system architecture may include terminal devices 601, 602, 603, a network 604, and a server 605. The network 604 serves to provide a medium for communication links between the terminal devices 601, 602, 603 and the server 605. Network 604 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.

The terminal devices 601, 602, 603 may interact with the server 605 via the network 604 to receive or send messages or the like. The terminal devices 601, 602, 603 may have various client applications installed thereon, such as a web browser application, a search-type application, and a news-information-type application. The client application in the terminal device 601, 602, 603 may receive the instruction of the user, and complete the corresponding function according to the instruction of the user, for example, add the corresponding information in the information according to the instruction of the user.

The terminal devices 601, 602, 603 may be hardware or software. When the terminal devices 601, 602, 603 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal device 601, 602, 603 is software, it can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.

The server 605 may be a server providing various services, for example, receiving an information acquisition request sent by the terminal devices 601, 602, and 603, and acquiring the presentation information corresponding to the information acquisition request in various ways according to the information acquisition request. And the relevant data of the presentation information is sent to the terminal devices 601, 602, 603.

It should be noted that the image generation method provided by the embodiment of the present disclosure may be executed by a terminal device, and accordingly, the image generation apparatus may be disposed in the terminal device 601, 602, 603. In addition, the image generation method provided by the embodiment of the present disclosure may also be executed by the server 605, and accordingly, the image generation apparatus may be provided in the server 605.

It should be understood that the number of terminal devices, networks, and servers in fig. 6 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.

Referring now to fig. 7, shown is a schematic diagram of an electronic device (e.g., a terminal device or a server of fig. 6) suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.

As shown in fig. 7, the electronic device may include a processing device (e.g., central processing unit, graphics processor, etc.) 701, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage device 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the electronic apparatus 700 are also stored. The processing device 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.

Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication device 709 may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.

In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communication means 709, or may be installed from the storage means 708, or may be installed from the ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of the embodiments of the present disclosure.

It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.

In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.

The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.

The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: selecting at least one pixel from pixels of a first canvas as a first type pixel, wherein the first canvas is used for recording a target two-dimensional image generated based on a three-dimensional model; rendering the first type of pixels based on a ray tracing algorithm to obtain a first type of pixel value; and performing image restoration on the first canvas recorded with the first type pixel value to obtain a second type pixel value of a second type pixel so as to generate the target two-dimensional image, wherein the pixels except the first type pixel in the first canvas are the second type pixel.

Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).

The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a cell does not in some cases constitute a limitation of the cell itself, for example, a selection cell may also be described as a "cell selecting at least one pixel as a first type of pixel".

The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.

In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.

The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种地理大数据三维可视化渲染方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!