Rendering wide gamut, two-dimensional (2D) images on three-dimensional (3D) enabled displays

文档序号:991717 发布日期:2020-10-20 浏览:2次 中文

阅读说明:本技术 在支持三维(3d)的显示器上渲染广色域、二维(2d)图像 (Rendering wide gamut, two-dimensional (2D) images on three-dimensional (3D) enabled displays ) 是由 A·N·佩纳 T·戴维斯 M·J·理查兹 于 2019-12-10 设计创作,主要内容包括:一种用于显示图像数据的系统和方法,包括:接收2D视频数据;从视频数据生成第一虚拟色域的虚拟基色的第一多个强度值和第二虚拟色域的第二多个强度值,该第一多个强度值低于亮度阈值并近似预定色域,并且第二多个强度值高于亮度阈值;将第一多个强度值转换为显示系统的第一投影头的预定基色的第三多个强度值并将第二多个强度值转换为显示系统的第二投影头的预定基色的第四多个强度值;以及基于第三多个强度值和第四多个强度值动态地调整显示系统的空间调制器的像素水平。(A system and method for displaying image data, comprising: receiving 2D video data; generating from the video data a first plurality of intensity values of virtual primaries of the first virtual gamut and a second plurality of intensity values of the second virtual gamut, the first plurality of intensity values being below a luminance threshold and approximating the predetermined gamut, and the second plurality of intensity values being above the luminance threshold; converting the first plurality of intensity values to a third plurality of intensity values for a predetermined color primary of a first projection head of the display system and converting the second plurality of intensity values to a fourth plurality of intensity values for a predetermined color primary of a second projection head of the display system; and dynamically adjusting a pixel level of a spatial modulator of the display system based on the third plurality of intensity values and the fourth plurality of intensity values.)

1. A dual head display system, comprising:

a first projection head;

a second projection head;

at least one spatial modulator; and

an electronic processor configured to:

receiving 2D video data;

generating, from the video data, a first plurality of intensity values of virtual primaries of a first virtual gamut and a second plurality of intensity values of virtual primaries of a second virtual gamut, the first plurality of intensity values being below a luminance threshold and approximating a predetermined gamut, and the second plurality of intensity values being above the luminance threshold;

converting the first plurality of intensity values and the second plurality of intensity values to a third plurality of intensity values of the predetermined color primaries of the first projection head and a fourth plurality of intensity values of the predetermined color primaries of the second projection head, respectively; and

dynamically adjusting pixel levels of the at least one spatial modulator of the first projection head and the second projection head based on the third plurality of intensity values and the fourth plurality of intensity values;

wherein the luminance threshold is a threshold vector comprising a threshold for each color channel of the first virtual color gamut, and wherein each threshold is determined based on a transformation of the first virtual color gamut into the predetermined color gamut.

2. The system of claim 1, wherein any negative values in the first plurality of intensity values and the second plurality of intensity values are clipped.

3. The system of any of claims 1 or 2, wherein any negative values in the third and fourth pluralities of intensity values are clipped.

4. The system of any of claims 1 to 3, wherein the dual-headed display system is configured to display a 2-D image based on the third and fourth pluralities of intensity values.

5. A method for displaying image data, the method comprising:

receiving 2D video data;

generating, from the video data, a first plurality of intensity values of virtual primaries of a first virtual gamut and a second plurality of intensity values of virtual primaries of a second virtual gamut, the first plurality of intensity values of the first virtual gamut being below a luminance threshold and approximating a predetermined gamut, and the second plurality of intensity values being above the luminance threshold;

converting the first plurality of intensity values and the second plurality of intensity values to a third plurality of intensity values for a predetermined color primary of a first projection head of a display system and converting the first plurality of intensity values and the second plurality of intensity values to a fourth plurality of intensity values for a predetermined color primary of a second projection head of the display system; and

dynamically adjusting pixel levels of at least one spatial modulator of the first projection head and the second projection head of the display system based on the third plurality of intensity values and the fourth plurality of intensity values; wherein the luminance threshold is a threshold vector comprising a threshold for each color channel, and wherein each threshold is determined based on a transformation of the first virtual color gamut into the predetermined color gamut.

6. The method of claim 5, wherein any negative values in the first and second plurality of intensity values are clipped.

7. The method of any of claims 5 or 6, wherein any negative values in the third and fourth pluralities of intensity values are clipped.

8. The method of any of claims 5 to 7, wherein converting the first and second pluralities of intensity values to a third and converting the first and second pluralities of intensity values to a fourth plurality of intensity values is performed via a mixing function.

9. The method of any of claims 5 to 8, further comprising: modifying one or both of the first plurality of intensity values and the second plurality of intensity values when at least one intensity value is outside of an achievable gamut volume of the first virtual gamut and the second virtual gamut.

10. The method of any of claims 5 to 9, further comprising: modifying one or both of the third plurality of intensity values and the fourth plurality of intensity values when at least one intensity value is outside of an achievable gamut volume of the first virtual gamut and the second virtual gamut.

11. The method of any of claims 5 to 10, further comprising: the first virtual color gamut is defined to approximate a predetermined color gamut associated with the predetermined color space based on a combination of a plurality of primary display colors associated with a light source, and the second virtual color gamut is defined based on a residual power of the light source and the first virtual color gamut.

12. A non-transitory computer-readable medium storing instructions that, when executed by a processor of a computer, cause the computer to perform operations of the method of any of claims 5 to 11.

Technical Field

The present application relates generally to rendering wide gamut images.

Background

A display capable of displaying three-dimensional (3D) images may use two different sets of primary colors (6P) to display left and right eye images that, when viewed together, create the appearance of a 3D image. Such displays may also be used to display two-dimensional (2D) images.

Disclosure of Invention

Various aspects of the present disclosure relate to systems and methods for improved rendering of two 2D images on a 6P 3D projector and display system, particularly a spectrally separated 3D projector and display system.

In one exemplary aspect of the disclosure, a dual head projection system is provided that includes a first projection head, a second projection head, at least one spatial modulator, and an electronic processor. The electronic processor is configured to: receiving 2D video data; generating from the video data a first plurality of intensity values of virtual primaries of the first virtual gamut and a second plurality of intensity values of virtual primaries of the second virtual gamut, the first plurality of intensity values being below a luminance threshold and approximating the predetermined gamut, and the second plurality of intensity values being above the luminance threshold; converting the first plurality of intensity values to a third plurality of intensity values for the predetermined color primaries of the first projection head and converting the second plurality of intensity values to a fourth plurality of intensity values for the predetermined color primaries of the second projection head; and dynamically adjusting pixel levels of spatial modulators of the first projection head and the second projection head.

In another exemplary aspect of the present disclosure, a method for displaying image data is provided. The method comprises the following steps: receiving 2D video data; generating, from the video data, a first plurality of intensity values of virtual primaries of the first virtual gamut and a second plurality of intensity values of the second virtual gamut, the first plurality of intensity values of the first virtual gamut being below a luminance threshold and approximating a predetermined gamut and the second plurality of intensity values being above the luminance threshold; converting the first plurality of intensity values to a third plurality of intensity values for a predetermined color primary of a first projection head of the display system and converting the second plurality of intensity values to a fourth plurality of intensity values for a predetermined color primary of a second projection head of the display system; and dynamically adjusting pixel levels of spatial modulators of the first projection head and the second projection head of the display system based on the third plurality of intensity values and the fourth plurality of intensity values.

In another exemplary aspect of the present disclosure, a method for displaying image data is provided. The method comprises the following steps: receiving video data; generating a first plurality of intensity values associated with a first virtual color gamut based on intensity levels of the video data; generating a second plurality of intensity values associated with a second virtual color gamut based on a comparison between the intensity level and at least one predetermined threshold; generating a third plurality of intensity values and a fourth plurality of intensity values associated with the plurality of primary display colors based on the first plurality of intensity values and the second plurality of intensity values; and providing the third plurality of intensity values and the fourth plurality of intensity values to the at least one spatial light modulator.

In another exemplary aspect of the disclosure, a non-transitory computer-readable medium storing instructions that, when executed by a processor of a computer, cause the computer to perform operations comprising: receiving 2D video data comprising tristimulus pixel values of predetermined primary colors of a predetermined color space; generating from the video data a first plurality of intensity values of virtual primaries of the first virtual gamut and a second plurality of intensity values of the second virtual gamut to approximate the predetermined gamut, the first plurality of intensity values being below a luminance threshold of the first virtual gamut and the second plurality of intensity values being above the luminance threshold; converting the first plurality of intensity values to a third plurality of intensity values for the predetermined color primaries and the second plurality of intensity values to a fourth plurality of intensity values for the predetermined color primaries via a mixing function; and dynamically adjusting a pixel level of at least one spatial modulator of the dual head projection system based on the third plurality of intensity values and the fourth plurality of intensity values.

In this way, aspects of the present disclosure provide for the rendering of 2D images from a 3D projection device, and improvements in at least the technical fields of image projection, signal processing, and the like.

Drawings

The accompanying figures, in which like reference numerals refer to identical or functionally similar elements throughout the separate views and which together with the detailed description below are incorporated in and form part of the specification, serve to further illustrate the conceptual embodiments and to explain various principles and advantages of those embodiments.

Fig. 1A is a spectral diagram of a 6P projection system according to aspects of the present disclosure.

Fig. 1B is a block diagram of a projection system according to aspects of the present disclosure.

Fig. 1C is a block diagram of a projection system according to aspects of the present disclosure.

Fig. 2 is a block diagram of a controller included in the systems of fig. 1B and 1C, in accordance with various aspects of the present disclosure.

Fig. 3 is a flow diagram illustrating a method implemented by the controller of fig. 2 in accordance with various aspects of the present disclosure.

Fig. 4 is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 5A is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 5B is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 6A is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 6B is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 7A is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 7B is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 8A is a chromaticity diagram in accordance with various aspects of the present disclosure.

Fig. 8B is a chromaticity diagram in accordance with various aspects of the present disclosure.

Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of embodiments of the present disclosure.

Apparatus and method components have been represented where appropriate by reference numerals in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.

Detailed Description

As previously mentioned, 2D images may be projected/displayed using some 6P displays configured to project 3D images. When projecting a 3D image, one set of three primary colors (red, green and blue) is used for the left eye image and another set of three primary colors is used for the right eye image, six primary colors being used for displaying the left and right eye images. 3D glasses used with such displays may have corresponding filters (e.g., bandpass filters) to allow each eye to see the appropriate image. By driving each pair of primary light sources with the same data, a two-dimensional image can be displayed by the 3D display without the need for 3D glasses to be worn by the viewer. For example, two D red data values are used to drive red1 and red2 primaries. Likewise, the 2D green data values are used to drive the green1 and green2 primaries, and the 2D blue data values are used to drive the blue1 and blue2 primaries. The system is calibrated using the combined primary colors and an image can be generated. However, the resulting gamut may be very limited with respect to the desired gamut (e.g., the established Rec2020 gamut).

The present disclosure and its aspects may be embodied in various forms, including hardware or circuitry controlled by computer-implemented methods, computer program products, computer systems and networks, user interfaces, and application programming interfaces; as well as hardware implemented methods, signal processing circuits, memory arrays, application specific integrated circuits, field programmable gate arrays, and the like. The preceding summary is intended only to give a general idea of the various aspects of the disclosure, and not to limit the scope of the disclosure in any way.

In the following description, numerous details are set forth, such as circuit configurations, waveform timing, circuit operations, etc., in order to provide an understanding of one or more aspects of the present disclosure. It will be apparent to those skilled in the art that these specific details are merely exemplary and are not intended to limit the scope of the present application.

Further, while the present disclosure focuses primarily on the received video data being an example of Rec2020, it will be understood that this is only one example of an implementation and that other color spaces may be utilized. It will also be appreciated that the disclosed systems and methods may be used in any projection system to improve the rendering of 2D images on a six primary color display.

For ease of description, some or all of the example systems presented herein are illustrated with a single example of each component described. Some examples may not describe or show all of the components of the system. Other example embodiments may include more or fewer of each of the illustrated components, may combine some components, or may include additional or alternative components. For example, in some embodiments, the system 100 of fig. 1B and 1C below may include more than one light source 102.

As described above, some 3D displays called 6P systems simultaneously display a left-eye image and a right-eye image using two different sets of primary colors. Fig. 1A is a spectral diagram 1 of a 6P system according to some embodiments. Fig. 1 includes three short wavelengths 2A, 3A, and 4A (referred to herein as short primaries) and three long wavelengths 2B, 3B, and 4B (referred to herein as long primaries). The example display systems described herein are configured to use the short primaries 2A, 3A, and 4A for the left-eye image (e.g., via a designated left projector), and the long primaries 2B, 3B, and 4B for the right-eye image (e.g., via a designated right projector). It should be appreciated that in further embodiments, a combination of short and long primaries may be used for each eye image. As explained in more detail below, each projector outputs a modulated light output (of a given primary color of the projector) onto a display or viewing screen. In the embodiments described herein, the left-eye image and the right-eye image are displayed simultaneously.

Fig. 1B and 1C are each block diagrams of an exemplary display system 100 according to some embodiments. Each system includes at least some similarly configured components, which are also labeled as such. The display system 100 is configured to display 3D and 2D video data received from a video data source 101. The display system 100 may be any kind of system configured to display an image, for example, a projection system or a Light Emitting Diode (LED) display system. Display system 100 includes a light source 102, illumination optics 104, a splitter 106, one or more modulators 108, a combiner 110, projection optics 112, and a controller 114. Although fig. 1B and 1C illustrate one light source 102, a display system 100 according to some embodiments may contain multiple light sources 102. The components of system 100 may be housed in a single projection device (e.g., a single projector), or in some embodiments, in multiple devices. For example, in some embodiments, the light sources, modulators, and other components of display system 100 may be split into two or more separate, coordinated projection devices.

In the illustrated example, the light source 102 is driven by the controller 114 to produce an illumination beam comprising six primary colors. The illumination beam is directed through illumination optics 104 and into color separator 106. A color separator 106 separates the illumination beam into six primary color beams and directs each primary color beam to an associated one of Spatial Light Modulators (SLMs) 108. As described in more detail below, the modulator 108 modulates the primary illumination light beams based on input from the controller 114. Projection optics 112 focus the modulated light beam to form an imaging light beam 116. The imaging beam 116 is then projected to produce an image on, for example, a viewing surface (not shown). In the exemplary system of fig. 1B, the left-eye image and the right-eye image may be projected alternately (also referred to as "time-division-multiplexed").

In some embodiments, each primary color may be associated with a single modulator 108. Alternatively, as shown in fig. 1B, the number of modulators may be reduced, for example, by using a field sequential modulation scheme. In some embodiments, such as in a dual modulation projector, the modulator may include multiple modulators for each primary color. In some embodiments, each modulator 108 is associated with a set of primary colors. For example, as described above with respect to fig. 1A, a 6P system as shown in fig. 1C may include a left projector and a right projector. FIG. 1C shows a dual head display system 100 that includes separate modulators 108A and 108B, projection optics 112A and 112B, and two resulting imaging beams 116A and 116B, each of which is designated for a left eye channel and a right eye channel, respectively. Modulator 108A, projection optics 112A, and resulting imaging beam 116A may be considered as components of a left projector (projection head), and modulator 108B, projection optics 112B, and resulting imaging beam 116B may be considered as components of a right projector (projection head). As described above, the light output from both channels is displayed simultaneously to produce a single resulting image on a display or screen. Further, although fig. 1B and 1C show the video data source 101 separate from the display system 100, in some embodiments the video data source 101 may be internal to the display system 100 (e.g., in a memory associated with the display system 100).

Fig. 2 is a block diagram of the controller 114 according to some embodiments. The controller 114 includes an electronic processor 205, a memory 210, and an input/output interface 215. The electronic processor 205 obtains and provides information (e.g., from the memory 210 and/or the input/output interface 215) and processes the information by executing one or more software instructions or modules, which can be stored, for example, in a Random Access Memory (RAM) area of the memory 210 or a Read Only Memory (ROM) of the memory 210 or another non-transitory computer readable medium (not shown). The software may include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions. The electronic processor 205 may include multiple cores or a single processing unit. The electronic processor 205 is configured to retrieve from the memory 210 and execute, among other things, software related to the control processes and methods described herein.

Memory 210 may include one or more non-transitory computer-readable media and include a program storage area and a data storage area. As described herein, the program storage area and the data storage area may include a combination of different types of memory. Memory 210 may take the form of any non-transitory computer-readable medium.

The input/output interface 215 is configured to receive input and provide system output. The input/output interface 215 obtains information and signals from and provides information and signals to devices (e.g., light source 102, one or more modulators 108, and video data source 101) internal and external to the display system 100 (e.g., via one or more wired and/or wireless connections).

Fig. 3 is a flow diagram illustrating an example method 300 of operating a projection system in accordance with some embodiments. As an example, the method 300 is described as being performed by the controller 114 shown in fig. 1B and 1C, and in particular the electronic processor 205 shown in fig. 2.

At block 302, the electronic processor 205 receives video data from a video data source (e.g., the video data source 101 shown in fig. 1B and 1C). The video data may include a series of three-color pixel values from a video content stream or file. In some embodiments, the video data comprises pixel values in a color space (or gamut) such as Rec2020 (also referred to as ITU-R recommendation bt.2020). At block 304, the electronic processor 205 generates a first plurality of intensity values for the virtual primaries of the first gamut from the video data, and at block 306, generates a second plurality of intensity values for the virtual primaries of the second gamut from the video data. In particular, the gamut volume for 2D images is divided into two virtual gamuts: gamut a and gamut B. Each gamut includes virtual primaries that are a particular mix of predetermined primaries. As explained in more detail below, gamut a is optimized to approximate a predetermined gamut, e.g., to approximate a defined standard color space (e.g., Rec2020), while gamut B is used for any residual energy from predetermined primaries. In other words, gamut a is used for lower brightness levels and gamut B is added as applicable to achieve higher brightness levels. In some embodiments, gamut a is optimized to achieve the largest possible gamut.

Returning to FIG. 3, at block 308, the electronic processor 205 converts the first plurality of intensity values to a third plurality of intensity values for a predetermined color primary for the first eye channel (e.g., the channel of the first projection head) and converts the second plurality of intensity values to a fourth plurality of intensity values for a predetermined color primary for the second eye channel (e.g., the channel of the second projection head).

In some embodiments, a mixing function is applied to gamut a and gamut B to optimize each gamut to approximate the color space of the video data (Rec 2020 in this example). In other words, in these embodiments, the electronic processor 205 converts the first plurality of intensity values and the second plurality of intensity values to a third plurality of intensity values of the predetermined color primaries of the first eye channel (e.g., the channel of the first projection head), e.g., via one or more mixing functions. The electronic processor 205 also converts the first plurality of intensity values and the second plurality of intensity values to a fourth plurality of intensity values of the predetermined base color of the second eye channel (e.g., the channel of the second projection head), e.g., via one or more mixing functions. Equations [1] and [2] below show the mixing functions performed at block 308 for the left-eye and right-eye channels, respectively.

And

Figure BDA0002652782250000062

RLGLBLthe matrix corresponds to a third plurality of intensity values, e.g., the color primaries of the right-eye channel; and R isSGSBSThe matrix corresponds to a fourth plurality of intensity values, e.g. the primary colors of the left eye channelWhere R represents the red primary, G represents the green primary, B represents the blue primary, subscript L represents the "long" wavelength primary, and subscript S represents the "short" wavelength primary. In some embodiments, the right eye channel may include a short wavelength primary and the left eye channel includes a long wavelength primary. In two equations, gamut A (including R) for each channelAGABAMatrix of) and gamut B (including R)BGBBBMatrix of) by a mixing matrix (B)AL、BBL、BASAnd BBSMatrix) scaling. The specific value of the mixing matrix may be a predetermined value determined based on the positions of the primary colors and a predetermined color space. The particular value may also depend on the projection/display system used (e.g., the type of projector head). An example method of determining each mixing matrix is described in more detail below. In this example, the following mixing matrix values are used:

Figure BDA0002652782250000071

Figure BDA0002652782250000072

Figure BDA0002652782250000073

returning to fig. 3, at block 310, the electronic processor 205 dynamically adjusts the pixel level of at least one spatial modulator (e.g., modulator 108 shown in fig. 1B and 1C) based on the third plurality of intensity values and the fourth plurality of intensity values. In embodiments where system 100 is a dual head projection system, the pixel level of the modulator of each projection head is adjusted.

As described above, gamut a is optimized to approximate a predetermined gamut, e.g., a defined standard color space (e.g., Rec2020), while gamut B is used for any residual energy from predetermined primaries. Two exemplary methods implemented by processor 205 for processing received video data into color gamuts a and B are described below.

One way to optimize gamut a is by scaling (compressing) the chromaticity of the video data to fit the achievable gamut volume of the light sources 102, referred to herein as gamut scaling. In the gamut scaling method, two functions are defined:

fl(C) if (C < 0.5), C, otherwise 0.5

fu(C)=C-fl(C)

C∈R2020,G2020,B2020

The variable C represents the pixel value of the received video data, which is assumed here to be tristimulus Rec2020 data. Function fl(C) A look-up table representing the color gamut A, and a simultaneous function fu(C) A look-up table representing gamut B. In the present example, the function fi(C) Defines a linear slope that increases from 0 to 0.5 and is a flat straight line above 0.5, and the function fu(C) A flat straight line from 0 to 0.5 and a linear slope increasing from 0.5 are defined. The value 0.5 corresponds to the brightness threshold. In some embodiments, different brightness thresholds may be used.

For each pixel value of the received video data, gamut a and gamut B are given by:

in other words, for the incoming video data R2020,G2020And B2020Pixel values with luminance levels below 0.5 (corresponding to 50% of the total luminance range of system 100 and 100% of the luminance range of gamut a) are contained in gamut a, while pixel values above 0.5 (indicating that these pixel values are outside the luminance range of gamut a) are assigned to gamut B. The resulting gamut A and gamut B signals are then converted toThe defined primaries described above with respect to block 308 and equations 1 and 2.

Another way to process the received video data into gamut a is by clipping the chromaticities to fit gamut a and B to fit the achievable gamut volume of the light source 102. First, the transformation relationships [ C ] from the color space of the source 102 to gamut A and gamut B, respectively, are defined]AAnd [ C]B. The derivation may be performed as follows.

Given the normalized primary matrix of the left and right channels, any point in both channels can be defined as:

wherein the matrix of X, Y and Z corresponds to an arbitrary point, NPMLLLAnd NPMSSSCorresponds to the normalized primary matrix of the right-eye channel and the left-eye channel, respectively, and RL、GLAnd BLAnd RS、GSAnd BSCorresponding to non-normalized matrices for the right eye channel and the left eye channel, respectively.

The mixing functions defined in equations 1 and 2 above are replaced with non-normalized matrices for the right and left eye channels, respectively.

Converting the above formula into terms of the base matrix, the formula will become:

wherein

[PM]A=[NPM]LLL[BAL]+[NPM]SSS[BAS],

And is

[PM]B=[NPM]SSS[BBS]+[NPM]LLL[BBL]。

The normalized primary matrix for Rec2020 is known as follows:

[NPM]2020

the conversion from the source Rec2020 vector to gamut a and B is:

wherein

And

Figure BDA0002652782250000094

wherein

Figure BDA0002652782250000095

As described above, gamut a is used for lower luminance levels, while gamut B is used to achieve higher luminance levels when applicable. In contrast to the gamut scaling method, where the luminance range of gamut a is the same on all channels, the threshold vector is used here as luminance threshold, which means that the luminance range of gamut a varies with each channel. A threshold vector representing the transition between the "a" and "B" gamuts in the Rec2020 space can be found as follows:

Figure BDA0002652782250000096

in other words, each threshold in the threshold vector is determined based on a transformation of the first virtual gamut into a predetermined gamut.

The following function is then defined:

fl(C) if (C < T)C) C, otherwise TC

fu(C)=C-fl(C)

C∈R2020,G2020,B2020

The variable C represents the pixel value of the received video data, which is assumed to be tristimulus Rec2020 data. Function f of the above gamut scaling methodl(C) And fu(C) Similarly, function fl(C) Again representing the look-up table of gamut a, while function fu(C) A look-up table representing gamut B. In the present example, the function fl(C) A flat straight line is defined that increases from 0 to the threshold of a particular channel as defined in the threshold vector and when a particular threshold is exceeded, while the function fu(C) A flat straight line from 0 to a certain threshold and a linear slope increasing from the certain threshold is defined. Also, the specific threshold values here correspond to the luminance threshold values of each channel of the color gamut a, respectively. In some embodiments, the function fl(C) And fu(C) Transitions other than the linear slope described above may be defined (as long as the sum of the functions is equal to 1). For example, function fl(C) And fu(C) One or both of which may define a curve to a flat line.

For each pixel value of the received video data, gamut a and gamut B are obtained as follows:

in other words, for the input video data R2020、G2020And B2020Has a vector T at a threshold valueR、TGAnd TBIs included within gamut a, while pixel values exceeding a certain threshold (meaning outside the luminance range of gamut a) are assigned to gamut B.

Any intensity values for gamut a and gamut B that are negative values that result are then clipped to 0 (or white).

Figure BDA0002652782250000103

Figure BDA0002652782250000104

The resulting gamut a and gamut B signals are then converted to defined primaries as described above with respect to block 308 and equations 1 and 2.

The gamut clipping method clips the colors to the edges of gamuts a and B, which may be created to be approximately the same for any projector system. In other words, the colors produced by one projector can be faithfully reproduced by any other kind of projector, regardless of whether the source primaries are the same.

Alternatively, the mixing function may be applied to gamut a and B before clipping the negative values. By applying the mixing function to each color gamut, the value of each color gamut is converted into predetermined primary colors according to the color space of the received video data.

Trimming can then be applied as follows.

Figure BDA0002652782250000111

Figure BDA0002652782250000112

This approach can maximize the gamut coverage for a particular primary color of any projector system.

Fig. 4 shows a chromaticity diagram 400. The graph 400 includes a color gamut associated with the display system 100 and a target color gamut (here, Rec 2020). Gamut 402 is the Rec2020 gamut. The color gamut 404 is the (long) primary color R of the right-eye channelL、GLAnd BL(LLL primary) while gamut 406 is the (short) primary R for the left eye channelS、GSAnd BS(SSS primaries) defined color gamut. Gamut 416 is a gamut defined by the primaries driving both the right-eye and left-eye channels at the same value ("WCG" gamut). As shown, color gamut 408 is significantly different from Rec2020 color gamut 402. Gamut 410 is virtual gamut a and gamut 412 is virtual gamut B.As described above, gamut a is defined as a combination of virtual primaries to approximate as closely as possible to Rec2020 gamut 402, while gamut B is defined by the remaining energy outputs of the virtual primaries after subtracting the energy for gamut a. As shown, gamut a (gamut 410) matches gamut 402 more closely with Rec2020 than sum gamut 408.

Fig. 5A shows a chromaticity diagram 500A that includes the color gamuts 402, 404, 406, 408, 410, and 412 described above with respect to fig. 4. Chromaticity diagram 500A includes an input chromaticity (black) and a resulting output chromaticity (gray) obtained using the above-described gamut clipping method. A line connecting any input chromaticity point to an output chromaticity point (e.g., a line connecting input point 502A to output point 502B) indicates how the input color is to be altered (if needed). Note that in the illustrated graph 500A, the maximum input is less than the luminance threshold, so gamut B is not used and the second clipping method described above is used.

Fig. 5B illustrates a chromaticity diagram 500B that includes the color gamuts 402, 404, 406, 408, 410, and 412 described above with respect to fig. 4. Chromaticity diagram 500B includes an input chromaticity (black) and a resulting output chromaticity (gray) obtained using the above-described gamut clipping method. In the illustrated graph 500B, the maximum input is greater than the luminance threshold, and thus gamut B is utilized.

As described above, the mixing matrix is a derived value determined based on the position of the base color compared to Rec 2020. The mixing matrix is specifically defined to place the short and long primaries of the right and left eye channels, respectively, at specific desired locations. An example method of determining the mixing matrix for each primary color is described below.

Fig. 6A shows an enhanced view 600A of the chromaticity diagram 400 of fig. 4. As shown in view 600A, the short red primary RSAnd a long red primary color RLAre outside of the Rec2020 gamut 402. It is necessary to place gamut a within Rec2020 gamut 402. To determine the mixing matrix R for the red primary, one approach is to bind the primaries to the Rec2020 gamut. This may be achieved by mixing short red wavelengths (RS) with long green wavelengths for gamut a and short red wavelengths with long red wavelengths (RL) for the B gamut. By finding the chroma between the resulting chroma and the Rec2020 boundaryThe distance between the long red and short red wavelengths and the Rec2020 boundary may be minimized. This position is then used to define the mixing matrix. For example, the line of enhanced view 600A represents a mixture of long red and short green wavelengths. Point a represents the location where the blend intersects the Rec2020 gamut.

In order to not be limited to the Rec2020 gamut, another approach is to maximize the coverage of the resulting gamut. Fig. 6B shows an enhancement diagram 600B of the chromaticity diagram 400 of fig. 4. Here, a line between the blue and green primaries of the Rec2020 gamut to a point of the shortest distance of the mixture of the long Red (RL) wavelength and the short Red (RS) wavelength. This results in using the long red of gamut a and using the short red of gamut B.

Fig. 7A shows an enhancement diagram 700A of the chromaticity diagram 400 of fig. 4. As shown in view 700A, the short green primary GS and the long green primary GL both enclose the Rec2020 color gamut 402. Again, it is necessary to place gamut a within Rec2020 gamut 402. Here, a color primary may be bound to the Rec2020 gamut 402 by defining the intersection of a line B between two green color primaries and the boundary of Rec 2020. This position is then used to define the mixing matrix. The green dot C represents the point at which the green primary of gamut a is located.

In order to not be limited to the Rec2020 gamut, another approach is to maximize the coverage of the resulting gamut. Fig. 7B shows an enhancement diagram 700B of the chromaticity diagram 400 of fig. 4. Here, a line between the red and blue primary colors of the Rec2020 gamut to a point of the shortest distance of the mixture of the long Green (GL) wavelength and the short Green (GS) wavelength. This results in the use of a short green wavelength.

Fig. 8A shows an enhancement diagram 800A of the chromaticity diagram 400 of fig. 4. As shown in the diagram 800A, the blue short primary BS and the blue long primary BL are both located within the Rec2020 color gamut 402. Both primaries will be outward towards the boundary of the Rec2020 gamut 402. Here, the primaries are bound to the Rec2020 gamut 402 by defining the intersection of the line D between the long blue and short green wavelengths with the Rec2020 boundary. This position is then used to define the mixing matrix. Point E represents the point at which the blue primary of gamut a is located.

Another approach is to not bind the blue primaries to the Rec2020 gamut, e.g., to maximize the coverage of the resulting gamut. Fig. 8B shows an enhancement map 800B of the chromaticity diagram 400 of fig. 4. Here, a line between the red and green primaries of the Rec2020 gamut to a point of the shortest distance of the mixture of the long Blue (BL) wavelength and the short Green (GS) wavelength. This position is then used to define the mixing matrix. As shown, the long blue primary is used for gamut a, and the short blue primary is used for gamut B.

In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings. For example, the techniques disclosed herein may be applied to projection systems designed for non-3D content. For example, a dual head projection system may utilize the disclosed techniques. Furthermore, the disclosed techniques may be applied to more than two projection heads.

Moreover, in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," "has," "having," "includes," "including," "contains," "containing," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, elements that begin with "include.", "have.", "contain.", and "include." do not exclude the presence of other like elements in a process, method, article, or device that includes, has, contains, or contains the element. The terms "a" and "an" are defined as one or more unless the context clearly indicates otherwise. The terms "substantially", "essentially", "approximately", "about" or any other version thereof are defined as being approximately as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5%. The term "coupled," as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically. A device or structure that is "configured" in a certain way is configured in at least that way, but may also be configured in ways that are not listed.

It will be appreciated that some embodiments may be comprised of one or more general-purpose or special-purpose processors (or "processing devices"), such as microprocessors, digital signal processors, custom processors, and Field Programmable Gate Arrays (FPGAs), and unique stored program instructions (including software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the methods and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more Application Specific Integrated Circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches may be used.

Furthermore, embodiments may be implemented as a computer-readable storage medium having computer-readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage media include, but are not limited to, hard disks, CD-ROMs, optical storage devices, magnetic storage devices, ROMs (read-only memories), PROMs (programmable read-only memories), EPROMs (erasable programmable read-only memories), EEPROMs (electrically erasable programmable read-only memories), and flash memories. Moreover, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.

The various aspects of the invention can be understood from the following exemplary embodiments (EEEs):

1. a dual head display system, comprising:

a first projection head;

a second projection head;

at least one spatial modulator; and

an electronic processor configured to:

receiving 2D video data;

generating from the video data a first plurality of intensity values of virtual primaries of a first virtual gamut and a second plurality of intensity values of a second virtual gamut, the first plurality of intensity values being below a luminance threshold and approximating a predetermined gamut and the second plurality of intensity values being above the luminance threshold;

converting the first plurality of intensity values to a third plurality of intensity values for a predetermined color primary of the first projection head and converting the second plurality of intensity values to a fourth plurality of intensity values for a predetermined color primary of the second projection head; and

dynamically adjusting pixel levels of the at least one spatial modulator of the first projection head and the second projection head based on the third plurality of intensity values and the fourth plurality of intensity values.

2. The system of EEE 1, wherein the brightness threshold is a threshold vector including a threshold for each color channel of the first virtual color gamut.

3. The system of EEE 2, wherein each threshold is determined based on a transformation of the first and second virtual color gamuts to the predetermined color gamut.

4. The system of any of EEEs 1-3, wherein any negative values of the first plurality of intensity values and the second plurality of intensity values are clipped.

5. The system of any of EEEs 1-4, wherein any negative values of the third plurality of intensity values and the fourth plurality of intensity values are clipped.

6. A method for displaying image data, the method comprising:

receiving 2D video data;

generating, from video data, a first plurality of intensity values of virtual primaries of a first virtual gamut and a second plurality of intensity values of a second virtual gamut, the first plurality of intensity values of the first virtual gamut being below a luminance threshold and approximating a predetermined gamut and the second plurality of intensity values being above the luminance threshold;

converting the first plurality of intensity values to a third plurality of intensity values for a predetermined color primary of a first projection head of a display system and converting the second plurality of intensity values to a fourth plurality of intensity values for a predetermined color primary of a second projection head of the display system; and

dynamically adjusting pixel levels of spatial modulators of the first projection head and the second projection head of the display system based on the third plurality of intensity values and the fourth plurality of intensity values.

7. The method of EEE 6, wherein the brightness threshold is a threshold vector including a threshold for each color channel.

8. The method according to EEE 7, wherein each threshold is determined based on a transformation of the first virtual gamut into the predetermined gamut.

9. The method according to any one of EEEs 6 to 8, wherein any negative values of the first and second plurality of intensity values are clipped.

10. The method according to any one of EEEs 6 to 9, wherein any negative values of the third plurality of intensity values and the fourth plurality of intensity values are clipped.

11. The method according to any one of EEEs 6 to 10, wherein converting the first plurality of intensity values into the third plurality of intensity values and converting the second plurality of intensity values into the fourth plurality of intensity values is performed via a mixing function.

12. A method for displaying image data, the method comprising:

receiving video data;

generating a first plurality of intensity values associated with a first virtual color gamut based on intensity levels of the video data;

generating a second plurality of intensity values associated with a second virtual color gamut based on a comparison between the intensity levels and at least one predetermined threshold;

generating a third plurality of intensity values and a fourth plurality of intensity values based on the first plurality of intensity values and the second plurality of intensity values, the third plurality of intensity values and the fourth plurality of intensity values each configured to approximate a predetermined color space associated with the video data; and

providing the third plurality of intensity values and the fourth plurality of intensity values to at least one spatial light modulator.

13. The method of EEE 12, wherein the second plurality of intensity values is set to zero if the intensity level does not exceed the predetermined threshold.

14. The method of EEE 12 or 13, wherein when the intensity level exceeds the predetermined threshold, the second plurality of intensity values is set based on an amount by which the intensity level exceeds the predetermined threshold.

15. The method according to any one of EEEs 12 to 14, further comprising: modifying one or both of the first plurality of intensity values and the second plurality of intensity values when at least one intensity value is outside of an achievable gamut volume of the first virtual gamut and the second virtual gamut.

16. The method according to any one of EEEs 12 to 15, further comprising: modifying one or both of the third plurality of intensity values and the fourth plurality of intensity values when at least one intensity value is outside of an achievable gamut volume of the first virtual gamut and the second virtual gamut.

17. The method according to any of EEEs 12 to 16, wherein the at least one predetermined threshold is a vector based on a relation between the first virtual gamut and the second virtual gamut in the predetermined color space.

18. The method according to any of EEEs 12 to 17, further comprising defining the first virtual color gamut to approximate a predetermined color gamut associated with the predetermined color space based on a combination of a plurality of primary display colors associated with a light source; and defining the second virtual color gamut based on the residual power of the light source and the first virtual color gamut.

19. The method according to any one of EEEs 12 to 18, wherein the at least one spatial light modulator is part of a dual head projector.

20. The method of EEE 19, wherein the dual head projector displays a 2-D image based on the third plurality of intensity values and the fourth plurality of intensity values.

21. A non-transitory computer-readable medium storing instructions that, when executed by a processor of a computer, cause the computer to perform operations comprising:

receiving 2D video data comprising tristimulus pixel values of predetermined primary colors of a predetermined color space;

generating, from the video data, a first plurality of intensity values of virtual primaries of a first virtual gamut and a second plurality of intensity values of a second virtual gamut to approximate a predetermined gamut, the first plurality of intensity values being less than a luminance threshold of the first virtual gamut and the second plurality of intensity values being above the luminance threshold;

converting the first plurality of intensity values to a third plurality of intensity values for the predetermined color primary and the second plurality of intensity values to a fourth plurality of intensity values for the predetermined color primary via a mixing function; and

dynamically adjusting a pixel level of at least one spatial modulator of the dual head projection system based on the third plurality of intensity values and the fourth plurality of intensity values.

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于实时跌倒风险评估的经连接自助服务终端

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类