Display device
阅读说明:本技术 显示装置 (Display device ) 是由 山本和夫 浜田俊也 高桥邦明 服部信夫 于 2014-06-06 设计创作,主要内容包括:本技术涉及一种显示装置。显示装置包括:接口,被配置为:将显示装置的亮度能力信息输出到再现装置,并且从再现装置接收视频数据和亮度特征信息,其中,再现装置根据亮度能力信息处理解码视频数据,并且接收的视频数据是处理的解码视频数据,并且亮度特征信息表示亮度特征;以及电路,被配置为基于接收的视频数据和亮度特征信息控制视频的显示。本技术可以应用于再现内容的播放器中。(The present technology relates to a display device. The display device includes: an interface configured to: outputting luminance capability information of the display apparatus to a reproducing apparatus, and receiving video data and luminance characteristic information from the reproducing apparatus, wherein the reproducing apparatus processes the decoded video data according to the luminance capability information, and the received video data is the processed decoded video data, and the luminance characteristic information represents a luminance characteristic; and circuitry configured to control display of the video based on the received video data and the brightness characteristic information. The present technology can be applied to a player that reproduces content.)
1. A display device, comprising:
an interface configured to:
outputting the luminance capability information of the display device to a reproducing device, and
receiving video data and luminance characteristic information from the reproducing apparatus, wherein the reproducing apparatus processes the decoded video data according to the luminance capability information, and the received video data is the processed decoded video data, and the luminance characteristic information represents a luminance characteristic; and
circuitry configured to control display of video based on the received video data and the brightness characteristic information.
2. The display device according to claim 1, wherein the first and second light sources are arranged in a matrix,
wherein the brightness capability information indicates a brightness performance of the display device corresponding to a high dynamic range.
3. Display device according to one of the preceding claims,
wherein the brightness characteristic information includes a reference screen brightness parameter.
4. Display device according to one of the preceding claims,
wherein the circuitry is further configured to initiate a comparison of the brightness characteristic information and the brightness capability information and initiate a determination to determine whether brightness of the received video data requires adjustment.
5. The display device according to any one of the preceding claims, further comprising a memory containing EDID,
wherein the EDID contains information of 4k video.
6. The display device according to claim 5, wherein the 4k video has a resolution of 2160p or higher.
7. The display device according to any one of claims 1 to 4, further comprising a memory containing EDID, wherein the EDID contains information of 4k resolution, wherein the horizontal/vertical resolution is 4096/2160 pixels or 3840/2160 pixels.
8. The display device according to any one of claims 1 to 4, wherein the display device includes EDID stored in a memory, and the EDID contains the luminance capability information of the display device.
9. A display device, comprising:
an interface configured to:
outputting the luminance capability information of the display device to a reproducing device, and
receiving video data and luminance characteristic information included in data of each frame from the reproducing apparatus, wherein the decoded video data is processed by the reproducing apparatus based on the luminance capability information, and the received video data is the processed decoded video data, and the luminance characteristic information represents a luminance characteristic; and
a circuit configured to control display of video based on the received video data and the brightness characteristic information included in the data of each frame.
10. The display device according to claim 9, wherein the first and second light sources are arranged in a matrix,
wherein the brightness capability information indicates a brightness performance of the display device corresponding to a high dynamic range.
11. The display device according to claim 9 or 10,
wherein the brightness characteristic information includes a reference screen brightness parameter.
12. The display device according to any one of claims 9 to 11,
wherein the circuitry is further configured to initiate a comparison of the brightness characteristic information and the brightness capability information and initiate a determination to determine whether brightness of the received video data requires adjustment.
13. The display device according to any one of claims 9 to 12,
wherein the interface is HDMI, and the data of each frame is HDMIInfoFrame.
14. The display device according to claim 10, wherein the first and second light sources are arranged in a matrix,
wherein the interface is HDMI, and the data of each frame is HDMIInfoFrame; and is
Wherein the brightness characteristic information includes a reference screen brightness parameter.
15. The display device according to claim 10, wherein the first and second light sources are arranged in a matrix,
wherein the interface is HDMI, and the data of each frame is HDMIInfoFrame; and is
Wherein the circuitry is further configured to initiate a comparison of the brightness characteristic information and the brightness capability information and initiate a determination to determine whether brightness of the received video data requires adjustment.
16. The display device according to any one of claims 9 to 15,
wherein the display device includes an EDID stored in a memory, and the EDID contains the luminance capability information of the display device.
17. The display device according to any one of claims 9 to 16,
among them, the EDID contains information of 4k video.
18. The display device of claim 17, wherein the 4k video has a resolution of 2160p or higher.
19. The display device according to any one of claims 9 to 16,
among them, the EDID contains information of 4k resolution, in which the horizontal/vertical resolution is 4096/2160 pixels or 3840/2160 pixels.
Technical Field
The present technology relates to a reproducing apparatus, a reproducing method, and a recording medium, and more particularly, to a reproducing apparatus, a reproducing method, and a recording medium, which can display content having a wide dynamic range of luminance with appropriate luminance.
Background
A blu-ray (registered trademark) disc (hereinafter referred to as BD as appropriate) is used as a recording medium for content, e.g., a movie. Heretofore, when a video recorded on a BD is produced, it has a standard brightness (100
The video as the main video is captured by a high-quality camera and includes a dynamic range equal to or greater than a dynamic range that can be displayed on a display having standard brightness. Needless to say, the dynamic range of the main video is degraded due to compression.
Reference list
Patent document
Patent document 1: JP 2009 and 58692A
Patent document 2: JP 2009-89209A
Disclosure of Invention
Technical problem
Due to advances in the technology of displays, such as organic Electroluminescent (EL) displays and Liquid Crystal Displays (LCDs), displays are commercially available that are brighter than standard displays, with luminances of 500 nits and 1000 nits.
In view of the above, the present technology is configured and is capable of displaying contents having a wide dynamic range of luminance with appropriate luminance.
Solution to the problem
A reproduction apparatus according to an aspect of the present technology includes: a readout unit configured to read out the encoded data, the luminance characteristic information, and the luminance conversion definition information from a recording medium on which the encoded data of the extended video that is a video having a second luminance range wider than the first luminance range is recorded, the luminance characteristic information representing the luminance characteristic of the extended video, and the luminance conversion definition information used when luminance conversion of the extended video to a standard video that is a video having the first luminance range is performed; a decoding unit configured to decode the encoded data; a conversion unit configured to convert an extended video obtained by decoding the encoded data into a standard video according to the luminance conversion definition information; and an output unit configured to output the data of the extended video and the luminance characteristic information to a display apparatus capable of displaying the extended video, and configured to output the data of the standard video to a display apparatus incapable of displaying the extended video.
The luminance characteristic information and the luminance conversion definition information may be inserted into a stream including the encoded data as auxiliary information of the encoded data and recorded in the recording medium.
The encoded data may be encoded data of HEVC and the luma feature information and the luma transform definition information may both be SEI's of HEVC streams.
The luminance conversion definition information may be first tone mapping information in which any one of
The tone _ map _ model _ id of the first tone mapping information and the tone _ map _ model _ id of the second tone mapping information may each be set with the same value representing a recording mode of the recording medium.
Information relating to reproduction of the encoded data may be further recorded in the recording medium, the information including a flag indicating whether recording of the extended video is in progress as a main video. The decoding unit may decode the encoded data when the flag indicates that recording of the extended video is ongoing as the main video.
The recording medium may be a blu-ray disc. The flag may be included in a clip information file used as reproduction-related information.
The recording medium may be a blu-ray disc. The flag may be included in a playlist file serving as reproduction-related information.
According to an aspect of the present technology, encoded data, luminance characteristic information, and luminance conversion definition information are read out from a recording medium on which encoded data of an extended video that is a video having a second luminance range wider than a first luminance range is recorded, luminance characteristic information representing luminance characteristics of the extended video, and luminance conversion definition information used when luminance conversion is performed from the extended video to a standard video that is a video having the first luminance range. The encoded data is decoded. Converting an extended video obtained by decoding the encoded data into a standard video according to the luminance conversion definition information. And outputting the data of the extended video and the brightness characteristic information to a display device capable of displaying the extended video. And outputting the data of the standard video to a display device which cannot display the extended video.
A reproduction apparatus according to another aspect of the present technology includes: a readout unit configured to read out encoded data, luminance characteristic information, and luminance conversion definition information from a recording medium on which encoded data of a standard video (the standard video being a video having a first luminance range) obtained by performing luminance conversion of an extended video that is a video having a second luminance range wider than the first luminance range is recorded, luminance characteristic information representing luminance characteristics of the extended video, and luminance conversion definition information used when luminance conversion of the standard video into the extended video is performed; a decoding unit configured to decode the encoded data; a conversion unit configured to convert a standard video obtained by decoding the encoded data into an extended video according to the luminance conversion definition information; and an output unit configured to output the data of the extended video and the luminance characteristic information to a display apparatus capable of displaying the extended video, and configured to output the data of the standard video to a display apparatus incapable of displaying the extended video.
The luminance characteristic information and the luminance conversion definition information may be inserted into a stream including the encoded data as auxiliary information of the encoded data and recorded in the recording medium.
The encoded data may be encoded data of HEVC and the luma feature information and the luma transform definition information may both be SEI's of HEVC streams.
The luminance conversion definition information may be first tone mapping information in which any one of
The tone _ map _ model _ id of the first tone mapping information and the tone _ map _ model _ id of the second tone mapping information may each be set with the same value representing a recording mode of the recording medium.
Information relating to reproduction of the encoded data may be further recorded in the recording medium, the information including a flag indicating whether recording of the extended video is in progress as a main video. The decoding unit may decode the encoded data when the flag indicates that recording of the extended video is ongoing as the main video.
The recording medium may be a blu-ray disc. The flag may be included in a clip information file used as reproduction-related information.
The recording medium may be a blu-ray disc. The flag may be included in a playlist file serving as reproduction-related information.
According to another aspect of the present technology, encoded data, luminance characteristic information, and luminance conversion definition information are read out from a recording medium in which encoded data of a standard video (the standard video being a video having a first luminance range) obtained by performing luminance conversion of an extended video that is a video having a second luminance range wider than the first luminance range, luminance characteristic information representing luminance characteristics of the extended video, and luminance conversion definition information used when luminance conversion of the standard video into the extended video is performed are recorded. The encoded data is decoded. Converting a standard video obtained by decoding the encoded data into an extended video according to the luminance conversion definition information. And outputting the data of the extended video and the brightness characteristic information to a display device capable of displaying the extended video. And outputting the data of the standard video to a display device which cannot display the extended video.
Advantageous effects of the invention
According to the present technology, it is possible to display content having a wide dynamic range of luminance with appropriate luminance.
Drawings
Fig. 1 is a diagram showing an exemplary configuration of a recording/reproducing system according to an embodiment of the present technology;
fig. 2 is a diagram showing an example of signal processing in mode-i;
FIG. 3 is a diagram showing the flow of signals processed in mode-i;
fig. 4 is a diagram showing an example of signal processing in mode-ii;
fig. 5 is a diagram showing a flow of signals processed in mode-ii;
fig. 6 is a diagram illustrating a configuration of an access unit of HEVC;
fig. 7 is a diagram showing a syntax of tone mapping information;
fig. 8 is a diagram showing an example of information serving as tone mapping definition information and HDR information;
fig. 9 is a diagram showing an example of a tone curve drawn by tone mapping information of tone _ map _ model _ id ═ 0;
fig. 10 is a diagram showing an example of a step function rendered by tone mapping information of tone _ map _ model _ id ═ 2;
fig. 11 is a diagram showing an example of a polyline function rendered by tone mapping information of tone _ map _ model _ id ═ 3;
fig. 12 is a diagram showing an example of each piece of information contained in HDR information;
FIG. 13 is a diagram showing an example of a management structure of an AV stream having a BD-ROM format;
FIG. 14 is a diagram showing the structure of Main Path and Sub Path;
FIG. 15 is a diagram showing an example of a management structure of a file;
fig. 16 is a diagram showing a syntax of a playlist file;
fig. 17 is a diagram showing a syntax of a clip information file;
fig. 18 is a diagram showing the syntax of ProgramInfo () in fig. 17;
fig. 19 is a diagram illustrating a syntax of StreamCodingInfo in fig. 18;
fig. 20 is a block diagram showing an exemplary configuration of a recording apparatus;
fig. 21 is a block diagram showing an exemplary configuration of an encoding processing unit in fig. 20;
fig. 22 is a diagram showing an example of signal processing performed by the HDR-STD conversion unit;
FIG. 23 is a diagram showing an example of tone mapping;
fig. 24 is a block diagram showing an exemplary configuration of a reproduction apparatus;
fig. 25 is a block diagram showing an exemplary configuration of a decoding processing unit in fig. 24;
fig. 26 is a block diagram showing an exemplary configuration of a display device;
fig. 27 is a flowchart showing a recording process of the recording apparatus;
fig. 28 is a flowchart showing the encoding process in mode-i executed in step S2 in fig. 27;
fig. 29 is a flowchart showing the encoding process in mode-ii performed in step S3 in fig. 27;
fig. 30 is a flowchart showing the database information generation processing executed in step S4 in fig. 27;
fig. 31 is a flowchart showing a reproduction process of the reproduction apparatus;
fig. 32 is a flowchart showing the decoding process in mode-i executed in step S44 in fig. 31;
fig. 33 is a flowchart showing the decoding process in mode-ii performed in step S45 in fig. 31;
fig. 34 is a flowchart showing a display process of the display device;
fig. 35 is a diagram showing an example of syntax of AppInfoPlayList () contained within the playlist file in fig. 16;
fig. 36 is a diagram showing the syntax of PlayList () contained within the PlayList file in fig. 16;
fig. 37 is a diagram showing the syntax of PlayList () in fig. 36;
fig. 38 is a diagram showing syntax of STN _ table () in fig. 37;
fig. 39 is a diagram showing the syntax of stream _ attributes () in fig. 38;
FIG. 40 is a diagram showing an example of allocation of PSRs;
fig. 41 is a diagram showing an example of signal processing in mode-i in the case where adjustment of the luminance of the HDR video is performed on the reproducing apparatus side;
fig. 42 is a diagram showing an example of signal processing in mode-ii in the case where adjustment of the luminance of the HDR video is performed on the reproducing apparatus side;
fig. 43 is a block diagram showing an exemplary configuration of the HDR video output unit in fig. 25;
fig. 44 is a flowchart showing the decoding process in mode-i performed in step S44 in fig. 31;
fig. 45 is a flowchart showing the decoding process in mode-ii performed in step S45 in fig. 31;
fig. 46 is a flowchart showing a display process of the display device;
fig. 47 is a diagram showing one example of identification based on information transmitted and received through HDMI;
fig. 48 is a diagram showing another example of identification based on information transmitted and received through HDMI;
fig. 49 is a diagram showing an example of HDR EDID;
fig. 50 is a diagram showing an example of the HDR InfoFrame;
fig. 51 is a flowchart showing the setting processing of HDR EDID of the display device;
fig. 52 is a flowchart showing a reproduction process of the reproduction apparatus;
fig. 53 is a flowchart showing the HDR/raw output processing executed in step S227 in fig. 52;
fig. 54 is a flowchart showing the HDR/processed output processing executed in step S228 in fig. 52;
fig. 55 is a flowchart showing the STD output process executed in step S229 in fig. 52;
fig. 56 is a flowchart showing a display process of the display device.
Fig. 57 is a block diagram showing an exemplary configuration of a computer.
Detailed Description
Hereinafter, modes for implementing the present technology are described. The description is provided in the following order.
1. Recording/reproducing system
2、HEVC
3. BD format
4. Configuration of each device
5. Operation of each device
6. Modifying
7. Exemplary case of adjusting luminance on the reproduction apparatus side
8. Exemplary application to HDMI
9. Other modifications
<1, recording/reproducing System >
Fig. 1 is a diagram showing an exemplary configuration of a recording/reproducing system according to an embodiment of the present technology.
The recording/reproducing system in fig. 1 includes a
The
The content may be recorded on the
When the
A High Dynamic Range (HDR) video, which is a video having a dynamic range equal to or larger than a dynamic range (luminance range) that can be displayed on a display having standard luminance, is input to the
The
Standard video (STD video) is video with a dynamic range that can be displayed on a display with standard brightness. In the case where the dynamic range of the STD video is 0-100%, the dynamic range of the HDR video is represented as a range of 0% to 101% or more, for example, 0-500% or 0-1000%.
Also, after converting the input main HDR video into an STD video, that is, while converting the input main HDR video into a video having a dynamic range that can be displayed on a display having standard luminance, the
The HDR video recorded by the
Information representing luminance characteristics of a main HDR video and information used when converting the HDR video into an STD video or when converting the STD video into the HDR video are inserted within encoded data of HEVC as Supplemental Enhancement Information (SEI). An HEVC stream, which is encoded data of HEVC with SEI inserted, is recorded on the
The
Also, the reproducing
For example, when the video data obtained by decoding is data of an HDR video, and when the
On the other hand, when the video data obtained by decoding is data of an HDR video, and when the
When the video data obtained by decoding is data of an STD video, and when the
On the other hand, when the video data obtained by decoding is data of an STD video, and when the
The
For example, when information representing luminance characteristics of a main HDR video is transmitted together with video data, the
In this case, the
Since the luminance characteristics of the main HDR video can be specified, the creator of the content can display an image with desired luminance.
Generally, a display device (e.g., TV) recognizes a video input from the outside as a video having a dynamic range of 0-100%. Also, when the display of the display device has a wider dynamic range than the input video, the display device displays an image while disadvantageously expanding the luminance according to the characteristics of the display. By specifying the luminance characteristics and by adjusting the luminance of the HDR video in accordance with the specified characteristics, it is possible to prevent the adjustment of the luminance unintended by the creator from being performed on the display device side.
Also, a reproducing apparatus that outputs a video on a display apparatus (e.g., a TV) typically outputs the video after converting the luminance according to the characteristics of the transmission line. According to the characteristics of the display, the display device having received the video displays an image after converting the brightness of the received video. Since the luminance is not converted in the reproducing
Meanwhile, when the video data transmitted from the
Hereinafter, the mode in which the main HDR video is recorded as it is on the
Also, a mode in which the main HDR video is recorded on the
[ Signal processing in mode-i ]
Fig. 2 is a diagram showing an example of signal processing in mode-i.
The processing on the left side shown by the enclosed solid line L1 represents the encoding processing performed within the
When the main HDR video is input, the
As shown at the end of
As shown at the end of
The tone mapping definition information is information defining a correlation between each pixel value representing luminance of a dynamic range of 0-400% or the like (i.e., a dynamic range wider than the standard dynamic range) and each pixel value representing luminance of a dynamic range of 0-100% as the standard dynamic range.
As shown at the end of
As described above, by using the form in which the SEI of HEVC is inserted into a stream, information representing the luminance characteristics of a main HDR video and information used when converting the HDR video into an STD video are supplied to the reproducing
The reproducing
Also, as shown at the end of
On the other hand, as shown at the end of
As described above, HDR video data obtained by decoding encoded data of HEVC is output to the
Fig. 3 is a diagram showing a flow of processes from when the main HDR video is input to the
As shown at the end of
When the
On the other hand, when the
As described above, in the mode-i, the main HDR video is recorded as it is on the
[ Signal processing in mode-ii ]
Fig. 4 is a diagram showing an example of signal processing in mode-ii.
When the main HDR video is input, the
As shown at the end of
As shown at the end of
Also, as shown at the end of
As shown at the end of
The reproducing
Also, as shown at the end of
On the other hand, as shown at the end of
As described above, after being converted into an HDR video, HDR video data obtained by decoding encoded data of HEVC is output to the
Fig. 5 is a diagram showing a flow of processes from when the main HDR video is input to the
As shown at the end of
When the
On the other hand, when the
As described above, in the mode-ii, the main HDR video is converted into the STD video and recorded on the
The detailed configuration and operation of such a
<2、HEVC>
Herein, a description of HEVC is provided.
Fig. 6 is a diagram illustrating a configuration of an access unit of HEVC;
an HEVC stream is configured by an access unit, which is a set of Network Abstraction Layer (NAL) units. Video data of a single picture is included within a single access unit.
As shown in fig. 6, a single access unit is configured by an access unit qualifier (AU qualifier), a Video Parameter Set (VPS), a Sequence Parameter Set (SPS), a Picture Parameter Set (PPS), an SEI, a Video Coding Layer (VCL), a sequence End (EOS), and an end of stream (EOS).
The AU qualifier denotes a header of the access unit. The VPS includes metadata representing the content of the bitstream. The SPS includes information such as picture size, Coded Tree Block (CTB) storage, etc. that HEVC decoders processed by decoding of the sequence need to refer to. The PPS includes information that needs to be referred to in order for the HEVC decoder to perform a decoding process of the picture. VPS, SPS, and PPS are used as header information.
The SEI is auxiliary information including information related to temporal information and random access of each picture, etc. HDR information and tone mapping definition information are contained within tone mapping information as one SEI. VCL is data of a single picture. The end of sequence (EOS) represents the end position of the sequence and the end of stream (EOS) represents the end position of the stream.
Fig. 7 is a diagram illustrating syntax of tone mapping information.
By using the tone movie information, the color of a picture obtained by decoding is converted in accordance with the performance of a display serving as an output destination of the picture. It is to be noted that the number of lines and colon (: on the left side in fig. 7) are described for convenience of description, and the number of lines and colon (: are not information contained in the tone mapping information. The main information contained in the tone mapping information is described.
Tone _ map _ id in the second line is identification information of Tone mapping information. The object of Tone mapping information is identified by Tone _ map _ id.
For example, the ID of pattern-i and the ID of pattern-ii are ensured. When the recording mode is mode-i, the ID of mode-i is set in tone _ map _ ID of tone mapping information within SEI inserted into encoded data of the HDR video. Also, when the recording mode is mode-ii, the ID of mode-ii is set in the tone _ map _ ID of tone mapping information in the SEI inserted into the encoded data of the STD video. In the
Tone _ map _ model _ id on
In the
As shown in fig. 8, tone mapping information in which any one of
Fig. 9 is a diagram showing an example of a tone curve drawn by tone mapping information of tone _ map _ model _ id ═ 0.
The abscissa axis in fig. 9 represents coded _ data (RGB values before conversion) and the ordinate axis represents target _ data (RGB values after conversion). When the tone curve in fig. 9 is used, as indicated by an outlined arrow #151, an RGB value equal to or lower than coded _ data D1 is converted into an RGB value represented by min _ value. Also, as indicated by an open arrow #152, the RGB value equal to or higher than coded _ data D2 is converted into an RGB value represented by max _ value.
tone mapping information of tone _ map _ model _ id ═ 0 is used as tone mapping definition information of HDR-STD conversion. When tone mapping information of tone _ map _ model _ id ═ 0 is used, luminances equal to or higher than max _ value and equal to or lower than min _ value (luminances represented by RGB values) are lost; however, the load on the conversion process becomes lighter.
Fig. 10 is a diagram showing an example of a step function rendered by tone mapping information of tone _ map _ model _ id ═ 2.
When the step function in fig. 10 is used, coded _
tone mapping information of tone _ map _ model _ id ═ 2 is used as tone mapping definition information for STD-HDR conversion or for HDR-STD conversion. Since the data amount of tone mapping information of tone _ map _ model _
Fig. 11 is a diagram showing an example of a polyline function rendered by tone mapping information of tone _ map _ model _ id ═ 3.
When the polyline function in fig. 11 is used, for example, coded _ data ═ D11 is converted into target _ data ═ D11 'and coded _ data ═ D12 is converted into target _ data ═ D12'. tone mapping information of tone _ map _ model _ id ═ 3 is used as tone mapping definition information for STD-HDR conversion or for HDR-STD conversion.
As described above, tone mapping information in which any one of the
Fig. 12 is a diagram showing an example of each piece of information contained in HDR information.
The axis of abscissa in fig. 12 represents pixel values of RGB. When the bit length is 10 bits, each pixel value is a value in the range of 0-1023. The ordinate axis in fig. 12 represents luminance. The curve L11 represents the relationship between the pixel value and the luminance of a display having a standard luminance. The dynamic range of a display with standard brightness is 0-100%.
ref _ screen _ luminance _ white represents luminance (cd/m) of a display as a standard2). extended _ range _ white _ level represents the maximum value of the luminance of the dynamic range after expansion. In the case of fig. 12, 400 is set to the value of extended _ range _ white _ level.
In a display with standard luminance, the nominal _ black _ level _ code _ value represents a pixel value of black (
In the case of fig. 12, as indicated by an
The luminance characteristics of the HDR video are represented by a curve L12, in which values nominal _ black _ level _ code _ value, nominal _ white _ level _ code _ value, and extended _ white _ level _ code _ value are 0%, 100%, and 400%, respectively.
As described above, the luminance characteristics of the main HDR video are represented by setting 4 to the tone mapping information of the value tone _ map _ model _ id, and are transmitted from the
<3, BD Format >
Here, a description of the BD-ROM format is provided.
[ data management Structure ]
FIG. 13 is a diagram showing an example of a management structure of an AV stream having a BD-ROM format.
Management of AV streams, including HEVC streams, is performed using two layers, e.g., playlists (playlists) and clips (clips). In some cases, the AV stream may be recorded not only on the
A pair of a single AV stream and clip information is managed as information related to the AV stream. A pair of a single AV stream and clip information is called a clip.
An AV stream is developed on a time axis, and an access point of each clip is specified within a playlist mainly by a time stamp. For example, the clip information is used to find an address at which decoding starts in the AV stream.
A playlist is a set of reproduction parts of an AV stream. A single reproduction section within an AV stream is called a play item. The play item is represented on the time axis by a pair of IN point and OUT point within the reproduction section. As shown in fig. 13, the playlist is configured of a single or a plurality of play items.
The first playlist on the left in fig. 13 is configured with two play items, and by the two play items, the front and rear portions of the AV stream contained in the clip on the left are referred to.
The second playlist on the left is configured with a single play item, and by play item, the entire AV stream contained in the clip on the right is referred to.
The third playlist on the left is configured with two play items, and with two play items, a certain part of the AV stream contained in the clip on the left and a certain part of the AV stream contained in the clip on the right are referred to.
For example, when a play item on the left included in the first play list on the left is indicated as a target of reproduction by a disk navigation program, reproduction of the front portion of the AV stream included in the clip on the left referred to by the play item is performed. As described above, the playlist serves as reproduction management information for managing reproduction of the AV stream.
In the playlist, a reproduction path made up of one or more play items (playitems) for one line is called a MainPath. Also, in the playlist, a reproduction Path running in parallel with the Main Path and composed of one or more subplayitems (Sub-play items) for one line is referred to as Sub Path.
Fig. 14 is a diagram showing the structure of Main Path and Sub Path.
The playlist includes a single Main Path and one or more Sub paths. The playlist in fig. 14 is composed of one line of three play items, including a single Main Path and three Sub paths.
PlayItems that configure the Main Path are each provided with an ID in order from the top. Sub Path is also provided with IDs in order from the top, i.e., Subpath _
In the example in fig. 14, a single SubPlayItem is contained within a SubPath of SubPath _
The AV stream referred to by a single playitem includes at least a video stream (main image data). The AV stream may include one or more audio streams reproduced simultaneously (in synchronization) with the video stream contained within the AV stream, or may not include any audio stream.
The AV stream may include one or more streams of bitmap subtitle data (presentation graphics (PG)) reproduced in synchronization with a video stream contained within the AV stream, or may not include any stream of subtitle data.
The AV stream may include one or more streams of Interactive Graphics (IG) reproduced in synchronization with a video stream contained in the AV stream file, or may not include any stream of interactive graphics. The IG stream is used to display graphics, such as buttons operated by a user.
In an AV stream referred to by a single play item, a video stream, and an audio stream, PG stream, and IG stream synchronized with the video stream are multiplexed.
One SubPlayItem refers to a video stream, an audio stream, a PG stream, and the like that do not stop with the stream of the AV stream to which the play item refers.
As described above, reproduction of an AV stream including an HEVC stream is performed using a playlist and clip information. The playlist and clip information including information related to reproduction of the AV stream are appropriately referred to as database information.
[ CAMERA STRUCTURE ]
Fig. 15 is a diagram showing a structure for managing files recorded within the
Each file recorded on the
The BDMV directory is located below the root directory.
An index file as a file provided with the name "index.
The PLAYLIST directory, CLIPINF directory, STREAM directory, etc. are located under the BDMV directory.
The PLAYLIST directory stores a PLAYLIST file describing a PLAYLIST. Each playlist is named by a combination of 5 digits and an extension ". mpls". One playlist file shown in fig. 15 is provided with a file name "00000. mpls".
The CLIPINF directory stores clip information files. Each clip information is named by a combination of 5 digits and an extension ". clpi". The 3 clip information files shown in fig. 15 are provided with file names "00001. clpi", "00002. clpi", and "00003. clpi".
The STREAM file is stored in the STREAM directory. Each stream file is named by a combination of 5 digits and an extension ". m2 ts". The 3 stream files shown in fig. 15 are provided with file names "00001. m2 ts", "00002. m2 ts", and "00003. m2 ts".
The clip information file and the stream file in which the same 5 digits are set in the file name are files constituting a single clip. The clip information file "00001. clpi" is used when the stream file "00001. m2 ts" is reproduced, and the clip information file "00002. clpi" is used when the stream file "00002. m2 ts" is reproduced. As will be described later, information related to HDR video processing is contained in a clip information file for reproducing an AV stream including an HEVC stream.
[ grammar of Each document ]
Here, a main description of syntax of each file is described.
Fig. 16 is a diagram showing the syntax of a playlist file.
The PLAYLIST file is stored in the PLAYLIST directory in fig. 15, and is a file provided with an extension ". mpls".
AppInfoPlayList () stores parameters related to reproduction control of a playlist, for example, reproduction restriction.
PlayList () stores parameters related to Main Path and Sub Path.
PlayListMark () stores mark information of a play list, in other words, PlayListMark () stores information related to a mark as a jumping destination (jumping point) in a user operation, a command, or the like, which commands chapter jumping.
Fig. 17 is a diagram showing the syntax of a clip information file.
The clip information file is stored in the CLIPINF directory in fig. 15, and is a file provided with an extension ". clpi".
ClipInfo () stores information, for example, information representing the type of AV stream configuring a clip, information representing the recording rate of the AV stream, and the like.
The SequenceInfo () includes information indicating the position of source packets configuring the AV stream on the time axis, information indicating the clock time of display, and the like.
ProgramInfo () includes information related to PID configuring an AV stream of a clip, information related to encoding of the AV stream, and the like.
Fig. 18 is a diagram illustrating the syntax of ProgramInfo () in fig. 17.
Number _ of _ program _ sequences represents the Number of program sequences described in ProgramInfo (). A program sequence is made up of a line of source packets that make up a program.
SPN _ program _ sequence _ start [ i ] indicates the number of source packets at the beginning of a program sequence.
StreamCodingInfo includes information related to encoding of an AV stream configuring the clip.
Fig. 19 is a diagram illustrating the syntax of StreamCodingInfo in fig. 18.
Stream _ coding _ type indicates a coding method of an elementary Stream contained in an AV Stream. For example, in StreamCodingInfo used for reproducing clip information of an HEVC stream, a value indicating that the coding method is HEVC is set to stream _ coding _ type.
Video _ format denotes a Video scanning method. In video _ format for reproducing HEVC stream, a value indicating a 4K scanning method (e.g., 2160p (2160 incoming line)) is set to stream _ coding _ type.
The Frame _ rate represents the Frame rate of the video stream.
Aspect _ ratio represents the Aspect ratio of the video.
Cc _ flag is a 1-bit flag and indicates whether closed caption data is contained within the video stream.
The HDR _ flag is a 1-bit flag, and indicates whether the HDR video is recorded as the main video. For example, HDR _ flag ═ 1 indicates that HDR video is recorded as primary video. Also, HDR _ flag ═ 0 indicates that the STD video is recorded as the main video.
Mode _ flag is a 1-bit flag and represents the recording Mode of the HEVC stream. When HDR _ flag is 1, mode _ flag becomes valid. For example, mode _ flag ═ 1 indicates that the recording mode is mode-i. Also, mode _ flag ═ 0 indicates that the recording mode is mode-ii.
As described above, the clip information includes: a flag indicating whether an HEVC stream contained within an AV stream reproduced using clip information is a stream in which a main video is an HDR video; and a flag representing a recording mode of the HEVC stream.
By referring to the flag included in the clip information, the reproducing
<4, configuration of each apparatus >
Here, the configuration of each device is described.
[ MEANS FOR PROBLEMS ] A
Fig. 20 is a block diagram showing an exemplary configuration of the
The
The
In the
The
The
Fig. 21 is a block diagram showing an exemplary configuration of the
The
The HDR
When the recording mode is mode-i,
The HDR-
Fig. 22 is a diagram showing an example of signal processing performed by the HDR-
As shown at the end of the
The HDR-
Also, as shown at the end of the
Fig. 23 is a diagram showing an example of tone mapping.
As shown in fig. 23, for example, by compressing the high luminance component and expanding the intermediate and low luminance components, the RGB signal of the HDR video is converted into the RGB signal of the STD video. As shown in fig. 23, information representing a function F that correlates the RGB signal of the HDR video and the RGB signal of the STD video is generated by the definition
Returning to the description of fig. 21, when the recording mode is mode-ii, the HDR-
From the information supplied from the HDR-
For example, when the tone _ map _ model _ id is used to be 0, the definition
Also, when the tone _ map _ model _ id is used equal to 2, the definition
Also, when tone _ map _ model _ id is used 3, the definition
Depending on the recording mode, the HEVC
[ PROVIDING OF REPRODUCING APPARATUS 2 ]
Fig. 24 is a block diagram showing an exemplary configuration of the
The reproducing
The
The
The
The
The
The
The
The
Fig. 25 is a block diagram showing an exemplary configuration of the
The
The HEVC stream read out from the
The
On the other hand, when the recording mode is the mode-ii, and when the HDR video is output to the
Also, the
The
The HDR-
The STD-
In outputting the HDR video to the
In outputting the STD video to the
The
[ arrangement of display device 3 ]
Fig. 26 is a block diagram showing an exemplary configuration of the
The
The
For example, the
The
The
<5 operation of each apparatus >
Herein, the operation of each device having the above-described configuration is described.
[ RECORDING PROCESSING ]
First, with reference to the flowchart in fig. 27, the recording process of the
In step S1, the
When it is determined in step S1 that the recording mode is mode-i, in step S2, the
On the other hand, when it is determined in step S1 that the recording mode is mode-ii, in step S3, the
In step S4, the database information generation unit 21A executes database information generation processing. The playlist file and clip information file generated by the database information generation processing are supplied to the
In step S5, the
Next, with reference to the flowchart in fig. 28, the encoding process in mode-i performed in step S2 in fig. 27 is described.
In step S11, the HDR
In step S12,
In step S13, the HDR-
In step S14, the definition
In step S15, the HEVC
Next, with reference to the flowchart in fig. 29, the encoding process in mode-ii performed in step S3 in fig. 27 is described.
In step S21, the HDR
In step S22, the HDR-
In step S23, the definition
In step S24, the
In step S25, the HEVC
Next, with reference to the flowchart in fig. 30, the database information generation process executed in step S4 in fig. 27 is described.
In step S31, the database information generating unit 21A of the
In step S32, the database information generation unit 21A generates clip information including the HDR _ flag and the mode _ flag within StreamCodingInfo of ProgramInfo (). In this example, since the main video is the HDR video, database information generation unit 21A sets 1, which indicates that the main video is the HDR video, as a value HDR _ flag.
Also, in step S2 in fig. 27, when the encoding process is performed in mode-i, the database information generation unit 21A sets 1 indicating that the recording mode is mode-i to the value mode _ flag. On the other hand, in step S3 in fig. 27, when the encoding process is performed in the mode-ii, the database information generation unit 21A sets 0 indicating that the recording mode is the mode-ii as the value mode _ flag. Subsequently, the process returns to step S4 in fig. 27, and then, processing is performed.
In the
[ REPRODUCING PROCESSING ]
Next, with reference to the flowchart in fig. 31, the reproduction processing of the
At a predetermined time, for example, at the start of reproduction of the
In step S41, the
In step S42, the
In step S43, the
When it is determined in step S44 that the recording mode is mode-i, in step S45, the
On the other hand, when it is determined in step S43 that the recording mode is mode-ii, in step S45, the
After the decoding process is performed in step S44 or step S45, the process ends.
It is noted that here, although it is determined whether the recording mode is mode-i according to the value mode _ flag, the decision may be made according to the tone _ map _ id of the tone mapping information inserted within the HEVC stream.
Next, with reference to the flowchart in fig. 32, the decoding process in mode-i performed in step S44 in fig. 31 is described.
In step S61,
In step S62, the
In step S63, the
When it is determined in step S63 that the display included in the
On the other hand, when it is determined in step S63 that the display contained in the
In step S66, the STD
In step S67, after the HDR video is output in step S64, or after the STD video is output in step S66, the
Upon determining in step S67 that the reproduction is not ended, the
Next, with reference to the flowchart in fig. 33, the decoding process in mode-ii performed in step S45 in fig. 31 is described.
In step S81,
In step S82, the
In step S83, the
When it is determined in step S83 that the display contained within the
In step S85, the HDR
On the other hand, when it is determined in step S83 that the display included in the
In step S87, after the HDR video is output in step S84, or after the STD video is output in step S86, the
[ DISPLAY TREATMENT ]
Next, with reference to a flowchart in fig. 34, a display process of the
Herein, a case where the
In step S101, the
In step S102, the
For example, the dynamic range in HDR video specified by the HDR information is 0-400% and the dynamic range of the
When it is determined in step S102 that the HDR video can be displayed as it is, in step S103, the
On the other hand, when it is determined in step S102 that the HDR video cannot be displayed as it is, in step S104, the
After displaying the image of the HDR video in step S103 or step S104, in step S105, the
Through the above sequence processing, the
Also, the
In reproducing the HDR video, by enabling the luminance characteristics of the main HDR video to be specified by the HDR information, the creator of the content can display the image of the HDR video with desired luminance.
<6, modification >
[ storage position of flag ]
Although it is described above that the HDR _ flag and the mode _ flag are stored in clip information, the HDR _ flag and the mode _ flag may be stored in a playlist.
First example of a storage location
Fig. 35 is a diagram showing an example of syntax of AppInfoPlayList () contained within the playlist file in fig. 16.
As described above, AppInfoPlayList () stores parameters related to reproduction control of a playlist, for example, reproduction restriction. In the example in fig. 35, after MVC _ Base _ view _ R _ flag, HDR _ flag and mode _ flag are continuously described.
As described above, in AppInfoPlayList () of a playlist file, an HDR _ flag and a mode _ flag can be described.
Second example of storage location
Fig. 36 is a diagram showing the syntax of PlayList () contained within the PlayList file in fig. 16.
Number _ of _ PlayItems indicates the Number of play items in the play list. In the case of the example in fig. 14, the number of play items is 3. PlayItem _ ids are each assigned a number as their value, starting from 0 in the order in which PlayItem () appears in the play list.
Number _ of _ SubPaths represents the Number of SubPaths within the playlist. In the case of the example in fig. 14, the Sub Path number is 3. SubPath _ id is each assigned a number as its value, starting from 0 in the order in which SubPath () appears in the playlist.
As shown in fig. 36, in the play list, PlayItem () is described as the number of times corresponding to the number of play items, and SubPath () is described as the number of times corresponding to the SubPath number.
Fig. 37 is a diagram showing the syntax of PlayList () in fig. 36.
Clip _ Information _ file _ name [0] indicates the name of the Clip Information file of the Clip referred to by the play item. Clip _ codec _ identifier [0] indicates a codec system of clips.
IN _ time denotes a start position of a reproduction section of the play item, and OUT _ time denotes an end position. After OUT _ time, UO _ mask _ table (), PlayItem _ random _ access _ mode, still _ mode are included.
The STN _ table () contains information of the AV stream referred to by the playitem. In the case of having a Sub Path reproduced in association with a play item, AV stream information referred to by subplayitems constituting the Sub Path is also included.
Fig. 38 is a diagram illustrating syntax of STN _ table () in fig. 37.
Number _ of _ video _ stream _ entries represents the Number of video streams input (registered) in STN _ table (). Number _ of _ audio _ stream _ entries represents the Number of streams of the first audio stream input in the STN _ table (). Number _ of _ audio _ stream2_ entries represents the Number of streams of the second audio stream input in STN _ table ().
Number _ of _ PG _ textST _ stream _ entries represents the Number of PG _ textST streams input in STN _ table (). The PG _ textST stream is a Presentation Graphics (PG) stream and a text subtitle file (textST) as run-length encoded bitmap subtitles. Number _ of _ IG _ stream _ entries represents the Number of Interactive Graphics (IG) streams input in the STN _ table ().
Stream _ entry () and Stream _ attributes () as information of each of the video Stream, the first video Stream, the second video Stream, the PG _ textST Stream, and the IG Stream are described in the STN _ table (). The PID of the stream is contained in stream _ entry (), and the attribute information of the stream is contained in stream _ attributes ().
Fig. 39 is a diagram showing an example of a description related to a video stream, that is, a description in the description of stream _ attributes () in fig. 38.
In the example of stream _ attributes () in fig. 39, stream _ coding _ type, video _ format, and frame _ rate are described as attribute information of a video stream, and HDR _ flag and mode _ flag are described thereafter. Note that stream _ coding _ type represents a coding method of a video stream, and video _ format represents a video format. frame _ rate represents the frame rate of the video.
As described above, the HDR _ flag and the mode _ flag may be described in the STN _ table () of the playlist file.
The HDR _ flag and the mode _ flag may be described in the playlist file, not in AppInfoPlayList () and STN _ table (). In a similar manner, the HDR _ flag and the mode _ flag may be described in a position within the clip information file, instead of in StreamCodingInfo described with reference to fig. 19.
The positions where the HDR _ flag and the mode _ flag are described are selectable, for example, one of the HDR _ flag and the mode _ flag is described in a clip information file and the other is described in a playlist file.
【PSR】
Fig. 40 is a diagram showing an example of allocation of PSRs.
As described above, the
HDR capability flag is stored in PSR29 with a PSR number of 29 PSR thereon. For example, a value of HDR capability flag of the PSR29 of 1 indicates that the reproducing
For example, when an optical disc in which the value of HDR flag of clip information is set to 1 is inserted, in other words, when an optical disc in which main HDR video is recorded is inserted, the
The PSR25 is a PSR having a PSR number of 25, serving as a PSR to record information indicating the correspondence of the connected display with the HDR video. In this case, information indicating the performance of the display included in the
For example, HDR _ display _ capability _ flag and information indicating a luminance specification are stored in the PSR25 for the HDR display function. A value of HDR display capability flag of 1 indicates that the connected display is capable of displaying HDR video. Also, a value of 0 of HDR _ display _ capability _ flag indicates that the connected display cannot display HDR video.
For example, information indicating the degree (in percentage) of brightness that can be displayed is stored as information indicating the brightness specification.
The PSR23, which is a PSR for the display function, may store information representing HDR _ display _ capability _ flag as well as the luminance specification, without using the PSR25 for the HDR display function.
<7, exemplary case of adjusting luminance on the reproducing apparatus side >
In the above description, when the HDR video transmitted from the reproducing
[ Signal processing in mode-i ]
Fig. 41 is a diagram showing an example of signal processing in mode-i in the case where adjustment of the luminance of the HDR video is performed by the reproducing
Among the processes shown in fig. 41, the process performed by the
The reproducing
Also, as shown at the end of
For example, when the dynamic range of the HDR video represented by the HDR information is 0-400%, and when the dynamic range representing the
When adjusting the luminance of the HDR video, the reproducing
The reproducing
[ Signal processing in mode-ii ]
Fig. 42 is a diagram showing an example of signal processing in mode-ii in the case where adjustment of the luminance of the HDR video is performed by the reproducing
Among the processes shown in fig. 42, the process performed by the
The reproducing
Also, as shown at the end of
As shown at the end of
The reproducing
As described above, when the
From the HDR information, the
[ PROVIDING OF REPRODUCING APPARATUS 2 ]
Fig. 43 is a block diagram showing an exemplary configuration of the HDR
The HDR
The
In accordance with the adjustment result of the
Decoding process of
Herein, the decoding process in mode-i performed in step S44 in fig. 31 is described with reference to the flowchart in fig. 44. In the process in fig. 44, the luminance adjustment of the HDR video is appropriately performed.
Among the processes shown in fig. 44, the processes of steps S151 to S153 and S158 to S160 are the same as those of steps S61 to S63 and S65 to S67 of fig. 32, respectively. Duplicate descriptions are appropriately omitted.
In step S151, the
In step S152, the
In step S153, the
Upon determining in step S153 that the display included in the
When it is determined in step S154 that the HDR video cannot be displayed as it is, in step S155, the
In step S156, the
In step S157, the HDR
When it is determined in step S154 that the HDR video can be displayed as it is, the processing of steps S155 and S156 is skipped. In the above case, in step S157, the HDR
In step S160, it is determined whether the reproduction is ended, and upon determining that the reproduction is ended, the process is ended. Then, the process returns to step S44 in fig. 31, and then, processing is performed.
Next, with reference to the flowchart in fig. 45, the decoding process in mode-ii performed in step S45 in fig. 31 is described. In the process in fig. 45, the luminance adjustment of the HDR video is appropriately performed.
Among the processes shown in fig. 45, the processes of steps S171 to S174, S179, and S180 are the same as the processes of steps S81 to S84, S86, and S87 of fig. 33, respectively. Duplicate descriptions are appropriately omitted.
In step S171, the
In step S172, the
In step S173, the
When it is determined in step S173 that the display included in the
In step S175, the
When it is determined in step S175 that the HDR video cannot be displayed as it is, in step S176, the
In step S177, the
In step S178, the HDR
When it is determined in step S175 that the HDR video can be displayed as it is, the processing of steps S176 and S177 is skipped. In the above case, in step S178, the HDR
In step S180, it is determined whether the reproduction is ended, and upon determining that the reproduction is ended, the process is ended. Then, the process returns to step S45 in fig. 31, and then, processing is performed.
[ display processing by display device 3 ]
Next, with reference to a flowchart in fig. 46, a display process of the
After the processing in fig. 44 or 45 performed by the
In step S191, the
In step S192, the
In step S193, the
As described above, when the reproducing
When the luminance of the HDR video needs to be adjusted, the user of the reproducing
Alternatively, the
When the
The following conditions apply: the parameter for adjustment differs between the luminance adjustment performed by the
By causing the
Notification of whether the luminance adjustment of the HDR video is performed on the reproducing
<8, exemplary application to HDMI >
[ HDR EDID and HDR InfoFrame ]
Fig. 47 is a diagram showing one example of identification based on information transmitted and received through HDMI.
As shown on the left side in fig. 47, the reproducing
When the HDR EDID is contained in the EDID read out from the
As shown on the right in fig. 47, the reproducing
The HDR InfoFrame is an InfoFrame that includes information related to the specification of the HDR video. HDR information representing luminance characteristics of HDR video is transmitted using HDRInfoFrame. The reproducing
When the HDR InfoFrame is added to the video data transmitted from the reproducing
Fig. 48 is a diagram showing another example of identification based on information transmitted and received through HDMI.
As shown on the left side in fig. 48, when HDR EDID is not contained in EDID read out from the
On the other hand, as shown on the right in fig. 48, when the HDR InfoFrame is not added to the video data transmitted from the reproducing
As described above, the HDR information can be transmitted from the
Fig. 49 is a diagram showing an example of HDR EDID.
Included within the HDR EDID is information representing the maximum brightness of the display, information representing the maximum extension level, and the raw/processed flag-1. The original/processed flag-1 indicates whether the HDR video is output in an original manner or after adjusting the luminance of the HDR video (if necessary).
A value of 1 for the original/processed flag-1 indicates that the HDR video is to be output in the original manner, in other words, that the
For example, if there is a function of adjusting the luminance of the HDR video, the
Also, the
For example, if there is no function for adjusting the luminance of the HDR video, the
The decoding processing in fig. 32 or fig. 33, which performs luminance adjustment not on the
Hereinafter, the output of the reproducing
Fig. 50 is a diagram showing an example of the HDR InfoFrame.
The HDR InfoFrame includes ref _ screen _ luminance _ white, extended _ range _ white _ level, non _ black _ level _ code _ value, non _ white _ level _ code _ value, extended _ white _ level _ code _ value, and raw/processed (raw/linked) flag-2 as parameters of the HDR information.
Also, the HDR InfoFrame also includes a raw/processed flag-2. The raw/processed flag-2 indicates whether the output HDR video is the original HDR video on which no brightness adjustment has been made or the HDR video on which brightness adjustment has been made.
A value of 1 for the original/processed flag-2 indicates that the output HDR video is the original HDR video without brightness adjustment on the
Also, the
In the decoding process in fig. 32 or 33 in which the luminance adjustment is not performed on the reproducing
[ PROCESSING OF REPRODUCING
Here, the processing of the reproducing
First, with reference to a flowchart in fig. 51, a process of the
In step S211, the
In step S212, the
Next, with reference to the flowchart in fig. 52, the reproduction processing of the
In step S221, the
In step S222, the
In step S223, the
In step S224, the
When it is determined that HDR EDID is contained in step S224, the
In step S226, the
When it is determined in step S226 that the raw output is requested, in step S227, the
When it is determined in step S226 that the original output is not requested, in step S228, the
On the other hand, when it is determined in step S224 that HDR EDID is not contained, in step S229, the
After the video data is output in step S227, S228, or S229, the process ends.
Next, with reference to the flowchart in fig. 53, the HDR/raw output processing performed in step S227 in fig. 52 is described.
In step S241, the
In step S242, the
In step S243, the
When it is determined in step S243 that the recording mode is mode-ii, in step S244, the STD-
In step S245, the HDR
In step S246, the HDR
In step S247, the
Next, referring to the flowchart in fig. 54, the HDR/processed output processing performed in step S228 in fig. 52 is described.
In step S261, the
In step S262, the
In step S263, the
When it is determined in step S263 that the recording mode is mode-ii, in step S264, the STD-
In step S265, the
When it is determined in step S265 that the HDR video cannot be displayed as it is, in step S266, the
In step S267, the
In step S268, the HDR
For example, when the luminance of the HDR video is not adjusted, the HDR
On the other hand, when adjusting the luminance of the HDR video, the HDR
In step S269, the HDR
In step S270, the
Next, with reference to the flowchart in fig. 55, the STD output process executed in step S229 in fig. 52 is described.
As described above, the processing in fig. 55 is a processing of outputting video data to a display apparatus which is different from the
In step S281, the
In step S282, the
In step S283, the
When it is determined in step S283 that the recording mode is mode-i, in step S284, the HDR-
In step S285, the STD
In step S286, the
Next, with reference to a flowchart in fig. 56, a display process of the
The HDR InfoFrame is added to the video data transmitted by the
In step S301, the
In step S302, the controller 110 determines whether the data of the HDR video is data on which the original output is performed, according to the original/processed flag-2 contained in the HDR InfoFrame.
When 1 is set to the value of the raw/processed flag-2, the controller 110 determines that the data of the HDR video is the data on which the raw output is performed. Also, when 0 is set to the value of the raw/processed flag-2, the controller 110 determines that the data of the HDR video is the data on which the processed output is performed.
When determining in step S302 that the data of the HDR video is data on which the original output is performed, in step S303, the
On the other hand, when it is determined in step S302 that the data of the HDR video is the data on which the processed output is performed, in step S304, the
After displaying the image of the HDR video in step S303 or in step S304, in step S305, the
Through the above sequence processing, the
<9, other modifications >
Although the HDR information is added when the HDR video is transmitted from the reproducing
Also, a case where the reproducing
Also, although the content reproduced by the
[ exemplary configuration of computer ]
The above-described sequence processing may be implemented by hardware, or may be implemented by software. When the sequence processing is realized by software, a program constituting the software is installed from a program recording medium to a computer embedded in dedicated hardware, a general-purpose personal computer, or the like.
Fig. 57 is a block diagram showing an exemplary hardware configuration of a computer that executes the above-described sequence processing by a program.
The
Also, an input/
In the computer configured in the above manner, for example, the
The program executed by the
Note that the program executed by the computer may be a program that performs processing chronologically according to the order described in this specification or may be a program that performs processing in a parallel manner or at a necessary time (for example, at the time of making a call).
The embodiments of the present technology are not limited to the above-described embodiments, and various changes may be made without departing from the scope of the present technology.
It is to be noted that in the present specification, a system denotes a set of a plurality of elements (devices, modules (components), and the like) regardless of whether all the elements are accommodated in the same housing. Therefore, a plurality of devices accommodated in different housings and connected to each other through a network and a single device in which a plurality of modules are accommodated in a single housing are each a system.
Exemplary combination of configurations
The present technology can also be configured in the following manner.
(1) A reproduction apparatus comprising:
a readout unit configured to read out the encoded data, the luminance characteristic information, and the luminance conversion definition information from a recording medium on which the encoded data of the extended video that is a video having a second luminance range wider than the first luminance range is recorded, the luminance characteristic information representing the luminance characteristic of the extended video, and the luminance conversion definition information used when luminance conversion of the extended video to a standard video that is a video having the first luminance range is performed;
a decoding unit configured to decode the encoded data;
a conversion unit configured to convert an extended video obtained by decoding the encoded data into a standard video according to the luminance conversion definition information; and
an output unit configured to output the data of the extended video and the luminance characteristic information to a display apparatus capable of displaying the extended video, and configured to output the data of the standard video to a display apparatus incapable of displaying the extended video.
(2) The reproduction apparatus according to (1),
wherein the luminance characteristic information and the luminance conversion definition information are inserted into a stream including the encoded data as auxiliary information of the encoded data and recorded in the recording medium.
(3) The reproduction apparatus according to the above (2),
wherein the encoded data is encoded data of HEVC, and the luma feature information and the luma transform definition information are SEIs of HEVC streams.
(4) The reproduction apparatus according to the above (3),
wherein the luminance conversion definition information is first tone mapping information in which any one of
wherein the luminance characteristic information is second tone mapping information, wherein 4 is set to a tone _ map _ model _ id value.
(5) The reproduction apparatus according to the above (4),
wherein a tone _ map _ model _ id of the first tone mapping information and a tone _ map _ model _ id of the second tone mapping information are each set with the same value representing a recording mode of the recording medium.
(6) The reproduction apparatus according to any one of (1) to (5),
wherein information relating to reproduction of the encoded data is further recorded in the recording medium, the information including a flag indicating whether recording of the extended video is in progress as a main video, and
wherein the decoding unit decodes the encoded data when the flag indicates that recording of the extended video is ongoing as a main video.
(7) The reproduction apparatus according to the above (6),
wherein the recording medium is a Blu-ray disc, and
wherein the flag is included in a clip information file used as reproduction-related information.
(8) The reproduction apparatus according to the above (6),
wherein the recording medium is a Blu-ray disc, and
wherein the flag is included in a playlist file used as reproduction-related information.
(9) A reproduction method, comprising:
a step of reading out the encoded data, the luminance characteristic information, and the luminance conversion definition information from a recording medium on which the encoded data of the extended video as a video having a second luminance range wider than the first luminance range is recorded, the luminance characteristic information indicating the luminance characteristic of the extended video, and the luminance conversion definition information used when performing luminance conversion of the extended video into a standard video as a video having the first luminance range;
a step of decoding the encoded data;
a step of converting an extended video obtained by decoding the encoded data into a standard video according to the luminance conversion definition information;
outputting the data of the extended video and the luminance characteristic information to a display device capable of displaying the extended video; and
and outputting data of the standard video to a display device incapable of displaying the extended video.
(10) A kind of recording medium is provided, which comprises a recording medium,
wherein the content of the first and second substances,
recording
Encoded data of an extended video which is a video having a second luminance range wider than the first luminance range,
luminance characteristic information representing luminance characteristics of the extended video, and
luminance conversion definition information used when luminance conversion of an extended video to a standard video as a video having a first luminance range is performed, and
wherein a reproducing apparatus that reproduces the recording medium performs the following processing:
reading out the encoded data, the luminance characteristic information, and the luminance conversion definition information from the recording medium,
the coded data is decoded and the data is decoded,
converting an extended video obtained by decoding the encoded data into a standard video according to the luminance conversion definition information,
outputting the data of the extended video and the brightness characteristic information to a display device capable of displaying the extended video; and
and outputting the data of the standard video to a display device which cannot display the extended video.
(11) A reproduction apparatus comprising:
a readout unit configured to read out encoded data, luminance characteristic information, and luminance conversion definition information from a recording medium on which encoded data of a standard video (the standard video being a video having a first luminance range) obtained by performing luminance conversion of an extended video that is a video having a second luminance range wider than the first luminance range is recorded, luminance characteristic information representing luminance characteristics of the extended video, and luminance conversion definition information used when luminance conversion of the standard video into the extended video is performed;
a decoding unit configured to decode the encoded data;
a conversion unit configured to convert a standard video obtained by decoding the encoded data into an extended video according to the luminance conversion definition information; and
an output unit configured to output the data of the extended video and the luminance characteristic information to a display apparatus capable of displaying the extended video, and configured to output the data of the standard video to a display apparatus incapable of displaying the extended video.
(12) The reproduction apparatus according to the above (11),
wherein the luminance characteristic information and the luminance conversion definition information are inserted into a stream including the encoded data as auxiliary information of the encoded data and recorded in the recording medium.
(13) The reproduction apparatus according to the above (12),
wherein the encoded data is encoded data of HEVC, and the luma feature information and the luma transform definition information are SEIs of HEVC streams.
(14) The reproduction apparatus according to the above (13),
wherein the luminance conversion definition information is first tone mapping information in which any one of
wherein the luminance characteristic information is second tone mapping information, wherein 4 is set to a tone _ map _ model _ id value.
(15) The reproduction apparatus according to (14),
wherein a tone _ map _ model _ id of the first tone mapping information and a tone _ map _ model _ id of the second tone mapping information are each set with the same value representing a recording mode of the recording medium.
(16) The reproduction apparatus according to any one of (11) to (15),
wherein information relating to reproduction of the encoded data is further recorded in the recording medium, the information including a flag indicating whether recording of the extended video is in progress as a main video, and
wherein the decoding unit decodes the encoded data when the flag indicates that recording of the extended video is ongoing as a main video.
(17) The reproduction apparatus according to (16),
wherein the recording medium is a Blu-ray disc, and
wherein the flag is included in a clip information file used as reproduction-related information.
(18) The reproduction apparatus according to (16),
wherein the recording medium is a Blu-ray disc, and
wherein the flag is included in a playlist file used as reproduction-related information.
(19) A reproduction method, comprising:
a step of reading out the encoded data, the luminance characteristic information, and the luminance conversion definition information from a recording medium on which encoded data of a standard video (the standard video being a video having a first luminance range) obtained by performing luminance conversion of an extended video that is a video having a second luminance range wider than the first luminance range is recorded, the luminance characteristic information representing a luminance characteristic of the extended video, and the luminance conversion definition information used when luminance conversion of the standard video into the extended video is performed;
a step of decoding the encoded data;
a step of converting a standard video obtained by decoding the encoded data into an extended video according to the luminance conversion definition information;
outputting the data of the extended video and the luminance characteristic information to a display device capable of displaying the extended video; and
and outputting data of the standard video to a display device incapable of displaying the extended video.
(20) A kind of recording medium is provided, which comprises a recording medium,
wherein the content of the first and second substances,
recording
Encoded data of a standard video obtained by performing luminance conversion of an extended video which is a video having a second luminance range wider than the first luminance range (the standard video being a video having the first luminance range),
luminance characteristic information representing luminance characteristics of the extended video, and
luminance conversion definition information used when luminance conversion from standard video to extended video is performed, and
wherein a reproducing apparatus that reproduces the recording medium performs the following processing:
reading out the encoded data, the luminance characteristic information, and the luminance conversion definition information from the recording medium,
the coded data is decoded and the data is decoded,
converting a standard video obtained by decoding the encoded data into an extended video according to the luminance conversion definition information,
outputting the data of the extended video and the brightness characteristic information to a display device capable of displaying the extended video; and
and outputting the data of the standard video to a display device which cannot display the extended video.
REFERENCE SIGNS LIST
1: recording apparatus
2: reproducing apparatus
3: display device
11: optical disk
21: controller
21A: database information generation unit
22: encoding processing unit
23: magnetic disk drive
31: HDR information generating unit
32: HEVC (high efficiency video coding) encoder
33: HDR-STD conversion unit
34: definition information generating unit
35: HEVC stream generation unit
51: controller
52: magnetic disk drive
53: memory device
56: decoding processing unit
58: HDMI communication unit
71: parameter extraction unit
72: HEVC decoder
73: HDR-STD conversion unit
74: STD-HDR conversion unit
75: and an output unit.
- 上一篇:一种医用注射器针头装配设备
- 下一篇:同步镜延迟电路和同步镜延迟操作方法