Image pickup apparatus, control method therefor, and program

文档序号:91108 发布日期:2021-10-08 浏览:29次 中文

阅读说明:本技术 摄像设备及其控制方法和程序 (Image pickup apparatus, control method therefor, and program ) 是由 西口达也 大森昇 于 2020-02-18 设计创作,主要内容包括:根据本发明,即使在记录RAW图像数据时,摄像设备也将与选择的动态范围相对应的显像图像数据记录到RAW图像文件中。为此,该摄像设备包括:摄像单元;选择单元,用于选择动态范围;显像单元,用于对由摄像单元获得的RAW图像数据进行显像处理;以及控制单元,用于在将由摄像单元获得的RAW图像数据记录为RAW图像文件时进行控制,使得如果选择单元选择了第一动态范围,则通过显像单元对RAW图像数据进行用于第一动态范围的显像处理而获得的图像数据与RAW图像数据一起被记录为RAW图像文件,并且如果选择单元选择了第二动态范围,则通过显像单元对RAW图像数据进行用于第二动态范围的显像处理而获得的图像数据与RAW图像数据一起被记录为RAW图像文件。(According to the present invention, even when recording RAW image data, the image capturing apparatus records developed image data corresponding to a selected dynamic range into a RAW image file. To this end, the image pickup apparatus includes: an image pickup unit; a selection unit for selecting a dynamic range; a development unit configured to perform development processing on the RAW image data obtained by the image pickup unit; and a control unit configured to control, when recording the RAW image data obtained by the image capturing unit as a RAW image file, such that if the selection unit selects the first dynamic range, image data obtained by subjecting the RAW image data to development processing for the first dynamic range by the development unit is recorded as the RAW image file together with the RAW image data, and if the selection unit selects the second dynamic range, image data obtained by subjecting the RAW image data to development processing for the second dynamic range by the development unit is recorded as the RAW image file together with the RAW image data.)

1. An image pickup apparatus characterized by comprising:

an image pickup unit;

a selection component for selecting a dynamic range;

a development unit configured to perform development processing on the RAW image data obtained by the image pickup unit; and

control means for controlling, when the RAW image data obtained by the image capturing means is recorded as a RAW image file, so that, in a case where a first dynamic range is selected by the selection means, image data obtained by subjecting the RAW image data to development processing for the first dynamic range by the development means is recorded as the RAW image file together with the RAW image data, and

in a case where the selecting section selects the second dynamic range, image data obtained by subjecting the RAW image data to development processing for the second dynamic range by the developing section is recorded as the RAW image file together with the RAW image data.

2. The image pickup apparatus according to claim 1, wherein, when the developed image data is recorded as an image file,

the control means controls so that, in a case where the selection means selects the first dynamic range, image data obtained by the development means performing development processing for the first dynamic range on RAW image data obtained by the image pickup means is recorded as an image file having a first format, and

in a case where the selection means selects the second dynamic range, image data obtained by subjecting the RAW image data to development processing for the second dynamic range by the development means is recorded as an image file having a second format.

3. The image pickup apparatus according to claim 2,

in a case where the selection section selects the first dynamic range, the control section records the image data processed by the development section as a JPEG file, an

In a case where the selection section selects the second dynamic range, the control section records the image data processed by the development section as an HEIF file.

4. The image capturing apparatus according to claim 2 or 3, further comprising:

encoding means for encoding the image data subjected to the development processing by the development means,

wherein the image data subjected to the development processing for the first dynamic range and the image data subjected to the development processing for the second dynamic range are encoded using different encoding formats.

5. The image capturing apparatus according to claim 4, wherein the image data subjected to the development processing for the first dynamic range is encoded in JPEG format, and the image data subjected to the development processing for the second dynamic range is encoded in HEVC format.

6. The image capturing apparatus according to any one of claims 1 to 5, wherein the second dynamic range is wider than the first dynamic range.

7. The image capturing apparatus according to claim 6, wherein the first dynamic range is SDR, and the second dynamic range is HDR.

8. The image capturing apparatus according to any one of claims 1 to 7,

in the case where the first dynamic range is selected, the development section performs development processing so that image data having one color component with 8 bits is generated, an

In the case where the second dynamic range is selected, the development section performs development processing so that image data having a color component exceeding 8 bits is generated.

9. The image capturing apparatus according to any one of claims 1 to 8, further comprising:

a generation section for generating image data having a plurality of sizes from the image data obtained by the development processing,

wherein, when the RAW image data obtained by the image pickup means is recorded as a RAW image file, the control means records the image data having the plurality of sizes generated by the generation means as the RAW image file together with the RAW image data.

10. The apparatus according to claim 9, wherein the plurality of sizes to be generated by said generation means include at least a thumbnail size and an image size represented by RAW image data.

11. The apparatus according to claim 1, wherein, when the RAW image data obtained by said image pickup means is recorded as a RAW image file,

the control means controls so that, in a case where developed image data obtained by subjecting the RAW image data to the development processing for the first dynamic range by the development means is recorded as an image file different from the RAW image file, compressed image data obtained by subjecting the developed image data subjected to the development processing for the first dynamic range by the development means to compression encoding processing with a first encoding format is recorded as an image for display in the RAW image file and the image file, respectively, and

in a case where developed image data obtained by subjecting the RAW image data to the development processing for the second dynamic range by the development means is recorded as an image file different from the RAW image file, compressed image data obtained by subjecting the developed image data subjected to the development processing for the second dynamic range by the development means to compression encoding processing having a second encoding format is recorded as an image for display in the RAW image file and the image file, respectively.

12. The apparatus according to claim 11, wherein said control means controls such that:

compressed image data obtained by subjecting development image data obtained by subjecting the RAW image data to development processing for the first dynamic range by the development means to compression encoding processing having the first encoding format by the encoding means is recorded as an image file different from the RAW image file, and

compressed image data obtained by subjecting development image data obtained by subjecting the RAW image data to development processing for the second dynamic range by the development means to compression encoding processing having the second encoding format is recorded as an image file different from the RAW image file.

13. The apparatus according to claim 11 or 12, wherein said control means controls such that:

development image data obtained by subjecting the RAW image data to development processing for the first dynamic range by the development means is recorded as an image file having a first file format different from that of the RAW image data, an

Developed image data obtained by subjecting the RAW image data to development processing for the second dynamic range by the development means is recorded as an image file having a second file format different from the RAW image data.

14. The apparatus according to claim 11 or 12, wherein said control means controls such that:

development image data obtained by subjecting the RAW image data to development processing for the first dynamic range by the development means is recorded as an image file having a second file format different from that of the RAW image data, an

Developed image data obtained by subjecting the RAW image data to development processing for the second dynamic range by the development means is recorded as an image file having a second file format different from the RAW image data.

15. The apparatus according to any one of claims 11 to 14, wherein when recording RAW image data obtained by the image capturing means as a RAW image file, the control means controls so that an encoding format of an image for display to be recorded in the RAW image file together with the RAW image data is switched in accordance with the dynamic range selected by the selection means.

16. The apparatus according to any one of claims 11 to 15, wherein in a case where developed image data obtained by subjecting the RAW image data to development processing by the developing means is recorded as an image file different from the RAW image file, the control means controls such that image data to be recorded in the image file as an image for display is recorded as image data for display of the RAW image data.

17. The apparatus according to any one of claims 11 to 16, wherein in a case where developed image data obtained by subjecting the RAW image data to development processing by the development means is recorded as an image file different from the RAW image file, the control means controls such that developed image data to be recorded as a main image of the image file is recorded as an image for display of the RAW image file.

18. A control method of an image pickup apparatus including an image pickup device, characterized by comprising:

a selection step for selecting a dynamic range;

a development step of performing development processing on the RAW image data obtained by the image pickup means; and

a control step of controlling, when recording the RAW image data obtained by the image pickup means as a RAW image file, so that, in a case where a first dynamic range is selected in the selection step, image data obtained by subjecting the RAW image data to development processing for the first dynamic range in the development step is recorded as the RAW image file together with the RAW image data, and in a case where a second dynamic range is selected in the selection step, image data obtained by subjecting the RAW image data to development processing for the second dynamic range in the development step is recorded as the RAW image file together with the RAW image data.

19. A program to be loaded into and executed by a computer including an image pickup section, thereby causing the computer to execute:

a selection step for selecting a dynamic range;

a development step of performing development processing on the RAW image data obtained by the image pickup means; and

a control step of controlling, when recording the RAW image data obtained by the image pickup means as a RAW image file, so that, in a case where a first dynamic range is selected in the selection step, image data obtained by subjecting the RAW image data to development processing for the first dynamic range in the development step is recorded as the RAW image file together with the RAW image data, and in a case where a second dynamic range is selected in the selection step, image data obtained by subjecting the RAW image data to development processing for the second dynamic range in the development step is recorded as the RAW image file together with the RAW image data.

Technical Field

The invention relates to an image pickup apparatus, a control method thereof, and a program.

Background

Recent image capturing apparatuses such as digital cameras are capable of capturing an HDR image and recording the HDR image on a recording medium. HDR is a high dynamic range, and is a technique of generating an image having a wider dynamic range than SDR (standard dynamic range). Further, the RAW image refers to an undeveloped original image.

When recording content of an HDR video, the content is recorded together with identification information indicating whether the content is an HDR video (for example, patent document l).

Reference list

Patent document

Patent document l: japanese patent laid-open No. 2018-7194

Disclosure of Invention

Technical problem

A general RAW image file may contain JPEG compressed data as a development image of a display image. However, because JPEG images do not correspond to HDR image quality, the display image cannot be saved with HDR image quality. Therefore, even when a RAW image captured by HDR is displayed on an HDR display, the RAW image must be once developed to check an image with HDR image quality.

The present invention has been made in view of the above problems, and provides a technique by which, even when RAW image data is recorded in an image pickup apparatus that supports a plurality of dynamic ranges such as SDR and HDR, developed image data corresponding to a selected dynamic range is recorded in a RAW image file, and therefore, an image having the selected dynamic range can be checked when the RAW image file is played back.

Means for solving the problems

In order to solve the above problem, the image pickup apparatus of the present invention has, for example, the following arrangement. Namely, the image pickup apparatus includes: an image pickup unit; a development section for performing development processing on the image data obtained by the image pickup section; a selection component for selecting a dynamic range; a selection component for selecting a dynamic range; a development section for performing development processing on the RAW image data obtained by the image pickup section; and a control unit configured to control, when recording the RAW image data obtained by the image pickup unit as a RAW image file, so that if the selection unit selects a first dynamic range, image data obtained by subjecting the RAW image data to development processing for the first dynamic range by the development unit is recorded as the RAW image file together with the RAW image data, and if the selection unit selects a second dynamic range, image data obtained by subjecting the RAW image data to development processing for the second dynamic range by the development unit is recorded as the RAW image file together with the RAW image data.

Advantageous effects of the invention

According to the present invention, even when RAW image data is recorded in an image capturing apparatus that supports a plurality of types of dynamic ranges such as SDR and HDR, development image data corresponding to a selected dynamic range is recorded in a RAW image file, and therefore, for example, when simple display (such as file list display) is performed, an image having the selected dynamic range can be checked.

Further, according to the present invention, even when RAW image data is recorded in an image capturing apparatus that supports a plurality of types of dynamic ranges such as SDR and HDR, developed image data corresponding to the selected dynamic range is recorded in a RAW image file. Thus, for example, when a simple display (such as a file list display) is performed, an image having a selected dynamic range can be checked.

Alternatively, the present invention can prevent complexity of management and maintain playback compatibility when recording multiple types of files in an image pickup apparatus that supports multiple types of dynamic ranges.

Other features and advantages of the present invention will become apparent from the following description taken in conjunction with the accompanying drawings. Note that throughout the drawings, the same reference numerals denote the same or similar components.

Drawings

The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.

Fig. 1A is an external view of an image capturing/displaying apparatus according to an embodiment;

fig. 1B is an external view of an image capturing/displaying apparatus according to an embodiment;

fig. 2 is a block diagram showing a configuration of an image capturing/displaying apparatus according to an embodiment;

fig. 3 is a diagram showing a connection configuration with an external device;

fig. 4A-1 is a flowchart of LV photographing mode processing according to an embodiment;

fig. 4A-2 is a flowchart of LV photographing mode processing according to an embodiment;

4A-3 are flow diagrams of LV shooting mode processing according to embodiments;

4A-4 are flow diagrams of LV shooting mode processing according to embodiments;

FIG. 4B-1 is a flow chart of quick review;

FIG. 4B-2 is a flow chart of quick review;

fig. 5A is a sequence diagram of an HDMI connection process according to the embodiment;

fig. 5B is a sequence diagram of an HDMI connection process according to the embodiment;

fig. 5C is a sequence diagram of an HDMI connection process according to the embodiment;

fig. 6A-1 is a flowchart of an HDR shooting menu process according to an embodiment;

fig. 6A-2 is a flowchart of an HDR shooting menu process according to an embodiment;

fig. 6B-1 is a flowchart of HDR shooting menu processing according to the embodiment;

fig. 6B-2 is a flowchart of HDR shooting menu processing according to an embodiment;

fig. 7A is a flowchart of an HDR shooting process according to an embodiment;

fig. 7B is a flowchart of an HDR shooting process according to an embodiment;

fig. 8A is a diagram showing a configuration of a RAW file according to an embodiment;

fig. 8B is a diagram showing an example of ImageData (image data) area in the RAW file;

fig. 8C is a diagram showing an example of ImageData area in the RAW file;

fig. 8D is a diagram showing an example of ImageData area in the RAW file;

fig. 8E is a diagram showing an example of ImageData area in the RAW file;

fig. 9A is a flowchart of playback mode processing according to an embodiment;

fig. 9B is a flowchart of playback mode processing according to an embodiment;

fig. 9C is a flowchart of playback mode processing according to an embodiment;

fig. 9D is a flowchart of playback mode processing according to an embodiment;

FIG. 9E-1 is a flow diagram of playback mode processing according to an embodiment;

FIG. 9E-2 is a flow diagram of playback mode processing according to an embodiment;

fig. 9F is a flowchart of playback mode processing according to an embodiment;

fig. 9G is a flowchart of playback mode processing according to an embodiment;

fig. 9H is a flowchart of playback mode processing according to an embodiment;

fig. 10A is a flowchart of an HDMI playback process according to an embodiment;

fig. 10B is a flowchart of an HDMI playback process according to an embodiment;

fig. 11A is a flowchart of playback menu processing according to an embodiment;

fig. 11B is a flowchart of playback menu processing according to an embodiment;

FIG. 12 is a flow diagram of a development process according to an embodiment;

FIG. 13 shows axCyA plan view;

fig. 14A is a flowchart of the tone correction parameter generation process;

fig. 14B is a flowchart of the tone correction parameter generation process;

fig. 15A is a diagram showing a tone correction amount;

fig. 15B is a diagram showing a tone correction amount;

fig. 16A is a diagram showing an example of the appearance of an SDR;

fig. 16B is a diagram showing an example of the appearance of HDR;

fig. 17A is a diagram showing the configuration of the HEIF file;

fig. 17B is a diagram showing an example of ImageData area in the HEIF file;

fig. 17C is a diagram showing an example of ImageData area in the HEIF file;

fig. 17D is a diagram showing an example of ImageData area in the HEIF file;

fig. 18 is a diagram showing the configuration of a JPEG file;

fig. 19A is a flowchart of a shooting process according to a modification; and

fig. 19B is a flowchart of a shooting process according to a modification.

Detailed Description

The embodiments will be described in detail below with reference to the accompanying drawings. Note that the following examples are not intended to limit the scope of the present invention. A plurality of features are described in the embodiments, but not limited to the invention requiring all of the features, and the features may be appropriately combined. Further, in the drawings, the same or similar configurations are given the same reference numerals, and redundant description thereof will be omitted.

Fig. 1A and 1B show the appearance of a digital camera 100 as an example of the apparatus according to the present invention. Fig. 1A is a front perspective view of the digital camera 100. Fig. 1B is a rear perspective view of the digital camera 100. Referring to fig. 1A and 1B, the display unit 28 is a display unit provided on the back of the camera, and displays images and various information. The outside-viewfinder display unit 43 is a display unit provided on the top surface of the camera, and displays various setting values for the camera, such as the shutter speed and the f-value. The shutter button 61 is an operation unit used by the user to issue a shooting instruction. The mode switch 60 is an operation unit for switching between various types of modes. The terminal cover 40 is a cover for protecting a connector (not shown), such as a connection cable for connecting an external device to the digital camera 100. The main electronic dial 71 is a rotational operation member included in the operation unit 70. For example, the user can change a setting value such as a shutter speed or an f-value by rotating the main electronic dial 71. The power switch 72 is an operation member for turning on and off the power of the digital camera 100. The sub electronic dial 73 is a rotary operation member included in the operation unit 70 and used to perform operations such as moving a selection frame and image feeding. A cross key 74 (four-way key) is included in the operation unit 70. The user can press the upper, lower, right and left portions of the cross key 74. Pressing the cross key 74 makes it possible to perform an operation corresponding to the pressed portion of the cross key 74. The setting button 75 is a button included in the operation unit 70, and is mainly used for deciding a selection item. The LV button 76 is an operation button included in the operation unit 70 and used to turn on and off a live view (hereinafter referred to as LV) in the still image shooting mode. In the moving image shooting mode, the button is used to issue an instruction to start or stop moving image shooting (recording). The zoom-in button 77 is an operation button included in the operation unit 70 and used to turn on and off a zoom-in mode in live view display in the shooting mode and to change a magnification in the zoom-in mode. In the playback mode, the button functions as a zoom-in button for zooming in the playback image and increasing the magnification. The reduction button 78 is an operation button included in the operation unit 70 and used to reduce the magnification of the enlarged playback image and reduce the display image. The playback button 79 is an operation button included in the operation unit 70 and used to switch between a shooting mode and a playback mode. Pressing the playback button 79 during the shooting mode shifts to the playback mode to cause the display unit 28 or the external apparatus 300 to display the latest image among the images recorded on the recording medium 200. The quick return mirror 12 is moved up and down by an actuator (not shown) according to an instruction from the system control unit 50. The communication terminal 10 allows the digital camera 100 to communicate with the lens side (detachable). The eyepiece finder 16 is an observation finder for checking the focus and composition of an optical image of an object obtained by the lens unit 150 by observing the focusing screen 13. The cover 202 is a cover of a slot that accommodates the recording medium 200. The grip portion 90 is a holding unit having a shape that allows the user to easily grip with his/her right hand while holding the digital camera 100.

Fig. 2 is a block diagram showing an example of the arrangement of the digital camera 100 according to the present embodiment.

Referring to fig. 2, the lens unit 150 includes an interchangeable lens.

Although the lens 103 is generally composed of a plurality of lenses, fig. 2 shows only one lens as the lens 103 for the sake of simplicity. The communication terminal 6 is a communication terminal used by the lens unit 150 to communicate with the digital camera 100 side. The communication terminal 10 is a communication terminal used by the digital camera 100 to communicate with the lens unit 150 side. The lens unit 150 communicates with the system control unit 50 via the communication terminals 6 and 10, and causes the internal lens system control circuit 4 to control the diaphragm 1 via the diaphragm drive circuit 2, and adjusts the focus by moving the position of the lens 103 via the AF drive circuit 3.

The AE sensor 17 measures the brightness of the object through the lens unit 150. The focus detection unit 11 outputs defocus amount information to the system control unit 50. The system control unit 50 controls the lens unit 150 to perform phase difference AF based on the input defocus amount information.

At the time of exposure, live view shooting, and moving image shooting, the quick return mirror 12 (hereinafter referred to as the mirror 12) is moved up and down by an actuator (not shown) in accordance with an instruction from the system control unit 50. The mirror 12 is a mirror for switching an incident beam from the lens 103 between the eyepiece finder 16 side and the imaging unit 22 side. The mirror 12 is typically used to reflect the light beam to be directed to the eyepiece finder 16. At the time of shooting or live view display, the mirror 12 is flipped up and retracted from the light beam (mirror up) to guide the light beam to the image pickup unit 22. In addition, the central portion of the mirror 12 is a half-transparent half mirror to transmit part of the light. The mirror 12 transmits part of the light beam to be incident on the focus detection unit 11 for focus detection.

A user (photographer) can check the focus and composition of an optical image of a subject obtained through the lens unit 150 by observing the focusing screen 13 via the pentaprism 14 and the eyepiece finder 16.

The shutter 101 is a focal plane shutter that can freely control the exposure time of the image pickup unit 22 under the control of the system control unit 50.

The image pickup unit 22 is an image sensor formed of a CCD or CMOS device that converts an optical image into an electric signal. The filters of the color components R, G and B are periodically arranged two-dimensionally on the imaging plane of the imaging unit 22. Among the 2 × 2 filters adjacent to each other, the G component filter is arranged as two filters having a diagonal relationship, and the G component filter and the B component filter are arranged as the remaining two filters. These 2 × 2 filters are arranged on the imaging plane of the imaging unit 22. This array is commonly referred to as the bayer array. Therefore, an image represented by a signal (analog signal) output from the image pickup unit 22 is also a pixel signal having a bayer array. The a/D converter 23 converts the 1-pixel analog signal output from the image pickup unit 22 into, for example, a 10-bit digital signal. Note that the image data at this stage is bayer array image data (one pixel has one component and each component has 10 bits) as described above, and is undeveloped image data. Therefore, the image data at this stage is referred to as RAW image data. Note that bayer array image data after compensation of defective pixels may also be referred to as RAW image data. Note also that in the present embodiment, the a/D converter 23 converts an analog signal into 10-bit digital data, but the number of bits is not particularly limited as long as the number of bits exceeds 8 bits. The larger the number of bits, the higher the hue of expression.

The image processing unit 24 performs a resizing process (such as predetermined pixel interpolation or reduction) and a color conversion process on the data from the a/D converter 23 or the data from the memory control unit 15. In addition, the image processing unit 24 performs predetermined arithmetic processing using the captured image data. The system control unit 50 then performs exposure control and focus control based on the obtained arithmetic result. By this operation, the system control unit 50 performs TTL (through the lens) AF (auto focus) processing, TTL AE (auto exposure) processing, and TTL EF (electronic flash pre-emission) processing. The image processing unit 24 also performs predetermined arithmetic processing by using captured image data, and performs TTL AWB (auto white balance) processing based on the obtained arithmetic result. Further, the image processing unit 24 encodes/decodes image data under the control of the system control unit 50. The coding includes JPEG and HEVC. JPEG is used to encode image data having 8 bits per color component, while HEVC is used to encode image data having more than 8 bits per color component.

The output data from the a/D converter 23 is written to the memory 32 via the image processing unit 24 and the memory control unit 15, or is directly written to the memory 32 via the memory control unit 15. The memory 32 stores image data obtained by the imaging unit 22 and converted into digital data by the a/D converter 23, and image data to be displayed on the display unit 28 or the external apparatus 300. The memory 32 has a storage capacity large enough to store a predetermined number of still images, and moving images and sounds lasting for a predetermined time.

The memory 32 also functions as a memory (video memory) for image display. The D/a converter 19 converts the data for image display stored in the memory 32 into an analog signal and supplies it to the display unit 28. In this way, the image data for display written in the memory 32 is displayed by the display unit 28 via the D/a converter 19. The display unit 28 displays an image corresponding to the analog signal from the D/a converter 19 on a display device such as an LCD. The D/a converter 19 converts the digital signals once a/D converted by the a/D converter 23 and stored in the memory 32 into analog signals, and sequentially transmits the signals to the display unit 28 to display an image, thereby functioning as an electronic viewfinder and performing through image display (live view display).

The in-finder liquid crystal display unit 41 displays a frame (AF frame) indicating a focus point at which autofocusing is currently performed, an icon indicating a setting state of the camera, and the like via the in-finder display unit drive circuit 42. The out-of-finder display unit 43 displays various setting values of the camera, such as a shutter speed and an f-value, via the out-of-finder display unit drive circuit 44.

The digital output I/F90 directly supplies the image data for display stored in the memory 32 to the external device 300 in the form of a digital signal. For example, the digital output I/F90 is based on coincidenceA communication protocol of (high-definition multimedia interface) standard, which outputs moving image data in a streaming form. The external device 300 thus displays the image data for display written in the memory 32.

The nonvolatile memory 56 is a memory capable of electrically erasing and recording data. For example, an EEPROM or the like is used as the nonvolatile memory 56. The nonvolatile memory 56 stores constants, programs, and the like for the operation of the system control unit 50. The program in this case is a program for executing various flowcharts described later in the present embodiment.

The system control unit 50 is a controller having at least one processor, and controls the entire digital camera 100. The system control unit 50 realizes respective processes in the present embodiment (described later) by executing programs recorded on the above-described nonvolatile memory 56. The system memory 52 is a RAM. Constants, variables, programs read out from the nonvolatile memory 56, and the like for the operation of the system control unit 50 are developed in the system memory 52. The system control unit 50 also performs display control by controlling the memory 32, the D/a converter 19, the digital output I/F90, the display unit 28, and the like. Further, the system control unit 50 converts the captured image data into a form to be recorded on the recording medium 200, and performs recording control to record the image data on the recording medium 200 via the recording medium I/F18.

The system timer 53 is a time measuring unit for measuring time used for various types of control and time of a built-in timer.

The mode switch 60, the first shutter switch 62, the second shutter switch 64, and the operation unit 70 are operation means for inputting various types of operation instructions to the system control unit 50.

The mode switch 60 switches the operation mode of the system control unit 50 to, for example, one of a still image recording mode, a moving image shooting mode, and a playback mode. The still image recording mode includes the following modes: an automatic shooting mode, an automatic scene discrimination mode, a manual mode, an aperture priority mode (Av mode), and a shutter speed priority mode (Tv mode). The still image recording mode also includes various types of scene modes having shooting settings specific to a shooting scene, a program AE mode, and a custom mode. The mode switch 60 directly switches the operation mode to one of these modes. Alternatively, after temporarily switching to the list screen of the shooting mode using the mode switch 60, the user may select one of a plurality of display modes and switch to the selected mode by using other operation members. Also, the moving image photographing mode may include a plurality of modes.

The shutter button 61 operated by the user includes a first shutter switch 62 and a second shutter switch 63. When the user operates the shutter button by half, i.e., performs a so-called half-press operation (pre-shooting instruction), the first shutter switch 62 is turned on to generate a first shutter switch signal SWl 61. Upon receiving the first shutter switch signal SW1, the system control unit 50 starts operations such as AF (auto focus) processing, AE (auto exposure) processing, AWB (auto white balance) processing, and EF (electronic flash pre-emission) processing. When the user fully operates the shutter button 61, i.e., performs a so-called full-press operation (shooting instruction), the second shutter switch 64 is turned on to generate a second shutter switch signal SW 2. The system control unit 50 starts a series of shooting processes from reading out a signal from the image pickup unit 22 to writing image data to the recording medium 200 in response to the second shutter switch signal SW 2.

When the user selectively operates various function icons displayed on the display unit 28 or the external apparatus 300, the respective operation members of the operation unit 70 are assigned appropriate functions corresponding to the scenes and function as various function buttons. Examples of the function buttons are an end button, a return button, an image feed button, a jump button, a zoom-out button, and an attribute change button. For example, when the menu button 70e is pressed, the display unit 28 or the external apparatus 300 displays a menu screen on which various settings can be made. The user can intuitively make various settings by using the menu screen displayed on the display unit 28 or the external device 300, four-way buttons for up, down, left, and right, and setting buttons.

Note that the display unit 28 according to the present embodiment has an SDR-quality image display function, that is, each of the color components R, G and B can be displayed in 8 bits (256 tones). Note also that when the external apparatus 300 is connected to the digital camera 100, the external apparatus 300 is set as an output target device that captures an image or a real-time image, instead of the display unit 28. Further, when the user explicitly selects one of the display unit 28 and the external apparatus 300 by operating the operation unit 70, the selected one is set as an output target device.

The operation unit 70 is an operation member serving as an input unit for accepting an operation from a user. The operation unit 70 includes at least the following operation units: shutter button 61, main electronic dial 71, power switch 72, sub electronic dial 73, cross key 74, set button 75, LV button 76, zoom-in button 77, zoom-out button 78, and playback button 79. The cross key 74 is a direction button that the user can press the upper, lower, right and left portions of the cross key 74. In the present embodiment, the cross key 74 is described as an integrated operation unit, but the up, down, right, and left buttons may be separate buttons. In the following description, the upper or lower portion will be referred to as an up/down key, and the left or right portion will be referred to as a left/right key. The operation unit 70 further includes the following operation units.

The AF-ON button 70b is a button switch included in the operation unit 70. The user can designate the execution of AF by pressing the AF-ON button 70 b. The pressing direction of the AF-PN button 70b is parallel to the direction (optical axis) of the object light incident from the lens 103 to the image pickup unit 22.

The quick setting button 70c (hereinafter referred to as Q button 70c) is a push button switch included in the operation unit 70. When the Q button 70c is pressed, a quick setting menu as a setting item list settable in each operation mode is displayed. For example, when the Q button 70c is pressed in the shooting standby state of the live view shooting, setting items such as an electronic front curtain shutter, brightness of a monitor, WB of an LV screen, two-point enlargement, and silent shooting are displayed in a list in such a manner that these setting items are superimposed on the LV. The user selects an arbitrary item in the quick setting menu by using the up/down keys and presses the setting button. The user can thus change the setting corresponding to the selected setting item and proceed to the operation mode corresponding to the item.

The movable frame switch button 70d is a button switch included in the operation unit 70. When the movable frame switch button 70d is pressed in a two-point enlargement process (described later), two enlargement portions can be switched as the movable enlargement positions (frames). Further, the active frame switching button 70d is assigned with a different function corresponding to the operation mode. When the active frame switching button 70d is pressed in the playback mode, a protection attribute may be given to the displayed image.

The menu button 70e is a button switch included in the operation unit 70, and a menu screen on which various settings are possible is displayed on the display unit 28 or the external apparatus 300.

The function buttons 70f are three button switches included in the operation unit 70, and are assigned different functions. Each of the function buttons 70f is disposed in a position operable by a finger (middle finger, ring finger, or little finger) of the right hand holding the grip portion 90. The pressing direction is parallel to the direction (optical axis) of the object light incident from the lens 103 to the imaging unit 22.

The information button 70g is a button switch included in the operation unit 70 and used to switch various information display operations.

The power supply control unit 80 includes, for example, a battery detection circuit, a DC-DC converter, and a switch circuit for switching blocks to be supplied with power, and detects the presence/absence of a battery, the type of the battery, and the remaining amount of the battery. Based on the detection result and an instruction from the system control unit 50, the power supply control unit 80 controls the DC-DC converter and supplies a required voltage to the units including the recording medium 200 for a required period of time.

The power supply unit 30 is constituted by a primary battery (such as an alkaline battery or a lithium battery), a secondary battery (such as a NiCd battery, a NiMH battery, or a Li battery), an AC adapter, and the like. The recording medium I/F18 is an interface with a recording medium 200 such as a memory card or a hard disk. The recording medium 200 is a recording medium (such as a memory card) for recording a captured image, and is formed of a semiconductor memory, a magnetic disk, or the like.

The communication unit 54 is connected to an external device wirelessly or via a wired cable, and transmits and receives a video signal and an audio signal. The communication unit 54 may also be connected to a wireless LAN (local area network) and the internet. The communication unit 54 can transmit images (including through images) captured by the image capturing unit 22 and images recorded on the recording medium 200, and can receive image data and other various information from an external apparatus.

The posture detection unit 55 detects the posture of the digital camera 100 in the gravity direction. It is possible to discriminate whether the image captured by the image capturing unit 22 is an image captured while the digital camera 100 is held in the horizontal direction or the vertical direction, based on the posture detected by the posture detection unit 55. The system control unit 50 may add direction information corresponding to the posture detected by the posture detection unit 55 to an image file of the image captured by the image capturing unit 22, or record the image while rotating the image. As the posture detection unit 55, an acceleration sensor, a gyro sensor, or the like can be used.

Note that the operation unit 70 also includes a touch panel 70a that can detect a touch to the display unit 28. The touch panel 70a and the display unit 28 may be integrated. For example, the touch panel 70a is configured to prevent transmission of light from obstructing display on the display unit 28, and is mounted on an upper layer of the display surface of the display unit 28. The input coordinates on the touch panel 70a are associated with the display coordinates on the display unit 28. This may form a GUI (graphical user interface) that allows the user to directly operate the screen displayed on the display unit 28. The system control unit 50 may detect the following operation or a state thereof to the touch panel 70 a:

a new touch, i.e., a touch start (hereinafter referred to as "touch") on the touch panel 70a with a finger or a pen that is not touching the panel;

a state in which the user is touching the touch panel 70a with his/her finger or pen (hereinafter referred to as "touch continuation");

moving a finger or a pen while the user touches the touch panel 70a with his/her finger or pen (hereinafter referred to as "touch movement");

releasing a finger or pen from the touch panel 70a, i.e., ending the touch (hereinafter referred to as "touch stop"); and

a state where nothing touches the touch panel 70a (hereinafter referred to as "not touched").

When the "touch down" is detected, the system control unit 50 simultaneously detects the "touch continuation". After "touching", the system control unit 50 normally continuously detects "touch continuation" unless "touch stop" is detected. When detecting "touch continuation", the system control unit 50 also detects "touch movement". Even when the system control unit 50 detects "touch continuation", the unit does not detect "touch movement" unless the touch position is moved. The system control unit 50 detects "not touched" when detecting "touch stop" of all fingers or pens touching the touch panel.

Information indicating such an operation or state and coordinates of a position touched by a finger or a pen on the touch panel 70a are notified to the system control unit 50 via the internal bus. The system control unit 50 determines a specific operation (touch operation) that has been performed on the touch panel 70a based on the notified information. The system control unit 50 may determine a "touch movement" including a movement direction of a finger or a pen moving on the touch panel for each vertical component and each horizontal component on the touch panel 70a based on the change in the position coordinates. If the "touch movement" of the predetermined distance is detected, the system control unit 50 determines that the slide operation is performed. An operation of rapidly moving the finger a certain distance and then releasing the finger while touching the touch panel 70a is called "flicking". In other words, "flicking" is an operation of quickly drawing on the touch panel 70a as if the touch panel 70a is flicked with a finger. When "touch movement" at or above a predetermined speed and at or above a predetermined distance is detected and "touch stop" is continuously detected, the system control unit 50 may determine that the user has performed "flicking" (may determine that "flicking" has been performed after the slide operation). Further, a touch operation of simultaneously touching a plurality of portions (e.g., two points) and bringing the touched positions close to each other is referred to as "pinch-in", and a touch operation of bringing the touched positions away from each other is referred to as "pinch-out". "kneading" and "kneading" are generally referred to as a kneading operation (or simply, kneading). As the touch panel 70a, any one of the following types of touch panels may be used: a resistance film type, a capacitance type, a surface acoustic wave type, an infrared type, an electromagnetic induction type, an image recognition type, a photoelectric sensor type, and the like. Some types are designed to detect a touch by detecting a contact to a touch panel. Other types are designed to detect touch by detecting the proximity of a finger or pen to the touch panel. Any of these types may be used.

Note that the present invention is not limited to the image pickup apparatus main body, and is also applicable to a control apparatus that communicates with an image pickup apparatus (including a network camera) by wired or wireless communication and remotely controls the image pickup apparatus. Examples of devices that remotely control a camera device are smart phones, tablet PCs, and desktop PCs. Based on an operation or processing performed on the control apparatus side, the control apparatus can remotely control the image pickup apparatus by notifying the image pickup apparatus of a command for causing the image pickup apparatus to perform various operations or settings. It is also possible to receive a live view image captured by the image capturing apparatus via wired or wireless communication and display the received image on the control apparatus side.

Note that the embodiment has been described by taking as an example a case where the present invention is applied to a digital camera, but the present invention is not limited thereto. For example, the present invention is applicable to any device including a display unit, such as a PDA, a portable telephone terminal, a portable image viewer, a printer device including a display, a digital photo frame, a music player, a game machine, or an electronic book reader.

Fig. 3 is a diagram showing a connection example of the digital camera 100 and the external device 300. When the digital camera 100 and the external device 300 are connected by the connection cable 302, the display unit 28 of the digital camera 100 is turned off, and the display 301 of the external device 300 displays content that has been displayed on the digital camera 100.

Fig. 4A-1 to 4A-4 are flowcharts showing LV photographing mode processing of the digital camera 100. This processing is realized by developing a program recorded in the nonvolatile memory 56 in the system memory 52 and executing the program by the system control unit 50.

First, the HDR photographing mode and the SDR photographing mode according to the present embodiment will be explained. In the digital camera 100 of the present embodiment, the HDR photographing mode or the SDR photographing mode can be set by a menu operation or the like from the user. These modes allow the user to set whether image data of HDR quality or image data of SDR quality is finally obtained, and perform various control operations in the following processes according to the set modes. Hereinafter, photographing in the HDR photographing mode and the SDR photographing mode will be referred to as "HDR photographing" and "SDR photographing" in some cases. However, as described later, it is also possible to set to perform recording using only the RAW format, and therefore, when shooting is performed in the HDR shooting mode, it is not necessary to record an HDR image.

In S401, the system control unit 50 determines whether the setting made by the user on the operation unit 70 is the HDR photographing mode. The system control unit 50 advances the process to S402 if it is determined that the HDR photographing mode is set, and advances the process to S422 if it is determined that the SDR photographing mode is set.

In S402, the system control unit 50 determines whether the external device 300 is connected to the digital camera 100. The system control unit 50 advances the process to S403 if it is determined that the external apparatus 300 is connected, and advances the process to S404 if it is determined that the external apparatus 300 is not connected.

In S403, the system control unit 50 performs processing of connecting the digital camera 100 and the external apparatus 300. Then, the system control unit 50 advances the process to S404. Details of this connection process will be described later with reference to fig. 5A to 5C. Note that if the external device supports HDR connection, HDR connection is made, otherwise SDR connection is made.

In S404, the system control unit 50 performs HDR quality development processing on the real-time RAW image data captured by the image capturing unit 22 and converted into a digital signal by the a/D converter 23 by using the image processing unit 24. An image obtained by the HDR quality visualization process will be referred to as an HDR image.

Note that the HDR image data of the present embodiment is data in which one pixel is formed of three components (e.g., Luv or YCbCr), and each component is represented by 10 bits (1024 tones) in the present embodiment. The gamma curve for an HDR image (e.g., PQ or HLG of recommendation BT.2100 by ITU-R) is applied to the HDR image data.

In S405, the system control unit 50 determines whether the apparatus for displaying the LV image (the display unit 28 or the external device 300) supports HDR. The system control unit 50 advances the process to S406 if it is determined that the device does not support HDR, and advances the process to S409 if it is determined that the device supports HDR.

In S406, the system control unit 50 checks the HDR auxiliary display setting. If it is determined that assist 1 is set, the system control unit 50 advances the process to S407, and if assist 2 is set, advances the process to S408. Assist 1 is a setting for checking a high luminance area of an HDR image, and performs processing of allocating many tones (code values) to a high luminance range of the HDR image. The auxiliary 2 is a setting for checking the intermediate luminance range of the HDR image, and performs a process of allocating many tones to the intermediate luminance region of the HDR image.

In S407, the system control unit 50 performs the conversion process of HDR → SDR on the HDR image data obtained by the development process in S404 in accordance with the setting of assist 1. In addition, the system control unit 50 displays SDR-quality LV image data obtained by performing the resizing processing to a size suitable for the output target apparatus (the display unit 28 or the external device 300), and advances the processing to S410.

In S408, the system control unit 50 performs the conversion process of HDR → SDR on the HDR image data obtained by the development process in S404 in accordance with the setting of assist 2. In addition, the system control unit 50 displays SDR-quality LV image data obtained by performing the resizing processing to a size suitable for the output target apparatus (the display unit 28 or the external device 300), and advances the processing to S410.

The SDR quality image data (SDR image data) in S407 and S408 is image data having 8 bits per component. The gamma curve of the SDR image (e.g., the gamma curve of the sRGB standard) is applied to SDR quality image data. Note that the gamma curve of the sRGB standard is generally a curve in which the dark portion is straight and the bright portion is a power of 2.4, but only a curve that is a power of 2.2 may be used.

In S409, the system control unit 50 performs processing of resizing the HDR image data obtained by the development processing in S404 to a size suitable for the output target device (the display unit 28 or the external apparatus 300), displays the resized HDR quality image (hereinafter referred to as an HDL _ LV image) as a real-time image, and advances the processing to S410.

In S410, the system control unit 50 determines whether the menu display button 70e is pressed. The system control unit 50 advances the process to S411 if it is determined that the button is pressed, and advances the process to S412 if it is determined that the button is not pressed. In S411, the system control unit 50 performs shooting menu processing and advances the processing to S412. Details of this shooting menu process will be described later with reference to fig. 6A-1 and 6A-2 and fig. 6B-1 and 6B-2.

In S412, the system control unit 50 determines whether the information display button 70g is pressed. The system control unit 50 advances the process to S413 if it is determined that the button is pressed, and advances the process to S414 if it is determined that the button is not pressed. In S413, the system control unit 50 switches display of the shooting information, and advances the process to S414. Examples of the photographing information are a histogram and a highlight warning table (highlight warning table).

In S414, the system control unit 50 determines whether the shutter button 61 is half-pressed based on whether the signal SWl is received. The system control unit 50 advances the process to S420 if it is determined that the button is not half-pressed, and advances the process to S415 if it is determined that the button is half-pressed.

In S415, the system control unit 50 performs the AE/AF processing described with reference to fig. 2, and advances the processing to S416. In S416, the system control unit 50 determines whether the shutter button 61 is fully pressed based on whether the signal SW2 is received. The system control unit 50 advances the process to S417 if it is determined that the button is not fully pressed, and advances the process to S418 if it is determined that the button is fully pressed. In S417, the system control unit 50 determines whether the half-pressed state of the shutter button 61 is held. The system control unit 50 returns the process to S415 if the half-press state is maintained, and advances the process to S420 if it is determined that the half-press state is not maintained. In S418, the system control unit 50 performs the HDR photographing process, and records an image data file corresponding to a preset recording format on a recording medium. Fig. 8A shows a data structure of a file to be recorded. Then, the system control unit 50 advances the process to S419. Note that details of this HDR photographing process will be described later with reference to fig. 7A and 7B. In S419, the system control unit 50 performs the quick review display process, and advances the process to S420. The details of this quick review display process will be described later with reference to fig. 4B-1 and 4B-2.

In S420, the system control unit 50 determines whether the LV button 76 is pressed. The system control unit 50 advances the process to S421 if it is determined that the button is pressed, and advances the process to S422 if it is determined that the button is not pressed.

In S421, the system control unit 50 compresses the image data (image data having three components for one pixel and 10 bits for each component) developed as the HDR image quality in S404 by HEVC (h.265) compression, records the compressed data as an HDR moving image file, and advances the process to S438.

In S422, the system control unit 50 determines whether the external device 300 is connected to the digital camera 100. The system control unit 50 advances the process to S423 if it is determined that the external apparatus 300 is connected, and advances the process to S424 if it is determined that the external apparatus 300 is not connected. In S423, the system control unit 50 performs processing of connecting the digital camera 100 and the external apparatus 300, and advances the processing to S424. Details of this connection process will be described later with reference to fig. 5A to 5C. Note that since the SDR photographing mode is set, the external device is connected through the SDR connection.

In S424, the system control unit 50 causes the image processing unit 24 to develop the image captured by the image capturing unit 22 and converted into a digital signal by the a/D converter 23 into an SDR image quality (one pixel has three components, and each component has 8 bits (256 tones), and advances the process to S425.

In S425, the system control unit 50 generates an SDR-quality real-time image (SDR _ LV image) by performing a process of resizing the SDR image obtained by the development process in S424 to a size suitable for the resolution of the output destination device (the display unit 28 or the external apparatus 300), and displays the generated SDR _ LV image.

In S426, the system control unit 50 determines whether the menu display button 70e is pressed. The system control unit 50 advances the process to S427 if it is determined that the button is pressed, and advances the process to S428 if it is determined that the button is not pressed. In S427, the system control unit 50 performs shooting menu processing, and advances the processing to S428. Details of this shooting menu process in S427 will be described later with reference to fig. 6A-1 and 6A-2 and fig. 6B-1 and 6B-2.

In S428, the system control unit 50 determines whether the information display button 70g is pressed. The system control unit 50 advances the process to S429 if it is determined that the button is pressed, and advances the process to S430 if it is determined that the button is not pressed. In S429, the system control unit 50 switches the display of the shooting information, and advances the process to S430. Examples of the photographing information are a histogram and a highlight warning table.

In S430, the system control unit 50 determines whether the shutter button 61 is in the half-pressed state. The system control unit 50 advances the process to S436 if it is determined that the button is not in the half-pressed state, and advances the process to S431 if it is determined that the button is in the half-pressed state.

In S431, the system control unit 50 performs the AF/AE processing described with reference to fig. 2, and advances the processing to S432. In S432, the system control unit 50 determines whether the shutter button 61 is in the fully-pressed state based on whether the signal SW2 is received. The system control unit 50 advances the process to S433 if it is determined that the button is not in the fully-pressed state, and advances the process to S434 if it is determined that the button is in the fully-pressed state. In S433, the system control unit 50 determines whether the half-pressed state of the shutter button 61 is held based on whether the signal SW1 is received. The system control unit 50 returns the process to S431 if it is determined that the half-press state is maintained, and advances the process to S436 if it is determined that the half-press state is not maintained.

In S434, the system control unit 50 performs SDR shooting processing and advances the processing to S435. In this SDR photographing process, the system control unit 50 visualizes RAW image data obtained by SDR photographing at SDR image quality, generates JPEG image data by encoding an SDR-quality image at JPEG, and records the data as a JPEG file having JPEG format on a recording medium. If it is set as a recording setting that only the SDR image is recorded as a JPEG file, only the JPEG file is recorded. If both the recording JPEG file and the RAW image file are set as the recording setting, the JPEG file is recorded, and data obtained by encoding the RAW image data obtained by SDR shooting and the JPEG image data are recorded on the recording medium as a RAW image file having the RAW image file format shown in fig. 8A. In this RAW image file, ImageData 809 in the data structure shown in fig. 8A has the format shown in fig. 8B. That is, each size image for display is JPEG-encoded with 8-bit precision, and the encoded data is integrated and stored as one file. Then, in S435, the system control unit 50 performs the quick review display process, and advances the process to S436. The details of this quick review display process will be described later with reference to fig. 4B-1 and 4B-2.

In S436, the system control unit 50 determines whether the LV button 76 is pressed. The system control unit 50 advances the process to S437 if it is determined that the button is pressed, and advances the process to S438 if it is determined that the button is not pressed. In S437, the system control unit 50 compresses the SDR image obtained by the SDR quality development process in S425 by H264 compression, records the compressed image as an SDR moving image file, and advances the process to S438.

In S438, the system control unit 50 determines whether the playback button 79 is pressed. The system control unit 50 advances the process to S439 if it is determined that the playback button 79 is pressed, and advances the process to S440 if it is determined that the playback button 79 is not pressed. In S439, the system control unit 50 performs playback mode processing and advances the processing to S440. Details of this playback mode processing will be described later with reference to fig. 9A to 9H and fig. 10A and 10B.

In S440, the system control unit 50 determines whether there is an LV mode termination instruction. The system control unit 50 returns the process to S401 if it is determined that there is no LV mode termination instruction, and terminates the process if it is determined that there is a termination instruction.

Fig. 4B-1 and 4B-2 are flowcharts showing the quick review display process of the system control unit 50. This processing is realized by developing a program recorded in the nonvolatile memory 56 in the system memory 52 and executing the program by the system control unit 50.

In S451, the system control unit 50 determines whether the quick review display is set. The system control unit 50 advances the process to S452 if it is determined that the quick review display is set, and terminates the process if it is determined that the quick review display is not set.

In S452, the system control unit 50 determines whether or not shooting is performed in the HDR shooting mode. The system control unit 50 advances the process to S453 if it is determined that photographing is performed in the HDR photographing mode, and advances the process to S460 if it is determined that photographing is performed in the SDR photographing mode.

In S453, the system control unit 50 determines whether the means for quick review display (the display unit 28 or the external device 300) supports HDR. The system control unit 50 advances the process to S454 if it is determined that the device does not support HDR, and advances the process to S457 if it is determined that the device supports HDR.

In S454, the system control unit 50 determines whether shooting is performed by RAW still image shooting. The system control unit 50 advances the process to S455 if it is determined that shooting is performed by RAW still image shooting, and advances the process to S456 if it is determined that shooting is performed by HIEF still image shooting.

In S455, the system control unit 50 performs HDR → SDR conversion on the HDR image 828 for display in the HDR RAW image by the same processing as in S406 to S408, adjusts the image size to a size suitable for the output target device (the display unit 28 or the external apparatus 300), displays the image at SDR image quality, and advances the processing to S463.

In S456, the system control unit 50 performs HDR → SDR conversion on the HDR image for display in the HEIF image, adjusts the image size to a size suitable for the output target apparatus (the display unit 28 or the external device 300), displays the image at SDR image quality, and advances the process to S463.

In S457, the system control unit 50 determines whether shooting is performed by RAW still image shooting. The system control unit 50 advances the process to S458 if it is determined to perform shooting by RAW still image shooting, and advances the process to S459 if it is determined to perform shooting by HIEF still image shooting. In S458, the system control unit 50 adjusts the size of the HDR image 828 for display in the HDR RAW image to a size suitable for the output target apparatus (the display unit 28 or the external device 300), displays the image with HDR image quality, and advances the process to S463. In S459, the system control unit 50 adjusts the size of the HDR image for display in the HEIF image to a size suitable for the output target apparatus (the display unit 28 or the external device 300), displays the image with HDR image quality, and advances the process to S463.

In S460, the external apparatus 300 determines whether or not shooting is performed by RAW still image shooting. The external apparatus 300 advances the process to S461 if it is determined to perform shooting by RAW still image shooting, and advances the process to S462 if it is determined to perform shooting by HIEF still image shooting. In S461, the system control unit 50 adjusts the size of the SDR image 823 for display in the SDR RAW image to a size suitable for the output target apparatus (the display unit 28 or the external device 300), displays the image at SDR image quality, and advances the process to S463. In S462, the system control unit 50 adjusts the size of the SDR image for display in the JPEG image to a size suitable for the output target apparatus (the display unit 28 or the external device 300), displays the image with SDR image quality, and advances the process to S463.

In S463, the system control unit 50 determines whether the shutter button 61 is pressed. The system control unit 50 advances the process to S464 if it is determined that the button is not pressed, and terminates the process if it is determined that the button is pressed.

In S464, the system control unit 50 determines whether the time set as the quick review display time has elapsed. If it is determined that the time has not elapsed, the system control unit 50 returns the process to S463, and if it is determined that the time has elapsed, terminates the process.

Fig. 5A is a sequence chart showing a control procedure of the digital camera 100 and the external device 300 when the digital camera 100 and the external device 300 are connected. The description will be made by assuming that the digital camera 100 and the external device 300 are connected by an HDMI connection.

In S501, the system control unit 50 instructs the digital output I/F90 to start transmitting the +5V signal. As a result, the digital output I/F90 begins transmitting a +5V signal. The transmitted +5V signal is transmitted to the external device 300 through a +5V signal line (not shown) of the connection cable 302. The external device 300 receives the +5V signal of the connection cable 302, and advances the process to S502.

In S502, the external apparatus 300 determines that the digital camera 100 has confirmed the connection of the external apparatus 300, and advances the process to S503.

In S503, the external apparatus 300 starts transmitting an HPD signal from an HPD signal line (not shown) of the connection cable 302. The digital output I/F90 of the digital camera 100 receives the transmitted HPD signal via the connection cable 302. Upon receiving the HPD signal, the digital output I/F90 notifies the system control unit 50 of HPD reception.

In S504, the system control unit 50 detects a connection response from the external apparatus 300 by the notification of the HPD, and advances the process to S505.

In S505, the system control unit 50 controls the digital output I/F90 to transmit the EDID request signal from the connection cable 302. The transmitted EDID request signal is transmitted to the external device 300 through an EDID signal line (not shown) of the connection cable 302. The external device 300 receives the EDID request signal and advances the process to S506.

In S506, the external device 300 transmits EDID from the EDID signal line (not shown) of the connection cable 302. The digital output I/F90 of the digital camera 100 receives the EDID via the connection cable 302. Upon receiving the EDID, the digital output I/F90 notifies the system control unit 50 of the reception of the EDID.

In S507, the system control unit 50 receives the notification of the reception of the EDID, and instructs the digital output I/F90 to copy the EDID received in S506 into the memory 32. After the copying is completed, the system control unit 50 analyzes the EDID developed in the memory 32, determines the performance of the video signal acceptable to the external device 300, and advances the process to S508.

In S508, if the subject setting is HDR-valid and the performance of the video signal acceptable to the external apparatus 300 determined in S507 corresponds to an HDR signal, the system control unit 50 determines to output the HDR signal to the external apparatus 300, and otherwise determines to output an SDR signal, and advances the process to S509.

In S509, the system control unit 50 instructs the digital output I/F90 to start transmitting the HDR or SDR video signal determined in S508. The digital output I/F90 that has received the video signal transmission start instruction starts transmitting a video signal through the connection cable 302, and advances the process to S510.

In S510, the digital camera 100 outputs a video signal to the TMDS signal line (not shown) of the connection cable 302. The external device 300 receives the video signal via the TMDS signal line (not shown) of the connection cable 302, and advances the process to S511.

In S511, the external apparatus 300 analyzes the video signal received in S508, switches the driving condition of the display 301 to a setting capable of displaying the video signal, and advances the process to S512. In S512, the external apparatus 300 displays the video signal received in S508 on the display 301 of the external apparatus 300.

Fig. 5B is a sequence diagram showing a process of switching the video output of the digital camera 100 and the external apparatus 300 from an SDR image to an HDR image.

In this sequence, it is assumed that the connection between the digital camera 100 and the external apparatus 300 is completed in the sequence explained with reference to fig. 5A.

In S521, the system control unit 50 instructs the digital output I/F90 to transmit an SDR video signal, and advances the process to S522.

In S522, the digital camera 100 outputs the SDR video signal to the TMDS signal line (not shown) of the connection cable 302. The external apparatus 300 receives the SDR video signal via the TMDS signal line (not shown) of the connection cable 302, and advances the process to S523.

In S523, the external device 300 displays the SDR video received in S522 on the display 301 of the external device 300.

When the digital camera 100 is outputting the SDR signal, the display 301 of the external device 300 displays the SDR image by repeating S521 to S523.

When the digital camera 100 switches the video output to the external device 300 from the SDR image to the HDR image, the processes from S524 are performed.

In S524, the system control unit 50 instructs the digital output I/F90 to stop the SDR video signal, and advances the process to S525.

In S525, the system control unit 50 stops the video signal to the TMDS signal line (not shown) of the connection cable 302. The external apparatus 300 stops receiving the SDR video signal via the TMDS signal line (not shown) of the connection cable 302, and advances the process to S526.

In S526, since the reception of the video from the digital camera 100 has stopped, the external apparatus 300 stops displaying the video on the display 301 of the external apparatus 300.

In S527, the system control unit 50 instructs the digital output I/F90 to transmit the HDR video signal, and advances the process to S528.

In S528, the system control unit 50 outputs the HDR video signal to the TMDS signal line (not shown) of the connection cable 302. The external device 300 receives the HDR video signal via the TMDS signal line (not shown) of the connection cable 302, and advances the process to S529.

In S529, the external apparatus 300 analyzes the video signal received in S528, switches the driving conditions of the display 301 to a setting capable of displaying an HDR video signal, and advances the process to S530.

In S530, the external device 300 displays the HDR video signal received in S528 on the display 301 of the external device 300.

The processing time from S529 to S530 varies depending on the performance of the external device 300, and thus it takes about 1 to 5 seconds before the video is displayed.

Fig. 5C is a sequence diagram showing a process of switching the video output of the digital camera 100 and the external apparatus 300 from an HDR image to an SDR image.

In this sequence, it is assumed that the connection between the digital camera 100 and the external apparatus 300 is completed in the sequence explained with reference to fig. 5A.

In S541, the system control unit 50 instructs the digital output I/F90 to transmit the HDR video signal, and advances the process to S542. In S542, the system control unit 50 outputs the HDR video signal to the TMDS signal line (not shown) of the connection cable 302. Further, the external apparatus 300 receives the HDR video signal via the TMDS signal line (not shown) of the connection cable 302, and advances the process to S523.

In S543, the external device 300 displays the HDR video received in S542 on the display 301 of the external device 300.

When the digital camera 100 is outputting the HDR signal, the display 301 of the external device 300 displays the HDR signal by repeating S541 to S543.

When the digital camera 100 switches the video output to the external device 300 from the HDR image to the SDR image, the processes from S544 on are performed.

In S544, the system control unit 50 instructs the digital output I/F90 to stop the HDR video signal, and advances the process to S545. In S545, the system control unit 50 stops the video signal to the TMDS signal line (not shown) of the connection cable 302. The external apparatus 300 stops receiving the HDR video signal via the TMDS signal line (not shown) of the connection cable 302, and advances the process to S546.

In S546, the external apparatus 300 stops displaying the video on the display 301 of the external apparatus 300 because the reception of the video from the digital camera 100 has stopped.

In S547, the system control unit 50 instructs the digital output I/F90 to transmit an SDR video signal, and advances the process to S548.

In S548, the system control unit 50 outputs the SDR video signal to the TMDS signal line (not shown) of the connection cable 302. The external device 300 receives the SDR video signal via the TMDS signal line (not shown) of the connection cable 302, and advances the process to S549.

In S549, the external apparatus 300 analyzes the video signal received in S548, switches the driving condition of the display 301 to a setting capable of displaying the SDR video signal, and advances the process to S530. In S550, the external device 300 displays the SDR video signal received in S528 on the display 301 of the external device 300.

Note that the processing time from S549 to S550 varies depending on the performance of the external device 300, and therefore it takes about 1 to 5 seconds before the video is displayed.

Fig. 6A-1 and 6A-2 and fig. 6B-1 and 6B-2 are flowcharts showing details of the shooting menu process in S411 and 427 of fig. 4A-1 to 4A-4. This processing is realized by developing a program recorded in the nonvolatile memory 56 in the system memory 52 and executing the program by the system control unit 50.

In S601, the system control unit 50 determines whether to perform HDR photographing based on whether the user has the HDR photographing mode enabled. If it is determined that HDR photographing is not to be performed, the system control unit 50 advances the process to S602 and displays a menu for normal SDR photographing. If it is determined that HDR shooting is performed, the system control unit 50 advances the process to S602 and displays a menu for HDR shooting. In S603, the system control unit 50 displays functions that are not used in HDR shooting in an invalid state such as an ashed state on the menu.

In S604, the system control unit 50 determines whether a setting item indicating whether to perform HDR shooting is selected by the user. The system control unit 50 advances the process to S605 if it is determined that the item is selected, and advances the process to S611 otherwise. In S605, the system control unit 50 determines whether the user switches the setting indicating whether to perform HDR shooting to "on". The system control unit 50 advances the process to S606 if it is determined that the setting is switched to "valid", and otherwise advances the process to S607. In S606, the system control unit 50 changes the setting indicating whether or not HDR photographing is performed to "valid", and stores the setting value in the system memory 52.

If the setting indicating whether or not to perform HDR shooting is "on", the system control unit 50 determines in S607 whether or not the user has switched the HDR auxiliary display setting to "change". The system control unit 50 advances the process to S608 if it is determined that the setting is switched to "change", and advances the process to S609 otherwise. Note that if the setting indicating whether or not HDR shooting is performed is "invalid", it is desirable that the HDR auxiliary display setting is not changeable.

In S608, the system control unit 50 changes the HDR auxiliary display setting for shooting to be "done" or "not done", and stores the setting value in the system memory 52. There may be two or more variations to "make" the HDR assistance setting.

Note that when the HDR shooting setting or the HDR auxiliary display setting is changed on the menu screen, the result of the change in the display setting may also be reflected on the display at the timing at which the menu screen has been changed to the live view screen. When these settings are changed not on the menu screen but on the live view screen by using, for example, a specific button of the operation unit 70, the change result can also be reflected at the changed timing (the timing of pressing the button).

In S609, the system control unit 50 determines whether the user has issued a termination instruction for the HDR setting menu display processing. If it is determined that the termination instruction is issued, the system control unit 50 advances the process to S610.

In S610, the system control unit 50 determines whether the user selects a setting item of still image recording image quality. If it is determined that the selection is made, the system control unit 50 advances the process to S611, otherwise, advances the process to S651.

In S611, the system control unit 50 determines whether the user has input an HDR photographing instruction. The system control unit 50 advances the process to S612 if it is determined that an instruction input for HDR shooting is issued, and advances the process to S614 if it is determined that there is no instruction input.

The system control unit 50 displays a screen for HDR shooting in S612, and accepts user selection of recording image quality for HDR shooting in S613. As the set recording image quality for HDR shooting, two images of a RAW, an HDR still image file, and a RAW + HDR still image file may be prepared to be output simultaneously as a file format. Examples of the image size are Large (close to the number of pixels at the time of sensor reading), Middle (slightly smaller) than Large, and Small (smaller) smaller than Large. Examples of compression rates for compressing file size capacities are high image quality (low compression rate), standard (high compression rate), and low image quality (high compression rate).

The system control unit 50 displays a screen for SDR shooting in S614, and accepts user selection of recorded image quality for SDR shooting in S615. The same option as that for HDR shooting is also prepared as the setting recording image quality for SDR shooting.

In S651, the system control unit 50 determines whether the user has selected a setting item of moving image recording image quality. The system control unit 50 advances the process to S652 if it is determined that the setting item of the moving image recording image quality is selected, and advances the process to S657 otherwise.

In S652, the system control unit 50 checks a setting indicating whether or not HDR shooting is performed by the user. If "yes", the system control unit 50 advances the process to S653, otherwise, advances the process to S655.

The system control unit 50 displays a screen for HDR shooting in S653, and accepts user selection of recording image quality for HDR shooting in S654. As the set recording image quality for HDR shooting, three moving images of a RAW moving image, a RAW moving image + proxy moving image, an HDR moving image file, and a RAW + proxy moving image + HDR moving image file are prepared to be output simultaneously. Examples of image sizes are 8K, 4K, FullHD, HD, and VGA. Examples of compression rates for reducing the file size are high image quality (low compression rate) such as ALL-I and standard to low image quality (high compression rate) such as IPB. Frame rates and broadcast systems such as NTSC/PAL may also be selected.

The system control unit 50 displays a screen for SDR shooting in S655 in the same manner as in S653, and accepts user selection of recording image quality for SDR shooting in S656. The same option as that for HDR shooting is also prepared as the setting recording image quality for SDR shooting.

In S657, the system control unit 50 determines whether the user selects a setting item of the HDR output. The system control unit 50 advances the process to S658 if it is determined that the user has selected the setting item of the HDR output, and advances the process to S660 otherwise. In S658, the system control unit 50 determines whether the user has switched the HDR output setting to "active". The system control unit 50 advances the process to S659 if it is determined that the setting is switched to "valid", and otherwise advances the process to S660. In S659, the system control unit 50 changes the HDR output setting to "valid", and stores the set value in the system memory 52.

In S660, the system control unit 50 determines whether the user selects a setting item of the playback viewing assistance. The system control unit 50 advances the process to S661 if it is determined that the setting item of the playback viewing assistant is selected, and advances the process to S663 otherwise. In S661, the system control unit 50 determines whether the user has switched the setting of the playback viewing assistant to "active". The system control unit 50 advances the process to S662 if it is determined that the setting is switched to "valid", and advances the process to S663 otherwise. In S662, the system control unit 50 changes the setting for the playback viewing assistance to "valid", and stores the setting value in the system memory 52.

In S663, the system control unit 50 determines whether the user has selected a setting item for SDR conversion for transmission. If it is determined that the selection is made, the system control unit 50 advances the process to S664, otherwise, advances the process to S665. In S664, the system control unit 50 determines whether the user has switched the setting of SDR conversion for transmission to "active". If it is determined that the setting is switched to "valid", the system control unit 50 changes the SDR conversion setting for transmission to "valid" in S664, and advances the process to S665.

In S665, the system control unit 50 determines whether the user has selected a setting item related to other HDR shooting. If it is determined that the selection is made, the system control unit 50 advances the process to S666, otherwise, advances the process to S667. In S666, the system control unit 50 changes the other processing to "valid", and advances the process to S667.

In S667, the system control unit 50 determines whether the user has instructed to leave the menu. If determined as "not leaving", the system control unit 50 returns the process to S660, and if determined as "leaving", terminates the process.

Fig. 7A and 7B are flowcharts showing details of the HDR photographing process of the system control unit 50. In this flow, the RAW data written in the memory 32 is developed by HDR in the image processing unit 24.

An image pickup apparatus such as a digital still camera or a digital video camera has a white balance function of correcting a tone of a captured image according to a light source used in capturing. The white balance function corrects a difference between hues that vary according to light sources (natural light sources such as clear sky and cloudy sky, and artificial light sources such as fluorescent lamps and incandescent lamps) so that white looks the same regardless of the light sources. In S701 to S703, white balance coefficients necessary for white balance processing are calculated. In the present embodiment, in order to prevent highlight details of a high luminance region such as the sky from being lost, it is assumed that shooting is performed by exposure lower than exposure appropriate for the brightness of a person or the like.

First, in S701, the system control unit 50 obtains RAW image data via the memory control unit 15.

In S702, the system control unit 50 performs white search in-frame determination processing for determining a pixel that appears white on the obtained RAW image data to calculate a white balance coefficient.

In S703, the system control unit 50 calculates a white balance coefficient based on the result of the determination in the white search box.

Details of the processing in S702 and 703 will be described with reference to the flowchart shown in fig. 12.

As described previously, each pixel of the RAW image data has only a signal component of one of R, G and B. To make whiteColor search, the system control unit 50 performs a de-bayer (debye) process S1201 (because conversion to color signals needs to be performed), thereby generating R, G and B all channel signals for each pixel. There are various methods of solving the bayer process, and signal generation may be performed by, for example, linear interpolation using a low-pass filter. RAW image data is generally affected by noise, and thus Optical Black (Optical Black) is not 0 but has a value. Therefore, the system control unit 50 performs a process of subtracting the OB value from the signal after bayer solution (S1202). Then, the system control unit 50 calculates the color signal C from the obtained RGB signal by the following equationxAnd Cy(S1203)。

Wherein G is1And G2Are two G component values in 2 × 2 pixels in the bayer array. CxRepresentative color temperature, CyRepresents a green direction correction amount, and YiIs a luminance value.

FIG. 13 shows Cx-CyAnd (4) a plane. As shown in fig. 13, white can be photographed in advance from a high color temperature (e.g., daytime) to a low color temperature (e.g., sunset) by the image pickup apparatus, and the color evaluation value C is plotted on coordinatesxAnd CyTo obtain a white axis 1200 as a reference for detecting white. Since the actual light source has a slight white variation, the system control unit 50 gives a certain degree of width to both sides of the white axis 1200 (S1204). A frame obtained by widening the white axis in this way is referred to as a white search frame 1201.

In S1205, the system control unit 50 is in Cx-CyThe pixels after bayer solution are plotted in a coordinate system, and it is determined whether or not these pixels exist in a white search box. In S1206, the system control unit 50 performs shading processing for limiting pixels to be integration targets in the luminance direction among pixels existing in the white search frame. This processing is performed to prevent the calculation accuracy of the white balance coefficient from being lowered due to too dark colors being susceptible to noise. Also, this processing is performed to prevent the calculation accuracy of the white balance coefficient from being lowered due to: colors that are too bright may disrupt the R/G ratio or B/G ratio balance and separate from the correct color due to sensor saturation of one of the channels. The luminance value of the pixel as the target of shading processing differs between SDR and HDR. That is, pixels to be used in white balance coefficient calculation (described later) differ between SDR and HDR. This is because the high luminance region of HDR is higher in reproducibility than SDR. In the present embodiment, in SDR, for example, the brightness to +1EV is the object on the bright side, whereas in HDR, the brightness to +2EV is the object. This enables the calculation of white balance coefficients optimized for HDR.

In S1207, the system control unit 50 performs shading processing based on C that exists in the white search boxxAnd CyThe integrated values SumR, SumG, and SumB of the color evaluation values are calculated. Then, in S1208, the system control unit 50 calculates a white balance coefficient WBCo by the following equation from the calculated integrated valueR、WBCoGAnd WBCoB

"1024" on the right side of each equation indicates that the precision of one color component is 10 bits.

Note that as the white balance coefficient, a value for a shooting mode (SDR shooting or HDR shooting) set by the user, or a value for both SDR and HDR may be calculated.

The description will return to fig. 7A. In S704 to S706, the system control unit 50 calculates a tone correction table necessary for tone correction processing. Details of the tone correction will be described with reference to the flowcharts shown in fig. 14A and 14B.

In S1221, the system control unit 50 performs WB processing by using WB coefficients generated by the processing in S701 to S703 of fig. 7A. In S1222, the system control unit 50 performs histogram detection. More specifically, the system control unit 50 applies the white balance gain value obtained in S1221 to the entire image data, and generates a histogram as luminance information for the pixel values subjected to the gamma correction processing. The gamma correction processing may be a method using a known lookup table, but it is preferable to use the same gamma characteristics as those used in development. However, simplified gamma characteristics, such as gamma characteristics using line approximations, may also be used to save processing time and memory. Note that a portion at the end of an image is generally not important in many cases and is also affected by reduction in edge illumination depending on the imaging lens, so a histogram can also be formed by excluding pixels of the peripheral portion.

In S1223, the system control unit 50 performs face detection preprocessing. This process makes the face easier to detect by performing reduction processing, gamma processing, and the like on the image data. In S1224, the system control unit 50 performs face detection processing on the preprocessed image data by using a well-known method. In this face detection processing, the position and size of the face-like region (face region), and the reliability of detection can be obtained.

In S1225, the system control unit 50 calculates a tone correction amount (a)) for compensating for the exposure correction amount (reduction amount) as a first tone correction amount. In this step, the system control unit 50 calculates a tone correction amount having an input/output characteristic by which a dark portion of an image is properly exposed and high-luminance pixels having a predetermined luminance level or more are not corrected (at least the exposure correction amount is not completely compensated). This makes it possible to further suppress highlight detail loss of the bright portion subjected to the color tone correction. The tone correction amount may be prepared as a plurality of correction tables corresponding to the exposure correction amount.

In S1226, if there is a face area whose reliability is higher than a preset evaluation threshold value among the face areas detected by the face detection processing in S1224, the system control unit 50 determines that a face is detected. The system control unit 50 advances the process to S1227 if it is determined that a face is detected, and advances the process to S1231 if it is determined that a face is not detected.

In S1227, the system control unit 50 calculates a partial region of the detected face region as a face luminance obtaining region. The face luminance obtaining region is a region for obtaining the luminance of the bright portion of the face, and the number, position, and the like of the region are not particularly limited. In S1228, the system control unit 50 calculates, for each face luminance obtaining region, an average value of each type of R, G and B pixels contained in the region. Further, the system control unit 50 gamma-corrects R, G and the respective average values of the B pixels by applying a white balance gain value in the same manner as the histogram detection, and converts the result into a luminance value Y by the following equation.

Y=0.299×R+0.587×G+0.114×B

Note that as the white balance gain value to be applied in the histogram detection and the face detection, it is preferable to use a gain value used in WB processing for the same image data. Ideally, the same brightness gamma as that used in development is also preferably used, but a simplified gamma characteristic, such as a gamma characteristic using line approximation, may also be used to save processing time and memory.

In S1229, the system control unit 50 converts the luminance values obtained for the respective face luminance obtaining regions in S1228 into values corresponding to appropriate exposures. This is processing of correcting the face luminance, which is detected to be lower than the luminance at the time of capturing an image by the proper exposure, since the image data is captured by the exposure lower than the proper exposure. The conversion of the luminance value may be performed to compensate for the exposure correction amount (reduction amount) determined by the exposure control, and the conversion of the luminance value may also be performed by using the tone correction amount calculated in S1225.

In S1230, the system control unit 50 calculates typical values of the detected face luminance values. As typical values, for example, the maximum value of the luminance values in the respective face luminance obtaining regions of the detected face region may be obtained.

When the system control unit 50 determines in S1226 that no face region is detected, the process in S1231 is performed. In S1231, the system control unit 50 detects the histogram feature amount. The histogram feature amount may be, for example, a level (SD) to which a pixel having an accumulated frequency of 1% from the dark portion side belongs, or a level (HL) to which a pixel having an accumulated frequency of 1% from the bright portion side belongs. Subsequently, in S1232, the system control unit 50 converts the histogram feature amount calculated in S1231 into a value corresponding to image capturing by appropriate exposure. This is processing of correcting a histogram feature amount that is detected to be lower than that when an image is captured by proper exposure because the image data is captured by exposure lower than proper exposure. The conversion of the luminance value may be performed to compensate for the exposure correction amount (reduction amount) determined by the exposure control, and the conversion of the luminance value may also be performed by using the tone correction amount calculated in S1225.

In S1233, the system control unit 50 calculates a target correction amount. The system control unit 50 calculates a typical luminance value of the face or a target luminance level with respect to the histogram feature amount. Then, the system control unit 50 generates a lookup table (input/output characteristic) defining an output luminance level with respect to an input luminance level as a tone correction amount (B) from the target luminance level and the minimum value and the maximum value of luminance in the image data by using spline interpolation. The tone correction amount (B) is a second tone correction amount.

The target tone correction amount of the HDR may be different from the target tone correction amount of the SDR. For example, fig. 16A shows the appearance of SDR, and fig. 16B shows the appearance of HDR. Although the brightness values of the subjects (persons) are the same, the background is 100dc/m at most in SDR2But in HDR exceeds 100cd/m2. Therefore, although the luminance values of the subjects are the same, the subjects sometimes look darker in the HDR. This is called brightness contrast and is a phenomenon caused by human visual characteristics. For example, the luminance value of the subject is the same in fig. 16A and 16B, but the difference between the subject luminance and the background luminance is larger in fig. 16B compared to fig. 16A. In this case, the user feels the object in fig. 16B darker than the object in fig. 16A. That is, HDR can represent high-luminance areas such as sky brighter, which increases the likelihood that a subject appears darker than in SDR. Therefore, in the present embodiment, the tone characteristic as shown in fig. 15A is used in the case of SDR. However, in the case of HDR, the tone correction amount that improves the dark portion is applied by using the tone characteristics as shown in fig. 15B, and thus a good appearance can be obtained. Note that the tone correction of the present embodiment is explained taking correction for compensating for an underexposure as an example. However, for the purpose of image production, similar tone correction may be performed in brightness correction.

The target luminance level with respect to the typical luminance value of the face and the histogram feature amount of the image data may be set to a fixed value that may be empirically advantageous. However, different target luminance levels may also be set according to the values of the representative luminance values and the histogram feature quantities. In this case, a lookup table defining the relationship between the input level and the target luminance level may be prepared for each parameter (typical luminance value and histogram feature amount) for which the target luminance level is set.

The correction characteristic for achieving conversion to the above-defined target luminance level is obtained by a method such as spline interpolation, and is saved as a lookup table (or a relational expression) to which the tone correction amount (B) is applied, as necessary.

In S1234, the system control unit 50 synthesizes the tone correction amount (a) calculated in S1225 and the tone correction amount (B) calculated in S1233. For example, the system control unit 50 first applies the tone correction amount (a) to each input luminance level and then applies the tone correction amount (B) to the corrected luminance level, thereby obtaining a resultant luminance value and forming a lookup table of output luminance values with respect to each input luminance level.

In S1235, the system control unit 50 performs processing (limiter processing) of limiting the upper limit of the synthesis correction amount (synthesized tone correction amount) obtained in S1234. The composite tone correction amount (a) and the tone correction amount (B) increase the correction amount and make the amount of noise in the corrected image conspicuous, thus limiting the entire correction amount. The limiter process may be implemented by: maximum correction amounts allowed for the respective luminance values are prepared in the form of a table, and output levels exceeding the maximum correction amount among the values in the lookup table formed in S1234 are replaced with output levels corresponding to the maximum correction amounts. Note that as the tone correction amount, a value for a shooting mode (SDR shooting or HDR shooting) set by the user, or a value for both SDR and HDR may be calculated.

The description will return to fig. 7A and 7B. In S707, the system control unit 50 performs development by using the calculated white balance coefficient, tone correction parameter, and various HDR parameters. The HDR development image is generated by using, for example, a color matrix, camera OETF curve data, color adjustment parameters, noise reduction parameters, and sharpness parameters as other development parameters. As an example of camera OETF (gamma curve), the inverse characteristic of EOTF (electro-optical transfer function) of PQ (perceptual quantization) of recommendation bt.2100 of ITU-R is assumed. However, it is also possible to incorporate the OOTF (light transfer function) as the experience on the camera side. Alternatively, the OETF of HLG (mixed log Gamma) of the same recommendation BT.2100 of ITU-R may be used.

In S708, the system control unit 50 generates an MPF (multi picture format) image for simple display, such as a two-picture comparison image, by adjusting the size of the image visualized in S707, and compression-encodes the generated image in the HEVC format.

In S709, the system control unit 50 further adjusts the size of the MPF image generated in S708, thereby generating a thumbnail image for index display or the like having a smaller number of pixels than that of the MPF image, and compresses the generated image.

In S710, the system control unit 50 compresses the HDR image visualized in S707 into a main image. Various methods may be used in this compression. For example, 10-bit YUV422 data may be compressed by H.265(ISO/IEC 23008-2 HEVC).

In S711, the system control unit 50 determines the recorded image quality set by the user. The system control unit 50 advances the process to S712 if it is determined that recording only the RAW image is set, advances the process to S713 if it is determined that recording only the HDR image is set, and advances the process to S714 if it is determined that recording both the RAW image and the HDR image is set.

In S712, the system control unit 50 generates a RAW image file having the structure shown in fig. 8A by compressing the RAW image and adding a header, and records the file on the recording medium 200 via the recording medium I/F18. Various methods may be used as the compression method, and, for example, lossless compression which is reversible without deterioration or lossy compression which is irreversible but reduces the file size may be used. Further, the white balance white search box internal determination result obtained in S1205, the histogram obtained in S704, and the face detection result obtained in S705 are recorded in the header. The determination result in the white search box to be detected here is the determination result before the shading processing is performed in S1206. Therefore, the same determination results are recorded in the HDR shooting and the SDR shooting. In addition, when the HDR photographing mode is set by the user, HDR development parameters (such as white balance coefficients obtained in fig. 12 and tone correction amounts obtained in fig. 14A and 14B), and an MPF image for display generated in S708 by encoding image data of HDR development in the HEVC format are also recorded as metadata as shown in fig. 8C. As described previously, the contents of these data vary depending on whether the shooting mode is HDR shooting or SDR shooting. Note that in SDR shooting, the development parameters obtained by using the above-described white search box determination and tone characteristics for SDR are recorded. Note that even when HDR shooting is performed, it is possible to generate an SDR visualization parameter by performing the processing in S702 to S706 on the SDR, and record both parameters. Note also that the processing load increases when generating the development parameters for both HDR and SDR, and therefore the processing may also be performed when the processing load is relatively redundant, such as when not performing continuous shooting but performing single shooting.

When the processing load is margin, such as when performing a single shot, it is possible to generate an SDR-quality main image, an MPF image, and a thumbnail image by using SDR development parameters in addition to the HDR display image, and record the HDR display image and the SDR display image in the same file (fig. 8D).

When a thumbnail image is displayed, the image is small, and therefore only the type of the image needs to be identifiable. Accordingly, only the thumbnail image may be formed and saved as the SDR visualized image in S709 (fig. 8E). In such an arrangement, a display device or PC that does not correspond to the decoding of h.265 as the HDR compression method may display only the thumbnail image.

In S713, the system control unit 50 adds static or dynamic metadata by compression-encoding the developed HDR image in the HECV format, and records the encoded image as a HEIF (high efficiency image file format) file on the recording medium 200 via the recording medium I/F18. Examples of static metadata are the x-and y-coordinates of the three primary colors and white point of a CEA-861.3 compliant display, and the maximum luminance value, minimum luminance value, maximum content luminance level, and maximum frame average luminance level of the master display (mastering display). An example of dynamic metadata is color volume converted dynamic tone mapping metadata defined by SMPTE ST 2094. Note that when the HDR characteristic is expressed by the PQ signal, a depth of 10 bits or more is preferable. However, since the conventional JPEG format has only 8 bits, a new container needs to be adopted for the still image HDR. The present embodiment uses a container of HEIF as an image file format developed by MPEG (moving picture experts group) and defined by MPEG-H Part 12(ISO/IEC 23008-12). The characteristic of HEIF is that it can store not only the main image but also thumbnail images, a plurality of time-related images, and metadata such as EXIF or XMP in one file. This makes HEIF convenient since a 10-bit image sequence encoded by HEVC can also be stored.

In S714 and S715, the processes in S712 and S713 are performed in sequence, thereby recording both the RAW image and the HDR image.

Fig. 8A shows the structure of a RAW image file of still RAW image data to be recorded on the recording medium 200 in the above-described recording process. The container file format of the RAW image file to be described below is an ISO base media file format defined by ISO/IEC 14496-12. Therefore, the container format of the file has a tree structure and has nodes each called a box (box). Further, each box may have a plurality of boxes as sub-elements.

The RAW image data file 801 has a box ftyp 802 for describing a file type in a header, and also has a box moov 803 containing all metadata, a box mdat 808 of a track media data body (image data), and other boxes 807. The box moov 803 has, as sub-elements, a box uuid 804 for storing MetaData 805 and a trak box 806 for storing information referring to ImageData. The MetaData 805 describes MetaData of an image, such as the formation date/time of the image, shooting conditions, information indicating whether shooting is performed by HDR or SDR, the above-described detection MetaData, and other shooting information. The box mdat 808 has ImageData 809 as captured still image data as a child element.

Note that image data to be recorded in ImageData 809 of a RAW image taken by SDR is different from that of a RAW image taken by HDR.

Fig. 8B shows ImageData 809 to be recorded in a RAW image taken by the SDR. ImageData 809 shown in fig. 8B has a THM image 821, an MPF image 822, and a main image 823 (these images are each developed at SDR image quality and compressed by JPEG), and a RAW image 824 and RAW development parameters 825. Each SDR quality image is an image with 8 bits (256 tones) for each color component. Note that the RAW visualization parameters 825 shown in fig. 8B include at least the visualization parameters for SDR visualization.

Fig. 8C shows ImageData 809 to be recorded in a RAW image with only an HDR image as a display image when shooting by HDR. ImageData 809 shown in fig. 8C has a THM image 826, an MPF image 827 and a main image 828 (each of which is visualized at HDR image quality and compressed by HEVC), and a RAW image 824 and RAW visualization parameters 825. Each HDR quality image is an image having 10 bits (1024 tones) for each color component. The RAW visualization parameters 825 shown in fig. 8C, 8D, and 8E have at least the visualization parameters for HDR visualization.

Fig. 8D shows ImageData 809 to be recorded in a RAW image with both an HDR image and an SDR image as display images when shooting by HDR. ImageData 809 shown in fig. 8D has a THM image 821, an MPF image 822, and a main image 823 (each of which is developed at SDR image quality and compressed by JPEG), a THM image 826, an MPF image 827, and a main image 828 (each of which is developed at HDR image quality and compressed by HEVC), and a RAW image 824 and a RAW development parameter 825.

Fig. 8E shows ImageData 809 to be recorded in a RAW image when shooting by HDR, the RAW image having only a THM image as an SDR image and displaying an image with an MPF image and a main image as HDR quality. ImageData 809 shown in fig. 8E has: THM image 821 visualized at SDR image quality and compressed by JPEG, MPF image 827 and main image 828 each visualized at HDR image quality and compressed by HEVC, and RAW image 82) and RAW visualization parameters 825.

The file formats illustrated in this example are one embodiment, and there may be other boxes as needed. In addition, the display image may also be stored in the box of moov 803 or other boxes 807.

Since the file format as described above is used, the development parameters for the SDR image are recorded in the RAW image file photographed as SDR, and the development parameters for the HDR image are recorded in the RAW image file photographed as HDR. Therefore, even when the RAW image is developed later, the development can be performed by reflecting the shooting setting. For example, a device for performing RAW visualization (the device may be the digital camera 100, or may be another device such as a PC) refers to MetaData 805 of a RAW image, and determines whether the image is captured as HDR or SDR. If it is determined that the RAW image is shot as an HDR, the RAW image is developed as an HDR image by using the development parameters for the HDR image contained in the file. If it is determined that the RAW image is photographed as the SDR, the RAW image is developed as the SDR image by using the development parameters for the SDR image contained in the file. To make such processing possible, the digital camera 100 records development parameters for an SDR image in a RAW image file captured as SDR, and records development parameters for an HDR image in a RAW image file captured as HDR. Note that the apparatus for performing RAW development can record a still HDR image developed by using the above-described HEIF container.

Further, the same determination result is recorded as detection metadata for HDR shooting and SDR shooting. Therefore, even a RAW image file photographed by the HDR photographing mode can be visualized as an SDR image by using the recorded detection data. Therefore, even a device supporting only SDR images can appropriately display a RAW image file captured by the HDR capturing mode.

Fig. 9A is a flowchart showing details of the playback mode processing using the display unit 28 of the system control unit 50. The processing is realized by developing a program recorded in the nonvolatile memory 56 in the system memory 52 and executing the program by the system control unit 50.

In S901, the system control unit 50 determines whether playback is index playback or normal playback. If it is determined in S901 that the playback is index playback, the system control unit 50 advances the process to S902. In S902, the system control unit 50 decides the number of images to be played back.

In S903, the system control unit 50 decides an image to be played back. In S904, the system control unit 50 performs processing of drawing an image to be played back.

In S905, the system control unit 50 determines whether the rendering of all the images to be displayed is completed. If it is determined that the drawing is not completed, the system control unit 50 returns the process to S903 and continues the drawing process. If it is determined that the rendering processing is completed, the system control unit 50 advances the processing to S906. In S906, the system control unit 50 performs the image output process of S906 on the display unit 28, and terminates the display process. After that, the system control unit 50 performs operation acceptance processing.

Fig. 9B is a flowchart showing details of the drawing process in the playback mode process using the display unit 28 of the system control unit 50.

In S911, the system control unit 50 obtains information of an image to be played back. In S912, the system control unit 50 decides an image to be played back. In S913, the system control unit 50 reads out an image to be played back from the recording medium 200. In S914, the system control unit 50 performs processing of decompressing an image to be played back. In S915, the system control unit 50 collects data of the respective pixels from the image data subjected to the decompression processing in S914. The image data is luminance data or the like, and is used for histogram processing or highlight warning processing.

In S916, the system control unit 50 determines whether the image to be played back is an HDR image or an SDR image. The system control unit 50 advances the process to S917 if it is determined that the image to be played back is an HDR image, and advances the process to S920 if it is determined that the image to be played back is an SDR image. In S917, the system control unit 50 checks the HDR auxiliary display setting for playback. The system control unit 50 advances the process to S918 if assist 1 is set, and advances the process to S919 if assist 2 is set. In S918, the system control unit 50 performs the HDR → SDR conversion process on the image decompressed in S914 according to the setting of assist 1. In S919, the system control unit 50 performs the HDR → SDR conversion process on the image decompressed in S914 according to the setting of assist 2.

In S920, the system control unit 50 performs a process of increasing or decreasing the size of the image decompressed in S914 or the image subjected to the SDR conversion process in S918 or S910 to a size suitable for the display unit 28. Then, in S921, the system control unit 50 decides the placement of the generated image, and terminates the rendering processing.

Fig. 9C to 9H are flowcharts showing details of the reading target image selection processing of the system control unit 50.

In S926, the system control unit 50 checks the information of the obtained image and determines whether the image can be played back. The system control unit 50 advances the process to S927 if it is determined that the image can be played back, and advances the process to S936 if it is determined that the image cannot be played back.

In S927, the system control unit 50 determines whether the image to be played back is a still image. The system control unit 50 advances the process to S928 if it is determined that the image to be played back is a still image, and otherwise, advances the process to S935.

In S928, the system control unit 50 determines whether the image to be played back is a RAW image. The system control unit 50 advances the process to S929 if it is determined that the image to be played back is a RAW image, and if not, advances the process to S930.

In S929, the system control unit 50 determines whether the RAW image is a RAW image captured by HDR or a RAW image captured by SDR. The system control unit 50 makes this determination by using the metadata in the RAW file described with reference to fig. 8A to 8E. The system control unit 50 advances the process to S931 if it is determined that the RAW image is a RAW image captured by HDR, and advances the process to S932 if it is determined that the RAW image is an SDR image.

In S930, the system control unit 50 determines whether the still image found not to be the RAW image is an image captured by HDR or an image captured by SDR. In the present embodiment, an image captured by HDR is recorded by HEIF, and an image captured by SDR is recorded by JPEG, and therefore whether the image is an HDR image or an SDR image is determined by whether the format is HEIF or JPEG. However, it is also possible to determine whether an image is an HDR image or an SDR image by using metadata in the HEIF.

In S931, the system control unit 50 selects image data to be used in playback from a RAW image captured by HDR. In S932, the system control unit 50 selects image data to be used in playback from RAW images captured by the SDR. In S933, the system control unit 50 selects image data to be used in playback from the still image visualized by the HDR. In S934, the system control unit 50 selects image data to be used in playback from the still images captured by the SDR. In S935, the system control unit 50 selects image data to be displayed from the moving image file. In S936, the system control unit 50 performs playback image non-display processing. In this process, information indicating that the image is unplayable is displayed to notify the user that the image cannot be played back.

Fig. 9D is a flowchart of the system control unit 50 selecting image data to be used in playback from a RAW image captured by HDR.

In S941, the system control unit 50 determines whether playback is index playback or normal playback. The system control unit 50 advances the process to S942 if it is determined that the playback is the index playback, and advances the process to S943 if it is determined that the playback is the normal playback.

In S942, the system control unit 50 determines image data to be used according to the number of images to be played back through index playback. Although the threshold value is 36 in the present embodiment, this number is merely an example, and thus the number may be set appropriately by the user or may be decided according to the size of the display unit 28. If it is determined that the number of images to be displayed is 36 or more, the system control unit 50 advances the process to S945, and if it is determined that the number is less than 36, advances the process to S944.

In S943, the system control unit 50 decides "HDR main image for display (HEVC)" (828) as image data to be used in playback. In S944, the system control unit 50 decides "HDR MPF image for display (HEVC)" (827) as image data to be used in playback. In S945, the system control unit 50 decides "HDR THM image for display (HEVC)" (826) as image data to be used in playback.

Fig. 9E-1 and 9E-2 are flowcharts showing a process of selecting image data to be used in playback from RAW images captured by HDR when the RAW images have SDR images for display.

In S951, the system control unit 50 determines whether playback is index playback or normal playback. The system control unit 50 advances the process to S952 if it is determined that the playback is the index playback, and advances the process to S953 if it is determined that the playback is the normal playback.

In S952, the system control unit 50 determines image data to be used according to the number of playback images for index playback. Assume that the threshold value determined in this case is 36. The system control unit 50 advances the process to S955 if it is determined that the number of playback images is 36 or more, and advances the process to S954 if it is determined that the number is less than 36.

In S953, S954, and S955, the system control unit 50 determines whether the RAW image to be played back contains an SDR image. This determination uses the metadata in the RAW file explained with reference to fig. 8A to 8E.

In S956, the system control unit 50 decides "HDR main image for display (HEVC)" (828) as image data to be used in playback. In S957, the system control unit 50 decides "SDR main image (JPEG) for display" (823) as image data to be used in playback. In S958, the system control unit 50 decides "HDR MPF image for display (HEVC)" (827) as image data to be used in playback. In S959, the system control unit 50 decides "SDR MPF image for display (JPEG)" (822) as image data to be used in playback. In S960, the system control unit 50 decides "HDR THM image for display (HEVC)" (826) as image data to be used in playback. In S961, the system control unit 50 decides "SDR THM image for display (JPEG)" (821) as image data to be used in playback.

Fig. 9F is a flowchart of the system control unit 50 selecting image data to be used in playback from a still image developed by HDR.

In S971, the system control unit 50 determines whether the playback is the index playback or the normal playback. The system control unit 50 advances the process to S972 if it determines that the playback is the index playback, and advances the process to S973 if it determines that the playback is the normal playback.

In S972, the system control unit 50 determines image data to be used according to the number of images to be played back in index playback. In this embodiment, the threshold for the number is 36. If it is determined that the number of playback images is 36 or more, the system control unit 50 advances the process to S975, and if it is determined that the number is less than 36, the system control unit 50 advances the process to S974.

In S973, the system control unit 50 decides "HDR main image (HEVC)" (not shown) as image data to be used in playback. In S974, the system control unit 50 decides "HDR MPF image (HEVC)" (not shown) as image data to be used in playback. In S975, the system control unit 50 decides "HDR THM image (HEVC)" (not shown) as image data to be used in playback.

Fig. 9G is a flowchart for selecting image data to be used in playback from RAW images captured by the SDR.

In S981, the system control unit 50 determines whether the playback is the index playback or the normal playback. The system control unit 50 advances the process to S982 if it is determined that the playback is the index playback, and advances the process to S983 if it is determined that the playback is the normal playback.

In S982, the system control unit 50 determines image data to be used according to the number of playback images in the index playback. In this embodiment, the threshold for the number is 36. The system control unit 50 advances the process to S985 if it is determined that the number of images to be displayed is 36 or more, and advances the process to S984 if it is determined that the number is less than 36.

In S983, the system control unit 50 decides "SDR main image for display (JPEG)" (823) as image data to be used in playback. In S984, the system control unit 50 decides "SDR MPF image for display (JPEG)" (822) as image data to be used in playback. In S985, the system control unit 50 decides "SDR THM image for display (JPEG)" (821) as image data to be used in playback.

Fig. 9H is a flowchart for selecting image data to be used in playback from still images developed by SDR.

In S991, the system control unit 50 determines whether playback is index playback or normal playback. The system control unit 50 advances the process to S992 if it is determined that the playback is the index playback, and advances the process to S993 if it is determined that the playback is the normal playback.

In S992, the system control unit 50 determines image data to be used according to the number of images to be played back in index playback. In this embodiment, the threshold for the number is 36. If it is determined that the number of playback images is 36 or more, the system control unit 50 advances the process to S995, and if it is determined that the number of playback images is less than 36, the system control unit 50 advances the process to S994.

In S993, the system control unit 50 decides "SDR main image (JPEG)" (not shown) as image data to be used in playback. In S994, the system control unit 50 decides "SDR MPF image (JPEG)" (not shown) as image data to be used in playback. In S995, the system control unit 50 decides "SDR THM image (JPEG)" (not shown) as image data to be used in playback.

Fig. 10A is a flowchart showing details of the playback mode process using the external apparatus 300. This processing is realized by developing a program recorded in the nonvolatile memory 56 in the system memory 52 and executing the program by the system control unit 50.

In S1001, the system control unit 50 determines whether the external device 300 is connected to the digital camera 100. The system control unit 50 advances the process to S1002 if it is determined that the external apparatus 300 is connected, and advances the process to S1005 if it is determined that the external apparatus 300 is not connected.

In S1002, the system control unit 50 determines whether the playback HDR setting is valid. As the playback setting, "HDR playback is performed", "HDR playback is not performed", or "synchronized with the shooting mode" may be selected. When "performing HDR playback" is set, if the external apparatus 300 supports HDR, the mode is an HDR output mode regardless of whether the image to be played back is an HDR image or an SDR image. "do not HDR playback" is SDR output mode. "synchronized with shooting mode" is a mode in which playback output is synchronized with shooting mode. That is, in the HDR photographing mode in which "HDR photographing" is set to "go", HDR output is performed in playback. In the SDR photographing mode in which "HDR photographing" is set to "not done", SDR output is done in playback. Note that "synchronization with shooting mode" is set as a default mode, and "synchronization with shooting mode" is maintained even if the user changes the shooting mode. This synchronization is canceled only when the user changes the playback setting from "synchronized with shooting mode" to "HDR playback is performed" or "HDR playback is not performed". File formats such as "HEIF (playback)" and "JPEG (playback)" may also be used as options instead of "play HDR playback" and "not play HDR playback". Also, file formats such as "HEIF (shooting)" and "JPEG (shooting)" may be used as options instead of "performing HDR shooting" and "not performing HDR shooting".

In S1002, if "perform HDR playback" is selected, the system control unit 50 advances the process to S1003, and if "not perform HDR playback" is selected, advances the process to S1005. Further, if "synchronize with shooting mode" is selected and "HDR shooting" set in S606 is "go", the system control unit 50 advances the process to S1003, and if "HDR shooting" is "not go", the process to S1005.

In S1003, the system control unit 50 determines whether the external device 300 is a display supporting HDR. If it is determined that the external apparatus 300 is a display supporting HDR, the system control unit 50 advances the process to S1004, and if not, advances the process to S1005.

In S1004, the system control unit 50 outputs an HDR signal to the external apparatus 300. In S1005, the system control unit 50 outputs an SDR signal to the external device 300.

S1006 to S1011 are the same as S901 to 906 of fig. 9A, and thus a description thereof will be omitted.

Fig. 10B is a flowchart showing details of the rendering process (S1009) when the HDR signal is output to the external apparatus 300.

S1021 to S1025, S1028, and S1029 are the same as S911 to S915, S920, and S921 described with reference to fig. 9B, and thus the description thereof will be omitted.

In S1026, the system control unit 50 determines whether the image to be played back is an HDR image or an SDR image. The system control unit 50 advances the process to S1028 if it is determined that the image to be played back is an HDR image, and advances the process to S1027 if it is determined that the image to be played back is an SDR image.

In S1027, the system control unit 50 performs SDR → HDR conversion processing. The following S1028 and S1029 are the same as S920 and S921 in fig. 9B. Note that the details of the rendering processing (S1009) when the SDR signal is output to the external apparatus 300 are the same as those of fig. 9B, and thus the description thereof will be omitted.

Fig. 11A is a flowchart showing details of the playback menu process. This processing is realized by expanding a program recorded in the nonvolatile memory 56 in the system memory 52 and executing the program by the system controller 50.

In S1101, the system control unit 50 determines whether the user has set the RAW development by a setting item (not shown) of the RAW development. The system control unit 50 advances the process to S1103 if it is determined that the RAW development is not set, and advances the process to S1102 if it is determined that the RAW development is set.

In S1103, the system control unit 50 determines whether HDR → SDR conversion is set through a setting item (not shown) of SDR conversion for the HDR file. If it is determined that the HDR → SDR conversion is not set, the system control unit 50 advances the process to S1105, and if it is determined that the HDR → SDR conversion is set, advances the process to S1104.

In S1105, the system control unit 50 determines whether file transfer is set by a setting item (not shown) of file transfer. The system control unit 50 advances the process to S1107 if it is determined that file transfer is not set, and advances the process to S1106 if it is determined that file transfer is set.

In S1107, the system control unit 50 determines whether to leave the menu. If determined as "not leaving", the system control unit 50 returns the process to S1101, and if determined as "leaving", the system control unit 50 terminates the playback menu process.

In S1106, the system control unit 50 performs transmission processing on the image file designated by the user. If the receiving destination can only display the SDR when the HDR image file is to be transferred, the HDR → SDR conversion shown in S1104 may be performed in the camera and the HDR image file is transferred as the SDR image file.

In S1102, the system control unit 50 performs RAW development on a RAW image file designated by the user. Details of this RAW development process will be described below with reference to a block diagram shown in fig. 11B. Note that the image processing unit 24 includes respective processing units shown in fig. 11B, but the respective processing units may also be realized by a program to be executed by the system control unit 50.

The system control unit 50 causes the read image processing unit 24 to perform RAW development processing on the captured RAW image 1101 recorded on the recording medium 200. A RAW image is a set of pixels having a bayer array, and thus one pixel has intensity data of only one color component. Note that the RAW image includes RAW (SDR) obtained when SDR photographing is performed and RAW (HDR) obtained when HDR photographing is performed. Furthermore, raw (SDR) may be visualized by SDR visualization or HDR visualization. In contrast, raw (HDR) can be visualized by HDR visualization or SDR visualization. The white balance unit 1102 performs processing of whitening white. In HDR Development of Raw (HDR), white balance processing is performed by using HDR white balance coefficients as HDR development elements recorded in a file. On the other hand, when SDR visualization is performed, white balance processing is performed by generating SDR white balance coefficients as detection elements stored in a file from the white search box determination result. When white balance coefficients of both HDR and SDR are recorded in RAW, the desired one may be used as appropriate, of course.

The color interpolation unit 1103 interpolates the noise reduction and color mosaic image, thereby generating a color image in which all pixels each have three components (e.g., color information of R, G and B). The generated color image is processed by the matrix conversion unit 1104 and the gamma conversion unit 1105, thereby generating a basic color image. After that, the color brightness adjustment unit 1106 performs processing for improving the appearance of the image on the generated color image. For example, image correction is performed to detect a night scene and emphasize saturation from the scene. The tone correction is also performed in the same manner. However, when HDR development is performed on raw (HDR), tone correction is performed by using a tone correction amount for HDR as an HDR development element stored in a file. In contrast, when SDR visualization is performed, tone correction is performed by calculating a tone correction amount for SDR using a face detection result and a histogram as detection elements recorded in a file. When the tone correction amounts of both the HDR and the SDR are recorded in RAW, of course, the desired one may be appropriately used.

For an image subjected to desired color adjustment, the compression unit 1107 compresses a high-resolution image by a method such as JPEG or HEVC, and the recording unit 1108 generates a development image to be recorded on a recording medium such as a flash memory. Note that the HEIF container described above can store a plurality of images, and thus can store an SDR development image in addition to an HDR development image.

In S1104, the system control unit 50 performs SDR conversion on the HDR image file specified by the user. Since the HDR image is an image generated in a color space with OETF of PQ and a color gamut of bt.2020, tone mapping and color gamut mapping are required in a color space such as γ 2.2 or sGRB of SDR. As a practical method, a known method can be used. For example, when tone mapping is performed that matches the proper exposure to the SDR, a result of increased brightness compared to the SDR may be obtained.

(modification example)

In recording the RAW image by HDR shooting in the above-described embodiment, the main image by HDR development, the MFP image for display by HDR development, and the THM (thumbnail image) image by HDR development are recorded in a file in the RAW image together with the RAW image data, as shown in fig. 8C. Alternatively, as shown in fig. 8D, not only the main image, MFP image, and THM image by HDR visualization but also the main image, MFP image, and THM image by SDR visualization are recorded together with the RAW image. Otherwise, as shown in fig. 8E, the main image and the MFP image developed by HDR and the THM image developed by SDR are recorded together with the RAW image.

In the present modification, in the shooting process, an HDR image file or an SDR image file is associated with a RAW image file and recorded together with the RAW image file. When a RAW image file is recorded in association with an HDR image file or an SDR image file, if there is a difference in a development method or a compression encoding method between the HDR image or SDR image contained in the RAW and the HDR image or SDR image contained in the image file associated with the RAW image file, management becomes complicated or playback compatibility cannot be maintained. Therefore, in the present modification, as for the RAW image file and the image file associated with the RAW image file, images generated by being developed by the same method and encoded by the same compression encoding method are recorded as the main image, the MFP image, and the THM image.

Fig. 19A and 19B are flowcharts of the photographing process according to this modification, which corresponds to the HDR photographing process in S418 (fig. 7A and 7B) and the SDR photographing process in S434. This processing is realized by developing a program recorded in the nonvolatile memory 56 in the system memory 52 and executing the program by the system control unit 50.

First, in S1901, the system control unit 50 obtains RAW data. The process is the same as S701.

Then, in S1902, the system control unit 50 determines whether the setting of the captured image quality is the HDR capturing mode. The system control unit 50 advances the process to S1903 if it determines that the HDR photographing mode is set, and advances the process to S1909 if it determines that the SDR photographing mode is set.

In S1903, the system control unit 50 performs the same processing as S702 to S710. That is, by using the RAW data obtained in S1901, various parameters are calculated and detection processing is performed, and by performing HDR development processing on the obtained RAW data, an HDR main image, an HDR MPF image, and an HDR THM image are generated. Then, 10-bit data of each of the HDR main image, the HDR MPF image, and the HDR THM image is compression-encoded by the HECV format, thereby generating HDR compressed image data (HEVC).

Subsequently, in S1904, the system control unit 50 determines the recorded image quality set by the user in the same manner as in S711. The system control unit 50 advances the process to S1905 if it is determined that only the RAW image file is recorded by the setting, and advances the process to S1906 if it is determined that only the HDR image file is recorded by the setting, and advances the process to S1907 if it is determined that the RAW image file and the HDR image file are recorded by the setting.

In S1905, the system control unit 50 records the RAW image data obtained in S1901 as a RAW image file having the container file format as shown in fig. 8A in the same manner as in S712. In S1905, the HDR main image, HDR MPF image, and HDR THM image generated and compression-encoded by the HEVC format in S1903 are recorded as an image for display together with the RAW image data in ImageData 809, as shown in fig. 8C.

In S1906, the system control unit 50 records the HDR main image, HDR MPF image, and HDR THM image generated and compression-encoded in S1903 as an image file having the HEIF format in the same manner as in S713. That is, the HDR main image is recorded as the main image of the HEIF file, and the HDR MPF image and the HDR THM image are recorded in the HEIF file as images for display.

The HEIF file will be explained. Fig. 17A shows the structure of an image file having the HEIF format. The container file format of the image file illustrated below is the ISO base media file format defined by ISO/IEC 14496-12. Thus, the container format of the file has a tree structure and has nodes called boxes. Further, each box may have a plurality of boxes as sub-elements. The HEIF image data file 1701 has a box ftyp 1702 for describing the file type at the head, and also has a box meta 1703 containing metadata and a box mdat 1708 containing the media data body (image data) of the track. The box meta 1703 has, as sub-elements, a trak box 1706 for storing information referring to ImageData and a box MetaData 1705-1 for storing MetaData other than the MetaData defined by EXIF. The box mdat 1708 has, as sub-elements, a box MetaData 1705-2 for storing MetaData such as shooting date/time and shooting condition of an image defined by EXIF and ImageData1709 as still image data for shooting. ImageData1709 stores image data. Image data to be recorded in ImageData1709 by SDR shooting is different from image data to be recorded by HDR shooting. In S1906, the image as illustrated in fig. 17C is recorded in ImageData 1709. In this case, a THM image 1726 for display and an MPF image 1727 for display each of which is visualized in HDR image quality and compressed by HEVC, and a main image 1728 visualized in HDR image quality and compressed by HEVC are recorded in ImageData 1709.

In S1907, the system control unit 50 records the RAW image data obtained in S1901 as a RAW image file having the container file format as shown in fig. 8A in the same manner as in S1905. Then, in S1908, the system control unit 50 records the HDR main image, the HDR MPF image, and the HDR THM image, each of which was generated and compression-encoded in S1903 by the HEVC format, as an image file having the HEIF format, in the same manner as in S1906. That is, in recording a RAW file of RAW image data and an image file (HEIF file) developed by HDR, the same image subjected to the same development processing and encoded in the same encoding format (HEVC) is recorded as an MPF image for display and a THM image for display. Further, image data of a main image for display to be recorded as a RAW image file and image data of a main image to be recorded as a presentation image file (HEIF file) are the same image encoded by the same encoding format (HEVC). Note that the system control unit 50 associates the RAW image file to be recorded in S1907 with the development image file (HEIF file) to be recorded in S1908.

As described above, in the RAW file and the HIEF file recorded while being associated with each other, the same image subjected to the same HDR development process and encoded in the same coding format (HEVC) is recorded as an image for display. This makes it possible to prevent management from being complicated and maintain playback compatibility.

Further, as images for display to be recorded in the RAW file recorded in S1905 and the HIEF file recorded in S1906, images subjected to the same HDR development processing and the same format (HEVC) encoding processing as in S1907 and S1908 are recorded. Therefore, even when the recording format is changed, it is possible to prevent inconvenience that other playback devices or the like cannot play back an image in a specific image file among image files captured by the same HDR capturing setting, and thus playback compatibility can be maintained.

In S1909, the system control unit 50 causes the image processing unit 24 to perform SDR visualization processing on the RAW image data obtained in S701, thereby generating an SDR main image, an SDR MPF image, and an SDR THM image. Then, SDR compressed image data (JPEG) is generated by compression-encoding each image in the JPEG format. Since the image is an SDR image, the image data of the development and compression encoding is 8-bit YUV420 data. Further, the calculation and detection processing of various parameters is also performed in the same manner as in HDR shooting.

Subsequently, in S1910, the system control unit 50 determines the recorded image quality set by the user. The system control unit 50 advances the process to S1911 if it is determined that only the RAW image file is recorded by the setting, and advances the process to S1912 if it is determined that only the SDR image file is recorded by the setting. Further, if it is determined that the RAW image file and the SDR image file are recorded by the setting, the system control unit 50 advances the process to S1913, and if it is determined that the RAW image file and a plurality of types of SDR image files are recorded by the setting, advances the process to S1914.

In S1911, the system control unit 50 compresses the RAW image, adds a header, and records the RAW image as a RAW image file having the container file structure shown in fig. 8A on the recording medium 200 via the recording medium I/F18 in the same manner as in S1905. In S1911, unlike in S1905, the SDR image generated in S1909 is recorded as image data for display, and the SDR development parameter generated in S1909 is recorded as a RAW development parameter. That is, in S1911, as shown in fig. 8B, the data is recorded in ImageData 809 of the RAW image file. In S1911, the original image data (lossless compression or lossy compression) 824 obtained in S1901 and the SDR RAW imaging parameters generated in S1909 are recorded in ImageData 809 of the RAW image file 801. Further, the SDR THM image (JPEG)821, the SDR MPF image (JPEG)822, and the SDR main image (JPEG)823 each generated in S1909 are recorded in ImageData 809 of the RAW image file 801 as images for display.

In S1912, the system control unit 50 records the SDR visualized image data (THM image, MPF image, and main image) generated and compression-encoded by JPEG in S1909 as a JPEG file on the recording medium 200 via the recording medium I/F18. Fig. 18 shows a file structure of the JPEG format. The image data file 1800 having the JPEG format has metadata 1804 such as EXIF in the header, and also has a THM image 1801, an MPF image 1802, and a main image 1803 each having SDR image quality and compressed by JPEG. The file format shown in this example is one embodiment and may have other information as desired.

In S1913, the system control unit 50 records the RAW image data obtained in S1901 and the SDR image data generated and compressed by JPEG in S1909 as JPEG image files on the recording medium 200 via the recording medium I/F18 in the same manner as in S1911. Then, in S1914, the system control unit 50 records the SDR image data generated and compressed by JPEG in S1909 as a JPEG image file on the recording medium 200 via the recording medium I/F18 in the same manner as in S1912. As described above, when the RAW image file of the RAW image data and the image file (JPEG file) developed by the SDR are recorded, the same image subjected to the same development processing and encoded in the same encoding format (JPEG) is recorded as the MPF image for display and the THM image for display. Further, the image data of the image for display to be recorded as the RAW image file and the image data of the main image to be recorded as the developed image file (JPEG file) are the same image subjected to the same development processing and encoded by the same encoding format (JPEG). Note that the system control unit 50 associates the RAW image file to be recorded in S1913 with the developed image file (JPEG file) to be recorded in S1914, and records them.

As described above, in the RAW file and the JPEG file recorded while being associated with each other, the same image subjected to the same SDR development process and encoded in the same encoding format (JPEG) is recorded as an image for display. This makes it possible to prevent management from being complicated and maintain playback compatibility.

In S1915, the system control unit 50 records the RAW image data obtained in S1901 and the JPEG-compressed SDR image data generated in S1909 as JPEG files on the recording medium 200 via the recording medium I/F18 in the same manner as in S1911. Then, in S1916, the system control unit 50 records the JPEG-compressed SDR image data generated in S1909 as a JPEG image file on the recording medium 200 via the recording medium I/F18 in the same manner as in S1912. Further, in S1917, the system control unit 50 records the JPEG compressed SDR image data generated in S1909 as an HEIF file on the recording medium 200 through the recording medium I/F18. In S1917, unlike in S1906, the image data is recorded in ImageData1709, as shown in fig. 17B. That is, the SDR THM image (JPEG)1721 and the SDR MPF image (JPEG)1722 generated in S1909 are recorded as images for display, and the SDR main image (JPEG)1723 generated in S1909 is recorded as the main image.

Note that the system control unit 50 associates the RAW file to be recorded in S1915, the JPEG file to be recorded in S1916, and the HIEF file to be recorded in S1917 with each other. As described above, in the RAW, JPEG, and HIEF files recorded while being associated with each other, the same image subjected to the same SDR development process and encoded in the same encoding format (JPEG) is recorded as an image for display. This makes it possible to prevent management from being complicated and maintain playback compatibility. In addition, in the RAW file to be recorded in S1911, S1913, and S1915, the JPEG file to be recorded in S1912, S1914, and S1916, and the HIEF file to be recorded in S1917, images for display subjected to the same HDR development processing and encoded in the same format (JPEG) are recorded. Therefore, even if a change occurs in the recording format, it is possible to prevent inconvenience that other playback devices or the like cannot play back images in a specific image file among image files captured by the same SDR shooting setting, and thus it is possible to maintain playback compatibility.

Note that, in the above description, the SDR image is recorded in the JPEG file in S1912 and S1914, but the SDR image may also be recorded as a HIEF file in the same manner as in S1917.

Note also that, in the above description, the HDR THM image and the HDR MPF image for display to be recorded in the RAW file in S1907 and the HDR THM image and the HDR MPF image for display to be recorded in the HIEF file in S1908 are the same image data. However, these images are not necessarily the exact same images as long as they are generated by encoding the images subjected to the HDR development processing using the same encoding format. Also, the image for display to be recorded in the RAW file in S1913 and the image for display to be recorded in the JPEG file in S1914 need not be the exact same image as long as the images are generated by encoding the image subjected to the SDR development processing using the same encoding format. This also applies to S1915.

Further, in S1907, the HDR main image data recorded as the main image in the HIEF file in S1908 is recorded as an image for display of the RAW image file. However, it is also possible to record, for example, image data having different sizes without recording the same image data. Similarly, in S1913, the SDR main image data to be recorded in the JPEG file as the main image in S1914 need not be recorded as an image for display of the RAW image file, but may be recorded as different image data. This also applies to S1915.

Further, when the HEIF file is recorded, the JPEG image may be recorded as a thumbnail image for display without recording the HEVC image, as shown in fig. 17D. When a JPEG image is recorded as a thumbnail for display, even a display device or a PC that does not support decoding of h.265 as an HDR compression method can display only the thumbnail image.

(other embodiments)

The preferred embodiments of the present invention have been described above, but the present invention is not limited to these embodiments, and various modifications and changes can be made within the spirit and scope of the present invention. For example, in the above-described embodiment, image data having one color component exceeding 8 bits is encoded using HEVC (high efficiency video coding). However, the type of method is not particularly limited as long as the method can encode an image having more than 8 bits of color components. Further, the above-described embodiments have been explained by assuming that the present invention is applied to a digital camera. However, the present invention is not limited to the above-described embodiments, and may also be applied to a computer having an image pickup function (such as a smartphone or a laptop PC having a camera).

The present invention is not limited to the above-described embodiments, and various changes and modifications can be made within the spirit and scope of the present invention. Accordingly, the appended claims are made to disclose the scope of the invention.

The present application claims priority from japanese patent application 2019-36396, filed on 28.2.2019, and japanese patent application 2019-85969, filed on 26.4.2019, the entire contents of which are incorporated herein by reference.

81页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:摄像设备和图像处理设备及其控制方法和程序

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类