Electronic device and method for playing high dynamic range video

文档序号:24550 发布日期:2021-09-21 浏览:29次 中文

阅读说明:本技术 播放高动态范围视频的电子装置及其方法 (Electronic device and method for playing high dynamic range video ) 是由 柳珍奉 金民起 金智民 金炯奭 李贤泽 廉东铉 于 2020-01-31 设计创作,主要内容包括:提供了一种电子装置。该电子装置包括:被配置为与外部电子装置通信的通信电路;与通信电路可操作地连接的处理器;以及与处理器可操作地连接的存储器,存储器可以存储指令,该指令在被执行时使处理器控制电子装置:获取与外部电子装置的视频播放环境相关的环境信息;对高动态范围(HDR)视频进行解码;基于环境信息,对解码后的HDR视频执行颜色转换;将颜色转换后的视频编码为标准动态范围(SDR)格式;以及经由通信电路向外部电子装置发送编码后的视频。(An electronic device is provided. The electronic device includes: a communication circuit configured to communicate with an external electronic device; a processor operatively connected to the communication circuit; and a memory operably connected to the processor, the memory storing instructions that, when executed, cause the processor to control the electronic device to: acquiring environment information related to a video playing environment of an external electronic device; decoding a High Dynamic Range (HDR) video; performing color conversion on the decoded HDR video based on the environment information; encoding the color converted video into a Standard Dynamic Range (SDR) format; and transmitting the encoded video to an external electronic device via the communication circuit.)

1. An electronic device, the electronic device comprising:

a communication circuit configured to communicate with an external electronic device;

a processor operatively connected with the communication circuit; and

a memory operatively connected with the processor,

wherein the memory is configured to store instructions that, when executed, cause the processor to control the electronic device to:

acquiring environment information related to a video playing environment of the external electronic device;

decoding a High Dynamic Range (HDR) video;

performing color conversion on the decoded HDR video based on the environment information;

encoding the color converted video into a Standard Dynamic Range (SDR) format; and

transmitting the encoded video to the external electronic device via the communication circuit.

2. The electronic device according to claim 1, wherein the environment information includes color space information on the external electronic device and information on a maximum light emission luminance of the external electronic device.

3. The electronic device according to claim 2, wherein the color space information includes color gamut information about the external electronic device and gamma information about the external electronic device.

4. The electronic device of claim 1, wherein the instructions, when executed, cause the processor to control the electronic device to:

performing a capability negotiation process for establishing a connection with the external electronic device through the communication circuit, and

wherein the capability negotiation process includes a process of receiving the environment information from the external electronic device.

5. The electronic device of claim 1, wherein the instructions comprise instructions that when executed cause the processor to control the electronic device to: setting a tone mapping coefficient based on the environment information; and performing the color conversion using the set tone mapping coefficients.

6. The electronic device of claim 5, wherein the memory stores a table that associates the environmental information and the tone mapping coefficients, and

wherein the instructions include instructions that when executed cause the processor to control the electronic device to: searching the tone mapping coefficients from the table based on the environment information.

7. The electronic device of claim 1, wherein the instructions, when executed, cause the processor to control the electronic device to:

receiving an input for setting a tone mapping coefficient in a state of transmitting a video to the external electronic device; and

performing the color conversion using tone mapping coefficients set based on the input.

8. The electronic device of claim 1, wherein the instructions, when executed, cause the processor to control the electronic device to:

generating at least one of Video Usability Information (VUI) and Supplemental Enhancement Information (SEI) based on the environment information; and

transmitting a bitstream having at least one of the VUI and the SEI, the bitstream being transmitted to the external electronic device.

9. An electronic device, the electronic device comprising:

a communication circuit configured to communicate with an external electronic device;

a processor operatively connected with the communication circuit; and

a memory operatively connected with the processor,

wherein the memory stores instructions that, when executed, cause the processor to control the electronic device to:

acquiring environment information related to a High Dynamic Range (HDR) video playback environment of the external electronic device;

requesting, via the communication circuit, an HDR video from an HDR video providing apparatus based on the environment information;

receiving the HDR video from the HDR video providing apparatus;

repackaging the received HDR video; and

sending the repackaged HDR video to the external electronic device via the communication circuit.

10. The electronic device of claim 9, wherein the environmental information comprises: information on HDR specifications supported by the external electronic device, color space information on the external electronic device, and information on maximum light emission luminance of the external electronic device.

11. The electronic device of claim 9, wherein the instructions, when executed, cause the processor to control the electronic device to:

determining whether the external electronic device supports a video specification of the HDR video based on the environment information;

decoding the HDR video based on a video specification that the external electronic device does not support the HDR video;

re-encoding the decoded HDR video into a video specification supported by the external electronic device based on the environment information; and

transmitting the re-encoded HDR video to the external electronic device.

12. The electronic device of claim 9, wherein the instructions, when executed, cause the processor to control the electronic device to:

performing a capability negotiation process for establishing a connection with the external electronic device through the communication circuit, and

wherein the capability negotiation process includes a process of receiving the environment information from the external electronic device.

13. The electronic device of claim 9, wherein the instructions comprise instructions that when executed cause the processor to control the electronic device to: setting a tone mapping coefficient based on the environment information; and performing the color conversion using the set tone mapping coefficients.

14. The electronic device of claim 9, wherein the instructions, when executed, cause the processor to control the electronic device to:

searching for the environmental information using identification information for identifying the external electronic device.

15. A method of playing a High Dynamic Range (HDR) video with an external electronic device at an electronic device, the method comprising:

acquiring environment information related to a video playing environment of the external electronic device;

decoding the HDR video;

performing color conversion on the decoded HDR video based on the environment information;

encoding the color-converted video into an SDR format; and

and transmitting the encoded video to the external electronic device.

Technical Field

The present disclosure relates to techniques for an electronic device to play High Dynamic Range (HDR) video with an external electronic device.

Background

Various types of display devices have been developed and put on the market. In particular, techniques have been developed to achieve the same high color representation coverage as the user's eye views the landscape.

As one of the methods thereof, a display apparatus for generating an HDR video and displaying the HDR video has been developed. HDR is a technology for displaying video, and classification of shadows thereof is more detailed, like recognizing an object with human eyes. Techniques that do not use HDR may be referred to as Standard Dynamic Range (SDR).

Further, with the development of communication technology, in order to output a video on a large screen, a technology of outputting a video stored in an electronic device or a video received from an external server or the like using an external electronic device has been widely used. The UHD alliance is an international alliance that defines the luminance standard for premium 4K HDR video to be 1000NIT or higher, but has multiple displays, each with a luminance range of about 100NIT, for outputting SDR video. Furthermore, HDR video has multiple formats, and electronic devices often support only some of the several formats.

Disclosure of Invention

Technical problem

When HDR video is transmitted in the SDR format, an electronic device according to the related art may cause a sense of incongruity to a user who recognizes a screen.

Solution to the problem

Embodiments of the present disclosure address at least the above problems and/or disadvantages and provide at least the advantages described below. Accordingly, an example aspect of the present disclosure provides a method for providing an HDR video, which minimizes and/or reduces a feeling of incongruity of a user recognizing a screen, although the HDR video is transmitted in an SDR format, and an electronic device thereof.

Another example aspect of the present disclosure provides a method of providing HDR video using an external electronic device supporting different HDR formats, and an electronic device thereof.

According to one example aspect of the present disclosure, an electronic device is provided. The electronic device may include: the electronic device includes a communication circuit configured to communicate with an external electronic device, a processor operatively connected with the communication circuit, and a memory operatively connected with the processor. The memory may store instructions that, when executed, cause the processor to control the electronic device to: acquiring environment information related to a video playing environment of an external electronic device; decoding a High Dynamic Range (HDR) video; performing color conversion on the decoded HDR video based on the environment information; encoding the color converted video into a Standard Dynamic Range (SDR) format; and transmitting the encoded video to an external electronic device via the communication circuit.

According to another example aspect of the present disclosure, an electronic device is provided. The electronic device may include: the electronic device includes a communication circuit configured to communicate with an external electronic device, a processor operatively connected with the communication circuit, and a memory operatively connected with the processor. The memory may store instructions that, when executed, cause the processor to control the electronic device to: acquiring environment information related to an HDR video playing environment of an external electronic device; requesting, via a communication circuit, HDR video from an HDR video providing apparatus based on the environment information; receiving an HDR video from an HDR video providing apparatus; repackaging the received HDR video; and transmitting the repackaged HDR video to an external electronic device via the communication circuit.

According to another example aspect of the present disclosure, a method is provided. The method can comprise the following steps: acquiring environment information related to a video playing environment of an external electronic device; decoding the HDR video; performing color conversion on the decoded HDR video based on the environment information; encoding the color-converted video into an SDR format; and transmitting the encoded video to an external electronic device.

According to another example aspect of the present disclosure, a method is provided. The method can comprise the following steps: acquiring environment information related to an HDR video playing environment of an external electronic device; receiving an HDR video from an HDR video providing apparatus based on the environment information; repackaging the received HDR video; and transmitting the repackaged HDR video to an external electronic device.

According to another example aspect of the present disclosure, a non-transitory computer-readable storage medium is provided. A non-transitory computer readable storage medium may store a computer program for causing an electronic device to perform the methods disclosed in the present disclosure.

Technical effects

According to embodiments disclosed in the present disclosure, a method and an electronic device for providing an HDR video may minimize and/or reduce a feeling of incongruity of a user who recognizes a screen in the case of transmitting the HDR video in an SDR format.

Further, according to an embodiment disclosed in the present disclosure, an electronic device may provide HDR video using an external electronic device that supports a different HDR format than the electronic device.

In addition, various effects directly or indirectly determined by the present disclosure may be provided.

Drawings

The above and other aspects, features and advantages of certain embodiments of the present disclosure will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings in which:

FIG. 1 is a diagram illustrating an example system for playing videos, according to an embodiment;

fig. 2 is a block diagram showing an example structure of an electronic apparatus according to the embodiment;

fig. 3 is a flow diagram illustrating an example process of playing HDR video, according to an embodiment;

fig. 4 is a signal sequence diagram illustrating an example process of establishing a connection between a source apparatus and a sink apparatus according to an embodiment;

FIG. 5 is a diagram illustrating an example structure for generating and representing HDR video according to an embodiment;

FIG. 6 is a flow diagram illustrating an example process of performing color conversion according to an embodiment;

FIG. 7 is a flow diagram illustrating an example process of playing HDR video according to another embodiment;

FIG. 8 is a diagram illustrating an example method for determining a format of a provided HDR image, according to an embodiment;

fig. 9 is a flowchart illustrating an example process of transmitting a bitstream including Video Usability Information (VUI) and Supplemental Enhancement Information (SEI) according to an external electronic device according to an embodiment;

fig. 10 is a diagram illustrating an example of VUI and SEI included in a bitstream according to an embodiment;

fig. 11 is a signal sequence diagram illustrating an example process of performing tone mapping according to user input according to an embodiment;

FIG. 12 is a block diagram illustrating an example electronic device in a network environment, in accordance with various embodiments; and

fig. 13 is a block diagram illustrating an example program according to various embodiments.

Detailed Description

Hereinafter, various example embodiments of the present disclosure may be described with reference to the accompanying drawings. Accordingly, those of ordinary skill in the art will recognize that various modifications, equivalents, and/or substitutions can be made to the various example embodiments described herein without departing from the scope and spirit of the present disclosure.

Fig. 1 is a diagram illustrating an example system for playing a video, according to an embodiment.

The electronic device 10 according to the embodiment may play an HDR video (or HDR content) 11. Playing a video may include, for example, an operation of outputting successive images on a display device (e.g., a Liquid Crystal Display (LCD), a touch screen, an Organic Light Emitting Diode (OLED) panel, a Plasma Display Panel (PDP), etc.). Here, the type of the display device is not limited.

The electronic device 10 may be a device capable of communicating with another device in a wired and/or wireless manner. For example, but not limited to, the electronic device 10 may include a set-top box, a home automation control panel, a security control panel, a media box, a game console, audio, an e-book reader, a server, a workstation, a Personal Multimedia Player (PMP), an MPEG audio layer 3(MP3) player, a wearable device, a smart phone, a Personal Digital Assistant (PDA) terminal, a computing device such as a laptop computer or a tablet Personal Computer (PC), and the like. Herein, the electronic device is not limited thereto.

The electronic apparatus 10 may play the HDR video 11 stored in a memory (e.g., the memory 130 of fig. 2) provided in the electronic apparatus 10, or may receive and play the HDR video 11 from the HDR video providing apparatus 20. In the present disclosure, HDR video may refer to, for example, video (e.g., still images or moving images) configured based on the HDR technique.

Here, the HDR technique may include, for example, a technique for video whose shadows are classified in more detail, similar to recognizing an object with human eyes. For example, HDR can distinguish luminances up to 1000 nits to represent shading in detail, so dark portions have a gray scale that is not more saturated and bright portions have a gray scale that is not more clipped (clip) than video implemented in the standard dynamic range. SDR may refer to, for example, the way video is represented within a standard dynamic range without applying HDR techniques. SDR video may refer to video generated based on SDR, for example.

The electronic device 10 may play the video 12 corresponding to the HDR video 11 using the external electronic device 30. When streaming the HDR video 11, the electronic device 10 may transmit the SDR video or the HDR video (e.g., the video 12) converted based on the output setting value corresponding to the HDR video 11 to the external electronic device 30.

The video 12 output by the external electronic device 30 may have the same or similar content as the HDR image 11. The electronic device 10 may transmit data representing the HDR image 11 to the external electronic device 30 over a wired communication cable such as a High Definition Multimedia Interface (HDMI). The electronic device 10 may transmit a bitstream representing the HDR video 11 to the external electronic device 30 through wireless communication. Wireless communication may refer to, for example, but not limited to, short-range wireless communication (e.g., wireless fidelity (Wi-Fi) mode, Zigbee mode, Near Field Communication (NFC) mode, or bluetooth mode) or mobile communication mode (e.g., third generation (3G), third generation partnership project (3GPP), Long Term Evolution (LTE), fifth generation (5G), etc.). Herein, the above-described wireless communication manner is merely exemplary, and the embodiment is not limited thereto.

The external electronic device 30 may include, for example, but not limited to, a television, a monitor, a tablet Personal Computer (PC), a smart phone, a laptop computer, a PC, a Portable Multimedia Player (PMP), a digital photo frame, a digital signage, a device with a display such as a kiosk or a navigation terminal, and the like. Herein, the external electronic device is not limited thereto.

Depending on the communication environment or the connection scheme, the connection between the electronic device 10 and the external electronic device 30 may not support the transmission of the image having the HDR format and may support only the transmission of the image having the SDR format. In this case, in transcoding the HDR video 11 to the SDR video, the electronic device 10 may convert the HDR image 11 to the SDR format and may transmit the video converted to the SDR format to the external electronic device 30.

When converting the HDR video 11 into the SDR format, the electronic device 10 may perform a conversion process based on an output setting value corresponding to the SDR video. For example, when streaming HDR video, the electronic device 10 may generate video converted to SDR format based on, for example, but not limited to, at least one of a maximum light emission brightness (or light emission brightness range), a gamma curve, a color gamut, a color space, etc., of the external electronic device 30.

Various formats have been developed for HDR. For example, formats such as dolby vision (dolby vision), HDR10, HDR10+, and Hybrid log-gamma (hlg) exist. However, all devices may not be able to support all of the various HDR formats. Depending on the device specification, there may be a supported HDR format and an unsupported HDR format. When the external electronic device 30 does not support the format of the HDR video 11 streamed to the electronic device 10 or the format of the HDR video 11 stored in the electronic device 10, the external electronic device 30 may not normally be able to output the video 12 although the communication connection between the electronic device 10 and the external electronic device 30 supports the transmission of video in the HDR format.

Accordingly, the electronic apparatus 10 may request an HDR image of a format supported by the external electronic apparatus 30 from the HDR video providing apparatus 20 or may convert the HDR video 12 into a format supported by the external electronic apparatus 30. That is, the electronic device 10 may transmit the HDR image to the external electronic device 30 using a format supported by the external electronic device 30.

According to an embodiment, the external electronic device 30 may output the video 12 having the same or similar level of image quality as the HDR image 11 without additional tasks for the received data.

Fig. 2 is a block diagram showing an example structure of the electronic apparatus 10 according to the embodiment. According to fig. 2, electronic device 10 may include a processor (e.g., including processing circuitry) 110, communication circuitry 120, and memory 130.

The memory 130 may store instructions that, when executed, cause the processor 110 to perform operations performed by the electronic device 10. The processor 110 may execute instructions to control components of the electronic device 10 or may process data to perform operations performed by the electronic device 10.

The communication circuit 120 may transmit and/or receive video under the control of the processor 110. For example, communication circuit 120 may receive HDR video or may transmit HDR video or SDR video.

Herein, the communication circuit 120 may receive environmental information about the external electronic device 200 (e.g., the external electronic device 30 of fig. 1) from the external electronic device 200. The environment information may refer to, for example, information related to a video playback environment in which the external electronic device 200 plays back a video (e.g., the video 12 of fig. 1). For example, but not limited to, the environmental information may include color space information of the external electronic device 200, information on maximum light emission luminance of the external electronic device 200, and the like. The color space information may include color gamut information and gamma information. For another example, the environment information may include information on a format of an HDR video that the external electronic device 200 can support.

According to the embodiment, when the HDR image is converted into the SDR image and the SDR image is transmitted, the external electronic device 200 may not properly represent pixels included in the HDR image because the HDR image can have a high luminance value. For example, when the maximum screen luminance value included in the HDR image is 1000 nits and the maximum light emission luminance of the external electronic device 200 is 100 nits, there may occur a problem that the external electronic device 200 cannot express pixels larger than 100 nits. Accordingly, the processor 110 may convert the HDR video such that the maximum light emission luminance of the video included in the HDR video becomes 100 nits or less.

However, when the light emission luminance of an image included in the HDR video is simply linearly mapped within the maximum light emission luminance range of the external electronic device 200, the representation of the HDR video may not be normally performed. For example, a portion representing a dark region of an HDR video may be output in a state that is difficult for a user to distinguish and recognize. Accordingly, the communication circuit 120 may receive environment information including color space information of the external electronic device 200 from the external electronic device 200, and the processor 110 may perform color conversion on the HDR video based on the color space information included in the received environment information. For example, color conversion may be referred to as color space conversion.

Further, according to another embodiment, the communication circuit 120 may receive the environment information further including information on a format of the HDR video supportable by the external electronic device 200. The processor 110 may request, from an HDR video providing apparatus (e.g., the HDR video providing apparatus of fig. 1), HDR video configured with a format supportable by the external electronic apparatus 200 via the communication circuit 120 based on the received environment information.

The processor 110 may include various processing circuits and repackage the HDR video in a form that the communication circuit 120 may transmit the HDR video to the external electronic device 200, and may transmit the repackaged HDR video to the external electronic device 200 via the communication circuit 120.

Fig. 3 is a flow diagram 300 illustrating an example process of playing HDR video, according to an embodiment.

The electronic device 10 of fig. 1 may perform operation 310 of acquiring environmental information regarding the external electronic device 30 of fig. 1. According to an embodiment, the electronic device 10 may perform a capability negotiation process to perform a communication connection with the external electronic device 30, and may receive the context information from the external electronic device 30 in the course of performing the capability negotiation process.

According to another embodiment, when the electronic device 10 wants to play HDR video on the external electronic device 30, it may send a request for environment information to the external electronic device 30. The electronic device 10 may receive the environment information as a response to a request from the external electronic device 30.

According to another embodiment, the electronic device 10 may recognize identification information capable of identifying the external electronic device 30, and may search for environmental information about the external electronic device 30 using the identification information. Here, the identification information may refer to information that can identify the type of the external electronic device 30, for example, classified according to the characteristics of the external electronic device 30. For example, the identification information may include a model name for the external electronic device 30. The electronic device 10 may send a query including at least a portion of the identification information to an external server to search for the environmental information. The electronic device 10 may use at least a portion of the identification information to search for environment information from a table stored in a memory of the electronic device 10 (e.g., memory 130 of fig. 2).

Further, the electronic device 10 may perform operation 320 of decoding the HDR video received in the encoded state. Further, the electronic device 10 may perform transcoding to encode the decoded HDR video into a form that can be sent to the external electronic device 10. Here, when the decoded HDR video is encoded and transmitted without change, in a case where the playback environment of the external electronic device 30 is not suitable for playing the HDR video, a large deterioration in image quality or the like contained in the HDR video may occur. Accordingly, the electronic device 10 may correct the HDR video based on the environmental information about the external electronic device 30 in transcoding the HDR video.

According to an embodiment, to correct HDR video based on environmental information, the electronic device 10 may perform a color conversion operation 330 on the decoded HDR video. The color conversion operation 330 may include, for example, a process of performing Color Space Conversion (CSC). In the color conversion process according to the embodiment, the electronic device 10 may perform tone mapping of video included in the HDR video according to environment information of the external electronic device 300. To perform tone mapping, the electronic device 10 may determine tone mapping coefficients used in the tone mapping process based on the environmental information.

The communication connection between the electronic device 10 and the external electronic device 30 may send SDR video without supporting the transmission of HDR video. Due to the playing environment of the external electronic device 30, the HDR video may not be played, and only the SDR video may be played. To transmit video to external electronic device 30 using the SDR format, electronic device 10 may perform operation 340 of encoding the HDR video with completed color conversion into the SDR format. In operation 350, the electronic device 10 may transmit a bitstream including the image encoded into the SDR format to the external electronic device 30.

Fig. 4 is a signal sequence diagram 400 illustrating an example process of establishing a connection between a source apparatus 401 and a sink apparatus 402, according to an embodiment.

According to an embodiment, in order to establish a communication connection between a source apparatus 401 (e.g., electronic apparatus 10 of fig. 1) and a sink apparatus 402 (e.g., external electronic apparatus 30 of fig. 1), the source apparatus 401 may process a capability negotiation procedure with the sink apparatus 402.

The source apparatus 401 and the sink apparatus 402 may perform an apparatus discovery operation 410 for discovering apparatuses. When sink apparatus 402 is discovered, source apparatus 401 may perform operation 420 of establishing a connection between source apparatus 401 and sink apparatus 402. For example, but not limiting of, source device 401 may establish a Wi-Fi direct connection or an optional Tunneled Direct Link Setup (TDLS) connection with sink device 402.

When a connection is established between source apparatus 401 and sink apparatus 402, source apparatus 401 may perform a capability negotiation procedure (e.g., 430, 440, 450, 460, and 470 of fig. 4) with sink apparatus 402. According to an embodiment, source device 401 may perform operation 430 of exchanging protocol (e.g., Real Time Streaming Protocol (RTSP)) options with sink device 402. Source device 401 may send GET _ PARAMETER request 440 for the protocol to sink device 402. Further, source device 401 may receive response 450 to GET _ PARAMETER request 440 and may send SET _ PARAMETER request 460 to sink device 402 to receive response 470 to SET _ PARAMETER request 460.

The source device 401 may obtain environmental information related to playing video of the sink device 402 from the sink device 402 using a request and a response transmitted and received in the process of performing a capability negotiation process with the sink device 402.

According to an embodiment, after the capability negotiation process between source device 401 and sink device 402 is completed, source device 401 may stream video to sink device 402 in operation 490 when a stream setup request and play request 480 is received from sink device 402. Fig. 4 is an illustration of an embodiment and may change the process by which source device 401 initiates streaming. For example, source device 401 may initiate streaming independent of the request of sink device 402.

Fig. 5 is a diagram illustrating an example structure for generating and representing HDR video, according to an embodiment.

The means for generating HDR video may control video for generating HDR video as HDR video (reference numeral 510). According to the example shown in fig. 5, the HDR controlled video may have a linear relationship 511 between code values and luminance values of light emission. In some cases, HDR control operations may be performed based on a non-linear relationship. The apparatus for generating the HDR video may be the HDR video providing apparatus 20 of fig. 1 or a separate apparatus. For example, 10-bit HDR video with a maximum light emission luminance level of 1000 nits can be generated.

HDR video represented in a linear fashion can be converted to non-linear HDR video represented in a non-linear fashion by an opto-electronic transfer function (OETF) conversion 520 using OETF 521. The non-linear HDR video may be encoded by encoding 530 based on the standard of streaming HDR video. For example, HDR video may stream video using gamma codes (e.g., approved ST-2084 Perceptual Quantizers (PQs), BT-2100, HLG gamma, etc.) and may be encoded based on a compression codec such as HEVC or VP9 for streaming to the electronic device 10 of fig. 1 (reference numeral 540).

The electronic device 10 may decode the streaming HDR video (reference numeral 550). Further, after decoding the HDR video, the electronic device 10 may perform an electro-optical transfer function (EOTF) conversion 560 with the EOTF 561 to linearize the nonlinear HDR video.

When the electronic device 10 wants to output an HDR video on an external electronic device 31 (e.g., an external electronic device having a display, capable of playing 10 bits of video while playing an HDR video having a maximum light emission luminance level of 1000 nits, having a maximum light emission luminance of 1000 nits) that supports an environment capable of playing the HDR video, it may play the HDR video without performing a video optimization work on the HDR video.

However, in some cases, when the HDR video is transmitted to the external electronic device 32 without being changed, the HDR video may not be played on the external electronic device 32. For example, when HDR video should be transmitted in a communication mode capable of transmitting only an SDR video format, the HDR video cannot be transmitted because the communication specification for transmitting the HDR video is not supported. When the external electronic device 32 has a different bit depth rendering capability than the electronic device 10, for example, when the external electronic device 32 may only play an 8-bit SDR video, the external electronic device may not be able to normally play the received HDR video. When there is a difference between the maximum light emission luminance level of the external electronic device 32 and the maximum light emission luminance level of the electronic device 10, since the electronic device 10 can receive the HDR video having a light emission luminance level higher than the maximum light emission luminance level of the external electronic device 32 (for example, when the maximum light emission luminance level included in the HDR video received by the electronic device 10 is 1000 nits and when the maximum light emission luminance of the display of the external electronic device 32 is 600 nits), it may be impossible to normally express a light emission luminance value higher than the maximum light emission luminance of the external electronic device 32. Although the color representation capabilities of the electronic device 10 and the external electronic device 30 are different from each other (e.g., rec.809vs.pci-P3), when the external electronic device 32 receives the HDR video of the electronic device 10 without change, the HDR video cannot be normally played. The electronic device 10 may perform an HDR video optimization operation 580 for optimizing HDR video and display the video 590 in the playback environment of the external electronic device 30. In performing video optimization operation 580 on the HDR video, the electronic device 10 may use the environmental information received from the external electronic device 32. According to an embodiment, HDR video optimization operation 580 may include a process to perform color conversion based on the environmental information.

Fig. 6 is a flowchart 600 illustrating an example process of performing color conversion according to an embodiment.

According to an embodiment, the electronic device 10 of fig. 1 may perform operation 610 of receiving an HDR video comprising an HDR image from the HDR video providing device 20 of fig. 1. According to another embodiment, the electronic device 10 may select HDR video stored in the electronic device 10.

The electronic device 10 may perform operation 620 of determining tone mapping coefficients for performing tone mapping of the HDR image. The tone mapping coefficients may be determined in response to environmental information.

According to an embodiment, the electronic device 10 may further determine a color conversion coefficient for performing color conversion of the video based on the environment information in operation 620. According to an embodiment, the electronic device 10 may determine the color conversion coefficient based on color information contained in the environment information and color information of the HDR image.

The electronic device 10 may perform operation 630 of calculating a light emission luminance ratio of the HDR image using the tone mapping coefficients. In operation 640, the electronic device 10 may apply the determined light emission luminance ratio value to the HDR image to obtain an HDR video including the tone-mapped image.

Fig. 7 is a flow diagram 700 illustrating an example process of playing HDR video, in accordance with another embodiment. For example, fig. 7 shows an embodiment in which the electronic device 10 is capable of transmitting HDR video to the external electronic device 30 using an HDR format. Fig. 8 is a diagram illustrating an example method for determining a format of a provided HDR image, according to an embodiment.

The electronic device 10 may perform operation 710 of obtaining the context information from the external electronic device. According to an embodiment, the environment information may include information on HDR formats supported by the external electronic device. The electronic device 10 may perform operation 720 of requesting HDR video from the HDR video providing device 20 of fig. 1.

According to an embodiment, the electronic device 10 may request the HDR video from the HDR video providing device 20 based on the environment information in operation 720. Referring to fig. 8, an embodiment is shown by way of example, as HDR video formats 810 that may be provided by HDR video providing apparatus 20 are HDR10, HDR10+, and HLG. HDR video providing apparatus 20 may provide HDR video to electronic device 10 using a format of the available HDR video formats, including among formats 820 supported by the electronic device. Referring to fig. 8, when electronic device 10 provides a device description of electronic device 10 to HDR video providing device 20, HDR video providing device 20 may select one of HDR10 and HDR10+ included in all formats 820 supported by the electronic device and available HDR video formats 810. Further, the video provided by the HDR video providing apparatus 20 may be selected with respect to compatibility between HDR formats. For example, it is considered that an HDR format capable of playing a video of an HDR10+ format is selected among devices supporting HDR10 and devices not supporting HDR10 +. HDR video providing apparatus 20 may provide HDR video configured with the selected format to electronic apparatus 10. When the selectable format is plural, the HDR video providing apparatus 20 may select one in order of priority. For example, the device description may also include information regarding the priority of formats 820 supported by the electronic device.

Referring to fig. 8, when HDR10+ has a high priority, the HDR video providing apparatus 20 may provide HDR video configured based on the HDR10+ format to the electronic apparatus 10. However, as shown in fig. 8, when the format 830 supported by the external electronic device does not include HDR10 and HDR10+, although the electronic device 10 transmits HDR video to the external electronic device 30, the external electronic device 30 cannot represent HDR video. Accordingly, the electronic apparatus 10 may provide the HDR video providing apparatus 20 with information about the format 830 supported by the external electronic apparatus included in the environment information. For example, when the device description of the electronic device 10 is provided to the HDR video providing device 20, the electronic device 10 may change information related to playing the HDR video included in the device description to information about the external electronic device 30, and may transmit the changed information to the external electronic device 30. According to the example shown in fig. 8, the electronic apparatus 10 may request HDR video in the HLG format that is not supported by the electronic apparatus 10 but is supported by the external electronic apparatus 30 from the HDR video providing apparatus 20, and may receive HDR video in the HLG format from the HDR video providing apparatus 20.

Referring again to fig. 7, the electronic device 10 may perform operation 730 of receiving the HDR video as a response to the request in operation 720. The electronic device 10 may perform operation 740 of repackaging the received HDR video. Operation 740 may include an operation of repackaging the data to transmit the HDR video according to the format of the received HDR video and the requirements of the communication connection protocol between the electronic device 10 and the external electronic device 30. The electronic device 10 may control its communication circuitry to perform operation 750 of sending the repackaged HDR video to the external electronic device 30.

Fig. 9 is a flowchart 900 illustrating an example process of transmitting a bitstream including VUI and SEI according to the external electronic device 30, according to an embodiment. Fig. 10 is a diagram illustrating an example of VUI and SEI included in a bitstream 1000 according to an embodiment.

According to an embodiment, the electronic device 10 of fig. 1 may perform operation 910 of decoding a received HDR video. The electronic device 10 may perform operation 920 of obtaining at least one of VUI and SEI for the HDR video from the decoded HDR video. Referring to at least one of VUI and SEI, the electronic device 10 may identify whether the HDR video is video according to any specification.

Referring to fig. 10, according to the HEVC specification for transmitting HDR video, a syntax indicating VUI 1010 and a syntax indicating SEI 1020 may be included in a bitstream 1000. According to the example shown in FIG. 10, VUI 1010 may be represented by syntax elements such as color _ primaries, transfer _ characteristics, and Matrix _ coeffs, and the syntax elements may have values such as 9, 16, and 9, respectively. Further, the SEI 1020 may indicate information about the maximum light emission luminance level or the like in max _ display _ mapping _ luminance or the like in conjunction with a message syntax element such as mapping _ display _ colour _ volume.

For example, the transfer _ characteristics value may have a value of 16 for the PQ format and a value of 18 for the HLG format.

When the electronic device 10 transmits a bitstream including VUI and SEI corresponding to a value of an HDR specification not supported by the external electronic device 30 to the external electronic device 30, the external electronic device 30 may determine that the received bitstream is configured with a format not supported by the external electronic device 30 and may output an error message instead of playing HDR video. Accordingly, referring again to fig. 9, the electronic device 10 may perform operation 930 of generating at least one of VUI and SEI corresponding to video supportable by the external electronic device 30 based on the environment information received from the external electronic device 30. The electronic device 10 may control its communication circuit to perform operation 940 of transmitting the bitstream having the generated at least one of the VUI and the SEI to the external electronic device 30.

Fig. 11 is a signal sequence diagram illustrating an example process of performing tone mapping according to user input according to an embodiment.

According to an embodiment, source device 401 may perform operation 1110 of establishing a connection with sink device 402. Source device 401 may perform operation 1120 of obtaining video comprising a tone-mapped image to be streamed to sink device 402.

When the source device 401 streams the obtained video to the sink device 402 in operation 1130, the sink device 402 may play the streamed video in operation 1140.

According to an embodiment, in a state where the sink device 402 plays a video in operation 1140, the source device 401 may receive a user input for the played video in operation 1150. Source device 401 may display a user interface for receiving user input. For example, the source apparatus 401 may output a visual image object for adjusting or selecting a tone mapping coefficient on a display provided in the source apparatus 401 or a display provided in the sink apparatus 402. Source device 401 may receive user input through an input device (e.g., a touch screen display, physical buttons, etc.) provided in source device 401.

According to another embodiment, unlike operation 1150, sink device 402 may receive user input and may transmit the received user input to source device 401.

Source device 401 may perform operation 1160 of determining tone mapping coefficients based on user input. In operation 1170, the source device 401 may perform tone mapping based on the tone mapping coefficients determined in response to the user input. In operation 1180, the source device 401 may stream video including the tone-mapped image to the sink device 402 based on the changed tone mapping coefficient. The sink device 402 may perform an operation 1190 of playing the video including the tone-mapped image in response to the user input.

Fig. 12 is a block diagram illustrating an example electronic device 1201 (e.g., electronic device 10 of fig. 1) in a network environment 1200 in accordance with various embodiments. Referring to fig. 12, an electronic device 1201 (e.g., electronic device 10 of fig. 1) may communicate with an electronic device 1202 (e.g., external electronic device 30 of fig. 1) via a first network 1298 (e.g., a short-range wireless communication network) in a network environment 1200 or may communicate with an electronic device 1204 or a server 1208 (e.g., HDR video providing device 20 of fig. 1) via a second network 1299 (e.g., a long-range wireless communication network). According to an embodiment, electronic device 1201 may communicate with electronic device 1204 through server 1208. According to an embodiment, electronic device 1201 may include processor 1220, memory 1230, input device 1250, sound output device 1255, display device 1260, audio module 1270, sensor module 1276, interface 1277, haptic module 1279, camera module 1280, power management module 1288, battery 1289, communication module 1290, user identification module 1296, or antenna module 1297. According to some embodiments, at least one of the components of electronic device 1201 (e.g., display device 1260 or camera module 1280) may be omitted or one or more other components may be added to electronic device 1201. According to some embodiments, some of the above components may be implemented in one integrated circuit. For example, the sensor module 1276 (e.g., a fingerprint sensor, an iris sensor, or an illumination sensor) may be embedded in the display device 1260 (e.g., a display).

The processor 1220 may run, for example, software (e.g., the program 1240) to control at least one other component (e.g., a hardware component or a software component) of the electronic device 1201 that is connected to the processor 1220 and may perform various data processing or calculations. According to an embodiment, as part of the data processing or operation, processor 1220 may load commands or data received from another component (e.g., sensor module 1276 or communication module 1290) into volatile memory 1232, process the commands or data loaded into volatile memory 1232, and store the resulting data in non-volatile memory 1234. According to an embodiment, the processor 1220 may include a main processor 1221 (e.g., a central processing unit or an application processor) and an auxiliary processor 1223 (e.g., a graphics processing unit, an image signal processor, a sensor hub processor, or a communication processor) that is operatively independent of or combined with the main processor 1221. Additionally or alternatively, the auxiliary processor 1223 may consume less power than the main processor 1221, or be specific to a specified function. The auxiliary processor 1223 may be implemented separately from the main processor 1221 or as part of the main processor 1221.

Secondary processor 1223 may control at least some of the functions or states associated with at least one of the components of electronic device 1201, for example, display device 1260, sensor module 1276, or communication module 1290, in place of primary processor 1221 when primary processor 1221 is in an inactive (e.g., sleep) state, or secondary processor 1223 may control at least some of the functions or states associated with at least one of the components of electronic device 1201, for example, display device 1260, sensor module 1276, or communication module 1290, with primary processor 1221 when primary processor 1221 is in an active state (e.g., application running). According to an embodiment, the auxiliary processor 1223 (e.g., an image signal processor or a communication processor) may be implemented as part of another component (e.g., a camera module 1280 or a communication module 1290) that is functionally related to the auxiliary processor 1223.

Memory 1230 may store various data used by at least one component of electronic device 1201, such as processor 1220 or sensor module 1276. For example, the data may include software (e.g., program 1240) and input data or output data for commands related to the software. The memory 1230 can include volatile memory 1232 or nonvolatile memory 1234.

Program 1240 may be stored as software in memory 1230, and program 1240 may include, for example, an Operating System (OS)1242, middleware 1244, or applications 1246.

Input device 1250 may receive commands or data from outside of electronic device 1201 (e.g., a user) that are to be used by components of electronic device 1201, such as processor 1220. Input device 1250 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (e.g., a stylus).

The sound output device 1255 may output a sound signal to the outside of the electronic device 1201. The sound output device 1255 may include, for example, a speaker or a receiver. The speaker may be used for general purposes such as playing multimedia or playing a record and the receiver may be used for incoming calls. According to embodiments, the receiver and the speaker may be implemented integrally or separately.

Display device 1260 may visually provide information to an exterior (e.g., a user) of electronic device 1201. The display device 1260 may include, for example, a display, a holographic device, or a projector, and control circuitry for controlling the respective devices. According to embodiments, the display device 1260 may include touch circuitry configured to sense touch or sensor circuitry (e.g., pressure sensors) for measuring pressure intensity from the touch.

The audio module 1270 may convert sound and electrical signals bi-directionally. According to an embodiment, the audio module 1270 may obtain sound through the input device 1250 or output sound through the sound output device 1255 or an external electronic device ((e.g., the electronic device 1202) (e.g., a speaker or an earphone)) directly or wirelessly connected to the electronic device 1201.

The sensor module 1276 may generate electrical signals or data values corresponding to operating conditions (e.g., power or temperature) internal to the electronic device 1201 or external environmental conditions (e.g., user conditions). According to an embodiment, the sensor module 1276 may include, for example, a gesture sensor, a gyroscope sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.

Interface 1277 may support one or more particular protocols that will enable electronic device 1201 to connect directly or wirelessly with external electronic devices (e.g., electronic device 1202). According to an embodiment, interface 1277 may include, for example, a High Definition Multimedia Interface (HDMI), a Universal Serial Bus (USB) interface, a Secure Digital (SD) card interface, or an audio interface.

Connection end 1278 may include a connector that physically connects electronic device 1201 to an external electronic device (e.g., electronic device 1202). According to an embodiment, connection end 1278 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).

The haptic module 1279 may convert the electrical signal into a mechanical stimulus (e.g., vibration or motion) or an electrical stimulus that may be recognized by the user via his sense of touch or movement. According to embodiments, the haptic module 1279 may include, for example, a motor, a piezoelectric element, or an electrical stimulator.

The camera module 1280 may capture still images or moving images. According to an embodiment, the camera module 1280 may include, for example, at least one or more lenses, an image sensor, an image signal processor, or a flash.

Power management module 1288 may manage power to electronic device 1201. According to an embodiment, the power management module 1288 may be implemented as at least part of a Power Management Integrated Circuit (PMIC), for example.

Battery 1289 may provide power to at least one component of electronic device 1201. According to an embodiment, the battery 1289 may include, for example, a non-rechargeable battery (primary battery), a rechargeable battery (secondary battery), or a fuel cell.

Communication module 1290 may support establishing a direct (e.g., wired) or wireless communication channel between electronic device 1201 and an external electronic device (e.g., electronic device 1202, electronic device 1204, or server 1208), and performing communication via the established communication channel. The communication module 1290 may include at least one communication processor operating independently of the processor 1220 (e.g., application processor) and supporting direct (e.g., wired) or wireless communication. According to an embodiment, communication module 1290 can include a wireless communication module 1292 (e.g., a cellular communication module, a short-range wireless communication module, or a Global Navigation Satellite System (GNSS) communication module) or a wired communication module 1294 (e.g., a Local Area Network (LAN) communication module or a power line communication module). Respective ones of the above-described communication modules may communicate with external electronic devices via a first network 1298 (e.g., a short-range communication network such as bluetooth, Wi-Fi direct, or infrared data association (IrDA)) or a second network 1299 (e.g., a long-range communication network such as a cellular network, the internet, or a computer network (e.g., LAN or WAN)). The various communication modules described above may be implemented as one component (e.g., a single chip) or separately as separate components (e.g., chips). The wireless communication module 1292 may identify and authenticate the electronic device 1201 in a communication network, such as the first network 1298 or the second network 1299, using subscriber information, such as an International Mobile Subscriber Identity (IMSI), stored in the subscriber identity module 1296.

The antenna module 1297 may transmit or receive signals or power to or from the outside (e.g., an external electronic device). According to an embodiment, antenna module 1297 may include an antenna that includes a radiating element comprised of a conductive material or conductive pattern formed in or on a substrate (e.g., a PCB). According to an embodiment, antenna module 1297 may include multiple antennas. In this case, at least one antenna suitable for a communication scheme used in the communication network may be selected from the plurality of antennas by, for example, communication module 1290. Signals or power may then be transmitted or received between the communication module 1290 and the external electronic device via the selected at least one antenna. According to an embodiment, additional components other than the radiating element, such as a Radio Frequency Integrated Circuit (RFIC), may be additionally formed as part of the antenna module 1297.

At least some of the components may be interconnected to exchange signals (e.g., commands or data) between them via an inter-peripheral communication method (e.g., a bus, a General Purpose Input Output (GPIO), a Serial Peripheral Interface (SPI), or a Mobile Industrial Processor Interface (MIPI)).

According to an embodiment, commands or data may be transmitted or received between the electronic device 1201 and the external electronic device 1204 through the server 1208 connected to the second network 1299. Each of the electronic apparatus 1202 and the electronic apparatus 1204 may be the same type of apparatus as the electronic apparatus 1201, or a different type of apparatus from the electronic apparatus 1201. According to an embodiment, all or some of the operations performed by the electronic device 1201 may be performed by one or more of the external electronic device 1202, the external electronic device 1204, or the server 1208. For example, when the electronic device 1201 should automatically perform a function or service or should perform the function or service in response to a request from a user or another device, the electronic device 1201 may request one or more external electronic devices to perform at least some functions related to the function or service, instead of performing the function or service by itself, or the electronic device 1201 may request the one or more external electronic devices to perform at least some functions related to the function or service in addition to running the function or service. The one or more external electronic devices that received the request may perform at least part of the requested functions or services or additional functions or services related to the request and transmit the results of the performance to the electronic device 1201. The electronic device 1201 may provide the result as it is or after additional processing as at least part of a response to the request. To this end, for example, cloud computing technology, distributed computing technology, or client-server computing technology may be used.

Fig. 13 is a block diagram 1300 illustrating an example program 1240 in accordance with various embodiments. According to an embodiment, programs 1240 may include an Operating System (OS) for controlling one or more resources of electronic device 1201 of FIG. 12, middleware 1244, and a virtual machine readable medium accessible by OS1242, and an application 1246. OS1242 may comprise, for example, AndroidTM、iOSTM、WindowsTM、SymbianTM、TizenTMOr BadaTM. For example, at least a portion of the programs 1240 may be preloaded on the electronic device 1201 at the time of manufacture of the electronic device 1201, or may be downloaded and updated from an external electronic device (e.g., the electronic device 1202 or 1204 or the server 1208 of fig. 12) at the time of use of the electronic device 1201 by a user.

OS1242 may control managing (e.g., allocating or collecting) one or more system resources (e.g., processes, memory, or power) of electronic device 1201. Additionally or alternatively, OS1242 may include one or more drivers for driving another hardware device of electronic device 1201 (e.g., input device 1250, sound output device 1255, display device 1260, audio module 1270, sensor module 1276, interface 1277, haptic module 1279, camera module 1280, power management module 1288, battery 1289, communication module 1290, user identification module 1296, or antenna module 1297 of fig. 12).

The middleware 1244 may provide various functions to the application 1246 so that functions or information provided from one or more resources of the electronic apparatus 1201 can be used by the application 1246. The middleware 1244 may include, for example, an application manager 1301, a window manager 1303, a multimedia manager 1305, a resource manager 1307, a power manager 1309, a database manager 1311, a package manager 1313, a connection manager 1315, a notification manager 1317, a location manager 1319, a graphic manager 1321, a security manager 1323, a phone manager 1325, or a voice recognition manager 1327.

The application manager 1301 may manage the lifecycle of the application 1246, for example. The window manager 1303 may manage one or more Graphical User Interface (GUI) resources used in a screen, for example. The multimedia manager 1305 may identify, for example, one or more formats required to play the media files, and may encode or decode a corresponding media file among the media files using a codec suitable for a format selected from the one or more formats. The resource manager 1307 can manage, for example, the source code or memory 1230 space of the application 1246 of fig. 12. The power manager 1309 may manage, for example, the capacity, temperature, or power of the battery 1289, and may determine or provide information related to the operation of the electronic apparatus 1201 using corresponding information among the capacity, temperature, or power of the battery 1289. According to an embodiment, the power manager 1309 may communicate with a basic input/output system (BIOS) (not shown) of the electronic device 1201.

Database manager 1311 may generate, search, or modify a database to be used by application 1246. The package manager 1313 may manage installation or update of applications distributed, for example, in the form of package files. The connection manager 1315 may manage, for example, wireless connection or direct connection between the electronic apparatus 1201 and an external electronic apparatus. The notification manager 1317 may provide functionality, for example, for notifying a user of the occurrence of a specified event (e.g., an incoming call, message, or alert). The location manager 1319 may manage, for example, location information of the electronic apparatus 1201. The graphic manager 1321 may manage one or more graphic effects to be provided to a user or may manage a UI related to the graphic effects, for example.

The security manager 1323 may provide, for example, system security or user authentication. The telephony manager 1325 may manage voice or video call functions provided by the electronic device 1201, for example. The speech recognition manager 1327 may transmit, for example, speech data of the user to the server 1208 of fig. 12, and may receive a command corresponding to a function to be executed in the electronic apparatus 1201 from the server 1208 based in part on the speech data or text data converted based in part on the speech data. According to embodiments, the middleware 1244 may dynamically contain no existing components or may further contain new components. According to embodiments, at least a portion of middleware 1244 may be included as part of OS1242 or may be implemented as software separate from OS 1242.

The applications 1246 may include, for example, a home application 1351, a dialer application 1353, an SMS/MMS application 1355, an Instant Messaging (IM) application 1357, a browser application 1359, a camera application 1361, an alarm application 1363, a contacts application 1365, a voice recognition application 1367, an email application 1369, a calendar application 1371, a media player application 1373, an album application 1375, a watch application 1377, a health application 1379 (e.g., an application for measuring biometric information such as quantity of motion or blood glucose, for example), or an environmental information application 1381 (e.g., an application for measuring information about atmospheric pressure, humidity, or temperature). According to an embodiment, applications 1246 may also include an information exchange application (not shown) capable of supporting information exchange between electronic device 1201 and an external electronic device. The information exchange application may include, for example, a notification relay application configured to transmit specific information (e.g., a call, a message, or an alarm) to the external electronic device or a device management application configured to manage the external electronic device. For example, the notification relay application may transmit notification information corresponding to a specified event (e.g., mail reception) occurring in another application (e.g., email application 1369) of the electronic apparatus 1201 to the external electronic apparatus. Additionally or alternatively, the notification relay application may receive notification information from an external electronic device and may provide the received notification information to a user of the electronic device 1201.

The device management application may control, for example, power supply (e.g., on/off of power) of an external electronic device that communicates with the electronic device 1201 and power supply of each of some components of the electronic device 1201 (e.g., the display device 1260 or the camera module 1280), or may control functions (e.g., brightness, resolution, or focal length) of each of some components of the electronic device 1201 (e.g., the display device 1260 or the camera module 1280). Additionally or alternatively, the device management application may support installing, deleting, or updating applications running in the external electronic device.

Electronic devices according to various embodiments disclosed in the present disclosure may be various types of electronic devices. The electronic device may comprise, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance. The electronic device according to the embodiment of the present disclosure is not limited to the above-described device.

It is to be understood that the various embodiments of the present disclosure and the terms used in the embodiments are not intended to limit the technical features disclosed in the present disclosure to the specific embodiments disclosed herein; on the contrary, the disclosure is to be construed as covering various modifications, equivalents, or alternatives to the embodiments of the disclosure. Similar or related components may be assigned similar reference numerals with respect to the description of the figures. As used herein, a singular form of a noun corresponding to an item may include one or more items unless the context clearly indicates otherwise. In the disclosure disclosed herein, each of the expressions "a or B", "at least one of a and B", "at least one of a or B", "A, B or C", "one or more of A, B and C", "one or more of A, B or C", and the like, can include any and all combinations of one or more of the associated listed items. Expressions such as "first", "second", "the first", "the second" may be used only to distinguish one component from other components, and do not limit the respective components in other aspects (e.g., importance or order). It will be understood that an element (e.g., a first element) can be coupled to another element (e.g., a second element) directly (e.g., wired), wirelessly, or via a third element if the element is referred to as being "coupled to," "connected to," or "connected to" another element (e.g., a second element) with or without the terms "operable" or "communicable," respectively.

The term "module" as used in this disclosure may include units implemented in hardware, software, or firmware, and may be used interchangeably with the terms "logic," logic block, "" component, "and" circuitry. A "module" may be the smallest unit of an integrated component or may be a part thereof. A "module" may be the smallest unit or part thereof for performing one or more functions. For example, according to an embodiment, a "module" may comprise an Application Specific Integrated Circuit (ASIC).

Various embodiments of the disclosure may be implemented through software (e.g., program 1240) comprising instructions stored in a machine-readable storage medium (e.g., internal memory 1236 or external memory 1238), which may be read by a machine (e.g., electronic device 1201). For example, a processor (e.g., processor 1220) of a machine (e.g., electronic device 1201) may call instructions from a machine-readable storage medium and execute the instructions so called. The machine may perform at least one function based on the invoked at least one instruction. The one or more instructions may comprise code generated by a compiler or executable by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium. Here, the non-transitory storage medium is tangible, but may not include a signal (e.g., an electromagnetic wave). The term "non-transitory" does not distinguish between the case where data is permanently stored in the storage medium and the case where data is temporarily stored in the storage medium.

According to an embodiment, a method according to various embodiments disclosed in the present disclosure may be provided as part of a computer program product. The computer program product may be traded as a product between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)) or may be distributed through an application Store (e.g., Play Store)TM) Or distributed (e.g., downloaded or uploaded) directly online between two user devices (e.g., smartphones). In the case of online distribution, at least a portion of the computer program product may be temporarily stored or generated in a machine-readable storage medium, such as a memory of a manufacturer's server, a server of an application store, or a relay server.

According to various embodiments, each component (e.g., module or program) of the above-described components may comprise one or more entities. According to various embodiments, at least one or more of the above components or operations may be omitted, or one or more components or operations may be added. Alternatively or additionally, some components (e.g., modules or programs) may be integrated in one component. In such a case, the integrated components may perform the same or similar functions performed by each corresponding component prior to integration. Operations performed by a module, program, or other component may be performed sequentially, in parallel, repeatedly, or heuristically, or at least some operations may be performed in a different order, omitted, or other operations may be added, according to various embodiments.

While the disclosure has been shown and described with reference to various exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the disclosure, including the appended claims and their equivalents.

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于利用多个细节级别和自由度的自适应空间内容流传输的系统和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类