Display control device, imaging device, display control method, and display control program

文档序号:54711 发布日期:2021-09-28 浏览:18次 中文

阅读说明:本技术 显示控制装置、摄像装置、显示控制方法及显示控制程序 (Display control device, imaging device, display control method, and display control program ) 是由 高桥宏辅 河口武弘 于 2020-02-13 设计创作,主要内容包括:本发明提供一种能够高效地进行用于减少动态图像模糊的特定图像的插入来实现提高显示画质和减少耗电的显示控制装置、摄像装置、显示控制方法及显示控制程序。在从将动态图像数据的第一帧显示于显示部(23)之后到显示该动态图像数据的第一帧之后的第二帧为止的期间,系统控制部(11)进行将动态图像模糊减少用的特定图像显示于显示部(23)的插入控制,并根据该动态图像数据中所包含的运动物体的移动量和显示于显示部(23)的该动态图像数据的(1)像素的观察角度来确定是否执行该插入控制。(The invention provides a display control device, an imaging device, a display control method, and a display control program, which can efficiently insert a specific image for reducing motion blur to improve display quality and reduce power consumption. During a period from display of a first frame of moving image data on a display unit (23) to display of a second frame of the moving image data after the first frame, a system control unit (11) performs insertion control for displaying a specific image for reducing moving image blur on the display unit (23), and determines whether or not to execute the insertion control based on the amount of movement of a moving object included in the moving image data and the observation angle of (1) pixel of the moving image data displayed on the display unit (23).)

1. A display control device that performs display control of moving image data, the display control device comprising:

a specific image insertion control unit that performs insertion control of displaying a specific image different from the moving image data on a display unit during a period from when a first frame of the moving image data is displayed on the display unit to when a second frame of the moving image data is displayed after the first frame of the moving image data; and

an insertion execution control section that determines whether to execute the insertion control based on a movement amount of a moving object contained in the moving image data and an observation angle of one pixel of the moving image data displayed on the display section.

2. The display control apparatus according to claim 1,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

the insertion execution control section determines not to execute the insertion control when the movement angle is equal to or smaller than the observation angle, and determines to execute the insertion control when the movement angle exceeds the observation angle.

3. The display control apparatus according to claim 1 or 2,

when it has been determined that the insertion control is to be executed, the specific image insertion control section controls the display time of the specific image in the period in accordance with the movement amount and the observation angle.

4. The display control apparatus according to claim 3,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

the specific image insertion control unit determines the display time of the specific image in the period based on a ratio between the movement angle and the observation angle.

5. The display control apparatus according to claim 4,

when the movement angle is set to L and the observation angle is set to H, the specific image insertion control unit sets a time 1-H/L times the period as the display time.

6. The display control apparatus according to any one of claims 1 to 5,

the insertion execution control unit further determines not to execute the insertion control when an exposure time of one frame of the moving image data is equal to or longer than a frame time based on a frame rate of the moving image data.

7. The display control apparatus according to any one of claims 1 to 6,

the insertion execution control unit further determines not to execute the insertion control when the focus evaluation value of the moving image data is equal to or less than a predetermined threshold value.

8. The display control apparatus according to any one of claims 1 to 7,

the insertion execution control portion sets an actual movement amount of the moving object calculated from the moving image data as the movement amount of the moving object.

9. The display control apparatus according to any one of claims 1 to 7,

the insertion execution control portion sets a tracking visual limit movement amount of a person as the movement amount of the moving object.

10. The display control apparatus according to claim 9,

the display control device includes a measurement unit that measures a tracking visual limit movement amount of an observer of the display unit,

the insertion execution control unit sets any one of the tracking visual limit movement amounts as the movement amount of the moving object according to the measurement result of the measurement unit.

11. The display control apparatus according to any one of claims 1 to 10,

the display control device includes a viewing angle calculation unit that calculates a viewing angle of the display unit by an observer of the display unit,

the observation angle and the movement amount are calculated from the field angle.

12. The display control apparatus according to any one of claims 1 to 11,

the observation angle and the movement amount are calculated from a resolution of the moving image data, a resolution of the display unit, and a viewing angle of an observer of the display unit with respect to the display unit.

13. The display control apparatus according to any one of claims 1 to 12,

the specific image is a black image.

14. An imaging device includes:

the display control apparatus of any one of claims 1 to 13;

the display section; and

an image pickup element for picking up an image of a subject,

the moving image data is a live preview image of the subject captured by the image pickup element.

15. A display control method for controlling display of moving image data, the display control method comprising:

a specific image insertion control step of performing insertion control of displaying a specific image different from the moving image data on a display unit during a period from when a first frame of the moving image data is displayed on the display unit to when a second frame of the moving image data is displayed after the first frame of the moving image data; and

an insertion execution control step of determining whether to execute the insertion control based on a movement amount of a moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display portion.

16. The display control method according to claim 15,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

in the insertion execution control step, it is determined that the insertion control is not executed when the movement angle is equal to or smaller than the observation angle, and it is determined that the insertion control is executed when the movement angle exceeds the observation angle.

17. The display control method according to claim 15 or 16,

in the specific image insertion control step, when it has been determined to perform the insertion control,

controlling a display time of the specific image in the period according to the movement amount and the observation angle.

18. The display control method according to claim 17,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

in the specific image insertion control step, the display time of the specific image in the period is determined according to a ratio of the movement angle to the observation angle.

19. The display control method according to claim 18,

in the specific image insertion control step, when the movement angle is L and the observation angle is H, a time 1-H/L times the period is set as the display time.

20. The display control method according to any one of claims 15 to 19,

in the insertion execution control step, it is also determined not to execute the insertion control when an exposure time of one frame of the moving image data is one frame time or more based on a frame rate of the moving image data.

21. The display control method according to any one of claims 15 to 20,

in the insertion execution control step, it is further determined that the insertion control is not executed when the focus evaluation value of the moving image data is equal to or less than a predetermined threshold value.

22. The display control method according to any one of claims 15 to 21,

in the insertion execution control step, an actual movement amount of the moving object calculated from the moving image data is set as the movement amount of the moving object.

23. The display control method according to any one of claims 15 to 21,

in the insertion execution control step, a tracking visual limit movement amount of a person is set as the movement amount of the moving object.

24. The display control method according to claim 23,

the display control method includes a measurement step of measuring a tracking visual limit movement amount of an observer of the display unit,

in the insertion execution control step, any one of a plurality of tracking visual limit movement amounts is set as the movement amount of the moving object in accordance with a measurement result of the measurement step.

25. The display control method according to any one of claims 15 to 24,

the display control method includes a viewing angle calculation step of calculating a viewing angle of an observer of the display unit with respect to the display unit,

the observation angle and the movement amount are calculated from the field angle.

26. The display control method according to any one of claims 15 to 25,

the observation angle and the movement amount are calculated from a resolution of the moving image data, a resolution of the display unit, and a viewing angle of an observer of the display unit with respect to the display unit.

27. The display control method according to any one of claims 15 to 26,

the specific image is a black image.

28. A display control program for causing a computer to execute a process comprising:

whether or not to perform insertion control for displaying a specific image different from moving image data on a display unit during a period from when a first frame of the moving image data is displayed on the display unit to when a second frame of the moving image data is displayed after the first frame of the moving image data is displayed on the display unit is determined based on a moving amount of a moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display unit.

Technical Field

The present invention relates to a display control device, an imaging device, a display control method, and a display control program.

Background

As a method of reducing moving image blur generated when a moving object included in a display image is tracked and observed in a display device, there is a black insertion process of displaying a black image between frames of a moving image. For example, there is a method of displaying a black image between frames of a moving image by intermittently lighting a backlight instead of always lighting the backlight. By performing the black insertion processing, the display characteristics of a Hold (Hold) type display such as a liquid crystal display device can be made close to those of an Impulse (Impulse) type display, and moving image blur can be reduced. Patent documents 1, 2, and 3 disclose black insertion techniques.

Prior art documents

Patent document

Patent document 1: japanese patent laid-open No. 2014-035525

Patent document 2: japanese patent laid-open publication No. 2014-032412

Patent document 3: WO2008/102828

Disclosure of Invention

Technical problem to be solved by the invention

In the black insertion processing, the higher the insertion frequency of the black image, the more the effect of reducing the moving image blur can be improved. However, when the moving amount of a moving object between frames is equal to or less than the display resolution of the display device, the effect of reducing motion blur cannot be obtained, and there are disadvantages such as increase in power and reduction in display luminance. Patent documents 1, 2, and 3 describe that black insertion processing is performed in consideration of movement of a moving object, but do not consider display resolution.

The present invention has been made in view of the above circumstances, and an object thereof is to provide a display control device, an imaging device, a display control method, and a display control program that can efficiently insert a specific image for reducing motion blur to improve display quality and reduce power consumption.

Means for solving the technical problem

A display control device according to the present invention is a display control device that performs display control of moving image data, and includes: a specific image insertion control unit that performs insertion control of displaying a specific image different from the moving image data on a display unit during a period from when a first frame of the moving image data is displayed on the display unit to when a second frame of the moving image data is displayed after the first frame; and an insertion execution control unit that determines whether to execute the insertion control based on a movement amount of a moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display unit.

The image pickup apparatus of the present invention includes the display control device, the display unit, and an image pickup device, and the moving image data is a live preview image of an object captured by the image pickup device.

A display control method according to the present invention is a display control method for moving image data, including: a specific image insertion control step of performing insertion control of displaying a specific image different from the moving image data on a display unit during a period from display of a first frame of the moving image data on the display unit to display of a second frame of the moving image data after the first frame; and an insertion execution control step of determining whether or not to execute the insertion control based on a movement amount of a moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display portion.

A display control program of the present invention is a program for causing a computer to execute processing including: whether or not to perform insertion control for displaying a specific image different from the moving image data on the display unit during a period from display of a first frame of the moving image data on the display unit to display of a second frame of the moving image data after the first frame of the moving image data is determined based on a moving amount of the moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display unit.

Effects of the invention

According to the present invention, it is possible to provide a display control device, an imaging device, a display control method, and a display control program that can efficiently perform insertion of a specific image for reducing motion blur and achieve improvement in display image quality and reduction in power consumption.

Drawings

Fig. 1 is a diagram showing a schematic configuration of a digital camera 100 as an embodiment of an imaging apparatus according to the present invention.

Fig. 2 is a functional block diagram of the system control unit 11 of the digital camera 100 shown in fig. 1.

Fig. 3 is a schematic diagram for explaining the observation angle.

Fig. 4 is a schematic diagram for explaining the amount of movement of a moving object contained in moving image data.

Fig. 5 is a flowchart for explaining an operation performed by the system control unit 11 during the through preview display control or the recorded moving image playback control.

Fig. 6 is a flowchart for explaining a first modification of the operation of the system control unit 11 shown in fig. 1.

Fig. 7 is a flowchart for explaining a second modification of the operation of the system control unit 11 shown in fig. 1.

Fig. 8 is a diagram showing a modification of the functional blocks of the system control unit 11 shown in fig. 2.

Fig. 9 is a diagram showing an external appearance of a smartphone 200 as an embodiment of the imaging device of the present invention.

Fig. 10 is a block diagram showing the structure of the smartphone 200 shown in fig. 9.

Detailed Description

Hereinafter, embodiments of the present invention will be described with reference to the drawings.

Fig. 1 is a diagram showing a schematic configuration of a digital camera 100 as an embodiment of an imaging apparatus according to the present invention.

The digital camera 100 shown in fig. 1 includes a lens device 40, and the lens device 40 includes an imaging lens 1, a diaphragm 2, a lens control unit 4, a lens driving unit 8, and a diaphragm driving unit 9.

The lens device 40 may be detachable from the main body of the digital camera 100, or may be integrated with the main body of the digital camera 100.

The image pickup lens 1 and the diaphragm 2 constitute an image pickup optical system, and the image pickup lens 1 includes a focus lens, a zoom lens, and the like that are movable in the optical axis direction.

The focus lens is a lens for adjusting the focus of the image pickup optical system and is composed of a single lens or a plurality of lenses. When the focus lens is moved in the optical axis direction, the position of the principal point of the focus lens is changed in the optical axis direction, and the focal position on the object side is changed. Further, as the focus lens, a liquid lens whose focus can be adjusted by changing the position of the principal point in the optical axis direction by electric control may be used.

The lens control unit 4 of the lens device 40 is configured to be able to communicate with the system control unit 11 of the digital camera 100 by wire or wirelessly.

The lens control unit 4 controls the focus lens included in the imaging lens 1 by the lens driving unit 8 in accordance with an instruction from the system control unit 11, and changes the position of the principal point of the focus lens (changes the focal length), or controls the aperture amount of the diaphragm 2 by the diaphragm driving unit 9.

The digital camera 100 further includes an imaging element 5 such as a CCD (Charge coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor for imaging an object through an imaging optical system.

The image pickup device 5 has an image pickup surface on which a plurality of pixels are two-dimensionally arranged, and converts an object image formed on the image pickup surface by an image pickup optical system into pixel signals by the plurality of pixels and outputs the pixel signals. Hereinafter, a set of pixel signals output from each pixel of the image pickup device 5 is also referred to as an image pickup image signal.

The system control unit 11, which centrally controls the entire electronic control system of the digital camera 100, drives the image pickup device 5 by the image pickup device driving unit 10, and outputs an object image picked up by the image pickup optical system of the lens device 40 as a picked-up image signal.

When the digital camera 100 is set to the shooting mode, the system control unit 11 performs live preview display control for starting continuous shooting of a subject by the imaging device 5 and displaying a live preview image based on moving image data composed of a plurality of captured image signals output from the imaging device 5 by the continuous shooting on the display unit 23. The system control unit 11 performs recorded moving image playback control for reading moving image data stored in the storage medium 21 and displaying a moving image based on the moving image data on the display unit 23.

An instruction signal from the user is input to the system control unit 11 through the operation unit 14. The system control unit 11 centrally controls the entire digital camera 100, and has a hardware configuration including various processors that execute programs to perform processing.

The various processors include a CPU (Central processing Unit), which is a general-purpose processor that executes a program to perform various processes, a processor-Programmable Logic Device (PLD), such as an FPGA (Field Programmable Gate Array), or a dedicated Circuit, such as an ASIC (Application specific Integrated Circuit), which has a Circuit configuration specifically designed to execute a specific process, and which can change the Circuit configuration after manufacture. More specifically, these various processors have a circuit structure in which circuit elements such as semiconductor elements are combined. The system control unit 11 may be constituted by one of various types of processors, or may be constituted by a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA).

The electronic control system of the digital camera 100 includes: a main memory 16 composed of a RAM (Random access memory), a memory control unit 15 for controlling data storage in the main memory 16 and data reading from the main memory 16, a digital signal processing unit 17 for performing digital signal processing on a captured image signal output from the image pickup device 5 to generate captured image data in various formats such as jpeg (joint Photographic Experts group) format, an external memory control unit 20 for controlling data storage in the storage medium 21 and data reading from the storage medium 21, and a display unit 23.

The display unit 23 is configured by, for example, a liquid crystal display device, an organic EL (electro-luminescence) display device, or the like. The display unit 23 includes one or both of an external display device provided on the back surface of the main body of the digital camera 100 opposite to the lens device 40 side and an internal display device built in an eyepiece finder, which is not shown.

The storage medium 21 is a semiconductor memory such as a flash memory built in the digital camera 100, a portable semiconductor memory or the like which is detachable from the digital camera 100.

The memory control unit 15, the digital signal processing unit 17, the external memory control unit 20, and the display unit 23 are connected to each other via a control bus 24 and a data bus 25, and are controlled in accordance with instructions from the system control unit 11.

Fig. 2 is a functional block diagram of the system control unit 11 of the digital camera 100 shown in fig. 1.

The system control unit 11 functions as a specific image insertion control unit 11A, an insertion execution control unit 11B, and a viewing angle information acquisition unit 11C by executing a program including a display control program. The system control unit 11 in this specification constitutes a display control device.

In the above-described through-preview display control or recorded moving image playback control, the specific image insertion control unit 11A performs insertion control of displaying a specific image for reducing moving image blur, which is different from moving image data to be displayed, on the display unit 23 during a period from when a first frame of the moving image data to be displayed is displayed on the display unit 23 to when a second frame of the moving image data after the first frame is displayed (hereinafter, the length of the period is referred to as one frame time).

The specific image for reducing moving image blur is an image for reducing moving image blur generated when a person follows and observes a moving object, and specifically is a black image. The specific image may be an image other than the first frame to be displayed and may have a luminance of a level at which an afterimage of the first frame is not left, and for example, a white image, a gray image, a random noise image, or the like may be used in addition to the black image. The black image is displayed by, for example, turning off the backlight of the display unit 23.

The insertion execution control portion 11B determines whether to execute the above-described insertion control, based on the moving amount L of the moving object included in the moving image data of the display target and the observation angle H of one pixel of the moving image data of the display target. The moving amount L of the moving object included in the moving image data includes a moving amount L1 as a horizontal moving component and a moving amount L2 as a vertical moving component.

Fig. 3 is a schematic diagram for explaining the observation angle. Fig. 4 is a schematic diagram for explaining the amount of movement of a moving object contained in moving image data.

Fig. 3 shows one frame 23a of moving image data displayed on the display unit 23. Assuming that the resolution of the frame 23a coincides with the maximum display resolution of the display unit 23, the observation angle H is defined by an angle formed by two straight lines connecting both ends in the horizontal direction or the vertical direction of each display pixel of the display unit 23 (or each pixel 23p constituting the frame 23 a) and the eyes of the observer. For example, the observation angle H is obtained by dividing the horizontal angle of view of the observer with respect to the display unit 23 (defined as the angle of a line connecting the eyes of the observer and both ends of the display surface of the display unit 23 in the horizontal direction) by the number of horizontal display pixels of the display unit 23.

In addition, the angle of view of the display section 23 by the observer may vary depending on which of the first display device and the second display device the display section 23 is. In order to cope with this change in the usage mode, the main memory 16 stores information of the angle of view of the observer when the first display device is used and information of the angle of view of the observer when the second display device is used.

Fig. 4 shows one frame 23a and the next frame 23b of moving image data displayed on the display unit 23. The moving object M is included in the frames 23a and 23 b. The horizontal movement amount L1 of the moving object M is defined as a movement angle of the horizontal direction of the line of sight during one frame of the observer who observes the moving object M. In the example of fig. 4, the moving object M is moved by two pixels in the horizontal direction between the two frames 23a, 23 b. Therefore, in the example of fig. 4, the movement amount L1 is calculated as a value 2 times the observation angle H.

Also, the vertical movement amount L2 of the moving object M is defined as the movement angle of the vertical direction of the line of sight during one frame of the observer who observes the moving object M. In the example of fig. 4, the moving object M is moved by one pixel in the vertical direction between the two frames 23a, 23 b. Therefore, in the example of fig. 4, the movement amount L2 is calculated as a value 1 times the observation angle H.

In addition, as defined above, the upper limit values of the shift amounts L1 and L2 are set such that the viewing angle H is multiplied by the number of display pixels in the horizontal or vertical direction of the display unit 23. However, when the moving amount of the moving object exceeds the limit value at which the person can follow and observe the moving object, that is, the tracking visual limit moving amount, in each frame time, the person cannot recognize the movement of the moving object that moves beyond the tracking visual limit moving amount. Therefore, the upper limit values of the settable movement amounts L1 and L2 are preferably set as the tracking visual limit movement amounts.

The viewing angle information acquiring unit 11C acquires information of the viewing angle of the observer with respect to the display unit 23 from the main memory 16. The insertion execution controller 11B sets the observation angle H based on the information, and determines whether or not the execution of the insertion control by the specific image insertion controller 11A is possible based on the set observation angle H and the movement amounts L1 and L2 set separately.

Fig. 5 is a flowchart for explaining an operation performed by the system control unit 11 during the through preview display control or the recorded moving image playback control. When one frame (hereinafter, referred to as a current frame) of the moving image data to be displayed is generated, first, the viewing angle information acquiring unit 11C determines which of the first display device and the second display device is being used, and acquires information of the viewing angle of the observer stored in the main memory 16 corresponding to the display device used (step S1).

For example, the viewing angle information acquisition unit 11C determines that the second display device is being used in a situation where the eyes of the observer are in contact with an eyepiece window, not shown, and determines that the first display device is being used in a situation where the eyes of the observer are not in contact with the eyepiece window.

Next, the insertion execution control section 11B sets the observation angle H based on the information of the angle of view acquired in step S1 (step S2). The insertion execution control unit 11B calculates and sets the observation angle H by calculation, for example, based on the information of the angle of view acquired in step S1 and the information of the maximum display resolution of the display unit 23. The viewing angle information and the viewing angle H may be stored in the main memory 16 in advance so as to be associated with each other, and the insertion execution control unit 11B may read and set the viewing angle H corresponding to the viewing angle information acquired in step S1 from the main memory 16.

Next, the insertion execution control unit 11B acquires and sets the movement amount of the moving object included in the current frame (step S3). The insertion execution control portion 11B performs moving object detection processing on moving image data of a display object, for example, calculates a moving amount L1 in the horizontal direction of the moving object by multiplying the number of moving pixels in the horizontal direction of the detected moving object by the observation angle H, and calculates a moving amount L2 in the vertical direction of the moving object by multiplying the number of moving pixels in the vertical direction of the detected moving object by the observation angle H, and sets them.

In addition, at the initial time point when only one or several frames are acquired as frames of moving image data, moving object detection processing cannot be performed. Therefore, at this initial point in time, the insertion execution control unit 11B may acquire and set the tracking visual limit movement amounts as the movement amounts L1 and L2 from the main memory 16.

Next, the insertion execution control unit 11B acquires information on the exposure time of the current frame, and determines whether or not the exposure time is equal to or longer than the one-frame time (a value obtained by dividing 1 second by the number of frames displayed in the 1 second) based on the frame rate of the moving image data to be displayed (step S4).

When the exposure time is one frame time or more, it is assumed that the jitter of the moving image data increases, and therefore the blur reduction effect by performing the insertion control is highly likely to be reduced. Therefore, when the exposure time is one frame time or more (YES in step S4), the insertion execution control section 11B determines that the insertion control of the specific image is not to be executed (step S5). As a result, the specific image insertion control unit 11A is not performing the insertion control (the state where the current frame is displayed within one frame time).

On the other hand, when the exposure time is less than one frame time, a blur reduction effect based on the insertion control can be expected according to the amount of movement of the moving object. Therefore, when the exposure time is less than one frame time (NO in step S4), the insertion execution control section 11B determines whether or not each of the movement amount L1 and the movement amount L2 is the observation angle H or less (step S6).

In a state where the movement amount L1 and the movement amount L2 are each equal to or less than the observation angle H, the effect of reducing the moving image blur by performing the interpolation control cannot be obtained in principle. Therefore, when the moving amount L1 and the moving amount L2 are respectively equal to or smaller than the observation angle H (yes in step S6), the insertion execution control portion 11B shifts the process to step S5.

On the other hand, in a state where either one of the movement amount L1 and the movement amount L2 exceeds the observation angle H, the moving image blur can be reduced by performing the interpolation control. Therefore, when either one of the moving amount L1 and the moving amount L2 exceeds the observation angle H (NO in step S6), the insertion execution control portion 11B determines to execute the insertion control (step S7).

When it is determined in step S7 that the insertion control is to be performed, specific image insertion control unit 11A selects the larger one of movement amounts L1 and L2 set in step S3 (either one of movement amounts L1 and L2 is the same value). Then, when the shift amount L1 is selected, the specific image insertion control unit 11A determines the display time BT of the specific image in the display time FT of the current frame by the calculation of the following expression (a) based on the observation angle H and the shift amount L1 set in step S2, and when the shift amount L2 is selected, determines the display time BT of the specific image in the display time FT of the current frame by the calculation of the following expression (B) based on the observation angle H and the shift amount L2 set in step S2 (step S8).

BT=FT×{1-(H/L1)} (A)

BT=FT×{1-(H/L2)} (B)

After determining the display time BT in step S8, the specific image insertion control section 11A performs control to display the specific image on the display section 23 for the time BT in place of the current frame after the elapse of (FT-BT) time from the start of the display of the current frame (step S9). Thus, in a frame period in which the current frame is to be displayed, the specific image is displayed for the display time BT, and the current frame is displayed for the remaining time. After step S5 and step S9, if the next frame of the current frame is generated, the process returns to step S1.

As described above, according to the digital camera 100, the insertion control of the specific image can be performed only when either one of the movement amount L1 and the movement amount L2 exceeds the observation angle H. In other words, in a situation where the moving image blur reduction effect cannot be expected, the interpolation control is not executed. Therefore, compared to the case where the insertion control is always performed, the period during which the luminance improvement processing of the display image is not required to be performed in the case where the insertion control is performed increases, and therefore, power consumption can be reduced.

In fig. 5, the process of step S4 is not essential, and the process may be shifted to step S6 after the process of step S3. Even for this action, whether to perform the insertion control can be determined from the relationship of the moving amounts L1, L2 of the moving object and the observation angle H. Therefore, reduction of moving image blur and low power consumption can be achieved at the same time.

The viewing angle information acquiring unit 11C in the present embodiment may calculate the distance between the display unit 23 and the face of the observer by using a distance sensor or the like provided on the back surface of the main body of the digital camera 100 in a state where the first display device is used as the display unit 23, calculate the viewing angle of the observer with respect to the display unit 23 based on the distance, and acquire the viewing angle information. In this configuration, the viewing angle information acquisition unit 11C constitutes a viewing angle calculation unit. According to this configuration, the angle of view of the observer in the state of using the first display device can be brought close to an accurate value, and therefore the execution frequency of the insertion control can be further optimized.

Fig. 6 is a flowchart for explaining a first modification of the operation of the system control unit 11 shown in fig. 1. The flowchart shown in fig. 6 is the same as that shown in fig. 5, except that step S11 is added between step S3 and step S4. In fig. 6, the same processes as those in fig. 5 are denoted by the same reference numerals, and the description thereof is omitted.

After step S3, the insertion execution control unit 11B corrects the observation angle H and the shift amounts L1, L2 set in step S2 and step S3, in accordance with the resolution of the moving image data of the display target and the maximum display resolution of the display unit 23 (step S11).

For example, consider a case where the resolution of the moving image data is 1/4, which is the maximum display resolution of the display unit 23. In this case, the size of each frame of the moving image data is enlarged by 2 times and displayed on the display unit 23. That is, one pixel of the moving image data is displayed as 4 display pixels of the display unit 23. In this case, the observation angle H at which the observer observes one pixel of the moving image data is 2 times the observation angle H set in step S2. The movement amounts L1 and L2, which are the movement angles of the line of sight when the observer traces a moving object included in the moving image data, are 2 times the movement amounts L1 and L2 set in step S3.

Specifically, in step S11, the insertion execution control unit 11B multiplies the observation angle H, the shift amount L1, and the shift amount L2 set in steps S2 and S3 by a correction coefficient, which is a value 1/2 obtained by dividing the maximum display resolution of the display unit 23 by the resolution of the moving image data. In step S6 or step S9 performed after step S11, processing is performed according to the corrected observation angle H, movement amount L1, and movement amount L2.

In this way, by correcting the observation angle H, the shift amount L1, and the shift amount L2 in accordance with the resolution of the moving image data and the maximum display resolution of the display unit 23, it is possible to perform efficient insertion control without depending on the resolution of the moving image data.

Fig. 7 is a flowchart for explaining a second modification of the operation of the system control unit 11 shown in fig. 1. The flowchart shown in fig. 7 is the same as that shown in fig. 5, except that step S21 and step S22 are added before step S1. In fig. 7, the same processes as those in fig. 5 are denoted by the same reference numerals, and the description thereof is omitted.

When the current frame is generated, the insertion execution control unit 11B acquires the focus evaluation value of the current frame (step S21). The insertion execution control unit 11B calculates, for example, a contrast value of the current frame, and acquires the contrast value as a focus evaluation value. Alternatively, if the image pickup device 5 includes the phase difference detection pixels, the insertion execution control unit 11B calculates the defocus amount using the signals of the phase difference detection pixels included in the picked-up image signal that is the generation source of the current frame, and acquires the defocus amount as the focus evaluation value.

Next, the insertion execution control unit 11B determines whether or not the focus evaluation value is equal to or less than a predetermined threshold value (step S22). When the focus evaluation value is equal to or less than the threshold value (yes in step S22), the process proceeds to step S5. On the other hand, when the focus evaluation value exceeds the threshold value (step S22: NO), the process proceeds to step S1.

As described above, when the focus evaluation value of the current frame is equal to or less than the threshold value, it is determined that the insertion control of the specific image is not performed, and the normal control of displaying the current frame for one frame time is performed. A frame with a low focus evaluation value indicates an image with a large blurred portion. Therefore, in this case, it is difficult to obtain a moving image blur reduction effect based on performing the insertion control. Therefore, in this case, the processing of steps S1 to S4, S6 to S9 is omitted, whereby the possibility of execution of the insertion control can be reduced to reduce power consumption. Further, since the processing of steps S1 to S4 and S6 to S9 is not performed, the load on the system control unit 11 can be reduced.

Fig. 8 is a diagram showing a modification of the functional blocks of the system control unit 11 shown in fig. 2. The functional blocks of the system control unit 11 shown in fig. 8 are the same as those in fig. 2, except that the measurement unit 11D is added. The measurement unit 11D is realized by a processor executing the display control program.

The measurement unit 11D measures the above-described tracking visual limit movement amount of the observer of the display unit 23. For example, the measuring unit 11D displays a predetermined pattern (for example, a pattern in which 3 lines are arranged at intervals) on the display unit 23, and moves the pattern at a plurality of stages (the number of pixels moved per frame time). Then, the user is allowed to input whether or not the 3 lines included in the pattern moving at each speed can be recognized in a non-overlapping state, and the maximum value among the speeds at which the 3 lines can be recognized in the non-overlapping state is determined. Then, the maximum amount of movement is converted into the angle of movement of the line of sight of the observer, and the angle of movement is stored as a measurement result of tracking the visual limit amount of movement.

In the modification shown in fig. 8, the visual trace limit movement amount set as the movement amounts L1 and L2 at the initial time point of starting the generation of a frame of moving image data is replaced with the visual trace limit movement amount measured by the measuring unit 11D. The tracking visual limit movement amount set to the upper limit value of the movement amounts L1 and L2 is replaced with the tracking visual limit movement amount measured by the measurement unit 11D.

According to the configuration of this modification, the tracking visual limit movement amount can be optimized according to the tracking observation ability of the observer, and therefore it is possible to determine whether or not the insertion control corresponding to the observer is executable.

In addition, the operations shown in fig. 5 to 8 have been performed in the digital camera 100 at the time of live preview display control or at the time of moving image recording/reproduction control. However, the operations shown in fig. 5 to 8 can be similarly applied to a case where moving image data is reproduced by an electronic device having a display unit, such as a case where moving image data recorded in a recording medium is reproduced by a television. In order to realize the operation of fig. 7 by a device other than the digital camera, information of the focus evaluation value may be recorded in association with each frame of the moving image data recorded in the recording medium.

Next, a structure of a smartphone will be described as an embodiment of the imaging apparatus of the present invention.

Fig. 9 is a diagram showing an external appearance of a smartphone 200 as an embodiment of the imaging device of the present invention.

A smartphone 200 shown in fig. 9 includes a flat-plate-shaped casing 201, and a display input unit 204 in which a display panel 202 serving as a display unit and an operation panel 203 serving as an input unit are integrated with each other is provided on one surface of the casing 201.

The housing 201 includes a speaker 205, a microphone 206, an operation unit 207, and a camera unit 208. The configuration of the housing 201 is not limited to this, and for example, a configuration in which the display portion and the input portion are independent from each other, or a configuration having a folding structure or a sliding mechanism may be employed.

Fig. 10 is a block diagram showing the structure of the smartphone 200 shown in fig. 9.

As shown in fig. 10, the smartphone includes, as main constituent elements, a wireless communication unit 210, a display input unit 204, a communication unit 211, an operation unit 207, a camera unit 208, a storage unit 212, an external input/output unit 213, a GPS (Global Positioning System) reception unit 214, an operation sensor unit 215, a power supply unit 216, and a main control unit 220.

The smartphone 200 has a wireless communication function of performing mobile wireless communication with a base station apparatus BS, which is not shown, via a mobile communication network NW, which is not shown, as a main function.

The wireless communication unit 210 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW in accordance with the instruction of the main control unit 220. The wireless communication is used to transmit and receive various file data such as audio data and image data, e-mail data, and the like, and to receive network data and stream data and the like.

The display input unit 204 is a so-called touch panel that visually transmits information to a user by controlling a display image (a still image or a moving image) or character information by the main control unit 220 and detects a user operation on the displayed information, and includes a display panel 202 and an operation panel 203.

The Display panel 202 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a Display device.

The operation panel 203 is a device mounted so that an image displayed on the display surface of the display panel 202 can be visually recognized, and detects one or more coordinates operated by a finger or a pen tip of a user. When the device is operated by a finger or a pen tip of a user, a detection signal generated by the operation is output to the main control unit 220. Next, the main control section 220 detects an operation position (coordinate) on the display panel 202 based on the received detection signal.

As shown in fig. 10, the display panel 202 of the smartphone 200 illustrated as an embodiment of the imaging apparatus of the present invention and the operation panel 203 are integrated to form the display input unit 204, but the operation panel 203 is disposed so as to completely cover the display panel 202.

When this configuration is adopted, the operation panel 203 may also have a function of detecting a user operation to an area other than the display panel 202. In other words, the operation panel 203 may include a detection region (hereinafter, referred to as a display region) for an overlapping portion with the display panel 202 and a detection region (hereinafter, referred to as a non-display region) for an outer edge portion other than the overlapping portion with the display panel 202.

The size of the display area may be completely matched with the size of the display panel 202, but it is not always necessary to match the sizes. The operation panel 203 may include two sensing regions, i.e., an outer edge portion and the other inner portion. The width of the outer edge portion is appropriately designed according to the size of the housing 201 and the like.

The position detection system used in the operation panel 203 includes a matrix switch system, a resistive film system, a surface acoustic wave system, an infrared system, an electromagnetic induction system, a capacitance system, and the like, and any of these systems can be used.

The communication unit 211 includes a speaker 205 or a microphone 206, converts the user's voice input through the microphone 206 into audio data that can be processed by the main control unit 220 and outputs the audio data to the main control unit 220, or decodes the audio data received by the wireless communication unit 210 or the external input/output unit 213 and outputs the decoded audio data from the speaker 205.

As shown in fig. 9, for example, the speaker 205 may be mounted on the same surface as the surface on which the display input unit 204 is provided, and the microphone 206 may be mounted on a side surface of the housing 201.

The operation unit 207 is a hardware key using a key switch or the like, and receives an instruction from a user. For example, as shown in fig. 9, the operation unit 207 is a push button switch which is mounted on a side surface of the housing 201 of the smartphone 200, is turned on when pressed by a finger or the like, and is turned off by a restoring force of a spring or the like when the finger is separated.

The storage unit 212 stores a control program and control data of the main control unit 220, application software, address data in which a name, a telephone number, and the like of a communication partner are associated, data of transmitted and received e-mails, Web data downloaded by Web browsing, content data downloaded, and streaming data and the like. The storage unit 212 is composed of an internal storage unit 217 built in the smartphone and an external storage unit 218 having a detachable external memory slot.

Each of the internal storage unit 217 and the external storage unit 218 constituting the storage unit 212 is implemented using a storage medium such as a flash Memory type (flash Memory type), a hard disk type (hard disk type), a multi-media micro card type (multi media card micro type), a card type Memory (e.g., MicroSD (registered trademark) Memory, etc.), a RAM (Random Access Memory), or a ROM (Read Only Memory).

The external input/output unit 213 functions as an interface with all external devices connected to the smartphone 200, and is used to directly or indirectly connect to other external devices via communication or the like (for example, Universal Serial Bus (USB), IEEE1394, or the like) or a network (for example, the internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), Infrared communication (Infrared Data association): IrDA (registered trademark), UWB (Ultra Wideband) (registered trademark), ZigBee (registered trademark), or the like).

Examples of the external device connected to the smartphone 200 include a wired/wireless headset, a wired/wireless external charger, a wired/wireless data port, a Memory Card (Memory Card) connected via a Card slot, a SIM (Subscriber Identity Module Card)/UIM (user Identity Module Card) Card, an external audio/video device connected via an audio/video I/O (Input/Output) terminal, a wirelessly connected external audio/video device, a wired/wireless connected smartphone, a wired/wireless connected personal computer, and an earphone.

The external input/output unit 213 can transfer data received from such an external device to each component inside the smartphone 200, or can transfer data inside the smartphone 200 to the external device.

The GPS receiving unit 214 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with an instruction from the main control unit 220, and executes a positioning calculation process based on the received GPS signals to detect a position including latitude, longitude, and altitude of the smartphone 200. The GPS receiving unit 214 can detect the position using the position information even when the position information can be acquired from the wireless communication unit 210 or the external input/output unit 213 (for example, wireless LAN).

The motion sensor unit 215 includes, for example, a 3-axis acceleration sensor, and detects physical movement of the smartphone 200 in accordance with an instruction from the main control unit 220. By detecting physical movement of the smartphone 200, the moving direction or acceleration of the smartphone 200 is detected. The detection result is output to the main control unit 220.

The power supply unit 216 supplies power stored in a battery (not shown) to each unit of the smartphone 200 in accordance with an instruction from the main control unit 220.

The main control unit 220 includes a microprocessor, operates according to the control program and the control data stored in the storage unit 212, and centrally controls each unit of the smartphone 200. The main control unit 220 has a mobile communication control function and an application processing function for controlling each unit of the communication system in order to perform voice communication or data communication via the wireless communication unit 210.

The application processing function is realized by the main control unit 220 operating in accordance with the application software stored in the storage unit 212. Examples of the application processing function include an infrared communication function for controlling the external input/output unit 213 and performing data communication with the opposing device, an electronic mail function for transmitting and receiving electronic mail, and a web browsing function for browsing a web page.

The main control unit 220 also has an image processing function of displaying a video on the display input unit 204 based on image data (data of still images or moving images) such as received data or downloaded stream data.

The image processing function is a function of decoding the image data by the main control unit 220, performing image processing on the decoded result, and displaying an image on the display input unit 204.

The main control unit 220 performs display control on the display panel 202 and operation detection control for detecting a user operation via the operation unit 207 and the operation panel 203.

By executing the display control, the main control section 220 displays software keys such as icons and scroll bars for starting application software, or displays a window for creating an email.

The scroll bar is a software key for receiving an instruction to move a display portion of an image with respect to a large image or the like that cannot be completely stored in the display area of the display panel 202.

By executing the operation detection control, the main control section 220 detects a user operation through the operation section 207, receives an operation on the icon and an input of a character string to the input field of the window through the operation panel 203, and receives a scroll request for a display image through a scroll bar.

In addition, by executing the operation detection control, the main control section 220 has the following touch panel control functions: the operation position of the operation panel 203 is determined as an overlapping portion (display region) overlapping the display panel 202 or an outer edge portion (non-display region) not overlapping the display panel 202 other than the overlapping portion, and the sensing region of the operation panel 203 or the display position of the software key is controlled.

The main control unit 220 may detect a gesture (capture) operation on the operation panel 203 and execute a preset function according to the detected gesture operation.

The gesture operation is not a conventional simple touch operation, but is an operation of drawing a trajectory with a finger or the like, simultaneously designating a plurality of positions, or combining these operations to draw a trajectory for at least one of the plurality of positions.

The camera unit 208 includes components other than the external memory control unit 20, the storage medium 21, the display unit 23, and the operation unit 14 in the digital camera shown in fig. 1.

The captured image data generated by the camera unit 208 can be stored in the storage unit 212 or output through the external input/output unit 213 or the wireless communication unit 210.

In the smartphone 200 shown in fig. 9, the camera unit 208 is mounted on the same surface as the display input unit 204, but the mounting position of the camera unit 208 is not limited thereto, and may be mounted on the back surface of the display input unit 204.

The camera unit 208 can be used for various functions of the smartphone 200. For example, it is possible to display an image acquired by the camera section 208 on the display panel 202, or to use the image of the camera section 208 as one of the operation inputs of the operation panel 203.

The GPS receiving unit 214 can also detect the position by referring to the image from the camera unit 208 when detecting the position. Further, the image from the camera unit 208 may be referred to, and the optical axis direction of the camera unit 208 of the smartphone 200 or the current usage environment may be determined without using the 3-axis acceleration sensor or using the 3-axis acceleration sensor in combination. Of course, the image from the camera section 208 can also be utilized within the application software.

Further, the position information acquired by the GPS receiving unit 214, the sound information acquired by the microphone 206 (which may be converted into text information by audio text conversion by a main control unit or the like), the posture information acquired by the motion sensor unit 215, and the like may be added to the image data of the still image or the moving image, and stored in the storage unit 212, or may be output through the external input/output unit 213 or the wireless communication unit 210.

In the smartphone 200 configured as described above, power saving can be achieved by optimizing the execution frequency of the insertion control of the specific image.

As described above, the following matters are disclosed in the present specification.

(1)

A display control device that performs display control of moving image data, the display control device comprising:

a specific image insertion control unit that performs insertion control of displaying a specific image different from the moving image data on a display unit during a period from when a first frame of the moving image data is displayed on the display unit to when a second frame of the moving image data is displayed after the first frame; and

and an insertion execution control unit that determines whether to execute the insertion control based on a movement amount of a moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display unit.

(2)

The display control apparatus according to (1), wherein,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

the insertion execution control unit determines not to execute the insertion control when the movement angle is equal to or smaller than the observation angle, and determines to execute the insertion control when the movement angle exceeds the observation angle.

(3)

The display control apparatus according to (1) or (2), wherein,

when the insertion control has been determined to be executed, the specific image insertion control unit controls the display time of the specific image in the period based on the movement amount and the observation angle.

(4)

The display control apparatus according to (3), wherein,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

the specific image insertion control unit determines the display time of the specific image in the period based on a ratio of the movement angle to the observation angle.

(5)

(4) The display control apparatus of the present invention, wherein,

when the movement angle is set to L and the observation angle is set to H, the specific image insertion control unit sets a time 1-H/L times the period as the display time.

(6)

The display control apparatus according to any one of (1) to (5), wherein,

the insertion execution control unit may determine not to execute the insertion control when an exposure time of one frame of the moving image data is equal to or longer than a frame time based on a frame rate of the moving image data.

(7)

The display control apparatus according to any one of (1) to (6), wherein,

the insertion execution control unit further determines not to execute the insertion control when the focus evaluation value of the moving image data is equal to or less than a predetermined threshold value.

(8)

The display control apparatus according to any one of (1) to (7), wherein,

the insertion execution control section sets an actual movement amount of the moving object calculated from the moving image data as the movement amount of the moving object.

(9)

The display control apparatus according to any one of (1) to (7), wherein,

the insertion execution control unit sets a human tracking visual limit movement amount as the movement amount of the moving object.

(10)

The display control apparatus according to (9), wherein,

the display control device includes a measurement unit that measures a tracking visual limit movement amount of an observer of the display unit,

the insertion execution control unit sets any one of the tracking visual limit movement amounts as the movement amount of the moving object based on the measurement result of the measurement unit.

(11)

The display control apparatus according to any one of (1) to (10), wherein,

the display control device includes a viewing angle calculation unit that calculates a viewing angle of the display unit by an observer of the display unit,

the observation angle and the movement amount are calculated from the angle of view.

(12)

The display control apparatus according to any one of (1) to (11), wherein,

the observation angle and the movement amount are calculated based on the resolution of the moving image data, the resolution of the display unit, and the angle of view of the display unit by the observer.

(13)

The display control apparatus according to any one of (1) to (12), wherein,

the specific image is a black image.

(14)

An imaging device includes:

(1) the display control apparatus of any one of (1) to (13);

the display unit; and

an image pickup element for picking up an image of a subject,

the moving image data is a live preview image of the subject captured by the image sensor.

(15)

A display control method for controlling display of moving image data, the display control method comprising:

a specific image insertion control step of performing insertion control of displaying a specific image different from the moving image data on a display unit during a period from display of a first frame of the moving image data on the display unit to display of a second frame of the moving image data after the first frame; and

an insertion execution control step of determining whether to execute the insertion control based on a movement amount of a moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display portion.

(16)

The display control method according to (15), wherein,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

in the insertion execution control step, it is determined that the insertion control is not to be executed when the movement angle is equal to or smaller than the observation angle, and it is determined that the insertion control is to be executed when the movement angle exceeds the observation angle.

(17)

The display control method according to (15) or (16), wherein,

in the above-mentioned specific image insertion control step, when it has been determined to perform the above-mentioned insertion control,

and controlling the display time of the specific image in the period according to the movement amount and the observation angle.

(18)

The display control method according to (17), wherein,

the movement amount is a movement angle of a line of sight of an observer who observes the moving object,

in the specific image insertion control step, the display time of the specific image in the period is determined based on a ratio of the movement angle to the observation angle.

(19)

The display control method according to (18), wherein,

in the specific image insertion control step, when the movement angle is L and the observation angle is H, a time 1-H/L times the period is set as the display time.

(20)

The display control method according to any one of (15) to (19), wherein,

in the insertion execution control step, it is further determined that the insertion control is not executed when the exposure time for one frame of the moving image data is equal to or longer than one frame time based on the frame rate of the moving image data.

(21)

The display control method according to any one of (15) to (20), wherein,

in the insertion execution control step, it is further determined that the insertion control is not executed when the focus evaluation value of the moving image data is equal to or less than a predetermined threshold value.

(22)

The display control method according to any one of (15) to (21), wherein,

in the insertion execution control step, an actual movement amount of the moving object calculated from the moving image data is set as the movement amount of the moving object.

(23)

The display control method according to any one of (15) to (21), wherein,

in the insertion execution control step, a tracking visual limit movement amount of the person is set as the movement amount of the moving object.

(24)

The display control method according to (23), wherein,

the display control method includes a measurement step of measuring a tracking visual limit movement amount of an observer of the display unit,

in the insertion execution control step, any one of the tracking visual limit movement amounts is set as the movement amount of the moving object according to the measurement result of the measurement unit.

(25)

The display control method according to any one of (15) to (24), wherein,

the display control device includes a viewing angle calculation step of calculating a viewing angle of the display unit by an observer of the display unit,

the observation angle and the movement amount are calculated from the angle of view.

(26)

The display control method according to any one of (15) to (25), wherein,

the observation angle and the movement amount are calculated based on the resolution of the moving image data, the resolution of the display unit, and the angle of view of the display unit by the observer.

(27)

The display control method according to any one of (15) to (26), wherein,

the specific image is a black image.

(28)

A display control program for causing a computer to execute a display control method of moving image data,

the display control method includes:

a specific image insertion control step of performing insertion control of displaying a specific image different from the moving image data on a display unit during a period from display of a first frame of the moving image data on the display unit to display of a second frame of the moving image data after the first frame; and

an insertion execution control step of determining whether to execute the insertion control based on a movement amount of a moving object included in the moving image data and an observation angle of one pixel of the moving image data displayed on the display portion.

While various embodiments have been described above with reference to the drawings, it is needless to say that the present invention is not limited to the examples. It is obvious that various modifications and alterations can be conceived by those skilled in the art within the scope of the claims, and it is understood that these modifications and alterations naturally also fall within the technical scope of the present invention. Moreover, the respective constituent elements in the above embodiments may be arbitrarily combined without departing from the spirit of the invention.

In addition, the present application is based on the japanese patent application filed on 20/2/2019 (japanese patent application 2019-.

Industrial applicability

The present invention can be preferably applied to an electronic device having an image pickup function and a display function, such as a digital camera or a smart phone.

Description of the symbols

100-digital camera, 1-photographic lens, 2-diaphragm, 4-lens control section, 5-photographic element, 8-lens drive section, 9-diaphragm drive section, 10-photographic element drive section, 11-system control section, 14-operation section, 15-memory control section, 16-main memory, 17-digital signal processing section, 20-external memory control section, 21-storage medium, 23-display section, 24-control bus, 25-data bus, 40-lens device, 11A-specific image insertion control section, 11B-insertion execution control section, 11C-field angle information acquisition section, 11D-measurement section, 23a, 23B-frame, 23 p-pixel, H-viewing angle, m-moving object, L1, L2-moving amount, 200-smart phone, 201-shell, 202-display panel, 203-operation panel, 204-display input part, 205-loudspeaker, 206-microphone, 207-operation part, 208-camera part, 210-wireless communication part, 211-communication part, 212-storage part, 213-external input and output part, 214-GPS receiving part, 215-action sensor part, 216-power part, 217-internal storage part, 218-external storage part, 220-main control part, ST 1-STn-GPS satellite.

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:再现装置、再现方法和程序

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类