Electronic device, control method for electronic device, program, and storage medium

文档序号:174375 发布日期:2021-10-29 浏览:33次 中文

阅读说明:本技术 电子设备、电子设备的控制方法、程序和存储介质 (Electronic device, control method for electronic device, program, and storage medium ) 是由 横山俊一 于 2020-03-04 设计创作,主要内容包括:根据本发明的电子设备包括:检测单元,其被配置为检测显示面的倾斜;接收单元,其被配置为接收用户操作;以及控制单元,其被配置为进行控制,以在所述显示面上显示图像的一部分作为显示范围,响应于所述显示面的倾斜的变化来改变所述显示范围的位置,并响应于用户操作来改变所述显示范围的位置,其中,所述控制单元基于所述显示范围通过用户操作移动的第一移动量来校正所述显示范围响应于所述显示面的倾斜的变化而移动的第二移动量。(The electronic device according to the present invention includes: a detection unit configured to detect a tilt of the display surface; a receiving unit configured to receive a user operation; and a control unit configured to control to display a part of an image as a display range on the display surface, change a position of the display range in response to a change in a tilt of the display surface, and change the position of the display range in response to a user operation, wherein the control unit corrects a second movement amount by which the display range is moved in response to the change in the tilt of the display surface, based on a first movement amount by which the display range is moved by the user operation.)

1. An electronic device, comprising:

a detection unit configured to detect a tilt of the display surface;

a receiving unit configured to receive a user operation; and

a control unit configured to control to display a part of an image as a display range on the display surface, change a position of the display range in response to a change in a tilt of the display surface, and change the position of the display range in response to a user operation,

wherein the control unit corrects a second movement amount by which the display range is moved in response to a change in the tilt of the display surface, based on a first movement amount by which the display range is moved by a user operation.

2. The electronic apparatus according to claim 1, wherein the control unit determines a display range that corresponds to a tilt of the display surface and in which the first movement amount is not reflected, in a case where the display surface is oriented in a specified direction.

3. The electronic device of claim 2, wherein the specified direction is a horizontal direction.

4. The electronic apparatus according to claim 2 or 3, wherein the control unit does not change the position of the display range in response to a user operation in a state where the display face is oriented in the specified direction.

5. The electronic apparatus according to any one of claims 2 to 4, wherein the control unit determines the display range such that the display range is continuously moved to a position corresponding to the tilt of the display face in response to a change in the tilt of the display face that brings the direction in which the display face is oriented closer to the specified direction.

6. The electronic device of any of claims 1-5,

the first amount of movement is an amount of movement in a vertical direction, an

The second movement amount is a movement amount by which the display range is moved in the vertical direction in response to a change in the tilt of the display surface in the elevation direction.

7. The electronic apparatus according to any one of claims 1 to 6, wherein in a case where the tilt of the display surface is changed, the control unit selectively performs one of processing for moving the display range by the second movement amount and processing for resetting the first movement amount and determining the display range.

8. The electronic device according to claim 7, wherein the case where the tilt of the display face is changed is a case where the tilt of the display face is changed by an amount of change larger than a threshold value.

9. The electronic apparatus according to claim 7 or 8, wherein in a case where the tilt of the display surface is changed, the control unit controls to display a confirmation screen on the display surface as to whether or not the first movement amount is to be reset.

10. The electronic device of any of claims 1-9, further comprising:

a gesture detection unit configured to detect a gesture of a user,

wherein the control unit selectively performs one of processing for moving the display range by the second movement amount and processing for resetting the first movement amount and determining the display range in a case where a posture of a user changes.

11. The electronic apparatus according to claim 10, wherein the posture detection unit detects a height at which the electronic apparatus is located as a parameter corresponding to a posture of a user.

12. The electronic device according to claim 10 or 11, wherein the case where the posture of the user changes is a case where the posture of the user changes by a change amount larger than a threshold value.

13. The electronic apparatus according to any one of claims 10 to 12, wherein the control unit controls, in a case where a posture of a user changes, to display a confirmation screen on the display surface as to whether or not the first movement amount is to be reset.

14. The electronic device of any of claims 1-13, wherein the image is a Virtual Reality (VR) image.

15. A method of controlling an electronic device, comprising:

a detection step of detecting a tilt of the display surface;

a receiving step for receiving a user operation; and

a control step of performing control to display a part of an image as a display range on the display surface, change a position of the display range in response to a change in a tilt of the display surface, and change the position of the display range in response to a user operation,

wherein in the control step, a second movement amount by which the display range is moved in response to a change in the inclination of the display surface is corrected based on a first movement amount by which the display range is moved by a user operation.

16. A program that causes a computer to function as each unit of the electronic apparatus according to any one of claims 1 to 14.

17. A storage medium storing a program that causes a computer to function as each unit of the electronic apparatus according to any one of claims 1 to 14.

Technical Field

The present invention relates to an electronic apparatus, a control method of an electronic apparatus, a program, and a storage medium, and particularly to a control method for causing an image having a wide video range to be displayed.

Background

The following techniques have been conventionally proposed: a part of an image having a wide-range video, such as a semispherical image or a celestial image, is displayed on the display unit as a display range, and the position of the display range is arbitrarily changed in response to a change in the orientation of the display unit (the inclination of the display surface), a user operation, or the like. An image transfer technique of transferring the image thus used is also proposed.

Patent document 1 discloses the following technique: the time axis direction of the 360 ° video and the reproduction speed thereof are controlled according to the direction of the tilt of the head mounted display and the size thereof.

CITATION LIST

Patent document

Patent document 1: japanese patent laid-open publication No. 2017-111539

Disclosure of Invention

Technical problem

According to the conventional technique, after the position of the display range is changed by changing the inclination of the display face, the position of the display range can be further changed by performing a user operation. However, after the position of the display range is thus changed, even if the tilt of the display surface is returned to the original tilt, the position of the display range is not returned to the original position, and therefore the position of the display range cannot be easily changed to the intended position. In other words, the user cannot change the position of the display range to a desired position by intuitively changing the tilt of the display surface.

The present invention provides a technique that enables the position of a display range to be easily changed to a desired position.

Means for solving the problems

The electronic device according to the present invention includes: a detection unit configured to detect a tilt of the display surface; a receiving unit configured to receive a user operation; and a control unit configured to control to display a part of an image as a display range on the display surface, change a position of the display range in response to a change in a tilt of the display surface, and change the position of the display range in response to a user operation, wherein the control unit corrects a second movement amount by which the display range is moved in response to the change in the tilt of the display surface, based on a first movement amount by which the display range is moved by the user operation.

Advantageous effects of the invention

According to the present invention, the position of the display range can be easily changed to a desired position.

Drawings

Fig. 1 is a hardware block diagram showing an example of the configuration of an electronic apparatus according to the first embodiment.

Fig. 2 is a diagram showing an example of the configuration of a video transmission system according to the first embodiment.

Fig. 3 is a diagram showing an example of a screen of the video player according to the first embodiment.

Fig. 4 is a diagram showing an example of a method of determining a display range according to the first embodiment.

Fig. 5 is a diagram showing an example of information transmitted from the electronic apparatus according to the first embodiment.

Fig. 6 is a flowchart showing an example of the display control process.

Fig. 7 is a diagram showing an example of a problem to be solved by the first embodiment.

Fig. 8 is a diagram showing an example of the direction correspondence relationship according to the first embodiment.

Fig. 9 is a diagram showing an example of a confirmation screen according to the first embodiment.

Fig. 10 is a diagram showing an example of a gesture of a user according to the second embodiment.

Fig. 11 is a flowchart showing an example of display control processing according to the second embodiment.

Detailed Description

< first embodiment >

A first embodiment of the present invention will be described below. Fig. 1 is a hardware block diagram showing an example of the configuration of an electronic apparatus 100 according to the present embodiment. The electronic device 100 is a smartphone, a head mounted display, or the like. When the electronic device 100 is not a head mounted display, the electronic device 100 may also function as a Head Mounted Display (HMD) by attaching to a head mounted adapter. The head-mounted adapter is also called "VR (virtual reality) goggles".

The electronic apparatus 100(CPU 101) displays a part of an image (object image) as a display range on a display surface of the display unit 110. Note that the display unit 110 may also be a display device separate from the electronic device 100. For example, the electronic apparatus 100 may be a Personal Computer (PC), a game machine, any reproduction apparatus, or the like without the display unit 110, and the display unit 110 may be an HMD connected to the electronic apparatus 100 or the like.

The object image is a semispherical image, a celestial image, or the like, and has a wider video range (a range in which video exists; an effective video range) than a display range when the object image is displayed at a normal magnification. The effective video range can also be said to be the "imaging range". The normal magnification is, for example, a magnification at which neither magnification nor reduction is achieved. The celestial sphere image is also referred to as an "omnidirectional image", a "360 ° panoramic image", or the like. The effective video range of the subject image corresponds to a maximum field of view of 360 degrees in the vertical direction (vertical angle, angle with respect to the zenith, pitch angle, or elevation angle) and a maximum field of view of 360 degrees in the horizontal direction (horizontal angle or azimuth angle). Note that the effective video range of the subject image may correspond to a pitch angle of less than 360 degrees (e.g., 180 degrees (± 90 degrees)), and may also correspond to an azimuth angle of less than 360 degrees (e.g., 180 degrees (± 90 degrees)). The object image may be a still image or a moving image. The present embodiment will describe an example in which the object image is a moving image.

The electronic apparatus 100(CPU 101) arbitrarily changes the position of the display range in response to a change in the orientation of the electronic apparatus 100 (a change in the inclination of the display unit 110 (the display surface of the display unit 110)). This enables the user to change the position of the display range to a desired position by intuitively changing the orientation of the electronic apparatus 100 in the vertical/horizontal direction. Further, the electronic apparatus 100(CPU 101) arbitrarily changes the position of the display range in response to a user operation (display position changing operation). This enables the user to change the position of the display range to an intended position without changing the orientation of the electronic apparatus 100, the posture of the user, or the like. A user who can appropriately selectively perform the change of the orientation of the electronic apparatus 100 and the display position changing operation can conveniently change the position of the display range. Examples of the display position changing operation include a touch operation (such as a tap operation, a flick operation, a slide operation, or the like) performed on a touch panel provided on the display surface of the display unit 110. The display position changing operation may also be an operation performed on a controller connected to the electronic apparatus 100 or the like.

In a state where the electronic apparatus 100 corresponding to the HMD is attached to the head region of the user, the user can visually recognize the display range displayed on the display unit 110 without manually holding the electronic apparatus 100. In this case, when the user moves the head region or the entire body thereof, the orientation of the electronic apparatus 100 (the display unit 110) also changes. Therefore, the orientation of the electronic apparatus 100 may be, in other words, "the orientation of the head region of the user (the direction in which the line of sight of the user is oriented").

A display method of changing the position of the display range in response to a change in the orientation of the electronic apparatus 100 (display unit 110) is referred to as "VR display" or the like. The VR display enables the user to feel a visual sensation (immersion or presence) as if actually in the object image (VR space). The object image may also be said to be an "image having an effective video range forming at least a part of a virtual space (VR space)". Such a method of displaying an object image as used by VR display is referred to as "VR view" or the like, and an image that can be VR displayed is referred to as "VR image" or the like. Note that the object image may or may not be a VR image.

As shown in fig. 1, in the electronic apparatus 100, a CPU101, a DRAM 102, a ROM 103, a RAM 104, a display control unit 105, an operation unit 106, a direction detection unit 107, a communication IF 108, a display unit 110, and an EEPROM 111 are connected to an internal bus 109. The above-described units connected to the internal bus 109 can exchange data with each other via the internal bus 109.

The display unit 110 is a display device such as a liquid crystal panel. The ROM 103 stores various programs and various data. For example, in the ROM 103, a program for controlling the overall processing (operation) of the electronic apparatus 100, an application program for a video player described later, and the like are stored in advance. The CPU101 executes a program stored in the ROM 103 to control processing of each unit of the electronic apparatus 100. The DRAM 102 is a main memory of the electronic apparatus 100. The RAM 104 serves as a work memory of the CPU 101. The EEPROM 111 is a nonvolatile memory that can continuously store information even when the power of the electronic apparatus 100 is turned off. The communication IF 108 performs communication with a network 120 such as the internet or the like in response to an instruction from the CPU 101.

The operation unit 106 is an input device that receives an operation (user operation). The operation unit 106 includes, for example, a pointing device such as a touch panel. The touch panel outputs coordinate information corresponding to a contact position where a user, a touch pen, or the like contacts the touch panel, and is stacked on the display surface of the display unit 110 to be integrally configured with the display unit 110, for example. Note that the display surface, the touch panel, and the like may or may not have a planar structure. The CPU101 controls processing of each unit of the electronic apparatus 100 in response to an operation performed on the operation unit 106. Specifically, the operation unit 106 generates a control signal based on an operation performed on the operation unit 106, and supplies the control signal to the CPU 101. The CPU101 controls processing of each unit of the electronic apparatus 100 based on a control signal supplied from the operation unit 106. Therefore, the electronic apparatus 100 can be made to operate based on the user operation.

The display control unit 105 generates a display signal (such as an image signal or a drive signal for driving the display unit 110) for causing the display unit 110 to display an image, and outputs the display signal to the display unit 110. The CPU101 generates a display control signal corresponding to an image to be displayed on the display unit 110, and supplies the display control signal to the display control unit 105. The display control unit 105 generates a display signal based on the display control signal supplied from the CPU101, and supplies the display signal to the display unit 110. The display unit 110 displays an image on a display surface based on a display signal supplied from the display control unit 105.

The direction detection unit 107 detects the orientation of the electronic apparatus 100 (the inclination of the display unit 110 (the display surface of the display unit 110); the direction in which the electronic apparatus 100 is oriented) and supplies the detection result to the CPU 101. In the present embodiment, the direction detection unit 107 notifies the CPU101 of the direction (elevation angle and azimuth angle) toward which the electronic apparatus 100 (the display unit 110 (the display surface of the display unit 110)) is facing as a detection result. The CPU101 may determine (detect) the orientation (tilt) of the electronic apparatus 100 or determine (detect) whether the orientation of the electronic apparatus 100 is changed based on the information supplied thereto from the direction detection unit 107. As the direction detection unit 107, at least one of a plurality of sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, and an orientation sensor may be used. As the direction detection unit 107, a combination of a plurality of sensors may be used. Note that when the display unit 110 is a display device separate from the electronic apparatus 100, the direction detection unit 107 is provided in the display unit 110, and an interface that acquires a detection result from the direction detection unit 107 is provided in the electronic apparatus 100.

Fig. 2 is a diagram showing an example of the configuration of the video transmission system according to the present embodiment. In the video delivery system of fig. 2, the electronic device 100 is connected to the video delivery server 10 via the network 120. The video delivery server 10 stores therein a video (object video) corresponding to the above-described object image. The electronic apparatus 100 transmits information on the display range and the display time position to the video delivery server 10. The display time position is a time position of the display object included in a plurality of time positions within a period from the start of the object video to the end thereof. Upon receiving the information on the display range and the display time position, the video delivery server 10 extracts image data corresponding to the display range and the display time position from the subject video data (video data of the subject video), and transmits the image data to the electronic apparatus 100. Then, the electronic apparatus 100 displays an image based on the received image data. This processing is performed at respective update timings at which the display on the electronic apparatus 100 is updated.

Note that the image size of the image data extracted by the video delivery server 10 is not particularly limited. The enlargement or reduction of the image may be performed by the video delivery server 10, or may also be performed by the electronic apparatus 100. When the video delivery server 10 performs enlargement or reduction of an image, the image size of image data extracted by the video delivery server 10 may vary. When the video delivery server 10 does not perform enlargement or reduction of the image, the image size of the image data extracted by the video delivery server 10 does not change. The electronic apparatus 100 may also acquire the entire object video data, extract image data corresponding to the display range and the display time position from the object video data, and display the image data. The object video data may also be acquired from a device different from the video delivery server 10.

Fig. 3 is a diagram showing an example of a screen displayed on the electronic apparatus 100. The pictures in fig. 3 are pictures of a video player. As shown in fig. 3, the screen of the video player has a display area 301, a reproduction button 302, a pause button 303, a stop button 304, and a display position change button 305. The display area 301 is an area where the target video is displayed. The reproduction button 302 receives a user operation for reproducing the target video. The pause button 303 receives a user operation for pausing the subject video. The stop button 304 receives a user operation for stopping the subject video. Reproduction control (such as reproduction, pause, and stop) is performed by the CPU 101. The display position change button 305 receives a user operation for changing the position of the display range. In other words, in the present embodiment, the user operation performed on the display position change button 305 corresponds to the above-described display position change operation.

The operation unit 106 generates a control signal based on user operations performed on the reproduction button 302, the pause button 303, the stop button 304, the display position change button 305, and the like, and supplies the control signal to the CPU 101. Then, the CPU101 controls the processing of the respective units of the electronic apparatus 100 based on the control signal supplied from the operation unit 106. For example, when a user operation is performed on the reproduction button 302, the CPU101 performs control based on the control signal so that the display time positions are sequentially updated, and information (information on the display range and the display time positions) is sequentially transmitted to the video delivery server 10. When a user operation is performed on the pause button 303, the CPU101 performs control based on the control signal so that the display time position is not updated, and information is sequentially transmitted to the video delivery server 10. When a user operation is performed on the stop button 304, the CPU101 performs control based on the control signal so that transmission of information to the video delivery server 10 is stopped to prevent display of the target video.

When a user operation is performed on any of the display position change buttons 305, the CPU101 performs control based on a control signal supplied from the operation unit 106 so that the display range is updated, and information (information on the display range and the display time position) is sequentially transmitted to the video delivery server 10. In the example of fig. 3, the display position change buttons 305 include an up arrow button 305a, a down arrow button 305b, a left arrow button 305c, and a right arrow button 305 d. The display range is moved up by one step each time the user operation is performed on the up arrow button 305a, and the display range is moved down by one step each time the user operation is performed on the down arrow button 305 b. The display range is shifted leftward by one step each time the user operates the left arrow button 305c, and is shifted rightward by one step each time the user operates the right arrow button 305 d.

Fig. 4(a) to 4(C) are diagrams each showing an example of a method of determining a display range. Fig. 4(a) to 4(C) show an example of forming a hemispherical virtual space 401 from a subject video. It is assumed here that the electronic apparatus 100 is located at the center of a reference plane (bottom plane) 402 in the virtual space 401 when the subject video is reproduced by the electronic apparatus 100. Note that the shape of the virtual space formed by the subject video (subject image) is not limited to a hemispherical shape. For example, a completely spherical virtual space may also be formed.

In fig. 4(a), an arrow 403 indicates a direction in which the electronic apparatus 100 faces in the virtual space 401. The direction of the arrow 403 may also be said to be "the direction in which the user looks (viewing direction; viewpoint direction in the virtual space 401)". On the surface of the virtual space 401, a range 404 in the direction of the arrow 403 is determined as a display range. The direction of the arrow 403 and the position of the display range 404 are represented by an azimuth angle θ and an elevation angle Φ. The azimuth angle θ is an angle between a reference line 405 on the reference plane 402 and a line connecting the center point T of the reference plane 402 and a point P reached by a line extending perpendicularly from the center of the display range 404 to the reference plane 402. The elevation angle Φ is an angle between a line connecting the center point T and the point P and a line connecting the center of the display range 404 and the center point T.

In fig. 4(B), the user points the electronic apparatus 100 at an object 406 in the virtual space 401 (object video). Thus, a display range 407 including the object 406 is set, and the display range 407 is displayed in the display area 301. In fig. 4(C), the user points the electronic apparatus 100 at an object 408 in the virtual space 401 (object video). Thus, a display range 409 including the object 408 is set, and the display range 409 is displayed in the display area 301. As shown in fig. 4(B) and 4(C), by changing the orientation of the electronic apparatus 100, the user can freely change the position (azimuth angle θ and elevation angle Φ) of the display range, and can view any display range in the display region 301.

Also as described above, the user can freely change the position of the display range by performing the user operation on any of the display position change buttons 305 (the up arrow button 305a, the down arrow button 305b, the left arrow button 305c, and the right arrow button 305 d). For example, the elevation angle Φ is increased by 1 degree each time the up arrow button 305a is operated by the user, and the elevation angle Φ is decreased by 1 degree each time the down arrow button 305b is operated by the user. The azimuth angle θ is increased by 1 degree each time the user operation is made to the left arrow button 305c, and is decreased by 1 degree each time the user operation is made to the right arrow button 305 d.

Note that the object video data forming the hemispherical virtual space 401 is stored in the video delivery server 10. The electronic apparatus 100 transmits the information (the azimuth angle θ and the elevation angle Φ) generated by the direction detection unit 107 to the video delivery server 10 using the communication IF 108. Then, the video delivery server 10 extracts image data within the display range determined by the azimuth angle θ and the elevation angle Φ based on information as the target video data received from the electronic device 100 via the network 120. Then, the video delivery server 10 transmits the extracted image data to the electronic device 100 via the network 120. The electronic apparatus 100 receives image data (image data within a display range) transmitted from the video delivery server 10 using the communication IF 108. Then, the CPU101 of the electronic apparatus 100 generates a display control signal based on the received image data, the display control unit 105 generates a display signal based on the display control signal, and the display unit 110 displays an image based on the display signal. Accordingly, an image within the display range is displayed on the display surface of the display unit 110 (in the display area 301). For the process of extracting (generating) image data to be displayed on the electronic device from the subject video data, various proposed techniques can be used.

Fig. 5 is a diagram showing an example of information (data) transmitted from the electronic apparatus 100 to the video delivery server 10. In the present embodiment, the electronic apparatus 100 transmits the azimuth angle θ and the elevation angle Φ as display range information (information on the display range). The electronic apparatus 100 also transmits, as display time position information (information on the display time position), a time period elapsed from the start of the subject video to the display time position. The count value in fig. 5 is counted up with the display update timing with which the display on the electronic apparatus 100 is updated, and managed by the electronic apparatus 100. The electronic apparatus 100 transmits the display range information and the display time position information to the video delivery server 10 every time the count value is counted up.

In the example of fig. 5, it is assumed that the count value is counted up every minute, i.e., the display on the electronic device 100 is updated every minute. At the count values 0 to 5, the subject video is paused, and the display time position stops at 0:00 (minutes: seconds). At the count value of 6 and subsequent count values, the subject video is reproduced, and the display time position advances from 0:01 second by second. At the count values 0 to 2, the azimuth angle θ and the elevation angle Φ are each 0 °, and video data in the direction defined by the azimuth angle θ and the elevation angle Φ is displayed. At the count value 3 and subsequent count values, the azimuth angle θ and the elevation angle Φ each change, and the direction of the image data displayed on the electronic apparatus 100 (i.e., the position of the display range) changes.

Fig. 6 is a flowchart showing an example of display control processing performed in the electronic apparatus 100. The CPU101 develops an application program for a video player stored in the ROM 103 in the RAM 104, and executes the application program to realize each processing step in the flowchart of fig. 6. It is assumed here that the object video is predetermined. In addition, the device orientation information L1 indicating the direction (the azimuth angle θ 1 and the elevation angle Φ 1) in which the electronic device 100 (the display unit 110 (the display surface of the display unit 110)) is oriented is referred to as "device orientation information L1(θ 1, Φ 1)". In addition, movement amount information L2 indicating the amount of movement (azimuth angle change amount θ 2 and elevation angle change amount Φ 2) by which the display range (viewing direction) is moved by the display position changing operation (user operation performed on the display position changing button 305) is referred to as "movement amount information L2(θ 2, Φ 2)". The display range information L (information indicating the display range (the azimuth angle θ and the elevation angle Φ)) to be transmitted to the video delivery server 10 is referred to as "display range information L (θ, Φ)".

Before describing the display control processing in fig. 6, a description will be given of the problem to be solved by the present embodiment (display control processing in fig. 6). Here, a case is considered where the position of the display range can be arbitrarily changed in response to the change of the orientation (inclination of the display surface) of the display unit and the display position changing operation. In this case, after the position of the display range is changed by changing the inclination of the display surface, the position of the display range may also be changed by performing a display position changing operation. However, after the position of the display range is thus changed, even if the tilt of the display surface is returned to the original tilt, the position of the display range is not returned to the original position, and therefore the position of the display range cannot be easily changed to the intended position. In other words, the user cannot change the position of the display range to a desired position by intuitively changing the tilt of the display surface.

For example, consider the following: the electronic apparatus facing the horizontal direction in a state where the display satisfies the range of the azimuth angle θ ═ elevation angle Φ ═ 0 ° as shown in fig. 7(a) is tilted upward by 30 ° as shown in fig. 7 (B). In this case, the display range is vertically moved to update a range satisfying an azimuth angle θ ═ 0 ° and an elevation angle Φ ═ 30 ° to a range satisfying an azimuth angle θ ═ 0 ° and an elevation angle Φ ═ 0 °. Then, when the display position changing operation is performed, the display range is further moved. For example, as shown in fig. 7(C), the display range is vertically moved by a variation Φ 2 of 30 °, and the range satisfying 0 ° for the azimuth angle θ and 30 ° for the elevation angle Φ is updated to the range satisfying 0 ° for the azimuth angle θ and 60 ° for the elevation angle Φ. Then, when the electronic apparatus is oriented in the horizontal direction, the variation Φ 2 remains unchanged at 30 °, so the display range does not return to a range satisfying the azimuth angle θ at 0 ° and the elevation angle Φ at 30 °, and results in satisfying the ranges satisfying the azimuth angle θ at 0 ° and the elevation angle Φ at 30 °. Therefore, even when the electronic apparatus is oriented in the horizontal direction, the user cannot return the display range to a range satisfying the azimuth angle θ being 0 ° and the elevation angle Φ being 0 °.

Therefore, in the display control process of fig. 6, the following processes are performed: the amount of movement by which the display range is moved in response to a change in the inclination or the like of the display surface is corrected based on the amount of movement (the amount of change θ 2, Φ 2) by which the display range is moved by the display position changing operation. This enables the position of the display range to be easily changed to a desired position.

In step S601 of fig. 6, the CPU101 acquires the device orientation information L1(θ 1, Φ 1) from the direction detection unit 107 and records the device orientation information L1(θ 1, Φ 1) in the RAM 104. In the state of fig. 7(B), the device orientation information L1(θ 1 ═ 0 °, Φ 1 ═ 30 °) is acquired. The CPU101 also records each of the display range information L (θ, Φ) and the movement amount information L2(θ 2, Φ 2) in the RAM 104. In step S601, there is no movement of the display range due to the display position changing operation. Therefore, the CPU101 records each of the display range information L (θ 1, Φ 1) and the movement amount information L2(θ 20 °, Φ 20 °) in the RAM 104. Then, the CPU101 transmits the display range information L (θ ═ θ 1, Φ ═ Φ 1) and the display time position information to the video delivery server 10 as information at the count value 0. The display time position information at the count value of 0 indicates, for example, the time position (0:00) of the top frame of the target video. Note that the display time position information at the count value of 0 may also indicate other time positions. For example, information indicating the time position at which the reproduction of the target video was previously stopped may also be used as the display time position information at the count value of 0. Such a temporal location may be managed by at least one of the electronic device 100 and the video delivery server 10.

In step S602, the CPU101 receives image data (image data within the display range) from the video delivery server 10 using the communication IF 108, and records the received image data in the RAM 104. Then, the CPU101 displays an image (image within the display range) based on the image data stored in the RAM 104 on the display unit 110 using the display control unit 105.

In step S603, the CPU101 determines whether a display position changing operation (a user operation performed on any of the display position changing buttons 305) is performed. The determination in step S603 can be made by monitoring the control signal output from the operation unit 106. When it is determined that the display position changing operation is performed, the process proceeds to step S604, and when it is determined that the display position changing operation is not performed, the process proceeds to step S607.

In step S604, the CPU101 acquires the movement amount information L2(θ 2, Φ 2) corresponding to the display position changing operation from the operation unit 106. Then, the CPU101 updates the movement amount information L2(θ 2, Φ 2) stored in the RAM 104 to the movement amount information L2(θ 2, Φ 2) acquired from the operation unit 106. In the state of fig. 7C, the movement amount information L2 is acquired (θ 2 is 0 °, and Φ 2 is 30 °).

In step S605, the CPU101 updates the display range information L (θ, Φ) stored in the RAM 104 based on the device orientation information L1(θ 1, Φ 1) and the movement amount information L2(θ 2, Φ 2) stored in the RAM 104. Specifically, using expressions 1 and 2 shown below, the updated azimuth angle θ and the updated elevation angle Φ are calculated, and the display range information L (θ, Φ) stored in the RAM 104 is updated. When there is a change from the state of fig. 7(B) to the state of fig. 7(C), the updated azimuth angle θ is calculated to be 0 ° and the updated elevation angle Φ is calculated to be 60 °, and the display range information L (θ, Φ) is updated to the display range information L (θ is 0 °, Φ is 60 °).

θ 1+ θ 2 … (expression 1)

Φ 1+ Φ 2 … (expression 2)

In step S606, the CPU101 determines the correspondence relationship (direction correspondence relationship) between the elevation angle Φ 1 of the electronic apparatus 100 (the display unit 110 (the display surface of the display unit 110)) and the elevation angle Φ of the display range. Fig. 8(a) shows a direction correspondence in an initial state, such as a direction correspondence before a display position changing operation is performed or a direction correspondence in a state indicated by the movement amount information L2(θ 2 is 0 °, Φ 2 is 0 °), and the like. In the direction correspondence relationship in the initial state, the elevation angle Φ of the display range is equal to the elevation angle Φ 1 of the electronic apparatus 100. When there is a change from the state of fig. 7(B) to the state of fig. 7(C), in step S606, the direction correspondence relationship shown in fig. 8(B) is determined. The elevation angle Φ n at the point a in fig. 8(B) is the elevation angle Φ 1+ Φ 2 stored in the RAM 104 at the time of step S606, which is 60 ° (display range information L (θ, Φ)). The elevation angle Φ 1n at the point a is the elevation angle Φ 1 stored in the RAM 104 at the time of performing step S606, which is 30 ° (device orientation information L1(θ 1, Φ 1)).

Here, a case where the display range is determined using the direction correspondence relationship shown in fig. 8(B) is considered. In this case, the display range is determined such that the display range continuously moves to a position corresponding to the tilt of the electronic apparatus 100 in response to a change in the tilt of the electronic apparatus 100, which brings the direction toward which the electronic apparatus 100 (the display unit 110 (the display surface of the display unit 110)) is closer to the specified direction. In the present embodiment, the specified direction is a horizontal direction (a direction in which the elevation angle Φ 1 of the electronic apparatus is 0 ° or 180 °). The "position corresponding to the tilt of the electronic apparatus 100" is a position where the elevation angle Φ of the display range is equal to the elevation angle Φ 1 of the electronic apparatus 100 regardless of the amount of change Φ 2 (movement amount information L2(θ 2, Φ 2)).

Note that the elevation angle Φ represented by the direction correspondence relationship in fig. 8(B) is an elevation angle obtained by correcting the elevation angle Φ 1 based on the variation Φ 2. Therefore, the direction correspondence relationship in fig. 8(B) can also be said to be "a correspondence relationship in which the amount of change in the elevation angle Φ due to the change in the elevation angle Φ 1 is corrected based on the amount of change Φ 2". Note that the change amount Φ 2 may also be said to be "a movement amount by which the display range is moved in the virtual direction (the elevation direction) by the display position changing operation". The amount of change in the elevation angle Φ caused by a change in the elevation angle Φ 1 can also be said to be "a movement amount by which the display range moves in the vertical direction (elevation direction) in response to a change in the inclination of the display surface in the elevation direction".

In step S607, the CPU101 acquires the device orientation information L1(θ 1, Φ 1) from the direction detection unit 107 and records the device orientation information L1(θ 1, Φ 1) in the RAM 104 (updates the device orientation information L1(θ 1, Φ 1)). At this time, the CPU101 determines whether the tilt of the electronic apparatus 100 (the display unit 110 (the display surface of the display unit 110)) has changed. In the present embodiment, the CPU101 determines whether the tilt of the electronic apparatus 100 changes by a change amount larger than a threshold. Specifically, the CPU101 compares the device orientation information L1(θ 1, Φ 1) before the update with the device orientation information L1(θ 1, Φ 1) after the update to determine whether the amount of change in the elevation angle Φ 1 is greater than the threshold. When it is determined that the amount of change in the elevation angle Φ 1 is greater than the threshold, the process proceeds to step S608, and when it is determined that the amount of change in the elevation angle Φ 1 is equal to or less than the threshold, the process proceeds to step S613.

Note that the threshold may be zero or may be greater than zero. The threshold may or may not be a value predetermined by the manufacturer or the like. For example, the threshold may be a value (variable value) specified by the user. In the present embodiment, when the elevation angle Φ 1 is not changed even if the azimuth angle θ 1 is changed, it is determined that the tilt of the electronic apparatus 100 is not changed. However, in this case, it may also be determined that the tilt of the electronic apparatus 100 has changed. In other words, the change in the azimuth angle θ 1 may be further considered to determine whether the tilt of the electronic device 100 has changed.

In step S608, the CPU101 determines whether or not the movement amount information L2(θ 2, Φ 2) (the movement amount by which the display range is moved by the display position changing operation) is to be reset. In the present embodiment, the CPU101 displays a confirmation screen of whether the movement amount information L2(θ 2, Φ 2) is to be reset on the display surface using the display control unit 105. Then, the CPU101 determines whether the movement amount information L2(θ 2, Φ 2) is to be reset in response to a user operation for a confirmation screen response. Fig. 9 shows an example of a reconfirmation screen. The confirmation screen 901 is displayed superimposed on the screen of the video player (fig. 3). To ask the user whether or not the movement amount information L2(θ 2, Φ 2) is to be reset, the confirmation screen 901 displays information such as "whether or not the viewpoint change by screen operation is to be reset? "etc. Further, the confirmation screen 901 displays a yes button 902 and a no button 903. When the moving amount information L2(θ 2, Φ 2) is to be reset, the user presses the yes button 902, and when the moving amount information L2(θ 2, Φ 2) is not reset, the user presses the no button 903. The CPU101 is notified of the pressing of the yes button 902, no button 903, and the like from the operation unit 106. When it is determined that the movement amount information L2(θ 2, Φ 2) is to be reset, the processing proceeds to step S612. When it is determined that the movement amount information L2(θ 2, Φ 2) is not reset, the processing proceeds to step S609. Therefore, in the present embodiment, one of the processing in S609 and the processing in S612 is selected and performed.

Note that the method of selecting one of the processing in step S609 and the processing in step S612 is not limited to the above-described method of displaying the confirmation screen. For example, the process to be performed (one of the process in step S609 and the process in step S612) may be set in advance at the start of the display control process of fig. 6 or the like. It is also possible that the processing always proceeds from step S608 to step S609 without involving selection of one of the processing in step S609 and the processing in step S612. It is also possible that the processing always advances from step S608 to step S612.

In step S609, the CPU101 acquires the elevation angle Φ corresponding to the elevation angle Φ 1 stored in the RAM 104 (the device orientation information L1(θ 1, Φ 1) after the update in step S607) from the direction correspondence relationship determined in step S606. For example, when the direction correspondence relationship in fig. 8(B) is determined and the elevation angle Φ 1 is 15 ° is stored in the RAM 104 in step S606, the elevation angle Φ at the point B is acquired to be 30 °. Then, the CPU101 updates the display range information L (θ, Φ) stored in the RAM 104 so that the acquired elevation angle Φ is shown. As a result, the amount of change in the elevation angle Φ caused by the change in the elevation angle Φ 1 is corrected based on the amount of change Φ 2, and therefore the display range moves by the corrected amount of change (amount of change in the elevation angle Φ). When the electronic apparatus 100 (the display unit 110 (the display surface of the display unit 110)) is oriented in the horizontal direction (the designated direction), the variation Φ 2 is not reflected, and a display range based on the elevation angle Φ 1 (a display range satisfying the elevation angle Φ ═ Φ 1) is determined.

In step S610, the CPU101 determines whether the electronic apparatus 100 (the display unit 110 (the display surface of the display unit 110)) is oriented in the horizontal direction (the designated direction). Specifically, the CPU101 determines whether the elevation angle Φ after the update in step S609 has the reference value (0 ° or 180 °). In the direction correspondence relationship (fig. 8(B)) determined in step S606, when the elevation angle Φ of the display range has the reference value, the elevation angle Φ of the display range is equal to the elevation angle Φ 1 of the electronic apparatus 100. Therefore, when the elevation angle Φ updated in step S609 has the reference value, the electronic apparatus 100 faces the horizontal direction. When it is determined that the elevation angle Φ has the reference value, the process proceeds to step S611, and when it is determined that the elevation angle Φ does not have the reference value, the process proceeds to step S613.

In step S611, the CPU101 updates (initializes) the movement amount information L2(θ 2, Φ 2) stored in the RAM 104 to the movement amount information L2(0, 0). Further, the CPU101 returns (initializes) the direction correspondence relationship from the direction correspondence relationship determined in step S606 to the direction correspondence relationship in fig. 8 (a). In other words, the CPU101 resets the movement amount information L2(θ 2, Φ 2) and the direction correspondence relationship.

After the processing in step S612, the processing in step S611 (resetting of the movement amount information L2(θ 2, Φ 2)) is inevitably performed. Therefore, in step S612, the CPU101 updates the display range information L (θ, Φ) stored in the RAM 104 to the device orientation information L1(θ 1, Φ 1) stored in the RAM 104. In other words, the CPU101 updates the display range information L (θ, Φ) to information in which the movement amount information L2(θ 2, Φ 2) is not reflected.

In step S613, the CPU101 determines whether a stop operation (user operation on the stop button 304) has been performed. The determination in step S613 can be made by monitoring the control signal output from the operation unit 106. When determining to perform the stop operation, the CPU101 ends the display of the subject video using the display control unit 105 and ends the display control processing in fig. 6. When it is determined that the stop operation is not performed, the process proceeds to step S614.

In step S614, the CPU101 updates the display time position information. Specifically, the CPU101 determines whether a reproduction operation (user operation on the reproduction button 302) or a pause operation (user operation on the pause button 303) is performed. This determination may be made by monitoring a control signal output from the operation unit 106. Then, during the period after the reproduction operation is performed and before the pause operation is performed, the CPU101 sequentially updates the display time position information so that the display time position is advanced to increase the period of time elapsed from the start of the object video to the display time position. Note that the CPU101 does not update the display time position information during the period after the pause operation is performed and before the reproduction operation is performed.

In step S615, the CPU101 increments the count value by one.

In step S616, the CPU101 transmits the display time position information updated in step S614 and the display range information L (θ, Φ) stored in the RAM 104 to the video delivery server 10 as information on the count value updated in step S615.

As described above, according to the present embodiment, based on the amount of movement of the display range by the display position changing operation, the following processing is performed: the amount of movement by which the display range moves in response to a change in the inclination or the like of the display surface is corrected. This enables the position of the display range to be easily changed to a desired position. For example, even after the position of the display range is changed by the display position changing operation, the position of the display range can be returned to the original position by returning the tilt of the display surface to the original tilt.

Although the following examples are described: the amount of change in the elevation angle Φ of the display range due to the change in the elevation angle Φ 1 of the display surface is corrected based on the amount of change Φ 2 (amount of change in the elevation angle Φ of the display range) due to the display position changing operation, but the correction is not limited thereto. For example, the amount of change in the azimuth angle θ of the display range due to the change in the azimuth angle θ 1 of the display surface may also be corrected based on the amount of change θ 2 (the amount of change in the azimuth angle θ of the display range) due to the display position changing operation. Only one of the azimuth angle θ and the elevation angle Φ may be appropriately corrected, or alternatively, both the azimuth angle θ and the elevation angle Φ may be appropriately corrected. In addition, the specified direction may be a direction different from the horizontal direction. For example, the specified direction may be a direction in which the azimuth angle θ 1 is 90 °, or alternatively, the specified direction may be a direction in which the azimuth angle θ 1 — the elevation angle Φ 1 — 45 ° is satisfied.

It is also possible that the CPU101 does not change the position of the display range in response to the display position changing operation in a state where the display surface is oriented in a specified direction (such as a horizontal direction). By so doing, even when the display position changing operation is inadvertently performed, the user can continue viewing in the specified direction.

< second embodiment >

A second embodiment of the present invention will be described below. Note that a detailed description will be given below of points (such as configuration or processing) different from those of the first embodiment, and a description of the same points as those of the first embodiment will be omitted.

In the present embodiment, the direction detection unit 107 also detects the posture (body position) of the user of the electronic apparatus 100 (posture detection). Specifically, the direction detection unit 107 detects the height H at which the electronic apparatus 100 is located using a gyro sensor. Fig. 10 shows an example of a correspondence between the height H at which the electronic apparatus 100 is located and the posture of the user. As shown in fig. 10, when the posture of the user changes between a standing posture, a sitting posture, a lying posture, and the like, the height H at which the electronic apparatus 100 is located also changes. Therefore, it can be said that the height at which the electronic apparatus 100 is located is "a parameter corresponding to the posture of the user". When the posture of the user is changed, it is highly likely that the user intends to perform the display position changing operation again. Therefore, in the present embodiment, the CPU101 determines whether the posture of the user has changed based on the detection result from the direction detection unit 107, and when it is determined that the posture of the user has changed, the CPU101 appropriately resets the movement amount information L2(θ 2, Φ 2).

Fig. 11 is a flowchart showing an example of the display control processing according to the present embodiment. The CPU101 develops an application program for a video player stored in the ROM 103 in the RAM 104 and executes the application program, thereby realizing the respective processing steps in the flowchart of fig. 11.

The processing steps in steps S1101 to S1112 are the same as those in steps S601 to S612 in fig. 6.

In step S1113, the CPU101 determines whether the posture (body position) of the user has changed using the direction detection unit 107. In the present embodiment, the CPU101 determines whether the posture of the user changes by a change amount larger than a threshold. Specifically, the direction detection unit 107 detects the height H at which the electronic apparatus 100 is located and reports the height H to the CPU 101. Then, the CPU101 determines whether the amount of change in the reported height H (such as the amount of change from the previously reported height H or the amount of change per a given period of time) is greater than a threshold. When it is determined that the amount of change in the height H is greater than the threshold, the process proceeds to step S1114, and when it is determined that the amount of change in the height H is equal to or less than the threshold, the process proceeds to step S1119. Note that the threshold may be zero or may be greater than zero. The threshold may or may not be a value predetermined by the manufacturer or the like. For example, the threshold may be a value (variable value) specified by the user.

The processing steps in steps S1114 to S1122 are the same as those in steps S608 to S616 in fig. 6.

As described above, according to the present embodiment, when it is determined that the posture of the user has changed, it is highly likely that the user intends to perform the display position changing operation again, and therefore the movement amount information L2(θ 2, Φ 2) is appropriately reset. This can improve convenience.

Note that each of the various control operations assumed to be performed by the CPU101 may also be performed by one hardware item, or alternatively, a plurality of hardware items (e.g., a plurality of processors or circuits) may share processing to control the entire device.

Although the present invention has been described in detail based on the preferred embodiments of the present invention, the present invention is not limited to these specific embodiments and includes various forms within the scope not departing from the spirit of the present invention. In addition, the above-described embodiments only show the embodiments of the present invention, and the embodiments can also be appropriately combined.

< other examples >

The present invention can also be realized by a process in which a program that realizes one or more functions of the above-described embodiments is supplied to a system or apparatus via a network or a storage medium, and the program is read and executed by one or more processors in a computer of the system or apparatus. The invention may also be implemented as a circuit (e.g., an ASIC) that performs one or more functions.

The present invention is not limited to the above-described embodiments, and various changes and modifications may be made without departing from the spirit and scope of the invention. Accordingly, the claims are made to disclose the scope of the invention.

This application claims priority from japanese patent application 2019-.

[ list of reference numerals ]

100 electronic device

101 CPU

106 operating unit

107 direction detecting unit

24页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:调谐方法、制造方法、计算机可读存储介质和调谐系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类