Program, information processing method, information processing system, head-mounted display device, and information processing device

文档序号:1327769 发布日期:2020-07-14 浏览:6次 中文

阅读说明:本技术 程序、信息处理方法、信息处理系统、头戴式显示装置和信息处理装置 (Program, information processing method, information processing system, head-mounted display device, and information processing device ) 是由 谏山英世 仓林修一 于 2018-09-27 设计创作,主要内容包括:提供一种使得可以在虚拟现实空间中向用户适当地显示GUI对象的程序。本发明是一种由处理装置执行的程序,所述处理装置用于进行要显示在头戴式显示装置的显示单元上的包括预定的GUI对象的虚拟空间的绘制处理,所述程序使得所述处理装置执行以下步骤:基于现实空间中的头戴式显示装置的位置来确定所述虚拟空间中的用户的视点位置;确定所述虚拟空间中的用户的可移动范围;在所确定的可移动范围的外侧的所述虚拟空间的区域中确定所述GUI对象的位置;通过使用透视法特性曲线来确定所述GUI对象在所述虚拟空间中的大小,使得所确定的位置处的所述GUI对象在所述显示单元上具有恒定显示大小;以及生成包括所述GUI对象的所述虚拟空间的绘制数据。(Provided is a program that makes it possible to appropriately display GUI objects to a user in a virtual reality space. The present invention is a program executed by a processing device for performing rendering processing of a virtual space including a predetermined GUI object to be displayed on a display unit of a head-mounted display device, the program causing the processing device to execute the steps of: determining a viewpoint position of a user in the virtual space based on a position of a head mounted display device in a real space; determining a movable range of a user in the virtual space; determining a position of the GUI object in a region of the virtual space outside the determined movable range; determining a size of the GUI object in the virtual space by using a perspective characteristic curve such that the GUI object at the determined position has a constant display size on the display unit; and generating rendering data of the virtual space including the GUI object.)

1. A program executed by a processing apparatus for performing rendering processing of a virtual space including a virtual object to be displayed on a display unit of a head-mounted display apparatus, the virtual object including a predetermined GUI object, the program causing the processing apparatus to execute:

determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor;

determining a movable range of a user in the virtual space;

determining a position of the GUI object in a virtual space region outside the determined movable range;

determining a size of the GUI object in the virtual space by using a correspondence between a depth and a display size of a unit length in the virtual space on the display unit such that the GUI object at the determined position has a constant display size on the display unit; and

generating rendering data for the virtual space including the virtual object, the virtual object including the GUI object having the determined size.

2. The program of claim 1, wherein the processing device is included in the head mounted display device.

3. The program according to claim 1, wherein the processing device is included in an information processing device communicably connected with the head-mounted display device.

4. The program according to any one of claims 1 to 3,

wherein the rendering process includes a rendering process for rendering the virtual object corresponding to a position of a controller held by a user in the real space based on the position of the controller, and

wherein the movable range of the user includes a movable range of the virtual object corresponding to a position of the controller.

5. The program according to any one of claims 1 to 4, wherein in the step of determining the position of the GUI object, the position of the GUI object is determined such that the GUI object having the constant display size viewed from a viewpoint position of a user does not overlap with other virtual objects.

6. The program of any one of claims 1 to 5, wherein the GUI object comprises at least one of a selection window, a timing window, a progress bar, and a scroll bar.

7. An information processing method for performing rendering processing of a virtual space including a virtual object to be displayed on a display unit of a head-mounted display apparatus, the virtual object including a predetermined GUI object, the information processing method comprising the steps of:

determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor;

determining a movable range of a user in the virtual space;

determining a position of the GUI object in a virtual space region outside the determined movable range;

determining a size of the GUI object in the virtual space by using a correspondence between a depth and a display size of a unit length in the virtual space on the display unit such that the GUI object at the determined position has a constant display size on the display unit; and

rendering the virtual space including the virtual object on the display unit, the virtual object including the GUI object having the determined size.

8. An information processing system including a head-mounted display device including a display unit for displaying a virtual space to a user, the information processing system comprising:

a sensor to obtain a position of the head mounted display device in real space;

a processing unit configured to perform rendering processing of the virtual space including a virtual object to be displayed on the display unit, the virtual object including a predetermined GUI object;

a storage unit configured to store data representing a correspondence relationship between a depth and a display size on the display unit of a unit length in the virtual space,

wherein the processing unit is configured to:

determining a viewpoint position of a user in the virtual space based on the position obtained by the sensor;

determining a movable range of a user in the virtual space;

determining a position of the GUI object in a virtual space region outside the determined movable range;

determining a size of the GUI object in the virtual space by using the data stored by the storage unit such that the GUI object at the determined position has a constant display size on the display unit; and

rendering the virtual space including the virtual object on the display unit, the virtual object including the GUI object having the determined size.

9. The information processing system according to claim 8,

wherein the information processing system includes a controller held by a user,

wherein the rendering process includes a rendering process for rendering the virtual object corresponding to the position of the controller based on the position of the controller in real space, and

wherein the movable range of the user includes a movable range of the virtual object corresponding to a position of the controller.

10. The information processing system according to claim 8 or 9, wherein the data stored by the storage unit is prepared in advance by measuring a display size displayed on the display unit for a unit length in the virtual space for each depth.

11. The information processing system according to any one of claims 8 to 10,

wherein the information processing system includes an information processing apparatus communicably connected with the head-mounted display apparatus, and

wherein the processing unit is implemented by the head mounted display device and the information processing device.

12. A head-mounted display device including a display unit for displaying a virtual space to a user, the head-mounted display device comprising:

a processing unit configured to perform rendering processing of the virtual space including a virtual object to be displayed on the display unit, the virtual object including a predetermined GUI object; and

a storage unit configured to store data representing a correspondence relationship between a depth and a display size on the display unit of a unit length in the virtual space,

wherein the processing unit is configured to:

determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor;

determining a movable range of a user in the virtual space;

determining a position of the GUI object in a virtual space region outside the determined movable range;

determining a size of the GUI object in the virtual space by using the data stored by the storage unit such that the GUI object at the determined position has a constant display size on the display unit; and

rendering the virtual space including the virtual object on the display unit, the virtual object including the GUI object having the determined size.

13. An information processing apparatus for performing rendering processing of a virtual space including a virtual object to be displayed on a display unit of a head-mounted display apparatus, the virtual object including a predetermined GUI object, the information processing apparatus comprising:

a processing unit configured to perform rendering processing of the virtual space to be displayed on the display unit; and

a storage unit configured to store data representing a correspondence relationship between a depth and a display size on the display unit of a unit length in the virtual space,

wherein the processing unit is configured to:

determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor;

determining a movable range of a user in the virtual space;

determining a position of the GUI object in a virtual space region outside the determined movable range;

determining a size of the GUI object in the virtual space by using the data stored by the storage unit such that the GUI object at the determined position has a constant display size on the display unit; and

transmitting, to the head-mounted display device, data for rendering the virtual space including the virtual object on the display unit, the virtual object including the GUI object having the determined size.

Technical Field

The present invention relates to a program and the like, and particularly to a program and the like for performing display on a display unit of a head-mounted display device.

Background

There is known a Head Mounted Display (HMD) that is worn on a head of a user and displays an image (video) on a display unit such as a display or the like disposed in front of the eyes. The HMD enables a Virtual Reality (VR) world to be provided to a user wearing the HMD by providing images and audio in a virtual space to the user.

Recently, applications such as games, which a user plays while viewing a screen displayed on an HMD, have also been developed. Such applications display background and character images for the user to experience the virtual space, as well as images of GUI elements (GUI objects) such as select message windows and the like. For example, patent literature 1 discloses an input method for performing input by using a GUI arranged in a virtual space.

Disclosure of Invention

Problems to be solved by the invention

In a VR application in which an image of a virtual object corresponding to the user's own arm, hand, or the like is displayed on the display unit of the HMD, if a GUI object is arranged in a virtual space, the following problems arise. In such VR applications, a GUI object configured in a virtual space must be configured at a position relatively close to a user so that the user can recognize the GUI object. However, in the case where the GUI object interferes with a virtual object corresponding to the user's own arm, hand, or the like, the virtual reality space (virtual space) provided by the VR application causes the user to intuitively feel a great sense of unnaturalness. In particular, a user experiencing the VR world feels the VR world as if the user's arms, hands, etc. penetrate or are buried in GUI objects, which greatly impairs the immersion.

On the other hand, although there is a method of displaying the GUI object in a part of the display unit in such a manner as if the GUI object is completely separated from the virtual space, displaying the GUI object as if the GUI object is completely separated from the virtual space also greatly impairs the sense of immersion.

The present invention has been made to solve the above-described problems, and a main object thereof is to provide a program or the like that enables a GUI object to be appropriately displayed to a user in a virtual reality space.

Means for solving the problems

In order to achieve the above object, a program according to an aspect of the present invention is a program executed by a processing apparatus for performing rendering processing of a virtual space including a predetermined GUI object to be displayed on a display unit of a head-mounted display apparatus, the program characterized by causing the processing apparatus to execute: determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor; determining a movable range of a user in the virtual space; determining a position of the GUI object in a virtual space region outside the determined movable range; determining a size of the GUI object in the virtual space by using a correspondence between a depth and a display size of a unit length in the virtual space on the display unit such that the GUI object at the determined position has a constant display size on the display unit; and generating rendering data of the virtual space including the GUI object having the determined size.

Further, in the present invention, it is preferable that the processing device is included in the head-mounted display device.

Further, in the present invention, it is preferable that the processing device is included in an information processing device communicably connected to the head mounted display device.

Further, in the present invention, it is preferable that the rendering process includes a rendering process for rendering the virtual object corresponding to the position of the controller based on the position of the controller held by the user in the real space, and the movable range of the user includes a movable range of the virtual object corresponding to the position of the controller.

Further, in the present invention, it is preferable that in the step of determining the position of the GUI object, the position of the GUI object is determined such that the GUI object having the constant display size viewed from the viewpoint position of the user does not overlap with other virtual objects.

Further, in the present invention, preferably, the GUI object includes at least one of a selection window, a timing window, a progress bar, and a scroll bar.

Further, in order to achieve the above object, an information processing method according to an aspect of the present invention is an information processing method for performing rendering processing of a virtual space including a predetermined GUI object to be displayed on a display unit of a head-mounted display device, the information processing method characterized by comprising the steps of: determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor; determining a movable range of a user in the virtual space; determining a position of the GUI object in a virtual space region outside the determined movable range; determining a size of the GUI object in the virtual space by using a correspondence between a depth and a display size of a unit length in the virtual space on the display unit such that the GUI object at the determined position has a constant display size on the display unit; and rendering the virtual space including the GUI object having the determined size on the display unit.

Further, in order to achieve the above object, an information processing system according to an aspect of the present invention is an information processing system including a head-mounted display device including a display unit for displaying a virtual space to a user, characterized by comprising: a sensor to obtain a position of the head mounted display device in real space; a processing unit configured to perform rendering processing of the virtual space including a predetermined GUI object to be displayed on the display unit; a storage unit configured to store data representing a correspondence between a depth and a display size on the display unit of a unit length in the virtual space, wherein the processing unit is configured to: determining a viewpoint position of a user in the virtual space based on the position obtained by the sensor; determining a movable range of a user in the virtual space; determining a position of the GUI object in a virtual space region outside the determined movable range; determining a size of the GUI object in the virtual space by using the data stored by the storage unit such that the GUI object at the determined position has a constant display size on the display unit; and rendering the virtual space including the GUI object having the determined size on the display unit.

Further, in the present invention, it is preferable that the information processing system includes a controller held by a user, the rendering process includes a rendering process for rendering the virtual object corresponding to a position of the controller based on the position of the controller in a real space, and the movable range of the user includes a movable range of the virtual object corresponding to the position of the controller.

Further, in the present invention, it is preferable that the data stored by the storage unit is prepared in advance by measuring a display size displayed on the display unit for a unit length in the virtual space for each depth.

Further, in the present invention, it is preferable that the information processing system includes an information processing apparatus communicably connected with the head mounted display apparatus, and the processing unit is realized by the head mounted display apparatus and the information processing apparatus.

Further, in order to achieve the above object, a head-mounted display device according to an aspect of the present invention is a head-mounted display device including a display unit for displaying a virtual space to a user, characterized by comprising: a processing unit configured to perform rendering processing of the virtual space including a predetermined GUI object to be displayed on the display unit; and a storage unit configured to store data representing a correspondence between a depth and a display size on the display unit of a unit length in the virtual space, wherein the processing unit is configured to: determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor; determining a movable range of a user in the virtual space; determining a position of the GUI object in a virtual space region outside the determined movable range; determining a size of the GUI object in the virtual space by using the data stored by the storage unit such that the GUI object at the determined position has a constant display size on the display unit; and rendering the virtual space including the GUI object having the determined size on the display unit.

Further, in order to achieve the above object, an information processing apparatus according to an aspect of the present invention is an information processing apparatus for performing rendering processing of a virtual space including a predetermined GUI object to be displayed on a display unit of a head-mounted display apparatus, characterized by comprising: a processing unit configured to perform rendering processing of the virtual space to be displayed on the display unit; and a storage unit configured to store data representing a correspondence between a depth and a display size on the display unit of a unit length in the virtual space, wherein the processing unit is configured to: determining a viewpoint position of a user in the virtual space based on a position of the head mounted display device in a real space obtained by using a sensor; determining a movable range of a user in the virtual space; determining a position of the GUI object in a virtual space region outside the determined movable range; determining a size of the GUI object in the virtual space by using the data stored by the storage unit such that the GUI object at the determined position has a constant display size on the display unit; and transmitting, to the head-mounted display apparatus, data for rendering the virtual space including the GUI object having the determined size on the display unit.

ADVANTAGEOUS EFFECTS OF INVENTION

The present invention makes it possible to appropriately display a GUI object to a user in a virtual reality space.

Drawings

Fig. 1 is a general configuration diagram of an information processing system according to an embodiment of the present invention.

Fig. 2 is a block diagram showing a hardware configuration of an information processing apparatus according to an embodiment of the present invention.

Fig. 3 is a block diagram showing a hardware structure of an HMD according to an embodiment of the present invention.

Fig. 4 is a schematic external view of a controller according to an embodiment of the present invention.

FIG. 5 is a functional block diagram of an information handling system according to an embodiment of the present invention.

Fig. 6 is a diagram illustrating an example of GUI objects displayed on a display unit according to an embodiment of the present invention.

Fig. 7 is a diagram illustrating an example of a virtual body displayed on a display unit according to an embodiment of the present invention.

Fig. 8 is a diagram showing an example of a far-near characteristic curve (perspective property profile) according to an embodiment of the present invention.

Fig. 9 is a flowchart illustrating information processing using the information processing system according to an embodiment of the present invention.

Detailed Description

An information processing system according to an embodiment of the present invention will be described below with reference to the drawings. The VR system (virtual reality space) provided to the user by the information processing system according to the present embodiment displays an image of a virtual hand corresponding to the user's own hand on the display unit of the HMD in a case where the user's own hand wearing the HMD falls within the visual field area of the user.

Fig. 1 is a general configuration diagram of an information processing system 1 according to an embodiment of the present invention. The information processing system 1 includes an information processing apparatus 10, a head mounted display apparatus (HMD)20 worn on the head of a user, a controller 30 held by the user, an image pickup apparatus 40, and an output apparatus 50. The information processing system 1 enables a user wearing the HMD20 to experience virtual reality.

The image pickup device 40 includes a camera equipped with an infrared sensor, and transmits an image picked up at a predetermined frame rate to the information processing device 10. Preferably, the camera device 40 is attached on top of the output device 50. As will be described later, in order to enable the user to experience virtual reality, the image pickup device 40 is configured to pick up an image of the HMD20 worn by the user and transmit the picked-up image to the information processing device 10. In one example, the image pickup device 40 picks up images of infrared rays emitted by the HMD20 worn by the user at a predetermined frame rate, and sequentially transmits the picked-up images to the information processing device 10, and the information processing device 10 sequentially obtains the images. In another example, the camera of the image pickup device 40 is equipped with an image sensor used in a general camera, such as a CCD sensor or a CMOS sensor, instead of or in addition to the infrared sensor.

The information processing apparatus 10 executes VR applications, which are various applications such as games that provide a virtual reality space to a user wearing the HMD20, the information processing apparatus 10 is electrically connected to the HMD20, the controller 30, the image pickup apparatus 40, and the output apparatus 50, respectively, the information processing apparatus 10 may be connected to the respective apparatuses by known wired communication using a cable or the like or by known wireless communication such as a wireless L AN, etc.

Fig. 2 is a block diagram showing a hardware configuration of the information processing apparatus 10 according to the embodiment of the present invention. The information processing apparatus 10 includes a processor 11, a storage apparatus 12, an input/output interface (input/output IF)13, and a communication interface (communication IF) 14. These constituent devices are connected via a bus 15. Note that interfaces are interposed between the bus 15 and the respective constituent devices as necessary. Note that the information processing apparatus 10 may be constituted by a plurality of electronic apparatuses.

The processor 11 is a processing device such as a CPU for controlling the overall operation of the information processing device 10. Alternatively, an electronic circuit such as an MPU or the like may be used as the processor 11. The processor 11 executes various processes by loading programs and data stored in the storage device 12 and executing the programs. Further, the processor 11 includes a GPU that performs rendering processing, and the information processing apparatus 10 generates a virtual space image to be presented to a user wearing the HMD20 and transmits the virtual space image to the HMD 20.

The input/output IF 13 is an interface for connecting input/output devices such as the image pickup device 40, a display, a keyboard, and a mouse. The communication IF 14 is a communication interface for connecting to other computers or devices by wired communication or wireless communication. For example, the information processing apparatus 10 is connected to the HMD20 via the communication IF 14. Although these IFs are described above for convenience of explanation, the kinds of interfaces used by the respective devices are not limited to those described above, and different interfaces may be provided for the respective devices.

The storage device 12 includes a primary storage device and a secondary storage device. The main storage is a semiconductor memory such as a RAM. The RAM is a volatile storage medium that allows high-speed reading and writing of information, and is used as a storage area and a work area when the processor 11 processes information. The main storage may include a ROM as a read-only non-volatile storage medium. In this case, the ROM stores programs such as firmware and the like. The auxiliary storage device stores various programs and data used by the processor 11 when executing the respective programs. The auxiliary storage device is, for example, a hard disk device; however, the secondary storage may be any type of non-volatile storage device or non-volatile memory capable of storing information, and may also be removable. For example, the secondary storage device stores an Operating System (OS), middleware, application programs, various data that can be referred to when executing these programs, and the like. The storage 12 stores VR applications.

Fig. 3 is a block diagram showing a hardware configuration of the HMD20 according to an embodiment of the present invention the HMD20 includes a processor 21, a storage device 22, a display device (display unit) 23, a light emitting marker (L ED lamp) 24, and a communication interface (communication IF)25 these constituent devices are connected via a bus 26.

The processor 21 is a processing device such as a CPU for controlling the overall operation of the HMD 20. Alternatively, an electronic circuit such as an MPU or the like may be used as the processor 21. The processor 21 performs various processes by loading programs and data stored in the storage device 22 and executing the programs. The HMD20 displays (renders) the virtual space image received from the information processing apparatus 10 on the display apparatus 23. Alternatively, the processor 21 may include a GPU for performing rendering processing. In this case, the HMD20 may generate a virtual space image on behalf of the information processing apparatus 10.

The storage device 22 includes a primary storage device and a secondary storage device. The main storage is a semiconductor memory such as a RAM. The RAM is a volatile storage medium that allows high-speed reading and writing of information, and is used as a storage area and a work area when the processor 21 processes information. The main storage may include a ROM as a read-only non-volatile storage medium. In this case, the ROM stores a program 21 such as firmware or the like. The auxiliary storage device stores various programs and data used by the processor when executing the respective programs. The auxiliary storage device is, for example, a hard disk device; however, the secondary storage may be any type of non-volatile storage device or non-volatile memory capable of storing information, and may also be removable. For example, the secondary storage device stores an Operating System (OS), middleware, application programs, various data that can be referred to when executing these programs, and the like.

The display device 23 is a non-transmissive display such as a liquid crystal display or an organic E L display, and displays virtual space images to a user wearing the HMD20 thus, for example, the HMD20 may be a general video see-through HMD the display device 23 is a display device in a Head Mounted Display (HMD), and will be referred to as a display unit 23 hereinafter for convenience of explanation.

The L ED lamp 24 is composed of infrared L ED, and a plurality of L ED lamps 24 are attached to a housing of the HMD20, etc. the L ED lamp 24 is attached to enable tracking of the HMD20 moving with the head movement of the user. L ED lamps 24 may be light emitting elements that emit light having a specific color as long as tracking of the HMD20 can be achieved.

The communication IF 25 is a communication interface for connecting to other computers or devices by wired communication or wireless communication. For example, the HMD20 is connected to the information processing apparatus 10 via the communication IF 25.

In one variation, the HMD20 is equipped with a pose sensor (not shown). In this case, the posture sensor 25 includes a gyro sensor and an acceleration sensor. The posture sensor detects sensor information including the position and rotation angle of the HMD20, the direction in which the display unit 23 is facing, and the like, and transmits the sensor information to the information processing apparatus 10, and the information processing apparatus 10 sequentially obtains the sensor information. In another modification, the HMD20 is equipped with a camera (not shown) constituted by a CCD sensor, a CMOS sensor, or the like. In this case, the camera sequentially transmits the captured images to the information processing apparatus 10, and the information processing apparatus 10 sequentially obtains the images.

Fig. 4 is a schematic external view of a controller 30 according to an embodiment of the present invention, the controller 30 is a stick-shaped controller including a grip 31, operation buttons 32, and L ED lamps 33, the grip 31 is a part that is gripped when a user operates the controller 30, the operation buttons 32 are one or more buttons provided on a general game machine, L ED lamps 33 are composed of L ED that emit light having a predetermined color, and are attached to ends of the controller.

In the present embodiment, the controller 30 includes two controllers corresponding to two hands, respectively, and the user operates by moving his or her arm or hand while holding the grip portion 31 or by operating the operation buttons 32, in one example, the information processing apparatus 10 determines the position and movement of the controller 30 by tracking the position of the L ED lamp 33 in the captured image obtained from the image pickup apparatus 40, in one example, the controller 30 transmits information on the button pressed by the user to the information processing apparatus 10.

The controller 30 may be equipped with a motion sensor including a gyro sensor and an acceleration sensor. In this case, the controller 30 transmits sensor information including the position, the rotation angle, and the like of the controller 30 detected by the motion sensor to the information processing apparatus 10.

The output device 50 is a general display that outputs images and audio, in one example, the output device 50 is a liquid crystal display, a display that utilizes an organic E L, or a plasma display, in one example, the output device 50 is configured to output the same images as virtual space images presented to a user wearing the HMD 20.

Fig. 5 is a functional block diagram of the information processing system 1 according to the embodiment of the present invention. The information processing system 1 includes a position determination unit 61, a storage unit 62, and a processing unit 63. These functions are realized by the processor 11 of the information processing apparatus 10 or the processor 21 of the HMD20 executing a program. Thus, at least one of the information processing apparatus 10 and the HMD20 has various functions shown in fig. 5. Since various functions are realized by a loader, a part of one component (function) may be included in another component. Note that, according to circumstances, part or all of the respective functions may be realized by hardware by configuring an electronic circuit or the like for realizing the functions.

The position determining unit 61 determines the position and tilt of the HMD20 by using the image pickup device 40 as a sensor for position determination, from the obtained sensor information the position determining unit 61 determines the position and tilt of the HMD20 by identifying the positions of a plurality of L ED lamps 24 attached to the HMD20 in captured images of the HMD20 worn by the user sequentially obtained from the image pickup device 40.

In one example, the position of the HMD20 determined by the position determining unit 61 is determined based on the position of the HMD20 relative to the imaging device 40. in one example, the position determining unit 61 further determines the direction in which the display unit 23 of the HMD20 is facing by identifying the positions of the plurality of L ED lights 24 attached to the HMD 20.

The position determination unit 61 may perform tracking processing with high accuracy using a structure that tracks the position and tilt of the HMD20 by using a combination of the image pickup device 40 for identifying the position of the L ED lamp 24 and the attitude sensor for detecting the position, the rotation angle, and the like of the HMD20, as described above, however, it is noted that the above-described tracking processing is an example, and other known techniques may also be used, however, in another example, the position determination unit 61 determines the position and tilt of the HMD20 by using a combination of the image pickup device 40 for identifying the position of the ED L lamp 24 and the sensor for irradiating an object with reference light and measuring reflected light.

In one modification, the position determination unit 61 determines the position and tilt of the HMD20 from the obtained sensor information by using only a camera included in the HMD20 as a sensor for position determination. The position determination unit 61 determines the position and tilt of the HMD20 by recognizing the position of the HMD20 with respect to the imaging target object in the captured images of the imaging device 40 sequentially obtained from the HMD 20.

Further, the position determination unit 61 determines the position of the controller 30 from the obtained sensor information by using the image pickup device 40 as a sensor for position determination the position determination unit 61 determines the position and tilt of the controller 30 by identifying the position of the L ED lamp 33 attached to the controller 30 in the captured images of the controller 30 held by the user sequentially obtained from the image pickup device 40. in the case where the captured images include L ED lamps 33 attached to the two controllers 30, respectively, the position determination unit 61 determines the respective positions and tilts of the two controllers.

In one example, a motion sensor included in the controller 30 is used as a sensor for position determination, in which case the position determination unit 61 determines the position and tilt of the controller 30 by further using sensor information detected by the motion sensor, as described above, the position determination unit 61 may perform tracking processing with high accuracy using a structure that tracks the position and tilt of the controller 30 by using a combination of the imaging device 40 for identifying L ED lamp 33 position and the motion sensor for detecting HMD20 position, rotation angle, and the like.

The position determination unit 61 proceeds to determine the position of the HMD20 or the controller 30 by using sensor information sequentially obtained from the sensors used for position determination. In one example, the position determination unit 61 continues to periodically determine the position and tilt of the HMD20 at predetermined intervals from the captured images sequentially obtained from the imaging device 40. In another example, the position determination unit 61 continues to periodically determine the position and the inclination of the controller 30 at predetermined intervals from the captured images sequentially obtained from the image pickup device 40.

The storage unit 62 stores programs, data, and the like in the storage device 12 or the storage device 22. The storage unit 62 stores information about a virtual space and information about virtual objects arranged in the virtual space.

The processing unit 63 executes various processes for executing a VR application such as a game or the like. The processing unit 63 includes a user environment determination unit 64, a virtual object control unit 65, a user operation processing unit 66, and a display image generation unit 67.

Initially, according to the content defined by the VR application, the user environment determination unit 64 constructs a three-dimensional virtual space, and determines an initial position of the user in the virtual space based on the position of the HMD20 in the real space obtained by using the sensor for position determination. The initial position of the user is a user position in an initial state such as a state at startup of the VR system. The user position includes a viewpoint position of the user, and the user environment determination unit 64 sets a virtual camera at the viewpoint position of the user. Here, since the image viewed from the virtual camera is the image viewed by the user, it will be understood that the region photographed by the virtual camera (i.e., the visual field region of the virtual camera) corresponds to the visual field region of the user. The user environment determination unit 64 determines the initial position of the user and the visual field area of the user by using the position and tilt of the HMD20 determined by the position determination unit 61 to determine the position of the user and the direction in which the user is facing.

At this time, the user environment determination unit 64 determines the movable range of the user, which is a range in which the user is allowed to move in the virtual space, by using the initial position of the user. For example, in a case where the VR application is a show experience VR game, the user environment determination unit 64 constructs a three-dimensional virtual space within the show venue, determines the seat position and the visual field area of the user, and determines the movable range of the user within the show venue.

In one example, the position determination unit 61 determines the position of the HMD20 based on the position of the HMD20 relative to the imaging device 40, and the user environment determination unit 64 determines the initial position of the user in the virtual space from the position of the HMD20 thus determined. In one example, the user environment determination unit 64 determines the movable range of the user based on a region in which the user can move in the real space. For example, the user environment determination unit 64 sets the boundary in the depth direction of the movable range of the user at a position closer than the position in the virtual space corresponding to the position of the image pickup device 40 existing in the real space. As described above, in the present embodiment, the user environment determination unit 64 determines the movable range of the user at the time of startup of the VR system (such as at the time of constructing a virtual space) from the position of the HMD20 in the real space and the environment of the virtual space constructed based on the content of the VR application.

Further, after determining the initial position, the user environment determination unit 64 determines the visual field area of the virtual camera by determining the position of the user and the direction in which the user is facing from the position and tilt of the HMD20 determined by the position determination unit 61. Here, in the present embodiment, the length in the virtual space constructed by the user environment determination unit 64 is substantially the same as the length in the real space. When the user wearing the HMD20 moves 0.1m in one direction in the real space, the user environment determination unit 64 moves the virtual camera in the virtual space by 0.1m in the direction in the virtual space corresponding to the direction. In one example, the user environment determination unit 64 proceeds to determine the viewpoint position of the user and the visual field area of the virtual camera by using the position and tilt of the HMD20 periodically determined by the position determination unit 61 at predetermined intervals. Note, however, that the length in the virtual space constructed by the user environment determination unit 64 does not have to be the same as the length in the real space, as long as the length corresponds to the length in the real space.

The virtual object control unit 65 configures a virtual object of a set size at a set position according to contents defined by the VR application. For example, in the case where the VR application is a performance experience VR game, the virtual object control unit 65 configures a virtual object corresponding to an object (such as a wall, a seat, and a fence of a building) constituting a performance venue, and a virtual object corresponding to a character (such as a pop idol and other spectators). In a case where any virtual object is arranged within the visual field area of the virtual camera, a display image generation unit 67, which will be described later, generates a virtual space image including the virtual object as viewed from the virtual camera.

In the present embodiment, the virtual object includes a predetermined GUI object such as a selection window for allowing the user to perform a selection operation by using the controller 30 or a message window for prompting the user for settings, or the like. For convenience of explanation, the present embodiment will be explained in the case where only one GUI object is arranged in the virtual space; however, a plurality of GUI objects may be configured. In one example, the GUI object includes at least one of a selection window, a timing window, a progress bar, and a scroll bar.

Fig. 6 shows a virtual space image generated by the display image generation unit 67, which includes an example of a GUI object according to an embodiment of the present invention, specifically, an example of a GUI object in a presentation experience VR game. As shown in fig. 6, the virtual object control unit 65 arranges the GUI objects at predetermined positions in the virtual space in the same manner as other virtual objects. This is because, if the processing unit 63 is configured to display the GUI object on the display unit 23 of the HMD20 in such a manner as if the GUI object were completely separated from the virtual world, the immersion felt by the user experiencing the VR world would be impaired. Note, however, that the information processing system 1 configures the GUI objects so that the user recognizes the GUI objects in the virtual space, instead of touching the GUI objects in the virtual space to perform the user operations.

The GUI object shown in fig. 6 includes a user's question about "start show again? "and a" yes "window and a" no "window for accepting a reply from the user. A user operation processing unit 66, which will be described later, accepts a selection operation by the user using the controller 30 to determine whether to start the performance again. As described above, the GUI object in the present embodiment is a virtual object that the user must recognize to facilitate the user's selection operation by using the controller 30.

The user operation processing unit 66 configures the rod-like virtual pencil torch corresponding to the controller 30 at a position in the virtual space corresponding to the position of the controller 30 in the real space by using the positions and inclinations of the two controllers 30 determined by the position determining unit 61. At this time, the user operates the processing unit 66 to estimate the positions of the arm and hand holding the pen torch from the position and inclination of the virtual pen torch, and also to configure the virtual hand and virtual arm holding the virtual pen torch. Thus, the user operates the processing unit 66 to configure a virtual body including the virtual pencil torch, the virtual hand, and the virtual arm at a position in the virtual space corresponding to the controller 30, the user's hand, and the user's arm in the real space. Note, however, that the virtual body may be an object including at least one of a virtual pencil torch, a virtual hand, and a virtual arm. Further, since the above-described function of the user operation processing unit 66 is to configure a virtual object according to a user operation, the function may be partially or entirely performed by the virtual object control unit 65. Note that the rod-shaped virtual pencil torch is an example of a rod-shaped virtual object corresponding to the rod controller 30, and the shape is not limited to this shape. Further, in one example, the user operation processing unit 66 proceeds to update the position of the virtual body by using the position and inclination of the controller 30 periodically determined at predetermined intervals by the position determining unit 61.

Further, the user operation processing unit 66 performs predetermined processing in the VR application according to the content defined by the VR application and information on the operation button 32 pressed by the user. At this time, for example, the controller 30 transmits operation information, which is information on the operation button 32 pressed by the user, to the information processing apparatus 10, and the user operation processing unit 66 performs predetermined processing by using the operation information received by the information processing apparatus 10. As another example, in the case where the GUI object shown in fig. 6 is displayed, the user operation processing unit 66 may determine whether to start the performance again, based on the operation information received from the controller 30.

The display image generation unit 67 generates virtual space images to be presented to the user viewed from the virtual camera, displayed on the display unit 23 of the HMD 20. the display image generation unit 67 changes the display sizes of the virtual objects according to the distances of the configured virtual objects from the virtual camera, for example, in the case where the display image generation unit 67 generates images of virtual objects having the same size, the closer virtual objects are displayed larger and the farther virtual objects are displayed smaller.

In the case where any virtual object exists in the field of view region of the virtual camera, the display image generation unit 67 generates a virtual space image including the virtual object existing in the field of view region. Also, in the case where a virtual body exists in the visual field area of the virtual camera, the display image generation unit 67 generates a virtual space image including the virtual body existing in the visual field area.

Fig. 7 illustrates an example of a virtual body displayed by the display unit 23 according to an embodiment of the present invention, specifically, an example of a virtual body displayed by the display unit 23 in a performance experience VR game. As will be understood from the figure, in the present embodiment, the display image generation unit 67 generates a virtual space image including a virtual body existing in the visual field area of the virtual camera, the virtual body including a virtual arm 71, a virtual hand 72, and a virtual pen barrel 73. With this structure, in the present embodiment, the information processing system 1 can make the user recognize the virtual body recognized by the user via the display unit 23 as the finger citron and the arm being the user's own hand and arm and the pen torch being held by the user himself. This makes it possible for the user to experience a sense of immersion as an attractive force for VR.

Here, the movable range of the user determined by the user environment determination unit 64 includes the movable range of the virtual object corresponding to the controller 30 in the real space. In the present embodiment, the movable range of the user determined by the user environment determination unit 64 includes a region in the virtual space where a virtual body including a virtual pencil torch, a virtual hand, and a virtual arm can be configured.

In the present embodiment, the virtual object control unit 65 determines the position at which the GUI object is to be configured at a predetermined position having a depth larger than the movable range of the user. With this structure, in the present embodiment, the information processing system 1 prevents interference between the virtual body and the GUI object. This makes it possible to prevent the user's arm from appearing to penetrate or be buried in the GUI object, for example.

However, in a case where the virtual object control unit 65 configures the GUI object at a position having a depth larger than the movable range of the user, the display size of the GUI object in the virtual space image generated by the display image generation unit 67 becomes small. That is, the display size of the GUI object displayed on the display unit 23 by the HMD20 becomes small. The GUI object is a selection window that the user must recognize for allowing the user to perform a selection operation by using the controller 30, a message window for prompting the user, or the like. Therefore, it is not preferable that the display size is small.

In the present embodiment, as will be explained below, the virtual object control unit 65 determines the size of the GUI object in the virtual space such that the display size of the GUI object on the display unit 23 at the determined position is constant. For example, the size in the virtual space is a vertical length and a horizontal length in the virtual space.

The storage unit 62 stores a far-near characteristic curve that is data indicating a correspondence relationship between a depth representing a distance from the virtual camera and a display size on the display unit 23 per unit length in the virtual space. In this embodiment, the far-near characteristic is generated before the VR application becomes available to the general user (e.g., during development of the VR application). Since the distance characteristic curve generally varies depending on VR applications, it is necessary to generate the distance characteristic curve for each VR application. The near-far characteristic curve is prepared in advance by displaying a unit length serving as a basis of the near-far method on the display unit 23, changing the depth of the unit length, and measuring the relationship between the depth and the display size of the unit length on the display unit 23. The unit length indicates a length used as a basis in a virtual space; the unit length is not limited to any particular length, and may be any length capable of expressing the relationship between the depth and the display size of the unit length on the display unit 23.

FIG. 8 is an example of a far-near characteristic according to an embodiment of the present invention. The distance characteristic curve shown in fig. 8 is a function representing the relationship between the length (unit length) of each side of the grid serving as the basis of the distance in the virtual space and the depth from the virtual camera. The horizontal axis of fig. 8 represents depth (depth), and indicates a deeper position, i.e., a position farther from the virtual camera, as the value becomes larger. The vertical axis of fig. 8 represents the length (d)) displayed on the display unit 23 per unit length at a given depth, and indicates that the displayed length is longer as the value becomes larger, for example, the number of pixels on the display unit 23 is larger.

Now, the relationship between the movable range of the user in the depth direction and the position at which the GUI object is configured will be described by using fig. 8. D1 denotes a position farthest from the virtual camera in the movable range of the user determined by the user environment determination unit 64, that is, a depth limit position in the movable range of the user, which is an arrival limit point of the virtual body of the user. D2 denotes a position where the depth of the GUI object is displayed, which is a position set by the virtual object control unit 65 and having a depth greater than D1.

By changing the size of the virtual object in accordance with the depth position of the virtual object using the above-described distance characteristic, the display size on the display unit 23 can be made constant at any depth position. In a preferred example, in the case where the length of any edge constituting a virtual object at a given length d to be displayed on the display unit 23 is represented by length, the length vrscale of the edge in the virtual space required for this purpose is represented by the following equation 1, where grid _ length represents a unit length.

In the present embodiment, the virtual object control unit 65 determines the size of the GUI object in the virtual space by using the near-far characteristic curve, for example, according to equation 1, so that the display size of the GUI object at the determined position will be constant on the display unit 23. For example, the display size of the GUI object on the display unit 23 is represented by the number of pixels on the screen of the display unit 23, and the constant display size refers to a certain constant size that enables the user to recognize.

As a result, in the case where the GUI object exists in the visual field range of the virtual camera, the display image generation unit 67 generates a virtual space image including the GUI object having a constant display size. With this structure, in the present embodiment, it is possible to display a GUI object having a constant size on the display unit 23 with the GUI object arranged at a position not within the reach of the user's hand. This makes it possible to display GUI objects having high visibility and not interfering with the virtual body of the user. Note that the virtual object control unit 65 may also determine the position of the GUI object so that the GUI object having a constant display size does not overlap with any other virtual object in the visual field area of the virtual camera.

The distance characteristic curve need not be a function, and may be data representing the correspondence between a plurality of depth distances and the display size on the display unit 23 of a unit length in the virtual space at the respective depth distances.

Next, information processing using the information processing system 1 according to the embodiment of the present invention will be described by using a flowchart shown in fig. 9. The information processing shown in fig. 9 is realized by causing the information processing apparatus 10 to execute a program, by causing the HMD20 to execute a program, or by causing the information processing apparatus 10 to execute a program and also causing the HMD20 to execute a program. Although it is assumed in the following description that the information processing apparatus 10 mainly performs information processing, the HMD20 may perform a part or all of the information processing.

The information processing apparatus 10 starts information processing, for example, when a VR system provided by the information processing system 1 is started. First, in step 901, the information processing apparatus 10 determines the viewpoint position of the user in the virtual space by determining the position of the user and the direction in which the user is facing from the position and tilt of the HMD20 determined by using the sensor. The information processing apparatus 10 configures a virtual camera at the determined viewpoint position of the user.

Then, in step 902, the information processing apparatus 10 determines a movable range of the user as a range in which the user is allowed to move in the virtual space, based on the environment of the virtual space constructed and the viewpoint position of the user.

Then, in step 903, the information processing apparatus 10 determines the position at which the GUI object is to be configured at a predetermined position having a depth larger than the movable range of the user in the virtual space region that falls outside the movable range of the user.

Then, in step 904, the information processing apparatus 10 determines the size of the GUI object in the virtual space so that the display size of the GUI object on the display unit 23 at the configured position will be constant from the determined viewpoint position of the user. At this time, the information processing apparatus 10 determines the size of the GUI object in the virtual space by using data representing the correspondence between the depth and the display size on the display unit 23 of the unit length in the virtual space, such as the distance-normal characteristic curve.

Then, in step 905, the information processing apparatus 10 generates rendering data of a virtual space including the GUI object having the determined size. Specifically, the information processing apparatus 10 generates a virtual space image to be displayed on the display unit 23 of the HMD20 as viewed from the virtual camera. The information processing apparatus 10 transmits the generated virtual space image to the HMD20, and the HMD20 displays the virtual space image on the display unit 23. Alternatively, however, the information processing apparatus 10 may generate data such as a drawing command for generating a virtual space image to be displayed on the display unit 23 by the HMD20 as viewed from a virtual camera.

Then, unless the VR system 1 provided by the information processing system is terminated (step 906), in step 907, the viewpoint position of the user and the visual field area of the virtual camera are updated based on the position and tilt of the HMD20 that are periodically determined at predetermined intervals by using sensors, and the process returns to step 904. At this time, the information processing apparatus 10 updates the position where the virtual camera is arranged to the viewpoint position of the user. In step 904, the information processing apparatus 10 determines the size of the GUI object in the virtual space so that the display size of the GUI object on the display unit 23 at the configured position will be constant from the viewpoint position of the updated user.

This information processing is information processing by the information processing apparatus 10 in the case where a GUI object exists in the visual field area of the virtual camera. In one preferred example, the information processing apparatus 10 determines whether or not a GUI object is present in the visual field area of the virtual camera before step 905, and performs step 905 only if a GUI object is present in the visual field area of the virtual camera. In another example, the information processing apparatus 10 determines whether or not a GUI object is present in the visual field area of the virtual camera before step 904, and executes step 904 and step 905 only in the case where the GUI object is present in the visual field area of the virtual camera.

Next, the operational advantage of the information processing system 1 according to the embodiment of the present invention will be described. In the present embodiment, at the time of startup of the VR system, the user environment determination unit 64 determines the viewpoint position of the user and the movable range of the user based on the position of the HMD20 in real space. After the movable range of the user is thus determined according to the environment, the virtual object control unit 65 configures a GUI object (GUI element) at a predetermined position having a depth larger than the movable range. Then, by using the near-far characteristic curve, the virtual object control unit 65 determines the size of the GUI object in the virtual space so that the display size of the GUI object on the display unit 23 at the determined position will be constant.

Further, in the present embodiment, the display image generation unit 67 generates a virtual space image including a virtual body existing in the visual field area of the virtual camera, the virtual body including a virtual arm 71, a virtual hand 72, and a virtual pencil torch 73.

With this structure, in the present embodiment, it is possible for the user to recognize the virtual body recognized by the user via the display unit 23 as the finger citron-like hand and arm being the user's own hand and arm and the pen torch being held by the user himself. This makes it possible for the user to experience a sense of immersion as an attractive force for VR.

Further, with this structure, in the present embodiment, it is possible to display a GUI object having a constant size on the display unit 23 with the GUI object being arranged at a position that is not within the reach of the user's hand. This makes it possible to display GUI objects having high visibility and not interfering with the virtual body of the user. Further, with this structure, in the present embodiment, it is possible to provide high visibility in various situations while providing the user with a sense of immersion, as compared with a structure in which GUI objects are displayed on the display unit 23 of the HMD20 in a state completely separated from the VR space.

Further, the information processing system 1, the information processing method, the program for executing the method, and the like according to the present embodiment configured as described above are generally available for VR applications such as VR games that display UI elements (e.g., windows) independent of the background constituting a VR space.

Further, in the present embodiment, the virtual object control unit 65 may determine the position of the GUI object so that the GUI object having a constant display size viewed from the viewpoint position of the user does not overlap with any other virtual object. This makes it possible to display GUI objects with high visibility.

A program according to an embodiment of the present invention realizes the operation of the information processing system 1 by causing the processor 11 or the processor 21 to execute the program. Thus, the operational advantage of the program is the same as that of the information processing system 1.

The present invention, in another embodiment thereof, may also be a computer-readable storage medium storing a program for realizing the functions of the above-described embodiments of the present invention or the information processing shown in the flowcharts. The present invention, in still another embodiment thereof, may also be a server which can provide a computer with a program for realizing the functions of the above-described embodiments of the present invention or the information processing shown in the flowchart. In still another embodiment thereof, the present invention may also be a virtual machine that realizes the functions of the above-described embodiments of the present invention or the information processing shown in the flowchart.

In the above-described processes or operations, the processes or operations can be freely modified as long as no contradiction (e.g., a contradiction in which data that is not available in one step is used in one step) occurs in the processes or operations. Further, the above-described embodiments are examples for explaining the present invention, and the present invention is not limited to these embodiments. The present invention can be embodied in various forms without departing from the gist of the invention. For example, the outer shape of each of the respective devices is not limited to the illustrated outer shape.

Description of the reference numerals

1 information processing system

10 information processing apparatus

11 processor

12 storage device

13 input/output IF

14 communication IF

20 head-mounted display device

21 processor

22 storage device

23 display unit

24L ED lamp

25 communication IF

30 controller

31 grip part

32 operating button

33L ED lamp

61 position determination unit

62 memory cell

63 processing unit

64 user environment determination unit

65 virtual object control unit

66 user operation processing unit

67 display image generating unit

71 virtual arm

72 virtual hand

73 virtual pen type torch

25页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:使用面向驾驶员的成像装置来监视驾驶员行为以用于车辆的车队中的车辆车队管理的系统和方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!