Image display system, image display method, and computer-readable medium

文档序号:1480936 发布日期:2020-02-28 浏览:32次 中文

阅读说明:本技术 图像显示系统、图像显示方法以及计算机可读介质 (Image display system, image display method, and computer-readable medium ) 是由 罗宾·鲍彻 于 2019-08-19 设计创作,主要内容包括:本发明提供显示用户的不适感少的图像的图像显示系统、图像显示方法以及计算机可读介质。一种图像显示系统,在显示器显示虚拟空间图像,所述图像显示系统具备处理装置,所述处理装置构成为进行如下处理:从多个位置检测传感器取得检测数据;根据所述检测数据,对所述多个位置检测传感器中的第一位置检测传感器与第二位置检测传感器之间的距离进行计算;根据所述距离,设定用于对用户的动作进行判定的边界;伴随所述用户的动作,判定所述第一位置检测传感器与所述边界之间的位置关系是否满足条件;以及响应于所述位置关系满足所述条件,在所述虚拟空间中执行与所述用户的动作对应的活动。(The invention provides an image display system, an image display method and a computer readable medium for displaying an image with less discomfort of a user. An image display system that displays a virtual space image on a display includes a processing device configured to perform: acquiring detection data from a plurality of position detection sensors; calculating a distance between a first position detection sensor and a second position detection sensor of the plurality of position detection sensors based on the detection data; setting a boundary for judging the action of the user according to the distance; determining whether or not a positional relationship between the first position detection sensor and the boundary satisfies a condition in accordance with the user's motion; and in response to the positional relationship satisfying the condition, performing an activity in the virtual space corresponding to the action of the user.)

1. An image display system for displaying a virtual space image on a display, the image display system comprising a processing device,

the processing device is configured to perform the following processes:

acquiring detection data from a plurality of position detection sensors;

calculating a distance between a first position detection sensor and a second position detection sensor of the plurality of position detection sensors based on the detection data;

setting a boundary for judging the action of the user according to the distance;

determining whether or not a positional relationship between the first position detection sensor and the boundary satisfies a condition in accordance with the user's motion; and

in response to the positional relationship satisfying the condition, performing an activity corresponding to the user's action in the virtual space.

2. The image display system of claim 1,

the processing device is configured to be capable of,

determining that the positional relationship satisfies the condition in response to the first position detection sensor crossing or overlapping the boundary.

3. The image display system according to claim 1 or 2,

the first position detection sensor is worn on the user.

4. The image display system according to any one of claims 1 to 3,

the user's action corresponding to the activity includes movement in a first direction,

the boundary includes a first boundary and a second boundary,

the first boundary is set apart in the first direction with respect to the first position detection sensor,

the processing device is configured to be capable of,

determining that the user's action has started in response to the first position detection sensor crossing or overlapping the first boundary.

5. The image display system according to any one of claims 1 to 4,

the user's action corresponding to the activity includes movement in a second direction,

the boundary also includes a second boundary that is,

the second boundary is set apart in the second direction with respect to the first position detection sensor,

the processing device is configured to determine that the motion of the user is completed in response to the first position detection sensor crossing or overlapping the second boundary.

6. The image display system of claim 5,

the processing device is configured to be capable of,

acquiring a speed of the first position detection sensor when the first position detection sensor crosses the boundary or overlaps the boundary, based on detection data from the position detection sensor or detection data from a speed detection sensor worn by the user, and reflecting the acquired speed to the activity.

7. The image display system according to any one of claims 1 to 6,

the processing device is configured to be capable of,

and moving the object in the virtual space in response to the positional relationship satisfying the condition.

8. The image display system according to claim 7,

the processing device is configured to be capable of,

in response to the first position detection sensor crossing a boundary set apart in a movement direction thereof or overlapping the boundary, an object is moved in the virtual space in a direction corresponding to the movement direction.

9. The image display system according to any one of claims 1 to 8,

the activity differs according to a positional relationship between the first position detection sensor and the boundary.

10. The image display system according to any one of claims 1 to 8,

the first position detection sensor is worn on a foot of the user.

11. An image display method for displaying a virtual space image on a display using a processing device,

the image display method comprises the following steps:

acquiring detection data from a plurality of position detection sensors;

calculating a distance between a first position detection sensor and a second position detection sensor of the plurality of position detection sensors based on the detection data;

setting a boundary for judging the action of the user according to the distance;

determining whether or not a positional relationship between the first position detection sensor and the boundary satisfies a condition in accordance with the user's motion; and

in response to the positional relationship satisfying the condition, performing an activity corresponding to the user's action in the virtual space.

12. A non-transitory computer readable medium storing a program, wherein,

the program causes a computer to execute:

acquiring detection data from a plurality of position detection sensors;

calculating a distance between a first position detection sensor and a second position detection sensor of the plurality of position detection sensors based on the detection data;

setting a boundary for judging the action of the user according to the distance;

determining whether or not a positional relationship between the first position detection sensor and the boundary satisfies a condition in accordance with the user's motion; and

in response to the positional relationship satisfying the condition, performing an activity corresponding to the user's action in the virtual space.

Technical Field

The invention relates to an image display system, an image display method and a computer readable medium.

Background

The following games are known: the movement of the user in the real space is detected using various sensors and the like, and reflected on the object in the virtual space. An input device used in a fishing game is disclosed in japanese patent laid-open No. 10-214155. The input device is provided with a built-in sensor capable of detecting acceleration and inclination, and is connected to the game processing device. When the input device is moved in an operation such as an actual fishing rod throwing operation in a state where the trigger button is turned on, an image of the fishing line throwing onto the water surface is displayed on a display connected to the game processing device.

In the above system, when the input device is slightly moved in a state where the trigger button is turned on, an image in which the fishing line is thrown on the water surface may be displayed contrary to the user's intention. As described above, when the user makes a deviation between the motion performed in the real world and the reflected image, there is a problem in that the user feels a sense of discomfort.

Disclosure of Invention

An object of the present invention is to display an image with less discomfort for a user when reflecting the movement of the user to the movement in a virtual space.

This summary is provided to show selected portions of concepts described further below in a simplified form. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.

In one general approach, an image display system is provided that displays a virtual space image on a display. The image display system displays a virtual space image on a display, and includes a processing device configured to perform: acquiring detection data from a plurality of position detection sensors; calculating a distance between a first position detection sensor and a second position detection sensor of the plurality of position detection sensors based on the detection data; setting a boundary for judging the action of the user according to the distance; determining whether or not a positional relationship between the first position detection sensor and the boundary satisfies a condition in accordance with the user's motion; and in response to the positional relationship satisfying the condition, performing an activity in the virtual space corresponding to the action of the user.

In another general aspect, an image display method for displaying a virtual space image on a display using a processing device is provided. The image display method displays a virtual space image on a display by using a processing device, wherein the image display method comprises the following steps: acquiring detection data from a plurality of position detection sensors; calculating a distance between a first position detection sensor and a second position detection sensor of the plurality of position detection sensors based on the detection data; setting a boundary for judging the action of the user according to the distance; determining whether or not a positional relationship between the first position detection sensor and the boundary satisfies a condition in accordance with the user's motion; and in response to the positional relationship satisfying the condition, performing an activity in the virtual space corresponding to the action of the user.

In another general approach, a non-transitory computer readable medium storing a program is provided. The program causes a computer to execute: acquiring detection data from a plurality of position detection sensors; calculating a distance between a first position detection sensor and a second position detection sensor of the plurality of position detection sensors based on the detection data; setting a boundary for judging the action of the user according to the distance; determining whether or not a positional relationship between the first position detection sensor and the boundary satisfies a condition in accordance with the user's motion; and in response to the positional relationship satisfying the condition, performing an activity in the virtual space corresponding to the action of the user.

Other features and aspects will become apparent from the following detailed description, the accompanying drawings, and the claims.

Drawings

Fig. 1 is a diagram schematically showing a first embodiment of an image display system.

Fig. 2 is a block diagram of the image display system of fig. 1.

Fig. 3 is a diagram for explaining the boundary setting method in the first embodiment.

Fig. 4 is a diagram illustrating a boundary utilization scheme according to the first embodiment.

Fig. 5 is a diagram illustrating a boundary utilization scheme according to the first embodiment.

Fig. 6A is a diagram showing a graph of data used in calculating the object movement distance in the first embodiment, and fig. 6B is a diagram showing a map of data used in calculating the object movement distance in the first embodiment.

Fig. 7 is a diagram of an example of a game screen in the first embodiment.

Fig. 8 is a flowchart illustrating a boundary setting procedure in the first embodiment.

Fig. 9 is a flowchart for explaining the progress of the game in the first embodiment.

Fig. 10A to 10C are diagrams showing the rear movement of the object, the movement of the object to the rear right, and the movement of the object to the rear left, respectively, with respect to the boundary utilization pattern in the second embodiment.

Fig. 11A to 11C are diagrams showing forward movement of an object, forward left movement of the object, and forward right movement of the object, in relation to the boundary utilization scheme in the second embodiment of the image display system.

Fig. 12 is a diagram showing an increase in the object in relation to the boundary utilization scheme in the second embodiment.

Fig. 13 is a flowchart for explaining the progress of the game in the second embodiment.

Fig. 14A to 14C are diagrams showing the setting of a boundary, the determination of the start of an operation, and the determination of the completion of an operation, respectively, with respect to the boundary utilization scheme in the modified example of the image display system.

Fig. 15A to 15C are diagrams showing the setting of the boundary, the determination of the type of activity when the ankle position is lower than the boundary, and the determination of the type of activity when the ankle position is higher than the boundary, respectively, with respect to the boundary utilization pattern in the modified example of the image display system.

Like reference numerals refer to like elements throughout the drawings and detailed description. The drawings may not be to scale and the relative sizes, proportions and depictions of elements in the drawings may be exaggerated for clarity, illustration and convenience.

Detailed Description

A general understanding of the described methods, devices, and/or systems will be given through this disclosure. Variations and equivalents of the described methods, apparatus and/or systems will be apparent to those skilled in the art. The order of the actions is illustrative and may be altered in ways apparent to those skilled in the art. The actions are exceptions that occur in a certain order. Descriptions about functions and configurations well known to those skilled in the art are sometimes omitted.

The illustrated embodiments may be variously modified, and are not limited to the described embodiments. The described embodiments are, however, complete and convey the full scope of the disclosure to those skilled in the art.

[ first embodiment ]

An embodiment of an image processing system will be described.

As shown in FIG. 1, the image processing system includes a housing 11, a tracking system 20, and a head-mounted display

30 (HMD), and a game processing device 50. The tracking system 20 has one or more tracking aids 21 and a plurality of tracking sensors 22. The game processing device 50 corresponds to the detection data acquisition unit, the distance calculation unit, the setting unit, the determination unit, and the event execution unit. The tracking sensor 22 corresponds to a position detection sensor.

The frame 11 simulates a swing. The frame 11 includes: a support body 12; and a seat portion 15 suspended from the support body 12 by the hanging portion 13. The seat portion 15 includes a seat surface and a back surface located on the opposite side of the seat surface. A tracking sensor 22 is provided on the back surface of the seat portion 15. In addition, the swing range of the seat portion 15 is limited to a predetermined range so that the seat portion 15 does not swing greatly. For example, a member for restricting the oscillation amplitude may be provided in the hanger 13 or the seat 15, and the weight of the hanger 13 or the seat 15 may be increased.

A user 200 wearing an HMD30 on his head is seated on the seat portion 15. A tracking sensor 22 is worn near the ankle of the user 200. The position of the tracking sensor 22 may be a toe, a heel, or the like, other than the vicinity of the ankle. The tracking sensor 22 may be worn on the body of the user 200, or may be worn on the user 200 by a fitting member such as a strap.

The game processing device 50 is installed in a space where the housing 11 is disposed or at another position, and is connected to the HMD30 by a communication cable or wireless, thereby enabling bidirectional data transmission and reception. The game processing device 50 performs the following processing: the virtual space image is displayed at the HMD30 and a game is played.

As the tracking system 20, a tracking sensor 22 and a sensor disposed in a tracking space are used. A tracking assistance device 21 is disposed in a space where the housing 11 is disposed. The tracking assistance apparatus 21 and the HMD30 cooperate to transmit detection data for detecting the position of the HMD30 to the game processing apparatus 50. In addition, the tracking assistance device 21 and the tracking sensor 22 cooperate to transmit detection data for detecting the position of the tracking sensor 22 to the game processing device 50.

The game processing device 50 detects the movement of the feet of the user 200 seated on the seat portion 15 based on the data transmitted from at least one of the tracking assistance device 21 and the tracking sensor 22. The game processing device 50 detects, for example, a motion in which the user moves both feet simultaneously or moves each foot separately, and determines whether or not the detected motion satisfies a predetermined condition. When it is determined that the detected motion satisfies a predetermined condition, the game processing device 50 executes a virtual activity of a virtual character (Avatar) corresponding to the user in the virtual space. In the present embodiment, a description will be given by taking as an example a game in which the user 200 performs an operation of kicking out a foot forward in the real world, and a virtual character corresponding to the user 200 executes a "shoe flying (shoe on the swinging)" activity of flying a shoe object worn on the foot forward in a virtual space.

The HMD30, the tracking sensor 22, and the tracking support device 21 will be described in detail with reference to fig. 2.

The HMD30 is a wearable computer, and is a non-transmissive display including a casing covering both eyes or a transmissive display other than the non-transmissive display. In the non-transmissive head-mounted display, an image for visual confirmation by the left eye and an image for visual confirmation by the right eye may be displayed on one or more displays, or a common image for the left eye and the right eye may be displayed on the displays. In the transmissive display, an image photographed by a camera provided in a head-mounted display or other displays may be displayed on the display, or the display may be formed of a half mirror or a transparent material, and the real world may be visually confirmed by the display. The HMD30 may be a display fixed to a housing, a frame, or the like, or may be a multifunction telephone terminal such as a smartphone detachably fixed to a predetermined housing. The HMD30 displays images such as Virtual Reality (VR), Augmented Reality (AR) that provides content in a Virtual space while visually confirming the real world, and Mixed Reality (MR) including the same. In this embodiment, HMD30 will be described as a non-transmissive display.

HMD30 includes information processing unit 31, measurement device 32, and display 33. The information processing unit 31 executes processing for causing the game to progress. The information processing unit 31 is not limited to performing software processing on all processes executed by itself. For example, the information processing unit 31 may include a dedicated hardware circuit (e.g., an application specific integrated circuit: ASIC) for performing hardware processing on at least a part of the processing executed by the information processing unit. That is, the information processing unit 31 may be configured as a circuit (circuit) including one or more processors operating according to a computer program (software), one or more dedicated hardware circuits for executing at least a part of various processes, or a combination thereof.

The processor includes an arithmetic processing device such as a CPU, MPU, or GPU, and a storage medium such as a RAM or ROM. The information processing unit 31 includes a storage medium (memory) as a storage device such as an HDD or an SSD. At least one of these storage media stores program codes or instructions configured in such a manner that causes the CPU to execute processing. Storage media, i.e., computer-readable media, include all available media that can be accessed by a general purpose or special purpose computer. The measuring device 32 is a device that detects at least the direction of the HMD30, and is, for example, an Inertial measuring device (IMU). For example, the inertial measurement unit includes a gyro sensor, an acceleration sensor, and the like, and detects at least one of a rotation angle, an angular velocity, an acceleration, and the like around an X axis, a Y axis, and a Z axis. The display 33 is an organic EL display, a liquid crystal display, or the like.

The tracking sensor 22 is a sensor for detecting the position and orientation thereof. For example, the tracking sensor 22 has an inertial measurement device as with the HMD 30.

The tracking sensor 22 outputs a signal corresponding to its own position and direction in cooperation with the plurality of tracking assistance devices 21. The tracking sensor 22 and the tracking assistance device 21 cooperate so that the position of the tracking sensor 22 in the space can be detected. As an example of the tracking assistance device 21, a multi-axis laser oscillator can be used. At this time, the tracking assistance devices 21 are disposed diagonally above the space in which the housing 11 is disposed. The tracking assistance device 21 emits pulsed laser light. The tracking sensor 22 includes a sensor for detecting laser light, and detects its own position and direction while synchronizing with each other by a synchronization pulse. For the tracking sensor 22 and the tracking assistance device 21, for example, a Vive Tracker (registered trademark) and a Vive Base Station, which are supplied from HTC Corporation (registered trademark), can be used.

The game processing device 50 includes a position specifying unit 51, an image processing unit 52, a progress management unit 53, and a data storage unit 60. The position specification unit 51 specifies the direction of the HMD30 from data acquired from at least one of the HMD30 and the tracking assistance apparatus 21. In addition, the position specification unit 51 may specify the position of the HMD30 in addition to the direction of the HMD 30. Then, the position specifying unit 51 specifies the direction and position of the tracking sensor 22 based on data acquired from at least one of the tracking sensor 22 and the tracking assistance device 21.

The image processing unit 52 displays the virtual space image on the display 33 based on the orientation and/or position of the HMD30 specified by the position specifying unit 51. Further, the image processing unit 52 displays a part of the body of the virtual character corresponding to the user as a part of the virtual space.

The progress management unit 53 manages the progress of the game, and performs processes other than those performed by the position specifying unit 51 and the image processing unit 52. The progress management unit 53 starts the game when the game start condition is satisfied. The game start condition is, for example, an instruction output by a user or a game administrator. Further, the progress management unit 53 ends the game when the game end condition is satisfied. The game termination condition can be changed according to the game. For example, the game end condition is that an elapsed time from the start of the game is limited, the score of the user reaches a threshold value, or a task is completed.

The data storage unit 60 is a storage medium such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive). The data storage unit 60 stores a game processing program and other programs. The position specifying unit 51, the image processing unit 52, and the progress managing unit 53 execute computer-readable instructions described in these programs to progress the game.

The data storage unit 60 stores setting information 61, game space data 62, and game progress data 63. The setting information 61 includes boundary information set according to the position of the tracking sensor 22.

The game space data 62 is data for rendering (rendering) a space for game play. For example, data for rendering the background of a game battlefield and data for rendering objects within a virtual space are included. Such objects include objects such as enemies moving in a game battlefield and objects displayed only when a predetermined condition is satisfied. The game space data 62 also includes position information of such an object in the virtual space.

The game progress data 63 is data used for managing the progress of the game, and is updated according to the progress of the game. The game progress data 63 contains user attribute information associated with the user. The user attribute information includes at least one of setting information of the user, a parameter of the user, a game medium (or game content), and attribute information associated with the game medium. The setting information of the user is the nature, age or age class, location, mail address and the like of the user. The user parameters are, for example, parameters of a virtual character corresponding to the user, such as a game result such as a score, a number of winning or losing times, and parameters for fighting such as a level, a state, a capability, a skill, a talent, an offensive power, a defensive power, an HP, a life, a physical strength, a restoring force, an spell, and a job. The game medium is electronic data (content) used in the game, and can be acquired, owned, used, managed, exchanged, synthesized, enhanced, sold, discarded, given away, or the like by the user in the game. For example, any medium including cards, items, virtual currency, tickets, characters, avatars, and the like. The attribute information of the game medium is the parameter of the game medium, and the like, and is the same as the user attribute information and the like. In addition, instead of the game medium itself, the parameters of the user and the attribute information related to the game medium may be acquired, owned, used, managed, exchanged, composed, enhanced, sold, discarded, given away, or the like. Further, the game progress data 63 may include information on enemies. The information on the enemy includes, for example, information indicating the ease of defeating the enemy, such as a physical strength value of the enemy.

An input operation unit 65 is connected to the game processing device 50 so as to be able to transmit and receive data. The input operation unit 65 is a pointing device (pointing device) such as a mouse, a keyboard, or a touch panel device. The input operation unit 65 is connected to the game processing device 5 by a communication cable or wirelessly. The input operation unit 65 is used by a user or a game manager.

Referring to fig. 3, a method of setting a boundary for determining an activity to the user 200 is described. The user 200 is seated on the seat portion 15 with the tracking sensor 22 worn near each ankle. Only the right foot is shown in fig. 3. At this time, the bottom surface of the foot (shoe) of the user 200 may not necessarily contact the ground 500.

The game processing device 50 acquires detection data from the tracking sensor 22 to determine the position of the tracking sensor 22. Thereby, the position of the tracking sensor 22 worn on the seat portion 15, the position of the tracking sensor 22 worn on the left foot, and the position of the tracking sensor 22 worn on the right foot are obtained. The position of the tracking sensor 22 attached to the seat unit 15, the position of the tracking sensor 22 attached to the left foot (not shown in fig. 3), and the position of the tracking sensor 22 attached to the right foot are respectively defined as a seating position P1(X1, Y1, and Z1), a left foot position P2(X2, Y2, and Z2), and a right foot position P3(X3, Y3, and Z3).

The direction in which the face of the user 200 seated on the seat portion 15 and parallel to the floor surface 500 faces is referred to as the "X direction (front-rear direction)". A direction parallel to the ground 500 and orthogonal to the "X direction" is referred to as a "Z direction (left-right direction)". The normal direction of the floor 500 is defined as the "Y direction (vertical direction)". The game processing device 50 can set an arbitrary coordinate system without being limited to these directions.

The game processing device 50 calculates a distance Ly between the seating position P1 and the left foot position P2 in the Y direction, or a distance Ly between the seating position P1 and the right foot position P3 in the Y direction. Instead, both of these distances Ly may be calculated. Thereby, the length of the lower leg from the knee to the ankle of the user 200 is measured. Instead, for example, the distance between the Y coordinate of the seating position P1 and a position that is the middle of the Y coordinate of the left foot position P2 and the Y coordinate of the right foot position P3 may be calculated.

The game processing device 50 calculates the corrected distance Ly2 by multiplying the distance Ly by a predetermined ratio R2. Further, the ratio R2 is larger than "0" and smaller than "1". Next, the game processing device 50 sets the rear boundary 102 (first boundary) at a position separated by the correction distance Ly2 rearward in the anti-X direction from the left foot position P2 or the right foot position P3 serving as the reference position. When calculating both the distance Ly between the seating position P1 and the left foot position P2 and the distance Ly between the seating position P1 and the right foot position P3, the rear boundary 102 for the left foot and the rear boundary 102 for the right foot may be set using these distances Ly.

The rear boundary 102 may be a boundary of X coordinates. For example, the rear boundary 102 may be a Y-Z plane having the X direction as a normal direction and being parallel to the Y direction and the Z direction. For example, the ratio R2 is determined in consideration of a general angle when the user 200 seated on the seat portion 15 performs a preparation operation for an activity for flying out an object to be shoes in a virtual space, rotates the lower leg around the knee, and bends the foot backward (in the reverse X direction). Since the ratio R2 is substantially constant, for example, for a user 200 with a long leg, the rear boundary 102 is set at a position farther from the reference position than for a user 200 with a short leg.

In addition, the game processing device 50 calculates the corrected distance Ly1 by multiplying the distance Ly by a predetermined ratio R1. Further, the ratio R1 is larger than "0" and smaller than "1". The ratio R1 may be different from the ratio R2 or the same as the ratio R2. Basically, the ratios R1 and R2 are constant, but the ratios R1 and R2 can be adjusted to suit users of various body shapes. Next, the game processing device 50 sets a front boundary 101 (second boundary) at a position separated by the correction distance Ly1 forward in the X direction from the left foot position P2 or the right foot position P3 serving as the reference position. When both the distance Ly in the Y direction between the seating position P1 and the left foot position P2 and the distance Ly in the Y direction between the seating position P1 and the right foot position P3 are calculated, the front boundary 101 for the left foot and the front boundary 101 for the right foot may be set using these distances Ly.

The front boundary 101 may be a boundary of X coordinates. For example, the front boundary 101 may be a Y-Z plane parallel to the Y direction and the Z direction with the X direction as a normal direction. For example, the ratio R1 is determined in consideration of a general angle when the user 200 seated on the seat portion 15 makes the lower leg turn around the knee and kicks the foot forward (X direction) as an action of the activity for flying the object in the virtual space. Thus, for example, the front boundary 101 is set at a position farther from the reference position for the user 200 with a long leg than for the user 200 with a short leg.

The front boundary 101 and the rear boundary 102 set as described above reflect the length of the lower leg as the user's body shape. On the other hand, when determining whether or not to perform a movement based on a change in the direction of movement of the ankle position of the user, for example, there is a possibility that the user performs the movement at a timing that the user does not intend, such as when the user slightly moves his foot, and there is a problem that the user feels a sense of discomfort. In addition, when boundaries or the like for determining the execution of an activity are made consistent among users, users who easily execute the activity and users who hardly execute the activity may occur. In contrast, by setting the boundary according to the user body shape as in the present embodiment, it is possible to perform an activity according to the user's intention, and reduce the user's sense of discomfort.

The progress sequence of the game using the front boundary 101 and the rear boundary 102 will be described with reference to fig. 4 and 5. As described above, the seat portion 15 is restricted from swinging back and forth, such as slightly swinging, in accordance with the movement of the user 200. Only the right foot is shown in fig. 4 and 5.

As shown in fig. 4, when starting the shoe game, the user 200 rotates the lower leg around the knee so that the lower leg is positioned below the seat portion 15 in preparation for kicking the foot forward. The game processing device 50 determines a left foot position P2 (not shown) and a right foot position P3, and determines whether at least one of them crosses the rear boundary 102.

When it is determined that at least one of the left foot position P2 and the right foot position P3 crosses the rear boundary 102, the game processing device 50 determines that the shoe flying operation is started.

After the start of the shoe movement is determined, the game processing device 50 determines whether or not at least one of the left foot position P2 and the right foot position P3 has crossed the front boundary 101 while identifying them.

As shown in fig. 5, when it is determined that at least one of the left foot position P2 (not shown) and the right foot position P3 has crossed the front boundary 101, the game processing device 50 determines that the shoe flying operation is completed. Next, the game processing device 50 drops the shoe object of the virtual character in the virtual space in front of the user 200 in the kicking direction along a predetermined trajectory. Specifically, the game processing device 50 drops the object in the direction in which the user kicks the object, based on the detection data of the tracking sensor 22. For example, when the kicking direction is the front left side as viewed from the user 200, the game processing device 50 drops the object to be shoes toward the front left side.

Further, the game processing device 50 acquires the velocity of the tracking sensor 22 when the shoe movement is completed. The velocity is at least one of velocity, angular velocity, acceleration, and the like, which is a moving distance per unit time. The game processing device 50 moves the shoe object based on data relating the speed to the moving distance (flying distance) of the object. The speed may be calculated from the distance between the pair of tracking sensors 22, or may be detected by a speed detection sensor built in the tracking sensors 22 or an external speed detection sensor. When calculating the velocity from the distance, the velocity V (V ═ Ly/T) is calculated by dividing the time T when the lower leg is rotated by the distance Ly between the tracking sensor 22 attached to the seat unit 15 and the left foot position P2 or the right foot position P3. The time T may be a predetermined time (constant), or may be a time measured from when the foot crosses the rear boundary 102 to when the foot crosses the front boundary 101.

Fig. 6A is a graph showing an example of data relating the moving speed of the foot and the moving distance of the object to be shoes. In this graph, the speed is associated with the distance such that the distance becomes larger in stages as the speed becomes larger. As shown in the map of fig. 6B, the speed and the distance may be associated so that the distance becomes continuously larger as the speed becomes larger. Instead of or in addition to the graph, the game processing device 50 may calculate the movement distance using a predetermined arithmetic expression.

Fig. 7 schematically shows an example of the screen 110 visually confirmed by the user 200. An image of the virtual space 111 is displayed on the screen 110. A shoe object 112 and a part of a virtual character 113 are displayed in the virtual space 111.

After completing one shoe-flying action and moving the shoe object 112, the game processing device 50 displays the shoe object 112 again on the foot of the virtual character, so that the user can fly the shoes again. When the falling position or the movement trajectory of the object 112 satisfies the score assignment condition, a score is assigned to the user. The score to be given may be different depending on the drop position or the movement trajectory. Instead, when the falling position or the movement trajectory of the object to be shoes 112 satisfies the achievement condition, the task is completed.

As described above, since the user actually seats on the seat portion 15 of the swing to perform the shoe flying operation, the seat portion 15 swings in accordance with the user's operation. Further, by detecting the sway by the HMD30, the game processing apparatus 50 can move the image displayed on the HMD30 in accordance with the actual sway. This suppresses the occurrence of VR halos, and the user feels a sense of substitution as if he/she were flying a shoe while sitting on a virtual swing.

The procedure of the game processing of the present embodiment will be described.

The order of setting the boundaries will be described with reference to fig. 8. The game processing device 50 acquires detection data from the tracking sensor 22 and the like, and specifies the position of the tracking sensor 22 (sensor position) (step S1). Specifically, a seating position P1, a left foot position P2, and a right foot position P3 are determined.

When determining that the game is newly started or that a new user starts the game, the game processing device 50 sets a boundary in accordance with the position of the tracking sensor 22 (step S2). Specifically, the front boundary 101 and the rear boundary 102 are set using the seating position P1 and at least one of the left foot position P2 and the right foot position P3. The front boundary 101 and the rear boundary 102 may be common to both feet, or may be set for each foot. The positional information of the front boundary 101 and the rear boundary 102 is stored as the setting information 61 in the data storage unit 60. At this time, the position information of the front boundary 101 and the rear boundary 102 may be stored in association with the identification information of the user. When the user who sets the position information of the front boundary 101 and the rear boundary 102 plays the game, the position information of the front boundary 101 and the rear boundary 102 may be read from the data storage unit 60, and the game may be played using the read position information.

Next, the progress sequence of the game will be described with reference to fig. 9. The game processing device 50 starts the game in response to a trigger such as an input operation by the input operation unit 65 (step S11). The game processing device 50 performs initialization such as resetting of the position information of the boundary used in the previous game, for example. The game processing device 50 acquires detection data of the measurement device 32 of the HMD30, and displays an image in a display range corresponding to the detection data on the HMD 30.

The game processing device 50 determines whether or not the left foot position P2 or the right foot position P3 crosses the rear boundary 102 based on the position information of the rear boundary 102 while advancing the game based on the game advancement data 63 (step S12). When the left foot position P2 and the right foot position P3 do not cross the rear boundary 102 (no in step S12), the game processing device 50 proceeds to step S12 again.

When it is determined that the left foot position P2 or the right foot position P3 crosses the rear boundary 102 (yes in step S12), the game processing device 50 determines that the corresponding foot starts the flying shoe operation (step S13).

The game processing device 50 determines whether or not the left foot position P2 or the right foot position P3 crosses the front boundary 101 while the game is being played (step S14). When it is determined that the left foot position P2 and the right foot position P3 do not cross the front boundary 101 (no in step S14), the game processing device 50 proceeds to step S17. In step S14, it may be determined whether or not the left foot position P2 or the right foot position P3 crosses the front boundary 101 within a predetermined time, for example, 1 to 2 seconds. It is also possible to proceed to step S17 when it is determined that the left foot position P2 or the right foot position P3 does not cross the front boundary 101 within the predetermined time (step S14: NO).

When the game processing device 50 determines that at least one of the left foot position P2 and the right foot position P3 has crossed the front boundary 101 (YES in step S14), it determines that the shoe flying operation is completed (step S15).

Then, the game processing device 50 executes a flying shoe movement in the virtual space (step S16). At this time, the game processing device 50 calculates the moving distance of the object based on the moving speed using the map of fig. 6B. For example, the game processing device 50 renders the foot of the virtual character and the shoe object, and renders the shoe object so as to fall along a parabola in the direction of kicking out the foot according to the calculated movement distance. Further, the game processing device 50 renders a new shoe object to the foot of the virtual character after the shoe object is separated from the foot of the virtual character.

The game processing device 50 determines whether the game is ended or not based on the game end condition (step S17). When determining that the termination condition is satisfied (step S17: yes), the game processing device 50 terminates the game. When the end condition is not satisfied (NO in step S17), the process returns to step S12.

The advantages of the first embodiment will be explained.

(1) The distance between the tracking sensor 22 attached to the foot of the user and the tracking sensor 22 attached to the housing 11 is calculated, and the front boundary 101 and the rear boundary 102 are set based on the distance. Thus, the positions of the front boundary 101 and the rear boundary 102 are set according to the user's body shape. When the positional relationship between the tracking sensor 22 worn on the foot of the user and the front boundary 101 or the rear boundary 102 satisfies a predetermined condition, an activity corresponding to the movement of the user is executed in the virtual space. The position where the user's motion starts in the real world differs according to the body shape of the user. Therefore, by using the boundary reflecting the user's physique, a situation in which the activity is performed contrary to the user's intention is suppressed, and the user's sense of discomfort can be reduced.

(2) The game processing device 50 acquires the speed of the tracking sensor 22 worn on the foot of the user when the tracking sensor crosses the front boundary 101. The magnitude of the velocity is reflected on the moving distance of the shoe object in the shoe flying motion. Thus, the user can adjust the moving distance of the shoe object in the virtual space by adjusting the moving speed of the foot. Therefore, the game can be diversified by giving a score according to the falling position of the object to be shoes or the like, or giving a score according to the moving distance.

(3) When the tracking sensor 22 attached to the foot of the user crosses the front boundary 101, the game processing device 50 moves the shoe object from the foot of the virtual character in the virtual space. Therefore, the timing at which the movement of the object is started can be made closer to the timing assumed by the user. Therefore, the user can intuitively perform a kicking motion as an input operation.

(4) A tracking sensor 22 is worn on the user's foot. This makes it possible to reflect the movement of the foot to the movement in the virtual space. Therefore, a novel game can be provided.

[ second embodiment ]

Next, a second embodiment of the image display system will be described. The boundary setting method in the second embodiment is the same as that in the first embodiment. On the other hand, the boundary utilization scheme set in the second embodiment is modified from that of the first embodiment. Hereinafter, the same portions as those of the first embodiment are denoted by the same reference numerals, and detailed description thereof is omitted.

In the second embodiment, the front boundary 101 and the rear boundary 102 are used for determining whether or not the movement of the object is started in the virtual space. Specifically, when the positional relationship between the left foot position P2 and the right foot position P3 and at least one of the front boundary 101 and the rear boundary 102 satisfies a predetermined condition, the game processing device 50 moves the seat object and the virtual character corresponding to the seat 15. In addition, since the user perspective in the virtual space corresponds to the perspective of the virtual character, the user perspective in the virtual space also moves along with the movement of the virtual character. Conventionally, when moving a virtual character, a method of moving the virtual character or the like in a virtual space along with the movement of the angle of view of a user in the real world, or a method of automatically moving the virtual character or the like according to a game scenario is used. Alternatively, for example, a real-world User operates a controller held by the User and instructs movement using a GUI (Graphical User Interface) in a virtual space such as an icon, thereby moving the virtual character. In the former case, there is a possibility that the virtual character moves contrary to the user's intention. In the latter case, in order to move the virtual character to the target position, the controller needs to be operated to instruct the movement using a GUI such as an icon that does not exist in the real world, and therefore, the sense of substitution may be damaged. In contrast, in the present embodiment, the user moves the virtual character by sitting on the swing and moving the feet. This allows the virtual character to move by a more intuitive operation than in the conventional method. The object to be moved may be the entire enclosure object in the virtual space corresponding to the enclosure 11, instead of or in addition to the seat object and the virtual character. In the present embodiment, the movement of the seat part object and the virtual character is determined in a mode (flight mode) different from a mode (kick mode) in which the shoe-flying operation is performed.

The front boundary 101 and the rear boundary 102 shown in fig. 10A to 12 may be boundaries in the X direction. For example, the front boundary 101 and the rear boundary 102 are Y-Z planes with the X direction as a normal direction. The game processing device 50 determines whether both the left foot position P2 (not shown) and the right foot position P3 (not shown) cross the boundary at the same time.

As shown in fig. 10A, when it is determined that both feet cross the front boundary 101, the game processing device 50 determines the direction in which the user 200 extends the feet. When it is determined that the direction in which the user extends the feet is the X direction, which is the front of the user, the game processing device 50 moves the seat object and the virtual character in the virtual space in the direction opposite to the direction in which the user extends the feet, that is, in the rear direction (the reverse X direction). At this time, the game processing device 50 may horizontally move the seat object and the virtual character 113 backward or may swing backward. As described above, by moving the object in the direction opposite to the direction in which the user stands out his feet, it is possible to give the user an impression (image) that the seat object and the virtual character move by the thrust force caused by the reaction of the virtual character movement.

The method by which the game processing device 50 determines the direction in which the user sticks out the foot is not particularly limited. For example, the direction in which the foot is extended may be determined based on the left foot position P2 before the game is started and when the boundary is set. Instead, the direction in which the left foot position P2 projects or the direction in which the right foot position P3 projects may be determined with reference to the seating position P1. Alternatively, the center value of the left foot position P2 and the right foot position P3 may be calculated, and the foot extending direction may be determined based on the direction from the reference seating position P1 to the center value.

As shown in fig. 10B, when it is determined that both the left foot position P2 (not shown) and the right foot position P3 (not shown) cross the front boundary 101 and the direction in which the user 200 extends both feet is the front left side, the game processing device 50 moves the seat portion object and the virtual character 113 to the rear right side in the virtual space.

As shown in fig. 10C, when it is determined that both the left foot position P2 (not shown) and the right foot position P3 (not shown) cross the front boundary 101 and the direction in which the user 200 extends both feet is the front right side, the game processing device 50 moves the seat portion object and the virtual character 113 to the rear left side in the virtual space.

In fig. 10A to 10B, the seating portion object and the virtual character 113 move in the direction opposite to the direction in which the user extends the feet, but the seating portion object and the virtual character 113 may move in the same direction as the direction in which the feet extend. For example, when the user stands out both feet on his or her own front surface, the seat object and the virtual character 113 may move forward. Further, when the direction in which the user extends both feet is the front left side, the seat subject and the virtual character 113 may move forward to the left side.

The moving direction of the seating portion object and the virtual character 113 is not limited to the above direction. For example, the seating portion object and the virtual character 113 may move in the right direction or the left direction, which is the direction directly lateral to the user. For example, when the direction in which the user extends his or her feet is a predetermined angle or more to the right from the front as a reference, the seating portion object and the virtual character 113 may be moved to the right. Further, when the direction in which the user extends his or her feet is a predetermined angle or more to the left, the seating portion object and the virtual character 113 may be moved to the left.

As shown in fig. 11A, when the game processing device 50 determines that the user 200 bends his or her foot so that the lower leg is positioned below the seat portion 15 and the left foot position P2 (not shown) and the right foot position P3 (not shown) cross the rear boundary 102, the orientation of the lower leg is determined. When the game processing device 50 determines that the direction in which the lower legs extend is the direction opposite to the X direction, that is, the reverse X direction, the seat object and the virtual character 113 are moved forward in the X direction in the virtual space.

As shown in fig. 11B, when it is determined that both the left foot position P2 (not shown) and the right foot position P3 (not shown) have crossed the rear boundary 102, the game processing device 50 specifies the direction of the lower leg. When the game processing device 50 determines that the direction of the lower leg is the rear right side, the seat object and the virtual character move forward to the left side.

As shown in fig. 11C, the game processing device 50 determines the direction of the lower leg when determining that both the left foot position P2 (not shown) and the right foot position P3 (not shown) cross the rear boundary 102. When the game processing device 50 determines that the direction of the lower leg is the rear left side, the seat object and the virtual character move forward and rightward.

On the other hand, when the user 200 bends the right foot when extending the left foot, bends the left foot when extending the right foot, and alternately kicks the left foot and the right foot, the seat portion object and the virtual character move in the Y direction as an upper direction in the virtual space.

As shown in fig. 12, when the game processing device 50 determines that the left foot position P2 (not shown) has crossed the front boundary 101 and the right foot position P3 (not shown) has crossed the rear boundary 102, the seat object and the virtual character moving upward are rendered. While the state in which one foot crosses the front boundary 101 and the other foot crosses the rear boundary 102 is repeated for a predetermined time, the rendering of the seating portion object and the virtual character moving upward is continued. Further, when the number of times that one foot crosses the front boundary 101 and the other foot crosses the rear boundary 102 reaches a predetermined number of times, rendering of the seating portion object and the virtual character 113 that move upward may be started.

The procedure of the game processing according to the second embodiment will be described with reference to fig. 13.

When the game is started (step S20), the game processing device 50 performs initialization such as resetting of the last history. Further, the game processing device 50 determines whether both feet cross the boundary at the same time (step S21).

When the game processing device 50 determines that both feet have crossed either the front boundary 101 or the rear boundary 102 (yes in step S21), it acquires the direction in which the feet (lower legs) extend (step S22). Then, the game processing device 50 displays an image in which the seat object and the virtual character are moved in accordance with the acquired direction on the HMD30 (step S23).

On the other hand, when the game processing device 50 determines in step S21 that the user 'S feet do not simultaneously cross the boundary (no in step S21), it determines whether each of the user' S feet crosses a different boundary (step S25). When the game processing device 50 determines that each foot has crossed a different boundary (yes in step S25), it displays an image for raising the seating object and the virtual character on the HMD30 (step S26), and the process proceeds to step S24. When the game processing device 50 determines that the respective feet do not cross the different boundaries (no in step S25), the process proceeds to step S24.

The game processing device 50 determines whether the game is ended (step S24). When it is determined that the game is ended (step S24: YES), the process is ended. On the other hand, when it is determined that the game is not ended (NO in step S24), the process returns to step S21.

The advantages of the second embodiment will be explained.

(5) When the user stretches or bends the feet so that the positional relationships between the left foot position P2 and the right foot position P3 and at least one of the front boundary 101 and the rear boundary 102 satisfy a predetermined condition, the seat portion object and the virtual character move. Thus, the user can move the virtual character in the virtual space by an intuitive operation at a timing intended by the user.

(6) When the left foot position P2 and the right foot position P3 exceed the front boundary 101, the game processing device 50 moves the seat portion object and the virtual character rearward, that is, in the direction opposite to the direction in which both feet are extended. When the left foot position P2 and the right foot position exceed the rear boundary 102, the game processing device 50 moves the seat portion object and the virtual character object forward, that is, in the direction opposite to the direction in which both feet are extended. As described above, by using the boundary reflecting the body shape of the user, it is possible to suppress the situation where the activity is performed contrary to the user's intention.

(7) The game processing device 50 changes the movement modes of the seat object and the virtual character according to the difference in the positional relationship between the left foot position P2 and the right foot position P3 with respect to the boundary. That is, when both the left foot position P2 and the right foot position exceed the boundary, the seat object and the virtual character are moved backward or forward, and when the left foot position P2 and the right foot position P3 cross different boundaries, the seat object and the virtual character are raised. This enables the use of the boundary for determining the type of activity.

The above embodiments can be modified and implemented as follows. The above embodiments and the following modifications can be combined with each other within a range not technically contradictory.

In each of the above embodiments, the tracking sensor 22 is worn near the ankle of the user. Instead of this or in addition to this, the value of the detection data of the tracking sensor 22 worn at a position other than the vicinity of the ankle in the body of the user may be reflected on the activity. For example, in the "flight mode", the moving speed, the moving distance, and the like of the seat object and the virtual character may be increased based on the detection data of the tracking sensor 22 worn at any position of the upper body including the head of the user. In the "kick mode", the movement of the upper body may be reflected in the movement, for example, by increasing the movement distance of the object to be shoes in accordance with the movement of the upper body of the user. In the "flight mode", when the left foot position P2 and the right foot position P3 cross the front boundary 101 and the position of the tracking sensor 22 attached to the head of the user or the position of the tracking sensor 22 attached to the upper body position other than the head is moved rearward with respect to the initial position at the time of starting the game, that is, when the head or the upper body other than the head is inclined in the direction opposite to the direction in which the feet project, the movement speed, the movement distance, and the like of the seat portion object and the dummy character may be increased to reflect the movement.

In each of the above embodiments, whether to start performing an activity and whether to complete the performed activity are determined using the tracking sensor 22 worn near the ankle of the user. Instead of this or in addition to this, it is also possible to determine whether to start performing an activity and whether to complete at least one of the performed activities using the tracking sensor 22 worn at a position other than near the ankle in the body of the user.

An example of a procedure for executing an action of throwing an object in a virtual space using the tracking sensor 22 worn on an arm will be described with reference to fig. 14A to 14C. As shown in fig. 14A, the user 200 wears the tracking sensors 22 on the elbow and wrist, respectively. The game processing device 50 determines the wrist position P5 and the elbow position P6 based on data retrieved from at least one of the tracking sensor 22 and the tracking assistance device 21. The game processing device 50 calculates the length from the elbow to the wrist of the user 200, that is, the length Lx1 of the forearm, based on the distance between the tracking sensors 22. Further, the game processing device 50 sets the rear boundary 102 at a position separated from the wrist position P5 toward the shoulder of the user 200 by a length Lx1 multiplied by a predetermined ratio R5. In this example, the ratio is "1" or more. Further, the game processing device 50 sets the front boundary 101 at a position separated from the wrist position P5 in the elbow direction by a length Lx1 multiplied by a predetermined ratio R6.

As shown in fig. 14B, the game processing device 50 determines whether the wrist position P5 has crossed the rear boundary 102. When the game processing device 50 determines that the wrist position P5 has crossed the rear boundary 102, it determines that the operation has started. The activities corresponding to the above-described operations include, for example, an activity of throwing an object such as a ball, and an activity of throwing (throwing) an object such as a bait to the water surface. Further, instead of throwing the object, a movement of bending an arm such as hitting a ball or dancing may be used.

As shown in fig. 14C, after the start of the operation, the game processing device 50 determines whether or not the wrist position P5 has crossed the front boundary 101. When the game processing device 50 determines that the wrist position P5 has crossed the front boundary 101, it determines that the operation is completed. Next, the game processing device 50 reflects the activity in the virtual space. For example, when the game processing device 50 is launched, the bait is moved forward along the calculated trajectory. Further, when the ball is thrown, the game processing device 50 moves the ball forward along the calculated trajectory.

In addition, the boundary may be set using the length of the entire arm or the length of the upper arm without using the length of the forearm. In calculating the length of the entire arm, the tracking sensors 22 are worn on the wrist and shoulder. In addition, when calculating the length of the arm, the tracking sensors 22 are worn at the elbow and shoulder. The game processing device 50 calculates the length of the whole arm or the length of the upper arm, multiplies the calculated length by a predetermined ratio, and sets a boundary with the position of the wrist, the position of the shoulder, or the position of the elbow as a reference. The boundary set as described above can be used for determining whether or not to execute an action corresponding to an action of rotating the entire arm or an action of bending the arm. For example, in a game for playing golf, a game for playing baseball, a game for playing tennis, a game for playing billiards, or the like, determination of start of operation, determination of completion, determination of type of operation, or the like may be performed.

In the above embodiments, the game provided by the system is a game that the user sits down and plays. Instead, the game may be a game that the user plays in a standing position.

As shown in fig. 15A, the game processing device 50 calculates the length Ly5 of the lower leg from the position of the tracking sensor 22. Further, the game processing device 50 sets the boundary 120 at a position separated from the ankle position P7 in the direction toward the knee of the user 200 by the length Ly5 of the lower leg multiplied by the predetermined ratio R7. In addition, the ratio R7 in this example is less than "1".

As shown in fig. 15B, the game processing device 50 may determine that the activity intended by the user is dribbling when the kicking motion is started from the ankle position P7 lower than the boundary 120 in a game including an activity of a kicking object such as soccer.

As shown in fig. 15C, the game processing device 50 may determine that the activity intended by the user is shooting when the kicking-out operation is started from the ankle position P7 higher than the boundary 120.

In the second embodiment, when the positional relationship between the tracking sensor 22 worn near the ankle and the preset boundary satisfies the predetermined condition, a predetermined activity is executed in the virtual space. Instead of this or in addition to this, the movement of the virtual character or the like may be performed using the tracking sensor 22 worn at a position other than the vicinity of the ankle in the body of the user. For example, the virtual character or the like may be moved when the positional relationship between the tracking sensor 22 worn at any position in the upper body and the boundary set for the upper body satisfies a predetermined condition. Further, the movement of the virtual character or the like may be performed when the positional relationship between the tracking sensor 22 worn on the head and the boundary set for the head satisfies a predetermined condition. The movement of the virtual character or the like may be any one of upward movement, downward movement, leftward movement, rightward movement, and the like, may be a movement in a direction along the upper body movement direction, and may be a movement in another direction.

In each of the above embodiments, it is determined that the operation is started when the left foot position P2 or the right foot position P3 crosses the rear boundary 102, and it is determined that the operation is completed when the left foot position P2 or the right foot position P3 crosses the front boundary 101. Instead, when the left foot position P2 or the right foot position P3 crosses the rear boundary 102, an activity such as flying shoes may be performed. Instead, when the left foot position P2 or the right foot position P3 crosses the front boundary 101, an activity such as flying shoes may be performed. According to this aspect, the processing load in the activity determination process can be reduced.

In the second embodiment, the raised seating portion object and the virtual character are rendered while the state in which one foot crosses the front boundary 101 and the other foot crosses the rear boundary 102 is repeated for a predetermined time. Alternatively or in addition to this, the lowered seat part object and the virtual character may be rendered when the positional relationship between the tracking sensor 22 of one foot and the front boundary 101 satisfies a predetermined condition and the positional relationship between the tracking sensor 22 of the other foot and the rear boundary 102 satisfies a predetermined condition. Alternatively, the lowered seat object and the virtual character may be rendered when the positional relationship between the tracking sensor 22 of either foot and the front boundary 101 or the rear boundary 102 satisfies a predetermined condition, or when the positional relationship between the tracking sensors 22 of both feet and the front boundary 101 or the rear boundary 102 satisfies a predetermined condition. The specific action of the user at this time is not particularly limited. For example, the operation of moving both feet forward and backward simultaneously, the operation of moving both feet leftward and rightward simultaneously, the operation of pedaling a bicycle with both feet, or the like may be used in addition to the operation of moving the feet alternately. Alternatively, instead of rendering one of the ascending and descending, the seating object and the virtual character that are repeatedly ascending and descending may be rendered.

In each of the above embodiments, the moving distance of the object to be shoes may be changed according to the length of the lower leg. For example, when the lower leg is long, the correction may be performed such that the moving distance of the object to be shoes is longer than when the lower leg is short. When the velocity is constant, the centrifugal force when the lower leg is rotated around the knee becomes larger as the lower leg becomes longer. Therefore, the moving distance of the shoe object can be increased as the lower leg is longer, thereby making it possible to approximate the real world situation. Conversely, when the lower leg is long, the correction may be performed so that the moving distance of the object to be worn is shorter than when the lower leg is short.

In each of the above embodiments, when at least one of the left foot position P2 and the right foot position P3 crosses the rear boundary 102, it is determined that the movement corresponding to the movement of the flying shoe is started. Alternatively, when at least one of the left foot position P2 and the right foot position P3 overlaps the rear boundary 102, it may be determined that the operation is to be started. Similarly, when at least one of the left foot position P2 and the right foot position P3 crosses the front boundary 101, it is determined that the movement corresponding to the movement of the flying shoe is completed. Alternatively, when at least one of the left foot position P2 and the right foot position P3 overlaps the front boundary 101, it may be determined that the operation is completed.

In each of the above embodiments, the seat portion 15 as a swing slightly swings in the front-rear direction. Instead, the seat portion 15 may be swung back and forth in accordance with the user's swing motion. At this time, the determination regarding the shoe-flying operation of the user can be made by moving the front boundary 101 and the rear boundary 102 with reference to the seating position P1.

In each of the above embodiments, the length of the lower leg is calculated from the seating position P1 to the left foot position P2, which is the ankle position, or from the seating position P1 to the right foot position P3, which is the ankle position. Alternatively, the height of the seating position P1 may be calculated as the length of the lower leg when the height of the seat portion 15 is changed by adjusting the hanging portion 13 so that the foot of the user 200 contacts the floor 500.

In each of the above embodiments, the length of the lower leg of the user is calculated. Instead, the length of the user's thigh may also be calculated. The length of the thigh can be approximated by the distance between the coordinates of the seating position P1 and the coordinates of the left foot position P2 or the right foot position P3. When the thigh is long, the length of the calf is estimated to be long, and therefore, the boundary may be set according to the length of the thigh. In this case, a boundary in accordance with the user's figure can be set.

In each of the above embodiments, the boundary is set with reference to the left foot position P2 or the right foot position P3. Instead of this or in addition to this, for example, the boundary may be set with reference to the tracking sensor 22 not worn on the body of the user, such as the tracking sensor 22 worn on the seat portion 15.

In each of the above embodiments, it is determined that the motion is started when the position of the foot crosses the rear boundary 102, and it is determined that the motion is completed when the position of the foot crosses the front boundary 101. Instead of this or in addition to this, the kind of the user's motion may also be determined with respect to the positional relationship between the foot or arm and the boundary. For example, if the game is a baseball game, the hitting motion, the pitching motion, and the like may be determined from the positional relationship between the arm and the boundary. In the case of a dance game, a turn, a jump, or the like may also be determined based on the positional relationship between the foot or the arm and the boundary.

In each of the above embodiments, the front boundary 101 and the rear boundary 102 are set with reference to the feet of the user. Instead of this or in addition to this, in fig. 3, an upper boundary and a lower boundary indicating a Y-direction boundary may be set. Instead, a left boundary and a right boundary indicating a boundary in the Z direction may be set. The number of boundaries may be one, or may be three or more. In addition, the boundary may be a point or a line instead of a plane.

In each of the above embodiments, when the left foot position P2 or the right foot position P3 of the user crosses over either the front boundary 101 or the rear boundary 102, it is determined that the positional relationship satisfies the predetermined condition. Instead, the game processing device 50 may determine whether to execute the activity according to whether the game processing device is approaching the boundary or moving away from the boundary.

In each of the above embodiments, the tracking system 20 as an example is a system in which the tracking assistance device 21 as a light emitter and the tracking sensor 22 as a light receiver cooperate with each other. Other tracking systems 20 can be used if the position and orientation of the tracking sensor 22 can be detected. For example, a tracking system of an outside-in type other than the above-described embodiments may be used, such as a system in which an auxiliary device serving as a light receiver is provided in a game space and the auxiliary device cooperates with the tracking sensor 22 serving as a light emitter to calculate the position and direction of the tracking sensor 22. Instead, a tracking system of the inside-out system that scans the real space by a sensor mounted on the tracking sensor 22 to determine the user position may be used.

In each of the above embodiments, the HMD30 having a light receiver detects the position and direction of the HMD30 in cooperation with the tracking assistance device 21 having a light emitter. Instead, the HMD30 having a light emitter may detect the position and direction of the HMD30 in cooperation with the tracking assistance device 21 having a light receiver. Instead of this, a tracking system in an inside-out manner may also be used.

In each of the above embodiments, HMD30 is an example of a display used by a user. Instead of this or in addition to this, the display may be a display that is not worn on the body of the user. For example, a setting type display, a display of a portable game machine, a display incorporated in a arcade game machine, or the like can be used. When such a display is used, a signal based on a user operation may be received from an input operation unit (controller), and a display range on the display may be changed.

In each of the above embodiments, HMD30 is a device different from game processing device 50. Instead, HMD30 may also have built-in game processing device 50. Furthermore, HMD30 may be a standalone HMD provided with a device for detecting its own position.

In the above embodiments, the frame is a swing as an example, but the frame may be other than the swing. The housing may have a seat portion on which the user sits, or may be operated in a standing posture or a lying posture. The housing may be a vehicle such as an automobile, a bicycle, an airplane, a flying object, a submarine, a roller coaster, or a rocket, may be a shooting gun or a rocket gun, or may be a game item such as a slide. Instead, the system may provide a game or an application other than a game without using the housing 11.

In each of the above embodiments, the system is a system in which a user plays a game. Instead, the system may be an experience-oriented system, a system for viewing contents such as movies, a system for a user to communicate with other users, a learning system, a training system, a simulation system in the medical field, or the like.

28页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:虚拟道具的控制方法和装置、存储介质及电子装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类