Information processing program, information processing method, information processing apparatus, and information processing system

文档序号:213518 发布日期:2021-11-05 浏览:6次 中文

阅读说明:本技术 信息处理程序、信息处理方法、信息处理装置和信息处理系统 (Information processing program, information processing method, information processing apparatus, and information processing system ) 是由 金森沙绘罗 森月健吾 于 2020-02-28 设计创作,主要内容包括:本发明使得可以在显示区域可移动的计算机游戏中向玩家提供更简单的操作。一种信息处理程序,其利用计算机实现:布置功能,用于在虚拟空间中布置第一对象和第二对象;显示控制功能,用于使显示介质显示所述虚拟空间的规定区域;区域移动功能,用于基于玩家的连续的移动指示操作的输入来移动所述规定区域;选择功能,用于在所述移动指示操作正在继续时、所述第一对象显示在所述规定区域中的规定位置处的情况下,选择所述第一对象;以及动作执行功能,用于在所述移动指示操作完成的情况下,使所述第二对象执行能够对在所述移动指示操作正在继续时所选择的所述第一对象执行的动作。(The present invention makes it possible to provide a player with simpler operations in a computer game in which a display area is movable. An information processing program that realizes with a computer: an arrangement function for arranging a first object and a second object in a virtual space; a display control function of causing a display medium to display a prescribed region of the virtual space; a region moving function for moving the prescribed region based on an input of a continuous movement instruction operation by a player; a selection function of selecting the first object when the first object is displayed at a prescribed position in the prescribed area while the movement instruction operation is continuing; and an action execution function of causing the second object to execute an action that can be executed on the first object selected while the movement instruction operation is continuing, in a case where the movement instruction operation is completed.)

1. An information processing program for realizing, with a computer:

an arrangement function for arranging a first object and a second object in a virtual space;

a display control function of causing a display medium to display a prescribed region of the virtual space;

a region moving function for moving the prescribed region based on an input of a continuous movement instruction operation by a player;

a selection function of selecting the first object when the first object is displayed at a prescribed position in the prescribed area while the movement instruction operation is continuing; and

an action execution function of causing the second object to execute an action that can be executed on the first object selected while the movement instruction operation is continuing, in a case where the movement instruction operation is completed.

2. The information processing program according to claim 1,

the arranging function also arranges a virtual camera in the virtual space,

the display control function causes the display medium to display, as the prescribed area, an area that the virtual camera captures in a prescribed direction from an arrangement position of the virtual camera itself, and

the area moving function moves the prescribed area by changing the prescribed direction of shooting by the virtual camera based on an input of the movement instruction operation.

3. The information processing program according to claim 2, said information processing program further implementing, with said computer, an object moving function for moving a position of said second object based on an input of a movement instruction operation for said second object as an operation different from said movement instruction operation,

wherein the arrangement function determines the arrangement position of the virtual camera based on the position of the second object moved by the object moving function.

4. The information processing program according to any one of claims 1 to 3,

the input of the movement instruction operation is accepted with an input accepting means stacked on the display medium.

5. The information processing program according to any one of claims 1 to 4,

the display control function causes the display medium to display information about the first object selected by the selection function.

6. The information processing program according to claim 5,

the display control function includes, in the information related to the first object, at least one of information indicating an action that can be performed on the first object and information indicating a status of the first object.

7. The information processing program according to any one of claims 1 to 6,

the selection function does not select the first object when the position of the first object and the position of the second object are separated by at least a prescribed distance.

8. The information processing program according to any one of claims 1 to 7,

in a case where there are a plurality of actions that can be performed on the first object selected while the movement instruction operation is continuing,

the display control function causes the display medium to display the plurality of actions that can be performed as options, an

The action execution function determines an action to be executed by the second object based on an input of an operation of the player for selecting an action from the options.

9. An information processing method executed by a computer, the information processing method comprising:

an arrangement step of arranging a first object and a second object in a virtual space;

a display control step of causing a display medium to display a predetermined region of the virtual space;

a region moving step of moving the predetermined region based on an input of a continuous movement instruction operation by a player;

a selection step of selecting the first object when the first object is displayed at a prescribed position in the prescribed area while the movement instruction operation is continuing; and

an action execution step of causing the second object to execute an action that can be executed on the first object selected while the movement instruction operation is continuing, in a case where the movement instruction operation is completed.

10. An information processing apparatus comprising:

an arrangement means for arranging a first object and a second object in a virtual space;

a display control unit configured to cause a display medium to display a predetermined region of the virtual space;

a region moving means for moving the prescribed region based on an input of a continuous movement instruction operation by a player;

a selection unit configured to select the first object when the first object is displayed at a predetermined position in the predetermined area while the movement instruction operation is continuing; and

action execution means for causing the second object to execute an action that can be executed on the first object selected while the movement instruction operation is continuing, in a case where the movement instruction operation is completed.

11. An information processing system includes a terminal and a server communicatively connected to the terminal,

the terminal includes:

a display medium for displaying; and

an input acceptance device stacked on the display medium, an

The server includes:

an arrangement means for arranging a first object and a second object in a virtual space;

a display control unit configured to cause the display medium to display a predetermined region of the virtual space;

a region moving means for moving the prescribed region based on an input of a continuous movement instruction operation by the player accepted by the input accepting device;

a selection unit configured to select the first object when the first object is displayed at a predetermined position in the predetermined area while the movement instruction operation is continuing; and

action execution means for causing the second object to execute an action that can be executed on the first object selected while the movement instruction operation is continuing, in a case where the movement instruction operation is completed.

Technical Field

The invention relates to an information processing program, an information processing method, an information processing apparatus, and an information processing system.

Background

In a computer game, a player performs various operations, and the game is played based on these various operations. For example, the player performs various operations such as an operation for moving an operable object, an operation for selecting another object, and an operation for causing the operable object to perform an action.

Further, in the case of a game that displays in three dimensions (3D), it is common that the player further performs an operation for moving the display area to move the display area (for example, see patent document 1).

By enabling various operations including such an operation for moving the display area to be performed in the game, the variety of operations of the player can be increased, thereby further enhancing the complexity of the game.

Documents of the prior art

Patent document

Patent document 1: japanese patent No. 5485702

Disclosure of Invention

Problems to be solved by the invention

However, in the case where various operations including an operation for moving the display area are possible as described above, the player has to perform these various operations sequentially or in parallel. Therefore, there has been an aspect that the player thinks these operations are laborious.

The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a player with simpler operations in a computer game in which a display area is movable.

Means for solving the problems

In order to achieve the above object, an information processing program according to an aspect of the present invention is an information processing program that realizes, with a computer:

an arrangement function for arranging a first object and a second object in a virtual space;

a display control function of causing a display medium to display a prescribed region of the virtual space;

a region moving function for moving the prescribed region based on an input of a continuous movement instruction operation by a player;

a selection function of selecting the first object when the first object is displayed at a prescribed position in the prescribed area while the movement instruction operation is continuing; and

an action execution function of causing the second object to execute an action that can be executed on the first object selected while the movement instruction operation is continuing, in a case where the movement instruction operation is completed.

ADVANTAGEOUS EFFECTS OF INVENTION

The present invention makes it possible to provide a player with simpler operations in a computer game in which a display area is movable.

Drawings

Fig. 1 is a block diagram showing a hardware configuration of an information processing apparatus according to an embodiment of the present invention.

Fig. 2 is a functional block diagram showing a functional configuration for executing a compound operation implementing process among the functional configurations of the information processing apparatus shown in fig. 1.

Fig. 3 is a flowchart showing a flow of a compound operation implementation process executed by the information processing apparatus of fig. 1 having the functional configuration of fig. 2.

Fig. 4 is a schematic diagram showing a presentation example of positions and the like of respective objects as an example of presentation in the compound operation implementation process.

Fig. 5 is a schematic diagram showing a presentation example of selection of a first object or the like as an example of presentation in the compound operation implementation process.

Fig. 6 is a schematic diagram showing a presentation example of an action or the like with a second object as an example of presentation in the compound operation implementation process.

Fig. 7 is a schematic diagram showing a presentation example of object information and the like as an example of presentation in the compound operation implementation processing.

Fig. 8 is a schematic diagram showing another example of presentation of object information and the like as an example of presentation in the compound operation implementation process.

Fig. 9 is a schematic diagram showing a presentation example of another action with a second object and the like as an example of presentation in the compound operation implementation process.

Detailed Description

Embodiments of the present invention will be described below with reference to the accompanying drawings.

[ brief description of the embodiments ]

An object of the present embodiment is to provide a player with simpler operations in a computer game in which a display area is movable.

For this reason, in the present embodiment, a first object (e.g., a character other than a character operable by a player) and a second object (e.g., a character operable by a player) are arranged in a virtual space (e.g., a 3D space in which a computer game is played). Further, in the present embodiment, a prescribed region of the virtual space is displayed on a display medium (e.g., a touch screen).

In addition, in the present embodiment, the prescribed region is moved based on an input of a continuous movement instruction operation by the player (for example, a slide operation by the player). Further, in the present embodiment, in a case where the first object is displayed at a prescribed position in the prescribed area while the movement instruction operation is continuing, the first object is selected. Further, in the present embodiment, in the case where the movement instruction operation is completed, the second object is caused to perform an action that can be performed on the first object selected while the movement instruction operation is continuing.

According to the present embodiment configured as described above, it is possible to realize a plurality of processing steps by only a single operation (i.e., a movement instruction operation), whereas conventionally, a plurality of processing steps are realized by a combination of a plurality of operations.

Specifically, by only a single operation (i.e., a movement instruction operation), a plurality of processing steps such as the following can be realized: (1) an operation for moving the displayed prescribed area (which is conventionally realized by a slide operation); (2) an operation for selecting a first object related to the execution of the action (which is conventionally realized by a tap operation); and (3) an operation for instructing the second object to perform an action, which is conventionally realized by a single tap operation, wherein the plurality of processing steps are conventionally realized by a combination of three operations such as a slide operation and a multi-tap operation.

Further, with the present embodiment capable of realizing the above-described operation method, it is possible to solve the problem that the player thinks that the operation is laborious when performing various operations sequentially or in parallel as in the conventional case.

That is, the present embodiment makes it possible to provide a player with simpler operations in a computer game in which the display area is movable.

[ hardware configuration ]

Next, the hardware structure of the present embodiment will be described with reference to fig. 1. Fig. 1 is a block diagram showing a hardware configuration of an information processing apparatus 1 according to an embodiment of the present invention.

As shown in fig. 1, the information processing apparatus 1 includes a Central Processing Unit (CPU)11, a Read Only Memory (ROM)12, a Random Access Memory (RAM)13, a bus 14, an input/output interface 15, a touch panel 16, an input unit 17, an output unit 18, a storage unit 19, a communication unit 20, and a drive 21.

The CPU 11 executes various processes in accordance with a program recorded in the ROM 12 or a program loaded from the storage unit 19 into the RAM 13.

The RAM 13 also appropriately stores data and the like necessary for the CPU 11 to execute various processes.

The CPU 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output interface 15 is also connected to the bus 14. The touch panel 16, the input unit 17, the output unit 18, the storage unit 19, the communication unit 20, and the driver 21 are connected to the input/output interface 15.

The touch screen 16 is configured by stacking a touch operation accepting unit 162 and a display unit 161.

The display unit 161 is constituted by a liquid crystal or other type of display, and displays various images such as images related to games, etc., so that a player can recognize the images.

The touch operation acceptance unit 162 is configured by, for example, a position input sensor based on electrostatic capacitance or resistive film (pressure sensitive) stacked on the display unit 161, and detects the coordinates of the position where the touch operation is performed. Here, the touch operation refers to an operation for bringing an object into contact with or close to the touch operation accepting unit 162. For example, the object that contacts or comes into proximity with the touch operation accepting unit 162 is a finger of a player, a stylus pen, or the like.

Examples of the touch operation include a tap operation, a slide operation, and a flick operation. Note, however, that both the sliding operation and the flicking operation are a series of operations with which the condition is changed from a state in which the object starts to contact or approach the touch screen 16 (hereinafter referred to as "first sliding state") to a state in which the object does not contact or approach the touch screen 16 (hereinafter referred to as "third sliding state") through a state in which the position of the object is moved while maintaining the object contact or approach to the touch screen 16 (hereinafter referred to as "second sliding state"). Therefore, in the following description, a series of moving operations performed continuously as described above will be collectively referred to as "sliding operations". That is, the "slide operation" in the following description is a broad concept including the flick operation and the like described above and an operation generally called a slide operation.

The input unit 17 is configured by various buttons, direction keys, a microphone, and the like, and accepts input of various information in accordance with an instruction operation performed by an administrator or the like of the information processing apparatus 1. Alternatively, the input unit 17 may be realized by an input device (such as a game controller, a keyboard, a mouse, or the like) that is independent of the main unit that accommodates the other units of the information processing apparatus 1.

The output unit 18 outputs the sound data to a connected speaker (not shown). The speaker outputs the sound data output from the output unit 18 in a form recognizable by the player, such as music, sound effects, voice, and the like.

The storage unit 19 is constituted by a semiconductor memory such as a Dynamic Random Access Memory (DRAM) or the like, and stores various data.

The communication unit 20 enables communication performed with other devices. For example, the communication unit 20 mutually communicates with a server apparatus (not shown) from which a program for executing a game is downloaded, and the other information processing apparatus 1 (not shown). The communication is performed, for example, via a network such as a Local Area Network (LAN), the internet, or a mobile phone network, or a network formed by combining these kinds of networks. Further, the communication may be performed via a relay device, or directly between devices without any intervening relay devices. Note, however, that in the present embodiment, the communication unit 20 is not a necessary component, and may be omitted from hardware.

The driver 21 is appropriately provided as needed. A removable medium 100 formed of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately loaded in the drive 21. The removable medium 100 stores a program for executing a game and various data such as image data. A program read from the removable medium 100 by the drive 21 and various data such as image data are installed in the storage unit 19 as necessary.

The information processing apparatus 1 having such a hardware structure can be realized by an electronic instrument having an information processing function such as a smart phone or a portable game machine.

[ functional Structure ]

Next, the functional structure of the information processing apparatus 1 will be described with reference to fig. 2.

Fig. 2 is a functional block diagram showing a functional configuration for executing a composite-operation implementing process among the functional configurations of the information processing apparatus 1 shown in fig. 1.

Here, the composite-operation implementation processing refers to a series of processing steps by which a plurality of processing steps corresponding to a plurality of operations are executed with a single operation by processing a continuous movement instruction operation by the player as the single operation (for example, a slide operation by the player as described earlier) as a combined operation (hereinafter referred to as "composite operation").

In the case of executing the composite-operation implementation processing, as shown in fig. 2, the CPU 11 functions as an operation detection unit 111, a game execution control unit 112, an output control unit 113, a shooting-area moving unit 114, an object selection unit 115, and an action execution unit 116. In addition to the processing described below, these functional blocks also appropriately perform calculation processing necessary for performing composite operation implementation processing and information exchange between the functional blocks as necessary.

Further, a game play data storage unit 191 and a parameter storage unit 192 are provided in the area of the storage unit 19.

The operation detection unit 111 detects an input from an operation of the player accepted via the touch panel 16 or the input unit 17. Further, the operation detection unit 111 determines the content of the detected operation input by the player. For example, in a case where an input of an operation from the player is accepted via the touch screen 16, the operation detection unit 111 determines at which coordinates on the touch screen the operation is accepted. In this case, the operation detection unit 111 determines the content of the operation based on a change in the coordinates at which the operation is accepted, or the like.

Further, for example, when an input of an operation from the player is accepted via the controller implementing the input unit 17, the operation detection unit 111 determines with which button or direction key the operation is accepted. In this case, the operation detection unit 111 determines the content of the operation based on the type of button or direction key with which the operation is accepted.

Then, the operation detection unit 111 outputs the content of the operation input of the player recognized by the judgment to other functional blocks as appropriate.

The game execution control unit 112 controls how the game as a whole is executed by executing processing for executing the game. Specifically, the game execution control unit 112 controls how the game is executed based on game software stored in a game execution data storage unit 191 to be described later and the content of the player operation input from the operation detection unit 111. Further, the game execution control unit 112 outputs the progress situation of the game based on the game execution control to other function blocks as appropriate.

Further, the game execution control unit 112 manages prescribed parameters that change in the game as the game is executed. Examples of the prescribed parameters include a parameter indicating a value of a condition (such as a level and a life value) of an allied character that is an object operable by a player, a parameter indicating items and equipment available to the allied character, and a parameter indicating a past game result.

These prescribed parameters are stored in a parameter storage unit 192 to be described later. Further, in the case where processing involving a change in these prescribed parameters occurs in the game (for example, processing involving an increase or decrease in the value of a parameter, or processing resulting in a change in a flag indicating the status of a parameter), the game execution control unit 112 changes the prescribed parameters based on the result of the processing. For example, in the case where a process involving a successful normal attack of an enemy character on an ally character occurs, the life value of the ally character is reduced according to the normal attack.

Further, the game execution control unit 112 updates the specified parameters stored in the parameter storage unit 192 based on the specified parameters after these changes. Further, the game execution control unit 112 continues executing the game in accordance with the updated prescribed parameters.

The game executed by the game execution control unit 112 is a game in which a plurality of objects appear, which is sufficient, and the content of the game is not particularly limited. That is, the present embodiment can be applied to any game without limitation in relation to game contents, game styles, and the like.

As an example, the present embodiment assumes a case where the game execution control unit 112 executes a third person viewpoint shooting game (TPS: third person viewpoint shooting) in which various movable objects having three-dimensional shapes appear in a virtual space that is a virtual three-dimensional space constructed in a prescribed global coordinate system.

In this game, the movable objects respectively have local coordinate systems set thereto, and the movable objects are arranged in the virtual space in a state where the coordinates in the respective local coordinate systems are converted into coordinates in the global coordinate system.

In the following description, an object other than the movable object operable by the player will be referred to as a "first object". In order to move the first objects, the game execution control unit 112 changes the coordinates of these first objects in the local coordinate system based on a prescribed algorithm. Accordingly, the coordinates in the changed local coordinate system are converted into coordinates in the global coordinate system, thereby enabling movement of the first object in the virtual space.

Further, the movable object operable by the player will be referred to as "second object". To move the second object, the game execution control unit 112 changes the coordinates of the second object in the local coordinate system based on the movement instruction operation of the second object by the player. Accordingly, the coordinates in the changed local coordinate system are converted into coordinates in the global coordinate system, thereby enabling movement of the second object in the virtual space.

Further, in order to realize the third person viewpoint, the game execution control unit 112 arranges a virtual camera in a virtual space in a similar manner to the movable object. For example, in order to realize a viewpoint of viewing the second object downward from the back upper side of the head of the second object, the game execution control unit 112 arranges a virtual camera at a position corresponding to the back upper side position of the head of the second object. The arrangement position of the virtual camera moves with the movement of the second object. That is, in order to move the virtual camera, the game execution control unit 112 changes the coordinates of the virtual camera in the local coordinate system based on the movement instruction operation of the second object by the player. Accordingly, the coordinates in the changed local coordinate system are converted into coordinates in the global coordinate system, thereby realizing the movement of the virtual camera in the virtual space.

Further, the shooting direction of the virtual camera in the virtual space (i.e., the orientation of the virtual camera) changes based on the composite operation by the player.

Further, an image, which is obtained by the virtual camera by shooting in the shooting direction from the arrangement position of the virtual camera and corresponds to an area (hereinafter referred to as "shooting area") corresponding to the angle of view of the virtual camera in the virtual space, is displayed on the touch screen 16 by an output control unit 113 to be described later.

The output control unit 113 performs control processing for generating an image corresponding to the progress situation of the game based on the progress situation of the game controlled by the game execution control unit 112, image data stored in the game execution data storage unit 191 to be described later, the arrangement position and shooting direction of the virtual camera described above, and the coordinates of each movable object in the global coordinate system, and causing the touch screen 16 to display the generated image. That is, the output control unit 113 also functions as a drawing unit for executing processing for drawing a virtual space, respective movable objects, a user interface, and the like at the time of game execution.

In addition, the output control unit 113 performs processing for generating music, sound effects, voice, and the like for the game from the progress status of the game controlled by the game execution control unit 112 and the sound data stored in the game execution data storage unit 191, and causing a speaker connected to the output unit 18 to output the generated sound and the like.

The photographing region moving unit 114 performs processing for moving the photographing region photographed by the virtual camera by changing the photographing direction of the virtual camera in the virtual space based on a composite operation by the player. For this purpose, the shooting-area moving unit 114 monitors the content of the player operation input from the operation detecting unit 111. Further, when the player performs a compound operation, the shooting-area moving unit 114 outputs an instruction for changing the shooting direction of the virtual camera to each function block. Here, the composite operation is the slide operation as described previously. The shooting area moving unit 114 outputs an instruction to the other functional blocks so that the shooting direction of the virtual camera will be changed to a direction in the sliding operation corresponding to the change in the coordinates of the position touched in the second sliding state.

Accordingly, the game execution control unit 112 changes the shooting direction of the virtual camera in the virtual space. Further, accordingly, the output control unit 113 generates an image corresponding to the shooting area shot based on the changed shooting direction, and causes the touch panel 16 to display the image. Thus, movement of the shooting area shot by the virtual camera is realized.

The object selection unit 115 performs processing for selecting a first object based on a composite operation by a player. For this purpose, the object selection unit 115 monitors the content of the player operation input from the operation detection unit 111. Further, when the player performs a compound operation, the object selection unit 115 outputs an instruction to the other function block so that a cursor for selecting the first object will be displayed in the first sliding state. Accordingly, the output control unit 113 generates an image (for example, an image of a black circle) corresponding to the cursor, and causes the touch screen 16 to display the image. On an image corresponding to the photographing region, an image corresponding to the cursor is displayed in a superimposed manner at a prescribed position of the photographing region photographed by the virtual camera. For example, on an image corresponding to the photographing region, an image corresponding to the cursor is displayed in a superimposed manner at a central position of the photographing region.

Further, the object selection unit 115 outputs an instruction for selecting the first object to each function block in the case where: in the second sliding state, the photographing region is moved as a result of the processing of the photographing region moving unit 114, or the first object is moved, thereby displaying the first object at a prescribed position where the image corresponding to the cursor is displayed.

Accordingly, the game play control unit 112 sets the first object to the selected state. Further, the game execution control unit 112 determines information to be displayed on the touch screen 16 regarding the selected first object (hereinafter referred to as "object information"). It is not limited which kind of information is to be displayed as the object information.

For example, the game execution control unit 112 determines that information indicating an action executable by the second object on the selected first object is to be displayed as the object information. The action performable by the second object may be a single fixed action or may be an action that differs between the items with which the selected first object or second object is equipped, which action may be arbitrarily defined by the game developer. Alternatively, the game execution control unit 112 determines that information indicating the status or the like of the selected first object is to be displayed as the object information. For example, the game execution control unit 112 determines that information representing the name of the first object and the parameter of the first object (for example, the current life value of the first object) is to be displayed as object information.

Accordingly, the output control unit 113 causes the touch panel 16 to display the object information determined as the information to be displayed by the game execution control unit 112. The subject information is displayed in a superimposed manner at a prescribed position of a captured area captured by the virtual camera on an image corresponding to the captured area. For example, the object information is displayed in the vicinity of the display position of the cursor in a superimposed manner on the image corresponding to the shooting area. In this case, in order for the player to recognize that the first object is currently selected, for example, the output control unit 113 may change the cursor image to a different image (for example, an image surrounding the first object).

The action execution unit 116 executes processing for causing the second object to execute the executable action based on the composite operation of the player. For this purpose, the action execution unit 116 monitors the content of the player operation input from the operation detection unit 111. Further, in a case where the player performs a compound operation and the third sliding state results in a state in which the first object is selected, the action execution unit 116 outputs an instruction to the other function block so that the second object will execute the executable action. Accordingly, the output control unit 113 causes the second object to perform the executable action. Accordingly, the photographing region moving unit 114 generates an image corresponding to the motion performed by the second object (for example, an image representing the motion of the second object performing the motion, an image representing the motion of the first object in response to the motion, or an image representing the effect associated with the motion), and causes the touch screen 16 to display the image.

Here, as previously described, the actions performable by the second object may be different depending on the first object selected. For example, in the case where the selected first object is an enemy character, the action is an action performed on the enemy character, such as an attack with a sword, a continuous attack with a sword, a shot with an arrow, a continuous shot with an arrow, an attack with a magic, a continuous attack with a magic, a defense with a shield, or a strike with a shield, which action is different from simple movement of a game character. Alternatively, the action may be an action for setting a sword, an arrow, or a shield in place (or equipping a sword, an arrow, or a shield) to attack an enemy character, or the like. Note that these are merely examples, and for example, in the case where the second object is a character or an item or the like other than an enemy character, the action may be an action of talking with the character or acquiring the item or the like.

As described above, the composite-operation implementation processing is performed by cooperation between the respective functional blocks, which makes it possible to handle a single slide operation as a composite operation. Therefore, as described earlier, a plurality of processing steps conventionally realized by a combination of three operations such as a slide operation and a multi-tap operation can be realized only by a single slide operation.

The game execution data storage unit 191 stores various data necessary for the game execution control unit 112 to execute the game. Examples of various data for running a game include game software as a program for executing the game, and image data, music data, and sound data for generating game images and sounds. Further, in the case where at least a part of a character and a background is displayed by using three-dimensional computer graphics in a game as in the present embodiment, the game execution data storage unit 191 also stores polygon data, texture information, and the like for realizing rendering based on three-dimensional computer graphics.

As described above in the description of the game play control unit 112, the parameter storage unit 192 stores prescribed parameters.

Note that although these various data for running the game may be stored only in the game running data storage unit 191 of the storage unit 19, the data may be read from the removable medium 100 as appropriate by the drive 21. Alternatively, these various data may be appropriately transmitted from a server apparatus (not shown) or other information processing apparatus 1 (not shown) to the information processing apparatus 1 by communication via the communication unit 20. That is, these various data can be downloaded appropriately as needed when a game is installed or when a game is updated.

[ operation and presentation example ]

Next, an operation in the compound operation realization process performed by the information processing apparatus 1 will be described with reference to fig. 3. Fig. 3 is a flowchart for explaining the flow of the compound operation implementation process. In addition, transition between presentation examples on the touch screen 16 associated with the operation of the composite-operation implementation process will be described with reference to fig. 4 to 7. Fig. 4 to 7 are schematic diagrams showing transitions between presentation examples on the touch screen 16.

The compound operation implementation processing is executed in response to an operation of starting a game by a player. The respective functional blocks described above suitably perform processes necessary for performing a compound operation implementation process including processes not specifically mentioned in the following description with reference to the flowchart of fig. 3.

In step S11, the object selection unit 115 determines whether or not a compound operation by the player is started. In a case where the compound operation by the player has started, the judgment in step S11 results in yes, and the process advances to step S12. On the other hand, in a case where the compound operation by the player has not started, "no" is obtained in step S11, and the determination in step S11 is repeated in the processing.

In step S12, the shooting-area moving unit 114 starts processing for moving the shooting area based on the compound operation.

In step S13, the output control unit 113 starts processing for displaying a cursor for selecting the first object.

An example of presentation on the touch screen 16 in this case will be explained with reference to fig. 4. First, at a timing before the composite operation starts (i.e., a timing at which "no" is obtained at step S11), for example, the presentation shown in part (a) of fig. 4 is provided. That is, the virtual space, the first object OB11, and the second object OB21 are displayed as images corresponding to the shooting area. Further, a software button B simulating a direction key for accepting a movement instruction operation of the second object OB21 from the player is displayed at a prescribed position. When the player touches a position corresponding to the predetermined direction of the software button B, the operation detection unit 111 determines that the content of the operation is a movement instruction operation in the predetermined direction. Then, the game execution control unit 112 moves the second object OB21 based on the movement instruction operation. Note that, as described previously, when the second object OB21 moves, the arrangement position of the virtual camera also moves.

Further, when the compound operation has started and steps S12 and S13 have been performed, for example, the presentation shown in part (B) of fig. 4 is provided. That is, the shooting area is moved based on the second sliding state in the composite operation with the finger of the player. Note that a finger F of the player and an arrow are schematically shown in these figures to show how the display area moves when the player performs a compound operation. Further, in addition to the presentation in part (a) of fig. 4, a cursor image C is displayed at the central portion of the photographing region as an image corresponding to the cursor.

Referring back to fig. 3, in step S14, the object selection unit 115 determines whether the first object is displayed at the prescribed position where the cursor image C is displayed as a result of the movement of the photographing region or the movement of the first object. In the case where the first object is displayed at the prescribed position where the cursor image C is displayed, the determination in step S14 results in yes, and the process advances to step S16. On the other hand, in a case where the first object is not displayed at the prescribed position where the cursor image C is displayed, the determination in step S14 results in no, and the process advances to step S15.

In step S15, the action execution unit 116 determines whether the composite operation has passed through the third sliding state and is completed. In the case where the compound operation has passed through the third sliding state and is completed, the determination in step S15 results in yes, and the processing is completed. On the other hand, in the case where the second sliding state is maintained and the compound operation is not completed, the determination in step S15 is no, and the process returns to step S14 and is repeated from the determination in step S14.

In step S16, the object selection unit 115 starts processing for selecting a first object displayed at a prescribed position where the cursor image C is displayed.

In step S17, the object selection unit 115 starts processing for displaying object information corresponding to the first object displayed at the prescribed position where the cursor image C is displayed.

An example of presentation on the touch screen 16 in this case will be explained with reference to fig. 5. As a result of performing steps S16 and S17, for example, the presentation shown in part (a) of fig. 5 is provided. That is, in addition to the presentation in part (B) of fig. 4, in order for the player to recognize that the first object OB11 is currently selected with the cursor image C, the cursor image C is changed to an image surrounding the first object OB11, and the image is displayed. Further, object information IN11 and object information IN12 are displayed as object information. Here, the object information IN11 is information indicating the status or the like of the first object OB11, and as an example, the text "broad leaf tree" indicating the name of the first object OB11 is displayed IN the figure. Further, the object information IN12 is information representing actions performable by the second object OB21 on the selected first object OB11, and by way of example, the text "arm" performable by the second object OB21 is displayed IN the figure. This means that the second object OB21 may perform the action of arming the axe in preparation for chopping the first object OB 11.

As another example, in the case where the first object OB12 is selected as another first object, the presentation shown in part (B) of fig. 5 is provided. That is, in order for the player to recognize that the first object OB12 is currently selected with the cursor image C instead of the first object OB11 in the presentation of part (a) of fig. 5, the cursor image C is changed to an image surrounding the first object OB12, and the image is displayed. Further, object information IN21 and object information IN22 are displayed as object information. Here, IN21 is information indicating the status or the like of the first object OB12, and by way of example, the text "wild boar" indicating the name of the first object OB11 and the text "HP: 120" indicating the current life value of the first object OB12 are displayed IN the figure. Further, the object information IN12 is information representing actions that the second object OB21 can perform on the selected first object OB11, and by way of example, the text "arming weapon" that can be performed by the second object OB21 is displayed IN the figure. This means that the second object OB21 may perform a weaponizing action in preparation for an attack on the first object OB 11.

Referring back to fig. 3, in step S18, the object selection unit 115 determines whether the first object is displayed outside the prescribed position where the cursor image C is displayed as a result of movement of the photographing region or movement of the first object. In a case where the first object is displayed outside the prescribed position where the cursor image C is displayed, "yes" is obtained in step S18, and the process advances to step S20. On the other hand, in a case where the first object is still displayed at the prescribed position where the cursor image C is displayed, the determination in step S18 results in "no", and the process advances to step S19.

In step S19, the action execution unit 116 determines whether the composite operation has passed through the third sliding state and is completed. In a case where the compound operation has passed through the third sliding state and is completed, the determination in step S19 results in yes, and the process advances to step S20. On the other hand, in the case where the second sliding state is maintained and the compound operation is not completed, the determination in step S19 is no, and the process returns to step S18 and is repeated from the determination in step S18.

In step S20, the action execution unit 116 executes processing for causing the second object to execute the executable action. Then, the process ends.

An example of presentation on the touch screen 16 in this case will be explained with reference to fig. 6. Here, the presentation shown in part (a) of fig. 6 is the same as the presentation shown in part (B) of fig. 5. In this case, as a result of performing step S20, for example, the presentation shown in part (B) of fig. 6 is provided. That is, the display state of the second object OB21 changes to a state in which the second object OB21 is equipped with a weapon (here, a sword). In the figure, the finger F of the player is schematically shown by a broken line to illustrate a state in which the compound operation has passed through the third sliding state and is completed.

Further, when the operations up to step S20 are performed in the composite-operation implementing process that is newly performed later (i.e., when the first object OB12 is selected again and the second object OB21 is caused to perform the executable action), for example, as shown in fig. 6 (cursor image C), an attack of the first object OB12 by a weapon equipped with the second object OB21 is performed, whereby the life value of the first object OB12 is reduced.

As described above, only by the compound operation performed by the player, processing such as the following can be realized: displaying object information of the selected second object, arming a weapon corresponding to the selected second object, performing an attack on the second object, and moving a photographing region corresponding to the display region. That is, simpler operations can be provided to the player in the computer game in which the display area is movable.

Referring back to fig. 3, in step S21, the object selection unit 115 completes the processing for selecting the first object displayed at the prescribed position where the cursor image C is displayed. Then, the cursor image C is changed from the image surrounding the first object back to the black circle image. On the other hand, a part of the object information whose display was started in step S17 is continuously displayed.

In step S22, the object selection unit 115 determines whether the first object (which may be the first object once selected and currently displaying the object information, or may be another first object) is displayed again at the prescribed position where the cursor image C is displayed as a result of the movement of the photographing region or the movement of the first object. In the case where the first object is displayed again at the prescribed position where the cursor image C is displayed, the determination in step S22 results in yes, and the process returns to step S16 and is repeated. On the other hand, in a case where the first object is not displayed again at the prescribed position where the cursor image C is displayed, "no" is obtained in the step S22, and the process advances to a step S23.

In step S23, the action execution unit 116 determines whether the composite operation has passed through the third sliding state and is completed. In the case where the compound operation has passed through the third sliding state and is completed, the determination in step S23 results in yes, and the processing is completed. On the other hand, in the case where the second sliding state is maintained and the compound operation is not completed, the determination in step S23 is no, and the process advances to step S24.

In step S24, the object selection unit 115 determines whether the first object is already located outside the shooting area as a result of movement of the shooting area or movement of the first object that has been selected and currently displays the object information. In a case where the first subject is already located outside the shooting area, the determination in step S24 results in yes, and the process advances to step S25. On the other hand, in a case where the first object is located within the shooting area, the determination in step S24 results in "no", and the process returns to step S22 and is repeated.

In step S25, the process for displaying the object information currently displayed is completed. Then, the process returns to step S14 and is repeated.

An example of presentation on the touch screen 16 in this case will be explained with reference to fig. 7. As shown in part (a) of fig. 7, in this example, the following case is assumed: the first object OB13 is selected IN step S16, and the object information IN31 and the object information IN32 are displayed as object information IN step S17. Here, the object information IN31 is information indicating the status or the like of the first object OB13, and as an example, a text " men" indicating the name of the first object OB13 is displayed IN the figure. Further, the object information IN32 is information representing actions that the second object OB21 can perform on the selected first object OB13, and as an example, text "talk" that can be performed by the second object OB21 is displayed IN the figure. This means that the second object OB21 can perform an action of talking with the first object OB 13.

In this case, as a result of performing step S21, for example, the presentation shown in part (B) of fig. 7 is provided. That is, the cursor image C is changed from the image surrounding the first object back to the black circle image. On the other hand, of the object information whose display is started IN step S17, the object information IN31 is continuously displayed. Therefore, the player can continue to refer to the object information IN 31. However, since the first object OB13 is not in the selected state, the second object OB21 cannot perform an action of talking with the first object OB 13. Thus, the presentation of the object information IN32 is exited.

IN this case, when the first object OB13 has been located outside the photographing area and step S25 is further performed, the presentation of the object information IN31 is also exited. That is, presentation of all object information is exited. Therefore, the presentation becomes the same as that in part (B) of fig. 4, and the above-described process can be repeated for a new first object.

[ modified examples ]

Although the embodiment of the present invention has been described above, the embodiment is merely an example, and does not limit the technical scope of the present invention. The present invention may be embodied in various other forms, and various modifications such as omissions and substitutions may be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and spirit of the present invention disclosed in the present specification and the like, and are included in the scope of the present invention described in the claims and equivalents thereof.

For example, the embodiment of the present invention may be modified as in the following modification examples. Further, the following modifications may be combined with each other.

< first modification >

In the above-described embodiment, the object selection unit 115 performs steps S16 and S17 in the case where it is determined in step S14 that the first object is displayed at the prescribed position where the cursor image C is displayed as a result of the movement of the photographing region or the movement of the first object. That is, the object selection unit 115 performs processing for selecting the first object and processing for displaying object information corresponding to the first object in a case where the first object exists within a range that is located in the shooting direction of the virtual camera and is included in the shooting area. However, without being limited to this embodiment, the processing may be performed in consideration of other criteria. For example, the processing may also be performed in consideration of the distance between the position of the virtual camera and the position of the first object.

For example, the process may be configured such that: in a case where the first object is displayed at the prescribed position where the cursor image C is displayed, and the distance between the position of the virtual camera and the position of the first object is within the prescribed distance, the judgment in step S14 is yes, and steps S16 and S17 are performed.

In another aspect, the process may be configured such that: in a case where the first object is displayed at the prescribed position where the cursor image C is displayed, but the distance between the position of the virtual camera and the position of the first object is not within the prescribed distance, the determination in step S14 is no, and steps S16 and S17 are not performed. Alternatively, as another variation, the process may be configured such that: in the case where the first object is displayed at the prescribed position where the cursor image C is displayed, and the distance between the position of the virtual camera and the position of the first object is not within the prescribed distance, the determination in step S14 results in "yes", but only step S16 or S17 is executed. For example, the process may be configured such that: although the process for selecting the first object in step S16 is performed, the process for displaying object information corresponding to the first object in step S17 is not performed.

This makes it possible to prevent the steps S16 and S17 from being performed even for the first object located too far away. For this reason, the length of the prescribed distance used for the judgment should be, for example, a distance at which an unnatural feeling is not felt even when the second object performs the executable action. This makes it possible to prevent, for example, the second object from performing an attacking action with a sword on the first object which is located too far away for the sword to reach.

Note that the distance between the position of the virtual camera and the position of the first object used for determination may be calculated from a difference between the coordinates of the virtual camera obtained by converting the coordinates in the local coordinate system into the coordinates in the global coordinate system and the coordinates of the first object also obtained by converting the coordinates in the local coordinate system into the coordinates in the global coordinate system.

Further, the above modification may be further modified. For example, instead of the distance between the position of the virtual camera and the position of the first object, a process similar to that in the above-described modification may be performed based on the distance between the position of the second object and the position of the first object. Also in this case, there is provided an advantage that: as described above, it is possible to prevent the second object from performing an attacking action with a sword on the first object which is located so far that the sword cannot reach.

< second modification >

In the above-described embodiment, the sliding operation of the touch screen 16 by the player is handled as a composite operation, and a plurality of processing steps are executed based on the single composite operation. However, without being limited to this embodiment, other operations of the player may also be handled as a composite operation. For example, an operation for moving the shooting area other than the sliding operation (for which the start and end can be recognized) may be handled as a composite operation.

For example, in the case of an operation with an operating mechanism (such as a cross key or an analog stick) of a controller connected to the input unit 17, and in the case where the start of an input via one of these operating mechanisms is detected, the processing is executed upon considering that the aforementioned first sliding state is detected. Further, during the continuous detection of the moving operation in the prescribed direction via one of these operating mechanisms, the processing is executed upon considering that the aforementioned second sliding state is detected. Further, at the timing when the detection of the moving operation in the predetermined direction via one of these operation mechanisms has ended (or the timing when the predetermined time has elapsed since the end of the detection), the processing is executed when it is considered that the aforementioned third sliding state is detected. Also in this case, the above-described composite operation implementation process can be implemented.

Alternatively, even in the case of an operation using a plurality of devices, the above-described composite operation implementation process can be implemented. For example, even in the case where an operation for moving the second object or the like is performed by using a keyboard or the like connected to the input unit 17, and an operation for moving the shooting area is performed by actually moving a pointing device (for example, a mouse) connected to the input unit 17, the above-described composite operation realization processing can be realized. In this case, in a case where the movement of the pointing device has started, the processing is executed when it is considered that the aforementioned first sliding state is detected. Further, while the movement in the predetermined direction by the pointing device is continuously detected, the processing is executed when it is considered that the above-described second sliding state is detected. Further, at the timing when the detection of the movement of the pointing device in the predetermined direction is finished (or the timing when the predetermined time has elapsed since the end of the detection), the processing is executed when it is considered that the aforementioned third sliding state is detected. Also in this case, the above-described composite operation implementation process can be implemented.

Further, also in the case where a controller for detecting the movement or inclination of the player to the controller itself is detected by an acceleration sensor or a gyro sensor, the above-described complex operation realization processing can be realized. In this case, the detection of the movement by the pointing device described above should be replaced with the detection of the movement or tilt of the controller itself.

< third modification >

In the above-described embodiment, it is assumed that the second object can perform only one action on the first object selected by the object selection unit 115. However, the actions that the second object can perform may be plural without being limited to this embodiment. This will be explained with reference to fig. 8.

Fig. 8 shows a presentation example of object information, which shows a situation similar to that in part (a) of fig. 5, part (B) of fig. 5, and part (a) of fig. 7. As a result of performing steps S16 and S17, for example, the presentation shown in fig. 8 is provided.

That is, in addition to the presentation in part (B) of fig. 4, in order for the player to recognize that the first object OB14 is currently selected with the cursor image C, the cursor image C is changed to an image surrounding the first object OB14, and the image is displayed. Further, as the object information, object information IN41, object information IN42, and object information IN43 are displayed. Here, the object information IN41 is information indicating the status or the like of the first object OB14, and by way of example, the text "fire" indicating the name of the first object OB14 is displayed IN the figure. Further, the object information IN42 and the object information IN43 are information indicating actions that the second object OB21 can perform with respect to the selected first object OB14, respectively. As an example, the text "heating" and the text "cooking" executable by the second object OB21 are shown in the figure.

In this modification, as described above, there are a plurality of actions that can be executed by the second object, and each action is displayed as object information. Further, when the player performs an operation for selecting one of the actions, the second object is caused to perform the selected action. For example, after it is determined in step S23 that the composite operation has passed through the third sliding state and is completed, the player performs a flick operation on the object information corresponding to the action to be selected. Then, the second object is caused to perform the selected action based on the tap operation.

As described above, while the above-described compound operation implementation process is implemented, the options of the player can be increased by allowing a plurality of actions to be performed.

Further, as another modification, for example, in the case where there are a plurality of actions that the second object can perform, one of the actions may be selected based on the condition of the second object. For example, one of these actions may be selected based on the category of weapon that the second object owns (or is equipped with). For example, in the case where the second object is equipped with a sword as described with reference to part (B) of fig. 6, an attack using the sword is selected as the action that the second object can perform. On the other hand, fig. 9 shows another presentation example in which, for example, the second object is equipped with a bow and an arrow. In the case where the second object is equipped with a bow and an arrow as shown in fig. 9, an attack using the bow and the arrow is selected as the action that the second object can perform. As described above, by selecting one of these actions based on the kind of weapon that the second object owns (or is equipped with), it is possible to omit the operation for selecting an action while realizing the above-described composite-operation realization process.

< fourth modification >

In the above-described embodiment, it is assumed that the functional blocks for executing the composite-operation implementing process are implemented by a single apparatus (i.e., the information processing apparatus 1). However, without being limited to this embodiment, the functional block for executing the composite-operation implementing process may be implemented by cooperation among a plurality of apparatuses. For example, functional blocks for executing the composite-operation implementing process implemented by the information processing apparatus 1 may be dispersed to a server apparatus, whereby these functional blocks are implemented in the form of a client-server system. In this case, the server apparatus may be a single server apparatus, or a combination of a plurality of server apparatuses such as a cloud server.

Further, the information processing apparatus 1 and the server apparatus appropriately perform communication to appropriately transmit and receive information. The information processing apparatus 1 transmits the operation accepted from the player via its touch screen 16 to the function block of the server apparatus. Further, the function blocks of the server apparatus execute processing based on the operation accepted from the player received from the information processing apparatus 1. Further, the server apparatus transmits the result of the processing to the information processing apparatus 1. Further, the information processing apparatus 1 displays an image based on the result of the processing received from the server apparatus on its touch screen 16.

Also in this case, the above-described composite operation implementation process can be implemented.

< fifth modification >

The above-described embodiment is explained assuming a game in which a single player participates. However, without being limited to this embodiment, a multiplayer game in which a plurality of players participate is applicable to the above-described embodiment. In this case, a plurality of second objects respectively corresponding to a plurality of players are provided in a single virtual space, and the respective players operate the respective second objects. Further, the above-described composite-operation implementing process is executed in each operation of each second object. In this case, the second object operated by a certain player may be handled as the first object for the other player.

< sixth modification >

In the above-described embodiment, only by the compound operation performed by the player, processing such as the following can be realized: displaying object information of the selected first object, arming a weapon corresponding to the selected first object, performing an attack on the first object, and moving a photographing region corresponding to the display region. However, without being limited to this embodiment, these kinds of processing may also be realized when the player performs a separate operation other than the compound operation. That is, although an operation for realizing a certain process may be limited to a compound operation, the certain process may be realized by other operations.

For example, in a case where an operation (for example, a flick operation) other than the composite operation is accepted via the touch screen 16, processing such as: displaying object information of the selected first object, arming a weapon corresponding to the selected first object, and performing an attack on the first object. Alternatively, these kinds of processing may be executed in a case where an operation is accepted via one of various buttons, direction keys, and the like included in the input unit 17.

< other modifications >

The embodiments and some modifications of the present invention have been described above. However, it is to be noted that the present invention is not limited to the above-described embodiment and some modifications, and the present invention includes modifications, improvements, and the like within a range in which the object of the present invention can be achieved.

Further, the series of processing steps described above may be executed by hardware or by software.

In other words, the functional structure shown in fig. 2 is merely an example, and is not particularly limited. That is, it is sufficient that the information processing apparatus 1 is provided with functions that make it possible to integrally execute the series of processing steps described above, and the selection of specific functional blocks for realizing these functions is not particularly limited to the functional blocks in the example of fig. 2.

Further, the functional blocks may be realized by separate hardware, by separate software, or by a combination of hardware and software.

The functional structure in the embodiment is realized by using a processor that performs calculation processing. The processor that can be used in the present embodiment includes a processor constituted by only one of various information processing apparatuses such as a single processor, a multiprocessor, or a multicore processor, and a combination of one of these various processing apparatuses and a processing circuit such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).

In the case where a series of processing steps is executed by software, a program constituting the software is installed in a computer or the like from a network or a recording medium.

The computer may be a computer embedded in special hardware. Alternatively, the computer may be a computer, such as a general-purpose personal computer or the like, capable of executing various functions when various programs are installed.

The recording medium including such a program is realized by the removable medium 100 in fig. 1 distributed separately from the main unit of the apparatus to provide the program to the player, or by a recording medium or the like provided to the player in a state of being embedded in the main unit of the apparatus in advance. Removable medium 100 is implemented, for example, by a magnetic disk (including a floppy disk), an optical disk, or a magneto-optical disk. The optical disc is realized by, for example, a compact disc-read only memory (CD-ROM), a Digital Versatile Disc (DVD), or a Blu-ray (registered trademark) disc. The magneto-optical disk is realized by, for example, a Mini Disk (MD). Further, the recording medium provided to the player in a state embedded in the main unit of the apparatus is realized by, for example, the ROM 12 in fig. 1 or the semiconductor memory included in the storage unit 19 in fig. 1, in which the program is recorded.

Note that in this specification, the steps defining the program recorded in the recording medium may include processes that are not necessarily executed sequentially but executed in parallel or individually, and processes that are executed sequentially in order. Further, the steps performed according to the program recorded in the recording medium may be performed in any order within a scope not departing from the spirit of the present invention.

Description of the reference numerals

1 information processing apparatus

11 CPU

12 ROM

13 RAM

14 bus

15 input/output interface

16 touch screen

161 display unit

162 touch operation acceptance unit

17 input unit

18 output unit

19 memory cell

20 communication unit

21 driver

100 removable media

111 operation detecting unit

112 game running control unit

113 output control unit

114 shooting area moving unit

115 object selection unit

116 action execution unit

181 Game execution data storage unit

182 parameter storage unit

26页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:游戏单元

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类