Information processing apparatus

文档序号:1804637 发布日期:2021-11-05 浏览:35次 中文

阅读说明:本技术 信息处理设备 (Information processing apparatus ) 是由 B.纽恩费尔特 西川洋平 石川贵久 楠本晶彦 古山将佳寿 藤原雅宏 S.特隆贝塔 A. 于 2020-03-18 设计创作,主要内容包括:第一环形缓冲器142通过当前正被执行的游戏软件110记录游戏移动图像和时间信息。在事件在游戏中被生成的情况下,事件信息获取单元120从游戏软件110获取包括标识事件的事件代码的事件信息。第二环形缓冲器144记录获取的事件信息和与第一环形缓冲器142同步的时间信息。(The first ring buffer 142 records game moving images and time information through the game software 110 currently being executed. In the case where an event is generated in the game, the event information acquisition unit 120 acquires event information including an event code identifying the event from the game software 110. The second ring buffer 144 records the acquired event information and time information synchronized with the first ring buffer 142.)

1. An information processing apparatus comprising:

a first ring buffer which records game moving images and time information of game software currently being executed;

an event information acquisition section that acquires, in a case where an event is generated in a game, event information including an event code that identifies the generated event from the game software; and

a second ring buffer that records the acquired event information and the time information synchronized with the first ring buffer.

2. The information processing apparatus according to claim 1, comprising:

a content extracting section that extracts a game moving image from a start point to an end point of the game moving image recorded in the first circular buffer; and

an event extracting section that extracts event information from the start point to the end point of a segment of the event information recorded in the second ring buffer.

3. The information processing apparatus according to claim 2, further comprising:

a receiving section that receives specification of the start point and the end point from a user.

4. The information processing apparatus according to claim 2 or 3, further comprising:

a transmission processing section that transmits the extracted game moving image to the first server device and transmits the extracted event information to the second server device.

5. The information processing apparatus according to any one of claims 1 to 4, comprising:

a state management section that updates state information indicating a play situation of the user based on the acquired event information; and

a screen capture generation section that acquires a screen capture of a game screen, wherein,

the screen capture generation section adds at least a part of the latest state information at the time of acquiring the screen capture as metadata.

Technical Field

The present invention relates to a technique of recording a moving image of a game.

Background

When the user posts moving image data to the moving image sharing website, the user adds metadata such as a title and a comment for the moving image data to the moving image data. Also, when the user views moving images on the moving image sharing site, the user inputs a search keyword and selects moving images from a moving image list retrieved from the moving image sharing site.

With the moving image sharing service in the past, the user cannot easily obtain the moving image desired by the user. This is because, when the user searches for a moving image, it is difficult to input an appropriate search keyword, and no appropriate metadata is added to each of the moving images published to the moving image sharing website. The moving image sharing site cannot thereby provide moving images matching the user's demand even in the case where the moving image sharing site accumulates many moving images therein.

PTL 1 discloses an information processing apparatus including: a recording section in which moving image data of game software currently being executed is recorded, a metadata acquisition section that acquires metadata indicating an execution state of the game software, and a content generation section that extracts, as content data, the game moving image data from a start point to an end point of the game moving image data recorded in the recording section. The content generation section refers to time information indicating a timing at which the metadata is acquired, and adds the metadata collected during an extraction period from the start point to the end point to the content data. In addition, the time information of the metadata is used for the purpose of specifying the metadata collected within the extraction period, and is not added to the game moving image data (content data). When the content data is uploaded to the distribution server, the added metadata is used to search for content.

Reference list

Patent document

[PTL 1]

JP 2015-198404A

Disclosure of Invention

Technical problem

In playing a game, a user may desire to view game moving images related to the user's play scene. At this time, when the reproduction period of the game moving image distributed from the moving image distribution server is long, it takes a long time for exploring a scene that the user desires to view, and therefore it is preferable that there is a mechanism capable of jumping to the desired scene. Also, regarding prevention of a cut, it is preferable that there is a mechanism capable of specifying a cut portion in the game moving image corresponding to the play situation of the user.

It is an object of the present invention to provide the techniques necessary to implement the above mechanisms.

Solution to the problem

In order to solve the above problem, an information processing apparatus in one mode of the present invention includes: a first ring buffer that records a game moving image and time information of game software currently being executed, an event information acquisition section that acquires event information including an event code identifying a generated event from the game software in a case where the event is generated in the game, and a second ring buffer that records the acquired event information and the time information synchronized with the first ring buffer.

In addition, any combination of the above-described constituent elements and any expression of the present invention converted among a method, an apparatus, a system, a recording medium, a computer program, and the like are also effective as aspects of the present invention.

Drawings

Fig. 1 is a diagram depicting an information processing system according to an embodiment.

Fig. 2 is a diagram depicting a hardware configuration of the information processing apparatus.

Fig. 3 is a diagram depicting functional blocks of the information processing apparatus.

Fig. 4 depicts a diagram each depicting an example of a screen of an output device.

Fig. 5 is a diagram depicting a relationship between a game moving image and event information.

Fig. 6 is a diagram depicting functional blocks of a search server.

Detailed Description

In the information processing system of the embodiment, the user plays game software (console game) installed in the information processing apparatus. When an event is generated in a game, game software outputs event information including an event code identifying the generated event to the system software side. An event is generated when the progress of play of a game changes, when the behavior of a player character or a game character changes, or the like.

The information processing apparatus causes the game moving images to be output from the television or the like, and then simultaneously automatically records the game moving images in the background, and the user can upload the recorded game moving images to the distribution server. When the user manually determines the start point and the end point of the recorded game moving image, the information processing apparatus extracts the game moving image that crosses from the start point to the end point, and uploads the extracted game moving image to the distribution server. At this time, the information processing apparatus transmits event information acquired between the start point and the end point to the event server together with time information of the event information. When the search server accepts a search request from a viewing user, the search server refers to time series data having events generated between a start point and an end point of a game moving image and generation times thereof in relation to each other, and searches for a game moving image matching a situation of play of the viewing user.

FIG. 1 depicts an information handling system 1 according to an embodiment of the present invention. The information processing system 1 includes a plurality of information processing apparatuses 10a, 10b, and 10c (hereinafter, each is referred to as "information processing apparatus 10" without particularly distinguishing one from another) and a content server 12, and they are connected to one another through a network 3 such as the internet or a Local Area Network (LAN). An access point (hereinafter referred to as "AP") 8 has functions of a wireless access point and a router, and the information processing apparatus 10 is connected to the AP 8 wirelessly or by wire to be communicably connected to a content server 12 existing in the network 3.

The content server 12 provides a content sharing service for game moving images and the like, and is depicted in the drawing as a concept in which a distribution server 14, an event server 16, and a search server 18 are incorporated with one another. The distribution server 14 receives the upload of the game moving image from the distributing user and distributes the game moving image to the viewing user. The event server 16 converts the event information transmitted from the information processing apparatus 10 into time series data suitable for searching. When the search server 18 receives a moving image search request from a viewing user, the search server 18 searches for a game moving image to be distributed from the distribution server 14 based on the time series data of the event.

The distribution server 14, the event server 16, and the search server 18 may each be configured as a separate server, and may each communicate with each other via the network 3, while these servers may be integrally formed with each other. Also, the distribution server 14 and the event server 16 may be integrally formed with each other, the distribution server 14 and the search server 18 may be integrally formed with each other, or the event server 16 and the search server 18 may be integrally formed with each other.

The information processing apparatus 10 is connected to the input apparatus 6 operated by the user wirelessly or by wire, and the input apparatus 6 outputs information operated by the user to the information processing apparatus 10. When the information processing apparatus 10 receives the operation information from the input apparatus 6, the information processing apparatus 10 causes the processing of the system software and the game software to reflect the operation information, and causes the output apparatus 4 to output the processing result. In the information processing system 1, the information processing apparatus 10 is a game device (game console) that executes a game, and the input device 6 may be an apparatus, such as a game controller, that supplies operation information of a user to the information processing device 10. In addition, the input device 6 may be an input interface, such as a keyboard or a mouse.

The secondary storage device 2 is a mass storage device such as a Hard Disk Drive (HDD) or a flash memory, and may be an external storage device connected to the information processing device 10 through a Universal Serial Bus (USB) or the like, or may be a built-in storage device. The output device 4 may be a television including a display that outputs images and a speaker that outputs sounds, or may be a computer display. The output device 4 may be connected to the information processing device 10 by a wired cable, or may be connected to the information processing device 10 by wireless.

A camera 7 as an imaging device is disposed in the vicinity of the output device 4, and images a space around the output device 4. Fig. 1 depicts an example in which the camera 7 is attached to the upper part of the output device 4, while the camera 7 may be arranged at the side or lower part of the output device 4, and in any case the camera 7 is arranged at a position where the camera 7 can image a user positioned in front of the output device 4. The camera 7 may be a stereo camera.

The distribution server 14 provides a service of sharing the game image uploaded from the information processing apparatus 10. The distribution server 14 has the following functions: the game images accumulated therein are distributed according to a request from the viewing user, and the game images provided by the distribution user to the distribution server 14 in real time are broadcast. There may be multiple distribution servers 14. The image distribution service provided by the distribution server 14 may restrict users receiving the service to registered members, or may be open to the public.

The event server 16 acquires an event information group related to the uploaded game moving image from the information processing apparatus 10. The event server 16 converts the event information group into event time series data suitable for the search. The event time series data is a data string in which events generated between the start time and the end time of the game moving image and the generation times thereof are associated with each other. The event server 16 provides event time series data to the search server 18.

When the search server 18 receives a search request for a game moving image from the information processing device 10 of the watching user, the search server 18 refers to the event time-series data and performs a search process for the game moving image accumulated in the content server 12. The search request includes state information indicating a play situation of the game of the viewing user, and the search server 18 searches for a game image to be distributed according to a relationship between the state information and the event time series data.

Fig. 2 depicts a hardware configuration of the information processing apparatus 10. The information processing apparatus 10 includes a main power button 20, a power-on Light Emitting Diode (LED)21, a standby LED 22, a system controller 24, a clock 26, an apparatus controller 30, a media driver 32, a USB module 34, a flash memory 36, a wireless communication module 38, a wired communication module 40, a subsystem 50, and a main system 60.

The host system 60 includes a host Central Processing Unit (CPU), a memory and memory controller serving as a main storage device, a Graphics Processing Unit (GPU), and the like. The GPU is mainly used for arithmetic processing of game programs. The main CPU has the following functions: an Operating System (OS) is started, and a game program installed in the secondary storage device 2 is executed in an environment provided by the OS. The subsystem 50 includes a sub CPU, a memory and a memory controller serving as main storage devices, and the like, and does not include a GPU.

The main CPU has a function of executing a game program installed in the secondary storage device 2 or a Read Only Memory (ROM) medium 44, and the sub CPU does not have such a function. However, the sub CPU has a function of accessing the auxiliary storage device 2, as well as a function of transmitting data to the server device 5 and a function of receiving data from the server device 5. The sub-CPU includes only these limited processing functions and can therefore operate with low power consumption compared to the main CPU. These functions of the sub-CPU are performed while the main CPU is in a standby state.

The main power button 20 is an input portion on which operation input from a user is performed, is provided on the front face of the housing of the information processing apparatus 10, and is operated to turn on or off power supply to the main system 60 of the information processing apparatus 10. The energizing LED 21 lights up when the main power button 20 is turned on, and the standby LED 22 lights up when the main power button 20 is turned off. System controller 24 detects a user depression of main power button 20.

The clock 26 is a real-time clock that generates current date and time (day-and-time) information and supplies the date and time information to the system controller 24, the subsystem 50, and the main system 60.

The device controller 30 is configured as a large-scale integrated circuit (LSI) that performs information delivery and reception between devices like a south bridge (south-bridge). As shown, connected to the device controller 30 are the following devices: such as system controller 24, media drive 32, USB module 34, flash memory 36, wireless communication module 38, wired communication module 40, subsystem 50, and host system 60. The device controller 30 absorbs the difference in electrical characteristics and the difference in data transfer speed between devices, and controls the timing for data transfer.

The medium drive 32 is a drive device in which a ROM medium 44 in which application software such as games and license information is recorded is attached to the medium drive 32 to drive the ROM medium 44, and the medium drive 32 reads programs and data from the ROM medium 44. The ROM medium 44 is a recording medium dedicated to reading, such as an optical disc, a magneto-optical disc, or a blu-ray disc.

The USB module 34 is a module connected to an external device through a USB cable. The USB module 34 may be connected to the secondary storage device 2 and the camera 7 through USB cables, respectively. The flash memory 36 is a secondary storage device included in the internal storage. The wireless communication module 38 wirelessly communicates with, for example, the input device 6 using a communication protocol such as a bluetooth (registered trademark) protocol or an Institute of Electrical and Electronics Engineers (IEEE)802.11 protocol. The wired communication module 40 communicates with an external device through a wire and is connected to the network 3 through, for example, the AP 8.

Fig. 3 depicts functional blocks of the information processing apparatus 10. The information processing apparatus 10 includes a processing section 100 and a communication section 102. The processing section 100 includes game software 110, an event information acquisition section 120, a routing section 122, a status management section 124, an output processing section 126, a first recording control section 128, a second recording control section 130, a recording section 140, a memory 146, and a sharing processing section 150. The recording section 140 includes a first ring buffer 142 and a second ring buffer 144. The sharing processing section 150 includes a receiving section 152, a content extracting section 154, an event extracting section 156, a screen shot generating section 158, a transmission processing section 160, and an operation screen generating section 162. The communication section 102 is expressed as a configuration having both the functions of the wireless communication module 38 and the wired communication module 40 depicted in fig. 2, the wireless communication module 38 taking charge of communication with the input device 6, and the wired communication module 40 taking charge of communication with the content server 12.

In fig. 3, elements depicted as functional blocks that perform various processes may each include circuit blocks, memory, and other LSIs for hardware or may each be implemented by a program loaded on the memory for software. Accordingly, those skilled in the art will appreciate that these functional blocks may be implemented in any of various forms by hardware only, software only, or a combination thereof, and the form is not limited to any one of the above.

The game software 110 includes at least a game program, image data, audio data, and configuration files. The game program receives operation information of the input device 6 by the user, and executes an arithmetic process of moving the player character in the virtual space. The output processing section 126 generates image data and audio data of the game, and causes the output device 4 to output the image data and the audio data. The output processing section 126 may include a Graphics Processing Unit (GPU) that performs a rendering process or the like.

When a preset event is generated in the progress of the game, the game program generates event information including an event code identifying the generated event, and outputs the event information to the event information acquiring section 120. The game developer may seed various events in the game as follows.

Tasks, goals, actions, etc. that a player may perform in a game are all referred to as "activities," and when a player begins an activity, an event is generated indicating the start of the activity, and when a player ends the activity, an event is generated indicating the end of the activity. For example, for an activity that "fights" with an enemy character, a combat start event is generated when the player starts a combat, and a combat end event is generated when the player ends the combat.

An event code is assigned to each event, and the game program produces event information having therein game data indicating a game state at the time of event generation, which is added to the event code identifying the generated event. The game program may cause the game data to include the location of the event generation, information about the opponent, and the like. For example, with respect to a battle start event, the game data may include at least information specifying a scene number in the game and information indicating a battle position. For a battle end event, the game data may include information specifying a scene number in the game, information indicating a battle position, information indicating whether an activity is completed, and experience values of players added by the battle end. Additionally, when the final purpose of the event is achieved, such as, for example, defeating the final Boss (Boss) or fighting an enemy, the event is completed.

Also, during play of the game, when a player obtains a weapon, an event is generated indicating that the player obtained the weapon. When a player uses a weapon, an event indicating that the player uses the weapon is generated, and when an enemy character uses the weapon against the player, an event indicating that the weapon is used for the player is generated. The game program generates event information in which game data (such as the type of weapon, the location where the weapon is obtained or used) is added to the event code, and outputs the event information to the event information acquisition section 120. In this way, game developers can define various events and can incorporate these events into the game.

The configuration file for game software 110 describes a list of event codes needed to manage the user's play situation. The information required to manage the play situation in this embodiment includes event codes related to activities, but not weapon-related event codes. In addition, which types of events are used to manage the user's play scenario depends on the management policy for the play scenario, and weapon-related events as well as other types of events may also be made to be reflected on the management of the user's play scenario.

Before play of the game begins, a list of event codes is provided to the routing section 122. The event information acquisition section 120 acquires event information from the game software 110 and delivers the event information to the routing section 122. When the event information is delivered to the routing section 122, the routing section 122 supplies all pieces of the event information to the second recording control section 130, and supplies pieces necessary for management of the play situation of the user in the event information to the state management section 124 according to the event code list.

The state management section 124 records the latest state information indicating the play situation of the user in the memory 146. The state management section 124 updates the state information based on the event information. For example, the state management section 124 may manage the play situation of the user using the following pieces of state information.

Activity _ A

This indicates the most recently started activity.

Activity _ B

This indicates the activity that has been played so far.

Activity _ C

This indicates activities that have been cleared so far.

Activity _ D

This indicates the activity that can be played.

Activity _ E

This indicates activities that cannot be played.

Each of all the above-described pieces of status information is managed using identification information (activity Identification (ID)) of an activity. The state management section 124 manages "activity _ a", "activity _ B", "activity _ C", "activity _ D", and "activity _ E" using the activity ID by previously acquiring the conversion table to convert the event code into the activity ID according to the configuration file. When event information related to an activity is supplied from the routing section 122 to the state management section 124, the state management section 124 updates the state information. For example, when a start event for a new activity is supplied, the state management section 124 updates "activity _ a" and "activity _ B". Also, when the activity having been played so far is completed, the state management section 124 updates "activity _ C".

In the information processing apparatus 10 of the present embodiment, the output processing section 126 generates image data and audio data of a game, causes the output apparatus 4 to output the image data and the audio data, and supplies the image data and the audio data to the first recording control section 128. Hereinafter, the image data and the audio data of the game will be simply referred to as "game moving image" or "game moving image data". The information processing apparatus 10 has a function of recording the game moving image output from the output apparatus 4 in the background, and the first recording control section 128 records the game moving image data in the first ring buffer 142 together with time information (time stamp).

The first ring buffer 142 is generated by setting a first start address and a first end address of a storage area of the auxiliary storage device 2. This ring buffer may be set in advance when the information processing apparatus 10 is marketed. The first recording control section 128 records the game moving image of the currently executed game software generated by the output processing section 126 in the first ring buffer 142 together with a time stamp.

The first recording control section 128 records the pieces of image data in order of predetermined addresses starting from the first start address in the first ring buffer 142, and when the first recording control section 128 records the pieces of image data up to the first end address, returns to the first start address to be recorded by overwriting (overwriting), and repeats this process. For example, the first ring buffer 142 is set such that the first ring buffer 142 records 20 minutes of game moving images, and time information (time stamp) is given to each of the recorded game moving images. Recording the game moving images in the first ring buffer 142 in the background enables the user to cut the play moving images of the last 20 minutes and upload the cut play moving images to the distribution server 14.

When an event is generated in the game, the event information acquisition section 120 acquires event information including an event code identifying the generated event from the game software 110, and supplies the event information to the routing section 122. The routing section 122 transmits all the pieces of the event information supplied thereto to the second recording control section 130, and the second recording control section 130 records the pieces of the event information in the second ring buffer 144 together with time information (time stamp).

The second ring buffer 144 is generated by setting a second start address and a second end address of the memory area of the secondary memory device 2. Like the first ring buffer 142, this ring buffer may be set in advance when the information processing apparatus 10 comes on the market. The second recording control section 130 records the pieces of event information in order of predetermined addresses starting from the second start address in the second ring buffer 144, and when the second recording control section 130 records the pieces of event information up to the second end address, returns to the second start address to be recorded by overwriting, and repeats this process. The second ring buffer 144 is set such that the second ring buffer 144 records pieces of event information for the same period of time (for example, 20 minutes) as that of the first ring buffer 142, and time information (time stamp) is given to each of the recorded pieces of event information. The time stamps recorded in the second ring buffer 144 are synchronized with the time stamps recorded in the first ring buffer 142. The game images are thus recorded in the first ring buffer 142 at the same timing, and when the event information is recorded in the second ring buffer 144, the same time stamp is given to both of them.

The process of the user a uploading the game moving image to the distribution server 14 during the play of the game will be described below. When the user a presses a predetermined button (sharing button) of the input device 6, a plurality of options related to sharing of the game image are displayed on the output device 4.

Fig. 4(a) depicts an example of a shared menu screen 200 displayed superimposed on a game screen. When the user a operates the sharing button provided on the input device 6 during the play of the game, the receiving section 152 receives the operation information of the sharing button, and the operation screen generating section 162 generates the sharing menu screen 200 to be displayed superimposed on the game screen. On the sharing menu screen 200, indicated are:

-uploading a video clip;

-uploading a screenshot;

-broadcasting a game play; and

-starting the shared play, and-starting the shared play,

and four options related to game image sharing. At this time, the progress of the game is temporarily suspended, and the first recording control section 128 and the second recording control section 130 simultaneously suspend writing to the first ring buffer 142 and the second ring buffer 144, respectively. A record of 20 minutes prior to the operation of the share button is thus stored in each of the first ring buffer 142 and the second ring buffer 144.

When the shared menu screen 200 is displayed, the operation information of the input device 6 is used for the operation of the shared menu screen 200. To upload the game moving image, the user a selects "upload video clip". The user a thereafter selects the distribution server 14 to upload the game moving images thereto. Fig. 1 depicts only a single distribution server 14, but there may be multiple distribution servers 14, and in such a case, user a needs to select one or more distribution servers 14. When the user a selects the distribution server 14, the operation screen generation section 162 generates a screen for editing the game moving image.

Fig. 4(b) depicts a trim edit screen 202. In the embodiment, the game moving images of up to 20 minutes are recorded in the first ring buffer 142, and depending on the distribution server 14, only the game moving images having a time period of less than 20 minutes may be allowed to be uploaded. Accordingly, the user a cuts the game moving image in a length that can be uploaded on the trimming edit screen 202. The user a operates the input device 6 and thereby designates a start point and an end point to cut the game moving image. When the receiving portion 152 receives the designation of the start point and the end point from the user a, the receiving portion 152 notifies the content extraction portion 154 and the event extraction portion 156 of the designation. The content extraction section 154 extracts a game moving image and a time stamp from a start point to an end point of the game moving image recorded in the first ring buffer 142, and the event extraction section 156 extracts event information and a time stamp from the same start point to the same end point of the event information recorded in the second ring buffer 144. Hereinafter, the extracted section of the event information from the start point to the end point is referred to as an "event information group". Note that the user a can determine the start point and the end point, and can cut out the play scene that the user likes best, regardless of the moving image permitted period of the distribution server 14.

Fig. 5 depicts a relationship between the extracted game moving image and a section of event information. Since the first and second ring buffers 142 and 144 set time stamps to be common, timelines of the extracted game moving image and the extracted segments of event information can be matched with each other. The content extraction section 154 adds at least game identification information (game ID) and content identification information (content ID) as attribute information to the extracted game moving image. Note that information specifying the user a may also be added to the extracted game moving image. The content ID must be unique in the information processing system 1. The content extraction section 154 supplies the content ID to the event extraction section 156, and the event extraction section 156 adds the game ID and the content ID to the extracted event information group as attribute information.

The transmission processing section 160 transmits the game moving image extracted by the content extraction section 154 to the distribution server 14. The distribution server 14 has the game moving image uploaded thereto from the information processing device 10 of each of the plurality of users, and a user desiring to distribute the game moving image thereto accesses the distribution server 14 and views the game moving image.

The transmission processing section 160 transmits the event information group extracted by the event extraction section 156 to the event server 16. Although the event information group is a set of combinations of event codes and game data, since the event codes and the game data are defined by the game developer, the event codes and the game data are different for each game. The event server 16 thus converts the pieces of event codes and game data included in the event information group into pieces of event codes and game data that are commonly used by the content server 12 using the conversion table provided from the game developer.

This conversion process will be described. When a player begins an activity, an activity start event is generated. For example, in game X, the event code of the activity start event may be "mission start (missionStart)", and in game Y, the event code of the activity start event may be "task start (taskStart)", and the event server 16 converts the event code of the activity start event into "start". The event server 16 thus retains a table for converting "mission Start" to "Start" for game X, and retains a table for converting "mission Start" to "Start" for game Y. In this manner, the event server 16 converts the pieces of event information of all games into pieces of event codes and game data that are commonly used by the content server 12. The converted event code and the piece of the game data are referred to as "event time series data", and the scene number in the event included in the game data before the conversion is converted into an event ID by the conversion table. The event server 16 supplies the event time series data to the search server 18, and the search server 18 analyzes the event time series data, and can thereby specify the type and generation time of the event generated in the game moving image.

The above description is a process in which the user a transmits the game moving image and the event information set to the content server 12. A process in which the user B operating the information processing device 10B accesses the content server 12 and views a game moving image will be described below. In the information processing system 1, during game play of the user B, the user B can receive distribution of a game moving image matching the current play situation by performing a predetermined operation.

In the information processing apparatus 10B of the user B, similarly, the state management section 124 manages state information indicating the play situation of the user B in the memory 146. The "activity _ a", "activity _ B", "activity _ C", "activity _ D", and "activity _ E" as pieces of state information are managed using the activity ID.

During play of the game, when the user B performs a predetermined operation, the user B may receive the provision of the candidate list of the game moving image matching the current play situation. When the information processing apparatus 10B receives a predetermined operation by the user B, the information processing apparatus 10B transmits a search request for a game moving image to the search server 18. At this time, the information processing device 10b makes the search request include the game ID currently being played and the state information managed by the state management section 124.

Fig. 6 depicts functional blocks of the search server 18. The search server 18 includes a search processing section 210 and a communication section 212. The search processing section 210 includes an event time-series data acquisition section 220, a search request acquisition section 222, a state information acquisition section 224, a candidate extraction section 226, and a recording device 230.

In fig. 6, elements depicted as functional blocks that perform various processes may each include circuit blocks, memory, and other LSIs as hardware, or may each be implemented by a program loaded on the memory as software. Accordingly, those skilled in the art will appreciate that these functional blocks may be implemented in any of various forms by hardware only, software only, or a combination thereof, and the form is not limited to any one of the above.

The event time-series data acquiring section 220 acquires event time-series data of the game moving image from the event server 16 and stores the event time-series data in the recording device 230. Preferably, when the event information is transmitted from the information processing apparatus 10 to the event server 16, the event server 16 immediately generates event time series data and provides the event time series data to the search server 18. The event time series data has a game ID and a content ID added thereto.

The search request acquisition section 222 receives a search request for a game moving image from the information processing device 10B of the user B as a viewing user. The status information acquisition section 224 acquires, from the search request, the game ID that the user B is currently playing and status information on this game management.

The candidate extraction section 226 specifies the event time-series data to which the game ID that the user B is currently playing is added, and searches for a game moving image that matches the state information of the user B. In the state information, the activity ID of "activity _ a" specifies the activity currently being played, and the candidate extraction section 226 may thereby specify event time series data including the end event of the activity indicated by the activity ID of "activity _ a". A game moving image including the activity-completed video image can thus be specified.

The activity ID included in "activity _ B" indicates an activity that has been played so far. From the viewpoint of preventing the drama, the candidate extracting section 226 can thereby avoid selecting any event time series data including the event ID not included in the "event _ B". Thereby causing the activity that the user B has not played so far to be avoided from being included in the candidate moving image. When the candidate extraction section 226 specifies a plurality of pieces of event time series data that match the state information of the user B, the candidate extraction section 226 notifies the distribution server 14 of the content IDs of the plurality of pieces of event time series data. The distribution server 14 specifies a game moving image according to the content ID, and transmits a list of thumbnails of the game moving images to the information processing apparatus 10 b. When the user B selects one of the thumbnails, the distribution server 14 distributes the selected game moving image. Thereby enabling the user B to view the game moving image matching the current play situation without inputting any search keyword.

For example, in the case where the reproduction period of the game moving image is long, the game moving image may be adapted to start at the time position of the start event code of the activity of "activity _ a". Also, in the case where a dramatic perspective image is included in the game moving image, the distribution server 14 may specify a dramatic part and may avoid any distribution of the dramatic part. In this way, the distribution server 14 can specify a scenario of preferential distribution and a scenario of non-distribution based on the event time series data and the state information of the user B.

The process of associating the event time series data with the uploaded game moving images has been described above.

Referring back to fig. 4(a), user a may upload the screenshot. When the user a selects "upload screen capture" and captures (or selects) a screen capture to upload the screen capture, the screen capture generation section 158 acquires a screen capture of the game screen. At this time, the screen capture generation section 158 reads at least a part of the latest state information from the memory 146 at the time of acquiring the screen capture, and adds the part as metadata to the screen capture image. The screen shot generation section 158 adds metadata to the screen shot of the game moving image, thereby enabling management so that no other user can view the screen shot from the viewpoint of preventing the drama.

The present invention has been described above based on embodiments. The present embodiment is illustrative, and it will be understood by those skilled in the art that various modified examples may be made for each of the constituent elements thereof, and each of the combinations of the processes and such modified examples are within the scope of the present invention.

Industrial applicability

The present invention is applicable to the technical field in which game moving images are recorded.

List of reference numerals

1: information processing system

10: information processing apparatus

12: content server

14: distribution server

16: event server

18: search server

100: treatment section

102: communication part

110: game software

120: event information acquisition section

122: routing section

124: state management section

126: output processing section

128: first recording control section

130: second recording control section

140: recording part

142: first ring buffer

144: second ring buffer

146: memory device

150: shared processing section

152: receiving part

154: content extraction section

156: event extraction section

158: screen shot generation section

160: transmission processing section

162: operation screen generating section

210: search processing section

212: communication part

220: event time series data acquisition section

222: search request acquisition section

224: status information acquisition section

226: candidate extraction section

230: a recording apparatus.

18页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:远程操作系统以及远程操作服务器

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类