Shared augmented reality games within a shared coordinate space

文档序号:440416 发布日期:2021-12-24 浏览:26次 中文

阅读说明:本技术 共享坐标空间内的共享增强现实游戏 (Shared augmented reality games within a shared coordinate space ) 是由 J·M·加希尔 J·D·梅里亚姆 T·F·奥拉夫松 T·J·舒茨 M·M·佩尔松 于 2020-04-12 设计创作,主要内容包括:本文描述了一种用于在最初具有不相交的相对坐标空间的各设备之间创建的共享坐标空间内共享AR游戏的系统和方法。一旦共享坐标空间被创建,AR视频游戏就可以提供其中用户参与游戏游玩动作的第一模式,这些游戏游玩动作根据预建立的游戏规则产生结果。AR视频游戏可以提供其中用户参与非破坏性游戏游玩动作的第二模式(“沙箱模式”),一旦第二模式已经终止,这些非破坏性游戏游玩动作就不产生结果。本文进一步描述了一种在AR会话内使用地理位置信息的系统和方法,其中虚拟动作可以由用户发起,该虚拟动作使对应的虚拟动作被显示在与物理环境平行的虚拟环境的地图上,该地图显示在另一用户的用户游戏设备上。(A system and method for sharing an AR game within a shared coordinate space created between devices that initially have disjoint relative coordinate spaces is described herein. Once the shared coordinate space is created, the AR video game may provide a first mode in which the user participates in game play actions that produce results according to pre-established game rules. The AR video game may provide a second mode ("sandbox mode") in which the user engages in non-destructive game play actions that do not produce an outcome once the second mode has terminated. Further described herein is a system and method for using geographic location information within an AR session, where a virtual action may be initiated by a user, the virtual action causing a corresponding virtual action to be displayed on a map of a virtual environment parallel to a physical environment, the map being displayed on a user gaming device of another user.)

1. A system for sharing an augmented reality game within a shared coordinate space, comprising:

a computer comprising a processor and a memory having computer-executable instructions stored thereon that, when executed by the processor, cause the computer to: creating a shared coordinate space in the augmented reality game between a plurality of user gaming devices;

using the created shared coordinate space to provide a first mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices, wherein in the first mode at least some of the plurality of users participate in game play actions that produce results according to pre-established game rules; and

using the created shared coordinate space to provide a second mode of the augmented reality game to the plurality of users associated with the plurality of user gaming devices, wherein in the second mode at least some of the plurality of users engage in a non-destructive game play action that produces no result once the second mode has terminated.

2. The system of claim 1, wherein creating the shared coordinate space in the augmented reality game between the plurality of user gaming devices comprises: the plurality of user gaming devices includes a first user gaming device and a second gaming device, and wherein, by the second user gaming device:

performing augmented reality tracking to establish a relative coordinate space of the second user gaming device;

identifying a spatially aligned image displayed on the first user gaming device and receiving an identifier of the augmented reality game;

recording a location of the second user gaming device within a coordinate space of the second user gaming device associated with the identification of the spatially aligned image and a timestamp associated with a clock of the second user gaming device;

sending a request for information to the first user gaming device, the request including the timestamp;

receiving, from the first user gaming device in response to the request, information including a location of the first user gaming device within a relative coordinate space of the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device;

calculating an offset between a coordinate space of the second user gaming device and a coordinate space of the first user gaming device based at least in part on the received information to create the shared coordinate space; and

displaying the augmented reality game using the shared coordinate space and the identifier.

3. The system of claim 2, wherein the spatially aligned image is displayed in a predetermined size and in a plurality of features comprising a predefined specific grouping of pixels of a predefined color and a predefined intensity, the plurality of features allowing the second user gaming device to determine its position relative to the first user gaming device in the form of a six degree position fix.

4. The system of claim 2, wherein a clock of the second user gaming device is synchronized to a clock of the first user gaming device by the second user gaming device.

5. The system of claim 2, wherein the game identifier is displayed on the first user gaming device and comprises a multi-dimensional barcode.

6. The system of claim 2, wherein the augmented reality game comprises a multi-party augmented reality construction video game.

7. The system of claim 2, the memory having further computer-executable instructions stored thereon that, when executed by the processor, cause the computer to:

displaying a virtual object associated with the augmented reality game on a display of the first user gaming device and a display of the second user gaming device.

8. The system of claim 2, wherein the second user gaming device comprises a mobile phone.

9. A method for sharing an augmented reality game within a shared coordinate space, comprising:

creating a shared coordinate space in an augmented reality game between a plurality of user gaming devices;

using the created shared coordinate space to provide a first mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices, wherein in the first mode at least some of the plurality of users participate in game play actions that produce results according to pre-established game rules; and

using the created shared coordinate space to provide a second mode of the augmented reality game to the plurality of users associated with the plurality of user gaming devices, wherein in the second mode at least some of the plurality of users engage in a non-destructive game play action that produces no result once the second mode has terminated.

10. The method of claim 9, wherein the plurality of user gaming devices includes a first user gaming device and a second user gaming device, and creating the shared coordinate space in the augmented reality game between the plurality of user gaming devices comprises:

performing, by the first user gaming device and the second user gaming device, augmented reality tracking to establish separate relative coordinate spaces for the first user gaming device and the second user gaming device;

providing, by the first user gaming device, an identifier of the augmented reality game;

displaying, by the first user gaming device, a spatially aligned image;

storing, by the first user gaming device, location information about the first user gaming device and an associated timestamp for at least a portion of the time that the spatially aligned image is displayed;

identifying, by the second user gaming device, the spatially-aligned image displayed on the first user gaming device and receiving the identifier of the augmented reality game;

recording, by the second user gaming device, a location of the second user gaming device within a coordinate space of the second user gaming device associated with the identification of the spatially aligned image and a timestamp associated with a clock of the second user gaming device;

sending, by the second user gaming device, a request for information to the first user gaming device, the request including the timestamp;

receiving, by the first user gaming device, the request for information from the second user gaming device, the request including the timestamp;

providing, by the first user gaming device, location information regarding the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device in response to the request;

receiving, by the second user gaming device and in response to the request, information from the first user gaming device including a location of the first user gaming device within a relative coordinate space of the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device;

calculating, by the second user gaming device, an offset between a coordinate space of the second user gaming device and a coordinate space of the first user gaming device based at least in part on the received information to create the shared coordinate space; and

displaying, by the second user gaming device, the augmented reality game utilizing the shared coordinate space and the identifier.

11. The method of claim 10, wherein the spatially aligned image is displayed in a predetermined size and in a plurality of features comprising a predefined specific grouping of pixels of a predefined color and a predefined intensity, the plurality of features allowing the second user gaming device to determine its position relative to the first user gaming device in the form of a six degree position fix.

12. A method of using geographic location information within an augmented reality session, comprising:

joining a plurality of user gaming devices to the augmented reality session;

determining, at a first user gaming device of the plurality of user gaming devices, that at least one other user gaming device of the plurality of user gaming devices has a physical location that is within a threshold physical distance of a physical location of the first user gaming device; and

at the first user gaming device, initiating a virtual action that causes a corresponding virtual action to be displayed on a map of the virtual environment that is parallel to at least a portion of the physical environment, the map being displayed on the second user gaming device.

13. The method of claim 12, wherein the corresponding virtual action comprises movement of a virtual object representing a location of the first user gaming device.

14. The method of claim 12, wherein the corresponding virtual action comprises an animation of a virtual object to be displayed on the map.

15. The method of claim 12, wherein the corresponding virtual action is displayed on a map of the virtual environment at the second user gaming device.

Background

An Augmented Reality (AR) system, such as a video game, displays a real-world image overlaid with a virtual experience (e.g., interactive three-dimensional object (s)). Thus, the AR system enables participants to view real-world images in conjunction with contextually relevant computer-generated images. Images from the real world and computer generated images are combined and presented to the user such that they appear to share the same physical space. In AR applications where multiple participants share the same physical environment, inconsistent positioning of the computer-generated image relative to the real-world image can cause significant interference, which degrades the AR experience.

SUMMARY

Described herein is a system for sharing an augmented reality game within a shared coordinate space, comprising: a computer comprising a processor and a memory having computer-executable instructions stored thereon that, when executed by the processor, cause the computer to: creating a shared coordinate space in an augmented reality game between a plurality of user gaming devices; using the created shared coordinate space to provide a first mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices, wherein in the first mode at least some of the plurality of users participate in game play actions that produce results according to pre-established game rules; and using the created shared coordinate space to provide a second mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices, wherein in the second mode at least some of the plurality of users engage in non-destructive game play actions that produce no result once the second mode has terminated.

Further described herein is a method of using geographic location information within an augmented reality game, comprising: adding a plurality of user gaming devices to an augmented reality game; determining, at a first user gaming device of the plurality of user gaming devices, that at least one other user gaming device of the plurality of user gaming devices has a physical location that is within a threshold physical distance of the physical location of the first user gaming device; and initiating, at the first user gaming device, a virtual action that causes a corresponding virtual action to be displayed on a map of the virtual environment that is parallel to at least a portion of the physical environment, the map being displayed on the second user gaming device.

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

Drawings

Fig. 1 is a functional block diagram illustrating a system for sharing an augmented reality game within a shared coordinate space.

Fig. 2 is a functional block diagram illustrating a system for creating a shared coordinate space in an augmented reality game between at least two devices having disjoint relative coordinate spaces.

Fig. 3 and 4 are exemplary user interfaces.

Fig. 5 is a functional block diagram illustrating a system for using geographic location information within an AR session.

FIG. 6 is an exemplary user interface.

Fig. 7 is a flow diagram illustrating a method for sharing an augmented reality game within a shared coordinate space.

Fig. 8 is a flow chart illustrating a method for using geographic location information within an AR session.

Fig. 9 is a flow diagram illustrating a method of creating a shared coordinate space in an augmented reality game between at least two devices having disjoint relative coordinate spaces by a first user gaming device.

Fig. 10 is a flow diagram illustrating a method of creating a shared coordinate space in an augmented reality game between at least two devices having disjoint relative coordinate spaces by a second user gaming device.

Fig. 11 and 12 are flow diagrams illustrating a method of creating a shared coordinate space in an augmented reality game between at least two devices having disjoint relative coordinate spaces.

Fig. 13 is a functional block diagram illustrating an exemplary computing system.

Detailed Description

Various technologies pertaining to shared augmented reality gaming within a shared coordinate space are now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects. It may be evident, however, that such aspect(s) may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing one or more aspects. In addition, it is to be understood that functionality that is described as being performed by a particular system component may be performed by multiple components. Similarly, for example, a component may be configured to perform functionality described as being implemented by multiple components.

The present disclosure supports various products and processes that perform or are configured to perform various actions with respect to a shared augmented reality game within a shared coordinate space. The following are one or more exemplary systems and methods.

Aspects of the present disclosure are directed to the technical problem of sharing augmented reality games within a shared coordinate space. Technical features associated with solving this problem involve using the displayed spatially aligned image and the AR game identifier to create a shared coordinate space in an augmented reality game between a plurality of user gaming devices that each initially have disjoint relative coordinate spaces; providing a first mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices using the created shared coordinate space, wherein in the first mode at least some of the plurality of users participate in game play actions that produce results according to pre-established game rules; and using the created shared coordinate space to provide a second mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices, wherein in the second mode at least some of the plurality of users engage in non-destructive game play actions that produce no result once the second mode has terminated. Accordingly, aspects of these technical features exhibit the technical effect of more efficiently and effectively sharing augmented reality games within a shared coordinate space to conserve computing resource(s) and/or bandwidth.

Furthermore, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise or clear from context, the phrase "X employs a or B" is intended to mean any of the natural inclusive permutations. That is, the phrase "X employs a or B" is satisfied by any of the following examples: x is A; x is B; or X employs both A and B. In addition, the articles "a" and "an" as used in this application and the appended claims should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form.

The terms "component" and "system" and various forms thereof (e.g., component, system, subsystem, etc.) as used herein are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, or software in execution. For example, a component may be, but is not limited to: a process running on a processor, an object, an instance, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a computer and the computer can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers. Furthermore, as used herein, the term "exemplary" is intended to mean serving as an illustration or example of something, and is not intended to indicate a preference.

"user gaming device" refers to a mobile personal computing device, including, for example, a mobile phone, laptop computer, tablet, phablet, personal digital assistant ("PDA"), e-reader, wearable computer, Head Mounted Display (HMD), or any other mobile computing device having components for displaying and/or interacting with an augmented reality game (e.g., a conversation). A "real object" is an object that exists in the environment surrounding the AR participant. A "virtual object" is a computer-generated construct that does not exist in the participant's physical surroundings, but can be experienced (e.g., seen, heard, etc.) via AR technology.

AR systems, such as video games, display real-world images overlaid with virtual experiences (e.g., interactive three-dimensional object (s)). Thus, the AR system enables participants to view real-world images in conjunction with contextually relevant computer-generated images. Aligning user gaming devices to achieve a shared AR experience can be a complex problem, which often results in inconsistent positioning of computer-generated images relative to real-world images. For example, a Shared coordinate space in an Augmented Reality session between two devices having disjoint relative coordinate spaces may be created as disclosed in co-pending U.S. patent application S/n.16/277,591 entitled "Aligning Location for a Shared Reality Experience," filed on 2019, 2, 15, which is hereby incorporated by reference herein.

A system and method for experiencing an AR experience (e.g., an AR video game) with a shared coordinate space is described herein. In a first mode ("game play mode"), once two or more users have created a shared coordinate space in an AR game, the users participate in the AR video game, with the action(s) producing outcome(s) according to pre-established game rule(s). In a second mode ("sandbox mode"), within the shared coordinate space of the AR game, two or more users participate in a non-destructive session of the AR game in which the action(s) only produce temporary or limited outcome(s) in accordance with at least some of the pre-established game rule(s). However, since the second mode is non-destructive, the result(s) are reset upon expiration of the second mode and/or in response to a game reset action.

Referring to fig. 1, a system for sharing an augmented reality game within a shared coordinate space 100 is illustrated. The system 100 may initially create a shared coordinate space in an AR game between two or more user gaming devices. Once the shared coordinate space in the AR game has been created, during a first mode ("game play mode"), the user uses the user gaming device to participate in the AR video game in the AR game, wherein the action(s) generate the result(s) according to pre-established game rule(s). In a second mode ("sandbox mode"), within the shared coordinate space of the AR game, two or more users participate in a non-destructive session of the AR video game in which the action(s) produce only temporary or limited outcome(s) in accordance with at least some of the pre-established game rule(s). However, since the second mode is non-destructive, the result(s) may be reset upon exit of the second mode and/or in response to a game reset action.

Aligning positions for shared augmented reality experience

The system 100 includes a shared coordinate space creation component 110 that creates a shared coordinate space in an augmented reality game between at least two devices having disjoint relative coordinate spaces. Initially, the two devices synchronize their clocks (e.g., to each other and/or to a network time server) and separately start AR tracking. In the case of initiating AR tracking, each device establishes its own relative coordinate space.

During an AR session (e.g., an AR game), the first device displays a spatially aligned image that is visible to the second device and optionally an AR game identifier for the AR game. In displaying the spatially aligned images, the first device tracks its position at different time intervals (e.g., a six degree position) using the timestamps. The second device identifies the spatially aligned image and records its position (e.g., six degree position) and a timestamp (e.g., a timestamp that coincides with the position at which the spatial alignment was identified) when the spatially aligned image was identified. "six degree positioning" (also referred to as "six degrees of freedom") refers to the freedom of movement of an object in three-dimensional space along three orthogonal spatial axes (e.g., x, y, and z) and changes in orientation of the object about three orthogonal rotational axes (e.g., yaw, pitch, and roll).

Thereafter, the second device sends a request for information to the first device along with a timestamp when the second device recognized the spatially aligned images. In response to the request, the first device sends the second device a location of the first device at or near the timestamp when the second device recognized the spatially aligned image (e.g., a six degree position fix) and a spatial origin of the first device.

Using the received location of the first device and the spatial origin, the second device may calculate an offset between the second device and the first device, thereby establishing a shared coordinate space between the first device and the second device. The second device may then display the virtual image(s) of the AR game being displayed by the first device in the shared coordinate space. In some embodiments, the shared coordinate space may be corrected to within millimeter accuracy, i.e., the second device may align its coordinate space with the coordinate space of the first device to within about one millimeter of accuracy.

Referring to fig. 2, a system 200 for creating a shared coordinate space in an augmented reality session (e.g., an AR game) between at least two devices (having disjoint relative coordinate spaces) is illustrated. The system 200 may facilitate the first user gaming device 210 to share the AR game initially hosted on the first user gaming device 210 with one or more second user gaming devices 220. The first user gaming device 210 and the second user gaming device 220 are coupled to each other, for example, using a high-speed wireless network connection. In some embodiments, the shared coordinate space creation component 110 comprises a portion of the components of the system 200. In some embodiments, the shared coordinate space creation component 110 includes all components of the system 200.

In some embodiments, the system 200 may overcome the constraints of some conventional shared AR gaming systems that utilize stationary reference objects to spatially anchor user gaming devices in a shared AR game. By utilizing stationary reference objects, the mobility of AR game users is severely limited. For example, the user cannot roam the park at will and decide to participate in the shared AR game. Furthermore, many conventional shared AR systems lack the ability to spatially pinpoint the user gaming device, resulting in inconsistent positioning of computer-generated images relative to real-world images and less-enjoyable shared AR games.

The first user gaming device 210 includes a shared AR game invitation component 230, the shared AR game invitation component 230 providing information to the second user gaming device 220 regarding a particular AR game and the coordinate space of the first user gaming device 210. The second user gaming device 220 includes a join shared AR gaming component 240 that coordinates communications with the first user gaming device 210 to create a shared coordinate space with the first user gaming device 210, for example, by aligning the coordinate space of the second user gaming device 220 with the coordinate space of the first user gaming device 210.

To share the AR game, the clocks of the first user gaming device 210 and the second user gaming device 220 may be synchronized. In some embodiments, the clocks of the first user gaming device 210 and the second user gaming device 220 are synchronized with each other. In some embodiments, the clocks of the first and second user gaming devices 210, 220 are synchronized with the AR system 250 (e.g., a local AR system and/or a cloud-based AR system). In some embodiments, the clocks of the first user gaming device 210 and the second user gaming device 220 are synchronized with a network time server (not shown).

The first user gaming device 210 and the second user gaming device 220 separately begin AR tracking (e.g., identify feature points) such that each user gaming device 210, 220 has its own relative coordinate space. For example, the first user gaming device 210 has its own spatial origin (e.g., 0,0,0 of a three-dimensional cartesian coordinate system) and the second user gaming device 220 has its own spatial origin (e.g., 0,0,0 of another three-dimensional cartesian coordinate system). In some embodiments, the coordinates are expressed as cartesian coordinates (X, Y, Z). In some embodiments, the coordinates are expressed as global coordinates (latitude, longitude, altitude).

The user of the first user gaming device 210 may place virtual object(s) and/or virtual encounter(s) within the AR game. Thereafter, the user of the first user gaming device 210 may initiate sharing of the AR game with the user of the second user gaming device 220. The first user gaming device 210 and the second user gaming device 220 may share information (e.g., player identifiers) about the respective users directly and/or through the AR system 250.

The shared AR game invitation component 230 may display a spatially aligned image on the first user gaming device 210 that is viewable by the second user gaming device 220. The user of the first user gaming device 210 may further provide an AR game identifier to the user of the second user gaming device 220 to identify a particular AR game. In some embodiments, the AR game identifier may be a globally unique identifier (e.g., GUID). In some embodiments, the AR game identifier may be a multi-dimensional barcode (e.g., a quick response or "QR" Code, Aztec Code, datamatrix Code, dataglyph, maximum Code (MaxiCode), PDF417 Code, supercode (Ultra Code), UCC RSS-2D barcode, and/or other optical Code) displayed on the first user gaming device 210 by the shared AR game invitation component 230 and viewable by the second user gaming device 220. In some embodiments, the AR game identifier may be electronically received, for example, based on explicit user input (e.g., a share AR game identifier command) electronically communicated from the first user gaming device 210 to the second user gaming device 220 and/or based on display of a proximity and spatially aligned image of the first user gaming device 210 to the second user gaming device 220 on the first user gaming device 210.

Turning briefly to fig. 3, an exemplary user interface 300 displayed on the first user gaming device 210 is illustrated. The user interface 300 includes a spatially aligned image 310 that allows the second user gaming device 220 to determine its position relative to the first user gaming device 210. In some embodiments, the spatially aligned image 310 is displayed in a predetermined size and in a plurality of features (e.g., predefined particular groupings of pixels of predefined colors and/or predefined intensities) that allow the second user gaming device 220 to determine its position/location (e.g., six degrees of location) relative to the first user gaming device 210. The user interface 300 further includes a multi-dimensional barcode 320 (e.g., identifier) that uniquely identifies the particular AR game.

Referring back to fig. 2, in displaying the spatially aligned images, the shared AR game invitation component 230 tracks (e.g., stores) location information about the first user gaming device 210 (e.g., information about the six degree position of the first user gaming device) at various time intervals using timestamps (e.g., associated with each location of the first user gaming device 210).

In some embodiments, the second user gaming device 220 may infer the user's desire to initiate joining of the AR game. For example, based at least in part on the proximity of the second user gaming device 220 to the first user gaming device 210, the joining shared AR game component 240 may infer a desire of the user to initiate joining of the AR game.

In some embodiments, the user of the second user gaming device 220 may explicitly initiate the joining of the AR game. Turning briefly to FIG. 4, the exemplary user interface 400 includes a "join friend's AR session" control 410 that is selected to cause the join sharing AR game component 240 to initiate a join to the AR game.

Referring back to fig. 2, the joining shared AR game component 240 of the second user gaming device 220 may utilize the displayed spatially aligned image (e.g., using one or more cameras (not shown) of the second user gaming device 220) to spatially link the second user gaming device 220 to the first user gaming device 210. The joining shared AR gaming component 240 may record location information (e.g., a six degree position) and a timestamp (e.g., a location and timestamp of when the spatially aligned image was identified) of when the second user gaming device 220 was spatially linked to the first user gaming device 210.

Thereafter, the second user gaming device 220 sends a request for information to the first user gaming device 210 along with a timestamp of when the spatial link by the second user gaming device 220 occurred. In response to the request, the first user gaming device 210 sends the second user gaming device 220 a location of the first user gaming device 210 at or near the timestamp when the second user gaming device 220 was spatially linked to the first user gaming device 210 (e.g., a six degree position fix) and a spatial origin of the first user gaming device.

Using the received location and spatial origin of the first user gaming device 220, along with the spatial origin of the second user gaming device 220 and the recorded location of the second user gaming device 220 when the second user gaming device 220 is spatially linked to the first user gaming device 210, the join sharing AR game component 240 may calculate an offset between the second user gaming device 220 and the first user gaming device 210. The joining shared AR gaming component 240 may utilize the calculated offset to establish a shared coordinate space between the first user gaming device 210 and the second user gaming device 220 by aligning the coordinate space of the second user gaming device 220 with the coordinate space of the first user gaming device 210.

The second user gaming device 220 and the first user gaming device 210 may then display and/or interact with the virtual image(s) of the AR game in the shared coordinate space. Once the shared coordinate space has been established, each user gaming device 210, 220 provides current location and interaction information to the other device(s) and/or the AR system 250.

The system 200 may thus allow two or more user gaming devices to align their respective views of the same virtual object such that the virtual object has the same orientation regardless of the perspective (e.g., viewed from any of the two or more user gaming devices). In some embodiments, the system 200 may be used to subsequently share the AR game with a plurality of other user gaming devices, where each of the plurality of other user gaming devices aligns its respective coordinate space with the coordinate space of the first user gaming device 210.

In some embodiments, system 200 may be used as part of a multiplayer AR gaming experience. For example, the AR gaming experience may allow the first user to begin building a virtual object or scene that includes multiple virtual objects utilizing virtual building blocks. A first user may utilize system 200 to share an AR game experience with one or more other users by providing an AR game identifier and displaying spatially aligned images that are viewable (e.g., simultaneously viewable and/or sequentially viewable) by the user gaming devices of the one or more other users. In this way, multiple users may participate in building the same virtual object or scene at the same time, such that all users see the same virtual object or scene in the same orientation, thereby enabling an accurate and immersive multiplayer AR gaming experience.

In some embodiments, the system 200 may be used to "anchor" virtual objects in highly changing environments (such as a home desktop). That is, the system 200 may coordinate the spatial alignment locking of devices regardless of the characteristics of the physical environment.

In some embodiments, both the first user gaming device 210 and the second user gaming device 220 each include a shared AR game invitation component 230 and a join shared AR game component 240. In this way, each device has the ability to initiate an invitation to join the shared AR game and to join the shared AR game.

For purposes of explanation and not limitation, the system 200 has been discussed with reference to spatially aligned images used by the joining shared AR gaming component 240 of the second user gaming device 220 to determine the spatial position of the second user gaming device 220 relative to the first user gaming device 210. In some embodiments, spatial sound may be utilized instead of spatially aligned images to facilitate joining the shared AR game component 240 to determine the spatial position of the second user game device 220 relative to the first user game 220.

Game mode

Referring back to fig. 1, the system 100 further includes an augmented reality gaming mode component 120 that provides a first mode of an augmented reality game ("gaming mode") in which at least some of the plurality of users participate in game play operations that produce outcomes in accordance with pre-established game rules.

In some embodiments, once the shared coordinate space in the AR game has been established, the user may experience one of two modes during the AR game. During a first mode ("game play mode"), a user uses a user gaming device in an AR game to participate in an AR video game, where action(s) generate result(s) according to pre-established game rule(s).

For example, an AR game may include an open AR virtual game in which the user(s) virtually place the piece(s) and continue to take an adventure. The user can decide himself what he wants to do. During the gaming mode, the user may participate in a creative mode game in which the player may be given unlimited resources to build whatever they choose. During the gaming mode, the user may further participate in a survival mode, wherein the player explores the world and mines his resources to foster, host and/or defend himself.

Sandbox mode

The system 100 also includes an augmented reality sandbox mode component 130 that provides a second mode of an augmented reality game in which at least some of the plurality of users engage in non-destructive game play actions that do not produce a result once the second mode has terminated (e.g., session based, not persistent).

In a second mode ("sandbox mode" or "edit mode"), within the shared coordinate space of the AR game, two or more users participate in a non-destructive session of the AR game in which the action(s) produce only temporary or limited outcome(s) in accordance with at least some of the pre-established game rule(s). However, since the second mode is non-destructive, at least some of the outcome(s) may be reset upon exit of the second mode and/or in response to a game reset action.

In some embodiments, the AR game may include an open AR virtual game in which the user(s) virtually place the piece(s) and continue the adventure. In sandboxed mode, the user may experience the AR game entirely (e.g., according to game rule (s)) or in a limited manner (e.g., based on predefined sandboxed rule (s)).

In some embodiments, the user may selectively switch between the gaming mode and the sandbox mode.

Sharing geographical location information

Turning to fig. 5, a system 500 for using geographic location information within an AR session is illustrated. The system 500 includes a plurality of user gaming devices 510 coupled to an augmented reality gaming system 520 via a network 530, such as the internet. Each of the user gaming devices 510 has an associated physical location. In some embodiments, the augmented reality gaming system 520 may utilize the latitude and longitude provided by the GPS of the user gaming device 510.

The augmented reality gaming system 520 may join multiple user gaming devices 510 into an augmented reality game, session, and/or experience. Using the information regarding the physical locations of the plurality of user gaming devices 510, the augmented reality gaming system 520 may determine, with respect to the first user gaming device 510, that the second user gaming device 510 has a physical location that is within a threshold physical distance (e.g., 10 feet, 100 feet, 1 mile) of the physical location of the first user gaming device 510.

The user of the first user gaming device 510 may initiate a virtual action that causes a corresponding virtual action to be displayed on a map of the virtual environment that is displayed via the second user gaming device 510 parallel to at least a portion of the physical environment.

In some embodiments, the virtual action includes an animation of an icon or other location indicator indicating the physical relative location of the first user gaming device 510. For example, the icons may be a move or wave icon to indicate the location of the first user gaming device 510 and the willingness of the user of the first user gaming device 510 to participate in the shared augmented reality gaming session. In this manner, the system 500 may facilitate a map-based collaborative experience.

In some embodiments, the virtual action includes an animated geo-location map indicator that is visible to other users (e.g., public participants) via the user gaming device 510. For example, instantiating an animation including fireworks on a map of a virtual environment is a visual effect that is visible by other users based on the determined distance.

In some embodiments, within game play of augmented reality gaming system 520, shared geographic location augmentation with social effect(s) may be facilitated such that the effect(s) of a particular augmentation are applicable only to users of user gaming device 510 within a threshold distance (e.g., within a particular radius). In some embodiments, these shared geographic location enhancements may be applied over time and obtain additional benefits as increased users participate in game play.

Referring briefly to fig. 6, an exemplary user interface 600 is illustrated. The user interface 600 displays a map of the virtual environment parallel to at least a portion of the physical environment as displayed by the second user gaming device. The user interface 600 includes a location indicator associated with the second user gaming device 610 (e.g., the user's own location) and a location indicator associated with the first user gaming device 620 (e.g., the first user's location).

The virtual action may be displayed via the user interface 600 in response to an action initiated by the first user on the first user gaming device 510. For example, the location indication associated with the first user game 620 may become animated or otherwise provide a geographic location based signal to the second user via the user interface 600.

The user interface 600 also includes a virtual item 630 (e.g., a virtual firework) that is placed or animated based on an action initiated by the first user on the first user gaming device 510.

Fig. 7 illustrates an exemplary method for sharing an augmented reality game within a shared coordinate space. Fig. 8 illustrates an exemplary method of using geographic location information within an AR session. Fig. 9-12 illustrate exemplary methods related to creating a shared coordinate space in an augmented reality session between at least two devices having disjoint relative coordinate spaces. While the methods are shown and described as a series of acts performed in a sequence, it will be understood that the methods are not limited by the order of the sequence. For example, some acts may occur in different orders than described herein. Additionally, an action may occur concurrently with another action. Moreover, in some instances, not all acts may be required to implement a methodology described herein.

Further, the acts described herein may be computer-executable instructions that may be implemented by one or more processors and/or stored on one or more computer-readable media. Computer-executable instructions may include routines, subroutines, programs, threads of execution, and the like. Additionally, the results of the acts of the methods may be stored in a computer readable medium, displayed on a display device, and the like.

Referring to fig. 7, a method 700 of sharing an augmented reality game (e.g., an AR video game) within a shared coordinate space 700 is illustrated. In some embodiments, method 700 is performed by system 100.

At 710, a shared coordinate space in an augmented reality game between a plurality of user gaming devices is created. At 720, a first mode of the augmented reality game is provided to a plurality of users associated with the plurality of user gaming devices using the created shared coordinate space. In a first mode, at least some of the plurality of users participate in game play actions that produce results according to pre-established game rules. At 730, a second mode of the augmented reality game is provided to a plurality of users associated with the plurality of user gaming devices using the created shared coordinate space. In the second mode, at least some of the plurality of users engage in non-destructive game play actions that produce no result once the second mode has terminated (e.g., session-based, rather than permanent). In some embodiments, a user may selectively switch between a first mode ("gaming mode") and a second mode ("sandbox mode").

Turning to fig. 8, a method of using geographic location information within an AR session is illustrated. In some embodiments, method 800 is performed by system 500.

At 810, a plurality of user gaming devices join an augmented reality game. At 820, at a first user gaming device of the plurality of user gaming devices, at least one other user gaming device of the plurality of user gaming devices having a physical location within a threshold physical distance of the physical location of the first user gaming device is determined.

At 830, at the first user gaming device, a virtual action is initiated that causes a corresponding virtual action to be displayed on a map of the virtual environment that is parallel to at least a portion of the physical environment. The map is displayed on the second user gaming device. At 840, at the second user device, the corresponding virtual action is displayed on the map of the virtual environment.

Turning to fig. 9, a method 900 of creating a shared coordinate space in an augmented reality game between at least two devices having disjoint relative coordinate spaces is illustrated. In some embodiments, the method 900 is performed by the first user gaming device 210.

At 910, augmented reality tracking is performed to establish a relative coordinate space of the first user gaming device. At 920, the first user gaming device displays the spatially aligned image. At 930, an identifier of the augmented reality game is provided. At 940, location information about the first user gaming device and associated timestamps for at least a portion of the time that the spatially aligned images were displayed are stored.

At 950, a request for information is received from the second user gaming device, the request including a timestamp. At 960, in response to the request, location information regarding the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device is provided.

Referring to fig. 10, a method 1000 of creating a shared coordinate space in an augmented reality game between at least two devices having disjoint relative coordinate spaces is illustrated. In some embodiments, the method 1000 is performed by the second user gaming device 220.

At 1010, augmented reality tracking is performed to establish a relative coordinate space of the second user gaming device. At 1020, a spatially aligned image displayed on the first user gaming device is identified and an identifier of an augmented reality game is received.

At 1030, the location of the second user gaming device within the coordinate space of the second user gaming device associated with the identification of the spatially aligned image and a timestamp associated with the clock of the second user gaming device are recorded.

At 1040, a request for information is sent to the first user gaming device, the request including the timestamp. At 1050, in response to the request, information is received from the first user gaming device that includes a location of the first user gaming device within a relative coordinate space of the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device.

At 1060, an offset between the coordinate space of the second user gaming device and the coordinate space of the first user gaming device is calculated based at least in part on the received information (e.g., the location of the first user gaming device within the relative coordinate space of the first user gaming device at or near the timestamp and the spatial origin of the first user gaming device) and used to create the shared coordinate space. At 1070, the shared coordinate space and the identifier are used to display the augmented reality game.

Next, referring to fig. 11 and 12, a method 1100 of creating a shared coordinate space in an augmented reality session between at least two devices having disjoint relative coordinate spaces is illustrated. In some embodiments, method 1100 is performed by system 200.

At 1104, augmented reality tracking is performed by the first user gaming device and the second user gaming device to establish separate relative coordinate spaces for the first user gaming device and the second user gaming device.

At 1108, an identifier of the augmented reality game is provided by the first user gaming device. At 1112, the spatially aligned image is displayed by the first user gaming device. At 1116, location information regarding the first user gaming device (e.g., location within the relative coordinate space of the first user gaming device) and an associated timestamp for at least a portion of the time that the spatially aligned image was displayed by the first user gaming device is stored by the first user gaming device.

At 1120, the spatially aligned image displayed on the first user gaming device is identified by the second user gaming device. An identifier of the augmented reality game is further received by the second user gaming device.

At 1124, the location of the second user gaming device within the coordinate space of the second user gaming device associated with the identification of the spatially aligned image and a timestamp associated with the clock of the second user gaming device is recorded by the second user gaming device. At 1128, a request for information is sent by the second user gaming device to the first user gaming device. The request includes a timestamp.

At 1132, a request for information is received by the first user gaming device from the second user gaming device. At 1136, in response to the request, the first user gaming device provides (to the second user gaming device) location information regarding the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device.

At 1140, information comprising the location of the first user gaming device at or near the timestamp within the relative coordinate space of the first user gaming device and the spatial origin of the first user gaming device is received from the first user gaming device in response to the request by the second user gaming device. At 1144, the second user gaming device calculates an offset between the coordinate space of the second user gaming device and the coordinate space of the first user gaming device based at least in part on the received information (e.g., the location of the first user gaming device within the relative coordinate space of the first user gaming device at or near the timestamp and the spatial origin of the first user gaming device) to create a shared coordinate space. At 1148, the second user gaming device displays/participates in the augmented reality game using the shared coordinate space and the identifier.

Described herein is a system for sharing an augmented reality game within a shared coordinate space, comprising: a computer comprising a processor and a memory having computer-executable instructions stored thereon that, when executed by the processor, cause the computer to: creating a shared coordinate space in an augmented reality game between a plurality of user gaming devices; providing a first mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices using the created shared coordinate space, wherein in the first mode at least some of the plurality of users participate in game play actions that produce results according to pre-established game rules; and using the created shared coordinate space to provide a second mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices, wherein in the second mode at least some of the plurality of users engage in non-destructive game play actions that produce no result once the second mode has terminated.

The system may further comprise: wherein the pre-established rules are determined and agreed upon by at least some of the plurality of users. The system may further comprise: wherein creating a shared coordinate space in an augmented reality game between a plurality of user gaming devices comprises: the plurality of user gaming devices includes a first user gaming device and a second gaming device, and wherein by the second user gaming device: performing augmented reality tracking to establish a relative coordinate space of a second user gaming device; identifying a spatially aligned image displayed on a first user gaming device and receiving an identifier of an augmented reality game; recording a location of the second user gaming device within a coordinate space of the second user gaming device associated with the identification of the spatially aligned image and a timestamp associated with a clock of the second user gaming device; sending a request for information to a first user gaming device, the request including a timestamp; receiving, from the first user gaming device, information including a location of the first user gaming device within a relative coordinate space of the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device in response to the request; calculating an offset between the coordinate space of the second user gaming device and the coordinate space of the first user gaming device based at least in part on the received information to create a shared coordinate space; and displaying the augmented reality game using the shared coordinate space and the identifier.

The system may further comprise: wherein the spatially aligned image is displayed in a predetermined size and in a plurality of features (including a predefined particular grouping of pixels of a predefined color and a predefined intensity) that allow the second user gaming device to determine its position (in the form of a six degree position) relative to the first user gaming device. The system may further comprise: wherein the clock of the second user gaming device is synchronized to the clock of the first user gaming device by the second user gaming device. The system may further comprise: wherein the game identifier is displayed on the first user gaming device and includes a multi-dimensional barcode.

The system may further comprise: wherein the augmented reality game comprises a multi-party augmented reality construction video game. The system may include a memory having stored thereon further computer-executable instructions that, when executed by the processor, cause the computer to: a virtual object associated with the augmented reality game is displayed on a display of the first user gaming device and a display of the second user gaming device. The system may further comprise: wherein the second user gaming device comprises a mobile phone.

Described herein is a method for sharing an augmented reality game within a shared coordinate space, comprising: creating a shared coordinate space in an augmented reality game between a plurality of user gaming devices; providing a first mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices using the created shared coordinate space, wherein in the first mode at least some of the plurality of users participate in game play actions that produce results according to pre-established game rules; and using the created shared coordinate space to provide a second mode of the augmented reality game to a plurality of users associated with the plurality of user gaming devices, wherein in the second mode at least some of the plurality of users engage in non-destructive game play actions that produce no result once the second mode has terminated.

The method may further comprise: wherein the plurality of user gaming devices includes a first user gaming device and a second user gaming device, and creating the shared coordinate space in the augmented reality game between the plurality of user gaming devices comprises: performing, by the first user gaming device and the second user gaming device, augmented reality tracking to establish separate relative coordinate spaces for the first user gaming device and the second user gaming device; providing, by a first user gaming device, an identifier of an augmented reality game; displaying, by the first user gaming device, the spatially aligned image; storing, by the first user gaming device, location information about the first user gaming device and an associated timestamp for at least a portion of the time that the spatially aligned image is displayed; identifying, by the second user gaming device, the spatially aligned image displayed on the first user gaming device and receiving an identifier of the augmented reality game; recording, by the second user gaming device, a location of the second user gaming device within a coordinate space of the second user gaming device associated with the identification of the spatially aligned image and a timestamp associated with a clock of the second user gaming device; sending, by the second user gaming device, a request for information to the first user gaming device, the request including a timestamp; receiving, by a first user gaming device, a request for information from a second user gaming device, the request including a timestamp; providing, by the first user gaming device, location information regarding the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device in response to the request; receiving, by the second user gaming device and in response to the request, information from the first user gaming device including a location of the first user gaming device within a relative coordinate space of the first user gaming device at or near the timestamp and a spatial origin of the first user gaming device; calculating, by the second user gaming device, an offset between a coordinate space of the second user gaming device and a coordinate space of the first user gaming device based at least in part on the received information to create a shared coordinate space; and displaying, by the second user gaming device, the augmented reality game using the shared coordinate space and the identifier.

The method may further comprise wherein the spatially aligned image is displayed in a predetermined size and in a plurality of features (comprising a predefined particular grouping of pixels of a predefined color and a predefined intensity) that allow the second user gaming device to determine its position (in a six degree positioning form) relative to the first user gaming device.

The method may further comprise: the clock of the second user gaming device is further synchronized to the clock of the first user gaming device. The method may further comprise: wherein the game identifier is displayed on the first user gaming device and includes a multi-dimensional barcode. The method may further comprise: wherein the augmented reality game comprises a multi-party augmented reality construction video game. The method may further comprise: virtual objects associated with an augmented reality game are displayed on a first user gaming device and a second user gaming device.

Described herein is a method of using geographic location information within an augmented reality session, comprising: adding a plurality of user gaming devices to an augmented reality game; determining, at a first user gaming device of the plurality of user gaming devices, that at least one other user gaming device of the plurality of user gaming devices has a physical location that is within a threshold physical distance of the physical location of the first user gaming device; and initiating, at the first user gaming device, a virtual action that causes a corresponding virtual action to be displayed on a map of the virtual environment that is parallel to at least a portion of the physical environment, the map being displayed on the second user gaming device.

The method may further comprise: wherein the corresponding virtual action comprises movement of a virtual object representing the location of the first user gaming device. The method may further comprise: wherein the corresponding virtual action comprises an animation of a virtual object to be displayed on the map. The method may further comprise: wherein at the second user gaming device, the corresponding virtual action is displayed on a map of the virtual environment.

Referring to fig. 13, an example general purpose computer or computing device 1302 (e.g., a mobile phone, desktop, laptop, tablet, watch, server, handheld device, programmable consumer or industrial electronic, set-top box, gaming system, computing node, etc.) is illustrated. For example, computing device 1302 may be used in system 100 for sharing augmented reality games within a shared coordinate space and/or system 500 for using geographic location information within an AR session.

The computer 1302 includes one or more processors 1320, memory 1330, a system bus 1340, mass storage device(s) 1350, and one or more interface components 1370. The system bus 1340 communicatively couples at least the above-described system components. However, it is to be appreciated that in its simplest form, the computer 1302 can include one or more processors 1320 coupled to the memory 1330, the one or more processors 1320 executing various computer-executable acts, instructions, and/or components stored in the memory 1330. The instructions may be, for example, instructions for implementing the functions described as being performed by one or more of the components described above or instructions for implementing one or more of the methods described above.

Processor(s) 1320 may be implemented with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any processor, controller, microcontroller, or state machine. Processor(s) 1320 may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, a multi-core processor, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In one embodiment, the processor(s) 1320 may be a graphics processor.

Computer 1302 can include or otherwise interact with a variety of computer-readable media to facilitate controlling computer 1302 to implement one or more aspects of the claimed subject matter. Computer readable media can be any available media that can be accessed by computer 1302 and includes both volatile and nonvolatile media, and removable and non-removable media. Computer-readable media may include two distinct and mutually exclusive types, namely computer storage media and communication media.

Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes storage devices such as memory devices (e.g., Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), etc.), magnetic storage devices (e.g., hard disks, floppy disks, magnetic cassettes, magnetic tape, etc.), optical disks (e.g., Compact Disks (CDs), Digital Versatile Disks (DVDs), etc.), and solid state devices (e.g., Solid State Drives (SSDs), flash memory drives (e.g., cards, sticks, key drives), etc.), or any other similar medium that stores (as opposed to conveys or communicates) desired information accessible by computer 1302. Thus, the computer storage medium excludes modulated data signals, as well as those described with respect to communication media.

Communication media embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.

Memory 1330 and mass storage device(s) 1350 are examples of computer-readable storage media. Depending on the exact configuration and type of computing device, memory 1330 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. By way of example, the basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer 1302, such as during start-up, may be stored in a non-volatile memory, while a volatile memory may act as an external cache memory to facilitate processing by the processor 1320, and the like.

The mass storage device(s) 1350 includes removable/non-removable, volatile/nonvolatile computer storage media that can be used to store a wide variety of data relative to the memory 1330. For example, mass storage device(s) 1350 includes, but is not limited to, one or more devices such as a magnetic or optical disk drive, floppy disk drive, flash memory, solid state drive, or memory stick.

The memory 1330 and mass storage device(s) 1350 may include or have stored therein an operating system 1360, one or more applications 1362, one or more program modules 1364, and data 1366. The operating system 1360 acts to control and allocate resources of the computer 1302. Applications 1362 include one or both of system and application software and may utilize management of resources by operating system 1360 to perform one or more actions through program modules 1364 and data 1366 stored in memory 1330 and/or mass storage device(s) 1350. Thus, the application 1362 may turn the general-purpose computer 1302 into a special-purpose machine in accordance with the logic provided thereby.

All or portions of the claimed subject matter can be implemented using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed functionality. By way of example and not limitation, the system 100 or portions thereof can be or form part of an application 1362 and include one or more modules 1364 and data 1366 stored in memory and/or mass storage device(s) 1350 whose functionality can be implemented when executed by the one or more processors 1320.

According to a particular embodiment, the processor(s) 1320 may correspond to a system on a chip (SOC) or similar architecture that includes, or in other words integrates, hardware and software on a single integrated circuit die. Here, processor(s) 1320 may include one or more processors and memory, etc., that are similar to at least processor(s) 1320 and memory 1330. Conventional processors include a minimal amount of hardware and software and rely extensively on external hardware and software. By contrast, an SOC implementation of a processor is more powerful because it embeds hardware and software therein to enable specific functions with minimal or no reliance on external hardware and software. For example, the system 100 and/or associated functionality may be embedded within hardware in an SOC architecture.

The computer 1302 also includes one or more interface components 1370 that are communicatively coupled to the system bus 1340 and facilitate interaction with the computer 1302. By way of example, the interface component 1370 can be a port (e.g., serial, parallel, PCMCIA, USB, firewire, etc.) or an interface card (e.g., voice, video, etc.), among others. In one example implementation, the interface component 1370 may be embodied as a user input/output interface that enables a user to enter commands and information into the computer 1302, e.g., by way of one or more input devices (e.g., a pointing device such as a mouse, a trackball, a stylus, a touch pad, a keyboard, a microphone, a joystick, a game pad, a satellite dish, a scanner, a camera, other computers, etc.), for example, in the form of one or more gestures or voice inputs. In another example implementation, the interface component 1370 may be embodied as an output peripheral interface that provides output to a display (e.g., LCD, LED, plasma, etc.), speakers, printer, and/or other computer, etc. Further, the interface component 1370 may be embodied as a network interface that enables communication with other computing devices (not shown), such as over wired or wireless communication links.

What has been described above includes examples of aspects of the claimed subject matter. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the claimed subject matter, but one of ordinary skill in the art may recognize that many further combinations and permutations of the claimed subject matter are possible. Accordingly, the disclosed subject matter is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim.

31页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于手持控制器中的触觉波束成形和触觉效果的方法和系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类