Method and device for realizing free migration in virtual space

文档序号:330684 发布日期:2021-12-03 浏览:8次 中文

阅读说明:本技术 一种在虚拟空间中实现自由游走的方法及其装置 (Method and device for realizing free migration in virtual space ) 是由 李沛伦 于 2021-09-03 设计创作,主要内容包括:本发明提供一种在虚拟空间中实现自由游走的方法,包括以下步骤:采集外接设备的控制信号;构建虚拟摄像机,并根据外接设备的控制信号,实时更新虚拟摄像机在虚拟空间中的变换矩阵;基于所述虚拟摄像机在虚拟空间中实时更新的变换矩阵,利用虚拟摄像机预先设定的裁剪矩阵,更新虚拟场景;以及将虚拟场景转换为可视画面。本发明所提供的在虚拟空间中实现自由游走的方法及装置可以使用户通过控制外接设备在虚拟场景中进行自由游览,提高用户使用体验。(The invention provides a method for realizing free walking in a virtual space, which comprises the following steps: collecting a control signal of the external equipment; constructing a virtual camera, and updating a transformation matrix of the virtual camera in a virtual space in real time according to a control signal of external equipment; updating a virtual scene by utilizing a preset cutting matrix of the virtual camera based on a transformation matrix updated by the virtual camera in real time in a virtual space; and converting the virtual scene into a visual picture. The method and the device for realizing free-roaming in the virtual space can enable a user to freely browse in a virtual scene by controlling the external equipment, and improve the user experience.)

1. A method for realizing free-play in a virtual space, comprising the steps of:

collecting a control signal of the external equipment;

constructing a virtual camera, and updating a transformation matrix of the virtual camera in a virtual space in real time according to a control signal of the external equipment;

updating a virtual scene by utilizing a preset cutting matrix of the virtual camera based on a transformation matrix updated by the virtual camera in real time in a virtual space; and

and converting the virtual scene into a visual picture.

2. The method of claim 1, wherein the step of collecting the control signal of the external device comprises:

and continuously collecting the control signal of the external equipment through polling.

3. The method of claim 2, wherein the step of updating the transformation matrix of the virtual camera in the virtual space in real time according to the control signal of the external device comprises:

when one or more control signals for polling the external equipment are not zero, calculating the product of the polling interval duration and the control signal value as the moving position value and/or the rotation angle value of the virtual camera; and

and updating a transformation matrix of the virtual camera in the virtual space according to the movement position value and/or the rotation angle value of the virtual camera.

4. The method for realizing free-play in virtual space according to claim 3, wherein the cropping matrix is generated according to the preset near cropping plane, far cropping plane and field angle of the virtual camera.

5. The method of claim 1, wherein the virtual scene is constructed by loading a real three-dimensional model through WebGL.

6. A method for realizing free-play in virtual space according to claim 1, further comprising the steps of:

the remote server end receives the acquired control signal of the external equipment;

the remote server side converts the virtual scene into a visual picture through video stream rendering, and sends the visual picture to a local client side in a stream mode.

7. Method for enabling free-play in virtual space according to claim 6, characterized in that the transmission of the video stream rendering is done by WebRTC.

8. An apparatus for enabling free-play in a virtual space, comprising:

the signal acquisition module is configured to acquire a control signal of the external device;

the virtual camera module is configured to construct a virtual camera and update a transformation matrix of the virtual camera in a virtual space in real time according to a control signal of the external device;

the virtual scene updating module is configured to update a virtual scene by utilizing a preset cutting matrix of the virtual camera based on a transformation matrix updated by the virtual camera in real time in a virtual space; and

a rendering module configured to convert the virtual scene into a visual display.

9. The apparatus for enabling free-play in virtual space according to claim 8, further comprising:

and the remote server is configured to receive the control signal of the external equipment acquired by the signal acquisition module and send the visual picture to the local client in a streaming mode.

10. A computer program product comprising computer instructions, characterized in that the computer instructions, when executed by a processor, implement the steps of the method according to any one of claims 1-7.

Technical Field

The invention relates to the technical field of computer image processing, in particular to a method and a device for realizing free migration in a virtual space.

Background

With the development of three-dimensional virtual space technology, the way of visiting three-dimensional virtual space is gradually developed. The mainstream mode of virtual space tour in the current market is point location wandering, namely, a series of panoramic observation point locations are constructed, and virtual spaces are constructed at different observation point locations. For example, the WebGL sky box has the principle that a group of six pictures is pasted on six faces of a sky box cube, and a scene is observed in the sky box by using a virtual camera, so that a panoramic visual effect can be presented.

The scheme can be further subdivided according to the existence of a real three-dimensional model, wherein one method adopts real three-dimensional model touring, a sky box is generated at a real three-dimensional space position for panoramic observation, and richer visual effects can be realized; the other is not good in visual effect because a real three-dimensional model is not used. Both systems use a point location observation mode, and the point location observation mode has an obvious problem: the switching between the point locations is not smooth, and the experience brought to the user is bottleneck. Therefore, a method and an apparatus for realizing free-play in a virtual space are needed, which provide a more free virtual space browsing experience for users.

It is to be noted that the information disclosed in the background section above is only for enhancement of understanding of the background of the invention and therefore it may contain information that does not constitute prior art that is already known to a person skilled in the art.

Disclosure of Invention

To solve at least some of the above problems in the prior art, a first object of the present invention is to provide a method for realizing free-play in a virtual space. A second object of the present invention is to provide an apparatus for realizing free-walk in a virtual space. Other objects of the present invention are to provide a computer apparatus, a computer-readable storage medium, and a computer program product.

To achieve the above object, according to an embodiment of the present invention, a first aspect of the present invention provides a method for realizing free-walking in a virtual space, including the following steps: collecting a control signal of the external equipment; constructing a virtual camera, and updating a transformation matrix of the virtual camera in a virtual space in real time according to a control signal of external equipment; updating a virtual scene by utilizing a preset cutting matrix of the virtual camera based on a transformation matrix updated by the virtual camera in real time in a virtual space; and converting the virtual scene into a visual picture.

The step of collecting the control signal of the external device may include: and continuously collecting the control signal of the external equipment through polling.

The step of updating the transformation matrix of the virtual camera in the virtual space in real time according to the control signal of the external device may include: when one or more control signals for polling the external equipment are not zero, calculating the product of the polling interval duration and the control signal value as the moving position value and/or the rotation angle value of the virtual camera; and updating a transformation matrix of the virtual camera in the virtual space according to the movement position value and/or the rotation angle value of the virtual camera.

The cutting matrix is generated according to a near cutting surface, a far cutting surface and a view angle preset by the virtual camera. The virtual scene can be constructed by loading a real three-dimensional model through WebGL.

According to an embodiment of the present invention, the method for realizing free-play in a virtual space may further include a step of receiving a control signal of an external device at a remote server, where the step of constructing a virtual camera, updating a transformation matrix of the virtual camera in the virtual space in real time according to the control signal of the external device, and the step of updating a virtual scene are executed at the remote server; and the remote server side converts the virtual scene into a visual picture through video stream rendering, and sends the visual picture to the local client side in a stream form.

The virtual camera built at the remote server side can be a UE4 camera, and the video stream rendering can be transmitted through WebRTC.

To achieve the above object, according to an embodiment of the present invention, a second aspect of the present invention provides an apparatus for implementing free-walk in a virtual space, including: the signal acquisition module is used for acquiring a control signal of the external equipment; the virtual camera module is used for constructing a virtual camera and updating a transformation matrix of the virtual camera in a virtual space in real time according to a control signal of the external equipment; the virtual scene updating module is used for updating the virtual scene by utilizing a preset cutting matrix of the virtual camera based on a transformation matrix updated by the virtual camera in real time in a virtual space; and a rendering module that converts the virtual scene into a visual picture.

According to an embodiment of the present invention, the apparatus for realizing free-play in a virtual space may further include a remote server, which receives a control signal of the external device acquired by the signal acquisition module, wherein the virtual camera module, the virtual scene update module, and the rendering module may be disposed at the remote server.

The virtual camera constructed by the virtual camera module can be a UE4 camera. The rendering module can convert the virtual scene into a visual picture in a WebRTC video stream rendering mode, and send the visual picture to the local client in a stream mode.

To achieve the above object, according to a third aspect of the present invention, there is provided a computer apparatus including: a processor; a storage device; and a computer program stored on the storage means and executable on the processor, the steps of the method for implementing free-walk in virtual space being implemented when the computer program is executed by the processor.

To achieve the above object, according to an embodiment of the present invention, a fourth aspect of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the above steps of the method for implementing free-play in a virtual space.

To achieve the above object, according to an embodiment of the present invention, a fifth aspect of the present invention provides a computer program product, which includes computer instructions that, when executed by a processor, implement the steps of the above method for implementing free-walking in a virtual space.

The method and the device for realizing free-roaming in the virtual space can enable a user to freely browse in a virtual scene by controlling the external equipment, including moving, jumping, rotating and the like, and improve the use experience of the user.

Drawings

The above and other features of the present invention will be described in detail below with reference to certain exemplary embodiments thereof, which are illustrated in the accompanying drawings, and which are given by way of illustration only, and thus are not limiting of the invention, wherein:

fig. 1 shows an exemplary system architecture of one embodiment of a method for implementing free-play in virtual space, which is applicable to the present invention.

Fig. 2 shows a flowchart of a method for implementing free-walk in virtual space according to an embodiment of the invention.

Fig. 3 is a detailed flowchart of the flowchart shown in fig. 2, which is used for updating the transformation matrix of the virtual camera in the virtual space in real time according to the control signal of the external device.

FIG. 4 illustrates a detailed flow diagram of one exemplary embodiment for converting a virtual scene into a visual display.

Fig. 5 is a flowchart illustrating a method for implementing free-play in a virtual space by means of server-side rendering according to another embodiment of the present invention.

Fig. 6 shows a schematic block diagram of an apparatus for implementing free-walk in virtual space according to an embodiment of the present invention.

Fig. 7 shows a schematic block diagram of an apparatus for implementing free-walk in virtual space according to another embodiment of the present invention.

FIG. 8 illustrates a block diagram of a computer system that may be used to implement an apparatus of an embodiment of the invention, according to an embodiment of the invention.

Detailed Description

The present invention is described in detail below with reference to specific examples so that those skilled in the art can easily practice the present invention based on the disclosure of the present specification. The embodiments described below are only a part of the embodiments of the present invention, and not all of them. All other embodiments obtained by a person skilled in the art on the basis of the embodiments described in the present specification without inventive step are within the scope of the present invention. It should be noted that the embodiments and features of the embodiments in the present specification may be combined with each other without conflict.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" include the plural forms "a plurality", "a plurality" and "the" unless the context clearly dictates otherwise. As used herein, the terms "first," "second," and the like are used solely to distinguish one from another feature, step, operation, element, and/or component, and do not denote any particular technical meaning or necessarily order therebetween. The term "plurality" as used herein may refer to two or more, and the term "at least one" may refer to one, two or more. Any reference herein to any feature, step, operation, element, and/or component is generally to be understood as one or more, unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. The suffixes "module" and "unit" of elements herein are merely for convenience of description, and thus, can be used interchangeably, and do not have any distinguishing meaning or function.

A detailed description thereof will be omitted when the prior art related to the description of the present invention is apparent to those skilled in the art. It should be further understood that the description of the embodiments in this specification focuses on emphasizing the differences between the embodiments, and the same or similar parts between the embodiments may be mutually referred to, and for the sake of brevity, the description is not repeated.

As schematically illustrated in fig. 1, there is shown an exemplary system architecture 100 that may be suitable for one embodiment of the method of the present invention for implementing free-play in virtual space. The system architecture 100 may include terminal devices 101, 102, 103, as well as a network 104 and a server 105. Network 104 is used to provide communications between terminal devices 101, 102, 103 and server 105, and may include various connection types, such as wired, wireless communications, or fiber optic cables.

The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as image and video capture applications, text input applications, web browser applications, domain-of-expertise application software, search-type applications, instant messaging tools, mailbox clients, social platform software, and the like.

In a specific implementation, the terminal devices 101, 102, and 103 may be implemented as hardware or software according to actual needs. When implemented as hardware, the terminal devices 101, 102, 103 may be various electronic devices having (touch) displays and supporting various inputs of voice, text, etc., including but not limited to personal computers (including notebook computers and desktop computers), tablet computers, smart phones, in-vehicle terminals, e-book readers, video players, and the like. When implemented as software, the terminal devices 101, 102, 103 may be installed in a suitable electronic device, implemented as a plurality of software or software modules (e.g. to provide distributed services), or may be implemented as a single software or software module. It should be understood that the examples of terminal devices 101, 102, 103 depicted in fig. 1 and described above are provided herein as examples only and should not be construed as being particularly limiting.

The server 105 may be a server providing various services, such as a background server providing analysis, response, and support for various information, such as control signals, voice or text information, input by the terminal devices 101, 102, 103. The background server may analyze and process the received control signal, voice, or target text, and feed back the processing result to the terminal devices 101, 102, and 103 through the network 104.

In a specific implementation, the server 105 may be implemented as hardware or software according to actual needs. When implemented as hardware, the server 105 may be implemented as a distributed server cluster of multiple servers or as a single server. When implemented as software, the server 105 may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. It should be understood that the example server 105 depicted in fig. 1 and described above is by way of example only and should not be construed as being particularly limiting.

It should be noted that the method for realizing free-roaming in a virtual space provided in the embodiment of the present application may be executed by the terminal devices 101, 102, and 103, may be executed by the server 105, or may be executed by the terminal devices 101, 102, and 103 and the server 105 in cooperation. Accordingly, the means for realizing free-play in the virtual space may be provided in the terminal apparatuses 101, 102, and 103, the server 105, or the terminal apparatuses 101, 102, and 103 and the server 105.

It is to be understood that, when the method for implementing free-roaming in a virtual space provided by the embodiment of the present application is executed by the terminal devices 101, 102, 103, the system architecture 100 may not include the network 104 and the server 105.

It should be understood that the number and variety of terminal devices, networks, and servers in fig. 1 are merely illustrative. In particular implementations, there may be any number and variety of terminal devices, networks, and servers, depending on the actual needs.

As shown in fig. 2, a method for implementing free-play in a virtual space according to an embodiment of the present invention includes the following steps: step S201, collecting control signals of the external equipment, wherein the control signals of the external equipment can be continuously collected in a polling mode; step S202, a virtual camera is constructed, and a transformation matrix of the virtual camera in a virtual space is updated in real time according to a control signal of external equipment; step S203, based on the transformation matrix updated by the virtual camera in real time in the virtual space, the virtual scene is updated by using the preset cutting matrix of the virtual camera, and the generation and the update of the virtual scene can be easily realized by using the existing known tools by the technicians in the field, so that the redundant description is not repeated; and step S204, converting the virtual scene into a visual picture.

In the step S201, the control signal of the external device may be continuously collected in a polling manner. The external device may be various common or unusual external devices that can input control signals, such as a mouse, a touch screen, a rotary joystick, a game pad, a sensor that can sense body limb movements or facial expressions, and the like, which may be wired through interfaces such as USB, HDMI, and the like, or may be wirelessly connected through bluetooth, infrared, and the like.

Referring to fig. 3, a specific flowchart of updating the transformation matrix of the virtual camera in the virtual space in real time according to the control signal of the external device in step S202 is shown. As shown in fig. 3, in step S301, it is determined whether the control signal of the external device collected in the polling manner in step S201 is zero, and if the control signal is zero, the next polling is continued; if one or some of the control signals of the external device are found to be non-zero by polling, in step S302, the non-zero control signals are converted into parameters of the virtual camera by using the packaged library, for example, the product of the interval duration of two polling times and the value of the control signal is calculated as the value of the moving position and/or the value of the rotation angle of the virtual camera. Specifically, for example, in the case where the peripheral device is a joystick, the joystick is configured to be movable in the front, rear, left, and right directions and rotatable in the axial direction thereof, and these movements and rotations correspond to the plurality of signal output ports, respectively. Thus, depending on the extent of the rocker movement and/or rotation, the port may output signals having an intensity in the range of [0, 1 ]. When the port is polled, the polling frequency can be set to be N (fps), so that the product between the signal output intensity and the time interval 1/N can be obtained in each 1/N, and therefore, the accumulated moving distance (moving position value) and/or the accumulated rotating angle (rotating angle value) in the front, rear, left and right directions at the moment can be obtained according to the concept of integration in calculus at any moment; subsequently, in step S303, the transformation matrix of the virtual camera in the virtual space is updated according to the movement position value and/or the rotation angle value of the virtual camera. The virtual camera can then update the cropped picture of the virtual space accordingly, thereby helping the user to control the picture of the virtual space that the user wants to see.

Referring to fig. 4, a detailed flow diagram of one exemplary embodiment of converting a virtual scene into a visual presentation is shown. As shown in fig. 4, a virtual scene is first constructed in step S401 by loading a real three-dimensional model, which may be done in a manner known in the art, e.g. by WebGL. Subsequently, in step S402, the position and the viewing direction of the virtual camera are updated according to the moving position value and/or the rotation angle value of the virtual camera determined in step S302, and the virtual camera forms a transformation Matrix, i.e., a scene Projection Matrix (Projection Matrix), by the camera position and the viewing direction. In step S403, a clip matrix (also referred to as a "clip matrix" or a "clip matrix") is formed by preset Near planes (also referred to as a "Near plane" or the like), Far planes (also referred to as a "Far plane" or the like), and Field angles (Field of View, Fov) of the virtual camera. In step S404, the virtual scene is transformed into a visual image by the generated two matrices, i.e., the scene projection matrix and the cropping matrix, and finally presented to the user on the screen.

The method for realizing free-walking in the virtual space according to the present invention described above with reference to fig. 2 to 4 is implemented by a local rendering method. The method for realizing free-roaming in the virtual space according to another embodiment of the invention can also be realized in a server-side rendering mode. A method for realizing free-play in a virtual space by means of server-side rendering will be described below with reference to fig. 5.

Fig. 5 is a flowchart illustrating a method for realizing free-walking in a virtual space by means of server-side rendering according to another embodiment of the present invention. Step S501 is the same as step S201 in fig. 2, that is, the local client collects the control signal of the external device in this step. Subsequently, in step S502, the remote server receives the external device control signal acquired in step S501 and pushed to the remote server by the local client. In step S503, the remote server constructs a virtual camera, which may be a UE4 camera, and converts the peripheral control signal into a mobile position value and/or a rotation angle value of the UE4 camera in the UE4, and may also implement front-back-left-right movement, view angle rotation, jumping operation, and the like of the camera, and updates the transformation matrix of the virtual camera in the virtual space in real time. In step S504, at the remote server, the virtual scene is updated according to the transformation matrix updated by the virtual camera in real time in the virtual space, and those skilled in the art can easily generate and update the virtual scene by using a currently known tool, so that redundant description is not repeated here. In step S505, the remote server converts the virtual scene into a visual image by video stream rendering, and sends the visual image to the local client in a stream form, so that the visual image can be transmitted and pushed back to the local client by WebRTC in a stream form, and then the browser can present the virtual scene to the user in a video form.

Fig. 6 shows a schematic block diagram of an apparatus 600 for implementing free-roaming in a virtual space according to an embodiment of the present invention, which apparatus 600 may be applied to various electronic devices in particular. As shown in fig. 6, an apparatus 600 for implementing free-walk in virtual space according to an embodiment of the present invention includes: the signal acquisition module 601 is used for acquiring a control signal of the external equipment; the virtual camera module 602 is used for constructing a virtual camera and updating a transformation matrix of the virtual camera in a virtual space in real time according to a control signal of the external device; a virtual scene update module 603, which updates the virtual scene by using a clipping matrix preset by the virtual camera based on a transformation matrix updated by the virtual camera in real time in the virtual space; and a rendering module 604 that converts the virtual scene into a visual picture.

Fig. 7 shows a schematic block diagram of an apparatus 700 for implementing free-play in a virtual space according to another embodiment of the present invention, where the apparatus 700 is used for implementing a method for implementing free-play in a virtual space by means of server-side rendering as shown in fig. 5. As shown in fig. 7, compared with the apparatus 600 shown in fig. 6, the apparatus 700 for implementing free-play in virtual space according to another embodiment of the present invention includes, in addition to the same or similar signal acquisition module 601, a remote server end, which receives the control signal of the external device acquired by the signal acquisition module 601, and a virtual camera module 602 ', a virtual scene update module 603 ' and a rendering module 604 ' which are the same or similar to the virtual camera module 602, the virtual scene update module 603 and the rendering module 604 are all disposed at the remote server end, and the virtual camera module 602 ', the virtual scene update module 603 ' and the rendering module 604 ' are used for implementing corresponding steps S503, S, and S ' in the method for implementing free-play in virtual space by way of server-end rendering shown in fig. 5, S504 and S505.

Referring to FIG. 8, there is shown a schematic block diagram of a computer system that may be used to implement an apparatus of an embodiment of the invention. It should be noted that the apparatus shown in fig. 8 is only an example, and should not be construed as limiting the embodiments of the present application in any way. The computer system shown in fig. 8 includes a Central Processing Unit (CPU)801, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data necessary for the operation of the computer system are also stored. The CPU 801, ROM 802, and RAM 803 are connected to each other via a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.

The following components are connected to the I/O interface 805: an input unit 806 including a keyboard, a mouse, a microphone, a touch screen, and the like; an output unit 807 including a display screen such as a liquid crystal display, a light emitting diode display, or the like, a speaker, or the like; a storage unit 808 including a hard disk memory and the like; and a communication unit 809 including a network interface card such as a WAN/LAN card, modem, or the like. The communication unit 809 performs communication processing via a network such as the internet, a local area network, or the like. A drive 810 may also be connected to the I/O interface 805 as needed. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as necessary, so that a computer program read out therefrom is mounted on the storage unit 808 as necessary.

In another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be separate and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to perform the steps of the method for free-play in virtual space of the present invention as described above.

In yet another aspect, the present application further provides a computer program product comprising computer instructions which, when executed by a processor, implement the steps of the above-mentioned method for free-walking in a virtual space according to the present invention.

In particular, the embodiments described above with reference to the flow diagrams in the figures may be implemented as computer software programs. For example, the embodiments disclosed in the present specification include a computer program product containing program instructions or code for executing the method for realizing a free-walk method in a virtual space according to the present invention shown in the flowcharts of the drawings. In such an embodiment, the computer program may be downloaded and installed from a network via the communication unit 809 and/or installed from the removable medium 811. The method of the present invention is executed when the computer program is executed by a Central Processing Unit (CPU) 801.

It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: a computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a flash memory, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.

In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.

Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).

The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

The units or modules referred to in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The above units or modules may also be provided in the processor, and may be described as: a processor includes a signal acquisition module, a virtual camera module, a virtual scene generation module, and a rendering module. The names of these units or modules do not in some cases constitute a limitation of the unit or module itself, for example, the signal acquisition module may also be described as a "module that acquires signals".

All documents mentioned in this specification are herein incorporated by reference as if each were incorporated by reference in its entirety.

Furthermore, it should be understood that various changes or modifications can be made by those skilled in the art after reading the above description of the present invention, and such equivalents also fall within the scope of the present invention.

15页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:虚拟对局的控制方法、装置、设备、介质及计算机产品

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类