Virtual map display method, device, equipment and storage medium

文档序号:199588 发布日期:2021-11-05 浏览:6次 中文

阅读说明:本技术 虚拟地图显示方法、装置、设备及存储介质 (Virtual map display method, device, equipment and storage medium ) 是由 马天牧 于 2021-08-05 设计创作,主要内容包括:本申请公开了一种虚拟地图显示方法、装置、设备及存储介质,属于计算机技术领域。所述方法包括:在图形用户界面上显示虚拟世界的平面虚拟地图;响应于检测到地图模式切换操作,将所述平面虚拟地图切换为所述虚拟世界的立体虚拟地图;其中,所述立体虚拟地图以空间几何体形态进行显示,所述立体虚拟地图由构成所述平面虚拟地图的地图元素在立体空间拼接而成。本申请显示的立体虚拟地图具有空间几何体形态,即将地图元素拼接成了空间几何体形状的虚拟地图,这个形状的虚拟地图具有视觉立体感,视觉观感好,提升了虚拟地图的视觉效果。(The application discloses a virtual map display method, a virtual map display device, virtual map display equipment and a virtual map display storage medium, and belongs to the technical field of computers. The method comprises the following steps: displaying a planar virtual map of the virtual world on the graphical user interface; in response to detecting a map mode switching operation, switching the planar virtual map to a stereoscopic virtual map of the virtual world; the three-dimensional virtual map is displayed in a space geometric form, and map elements forming the plane virtual map are spliced in a three-dimensional space to form the three-dimensional virtual map. The three-dimensional virtual map displayed by the application has a space geometric solid shape, namely the map elements are spliced into the virtual map with the space geometric solid shape, the virtual map with the shape has a visual three-dimensional effect, the visual impression is good, and the visual effect of the virtual map is improved.)

1. A virtual map display method, the method comprising:

displaying a planar virtual map of the virtual world on the graphical user interface;

in response to detecting a map mode switching operation, switching the planar virtual map to a stereoscopic virtual map of the virtual world;

the three-dimensional virtual map is displayed in a space geometric form, and map elements forming the plane virtual map are spliced in a three-dimensional space to form the three-dimensional virtual map.

2. The method of claim 1, wherein a first control is displayed on the graphical user interface, and the first control is used for switching map modes; the switching the planar virtual map into the stereoscopic virtual map in response to detecting the map mode switching operation includes:

and responding to the detection of the touch operation of the first control, and switching the planar virtual map into the stereoscopic virtual map.

3. The method of claim 1, wherein a map thumbnail is displayed on the graphical user interface, the map thumbnail being used for map mode switching; the switching the planar virtual map into the stereoscopic virtual map in response to detecting the map mode switching operation includes:

and responding to the detected touch operation on the map thumbnail, and switching the planar virtual map into the stereoscopic virtual map.

4. The method of claim 1, wherein switching the planar virtual map to a stereoscopic virtual map in response to detecting a map mode switching operation comprises:

in response to detecting a target gesture operation on the planar virtual map, switching the planar virtual map to the stereoscopic virtual map.

5. The method of claim 1, further comprising:

in response to detecting a map panning operation by the first map edge pointing to a second map edge, panning the planar virtual map on the graphical user interface if the planar virtual map has been shown to the first map edge;

wherein the first map edge is opposite the location of the second map edge; the translated plane virtual map is provided with a peripheral area of the second map edge.

6. The method of claim 1, further comprising:

in response to detecting a location-clicking operation on the stereoscopic virtual map, displaying a location marker at the selected target location, displaying at least one functionality control on the graphical user interface; one of the functionality controls is for indicating a march type of the virtual queue;

and responding to the touch operation of any one functional control, and displaying the marching information matched with the marching type indicated by the selected functional control on the three-dimensional virtual map.

7. The method of claim 1, further comprising:

displaying a position marker on the stereoscopic virtual map; wherein, one position mark is used for marking a selected target position on the three-dimensional virtual map;

in response to detecting a selected operation on any one of the position markers, displaying at least one function control on the graphical user interface; one of the functionality controls is for indicating a march type of the virtual queue;

and responding to the touch operation of any one functional control, and displaying the marching information matched with the marching type indicated by the selected functional control on the three-dimensional virtual map.

8. The method according to claim 6 or 7, wherein the displaying, in response to a touch operation on any one of the function controls, the marching information matched with the marching type indicated by the selected function control on the stereoscopic virtual map comprises:

responding to touch operation of a second control, and displaying a starting position list and a virtual queue list on the graphical user interface; the starting position list is used for showing at least one currently usable starting position, and the virtual queue list is used for showing at least one currently usable virtual queue;

and displaying the marching route of the selected virtual queue on the stereoscopic virtual map based on the selected starting position and the selected target position.

9. The method according to any one of claims 1 to 8, further comprising:

in response to detecting a map mode switching operation, switching a movement pattern of a virtual camera in the virtual world from planar movement to orbital movement around the spatial geometry.

10. The method according to any one of claims 1 to 8, further comprising:

acquiring spatial position coordinates of each map element forming the planar virtual map;

determining a reference position, and acquiring the rotation angle of each map element relative to the reference position;

and splicing the map elements according to the types of the map elements forming the planar virtual map, and the spatial position coordinates and the rotation angles of all the map elements to obtain the three-dimensional virtual map.

11. The method of claim 10, wherein determining a reference position, obtaining a rotation angle of the respective map element relative to the reference position, comprises at least one of:

determining a zero longitude position as the reference position, and acquiring the longitude of each map element relative to the reference position; determining a rotation angle of each map element at each position according to the longitude of each map element relative to the reference position;

determining a zero latitude position as the reference position, and acquiring the latitude of each map element relative to the reference position; and determining the rotation angle of each map element at each position according to the latitude of each map element relative to the reference position.

12. The method of claim 9, further comprising:

acquiring the radius of the space geometric body according to the width of a single map element forming the plane virtual map and the total number of the map elements forming the plane virtual map;

and determining the track height of the virtual camera for track movement according to the radius of the space geometry and the target offset.

13. A virtual map display apparatus, characterized in that the apparatus comprises:

a first display module configured to display a planar virtual map of a virtual world on a graphical user interface;

a second display module configured to switch the planar virtual map to a stereoscopic virtual map of the virtual world in response to detecting a map mode switching operation;

the three-dimensional virtual map is displayed in a space geometric form, and map elements forming the plane virtual map are spliced in a three-dimensional space to form the three-dimensional virtual map.

14. A computer device, characterized in that it comprises a processor and a memory in which at least one program code is stored, which is loaded and executed by the processor to implement the virtual map display method according to any one of claims 1 to 12.

15. A computer-readable storage medium, wherein at least one program code is stored therein, which is loaded and executed by a processor to implement the virtual map display method according to any one of claims 1 to 12.

Technical Field

The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for displaying a virtual map.

Background

The war strategy game is a virtual world-based battle strategy simulation game. The functionality provided by this type of game includes, but is not limited to: displaying a virtual map, creating a virtual queue (such as a marching army), creating a game league, displaying a marching route, and the like. Since displaying the virtual map is the basis for implementing the above-mentioned other functions, how to present the virtual map to the player becomes a research hotspot in the field.

Disclosure of Invention

The embodiment of the application provides a virtual map display method, a virtual map display device, virtual map display equipment and a virtual map storage medium. The technical scheme is as follows:

in one aspect, a virtual map display method is provided, and the method includes:

displaying a planar virtual map of the virtual world on the graphical user interface;

in response to detecting a map mode switching operation, switching the planar virtual map to a stereoscopic virtual map of the virtual world;

the three-dimensional virtual map is displayed in a space geometric form, and map elements forming the plane virtual map are spliced in a three-dimensional space to form the three-dimensional virtual map.

In another aspect, there is provided a virtual map display apparatus, the apparatus including:

a first display module configured to display a planar virtual map of a virtual world on a graphical user interface;

a second display module configured to switch the planar virtual map to a stereoscopic virtual map of the virtual world in response to detecting a map mode switching operation;

the three-dimensional virtual map is displayed in a space geometric form, and map elements forming the plane virtual map are spliced in a three-dimensional space to form the three-dimensional virtual map.

In some embodiments, a first control is displayed on the graphical user interface, and the first control is used for switching map modes; the second display module configured to:

and responding to the detection of the touch operation of the first control, and switching the planar virtual map into the stereoscopic virtual map.

In some embodiments, a map thumbnail is displayed on the graphical user interface, and the map thumbnail is used for switching map modes; the second display module configured to:

and responding to the detected touch operation on the map thumbnail, and switching the planar virtual map into the stereoscopic virtual map.

In some embodiments, the second display module is configured to:

in response to detecting a target gesture operation on the planar virtual map, switching the planar virtual map to the stereoscopic virtual map.

In some embodiments, the first display module is further configured to:

in response to detecting a map panning operation by the first map edge pointing to a second map edge, panning the planar virtual map on the graphical user interface if the planar virtual map has been shown to the first map edge;

wherein the first map edge is opposite the location of the second map edge; the translated plane virtual map is provided with a peripheral area of the second map edge.

In some embodiments, the apparatus further comprises:

a third display module configured to display a location marker at the selected target location in response to detecting a location-clicking operation on the stereoscopic virtual map, displaying at least one functionality control on the graphical user interface; one of the functionality controls is for indicating a march type of the virtual queue;

the second display module is configured to respond to touch operation of any one function control, and display marching information matched with the marching type indicated by the selected function control on the stereoscopic virtual map.

In some embodiments, the third display module is further configured to display a location marker on the stereoscopic virtual map; wherein, one position mark is used for marking a selected target position on the three-dimensional virtual map; in response to detecting a selected operation on any one of the position markers, displaying at least one function control on the graphical user interface; one of the functionality controls is for indicating a march type of the virtual queue;

the second display module is configured to respond to touch operation of any one function control, and display marching information matched with the marching type indicated by the selected function control on the stereoscopic virtual map.

In some embodiments, the third display module is further configured to display a starting position list and a virtual queue list on the graphical user interface in response to a touch operation on a second control of the at least one function control; the starting position list is used for showing at least one currently usable starting position, and the virtual queue list is used for showing at least one currently usable virtual queue;

the second display module is configured to display the marching routes of the selected virtual queue on the stereoscopic virtual map based on the selected starting position and the selected target position.

In some embodiments, the apparatus further comprises:

an acquisition module configured to acquire spatial position coordinates of respective map elements constituting the planar virtual map;

a first processing module configured to determine a reference position, and acquire a rotation angle of each map element relative to the reference position;

and the second processing module is configured to perform map element splicing according to the types of the map elements forming the planar virtual map, and the spatial position coordinates and the rotation angles of the map elements to obtain the three-dimensional virtual map.

In some embodiments, the first processing module is configured to perform at least one of:

determining a zero longitude position as the reference position, and acquiring the longitude of each map element relative to the reference position; determining a rotation angle of each map element at each position according to the longitude of each map element relative to the reference position;

determining a zero latitude position as the reference position, and acquiring the latitude of each map element relative to the reference position; and determining the rotation angle of each map element at each position according to the latitude of each map element relative to the reference position.

In some embodiments, the apparatus further comprises:

a third processing module configured to switch the moving mode of the virtual camera in the virtual world from planar movement to orbital movement around the spatial geometry in response to detecting a map mode switching operation.

In some embodiments, the apparatus further comprises:

the fourth processing module is configured to obtain the radius of the space geometric body according to the width of a single map element forming the plane virtual map and the total number of the map elements forming the plane virtual map;

and determining the track height of the virtual camera for track movement according to the radius of the space geometry and the target offset.

In another aspect, a computer device is provided, the device comprising a processor and a memory, the memory having stored therein at least one program code, the at least one program code being loaded and executed by the processor to implement the virtual map display method described above.

In another aspect, there is provided a computer-readable storage medium having at least one program code stored therein, the at least one program code being loaded and executed by a processor to implement the virtual map display method described above.

In another aspect, a computer program product or a computer program is provided, the computer program product or the computer program comprising computer program code, the computer program code being stored in a computer-readable storage medium, the computer program code being read by a processor of a computer device from the computer-readable storage medium, the computer program code being executed by the processor to cause the computer device to perform the virtual map display method described above.

The three-dimensional virtual map displayed by the application has a space geometric solid shape, namely the map elements are spliced into the virtual map with the space geometric solid shape, the virtual map with the shape has a visual three-dimensional effect, the visual impression is good, and the visual effect of the virtual map is improved.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.

Fig. 1 is a schematic diagram of an implementation environment related to a virtual map display method provided in an embodiment of the present application;

fig. 2 is a flowchart of a virtual map display method according to an embodiment of the present application;

fig. 3 is a schematic diagram of a map mode switching provided in an embodiment of the present application;

fig. 4 is a schematic view of a planar virtual map provided in an embodiment of the present application;

FIG. 5 is a schematic diagram of a map interaction provided by an embodiment of the present application;

FIG. 6 is a schematic diagram of another map mode switch provided in an embodiment of the present application;

FIG. 7 is a schematic diagram of another map mode switch provided in an embodiment of the present application;

FIG. 8 is a schematic diagram of another map interaction provided by embodiments of the present application;

FIG. 9 is a schematic diagram of another map interaction provided by embodiments of the present application;

FIG. 10 is a schematic diagram of another map interaction provided by embodiments of the present application;

FIG. 11 is a schematic illustration of a march provided by an embodiment of the present application;

FIG. 12 is a schematic view of a field of view of a virtual camera provided in an embodiment of the present application;

fig. 13 is a flowchart of another virtual map display method provided in the embodiment of the present application;

FIG. 14 is a schematic diagram of a map element provided by an embodiment of the present application;

fig. 15 is a schematic diagram of a moving path of a virtual camera provided in an embodiment of the present application;

fig. 16 is a schematic view of a stereoscopic virtual map provided in an embodiment of the present application;

fig. 17 is a schematic structural diagram of a virtual map display apparatus according to an embodiment of the present application;

fig. 18 is a schematic structural diagram of a computer device according to an embodiment of the present application.

Detailed Description

To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.

The terms "first," "second," and the like, in this application, are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency, nor do they define a quantity or order of execution. It will be further understood that, although the following description uses the terms first, second, etc. to describe various elements, these elements should not be limited by these terms.

These terms are only used to distinguish one element from another. For example, a first element can be termed a second element, and, similarly, a second element can also be termed a first element, without departing from the scope of various examples. The first element and the second element may both be elements, and in some cases, may be separate and distinct elements.

For example, at least one element may be an integer of 0 or more than one, such as one element, two elements, three elements, and the like. And at least two means two or more, for example, at least two elements may be any integer number of two or more, such as two elements, three elements, and the like.

To facilitate understanding of the technical processes of the embodiments of the present application, abbreviations or key terms that may be mentioned in the embodiments of the present application are introduced below.

1. SLG (Simulation Game): referred to herein as a war strategy type game. The war strategy game is a virtual world-based battle strategy simulation game. The functionality provided by this type of game includes, but is not limited to: displaying a virtual map, creating a virtual queue (such as a marching army), creating a game league, displaying a marching route, and the like. In some embodiments, in a war strategy game, a player can continuously extend his own territory, such as virtual land or a virtual city, in the game world by acquiring virtual resources, cultivating virtual characters and virtual queues to form a game league with other players.

2. And (3) game scene: the game engine is used for arranging various resources (such as art model resources). Put another way, a game scene is a virtual world that is displayed (or provided) by a game engine when running on a terminal. In some embodiments, the virtual world includes the sky, land, ocean, etc., the land including environmental elements such as deserts, cities, etc.

3. Land parcel: unit map elements constituting the virtual map, such as hexagonal parcels, square parcels, and the like.

4. A virtual camera: the terminal is used for observing a game scene, and the game scene seen by a player through the terminal display screen is shot through the virtual camera.

5. Virtual map: in the war strategy type game, symbols, colors, character labels and the like are used according to a certain proportion to describe images of natural geography, administrative regions, social and economic conditions in the virtual world. Such as: displaying a virtual building in the virtual world in a virtual map, the virtual building being created according to an operation of a player; or, default settings in the game. That is, the virtual map is the main player interaction scenario in a war strategy-like game. In the virtual map, elements such as a landform, a player or a neutral building, a virtual queue, and the like are included. In some embodiments, the virtual map may also be referred to as a large map, a virtual large map, a game large map, and the like, and the name of the virtual map is not limited in this embodiment.

6. Virtual queue: is composed of at least one virtual character in a war strategy game. Virtual queues are used for marching purposes and are therefore also referred to as marching queues. In some embodiments, marching purposes include, but are not limited to: move, hold, explore, assist, stay on, etc. Virtual roles in a virtual queue include, but are not limited to: a virtual Character controlled by a Player, a virtual Character provided by a war-strategy-type game, for example, an NPC (Non Player Character).

7. And (3) game league: a league of virtual characters controlled by at least one player in a war strategy type game. In some embodiments, a game league is used to assist a player in completing tasks in a game, such as: the player is assisted to achieve the marching purpose, and the player is assisted to create a virtual building and the like.

8. Marching: in the war strategy game, after a player selects a destination, a virtual queue is assigned, and the virtual queue reaches the destination through marching according to a marching path calculated by a server, so that subsequent behaviors such as attack or standing on duty are completed. The marching route is used for representing a marching route of a virtual queue formed by at least one virtual character. The marching route is composed of a starting position, a target position and a marching path. In some embodiments, the marching line can be a straight line connecting the starting location and the target location; alternatively, the curve may be a curve connecting the starting position and the target position, which is not limited in the embodiment of the present application.

The following describes an implementation environment related to the virtual map display method provided by the embodiment of the present application.

Fig. 1 is a schematic diagram of an implementation environment of a virtual map display method provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a terminal 101 and a server 102.

Wherein the terminal 101 is installed and running with a target application supporting virtual world display, which in some embodiments is an SLG. The terminal 101 is a terminal used by a user. The server 102 is used for providing background services for the target application programs supporting the virtual map display.

In some embodiments, the target application may be a stand-alone application. The terminal 101 may log in the target application program based on the account information input by the user, and the interaction between the user and the terminal 101 is realized through the target application program. In addition, the target application program may also be a sub-application running in another application, for example, the sub-application may be an applet, which is not specifically limited in this embodiment of the present application.

In some embodiments, the server 102 may be an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform, and the like. The terminal 101 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal 101 and the server 102 may be directly or indirectly connected through wired or wireless communication, and the present application is not limited thereto.

In some embodiments, the server 102 undertakes primary computing work and the terminal 101 undertakes secondary computing work; or, the server 102 undertakes the secondary computing work, and the terminal 101 undertakes the primary computing work; or, the server 102 and the terminal 101 perform cooperative computing by using a distributed computing architecture. In addition, those skilled in the art will appreciate that the number of the above terminals may be greater or smaller. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of the terminals is not limited in the embodiment of the application.

In some embodiments, the virtual map display method provided by the embodiments of the present application is applied to a game scene, such as a game scene provided by a war strategy game.

It is common in the related art to present a virtual map of a planar structure to a player. On one hand, the virtual map with the plane structure has poor visual effect, and on the other hand, great limitation exists in the formulation of game strategies of players, so that the interactive effect is poor, and the interactive efficiency is low. Taking the game strategy as an example, because the virtual map with the plane structure is tiled, the left edge and the right edge of the map are not communicated, and the upper edge and the lower edge of the map are not communicated, the game playing method of the player can be greatly limited. For example, if a player located in an edge zone of a certain map tries to attack an edge zone of another map, the player must pass through the center zone of the planar virtual map when controlling the virtual queue march.

Therefore, the embodiment of the application provides a new virtual map display method. The following embodiments describe the virtual map display method provided in the embodiments of the present application in detail.

Fig. 2 is a flowchart of a virtual map display method according to an embodiment of the present application. The method is applied to the implementation environment, the terminal is used as an execution subject in a game scene, and the virtual map display method is described, referring to fig. 2, and in some embodiments, the method flow includes the following steps.

201. The terminal displays a planar virtual map of the virtual world on the graphical user interface.

Currently, a map (referred to herein as a virtual map) is typically set up within most game scenarios. Generally, some important places, route information and the like in the virtual world are marked in the virtual map, and a player can quickly know the current position of the virtual character controlled by the player and the information of a route leading to a destination through the virtual map.

Wherein the planar virtual map is an image used for representing a virtual world on a plane in a two-dimensional form.

In some embodiments, the terminal is installed and operated with an object application supporting virtual map display, the object application is operated to display a partial area in the planar virtual map in a graphical user interface of the terminal, and optionally, a user adjusts the area of the virtual map displayed in the graphical user interface by performing a map translation operation on the graphical user interface. The map translation operation includes, but is not limited to, a sliding operation or a dragging operation.

202. In response to detecting a map mode switching operation, the terminal switches the planar virtual map into a stereoscopic virtual map of a virtual world; the three-dimensional virtual map is displayed in a space geometric form and is formed by splicing map elements forming the plane virtual map in a three-dimensional space.

The three-dimensional virtual map is an image which is used for representing a virtual world in a three-dimensional form in a three-dimensional space.

In some embodiments, the types of spatial geometries include, but are not limited to, depending on design requirements: cylinders, cubes, spheres, ellipsoids, rings, tetrahedrons, and the like.

In other embodiments, the stereoscopic virtual map corresponds to a different scale, and the map mode switching operation triggers switching from the planar virtual map to the stereoscopic virtual map at the fixed scale. After the terminal presents the stereoscopic virtual map for the player, the player can adjust the scale of the stereoscopic virtual map by performing map zooming operation so as to present stereoscopic virtual maps with different sizes.

In other embodiments, in a stereoscopic virtual map scene, different portions of the stereoscopic virtual map are displayed in response to detecting a slide operation on the stereoscopic virtual map. That is, the player can view different portions of the stereoscopic virtual map by performing a sliding operation to achieve quick browsing of the map. Optionally, the spatial geometry presents the player with different map portions in a rotating manner.

Taking the spatial geometry as a cylinder as an example, fig. 3 shows a switching from the planar virtual map 30 to the cylindrical virtual map 31. In the embodiment of the present application, the switching of the map mode is triggered by a map mode switching operation. In some embodiments, the map mode switching operation includes, but is not limited to: gesture operation, touch operation on a specific control or a specific graphic, which is not specifically limited herein.

It should be noted that the stereoscopic virtual map and the planar virtual map have no difference in content.

One of the differences is that the stereoscopic virtual map is displayed in a geometric form of space, and has a stereoscopic effect, that is, the embodiment of the present application presents a game geodesic map originally displayed in a plane in a stereoscopic space. The method is more real in visual presentation, accords with the daily understanding of the player to the star map in the real world, creates visual experience similar to the star map, and has better visual impression.

The second difference is that the edge zones of the planar virtual map of the related art are not connected, which greatly limits the game play of the player. For example, referring to FIG. 4, a player at the edge of a map can only control a virtual queue to march to the middle area before further interaction can be achieved. For example, if a player wants to march from an area a at the edge of a map to another area B at the edge of the map, the virtual queue must be controlled to march along a particular route, for example, from the middle area C.

The three-dimensional virtual map is a game large map formed by splicing based on the spatial geometric shapes, namely, the original plane-displayed game ground map is surrounded into the spatial geometric shapes, and the spatial geometric shapes determine that the three-dimensional virtual map is communicated, so that the problems that in the related technology, because the planar virtual map is physically tiled, the left edge and the right edge of the map are not communicated and the upper edge and the lower edge of the map are also not communicated are solved. Based on the large game map with the shape of the space geometry, a player at a certain position can start marching to a destination from different directions, and the game strategy formulation of the player is not limited too much, so that the game playing method is enriched.

Taking the spatial geometry as a cylinder as an example, referring to fig. 5, because the edge zones of the planar virtual map 50 are not intercommunicated, the planar virtual map 50 does not support players at the map edge to march according to the direction shown in the drawing, and players at the map edge generally need to interact through the middle area shown in the planar virtual map 50, that is, players at the map edge generally can only realize further interaction after controlling the virtual queue to march to the middle area. Also shown in fig. 5 is a cylindrical virtual map 51, after the cylindrical virtual map 51 is used, the relative positions of the two players shown in the planar virtual map 50 are shown in the cylindrical virtual map 51, each player can initiate marching from multiple directions, and the game playing method is rich and is not limited to a fixed marching route.

In other embodiments, the embodiments of the present application support switching the map mode in the following ways. That is, the above-described step 202 "the terminal switches the planar virtual map to the stereoscopic virtual map of the virtual world in response to detection of the map mode switching operation" includes the following steps.

2021. A first control is displayed on a graphical user interface of the terminal and used for switching a map mode; and responding to the detection of the touch operation of the first control, and switching the plane virtual map into the three-dimensional virtual map by the terminal.

In some embodiments, the planar virtual map is displayed in full screen on a graphical user interface of the terminal; or, the planar virtual map is displayed on the graphical user interface in a non-full screen form.

Optionally, the first control is in the form of a virtual key, and the first control is displayed at an edge position of the graphical user interface, such as a lower left corner, a lower right corner, or a lower right corner of the graphical user interface. In addition, the touch operation on the first control may be a click operation or a long-press operation on the first control, which is also not specifically limited in the embodiment of the present application.

2022. A map thumbnail is displayed on a graphical user interface of the terminal, and the map thumbnail is used for switching map modes; and responding to the detected touch operation on the map thumbnail, and switching the plane virtual map into the three-dimensional virtual map.

In some embodiments, referring to fig. 6, the map thumbnail 501 is a scaled down version of the planar virtual map 50. The player touches the map thumbnail 501 to trigger the terminal to be switched and displayed as the three-dimensional virtual map 51 by the planar virtual map 50.

Optionally, the map thumbnail is displayed at an edge position of the graphical user interface, such as a lower left corner, a lower right corner, or a lower right corner of the graphical user interface, which is not specifically limited in this embodiment of the present application. In addition, the touch operation on the map thumbnail may be a click operation or a long-press operation on the map thumbnail, and the embodiment of the present application also does not specifically limit this.

2023. And in response to detecting the target gesture operation on the plane virtual map, switching the plane virtual map into the stereoscopic virtual map.

The embodiment of the application also supports that the user triggers the map mode switching through target gesture operation. In some embodiments, referring to fig. 7, the target gesture operation may be a finger sliding operation performed by the player on a display screen of the terminal while the terminal is displaying the planar virtual map. For example, the finger sliding operation may be a sliding operation in which two fingers gradually move away from each other, or a sliding operation in which two fingers gradually move closer to each other, which is not limited in this embodiment of the present application.

In other embodiments, the present application further comprises the following steps.

203. In the case that the planar virtual map is shown to the edge of the first map, in response to detecting a map translation operation that points from the edge of the first map to the edge of the second map, the terminal translates and displays the planar virtual map on the graphical user interface; wherein the first map edge is opposite to the second map edge in position; the translated plane virtual map is provided with a peripheral area of the second map edge.

Without switching the map mode, if the planar virtual map is already shown to the first map edge, the embodiment of the application supports the player to continue to pan the planar virtual map, wherein the panning operation points to the second map edge from the first map edge to show the area near the second map edge.

Taking the first edge as the left edge 80 of the map shown in fig. 8 as an example, if the virtual planar map is already displayed on the left edge of the map, the embodiment of the present invention supports that the player continues to translate the virtual planar map through a rightward translation operation, that is, the second edge of the map is the right edge of the map, and at this time, the left edge of the map is displayed with the big book 81 of a certain player. As shown in fig. 8, the panned virtual planar map includes a large book 82 of another player located near the right edge of the map. That is, when a player is at map edge a, the player can continue browsing the map through a map panning operation, such as viewing players located near map edge B, where map edge a is opposite to the location of map edge B.

In other embodiments, besides the player can initiate the marching instructions on the planar virtual map, the player can also be functionally supported to initiate the marching instructions on the stereoscopic virtual map. Namely, the embodiment of the present application further includes the following steps.

204. In response to detecting a position clicking operation on the three-dimensional virtual map, the terminal displays a position mark at the selected target position and displays at least one function control on the graphical user interface; a functionality control for indicating a marching type of the virtual queue; and responding to the touch operation of any one functional control, and displaying the marching information matched with the marching type indicated by the selected functional control on the three-dimensional virtual map.

Optionally, the position clicking operation may be a clicking operation of the player on the stereoscopic virtual map, that is, which position on the stereoscopic virtual map is clicked by the player, the selected position is marked and is taken as the target position.

As shown in fig. 9, the position mark 90 is in the form of a bubble, and besides, the position mark 90 may be in other forms, such as a flag form or a text form, which is not particularly limited in the embodiment of the present application.

Optionally, the at least one function control is uniformly displayed on a function panel, and the function panel is displayed on the stereoscopic virtual map in a top-level display manner.

Optionally, a marching type corresponds to a marching purpose, wherein the marching purpose includes, but is not limited to, movement, occupation, exploration, and the like. Correspondingly, one march type corresponds to one function control, and fig. 9 shows a function control 91 indicating a virtual queue to move, a function control 92 indicating a virtual queue to camp, and a function control 93 indicating a virtual queue to explore, respectively.

205. The terminal displays a position mark on the three-dimensional virtual map; the system comprises a three-dimensional virtual map, a position mark, a display unit and a control unit, wherein the position mark is used for marking a selected target position on the three-dimensional virtual map; in response to detecting a selected operation on any one of the position markers, displaying at least one function control on the graphical user interface; a functionality control for indicating a marching type of the virtual queue; and responding to the touch operation of any one functional control, and displaying the marching information matched with the marching type indicated by the selected functional control on the three-dimensional virtual map.

For such a manner, the embodiment of the application supports that the player marks the target position in advance, and optionally, the player may mark a plurality of target positions on the stereoscopic virtual map in advance. Wherein, a position mark is displayed at one target position, so as to prompt the player. In the displayed position marks, a player can randomly select a target position indicated by one position mark, and the selection operation triggers the terminal to display at least one functional control so that the player can determine the marching type.

In some embodiments, taking as an example that the player selects the march type indicated by the function control 91 in fig. 9, the terminal displays a start position list and a virtual queue list on the graphical user interface in response to a touch operation on the function control 91; the starting position list is used for showing at least one starting position which can be currently used by a player, and the virtual queue list is used for showing at least one virtual queue which can be currently used by the player. And displaying the marching route of the selected virtual queue on the stereoscopic virtual map based on the selected starting position and the selected target position.

In some embodiments, the start position list and the virtual queue list are displayed on the stereoscopic virtual map in a top-level display manner. As shown in fig. 10, two start positions that are currently available are shown in the start position list, which are city 1 and city 2, respectively; three virtual queues that are currently available are shown in the virtual queue list, queue 1, queue 2, and queue 3. The player can select from different starting positions and virtual queues, according to the starting position and the target position selected by the player, the terminal (or the server) automatically completes route searching, calculates the marching route of the selected virtual queue, and presents the planned marching route for the player on the three-dimensional virtual map. Therein, a marching route 10 is shown in fig. 10. After the player confirms that the marching route is correct, the player can initiate marching by touching the confirmation control 11 in fig. 10.

The first point to be described is that after the march is started, the player can switch back to the planar virtual map again, or can stay in the stereoscopic virtual map scene continuously. In some embodiments, the manner of switching from the stereoscopic virtual map to the planar virtual map may be to touch the first control in the step 2021 again, or to execute a gesture operation opposite to the target gesture operation in the step 2023, which is not limited in this embodiment of the application.

The second point to be noted is that, as shown in fig. 11, the interaction mode in the stereoscopic virtual map scene is not different from the interaction mode in the planar virtual map scene, and the player does not increase the learning cost by using the stereoscopic virtual map. In the three-dimensional virtual map scene or the planar virtual map scene, a player needs to select a target position first, then send a virtual queue to the target position according to a planned marching route, and control the virtual queue to march from a starting position to the target position. Alternatively, the virtual queue may be marched to the target location by walking or riding in a virtual vehicle. Exemplary virtual vehicles include, but are not limited to, vehicles, horses, and the like, and this is not particularly limited in the embodiments of the present application.

In other embodiments, after switching from the planar virtual map scene to the stereoscopic virtual map scene, the moving manner of the virtual camera may be changed. Namely, the embodiment of the present application further includes the following steps.

206. In response to detecting the map mode switching operation, the terminal switches the moving mode of the virtual camera in the virtual world from planar movement to orbital movement around the space geometry.

Taking the spatial geometry as a cylinder as an example, in response to detecting a map mode switching operation, the moving mode of the virtual camera is switched from planar movement to orbital movement around the cylinder; taking the space geometry as a sphere as an example, in response to detecting the map mode switching operation, the moving mode of the virtual camera is switched from planar movement to orbital movement around the sphere.

The virtual camera is in orbit movement, and the height of the orbit is variable. Accordingly, the track heights of the virtual cameras are different, and the map content presented by the stereoscopic virtual map is also different. For example, as the track height of the virtual camera is gradually increased, the player can see a map with a larger arc. Fig. 12 shows the field of view of the virtual camera at different track heights. As shown in fig. 12, when the track height of the virtual camera is low, the visual field 121 is small, i.e., the player sees a map with a small arc; when the track height of the virtual camera is high, the field of view 122 is large, i.e., the player sees a map with a large arc.

In some embodiments, in a stereoscopic virtual map scene, the player may gradually adjust to a track height at which the player wishes the virtual camera to stop by performing a map zoom operation on the display screen of the terminal. Optionally, the map zoom operation includes a gradual distance of the two fingers and a gradual approach of the two fingers. For example, the height of the track of the virtual camera is adjusted to be lower when the two fingers are gradually far away from the corresponding position, and the height of the track of the virtual camera is adjusted to be higher when the two fingers are gradually close to the corresponding position.

By displaying the three-dimensional virtual map, on one hand, real visual experience can be brought to a player, so that the game environment is rich in immersion, and the visual effect is better; on the other hand, the space geometry shape determines that the three-dimensional virtual map is communicated, the problems that the left edge and the right edge of the map are not communicated and the upper edge and the lower edge of the map are not communicated because the planar virtual map is physically tiled in the related technology do not exist, a player can conveniently make various game strategies, the interaction effect is greatly improved, and the interaction efficiency is high.

In detail, from the perspective of a game environment, the large game map spliced according to the space geometry shape accords with the daily understanding of a player on a star map in the real world, creates visual experience similar to the star map, and has better visual impression. For example, when the virtual camera reaches a certain height, the effect similar to the horizon can be seen, and the visual impression of the stereoscopic virtual map is better compared with that of a plane virtual map. From a play perspective, the stereoscopic virtual map can provide more game play for the player. For example, in a planar virtual map of the related art, if a player in an edge zone of the map tries to attack another edge zone of the map, the player generally needs to pass through a middle area of the map; after the three-dimensional virtual map is used, the player can either march through the middle area or initiate marching through other various marching routes without crossing the middle area.

Fig. 13 is a flowchart of another virtual map display method provided in this embodiment, where the method is applied in the foregoing implementation environment, and in this embodiment, a terminal is used as an execution subject to describe the virtual map display method, and referring to fig. 13, in some embodiments, the method flow includes the following steps.

1301. Spatial position coordinates of respective map elements constituting the planar virtual map are acquired.

In some embodiments, the map elements required to construct the planar virtual map are identical to the map elements required to construct the stereoscopic virtual map. In other words, the three-dimensional virtual map is formed by splicing the map elements forming the planar virtual map in a three-dimensional space. Taking the example that the three-dimensional virtual map is displayed in a cylindrical shape, the three-dimensional virtual map is formed by splicing map elements forming the planar virtual map in a cylindrical space.

Optionally, the map elements are plots, that is, both the planar virtual map and the stereoscopic virtual map are formed by splicing the plots.

Optionally, the spatial position coordinates of each map element may define a relative position relationship of each map element in the three-dimensional space, so as to facilitate subsequent parcel stitching according to the position relationship. In addition, the terminal is installed and operated with a target application program supporting virtual map display, and the target application program may preset spatial position coordinates of each map element, which is not specifically limited in this embodiment of the present application.

1302. And determining a reference position, and acquiring the rotation angle of each map element relative to the reference position.

In the embodiment of the present application, since the stereoscopic virtual map is spliced, in addition to the spatial position coordinates of each map element, it is also necessary to determine the rotation angle of each map element at its respective position relative to the reference position, so as to splice the stereoscopic virtual map with a spatial geometry.

Optionally, the reference position may be a zero-longitude position of the space geometry, may also be a zero-latitude position of the space geometry, and may also be a position where both longitude and latitude of the space geometry are zero, which is not specifically limited in this embodiment of the present application. Accordingly, determining a reference position, and acquiring a rotation angle of each map element relative to the reference position, comprises at least one of the following:

1302-1, determining a zero longitude position as a reference position, and acquiring the longitude of each map element relative to the reference position; the rotation angle of each map element at the respective position is determined based on the longitude of each map element relative to the reference position.

The longitude of each map element relative to the zero longitude position is obtained, and then the rotation angle of each map element at each position is determined according to the reference position and the longitude of each map element. For example, the difference between the longitude of the reference position and the longitude of each map element is used as the rotation angle of each map element at each position. For example, assuming that the map element in the zero longitude position is not rotated, the map element located on the east 45 degree meridian would need to be rotated 45 degrees in its position to achieve more accurate stitching.

1302-2, determining the zero latitude position as a reference position, and acquiring the latitude of each map element relative to the reference position; and determining the rotation angle of each map element at each position according to the latitude of each map element relative to the reference position.

The method comprises the following steps of obtaining the latitude of each map element relative to the zero latitude position, and then determining the rotation angle of each map element at each position according to the latitude of each map element. For example, the difference between the longitude of the reference position and the latitude of each map element is used as the rotation angle of each map element at each position. For example, assuming that the map element at the zero latitude position is not rotated, the map element at the 45 degree latitude line of south latitude also needs to be rotated 45 degrees in its position to achieve a more accurate stitching.

1303. And splicing the map elements according to the types of the map elements forming the plane virtual map, and the spatial position coordinates and the rotation angles of all the map elements to obtain the three-dimensional virtual map.

As shown in fig. 14, the map element types include, but are not limited to, a regular quadrilateral parcel or a regular hexagonal parcel. After the spatial position coordinates and the rotation angles of the map elements are obtained, the map splicing is completed according to the types of the map elements. The map elements are different in types and splicing modes. Fig. 14 shows two different stitching approaches for two types of map elements.

In some embodiments, assuming that the zero-longitude position is the left edge of the map or the right edge of the map, after the map elements are spliced according to the step 1303, a stereoscopic virtual map with a cylindrical shape is formed, and the stereoscopic virtual map realizes that the left edge of the map and the right edge of the map which are not adjacent to each other in the plane map scene are spliced together.

In other embodiments, assuming that the zero-latitude position is the upper edge of the map or the lower edge of the map, after the map elements are spliced according to the step 1303, a stereoscopic virtual map with a cylindrical shape is formed, and the stereoscopic virtual map realizes that the upper edge of the map and the lower edge of the map which are not adjacent to each other in the planar map scene are spliced together.

1304. Acquiring the radius of the space geometric body according to the width of a single map element forming the plane virtual map and the total number of the map elements forming the plane virtual map; and determining the track height of the virtual camera in the virtual world for track movement according to the radius of the space geometry and the target offset.

Optionally, step 1304 may be performed after step 1303, or before step 1303, for example, before step 1301, or before step 1302, which is not specifically limited in this embodiment of the present application.

In some embodiments, taking the space geometry as a cylinder as an example, first, the number of cross-sectional map elements of the cylinder is calculated according to the total number of map elements and the height of the cylinder; assuming that the width of a unit map element is a fraction of the cross-sectional perimeter of the cylinder, the radius r of the cylinder can be obtained from the formula cross-sectional perimeter, i.e., the width of the unit map element, i.e., the number of cross-sectional map elements, and the formula circular perimeter, i.e., 2, pi, r.

Fig. 15 shows the moving path and the track height of the virtual camera. It should be noted that, the moving path of the virtual camera may change correspondingly when the shape of the space geometry is different. Fig. 15 illustrates the movement path of the virtual camera by taking the spatial geometry as a cylinder as an example. The target offset amount may be set in advance by the target application program, and this embodiment of the present application is not particularly limited thereto.

In another embodiment, referring to fig. 16, taking longitude as an example, the 0 degree longitude on the space geometry is west longitude with west to 180 degree longitude part as west longitude; the 0-degree meridian part of east-180-degree meridians is east meridian; east and west 180 degrees warp threads are the same warp thread. That is, the east longitude and west longitude ranges between 0 degrees to 180 degrees. Accordingly, whenever the longitude of a certain map element exceeds 0 degree or 180 degrees, the east-west longitude needs to be converted once, for example, the longitude of a certain map element is 190 degrees, and the east-west longitude should be converted to 170 degrees; for another example, the longitude of a certain map element is-20 degrees from east longitude, and should be converted to 20 degrees from west longitude.

According to the embodiment of the application, the accurate three-dimensional virtual map can be obtained through the map splicing method. Because the three-dimensional virtual map has a certain radian, when the camera is far away, a player can see the bending similar to a real star body and even can see the horizon effect, and the visual experience is more real.

By displaying the three-dimensional virtual map, on one hand, real visual experience can be brought to a player, so that the game environment is rich in immersion, and the visual effect is better; on the other hand, the space geometry shape determines that the three-dimensional virtual map is communicated, the problems that the left edge and the right edge of the map are not communicated and the upper edge and the lower edge of the map are not communicated because the planar virtual map is physically tiled in the related technology do not exist, a player can conveniently make various game strategies, the interaction effect is greatly improved, and the interaction efficiency is high.

In detail, from the perspective of a game environment, the large game map spliced according to the space geometry shape accords with the daily understanding of a player on a star map in the real world, creates visual experience similar to the star map, and has better visual impression. For example, when the virtual camera reaches a certain height, the effect similar to the horizon can be seen, and the visual impression of the stereoscopic virtual map is better compared with that of a plane virtual map. From a play perspective, the stereoscopic virtual map can provide more game play for the player. For example, in a planar virtual map of the related art, if a player in an edge zone of the map tries to attack another edge zone of the map, the player generally needs to pass through a middle area of the map; after the three-dimensional virtual map is used, the player can either march through the middle area or initiate marching through other various marching routes without crossing the middle area.

Fig. 17 is a block diagram of a virtual map display apparatus according to an embodiment of the present application. Referring to fig. 17, the apparatus includes:

a first display module 1701 configured to display a planar virtual map of a virtual world on a graphic user interface;

a second display module 1702 configured to switch the planar virtual map to a stereoscopic virtual map of the virtual world in response to detecting a map mode switching operation;

the three-dimensional virtual map is displayed in a space geometric form, and map elements forming the plane virtual map are spliced in a three-dimensional space to form the three-dimensional virtual map.

By displaying the three-dimensional virtual map, on one hand, real visual experience can be brought to a player, so that the game environment is rich in immersion, and the visual effect is better; on the other hand, the space geometry shape determines that the three-dimensional virtual map is communicated, the problems that the left edge and the right edge of the map are not communicated and the upper edge and the lower edge of the map are not communicated because the planar virtual map is physically tiled in the related technology do not exist, a player can conveniently make various game strategies, the interaction effect is greatly improved, and the interaction efficiency is high.

In detail, from the perspective of a game environment, the large game map spliced according to the space geometry shape accords with the daily understanding of a player on a star map in the real world, creates visual experience similar to the star map, and has better visual impression. For example, when the virtual camera reaches a certain height, the effect similar to the horizon can be seen, and the visual impression of the stereoscopic virtual map is better compared with that of a plane virtual map. From a play perspective, the stereoscopic virtual map can provide more game play for the player. For example, in a planar virtual map of the related art, if a player in an edge zone of the map tries to attack another edge zone of the map, the player generally needs to pass through a middle area of the map; after the three-dimensional virtual map is used, the player can either march through the middle area or initiate marching through other various marching routes without crossing the middle area.

In some embodiments, a first control is displayed on the graphical user interface, and the first control is used for switching map modes; the second display module configured to:

and responding to the detection of the touch operation of the first control, and switching the planar virtual map into the stereoscopic virtual map.

In some embodiments, a map thumbnail is displayed on the graphical user interface, and the map thumbnail is used for switching map modes; the second display module configured to:

and responding to the detected touch operation on the map thumbnail, and switching the planar virtual map into the stereoscopic virtual map.

In some embodiments, the second display module is configured to:

in response to detecting a target gesture operation on the planar virtual map, switching the planar virtual map to the stereoscopic virtual map.

In some embodiments, the first display module is further configured to:

in response to detecting a map panning operation by the first map edge pointing to a second map edge, panning the planar virtual map on the graphical user interface if the planar virtual map has been shown to the first map edge;

wherein the first map edge is opposite the location of the second map edge; the translated plane virtual map is provided with a peripheral area of the second map edge.

In some embodiments, the apparatus further comprises:

a third display module configured to display a location marker at the selected target location in response to detecting a location-clicking operation on the stereoscopic virtual map, displaying at least one functionality control on the graphical user interface; one of the functionality controls is for indicating a march type of the virtual queue;

the second display module is configured to respond to touch operation of any one function control, and display marching information matched with the marching type indicated by the selected function control on the stereoscopic virtual map.

In some embodiments, the third display module is further configured to display a location marker on the stereoscopic virtual map; wherein, one position mark is used for marking a selected target position on the three-dimensional virtual map; in response to detecting a selected operation on any one of the position markers, displaying at least one function control on the graphical user interface; one of the functionality controls is for indicating a march type of the virtual queue;

the second display module is configured to respond to touch operation of any one function control, and display marching information matched with the marching type indicated by the selected function control on the stereoscopic virtual map.

In some embodiments, the third display module is further configured to display a starting position list and a virtual queue list on the graphical user interface in response to a touch operation on a second control of the at least one function control; the starting position list is used for showing at least one currently usable starting position, and the virtual queue list is used for showing at least one currently usable virtual queue;

the second display module is configured to display the marching routes of the selected virtual queue on the stereoscopic virtual map based on the selected starting position and the selected target position.

In some embodiments, the apparatus further comprises:

an acquisition module configured to acquire spatial position coordinates of respective map elements constituting the planar virtual map;

a first processing module configured to determine a reference position, and acquire a rotation angle of each map element relative to the reference position;

and the second processing module is configured to perform map element splicing according to the types of the map elements forming the planar virtual map, and the spatial position coordinates and the rotation angles of the map elements to obtain the three-dimensional virtual map.

In some embodiments, the first processing module is configured to perform at least one of:

determining a zero longitude position as the reference position, and acquiring the longitude of each map element relative to the reference position; determining a rotation angle of each map element at each position according to the longitude of each map element relative to the reference position;

determining a zero latitude position as the reference position, and acquiring the latitude of each map element relative to the reference position; and determining the rotation angle of each map element at each position according to the latitude of each map element relative to the reference position.

In some embodiments, the apparatus further comprises:

a third processing module configured to switch the moving mode of the virtual camera in the virtual world from planar movement to orbital movement around the spatial geometry in response to detecting a map mode switching operation.

In some embodiments, the apparatus further comprises:

the fourth processing module is configured to obtain the radius of the space geometric body according to the width of a single map element forming the plane virtual map and the total number of the map elements forming the plane virtual map;

and determining the track height of the virtual camera for track movement according to the radius of the space geometry and the target offset.

All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.

It should be noted that: in the virtual map display device provided in the above embodiment, when displaying a virtual map, only the division of the above functional modules is taken as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the virtual map display apparatus and the virtual map display method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.

Fig. 18 shows a block diagram of a computer device 1800, provided in an example embodiment of the present application. The computer device 1800 may be a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Computer device 1800 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.

Generally, computer device 1800 includes: a processor 1801 and a memory 1802.

The processor 1801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1801 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1801 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1801 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the processor 1801 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.

Memory 1802 may include one or more computer-readable storage media, which may be non-transitory. Memory 1802 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1802 is used to store at least one program code for execution by processor 1801 to implement the virtual map display method provided by the method embodiments herein.

In some embodiments, computer device 1800 may also optionally include: a peripheral interface 1803 and at least one peripheral. The processor 1801, memory 1802, and peripheral interface 1803 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1803 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1804, display 1805, camera assembly 1806, audio circuitry 1807, positioning assembly 1808, and power supply 1809.

The peripheral interface 1803 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1801 and the memory 1802. In some embodiments, the processor 1801, memory 1802, and peripheral interface 1803 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1801, the memory 1802, and the peripheral device interface 1803 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.

The Radio Frequency circuit 1804 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1804 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1804 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals. In some embodiments, the radio frequency circuitry 1804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1804 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.

The display screen 1805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1805 is a touch display screen, the display screen 1805 also has the ability to capture touch signals on or over the surface of the display screen 1805. The touch signal may be input to the processor 1801 as a control signal for processing. At this point, the display 1805 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1805 may be one, disposed on a front panel of the computer device 1800; in other embodiments, the number of the display screens 1805 may be at least two, respectively disposed on different surfaces of the computer device 1800 or in a foldable design; in other embodiments, the display 1805 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 1800. Even more, the display 1805 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display 1805 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.

The camera assembly 1806 is used to capture images or video. In some embodiments, camera assembly 1806 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1806 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.

The audio circuitry 1807 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1801 for processing or inputting the electric signals to the radio frequency circuit 1804 to achieve voice communication. The microphones may be multiple and placed at different locations on the computer device 1800 for stereo sound capture or noise reduction purposes. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1801 or the radio frequency circuitry 1804 to sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1807 may also include a headphone jack.

The Location component 1808 is used to locate a current geographic Location of the computer device 1800 for navigation or LBS (Location Based Service). The Positioning component 1808 may be a Positioning component based on a Global Positioning System (GPS) in the united states, a beidou System in china, or a galileo System in russia.

The power supply 1809 is used to power various components within the computer device 1800. The power supply 1809 may be ac, dc, disposable or rechargeable. When the power supply 1809 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.

In some embodiments, computer device 1800 also includes one or more sensors 1810. The one or more sensors 1810 include, but are not limited to: acceleration sensor 1811, gyro sensor 1812, pressure sensor 1813, fingerprint sensor 1814, optical sensor 1815, and proximity sensor 1816.

The acceleration sensor 1811 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the computer apparatus 1800. For example, the acceleration sensor 1811 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1801 may control the display 1805 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1811. The acceleration sensor 1811 may also be used for acquisition of motion data of a game or a user.

The gyro sensor 1812 may detect a body direction and a rotation angle of the computer device 1800, and the gyro sensor 1812 may cooperate with the acceleration sensor 1811 to collect a 3D motion of the user on the computer device 1800. The processor 1801 may implement the following functions according to the data collected by the gyro sensor 1812: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.

Pressure sensors 1813 may be disposed on the side bezel of computer device 1800 and/or underneath display 1805. When the pressure sensor 1813 is disposed on a side frame of the computer device 1800, a user's holding signal to the computer device 1800 can be detected, and the processor 1801 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1813. When the pressure sensor 1813 is disposed at the lower layer of the display screen 1805, the processor 1801 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1805. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.

The fingerprint sensor 1814 is used to collect the fingerprint of the user, and the processor 1801 identifies the user according to the fingerprint collected by the fingerprint sensor 1814, or the fingerprint sensor 1814 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1801 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1814 may be disposed on the front, back, or side of the computer device 1800. When a physical key or vendor Logo is provided on the computer device 1800, the fingerprint sensor 1814 may be integrated with the physical key or vendor Logo.

The optical sensor 1815 is used to collect the ambient light intensity. In one embodiment, the processor 1801 may control the display brightness of the display screen 1805 based on the ambient light intensity collected by the optical sensor 1815. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1805 is increased; when the ambient light intensity is low, the display brightness of the display 1805 is reduced. In another embodiment, the processor 1801 may also dynamically adjust the shooting parameters of the camera assembly 1806 according to the intensity of the ambient light collected by the optical sensor 1815.

A proximity sensor 1816, also known as a distance sensor, is typically provided on the front panel of the computer device 1800. The proximity sensor 1816 is used to gather the distance between the user and the front of the computer device 1800. In one embodiment, the processor 1801 controls the display 1805 to switch from the bright screen state to the dark screen state when the proximity sensor 1816 detects that the distance between the user and the front of the computer device 1800 is gradually decreased; when the proximity sensor 1816 detects that the distance between the user and the front of the computer device 1800 is gradually increasing, the display 1805 is controlled by the processor 1801 to switch from the breath-screen state to the bright-screen state.

Those skilled in the art will appreciate that the configuration illustrated in FIG. 18 is not intended to be limiting with respect to the computer device 1800 and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components may be employed.

In an exemplary embodiment, there is also provided a computer readable storage medium, such as a memory, including program code executable by a processor in a computer device to perform the virtual map display method in the above embodiments. For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.

In an exemplary embodiment, there is also provided a computer program product or a computer program comprising computer program code stored in a computer-readable storage medium, the computer program code being read by a processor of a computer device from the computer-readable storage medium, the processor executing the computer program code to cause the computer device to perform the virtual map display method described above.

It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.

The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

32页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:基于虚拟对象的显示控制方法、装置、设备及介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类