Card object-based interaction method and device, computer equipment and storage medium

文档序号:576945 发布日期:2021-05-25 浏览:8次 中文

阅读说明:本技术 基于牌类对象的交互方法、装置、计算机设备及存储介质 (Card object-based interaction method and device, computer equipment and storage medium ) 是由 文晗 陈印超 张雅 梁皓辉 林琳 李熠琦 钱杉杉 于 2021-01-22 设计创作,主要内容包括:本申请实施例公开了一种基于牌类对象的交互方法、装置、计算机设备及存储介质,属于计算机技术领域。该方法包括:显示交互界面,响应于在交互界面中检测到的滑动操作满足发牌条件,对个人展示区域中滑动操作的滑动轨迹经过的目标牌类对象执行发出操作,发牌条件为:滑动操作从个人展示区域移动至共享展示区域后在共享展示区域内释放,将目标牌类对象显示于共享展示区域中。本申请实施例提供的方法,用户只需要在交互界面中执行一次滑动操作,即可自动地确定要发出的牌类对象,并自动执行发出牌类对象的操作,而无需用户先执行选择牌类对象的操作再执行点击确认选项的操作,简化了操作,提高了操作效率。(The embodiment of the application discloses an interaction method and device based on card objects, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: displaying an interactive interface, responding to that the sliding operation detected in the interactive interface meets a dealing condition, and executing a dealing operation on a target card class object through which a sliding track of the sliding operation passes in the personal display area, wherein the dealing condition is as follows: and releasing the sliding operation in the shared display area after the sliding operation is moved from the personal display area to the shared display area, and displaying the target card class object in the shared display area. According to the method provided by the embodiment of the application, the user can automatically determine the card class object to be dealt by only executing one sliding operation in the interactive interface, and automatically execute the operation of dealing the card class object, and the user does not need to execute the operation of selecting the card class object and then clicking the confirmation option, so that the operation is simplified, and the operation efficiency is improved.)

1. A card object-based interaction method, the method comprising:

displaying an interactive interface, wherein the interactive interface comprises a personal display area and a shared display area, the personal display area is used for displaying card objects held by a first user identifier which is currently logged in, the shared display area is used for displaying card objects issued by a plurality of user identifiers which participate in interaction, and the plurality of user identifiers comprise the first user identifier;

in response to that the sliding operation detected in the interactive interface meets a dealing condition, performing a dealing operation on a target card class object through which a sliding track of the sliding operation passes in the personal display area, wherein the dealing condition is as follows: the sliding operation is released in the shared display area after being moved from the personal display area to the shared display area;

displaying the target card class object in the shared display area.

2. The method of claim 1, wherein performing a dealing operation on a target card class object through which a sliding track of the sliding operation passes in the personal presentation area in response to the sliding operation detected in the interactive interface satisfying a card dealing condition comprises:

in response to detecting the sliding operation in the personal presentation area, determining a sliding trajectory of the sliding operation;

in response to the sliding track intersecting a display area corresponding to any card object in the personal display area, determining the any card object as a target card object;

and in response to detecting that the sliding operation is released in the shared display area after moving from the personal display area to the shared display area, executing a dealing operation on the target card class object.

3. The method of claim 1, wherein performing a dealing operation on a target card class object through which a sliding track of the sliding operation passes in the personal presentation area in response to the sliding operation detected in the interactive interface satisfying a card dealing condition comprises:

in response to detecting the sliding operation in the personal presentation area and while the sliding operation is moved out of the personal presentation area, highlighting the target card class object;

in response to detecting that the swipe operation is released within the shared presentation area, performing a launch operation on the target card class object.

4. The method as claimed in claim 3, wherein the target card objects are plural, when the sliding operation is moved out of the personal exhibition area, the distance between any two adjacent target card objects is smaller than the distance between any two adjacent other card objects in the personal exhibition area, and the display positions of the plural target card objects are higher than the display positions of the other card objects.

5. The method of claim 1, wherein performing a dealing operation on a target card class object through which a sliding track of the sliding operation passes in the personal presentation area in response to the sliding operation detected in the interactive interface satisfying a card dealing condition comprises:

in response to detecting a sliding operation in the personal exhibition area, determining a target card class object through which a sliding track of the sliding operation passes in the personal exhibition area;

after the sliding operation is detected to move out of the personal display area, the target card class object is controlled to move along with the sliding track until the sliding operation is released, and a sending operation is executed on the target card class object.

6. The method of claim 5, wherein the determining, in response to detecting a slide operation in the personal presentation area, a target card class object through which a slide trajectory of the slide operation passes in the personal presentation area comprises:

in response to detecting the sliding operation in the personal presentation area, setting a state of a target card class object through which a sliding track of the sliding operation passes as a selected state;

after the sliding operation is detected to move out of the personal display area, controlling the target card class object to move along with the sliding track until the sliding operation is released, and executing a sending operation on the target card class object, wherein the sending operation comprises:

after the sliding operation is detected to move out of the personal display area, controlling the target card class object in the selected state to move along with the sliding track;

in response to detecting that the swipe operation is released within the shared presentation area, performing the launch operation on the target card class object in the selected state.

7. The method of claim 6, wherein after controlling the target card class object in the selected state to move along the sliding trajectory after detecting that the sliding operation moves out of the personal presentation area, the method further comprises:

in response to detecting that the swipe operation is released within the personal display area, restoring the state of the target card class object to a non-selected state.

8. The method of claim 6, wherein the target card class object is a plurality of card class objects, each card class object containing an indicator; after the performing of the payout operation on the target card class object in the selected state in response to detecting that the swipe operation is released within the shared presentation area, the method further comprises:

and restoring the states of the target card objects to the non-selected state in response to the condition that an indication mark combination formed by indication marks contained in the target card objects does not meet the issuing condition.

9. The method of claim 1, wherein the target card class object is a plurality of card class objects, each card class object containing an indicator; the displaying the target card class object in the shared display area comprises:

determining an indication identifier arrangement sequence corresponding to an indication identifier combination according to the indication identifier combination formed by the indication identifiers contained in the target card objects;

sequencing the target card objects according to the indication mark arrangement sequence;

displaying the sorted target card class objects in the shared display area.

10. The method of claim 1, wherein after the displaying the target card class object in the shared display area, the method further comprises:

adjusting display positions of the remaining card objects in the personal display area to equalize distances between any two of the remaining card objects.

11. An interaction device based on card objects, the device comprising:

the display module is used for displaying an interactive interface, the interactive interface comprises a personal display area and a shared display area, the personal display area is used for displaying the card objects held by the currently logged-in first user identifier, the shared display area is used for displaying the card objects issued by a plurality of user identifiers participating in interaction, and the user identifiers comprise the first user identifier;

an execution module, configured to execute a dealing operation on a target card class object through which a sliding track of a sliding operation in the personal display area passes in response to that the sliding operation detected in the interactive interface satisfies a card dealing condition, where the card dealing condition is: the sliding operation is released in the shared display area after being moved from the personal display area to the shared display area;

the display module is further used for displaying the target card class object in the shared display area.

12. The apparatus of claim 11, wherein the execution module comprises:

a determination unit configured to determine a slide trajectory of the slide operation in response to detection of the slide operation in the personal presentation area;

the determining unit is further used for determining any card object as a target card object in response to the intersection of the sliding track and a display area corresponding to the card object in the personal display area;

and the execution unit is used for responding to the release in the shared display area after the sliding operation is detected to move from the personal display area to the shared display area, and executing the issuing operation on the target card class object.

13. The apparatus of claim 11, wherein the execution module comprises:

a display unit for highlighting the target card class object in response to detecting the sliding operation in the personal presentation area and when the sliding operation is moved out of the personal presentation area;

and the execution unit is used for responding to the detection that the sliding operation is released in the shared display area and executing a sending operation on the target card class object.

14. A computer device comprising a processor and a memory, the memory having stored therein at least one computer program, the at least one computer program being loaded and executed by the processor to perform operations performed in the card object-based interaction method of any one of claims 1 to 10.

15. A computer-readable storage medium, having at least one computer program stored therein, the at least one computer program being loaded and executed by a processor to perform operations performed in the card object-based interaction method of any one of claims 1 to 10.

Technical Field

The embodiment of the application relates to the technical field of computers, in particular to an interaction method and device based on card objects, computer equipment and a storage medium.

Background

With the development of internet technology and the improvement of entertainment requirements of people, card games are more and more popular among users. In a round game, multiple users interact based on held card objects. When a game of cards starts, each user acquires a certain number of card objects, a plurality of users send the held card objects in turn, and the users who send all the held card objects win the game first.

When a user sends a held card object, the user needs to select a target card object to be sent first and then click a confirmation option to send the selected target card object. Therefore, multiple operations need to be executed in the process of dealing out the card class objects, and the operation efficiency is low.

Disclosure of Invention

The embodiment of the application provides an interaction method and device based on card objects, computer equipment and a storage medium, and the operation efficiency can be improved. The technical scheme is as follows:

in one aspect, an interaction method based on card class objects is provided, the method comprising:

displaying an interactive interface, wherein the interactive interface comprises a personal display area and a shared display area, the personal display area is used for displaying card objects held by a first user identifier which is currently logged in, the shared display area is used for displaying card objects issued by a plurality of user identifiers which participate in interaction, and the plurality of user identifiers comprise the first user identifier;

in response to that the sliding operation detected in the interactive interface meets a dealing condition, performing a dealing operation on a target card class object through which a sliding track of the sliding operation passes in the personal display area, wherein the dealing condition is as follows: the sliding operation is released in the shared display area after being moved from the personal display area to the shared display area;

displaying the target card class object in the shared display area.

In another aspect, an interaction apparatus based on card objects is provided, the apparatus comprising:

the display module is used for displaying an interactive interface, the interactive interface comprises a personal display area and a shared display area, the personal display area is used for displaying the card objects held by the currently logged-in first user identifier, the shared display area is used for displaying the card objects issued by a plurality of user identifiers participating in interaction, and the user identifiers comprise the first user identifier;

an execution module, configured to execute a dealing operation on a target card class object through which a sliding track of a sliding operation in the personal display area passes in response to that the sliding operation detected in the interactive interface satisfies a card dealing condition, where the card dealing condition is: the sliding operation is released in the shared display area after being moved from the personal display area to the shared display area;

the display module is further used for displaying the target card class object in the shared display area.

In one possible implementation manner, the target card class objects are multiple, when the sliding operation is moved out of the personal display area, the distance between any two adjacent target card class objects is smaller than the distance between any two adjacent other card class objects in the personal display area, and the display positions of the multiple target card class objects are higher than the display positions of the other card class objects.

In another possible implementation manner, the execution module includes:

a determination unit configured to determine, in response to detection of a slide operation in the personal presentation area, a target card class object through which a slide trajectory of the slide operation passes in the personal presentation area;

and the execution unit is used for controlling the target card class object to move along with the sliding track after detecting that the sliding operation moves out of the personal display area until the sliding operation is released, and executing a sending operation on the target card class object.

In another possible implementation manner, the determining unit is configured to set, in response to detection of the sliding operation in the personal presentation area, a state of a target card class object through which a sliding trajectory of the sliding operation passes as a selected state;

the execution unit is used for controlling the target card class object in the selected state to move along with the sliding track after the sliding operation is detected to move out of the personal display area; in response to detecting that the swipe operation is released within the shared presentation area, performing the launch operation on the target card class object in the selected state.

In another possible implementation manner, the apparatus further includes:

and the restoring module is used for restoring the state of the target card class object to a non-selected state in response to the detection that the sliding operation is released in the personal display area.

In another possible implementation manner, the target card class objects are multiple, and each card class object contains an indication mark; the device further comprises:

and the restoring module is used for restoring the states of the target card objects to the non-selected state in response to that an indication identifier combination formed by the indication identifiers contained in the target card objects does not meet the issuing condition.

In another possible implementation manner, the target card class objects are multiple, and each card class object contains an indication mark; the display module is used for determining the indication identifier arrangement sequence corresponding to the indication identifier combination according to the indication identifier combination formed by the indication identifiers contained in the target card objects; sequencing the target card objects according to the indication mark arrangement sequence; displaying the sorted target card class objects in the shared display area.

In another possible implementation manner, the apparatus further includes:

and the adjusting module is used for adjusting the display positions of the remaining card objects in the personal display area so as to enable the distance between any two card objects in the remaining card objects to be equal.

In another aspect, a computer device is provided, which includes a processor and a memory, wherein at least one computer program is stored in the memory, and the at least one computer program is loaded and executed by the processor to implement the operations performed in the card object-based interaction method according to the above aspect.

In another aspect, a computer-readable storage medium is provided, in which at least one computer program is stored, the at least one computer program being loaded and executed by a processor to implement the operations performed in the card object-based interaction method according to the above aspect.

In yet another aspect, a computer program product or a computer program is provided, the computer program product or the computer program comprising computer program code, the computer program code being stored in a computer readable storage medium. The processor of the computer device reads the computer program code from the computer-readable storage medium, and the processor executes the computer program code, so that the computer device implements the operations performed in the card object-based interaction method as described in the above aspect.

The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:

according to the method, the device, the computer equipment and the storage medium provided by the embodiment of the application, the user can automatically determine the card class object to be dealt by only executing one sliding operation in the interactive interface, and automatically execute the operation of dealing the card class object without executing the operation of selecting the card class object and then clicking the confirmation option, so that the operation is simplified, and the operation efficiency is improved.

Drawings

In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.

FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application;

FIG. 2 is a flowchart of an interaction method based on card objects according to an embodiment of the present disclosure;

FIG. 3 is a flowchart of an interaction method based on card objects according to an embodiment of the present disclosure;

FIG. 4 is a schematic diagram of an interactive interface provided by an embodiment of the present application;

fig. 5 is a schematic structural diagram of a touch screen provided in an embodiment of the present application;

FIG. 6 is a flow chart of determining a touch location provided by an embodiment of the present application;

FIG. 7 is a schematic diagram of an interactive interface provided by an embodiment of the present application;

FIG. 8 is a schematic view of an interactive interface provided by an embodiment of the present application;

FIG. 9 is a schematic diagram of an interactive interface provided by an embodiment of the present application;

FIG. 10 is a schematic diagram of an interactive interface provided by an embodiment of the present application;

FIG. 11 is a flow chart of a playing card based interaction method provided by an embodiment of the application;

fig. 12 is a schematic structural diagram of a terminal according to an embodiment of the present application;

FIG. 13 is a schematic structural diagram of an interaction device based on card objects according to an embodiment of the present disclosure;

FIG. 14 is a schematic structural diagram of an interaction device based on card objects according to an embodiment of the present disclosure;

fig. 15 is a schematic structural diagram of a terminal according to an embodiment of the present application;

fig. 16 is a schematic structural diagram of a server according to an embodiment of the present application.

Detailed Description

To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.

As used herein, the terms "at least one," "a plurality," "each," and "any," at least one of which includes one, two, or more than two, and a plurality of which includes two or more than two, each of which refers to each of the corresponding plurality, and any of which refers to any of the plurality. For example, the plurality of card objects includes 3 card objects, each of which refers to each of the 3 card objects, and any of which refers to any one of the 3 card objects, which can be a first card object, a second card object, or a third card object.

Before the embodiments of the present application are explained in detail, the concepts involved are explained as follows:

1. card class object: is a virtual object for user interaction in card games. Optionally, the card object is presented in the form of an image. In the card game, one card class object comprises a plurality of card class objects, and optionally, each card class object comprises an indication mark, the indication mark comprises data, suit and the like of the card class object, and indication marks contained in different card class objects are different. In addition, the card objects have size scores, and the sizes of different card objects can be distinguished according to the indication marks contained in each card object. For example, the indicator includes data and suit, and the size between the card objects can be distinguished according to the data in the indicator, or the rank of different card objects can be distinguished according to the suit in the indicator. For example, the card type object is a playing card, the indication mark in the playing card comprises data and suits of the playing card, different suits have grades of different ranks, the suits are sequentially spades, peaches, clubs and squares according to the order of the ranks from high to low, different data have grades of different sizes, and the data are A, K, Q, J, 10, 9, 8 and the like according to the order of the ranks from large to small.

2. Card object interactivity: the method comprises the steps that a plurality of users participate in each card game, when the card game starts, a pair of card objects are distributed to the participating users, each user holds a certain number of card objects, after the card playing sequence of the users is determined, the users send the held card objects in turn for other users to check, and the users who send all the held card objects out of the users win the game.

Here, a dealing condition is set in the card game, and each user should satisfy the dealing condition when dealing out the card objects, and the dealing condition specifies the size of the indication mark included in the card objects dealt by the user, the number of the card objects, or the like.

The card object-based interaction method provided by the embodiment of the application can be applied to a terminal, and optionally, the terminal is a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart sound box, a smart watch, and the like, but is not limited thereto.

Fig. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application. Referring to fig. 1, the implementation environment includes a plurality of terminals 101 (3 are taken as an example in fig. 1) and a server 102. The terminal 101 and the server 102 are connected via a wireless or wired network.

The server 102 provides an interaction environment for the plurality of terminals 101, the plurality of terminals 101 perform interaction based on the logged-in user identifiers, each terminal 101 displays the card objects held by the currently logged-in user identifier, and sends out the held card objects in turn, so that interaction among the plurality of users is realized.

Alternatively, each terminal 101 has installed thereon a target application served by the server 102, and a plurality of terminals 101 can interact through the card objects in the target application. Optionally, the target application is a target application in an operating system of the terminal 101, or a target application provided by a third party. For example, the target application is a game application having a game function, but of course, the game application can also have other functions, such as a comment function, a shopping function, a chat function, and the like. Optionally, the server 102 is a background server of the target application or a cloud server providing services such as cloud computing and cloud storage.

The method provided by the embodiment of the application can be used for various scenes.

For example, in a card game scenario:

the method comprises the steps that a terminal is provided with a card game application, the card game application is logged in based on a user identification, a virtual room is distributed to the user identification, the user identification interacts with other user identifications in the virtual room, when one game starts, a server distributes a certain number of card objects for each user identification, a plurality of user identifications distribute held card objects in turn according to distribution conditions in the card game application, and when each user identification distributes the held card objects, the interaction method based on the card objects provided by the embodiment of the application is adopted, the target card objects can be rapidly selected from the held card objects, the selected target card objects are distributed, so that the operation required to be executed by a user is reduced, and the operation efficiency is improved.

Fig. 2 is a flowchart of an interaction method based on card objects according to an embodiment of the present application, which is applied to a terminal, and as shown in fig. 2, the method includes:

201. and the terminal displays the interactive interface.

The interactive interface comprises a personal display area and a shared display area, wherein the personal display area is used for displaying card objects held by a first user identifier which is currently logged in, the shared display area is used for displaying card objects issued by a plurality of user identifiers which participate in interaction, and the plurality of user identifiers comprise the first user identifier.

202. And the terminal responds to that the sliding operation detected in the interactive interface meets the dealing condition, and the dealing operation is executed on the target card class object through which the sliding track of the sliding operation passes in the personal display area.

The target card class object is a card class object selected from the personal display area, and the card dealing condition is as follows: and releasing the sliding operation in the shared display area after the sliding operation is moved from the personal display area to the shared display area.

The sliding operation is a continuous operation when moving from the individual display area to the shared display area, the sliding operation is a track formed by connecting the positions of the sliding operation in the sliding process in the interactive interface, namely the sliding track of the sliding operation, and in the sliding process of the sliding operation, the card objects passed by the sliding track are determined as target card objects.

203. And the terminal displays the target card class object in the shared display area.

After the target card class object is issued, the target card class object is moved to the shared display area from the personal display area to be displayed, so that other user identifications participating in interaction can view the issued target card class object, and interaction among multiple users is realized.

According to the method provided by the embodiment of the application, the user can automatically determine the card class object to be dealt by only executing one sliding operation in the interactive interface, and automatically execute the operation of dealing the card class object, and the user does not need to execute the operation of selecting the card class object and then clicking the confirmation option, so that the operation is simplified, and the operation efficiency is improved.

Fig. 3 is a flowchart of an interaction method based on a card object according to an embodiment of the present application, which is applied in a terminal, and as shown in fig. 3, the method includes:

301. and the terminal displays the interactive interface.

The terminal is a terminal for logging in a first user identifier, the interactive interface comprises a personal display area and a shared display area, the personal display area is used for displaying card objects held by the first user identifier which logs in currently, the shared display area is used for displaying the card objects issued by a plurality of user identifiers which participate in interaction, and the shared display area is a card abandoning area or a card changing area in a card game. The first user identifier is included in the plurality of user identifiers participating in the interaction. Optionally, the personal presentation area is a partial area below the interactive interface, and the shared presentation area is a partial area in the middle of the interactive interface.

In one possible implementation manner, the terminal displays the card class object held by the currently logged-in first user identifier in a personal display area in the interactive interface. The card objects held by the first user identification are displayed in the personal display area, so that the user can view the card objects held by the user in the personal display area, and convenience in display of the card objects is guaranteed.

Optionally, the first user identifier holds a plurality of card type objects, in the personal display area, partial areas of every two adjacent card type objects in the plurality of card type objects are overlapped, and the distances between every two adjacent card type objects are equal.

In the embodiment of the application, the sizes of different card objects are equal, and the distance between every two adjacent card objects in the personal display area can represent the size of the overlapping area between the two card objects. That is, the larger the distance between the two card type objects is, the smaller the overlapping area between the two card type objects is, and the smaller the distance between the two card type objects is, the larger the overlapping area between the two card type objects is.

Because the number of the card objects held by the currently logged-in first user identifier is large, the card objects held by the first user identifier may not be completely displayed in the personal display area, partial areas of every two adjacent card objects are overlapped, and distances between every two adjacent card objects are equal, so that the card objects are displayed in the personal display area, and the card objects are guaranteed to be uniformly displayed in the personal display area, so that the display effect of the personal display area is improved.

Optionally, the distance between two adjacent card objects is expressed in the following way: the distance between the two card objects is the distance between the centers of the two card objects; alternatively, if the card objects are rectangles, the distance between the two card objects is the distance between the side lengths of the two rectangles.

Optionally, when the plurality of card objects are displayed in the personal display area, the distance between every two adjacent card objects in the plurality of card objects is a fixed distance. Wherein the fixed distance is an arbitrarily set numerical value.

Optionally, before displaying a plurality of card objects in the personal display area, according to the number of the card objects held by the currently logged first user identifier, the corresponding relationship between the number of the card objects and the distance is queried, the distance between every two adjacent card objects is determined, and then the card objects held by the first user identifier are displayed according to the determined distance.

In the corresponding relation between the number of the card objects and the distance, the larger the number of the card objects is, the smaller the corresponding distance is, namely, when a plurality of card objects are displayed, the smaller the distance between two adjacent card objects is; the smaller the number of the card class objects, the larger the corresponding distance, that is, the larger the distance between two adjacent card class objects when a plurality of card class objects are displayed.

In the embodiment of the application, the display size of each card object is fixed, the size of the personal display area is fixed, and if the distance between every two adjacent card objects is too large, all the card objects held by the first user identifier may not be displayed in the personal display area, so that on the premise that the personal display area can display all the card objects, the distance between two adjacent card objects in the displayed card objects is determined according to the number of the card objects, and the card objects are completely displayed as much as possible.

Optionally, in the personal display area, the card objects are arranged in the order of the indication marks from large to small, and the indication marks included in the card objects are displayed when each card object is displayed.

When a plurality of card objects are displayed in the personal display area, partial areas of every two adjacent card objects are overlapped, each card object may only display partial areas, other areas are not displayed, and the partial area displayed by each card object comprises an indication mark, so that a user can distinguish different card objects through the indication marks contained by the plurality of card objects displayed in the personal display area.

In the embodiment of the application, each card class object comprises an indication mark, the indication mark can distinguish the sizes of different card class objects, and the indication marks contained in different card class objects are different. According to the indication marks contained in each card class object, the size relation of the indication marks contained in the card class objects can be determined, and then the card class objects are displayed according to the size sequence of the indication marks so as to ensure that the card class objects displayed in the personal display area are orderly.

In one possible implementation manner, the terminal has a target application installed thereon, and the step 301 includes: and the terminal logs in the target application based on the first user identification and displays an interactive interface. Wherein, the interactive interface is an interface provided by the target application.

Optionally, the process of obtaining the card object by the user identification includes: the server allocates a virtual room for the first user identifier logged in by the terminal, and respectively issues a plurality of card objects for the plurality of user identifiers in the virtual room, so that the terminal receives the plurality of card objects issued by the server and displays the plurality of card objects in the personal display area.

Wherein, the server is used for providing service for the target application. After the user identifications log in the target application, when a game of the card class starts, the server respectively issues a plurality of card class objects for the user identifications in the same virtual room.

Optionally, the process of issuing the card class object by the server includes: after receiving card object acquisition requests of a plurality of user identifications in the virtual room, the server issues a plurality of card objects for each user identification in the virtual room. The card object obtaining request is used for requesting to obtain the card object.

For example, after a plurality of user identifiers log in the same virtual room, for any terminal, the interactive interface includes a preparation option, a card object acquisition request is sent to the server in response to a trigger operation on the preparation option, and the server issues a card object for each user identifier in the virtual room after acquiring the card object acquisition request corresponding to each user identifier in the virtual room.

In addition, the interactive interface also comprises a plurality of user identifications participating in interaction, virtual assets owned by each user identification, the number of card objects held by each user identification and a statistical list of the card objects which are not dealt in one card object. Optionally, an interactive scene can also be displayed in the interactive interface, and the interactive scene is a two-dimensional scene or a three-dimensional scene.

As shown in fig. 4, an interactive interface displayed by the terminal includes user nicknames of 3 user identifiers and the number of virtual assets owned by each user nickname, 14 card class objects held by a currently logged-in first user identifier are displayed in a personal display area, other two user identifiers respectively hold 9 card class objects and 7 card class objects, a statistical list of card class objects that have not been dealt out in a pair of card class objects is displayed above the interactive interface, and if 4 card class objects including an indication identifier 10 are displayed in the statistical list, the statistical list has not been dealt out. In addition, in the process of alternately dealing out held card objects by 3 user identifiers, at the card dealing stage of the currently logged-in first user identifier, a card object dealt out by the previous user identifier is displayed in a shared display area in an interactive interface, for example, 4 card objects dealt out by the previous user identifier are 3 card objects containing an indication identifier 6 and 1 card object containing an indication identifier 3. And, at each of the user identified card-playing sessions, displaying in the interactive interface a countdown to the current card-playing session.

302. And the terminal responds to that the sliding operation detected in the interactive interface meets the dealing condition, and the dealing operation is executed on the target card class object through which the sliding track of the sliding operation passes in the personal display area.

The target card class objects are card class objects selected from the personal display area, and optionally, the target card class objects are one or more card class objects. The dealing conditions are as follows: and releasing the sliding operation in the shared display area after the sliding operation is moved from the personal display area to the shared display area.

The sliding operation is a continuous operation when moving from the individual display area to the shared display area, the sliding operation is a track formed by connecting the positions of the sliding operation in the sliding process in the interactive interface, namely the sliding track of the sliding operation, and in the sliding process of the sliding operation, the card objects passed by the sliding track are determined as target card objects.

In one possible implementation, this step 302 includes the following three ways.

The first way comprises the following steps 3021-3023:

3021. the terminal determines a slide trajectory of the slide operation in response to detecting the slide operation in the personal exhibition area.

A slide operation is detected in the personal exhibition area, and a trajectory formed by connecting the positions of the slide operation is determined as a slide trajectory of the slide operation.

In a possible implementation manner, the terminal is configured with a touch screen, when a human body contacts with the touch screen, the touch screen generates a signal at a touch position, determines a touch position corresponding to the signal according to the detected signal, and determines a plurality of adjacent touch positions in the personal display area as a sliding track of a sliding operation in response to the plurality of adjacent touch positions being acquired in real time.

The touch screen is used for displaying the interactive interface. The method comprises the steps of acquiring a plurality of adjacent touch positions in real time in a touch screen, wherein the plurality of adjacent touch positions are in a personal display area and indicate that a sliding operation is detected in the personal display area, and a track formed by the plurality of adjacent touch positions is a sliding track of the sliding operation.

As shown in fig. 5, the touch screen is connected with driving buffers and driving electrodes, the driving electrodes are disposed on four sides of the touch screen, and the driving electrodes output driving pulses, so that the touch screen forms a low-voltage ac electric field. Because the human body is conductive, when the human body contacts the touch screen, the human body and a conductor layer in the touch screen form a coupling capacitor, currents sent by driving electrodes on four sides of the touch screen flow to a contact position, charge signals are generated between an inner layer and an outer layer of the touch screen through a middle metal oxide, the terminal can receive the charge signals through a receiving electrode, and the touch position is determined according to charge information subsequently.

Optionally, the process of acquiring the touch position corresponding to the detected signal includes: determining original data of a touch position according to the detected signal, wherein the original data comprises a coordinate range corresponding to the touch position and pressure of each point, removing interference of the original data, establishing a touch area according to the pressure of each point in the coordinate range, and determining the touch area as a contact position corresponding to the signal.

In the embodiment of the application, when a human body is in contact with the touch screen, the contact position is actually a contact area, but not a point, the touch area is determined through the detected signal, and the touch area is taken as the corresponding contact position of the signal. And determining a contact position corresponding to the signal according to the detected signal, as shown in fig. 6.

3022. And the terminal determines any card class object as a target card class object in response to the intersection of the sliding track and a display area corresponding to any card class object in the personal display area.

In the embodiment of the application, card objects held by a first user identifier which is currently logged in are displayed in a personal display area, each card object is displayed in a corresponding display area, and if a sliding track is intersected with the display area corresponding to any card object to indicate that the card object is selected by the sliding operation, the card object is determined as a target card object.

As shown in fig. 4, the sliding track intersects with the display areas of the adjacent card objects, and the adjacent card objects are determined as target card objects; alternatively, as shown in fig. 7, the sliding track intersects with the display areas of the plurality of non-adjacent card objects, and the plurality of non-adjacent card objects are determined as the target card objects.

In one possible implementation, this step 3022 includes: and in response to the intersection of the sliding track and a display area corresponding to any card type object in the personal display area, determining any card type object as a target card type object, and displaying the target card type object in the personal display area in a highlighted mode.

The target card class object in the personal display area is distinguished from other card class objects which are not selected by highlighting and displaying the selected target card class object in the personal display area, so that a user can directly view the selected target card class object, and the display effect is enhanced.

Optionally, the display position of the target card class object is higher than the display positions of the other card class objects in the personal presentation area. As shown in fig. 4, 4 target card class objects are selected, the 4 target card class objects include 3 target card class objects containing the indication marks 9 and 1 target card class object containing the indication marks 6, and the display positions of the 4 target card class objects are higher than those of the other card class objects to highlight the target card class objects.

3023. And the terminal responds to the fact that the sliding operation is released in the shared display area after being detected to move from the personal display area to the shared display area, and issuing operation is carried out on the target card class object through which the sliding track of the sliding operation passes.

In the embodiment of the application, after the sliding operation is detected to move from the personal display area to the shared display area, the sliding operation is released in the shared display area, which indicates that the sliding operation is stopped in the shared display area, that is, the target card object selected by the sliding operation needs to be displayed in the shared display area, so that the issuing operation is performed on the selected target card object, and the target card object is moved from the personal display area to the shared display area for display. The user can automatically determine the card class object to be dealt by only executing one sliding operation in the interactive interface, and automatically execute the operation of dealing the card class object, thereby simplifying the operation and improving the operation efficiency.

In a possible implementation manner, the terminal acquires a position where a sliding track of the sliding operation passes in real time, and in response to that the starting position of the sliding track is in the personal display area and the ending position of the sliding track is in the shared display area, it is determined that the sliding operation is released in the shared display area after moving from the personal display area to the shared display area.

The second way comprises the following steps 3024-3025:

3024. the terminal responds to the detection of the sliding operation in the personal exhibition area, and when the sliding operation moves out of the personal exhibition area, the target card class object through which the sliding track of the sliding operation passes is highlighted.

The method comprises the steps of detecting a sliding operation in a personal display area, determining a card object passing through a sliding track of the sliding operation in the personal display area as a selected target card object, and when the sliding operation moves out of the personal display area, indicating that the target card object is selected and completed, so that the target card object is highlighted to enhance the display effect of the selected target card object, and a user can know whether the target card object is selected or not by checking the display style of the target card object.

In one possible implementation manner, the terminal detects the position passed by the sliding track of the sliding operation in real time, and determines that the sliding operation moves out of the personal display area in response to detecting that the previous position passed by the sliding track is in the personal display area and the current detected position is outside the personal display area.

In one possible implementation manner, the target card class objects are multiple, when the sliding operation is moved out of the personal display area, the distance between any two adjacent target card class objects is smaller than the distance between any two adjacent other card class objects in the personal display area, and the display positions of the multiple target card class objects are higher than those of the other card class objects.

Optionally, in the plurality of card objects displayed in the personal display area, partial areas of every two adjacent card objects overlap, when the sliding operation moves out of the personal display area, a distance between any two adjacent target card objects is smaller than a distance between any two adjacent other card objects in the personal display area, then partial areas of every two adjacent target card objects in the plurality of target card objects overlap, and an overlapping area of any two adjacent target card objects is larger than an overlapping area of any two adjacent other card objects.

Optionally, each card object includes an indicator, when the sliding operation is moved out of the personal display area, partial areas of every two adjacent target card objects in the plurality of target card objects are overlapped, and only the indicator of the uppermost target card object in the plurality of target card objects is displayed.

As shown in fig. 8, the selected 4 target card class objects include 3 target card class objects containing the indication marks 9 and 1 target card class object containing the indication marks 6, when the sliding operation is moved out of the personal display area, the distance between any two adjacent target card class objects in the 4 target card class objects is smaller than the distance between any two adjacent other card class objects in the personal display area, the display positions of the 4 target card class objects are higher than those of the other card class objects, and partial areas of every two adjacent target card class objects in the 4 target card class objects are overlapped, only the indication mark 6 of the topmost target card class object in the 4 target card class objects is displayed, and the indication marks of the other 3 target card class objects are not displayed any more.

3025. And the terminal responds to the detection that the sliding operation is released in the shared display area and executes the issuing operation on the target card class object.

This step is similar to step 3023 described above and will not be described further herein.

The third method comprises the following steps 3026-3027:

3026. the terminal determines a target card class object through which a sliding track of the sliding operation passes in the personal exhibition area in response to detecting the sliding operation in the personal exhibition area.

When the sliding operation is detected in the personal display area, the sliding track of the sliding operation can be determined, and the card class object passing through the sliding track in the personal display area is the target card class object.

In one possible implementation, this step 3026 includes: in response to detecting the slide operation in the personal presentation area, the state of the target card class object through which the slide trajectory of the slide operation passes is set to the selected state.

The selected state is used for indicating that the corresponding card type object is selected.

Optionally, after setting the state of the target card class object to the selected state, the target card class object is highlighted in the personal display area. For example, when the state of the target card class object is the selected state, the target card class object is displayed in an enlarged manner, or the display position of the target card class object in the personal display area is higher than other card class objects in the personal display area.

The state of the target card object passing through the sliding track is set to be the selected state, so that the selected target card object is displayed in the personal display area in a protruding mode, a user can conveniently and directly distinguish the selected target card object from unselected card objects according to the card objects displayed in the personal display area, and the display effect of the interactive interface is improved.

In one possible implementation, this step 302 includes: and determining any card object as a target card object in response to the intersection of the sliding track of the sliding operation and the display area corresponding to any card object in the personal display area, and setting the state of the target card object to be the selected state.

3027. And after the terminal detects that the sliding operation moves out of the personal display area, the target card class object is controlled to move along with the sliding track until the sliding operation is released, and the issuing operation is executed on the target card class object.

After the sliding operation is detected to move out of the personal display area, the target card class object is controlled to move along with the sliding track of the sliding operation, the sliding operation is released to represent that the sliding operation stops, and the issuing operation is carried out on the target card class object, namely the target card class object is not controlled to move along with the sliding track any more. In the process, the target card object is controlled to move along with the sliding track of the sliding operation until the sliding operation is released, so that the effect of dragging the target card object to move and send out is presented in the interactive interface, and the display effect of the interactive interface is improved.

In one possible implementation manner, after detecting that the sliding operation moves out of the personal display area, the terminal detects the position where the sliding track of the sliding operation passes in real time, and controls the target card object to move according to the detected position so as to present a picture of the target card object moving along with the sliding track.

In one possible implementation, after determining the target card class object through which the slide trajectory of the slide operation passes in the personal exhibition area, the state of the target card class object through which the slide trajectory of the slide operation passes is set as the selected state, then the step 3027 includes the following steps 30271 and 30272:

30271. and after the sliding operation is detected to move out of the personal display area, controlling the target card class object in the selected state to move along with the sliding track.

30272. And in response to detecting that the sliding operation is released in the shared display area, executing a sending-out operation on the target card class object in the selected state.

As shown in fig. 9, after it is detected that the sliding operation moves out of the personal display area, the target card class object in the selected state is controlled to move along with the sliding track, and after the sliding operation moves to the shared display area, the target card class object in the selected state moves to the shared display area along with the sliding track, and on the basis of the display in fig. 9, if it is detected that the sliding operation is released in the shared display area, a sending-out operation is performed on the target card class object in the selected state, so that the target card class object is subsequently displayed in the shared display area, as shown in fig. 10.

Optionally, after step 30271, the method further comprises: and restoring the state of the target card class object to a non-selected state in response to detecting that the sliding operation is released in the personal display area.

When the sliding operation is released in the personal display area, the currently selected target card type object is withdrawn, so that the state of the target card type object is restored to the non-selected state, namely the target card type object is displayed in the personal display area according to the style before selection.

After the sliding operation is detected to move out of the personal display area, the target card object in the selected state is controlled to move along with the sliding track, and then the sliding operation is released in the personal display area after moving to the personal display area so as to cancel the selected target card object, so that the selected target card object can be changed at will when the sliding operation is executed, the flexibility of an interaction method is ensured, and the operation experience of a user is improved.

Optionally, after step 30272, the method further comprises: and restoring the states of the target card objects to the non-selected state in response to the condition that the combination of the indicators contained in the target card objects does not meet the issuing condition.

The target card objects are multiple, each card object comprises an indication mark, and the sending condition specifies the size of the indication marks contained in the card objects sent by the user, the number of the card objects and the like. When the combination of the indication marks formed by the indication marks contained in the target card objects does not meet the dealing conditions, the target card objects cannot be dealt, so that the states of the target card objects are restored to the non-selected state, and a subsequent user can select the card objects again.

303. And the terminal determines the arrangement sequence of the indication marks corresponding to the indication mark combinations according to the indication mark combinations formed by the indication marks contained in the target card objects.

Wherein each card class object contains an indication mark. In the embodiment of the present application, the indicators included in the plurality of card objects can form a plurality of indicator combinations, for example, the indicator combinations include a combination of two identical indicators, a combination of three identical indicators, a combination of four identical indicators, a combination of five or more adjacent indicators, a combination of three identical indicators and one different indicator, and the like. For example, the target card objects are playing cards, and in the card game, the indication mark combinations corresponding to the playing cards may form card types such as sequential cards, bombs, pairs and 3 cards. The indication mark arrangement sequence corresponding to different indication mark combinations is different, and the indication mark arrangement sequence is used for indicating the display sequence of the card objects containing the corresponding indication marks.

According to the indication mark combination formed by the indication marks contained in the target card objects, the indication mark arrangement sequence corresponding to the indication mark combination can be obtained, namely the indication mark combination meets the issuing condition of the card objects, so that the display sequence of the target card objects can be adjusted according to the determined indication mark arrangement sequence when the target card objects are issued subsequently.

In one possible implementation, this step 303 includes: and inquiring the corresponding relation between the indication mark combination and the indication mark arrangement sequence according to the indication mark combination formed by the indication marks contained in the target card objects, and determining the indication mark arrangement sequence corresponding to the indication mark combination.

The method comprises the steps of recording various indication mark combinations meeting the sending conditions of card objects and corresponding indication mark arrangement sequences in the corresponding relation of the indication mark combinations and the indication mark arrangement sequences, inquiring the corresponding relation of the indication mark combinations and the indication mark arrangement sequences according to the indication mark combinations formed by the indication marks contained in a plurality of target objects, if the indication mark arrangement sequences corresponding to the indication mark combinations are inquired, indicating that the indication mark combinations meet the sending conditions of the card objects, and subsequently adjusting the display sequences of the target card objects according to the determined indication mark arrangement sequences when the target card objects are sent.

304. And the terminal sorts the target card objects according to the indication mark arrangement sequence.

Because each target card class object comprises the indication mark, the target card class objects are sequenced according to the arrangement sequence of the indication marks and the indication marks contained in the target card class objects, so that the sequenced target card class objects are consistent with the arrangement sequence of the indication marks.

For example, if the indication mark included in the target brand object 1 is "spade 9", the indication mark included in the target brand object 2 is "red peach 6", the indication mark included in the target brand object 3 is "spade 6", the indication mark included in the target brand object 4 is "square 6", the indication mark arrangement order corresponding to the indication mark combination formed by the indication marks included in the 4 target brand objects is "spade 6", "red peach 6", "square 6", and "spade 9", the display order of the sorted target brand objects is the target brand object 3, the target brand object 2, the target brand object 4, and the target brand object 1.

305. And the terminal displays the sequenced target card class objects in the shared display area.

Before the terminal displays the sequenced target card objects in the shared display area, the terminal cancels the display of the target card objects in the personal display area so as to show the effect that the target card objects move from the personal display area to the shared display area for display.

It should be noted that, in the embodiment of the present application, the target card class objects are sorted first, and then displayed according to the sorted target card class objects, but in another embodiment, the step 303 and the step 305 need not be executed, and other manners can be adopted to display the target card class objects in the shared display area.

306. And the terminal adjusts the display positions of the remaining card objects in the personal display area so as to enable the distance between any two card objects in the remaining card objects to be equal.

After the display of the selected at least one target card class object is cancelled, the display position corresponding to the at least one target card class object may be vacant, and the display positions of the remaining card class objects in the personal display area are adjusted to enable the distance between any two adjacent card class objects in the remaining card class objects to be equal, that is, the remaining card class objects are guaranteed to be uniformly displayed in the personal display area, so that the display effect of the personal display area is guaranteed.

According to the method provided by the embodiment of the application, the user can automatically determine the card class object to be dealt by only executing one sliding operation in the interactive interface, and automatically execute the operation of dealing the card class object, and the user does not need to execute the operation of selecting the card class object and then clicking the confirmation option, so that the operation is simplified, and the operation efficiency is improved. And the user executes one sliding operation in the interactive interface, so that the operation of quickly selecting and playing cards can be realized, and the operation cost of the player is reduced.

Taking a card class object as a playing card and taking a shared display area as a discard area as an example, a flow chart of an interaction method based on the playing card is provided, as shown in fig. 11, the flow chart includes:

1. and detecting that the finger of the user moves on the touch screen, and determining whether the sliding operation is executed according to whether the detected touch position of the finger meets the continuous displacement.

2. In response to the detection of the sliding operation, the playing cards passing through the sliding track of the sliding operation are determined to be selected playing cards, the selected playing cards move along with the sliding track, the selected playing cards are dragged to a card abandoning area, and when a plurality of playing cards are selected, the playing cards are stacked together in the process of dragging the selected playing cards to the card abandoning area.

3. When the finger leaves the touch screen, namely the release of the sliding operation is detected, the selected playing card is displayed in the card abandoning area.

The card object-based interaction method provided by the embodiment of the application is applied to a terminal, as shown in fig. 12, the terminal includes a touch sensing control system, an optical sensor, an input/output system, a power supply, an external interface, a touch screen, a controller and a processor, and the input/output system includes a display controller and an optical sensor controller. The touch sensing control system is electrically connected with the display controller, and the optical sensor is electrically connected with the optical sensor controller.

The touch screen is used for displaying a user interface, and in the embodiment of the application, the user interface is an interactive interface. The touch sensing control system is connected with the display controller and controls the trigger operation detected on the touch screen, so that a user can perform touch operation on a user interface displayed by the terminal, and man-machine interaction is realized. The optical sensor is used for collecting the intensity of ambient light, and the optical sensor controller is used for controlling the optical sensor to collect the intensity of the ambient light and controlling the user interface displayed on the touch screen based on the input/output system, such as controlling the display brightness of the user interface. The terminal is provided with a power supply that supplies power to the various components in the terminal. The external interface is used for receiving external information and controlling the user interface displayed by the touch screen through the received external information. The processor and the controller in the terminal jointly control the terminal so as to realize the card object-based interaction method provided by the embodiment of the application.

The user performs an operation on the terminal, so that the terminal can implement the card object-based interaction method provided in the above embodiment.

Fig. 13 is a schematic structural diagram of an interaction device based on card objects according to an embodiment of the present application, and as shown in fig. 13, the device includes:

the display module 1301 is configured to display an interactive interface, where the interactive interface includes a personal display area and a shared display area, the personal display area is configured to display a card object held by a currently logged-in first user identifier, the shared display area is configured to display a card object issued by multiple user identifiers participating in interaction, and the multiple user identifiers include the first user identifier;

an executing module 1302, configured to, in response to that the sliding operation detected in the interactive interface meets a dealing condition, perform a dealing operation on a target card class object through which a sliding track of the sliding operation in the personal display area passes, where the dealing condition is: the sliding operation is released in the shared display area after being moved from the personal display area to the shared display area;

the display module 1301 is further configured to display the target card class object in the shared display area.

In one possible implementation, as shown in fig. 14, a module 1302 is executed, including:

a determination unit 1321 configured to determine a slide trajectory of the slide operation in response to detection of the slide operation in the personal presentation area;

the determining unit 1321 is further configured to determine any card class object as a target card class object in response to the intersection of the sliding track and a display area corresponding to any card class object in the personal display area;

the execution unit 1322 is configured to, in response to detecting that the sliding operation is released in the shared display area after moving from the personal display area to the shared display area, perform a dealing operation on the target card class object.

In another possible implementation, as shown in fig. 14, the executing module 1302 includes:

a display unit 1323 for highlighting the target card class object in response to detecting the sliding operation in the personal presentation area and when the sliding operation is moved out of the personal presentation area;

an execution unit 1322 is configured to, in response to detecting that the sliding operation is released within the shared display area, perform a payout operation on the target card class object.

In another possible implementation manner, the target card class objects are multiple, when the sliding operation is moved out of the personal display area, the distance between any two adjacent target card class objects is smaller than the distance between any two adjacent other card class objects in the personal display area, and the display positions of the multiple target card class objects are higher than those of the other card class objects.

In another possible implementation, as shown in fig. 14, the executing module 1302 includes:

a determination unit 1321 configured to determine, in response to detection of a slide operation in the personal presentation area, a target card class object through which a slide trajectory of the slide operation in the personal presentation area passes;

and the execution unit 1322 is used for controlling the target card class object to move along with the sliding track after detecting that the sliding operation moves out of the personal display area until the sliding operation is released, and executing a dealing operation on the target card class object.

In another possible implementation, the determining unit 1321 is configured to set, in response to detecting a sliding operation in the personal presentation area, a state of a target card class object through which a sliding trajectory of the sliding operation passes, to a selected state;

the execution unit 1322 is used for controlling the target card class object in the selected state to move along with the sliding track after detecting that the sliding operation moves out of the personal display area; and in response to detecting that the sliding operation is released in the shared display area, executing a sending-out operation on the target card class object in the selected state.

In another possible implementation manner, as shown in fig. 14, the apparatus further includes:

and the restoring module 1303 is used for restoring the state of the target card class object to the non-selected state in response to the fact that the sliding operation is detected to be released in the personal display area.

In another possible implementation manner, the target card class objects are multiple, and each card class object contains an indication mark; the device still includes:

and the restoring module 1303, configured to restore the states of the multiple target card objects to the unselected state in response to that an indication identifier combination formed by the indication identifiers included in the multiple target card objects does not satisfy the issue condition.

In another possible implementation manner, the target card class objects are multiple, and each card class object contains an indication mark; a display module 1301, configured to determine, according to an indication identifier combination formed by indication identifiers included in the target board objects, an indication identifier arrangement order corresponding to the indication identifier combination; sequencing the target card objects according to the indication mark arrangement sequence; and displaying the sorted target card class objects in the shared display area.

In another possible implementation manner, as shown in fig. 14, the apparatus further includes:

an adjusting module 1304, configured to adjust display positions of the remaining card objects in the personal display area, so that distances between any two of the remaining card objects are equal.

It should be noted that: the card object-based interaction device provided in the above embodiment is only illustrated by the division of the above functional modules, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the terminal is divided into different functional modules to complete all or part of the functions described above. In addition, the card object-based interaction device provided by the above embodiment and the card object-based interaction method embodiment belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.

The embodiment of the present application further provides a computer device, which includes a processor and a memory, where the memory stores at least one computer program, and the at least one computer program is loaded and executed by the processor to implement the operations performed in the card object-based interaction method according to the above embodiment.

Optionally, the computer device is provided as a terminal. Fig. 15 shows a block diagram of a terminal 1500 according to an exemplary embodiment of the present application. The terminal 1500 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1500 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.

The terminal 1500 includes: a processor 1501 and memory 1502.

Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.

The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one computer program for execution by processor 1501 to implement the card object based interaction methods provided by method embodiments herein.

In some embodiments, the terminal 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1504, a display 1505, a camera assembly 1506, an audio circuit 1507, a positioning assembly 1508, and a power supply 1509.

The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.

The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.

The display screen 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, provided on the front panel of terminal 1500; in other embodiments, display 1505 may be at least two, each disposed on a different surface of terminal 1500 or in a folded design; in other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.

The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.

The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.

The positioning component 1508 is used to locate the current geographic position of the terminal 1500 for navigation or LBS (Location Based Service). The Positioning component 1508 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, or the russian galileo System.

Power supply 1509 is used to power the various components in terminal 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.

In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.

The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in a landscape view or a portrait view based on the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.

The gyroscope sensor 1512 can detect the body direction and the rotation angle of the terminal 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 cooperate to collect the 3D motion of the user on the terminal 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.

Pressure sensor 1513 may be disposed on a side frame of terminal 1500 and/or underneath display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, the holding signal of the user to the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the display screen 1505, the processor 1501 controls the operability control on the UI interface in accordance with the pressure operation of the user on the display screen 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.

The fingerprint sensor 1514 is configured to capture a fingerprint of the user, and the processor 1501 identifies the user based on the fingerprint captured by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user based on the captured fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the terminal 1500. When a physical key or vendor Logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.

The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the display screen 1505 is adjusted down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.

A proximity sensor 1516, also called a distance sensor, is provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front surface of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually decreases, the processor 1501 controls the display 1505 to switch from the bright screen state to the dark screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually becomes larger, the processor 1501 controls the display 1505 to switch from the breath screen state to the bright screen state.

Those skilled in the art will appreciate that the configuration shown in fig. 15 does not constitute a limitation of terminal 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.

Optionally, the computer device is provided as a server. Fig. 16 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 1600 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1601 and one or more memories 1602, where the memories 1602 store at least one computer program, and the at least one computer program is loaded and executed by the processors 1601 to implement the methods provided by the above method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.

The embodiment of the present application further provides a computer-readable storage medium, in which at least one computer program is stored, and the at least one computer program is loaded and executed by a processor to implement the operations performed in the card object-based interaction method according to the above embodiment.

Embodiments of the present application also provide a computer program product or a computer program comprising computer program code stored in a computer readable storage medium. The processor of the computer apparatus reads the computer program code from the computer-readable storage medium, and the processor executes the computer program code, so that the computer apparatus implements the operations performed in the card object-based interaction method according to the above-described embodiment.

It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.

The above description is only an alternative embodiment of the present application and should not be construed as limiting the present application, and any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

32页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:虚拟关系养成应用中的剧情触发方法、装置、终端及介质

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类