Interaction control method, electronic equipment and system

文档序号:1798084 发布日期:2021-11-05 浏览:12次 中文

阅读说明:本技术 交互控制方法、电子设备及系统 (Interaction control method, electronic equipment and system ) 是由 高源� 于 2021-07-14 设计创作,主要内容包括:本申请公开了一种交互控制方法、电子设备及系统,涉及终端领域,该方法包括:当云服务器给电子设备下发第一应用安装包时,也会下发第一应用的第一手势导航信息。当电子设备运行第一应用并显示出第一页面时,电子设备可以接收到针对于第一页面的第一手势类型输入。当电子设备基于第一页面通过第一手势导航信息和第一手势类型确定出第一信息时,电子设备基于该第一信息可以使得第一应用执行第一功能。其中,该第一功能是第一应用基于第一页面所设置的第一手势类型的对应功能。此时,电子设备不会基于该第一手势类型的输入执行系统导航功能。这样,可以提高电子设备执行用户想要触发的操作的准确性,提升了用户操作的效率。(The application discloses an interaction control method, electronic equipment and a system, which relate to the field of terminals, and the method comprises the following steps: when the cloud server issues the first application installation package to the electronic device, first gesture navigation information of the first application is also issued. When the electronic device runs the first application and displays the first page, the electronic device may receive a first gesture type input for the first page. When the electronic device determines first information based on the first page through the first gesture navigation information and the first gesture type, the electronic device may cause the first application to execute a first function based on the first information. Wherein the first function is a corresponding function of a first gesture type set by the first application based on the first page. At this time, the electronic device does not perform a system navigation function based on the first gesture type of input. Therefore, the accuracy of the electronic equipment for executing the operation which the user wants to trigger can be improved, and the efficiency of the user operation is improved.)

1. An interaction control method, characterized in that the method comprises:

the electronic equipment responds to the received first input and acquires a first identifier; the first identification is an identification corresponding to a first page in the first application;

the electronic equipment displays the first page based on the first input;

after the electronic equipment receives a second input aiming at the first page, the electronic equipment determines a first gesture type corresponding to the second input;

the electronic equipment determines first information from first gesture navigation information based on the first gesture type and the first identification; the first gesture navigation information is gesture navigation information corresponding to the first application;

the electronic device causes the first application to execute a first function based on the first information;

the electronic equipment responds to the received third input and acquires a second identifier; the second identifier is an identifier corresponding to a second page in the first application;

the electronic equipment displays the second page based on the third input;

after the electronic equipment receives a fourth input aiming at the second page, the electronic equipment determines a first gesture type corresponding to the fourth input;

the electronic equipment determines second information from the first gesture navigation information based on the first gesture type and the second identification;

the electronic device causes the first application to execute a second function based on the second information; wherein the first function and the second function are different.

2. The method of claim 1, further comprising:

the electronic equipment responds to the received fifth input and acquires a third identifier; the third identifier is an identifier corresponding to a third page in the first application;

the electronic equipment displays the third page based on the fifth input;

after the electronic equipment receives a sixth input on a first designated area aiming at the third page, the electronic equipment determines a first gesture type corresponding to the sixth input;

the electronic equipment determines first information from first gesture navigation information based on the first gesture type and the first identification; the electronic device causes the first application to execute a first function based on the first information.

3. The method of claim 2, further comprising:

after the electronic equipment receives a seventh input on a second designated area aiming at the third page, the electronic equipment determines a first gesture type corresponding to the seventh input;

the electronic equipment acquires the second information based on the first gesture type; the electronic device causes the first application to execute the second function based on the second information.

4. The method of claim 1, further comprising:

the electronic equipment responds to the received eighth input and acquires a fourth identifier; the fourth identification is an identification corresponding to a fourth page in the first application;

the electronic equipment displays the fourth page based on the eighth input;

after the electronic equipment receives a ninth input aiming at the fourth page, the electronic equipment determines a first gesture type corresponding to the ninth input; wherein the slip speed of the ninth input is within a first speed threshold;

the electronic equipment determines first information from first gesture navigation information based on the first gesture type and the fourth identification; the electronic device causes the first application to execute a first function based on the first information.

5. The method of claim 4, further comprising:

after the electronic equipment receives a tenth input aiming at the fourth page, the electronic equipment determines a first gesture type corresponding to the tenth input; wherein the slip speed of the tenth input is within a second speed threshold;

the electronic equipment acquires the second information based on the first gesture type; the electronic device causes the first application to execute the second function based on the second information.

6. The method of any of claims 1-5, wherein the first function comprises:

opening a side menu bar or display hover window for the page.

7. The method of any of claims 1-5, wherein the second function comprises:

return to the previous page, return to the home screen interface, or enter the recent tasks page.

8. The method of any of claims 1-5, wherein the first gesture type comprises:

a gesture type of sliding right from a left edge of a display screen of the electronic device, a gesture type of sliding left from a right edge of a display screen of the electronic device, a gesture type of sliding up from a lower edge of a screen of the electronic device, or a swipe and pause from a lower edge of a display screen.

9. The method of claim 1, wherein before the electronic device includes the first gesture navigation information for the first application, the method further comprises:

the electronic equipment receives eleventh input for installing the first application and sends an installation request to a cloud server;

the electronic equipment receives an installation package of the first application and the first gesture navigation information from the cloud server;

the electronic equipment installs the first application based on the installation package of the first application and stores the first gesture navigation information; the first gesture navigation information comprises the first identification, the second identification, a first gesture type corresponding to the first page, the first information corresponding to the first page and the second information corresponding to the second page.

10. An electronic device comprising a communication apparatus, a memory, and a processor coupled to the memory, a plurality of applications, and one or more programs; the processor, when executing the one or more programs, causes the electronic device to implement the method of any of claims 1-9.

11. A computer storage medium, characterized in that the storage medium has stored therein a computer program comprising executable instructions that, when executed by a processor, cause the processor to perform the method of any one of claims 1-9.

12. A computer program product, characterized in that, when run on an electronic device, causes the electronic device to perform the method according to any of claims 1-9.

Technical Field

The present application relates to the field of terminals, and in particular, to an interaction control method, an electronic device, and a system.

Background

With the development of terminal technology, in addition to receiving and responding to a virtual control acting on a physical key and/or a display screen so as to perform a corresponding operation, the electronic device can also receive a gesture input of a user, and quickly execute an instruction corresponding to the gesture input so as to simplify the operation of the user. For example, the electronic device may receive a gesture of a user sliding right from a left edge on the display screen or sliding left from a right edge of the display screen, so that the electronic device may perform an operation of returning from a currently displayed page to a previous page in response to the above-described sliding gesture.

However, some application-set gesture input operations may conflict with electronic device-set gesture navigation operations. For example, some application settings may trigger a "floating window" function when the electronic device receives and responds to a gesture input by the user sliding left from the right edge on the display screen, while the electronic device is provided with an instruction that the gesture input is a return from the currently displayed page to a previous page. Then, in the display interface of the application, when the electronic device receives a gesture input that the user slides to the left from the right edge on the display screen, an instruction to return to the upper page is executed, and the "floating window" function cannot be triggered. Therefore, the accuracy of the electronic device executing the operation that the user wants to trigger is low, and the user operation is very inconvenient.

Disclosure of Invention

The application provides an interaction control method, electronic equipment and a system, and the method, the electronic equipment and the system realize that when the electronic equipment runs a first application and displays a first page and receives a first gesture type input aiming at the first page, the electronic equipment can enable the first application to execute a function corresponding to the first gesture type set by the first application based on the first page, but cannot execute a second function corresponding to the first gesture type, so that the accuracy of the electronic equipment in executing an operation which a user wants to trigger is improved, and the efficiency of user operation is improved.

In a first aspect, the present application provides an interaction control method, including: the electronic device obtains a first identifier in response to the received first input. The first identifier is an identifier corresponding to a first page in the first application. The electronic equipment displays the first page based on the first input. After the electronic equipment receives a second input aiming at the first page, the electronic equipment determines a first gesture type corresponding to the second input. The electronic device determines first information from first gesture navigation information based on the first gesture type and the first identifier. The first gesture navigation information is gesture navigation information corresponding to the first application. The electronic device causes the first application to execute a first function based on the first information. The electronic device obtains a second identification in response to the received third input. And the second identifier is an identifier corresponding to a second page in the first application. The electronic device displays the second page based on the third input. After the electronic device receives a fourth input aiming at the second page, the electronic device determines a first gesture type corresponding to the fourth input. The electronic device determines second information from the first gesture navigation information based on the first gesture type and the second identifier. The electronic device causes the first application to execute a second function based on the second information. Wherein the first function and the second function are different. Therefore, the accuracy of the electronic equipment for executing the operation which the user wants to trigger can be improved, and the efficiency of the user operation is improved.

In one possible implementation, the method further includes: the electronic device obtains a third identifier in response to the received fifth input. And the third identifier is an identifier corresponding to a third page in the first application. The electronic device displays the third page based on the fifth input. After the electronic equipment receives a sixth input on the first designated area aiming at the third page, the electronic equipment determines a first gesture type corresponding to the sixth input. The electronic device determines first information from first gesture navigation information based on the first gesture type and the first identifier. The electronic device causes the first application to execute a first function based on the first information. Therefore, the electronic equipment can inquire the first gesture navigation information based on the input of the designated area on the page, so that the first application executes the first function, the accuracy of the electronic equipment for executing the operation which is required to be triggered by the user can be improved, and the operation efficiency of the user is improved.

In one possible implementation, the method further includes: after the electronic device receives a seventh input on the second designated area for the third page, the electronic device determines a first gesture type corresponding to the seventh input. The electronic device obtains the second information based on the first gesture type. The electronic device causes the first application to execute the second function based on the second information. Therefore, the electronic equipment can execute the second function based on the input of the designated area on the page, the accuracy of the electronic equipment for executing the operation which is required to be triggered by the user can be improved, and the operation efficiency of the user is improved.

In one possible implementation, the method further includes: the electronic device obtains a fourth identification in response to the received eighth input. And the fourth identification is an identification corresponding to the fourth page in the first application. The electronic device displays the fourth page based on the eighth input. After the electronic device receives a ninth input for the fourth page, the electronic device determines a first gesture type corresponding to the ninth input. Wherein the slip speed of the ninth input is within a first speed threshold. The electronic device determines first information from the first gesture navigation information based on the first gesture type and the fourth identification. The electronic device causes the first application to execute a first function based on the first information. Therefore, the electronic equipment can inquire the first gesture navigation information based on the input speed, so that the first application executes the first function, the accuracy of the electronic equipment for executing the operation which is required to be triggered by the user can be improved, and the operation efficiency of the user is improved.

In one possible implementation, the method further includes: after the electronic device receives a tenth input for the fourth page, the electronic device determines a first gesture type corresponding to the tenth input. Wherein the slip speed of the tenth input is within a second speed threshold. The electronic device obtains the second information based on the first gesture type. The electronic device causes the first application to execute the second function based on the second information. Therefore, the electronic equipment can execute the second function based on the input speed, the accuracy of the electronic equipment for executing the operation which is required to be triggered by the user can be improved, and the operation efficiency of the user is improved.

In a possible implementation manner, the first function specifically includes: opening a side menu bar or display hover window for the page.

In a possible implementation manner, the second function specifically includes: return to the previous page, return to the home screen interface, or enter the recent tasks page.

In a possible implementation, the first gesture type specifically includes: a type of gesture to slide to the right from a left edge of a display screen of the electronic device, a type of gesture to slide to the left from a right edge of a display screen of the electronic device, a type of gesture to slide up from a lower edge of a screen of the electronic device, or a swipe and pause from a lower edge of a display screen.

In a possible implementation manner, before the electronic device includes the first gesture navigation information of the first application, the method further includes: the electronic device receives an eleventh input for installing the first application and sends an installation request to the cloud server. The electronic device receives the installation package of the first application and the first gesture navigation information from the cloud server. The electronic equipment installs the first application based on the installation package of the first application and stores the first gesture navigation information. The first gesture navigation information comprises the first identification, the second identification, a first gesture type corresponding to the first page, the first information corresponding to the first page and the second information corresponding to the second page.

In a second aspect, an embodiment of the present application provides a communication system, including: electronic equipment and cloud server. Wherein the cloud server comprises first gesture navigation information. When the cloud server and the electronic device establish a communication connection, the electronic device is caused to perform the method in any one of the possible implementation manners of the first aspect. Therefore, the accuracy of the electronic equipment for executing the operation which the user wants to trigger can be improved, and the efficiency of the user operation is improved.

In a third aspect, an embodiment of the present application provides an electronic device, which includes a communication apparatus, a memory, and a processor coupled to the memory, a plurality of application programs, and one or more programs. The processor, when executing the one or more programs, causes the electronic device to perform the method of any of the possible implementations of the first aspect. Therefore, the accuracy of the electronic equipment for executing the operation which the user wants to trigger can be improved, and the efficiency of the user operation is improved.

In a fourth aspect, the present application provides a computer storage medium, where a computer program is stored in the storage medium, where the computer program includes executable instructions, and when the executable instructions are executed by a processor, the processor is caused to execute the method in any one of the possible implementation manners of the first aspect. Therefore, the accuracy of the electronic equipment for executing the operation which the user wants to trigger can be improved, and the efficiency of the user operation is improved.

In a fifth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to execute the method in any one of the possible implementation manners of the first aspect. Therefore, the accuracy of the electronic equipment for executing the operation which the user wants to trigger can be improved, and the efficiency of the user operation is improved.

Drawings

Fig. 1 is a schematic architecture diagram of a communication system according to an embodiment of the present application;

fig. 2A is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure;

fig. 2B is a schematic diagram of a hardware structure of a cloud server according to an embodiment of the present disclosure;

fig. 3 is a schematic flowchart illustrating an interaction control method according to an embodiment of the present application;

4A-4C are a set of schematic user interfaces provided by embodiments of the present application;

fig. 5 is a schematic flowchart of another interaction control method according to an embodiment of the present application;

6A-6B are a set of schematic user interfaces provided by embodiments of the present application;

fig. 7 is a schematic view of a specific process for acquiring first gesture navigation information according to an embodiment of the present disclosure;

8A-8C are a set of schematic user interfaces provided by embodiments of the present application;

fig. 9 is a schematic diagram of a software structure provided in an embodiment of the present application;

fig. 10 is a schematic view of a first gesture guidance information storage according to an embodiment of the present application.

Detailed Description

The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in this application in the specification and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein is meant to encompass any and all possible combinations of one or more of the listed features. In the embodiments of the present application, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.

First, a communication system 10 provided in an embodiment of the present application is described.

Referring to fig. 1, fig. 1 schematically illustrates an architecture of a communication system 10 according to an embodiment of the present disclosure.

As shown in fig. 1, the communication system 10 may include an electronic device 100 and a cloud server 200.

The electronic device 100 may be an electronic device such as a mobile phone, a tablet computer, a PC, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a smart watch, a smart bracelet, a wearable device, an Augmented Reality (AR)/Virtual Reality (VR) device, and the like, which is not limited in the present application.

The cloud server 200 may store an installation package of one or more applications, and gesture input information of each application. The one or more applications include a first application. The gesture input information of each application includes first gesture navigation information of a first application. The first gesture navigation information may include an identification of one or more pages in the first application, a gesture type corresponding to the one or more pages that may trigger a corresponding function set by the first application based on the respective pages (e.g., a gesture type of sliding right from a left edge of the display screen, a gesture type of sliding left from a right edge of the display screen, etc.), and the gesture type is based on the designated information corresponding to the designated page. The cloud server 200 may establish a communication connection with the plurality of electronic devices 100, and may process a task requested to be processed by the plurality of electronic devices 100. The cloud server 200 may distinguish the electronic device 100 by a device identifier such as an account (e.g., a glory account) registered by the user on the electronic device 100, or an International Mobile Equipment Identity (IMEI) corresponding to the electronic device 100.

The electronic device 100 may establish a communication connection with the cloud server 200 through a communication technology such as a 2G network, a 3G network, a 4G network, a 5G network, and a Wireless Local Area Network (WLAN). The electronic device 100 may perform data interaction with the cloud server based on the communication connection. For example, the electronic device 100 may send a data request to the cloud server 200 (e.g., request the cloud server 200 to issue an installation package of the first application, and issue first gesture navigation information of the first application, etc.). The cloud server 200 may receive a data request (e.g., a data request including the first application identifier and the device identifier of the electronic device 100, etc.) sent by the electronic device 100 based on the communication connection. The cloud server 200 may transmit the installation package of the first application, the first gesture navigation information of the first application, and the like to the electronic device 100 based on the received data request.

It should be noted that the architecture of the communication system 10 shown in the embodiment of the present application is only used for exemplary explanation of the present application, and should not be construed as limiting in any way.

The following describes a flow of a gesture input method provided by the present application.

The electronic device 100 may trigger a system navigation function in response to the gesture input. For example, electronic device 100 may receive and trigger one or more of a return to previous page function in response to input of a slide gesture from the left edge of the display screen to the right, a return to previous page function from input of a slide gesture from the right edge of the display screen to the left, a return to home screen interface from input of a slide gesture from the lower edge of the display screen up, a gesture input to slide up and pause from the lower edge of the display screen into a recent tasks page, and so forth. However, some applications may set the same gestures described above to trigger different functions, e.g., a gesture that slides right from the left edge of the display triggers the display side menu bar function, a gesture that slides left from the right edge of the display triggers the floating window function, etc. Then, when the electronic device 100 receives the same gesture input, the electronic device 100 preferentially executes the system navigation function without executing the function corresponding to the gesture input set by the application. For example, when the electronic apparatus 100 receives an input of a slide gesture from the left edge of the display screen to the right, the electronic apparatus 100 may perform an operation of returning to a previous page without performing a display-side menu operation set by an application. Therefore, the electronic apparatus 100 cannot efficiently execute the function that the user wants to trigger, and the user operation is very inconvenient.

Therefore, the embodiment of the application provides an interaction control method. When the cloud server 200 receives and responds to the data request sent by the electronic device 100, and issues the installation package of the first application, the first gesture navigation information of the first application is also issued at the same time. When the electronic device 100 runs the first application and displays the first page, the electronic device 100 may receive an input of a first gesture type with respect to the first page. When the electronic device 100 determines first information through the first gesture navigation information and the first gesture type based on the first page, the electronic device 100 may cause the first application to perform the first function based on the first information. Wherein the first function is a corresponding function of a first gesture type set by the first application based on the first page. At this time, the electronic apparatus 100 does not perform the system navigation function based on the input of the first gesture type. Therefore, the accuracy of the electronic equipment 100 executing the operation which the user wants to trigger can be improved, and the efficiency of the user operation is improved.

An electronic device 100 provided in an embodiment of the present application is described below.

Referring to fig. 2A, fig. 2A schematically illustrates a hardware structure of an electronic device 100 according to an embodiment of the present disclosure.

As shown in fig. 2A, the electronic device 100 may include a processor 101, a memory 102, a wireless communication module 103, a display 104, a sensor module 105, an audio module 106, a speaker 107, a mobile communication module 108, and the like. The modules may be connected by a bus or in other manners, and the embodiment of the present application takes the bus connection as an example.

It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may also include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.

The processor 101 may include one or more processor units, for example, the processor 101 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.

The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.

A memory may also be provided in the processor 101 for storing instructions and data. In some embodiments, the memory in the processor 101 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 101. If the processor 101 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 101, thereby increasing the efficiency of the system.

In some embodiments, processor 101 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a USB interface, etc.

Memory 102 is coupled to processor 101 for storing various software programs and/or sets of instructions. In particular implementations, memory 102 may include a volatile memory (volatile memory), such as a Random Access Memory (RAM); non-volatile memory (non-volatile memory) such as ROM, flash memory, Hard Disk Drive (HDD), or Solid State Drive (SSD) may also be included; the memory 102 may also comprise a combination of the above-mentioned kinds of memories. The memory 102 may also store some program codes, so that the processor 101 may call the program codes stored in the memory 102 to implement the method implemented in the electronic device 100 according to the embodiment of the present application. The memory 102 may store an operating system, such as an embedded operating system like uCOS, VxWorks, RTLinux, etc.

The wireless communication module 103 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 103 may be one or more devices integrating at least one communication processing module. The wireless communication module 103 receives electromagnetic waves via an antenna, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 101. The wireless communication module 103 may also receive a signal to be transmitted from the processor 101, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna to radiate the electromagnetic waves. In some embodiments, the electronic device 100 may further transmit a signal to detect or scan a device near the electronic device 100 through a bluetooth module (not shown in fig. 2A) or a WLAN module (not shown in fig. 2A) in the wireless communication module 103, and establish a wireless communication connection with the nearby device and transmit data. Wherein, the bluetooth module may provide a solution including one or more of classic bluetooth (BR/EDR) or Bluetooth Low Energy (BLE) bluetooth communication, and the WLAN module may provide a solution including one or more of Wi-Fi direct, Wi-Fi LAN, or Wi-Fi softAP WLAN communication.

The display screen 104 may be used to display images, video, and the like. The display screen 104 may include a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 104, N being a positive integer greater than 1.

The sensor module 105 may include a touch sensor 105A or the like. The touch sensor 105A may also be referred to as a "touch device". The touch sensor 105A may be disposed on the display screen 104, and the touch sensor 105A and the display screen 104 form a touch screen, which is also called a "touch screen". The touch sensor 105A can be used to detect a touch operation acting thereon or nearby. Optionally, the sensor module 105 may further include a gyroscope sensor (not shown in fig. 2A), an acceleration sensor (not shown in fig. 2A), and the like. Where a gyro sensor may be used to determine the motion pose of the electronic device 100, in some embodiments, the electronic device 100 may determine the angular velocity of the electronic device 100 about three axes (i.e., the x, y, and z axes) via the gyro sensor. Acceleration sensors may be used to detect the magnitude of acceleration of electronic device 100 in various directions (typically the x, y, and z axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary.

The audio module 106 may be used to convert digital audio information into an analog audio signal output and may also be used to convert an analog audio input into a digital audio signal. The audio module 106 may also be used to encode and decode audio signals. In some embodiments, the audio module 106 may also be disposed in the processor 101, or some functional modules of the audio module 106 may be disposed in the processor 101.

The speaker 107, which may also be referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music through the speaker 107 or to a hands free phone.

The mobile communication module 108 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 100.

A cloud server 200 provided in an embodiment of the present application is described below.

Referring to fig. 2B, fig. 2B schematically illustrates a hardware structure of a cloud server 200 according to an embodiment of the present disclosure.

As shown in fig. 2B, the cloud server 200 may be applied to the communication system 10 described in fig. 1. The cloud server 200 may include one or more processors 201A, a communication interface 202A, and a memory 203A, where the processors 201A, the communication interface 202A, and the memory 203A may be connected by a bus or in other manners, and in this embodiment, the connection by the bus 204A is taken as an example.

The processor 201A may be constituted by one or more general-purpose processors, such as CPUs. The processor 201A may be used to run the associated program code of the device control method.

The communication interface 202A may be a wired interface (e.g., an ethernet interface) or a wireless interface (e.g., a cellular network interface) for communicating with other nodes. In this embodiment, the communication interface 202A may be specifically configured to communicate with the electronic device 100, for example, receive a data request sent by the electronic device 100 (for example, request the cloud server 200 to issue an installation package of the first application and issue first gesture navigation information of the first application, and the like), send data information to the electronic device 100 (for example, send the installation package of the first application and the first gesture navigation information of the first application to the electronic device 100, and the like), and the like.

The memory 203A may include a volatile memory (volatile memory), such as a Random Access Memory (RAM); non-volatile memory (non-volatile memory) such as ROM, flash memory, Hard Disk Drive (HDD), or Solid State Drive (SSD) may also be included; the memory 203A may also comprise a combination of memories of the kind described above. In the embodiment of the present application, the memory 203A may store an installation package of one or more applications, and gesture navigation information of each application. The one or more applications include a first application. The gesture navigation information of each application comprises first gesture navigation information of a first application. The installation package of the first application and the first gesture navigation information of the first application may be issued to the electronic device 100 based on the communication interface 202A. The memory 203A may also store some program codes, so that the processor 201A calls the program codes stored in the memory 203A to implement the implementation method in the cloud server 200 according to the embodiment of the present application.

It should be noted that the cloud server 200 shown in fig. 2B is only one implementation manner of the embodiment of the present application, and in practical applications, the cloud server 200 may further include more or less components, which is not limited herein.

An interactive control method provided in the embodiments of the present application is described below.

Referring to fig. 3, fig. 3 illustrates a specific flow of an interaction control method, where the method includes:

s301, the cloud server 200 stores one or more installation packages of the applications, where the installation packages of the one or more applications include an installation package of the first application. Wherein the first application may include one or more pages. When the first application includes a plurality of pages, the plurality of pages may include a first page and a second page.

S302, the cloud server 200 acquires and stores first gesture navigation information of the first application. The first gesture navigation information may include an identifier of each page, one or more of the pages may trigger a gesture type corresponding to a designated function set by the first application based on each page, and the gesture type is based on designated information corresponding to the designated page.

The gesture type may include one or more of a gesture type of sliding right from a left edge of the display screen, a gesture type of sliding left from a right edge of the display screen, a gesture type of sliding up from a lower edge of the display screen, a slide and pause from the lower edge of the display screen, and the like, and the specified function may include opening a side menu bar or a display floating window of a specified page (e.g., a first page or a second page, and the like).

Illustratively, taking the first application as a music application as an example, the first gesture navigation information may be as shown in table 1:

TABLE 1

As can be seen from table 1, the storage format of the first gesture navigation information may be a "key-value pair" format as shown in table 1. "Activity 1" identifies a first page that corresponds to a first gesture type (e.g., a gesture that slides right from a left edge of a display screen) that is based on information that the first page corresponds to being first information (e.g., gesture input information sent). Wherein the first application may set that the first gesture type corresponds to function 1 (e.g., opening a side menu bar of a page) based on the first page. That is, when the electronic device 100 receives and responds to a first gesture type of input (e.g., an input that slides right from a left edge of the display screen, which may also be referred to as a left-swipe input) based on the first page, the electronic device 100 may retrieve and cause the first application to perform function 1 (e.g., open a side menu bar of the page) based on the first information (e.g., gesture input information). The "Activity 2" identifies a second page, where the second page has no corresponding gesture type, that is, the first application has no gesture type corresponding to the second page and no trigger function corresponding to the gesture type. At this time, the second page corresponds to second information (e.g., system navigation information). That is, for example, when the electronic device 100 receives and responds to an input of a first gesture type (e.g., an input that slides right from a left edge of a display screen, which may also be referred to as a left-swipe input) based on a second page, the electronic device 100 may obtain and cause the first application to perform a system navigation function (which may also be referred to as a second function, e.g., return to a previous page, return to a home screen interface, or enter a recent tasks page, etc.) corresponding to the gesture type based on second information (e.g., system navigation information). "Activity 3" identifies a third page that corresponds to a second gesture type (e.g., a gesture type that slides left from the right edge of the display). The second gesture type is based on the information corresponding to the third page being the first information (e.g., gesture input information). Wherein the first application may set that the second gesture type corresponds to function 2 based on the third page (e.g., display a floating window on the third page). That is, when the electronic device 100 receives and responds to a second gesture type of input (e.g., an input that slides left from a right edge of the display screen, which may also be referred to as a right-swipe input) based on the third page, the electronic device 100 may obtain and cause the first application to perform function 2 (e.g., display the floating window on the third page) based on the first information (e.g., gesture input information).

It should be noted that when the electronic device 100 receives a gesture input (e.g., right-side-slip input) by the user for a specific page (e.g., a first page), the electronic device 100 queries that the specific page (e.g., the first page) does not map a specific gesture type (e.g., a gesture type of left-sliding from a right edge of a display screen) corresponding to the gesture input (e.g., right-side-slip input) based on the identification of the specific page (e.g., the first page), the gesture input information (e.g., right-side-slip input information), and the first gesture navigation information, the electronic device 100 may acquire the second information (e.g., system navigation information). The electronic device 100 may cause the first application to perform a system navigation function (e.g., return to an upper page, return to a home screen interface, or enter a recent tasks page, etc.) corresponding to the gesture type based on the specified page (e.g., the first page) based on the second information (e.g., the system navigation information). This is not shown in full in table 1.

It should be noted that table 1 is only an exemplary illustration of the present application, and in a specific implementation, table 1 may further include gesture input information of more or fewer pages, and may also include different gesture input information, which is not limited in the present application.

Specifically, regarding the process of acquiring the first gesture navigation information of the first application by the cloud server 200, the following embodiments will be described in detail, and are not described herein again.

S303, the electronic device 100 receives an input 1 (which may also be referred to as an eleventh input) to install the first application.

S304, the electronic device 100 sends an installation request to the cloud server 200.

In particular, electronic device 100 may receive input 1 (e.g., a click) on an associated control that installs the first application. In response to the input 1, the electronic device 100 may transmit an installation request including the first application identification and the device identification of the electronic device 100 to the cloud server 200.

S305, the cloud server 200 sends the installation package of the first application and the first gesture navigation information to the electronic device 100.

Specifically, after the cloud server 200 receives an installation request including the first application identifier and the device identifier of the electronic device 100, the cloud server 200 may obtain, based on the installation request, an installation package of the first application stored on the cloud server 200 and first gesture navigation information corresponding to the first application. The cloud server 200 may transmit the installation package of the first application and the first gesture navigation information to the electronic device 100.

In one possible implementation, the electronic device 100 may pre-store the first gesture navigation information of the first application. The cloud server 200 only needs to send the installation package of the first application to the electronic device 100, and does not need to send the first gesture navigation information of the first application to the electronic device 100.

S306, the electronic device 100 installs the first application based on the installation package of the first application.

In one possible implementation, when the electronic device 100 installs the first application based on the installation package of the first application, the electronic device 100 may further store the first gesture navigation information of the first application in a database of the electronic device 100. The database may store gesture input information corresponding to one or more applications.

S307, the electronic device 100 may receive an input 2 (may also be referred to as a first input) for the first application, and in response to the input 2, the electronic device 100 acquires an identifier (may also be referred to as a first identifier) of the first page.

Specifically, electronic device 100 may receive input 2 (e.g., a click) to act on the first application icon. In response to the input 2, the electronic device 100 obtains an identification of the first page.

Illustratively, taking the first application as a music application as an example, as shown in fig. 4A, the electronic device 100 may display the interface 400 of the home screen. One or more application icons may be displayed in the interface 400. Among the one or more application icons may be a weather application icon, a stock application icon, a calculator application icon, a settings application icon, a mail application icon, a theme application icon, a music application icon 401, a video application icon, and so on.

Optionally, a status bar, page indicator, and tray icon area may also be displayed in the interface 400. The status bar may include, among other things, one or more signal strength indicators for mobile communication signals (also may be referred to as cellular signals), signal strength indicators for wireless fidelity (Wi-Fi) signals, battery status indicators, time indicators, and the like. The page indicator may be used to indicate the positional relationship of the currently displayed page with other pages. The tray icon area includes a plurality of tray icons (e.g., a dialing application icon, an information application icon, a contacts application icon, a camera application icon, etc.) that remain displayed during page switching. The page may also include a plurality of application icons and a page indicator, the page indicator may not be a part of the page but may exist alone, the tray icon is also optional, and the embodiment of the present application does not limit this.

The electronic device 100 may receive a touch operation (which may also be referred to as input 2, e.g., a click) applied to the music icon 401 by the user, and in response to the touch operation, the electronic device 100 may acquire an identifier of a first page issued by the music application.

S308, the electronic device 100 displays the first page.

For example, taking the first application as a music application as an example, when the electronic device 100 receives a touch operation of a user on the music icon 401 shown in fig. 4A, and in response to the touch operation, the electronic device 100 acquires the Activity1 of the first page issued by the music application, the electronic device 100 may display the user interface 410 (which may also be referred to as the first page) shown in fig. 4B.

As shown in fig. 4B, user interface 410 may include a search box, one or more page options (e.g., "singer" page option, "line" page option, "song list" page option, and "station" page option, etc.), one or more song selection entries (e.g., "song 1" selection entry 411, "song 2" selection entry, and "song 3" selection entry, etc.), a music play bar displayed below the song selection entries. In response to the operation, the electronic device 100 may display a text input box and receive an input of a user action in the text input box, so that the user displays a music track desired to be searched in the input box. The music playbar may be configured to receive a touch operation (e.g., a click) performed by a user on a control in the music playbar, and in response to the touch operation, the electronic device 100 may play a music track.

S309, when the electronic device 100 receives a second input (for example, a left-sliding input) for the first page, the electronic device 100 determines that a gesture type corresponding to the second input is a first gesture type, and the electronic device 100 may acquire corresponding first information based on the identifier of the first page, the first gesture type, and the first gesture navigation information.

For example, taking the first application as a music application, the first gesture navigation information is shown in table 1. The first application may set that the first gesture type corresponds to function 1 based on the first page (e.g., open a side menu bar of the page). When electronic device 100 receives a second input directed to the first page (e.g., a left-swipe input), electronic device 100 may determine that the gesture type corresponding to the second input is the first gesture type (e.g., a gesture that swipes right from a left edge of the display screen). The electronic device 100 may retrieve first information (e.g., gesture input information) based on the identification of the first page, the first gesture type, and the first gesture navigation information. This first information may cause the first application to perform function 1 (which may also be referred to as a first function, e.g., opening a side menu bar of a page).

S310, the electronic device 100 may execute a function 1 (which may also be referred to as a first function, e.g., opening a side menu bar of a page) through the first application based on the first information (e.g., gesture input information).

Illustratively, as shown in fig. 4B, taking the first application as a music application as an example, the electronic device 100 displays a user interface 410 (which may also be referred to as a first page). The electronic device 100 may receive a gesture input (also may be referred to as a second input) for the user interface 410 to slide to the right from the left edge of the display screen. The system navigation function is provided with a function executed by the gesture input sliding to the right from the left edge of the display screen as returning to a page at the upper level, and the function executed by the gesture input sliding to the right from the left edge of the display screen when the first page is displayed is a side menu bar (also referred to as function 1) for opening the page. When the electronic apparatus 100 responds to the above-described gesture input with respect to the user interface 410, the electronic apparatus 100 may cause the music application to perform a function (may also be referred to as a first function) of opening the first page-side menu bar according to the gesture input information (may also be referred to as first information) based on the above-described step S309. Electronic device 100 may display user interface 420 as shown in FIG. 4C, revealing a side menu bar page.

As shown in fig. 4C, the user interface 420 may include a side menu bar 421. Among other things, the side menu bar 421 may include user account information of the first application, one or more option entries (e.g., "my information" option entry, "originator center" option entry, "listen-while-store" option entry, "mall" option entry, "help and customer" option entry, and "setup" option entry, etc.). The electronic device 100 may receive a touch operation (e.g., click) by the user on the one or more option items, and in response to the touch operation, the electronic device 100 may display a page of the corresponding option item.

Another interactive control method provided in the embodiments of the present application is described below.

Referring to fig. 5, fig. 5 illustrates a specific flow of another interactive control method, where the method includes:

s501, the cloud server 200 stores one or more installation packages of the applications, where the installation packages of the one or more applications include an installation package of a first application. The first application comprises a plurality of pages, and the pages comprise a first page and a second page.

S502, the cloud server 200 acquires and stores first gesture navigation information of the first application. The first gesture navigation information may include an identifier of each page, one or more of the pages may trigger a gesture type corresponding to a designated function set by the first application based on each page, and the gesture type is based on designated information corresponding to the designated page.

Specifically, the description of this step may refer to the description in step S302 in the foregoing embodiment of fig. 3, and is not repeated here.

S503, the electronic device 100 receives an input 1 (which may also be referred to as an eleventh input) for installing the first application.

S504, the electronic device 100 sends an installation request to the cloud server 200.

Specifically, the description of this step may refer to the description in step S304 in the foregoing embodiment of fig. 3, and is not repeated here.

S505, the cloud server 200 sends the installation package of the first application and the first gesture navigation information to the electronic device 100.

Specifically, the description of this step may refer to the description in step S305 in the foregoing embodiment in fig. 3, and is not repeated here.

S506, the electronic device 100 installs the first application based on the installation package of the first application.

Specifically, the description of this step may refer to the description in step S306 in the embodiment of fig. 3, and is not repeated here.

S507, the electronic device 100 receives an input 5 (which may also be referred to as a third input) for the first application, and in response to the input 5, the electronic device 100 acquires an identifier (which may also be referred to as a second identifier) of the second page.

In particular, electronic device 100 may receive input 5 (e.g., a click) to act on the first application with respect to launching the second page icon. In response to the input 5, the electronic device 100 retrieves the identification of the second page.

Illustratively, taking the first application as a music application as an example, the electronic device 100 may display the user interface 410 shown in fig. 4B. Regarding the step of displaying the user interface 410 by the electronic device 100, reference may be made to the foregoing description in the embodiments shown in fig. 3 and fig. 4A to 4B, which is not repeated herein. As shown in fig. 6A, electronic device 100 may receive a touch operation (which may also be referred to as input 5) by a user acting on "song 1" selection entry 601 in user interface 410, and in response to the touch operation, electronic device 100 may retrieve an identification of the second page. For the process of displaying the user interface 410 by the electronic device 100, reference may be made to the foregoing step S307-step S308 shown in the embodiment of fig. 3, which is not described herein again.

And S508, the electronic device 100 displays the second page.

For example, taking the first application as a music application as an example, when the electronic device 100 receives a touch operation of the user on the aforementioned "song 1" selection item 601 shown in fig. 6A, and in response to the touch operation, the electronic device 100 acquires the identifier Activity2 of the second page issued by the music application, the electronic device 100 may display a user interface 610 (which may also be referred to as a second page) as shown in fig. 6B.

As shown in fig. 6B, the user interface 610 (which may also be referred to as a second page) may include music track reminder information, one or more operational controls for the music track. The prompt information may be text information, such as a music track name "Dream it pos table" and a music track singer name "delakey". The prompt information is not limited to text information, and may also be voice or other types of prompt information output by the electronic device 100, and the like, which is not limited in this application. The one or more operational controls for the music track may include one or more of a play-to-track control, a play-to-previous-track control, a play-to-next-track control, and the like. The electronic device 100 receives and responds to the touch operation (e.g., clicking) acted on the one or more operation controls for the music tracks, and can execute corresponding operations (e.g., playing the current music track, playing the previous music track or playing the next music track, etc.).

S509, when the electronic device 100 receives a fourth input (for example, a left-sliding input) on the second page, the electronic device 100 determines that a gesture type corresponding to the fourth input is the first gesture type. The electronic device 100 acquires the second information based on the identifier of the second page, the first gesture type, and the first gesture navigation information.

Illustratively, the first application is a music application, and the first gesture guidance information is the information shown in table 1. According to the first gesture navigation information, the gesture type and the corresponding trigger function of the second page are not set by the first application. When the electronic device 100 receives a fourth input (e.g., a left-swipe input) directed to the second page, the electronic device 100 determines that the gesture type corresponding to the fourth input is the first gesture type (e.g., a gesture of sliding right from a left edge of the display screen). The electronic device 100 may obtain the second information (e.g., system navigation information) based on the identification of the second page, the first gesture type, and the first gesture navigation information. This second information may cause the first application to perform a system navigation function (which may also be referred to as a second function, e.g., return to a previous page, return to a home screen interface, or enter a recent tasks page, etc.) corresponding to the gesture type described above.

S510, the electronic device 100 may execute a system navigation function (which may also be referred to as a second function, for example, returning to an upper page, returning to a home screen interface, or entering a recent task page, etc.) corresponding to the first gesture type through the first application based on the second information (e.g., the system navigation information).

Illustratively, as shown in fig. 6B, taking the first application as a music application as an example, the electronic device 100 displays a user interface 610 (which may also be referred to as a second page), and the electronic device 100 may receive a gesture input (which may also be referred to as a fourth input) for the user interface 610 to slide from the left edge of the display screen to the right. The system navigation function is provided with a function corresponding to the gesture sliding rightwards from the left edge of the display screen, and the function is to return to the upper-level page. When the electronic apparatus 100 responds to the fourth input with respect to the user interface 610, the electronic apparatus 100 may perform a function of returning to the upper page (which may also be referred to as a second function) through the music application based on the second information (e.g., system navigation information) determined in the foregoing step S509, and display the user interface 410 as shown in fig. 4B.

Next, a method for acquiring the first gesture navigation information by the cloud server 200 according to the embodiment of the present application is described.

Referring to fig. 7, fig. 7 illustrates a specific flow of acquiring the first gesture navigation information by the cloud server 200.

S701, the test equipment 300 acquires the installation package of the first application from the cloud server 200. The first application comprises a plurality of pages, and the plurality of pages can comprise a first page.

S702, the test device 300 may analyze and record control data information of each page in the first application.

Specifically, the hardware structure of the testing device 300 may refer to the foregoing structural description for the electronic device 100 shown in fig. 1, and is not described herein again. The test device 300 may analyze the control data information of each page in the first application through a preset algorithm mechanism (e.g., Java reflection technology). The testing device 300 may record the acquired control data information.

S703, the test device 300 queries out the specified data information based on each page in the first application.

Specifically, the test device 300 may query whether the specified data information (e.g., the sliding listening data information) is included in the data information of each page in the first application through a preset algorithm mechanism (e.g., Java reflection technology).

S704, the test apparatus 300 may query the data information of the designated control (e.g., drawer control) of the first page through the designated application (e.g., sysstrace).

Specifically, the test device 300 may query, by using a specific application (e.g., sysstrace), based on a preset algorithm mechanism (e.g., Java reflection technology), whether a keyword (e.g., "booten Opendrawer (int)") of a specific control (e.g., a drawer control) is included in the data information of the first page. If so, the test apparatus 300 may determine that a specified control (e.g., a drawer control) is included in the first page.

S705, the test device 300 starts a first page in the first application, so that the first application can respond to the input of the first gesture type based on the first page based on a preset instruction (e.g., an input instruction).

Specifically, the test device 300 may start the first page in the first application according to a preset mechanism (e.g., AcitivityManager). Then, the test device 300 may cause the first application to respond to the first gesture type input (e.g., gesture input sliding right from the left edge of the display screen) based on the first page based on a preset instruction (e.g., input instruction).

S706, the test device 300 analyzes whether the first application sets a function corresponding to the first gesture type based on the first page based on the specified data information (for example, sliding listening data information) and the data information of the specified control (for example, a drawer control).

In one possible implementation, the testing apparatus 300 may calculate a size change of a designated control (e.g., a drawer control) based on a designated detector (e.g., a layout analyzer), and determine that the first application has a function corresponding to setting the first gesture type based on the first page.

For example, as shown in fig. 8A, when the test device 300 has not caused the first application to respond to the input of the first gesture type based on the first page based on a preset instruction (e.g., an input instruction), the test device 300 may display the user interface 900. The user interface 900 may include a page 901 (which may also be referred to as a first page) and a property region 902. The property area 902 may be an area for displaying data information of the control. At this time, the designated control 901B (e.g., a drawer control) is not displayed in the page 901. Data information specifying the control may be as shown in property region 902. The identification "X" in the property area 902 may indicate that the change value of the specified control (e.g., drawer control) on the X-axis is 0 dp. The designation "Y" may indicate that the specified control (e.g., drawer control) has a variation value of 0dp on the Y-axis. The designation "width" may indicate a width value of the specified control (e.g., drawer control) of 280 dp. The identification "height" may indicate that the height value of the specified control (e.g., drawer control) is 780 dp.

As shown in fig. 8B, when the test device 300 causes the first application to respond to a first gesture type input (e.g., a gesture input sliding right from the left edge of the display screen) based on the first page based on a preset instruction (e.g., an input instruction), the page 901 may display a designated control 901B (e.g., a drawer control). At this time, the data information specifying the control may be represented by the property region 902, and the change value for "x" is-280 dp. Accordingly, the test apparatus 300 may determine that the first application has a function corresponding to setting the first gesture type based on the first page.

In one possible implementation, the test device 300 may detect whether one or more controls in the first page are invoked in response to the input of the first gesture type based on a preset mechanism (e.g., Java reflection technology), so as to analyze whether the first application sets a function corresponding to the first gesture type based on the first page. If one or more controls in the first page are invoked in response to the input of the first gesture type, the test device 300 determines that the first application has a function corresponding to setting the first gesture type based on the first page. Illustratively, as shown in FIG. 8C, the test apparatus 300 may display a user interface 930. The user interface 930 may include an area 931 and an area 932. Therein, the area 931 may indicate a running time period during which a Central Processing Unit (CPU) executes a corresponding function in response to an input of the first gesture type. Region 932 may indicate a function performed by test device 300 in response to an input of the first gesture type. Taking the example that the function corresponding to the input of the first gesture type is to display a drawer control, when the test apparatus 300 performs the above-mentioned function in response to the input of the first gesture type, the user interface 930 may display a name 932A "ondrowrslide ()" of the function in a region 932, and display a duration region 931A where the CPU performs the function in response to the first gesture in a region 931, and then the test apparatus 300 determines that the first application has the function corresponding to the setting of the first gesture type based on the first page.

In one possible implementation, the test device 300 may record an image of the page change by specifying an application (e.g., a screen recording application), and the test device 300 may analyze whether the first application sets a function corresponding to the first gesture type based on the first page based on the image and a preset image algorithm.

S707, when the test device 300 analyzes that the first application has a function corresponding to setting the first gesture type based on the first page, the test device 300 may record the identifier of the first page, the first gesture type, and corresponding first information as first gesture navigation information.

Specifically, the test device 300 may record the first gesture navigation information in the format of table 1 in the step S302, and for related descriptions, reference may be made to the embodiment shown in the step S302, which is not described herein again.

S708, the test equipment 300 sends the first gesture navigation information to the cloud server 200.

S709, the cloud server 200 stores the first gesture navigation information.

In one possible implementation, the first application may be pre-configured with first gesture navigation information. When acquiring the installation package of the first application, the cloud server 200 may also acquire the first gesture navigation information, and store the first gesture navigation information on the cloud server 200.

Next, a software structure provided in the embodiment of the present application is described.

Referring to fig. 9, fig. 9 is a block diagram illustrating an exemplary software structure provided in an embodiment of the present application.

The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.

The layered architecture divides the software into several layers. Each layer has a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.

The application layer may include a series of application packages.

As shown in fig. 9, the application package may include a camera, a calendar, a memo, weather, a first application, and the like.

In an embodiment of the present application, the first application may be an application configured with corresponding first gesture navigation information. It is to be understood that the types of the first application and the plurality of other applications are only examples, and the embodiments of the present application are not limited thereto.

The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.

As shown in fig. 9, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, a gesture input determination module, and the like.

The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.

The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.

The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.

The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).

The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.

The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.

The gesture input receiving module may be used to receive gesture input by a user acting on a specified page on the electronic device 100.

The gesture input determination module may acquire the first information (e.g., gesture input information) or the second information (e.g., navigation information) based on the gesture input reception module and the first gesture navigation information.

In embodiments of the present application, the gesture input receiving module may receive a gesture input by a user acting on a specified page (e.g., a first page of a first application) on the electronic device 100. The gesture input module may send the gesture input information to the gesture input determination module. After the gesture input information is acquired by the gesture input judgment module, the gesture judgment module may determine, from the first gesture navigation information, that the specified page is mapped with the specified gesture type (e.g., the first gesture type) corresponding to the gesture input based on the identifier of the specified page (e.g., the first page of the first application), and thus, the gesture judgment module may acquire the first information (e.g., the gesture input information). The gesture determination module may send the first information (e.g., gesture input information) to the first application. After receiving the first information, the first application may execute a corresponding function (e.g., function 1 described in the foregoing embodiment). If the gesture determination module determines that the designated page does not map the designated gesture type (e.g., the first gesture type) corresponding to the gesture input through the first gesture navigation information based on the identifier of the designated page (e.g., the first page of the first application), the gesture determination module may acquire the second information (e.g., the system navigation information). The gesture determination module may send the second information (e.g., system navigation information) to the first application. After receiving the second information, the first application may perform a system navigation function (which may also be referred to as a second function, such as returning to a previous page, returning to a home screen interface, or entering a recent tasks page, etc.) corresponding to the specified gesture type (e.g., the first gesture type).

The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.

The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.

The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.

The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.

The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.

The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.

The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.

The 2D graphics engine is a drawing engine for 2D drawing.

The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.

In some embodiments, the electronic device 100 may trigger different functions based on different start coordinate information and first gesture navigation information in the same gesture input. For example, when the electronic device 100 acquires the third identifier and displays the third page of the first application in response to the received fifth input, the electronic device 100 may receive a sixth input (e.g., a left-sliding input) in the upper third area of the screen, and the electronic device 100 may determine that the gesture type corresponding to the sixth input is the first gesture type. Then, the electronic device 100 may query that the third page in the first gesture navigation information corresponds to the first gesture type (e.g., a gesture of sliding right from the left edge of the display screen), and the electronic device 100 may acquire the first information (e.g., gesture input information) and, based on the first information, enable the first application to execute the function 1 (which may also be referred to as a first function, e.g., opening a menu bar of a page) corresponding to the first gesture type in the third page. When the electronic device 100 receives a seventh input (e.g., a left-slip input) that is lower than the upper third of the screen, the electronic device may determine that the gesture type corresponding to the seventh input is the first gesture type. The electronic device 100 may acquire second information (e.g., system navigation information) based on the first gesture type, and cause the first application to execute a system navigation function (which may also be referred to as a second function, e.g., return to a previous page, return to a home screen interface, or enter a recent task page, etc.) corresponding to the first gesture type based on the second information.

In some embodiments, the electronic device 100 may initiate different functions based on different motion speeds and first gesture navigation information in the same gesture input. For example, when electronic device 100 obtains a fourth identifier and displays a fourth page of the first application in response to receiving an eighth input, electronic device 100 may receive a ninth input (e.g., a left-side-slip input) with a motion speed within preset speed threshold 1 (which may also be referred to as a first speed threshold), and electronic device 100 determines that the gesture type corresponding to the ninth input is the first gesture type. Then, the electronic device 100 may query that the fourth page in the first gesture navigation information corresponds to the first gesture type (e.g., a gesture of sliding right from the left edge of the screen display), and the electronic device 100 may acquire the first information (e.g., gesture input information) and cause the first application to execute the function 1 (which may also be referred to as a first function, e.g., opening a menu bar of a page) corresponding to the first gesture type on the fourth page based on the first information. When the electronic device 100 receives a tenth input (e.g., a left-slip input) with a motion speed within a preset speed threshold 2 (which may also be referred to as a second speed threshold), the electronic device 100 may determine that a gesture type corresponding to the tenth input is the first gesture type. The electronic device 100 may acquire second information (e.g., system navigation information) based on the first gesture type, and perform a system navigation function (which may also be referred to as a second function, e.g., return to a previous page) corresponding to the first gesture type based on the fourth page through the first application based on the second information.

In some embodiments, the first navigation information may include an identification of the specified page and a specified instruction corresponding to the specified page. Illustratively, taking the first application as a music application as an example, the first gesture navigation information may be recorded in a format as shown in table 2:

TABLE 2

As shown in table 2, "Activity 1" identifies a first page corresponding to the instruction "gesture input information send instruction to slide right from the left edge of the display screen". "Activity 2" identifies a second page that corresponds to instructions that are "none," i.e., without any corresponding specified instructions. "Activity 3" identifies a third page corresponding to the instruction "gesture input information send instruction to slide left from the right edge of the display screen".

When the electronic device 100 receives a gesture input (gesture input sliding to the right from the left edge of the display screen) on a specified page (e.g., a first page), the electronic device 100 may retrieve a system navigation instruction (e.g., return to previous page instruction) corresponding to the gesture input. The electronic device 100 may determine that the specified page corresponds to the specified instruction (e.g., a gesture input information sending instruction sliding right from the left edge of the display screen) based on the first gesture navigation information shown in table 2, for example. Then, when the electronic device 100 is based on that the gesture input on the designated page (e.g., the first page) is the designated gesture input (e.g., the gesture input sliding to the right from the left edge of the display screen), the electronic device 100 may cause the first application to execute the function corresponding to the designated gesture input (e.g., open the side menu bar of the page) based on the first page instead of executing the system navigation instruction corresponding to the gesture input based on the first page.

When the electronic device 100 receives a gesture input (gesture input sliding to the right from the left edge of the display screen) on a specified page (e.g., the second page), the electronic device 100 may retrieve a system navigation instruction (e.g., return to previous page instruction) corresponding to the gesture input. When the electronic device 100 determines that the designated page does not have a corresponding designated instruction based on the first gesture navigation information shown in table 2, for example, the electronic device 100 may input a corresponding system navigation instruction based on the gesture, so that the first application performs a system navigation function based on the first page (e.g., returns to a previous page).

It should be noted that the first gesture navigation information may include different gesture types corresponding to different pages in the first application. The first application can set different trigger functions based on each page and the gesture type corresponding to each page. Illustratively, as shown in FIG. 10, the first application may be a music application, Activity1 is an identification of a first page, Activity2 is an identification of a second page, Activity3 is an identification of a third page, and Activity4 is an identification of a fourth page. The first page corresponds to a gesture type sliding from the left edge of the display screen to the right, and the gesture is based on the information corresponding to the first page being gesture input information. The first application may set the gesture type corresponding to the function "open side menu bar of page" based on the first page. The electronic device 100 may perform a function of opening a side menu bar of a page on the first page in response to the left-side slide gesture input and a system navigation function (e.g., return to the previous page) in response to the right-side slide gesture input. The gesture type of the second page setting is "none". The electronic device 100 may perform a system navigation function (e.g., return to a previous page) on the second page in response to the left slide input and may perform a system navigation function (e.g., return to a previous page) in response to the right slide input. The third page may correspond to a gesture type of sliding left from a right edge of the display screen, the gesture based on the information corresponding to the third page being gesture input information. The first application may set that the gesture type corresponds to a function "display hover window" based on the third page. The electronic device 100 may perform a system navigation function (e.g., return to the upper page) at the third page in response to the left slide input and a display floating window function in response to the right slide input. The fourth page may correspond to a gesture type that slides right from the left edge of the display screen and a gesture type that slides left from the right edge of the display screen. The gesture is based on that the information corresponding to the fourth page is gesture input information. The first application may set the gesture type sliding leftward from the right edge of the display screen corresponding to the function "display floating window" and set the gesture type sliding rightward from the left edge of the display screen corresponding to the function "open side menu bar of the page" based on the fourth page. The electronic apparatus 100 performs a function of opening a side menu bar of a page in response to the left-side slide gesture input on the fourth page, and may perform a display floating window function in response to the right-side slide gesture input. For each gesture type, the function set by the first application based on the page and the corresponding gesture type, and the implementation flow, reference may be made to the description in the foregoing embodiment, and details are not repeated here.

As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".

In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.

One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

36页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:蓬松触感复现装置及方法和基于蓬松触感复现的网购方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类