Terminal device control method and terminal device

文档序号:1845422 发布日期:2021-11-16 浏览:9次 中文

阅读说明:本技术 终端设备的控制方法和终端设备 (Terminal device control method and terminal device ) 是由 袁德祥 王晓林 吕士朋 周华 樊国祥 于 2021-08-19 设计创作,主要内容包括:本申请公开了一种终端设备的控制方法和终端设备,用以解决在触控屏的屏幕损伤后,无法控制终端设备的问题。该方法主要应用于屏幕损伤的终端设备,包括:屏幕损伤后,若检测到对第一实体按键的第一用户操作,对屏幕的触控功能进行检测,确定出触控功能正常的封闭区域;将页面内容映射到所述封闭区域中展示。使用该方法可以达到在屏幕损伤后,终端可以临时使用的目的,解决在触控屏的屏幕损伤后,无法控制终端设备,可能导致终端生命周期异常结束、正常业务无法及时开展的问题。(The application discloses a control method of terminal equipment and the terminal equipment, which are used for solving the problem that the terminal equipment cannot be controlled after a screen of a touch screen is damaged. The method is mainly applied to terminal equipment with damaged screens, and comprises the following steps: after the screen is damaged, if first user operation on a first entity key is detected, detecting a touch function of the screen, and determining a closed area with a normal touch function; and mapping page content to the closed area for presentation. By using the method, the purpose that the terminal can be used temporarily after the screen is damaged can be achieved, and the problems that the terminal equipment cannot be controlled after the screen of the touch screen is damaged, the life cycle of the terminal is abnormally ended, and normal services cannot be developed in time can be solved.)

1. A control method of a terminal device is applied to the terminal device with damaged screen, and the method comprises the following steps:

after the screen is damaged, if first user operation on a first entity key is detected, detecting a touch function of the screen, and determining a closed area with a normal touch function;

and mapping page content to the closed area for presentation.

2. The method of claim 1, wherein after detecting the touch function of the screen and determining an enclosed area with normal touch function and before mapping the page content to the enclosed area for presentation, the method further comprises:

displaying the closed area;

and responding to a second user operation of a second entity key to adjust the closed area.

3. The method of claim 1, wherein the detecting the touch function of the screen and determining the closed area with normal touch function comprises:

acquiring a plurality of lines drawn in the screen by a user;

determining the position of the damaged touch point according to the touch points of the lines;

and determining the closed area based on the position of the damaged touch point.

4. The method according to any one of claims 1-3, wherein before detecting the touch function of the screen and determining the closed area with normal touch function, the method further comprises:

and displaying an operation demonstration for detecting the touch function of the screen.

5. The method of claim 3, wherein prior to obtaining the plurality of lines drawn by the user in the screen, the method further comprises:

displaying a plurality of screen damage types;

responding to a third user operation of a third entity key, and recording at least one selected screen damage type;

the determining the closed area based on the damaged touch point position includes:

and determining the closed area based on the position of the damaged touch point and at least one recorded screen loss type.

6. The method of claim 5, wherein the obtaining the plurality of lines drawn by the user in the screen further comprises:

and displaying a plurality of recommendation lines corresponding to the screen damage type based on the selected screen damage type.

7. The method of claim 5, further comprising:

and prompting to cancel the operation mode of selecting the screen damage type.

8. The method of claim 1, further comprising:

if the first user operation on the first entity key is detected, recording an emergency mode identifier;

after the touch function of the screen is detected and the closed area with the normal touch function is determined, the method further comprises the following steps:

if a starting-up instruction is received, checking the emergency mode identification;

if the emergency mode identification is detected, acquiring the closed area;

and mapping page content to be displayed in the closed area.

9. The method of claim 8, wherein if the limp-home mode identifier is detected, the method further comprises:

and prompting to cancel the operation mode of the emergency mode.

10. A terminal device, comprising:

a display, a processor, and a memory;

the display is used for displaying a screen display area;

the memory to store the processor-executable instructions;

the processor is configured to execute the instructions to implement the control method of the terminal device according to any one of claims 1-9.

Technical Field

The application relates to the technical field of touch terminals, in particular to a control method of a terminal device and the terminal device.

Background

Touch screens are employed in many terminal devices due to their convenience of operation. The mobile phones are also mainly and increasingly touch screens. After the touch screen is impacted by external force, the screen is easily damaged. After the screen is damaged, not only the display effect is affected, but also the touch operation in some areas may not be performed normally.

For example, after the smart phone falls, cracks are easily generated on the screen, and thus, some touch points cannot work normally. If a control is just displayed at the damaged touch point, the control will lose functionality because of being unable to touch. For example, after the touch screen is broken, an unlocking key displayed on the screen cannot be operated, and the content of the control cannot be browsed in a browsed interface, which brings difficulty to the use of the touch screen. Therefore, how to control the terminal equipment is to be solved on the premise of damaging the screen of the touch screen.

Disclosure of Invention

The application aims to provide a control method of a terminal device and the terminal device, and the control method and the terminal device are used for solving the problem that the terminal device cannot be controlled after a screen of a touch screen is damaged.

In a first aspect, the present application provides a control method for a terminal device, which is applied to a terminal device with a damaged screen, and the method includes:

after the screen is damaged, if first user operation on a first entity key is detected, detecting a touch function of the screen, and determining a closed area with a normal touch function;

and mapping page content to the closed area for presentation.

In some embodiments, after the detecting the touch function of the screen and determining the closed area with normal touch function and before the mapping the page content to the closed area for display, the method further includes:

displaying the closed area;

and responding to a second user operation of a second entity key to adjust the closed area.

In some embodiments, the detecting the touch function of the screen and determining the closed area with a normal touch function includes:

acquiring a plurality of lines drawn in the screen by a user;

determining the position of the damaged touch point according to the touch points of the lines;

and determining the closed area based on the position of the damaged touch point.

In some embodiments, before the detecting the touch function of the screen and determining the closed region with a normal touch function, the method further includes:

and displaying an operation demonstration for detecting the touch function of the screen.

In some embodiments, before the obtaining the plurality of lines drawn in the screen by the user, the method further comprises:

displaying a plurality of screen damage types;

responding to a third user operation of a third entity key, and recording at least one selected screen damage type;

the determining the closed area based on the damaged touch point position includes:

and determining the closed area based on the position of the damaged touch point and at least one recorded screen loss type.

In some embodiments, the obtaining, before the plurality of lines drawn in the screen by the user, further includes:

and displaying a plurality of recommendation lines corresponding to the screen damage type based on the selected screen damage type.

In some embodiments, the method further comprises:

and prompting to cancel the operation mode of selecting the screen damage type.

In some embodiments, the method further comprises:

and prompting to cancel the operation mode of the demonstration operation.

In some embodiments, the method further comprises:

if the first user operation on the first entity key is detected, recording an emergency mode identifier;

after the touch function of the screen is detected and the closed area with the normal touch function is determined, the method further comprises the following steps:

if a starting-up instruction is received, checking the emergency mode identification;

if the emergency mode identification is detected, acquiring the closed area;

and mapping page content to be displayed in the closed area.

In some embodiments, if the limp-home mode identifier is detected, the method further comprises:

and prompting to cancel the operation mode of the emergency mode.

In a second aspect, the present application provides a terminal device, including:

a display, a processor, and a memory;

the display is used for displaying a screen display area;

the memory to store the processor-executable instructions;

the processor is configured to execute the instructions to implement the control method of the terminal device as described in any one of the above first aspects.

In a third aspect, the present application provides a computer-readable storage medium, wherein instructions, when executed by a terminal device, enable the terminal device to perform the control method of the terminal device according to any one of the first aspect.

In a fourth aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements the control method of the terminal device according to any one of the first aspects above.

The embodiment of the application provides a control method of terminal equipment, which mainly comprises the steps of entering a screen emergency mode after a terminal screen is damaged, determining a closed area with a normal touch function by detecting the touch function of the screen, and mapping the content of the terminal screen into the closed area with the normal touch function. After the screen is damaged, the terminal can be used temporarily and emergently, and even can be used for a long time, so that the problems that the life cycle of the terminal is abnormally finished and normal services cannot be developed timely due to the fact that the terminal cannot be controlled after the screen of the touch screen is damaged are solved, and convenience is brought to users.

Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.

Drawings

In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.

Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present application;

fig. 2 is a block diagram of a software structure of a terminal device according to an embodiment of the present disclosure;

fig. 3 is a schematic flowchart of a control method for a terminal device according to an embodiment of the present application;

fig. 4 is a schematic flowchart of entering an emergency mode according to an embodiment of the present application;

fig. 5 is a schematic flow chart illustrating an emergency mode entering after the computer is turned on according to the embodiment of the present application;

fig. 6 is an animation effect diagram of an exemplary operation of detecting a touch function of a screen according to an embodiment of the present application;

FIG. 7 is an animation effect diagram of an exemplary manner of operation of a cancel operation provided in an embodiment of the present application;

fig. 8 is a schematic flowchart of a closed region for determining that a touch screen functions normally according to an embodiment of the present application;

FIG. 9 is a schematic flow chart illustrating a process for selecting a screen damage type according to an embodiment of the present application;

FIG. 10 is a diagram of an animation effect for selecting a type of screen damage provided by an embodiment of the present application;

FIG. 11a is an animation effect diagram of an operation manner for deselecting a screen damage type according to an embodiment of the present application;

FIG. 11b is an animation effect diagram showing a recommended line in a screen according to an embodiment of the present application;

FIG. 12 is a diagram illustrating an animation effect of determining a location of a damaged touch point according to an embodiment of the present disclosure;

FIG. 13 is a schematic flow chart illustrating the process of determining the enclosed area according to an embodiment of the present disclosure;

FIG. 14 is a diagram illustrating the effect of modifying a closed region according to an embodiment of the present disclosure;

FIG. 15 is a schematic flow chart illustrating a process for modifying a closed region according to an embodiment of the present disclosure;

FIG. 16 is a diagram illustrating an effect of mapping a page to a correction area according to an embodiment of the present application;

fig. 17 is an animation effect diagram of an operation manner of prompting to exit the screen emergency mode according to the embodiment of the present application.

Detailed Description

In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. The embodiments described are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.

Also, in the description of the embodiments of the present application, "/" indicates or means, for example, a/B may indicate a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.

The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," or "second" may explicitly or implicitly include one or more features, and in the description of embodiments of the application, "a plurality" means two or more unless otherwise indicated.

With the popularization of intelligent terminals, touch screens are also frequently adopted in many intelligent terminal devices due to the convenience of operation. However, the touch screen is easily damaged after being impacted by external force. The very small damaged area of the screen may also cause that the terminal cannot perform touch operations such as unlocking the screen and answering a call, or even cannot derive important data stored by the terminal. At present, each terminal manufacturer has difficulty in using the terminal screen due to damage, and still does not provide a corresponding solution.

In view of this, the embodiment of the present application provides a terminal device and a control method thereof. The following describes a control method of a terminal device provided in the present application with reference to an embodiment.

The inventive concept of the present application can be summarized as follows: after the terminal screen is damaged, the terminal screen can enter a screen emergency mode based on user operation, in the emergency mode, a closed area with a normal touch function is determined by detecting the touch function of the screen, and then the content of the terminal screen is mapped to the closed area with the normal touch function for display. According to the method and the device, the terminal can be used temporarily after the screen is damaged by determining the area with the normal screen touch function and controlling the display of the screen content. The problem of after the screen of touch screen damages, can't control terminal equipment, probably lead to terminal life cycle to finish unusually, normal business can't develop in time is solved.

After the inventive concept of the present application is introduced, the terminal device provided in the present application will be described below. Fig. 1 shows a schematic structural diagram of a terminal device 100. It should be understood that the terminal device 100 shown in fig. 1 is only an example, and the terminal device 100 may have more or less components than those shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.

A block diagram of a hardware configuration of a terminal device 100 according to an exemplary embodiment is exemplarily shown in fig. 1. As shown in fig. 1, the terminal device 100 includes: a Radio Frequency (RF) circuit 110, a memory 120, a display unit 130, a camera 140, a sensor 150, an audio circuit 160, a Wireless Fidelity (Wi-Fi) module 170, a processor 180, a bluetooth module 181, and a power supply 190.

The RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 180 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.

The memory 120 may be used to store software programs and data. The processor 180 performs various functions of the terminal device 100 and data processing by executing software programs or data stored in the memory 120. The memory 120 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 120 stores an operating system that enables the terminal device 100 to operate. The memory 120 may store an operating system and various application programs, and may also store program codes for executing the control method of the terminal device according to the embodiment of the present application.

The display unit 130 may be used to receive input numeric or character information and generate signal input related to user settings and function control of the terminal device 100, and particularly, the display unit 130 may include a touch screen 131 disposed on the front surface of the terminal device 100 and may collect touch operations of a user thereon or nearby, such as clicking a button, dragging a scroll box, and the like.

The display unit 130 may also be used to display a Graphical User Interface (GUI) of information input by or provided to the user and various menus of the terminal apparatus 100. Specifically, the display unit 130 may include a display screen 132 disposed on the front surface of the terminal device 100. The display screen 132 may be configured in the form of a liquid crystal display, a light emitting diode, or the like. The display unit 130 may be used to display the mapping region and a screen display interface of the terminal device in the present application.

The touch screen 131 may cover the display screen 132, or the touch screen 131 and the display screen 132 may be integrated to implement the input and output functions of the terminal device 100, and after the integration, the touch screen may be referred to as a touch display screen for short. In the present application, the display unit 130 may display the application programs and the corresponding operation steps.

The camera 140 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to the processor 180 for conversion into digital image signals.

The terminal device 100 may further comprise at least one sensor 150, such as an acceleration sensor 151, a distance sensor 152, a fingerprint sensor 153, a temperature sensor 154. The terminal device 100 may also be configured with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, light sensor, motion sensor, and the like.

The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and the terminal device 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161. The terminal device 100 may further be configured with a volume button for adjusting the volume of the sound signal, and may be configured to combine other buttons to adjust the closed area. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 160, and outputs the audio data to the RF circuit 110 to be transmitted to, for example, another terminal device, or outputs the audio data to the memory 120 for further processing.

Wi-Fi belongs to a short-distance wireless transmission technology, and the terminal device 100 can help a user to send and receive e-mails, browse webpages, access streaming media and the like through the Wi-Fi module 170, and provides wireless broadband internet access for the user.

The processor 180 is a control center of the terminal device 100, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by running or executing software programs stored in the memory 120 and calling data stored in the memory 120. In some embodiments, processor 180 may include one or more processing units; the processor 180 may also integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a baseband processor, which mainly handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 180. In the present application, the processor 180 may run an operating system, an application program, a user interface display, and a touch response, and the control method of the terminal device according to the embodiment of the present application. Further, the processor 180 is coupled with the display unit 130.

And the bluetooth module 181 is configured to perform information interaction with other bluetooth devices having a bluetooth module through a bluetooth protocol. For example, the terminal device 100 may establish a bluetooth connection with a wearable electronic device (e.g., a smart watch) having a bluetooth module via the bluetooth module 181, so as to perform data interaction.

The terminal device 100 also includes a power supply 190 (such as a battery) for powering the various components. The power supply may be logically connected to the processor 180 through a power management system to manage charging, discharging, power consumption, etc. through the power management system. The terminal device 100 may further be configured with a power button for powering on and off the terminal device, and locking the screen.

Fig. 2 is a block diagram of a software configuration of the terminal device 100 according to the embodiment of the present application.

The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer, from top to bottom, respectively.

The application layer may include a series of application packages.

As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.

The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.

As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.

The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.

The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, dialed and answered calls, browsing history and bookmarks, phone books, short messages, etc.

The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying a picture.

The phone manager is used to provide the communication function of the terminal device 100. Such as management of call status (including on, off, etc.).

The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application.

The notification manager allows the application to display notification information (e.g., message digest of short message, message content) in the status bar, can be used to convey notification-type messages, and can automatically disappear after a short dwell without user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.

The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.

The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.

The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.

The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.

The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.

The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.

The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.

A 2D (an animation mode) graphics engine is a drawing engine for 2D drawing.

The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.

The following exemplifies the workflow of the terminal device 100 software and hardware in conjunction with the user operation event.

When the touch screen 131 receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. The application framework layer obtains the original input event from the kernel layer, identifies the input event, and then triggers control of the terminal device. When the touch screen is damaged, the touch function of the local damaged area is invalid, so that a user cannot trigger a touch event through touch operation in the area. Therefore, in the embodiment of the application, the touch function can be checked, and then a suitable area is determined to display the page content of the terminal device, so that the user can perform touch operation in the area and can trigger a touch event.

The terminal 100 in the embodiment of the present application may be an electronic device including, but not limited to, a smart phone, a tablet computer, a wearable electronic device (e.g., a smart watch), a notebook computer, and a television.

In order to facilitate understanding of the control method of the terminal device provided in the embodiment of the present application, the following further describes the control method with reference to the accompanying drawings. It should be noted that the method provided by the embodiment of the present application is suitable for controlling the terminal device after the screen of the terminal device is damaged.

Fig. 3 is a flowchart illustrating a control method for a terminal device according to an embodiment of the present application. As shown in fig. 3, the method comprises the steps of:

step 301: after the screen is damaged, if first user operation on a first entity key is detected, the touch function of the screen is detected, and a closed area with a normal touch function is determined.

In some embodiments, after the screen of the terminal device is damaged, the user may enter a screen emergency mode through a first user operation of a first entity key. In order to protect the normal operation of the screen, before entering the screen emergency mode, it is necessary to determine whether an emergency mode identifier exists, and the specific steps are as shown in fig. 4:

in step 401, a first user operation on a first entity key is detected, and a screen emergency mode is triggered. Then, in step 402, it is determined whether an emergency mode identifier exists, if no emergency mode identifier exists, in step 403, the emergency mode identifier is written, finally, in step 404, a screen emergency mode is entered, and if an emergency mode identifier exists, step 404 is directly executed.

In the embodiment of the present application, the first physical key may be set as one or more key combinations as needed.

In another embodiment, determining whether the terminal enters the screen emergency mode or the screen normal mode after being powered on also needs to determine whether the emergency mode identifier exists. The method can be specifically implemented as the steps shown in fig. 5:

in step 501, after detecting a first user operation on a first entity key, recording an emergency mode identifier; in step 502, a power-on indication is received. Then, in step 503, checking whether the emergency mode flag exists, if the emergency mode flag is detected, then in step 504, entering the screen emergency mode, and executing the following operations, and if the emergency mode flag is not detected, then in step 505, entering the screen normal mode.

Therefore, the screen emergency mode is entered after the terminal screen is damaged through the first user operation of the first entity key.

In some embodiments, after entering the emergency mode, the touch function of the screen is detected, but it is unclear how to detect the touch function of the screen for the new user, so an operation demonstration of detecting the touch function of the screen is displayed in the embodiment of the present application. Fig. 6 shows how to detect the touch function in the form of animation to the user after entering the emergency mode. In order to facilitate the user operation, in the embodiment of the present application, an operation mode of an animation showing a cancel operation demonstration is also presented to the user. As shown in fig. 7, it is prompted in the screen whether to exit the screen power check operation by "clicking the volume entity key three times" during the playing of the animation, so that the user can exit the screen power check demonstration at any time.

In some embodiments, in order to determine the largest touchable area of the damaged screen, the touch function of the screen needs to be detected, and a closed area with a normal touch function is determined, which may be specifically implemented as the steps shown in fig. 8:

in step 801, a plurality of lines drawn in a screen by a user are acquired.

Wherein, the closed area is determined to show the page content conveniently and accurately. The screen damage type can be determined before acquiring a plurality of lines drawn in the screen by the user. One possible implementation is that a plurality of different screen damage types are displayed in the screen for the user to select, and since the touch function may be abnormal, the user is prompted to select the screen damage type by physical keys. May be implemented as the steps shown in fig. 9:

upon entering the screen contingency mode, the user is prompted to select a screen damage type, and several typical screen damage types are provided in step 901.

In step 902, the user is prompted to select a type of screen damage.

In step 903, a user operation for a third entity key operation is detected.

In step 904, if a user operation for a third physical key operation is detected, the selected screen damage type data is recorded and marked.

In step 905, if the user operation for the third physical key operation is not detected, the screen damage type data is not recorded.

As shown in fig. 10, the left diagram in fig. 10 shows several possible screen damage types for the user to select, and the user can select the screen damage type by the third physical key. For example, the user can move the focus by the physical keys for increasing and decreasing the volume, and on which option the focus is located, surround the option with a frame to facilitate the user to know the focus position, and then select the screen damage type of the corresponding focus position by the switch keys. The user can select one or more screen damage types according to actual conditions. As shown in the right panel of FIG. 10, the two screen impairment types selected by the user are "lower left corner impairment" and "middle impairment", respectively.

In some embodiments, if the provided option is appropriate, the user may also deselect the screen damage type, so in this embodiment of the application, at the stage when the user selects the screen damage type, the user may be prompted how to deselect the operation mode of the screen damage type. Such as prompting the user through which physical keys or combinations of physical keys to operate the cancel as shown in fig. 11 a. When the user does not need to select the screen damage type, the screen damage type selection can be quitted according to the prompt key, and the screen damage type is not recorded.

If the user selects the screen damage type, a recommended line can be displayed in the screen based on the selected screen damage type, and therefore the user can draw a plurality of lines on the screen based on the recommended line. For example, as shown in fig. 11b, if the screen damage type is a lower left corner damage, several horizontal lines and several vertical lines are displayed at the sitting position in the screen, so as to perform a detailed inspection on the touch function in the lower left corner region.

In implementation, the recommended line can be set according to the screen damage type and the actual situation, which is not limited in the embodiment of the present application. For example: the recommended lines of the full-screen drawing line can be set to be 11 horizontal lines and 5 vertical lines. If the user does not select the screen damage type, then the area of screen damage is not clear before the user draws the line. At this time, suggesting that the user draw a line in a full screen; if the user selects a screen corner, the user may be advised to draw a line at the selected screen corner, with other screen damage types being handled in a similar manner.

Step 802: and determining the position of the damaged touch point according to the touch points of the lines drawn by the user.

In the process of drawing the line, if the user draws the line at a substantially uniform speed, the obtained distances between the coordinates of the touch points are substantially equal, and at this time, if the distances between the touch points are larger and/or the touch points of the line have larger deviation, if the sudden change of the line direction can be understood as the deviation of the touch points, the line is indicated to have a line breaking point and a drifting point, so that the position of the damaged touch point can be determined. As shown in fig. 12, touch point disconnection or drift occurs at the position where the crack occurs, and the position of the damaged touch point can be detected by the above features.

Step 803: and determining a closed area based on the position of the damaged touch point.

In some embodiments, a maximum rectangular area may be found based on the location of the damaged touch point, where the damaged touch point location is not contained within the maximum rectangular area.

In some embodiments, if the user selects the screen damage type, the closed area with normal touch function can be determined according to the determined position of the damaged touch point and the recorded at least one screen damage type. For example: if the user selects the damage to the lower left corner when selecting the screen damage type, and meanwhile, more damaged touch points on the lower left corner are determined in the user screen function detection, the reliability of the damaged area of the screen on the lower left corner is high, and the determined closed area with normal touch screen function can avoid the area on the lower left corner of the screen.

The lines drawn by the user can be horizontal lines, vertical lines, oblique lines, diagonal lines, square lines or a combination of several lines, and are all suitable for the embodiment of the application. The process of determining the closed region will be further described below with reference to the steps shown in fig. 13, taking the example of drawing horizontal and vertical lines.

In step 1301, the user draws a line laterally.

In step 1302, the coordinates of the touch point with the damaged horizontal line are recorded.

In step 1303, it is determined whether the horizontal scribing is completed, if not, the process returns to step 1301, and if yes, the user performs the vertical scribing in step 1304.

If the user can press keys to select the line drawing completion, the judgment can be carried out according to whether the number of the drawn lines reaches the recommended number of the lines. If the number of the recommended lines is reached, marking is finished, otherwise, judging that marking is not finished.

In step 1305, the coordinates of the touch point with the damaged vertical line are recorded.

In step 1306, it is determined whether the vertical scribing is completed, if not, the process returns to step 1304, and if so, the closed region is determined in step 1307.

In step 1308, it is determined whether the enclosed area satisfies a predetermined condition. If the preset condition is not met, in step 1309, it is determined whether to trigger the user operation on the fourth entity key, if the user operation is triggered on the fourth entity key, the step 1301 is returned to, and if the user operation is not triggered on the fourth entity key, the position of the closed area is stored in step 1310. If the predetermined condition is satisfied, go to step 1310.

And continuously drawing the line on the screen by triggering the fourth entity key.

The preset condition can be judged by whether the closed area is generally matched with the display ratio of the horizontal screen or the vertical screen of the screen. And if the aspect ratio of the closed region is determined to be similar to the aspect ratio of the screen, determining that the preset condition is met, otherwise, determining that the preset condition is not met.

Therefore, the closed area with the normal touch function can be determined through detecting the touch function of the screen.

In some embodiments, after the closed region with the normal touch function is automatically determined, the closed region may be displayed, so that the closed region may be adjusted according to a user requirement, for example, the user may be prompted to adjust the displayed closed region by operating the second entity key.

One possible implementation manner is to use the displayed closed region as a region to be adjusted, when the area of the displayed closed region is too small or the aspect ratio is not reasonable, a user operates an entity key to adjust a line on a screen to a proper position, adjust a proper closed region, and store the closed region modified by the user as a finally determined closed region. Fig. 14 is a diagram showing the effect of the closed region after correction. Wherein, the displayed closed area has a sideline at the edge of the screen without adjustment. Embodiments of adjusting the enclosed area may include the steps shown in fig. 15:

in step 1501, the user is prompted to press a key to exit the manner of the correction area.

In step 1502, a user action on a second physical key is detected, and a bar on the screen is adjusted to an appropriate position and identified.

In step 1503, it is determined whether or not the closed region has been adjusted, and if the closed region has been adjusted, the key operation is confirmed in step 1504, and if the closed region has not been adjusted, the process proceeds to step 1502.

And if the corrected closed area is adjusted to be in a proper size, finishing the adjustment, otherwise, continuing the adjustment.

In step 1505, the corrected closed region position is stored.

Therefore, the user can enlarge or reduce the area range or adjust a more reasonable display aspect ratio through the physical key operation under the condition of not influencing the actual touch operation, and finally determine a more suitable closed area.

Step 302: and mapping the page content to be displayed in the finally determined closed area.

In some embodiments, the page content may be scaled to the size of the closed area for presentation, and the scaling may be performed according to the aspect ratio of the page in order to keep the page from being stretched.

In another embodiment, the page may not be zoomed, and a scroll bar is added when the page is displayed, so that the user can view the content of the page through the scroll bar.

Of course, when the closed area is small, the page content can be also scaled to a certain extent and a scroll bar is also added to help the user view the page content.

In some examples, after mapping the page content to the closed region presentation, the user may use the controls in the page in the closed region again, with the specific effect as shown in fig. 16. The screen area outside the closed area may be a black screen, may also be content that the user sets for display by himself, and may also be a default screen, such as a home screen, which is not limited in the embodiment of the present application.

In some embodiments, the emergency mode identifier is recorded if a first user action on a first physical key is detected. And if the closed area is not stored in advance, after the closed area is determined by touch function inspection, displaying page content by adopting the closed area and storing the closed area. After the next startup, if a startup instruction is received, after the emergency mode identification is checked, the pre-stored closed area display page content is directly read.

Of course, after the computer is turned on, if the emergency mode identifier is checked, the operation mode of canceling the emergency mode can be prompted. For example, a page such as that shown in FIG. 17 may be displayed prompting the manner of operation to exit the limp-home mode. If the user selects to quit the emergency mode, the emergency mode identifier can be deleted, the pre-stored closed area can be deleted, and then the normal mode is entered for displaying. The normal mode in the application refers to a mode for displaying page content based on a screen size according to a normal screen touch function. The emergency mode refers to displaying page content according to a predetermined closed area.

The timing for prompting to exit the emergency mode may be determined according to a requirement, for example, after a first user operates a first key, for example, when an enclosed area is displayed, for example, when a page is mapped to a determined enclosed area for display. In short, the user can exit the emergency mode at any time.

In the embodiment of the present application, each entity key may be the same or different, and may include one or more key combinations, as long as different operations can be distinguished, which are all applicable to the embodiment of the present application. For example, the volume up key may be used as a first physical key, the volume down key may be used as a second physical key, and the volume up key and the volume down key may be combined to be used as a third physical key.

Of course, the user can control the terminal device by using a voice instruction or by using machine vision before the closed region is not determined in the case of screen damage through not only the physical key operation. For example, the emergency mode is started by voice control, the screen damage type is selected by voice control, the emergency mode is exited by voice control, the operation demonstration is exited by voice control, the screen damage type is deselected, and the like.

Based on the foregoing description, in the embodiment of the application, by detecting the touch function of the screen, determining the closed region with the normal touch function, and mapping the content of the terminal screen into the closed region with the normal touch function for display, the problems that after the screen of the touch screen is damaged, the terminal equipment cannot be controlled, the life cycle of the terminal may be abnormally ended, and normal services cannot be timely developed are finally solved.

The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.

It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, terminal device or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.

The present application is described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

25页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种基于PM的数据库页缓存方法及系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类