Capacitive sensor delay compensation

文档序号:118970 发布日期:2021-10-19 浏览:8次 中文

阅读说明:本技术 电容传感器延时补偿 (Capacitive sensor delay compensation ) 是由 菲利普·奎因 于 2019-10-16 设计创作,主要内容包括:一种示例方法包括由计算设备的一个或多个处理器以及基于该计算设备的存在敏感显示器所生成的互电容数据识别一个或多个互电容触摸位置;由该一个或多个处理器以及基于该存在敏感显示器所生成的自电容数据识别一个或多个自电容触摸位置;由该一个或多个处理器确定该一个或多个互电容触摸位置和该一个或多个自电容触摸位置的相对应触摸位置之间的运动;由该一个或多个处理器以及基于所确定的运动调节该一个或多个互电容触摸位置以获得一个或多个经调节的互电容触摸位置;并且由该一个或多个处理器利用该一个或多个经调节的互电容触摸位置作为用户输入。(One example method includes identifying, by one or more processors of a computing device and based on mutual capacitance data generated by a presence-sensitive display of the computing device, one or more mutual capacitance touch locations; identifying, by the one or more processors and based on the self-capacitance data generated by the presence-sensitive display, one or more self-capacitance touch locations; determining, by the one or more processors, a motion between the one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations; adjusting, by the one or more processors and based on the determined motion, the one or more mutual capacitance touch locations to obtain one or more adjusted mutual capacitance touch locations; and utilizing, by the one or more processors, the one or more adjusted mutual capacitance touch positions as user input.)

1. A method, comprising:

identifying, by one or more processors of a computing device and based on mutual capacitance data generated by a presence-sensitive display of the computing device, one or more mutual capacitance touch locations;

identifying, by the one or more processors and based on self-capacitance data generated by the presence-sensitive display, one or more self-capacitance touch locations, each of the one or more mutual capacitance touch locations corresponding to a touch location of the one or more self-capacitance touch locations;

determining, by the one or more processors, a motion between the one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations;

adjusting, by the one or more processors and based on the determined motion, the one or more mutual capacitance touch locations to obtain one or more adjusted mutual capacitance touch locations; and is

Utilizing, by the one or more processors, the one or more adjusted mutual capacitance touch positions as user input.

2. The method of claim 1, wherein identifying the one or more self-capacitance touch locations comprises:

identifying the one or more self-capacitance touch locations based on the one or more mutual capacitance touch locations and the self-capacitance data.

3. The method of claim 2, wherein identifying the one or more self-capacitance touch locations further comprises:

determining reconstructed self-capacitance data based on the self-capacitance data; and is

Identifying one or more touch locations in the reconstructed self-capacitance data that correspond to ones of the one or more mutual-capacitance touch locations as the one or more self-capacitance touch locations.

4. The method of any of claims 1-3, wherein adjusting a particular mutual capacitance touch location of the one or more mutual capacitance touch locations comprises:

predicting a future position of the particular mutual capacitance touch position based on the motion and delay target values.

5. The method of any of claims 1-4, wherein utilizing the one or more adjusted mutual capacitance touch positions as user input comprises:

providing the adjusted mutual capacitance touch position as user input to an application executing at the computing device.

6. The method of claim 5, further comprising:

outputting, for display at the presence-sensitive display, a graphical user interface based on instructions received from the application; and is

Outputting, based on instructions received from the application, an updated graphical user interface, modified based on the user input, for display at the presence-sensitive display.

7. The method of any of claims 1-6, wherein the presence-sensitive display includes a capacitive touch panel.

8. The method of any of claims 1-7, wherein the one or more processors include a touch controller and an application processor.

9. The method of claim 8, wherein utilizing the one or more adjusted mutual capacitance touch positions as user input comprises:

outputting, by the touch controller to the application processor, the one or more adjusted mutual capacitance touch positions.

10. A computing device, comprising:

a presence-sensitive display; and

one or more processors configured to perform the method of any combination of claims 1-9.

11. A non-transitory computer-readable storage medium storing instructions that, when executed, cause one or more processors of a mobile computing device to perform the method of any combination of claims 1-9.

Background

Some computing devices may include proximity (e.g., touch or presence) sensors that can detect user input. For example, a computing device may include a presence-sensitive display (i.e., a display with a proximity sensor) that is capable of displaying graphical objects and receiving user input to enable a user to interact with the displayed graphical objects. Some example interactions include a user moving their finger across a presence-sensitive display to drag an object and/or cause a computing device to scroll.

One example of a proximity sensor is a capacitive sensor. A capacitive sensor panel is constructed from a matrix of row and column electrodes on either side of a dielectric material. The electrodes are typically constructed of a transparent capacitive material such as Indium Tin Oxide (ITO) so that they can be placed over the display module and not seen by the user. The dielectric material is typically a glass substrate. The touch panel module may be attached to a surface of a display module, for example, a Liquid Crystal Display (LCD) or an Organic Light Emitting Diode (OLED) display, and may be disposed under a protective cover glass. The electrodes may be connected to a touch controller, which may drive the electrodes with voltage signals and sense the resulting capacitance changes.

When an electrode is driven with a voltage signal, the intrinsic capacitance of the electrode with respect to other objects (e.g., a human finger, another electrode, or ground) may be measured. Changes in the ambient environment may have an effect on changes in the intrinsic capacitance of the electrode.

Disclosure of Invention

In general, techniques of this disclosure are directed to techniques for compensating for delays incurred via use of capacitive proximity sensors to detect user input. In direct manipulation scenarios (e.g., on mobile phones, tablet computers, and smart watches), a presence-sensitive display may be attached on top of a display panel (e.g., an LCD or OLED panel) and enable a user to manipulate objects on the display in accordance with the movement of their finger. In such a scenario, latency may become an issue. In particular, it is desirable that the time between the user providing their input and observing a corresponding response from the system be as small as possible (e.g., because the user expects the system to be highly responsive to their input). For example, if a user is dragging an object rendered on a display with his finger, the user expects that the object will follow the user's finger accurately without lagging, advancing, or scanning the vestige (stuttering). High latency not only frustrates the user from a visual perspective, but also results in sub-optimal human-machine interaction. For example, the system may incorrectly interpret input from the user or fail to fully detect the input. Users may feel forced to adapt their input behavior in order to try and counteract some of the latency effects, for example by slowing down their input or tracing a different path with their fingers. The user may not be able to enter instructions or data into the system as quickly as they would otherwise want, or they may have to perform additional inputs in order to compensate for previous inputs that were misinterpreted by the system. These additional inputs may place further demands on the system, such as in terms of increased power consumption due to increased usage, increased processing costs, and reduced lifetime of the display panel and/or capacitive proximity sensor.

The capacitive proximity sensor may periodically scan for input. The time between successive scans defines the minimum interval before a change in input can be observed. The touch controller may perform signal processing and algorithmic calculations on the sensor data (e.g., to resolve the sensor data to high resolution coordinates). The touch controller may output the coordinates directly for use by the application, or they may be further processed before being output for use by the application. The application may communicate a graphical response to the input via the graphics system and ultimately cause the display to update based on the response, which may be slowed by the update frequency (e.g., refresh rate) of the display.

Capacitive proximity sensors may be capable of performing several different types of input scans, each of which presents advantages and disadvantages. Some example input scan types include self-capacitance scan and mutual capacitance scan. Self-capacitance scanning offers the advantages of high speed and lower power requirements, but can be affected by the "smearing" effect when multiple contacts are present (e.g., where a user performs one gesture using multiple fingers). Mutual capacitance scanning is slower and requires more power than self capacitance scanning, but is not affected by "smearing".

In accordance with one or more techniques of this disclosure, a computing device may utilize a combination of self-capacitance scanning and mutual capacitance scanning to obtain an estimate of dynamic motion (e.g., input direction and speed) in a user input. For example, the computing device may identify a self-capacitance touch location based on the self-capacitance scan and a mutual touch location based on the mutual capacitance scan. The computing device can determine which of the self-capacitance touch locations correspond to touch locations in the mutual touch location (e.g., and discard ones of the self-capacitance touch locations that do not correspond to touch locations in the mutual touch location), and determine motion between the mutual-capacitance touch location and the corresponding ones of the self-capacitance touch locations. The computing device may utilize the obtained estimate of dynamic motion to adjust the estimate of the static touch location to anticipate where the actual input location will be when the input location is received by the application. In this manner, the computing device may compensate for delays (e.g., introduced during processing of the scan data). As such, the techniques of this disclosure enable the computing device to more smoothly and/or accurately track user input, thereby providing improved human-machine interaction and user experience.

The techniques of this disclosure may also address a balance between increased responsiveness of the computing device to user input without compromising on power consumption and processing costs. Users may no longer need to adapt their input behavior in order to try and counteract some of the delay effects experienced on known systems. Users may be able to input instructions or data into the system faster and/or more efficiently, and they may avoid having to perform additional inputs that would normally compensate for previous inputs that were misinterpreted by the system. Not having to perform these additional calibration inputs may provide savings in power consumption and processing costs for the system, as well as improved lifetime of the display panel and/or capacitive proximity sensor.

In one example, a method includes identifying, by one or more processors of a computing device and based on mutual capacitance data generated by a presence-sensitive display of the computing device, one or more mutual capacitance touch locations; identifying, by the one or more processors and based on the self-capacitance data generated by the presence-sensitive display, one or more self-capacitance touch locations; determining, by the one or more processors, a motion between the one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations; adjusting, by the one or more processors and based on the determined motion, the one or more mutual capacitance touch locations to obtain one or more adjusted mutual capacitance touch locations; and utilizing, by the one or more processors, the one or more adjusted mutual capacitance touch positions as user input.

In another example, a computing device includes a presence-sensitive display; and one or more processors configured to identify one or more mutual capacitance touch locations based on mutual capacitance data generated by a presence-sensitive display of the computing device; identifying one or more self-capacitance touch locations based on self-capacitance data generated by the presence-sensitive display; determining a motion between the one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations; adjusting the one or more mutual capacitance touch positions based on the determined motion to obtain one or more adjusted mutual capacitance touch positions; and utilizing the one or more adjusted mutual capacitance touch positions as user input.

In another example, a non-transitory computer-readable storage medium storing instructions that, when executed, cause one or more processors of a device to identify one or more mutual capacitance touch locations based on mutual capacitance data generated by a presence-sensitive display of the computing device; identifying one or more self-capacitance touch locations based on self-capacitance data generated by the presence-sensitive display; determining a motion between the one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations; adjusting the one or more mutual capacitance touch positions based on the determined motion to obtain one or more adjusted mutual capacitance touch positions; and utilizing the one or more adjusted mutual capacitance touch positions as user input.

The details of one or more examples of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.

Drawings

Fig. 1 is a conceptual diagram illustrating an example computing device including a presence-sensitive display in accordance with one or more aspects of the present disclosure.

FIG. 2 is a block diagram illustrating further details of one example of the computing device of FIG. 1 in accordance with one or more techniques of this disclosure.

Fig. 3 is a conceptual diagram illustrating example self-capacitance data generated by a presence-sensitive display of a computing device in accordance with one or more techniques of this disclosure.

Fig. 4 is a conceptual diagram illustrating example mutual capacitance data generated by a presence-sensitive display of a computing device in accordance with one or more techniques of this disclosure.

FIG. 5 is a clipped set of mutual capacitance data in accordance with one or more techniques of this disclosure.

Fig. 6 is a timeline illustrating example operations of a computing device utilizing self-capacitance scanning and mutual capacitance scanning to receive user input in accordance with one or more techniques of this disclosure.

Fig. 7 is a flow diagram illustrating example operations of an example computing device to perform latency compensation in accordance with one or more aspects of the present disclosure.

Fig. 8 is a conceptual diagram illustrating reconstructed self-capacitance data generated based on the self-capacitance data of fig. 3 according to one or more techniques of this disclosure.

Fig. 9 is a conceptual diagram illustrating example capacitive scan data as an input object moves downward on a presence-sensitive display in accordance with one or more techniques of this disclosure.

Fig. 10 is a conceptual diagram illustrating example capacitive scan data as an input object moves upward on a presence-sensitive display in accordance with one or more techniques of this disclosure.

Detailed Description

Fig. 1 is a conceptual diagram illustrating an example computing device 2 that includes a presence-sensitive display 12. Fig. 1 illustrates only one particular example of computing device 2, and many other examples of computing device 2 may be used in other instances. In the example of fig. 1, computing device 2 may be a wearable computing device, a removable computing device, or any other computing device capable of receiving user input. Computing device 2 of fig. 1 may include a subset of the components included in example computing device 2, or may include additional components not shown in fig. 1.

In the example of fig. 1, computing device 2 may be a mobile phone. However, computing device 2 may also be any other type of computing device, such as a camera device, a tablet computer, a Personal Digital Assistant (PDA), a smart speaker, a laptop computer, a desktop computer, a gaming system, a media player, an e-book reader, a television platform, a car navigation system, or a wearable computing device (e.g., a computerized watch). As shown in fig. 1, computing device 2 includes User Interface Component (UIC)10, UI module 14, user application module 16, and processor(s) 22.

UIC10 may act as a corresponding input and/or output device for mobile computing device 2. As shown in fig. 1, UIC10 includes a presence-sensitive display 12. UIC10 may be implemented using various techniques. For example, UIC10 may act as an input device using a presence-sensitive input screen, such as a capacitive touch screen or a projected capacitive touch screen. UIC10 may also function as an output (e.g., display) device using any one or more display devices, such as a Liquid Crystal Display (LCD), dot matrix display, Light Emitting Diode (LED) display, Organic Light Emitting Diode (OLED) display, electronic ink, or similar monochrome or color display capable of outputting visual information to a user of computing device 2. In the example of fig. 1, presence-sensitive display 12 may be a presence-sensitive display capable of receiving user input and displaying graphical data.

UIC10 may detect input (e.g., touch and non-touch input) from a user of the respective computing device 2. For example, presence-sensitive display 12 may detect an indication of an input by detecting one or more gestures from a user (e.g., a user touching, pointing, and/or sweeping with a finger or stylus at or near one or more locations of presence-sensitive display 12). UIC10 may output information to the user in the form of a user interface that may be associated with functionality provided by computing device 2. Such user interfaces may be associated with computing platforms, operating systems, applications, and/or services (e.g., electronic messaging applications, chat applications, internet browser applications, mobile or desktop operating systems, social media applications, electronic games, menus, and other types of applications) executing at computing device 2 or accessible from computing device 2.

UI module 14 manages user interaction with UIC10 and other components of computing device 2. In other words, UI module 14 may act as an intermediary between various components of computing device 2 to make decisions based on user input detected by UIC10 and to generate output at UIC10 in response to the user input. UI module 14 may receive instructions from an application, service, platform, or other module of computing device 2 to cause UIC10 to output a user interface. UI module 14 may manage input received by computing device 2 as a user view and interact with a user interface presented at UIC10 and update the user interface in response to receiving additional instructions from an application, service, platform, or other module of computing device 2 that processes the user input.

Computing device 2 may include modules 14 and 16. Modules 14 and 16 may perform the described operations using software, hardware, firmware, or a mixture of software, hardware, and/or firmware residing in computing device 2 and/or executing at computing device 2. Computing device 2 may execute modules 14 and 16 with one or more processors. Computing device 2 may execute modules 14 and 16 as virtual machines executing on underlying hardware. Modules 14 and 16 may execute as services or components of an operating system or computing platform. Modules 14 and 16 may execute as one or more executable programs at the application layer of the computing platform. Modules 14 and 16 may be otherwise remotely disposed to or accessed by computing device 2, for example as one or more web services operating at a network in a cloud.

Processor(s) 22 may implement functions and/or execute instructions within computing device 2. Examples of processor(s) 22 include, but are not limited to, one or more Digital Signal Processors (DSPs), general purpose microprocessors, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Thus, the term "processor," as used herein may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described herein.

User application module 16 may execute at computing device 2 to perform any of a variety of operations. Examples of user application modules 16 include, but are not limited to, music applications, photo viewing applications, mapping applications, electronic messaging applications, chat applications, internet browser applications, social media applications, electronic games, menus, and/or other types of applications that may operate based on user input.

In operation, user application module 16 may cause UI module 14 to generate a Graphical User Interface (GUI) for display at UIC 10. UIC10 may output a graphical user interface based on instructions from user application module 16. As shown in the example of fig. 1, user application module 16 may be a photo viewing application that causes UI module 14 to generate a GUI including a picture of a mountain for display at UIC 10.

The user may desire to provide user input to the user application module 16. For example, in the example of fig. 1 in which user application module 16 may be a photo viewing application, the user may want to zoom in or out on a picture of a mountain displayed at UIC 10. In this example, to zoom out, the user may perform a pinch gesture by placing their index finger at location 18A, placing their thumb at location 18B, and sliding their thumb and index finger together (in the direction shown by the arrows).

Presence-sensitive display 12 may detect the user input. For example, where presence-sensitive display 12 is a capacitive touch panel, presence-sensitive display 12 may detect user input via a mutual capacitance scan triggered based on the results of the self-capacitance scan. To illustrate, presence-sensitive display 12 may perform periodic self-capacitance scans to determine whether any input objects (e.g., fingers, styluses, etc.) are present near presence-sensitive display 12. In response to the self-capacitance scan indicating that at least one input object is in proximity to presence-sensitive display 12, presence-sensitive display 12 may perform a mutual capacitance scan to identify a touch point for each of the at least one input object. Further details of self-capacitance scanning are discussed below with reference to FIG. 3. Further details of mutual capacitance scanning are discussed below with reference to FIG. 4.

While it may be sufficient to identify touch points using mutual capacitance scan data when the input object is static (i.e., not moving), such techniques may not be satisfactory for identifying touch points of moving objects. As discussed above, presence-sensitive display 12 may perform latency compensation to predict where an input object is moving (e.g., to account for delays introduced by processing scan data). The performance of the delay compensation may involve determining a predicted touch point based on at least two temporally different touch points for a particular object (e.g., a first touch point for the object obtained at a first time and a second touch point for the object obtained at a second time after the first time). However, as discussed below, the relatively large amount of time between successive mutual capacitance scans can result in a large time gap between at least two temporally distinct touch points of a particular object. This large time gap can reduce the effectiveness of the delay compensation.

In accordance with one or more techniques of this disclosure, computing device 2 may perform delay compensation using touch points determined based on mutual capacitance scan data and touch points determined based on self capacitance scan data. For example, computing device 2 may identify one or more mutual capacitance touch locations based on mutual capacitance data generated by presence-sensitive display 12 and one or more self-capacitance touch locations based on self-capacitance data generated by presence-sensitive display 12. Computing device 2 may determine a movement between one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations. Based on the determined motion, computing device 2 may adjust one or more mutual capacitance touch locations to obtain one or more adjusted mutual capacitance touch locations (e.g., predicted touch points). Computing device 2 may utilize the one or more adjusted mutual capacitance touch positions as user input. For example, the adjusted mutual capacitance touch position may be provided to the user application module 16.

Computing device 2 may perform one or more actions based on the user input. For example, user application module 16 may cause UI module 14 to generate an updated GUI for display at UIC10, the GUI being modified based on the user input. For example, in the example of fig. 1 in which user application module 16 may be a photo viewing application and the user is performing a pinch gesture to zoom in on a picture of a mountain, user application module 16 may cause UI module 14 to generate a GUI including a zoomed-out representation of the photograph of the mountain for display at UIC 10.

Fig. 2 is a block diagram illustrating further details of one example of computing device 1 of fig. 1 in accordance with one or more techniques of this disclosure. As discussed above, UIC10 of computing device 2 may include presence-sensitive display 12 and processor(s) 22. As shown in fig. 2, presence-sensitive display 12 may include electrodes 24, a touch controller 26, a display panel 28, and a display controller 30. As also shown in fig. 2, processor(s) 22 may include UI module 14, user application module 16, and operating system 36.

The electrodes 24 may form a matrix of row and column electrodes on either side of the dielectric material. The electrode 24 may be constructed of a transparent conductive material such as Indium Tin Oxide (ITO). In this way, the electrodes 24 may be positioned above the display components (e.g., the display panel 28) and not visible to a user. The dielectric material may be a glass substrate.

Touch controller 26 may perform one or more operations to sense user input via electrodes 24. For example, touch controller 26 may output voltage signals across the electrodes and sense resulting changes in capacitance (e.g., due to the presence of a finger or other object on or near presence-sensitive display 12). When one of the electrodes 24 is driven by a voltage signal, the inherent capacitance of that electrode with respect to other objects (such as a human finger, another electrode, or ground) may be measured. Changes in the ambient environment have an effect on changes in the intrinsic capacitance of the electrode. Touch controller 26 may output indications of the sensed user input to one or more other components of computing device 2, such as UI module 14.

The display panel 28 may be a display device capable of rendering a graphical user interface. Examples of display panel 28 include, but are not limited to, a Liquid Crystal Display (LCD), a dot matrix display, a Light Emitting Diode (LED) display, an Organic Light Emitting Diode (OLED) display, electronic ink, or similar monochrome or color display capable of outputting visual information to a user of computing device 2.

The display controller 30 may perform one or more operations to manage the operation of the display panel 28. For example, display controller 30 may receive instructions from UI module 14 that cause display controller 30 to control display panel 28 to render a particular graphical user interface.

As discussed above, UI module 14 may act as an intermediary between various components of computing device 2 to make decisions based on user input detected by UIC10 and to generate output at UIC10 in response to the user input. As shown in fig. 2, UI module 14 may include a touch driver 32 and a display driver 34. Touch driver 32 may interact with touch controller 26 and operating system 36 to process user input sensed via presence-sensitive display 12. Display driver 36 may interact with display controller 30 and operating system 36 to process output for display at display panel 28, which may vary based on user input received via electrodes 24.

Operating system 36, or components thereof, may manage interactions between applications and users of computing device 2. As shown in the example of FIG. 2, operating system 36 may manage operations between user application modules 16 and users of computing device 2. In some examples, UI module 14 may be considered a component of operating system 36.

As discussed above, presence-sensitive display 12 may detect user input using one or both of self-capacitance scanning and mutual capacitance scanning. In particular, electrodes 24, touch controller 26, and touch driver 32 may collectively operate to generate mutual capacitance data based on a mutual capacitance scan and to generate self capacitance data based on a self capacitance scan. Further details of self-capacitance scanning are discussed below with reference to FIG. 3. Further details of mutual capacitance scanning are discussed below with reference to FIG. 4.

Fig. 3 is a conceptual diagram illustrating example self-capacitance data generated by a presence-sensitive display of a computing device in accordance with one or more techniques of this disclosure. The self-capacitance data 300 of fig. 3 is discussed with reference to the computing device 2 of fig. 1 and 2. However, other computing devices may generate the self-capacitance data 300.

To perform a self-capacitance scan (also referred to as a surface capacitance scan), touch controller 26 may drive one of electrodes 24 with a signal and measure the capacitance across the entire electrode (e.g., with respect to ground). When another conductive object approaches the electrode, a capacitor is formed between them — reducing the capacitance between the electrode and ground. Touch controller 26 measures the capacitance by driving all of electrodes 24 in each direction (e.g., all rows and then all columns) and measuring their capacitance. Where electrodes 24 include r row electrodes and c column electrodes, the self-capacitance scan produces r + c measurements, which are collectively referred to as self-capacitance data 300.

Self-capacitance data 300 of fig. 3 may represent self-capacitance data measured by presence-sensitive display 12 when a user performs the pinch gesture shown in fig. 1. As shown in fig. 3, self-capacitance data 300 includes row capacitance values 302 and column capacitance values 304. The darker cells indicate higher values for the row and column capacitance values 302 and 304. As can be seen in row and column capacitance values 302 and 304, a user may induce higher capacitance values by placing their finger at locations 18A and 18B.

Fig. 4 is a conceptual diagram illustrating example mutual capacitance data generated by a presence-sensitive display of a computing device in accordance with one or more techniques of this disclosure. The mutual capacitance data 400 of fig. 4 is discussed with reference to the computing device 2 of fig. 1 and 2. However, other computing devices may generate the mutual capacitance data 400.

To perform mutual capacitance scanning, touch controller 26 may utilize the inherent capacitive coupling that exists between row and column electrodes of electrodes 24 at locations (i.e., cells) where they overlap. For example, touch controller 26 may drive individual ones (e.g., rows) of electrodes 24 and measure capacitance on intersecting ones (e.g., columns) of electrodes 24. Touch controller 26 may repeat this process until all cells have been sensed. Where electrodes 24 include r row electrodes and c column electrodes, the mutual capacitance scan produces r × c measurements, which are collectively referred to as mutual capacitance data 400. For mutual capacitance data 400, darker cells indicate higher values.

Self-capacitance sensing may be a fast and efficient method for sensing the presence of a contact near presence-sensitive display 12. However, by measuring each electrode within its entirety, self-capacitance sensing cannot sense the location of contact along each electrode pair. Even with measurements of all rows and all columns together, self-capacitance sensing suffers from "smearing" effects when there are multiple contacts. That is, each contact will cause the capacitance of the closest row electrode and the closest column electrode to drop. When there is more than one contact, multiple dips will be sensed on each axis, which will form a "smeared" contact point in addition to the true contact point (as shown in FIG. 8).

Mutual capacitance sensing may be slower and less effective than self-capacitance sensing because mutual capacitance sensing involves sensing each cell individually. However, mutual capacitance sensing generates a complete "image" of the panel, which may allow touch controller 26 to clearly separate each contact.

FIG. 5 is a clipped set of mutual capacitance data in accordance with one or more techniques of this disclosure. The mutual capacitance data 500 of fig. 5 can represent an area of mutual capacitance data surrounding a contact (e.g., a finger contact). As shown in fig. 5, by bringing an input object, such as a finger, close to the presence-sensitive display, the capacitance of several cells may change. As discussed herein, based on these changes in capacitance, touch controller 26 may identify a touch contact using mutual capacitance data. For each recognized touch contact, touch controller 26 may recognize the centroid and covered element. In the example of fig. 5, touch controller 26 may identify the centroid as centroid 502 and the covered cells as all non-white cells (e.g., cells having all contact centroids of capacitance values greater than a threshold value).

Fig. 6 is a timeline illustrating example operations of a computing device utilizing both self-capacitance scanning and mutual capacitance scanning to receive user input in accordance with one or more techniques of this disclosure. As shown in fig. 6, touch controller 26 may perform self-capacitance scanning (SS) at some regular intervals (e.g., every 8 milliseconds) to generate self-capacitance data. Touch controller 26 may analyze the self-capacitance data to determine whether any contact can be detected. For example, touch controller 26 may determine that a contact may be detected if at least one of row capacitance values 302 exceeds a threshold value and/or at least one of column capacitance values 304 exceeds a threshold value.

In response to detecting at least one contact in the self-capacitance data, touch controller 26 may perform a mutual capacitance scan (MS) to generate mutual capacitance data. Touch controller 26 may analyze the mutual capacitance data to identify touch points. For example, touch controller 26 may identify a value in the mutual capacitance data as a touch point if the value is greater than a threshold. In some examples, touch controller 26 may utilize an image processing algorithm (e.g., watershed) to segment the mutual capacitance data into regions, where each region has one touch point. As discussed above, touch controller 26 may output an indication of the touch point to one or more other components of computing device 2, such as processor(s) 22.

However, in response to not detecting any contact in the self-capacitance data, touch controller 26 may refrain from performing a mutual capacitance scan. In this manner, touch controller 26 may utilize self-capacitance scanning as a preliminary low resolution trigger to selectively perform higher resolution mutual capacitance scanning. By performing a preliminary self-capacitance scan, the touch controller can eliminate the power and processing costs of a mutual capacitance scan without contact on/near the panel.

As shown in fig. 6, in the first two scans, touch controller 26 may detect a contact based on self-capacitance data. In this way, touch controller 26 may perform subsequent mutual capacitance scans in the first two scans. The results of the mutual capacitance scan are processed and reported to other components of computing device 2. However, since no contact is detected in the third scan, the touch controller 26 may omit the execution of the mutual capacitance scan.

During user interaction, the motion of the user's finger (e.g., relative to presence-sensitive display 12) may be an arbitrary high bandwidth signal. For example, a user's finger may move at-200 mm/s when scrolling through a document while reading it; when scrolling through a long list of items quickly, the user's finger may move at-400 mm/s. Similarly, a user's finger may change its direction or dynamics (velocity, acceleration, etc.) extremely quickly. When computing device 2 is unable to sense or respond to these changes quickly enough, the latency becomes perceptible to the user and degrades their experience.

For example, in the example of FIG. 6, the input on presence-sensitive display 12 may be sampled every 8ms (120 Hz). If the user's finger drags an object on presence-sensitive display 12 at 200mm/s, the sampling interval means that the finger will travel 1.6mm before touch controller 26 observes its new location. In some cases, the additional processing to respond to the sample may take another 24 ms: for a total of 32ms, or 6.4mm of travel.

This gap between the user's finger and the dragged object may be perceived and may degrade the user experience. For example, if the user's finger suddenly stops, the object will continue to move as it "catches up" with the finger, or in the case of some latency-compensated techniques, the object may overtake (overshot) the finger (e.g., because computing device 2 cannot react quickly enough to input changes). The user may have to provide further input to correct the override.

Some delay compensation techniques utilize historical sequences of high resolution points reported by the touch sensor (e.g., reported by the touch controller 26 and/or touch driver 32) to estimate motion parameters and perform some form of interpolation or extrapolation. For example, new points may be extrapolated based on the last reported point and estimates of velocity and/or acceleration from previous reports for the last reported point.

Delay compensation techniques that utilize reported points for motion estimation may suffer from one or more drawbacks. As one example, touch controller 26 and/or touch driver 32 may report points at a very low frequency (typically 60 or 120 Hz). This means that the delay compensation algorithm can estimate the finger position based on data at least 8ms later and therefore suffer from the artifact of irregular overshoot/lag (underscoring) when the user's finger suddenly changes direction. As another example, the touch points reported by touch controller 26 and/or touch driver 32 may be optimized for stability and may not be suitable for differentiation. Although each reported touch point is a good estimate of location, their temporal order is not a good estimate of motion due to the design constraints of the touch processing algorithms. To estimate motion (e.g., velocity) from touch points, it may be necessary to apply considerable filtering — which introduces more delay. As a result of these drawbacks, many systems do not employ motion estimation based delay compensation techniques.

In accordance with one or more techniques of this disclosure, a computing device may utilize various properties of a presence-sensitive display to obtain an estimate of dynamic motion (e.g., its direction and speed) in user input. As one exemplary attribute, the interval between self-capacitance and mutual capacitance scans (e.g., 1 to 3ms) may be significantly shorter than the interval between successive mutual capacitance scans (e.g., 8 to 16 ms). As another example attribute, data from the self-capacitance scan (i.e., self-capacitance data) may be used to reconstruct cell-level data using the mutual capacitance data as a "mask".

At a high level, the self-capacitance scan and the mutual capacitance scan are two snapshots of the user input that are 1 to 3ms apart (i.e., an effective sampling rate of 300 to 1000 Hz). If the corresponding touch location can be estimated within each, a high frequency estimate of the motion of the corresponding touch location can be derived. Estimating instantaneous motion using both self-capacitance data and mutual capacitance data may be preferable to estimating instantaneous motion using a continuous mutual capacitance scan (i.e., at least 8ms apart), because estimating instantaneous motion using both self-capacitance data and mutual capacitance data may be more responsive to changes in the user's input dynamics (e.g., a sudden stop). Delay compensation techniques (e.g., extrapolation) using this estimate may thus produce more accurate and responsive results.

As discussed above, self-capacitance data may be used as a trigger to perform a mutual capacitance scan, but may not be used to determine motion or actual touch point location due to smearing. Thus, the self-capacitance data is typically discarded during conventional touch processing.

In accordance with one or more techniques of this disclosure, computing device 2 may determine a motion and/or an actual touch point location based on the self-capacitance data. For example, touch controller 26 may determine reconstructed self-capacitance data based on the self-capacitance data. Touch controller 26 may utilize the mutual capacitance data to mask the reconstructed self-capacitance data to reduce or eliminate smearing contact points. Touch controller 26 may utilize the masked reconstructed self-capacitance data to estimate motion of the touch location identified in the mutual capacitance data. In this manner, computing device 2 may utilize both self-capacitance data and mutual capacitance data to estimate motion of the touch location.

Fig. 7 is a flow diagram illustrating example operations of an example computing device to perform latency compensation in accordance with one or more aspects of the present disclosure. The operation of computing device 2 is described in the context of computing device 2 of fig. 1 and 2.

Computing device 2 may perform a self-capacitance scan to generate self-capacitance data (702). For example, touch controller 26 may perform a self-capacitance scan using electrodes 24 of presence-sensitive display 12 in the manner discussed above with reference to fig. 3. As discussed above, the self-capacitance data may include row and column capacitance values (e.g., row and column capacitance values 302 and 304). The touch controller 26 may be at a first time t1A self capacitance scan is performed.

Computing device 2 may perform a mutual capacitance scan to generate mutual capacitance data (704). For example, touch controller 26 may perform a mutual capacitance scan using electrodes 24 of presence-sensitive display 12 in the manner discussed above with reference to fig. 4. As discussed above, the mutual capacitance data may include a capacitance value for each cell of presence-sensitive display 12. The touch controller 26 may be at a second time t2A mutual capacitance scan is performed.

Computing device 2 may identify one or more mutual capacitance touch locations based on the mutual capacitance data (706). For example, touch controller 26 may analyze the mutual capacitance data to identify clusters of values that exceed a threshold capacitance value. Each of the identified clusters may represent a mutual capacitance touch location. For each respective identified cluster, touch controller 26 may identify a respective estimate of the touch location (e.g., the centroid of the cluster) and a respective area of the touch location (e.g., which elements of the presence-sensitive display are covered by the respective touch location). In the example of FIG. 4, touch controller 26 may identify touch locations 402A and 402B as mutual capacitance touch locations.

Computing device 2 may generate reconstructed self-capacitance data based on the self-capacitance data (708). As mentioned above, the self-capacitance data may comprise row and column data. In order to compare this data with the mutual capacitance data (which includes the value of each cell), it may be necessary to establish a value for each cell. The self-capacitance data can be thought of as a transport polytope that represents the sum of the margins of the unknown matrix. The values of the unknown matrix may be considered an example of the reconstructed self-capacitance data.

Although there may be many solutions for this unknown matrix, touch controller 26 may obtain the estimate by distributing the row and column self-capacitance data evenly across their respective lengths and multiplying them at each cell. For example, touch controller 26 may determine a reconstructed self-capacitance value a for cell (i, j) from self-capacitance row data r and self-capacitance column data c according to the following equation:

ai,j=ri/|c|·cj/|r|

note that while the above equation may not satisfy the conditions of a true transportation polyhedral solution, the numerical distribution may be sufficient.

Fig. 8 is a conceptual diagram illustrating reconstructed self-capacitance data generated based on the self-capacitance data of fig. 3 according to one or more techniques of this disclosure. As discussed above, the self-capacitance data may be affected by smearing in the presence of multiple touch locations, which may shift to reconstructing the self-capacitance data. As shown in FIG. 8, the reconstructed self-capacitance data 800 includes touch locations 802A-802D. Touch locations 802A and 802D can be real touch locations, while touch locations 802B and 802C can be smear touch locations.

In some examples, touch controller 26 may determine reconstructed self-capacitance data for all cells. In other examples, touch controller 26 may determine reconstructed self-capacitance data for a subset of cells. For example, touch controller 26 may determine reconstructed self-capacitance data for a subset of cells that includes cells covered by a respective area of each mutual capacitance touch location. In this manner, touch controller 26 can remove smeared touch locations 802B and 802C.

Computing device 2 may identify one or more self-capacitance touch locations based on the reconstructed self-capacitance data and the mutual-capacitance touch locations (710). Each of the one or more mutual-capacitance touch locations may correspond to a touch location of the one or more self-capacitance touch locations (however, due to smearing, some of the one or more self-capacitance touch locations may not correspond to a touch location of the one or more mutual-capacitance touch locations). For example, touch controller 26 may analyze the reconstructed self-capacitance data to identify clusters of values that exceed a threshold capacitance value. Each identified cluster may represent a candidate reconstructed self-capacitance, which may be a true touch location or a smear touch location. For each respective identified cluster, touch controller 26 may identify a respective estimate of the candidate touch location (e.g., the centroid of the cluster) and a respective area of the candidate touch location (e.g., which elements of presence-sensitive display 12 are covered by the respective candidate touch location). In the example of FIG. 8, touch controller 26 may identify touch locations 802A-802D as candidate reconstructed self-capacitance touch locations.

Touch controller 26 may perform one or more actions to remove the smear location. As one example, as discussed above, touch controller 26 may determine reconstructed self-capacitance data for a subset of cells covered by a respective area of each mutual capacitance touch location. As another example, touch controller 26 may identify, for each respective cluster in the mutual capacitance data, a respective cluster in the reconstructed self-capacitance data that has a centroid closest to a centroid of the respective cluster in the mutual capacitance data. The clusters identified in the reconstructed self-capacitance data for the respective clusters in the mutual capacitance data can be considered to represent reconstructed self-capacitance touch locations corresponding to the mutual capacitance touch locations represented by the respective clusters in the mutual capacitance data. In the example of FIG. 8, touch controller 26 can identify the cluster of cells surrounding touch location 802A as corresponding to mutual capacitance touch location 402A and the cluster of cells surrounding touch location 802D as corresponding to mutual capacitance touch location 402B. In this manner, touch controller 26 can identify one or more touch locations in the reconstructed self-capacitance data that correspond to locations in the one or more mutual capacitance touch locations. The touch location identified in the reconstructed self-capacitance data may be referred to as a reconstructed self-capacitance touch location or a self-capacitance touch location.

Computing device2, a motion between one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations can be determined (712). As discussed above, the self-capacitance data and mutual capacitance data may be at times t, respectively1And t2Is captured. (based on at time t1Obtained from captured self-capacitance data) corresponding ones of the self-capacitance touch points and (based on at time t)2Captured mutual capacitance data obtained) motion between points in the mutual capacitance touch points may reveal motion of an input object (e.g., a user's finger) between scans.

Fig. 9 is a conceptual diagram illustrating example capacitive scan data as an input object moves downward on a presence-sensitive display in accordance with one or more techniques of this disclosure. Fig. 9 includes mutual capacitance data 902, reconstructed self-capacitance data 904, and difference data 906. Mutual capacitance data 902 may represent mutual capacitance data captured via the techniques discussed above with reference to fig. 4. The reconstructed self-capacitance data 904 may represent reconstructed self-capacitance data captured by the techniques discussed above with reference to fig. 8. The difference data 906 may represent the difference between the reconstructed self-capacitance data 904 and the mutual capacitance data 902.

Fig. 10 is a conceptual diagram illustrating example capacitive scan data as an input object moves upward on a presence-sensitive display in accordance with one or more techniques of this disclosure. Fig. 10 includes mutual capacitance data 1002, reconstructed self-capacitance data 1004, and difference data 1006. Mutual capacitance data 1002 may represent mutual capacitance data captured via the techniques discussed above with reference to fig. 4. The reconstructed self-capacitance data 1004 may represent the reconstructed self-capacitance data captured by the techniques discussed above with reference to fig. 8. The difference data 1006 may represent the difference between the reconstructed self-capacitance data 1004 and the mutual-capacitance data 1002.

As can be seen from the difference data 906 and 1006, the difference data peaks at the leading edge of the motion (downward motion in the difference data 906 and upward motion in the difference data 1006). This is expected as the input object (e.g., a finger) is moving in that direction and mutual capacitance data is captured after self-capacitance data.

In some examples, such as where the absolute values of the mutual capacitance data and the reconstructed self-capacitance data may be on different scales, the touch controller 26 may rescale the reconstructed self-capacitance data to the signal level of the mutual capacitance data and keep the reconstructed self-capacitance data at [0, ∞ ] when calculating the difference as follows:

where RSS is the rescaled and retained reconstructed self-capacitance data, RSS is the unscaled reconstructed self-capacitance data, MS is the mutual capacitance data, and Difference is the Difference data.

In some examples, touch controller 26 may perform scaling (e.g., for greater accuracy) on each touch region identified in the mutual capacitance data separately.

Touch controller 26 may use any of a variety of techniques to estimate motion between the reconstructed self-capacitance touch location and the mutual-capacitance touch location. As one example technique, touch controller 26 may reconstruct the self-capacitance data using mutual capacitance data masking. For example, the touch controller 26 may zero the values of cells in the reconstructed self-capacitance data that correspond to cells in the mutual-capacitance data that have values less than a threshold value. Touch controller 26 may pair each touch location found in the reconstructed self-capacitance data (i.e., the self-capacitance touch location) with a location identified in the mutual-capacitance touch location (e.g., the closest location). Touch controller 26 may determine a vector between the paired locations that represents the direction and magnitude of motion between the reconstructed self-capacitance touch location in the pair and the mutual-capacitance touch location in the pair.

As another example technique, touch controller 26 may adjust the mutual capacitance touch position based on the difference data. For example, touch controller 26 can shift each respective mutual capacitance touch location by a weight calculated from the cells in the difference data that correspond to the cells covered by the respective mutual capacitance touch location. In the example of FIG. 9, the difference data 906 would reduce the weight of the mutual capacitance touch location. In the example of FIG. 10, the difference data 1006 would increase the weight of the mutual capacitance touch location. In other words, touch controller 26 shifts the mutual capacitance touch position in the direction of the motion, the direction and magnitude of the shift being an estimate of the motion.

Computing device 2 may perform delay compensation for one or more mutual capacitance touch locations based on the motion (714). Given the touch location estimate from the mutual capacitance data and the motion estimate, touch controller 26 may perform delay compensation by advancing the mutual capacitance touch location in the direction of the motion. The amount of compensation may depend on the latency target value of the system and the subjective user experience. In some examples, the delay target value may be 8-16 ms.

In some examples, touch controller 26 may extrapolate the location of the touch location to the time-lapse target using the estimated direction and speed of motion. In some examples, touch controller 26 may use higher order derivatives (e.g., as opposed to linear extrapolation). Using higher order derivatives may result in smoother behavior. In one example, touch controller 26 may use higher order derivatives (e.g., estimated acceleration, extrapolated velocity, and extrapolated position) as follows:

p*=p+vΔt+(1/2)aΔt2

wherein p is*Is the compensated touch position, p is the estimated static touch position, v is the estimated velocity, a is the estimated acceleration, and Δ t is the delay target.

The compensated touch position for a particular mutual capacitance touch position may represent a prediction of where the input object that caused the particular mutual capacitance touch position will be at a future time. The future time may be represented by a delay target value of the system.

There may be some inherent noise (mainly electrical noise) in the system that introduces jitter. To account for this noise/jitter, touch controller 26 may apply filtering to these motion estimates (and/or a fairly conservative delay target). As one example, touch controller 26 may constrain the estimate by curve fitting, a Taylor series, or other linear system. As another example, touch controller 26 may process the delay compensated output through a Kalman filter to estimate and remove system noise.

The following numbered examples will set forth one or more aspects of the present disclosure.

Example 1. a method, comprising: identifying, by one or more processors of a computing device and based on mutual capacitance data generated by a presence-sensitive display of the computing device, one or more mutual capacitance touch locations; identifying, by the one or more processors and based on self-capacitance data generated by the presence-sensitive display, one or more self-capacitance touch locations, each of the one or more mutual-capacitance touch locations corresponding to one of the one or more self-capacitance touch locations; determining, by the one or more processors, a motion between the one or more mutual capacitance touch locations and a corresponding touch location of the one or more self-capacitance touch locations; adjusting, by the one or more processors and based on the determined motion, the one or more mutual capacitance touch locations to obtain one or more adjusted mutual capacitance touch locations; and utilizing, by the one or more processors, the one or more adjusted mutual capacitance touch positions as user input.

Example 2. the method of example 1, wherein identifying the one or more self-capacitance touch locations comprises: the one or more self-capacitance touch locations are identified based on the one or more mutual capacitance touch locations and the self-capacitance data.

Example 3. the method of example 2, wherein identifying the one or more self-capacitance touch locations further comprises: determining reconstructed self-capacitance data based on the self-capacitance data; and identifying one or more touch locations in the reconstructed self-capacitance data that correspond to ones of the one or more mutual-capacitance touch locations as the one or more self-capacitance touch locations.

Example 4. the method of any of examples 1-3, wherein adjusting a particular mutual capacitance touch position of the one or more mutual capacitance touch positions comprises: a future position of the particular mutual capacitance touch position is predicted based on the motion and delay target values.

Example 5. the method of any of examples 1-4, wherein utilizing the one or more adjusted mutual capacitance touch locations as user input comprises: the adjusted mutual capacitance touch position is provided as user input to an application executing at the computing device.

Example 6. the method of example 5, further comprising: outputting, based on instructions received from the application, a graphical user interface for display at the presence-sensitive display; and outputting, based on the instructions received from the application, an updated graphical user interface modified based on the user input for display at the presence-sensitive display.

Example 7. the method of any of examples 1-6, wherein the presence-sensitive display includes a capacitive touch panel.

Example 8. the method of any of examples 1-7, wherein the one or more processors include a touch controller and an application processor.

Example 9. the method of example 8, wherein utilizing the one or more adjusted mutual capacitance touch locations as user input comprises: outputting, by the touch controller and to the application processor, the one or more adjusted mutual capacitance touch positions.

Example 10. a computing device, comprising: a presence-sensitive display; and one or more processors configured to perform the method according to any combination of examples 1-9.

Example 11 a non-transitory computer-readable storage medium storing instructions that, when executed, cause one or more processors of a computing device to perform the method of any combination of examples 1-9.

In one or more examples, the described features may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. The computer-readable medium may include a computer-readable storage medium, which corresponds to a tangible medium such as a data storage medium, or a communication medium including any medium that facilitates transfer of a computer program from one place to another, for example, according to a communication protocol. In this manner, the computer-readable medium may generally correspond to (1) a tangible computer-readable storage medium that is non-transitory, or (2) a communication medium such as a signal or carrier wave. A data storage medium may be any available medium that can be accessed by one or more computers or one or more processors to obtain instructions, code and/or data structures for implementing the techniques described in this disclosure. The computer program product may include a computer-readable medium.

By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transitory media, but refer to non-transitory, tangible storage media. Disk or disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually carry data magnetically, while discs carry data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.

The instructions may be executed by one or more processors, such as one or more Digital Signal Processors (DSPs), general purpose microprocessors, Application Specific Integrated Circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Thus, the term "processor," as used herein may refer to any of the above structures or any other structure suitable for implementation of the techniques described herein. Further, in some aspects, the functionality described herein may be provided within dedicated hardware and/or software modules. Furthermore, the techniques may be fully implemented in one or more circuits or logic elements.

The techniques of this disclosure may be implemented in various devices or apparatuses, including a wireless handset, an Integrated Circuit (IC), or a collection of ICs (e.g., a chipset). Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, and do not necessarily require implementation by different hardware units. Rather, as described above, the various units may be combined in a hardware unit comprising one or more processors as described above, or provided by a collection of interoperating hardware units, including appropriate software and/or firmware.

Various exemplary embodiments of the disclosure have been described. Any combination of the described systems, operations, or functions is contemplated. These and other examples are within the scope of the following claims.

27页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:触控显示基板及触控显示装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类