Gesture control of data processing apparatus

文档序号:1146083 发布日期:2020-09-11 浏览:11次 中文

阅读说明:本技术 数据处理装置的手势控制 (Gesture control of data processing apparatus ) 是由 A·马图尔 于 2018-12-10 设计创作,主要内容包括:公开了一种方法和系统,该方法和系统包括:在远离数据处理终端的位置处检测用户手势并且基于检测到的用户手势的距离在距数据处理终端的第一距离范围内来标识第一应用,其中多个应用与距数据处理终端的不同距离范围相关联。该方法和系统还可以基于用户手势来引起第一应用的多个功能中的一个功能的执行。(A method and system are disclosed, the method and system comprising: a user gesture is detected at a location remote from the data processing terminal and a first application is identified based on a distance of the detected user gesture being within a first range of distances from the data processing terminal, wherein the plurality of applications are associated with different ranges of distances from the data processing terminal. The method and system may also cause performance of one of a plurality of functions of the first application based on the user gesture.)

1. An apparatus, comprising:

means for receiving data indicative of a user gesture detected at a location remote from the data processing terminal;

means for identifying a first application based on: a distance of the detected user gesture is within a first range of distances from the data processing terminal, wherein a plurality of applications are associated with different ranges of distances from the data processing terminal; and

means for causing execution of one of a plurality of functions of the first application based on the user gesture.

2. The apparatus of claim 1, further comprising:

means for detecting another remote user gesture at a distance within a second range from the data processing terminal;

means for identifying a second application based on the distance being within the second range, the second application being different from the first application; and

means for causing execution of one of a plurality of functions of the second application based on the user gesture.

3. The apparatus of claim 1, wherein the different distance ranges do not substantially overlap.

4. An apparatus as claimed in any preceding claim, wherein the means for causing performance of the one of the plurality of functions is arranged to identify a type of the detected user gesture and to determine which of a plurality of predetermined functions associated with the identified application corresponds to the identified gesture type, the determined function then being caused to be performed.

5. An apparatus as claimed in any preceding claim, wherein a data transfer function is performed between associated applications in the event that a predetermined gesture moves from the first distance range into a different distance range.

6. An apparatus as claimed in any preceding claim, wherein in the event that the predetermined gesture corresponds to a shared function, the data transfer function is performed between the associated application and an application associated with another data processing terminal in the vicinity.

7. The apparatus of any preceding claim, wherein the associations between different ones of the plurality of applications and the different distance ranges are dynamically updated.

8. The apparatus of claim 7, wherein the association is dynamically updated by assigning a most frequently and/or most recently used application to a predetermined distance range of the different distance ranges.

9. The apparatus according to claim 8, wherein the predetermined one of the different distance ranges is a distance range closest to the data processing terminal.

10. The apparatus according to any of the preceding claims, wherein the apparatus is a data processing terminal.

11. The apparatus of claim 10, wherein the apparatus is a wearable data processing terminal.

12. The apparatus of claim 11, wherein the apparatus comprises one of: earphones, headphones, and watches.

13. An apparatus according to claim 11 or claim 12, wherein the means for causing performance of the function is arranged to issue corresponding instructions to a further data terminal in proximity to which the wearable data processing terminal is in proximity.

14. A method, comprising:

receiving data indicative of a user gesture detected at a location remote from the data processing terminal;

identifying a first application based on: a distance of the detected user gesture is within a first range of distances from the data processing terminal, wherein a plurality of applications are associated with different ranges of distances from the data processing terminal; and

cause execution of one of a plurality of functions of the first application based on the user gesture.

15. A computer readable medium comprising computer program code stored thereon, the computer readable medium and the computer program code configured to, when run on at least one processor:

receiving data indicative of a user gesture detected at a location remote from the data processing terminal;

identifying a first application based on: a distance of the detected user gesture is within a first range of distances from the data processing terminal, wherein a plurality of applications are associated with different ranges of distances from the data processing terminal; and

cause execution of one of a plurality of functions of the first application based on the user gesture.

Technical Field

The present disclosure relates to gesture control of data processing apparatus, and in particular, but not exclusively, to wearable data processing apparatus.

Background

Data processing devices with smaller form factors are becoming more and more popular. For example, wearable devices such as wireless headsets, earbuds, and smart watches are now relatively common. Other examples include devices associated with the so-called internet of things (IoT). The smaller form factor of such devices means that it is difficult to control different types of functions in the same way as, for example, larger touch screen devices can provide.

For example, a bluetooth ear bud headphone may only have enough surface area for a single physical control button. This limits the different types of functions that may be suitable for an ear bud headphone. The user may have to resort to their associated media player to manually select or control different applications and different types of functions on different applications, which is cumbersome.

Disclosure of Invention

A first aspect provides an apparatus comprising: means for receiving data indicative of a detected user gesture at a location remote from the data processing terminal; means for identifying a first application based on a distance of a detected user gesture, the detected user gesture being within a first range of distances from the data processing terminal, wherein a plurality of applications are associated with different ranges of distances from the data processing terminal; and means for causing execution of one of a plurality of functions of the first application based on the user gesture.

The apparatus may further include: means for detecting another remote user gesture at a distance within a second range from the data processing terminal; means for identifying a second application different from the first application based on the distance being within a second range; and means for causing execution of one of a plurality of functions of a second application based on the user gesture.

The different distance ranges may be substantially non-overlapping.

The means for causing performance of one of the plurality of functions may be arranged to identify a type of detected user gesture and to determine which of a plurality of predetermined functions associated with the identified application corresponds to the identified gesture type, the determined function then being caused to be performed.

In the event that the predetermined gesture moves from a first distance range to a different distance range, a data transfer function may be performed between the associated applications.

In the case where the predetermined gesture corresponds to a sharing function, the data transfer function may be executed between the associated application and an application associated with another data processing terminal in the vicinity.

Associations between different ones of the plurality of applications and different distance ranges may be dynamically updated. The association may be dynamically updated by assigning the most frequently and/or most recently used applications to predetermined ones of the different distance ranges. The predetermined distance range among the different distance ranges may be a distance range closest to the data processing terminal.

The apparatus may be a data processing terminal. For example, the apparatus may be a wearable data processing terminal. The apparatus may include one of a headset, and a watch. For example, where the devices are earphones or headphones, they may also be configured to issue an audio notification to confirm the function and/or the parameters associated with the function.

The means for causing performance of the function may be arranged to issue a corresponding instruction to a further data terminal in proximity of the wearable data processing terminal.

The means for detecting a user gesture may be implemented by means of capacitive sensing using one or more capacitive sensors of the data processing terminal. In some cases, an array of sensors may be used.

In another aspect, a method is provided, the method comprising: receiving data indicative of a detected user gesture at a location remote from the data processing terminal; identifying a first application based on a distance of a detected user gesture, the detected user gesture being within a first range of distances from the data processing terminal, wherein a plurality of applications are associated with different ranges of distances from the data processing terminal; and causing execution of one of a plurality of functions of the first application based on the user gesture. The preferred features of the first aspect may also be applied to this aspect.

Another aspect provides a computer readable medium comprising computer program code stored thereon, the computer readable medium and the computer program code configured to, when run on at least one processor: receiving data indicative of a detected user gesture at a location remote from the data processing terminal; identifying a first application based on a distance of the detected user gesture being within a first distance range from the data processing terminal, wherein a plurality of applications are associated with different distance ranges from the data processing terminal; and causing execution of one of a plurality of functions of the first application based on the user gesture. The preferred features of the first aspect may also be applied to this aspect.

In another aspect, a non-transitory computer-readable medium having computer-readable code stored thereon is provided, which when executed by at least one processor causes the at least one processor to perform a method comprising: receiving data indicative of a detected user gesture at a location remote from the data processing terminal; identifying a first application based on a distance of the detected user gesture being within a first distance range from the data processing terminal, wherein a plurality of applications are associated with different distance ranges from the data processing terminal; and causing execution of one of a plurality of functions of the first application based on the user gesture. The preferred features of the first aspect may also be applied to this aspect.

Another aspect provides an apparatus having at least one processor and at least one memory having computer-readable code stored thereon, the computer-readable code when executed controlling the at least one processor to: receiving data indicative of a detected user gesture at a location remote from the data processing terminal; identifying a first application based on a distance of the detected user gesture being within a first distance range from the data processing terminal, wherein a plurality of applications are associated with different distance ranges from the data processing terminal; and causing execution of one of a plurality of functions of the first application based on the user gesture. The preferred features of the first aspect may also be applied to this aspect.

Drawings

Embodiments will now be described, by way of non-limiting example, with reference to the accompanying drawings, in which:

FIG. 1 is a schematic diagram of a user gesture remote from a first device within one of a plurality of ranges of a device for controlling a function, in accordance with some embodiments;

FIG. 2 is a schematic diagram of a user gesture away from a second apparatus within one of a plurality of ranges of an apparatus for controlling a function, in accordance with some embodiments;

FIG. 3 is a schematic diagram of hardware components of the apparatus of FIG. 1 or FIG. 2, according to some embodiments;

FIG. 4 is a schematic diagram of functional components of the apparatus of FIG. 1 or FIG. 2, according to some embodiments;

FIG. 5 is an example of the mapping database shown in FIG. 4 in accordance with some embodiments;

FIG. 6 is a flow diagram of operations performed at the apparatus of FIG. 1 or FIG. 2, in accordance with some embodiments;

FIG. 7 is a flowchart of operations performed at the apparatus of FIG. 1 or 2 for sharing data between applications, in accordance with some embodiments;

FIG. 8 is a schematic diagram of a first device and representing a gesture for detection at the first device to perform a sharing operation of the method of FIG. 7;

fig. 9 is a flowchart of further operations performed at the apparatus of fig. 1 or 2 for sharing data between an application of a first apparatus and another apparatus, in accordance with some embodiments; and

FIG. 10 is a schematic diagram of a first device and a gesture representing a sharing operation for detection at the first device and another device to perform the method of FIG. 9.

Detailed Description

Embodiments herein relate to a method and system for gesture control of a data processing device or terminal, which may be a wearable data processing terminal or indeed any other type of data processing terminal, which typically, but not exclusively, will be portable and/or wireless.

For example, embodiments may relate to methods and systems for gesture control of wearable data processing terminals, such as one or more of headphones, earphones, headphones, or other forms of wearable audio terminals that include speakers for placement near one or more user ears. For example, embodiments described later relate to wireless ear bud headsets that may communicate with other terminals associated with, for example, media players, smart phones, or tablet computers. The communication method may be wireless, for example using bluetooth or similar wireless protocols.

For example, in other embodiments, the wearable data processing terminal may be a limb-worn device such as a smart watch. Similarly, a smart watch may communicate with other terminals associated with a computer such as a media player, smart phone, or tablet computer. The communication method may be wireless, for example using bluetooth or similar wireless protocols.

The use and popularity of such wearable data processing terminals continues to grow. They provide a convenient way of accessing one or more applications and functions associated with such applications, particularly in the case of a user traveling or participating in a sporting activity, for example.

Such applications may include one or more of a music or another audio playback application, a health monitoring application, a voice call application, a text or multimedia communication application, a voice recognition application, a podcast application.

For the avoidance of doubt, in the context of the present disclosure, an application includes any computer program or type of computer program that provides a distinct set of functionality, such as a music application, a health monitoring application, a voice telephony application, a text or multimedia messaging application, a voice recognition application, a podcast application, and so forth. Accordingly, the term may be considered broader than referring to a particular software application. For example, in some embodiments, there may be two different software applications that provide music, but they may have common music playing functionality, such as enable, disable, play, pause, stop, rewind, forward, next track, previous track, volume up, volume down, and so forth.

In some embodiments, the wearable device may include a processor and a memory, the memory providing one or more applications, such as the applications mentioned above. In other embodiments, the wearable device may wirelessly communicate with another data processing device (such as a smartphone or tablet) that provides the above-described applications and functions, the output of which is relayed back to the wearable device. In this case, the wearable device is in signal communication with the other data processing device.

A problem with certain data processing terminals, particularly but not exclusively wearable terminals, is that their form factor is small. This limits the way in which a user interacts with the data processing terminal or another data processing device in signal communication therewith. For example, space for buttons or switches may be limited so that they can input commands for various types of functions that may be associated with different applications. Due to the size and additional expense of such data processing terminals, it is often not feasible to provide a touch screen on such data processing terminals.

Furthermore, it is difficult to switch between different applications. For example, a user listening to music may wish to interact with a health monitoring application to measure heart rate at a current time. Typically, this requires the user to exit the cell phone or tablet computer, close or minimize the music application to background, and manually open the health monitoring application.

The use of voice commands for interacting with such data processing terminals is disadvantageous due to background noise and interference and its potential to interfere with others.

Accordingly, embodiments herein relate to the detection and interpretation of physical user gestures made remotely from a data processing terminal; i.e. a gesture that does not touch the data processing terminal.

Such gestures may be detected using capacitive coupling, which is a known technique. In some embodiments, a single capacitive sensor may be provided in or on the data processing terminal. In other embodiments, an array comprising a plurality of capacitive sensors may be provided for more accurately determining the spatial position of a user gesture relative to the data processing terminal.

In general, embodiments relate to assigning or associating applications to respective distance ranges relative to a data processing terminal. Typically, the range of distances are non-overlapping such that a first range (e.g., the range closest to the data processing terminal) is associated with a first application and another range (e.g., beyond the first range) is associated with a different application. In some embodiments, the ranges may overlap, as will be briefly mentioned later.

Particular gestures made within the range are also associated with corresponding functions of these applications, and thus a distinction may be made between a first gesture made within a first range, a second gesture made within the first range, a first gesture made within a second range, and a second gesture made within the second range, and so on. This means that the user can control a number of different applications with a number of different functions based on the gestures they make and the position relative to the data processing terminal. Furthermore, the user does not necessarily need to manually switch between different applications. The method and system provide a more intuitive way to interact with an application.

In some embodiments, the association between the distance range and the application is predefined, which may be factory set, or performed by a user, and/or updated by a user.

The association may be fixed or may change dynamically.

For example, the association may be dynamically updated such that the particular application that was most recently used is associated with a particular range of distances (e.g., the range of distances closest to the data processing terminal). In other embodiments, a particular application that is used most frequently, for example, within a predetermined time range, may be associated with a particular distance range. Other rules may be applied.

In some embodiments, in the case where the data processing terminal emits audio, for example if it is an ear bud or a set of headphones, the detected gesture may cause an audio confirmation of the currently selected associated function (e.g., "play audio" or "monitor heart rate"). In some embodiments, confirmation of within which range the intended gesture is may be provided. For example, if the user's hand is within the second range, an audio confirmation of the application, such as "music player," may be generated prior to detecting any particular gesture. This may prevent accidental control of the wrong application, which may be particularly useful if dynamic updating is used.

Referring to fig. 1, for example, there is shown a wireless ear bud headphone 10 including a body 20 and a flexible tip 30 for insertion into a human ear. The body 20 includes a system, to be described below, that includes a Radio Frequency (RF) transceiver for communicating with an associated media player, smart phone, or tablet computer. For example, the transceiver may be a bluetooth transceiver. The ear bud headphone 10 also includes a capacitive sensor 40 within the body 20 for sensing user gestures near the sensor, the capacitive sensor 40 forming part of the system; it will be appreciated that the sensor will generate a signal indicative of the distance of the user's gesture from the sensor and thus the ear bud headphone, and also indicative of the gesture type.

For example, the gesture type may be a flick gesture, including a sharp movement of a finger in the air. Another gesture type may be a swipe gesture, including a smooth movement in the air. A distinction can be made between horizontal sliding and vertical sliding. Other gesture types may include, for example, one or more of a tap, arc, circle, or pointing gesture. The use of different numbers of fingers or other parts of the body may be related to other gestures, which may be distinguished from each other. For example, a flick gesture made by two fingers may be distinguished from a flick gesture made by one finger.

More accurate gesture recognition may be achieved using multiple capacitive sensors, which may be arranged in or on the ear bud headphone 10 in an array. These capacitive sensors can provide more voluminous sensing of distance and gesture type.

In the illustrated example, the user's hand 50 is shown near the earbud headphone 10, but at a distance. The user's hand 50 represents a gesture.

According to embodiments herein, the detected distance to the ear bud headphone 10 determines which application the gesture relates to based on one or more ranges of distances (hereinafter "ranges"). A first range is indicated by reference numeral 60 which defines a first area which may be omnidirectional around the ear bud earphone 10 or may be segmented around the ear bud earphone. The non-overlapping second range is indicated by reference numeral 70 and defines a non-overlapping second region, which may be omnidirectional around the ear bud headphone 10 or may be segmented. The gesture 50 is shown in a first range 60 and thus the system of the earbud headphone 10 will identify that the gesture is related to a first application. The gesture type will determine the function controlled. Gestures made in the second scope 70 will be identified as being related to another application and the gesture type will again determine the controlled functionality of the other application.

The third range 80 is shown merely to indicate that any number of regions may be provided at least within a distance that may be reliably sensed by the capacitive sensor 40.

In some embodiments, the sensor 40 senses gestures made in a limited volumetric space (i.e., not omnidirectional). For example, the sensor 40 may be configured to sense only gestures made within a predetermined conical volume extending outwardly from the body 20 and away from the tip 30. Alternatively, the sensor 40 may be configured to sense gestures only substantially along segmented regions of a particular axis or surrounding region. For example, it may be undesirable to sense the area directly under the ear bud headphone 10 where the user's body will be.

Fig. 2 shows another embodiment as a smart watch 85. The smart watch 85 includes a body 90, the system of the embodiment is housed in the body 90, and the smart watch 85 may include a crown 92 housing a capacitive sensor. In some embodiments, the crown 92 may be omitted, in which case the sensor is housed within the body 90. In a manner similar to the embodiment of fig. 1, capacitive sensing is employed to detect a gesture made within two or more respective regions 94, 96, 98 to determine the application to which the gesture relates, and to determine the function performed by the gesture. Here, a form of virtual crown may be provided which enables setting of e.g. time, date and stopwatch by detecting gestures in different areas. For example, a twist gesture made in the first region 94 may indicate that time is adjusted using a time application, a twist gesture made in the second region 96 may indicate that date is adjusted using a date application, a tap gesture made in the third region may indicate that a timer is started using a stopwatch application, and so on.

Fig. 3 is a schematic diagram of the components of the earbud earphone 10 or smart watch 85 shown in fig. 1 and 2, respectively. For ease of explanation, we will assume that these components are components in the ear bud headphone 10, but it should be understood that the following applies to the smart watch 85.

The ear bud headphone 10 can have a processor 100, a memory 104 closely coupled to the processor and including a RAM 102 and a ROM103, an audio output 108, and a network interface 110. In the case of the smart watch 85, for example, a display and one or more hardware keys may be used in place of the audio output 108. The ear bud headphone 10 can include one or more network interfaces 110 for connecting to a network, for example using bluetooth or similar technology. The processor 100 is connected to each of the other components in order to control the operation thereof.

The memory 104 may include a nonvolatile memory, a Hard Disk Drive (HDD), or a Solid State Drive (SSD). The ROM103 of the memory 104 stores, among other things, an operating system 112 and may store software applications 114. The RAM 102 of the memory 104 may be used by the processor 100 for temporarily storing data. The operating system 112 may contain code that is executed by the processor software component of the ear bud headphone 10.

The processor 100 may take any suitable form. For example, it may be a microcontroller, a plurality of microcontrollers, a processor, or a plurality of processors, and it may include processor circuitry.

In some embodiments, the earbud headphone 10 may also be associated with an external software application. These external software applications may be applications stored on the remote device 120 and may run partially or exclusively on the remote device. In some cases, these applications may be referred to as cloud-hosted applications. The ear bud headphone 10 can communicate with the remote device 120 to utilize software applications stored at the remote device 120.

For example, the ear bud headphone 10 may signal the remote device 120 corresponding to the particular function of the application stored thereon. For example, a gesture indicating an increase or decrease in volume may cause the earbud headphone 10 to signal the remote device 120 corresponding to the associated volume increase or decrease function. The remote device 120 is configured to decode or interpret the signal and locally perform volume up or down functions. The resulting audio can then be relayed to the ear bud headphone 10 at an appropriate volume. In other cases, no relay is needed, for example, when controlling some non-audio function, such as opening a health application on the remote device 120.

Fig. 4 illustrates example functional elements of a data processing terminal 130 according to some embodiments. The data processing terminal 130 may be, for example, an ear bud headphone 10 or a smart watch 85 as shown in fig. 1 and 2, respectively.

The data processing terminal 130 includes a capacitive proximity sensor 132, a proximity controller 134, a mapping database 136, an interaction layer 138, and an application stack 140 that includes a plurality of applications, such as two or more of a music playing application, a health monitoring application, a voice call application, a text or multimedia communication application, a voice recognition application, a podcast application, and the like.

These elements may be provided by software, firmware, hardware, or any combination thereof. For example, the proximity controller 134 and the interaction layer 138 may include the software application 114 stored on the memory 104 shown in FIG. 3.

Reference numeral 150 indicates the presence of a user's hand, remote from the data processing terminal 130, which, when in use, can be selectively positioned within one of the three spatial regions R1, R2 and R3, represented by respective ranges of distances.

The capacitive proximity sensor 132 may be any suitable sensor as described above.

The proximity controller 134 may be configured to control the allocation or association of three applications in the application stack 140 to respective spatial regions R1, R2, and R3. These allocations are stored in the mapping database 136 and may be updated from time to time in the mapping database 136, the mapping database 136 being schematically illustrated in fig. 5.

Referring to FIG. 5, mapping database 136 may include or represent in any suitable form an allocation table that stores a respective application for each scope. In the illustrated example, the closest range (set to between 0.1-2 cm) to the data processing terminal 130 is labeled R1, the next range (set to between 2-4 cm) to the data processing terminal is labeled R2, and the next range (set to between 4-6 cm) to the data processing terminal is labeled R3. R1 is assigned to music applications, R2 is assigned to voice call (i.e., telephone) applications, and R3 is assigned to health applications.

For each application, a plurality of gestures and their corresponding functions for that application are also stored.

For example, for a music application, the gestures "tap", "vertical swipe" and "horizontal swipe" are assigned to "enable/play/pause", "volume up/down" and "next/previous track", respectively.

For example, for a voice call application, the gestures "tap," "vertically swipe," and "horizontally swipe" are assigned to "enable/answer/end call," "volume up/down," and "next/previous contact," respectively.

For example, for a fitness application, the gestures "tap," "vertical swipe," and "horizontal swipe" are assigned to "enable/disable," "activity selector," and "date selector," respectively.

It should be understood that a greater or lesser number of applications, gestures, and/or functions may be represented in mapping database 136. It should also be understood that different gestures may be represented. For example, a gesture for determining a battery level of the data processing terminal may be provided.

The allocation shown in fig. 5 may be factory set. The allocation may be fixed or may be updated. For example, the data processing terminal 130 may be provided with associated setup software, which may be downloadable on disk or from a website or even stored in the memory 404 for plug-and-play operation. The setup software may allow users to assign and/or manually update them. As will be discussed later, the allocation may be dynamically updated over time. The range may also be controlled by a user or dynamically adjusted, for example based on environmental factors.

Returning to FIG. 4, the interaction layer 138 operates in response to detecting an object (e.g., the user's hand 150) within the proximity range and within one of the ranges R1, R2, or R3. The interaction layer 138 queries the mapping database 136 to identify the application assigned to the range and gesture. In the illustrated example, a second range R2 is identified, and thus a voice call application is identified. This identification of the interaction layer 138 causes performance of a function associated with a gesture of a voice call application (App2) in the application stack 140, for example, answering a call in response to a tap gesture.

At a subsequent stage, for example, the user may make an upward swipe gesture in the same range R2. In response, the interaction layer 138 will cause the volume of the voice call to increase.

For example, at a later stage, while the call is in progress, the user may move his hand into the third range R3. This may result in an audio confirmation (e.g., if the data processing terminal 130 is an audio device) by playing an audio clip such as "healthy" to inform the user of the fact. Then, the user may make a flick gesture in the third range R3. In response, the interaction layer 138 will cause the enablement of the health application without interrupting the voice call. At a later stage, for example, the user may make a vertical swipe in the third range R3 and the interaction layer 138 will scroll through different activity types, which in the case of an audio device may be briefly notified as an audio confirmation, e.g., "walk".

Thus, applications that are hidden or disabled may be enabled without physical interaction.

In some embodiments, certain application functions may not be allowed or may be modified depending on the function currently running. For example, if a call is in progress, gestures that play a music track may not be allowed. Alternatively, the functionality may be modified, for example, according to the ongoing functionality, for example by being ready to play a music track once the call is interrupted.

In some embodiments, the correlations between ranges and applications and/or gestures and functions may be stored elsewhere than in mapping database 136. For example, the correlations may be stored in a separate database, at the interaction level, or in a remote data terminal or server. The arrangement of fig. 4 is merely illustrative.

In some embodiments, one or more of the functional modules shown in the data processing terminal 130 (in addition to the capacitive sensor 132) may be provided in a separate data processing terminal such as a smartphone or tablet computer. In this regard, separate data processing terminals may receive the sense signals from the capacitive sensor 132 and perform the steps outlined below, although for ease of illustration we will assume hereinafter that they are performed in the data processing terminal 130 itself.

Fig. 6 is a flowchart illustrating example processing operations that may be performed by the data processing terminal 130. Certain operations may be omitted or replaced with other operations.

The first operation 6.1 comprises receiving data indicative of a detected user gesture at a first location remote from the data processing terminal 130.

Another operation 6.2 includes identifying an application based on the distance of the detected user gesture being within a particular range.

Another operation 6.3 includes causing execution of one of a plurality of functions of the identified application based on the user gesture.

Fig. 7 is a flowchart illustrating example processing operations that may be performed by the data processing terminal 130 in another embodiment.

A first operation 7.1 comprises receiving data indicative of a detected user gesture spanning two distance ranges.

A second operation 7.2 comprises identifying two applications based on two scopes.

A third operation 7.3 comprises causing a data transfer from one application to another application.

In this embodiment, a predetermined gesture that spans two or possibly more distance ranges may be interpreted to cause data sharing between two corresponding applications if available.

For example, fig. 8 shows the earbud headphone 10 of fig. 1 with two ranges 60, 70 and a user gesture 50, the user gesture 50 starting from a first (closest) range and moving in a lateral sliding motion to a second range. A particular "share" gesture may correspond to a share, copy, or move function for a particular application. A particular "share" gesture may require a particular start gesture and a particular end gesture to prevent accidental application of shared functionality within an external scope.

For example, the first application associated with the first scope 60 may be a music application and the second application associated with the second scope 70 may be a voice call application. In this regard, the sharing gesture may cause a currently playing or selected music track in the first application to be shared with the second application, thereby using the music track as a ringtone for the second application.

In another example, the first application may be a multimedia application and the second application may be a text messaging application. The sharing gesture may cause the currently selected multimedia entity to be entered into the text message.

In another example, the first application may be a network settings application and the second application may be an application that requires network settings (e.g., proxy settings). The sharing gesture may cause a network setting to be shared between the two applications.

Many other examples are envisaged, such as simple file transfer between applications.

In this case, the sharing gesture may be associated with a sharing, copy, or move function of the first application and a sharing, paste, or download function of the second application.

Fig. 9 is a flowchart illustrating example processing operations that may be performed by the data processing terminal 130 in another embodiment.

The first operation 9.1 comprises receiving data indicative of a detected user gesture at a first location remote from the data processing terminal 130.

A second operation 9.2 comprises identifying an application based on the distance of the detected user gesture being within a certain range.

A third operation 9.3 comprises identifying a user gesture as a neighborhood sharing function.

A fourth operation 9.4 comprises causing data sharing between the identified application and another device in the vicinity; the other device may be paired with the current device, but this is not required.

In this embodiment, the predetermined gesture indicating neighborhood sharing may allow sharing of data between different, but proximate, devices that are not otherwise physically connected. The predetermined gestures may include a start gesture and an end gesture.

For example, fig. 10 shows the earbud earphone 10 and the second earbud earphone 10A of fig. 1, which may belong to different people. In this example, the dashed lines 150, 160 indicate ranges in their respective first ranges. A user gesture 50 that starts in the first range 150 of the first earphone 10 and ends in the range 160 of the second earphone 10A may cause inter-device sharing such that, for example, a music track being played or selected on the first earphone is transmitted to the second earphone 10A. In this case, the neighborhood sharing gesture may be associated with a sharing, copy, or move function of a first earpiece function and a sharing, paste, or download function of a second earpiece function.

In some embodiments, the different distance ranges may partially overlap, although this may require that the function gesture of one corresponding application be different from the function gesture of another corresponding application so as not to misinterpret gestures made in the overlap region.

As previously mentioned, although embodiments primarily assume that the detection, identification and function causing phases occur in the wearable device, they may be performed in an associated other device, such as a smartphone or tablet computer, which receives data indicative of gestures from the wearable device and relays the results of the performed functions back to the wearable device, such as by changing music tracks or increasing volume.

It should be understood that the above-described embodiments are illustrative only and do not limit the scope of the present invention. Other variations and modifications will be apparent to persons skilled in the art upon reading the present application.

Furthermore, the disclosure of the present application should be understood to include any novel feature or any novel combination of features disclosed herein either explicitly or implicitly or any generalisation thereof, and during the prosecution of the present application or of any application derived therefrom, new claims may be formulated to cover any such features and/or combination of such features.

19页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:通信方法及系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类