System and method for monitoring physical therapy and rehabilitation using a user interface

文档序号:1660834 发布日期:2019-12-27 浏览:10次 中文

阅读说明:本技术 利用用户界面监测物理治疗和康复的系统和方法 (System and method for monitoring physical therapy and rehabilitation using a user interface ) 是由 柯特·威登霍弗 贾斯廷·安东尼·克里尔 布莱恩·詹姆斯·卡特尔柏格 约书亚·戴尔·豪沃德 于 2018-02-01 设计创作,主要内容包括:一种用于监测患者关节的系统,该系统包括多个传感器,该多个传感器布置在关节附近并测量或观察与关节相关的动作或物理量;以及至少一个通信模块,其耦合到传感器以从传感器接收数据并且将传感器信息传送到外部设备。系统还包括患者设备和临床设备,可用于例如监测和显示从传感器获取的信息、从传感器数据确定运动测量范围、显示物理治疗的进展、拍摄患者身上的部位的照片或视频、获得疼痛评分、以及包括在物理治疗期间提供鼓励的朋友。(A system for monitoring a joint of a patient, the system comprising a plurality of sensors disposed in proximity to the joint and measuring or observing an action or physical quantity associated with the joint; and at least one communication module coupled to the sensor to receive data from the sensor and to communicate sensor information to an external device. The system also includes a patient device and a clinical device that can be used, for example, to monitor and display information obtained from sensors, determine a range of motion measurements from sensor data, display the progress of physical therapy, take a picture or video of a site on the patient, obtain a pain score, and include friends that provide encouragement during physical therapy.)

1. A system for monitoring a patient, the system comprising:

a patient device configured and arranged to communicate with a sensor unit disposed on or within a patient, the patient device comprising a display, a camera, a memory, and a processor coupled to the display, camera, and memory, wherein the processor is configured and arranged to perform acts comprising:

directing a user on the display to take a picture or video of a part on a patient;

upon receiving a user input to take the picture or video, taking the picture or video; and

and storing the photo or the video in a memory.

2. The system of claim 1, wherein the actions further comprise performing pigment analysis using the photograph or video to assess a health condition of a site on a patient.

3. The system of claim 2, wherein the actions further comprise generating a visual or audible alert to the patient when the pigment analysis indicates that the site on the patient is likely to be infected.

4. The system of claim 2, wherein the actions further comprise sending a message to a clinical device with an indication of possible infection when the pigment analysis indicates that a site on the patient is likely to be infected.

5. The system of claim 1, wherein the actions further comprise sending the photograph or video to a clinical device to assess a health condition of a site on a patient.

6. The system of claim 1, wherein the photograph or video is a video, and the actions further comprise using the video to assess performance of an exercise.

7. The system of claim 1, wherein the actions further comprise overlaying one or more lines or corners on the photograph or video to represent the position or movement of the limb of the patient within the photograph or video.

8. A system for monitoring a patient, the system comprising:

a patient device configured and arranged to communicate with a sensor unit disposed on or within a patient, the patient device comprising a display, a memory, and a processor coupled to the display and memory, wherein the processor is configured and arranged to perform acts comprising:

instructing a user to input a pain score on the display; and

transmitting the pain score to a clinical device.

9. The system of claim 8, wherein the actions further comprise:

receiving a user selection of an exercise;

displaying on the display a graphical representation of the exercise and a user control that indicates that the user is performing the exercise when the user actuates the user control; and

monitoring sensor data from the sensor unit to monitor performance of the exercise while the user control is actuated.

10. The system of claim 9, wherein the actions further comprise:

displaying a repetition count of the exercise being performed during the performance of the exercise.

11. The system of claim 9, wherein the actions further comprise displaying a current or maximum realized value of a range of motion measurements associated with the exercise determined from the sensor data during performance of the exercise.

12. The system of claim 9, wherein the actions further comprise:

receiving a request for an exercise summary from the user; and

displaying a summary of a plurality of exercises on the display, the summary of the plurality of exercises indicating a number of repetitions performed for each exercise over a period of time.

13. The system of claim 9, wherein the actions further comprise:

receiving a request from the user for a progress report of a range of motion measurements; and

displaying on the display a graph of a plurality of values of a range of motion measurements obtained over a period of time.

14. The system of claim 13, wherein the actions further comprise:

displaying on the display a representation of a path to the target of the motion measurement range and indicating a current progress of the patient towards the target.

15. A system for monitoring physical therapy of a patient, the system comprising:

a patient device configured and arranged to communicate with a sensor unit disposed on or within a patient, the patient device comprising a display, a memory, and a processor coupled to the display and memory, wherein the processor is configured and arranged to perform acts comprising:

instructing a user to enter one or more friends on the display;

sending a connection message to each of the one or more friends; and

receiving a message from at least one of the one or more friends encouraging the patient to continue the physical therapy.

16. The system of claim 15, wherein the actions further comprise:

sending a message to at least one of the one or more friends in response to a user input.

17. The system of claim 15, wherein the actions further comprise:

in response to the user input, a challenge is sent to at least one of the one or more friends.

18. The system of claim 15, wherein the actions further comprise:

sending a progress report to at least one of the one or more friends, wherein the progress report includes an indication of the patient's progress toward at least one physical therapy goal.

19. The system of claim 15, wherein the actions further comprise:

sending a progress report to at least one of the one or more friends, wherein the progress report includes an indication that the patient performed at least one physical therapy exercise.

20. The system of claim 19, wherein the progress report further comprises an indication of performance of the at least one physiotherapy exercise by at least one of the one or more friends.

Technical Field

The present invention relates to the field of methods, systems and devices including user interfaces for monitoring physical therapy or rehabilitation. The invention also relates to a system and a method for monitoring physical therapy or rehabilitation with a user interface after surgery or implantation of an orthopaedic device.

Background

Joint replacement surgery is a common orthopedic procedure for joints such as shoulder, hip, knee, ankle and wrist. In the event that a patient's joint is worn or damaged, the joint may be replaced with an implant that can fuse with the bone structure and restore painless motion and function. Prior to implanting a prosthetic component in a joint of a patient, a surgeon typically resects at least a portion of the patient's natural bone to form a platform, recess, or cavity for receiving at least a portion of the implanted prosthetic component. During implantation of the prosthetic component, the muscles and tendons must be repositioned and reattached.

The patient must undergo physical therapy to recover from the major surgery. The patient must move regularly and strive to keep the muscles that have displaced flexible and balanced. Although the goal is to allow the patient to extend the range of motion, the risk of falling or over-stretching may increase, which may damage the implant and injure the patient. If the patient does not drive his recovery and reach the required range of motion, they will find their own joint stiff, which may require additional surgery (MUA-under anesthesia operation) to reach sufficient range of motion to maintain his positive lifestyle. Measuring or monitoring the progress of physical therapy can be problematic, but is very useful for maintaining patient contribution (dedication) and participation.

Disclosure of Invention

One embodiment is a system for monitoring a patient. The system includes a patient device configured and arranged to communicate with a sensor unit disposed on or within a patient, the patient device including a display, a camera, a memory, and a processor coupled to the display, the camera, and the memory, wherein the processor is configured and arranged to perform acts comprising: directing a user on the display to take a picture or video of a part on a patient; upon receiving a user input to take the picture or video, taking the picture or video; and storing the photo or video in a memory.

In at least some embodiments, the actions further include performing pigment analysis using the photograph or video to assess the health of a site on the patient. In at least some embodiments, the actions further include: when the pigment analysis indicates that the site on the patient is likely to be infected, a visual or audible alert is generated to the patient. In at least some embodiments, the actions further include: when the pigment analysis indicates that the site on the patient is likely to be infected, a message is sent to the clinical device with an indication of likely infection.

In at least some embodiments, the actions further include sending the photograph or video to a clinical device to assess the health of a site on the patient. In at least some embodiments, the photograph or video is a video, and the actions further include using the video to assess performance of the workout. In at least some embodiments, the actions further include overlaying one or more lines or corners on the photograph or video to represent the position or movement of the patient's limb within the photograph or video.

Another embodiment is a system for monitoring a patient. The system includes a patient device configured and arranged to communicate with a sensor unit disposed on or within a patient, the patient device including a display, a memory, and a processor coupled to the display and the memory, wherein the processor is configured and arranged to perform acts including: instructing a user to input a pain score on the display; and transmitting the pain score to a clinical device.

In at least some embodiments, the actions further include: receiving a user selection of an exercise; and displaying on the display a graphical representation of an exercise and a user control indicating that the user is performing the exercise when the user actuates the user control; and monitoring sensor data from the sensor unit to monitor performance of the exercise while the user control is actuated. In at least some embodiments, the actions further include displaying a repeat count of an exercise being performed during performance of the exercise. In at least some embodiments, the actions further include displaying, during performance of the exercise, a current or maximum realized value of a range of motion measurements associated with the exercise determined from the sensor data.

In at least some embodiments, the actions further include receiving a request for an exercise summary from the user; and displaying a summary of the plurality of exercises on the display, the summary of the plurality of exercises indicating a number of repetitions performed for each exercise over a period of time. In at least some embodiments, the actions further include receiving a request from the user for a progress report for a range of motion measurements; and displaying on the display a graph of a plurality of values of the range of motion measurements obtained over a period of time. In at least some embodiments, the actions further include displaying on the display a representation of the path to the target of the motion measurement range and indicating the current progress of the patient toward the target.

Yet another embodiment is a system for monitoring physical therapy of a patient. The system includes a patient device configured and arranged to communicate with a sensor unit disposed on or within a patient, the patient device including a display, a memory, and a processor coupled to the display and the memory, wherein the processor is configured and arranged to perform acts including: instructing a user to enter one or more friends on the display; sending a connection message to each of the one or more friends; and receiving a message from at least one of the one or more friends encouraging the patient to continue with the physical therapy.

In at least some embodiments, the actions further include sending a message to at least one of the one or more friends in response to a user input. In at least some embodiments, the actions further include: in response to the user input, a challenge is sent to at least one of the one or more friends. In at least some embodiments, the actions further include sending a progress report to at least one of the one or more friends, wherein the progress report includes an indication of the patient's progress toward at least one physical therapy goal. In at least some embodiments, the actions further include sending a progress report to at least one of the one or more friends, wherein the progress report includes an indication that the patient performed at least one physical therapy exercise. In at least some embodiments, the progress report further includes an indication of the performance of the at least one physical therapy exercise by at least one of the one or more friends.

Yet another embodiment is a method for performing the acts recited for any of the systems above. Yet another embodiment is a non-transitory computer-readable medium comprising instructions for performing the acts described with respect to any of the systems described above.

Drawings

Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following drawings. In the drawings, like reference numerals refer to like parts throughout the various figures unless otherwise specified.

For a better understanding of the present invention, reference is made to the following detailed description, which is to be read in connection with the accompanying drawings, wherein:

FIG. 1 is a schematic view of one embodiment of a system for monitoring the rehabilitation of a patient after implant surgery according to the present invention;

FIG. 2 is a schematic diagram of one embodiment of a computing device used in the system of FIG. 1, in accordance with the present invention;

FIG. 3A is a perspective side view of one embodiment of a sensor unit and a base detached from each other in accordance with the present invention;

FIG. 3B is a side view of the sensor unit and substrate of FIG. 3A engaged with one another in accordance with the present invention;

FIG. 3C is a top view of the sensor unit and substrate of FIG. 3A engaged with one another in accordance with the present invention;

FIG. 4A is a top view of one embodiment of the sensor unit of FIG. 3A, in accordance with the present invention;

FIG. 4B is a side view of the sensor unit of FIG. 4A according to the present invention;

FIG. 4C is an exploded view of the sensor unit of FIG. 4A according to the present invention;

FIG. 4D is an exploded view of one embodiment of a housing of the sensor unit of FIG. 4A, according to the present invention;

FIG. 5 is a diagram for one embodiment of a user interface for a mobile device to display a patient or clinician profile in accordance with the present invention;

FIG. 6 is a diagram for one embodiment of a user interface for a mobile device to display information obtained from a sensor unit, in accordance with the present invention;

FIG. 7 is a diagram of another embodiment of a user interface for a mobile device to display information obtained from a sensor unit, according to the present invention;

FIG. 8 is a diagram for one embodiment of a user interface for a mobile device to display a range of motion measurements, in accordance with the present invention;

FIG. 9 is a diagram for one embodiment of a user interface for a mobile device to display an abstract of a repetitive workout in accordance with the present invention;

FIG. 10 is a diagram for one embodiment of a user interface for a mobile device to display information obtained from a sensor unit, in accordance with the present invention;

FIG. 11 is a diagram for one embodiment of a user interface for a mobile device displaying selectable videos to demonstrate an exercise in accordance with the present invention;

FIG. 12 is a diagram for one embodiment of a user interface for a mobile device to set an athletic reminder, in accordance with the present invention;

FIG. 13 is a diagram of one embodiment of a user interface for a mobile device to display information obtained from a sensor unit and a path toward a physical therapy target in accordance with the present invention;

FIG. 14 is a diagram of yet another embodiment of a user interface for a mobile device to display information obtained from a sensor unit, in accordance with the present invention;

FIG. 15 is a diagram for one embodiment of a user interface for a mobile device to set patient-specific settings of a sensor unit, in accordance with the present invention;

FIG. 16 is a diagram for one embodiment of a user interface for taking a picture or video, according to the present invention;

FIG. 17 is a diagram of another embodiment of a user interface for displaying information obtained from a sensor unit, according to the present invention;

FIG. 18 is a diagram of yet another embodiment of a user interface for displaying information obtained from a sensor unit, in accordance with the present invention;

FIG. 19 is a diagram of another embodiment of a user interface for displaying information obtained from a sensor unit, according to the present invention;

FIG. 20 is a flow chart of one embodiment of a method of taking a picture or video of an area on a patient according to the present invention;

FIG. 21 is a flow diagram of one embodiment of a method of inputting a pain score, according to the present invention;

FIG. 22 is a flow diagram for one embodiment of a method for displaying information requested by a patient, in accordance with the present invention; and

FIG. 23 is a flow diagram of one embodiment of a method for including friends in physical therapy according to the present invention.

Detailed Description

The present invention relates to the field of methods, systems and devices including user interfaces for monitoring physical therapy or rehabilitation. The invention also relates to a system and a method for monitoring physical therapy or rehabilitation with a user interface after surgery or implantation of an orthopaedic device.

As described herein, the system may be used to monitor physical therapy or recovery processes or rehabilitation of a patient after surgery, as well as to monitor or verify the extent of patient activity. The system includes one or more sensors that can be in communication with a processor that can generate information based on sensor readings and data, which can facilitate a patient or another user (e.g., a clinician, doctor, physiotherapist, nurse, nursing coordinator, or other suitable person) in monitoring the patient's activities, the status of the orthopedic implant or surrounding tissue, or the effect of rehabilitation or other therapy. However, it will be understood that the systems, devices, and methods described herein may be used in the context of other procedures or even rehabilitation or physical therapy without surgical intervention. The sensors described below are placed near a physical treatment or rehabilitation site, such as a surgical site or a body site to be rehabilitated.

The system may also provide an alert if the patient tissue is inflamed, or if the effectiveness or compliance of the physical or rehabilitation therapy is insufficient. The system includes a wearable device having one or more sensors. For example, one or more sensors may be provided on a wearable device that is applied to the skin of a patient.

In at least some embodiments, one or more sensors are in communication with a sensor processor on a device containing the sensors. In at least some embodiments, the sensor processor, or alternatively or additionally the sensor, is in communication with a processor of the patient device (e.g., a mobile phone, tablet, computer, etc.), or with a processor of the clinical device, e.g., a cell phone, tablet, computer, etc.

Fig. 1 shows one embodiment of a system 100 for monitoring orthopedic implants and rehabilitation after orthopedic replacement surgery. The system 100 includes one or more sensors 102, an optional sensor processor 104, a patient device 106 (e.g., a mobile phone, tablet, computer, etc.), a clinical device 108, and the network 60. In at least some embodiments, the one or more sensors 102 and preferably the sensor processor 104 (or one or more of the plurality of sensor processors) are disposed in a wearable device 112 external to the patient, such as a device that may be applied to the patient's skin or carried in a cradle or other article or textile worn by the patient. Alternatively, one or more of the sensors 102 and optionally the sensor processor may be implanted in the patient. In some embodiments, one or more sensors 102 are implanted and a sensor processor and optionally one or more additional sensors are provided in the wearable device.

Other embodiments of the system may include fewer or more components than shown in fig. 1, but the system generally includes a sensor 102 and a processor (e.g., sensor processor 104, patient device 106, or clinical device 108) in communication with the sensor and provides information based on sensor data. In the illustrated embodiment, the wearable device 112 includes the sensor 102 and the sensor processor 104, but it will be understood that other sensors not part of the wearable device 112 may be included. For example, one or more additional sensors may be combined into another wearable device, which may also include a sensor processor. It should also be understood that in some embodiments, wearable device 102 may not include sensor processor 104, or sensor processor 104 may have limited capabilities (e.g., obtain and transmit sensor readings without (or with limited) analysis of the sensor readings.

In FIG. 1, solid lines represent communications between components in at least some embodiments of the system. The dashed lines indicate alternative or additional modes of communication between components. In addition to the communications shown in FIG. 1, in at least some embodiments, the sensor processesThe device 104 or sensor 102 may also communicate directly with the clinical device. The communication may include, but is not limited to, wireless communication, wired communication, optical communication, ultrasonic communication, or a combination thereof. Satellite communication, cellular communication, BluetoothTMNear Field Communication (NFC), infrared data association standard (IrDA), wireless fidelity (WiFi), and Worldwide Interoperability for Microwave Access (WiMAX) are non-limiting examples of wireless communications that may be used for communication. Ethernet, Digital Subscriber Line (DSL), home Fiber (FTTH), and Plain Old Telephone Service (POTS) are non-limiting examples of wired communications that may be used for communication.

Network 60 may be any suitable type of network including, but not limited to, a Personal Area Network (PAN), a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), the internet, or any combination thereof. In at least some embodiments, the network 60 may be omitted to provide direct connections between components. It will be understood that other devices, such as servers or server farms, memory storage devices, etc., may be connected to the patient device 106 or the clinical device 108 through the network 60 or directly. For example, a server may be coupled to the patient device 106 or the clinical device 108 that stores patient or other medical information, applications, user interfaces, network interfaces, etc. for access by the patient device 106 or the clinical device 108.

The patient device 106 and the clinical device 108 may be any of a variety of devices, such as a computer (e.g., a laptop computer, a mobile medical station or computer, a server, a mainframe computer, or a desktop computer), a mobile device (e.g., a cellular phone or smartphone, a personal digital assistant, or a tablet computer), or any other suitable device. In at least some embodiments, the clinical device 108 may be incorporated into a medical kiosk or system.

Fig. 2 shows one embodiment of a computing device 201 for use as the patient device 106 or the clinical device 108. The computing device 201 includes a processor 214, a memory 216, a display 218, and an input device 220. The computing device 201 may be local to the user or may include components that are not local to a computer that includes one or both (or a portion of) the processor 214 or the memory 216. For example, in some embodiments, a user may operate a terminal connected to a non-local processor or memory.

The computing device 201 may utilize any suitable processor 214 including one or more hardware processors that may be local to a user or non-local to a user or other component of the computing device. The processor 214 is configured to execute instructions provided to the processor. The instructions may include any step of a method or process described herein.

Any suitable memory 216 may be used for computing device 214. Memory 216 illustrates one type of computer-readable medium, a computer-readable storage medium. Computer-readable storage media may include, but are not limited to, non-volatile, non-transitory, removable, and non-removable computer-readable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Examples of computer readable storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks ("DVD") or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computing device.

The communication method provides another type of computer-readable medium, namely communication media. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave, data signal or other transport mechanism and includes any information delivery media. The terms "modulated data signal" and "carrier-wave signal" includes a signal that has one or more of its characteristics set or changed in such a manner as to encode information, instructions, data, and the like, in the signal. By way of example, communication media includes wired media such as twisted pair, coaxial cable, fiber optics, wave guides, and other wired media and media such as acoustic, RF, infrared, BluetoothTMWireless media such as near field communication and other wireless media.

The display 218 may be any suitable display device, such as a monitor, screen, display, etc., and may include a printer. The input device 220 may be, for example, a keyboard, mouse, touch screen, trackball, joystick, voice recognition system, camera, microphone, or any combination thereof, and the like.

Returning to fig. 1, the sensor processor 104 may be any suitable processor including one or more hardware processors. The sensor processor 104 is configured to execute instructions provided to the processor. The sensor processor 104 is configured to receive sensor data from the sensors and communicate with the patient device 106, the network 60, the clinical device 108, or any combination thereof. Optionally, the sensor processor 104 may also process or analyze the sensor data and may store instructions thereon to perform such processing or analysis, including, for example, instructions to perform the steps of any of the processing or analysis described herein. In at least some embodiments, one or more sensors 102 may each include a processor, which may have some or all of the functionality of sensor processor 104.

One or more sensors 102 are provided to monitor the orthopedic implant and surrounding tissue, or to monitor healing after orthopedic surgery regardless of whether the implant is needed, or to provide preparatory treatment prior to surgery, or any combination thereof. The present disclosure will be exemplified with respect to orthopedic knee implants, but it should be understood that other joint implants, such as implants for the shoulder, hip, ankle, wrist, or any other joint, or any other orthopedic device, such as an orthopedic spinal implant, whether joint replacement, resurfacing of joint surfaces, soft tissue reconstruction, debridement, limb correction surgery, ligament replacement, or the like, may be used.

Any suitable type of sensor 102 may be used, including but not limited to accelerometers, magnetometers, gyroscopes, proximity sensors, infrared sensors, ultrasonic sensors, thermistors or other temperature sensors, cameras, piezoelectric or other pressure sensors, sonar sensors, external fluid sensors, skin discoloration sensors, pH sensors, microphones, and the like, or any combination thereof. In at least some embodiments, the system 100 includes at least one, two, three, four, five, six, or more different types of sensors 102. The system may include at least one, two, three, four, five, six, eight, ten, or more sensors 102. Other examples of suitable sensors and their arrangement and use may be found in U.S. patent application serial nos. 15/077,809 and 15/077,793 and U.S. provisional patent application serial nos. 62/136,892 and 62/136,925, which are all incorporated herein by reference.

One or more sensors 102 may be used to measure, monitor, or otherwise observe one or more aspects of the orthopedic device, surrounding tissue, or patient activity, among others. The following are examples of observations or measurements that may be made or interpreted using one or more sensors: number of steps, repetitive exercises, repetitive joint movements (e.g., joint rotation), type of exercise being performed, or other actions; stability or lack of stability; flexion angle or range of motion; the speed of movement; skin temperature; pulse or pulse curve or heart rate recovery time after activity; ultrasound images, flow measurements or doppler measurements; sonar images, flow measurements or doppler measurements; pressure or load bearing measurements; detecting lameness or body orientation (e.g., subluxation, posture, scoliosis) or a change in body orientation; monitoring joint vibration or influence; sleep condition or rest duration; gait analysis, body/limb/joint alignment, etc. The system 100 may observe or measure one or more of these items or any combination of items.

Further details of some of these measurements or observations are provided below. One or more sensors (e.g., accelerometers, gyroscopes, magnetometers, proximity sensors, etc.) may count steps or repetitions of an exercise or the number of joint movements or other actions experienced by the sensor and may be used to determine which type of exercise or movement is occurring. This may be used, for example, to monitor patient activity, to monitor compliance with exercise therapy or to monitor possible signs of pain or other conditions that may impede or aid rehabilitation. Sensor data may also be used to monitor changes in activity or trends in activity.

One or more sensors (e.g., accelerometers, gyroscopes, magnetometers, proximity sensors, etc.) may sense or detect or calculate the range of motion or flexion of a sensor, joint, or other portion of the patient's body. This may be used, for example, to monitor patient rehabilitation, patient activity, to monitor compliance with exercise therapy or to monitor possible signs of pain or other conditions that may impede or assist rehabilitation. These sensors or other sensors may be used to monitor the shock or impact to the orthopaedic device or tissue surrounding the orthopaedic device. The sensor data may also be used to monitor changes in or trends in the range of motion or bending.

As an illustrative example, one or more accelerometers may measure acceleration from joint motion. With a center of rotation about which the computing device rotates, joint motion and regions of motion or bending can be assessed using the measured acceleration rate between accelerometers of known separation distance. This information may be used for the same purpose as described in the previous examples.

In another illustrative example, a range of motion, rate of motion, number of repetitions, etc. may be measured using 1) an accelerometer and 2) a gyroscope or magnetometer (indicating a direction relative to magnetic north). This information can be used for the same purpose as described in the previous two examples.

In another illustrative example, a single sensor, such as an accelerometer, gyroscope, or magnetometer, may be used to measure or otherwise observe a range of motion, a rate of motion, a number of repetitions, and the like. In at least some embodiments, these measurements or other observations are determined using the sensor data and one or more assumptions about the sensor or sensor data based on, for example, identification of patterns in the sensor data, upper and lower limits of ranges in the collected data. This information can be used in a similar manner to the previous three examples.

One or more sensors (e.g., thermistors or infrared sensors) may sense or detect or calculate temperature or a temperature change or trend. The temperature may be skin temperature or ambient temperature. Temperature measurements may for example be used to indicate the possibility of inflammation or pain or another condition that may hinder rehabilitation or the patient's health. Temperature measurements can also be used, for example, to monitor whether freezing is being effectively performed, which can help reduce inflammation and help heal. These sensors may also or alternatively be used to sense, detect, or measure pulse, pulse changes, patient pulse trends, pulse contours, or heart rate recovery after patient activity (e.g., exercise or other exertion).

One or more sensors (e.g., ultrasonic or sonar sensors or cameras, etc.) may sense or detect or calculate particles or particle densities or particle density trends. These sensors may also be used to sense tissue surrounding the orthopaedic device, detect wear or dimensional changes of the orthopaedic device or surrounding tissue, etc. Ultrasonic and sonar transducers may also be used to determine the distance of other parts of the knee (or other joint) from the implant.

One or more sensors (e.g., piezoelectric sensors, strain gauges, or other pressure or load sensors) may sense or detect or calculate pressure or load on or around the sensor or orthopedic device. The sensor data may also be used to monitor changes in pressure or load bearing ranges or pressure or load bearing trends. These sensors or other sensors may be used to monitor the shock or impact to the orthopaedic device or tissue surrounding the orthopaedic device. Pressure or load sensors may also be used to detect swelling of tissue surrounding the orthopedic implant. Multiple pressure or load bearing sensors may also be used to detect bending (which may be indicated by uniaxial stretching of the tissue) and swelling (which may be indicated by biaxial stretching of the tissue).

Examples of sensors (including devices with implantable sensors), systems, devices, and methods for monitoring rehabilitation are described in U.S. patent application serial nos. 15/077,809 and 15/077,793 and U.S. provisional patent application serial nos. 62/136,892 and 62/136,925, both incorporated herein by reference. Other examples of sensors and their use may be found in U.S. patent application No.15/422,312 entitled "system and method for monitoring orthopedic implants and rehabilitation using wearable devices" and U.S. patent No.15/422,299 entitled "system and method for monitoring physical therapy and rehabilitation of joints," both filed on 2/1/2017, which are hereby incorporated by reference.

The sensor 102 and optional sensor processor 104 may be provided with power using any suitable power source, including but not limited to a primary battery, a rechargeable battery, a storage capacitor, other power storage devices, and the like, or any combination thereof. In some embodiments, power may be provided by a kinetic energy power source that utilizes the motion of the patient's body to generate power for the assembly or to charge a battery or storage capacitor or other power storage device coupled to the assembly. In some embodiments, a wireless power supply may be used in place of (or in addition to) a battery, storage capacitor, or other power storage device.

Additionally, a charging port may be provided for charging a battery or storage capacitor or other power storage device from a source such as a wall outlet. Alternatively or additionally, wireless charging systems and methods may also be used. It will be appreciated that in some embodiments, there may be multiple methods for providing power to a component or a power storage device associated with a component. All sensors and optional sensor processors may be coupled to the same power supply, or some sensors (even all sensors) and sensor processors may have separate power supplies.

In at least some embodiments, the sensors and optional sensor processor may be active at all times to measure, monitor, or otherwise observe. In other embodiments, one or more sensors and optional sensor processors may be periodically activated (for a period of 15 or 30 seconds or 1, 5, 10, 15 or 30 minutes or 1, 2,3, 4, 6, 7 or 24 hours) or randomly measured, monitored or otherwise observed. Alternatively, the period may be programmable. Additionally, the period may optionally be changed based on data from one or more sensors. In other embodiments, the sensor module, patient device, clinical device, or other device may manually or automatically activate one or more sensors and optional sensor processors. In at least some embodiments, the sensor and optional sensor processor may have different activation schedules (continuous, periodic, random, or manual). For example, a sensor for measuring temperature may perform this periodically, a sensor for measuring step number or joint motion may be continuous, and a sensor for measuring range of motion may be manually activated by a wearable device, patient device, or clinical device while the patient is performing rehabilitation exercises.

The systems and methods will be described herein with reference to orthopedic knee implants or other knee surgery. Similar systems and methods may be used for other joints including, but not limited to, finger joints, wrist joints, elbow joints, shoulder joints, hip joints, ankle joints, or toe joints. The system and method may be used to monitor physical therapy for any reason, including but not limited to rehabilitation in connection with other therapies, including therapies for ligament or fracture surgery.

Fig. 3A-3C illustrate one embodiment of a wearable device 312, the wearable device 312 including a sensor unit 322 and a substrate 324. The sensor unit 322 may be removed from the substrate 324 as shown in fig. 3A. As shown in fig. 3B and 3C, the wearable device 312 is placed on the skin of the patient and the base 324 is adhered to the skin.

The base 324 includes a flexible containment case 326, a magnet 328, an optional opening 330 for a temperature sensor, an optional tab 332, an adhesive disposed on a bottom surface 334 of the case, and an optional magnet retainer 336 disposed on the case. When the sensor unit 322 is attached to the substrate 324, the magnet 328 of the substrate 324 is magnetically attached to a similar magnet 354 in the sensor unit 322 (FIG. 4C). The magnets 328, 354 are intended to maintain attachment of the sensor unit 322 to the base 324 during normal activities, exercise, and other physical treatments unless the patient or others disengage the sensor unit from the base. Optionally, a magnet retainer 336 is secured to the magnet 328 (either entirely or only at its periphery) to retain the magnet on the housing 326.

In at least some embodiments, the housing 326 of the base 324 is sufficiently flexible for adhering to the patient's skin when the patient is moving during normal activities or physical therapy exercises. The housing may be made of any suitable material, including but not limited to flexible plastics such as silicone or polyurethane.

The housing 326 may also removably grip the sensor unit 322 to provide further maintenance of attachment of the sensor unit to the base 324. In the illustrated embodiment, the housing 326 defines a receiving cavity 338 with a sidewall 340 surrounding the cavity and a rim 342 surrounding the sidewall. In operation, the housing 326 houses a portion of the sensor unit 322, as shown in fig. 3B and 3C. In some embodiments, when a portion of the sensor cell 322 is received in the cavity 338, the sidewall 340 or the rim 342 may be resiliently flexible for expansion and then compression over the perimeter of the received portion of the sensor cell 322. Preferably, at least the edge 342 or the side wall 340 (or both) of the base 324 is made of a material that grips the sensor unit 322 by adhesion, compression, or the like, or any combination thereof. In at least some embodiments, the sensor unit 322 can have a recess 390, the recess 390 can receive the edge 342 to further facilitate maintaining attachment of the sensor unit to the base 324. In at least some embodiments, the sidewall 340 slopes outward and downward from the edge 342 to form an undercut region below the edge. The sensor unit 322 may similarly be formed with a sloped housing to fit in an undercut below the edge 342 of the base 324 to further facilitate maintaining engagement between the sensor unit and the base. It will be appreciated that any other suitable type of mechanical fastener may be used to secure the sensor unit 322 to the base 324 in addition to or instead of the magnet (or magnet and magnetically attractive material).

The adhesive may be applied to the base 324 or may be adhesive disposed on both sides of a substrate, one side of which is adhered to the base 324. Preferably, the binder is selected to be water resistant and to resist loss of adhesion due to exposure to perspiration. In at least some embodiments, the substrate 324 or the adhesive on the substrate is intended to replace the adhesive or to recoat the adhesive under normal use conditions for at least one, two, three, five, seven, or ten days or two, three, or four weeks or more. In at least some embodiments, the adhesive is selected to maintain adhesion to the skin when the user is bathing. In at least some embodiments, the adhesive is selected to maintain adhesion to the skin when the user bathes, swims in a swimming pool, or sits in a whirlpool, hot water bath, or rehabilitation pool.

The base 324 optionally includes tabs 332 disposed at any suitable location relative to the housing 326. The tab 332 may facilitate removal of the sensor unit 322 from the base 324 by pushing or pulling the tab 332 to deform the housing 326 to release the sensor unit. Preferably, the operation of the tab 332 to disengage the sensor unit 322 may be performed while maintaining the attachment of the base 324 to the patient's skin. In some embodiments, the operation of the tab 332 may also facilitate engagement of the sensor unit 322 with the base 324.

Fig. 4A-4C illustrate one embodiment of a sensor unit 322. Sensor unit 322 is shown to include an upper housing 350, a lower housing 352, a magnet 354, an electronics package 356, a power source 358, a light emitting device 360, and adhesives 362, 364. Additionally, in some embodiments, as shown in fig. 4D, the upper housing 350 can include a main housing 366 and a clamping element 368. In some embodiments, sensor unit 322 may include more or fewer components than those shown in fig. 4A-4D.

The upper and lower housings 350, 352 form a cavity in which at least the electronic components 356 and the power supply 358 are located. The upper and lower housings 350, 352 may be made of any suitable material, such as a metal or plastic material (preferably a rigid plastic material) or any combination thereof. In at least some embodiments, the upper and lower housings 350, 352 and the junction of the upper and lower housings are waterproof to resist water, perspiration, rain, and other fluids from entering the interior of the housings. In at least some embodiments, the sensor unit 322 is sufficiently waterproof to allow the patient to shower or swim without covering the sensor unit.

Optional gripping member 368 may have a rough or matte surface on at least a portion of the gripping member. This matte surface facilitates gripping of the sensor unit 322, particularly for engaging and disengaging the sensor unit from the substrate 324. In the illustrated embodiment, the grip element 368 is a separate element that is overmolded, adhered, or otherwise attached to the main housing 366. The clamping element 368 may be made of a more flexible material, such as silicone or polyurethane, that is different from the main housing 366. In other embodiments, the gripping element 368 is formed as part of the main housing 366 by roughening or otherwise rendering at least a portion of the surface of the main housing non-smooth.

The magnets 354 are arranged for magnetic coupling to the magnets 328 of the substrate 324. In some embodiments, one of the magnets 354, 328 may be replaced with a magnetically attractive material that will then couple with the other magnet 354, 328 to magnetically couple the substrate 324 to the sensor unit 322. In the illustrated embodiment, the magnet 354 is attached to the lower housing 352 by an adhesive 364, which may be a layer of adhesive or an adhesive disposed on both sides of the substrate. In other embodiments, the magnet 354 may be attached to the lower housing 352 by any other suitable method, or may be disposed within the cavity formed by the upper and lower housings 350, 352.

Power source 358 may be any suitable power source. For example, power source 358 may be a primary battery (e.g., a battery) and may have an expected life of at least 7, 10, 20, 30, 60, 90, 100, 70, or 180 days or more under normal use. In some embodiments, the primary battery may be replaceable. In some embodiments, power source 358 is rechargeable, for example using a charging port or inductive charging device (e.g., an inductive pad or sleeve), or using WiFi or ultrasonic charging or any other suitable charging method. In some embodiments, the primary battery (e.g., a secondary battery) may be a magnetically attractive material to which the magnets 328 of the substrate 324 may be magnetically coupled.

Electronic components 356 may include any suitable components for operation of sensor unit 322. In the illustrated embodiment, the electronic assembly 356 includes a circuit board 368, a sensor processor 304, a temperature sensor 370, an accelerometer 372, at least one LED374, a communication device 376, and a magnetic switch 378. Adhesive 362 may couple circuit board 368 to lower housing 352. Other adhesives (not shown) may couple the circuit board or other components to the upper housing 350.

The sensor processor 304 may be similar to the sensor processor 104 described above and may have more or less capabilities than the sensor processor 104. In some embodiments, the sensor processor 304 may include an analysis algorithm for analyzing or partially analyzing the sensor data. In other embodiments, the sensor processor 304 may be primarily designed to receive, store, and transmit sensor data.

The illustrated sensor unit 322 includes a temperature sensor 370 and an accelerometer 372, but as noted above, other embodiments may include more or different sensors in any suitable combination. In the illustrated embodiment, the temperature sensor 370 is a thermistor that extends away from the circuit board 368 and through an opening 366 in the lower housing 352. When the sensor unit 322 engages the base 324, a portion of the temperature sensor 370 extends through the opening 330 in the base 324 such that the temperature sensor 370 is exposed to and may be in contact with the skin of the patient.

As described above, the communication device 376 operates with the sensor processor 304 to communicate with a patient or clinical or other device. Any suitable communication method or protocol may be used, including but not limited to WiFi, BluetoothTMNear field communication, infrared, radio frequency, acoustic, optical, and the like.

In some embodiments, the electronics assembly 356 further includes a magnetic switch 378, such as a reed switch, that is coupled to the sensor processor 304 so as to be actuated to place the sensor unit 322 in the active mode when positioned proximate to the magnet 328 of the base 324. In at least some embodiments, when the sensor unit 322 is removed from the substrate 324, the magnetic switch is actuated to place the sensor in an inactive or standby mode. Alternatively or additionally, the sensor unit 322 may include a button, mechanical switch, or other mechanism to place the sensor in an active mode or inactive or standby mode, or to switch between modes or turn the sensor unit on or off. Also, alternatively or additionally, signals from a patient or clinical device or other device in communication with the sensor unit 322 may be used to place the sensor unit 322 in one of these modes (or to switch between modes). In at least some embodiments, in the inactive or standby mode, the sensor unit 322 continues to receive signals from an external source (e.g., a patient or clinical device). In at least some embodiments, in an inactive or standby mode, the sensor unit 322 also maintains an internal clock.

At least one LED374 is coupled to the light emitting device 360 to provide light to the light emitting device. In at least some embodiments, the light emitting device 360 includes a light emitter 380 and a light pipe 382 to direct light from the LED374 to the light emitter. The light 360 provides an indication to the user or patient of the operation of the device. For example, the light emitting device 360 may be illuminated when the sensor unit 322 is operating or in an active mode. In some embodiments, the color of the light emitted by the light emitting device may indicate which mode the sensor unit is currently in (active or inactive/standby), or may indicate the operation the sensor unit is performing (e.g., emitting, sensing, not sensing, synchronizing with a patient or clinical device, etc.). In some embodiments, blinking of light or brightness of light may be used to indicate a mode or operation instead of or in addition to color. As an example, a flashing blue light may indicate synchronization with a patient or clinical device, a green light may indicate an active mode, and a non-emitting light may indicate a non-active/standby mode.

U.S. patent application serial numbers 15/077,809 and 15/077,793 and U.S. provisional patent application serial numbers 62/136,892 and 62/136,925 (both incorporated herein by reference) describe other features and apparatus that may be incorporated into the wearable devices and sensor units described herein. These patent applications also describe other wearable or implantable devices that may be used in the methods and systems described herein.

In some embodiments, a second sensor unit may be used. For example, the second sensor unit may be placed on or in the same leg on the other side of the joint. As another example, a second sensor unit may be placed on the other leg for detecting or observing lameness or other gait defects, or may be placed on the torso for detecting or observing body direction. When two or more replacements are implanted in the body, for example with multiple joint or vertebral replacements, a second sensor unit (or more further sensor units) may also be used to detect or observe, for example, subluxation, a change or defect in posture, scoliosis, etc.

The two sensor units may optionally communicate or synchronize with each other. In at least some embodiments, the two sensor units can be synchronized with each other and know the location of each sensor in space and their location in terms of distance and orientation from each other. As an example, two sensor units may triangulate their position using a patient or clinical device. In at least some embodiments, if one of the sensor units is replaced or removed from its base, it is advisable to use the patient device or another sensor unit. When a sensor unit is reattached to its base or a new sensor unit is attached to the base, the system may determine the position or distance of the new sensor unit relative to its other sensor unit.

The sensors in the two sensor units can be used for measuring the bending angle and the moving range; calculate vectors, angles, rays, planes or distances, etc. Temperature sensors on both sensor units can be used to determine the temperature difference between the two parts of the body. If the patient is out of the limits of motion of the range of motion during physical therapy or rehabilitation, the sensors in both sensor units may be used to calculate an angle or other information that may be used to send a signal to the patient.

As described above, the sensor processor or sensor communicates with the patient device or clinical device to provide sensor data or information derived from the sensor data. The patient device may be a dedicated device, or may be an application on a smartphone, tablet, laptop or desktop computer, a Web or cloud application, or any other suitable apparatus. Communication between the implanted or wearable sensor unit and the patient or clinical device may occur at predetermined times (e.g., every 30 minutes or hourly or daily). In this way, the sensor unit can be synchronized with the patient or clinical device. Similarly, the patient device may be periodically synchronized with the clinical device. In some embodiments, if the sensor unit or patient device detects a condition that requires an alarm to be raised (e.g., a drop or increase in temperature), the sensor unit or patient device may communicate (or attempt to communicate) with the patient device or clinical device, respectively, to immediately provide an alert.

Alternatively or additionally, the communication between the sensor unit and the patient or clinical device may be constant or nearly constant (e.g., once every 1, 5, 10, 30, or 60 seconds). For example, a constant or near constant communication may be established when the sensor unit determines that the patient is performing an exercise, or when the patient manually actuates a control to the patient device or the sensor unit to indicate that the patient will begin exercising or otherwise desire the sensor unit to communicate or synchronize with the patient device.

5-16 illustrate screen shots of one embodiment of an application or user interface for a patient device or a clinical device. The application or user interface shown is particularly useful for mobile devices such as smartphones or tablets, but may also be used with other devices such as desktop or laptop computers. FIG. 5 illustrates one embodiment of a patient or clinician profile for an application or user interface. Elements of the page may include, but are not limited to, patient or clinician information, controls to enter identification numbers or other identification information for the wearable device so that the wearable device may be synchronized or otherwise coupled to the patient or clinical device, controls to enter or change passwords or to enter or access application settings, controls to access a calibration program to calibrate the wearable device, controls to access one or more other functions or pages, and so forth. Other information that may be displayed on this or another page may include, but is not limited to, account creation or account login controls, indications of wearable device status, or controls to access help information; solving common problems; photos, videos, or words that are used to guide how the wearable device is applied to the skin of a patient, how to care for a surgical wound, how to perform a particular exercise, or how to program or operate the wearable device.

FIG. 6 shows another page of the user interface or application that provides information such as steps per day (or number of repeated workouts, etc.) and temperature measurements, as shown in area 592. The user interface 590 may also include an area 594, the area 594 displaying a data chart, such as steps per hour, as shown in FIG. 6. The user interface shown allows the user to select from other charts, such as exercise history (labeled "ROM"), temperature or temperature trend, and number of impacts or shocks to the sensor module. It will be understood that other measurements or observations from the above-described sensors may be plotted. In at least some embodiments, the user can also select the time period of the graph to display data for a time period such as, for example, minutes, hours, days, or weeks.

The user interface may be used to monitor patient activity and progress. The graphics in area 594 may be used to display the patient's exercise history and progress. In some embodiments, the user interface may also allow the user to set goals, such as multiple steps or multiple workout repetitions over a particular period (e.g., 1, 2, 4, 6, or 7 hours or 1 day or 1 week). The user interface may also display the current state of achieving these goals. The user interface may also highlight notable events, such as a maximum number of steps or exercise repetitions, elevated temperature readings, a large number of shocks or impacts, and the like. The user interface may also highlight the achievement of the goal.

Fig. 7 illustrates another page of a user interface or application displaying information related to a particular patient measurement that may be tracked to monitor rehabilitation or physical therapy. In the illustrated page, the patient measurements are flexion and extension in relation to the knee of the patient. These patient measurements may include, but are not limited to, ranges of motion measurements, such as flexion and extension. The page also shows a chart 596 that tracks the progress of these measurements. Progress may be tracked hourly, daily, weekly, or at any other time period. In some embodiments, a user interface or application allows a user to select or change the time period shown in the figures. The page in fig. 7 also provides information about other measurements, such as the percentage of exercise completion, skin temperature, number of steps, etc.

FIG. 8 shows another page in which a user interface or application may be directed to calculate or otherwise determine a particular measurement. In the case shown, the femoral angle is measured.

FIG. 9 shows another page in which a user interface or application tracks daily exercise programs. In the illustrated embodiment, the exercises are sitting up, heel sliding (hip and knee flexion), straight leg lifting, and knee to chest. Other exercises may include, but are not limited to, standing movements, ankle pumping movements, ankle looping, thigh squeezing (quadriceps sizing), kicking (short arc quadriceps), knee bending (knee flexion while sitting), long knee extension, sitting posture (long arc quadriceps), knee extension with straightening, knee pendulous/swinging, hamstring setting (heel digging), hip squeezing (gluteus setting), walking, etc. These exercises are directed to knee rehabilitation. Of course, rehabilitation or physical therapy for other joints or body parts may include a different set of exercises. In addition, the page also displays the percentage of completion for each set of repetitions (three sets in this case) that the patient is to perform.

FIG. 10 shows another page with a single workout. The page shows the current measurements related to the exercise (flexion and extension in this case). The page also shows how the workout is performed and may include a control for the patient to indicate that the workout is to begin. In some embodiments, the page may also provide an indication of the number of repetitions while the patient is exercising (or the number of repetitions still needed to reach the repetition goal). The page may also indicate patient measurements based on exercise (e.g., the current measurement of the most recent repetition or the average measurement of the current repetition group or the maximum measurement of a group of repetitions), and may also indicate the goal of the measurement. The page may include a gauge or the like with a bar graph to indicate which portion of the workout goal is being met. The indication (e.g., bar graph, etc.) may also indicate which portion of the range of motion or other treatment goal has been met. In some embodiments, the page may display the average patient time to the range of the moving object, or the like, to motivate the patient.

FIG. 11 shows a page with controls for accessing a video that may show how a patient performs an exercise. Fig. 12 shows a page in which the patient can set reminders to perform the workout. The patient may set a time reminder or may set a visual or audible alarm to remind the patient to exercise at a designated time.

Fig. 13 shows a page in which it is indicated how the patient has progressed on physical therapy or rehabilitation. The distance and significant events included in the indication may be based on the time of rehabilitation (e.g., days or weeks); a physical measurement (e.g., bending or stretching) toward a final target of the physical measurement; the number of repetitions or completions of one or more exercises toward an exercise goal, or the like. As shown in fig. 13, the page may also include a graph of measurements (similar to fig. 7) or number of repetitions or any other graph of relevant information.

FIG. 14 shows a page for a clinical device indicating information about a group of patients, such as the number or percentage of patients who completed an exercise or other goal, the number or percentage of patients who achieved a particular range of motion or other measurement goal, and so forth.

FIG. 15 illustrates a page of a patient or clinical device in which settings of the patient may be entered or altered. Such settings may include, for example, which exercises to perform, the number of repetitions of each exercise, the number of repetitions set for each exercise per day, the number of steps per day. The page may contain controls that allow these settings to be changed. Further, as shown in fig. 15, the settings may be related to a particular stage of rehabilitation or physical therapy. The page may allow switching between different phases so that settings may be viewed, entered, or changed for the phases. Another page of the patient or clinical device may display the actual results achieved by the patient and may compare the results to the settings or goals entered for the patient.

FIG. 22 illustrates one embodiment of a method for displaying information requested by a patient. In step 2202, the patient device receives input from the patient to display exercise or other information. In step 2204, the patient device displays the requested information. For example, the requested information may be a graphical representation of an exercise and a user control that, when actuated by the user, indicates that the user is performing the exercise. One example of such requested information is shown in fig. 10. The requested information may be a repeat count of the exercises being performed or a summary of a number of exercises indicating the number of repetitions performed on each exercise over a period of time. Examples of such requested information are given in fig. 7, 9, 13 and 14. The requested information may be a progress report of the motion measurement range or a graph of the values of the motion measurement range obtained over a period of time. Examples of such requested information are given in fig. 7, 13 and 14. The requested information may indicate a path to the motion measurement range target and indicate the current progress of the patient toward that target, as shown, for example, in fig. 13.

Fig. 16 shows another page in which the patient is instructed to take a picture or video of the patient's knee or other wound site or physical treatment site using a camera on the patient device or other device. The page may instruct the user how to compose a photo or video. In at least some embodiments, the photographs or videos may be sent to a clinical or other device over a network (see fig. 1) or by other methods. The clinician may use the photograph or video to assess the wound or physical treatment site. In some embodiments, the patient device may request to take a video of the patient performing the exercise. The video may be provided to a patient device or a clinical device to evaluate or view a reticle of the exercise. For example, a clinician may assess whether a patient is performing exercise correctly by observing the exercise or may assess the progress of physical therapy or rehabilitation.

In some embodiments, the patient device is configured and arranged to perform pigment analysis or other wound analysis of the knee using a photograph or video. For example, the patient device may compare skin pigments at the wound site to skin pigments near the wound site to identify infections (e.g., superficial wound infections or deep wound infections), rashes, discoloration, or other problems. In at least some embodiments, pigment or other wound analysis can be combined with skin temperature information to assess infection (e.g., superficial wound infection or deep wound infection), rash, discoloration, or other problems. If the analysis indicates a potential or actual problem, the patient device may provide a visual or audible alert to the patient and may also send an alert to the clinical device. In other embodiments, pigment or other wound analysis (with or without skin temperature information) may be performed by a clinical device or other device in place of (or in addition to) a patient device. The patient device or clinical device may include a white balance or light compensation algorithm to evaluate the photograph or video. The patient device may also include a calibration tool to facilitate calibration of light and other aspects of the photograph or video.

In some embodiments, instead of or in addition to taking a picture or video, the patient device may display the area at which the camera is pointed for viewing by the patient. The displayed area or photograph or video may be modified to an overlay line or graphic corresponding to the patient's anatomy. These lines or graphics may move as the patient's legs or other body parts move. In some embodiments, patient measurements, such as flexion or extension or other range of motion measurements, may be calculated during motion and displayed on the patient device as the patient's limb moves.

FIG. 20 depicts an embodiment of a method of taking a photograph or video of a spot on a patient. In step 2002, the patient device (or clinician or other device or person) instructs the patient to take a picture or video of the point (e.g., physical treatment point or surgical or wound point). In step 2004, the patient takes a picture or video using the camera of the patient device (or the camera of another device) and stores the picture or video. In some embodiments, in optional step 2006, the photograph or video is sent to a clinical or other device. In step 2008, an analysis may be performed on the photo or video. The analysis may be performed by a patient device, a clinical device, or any other suitable device. For example, a pigment analysis may be performed or an analysis may be performed relating to exercise or a range of motion measurements. In some embodiments, a graphical marker, such as a line or angle, may be superimposed on the photograph or video based on the analysis.

The patient device, user interface, or application may include other features. For example, the patient device, user interface, or application may include controls for the patient to enter information or ratings regarding his or her experiences in the hospital, the patient's experiences during rehabilitation, how the patient is tied to the patient's expense during rehabilitation, whether the patient will recommend a wearable device or other aspect of treatment to family or friends, and so forth. The patient device, user interface, or application may include controls for entering a rating related to pain or other clinical aspects. For example, the patient may enter a pain score based on a rating provided on the device. Other scores that a patient, clinician, or other person may enter may be based on, for example, knee joint congress scores, new knee joint congress scores, other congress scores, KOSS or PROM (patient reported outcome index), oxford knee joint scores, or Womack, or any other suitable score or grade.

Fig. 21 illustrates one embodiment of a method of inputting a pain score. In step 2102, the patient device directs the user to enter a pain score, and the device receives the pain score. In step 2104, the pain score is sent to a clinical or other device.

The patient device, user interface, or application may include controls to add friends, create a network of friends, send messages to friends, send progress updates or other workout information to friends or others, and so forth. Some friends may be other patients, and the patient device, user interface, or application may display a progress comparison with the friends, allowing challenges to be issued to the friends, providing controls to send encouragement to the friends, or other social controls or interactive functionality. In addition, family, friends, and colleagues of the patient may use another application. The application may allow the user to send encouragement or messages to the patient and may also display progress updates or other exercise information that the patient has allowed. These applications and features help encourage patients to continue to make their commitments to physical therapy or rehabilitation goals and objectives.

Fig. 23 illustrates one embodiment of a method for including friends in physical therapy. In step 2302, the patient is instructed to enter one or more friends into the patient device. In step 2304, a connect message is sent to one or more friends to connect the patient to the patient's friend network. In step 2306, the patient receives an encouragement message from at least one of the one or more friends. In optional step 2308, the patient sends a message, challenge, or progress report to at least one of the one or more friends. The progress report may include, for example, an indication of the patient's progress toward the at least one physical therapy goal, an indication of the patient's performance of the at least one physical therapy exercise, or an indication of the performance of the at least one physical therapy by at least one of the one or more friends, or the like, or any combination thereof.

Fig. 17 shows a user interface 690 that may be suitable for a computer or network interface. The user interface shown includes a region 692 showing the results of a temperature measurement 692a, a step measurement 692b, a motion test range 692c, a specific motion and test 692d, and an adverse event 692 e. The results may include digital information and graphical information. These results may also graphically or numerically illustrate the degree of success of performing the exercise (see, e.g., area 692d), and may also show the degree of compliance with the rehabilitation activity (e.g., the number of repetitions of the exercise performed). Such an arrangement of information may facilitate monitoring patient progress, identification of progress or lack of progress, identification of concern (e.g., an increase in temperature or an increase in the number of shocks or impacts), and so forth.

Other information that may be displayed on one or more pages on the user interface may include any suitable patient rehabilitation progress data, including progress at baseline and over a period of time. For example, the information may include a baseline range of motion information for exercise, such as sitting leg lift, heel slip, standing lift, prone lift, and the like. The information may also include current motion information ranges for the exercises. The information may also include step analysis information including, but not limited to, pre-operative and post-operative average cadence, maximum cadence, stride angle, and time spent walking, cycling, running, or sedentary activities. Additional information may include skin temperature, ambient temperature, and temperature trends. The user interface may also provide information about how many times or how often the patient has fallen or other significant events. The user interface may provide information from GPS readings of the wearable device or patient device to assess baseline activity, current activity, general activity after surgery or physical therapy, and the like.

The user interface of the clinician device may also be used to conduct office-wide motion testing. A clinician device or patient device may be used to create a video of the range of motion exercises.

Fig. 18 shows a user interface 790 for a clinician to monitor a plurality of patients. The area 792 includes information such as the patient's name, the surgical date, the sensor date and results of the test 792a, the number of adverse events, the location of the orthopedic implant, and the like. The clinician may also track the number of surgeries 794, the rate of successful rehabilitation 796, and other suitable information, such as the total number of surgeries (e.g., total number of knee replacements), the average time to a particular rehabilitation outcome (e.g., average time to reach a specified range of motion), etc.

Fig. 19 shows another user interface for a clinician to monitor a patient. The area 1592 includes information such as patient name, gender, surgical date, days post-surgery, measured or trending temperature, range of motion measurements, activity, number of steps, significant events, implantation point, wearable device status, etc. The clinician may also track the number of successful rehabilitations 894, the range of motion 896 achieved by a group of patients over time, and other suitable information. Controls may also be provided to access individual patient records 898 or to access patient alerts 899.

In at least some embodiments, the applications or user interfaces described herein may be network or application interfaces that are accessible when the patient device or clinician device accesses a server of a content provider. In at least some embodiments, a server or other server or memory storage device may store information for the network interface, and may also store patient-specific information, including patient identification data obtained from sensor data, patient or clinician reviews, or the like, sensor data or information, or any other suitable data. In at least some embodiments, patient-specific information can be accessed from a patient device, clinician device, or other device, and in some embodiments, credentials (e.g., username or password or both) may need to be provided to access the information.

Additional user interfaces and methods of calculating or otherwise determining information related to a patient's physical therapy, rehabilitation, or condition are described in U.S. patent application No.15/422,312 entitled "system and method with a user interface for monitoring physical therapy and rehabilitation" and U.S. patent application No.15/422,299 entitled "system and method for monitoring orthopedic implant and rehabilitation using wearable devices", both filed on 2/1 of 2017, which are incorporated herein by reference.

The methods and systems described herein may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Thus, the methods and systems described herein may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. The systems referenced herein generally include memory and generally include methods for communicating with other devices, including mobile devices. The communication methods may include wired and wireless (e.g., RF, optical, or infrared) communication methods, and the methods provide another type of computer-readable medium, namely a communication medium. Wired communications may include communications over twisted pair, coaxial cable, fiber optics, waveguides, etc., or any combination thereof. The wireless communication may include RF, infrared, acoustic, near field communication, BluetoothTMAnd the like or any combination thereof.

It will be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations and methods disclosed herein, can be implemented by computer program instructions. These program instructions may be provided to a processor to produce a machine, such that the instructions, which execute on the processor, create means for implementing the actions specified in the flowchart block or blocks disclosed herein. The computer program instructions may be executed by a processor to cause the processor to perform a series of operational steps to produce a computer-implemented process. The computer program instructions may also cause at least some of the operational steps to be performed in parallel. Moreover, some of the steps may also be performed on more than one processor, such as may be present in a multi-processor computer system. In addition, one or more processes may also be performed concurrently with other processes, or even in a different order than that shown, without departing from the scope or spirit of the invention.

The computer program instructions may be stored on any suitable computer readable medium, including but not limited to RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks ("DVD") or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.

The above specification provides a description of the manufacture and use of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.

41页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:可植入的唯一器械标识和检测系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!