Modular system for multi-modality imaging and analysis

文档序号:118915 发布日期:2021-10-19 浏览:19次 中文

阅读说明:本技术 用于多模态成像和分析的模块化系统 (Modular system for multi-modality imaging and analysis ) 是由 拉尔夫·达科斯塔 西蒙·特雷德韦尔 康纳·赖特 金伯林·丹皮坦 托迪·丹尼斯 托迪·米尼 于 2020-01-17 设计创作,主要内容包括:公开了一种便携式模块化手持成像系统。模块化系统包括第一壳体部和第二壳体部。第一壳体部包括至少一个激发光源。第一滤波器被配置为检测并允许响应于用激发光照射选择的光学信号通过第一图像传感器。第二滤波器被配置为检测并允许响应于用白光照射目标表面选择的光学信号通过第二图像传感器。第二壳体部被配置为可释放地接收第一壳体部。第二壳体部包括显示器和处理器,处理器被配置为接收检测的荧光和白光光学信号,并基于检测的光学信号将目标表面的表示输出到显示器。(A portable modular handheld imaging system is disclosed. The modular system includes a first housing portion and a second housing portion. The first housing portion includes at least one excitation light source. The first filter is configured to detect and allow a selected optical signal to pass through the first image sensor in response to illumination with excitation light. The second filter is configured to detect and allow selected optical signals to pass through the second image sensor in response to illuminating the target surface with white light. The second housing portion is configured to releasably receive the first housing portion. The second housing portion includes a display and a processor configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to the display based on the detected optical signals.)

1. A portable handheld imaging system, comprising:

at least one excitation light source configured to emit excitation light during fluorescence imaging;

a first filter configured to detect and allow passage of an optical signal to a first image sensor, the optical signal responsive to illumination of a target surface with the excitation light and having a wavelength corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue fluorescence, and tissue autofluorescence;

a white light source configured to emit white light during white light imaging;

a second filter configured to detect and allow passage of an optical signal to a second image sensor, the optical signal responsive to illumination of the target surface with the white light and having a wavelength in the visible range; and

a processor configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to a display based on the detected optical signals.

2. The system of claim 1, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 350nm to about 400nm, about 400nm to about 450nm, about 450nm to about 500nm, about 500nm to about 550nm, about 550nm to about 600nm, about 600nm to about 650nm, about 650nm to about 700nm, about 700nm to about 750nm, about 750nm to about 800nm, about 800nm to about 850nm, about 850nm to about 900nm, and/or combinations thereof.

3. The system of claim 1 or claim 2, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 400nm to about 450 nm.

4. The system of any one of claims 1 to 3, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 405nm ± 10 nm.

5. The system of any one of claims 1 to 4, wherein the at least one excitation light source is coupled to a housing of the portable handheld imaging system.

6. The system of any of claims 1-5, wherein the first filter is further configured to block passage of optical signals having a wavelength of 405nm ± 10 nm.

7. The system of any one of claims 1 to 6, wherein the first filter is configured to allow optical signals having a wavelength between about 500nm and about 550nm and/or optical signals having a wavelength between about 600nm and about 660nm to pass through the first filter to the first image sensor.

8. The system according to any one of claims 1 to 7, wherein the at least one excitation light source comprises first and second violet/blue LEDs, each LED configured to emit light having a wavelength of 405nm ± 10 nm.

9. The system of any one of claims 1 to 8, further comprising a housing having a display on a front side of the housing.

10. The system of claim 9, wherein the at least one excitation light source is located on a rear side of the housing.

11. The system according to any one of claims 1 to 11, wherein the at least one excitation light source comprises first and second violet/blue LEDs, each LED configured to emit light having a wavelength of 405nm ± 10 nm.

12. The system of claim 11, wherein the first and second violet/blue LEDs are located on opposite sides of a longitudinal axis of the housing, wherein the longitudinal axis passes through the top and bottom of the housing.

13. The system of claim 10, wherein the housing is a modular housing comprising a display unit and an optical unit.

14. The system of claim 13, wherein the optical unit is releasably attached to the display unit.

15. The system of claim 14, wherein the at least one excitation light source is contained in the optical unit.

16. The system of claim 15, wherein the white light source is contained in the optical unit.

17. The system of claim 16, wherein the display unit comprises an interface configured to releasably receive an optical unit.

18. The system of claim 17, wherein the interface is defined at least in part by a heat sink of the system.

19. The system of claim 18, wherein the heat sink surrounds an opening configured to releasably receive the optical unit.

20. The system of any one of claims 1-19, further comprising a thermal sensor configured to detect thermal information about the target surface.

21. The system of any one of claims 1 to 20, further comprising an ambient light sensor configured to indicate when ambient lighting conditions are sufficient to allow fluorescence imaging.

22. The system of any one of claims 1 to 21, further comprising a range finder.

23. The system of any one of claims 1 to 22, further comprising a third filter configured to detect and allow passage of an optical signal to a third image sensor, the optical signal responsive to illumination of the target surface with the white light and having a wavelength in the visible range.

24. The system of claim 23, wherein the processor is further configured to receive image data from the second image sensor and the third image sensor and output a stereoscopic image or a three-dimensional image.

25. The system of any one of claims 1 to 24, further comprising a WiFi and/or bluetooth antenna.

26. The system of any one of claims 1 to 25, wherein the processor is configured to transmit and/or receive data wirelessly.

27. The system of any one of claims 1 to 26, further comprising a power source.

28. The system of any one of claims 1 to 27, wherein the excitation light source comprises a first excitation light source and a second excitation light source.

29. The system of claim 28, wherein the first excitation light source is configured to emit excitation light having a wavelength of about 350nm to about 400nm, about 400nm to about 450nm, about 450nm to about 500nm, about 500nm to about 550nm, about 550nm to about 600nm, about 600nm to about 650nm, about 650nm to about 700nm, about 700nm to about 750nm, about 750nm to about 800nm, about 800nm to about 850nm, about 850nm to about 900nm, and/or combinations thereof.

30. The system of claim 29, wherein the first excitation light source is configured to emit excitation light having a wavelength of about 400nm to about 450 nm.

31. The system of claim 30, wherein the first excitation light source is configured to emit excitation light having a wavelength of about 405nm ± 10 nm.

32. The system according to claim 28, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 350nm to about 400nm, about 400nm to about 450nm, about 450nm to about 500nm, about 500nm to about 550nm, about 550nm to about 600nm, about 600nm to about 650nm, about 650nm to about 700nm, about 700nm to about 750nm, about 750nm to about 800nm, about 800nm to about 850nm, about 850nm to about 900nm, and/or combinations thereof.

33. The system of claim 32, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 750nm to 800 nm.

34. The system of claim 33, wherein the second excitation light source is configured to emit excitation light having a wavelength between about 760nm and about 780 nm.

35. The system of claim 34, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 760nm ± 10 nm.

36. The system of claim 34, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 770nm ± 10 nm.

37. The system of claim 34, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 780nm ± 10 nm.

38. The system of any one of claims 1 to 37, wherein the first image sensor and the second image sensor each comprise a Complementary Metal Oxide Semiconductor (CMOS) sensor.

39. The system of any one of claims 1 to 38, wherein the first filter is fixed to the first image sensor.

40. The system of any one of claims 1 to 39, wherein the second filter is fixed to the second image sensor.

41. The system of any one of claims 1 to 40, further comprising a source of infrared radiation.

42. The system of claim 41, wherein the system is configured to project infrared radiation onto the target surface and detect infrared radiation reflected from the target surface and any infrared fluorescence emitted by the target when excited by an appropriate excitation wavelength.

43. The system of claim 42, wherein the processor is further configured to generate a three-dimensional map of the target surface based on the detected reflected infrared radiation.

44. The system of claim 43, wherein the processor is further configured to generate a three-dimensional fluorescence image of the target surface based on the three-dimensional map, the two-dimensional white light image of the target surface, and the two-dimensional fluorescence image of the target surface.

45. A portable modular handheld imaging system comprising:

a first housing portion comprising:

at least one excitation light source configured to emit excitation light during fluorescence imaging,

a first filter configured to detect and allow passage of an optical signal to a first image sensor, the optical signal responsive to illumination of a target surface with the excitation light and having a wavelength corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue fluorescence, and tissue autofluorescence,

a white light source configured to emit white light during white light imaging, an

A second filter configured to detect and allow passage of an optical signal to a second image sensor, the optical signal responsive to illumination of the target surface with the white light and having a wavelength in the visible range; and

a second housing portion configured to releasably receive the first housing portion and comprising:

a display, and

a processor configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to the display based on the detected optical signals.

46. The system according to claim 45, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 350nm to about 400nm, about 400nm to about 450nm, about 450nm to about 500nm, about 500nm to about 550nm, about 550nm to about 600nm, about 600nm to about 650nm, about 650nm to about 700nm, about 700nm to about 750nm, about 750nm to about 800nm, about 800nm to about 850nm, about 850nm to about 900nm, and/or combinations thereof.

47. The system according to claim 45 or claim 46, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 400nm to about 450 nm.

48. The system according to any one of claims 45 to 47, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 405nm ± 10 nm.

49. The system of any one of claims 45 to 48, wherein the first filter is further configured to block passage of optical signals having a wavelength of 405nm ± 10 nm.

50. The system of any one of claims 45 to 49, wherein the first filter is configured to allow optical signals having a wavelength between about 500nm and about 550nm and/or optical signals having a wavelength between about 600nm and about 660nm to pass through the first filter to the first image sensor.

51. The system according to any one of claims 45 to 50, wherein the at least one excitation light source comprises first and second violet/blue LEDs, each LED configured to emit light having a wavelength of 405nm ± 10 nm.

52. The system of any one of claims 45-51, wherein the second housing portion further comprises a power source.

53. The system of claim 52, wherein the second housing further comprises an outer surface having contacts for charging the power source.

54. The system of any one of claims 45-53, wherein the second housing further comprises a heat sink.

55. The system of claim 54, wherein the heat sink defines an opening in the second housing configured to releasably receive the first housing.

56. The system of any one of claims 45 to 55, wherein the first housing further comprises a thermal sensor configured to detect thermal information about the target surface.

57. The system of any one of claims 45 to 56, wherein the first housing further comprises an ambient light sensor configured to indicate when ambient lighting conditions are sufficient to allow fluorescence imaging.

58. The system of any one of claims 45 to 57, wherein the first housing further comprises a range finder.

59. The system of any one of claims 45 to 58, wherein the first housing further comprises a third filter configured to detect and allow passage of an optical signal to a third image sensor, the optical signal being responsive to illumination of the target surface with the white light and having a wavelength in the visible range.

60. The system according to any one of claims 45 to 59, wherein the first housing further comprises a second excitation light source configured to emit excitation light having a wavelength of about 350nm to about 400nm, about 400nm to about 450nm, about 450nm to about 500nm, about 500nm to about 550nm, about 550nm to about 600nm, about 600nm to about 650nm, about 650nm to about 700nm, about 700nm to about 750nm, about 750nm to about 800nm, about 800nm to about 850nm, about 850nm to about 900nm, and/or combinations thereof.

61. The system according to any one of claims 45 to 60, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 750nm to 800 nm.

62. The system of claim 61, wherein the second excitation light source is configured to emit excitation light having a wavelength between about 760nm and about 780 nm.

63. The system according to claim 62, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 760nm ± 10 nm.

64. The system according to claim 62, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 770nm ± 10 nm.

65. The system according to claim 62, wherein the second excitation light source is configured to emit excitation light having a wavelength of about 780nm ± 10 nm.

66. The system of any one of claims 45-65, wherein the first housing further comprises a polarizing filter.

67. A portable modular handheld imaging system kit comprising:

a plurality of optical housing sections, each of the plurality of optical housing sections comprising:

at least one excitation light source configured to emit excitation light during fluorescence imaging,

a first filter configured to detect and allow passage of an optical signal to a first image sensor, the optical signal responsive to illumination of a target surface with the excitation light and having a wavelength corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue fluorescence, and tissue autofluorescence,

a white light source configured to emit white light during white light imaging, an

A second filter configured to detect and allow passage of an optical signal to a second image sensor, the optical signal responsive to illumination of the target surface with the white light and having a wavelength in the visible range; and

a base housing portion configured to releasably interchangeably receive each of the plurality of optical housing portions and comprising:

a display device is arranged on the base plate,

a power supply configured to supply power to the at least one excitation light source and the white light source, an

A processor configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to the display based on the detected optical signals.

68. The kit according to claim 67, wherein in each of the plurality of optical housing portions, the at least one excitation light source is configured to emit excitation light having a wavelength of about 350nm to about 400nm, about 400nm to about 450nm, about 450nm to about 500nm, about 500nm to about 550nm, about 550nm to about 600nm, about 600nm to about 650nm, about 650nm to about 700nm, about 700nm to about 750nm, about 750nm to about 800nm, about 800nm to about 850nm, about 850nm to about 900nm, and/or combinations thereof.

69. The kit according to claim 67 or 68, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 400nm to about 450 nm.

70. The kit according to any one of claims 67 to 69, wherein the at least one excitation light source is configured to emit excitation light having a wavelength of about 405nm ± 10 nm.

71. The kit of any one of claims 67 to 70, wherein the first filter is configured to allow optical signals having a wavelength between about 500nm and about 550nm and/or optical signals having a wavelength between about 600nm and about 660nm to pass through the first filter to the first image sensor.

72. The kit according to any one of claims 67 to 71, wherein the at least one excitation light source comprises first and second violet/blue LEDs, each LED configured to emit light having a wavelength of 405nm ± 10 nm.

73. The kit of any one of claims 67 to 72, wherein one of the plurality of optical housing portions further comprises a third filter configured to detect and allow passage of an optical signal to a third image sensor, the optical signal responsive to illumination of the target surface with the white light and having a wavelength in the visible range.

74. The kit of claim 73, wherein a second one of the plurality of optical housing portions further comprises a second excitation light source configured to emit excitation light having a different wavelength than the first excitation light source.

75. The kit of claim 74, wherein a second one of the plurality of optical housing portions is formed as an endoscope housing portion.

76. The kit of any one of claims 67 to 75, wherein one of the plurality of optical housing portions further comprises a range finder.

77. The kit of any one of claims 67 to 76, wherein one of the plurality of optical housing portions further comprises a thermal sensor configured to detect thermal information about the target surface.

78. The kit of any one of claims 67 to 77, wherein the first housing further comprises an ambient light sensor configured to indicate when ambient lighting conditions are sufficient to allow fluorescence imaging.

79. The kit of any one of claims 67 to 78, wherein the base housing further comprises an outer surface having contacts for charging the power source.

80. The kit of any one of claims 67 to 79, wherein the base housing further comprises a heat sink.

81. The kit of any one of claims 67 to 80, wherein the heat sink defines an opening in the base housing configured to releasably receive one of the plurality of optical housings.

82. The kit of any one of claims 67-81, further comprising a darkening drape configured to attach to one of the plurality of optical housings.

83. The kit of claim 82, wherein the darkening drape is configured to reduce ambient light in a field of view of the first image sensor.

84. The kit of claim 82, further comprising a plurality of darkening drapes, each darkening drape configured to attach to a respective one of the plurality of optical housings.

85. The kit of claim 75, further comprising a darkening drape configured to attach to the endoscope optical housing.

86. The kit of claim 85, wherein the darkening drape is configured to reduce ambient light in a field of view of the first image sensor.

87. The kit of claim 86, wherein the darkening drape is further configured to provide sterility in a surgical field and/or to protect the optical housing portion from contamination.

88. The kit of any one of claims 67 to 87, wherein one of the plurality of optical housing portions further comprises a polarizing filter.

89. The kit of any one of claims 67-88, wherein the base housing and one of the plurality of optical housing portions collectively form an imaging device, and further comprising a sterile drape configured to create a sterile barrier between the imaging device and an environment in which the imaging device is used.

90. The kit of any one of claims 67 to 89, further comprising one or more imaging or contrast agents.

91. A method of operating a modular handheld fluorescence-based imaging device, comprising:

selecting an optical housing comprising optical components comprising a last excitation light source for fluorescence imaging;

connecting the selected optics housing to a base housing of the imaging device to provide power from a power source in the base housing to the optical components in the optics housing;

illuminating a target with the at least one excitation light source to cause one or more of a portion, a component, and a biomarker of the illuminated portion of the target to fluoresce, reflect, or absorb light;

filtering optical signals in response to illumination of the target with the excitation light, wherein filtering the plurality of optical signals includes preventing reflected excitation light from passing through and allowing optical signals having wavelengths corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue autofluorescence, and exogenous tissue fluorescence to pass through a fluorescence filter contained in the optical housing;

detecting the filtered optical signal with an image sensor contained in the optics housing;

displaying the detected filtered signal on at least one display of the base housing as a composite image of the illuminated portion of the target, the composite image including fluorescent representations of various tissue components present in the illuminated portion of the target.

92. The method of claim 91, further comprising:

illuminating a target with a white light source contained in the optical housing;

filtering an optical signal with a visible light filter contained in the optical housing in response to illumination of the target with the white light;

detecting the filtered optical signal with an image sensor contained in the optics housing.

93. An imaging system kit, comprising:

the imaging system of any of claims 1 to 66; and

a sterile drape configured to create a sterile barrier between the imaging system and an environment in which the imaging system is used.

94. The kit of claim 93, further comprising an imaging drape configured to reduce ambient light in an imaging environment of the imaging system.

95. The kit of claim 94, wherein the imaging drape comprises a connector element configured to receive an optical housing of the imaging system.

96. The kit of any one of claims 93 to 95, further comprising one or more imaging or contrast agents.

Technical Field

A system for multi-modality imaging and analysis is disclosed. In particular, the systems and methods may be adapted to collect data regarding biochemical, biological, and/or non-biological substances. For example, the data may include one or more of white light data, fluorescence data, thermal data, infrared data, such as wound care for human and animal applications.

Background

Wound care is a significant clinical challenge. Healing and chronic non-healing wounds are associated with changes in many biological tissues, including inflammation, hyperplasia, remodeling of connective tissues, and the common major problem, bacterial infection. Some wound infections are not clinically evident and result in an increasing economic burden associated with wound care, especially in the elderly population. Currently, gold standard wound assessment involves direct visual inspection of the wound site under white light, combined with collection of bacterial swabs and tissue biopsies without bluish-red soap, resulting in delayed, expensive and often insensitive bacteriological results. This may affect the timing and effectiveness of the treatment. Qualitative and subjective visual assessment only provides a general picture of the wound site, but does not provide information about potential biological and molecular changes occurring at the tissue and cellular level. In clinical wound management, there is a need for a relatively simple and complementary method of using "biological and molecular" information to improve the early recognition of such occult changes. Early identification of risk wounds can guide therapeutic intervention and provide time-varying response monitoring, thereby greatly reducing morbidity and mortality, particularly in chronic wounds.

Wound care and management are major clinical challenges, placing a tremendous burden and challenge on global health care [ Bowler et al, Clin Microbiol rev.2001, 14: 244-; cutch et al, Journal of Wurd Care.1994, 3: 198-201; dow et al, Ostomy/round management.1999, 45: 23-40]. Wounds are generally classified as: wounds without tissue loss (e.g., in surgery) and wounds with tissue loss, such as burns, wounds, abrasions, or wounds resulting from secondary events of chronic disease (e.g., venous stasis, diabetic ulcers or pressure sores and iatrogenic wounds, such as skin graft donor sites and skin abrasions, hair sinuses, non-healing surgical wounds, and chronic cavity wounds). Wounds are also classified by the number of involved layers, superficial wounds involving only the epidermis, partial thickness wounds involving only the epidermis and dermis, and full thickness wounds involving subcutaneous fat or deeper tissues and epidermis and dermis. While restoration of tissue continuity following injury is a natural phenomenon, infection, quality of healing, speed of healing, fluid loss and other complications that prolong healing time are major clinical challenges. Most wounds heal without any complications. Chronic non-healing wounds, however, involve increasing tissue loss, which presents a significant challenge to wound care practitioners and researchers. Unlike surgical incisions, where tissue loss is relatively small, wounds typically heal without significant complications, while chronic wounds disrupt the normal healing process, which is often insufficient in itself to effect repair. Delayed healing is often the result of impaired physiological function of the wound [ Winter (1962) nature.193: 293-294], commonly occurring in venous stasis and diabetic ulcers, or prolonged local stress in immunosuppressed and mobility-impaired elderly. These chronic diseases increase the cost of care and reduce the quality of life of the patient. As the number of these groups grows, the need for advanced wound care products will increase.

Conventional clinical assessment methods for acute and chronic wounds remain unsatisfactory. They are typically based on complete patient history, qualitative and subjective clinical assessments, and simple visual assessments using ambient white light and the "naked eye", and sometimes may also involve the use of color photography to capture the overall appearance of the wound under white light illumination [ Perednia (1991) J Am Acad dermatol.25: 89-108]. It is also necessary to re-evaluate treatment progress on a regular basis and to modify the intervention appropriately. Wound assessment terminology is not uniform, many questions surrounding wound assessment remain unanswered, key wound parameters that need to be measured in clinical practice have not been agreed upon, and the accuracy and reliability of existing wound assessment techniques vary. Visual assessment is often combined with swabs and/or tissue biopsies for bacterial culture to make a diagnosis. Collection of bacterial swabs at the time of wound examination has the significant advantage of providing identification of specific bacterial/microbial species [ Bowler, 2001; cutting, 1994; dow, 1999; in, Dow G: krasner et al, eds. chrononic Wound Care: the AC Source Book for Healthcare services, 3 rd edition, Wayne Pa.: HMP communications.2001: 343-356]. However, multiple swabs and/or biopsies are often collected randomly from the wound, and some swab techniques may actually spread the microbes around the wound during the collection process, thereby affecting the healing time and morbidity of the patient [ Dow, 1999 ]. This can be a problem, especially for large chronic (non-healing) wounds, and although many swabs are collected, the efficiency of detecting the presence of bacteria using current swab and biopsy protocols is not ideal (diagnostic insensitivity). Thus, current methods for obtaining a swab or tissue biopsy from a wound site for subsequent bacterial culture are based on non-targeted or "blind" swab or punch biopsy methods, and have not been optimized to minimize trauma to the wound or to maximize diagnostic yield of bacteriological tests. Furthermore, obtaining bacteriological swabs and biopsy samples can be laborious, invasive, painful, expensive, and more importantly, the bacteriological culture results typically take about 2-3 days to come out of the laboratory, and may not be conclusive [ Serena et al (2008) Int J Low extreme woods.7 (1): 32-5.; gardner et al, (2007) wounds.19 (2): 31-38], thereby delaying accurate diagnosis and treatment [ Dow, 1999 ]. Therefore, the bacterial swab does not provide real-time detection of the infection status of the wound. While wound debridement appears to be simple, improper handling, if not done properly, may result in improper treatment, increased patient morbidity and hospital stays [ Bowler, 2001; cutting, 1994; dow, 1999; dow, 2001 ]. The lack of a non-invasive imaging method to objectively and rapidly assess wound repair at the biological level (which may be more detailed than mere appearance or morphology), and to help target the collection of bacteriological swabs and tissue biopsy samples, is a major obstacle to clinical wound assessment and treatment. Alternative methods are highly desirable.

As wounds (chronic and acute) heal, many key biological changes occur at the tissue and cellular level at the wound site [ Cutch, 1994 ]. Wound healing involves a complex and dynamic interaction of biological processes, divided into four overlapping phases-hemostasis, inflammation, cell proliferation and maturation or remodeling of connective tissue-which affects the pathophysiology of wound healing [ Physiological bases of surrounding health, in Developments in surrounding care, PJB Publications ltd., 5-17, 1994 ]. A common major complication in the wound healing process (lasting days to months) is infection by bacteria and other microorganisms [ Cutch, 1994; dow, 1999 ]. This can lead to a serious obstacle to the healing process and to serious complications. All wounds contain varying degrees of bacteria, ranging from contamination, colonization, severe colonization, to infection, and the diagnosis of bacterial infection is based on clinical symptoms and signs (e.g., visual and odor cues).

The most commonly used terms for wound infection include wound contamination, wound colonization, wound infection and most recently critical colonization. Wound contamination refers to the presence of bacteria within the wound without any host response [ Ayton m. nurs Times 1985, 81 (46): suppl 16-19], wound colonization refers to the presence of bacteria within the wound that multiply or initiate a host response [ Ayton, 1985], critical colonization refers to the multiplication of bacteria leading to delayed wound healing, often associated with exacerbation of pain not previously reported but still not apparent host responses [ Falanga et al, J Invest Dermatol W94, 102 (1): 125-27; kingsley a, Nurs Stand 2001, 15 (30): 50-54, 56, 58]. Wound infection refers to the deposition and proliferation of bacteria in tissues, accompanied by host responses [ Ayton, 1985 ]. In practice, the term "critical colonization" may be used to describe a wound that is considered to be transferred from colonization to local infection. However, the challenge in the clinical setting is to ensure that this condition is quickly and confidently recognized and that bacterial biofouling is reduced as soon as possible, perhaps by the use of topical antibacterial drugs. Potential Wound pathogens can be divided into different groups, depending on their structure and metabolic capabilities, such as bacteria, fungi, spores, protozoa and viruses [ Cooper et al, Wound Infection and microbiology: medical Communications (UK) Ltd for Johnson & Johnson Medical, 2003 ]. Although viruses do not normally cause wound infection, bacteria can infect skin lesions formed during certain viral diseases. Such infections can occur in a variety of settings, including healthcare settings (hospitals, clinics) and at home or in chronic care facilities. Control of wound infection is increasingly complex, but treatment is not always guided by microbiological diagnosis. The diversity of microorganisms and the high incidence of the polymicrobial flora in most chronic and acute wounds make the value of identifying one or more bacterial pathogens from wound cultures plausible. Early identification of pathogens of wound infection may help wound care providers take appropriate action. In addition, incorrect collagen formation is caused by increased bacterial load and leads to excessively vascularized, loose granulation tissue, often leading to wound disruption [ Sapico et al (1986) Diagn microbial infection dis.5: 31-38].

Accurate and clinically relevant wound assessment is an important clinical tool, but this process is still a great challenge at present. Current visual assessment in clinical practice only provides a general view of the wound site (e.g., the presence of suppurative material and scabs). Current best clinical practice fails to take full advantage of vital objective information about potentially critical biological changes occurring at the tissue and cell level (e.g., contamination, colonization, infection, matrix remodeling, inflammation, bacterial/microbial infection, and necrosis) as these indicators are i) not readily available at the time of wound examination, and ii) not currently integrated into conventional wound management processes. Direct visual assessment of wound health using white light relies on detection of color and topographical/texture changes in and around the wound and may therefore be impossible and unreliable in detecting subtle changes in tissue remodeling. More importantly, because bacteria are hidden under white light illumination, direct visual assessment of wounds often fails to detect the presence of bacterial infection. Infections are clinically diagnosed by microbiological testing to identify organisms and their susceptibility to antibiotics. While physical indications of bacterial infection (e.g., purulent exudate, crusting, swelling, erythema) can be readily observed in most wounds using white light, this is often greatly delayed and patients are already at increased risk for morbidity (and other infection-related complications) and mortality. Thus, standard white light direct visualization fails to detect the early presence of bacteria themselves or identify the type of bacteria within the wound.

The implantation and transplantation of stem cells has recently attracted interest, for example, for wound care and therapy. However, tracking the proliferation of stem cells following implantation or transplantation is currently challenging. Tracking and identifying cancer cells also presents challenges. It would be desirable if these cells could be monitored in a minimally invasive or non-invasive manner.

It would also be useful to provide a method for detecting contamination of other target surfaces, including non-biological targets.

Disclosure of Invention

The present disclosure may address one or more of the above-identified problems and/or may demonstrate one or more of the above-identified desirable characteristics. Other features and/or advantages will become apparent from the following description.

According to one aspect of the present disclosure, a portable handheld imaging system is provided. The system includes at least one excitation light source configured to emit excitation light during fluorescence imaging. The first filter is configured to detect and allow passage of an optical signal to the first image sensor, the optical signal being responsive to illumination of the target surface with excitation light and having a wavelength corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue fluorescence, and tissue autofluorescence. The white light source is configured to emit white light during white light imaging. The second filter is configured to detect and allow passage of an optical signal to the second image sensor, the optical signal being responsive to illumination of the target surface with white light and having a wavelength in the visible range. And, the processor is configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to the display based on the detected optical signals.

In accordance with another aspect of the present disclosure, a portable modular handheld imaging system is provided. The modular system includes a first housing portion and a second housing portion. The first housing portion includes: at least one excitation light source configured to emit excitation light during fluorescence imaging; a first filter configured to detect and allow passage of an optical signal to the first image sensor, the optical signal responsive to illumination of the target surface with excitation light and having a wavelength corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue fluorescence, and tissue autofluorescence; a white light source configured to emit white light during white light imaging; and a second filter configured to detect and allow passage of an optical signal to the second image sensor, the optical signal being responsive to illumination of the target surface with white light and having a wavelength in the visible range. The second housing portion is configured to releasably receive the first housing portion and includes a display and a processor configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to the display based on the detected optical signals.

In accordance with another aspect of the present disclosure, a portable modular handheld imaging system kit is provided. The kit includes a plurality of optical housing portions and a base housing portion. Each of the plurality of optical housing sections includes: at least one excitation light source configured to emit excitation light during fluorescence imaging; a first filter configured to detect and allow passage of an optical signal to the first image sensor, the optical signal responsive to illumination of the target surface with excitation light and having a wavelength corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue fluorescence, and tissue autofluorescence; a white light source configured to emit white light during white light imaging; and a second filter configured to detect and allow passage of an optical signal to the second image sensor, the optical signal being responsive to illumination of the target surface with white light and having a wavelength in the visible range. The base housing portion is configured to releasably interchangeably receive each of the plurality of optical housing portions. The base housing portion includes: a display; a power supply configured to supply power to the at least one excitation light source and the white light source; and a processor configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to a display based on the detected optical signals.

In accordance with yet another aspect of the present disclosure, a method of operating a modular handheld fluorescence-based imaging device is provided. The method comprises the following steps: selecting an optical housing comprising optical components comprising at least one excitation light source for fluorescence imaging, and connecting the selected optical housing to a base housing of the imaging device to provide power from a power source in the base housing to the optical components in the optical housing. The method further comprises the following steps: illuminating the target with at least one excitation light source to cause one or more of a portion, a component, and a biomarker of the illuminated portion of the target to fluoresce, reflect, or absorb light; and filtering the optical signals in response to illumination of the target with the excitation light, wherein filtering the plurality of optical signals includes preventing the reflected excitation light from passing through and allowing optical signals having wavelengths corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue autofluorescence, and exogenous tissue fluorescence to pass through a fluorescence filter contained in the optical housing. The method further comprises the following steps: the filtered optical signals are detected with an image sensor contained in the optical housing, and a composite image of the illuminated portion of the target including fluorescent representations of various tissue components present in the illuminated portion of the target is displayed on at least one display of the base housing.

Drawings

The disclosure may be understood from the following detailed description, taken alone or in conjunction with the accompanying drawings. The accompanying drawings are included to provide a further understanding, and are incorporated in and constitute a part of this specification. The drawings illustrate one or more exemplary embodiments of the disclosure and, together with the description, serve to explain various principles and operations.

Fig. 1 is a front view of a first embodiment of a modular handheld imaging apparatus according to the present disclosure.

Fig. 2 is a rear view of the modular handheld imaging apparatus of fig. 1.

Fig. 3 is a front perspective view of the modular handheld imaging apparatus of fig. 1.

Fig. 4 is a rear perspective view of the modular handheld imaging apparatus of fig. 1.

Fig. 5A is a perspective view of a first optical housing detached from a base housing of a second embodiment of a modular handheld imaging system according to the present disclosure.

Fig. 5B is a perspective view of a second optical housing detached from a base housing of a third embodiment of a modular handheld imaging system according to the present disclosure.

Fig. 6 is a front view of a rendering of a fourth embodiment of a modular handheld imaging device according to the present disclosure.

Fig. 7 is a rendered rear view of the modular handheld imaging apparatus of fig. 6 according to the present disclosure.

Fig. 8 is an example of White Light (WL), Fluorescence (FL), and thermal images acquired according to an exemplary embodiment of a modular handheld imaging device of the present disclosure.

Fig. 9 is an example of measurements made by an exemplary embodiment of a modular handheld imaging apparatus according to the present disclosure.

Fig. 10A-10D are examples of images acquired and created in forming a three-dimensional fluoroscopic image of a target using an exemplary embodiment of a modular handheld imaging apparatus according to one aspect of the present disclosure.

Fig. 11A to 11E illustrate a separate charging station (fig. 11A to 11C) according to the present disclosure, and are used with an imaging device (fig. 11D and 11E).

Fig. 12 is an exploded view of an example embodiment of an optical housing of an imaging device according to an aspect of the present disclosure.

Fig. 13 is an example embodiment of a printed circuit board for use in an imaging device according to one aspect of the present disclosure.

Fig. 14A and 14B illustrate example hardware block diagrams used in the imaging apparatus of the present disclosure.

Fig. 15A-15F illustrate example embodiments of drapes according to the present disclosure unconnected (fig. 15A-15C) and connected (fig. 15D-15E) to a portable handheld imaging device.

Fig. 15G-15H illustrate an example portable handheld imaging device connected to the drape of fig. 15D-15F according to the present disclosure.

Fig. 16A-16C illustrate an example embodiment of a sterile drape for use with an imaging apparatus (fig. 16A-16B) according to the present disclosure, and a sterile drape over the imaging apparatus when the imaging apparatus is connected to the darkened drape/imaging drape (fig. 16C).

Detailed Description

Wound progression is currently monitored manually. The national pressure sore advisor group (NPUAP) developed a pressure sore healing scale (PUSH) tool outlining a five-step method of characterizing pressure sores. The tool uses three parameters to determine a quantitative score, which is then used to monitor the pressure sore over time. Qualitative parameters included wound size, tissue type, exudation or drainage rate, and thermal readings after dressing removal. The wound may be further characterized by its odor and color. This assessment of wounds does not currently include critical biological and molecular information about the wound. Thus, all descriptions of wounds are subjective and are manually recorded by the attending physician or nurse.

There is a need for a robust, cost-effective method or apparatus based on non-invasive and rapid imaging for objectively assessing changes in a wound at the biological, biochemical and cellular levels, and for rapidly, sensitively and non-invasively detecting the early presence of bacteria/microorganisms within the wound. Such a method or device for detecting key biological tissue changes in wounds can be used as an aid in conjunction with conventional clinical wound management methods to guide key clinical pathological decisions in patient care. Such devices may be compact, portable, and capable of real-time non-invasive and/or non-contact interrogation of wounds in a safe and convenient manner, which may allow the handheld imaging device to seamlessly adapt to conventional wound management practices and be user-friendly to clinicians, nurses, and wound specialists. Handheld imaging devices may also be used in home care environments (including patient self-use) as well as in military battlefield environments. In addition, such image-based devices can provide the ability to monitor wound treatment response and healing in real time by incorporating valuable "bioinformatic" image guidance into the clinical wound assessment process. This may ultimately lead to potentially new diagnostics, treatment planning, treatment response monitoring, and thus "adaptive" intervention strategies, which may allow for enhanced wound healing responses at the individual patient level. Accurate identification of systemic, local, and molecular factors behind individual patient wound healing problems may allow for better tailored treatment.

Molecular light i: x-devices have made great strides in addressing many of the problems discussed above. Molecular light i: the X-ray device allows the clinician to quickly, safely, and easily view bacteria and measure wounds at the point of care. Molecular light i: the basis and method of use of the X device is described in U.S. patent No.9,042,967 of the national phase application of PCT/CA2009/000680, filed internationally on 5/20/2009, which claims the benefit of U.S. provisional application No.61/054,780, filed on 20/5/2008, each of which is incorporated herein by reference in its entirety.

Another imaging device disclosed for Visualization of cancer is disclosed in U.S. provisional application No.62/625,983 (filed on 3.2.2018) entitled "Devices, Systems, and Methods for turber Visualization and Removal" and U.S. provisional application No.62/625,967 (filed on 3.2.2018) entitled "Devices, Systems, and Methods for turber Visualization and Removal", and international patent application No. pct/CA2019/000015 (filed on 1.2.2019) entitled "Devices, Systems, and Methods for turber Visualization and Removal", each of which is incorporated herein by reference in its entirety. Although disclosed in the context of visualizing cancer, the disclosed systems and methods relate to visualization and imaging of tissue autofluorescence and tissue fluorescence, and details regarding the construction, function, and operation of the exemplary devices described therein may be similar or identical to portions of the systems described herein.

Molecular light i: the X-ray device and devices disclosed in the present application utilize tissue autofluorescence imaging, which provides a unique means of obtaining biologically relevant information of normal and diseased tissues in real time, allowing differentiation between normal and diseased tissue states. The autofluorescence imaging device can be used for rapid, non-invasive, contactless real-time imaging of wounds to detect and utilize rich biological information of wounds, to overcome existing limitations, and to improve clinical care and management.

In the present application, systems, methods, and apparatus for fluorescence-based imaging are disclosed. One embodiment of the device is a portable optical digital imaging device. The device may utilize a combination of White Light (WL) imaging, Fluorescence (FL) imaging, Infrared (IR) imaging, thermal imaging, and/or three-dimensional mapping, and may provide real-time wound imaging, assessment, recording/documentation, monitoring, and/or care management. The device may be hand-held, compact and/or lightweight. For example, the apparatus may comprise: at least one excitation light source configured to emit excitation light during fluorescence imaging; a first filter configured to detect and allow passage of an optical signal to the first image sensor, the optical signal responsive to illumination of the target surface with excitation light and having a wavelength corresponding to one or more of bacterial fluorescence, bacterial autofluorescence, tissue fluorescence, and tissue autofluorescence; a white light source configured to emit white light during white light imaging; a second filter configured to detect and allow passage of an optical signal to a second image sensor, the optical signal responsive to illumination of the target surface with white light and having a wavelength in the visible range; and a processor configured to receive the detected fluorescent and white light optical signals and output a representation of the target surface to a display based on the detected optical signals. The device and method may be suitable for monitoring wounds in humans and animals.

In another exemplary embodiment, the device may be a modular handheld imaging device. In such embodiments, the device includes a base portion, also referred to herein as a base portion or base shell, and an optical portion, also referred to herein as an optical shell or optical shell portion. The optic is releasably received by the base portion and is interchangeable with other optics, each configured for a particular application or capturing particular features and optical information from the object being imaged. Thus, the user will select the optical housing based on the capabilities required to image in a given situation.

The modular handheld imaging device may be packaged and/or sold as part of a kit in which a base portion and two or more optics are provided, each optic having different optical characteristics than the other and any other optic shells. Properties that may vary from one optical housing to another include the following non-limiting examples, which may be included in any combination in each optical housing: a number of image sensors configured for white light imaging (i.e., combined with a filter for white light imaging); the number of image sensors configured for fluorescence imaging, wherein different image sensors for fluorescence imaging may be paired with different filters to allow different ranges of fluorescence emissions to pass, wherein each range is configured to capture specific features of a target (e.g., blood vessels or microvessels, collagen, elastin, blood, bone, bacteria, malignancy, lymphatic vessels, immune cells, adipose tissue, cartilage, tendons, nerves, gastrointestinal tissue, skin, pre-malignant or benign tissue, bodily fluids, urine, blood, saliva, tears, mucus, mucosal tissue, dermal tissue, and exogenous fluorescent agents, drugs, etc.).

The image sensor is configured to capture still images or video.

The number and type of excitation light sources may also vary between optical housings. The excitation light source is configured to emit excitation light having a wavelength of about 350nm to about 400nm, about 400nm to about 450nm, about 450nm to about 500nm, about 500nm to about 550nm, about 550nm to about 600nm, about 600nm to about 650nm, about 650nm to about 700nm, about 700nm to about 750nm, about 750nm to about 800nm, about 800nm to about 850nm, about 850nm to about 900nm, about 900nm to about 950nm, about 950nm to about 1000nm, and/or combinations thereof. The shape of the optical housing may also vary from one housing to another depending on the particular application. For example, the particular shape may be useful for particular applications, e.g., into a confined anatomical space such as a recess, oral cavity, nasal cavity, anal region, abdominal region, ear, etc. In this case, the optical housing may have the form of an endoscopic accessory. The material forming the optical housing may vary from one housing to another. For example, the housing may have a flexible patient facing portion or a rigid patient facing portion, depending on the application in which the imaging device is to be used. In some embodiments, the optical housing may be made waterproof or water resistant. In some embodiments, the housing may be made of a material that is inherently resistant to bacterial growth, or a material having a surface texture or topology that is resistant to microbial growth, such as a rough nano-surface. The dimensions of the optical housing may vary depending on the size and number of components contained therein. Various exemplary embodiments of the optical housing may also include features such as an ambient light sensor, a range finder, a thermal imaging sensor, a structured light emitter, an infrared radiation source and detector for three-dimensional imaging, a laser for taking measurements, and the like, in any combination. Additionally or alternatively, the imaging device may also and have an external channel embedded in the housing to enable delivery of tools such as biopsy forceps, fiber optic spectroscopy probes, or other tools requiring (FL) image guided targeting to collect tissue, ablate tissue, cauterize tissue, or interrogate fluorescent tissue.

The base portion/base housing includes an interface configured to releasably receive the optical housing. The optical housing includes a portion configured to be received into the base in a manner that provides electrical and power connections between components in the optical housing and the battery and processor in the base. This connection will enable data transfer between the optical housing and the base, which contains a processor configured to receive data from the image sensor. Further, the base may be connected to a PC to store or analyze data from the modular imaging apparatus.

In various exemplary embodiments, the base portion comprises a heat sink. In one exemplary embodiment, the heat sink forms a lip around the opening in the base, the lip configured to receive the optical housing.

In various example embodiments, a modular imaging apparatus includes elements in various configurations:

FL camera sensor-a camera sensor configured to detect fluorescent wavelengths is used for fluorescent imaging mode (FL). Light incident on the sensor passes through a dual band filter to allow visualization and capture of red and green fluorescent signals that may be present, such as signals generated in response to illumination of a target with excitation light. In some embodiments, the filter may be configured to identify additional fluorescence signals or fewer fluorescence signals.

WL camera 1-when the modular imaging device is in White Light (WL) imaging mode, a first White Light (WL) camera sensor is used. Light incident on the sensor passes through a short pass filter to allow the sensor to image visible wavelengths. The short pass filter blocks Infrared (IR) light that may be present in the clinical environment. The short pass filter also blocks the IR emitted by the rangefinder (if present).

WL camera 2-second WL image sensor/camera sensor may be used as part of a stereoscopic or 3D imaging (target depth) configuration of the modular imaging device.

Light incident on the sensor passes through a short pass filter to allow the sensor to image visible wavelengths. The short pass filter blocks Infrared (IR) light that may be present in the clinical environment. The short pass filter also blocks the IR emitted by the rangefinder (if present). When present, the second WL camera sensor must be aligned with the first WL camera sensor.

Display-a high resolution, wide color gamut display with touch screen functionality may be provided. The touch screen functionality allows the user to manipulate the image and also allows the display to act as the primary User Interface (UI) for the clinician/device operator, allowing patient information to be entered which may be checked or registered with the captured image in some manner, whether on the camera or when the information is uploaded to the cloud or other storage.

Battery-rechargeable batteries, such as rechargeable lithium ion batteries with integrated gas measurement functionality, may be used to power the modular imaging device. As will be appreciated, other types of batteries or other power sources may be used.

Speaker-the speaker on the modular imaging device may be used to communicate with the user and may also generate camera clicks and/or other sounds that improve the user experience.

Battery status LED-indicates battery status low and the state of charge of the battery during a charging operation.

System status LED-indicating system status by using on/off or different colors, providing system OK/run indication or indicating that there is an internal system problem.

Wi-Fi antenna-WIFI communication enabled. The WIFI communication is used for cloud storage of images, field updating of system software and pay-per-view use management.

The light source of the FL LED-modular arrangement may comprise LEDs. In one example, excitation light, such as fluorescent excitation light, may be generated by a Fluorescent (FL) LED. The fluorescence excitation light may be used to extract fluorescence from the bacteria, i.e. in response to irradiation with excitation light. The LED current is controlled with closed loop control, where the set point of the control loop is managed by the MCU. The nominal FL LED drive current set point is established during device manufacturing to meet minimum target optical irradiance and uniformity requirements. The optical efficiency of an LED depends on temperature. A temperature sensor measures the Printed Circuit Board (PCB) temperature near the LED, which is used as an input to a control loop that adjusts the nominal drive current setpoint to compensate for LED temperature-dependent variations in the irradiance efficiency. As will be appreciated, other types of fluorescent light sources may be used instead of or in addition to FL LEDs.

An ambient light sensor is provided to monitor ambient light in an imaging environment near an imaging target. Fluorescence (FL) imaging requires a sufficiently dark environment to obtain a useful image. An ambient light sensor is used to provide feedback to the clinician regarding the ambient light level. The ambient light level before the system enters the FL imaging mode may be stored in the picture metadata. In post-analysis, light levels may be useful. During the white light imaging mode, the measured ambient light level may also be useful to enable the WL flashlight or control its intensity. The ambient light sensor may be configured to indicate to a user when the imaging environment is dark enough to take a fluorescence image. This may take the form of providing an indication that the imaging environment is satisfactory and/or unsatisfactory depending on the imaging mode.

Rangefinder-a rangefinder may be used to measure the distance between a camera sensor and an imaged object. Over a range of camera-to-target distances, minimum blue irradiance and uniformity is effective. The rangefinder provides feedback to the clinician/user to guide them in imaging at the correct distance by providing an indication that the appropriate distance has been reached. The target distance may be stored in the picture metadata. The target distance may be useful for a sticker detection algorithm that may be used in the measurement process to determine the minimum and maximum expected sticker sizes in the sensor pixels as a function of the distance between the sticker and the camera sensor. In some embodiments, a change in the measured target distance may be used to initiate a camera sensor refocusing action.

Flashlight LED-during a white light imaging mode, one or more white light sources may be provided to illuminate the target. The white light source may comprise one or more white LEDs. Other white light sources may be used in addition to or in place of LEDs.

USB-C port-a USB-C port may be provided for battery charging, factory loading of software, factory testing and calibration of the device, and picture downloading.

Additional or alternative ports may be provided for information transfer and/or billing.

An exemplary embodiment of a modular handheld imaging device 100 is shown in fig. 1-5B. As shown in fig. 1-5B, in some example embodiments, the base portion 110 of the device 100 may have a generally square or rectangular shape. The front or user facing side 115 of the base portion 110 includes a display screen 120 for displaying images and video captured by the device. Although depicted as square or rectangular, the device may take any shape that will reasonably support a display screen such as a touch screen display. In addition to disclosing the images captured by the imaging device 100, the display screen also serves as a user interface, allowing a user to control the functions of the device via touch screen input.

Positioned on the opposite side of the device, i.e., the patient-facing side 125 of the device, may be a handheld region 130 configured to facilitate a user's holding of the device during imaging. As shown in fig. 4, the hand-held area may include a protrusion or an area extending away from the base portion 110 sufficient to allow a user's fingers to grasp or wrap around the protrusion. Various other types of hand-held and hand-held alternative positioning may be used. One factor that is considered in such a handheld location is the ability of the user to balance the imaging device when using the imaging device and entering commands via the touch screen display. The weight distribution of the imaging device will also be a consideration in providing a user-friendly and ergonomic device. The patient-facing side 125 of the device may also include contacts 135 for wireless charging of the device.

As shown in fig. 11A-11E, a charging station 136 may be provided for wireless charging of the device 100. As shown in the example embodiment, the charging station 136 may include contacts, such as contact pins 137 for wirelessly charging the device 100. The contact pins 137 may be spring-loaded and may be separated from each other in a manner that prevents shorting due to inadvertent placement of other objects (i.e., small metal objects) on the contact pins 137. In one example, raised portions, such as protrusions, of the surface of the charging station 136 may separate the contact pins 137. Charging station 136 may also include an indicator light 138, which indicator light 138 will engage/light when device 100 is properly placed on charging station 136 for charging. Additionally or alternatively, the indicator light 138 may indicate when the device 100 is fully charged.

According to one aspect of the present disclosure, the patient-facing side 125 of the device 100 further includes an optics housing 140. As shown in fig. 5A to 5B, the optical housing 140 may be separated from the base portion 110. The optics housing portion 140 is shown as a rectangular housing configured to be received in a rectangular opening 145 in the base portion 110. However, both the optics housing portion 140 and the opening 145 may take other shapes, such as square, rectangular, oval, or circular. Further, the optical housing portion 140 may not have the same shape as the opening 145, but may use a connector element having the same shape as the opening 145 of the base portion 110 or otherwise configured to be received in the opening 145 of the base portion 110 as a bridge to connect the optical housing portion 140 to the base portion 110. The opening 145 is configured to releasably receive the optics housing portion 140. When the optics housing portion 140 is positioned in the opening 145, it may be locked in place such that the optics housing portion 140 is locked to the base portion 110. In this configuration, electrical contact is made between the base portion 110 and the optical components contained in the optical housing portion 140, and the components in the optical housing portion are powered by a power source (e.g., a battery) contained in the base portion 110.

In various exemplary embodiments, the base portion 110 includes a heat sink 150. In an exemplary embodiment, the heat sink 150 forms a lip around the opening 145 in the base portion 110 that is configured to receive the optical housing portion 140.

As shown in fig. 5A and 5B, the optical housing 140 may take on different shapes or configurations. For example, as shown in fig. 5A, the optical housing portion 140 has a substantially flat rectangular shape. The optical components are arranged in a substantially linear manner across the width of the optical housing. Fig. 5B shows a second optical housing 185 that includes an endoscope portion 190. Unlike the optics housing portion 140, the optical components contained in the second optics housing 185 are contained in the distal tip 195 of the endoscope portion 190 of the second optics housing 185 and are not arranged in a linear fashion. The arrangement of the optical components within each optical housing will vary based on the size and shape of the optical housing and the number and type of optical components contained within a given housing.

The optics housing 140 may include various optical components configured to facilitate collection of optical signals from a target being imaged. Properties that may vary from one optical housing to another include the following non-limiting examples, which may be included in any combination in each optical housing: a total number of image sensors, a number of image sensors configured for white light imaging (i.e., combined with a filter for white light imaging); the number of image sensors configured for fluorescence imaging, wherein different image sensors for fluorescence imaging may be paired with different filters to allow different ranges of fluorescence emissions to pass, wherein each range is configured to capture a specific feature of a target (e.g., blood vessels or microvessels, collagen, elastin, blood, bone, bacteria, malignancy, healthy or diseased cartilage, ligaments, tendons, connective tissue, lymphatic vessels, nerves, muscles, etc.).

The optical housing portion 140 may include one or more excitation light sources. The excitation light source can provide excitation light of a single wavelength selected to excite tissue autofluorescence emission and porphyrin-induced fluorescence emission in tumor/cancer cells. Additionally or alternatively, the excitation light source may provide a wavelength of excitation light selected to excite bacterial autofluorescence emission and/or exogenous fluorescence emission of one or more of tissue and bacteria in the wound. In one example, the excitation light may have a wavelength in the range of about 350nm to about 600nm, or 350nm to about 450nm and 550nm to about 600nm, or 405nm, for example, or 572nm, for example.

Alternatively, the excitation light source may be configured to provide excitation light of two or more wavelengths. As will be appreciated by those skilled in the art, the wavelength of the excitation light may be selected for different purposes. For example, by changing the wavelength of the excitation light, the depth to which the excitation light penetrates a target surface, such as an operating table or a wound, can be changed. As the depth of penetration increases with a corresponding increase in wavelength, light of different wavelengths may be used to excite tissue below the surface of the target surface. In one example, excitation light having a wavelength in the range of 350nm-450nm (e.g., 405nm) and excitation light having a wavelength in the range of 550nm-600nm (e.g., 572nm) may penetrate the target tissue to different depths, such as about 500pm to about 1mm and about 2.5mm, respectively. This will allow a user of the device, e.g. a physician, surgeon or pathologist, to visualize the target surface and the tissue cells at the lower surface of the target. Additionally or alternatively, excitation light having a wavelength in the near infrared/infrared range may be used, for example, excitation light having a wavelength between about 750nm and about 800nm, for example 760nm or 780nm, may be used. Furthermore, the use of this type of light source may be used in conjunction with a second type of imaging/contrast agent, such as an infrared dye (e.g., IRDYE800, ICG), in order to penetrate tissue to a deeper level. This would enable, for example, visualization of vascularization, vascular perfusion, and blood pooling in the target tissue. In addition, the effect of visualizing vascular perfusion is to improve the anastomosis during reconstruction or to observe the healing of the wound.

The imaging device 100 may comprise an additional light source, for example a white light source for White Light (WL) imaging the target surface. The use of white light provides an anatomical background for other images, such as fluorescence images. The white light source may comprise one or more white LEDs. Other white light sources may be used as appropriate. As will be appreciated by those of ordinary skill in the art, white light sources should be stable and reliable and not overheat during prolonged use.

The base portion 110 of the imaging device 100 may include controls that allow switching/switching between white light imaging and fluorescence imaging. The control may also enable the various excitation light sources to be used together or separately, in various combinations and/or sequentially. The control may cycle through various different combinations of light sources, may control the light sources sequentially, may strobe the light sources, or otherwise control the time and duration of use of the light sources. As will be appreciated by one of ordinary skill in the art, the control may be automatic, manual, or a combination thereof. As described above, the touch screen display 120 of the base portion 110 may be used as a user interface to allow control of the imaging device 100. Alternatively, it is contemplated that a separate control, such as a manual control (e.g., a button), may be used instead of or in addition to the touch screen control. For example, such manual controls may be located on the handle 130 to allow a user to easily actuate the controls while holding and using the imaging device.

The optics housing portion 140 of the imaging device 100 may also contain one or more optical imaging filters configured to prevent reflected excitation light from passing through the camera sensor. In one example, the optical imaging filter may also be configured to allow passage of emissions having a wavelength corresponding to autofluorescence emissions of the tissue cells and fluorescence emissions of porphyrins induced in the tissue cells. In another example, the apparatus 100 may include one or more optical imaging filters configured to allow passage of emissions corresponding to autofluorescence emissions of bacteria contained in the target and exogenous fluorescence emissions of the bacteria resulting from use of a contrast agent on the surface of the target. The imaging device 100 may also include a filter configured to capture fluorescence and autofluorescence of both bacteria and tissue.

These optical filters may be selected to detect specific optical signals from the target/tissue/wound surface based on the desired wavelength of light. Spectral filtering (e.g., absorption, fluorescence, reflectance) of the detected optical signal may also be achieved using, for example, a Liquid Crystal Tunable Filter (LCTF) or an acousto-optic tunable filter (AOTF), which are solid-state electronically tunable spectral bandpass filters. Spectral filtering may also involve the use of continuously variable filters, and/or manual bandpass optical filters. These filter/filtering mechanisms may be placed in front of the imaging sensor to produce multispectral, hyperspectral and/or wavelength-selective imaging of tissue.

The imaging device 100 may be modified by attaching to the excitation/illumination light source and the imaging sensor in a reasonable manner using optical or variable orientation polarization filters (e.g., linear or circular, in combination with the use of light waveplates). In this manner, the imaging device 100 may be used to image a target surface with white light reflectance and/or fluorescence imaging by polarized light illumination and unpolarized light detection, or vice versa, or by polarized light illumination and polarized light detection. This may allow for imaging of a wound with minimized specular reflection (e.g., glare from white light imaging), as well as the ability to image fluorescence polarization and/or anisotropy-related changes in connective tissue (e.g., collagen and elastin) in normal tissue within and around the wound. This may yield useful information about the spatial orientation of connective tissue fibers and tissue associated with wound remodeling during the healing process [ Yasui et al, (2004) appl.opt.43: 2861-2867].

In one example embodiment shown in fig. 12, the imaging device 200 includes three camera sensors 260, 265, 270, and each sensor includes a fixed filter 261, 266, 271. For example, first and second white light sensors may be provided, each white light sensor being configured to receive a visible optical signal via a dedicated filter fixed to the respective sensor. In addition, the sensor for fluorescence imaging may be configured to allow various desired emission wavelengths to pass through to the fluorescence camera sensor. As previously described, different optical housing portions may contain different configurations of sensors, filters, and light sources that together are configured to create an image of a particular feature of a target.

Fig. 12 shows an exploded view of the optics housing 240 of the imaging device 200. As shown in fig. 12, the base portion 210 may include a heat sink 212 located behind the heat sink 250 of the optical housing 240. Optical housing 240 may further include three camera sensors 260, 265, 270, a Printed Circuit Board (PCB)273, an external heat sink pad 252, a camera housing 244, three filters 261, 266, 271, a light diffuser 253 for a white light source, an internal pad/filter holder 274, windows 275a, 275b, 275c, tape 276 (or other means for securing the windows), and a lens assembly tip 280, which may include features that allow attachment of accessories.

As will be understood by those skilled in the art, the arrangement of components in the optical housing of the imaging device may take a variety of configurations. Such a configuration may be affected by the size of the device, the footprint of the device, and the number of components used. However, functional factors should also be considered when arranging the components. For example, problems such as light leakage from the light source of the device and/or ambient light entering the optical housing may interfere with proper or optimal operation of the device and may result in less desirable output, such as image artifacts, for example. The arrangement shown in fig. 12 is one in which the camera sensor is isolated to prevent light leakage from the light source and ambient light.

An example PCB 273 is shown in fig. 13. As shown, the PCB may include an excitation light source 302, such as two fluorescent LEDs, e.g., violet/blue LEDs having a wavelength between about 400nm and about 450nm, and in one example, a wavelength of about 405 nm. Additional LEDs having the same wavelength may be provided, or only one LED may be used. Additionally, it is contemplated that additional excitation light sources having different wavelengths may be provided. The PCB 273 may also include two temperature sensors 304, a white light or flashlight LED306 that provides white light for white light imaging, an ambient light sensor 308, and a rangefinder 312, for example, the rangefinder 312 may be a laser-based rangefinder.

While the apparatus 100 or 200 is held over a target tissue surface (e.g., a wound) to be imaged, the illumination source may illuminate narrow or broadband violet/blue wavelengths or other wavelengths or wavelength bands onto the tissue/wound surface, thereby generating a flat and uniform light field within the region of interest. The light also illuminates or excites the tissue down to a shallow depth. The excitation/illumination light interacts with normal and diseased tissue and may cause an optical signal (e.g., absorption, fluorescence, and/or reflection) to be generated within the target tissue, which is then captured by one of the camera sensors.

By varying the excitation and emission wavelengths accordingly, the imaging apparatus 100, 200 can interrogate the inner surface of the target tissue (e.g., wound) and target tissue components (e.g., connective tissue and bacteria in the wound) at a particular depth. Excitation of deeper tissue/bacterial fluorescence sources, for example, in a wound may be achieved by changing the violet/blue (about 400-500nm) wavelength light to green (about 500-540nm) wavelength light. Similarly, by detecting longer wavelengths, fluorescence emissions from tissue and/or bacterial sources deep in the tissue can be detected at the tissue surface. For wound assessment, the ability to interrogate surface and/or subsurface fluorescence may be useful, for example, to detect and potentially identify bacterial contamination, colonization, critical colonization, and/or infection that may occur at the surface and depth within the wound (e.g., in chronic non-healing wounds).

The handheld imaging device 100, 200 also includes an imaging lens and an image sensor in the optical housing portion 140, 240 of the device. The imaging lens or lens assembly may be configured to focus the filtered autofluorescence and fluorescence emissions on the image sensor. Wide angle imaging lenses or fish-eye imaging lenses are examples of suitable lenses. A wide angle lens may provide a 180 degree field of view. The lens may also provide optical magnification. Imaging devices require very high resolution so that a distinction can be made between very small cell populations. The image sensor is configured to detect filtered autofluorescence emissions of the tissue cells and fluorescence emissions of porphyrins induced in the tissue cells. The image sensor may have 4K video capability as well as auto focus and optical or digital zoom capability. CCD or CMOS imaging sensors may be used. In one example, a CMOS sensor in combination with a filter, i.e. a hyperspectral image sensor, such as those sold by xmea corporation, may be used.

Exemplary filters include visible filters (https:// www.ximea.com/en/products/hyperspectra-cameras-based-on-usb 3-xspec/m q022hq-im-sm4x4-vis) and infrared filters (https:// www.ximea.com/en/products/hyperspectra-cameras-based-on-usb 3-xspec/m q022hg-im-sm5x 5-nir). The handheld device 100, 200 may also include a processor configured to receive the detected emissions and output data regarding the detected filtered autofluorescence and/or exogenous fluorescence emissions. The processor may have the ability to run the synchronization program seamlessly (including, but not limited to, wireless signal monitoring, battery monitoring and control, temperature monitoring, image acceptance/compression, and button press monitoring). The processor interfaces with internal memory, physical controls (e.g., buttons, optics, and wireless modules). The processor also has the capability to read analog signals.

The imaging devices 100, 200 may also include a wireless module and be configured for fully wireless operation. Which can utilize high throughput wireless signals and have the ability to transmit high definition video with minimal delay. The device can simultaneously enable Wi-Fi and Bluetooth-Wi-Fi is used for data transmission, and Bluetooth is used for quick connection. The device may operate using the 5GHz wireless transmission band to isolate it from other devices. In addition, the device can operate as a soft access point, which eliminates the need for connection to the internet and keeps the device and module connected in isolation from other patient data security related devices. The apparatus may be configured for wireless charging and include an inductive charging coil. Additionally or alternatively, the apparatus may include a port configured to receive a charging connection.

Fig. 14A and 14B show alternative embodiments of hardware block diagrams of the apparatus 100, 200. Fig. 14A and 14B illustrate example block diagrams showing various components of handheld imaging devices 100, 200 according to example embodiments of the present disclosure.

The components of the handheld imaging device 100, 200 may be grouped in optical PCBs and electronic systems. In the embodiment of fig. 14B, the optical PCB includes 4 fluorescent wavelength LEDs, 2 infrared LEDs, and 2 white LEDs. The optical PCB further includes an ambient light sensor, a laser range finder, and a temperature sensor.

The optical PCB is operatively coupled with the electronic system 302. The electronic system may include, for example, but not limited to, electronic control components such as an application processor module, a real-time microcontroller unit (MCU), and a power management subsystem. The electronic system may further include components and systems that interface with other electronic components of the handheld imaging device. For example, the electronic system may include a CMOS camera interface and motor drive electronics for the optical filter system. The electronic system may also include connectors for fluorescence and white light cameras, respectively, to facilitate switching between fluorescence and white light imaging modes discussed herein. Although only two cameras, a white light camera and a fluorescence camera, are shown in fig. 14B, the present disclosure contemplates the use of additional cameras, particularly white light cameras. For example, the example block diagram of fig. 14A discloses the presence of three cameras, two white light cameras, and one fluorescence camera. The addition of additional cameras is within the scope of the present disclosure.

Other electronic systems and components of electronic systems may be supported including memory (e.g., flash memory devices), rechargeable batteries (e.g., lithium ion batteries), and inductive battery charging systems. Some components of the electronic system may include communication components (e.g., Wi-Fi and/or bluetooth radio systems), as well as spatially-oriented components, such as one or more of a magnetometer, an accelerometer, and a gyroscope.

The electronic system may include various user controls, such as a power switch, a system status LED, a charge status LED, a picture capture switch, a video capture switch, and an imaging mode switch. The various user controls may interface with other components of the electronic system through a user interface module that provides signals to and from the user controls.

Other components in the electronic system may include drivers for fluorescent, infrared, and white light LEDs, a USB hub for uplink or downlink data signals, and/or a power supply from an external computer system to which the electronic system may be connected through the USB hub, such as a workstation or other computer. The electronic system may also include one or more devices that provide feedback to the user, such as, but not limited to, a speaker. Other feedback devices may include various audible and visual indicators, tactile feedback devices, displays, and other devices.

The modular handheld imaging apparatus 100, 200 of the present application may be used with a variety of accessories. For example, the apparatus 100, 200 may be used with a drape configured to darken an area around a target being imaged by blocking or reducing ambient light around the target. The drape may include an adapter configured to be mounted on a patient-facing side of the optical housing and configured to isolate and/or separate the optics of the optical housing from ambient light by forming a barrier between the optics of the optical housing. Examples OF the types OF DRAPES that may be used with the device may be found, FOR example, in U.S. provisional patent application No.62/669,009 filed on 9.5.9.2018 entitled "Darkening draw, Packaging FOR Darape, Method OF Use and Method FOR dispensing Same", international patent application PCT/CA2019/000061 filed on 9.5.9.2019 entitled "IMAGING DRAPES, PACKAGING FOR pes, METHODS OF Use OF IMAGING DRAPES, AND METHODS FOR providing draw, U.S. design patent application No.29/647,110 filed on 9.5.9.2018 entitled" Darkening draw ", and design application No.29/676,893 filed on 15.1.2019.15.1.9 entitled" adap FOR providing a Darkening draw ", each OF which is incorporated herein by reference in its entirety.

According to one exemplary embodiment, a darkening drape 500 is disclosed. Figures 15A-15C illustrate an example embodiment of an imaging drape 500. The imaging drape 500 is shown connected to an imaging device, such as the imaging devices 100, 200 previously discussed in fig. 15D-15F. The drape 500 may be used in conjunction with any of the imaging devices disclosed herein. The drape 500 includes a connecting element 501. In an example embodiment, the connecting element 501 includes a ridge 510, the ridge 510 for forming a press-fit or snap-fit connection with the imaging device 100, 200. Protrusions (described below) on the imaging devices 100, 200 may engage the ridges 510 to provide a press-fit or snap-fit connection. Additionally, one or more protrusions 520 may be configured to engage with protrusions on the imaging device 100, 200 to grip the imaging device 100, 200 and better secure it to the drape. In some embodiments, the protrusions 520 are tooth-like members that engage with protrusions on the imaging device. When the imaging device is properly aligned and pressed down into the connecting element, the protrusion 520 clips into a protrusion on the imaging device to provide a snap-fit connection.

The connecting element 501 may also include an end cup member 550 to help facilitate a snap-fit connection between the connecting element 501 and the imaging device. As shown in fig. 15A, the end cup member 550 may be a smooth member disposed on either end of the opening 533 formed in the connecting element 501. The end cup member 550 may provide a guide to center/position the imaging head/optics of an imaging device snap-fit into the connecting element 501.

Fig. 15A shows an opening 533 having a rectangular shape in which a protrusion 520 is provided on a long side of the rectangle, and an end cup member 550 is provided on a short side of the rectangle. However, it is also contemplated that the projections 520 may be disposed on the short sides of the rectangle and the end cup members 550 may be disposed on the long sides of the rectangle. Although fig. 15A shows two end cup members 550, only one end cup member 550 may be used on one side of the opening 533. Further, in some embodiments, the connecting element 501 may not include the end cup member 550. In this embodiment, the protrusion 520 may be disposed around most or the entire perimeter of the opening 533 on the connection element 501. As described above, the connecting element 501 may be formed of injection molded plastic.

Figure 15A shows a top perspective view of the connecting element 501 secured to the drape. Fig. 15B shows a top view of the connecting element 501 and an exterior view of the drape, and fig. 15C shows a bottom view of the connecting element 500 and a view of the portable imaging environment formed by the interior of the drape.

The connecting element 501 may also include a top planar surface 503, such as a one-way valve and tab 554 of a flap valve 555. As shown, the ledge 554 is disposed on a top surface of the connecting element 501. Thus, the tab 554 is visible from the top view of fig. 15B, but not from the interior of the drape in the view of fig. 15C. The tabs 554 help keep the drape material out of the imaging field of view.

In the embodiment of fig. 15A-15F, the connecting element 501 may be formed from an injection molded plastic, such as polyethylene. Thus, the connecting element 501 may be a relatively rigid member. In some embodiments, connecting element 501 has a thickness of about 1.8 mm. The ledge 554 may be formed of the same material as the rest of the connecting member 501, but may be less rigid than the rest of the connecting member 501. Thus, tab 554 may be thinner than the remainder of connecting element 501. The material of the drape body may be formed from the same material as the connecting elements 501, but may not be injection molded, so that the material of the drape body is not as stiff as the connecting elements 501 (including the tabs 554). In some embodiments, the drape body is also thinner in material than the connecting elements 501 (including tabs 554). The drape body may be formed of a soft material welded to the relatively stiff material of the connecting element 501. This can reduce manufacturing costs by allowing the flap valve 555 to be integrated into the drape by being formed from the material of the drape body.

The connecting element 501 further comprises an opening 533 to provide FL and/or white light imaging within the interior environment of the drape from the imaging device 100, 200. In the embodiment of fig. 15A to 15C, the opening 133 is substantially rectangular in shape. However, it is further contemplated that other shapes may be used.

Fig. 15D to 15F show an example of the imaging apparatus 600 fixed to the connecting element 501 of the drape shown in fig. 15A to 15C. The imaging device 600 may be configured as described with respect to the devices 100, 200 discussed above. The imaging device 600 is securely fastened to the connection element 501 by a snap-fit connection that prevents/reduces any ambient light from entering the interior of the drape through the top of the drape. As described above, the protrusion 670 on the imaging device 600 may engage with the protrusion 520 and the ridge 510 on the connecting element 501 to provide a snap-fit connection.

Example embodiments of the modular handheld imaging device 600 are shown in fig. 15G and 15H. The imaging device 600 includes a base portion 610 having a generally square or rectangular shape. The front or user facing side 615 of the base body portion 610 includes a display screen 620 for displaying images and video captured by the device. The protrusion 670 protrudes outward from the optics head/housing 640, although it may alternatively be located on the base portion 610. Although fig. 15G shows the protrusion 670 disposed on top of the base body portion 610, it is also contemplated that the protrusion 670 may be disposed on the other side of the base body portion 610 depending on the location of the protrusion 520 on the connecting element 501.

Although depicted as square or rectangular, imaging device 600 may take any shape that will reasonably support a display screen such as a touch screen. In addition to disclosing images captured by the imaging device 600, the display screen also serves as a user interface, allowing a user to control the functions of the device via touch screen input.

Positioned on the opposite side of the device, i.e., the patient-facing side 625 of the device, may be a handheld region 630 configured to facilitate a user's holding of the device during imaging. The patient-facing side of the device may also include contacts 635 for wireless charging of the device.

According to one aspect of the present disclosure, the patient-facing side of the device 600 further includes an optics housing 640. Optical housing 640 may be separate from base portion 610. The optical housing portion 640 is shown as a rectangular housing configured to be received in an opening of a connection element on a drape.

The optical housing 640 may take on different configurations. For example, as shown in fig. 15H, the optical housing portion 640 has a substantially flat rectangular shape. The optical components for FL and/or white light imaging are arranged in a substantially linear manner across the width of the optical housing. The optical assembly is described in more detail below.

According to another aspect of the present disclosure, the imaging apparatus 100, 200, 600 of the present disclosure may be used with a sterile drape. The sterile drape is configured to form a sterile barrier between the imaging device 100, 200, 600 and the environment in which the imaging device is used. Fig. 16A-16C illustrate an example embodiment of a sterile drape for use with the imaging apparatus of the present disclosure. As shown in fig. 16A, the sterile drape 700 may be configured to receive the body of the imaging device 800. When the sterile drape is positioned over the imaging device 800, the imaging device may be engaged with the darkened drape 500, as discussed above with respect to fig. 15A-15H and as shown in fig. 16B and 16C.

The optical housings may be configured such that a single adapter will fit all of the optical housings to attach the darkening drape. Alternatively, a separate adapter may be provided to engage each optical housing.

According to one aspect of the present disclosure, a modular handheld device may be used to obtain three-dimensional fluoroscopic images of a target. Systems and Methods for obtaining such Three-Dimensional images are disclosed in U.S. provisional application No.62/793,837 filed on 2019, 1, 17, entitled "Systems Methods, and Devices for Three-Dimensional Imaging, Measurement, and Display of woods and Tissue specifications," the entire contents of which are incorporated herein by reference.

Other uses of the device may include:

clinical and research based small and large animal (e.g., veterinary) imaging.

Detecting and monitoring contamination (e.g., bacterial contamination) during food/animal product preparation in meat, poultry, dairy, fish, and agriculture.

Detection of "surface contamination" (e.g., bacterial or biological contamination) in public (e.g., healthcare) and private environments.

Multispectral imaging and detection of cancer in human and/or veterinary patients.

As a research tool for multispectral imaging and monitoring of cancer in experimental animal models of human disease (e.g., wounds and cancer).

Forensic detection, e.g., of potential fingerprints and biological fluids on non-biological surfaces.

Imaging and monitoring of plaque, carryover and cancer within the oral cavity.

Imaging and monitoring devices for clinical microbiology laboratories.

Test antibacterial (e.g., antibiotics), disinfectants.

The apparatus may generally comprise: i) one or more excitation/illumination light sources and ii) one or more image sensors, which may be combined with one or more optical emission filters or spectral filtering mechanisms. The device may have a view/control screen (e.g., a touch screen), image capture, and zoom control. The apparatus may further have: iii) wired and/or wireless data transfer ports/modules, iv) power supplies and power/control switches.

The apparatus may include software that allows a user to control the apparatus, including control of imaging parameters, visualization of images, storage of image data and user information, transmission of images and/or related data, and/or related image analysis (e.g., diagnostic algorithms). The apparatus may also include software for measuring the imaging target, counting the number of various items found in the imaging target. For example, if the target is a wound, the apparatus may include software configured to calculate wound size, wound depth, wound perimeter, wound area, wound volume, identify various types of tissue within the wound (collagen, elasticity, blood vessels) and the percentage of each tissue within the wound. In addition, the device can determine the amount or number of bacteria in the wound, bacterial load, differentiate between the various types of bacteria within the load, and identify relative percentages. Examples of suitable software and methods are described, for example, in U.S. provisional patent application No.62/625,611 entitled "round Imaging and Analysis" filed on 2/2019 and international patent application No. pct/CA2019/000002 entitled "round Imaging and Analysis" filed on 1/15/2019, each of which is incorporated herein by reference in its entirety.

The apparatus may be configured to co-register white light images, fluorescence images, thermal images, and other images of the target. The apparatus may be configured to create a three-dimensional map of the target. The apparatus may be configured to enhance color discrimination between different tissue types identified in the image. The apparatus may be configured to determine a tissue classification of the target based on different colors or image features captured in the fluorescence image. The device may be configured to demarcate between diseased tissue and healthy tissue therein, providing a map for the user to selectively remove diseased tissue while preserving surrounding healthy tissue in a targeted manner.

Various types of filters, power supplies, light sources, excitation light sources, image sensors, and charging configurations may be present in the presently disclosed apparatus. Examples of such components are described in U.S. patent No.9,042,967, filed internationally on the national phase of PCT/CA2009/000680, 5/20/2009, which claims the benefit of U.S. provisional application No.61/054,780, filed on 20/5/2008, each of which is incorporated herein by reference in its entirety. Additional components are disclosed in U.S. provisional application No.62/625,983 entitled "Devices, Systems, and Methods for tire Visualization and Removal" filed on 3.2.2018 and U.S. provisional application No.62/625,967 entitled "Devices, Systems, and Methods for tire Visualization and Removal" filed on 2.2.2018, each of which is incorporated herein by reference in its entirety. Additional components are disclosed in U.S. provisional patent application nos. 62/793,764, entitled "Multimodal System FOR Visualization of Disease" filed on 1/17.2019 AND U.S. provisional patent application No.62/857,155 entitled "DEVICES, SYSTEMS, AND METHODS FOR virtual reality" filed on 6/4.2019, each of which is incorporated herein by reference in its entirety.

The imaging systems and methods disclosed herein may rely on tissue autofluorescence and bacterial autofluorescence, as well as autofluorescence of other target materials. Additionally or alternatively, the present application contemplates the use of exogenous contrast agents, which may be applied topically, ingested, or otherwise applied. An example of such a reagent is disclosed in U.S. patent No.9,042,967 of the national phase application of PCT/CA2009/000680, filed internationally on 5/20/2009, which claims the benefit of U.S. provisional application No.61/054,780, filed on 20/5/2008, each of which is incorporated herein by reference in its entirety. Additional components are disclosed in U.S. provisional application nos. 62/625,983 entitled "Devices, Systems, and Methods for tire Visualization and Removal" filed on 3.2.2018 and U.S. provisional application No.62/625,967 entitled "Devices, Systems, and Methods for tire Visualization and Removal" filed on 2.2.2018, each of which is incorporated herein by reference in its entirety. Additional components are disclosed in U.S. provisional patent application No.62/793,764 entitled "Multimodal System FOR Visualization of Disease" filed on.1/17.2019 AND U.S. provisional patent application No.62/857,155 entitled "DEVICES, SYSTEMS, AND METHODS FOR virtual reality" filed on.6/4.2019, each of which is incorporated herein by reference in its entirety.

The device interface port may support wired (e.g., USB) or wireless (e.g., bluetooth, WiFi, and similar modes) data transfer or third party add-on modules to various external devices, such as: head mounted displays, external printers, tablets, laptops, personal desktop computers, wireless devices allowing transmission of imaging data to remote sites/other devices, Global Positioning System (GPS) devices, devices allowing use of additional memory, and microphones.

The device can be used to guide wound debridement, identify bacterial types, and help determine the appropriate therapy/drug/antibiotic.

The device may also be attached to a mounting mechanism (e.g., a tripod or stand) to serve as a relatively stationary optical imaging device for white light, fluorescence and reflectance imaging of objects, materials and surfaces (e.g., a body). This may allow the device to be used on desks or tables, or for "assembly line" imaging of objects, materials and surfaces. In some embodiments, the mounting mechanism can be mobile.

Other features of the device may include digital image and video recording capabilities, audio, documentation methods (e.g., image storage and analysis software), and wired or wireless data transmission for remote medical/electronic healthcare needs.

In addition to providing detection of bacterial strains, the device may also be used to distinguish the presence and/or location of different bacterial strains (e.g., staphylococcus aureus or pseudomonas aeruginosa), for example, in wounds and surrounding tissue. This may be based on different autofluorescence emission characteristics of different bacterial strains, including autofluorescence emission characteristics within the 490-550nm and 610-640nm emission bands when excited by violet/blue light (e.g., light around 405 nm). Other combinations of wavelengths may be used to distinguish other species on the image. This information can be used to select an appropriate treatment, such as the selection of antibiotics.

The device can scan over any wound (e.g., on the body surface) so that the excitation light can illuminate the wound area. The device may then be used to inspect the wound so that the operator may view the wound in real time, for example, via a viewer on the imaging device or through an external display device (e.g., a heads-up display, a television display, a computer monitor, an LCD projector, or a head-mounted display). Images obtained from the device may also be transmitted in real time (e.g., via wireless communication) to a remote viewing site, e.g., for telemedicine purposes, or sent directly to a printer or computer memory. Imaging may be performed in routine clinical evaluation of a patient with a wound.

Prior to imaging, fiducial markers may be placed on the skin surface near the wound edge or perimeter (e.g., using a non-erasable fluorescent ink pen). For example, four dots may be placed near the wound edge or boundary on a normal skin surface, each dot from a different fluorescent ink color from a separate non-erasable fluorescent ink pen that may be provided as a kit to a clinical operator. These colors can be imaged by the device using excitation light and a multi-spectral band filter that matches the emission wavelengths of the four ink dots. Image analysis may then be performed by co-registering fiducial markers for alignment between images. Thus, the user may not have to align the imaging device between different imaging sessions. This technique may facilitate longitudinal (i.e., over time) imaging of the wound, so a clinical operator may be able to image the wound over time without the need to align the imaging device during each image acquisition.

Furthermore, to aid in intensity calibration of the fluorescence image, a disposable simple fluorescence standard "strip" may be placed into view during wound imaging (e.g., by temporarily adhering the strip to the skin using a mild adhesive). The strip may be impregnated with one or several different concentrations of a fluorescent dye that produces a predetermined and calibrated fluorescence intensity when illuminated by an excitation light source that may have a single (e.g., 405nm) or multiple fluorescence emission wavelengths or wavelength bands for image intensity calibration. The disposable strip may also have four dots as described above (e.g., each different diameter or size, each different fluorescent ink color, with a unique black dot placed next to it) from a separate non-erasable fluorescent ink pen. The strip is placed near the wound edge or boundary of a normal skin surface and the device can be used to take white light and fluorescence images. The strip may provide a convenient way to take multiple images of a given wound over time and then align the images using image analysis. In addition, the fluorescence "intensity calibration" strip may also contain additional linear measuring devices, such as a straight ruler of fixed length, to help measure the spatial distance of the wound. Such a strip may be an example of a calibration target that may be used with the apparatus to help calibrate or measure image parameters (e.g., wound size, fluorescence intensity, etc.), and other similar calibration targets may be used.

It may be desirable to increase the consistency of the imaging results and reproduce the distance between the device and the wound, since tissue fluorescence intensity may vary slightly if the distance is changed over multiple imaging sessions. Thus, in embodiments, the device may have a range finder to determine a fixed or variable distance between the device and the wound surface.

The device can be used to take white light images of an entire wound with normal surrounding normal tissue using a measuring device (e.g., ruler) placed within the imaging field of view. This may allow for visual assessment of the wound and calculation/determination of quantitative parameters such as wound area, circumference, diameter and topographical profile. Wound healing can be assessed by making planar measurements of the wound area at multiple time points (e.g., at the time of clinical visits) until the wound heals. The time course of wound healing can be compared to the expected healing time calculated using multiple time point measurements of wound radius reduction of the equation R √ a/pi (R, radius; a, planar wound area; pi, constant 3.14). This quantitative information about the wound can be used to track and monitor changes in the appearance of the wound over time in order to assess and determine the extent of wound healing caused by natural means or any therapeutic intervention. This data may be electronically stored in the patient's health record for future reference. White light imaging may be performed while the operator is performing a preliminary clinical assessment of the patient.

The device may be designed to detect all or most of tissue Autofluorescence (AF). For example, using a multi-spectral band filter, the device can image tissue autofluorescence and blood-related optical absorption emitted from the following tissue biomolecules, for example, at 405nm excitation: green-appearing collagens (types I, II, III, IV, V and others), green-yellow-orange elastin, reduced Nicotinamide Adenine Dinucleotide (NADH), Flavin Adenine Dinucleotide (FAD) emitting a blue-green autofluorescence signal, and bacteria/microorganisms, most of which appear to have extensive (e.g., green and red) autofluorescence emissions.

The image analysis may include calculating a ratio of red-green AF in the image. Intensity calculations can be obtained from a region of interest within the wound image. The pseudo-color image may be mapped onto a white light image of the wound.

The device maps the biodistribution of bacteria on the wound site and surrounding skin and thus may assist in microbiological testing for specific tissue areas where a swab or biopsy is required. In addition, use of the imaging device may allow monitoring of the response of bacterially infected tissue to various drug treatments, including the use of antibiotics and other treatments, such as photodynamic therapy (PDT), Hyperbaric Oxygen Therapy (HOT), low level light therapy or anti-Matrix Metalloproteinase (MMP). The device can be used to observe bacterial biodistribution on the surface of a wound and in the depth of tissue, and can also be used to observe surrounding normal tissue. Thus, the device may be used to indicate the spatial distribution of infection.

In general, the device may be used to image and/or monitor a target, such as a skin target, a tumor target, a wound target, a confined anatomical space or cavity, an oral target, an otorhinolaryngological target, an ocular target, a genital target, an anal target, and any other suitable target on a subject.

The image analysis algorithm may provide one or more of the following features:

patient digital image management

Integration of multiple image acquisition devices

Recording all imaging parameters, including all exogenous fluorescent contrast agents

Multiple scale and calibration settings

Internal spectral image unmixing calculation algorithm for quantitative determination of tissue/bacteria autofluorescence and exogenous agent fluorescence signals

Convenient annotation tools

Digital archive

Web publishing

Basic image processing and analysis

Complete image processing and quantitative analysis functions

The image stitching algorithm will allow stitching of a series of panoramic or partially overlapping wound images into a single image in either an automatic or manual mode.

Easy-to-use measuring tool

Intuitive setting of processing parameters

Convenient manual editor

Report generation

A powerful image report generator with specialized templates that can be integrated into an existing clinical report infrastructure or telemedicine/electronic healthcare patient medical data infrastructure. For example, reports can be exported as PDF, Word, Excel.

Large-scale automated solution library

Custom automated solutions for various areas of wound assessment, including quantitative image analysis.

Although image analysis algorithms, techniques or software have been described, the description also extends to computing devices, systems and methods for performing the image analysis.

Image guidance

The device may also be used to provide fluoroscopic image guidance, for example in surgery, even without the use of dyes or markers. Certain tissues and/or organs may have different fluorescence spectra (e.g., intrinsic fluorescence) when viewed using an imaging device, or under certain excitation light conditions, for example.

Food applications

The imaging device may also be used to monitor contamination of food products (e.g., meat products). This may be useful, for example, in the preparation of food/animal products for the meat, poultry, dairy, fish and agricultural industries. The device may be used as part of an integrated multidisciplinary method for the department's analytical laboratory services, and may provide capabilities including image-based contamination detection and guidance for obtaining samples for testing. The device can be used for detecting, identifying and monitoring the pollution/adulteration level of bacteria and other microorganisms in the food to meat in real time. It may be used for bacterial contamination tracking in food processing plant environments, and thus may provide an image-based method for determining food safety and quality. In embodiments where the device is handheld, compact and portable, the imaging device may be used in food preparation areas to determine the safety of food against bacterial/microbial contamination. The device can also be used to relatively quickly detect and analyze bacteria/microorganisms in collected or sampled meat samples (and preparation surfaces) during processing and in finished food products, for example as part of a food safety and quality regulatory inspection process. The device can be used in meat, gardening and aquaculture industries to implement food safety inspection/detection procedures that meet food safety and quality requirements. The device can be used to detect food contaminants such as those found in the meat, poultry, dairy and fish industries. This technique is useful in fecal contamination detection systems because fecal bacteria produce porphyrins that can be readily detected by the device.

The detection and accurate identification of food-borne pathogens, such as Listeria Monocytogenes (LM), in food samples and processing lines may be critical to ensure food quality assurance and to track bacterial pathogen outbreaks in food supplies. Current detection methods used in food production and processing facilities typically rely on multiple random surface sampling of the equipment (e.g., swabs) followed by molecular-based diagnostic analysis (e.g., real-time polymerase chain reaction, RT-PCR), which typically provides quantitative confirmation of the presence of LM within 24-72 hours. However, because of time and cost constraints, pathogen contamination testing is typically performed only on randomly selected areas of a given food production facility at a time, the enormous potential for undersampling during a "first pass" surface wipe of equipment can result in undetected pathogens with catastrophic health and economic consequences. Furthermore, the inability to rapidly sample all surface areas during i) "first pass" wipes to determine areas with a high probability of infection, ii) visually record this initial screening process (e.g., no imaging methods available to date), iii) delays in obtaining laboratory results, iv) high costs associated with current methods, and v) more importantly, the absence of potentially lethal pathogen infection has prompted efforts to improve the cost-effectiveness of early and accurate detection of food-borne pathogens.

The device may be used to provide a relatively rapid and accurate method of detecting such pathogens. The device can be used with the analysis of a multi-color fluorescent probe "cocktail" (e.g., a combination of two or more contrast agents) that can unambiguously identify (and make visible) only viable listeria monocytogenes from other listeria species using highly specific genetic probe technology. This may allow for specific detection of live LM in real time, potentially minimizing the need for standard time consuming enrichment methods. This process can also be extended to include the detection of other pathogens of interest, including Enterobacter sakazakii, Campylobacter (E.coli, jejunum, and Raylella), Escherichia coli and E.coli (including lactose and indole-negative E.coli), Salmonella, all bacteria belonging to Staphylococcus aureus and all bacteria belonging to the genus Staphylococcus and Pseudomonas aeruginosa. Other bacteria may be detected by selecting an appropriate probe or combination of probes. For example, a combination of two or more contrast agents may be designed to be specific to a certain bacterium and may result in a unique detectable fluorescence signature when imaged using an imaging device.

The imaging device can be used (e.g., when combined with an applied exogenous bacteria-specific contrast agent, including a multi-targeting probe or combination of probes) for relatively rapid "first-pass" screening of food preparation and treatment surfaces for targeted swab and microbiological testing. The device may allow for relatively rapid image-based monitoring of any surface of equipment and food products, and may capture in real time fluorescent characteristics of food-borne bacteria/pathogens. As described above, the device can be used in conjunction with, for example, assays of "cocktails" (and combinations thereof) of multi-color fluorescent probes, which can unambiguously identify (and make visible) only viable listeria monocytogenes from other listeria species using highly specific genetic probe technology. Such probe "cocktails" may be designed to specifically target certain pathogens based on specific combinations of probes known to be sensitive to such pathogens and known to produce a characteristic fluorescent response. In addition to detecting these pathogens, the device also allows to distinguish the presence and/or location of different strains based on their different characteristic fluorescent responses.

Surface contamination

The imaging device may be used to detect surface contamination, for example to detect "surface bacterial contamination" in a healthcare environment. The device can be used to detect and image the presence of bacteria/microorganisms and other pathogens on various surfaces/materials/instruments (particularly those associated with surgery) in hospitals, chronic care facilities and nursing homes where the primary source of infection is contamination. The device can be used in conjunction with standard detection, identification and enumeration strategies that are indicative of organisms and pathogens.

The systems and methods disclosed herein may form a system as summarized below and may be capable of performing processes as summarized below:

a method/system/apparatus for illuminating a subject with light of a calibrated intensity and for capturing close-up fluorescent digital images, comprising:

-optical distance measuring device

Digital camera sensor with optical fluorescence filter

-one or more narrowband optical transmitters

-a computing processor with memory

-user display screen

-user input control

Thereby:

-the light emitter has been switched on,

-presenting a preview camera image to a user via a display screen,

-presenting the rangefinder value to a user via a display screen

The user can activate the camera to capture an image

Whereby the user can:

the light intensity on the object is set by adjusting the height of the device from the object according to the rangefinder value on the screen and an image is captured.

A method | system | apparatus for capturing close-up digital images of consistent magnification and perspective, comprising:

-optical distance measuring device

One or more similar digital camera sensors

-a computing processor with memory

-user display screen

-user input control

Thereby:

-presenting a preview camera image to a user via a display screen,

-presenting the rangefinder value to a user via a display screen

The user may activate one or the other camera to capture an image

Whereby the user can:

the view of the object is set by adjusting the height of the device from the object according to the rangefinder value on the screen, and an image is captured.

A method | system | apparatus for capturing measurement-ready close-up digital images of a subject, comprising:

-optical distance measuring device

-digital camera sensor

-a computing processor with memory

-user display screen

-user input control

-image processing software

Thereby:

the subject is attached with 2 visible wound stickers,

-presenting a preview camera image to a user via a display screen,

-presenting the rangefinder value to a user via a display screen

-if the position of 2 stickers is detected using image processing, continuously presenting to the user via the display screen

When the sticker is detected, the user may activate the camera to capture an image.

Whereby the user can:

setting a view of the object by adjusting the height of the device from the object according to the rangefinder value on the screen,

-setting a view of the object by adjusting the position of the device relative to the object in order to detect the sticker, and

-capturing an image.

One of ordinary skill in the art having the benefit of this disclosure will appreciate that the present disclosure provides various exemplary devices, systems, and methods for surgical and/or in vitro visualization of tumors and/or residual cancer cells on the surgical margin. Further modifications and alternative embodiments of various aspects of the disclosure will be apparent to those skilled in the art in view of this description.

Moreover, the apparatus and methods may include additional components or steps that have been omitted from the figures for clarity of illustration and/or operation. Accordingly, this description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art the general manner of carrying out the disclosure. It should be understood that the various embodiments shown and described herein are to be considered exemplary. It will be apparent to those skilled in the art having the benefit of the description herein that the elements and materials, and the arrangement of such elements and materials, may be substituted for those illustrated and described herein, the parts and processes may be reversed, and certain features of the disclosure may be utilized independently. Changes may be made in the elements described herein without departing from the spirit and scope of the disclosure and the following claims, including the equivalents thereof.

It is understood that the specific examples and embodiments set forth herein are non-limiting and that modifications in structure, size, materials, and method may be made without departing from the scope of the disclosure.

Furthermore, the terminology of the present specification is not intended to be limiting of the present disclosure. For example, spatially relative terms, such as "under," "lower," "below," "upper," "bottom," "right," "left," "proximal," "distal," "front," and the like, may be used to describe one element or feature's relationship to another element or feature as illustrated in the figures. These spatially relative terms are intended to encompass different positions (i.e., positions) and orientations (i.e., rotational placements) of the device in use or operation in addition to the position and orientation depicted in the figures.

For the purposes of the present specification and appended claims, unless otherwise indicated, all numbers expressing quantities, percentages or proportions, and other numerical values used in the specification and claims, are to be understood as being modified in all instances by the term "about" (if not already modified). Accordingly, unless indicated to the contrary, the numerical parameters set forth in the specification and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by the present disclosure. At the very least, and not as an attempt to limit the application of the doctrine of equivalents to the scope of the claims, each numerical parameter should at least be construed in light of the number of reported significant digits and by applying ordinary rounding techniques.

Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the disclosure are approximations, the numerical values set forth in the specific examples are reported as precisely as possible. Any numerical value, however, inherently contains certain errors necessarily resulting from the standard deviation found in their respective testing measurements. Moreover, all ranges disclosed herein are to be understood to encompass any and all subranges subsumed therein.

It should be noted that, as used in this specification and the appended claims, the singular forms "a," "an," and "the" and any singular use of any word include plural referents unless expressly and unequivocally limited to one referent. As used herein, the term "include" and grammatical variations thereof are intended to be non-limiting such that recitation of items in a list is not to the exclusion of other like items that may be substituted or added to the listed items.

It should be understood that while the present disclosure has been described in detail with respect to various exemplary embodiments thereof, it should not be considered limited thereto since various modifications can be made without departing from the broad scope of the following claims, including their equivalents.

52页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:用于内孔窥视仪检查的方法和设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!