Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR FLUORESCENCE IMAGING
Document Type and Number:
WIPO Patent Application WO/2023/230283
Kind Code:
A1
Abstract:
The present disclosure provides a system for fluorescence imaging, comprising: one or more light sources for illuminating a surgical scene, wherein the one or more light sources comprise (i) a first light source configured to generate a first set of light signals for fluorescence imaging and (ii) a second light source configured to generate a second set of light signals for at least one of RGB imaging, laser speckle imaging, and depth imaging; and one or more imaging devices for generating one or more images of the surgical scene based on a third set of light signals reflected, emitted, or received from the surgical scene, wherein the third set of light signals correspond to at least one of the first set of light signals and the second set of light signals.

Inventors:
OBERLIN JOHN (US)
MAROIS MIKAEL (US)
DEMAIO EMANUEL (US)
IYER SANTOSH (US)
Application Number:
PCT/US2023/023615
Publication Date:
November 30, 2023
Filing Date:
May 25, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ACTIV SURGICAL INC (US)
International Classes:
A61B90/30; A61B1/06; A61B5/00; H04N23/11; A61B1/04; G02B23/24; G03B15/02
Foreign References:
US20150182106A12015-07-02
US20160206202A12016-07-21
US20190175021A12019-06-13
US20200221933A12020-07-16
Attorney, Agent or Firm:
ELKINS, Madeline (US)
Download PDF:
Claims:
CLAIMS

WHAT IS CLAIMED IS:

1. A system for medical imaging, comprising: one or more light sources for illuminating a surgical scene, wherein the one or more light sources comprises (i) a first light source configured to generate a first set of light signals for fluorescence imaging and (ii) a second light source configured to generate a second set of light signals for at least one of RGB imaging, laser speckle imaging, and depth imaging; and one or more imaging devices for generating one or more images of the surgical scene based on a third set of light signals reflected, emitted, or received from the surgical scene, wherein the third set of light signals corresponds to at least one of the first set of light signals and the second set of light signals.

2. The system of claim 1, wherein the first set of light signals for fluorescence imaging is configured to excite one or more biological materials in the surgical scene, thereby causing the one or more biological materials to emit one or more fluorescence signals that are detectable by the one or more imaging devices.

3. The system of any one of the preceding claims, wherein the third set of light signals comprises the one or more fluorescence signals emitted by the one or more biological materials.

4. The system of any one of the preceding claims, wherein the one or more fluorescence signals emitted by the one or more biological materials have a different wavelength than the first set of light signals used to excite the one or more biological materials.

5. The system of any one of the preceding claims, wherein the first set of light signals for fluorescence imaging has a wavelength ranging from about 440 nanometers (nm) to about 500 nanometers (nm).

6. The system of any one of the preceding claims, wherein the first set of light signals for fluorescence imaging has a wavelength of about 470 nanometers (nm) or about 445 nanometers (nm).

7. The system of any one of the preceding claims, wherein the one or more fluorescence signals emitted by the one or more biological materials have a wavelength of at least about 500 nanometers (nm).

8. The system of any one of the preceding claims, wherein the one or more images of the surgical scene comprise one or more fluorescence images of the one or more biological materials, wherein the one or more fluorescence images are generated without the use of any dyes.

9. The system of any one of the preceding claims, wherein the one or more biological materials comprise a tissue.

10. The system of any one of the preceding claims, wherein the one or more biological materials comprise bile, urine, fat, connective tissue, or cauterized tissue.

11. The system of any one of the preceding claims, wherein the first set of light signals does not cause blood in the surgical scene to fluoresce.

12. The system of any one of the preceding claims, wherein the first set of light signals causes one or more tissue regions in the surgical scene to fluoresce.

13. The system of any one of the preceding claims, wherein the one or more images of the surgical scene are usable to detect bile leaks from one or more bile ducts in the surgical scene during or after surgery.

14. The system of any one of the preceding claims, wherein the one or more images of the surgical scene are usable to infer a hemoglobin density in tissue and to correct one or more laser speckle maps based on the inferred hemoglobin density.

15. The system of claim 14, wherein the one or more laser speckle maps are generated using the second set of light signals.

16. The system of any one of the preceding claims, further comprising a filter disposed along a light path between (i) the one or more imaging devices and (ii) the first or second light source, wherein the filter is configured to filter out one or more light signals with a wavelength below about 500 nanometers (nm).

17. The system of claim 16, wherein the filter is configured to filter out the first set of light signals generated by the first light source.

18. The system of any one of the preceding claims, wherein the one or more imaging devices comprise a first imaging device for generating a first set of images based on the first set of light signals and a second imaging device for generating a second set of images based on the second set of light signals.

19. The system of any one of the preceding claims, further comprising an optical element disposed along a light path between (i) the one or more imaging devices and (ii) the first or second light source, wherein the optical element is configured to (i) direct a first subset of the light signals reflected, emitted, or received from the surgical scene to the first imaging device and (ii) direct a second subset of the light signals reflected, emitted, or received from the surgical scene to the second imaging device.

20. The system of claim 19, wherein the optical element comprises a lens, a mirror, or a prism.

21. The system of claim 20, wherein the optical element comprises a dichroic mirror or a dichroic prism.

22. The system of any one of the preceding claims, further comprising an image processing unit configured to generate an enhanced visualization of the surgical scene based on at least one of the first set of images and the second set of images.

23. The system of claim 22, wherein the image processing unit is configured to adjust, modify, correct, or update the second set of images based on the first set of images.

24. The system of claim 22 or 23, wherein the image processing unit is configured to overlay at least one image from the first set of images on at least one image from the second set of images.

25. The system of any one of claims 22-24, wherein the image processing unit is configured to overlay at least one image from the second set of images on at least one image from the first set of images.

26. The system of claim any one of the preceding claims, wherein the one or more imaging devices comprise at least one sensor configured for fluorescence imaging based on the first set of light signals.

27. The system of claim 26, wherein the at least one sensor comprises an RGB sensor.

28. The system of claim 27, wherein the one or more images are generated based on one or more fluorescence signals filtered through one or more bandpass filters on the RGB sensor.

29. The system of claim 28, wherein the one or more bandpass filters comprise a red bandpass filter, a blue bandpass filter, or a green bandpass filter.

30. The system of claim any one of the preceding claims, further comprising a processing unit configured to (i) identify one or more critical structures in or near the surgical scene or (ii) distinguish between different critical structures in or near the surgical scene, based at least in part on the one or more images captured using the one or more imaging devices.

31. The system of claim 30, wherein the one or more critical structures comprise a ureter, a bile duct, one or more blood vessels, an artery, a vein, one or more nerves, or one or more lymph nodes.

32. The system of claim any one of the preceding claims, wherein the one or more images are generated based on a quantification of fluorescent light emitted from one or more biological materials in or near the surgical scene.

33. The system of claim 32, wherein the quantification of fluorescent light is based at least in part on (i) an amount of fluorescent light emitted from the one or more biological materials and (ii) one or more characteristics of illumination light used to capture the one or more images.

34. The system of claim 33, wherein the one or more characteristics of the illumination light comprise at least one of illumination intensity, illumination gradient or bias across the surgical scene, or a distance between the surgical scene and a light source providing the illumination light.

35. The system of any one of the preceding claims, wherein the first set of light signals for fluorescence imaging is configured to excite one or more fluorescing materials in the surgical scene, wherein the one or more fluorescing materials comprise a vitamin.

36. The system of claim 35, wherein the vitamin comprises a B vitamin or any metabolites thereof.

37. The system of claim 36, wherein the B vitamin comprises riboflavin.

38. The system of claim 35, further comprising an image processing unit configured to (i) analyze one or more fluorescence signals generated or emitted by the vitamin and (ii) identify one or more critical structures in or near the surgical scene based on the analysis of the one or more fluorescence signals generated or emitted by the vitamin.

39. The system of claim 35, wherein the excitation of the one or more fluorescing materials produces an fluorescence signal, which fluorescence signal is usable to locate, identify, or visualize a structure or a feature in the surgical scene.

40. The system of claim 39, wherein the structure or the feature in the surgical scene comprises a ureter, a bile duct, or a lymph node of a subject.

41. A method, comprising: providing one or more B vitamins to a subject; providing an excitation light to a target region comprising the one or more B vitamins; receiving an emission light from the one or more B vitamins in the target region in response to the excitation light; and detecting, identifying, or visualizing a structure or a feature in the target region based at least in part on the emission light received from the one or more B vitamins.

42. The method of claim 41, wherein the emission light comprises fluorescent light.

43. The method of any one of claims 41 or 42, wherein the fluorescent light is generated from fluorescence of the one or more B vitamins.

44. The method of claim 42, wherein the one or more B vitamins are configured to fluoresce or autofluoresce in response to the excitation light to generate the fluorescent light.

45. The method of claim 41, wherein the one or more B vitamins are exogenous and non-naturally occurring in the subject’s body.

46. The method of any one of claims 41 to 45, wherein the excitation light has a wavelength ranging from about 200 nanometers to about 500 nanometers.

47. The method of any one of claims 41 to 46, wherein the excitation light has a wavelength of about 270 nanometers.

48. The method of any one of claims 41 to 47, wherein the emission light has a wavelength that is greater than or equal to about 500 nanometers.

49. The method of any one of claims 41 to 48, wherein the structure or feature in the target region comprises a ureter.

50. The method of any one of claims 41 to 49, wherein the structure or feature in the target region comprises a bile duct.

51. The method of any one of claims 41 to 50, wherein the structure or feature in the target region comprises one or more lymph nodes.

52. The method of claim 51, wherein the emission light received from the one or more B vitamins is usable to detect, identify, or visualize a plurality of lymph nodes in the subject’s body.

53. The method of any one of claims 41 to 52, wherein in (a), the one or more B vitamins are introduced into the subject’s body via oral ingestion or intravenous injection.

54. The method of claim any one of claims 41 to 53, wherein the one or more B vitamins in the target region comprise excess B vitamins that are concentrated in the target region.

55. The method of any one of claims 41 to 54, further comprising, prior to (d), filtering out the excitation light to aid in the detection, identification, or visualization of the structure or the feature in the target region based on the emission light from the one or more B vitamins.

56. The method of any one of claims 41 to 55, wherein the feature in the target region comprises a perfusion of a biological fluid through the structure in the target region.

57. The method of any one of claims 41 to 56, wherein the one or more B vitamins comprise a B2 vitamin.

58. A method, comprising: proving an image of a subject, wherein the image comprises an emission light from one or more B vitamins in a target region collected in response to an excitation light; and detecting, identifying, or visualizing a structure or a feature in the target region based at least in part on the emission light received from the one or more B vitamins.

59. The method of claim 58, wherein the emission light comprises fluorescent light.

60. The method of any one of claims 58 or 59, wherein the fluorescent light is generated from fluorescence of the one or more B vitamins.

61. The method of claim 59, wherein the one or more B vitamins are configured to fluoresce or autofluoresce in response to the excitation light to generate the fluorescent light.

62. The method of claim 58, wherein the one or more B vitamins are exogenous and non-naturally occurring in the subject’s body.

63. The method of any one of claims 58 to 62, wherein the excitation light has a wavelength ranging from about 200 nanometers to about 500 nanometers.

64. The method of any one of claims 58 to 63, wherein the excitation light has a wavelength of about 445 nanometers.

65. The method of any one of claims 58 to 64, wherein the emission light has a wavelength that is greater than or equal to about 500 nanometers.

66. The method of any one of claims 58 to 65, wherein the structure or feature in the target region comprises a ureter.

67. The method of any one of claims 58 to 66, wherein the structure or feature in the target region comprises a bile duct.

68. The method of any one of claims 58 to 67, wherein the structure or feature in the target region comprises one or more lymph nodes.

69. The method of claim 68, wherein the emission light received from the one or more B vitamins is usable to detect, identify, or visualize a plurality of lymph nodes in the subject’s body.

70. The method of any one of claims 58 to 69, wherein in (a), the one or more B vitamins are introduced into the subject’s body via oral ingestion or intravenous injection.

71. The method of any one of claims 58 to 70, wherein the one or more B vitamins in the target region comprise excess B vitamins that are concentrated in the target region.

72. The method of any one of claims 58 to 71, further comprising, prior to (d), filtering out the excitation light to aid in the detection, identification, or visualization of the structure or the feature in the target region based on the emission light from the one or more B vitamins.

73. The method of any one of claims 58 to 72, wherein the feature in the target region comprises a perfusion of a biological fluid through the structure in the target region.

74. The method of any one of claims 58 to 73, wherein the one or more B vitamins comprise a B2 vitamin.

75. The system of any one of claims 1 to 40, wherein the first light source and the second light source comprise the same light source.

76. The system of any one of claims 1 to 40 or 75, wherein the system further comprises a filter, wherein the filter is configured to reduce a presence of an excitation signal from the first set of light signals.

77. The system of claim 76, wherein the filter is an absorptive filter.

78. The system of claim 77, wherein the absorptive filter comprises an optical density of equal to or greater than 4.

79. The system of claim 78, wherein the optical density is about seven.

80. The system of claim 76, wherein the filter is a reflective filter.

81. The system of claim 80, wherein the reflective filter is a dichroic mirror.

82. The system of claim 18, wherein the first imaging device and the second imaging device are the same imaging device.

Description:
SYSTEMS AND METHODS FOR FLUORESCENCE IMAGING

CROSS REFERENCE

[0001] This application claims priority to U.S. Provisional Patent Application No. 63/346,578 filed on May 27, 2022, which application is incorporated herein by reference in its entirety for all purposes.

BACKGROUND

[0002] Medical imaging data may be used to aid in the diagnosis and/or treatment of different medical conditions, and the performance of various medical or surgical procedures. Such medical imaging data may be associated with various anatomical, physiological, or morphological features within a surgical scene.

SUMMARY

[0003] The systems and methods disclosed herein may be used to generate accurate and useful fluorescence imaging datasets that can be leveraged by medical or surgical operators for a variety of different applications or surgical procedures. The systems and methods of the present disclosure can be used to provide a medical or surgical operator, in some cases a surgeon, surgeon assistant, or nurse, with additional visual information of a surgical scene, including, for example, live fluorescence image overlays to enhance a medical operator’s ability to quickly and efficiently perform one or more steps of a live surgical procedure in an optimal manner. In some cases, the fluorescence images generated using the systems and methods of the present disclosure may also be used to improve the precision, flexibility, and control of autonomous and/or semi-autonomous robotic surgical systems.

[0004] The systems and methods of the present disclosure may be implemented for medical imaging of a surgical scene using a variety of different imaging modalities. The medical images obtained or generated using the presently disclosed systems and methods may comprise, for example, fluorescent images (including fluorescence images based on tissue fluorescence characteristics and/or fluorescence images generated based on fluorescent dyes or markers), RGB images, depth maps, time of flight (TOF) images, laser speckle contrast images, hyperspectral images, multispectral images, or laser doppler images. The medical images may also comprise, for example, fluorescent videos (including fluorescence videos based on tissue fluorescence characteristics and fluorescence videos generated based on fluorescent dyes or markers), time of flight (TOF) videos, RGB videos, dynamic depth maps, laser speckle contrast videos, hyperspectral videos, multispectral videos, or laser doppler videos. In some cases, the medical imagery may comprise one or more streams of imaging data comprising one or more medical images. The one or more streams of imaging data may comprise a series of medical images obtained successively or sequentially over a time period.

[0005] In some embodiments, the medical images may be processed to determine or detect one or more anatomical, physiological, or morphological processes or properties associated with the surgical scene or the subject undergoing a surgical procedure. As used herein, processing the medical images may comprise determining or classifying one or more features, patterns, or attributes of the medical images. In some embodiments, the medical images may be used to train or implement one or more medical algorithms or models for tissue tracking. In some embodiments, the systems and methods of the present disclosure may be used to augment various medical imagery with fluorescence information associated with a surgical scene.

[0006] In some embodiments, the one or more medical images may be used or processed to provide live guidance based on a detection of one or more tools, surgical phases, critical views, or one or more biological, anatomical, physiological, or morphological features in or near the surgical scene. In some embodiments, the one or more medical images may be used to enhance intra-operative decision making and provide supporting features (e.g., enhanced image processing capabilities or live data analytics) to assist a surgeon during a surgical procedure. [0007] In some embodiments, the one or more medical images may be used to generate an overlay comprising (i) one or more RGB images or videos of the surgical scene and (ii) one or more additional images or videos of the surgical procedure, wherein the one or more additional images or videos comprise fluorescence data, laser speckle data, perfusion data, or depth information.

[0008] The fluorescence images generated using the presently disclosed systems and methods may provide several advantages over other conventional imaging systems, fluorescence. For example, fluorescence (e.g., fluorescence in a tissue or organ being imaged) may help better visualize and identify certain features, elements or details in the surgical scene compared to a situation in which fluorescence is not used. For example, a tissue or organ may fluoresce/autofluoresce when exposed to a light with a relevant bandwidth (e.g., green or blue). The illuminated region of the body, organ, or tissues may further comprise one or more blood vessels therein or in proximity thereof. The blood and blood vessels may not be fluorescent. This may create a contrast between the blood vessel(s) and their surrounding tissues. Thus, the blood vessel may be seen with more ease and convenience, with higher certainty and precision, and in higher resolution. Furthermore, less time may be required to locate and identify such blood vessels.

[0009] In some examples, autofluorescence imaging utilizes native fluorescent properties of tissues (which fluoresce brightly when illuminated with blue light) to identify features in a surgical scene that do not fluoresce (e.g., blood). The autofluorescence images generated using systems and methods of the present disclosure may permit surgeons to easily identify blood vessels and visualize blood flow, which can appear dark relative to other tissues such as fat, which fluoresces and shows up brightly. The use of autofluorescence imaging can also produce images in which certain tissues (e.g., connective tissues) appear much brighter in comparison to other conventional imaging methods (e.g., narrow band imaging).

[0010] In one aspect, the present disclosure provides a system for medical imaging. The system may comprise one or more light sources for illuminating a surgical scene. The one or more light sources may comprise (i) a first light source configured to generate a first set of light signals for fluorescence imaging and (ii) a second light source configured to generate a second set of light signals for at least one of RGB imaging, laser speckle imaging, and depth imaging. The system may further comprise one or more imaging devices for generating one or more images of the surgical scene based on a third set of light signals reflected, emitted, or received from the surgical scene. The third set of light signals may correspond to at least one of the first set of light signals and the second set of light signals.

[0011] In some embodiments, the first set of light signals for fluorescence imaging may be configured to excite one or more biological materials in the surgical scene, thereby causing the one or more biological materials to emit one or more fluorescence signals that are detectable by the one or more imaging devices.

[0012] In some embodiments, the third set of light signals may comprise the one or more fluorescence signals emitted by the one or more biological materials.

[0013] In some embodiments, the one or more fluorescence signals emitted by the one or more biological materials may have a different wavelength than the first set of light signals used to excite the one or more biological materials.

[0014] In some embodiments, the first set of light signals for fluorescence imaging may have a wavelength of at least 100 nanometers (nm), 150 nm, 200 nm, 250 nm, 300 nm, 400 nm, 450 nm), 500 nm, or longer. In some cases, the wavelength may be at most about 600 nm, 500 nm, 450 nm, 400 nm, 300 nm, 250 nm, 200 nm, or shorter. In some examples, the wavelengths may be from 200 to 500. In some embodiments, the first set of light signals for fluorescence imaging may have a wavelength of about 470 nanometers (nm). In some embodiments, the first set of light signals for fluorescence imaging may have a wavelength of about 445 nanometers (nm). In some embodiments, the one or more fluorescence signals emitted by the one or more biological materials may have a wavelength of at least about 500 nanometers (nm). [0015] In some embodiments, the first set of light signals may not cause blood in the surgical scene to fluoresce. In some embodiments, the first set of light signals may cause one or more tissue regions or biological materials in the surgical scene to fluoresce.

[0016] In some embodiments, the one or more images of the surgical scene may comprise one or more fluorescence/ autofluorescence images of the one or more biological materials. In some embodiments, the one or more biological materials may comprise a tissue. In some embodiments, the one or more biological materials may comprise bile, urine, fat, connective tissue, or cauterized tissue. The one or more fluorescence/autofluorescence images may be generated without the use of any dyes (e.g., ICG) or other fluorescent markers or fiducials. Fluorescence may comprise or be autofluorescence.

[0017] In some embodiments, the one or more images of the surgical scene may be usable to detect bile leaks from one or more bile ducts in the surgical scene during or after surgery. In some embodiments, the one or more images of the surgical scene may be usable to infer a hemoglobin density in tissue and to correct one or more laser speckle maps based on the inferred hemoglobin density. In some embodiments, the one or more laser speckle maps may be generated using the second set of light signals.

[0018] In some embodiments, the system may further comprise a filter disposed along a light path between (i) the one or more imaging devices and (ii) the first or second light source. The filter may be configured to filter out one or more light signals with a wavelength below about 500 nanometers (nm). In some embodiments, the filter may be configured to filter out the first set of light signals generated by the first light source.

[0019] In some embodiments, the system may further comprise an optical element disposed along a light path between (i) the one or more imaging devices and (ii) the first or second light source. The optical element may be configured to (i) direct a first subset of the light signals reflected, emitted, or received from the surgical scene to the first imaging device and (ii) direct a second subset of the light signals reflected, emitted, or received from the surgical scene to the second imaging device. In some embodiments, the optical element may comprise a lens, a mirror, or a prism. In some embodiments, the optical element may comprise a dichroic mirror or a dichroic prism. In some alternative embodiments, the optical element may comprise a trichroic mirror or a trichroic prism.

[0020] In some embodiments, the system may further comprise an image processing unit configured to generate the one or more images of the surgical scene based on at least one of the first set of images and the second set of images. In some embodiments, the image processing unit may be configured to adjust, modify, correct, or update the second set of images based on the first set of images. In some embodiments, the image processing unit may be configured to overlay at least one image from the first set of images on at least one image from the second set of images. In some embodiments, the image processing unit may be configured to overlay at least one image from the second set of images on at least one image from the first set of images. [0021] In some embodiments, the one or more imaging devices may comprise a first imaging device for generating a first set of images based on the first set of light signals and a second imaging device for generating a second set of images based on the second set of light signals. In some embodiments, the one or more imaging devices may comprise at least one sensor configured for fluorescence imaging based on the first set of light signals. In some embodiments, the at least one sensor may comprise an RGB sensor. In some embodiments, the one or more images may be generated based on one or more fluorescence signals filtered through one or more bandpass filters on the RGB sensor. In some embodiments, the one or more bandpass filters may comprise a red bandpass filter, a blue bandpass filter, and/or a green bandpass filter.

[0022] Another aspect of the present disclosure provides a non-transitory computer readable medium comprising machine executable code that, upon execution by one or more computer processors, implements any of the methods above or elsewhere herein.

[0023] Another aspect of the present disclosure provides a system comprising one or more computer processors and computer memory coupled thereto. The computer memory comprises machine executable code that, upon execution by the one or more computer processors, implements any of the methods above or elsewhere herein.

[0024] In an aspect, provided herein is a method, comprising proving an image of a subject, wherein the image comprises an emission light from one or more B vitamins in a target region collected in response to an excitation light; and detecting, identifying, or visualizing a structure or a feature in the target region based at least in part on the emission light received from the one or more B vitamins.

[0025] In some embodiments, the emission light comprises fluorescent light. In some embodiments, the fluorescent light is generated from fluorescence of the one or more B vitamins. In some embodiments, the one or more B vitamins are configured to fluoresce or autofluoresce in response to the excitation light to generate the fluorescent light. In some embodiments, the one or more B vitamins are exogenous and non-naturally occurring in the subject’s body.

[0026] In some embodiments, the excitation light has a wavelength ranging from about 200 nanometers to about 500 nanometers. In some embodiments, the excitation light has a wavelength of about 445 nanometers. In some embodiments, the emission light has a wavelength that is greater than or equal to about 500 nanometers. [0027] In some embodiments, the structure or feature in the target region comprises a ureter. In some embodiments, the structure or feature in the target region comprises a bile duct. In some embodiments, the structure or feature in the target region comprises one or more lymph nodes. In some embodiments, the emission light received from the one or more B vitamins is usable to detect, identify, or visualize a plurality of lymph nodes in the subject’s body. In some embodiments, the one or more B vitamins are introduced into the subject’s body via oral ingestion or intravenous injection. In some embodiments, the one or more B vitamins in the target region comprise excess B vitamins that are concentrated in the target region.

[0028] In some embodiments, the method further comprises, prior to (d), filtering out the excitation light to aid in the detection, identification, or visualization of the structure or the feature in the target region based on the emission light from the one or more B vitamins. In some embodiments, the feature in the target region comprises a perfusion of a biological fluid through the structure in the target region. In some embodiments, the one or more B vitamins comprise a B2 vitamin.

[0029] In some embodiments, the first light source and the second light source comprise the same light source. In some embodiments, the system further comprises a filter, wherein the filter is configured to reduce a presence of an excitation signal from the first set of light signals. In some embodiments, the filter is an absorptive filter. In some embodiments, the absorptive filter comprises an optical density of equal to or greater than 4. In some embodiments, the optical density is about seven. In some embodiments, the filter is a reflective filter. In some embodiments, the reflective filter is a dichroic mirror. In some embodiments, the first imaging device and the second imaging device are the same imaging device.

[0030] Additional aspects and advantages of the present disclosure will become readily apparent to those skilled in this art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the present disclosure is capable of other and different embodiments, and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.

INCORPORATION BY REFERENCE

[0031] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.

To the extent publications and patents or patent applications incorporated by reference contradict the disclosure contained in the specification, the specification is intended to supersede and/or take precedence over any such contradictory material.

BRIEF DESCRIPTION OF THE DRAWINGS

[0032] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings (also “Figure” and “FIG.” herein), of which:

[0033] FIG. 1 schematically illustrates an exemplary imaging module for imaging a surgical scene using one or more imaging modalities, in accordance with some embodiments.

[0034] FIG. 2 schematically illustrates an exemplary system configuration for implementing fluorescence imaging, in accordance with some embodiments.

[0035] FIG. 3 schematically illustrates an exemplary method for fluorescence imaging, in accordance with some embodiments.

[0036] FIG. 4 schematically illustrates a computer system that is programmed or otherwise configured to implement methods provided herein.

[0037] FIG. 5 shows an example of a ureter visualized with fluorescent light generated by administering vitamin B2 to an animal and using the imaging methods and systems of the present disclosure.

[0038] FIG. 6 shows an example of generating fluorescence in the ureter of an animal using intraureteral indocyanine green (ICG) stents and using the imaging methods and systems of the present disclosure.

[0039] FIG. 7 shows a sensitivity curve for vitamin B2 fluorescence detected using the imaging methods and systems of the present disclosure.

[0040] FIG. 8 shows an example of illuminating and visualizing the lymph nodes of an animal via vitamin B2 and imaging methods and systems of the present disclosure.

DETAILED DESCRIPTION

[0041] While various embodiments of the invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions may occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed.

[0042] The term “real-time,” as used herein, generally refers to a simultaneous or substantially simultaneous occurrence of a first event or action with respect to an occurrence of a second event or action. A real-time action or event may be performed within a response time of less than one or more of the following: ten seconds, five seconds, one second, a tenth of a second, a hundredth of a second, a millisecond, or less relative to at least another event or action. A real-time action may be performed by one or more computer processors.

[0043] Whenever the term “at least,” “greater than” or “greater than or equal to” precedes the first numerical value in a series of two or more numerical values, the term “at least,” “greater than” or “greater than or equal to” applies to each of the numerical values in that series of numerical values. For example, greater than or equal to 1, 2, or 3 is equivalent to greater than or equal to 1, greater than or equal to 2, or greater than or equal to 3.

[0044] Whenever the term “no more than,” “less than,” or “less than or equal to” precedes the first numerical value in a series of two or more numerical values, the term “no more than,” “less than,” or “less than or equal to” applies to each of the numerical values in that series of numerical values. For example, less than or equal to 3, 2, or 1 is equivalent to less than or equal to 3, less than or equal to 2, or less than or equal to 1.

[0045] In some examples, provided herein are methods and systems for imaging, including fluorescent and autofluorescent imaging. In some cases, an organ or tissue of a subject may emit a light which can be detected using the systems described herein. Such light may illuminate a surgical scene for a surgical operator (e.g., a surgeon or assistant). In some cases, this may increase the precision of the surgical operator’s work/surgery. For example, this may reduce a chance of injury to an organ undergoing a surgery, such as by making the organs, tissues, and body parts easier for the surgical operator to be seen. In one example, the methods and systems may be used during surgery on a particular organ. The organ may be an internal organ of a mammal, animal, or human. The organ may be a part of the digestion system, cardiovascular system, nervous system, or any other part of the body. In some cases, the methods may be applied to the digestion system.

[0046] In some cases, the methods of the present disclosure may be used to illuminate and detect a light from a subject’s urinary tract, urethra, bladder, ureter, kidneys, and/or urine. In some cases, such organ (e.g., a ureter of a subject) may emit a light (e.g., fluorescent light) due to the presence of a chemical, nutrient, hormone, or vitamin in the organ (e.g., in the ureter and/or in the urine inside the ureter). In some cases, the method may comprise administering vitamin B (e.g., vitamin B2) to a subject (e.g., before the surgery) to cause/generate fluorescence emission in the urine/ureter during the surgery, and illuminate, visualize, and/or detect such fluorescent light using the systems of the present disclosure, for example to assist with a surgical operation and increase a safety, efficacy, and speed thereof. [0047] The methods and systems may be used during any suitable surgery. A few examples of surgeries which can benefit from the methods and system of the present disclosure include hysterectomy (e.g., a hysterectomy of urinary tract which may have or be suspected of having a damage, issue, or imperfection), a surgery to remove or address a tumor in a body part/organ (e.g., in the bladder, urinary tract, kidneys, and/or one or two ureter(s), a surgery in a pelvis of a subject (e.g., in the lower pelvis area, in some cases, in an area surrounding or near ureter(s). The methods and systems may be helpful for causing unintended damages or wounds during the surgery by enhancing the visibility into the surgical scene. In some cases, a careful inspection of the area is needed by the surgical operator or surgeon such as to not cause a damage. This process may in some cases be time consuming. For example, it may delay the surgical operator for a duration of time such as a few minutes (e.g., at least 10, 20, 30, 40, 50, 60, 70, 80 minutes or longer). Using the methods and systems of the present disclosure may reduce this duration of time by increasing the efficiency and speed of the operation/process. For example, a substance (in some cases, vitamin B (e.g., vitamin B2)) may be administered to a subject. This may cause the ureter of the subject to emit fluorescent light. The emitted fluorescent light may be detected using the methods and systems of the present disclosure. This may enhance the surgeon’s visibility during the operation and reduce the chances of the operator making a mistake and/or causing a damage to the ureter and/or its surrounding areas. This may also increase the convenience, efficiency, and speed of the operation and reduce the amount of time that the operator may need to spend in performing the operation. In some cases, vitamin B may comprise or be vitamin B2)

[0048] Various substances may be suitable for using in order to cause an organ emits a fluorescent light. An example may comprise a urine-targeted fluorescent dye, Indocyanine Green dye, and other dyes. The dyes may be delivered to the organ or surgical tissue in a variety of ways, including injection, stents, or oral ingestion. In some cases, artificial/non-natural fluorescent dyes may be avoided. In some cases, injections or stents may be avoided. In some cases, the substance causing fluorescence may be natural. In some cases, the substance causing fluorescence in the organ may be delivered via oral ingestion. In an example, the substance causing fluorescence in the organ (e.g., ureter) may comprise or be vitamin B, and it may be administered to the subject via oral ingestion (e.g., prior to the surgery). In some cases, vitamin B may comprise or be vitamin B2.

[0049] This method may be safe, non-invasive, and effective. Vitamin B may be administered to the subject at any suitable concentration/dose depending on the intended application, intended fluorescent intensity, and the imaging conditions used. In some cases, the concentration of vitamin B be at least 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 30, 40, 50, 60, 70, 80, 90, 100 mg or more, dissolved in water. In some cases, the concentration of vitamin B may be at most about 100, 90, 80, 70, 60, 50, 40, 30, 20, 10, 9, 8, 7, 6, 5, 4, 3, 2, 1 mg or less. In some cases, the concentration of vitamin B administered to the subject may be from 0.1 to 100 mg. In an example, the concentration of vitamin B may be 10 mg. Other suitable additives may be added to the solution. A few examples on the concentrations and imaging setup to be used are provided in EXAMPLE 1 -EXAMPLE 4. In some cases, vitamin B may comprise or be vitamin B2.

[0050] The methods and systems presented herein address a significant unmet need in the art as related to surgical operation and medical devices for same. As an example, Iatrogenic ureteral injury is a potentially devastating complication of abdominal and pelvic surgery. The overall incidence of significant ureteral injury in gynecologic and colorectal surgeries has been estimated to be as high as 1.7% and 1.9%, respectively. Risk of injury is higher in patients with a complex history of radiation, prior surgery, inflammatory conditions, or a mass that may obscure anatomic planes. In such cases ureteral identification is more challenging. Following ureteral injury repair, the postoperative course can be prolonged due to the common need for urinary diversion in the form of ureteral stents, nephrostomy tubes or urethral catheters. Such instrumentation can cause flank pain, hematuria, irritative urinary tract symptoms and predisposition to urinary infections, all of which can significantly impact a patient's quality of life. The methods and systems of the present disclosure may help reduce the risk of injury, making operations safer and more efficient, as described elsewhere herein.

Fluorescence Imaging

[0051] In one aspect, the present disclosure provides a system for medical imaging. The system may be configured to fluorescence imaging. In some cases, fluorescence imaging may be performed without the use of dyes, fluorescent markers, or other fiducials. In some cases, fluorescence imaging may be implemented using selective wavelengths of light that induce biological fluorescence. As used herein, biological fluorescence (also referred to as tissue fluorescence or autofluorescence in general) may comprise a natural emission of light by biological structures in response to one or more light signals incident on at least a portion of the biological structures. In some cases, the emission of light may be based on light reflection or light absorption.

[0052] In some cases, emission may comprise or be natural emission (e.g., the light may be naturally emitted by the tissue or organ without the use of a dye or fluorophore). In some cases, the presence of a chemical, nutrient, a hormone, a molecule, a mineral, a protein, an enzyme, or a vitamin may generate or facilitate the generation or the light and its emission. The emitted light may be detected using the methods and systems described herein. In some cases, the method may further comprise administering such chemical, nutrient, hormone, molecule, mineral, protein, enzyme, or vitamin to a subject, in order to make one or more tissues or organs in the subject to fluoresce, shine, or otherwise emit light which can be detected and/or visualized using the methods and systems presented herein. The methods may further comprise providing, obtaining, and/or using devices (e.g., medical devices) comprising modules for illuminating a scene comprising the organ or tissue to generate, emit, and/or detect the light.

[0053] Fluorescence may comprise autofluorescence. In some cases, fluorescence may comprise or be autofluorescence. The methods and systems of the present disclosure may comprise methods and systems for visualizing and imaging autofluorescence. Any of the methods and systems may be used for detecting, visualizing, and/or imaging fluorescence and/or autofluorescence. In some cases, an organ, tissue, or body part imaged using the methods and systems provided herein may naturally emit fluorescent light (e.g., without use of a dye, a fluorophore, a substance, a nutrition, a chemical, or alike). Some tissues or organs in body may emit detectable fluorescent light which can be detected by fluorescent imaging using any method and system presented anywhere herein. For example, in some cases, fat tissues may be autofluorescent. Autofluorescent tissues may be visualized and/or imaged using fluorescent imaging methods and systems. In some cases, the terms fluorescent and autofluorescent may be used interchangeably depending on the context.

Light Sources

[0054] In some examples, a system of the present disclosure may comprise one or more light sources (e.g., a plurality of light sources). The one or more (e.g., a plurality of) light sources may The plurality of light sources may comprise (i) a first light source configured to generate a first set of light signals for fluorescence imaging and (ii) a second light source configured to generate a second set of light signals for at least one of RGB imaging, laser speckle imaging, and depth imaging. In some cases, the plurality of light sources may comprise at least one of a white light source, a laser speckle light source, and a fluorescence excitation light source. In other cases, the plurality of light sources may not or need not comprise a white light source, a laser light source, or a fluorescence excitation light source.

Beams / Pulses

[0055] In any of the embodiments described herein, the one or more (e.g., the plurality of light sources) may be configured to generate one or more light beams. In such cases, the plurality of light sources may be configured to operate as a continuous wave light source. A continuous wave light source may be a light source that is configured to produce a continuous, uninterrupted beam of light with a stable output power.

[0056] In some cases, the plurality of light sources may be configured to continuously emit pulses of light and/or energy at predetermined intervals. In such cases, the light sources may only be switched on for limited time intervals and may alternate between a first power state and a second power state. The first power state may be a low power state or an OFF state. The second power state may be a high power state or an ON state.

[0057] Alternatively, the plurality of light sources may be operated in a continuous wave mode, and the one or more light beams generated by the plurality of light sources may be chopped (i.e., separated, or discretized) into a plurality of light pulses using a mechanical component (e.g., a physical object or shuttering mechanism) that blocks the transmission of light at predetermined intervals. The mechanical component may comprise a movable plate that is configured to obstruct an optical path of one or more light beams generated by the plurality of light sources, at one or more predetermined time periods.

Fluorescence Excitation Light Source

[0058] In some cases, the second light source may comprise a fluorescence excitation light source. The fluorescence excitation light source may be used for fluorescence imaging. As used herein, fluorescence imaging may refer to the imaging of any fluorescent materials (e.g., auto fluorescing biological materials such as tissues or organs) or fluorescing materials (e.g., dyes comprising a fluorescent substance like fluorescein, coumarin, cyanine, rhodamine, or any chemical analog or derivative thereof). Fluorescing materials may comprise fluorescing tissues, organs or body parts. Fluorescing materials may comprise autofluorescent materials, tissues and/or body parts. Fluorescence may be generated in an imaged organ naturally or as a result of injection, ingestion, or otherwise uptake or administration of a substance to a subject an organ of which is being visualized/imaged. In some cases, a substance causing fluorescence in the organ or tissue may be a vitamin. In an example a substance causing or leading to fluorescence in the tissue or organ may comprise or be vitamin B2. The fluorescence excitation light source may be configured to generate a fluorescence excitation light beam. The fluorescence excitation light beam may cause a tissue or a fluorescent dye (e.g., indocyanine green) to fluoresce (i.e., emit light). The fluorescence excitation light beam may have a wavelength that ranges from about 200 nanometers (nm) to about 500 nanometers (nm). In some examples, the wavelength may be at least 200, 250 The fluorescence excitation light beam may be emitted onto a target tissue or biological material with native fluorescence properties. In some cases, the target region may emit fluorescent light signals with a wavelength that is greater than about 500 nanometers (nm). [0059] In some cases, the fluorescence excitation light source may be configured to generate blue light having a wavelength of about 470 nm or about 445 nm to excite fluorescing/autofluorescing tissue, which can then emit light having wavelengths of at least about 500 nm. In some cases, a filter may be used to block the 470 nm light or 445 nm light from the camera so that the fluorescence imaging sensor only sees the 500+ nm light emitted by the fluorescing/autofluorescing tissue.

[0060] In some cases, the fluorescence excitation light source may be configured to generate one or more light pulses, light beams, or light waves. In some cases, the one or more light pulses, light beams, or light waves may be used for tissue fluorescence imaging and/or other types of imaging (e.g., RGB imaging, laser speckle imaging, or time of flight (TOF) imaging). [0061] In some embodiments, the fluorescence excitation light source may be used to generate a plurality of fluorescence excitation light pulses. In such cases, the fluorescence excitation light source may be pulsed (i.e., switched ON and OFF at one or more predetermined intervals). In some cases, such pulsing may be synced to an opening and/or a closing of one or more camera shutters for synchronized fluorescence imaging.

[0062] In some embodiments, the fluorescence excitation light source may be used to generate a continuous light beam. In some cases, the fluorescence excitation light source may be continuously ON, and a property of the fluorescence excitation light may be modulated. For example, the continuous light beam may undergo an amplitude modulation. The amplitude modulated light beam may be used to obtain one or more measurements based on a phase difference between the fluorescence excitation light and the fluorescence light emitted and received from the tissue. The fluorescence measurements may be computed based at least in part on a phase shift observed between the fluorescence excitation light directed to the target region and the fluorescence light emitted and received from the target region. In other cases, when the fluorescence excitation light source is used to generate a continuous light beam, one or more movable mechanisms (e.g., an optical chopper or a physical shuttering mechanism such as an electromechanical shutter or gate) may be used to generate a series of fluorescence excitation pulses from the continuous light beam. The plurality of fluorescence excitation light pulses may be generated by using a movement of the electromechanical shutter or gate to chop, split, or discretize the continuous light beam into the plurality of fluorescence excitation light pulses. One advantage of having the fluorescence excitation light beam continuously on is that there are no delays in ramp-up and/or ramp-down (i.e., no delays associated with powering the beam on and off).

[0063] In some embodiments, the fluorescence excitation light source may be located remote from a scope and operatively coupled to the scope via a light guide. For example, the fluorescence excitation light source may be located on or attached to a surgical tower. In other embodiments, the fluorescence excitation light source may be located on the scope and configured to provide the fluorescence excitation light to the scope via a scope-integrated light guide. The scope-integrated light guide may comprise a light guide that is attached to or integrated with a structural component of the scope. The light guide may comprise a thin filament of a transparent material, such as glass or plastic, which is capable of transmitting light signals through successive internal reflections. Alternatively, the fluorescence excitation light source may be configured to provide the fluorescence excitation light to the target region via one or more secondary illuminating scopes. In such cases, the system may comprise a primary scope that is configured receive and direct light generated by other light sources (e.g., a white light source, a laser speckle light source, and/or a TOF light source). The one or more secondary illuminating scopes may be different than the primary scope. The one or more secondary illuminating scopes may comprise a scope that is separately controllable or movable by a medical operator or a robotic surgical system. The one or more secondary illuminating scopes may be provided in a first set of positions or orientations that is different than a second set of positions or orientations in which the primary scope is provided. In some cases, the fluorescence excitation light source may be located at a tip of the scope. In other cases, the fluorescence excitation light source may be attached to a portion of the surgical subject’s body. The portion of the surgical subject’s body may be proximal to the target region being imaged using the medical imaging systems of the present disclosure. In any of the embodiments described herein, the fluorescence excitation light source may be configured to illuminate the target region through a rod lens. The rod lens may comprise a cylindrical lens configured to enable beam collimation, focusing, and/or imaging. In some cases, the fluorescence excitation light source may be configured to illuminate the target region through a series or a combination of lenses (e.g., a series of relay lenses).

[0064] In some embodiments, the system may comprise a fluorescence excitation light source configured to transmit the first set of light signals to the surgical scene. The fluorescence excitation light source may be configured to generate and transmit one or more fluorescence excitation light pulses to the surgical scene. In some cases, the fluorescence excitation light source may be configured to provide a spatially varying illumination to the surgical scene. In some cases, the fluorescence excitation light source may be configured to provide a temporally varying illumination to the surgical scene. In some cases, the timing of the opening and/or closing of one or more shutters associated with one or more imaging units may be adjusted based on the spatial and/or temporal variation of the illumination. In some cases, the image acquisition parameters for the one or more imaging units may be tuned based on the surgical application (e.g., type of surgical procedure), a scope type, or a cable length. In some cases, the fluorescence measurement and acquisition scheme may be tuned based on a distance between the surgical scene and one or more components of the fluorescence imaging systems disclosed herein.

[0065] In some cases, the fluorescence excitation light source may be configured to adjust an intensity of the first set of light signals. In some cases, the fluorescence excitation light source may be configured to adjust a timing at which the first set of light signals is transmitted. In some cases, the fluorescence excitation light source may be configured to adjust an amount of light directed to one or more regions in the surgical scene. In some cases, the fluorescence excitation light source may be configured to adjust one or more properties of the first set of light signals based on a type of surgical procedure, a type of tissue in the surgical scene, a type of scope through which the light signals are transmitted, or a length of a cable used to transmit the light signals from the fluorescence excitation light source to a scope. The one or more properties may comprise, for example, a pulse width, a pulse repetition frequency, or an intensity.

[0066] In some cases, the fluorescence excitation light source may be configured to generate a plurality of light pulses, light beams, or light waves for fluorescence imaging. In some cases, the fluorescence excitation light source may be configured to generate light pulses, light beams, or light waves having multiple different wavelengths or ranges of wavelengths.

[0067] In some examples, vitamin B (e.g., administered to a subject) may allow for fluorescent light to be detected from an organ of the subject (e.g., ureter). In some cases, the light emitted by vitamin B may be detected by exposing the organ/surgical scene comprising the organ (e.g., ureter) to a green, fluorescent light (e.g., wavelength of 400-550 nanometers (nm)). Further, in some cases, RGB light may also be exposed to the surgical scene. In some examples, the RGB light and green, fluorescent light may be overlayed (e.g., directly overlayed) and their beams may be combined (e.g., into a single fiber of a device of the present disclosure) before exposing the surgical scene comprising the organ. In some cases, vitamin B may comprise or be vitamin B2.

[0068] In some cases, light detected from the organ/surgical scene according to the methods presented herein may comprise blue, fluorescent light (a wavelength of 400-500 nanometers (nm)). In some cases, the blue light may be bright (e.g., dominantly bright such that it might outshine other bandwidths of light) and it may need to be filtered/normalized. The systems of the present disclosure may comprise light filters which may allow for filtering the blue light and/or reducing noise. The filter may comprise or be a noise reduction and/or absorption filter. Any suitable filter may be used. In some examples, the filter may comprise a 460 long pass. The filter may comprise an optical density (OD) of at least 4, 5, 6, 7, 8, or greater. In some cases, a percentage of the blue light passes the filter, and the rest is absorbed by the filter. For example, at least about 10%, 20%, 30%, 40%, 50%, 60%, 70% or a greater portion of the blue light may pass through the filter, and the rest may be absorbed.

[0069] In some examples, an organ or tissue may generate fluorescence due to presence of vitamin B (e.g., as a result of administration of vitamin B to the subject). The organ or tissue may be visualized and/or imaged during a surgery using the methods and systems described anywhere herein. Imaging and/or visualization may be performed over one or more layers of tissue and/or animal fat. In some cases, a depth of at least about 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 15, 20, 30 mm or more may be visible and capable of getting captured. In some cases, vitamin B may comprise or be vitamin B2.

[0070] Vitamin B may have a quantum yield of at least about 5%, 10%, 15%, 20%, 25%, 30%, 32%, 35%, 37%, 37.5%, 40%, 50% or more. In some cases, vitamin B may have a quantum yield of 5% to 50%. In some cases, vitamin B may have a quantum yield of at most about 70%, 60%, 50%, 60%, 50%, 40%, 35%, 30%, 25%, 20%, 10% or less. In an example, vitamin B had a quantum yield of about 37.5%. In an example, the quantum yield of intra- ureteral indocyanine green ICG stents was measured and compared to that of vitamin. ICG stents had a quantum yield of about 2% to 4%. In an example, an ICG stent used had a quantum yield of 2.7%. In an example, vitamin B had a quantum yield of about 14 times greater than ICG stents. Additionally, administering vitamin B to an animal is non-surgical and non-invasive. It may be a safe, non-invasive, and effective method for visualizing and imaging an organ of the subject by generating fluorescence therein. In some cases, vitamin B may comprise or be vitamin B2.

Fluorescence Excitation Light Modulator

[0071] In some embodiments, the system may comprise a light modulator. The light modulator may be configured to adjust one or more properties (e.g., illumination intensity, direction of propagation, travel path, etc.) of the fluorescence excitation light generated using the fluorescence excitation light source. In some cases, the light modulator may comprise a diverging lens that is positioned along a light path of the fluorescence excitation light. The diverging lens may be configured to modulate an illumination intensity of the fluorescence excitation light across the target region. In other cases, the light modulator may comprise a light diffusing element that is positioned along a light path of the fluorescence excitation light. The light diffusing element may likewise be configured to modulate an illumination intensity of the fluorescence excitation light across the target region. Alternatively, the light modulator may comprise a beam steering element configured to illuminate the target region and one or regions proximal to the target region. The beam steering element may be used to illuminate a greater proportion of a scene comprising the target region with the fluorescence excitation light. In some cases, the beam steering element may comprise a lens or a mirror (e.g., a fast-steering mirror).

Fluorescence Excitation Parameter Optimizer

[0072] In some embodiments, the system may further comprise a parameter optimizer configured to adjust one or more pulse parameters and one or more camera parameters, based at least in part on a desired application, tissue type, scope type, or procedure type. The one or more fluorescence measurements obtained using the fluorescence imaging sensor may be based at least in part on the one or more pulse parameters and the one or more camera parameters. For example, the parameter optimizer may be used to implement a first set of pulse parameters and camera parameters for a first procedure, and to implement a second set of pulse parameters and camera parameters for a second procedure. The parameter optimizer may be configured to adjust the one or more pulse parameters and/or the one or more camera parameters to improve a resolution, accuracy, or tolerance of fluorescence sensing, and to increase the signal to noise ratio for fluorescence applications. In some cases, the parameter optimizer may be configured to determine the actual or expected performance characteristics of the fluorescence sensing system based on a desired selection or adjustment of one or more pulse parameters or camera parameters. Alternatively, the parameter optimizer may be configured to determine a set of pulse parameters and camera parameters required to achieve a desired resolution, accuracy, or tolerance for a desired biological material or tissue type or surgical operation.

[0073] In some cases, the parameter optimizer may be configured to adjust the one or more pulse parameters and the one or more camera parameters in real time. In other cases, the parameter optimizer may be configured to adjust the one or more pulse parameters and the one or more camera parameters offline. In some cases, the parameter optimizer may be configured to adjust the one or more pulse parameters and/or camera parameters based on a feedback loop. The feedback loop may be implemented using a controller (e.g., a programmable logic controller, a proportional controller, a proportional integral controller, a proportional derivative controller, a proportional integral derivative controller, or a fuzzy logic controller). In some cases, the feedback loop may comprise a real-time control loop that is configured to adjust the one or more pulse parameters and/or the one or more camera parameters based on a temperature of the fluorescence excitation light source or the fluorescence imaging sensor. In some embodiments, the system may comprise an image post processing unit configured to update the fluorescence measurements based on an updated set of fluorescence measurements obtained using the one or more adjusted pulse parameters or camera parameters.

[0074] The parameter optimizer may be configured to adjust one or more pulse parameters. The one or more pulse parameters may comprise, for example, an illumination intensity, a pulse width, a pulse shape, a pulse count, a pulse on/off level, a pulse duty cycle, an fluorescence excitation light pulse wavelength, a light pulse rise time, and a light pulse fall time. The illumination intensity may correspond to an amount of power needed to provide a detectable light signal during a procedure. The pulse width may correspond to a duration of the pulses. The system may require an fluorescence excitation pulse of some minimal or maximal duration to guarantee a certain acceptable resolution. The pulse shape may correspond to a phase, an amplitude, or a period of the pulses. The pulse count may correspond to a number of pulses provided within a predetermined time period. Each of the pulses may have at least a predetermined amount of power (in Watts) in order to enable fluorescence measurements with reduced noise. The pulse on/off level may correspond to a pulse duty cycle. The pulse duty cycle may be a function of the ratio of pulse duration or pulse width (PW) to the total period (T) of the pulse waveform. The fluorescence excitation pulse wavelength may correspond to a wavelength of the fluorescence excitation light from which the fluorescence excitation light pulse is derived. The fluorescence excitation pulse wavelength may be predetermined or adjusted accordingly for each desired fluorescence imaging application. The pulse rise time may correspond to an amount of time for the amplitude of a pulse to rise to a desired or predetermined peak pulse amplitude. The pulse fall time may correspond to an amount of time for the peak pulse amplitude to fall to a desired or predetermined threshold value. The pulse rise time and/or the pulse fall time may be modulated to meet a certain threshold value. In some cases, the fluorescence excitation light source may be pulsed from a lower power mode (e.g., 50%) to higher power mode (e.g., 90%) to minimize rise time. In some cases, a movable plate or other mechanical object (e.g., a shutter) may be used to chop a continuous fluorescence excitation light beam into a plurality of fluorescence excitation light pulses, which can also minimize or reduce pulse rise time.

[0075] The parameter optimizer may be configured to adjust one or more camera parameters. The camera parameters may include, for example, a number of shutters, shutter timing, shutter overlap, shutter spacing, and shutter duration. As used herein, a shutter may refer to a physical shutter and/or an electronic shutter. A physical shutter may comprise a movement of a shuttering mechanism (e.g., a leaf shutter or a focal-plane shutter of an imaging device or imaging sensor) in order to control exposure of light to the imaging device or imaging sensor. An electronic shutter may comprise turning one or more pixels of an imaging device or imaging sensor ON and/or OFF to control exposure. The number of shutters may correspond to a number of times in a predetermined time period during which the fluorescence imaging sensor or camera is shuttered open to receive fluorescence light pulses emitted from the target region. In some cases, two or more shutters may be used for a fluorescence excitation light pulse. Temporally spaced shutters can be used to determine or detect one or more features in the target region. In some cases, a first shutter may be used for a first pulse (e.g., an outgoing pulse), and a second shutter may be used for a second pulse (e.g., an incoming pulse). Shutter timing may correspond to a timing of shutter opening and/or shutter closing based on a timing of when a pulse is transmitted and/or received. The opening and/or closing of the shutters may be adjusted to capture one or more fluorescence pulses. In some cases, the shutter timing may be adjusted based on a path length of the fluorescence pulses or a target region of interest. Shutter timing modulation may be implemented to minimize the duty cycle of fluorescence excitation light source pulsing and/or camera shutter opening and closing, which can enhance the operating conditions of the fluorescence excitation light source and improve hardware longevity (e.g., by limiting or controlling the operating temperature). Shutter overlap may correspond to a temporal overlap of two or more shutters. Shutter overlap may increase peak Rx power at short pulse widths where peak power is not immediately attained. Shutter spacing may correspond to the temporal spacing or time gaps between two or more shutters. Shutter spacing may be adjusted to time the camera shutters for fluorescence imaging to receive the beginning and/or the end of the pulse. Shutter spacing may be optimized to increase the accuracy of fluorescence measurements at decreased Rx power. Shutter duration may correspond to a length of time during which the fluorescence imaging sensor or camera is shuttered open to receive fluorescence light signals emitted from the target region. Shutter duration may be modulated to minimize noise associated with a received fluorescence light signal, and to ensure that the fluorescence imaging sensor or camera receives a minimum amount of light needed for fluorescence imaging applications.

[0076] In some cases, hardware may be interchanged or adjusted in addition to or in lieu of software-based changes to pulse parameters and camera parameters, in order to achieve the desired fluorescence imaging capabilities for a particular application or type of tissue or biological material.

TOF Light Source

[0077] In some cases, the second light source may comprise a time of flight (TOF) light source. The TOF light source may comprise a laser or a light emitting diode (LED). The laser or the light emitting diode (LED) may be configured to generate a TOF light for depth measurements. The TOF light may comprise an infrared or near infrared light having a wavelength from about 700 nanometers (nm) to about 1 millimeter (mm). In some cases, the TOF light may comprise visible light having a wavelength from about 400 nm to about 700 nm. In some cases, the visible light may comprise blue light having a wavelength from about 400 nm to about 500 nm. Advantages of visible light for TOF applications include low penetration of tissue surfaces, which can improve the reliability and accuracy of TOF measurements. In contrast, IR light, which penetrates tissue surfaces deeper and induces multi-reflections, can (a) cause ambiguity as to which internal surface or subsurface is reflecting the IR light, and (b) introduce errors in any TOF measurements obtained. In some cases, the TOF light may comprise a plurality of light beams and/or light pulses having a plurality of wavelengths from about 400 nm to about 1 mm.

[0078] The terms “time of flight,” “time-of-flight,” “ToF,” or “TOF,” as used interchangeably herein, may generally refer to one or more measurements of a time taken by an object, a particle, or a wave to travel a distance through a medium (e.g., fluid, such as a liquid or gas). Examples of the wave may include acoustic wave and electromagnetic radiation. The time measurement(s) may be used to establish a velocity and/or a path length of the object, particle, or wave. In some cases, time of flight may refer to the time required for emitted electromagnetic radiation to travel from a source of the electronic radiation to a sensor (e.g., a camera). In some cases, a-time-of-flight measurement may correspond to the time required for the emitted electromagnetic radiation to reach a target tissue. Alternatively, a time-of-flight measurement may correspond to the time required for the emitted electromagnetic radiation to reach a target tissue and to be directed or re-directed (e.g., reflected) to a sensor. Such sensor, which may comprise a TOF sensor, may be adjacent to the source of the emitted electromagnetic radiation, or may be at a different location than the source. In some cases, a camera or an imaging sensor may be used to determine a time of flight based on a phase shift of emitted and received signal (e.g., electromagnetic radiation). Examples of time-of-flight cameras may include, but are not limited to, radio frequency (RF)-modulated light sources with phase detectors (e.g., Photonic Mixer Devices (PMD), Swiss Ranger™, CanestaVision™), range gated imagers (e.g., ZCam™), and/or direct time-of-flight imagers (e.g., light detection and ranging (LIDAR)).

White Light Source

[0079] In some cases, the second light source may comprise a white light source. The white light source may be configured to generate one or more light beams or light pulses having one or more wavelengths that lie within the visible spectrum. The white light source may comprise a lamp (e.g., an incandescent lamp, a fluorescent lamp, a compact fluorescent lamp, a halogen lamp, a metal halide lamp, a fluorescent tube, a neon lamp, a high intensity discharge lamp, or a low-pressure sodium lamp), a light bulb (e.g., an incandescent light bulb, a fluorescent light bulb, a compact fluorescent light bulb, or a halogen light bulb), and/or a light emitting diode (LED). The white light source may be configured to generate a white light beam. The white light beam may be a polychromatic emission of light comprising one or more wavelengths of visible light. The one or more wavelengths of light may correspond to a visible spectrum of light. The one or more wavelengths of light may have a wavelength between about 400 nanometers (nm) and about 700 nanometers (nm). In some cases, the white light beam may be used to generate an RGB image of a target region.

Laser Speckle Light Source

[0080] In some cases, the second light source may comprise a laser speckle light source. The laser speckle light source may comprise one or more laser light sources. The laser speckle light source may comprise one or more light emitting diodes (LEDs) or laser light sources configured to generate one or more laser light beams with a wavelength between about 700 nanometers (nm) and about 1 millimeter (mm). In some cases, the one or more laser light sources may comprise two or more laser light sources that are configured to generate two or more laser light beams having different wavelengths. The two or more laser light beams may have a wavelength between about 700 nanometers (nm) and about 1 millimeter (mm). The laser speckle light source may comprise an infrared (IR) laser, a near-infrared laser, a shortwavelength infrared laser, a mid-wavelength infrared laser, a long-wavelength infrared laser, and/or a far-infrared laser. The laser speckle light source may be configured to generate one or more light beams or light pulses having one or more wavelengths that lie within the invisible spectrum. The laser speckle light source may be used for laser speckle imaging of a target region.

Imaging Devices

[0081] The system may further comprise one or more imaging devices for generating one or more images of the surgical scene based on a third set of light signals reflected, emitted, or received from the surgical scene. The third set of light signals may correspond to at least one of the first set of light signals and the second set of light signals. The imaging devices may comprise any imaging device configured to generate one or more medical images using light beams or light pulses transmitted to and reflected or emitted from a surgical scene. For example, the imaging devices may comprise a camera, a video camera, an imaging sensor for fluorescence or autofluorescence imaging, an infrared imaging sensor an imaging sensor for laser speckle imaging, a charge coupled device (CCD) image sensor, a complementary metal oxide semiconductor (CMOS) image sensor, a depth camera, a three-dimensional (3D) depth camera, a stereo camera, a Red Green Blue Depth (RGB-D) camera, a time-of-flight (TOF) camera, and/or an infrared camera.

Fluorescence Imaging Sensor

[0082] The system may comprise a fluorescence imaging sensor configured to receive at least a portion of the plurality of light beams or light pulses that are reflected and/or emitted from the surgical scene. The portion may comprise one or more fluorescence signals emitted from the surgical scene. The fluorescence imaging sensor may be configured to obtain one or more fluorescence measurements associated with the fluorescence signals natively emitted from a biological material or tissue in the surgical scene.

[0083] In some cases, the fluorescence imaging sensor may be positioned along a common beam path of a plurality of light beams or light pulses reflected from and/or emitted from the surgical scene. The common beam path may be disposed between the surgical scene and an optical element used to split the plurality of light beams or light pulses into different subsets of light signals. In some cases, the plurality of light beams or light pulses reflected from the surgical scene may be split into (i) a first subset of light signals corresponding to the fluorescence emission light and (ii) a second subset of light signals corresponding to white light, laser speckle light, and/or TOF light. The first subset of light signals may be redirected along a beam path that is different than that of the second subset of light signals and/or the plurality of light beams or light pulses reflected from the surgical scene. In some cases, the fluorescence imaging sensor may be positioned along a discrete beam path of the first subset of light signals that is downstream of the optical element.

[0084] In some cases, the fluorescence imaging sensor may be positioned at a tip of a scope through which the plurality of light beams or light pulses are directed. In other cases, the fluorescence imaging sensor may be attached to a portion of the surgical subject’s body. The portion of the surgical subject’s body may be adjacent or proximal to the surgical scene being imaged or operated on.

Fluorescence Light Signals

[0085] In some embodiments, the first set of light signals for fluorescence imaging may be configured to excite one or more biological materials in the surgical scene, thereby causing the one or more biological materials to emit one or more fluorescence signals that are detectable by the one or more imaging devices. The one or more fluorescence signals emitted by the one or more biological materials may have a different wavelength than the first set of light signals used to excite the one or more biological materials. In some cases, the first set of light signals for fluorescence imaging may have a wavelength ranging from about 450 nanometers (nm) to about 500 nanometers (nm). In some cases, the first set of light signals for fluorescence imaging may have a wavelength of about 470 nanometers (nm). In some cases, the first set of light signals for fluorescence imaging may have a wavelength of about 445 nanometers (nm). In some cases, the one or more fluorescence signals emitted by the one or more biological materials may have a wavelength of at least about 500 nanometers (nm). In some embodiments, the first set of light signals may not cause blood in the surgical scene to fluoresce. In some embodiments, the first set of light signals may cause one or more tissue regions in the surgical scene to fluoresce. [0086] As described above, one or more light sources may be used to illuminate a surgical scene. In some cases, the plurality of light sources may comprise (i) a first light source configured to generate a first set of light signals for fluorescence imaging and (ii) a second light source configured to generate a second set of light signals for at least one of RGB imaging, laser speckle imaging, and depth imaging.

[0087] In some embodiments, one or more imaging devices may be used to generate one or more images of the surgical scene based on a third set of light signals reflected, emitted, or received from the surgical scene. The third set of light signals may correspond to at least one of the first set of light signals and the second set of light signals. In some cases, the third set of light signals may be produced based on an interaction between one or more biological materials or tissues in the surgical scene and the first and second set of light signals. Such interaction may comprise, for example, a transmission of light, a reflection of light, or an absorption of light. In some cases, such interaction may comprise inducing or causing native fluorescence in tissues or biological materials. In some cases, the third set of light signals may comprise the one or more fluorescence signals emitted by the one or more biological materials.

[0088] In some cases, the one or more images generated using the one or more imaging devices may comprise one or more fluorescence images of the one or more biological materials. The one or more biological materials may comprise, for example, a tissue. In some cases, the one or more biological materials may comprise bile, urine, fat, connective tissue, or cauterized tissue. In any of the embodiments described herein, the one or more fluorescence images may be generated without the use of any dyes or other fluorescent markers or fiducials. Fluorescence may comprise or be autofluorescence. Fluorescence imaging may comprise or be autofluorescence imaging. Fluorescent tissue or organ may comprise or be autofluorescence organ or tissue.

[0089] In some embodiments, the one or more images of the surgical scene may be usable to detect bile leaks from one or more bile ducts in the surgical scene during or after surgery. In some embodiments, the one or more images of the surgical scene may be usable to infer a hemoglobin density in tissue and to correct one or more laser speckle maps based on the inferred hemoglobin density.

[0090] In some embodiments, the one or more imaging devices may comprise a first imaging device for generating a first set of images based on the first set of light signals and a second imaging device for generating a second set of images based on the second set of light signals. In some embodiments, the one or more imaging devices may comprise at least one sensor configured for fluorescence imaging based on the first set of light signals. In some embodiments, the at least one sensor may comprise an RGB sensor. In some embodiments, the one or more images may be generated based on one or more fluorescence signals filtered through one or more bandpass filters on the RGB sensor. In some embodiments, the one or more bandpass filters may comprise a red bandpass filter, a blue bandpass filter, and/or a green bandpass filter.

[0091] In some embodiments, the system may further comprise an image processing unit configured to generate the one or more images of the surgical scene based on at least one of the first set of images and the second set of images. In some embodiments, the image processing unit may be configured to adjust, modify, correct, or update the second set of images based on the first set of images. In some embodiments, the image processing unit may be configured to overlay at least one image from the first set of images on at least one image from the second set of images. In some embodiments, the image processing unit may be configured to overlay at least one image from the second set of images on at least one image from the first set of images. [0092] In some embodiments, the system may comprise an optical element disposed along a light path between (i) the one or more imaging devices and (ii) the first or second light source. The optical element may be configured to (i) direct a first subset of the light signals reflected, emitted, or received from the surgical scene to the first imaging device and (ii) direct a second subset of the light signals reflected, emitted, or received from the surgical scene to the second imaging device. In some embodiments, the optical element may comprise a lens, a mirror, or a prism. In some embodiments, the optical element may comprise a dichroic mirror or a dichroic prism. In some alternative embodiments, the optical element may comprise a trichroic mirror or a trichroic prism.

[0093] In some embodiments, the system may comprise a filter disposed along a light path between (i) the one or more imaging devices and (ii) the first or second light source. The filter may be configured to filter out one or more light signals with a wavelength below about 500 nanometers (nm). In some embodiments, the filter may be configured to filter out the first set of light signals generated by the first light source. Combined Sensor

[0094] In some cases, a single imaging sensor may be used for multiple types of imaging (e.g., any combination of fluorescence imaging, TOF depth imaging, laser speckle imaging, and/or RGB imaging). In some cases, a single imaging sensor may be used for imaging based on multiple ranges of wavelengths, each of which may be specialized for a particular type of imaging or for imaging of a particular type of biological material or physiology.

[0095] In some cases, the fluorescence imaging sensors described herein may comprise an imaging sensor configured for fluorescence imaging and at least one of RGB imaging, laser speckle imaging, and TOF imaging. In some cases, the imaging sensor may be configured for fluorescence imaging and at least one of RGB imaging, perfusion imaging, and TOF imaging. In any of the embodiments described herein, the fluorescence imaging sensor may be configured to see and register non-fluorescent light.

[0096] In some cases, the fluorescence imaging sensor may be configured to capture fluorescence signals and laser speckle signals during alternating or different temporal slots. For example, the fluorescence imaging sensor may capture fluorescence signals at a first time instance, laser speckle signals at a second time instance, fluorescence signals at a third time instance, laser speckle signals at a fourth time instance, and so on. The fluorescence imaging sensor may be configured to capture a plurality of different types of optical signals at different times. The optical signals may comprise a fluorescence signal, a TOF depth signal, an RGB signal, and/or a laser speckle signal.

[0097] In other cases, the fluorescence imaging sensor may be configured to simultaneously capture fluorescence signals and laser speckle signals to generate one or more medical images comprising a plurality of spatial regions. The plurality of spatial regions may correspond to different imaging modalities. For example, a first spatial region of the one or more medical images may comprise a fluorescence image based on fluorescence measurements, and a second spatial region of the one or more medical images may comprise an image based on one or more of laser speckle signals, white light or RGB signals, and TOF depth measurements.

[0098] FIG. 1 schematically illustrates an example of an imaging module 110 for fluorescence imaging. The imaging module 110 may comprise a plurality of imaging units 120- 1, 120-2, 120-3, etc. The plurality of imaging units 120-1, 120-2, 120-3 may comprise one or more imaging sensors. The imaging sensors may be configured for different types of imaging (e.g., fluorescence imaging, time of flight imaging, RGB imaging, and/or laser speckle imaging). In some cases, the plurality of imaging units 120-1, 120-2, 120-3 may be integrated into the imaging module 110. In other cases, at least one of the plurality of imaging units 120-1, 120-2, 120-3 may be provided separately from the imaging module 110. [0099] The imaging module may be configured to receive one or more signals reflected from a surgical scene 150. The one or more signals reflected from a surgical scene 150 may comprise one or more optical signals. The one or more optical signals may correspond to one or more light waves, light pulses, or light beams generated using one or more light sources. The one or more light sources may comprise one or more light sources for fluorescence imaging, TOF imaging, RGB imaging, and/or laser speckle imaging. The one or more optical signals may be generated when the one or more light waves, light pulses, or light beams generated using the plurality of light sources are transmitted to the surgical scene 150. The optical signals may be generated based on an interaction between a biological material or tissue in the surgical scene and the light waves, light pulses, or light beams generated using the plurality of light sources. Such interaction may comprise, for example, transmission of light, reflection of light, or absorption of light by the biological material or tissue. In some cases, the light waves, light pulses, or light beams generated using the plurality of light sources may cause the biological material to natively fluoresce as described elsewhere herein. In some cases, the one or more light waves, light pulses, or light beams generated using the plurality of light sources may be transmitted to the surgical scene 150 via a scope (e.g., a laparoscope). In some cases, the optical signals from the surgical scene 150 may be transmitted back to the imaging module 110 via the scope. The optical signals (or a subset thereof) may be directed to the appropriate imaging sensor and/or the appropriate imaging units 120-1, 120-2, 120-3.

[00100] The imaging units 120-1, 120-2, 120-3 may be operatively coupled to an image processing module 140. As described in greater detail below, the image processing module 140 may be configured to generate one or more images of the surgical scene 150 based on the optical signals received at the imaging units 120-1, 120-2, 120-3. In some cases, the image processing module 140 may be provided separately from the imaging module 110. In other cases, the image processing module 140 may be integrated with or provided as a component within the imaging module 110.

[00101] FIG. 2 schematically illustrates an example of a fluorescence imaging system. The fluorescence imaging system may comprise an imaging module 210. The imaging module 210 may be operatively coupled to a scope 225. The scope 225 may be configured to receive one or more input light signals 229 from one or more light sources. The one or more input light signals 229 may be transmitted from the one or more light sources to the scope 225 via a light guide. The one or more input light signals 229 may comprise, for example, white light for RGB imaging, fluorescence excitation light for fluorescence imaging, infrared light for laser speckle imaging, and/or time of flight (TOF) light for TOF imaging. In some cases, the one or more input light signals 229 may comprise a first set of light signals for fluorescence imaging and a second set of light signals for at least one of RGB imaging, laser speckle imaging, and depth imaging. The one or more input light signals 229 may be transmitted through a portion of the scope 225 and directed to a target region 250 (e.g., a surgical scene).

[00102] In some cases, at least a portion of light signals transmitted to the target region 250 may cause a biological material or tissue in the target region 250 to natively fluoresce. One or more fluorescence signals 230 may be produced by the target region 250 in response to the transmitted light signals. The one or more fluorescence signals 230 may be received at the imaging module 210 via the scope 225. In some cases, a third set of light signals may be received at the imaging module 210 from the target region 250. The third set of light signals may correspond to at least one of the first set of light signals and the second set of light signals. The third set of light signals may comprise any optical signals that are reflected, emitted, or received from the surgical scene. In any of the embodiments described herein, the third set of light signals may comprise the one or more fluorescence signals 230 emitted from the target region 250.

[00103] The imaging module 210 may be configured to receive the third set of light signals, and to direct different subsets or portions of the third set of light signals to one or more imaging units (e.g., imaging modules 220-1, 220-2, 220-3) to enable various types of imaging based on different imaging modalities. In some cases, the one or more imaging devices may be releasably coupled to the imaging module 210. In other cases, the one or more imaging devices 220-1, 220-2, 220-3 may be integrated with the imaging module 210 or a housing or other structural component or subcomponent of the imaging module 210.

[00104] In some cases, the imaging module 210 may comprise one or more optical elements 235 for splitting the third set of light signals into the different subsets of light signals. Such splitting may occur based on a wavelength of the light signals, or a range of wavelengths associated with the light signals. The optical elements 235 may comprise, for example, a mirror, a lens, or a prism. The optical elements 235 may comprise a dichroic mirror, a trichroic mirror, a dichroic lens, a trichroic lens, a dichroic prism, and/or a trichroic prism. In some cases, the optical elements 235 may comprise a beam splitter, a prism, or a mirror. In some cases, the prism may comprise a trichroic prism assembly. In some cases, the mirror may comprise a faststeering mirror. In some cases, one or more optical elements 235 may be placed adjacent to and/or in optical communication with each other to enable selective splitting and redirection of light signals to multiple different imaging devices or imaging sensors based on one or more properties of the light signals (e.g., wavelength, frequency, phase, intensity, etc.).

[00105] In some cases, the input light signals 229 generated by the plurality of light sources may comprise fluorescence excitation light having a wavelength that ranges from about 400 nanometers to at most about 500 nanometers, laser speckle light having a wavelength that ranges from about 800 nanometers to about 900 nanometers, and/or TOF light having a wavelength that ranges from about 800 nanometers to about 900 nanometers. In some cases, the fluorescence excitation light may have a wavelength of about 470 nanometers. In some cases, the fluorescence excitation light may have a wavelength of about 445 nanometers. In some cases, the laser speckle light may have a wavelength of about 852 nanometers. In some cases, the TOF light may have a wavelength of about 808 nanometers.

[00106] The third set of light signals received from the target region 250 may be directed through the scope 225 to one or more optical elements 235 in the imaging module 210. In some cases, the one or more optical elements 235 may be configured to direct a first subset of the third set of light signals to an imaging unit 220-1 for fluorescence imaging. In some cases, the one or more optical elements 235 may be configured to direct a second subset of the third set of light signals to an imaging unit 220-2 for laser speckle imaging and/or TOF imaging. The first and second subsets of the third set of light signals may be separated based on a threshold wavelength. In some cases, the one or more optical elements 235 may be configured to permit a third subset of the third set of light signals to pass through to another imaging unit 220-3. The imaging unit 220-3 may comprise, for example, a camera for RGB imaging. In some cases, the imaging unit 220-3 may be a third-party imaging unit that may be coupled to the imaging module 210. In some embodiments, the imaging module 210 may comprise a filter for the first imaging unit 220-1. The filter may comprise, for example, a notch filter. The filter may be configured to block the fluorescence excitation light used to induce fluorescence so that the imaging unit 220-1 only sees the fluorescence signals produced by the biological material or tissue in the target region 250.

Time Sharing

[00107] In some embodiments, the system may further comprise a controller configured to control at least one of a gain, an exposure, a shutter timing, or a shutter size of at least one of the one or more imaging units. In some cases, the controller may be configured to control an exposure of a first imaging unit and a second imaging unit such that the first imaging unit receives the first set of light signals at a first point in time and the second imaging unit receives the second set of light signals at a second point in time. This may be referred to as time sharing among different sensors. The first set of light signals may comprise fluorescent light, which may be used by the first imaging unit for fluorescence imaging. The second set of light signals may comprise laser speckle light and/or TOF light, which may be used by the second imaging unit for laser speckle imaging and/or depth imaging. The first point in time may be different than the second point in time.

[00108] In some cases, the controller may be configured to control an exposure of an imaging device or sensor such that the imaging device or sensor receives a first subset of light signals for fluorescence imaging at a first point in time and a second subset of light signals for laser speckle imaging at a second point in time. This may be referred to as time sharing for a same sensor. In some cases, the fluorescent light signals may be associated with one or more biological materials (e.g., organs, tissue, or biological fluids such as blood). In other cases, the fluorescent light signals may be associated with one or more dyes (e.g., ICG dyes). The first point in time may be different than the second point in time.

Fluorescence excitation with RGB light pulses

[00109] In some cases, the one or more light sources may be used to generate one or more light pulses, light beams, or light waves that are usable for fluorescence imaging as well as other types of imaging (e.g., RGB imaging, laser speckle imaging, and/or TOF imaging). In some cases, one or more light pulses may be used for both RGB imaging and fluorescence imaging. In such cases, the one or more light pulses may cause one or more biological materials or tissues to natively fluoresce. In some cases, the imaging systems disclosed herein may comprise an image processing module. The image processing module may be configured to generate one or more images showing tissue fluorescence using one or more light signals emitted from natively fluorescing tissues. The one or more light signals emitted from the natively fluorescing tissues may be produced when the natively fluorescing tissues are excited using light pulses generated using a white light source, a TOF light source, and/or a laser speckle light source. The image processing module may be configured for visualization of tissue fluorescence based on one or more signals or measurements obtained using an RGB sensor, a TOF sensor, or a laser speckle imaging sensor.

Fluorescence Applications

[00110] In some embodiments, the fluorescence signals obtained using the systems and methods of the present disclosure may be used to visualize, detect, and/or monitor the movements of a biological material or a tissue in a target region being imaged. In some cases, the fluorescence measurements may be used to perform temporal tracking of perfusion characteristics or other features within a surgical scene.

[00111] In some cases, the measurements and/or the light signals obtained using one or more imaging sensors of the imaging module may be used for perfusion quantification. In some cases, the measurements and/or the light signals obtained using one or more imaging sensors may be used to generate, update, and/or refine one or more perfusion maps for the surgical scene.

[00112] In some cases, the fluorescence signals obtained using the systems and methods of the present disclosure may be used to provide a medical operator with a more accurate real-time visualization of a position or a movement of a particular point or feature within the surgical scene. In some cases, the fluorescence signals may provide a surgeon with spatial information about the surgical scene to optimally maneuver a scope, robotic camera, robotic arm, or surgical tool relative to one or more features within the surgical scene.

[00113] In some cases, the fluorescence signals obtained using the systems and methods of the present disclosure may be used to detect bile leaks from one or more bile ducts in the surgical scene during or after surgery. In some embodiments, the fluorescence signals may be used to infer a hemoglobin density in tissue and to correct one or more laser speckle maps based on the inferred hemoglobin density.

[00114] In any of the embodiments described herein, the one or more images of the surgical scene may be generated based on a quantification of fluorescent light emitted from one or more biological materials in or near the surgical scene. In some cases, the quantification of fluorescent light can be based at least in part on (i) an amount of fluorescent light emitted from the one or more biological materials and/or (ii) one or more characteristics of the illumination light used to capture the one or more images. The one or more characteristics of the illumination light may comprise, for instance, illumination intensity, illumination gradient or bias across the surgical scene, and/or a distance between the surgical scene and a light source providing the illumination light.

Image Processing Module

[00115] In some embodiments, the system may further comprise an image processing module operatively coupled to the first imaging unit and the second imaging unit. The image processing module may be configured to generate one or more enhanced or processed images of the surgical scene based on the first set of light signals and the second set of light signals.

[00116] In some cases, the image processing module may be configured to utilize image interpolation to account for a plurality of different frame rates and exposure times associated with the first and second imaging unit when generating the one or more images of the surgical scene. In some cases, the image processing module may be configured to quantify or visualize perfusion of a biological fluid in, near, or through the surgical scene based on the one or more images of the surgical scene. In some cases, the image processing module may be configured to generate one or more perfusion maps for one or more biological fluids in or near the surgical scene, based on the one or more images of the surgical scene. In some cases, the image processing module may be configured to update, refine, or normalize the one or more perfusion maps based on an inferred hemoglobin density that is derived from one or more fluorescence measurements or fluorescent signals obtained using fluorescence imaging. In some cases, the image processing module may be configured to update, refine, or normalize the one or more perfusion maps based on a distance between (i) a scope through which the plurality of light signals is transmitted and (ii) one or more pixels of the one or more images. In some cases, the image processing module may be configured to update, refine, or normalize the one or more perfusion maps based on a position, an orientation, or a pose of a scope through which the plurality of light signals is transmitted relative to one or more pixels of the one or more images. In some cases, the image processing module may be configured to update, refine, or normalize the one or more perfusion maps based on depth information or a depth map associated with the surgical scene. In some cases, the image processing module may be configured to determine a pose of a scope through which the plurality of light signals is transmitted relative to one or more pixels of the one or more images, based on depth information or a depth map. In some cases, the image processing module may be configured to update, refine, or normalize one or more velocity signals associated with the perfusion map based on the pose of the scope relative to the surgical scene. In some cases, the image processing module may be configured to update, refine, or normalize the one or more perfusion maps based on a type of tissue detected or identified within the surgical scene. In some cases, the image processing module may be configured to update, refine, or normalize the one or more perfusion maps based on an intensity of at least one of the first and second set of light signals. The intensity of the light signals may be a function of a distance between a scope through which the plurality of light signals is transmitted and one or more pixels in the surgical scene. In some cases, the image processing module may be configured to update, refine, or normalize the one or more perfusion maps based on a spatial variation of an intensity of at least one of the first and second set of light signals across the surgical scene. In some cases, the image processing module may be configured to infer a tissue type based on an intensity of one or more light signals reflected from the surgical scene, wherein the one or more reflected light signals comprise at least one of the first set of light signals and the second set of light signals. In some cases, the image processing module may be configured to use at least one of the first set of light signals and the second set of light signals to determine a time-varying motion of a biological material in or near the surgical scene.

[00117] In some cases, the imaging devices and/or the image processing module may be configured to (i) generate one or more fluorescence images based on the first set of light signals and/or the second set of light signals, and (ii) use the one or more fluorescence images to generate one or more machine-learning based inferences. The one or more machine-learning based inferences may comprise at least one of automatic video de-identification, image segmentation, automatic labeling of tissues or instruments in or near the surgical scene, and optimization of image data variability based on one or more normalized RGB or perfusion features. In some cases, the image processing module may be configured to (i) generate one or more fluorescence images based on at least one of the first set of light signals and the second set of light signals, and (ii) use the one or more fluorescence images to perform temporal tracking of perfusion and/or to implement speckle motion compensation.

[00118] In some cases, the image processing module may be operatively coupled to one or more 3D interfaces for viewing, assessing, or manipulating the one or more images. For example, the image processing module may be configured to provide the one or more images to the one or more 3D interfaces for viewing, assessing, or manipulating the one or more images. In some cases, the one or more 3D interfaces may comprise video goggles, a monitor, a light field display, or a projector.

[00119] In some cases, the image processing module may be configured to generate fluorescence images based at least in part on one or more fluorescence measurements obtained using the imaging devices or imaging sensors described herein. In some cases, the image processing module may be integrated with one or more imaging devices or imaging sensors. The fluorescence images may comprise an image or an image channel that contains information relating to fluorescence signals received or emitted from one or more surfaces or regions within the surgical scene. The fluorescence images may comprise fluorescence intensity or wavelength values for a plurality of points or locations within the surgical scene. The fluorescence intensity or wavelength values may be a function of and/or may correspond to a distance between (i) a tissue fluorescence imaging sensor or a tissue fluorescence imaging device and (ii) a plurality of points or locations within the surgical scene.

Overlay

[00120] In some cases, the image processing module may be configured to generate one or more image overlays comprising the one or images generated using the image processing module. The one or more image overlays may comprise a superposition of at least a portion of a first image on at least a portion of a second image. The first image and the second image may be associated with different imaging modalities (e.g., fluorescence imaging, TOF imaging, laser speckle imaging, RGB imaging, etc.). The first image and the second image may correspond to a same or similar region or set of features of the surgical scene. Alternatively, the first image and the second image may correspond to different regions or sets of features of the surgical scene. The one or more images generated using the image processing module may comprise the first image and the second image.

[00121] In some cases, the image processing module may be configured to provide or generate an overlay of a perfusion map and a live image of a surgical scene. In some cases, the image processing module may be configured to provide or generate an overlay of a perfusion map and a pre-operative image of a surgical scene. In some cases, the image processing module may be configured to provide or generate an overlay of a pre-operative image of a surgical scene and a live image of the surgical scene, or an overlay of a live image of the surgical scene with a pre-operative image of the surgical scene. The overlay may be provided in real time as the live image of the surgical scene is being obtained during a live surgical procedure. In some cases, the overlay may comprise two or more live images or videos of the surgical scene. The two or more live images or videos may be obtained or captured using different imaging modalities (e.g., tissue fluorescence imaging, TOF imaging, RGB imaging, laser speckle imaging, etc.).

[00122] In some cases, the image processing module may be configured to provide augmented visualization by way of image or video overlays, or additional video data corresponding to different imaging modalities. An operator using the imaging systems and methods disclosed herein may select various types of imaging modalities or video overlays for viewing. In some examples, the imaging modalities may comprise, for example, tissue fluorescence imaging, ICG fluorescence imaging, RGB imaging, laser speckle imaging, time of flight depth imaging, or any other type of imaging using a predetermined range of wavelengths. The video overlays may comprise, in some cases, perfusion views and/or tissue fluorescence views. Such video overlays may be performed in real-time. The overlays may be performed live when a user toggles the overlay using one or more physical or graphical controls (e.g., buttons or toggles). The various types of imaging modalities and the corresponding visual overlays may be toggled on and off by the user as desired (e.g., by clicking a button or a toggle). In some cases, the image processing module may be configured to provide or generate a first processed image or video corresponding to a first imaging modality (e.g., tissue fluorescence) and a second processed video corresponding to a second imaging modality (e.g., laser speckle, TOF, RGB, etc.). The user may view the first processed video for a first portion of the surgical procedure, and switch or toggle to the second processed video for a second portion of the surgical procedure. Alternatively, the user may view an overlay comprising the first processed video and the second processed video, wherein the first and second processed video correspond to a same or similar time frame during which one or more steps of a surgical procedure are being performed. [00123] In some cases, the image processing module may be configured to process or pre- process medical imaging data (e.g., surgical images or surgical videos) in real-time as the medical imaging data is being captured. Such processing or post-processing may comprise, for example, image alignment for a plurality of images obtained using different types of imaging modalities.

[00124] In some embodiments, the system may comprise a processing unit configured to (i) identify one or more critical structures in or near the surgical scene or (ii) distinguish between different critical structures in or near the surgical scene, based at least in part on the one or more images captured using the one or more imaging devices. The one or more critical structures may comprise, for example, a ureter, a bile duct, one or more blood vessels, an artery, a vein, one or more nerves, or one or more lymph nodes.

Calibration

[00125] In some embodiments, the system may further comprise a calibration module configured to calibrate (i) one or more light sources configured to provide the plurality of light signals or (ii) at least one of the first imaging unit and the second imaging unit. In some cases, the calibration module may be configured to calibrate the fluorescence imaging system by sampling multiple targets at multiple illumination wavelengths or intensities. In some cases, the calibration module may be configured to perform intrinsic calibration. Intrinsic calibration may comprise adjusting one or more intrinsic parameters associated with the first and/or second imaging units. The one or more intrinsic parameters may comprise, for example, a focal length, principal points, a distortion, and/or a field of view. In other cases, the calibration module may be configured to perform acquisition parameter calibration. Acquisition parameter calibration may comprise adjusting one or more operational parameters associated with the first and/or second imaging units. The one or more operational parameters may comprise, for example, a shutter width, an exposure, a gain, and/or a shutter timing.

Surgical Procedures

[00126] The systems and methods of the present disclosure may be implemented to perform fluorescence imaging for various types of surgical procedures. The surgical procedure may comprise one or more general surgical procedures, neurosurgical procedures, orthopedic procedures, and/or spinal procedures. In some cases, the one or more surgical procedures may comprise colectomy, cholecystectomy, appendectomy, hysterectomy, thyroidectomy, and/or gastrectomy. In some cases, the one or more surgical procedures may comprise hernia repair, and/or one or more suturing operations. In some cases, the one or more surgical procedures may comprise bariatric surgery, large or small intestine surgery, colon surgery, hemorrhoid surgery, and/or biopsy (e.g., liver biopsy, breast biopsy, tumor, or cancer biopsy, etc.).

[00127] FIG. 3 illustrates an exemplary method for fluorescence imaging. The method may comprise a step 310 comprising (a) generating a first set of light signals for fluorescence imaging and a second set of light signals for at least one of RGB imaging, laser speckle imaging, and depth imaging. The method may comprise another step 320 comprising (b) directing the first and second set of light signals to a target region. The method may comprise another step 330 comprising (c) receiving a third set of light signals reflected or emitted from the surgical scene. The method may comprise another step 340 comprising (d) generating one or more fluorescence images of the surgical scene based on the third set of light signals. The third set of light signals may correspond to at least one of the first set of light signals and the second set of light signals.

[00128] In another aspect, the present disclosure provides systems and methods for imaging and critical structure identification based on fluorescence signals. In some cases, the fluorescence signals may comprise fluorescence signals that are generated by a fluorescing material. The fluorescing material may be introduced into a surgical scene or administered to a subject such that the fluorescing material is detectable within the surgical scene. In some cases, the surgical scene may comprise a ureter, a bile duct, or one or more lymph nodes of the subject. [00129] In some embodiments, the fluorescing material may comprise a vitamin. The vitamin may comprise, for example, vitamin B. As used herein, vitamin B may refer to any type of B vitamin, including, for instance, Bi (thiamine), B2 (riboflavin), B3 (niacin), B5 (pantothenic acid), Be (pyridoxine), B7 (biotin), B9 (folate), or B12 (cobalamin). In some cases, B2 (riboflavin) may automatically fluoresce when exposed to a light source that generates light having a wavelength ranging from about 300 nanometers to about 500 nanometers. The fluorescence light emitted by the B2 (riboflavin) in response to such excitation light may have an emission wavelength that is greater than about 350 nanometers. In some embodiments, the fluorescing material may be water soluble, and in some cases, the fluorescing material may be removable from the subject’s body via one or more bodily fluids (e.g., urine). In some cases, vitamin B may comprise or be vitamin B2.

[00130] In some embodiments, the fluorescence signals generated by the fluorescing material may be registered using one or more imaging sensors. The one or more imaging sensors can be used to generate an image of a surgical scene or any objects or features detectable within the surgical scene based at least in part on the fluorescence signals registered by various pixels or sub-pixels of the one or more imaging sensors. In some cases, a processor may be configured to process the fluorescence signals to identify various critical structures that are highlighted by or adjacent / proximal to the fluorescing material.

[00131] In some embodiments, B2 (riboflavin) can accumulate in a select location of a subject’s body after administration. When present in sufficient quantities, the B2 (riboflavin) can be used to detect and visualize the select location or any critical structures or features that are observable within the select location. In some cases, the select location can comprise the subject’s bile duct or the subject’s lymph nodes (e.g., sentinel nodes).

[00132] In some embodiments, B2 (riboflavin) can maintain its fluorescence characteristics after metabolization. In some cases, various metabolites of B2 (riboflavin) can also be detected based on the fluorescence characteristics of the various metabolites. The absorption bands of B2 (riboflavin) and its various metabolites may correspond to the ranges of wavelengths of light that can be generated using the light sources described herein. The emission bands of B2 (riboflavin) and its various metabolites may correspond to the ranges of wavelengths of light that can be detected using the imaging sensors described herein.

[00133] In any of the embodiments described herein, the fluorescence signals emitted by B2 (riboflavin) and/or its various metabolites can be analyzed by a processor to generate one or more metrics. The one or more metrics may comprise or may correspond to, for example, a quantitative analysis of the fluorescent materials present in a surgical scene, or the fluorescence signals emitted by said fluorescent materials. The quantitative analysis can be used to quantity an amount or a concentration of the fluorescent materials that are present in the surgical scene. In some cases, the one or more metrics may comprise, for example, a normalized reading of fluorescence. In some cases, the fluorescence signals can be analyzed as part of a multi-spectral analysis of a plurality of different light signals in different spectral bands. In some cases, a processor can be used to process (i) the fluorescence signals generated by B2 (riboflavin) or its various metabolites and (ii) an additional set of light signals to generate a medical image of a surgical scene. The additional set of light signals may comprise light having a different wavelength than that of the fluorescence signals generated or emitted by B2 (riboflavin) or its various metabolites. The fluorescence signals and the additional set of light signals can be collectively used to visualize a surgical scene using a plurality of different imaging modalities, which can be combined, toggled, or dynamically overlaid with each other as desired.

[00134] In another aspect, the present disclosure provides a method for detecting, identifying, or visualizing a structure or a feature in the target region. In some embodiments, the method may comprise: (a) providing one or more B vitamins to a subject; (b) providing an excitation light to a target region comprising the one or more B vitamins; (c) receiving an emission light from the one or more B vitamins in the target region in response to the excitation light; and (d) detecting, identifying, or visualizing a structure or a feature in the target region based at least in part on the emission light received from the one or more B vitamins. In some embodiments, the emission light comprises fluorescent light. In some embodiments, the fluorescent light is generated from fluorescence of the one or more B vitamins. In some embodiments, the one or more B vitamins are configured to fluoresce or autofluoresce in response to the excitation light to generate the fluorescent light. In some embodiments, the one or more B vitamins are exogenous and non-naturally occurring in the subject’s body. In some embodiments, the excitation light has a wavelength ranging from about 200 nanometers to about 500 nanometers. In some embodiments, the excitation light has a wavelength of about 270 nanometers. In some embodiments, the emission light has a wavelength that is greater than or equal to about 500 nanometers. In some embodiments, the structure or feature in the target region comprises a ureter. In some embodiments, the structure or feature in the target region comprises a bile duct. In some embodiments, the structure or feature in the target region comprises one or more lymph nodes. In some embodiments, the emission light received from the one or more B vitamins is usable to detect, identify, or visualize a plurality of lymph nodes in the subject’s body. In some embodiments, the one or more B vitamins are introduced into the subject’s body via an intravenous injection or by way of oral ingestion. In some embodiments, the one or more B vitamins in the target region comprise excess B vitamins that are concentrated in the target region. In some embodiments, the method may comprise, prior to (d), filtering out the excitation light to aid in the detection, identification, or visualization of the structure or the feature in the target region based on the emission light from the one or more B vitamins. In some embodiments, the feature in the target region comprises a perfusion of a biological fluid through the structure in the target region. In some embodiments, the one or more B vitamins comprise a B2 vitamin.

[00135] Another aspect of the present disclosure provides a non-transitory computer readable medium comprising machine executable code that, upon execution by one or more computer processors, implements any of the methods above or elsewhere herein.

[00136] Another aspect of the present disclosure provides a system comprising one or more computer processors and computer memory coupled thereto. The computer memory comprises machine executable code that, upon execution by the one or more computer processors, implements any of the methods above or elsewhere herein.

Computer Systems

[00137] In another aspect, the present disclosure provides computer systems that are programmed or otherwise configured to implement methods of the disclosure. Referring to FIG. 4, the computer system 2001 may be programmed or otherwise configured to implement a method for fluorescence imaging. The computer system 2001 may be configured to, for example, control a transmission of a plurality of light signals to a surgical scene. At least a portion of the plurality of light signals may interact with one or more features in the surgical scene, and one or more light signals may be emitted or reflected from the surgical scene. The one or more emitted or reflected light signals may be received at an imaging module. One or more optical elements of the imaging module may be used to direct a first subset of the emitted or reflected light signals to a first imaging unit and a second subset of the emitted or reflected light signals to a second imaging unit. The system may be further configured to generate one or more images of the surgical scene based on at least the first subset and second subset of emitted or reflected light signals respectively received at the first and second imaging units. The first subset of reflected light signals may be used for fluorescence imaging. The second subset of reflected light signals may be used for RGB imaging, laser speckle imaging, and/or time of flight (TOF) imaging. The computer system 2001 can be an electronic device of a user or a computer system that is remotely located with respect to the electronic device. The electronic device can be a mobile electronic device.

[00138] The computer system 2001 may include a central processing unit (CPU, also "processor" and "computer processor" herein) 2005, which can be a single core or multi core processor, or a plurality of processors for parallel processing. The computer system 2001 also includes memory or memory location 2010 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 2015 (e.g., hard disk), communication interface 2020 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 2025, such as cache, other memory, data storage and/or electronic display adapters. The memory 2010, storage unit 2015, interface 2020 and peripheral devices 2025 are in communication with the CPU 2005 through a communication bus (solid lines), such as a motherboard. The storage unit 2015 can be a data storage unit (or data repository) for storing data. The computer system 2001 can be operatively coupled to a computer network ("network") 2030 with the aid of the communication interface 2020. The network 2030 can be the Internet, an internet and/or extranet, or an intranet and/or extranet that is in communication with the Internet. The network 2030 in some cases is a telecommunication and/or data network. The network 2030 can include one or more computer servers, which can enable distributed computing, such as cloud computing. The network 2030, in some cases with the aid of the computer system 2001, can implement a peer-to-peer network, which may enable devices coupled to the computer system 2001 to behave as a client or a server. [00139] The CPU 2005 can execute a sequence of machine-readable instructions, which can be embodied in a program or software. The instructions may be stored in a memory location, such as the memory 2010. The instructions can be directed to the CPU 2005, which can subsequently program or otherwise configure the CPU 2005 to implement methods of the present disclosure. Examples of operations performed by the CPU 2005 can include fetch, decode, execute, and writeback.

[00140] The CPU 2005 can be part of a circuit, such as an integrated circuit. One or more other components of the system 2001 can be included in the circuit. In some cases, the circuit is an application specific integrated circuit (ASIC).

[00141] The storage unit 2015 can store files, such as drivers, libraries, and saved programs. The storage unit 2015 can store user data, e.g., user preferences and user programs. The computer system 2001 in some cases can include one or more additional data storage units that are located external to the computer system 2001 (e.g., on a remote server that is in communication with the computer system 2001 through an intranet or the Internet).

[00142] The computer system 2001 can communicate with one or more remote computer systems through the network 2030. For instance, the computer system 2001 can communicate with a remote computer system of a user (e.g., a doctor, a surgeon, an operator, a healthcare provider, etc.). Examples of remote computer systems include personal computers (e.g., portable PC), slate or tablet PC's (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android-enabled device, Blackberry®), or personal digital assistants. The user can access the computer system 2001 via the network 2030.

[00143] Methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the computer system 2001, such as, for example, on the memory 2010 or electronic storage unit 2015. The machine executable or machine-readable code can be provided in the form of software. During use, the code can be executed by the processor 2005. In some cases, the code can be retrieved from the storage unit 2015 and stored on the memory 2010 for ready access by the processor 2005. In some situations, the electronic storage unit 2015 can be precluded, and machine-executable instructions are stored on memory 2010.

[00144] The code can be pre-compiled and configured for use with a machine having a processor adapted to execute the code or can be compiled during runtime. The code can be supplied in a programming language that can be selected to enable the code to execute in a precompiled or as-compiled fashion.

[00145] Aspects of the systems and methods provided herein, such as the computer system 2001, can be embodied in programming. Various aspects of the technology may be thought of as "products" or "articles of manufacture" typically in the form of machine (or processor) executable code and/or associated data that is carried on or embodied in a type of machine readable medium. Machine-executable code can be stored on an electronic storage unit, such as memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk. "Storage" type media can include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from a management server or host computer into the computer platform of an application server. Thus, another type of media that may bear the software elements includes optical, electrical, and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links. The physical elements that carry such waves, such as wired or wireless links, optical links, or the like, also may be considered as media bearing the software. As used herein, unless restricted to non-transitory, tangible "storage" media, terms such as computer or machine "readable medium" refer to any medium that participates in providing instructions to a processor for execution.

[00146] Hence, a machine readable medium, such as computer-executable code, may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium or physical transmission medium. Non-volatile storage media including, for example, optical or magnetic disks, or any storage devices in any computer(s) or the like, may be used to implement the databases, etc. shown in the drawings. Volatile storage media include dynamic memory, such as main memory of such a computer platform. Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code and/or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.

[00147] The computer system 2001 can include or be in communication with an electronic display 2035 that comprises a user interface (UI) 2040 for providing, for example, a portal for a doctor or a surgeon to view one or more medical images associated with a live procedure. The portal may be provided through an application programming interface (API). A user or entity can also interact with various elements in the portal via the UI. Examples of UI's include, without limitation, a graphical user interface (GUI) and web-based user interface.

[00148] Methods and systems of the present disclosure can be implemented by way of one or more algorithms. An algorithm can be implemented by way of software upon execution by the central processing unit 2005. For example, the algorithm may be configured to generate one or more image overlays based on the one or more medical images generated using at least a portion of the light signals reflected or emitted from the surgical scene. The one or more image overlays may comprise, for example, fluorescence imaging data, TOF imaging data, laser speckle imaging data, and/or RGB imaging data associated with the surgical scene or one or more anatomical features or physiological characteristics of the surgical scene.

EXAMPLES

EXAMPLE 1: visualizing a ureter of an animal via fluorescence generated by Vitamin B [00149] FIG. 5 shows an example in which a ureter of an animal (pig) is visualized during a surgical operation, using the method and systems of the present disclosure. In this example, 10 milligrams (mg) of riboflavin (vitamin B2) was injected intravenously to an animal (pig) (1 milliliter (ml) of a 10 mg/ml riboflavin solution), during a surgery. The image was captured between 5 to 10 minutes after the injection. The fluorescence emitted from the organ lasted for about 40 minutes in this example. The animal had a liquid diet prior to the surgery. Images were acquired laparoscopically after creating a hole near the abdomen.

[00150] Picture 502 shows a fluorescent image of the surgical scene, including the ureter 504, surrounding tissue, and blood vessels 506. In this example, the excitation source was at 445 nanometers (nm) and a 460 nm long pass filter was used to block excitation in the light path of the RGB camera. Exposure time was approximately less than one second, less than a faction of a second, or less than 1/30 of the second. A strong signal to noise ratio was achieved without the need for any post-processing. In some cases, post-processing enhancements may optionally be performed on the captured images, which can improve the quality of the images, enhance visibility, and the signal to noise ratio. However, such post-processing was nor applied on the images presented in FIG. 5. The ureter is emitting fluorescent light, making it brighter than surrounding tissue and blood vessel 506. Picture 501 shows a brightfield image of the same surgical scene shown in picture 502. In the fluorescent image 502, the ureter can be seen easily and clearly during the surgery which has various benefits as described throughput this disclosure.

EXAMPLE 2: visualizing the ureter of an animal via fluorescent generated using an intraureteral indocyanine green (ICG) stent

[00151] Fig. 6 shows an example fluorescent image of ureter of an animal (pig). In some cases, a ureter of an animal can be visualized or illuminated by generating a fluorescence inside the organ. In this example, an intraureteral indocyanine green (ICG) stent was implanted into the ureter of the animal (pig) in order to generate the fluorescence for illumination and visualization purposes. This was performed as a comparison to the method of EXAMPLE 1 (Fig. 5). The surgical scene in EXAMPLE 2 was illuminated with a light in the Infrared (IR) range in order to be excited and emit a fluorescent light, as shown in Fig. 6.

[00152] In some cases, using vitamin B2 to generate fluorescence in the ureter may have a few advantages over using an ICG stent. For example, vitamin B2 is a stronger fluorophore compared to a typical ICG stent and it may have a quantum efficiency greater than that of the ICG tent, for example 2X, 3X, 5X, 10X, 12X, 15X, 16X, 20X, 25X, 30X greater or more. Another advantage of using Vitamin B2 over an ICG stent is that it is non-invasive and does not need surgical implantation. Therefore, using Vitamin B2 as a fluorophore and a method for illuminating an organ such as ureter may reduce the time/delay and complications associated with a surgery. Vitamin B2 may be safer to use than ICG stents.

EXAMPLE 3

[00153] Fig. 7 presents an example sensitivity curve for the methods and systems of the present disclosure. In this example, vitamin B2 was dissolved and diluted in water at varying concentrations ranging from 0 to 2000 micrograms per milliliters (pg/mL). An imaging system described in this disclosure was used to detect the fluorescence generated as a result of exciting vitamin B2, wherein the imaging system was pointing to the illumination scene from a 10- centimeter (cm) distance. In some examples, injection of 10 mg of vitamin B2 (riboflavin) caused the urine of the animal to have a vitamin B2 concentration of from 10 pg/mL to 100 pg/mL on the chart shown in Fig. 7, leading to generating the peak fluorescence achievable on the imaging systems. In some cases, such fluorescence lasted for at least 10, 20, 30, 40, 60, 70, 80, 90 minutes or longer. EXAMPLE 4

[00154] Fig. 8 shows an example image in which the lymph nodes of an animal are visualized using the fluorescence methods and systems of the present disclosure. In this example, the animal (pig) had a diet including some amount of vitamin B2 included therein. A fluorescence medical imaging system according to the present disclosure was used to capture the image shown in Fig. 8.

[00155] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. It is not intended that the invention be limited by the specific examples provided within the specification. While the invention has been described with reference to the aforementioned specification, the descriptions and illustrations of the embodiments herein are not meant to be construed in a limiting sense. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. Furthermore, it shall be understood that all aspects of the invention are not limited to the specific depictions, configurations or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is therefore contemplated that the invention shall also cover any such alternatives, modifications, variations, or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.