CN109997174B - Wearable Spectral Inspection System - Google Patents

Wearable Spectral Inspection System Download PDF

Info

Publication number
CN109997174B
CN109997174B CN201780071647.5A CN201780071647A CN109997174B CN 109997174 B CN109997174 B CN 109997174B CN 201780071647 A CN201780071647 A CN 201780071647A CN 109997174 B CN109997174 B CN 109997174B
Authority
CN
China
Prior art keywords
light
user
electromagnetic radiation
head
wavelength
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780071647.5A
Other languages
Chinese (zh)
Other versions
CN109997174A (en
Inventor
N·E·萨梅茨
N·U·罗柏纳
A·克勒
M·拜伦洛特
E·拜伦洛特
C·M·哈利西斯
T·S·鲍尔斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Leap Inc
Original Assignee
Magic Leap Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap Inc filed Critical Magic Leap Inc
Priority to CN202310647311.6A priority Critical patent/CN116649967A/en
Publication of CN109997174A publication Critical patent/CN109997174A/en
Application granted granted Critical
Publication of CN109997174B publication Critical patent/CN109997174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14552Details of sensors specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14555Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases specially adapted for the eye fundus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0248Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using a sighting port, e.g. camera or human eye
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0256Compact construction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0264Electrical interface; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • G01J3/108Arrangements of light sources specially adapted for spectrometry or colorimetry for measurement in the infrared range
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • G01J3/427Dual wavelengths spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/359Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using near infrared light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/0001Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
    • G02B6/0011Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
    • G02B6/0075Arrangements of multiple light guides
    • G02B6/0076Stacked arrangements of multiple light guides of the same or different cross-sectional area
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/13Digital output to plotter ; Cooperation and interconnection of the plotter with other functional units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/309Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • G01J2003/102Plural sources
    • G01J2003/106Plural sources the two sources being alternating or selectable, e.g. in two ranges or line:continuum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • G01J2003/425Reflectance
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • G02B2027/0125Field-of-view increase by wavefront division
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • Geometry (AREA)
  • Food Science & Technology (AREA)
  • Hematology (AREA)
  • Urology & Nephrology (AREA)
  • Medicinal Chemistry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Ophthalmology & Optometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

In some embodiments, a system includes: a head-mounted frame detachably coupled to a head of a user; one or more light sources coupled to the head-mounted frame and configured to emit light of at least two different wavelengths toward the target object in a radiation field of view of the light sources; one or more electromagnetic radiation detectors coupled to the head-mounted component and configured to receive light reflected after encountering the target object; and a controller operatively coupled to the one or more light sources and the detector and configured to determine and display an output indicative of an identity or characteristic of the target object, the identity or characteristic being determined in relation to a characteristic of light emitted by the light sources measured by the detector.

Description

可穿戴光谱检查系统Wearable Spectral Inspection System

优先权声明priority statement

本申请要求2016年9月22日提交的美国临时申请No.62/398,454的优先权,其通过引用并入本文。This application claims priority to US Provisional Application No. 62/398,454, filed September 22, 2016, which is incorporated herein by reference.

通过引用的并入incorporation by reference

本申请通过引用并入有以下美国专利申请中的每一个的全部内容:美国专利申请No.15/072,341;美国专利申请No.14/690,401;美国专利申请No.14/555,858;美国专利申请No.14/555,585;美国专利申请No.13/663,466;美国专利申请No.13/684,489;美国专利申请No.14/205,126;美国专利申请No.14/641,376;美国专利申请No.14/212,961;美国临时专利申请No.62/298,993(美国专利申请No.15/425,837);以及美国专利申请No.15/425,837。This application incorporates by reference the entire contents of each of the following U.S. patent applications: U.S. Patent Application No. 15/072,341; U.S. Patent Application No. 14/690,401; U.S. Patent Application No. 14/555,858; U.S. Patent Application No. .14/555,585; U.S. Patent Application No. 13/663,466; U.S. Patent Application No. 13/684,489; U.S. Patent Application No. 14/205,126; U.S. Patent Application No. 14/641,376; US Provisional Patent Application No. 62/298,993 (US Patent Application No. 15/425,837); and US Patent Application No. 15/425,837.

技术领域technical field

本公开涉及用于使用可穿戴组件的增强现实的系统和方法,并且更具体地涉及用于通过反射光特性识别材料的增强现实系统的配置。The present disclosure relates to systems and methods for augmented reality using wearable components, and more particularly to configurations of augmented reality systems for identifying materials through reflected light properties.

背景技术Background technique

现代计算和显示技术促进了用于所谓的“虚拟现实”或“增强现实”体验的系统的开发,其中数字再现图像或其部分以它们看起来或者可以被感知为真实的方式呈现给用户。虚拟现实或“VR”场景通常涉及呈现数字或虚拟图像信息而对其他实际的真实世界视觉输入不透明;以及增强现实或“AR”场景通常涉及呈现数字或虚拟图像信息,作为对用户周围的实际世界的可视化的增强,同时仍允许用户基本上感知和观看到真实世界。Modern computing and display technologies have facilitated the development of systems for so-called "virtual reality" or "augmented reality" experiences, in which digitally reproduced images, or parts thereof, are presented to users in such a way that they appear or can be perceived as real. Virtual reality or "VR" scenarios generally involve the presentation of digital or virtual image information opaque to otherwise actual real-world visual input; and augmented reality or "AR" scenarios generally involve the presentation of digital or virtual image information as a reflection of the actual world around the user Visual enhancements while still allowing users to essentially perceive and see the real world.

参考图1,描绘了增强现实场景(4),其中AR技术的用户看到以人、树木、背景中的建筑物和混凝土平台(1120)为特征的真实世界公园状设置(6)。除了这些项目之外,AR技术的用户同样感知到他“看到”站在真实世界平台(1120)上的机器人雕像(1110)以及飞过的卡通式化身角色(2),该化身角色看起来是大黄蜂的化身,即使这些元素(2、1110)在真实世界中不存在。结果表明,人类的视觉感知系统是非常复杂的,并且产生有助于连同其他虚拟或真实世界的图像元素一起的虚拟图像元素的舒适、自然、丰富呈现是具有挑战性的。例如,头戴式AR显示器(或头盔式显示器或智能眼镜)通常至少松散地耦接到用户的头部,并由此在用户的头部移动时而移动。如果显示系统检测到用户的头部移动,则可以更新正在显示的数据以考虑头部姿态的变化。合适的AR系统的某些方面公开在例如名称为“Systemand method for augmented and virtual reality(用于增强和虚拟现实的系统和方法)”的序列号为No.14/205,126的美国专利申请中,其全部内容通过引用并入本文,以及被公开在以下附加公开内容中,其涉及诸如由弗罗里达州劳德代尔堡(Fort Lauderdale)的奇跃股份有限公司开发的增强和虚拟现实系统:序列号为14/641,376的美国专利申请;序列号为14/555,585的美国专利申请;序列号为14/212,961的美国专利申请;序列号为14/690,401的美国专利申请;序列号为13/663,466的美国专利申请;序列号为13/684,489的美国专利申请,以及序列号为62/298,993的美国专利申请,其中每一者在此通过引用整体并入本文。Referring to FIG. 1 , an augmented reality scene ( 4 ) is depicted in which a user of AR technology sees a real world park-like setting ( 6 ) featuring people, trees, buildings in the background, and a concrete platform ( 1120 ). In addition to these items, the user of AR technology also perceives that he "sees" a robot statue (1110) standing on a real-world platform (1120) and a flying cartoon-like avatar character (2) that looks like is the embodiment of the bumblebee, even though these elements (2, 1110) do not exist in the real world. The results show that the human visual perception system is very complex, and it is challenging to generate comfortable, natural, rich presentation of virtual image elements that facilitate together with other virtual or real-world image elements. For example, a head-mounted AR display (or head-mounted display or smart glasses) is typically at least loosely coupled to a user's head, and thus moves when the user's head moves. If the display system detects movement of the user's head, the data being displayed can be updated to account for the change in head pose. Certain aspects of a suitable AR system are disclosed, for example, in U.S. Patent Application Serial No. 14/205,126, entitled "System and method for augmented and virtual reality," which The entire contents are incorporated herein by reference, and are disclosed in the following additional disclosure relating to augmented and virtual reality systems such as those developed by Qiyue Inc. of Fort Lauderdale, Florida: Sequence U.S. Patent Application Serial No. 14/641,376; U.S. Patent Application Serial No. 14/555,585; U.S. Patent Application Serial No. 14/212,961; U.S. Patent Application Serial No. 14/690,401; US Patent Application; US Patent Application Serial No. 13/684,489, and US Patent Application Serial No. 62/298,993, each of which is hereby incorporated by reference in its entirety.

本文公开的系统和方法解决了与AR和VR技术相关的各种挑战和开发。The systems and methods disclosed herein address various challenges and developments associated with AR and VR technologies.

发明内容Contents of the invention

混合现实系统被配置为执行光谱检查(spectroscopy)。混合现实(或者缩写为“MR”)通常涉及集成到自然世界中并响应于自然世界的虚拟对象。例如,在MR场景中,AR内容被真实世界对象遮挡和/或被感知为与真实世界中的其他对象(虚拟或真实的)交互。贯穿本公开内容,对AR、VR或MR的引用不限制本发明,并且这些技术可以应用于任何背景。The mixed reality system is configured to perform spectroscopy. Mixed reality (or "MR" for short) generally involves virtual objects that are integrated into and respond to the natural world. For example, in MR scenarios, AR content is occluded by real-world objects and/or is perceived as interacting with other objects (virtual or real) in the real world. Throughout this disclosure, references to AR, VR, or MR do not limit the invention, and these techniques may be applied in any context.

一些实施例涉及一种用于根据从可拆卸地耦合到用户头部的头戴式部件发射并随后由该头戴式部件接收/反射/检测到的光波长来识别物质(例如组织、组织内的细胞或细胞/组织内的特性)的可穿戴系统。尽管本公开主要参考组织或组织特性作为根据各种实施例的分析的主题,但是科技和技术以及组件不限于此。一些实施例利用例如耦合到头戴式部件的电磁辐射发射器的一个或多个光源在用户选择的方向上发射一个或多个波长的光。这些实施例允许连续的甚至是被动的测量。例如,佩戴头戴式系统的用户可以进行给定的活动,但是面向内的传感器可以检测眼睛的特性而不会干扰活动。Some embodiments relate to a method for identifying substances (e.g., tissue, cells or properties within cells/tissues) for wearable systems. Although this disclosure primarily refers to tissue or tissue properties as the subject of analysis according to various embodiments, technologies and techniques and components are not so limited. Some embodiments utilize one or more light sources, such as electromagnetic radiation emitters coupled to a head mounted component, to emit light at one or more wavelengths in a user-selected direction. These embodiments allow continuous and even passive measurements. For example, a user wearing a head-mounted system can perform a given activity, but inward-facing sensors can detect the properties of the eyes without interfering with the activity.

例如,用户可以佩戴系统,该系统被配置为向内看用户的眼睛并识别或测量眼睛的组织特性,例如眼睛血管中的血液浓度。在向内系统的其他示例中,可以分析诸如眼内液的流体而不仅仅是组织特性。在其他示例中,系统可以包括传感器,该传感器朝向外部世界向外看并且识别或测量除了眼睛之外的例如用户的四肢(extremity)或周围环境中远离用户的对象的组织或材料特性。For example, a user may wear a system configured to look inside the user's eye and identify or measure tissue properties of the eye, such as blood concentration in blood vessels of the eye. In other examples of inward systems, fluids such as intraocular fluid may be analyzed rather than just tissue properties. In other examples, the system may include sensors that look out towards the outside world and identify or measure tissue or material properties other than the eyes, such as the user's extremity or objects in the surrounding environment that are remote from the user.

在向外看的系统中,耦合到头戴式部件的眼睛跟踪相机可以确定用户正在看的定向凝视,处理器或控制器可以通过耦合到头戴式部件的真实世界捕获系统(例如相机或深度传感器)捕获的图像将该凝视与对真实世界目标对象的观察相关联。耦合到头戴式系统的光源发射远离用户的光,例如来自电磁辐射发射器的红外光,并且在一些实施例中,发射光以在与由眼睛跟踪相机确定的凝视方向基本相同的方向上创建辐射图案,从而发射在目标对象上。In an outward-looking system, an eye-tracking camera coupled to the head-mounted component can determine the directional gaze the user is looking at, and the processor or controller can determine the directional gaze that the user is looking at, and the processor or controller can communicate with a real-world capture system coupled to the head-mounted component, such as a camera or depth The image captured by the sensor) correlates that gaze with the observation of a real-world target object. A light source coupled to the head-mounted system emits light away from the user, such as infrared light from an electromagnetic radiation emitter, and in some embodiments emits light to create substantially the same direction as the gaze direction determined by the eye-tracking camera. The radiation pattern is thus emitted on the target object.

在一些实施例中,真实世界捕获系统捕获对象。例如,诸如垂直腔表面发射激光器的深度传感器可以通过收集撞击对象的飞行时间信号来确定对象的轮廓。一旦对象被这种真实世界捕获系统在其外形处识别出来,该对象可以被突出显示并且可用于标记。在一些实施例中,给定视场的相机系统限定可用于突出显示和标记的区域。例如,与用户的凝视相关的相机可以包含5度视场、10度视场或合适的增量优选地高达30度的中心视觉视场,光源将基本上在该中心视觉视场内发射光。In some embodiments, a real world capture system captures objects. For example, depth sensors such as vertical-cavity surface-emitting lasers can determine the contours of objects by collecting time-of-flight signals of impacting objects. Once an object is recognized in its outline by such a real world capture system, the object can be highlighted and available for labeling. In some embodiments, the camera system for a given field of view defines an area available for highlighting and marking. For example, a camera associated with the user's gaze may contain a 5 degree field of view, a 10 degree field of view, or a central visual field of view in suitable increments, preferably up to 30 degrees, within which the light source will emit light substantially.

在一些实施例中,这种系统还包括:一个或多个电磁辐射检测器或光电检测器,其耦合到头戴式部件并被配置为接收从光源发射并从目标对象反射的反射光;以及控制器,其可操作地耦合到一个或多个电磁辐射发射器和一个或多个电磁辐射检测器,并被配置为使一个或多个电磁辐射发射器发射光脉冲,同时还使一个或多个电磁辐射检测器检测与发射的光脉冲相关的光吸收水平,该光吸收水平取决于任何接收的特定脉冲发射的反射光。In some embodiments, such a system further includes: one or more electromagnetic radiation detectors or photodetectors coupled to the head mounted component and configured to receive reflected light emitted from the light source and reflected from the target object; and a controller operatively coupled to the one or more electromagnetic radiation emitters and the one or more electromagnetic radiation detectors and configured to cause the one or more electromagnetic radiation emitters to emit light pulses while also causing the one or more An electromagnetic radiation detector detects the level of light absorption associated with emitted light pulses, which depends on the reflected light emitted by any received particular pulse.

在一些实施例中,该系统还包括处理器,以将检测器从目标对象接收的反射光的波长与下方组织的特定材料、组织类型或特性相匹配。在一些实施例中,确定其他光特征,例如相对于发射光和检测到的光的偏振变化或散射效应,但是出于本说明的目的,波长特征用作示例性光特征。例如,在一些实施例中,向内的电磁辐射发射器将红外光谱中的光发射到用户的视网膜、接收反射光并匹配反射光的波长以确定诸如组织的类型或组织中的氧饱和度的物理特性。在一些实施例中,该系统包括面向外的光源并向目标对象(例如用户或第三人的四肢)发射红外光、接收反射光并匹配反射光波长以确定观察到的材料。例如,这种面向外的系统可以检测健康细胞中癌细胞的存在。因为癌细胞或其他异常细胞以不同于健康细胞的方式反射和吸收光,所以在某些波长下的光反射可以指示异常的存在和数量。In some embodiments, the system also includes a processor to match the wavelength of reflected light received by the detector from the target object to a particular material, tissue type or characteristic of the underlying tissue. In some embodiments, other light characteristics are determined, such as polarization changes or scattering effects with respect to emitted and detected light, but for purposes of this description wavelength characteristics are used as exemplary light characteristics. For example, in some embodiments, an inward-facing electromagnetic radiation emitter emits light in the infrared spectrum to the user's retina, receives the reflected light, and matches the wavelength of the reflected light to determine parameters such as the type of tissue or oxygen saturation in the tissue. physical properties. In some embodiments, the system includes an outwardly facing light source and emits infrared light toward a target object (eg, a limb of a user or a third person), receives the reflected light, and matches the reflected light wavelength to determine the observed material. For example, this outward-facing system can detect the presence of cancer cells in healthy cells. Because cancer cells or other abnormal cells reflect and absorb light differently than healthy cells, reflection of light at certain wavelengths can indicate the presence and amount of abnormalities.

在一些实施例中,控制器从真实世界捕获系统接收捕获的目标对象,并将标签应用于指示所识别的特性的目标对象。在一些实施例中,标签是头戴式部件的显示器内的文本标签或提示。在一些实施例中,标签是对用户的音频提示。在一些实施例中,标签是例如医学书中所引用的类似组织的虚拟图像,该虚拟图像叠加在目标对象附近,以供用户准备进行比较分析。In some embodiments, the controller receives the captured target object from the real world capture system and applies a tag to the target object indicative of the identified characteristic. In some embodiments, the label is a text label or prompt within the display of the head mounted unit. In some embodiments, the label is an audio prompt to the user. In some embodiments, the label is a virtual image of similar tissue, such as that cited in a medical book, superimposed near the target object for the user to prepare for comparative analysis.

在一些实施例中,头戴式部件可包括眼镜框架。眼镜框架可以是双目眼镜框架。一个或多个辐射发射器可包括光源,例如发光二极管。一个或多个辐射发射器可包括多个光源,该多个光源被配置为发射两个或更多个不同波长的电磁辐射。多个光源可以被配置为发射约660纳米的第一波长和约940纳米的第二波长的电磁辐射。一个或多个辐射发射器可以被配置为顺序地发射两个不同波长的电磁辐射。一个或多个辐射发射器可以被配置为同时发射两个预定波长的电磁辐射。一个或多个电磁辐射检测器可以包括从由光电二极管、光电检测器和数字相机传感器构成的组中选择的器件。一个或多个电磁辐射检测器可以被定位和定向以接收在遇到目标对象之后反射的光。一个或多个电磁辐射检测器可以被定位和定向以接收在遇到观察到的组织或材料之后反射的光;也就是说,无论是朝向用户的眼睛面向内还是朝向用户的环境面向外,一个或多个电磁辐射检测器基本上沿与一个或多个电磁辐射发射器相同的方向定向。In some embodiments, the head mounted component may include eyeglass frames. The spectacle frame may be a binocular spectacle frame. The one or more radiation emitters may include light sources, such as light emitting diodes. The one or more radiation emitters may include multiple light sources configured to emit electromagnetic radiation at two or more different wavelengths. The plurality of light sources may be configured to emit electromagnetic radiation at a first wavelength of about 660 nanometers and a second wavelength of about 940 nanometers. One or more radiation emitters may be configured to sequentially emit electromagnetic radiation at two different wavelengths. One or more radiation emitters may be configured to simultaneously emit electromagnetic radiation of two predetermined wavelengths. The one or more electromagnetic radiation detectors may include devices selected from the group consisting of photodiodes, photodetectors, and digital camera sensors. One or more electromagnetic radiation detectors may be positioned and oriented to receive light reflected after encountering a target object. One or more electromagnetic radiation detectors may be positioned and oriented to receive light reflected after encountering observed tissue or material; that is, whether facing inward toward the user's eyes or outward toward the user's environment, a The or more electromagnetic radiation detectors are oriented substantially in the same direction as the one or more electromagnetic radiation emitters.

控制器还可以被配置为使多个光源发射第一波长开、然后第二波长开、然后两个波长都关的循环图案,使得一个或多个电磁辐射检测器分开地检测第一和第二波长。控制器可以被配置为使多个发光二极管以大约每秒三十次的循环脉冲模式发射第一波长开、然后第二波长开、然后两个波长都关的循环图案。The controller can also be configured to cause the plurality of light sources to emit a cyclic pattern of a first wavelength on, then a second wavelength on, and then both wavelengths off, such that the one or more electromagnetic radiation detectors detect the first and second wavelengths separately. wavelength. The controller may be configured to cause the plurality of light emitting diodes to emit a cyclic pattern of the first wavelength on, then the second wavelength on, and then both wavelengths off in a cyclic pulse pattern approximately thirty times per second.

在一些实施例中,控制器可以被配置为计算第一波长光测量与第二波长光测量的比率。在一些实施例中,该比率可以至少部分地基于比尔-朗伯特(Beer-Lambert)定律经由查找表进一步转换为氧饱和度读数。在一些实施例中,该比率被转换为外部查找表中的材料标识符,该标识符例如存储在头戴式部件上或者耦合到本地或远程处理模块上的头戴式部件的吸收数据库模块中。例如,用于特定组织的吸收比率或波长反射的吸收数据库模块可以存储在医疗保健提供者可访问的并且通过远程处理模块访问的“云”存储系统中。在一些实施例中,吸收数据库模块可以存储某些食物的吸收特性(例如波长比率或波长反射)并且被永久地存储在到头部安装部件的本地处理模块上。In some embodiments, the controller may be configured to calculate a ratio of the first wavelength light measurement to the second wavelength light measurement. In some embodiments, this ratio may be further converted to an oxygen saturation reading via a lookup table based at least in part on the Beer-Lambert law. In some embodiments, the ratio is converted to a material identifier in an external lookup table, such as stored on the headset or in an absorption database module of the headset coupled to a local or remote processing module . For example, an absorbance database module for tissue-specific absorbance ratios or wavelength reflectance may be stored in a "cloud" storage system accessible to the healthcare provider and via the remote processing module. In some embodiments, the absorption database module may store the absorption characteristics (eg wavelength ratio or wavelength reflection) of certain foods and be permanently stored on the local processing module to the head mount.

以这种方式,控制器可以被配置为操作一个或多个电磁辐射发射器和一个或多个电磁辐射检测器,以用作广泛使用的头戴式光谱仪(spectroscope)。控制器可操作地耦合到光学元件,该光学元件耦合到头戴式部件并且可由用户观看,使得可由用户通过光学元件观看到控制器的指示波长特性的输出,该波长特性指示特定组织特性或材料。一个或多个电磁辐射检测器可以包括数字图像传感器,该数字图像传感器包括多个像素,其中所述控制器被配置为自动检测像素子集,该像素子集接收在遇到例如组织或组织内的细胞之后反射的光。在一些实施例中,这种像素子集用于产生代表数字图像传感器的视场内的目标对象的输出。例如,输出可以是指示组织的吸收水平的显示标签。在一些实施例中,比较值被显示为输出。例如,输出可以是来自第一分析时间的血液中氧的百分比饱和度和第二分析时间的氧的百分比饱和度以及在两个时间之间记录的变化率。在这些实施例中,可以通过识别测量的特性随时间的变化来检测诸如糖尿病性视网膜病的疾病。In this manner, the controller may be configured to operate one or more electromagnetic radiation emitters and one or more electromagnetic radiation detectors for use as a widely used head-mounted spectroscope. The controller is operatively coupled to an optical element that is coupled to the head mounted component and viewable by the user such that an output of the controller indicative of a wavelength characteristic indicative of a particular tissue property or material is viewable by the user through the optical element . The one or more electromagnetic radiation detectors may comprise a digital image sensor comprising a plurality of pixels, wherein the controller is configured to automatically detect a subset of pixels received in an environment encountered, e.g., tissue or within tissue The light reflected after the cells. In some embodiments, such a subset of pixels is used to generate an output representative of a target object within the field of view of the digital image sensor. For example, the output may be a display label indicating the absorption level of the tissue. In some embodiments, the comparison value is displayed as an output. For example, the output may be the percent saturation of oxygen in blood from a first analysis time and the percent saturation of oxygen in a second analysis time and the rate of change recorded between the two times. In these embodiments, diseases such as diabetic retinopathy may be detected by identifying changes in measured properties over time.

在一些实施例中,控制器可以被配置为至少部分地基于与像素相关联的信号之间的反射光亮度差异来自动检测像素子集。控制器可以被配置为至少部分地基于与像素相关联的信号之间的反射光吸收差异来自动检测像素子集。在这样的实施例中,这样的子集可以是隔离的像素并且被标记以用于进一步的分析,例如用于附加的辐射或映射,或者虚拟图像可以覆盖在这样的像素上以提供与显示其他特性的隔离像素的视觉对比度以用作通知用户由系统识别的子像素的不同特性。In some embodiments, the controller may be configured to automatically detect the subset of pixels based at least in part on differences in reflected light brightness between signals associated with the pixels. The controller may be configured to automatically detect the subset of pixels based at least in part on differences in reflected light absorption between signals associated with the pixels. In such embodiments, such subsets may be isolated pixels and labeled for further analysis, such as for additional radiation or mapping, or a virtual image may be overlaid on such pixels to provide and display other The visual contrast of the isolated pixels of the characteristic is used to inform the user of the different characteristics of the sub-pixels identified by the system.

在一些实施例中,系统数据收集不仅对于脉冲和记录光脉冲是时分多路复用的,而且每天多次被被动地收集。在一些实施例中,GPS或其他类似的映射系统耦合到系统,以将用户的位置或一天中的时间与收集的某些生理数据相关联。例如,用户可以在一天中跟踪相对于某些位置或活动的生理响应。In some embodiments, system data collection is not only time multiplexed for pulses and recording light pulses, but is collected passively multiple times per day. In some embodiments, a GPS or other similar mapping system is coupled to the system to correlate the user's location or time of day with certain physiological data collected. For example, a user can track physiological responses relative to certain locations or activities throughout the day.

当进一步考虑以下附图和描述时,将理解本发明的这些和许多其他特征和优点。These and many other features and advantages of the present invention will be understood upon further consideration of the following drawings and description.

附图说明Description of drawings

图1示出了向用户的增强现实系统呈现的某些方面。Figure 1 illustrates certain aspects of an augmented reality system presentation to a user.

图2A-2D示出了用于可穿戴计算应用的各种增强现实系统的某些方面,其特征在于可操作地耦合到本地和远程处理和数据组件的头戴式组件。2A-2D illustrate certain aspects of various augmented reality systems for wearable computing applications, featuring head-mounted components operably coupled to local and remote processing and data components.

图3示出了可穿戴增强或虚拟现实系统与某些远程处理和/或数据存储源之间的连接范例的某些方面。Figure 3 illustrates certain aspects of an example of a connection between a wearable augmented or virtual reality system and certain remote processing and/or data storage sources.

图4A-4D示出了脉搏血氧测定配置和与血液的氧合中的光散射有关的校准曲线的各个方面。4A-4D illustrate various aspects of pulse oximetry configurations and calibration curves related to light scatter in the oxygenation of blood.

图5示出了根据一些实施例的集成AR/VR功能的头戴式光谱检查系统。FIG. 5 illustrates a head-mounted spectroscopic inspection system integrating AR/VR functionality, according to some embodiments.

图6示出了根据一些实施例的以集成的光谱检查模块为特征的可穿戴AR/VR系统的各个方面。6 illustrates various aspects of a wearable AR/VR system featuring an integrated spectral inspection module, according to some embodiments.

图7A-7B是指示通过波长的选择特性的示例光饱和度图表。7A-7B are example light saturation graphs indicating selective properties of passed wavelengths.

图8示出了根据一些实施例的用于通过头戴式光谱检查系统识别材料或材料特性的方法。FIG. 8 illustrates a method for identifying materials or material properties with a head-mounted spectroscopic inspection system, according to some embodiments.

具体实施方式Detailed ways

一些AR和VR系统包括处理能力,例如控制器或微控制器,以及还包括用于为各种组件的功能供电的电源,并且由于诸如AR或VR系统的可穿戴计算系统中的至少一些组件靠近操作它们的用户的身体的事实,有机会利用这些系统组件中的一些来相对于用户进行某些生理监测。例如,可以通过测量光吸收来进行生理监测。Some AR and VR systems include processing capabilities, such as controllers or microcontrollers, and also power supplies for powering the functions of various components, and since at least some components in a wearable computing system such as an AR or VR system are in close proximity to The fact of the user's body operating them provides an opportunity to exploit some of these system components to perform some physiological monitoring with respect to the user. For example, physiological monitoring can be performed by measuring light absorption.

在常规的光吸收测量技术(例如,如图4A所示或者在葡萄糖检测中的可附接到人的手指的脉搏血氧测量仪)中,光被以受控且固定的方向发射并且被接收在受控且固定的接收器中。光以不同波长脉冲通过周围的组织结构,同时还在组织结构的另一侧被检测(因此测量光特性,例如吸收和散射)。在这样的系统中,发射的光的测量与检测的光的测量相比可以提供与估计的组织或组织特性成比例的输出或读取作为估计的组织或组织特性的输出(例如,脉搏血氧测量仪的估计的血氧饱和度水平)或者仅仅是材料或组织类型的输出。如图4D所示,描绘感兴趣的光相对于其他光的比率的校准曲线也可以预测下方组织的特性,该特性取决于入射到其上的光。In conventional light absorption measurement techniques (e.g., a pulse oximeter attachable to a human finger as shown in Figure 4A or in glucose detection), light is emitted and received in a controlled and fixed direction In a controlled and fixed receiver. Light is pulsed at different wavelengths through the surrounding tissue structure, while also being detected on the other side of the tissue structure (thus measuring light properties such as absorption and scattering). In such systems, measurements of emitted light compared to measurements of detected light can provide an output proportional to or read as an output of an estimated tissue or tissue property (e.g., pulse oximetry The estimated blood oxygen saturation level of the meter) or simply the output of the material or tissue type. As shown in Figure 4D, a calibration curve depicting the ratio of the light of interest relative to other lights can also predict properties of the underlying tissue that depend on the light incident on it.

拉曼光谱是测量由辐射分子释放的光子的非弹性散射的另一种技术。当被照射时,特定分子将呈现波长的特定偏移,从而呈现可用于测量和量化样品内的分子的独特散射效应。Raman spectroscopy is another technique that measures the inelastic scattering of photons released by irradiating molecules. When illuminated, specific molecules will exhibit a specific shift in wavelength, thereby presenting a unique scattering effect that can be used to measure and quantify the molecules within the sample.

图4B示出了氧合(806)与脱氧(808)的血红蛋白的吸收光谱图,如这样的曲线(806、808)所示,在电磁光谱的红光波长范围内,例如在约660nm,氧合血红蛋白与脱氧血红蛋白的吸收存在显著差异,而在红外波长范围内在约940nm存在反转差异。已知在对于特定用户的氧饱和度的确定中,这样的波长下的脉冲辐射和使用脉冲血氧仪的检测利用这种吸收差异。Figure 4B shows a graph of the absorption spectra of oxygenated (806) and deoxygenated (808) hemoglobin, as shown by such curves (806, 808), in the red wavelength range of the electromagnetic spectrum, for example at about 660 nm, oxygen There is a marked difference in the absorption of synthetic hemoglobin and deoxygenated hemoglobin, with an inverse difference at about 940 nm in the infrared wavelength range. Pulsed radiation at such wavelengths and detection using pulse oximeters are known to exploit this difference in absorption in the determination of oxygen saturation for a particular user.

虽然脉冲血氧仪(802)通常被配置为至少部分地包裹诸如手指(804)或耳垂的组织结构,但是已经提出了某些桌面式系统,例如图4C中所示的那些(812),以观察例如视网膜血管的眼睛血管的吸收差异,但也可以被配置为检测其他组织的特性。While pulse oximeters (802) are typically configured to at least partially enclose tissue structures such as a finger (804) or an earlobe, certain tabletop systems, such as those shown in Figure 4C (812), have been proposed to Observes differences in absorption by eye vessels such as retinal vessels, but can also be configured to detect properties of other tissues.

这样的配置(812)可以被称为流量(flow)血氧仪或光谱仪系统并且可以包括如图所示的组件,包括相机(816)、变焦透镜(822)、第一(818)和第二(820)发光二极管(LED)以及一个或多个分束器(814)。虽然对于某些用户来说是有价值的,例如高海拔徒步者、运动员或有某些心血管或呼吸问题的人,以当他们在一天中移动并进行活动时能够检索他们的血氧饱和度的信息,或者对于护理人员能够对于潜在的异常来实时分析组织,但是大多数配置涉及组织结构的有些不方便的包裹或者是不便携的或不可穿戴的、不考虑指示其他组织状态或材料的其他吸收特性或者不将用户正在看的凝视(gaze)作为其传感器的方向性的一部分(换句话说,缺乏用于通过光谱检查进行识别和分析的目标对象的选择性)。Such a configuration (812) may be referred to as a flow oximeter or spectrometer system and may include components as shown, including a camera (816), zoom lens (822), first (818) and second (820) light emitting diodes (LEDs) and one or more beam splitters (814). While it is valuable for some users, such as high altitude hikers, athletes, or people with certain cardiovascular or respiratory issues, to be able to retrieve their blood oxygen saturation as they move and engage in activities throughout the day information, or for caregivers to be able to analyze tissue in real time for potential abnormalities, but most configurations involve somewhat inconvenient wrapping of tissue structures or are not portable or wearable, regardless of other indications of tissue status or material absorption feature or does not include the gaze the user is looking at as part of its sensor's directionality (in other words, lacks selectivity of target objects for identification and analysis by spectral inspection).

有利地,在一些实施例中,本文提出了一种解决方案,其将AR或VR系统形式的可穿戴计算的便利性与成像装置相结合,以实时确定用户的视场内的附加组织识别和特性。Advantageously, in some embodiments, a solution is presented herein that combines the convenience of wearable computing in the form of an AR or VR system with an imaging device to determine in real-time additional tissue identification and characteristic.

参考图2A-2D,示出了一些通用组件选项。在关于图2A-2D的讨论之后的详细描述的部分中,呈现了各种系统、子系统和组件,用于解决为访问和创建外部信息源的人类VR和/或AR提供高质量、舒适感知的显示系统的目标。Referring to Figures 2A-2D, some general component options are shown. In the section of the Detailed Description that follows the discussion regarding Figures 2A-2D, various systems, subsystems, and components are presented that address high-quality, comfortable perception of VR and/or AR for humans accessing and creating external information sources The display system target.

如图2A所示,AR系统用户(60)被描绘为佩戴头戴式组件(58),该头戴式组件(58)以耦接到位于用户眼睛前方的显示系统(62)的框架(64)结构为特征。扬声器(66)耦接到以所示配置的框架(64)并且位于用户的耳道附近(在一个实施例中,另一个扬声器(未示出)位于用户的另一耳道附近以提供给立体声/可塑形声音控制)。显示器(62)可操作地(诸如通过有线引线或无线连接)被耦接(68)到本地处理和数据模块(70),本地处理和数据模块(70)可以以各种配置安装,诸如被固定地附到框架(64)上、被固定地附到如图2B的实施例示出的头盔或帽子(80)上、被嵌入耳机内、以图2C的实施例示出的背包式配置可拆卸地附到用户(60)的躯干(82)、或以图2D的实施例示出的带耦接式配置可拆卸地附到用户(60)的臀部(84)。As shown in FIG. 2A, an AR system user (60) is depicted wearing a head-mounted assembly (58) with a frame (64) coupled to a display system (62) positioned in front of the user's eyes. ) structure is characteristic. A speaker (66) is coupled to the frame (64) in the configuration shown and is positioned near the user's ear canal (in one embodiment, another speaker (not shown) is positioned near the user's other ear canal to provide stereophonic sound) /shapeable sound control). The display (62) is operatively coupled (68), such as by wired leads or a wireless connection, to a local processing and data module (70), which may be installed in various configurations, such as being fixed attached to the frame (64), fixedly attached to a helmet or hat (80) as shown in the embodiment of FIG. Removably attached to the user's (60) torso (82), or in a strap-coupled configuration shown in the embodiment of Figure 2D, to the user's (60) hips (84).

本地处理和数据模块(70)可以包括处理器和控制器(例如,功率有效的处理器或控制器)以及诸如闪速存储器的数字存储器,这两者都可用于辅助处理、高速缓存和存储数据,该数据包括:a)从可以可操作地耦接到框架(64)的传感器捕捉的数据,所述传感器诸如为电磁发射器和检测器、图像捕捉装置(诸如相机)、麦克风、惯性测量单元、加速度计、罗盘、GPS单元、无线电装置和/或陀螺仪;和/或b)使用远程处理模块(72)和/或远程数据储存库(74)获取和/或处理的数据,这些数据可以在这样的处理或检索之后被传送到显示器(62)。本地处理和数据模块(70)可以诸如经由有线或无线通信链路可操作地耦接(76、78)到远程处理模块(72)和远程数据储存库(74),使得这些远程模块(72、74)可操作地彼此耦接并且可用作本地处理和数据模块(70)的资源。The local processing and data module (70) may include a processor and controller (e.g., a power efficient processor or controller) and digital memory such as flash memory, both of which may be used to assist in processing, caching, and storing data , the data includes: a) data captured from sensors that may be operatively coupled to the frame (64), such as electromagnetic emitters and detectors, image capture devices (such as cameras), microphones, inertial measurement units , accelerometer, compass, GPS unit, radio, and/or gyroscope; and/or b) data acquired and/or processed using a remote processing module (72) and/or remote data repository (74), which can be After such processing or retrieval is transmitted to the display (62). The local processing and data module (70) may be operably coupled (76, 78) to a remote processing module (72) and remote data repository (74), such as via a wired or wireless communication link, such that these remote modules (72, 74) 74) are operatively coupled to each other and available as resources for the local processing and data module (70).

在一个实施例中,远程处理模块(72)可以包括一个或多个相对强大的处理器或控制器,这些处理器或控制器被配置为分析和处理数据、发射的或接收的光特性和/或图像信息。在一个实施例中,远程数据储存库(74)可以包括相对大规模的数字数据存储设施,该设施可以通过因特网或“云”资源配置中的其它网络配置而可用。在一个实施例中,存储所有数据并且在本地处理和数据模块中执行所有计算,允许从任何远程模块完全自主使用。In one embodiment, the remote processing module (72) may include one or more relatively powerful processors or controllers configured to analyze and process data, emitted or received light characteristics, and/or or image information. In one embodiment, the remote data repository (74) may comprise a relatively large scale digital data storage facility that may be available through the Internet or other network configuration in a "cloud" resource configuration. In one embodiment, all data is stored and all calculations are performed in local processing and data modules, allowing complete autonomous use from any remote module.

现在参考图3,示意图示出了云计算资产(46)和本地处理资产之间的协调,该本地处理资产可以例如存在于耦接到用户头部(120)的头戴式组件(58)和耦接到用户的腰带(308;因此组件70也可以称为“腰带包”70)本地处理和数据模块(70)中,如图3所示。在一个实施例中,诸如一个或多个服务器系统(110)的云(46)资产诸如经由有线或无线网络(无线通常优选用于移动性,有线通常优选用于可能需要的某些高带宽或高数据容量传输)可操作地直接耦接(115)到(40、42)如上所述耦接到用户的头部(120)和腰带(308)的本地计算资产中的一个或两个,该本地资产诸如处理器和存储器配置。对用户而言为本地的这些计算资产也可以经由有线和/或无线连接配置(44)可操作地彼此耦接,诸如下面参考图8讨论的有线耦接(68)。Referring now to FIG. 3, a schematic diagram illustrates the coordination between cloud computing assets (46) and local processing assets, which may exist, for example, in a head-mounted assembly (58) coupled to a user's head (120) and A local processing and data module (70) coupled to the user's belt (308; thus assembly 70 may also be referred to as a "belt pack" 70), as shown in FIG. In one embodiment, cloud (46) assets such as one or more server systems (110) such as via a wired or wireless network (wireless generally preferred for mobility, wired generally preferred for certain high bandwidth or High data capacity transmission) is operatively directly coupled (115) to (40, 42) one or both of the local computing assets coupled to the user's head (120) and belt (308) as described above, the Local assets such as processor and memory configuration. These computing assets local to the user may also be operably coupled to each other via wired and/or wireless connection configurations (44), such as the wired coupling (68) discussed below with reference to FIG. 8 .

在一个实施例中,为了保持安装到用户头部(120)的低惯性和小尺寸子系统,用户和云(46)之间的主要传输可以经由安装在腰带(308)的子系统和云之间的链路,其中头戴式(120)子系统主要使用无线连接(诸如超宽带(“UWB”)连接)数据连接(data-tether)到基于腰带的(308)子系统,如当前例如在个人计算外围连接应用中所采用的那样。In one embodiment, in order to maintain the low inertia and small size of the subsystem mounted to the user's head (120), the primary transmission between the user and the cloud (46) may be via the belt-mounted (308) subsystem and the cloud. A link between the head-mounted (120) subsystem, where the head-mounted (120) subsystem primarily uses a wireless connection (such as an ultra-wideband ("UWB") connection) data-tether to the belt-based (308) subsystem, as currently for example in as used in personal computing peripheral connectivity applications.

通过有效的本地和远程处理协调以及用于用户的诸如图2A所示的用户界面或用户显示系统(62)或其变型的适当的显示装置,与用户当前的实际或虚拟位置相关的一个世界的方面可以被传输或“传递”给用户并以有效的方式更新。换言之,可以在存储位置处连续地更新世界的映射(map),该存储位置可以例如部分地存在于用户的AR系统上并且部分地存在于云资源中。映射(也称为“可传递的世界模型”)可以是包括栅格图像、3-D和2-D点、参数信息以及关于真实世界的其他信息的大型数据库。随着越来越多的AR用户(例如,通过相机、传感器、IMU等)不断地捕获关于他们的现实环境的信息,映射变得越来越准确和完整。Through efficient local and remote processing coordination and appropriate display means for the user, such as the user interface or user display system (62) shown in FIG. Aspects can be transmitted or "delivered" to users and updated in an efficient manner. In other words, a map of the world may be continuously updated at a storage location that may, for example, reside partly on the user's AR system and partly in a cloud resource. Maps (also called "transferable world models") can be large databases including raster images, 3-D and 2-D points, parametric information, and other information about the real world. As more and more AR users continue to capture information about their real-world environment (e.g., via cameras, sensors, IMUs, etc.), the mapping becomes more accurate and complete.

利用如上所述的配置,其中存在可以位于云计算资源上并且从那里分配的一个世界模型,这样的世界可以以相对低的带宽形式“可传递”给一个或多个用户,优选尝试传递实时视频数据等。在一些实施例中,站在雕像(即,如图1所示)附近的人的增强体验可以通过基于云的世界模型来通知,该基于云的世界模型的子集可以被向下传递给基于云的世界模型以及他们的本地显示装置以完成视图。坐在远程显示装置处的人(其可以像坐在桌子上的个人计算机一样简单)可以有效地从云中下载相同的信息部分并将其呈现在他们的显示器上。事实上,实际出现在雕像附近的公园里的一个人可能会带一个位于远方的朋友在那个公园散步,其中该朋友通过虚拟和增强现实加入。系统将需要知道街道的位置、树的位置、雕像的位置-但是关于在云上的那些信息,加入的朋友可以从场景的云方面下载,然后作为相对于实际在公园里的人为本地的增强现实开始一起散步。With a configuration as described above, where there is a model of the world that can be located on cloud computing resources and distributed from there, such a world can be "deliverable" to one or more users in a relatively low bandwidth form, preferably trying to deliver live video data etc. In some embodiments, the enhanced experience of a person standing near a statue (i.e., as shown in FIG. 1 ) may be informed by a cloud-based world model, a subset of which may be passed down to a The world models of the clouds and their local displays complete the view. A person sitting at a remote display device (which could be as simple as a personal computer sitting on a desk) can effectively download the same portion of information from the cloud and present it on their display. In fact, a person physically present in a park near a statue might take a remotely located friend for a walk in that park, where the friend joins in via virtual and augmented reality. The system will need to know where the street is, where the tree is, where the statue is - but about that information on the cloud, the friends who join can download it from the cloud side of the scene and then act as a local augmented reality relative to the person actually in the park Start walking together.

可以从环境捕获三维(3-D)点,并且可以确定捕获那些图像或点的相机的姿态(即,相对于世界的矢量和/或原点位置信息),以便这些点或图像可以被该姿态信息“标记”或与该姿态信息相关联。然后,可以利用由第二相机捕获的点来确定第二相机的姿态。换言之,可以基于与来自第一相机的标记图像的比较来定向和/或定位第二相机。然后,可以利用这些知识来提取纹理(texture)、制作映射以及创建真实世界(因为那时在那周围注册了两个相机)的虚拟副本。Three-dimensional (3-D) points can be captured from the environment, and the pose (i.e., vector and/or origin position information relative to the world) of the camera that captured those images or points can be determined so that the points or images can be identified by the pose information "Mark" or be associated with this gesture information. The pose of the second camera can then be determined using the points captured by the second camera. In other words, the second camera may be oriented and/or positioned based on a comparison with the marker image from the first camera. This knowledge can then be used to extract textures, make maps, and create a virtual copy of the real world (since then there are two cameras registered around there).

因此,在基础水平处,在一些实施例中,可以利用人佩戴系统来捕获3-D点和产生这些点的2-D图像两者,并且这些点和图像可以被传输到云存储和处理资源。它们也可以被本地高速缓存为具有嵌入的姿态信息(例如,高速缓存标记的图像);因此,云可以具有准备好的(例如,在可用的高速缓存中)标记的2-D图像(例如,被3-D姿态标记)以及3-D点。如果用户正在观察动态的东西(例如,具有移动的对象或特征的场景),他/她还可以将另外的信息传输到与运动相关的云(例如,如果观看另一个人的脸部,用户可以拍摄脸部的纹理映射并在优化频率下将其向上推(push up),即使周围的世界基本上是静态的)。如上所述,关于对象识别器和可传递的世界模型的更多信息可以在名称为“System and method foraugmented and virtual reality(用于增强和虚拟现实的系统和方法)”的序列号为No.14/205,126的美国专利申请中找到,其全部内容通过引用并入本文,以及在以下附加公开内容中找到,其涉及诸如由弗罗里达州劳德代尔堡的奇跃股份有限公司开发的增强和虚拟现实系统:序列号为14/641,376的美国专利申请;序列号为14/555,585的美国专利申请;序列号为14/212,961的美国专利申请;序列号为14/690,401的美国专利申请;序列号为13/663,466的美国专利申请;序列号为13/684,489的美国专利申请;以及序列号为62/298,993的美国专利申请,其中每一者在此通过引用整体并入本文。So at a basic level, in some embodiments, a human-worn system can be utilized to both capture 3-D points and generate 2-D images of those points, and these points and images can be transmitted to cloud storage and processing resources . They can also be cached locally with embedded pose information (e.g., cached labeled images); thus, the cloud can have ready (e.g., in cache available) labeled 2-D images (e.g., marked by 3-D poses) and 3-D points. If the user is looking at something dynamic (e.g., a scene with moving objects or features), he/she can also transmit additional information to the motion-related cloud (e.g., if viewing another person's face, the user can Take a texture map of the face and push it up at an optimized frequency, even though the world around you is largely static). More information on object recognizers and transferable world models, as described above, can be found in Serial No. 14 entitled "System and method for augmented and virtual reality" /205,126, the entire contents of which are incorporated herein by reference, as well as in the following additional disclosures, which relate to enhanced and Virtual Reality Systems: U.S. Patent Application Serial No. 14/641,376; U.S. Patent Application Serial No. 14/555,585; U.S. Patent Application Serial No. 14/212,961; U.S. Patent Application Serial No. 14/690,401; US Patent Application Serial No. 13/663,466; US Patent Application Serial No. 13/684,489; and US Patent Application Serial No. 62/298,993, each of which is hereby incorporated by reference in its entirety.

在一些实施例中,这种可传递的世界信息的使用可以允许通过光谱检查对对象的识别和标记然后在用户之间传递。例如,在临床环境中,操作实现本公开的特征的装置的第一护理人员可以对患者上的癌组织进行映射(map)和检测,并且将与元标签非常类似的虚拟标签分配和应用于该组织。然后,类似地佩戴这种装置的第二护理人员可以查看相同的癌组织细胞群并接收识别这些细胞的虚拟标签的通知,而不需要参与发射光、接收光、使吸收特征与组织匹配以及独立地标记组织中的一个或多个。In some embodiments, the use of such transferable world information may allow identification and labeling of objects by spectral inspection and then transfer between users. For example, in a clinical setting, a first caregiver operating a device implementing the features of the present disclosure may map and detect cancerous tissue on a patient, and assign and apply a virtual tag very similar to a meta tag to the organize. A second caregiver, similarly wearing such a device, can then view the same population of cancerous tissue cells and receive notification of a virtual label identifying those cells without being involved in emitting light, receiving light, matching absorption features to tissue, and independently To mark one or more of the organizations.

GPS和其他定位信息可以用作这种处理的输入。应当理解,用户的头部、图腾、手势、触觉装置等的高度精确定位可能是有利于向用户显示适当的虚拟内容在可传递的世界中在用户当中显示可传递的虚拟或增强的内容。GPS and other positioning information can be used as input to this process. It should be appreciated that highly precise positioning of the user's head, totem, gestures, haptic devices, etc. may be advantageous in displaying appropriate virtual content to the user Displaying deliverable virtual or augmented content among users in the deliverable world.

参考图5,可穿戴计算配置的可头戴组件(58)的顶部正交视图被示出为以用于示例性光谱检查系统的各种集成组件为特征。该配置以以下部件为特征:两个显示元件(62-双目-每只眼睛一个);两个向前定向的相机(124),其用于观察和检测用户周围的世界,每个相机(124)具有相关联的视场(18、22);具有视场(20)的至少一个光谱检查阵列(126,在图6中更详细地描述);还有具有视场(26)的向前定向的相对高分辨率图片相机(156);一个或多个惯性测量单元(102);以及具有相关联的视场(24)的深度传感器(154),例如如前面引用的参考公开内容所述。朝向用户的眼睛(12、13)并且耦合到头戴式组件(58)框架的是眼睛跟踪相机(828、830)和向内发射器和接收器(832、834)。本领域技术人员将理解,向内发射器和接收器(832、834)以与光谱检查阵列(126)对其视场(20)中的向外对象作用的方式相同的辐射图案(824、826)发射和接收指向眼睛的光。这些组件或不包括所有组件的组合例如通过引线可操作地耦合到控制器(844),该控制器(844)可操作地耦合(848)到例如电池的电源(846)。Referring to FIG. 5 , a top orthogonal view of a head-mountable assembly ( 58 ) of a wearable computing arrangement is shown featuring various integrated components for an exemplary spectroscopic inspection system. The configuration features the following components: two display elements (62 - binocular - one for each eye); two forward-oriented cameras (124) for observing and detecting the world around the user, each camera ( 124) having associated fields of view (18, 22); at least one spectroscopic inspection array (126, described in more detail in FIG. 6) having field of view (20); and a forward a directional, relatively high-resolution picture camera (156); one or more inertial measurement units (102); and a depth sensor (154) with an associated field of view (24), e.g., as described in the previously cited reference disclosure . Towards the user's eyes (12, 13) and coupled to the frame of the head mounted assembly (58) are eye tracking cameras (828, 830) and inward transmitters and receivers (832, 834). Those skilled in the art will understand that the inward emitters and receivers (832, 834) emit radiation patterns (824, 826) in the same manner as the spectroscopic inspection array (126) affects outward objects in its field of view (20). ) emits and receives light directed at the eye. These components, or a combination of all components, are operably coupled, eg, by leads, to a controller (844), which is operatively coupled (848) to a power source (846), eg, a battery.

在一些实施例中,显示元件(62)包括一个或多个波导(例如,波导堆叠),该一个或多个波导是光学透射的并且允许用户通过接收来自世界的光来“看到”世界。波导还接收包含显示信息的光并将传播光且将光射到用户的眼睛(12、13),从而向用户显示图像。优选地,从波导传播出的光提供与不同深度平面对应的特定的且限定的波前发散(divergence)水平(例如,形成距用户特定距离处的对象的图像的光具有波前发散度,该波前发散度与如果真实地将从该对象到达用户的光的波前发散度对应或基本匹配)。例如,波导可以具有光功率并且可以被配置为输出具有选择性可变波前发散水平的光。应当理解,该波前发散度提供了关于眼睛(12、13)的适应(accommodation)的提示。另外,显示元件(62)利用双目视差来进一步提供深度提示,例如,关于眼睛(12、13)的聚散(vergence)的提示。有利地,关于适应的提示和关于聚散的提示可以匹配,例如,使得它们都对应于距用户相同距离处的对象。这种适应-聚散匹配有利于利用头戴式部件(58)的系统的长期可穿戴性。In some embodiments, the display element (62) includes one or more waveguides (eg, a waveguide stack) that are optically transmissive and allow a user to "see" the world by receiving light from the world. The waveguide also receives light containing display information and propagates the light and directs the light to the user's eyes (12, 13), thereby displaying an image to the user. Preferably, the light propagating out of the waveguide provides a specific and defined level of wavefront divergence corresponding to the different depth planes (e.g. light forming an image of an object at a specific distance from the user has a wavefront divergence that The wavefront divergence corresponds to or substantially matches the wavefront divergence of the light that would, in reality, reach the user from the object). For example, a waveguide may have optical power and may be configured to output light with a selectively variable level of wavefront divergence. It will be appreciated that this wavefront divergence provides a cue regarding the accommodation of the eyes (12, 13). In addition, the display element (62) utilizes binocular parallax to provide further depth cues, eg, cues about the vergence of the eyes (12, 13). Advantageously, the cues on adaptation and the cues on vergence may be matched, for example, so that they both correspond to objects at the same distance from the user. This fit-vergence fit facilitates long-term wearability of the system utilizing the head-mounted component (58).

继续参考图5,优选地,每个发射器(126、832、834)被配置为例如通过LED可控地发射两个或更多个波长的电磁辐射,例如约660nm和约940nm,并且优选地辐射场(824、826)被定向为照射目标对象或表面。在一些实施例中,目标对象是向内的,例如眼睛(12、13),并且辐射图案(824、826)可以被固定或加宽/变窄以响应于眼睛跟踪相机数据点而指向眼睛的特定区域。在一些实施例中,目标对象是向外的(例如,远离用户),并且光谱仪阵列(126)的视场(20)内的辐射图案符合从眼睛跟踪相机(828、830)确定的眼睛(12、13)的凝视。Continuing with reference to FIG. 5 , preferably each emitter ( 126 , 832 , 834 ) is configured to controllably emit two or more wavelengths of electromagnetic radiation, such as about 660 nm and about 940 nm, such as by an LED, and preferably radiates Fields (824, 826) are oriented to illuminate a target object or surface. In some embodiments, the target object is facing inward, such as the eye (12, 13), and the radiation pattern (824, 826) can be fixed or widened/narrowed to be directed toward the eye in response to eye tracking camera data points. specific area. In some embodiments, the target object is outward (e.g., away from the user) and the radiation pattern within the field of view (20) of the spectrometer array (126) conforms to the eye (12) determined from the eye-tracking cameras (828, 830). , 13) the gaze.

在一些实施例中,凝视可以被理解为从用户的眼睛延伸的矢量,例如通过眼睛的晶状体从中央凹延伸,发射器(832、834)可以关于用户的眼睛输出红外光,可以监测来自眼睛的反射(例如,角膜反射)。眼睛的瞳孔中心(例如,显示系统可以例如通过红外成像确定瞳孔的质心)与来自眼睛的反射之间的矢量可以用于确定眼睛的凝视。在一些实施例中,当估计眼睛的位置时,由于眼睛具有巩膜和眼球,因此几何形状可以表示为在彼此顶上层叠的两个圆。可以基于该信息确定或计算眼睛指向矢量。还可以估计眼睛旋转中心,因为眼睛的横截面是圆形的并且巩膜摆动特定角度。由于接收信号与已知发射信号的自相关,这可能导致矢量距离而不仅仅是射线迹线。输出可以被看作浦肯野(Purkinje)图像1400,其可以进而用于跟踪眼睛的运动。In some embodiments, the gaze can be understood as a vector extending from the user's eyes, such as through the lens of the eye from the fovea, the emitters (832, 834) can output infrared light with respect to the user's eyes, and the gaze from the eyes can be monitored. Reflections (eg, corneal reflections). The vector between the pupil center of the eye (eg, the display system can determine the centroid of the pupil, eg, by infrared imaging) and the reflection from the eye can be used to determine the gaze of the eye. In some embodiments, when estimating the position of the eye, since the eye has a sclera and an eyeball, the geometry can be represented as two circles layered on top of each other. An eye pointing vector can be determined or calculated based on this information. The center of rotation of the eye can also be estimated because the cross-section of the eye is circular and the sclera wobbles at a certain angle. Due to the autocorrelation of the received signal with the known transmitted signal, this can result in vector distances rather than just ray traces. The output can be viewed as a Purkinje image 1400, which can in turn be used to track eye movement.

本领域技术人员将理解例如通过由IMU(102)中的一个或多个确定的头部姿势信息确定视场(20)内的辐射图案的其他方式。Those skilled in the art will appreciate other ways of determining the radiation pattern within the field of view (20), such as through head posture information determined by one or more of the IMUs (102).

在一些实施例中,发射器可以被配置为利用受控的脉冲(pulsatile)发射循环同时或顺序地发射波长。一个或多个检测器(126、828、830)可以包括光电二极管、光电检测器和/或数字相机传感器,并且优选地被定位和定向成接收已经以其他方式与目标组织或材料或对象相遇的辐射。一个或多个电磁辐射检测器(126、828、830)可以包括数字图像传感器,该数字图像传感器包括多个像素,其中控制器(844)被配置为自动检测像素子集,该像素子集接收在遇到目标对象之后反射的光,并且控制器(844)被配置为使用这样的像素子集产生输出。In some embodiments, the transmitter may be configured to emit wavelengths simultaneously or sequentially using a controlled pulse (pulsatile) emission cycle. The one or more detectors (126, 828, 830) may include photodiodes, photodetectors, and/or digital camera sensors, and are preferably positioned and oriented to receive data that has otherwise encountered the target tissue or material or object. radiation. The one or more electromagnetic radiation detectors (126, 828, 830) may comprise a digital image sensor comprising a plurality of pixels, wherein the controller (844) is configured to automatically detect a subset of pixels that receive light reflected after encountering a target object, and the controller (844) is configured to generate an output using such a subset of pixels.

在一些实施例中,输出取决于将相对于发射的光的所接收的光与来自材料和材料特性的吸收数据库的目标匹配。例如,在一些实施例中,吸收数据库包括如图7A和7B中所示的多个吸收图表。应当理解,包括图表的数据库可以包括图表中的信息的电子表示或变换,并且这里的术语图表的使用包括这样的表示或变换。图7A和7B仅用作示例,但展示了可以从给定系统检测到的各种组织特性,该系统发射来自特定光源的光并接收特定波长和/或光特性的光以确定观察的目标是特定组织或具有组织内的特定特性的概率。例如饱和度曲线或校准曲线的其他图表可以由用户选择性地访问。例如,用户可以选择特定光源或波长图案的吸收数据库,然后环顾四周,直到光谱系统识别出与所要求的特性匹配的材料。这样的实施例可以被称为“封闭搜索”或者被称为与查看任何目标的“开放搜索”相对的寻找特定特性然后搜索数据库以查找与检测到的光特性的匹配的一种实施例。In some embodiments, the output is dependent on matching received light relative to emitted light to targets from an absorption database of materials and material properties. For example, in some embodiments, the absorption database includes a plurality of absorption graphs as shown in Figures 7A and 7B. It should be understood that a database including a diagram may include an electronic representation or transformation of the information in the diagram, and that use of the term diagram herein includes such representation or transformation. Figures 7A and 7B are used as examples only, but demonstrate the various tissue properties that can be detected from a given system that emits light from a particular light source and receives light of a particular wavelength and/or light characteristic to determine what the target of observation is A particular tissue or probability of having a particular characteristic within an organization. Other graphs such as saturation curves or calibration curves can be selectively accessed by the user. For example, a user can select an absorption database for a specific light source or wavelength pattern, and then look around until the spectroscopy system identifies a material that matches the desired properties. Such an embodiment may be referred to as a "closed search" or as one that looks for a specific characteristic and then searches a database for a match to the detected light characteristic, as opposed to an "open search" that looks at any object.

控制器(844)可以被配置为至少部分地基于与像素相关联的信号之间的反射光特性差异来自动检测视场(124或126或824、826,图5)内的像素子集。例如,控制器(844)可以被配置为至少部分地基于与像素相关联的信号之间的反射光吸收差异来自动检测像素子集。在不受理论限制的情况下,撞击对象的光将在击到对象时反射、透射(吸收)或散射,使得R+T+S=1(R=来自对象的反射,T=到对象的透射/吸收,S=从对象的散射)。如果特定像素子集相对于周围子像素反射较高比例的光,则控制器可以隔离这些子像素或者在存储器系统中记录或登记关于这些不同的特性的像素位置。在一些实施例中,像素位置存储在可传递的世界映射系统中作为密集或稀疏的映射点,例如头戴式显示系统的附加用户访问映射,像素子集被传递给附加用户并被第二用户显示器访问和/或显示在第二用户显示器上。The controller (844) may be configured to automatically detect a subset of pixels within the field of view (124 or 126 or 824, 826, Fig. 5) based at least in part on differences in reflected light characteristics between signals associated with the pixels. For example, the controller (844) may be configured to automatically detect a subset of pixels based at least in part on differences in reflected light absorption between signals associated with the pixels. Without being bound by theory, light striking an object will reflect, transmit (absorb) or scatter upon hitting the object such that R+T+S=1 (R=reflection from object, T=transmission to object /absorption, S=scattering from the object). If a particular subset of pixels reflects a higher proportion of light relative to surrounding sub-pixels, the controller may isolate those sub-pixels or record or register the pixel locations for these different characteristics in a memory system. In some embodiments, the pixel locations are stored as dense or sparse map points in a transferable world mapping system, such as an additional user access map of a head-mounted display system, and the subset of pixels is passed to the additional user and read by the second user. Display access and/or display on the second user display.

参考图6,光谱检查阵列(126)可包括朝向目标对象(620)发射光(613)的光源(612)。在一些实施例中,光源(612)是例如发光二极管的电磁发射器。在一些实施例中,发射光(613)的方向与用户(60)的凝视取向或用户(60)的头部姿势取向基本相同。在一些实施例中,光电检测器(614)捕获来自目标对象的反射光(615)。在一些实施例中,可以是图5中描绘的控制器(844)的处理器(610)确定发射光(613)和反射光(615)之间的吸收特性并匹配来自吸收数据库(630)的特性。在一些实施例中,吸收数据库(630)存储在例如图2A中所示的模块(70)的本地处理模块上;在一些实施例中,吸收数据库(630)存储在例如图2A中所示的远程处理模块的远程处理模块(72)上。Referring to FIG. 6, the spectral inspection array (126) may include a light source (612) that emits light (613) toward a target object (620). In some embodiments, the light source (612) is an electromagnetic emitter such as a light emitting diode. In some embodiments, the direction of emitted light (613) is substantially the same as the gaze orientation of the user (60) or the head gesture orientation of the user (60). In some embodiments, a photodetector (614) captures reflected light (615) from a target object. In some embodiments, the processor (610), which may be the controller (844) depicted in FIG. 5, determines the absorption characteristics between the emitted light (613) and the reflected light (615) and matches characteristic. In some embodiments, the ingestion database (630) is stored on a local processing module such as the module (70) shown in FIG. 2A; in some embodiments, the ingestion database (630) is stored on a On the teleprocessing module (72) of the teleprocessing module.

为简单起见,对象(620)在图6中被描绘为苹果,并且虽然食物特性具有其各自的光吸收特性,以及本发明的实施例可以用于通过其光特性来识别食物,但是也可以设想更复杂的用途。在一些实施例中,面向外的光谱检查阵列(126)识别组织源(624),例如,为了说明的目的而描绘的手臂。发射光(613)可以撞击组织源(624),反射光(615)可以指示规则细胞(625)中存在不规则细胞(626)。当光源(612)照射组织源(624)时,不规则细胞(626)将与规则细胞(625)不同的光特性返回到光检测器(614)。不规则细胞(626)可以是癌性的、是瘢痕组织的一部分或者甚至是组织中的健康细胞,该健康细胞仅简单地指示或具有与周围细胞的差异,例如指示组织源(624)内的血管或骨可能位于的位置。在一些实施例中,规则细胞构成分析中的样品中的大部分细胞,不规则细胞构成样品的少数细胞,不规则细胞表现出与规则细胞不同的可检测特性。在一些实施例中,捕获像素级图像的真实世界相机可以标记这样的不规则细胞(626)。如前所述,一个这样的标记可以是应用接近不规则细胞(626)的文本图像的标记系统,另一个这样的标记系统可以是不规则细胞(626)上的颜色覆盖,如通过显示元件62(图5)所见。For simplicity, the object (620) is depicted as an apple in FIG. 6, and while food properties have their own light absorption properties, and embodiments of the present invention can be used to identify food by their light properties, it is also contemplated that more complex uses. In some embodiments, outward facing spectroscopic inspection array (126) identifies a tissue source (624), eg, an arm depicted for illustration purposes. Emitted light (613) may strike a tissue source (624), and reflected light (615) may indicate the presence of irregular cells (626) among regular cells (625). When the light source (612) illuminates the tissue source (624), the irregular cells (626) return a different light characteristic to the light detector (614) than the regular cells (625). Irregular cells (626) may be cancerous, part of scar tissue, or even healthy cells in tissue that simply indicate or have differences from surrounding cells, such as indicative of Where a blood vessel or bone may be located. In some embodiments, regular cells constitute the majority of the cells in the sample in the assay, irregular cells constitute a minority of the sample, and the irregular cells exhibit different detectable properties than the regular cells. In some embodiments, a real-world camera capturing pixel-level images can mark such irregular cells (626). As previously mentioned, one such marking system may be a marking system applying a text image close to the irregular cells (626), and another such marking system may be a color overlay on the irregular cells (626), such as via the display element 62 (Figure 5) see.

因此,再次参考图5,提出了一种用于通过例如用于AR或VR的系统的可穿戴计算系统确定组织特性或材料的系统包括:头戴式部件(58),其可拆卸地耦合到用户的头部;一个或多个电磁辐射发射器(126、832、834),其耦合到头戴式部件(58)并且被配置为在向内或向外方向上发射具有至少两个不同波长的光;一个或多个电磁辐射检测器(126、828、830),其耦合到头戴式部件并且被配置为接收在遇到目标对象之后反射的光;以及控制器(844),其可操作地耦合到一个或多个电磁辐射发射器(126、832、834)和一个或多个电磁辐射检测器(126、828、830)并被配置为使一个或多个电磁辐射发射器发射光脉冲,同时还使一个或多个电磁辐射检测器检测与发射的光脉冲相关的光吸收水平并产生可显示的输出。Thus, referring again to FIG. 5 , it is proposed that a system for determining tissue properties or materials by a wearable computing system, such as a system for AR or VR, comprises: a head-mounted part (58) detachably coupled to The user's head; one or more electromagnetic radiation emitters (126, 832, 834) coupled to the head-mounted component (58) and configured to emit radiation having at least two different wavelengths in an inward or outward direction one or more electromagnetic radiation detectors (126, 828, 830) coupled to the head mounted component and configured to receive light reflected after encountering a target object; and a controller (844) that may operatively coupled to one or more electromagnetic radiation emitters (126, 832, 834) and one or more electromagnetic radiation detectors (126, 828, 830) and configured to cause the one or more electromagnetic radiation emitters to emit light pulse while also causing one or more electromagnetic radiation detectors to detect the level of light absorption associated with the emitted light pulse and produce a displayable output.

头戴式部件(58)可包括被配置为放在用户的头部上的框架,例如眼镜框架。眼镜框架可以是双目眼镜框架;替代实施例可以是单眼眼镜框架。一个或多个发射器(126、832、834)可以包括发射多个波长的光的光源,例如至少一个发光二极管或其他电磁辐射发射器。多个光源可以被配置为优选地以两个波长的光发射,例如,约660纳米的第一波长和约940纳米的第二波长。The head-mounted component (58) may include a frame configured to rest on the user's head, such as an eyeglass frame. The eyeglass frame may be a binocular eyeglass frame; an alternative embodiment may be a monocular eyeglass frame. The one or more emitters (126, 832, 834) may include a light source that emits light at multiple wavelengths, such as at least one light emitting diode or other emitter of electromagnetic radiation. The plurality of light sources may be configured to emit light at preferably two wavelengths, eg, a first wavelength of about 660 nanometers and a second wavelength of about 940 nanometers.

在一些实施例中,一个或多个发射器(126、832、834)可以被配置为顺序地发射相应波长的光。在一些实施例中,一个或多个发射器(126、832、834)可以被配置为同时发射相应波长的光。一个或多个电磁辐射检测器(126、828、830)可以包括从由光电二极管、光电检测器和数字相机传感器构成的组中选择的装置。控制器(844)还可以被配置为使得多个发光二极管发射第一波长开、然后第二波长开、然后两个波长都关的循环图案,使得一个或多个电磁辐射检测器分开地检测第一波长和第二波长。控制器(844)可以被配置为使得多个发光二极管以大约每秒三十次的循环脉冲模式发射第一波长开、然后第二波长开、然后两个波长都关的循环图案。控制器(844)可以被配置为计算第一波长光测量与第二波长光测量的比率,其中该比率至少部分地基于比尔-朗伯特定律经由查找表转换为氧饱和度读数。In some embodiments, one or more emitters (126, 832, 834) may be configured to sequentially emit light of corresponding wavelengths. In some embodiments, one or more emitters (126, 832, 834) may be configured to simultaneously emit corresponding wavelengths of light. The one or more electromagnetic radiation detectors (126, 828, 830) may include devices selected from the group consisting of photodiodes, photodetectors, and digital camera sensors. The controller (844) can also be configured to cause the plurality of light emitting diodes to emit a cyclic pattern of a first wavelength on, then a second wavelength on, then both wavelengths off, such that the one or more electromagnetic radiation detectors separately detect the first wavelength A wavelength and a second wavelength. The controller (844) can be configured to cause the plurality of light emitting diodes to emit a cyclic pattern of a first wavelength on, then a second wavelength on, then both wavelengths off in a cyclic pulse pattern approximately thirty times per second. The controller (844) may be configured to calculate a ratio of the first wavelength light measurement to the second wavelength light measurement, wherein the ratio is converted to an oxygen saturation reading via a lookup table based at least in part on the Beer-Lambert law.

控制器(844)可以被配置为操作一个或多个发射器(126、832、834)和一个或多个电磁辐射检测器(126、828、830)以用作头戴式光谱仪。控制器(844)可操作地耦合到光学元件(62),该光学元件(62)耦合到头戴式部件(58)并且可由用户观看,使得控制器(844)的指示特定材料特性或组织特性的输出可以由用户通过光学元件(62)而观看到。The controller (844) may be configured to operate one or more emitters (126, 832, 834) and one or more electromagnetic radiation detectors (126, 828, 830) for use as a head-mounted spectrometer. The controller (844) is operatively coupled to the optical element (62), which is coupled to the head-mounted component (58) and viewable by the user such that the sensor of the controller (844) is indicative of a particular material property or tissue property. The output of can be viewed by the user through the optical element (62).

图7A是可以由吸收数据库(630,图6)引用的示例性光特性吸收图表。如图所示,可见光谱中的例如IR、NIR或发光二极管的各种光源类型对于检测组织内的某些组织和特性可能是最佳的。在一些实施例中,校准曲线中的吸收比率或散射是从发射光到反射光计算的并施加到诸如图7A所示的给定的吸收数据库(630),以确定其中的下方组织和/或特性或确定异常。FIG. 7A is an exemplary light characteristic absorption chart that may be referenced by the absorption database (630, FIG. 6). As shown, various light source types in the visible spectrum such as IR, NIR or light emitting diodes may be optimal for detecting certain tissues and properties within the tissue. In some embodiments, the absorbance ratio or scatter in the calibration curve is calculated from emitted light to reflected light and applied to a given absorbance database (630) such as that shown in FIG. 7A to determine the underlying tissue and/or Characteristic or determine exception.

图7B描绘了波长的潜在“重叠”。如图所示,“氧合血液”可以在某些波长处与“脱氧血液”重叠,导致使光谱过程可以提供的结果静音(mute)。为了避免这种潜在的重叠,在一些实施例中,发射第二不同波长的光以提供第二光源以进行测量和比较。Figure 7B depicts the potential "overlap" of wavelengths. As shown, "oxygenated blood" can overlap with "deoxygenated blood" at certain wavelengths, leading to mute the results that the spectroscopic process can provide. To avoid this potential overlap, in some embodiments, a second, different wavelength of light is emitted to provide a second light source for measurement and comparison.

图8示出了用于使用以光谱检查组件为特征的可穿戴AR/VR系统的方法(850),以识别组织或组织内的特性。方法(850)开始于(851),其中系统将光源定向到目标对象。在一些实施例中,定向使得光源向内指向用户的眼睛,并且该定向可以被固定或扫描眼睛,例如扫描视网膜。在一些实施例中,定向是通过确定用户的眼睛凝视或头部姿势并使光源以朝向这样的凝视或姿势视场内的目标对象或者朝向特征地标或目标对象基本相同的方向而定向。FIG. 8 illustrates a method ( 850 ) for using a wearable AR/VR system featuring a spectral inspection component to identify tissue or properties within tissue. The method (850) begins at (851), where the system directs a light source to a target object. In some embodiments, the orientation is such that the light source is directed inwardly at the user's eye, and this orientation may be fixed or scanned across the eye, eg, across the retina. In some embodiments, orienting is by determining the user's eye gaze or head pose and orienting the light source in substantially the same direction toward a target object within such gaze or pose field of view, or toward a characteristic landmark or target object.

在一些实施例中,在(852)处,光源以朝向目标对象或表面的辐射图案发射光。在一些实施例中,通过计时器以定时间隔脉冲光。在一些实施例中,光源发射至少一个波长的光,并且在(854)处,例如光电检测器的辐射检测器接收反射光。在一些实施例中,检测器还可操作地耦合到计时器,以指示接收的光是否最初在特定时间被脉冲,以确定在目标对象上反射时的光特性的变化。在一些实施例中,(852)与(853)处的映射同时开始,但是该序列不一定如此。In some embodiments, at (852), the light source emits light in a radiation pattern towards the target object or surface. In some embodiments, the light is pulsed at timed intervals by a timer. In some embodiments, the light source emits light of at least one wavelength, and at (854), a radiation detector, such as a photodetector, receives the reflected light. In some embodiments, the detector is also operatively coupled to a timer to indicate whether the received light was initially pulsed at a particular time to determine a change in light properties when reflected on the target object. In some embodiments, (852) begins at the same time as the mapping at (853), but this sequence does not have to be.

在一些实施例中,在(853)处,真实世界捕获系统可以开始映射目标对象。在一些实施例中,这种映射可以包括接收目标对象的可传递的世界数据。在一些实施例中,映射可以包括目标对象的外形(contour)的深度传感器分析。在一些实施例中,映射可以包括建立视场内的物品的网格模型并参考它们以用于潜在的标记。在一些实施例中,目标对象不是视场内的可以由深度传感器捕获的特定对象,而是视场本身内的深度平面。In some embodiments, at (853), the real world capture system may begin mapping the target object. In some embodiments, such mapping may include receiving transitive world data for the target object. In some embodiments, the mapping may include depth sensor analysis of the contour of the target object. In some embodiments, mapping may include building mesh models of items within the field of view and referencing them for potential marking. In some embodiments, the target object is not a specific object within the field of view that can be captured by the depth sensor, but a depth plane within the field of view itself.

在一些实施例中,在(855)处,例如在比尔-朗伯特定律或光密度关系(下面描述)或校准曲线的散射图案下,控制器分析与接收光相比的发射光。在一些实施例中,在(856)处,比较的光特性在吸收数据库中被引用或者被本地存储在系统上或者通过系统远程访问,以识别目标对象的组织或组织特性。在一些实施例中,吸收数据库可以包括例如图4B中所示的图表的饱和光图表或者可以包括特定光波长的校准曲线。In some embodiments, at (855), the controller analyzes the emitted light compared to the received light, eg, under the Beer-Lambert law or the optical density relationship (described below) or the scattering pattern of a calibration curve. In some embodiments, at (856), the compared optical properties are referenced in an absorption database or stored locally on the system or accessed remotely by the system to identify tissue or tissue properties of the target object. In some embodiments, the absorption database may include a saturated light graph such as the graph shown in FIG. 4B or may include a calibration curve for a particular wavelength of light.

在一些实施例中,在(854)处,辐射检测器不接收与在(852)处发射的光的波长不同的波长的光,并且控制器不能进行光谱分析。这种情况将如图7B中那样发生,其中对于氧合和脱氧血液,波长在某些范围内重叠。在一些实施例中,在(854a)处,在发射光和接收光之间没有检测到波长差异,并且子步骤(854b)通过发射与在(852)处发射的波长不同的另一不同波长的光来启动。然后,在(855)处将发射的新光和接收的光的信息传送到控制器。In some embodiments, at (854), the radiation detector does not receive light of a different wavelength than the wavelength of light emitted at (852), and the controller cannot perform spectral analysis. This would happen as in Figure 7B, where the wavelengths overlap in some range for oxygenated and deoxygenated blood. In some embodiments, at (854a), no wavelength difference is detected between the emitted light and the received light, and substep (854b) is performed by emitting a different wavelength than that emitted at (852) light to start. The information of new light emitted and light received is then communicated to the controller at (855).

在一些实施例中,在映射目标对象(853)之后并且可能与(852至856)中的每一个同时发生,在(857)处,真实世界相机可以另外识别视场内的指示不规则性的子像素。例如,在一些实施例中,在(853)处的真实世界捕获期间检测像素之间的颜色对比度并且,在(857)处,进一步更改这些像素以突出显示这样的对比度,从而作为潜在的不健康细胞。在一些实施例中,真实世界捕获(853)检测像素簇之间的不规则线,并且在(857)处,由不规则线界定的像素在用户显示器上被标记(例如通过虚拟颜色覆盖)。In some embodiments, after mapping the target object (853) and possibly concurrently with each of (852 to 856), at (857) the real world camera may additionally identify irregularities within the field of view sub-pixel. For example, in some embodiments, color contrast between pixels is detected during real-world capture at (853) and, at (857), these pixels are further altered to highlight such contrast as potentially unhealthy cells . In some embodiments, real world capture (853) detects irregular lines between clusters of pixels, and at (857), pixels bounded by the irregular lines are marked (eg, by a virtual color overlay) on the user display.

在一些实施例中,方法(850)终止于(858),系统向用户显示组织或组织的材料特性。在一些实施例中,显示器可包括虚拟地显示在目标对象附近的文本标签、描述从吸收数据库确定的目标对象的音频标签(630)或者并置在目标对象附近的由吸收数据库识别的类似组织或对象的虚拟图像(630)。In some embodiments, the method (850) terminates (858) with the system displaying the tissue or material properties of the tissue to the user. In some embodiments, the display may include a text label virtually displayed near the target object, an audio label (630) describing the target object determined from the absorption database, or a similar tissue or tissue identified by the absorption database juxtaposed near the target object. A virtual image of the subject (630).

在一些实施例中,使用由控制器(844)操作的软件实现显著量的光谱检查活动,使得使用数字图像处理(例如通过使用各种滤波器的颜色、灰度和/或强度阈值分析)进行定位期望目标(例如,血管、肌肉组织、骨组织或其他组织以及在所需的深度)的初始任务。这种靶向可以使用图案、形状识别或纹理识别来进行。癌细胞或其他不规则细胞通常具有不规则的边界。相机系统可以识别相机视场内的一系列像素(例如图5中的相机124和视场18、22),其具有不规则的非线性图案并且具有瞬时注意力(prompt attention)以识别到潜在地不健康的细胞的边界。可选地,软件和控制器可以被配置为使用目标对象的中心强度和周围对象/组织的强度来确定与目标对象的对比度/光密度以确定异常。这些测量可以仅用于识别与本公开一致的光谱仪扫描的感兴趣区域,并且不一定是识别组织本身的手段。此外,如先前参考图6中的不规则细胞(626)所描述的,增强现实系统可以在潜在不健康细胞的边界内覆盖标签或颜色图案以相对于周围的健康细胞而标记它们/突出显示它们。In some embodiments, a significant amount of spectral inspection activity is accomplished using software operated by the controller (844), such that digital image processing (e.g., by color, grayscale, and/or intensity threshold analysis using various filters) is performed. An initial task of locating a desired target (eg, blood vessel, muscle tissue, bone tissue, or other tissue and at a desired depth). Such targeting can be performed using pattern, shape recognition or texture recognition. Cancer cells or other irregular cells often have irregular borders. A camera system can identify a series of pixels within a camera's field of view (e.g., camera 124 and fields of view 18, 22 in FIG. Unhealthy cell borders. Optionally, the software and controller may be configured to use the central intensity of the target object and the intensity of surrounding objects/tissue to determine the contrast/optical density to the target object to determine abnormalities. These measurements may only be used to identify regions of interest for spectrometer scans consistent with the present disclosure, and are not necessarily a means of identifying the tissue itself. Additionally, as previously described with reference to irregular cells (626) in FIG. 6, the augmented reality system may overlay labels or color patterns within the boundaries of potentially unhealthy cells to mark/highlight them relative to surrounding healthy cells.

在一些实施例中,控制器(844)可用于计算密度比率(对比度)并根据血管中各种脉搏血氧测定特性的密度比率计算氧饱和度。可以使用以下式子计算在两个或更多个发射波长中的每一个处的血管光密度(“O.D.”):In some embodiments, the controller (844) may be used to calculate density ratios (contrast) and to calculate oxygen saturation from the density ratios of various pulse oximetry properties in blood vessels. Vessel optical density ("O.D.") at each of two or more emission wavelengths can be calculated using the following formula:

ODvessel=-log10(Iv/It)ODvessel = -log 10 (Iv/It)

其中ODvessel是血管的光密度;Iv是血管强度;It是周围的组织强度。Where ODvessel is the optical density of the vessel; Iv is the intensity of the vessel; It is the intensity of the surrounding tissue.

血管中的氧饱和度(也称为“SO2”)可以计算为两个波长处的血管光密度(OD比或“ODR”)的线性比率,使得:Oxygen saturation (also called "SO2") in blood vessels can be calculated as a linear ratio of blood vessel optical densities at two wavelengths (OD ratio or "ODR") such that:

SO2=ODR=OD第一波长/OD第二波长 SO 2 =ODR=OD first wavelength /OD second wavelength

在一个实施例中,可以在血管血氧测定中使用约570nm(对脱氧血红蛋白敏感)和约600nm(对氧合血红蛋白敏感)的波长,使得SO2=ODR=OD600nm/0D570nm;这样的式子不考虑通过校准系数来调整比率。In one embodiment, wavelengths of about 570nm (sensitive to deoxyhemoglobin) and about 600nm (sensitive to oxyhemoglobin) can be used in vascular oximetry such that SO2 = ODR = OD 600nm / OD570nm; such a formula does not take into account Adjust the ratio by the calibration factor.

以上式子仅是用于计算材料特性的参考的示例。本领域技术人员将理解控制器可确定的许多其他组织特性和关系。The above formulas are just examples of references for calculating material properties. Those skilled in the art will understand many other organizational properties and relationships that the controller can determine.

应当理解,利用控制器(844)来执行计算和/或进行确定可以涉及在控制器内的处理器上本地执行计算(844)。在一些其他实施例中,使用控制器执行计算和/或进行确定(844)可以涉及利用控制器与外部计算源(例如,诸如服务器(110)的云(46)中的源)面接(interface)。It should be appreciated that utilizing the controller (844) to perform calculations and/or make determinations may involve performing calculations (844) locally on a processor within the controller. In some other embodiments, using the controller to perform calculations and/or make determinations (844) may involve utilizing the controller to interface with an external computing source (e.g., a source in the cloud (46) such as server (110) .

计算机视觉computer vision

如上所述,光谱系统可以被配置为检测用户周围环境中的对象或对象的特征(例如,特性)。在一些实施例中,可以使用计算机视觉技术来检测环境中存在的对象或对象的特性。例如,如本文所公开的,光谱检查系统的面前向的相机可以被配置为对对象成像,并且系统可以被配置为对图像执行图像分析以确定对象上的特征的存在。系统可以分析由面向外的成像系统获取的图像、吸收确定和/或反射和/或散射光测量,以执行对象识别、对象姿势估计、学习、索引、运动估计或图像恢复等。一个或多个计算机视觉算法可以被适当地选择且用来执行这些任务。计算机视觉算法的非限制性示例包括:标度(scale)不变特征变换(SIFT)、加速稳健(robust)特征(SURF)、定向(orient)FAST和旋转(rotate)BRIEF(ORB)、二进制稳健不变可缩放关键点(BRISK)、快速视网膜关键点(FREAK)、Viola-Jones算法、Eigenfaces方法、Lucas-Kanade算法、Horn-Schunk算法、Mean-shift算法、视觉同步定位和映射(vSLAM)技术、序贯(sequential)贝叶斯估计器(例如,卡尔曼滤波器、扩展卡尔曼滤波器等)、束调整、自调节阈值(和其他阈值技术)、迭代最近点(ICP)、半全局匹配(SGM)、半全局块匹配(SGBM)、特征点直方图、各种机器学习算法(诸如,支持向量机、k-最近邻算法、朴素贝叶斯、神经网络(包括卷积或深度神经网络)、或其他有监督/无监督模型等)等等。As noted above, spectroscopic systems may be configured to detect objects or characteristics (eg, characteristics) of objects in the user's surroundings. In some embodiments, computer vision techniques may be used to detect objects or characteristics of objects present in the environment. For example, as disclosed herein, a front-facing camera of a spectroscopic inspection system may be configured to image an object, and the system may be configured to perform image analysis on the image to determine the presence of features on the object. The system may analyze images acquired by the outward-facing imaging system, absorption determinations, and/or reflected and/or scattered light measurements to perform object recognition, object pose estimation, learning, indexing, motion estimation, or image restoration, among others. One or more computer vision algorithms may be suitably selected and used to perform these tasks. Non-limiting examples of computer vision algorithms include: scale invariant feature transform (SIFT), accelerated robust feature (SURF), orientation (orient) FAST and rotate (rotate) BRIEF (ORB), binary robust Invariant Scalable Keypoint (BRISK), Fast Retinal Keypoint (FREAK), Viola-Jones Algorithm, Eigenfaces Method, Lucas-Kanade Algorithm, Horn-Schunk Algorithm, Mean-shift Algorithm, Visual Synchronous Localization and Mapping (vSLAM) Technology , sequential Bayesian estimators (e.g., Kalman filter, extended Kalman filter, etc.), bundle adjustment, self-adjusting thresholding (and other thresholding techniques), iterative closest point (ICP), semi-global matching (SGM), semi-global block matching (SGBM), histogram of feature points, various machine learning algorithms (such as support vector machine, k-nearest neighbor algorithm, naive Bayesian, neural network (including convolutional or deep neural network) ), or other supervised/unsupervised models, etc.) and so on.

如本文所讨论的,可以基于一个或多个标准(例如,在一个或多个波长处的吸光度、光反射和/或光散射)来检测对象或对象的特征(包括特性)。当光谱检查系统使用计算机视觉算法或使用从一个或多个传感器组件(其可能是或可能不是光谱检查系统的一部分)接收的数据来检测周围环境中是否存在标准时,光谱检查系统然后可以用信号表示对象或特征的存在。As discussed herein, an object or characteristics (including properties) of an object may be detected based on one or more criteria (eg, absorbance at one or more wavelengths, light reflectance, and/or light scatter). When the spectral inspection system uses computer vision algorithms or uses data received from one or more sensor components (which may or may not be part of the spectral inspection system) to detect the presence of standards in the surrounding environment, the spectral inspection system may then signal The existence of an object or feature.

这些计算机视觉技术中的一个或多个也可以与从其他环境传感器(诸如例如麦克风、GPS传感器)获取的数据一起使用,以检测并确定由传感器检测到的对象的各种特性。One or more of these computer vision techniques may also be used with data acquired from other environmental sensors (such as eg microphones, GPS sensors) to detect and determine various characteristics of objects detected by the sensors.

机器学习machine learning

可以使用各种机器学习算法来学习以识别对象的存在或对象特征。一旦经过训练,机器学习算法可以由光谱检查系统存储。机器学习算法的一些示例可以包括监督或非监督机器学习算法,其包括回归算法(例如,普通最小二乘回归)、基于实例的算法(例如,学习矢量量化)、决策树算法(例如,分类和回归树)、贝叶斯算法(例如,朴素贝叶斯)、聚类算法(例如,k均值聚类)、关联规则学习算法(例如,先验(a-priori)算法)、人工神经网络算法(例如,感知器)、深度学习算法(例如,深度玻尔兹曼机或深度神经网络)、维数减少算法(例如,主成分分析)、集成算法(例如,层叠泛化)和/或其他机器学习算法。在一些实施例中,可以针对各个数据组定制各个模型。例如,可穿戴装置可以产生或存储基础模型。基本模型可以用作起点以产生特定于数据类型(例如,特定用户)、数据组(例如,在一个或更多波长处获得的吸光度、光反射和/或光散射值的组)、条件情况或其他变体的附加模型。在一些实施例中,光谱检查系统可以被配置为利用多种技术来产生用于分析聚合数据的模型。其他技术可包括使用预限定的阈值或数据值。Various machine learning algorithms can be used to learn to recognize the presence of objects or characteristics of objects. Once trained, the machine learning algorithm can be stored by the spectral inspection system. Some examples of machine learning algorithms may include supervised or unsupervised machine learning algorithms, including regression algorithms (e.g., ordinary least squares regression), instance-based algorithms (e.g., learning vector quantization), decision tree algorithms (e.g., classification and regression trees), Bayesian algorithms (e.g., Naive Bayes), clustering algorithms (e.g., k-means clustering), association rule learning algorithms (e.g., a-priori algorithms), artificial neural network algorithms (e.g., perceptrons), deep learning algorithms (e.g., deep Boltzmann machines or deep neural networks), dimensionality reduction algorithms (e.g., principal component analysis), ensemble algorithms (e.g., stacked generalization), and/or other machine learning algorithm. In some embodiments, individual models can be customized for individual data sets. For example, a wearable device may generate or store base models. The base model can be used as a starting point to generate data types specific (e.g., a particular user), data sets (e.g., sets of absorbance, light reflectance, and/or light scatter values obtained at one or more wavelengths), conditional situations, or Additional models for other variants. In some embodiments, a spectral inspection system may be configured to utilize a variety of techniques to generate models for analyzing aggregated data. Other techniques may include using predefined thresholds or data values.

用于检测对象或对象的特征的标准可以包括一个或多个阈值条件。如果对由传感器(例如,相机或光电检测器)获取的数据的分析指示通过阈值条件,则光谱检查系统可以提供指示在周围环境中检测到对象的存在的信号。阈值条件可以涉及定量和/或定性测量。例如,阈值条件可以包括与对象和/或特征存在的可能性相关联的分数或百分比。光谱检查系统可以将根据传感器的数据计算的分数与阈值分数进行比较。如果得分高于阈值水平,则光场系统可以用信号通知检测到对象或对象特征的存在。在其他实施例中,如果得分低于阈值,则光谱检查系统可以用信号通知对象或特征的不存在。Criteria for detecting objects or features of objects may include one or more threshold conditions. If analysis of data acquired by a sensor (eg, a camera or photodetector) indicates that a threshold condition is passed, the spectral inspection system may provide a signal indicative of the detected presence of an object in the surrounding environment. Threshold conditions may involve quantitative and/or qualitative measurements. For example, a threshold condition may include a score or percentage associated with the likelihood that an object and/or feature is present. The spectral inspection system can compare the score calculated from the sensor's data to a threshold score. If the score is above a threshold level, the light field system may signal the detection of an object or the presence of an object feature. In other embodiments, the spectral inspection system may signal the absence of an object or feature if the score is below a threshold.

应当理解,本文描述的和/或附图描绘的过程、方法以及算法中的每一者可体现在以下项中并通过以下项被全部或部分自动化:代码模块,其由一个或多个物理计算系统、硬件计算机处理器、专用电路执行;和/或电子硬件,其被配置为执行具体和特定计算机指令。代码模块可被编译并链接到可执行程序中,安装在动态链接库中,或可用解释性编程语言编写。在一些实施例中,特定操作和方法可由特定于给定功能的电路来执行。在一些实施例中,代码模块可以由控制器(844)(图5)和/或云(46)(例如,服务器(110))中的硬件执行。It should be understood that each of the processes, methods, and algorithms described herein and/or depicted in the accompanying drawings can be embodied in and automated in whole or in part by a code module that is composed of one or more physical computing A system, hardware computer processor, execution of special purpose circuits; and/or electronic hardware configured to execute specific and specific computer instructions. Code modules can be compiled and linked into an executable program, installed in a dynamic link library, or written in an interpreted programming language. In some embodiments, particular operations and methods may be performed by circuitry specific to a given function. In some embodiments, the code modules may be executed by controller (844) (FIG. 5) and/or hardware in the cloud (46) (eg, server (110)).

此外,本公开的功能的特定实施例在数学上、计算上或技术上都足够复杂,以至于为了执行所述功能(例如由于所涉及的计算量或复杂性)或为了基本实时地提供结果,专用硬件或者一个或多个物理计算装置(利用适当的专有可执行指令)可以是必需的。例如,视频可包括多个帧,每帧具有数百万个像素,为了处理视频数据以在商业合理的时间量内提供期望的图像处理任务或应用,专用编程计算机硬件是必需的。Furthermore, certain embodiments of the disclosed functions are sufficiently mathematically, computationally, or technically complex that in order to perform the described functions (eg, due to the amount of computation or complexity involved) or to provide results in substantially real-time, Dedicated hardware or one or more physical computing devices (with appropriate proprietary executable instructions) may be necessary. For example, video may include multiple frames, each frame having millions of pixels, and specially programmed computer hardware is necessary in order to process the video data to provide the desired image processing task or application in a commercially reasonable amount of time.

代码模块或任何类型的数据可被存储在任何类型的非暂时性计算机可读介质上,诸如物理计算机存储器,包括硬盘驱动器、固态存储器、随机存取存储器(RAM)、只读存储器(ROM)、光盘、易失性或非易失性存储器以及相同和/或相似元件的组合。在一些实施例中,非暂时性计算机可读介质可以是本地处理和数据模块(70,图2C)、远程处理模块(72,图2D)和远程数据储存库(74,图2D)中的一个或多个的一部分。方法和模块(或数据)也可在各种计算机可读传输介质上作为生成的数据信号(例如,作为载波或其他模拟或数字播放信号的一部分)传输,所述传输介质包括基于无线的介质和基于有线/电缆的介质,且可采取多种形式(例如,作为单一或多路复用模拟信号的一部分,或者作为多个离散数字数据包或帧)。所公开的过程或处理步骤的结果可持久地或以其他方式存储在任何类型的非暂时性实体计算机存储器中,或可经由计算机可读传输介质进行传送。Modules of code or any type of data may be stored on any type of non-transitory computer readable medium, such as physical computer memory, including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), Optical discs, volatile or non-volatile memory, and combinations of the same and/or similar elements. In some embodiments, the non-transitory computer readable medium may be one of a local processing and data module (70, FIG. 2C ), a remote processing module (72, FIG. 2D ), and a remote data repository (74, FIG. 2D ). or part of multiple. Methods and modules (or data) may also be transmitted as a generated data signal (e.g., as part of a carrier wave or other analog or digital broadcast signal) over a variety of computer-readable transmission media, including wireless-based media and Wire/cable based medium and can take various forms (eg, as part of a single or multiplexed analog signal, or as multiple discrete digital data packets or frames). The results of disclosed procedures or processing steps may be stored persistently or otherwise in any type of non-transitory tangible computer memory or transmitted via a computer-readable transmission medium.

本文所描述和/或附图所描绘的流程图中的任何过程、框、状态、步骤或功能应当被理解为潜在地表示代码模块、代码段或代码部分,它们包括在过程中实现具体功能(例如逻辑功能或算术功能)或步骤的一个或多个可执行指令。各种过程、框、状态、步骤或功能能够根据本文提供的说明性示例进行组合、重新排列、添加、删除、修改或其他改变。在一些实施例中,额外或不同的计算系统或代码模块可执行本文所述的一些或全部功能。本文所述方法和过程也不限于任何具体的顺序,且与其相关的框、步骤或状态能以适当的其他顺序来执行,例如以串行、并行或某种其他方式。可向所公开的示例实施例添加或从中移除任务或事件。此外,本文所述的实施例中的分离各种系统组件是出于说明的目的,且不应被理解为在所有实施例中都需要这样的分离。应该理解,所描述的程序组件、方法以及系统一般能一起集成在单个计算机产品中或封装到多个计算机产品中。许多实施方式变体是可行的。Any process, block, state, step or function in the flowcharts described herein and/or depicted in the accompanying drawings should be understood as potentially representing a code module, code segment, or code portion, which is included in the process to achieve a specific function ( One or more executable instructions such as logical or arithmetic functions) or steps. Various processes, blocks, states, steps, or functions can be combined, rearranged, added, deleted, modified, or otherwise changed according to the illustrative examples provided herein. In some embodiments, additional or different computing systems or code modules may perform some or all of the functions described herein. Nor are the methods and processes described herein limited to any particular order, and blocks, steps or states related thereto can be performed in any other order as appropriate, such as in series, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. Furthermore, the separation of various system components in the embodiments described herein is for illustrative purposes and should not be construed as requiring such separation in all embodiments. It should be understood that the described program components, methods, and systems can generally be integrated together in a single computer product or packaged in multiple computer products. Many implementation variants are possible.

在此描述了本发明的各种示例性示例。在非限制性意义上参考这些示例。提供这些示例以说明本发明的更广泛的应用方面。可以在不脱离本发明的精神和范围的情况下,可以对所描述的发明进行各种改变并可替换等同物。可以进行许多修改以使特定情况、材料、物质的组成、过程、一个或多个过程动作或一个或多个步骤适应于本发明的一个或多个目的、精神或范围。此外,如本领域技术人员将理解的,在不脱离本发明的范围或精神的情况下,在此所描述和示出的各个变型中的每一个具有分离的组件和特征,其可以容易地与其它若干实施例中的任一特征分离或组合。所有这些修改旨在处于与本公开相关联的权利要求的范围内。Various illustrative examples of the invention are described herein. Reference is made to these examples in a non-limiting sense. These examples are provided to illustrate the broader applicable aspects of the invention. Various changes may be made and equivalents may be substituted for the invention described without departing from the spirit and scope of the invention. Many modifications may be made to adapt a particular situation, material, composition of matter, process, process act or steps, or step or steps to one or more of the objectives, spirit or scope of the invention. Furthermore, as those skilled in the art will appreciate, each of the various variations described and illustrated herein has separate components and features, which can be readily combined with Any feature in several other embodiments is separated or combined. All such modifications are intended to be within the scope of the claims associated with this disclosure.

本发明包括可以使用主题装置执行的方法。该方法可以包括提供这种合适的装置的动作。这种提供可以由终端用户执行。换言之,“提供”动作仅仅需要终端用户获得、访问、接近、定位、设置、激活、开启或以其它方式提供在该方法中的必要装置。在此所述的方法可以按逻辑上可能的所述事件的任何顺序以及按照所记载的事件顺序进行。The present invention includes methods that may be performed using the subject apparatus. The method may include the act of providing such suitable means. Such provisioning may be performed by end users. In other words, the act of "providing" merely requires the end user to obtain, access, approach, locate, configure, activate, turn on, or otherwise provide the necessary means in the method. The methods described herein can be performed in any order of events described which is logically possible as well as in the order of events recited.

以上已经阐述了本发明的示例性方面以及关于材料选择和制造的细节。关于本发明的其它细节,可以结合上述参考的专利和出版物以及本领域技术人员通常所知或理解的来理解这些。关于根据本发明的基础方法的方面在通常或逻辑上利用的附加动作方面同样可以成立。Exemplary aspects of the invention have been set forth above, along with details regarding material selection and fabrication. Other details of the present invention can be understood in conjunction with the above-referenced patents and publications as well as those generally known or understood by those skilled in the art. The same holds true with respect to the aspects of the basic method according to the invention in terms of additional actions which are generally or logically employed.

另外,虽然已经参考可选地并入各种特征的若干示例描述了本发明,但是本发明不限于针对本发明的每个变型所构想的描述或指示的发明。在不脱离本发明的精神和范围的情况下,可以对所描述的本发明进行各种改变,并且可以替代等同物(为了简洁起见,不论在此是否包括)。此外,在提供了值的范围的情况下,应当理解,在该范围的上限和下限之间的每个中间值以及在该所述范围内的任何其它所述或中间值都包含在本发明内。Additionally, although the invention has been described with reference to several examples optionally incorporating various features, the invention is not limited to the described or indicated inventions for every variation contemplated by the invention. Various changes may be made to the invention as described, and equivalents may be substituted (whether or not included herein for the sake of brevity) without departing from the spirit and scope of the invention. Further, where a range of values is provided, it is understood that each intervening value between the upper and lower limits of that range, as well as any other stated or intervening value within that stated range, is encompassed within the invention .

另外,可构想的是所描述的本发明变形的任何可选特征可独立地或与在此所描述的特征中的任何一个或多个相结合来陈述和要求权利。引用单数项包括可能存在相同项的复数。更具体地,如在此和关联权利要求书所使用的,单数形式“一”、“一个”、“所述”和“该”包括复数对象,除非另有明确说明。换句话说,在上述描述以及与本公开关联的权利要求中,允许使用冠词的“至少一个”目标项。进一步应注意,可以起草这种权利要求以排除任何可选要素。因此,结合权利要求要素或使用“负面”限制,本声明旨在作为使用“单独地”、“仅”等排他性术语的先行基础。In addition, it is contemplated that any optional feature of variations of the invention described may be stated and claimed independently or in combination with any one or more of the features described herein. References to singular items include that there may be plural occurrences of the same item. More specifically, as used herein and in the associated claims, the singular forms "a," "an," "said," and "the" include plural referents unless expressly stated otherwise. In other words, in the above description as well as the claims associated with this disclosure, the subject matter of "at least one" of the article is allowed. It should further be noted that such claims may be drafted to exclude any optional elements. Accordingly, this statement is intended to serve as antecedent basis for the use of exclusive terms such as "solely" and "only" in conjunction with claim elements or the use of a "negative" limitation.

在不使用这种排他性术语的情况下,与本公开相关联的权利要求中的术语“包括”应允许包括任何附加元素,不考虑在这种权利要求中是否列举了给定数量的要素或添加特征可以被认为是改变在权利要求中所述的元素的性质。除了在此具体定义之外,应在保持权利要求有效性的同时给定在此使用的所有技术和科学术语尽可能广泛的通常理解含义。In the absence of such an exclusive term, the term "comprising" in claims associated with the present disclosure shall allow the inclusion of any additional elements, regardless of whether a given number of elements or additions are recited in such claims. A characteristic may be considered as changing a property of an element stated in a claim. Except as specifically defined herein, all technical and scientific terms used herein should be given the broadest commonly understood meaning possible while maintaining claim validity.

本发明的广度不限于提供的实施例和/或主题说明书,而是仅由与本公开相关联的权利要求语言的范围限定。The breadth of the present invention is not limited by the examples provided and/or the subject description, but only by the scope of the language of the claims associated with this disclosure.

Claims (17)

1.一种可穿戴光谱检查系统,包括:1. A wearable spectrum inspection system comprising: 头戴式显示系统,其可拆卸地耦合到用户的头部;a head-mounted display system detachably coupled to a user's head; 至少一个眼睛跟踪相机,其被配置为检测所述用户的凝视;at least one eye-tracking camera configured to detect the user's gaze; 一个或多个光源,其耦合到所述头戴式显示系统并且被配置为在基本上与所检测到的凝视相同的方向上在被照射的视场中发射具有至少两个不同波长的光;one or more light sources coupled to the head mounted display system and configured to emit light having at least two different wavelengths in the illuminated field of view in substantially the same direction as the detected gaze; 一个或多个电磁辐射检测器,其耦合到所述头戴式部件并且被配置为接收来自所述被照射的视场内的目标对象的反射光;one or more electromagnetic radiation detectors coupled to the head mounted component and configured to receive reflected light from target objects within the illuminated field of view; 控制器,其可操作地耦合到所述一个或多个光源和所述一个或多个电磁辐射检测器,所述控制器被配置为使所述一个或多个光源发射光脉冲,同时还使所述一个或多个电磁辐射检测器检测与所发射的光脉冲和来自所述目标对象的反射光有关的光吸收水平;a controller operatively coupled to the one or more light sources and the one or more electromagnetic radiation detectors, the controller configured to cause the one or more light sources to emit pulses of light while also causing the said one or more electromagnetic radiation detectors detect light absorption levels associated with emitted light pulses and reflected light from said target object; 具有至少一种材料的光吸收特性的吸收数据库;以及an absorption database having light absorption properties of at least one material; and 图形处理器单元,其用于向所述用户显示输出。a graphics processor unit for displaying output to said user. 2.根据权利要求1所述的系统,其中所述一个或多个光源包括多个发光二极管。2. The system of claim 1, wherein the one or more light sources comprise a plurality of light emitting diodes. 3.根据权利要求1所述的系统,其中所述一个或多个光源被配置为发射两个或更多个预定波长的电磁辐射。3. The system of claim 1, wherein the one or more light sources are configured to emit electromagnetic radiation at two or more predetermined wavelengths. 4.根据权利要求3所述的系统,其中所述一个或多个光源被配置为发射约660纳米的第一波长和约940纳米的第二波长的电磁辐射。4. The system of claim 3, wherein the one or more light sources are configured to emit electromagnetic radiation at a first wavelength of about 660 nanometers and a second wavelength of about 940 nanometers. 5.根据权利要求3所述的系统,其中所述一个或多个光源被配置为顺序地发射所述两个预定波长的电磁辐射。5. The system of claim 3, wherein the one or more light sources are configured to sequentially emit electromagnetic radiation of the two predetermined wavelengths. 6.根据权利要求3所述的系统,其中所述一个或多个光源被配置为同时发射所述两个预定波长的电磁辐射。6. The system of claim 3, wherein the one or more light sources are configured to simultaneously emit electromagnetic radiation of the two predetermined wavelengths. 7.根据权利要求1所述的系统,其中所述控制器还被配置为使得所述一个或多个光源发射第一波长开、然后第二波长开、然后第一和第二波长都关的循环图案,使得所述一个或多个电磁辐射检测器分开检测所述第一和第二波长。7. The system of claim 1, wherein the controller is further configured to cause the one or more light sources to emit a first wavelength on, then a second wavelength on, then both the first and second wavelengths off A cyclic pattern such that said one or more electromagnetic radiation detectors detect said first and second wavelengths separately. 8.根据权利要求1所述的系统,其中所述控制器被配置为计算第一波长光测量与第二波长光测量的比率,以及其中所述系统被配置为基于所述吸收数据库将所述比率转换为组织特性。8. The system of claim 1 , wherein the controller is configured to calculate a ratio of a first wavelength light measurement to a second wavelength light measurement, and wherein the system is configured to divide the Ratios are converted to tissue properties. 9.根据权利要求8所述的系统,其中所述控制器可操作地耦合到光学元件,所述光学元件耦合到所述头戴式部件并且是由所述用户可观看到的,其中所述系统被配置为基于所述组织特性提供输出,其中所述输出是由所述用户通过所述光学元件可观看到的。9. The system of claim 8, wherein the controller is operatively coupled to an optical element coupled to the head mounted component and viewable by the user, wherein the The system is configured to provide an output based on the tissue characteristic, wherein the output is viewable by the user through the optical element. 10.根据权利要求1所述的系统,其中所述一个或多个电磁辐射检测器包括从由光电二极管、光电检测器构成的组中选择的装置。10. The system of claim 1, wherein the one or more electromagnetic radiation detectors comprise devices selected from the group consisting of photodiodes, photodetectors. 11.根据权利要求1所述的系统,其中所述一个或多个电磁辐射检测器包括数字图像传感器。11. The system of claim 1, wherein the one or more electromagnetic radiation detectors comprise digital image sensors. 12.根据权利要求11所述的系统,其中所述数字图像传感器包括多个像素,其中所述控制器被配置为:自动检测正在接收遇到预定组织特性之后反射的光的像素子集,并且产生显示所述像素子集的位置的输出,所述输出指示所述预定组织特性。12. The system of claim 11 , wherein the digital image sensor comprises a plurality of pixels, wherein the controller is configured to automatically detect a subset of pixels that are receiving light reflected after encountering a predetermined tissue property, and An output is generated showing the location of the subset of pixels, the output being indicative of the predetermined tissue characteristic. 13.根据权利要求1所述的系统,其中所述头戴式部件还包括惯性测量单元定位系统。13. The system of claim 1, wherein the head mounted component further comprises an inertial measurement unit positioning system. 14.根据权利要求13所述的系统,其中所述惯性测量系统确定所述用户的头部的姿势取向。14. The system of claim 13, wherein the inertial measurement system determines a pose orientation of the user's head. 15.根据权利要求14所述的系统,其中所述被照射的视场至少与所述姿势取向一样宽。15. The system of claim 14, wherein the illuminated field of view is at least as wide as the pose orientation. 16.根据权利要求1所述的系统,其中所述头戴式显示系统包括波导堆叠,所述波导堆叠被配置为输出具有选择性可变的波前发散水平的光。16. The system of claim 1, wherein the head-mounted display system includes a waveguide stack configured to output light having a selectively variable level of wavefront divergence. 17.根据权利要求16所述的系统,其中所述波导堆叠包括具有光学功率的波导。17. The system of claim 16, wherein the waveguide stack includes waveguides with optical power.
CN201780071647.5A 2016-09-22 2017-09-22 Wearable Spectral Inspection System Active CN109997174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310647311.6A CN116649967A (en) 2016-09-22 2017-09-22 Wearable spectrum inspection system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662398454P 2016-09-22 2016-09-22
US62/398,454 2016-09-22
PCT/US2017/053067 WO2018057962A1 (en) 2016-09-22 2017-09-22 Augmented reality spectroscopy

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310647311.6A Division CN116649967A (en) 2016-09-22 2017-09-22 Wearable spectrum inspection system

Publications (2)

Publication Number Publication Date
CN109997174A CN109997174A (en) 2019-07-09
CN109997174B true CN109997174B (en) 2023-06-02

Family

ID=61621033

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310647311.6A Pending CN116649967A (en) 2016-09-22 2017-09-22 Wearable spectrum inspection system
CN201780071647.5A Active CN109997174B (en) 2016-09-22 2017-09-22 Wearable Spectral Inspection System

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310647311.6A Pending CN116649967A (en) 2016-09-22 2017-09-22 Wearable spectrum inspection system

Country Status (9)

Country Link
US (4) US10558047B2 (en)
EP (2) EP4455840A3 (en)
JP (3) JP7148501B2 (en)
KR (4) KR102650592B1 (en)
CN (2) CN116649967A (en)
AU (2) AU2017331284B2 (en)
CA (1) CA3037725A1 (en)
IL (3) IL307292A (en)
WO (1) WO2018057962A1 (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2992065T3 (en) 2016-08-16 2024-12-09 Insight Medical Systems Inc Sensory augmentation systems in medical procedures
EP4455840A3 (en) 2016-09-22 2025-01-01 Magic Leap, Inc. Augmented reality spectroscopy
WO2018211494A1 (en) * 2017-05-15 2018-11-22 Real View Imaging Ltd. System with multiple displays and methods of use
US20190012835A1 (en) * 2017-07-07 2019-01-10 Microsoft Technology Licensing, Llc Driving an Image Capture System to Serve Plural Image-Consuming Processes
US10730181B1 (en) * 2017-12-27 2020-08-04 X Development Llc Enhancing robot learning
US11017317B2 (en) 2017-12-27 2021-05-25 X Development Llc Evaluating robot learning
US11475291B2 (en) 2017-12-27 2022-10-18 X Development Llc Sharing learned information among robots
WO2019173283A1 (en) 2018-03-05 2019-09-12 Marquette University Method and apparatus for non-invasive hemoglobin level prediction
WO2019183399A1 (en) 2018-03-21 2019-09-26 Magic Leap, Inc. Augmented reality system and method for spectroscopic analysis
WO2019200362A1 (en) * 2018-04-12 2019-10-17 The Regents Of The University Of California Wearable multi-modal bio-sensing system
CN108953888B (en) * 2018-07-17 2019-06-28 东北大学 A multiple shock-absorbing gimbal device based on somatosensory interaction
EP3860424B1 (en) * 2018-10-03 2024-07-17 Verily Life Sciences LLC Dynamic illumination to identify tissue type
US10914945B2 (en) 2018-11-09 2021-02-09 Facebook Technologies, Llc Inconspicuous near-eye electrical circuits
US10792122B2 (en) * 2018-12-23 2020-10-06 Taiwan Main Orthopaedic Biotechnology Co., Ltd. Object developing and calibrating method in a surgical environment
US11513003B2 (en) 2019-08-07 2022-11-29 Apple Inc. Electronic devices with beam-steered infrared light sensing
WO2021092314A1 (en) 2019-11-06 2021-05-14 Hes Ip Holdings, Llc System and method for displaying an object with depths
EP4103088A4 (en) * 2020-02-10 2024-03-20 Insight Medical Systems, Inc. Systems and methods for sensory augmentation in medical procedures
KR20220105698A (en) 2021-01-20 2022-07-28 삼성전자주식회사 Augmented reality glasses lenses, and augmented reality glasses and system including the same
US20220280110A1 (en) * 2021-03-04 2022-09-08 Hi Llc Data Aggregation and Power Distribution in Time Domain-Based Optical Measurement Systems
DE102022119578A1 (en) * 2022-08-04 2024-02-15 Ams-Osram International Gmbh OPTICAL ASSEMBLY FOR DETECTING LASER RADIATION REFLECTED BY THE EYE
US12155421B2 (en) * 2022-05-18 2024-11-26 Rohde & Schwarz Gmbh & Co. Kg Augmented reality spectrum monitoring system
US12436386B2 (en) * 2022-06-23 2025-10-07 Rockwell Collins, Inc. Head wearable display device with image monitoring comprising at least one image monitor sensor optically coupled to a waveguide
US12376763B2 (en) * 2022-06-29 2025-08-05 Apple Inc. Non-contact respiration sensing
JP7742866B2 (en) 2023-07-24 2025-09-22 プライムプラネットエナジー&ソリューションズ株式会社 Battery material manufacturing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102955255A (en) * 2011-09-26 2013-03-06 微软公司 Integrated eye tracking and display system
WO2014015378A1 (en) * 2012-07-24 2014-01-30 Nexel Pty Ltd. A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard, vision assistance and detecting disease
CN103827728A (en) * 2011-07-18 2014-05-28 谷歌公司 Identify objects of interest using optical occlusion
WO2015175681A1 (en) * 2014-05-15 2015-11-19 Fenwal, Inc. Head-mounted display device for use in a medical facility

Family Cites Families (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6222525B1 (en) 1992-03-05 2001-04-24 Brad A. Armstrong Image controllers with sheet connected sensors
US5377674A (en) * 1992-05-08 1995-01-03 Kuestner; J. Todd Method for non-invasive and in-vitro hemoglobin concentration measurement
US5701902A (en) * 1994-09-14 1997-12-30 Cedars-Sinai Medical Center Spectroscopic burn injury evaluation apparatus and method
US5670988A (en) 1995-09-05 1997-09-23 Interlink Electronics, Inc. Trigger operated electronic device
US6305804B1 (en) * 1999-03-25 2001-10-23 Fovioptics, Inc. Non-invasive measurement of blood component using retinal imaging
AT409042B (en) 1999-11-24 2002-05-27 Life Optics Handel Und Vertrie sehhilfe
US6554444B2 (en) * 2000-03-13 2003-04-29 Kansai Technology Licensing Organization Co., Ltd. Gazing point illuminating device
JP2002150803A (en) * 2000-03-13 2002-05-24 Kansai Tlo Kk Visual axis lighting device and operation lighting system
GR1004180B (en) * 2000-03-28 2003-03-11 ����������� ����� ��������� (����) Method and system for characterization and mapping of tissue lesions
GB0021988D0 (en) 2000-09-07 2000-10-25 Nokia Mobile Phones Ltd Management of portable radiotelephones
US8328420B2 (en) * 2003-04-22 2012-12-11 Marcio Marc Abreu Apparatus and method for measuring biologic parameters
US20110077548A1 (en) * 2004-04-01 2011-03-31 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them
WO2006017771A1 (en) 2004-08-06 2006-02-16 University Of Washington Variable fixation viewing distance scanned light displays
JP4834977B2 (en) 2004-11-12 2011-12-14 コニカミノルタホールディングス株式会社 See-through type head mounted display
JP5383190B2 (en) * 2005-08-08 2014-01-08 コーニング インコーポレイテッド Method for increasing the readout speed of a CCD detector
US8696113B2 (en) 2005-10-07 2014-04-15 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear
US20070081123A1 (en) 2005-10-07 2007-04-12 Lewis Scott W Digital eyewear
US11428937B2 (en) 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US20070270673A1 (en) * 2005-12-06 2007-11-22 Abrams Daniel J Ocular parameter sensing for cerebral perfusion monitoring and other applications
JP4961914B2 (en) 2006-09-08 2012-06-27 ソニー株式会社 Imaging display device and imaging display method
EP3415090B1 (en) * 2007-09-05 2025-02-12 Sensible Medical Innovations Ltd. Method, system and apparatus for using electromagnetic radiation for monitoring a tissue of a user
US8224020B2 (en) 2007-11-29 2012-07-17 Kabushiki Kaisha Toshiba Appearance inspection apparatus, appearance inspection system, and appearance inspection appearance
JP5537008B2 (en) 2007-11-29 2014-07-02 株式会社東芝 Appearance inspection device
JP2009157634A (en) 2007-12-26 2009-07-16 Fuji Xerox Co Ltd Irradiation control device, irradiation control program, and visual line analysis system
US20100113940A1 (en) * 2008-01-10 2010-05-06 The Ohio State University Research Foundation Wound goggles
US8443146B2 (en) 2008-09-18 2013-05-14 International Business Machines Corporation Techniques for cache injection in a processor system responsive to a specific instruction sequence
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
JP5405445B2 (en) * 2010-12-17 2014-02-05 富士フイルム株式会社 Endoscope device
NZ725592A (en) 2010-12-24 2018-05-25 Magic Leap Inc An ergonomic head mounted display device and optical system
US10156722B2 (en) 2010-12-24 2018-12-18 Magic Leap, Inc. Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality
CN103635891B (en) 2011-05-06 2017-10-27 奇跃公司 Massive Simultaneous Telepresence of the World
US9256711B2 (en) * 2011-07-05 2016-02-09 Saudi Arabian Oil Company Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display
US8542351B2 (en) 2011-09-01 2013-09-24 Nike, Inc. Coating inspection device
WO2013049861A1 (en) 2011-09-29 2013-04-04 Magic Leap, Inc. Tactile glove for human-computer interaction
EP2771877B1 (en) 2011-10-28 2017-10-11 Magic Leap, Inc. System and method for augmented and virtual reality
BR112014012615A2 (en) 2011-11-23 2017-06-13 Magic Leap Inc three-dimensional augmented reality and virtual reality display system
AU2013243380B2 (en) 2012-04-05 2017-04-20 Magic Leap, Inc. Wide-field of view (FOV) imaging devices with active foveation capability
US20140039309A1 (en) 2012-04-26 2014-02-06 Evena Medical, Inc. Vein imaging systems and methods
US20140046291A1 (en) 2012-04-26 2014-02-13 Evena Medical, Inc. Vein imaging systems and methods
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
EP4130820B1 (en) 2012-06-11 2024-10-16 Magic Leap, Inc. Multiple depth plane three-dimensional display using a wave guide reflector array projector
US8971978B2 (en) * 2012-08-21 2015-03-03 Google Inc. Contact lens with integrated pulse oximeter
IL221863A (en) * 2012-09-10 2014-01-30 Elbit Systems Ltd Digital system for surgical video capturing and display
US9740006B2 (en) 2012-09-11 2017-08-22 Magic Leap, Inc. Ergonomic head mounted display device and optical system
US9039180B2 (en) * 2012-12-11 2015-05-26 Elwah LLC Self-aligning unobtrusive active eye interrogation
KR102141992B1 (en) 2013-01-15 2020-08-06 매직 립, 인코포레이티드 Ultra-high resolution scanning fiber display
JP6055688B2 (en) * 2013-01-31 2016-12-27 日本光電工業株式会社 Biosignal measurement system, biosignal measurement device, and control program for biosignal measurement device
EP2967322A4 (en) * 2013-03-11 2017-02-08 Magic Leap, Inc. System and method for augmented and virtual reality
EP2973532A4 (en) 2013-03-15 2017-01-18 Magic Leap, Inc. Display system and method
CA2894133C (en) * 2013-03-15 2016-11-01 Synaptive Medical (Barbados) Inc. Surgical imaging systems
EP2979128B1 (en) * 2013-03-25 2017-10-25 Intel Corporation Method for displaying an image projected from a head-worn display with multiple exit pupils
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US9874749B2 (en) * 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
CN103479363B (en) 2013-09-30 2015-03-18 深圳市倍轻松科技股份有限公司 Method and system for measuring oxyhemoglobin saturation in blood
US9470906B2 (en) 2013-10-16 2016-10-18 Magic Leap, Inc. Virtual or augmented reality headsets having adjustable interpupillary distance
US10568522B2 (en) * 2013-10-23 2020-02-25 The Trustees Of Dartmouth College Surgical vision augmentation system
US20150257735A1 (en) * 2013-10-24 2015-09-17 Evena Medical, Inc. Systems and methods for displaying medical images
EP3075090B1 (en) 2013-11-27 2023-04-05 Magic Leap, Inc. Virtual and augmented reality systems and methods
US9857591B2 (en) 2014-05-30 2018-01-02 Magic Leap, Inc. Methods and system for creating focal planes in virtual and augmented reality
FR3013906B1 (en) 2013-11-28 2017-04-07 Commissariat Energie Atomique RADIO ANTENNA INTEGRATED IN MEANDRES
WO2015094191A1 (en) * 2013-12-17 2015-06-25 Intel Corporation Controlling vision correction using eye tracking and depth detection
EP4071537B1 (en) 2014-01-31 2024-07-10 Magic Leap, Inc. Multi-focal display system
CN111552079B (en) 2014-01-31 2022-04-15 奇跃公司 Multi-focus display system and method
US10203762B2 (en) 2014-03-11 2019-02-12 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
JP6550460B2 (en) 2014-05-09 2019-07-24 グーグル エルエルシー System and method for identifying eye signals, and continuous biometric authentication
EP3149539B1 (en) 2014-05-30 2025-04-30 Magic Leap, Inc. Virtual or augmented reality apparatus
JP6468287B2 (en) 2014-06-05 2019-02-13 株式会社ニコン Scanning projection apparatus, projection method, scanning apparatus, and surgery support system
JP6415901B2 (en) * 2014-08-27 2018-10-31 株式会社東芝 Electronic device and control method
WO2016054092A1 (en) * 2014-09-29 2016-04-07 Magic Leap, Inc. Architectures and methods for outputting different wavelength light out of waveguides
EP3210095B1 (en) * 2014-10-21 2019-12-11 Signify Holding B.V. System, method and computer program for hands-free configuration of a luminous distribution
IL297803B2 (en) 2015-01-26 2023-11-01 Magic Leap Inc Virtual and augmented reality systems and methods having improved diffractive grating structures
CN107851176A (en) 2015-02-06 2018-03-27 阿克伦大学 Optical imaging system and method thereof
US10180734B2 (en) 2015-03-05 2019-01-15 Magic Leap, Inc. Systems and methods for augmented reality
NZ773815A (en) 2015-03-16 2022-07-01 Magic Leap Inc Methods and systems for diagnosing and treating health ailments
USD758367S1 (en) 2015-05-14 2016-06-07 Magic Leap, Inc. Virtual reality headset
EP4455840A3 (en) 2016-09-22 2025-01-01 Magic Leap, Inc. Augmented reality spectroscopy

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103827728A (en) * 2011-07-18 2014-05-28 谷歌公司 Identify objects of interest using optical occlusion
CN102955255A (en) * 2011-09-26 2013-03-06 微软公司 Integrated eye tracking and display system
WO2014015378A1 (en) * 2012-07-24 2014-01-30 Nexel Pty Ltd. A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard, vision assistance and detecting disease
WO2015175681A1 (en) * 2014-05-15 2015-11-19 Fenwal, Inc. Head-mounted display device for use in a medical facility

Also Published As

Publication number Publication date
CA3037725A1 (en) 2018-03-29
KR20240042181A (en) 2024-04-01
IL293014B1 (en) 2023-11-01
EP3516630B1 (en) 2024-09-18
EP3516630A4 (en) 2020-06-03
CN109997174A (en) 2019-07-09
JP2019529917A (en) 2019-10-17
JP7148501B2 (en) 2022-10-05
KR102650592B1 (en) 2024-03-22
US11460705B2 (en) 2022-10-04
AU2017331284A1 (en) 2019-04-11
KR102786412B1 (en) 2025-03-25
IL293014B2 (en) 2024-03-01
WO2018057962A1 (en) 2018-03-29
EP4455840A3 (en) 2025-01-01
CN116649967A (en) 2023-08-29
US20220026717A1 (en) 2022-01-27
JP2022192065A (en) 2022-12-28
KR20190051043A (en) 2019-05-14
IL265402B (en) 2022-06-01
JP7716446B2 (en) 2025-07-31
US10558047B2 (en) 2020-02-11
US20180081179A1 (en) 2018-03-22
IL265402A (en) 2019-05-30
KR102266343B1 (en) 2021-06-17
US20220404626A1 (en) 2022-12-22
EP4455840A2 (en) 2024-10-30
US11754844B2 (en) 2023-09-12
AU2022202370A1 (en) 2022-04-28
KR20210072157A (en) 2021-06-16
AU2017331284B2 (en) 2022-01-13
US11079598B2 (en) 2021-08-03
IL293014A (en) 2022-07-01
JP2023143907A (en) 2023-10-06
EP3516630A1 (en) 2019-07-31
KR20230072515A (en) 2023-05-24
IL307292A (en) 2023-11-01
US20200166760A1 (en) 2020-05-28

Similar Documents

Publication Publication Date Title
JP7716446B2 (en) Augmented Reality Spectroscopy
JP2019529917A5 (en)
KR102630754B1 (en) Augmented Reality Pulse Oximetry
US12174068B2 (en) Augmented reality system and method for spectroscopic analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant