優先權主張Priority claim
本申請案係2018年4月27日提出申請之第62/663,689號美國臨時專利申請案之一非臨時案且主張該美國臨時專利申請案之優先權及權益,該美國臨時專利申請案之全文以引用方式併入本文中。本申請案亦係2017年11月15日提出申請之第15/814,127號美國專利申請案之一部分接續案,該美國專利申請案主張2017年4月24日提出申請之第62/489,289號美國臨時專利申請案及2017年4月25日提出申請之第62/489,876號美國臨時專利申請案之優先權及權益,該等美國臨時專利申請案之全文以引用方式併入本文中。This application is a non-provisional case of the U.S. Provisional Patent Application No. 62/663,689 filed on April 27, 2018 and claims the priority and rights of the U.S. Provisional Patent Application. The full text of the U.S. Provisional Patent Application Incorporated into this article by reference. This application is also part of the continuation of the U.S. Patent Application No. 15/814,127 filed on November 15, 2017. The U.S. Patent Application claims that the U.S. Provisional No. 62/489,289 filed on April 24, 2017 Priority and rights of patent applications and US provisional patent application No. 62/489,876 filed on April 25, 2017, the full text of these US provisional patent applications are incorporated herein by reference.
本發明一般而言係關於一立體視覺化攝影機及平台。該立體視覺化攝影機可稱為一數位立體顯微鏡(「DSM」)。該實例性攝影機及平台經組態以將顯微鏡光學元件及視訊感測器整合至與先前技術顯微鏡(諸如圖1之外科手術放大鏡100及圖2之外科手術顯微鏡200)相比較顯著較小、較輕且更加易於操作之一獨立頭部單元中。該實例性攝影機經組態以將一立體視訊信號傳輸至一或多個電視監視器、投影機、全像裝置、智慧眼鏡、虛擬實境裝置或一外科手術環境內之其他視覺顯示裝置。The present invention generally relates to a stereo visualization camera and platform. The stereo vision camera may be referred to as a digital stereo microscope ("DSM"). The exemplary camera and platform are configured to integrate microscope optical elements and video sensors to be significantly smaller and smaller than prior art microscopes (such as the surgical loupe 100 of FIG. 1 and the surgical microscope 200 of FIG. 2). Lighter and easier to operate in an independent head unit. The example camera is configured to transmit a stereoscopic video signal to one or more television monitors, projectors, holographic devices, smart glasses, virtual reality devices, or other visual display devices in a surgical environment.
監視器或其他視覺顯示裝置可定位於外科手術環境內以在對一患者執行外科手術時容易地在一外科醫師之視線內。此彈性使得外科醫師能夠基於個人偏好或習慣而放置顯示監視器。另外,本文中所揭示之立體視覺化攝影機之彈性及細長輪廓減少在一患者上方消耗之面積。總之,與上文所論述之外科手術顯微鏡200相比較,立體視覺化攝影機及監視器(例如,立體視覺化平台)使得一外科醫師及外科手術團隊能夠在移動不受支配或限定之情況下對一患者執行複雜顯微外科手術程序。實例性立體視覺化平台因此操作為外科醫師之眼睛之一延伸,使得外科醫師能夠在不處理由先前已知視覺化系統引發之壓力、限定及限制之情況下執行拿手顯微外科手術。A monitor or other visual display device can be positioned in the surgical environment to be easily within the sight of a surgeon when performing a surgical operation on a patient. This flexibility allows the surgeon to place the display monitor based on personal preference or habits. In addition, the elasticity and slender contour of the stereo vision camera disclosed in this article reduce the area consumed above a patient. In short, compared with the surgical microscope 200 discussed above, the stereo visualization camera and monitor (for example, the stereo visualization platform) enable a surgeon and a surgical team to control without being dominated or restricted by movement. A patient performs a complex microsurgical procedure. The exemplary stereo visualization platform therefore operates as an extension of the surgeon's eyes, allowing the surgeon to perform outstanding microsurgery operations without dealing with the pressure, constraints, and limitations caused by previously known visualization systems.
本發明在本文中一般提及顯微外科手術。可在實際上任何顯微外科手術程序(舉例而言,包含顱外科手術、腦外科手術、神經外科手術、脊柱外科手術、眼科外科手術、角膜移植、矯形外科手術、耳鼻喉外科手術、口腔外科手術、整形與重建外科手術或普通外科手術)中使用實例性立體視覺化攝影機。The present invention generally refers to microsurgery in this context. Can be used in virtually any microsurgery procedure (for example, including cranial surgery, brain surgery, neurosurgery, spinal surgery, ophthalmic surgery, corneal transplantation, orthopedic surgery, ENT surgery, oral surgery Exemplary stereo vision cameras are used in surgery, plastic and reconstructive surgery, or general surgery.
本發明亦在本文中提及目標部位、場景或視域。如本文中所使用,目標部位或視域包含由實例性立體視覺化攝影機記錄或以其他方式成像之一物件(或一物件之部分)。一般而言,該目標部位、場景或視域距實例性立體視覺化攝影機之一主要物鏡總成係一工作距離遠且與實例性立體視覺化攝影機對準。目標部位可包含一患者之生物組織、骨骼、肌肉、皮膚或其組合。在此等例項中,目標部位可因具有與一患者之解剖結構之一進展對應之一深度分量而係三維的。目標部位亦可包含用於實例性立體視覺化攝影機之校準或驗證之一或多個模板。該等模板可係二維的,諸如紙(或塑膠薄片)上之一圖形設計,或係三維的,諸如用以約計一患者之在一特定區域中之解剖結構。The present invention also refers to the target part, scene or field of view in this article. As used herein, a target part or field of view includes an object (or part of an object) recorded or otherwise imaged by an exemplary stereo visualization camera. Generally speaking, the main objective lens assembly of the target part, scene or field of view is a long working distance and aligned with the example stereo visualization camera. The target site may include a patient's biological tissue, bone, muscle, skin, or a combination thereof. In these cases, the target site may be three-dimensional because it has a depth component corresponding to a progression of a patient's anatomical structure. The target part may also include one or more templates for calibration or verification of the exemplary stereo visualization camera. The templates can be two-dimensional, such as a graphic design on paper (or plastic sheet), or three-dimensional, such as used to approximate the anatomy of a patient in a specific area.
貫穿全文亦參考一x方向、一y方向、一z方向及一傾斜方向。該z方向係沿著自實例性立體視覺化攝影機至目標部位之一軸線且一般係指深度。該x方向及該y方向在入射至該z方向之一平面中且包括目標部位之一平面。該x方向係沿著距該y方向之一軸線有90°之一軸線。沿著該x方向及/或該y方向之移動係指平面內移動且可係指實例性立體視覺化攝影機之移動、實例性立體視覺化攝影機內之光學元件之移動及/或目標部位之移動。Throughout the text, reference is also made to an x-direction, a y-direction, a z-direction, and an oblique direction. The z direction is along an axis from the exemplary stereo visualization camera to the target site and generally refers to depth. The x direction and the y direction are incident on a plane of the z direction and include a plane of the target part. The x-direction is along an axis that is 90° from an axis in the y-direction. Movement along the x-direction and/or the y-direction refers to in-plane movement and can refer to the movement of the exemplary stereoscopic visualization camera, the movement of optical elements in the exemplary stereoscopic visualization camera, and/or the movement of the target part .
該傾斜方向與相對於x方向、y方向及/或z方向沿著歐拉角(例如,一側傾軸線、一縱傾軸線及一側滾軸線)之移動對應。舉例而言,一完全對準透鏡相對於x方向、y方向及/或z方向實質上具有一0°傾斜。換言之,透鏡之一面係90°或垂直於沿著z方向之光。另外,透鏡之邊緣(若透鏡具有一矩形形狀)沿著x方向及y方向係平行的。可透過側傾移動、縱傾移動及/或側滾移動使透鏡及/或光學影像感測器傾斜。舉例而言,可使一透鏡及/或光學影像感測器相對於z方向沿著一縱傾軸線傾斜,以面向上或向下。沿著z方向之光以非垂直角度接觸一透鏡之一面(其向上或向下縱傾)。一透鏡及/或光學影像感測器沿著一側傾軸線、縱傾軸線或側滾軸線之傾斜使得(舉例而言)一焦點或ZRP能夠經調整。
I.實例性立體視覺化攝影機 The tilt direction corresponds to movement along Euler angles (for example, a side tilt axis, a pitch axis, and a side roll axis) relative to the x direction, the y direction, and/or the z direction. For example, a perfectly aligned lens has substantially a 0° tilt with respect to the x-direction, the y-direction, and/or the z-direction. In other words, one surface of the lens is 90° or perpendicular to the light along the z direction. In addition, the edges of the lens (if the lens has a rectangular shape) are parallel along the x-direction and the y-direction. The lens and/or the optical image sensor can be tilted through tilting movement, vertical tilting movement and/or rolling movement. For example, a lens and/or optical image sensor can be inclined with respect to the z direction along a pitch axis to face upward or downward. The light along the z-direction contacts one surface of a lens (it tilts upward or downward) at a non-perpendicular angle. The inclination of a lens and/or optical image sensor along a tilt axis, a pitch axis, or a roll axis enables, for example, a focus or ZRP to be adjusted. I. Example Stereo Visualization Camera
圖3及圖4展示根據本發明之一實例性實施例之一立體視覺化攝影機300之透視圖之圖式。實例性攝影機300包含經組態以封圍光學元件、透鏡馬達(例如,致動器)及信號處理電路之一殼體302。攝影機300具有介於15公分(cm)與28公分(cm)之間、較佳地大約22 cm之一寬度(沿著一x軸)。另外,攝影機300具有介於15 cm與32 cm之間、較佳地大約25 cm之一長度(沿著一y軸)。此外,攝影機300具有介於10 cm與20 cm之間、較佳地大約15 cm之一高度(沿著一z軸)。攝影機300之重量介於3 kg與7 kg之間、較佳地大約3.5 kg。3 and 4 show a perspective view of a stereoscopic visualization camera 300 according to an exemplary embodiment of the present invention. The example camera 300 includes a housing 302 configured to enclose optical elements, a lens motor (e.g., an actuator), and a signal processing circuit. The camera 300 has a width (along an x-axis) between 15 centimeters (cm) and 28 centimeters (cm), preferably about 22 cm. In addition, the camera 300 has a length (along a y-axis) between 15 cm and 32 cm, preferably about 25 cm. In addition, the camera 300 has a height (along a z-axis) between 10 cm and 20 cm, preferably about 15 cm. The weight of the camera 300 is between 3 kg and 7 kg, preferably about 3.5 kg.
攝影機300亦包含經組態以控制放大位準、焦點及其他顯微鏡特徵之控制臂304a及304b (例如,操作把手)。控制臂304a及304b可包含用於啟動或選擇特定特徵之各別控件305a及305b。舉例而言,控制臂304a及304b可包含用於選擇一螢光模式、調整投影至一目標部位上之光之一量/類型且控制一顯示器輸出信號(例如,在1080p或4K及/或立體鏡之間進行選擇)之控件305a及305b。另外,控件305a及/或305b可用於起始及/或執行一校準程序及/或使連接至立體視覺化攝影機300之一機器人臂移動。在某些例項中,控件305a及305b可包含相同按鈕及/或特徵。在其他例項中,控件305a及305b可包含不同特徵。此外,控制臂304a及304b亦可組態為手柄以使得一操作者能夠定位立體視覺化攝影機300。The camera 300 also includes control arms 304a and 304b (e.g., operating handles) that are configured to control the magnification level, focus, and other microscope features. The control arms 304a and 304b may include respective controls 305a and 305b for activating or selecting specific features. For example, the control arms 304a and 304b may include controls for selecting a fluorescent mode, adjusting the amount/type of light projected on a target site, and controlling a display output signal (for example, in 1080p or 4K and/or stereo Choose between mirrors) controls 305a and 305b. In addition, the controls 305a and/or 305b can be used to initiate and/or execute a calibration procedure and/or move a robot arm connected to the stereo visualization camera 300. In some examples, controls 305a and 305b may include the same buttons and/or features. In other examples, the controls 305a and 305b may include different characteristics. In addition, the control arms 304a and 304b can also be configured as handles so that an operator can position the stereo visualization camera 300.
每一控制臂304經由一可旋轉支柱306b連接至殼體302,如圖3中所展示。此連接使得控制臂304能夠相對於殼體302旋轉。此旋轉提供視需要配置控制臂304之彈性給一外科醫師,從而進一步增強立體視覺化攝影機300之適應性以與一外科手術執行同步。Each control arm 304 is connected to the housing 302 via a rotatable pillar 306b, as shown in FIG. 3. This connection enables the control arm 304 to rotate relative to the housing 302. This rotation provides flexibility to configure the control arm 304 to a surgeon as needed, thereby further enhancing the adaptability of the stereo vision camera 300 to synchronize with a surgical operation.
雖然圖3及圖4中所展示之實例性攝影機300包含兩個控制臂304a及304b,但應瞭解,攝影機300可僅包含一個控制臂或零個控制臂。在其中立體視覺化攝影機300不包含一控制臂之例項中,控件可與殼體302整合在一起及/或經由一遠端控件而提供。Although the example camera 300 shown in FIGS. 3 and 4 includes two control arms 304a and 304b, it should be understood that the camera 300 may include only one control arm or zero control arms. In an example in which the stereo vision camera 300 does not include a control arm, the control can be integrated with the housing 302 and/or provided via a remote control.
圖4展示根據本發明之一實例性實施例之立體視覺化攝影機300之一後側之一自底向上透視圖。立體視覺化攝影機300包含經組態以連接至一支撐件之一安裝托架402。如圖5及圖6中更詳細地闡述,該支撐件可包含具有一或多個關節之一臂以提供顯著可操作性。該臂可連接至一可移動搬運車或緊固至一壁或天花板。FIG. 4 shows a bottom-up perspective view of a rear side of a stereo visualization camera 300 according to an exemplary embodiment of the present invention. The stereo visualization camera 300 includes a mounting bracket 402 configured to be connected to a support. As explained in more detail in Figures 5 and 6, the support may include an arm with one or more joints to provide significant maneuverability. The arm can be connected to a mobile truck or fastened to a wall or ceiling.
立體視覺化攝影機300亦包含經組態以接納一電源適配器之一電源埠404。可自一AC插座及/或一搬運車上之一電池接收電力。在某些例項中,立體視覺化攝影機300可包含一內部電池以在不具有電線之情況下促進操作。在此等例項中,電源埠404可用於給該電池充電。在替代實施例中,電源埠404可與安裝托架402整合在一起,使得立體視覺化攝影機300經由支撐件內之導線(或其他導電佈線材料)接收電力。The stereoscopic visualization camera 300 also includes a power port 404 configured to receive a power adapter. It can receive power from an AC outlet and/or a battery on a truck. In some cases, the stereo vision camera 300 may include an internal battery to facilitate operation without wires. In these examples, the power port 404 can be used to charge the battery. In an alternative embodiment, the power port 404 may be integrated with the mounting bracket 402, so that the stereo vision camera 300 receives power through wires (or other conductive wiring materials) in the support.
圖4亦展示立體視覺化攝影機300可包含一資料埠406。實例性資料埠406可包含任何類型之埠,包含(舉例而言)一乙太網路介面、一高清晰度多媒體介面(「HDMI」)介面、一通用串列匯流排(「USB」)介面、一串列數位介面(「SDI」)、一數位光學介面、一RS-232串列通信介面等。資料埠406經組態以提供立體視覺化攝影機300與佈線至一或多個運算裝置、伺服器、記錄裝置及/或顯示裝置之電線之間的一通信連接。該通信連接可傳輸立體視訊信號或二維視訊信號以用於進一步處理、儲存及/或顯示。資料埠406亦可使得控制信號能夠發送至立體視覺化攝影機300。例如,一經連接電腦(例如,一膝上型電腦、桌上型電腦及/或平板電腦)處之一操作者可將控制信號傳輸至立體視覺化攝影機300以指導操作、執行校準或改變一輸出顯示設定。FIG. 4 also shows that the stereo visualization camera 300 may include a data port 406. The example data port 406 can include any type of port, including, for example, an Ethernet interface, a high-definition multimedia interface ("HDMI") interface, and a universal serial bus ("USB") interface , A serial digital interface ("SDI"), a digital optical interface, an RS-232 serial communication interface, etc. The data port 406 is configured to provide a communication connection between the stereo visualization camera 300 and the wires wired to one or more computing devices, servers, recording devices, and/or display devices. The communication connection can transmit a stereoscopic video signal or a two-dimensional video signal for further processing, storage and/or display. The data port 406 can also enable control signals to be sent to the stereo visualization camera 300. For example, once connected to a computer (eg, a laptop, desktop, and/or tablet), an operator can transmit a control signal to the stereo visualization camera 300 to guide the operation, perform calibration, or change an output Display settings.
在某些實施例中,可用一無線介面替換(及/或補充)資料埠406。舉例而言,立體視覺化攝影機300可經由Wi-Fi將立體顯示信號傳輸至一或多個顯示裝置。與一內部電池組合之一無線介面之一使用使得立體視覺化攝影機300能夠係無導線的,因而進一步改良一外科手術環境內之可操作性。In some embodiments, the data port 406 can be replaced (and/or supplemented) by a wireless interface. For example, the stereoscopic visualization camera 300 can transmit the stereoscopic display signal to one or more display devices via Wi-Fi. The use of one of the wireless interfaces in combination with an internal battery enables the stereo visualization camera 300 to be wireless, thereby further improving the operability in a surgical environment.
圖4中所展示之立體視覺化攝影機300亦包含一主要物鏡總成之一前工作距離主要物鏡透鏡408。實例性透鏡408係立體視覺化攝影機300內之光學路徑之開始。使來自在立體視覺化攝影機300內部之一光源之光透射穿過透鏡408到達一目標部位。另外,自該目標部位反射之光經接收於透鏡408中且傳遞至下游光學元件。
II.立體視覺化攝影機之例示性可操作性 The stereo vision camera 300 shown in FIG. 4 also includes a main objective lens assembly and a front working distance main objective lens 408. The example lens 408 is the beginning of the optical path within the stereo visualization camera 300. The light from a light source inside the stereo visualization camera 300 is transmitted through the lens 408 to reach a target location. In addition, the light reflected from the target site is received in the lens 408 and transmitted to the downstream optical element. II. Illustrative operability of stereo vision camera
圖5及圖6展示根據本發明之實例性實施例之在一顯微外科手術環境500內使用之立體視覺化攝影機300之圖式。如圖解說明,立體視覺化攝影機300之小佔用面積及可操作性(尤其在連同一個多自由度臂使用時)達成相對於一患者502之彈性定位。在立體視覺化攝影機300之視圖中的患者502之一部分包含一目標部位503。一外科醫師504可將立體視覺化攝影機300定位於實際上任何定向中同時在患者502 (以仰臥位躺下)上面留下綽綽有餘之外科手術空間。立體視覺化攝影機300因此係最低限度地侵入的(或並非侵入的)以使得外科醫師504能夠在不分神或受妨礙之情況下執行一改變生命之顯微外科手術程序。5 and 6 show diagrams of a stereo visualization camera 300 used in a microsurgery environment 500 according to an exemplary embodiment of the present invention. As illustrated in the figure, the small footprint and operability of the stereo vision camera 300 (especially when used with a multi-degree-of-freedom arm) achieves elastic positioning relative to a patient 502. A part of the patient 502 in the view of the stereo visualization camera 300 includes a target part 503. A surgeon 504 can position the stereo vision camera 300 in virtually any orientation while leaving more than enough room for surgery on the patient 502 (lying down in a supine position). The stereo vision camera 300 is therefore minimally invasive (or not invasive) so that the surgeon 504 can perform a life-changing microsurgery procedure without being distracted or hindered.
在圖5中,立體視覺化攝影機300經由安裝托架402連接至一機械臂506。臂506可包含具有機電制動器之一或多個旋轉或可延伸關節以促進立體視覺化攝影機300之容易重定位。為使立體視覺化攝影機300移動,外科醫師504或助手508致動臂506之一或多個關節上之制動器釋放。在使立體視覺化攝影機300移動至一所要位置中之後,該等制動器可經嚙合以將臂506之關節鎖定於適當位置中。In FIG. 5, the stereo vision camera 300 is connected to a robot arm 506 via a mounting bracket 402. The arm 506 may include one or more rotating or extendable joints with electromechanical brakes to facilitate easy repositioning of the stereoscopic camera 300. To move the stereo vision camera 300, the surgeon 504 or assistant 508 activates the brake on one or more joints of the arm 506 to release. After moving the stereo visualization camera 300 to a desired position, the brakes can be engaged to lock the joints of the arm 506 in place.
立體視覺化攝影機300之一顯著特徵係其不包含目鏡。此意味立體視覺化攝影機300不必須與外科醫師504之眼睛對準。此自由度使得立體視覺化攝影機300能夠定位且定向在對於先前已知外科手術顯微鏡不實際或可能之合意位置中。換言之,外科醫師504可利用用於進行程序之最佳視圖(而非僅限定於由一外科手術顯微鏡之目鏡指示之充足視圖)來執行顯微外科手術。One of the salient features of the stereo vision camera 300 is that it does not include eyepieces. This means that the stereo vision camera 300 does not have to be aligned with the eyes of the surgeon 504. This degree of freedom enables the stereo visualization camera 300 to be positioned and oriented in a desirable position that is not practical or possible for previously known surgical operating microscopes. In other words, the surgeon 504 can perform microsurgery using the best view for performing the procedure (not limited to the sufficient view indicated by the eyepiece of a surgical operating microscope).
返回至圖5,立體視覺化攝影機300經由機械臂506與顯示監視器512及514一起連接至一搬運車510 (共同稱為一立體視覺化平台或立體機器人平台516)。在所圖解說明之組態中,立體視覺化平台516係獨立的且可移動至顯微外科手術環境500中之任何所要位置(包含外科手術室之間)。整合式平台516使得立體視覺化攝影機300能夠按需移動及使用而無需藉由連接顯示監視器512及514而組態系統所需要之時間。Returning to FIG. 5, the stereo visualization camera 300 is connected to a truck 510 (collectively referred to as a stereo visualization platform or a stereo robot platform 516) via the robot arm 506 and the display monitors 512 and 514. In the illustrated configuration, the stereo visualization platform 516 is independent and movable to any desired location in the microsurgery environment 500 (including between surgical operating rooms). The integrated platform 516 enables the stereo visualization camera 300 to be moved and used as needed without the time required to configure the system by connecting the display monitors 512 and 514.
顯示監視器512及514可包含任何類型之顯示器,包含一高清晰度電視、一超高清晰度電視、智慧護目鏡、投影機、一或多個電腦螢幕、膝上型電腦、平板電腦及/或智慧型電話。顯示監視器512及514可連接至機械臂以達成類似於立體視覺化攝影機300之彈性定位。在某些例項中,顯示監視器512及514可包含一觸控螢幕以使得一操作者能夠將命令發送至立體視覺化攝影機300及/或調整一顯示器之一設定。Display monitors 512 and 514 can include any type of display, including a high-definition TV, an ultra-high-definition TV, smart goggles, projectors, one or more computer screens, laptops, tablets, and/ Or smart phone. The display monitors 512 and 514 can be connected to the robotic arm to achieve flexible positioning similar to the stereo visual camera 300. In some examples, the display monitors 512 and 514 may include a touch screen to enable an operator to send commands to the stereo visualization camera 300 and/or adjust a setting of a display.
在某些實施例中,搬運車510可包含一電腦520。在此等實施例中,電腦520可控制連接至立體視覺化攝影機300之一機器人機械臂。另外或另一選擇係,電腦520可處理來自立體視覺化攝影機300之視訊(或立體視訊)信號(例如,一影像或圖框串流)以用於顯示在顯示監視器512及514上。舉例而言,電腦520可組合來自立體視覺化攝影機300之左視訊信號及右視訊信號或使該等左視訊信號及右視訊信號交錯以形成用於顯示一目標部位之一立體影像之一立體信號。電腦520亦可用於將視訊及/或立體視訊信號儲存至一視訊檔案中(儲存至一記憶體),因此外科手術執行可經記載且回放。此外,電腦520亦可將控制信號發送至立體視覺化攝影機300以選擇設定及/或執行校準。In some embodiments, the truck 510 may include a computer 520. In these embodiments, the computer 520 can control a robotic arm connected to the stereo visualization camera 300. Alternatively or alternatively, the computer 520 can process the video (or stereoscopic video) signal (for example, an image or frame stream) from the stereo visualization camera 300 for display on the display monitors 512 and 514. For example, the computer 520 may combine the left video signal and the right video signal from the stereo visualization camera 300 or interlace the left and right video signals to form a stereo signal for displaying a stereoscopic image of a target part . The computer 520 can also be used to store video and/or stereoscopic video signals in a video file (stored in a memory), so that the surgical operation can be recorded and played back. In addition, the computer 520 may also send a control signal to the stereo visualization camera 300 to select settings and/or perform calibration.
在某些實施例中,圖5之顯微外科手術環境500包含一眼科外科手術程序。在此實施例中,機械臂506可經程式化以執行一患者之眼睛之一軌道掃視。此一掃視使得外科醫師能夠在玻璃體視網膜程序期間檢查一周邊視網膜。相比之下,在習用光學顯微鏡之情況下,一外科醫師可觀看周邊視網膜之僅有方式係使用稱為鞏膜凹陷之一技術將眼睛之側推動至視域中。In some embodiments, the microsurgery environment 500 of FIG. 5 includes an ophthalmic surgical procedure. In this embodiment, the robotic arm 506 can be programmed to perform an orbital sweep of a patient's eyes. This saccade allows the surgeon to examine a peripheral retina during the vitreoretinal procedure. In contrast, with conventional optical microscopes, the only way a surgeon can view the peripheral retina is to use a technique called scleral depression to push the side of the eye into the field of view.
圖6展示在患者502處於一坐位中以用於一後入路顱底神經外科手術之情況下顯微外科手術環境500之一圖式。在所圖解說明實施例中,立體視覺化攝影機300經放置至一水平位置中以面對患者502之頭部之背面。機械臂506包含使得立體視覺化攝影機300能夠如所展示而定位之關節。另外,搬運車510包含監視器512,監視器512可與外科醫師之自然觀看方向對準。Figure 6 shows a diagram of a microsurgery environment 500 with a patient 502 in a sitting position for a posterior approach skull base neurosurgery. In the illustrated embodiment, the stereo visualization camera 300 is placed in a horizontal position to face the back of the patient 502's head. The robotic arm 506 includes joints that enable the stereo visualization camera 300 to be positioned as shown. In addition, the truck 510 includes a monitor 512, which can be aligned with the surgeon's natural viewing direction.
目鏡之不存在使得立體視覺化攝影機300能夠水平地且低於外科醫師504之平視視圖而定位。此外,相對低重量及撓性使得立體視覺化攝影機300能夠以其他已知外科手術顯微鏡不可想像之方式來定位。立體視覺化攝影機300因而針對患者502及/或外科醫師504之任一所要位置及/或定向提供一顯微外科手術視圖。The absence of the eyepiece enables the stereo visualization camera 300 to be positioned horizontally and below the head-up view of the surgeon 504. In addition, the relatively low weight and flexibility enable the stereo visualization camera 300 to be positioned in a way that other known surgical operating microscopes cannot imagine. The stereo visualization camera 300 thus provides a microsurgery view for any desired position and/or orientation of the patient 502 and/or the surgeon 504.
雖然圖5及圖6展示用於定位立體視覺化攝影機300之兩個實例性實施例,但應瞭解,立體視覺化攝影機300可取決於機械臂506之自由度數而定位於任一數目個位置中。在某些實施例中,完全有可能將立體視覺化攝影機300定位為面向上(例如,顛倒)。
III.實例性立體視覺化平台與已知外科手術顯微鏡之比較 Although FIGS. 5 and 6 show two exemplary embodiments for positioning the stereo visualization camera 300, it should be understood that the stereo visualization camera 300 can be positioned in any number of positions depending on the degree of freedom of the robotic arm 506 . In some embodiments, it is entirely possible to position the stereoscopic visualization camera 300 to face upward (eg, upside down). III. Comparison between an example stereo visualization platform and known surgical operating microscopes
在比較圖3至圖6之立體視覺化攝影機300與圖2之外科手術顯微鏡200中,差異係顯而易見的。與外科手術顯微鏡一起包含目鏡206需要外科醫師不斷地將他/她的眼睛定向至接目鏡,該等接目鏡相對於鏡頭部201及患者在一固定位置中。此外,外科手術顯微鏡之龐大性及重量將其限定為僅定位於相對於一患者之一大體垂直定向中。相比之下,實例性立體視覺化攝影機300不包含目鏡且可定位於相對於一患者之任一定向或位置中,因而使外科醫師在外科手術期間自由移動。Comparing the stereo vision camera 300 of FIG. 3 to FIG. 6 with the surgical microscope 200 of FIG. 2, the difference is obvious. Including the eyepiece 206 together with the surgical operating microscope requires the surgeon to constantly orient his/her eyes to the eyepieces, which are in a fixed position relative to the lens portion 201 and the patient. In addition, the bulkiness and weight of surgical operating microscopes limit it to only being positioned in a generally vertical orientation relative to a patient. In contrast, the exemplary stereo visualization camera 300 does not include eyepieces and can be positioned in any orientation or position relative to a patient, thereby allowing the surgeon to move freely during the surgical operation.
為使得其他臨床醫師職員能夠觀看一顯微外科手術目標部位,外科手術顯微鏡200需要添加第二目鏡208。一般而言,大多數已知外科手術顯微鏡200不允許添加第三目鏡。相比之下,實例性立體視覺化攝影機300可以通信方式耦合至不受限制數目個顯示監視器。雖然圖5及圖6在上文展示連接至搬運車510之顯示監視器512及514,但一外科手術室可被環繞在顯示監視器中,該等顯示監視器全部展示由立體視覺化攝影機300記錄之顯微外科手術視圖。因此,替代將一視圖限制於一個或兩個人(或需要共用一目鏡),一整個外科手術團隊可觀看一目標外科手術部位之一經放大視圖。此外,可向其他室(諸如訓練室及觀察室)中之人呈現顯示給外科醫師之相同經放大視圖。In order to allow other clinicians to view a target site of microsurgery, the surgical operating microscope 200 needs to add a second eyepiece 208. Generally speaking, most known surgical operating microscopes 200 do not allow the addition of a third eyepiece. In contrast, the example stereo visualization camera 300 can be communicatively coupled to an unlimited number of display monitors. Although FIGS. 5 and 6 show the display monitors 512 and 514 connected to the cart 510 above, a surgical operating room can be surrounded by display monitors, all of which are shown by the stereo visualization camera 300 Recorded view of microsurgery. Therefore, instead of restricting a view to one or two people (or the need to share an eyepiece), an entire surgical team can view an enlarged view of a target surgical site. In addition, people in other rooms, such as training rooms and observation rooms, can be presented with the same magnified view shown to the surgeon.
與立體視覺化攝影機300相比較,雙目鏡外科手術顯微鏡200更易於碰撞或無意地移動。由於外科醫師在外科手術期間將其頭部放置於目鏡206及208上以看向接目鏡,因此鏡頭部201接收恆定力及週期性碰撞。添加第二目鏡208使來自一第二角度之力加倍。總之,外科醫師之恆定力及週期性碰撞可致使鏡頭部201移動,因而需要鏡頭部201經重定位。此重定位使外科手術程序延遲且使外科醫師心煩。Compared with the stereo vision camera 300, the binocular surgical operating microscope 200 is more prone to collision or unintentional movement. Since the surgeon places his head on the eyepieces 206 and 208 during the surgical operation to look at the eyepieces, the lens part 201 receives constant force and periodic collisions. Adding a second eyepiece 208 doubles the force from a second angle. In short, the constant force and periodic collision of the surgeon can cause the lens part 201 to move, which requires the lens part 201 to be repositioned. This relocation delays the surgical procedure and annoys the surgeon.
實例性立體視覺化攝影機300不包含目鏡且一旦其鎖定至適當位置中便不意欲接收來自一外科醫師之接觸。此對應於立體視覺化攝影機300在外科醫師之執行期間意外地移動或碰撞之一顯著較低機會。The example stereo visualization camera 300 does not include an eyepiece and does not intend to receive contact from a surgeon once it is locked in place. This corresponds to one of the significantly lower chances of the stereo visualization camera 300 accidentally moving or colliding during the operation of the surgeon.
為促進第二目鏡208,外科手術顯微鏡200必須裝備有一分束器210,分束器210可包含裝納於精密金屬管中之玻璃透鏡及反射鏡。一分束器210之使用減少在第一目鏡處接收之光,此乃因光中之某些光反射至第二目鏡208。此外,添加第二目鏡208及分束器210會增加鏡頭部201之重量及龐大性。In order to promote the second eyepiece 208, the surgical operating microscope 200 must be equipped with a beam splitter 210, which may include a glass lens and a mirror contained in a precision metal tube. The use of a beam splitter 210 reduces the light received at the first eyepiece because some of the light is reflected to the second eyepiece 208. In addition, adding the second eyepiece 208 and the beam splitter 210 will increase the weight and bulkiness of the lens portion 201.
與外科手術顯微鏡200相比較,立體視覺化攝影機300僅含有用於感測器之光學路徑,因而減少重量及龐大性。另外,光學感測器接收全入射光,此乃因不需要分束器重新引導光之一部分。此意味由實例性立體視覺化攝影機300之光學感測器接收之影像係儘可能明亮且清晰的。Compared with the surgical operating microscope 200, the stereo vision camera 300 only contains the optical path for the sensor, thereby reducing the weight and bulkiness. In addition, the optical sensor receives all incident light, because there is no need for a beam splitter to redirect part of the light. This means that the image received by the optical sensor of the exemplary stereo visualization camera 300 is as bright and clear as possible.
外科手術顯微鏡之某些模型可使得能夠附接一視訊攝影機。例如,圖2之外科手術顯微鏡200包含經由分束器214連接至一光學路徑之一單像視訊攝影機212。視訊攝影機212可係單像的或立體的,諸如Leica® TrueVision® 3D視覺化系統眼科學攝影機。視訊攝影機212記錄自分束器214接收之一影像以用於顯示在一顯示監視器上。添加視訊攝影機212及分束器214會進一步增添鏡頭部201之重量。另外,分束器214消耗註定用於目鏡206及/或208之額外光。Certain models of surgical microscopes may enable the attachment of a video camera. For example, the surgical operating microscope 200 of FIG. 2 includes a single-image video camera 212 connected to an optical path via a beam splitter 214. The video camera 212 can be mono- or stereo, such as a Leica® TrueVision® 3D visualization system ophthalmology camera. The video camera 212 records an image received from the beam splitter 214 for display on a display monitor. The addition of the video camera 212 and the beam splitter 214 will further increase the weight of the lens portion 201. In addition, the beam splitter 214 consumes extra light destined for the eyepieces 206 and/or 208.
每一分束器210及214將入射光以分數方式劃分成三個路徑,從而自外科醫師之視野移除光。外科醫師之眼睛具有有限低光敏感度,使得呈現給他/她之來自手術部位之光必須足以允許外科醫師執行程序。然而,一外科醫師無法始終增加施加至一患者上之一目標部位之光之強度,尤其在眼科程序中。一患者之眼睛在其形成光毒性之前具有有限高光敏感度。因此,對分束器之數目及分率且對可自第一目鏡206分裂之光量進行限制以使得能夠使用輔助裝置208及212。Each beam splitter 210 and 214 divides the incident light into three paths in a fractional manner, thereby removing the light from the surgeon's field of view. The surgeon’s eyes have limited low light sensitivity, so that the light from the surgical site presented to him/her must be sufficient to allow the surgeon to perform the procedure. However, a surgeon cannot always increase the intensity of light applied to a target site on a patient, especially in ophthalmic procedures. The eye of a patient has limited high light sensitivity before it becomes phototoxic. Therefore, the number and division of beam splitters and the amount of light that can be split from the first eyepiece 206 are restricted to enable the use of auxiliary devices 208 and 212.
圖3至圖6之實例性立體視覺化攝影機300不包含分束器,使得光學成像感測器自一主要物鏡總成接收全光量。此使得能夠使用具有低光敏感度之感測器或甚至使用具有超出可見光之波長之敏感度之光學感測器,此乃因後處理可使影像充分明亮且可見(且可調整)以用於顯示在監視器上。The exemplary stereo vision camera 300 of FIGS. 3 to 6 does not include a beam splitter, so that the optical imaging sensor receives the full amount of light from a main objective lens assembly. This enables the use of sensors with low light sensitivity or even optical sensors with sensitivity beyond the wavelength of visible light. This is because the post-processing can make the image sufficiently bright and visible (and adjustable) for display On the monitor.
此外,由於定義光學路徑之光學元件在立體視覺化攝影機300內係獨立的,因此可透過攝影機控制該等光學元件。此控制允許光學元件之放置及調整針對一個三維立體顯示器而非針對顯微鏡目鏡經最佳化。攝影機之此組態准許自攝影機控件或自一遠端電腦以電子方式提供控制。另外,可透過攝影機300上之一或多個程式自動提供控制,攝影機300經組態以調整光學元件用於在變焦之同時保持焦點或對光學缺陷及/或假性視差進行調整。相比之下,外科手術顯微鏡200之光學元件在視訊攝影機212外部且僅經由操作者輸入來控制,該操作者輸入一般經最佳化以用於透過目鏡206觀看一目標部位。In addition, since the optical elements defining the optical path are independent in the stereo visualization camera 300, the optical elements can be controlled through the camera. This control allows the placement and adjustment of optical components to be optimized for a three-dimensional display rather than for the microscope eyepiece. This configuration of the camera allows control to be provided electronically from the camera control or from a remote computer. In addition, the control can be automatically provided through one or more programs on the camera 300, and the camera 300 is configured to adjust optical elements for maintaining focus while zooming or adjusting optical defects and/or false parallaxes. In contrast, the optical elements of the surgical operating microscope 200 are external to the video camera 212 and controlled only by operator input, which is generally optimized for viewing a target site through the eyepiece 206.
在一最後比較中,外科手術顯微鏡200包含一X-Y水平擺動裝置220以用於使一視域或目標場景移動。X-Y水平擺動裝置220通常係一又大又重且昂貴機電模組,此乃因其必須剛性地支撐外科手術鏡頭部201且使外科手術鏡頭部201移動。另外,使鏡頭部201移動會將外科醫師之定位改變至目鏡206之新位置。In a final comparison, the surgical operating microscope 200 includes an X-Y horizontal swing device 220 for moving a field of view or target scene. The X-Y horizontal swing device 220 is usually a large, heavy and expensive electromechanical module, because it must rigidly support the surgical lens portion 201 and move the surgical lens portion 201. In addition, moving the lens portion 201 will change the position of the surgeon to the new position of the eyepiece 206.
相比之下,實例性立體視覺化攝影機300包含一記憶體,該記憶體包含在經執行時致使一處理器選擇光學感測器之像素資料以跨越一寬像素網格達成X-Y水平擺動之指令。另外,實例性立體視覺化攝影機300可包含一小馬達或致動器,該小馬達或致動器控制一主要物鏡光學元件以在不使攝影機300移動之情況下改變至一目標部位之一工作距離。
IV.立體視覺化攝影機之實例性光學元件 In contrast, the exemplary stereo visualization camera 300 includes a memory that, when executed, causes a processor to select the pixel data of the optical sensor to achieve XY horizontal swing across a wide pixel grid. . In addition, the exemplary stereo visualization camera 300 may include a small motor or actuator that controls a main objective optical element to change to a target position without moving the camera 300. distance. IV. Example optical components of stereo vision cameras
圖7及圖8展示根據本發明之一實例性實施例圖解說明圖3至圖6之實例性立體視覺化攝影機300內之光學元件之圖式。獲取一目標部位之左視圖及右視圖以建構一立體影像可似乎係相對簡單的。然而,在不具有仔細設計及補償之情況下,諸多立體影像具有左視圖與右視圖之間的對準問題。當觀看一長久時間週期時,對準問題可由於左視圖與右視圖之間的差而在一觀察者之大腦中形成混亂。此混亂可導致頭痛、疲勞、眩暈及甚至噁心。FIGS. 7 and 8 show diagrams illustrating optical elements in the exemplary stereoscopic visualization camera 300 of FIGS. 3 to 6 according to an exemplary embodiment of the present invention. Obtaining the left and right views of a target part to construct a three-dimensional image may seem relatively simple. However, without careful design and compensation, many stereo images have alignment problems between the left view and the right view. When viewing for a long period of time, alignment problems can cause confusion in the brain of an observer due to the difference between the left and right views. This confusion can cause headaches, fatigue, dizziness and even nausea.
實例性立體視覺化攝影機300藉由具有一右光學路徑及左光學路徑而減少(或消除)對準問題,其中對某些光學元件進行獨立控制及/或調整同時其他左光學元件及右光學元件固定在一共同載體中。在一實例性實施例中,某些左變焦透鏡及右變焦透鏡可固定至一共同載體以確保左放大率及右放大率係實質上相同的。然而,前透鏡或後透鏡可係可徑向地、旋轉地、軸向地調整的及/或使該等前透鏡或後透鏡傾斜,以補償變焦放大率、視覺缺陷及/或假性視差(諸如一變焦重複點之移動)之小差。由可調整透鏡提供之補償在一完整變焦放大率範圍內產生幾乎完全對準光學路徑。The exemplary stereo vision camera 300 reduces (or eliminates) alignment problems by having a right optical path and a left optical path, in which some optical elements are independently controlled and/or adjusted while other left optical elements and right optical elements Fixed in a common carrier. In an exemplary embodiment, certain left and right zoom lenses can be fixed to a common carrier to ensure that the left and right magnifications are substantially the same. However, the front lens or the rear lens can be adjusted radially, rotationally, and axially and/or the front lens or the rear lens can be tilted to compensate for zoom magnification, visual defects and/or false parallax ( Such as the movement of a zoom repeat point). The compensation provided by the adjustable lens produces an almost perfectly aligned optical path within a full zoom magnification range.
另外或另一選擇係,可使用像素讀出及/或再現技術來減少(或消除)對準問題。舉例而言,可相對於一左影像(由一左光學感測器記錄)向上或向下調整一右影像(由一右光學感測器記錄)以校正該等影像之間的垂直不對準。類似地,可相對於一左影像向左或右調整一右影像以校正該等影像之間的水平不對準。Additionally or alternatively, pixel readout and/or reproduction techniques can be used to reduce (or eliminate) alignment problems. For example, a right image (recorded by a right optical sensor) can be adjusted up or down relative to a left image (recorded by a left optical sensor) to correct the vertical misalignment between the images. Similarly, a right image can be adjusted to the left or right relative to a left image to correct the horizontal misalignment between the images.
圖7及圖8在下文展示提供幾乎假影、假性視差及無失真經對準光學路徑之光學元件之一實例性配置及定位。如稍後所論述,可在校準及/或使用期間使該等光學元件中之特定光學元件移動以進一步對準該等光學路徑且移除任何剩餘失真、假性視差及/或缺陷。在所圖解說明實施例中,該等光學元件定位於兩個平行路徑中以產生一左視圖及一右視圖。替代實施例可包含經摺疊、經偏轉或以其他方式不平行之光學路徑。Figures 7 and 8 below show an exemplary configuration and positioning of an optical element that provides almost artifacts, false parallaxes, and distortion-free aligned optical paths. As discussed later, specific ones of the optical elements can be moved during calibration and/or use to further align the optical paths and remove any remaining distortion, false parallax, and/or defects. In the illustrated embodiment, the optical elements are positioned in two parallel paths to produce a left view and a right view. Alternative embodiments may include optical paths that are folded, deflected, or otherwise non-parallel.
所圖解說明路徑與一人類之視覺系統對應,使得如顯示於一立體顯示器上之左視圖及右視圖似乎分開達形成大致6度之一聚光角度之一距離,該聚光角度與一成人之眼睛觀看在大致4英吋遠處之一物件之聚光角度相當,因而產生立體視覺。在某些實施例中,自左視圖及右視圖產生之影像資料在顯示監視器512及514上組合在一起以產生一目標部位或場景之一立體影像。替代實施例包括其他立體顯示器,其中左視圖僅呈現給一觀看者之左眼且對應右視圖僅呈現給右眼。在用於調整且驗證恰當對準及校準之例示性實施例中,兩個視圖重疊顯示給兩隻眼睛。The illustrated path corresponds to a human visual system, so that the left and right views displayed on a stereoscopic display appear to be separated by a distance forming a condensing angle of approximately 6 degrees, which is the same as that of an adult. The condensing angle of an object approximately 4 inches away from the eyes is equivalent, thus producing stereoscopic vision. In some embodiments, the image data generated from the left view and the right view are combined together on the display monitors 512 and 514 to generate a three-dimensional image of a target part or scene. Alternative embodiments include other stereoscopic displays, in which the left view is presented only to the left eye of a viewer and the corresponding right view is presented only to the right eye. In an exemplary embodiment for adjusting and verifying proper alignment and calibration, the two views are displayed overlappingly to both eyes.
一立體視圖優於一單像視圖,此乃因其更貼近地模仿人類視覺系統。一立體視圖提供深度感知、距離感知及相對大小感知以將一目標外科手術部位之一真實視圖提供給一外科醫師。對於諸如視網膜外科手術之程序,立體視圖係至關重要的,此乃因外科手術移動及力係小的使得外科醫師無法感覺到其。提供一立體視圖幫助一外科醫師之大腦在大腦感測甚至微小移動時放大觸覺感覺同時感知深度。A three-dimensional view is better than a single-image view because it more closely mimics the human visual system. A stereoscopic view provides depth perception, distance perception, and relative size perception to provide a real view of a target surgical site to a surgeon. For procedures such as retinal surgery, the stereoscopic view system is very important, because the surgical movement and the small force system make it impossible for the surgeon to feel it. Provides a three-dimensional view to help a surgeon's brain amplify tactile sensations and perceive depth when the brain senses even small movements.
圖7展示實例性立體視覺化攝影機300之一側視圖,其中殼體302係透明的以曝露光學元件。圖8展示圖解說明由圖7中所展示之光學元件提供之一光學路徑之一圖式。如圖8中所展示,該光學路徑包含一右光學路徑及一左光學路徑。自面對一正向方向且俯視立體視覺化攝影機300之一視角展示圖8中之該等光學路徑。自此視圖,左光學路徑出現在圖8之右側上而右光學路徑展示在左側上。FIG. 7 shows a side view of an exemplary stereo visualization camera 300 in which the housing 302 is transparent to expose optical elements. FIG. 8 shows a diagram illustrating an optical path provided by the optical element shown in FIG. 7. As shown in FIG. 8, the optical path includes a right optical path and a left optical path. The optical paths in FIG. 8 are shown from a viewing angle of the stereoscopic visualizing camera 300 when facing a forward direction and looking down. From this view, the left optical path appears on the upper right side of Fig. 8 and the right optical path appears on the left side.
圖7中所展示之光學元件係左光學路徑之一部分。應瞭解,圖7中之右光學路徑關於光學元件之關係位置及配置一般完全相同於左光學路徑。如上文所提及,光學路徑之一中心之間的瞳孔間距離介於58 mm與70 mm之間,其可比例縮放至10 mm至25 mm。該等光學元件中之每一者包括具有特定直徑(例如,介於2 mm與29 mm之間)之透鏡。因此,該等光學元件自身之間的一距離介於1 mm與23 mm之間,較佳地大約10 mm。The optical element shown in Figure 7 is part of the left optical path. It should be understood that the relationship position and arrangement of the right optical path with respect to the optical elements in FIG. 7 are generally identical to the left optical path. As mentioned above, the interpupillary distance between the centers of one of the optical paths is between 58 mm and 70 mm, which can be scaled to 10 mm to 25 mm. Each of the optical elements includes a lens with a specific diameter (for example, between 2 mm and 29 mm). Therefore, a distance between the optical elements themselves is between 1 mm and 23 mm, preferably about 10 mm.
實例性立體視覺化攝影機300經組態以獲取一目標部位700 (亦稱為一場景或視域(「FOV」)或目標外科手術部位)之影像。目標部位700包含一患者上之一解剖位置。目標部位700亦可包含實驗室生物樣本、校準滑塊/模板等。經由一主要物鏡總成702在立體視覺化攝影機300處接收來自目標部位700之影像,主要物鏡總成702包含前工作距離透鏡408 (圖4中所展示)及一後工作距離透鏡704。
A.實例性主要物鏡總成 The exemplary stereo visualization camera 300 is configured to capture images of a target site 700 (also referred to as a scene or field of view ("FOV") or target surgical site). The target site 700 includes an anatomical location on a patient. The target part 700 may also include laboratory biological samples, calibration sliders/templates, and so on. The image from the target area 700 is received at the stereo visualization camera 300 through a main objective lens assembly 702. The main objective lens assembly 702 includes a front working distance lens 408 (shown in FIG. 4) and a rear working distance lens 704. A. Example main objective lens assembly
實例性主要物鏡總成702可包含任何類型之折射總成或反射總成。圖7展示物鏡總成702作為一消色差折射總成,其中前工作距離透鏡408係固定的且後工作距離透鏡704可沿著z軸移動。前工作距離透鏡408可包括一平凸(「PCX」)透鏡及/或一彎月面透鏡。後工作距離透鏡704可包括一消色差透鏡。在其中主要物鏡總成702包含一消色差折射總成之實例中,前工作距離透鏡408可包含一半球形透鏡及/或一彎月面透鏡。另外,後工作距離透鏡704可包含一消色差雙重透鏡、一消色差雙重透鏡群組及/或一消色差三重透鏡。The exemplary main objective lens assembly 702 may include any type of refraction assembly or reflection assembly. FIG. 7 shows the objective lens assembly 702 as an achromatic refraction assembly, in which the front working distance lens 408 is fixed and the rear working distance lens 704 is movable along the z-axis. The front working distance lens 408 may include a plano-convex ("PCX") lens and/or a meniscus lens. The rear working distance lens 704 may include an achromatic lens. In an example where the main objective lens assembly 702 includes an achromatic refraction assembly, the front working distance lens 408 may include a hemispherical lens and/or a meniscus lens. In addition, the rear working distance lens 704 may include an achromatic double lens, an achromatic double lens group, and/or an achromatic triple lens.
主要物鏡總成702之放大率介於6x與20x之間。在某些例項中,主要物鏡總成702之放大率可基於一工作距離而稍微變化。舉例而言,主要物鏡總成702可針對一200 mm工作距離具有8.9x之一放大率且針對一450 mm工作距離具有8.75x之一放大率。The magnification of the main objective lens assembly 702 is between 6x and 20x. In some cases, the magnification of the main objective lens assembly 702 may vary slightly based on a working distance. For example, the main objective lens assembly 702 may have a magnification of 8.9x for a working distance of 200 mm and a magnification of 8.75x for a working distance of 450 mm.
實例性後工作距離透鏡704經組態以可相對於前工作距離透鏡408移動以改變其之間的一間距。透鏡408與透鏡704之間的間距判定主要物鏡總成702之總體前焦距,且因此判定一焦平面之位置。在某些實施例中,焦距係透鏡408與透鏡704之間的距離加上前工作距離透鏡408之厚度之二分之一。The exemplary rear working distance lens 704 is configured to be movable relative to the front working distance lens 408 to change a spacing therebetween. The distance between the lens 408 and the lens 704 determines the overall front focal length of the main objective lens assembly 702, and therefore the position of a focal plane. In some embodiments, the focal length is the distance between the lens 408 and the lens 704 plus one-half the thickness of the front working distance lens 408.
總之,前工作距離透鏡408及後工作距離透鏡704經組態以提供一無限共軛影像以用於為下游光學影像感測器提供一最佳焦點。換言之,恰好位於目標部位700之焦平面處之一物件將使其影像投影在一無限遠距離處,因而無限遠耦合在一所提供工作距離處。一般而言,該物件沿著自焦平面之光學路徑表現為對焦達一特定距離。然而,越過特定臨限值距離,物件開始顯得模糊或失焦。In summary, the front working distance lens 408 and the rear working distance lens 704 are configured to provide an infinite conjugate image for providing a best focus for the downstream optical image sensor. In other words, an object located exactly at the focal plane of the target part 700 will project its image at an infinite distance, and thus infinitely couple at a provided working distance. Generally speaking, the optical path of the object along the focal plane appears to be focused up to a specific distance. However, beyond a certain threshold distance, the object begins to appear blurry or out of focus.
圖7展示工作距離706,其係前工作距離透鏡408之一外表面與目標部位700之焦平面之間的距離。工作距離706可與一角視域對應,其中一較長工作距離產生一較寬視域或較大可觀看區。工作距離706因此設定對焦之目標部位或場景之一平面。在所圖解說明實例中,工作距離706藉由使後工作距離透鏡704移動而係可自200 mm至450 mm調整的。在一實例中,當工作距離係450 mm時可使用上游變焦透鏡在20 mm×14 mm至200 mm×140 mm之間調整視域。FIG. 7 shows the working distance 706, which is the distance between an outer surface of the front working distance lens 408 and the focal plane of the target part 700. The working distance 706 may correspond to a corner field of view, where a longer working distance produces a wider field of view or a larger viewable area. The working distance 706 therefore sets the target part of the focus or a plane of the scene. In the illustrated example, the working distance 706 is adjustable from 200 mm to 450 mm by moving the rear working distance lens 704. In one example, when the working distance is 450 mm, the upstream zoom lens can be used to adjust the field of view between 20 mm×14 mm and 200 mm×140 mm.
圖7及圖8中所展示之主要物鏡總成702針對左光學路徑及右光學路徑兩者提供目標部位700之一影像。此意味透鏡408及704之寬度應至少與左光學路徑及右光學路徑一樣寬。在替代實施例中,主要物鏡總成702可包含單獨左及右前工作距離透鏡408以及單獨左及右後工作距離透鏡704。每一對單獨工作距離透鏡之寬度可介於圖7及圖8中所展示之透鏡408及704之寬度之1/4與1/2之間。此外,後工作距離透鏡704中之每一者可係可獨立地調整的。The main objective lens assembly 702 shown in FIGS. 7 and 8 provides an image of the target site 700 for both the left optical path and the right optical path. This means that the width of the lenses 408 and 704 should be at least as wide as the left optical path and the right optical path. In an alternative embodiment, the main objective lens assembly 702 may include separate left and right front working distance lenses 408 and separate left and right rear working distance lenses 704. The width of each pair of individual working distance lenses can be between 1/4 and 1/2 of the width of the lenses 408 and 704 shown in FIGS. 7 and 8. In addition, each of the rear working distance lenses 704 may be independently adjustable.
在某些實施例中,主要物鏡總成702可係可替換的。舉例而言,可添加不同主要物鏡總成以改變一工作距離範圍、一放大率、一數值孔徑及/或折射/反射類型。在此等實施例中,立體視覺化攝影機300可基於裝設哪一主要物鏡總成而改變下游光學元件之定位、光學影像感測器之性質及/或影像處理之參數。一操作者可使用圖3之控件305中之一者及/或一使用者輸入裝置來規定哪一主要物鏡總成裝設於立體視覺化攝影機300中。
B.實例性光照源 In some embodiments, the main objective lens assembly 702 may be replaceable. For example, different main objective lens assemblies can be added to change a working distance range, a magnification, a numerical aperture, and/or refraction/reflection type. In these embodiments, the stereoscopic visualization camera 300 can change the positioning of the downstream optical element, the properties of the optical image sensor, and/or the image processing parameters based on which main objective lens assembly is installed. An operator can use one of the controls 305 in FIG. 3 and/or a user input device to specify which main objective lens assembly is installed in the stereo visualization camera 300. B. Example light source
為照射目標部位700,實例性立體視覺化攝影機300包含一或多個光照源。圖7及圖8展示三個光照源,其包含一可見光源708a、一近紅外線(「NIR」)光源708b及一近超紫外線(「NUV」)光源708c。在其他實例中,立體視覺化攝影機300可包含額外或較少光源(或不包含光源)光源。例如,可省略NIR及NUV光源。實例性光源708經組態以產生投影至目標場景700之光。所產生光與目標場景相互作用且反射離開目標場景,其中該光中之某些光反射至主要物鏡總成702。其他實例可包含外部光源或來自環境之周圍光。To illuminate the target site 700, the exemplary stereo visualization camera 300 includes one or more light sources. Figures 7 and 8 show three light sources, which include a visible light source 708a, a near infrared ("NIR") light source 708b, and a near extreme ultraviolet ("NUV") light source 708c. In other examples, the stereo visualization camera 300 may include additional or fewer light sources (or no light sources). For example, NIR and NUV light sources can be omitted. The example light source 708 is configured to generate light that is projected onto the target scene 700. The generated light interacts with the target scene and reflects away from the target scene, wherein some of the light is reflected to the main objective lens assembly 702. Other examples may include external light sources or ambient light from the environment.
除具有超出可見區域之波長之某些光之外,實例性可見光源708a亦經組態以輸出在光頻譜之人類可見部分中之光。NIR光源708b經組態以輸出主要處於稍微越過可見頻譜之紅色部分之波長之光,其亦稱為「近紅外線」。NUV光源708c經組態以輸出主要處於在可見頻譜之藍色部分中之波長之光,其稱為「近超紫外線」。由光源708輸出之光頻譜受下文所闡述之各別控制器控制。由光源708發射之光之一亮度可受一切換速率及/或所施加電壓波形控制。In addition to certain light having wavelengths outside the visible region, the exemplary visible light source 708a is also configured to output light in the human visible portion of the light spectrum. The NIR light source 708b is configured to output light mainly at a wavelength that slightly crosses the red part of the visible spectrum, which is also called "near infrared." The NUV light source 708c is configured to output light with a wavelength mainly in the blue part of the visible spectrum, which is called "near ultra-ultraviolet." The light spectrum output by the light source 708 is controlled by the respective controllers described below. A brightness of the light emitted by the light source 708 can be controlled by a switching rate and/or the applied voltage waveform.
圖7及圖8圖解說明透過主要物鏡總成702將可見光源708a及NIR光源708b直接提供至目標部位700。如圖8中所展示,來自可見光源708a之可見光沿著可見路徑710a傳播。另外,來自NIR光源708b之NIR光沿著NIR路徑710b傳播。雖然光源708a及708b經展示為在主要物鏡總成702後面(相對於目標部位700),但在其他實例中,光源708a及708b可提供於主要物鏡總成702前面。在一項實施例中,光源708a及708b可提供於殼體302之一外側上且面朝目標部位700。在又其他實施例中,舉例而言,光源708可使用一Koeher照射設置及/或一暗場照射設置來提供為與立體視覺化攝影機300分開。7 and 8 illustrate that the visible light source 708a and the NIR light source 708b are directly provided to the target site 700 through the main objective lens assembly 702. As shown in Figure 8, visible light from visible light source 708a travels along visible path 710a. In addition, the NIR light from the NIR light source 708b propagates along the NIR path 710b. Although the light sources 708a and 708b are shown behind the main objective lens assembly 702 (relative to the target site 700), in other examples, the light sources 708a and 708b may be provided in front of the main objective lens assembly 702. In an embodiment, the light sources 708a and 708b may be provided on an outer side of the housing 302 and face the target site 700. In still other embodiments, for example, the light source 708 may be provided separately from the stereoscopic visualization camera 300 using a Koeher illumination setting and/or a dark field illumination setting.
與光源708a及708b相比較,一偏轉元件712 (例如,一分束器)使用一落射照射設置將來自NUV光源708c之NUV光反射至主要物鏡總成702。偏轉元件712可經塗佈或以其他方式經組態以僅反射超出NUV波長範圍之光,因而對NUV光進行濾波。來自NUV光源708c之NUV光沿著NUV路徑710c傳播。Compared with the light sources 708a and 708b, a deflection element 712 (eg, a beam splitter) uses an epi-illumination setting to reflect the NUV light from the NUV light source 708c to the main objective lens assembly 702. The deflection element 712 may be coated or otherwise configured to reflect only light outside the NUV wavelength range, thereby filtering the NUV light. NUV light from NUV light source 708c travels along NUV path 710c.
在某些實施例中,NIR光源708b及NUV光源708c可與激發濾波器一起使用以進一步對可不受濾波器(例如,濾波器740)阻擋之光進行濾波。該等濾波器可放置於光源708b及708c前面在主要物鏡總成702之前及/或在主要物鏡總成之後。來自NUV光源708c及NIR光源708b之光在經濾波之後包括激發一解剖對象之螢光部位914 (圖9中所展示)中之螢光之波長。此外,來自NUV光源708c及NIR光源708b之光在經濾波之後可包括不在與由螢光部位914發射之彼等光相同之範圍中之波長。In some embodiments, NIR light source 708b and NUV light source 708c can be used with an excitation filter to further filter light that may not be blocked by the filter (eg, filter 740). The filters may be placed in front of the light sources 708b and 708c before the main objective lens assembly 702 and/or after the main objective lens assembly. The light from the NUV light source 708c and the NIR light source 708b after filtering includes the wavelength that excites the fluorescence in the fluorescent part 914 (shown in FIG. 9) of an anatomical object. In addition, the light from the NUV light source 708c and the NIR light source 708b may include wavelengths that are not in the same range as their light emitted by the fluorescent part 914 after being filtered.
來自光源708之光透過主要物鏡總成之投影提供基於工作距離706及/或焦平面而改變經光照視域之益處。由於光通過主要物鏡總成702,因此使光投影之角度基於工作距離706而改變且與角視域對應。此組態因此確保視域由光源708恰當地照射,而不管工作距離或放大率如何。
C.實例性偏轉元件 The projection of the light from the light source 708 through the main objective lens assembly provides the benefit of changing the illuminated field of view based on the working distance 706 and/or the focal plane. Since the light passes through the main objective lens assembly 702, the angle of the light projection is changed based on the working distance 706 and corresponds to the angular field of view. This configuration therefore ensures that the field of view is properly illuminated by the light source 708 regardless of working distance or magnification. C. Example deflection element
圖7及圖8中所圖解說明實例性偏轉元件712經組態以透過主要物鏡總成702使一特定波長之光自NUV光源708c透射至目標部位700。偏轉元件712亦經組態以使自目標部位700接收之光反射至下游光學元件,該等下游光學元件包含一前透鏡組714以用於變焦及記錄。在某些實施例中,偏轉元件712可對透過主要物鏡總成702自目標部位700接收之光進行濾波使得特定波長之光到達前透鏡組714。The exemplary deflection element 712 illustrated in FIGS. 7 and 8 is configured to transmit light of a specific wavelength from the NUV light source 708 c to the target site 700 through the main objective lens assembly 702. The deflection element 712 is also configured to reflect light received from the target site 700 to downstream optical elements, which include a front lens group 714 for zooming and recording. In some embodiments, the deflection element 712 can filter the light received from the target part 700 through the main objective lens assembly 702 so that light of a specific wavelength reaches the front lens group 714.
偏轉元件712可包含任何類型之反射鏡或透鏡以使光在一規定方向上反射。在一實例中,偏轉元件712包含一個二向色反射鏡或濾波器,其在不同波長下具有不同反射及透射特性。圖7及圖8之立體視覺化攝影機300包含為右光學路徑及左光學路徑兩者提供光之一單個偏轉元件712。在其他實例中,攝影機300可針對右光學路徑及左光學路徑中之每一者包含單獨偏轉元件。此外,可為NUV光源708c提供一單獨偏轉元件。The deflection element 712 may include any type of mirror or lens to reflect light in a prescribed direction. In one example, the deflection element 712 includes a dichroic mirror or filter, which has different reflection and transmission characteristics at different wavelengths. The stereo vision camera 300 of FIGS. 7 and 8 includes a single deflection element 712 that provides light for both the right optical path and the left optical path. In other examples, the camera 300 may include separate deflection elements for each of the right optical path and the left optical path. In addition, a separate deflection element can be provided for the NUV light source 708c.
圖9展示根據本發明之一實例性實施例之圖7及圖8之偏轉元件712之一圖式。為了簡潔,未展示主要物鏡總成702。在此實例中,偏轉元件712包含用於使特定波長之光透射及反射之兩個平行面902及904。平行面902及904相對於左光學路徑及右光學路徑(表示為路徑906)以一45°角度來設定。選擇該45°角度,此乃因此角度致使經反射光以一90°角度自所透射光傳播,因而提供最佳分離而不致使經分離光在下游前透鏡組714中被偵測到。在其他實施例中,偏轉元件712之角度可介於10度與80度之間而不會無意地傳播非想要波長之光。FIG. 9 shows a diagram of the deflection element 712 of FIGS. 7 and 8 according to an exemplary embodiment of the present invention. For brevity, the main objective lens assembly 702 is not shown. In this example, the deflection element 712 includes two parallel surfaces 902 and 904 for transmitting and reflecting light of a specific wavelength. The parallel planes 902 and 904 are set at an angle of 45° with respect to the left optical path and the right optical path (denoted as path 906). The 45° angle is selected so that the angle causes the reflected light to propagate from the transmitted light at a 90° angle, thus providing the best separation without causing the separated light to be detected in the downstream front lens group 714. In other embodiments, the angle of the deflection element 712 can be between 10 degrees and 80 degrees without unintentionally propagating light of undesired wavelengths.
實例性NUV光源708c位於偏轉元件712後面(相對於目標部位700)。來自光源708c之光沿著路徑908傳播且接觸偏轉元件712。使在NUV光源708c之主波長範圍周圍之NUV光透過偏轉元件712沿著路徑910透射至目標部位700。使具有高於(及低於) NUV光源708c之主波長範圍之一波長的來自NUV光源708c之光沿著路徑912反射至殼體302之一光槽或未用區域。An example NUV light source 708c is located behind the deflection element 712 (relative to the target site 700). The light from the light source 708c travels along the path 908 and contacts the deflection element 712. The NUV light around the main wavelength range of the NUV light source 708c is transmitted through the deflecting element 712 to the target site 700 along the path 910. The light from the NUV light source 708c having a wavelength higher (and lower) than the main wavelength range of the NUV light source 708c is reflected along the path 912 to an optical slot or unused area of the housing 302.
當NUV光到達目標部位700時,一解剖對象之一或多個螢光部位914吸收該NUV光。在某些例項中,該解剖對象可已注射有一對比劑,該對比劑經組態以吸收NUV光且發射具有一不同主波長之光。在其他例項中,該解剖對象可自然吸收NUV光且發射具有一不同主波長之光。由螢光部位914反射或發射之光中之至少某些光沿著路徑916傳播直至其接觸偏轉元件712為止。光中之大多數光沿著路徑906反射離開表面904到達前透鏡組714。使光之一部分(包含在NUV光源708c之主波長範圍周圍之NUV光)透過偏轉元件712沿著路徑918透射至殼體302之一光槽或未用區域。圖9中所展示之偏轉元件712因此利用頻譜之一個區域達成目標部位700處之一螢光劑之光學刺激同時阻止許多刺激光行進至下游前透鏡組714。When the NUV light reaches the target part 700, one or more fluorescent parts 914 of an anatomical object absorb the NUV light. In some cases, the anatomical object may have been injected with a contrast agent configured to absorb NUV light and emit light with a different dominant wavelength. In other cases, the anatomical object can naturally absorb NUV light and emit light with a different dominant wavelength. At least some of the light reflected or emitted by the fluorescent part 914 travels along the path 916 until it contacts the deflection element 712. Most of the light reflects off the surface 904 along the path 906 to reach the front lens group 714. A part of the light (including the NUV light around the main wavelength range of the NUV light source 708c) is transmitted through the deflecting element 712 along the path 918 to an optical slot or unused area of the housing 302. The deflection element 712 shown in FIG. 9 therefore uses a region of the frequency spectrum to achieve the optical stimulation of a fluorescent agent at the target site 700 while preventing a lot of stimulation light from traveling to the downstream front lens group 714.
應瞭解,可改變偏轉元件712之反射率及透射率特性以滿足其他光頻譜要求。在某些例項中,殼體302可包含使得能夠基於所要光反射率及透射率特性而替換偏轉元件712及/或NUV光源708c之一狹縫。亦應瞭解,在路徑908與路徑910之間的在偏轉元件712內部之一第一路徑及在路徑916與路徑918之間的在偏轉元件712內部之一第二路徑各自成角度以示意性地表示光之折射,此乃因光在空氣與偏轉元件712之內部之間行進。所展示之角度不意欲表示實際反射角度。
D.實例性變焦透鏡 It should be understood that the reflectance and transmittance characteristics of the deflection element 712 can be changed to meet other optical spectrum requirements. In some examples, the housing 302 may include a slit that enables the deflection element 712 and/or the NUV light source 708c to be replaced based on the desired light reflectance and transmittance characteristics. It should also be understood that a first path inside the deflection element 712 between the path 908 and the path 910 and a second path inside the deflection element 712 between the path 916 and the path 918 are each angled to schematically It represents the refraction of light, which is because the light travels between the air and the inside of the deflection element 712. The angle shown is not intended to represent the actual reflection angle. D. Example zoom lens
圖7及圖8之實例性立體視覺化攝影機300包含一或多個變焦透鏡以改變目標部位700之一焦距及視角從而提供變焦放大。在所圖解說明實例中,變焦透鏡包含前透鏡組714、一變焦透鏡總成716及一透鏡鏡筒組718。應瞭解,在其他實施例中,可省略前透鏡組714及/或透鏡鏡筒組718。另一選擇係,變焦透鏡可包含額外透鏡以提供額外放大率及/或影像解析度。The exemplary stereoscopic visualization camera 300 of FIGS. 7 and 8 includes one or more zoom lenses to change a focal length and angle of view of the target site 700 to provide zoom magnification. In the illustrated example, the zoom lens includes a front lens group 714, a zoom lens assembly 716, and a lens barrel group 718. It should be understood that in other embodiments, the front lens group 714 and/or the lens barrel group 718 may be omitted. Alternatively, the zoom lens may include additional lenses to provide additional magnification and/or image resolution.
前透鏡組714包含用於右光學路徑之一右前透鏡720及用於左光學路徑之一左前透鏡722。透鏡720及722可各自包含一正聚光透鏡以將光自偏轉元件712引導至變焦透鏡總成716中之各別透鏡。透鏡720及722之一橫向位置因此定義自主要物鏡總成702及偏轉元件712傳播至變焦透鏡總成716之一光束。The front lens group 714 includes a right front lens 720 for the right optical path and a left front lens 722 for the left optical path. The lenses 720 and 722 may each include a positive condenser lens to guide light from the deflecting element 712 to the respective lenses in the zoom lens assembly 716. A lateral position of the lenses 720 and 722 is therefore defined as a light beam propagating from the main objective lens assembly 702 and the deflection element 712 to the zoom lens assembly 716.
透鏡720及722中之一者或兩者可係可徑向調整的以匹配左光學路徑及右光學路徑之光軸。換言之,可使透鏡720及722中之一者或兩者在入射至光學路徑之一平面中左右及/或上下移動。在某些實施例中,可使透鏡720及722中之一或多者旋轉或傾斜以減少或消除影像光學缺陷及/或假性視差。在變焦期間使透鏡720及722中之任一者或兩者移動可致使每一光學路徑之變焦重複點(「ZRP」)對於一使用者而言似乎保持固定。除徑向移動之外,亦可使前透鏡720及722中之一者或兩者軸向移動(沿著各別光學路徑)以匹配光學路徑之放大率。One or both of the lenses 720 and 722 may be radially adjustable to match the optical axis of the left optical path and the right optical path. In other words, one or both of the lenses 720 and 722 can be moved left and right and/or up and down in a plane incident on the optical path. In some embodiments, one or more of the lenses 720 and 722 can be rotated or tilted to reduce or eliminate image optical defects and/or false parallax. Moving either or both of lenses 720 and 722 during zooming can cause the zoom repeat point ("ZRP") of each optical path to appear to remain fixed to a user. In addition to radial movement, one or both of the front lenses 720 and 722 can also be moved axially (along respective optical paths) to match the magnification of the optical paths.
實例性變焦透鏡總成716形成一無焦變焦系統以用於藉由改變傳播至透鏡鏡筒組718之光束之一大小而改變一視域(例如,一線性視域)之大小。變焦透鏡總成716包含具有一右前變焦透鏡726及一左前變焦透鏡728之一前變焦透鏡組724。變焦透鏡總成716亦包含具有一右後變焦透鏡732及一左後變焦透鏡734之一後變焦透鏡組730。前變焦透鏡726及728可係正聚光透鏡,而後變焦透鏡732及734包含負散光透鏡。The exemplary zoom lens assembly 716 forms an afocal zoom system for changing the size of a field of view (eg, a linear field of view) by changing the size of one of the light beams propagating to the lens barrel assembly 718. The zoom lens assembly 716 includes a front zoom lens group 724 having a right front zoom lens 726 and a left front zoom lens 728. The zoom lens assembly 716 also includes a rear zoom lens group 730 having a right rear zoom lens 732 and a left rear zoom lens 734. The front zoom lenses 726 and 728 may be positive condenser lenses, and the rear zoom lenses 732 and 734 include negative astigmatism lenses.
基於前變焦透鏡726及728、後變焦透鏡732及734以及透鏡鏡筒組718之間的一距離而判定用於左光學路徑及右光學路徑中之每一者之一影像光束之大小。一般而言,該等光學路徑之大小隨著後變焦透鏡732及734移動朝向透鏡鏡筒組718 (沿著各別光學路徑)而減少,因而減小放大率。另外,當後變焦透鏡732及734移動朝向透鏡鏡筒組718時前變焦透鏡726及728亦可移動朝向(或遠離)透鏡鏡筒組718 (諸如在一抛物線弧線中),以維持焦平面在目標部位700上之位置,因而維持焦點。The size of one of the image beams for each of the left optical path and the right optical path is determined based on a distance between the front zoom lenses 726 and 728, the rear zoom lenses 732 and 734, and the lens barrel group 718. Generally speaking, the size of the optical paths decreases as the rear zoom lenses 732 and 734 move toward the lens barrel group 718 (along the respective optical paths), thereby reducing the magnification. In addition, when the rear zoom lenses 732 and 734 move toward the lens barrel group 718, the front zoom lenses 726 and 728 can also move toward (or away from) the lens barrel group 718 (such as in a parabolic arc) to maintain the focal plane in The position on the target site 700, thus maintaining the focus.
前變焦透鏡726及728可包含於一第一載體(例如,前變焦組724)內,而後變焦透鏡732及734包含於一第二載體(例如,後變焦組730)內。可使載體724及730中之每一者沿著光學路徑在軌道(或軌條)上移動,使得左放大率及右放大率同時改變。在此實施例中,可藉由使右前透鏡720及/或左前透鏡722移動而校正左光學路徑與右光學路徑之間的放大率之任何稍微差異。另外或另一選擇係,可使透鏡鏡筒組718之一右透鏡鏡筒736及/或一左透鏡鏡筒738軸向移動。The front zoom lenses 726 and 728 may be included in a first carrier (for example, the front zoom group 724), and the rear zoom lenses 732 and 734 are included in a second carrier (for example, the rear zoom group 730). Each of the carriers 724 and 730 can be moved on a track (or rail) along an optical path, so that the left magnification and the right magnification are changed at the same time. In this embodiment, any slight difference in magnification between the left optical path and the right optical path can be corrected by moving the right front lens 720 and/or the left front lens 722. Additionally or alternatively, a right lens barrel 736 and/or a left lens barrel 738 of the lens barrel group 718 can be moved axially.
在替代實施例中,可使右前變焦透鏡726與左前變焦透鏡728分開地軸向移動。另外,可使右後變焦透鏡732與左後變焦透鏡734分開地軸向移動。單獨移動可使得小放大率差能夠由變焦透鏡總成716校正,尤其在前透鏡組714及透鏡鏡筒組718沿著光學路徑係固定的時。此外,在某些實施例中,右前變焦透鏡726及/或左前變焦透鏡728可係可徑向地及/或旋轉地調整的(及/或傾斜)以維持一ZRP在光學路徑中之一明顯位置。另外或另一選擇係,右後變焦透鏡732及/或左後變焦透鏡734可係可徑向地及/或旋轉地調整的(及/或傾斜)以維持一ZRP在光學路徑中之一明顯位置。In an alternative embodiment, the front right zoom lens 726 may be moved axially separately from the front left zoom lens 728. In addition, the right rear zoom lens 732 and the left rear zoom lens 734 can be axially moved separately. Moving alone can enable small magnification differences to be corrected by the zoom lens assembly 716, especially when the front lens group 714 and the lens barrel group 718 are fixed along the optical path. In addition, in some embodiments, the right front zoom lens 726 and/or the left front zoom lens 728 may be radially and/or rotationally adjusted (and/or tilted) to maintain a ZRP in the optical path. Location. In addition or alternatively, the right rear zoom lens 732 and/or the left rear zoom lens 734 can be adjusted radially and/or rotationally (and/or tilted) to maintain a ZRP clearly visible in the optical path Location.
實例性透鏡鏡筒組718包含右透鏡鏡筒736及左透鏡鏡筒738,除變焦透鏡總成716之外,右透鏡鏡筒736及左透鏡鏡筒738亦係無焦變焦系統之一部分。透鏡736及738可包含經組態以使來自變焦透鏡總成716之一光束變直或聚焦之正聚光透鏡。換言之,透鏡736及738使變焦透鏡總成716之無限遠耦合輸出聚焦。The exemplary lens barrel group 718 includes a right lens barrel 736 and a left lens barrel 738. In addition to the zoom lens assembly 716, the right lens barrel 736 and the left lens barrel 738 are also part of an afocal zoom system. Lenses 736 and 738 may include positive condenser lenses configured to straighten or focus a light beam from the zoom lens assembly 716. In other words, the lenses 736 and 738 focus the infinity coupling output of the zoom lens assembly 716.
在某些實例中,透鏡鏡筒組718徑向地且軸向地固定於殼體302內。在其他實例中,透鏡鏡筒組718可係可沿著光學路徑軸向移動的以提供經增加放大率。另外或另一選擇係,透鏡736及738中之每一者可係可徑向地及/或旋轉地調整的(及/或傾斜)以(舉例而言)校正前透鏡組714、前變焦透鏡組724及/或後變焦透鏡組730之左透鏡與右透鏡之間的光學性質差(依據製造或自然玻璃偏差)。In some instances, the lens barrel group 718 is fixed in the housing 302 radially and axially. In other examples, the lens barrel group 718 may be axially movable along the optical path to provide increased magnification. Additionally or alternatively, each of the lenses 736 and 738 may be radially and/or rotationally adjustable (and/or tilted) to, for example, correct the front lens group 714, the front zoom lens The optical properties between the left lens and the right lens of the group 724 and/or the rear zoom lens group 730 are poor (depending on manufacturing or natural glass deviation).
總之,實例性前透鏡組714、變焦透鏡總成716及透鏡鏡筒組718經組態以達成介於5X與大約20X之間的一光學變焦,較佳地為具有繞射限制解析度之一變焦位準。在某些實施例中,若影像品質可受損,則前透鏡組714、變焦透鏡總成716及透鏡鏡筒組718可提供較高變焦範圍(例如,25X至100X)。在此等實施例中,立體視覺化攝影機300可向一操作者輸出指示一選定光學範圍超出一光學範圍且遭受影像品質之一降低之一訊息。In summary, the exemplary front lens group 714, the zoom lens assembly 716, and the lens barrel group 718 are configured to achieve an optical zoom between 5X and about 20X, preferably one of the diffraction-limited resolutions. Zoom level. In some embodiments, if the image quality can be impaired, the front lens group 714, the zoom lens assembly 716, and the lens barrel group 718 can provide a higher zoom range (for example, 25X to 100X). In these embodiments, the stereo vision camera 300 can output a message to an operator indicating that a selected optical range exceeds an optical range and suffers from a degradation of image quality.
在某些實施例中,前透鏡組714、變焦透鏡總成716、透鏡鏡筒組718及/或主要物鏡總成702之透鏡可各自使用平衡彼此之光學失真參數之材料由多個光學子元件構造為一雙重透鏡。該雙重構造減少色像差及光學像差。舉例而言,前工作距離透鏡408及後工作距離透鏡704可各自經構造為一雙重透鏡。在另一實例中,前透鏡720及722、前變焦透鏡726及728、後變焦透鏡732及734以及透鏡鏡筒736及738可各自包括一雙重透鏡。In some embodiments, the lenses of the front lens group 714, the zoom lens assembly 716, the lens barrel group 718, and/or the main objective lens assembly 702 can each be made of a plurality of optical sub-elements using materials that balance the optical distortion parameters of each other. The structure is a double lens. This dual structure reduces chromatic aberrations and optical aberrations. For example, the front working distance lens 408 and the rear working distance lens 704 may each be configured as a double lens. In another example, the front lenses 720 and 722, the front zoom lenses 726 and 728, the rear zoom lenses 732 and 734, and the lens barrels 736 and 738 may each include a double lens.
在又額外實施例中,前透鏡組714、變焦透鏡總成716、透鏡鏡筒組718及/或主要物鏡總成702之透鏡可以不同方式經調諧及/或具有不同性質以提供具有不同能力之兩個平行光學路徑。舉例而言,可選擇變焦透鏡總成716中之右透鏡以為右光學路徑提供5X至10X光學變焦,同時選擇變焦透鏡總成716中之左透鏡以為左光學路徑提供15X至20X光學變焦。此一組態可使得兩個不同放大率能夠同時及/或在同一螢幕上經展示,儘管在一單像視圖中。
E.實例性濾波器 In still additional embodiments, the lenses of the front lens group 714, the zoom lens assembly 716, the lens barrel group 718, and/or the main objective lens assembly 702 may be tuned in different ways and/or have different properties to provide different capabilities. Two parallel optical paths. For example, the right lens of the zoom lens assembly 716 can be selected to provide 5X to 10X optical zoom for the right optical path, and the left lens of the zoom lens assembly 716 can be selected to provide 15X to 20X optical zoom for the left optical path. This configuration allows two different magnifications to be displayed simultaneously and/or on the same screen, albeit in a single image view. E. Example filters
圖7及圖8之實例性立體視覺化攝影機300包含一或多個光學濾波器740 (或濾波器總成)以選擇性地透射所要波長之光。圖8展示一單個濾波器740可應用於右光學路徑及左光學路徑。在其他實例中,該等光學路徑中之每一者可具有一單獨濾波器。包含單獨濾波器使得(舉例而言)不同波長之光能夠同時自左光學路徑及右光學路徑經濾波,此使得(舉例而言)螢光影像能夠連同可見光影像來顯示。The exemplary stereo visualization camera 300 of FIGS. 7 and 8 includes one or more optical filters 740 (or filter assemblies) to selectively transmit light of desired wavelengths. Figure 8 shows that a single filter 740 can be applied to the right optical path and the left optical path. In other examples, each of the optical paths may have a separate filter. The inclusion of separate filters enables (for example) light of different wavelengths to be filtered simultaneously from the left optical path and the right optical path, which enables (for example) fluorescent images to be displayed along with visible light images.
圖7展示濾波器740包含圍繞其旋轉軸線旋轉之一輪。在所圖解說明實施例中,濾波器740可容納三個不同光學濾波器對。然而,在其他實施例中,濾波器740可包含額外或更少濾波器對。一般而言,在濾波器740處自目標部位700接收之光包含一寬波長頻譜。主要物鏡總成702、前透鏡組714、變焦透鏡總成716及透鏡鏡筒組718之透鏡經組態以使包含一操作者所關注之波長及不合意波長之一相對寬頻寬之光通過。另外,下游光學影像感測器對特定波長敏感。實例性濾波器740因此使光頻譜之特定部分通過且阻擋光頻譜之特定部分以達成不同合意特徵。Figure 7 shows that the filter 740 includes a wheel that rotates about its axis of rotation. In the illustrated embodiment, the filter 740 can accommodate three different optical filter pairs. However, in other embodiments, the filter 740 may include additional or fewer filter pairs. Generally speaking, the light received from the target site 700 at the filter 740 contains a broad wavelength spectrum. The lenses of the main objective lens assembly 702, the front lens group 714, the zoom lens assembly 716, and the lens barrel group 718 are configured to pass light of a relatively wide bandwidth including a wavelength of interest to the operator and an undesirable wavelength. In addition, the downstream optical image sensor is sensitive to specific wavelengths. The example filter 740 therefore passes certain parts of the light spectrum and blocks certain parts of the light spectrum to achieve different desirable characteristics.
作為一輪,濾波器740包括能夠以大約4次/秒改變位置之一機械裝置。在其他實施例中,濾波器740可包含一數位微反射鏡,其可以諸如 60次/秒之視訊圖框率改變一光路徑之方向。在此等其他實施例中,左光學路徑及右光學路徑中之每一者將包含一微反射鏡。左及右微反射鏡可具有經同步切換或同時切換。As a round, the filter 740 includes a mechanical device capable of changing position at approximately 4 times per second. In other embodiments, the filter 740 may include a digital micro-mirror, which can change the direction of a light path at a video frame rate such as 60 times per second. In these other embodiments, each of the left optical path and the right optical path will include a micro mirror. The left and right micro mirrors can be switched synchronously or simultaneously.
在某些實施例中,濾波器740可同步至光源708以實現「時間交錯」多頻譜成像。舉例而言,濾波器740可包含一紅外線截止濾波器、近紅外線帶通濾波器及近超紫外線截止濾波器。不同濾波器類型經選擇以對光源708之不同頻譜以及偏轉元件712之反射率及透射率特性有效以在預定時間處使特定所要波長之光通過。In some embodiments, the filter 740 can be synchronized to the light source 708 to achieve "time-interleaved" multi-spectral imaging. For example, the filter 740 may include an infrared cut filter, a near infrared band pass filter, and a near ultra ultraviolet cut filter. Different filter types are selected to be effective for the different frequency spectrum of the light source 708 and the reflectance and transmittance characteristics of the deflection element 712 to pass light of a specific desired wavelength at a predetermined time.
在一個模式中,濾波器740及光源708經組態以提供一可見光模式。在此模式中,可見光源708a使來自可見區域之光透射至目標部位700上,該光中之某些光反射至主要物鏡總成702。該所反射光可包含超出可見頻譜之某些光,此可影響光學影像感測器。可見光由偏轉元件712反射且通過前透鏡組714、變焦透鏡總成716及透鏡鏡筒組718。在此實例中,濾波器740經組態以將紅外線截止濾波器或近超紫外線截止濾波器應用於光學路徑以移除超出可見頻譜之光,使得僅在可見頻譜中之光通過到達一最後光學組742及一光學影像感測器744。In one mode, the filter 740 and the light source 708 are configured to provide a visible light mode. In this mode, the visible light source 708a transmits light from the visible region to the target site 700, and some of the light is reflected to the main objective lens assembly 702. The reflected light may include some light beyond the visible spectrum, which may affect the optical image sensor. The visible light is reflected by the deflection element 712 and passes through the front lens group 714, the zoom lens assembly 716, and the lens barrel group 718. In this example, the filter 740 is configured to apply an infrared cut filter or a near ultra-ultraviolet cut filter to the optical path to remove light beyond the visible spectrum, so that only light in the visible spectrum passes through to a final optical path. Group 742 and an optical image sensor 744.
在另一模式中,濾波器740及光源708經組態以將一窄波長之螢光提供至光學感測器744。在此模式中,NUV光源708c使來自頻譜之深藍區域之光透射至目標部位700。偏轉元件712允許深藍區域之所要光通過同時反射非所要光。深藍光與目標部位700相互作用使得發射螢光。在某些實例中,δ-氨基酮戊酸(「5ala」)及/或原紫質IX施加至目標部位700以致使在接收到深藍光時發射螢光。除所反射深藍光及某些可見光之外,主要物鏡總成702亦接收螢光。該深藍光通過偏轉元件712離開右光學路徑及左光學路徑。因此,僅可見光及螢光通過前透鏡組714、變焦透鏡總成716及透鏡鏡筒組718。在此實例中,濾波器740經組態以將近超紫外線截止濾波器應用於光學路徑以移除超出所要螢光頻譜之光(包含可見光及任何剩餘NUV深藍光)。因此,僅一窄波長之螢光到達光學影像感測器744,此使得螢光能夠基於相對強度而更容易地偵測到且區分。In another mode, the filter 740 and the light source 708 are configured to provide a narrow wavelength of fluorescent light to the optical sensor 744. In this mode, the NUV light source 708c transmits light from the dark blue region of the spectrum to the target site 700. The deflection element 712 allows the desired light in the dark blue area to pass through while reflecting the undesired light. The deep blue light interacts with the target site 700 to cause fluorescence to be emitted. In some instances, delta-aminolevulinic acid ("5ala") and/or protoporphyrin IX are applied to the target site 700 to cause fluorescence to be emitted when deep blue light is received. In addition to the reflected deep blue light and some visible light, the main objective lens assembly 702 also receives fluorescent light. The deep blue light leaves the right optical path and the left optical path through the deflection element 712. Therefore, only visible light and fluorescent light pass through the front lens group 714, the zoom lens assembly 716, and the lens barrel group 718. In this example, the filter 740 is configured to apply a near ultra-ultraviolet cut filter to the optical path to remove light beyond the desired fluorescent spectrum (including visible light and any remaining NUV deep blue light). Therefore, only a narrow wavelength of fluorescent light reaches the optical image sensor 744, which enables the fluorescent light to be more easily detected and distinguished based on the relative intensity.
在又一模式中,濾波器740及光源708經組態以將靛青綠(「ICG」)螢光提供至光學感測器744。在此模式中,NIV光源708b使在可見頻譜之遠紅區域中之光(其亦被視為近紅外線)透射至目標部位700。另外,可見光源708a使可見光透射至目標場景700。可見光及遠紅光由目標部位處具有ICG之材料吸收,該目標部位然後發射在更遠紅區域中之一高度受刺激之螢光。除所反射NIR光及可見光之外,主要物鏡總成702亦接收螢光。光由偏轉元件712反射至前透鏡組714、變焦透鏡總成716及透鏡鏡筒組718。在此實例中,濾波器740經組態以將近紅外線帶通濾波器應用於光學路徑以移除超出所要螢光頻譜之光(包含 可見光及至少某些NIR光)。因此,僅在更遠紅區域中之螢光到達光學影像感測器744,此使得螢光能夠基於相對強度而更容易地偵測到且區分。 表 1 In yet another mode, the filter 740 and the light source 708 are configured to provide indigo green ("ICG") fluorescence to the optical sensor 744. In this mode, the NIV light source 708b transmits light in the far red region of the visible spectrum (which is also regarded as near infrared) to the target site 700. In addition, the visible light source 708a transmits visible light to the target scene 700. Visible light and far-red light are absorbed by the material with ICG at the target site, and the target site then emits highly stimulated fluorescence in one of the more distant red regions. In addition to the reflected NIR light and visible light, the main objective lens assembly 702 also receives fluorescent light. The light is reflected by the deflection element 712 to the front lens group 714, the zoom lens assembly 716, and the lens barrel group 718. In this example, the filter 740 is configured to apply a near-infrared bandpass filter to the optical path to remove light (including visible light and at least some NIR light) outside the desired fluorescent spectrum. Therefore, only the fluorescent light in the farther red region reaches the optical image sensor 744, which enables the fluorescent light to be more easily detected and distinguished based on the relative intensity. Table 1
表1在上文展示用於致使一特定所要波長之光到達光學光感測器744之光源與濾波器之不同可能組合之一總結。應瞭解,其他類型之濾波器及/或光源可用於進一步增加在影像感測器744處接收之不同類型之光。例如,經組態以使一窄波長之光通過之帶通濾波器可用於與施加至目標部位700之特定生物染色劑或對比劑對應。在某些實例中,濾波器740可包含一級聯濾波器或一個以上濾波器以使得來自兩個不同範圍之光能夠經濾波。舉例而言,一第一濾波器740可應用一紅外線截止濾波器及一近超紫外線截止濾波器,使得僅一所要波長範圍之可見光通過到達光學感測器744。Table 1 above shows a summary of the different possible combinations of light sources and filters used to cause light of a specific desired wavelength to reach the optical light sensor 744. It should be understood that other types of filters and/or light sources can be used to further increase the different types of light received at the image sensor 744. For example, a bandpass filter configured to pass light of a narrow wavelength can be used to correspond to a specific biological stain or contrast agent applied to the target site 700. In some examples, the filter 740 may include a cascade filter or more than one filter to enable light from two different ranges to be filtered. For example, an infrared cut filter and a near ultra-ultraviolet cut filter can be applied to a first filter 740, so that only visible light in a desired wavelength range can pass through to the optical sensor 744.
在其他實施例中,單獨濾波器740可用於左光學路徑及右光學路徑。舉例而言,一右濾波器可包含一紅外線截止濾波器,而一左濾波器包含一近紅外線通過濾波器。此一組態使得能夠與IGC綠色螢光波長同時在可見波長中觀看目標部位700。在另一實例中,一右濾波器可包含一紅外線截止濾波器,而一左濾波器包含一近超紫外線截止濾波器。在此組態中,可與5ALA螢光同時在可見光中展示目標部位700。在此等其他實施例中,右影像串流及左影像串流仍可組合成一立體視圖,該立體視圖與在可見光中的目標部位700之一視圖組合來提供特定解剖結構之一螢光視圖。
F.實例性最後光學元件組 In other embodiments, separate filters 740 can be used for the left optical path and the right optical path. For example, a right filter may include an infrared cut filter, and a left filter may include a near infrared pass filter. This configuration makes it possible to view the target site 700 in the visible wavelength at the same time as the IGC green fluorescent wavelength. In another example, a right filter may include an infrared cut filter, and a left filter may include a near ultra-ultraviolet cut filter. In this configuration, the target site 700 can be displayed in visible light simultaneously with 5ALA fluorescence. In these other embodiments, the right image stream and the left image stream can still be combined into a stereo view, which is combined with a view of the target site 700 in visible light to provide a fluorescent view of a specific anatomical structure. F. Example final optical element group
圖7及圖8之實例性立體視覺化攝影機300包含最後光學元件組742以將自濾波器740接收之光聚焦至光學影像感測器744上。最後光學元件組742包含一右最後光學元件745及一左最後光學元件747,其可各自包括一正聚光透鏡。除使光聚焦之外,光學元件745及747亦可經組態以在光到達光學影像感測器744之前校正右光學路徑及左光學路徑中之微小像差。在某些實例中,透鏡745及747可係可徑向地及/或軸向地移動的以校正由前透鏡組714、變焦透鏡總成716及透鏡鏡筒組718導致之放大率及/或聚焦像差。在一實例中,可使左最後光學元件747徑向移動,而右最後光學元件745係固定的以在放大率改變期間移除ZRP移動。
G.實例性影像感測器 The exemplary stereo visualization camera 300 of FIGS. 7 and 8 includes a final optical element group 742 to focus the light received from the filter 740 onto the optical image sensor 744. The last optical element group 742 includes a right last optical element 745 and a left last optical element 747, which may each include a positive condenser lens. In addition to focusing the light, the optical elements 745 and 747 can also be configured to correct small aberrations in the right and left optical paths before the light reaches the optical image sensor 744. In some examples, the lenses 745 and 747 may be movable radially and/or axially to correct the magnification and/or the magnification caused by the front lens group 714, the zoom lens assembly 716, and the lens barrel group 718. Focus aberration. In one example, the left last optical element 747 can be moved radially, while the right last optical element 745 is fixed to remove ZRP movement during magnification changes. G. Example image sensor
圖7及圖8之實例性立體視覺化攝影機300包含影像感測器744以獲取及/或記錄自最後光學元件組742接收之入射光。影像感測器744包含一右光學影像感測器746以獲取及/或記錄沿著右光學路徑傳播之光且包含一左光學影像感測器748以獲取及/或記錄沿著左光學路徑傳播之光。舉例而言,左光學影像感測器748及右光學影像感測器746中之每一者包含互補金屬氧化物半導體(「CMOS」)感測元件、N型金屬氧化物半導體(「NMOS」)及/或半導體電荷耦合裝置(「CCD」)感測元件。在某些實施例中,左光學感測器748及右光學感測器746係完全相同的及/或具有相同性質。在其他實施例中,左光學感測器748及右光學感測器746包含不同感測元件及/或性質以提供變化能力。舉例而言,右光學影像感測器746 (使用一第一彩色濾波器陣列)可經組態以對藍色螢光更敏感,而左光學影像感測器748 (使用一第二彩色濾波器陣列)經組態以對可見光更敏感。The exemplary stereo visualization camera 300 of FIGS. 7 and 8 includes an image sensor 744 to acquire and/or record the incident light received from the last optical element group 742. The image sensor 744 includes a right optical image sensor 746 to capture and/or record light propagating along the right optical path and a left optical image sensor 748 to capture and/or record the light propagating along the left optical path Light. For example, each of the left optical image sensor 748 and the right optical image sensor 746 includes a complementary metal oxide semiconductor ("CMOS") sensing element, an N-type metal oxide semiconductor ("NMOS") And/or semiconductor charge coupled device ("CCD") sensing element. In some embodiments, the left optical sensor 748 and the right optical sensor 746 are identical and/or have the same properties. In other embodiments, the left optical sensor 748 and the right optical sensor 746 include different sensing elements and/or properties to provide variability. For example, the right optical image sensor 746 (using a first color filter array) can be configured to be more sensitive to blue fluorescence, while the left optical image sensor 748 (using a second color filter) The array) is configured to be more sensitive to visible light.
圖10展示根據本發明之一實例性實施例之影像感測器744之右光學影像感測器746及左光學影像感測器748之一實例。右光學影像感測器746包含光感測元件(例如,像素)之一第一二維網格或矩陣1002。另外,左光學影像感測器748包含光感測元件之一第二二維像素網格1004。像素中之每一者包含使得僅一特定波長之光能夠通過因而接觸一下伏光偵測器之一濾波器。用於不同色彩之濾波器跨越感測器746及748散佈以跨越網格針對所有波長提供光偵測。光偵測器可對可見光以及高於及低於可見頻譜之額外範圍敏感。FIG. 10 shows an example of the right optical image sensor 746 and the left optical image sensor 748 of the image sensor 744 according to an exemplary embodiment of the present invention. The right optical image sensor 746 includes a first two-dimensional grid or matrix 1002 of light sensing elements (eg, pixels). In addition, the left optical image sensor 748 includes a second two-dimensional pixel grid 1004 that is a photo sensor element. Each of the pixels includes a filter that allows only a specific wavelength of light to pass through and thus touch the volt detector. The filters for different colors are spread across the sensors 746 and 748 to provide light detection for all wavelengths across the grid. The light detector can be sensitive to visible light and additional ranges above and below the visible spectrum.
網格1002及1004之光感測元件經組態以記錄一光波長範圍作為在視域中之目標部位700之一表示。入射於一光感測元件上之光致使一電荷累積。讀取該電荷以判定在感測元件處接收之一光量。另外,由於感測元件之濾波器特性已知為在製造容差內,因此所接收光之波長範圍係已知的。將目標部位700之表示引導至光感測元件上使得用於各別光學影像感測器746及748之網格1002及1004對目標部位700進行空間取樣。空間取樣之解析度係影響影像品質及同位之一參數。The light sensing elements of the grids 1002 and 1004 are configured to record a range of light wavelengths as one of the target locations 700 in the field of view. Light incident on a light sensing element causes a charge to accumulate. The charge is read to determine the amount of light received at the sensing element. In addition, since the filter characteristics of the sensing element are known to be within manufacturing tolerances, the wavelength range of the received light is known. The representation of the target site 700 is directed to the light sensing element so that the grids 1002 and 1004 for the respective optical image sensors 746 and 748 spatially sample the target site 700. The resolution of spatial sampling is a parameter that affects image quality and parity.
圖10中之像素網格1002及1004中所展示之像素數目不表示光學影像感測器746及748中之實際像素之數目。替代地,該等感測器通常具有介於1280×720個像素與8500×4500個像素之間(較佳地大約2048×1560個像素)之一解析度。然而,選擇網格1002及1004之並非所有像素以用於影像傳輸。替代地,選擇網格1002及1004之一子組或像素組以用於傳輸。舉例而言,在圖10中,自像素網格1002選擇像素組1006以用於作為一右影像而傳輸且自像素網格1004選擇像素組1008以用於作為一左影像而傳輸。如圖解說明,關於各別像素網格1002及1004,像素組1006不需要位於與像素組1008相同之位置中。對像素組1006及1008之單獨控制使得左影像及右影像能夠經對準及/或針對影像缺陷及/或假性視差經校正,諸如使ZRP移動。The number of pixels shown in the pixel grids 1002 and 1004 in FIG. 10 does not represent the actual number of pixels in the optical image sensors 746 and 748. Alternatively, the sensors usually have a resolution between 1280×720 pixels and 8500×4500 pixels (preferably about 2048×1560 pixels). However, not all pixels of the grids 1002 and 1004 are selected for image transmission. Alternatively, a subgroup or pixel group of grids 1002 and 1004 is selected for transmission. For example, in FIG. 10, the pixel group 1006 is selected from the pixel grid 1002 for transmission as a right image and the pixel group 1008 is selected from the pixel grid 1004 for transmission as a left image. As illustrated in the figure, regarding the respective pixel grids 1002 and 1004, the pixel group 1006 does not need to be located in the same position as the pixel group 1008. Separate control of the pixel groups 1006 and 1008 enables the left and right images to be aligned and/or corrected for image defects and/or false parallax, such as moving the ZRP.
自一像素網格選擇一像素組使得像素網格之一部分能夠經選擇以補償影像缺陷/假性視差及/或使右光學影像與左光學影像更加對準。換言之,可相對於像素網格移動或調整(即時)像素組以藉由減少或消除假性視差而改良影像品質。另一選擇係,可實際上使立體影像之左視圖及右視圖中之任一者或兩者在影像處理管線中移動(舉例而言在再現視圖以用於顯示期間)以實現相同效應。亦可實際上校正感測器之旋轉不對準。亦可在使用期間使一像素組跨越一像素網格移動以提供使視域水平擺動之一樣貌。在一實例中,可自具有2048×1560個像素之一像素網格選擇1920×1080個像素之一像素組或窗。在設置及/或使用期間軟體/韌體可控制該像素窗或組之位置且使該位置移動。因此基於在像素組或窗之長度及寬度方向上之一像素數目而規定光學影像感測器746及748之解析度。1. 利用實例性影像感測器進行色彩感測 Selecting a pixel group from a pixel grid enables a portion of the pixel grid to be selected to compensate for image defects/false parallax and/or to make the right optical image and the left optical image more aligned. In other words, the pixel group can be moved or adjusted (in real time) relative to the pixel grid to improve image quality by reducing or eliminating false parallax. Another option is to actually move either or both of the left view and the right view of the stereoscopic image in the image processing pipeline (for example, during the rendering of the view for display) to achieve the same effect. It can also actually correct the rotation misalignment of the sensor. It is also possible to move a pixel group across a pixel grid during use to provide the appearance of horizontal swinging of the field of view. In an example, a pixel group or window of 1920×1080 pixels can be selected from a pixel grid of 2048×1560 pixels. During setup and/or use, the software/firmware can control the position of the pixel window or group and move the position. Therefore, the resolution of the optical image sensors 746 and 748 is specified based on the number of pixels in the length and width directions of the pixel group or window. 1. Use an example image sensor for color sensing
如上文所提及,光學感測元件746及748包含具有不同濾波器之像素以偵測特定色彩之光。例如,用使主要紅色光通過之濾波器覆蓋某些像素,用使主要綠色光通過之濾波器覆蓋某些像素,且用使主要藍色光通過之濾波器覆蓋某些像素。在某些實施例中,將一拜耳圖案應用於像素網格1002及1004。然而,應瞭解,在其他實施例中,可使用針對特定波長之光經最佳化之一不同色彩圖案。舉例而言,可用一寬頻濾波器或一近紅外線濾波器替換每一感測區域中之一綠色濾波器,因而擴展感測頻譜。As mentioned above, the optical sensing elements 746 and 748 include pixels with different filters to detect light of a specific color. For example, some pixels are covered with a filter that passes mainly red light, some pixels are covered with a filter that passes mainly green light, and some pixels are covered with a filter that passes mainly blue light. In some embodiments, a Bayer pattern is applied to the pixel grids 1002 and 1004. However, it should be understood that in other embodiments, a different color pattern optimized for a specific wavelength of light may be used. For example, a broadband filter or a near-infrared filter can be used to replace one of the green filters in each sensing area, thereby expanding the sensing spectrum.
藉由將兩列×兩行之像素分組且用一紅色濾波器覆蓋一個像素、用一藍色濾波器覆蓋一個像素且用一綠色濾波器覆蓋兩個像素而實施拜耳圖案(各自呈一棋盤狀圖案)。因此,紅色及藍色之解析度各自係整個所關注感測區域之四分之一,而綠色解析度係整個所關注感測區域之二分之一。The Bayer pattern is implemented by grouping two columns × two rows of pixels and covering one pixel with a red filter, one pixel with a blue filter, and two pixels with a green filter (each in a checkerboard shape) pattern). Therefore, the resolutions of red and blue are each a quarter of the entire sensing area of interest, and the resolution of green is one-half of the entire sensing area of interest.
綠色可指派給感測區域之二分之一以致使光學影像感測器746及748操作為一明度感測器且模仿人類視覺系統。另外,紅色及藍色模仿人類視覺系統之色度感測器,但不像綠色感測一樣關鍵。一旦針對一特定區域判定一定量之紅色、綠色及藍色,便可藉由將紅色、綠色及藍色值求平均而判定可見頻譜中之其他色彩,如連同下文所論述之圖16之解拜耳程式1580a所論述。Green can be assigned to one-half of the sensing area so that the optical image sensors 746 and 748 operate as a brightness sensor and mimic the human visual system. In addition, red and blue mimic the chromaticity sensor of the human visual system, but they are not as critical as green sensing. Once a certain amount of red, green, and blue is determined for a specific area, other colors in the visible spectrum can be determined by averaging the red, green, and blue values, as discussed in conjunction with the Bayer in Figure 16 below. As discussed in program 1580a.
在某些實施例中,光學影像感測器746及748可使用堆疊組件而非濾波器來感測色彩。舉例而言,感測元件可包含垂直堆疊於一像素之區內側之紅色、綠色及藍色感測組件。在另一實例中,稜鏡使用經特殊塗佈分束器一或多次(通常至少兩次,從而產生三個分量色彩,稱為「3片」)而將入射光分裂成若干分量,其中感測元件放置於分裂光束之路徑中之每一者中。其他感測器類型使用一不同圖案,諸如用一寬頻濾波器或一近紅外線濾波器替換綠色濾波器中之一者,因而擴展數位外科手術顯微鏡之感測可能性。2. 利用實例性影像感測器感測超出可見範圍之光 In some embodiments, the optical image sensors 746 and 748 may use stacked components instead of filters to sense colors. For example, the sensing element may include red, green, and blue sensing elements stacked vertically inside the area of a pixel. In another example, the beam splitter is specially coated one or more times (usually at least twice to produce three component colors, called "3 slices") to split the incident light into several components. The sensing element is placed in each of the paths of the split beam. Other sensor types use a different pattern, such as replacing one of the green filters with a broadband filter or a near-infrared filter, thereby expanding the sensing possibilities of digital surgical microscopes. 2. Use an example image sensor to sense light beyond the visible range
光學影像感測器746及748之實例性感測元件濾波器經組態以亦使在感測元件可偵測到之一範圍中之近紅外線光通過。此使得光學影像感測器746及748能夠偵測超出可見範圍之至少某些光。此敏感度可降低頻譜之可見部分中之影像品質,此乃因其「沖洗」影像,從而降低諸多類型之場景中之對比度且消極地影響色彩品質。因此,濾波器740可使用紅外線截止濾波器來阻擋近紅外線波長同時使可見波長通過到達光學影像感測器746及748。The example sensing element filters of the optical image sensors 746 and 748 are configured to also pass near-infrared light in a range detectable by the sensing element. This enables the optical image sensors 746 and 748 to detect at least some light outside the visible range. This sensitivity can reduce the image quality in the visible part of the spectrum because it "washes" the image, which reduces the contrast in many types of scenes and negatively affects the color quality. Therefore, the filter 740 can use an infrared cut filter to block near-infrared wavelengths while allowing visible wavelengths to pass to the optical image sensors 746 and 748.
然而,此近紅外線敏感度可係合意的。舉例而言,諸如ICG之一螢光劑可引入至目標部位700。ICG以可見光或其他波長之光來激發或活化且發射在近紅外線範圍中之螢光。如上文所提及,NIR光源708b提供NIR光且可見光源708a提供可見光以激發具有ICG之試劑。所發射光進一步沿著紅色頻譜,可使用一近紅外線帶通或高通濾波器使該紅色頻譜通過濾波器740。來自紅色頻譜之光然後由光學影像感測器746及748偵測到。藉由使濾波器740之頻譜特性與光源708及螢光劑之預期行為匹配,試劑及生物結構(諸如含有試劑之血液)可在目標部位700處與不含有試劑之其他結構區分開。However, this near-infrared sensitivity may be desirable. For example, a fluorescent agent such as ICG may be introduced to the target site 700. ICG is excited or activated by visible light or light of other wavelengths and emits fluorescence in the near-infrared range. As mentioned above, NIR light source 708b provides NIR light and visible light source 708a provides visible light to excite reagents with ICG. The emitted light is further along the red spectrum, and a near-infrared bandpass or high-pass filter can be used to pass the red spectrum through the filter 740. The light from the red spectrum is then detected by optical image sensors 746 and 748. By matching the spectral characteristics of the filter 740 with the expected behavior of the light source 708 and the fluorescent agent, reagents and biological structures (such as blood containing reagents) can be distinguished at the target site 700 from other structures that do not contain reagents.
應注意,在此實例中,NIR光源708b具有與濾波器740中之近紅外線濾波器不同之一主波長。具體而言,NIR光源708b具有大約780奈米(「nm」)之一主波長(大多數光之輸出頻譜存在於其周圍)。相比之下,濾波器740之近紅外線濾波器透射波長在大致810 nm至910 nm之一範圍中之光。來自NIR光源708b之光及通過濾波器740之光兩者皆係「近紅外線」波長。然而,該等光波長係分開的,使得實例性立體視覺化攝影機300可利用光源708進行刺激且利用光學影像感測器744進行偵測同時對激發光進行濾波。此組態因此使得能夠使用螢光劑。It should be noted that in this example, the NIR light source 708b has a dominant wavelength that is different from the near-infrared filter in the filter 740. Specifically, the NIR light source 708b has a dominant wavelength of approximately 780 nanometers ("nm") (most of the light output spectrum exists around it). In contrast, the near-infrared filter of the filter 740 transmits light having a wavelength in a range of approximately 810 nm to 910 nm. Both the light from the NIR light source 708b and the light passing through the filter 740 are "near infrared" wavelengths. However, the wavelengths of the light are separated so that the exemplary stereo visualization camera 300 can use the light source 708 for stimulation and the optical image sensor 744 for detection while filtering the excitation light. This configuration therefore enables the use of fluorescent agents.
在另一實施例中,試劑可在藍色、紫色及近超紫外線區域中經激發且在紅色區域中發螢光。此一試劑之一實例包含由5ALA之引入導致之惡性神經膠瘤中之卟啉累積。在此實例中,有必要濾除藍色光同時使頻譜之剩餘部分通過。一近超紫外線截止濾波器用於此情景。如在上文所論述之「近紅外線」之情形中,NUV光源708c具有與濾波器740中之近超紫外線截止濾波器不同之一主波長。
H.實例性透鏡載體 In another embodiment, the reagent can be excited in the blue, violet, and near ultra-ultraviolet regions and fluoresce in the red region. An example of such an agent includes the accumulation of porphyrin in malignant glioma caused by the introduction of 5ALA. In this example, it is necessary to filter out the blue light while passing the rest of the spectrum. A near ultra-ultraviolet cut filter is used in this scenario. As in the “near infrared” case discussed above, the NUV light source 708c has a dominant wavelength that is different from the near ultra-ultraviolet cut filter in the filter 740. H. Exemplary lens carrier
章節IV(D)在上文提及前透鏡組714、變焦透鏡總成716及/或透鏡鏡筒組718之透鏡中之至少某些透鏡可在一或多個載體中沿著軌條移動。舉例而言,前變焦透鏡組724可包括使前變焦透鏡726及728一起軸向移動之一載體。Section IV (D) mentioned above that at least some of the lenses of the front lens group 714, the zoom lens assembly 716, and/or the lens barrel group 718 can move along the rails in one or more carriers. For example, the front zoom lens group 724 may include a carrier for axially moving the front zoom lenses 726 and 728 together.
圖11及圖12展示根據本發明之實例性實施例之實例性載體之圖式。在圖11中,載體724包含在一支撐結構1102內之右前變焦透鏡726及左前變焦透鏡728。載體724包含經組態以可移動地連接至軌條1106之一軌條固持器1104。一力「F」施加至一致動區段1108以致使載體724沿著軌條1106移動。力「F」可由一導螺桿或其他線性致動裝置施加。如圖11中所圖解說明,力「F」施加在載體724之一偏移處。軌條1106與載體724之間的摩擦產生致使支撐結構1102圍繞圖11中所展示之Y軸稍微移動之一力矩My
。此稍微移動可致使右前變焦透鏡726及左前變焦透鏡728在相反方向上稍微移位從而導致假性視差,該假性視差係一立體影像之視圖之間的一視差之一誤差。Figures 11 and 12 show diagrams of exemplary carriers according to exemplary embodiments of the present invention. In FIG. 11, the carrier 724 includes a right front zoom lens 726 and a left front zoom lens 728 in a supporting structure 1102. The carrier 724 includes a rail holder 1104 that is configured to be movably connected to the rail 1106. A force “F” is applied to the actuating section 1108 to cause the carrier 724 to move along the rail 1106. The force "F" can be applied by a lead screw or other linear actuation device. As illustrated in FIG. 11, the force “F” is applied at one of the offsets of the carrier 724. The friction between the rail 1106 and the carrier 724 causes the support structure 1102 to slightly move around the Y axis shown in FIG. 11 by a moment My . This slight movement may cause the right front zoom lens 726 and the left front zoom lens 728 to shift slightly in opposite directions, resulting in false parallax, which is an error of a parallax between views of a stereoscopic image.
圖12展示載體724之另一實例。在此實例中,力「F」對稱地施加在中心結構1202處,中心結構1202連接至軌條固持器1104及支撐結構1102。力「F」產生致使載體724圍繞圖12中所展示之X軸旋轉或稍微移動之一力矩Mx
。該旋轉移動致使右前變焦透鏡726及左前變焦透鏡728在相同方向上移位相同移動程度,因而減少(或消除)假性視差之發生。FIG. 12 shows another example of the carrier 724. As shown in FIG. In this example, the force “F” is applied symmetrically at the central structure 1202, which is connected to the rail holder 1104 and the support structure 1102. The force "F" is generated to cause the carrier 724 to rotate or slightly move around the X axis shown in FIG. 12 by a moment M x . This rotational movement causes the right front zoom lens 726 and the left front zoom lens 728 to shift in the same direction by the same degree of movement, thereby reducing (or eliminating) the occurrence of false parallax.
雖然圖11及圖12展示在一個載體內之透鏡726及728,但在其他實施例中透鏡726及728可各自在一載體內。在此等實例中,每一透鏡將在一單獨軌道或軌條上。可為透鏡中之每一者提供單獨導螺桿以沿著各別光學路徑提供獨立軸向移動。
I.實例性撓曲部 Although FIGS. 11 and 12 show the lenses 726 and 728 in a carrier, in other embodiments the lenses 726 and 728 may each be in a carrier. In these examples, each lens will be on a separate track or rail. A separate lead screw can be provided for each of the lenses to provide independent axial movement along the respective optical path. I. Example flexure
章節IV(D)在上文提及可使前透鏡組714、變焦透鏡總成716及/或透鏡鏡筒組718之透鏡中之至少某些透鏡徑向移動、旋轉及/或傾斜。另外或另一選擇係,可使光學影像感測器746及748相對於其各別入射光學路徑軸向移動及/或傾斜。軸向及/或傾斜移動可由一或多個撓曲部提供。在某些實例中,該等撓曲部可係級聯的,使得一第一撓曲部提供在一第一方向上之運動且單獨撓曲部提供在一第二方向上之獨立運動。在另一實例中,一第一撓曲部提供沿著一縱傾軸線之傾斜且單獨撓曲部提供沿著一側傾軸線之傾斜。Section IV(D) mentioned above that at least some of the lenses of the front lens group 714, the zoom lens assembly 716, and/or the lens barrel group 718 can be moved, rotated and/or tilted radially. Additionally or alternatively, the optical image sensors 746 and 748 can be axially moved and/or tilted relative to their respective incident optical paths. The axial and/or tilt movement can be provided by one or more flexures. In some examples, the flexures may be cascaded, such that a first flexure provides movement in a first direction and a separate flexure provides independent movement in a second direction. In another example, a first flexure provides tilt along a pitch axis and a single flexure provides tilt along a side tilt axis.
圖13展示根據本發明之一實例性實施例之一實例性雙撓曲部1300之一圖式。圖13中所圖解說明之撓曲部1300係用於光學影像感測器744且經組態以使右光學影像感測器746及左光學影像感測器748沿著其各別光軸獨立地移動以用於最後聚焦之目的。撓曲部1300包含一支撐樑1301以用於連接至實例性立體視覺化攝影機300之殼體302且為致動提供一剛性基底。撓曲部1300亦針對每一通道(例如,感測器746及748)包含一樑1302,樑1302在所有方向(惟運動方向1310除外)上係剛性的。樑1302連接至撓曲鉸鏈1303,撓曲鉸鏈1303使得樑1302能夠在一運動方向1310上移動(在此實例中為一平行四邊形平移)。FIG. 13 shows a diagram of an exemplary double flexure 1300 according to an exemplary embodiment of the present invention. The flexure 1300 illustrated in FIG. 13 is used for the optical image sensor 744 and is configured so that the right optical image sensor 746 and the left optical image sensor 748 are independently along their respective optical axes Move for the purpose of final focusing. The flexure 1300 includes a supporting beam 1301 for connecting to the housing 302 of the exemplary stereo visualization camera 300 and providing a rigid base for actuation. The flexure 1300 also includes a beam 1302 for each channel (for example, the sensors 746 and 748), and the beam 1302 is rigid in all directions (except the direction of motion 1310). The beam 1302 is connected to a flexure hinge 1303, and the flexure hinge 1303 enables the beam 1302 to move in a movement direction 1310 (in this example, a parallelogram translation).
一致動器裝置1304使樑1302在所要方向上撓曲一所要距離。致動器裝置1304針對每一通道包含一推動螺桿1306及一拉動螺桿1308,推動螺桿1306及拉動螺桿1308將相反力施加至樑1302,從而致使撓曲鉸鏈1303移動。舉例而言,可藉由使推動螺桿1306轉動以對樑1302進行推動而使樑1302向內移動。圖13中所圖解說明之撓曲部1300經組態以使右光學影像感測器746及左光學影像感測器748沿著其光軸獨立地軸向移動。The actuator device 1304 deflects the beam 1302 a desired distance in the desired direction. The actuator device 1304 includes a pushing screw 1306 and a pulling screw 1308 for each channel. The pushing screw 1306 and the pulling screw 1308 apply opposite forces to the beam 1302, thereby causing the flexure hinge 1303 to move. For example, the beam 1302 can be moved inward by rotating the pushing screw 1306 to push the beam 1302. The flexure 1300 illustrated in FIG. 13 is configured to move the right optical image sensor 746 and the left optical image sensor 748 independently along their optical axes axially.
在使樑1302撓曲至一所要位置中之後,嚙合一鎖定機構以阻止進一步移動,因而形成一剛性柱。該鎖定機構包含推動螺桿1306及其各別同心拉動螺桿1308,推動螺桿1306及其各別同心拉動螺桿1308在擰緊時形成產生樑1302之剛性柱之大的相對力。After the beam 1302 is flexed into a desired position, a locking mechanism is engaged to prevent further movement, thereby forming a rigid column. The locking mechanism includes a pushing screw 1306 and its respective concentric pulling screw 1308. The pushing screw 1306 and its respective concentric pulling screw 1308 form a large relative force that generates the rigid column of the beam 1302 when tightened.
雖然光學影像感測器746及748經展示為連接至相同撓曲部1300,但在其他實例中,該等感測器可連接至單獨撓曲部。舉例而言,返回至圖8,右光學影像感測器746連接至撓曲部750且左光學影像感測器748連接至撓曲部752。使用單獨撓曲部750及752使得光學影像感測器746及748能夠經單獨調整以(舉例而言)對準左光學視圖與右光學視圖及/或減少或消除假性視差。Although optical image sensors 746 and 748 are shown connected to the same flexure 1300, in other examples, the sensors may be connected to a separate flexure. For example, returning to FIG. 8, the right optical image sensor 746 is connected to the flexure 750 and the left optical image sensor 748 is connected to the flexure 752. The use of separate flexures 750 and 752 enables the optical image sensors 746 and 748 to be individually adjusted to, for example, align the left and right optical views and/or reduce or eliminate false parallax.
另外,雖然圖13展示連接至撓曲部1300之影像感測器746及748,但在其他實例中,前透鏡組714、變焦透鏡總成716、透鏡鏡筒組718及/或最後光學元件組742之透鏡可替代地連接至替代或額外撓曲部。在某些例項中,前透鏡組714、變焦透鏡總成716、透鏡鏡筒組718及/或最後光學元件組742之右透鏡及左透鏡中之每一者可連接至一單獨撓曲部1300以提供獨立徑向、旋轉及/或傾斜調整。In addition, although FIG. 13 shows the image sensors 746 and 748 connected to the flexure 1300, in other examples, the front lens group 714, the zoom lens assembly 716, the lens barrel group 718, and/or the last optical element group The lens of 742 can alternatively be connected to an alternative or additional flexure. In some examples, each of the right lens and the left lens of the front lens group 714, the zoom lens assembly 716, the lens barrel group 718, and/or the last optical element group 742 can be connected to a separate flexure 1300 to provide independent radial, rotation and/or tilt adjustment.
撓曲部1300可提供小於一微米之運動解析度。作為非常精細運動調整之一結果,來自右光學路徑及左光學路徑之影像對於一4K顯示監視器可具有數個或甚至一個像素之一對準準確性。藉由使左視圖及右視圖覆疊且用兩隻眼睛而非實立體鏡地觀察兩個視圖而在每一顯示器512、514上觀看此準確性。The flexure 1300 can provide a motion resolution of less than one micron. As a result of the very fine motion adjustment, the images from the right optical path and the left optical path may have alignment accuracy of one pixel or even one pixel for a 4K display monitor. This accuracy is viewed on each display 512, 514 by overlaying the left and right views and observing the two views with two eyes instead of a real stereoscope.
在某些實施例中,撓曲部1300可包含標題為「SYSTEM FOR THE SUB-MICRON POSITIONING OF A READ/WRITE TRANSDUCER
」之第5,359,474號美國專利中所揭示之撓曲部,該美國專利之全文以引用方式併入本文中。在又其他實施例中,前透鏡組714、變焦透鏡總成716、透鏡鏡筒組718及/或最後光學元件組742之透鏡在一徑向方向上可係固定的。替代地,在一光學路徑中具有一可調整偏轉方向之一偏轉元件(例如,一反射鏡)可用於操縱右光學路徑及/或左光學路徑以調整對準及/或假性視差。另外或另一選擇係,一傾斜/移位透鏡可提供於光學路徑中。例如,可用一可調整楔形透鏡控制一光軸之一傾斜。在額外實施例中,前透鏡組714、變焦透鏡總成716、透鏡鏡筒組718及/或最後光學元件組742之透鏡可包含具有可以電子方式改變之參數之動態透鏡。舉例而言,該等透鏡可包含由Invenios France SAS生產之Varioptic液體透鏡。
V.立體視覺化攝影機之實例性處理器 In some embodiments, the flexure 1300 may include the flexure disclosed in US Patent No. 5,359,474 entitled "SYSTEM FOR THE SUB-MICRON POSITIONING OF A READ/WRITE TRANSDUCER ", the full text of which is based on The way of citation is incorporated into this article. In still other embodiments, the lenses of the front lens group 714, the zoom lens assembly 716, the lens barrel group 718, and/or the last optical element group 742 may be fixed in a radial direction. Alternatively, a deflection element (for example, a mirror) having an adjustable deflection direction in an optical path can be used to manipulate the right optical path and/or the left optical path to adjust alignment and/or false parallax. Additionally or alternatively, a tilt/shift lens can be provided in the optical path. For example, an adjustable wedge lens can be used to control the tilt of one of the optical axes. In additional embodiments, the lenses of the front lens group 714, the zoom lens assembly 716, the lens barrel group 718, and/or the last optical element group 742 may include dynamic lenses with parameters that can be changed electronically. For example, the lenses may include Varioptic liquid lenses produced by Invenios France SAS. V. Example processor for stereo vision camera
實例性立體視覺化攝影機300經組態以記錄來自右光學路徑及左光學路徑之影像資料且將該影像資料輸出至監視器512及/或514以用於顯示為一立體影像。圖14展示根據本發明之一實例性實施例之用於獲取且處理影像資料的實例性立體視覺化攝影機300之模組之一圖式。應瞭解,該等模組圖解說明由特定硬體、控制器、處理器、驅動器及/或介面執行之操作、方法、演算法、常式及/或步驟。在其他實施例中,該等模組可經組合、進一步分割及/或移除。此外,該等模組中之一或多者(或一模組之部分)可提供於立體視覺化攝影機300外部,諸如在一遠端伺服器、電腦及/或分佈式運算環境中。The exemplary stereo visualization camera 300 is configured to record image data from the right optical path and the left optical path and output the image data to the monitor 512 and/or 514 for display as a stereo image. FIG. 14 shows a diagram of a module of an exemplary stereo visualization camera 300 for acquiring and processing image data according to an exemplary embodiment of the present invention. It should be understood that these modules illustrate operations, methods, algorithms, routines, and/or steps performed by specific hardware, controllers, processors, drivers, and/or interfaces. In other embodiments, the modules can be combined, further divided, and/or removed. In addition, one or more of these modules (or parts of a module) may be provided outside the stereo visualization camera 300, such as in a remote server, computer, and/or distributed computing environment.
在圖14之所圖解說明實施例中,圖7至圖13中之組件408、702至750及1300共同稱為光學元件1402。光學元件1402 (具體而言光學影像感測器746及748)以通信方式耦合至一影像擷取模組1404及一馬達與光照模組1406。影像擷取模組1404以通信方式耦合至一資訊處理器模組1408,資訊處理器模組1408可以通信方式耦合至一位於外部之使用者輸入裝置1410及一或多個顯示監視器512及/或514。In the illustrated embodiment of FIG. 14, the components 408, 702 to 750, and 1300 in FIGS. 7 to 13 are collectively referred to as an optical element 1402. The optical element 1402 (specifically, the optical image sensors 746 and 748) are communicatively coupled to an image capturing module 1404 and a motor and illumination module 1406. The image capture module 1404 is communicatively coupled to an information processor module 1408, and the information processor module 1408 can be communicatively coupled to an external user input device 1410 and one or more display monitors 512 and/ Or 514.
實例性影像擷取模組1404經組態以自光學影像感測器746及748接收影像資料。另外,影像擷取模組1404可定義在各別像素網格1002及1004內之像素組1006及1008。影像擷取模組1404亦可規定影像記錄性質,諸如圖框率及曝光時間。The example image capture module 1404 is configured to receive image data from the optical image sensors 746 and 748. In addition, the image capturing module 1404 can define pixel groups 1006 and 1008 in respective pixel grids 1002 and 1004. The image capture module 1404 can also specify image recording properties, such as frame rate and exposure time.
實例性馬達與光照模組1406經組態以控制一或多個馬達(或致動器)以改變光學元件1402中之一或多者之一徑向、軸向及/或傾斜位置。例如,一馬達或致動器可使一驅動螺桿轉動以使載體724沿著軌道1106移動,如圖11及圖12中所展示。一馬達或致動器亦可使圖13之撓曲部1300之推動螺桿1306及/或拉動螺桿1308轉動以調整一透鏡及/或光學影像感測器之一徑向、軸向或傾斜位置。馬達與光照模組1406亦可包含用於控制光源708之驅動器。The example motor and lighting module 1406 is configured to control one or more motors (or actuators) to change one or more of the optical elements 1402 in the radial, axial, and/or tilt positions. For example, a motor or actuator can rotate a drive screw to move the carrier 724 along the track 1106, as shown in FIGS. 11 and 12. A motor or actuator can also rotate the pushing screw 1306 and/or pulling the screw 1308 of the flexure 1300 in FIG. 13 to adjust a lens and/or a radial, axial, or tilt position of the optical image sensor. The motor and lighting module 1406 may also include a driver for controlling the light source 708.
實例性資訊處理器模組1408經組態以處理影像資料以用於顯示。例如,資訊處理器模組1408可提供對影像資料之色彩校正,自影像資料過濾缺陷,及/或再現影像資料以用於立體顯示。資訊處理器模組1408亦可藉由如下方式執行一或多個校準常式以校準立體視覺化攝影機300:將執行對光學元件之規定調整之指令提供至影像擷取模組1404及/或馬達與光照模組1406。資訊處理器模組1408可進一步判定改良影像對準及/或減少假性視差之指令且即時將該等指令提供至影像擷取模組1404及/或馬達與光照模組1406。The example information processor module 1408 is configured to process image data for display. For example, the information processor module 1408 can provide color correction of the image data, filter defects from the image data, and/or reproduce the image data for stereo display. The information processor module 1408 can also calibrate the stereoscopic camera 300 by executing one or more calibration routines in the following manner: providing instructions for performing prescribed adjustments to optical components to the image capturing module 1404 and/or the motor And the light module 1406. The information processor module 1408 can further determine instructions for improving image alignment and/or reducing false parallax and provide these instructions to the image capturing module 1404 and/or the motor and illumination module 1406 in real time.
實例性使用者輸入裝置1410可包含一電腦以提供用於改變立體視覺化攝影機300之操作之指令。使用者輸入裝置1410亦可包含用於選擇立體視覺化攝影機300之參數及/或特徵之控件。在一實施例中,使用者輸入裝置1410包含圖3之控制臂304。使用者輸入裝置1410可硬接線至資訊處理器模組1408。另外或另一選擇係,使用者輸入裝置1410以無線方式或以光學通信方式耦合至資訊處理器模組1408。The example user input device 1410 may include a computer to provide instructions for changing the operation of the stereo visualization camera 300. The user input device 1410 may also include controls for selecting parameters and/or features of the stereo visualization camera 300. In one embodiment, the user input device 1410 includes the control arm 304 of FIG. 3. The user input device 1410 can be hard-wired to the information processor module 1408. Additionally or alternatively, the user input device 1410 is coupled to the information processor module 1408 in a wireless manner or in an optical communication manner.
舉例而言,實例性顯示監視器512及514包含經組態以提供一種三維觀看體驗之電視及/或電腦監視器。舉例而言,該等顯示監視器可包含LG® 55LW5600電視。另一選擇係,顯示監視器512及514可包含一膝上型電腦螢幕、平板電腦螢幕、一智慧型電話螢幕、智慧護目鏡、一投影機、一全像顯示器等。For example, example display monitors 512 and 514 include television and/or computer monitors that are configured to provide a three-dimensional viewing experience. For example, the display monitors may include LG® 55LW5600 TVs. Alternatively, the display monitors 512 and 514 may include a laptop computer screen, a tablet computer screen, a smart phone screen, smart goggles, a projector, a holographic display, and so on.
以下章節更詳細地闡述影像擷取模組1404、馬達與光照模組1406及資訊處理器模組1408。
A.實例性影像擷取模組 The following sections describe the image capture module 1404, the motor and illumination module 1406, and the information processor module 1408 in more detail. A. Example image capture module
圖15展示根據本發明之一實例性實施例之影像擷取模組1404之一圖式。實例性影像擷取模組1404包含一影像感測器控制器1502,影像感測器控制器1502包含一處理器1504、一記憶體1506及一通信介面1508。處理器1504、記憶體1506及通信介面1508可經由一影像感測器控制器匯流排1512以通信方式耦合在一起。FIG. 15 shows a diagram of an image capturing module 1404 according to an exemplary embodiment of the present invention. The exemplary image capture module 1404 includes an image sensor controller 1502, and the image sensor controller 1502 includes a processor 1504, a memory 1506, and a communication interface 1508. The processor 1504, the memory 1506, and the communication interface 1508 can be communicatively coupled together via an image sensor controller bus 1512.
處理器1504可程式化有持久地儲存於記憶體1506內之一或多個程式1510。程式1510包含在經執行時致使處理器1504執行一或多個步驟、常式、演算法等之機器可讀指令。在某些實施例中,程式1510可自資訊處理器模組1408及/或自使用者輸入裝置1410傳輸至記憶體1506。在其他實例中,程式1510可直接自資訊處理器模組1408及/或自使用者輸入裝置1410傳輸至處理器1504。The processor 1504 can be programmed with one or more programs 1510 that are permanently stored in the memory 1506. The program 1510 includes machine-readable instructions that, when executed, cause the processor 1504 to execute one or more steps, routines, algorithms, etc. In some embodiments, the program 1510 can be transferred from the information processor module 1408 and/or from the user input device 1410 to the memory 1506. In other examples, the program 1510 may be directly transmitted from the information processor module 1408 and/or from the user input device 1410 to the processor 1504.
實例性影像感測器控制器1502以通信方式耦合至光學元件1402之右光學影像感測器746及左光學影像感測器748。除發送定時控制資料及/或程式化資料之外,影像感測器控制器1502亦經組態以將電力提供至光學影像感測器746及748。另外,影像感測器控制器1502經組態以自光學影像感測器746及748接收影像及/或診斷資料。The example image sensor controller 1502 is communicatively coupled to the right optical image sensor 746 and the left optical image sensor 748 of the optical element 1402. In addition to sending timing control data and/or programming data, the image sensor controller 1502 is also configured to provide power to the optical image sensors 746 and 748. In addition, the image sensor controller 1502 is configured to receive images and/or diagnostic data from the optical image sensors 746 and 748.
光學影像感測器746及748中之每一者含有可程式化暫存器以控制特定參數及/或特性。該等暫存器中之一或多者可規定圖10之各別像素網格1002及1004內之像素組1006及1008之一位置。該等暫存器可儲存相對於像素網格1002及1004之一原點或邊緣點之一起始位置之一值。該等暫存器亦可規定像素組1006及1008之一寬度及高度以界定一矩形所關注區域。影像感測器控制器1502經組態以讀取在規定像素組1006及1008內之像素之像素資料。在某些實施例中,光學影像感測器746及748之暫存器可促進其他形狀(諸如圓圈、卵形、三角形等)之像素組之指定。另外或另一選擇係,光學影像感測器746及748之暫存器可使得能夠針對像素網格1002及1004中之每一者同時規定多個像素組。Each of the optical image sensors 746 and 748 contains programmable registers to control specific parameters and/or characteristics. One or more of these registers can specify a position of the pixel groups 1006 and 1008 in the respective pixel grids 1002 and 1004 of FIG. 10. The registers can store a value relative to an origin or an edge point of the pixel grids 1002 and 1004. The registers can also specify a width and height of the pixel groups 1006 and 1008 to define a rectangular area of interest. The image sensor controller 1502 is configured to read the pixel data of the pixels in the specified pixel groups 1006 and 1008. In some embodiments, the registers of the optical image sensors 746 and 748 can facilitate the designation of pixel groups of other shapes (such as circles, ovals, triangles, etc.). Additionally or alternatively, the registers of the optical image sensors 746 and 748 can enable multiple pixel groups to be specified for each of the pixel grids 1002 and 1004 at the same time.
像素網格1002及1004之像素之一光感測部分受嵌入式電路(其規定不同光感測模式)控制。該等模式包含一重設模式、一整合模式及一讀出模式。在該重設模式期間,一像素之一電荷儲存組件重設至一已知電壓位準。在該整合模式期間,該像素切換至一「接通」狀態。到達該像素之一感測區或元件之光致使一電荷累積在一電荷儲存組件(例如,一電容器)中。所儲存電荷量在整合模式期間對應於入射於感測元件上之光量。在該讀出模式期間,電荷量轉換為一數位值且經由嵌入式電路自光學影像感測器746及748讀出且傳輸至影像感測器控制器1502。為讀取每個像素,一給定區域中之每一像素之電荷儲存組件由切換式內部電路順序地連接至一讀出電路,該讀出電路執行將電荷自一類比值轉換至數位資料。在某些實施例中,像素類比資料轉換至12位元數位資料。然而,應瞭解,解析度可基於雜訊容限、穩定時間、圖框率及資料傳輸速度而係較小或較大。每一像素之數位像素資料可儲存至一暫存器。One of the light sensing parts of the pixels of the pixel grids 1002 and 1004 is controlled by an embedded circuit (which specifies different light sensing modes). These modes include a reset mode, an integration mode, and a read mode. During the reset mode, a charge storage element of a pixel is reset to a known voltage level. During the integration mode, the pixel switches to an "on" state. The light reaching a sensing area or element of the pixel causes a charge to be accumulated in a charge storage element (for example, a capacitor). The amount of stored charge corresponds to the amount of light incident on the sensing element during the integration mode. During the readout mode, the amount of charge is converted into a digital value and read out from the optical image sensors 746 and 748 via the embedded circuit and transmitted to the image sensor controller 1502. To read each pixel, the charge storage element of each pixel in a given area is sequentially connected to a readout circuit by a switchable internal circuit, and the readout circuit performs the conversion of charge from an analog to digital data. In some embodiments, the pixel analog data is converted to 12-bit digital data. However, it should be understood that the resolution can be smaller or larger based on noise tolerance, stabilization time, frame rate, and data transmission speed. The digital pixel data of each pixel can be stored in a register.
圖15之影像感測器控制器1502之實例性處理器1504經組態以自像素組1006及1008內之像素中之每一者接收像素資料(例如,指示與像素之一元件上之一入射光量對應之儲存於像素中之一電荷的數位資料)。處理器1504依據自右光學影像感測器746接收之像素資料形成一右影像。另外,處理器1504依據自左光學影像感測器748接收之像素資料形成一左影像。另一選擇係,處理器1504在向下游傳輸資料之前形成每一左影像及右影像之僅一部分(舉例而言,一個列或數個列)。在某些實施例中,處理器1504使用一暫存器位置來判定一影像內之每一像素之一位置。The exemplary processor 1504 of the image sensor controller 1502 of FIG. 15 is configured to receive pixel data from each of the pixels in the pixel groups 1006 and 1008 (for example, indicating an incident on one of the elements of the pixel). The amount of light corresponds to the digital data of a charge stored in the pixel). The processor 1504 forms a right image according to the pixel data received from the right optical image sensor 746. In addition, the processor 1504 forms a left image according to the pixel data received from the left optical image sensor 748. Another option is that the processor 1504 forms only a portion (for example, one row or several rows) of each left image and right image before transmitting the data downstream. In some embodiments, the processor 1504 uses a register location to determine a location of each pixel in an image.
在形成右影像及左影像之後,處理器1504使右影像與左影像同步。處理器1504然後將右影像及左影像兩者傳輸至通信介面1508,通信介面1508將該等影像處理成用於經由一通信通道1514傳輸至資訊處理器模組1408之一格式。在某些實施例中,通信通道1514符合USB 2.0或3.0標準且可包括一銅或纖維光學電纜。通信通道1514可使得能夠每秒傳輸高達大致60對(或更多)左影像及右影像(具有1920×1080之一立體解析度及12位元之一資料轉換解析度)。使用一銅USB電纜使得電力能夠自資訊處理器模組1408提供至影像擷取模組1404。After forming the right image and the left image, the processor 1504 synchronizes the right image with the left image. The processor 1504 then transmits both the right image and the left image to the communication interface 1508, and the communication interface 1508 processes the images into a format for transmission to the information processor module 1408 via a communication channel 1514. In some embodiments, the communication channel 1514 complies with the USB 2.0 or 3.0 standard and may include a copper or fiber optic cable. The communication channel 1514 can enable transmission of up to approximately 60 pairs (or more) of left and right images (with a stereo resolution of 1920×1080 and a data conversion resolution of 12 bits) per second. Using a copper USB cable enables power to be supplied from the information processor module 1408 to the image capture module 1404.
下文之章節進一步闡述藉由影像感測器控制器1502之處理器1504執行特定程式1510以獲取及/或處理來自光學影像感測器746及748之影像資料而提供之特徵。
1.曝光實例 The following sections further describe the features provided by the processor 1504 of the image sensor controller 1502 to execute a specific program 1510 to obtain and/or process the image data from the optical image sensors 746 and 748. 1. Exposure example
實例性處理器1504可控制或程式化光學影像感測器746及748處於上文所論述之整合模式中之一時間量。該整合模式出現稱為一曝光時間之一時間週期。處理器1504可藉由將一值寫入至光學影像感測器746及748之一曝光暫存器而設定該曝光時間。另外或另一選擇係,處理器1504可將發信號通知曝光時間之開始及結束之指令傳輸至光學影像感測器746及748。該曝光時間可係可在幾毫秒(「ms」)與幾秒之間程式化的。較佳地,該曝光時間大致係圖框率之倒數。The example processor 1504 may control or program the optical image sensors 746 and 748 to be in one of the integrated modes discussed above for an amount of time. The appearance of the integration mode is called a time period of an exposure time. The processor 1504 can set the exposure time by writing a value to an exposure register of the optical image sensors 746 and 748. Alternatively or alternatively, the processor 1504 may transmit instructions to signal the start and end of the exposure time to the optical image sensors 746 and 748. The exposure time can be programmable between a few milliseconds ("ms") and a few seconds. Preferably, the exposure time is roughly the reciprocal of the frame rate.
在某些實施例中,處理器1504可將一滾動快門方法應用於光學影像感測器746及748以讀取像素資料。在此方法下,像素組1006及1008之一給定像素列之曝光時間在已讀出且然後重設彼列中之像素之後就開始。短時間之後,下一列(其通常實體上最接近於剛剛設定之列)經讀取且因此在其曝光時間重新開始之情況下經重設。每一像素列之順序讀取繼續直至像素組1006及1008之最後或底部列已經讀取且重設為止。處理器1504然後返回至像素組1006及1008之頂部列以讀取下一影像之像素資料。In some embodiments, the processor 1504 can apply a rolling shutter method to the optical image sensors 746 and 748 to read pixel data. Under this method, the exposure time of a given pixel column of one of the pixel groups 1006 and 1008 starts after the pixels in that column have been read out and then reset. After a short time, the next column (which is usually physically closest to the column just set) is read and therefore reset with its exposure time restarted. The sequential reading of each pixel row continues until the last or bottom row of the pixel groups 1006 and 1008 have been read and reset. The processor 1504 then returns to the top row of the pixel groups 1006 and 1008 to read the pixel data of the next image.
在另一實施例中,處理器1504應用一全域快門方法。在此方法下,處理器1504以類似於滾動快門方法之一方式實施讀出及重設。然而,在此方法中,整合針對像素組1006及1008中之所有像素同時發生。該全域快門方法具有與滾動快門方法相比較減少一影像中之缺陷之優點,此乃因同時曝光所有像素。相比之下,在滾動快門方法中,在曝光像素組之行之間存在一小時間延遲。在行曝光之間(尤其頂部行與底部行之間)的時間期間可形成小缺陷,其中可發生讀取之間的目標部位700處之小改變。
2.動態範圍實例 In another embodiment, the processor 1504 applies a global shutter method. Under this method, the processor 1504 implements reading and resetting in a manner similar to the rolling shutter method. However, in this method, integration occurs for all pixels in pixel groups 1006 and 1008 at the same time. The global shutter method has the advantage of reducing defects in an image compared to the rolling shutter method because all pixels are exposed at the same time. In contrast, in the rolling shutter method, there is a small time delay between rows of exposed pixel groups. Small defects may form during the time between row exposures (especially between the top row and the bottom row), where small changes at the target site 700 between readings may occur. 2. Examples of dynamic range
實例性處理器1504可執行一或多個程式1510以偵測超出光學影像感測器746及748之一動態範圍之光。一般而言,極其明亮光完全填充一像素之一電荷儲存區域,因而致使關於確切亮度位準之影像資訊丟失。類似地,極其低光或缺乏光未能施予一像素中之一有意義電荷,此亦致使影像資訊丟失。依據此像素資料形成之影像因此不準確地反映目標部位700處之光強度。The example processor 1504 can execute one or more programs 1510 to detect light that exceeds one of the dynamic ranges of the optical image sensors 746 and 748. Generally speaking, extremely bright light completely fills a charge storage area of a pixel, thus causing image information about the exact brightness level to be lost. Similarly, extremely low light or lack of light fails to impart a meaningful charge to one of the pixels, which also results in loss of image information. The image formed based on this pixel data therefore does not accurately reflect the light intensity at the target site 700.
為偵測超出動態範圍之光,處理器1504可執行數個高動態範圍(「HDR」)程式1510中之一者,包含(舉例而言)一多重曝光程式、一多斜率像素整合程式及一多感測器影像融合程式。在一實例中,該多重曝光程式可利用與光學影像感測器746及748整合或嵌入在一起之HDR特徵。在此方法下,將像素組1006及1008放置至整合模式中達一正常曝光時間。像素組1006及1008之行經讀取且儲存於光學影像感測器746及748處之一記憶體及/或影像感測器控制器1502之記憶體1506中。在由處理器1504執行讀取之後,像素組1006及1008中之每一行再次接通比該正常曝光時間少之一第二曝光時間。處理器1504在該第二曝光時間之後讀取像素行中之每一者且將此像素資料與來自相同行之正常曝光時間之像素資料組合。處理器1504可應用色調映射以在來自正常長度曝光時間之像素資料與來自短長度曝光時間之像素資料之間進行選擇(或組合兩者)且將所得像素資料映射至與下游處理及顯示相容之一範圍。使用多重曝光程式,處理器1504能夠擴展光學影像感測器746及748之動態範圍且壓縮用於顯示之所得像素資料範圍。To detect light beyond the dynamic range, the processor 1504 can execute one of several high dynamic range ("HDR") programs 1510, including (for example) a multiple exposure program, a multi-slope pixel integration program, and A multi-sensor image fusion program. In one example, the multiple exposure program can utilize HDR features integrated or embedded with optical image sensors 746 and 748. In this method, the pixel groups 1006 and 1008 are placed in the integration mode for a normal exposure time. The rows of the pixel groups 1006 and 1008 are read and stored in one of the optical image sensors 746 and 748 and/or the memory 1506 of the image sensor controller 1502. After reading is performed by the processor 1504, each row in the pixel groups 1006 and 1008 is turned on again for a second exposure time which is less than the normal exposure time. The processor 1504 reads each of the pixel rows after the second exposure time and combines this pixel data with the pixel data from the same row for the normal exposure time. The processor 1504 can apply tone mapping to select (or combine both) pixel data from normal length exposure time and pixel data from short length exposure time and map the resulting pixel data to be compatible with downstream processing and display One range. Using the multiple exposure program, the processor 1504 can expand the dynamic range of the optical image sensors 746 and 748 and compress the pixel data range obtained for display.
處理器1504可針對相對暗光操作一類似程式。然而,替代第二曝光時間比正常時間少,第二曝光時間大於正常時間,因而更多時間提供像素以累積一電荷。處理器1504可使用色調映射來調整所讀取像素資料以補償較長曝光時間。
3.圖框率實例 The processor 1504 can operate a similar program for relatively low light. However, instead of the second exposure time being less than the normal time, the second exposure time is greater than the normal time, and thus more time is provided for the pixels to accumulate a charge. The processor 1504 can use tone mapping to adjust the read pixel data to compensate for the longer exposure time. 3. Example of frame rate
實例性處理器1504可控制或規定光學影像感測器746及748之一圖框率。在某些實施例中,光學影像感測器746及748包含板上定時電路及可程式化控制暫存器以規定像素組1006及1008內之像素中之每一者將循環通過上文所論述之成像模式之每秒次數。每當像素組透過三個模式進展時形成一圖框或影像。一圖框率係整合、讀取且重設像素組1006及1008中之像素之每秒次數。The example processor 1504 can control or specify a frame rate of the optical image sensors 746 and 748. In some embodiments, the optical image sensors 746 and 748 include on-board timing circuits and programmable control registers to specify that each of the pixels in the pixel groups 1006 and 1008 will cycle through as discussed above The number of imaging modes per second. Each time the pixel group progresses through the three modes, a frame or image is formed. A frame rate is the number of times per second that the pixels in the pixel groups 1006 and 1008 are integrated, read, and reset.
處理器1504可與光學影像感測器746及748同步,使得在適當時間進行讀取。在其他實例中,處理器1504與光學影像感測器746及748異步。在此等其他實例中,光學影像感測器746及748可在一本地讀取之後將像素資料儲存至一暫時記憶體或佇列。然後可由處理器1504週期性地讀取該像素資料以達成右影像與左影像同步。The processor 1504 can be synchronized with the optical image sensors 746 and 748 to enable reading at an appropriate time. In other examples, the processor 1504 is asynchronous with the optical image sensors 746 and 748. In these other examples, the optical image sensors 746 and 748 can store the pixel data in a temporary memory or queue after a local read. The processor 1504 can then periodically read the pixel data to achieve synchronization of the right image and the left image.
以一時間順序方式處理圖框或影像(例如,創建一影像串流)會提供傳達為一視訊之一運動錯覺。實例性處理器1504經組態以程式化將一平滑視訊之外觀提供給一觀察者之一圖框率。太低之一圖框率使任何運動表現得不連貫或不均勻。高於一最大臨限值圖框率之電影品質對於一觀察者而言係不可辨別的。實例性處理器1504經組態以產生大致20至70個圖框/秒,較佳地針對典型外科手術視覺化介於50個圖框/秒與60個圖框/秒之間。
4.感測器同步實例 Processing frames or images in a chronological manner (for example, creating an image stream) provides the illusion of motion conveyed as a video. The example processor 1504 is configured to programmatically provide the appearance of a smooth video to an observer with a frame rate. A frame rate that is too low will cause any movement to behave inconsistently or unevenly. Film quality above a maximum threshold frame rate is indistinguishable to an observer. The example processor 1504 is configured to generate approximately 20 to 70 frames per second, preferably between 50 frames per second and 60 frames per second for typical surgical visualization. 4. Sensor synchronization example
圖15之實例性處理器1504經組態以控制光學影像感測器746及748之同步。例如,處理器1504可同時將電力提供至光學影像感測器746及748。處理器1504然後可將一時脈信號提供至光學影像感測器746及748兩者。該時脈信號使得光學影像感測器746及748能夠在一自由運行模式中獨立地但以一經同步及/或同時方式操作。因此,光學影像感測器746及748在幾乎相同時間記錄像素資料。實例性處理器1504自光學影像感測器746及748接收像素資料,建構影像及/或圖框之至少一分率且使影像及/或圖框(或其分率)同步以解釋任何稍微定時不匹配。通常,光學影像感測器746及748之間的遲滯小於200微秒。在其他實施例中,處理器1504可在(舉例而言)每一重設模式之後使用一同步接針來同時啟動光學影像感測器746及748。
B.實例性馬達與光照模組 The example processor 1504 of FIG. 15 is configured to control the synchronization of the optical image sensors 746 and 748. For example, the processor 1504 can provide power to the optical image sensors 746 and 748 at the same time. The processor 1504 can then provide a clock signal to both the optical image sensors 746 and 748. The clock signal enables the optical image sensors 746 and 748 to operate independently but in a synchronized and/or simultaneous manner in a free-running mode. Therefore, the optical image sensors 746 and 748 record pixel data at almost the same time. The example processor 1504 receives pixel data from the optical image sensors 746 and 748, constructs at least one fraction of the image and/or frame, and synchronizes the image and/or frame (or its fraction) to account for any slight timing Mismatch. Generally, the hysteresis between the optical image sensors 746 and 748 is less than 200 microseconds. In other embodiments, the processor 1504 may use a synchronization pin to activate the optical image sensors 746 and 748 at the same time after each reset mode, for example. B. Example motor and lighting module
圖15之實例性立體視覺化攝影機300包含馬達與光照模組1406以控制一或多個馬達或致動器以用於使光學元件1402之透鏡移動及/或控制來自光源708之光照輸出。實例性馬達與光照模組1406包含一馬達與光照控制器1520,馬達與光照控制器1520含有經由通信匯流排1528以通信方式耦合在一起之一處理器1522、一記憶體1524及一通信介面1526。記憶體1524儲存一或多個程式1530,一或多個程式1530可在處理器1522上執行以執行光學元件1402及/或光源708之透鏡之控制、調整及/或校準。在某些實施例中,程式1530可自資訊處理器模組1408及/或使用者輸入裝置1410傳輸至記憶體1524。The exemplary stereo visualization camera 300 of FIG. 15 includes a motor and lighting module 1406 to control one or more motors or actuators for moving the lens of the optical element 1402 and/or controlling the light output from the light source 708. The exemplary motor and lighting module 1406 includes a motor and lighting controller 1520. The motor and lighting controller 1520 include a processor 1522, a memory 1524, and a communication interface 1526 that are communicatively coupled together via a communication bus 1528. . The memory 1524 stores one or more programs 1530, and the one or more programs 1530 can be executed on the processor 1522 to perform the control, adjustment, and/or calibration of the optical element 1402 and/or the lens of the light source 708. In some embodiments, the program 1530 can be transferred from the information processor module 1408 and/or the user input device 1410 to the memory 1524.
通信介面1526以通信方式耦合至影像擷取模組1404之通信介面1508及資訊處理器模組1408之一通信介面1532。通信介面1526經組態以自影像擷取模組1404及資訊處理器模組1408接收命令訊息、定時信號、狀態訊息等。舉例而言,影像擷取模組1404之處理器1504可將定時信號發送至處理器1522以同步化光學影像感測器746及748之光照控制與曝光時間之間的定時。在另一實例中,資訊處理模組1408可發送指示將啟動特定光源708及/或將使光學元件1402之特定透鏡移動之命令訊息。該等命令可回應於經由(舉例而言)使用者輸入裝置1410自一操作者接收之輸入。另外或另一選擇係,該等命令可回應於一校準常式及/或即時調整以減少或消除影像不對準及/或缺陷,諸如假性視差。The communication interface 1526 is communicatively coupled to the communication interface 1508 of the image capture module 1404 and the communication interface 1532 of the information processor module 1408. The communication interface 1526 is configured to receive command messages, timing signals, status messages, etc. from the image capture module 1404 and the information processor module 1408. For example, the processor 1504 of the image capture module 1404 can send a timing signal to the processor 1522 to synchronize the timing between the light control of the optical image sensors 746 and 748 and the exposure time. In another example, the information processing module 1408 may send a command message indicating that the specific light source 708 will be activated and/or the specific lens of the optical element 1402 will be moved. The commands can be in response to input received from an operator via the user input device 1410, for example. Additionally or alternatively, the commands can be responsive to a calibration routine and/or real-time adjustments to reduce or eliminate image misalignment and/or defects, such as false parallax.
實例性馬達與光照模組1406包含驅動器,該等驅動器提供電力以控制馬達以用於調整光學元件1402之透鏡之一軸向及/或徑向位置及/或來自光源708之光輸出。具體而言,馬達與光照模組1406包含一NUV光驅動器1534以將一NUV信號傳輸至NUV光源708c,包含一NIR光驅動器1536以將一NIR信號傳輸至NIR光源708b,且包含一可見光驅動器1538以將一可見光信號傳輸至可見光源708a。The example motor and lighting module 1406 includes drivers that provide power to control the motors for adjusting the axial and/or radial position of the lens of the optical element 1402 and/or the light output from the light source 708. Specifically, the motor and lighting module 1406 includes a NUV light driver 1534 to transmit a NUV signal to the NUV light source 708c, a NIR light driver 1536 to transmit a NIR signal to the NIR light source 708b, and a visible light driver 1538 In order to transmit a visible light signal to the visible light source 708a.
另外,馬達與光照模組1406包含一濾波器馬達驅動器1540以將一濾波器馬達信號傳輸至一濾波器馬達1542,濾波器馬達1542控制圖7及圖8之濾波器740。馬達與光照模組1406包含一後變焦透鏡馬達驅動器1544以將一後變焦透鏡馬達信號傳輸至一後變焦透鏡馬達1546,包含一前變焦透鏡馬達驅動器1548以將一前變焦透鏡馬達信號傳輸至一前變焦透鏡馬達1550,且包含一後工作距離透鏡馬達驅動器1552以將一工作距離透鏡馬達信號傳輸至一工作距離透鏡馬達1554。馬達與光照模組1406亦可包含一馬達及/或致動器以使偏轉元件712移動及/或傾斜。In addition, the motor and lighting module 1406 includes a filter motor driver 1540 to transmit a filter motor signal to a filter motor 1542, and the filter motor 1542 controls the filter 740 in FIGS. 7 and 8. The motor and lighting module 1406 includes a rear zoom lens motor driver 1544 to transmit a rear zoom lens motor signal to a rear zoom lens motor 1546, and includes a front zoom lens motor driver 1548 to transmit a front zoom lens motor signal to a rear zoom lens motor 1546. The front zoom lens motor 1550 includes a rear working distance lens motor driver 1552 to transmit a working distance lens motor signal to a working distance lens motor 1554. The motor and lighting module 1406 may also include a motor and/or actuator to move and/or tilt the deflection element 712.
後變焦透鏡馬達1546經組態以使一驅動螺桿旋轉,此致使載體730沿著一軌道或軌條軸向移動。前變焦透鏡馬達1550經組態以使一驅動螺桿旋轉,此致使載體724沿著圖11及圖12中所展示之軌道1106軸向移動。工作距離透鏡馬達1554經組態以使一驅動螺桿旋轉,此致使後工作距離透鏡704沿著一軌道或軌條軸向移動。The rear zoom lens motor 1546 is configured to rotate a drive screw, which causes the carrier 730 to move axially along a track or rail. The front zoom lens motor 1550 is configured to rotate a drive screw, which causes the carrier 724 to move axially along the track 1106 shown in FIGS. 11 and 12. The working distance lens motor 1554 is configured to rotate a drive screw, which causes the rear working distance lens 704 to move axially along a track or rail.
驅動器1536、1538及1540可包含任何類型之光照驅動器、變壓器及/或鎮流器。驅動器1536、1538及1540經組態以輸出一脈衝寬度調變(「PWM」)信號以控制由光源708輸出之光之一強度。在某些實施例中,處理器1522可控制驅動器1536、1538及1540之定時以與用於使用濾波器馬達驅動器1540應用一特定濾波器之一定時對應。The drivers 1536, 1538, and 1540 may include any type of lighting drivers, transformers, and/or ballasts. The drivers 1536, 1538, and 1540 are configured to output a pulse width modulation ("PWM") signal to control an intensity of the light output by the light source 708. In some embodiments, the processor 1522 may control the timing of the drivers 1536, 1538, and 1540 to correspond to a timing for applying a specific filter using the filter motor driver 1540.
舉例而言,實例性驅動器1540、1544、1548及1552可包含步進馬達驅動器及/或DC馬達驅動器。同樣地,馬達1542、1546、1550及/或1554可包含一步進馬達、一DC馬達或者其他電、磁性、熱、液壓或氣動致動器。舉例而言,馬達1542、1546、1550及/或1554可包含一旋轉編碼器、一槽式光學開關(例如,一光斷續器)及/或一線性編碼器以報告一軸件及/或軸之一角位置以達成回饋報告及控制。替代實施例可包含語音線圈馬達、壓電馬達、具有適合驅動器之線性馬達及其等效物。For example, the example drivers 1540, 1544, 1548, and 1552 may include stepper motor drivers and/or DC motor drivers. Similarly, the motors 1542, 1546, 1550, and/or 1554 may include a stepper motor, a DC motor, or other electric, magnetic, thermal, hydraulic, or pneumatic actuators. For example, the motors 1542, 1546, 1550, and/or 1554 may include a rotary encoder, a slotted optical switch (for example, a photointerrupter), and/or a linear encoder to report a shaft and/or shaft One corner position to achieve feedback reporting and control. Alternative embodiments may include voice coil motors, piezoelectric motors, linear motors with suitable drivers, and their equivalents.
為控制驅動器1534、1536、1538、1540、1544、1548及1552,處理器1522經組態以使用一程式1530來將一命令訊息轉換為一數位及/或類比信號。處理器1522將該數位及/或類比信號傳輸至適當驅動器,該驅動器輸出一類比功率信號,諸如與所接收信號對應之一PWM信號。該類比功率信號將電力提供至一適當馬達或致動器,從而致使其旋轉(或以其他方式移動)一所要量。To control the drivers 1534, 1536, 1538, 1540, 1544, 1548, and 1552, the processor 1522 is configured to use a program 1530 to convert a command message into a digital and/or analog signal. The processor 1522 transmits the digital and/or analog signal to an appropriate driver, and the driver outputs an analog power signal, such as a PWM signal corresponding to the received signal. The analog power signal provides power to an appropriate motor or actuator, causing it to rotate (or otherwise move) by a desired amount.
處理器1522可自驅動器1534、1536、1538、1540、1544、1548及1552、馬達1542、1546、1550及/或1554及/或光源708接收回饋。該回饋對應於(舉例而言)一光照位準或光照輸出。關於該等馬達,回饋對應於一馬達(或其他致動器)之一位置及/或一移動量。處理器1522使用一程式1530來將所接收信號轉化成數位回饋以基於對應馬達或致動器軸件之一角位置而判定(舉例而言)一透鏡之一徑向、傾斜及/或軸向位置。處理器1522然後可將具有位置資訊之一訊息傳輸至資訊處理器模組1408以用於顯示給一使用者及/或追蹤光學元件1402之透鏡之一位置以用於校準。The processor 1522 can receive feedback from the drivers 1534, 1536, 1538, 1540, 1544, 1548, and 1552, the motors 1542, 1546, 1550, and/or 1554, and/or the light source 708. The feedback corresponds to, for example, a light level or light output. Regarding the motors, the feedback corresponds to a position and/or a movement amount of a motor (or other actuator). The processor 1522 uses a program 1530 to convert the received signal into digital feedback to determine (for example) a radial, tilt, and/or axial position of a lens based on an angular position of the corresponding motor or actuator shaft. . The processor 1522 can then transmit a message with position information to the information processor module 1408 for display to a user and/or for tracking a position of the lens of the optical element 1402 for calibration.
在某些實施例中,馬達與光照模組1406可包含額外驅動器以改變光學元件1402內之個別透鏡之一軸向、傾斜及/或徑向位置。舉例而言,馬達與光照模組1406可包含驅動器,該等驅動器控制馬達以用於致動光學影像感測器746及748之撓曲部750及752以達成傾斜及/或徑向/軸向調整。此外,馬達與光照模組1406可包含驅動器,該等驅動器控制馬達(或致動器)以用於個別地使前透鏡720及722、前變焦透鏡726及728、後變焦透鏡732及734、透鏡鏡筒736及738及/或最後光學元件745及747傾斜及/或對其徑向地沿著一x軸或y軸及/或軸向地進行調整。透鏡及/或感測器之獨立調整使得馬達與光照控制器1520能夠(舉例而言)移除影像缺陷及/或對準左影像與右影像。In some embodiments, the motor and illumination module 1406 may include additional drivers to change the axial, tilt, and/or radial position of individual lenses in the optical element 1402. For example, the motor and lighting module 1406 may include drivers that control the motors for actuating the flexures 750 and 752 of the optical image sensors 746 and 748 to achieve tilt and/or radial/axial adjust. In addition, the motor and illumination module 1406 may include drivers that control motors (or actuators) for individually driving the front lenses 720 and 722, the front zoom lenses 726 and 728, the rear zoom lenses 732 and 734, and the lens The lens barrels 736 and 738 and/or the final optical elements 745 and 747 are tilted and/or adjusted radially along an x-axis or y-axis and/or axially. The independent adjustment of the lens and/or sensor enables the motor and illumination controller 1520 to, for example, remove image defects and/or align the left and right images.
以下章節闡述處理器1522如何執行一或多個程式1530以改變一工作距離、變焦、濾波器位置、透鏡位置及/或光輸出。
1.工作距離實例 The following sections describe how the processor 1522 executes one or more programs 1530 to change a working distance, zoom, filter position, lens position, and/or light output. 1. Working distance example
圖15之馬達與光照模組1406之實例性處理器1522經組態以調整立體視覺化攝影機300之一工作距離。該工作距離係藉由調整後工作距離透鏡704與前工作距離透鏡408之間的一距離來設定。處理器1522藉由致使後工作距離透鏡704相對於前工作距離透鏡408移動而調整該距離。具體而言,處理器1522將啟動工作距離透鏡馬達1554達與將使後工作距離透鏡704移動之一量成比例之一預定時間的一信號發送至後工作距離透鏡馬達驅動器1552。工作距離透鏡馬達1554驅動一導螺桿穿過附接至固持後工作距離透鏡704之一滑動軌道之螺紋。工作距離透鏡馬達1554致使透鏡704移動一所要距離,因而調整工作距離。工作距離透鏡馬達1554可將判定是否使後工作距離透鏡704移動了所要量之一回饋信號提供至處理器1522。若移動少於或多於所要量,則處理器1522可發送進一步精細化後工作距離透鏡704之位置之指令。在某些實施例中,資訊處理器模組1408可判定用於後工作距離透鏡704之回饋控制。The exemplary processor 1522 of the motor and lighting module 1406 of FIG. 15 is configured to adjust a working distance of the stereo visualization camera 300. The working distance is set by adjusting a distance between the rear working distance lens 704 and the front working distance lens 408. The processor 1522 adjusts the distance by causing the rear working distance lens 704 to move relative to the front working distance lens 408. Specifically, the processor 1522 sends a signal for starting the working distance lens motor 1554 for a predetermined time in proportion to the amount of movement of the rear working distance lens 704 to the rear working distance lens motor driver 1552. The working distance lens motor 1554 drives a lead screw through a thread attached to a sliding track of the fixed working distance lens 704. The working distance lens motor 1554 causes the lens 704 to move a desired distance, thereby adjusting the working distance. The working distance lens motor 1554 can provide a feedback signal to the processor 1522 to determine whether the rear working distance lens 704 is moved by a desired amount. If the movement is less or more than the required amount, the processor 1522 can send an instruction to further refine the position of the working distance lens 704. In some embodiments, the information processor module 1408 can be determined to be used for feedback control of the rear working distance lens 704.
為判定後工作距離透鏡704之一位置,處理器1522可操作一或多個校準程式1530。舉例而言,在啟動之後,處理器1522旋即可指示工作距離透鏡馬達1554驅動一導螺桿以使後工作距離透鏡704沿著一軌道或軌條移動直至觸發在運動範圍之一端處之一限制開關。處理器1522可將此停止位置指定為馬達1554之編碼器之一零點。在知曉後工作距離透鏡704之當前位置及對應編碼器值之情況下,處理器1522變得能夠判定致使後工作距離透鏡704移動至一所要位置之一軸件旋轉數目。將該軸件旋轉數目以一類比信號傳輸至工作距離透鏡馬達1554 (經由驅動器1552)以因此使透鏡704移動至一規定位置。
2.變焦實例 To determine a position of the rear working distance lens 704, the processor 1522 can operate one or more calibration programs 1530. For example, after activation, the processor 1522 can instruct the working distance lens motor 1554 to drive a lead screw to move the rear working distance lens 704 along a track or rail until it triggers a limit switch at one end of the range of motion . The processor 1522 can designate the stop position as a zero point of the encoder of the motor 1554. Knowing the current position of the back working distance lens 704 and the corresponding encoder value, the processor 1522 becomes able to determine the number of shaft rotations that caused the back working distance lens 704 to move to a desired position. The number of rotations of the shaft is transmitted to the working distance lens motor 1554 (via the driver 1552) as an analog signal to thereby move the lens 704 to a prescribed position. 2. Zoom example
圖15之實例性處理器1522經組態以執行一或多個程式1530以改變立體視覺化攝影機300之一變焦位準。如上文所論述,藉由改變前變焦組724及後變焦組730相對於彼此且相對於前透鏡組714及透鏡鏡筒組718之位置而達成變焦(例如,放大率改變)。類似於上文針對後工作距離透鏡704所闡述之校準程序,處理器1522可沿著軌道或軌條校準組724及730之位置。特別係,處理器1522發送致使後變焦透鏡馬達1546及前變焦透鏡馬達1550使組724及730 (例如,載體)沿著一軌條(或軌道)移動至在一限制開關處之一停止位置的指令。處理器1522自馬達1546及1550接收編碼器回饋以判定與組724及730之停止位置相關聯之一編碼器值。處理器1522然後可將編碼器值歸零或使用在停止位置處之已知編碼器值來判定對馬達1546及1550之啟動程度以沿著軌道達成組724及730之一所要位置。The example processor 1522 of FIG. 15 is configured to execute one or more programs 1530 to change a zoom level of the stereo visualization camera 300. As discussed above, zooming (e.g., magnification change) is achieved by changing the positions of the front zoom group 724 and the rear zoom group 730 relative to each other and relative to the front lens group 714 and lens barrel group 718. Similar to the calibration procedure described above for the rear working distance lens 704, the processor 1522 can calibrate the positions of the groups 724 and 730 along the track or rail. Specifically, the processor 1522 sends the rear zoom lens motor 1546 and the front zoom lens motor 1550 to move the groups 724 and 730 (for example, the carrier) along a rail (or track) to a stop position at a limit switch. instruction. The processor 1522 receives encoder feedback from the motors 1546 and 1550 to determine an encoder value associated with the stop positions of the groups 724 and 730. The processor 1522 can then reset the encoder value to zero or use the known encoder value at the stop position to determine the degree of activation of the motors 1546 and 1550 to reach the desired position in one of the groups 724 and 730 along the track.
除對停止位置之校準之外,處理器1522亦可執行定義組724及730之位置以達成一所要變焦位準之程式1530。舉例而言,可在一校準程序期間將一已知距離設定型樣對比一所要變焦值組儲存為一程式1530 (或一查找表)。該校準程序可包含將一模板放置於目標部位700內且指示處理器522使組724及730移動直至一特定所指定標記或字符在右影像或圖框及左影像或圖框中係一特定大小為止。舉例而言,一校準常式可對應於當目標部位700處之一模板上之字符「E」在右影像及左影像中顯示為具有10個像素之一高度時而判定組724及730在一軌條上之位置。In addition to the calibration of the stop position, the processor 1522 can also execute a program 1530 that defines the positions of the groups 724 and 730 to achieve a desired zoom level. For example, a known distance setting pattern can be compared to a desired zoom value set and stored as a program 1530 (or a look-up table) during a calibration procedure. The calibration procedure may include placing a template in the target area 700 and instructing the processor 522 to move the groups 724 and 730 until a specific designated mark or character is a specific size in the right image or frame and the left image or frame until. For example, a calibration routine may correspond to when the character "E" on a template at the target site 700 is displayed as having a height of 10 pixels in the right and left images, and determining that the groups 724 and 730 are one The position on the rail.
在某些實施例中,資訊處理器模組1408可執行視覺分析且將關於組724及730進行放大或縮小之所要移動之指令發送至處理器1522。另外,資訊處理器1408可發送指令以用於使焦平面移動使得在所要變焦位準下之目標部位700對焦。舉例而言,該等指令可包含使後工作距離透鏡704移動及/或使組724及730共同及/或個別地移動之指令。在某些替代實施例中,處理器1522可自使用者輸入裝置1410或另一電腦接收前變焦組724及後變焦組730之軌條位置在特定變焦位準下之校準參數。In some embodiments, the information processor module 1408 can perform visual analysis and send instructions to the processor 1522 to move the groups 724 and 730 to zoom in or out. In addition, the information processor 1408 can send instructions for moving the focal plane to focus the target part 700 at the desired zoom level. For example, the instructions may include instructions to move the rear working distance lens 704 and/or to move the groups 724 and 730 collectively and/or individually. In some alternative embodiments, the processor 1522 may receive the calibration parameters of the track positions of the front zoom group 724 and the rear zoom group 730 at a specific zoom level from the user input device 1410 or another computer.
實例性處理器1522及/或資訊處理器模組1408可發送指令使得一影像在放大率改變之同時保持對焦。舉例而言,處理器1522可使用一程式1530及/或一查找表來判定特定透鏡將如何沿著一光軸移動以將焦點保持在目標部位700上。程式1530及/或查找表可規定一軌條上之放大位準及/或設定點以及阻止焦平面移動所需要之對應透鏡調整。The example processor 1522 and/or the information processor module 1408 can send commands to keep an image in focus while the magnification changes. For example, the processor 1522 may use a program 1530 and/or a look-up table to determine how a specific lens will move along an optical axis to maintain the focus on the target site 700. The program 1530 and/or the look-up table can specify the magnification level and/or set point on a rail and the corresponding lens adjustment required to prevent the focal plane from moving.
表2在下文展示可由處理器1522使用以在改變放大率之同時保持焦點之一實例性程式1530或查找表。前變焦透鏡組724及後變焦透鏡組730之位置基於一軌條之一長度而正規化至各別組724及730之停止位置。為減小放大率,使後變焦透鏡組朝向透鏡鏡筒組718移動,因而增加沿著一軌條之一位置。亦使前變焦透鏡組724移動。然而,其移動未必等於後變焦透鏡組730之移動。替代地,前變焦透鏡組724之移動計及改變組724與730之間的一距離以保持焦平面之位置從而在改變放大率之同時維持焦點。舉例而言,為將一放大位準自10X減小至9X,處理器1522指示後變焦透鏡組730沿著一軌條自位置10移動至位置11。另外,處理器1522指示前變焦透鏡組724沿著一軌條(或與組730相同之軌條)自位置5移動至位置4。不僅使組724及730移動以改變放大率,已使組724及730移動相對於彼此移動以保持焦點。 表 2 Table 2 below shows an example program 1530 or look-up table that can be used by the processor 1522 to maintain focus while changing the magnification. The positions of the front zoom lens group 724 and the rear zoom lens group 730 are normalized to the stop positions of the respective groups 724 and 730 based on a length of one rail. To reduce the magnification, the rear zoom lens group is moved toward the lens barrel group 718, thereby increasing one position along a rail. The front zoom lens group 724 is also moved. However, its movement is not necessarily equal to the movement of the rear zoom lens group 730. Alternatively, the movement of the front zoom lens group 724 includes changing a distance between the groups 724 and 730 to maintain the position of the focal plane so as to maintain the focus while changing the magnification. For example, in order to reduce a magnification level from 10X to 9X, the processor 1522 instructs the rear zoom lens group 730 to move from position 10 to position 11 along a rail. In addition, the processor 1522 instructs the front zoom lens group 724 to move from position 5 to position 4 along a rail (or the same rail as the group 730). Not only have the groups 724 and 730 moved to change the magnification, but the groups 724 and 730 have been moved relative to each other to maintain focus. Table 2
應瞭解,表2提供可如何使組724及730移動之一實例。在其他實例中,表2可包含額外列以計及組724及730之更精確放大率及/或位置。另外或另一選擇係,表2可包含用於後工作距離透鏡704之一行。舉例而言,可替代或連同前變焦透鏡組724使後工作距離透鏡704移動以保持焦點。此外,表2可包含規定組724及730以及後工作距離透鏡704之位置以在改變工作距離期間保持焦點之列。It should be understood that Table 2 provides an example of how groups 724 and 730 can be moved. In other examples, Table 2 may include additional columns to account for more precise magnification and/or location of groups 724 and 730. Additionally or alternatively, Table 2 may include a row for the rear working distance lens 704. For example, the rear working distance lens 704 may be moved in place of or in conjunction with the front zoom lens group 724 to maintain focus. In addition, Table 2 may include a column specifying the positions of the groups 724 and 730 and the rear working distance lens 704 to maintain focus during the changing working distance.
可透過校準判定及/或自一遠端電腦或使用者輸入裝置1410接收表2中之值。在校準期間,資訊處理器模組1408可操作透過不同放大率及/或工作距離進展之一校準程式1560。資訊處理器模組1408處之一處理器1562可執行影像自身或所接收像素資料之影像處理以判定何時使用(舉例而言)具有預定形狀及/或字符之一模板達成一所要放大率。處理器1562判定所接收影像是否對焦。回應於判定影像失焦,處理器1562將調整前變焦透鏡組724及/或後工作距離透鏡組704之指令發送至處理器1522。該調整可包含沿著一光學路徑在正向方向及反向方向上之反覆移動直至處理器1562判定影像對焦為止。為判定一影像對焦,處理器1562可執行(舉例而言)影像分析,從而搜尋光乏晰度最小之影像及/或針對毗鄰像素區域之間的光值差分析像素資料(其中較大差對應於更多對焦影像)。在判定一影像在一所要工作距離及放大率下對焦之後,處理器1562及/或處理器1522然後可記錄組724及730及/或後工作距離透鏡704之位置以及對應放大位準。
3.濾波器位置實例 The values in Table 2 can be determined through calibration and/or received from a remote computer or user input device 1410. During the calibration, the information processor module 1408 can operate a calibration program 1560 that progresses through different magnifications and/or working distances. A processor 1562 in the information processor module 1408 can perform image processing of the image itself or the received pixel data to determine when to use (for example) a template with a predetermined shape and/or characters to achieve a desired magnification. The processor 1562 determines whether the received image is in focus. In response to determining that the image is out of focus, the processor 1562 sends an instruction to adjust the front zoom lens group 724 and/or the rear working distance lens group 704 to the processor 1522. The adjustment may include repeated movement in the forward direction and the reverse direction along an optical path until the processor 1562 determines that the image is in focus. In order to determine the focus of an image, the processor 1562 may perform (for example) image analysis to search for images with the least light lack and/or analyze pixel data for the light value difference between adjacent pixel regions (where the larger difference corresponds to More focused images). After determining that an image is in focus at a desired working distance and magnification, the processor 1562 and/or the processor 1522 can then record the positions of the groups 724 and 730 and/or the rear working distance lens 704 and the corresponding magnification level. 3. Example of filter location
圖15之馬達與光照模組1406之實例性處理器1522經組態以基於所接收指令而使濾波器740移動至右光學路徑及左光學路徑中。在某些實例中,濾波器740可包含一反射鏡陣列。在此等實例中,處理器1522將致動一或多個馬達1542以改變反射鏡之位置之指令發送至濾波器馬達驅動器1540。在某些例項中,驅動器1540可沿著一或多個路徑將一電荷發送至濾波器740,從而致使特定反射鏡元件切換至一接通或關斷位置。在此等實例中,濾波器類型選擇一般基於致動哪些反射鏡而係二進制的。The example processor 1522 of the motor and lighting module 1406 of FIG. 15 is configured to move the filter 740 into the right optical path and the left optical path based on the received instruction. In some examples, the filter 740 may include an array of mirrors. In these examples, the processor 1522 sends an instruction to actuate one or more motors 1542 to change the position of the mirror to the filter motor driver 1540. In some examples, the driver 1540 may send a charge to the filter 740 along one or more paths, thereby causing a particular mirror element to switch to an on or off position. In these examples, the filter type selection is generally binary based on which mirrors are activated.
在其他實例中,濾波器740可包含具有不同類型之濾波器(諸如一紅外線截止濾波器、近紅外線帶通濾波器及近超紫外線截止濾波器)之一輪。在此等實例中,藉由濾波器馬達1542使該輪旋轉。處理器1522判定與不同濾波器之間的分割區對應的該輪之停止位置。處理器1522亦判定與該等停止位置中之每一者對應之旋轉編碼器值。In other examples, the filter 740 may include a round of filters having different types (such as an infrared cut filter, a near infrared band pass filter, and a near ultra ultraviolet cut filter). In these examples, the wheel is rotated by the filter motor 1542. The processor 1522 determines the stop position of the wheel corresponding to the partition between different filters. The processor 1522 also determines the rotary encoder value corresponding to each of the stop positions.
處理器1522可操作一校準程式1530及/或處理器1562可操作一校準程式1560以判定該等停止位置。舉例而言,處理器1522可使濾波器輪740緩慢地旋轉,其中處理器1562判定在像素處接收之光何時改變(使用影像分析或自影像擷取模組1404讀取像素資料)。像素處之一光值之一改變指示應用於光學路徑之濾波器類型之一改變)。在某些例項中,處理器1522可改變啟動哪些光源708以在應用一不同濾波器類型時在像素處形成進一步區分。
4.光控制及濾波器實例 The processor 1522 can operate a calibration program 1530 and/or the processor 1562 can operate a calibration program 1560 to determine the stop positions. For example, the processor 1522 can rotate the filter wheel 740 slowly, where the processor 1562 determines when the light received at the pixel changes (using image analysis or reading the pixel data from the image capture module 1404). A change in one of the light values at the pixel indicates a change in one of the filter types applied to the optical path). In some cases, the processor 1522 may change which light sources 708 are activated to further differentiate at the pixels when a different filter type is applied. 4. Examples of light control and filters
如上文所揭示,處理器1522可連同濾波器740控制光源708以致使一所要波長之光到達光學影像感測器746及748。在某些實例中,處理器1522可控制或同步化光源708中之一或多者之啟動與濾波器740中之一或多者之啟動之間的定時。為同步化定時,一程式1530可規定用於啟動一特定濾波器之一延遲時間。處理器1522使用此程式1530來判定何時(舉例而言)相對於發送接通一光源708之一信號而傳輸啟動濾波器740之一信號。經排程定時確保當啟動規定光源708時應用適當濾波器740。此一組態使得由一個光源708 (諸如螢光)突出顯示之特徵能夠展示於在一第二光源708 (諸如白光或周圍光)下顯示之特徵之頂部或連同該等特徵而展示。As disclosed above, the processor 1522 together with the filter 740 can control the light source 708 so that light of a desired wavelength reaches the optical image sensors 746 and 748. In some examples, the processor 1522 may control or synchronize the timing between the activation of one or more of the light sources 708 and the activation of one or more of the filters 740. To synchronize timing, a program 1530 can specify a delay time for activating a specific filter. The processor 1522 uses this program 1530 to determine when (for example) to transmit a signal to activate the filter 740 relative to a signal to turn on a light source 708. The scheduled timing ensures that the appropriate filter 740 is applied when the prescribed light source 708 is activated. This configuration enables features highlighted by a light source 708 (such as fluorescent) to be displayed on top of or in conjunction with features displayed under a second light source 708 (such as white light or ambient light).
在某些例項中,可與可改變光濾波器740一樣快地切換光源708,因而使得在不同光中記錄之影像能夠彼此上下地聯合展示。舉例而言,發射螢光(由於一施用染料或對比劑)之靜脈或其他解剖結構可在周圍光照下展示於一影像之頂部上。在此實例中,靜脈相對於可見光中所展示之背景解剖特徵將係突出顯示的。在此例項中,資訊處理器模組1408之處理器1562及/或一圖形處理單元1564 (例如,一視訊卡或圖形卡)使在應用一個濾波器期間記錄之一或多個影像與在應用一後續濾波器期間記錄之影像組合或覆疊。In some cases, the light source 708 can be switched as fast as the changeable light filter 740, thereby allowing images recorded in different lights to be displayed jointly on top of each other. For example, veins or other anatomical structures that emit fluorescence (due to an application of a dye or contrast agent) can be displayed on top of an image under ambient light. In this example, the veins will be highlighted relative to the background anatomical features shown in visible light. In this example, the processor 1562 of the information processor module 1408 and/or a graphics processing unit 1564 (for example, a video card or a graphics card) enables one or more images to be recorded during the application of a filter with the A combination or overlay of images recorded during the application of a subsequent filter.
在某些實施例中,處理器1522可同時啟動多個光源708。可同時或順序地啟動光源708以使不同波長之光「交錯」以使得不同資訊能夠在光學影像感測器746及748處使用適當像素來提取。同時啟動該等光源可幫助照射暗場。舉例而言,某些應用使用UV光來刺激一目標部位700處之螢光。然而,UV光由一操作者感知為係非常暗的。因此,處理器1522可週期性地啟動可見光源708a以將某些可見光添加至視域,使得外科醫師可在不覆蓋對UV光敏感之像素之情況下觀察視域但亦可偵測到某些可見光。在另一實例中,在某些例項中,在光源708之間進行交替會避免沖洗光學影像感測器746及748之在其範圍之邊緣處具有重疊敏感度之像素。
5.光強度控制 In some embodiments, the processor 1522 may activate multiple light sources 708 at the same time. The light source 708 can be activated simultaneously or sequentially to "interleave" light of different wavelengths so that different information can be extracted at the optical image sensors 746 and 748 using appropriate pixels. Starting these light sources at the same time can help illuminate the dark field. For example, some applications use UV light to stimulate fluorescence at a target site 700. However, UV light is perceived by an operator as being very dark. Therefore, the processor 1522 can periodically activate the visible light source 708a to add some visible light to the field of view, so that the surgeon can observe the field of view without covering the pixels sensitive to UV light but can also detect some Visible light. In another example, in some cases, alternating between the light sources 708 avoids flushing the pixels of the optical image sensors 746 and 748 that have overlapping sensitivities at the edges of their ranges. 5. Light intensity control
圖15之實例性處理器1522經組態以執行一或多個程式1530以改變由光源708提供之照射之一強度或一位準。應瞭解,場深度取決於目標部位700處之照射位準。一般而言,較高照射提供一較大場深度。處理器1522經組態以確保針對一所要場深度提供一適當照射量而不會沖洗視域或使視域過熱。The example processor 1522 of FIG. 15 is configured to execute one or more programs 1530 to change an intensity or level of illumination provided by the light source 708. It should be understood that the depth of field depends on the irradiation level at the target site 700. Generally speaking, higher illumination provides a greater depth of field. The processor 1522 is configured to ensure that an appropriate amount of exposure is provided for a desired depth of field without flushing the field of view or overheating the field of view.
可見光源708a由可見光驅動器1538驅動且輸出在頻譜之人類可見部分中之光以及在彼區域外側之某些光。NIR光源708b由NIR光驅動器1536驅動且輸出主要處於稱為近紅外線之一波長之光。NUV光源708c由NUV光驅動器1534驅動且輸出主要處於深度在可見頻譜之藍色部分中之一波長之光,其稱為近超紫外線。各別光驅動器1534、1536及1538受處理器1522所提供之命令控制。藉由PWM信號達成對光源708之各別輸出頻譜之控制,其中在一最小值(例如,關斷)與最大值(例如,接通)之間切換一控制電壓或電流。藉由使切換速率以及電壓或電流在PWM信號中每循環處於最大位準之時間百分比變化而控制自光源708輸出之光之亮度。The visible light source 708a is driven by the visible light driver 1538 and outputs light in the human visible part of the spectrum and some light outside that area. The NIR light source 708b is driven by the NIR light driver 1536 and outputs light mainly at a wavelength called near-infrared. The NUV light source 708c is driven by the NUV light driver 1534 and outputs light whose depth is mainly in one of the wavelengths in the blue part of the visible spectrum, which is called near ultra-ultraviolet. The respective optical drivers 1534, 1536, and 1538 are controlled by commands provided by the processor 1522. The control of the respective output spectrum of the light source 708 is achieved by the PWM signal, wherein a control voltage or current is switched between a minimum value (for example, off) and a maximum value (for example, on). The brightness of the light output from the light source 708 is controlled by changing the switching rate and the percentage of time that the voltage or current is at the maximum level per cycle in the PWM signal.
在某些實例中,處理器1522基於視域或變焦位準之一大小而控制光源708之一輸出。處理器1522可執行針對特定光敏設定而規定光強度成為變焦之一函數的一程式1530。舉例而言,程式1530可包含使一變焦位準與一光強度值相關之一查找表。處理器1522使用程式1530以基於選定放大位準而選擇用於光源708之PWM信號。在某些實例中,處理器1522可隨著放大率增加而降低光強度以維持每單位面積提供至視域之光量。
C.實例性資訊處理器模組 In some examples, the processor 1522 controls the output of one of the light sources 708 based on one of the field of view or the zoom level. The processor 1522 can execute a program 1530 for specifying the light intensity as a function of the zoom for a specific photosensitive setting. For example, the program 1530 may include a look-up table that correlates a zoom level with a light intensity value. The processor 1522 uses the program 1530 to select the PWM signal for the light source 708 based on the selected amplification level. In some examples, the processor 1522 may reduce the light intensity as the magnification increases to maintain the amount of light provided to the field of view per unit area. C. Example information processor module
圖15之立體視覺化攝影機300內之實例性資訊處理器模組1408經組態以分析且處理自影像擷取模組1404接收之影像/圖框以用於顯示。另外,資訊處理器模組1408經組態以與不同裝置介接且將控制指令轉化成用於影像擷取模組1404及/或馬達與光照模組1406之訊息。資訊處理器模組1408亦可提供用於手動校準之一介面及/或管理光學元件1402之自動校準。The exemplary information processor module 1408 in the stereo visualization camera 300 of FIG. 15 is configured to analyze and process the images/frames received from the image capture module 1404 for display. In addition, the information processor module 1408 is configured to interface with different devices and convert control commands into information for the image capture module 1404 and/or the motor and illumination module 1406. The information processor module 1408 can also provide an interface for manual calibration and/or manage the automatic calibration of the optical element 1402.
如圖15中所展示,資訊處理器模組1408以通信方式耦合及/或電耦合至影像擷取模組1404及馬達與光照模組1406。舉例而言,除通信通道1566及1568之外,通信通道1514亦可包含USB 2.0或USB 3.0連接。如此,資訊處理器模組1408調節電力且將電力提供至模組1404及1406。在某些實施例中,資訊處理器模組1408將來自一壁式插座之110伏特交流電(「AC」)電力轉換成用於模組1404及1406之一5、10、12及/或24伏特直流電(「DC」)供應。另外或另一選擇係,資訊處理器模組1408自在立體視覺化攝影機300之殼體302內部之一電池及/或搬運車510處之一電池接收電力。As shown in FIG. 15, the information processor module 1408 is communicatively and/or electrically coupled to the image capture module 1404 and the motor and illumination module 1406. For example, in addition to the communication channels 1566 and 1568, the communication channel 1514 may also include USB 2.0 or USB 3.0 connections. In this way, the information processor module 1408 regulates power and provides power to the modules 1404 and 1406. In some embodiments, the information processor module 1408 converts 110 volt alternating current ("AC") power from a wall outlet into 5, 10, 12, and/or 24 volts for one of the modules 1404 and 1406 Direct current ("DC") supply. In addition or alternatively, the information processor module 1408 receives power from a battery inside the housing 302 of the stereo visualization camera 300 and/or a battery at the truck 510.
實例性資訊處理器模組1408包含通信介面1532以與影像擷取模組1404及馬達與光照模組1406雙向通信。資訊處理器模組1408亦包含處理器1562,處理器1562經組態以執行一或多個程式1560以處理自影像擷取模組1404接收之影像/圖框。程式1560可儲存於一記憶體1570中。另外,處理器1562可執行光學元件1402之校準及/或調整光學元件1402以對準右影像及左影像及/或移除視覺缺陷。The example information processor module 1408 includes a communication interface 1532 to communicate with the image capture module 1404 and the motor and illumination module 1406 bidirectionally. The information processor module 1408 also includes a processor 1562, which is configured to execute one or more programs 1560 to process images/frames received from the image capture module 1404. The program 1560 can be stored in a memory 1570. In addition, the processor 1562 may perform calibration of the optical element 1402 and/or adjust the optical element 1402 to align the right and left images and/or remove visual defects.
為將影像及/或圖框處理成一經再現三維立體顯示,實例性資訊處理器模組1408包含圖形處理單元1564。圖16展示根據本發明之一實例性實施例之圖形處理單元1564之一圖式。在操作期間,處理器1562自影像擷取模組1404接收影像及/或圖框。一解壓縮常式1602將影像/圖框自有助於跨越通信通道1514傳輸之一格式轉換或以其他方式改變為有助於影像處理之一格式。例如,可跨越通信通道1514在多個訊息中傳輸影像及/或圖框。實例性解壓縮常式1602組合來自該多個訊息之資料以重新彙編圖框/影像。在某些實施例中,解壓縮常式1602可使圖框及/或影像排佇直至由圖形處理單元1564請求為止。在其他實例中,處理器1562可在完全接收且解壓縮之後傳輸每一右影像/圖框對及左影像/圖框對。To process the image and/or frame into a rendered three-dimensional display, the exemplary information processor module 1408 includes a graphics processing unit 1564. Figure 16 shows a diagram of a graphics processing unit 1564 according to an exemplary embodiment of the present invention. During operation, the processor 1562 receives images and/or frames from the image capture module 1404. A decompression routine 1602 converts the image/frame to a format that is useful for transmission across the communication channel 1514 or otherwise changes to a format that is useful for image processing. For example, images and/or frames can be transmitted across the communication channel 1514 in multiple messages. The example decompression routine 1602 combines the data from the multiple messages to reassemble the frame/image. In some embodiments, the decompression routine 1602 may queue frames and/or images until requested by the graphics processing unit 1564. In other examples, the processor 1562 may transmit each right image/frame pair and left image/frame pair after fully receiving and decompressing.
實例性圖形處理單元1564使用一或多個程式1580 (圖15中所展示)來使影像準備再現。在圖15及圖16中展示程式1580之實例。可由圖形處理單元1564之一處理器執行程式1580。另一選擇係,圖16中所展示之程式1580中之每一者可由一單獨圖形處理器、微控制器及/或特殊應用積體電路(「ASIC」)執行。舉例而言,一解拜耳程式1580a經組態以跨越相鄰像素平滑化或平均化像素值以補償應用於圖7及圖8之右光學影像感測器746及左光學影像感測器748之像素網格1002及1004之一拜耳圖案。圖形處理單元1564亦可包含用於色彩校正及/或白色平衡調整之程式1580b、1580c及1580d。圖形處理單元1564亦包含一再現器程式1580e以用於使經色彩校正影像/圖框準備顯示在顯示監視器512及514上。圖形處理單元1564可進一步與一周邊輸入單元介面1574交互及/或包含一周邊輸入單元介面1574,周邊輸入單元介面1574經組態以組合、融合或以其他方式包含用於與目標部位700之立體顯示一起呈現之其他影像及/或圖形。更一般而言,在下文論述程式1580及資訊處理器模組1408之額外細節。The exemplary graphics processing unit 1564 uses one or more programs 1580 (shown in FIG. 15) to prepare the image for rendering. Examples of the program 1580 are shown in FIGS. 15 and 16. The program 1580 can be executed by a processor of the graphics processing unit 1564. Alternatively, each of the programs 1580 shown in FIG. 16 can be executed by a separate graphics processor, microcontroller, and/or application-specific integrated circuit ("ASIC"). For example, a solution Bayer program 1580a is configured to smooth or average pixel values across adjacent pixels to compensate for the difference between the right optical image sensor 746 and the left optical image sensor 748 of FIGS. 7 and 8 A Bayer pattern of pixel grids 1002 and 1004. The graphics processing unit 1564 may also include programs 1580b, 1580c, and 1580d for color correction and/or white balance adjustment. The graphics processing unit 1564 also includes a renderer program 1580e for preparing the color-corrected image/frame for display on the display monitors 512 and 514. The graphics processing unit 1564 may further interact with a peripheral input unit interface 1574 and/or include a peripheral input unit interface 1574. The peripheral input unit interface 1574 is configured to combine, merge, or otherwise include a stereoscopic image for the target part 700 Display other images and/or graphics presented together. More generally, additional details of the program 1580 and the information processor module 1408 are discussed below.
實例性資訊處理器模組1408可執行一或多個程式1560以檢查且改良立體視覺化攝影機300之延時。延時係指一事件出現在目標部位700處及使彼相同事件由顯示監視器512及514展示所花費之時間量。低延時提供立體視覺化攝影機300係一外科醫師之眼睛之一延伸之一感覺,而高延時趨向於自顯微外科手術程序分神。實例性處理器1562可追蹤在自光學影像感測器746及748讀取影像之間逝去了多少時間直至傳輸基於所讀取影像之經組合立體影像以用於顯示為止。高延時之偵測可致使處理器1562減少佇列時間,增加圖框率,及/或跳過某些色彩校正步驟。
1.使用者輸入實例 The example information processor module 1408 can execute one or more programs 1560 to check and improve the delay of the stereo visualization camera 300. Delay refers to the amount of time it takes for an event to appear at the target site 700 and for the same event to be displayed by the display monitors 512 and 514. The low-latency provides the stereo vision camera 300 as an extension of a surgeon’s eye, while the high-latency tends to be distracted from the microsurgery procedure. The example processor 1562 can track how much time has elapsed between reading the image from the optical image sensors 746 and 748 until transmitting the combined stereoscopic image based on the read image for display. High-latency detection can cause the processor 1562 to reduce the queue time, increase the frame rate, and/or skip certain color correction steps. 1. User input example
圖15之資訊處理器模組1408之實例性處理器1562經組態以將使用者輸入指令轉換為用於馬達與光照模組1406及/或影像擷取模組1404之訊息。使用者輸入指令可包含改變立體視覺化攝影機300之光學態樣(包含一放大位準、一工作距離、一焦平面(例如,焦點)之一高度、一光照源708及/或濾波器740之一濾波器類型)之請求。該等使用者輸入指令亦可包含執行校準之請求,包含一影像對焦之指示及/或影像對準之指示及/或左影像與右影像之間的經對準ZRP之指示。該等使用者輸入指令可進一步包含對立體視覺化攝影機300之參數(諸如圖框率、曝光時間、色彩校正、影像解析度等)之調整。The example processor 1562 of the information processor module 1408 of FIG. 15 is configured to convert user input commands into messages for the motor and lighting module 1406 and/or the image capture module 1404. The user input command may include changing the optical aspect of the stereo visualization camera 300 (including a magnification level, a working distance, a height of a focal plane (eg, focus), a light source 708 and/or a filter 740 A filter type) request. The user input commands may also include a request to perform calibration, including an image focus instruction and/or an image alignment instruction and/or an alignment ZRP instruction between the left image and the right image. The user input commands may further include adjustment of the parameters of the stereo visualization camera 300 (such as frame rate, exposure time, color correction, image resolution, etc.).
可自一使用者輸入裝置1410接收該等使用者輸入指令,使用者輸入裝置1410可包含圖3之控制臂304之控件305及/或一遠端控件。使用者輸入裝置1410亦可包含一電腦、平板電腦等。在某些實施例中,經由一網路介面1572及/或一周邊輸入單元介面1574接收該等指令。在其他實施例中,可自一有線連接及/或一RF介面接收該等指令。The user input commands may be received from a user input device 1410, and the user input device 1410 may include the control 305 of the control arm 304 of FIG. 3 and/or a remote control. The user input device 1410 may also include a computer, a tablet computer, and so on. In some embodiments, the commands are received via a network interface 1572 and/or a peripheral input unit interface 1574. In other embodiments, the commands can be received from a wired connection and/or an RF interface.
實例性處理器1562包含用於判定一指令類型且判定將如何處理使用者輸入之程式1560。在一實例中,一使用者可按壓控件305之一按鈕以改變一放大位準。可繼續按壓該按鈕直至操作者已致使立體視覺化攝影機300達到一所要放大位準。在此等實例中,該等使用者輸入指令包含指示將(舉例而言)增加一放大位準之資訊。對於所接收之每一指令(或其中接收指示該指令之一信號之每一時間週期),處理器1562將指示放大率改變之一控制指令發送至馬達與光照處理器1406。處理器1522依據一程式1530判定將使用(舉例而言)表2使變焦透鏡組724及730移動多少。處理器1522因此將致使後變焦透鏡馬達1546及/或前變焦透鏡馬達1550使後變焦透鏡組730及/或前變焦透鏡組724移動由處理器1562規定之一量以達成所要放大位準的一信號或訊息傳輸至後變焦透鏡馬達驅動器1544及/或前變焦透鏡馬達驅動器1548。The example processor 1562 includes a program 1560 for determining a command type and determining how to process user input. In one example, a user can press a button of the control 305 to change a zoom level. The button can continue to be pressed until the operator has caused the stereo visualization camera 300 to reach a desired magnification level. In these examples, the user input commands include information instructing to increase, for example, a magnification level. For each command received (or each time period in which a signal indicating the command is received), the processor 1562 sends a control command indicating a change in the magnification to the motor and illumination processor 1406. The processor 1522 determines how much the zoom lens groups 724 and 730 will be moved using (for example) Table 2 according to a formula 1530. The processor 1522 will therefore cause the rear zoom lens motor 1546 and/or the front zoom lens motor 1550 to move the rear zoom lens group 730 and/or the front zoom lens group 724 by an amount specified by the processor 1562 to achieve a desired magnification level. The signal or message is transmitted to the rear zoom lens motor driver 1544 and/or the front zoom lens motor driver 1548.
應瞭解,在以上實例中,立體視覺化攝影機300基於使用者輸入而提供一改變,但亦做出自動調整以維持焦點及/或一高影像品質。例如,替代僅僅改變放大位準,處理器1522亦判定將如何使變焦透鏡組724及730移動以亦保持焦點,因而使一操作者免於必須手動地執行此任務。另外,處理器1562可在一放大位準改變時即時調整及/或對準右影像及左影像內之ZRP。舉例而言,此可藉由選擇或改變圖10之像素組1006及1008相對於像素網格1002及1004之位置來完成。It should be understood that in the above example, the stereoscopic visualization camera 300 provides a change based on user input, but also makes automatic adjustments to maintain focus and/or a high image quality. For example, instead of merely changing the magnification level, the processor 1522 also determines how to move the zoom lens groups 724 and 730 to also maintain the focus, thereby saving an operator from having to perform this task manually. In addition, the processor 1562 can instantly adjust and/or align the ZRP in the right and left images when a zoom level is changed. For example, this can be accomplished by selecting or changing the positions of the pixel groups 1006 and 1008 in FIG. 10 relative to the pixel grids 1002 and 1004.
在另一實例中,處理器1562可自使用者輸入裝置1410接收改變一圖框率之一指令。處理器1562將一訊息傳輸至影像擷取模組1404之處理器1504。繼而,處理器1504將指示新圖框率之內容寫入至右影像感測器746及左影像感測器748之暫存器。處理器1504亦可用新圖框率更新內部暫存器以改變讀取像素之一速度。In another example, the processor 1562 may receive a command from the user input device 1410 to change a frame rate. The processor 1562 transmits a message to the processor 1504 of the image capturing module 1404. Then, the processor 1504 writes the content indicating the new frame rate to the registers of the right image sensor 746 and the left image sensor 748. The processor 1504 can also update the internal register with the new frame rate to change the speed of reading pixels.
在又一實例中,處理器1562可自使用者輸入裝置1410接收開始ZRP之一校準常式之一指令。作為回應,處理器1562可執行規定將如何操作校準之一程式1560。舉例而言,除用於驗證影像品質之一常式之外,程式1560亦可包含放大位準及/或工作距離之一進展或反覆。該常式可規定:對於每一放大位準,除ZRP之外,亦將驗證焦點。該常式亦可規定將如何調整變焦透鏡組724及730及/或後工作距離透鏡704以達成一對焦影像。該常式可進一步規定將如何針對放大位準使右影像及左影像之ZRP定中心。一旦影像品質已通過驗證,程式1560便可除像素組1006及1008之位置以及對應放大位準之外亦儲存變焦透鏡組724及/或730及/或後工作距離透鏡704之位置(至一查找表)。因此,當在一後續時間處請求相同放大位準時,處理器1562使用查找表來向馬達與光照模組1406規定變焦透鏡組724及/或730及/或後工作距離透鏡704之位置並向影像擷取模組1404規定像素組1006及1008之位置。應瞭解,在某些校準常式中,可徑向地/旋轉地調整光學元件1402之透鏡中之至少某些透鏡及/或使該等透鏡傾斜以將ZRP定中心及/或對準右影像及左影像。
2.介面實例 In another example, the processor 1562 may receive a command from the user input device 1410 to start a calibration routine of ZRP. In response, the processor 1562 can execute a program 1560 that specifies how the calibration will be performed. For example, in addition to a routine used to verify image quality, the program 1560 may also include progress or repetitions of magnification level and/or working distance. The routine may specify that for each zoom level, in addition to ZRP, the focus will also be verified. The routine may also specify how to adjust the zoom lens groups 724 and 730 and/or the rear working distance lens 704 to achieve a focused image. This routine may further specify how the ZRP of the right and left images will be centered for the magnification level. Once the image quality has been verified, the program 1560 can store the positions of the zoom lens groups 724 and/or 730 and/or the rear working distance lens 704 in addition to the positions of the pixel groups 1006 and 1008 and the corresponding magnification levels (to a search surface). Therefore, when the same magnification level is requested at a subsequent time, the processor 1562 uses the look-up table to specify the position of the zoom lens group 724 and/or 730 and/or the rear working distance lens 704 to the motor and illumination module 1406 and to capture the image The fetch module 1404 specifies the positions of the pixel groups 1006 and 1008. It should be understood that in some calibration routines, at least some of the lenses of the optical element 1402 can be adjusted radially/rotatably and/or the lenses can be tilted to center the ZRP and/or align the right image And left image. 2. Interface example
為促進立體視覺化攝影機300與外部裝置之間的通信,實例性資訊處理器模組1408包含網路介面1572及周邊輸入單元介面1574。實例性網路介面1572經組態以使得遠端裝置能夠以通信方式耦合至資訊處理器模組1408以(舉例而言)儲存所記錄視訊,控制立體視覺化攝影機300之一工作距離、變焦位準、焦點、校準或其他特徵。在某些實施例中,該等遠端裝置可提供校準查找表之值或參數,或更一般而言提供具有經校準參數之程式1530。網路介面1572可包含一乙太網路介面、一區域網路介面及/或一Wi-Fi介面。In order to facilitate the communication between the stereo visualization camera 300 and external devices, the exemplary information processor module 1408 includes a network interface 1572 and a peripheral input unit interface 1574. The example network interface 1572 is configured to enable a remote device to be communicatively coupled to the information processor module 1408 to, for example, store the recorded video, and control the working distance and zoom position of the stereo visualization camera 300 Alignment, focus, alignment, or other characteristics. In some embodiments, the remote devices can provide values or parameters of a calibration look-up table, or more generally, a program 1530 with calibrated parameters. The network interface 1572 may include an Ethernet interface, a local area network interface, and/or a Wi-Fi interface.
實例性周邊輸入單元介面1574經組態而以通信方式耦合至一或多個周邊裝置1576且促進立體影像資料與周邊資料(諸如患者生理資料)之整合。周邊輸入單元介面1574可包含一Bluetooth®介面、一USB介面、一HDMI介面、SDI等。在某些實施例中,周邊輸入單元介面1574可與網路介面1572組合。The example peripheral input unit interface 1574 is configured to be communicatively coupled to one or more peripheral devices 1576 and facilitate the integration of stereoscopic image data with peripheral data (such as patient physiological data). The peripheral input unit interface 1574 may include a Bluetooth® interface, a USB interface, an HDMI interface, SDI, and so on. In some embodiments, the peripheral input unit interface 1574 can be combined with the network interface 1572.
舉例而言,周邊裝置1576可包含資料或視訊儲存單元、患者生理感測器、醫學成像裝置、輸注泵、透析機器及/或平板電腦等。周邊資料可包含來自一專用二維紅外線專業攝影機之影像資料、來自一使用者之膝上型電腦之診斷影像及/或來自一眼科裝置(諸如Alcon Constellation®系統及波技術光學波折射分析(ORATM
)系統)之影像或患者診斷文字。For example, the peripheral device 1576 may include a data or video storage unit, a patient's physiological sensor, a medical imaging device, an infusion pump, a dialysis machine, and/or a tablet computer, etc. Peripheral data can include image data from a dedicated two-dimensional infrared professional camera, diagnostic images from a user’s laptop, and/or from an ophthalmic device (such as Alcon Constellation® system and wave technology optical wave refraction analysis (ORA) TM ) system) image or patient diagnosis text.
實例性周邊輸入單元介面1574經組態以將來自周邊裝置1576之資料轉換及/或格式化為供與立體影像一起使用之一適當數位形式。一旦呈數位形式,圖形處理單元1564便整合周邊資料與其他系統資料及/或立體影像/圖框。該資料與立體影像一起經再現以用於顯示在顯示監視器512及/或514上。The example peripheral input unit interface 1574 is configured to convert and/or format data from the peripheral device 1576 into an appropriate digital form for use with stereoscopic images. Once in digital form, the graphics processing unit 1564 integrates peripheral data with other system data and/or stereo images/frames. The data is reproduced together with the stereoscopic image for display on the display monitors 512 and/or 514.
為組態所包含之周邊資料與立體影像,處理器1562可控制一整合設置。在一實例中,處理器1562可致使圖形處理單元1564在顯示監視器512及/或514上顯示一組態面板。該組態面板可使得一操作者能夠將一周邊裝置1576連接至介面1574及處理器1562以隨後建立與裝置1576之通信。處理器1562然後可讀取哪一資料係可用的或使得操作者能夠使用該組態面板來選擇一資料目錄位置。該目錄位置中之周邊資料顯示在該組態面板中。該組態面板亦可給操作者提供使周邊資料與立體影像資料覆疊或顯示為一單獨圖片之一選項。In order to configure the included peripheral data and three-dimensional images, the processor 1562 can control an integrated setting. In one example, the processor 1562 may cause the graphics processing unit 1564 to display a configuration panel on the display monitors 512 and/or 514. The configuration panel allows an operator to connect a peripheral device 1576 to the interface 1574 and the processor 1562 to subsequently establish communication with the device 1576. The processor 1562 can then read which data is available or enable the operator to use the configuration panel to select a data directory location. The surrounding data in the directory location is displayed in the configuration panel. The configuration panel can also provide an option for the operator to overlay or display the surrounding data and the three-dimensional image data as a single picture.
周邊資料(及覆疊格式)之選擇致使處理器1562讀取資料且將資料傳輸至圖形處理單元1564。圖形處理單元1564將周邊資料應用於立體影像資料以用於呈現為一覆疊圖形(諸如融合一術前影像或圖形與一即時立體影像)、一「畫中畫」及/或在主要立體影像窗側面或頂部之一子窗。
3.解拜耳程式實例 The selection of the peripheral data (and overlay format) causes the processor 1562 to read the data and transmit the data to the graphics processing unit 1564. The graphics processing unit 1564 applies the peripheral data to the three-dimensional image data for presentation as an overlay graphic (such as fusing a preoperative image or a graphic with a real-time three-dimensional image), a "picture-in-picture" and/or in the main three-dimensional image A sub-window on the side or top of the window. 3. An example of the Bayer program
圖16之實例性解拜耳程式1580a經組態以在每個像素值下產生具有針對紅色、綠色及藍色色彩之值之影像及/或圖框。如上文所論述,右光學影像感測器746及左光學影像感測器748之像素具有使在紅色波長範圍、藍色波長範圍或綠色波長範圍中之光通過之一濾波器。因此,每一像素僅含有光資料之一部分。因此,自影像擷取模組1404接收於資訊處理器模組1408中之每一影像及/或圖框具有含有紅色、藍色或綠色像素資料之像素。The exemplary Bayer solution 1580a of FIG. 16 is configured to generate images and/or frames with values for red, green, and blue colors at each pixel value. As discussed above, the pixels of the right optical image sensor 746 and the left optical image sensor 748 have a filter that passes light in the red wavelength range, blue wavelength range, or green wavelength range. Therefore, each pixel only contains a part of the light data. Therefore, each image and/or frame received from the image capturing module 1404 in the information processor module 1408 has pixels containing red, blue, or green pixel data.
實例性解拜耳程式1580a經組態以對毗鄰及/或相鄰像素之紅色、藍色及綠色像素資料求平均以判定每一像素之更完整色彩資料。在一實例中,具有紅色資料之一像素及具有藍色資料之一像素位於具有綠色資料之兩個像素之間。該兩個像素之綠色像素資料經求平均且指派給具有紅色資料之像素及具有藍色資料之像素。在某些例項中,可基於具有紅色資料之像素及具有藍色資料之像素距各別綠色像素之一距離而對經求平均綠色資料進行加權。在計算之後,最初僅具有紅色或藍色資料之像素現在包含綠色資料。因此,在由圖形處理單元1564執行解拜耳程式1580a之後,每一像素含有針對一量定量之紅色、藍色及綠色光之像素資料。針對不同色彩之像素資料經摻合以判定色彩頻譜上之一所得色彩,再現器程式1580e可使用該所得色彩以用於顯示及/或顯示監視器512及514。在某些實例中,解拜耳程式1580a可判定所得色彩且儲存指示該色彩之資料或一識別符。
4.色彩校正實例 The example Bayer program 1580a is configured to average the red, blue, and green pixel data of adjacent and/or adjacent pixels to determine more complete color data for each pixel. In one example, a pixel with red data and a pixel with blue data are located between two pixels with green data. The green pixel data of the two pixels are averaged and assigned to the pixel with red data and the pixel with blue data. In some examples, the averaged green data may be weighted based on a distance between pixels with red data and pixels with blue data from the respective green pixels. After calculation, pixels that originally had only red or blue data now contain green data. Therefore, after the graphics processing unit 1564 executes the Bayer program 1580a, each pixel contains pixel data for a certain amount of red, blue, and green light. For the pixel data of different colors to be blended to determine the color obtained in one of the color spectrums, the reproducer program 1580e can use the obtained color for display and/or display monitors 512 and 514. In some instances, the DeBayer program 1580a can determine the resulting color and store data or an identifier indicating the color. 4. Examples of color correction
實例性色彩校正程式1580b、1580c及1580d經組態以調整像素色彩資料。感測器色彩校正程式1580b經組態以解釋或調整光學影像感測器746及748之色彩感測之可變性。使用者色彩校正程式1580c經組態以基於一操作者之感知及回饋而調整像素色彩資料。此外,顯示器色彩校正程式1580d經組態以基於一顯示監視器類型而調整 像素色彩資料。Example color correction programs 1580b, 1580c, and 1580d are configured to adjust pixel color data. The sensor color calibration program 1580b is configured to explain or adjust the color sensing variability of the optical image sensors 746 and 748. The user color calibration program 1580c is configured to adjust pixel color data based on the perception and feedback of an operator. In addition, the display color correction program 1580d is configured to adjust pixel color data based on a display monitor type.
為針對感測器可變性而校正色彩,實例性色彩校正程式1580b規定可由圖形處理單元1564及/或處理器1562執行之一校準常式。感測器校準包含將一經校準色彩圖表(諸如X-Rite公司之ColorChecker® Digital SG)放置於目標部位700處。處理器1562及/或圖形處理單元1564執行程式1580b,程式1580b包含將記錄色彩圖表之右影像及左影像之指令發送至影像擷取模組1404。可比較來自右影像及左影像之像素資料(在由解拜耳程式1580a處理之後)與相關聯於色彩圖表之像素資料(其可經由網路介面1572自一周邊單元1576及/或一遠端電腦儲存至記憶體1570)。處理器1562及/或圖形處理單元1564判定像素資料之間的差。該等差作為校準資料或參數儲存至記憶體1570。感測器色彩校正程式1580b將校準參數應用於後續右影像及左影像。In order to calibrate colors for sensor variability, the example color calibration program 1580b provides for a calibration routine that can be executed by the graphics processing unit 1564 and/or the processor 1562. Sensor calibration includes placing a calibrated color chart (such as ColorChecker® Digital SG from X-Rite) at the target site 700. The processor 1562 and/or the graphics processing unit 1564 executes a program 1580b, and the program 1580b includes instructions to record the right image and the left image of the color chart to the image capture module 1404. The pixel data from the right and left images (after being processed by the Bayer program 1580a) can be compared with the pixel data associated with the color chart (which can be from a peripheral unit 1576 and/or a remote computer via the network interface 1572) Store to memory 1570). The processor 1562 and/or the graphics processing unit 1564 determines the difference between the pixel data. The difference is stored in the memory 1570 as calibration data or parameters. The sensor color calibration program 1580b applies the calibration parameters to the subsequent right and left images.
在某些實例中,可在若干像素區域內對該等差求平均使得程式1580b找出可全域地應用於光學影像感測器746及748之所有像素以產生儘可能接近於色彩圖表之色彩的色彩校正資料之一最佳擬合。另外或另一選擇係,程式1580b可處理自使用者單元裝置1410接收之使用者輸入指令以校正色彩。該等指令可包含基於操作者偏好對紅色、藍色及綠色像素資料進行之區域及/或全域改變。In some instances, the differences can be averaged over a number of pixel regions so that the program 1580b can find a method that can be applied globally to all pixels of the optical image sensors 746 and 748 to produce colors that are as close as possible to the color chart. One of the color correction data is the best fit. Alternatively or alternatively, the program 1580b can process the user input command received from the user unit device 1410 to correct the color. The commands may include regional and/or global changes to the red, blue, and green pixel data based on the operator's preference.
實例性感測器色彩校正程式1580b亦經組態以校正白色平衡。一般而言,白光應產生具有相等值之紅色、綠色及藍色像素。然而,像素之間的差可因在成像期間使用之光之色溫、像素中之每一者之濾波器及感測元件之固有態樣以及圖7及圖8之(舉例而言)偏轉元件712之頻譜濾波參數而產生。實例性感測器色彩校正程式1580b經組態以規定一校準常式以校正光失衡。The example sensor color calibration program 1580b is also configured to calibrate the white balance. Generally speaking, white light should produce red, green, and blue pixels with equal values. However, the difference between the pixels can be due to the color temperature of the light used during imaging, the inherent pattern of the filter and sensing element of each of the pixels, and the (for example) deflection element 712 of FIGS. 7 and 8 The frequency spectrum filtering parameters are generated. The example sensor color calibration program 1580b is configured to specify a calibration routine to correct the light imbalance.
為執行白色平衡,處理器1562 (根據來自程式1580b之指令)可在顯示監視器512及/或514上顯示使一操作者將一中性卡放置在目標部位700處之一指令。處理器1562然後可指示影像擷取模組1404記錄該中性卡之一或多個影像。在由解壓縮常式1602及解拜耳程式1580a處理之後,程式1580b針對紅色、藍色及綠色資料中之每一者判定區域及/或全域白色平衡校準權重值,使得像素中之每一者具有紅色、藍色及綠色資料之實質上相等值。該等白色平衡校準權重值儲存至記憶體1570。在操作期間,圖形處理單元1564使用程式1580b來應用白色平衡校準參數以提供白色平衡。To perform white balance, the processor 1562 (according to the instructions from the program 1580b) can display an instruction on the display monitor 512 and/or 514 for an operator to place a neutral card at the target site 700. The processor 1562 can then instruct the image capture module 1404 to record one or more images of the neutral card. After being processed by the decompression routine 1602 and the Bayer program 1580a, the program 1580b determines the regional and/or global white balance calibration weight values for each of the red, blue, and green data, so that each of the pixels has The red, blue, and green data are substantially equal in value. The white balance calibration weight values are stored in the memory 1570. During operation, the graphics processing unit 1564 uses the program 1580b to apply white balance calibration parameters to provide white balance.
在某些實例中,程式1580b針對右光學影像感測器746及左光學影像感測器748個別地判定白色平衡校準參數。在此等實例當中,程式1580b可儲存左影像及右影像之單獨校準參數。在其他例項中,感測器色彩校正程式1580b判定右視圖與左視圖之間的一加權,使得色彩像素資料對於右光學影像感測器746及左光學影像感測器748幾乎完全相同。所判定權重可施加至白色平衡校準參數以供在立體視覺化攝影機300之操作期間之後續使用。In some examples, the program 1580b determines the white balance calibration parameters for the right optical image sensor 746 and the left optical image sensor 748 individually. In these examples, the program 1580b can store separate calibration parameters for the left and right images. In other examples, the sensor color correction program 1580b determines a weight between the right view and the left view, so that the color pixel data is almost the same for the right optical image sensor 746 and the left optical image sensor 748. The determined weight can be applied to the white balance calibration parameter for subsequent use during the operation of the stereo visualization camera 300.
在某些實施例中,圖16之感測器色彩校正程式1580b規定將應用白色平衡校準參數作為對右光學影像感測器746及左光學影像感測器748之像素之一數位增益。舉例而言,影像擷取模組1404之處理器1504將數位增益施加至自像素中之每一者讀取之像素資料。在其他實施例中,將應用白色平衡校準參作為針對每一像素之色彩感測元件之一類比增益。In some embodiments, the sensor color calibration program 1580b of FIG. 16 specifies that the white balance calibration parameter is applied as a digital gain for the pixels of the right optical image sensor 746 and the left optical image sensor 748. For example, the processor 1504 of the image capture module 1404 applies digital gain to the pixel data read from each of the pixels. In other embodiments, the white balance calibration parameter is applied as an analog gain of the color sensing element for each pixel.
實例性感測器色彩校正程式1580b可在啟動不同光源708及/或不同濾波器類型之濾波器740時執行白色平衡及/或色彩校正。因此,記憶體1570可基於選擇哪一光源708而儲存不同校準參數。此外,感測器色彩校正程式1580b可針對不同類型之外部光執行白色平衡及/或色彩校正。一操作者可使用使用者輸入裝置1410來規定外部光源之特性及/或一類型。此校準使得立體視覺化攝影機300能夠針對不同光照環境提供色彩校正及/或白色平衡。The example sensor color correction program 1580b can perform white balance and/or color correction when the filters 740 of different light sources 708 and/or different filter types are activated. Therefore, the memory 1570 can store different calibration parameters based on which light source 708 is selected. In addition, the sensor color calibration program 1580b can perform white balance and/or color calibration for different types of external light. An operator can use the user input device 1410 to specify the characteristics and/or a type of the external light source. This calibration enables the stereo vision camera 300 to provide color correction and/or white balance for different lighting environments.
實例性程式1580b經組態以對光學影像感測器746及748中之每一者單獨執行校準。因此,程式1580b在操作期間將不同校準參數應用於右影像及左影像。然而,在某些實例中,可僅對一個感測器 746或748執行校準,其中校準參數用於另一感測器。The example program 1580b is configured to perform calibration on each of the optical image sensors 746 and 748 individually. Therefore, the program 1580b applies different calibration parameters to the right and left images during operation. However, in some instances, calibration may be performed on only one sensor 746 or 748, where the calibration parameters are used for the other sensor.
實例性使用者色彩校正程式1580c經組態以請求關於影像品質參數(諸如亮度、對比度、伽瑪、色調及/或飽和度)之操作者所提供回饋。可接收回饋作為來自使用者輸入裝置1410之指令。使用者做出之調整作為使用者校準參數儲存於記憶體1570中。隨後藉由使用者色彩校正程式1580c在針對光學影像感測器746及748之色彩校正之後將此等參數應用於右學影像及左光學影像。The example user color correction program 1580c is configured to request feedback from the operator regarding image quality parameters such as brightness, contrast, gamma, hue, and/or saturation. The feedback can be received as a command from the user input device 1410. The adjustments made by the user are stored in the memory 1570 as user calibration parameters. Then, the user color calibration program 1580c applies these parameters to the right and left optical images after color calibration for the optical image sensors 746 and 748.
圖16之實例性顯示器色彩校正程式1580d經組態以使用(舉例而言) DatacolorTM
Spyder色彩核對器校正一顯示監視器之影像色彩。類似於程式1580b,程式1580d指示影像擷取模組1404記錄目標場景700處之一顯示器色彩模板之一影像。顯示器色彩校正程式1580d操作調整像素資料之一常式以匹配儲存於記憶體1570中之一查找表中之一預期顯示輸出。該經調整像素資料可作為顯示校準參數儲存至記憶體1570。在某些實例中,一攝影機或其他成像感測器可連接至周邊輸入單元介面1574,周邊輸入單元介面1574提供影像或自顯示監視器512及514記錄之關於色彩之其他回饋(其用於調整像素資料)。
5.立體影像顯示實例 The exemplary display color correction program 1580d of Figure 16 is configured to use, for example, the Datacolor ™ Spyder color checker to correct the image color of a display monitor. Similar to the program 1580b, the program 1580d instructs the image capturing module 1404 to record an image of a display color template at the target scene 700. The display color correction program 1580d operates to adjust a routine of pixel data to match an expected display output in a look-up table stored in the memory 1570. The adjusted pixel data can be stored in the memory 1570 as display calibration parameters. In some instances, a camera or other imaging sensor can be connected to the peripheral input unit interface 1574. The peripheral input unit interface 1574 provides images or other color-related feedback recorded from the display monitors 512 and 514 (which is used to adjust Pixel data). 5. Three-dimensional image display example
圖16之圖形處理單元1564之實例性再現器程式1580e經組態以使右影像及/或圖框及左影像及/或圖框準備用於三維立體顯示。在藉由程式1580b、1580c及1580d對右影像及左影像之像素資料進行色彩校正之後,再現器程式1580e經組態以將左眼及右眼資料繪製成適合用於立體顯示之一格式且將最後所再現版本放置至一輸出緩衝器中以用於傳輸至顯示監視器512或514中之一者。The example renderer program 1580e of the graphics processing unit 1564 of FIG. 16 is configured to prepare the right image and/or frame and the left image and/or frame for 3D stereoscopic display. After performing color correction on the pixel data of the right and left images by the programs 1580b, 1580c, and 1580d, the renderer program 1580e is configured to render the left-eye and right-eye data into a format suitable for stereoscopic display and The final reproduced version is placed in an output buffer for transmission to one of the display monitors 512 or 514.
一般而言,再現器程式1580e接收一右影像及/或圖框及一左影像及/或圖框。再現器程式1580e將右影像及/或圖框以及左影像及/或圖框組合成一單個圖框。在某些實施例中,程式1580e操作一自上而下模式且將左影像資料之高度壓縮二分之一。程式1580e然後將經壓縮左影像資料放置於經組合圖框之一頂部二分之一中。類似地,程式1580e將右影像資料之高度壓縮二分之一且將經壓縮右影像資料放置於經組合圖框之一底部二分之一中。Generally speaking, the renderer program 1580e receives a right image and/or frame and a left image and/or frame. The renderer program 1580e combines the right image and/or frame and the left image and/or frame into a single frame. In some embodiments, the program 1580e operates in a top-down mode and compresses the height of the left image data by one-half. Program 1580e then places the compressed left image data in the top half of one of the combined frames. Similarly, the program 1580e compresses the height of the right image data by one half and places the compressed right image data in the bottom half of the combined frame.
在其他實施例中,再現器程式1580e操作一並肩模式,其中左影像及右影像中之每一者之寬度壓縮二分之一且組合於一單個影像中,使得左影像資料提供於影像之一左二分之一上而右影像資料提供於影像之一右二分之一上。在又一替代實施例中,再現器程式1580e操作一列交錯模式,其中丟棄左圖框及右圖框中之每隔一個線。左圖框與右圖框組合在一起以形成一完整立體影像。In other embodiments, the renderer program 1580e operates in a side-by-side mode, in which the width of each of the left image and the right image is compressed by one-half and combined in a single image, so that the left image data is provided in one of the images The upper half of the left image data and the right half of the image data are provided on the upper half of the image. In yet another alternative embodiment, the renderer program 1580e operates in a row of interleaved mode, in which every other line in the left and right frames is discarded. The left frame and the right frame are combined to form a complete three-dimensional image.
實例性再現器程式1580e經組態以針對每一所連接顯示監視器單獨再現經組合左影像與右影像。例如,若顯示監視器512及514兩者經連接,則再現器程式1580e針對顯示監視器512再現一第一經組合立體影像且針對顯示監視器514再現一第二經組合立體影像。再現器程式1580e格式化第一經組合立體影像及第二經組合立體影像,使得其與顯示監視器及/或螢幕之類型及/或螢幕大小相容。The example renderer program 1580e is configured to individually render the combined left and right images for each connected display monitor. For example, if the display monitors 512 and 514 are connected, the renderer program 1580e renders a first combined stereoscopic image for the display monitor 512 and a second combined stereoscopic image for the display monitor 514. The renderer program 1580e formats the first combined 3D image and the second combined 3D image so that they are compatible with the type and/or screen size of the display monitor and/or screen.
在某些實施例中,再現器程式1580e基於顯示監視器將如何顯示立體資料而選擇影像處理模式。一操作者之大腦對立體影像資料之恰當解釋需要立體影像之左眼資料傳遞至操作者之左眼且立體影像之右眼資料傳遞至操作者之右眼。一般而言,顯示監視器提供左眼資料之一第一極化及右眼資料之一第二對立極化。因此,經組合立體影像必須匹配顯示監視器之極化。In some embodiments, the renderer program 1580e selects the image processing mode based on how the display monitor will display the stereo data. A proper interpretation of the three-dimensional image data by the brain of an operator requires that the left-eye data of the three-dimensional image is transmitted to the left eye of the operator and the right-eye data of the three-dimensional image is transmitted to the right eye of the operator. Generally speaking, the display monitor provides a first polarization of left-eye data and a second opposite polarization of right-eye data. Therefore, the combined 3D image must match the polarization of the display monitor.
圖17展示根據本發明之一實例性實施例之顯示監視器512之一實例。舉例而言,顯示監視器512可係具有一螢幕1702之LG® 55LW5600三維電視。實例性顯示監視器512使用螢幕1702上之一極化膜,使得所有奇數列1704具有一第一極化且所有偶數列1706具有一對立極化。為了與圖17中所展示之顯示監視器512相容,再現器程式1580e將必須選擇列交叉模式,使得左影像資料及右影像資料在交替線上。在某些例項中,再現器程式1580e可在準備立體影像之前請求(或以其他方式接收)顯示監視器512之顯示特性。FIG. 17 shows an example of a display monitor 512 according to an exemplary embodiment of the present invention. For example, the display monitor 512 can be an LG® 55LW5600 3D TV with a screen 1702. The exemplary display monitor 512 uses a polarization film on the screen 1702 so that all odd-numbered columns 1704 have a first polarization and all even-numbered columns 1706 have a pair of polarizations. In order to be compatible with the display monitor 512 shown in FIG. 17, the renderer program 1580e will have to select the row cross mode so that the left image data and the right image data are on alternate lines. In some cases, the renderer program 1580e may request (or otherwise receive) the display characteristics of the display monitor 512 before preparing the stereoscopic image.
為觀看顯示在螢幕1702上之立體影像,外科醫師504 (自圖5記得他)佩戴包含一左透鏡1714之眼鏡1712,左透鏡1714包括與列1704之第一極化匹配之一第一極化。另外,眼鏡1712包含一右透鏡1716,右透鏡1716包括與列1706之第二極化匹配之一第二極化。因此,左透鏡1714僅准許來自左列1704之左影像資料之大部分光通過同時阻擋來自右影像資料之大部分光。另外,右透鏡1716准許來自右列1706之右影像資料之大部分光通過同時阻擋來自左影像資料之大部分光。來自「錯誤」視圖之到達每一各別眼睛之光量稱為「串擾」且一般保持至足夠低以准許舒適觀看之一值。因此,外科醫師504在一左眼中觀看由左光學影像感測器748記錄之左影像資料同時在一右眼中觀看由右光學影像感測器746記錄之右影像資料。外科醫師之大腦將兩個視圖融合在一起以形成三維距離及/或深度之一感知。此外,此一顯示監視器之使用對於觀察立體視覺化攝影機300之準確性係有利的。若外科醫師或操作者未佩戴眼鏡,則可用兩隻眼睛觀察左視圖及右視圖兩者。若一平面目標放置於焦平面處,則兩個影像將係理論上對準的。若偵測到不對準,則可由處理器1562起始一重新校準程序。To view the three-dimensional image displayed on the screen 1702, the surgeon 504 (remember him from Figure 5) wears glasses 1712 that includes a left lens 1714. The left lens 1714 includes a first polarization that matches the first polarization of column 1704. . In addition, the glasses 1712 include a right lens 1716, and the right lens 1716 includes a second polarization that matches the second polarization of the column 1706. Therefore, the left lens 1714 only allows most of the light from the left image data in the left row 1704 to pass while blocking most of the light from the right image data. In addition, the right lens 1716 allows most of the light from the right image data in the right column 1706 to pass while blocking most of the light from the left image data. The amount of light reaching each individual eye from the "error" view is called "crosstalk" and is generally kept low enough to allow comfortable viewing. Therefore, the surgeon 504 views the left image data recorded by the left optical image sensor 748 in a left eye while viewing the right image data recorded by the right optical image sensor 746 in a right eye. The surgeon’s brain fuses the two views together to form a three-dimensional perception of distance and/or depth. In addition, the use of such a display monitor is advantageous to the accuracy of observing the stereo visualization camera 300. If the surgeon or operator does not wear glasses, he can observe both the left view and the right view with two eyes. If a flat target is placed at the focal plane, the two images will theoretically be aligned. If misalignment is detected, the processor 1562 can initiate a recalibration procedure.
實例性再現器程式1580e經組態以再現左視圖及右視圖以用於圓形極化。然而,在其他實施例中,再現器程式1580e可提供與線性極化相容之一立體影像。無論使用哪一類型之極化,實例性處理器1562可執行一程式1560以驗證或檢查由再現器程式1580e輸出之立體影像之一極化。為檢查極化,處理器1562及/或周邊輸入單元介面1574將診斷資料插入至左影像及/或右影像中。舉例而言,處理器1562及/或周邊輸入單元介面1574可將「左」文字覆疊在左影像上且將「右」文字覆疊在右影像上。處理器1562及/或周邊輸入單元介面1574可顯示指示一操作者在佩戴眼鏡1712時一次閉一隻眼睛之一提示以確認在左眼處接收左視圖且在右眼處接收右視圖。操作者可經由使用者輸入裝置1410提供指示極化是否正確之確認。若極化係不正確的,則實例性再現器程式1580e經組態以顛倒其中左影像及右影像插入至經組合立體影像中之位置。The example renderer program 1580e is configured to render the left and right views for circular polarization. However, in other embodiments, the renderer program 1580e can provide a stereoscopic image compatible with linear polarization. No matter which type of polarization is used, the example processor 1562 can execute a program 1560 to verify or check one of the polarizations of the stereoscopic image output by the renderer program 1580e. To check the polarization, the processor 1562 and/or the peripheral input unit interface 1574 inserts the diagnostic data into the left image and/or the right image. For example, the processor 1562 and/or the peripheral input unit interface 1574 can overlay the "left" text on the left image and the "right" text on the right image. The processor 1562 and/or the peripheral input unit interface 1574 may display a prompt instructing an operator to close one eye at a time while wearing the glasses 1712 to confirm that the left view is received at the left eye and the right view is received at the right eye. The operator can provide confirmation via the user input device 1410 indicating whether the polarization is correct. If the polarization is incorrect, the example renderer program 1580e is configured to reverse the position where the left and right images are inserted into the combined stereoscopic image.
在又其他實施例中,實例性再現器程式1580e經組態以替代形成一經組合立體影像而提供圖框順序投影。在此處,再現器程式1580e與右影像及/或圖框時間順序地交叉而再現左影像及/或圖框。因此,左影像及右影像交替地呈現給外科醫師504。在此等其他實施例中,螢幕1702未經極化。替代地,眼鏡1712之左透鏡及右透鏡可電子或光學同步至一圖框序列之其各別部分,此將對應左視圖及右視圖提供給一使用者以區別深度。In still other embodiments, the example renderer program 1580e is configured to provide frame sequential projection instead of forming a combined stereo image. Here, the renderer program 1580e intersects the right image and/or frame in time order to reproduce the left image and/or frame. Therefore, the left image and the right image are alternately presented to the surgeon 504. In these other embodiments, the screen 1702 is not polarized. Alternatively, the left lens and the right lens of the glasses 1712 can be electronically or optically synchronized to their respective parts of a frame sequence, which provides a user with the corresponding left and right views to distinguish the depth.
在某些實例中,再現器程式1580e可提供特定右影像及左影像以顯示於單獨顯示監視器或一個顯示監視器上之單獨窗上。當光學元件1402之右光學路徑及左光學路徑之透鏡係可獨立地調整時,此一組態可係尤其有益的。在一實例中,一右光學路徑可經設定處於一第一放大位準,而一左光學路徑經設定處於一第二放大位準。實例性再現器程式1580e因此可顯示來自顯示監視器512上之左視圖之一影像串流及來自顯示監視器514上之右視圖之一影像串流。在某些例項中,左視圖可顯示於顯示監視器512上之一第一窗中,而右視圖顯示於同一顯示監視器512之一第二窗(例如,一畫中畫)中。因此,儘管並非立體的,但左影像及右影像之同時顯示將有用資訊提供給一外科醫師。In some instances, the renderer program 1580e can provide specific right and left images for display on a separate display monitor or a separate window on a display monitor. This configuration can be particularly beneficial when the lenses of the right optical path and the left optical path of the optical element 1402 can be adjusted independently. In one example, a right optical path may be set at a first magnification level, and a left optical path may be set at a second magnification level. The example renderer program 1580e can therefore display an image stream from the left view on the display monitor 512 and an image stream from the right view on the display monitor 514. In some examples, the left view may be displayed in a first window on the display monitor 512, and the right view may be displayed in a second window (for example, a picture-in-picture) of the same display monitor 512. Therefore, although it is not three-dimensional, the simultaneous display of the left and right images provides useful information to a surgeon.
在另一實例中,光源708及濾波器740可迅速地切換以利用可見光及螢光產生交替影像。實例性再現器程式1580e可組合左視圖與右視圖以在不同光照源下提供一立體顯示以突出顯示(舉例而言)具有一染色劑之一血管同時在可見光中展示背景。In another example, the light source 708 and the filter 740 can be quickly switched to use visible light and fluorescent light to generate alternate images. The example renderer program 1580e can combine the left view and the right view to provide a stereoscopic display under different light sources to highlight (for example) a blood vessel with a dye while displaying the background in visible light.
在又一實例中,一數位變焦可應用於右光學影像感測器746及/或左光學影像感測器748。數位變焦一般影響影像之所感知解析度且取決於諸如顯示解析度及觀看者之偏好之因素。舉例而言,影像擷取模組1404之處理器1504可藉由形成經合成且散置在經數位變焦像素之間的經內插像素而應用數位變焦。處理器1504可操作協調光學影像感測器746及748之選擇及內插像素之一程式1510。處理器1504在數位變焦應用於資訊處理器模組1408之情況下傳輸右影像及左影像以供後續再現及顯示。In yet another example, a digital zoom can be applied to the right optical image sensor 746 and/or the left optical image sensor 748. Digital zoom generally affects the perceived resolution of an image and depends on factors such as display resolution and viewer preference. For example, the processor 1504 of the image capture module 1404 can apply digital zoom by forming interpolated pixels that are synthesized and interspersed between the digitally zoomed pixels. The processor 1504 can operate a program 1510 that coordinates the selection of the optical image sensors 746 and 748 and the interpolation of pixels. The processor 1504 transmits the right image and the left image for subsequent reproduction and display when the digital zoom is applied to the information processor module 1408.
在某些實施例中,處理器1504自處理器1562接收如下指令:一數位變焦影像將記錄於不具有數位變焦之影像之間以提供目標部位700之一所關注區域之一數位變焦之一畫中畫(或單獨窗)顯示。處理器1504因此將數位變焦應用於自像素網格1002及1004之每隔一次讀取。此使得再現器程式1580e能夠除一經數位變焦立體影像之外亦同時顯示一立體完全解析度影像。另一選擇係,將數位變焦之影像自當前影像複製、經比例縮放且在再現階段期間放置於覆疊於當前影像頂部上之恰當位置中。另一選擇係,此組態避免「交替」記錄要求。
6.校準實例 In some embodiments, the processor 1504 receives the following instruction from the processor 1562: a digital zoom image will be recorded between images without digital zoom to provide a digital zoom image of an area of interest of the target site 700 Medium picture (or separate window) display. The processor 1504 therefore applies digital zoom to every other reading from the pixel grids 1002 and 1004. This enables the renderer program 1580e to simultaneously display a stereo full resolution image in addition to a digitally zoomed stereo image. Another option is to copy the digitally zoomed image from the current image, scale it and place it in a proper position overlaid on top of the current image during the reproduction phase. Another option is that this configuration avoids "alternating" recording requirements. 6. Calibration example
圖14至圖16之實例性資訊處理器模組1408可經組態以執行一或多個校準程式1560以校準(舉例而言)一工作距離及/或放大率。舉例而言,處理器1562可將執行一校準步驟之指令發送至馬達與光照模組1406以用於將自主要物鏡總成702至目標部位700之一工作距離(以毫米為單位進行量測)映射至工作距離透鏡馬達1554之一已知馬達位置。處理器1562藉由使一物件平面在離散步驟中沿著光軸順序地移動且重新聚焦左影像及右影像同時記錄編碼器計數及工作距離而執行校準。在某些實例中,工作距離可由一外部裝置量測,該外部裝置經由周邊輸入單元介面1574及/或至使用者輸入裝置1410之一介面將所量測工作距離值傳輸至處理器1562。處理器1562可儲存後工作距離透鏡704之位置(基於工作距離透鏡馬達1554之位置)及對應工作距離。The example information processor module 1408 of FIGS. 14-16 can be configured to execute one or more calibration programs 1560 to calibrate, for example, a working distance and/or magnification. For example, the processor 1562 may send an instruction to execute a calibration step to the motor and illumination module 1406 for measuring a working distance (measured in millimeters) from the main objective lens assembly 702 to the target part 700 Map to the known motor position of one of the working distance lens motors 1554. The processor 1562 performs calibration by sequentially moving an object plane along the optical axis in discrete steps and refocusing the left and right images while recording the encoder count and working distance. In some instances, the working distance can be measured by an external device that transmits the measured working distance value to the processor 1562 via the peripheral input unit interface 1574 and/or an interface to the user input device 1410. The processor 1562 can store the position of the rear working distance lens 704 (based on the position of the working distance lens motor 1554) and the corresponding working distance.
實例性處理器1562亦可執行一程式1560以執行放大率校準。處理器1562可使用馬達與光照模組1406設定光學元件1402以選擇放大位準。處理器1562可關於每一放大位準記錄光學元件1402之位置或對應馬達位置。可藉由量測一已知大小之一物件之一影像之一高度而判定放大位準。舉例而言,處理器1562可將一物件量測為具有10個像素之一高度且使用一查找表來判定一10像素高度對應於一5X放大率。The example processor 1562 can also execute a program 1560 to perform magnification calibration. The processor 1562 can use the motor and the light module 1406 to set the optical element 1402 to select the magnification level. The processor 1562 can record the position of the optical element 1402 or the corresponding motor position for each magnification level. The magnification level can be determined by measuring the height of an object and an image of a known size. For example, the processor 1562 can measure an object as having a height of 10 pixels and use a look-up table to determine that a height of 10 pixels corresponds to a 5X magnification.
為匹配兩個不同成像模態之立體視角,通常期望將其兩者模型化,好似其係簡單針孔攝影機。可自使用者可調整方向及距離觀看諸如一MRI腦腫瘤之一3D電腦模型之視角(例如,好似影像由一合成立體攝影機記錄)。可調整性可用於匹配現場外科手術影像之視角(其因此必須係已知的)。實例性處理器1562可校準此等針孔攝影機模型參數中之一或多者,諸如(舉例而言)右光學影像感測器746及左光學影像感測器748之一投影中心(「COP」)。為判定投影中心,處理器1562判定自投影中心至一物件平面之一焦距。首先,處理器1562將光學元件1402設定處於一放大位準。處理器1562然後在沿著光軸之三個不同距離處(包含在物件平面、小於物件平面距離之一距離d
及大於物件平面距離之一距離d
處)記錄一影像之一高度之量測。處理器1562使用兩個最極端位置處之類似三角形之一代數公式來判定至投影中心之焦距。處理器1562可使用相同方法或藉由判定用於校準之放大率之間的一比率而判定在其他放大率下之焦距。處理器可使用一投影中心來匹配一所要融合物件(諸如一MRI腫瘤模型)之一影像之視角與一現場立體外科手術影像。另外或另一選擇係,諸如OpenCV calibrateCamera之現有攝影機校準程序可用於找出上文所闡述之參數以及額外攝影機資訊,諸如光學元件1402之一失真模型。In order to match the stereo perspective of two different imaging modalities, it is usually desirable to model the two, as if it were a simple pinhole camera. The viewing angle of a 3D computer model such as an MRI brain tumor can be viewed from the user with adjustable direction and distance (for example, as if the image was recorded by a synthetic stereo camera). Adjustability can be used to match the field of view of surgical images (which must therefore be known). The example processor 1562 may calibrate one or more of these pinhole camera model parameters, such as, for example, the center of projection of the right optical image sensor 746 and the left optical image sensor 748 ("COP" ). To determine the projection center, the processor 1562 determines a focal length from the projection center to an object plane. First, the processor 1562 sets the optical element 1402 to a magnification level. The processor 1562 then three different distances along the optical axis (contained in the object plane, one plane at a distance d from one of the planes is smaller than the object distance and the distance d is greater than the object) measuring one of a recorded image height. The processor 1562 uses an algebraic formula of similar triangles at the two most extreme positions to determine the focal length to the center of the projection. The processor 1562 may use the same method or determine the focal length at other magnifications by determining a ratio between the magnifications used for calibration. The processor can use a projection center to match the angle of view of an image of an object to be fused (such as an MRI tumor model) with an on-site stereoscopic surgical image. In addition or alternatively, an existing camera calibration program such as OpenCV calibrateCamera can be used to find the parameters described above and additional camera information, such as a distortion model of the optical element 1402.
實例性處理器1562可進一步校準左光軸及右光軸。處理器1562判定左光軸與右光軸之間的一瞳孔間距離以用於校準。為判定瞳孔間距離,實例性處理器1562記錄左影像及右影像,其中像素組1006及1008定中心於像素網格1002及1004處。處理器1562判定左影像及右影像之ZRP位置(及/或至一經位移物件之距離),該等ZRP位置指示影像不對準及視差程度。另外,處理器1562基於放大位準而對視差及/或距離進行比例縮放。考量視差程度及/或至顯示器中之物件之經比例縮放距離,處理器1562然後使用一個三角量測計算來判定瞳孔間距離。處理器1562接下來在規定放大位準下使瞳孔間距離與光軸相關聯來作為一校準點。
VI.影像對準與假性視差調整實施例 The example processor 1562 may further calibrate the left optical axis and the right optical axis. The processor 1562 determines an interpupillary distance between the left optical axis and the right optical axis for calibration. To determine the distance between the pupils, the exemplary processor 1562 records the left and right images, where the pixel groups 1006 and 1008 are centered at the pixel grids 1002 and 1004. The processor 1562 determines the ZRP position (and/or the distance to a displaced object) of the left image and the right image, and these ZRP positions indicate the image misalignment and the degree of parallax. In addition, the processor 1562 scales the parallax and/or distance based on the magnification level. Taking into account the degree of parallax and/or the scaled distance to the object in the display, the processor 1562 then uses a triangulation measurement calculation to determine the interpupillary distance. The processor 1562 then associates the interpupillary distance with the optical axis under a prescribed magnification level as a calibration point. VI. Image alignment and false parallax adjustment embodiment
類似於人類視覺,立體影像包括在一所關注點處會聚之右視圖及左視圖。以自該所關注點之稍微不同角度記錄該右視圖及該左視圖,此引起該兩個視圖之間的視差。在該所關注點前面或後面之場景中之物項展現視差,使得可推斷出該等物項距觀看者之距離或深度。舉例而言,所感知距離之準確性取決於觀看者之視力之清晰度。大多數人之視力皆展現某種程度之不完美,從而導致右視圖與左視圖之間存在某些不準確性。然而,該等視圖仍能夠達成立體視覺,其中大腦以某種程度之準確性融合該等視圖。Similar to human vision, a stereoscopic image includes a right view and a left view that converge at a point of interest. The right view and the left view are recorded at slightly different angles from the point of interest, which causes a parallax between the two views. The items in the scene before or behind the point of interest exhibit parallax, so that the distance or depth of the items from the viewer can be inferred. For example, the accuracy of the perceived distance depends on the clarity of the viewer's vision. The vision of most people exhibits some degree of imperfection, which results in some inaccuracies between the right and left views. However, these views can still achieve stereo vision, where the brain fuses the views with a certain degree of accuracy.
當替代由一人類觀看而由一攝影機記錄左影像及右影像時,一顯示器螢幕上之經組合影像之間的視差產生立體視覺,此在一個二維顯示器上提供一個三維立體影像之一外觀。視差之誤差可影響三維立體影像之品質。所觀察視差與一理論上完美視差相比較之不準確性稱為假性視差。不同於人類,攝影機不具有自動補償不準確性之大腦。When the left and right images are recorded by a camera instead of being viewed by a human, the parallax between the combined images on a display screen produces stereo vision, which provides an appearance of a three-dimensional image on a two-dimensional display. The error of the parallax can affect the quality of the three-dimensional image. The inaccuracy of the observed parallax compared with a theoretically perfect parallax is called false parallax. Unlike humans, cameras do not have a brain that automatically compensates for inaccuracy.
若假性視差變得顯著,則三維立體影像可係無法觀看的,達到引起眩暈、頭痛及噁心之程度。存在可影響一顯微鏡及/或攝影機中之視差之諸多因素。例如,右視圖及左視圖之光學通道可並非完全相等的。該等光學通道可具有不匹配焦點、放大率及/或所關注點不對準。此等問題可在不同放大率及/或工作距離下具有不同嚴重性,因而減少透過校準進行校正之努力。If the pseudo-parallax becomes significant, the three-dimensional image may be invisible to the extent that it causes dizziness, headache, and nausea. There are many factors that can affect the parallax in a microscope and/or camera. For example, the optical channels of the right view and the left view may not be completely equal. The optical channels may have mismatched focus, magnification, and/or misalignment of the point of interest. These problems can have different severity under different magnifications and/or working distances, thus reducing the effort of correction through calibration.
已知外科手術顯微鏡(諸如圖2之外科手術顯微鏡200)經組態以透過目鏡206提供一充足視圖。通常,已知外科手術顯微鏡之光學元件之影像品質對於立體攝影機係不充足的。此情形之原因係由於外科手術顯微鏡之製造者假設主要觀看係透過目鏡。任何攝影機附件(諸如攝影機212)係單像的且不受假性視差影響,或在具有低影像解析度之情況下係立體的,其中假性視差並不明顯。Known surgical operating microscopes (such as the surgical operating microscope 200 of FIG. 2) are configured to provide a sufficient view through the eyepiece 206. Generally, the image quality of the optical elements of the known surgical operating microscope is insufficient for the stereo camera. The reason for this situation is that the manufacturer of the surgical operating microscope assumes that the main viewing is through the eyepiece. Any camera attachment (such as the camera 212) is mono-image and is not affected by false parallax, or is stereoscopic in the case of low image resolution, where the false parallax is not obvious.
已開發國際標準(諸如ISO 10936-1:2000,Optics and optical instruments – Operation microscopes – Part 1: Requirements and test methods
)以提供外科手術顯微鏡之影像品質之規格限制。該等規格限制一般經設定以用於透過一外科手術顯微鏡之目鏡觀看且不考量三維立體顯示。舉例而言,關於假性視差,ISO 10936-1:2000規定左視圖與右視圖之間的垂直軸線之差應小於15弧分。軸線之小角偏差通常以弧分(其對應於一度之1/60)或弧秒(其對應於一弧分之1/60)為單位來量化。針對具有250 mm之一工作距離及35 mm之一視域之一典型外科手術顯微鏡(其具有8°之一角視域),15弧分規格限制對應於左視圖與右視圖之間的一3%差。International standards (such as ISO 10936-1:2000, Optics and optical instruments – Operation microscopes – Part 1: Requirements and test methods ) have been developed to provide the specification limits for the image quality of surgical operating microscopes. These specifications are generally set for viewing through the eyepieces of a surgical operating microscope and do not consider the three-dimensional display. For example, regarding false parallax, ISO 10936-1:2000 stipulates that the difference in the vertical axis between the left view and the right view should be less than 15 arc minutes. The small angular deviation of the axis is usually quantified in units of arc minutes (which corresponds to 1/60 of a degree) or arc seconds (which corresponds to 1/60 of an arc minute). For a typical surgical operating microscope with a working distance of 250 mm and a field of view of 35 mm (which has an angular field of view of 8°), the 15 arc minute specification limit corresponds to a 3% between the left view and the right view Difference.
該3%差對於目鏡觀看係可接受的,其中一外科醫師之大腦能夠克服小程度之誤差。然而,當在一顯示監視器上立體地觀看時,此3%差產生左視圖與右視圖之間的明顯差。舉例而言,當一起展示左視圖及右視圖時,一3%差致使一影像看起來不連貫且難以觀看延長時間週期。The 3% difference is acceptable for the eyepiece viewing system, in which the brain of a surgeon can overcome a small degree of error. However, when viewed stereoscopically on a display monitor, this 3% difference produces a significant difference between the left view and the right view. For example, when the left view and the right view are displayed together, a 3% difference makes an image look disjointed and difficult to view for an extended period of time.
另一問題係已知外科手術顯微鏡可在僅一個或幾個放大位準下滿足15弧分規格限制及/或僅個別光學元件可滿足一特定規格限制。舉例而言,製造個別透鏡以滿足特定準則。然而,當個別光學元件組合於一光學路徑中時,可放大而非取消與標準之小偏差。當在包含一共同主要物鏡透鏡之一光學路徑中使用五個或五個以上光學元件時此可係特別顯著的。另外,使平行通道上之光學元件完全匹配係非常困難的。至多在製造期間,僅在一個或幾個特定放大位準下校準一外科手術顯微鏡之光學元件以滿足15弧分規格限制。因此,校準點之間的誤差可更大,儘管外科手術顯微鏡據稱滿足ISO 10936-1:2000規格。Another problem is that the known surgical operating microscope can meet the 15 arcmin specification limit under only one or a few magnification levels and/or only individual optical elements can meet a specific specification limit. For example, individual lenses are manufactured to meet certain criteria. However, when individual optical elements are combined in an optical path, small deviations from the standard can be amplified rather than cancelled. This can be particularly significant when five or more optical elements are used in an optical path that includes a common main objective lens. In addition, it is very difficult to completely match the optical elements on the parallel channels. At most during manufacturing, the optical elements of a surgical operating microscope are only calibrated at one or a few specific magnification levels to meet the 15 arcmin specification limit. Therefore, the error between the calibration points can be greater, even though the surgical operating microscope is said to meet the ISO 10936-1:2000 specification.
另外,當添加額外組件時,ISO 10936-1:2000規格准許更大容差。舉例而言,添加第二目鏡(例如,目鏡208)會使假性視差增加2弧分。再次,雖然此誤差對於透過目鏡206及208觀看可係可接受的,但當透過攝影機立體地觀看時影像不對準變得更顯著。In addition, when adding additional components, the ISO 10936-1:2000 specification allows greater tolerances. For example, adding a second eyepiece (e.g., eyepiece 208) increases the false parallax by 2 arc minutes. Again, although this error may be acceptable for viewing through the eyepieces 206 and 208, the image misalignment becomes more significant when viewed stereoscopically through a camera.
與已知外科手術顯微鏡相比較,本文中所揭示之實例性立體視覺化攝影機300經組態以自動調整光學元件1402中之至少某些光學元件以減少或消除假性視差。將光學元件嵌入於立體視覺化攝影機300內使得能夠自動(有時即時)做出精細調整以用於三維立體顯示。在某些實施例中,實例性立體視覺化攝影機300可提供20弧秒至40弧秒之一準確性,其接近於光學誤差之一97%減少(與已知外科手術顯微鏡之15弧分準確性相比較)。Compared with known surgical operating microscopes, the exemplary stereo visualization camera 300 disclosed herein is configured to automatically adjust at least some of the optical elements 1402 to reduce or eliminate false parallax. Embedding the optical element in the stereo visualization camera 300 enables automatic (sometimes instantaneous) fine adjustments to be made for three-dimensional stereo display. In some embodiments, the exemplary stereo vision camera 300 can provide an accuracy of 20 arc seconds to 40 arc seconds, which is close to a 97% reduction in optical errors (which is accurate to 15 arc minutes of known surgical operating microscopes). Sexual comparison).
準確性改良使得實例性立體視覺化攝影機300能夠提供不能夠利用已知立體顯微鏡來執行之特徵。舉例而言,諸多新顯微外科手術程序依賴於一現場外科手術部位中之準確量測以達成最佳定大小、定位、匹配、引導及診斷。此包含判定一血管之一大小、一環面眼內透鏡(「IOL」)之一放置角度、脈管系統自一術前影像至一現場視圖之一匹配、一腫瘤在一動脈下面之一深度等。實例性立體視覺化攝影機300因此使得能夠使用(舉例而言)圖形覆疊或影像分析來進行精確量測以判定解剖結構之大小。The accuracy improvement allows the exemplary stereo visualization camera 300 to provide features that cannot be performed with known stereo microscopes. For example, many new microsurgery procedures rely on accurate measurement of an on-site surgical site to achieve the best sizing, positioning, matching, guidance, and diagnosis. This includes determining the size of a blood vessel, the placement angle of an toroidal intraocular lens ("IOL"), the matching of the vasculature from a preoperative image to a live view, a tumor at a depth under an artery, etc. . The example stereo visualization camera 300 thus enables the use of, for example, graphic overlays or image analysis to make accurate measurements to determine the size of anatomical structures.
已知外科手術顯微鏡需要一外科醫師將一已知大小之一物件(諸如一微米尺)放置至視域中。外科醫師比較物件之大小與周圍解剖結構以判定一大致大小。然而,此程序係相對緩慢的,此乃因外科醫師必須將物件放置於適當位置中,且然後在執行量測之後將其移除。另外,量測僅提供一約計,此乃因大小係基於外科醫師之主觀比較及量測。某些已知立體攝影機提供圖形覆疊以判定大小。然而,若假性視差存在於左視圖與右視圖之間,則降低此等覆疊之準確性。
A.作為一假性視差源之 ZRP The known surgical operating microscope requires a surgeon to place an object of a known size (such as a micrometer ruler) into the field of view. The surgeon compares the size of the object with the surrounding anatomical structure to determine an approximate size. However, this procedure is relatively slow, because the surgeon must place the object in the proper position and then remove it after performing the measurement. In addition, the measurement only provides an approximation, because the size is based on the subjective comparison and measurement of the surgeon. Some known stereo cameras provide graphic overlays to determine the size. However, if false parallax exists between the left view and the right view, the accuracy of these overlays is reduced. A. ZRP as a source of false parallax
ZRP不準確性提供左影像與右影像之間的一顯著誤差源,從而產生假性視差。ZRP或變焦重複點係指在改變一放大位準時保持在一相同位置中的一視域中之一點。圖18及圖19展示不同放大位準之一左視域及右視域中之ZRP之實例。具體而言,圖18展示一低放大位準之一左視域1800及一高放大位準之一左視域1850。另外,圖19展示一低放大位準之一右視域1900及一高放大位準之一右視域1950。The inaccuracy of ZRP provides a significant source of error between the left and right images, resulting in false parallax. ZRP or zoom repeat point refers to a point in a field of view that remains in the same position when changing a magnification level. Figures 18 and 19 show examples of ZRP in the left view and the right view at one of the different magnification levels. Specifically, FIG. 18 shows a left view area 1800 at a low magnification level and a left view area 1850 at a high magnification level. In addition, FIG. 19 shows a right viewing zone 1900 at a low magnification level and a right viewing zone 1950 at a high magnification level.
應注意,圖18及圖19展示十字絲1802及1902以提供本發明之一例示性參考點。十字絲1802包含沿著一y方向或y軸定位之一第一十字絲1802a及沿著一x方向或x軸定位之一第二十字絲1802b。另外,十字絲1902包含沿著一y方向或y軸定位之一第一十字絲1902a及沿著一x方向或x軸定位之一第二十字絲1902b。在實際實施方案中,實例性立體視覺化攝影機300預設地通常不包含十字絲或不將十字絲添加至光學路徑,除非一操作者請求。It should be noted that FIGS. 18 and 19 show crosshairs 1802 and 1902 to provide an exemplary reference point of the present invention. The reticle 1802 includes a first reticle 1802a located along a y-direction or y-axis and a second reticle 1802b located along an x-direction or x-axis. In addition, the reticle 1902 includes a first reticle 1902a located along a y-direction or y-axis and a second reticle 1902b located along an x-direction or x-axis. In actual implementations, the exemplary stereo visualization camera 300 generally does not include cross-hairs or add cross-hairs to the optical path by default, unless requested by an operator.
理想地,ZRP應定位於一中央位置或原點處。舉例而言,ZRP應定中心於十字絲1802及1902中。然而,光學元件1402之不準確性及/或光學元件1402之間的稍微不對準致使ZRP位於遠離十字絲1802及1902之中心處。除ZRP在左視圖與右視圖之間不對準之外,假性視差程度亦對應於左視圖及右視圖之ZRP中之每一者位於遠離各別中心處多遠。此外,光學元件1402之不準確性可致使ZRP在放大率改變時稍微漂移,因而進一步導致更大程度之假性視差。Ideally, ZRP should be positioned at a central location or origin. For example, ZRP should be centered in crosshairs 1802 and 1902. However, the inaccuracy of the optical element 1402 and/or the slight misalignment between the optical elements 1402 causes the ZRP to be located away from the center of the crosshairs 1802 and 1902. In addition to the misalignment of the ZRP between the left view and the right view, the degree of false parallax also corresponds to how far each of the ZRP of the left view and the right view is located away from the respective center. In addition, the inaccuracy of the optical element 1402 may cause the ZRP to drift slightly when the magnification is changed, thereby further causing a greater degree of false parallax.
圖18展示在圖7之目標部位700之視域1800及1850中之三個新月形物件1804、1806及1808。應瞭解,視域1800及1850係關於光學影像感測器746及748之線性視域。物件1804、1806及1808放置於視域1800中以圖解說明如何自左影像與右影像不對準產生假性視差。物件1804沿著十字絲1802a定位十字絲1802b上面。物件1806沿著十字絲1802b定位且定位於十字絲1802a左邊。物件1808定位為稍微低於十字絲1802b且定位於十字絲1802a右邊。左視域1800之一ZRP 1810定位於物件1808之一凹口中。FIG. 18 shows three crescent-shaped objects 1804, 1806, and 1808 in the field of view 1800 and 1850 of the target part 700 of FIG. 7. It should be understood that the viewing zones 1800 and 1850 are related to the linear viewing zones of the optical image sensors 746 and 748. Objects 1804, 1806, and 1808 are placed in the viewing area 1800 to illustrate how to generate false parallax from the misalignment of the left and right images. The object 1804 is positioned on the crosshair 1802b along the crosshair 1802a. The object 1806 is positioned along the crosshair 1802b and on the left side of the crosshair 1802a. The object 1808 is positioned slightly below the reticle 1802b and to the right of the reticle 1802a. A ZRP 1810 of the left view 1800 is positioned in a recess of the object 1808.
藉由使用實例性立體視覺化攝影機300之變焦透鏡總成716增加放大位準(例如,變焦)而將左視域1800改變為左視域1850。增加放大率致使物件1804、1806及1808似乎膨脹或生長,如視域1850中所展示。在所圖解說明實例中,視域1850係視域1800之放大位準之大致3X。The left field of view 1800 is changed to the left field of view 1850 by using the zoom lens assembly 716 of the exemplary stereo visualization camera 300 to increase the magnification level (eg, zoom). Increasing the magnification causes objects 1804, 1806, and 1808 to appear to expand or grow, as shown in view 1850. In the illustrated example, the viewing area 1850 is approximately 3X of the magnification level of the viewing area 1800.
與低放大率視域1800相比較,高放大率視域1850中之物件1804、1806及1808之大小已增加大約3X同時亦相對於ZRP 1810移動為彼此分開3X。另外,物件1804、1806及1808之位置已相對於十字絲1802移動。物件1804現在移位至十字絲1802a左邊且移位為稍微更加遠離十字絲1802b。另外,物件1806現在移位至十字絲1802a左邊更遠處且稍微高於十字絲1802b。一般而言,物件1808位於相對於十字絲1802之相同(或幾乎相同)位置中,其中ZRP 1810位於相對於十字絲1802及物件1806之完全相同(或幾乎相同)位置中。換言之,隨著放大率增加,物件1804、1806及1808 (及視域1850中之別的東西)似乎移動遠離ZRP 1810且自ZRP 1810向外移動。Compared with the low-magnification field of view 1800, the size of the objects 1804, 1806, and 1808 in the high-magnification field of view 1850 have been increased by approximately 3X while also moving relative to the ZRP 1810 to be 3X apart from each other. In addition, the positions of the objects 1804, 1806, and 1808 have moved relative to the crosshair 1802. The object 1804 is now shifted to the left of the reticle 1802a and shifted slightly further away from the reticle 1802b. In addition, the object 1806 is now shifted to the far left of the reticle 1802a and slightly higher than the reticle 1802b. Generally speaking, the object 1808 is located in the same (or almost the same) position relative to the reticle 1802, and the ZRP 1810 is located in the same (or almost the same) position relative to the reticle 1802 and the object 1806. In other words, as the magnification increases, the objects 1804, 1806, and 1808 (and others in the field of view 1850) seem to move away from the ZRP 1810 and outward from the ZRP 1810.
在圖19中所圖解說明之右視域1900及1950中展示相同物件1804、1806及1808。然而,ZRP之位置係不同的。具體而言,ZRP 1910在右視域1900及1950中位於十字絲1902b上面且位於十字絲1902a左邊。因此,ZRP 1910位於與左視域1800及1850中之ZRP 1810不同之一位置處。在所圖解說明實例中,假定左光學路徑及右光學路徑在第一放大位準下完全對準。因此,右視域1900中所展示之物件1804、1806及1808位於與左視域1800中之物件1804、1806及1808相同之位置中。由於左視圖與右視圖對準,因此不存在假性視差。The same objects 1804, 1806, and 1808 are shown in the right views 1900 and 1950 illustrated in FIG. 19. However, the location of ZRP is different. Specifically, the ZRP 1910 is located above the reticle 1902b and to the left of the reticle 1902a in the right viewing zones 1900 and 1950. Therefore, the ZRP 1910 is located at a different position from the ZRP 1810 in the left view zones 1800 and 1850. In the illustrated example, it is assumed that the left optical path and the right optical path are perfectly aligned at the first magnification level. Therefore, the objects 1804, 1806, and 1808 displayed in the right view 1900 are located in the same positions as the objects 1804, 1806, and 1808 in the left view 1800. Since the left view is aligned with the right view, there is no false parallax.
然而,在高放大率視域1950中,物件1804、1806及1808膨脹且移動遠離ZRP 1910。給定ZRP 1910之位置,物件1804移動或移位至右邊且物件1806向下移動或移位。另外,物件1808與其在視域1900中之位置相比較向下移動且移動至右邊。However, in the high-magnification field of view 1950, the objects 1804, 1806, and 1808 expand and move away from the ZRP 1910. Given the position of the ZRP 1910, the object 1804 moves or shifts to the right and the object 1806 moves or shifts downward. In addition, the object 1808 moves down and to the right compared to its position in the viewing area 1900.
圖20展示比較高放大率左視域1850與高放大率右視域之一像素圖式。一網格2000可表示左光學影像感測器748之像素網格1004上之物件1804(L)、1806(L)及1808(L)之位置與右光學影像感測器746之像素網格1002上之物件1804(R)、1806(R)及1808(R)之位置覆疊。圖20清楚地展示物件1804、1806及1808針對左視域1850及右視域1950位於不同位置中。舉例而言,物件1804(R)位於十字絲1902a右邊且位於十字絲1902b上面,而相同物件1804(L)位於十字絲1802a左邊且位於十字絲1802b上面更遠處。FIG. 20 shows a pixel pattern comparing the high-magnification left view area 1850 and the high-magnification right view area. A grid 2000 can represent the positions of objects 1804 (L), 1806 (L) and 1808 (L) on the pixel grid 1004 of the left optical image sensor 748 and the pixel grid 1002 of the right optical image sensor 746 The positions of the above objects 1804(R), 1806(R) and 1808(R) overlap. Figure 20 clearly shows that the objects 1804, 1806, and 1808 are located in different positions for the left view 1850 and the right view 1950. For example, the object 1804(R) is located to the right of the crosshair 1902a and above the crosshair 1902b, while the same object 1804(L) is located to the left of the crosshair 1802a and farther above the crosshair 1802b.
物件1804、1806及1808之位置差對應於假性視差,該假性視差因在不同位置中產生ZRP 1810及1910之光學元件1402之光學對準之缺陷而形成。假定不存在失真或其他成像誤差,圖20中所展示之假性視差對於影像內之所有點一般係相同的。當透過一外科手術顯微鏡(諸如圖2之顯微鏡200)之目鏡觀看時,物件1804、1806及1808之位置差可並非明顯的。然而,當在顯示監視器512及514上在一立體影像中觀看時,該等差變得顯而易見且可導致頭痛、噁心及/或眩暈。The positional difference of the objects 1804, 1806, and 1808 corresponds to a false parallax, which is formed due to a defect in the optical alignment of the optical element 1402 of the ZRP 1810 and 1910 in different positions. Assuming that there is no distortion or other imaging errors, the false parallax shown in Figure 20 is generally the same for all points in the image. When viewed through the eyepiece of a surgical operating microscope (such as the microscope 200 of FIG. 2), the positional difference of the objects 1804, 1806, and 1808 may not be obvious. However, when viewed in a stereoscopic image on the display monitors 512 and 514, the difference becomes obvious and can cause headaches, nausea, and/or dizziness.
圖21展示圖解說明關於左ZRP及右ZRP之假性視差之一圖式。該圖式包含一像素網格2100,像素網格2100包含圖10之右像素網格1002與左像素網格1004之覆疊。在此所圖解說明實例中,左光學路徑之一左ZRP 2102位於沿著x軸之+4及沿著y軸之0處。另外,右光學路徑之一右ZRP 2104位於沿著x軸之-1及沿著y軸之0處。一原點2106展示於x軸與y軸之交叉點處。FIG. 21 shows a diagram illustrating the false parallax with respect to the left ZRP and the right ZRP. The figure includes a pixel grid 2100, which includes an overlay of the right pixel grid 1002 and the left pixel grid 1004 in FIG. 10. In the example illustrated here, one of the left optical paths, the left ZRP 2102, is located at +4 along the x-axis and 0 along the y-axis. In addition, one of the right optical paths, the right ZRP 2104, is located at -1 along the x-axis and 0 along the y-axis. An origin 2106 is shown at the intersection of the x-axis and the y-axis.
在此實例中,物件2108在一第一低放大率下關於左影像及右影像對準。當放大率增加3X時,物件2108大小增加且移動遠離ZRP 2102及2104。輪廓物件2110基於ZRP 2102及2104與原點2106對準而展示在第二較高放大率下之物件2108之一理論位置。具體而言,在第一放大位準下之物件2108之一凹口在沿著x軸之位置+2處。在3X放大率之情況下,凹口沿著x軸移動3X,使得凹口在較高放大位準下位於沿著x軸之+6處。另外,由於ZRP 2102及2104將在原點2106處理論上對準,因此物件2110將在左視圖與右視圖之間對準(在給定覆疊之情況下,在圖21中展示為一單個物件)。In this example, the object 2108 is aligned with respect to the left image and the right image at a first low magnification. When the magnification increases by 3X, the object 2108 increases in size and moves away from ZRP 2102 and 2104. The contour object 2110 shows a theoretical position of the object 2108 at the second higher magnification based on the alignment of the ZRP 2102 and 2104 with the origin 2106. Specifically, a notch of the object 2108 at the first magnification level is at a position +2 along the x-axis. In the case of 3X magnification, the notch moves 3X along the x-axis, so that the notch is located at +6 along the x-axis at a higher magnification level. In addition, since the ZRP 2102 and 2104 will be aligned at the origin 2106 in terms of processing, the object 2110 will be aligned between the left view and the right view (in the case of a given overlay, it is shown as a single object in Figure 21). ).
然而,在此實例中,左ZRP 2102與右ZRP 2104之不對準在較高放大率下致使物件2110在左視圖與右視圖之間不對準。關於右光學路徑,右ZRP 2104位於沿著x軸之-1處,使得其在低放大率下距物件2108之凹口3個像素遠。當放大3X時,此差成為9個像素,此經展示為物件2110(R)。類似地,左ZRP 2102位於沿著x軸之+4像素處。在3X放大率下,物件2108自2個像素遠移動至6個像素遠,此經展示為 物件2110(L)在沿著x軸之-2處。However, in this example, the misalignment of the left ZRP 2102 and the right ZRP 2104 at higher magnification causes the object 2110 to be misaligned between the left view and the right view. Regarding the right optical path, the right ZRP 2104 is located at -1 along the x-axis, making it 3 pixels away from the notch of the object 2108 at low magnification. When zoomed in at 3X, this difference becomes 9 pixels, which is shown as object 2110(R). Similarly, the left ZRP 2102 is located at +4 pixels along the x-axis. At 3X magnification, the object 2108 moves from 2 pixels away to 6 pixels away, which is shown as the object 2110 (L) at -2 along the x-axis.
物件2110(L)與物件2110(R)之間的位置差在較高放大率下對應於左視圖與右視圖之間的假性視差。若右視圖及左視圖組合成一立體影像以用於顯示,則物件2110之位置將在每一列處不對準(若再現器程式1580e使用一列交錯模式)。不對準對於產生立體視覺將係不利的且可產生看起來模糊或讓一操作者混亂之一影像。
B.其他假性視差源 The position difference between the object 2110 (L) and the object 2110 (R) corresponds to the false parallax between the left view and the right view at a higher magnification. If the right view and the left view are combined into a 3D image for display, the position of the object 2110 will not be aligned at each row (if the renderer program 1580e uses a row interlace mode). Misalignment is detrimental to stereoscopic vision and can produce an image that looks blurry or confuses an operator. B. Other sources of false parallax
雖然左光學路徑與右光學路徑之間的ZRP不對準係一顯著假性視差源,但亦存在其他誤差源。舉例而言,假性視差可由右光學路徑與左光學路徑之間的不相等放大率改變引起。平行光學路徑之間的放大率差可由光學元件1402之透鏡之光學性質或特性之稍微變化引起。此外,若圖7及圖8之左前變焦透鏡728及右前變焦透鏡726中之每一者以及左後變焦透鏡734及右後變焦透鏡732中之每一者獨立地受控制,則稍微差異可由定位引起。Although the ZRP misalignment between the left optical path and the right optical path is a significant source of false parallax, there are other sources of error. For example, false parallax can be caused by unequal magnification changes between the right optical path and the left optical path. The difference in magnification between the parallel optical paths can be caused by slight changes in the optical properties or characteristics of the lens of the optical element 1402. In addition, if each of the left front zoom lens 728 and the right front zoom lens 726 and each of the left rear zoom lens 734 and the right rear zoom lens 732 of FIGS. 7 and 8 are independently controlled, a slight difference can be determined by the positioning cause.
返回參考圖18及圖19,放大率改變之差針對左光學路徑及右光學路徑產生不同大小之物件及物件之間的不同間距。若(舉例而言)左光學路徑具有一較高放大率改變,則物件1804、1806及1808與圖19中之右視域1950中之物件1804、1806及1808相比較將看起來更大且自ZRP 1810移動一更大距離。即使ZRP 1810與ZRP 1910對準,物件1804、1806及1808之位置之差亦引起假性視差。Referring back to FIGS. 18 and 19, the difference in magnification changes produces objects of different sizes and different distances between objects for the left optical path and the right optical path. If, for example, the left optical path has a higher magnification change, the objects 1804, 1806, and 1808 will look larger and more self-contained than the objects 1804, 1806, and 1808 in the right view 1950 in FIG. 19 ZRP 1810 moves a greater distance. Even if the ZRP 1810 is aligned with the ZRP 1910, the difference in the positions of the objects 1804, 1806, and 1808 causes false parallax.
另一假性視差源由左光學路徑及右光學路徑之不相等聚焦引起。一般而言,左視圖與右視圖之間的任何聚焦差可導致影像品質之一所感知降低及左視圖還是右視圖應佔主導地位之潛在混亂。若聚焦差係明顯的,則其可產生一離焦(「OOF」)條件。OOF條件在立體影像中係尤其明顯的,其中左視圖及右視圖展示於同一影像中。另外,OOF條件並非可容易校正的,此乃因使一離焦光學路徑重新聚焦通常致使其他光學路徑變得不聚焦。一般而言,需要判定其中兩個光學路徑對焦之一點,此可包含改變左透鏡及右透鏡沿著一光學路徑之位置及/或調整距目標部位700之一工作距離。Another source of false parallax is caused by the unequal focus of the left optical path and the right optical path. Generally speaking, any focus difference between the left view and the right view can lead to a perceived reduction in image quality and potential confusion as to whether the left view or the right view should dominate. If the focus difference is significant, it can produce an out-of-focus ("OOF") condition. The OOF condition is especially obvious in stereo images, where the left view and the right view are displayed in the same image. In addition, OOF conditions are not easily correctable, because refocusing an out-of-focus optical path usually causes other optical paths to become out of focus. Generally speaking, it is necessary to determine a point where two optical paths are in focus. This may include changing the positions of the left lens and the right lens along an optical path and/or adjusting a working distance from the target part 700.
圖22展示圖解說明如何形成一OOF條件之一圖式。該圖式相對於一最佳解析度區段2202使所感知解析度(例如,聚焦)與一透鏡位置相關。在此實例中,左後變焦透鏡734在位置L1
處,而右後變焦透鏡732在位置R1
處。在位置L1
及R1
處,後變焦透鏡732及734在一最佳解析度範圍2202內,使得左光學路徑及右光學路徑具有經匹配聚焦位準。然而,存在L1
與R1
之位置之一差,該差與距離ΔP對應。在一稍後時間處,改變工作距離706使得一點係離焦的。在此實例中,後變焦透鏡732及734兩者移動相同距離到達位置L2
及R2
,使得距離ΔP不改變。然而,位置改變引起解析度ΔR之一顯著改變,使得左後變焦透鏡734具有比右後變焦透鏡732高之一解析度(例如,更佳聚焦)。解析度ΔR對應於OOF條件,此依據右光學路徑與左光學路徑之間的焦點不對準產生假性視差。Figure 22 shows a diagram illustrating how to form an OOF condition. The diagram correlates the perceived resolution (eg, focus) with a lens position relative to an optimal resolution section 2202. In this example, the zoom lens 734 at the left rear of the position L 1, and the right rear of the zoom lens 732 in the position R 1. L 1 and at a position at R 1, the zoom lens 732 and 734 in a preferred range of 2202 resolution, so that the left and right optical paths having an optical path through the focusing level matching. However, there is a difference between the positions of L 1 and R 1 , and this difference corresponds to the distance ΔP. At a later time, the working distance 706 is changed so that one point is out of focus. In this example, both the rear zoom lenses 732 and 734 move the same distance to reach the positions L 2 and R 2 so that the distance ΔP does not change. However, the position change causes a significant change in one of the resolution ΔR, so that the left rear zoom lens 734 has a higher resolution than the right rear zoom lens 732 (for example, better focus). The resolution ΔR corresponds to the OOF condition, which generates false parallax due to the misalignment between the right optical path and the left optical path.
再一假性視差源可由使在目標部位700處移動之物件成像引起。假性視差由右光學影像感測器746與左光學影像感測器748之曝光之間的小同步誤差引起。若未同時記錄左視圖及右視圖,則物件似乎在兩個視圖之間位移或不對準。經組合立體影像針對左視圖及右視圖在兩個不同位置處展示相同物件。Another source of false parallax can be caused by imaging an object moving at the target site 700. The false parallax is caused by a small synchronization error between the exposure of the right optical image sensor 746 and the left optical image sensor 748. If the left view and the right view are not recorded at the same time, the object seems to be shifted or misaligned between the two views. The combined 3D image shows the same object at two different positions for the left view and the right view.
此外,另一假性視差源涉及在放大期間之一移動ZRP點。上文在章節IV(A)中所論述之實例假定左視圖及右視圖之ZRP不在x方向或y方向上移動。然而,若變焦透鏡726、728、732及/或734不與光學路徑或光軸完全平行地(例如,在z方向上)移動,則ZRP可在放大期間移位。如上文參考圖11所論述,當一力施加至致動區段1108時載體724可稍微移位或旋轉。當改變一放大位準時此旋轉可致使左ZRP及右ZRP稍微移動。In addition, another source of false parallax involves moving the ZRP point during one of the zoom-in periods. The example discussed above in Section IV(A) assumes that the ZRP of the left and right views does not move in the x-direction or the y-direction. However, if the zoom lenses 726, 728, 732, and/or 734 do not move completely parallel to the optical path or optical axis (e.g., in the z-direction), the ZRP may shift during magnification. As discussed above with reference to FIG. 11, the carrier 724 may be slightly displaced or rotated when a force is applied to the actuation section 1108. This rotation can cause the left ZRP and right ZRP to move slightly when changing a magnification level.
在一實例中,在一放大率改變期間,載體730在一單個方向上移動,而載體724針對放大率改變之一部分在相同方向上移動且針對放大率改變之一剩餘部分在一相反方向上移動以達成焦點調整。若載體724之運動軸線相對於光軸稍微傾斜或旋轉,則左光學路徑及/或右光學路徑之ZRP將針對第一部分在一個方向上移位,後續接著針對放大率改變之第二部分在一反向方向上移位。另外,由於不相等地施加力,因此右前變焦透鏡726及左前變焦透鏡728可經歷左光學路徑與右光學路徑之間的變化程度之ZRP移位。總之,ZRP之位置改變引起不對準光學路徑,因而產生假性視差。
C.假性視差之減少促進將數位圖形及影像與一立體視圖合併在一起 In one example, during a magnification change, the carrier 730 moves in a single direction, and the carrier 724 moves in the same direction for a part of the magnification change and moves in the opposite direction for a remaining part of the magnification change. To achieve focus adjustment. If the axis of motion of the carrier 724 is slightly inclined or rotated with respect to the optical axis, the ZRP of the left optical path and/or the right optical path will be shifted in one direction for the first part, and subsequently for the second part of the change in magnification. Shift in the reverse direction. In addition, since the force is not applied equally, the front right zoom lens 726 and the front left zoom lens 728 may experience a ZRP shift of the degree of change between the left optical path and the right optical path. In short, the position change of the ZRP causes misalignment of the optical path, which results in false parallax. C. The reduction of false parallax promotes the integration of digital graphics and images with a stereoscopic view
隨著外科手術顯微鏡變得更加數位化,設計者正在添加使圖形、影像及/或其他數位效應覆疊至現場視圖影像之特徵。舉例而言,導引覆疊、立體磁共振成像(「MRI」)影像融合及/或外部資料可與由一攝影機記錄或甚至顯示於目鏡本身內之影像組合。假性視差降低與基本立體影像之覆疊之準確性。舉例而言,外科醫師一般需要經由MRI來視覺化之一腫瘤儘可能準確地放置(通常在三個維度上)於一經融合現場外科手術立體視圖內。否則,術前腫瘤影像提供很少資訊給外科醫師,因而降低效能。As surgical operating microscopes become more digital, designers are adding features that overlay graphics, images, and/or other digital effects onto the live view image. For example, guidance overlays, stereo magnetic resonance imaging ("MRI") image fusion, and/or external data can be combined with images recorded by a camera or even displayed in the eyepiece itself. The false parallax reduces the accuracy of the overlay with the basic three-dimensional image. For example, surgeons generally need to use MRI to visualize a tumor as accurately as possible (usually in three dimensions) and place it in a fusion on-site surgical stereoscopic view. Otherwise, preoperative tumor imaging provides little information to the surgeon, thus reducing performance.
舉例而言,一外科手術導板可與一右視圖影像對準而與左視圖不對準。兩個視圖之間的不對準外科手術導板對於操作者係顯而易見的。在另一實例中,在圖形處理單元1564形成經組合立體影像之前一外科手術導板可單獨與資訊處理器模組1408中之左視圖及右視圖對準。然而,左視圖與右視圖之間的不對準形成導板之間的不對準,因而降低導板之有效性且在顯微外科手術程序期間形成混亂及延遲。For example, a surgical guide can be aligned with a right view image but not aligned with the left view. The misalignment of the surgical guide between the two views is obvious to the operator. In another example, a surgical guide can be individually aligned with the left view and the right view in the information processor module 1408 before the graphics processing unit 1564 forms the combined stereo image. However, the misalignment between the left view and the right view creates a misalignment between the guide plates, thus reducing the effectiveness of the guide plates and creating confusion and delay during the microsurgery procedure.
標題為「IMAGING SYSTEM AND METHODS DISPLAYING A FUSED MULTIDIMENSIONAL RECONSTRUCTED IMAGE
」之第9,552,660號美國專利(以引用方式併入本文中)揭示術前影像及/或圖形如何與一立體影像視覺上融合。圖23及圖24展示圖解說明假性視差如何在融合至一立體影像時致使數位圖形及/或影像失去準確性之圖式。圖24展示一患者之眼睛2402之一前視圖且圖23展示眼睛沿著圖24之平面A-A之一剖面圖。在圖23中,指示資訊處理器模組1408判定自一焦點平面2302至(舉例而言)眼睛2402之一後囊上之一所關注物件2304之一尾部距離d
。資訊處理器模組1408操作一程式1560,程式1560規定(舉例而言)距離d
係藉由來自眼睛2402之左視圖及右視圖之影像資料之一個三角量測計算來判定的。自左光學影像感測器748之一視角展示一視圖2306且自右光學影像感測器746之一視角展示一視圖2308。假定左視圖2306及右視圖2308與眼睛2402之一前中心2310重合。另外,左視圖2306及右視圖2308係作為理論右投影2312及理論左投影2314投影至一焦平面2302上的物件2304之二維視圖。在此實例中,處理器1562藉由使用一個三角量測常式計算理論右投影2312之一外推與理論左投影2314之一外推之一交叉點而判定至所關注物件2304之距離d
。US Patent No. 9,552,660 (incorporated by reference) entitled " IMAGING SYSTEM AND METHODS DISPLAYING A FUSED MULTIDIMENSIONAL RECONSTRUCTED IMAGE " discloses how preoperative images and/or graphics are visually fused with a stereoscopic image. Figures 23 and 24 show diagrams illustrating how false parallaxes can cause digital graphics and/or images to lose accuracy when fused to a 3D image. FIG. 24 shows a front view of a patient's eye 2402 and FIG. 23 shows a cross-sectional view of the eye along the plane AA of FIG. 24. In FIG. 23, the information processor module 1408 is instructed to determine the distance d from a focal plane 2302 to a tail distance d of an object of interest 2304 on a posterior capsule of the eye 2402, for example. The information processor module 1408 operates a program 1560. The program 1560 specifies (for example) that the distance d is determined by a triangulation calculation of the image data from the left view and the right view of the eye 2402. A view 2306 is shown from a viewing angle of the left optical image sensor 748 and a view 2308 is shown from a viewing angle of the right optical image sensor 746. Assume that the left view 2306 and the right view 2308 coincide with a front center 2310 of the eye 2402. In addition, the left view 2306 and the right view 2308 are two-dimensional views of the object 2304 projected onto a focal plane 2302 by the theoretical right projection 2312 and the theoretical left projection 2314. In this example, the processor 1562 determines the distance d to the object of interest 2304 by calculating an intersection of an extrapolation of the theoretical right projection 2312 and an extrapolation of the theoretical left projection 2314 using a triangulation routine.
然而,在此實例中存在假性視差,此致使一實際左投影2316位於理論左投影2314左邊一距離P 處
,如圖23及圖24中所展示。處理器1562使用實際左投影2316及右投影2312以使用三角量測常式來判定至右投影2312之一外推與實際左投影2316之一外推之一交叉點2320的一距離。交叉點2320之距離等於距離d
加上一誤差距離e
。假性視差因此使用自一立體影像獲取之資料產生一錯誤距離計算。如圖23及圖24中所展示,甚至小程度之假性視差可產生一顯著誤差。在一融合影像之內容脈絡中,錯誤距離可導致用於與一立體影像融合之一腫瘤三維視覺化之一不準確放置。該不準確放置可使外科手術延遲,妨礙外科醫師之表現,或致使整個視覺化系統被忽視。更糟糕地,一外科醫師可能依賴於腫瘤影像之不準確放置且在顯微外科手術程序期間犯錯誤。
D.實例性立體視覺化攝影機減少或消除假性視差 However, there is false parallax in this example, which causes an actual left projection 2316 to be located a distance P to the left of the theoretical left projection 2314, as shown in FIGS. 23 and 24. The processor 1562 uses the actual left projection 2316 and the right projection 2312 to determine a distance between an extrapolation of the right projection 2312 and an extrapolation of the actual left projection 2316 and an intersection 2320 using a triangulation measurement routine. The distance of the intersection 2320 is equal to the distance d plus an error distance e . The false parallax therefore uses data obtained from a stereo image to generate an incorrect distance calculation. As shown in Figure 23 and Figure 24, even a small degree of false parallax can produce a significant error. In the context of a fusion image, the wrong distance can lead to inaccurate placement of a tumor three-dimensional visualization used for fusion with a stereo image. This inaccurate placement can delay surgery, hinder the performance of the surgeon, or cause the entire visualization system to be ignored. Worse, a surgeon may rely on inaccurate placement of tumor images and make mistakes during the microsurgery procedure. D. Example stereo visualization camera reduces or eliminates false parallax
圖3至圖16之實例性立體視覺化攝影機300經組態以減少或消除視覺缺陷、假性視差及/或通常引起假性視差之不對準光學路徑。在某些實例中,立體視覺化攝影機300藉由將左光學路徑及右光學路徑之ZRP對準至右光學影像感測器746及左光學影像感測器748之像素組1006及1008之各別中心而減少或消除假性視差。另外或另一選擇係,立體視覺化攝影機300可對準左影像及右影像之光學路徑。應瞭解,立體視覺化攝影機300可執行在校準期間減少假性視差之動作。另外,立體視覺化攝影機300可在使用期間即時減少所偵測到之假性視差。The example stereo visualization camera 300 of FIGS. 3-16 is configured to reduce or eliminate visual defects, false parallax, and/or misaligned optical paths that often cause false parallax. In some examples, the stereo visualization camera 300 aligns the ZRP of the left optical path and the right optical path to the pixel groups 1006 and 1008 of the right optical image sensor 746 and the left optical image sensor 748, respectively. Center to reduce or eliminate false parallax. Additionally or alternatively, the stereoscopic visualizing camera 300 can be aimed at the optical path of the left image and the right image. It should be understood that the stereo visualization camera 300 can perform actions to reduce false parallax during calibration. In addition, the stereoscopic visualization camera 300 can instantly reduce the false parallax detected during use.
圖25及圖26圖解說明根據本發明之一實例性實施例之展示用以減少或消除假性視差之一實例性程序2500之一流程圖。儘管參考圖25及圖26中所圖解說明之流程圖闡述程序2500,但應瞭解,可使用執行與程序2500相關聯之步驟之諸多其他方法。舉例而言,可改變方塊中之諸多方塊之次序,可組合特定方塊與其他方塊,且所闡述之方塊中之諸多方塊係選用的。此外,可在多個裝置當中執行程序2500中所闡述之動作,該多個裝置包含(舉例而言)實例性立體視覺化攝影機300之光學元件1402、影像擷取模組1404、馬達與光照模組1406及/或資訊處理器模組1408。舉例而言,可由資訊處理器模組1408之程式1560中之一者執行程序2500。25 and FIG. 26 illustrate a flowchart of an exemplary procedure 2500 to reduce or eliminate false parallax according to an exemplary embodiment of the present invention. Although the procedure 2500 is described with reference to the flowcharts illustrated in FIGS. 25 and 26, it should be understood that many other methods of performing the steps associated with the procedure 2500 can be used. For example, the order of the blocks in the block can be changed, a specific block can be combined with other blocks, and many of the blocks described are optional. In addition, the actions described in the procedure 2500 can be performed in a plurality of devices including, for example, the optical element 1402 of the exemplary stereo visualization camera 300, the image capturing module 1404, the motor, and the illumination module. Group 1406 and/or information processor module 1408. For example, the program 2500 can be executed by one of the programs 1560 of the information processor module 1408.
實例性程序2500在立體視覺化攝影機300接收對準右光學路徑與左光學路徑之一指令(方塊2502)時開始。可回應於一操作者請求立體視覺化攝影機300執行一校準常式而自使用者輸入裝置1410接收該等指令。在其他實例中,可在判定右影像與左影像不對準之後自資訊處理器模組1408接收該等指令。資訊處理器模組1408可藉由執行使右影像與左影像覆疊且判定像素值之差之一程式1560而判定影像未對準,其中在大像素區內之較大差指示不對準影像。在某些實例中,程式1560可在不執行一覆疊功能之情況下比較左影像及右影像之像素資料,其中(舉例而言)自右像素資料減去左像素資料以判定一不對準嚴重性。The example process 2500 starts when the stereo visualization camera 300 receives an instruction to align the right optical path and the left optical path (block 2502). In response to an operator's request for the stereo visualization camera 300 to execute a calibration routine, the commands can be received from the user input device 1410. In other examples, the commands may be received from the information processor module 1408 after determining that the right image and the left image are not aligned. The information processor module 1408 can determine that the image is misaligned by executing a program 1560 that overlaps the right image and the left image and determines the difference in pixel values, where a larger difference in the large pixel area indicates that the image is not aligned. In some instances, the program 1560 can compare the pixel data of the left and right images without performing an overlay function, where (for example) the left pixel data is subtracted from the right pixel data to determine a serious misalignment sex.
在接收到減少假性視差之指令之後,實例性立體視覺化攝影機300定位左光學路徑或右光學路徑中之一者之一ZRP。出於說明性目的,程序2500包含首先判定左光學路徑之ZRP。然而,在其他實施例中,程序2500可首先判定右光學路徑之ZRP。為判定左ZRP,立體視覺化攝影機300使至少一個變焦透鏡(例如,左前變焦透鏡728及/或左後變焦透鏡734)沿著左光學路徑之一z方向移動至一第一放大位準(方塊2504)。在其中前變焦透鏡726及728連接至同一載體724且後變焦透鏡732及734連接至同一載體730之例項中,左透鏡之移動致使右透鏡亦移動。然而,在程序2500之此區段期間僅考量左透鏡之移動。After receiving the instruction to reduce the false parallax, the exemplary stereo visualization camera 300 locates one of the left optical path or the right optical path ZRP. For illustrative purposes, the procedure 2500 includes first determining the ZRP of the left optical path. However, in other embodiments, the procedure 2500 may first determine the ZRP of the right optical path. To determine the left ZRP, the stereo vision camera 300 moves at least one zoom lens (for example, the left front zoom lens 728 and/or the left rear zoom lens 734) to a first magnification level (box 2504). In the example in which the front zoom lenses 726 and 728 are connected to the same carrier 724 and the rear zoom lenses 732 and 734 are connected to the same carrier 730, the movement of the left lens causes the right lens to also move. However, during this section of the process 2500, only the movement of the left lens is considered.
在第一放大位準下,立體視覺化攝影機300致使左變焦透鏡沿著z方向移動(方塊2506)。舉例而言,移動可包含在第一放大位準周圍之前後移動。舉例而言,若第一放大位準係5X,則移動可介於4X與6X之間。移動亦可包含在一個方向上(諸如自5X至4X)之移動。在此移動期間,立體視覺化攝影機300可調整一或多個其他透鏡以維持目標部位700之聚焦。在方塊2508處,在左變焦透鏡之移動期間,立體視覺化攝影機300使用(舉例而言)左光學影像感測器748記錄目標部位700之影像及/或圖框2509之一串流或一序列。使用經組態以囊括像素網格1004之一原點及左ZRP之可能位置之一過大像素組1008來記錄影像2509。At the first magnification level, the stereo visualization camera 300 causes the left zoom lens to move in the z direction (block 2506). For example, moving may include moving before and after around the first magnification level. For example, if the first magnification level is 5X, the movement can be between 4X and 6X. Movement can also include movement in one direction (such as from 5X to 4X). During this movement, the stereo vision camera 300 can adjust one or more other lenses to maintain the focus of the target part 700. At block 2508, during the movement of the left zoom lens, the stereo visualization camera 300 uses (for example) the left optical image sensor 748 to record a stream or a sequence of images of the target site 700 and/or frame 2509 . The image 2509 is recorded using an oversized pixel group 1008 configured to include an origin of the pixel grid 1004 and one of the possible positions of the left ZRP.
資訊處理器模組1408之實例性處理器1562分析影像串流以定位不在影像之間在一x方向或一y方向上移動的區之一部分(方塊2510)。該區之該部分可包含一個或幾個像素且對應於左ZRP。如上文所論述,在一放大率改變期間,物件移動遠離ZRP或移動朝向ZRP。在放大率改變時,僅在ZRP處之物件相對於視域保持位置不變。處理器1562可使用像素資料計算每一像素之影像串流之間的差量。跨越影像串流具有最小差量之一區對應於左ZRP。The example processor 1562 of the information processor module 1408 analyzes the image stream to locate a portion of the region that does not move in an x direction or a y direction between the images (block 2510). This part of the zone may contain one or several pixels and correspond to the left ZRP. As discussed above, during a change in magnification, the object moves away from the ZRP or moves toward the ZRP. When the magnification changes, only the object at the ZRP keeps its position relative to the field of view. The processor 1562 can use the pixel data to calculate the difference between the image streams of each pixel. The area with the smallest difference across the video stream corresponds to the left ZRP.
資訊處理器模組1408之實例性處理器1562接下來判定不在影像串流之間移動的區之一部分相對於像素網格1004之座標(例如,判定左ZRP之一位置) (方塊2512)。在其他實例中,資訊處理器模組1408之處理器1562判定在原點與該區之該部分(與左ZRP對應)之間的一距離。該距離用於判定像素網格1004上之左ZRP之一位置。一旦判定左ZRP之位置,資訊處理器模組1408之處理器1562便判定左光學影像感測器748之一像素組(例如,像素組1008),使得左ZRP位於該像素組之一中心(在一個像素內)處(方塊2514)。此時,左ZRP定中心於左光學路徑內。The example processor 1562 of the information processor module 1408 then determines the coordinates of a part of the region that does not move between image streams relative to the pixel grid 1004 (for example, determines a position of the left ZRP) (block 2512). In other examples, the processor 1562 of the information processor module 1408 determines a distance between the origin and the part of the zone (corresponding to the left ZRP). This distance is used to determine a position of the left ZRP on the pixel grid 1004. Once the position of the left ZRP is determined, the processor 1562 of the information processor module 1408 determines a pixel group of the left optical image sensor 748 (for example, the pixel group 1008) so that the left ZRP is located at the center of one of the pixel groups (in the Within one pixel) (block 2514). At this time, the left ZRP is centered in the left optical path.
在某些實例中,可藉由重新選擇像素組而反覆地執行方塊2504至2514直至左ZRP在原點之一像素內且最小化假性視差為止。在判定像素網格之後,資訊處理器模組1408之處理器1562將像素組之座標及/或左ZRP之座標中之至少一者作為一校準點儲存至記憶體1570 (方塊2516)。資訊處理器模組1408之處理器1562可使第一放大位準與校準點相關聯,使得當立體視覺化攝影機300返回至第一放大位準時選擇同一像素組。In some examples, blocks 2504 to 2514 may be repeatedly executed by reselecting pixel groups until the left ZRP is within one pixel of the origin and the false parallax is minimized. After determining the pixel grid, the processor 1562 of the information processor module 1408 stores at least one of the coordinates of the pixel group and/or the coordinates of the left ZRP as a calibration point in the memory 1570 (block 2516). The processor 1562 of the information processor module 1408 can associate the first magnification level with the calibration point, so that the same pixel group is selected when the stereo visualization camera 300 returns to the first magnification level.
圖27展示圖解說明如何相對於左光學影像感測器748之像素網格調整左ZRP之一圖式。最初,選擇定中心於原點2704上之一初始(例如,過大)像素組2702。像素組2702足夠大以將可能ZRP記錄於影像串流中。在此所圖解說明實例中,一左ZRP 2706位於原點2704上面且位於原點2704右邊。資訊處理器模組1408之處理器1562基於左ZRP 2706之一位置而判定像素組2708,使得左ZRP 2706位於或定位於像素組2708之一中心處。FIG. 27 shows a diagram illustrating how to adjust the left ZRP relative to the pixel grid of the left optical image sensor 748. Initially, an initial (for example, too large) pixel group 2702 centered on the origin 2704 is selected. The pixel group 2702 is large enough to record possible ZRP in the video stream. In the example illustrated here, a left ZRP 2706 is located above the origin 2704 and to the right of the origin 2704. The processor 1562 of the information processor module 1408 determines the pixel group 2708 based on a position of the left ZRP 2706, so that the left ZRP 2706 is located or positioned at the center of a pixel group 2708.
在於圖25中判定左ZRP且使左ZRP與一像素組之一原點對準之後,實例性程序2500在圖26中對準左影像與右影像。為對準該等影像,實例性處理器1562比較來自在使左ZRP與原點對準之後所記錄之左影像及右影像之像素資料。在某些實施例中,處理器1562使左影像與右影像覆疊以使用(舉例而言)一減法及/或模板方法來判定差。處理器1562選擇或判定右光學路徑之一像素組,使得所得右影像與左影像對準或重合(方塊2519)。After determining the left ZRP in FIG. 25 and aligning the left ZRP with an origin of a pixel group, the example procedure 2500 aligns the left image and the right image in FIG. 26. To align the images, the example processor 1562 compares the pixel data from the left and right images recorded after aligning the left ZRP with the origin. In some embodiments, the processor 1562 overlays the left image and the right image to determine the difference using, for example, a subtraction and/or template method. The processor 1562 selects or determines a pixel group of the right optical path so that the resulting right image is aligned with or coincides with the left image (block 2519).
在所圖解說明實施例中,實例性處理器1562判定右ZRP。該等步驟類似於在方塊2504至2512中針對左ZRP所論述之步驟。舉例而言,在方塊2518處,立體視覺化攝影機300使一右變焦透鏡移動至第一放大位準。在某些實施例中,右透鏡之放大位準不同於用於判定左ZRP之放大位準。資訊處理器模組1408之實例性處理器1562然後使右變焦透鏡在該放大位準周圍移動且在移動期間自右光學影像感測器746接收一影像串流2521 (方塊2520及2522)。資訊處理器模組1408之實例性處理器1562藉由定位不在影像之間移動的一區之一部分而自右影像串流判定右ZRP (方塊2524)。處理器1562接下來判定右ZRP之座標及/或一經對準像素組1006之一中心與右ZRP之間的一距離(方塊2526)。In the illustrated embodiment, the example processor 1562 determines the right ZRP. The steps are similar to the steps discussed for the left ZRP in blocks 2504-2512. For example, at block 2518, the stereo visualization camera 300 moves a right zoom lens to the first magnification level. In some embodiments, the magnification level of the right lens is different from the magnification level used to determine the left ZRP. The example processor 1562 of the information processor module 1408 then moves the right zoom lens around the magnification level and receives an image stream 2521 from the right optical image sensor 746 during the movement (blocks 2520 and 2522). The example processor 1562 of the information processor module 1408 determines the right ZRP from the right image stream by locating a part of an area that does not move between images (block 2524). The processor 1562 next determines the coordinates of the right ZRP and/or a distance between a center of the aligned pixel group 1006 and the right ZRP (block 2526).
處理器1562然後指示馬達與光照模組1406使右光學路徑中之至少一個透鏡在一x方向、一y方向及/或一傾斜方向中之至少一者上移動以使用(舉例而言)右ZRP之距離或座標使右ZRP與經對準像素組1006之中心對準(方塊2528)。換言之,使右ZRP移動以與經對準像素組1006之中心重合。在某些實例中,使右前透鏡720、右透鏡鏡筒736、右最後光學元件745及/或右影像感測器746在相對於右光學路徑之z方向之x方向、y方向及/或一傾斜方向上移動(舉例而言,使用一撓曲部)。移動程度與右ZRP距像素組1006之中心之距離成比例。在某些實施例中,處理器1562以數位方式改變右前透鏡720、右透鏡鏡筒736及/或右最後光學元件745之性質以具有與使透鏡移動相同之效應。處理器1562可重複步驟2520至2528及/或使用後續右影像來確認右ZRP與像素組1006之中心對準及/或反覆地判定使右ZRP與像素組之中心對準所需要之額外透鏡移動。The processor 1562 then instructs the motor and illumination module 1406 to move at least one lens in the right optical path in at least one of an x-direction, a y-direction, and/or an oblique direction to use (for example) the right ZRP The distance or coordinates align the right ZRP with the center of the aligned pixel group 1006 (block 2528). In other words, the right ZRP is moved to coincide with the center of the aligned pixel group 1006. In some examples, the right front lens 720, the right lens barrel 736, the right last optical element 745, and/or the right image sensor 746 are arranged in the x direction, the y direction and/or a direction relative to the z direction of the right optical path. Move in an oblique direction (for example, using a flexure). The degree of movement is proportional to the distance between the right ZRP and the center of the pixel group 1006. In some embodiments, the processor 1562 digitally changes the properties of the right front lens 720, the right lens barrel 736, and/or the right last optical element 745 to have the same effect as moving the lens. The processor 1562 may repeat steps 2520 to 2528 and/or use subsequent right images to confirm that the right ZRP is aligned with the center of the pixel group 1006 and/or repeatedly determine the additional lens movement required to align the right ZRP with the center of the pixel group .
實例性處理器1562將右像素組及/或右ZRP之座標作為一校準點儲存至記憶體1570 (方塊2530)。處理器1562亦可將經移動以對準右ZRP之右透鏡之一位置儲存至校準點。在某些實例中,右光學路徑之校準點連同第一放大位準而與左光學路徑之校準點一起經儲存。因此,當立體視覺化攝影機300隨後設定至第一放大位準時處理器1562將校準點內之資料應用於光學影像感測器746及748及/或一或多個光學元件1402之徑向定位。The example processor 1562 stores the coordinates of the right pixel group and/or the right ZRP as a calibration point in the memory 1570 (block 2530). The processor 1562 can also store a position of the right lens moved to align with the right ZRP to the calibration point. In some instances, the calibration points of the right optical path are stored together with the first magnification level and the calibration points of the left optical path. Therefore, when the stereo visualization camera 300 is subsequently set to the first magnification level, the processor 1562 applies the data in the calibration point to the optical image sensors 746 and 748 and/or the radial positioning of the one or more optical elements 1402.
在某些實例中,可針對不同放大位準及/或工作距離重複程序2500。因此,處理器1562判定是否針對另一放大位準或工作距離需要ZRP校準(方塊2532)。若將選擇另一放大位準,則程序2500返回至圖25中之方塊2504。然而,若不需要另一放大位準,則實例性程序結束。In some instances, the procedure 2500 may be repeated for different magnification levels and/or working distances. Therefore, the processor 1562 determines whether ZRP calibration is required for another magnification level or working distance (block 2532). If another magnification level is to be selected, the process 2500 returns to block 2504 in FIG. 25. However, if another magnification level is not required, the example procedure ends.
校準點中之每一者可儲存於一查找表中。該表中之每一列可對應於一不同放大位準及/或工作距離。該查找表中之行可提供左ZRP、右ZRP、左像素組及/或右像素組之座標。另外,一或多個行可規定光學元件1402之透鏡之相關位置(例如,徑向、旋轉、傾斜及/或軸向位置)以除經對準右影像與左影像之外亦達成在該放大位準下之聚焦。Each of the calibration points can be stored in a look-up table. Each column in the table can correspond to a different magnification level and/or working distance. The rows in the lookup table can provide the coordinates of the left ZRP, the right ZRP, the left pixel group, and/or the right pixel group. In addition, one or more rows can specify the relative position of the lens of the optical element 1402 (for example, radial, rotation, tilt, and/or axial position) so as to achieve magnification in addition to aligning the right and left images. Focus on the level.
程序2500因此致使除目標部位之視圖以外之右ZRP及左ZRP在一個三維立體影像中對準至各別光學影像感測器746及748之像素網格而且對準至彼此。在某些例項中,左影像及右影像以及對應ZRP具有在一個像素內之一準確性及對準。藉由使左視圖及右視圖(例如,來自左光學路徑及右光學路徑之影像)覆疊且用兩隻眼睛而非立體地觀察兩個視圖,此準確性可係可在顯示器514或514上觀察到的。The process 2500 thus causes the right ZRP and the left ZRP, except for the view of the target site, to be aligned to the pixel grids of the respective optical image sensors 746 and 748 and to each other in a three-dimensional image. In some cases, the left and right images and the corresponding ZRP have an accuracy and alignment within one pixel. By overlaying the left view and the right view (for example, images from the left optical path and the right optical path) and observing the two views with two eyes instead of stereoscopically, this accuracy can be achieved on the display 514 or 514 Observed.
應瞭解,在某些實例中,首先選擇一右像素組使得右ZRP與像素組之一原點對準或重合。然後,右光學影像與左光學影像可藉由使光學元件1402之一或多個右透鏡及/或左透鏡移動而對準。此替代程序仍提供在彼此之間且相對於光學影像感測器746及748定中心且對準之右ZRP及左ZRP。It should be understood that, in some examples, a right pixel group is first selected so that the right ZRP is aligned with or coincides with an origin of the pixel group. Then, the right optical image and the left optical image can be aligned by moving one or more of the right lens and/or the left lens of the optical element 1402. This alternative procedure still provides right ZRP and left ZRP centered and aligned with respect to the optical image sensors 746 and 748 between each other.
程序2500最終藉由確保左ZRP與右ZRP保持對準且右影像與左影像保持對準而在一完全光學放大率範圍內減少或消除立體視覺化攝影機300中之假性視差。換言之,右光學影像感測器746及左光學影像感測器748之雙重光學器件經對準使得左光學路徑與右光學路徑之間的在一影像之一中心處之視差在焦平面處係大致零。另外,實例性立體視覺化攝影機300跨越放大率範圍係等焦的,且跨越放大率及工作距離範圍係等中心的,此乃因每一光學路徑之ZRP已對準至各別像素組之一中心。因此,僅改變放大率將維持目標部位700之一聚焦處於光學影像感測器746及748兩者中,同時在同一中心點上經訓練。The process 2500 finally reduces or eliminates the false parallax in the stereo visualization camera 300 by ensuring that the left ZRP and the right ZRP remain aligned and the right image and the left image remain aligned within a full optical magnification range. In other words, the dual optics of the right optical image sensor 746 and the left optical image sensor 748 are aligned so that the parallax between the left optical path and the right optical path at a center of an image is approximately at the focal plane. zero. In addition, the exemplary stereo visual camera 300 is isofocal across the magnification range, and is isocentric across the magnification and working distance ranges, because the ZRP of each optical path has been aligned to one of the respective pixel groups center. Therefore, only changing the magnification will keep one of the target sites 700 focused in both the optical image sensors 746 and 748, while being trained on the same center point.
可在執行一外科手術程序之前及/或基於一操作者請求而在校準時執行以上程序2500。亦可在與一術前顯微外科手術影像及/或外科手術導引圖形之影像配準之前執行實例性程序2500。此外,可在立體視覺化攝影機300之操作期間自動即時執行實例性程序2500。
1.模板匹配實例 The above procedure 2500 may be performed before performing a surgical procedure and/or at the time of calibration based on an operator request. The example procedure 2500 may also be performed before the image registration with a preoperative microsurgery image and/or surgical guidance figure. In addition, the example procedure 2500 can be automatically executed in real time during the operation of the stereo visualization camera 300. 1. Example of template matching
在某些實施例中,資訊處理器模組1408之實例性處理器1562經組態以連同一或多個模板使用一程式1560來判定右ZRP及/或左ZRP之一位置。圖28展示圖解說明處理器1562如何使用一目標模板2802來判定一左ZRP之一位置之一圖式。在此實例中,圖28展示一第一左影像,該第一左影像包含與左光學影像感測器748之左像素網格1004之一原點2804或中心對準之模板2802。可藉由使立體視覺化攝影機300移動至適當位置而對準模板2802。另一選擇係,可使模板2802在目標部位700處移動直至對準為止。在其他實例中,模板2802可包含不需要與像素網格1004之一中心對準之另一樣式。舉例而言,模板可包含一圖形波樣式、一圖形呼吸描記器樣式、一患者之一外科手術部位之一視圖及/或具有視覺上可區分特徵(在x方向及y方向兩者上具有某種程度之非週期性)之一網格。模板經組態以阻止一週期性影像之一子組在複數個位置中完全對準至較大影像上,此使此等模板不適合用於匹配。適合用於模板匹配之一模板影像稱為一「模板可匹配」模板影像。In some embodiments, the example processor 1562 of the information processor module 1408 is configured to use a program 1560 in conjunction with the same or multiple templates to determine one of the right ZRP and/or left ZRP positions. FIG. 28 shows a diagram illustrating how the processor 1562 uses a target template 2802 to determine a position of a left ZRP. In this example, FIG. 28 shows a first left image that includes a template 2802 aligned with an origin 2804 or center of the left pixel grid 1004 of the left optical image sensor 748. The template 2802 can be aligned by moving the stereo visualization camera 300 to an appropriate position. Alternatively, the template 2802 can be moved at the target site 700 until it is aligned. In other examples, the template 2802 may include another pattern that does not need to be centered on one of the pixel grids 1004. For example, the template may include a graphic wave pattern, a graphic pneumograph pattern, a view of a surgical site of a patient, and/or a visually distinguishable feature (with certain characteristics in both the x-direction and the y-direction). A degree of non-periodical) one of the grids. The templates are configured to prevent a subset of a periodic image from being completely aligned on the larger image in a plurality of positions, which makes these templates unsuitable for matching. A template image suitable for template matching is called a "template matchable" template image.
在一第一放大位準下使圖28中所展示之模板2802成像。一左ZRP 2806相對於模板2802而展示。ZRP 2806相對於原點2804具有Lx
, Ly
之座標。然而,在此時間點,處理器1562尚未識別左ZRP 2806。The template 2802 shown in Figure 28 is imaged at a first magnification level. A left ZRP 2806 is shown relative to the template 2802. ZRP 2806 has coordinates of L x , L y relative to the origin 2804. However, at this point in time, the processor 1562 has not yet recognized the left ZRP 2806.
為定位ZRP 2806,處理器1562致使一左變焦透鏡(例如,左前變焦透鏡728及/或左後變焦透鏡734)將放大率自第一放大位準改變至一第二放大位準,具體而言在此實例中,自1X改變至2X。圖29展示一第二左影像之一圖式,該第二左影像包含在放大位準加倍之情況下像素網格1004上之目標2802。自第一放大位準至第二放大位準,目標2802之部分之大小增加且均勻地膨脹遠離左ZRP 2806,左ZRP 2806相對於第一影像及第二影像保持固定。另外,像素網格1004之原點2804與左ZRP 2806之間的一距離保持相同。To position the ZRP 2806, the processor 1562 causes a left zoom lens (for example, the left front zoom lens 728 and/or the left rear zoom lens 734) to change the magnification from a first magnification level to a second magnification level, specifically In this example, change from 1X to 2X. FIG. 29 shows a diagram of a second left image that includes the target 2802 on the pixel grid 1004 when the magnification level is doubled. From the first magnification level to the second magnification level, the size of the part of the target 2802 increases and expands evenly away from the left ZRP 2806, which remains fixed relative to the first image and the second image. In addition, the distance between the origin 2804 of the pixel grid 1004 and the left ZRP 2806 remains the same.
實例性處理器1562依據圖29中所展示之第二影像合成一數位模板影像3000。為形成該數位模板影像,處理器1562複製圖29中所展示之第二影像且使該所複製影像比例縮放自第一放大率至第二放大率之放大率改變之倒數。舉例而言,若自第一影像至第二影像之放大率改變係2之一因數,則第二影像比例縮放½。圖30展示包含模板2802之數位模板影像3000之一圖式。圖30之數位模板影像3000中之模板2802比例縮放為與圖28中所展示之第一左影像之模板2802相同之大小。The example processor 1562 synthesizes a digital template image 3000 based on the second image shown in FIG. 29. To form the digital template image, the processor 1562 copies the second image shown in FIG. 29 and scales the copied image by the reciprocal of the magnification change from the first magnification to the second magnification. For example, if the magnification change from the first image to the second image is a factor of 2, the second image is scaled by ½. FIG. 30 shows a diagram of a digital template image 3000 including a template 2802. The template 2802 in the digital template image 3000 of FIG. 30 is scaled to the same size as the template 2802 of the first left image shown in FIG. 28.
實例性處理器1562使用數位模板影像3000來定位左ZRP 2806。圖31展示一圖式,該圖式展示疊加於記錄於像素網格1004中之第一左影像(或在第一放大位準下記錄之一後續左影像)之頂部上之數位模板影像3000。數位模板影像3000與第一左影像之組合產生一合成視圖,如圖31中所圖解說明。最初,數位模板影像3000定中心於像素網格1004之原點2804處。The example processor 1562 uses the digital template image 3000 to locate the left ZRP 2806. FIG. 31 shows a diagram showing the digital template image 3000 superimposed on top of the first left image recorded in the pixel grid 1004 (or a subsequent left image recorded at the first magnification level). The combination of the digital template image 3000 and the first left image produces a composite view, as illustrated in FIG. 31. Initially, the digital template image 3000 is centered at the origin 2804 of the pixel grid 1004.
實例性處理器1562比較數位模板影像3000與基本模板2802以判定其是否對準或匹配。實例性處理器1562然後使數位模板影像3000水平地或垂直地移動一或多個像素且執行另一比較。處理器1562使數位模板影像3000反覆地移動,從而針對每一位置編譯關於數位模板影像3000與基本模板2802匹配之緊密度如何之一度量矩陣。處理器1562選擇與最佳匹配度量對應的矩陣中之位置。在某些實例中,處理器1562使用OpenCVTM
模板匹配功能。The example processor 1562 compares the digital template image 3000 with the basic template 2802 to determine whether they are aligned or matched. The example processor 1562 then causes the digital template image 3000 to move one or more pixels horizontally or vertically and perform another comparison. The processor 1562 makes the digital template image 3000 move repeatedly, thereby compiling a measurement matrix for how closely the digital template image 3000 matches the basic template 2802 for each position. The processor 1562 selects the position in the matrix corresponding to the best matching metric. In some instances, the processor 1562 uses the OpenCV ™ template matching function.
圖32展示其中數位模板影像3000與模板2802對準之一圖式。使數位模板影像3000移動以達成最佳匹配之距離經展示為Δx及Δy。知曉數位模板影像3000係以M1/M2 (第一放大位準除以第二放大位準)之一比例合成的,處理器1562使用下文之方程式(1)及(2)判定左ZRP 2806之座標(Lx
, Ly
)。
Lx = Δx/ (M1/M2) - 方程式(1)
Ly = Δy/ (M1/M2) - 方程式(2)FIG. 32 shows a diagram in which the digital template image 3000 and the template 2802 are aligned. The distance to move the digital template image 3000 to achieve the best match is shown as Δx and Δy. Knowing that the digital template image 3000 was synthesized at a ratio of M1/M2 (the first magnification level divided by the second magnification level), the processor 1562 uses the following equations (1) and (2) to determine the coordinates of the left ZRP 2806 (L x , L y ). Lx = Δx/ (M1/M2)-equation (1) Ly = Δy/ (M1/M2)-equation (2)
在判定左ZRP 2806之座標(Lx
, Ly
)之後,實例性處理器1562選擇或判定具有與左ZRP 2806對準或重合之一原點之一像素子組,如上文連同圖25及圖26之程序2500所論述。在某些實施例中,處理器1562可反覆地使用模板匹配以收斂於一高度準確ZRP位置及/或像素子組上。此外,雖然以上實例論述定位左ZRP,但相同模板匹配程序可用於定位右ZRP。After determining the coordinates (L x , L y ) of the left ZRP 2806, the exemplary processor 1562 selects or determines a pixel subgroup having an origin aligned with or coincident with the left ZRP 2806, as described above in conjunction with FIGS. 25 and Procedure 2500 of 26 is discussed. In some embodiments, the processor 1562 may repeatedly use template matching to converge on a highly accurate ZRP position and/or pixel subgroup. In addition, although the above example discusses locating the left ZRP, the same template matching procedure can be used to locate the right ZRP.
在某些實施例中,上文所闡述之模板匹配程式1560可用於對準左影像與右影像。在此等實施例中,在一放大位準下記錄左影像及右影像。舉例而言,該兩個影像可包含圖28之目標模板2802。右影像之一部分經選擇且與左影像覆疊。右影像之該部分然後在左影像周圍水平地及/或垂直地移位一或多個像素。實例性處理器1562在右影像之部分之每一位置處執行一比較以判定關於左影像存在之一匹配之緊密度如何。一旦判定一最佳位置,便判定右像素網格1002之一像素組1006使得右影像與左影像大體重合。可基於使右影像之部分移動以與左影像重合之量而判定像素組1006之位置。具體而言,處理器1562使用在x方向、y方向及/或傾斜方向上之一移動量來判定右像素組1006之對應座標。
2.右影像與左影像對準實例 In some embodiments, the template matching program 1560 described above can be used to align the left and right images. In these embodiments, the left image and the right image are recorded at a magnification level. For example, the two images may include the target template 2802 of FIG. 28. A part of the right image is selected and overlaps the left image. This part of the right image is then shifted horizontally and/or vertically by one or more pixels around the left image. The example processor 1562 performs a comparison at each position of the portion of the right image to determine how close a match exists with respect to the left image. Once an optimal position is determined, it is determined that a pixel group 1006 of the right pixel grid 1002 makes the right image and the left image substantially coincide. The position of the pixel group 1006 can be determined based on the amount by which the part of the right image is moved to coincide with the left image. Specifically, the processor 1562 uses one of the movement amounts in the x direction, the y direction, and/or the oblique direction to determine the corresponding coordinates of the right pixel group 1006. 2. Example of aligning the right image with the left image
在某些實施例中,圖14至圖16之資訊處理器模組1408之實例性處理器1562在顯示監視器512及/或514上顯示右影像與左影像之一覆疊。處理器1562經組態以接收使用者回饋以用於對準右影像與左影像。在此實例中,右影像及左影像之每一像素資料使用(舉例而言)圖形處理單元1564精確地映射至顯示監視器512之一各別像素。經覆疊左影像及右影像之顯示使任何假性視差對於一操作者顯而易見。一般而言,在不具有假性視差之情況下,左影像與右影像應幾乎完全對準。In some embodiments, the example processor 1562 of the information processor module 1408 of FIGS. 14-16 displays an overlay of the right image and the left image on the display monitors 512 and/or 514. The processor 1562 is configured to receive user feedback for aligning the right and left images. In this example, each pixel data of the right image and the left image is accurately mapped to a respective pixel of the display monitor 512 using, for example, the graphics processing unit 1564. The display of the overlaid left and right images makes any false parallax obvious to an operator. Generally speaking, without false parallax, the left and right images should be almost completely aligned.
若一操作者偵測到假性視差,則該操作者可致動控件305或使用者輸入裝置1410以使右影像或左影像移動以用於與右影像及左影像中之另一者對準。來自控件305之指令可致使處理器1562即時相應地調整左像素組或右像素組之位置,使得在顯示監視器512上顯示反映操作者輸入之後續影像。在其他實例中,該等指令可致使處理器1562經由徑向調整、旋轉調整、軸向調整或傾斜改變光學元件1402中之一或多者之一位置。操作者繼續經由控件305及/或使用者輸入裝置1410提供輸入直至左影像與右影像對準為止。在接收到一確認指令之後,處理器1562旋即將在經設定放大位準下反映影像對準之一校準點儲存至一查找表。If an operator detects false parallax, the operator can actuate the control 305 or the user input device 1410 to move the right image or the left image for alignment with the other of the right image and the left image . The instruction from the control 305 can cause the processor 1562 to adjust the position of the left pixel group or the right pixel group accordingly in real time, so that subsequent images reflecting the operator's input are displayed on the display monitor 512. In other examples, the instructions may cause the processor 1562 to change the position of one or more of the optical elements 1402 via radial adjustment, rotation adjustment, axial adjustment, or tilt. The operator continues to provide input via the control 305 and/or the user input device 1410 until the left image and the right image are aligned. After receiving a confirmation command, the processor 1562 is about to store a calibration point reflecting the image alignment under the set magnification level in a look-up table.
另外或另一選擇係,上文所闡述之模板匹配方法可用於執行影像對準同時聚焦於大致正交於立體視覺化攝影機300之一立體光軸之一平面目標上。此外,每當一「模板可匹配」場景在左光學路徑及右光學路徑兩者之視圖中時,模板匹配方法可用於即時對準左視圖與右視圖。在一實例中,一模板影像自(例如)左視圖之一子組複製,定中心於該視圖之中心上或附近。自一對焦影像之中心進行取樣確保目標部位700之一類似視圖將存在於另一視圖(在此實例中為右視圖)中。對於離焦影像,情形並非如此,使得在當前實施例中僅在一成功自動聚焦操作之後執行此對準方法。選定模板然後匹配於另一視圖(在此實例中為右視圖)之當前視圖(或其一複本)中且僅自結果獲取一y值。當該等視圖垂直對準時,模板匹配之y值在零像素處或附近。一非零y值指示兩個視圖之間的垂直不對準且應用使用y之相同值之一校正以選擇第一視圖之像素讀出組或將使用y之否定值之一校正應用於另一視圖之像素讀出組。另一選擇係,可在視覺化管線之其他部分中應用或在像素讀出組與該管線之間分裂校正。Additionally or alternatively, the template matching method described above can be used to perform image alignment while focusing on a plane target that is substantially orthogonal to a stereo optical axis of the stereo visualization camera 300. In addition, whenever a "template can be matched" scene is in both views of the left optical path and the right optical path, the template matching method can be used to instantly align the left view and the right view. In one example, a template image is copied from, for example, a subset of the left view, centered on or near the center of the view. Sampling from the center of a focused image ensures that a similar view of the target site 700 will exist in another view (the right view in this example). For out-of-focus images, this is not the case, so that the alignment method is performed only after a successful autofocus operation in the current embodiment. The selected template is then matched to the current view (or a copy thereof) of another view (the right view in this example) and only a y value is obtained from the result. When the views are aligned vertically, the y value of the template matching is at or near the zero pixel. A non-zero y value indicates the vertical misalignment between the two views and a correction using one of the same values of y is applied to select the pixel readout group of the first view or a correction using one of the negative values of y is applied to the other view The pixel readout group. Another option is to apply split correction in other parts of the visualization pipeline or between the pixel readout group and the pipeline.
在某些實例中,操作者亦可手動地對準一右ZRP與像素網格1002之一原點。例如,在判定該右ZRP之一位置之後,處理器1562 (及/或周邊輸入單元介面1574或圖形處理單元1564)致使該右ZRP以圖形方式突出顯示於由顯示監視器512顯示之一右影像上。處理器1562亦可顯示指示像素網格1002之原點之一圖形。操作者使用控件305及/或使用者輸入裝置1410將右ZRP操縱至原點。處理器1562使用來自控件305及/或使用者輸入裝置1410之指令以使光學元件1402中之一或多者相應地移動。除以圖形方式顯示右ZRP及原點之當前位置之外,處理器1562亦可即時提供一右影像串流以提供關於定位之操作者經更新回饋。操作者繼續經由控件305及/或使用者輸入裝置1410提供輸入直至右ZRP經對準為止。在接收到一確認指令之後,處理器1562旋即將在經設定放大位準下反映光學元件1402之位置之一校準點儲存至一查找表。
3.對準誤差之比較 In some instances, the operator can also manually align a right ZRP with an origin of the pixel grid 1002. For example, after determining a position of the right ZRP, the processor 1562 (and/or the peripheral input unit interface 1574 or the graphics processing unit 1564) causes the right ZRP to be graphically highlighted on a right image displayed by the display monitor 512 superior. The processor 1562 may also display a graphic indicating the origin of the pixel grid 1002. The operator uses the control 305 and/or the user input device 1410 to manipulate the right ZRP to the origin. The processor 1562 uses commands from the control 305 and/or the user input device 1410 to move one or more of the optical elements 1402 accordingly. In addition to graphically displaying the current position of the right ZRP and the origin, the processor 1562 can also provide a right image stream in real time to provide updated feedback on the positioning of the operator. The operator continues to provide input via the control 305 and/or the user input device 1410 until the right ZRP is aligned. After receiving a confirmation command, the processor 1562 is about to store a calibration point reflecting the position of the optical element 1402 under the set magnification level in a look-up table. 3. Comparison of alignment errors
與具有立體攝影機之已知數位外科手術顯微鏡相比較,實例性立體視覺化攝影機300產生右影像與左影像之間的較少對準誤差。下文所論述之分析針對具有攝影機之一已知數位外科手術顯微鏡及實例性立體視覺化攝影機300比較因ZRP不對準產生之假性視差。最初,兩個攝影機設定在一第一放大位準下,其中一焦平面定位於一患者之眼睛之一第一位置上。方程式(3)在下文用於判定自每一攝影機至眼睛之工作距離(「WD」)。
WD = (IPD / 2) / tan(α) - 方程式(3)Compared with a known digital surgical operating microscope with a stereo camera, the exemplary stereo visualization camera 300 produces less alignment error between the right image and the left image. The analysis discussed below is for comparing the false parallax caused by ZRP misalignment with one of the known digital surgical operating microscopes with cameras and the exemplary stereo visualization camera 300. Initially, the two cameras are set at a first magnification level, and one of the focal planes is positioned at a first position of a patient's eye. Equation (3) is used below to determine the working distance ("WD") from each camera to the eye.
WD = (IPD / 2) / tan(α)-equation (3)
在此方程式中,IPD對應於瞳孔間距離,其係大致23 mm。另外,α係(舉例而言)右光學影像感測器746與左光學影像感測器748之間的一角度之二分之一,其在此實例中係2.50°。聚光角度係此角度之兩倍,其在此實例中係5°。所得工作距離係263.39 mm。In this equation, IPD corresponds to the distance between pupils, which is approximately 23 mm. In addition, α is, for example, one-half of an angle between the right optical image sensor 746 and the left optical image sensor 748, which is 2.50° in this example. The condensing angle is twice this angle, which is 5° in this example. The resulting working distance is 263.39 mm.
攝影機放大至一第二放大位準且在患者之眼睛之一第二位置上經三角量測。在此實例中,第二位置處於與第一位置相同之距攝影機之實體距離處,但在第二放大位準下呈現。放大率改變由於ZRP中之一者或兩者相對於一感測器像素網格之一中心之不對準而產生虛假水平視差。對於已知攝影機系統,判定假性視差係(舉例而言) 3弧分,其對應於0.05°。在上文之方程式(3)中,使0.05°值與α相加,此產生258.22 mm之一工作距離。工作距離差係5.17 mm (263.39 mm – 258.22 mm),此對應於具有攝影機附件之已知數位外科手術顯微鏡之誤差。The camera zooms in to a second zoom level and triangulates on a second position of the patient's eyes. In this example, the second position is at the same physical distance from the camera as the first position, but is presented at the second magnification level. The change in magnification produces false horizontal parallax due to the misalignment of one or both of the ZRP relative to a center of a sensor pixel grid. For the known camera system, it is determined that the false parallax is (for example) 3 arc minutes, which corresponds to 0.05°. In equation (3) above, the 0.05° value is added to α, which produces a working distance of 258.22 mm. The working distance difference is 5.17 mm (263.39 mm – 258.22 mm), which corresponds to the error of known digital surgical operating microscopes with camera attachments.
相比之下,實例性立體視覺化攝影機300能夠將ZRP自動對準至一像素組或網格之一中心之一個像素內。若角視域係5°且利用連同一4k顯示監視器使用之一4k影像感測器來記錄,則一個像素準確性對應於0.00125° (5°/4000)或4.5弧秒。使用上文之方程式(3),使0.00125°值與α相加,此產生263.25 mm之一工作距離。立體視覺化攝影機300之工作距離差係0.14 mm (263.39 mm – 263.25 mm)。當與已知數位外科手術顯微鏡之5.17 mm誤差相比較時,實例性立體視覺化攝影機300使對準誤差減少97.5%。In contrast, the exemplary stereo visualization camera 300 can automatically align the ZRP to one pixel in the center of a pixel group or grid. If the angular field of view is 5° and is recorded with a 4k image sensor connected to the same 4k display monitor, then one pixel accuracy corresponds to 0.00125° (5°/4000) or 4.5 arc seconds. Using equation (3) above, the value of 0.00125° is added to α, which results in a working distance of 263.25 mm. The working distance difference of the stereo vision camera 300 is 0.14 mm (263.39 mm – 263.25 mm). When compared with the 5.17 mm error of a known digital surgical operating microscope, the exemplary stereo vision camera 300 reduces the alignment error by 97.5%.
在某些實施例中,立體視覺化攝影機300在較高解析度下可係更準確的。在上文之實例中,解析度對於一5°視域係大約4.5弧秒。對於具有2°之一視域之一8K超高清晰度系統(其中8000個像素在4000列中之每一者中),立體視覺化攝影機300之解析度係大致1弧秒。此意味左視圖及右視圖之ZRP可對準至一個像素或1弧秒。此顯著比具有大約若干弧分之假性視差之已知數位顯微鏡系統精確。
4.其他假性視差源之減少 In some embodiments, the stereoscopic visualization camera 300 may be more accurate at a higher resolution. In the above example, the resolution is approximately 4.5 arc seconds for a 5° field of view. For an 8K ultra-high-definition system with a field of view of 2° (in which 8000 pixels are in each of 4000 columns), the resolution of the stereo visualization camera 300 is approximately 1 arc second. This means that the ZRP of the left and right views can be aligned to one pixel or one arc second. This is significantly more accurate than known digital microscope systems with pseudo parallaxes of approximately several arc minutes. 4. Reduction of other sources of false parallax
以上實例論述實例性立體視覺化攝影機300如何減少由於不對準ZRP及/或左影像與右影像本身而產生之假性視差。立體視覺化攝影機300亦可經組態以減少其他假性視差源。舉例而言,立體視覺化攝影機300可藉由同時對在實質上相同時刻記錄影像之右光學影像感測器746及左光學影像感測器748進行計時而減少由運動引起之假性視差。The above example discusses how the exemplary stereo visualization camera 300 reduces the false parallax caused by the misalignment of the ZRP and/or the left and right images themselves. The stereo visualization camera 300 can also be configured to reduce other sources of false parallax. For example, the stereo visualization camera 300 can reduce the false parallax caused by motion by simultaneously timing the right optical image sensor 746 and the left optical image sensor 748 that record images at substantially the same time.
實例性立體視覺化攝影機300亦可減少由左光學路徑與右光學路徑之間的不類似放大率引起之假性視差。舉例而言,立體視覺化攝影機300可基於左光學路徑而設定放大位準。立體視覺化攝影機300然後可進行自動調整,使得右影像之放大率匹配左影像之放大率。舉例而言,處理器1562可(舉例而言)使用影像資料以藉由量測在左影像及右影像中共同之特定特徵之間的一像素數目而計算控制參數。處理器1562然後可藉由數位比例縮放、插入內插像素及/或刪除外來像素而使左影像及右影像之放大位準等化。實例性處理器1562及/或圖形處理單元1564可重新再現右影像,使得放大率匹配至左影像。另外或另一選擇係,立體視覺化攝影機300可包含左光學元件及右光學元件1402之獨立調整。處理器1562可單獨控制左光學元件及右光學元件1402以達成相同放大率。在某些實例中,處理器1562可首先設定(舉例而言)左放大位準,然後單獨調整右光學元件1402以達成相同放大位準。The exemplary stereo visualization camera 300 can also reduce false parallax caused by the dissimilar magnification between the left optical path and the right optical path. For example, the stereo visualization camera 300 can set the magnification level based on the left optical path. The stereoscopic visualization camera 300 can then automatically adjust so that the magnification of the right image matches the magnification of the left image. For example, the processor 1562 may, for example, use the image data to calculate the control parameter by measuring the number of pixels between a specific feature that is common in the left image and the right image. The processor 1562 can then equalize the magnification levels of the left and right images by digital scaling, inserting interpolated pixels, and/or deleting extraneous pixels. The example processor 1562 and/or the graphics processing unit 1564 can reproduce the right image so that the magnification is matched to the left image. Additionally or alternatively, the stereo vision camera 300 may include independent adjustments of the left optical element and the right optical element 1402. The processor 1562 can individually control the left optical element and the right optical element 1402 to achieve the same magnification. In some instances, the processor 1562 may first set (for example) the left magnification level, and then individually adjust the right optical element 1402 to achieve the same magnification level.
實例性立體視覺化攝影機300可進一步減少由不類似焦點引起之假性視差。在一實例中,處理器1562可執行針對一給定放大率及/或工作距離判定每一光學路徑之一最佳焦點之一程式1560。處理器1562首先執行光學元件1402在一最佳解析度點下之一聚焦。處理器1562然後可檢查在一適合非物件平面位置處之OOF條件且匹配左影像及右影像之焦點。處理器1562接下來重新檢查在最佳解析度下之焦點且反覆地調整焦點直至左光學元件及右光學元件1402兩者同樣良好地聚焦於一物件平面上且遠離該物件平面而聚焦為止。The exemplary stereo visualization camera 300 can further reduce false parallax caused by dissimilar focal points. In one example, the processor 1562 can execute a program 1560 for determining the best focus of each optical path for a given magnification and/or working distance. The processor 1562 first executes the optical element 1402 to focus at one of the best resolution points. The processor 1562 can then check the OOF condition at a suitable non-object plane position and match the focus of the left and right images. The processor 1562 then rechecks the focus at the best resolution and repeatedly adjusts the focus until both the left optical element and the right optical element 1402 are equally well focused on an object plane and away from the object plane.
實例性處理器1562可藉由監測與右影像及左影像中之一者或兩者之焦點有關之一信號而量測且驗證最佳焦點。舉例而言,圖形處理單元1564針對左影像及右影像同時及/或同步地產生一「清晰度」信號。該信號隨焦點改變而改變且可依據一影像分析程式、一邊緣偵測分析程式、一樣式強度傅裡葉變換頻寬程式及/或一調變傳送函數(「MTF」)量測程式來判定。處理器1562調整光學元件1402之一焦點同時監測指示一清晰影像之一最大信號。The example processor 1562 can measure and verify the best focus by monitoring a signal related to the focus of one or both of the right and left images. For example, the graphics processing unit 1564 simultaneously and/or synchronously generates a "resolution" signal for the left image and the right image. The signal changes as the focus changes and can be determined based on an image analysis program, an edge detection analysis program, a pattern intensity Fourier transform bandwidth program, and/or a modulation transfer function ("MTF") measurement program . The processor 1562 adjusts a focus of the optical element 1402 while monitoring a maximum signal indicating a clear image.
為最佳化OOF條件,處理器1562可監測左影像及右影像兩者之清晰度信號。若使焦點移動離開物件平面且與(舉例而言)左影像有關之信號增加但與右影像有關之信號減少,則處理器1562經組態以判定光學元件1402移動離開焦點。然而,若與右影像及左影像兩者有關之信號係相對高的且大致相等的,則處理器1562經組態以判定光學元件1402恰當地定位以用於聚焦。
5.低假性視差之益處 To optimize OOF conditions, the processor 1562 can monitor the sharpness signals of both the left and right images. If the focus is moved away from the object plane and the signal related to, for example, the left image increases but the signal related to the right image decreases, the processor 1562 is configured to determine that the optical element 1402 moves away from the focus. However, if the signals related to both the right image and the left image are relatively high and approximately equal, the processor 1562 is configured to determine that the optical element 1402 is properly positioned for focusing. 5. The benefits of low false parallax
實例性立體視覺化攝影機300由於右影像與左影像之間的低假性視差而具有優於已知數位外科手術顯微鏡之若干個優點。舉例而言,幾乎完全對準左影像及右影像為一外科醫師產生一幾乎完美立體顯示,因而減輕眼睛疲勞。此允許立體視覺化攝影機300用作一外科醫師之眼睛之一延伸而非一笨重工具。The exemplary stereo visualization camera 300 has several advantages over known digital surgical operating microscopes due to the low false parallax between the right image and the left image. For example, aligning the left image and the right image almost completely creates an almost perfect stereoscopic display for a surgeon, thereby reducing eye fatigue. This allows the stereo visualization camera 300 to be used as an extension of a surgeon's eye rather than a heavy tool.
在另一實例中,精確地經對準左影像及右影像允許以數位方式進行外科手術部位之準確量測。例如,可量測一患者之目鏡晶狀體囊之一大小使得可判定且準確地植入一恰當地定大小之IOL。在另一例項中,可量測一移動血管之一運動使得一紅外線螢光素覆疊可準確地放置於一融合影像中。在此處,實際運動速度一般不受外科醫師關注,但對於經覆疊影像之放置及即時調整係關鍵的。對於提供一準確地融合之經組合現場立體影像與一替代模式影像,經覆疊影像之恰當地經匹配比例、配準及視角全部係重要的。In another example, the precise alignment of the left and right images allows accurate measurement of the surgical site in a digital manner. For example, the size of a lens capsule of a patient's eyepiece can be measured so that an appropriately sized IOL can be implanted judiciously and accurately. In another example, the movement of a moving blood vessel can be measured so that an infrared fluorescein overlay can be accurately placed in a fusion image. Here, the actual movement speed is generally not of concern to the surgeon, but it is critical for the placement and real-time adjustment of the overlay image. To provide an accurately fused combined live 3D image and an alternative mode image, the proper proportion, registration, and viewing angle of the overlay image are all important.
在某些實例中,處理器1562可使得一操作者能夠在顯示監視器512上繪製量測參數。處理器1562接收一螢幕上之經繪製座標且因此將該等座標轉化成立體影像。處理器1562可藉由使顯示監視器512上之畫尺比例縮放至立體影像中所展示之一放大位準而判定量測值。由處理器1562進行之量測包含立體顯示器中所顯示之兩個或三個位置之點至點量測、點至表面量測、表面表徵量測、體積判定量測、速度驗證量測、座標變換、儀器及/或組織追蹤等。
VII.立體視覺化攝影機之實例性機器人系統 In some instances, the processor 1562 may enable an operator to plot the measurement parameters on the display monitor 512. The processor 1562 receives the drawn coordinates on a screen and thus transforms these coordinates into a stereoscopic image. The processor 1562 can determine the measurement value by scaling the scale on the display monitor 512 to one of the magnification levels shown in the stereo image. The measurement performed by the processor 1562 includes point-to-point measurement, point-to-surface measurement, surface characterization measurement, volume determination measurement, speed verification measurement, and coordinates of two or three positions displayed in the stereo display Transformation, instrumentation and/or organization tracking, etc. VII. Example Robot System of Stereo Visual Camera
如結合圖5及圖6所論述,一實例性立體視覺化攝影機300可連接至一機械臂或機器人臂506作為一立體視覺化平台或立體機器人平台516之一部分。實例性機器人臂506經組態以使得一操作者能夠在一或多個程序期間將立體視覺化攝影機300定位及/或定向為在一患者上面及/或緊挨著該患者。因此,機器人臂506使得一操作者能夠使立體視覺化攝影機300移動至一目標外科手術部位之一所要視域(「FOV」)。外科醫師一般偏好使攝影機定位及/或定向於與其自身之FOV類似之一FOV中以達成更容易視覺定向以及顯示在一螢幕上之影像與外科醫師之FOV之間的對應性。本文中所揭示之實例性機器人臂506提供結構撓性及輔助控制以使得定位能夠與一外科醫師之FOV重合或一致而不阻擋外科醫師自身之FOV。As discussed in conjunction with FIGS. 5 and 6, an exemplary stereo visualization camera 300 can be connected to a robotic arm or robot arm 506 as a part of a stereo visualization platform or stereo robot platform 516. The example robotic arm 506 is configured to enable an operator to position and/or orient the stereo visualization camera 300 on and/or next to a patient during one or more procedures. Therefore, the robot arm 506 enables an operator to move the stereo visualization camera 300 to a desired field of view ("FOV") of a target surgical site. Surgeons generally prefer to position and/or orient the camera in a FOV similar to its own FOV in order to achieve easier visual orientation and correspondence between the image displayed on a screen and the FOV of the surgeon. The exemplary robotic arm 506 disclosed herein provides structural flexibility and auxiliary control to enable positioning to coincide or coincide with a surgeon's FOV without blocking the surgeon's own FOV.
與本文中所揭示之立體機器人平台516相比較,已知立體顯微鏡固持裝置包含由一操作者手動地移動之一簡單機械臂。此等裝置包含配備有允許手動重定位之機電制動器之多個旋轉關節。此外,為允許一操作者容易地改變一視圖且在不中斷一程序之情況下,某些已知固持裝置具有機動化關節。該等機動化關節具有範圍介於自(舉例而言)簡單X-Y定位高達包括操縱經連接剛性臂之多個獨立旋轉關節之裝置之各種複雜性位準。在大多數程序期間,期望迅速地且容易地獲得來自各種方向之視圖。然而,已知立體顯微鏡固持裝置遭受一或多個問題。Compared with the three-dimensional robot platform 516 disclosed herein, the known three-dimensional microscope holding device includes a simple mechanical arm that is manually moved by an operator. These devices include multiple rotary joints equipped with electromechanical brakes that allow manual repositioning. In addition, in order to allow an operator to easily change a view without interrupting a program, some known holding devices have motorized joints. These motorized joints have levels of complexity ranging from (for example) simple X-Y positioning up to devices including the manipulation of multiple independent rotary joints connected to rigid arms. During most procedures, it is desirable to quickly and easily obtain views from various directions. However, known stereo microscope holding devices suffer from one or more problems.
已知立體顯微鏡固持裝置具有一般受外科醫師操縱顯微鏡以觀看影像之合意態樣之手動能力限制之有限位置、方向及/或定向準確性。具有多個關節之固持裝置操作起來可係尤其笨重的,此乃因裝置操縱通常致使所有關節同時移動。時常地,一操作者觀看一臂如何移動。在臂定位於一所要位置中之後,操作者檢查成像裝置之FOV是否對準於所要位置中。很多時候,即使裝置恰當地對準,亦必須調整裝置之一焦點。額外已知立體顯微鏡固持裝置無法關於一目標外科手術部位中之其他物件提供一致FOV或焦平面,此乃因裝置不具有臂位置記憶體,或該等記憶體在一程序期間在使一患者移動或移位時係不準確的。The known stereo microscope holding device has a limited position, direction, and/or orientation accuracy that is generally limited by the manual ability of the surgeon to manipulate the microscope to view images in a desirable manner. A holding device with multiple joints can be particularly cumbersome to handle, because device manipulation usually causes all joints to move at the same time. From time to time, an operator watches how one arm moves. After the arm is positioned in a desired position, the operator checks whether the FOV of the imaging device is aligned in the desired position. In many cases, even if the device is properly aligned, one of the focal points of the device must be adjusted. It is additionally known that the stereo microscope holding device cannot provide a consistent FOV or focal plane with respect to other objects in a target surgical site. This is because the device does not have arm position memory, or these memories move a patient during a procedure Or it is inaccurate at the time of displacement.
已知立體顯微鏡固持裝置一般具有其中控制獨立於顯微鏡參數(諸如物件平面焦距、放大率及照射)之定位系統。對於此等裝置,必須手動地執行定位及(舉例而言)變焦之協調。在一實例中,一操作者可到達用於聚焦或改變一工作距離之一透鏡限制。該操作者必須手動地改變固持裝置之一位置,且然後使立體顯微鏡重聚焦。The known stereo microscope holding device generally has a positioning system in which the control is independent of microscope parameters (such as the focal length of the object plane, magnification, and illumination). For these devices, the coordination of positioning and (for example) zooming must be performed manually. In one example, an operator can reach a lens limit for focusing or changing a working distance. The operator must manually change the position of one of the holding devices and then refocus the stereo microscope.
已知立體顯微鏡固持裝置僅意欲用於一外科手術部位之觀察。該等已知裝置不判定自一FOV內之組織至該FOV外側之另一物件之位置或距離。該等已知裝置亦不提供在一現場外科手術部位內之組織與其他物件之比較以形成一替代觀看模態,諸如組合一MRI影像與一現場視圖。替代地,來自已知裝置之視圖經單獨顯示且與其他醫學影像或模板不對準。The known stereo microscope holding device is only intended to be used for observation of a surgical site. These known devices do not determine the position or distance from the tissue within a FOV to another object outside the FOV. These known devices also do not provide an alternative viewing modality such as combining an MRI image and a live view by comparing tissues in an on-site surgical site with other objects to form an alternative viewing modality. Alternatively, the view from the known device is displayed separately and is not aligned with other medical images or templates.
另外,已知立體顯微鏡固持裝置具有可並不準確之參數,此乃因很少強調精確度(除觀察以外)。ISO標準10936-1:2000(E) 「Optics and optical instruments – Operation microscopes – Part 1: Requirements and test methods
」中之要求很大程度上經導出以由一正常人類操作者使用目鏡達成合理立體光學影像品質。操作者之大腦將視圖組合成內心之影像以達成立體視覺。一般不組合該等視圖或以其他方式將該等視圖一起進行比較。只要操作者看到一可接受影像且未遭受如頭痛之有害效應,已滿足其需要。對於立體顯微鏡固持裝置同樣成立,其中准許某些不穩定性、臂垂度及不精確移動控制。然而,當與已知固持裝置一起使用高解析度數位攝影機時,結構不準確性係可容易觀察到的且可減少其使用,尤其對於顯微外科手術程序。In addition, the known stereo microscope holding device has parameters that may be inaccurate, because accuracy (except observation) is rarely emphasized. The requirements in ISO Standard 10936-1:2000(E) " Optics and optical instruments – Operation microscopes – Part 1: Requirements and test methods " are largely derived to allow a normal human operator to use eyepieces to achieve reasonable stereoscopic optical images quality. The operator's brain combines the views into inner images to achieve stereo vision. Generally, these views are not combined or compared together in other ways. As long as the operator sees an acceptable image and does not suffer from harmful effects such as headaches, his needs are met. The same holds true for stereo microscope holding devices, where certain instabilities, arm sag and inaccurate movement control are permitted. However, when using high-resolution digital cameras with known holding devices, structural inaccuracies can be easily observed and their use can be reduced, especially for microsurgery procedures.
如上文所提及,已知立體顯微鏡固持裝置可由於一攝影機之重量而下垂。一般而言,已知機器人定位系統經校準以僅針對系統自身判定順從性或不準確性。該等立體顯微鏡固持裝置不考量攝影機或一攝影機座架與固持裝置之間的任何不準確性。垂度一般藉由一操作者在觀察一顯示器上之一影像之同時手動地定位攝影機來補償。在提供機動化運動之系統中,舉例而言,當攝影機之一重心(「CG」)重新定位於一臂關節之一旋轉軸線之一相對側上時發生垂度改變,其中恢復圍繞軸線之轉矩會顛倒方向。隨後,機構中之任何順從性或垂度(其係由一操作者藉由調整攝影機之位置、方向及/或定向而補償)現在增添了位置、方向及/或定向誤差。在某些情形中,舉例而言,當使攝影機移動穿過一機器人奇異點時,迅速地發生力矩反轉且所得攝影機影像之誤差迅速地且過多地移位。此誤差限制已知立體顯微鏡固持裝置(舉例而言)準確地跟隨或追蹤部位中之組織或儀器之能力。As mentioned above, the known stereo microscope holding device can sag due to the weight of a camera. Generally speaking, known robot positioning systems are calibrated to determine compliance or inaccuracy only for the system itself. These stereo microscope holding devices do not consider any inaccuracy between the camera or a camera mount and the holding device. Sag is generally compensated by an operator manually positioning the camera while observing an image on a display. In a system that provides motorized motion, for example, when a camera's center of gravity ("CG") is repositioned on an opposite side of an arm joint on an opposite side of an axis of rotation, the sag changes, in which the rotation around the axis is restored Moment will reverse direction. Subsequently, any compliance or sag in the mechanism (which is compensated by an operator by adjusting the position, direction, and/or orientation of the camera) now adds position, direction, and/or orientation errors. In some cases, for example, when the camera is moved through a robot singularity, the torque reversal occurs quickly and the error of the resulting camera image is quickly and excessively shifted. This error limits the ability of known stereo microscope holding devices (for example) to accurately follow or track tissues or instruments in a site.
已知立體顯微鏡固持裝置包含用於在空間上定位且追蹤外科手術儀器且在一監視器上提供其後續代表性顯示之特徵。然而,此等已知系統需要突出定位之一額外立體定位攝影機或三角量測裝置以及儀器上之顯眼基準裝置。所添加裝置增添複雜性、成本及操作冒失。The known stereo microscope holding device includes features for spatially positioning and tracking surgical instruments and providing their subsequent representative display on a monitor. However, these known systems need to highlight and locate an additional stereotactic camera or triangulation measurement device and a conspicuous reference device on the instrument. The added devices add complexity, cost, and operational rashness.
本文中所揭示之實例性立體機器人平台516包含連接至一機械臂或機器人臂506之一實例性立體視覺化攝影機300。圖5及圖6圖解說明立體機器人平台516之一實例。經由一或多個顯示監視器512、514顯示由攝影機300記錄之立體影像。機器人臂506機械地連接至一搬運車510,搬運車510亦可支撐顯示監視器512、514中之一或多者。舉例而言,該機器人臂可包含在大小、性質、功能及操作方面一般擬人化之一鉸接式機器人臂。The exemplary stereo robotic platform 516 disclosed herein includes an exemplary stereo visualization camera 300 connected to a robotic arm or robotic arm 506. 5 and 6 illustrate an example of the three-dimensional robot platform 516. The three-dimensional images recorded by the camera 300 are displayed via one or more display monitors 512 and 514. The robot arm 506 is mechanically connected to a truck 510, and the truck 510 can also support one or more of the display monitors 512 and 514. For example, the robot arm may include an articulated robot arm that is generally anthropomorphic in terms of size, nature, function, and operation.
圖33展示根據本發明之一實例性實施例之圖5之顯微外科手術環境500之一側視圖。在所圖解說明實例中,顯示監視器512可經由具有一或多個關節以達成彈性定位之一機械臂3302連接至搬運車510。在某些實施例中,機械臂3302可足夠長以在外科手術期間在一患者上方延伸以提供一外科醫師之相對靠近觀看。FIG. 33 shows a side view of the microsurgery environment 500 of FIG. 5 according to an exemplary embodiment of the present invention. In the illustrated example, the display monitor 512 may be connected to the truck 510 via a robot arm 3302 having one or more joints to achieve elastic positioning. In some embodiments, the robotic arm 3302 may be long enough to extend over a patient during surgery to provide a relatively close view of a surgeon.
圖33亦圖解說明包含立體視覺化攝影機300及機器人臂506之立體機器人平台516之一側視圖。攝影機300經由一耦合板3304機械地耦合至機器人臂506。在某些實施例中,耦合板3304可包含提供攝影機300之額外程度之定位及/或定向之一或多個關節。在某些實施例中,耦合板3304必須由一操作者手動地移動或旋轉。舉例而言,耦合板3304可具有使得攝影機300能夠迅速地定位於具有沿著一z軸之一光軸(亦即,向下朝向一患者指向)與沿著一x軸或y軸之一光軸(亦即,側向朝向一患者指向)之間的一關節。FIG. 33 also illustrates a side view of the stereo robot platform 516 including the stereo visualization camera 300 and the robot arm 506. The camera 300 is mechanically coupled to the robot arm 506 via a coupling board 3304. In some embodiments, the coupling plate 3304 may include one or more joints that provide an additional degree of positioning and/or orientation of the camera 300. In some embodiments, the coupling plate 3304 must be manually moved or rotated by an operator. For example, the coupling plate 3304 may have a configuration that enables the camera 300 to be quickly positioned with an optical axis along a z-axis (that is, pointing downward toward a patient) and an optical axis along an x-axis or y-axis. A joint between the shafts (that is, pointing laterally toward a patient).
實例性耦合板3304可包含經組態以偵測由一操作者所施予以用於使攝影機300移動之力及/或轉矩的一感測器3306。在某些實施例中,一操作者可藉由抓握控制臂304a及304b (圖3中所展示)而定位攝影機300。在操作者已用其手掌控控制臂304a及304b之後,使用者可在機器人臂306之輔助下定位及/或定向攝影機300。感測器3306偵測由操作者提供之一力向量或轉矩角度。本文中所揭示之實例性平台516使用所感測力/轉矩來判定應使機器人臂506之哪些關節旋轉(及應使關節旋轉多快)以提供與由操作者提供之力/轉矩對應的攝影機300之輔助移動。感測器3306可位於耦合板3304與攝影機300之間的一介面處以用於偵測由一操作者經由控制臂304施予之力及/或轉矩。The example coupling plate 3304 may include a sensor 3306 configured to detect the force and/or torque applied by an operator to move the camera 300. In some embodiments, an operator can position the camera 300 by grasping the control arms 304a and 304b (shown in FIG. 3). After the operator has used his hands to control the control arms 304a and 304b, the user can position and/or orient the camera 300 with the assistance of the robot arm 306. The sensor 3306 detects a force vector or torque angle provided by the operator. The exemplary platform 516 disclosed herein uses the sensed force/torque to determine which joints of the robot arm 506 should be rotated (and how fast the joints should be rotated) to provide a corresponding force/torque provided by the operator Auxiliary movement of the camera 300. The sensor 3306 may be located at an interface between the coupling plate 3304 and the camera 300 for detecting the force and/or torque applied by an operator through the control arm 304.
在某些實施例中,感測器3306可包含(舉例而言)一個六自由度觸覺力感測模組。在此等實施例中,感測器3306可偵測在x軸、y軸及z軸上之平移力或運動。感測器3306亦可單獨偵測圍繞一側傾軸線、一縱傾軸線及一側滾軸線之旋轉力或運動。平移力及旋轉力之解耦可使得立體機器人平台516能夠更容易地計算用於控制機器人臂506之直接及/或反向運動學。In some embodiments, the sensor 3306 may include, for example, a six-degree-of-freedom tactile force sensing module. In these embodiments, the sensor 3306 can detect the translational force or movement on the x-axis, y-axis, and z-axis. The sensor 3306 can also separately detect the rotational force or movement around a pitch axis, a pitch axis, and a roll axis. The decoupling of the translational force and the rotational force allows the three-dimensional robot platform 516 to more easily calculate the direct and/or inverse kinematics for controlling the robot arm 506.
實例性感測器3306可經組態以偵測力,此乃因不可由一使用者單獨使機器人臂506移動。替代地,感測器3306偵測由一使用者施加之平移及旋轉力,立體機器人平台516使用該平移及旋轉力來判定哪些關節旋轉以提供機器人臂506之輔助移動控制。在其他實例中,機器人臂506可在不具有輔助或至少初始輔助之情況下准許操作者移動。在此等其他實例中,感測器3306偵測由使用者施予之運動,立體機器人平台516使用該運動以隨後致使一或多個關節旋轉,因而提供輔助移動。運動或產生運動之力之初始偵測直至立體機器人平台516致使關節旋轉為止之間的時間可小於200毫秒(「ms」)、100 ms、50 ms或少至10 ms,其中使用者未注意機器人臂506之非輔助移動之初始時間。The example sensor 3306 can be configured to detect force because the robot arm 506 cannot be moved by a user alone. Alternatively, the sensor 3306 detects the translation and rotation force applied by a user, and the three-dimensional robot platform 516 uses the translation and rotation force to determine which joints rotate to provide auxiliary movement control of the robot arm 506. In other examples, the robotic arm 506 may permit the operator to move without assistance or at least initial assistance. In these other examples, the sensor 3306 detects the movement imparted by the user, and the three-dimensional robot platform 516 uses the movement to subsequently cause one or more joints to rotate, thereby providing auxiliary movement. The time between the initial detection of movement or force generating movement until the rotation of the joints caused by the three-dimensional robot platform 516 can be less than 200 milliseconds ("ms"), 100 ms, 50 ms, or as little as 10 ms, where the user does not pay attention to the robot The initial time of the unassisted movement of the arm 506.
實例性感測器3306可輸出指示旋轉力/運動之數位資料及指示平移力/運動之數位資料。在此實例中,數位資料可在每一軸線上針對所偵測到之力/運動具有8、16、32或64位元解析度。另一選擇係,感測器3306可傳輸與所感測到之力及/或運動成比例之一類比信號。實例性感測器3306可以(舉例而言) 1 ms、5 ms、10 ms、20 ms、50 ms、100 ms等之一週期性取樣速率傳輸資料。另一選擇係,感測器3306可提供力/運動資料之一近乎連續串流。The example sensor 3306 can output digital data indicating rotation force/motion and digital data indicating translation force/motion. In this example, the digital data can have a resolution of 8, 16, 32, or 64 bits for the detected force/motion on each axis. Alternatively, the sensor 3306 can transmit an analog signal proportional to the sensed force and/or motion. The example sensor 3306 can (for example) transmit data at a periodic sampling rate of 1 ms, 5 ms, 10 ms, 20 ms, 50 ms, 100 ms, etc. Alternatively, the sensor 3306 can provide a nearly continuous stream of force/motion data.
在某些實施例中,實例性感測器3306可替代地位於控制臂304a及304b中之一或多者中或位於控制臂304a及304b與殼體302之間。在其中控制臂304a及304b中之每一者包含感測器3306之實例中,實例性立體機器人平台516可接收兩組平移及旋轉力或運動。在此等實例中,立體機器人平台516可對來自感測器3306之值求平均。In some embodiments, the instance sensor 3306 may alternatively be located in one or more of the control arms 304a and 304b or between the control arms 304a and 304b and the housing 302. In an example where each of the control arms 304a and 304b includes a sensor 3306, the example three-dimensional robot platform 516 can receive two sets of translational and rotational forces or movements. In these examples, the three-dimensional robot platform 516 can average the values from the sensors 3306.
在所圖解說明實施例中,機器人臂506之一第一端安裝至搬運車510,而機器人臂之一第二相對端機械地連接至立體視覺化攝影機300 (例如,一機器人末端執行器)。圖33展示機器人臂506將立體視覺化攝影機300固持於一延伸位置中(諸如將立體視覺化攝影機300定位於一外科手術部位上面同時使平台516之剩餘部分保持不對一外科醫師擋道)。搬運車510經組態以牢固地固持立體機器人平台516且經加重及平衡以阻止在指定操作位置下傾翻。In the illustrated embodiment, a first end of the robotic arm 506 is mounted to the truck 510, and a second opposite end of the robotic arm is mechanically connected to the stereo visualization camera 300 (eg, a robotic end effector). Figure 33 shows the robotic arm 506 holding the stereo visualization camera 300 in an extended position (such as positioning the stereo visualization camera 300 over a surgical site while keeping the rest of the platform 516 out of the way of a surgeon). The truck 510 is configured to firmly hold the three-dimensional robot platform 516 and is weighted and balanced to prevent tipping at the designated operating position.
實例性立體機器人平台516經組態以提供以下益處。
1. 經增強視覺化。機器人臂506與立體視覺化攝影機300之間的通信使得平台516能夠使攝影機300指向且操縱攝影機300以迅速地且更準確地視覺化外科手術部位。舉例而言,機器人臂506可使攝影機300沿著其光軸移動以使聚焦及變焦範圍延伸超過僅含納於攝影機中之範圍。平台516之相對小大小在更廣泛之各種外科手術程序及定向中提供Heads-Up Surgery®,因而改良外科手術效率及外科醫師工效學。
2. 經增強尺寸效能。實例性立體視覺化攝影機300在具有其對立體影像內之所有點之準確量測能力之情況下經組態以將量測資訊傳遞至機器人臂506。機器人臂506又包括準確位置、方向及/或定向判定能力且配準至攝影機300,使得影像內及影像之間的尺寸可分別準確地變換至立體機器人平台516與一患者之一解剖結構共同之一座標系。
3. 來自立體視覺化攝影機300之立體影像資料之品質及準確性使得其能夠與來自各種模態之外部源之影像或診斷資料組合以構造融合影像。此等融合影像可由外科醫師使用以更準確地且高效地執行程序且達成更佳患者結果。
4. 立體視覺化攝影機300、機器人臂506及/或影像與運動處理器(例如,圖14之處理器1408)可經程式化以達成有益程序應用。舉例而言,一特定視覺化部位位置、方向及/或定向可經保存,且然後在程序中後來經傳回。精確運動路徑可經程式化以(舉例而言)遵循組織之一特定長度或線路。在其他實例中,可設定經預程式化路徑點,因而准許一操作者改變機器人臂506之一位置及/或定向,在一醫學程序期間基於此而執行步驟。
5. 立體機器人平台516藉由使用且分析準確影像位置資訊而提供本質上導引外科手術。此導引可傳遞至其他裝置,諸如執行一外科手術程序之至少若干部分之另一機器人系統。與此等其他裝置之組件共用功能性的立體機器人平台516之組件可一起整合至一封裝中以達成效能、準確性及成本之效率。
A.機器人臂實施例 The exemplary three-dimensional robotic platform 516 is configured to provide the following benefits. 1. Enhanced visualization. The communication between the robotic arm 506 and the stereo visualization camera 300 enables the platform 516 to point the camera 300 and manipulate the camera 300 to quickly and more accurately visualize the surgical site. For example, the robot arm 506 can move the camera 300 along its optical axis to extend the focus and zoom range beyond the range contained only in the camera. The relatively small size of the platform 516 provides Heads-Up Surgery® in a wider variety of surgical procedures and orientations, thereby improving surgical efficiency and surgeon ergonomics. 2. Enhanced dimensional performance. The exemplary stereo visualization camera 300 is configured to transmit the measurement information to the robot arm 506 when it has its ability to accurately measure all points in the stereo image. The robot arm 506 also includes accurate position, direction, and/or orientation determination capabilities and is registered to the camera 300, so that the dimensions in and between images can be accurately transformed to the stereo robot platform 516 and an anatomical structure of a patient. A standard system. 3. The quality and accuracy of the stereo image data from the stereo visualization camera 300 enables it to be combined with images or diagnostic data from external sources of various modalities to construct a fusion image. These fused images can be used by surgeons to perform procedures more accurately and efficiently and achieve better patient outcomes. 4. The stereo vision camera 300, the robot arm 506, and/or the image and motion processor (for example, the processor 1408 of FIG. 14) can be programmed to achieve beneficial program applications. For example, the location, direction, and/or orientation of a specific visualization site can be saved, and then later returned in the program. The precise movement path can be programmed to, for example, follow a specific length or line of tissue. In other examples, pre-programmed path points may be set, thereby allowing an operator to change a position and/or orientation of the robot arm 506, based on which steps are performed during a medical procedure. 5. The three-dimensional robot platform 516 provides essentially guided surgery by using and analyzing accurate image location information. This guidance can be transferred to other devices, such as another robotic system that performs at least portions of a surgical procedure. The components of the three-dimensional robot platform 516 that share functionality with the components of these other devices can be integrated into a package to achieve performance, accuracy, and cost efficiency. A. Robot arm embodiment
圖34圖解說明根據本發明之一實例性實施例之實例性機器人臂506之一實施例。在某些實施例中,機器人臂506類似於或包括來自優傲機器人S/A之型號UR5。機器人臂506之外部表面包括鋁及塑膠材料,該等鋁及塑膠材料係相容的以供在一手術室中使用且容易地清潔。Figure 34 illustrates an embodiment of an exemplary robotic arm 506 in accordance with an exemplary embodiment of the present invention. In some embodiments, the robot arm 506 is similar to or includes the model UR5 from Universal Robots S/A. The outer surface of the robot arm 506 includes aluminum and plastic materials, which are compatible for use in an operating room and easy to clean.
雖然機器人臂506在本文中闡述為係機電的,但在其他實例中,機器人臂506可係機械的、液壓的或氣動的。在某些實施例中,機器人臂506可具有(舉例而言)與一控制閥一起使用一針孔吸盤來固持且操縱攝影機300之混合致動機構。此外,雖然機器人臂506在下文闡述為包含特定數目個關節及連桿,但應瞭解,機器人臂506可包含任一數目個關節、任何長度之連桿及/或包括任何類型之關節或感測器。Although the robotic arm 506 is described herein as being electromechanical, in other examples, the robotic arm 506 may be mechanical, hydraulic, or pneumatic. In some embodiments, the robot arm 506 may have, for example, a hybrid actuation mechanism that uses a pinhole suction cup together with a control valve to hold and manipulate the camera 300. In addition, although the robot arm 506 is described below as including a specific number of joints and links, it should be understood that the robot arm 506 may include any number of joints, any length of links, and/or include any type of joints or sensing. Device.
如本文中所闡述,機器人臂506經座落且關節經定向以提供一手術區域之一不受限制視圖同時針對一患者之任何外科手術程序為一操作者提供一3D立體顯示。機器人臂506在非關鍵運動中之移動經提供為足夠快以使一操作者方便且安全。機器人臂506之移動在外科手術期間經控制為小心謹慎的且準確的。另外,機器人臂之移動經控制為在一外科手術程序所需要之整個運動範圍中係平滑的且可預測的。如本文中所闡述,機器人臂506之移動可由遠端控件或經由臂自身之手動操縱來控制。在某些實施例中,機器人臂506經組態以可在僅使用(舉例而言)一單個小指之情況下藉助最小力(例如,經由一輔助導引特徵)來定位。As described herein, the robotic arm 506 is seated and the joints are oriented to provide an unrestricted view of an operating area while providing an operator with a 3D stereoscopic display for any surgical procedure of a patient. The movement of the robot arm 506 in non-critical motions is provided to be fast enough to make it convenient and safe for an operator. The movement of the robotic arm 506 is controlled carefully and accurately during the surgical operation. In addition, the movement of the robot arm is controlled to be smooth and predictable in the entire range of motion required for a surgical procedure. As explained herein, the movement of the robotic arm 506 can be controlled by remote controls or via manual manipulation of the arm itself. In certain embodiments, the robotic arm 506 is configured to be positioned with minimal force (e.g., via an auxiliary guidance feature) using only (for example) a single little finger.
在某些實施例中,機器人臂506可包含關節上之機械地或電子地鎖定制動器。一旦攝影機300之目標或「姿勢」在由一操作者設定之後便可嚙合該等制動器。機器人臂506可包含一鎖定或解鎖開關或其他輸入裝置以阻止非所要手動或意外運動。在鎖定時,實例性機器人臂提供使得立體視覺化攝影機300能夠提供一穩定清晰影像之充分穩定性。另外或另一選擇係,機器人臂506可包含一或多個阻尼裝置以在立體視覺化攝影機300移動至一新姿勢之後吸收或衰減振動。舉例而言,該等阻尼裝置可包含經流體填充線性或旋轉阻尼器、基於橡膠之振動隔離安裝阻尼器及/或經調諧質量彈簧阻尼器。另一選擇係或另外,舉例而言,臂506可包含透過使用一比例積分微分(「PID」)伺服系統進行機電阻尼。In some embodiments, the robotic arm 506 may include mechanically or electronically locking brakes on the joints. Once the target or "posture" of the camera 300 is set by an operator, the brakes can be engaged. The robot arm 506 may include a lock or unlock switch or other input device to prevent undesired manual or accidental movement. When locked, the exemplary robotic arm provides sufficient stability to enable the stereo visualization camera 300 to provide a stable and clear image. Additionally or alternatively, the robot arm 506 may include one or more damping devices to absorb or dampen vibrations after the stereo visualization camera 300 moves to a new posture. For example, the damping devices may include fluid-filled linear or rotary dampers, rubber-based vibration isolation mounted dampers, and/or tuned mass spring dampers. Alternatively or additionally, for example, the arm 506 may include electromechanical damping through the use of a proportional integral derivative ("PID") servo system.
實例性機器人臂506可組態有一裝載位置,一或多個連桿返回至該裝載位置以用於運輸及儲存。一裝載位置使得機器人臂能夠運輸且儲存於一簡潔佔用面積中,然而部署有某些外科手術程序中所需要之一長可及範圍。電纜(諸如針對立體視覺化攝影機300而佈線之彼等電纜)沿著機器人臂506提供以便避免干擾一外科手術程序。The exemplary robotic arm 506 may be configured with a loading position to which one or more linkages are returned for transportation and storage. A loading position allows the robotic arm to be transported and stored in a compact footprint, but is deployed with a long reach that is required in certain surgical procedures. Cables (such as those wired for the stereo visualization camera 300) are provided along the robotic arm 506 in order to avoid interference with a surgical procedure.
在圖34之所圖解說明實施例中,機器人臂506包含標記為R1、R2、R3、R4、R5及R6之六個關節。在其他實施例中,機器人臂506可包含更少或額外關節。另外,在某些實施例中,關節R1至R6中之至少某些關節具有+/-360°之旋轉運動能力。旋轉運動可由一機電子系統提供,該機電子系統針對每一關節包含經組態以透過一或多個反背隙關節變速箱驅動一機械旋轉關節之一電動馬達。關節R1至R6中之每一者可包含一或多個旋轉感測器以偵測關節位置。此外,每一關節可包含一滑動離合器及/或一機電制動器。In the illustrated embodiment of FIG. 34, the robot arm 506 includes six joints labeled R1, R2, R3, R4, R5, and R6. In other embodiments, the robotic arm 506 may include fewer or additional joints. In addition, in some embodiments, at least some of the joints R1 to R6 have a rotational movement capability of +/-360°. Rotational motion can be provided by an electromechanical subsystem that includes an electric motor for each joint that is configured to drive a mechanical rotary joint through one or more anti-backlash joint gearboxes. Each of the joints R1 to R6 may include one or more rotation sensors to detect the joint position. In addition, each joint may include a slip clutch and/or an electromechanical brake.
關節R1至R6中之每一者可具有一毫米(「mm」)之大致+/-1/10之一總體運動可重複性(在附接有攝影機300之情況下)。該等關節可具有可控制在0.5°/秒與180°/秒之間的可變旋轉速度。總之,此轉化為介於1 mm/秒與1米/秒之間的攝影機移動。在某些實施例中,立體機器人平台516可針對在外科手術程序期間處於適當位置中之關節R1至R6中之一或多者具有調速器。關節R1至R6中之每一者可電連接至機器人臂506之一控制器中之一電源及/或命令線路。用於功率及命令信號之導線可在關節及連桿內部佈線。此外,該等關節中之一或多者可包含阻尼器,諸如用於連接至連桿之o形環。舉例而言,該等阻尼器可減少或吸收機器人臂506中之振動、來自搬運車510之振動及/或經由立體視覺化攝影機300施予之振動。Each of the joints R1 to R6 may have an overall motion repeatability of approximately +/-1/10 of one millimeter ("mm") (with the camera 300 attached). The joints may have a variable rotation speed controllable between 0.5°/sec and 180°/sec. In short, this translates to camera movement between 1 mm/sec and 1 m/sec. In certain embodiments, the three-dimensional robotic platform 516 may have a speed governor for one or more of the joints R1 to R6 that are in place during the surgical procedure. Each of the joints R1 to R6 may be electrically connected to a power supply and/or command line in a controller of the robot arm 506. The wires used for power and command signals can be routed inside the joints and connecting rods. In addition, one or more of the joints may include a damper, such as an o-ring for connecting to a connecting rod. For example, the dampers can reduce or absorb vibrations in the robot arm 506, vibrations from the truck 510, and/or vibrations imparted by the stereo visualization camera 300.
關節R1包含機械地耦合至一凸緣3402之一基關節,凸緣3402緊固至一固定結構3404。凸緣3402可包含任何類型之機械連接器。舉例而言,固定結構3404可包含圖5之搬運車510、一壁、一天花板、一桌子等。關節R1經組態以圍繞一第一軸線3410 (其可包含z軸)旋轉。The joint R1 includes a base joint mechanically coupled to a flange 3402 which is fastened to a fixing structure 3404. The flange 3402 can include any type of mechanical connector. For example, the fixed structure 3404 may include the truck 510 in FIG. 5, a wall, a ceiling, a table, and so on. The joint R1 is configured to rotate about a first axis 3410 (which may include the z-axis).
關節R1經由一連桿3430連接至關節R2。實例性連桿3430包含經組態以為機器人臂506之下游區段提供結構支撐之一圓柱體或其他管狀結構。連桿3430經組態以提供與關節R2之一旋轉牢固連接以使得關節R2能夠旋轉,同時連桿3430藉由其連接至關節R1而固持於適當位置中。舉例而言,關節R2可包含經組態以圍繞一軸線3412旋轉之一肩關節。實例性軸線3412經組態以垂直(或實質上垂直)於軸線3410。在給定關節R1圍繞z軸旋轉之情況下,軸線3412經組態以在一x-y平面內。The joint R1 is connected to the joint R2 via a link 3430. The example link 3430 includes a cylinder or other tubular structure configured to provide structural support for the downstream section of the robotic arm 506. The link 3430 is configured to provide a rotationally secure connection with one of the joints R2 so that the joint R2 can rotate, while the link 3430 is held in place by its connection to the joint R1. For example, the joint R2 may include a shoulder joint configured to rotate about an axis 3412. The example axis 3412 is configured to be perpendicular (or substantially perpendicular) to the axis 3410. Given the rotation of the joint R1 around the z-axis, the axis 3412 is configured to be in an x-y plane.
關節R2經由連桿3432機械地耦合至關節R3。連桿3432經組態以具有比連桿3430大之一長度且經組態以為機器人臂506之下游部分提供結構支撐。舉例而言,關節R3可包含一肘關節。關節R3與關節R2一起提供機器人臂506之可延伸定位及/或定向。關節R3經組態以圍繞一軸線3414旋轉,軸線3414垂直或正交於軸線3410且平行於軸線3412。The joint R2 is mechanically coupled to the joint R3 via a link 3432. The link 3432 is configured to have a length greater than that of the link 3430 and is configured to provide structural support for the downstream portion of the robot arm 506. For example, the joint R3 may include an elbow joint. The joint R3 and the joint R2 together provide the extendable positioning and/or orientation of the robot arm 506. The joint R3 is configured to rotate about an axis 3414, which is perpendicular or orthogonal to the axis 3410 and parallel to the axis 3412.
關節R3經由連桿3434連接至關節R4,連桿3434為機器人臂506之下游部分提供結構支撐。舉例而言,實例性關節R4可係經組態以提供圍繞軸線3416之旋轉之一第一腕關節,軸線3416可正交於軸線3412及3414。關節R4經由連桿3436機械地連接至關節R5。關節R5可係經組態以提供圍繞一軸線3418之旋轉之一第二腕關節,軸線3418正交於軸線3416。關節R5經由連桿3438機械地連接至關節R6。關節R6可係經組態以圍繞軸線3420旋轉之一第三腕關節,軸線3420正交於軸線3418。總之,腕關節R4至R6在定位本文中所闡述之立體視覺化攝影機300中提供精確彈性。The joint R3 is connected to the joint R4 via a link 3434, and the link 3434 provides structural support for the downstream part of the robot arm 506. For example, the example joint R4 may be a first wrist joint configured to provide rotation about axis 3416, which may be orthogonal to axes 3412 and 3414. The joint R4 is mechanically connected to the joint R5 via a link 3436. The joint R5 may be a second wrist joint configured to provide rotation about an axis 3418 that is orthogonal to the axis 3416. The joint R5 is mechanically connected to the joint R6 via a link 3438. The joint R6 may be a third wrist joint configured to rotate about the axis 3420, which is orthogonal to the axis 3418. In short, the wrist joints R4 to R6 provide precise flexibility in positioning the stereo visualization camera 300 described herein.
實例性機器人臂506包含一連接器3450。實例性連接器3450經由連桿3440連接至關節R6。在某些實施例中,實例性連桿3440可包含使得關節R6能夠使連接器3450旋轉之一套筒。如本文中所論述,連接器3450可經組態以機械地耦合至耦合板3304或在不使用一耦合板時直接耦合至立體視覺化攝影機300。連接器3450可包含一或多個螺桿以將機器人臂506緊固至耦合板3304及/或立體視覺化攝影機300。The example robotic arm 506 includes a connector 3450. The example connector 3450 is connected to the joint R6 via a link 3440. In certain embodiments, the example link 3440 may include a sleeve that enables the joint R6 to rotate the connector 3450. As discussed herein, the connector 3450 can be configured to be mechanically coupled to the coupling plate 3304 or directly to the stereo visualization camera 300 when a coupling plate is not used. The connector 3450 may include one or more screws to fasten the robot arm 506 to the coupling plate 3304 and/or the stereo visualization camera 300.
在某些實施例中,所圖解說明實例之機器人臂506可在大致類似於一人類臂之一定向上具有85 mm之一最大可及範圍。臂506可具有5千克之一酬載容量。此外,臂506可組態為一「協作」裝置以在人類附近達成安全操作。舉例而言,控制機器人臂506可施加至外部表面之最大力。倘若機器人臂之一部分出乎意料地接觸另一物件,則偵測到碰撞,且立即停止運動。在一緊急停止情景(舉例而言,其中失去電力)期間,關節R1至R6可經反向驅動或手動旋轉,使得一操作者可抓住機器人系統之一部分且擺動該部分使其不擋道。舉例而言,關節內之滑動離合器限制關節馬達可在操作期間旋轉地施加至臂506之最大轉矩。當斷電時,關節之滑動離合器在手動地操縱時滑動以允許一操作者迅速地移動機器人臂506使其不擋道。In certain embodiments, the robotic arm 506 of the illustrated example may have a maximum reach of 85 mm in an orientation substantially similar to that of a human arm. The arm 506 may have a payload capacity of 5 kilograms. In addition, the arm 506 can be configured as a "cooperative" device to achieve safe operation in the vicinity of humans. For example, control the maximum force that the robot arm 506 can apply to the external surface. If a part of the robot arm unexpectedly touches another object, it detects a collision and immediately stops moving. During an emergency stop scenario (for example, where power is lost), the joints R1 to R6 can be reverse-driven or manually rotated so that an operator can grasp a part of the robot system and swing the part so that it does not get in the way. For example, a slip clutch within the joint limits the maximum torque that the joint motor can rotationally apply to the arm 506 during operation. When power is off, the sliding clutch of the joint slides when manually manipulated to allow an operator to quickly move the robot arm 506 out of the way.
圖35至圖40圖解說明根據本發明之實例性實施例之機器人臂506及立體視覺化攝影機300之實例性組態。圖35展示經由凸緣3402連接至搬運車510之機器人臂506之一圖式。在此實例中,立體視覺化攝影機300直接連接至連接器3540。在此實施例中,連接器3540及/或立體視覺化攝影機300可包含圖33之感測器3306以用於感測由一操作者施予立體視覺化攝影機300之平移及/或旋轉力/運動。若連接器3540包含感測器3306,則輸出力/運動資料可透過機器人臂506傳輸至一控制器。若(舉例而言)感測器3306位於立體視覺化攝影機300上,則輸出資料可與控制資料一起傳輸至一單獨控制器。在某些實施例中,一控制器可提供於搬運車510中或單獨地提供於一伺服器處。35-40 illustrate an example configuration of the robot arm 506 and the stereo visualization camera 300 according to an example embodiment of the present invention. 35 shows a diagram of the robot arm 506 connected to the truck 510 via the flange 3402. In this example, the stereo visualization camera 300 is directly connected to the connector 3540. In this embodiment, the connector 3540 and/or the stereo visualization camera 300 may include the sensor 3306 of FIG. 33 for sensing the translation and/or rotation force applied to the stereo visualization camera 300 by an operator/ sports. If the connector 3540 includes the sensor 3306, the output force/motion data can be transmitted to a controller through the robot arm 506. If, for example, the sensor 3306 is located on the stereo visualization camera 300, the output data can be transmitted to a separate controller along with the control data. In some embodiments, a controller may be provided in the truck 510 or separately provided at a server.
圖36展示其中機器人臂506經由凸緣3402安裝至一天花板3404之一實施例。該機器人臂可自一手術室之天花板懸掛以減少地板空間雜亂。包含關節之機器人臂506可定位於其中執行外科手術活動之區上面且自該區穿過,不對外科醫師及手術室職員擋路,然而仍提供攝影機300之功能定位及/或定向同時提供顯示監視器512及514之一清楚視圖。FIG. 36 shows an embodiment in which the robot arm 506 is mounted to a ceiling 3404 via a flange 3402. The robot arm can be hung from the ceiling of an operating room to reduce floor space clutter. The robot arm 506 including joints can be positioned on the area where surgical activities are performed and pass through the area, without blocking the way for surgeons and operating room staff, but still providing the functional positioning and/or orientation of the camera 300 while providing display monitoring One of the devices 512 and 514 has a clear view.
圖37展示耦合板3304之一實施例。在所圖解說明實例中,耦合板3304之一第一端3702連接至機器人臂506之連接器3450。耦合板3304之一第二端3704連接至立體視覺化攝影機300。實例性耦合板3304經組態以提供額外自由度以用於使立體視覺化攝影機300移動。耦合板3304亦使機器人臂506之最大可及範圍延伸。耦合板3304可具有介於10 cm與100 cm之間的一長度。FIG. 37 shows an embodiment of the coupling plate 3304. In the illustrated example, the first end 3702 of one of the coupling plates 3304 is connected to the connector 3450 of the robot arm 506. A second end 3704 of the coupling plate 3304 is connected to the stereo visualization camera 300. The example coupling plate 3304 is configured to provide additional degrees of freedom for moving the stereo visualization camera 300. The coupling plate 3304 also extends the maximum reach of the robot arm 506. The coupling plate 3304 may have a length between 10 cm and 100 cm.
耦合板3304可包含一或多個關節。在所圖解說明實例中,耦合板3304包含關節R7、R8及R9。實例性關節係提供圍繞各別軸線之旋轉之機械關節。關節R7至R9可包括可在一操作者致動一釋放按鈕或槓桿之後移動之可旋轉閂鎖機構。每一關節R7至R9可具有其自身之釋放按鈕,或一單個按鈕可釋放關節R7至R9中之每一者。The coupling plate 3304 may include one or more joints. In the illustrated example, the coupling plate 3304 includes joints R7, R8, and R9. The exemplary joint system provides mechanical joints that rotate around respective axes. The joints R7 to R9 may include a rotatable latch mechanism that can be moved after an operator activates a release button or lever. Each joint R7 to R9 may have its own release button, or a single button may release each of the joints R7 to R9.
關節R7至R9可經由各別連桿連接在一起。另外,提供一連桿3718以用於連接至機器人臂506之連接器3450。關節R7經組態以圍繞軸線3710旋轉,而關節R8經組態以圍繞軸線3712旋轉,且關節R9經組態以圍繞軸線3714旋轉。軸線3710及3714彼此平行且正交於軸線3712。關節R7及R9可經組態以提供+/- 360°旋轉。在其他實例中,關節R7及R9可提供圍繞各別軸線3710及3714之+/- 90°、+/- 180°旋轉或+/- 270°旋轉。關節R8可提供圍繞軸線3712之+/- 90°旋轉。在某些實例中,關節R8可僅經設定處於+ 90°、0°及-90°。The joints R7 to R9 can be connected together via individual links. In addition, a link 3718 is provided for connecting to the connector 3450 of the robot arm 506. The joint R7 is configured to rotate about the axis 3710, the joint R8 is configured to rotate about the axis 3712, and the joint R9 is configured to rotate about the axis 3714. The axes 3710 and 3714 are parallel to each other and orthogonal to the axis 3712. Joints R7 and R9 can be configured to provide +/- 360° rotation. In other examples, the joints R7 and R9 may provide +/- 90°, +/- 180° rotation, or +/- 270° rotation about the respective axes 3710 and 3714. The joint R8 can provide +/- 90° rotation around the axis 3712. In some examples, the joint R8 may only be set at +90°, 0°, and -90°.
在某些實施例中,關節R7至R9可包含提供連續移動之馬達。關節R7至R9亦可包含控制裝置,諸如傳遞或提供指示一旋轉位置之資料之開關或位置感測器。以此方式,關節R7至R9可類似於機器人臂506之關節R1至R6且提供輔助移動及定位感測以用於回饋控制。可經由佈線穿過機器人臂506之導線、連接器3450內之電力/導線連接器及/或在機器人臂506外部之導線提供用於關節R7至R9之電力及控制。In some embodiments, the joints R7 to R9 may include motors that provide continuous movement. The joints R7 to R9 may also include control devices, such as switches or position sensors that transmit or provide data indicating a rotational position. In this way, the joints R7 to R9 can be similar to the joints R1 to R6 of the robot arm 506 and provide auxiliary movement and positioning sensing for feedback control. The power and control for the joints R7 to R9 can be provided by wiring through the wires of the robot arm 506, the power/wire connectors in the connector 3450, and/or the wires outside the robot arm 506.
圖37展示其中立體視覺化攝影機300定位於一水平定向中使得一光軸3720沿著一z軸提供之一實例。該水平定向可用於使躺下來之患者成像。相比之下,圖38展示其中關節R8旋轉90°以將攝影機300定位於一垂直定向中使得光軸3720沿著一x軸或正交於該x軸之一y軸提供之一實施例。該垂直定向可用於使坐著之患者成像。應瞭解,關節R8使得立體視覺化攝影機300能夠基於程序而迅速地重新定向於水平位置與垂直位置之間。FIG. 37 shows an example in which the stereo visualization camera 300 is positioned in a horizontal orientation such that an optical axis 3720 is along a z axis. This horizontal orientation can be used to image a patient lying down. In contrast, FIG. 38 shows an embodiment where the joint R8 is rotated by 90° to position the camera 300 in a vertical orientation such that the optical axis 3720 is along an x-axis or a y-axis orthogonal to the x-axis. This vertical orientation can be used to image a sitting patient. It should be understood that the joint R8 enables the stereo visualization camera 300 to quickly reorient between the horizontal position and the vertical position based on the program.
在圖36及圖37之所圖解說明實例中,實例性感測器3306可位於(舉例而言)機器人臂之連接器3450處(其中連接耦合板3304)及/或耦合板之第一端3702處(在與連接器3450之連接處)。另一選擇係或另外,實例性感測器3306可位於(舉例而言)耦合板之第二端3704處(在與攝影機300之連接處)及/或位於攝影機300處在與耦合板3304之第二端3704之連接處。In the illustrated examples of FIGS. 36 and 37, the example sensor 3306 may be located, for example, at the connector 3450 of the robot arm (wherein the coupling plate 3304 is connected) and/or the first end 3702 of the coupling plate (At the connection with connector 3450). Alternatively or additionally, the example sensor 3306 may be located at the second end 3704 of the coupling plate (at the connection with the camera 300) and/or located at the second end 3704 of the coupling plate 3304 and/or the camera 300 at the first end of the coupling plate 3304. The connection point of the two ends 3704.
圖39及圖40展示在水平定向中且圍繞關節R9之軸線3714旋轉+90°之立體視覺化攝影機300。圖40展示在水平定向中且圍繞關節R9之軸線3714旋轉-90°之立體視覺化攝影機300之一實例。Figures 39 and 40 show the stereo visualization camera 300 rotated +90° around the axis 3714 of the joint R9 in a horizontal orientation. FIG. 40 shows an example of a stereo vision camera 300 in a horizontal orientation and rotated -90° around the axis 3714 of the joint R9.
如圖34至圖40中圖解說明,實例性機器人臂506經組態以為立體視覺化攝影機300提供支撐且允許攝影機之光軸之精確定位及/或定向及瞄準。由於立體視覺化攝影機300不具有目鏡且不需要針對一外科醫師之眼睛而定向,因此存在用於可達成之成像之諸多合意位置及/或定向(其先前係不實際的)。一外科醫師可利用對於一程序最佳而非對於其與目鏡之定向最佳之視圖來執行。As illustrated in Figures 34-40, an example robotic arm 506 is configured to provide support for the stereo visualization camera 300 and allow precise positioning and/or orientation and aiming of the optical axis of the camera. Since the stereo vision camera 300 does not have eyepieces and does not need to be oriented for the eyes of a surgeon, there are many desirable positions and/or orientations for achievable imaging (which were previously impractical). A surgeon can perform it with a view that is best for a procedure, not for its orientation with the eyepiece.
實例性機器人臂506在與立體視覺化攝影機300一起使用時使得一外科醫師能夠環顧角落及不容易看到之其他位置。機器人臂506亦使得患者能夠放置至包含仰臥、俯臥、坐、半坐等之不同位置中。因此,機器人臂506使得患者能夠針對一特定程序放置於最佳位置中。實例性機器人臂506在與立體視覺化攝影機300一起使用時可經裝設以達成最不突出位置。臂506及攝影機300因此向一外科醫師提供視覺位置及定向之眾多可能性同時方便地定位且定向為不擋道。The exemplary robotic arm 506 when used with the stereo visualization camera 300 enables a surgeon to look around corners and other locations that are not easily seen. The robot arm 506 also enables the patient to be placed in different positions including supine, prone, sitting, semi-sitting and the like. Therefore, the robot arm 506 enables the patient to be placed in the best position for a specific procedure. The exemplary robotic arm 506 can be installed to achieve the least protruding position when used with the stereo visualization camera 300. The arm 506 and the camera 300 therefore provide a surgeon with numerous possibilities of visual position and orientation while being conveniently positioned and oriented so as not to get in the way.
機器人臂506及/或耦合板3304之連桿及關節之配置連同機動化六個(或九個)自由度一起一般允許攝影機300視需要定位,其中連桿及關節組態並不唯一於攝影機之姿勢。如下文更詳細地論述,臂506及/或板3304之關節及連桿可手動地重定位及/或重定向而不改變攝影機300之姿勢或FOV。舉例而言,此組態允許一肘關節自一遮擋視線移出而不透過攝影機300改變外科手術部位之視圖。此外,一控制系統可判定攝影機300之位置及姿勢且計算並顯示機器人臂506之替代位置及/或定向以(舉例而言)避免職員或顯示器遮擋。連同一影像處理器翻轉、反轉或以其他方式重定向所顯示影像之一能力一起使用耦合板3304之各種位置及/或定向准許甚至更多機器人臂506位置及/或定向。The configuration of the links and joints of the robot arm 506 and/or the coupling plate 3304 together with the motorized six (or nine) degrees of freedom generally allows the camera 300 to be positioned as required, and the configuration of links and joints is not unique to the camera. posture. As discussed in more detail below, the joints and links of the arm 506 and/or the plate 3304 can be manually repositioned and/or redirected without changing the pose or FOV of the camera 300. For example, this configuration allows an elbow joint to move out of a blocked view without changing the view of the surgical site through the camera 300. In addition, a control system can determine the position and posture of the camera 300 and calculate and display the alternate position and/or orientation of the robot arm 506 to, for example, avoid staff or display obstruction. Using the various positions and/or orientations of the coupling plate 3304 together with the ability of the same image processor to flip, invert, or otherwise redirect the displayed image allows even more robot arm 506 positions and/or orientations.
機器人臂506及/或耦合板3304一般經座落,且關節經定位使得在任何一般移動中避免關節奇異點。避免關節奇異點會提供對遲滯及背隙之更佳機器人控制。此外,機器人臂506及/或耦合板3304之連桿及關節之長度及組態提供沿著大多數任何合意運動路徑之平滑移動。舉例而言,機器人臂506之重定位及/或重定向使得其能夠改變一外科手術部位內之一目標點之攝影機300視圖之方向而不改變一焦點,因而准許一外科醫師自不同方向/定向觀看相同目標點。在另一實例中,機器人臂506能夠藉由使攝影機300沿著視線平移朝向或遠離一目標點而在不改變一焦點之情況下改變至該目標點之一工作距離。可視需要與立體機器人平台516之立體視覺化攝影機300一起使用機器人臂506及/或耦合板3304獲得眾多類似運動路徑。
B.機器人控制實施例 The robot arm 506 and/or the coupling plate 3304 are generally seated, and the joints are positioned so as to avoid joint singularities during any general movement. Avoiding joint singularities will provide better robot control over hysteresis and backlash. In addition, the length and configuration of the links and joints of the robot arm 506 and/or the coupling plate 3304 provide smooth movement along most any desired motion path. For example, the repositioning and/or reorientation of the robotic arm 506 enables it to change the direction of the camera 300 view of a target point in a surgical site without changing a focal point, thereby allowing a surgeon to move from different directions/orientations Watch the same target point. In another example, the robot arm 506 can change a working distance to a target point without changing a focus by moving the camera 300 along the line of sight toward or away from a target point. If necessary, the robot arm 506 and/or the coupling plate 3304 can be used together with the stereo visualization camera 300 of the stereo robot platform 516 to obtain many similar motion paths. B. Robot control example
圖34至圖40之實例性機器人臂506及/或耦合板3304可由一或多個控制器控制。圖41圖解說明根據本發明之一實例性實施例之圖3至圖40之立體機器人平台516之一實施例。實例性立體機器人平台516包含結合圖14及圖15所闡述之立體視覺化攝影機300及對應影像擷取模組1404及馬達與光照模組1406。The example robot arm 506 and/or the coupling plate 3304 of FIGS. 34-40 can be controlled by one or more controllers. FIG. 41 illustrates an embodiment of the three-dimensional robot platform 516 of FIGS. 3 to 40 according to an exemplary embodiment of the present invention. The exemplary three-dimensional robot platform 516 includes the three-dimensional visualization camera 300 and the corresponding image capturing module 1404 and the motor and lighting module 1406 described in conjunction with FIGS. 14 and 15.
在所圖解說明實施例中,立體機器人平台516包含位於遠離立體視覺化攝影機300處之一伺服器或處理器4102。舉例而言,處理器4102可包含組態有由儲存於記憶體1570中之指令定義之一或多個軟體程式之一膝上型電腦、一工作站、一桌上型電腦、一平板電腦、一智慧型電話等,該等指令在由處理器4102執行時致使處理器4102執行此處所闡述之操作。在此實例中,實例性處理器4102經組態以包含圖14至圖16之資訊處理器模組1408、影像感測器控制器1502及/或馬達與光照控制器1520 (或執行結合該等各項所闡述之操作)。In the illustrated embodiment, the stereo robotic platform 516 includes a server or processor 4102 located remote from the stereo visualization camera 300. For example, the processor 4102 may include a laptop computer, a workstation, a desktop computer, a tablet computer, a laptop computer, a workstation, a desktop computer, a tablet computer, a Smart phones, etc., when executed by the processor 4102, these instructions cause the processor 4102 to perform the operations described here. In this example, the example processor 4102 is configured to include the information processor module 1408, image sensor controller 1502, and/or motor and illumination controller 1520 of FIGS. 14-16 (or execute a combination of these The operations described in each item).
在某些實例中,影像感測器控制器1502及/或馬達與光照控制器1520之操作中之至少某些操作可分別與影像擷取模組1404及馬達與光照模組1406共用。舉例而言,處理器4102可產生用於改變焦點、放大率及/或工作距離之命令,且經由馬達與光照模組1406內之馬達與光照控制器1520之一第一部分及馬達與光照控制器1520之一第二部分控制驅動器1534至1552。另外或另一選擇係,操作地位於處理器4102中之資訊處理器模組1408之一第一部分經組態以自影像擷取模組1404中之資訊處理器模組1408之一第二部分接收個別左/右影像及/或立體影像。資訊處理器模組1408之該第一部分可經組態以用於處理用於在一或多個顯示監視器512及/或514上顯示之影像,包含具有圖形指南/文字之視覺上融合影像、來自一MRI機器、X射線或其他成像裝置之影像覆疊及/或螢光影像。In some instances, at least some of the operations of the image sensor controller 1502 and/or the motor and illumination controller 1520 can be shared with the image capture module 1404 and the motor and illumination module 1406, respectively. For example, the processor 4102 can generate commands for changing the focus, magnification and/or working distance, and through the first part of the motor and the light controller 1520 in the motor and light module 1406 and the motor and light controller A second part of 1520 controls drivers 1534 to 1552. Alternatively or alternatively, a first part of the information processor module 1408 operatively located in the processor 4102 is configured to receive from a second part of the information processor module 1408 in the image capture module 1404 Individual left/right images and/or stereo images. The first part of the information processor module 1408 can be configured to process images for display on one or more display monitors 512 and/or 514, including visually fused images with graphical guides/texts, Image overlay and/or fluorescent image from an MRI machine, X-ray or other imaging device.
處理器4102經由一導線線束4102電耦合及/或以通信方式耦合至立體視覺化攝影機300之影像擷取模組1404及馬達與光照模組1406。在某些實施例中,線束4102可在機器人臂506外部。在其他實施例中,導線線束4102可係內部的或走線穿過機器人臂。在又其他實施例中,影像擷取模組1404及馬達與光照模組1406可經由Bluetooth®以無線方式與處理器4102通信,舉例而言。The processor 4102 is electrically coupled and/or communicatively coupled to the image capturing module 1404 and the motor and illumination module 1406 of the stereo visualization camera 300 via a wire harness 4102. In some embodiments, the wire harness 4102 may be external to the robot arm 506. In other embodiments, the wire harness 4102 can be internal or routed through the robot arm. In still other embodiments, the image capture module 1404 and the motor and illumination module 1406 can communicate with the processor 4102 wirelessly via Bluetooth®, for example.
實例性處理器4102亦經由導線線束4102電耦合及/或以通信方式耦合至感測器3306。處理器4102經組態以自感測器3306接收(舉例而言)旋轉及/或平移輸出資料。該資料可包含數位資料及/或類比信號。在某些實施例中,處理器4102自感測器3306接收指示所偵測到之力及/或運動之一近乎連續輸出資料串流。在其他實例中,處理器4102以週期性經取樣間隔接收輸出資料。在又其他實例中,處理器4102週期性地傳輸請求該輸出資料之一請求訊息。The example processor 4102 is also electrically and/or communicatively coupled to the sensor 3306 via a wire harness 4102. The processor 4102 is configured to receive, for example, rotation and/or translation output data from the sensor 3306. The data may include digital data and/or analog signals. In some embodiments, the processor 4102 receives an indication of the detected force and/or motion from the sensor 3306 to output a data stream almost continuously. In other examples, the processor 4102 receives output data at periodic sampling intervals. In still other examples, the processor 4102 periodically transmits a request message for requesting the output data.
在所圖解說明實例中,處理器4102進一步以通信方式耦合至一顯示監視器512、輸入裝置1410a、1410b及其他裝置/系統4104 (例如,醫學成像裝置,諸如一X射線機器、一電腦斷層攝影(「CT」)機器、一磁共振成像(「MRI」)機器、一攝影機、用於儲存影像或外科手術指南之一工作站等)中之至少一者。輸入裝置1410a可包含一觸控螢幕裝置且輸入裝置1410b可包含一腳踏開關。觸控螢幕輸入裝置1410a可與顯示監視器512整合在一起及/或作為一單獨裝置提供於(舉例而言)圖5之搬運車510上。實例性顯示監視器512經組態以顯示一或多個使用者介面,該一或多個使用者介面包含由立體視覺化攝影機300記錄之一目標外科手術部位之一立體視訊(或單獨二維左視訊及右視訊)。In the illustrated example, the processor 4102 is further communicatively coupled to a display monitor 512, input devices 1410a, 1410b, and other devices/systems 4104 (e.g., medical imaging devices such as an X-ray machine, a computer tomography ("CT") machine, a magnetic resonance imaging ("MRI") machine, a camera, a workstation for storing images or surgical guides, etc.). The input device 1410a may include a touch screen device and the input device 1410b may include a foot switch. The touch screen input device 1410a can be integrated with the display monitor 512 and/or provided as a separate device on the truck 510 of FIG. 5, for example. The exemplary display monitor 512 is configured to display one or more user interfaces, the one or more user interfaces including a stereoscopic video (or a separate two-dimensional video) recorded by the stereo visualization camera 300 of a target surgical site Left video and right video).
觸控螢幕輸入裝置1410a經組態以提供一或多個使用者介面以用於接收與立體視覺化攝影機300、耦合板3304及/或機器人臂506之控制有關之使用者輸入。輸入裝置1410a可包含經組態以使得一操作者能夠規定、設定或以其他方式提供指令以用於控制立體視覺化攝影機300之一工作距離、焦點、放大率、照射源及位準、濾波器及/或數位變焦之一或多個圖形控制按鈕、滑桿等。輸入裝置1410a亦可包含一或多個控制按鈕以使得一操作者能夠選擇外科手術導引圖形/文字、一視訊及/或一影像以用於融合及/或以其他方式疊加於顯示於顯示監視器512上之所顯示立體視訊上。輸入裝置1410a亦可包含經組態以使得一操作者能夠輸入或創建一外科手術程序視覺化模板之一使用者介面。輸入裝置1410a可進一步包含用於控制機器人臂506及/或耦合板3304之一或多個控制按鈕,包含用於控制諸如速度、運動、部署/裝載、校準、目標鎖之操作參數、儲存一視圖位置及/或改變或輸入攝影機300之一新定向之選項。用於機器人臂506及/或耦合板3304之使用者介面控件可包含用於使攝影機300移動之控件,該等控件轉化成用於個別關節R1至R9之命令。另外或另一選擇係,用於機器人臂506及/或耦合板3304之使用者介面控件可包含用於使關節R1至R9中之每一者個別地移動之控件。經由輸入裝置1410a接收之輸入傳輸至處理器4102以用於處理。The touch screen input device 1410a is configured to provide one or more user interfaces for receiving user input related to the control of the stereo visualization camera 300, the coupling board 3304, and/or the robot arm 506. The input device 1410a may include a working distance, focus, magnification, illumination source and level, filter configured to enable an operator to specify, set, or otherwise provide instructions for controlling a working distance, focus, magnification, illumination source and level, and filter of the stereo visualization camera 300 And/or one or more graphic control buttons, sliders, etc. for digital zoom. The input device 1410a may also include one or more control buttons to enable an operator to select surgical guidance graphics/text, a video and/or an image for fusion and/or other ways to superimpose the display on the display monitor On the 3D video displayed on the device 512. The input device 1410a may also include a user interface configured to enable an operator to input or create a surgical procedure visualization template. The input device 1410a may further include one or more control buttons for controlling the robot arm 506 and/or the coupling board 3304, including operation parameters such as speed, movement, deployment/loading, calibration, target lock, and storage of a view The position and/or the option of changing or entering a new orientation of the camera 300. The user interface controls for the robot arm 506 and/or the coupling board 3304 may include controls for moving the camera 300, which are converted into commands for the individual joints R1 to R9. Additionally or alternatively, the user interface controls for the robot arm 506 and/or the coupling plate 3304 may include controls for individually moving each of the joints R1 to R9. The input received via the input device 1410a is transmitted to the processor 4102 for processing.
舉例而言,實例性腳踏開關輸入裝置1410可包含經組態以接收輸入以用於控制立體視覺化攝影機300、耦合板3304及/或機器人臂506之一位置之一腳踏板。舉例而言,腳踏板輸入裝置1410b可包含用於使攝影機300沿著x軸、y軸及/或z軸移動之控件。腳踏板輸入裝置1410b亦可包含用於儲存攝影機300之一位置及/或返回至一先前所儲存位置之控件。腳踏板輸入裝置1410b可進一步包含用於改變攝影機300之一焦點、變焦、放大率等之控件。For example, the example foot switch input device 1410 may include a foot pedal configured to receive input for controlling a position of the stereo visualization camera 300, the coupling plate 3304, and/or the robot arm 506. For example, the foot pedal input device 1410b may include controls for moving the camera 300 along the x-axis, y-axis, and/or z-axis. The foot pedal input device 1410b may also include controls for storing a position of the camera 300 and/or returning to a previously stored position. The foot pedal input device 1410b may further include controls for changing a focus, zoom, magnification, etc. of the camera 300.
在其他實施例中,立體機器人平台516可包含額外及/或替代輸入裝置1410,諸如一搖桿、滑鼠或其他類似2D或3D手動輸入裝置。輸入裝置1410經組態以提供類似於一X-Y水平擺動裝置之輸入,其中額外自由度引起系統運動彈性。具有3D能力之輸入裝置(諸如一3D滑鼠或六自由度控制器)很好地適合用於彈性且方便輸入命令。此等使用者控制裝置之一主要益處係可在發生運動之同時容易地觀看外科手術影像。此外,一外科醫師可觀看在整個外科手術及附近部位周圍發生之事情以避免(舉例而言)使攝影機300碰撞外科手術職員及/或附近設備。In other embodiments, the three-dimensional robot platform 516 may include additional and/or alternative input devices 1410, such as a joystick, a mouse, or other similar 2D or 3D manual input devices. The input device 1410 is configured to provide input similar to an X-Y horizontal oscillating device, where the extra degrees of freedom cause flexibility in the movement of the system. An input device with 3D capability (such as a 3D mouse or a six-degree-of-freedom controller) is well suited for flexible and convenient command input. One of the main benefits of these user control devices is that they can easily view surgical images while moving. In addition, a surgeon can watch what is happening around the entire surgical operation and nearby parts to avoid (for example) causing the camera 300 to collide with the surgical staff and/or nearby equipment.
視情況,輸入裝置1410可包含一頭戴式、眼睛或眼鏡安裝式追蹤裝置、一語音辨識裝置及/或一手勢輸入裝置。此等類型之輸入裝置1410促進「免手持」可操作性,使得一操作者不需要用其無菌手套觸碰任何東西。可使用一手勢辨識控件,其中特定操作手運動經辨識且轉化成用於攝影機300、耦合板3304及/或機器人臂506之控制信號。一類似功能由一語音辨識裝置提供,其中一麥克風感測來自一操作者之一命令,諸如「使攝影機向左移動」,將語音辨識為一命令,且將其轉換成適當攝影機及/或機器人控制信號。替代實施例包含一眼睛追蹤裝置,該眼睛追蹤裝置經組態以判定一操作者之眼睛相對於一3D顯示器之一位置,且可取決於操作者看所顯示場景中何處而調整視圖。Optionally, the input device 1410 may include a head-mounted, eye- or glasses-mounted tracking device, a voice recognition device, and/or a gesture input device. These types of input devices 1410 promote "hands-free" operability, so that an operator does not need to touch anything with his sterile gloves. A gesture recognition control can be used, in which specific operator movements are recognized and converted into control signals for the camera 300, the coupling plate 3304, and/or the robot arm 506. A similar function is provided by a voice recognition device, where a microphone senses a command from an operator, such as "move the camera to the left", recognizes the voice as a command, and converts it into an appropriate camera and/or robot control signal. Alternative embodiments include an eye tracking device that is configured to determine a position of an operator's eyes relative to a 3D display, and can adjust the view depending on where the operator looks in the displayed scene.
其他實施例包含經組態以追蹤一操作者之頭部(舉例而言,經由安裝於一操作者之3D眼鏡上之一可追蹤目標或目標組)在一參考系中之一位置之一裝置及用以啟動「頭部追蹤」之一腳踏開關。實例性追蹤輸入裝置經組態以儲存一操作者之頭部在啟動時間處之一開始位置且然後以某一時間間隔連續地偵測頭部位置。追蹤輸入裝置連同處理器4102可計算一當前位置與開始位置之間的一移動差量向量且將該向量轉換為對應機器人臂或攝影機透鏡移動。舉例而言,一追蹤輸入裝置1410及處理器4102可將左/右頭部移動轉換成機器人臂移動,使得螢幕上之一影像左/右移動。追蹤輸入裝置1410及處理器4102亦可將上/下頭部移動轉換成機器人臂或攝影機透鏡移動,使得螢幕上之一影像上/下移動,且可將前/後頭部移動轉換成機器人臂或攝影機透鏡移動,使得螢幕上之一影像放大/縮小。其他移動轉換係可能的,舉例而言,藉由將頭部旋轉轉換成機器人臂506之一「鎖定至目標」運動。如此處所闡述,鎖定至目標經組態以在某一容差內將機器人平台516之一焦點維持在一場景或FOV中之相同點上且使機器人臂506 (及因此視圖)在模仿一操作者之頭部移動之一方向上樞轉。Other embodiments include a device configured to track an operator's head (for example, a trackable target or group of targets installed on an operator's 3D glasses) in a position in a reference frame And a foot switch used to activate the "head tracking". The exemplary tracking input device is configured to store a starting position of an operator's head at the start time and then continuously detect the head position at a certain time interval. The tracking input device together with the processor 4102 can calculate a movement difference vector between a current position and a starting position and convert the vector into a corresponding robot arm or camera lens movement. For example, a tracking input device 1410 and processor 4102 can convert left/right head movement into robot arm movement, so that an image on the screen moves left/right. The tracking input device 1410 and the processor 4102 can also convert the up/down head movement into a robot arm or camera lens movement, so that an image on the screen moves up/down, and can convert the front/back head movement into a robot arm or camera The lens moves to enlarge/reduce an image on the screen. Other movement transformations are possible, for example, by transforming the head rotation into a "lock to target" movement of the robot arm 506. As explained here, the lock to target is configured to maintain a focus of the robotic platform 516 at the same point in a scene or FOV within a certain tolerance and cause the robotic arm 506 (and therefore the view) to mimic an operator The head moves in one direction and pivots.
在特定外科手術程序之前,形成針對儀器及視覺化建立所要路徑之一外科手術計劃。在某些實施例中,輸入裝置1410經組態以在具有來自一操作者之很少額外輸入之情況下遵循此一預定路徑。如此,操作者可在外科手術部位之視圖自動改變之同時按預先計劃而繼續操作。在某些實施例中,外科手術計劃可包含與攝影機位置、放大率、焦點等對應之一組預先計劃路徑點。一操作者可致動輸入裝置1410以隨著外科手術程序進展而進展經過路徑點(致使處理器4102按計劃移動機器人臂506、耦合板3304及/或攝影機300)。Before a specific surgical procedure, a surgical plan is formed for instrumentation and visualization to establish one of the necessary paths. In some embodiments, the input device 1410 is configured to follow this predetermined path with little additional input from an operator. In this way, the operator can continue the operation as planned in advance while the view of the surgical site is automatically changed. In some embodiments, the surgical operation plan may include a set of pre-planned path points corresponding to the camera position, magnification, focus, and so on. An operator can actuate the input device 1410 to progress through waypoints as the surgical procedure progresses (causing the processor 4102 to move the robot arm 506, coupling plate 3304, and/or camera 300 as planned).
在所圖解說明實施例中,實例性感測器3306係一輸入裝置。感測器3306經組態以偵測一操作者對立體視覺化攝影機300之移動或力且將該所偵測到之力/移動轉換為旋轉及/或平移資料。感測器3306可包含一運動預期輸入裝置,諸如一個六自由度觸覺力感測模組或一光電感測器(例如,力/轉矩感測器),其使得機器人臂506能夠對一操作者對攝影機300之輕柔推動機電地做出回應。該光電感測器可包含一電光裝置,該電光裝置經組態以將所施加力及/或轉矩變換為電信號,因而使得一操作者所輸入之一所要力/轉矩能夠經感測且變換為在所感測線性及/或旋轉方向上提供之一運動請求。在其他實施例中,其他感測器類型可用於感測器3306。舉例而言,感測器3306可包含經組態以感測來自一操作者之一觸覺請求之一應變計或壓電裝置。In the illustrated embodiment, the example sensor 3306 is an input device. The sensor 3306 is configured to detect the movement or force of an operator on the stereo visualization camera 300 and convert the detected force/movement into rotation and/or translation data. The sensor 3306 may include a motion expectation input device, such as a six-degree-of-freedom tactile force sensing module or a photoelectric sensor (for example, a force/torque sensor), which enables the robot arm 506 to operate on a The reader responds electromechanically to the gentle push of the camera 300. The photoelectric sensor may include an electro-optical device configured to convert the applied force and/or torque into an electrical signal, thereby enabling a desired force/torque input by an operator to be sensed And the transformation is to provide a motion request in the sensed linear and/or rotational direction. In other embodiments, other sensor types may be used for the sensor 3306. For example, the sensor 3306 may include a strain gauge or piezoelectric device configured to sense a tactile request from an operator.
在一實施例中,一外科醫師固持控制臂304中之一或多者且致動或推動一釋放按鈕(其可位於控制臂304中之一者或兩者上)。釋放按鈕之致動致使攝影機300將指示一操作者期望開始一「輔助移動」模式之一訊息傳輸至處理器4102。處理器4102組態機器人臂506及/或耦合板3304以使得外科醫師能夠在一所要方向上輕柔地操縱攝影機300。在此移動期間,處理器4102致使機器人臂506及/或耦合板以一「動力轉向」方式移動攝影機300,從而安全地支撐其重量且自動判定應啟動哪些關節及應以一協調方式制動哪些關節以達成外科醫師之所要移動。In one embodiment, a surgeon holds one or more of the control arms 304 and actuates or pushes a release button (which may be located on one or both of the control arms 304). The actuation of the release button causes the camera 300 to transmit to the processor 4102 a message indicating that an operator desires to start an “assisted movement” mode. The processor 4102 configures the robotic arm 506 and/or the coupling plate 3304 to enable the surgeon to gently manipulate the camera 300 in a desired direction. During this movement, the processor 4102 causes the robot arm 506 and/or the coupling plate to move the camera 300 in a "power steering" manner, thereby safely supporting its weight and automatically determining which joints should be activated and which joints should be braked in a coordinated manner In order to achieve what the surgeon wants to move.
在所圖解說明實例中,圖41之立體機器人平台516包含經組態以控制機器人臂506及/或耦合板3304之一機器人臂控制器4106。機器人臂控制器4106可包含經組態以將來自處理器4102之一或多個訊息或指令轉換為致使關節R1至R9中之任一者旋轉之一或多個訊息及/或信號的一處理器、一伺服器、一微控制器、一工作站等。機器人臂控制器4106亦經組態以接收感測器資訊(諸如來自機器人臂506及/或耦合板3304之關節位置及/或速度)且將該感測器資訊轉換為用於處理器4102之一或多個訊息。In the illustrated example, the three-dimensional robot platform 516 of FIG. 41 includes a robot arm controller 4106 that is configured to control the robot arm 506 and/or the coupling plate 3304. The robot arm controller 4106 may include a process configured to convert one or more messages or instructions from the processor 4102 into one or more messages and/or signals that cause any one of the joints R1 to R9 to rotate Server, a server, a microcontroller, a workstation, etc. The robot arm controller 4106 is also configured to receive sensor information (such as the joint position and/or speed from the robot arm 506 and/or the coupling plate 3304) and convert the sensor information to the processor 4102 One or more messages.
在某些實施例中,機器人臂控制器4106經組態為位於處理器4102與機器人臂506之間的一獨立模組。在其他實施例中,機器人臂控制器4106可包含於機器人臂506內。在又其他實施例中,機器人臂控制器4106可與處理器4102一起經包含。In some embodiments, the robot arm controller 4106 is configured as an independent module located between the processor 4102 and the robot arm 506. In other embodiments, the robot arm controller 4106 may be included in the robot arm 506. In still other embodiments, the robot arm controller 4106 may be included with the processor 4102.
實例性機器人臂控制器4106包含可由一機器人處理器4122執行之儲存於一記憶體4120中之一或多個指令。該等指令可組態成一或多個軟體程式、演算法及/或常式。記憶體4120可包含任何類型之揮發性或非揮發性記憶體。實例性機器人處理器4122以通信方式耦合至處理器4102且經組態以接收與機器人臂506及/或耦合板3304之操作有關之一或多個訊息。實例性機器人處理器4122亦經組態以將指示關節R1至R9之位置及/或速度之一或多個訊息傳輸至處理器4102。該一或多個訊息亦可指示一關節已到達一航站或被阻止移動。The example robot arm controller 4106 includes one or more instructions stored in a memory 4120 that can be executed by a robot processor 4122. These instructions can be configured into one or more software programs, algorithms and/or routines. The memory 4120 may include any type of volatile or non-volatile memory. The example robotic processor 4122 is communicatively coupled to the processor 4102 and is configured to receive one or more messages related to the operation of the robotic arm 506 and/or the coupling board 3304. The example robot processor 4122 is also configured to transmit one or more messages indicating the position and/or velocity of the joints R1 to R9 to the processor 4102. The one or more messages may also indicate that a joint has reached a terminal or is prevented from moving.
實例性處理器4102經組態以判定以一協調方式給哪些關節R1至R9提供動力,使得所有關節之所有運動之一總和產生攝影機300處之所要影像運動。在一「使攝影機向左移動」實例中,可存在數個關節之複雜運動,該等複雜運動致使攝影機之外科手術影像似乎自一外科醫師之一相對視點簡單地且平滑地向左平移。應注意,在「使攝影機向左移動」實例中,取決於攝影機300如何透過耦合板3304連接至機器人臂506,至特定關節之控制信號可取決於位置/定向而顯著不同。The example processor 4102 is configured to determine which joints R1 to R9 are powered in a coordinated manner so that the sum of all the motions of all the joints produces the desired image motion at the camera 300. In an example of "move the camera to the left", there may be complex motions of several joints that cause the surgical image of the camera to appear to be simply and smoothly translated to the left from a relative viewpoint of a surgeon. It should be noted that in the example of "move the camera to the left", depending on how the camera 300 is connected to the robot arm 506 through the coupling plate 3304, the control signal to a specific joint may be significantly different depending on the position/orientation.
記憶體4120可包含規定如何基於關節之一已知位置而使關節R1至R9移動之一或多個指令。機器人臂控制器4106經組態以執行一或多個指令以判定如何將所指示攝影機移動轉化為關節移動。在一實例中,機器人臂控制器4106可自處理器4102接收指示立體視覺化攝影機300將沿著一z軸向下移動且在一x-y平面中側向移動之訊息。換言之,處理器4102傳輸指示經由輸入裝置1410接收之關於攝影機300之所要移動之輸入之內容。實例性機器人臂控制器4106經組態以將三維座標中之移動向量轉化為達成所要位置/定向之關節位置移動資訊。機器人臂控制器4106可判定或考量機器人臂506及/或耦合板3304之連桿及關節之當前位置(及/或攝影機300之一位置/定向)連同所要移動以判定一移動差量向量。另外,機器人臂控制器4106可執行一或多個檢查以確保所要移動不致使攝影機300進入或前進為靠近於一受限制區,如由界定於與臂506及耦合板3304相同之座標系中之一或多個三維邊界所規定。靠近於一邊界之區可規定在將移動信號發送至關節時由機器人臂控制器4106應用之一經減小比例因子,該經減小比例因子致使關節隨著機器人臂506接近一邊界而移動得更慢,且不進一步移動越過一邊界。The memory 4120 may contain one or more instructions specifying how to move the joints R1 to R9 based on the known position of one of the joints. The robot arm controller 4106 is configured to execute one or more instructions to determine how to convert the instructed camera movement into joint movement. In one example, the robot arm controller 4106 may receive a message from the processor 4102 indicating that the stereoscopic visualization camera 300 will move down along a z-axis and move sideways in an x-y plane. In other words, the processor 4102 transmits the content indicating the input received via the input device 1410 regarding the desired movement of the camera 300. The exemplary robot arm controller 4106 is configured to convert the movement vector in three-dimensional coordinates into joint position movement information to achieve the desired position/orientation. The robot arm controller 4106 can determine or consider the current positions of the links and joints of the robot arm 506 and/or the coupling plate 3304 (and/or a position/orientation of the camera 300) together with the desired movement to determine a movement difference vector. In addition, the robot arm controller 4106 may perform one or more checks to ensure that the desired movement does not cause the camera 300 to enter or advance close to a restricted area, such as defined in the same coordinate system as the arm 506 and the coupling plate 3304. Defined by one or more three-dimensional boundaries. The area close to a boundary may specify that the robot arm controller 4106 applies a reduced scale factor when sending a movement signal to the joint. The reduced scale factor causes the joint to move more as the robot arm 506 approaches a boundary. Slow, and do not move further across a boundary.
在執行邊界檢查之後,機器人臂控制器4106使用移動差量及關節R1至R9中之每一者之當前位置/定向來判定用於使該等關節中之一或多者旋轉之一最佳或近乎最佳移動順序以致使機器人臂506使攝影機300移動至規定位置中。機器人臂控制器4106可使用(舉例而言)判定滿足移動差量向量所需要之一最少關節移動量的一最佳化常式。在判定關節移動量之後,實例性機器人臂控制器4106經組態以將一或多個訊息(指示一旋轉量及旋轉速度,從而考量任何比例因子)發送至一馬達控制器4124。機器人臂控制器4106可傳輸一訊息序列以致使機器人臂506及/或耦合板3304以一經定義或協調序列移動。該訊息序列亦可隨著(舉例而言)機器人臂506接近一虛擬或實體邊界而導致關節速度之一改變。After performing the boundary check, the robot arm controller 4106 uses the movement difference and the current position/orientation of each of the joints R1 to R9 to determine which one is the best or the best for rotating one or more of the joints. The sequence of movement is nearly optimal so that the robot arm 506 moves the camera 300 to a prescribed position. The robot arm controller 4106 can use, for example, an optimization routine that determines a minimum joint movement required to meet the movement difference vector. After determining the amount of joint movement, the example robot arm controller 4106 is configured to send one or more messages (indicating a rotation amount and rotation speed, thereby taking into account any scaling factors) to a motor controller 4124. The robot arm controller 4106 can transmit a message sequence to cause the robot arm 506 and/or the coupling board 3304 to move in a defined or coordinated sequence. The message sequence can also cause one of the joint speeds to change as the robot arm 506 approaches a virtual or physical boundary, for example.
實例性馬達控制器4124經組態以將所接收訊息轉化或轉換為類比信號,諸如致使關節R1至R9中之一或多者旋轉之脈寬調變(「PWM」)信號。舉例而言,馬達控制器4124可選擇去往適當關節馬達之輸入線,其中一脈衝持續時間用於控制馬達旋轉之一持續時間且脈衝之一頻率、工作循環及/或振幅用於控制旋轉速度。馬達控制器4124亦可為關節馬達及對應關節感測器提供電力。The example motor controller 4124 is configured to convert or convert received information into analog signals, such as pulse width modulation ("PWM") signals that cause one or more of the joints R1 to R9 to rotate. For example, the motor controller 4124 can select the input line to the appropriate joint motor, where a pulse duration is used to control the motor rotation for a duration and the pulse frequency, duty cycle and/or amplitude are used to control the rotation speed . The motor controller 4124 can also provide power for the joint motors and corresponding joint sensors.
在某些實施例中,與馬達控制器4124組合之機器人臂控制器4106經組態以接收或讀取關節感測器位置資訊且透過運動學判定機器人關節及攝影機300之位置及定向。每一關節R1至R9可包含偵測且傳輸指示關節位置、關節旋轉速度及/或關節旋轉方向之資料之至少一個感測器。在某些實施例中,該等感測器僅傳輸位置資訊,且機器人臂控制器4106基於位置資訊隨著時間之差而判定速度/方向。機器人臂控制器4106可將感測器資料傳輸至處理器4102以用於判定移動資訊。In some embodiments, the robot arm controller 4106 combined with the motor controller 4124 is configured to receive or read the joint sensor position information and determine the position and orientation of the robot joints and the camera 300 through kinematics. Each joint R1 to R9 may include at least one sensor that detects and transmits data indicating joint position, joint rotation speed, and/or joint rotation direction. In some embodiments, the sensors only transmit position information, and the robot arm controller 4106 determines the speed/direction based on the difference in position information over time. The robot arm controller 4106 can transmit sensor data to the processor 4102 for determining movement information.
機器人臂控制器4106自處理器4102接收移動指令且透過亞可比正向及/或逆運動學來判定應啟動哪些馬達及關節、多快及多遠以及在什麼方向上。機器人臂控制器4106然後將適當命令信號發送至馬達控制器4124中之馬達功率放大器以驅動機器人臂506中之關節馬達。The robot arm controller 4106 receives the movement command from the processor 4102 and determines which motors and joints should be activated, how fast and how far, and in what direction through sub-comparable forward and/or inverse kinematics. The robot arm controller 4106 then sends appropriate command signals to the motor power amplifier in the motor controller 4124 to drive the joint motors in the robot arm 506.
實例性機器人臂506接收適當馬達功率信號且相應地移動。臂506中之感測器及制動器對來自機器人臂控制器4106之各種操作及回饋資訊做出反應。在某些實施例中,機器人臂506機械地且以通信方式連接至耦合板3304,耦合板3304將耦合器狀態及定向資訊傳輸至機器人臂控制器4106。The example robotic arm 506 receives the appropriate motor power signal and moves accordingly. The sensors and actuators in the arm 506 respond to various operations and feedback information from the robot arm controller 4106. In some embodiments, the robot arm 506 is mechanically and communicatively connected to the coupling board 3304, and the coupling board 3304 transmits the coupler status and orientation information to the robot arm controller 4106.
在某些實施例中,圖41之實例性機器人臂506包含一耦合器控制器4130。實例耦合器控制器4130經組態以繞過機器人處理器4122且在處理器4102與耦合板3304之間中繼控制資訊。耦合器控制器4130可自處理器4102接收訊息且因此致使關節R7至R9在耦合板3304上旋轉。耦合器控制器4130亦可接收關於關節位置及/或速度之感測器資訊且將指示該關節位置及/或速度之一或多個訊息傳輸至處理器4102。在此等實施例中,處理器4102可傳輸用於控制機器人臂506之訊息及用於耦合板3304之單獨訊息。In some embodiments, the example robotic arm 506 of FIG. 41 includes a coupler controller 4130. The example coupler controller 4130 is configured to bypass the robot processor 4122 and relay control information between the processor 4102 and the coupling board 3304. The coupler controller 4130 can receive messages from the processor 4102 and thus cause the joints R7 to R9 to rotate on the coupling plate 3304. The coupler controller 4130 can also receive sensor information about the joint position and/or speed and transmit one or more messages indicating the joint position and/or speed to the processor 4102. In these embodiments, the processor 4102 can transmit a message for controlling the robot arm 506 and a separate message for the coupling board 3304.
在某些實施例中,機器人臂控制器4106經組態以判定關節R7至R9將如何移動。然而,若耦合板3304未以通信方式直接耦合至機器人臂506,則機器人處理器4122可經由處理器4102將移動信號傳輸至耦合器控制器4130。在其中機器人處理器4122之至少某些操作者與處理器4102一起定位之例項中,連同機器人臂506自處理器4102接收移動命令或信號,耦合器控制器4130自處理器4102接收移動命令或信號。In some embodiments, the robot arm controller 4106 is configured to determine how the joints R7 to R9 will move. However, if the coupling board 3304 is not directly coupled to the robot arm 506 in a communication manner, the robot processor 4122 may transmit the movement signal to the coupler controller 4130 via the processor 4102. In the example in which at least some operators of the robot processor 4122 are positioned together with the processor 4102, together with the robot arm 506 receiving movement commands or signals from the processor 4102, the coupler controller 4130 receives movement commands or signals from the processor 4102. Signal.
在圖41之所圖解說明實施例中,實例性立體視覺化攝影機300、處理器4102、耦合板3304、機器人臂506、機器人臂控制器4106及/或輸入裝置1410經由一輸入功率模組4140接收電力。實例性模組4140包含一電源供應器(諸如來自一壁式插座之電力)及/或一隔離變壓器以阻止電力線異常破壞系統效能。在某些例項中,該電源供應器可包含一電池電源供應器。In the illustrated embodiment of FIG. 41, the exemplary stereo visualization camera 300, the processor 4102, the coupling board 3304, the robot arm 506, the robot arm controller 4106, and/or the input device 1410 receive via an input power module 4140 electricity. The example module 4140 includes a power supply (such as power from a wall outlet) and/or an isolation transformer to prevent abnormal power lines from disrupting system performance. In some examples, the power supply may include a battery power supply.
立體視覺化平台516亦可包含經組態以立即切斷電力之一緊急停止開關4142。開關4142可僅切斷去往機器人臂506及/或耦合板3304之電力。處理器4102可偵測緊急停止開關4142之啟動且致使關節制動器嚙合以阻止機器人臂506下降。在某些例項中,機器人臂506經組態以在偵測到一電力損失之後啟動關節制動器。在某些實施例中,機器人臂506之關節R1至R6經組態以在施加高於一臨限值之一力之情況下滑動,因而使得一操作者能夠在一緊急情況中在具有或不具有電力之情況下迅速地移動臂使其不擋道。The stereo visualization platform 516 may also include an emergency stop switch 4142 that is configured to immediately cut off the power. The switch 4142 can only cut off the power to the robot arm 506 and/or the coupling plate 3304. The processor 4102 can detect the activation of the emergency stop switch 4142 and cause the joint brake to engage to prevent the robot arm 506 from descending. In some examples, the robot arm 506 is configured to activate the joint brake after detecting a power loss. In some embodiments, the joints R1 to R6 of the robot arm 506 are configured to slide when a force above a threshold is applied, thereby enabling an operator to have or not in an emergency situation. Move the arm quickly when it has power to keep it out of the way.
在某些實施例中,實例性處理器4102經組態以顯示機器人臂506、耦合板3304及/或立體視覺化攝影機300之一或多個圖形表示。處理器4102可致使圖形表示顯示於一或多個使用者介面中,該一或多個使用者介面提供對機器人臂506、耦合板3304及/或攝影機300之控制。處理器4102可定位且定向圖形表示以反映機器人臂506、耦合板3304及/或攝影機之當前位置。舉例而言,處理器4102使用來自機器人臂控制器4106之回饋訊息來判定將使圖形表示中之哪些關節旋轉,因而改變顯示裝置之定向及/或位置。在某些例項中,處理器4102經組態以藉由(舉例而言)如下方式經由圖形表示接收使用者輸入:一操作者將圖形表示中之連桿、關節或攝影機300移動至一所要位置。在立體視覺化攝影機300之移動之情形中,處理器4102可傳輸與攝影機移動至何處對應之新座標。在經移動關節或連桿之情形中,處理器4102可將指示關節旋轉及/或連桿位置之訊息傳輸至機器人臂控制器4106。In some embodiments, the example processor 4102 is configured to display one or more graphical representations of the robotic arm 506, the coupling board 3304, and/or the stereo visualization camera 300. The processor 4102 may cause the graphical representation to be displayed in one or more user interfaces that provide control of the robot arm 506, the coupling board 3304, and/or the camera 300. The processor 4102 can position and orient the graphical representation to reflect the current position of the robot arm 506, the coupling board 3304, and/or the camera. For example, the processor 4102 uses feedback from the robot arm controller 4106 to determine which joints in the graphical representation will be rotated, thereby changing the orientation and/or position of the display device. In some examples, the processor 4102 is configured to receive user input via a graphical representation by, for example, the following: an operator moves a link, joint or camera 300 in the graphical representation to a desired Location. In the case of the movement of the stereo visualization camera 300, the processor 4102 may transmit a new coordinate corresponding to where the camera moves. In the case of a moved joint or link, the processor 4102 may transmit a message indicating the rotation of the joint and/or the position of the link to the robot arm controller 4106.
在某些實施例中,處理器4102結合機器人臂控制器4106來操作以基於機器人臂506及/或耦合板3304之移動或與機器人臂506及/或耦合板3304之移動協作而調整攝影機之一或多個透鏡。舉例而言,若使機器人臂506朝向一外科手術部位移動,則處理器4102結合機器人臂控制器4106來操作以藉由使立體視覺化攝影機300之透鏡中之一或多者移動而改變一工作距離或焦點以維持聚焦。處理器4102結合機器人臂控制器4106來操作以判定(舉例而言)機器人臂506之移動致使一工作距離減小。EPU處理器4102結合機器人臂控制器4106來操作以基於藉由使機器人臂506移動而設定之新工作距離而判定透鏡之一新位置。此可包含使一或多個透鏡移動以用於調整焦點。在某些實施例中,處理器4102可指示攝影機300針對機器人臂506之新位置操作一校準常式以消除(舉例而言)假性視差。In some embodiments, the processor 4102 operates in conjunction with the robot arm controller 4106 to adjust one of the cameras based on the movement of the robot arm 506 and/or the coupling plate 3304 or in cooperation with the movement of the robot arm 506 and/or the coupling plate 3304 Or multiple lenses. For example, if the robot arm 506 is moved toward a surgical site, the processor 4102 operates in conjunction with the robot arm controller 4106 to change a job by moving one or more of the lenses of the stereo visualization camera 300 Distance or focus to maintain focus. The processor 4102 operates in conjunction with the robot arm controller 4106 to determine, for example, that the movement of the robot arm 506 causes a working distance to decrease. The EPU processor 4102 operates in conjunction with the robot arm controller 4106 to determine a new position of the lens based on a new working distance set by moving the robot arm 506. This can include moving one or more lenses for adjusting focus. In some embodiments, the processor 4102 may instruct the camera 300 to operate a calibration routine for the new position of the robot arm 506 to eliminate, for example, false parallax.
在某些例項中,一操作者可改變立體視覺化攝影機300之一或多個透鏡之位置且達到一透鏡行進限制。透鏡之位置自攝影機300發送至處理器4102,此用以判定已達到一限制。在偵測到已達到一限制之後,處理器4102可基於來自操作者之輸入而致使機器人臂506移動(經由控制器4106),因而使其命令自透鏡移動延伸至臂移動以到達一所要放大率或目標區。如此,結合機器人臂控制器4106來操作之處理器4102使得一操作者能夠僅使用一個使用者介面而非在用於機器人臂及攝影機之一介面之間改變。應瞭解,處理器4102及/或控制器4106可對照任何預定移動限制檢查所要移動以確保移動將不致使攝影機300或機器人臂506進入受限定患者或操作者空間。若偵測到一限制違反,則處理器4102結合機器人臂控制器4106可將指示限制之一警告顯示給操作者(經由顯示於觸控螢幕輸入裝置1410a及/或顯示監視器512上之一使用者介面)以指示使機器人臂506停止之一原因。
C.機器人臂及立體攝影機校準實施例 In some cases, an operator can change the position of one or more lenses of the stereoscopic visualization camera 300 and reach a lens travel limit. The position of the lens is sent from the camera 300 to the processor 4102, which is used to determine that a limit has been reached. After detecting that a limit has been reached, the processor 4102 can cause the robot arm 506 to move (via the controller 4106) based on the input from the operator, thereby causing its command to extend from the lens movement to the arm movement to reach a desired magnification Or target area. In this way, the processor 4102 operating in conjunction with the robot arm controller 4106 enables an operator to use only one user interface instead of changing between one interface for the robot arm and the camera. It should be appreciated that the processor 4102 and/or the controller 4106 can check the desired movement against any predetermined movement restrictions to ensure that the movement will not cause the camera 300 or the robotic arm 506 to enter the restricted patient or operator space. If a restriction violation is detected, the processor 4102 in conjunction with the robot arm controller 4106 can display a warning indicating one of the restrictions to the operator (using one of the displayed on the touch screen input device 1410a and/or the display monitor 512) User interface) to indicate one of the reasons for stopping the robot arm 506. C. Examples of calibration of robot arms and stereo cameras
如上文所論述,實例性立體視覺化攝影機300經組態以在不同放大率下提供一目標外科手術部位之高解析度立體視訊影像。作為立體視覺化平台516之一部分,立體視覺化攝影機300結合機器人臂506及/或耦合板3304來操作以達成對影像焦點、工作距離、放大率等之精確且清晰改變。為實現影像獲取彈性,立體視覺化平台516經組態以操作一或多個校準、初始化及/或重設常式。在某些實施例中,在製造期間及/或在裝設之後校準立體視覺化攝影機300、機器人臂506、耦合板3304或更一般而言立體視覺化平台516。與機器人臂506一起校準攝影機300會提供攝影機300相對於機器人臂506及操作者空間之定位資訊。在立體視覺化平台516之通電之後,在某些實施例中,攝影機300及/或機器人臂506經組態以執行額外校準/初始化以在彼時間處量測且驗證攝影機300之一位置及定向。As discussed above, the exemplary stereo visualization camera 300 is configured to provide a high-resolution stereoscopic video image of a target surgical site at different magnifications. As part of the stereo visualization platform 516, the stereo visualization camera 300 is operated in conjunction with the robot arm 506 and/or the coupling board 3304 to achieve precise and clear changes to the image focus, working distance, magnification, etc. In order to achieve flexibility in image acquisition, the stereo visualization platform 516 is configured to operate one or more calibration, initialization and/or reset routines. In some embodiments, the stereo visualization camera 300, the robotic arm 506, the coupling plate 3304, or more generally the stereo visualization platform 516 are calibrated during manufacturing and/or after installation. Calibrating the camera 300 with the robot arm 506 will provide positioning information of the camera 300 relative to the robot arm 506 and the operator space. After powering on the stereo visualization platform 516, in some embodiments, the camera 300 and/or the robot arm 506 are configured to perform additional calibration/initialization to measure and verify a position and orientation of the camera 300 at that time .
實例性處理器4102經組態以將來自校準之結果(例如,校準資料)儲存於(舉例而言)圖41之記憶體1570及/或記憶體4120中。該等校準結果可儲存至校準暫存器及/或記憶體1570及/或4120中之查找表(「LUT」)。所儲存校準資料使光學、功能及/或效能特性與攝影機300、機器人臂506及/或耦合板3304之屬性(可由一操作者或由處理器4102調整、量測及/或驗證)有關或將該等光學、功能及/或效能特性映射至該等屬性。舉例而言,(圖7之)主要物鏡總成702之一工作距離致動器馬達編碼器位置在一LUT中映射至一工作距離。在另一實例中,沿著變焦透鏡總成716之一線性編碼器之一變焦透鏡軸向位置在一LUT中映射至放大位準。對於此等實例中之每一者,實例性處理器4102經組態以判定一編碼器特性之恰當位準,調整且驗證該特性提供規定或所要工作距離及/或放大率。在某些實施例中,LUT可係複合的,其中多個效能特性映射至多個平台516屬性以達成對攝影機300、機器人臂506及/或耦合板3304之所有相關態樣之總體控制。The example processor 4102 is configured to store the results from the calibration (eg, calibration data) in, for example, the memory 1570 and/or the memory 4120 of FIG. 41. The calibration results can be stored in the calibration register and/or the look-up table ("LUT") in the memory 1570 and/or 4120. The stored calibration data relates the optical, functional and/or performance characteristics to the attributes of the camera 300, the robot arm 506 and/or the coupling board 3304 (adjustable, measured and/or verified by an operator or by the processor 4102) or the These optical, functional and/or performance characteristics are mapped to these attributes. For example, the position of a working distance actuator motor encoder of the main objective lens assembly 702 (of FIG. 7) is mapped to a working distance in a LUT. In another example, the axial position of a zoom lens along a linear encoder of the zoom lens assembly 716 is mapped to the magnification level in an LUT. For each of these examples, the example processor 4102 is configured to determine the appropriate level of an encoder characteristic, adjust and verify that the characteristic provides the specified or desired working distance and/or magnification. In some embodiments, the LUT may be composite, where multiple performance characteristics are mapped to multiple platform 516 attributes to achieve overall control of all relevant aspects of the camera 300, the robot arm 506, and/or the coupling board 3304.
一機器人臂506與實例性立體視覺化攝影機300之組合相對於機器人臂506之一參考系提供目標視圖高度準確位置、方向及/或定向資訊。以下章節闡述如何校準立體視覺化攝影機300以定義一視覺尖端。在判定一視覺尖端之後,立體視覺化攝影機300配準至機器人臂506及/或耦合板3306之一參考系。因此,在校準及配準之後,一外科手術部位之一立體視圖與對立體視覺化攝影機300之整合式控制(與對機器人臂506及/或耦合板3304之位置、方向及定向控制組合)統一。The combination of a robot arm 506 and the exemplary stereo visualization camera 300 provides highly accurate position, direction and/or orientation information of the target view relative to a reference frame of the robot arm 506. The following sections describe how to calibrate the stereo visualization camera 300 to define a visual tip. After determining a visual tip, the stereo vision camera 300 is registered to a reference system of the robot arm 506 and/or the coupling board 3306. Therefore, after calibration and registration, a stereo view of a surgical site is unified with the integrated control of the stereo visualization camera 300 (combined with the position, direction and orientation control of the robot arm 506 and/or the coupling plate 3304) .
在某些實施例中,圖41之實例性處理器4102經組態以精確地整合立體視覺化攝影機300 (包含其視覺尖端)之一配準與機器人臂506之一位置、方向及/或定向校準以相對於一指定座標系定義所獲取立體影像及其內之所有點之一統一位置、方向及/或定向感知。實例性處理器4102經組態以使用來自立體視覺化攝影機300之固有視覺成像資料來協調或指導機器人臂506之實體定位及/或定向以視一操作者需要而提供視覺化。另外,由處理器4102提供之此方向及協調經提供以維持視覺化之較佳特性,諸如焦點、工作距離、預定義定位、定向等。In some embodiments, the example processor 4102 of FIG. 41 is configured to accurately integrate a registration of the stereo visualization camera 300 (including its visual tip) with a position, direction, and/or orientation of the robotic arm 506 Calibration is to define a unified position, direction and/or orientation perception of the acquired three-dimensional image and all points within it relative to a specified coordinate system. The example processor 4102 is configured to use inherent visual imaging data from the stereo visualization camera 300 to coordinate or direct the physical positioning and/or orientation of the robotic arm 506 to provide visualization as required by an operator. In addition, the direction and coordination provided by the processor 4102 are provided to maintain better visual characteristics, such as focus, working distance, predefined positioning, orientation, and so on.
在某些實施例中,立體視覺化攝影機300、機器人臂506及/或耦合板3304之校準一般包含(i)判定及/或量測影響立體影像的立體視覺化平台516之功能參數之不準確性;(ii)校準或調整立體視覺化平台516以將立體影像之不準確性最小化為一所要位準或低於該所要位準;(iii)透過立體影像之雙重通道彼此間或與校準模板之同時比較來驗證已在一所要校準準確性位準內進行調整;及(iv)在執行其任務時使用立體視覺化平台516,其中校準之準確性之一位準係可偵測到的且經維持。In some embodiments, the calibration of the stereo visualization camera 300, the robot arm 506, and/or the coupling plate 3304 generally includes (i) determining and/or measuring the inaccuracy of the functional parameters of the stereo visualization platform 516 that affect the stereo image (Ii) Calibrate or adjust the stereo visualization platform 516 to minimize the inaccuracy of the stereo image to a desired level or below the desired level; (iii) Through the dual channels of the stereo image, align with each other or with each other The templates are compared at the same time to verify that the adjustment has been made within a desired calibration accuracy level; and (iv) the stereo visualization platform 516 is used when performing its task, where one level of the calibration accuracy is detectable And maintained.
在一替代實施例中,機器人臂506具備用於精確地校準機器人臂506之關節及連桿彼此之間的一實體關係而且用於校準攝影機300之視覺尖端與機器人臂506及/或一初始姿勢組態之一關係的一或多個固定校準基準。機器人平台固定校準基準可用於配準或整合機器人臂506與一外部環境(諸如一手術室場地)或一外部環境內之一患者或目標空間。該等固定校準基準可包含與機器人臂506之一主體之一外部部分或機器人臂506之已知外部特徵(諸如安裝點、關節、角落或類似者)之一組合的一專用附接。
1.立體視覺化攝影機實施例之校準 In an alternative embodiment, the robot arm 506 has a physical relationship for accurately calibrating the joints and linkages of the robot arm 506 and for calibrating the vision tip of the camera 300 and the robot arm 506 and/or an initial posture. Configure one or more fixed calibration standards for a relationship. The fixed calibration datum of the robot platform can be used to register or integrate the robot arm 506 with an external environment (such as an operating room) or a patient or target space in an external environment. The fixed calibration standards may include a dedicated attachment in combination with an external part of a main body of the robot arm 506 or one of the known external features of the robot arm 506 (such as mounting points, joints, corners, or the like). 1. Calibration of the stereo vision camera embodiment
為匹配一外科手術部位之一立體視圖,實例性處理器4102及/或立體視覺化攝影機300經組態以執行一或多個校準常式。實例性常式可由儲存於記憶體1570中之一或多個指令規定,該等指令在由處理器4102執行時致使處理器4102判定與特定工作距離、放大率(例如,變焦位準)及聚焦位準對應之透鏡位置。該等指令亦可致使處理器4102操作一或多個常式以用於在不同工作距離及/或放大率下判定立體視覺化攝影機300之一投影中心、立體光軸及/或一ZRP。校準使得(舉例而言)處理器4102能夠在改變放大率及/或工作距離時將焦點保持在一目標外科手術部位上。To match a stereo view of a surgical site, the example processor 4102 and/or stereo visualization camera 300 are configured to execute one or more calibration routines. The example routine may be specified by one or more instructions stored in the memory 1570, which, when executed by the processor 4102, cause the processor 4102 to determine that it is related to a specific working distance, magnification (e.g., zoom level), and focus The position of the lens corresponding to the level. The instructions may also cause the processor 4102 to operate one or more routines for determining a projection center, a stereo optical axis, and/or a ZRP of the stereo visualization camera 300 at different working distances and/or magnifications. Calibration enables, for example, the processor 4102 to maintain focus on a target surgical site when changing magnification and/or working distance.
圖42圖解說明根據本發明之一實例性實施例之用於校準立體視覺化攝影機300之一實例性程序4200或常式。儘管參考圖42中所圖解說明之流程圖闡述程序4200,但應瞭解,可使用執行與程序4200相關聯之步驟之諸多其他方法。舉例而言,可改變方塊中之諸多方塊之次序,可組合特定方塊與其他方塊,且所闡述之方塊中之諸多方塊係選用的。此外,可在多個裝置當中執行程序4200中所闡述之動作,該多個裝置包含(舉例而言)實例性立體視覺化攝影機300之光學元件1402、影像擷取模組1404、馬達與光照模組1406及/或資訊處理器模組1408。舉例而言,可由資訊處理器模組1408之程式1560中之一者執行程序4200。FIG. 42 illustrates an example procedure 4200 or routine for calibrating the stereo visualization camera 300 according to an example embodiment of the present invention. Although the procedure 4200 is described with reference to the flowchart illustrated in FIG. 42, it should be understood that many other methods of performing the steps associated with the procedure 4200 can be used. For example, the order of the blocks in the block can be changed, a specific block can be combined with other blocks, and many of the blocks described are optional. In addition, the actions described in the procedure 4200 can be executed in multiple devices including, for example, the optical element 1402 of the exemplary stereo visualization camera 300, the image capturing module 1404, the motor, and the illumination module. Group 1406 and/or information processor module 1408. For example, the program 4200 can be executed by one of the programs 1560 of the information processor module 1408.
實例性程序4200在給立體視覺化攝影機300供電或以其他方式初始化立體視覺化攝影機300 (方塊4202)時開始。攝影機300可安裝至機器人臂506。另一選擇係,可在立體視覺化攝影機300連接至一固定座架時執行程序4200。實例性程序4200接下來執行ZRP對準,如上文結合圖25及圖26所論述(方塊4204)。實例性處理器4102可自動對準ZRP,如上文所論述,及/或結合一操作者來操作以提供影像感測器746、748上之左光學路徑與右光學路徑之對準。在某些實例中,處理器4102及/或一操作者可經由一馬達以足以對一透鏡組件之一傾斜做出非常小的調整以使一ZRP移動為與一像素網格原點對準的準確度引起一撓曲部(例如,圖13之撓曲部1300)之小幅移動或撓曲。在半手動對準期間,處理器4102可致使來自影像感測器746及748之左影像及右影像覆疊於顯示監視器512上。一操作者可使用輸入裝置1410來調整影像,從而致使感測器746及748之像素組相應地移動直至ZRP恰當地對準為止。The example process 4200 starts when the stereo visualization camera 300 is powered or otherwise initialized (block 4202). The camera 300 can be mounted to the robot arm 506. Alternatively, the program 4200 can be executed when the stereo visualization camera 300 is connected to a fixed mount. The example procedure 4200 next performs ZRP alignment, as discussed above in connection with FIGS. 25 and 26 (block 4204). The example processor 4102 may automatically align the ZRP, as discussed above, and/or operate in conjunction with an operator to provide alignment of the left optical path and the right optical path on the image sensors 746, 748. In some instances, the processor 4102 and/or an operator can make very small adjustments to a tilt of a lens assembly via a motor to make a ZRP move aligned with the origin of a pixel grid. The accuracy causes a small movement or deflection of a flexure (for example, the flexure 1300 of FIG. 13). During the semi-manual alignment, the processor 4102 may cause the left and right images from the image sensors 746 and 748 to be overlaid on the display monitor 512. An operator can use the input device 1410 to adjust the image, causing the pixel groups of the sensors 746 and 748 to move accordingly until the ZRP is properly aligned.
在對準期間,ZRP經設定以在一影像中心處對準以避免假性視差。對準至大約一顯示器上之一單個像素內係可能的。自左視圖及右視圖至一影像中心之對準程度在經覆疊影像中(包含在變焦操作期間)係可見的。在一8° FOV之一實例中,使用一4K影像感測器746、748及一對應4K顯示解析度之顯示監視器512 (包括大約4000個像素乘以大約2000個列)會產生8° / 4000個像素= 7弧秒之一系統解析度。然而,可在大多數任何放大率下執行ZRP對準,其中解析度(係相同數目個像素(例如4000))除以一經減小(或經增加)角FOV。舉例而言,攝影機300之一例示性實施例在一高放大率下產生大約2°之一角FOV。一8K UHD顯示監視器512以及感測器746及748在大約4000個列中具有大約8000個像素。此系統之解析度係2°/ 8000個像素= 1弧秒。此係比此項技術中之已知系統更佳之大約一數量級或更多,其中經組裝經個別地量測組件之準確性具有容差,該等容差具有以弧分
為單位而量測之解析度。由於影像感測器及顯示監視器之密度一般隨著相同實體感測器或顯示空間中之像素變小而變得更高,因此立體視覺化攝影機300可調整性之準確性隨著像素大小變小而比例縮放。立體視覺化攝影機300之經增強高準確性對準提供更佳更準確數位效應。During the alignment, the ZRP is set to align at the center of an image to avoid false parallax. Alignment to approximately one single pixel on a display is possible. The degree of alignment from the left and right views to the center of an image is visible in the overlay image (including during the zoom operation). In an example of an 8° FOV, using a 4K image sensor 746, 748 and a display monitor 512 corresponding to a 4K display resolution (including approximately 4000 pixels by approximately 2000 columns) will produce 8°/ 4000 pixels = one system resolution of 7 arc seconds. However, ZRP alignment can be performed at most any magnification, where the resolution (the same number of pixels (for example, 4000)) is divided by a reduced (or increased) angle FOV. For example, an exemplary embodiment of the camera 300 produces an FOV of approximately 2° at a high magnification. An 8K UHD display monitor 512 and sensors 746 and 748 have about 8000 pixels in about 4000 columns. The resolution of this system is 2°/ 8000 pixels = 1 arc second. This system better than the systems known in the art of about one order of magnitude or more, wherein the assembled individually by the accuracy of the measuring assembly having tolerances, these tolerances having arc minutes measured in units of Resolution. Since the density of image sensors and display monitors generally becomes higher as the pixels in the same physical sensor or display space become smaller, the accuracy of the adjustability of the stereoscopic camera 300 changes with the pixel size. Small and scaled. The enhanced high-accuracy alignment of the stereo vision camera 300 provides better and more accurate digital effects.
當左影像及右影像之ZRP在一影像中心處保持在一所要容差範圍內時ZRP之對準係完全的,且當自低放大率循環至高放大率時目標外科手術部位之影像保持為準確的。在ZRP貫穿立體視覺化攝影機300之放大能力而對準之後,針對放大位準中之每一者之像素組位置及/或透鏡位置儲存至(舉例而言)一LUT 4203或其他資料結構。在其他實例中,處理器4102將針對放大位準中之每一者之像素組位置及/或透鏡位置寫入至校準暫存器。When the ZRP of the left image and the right image remain within a desired tolerance range at the center of an image, the alignment of the ZRP is complete, and the image of the target surgical site remains accurate when cycling from low magnification to high magnification of. After the ZRP is aligned through the magnification capability of the stereo visualization camera 300, the pixel group position and/or lens position for each of the magnification levels are stored in, for example, a LUT 4203 or other data structure. In other examples, the processor 4102 writes the pixel group position and/or lens position for each of the magnification levels to the calibration register.
在ZRP對準之後,實例性處理器4102經組態以對工作距離及/或放大率(例如,變焦)進行校準(方塊4206)。如上文結合圖15之工作距離及變焦實例所論述,對工作距離之精確知曉在攝影機300中係重要的,使得機器人臂506可相對於所要座標精確地定位攝影機。在某些例項中,連同機器人臂506之機械尺寸一起使用一準確基準,以將來自一影像之物件平面資料變換至立體視覺化平台516之一各別座標系(在本文中稱為機器人空間)中。After ZRP alignment, the example processor 4102 is configured to calibrate the working distance and/or magnification (e.g., zoom) (block 4206). As discussed above in conjunction with the working distance and zoom example of FIG. 15, accurate knowledge of the working distance is important in the camera 300 so that the robot arm 506 can accurately position the camera with respect to the desired coordinates. In some cases, an accurate datum is used together with the mechanical dimensions of the robot arm 506 to transform the object plane data from an image to a respective coordinate system of the stereo visualization platform 516 (referred to as the robot space in this article). )middle.
執行實例性校準程序4200以映射立體視覺化攝影機300之光學系統之工作距離,其中可以毫米為單位來計算或量測自一共模物鏡(「CMO」)透鏡組合件(例如,圖4及圖7之前工作距離主要物鏡透鏡408)之一前面至一物件平面之工作距離。將工作距離自一已知「原始」位置(諸如一實體停止或限制開關觸發位置)映射至一已知可量測參數(諸如(舉例而言)在一馬達軸件編碼裝置之計數中所量測之一聚焦馬達位置)。The exemplary calibration procedure 4200 is executed to map the working distance of the optical system of the stereo vision camera 300, which can be calculated or measured from a common mode objective ("CMO") lens assembly (for example, FIGS. 4 and 7). Previous working distance The working distance from the front of one of the main objective lens 408) to an object plane. Mapping the working distance from a known ``original'' position (such as a physical stop or limit switch trigger position) to a known measurable parameter (such as (for example) measured in the count of a motor shaft encoder device) Measure one of the focus motor position).
由處理器4102執行方塊4206處之校準,處理器4102在離散步驟中使物件平面沿著光軸順序地移動且使影像重聚焦同時記錄編碼器計數及工作距離,如連同圖43更詳細地論述。處理器4102在CMO之前面外部量測工作距離。編碼器計數及工作距離之映射儲存至LUT 4203或者一不同LUT及/或校準暫存器。給定一所要工作距離,此校準使得處理器4102能夠將一編碼器計數位置輸出至馬達控制器。立體視覺化攝影機300之例示性實施例使用每轉高計數軸件編碼裝置,其中工作距離之解析度係大約1微米/每一編碼器計數。替代實施例可包含不同編碼器解析度以視需要提供工作距離之更高或更低解析度。The calibration at block 4206 is performed by the processor 4102. The processor 4102 sequentially moves the object plane along the optical axis in discrete steps and refocuses the image while recording the encoder count and working distance, as discussed in more detail in connection with FIG. 43 . The processor 4102 measures the working distance externally before the CMO. The mapping of encoder count and working distance is stored in LUT 4203 or a different LUT and/or calibration register. Given a desired working distance, this calibration enables the processor 4102 to output an encoder count position to the motor controller. The exemplary embodiment of the stereo visualization camera 300 uses a high-count shaft encoding device per revolution, where the resolution of the working distance is about 1 micron/count per encoder. Alternative embodiments may include different encoder resolutions to provide higher or lower resolutions of the working distance as needed.
圖43展示根據本發明之一實例性實施例之在離散步驟中使一物件平面移動之實例性立體視覺化攝影機300之一實施例。實例性立體視覺化攝影機300包含經組態以提供一目標外科手術部位之左視圖及右視圖的圖7之主要物鏡總成702 (例如,一單個CMO)。在所圖解說明實例中,主要物鏡總成702經展示為一消色差折射總成,該消色差折射總成具有在一殼體4302內之固定前工作距離透鏡408及可沿著z軸(或其他光軸)移動之可移動後工作距離透鏡704。後工作距離透鏡704之移動改變至前工作距離透鏡408之距離。透鏡408與透鏡704之間的間距判定主要物鏡總成702之總體前焦距4304及因此一前焦平面(或僅僅「焦點平面」) 4306之位置。前焦平面4306位於與距主要物鏡總成702之一主平面4308之焦距4304相等之一距離處。計量主平面4308之位置可係困難的,因此自殼體4302之底部表面至前焦平面之一距離定義為工作距離4310。工作距離4310因此準確地設定對焦之目標部位或場景之一平面。FIG. 43 shows an embodiment of an exemplary stereoscopic visualization camera 300 that moves an object plane in a discrete step according to an exemplary embodiment of the present invention. The example stereo visualization camera 300 includes the main objective assembly 702 of FIG. 7 (eg, a single CMO) configured to provide left and right views of a target surgical site. In the illustrated example, the main objective lens assembly 702 is shown as an achromatic refraction assembly with a fixed front working distance lens 408 in a housing 4302 and can be along the z-axis (or The other optical axis) moves the movable rear working distance lens 704. The movement of the rear working distance lens 704 is changed to the distance of the front working distance lens 408. The distance between the lens 408 and the lens 704 determines the overall front focal length 4304 of the main objective lens assembly 702 and therefore the position of a front focal plane (or just the "focal plane") 4306. The front focal plane 4306 is located at a distance equal to the focal length 4304 of a main plane 4308 of the main objective lens assembly 702. It can be difficult to measure the position of the main plane 4308, so the distance from the bottom surface of the housing 4302 to the front focal plane is defined as the working distance 4310. The working distance 4310 therefore accurately sets the target part of the focus or a plane of the scene.
使前焦平面4306處之一物件成像會形成位於距主要物鏡總成702之一背面或後面無限遠處之一共軛影像。包括攝影機300之光學器件714、716、718及感測器744R、744L (其橫向地分開一瞳孔間距離(「IPD」) 4312)之兩個平行光學路徑沿著各別左光軸4320及右光軸4322在稍微不同於主要物鏡總成702之一光軸4324之方向上產生左視圖及右視圖。該兩個光學路徑經調整使得其各別外部聚光左軸線及右軸線經設定以在一影像4330之FOV之中心處重合。此點4330在本文中稱為立體視覺化攝影機300在前焦平面4306處之「尖端」。Imaging an object at the front focal plane 4306 will result in a conjugate image located at an infinite distance from or behind one of the main objective lens assemblies 702. Two parallel optical paths including the optics 714, 716, 718 of the camera 300 and the sensors 744R, 744L (which are laterally separated by an interpupillary distance ("IPD") 4312) are along the respective left optical axis 4320 and right The optical axis 4322 produces a left view and a right view in a direction slightly different from an optical axis 4324 of the main objective lens assembly 702. The two optical paths are adjusted so that the left axis and the right axis of their respective external focus are set to coincide at the center of the FOV of an image 4330. This point 4330 is referred to herein as the "tip" of the stereo visualization camera 300 at the front focal plane 4306.
後工作距離透鏡704之一位置之調整引起主要物鏡總成702之前焦距4304之一改變。如圖解說明,後工作距離透鏡704之位置之一改變形成位於一新前焦平面4306’之位置處之一新工作距離4310’。後工作距離透鏡704之移動亦引起左光軸4320’與右光軸4322’之一重新對準,從而產生攝影機300之一經重定位尖端4330’。在攝影機300位於焦點平面4306上面或下面之情況下一物件之視覺化會減弱對該物件之一聚焦。The adjustment of one position of the rear working distance lens 704 causes one of the front focal lengths 4304 of the main objective lens assembly 702 to change. As illustrated in the figure, one of the positions of the rear working distance lens 704 is changed to form a new working distance 4310' at the position of a new front focal plane 4306'. The movement of the rear working distance lens 704 also causes the left optical axis 4320' to realign with one of the right optical axis 4322', resulting in a repositioned tip 4330' of one of the cameras 300. When the camera 300 is located above or below the focal plane 4306, the visualization of an object will reduce the focus on one of the objects.
以類似於工作距離校準之一方式,可藉由處理器4102使放大率變化同時量測已知大小之一物件之一影像高度而建構一類似LUT或一工作距離LUT 4203中之額外行。可藉由依據一已知「原始」位置(諸如一實體停止或限制開關觸發位置)判定馬達軸件編碼裝置之計數而將放大率量化。處理器4102可(舉例而言)以每一放大率位置處之一感測器像素數目來相對地量測影像高度。可藉由(舉例而言)將以像素計之高度除以以毫米計之高度而表徵放大率。In a manner similar to the working distance calibration, the processor 4102 can change the magnification while measuring the height of an image of an object of known size to construct a similar LUT or an extra row in the working distance LUT 4203. The magnification can be quantified by determining the count of the motor shaft encoder device based on a known "original" position (such as a physical stop or limit switch trigger position). The processor 4102 may, for example, use the number of sensor pixels at each magnification position to relatively measure the image height. The magnification can be characterized by, for example, dividing the height in pixels by the height in millimeters.
返回至圖42,在校準立體視覺化攝影機300之工作距離及放大率之後,實例性處理器4102經組態以判定一投影中心(方塊4208)。可使用將立體視覺化攝影機300模型化之一或多個常式來判定該投影中心(例如,COP),如上文結合圖15所論述。為匹配一外科手術部位之左與右立體視圖,通常期望使用在用於處理器4102之軟體、韌體、硬體及/或GPU程式碼中實施之一數學模型將實體攝影機300模型化。通常可自使用者可調整方向及距離再現且觀看一3D電腦模型之一視角(諸如一腦腫瘤之MRI影像) (例如,好似影像由一合成立體攝影機擷取)。模型之可調整性可由處理器4102使用以匹配一現場外科手術影像之一視角,該現場外科手術影像因此必須係已知的。Returning to Figure 42, after calibrating the working distance and magnification of the stereo visualization camera 300, the example processor 4102 is configured to determine a projection center (block 4208). One or more routines that model the stereo visualization camera 300 may be used to determine the projection center (for example, COP), as discussed above in conjunction with FIG. 15. In order to match the left and right stereoscopic views of a surgical site, it is generally desirable to use a mathematical model implemented in the software, firmware, hardware, and/or GPU code for the processor 4102 to model the physical camera 300. Usually, the user can adjust the direction and distance to reproduce and view a perspective of a 3D computer model (such as an MRI image of a brain tumor) (for example, as if the image was captured by a synthetic stereo camera). The adjustability of the model can be used by the processor 4102 to match a viewing angle of a live surgical image, which must therefore be known.
立體視覺化攝影機300及/或處理器4102之例示性實施例經組態以針對放大率及工作距離之每一值準確地量測且計算攝影機模型參數。此等值受立體視覺化攝影機300內所含有之單獨光學器件控制。雙重光學器件經對準使得左通道/視圖與右通道/視圖之間的在一影像之中心處之視差在焦平面4330處大致係零。另外,立體視覺化攝影機300跨越放大率範圍係等焦的,且跨越放大率及工作距離範圍係等中心的,此乃因每一左通道及右通道之ZRP已對準至其各別像素網格之中心(上文在方塊4202中所闡述)。換言之,僅改變放大率會使影像在兩個通道中保持對焦,且在同一中心點上經訓練。類似地,若目標及立體視覺化攝影機300保持固定,則僅改變一工作距離應不會引起影像中之垂直視差,僅增加左視圖與右視圖之間的水平視差。An exemplary embodiment of the stereo visualization camera 300 and/or the processor 4102 is configured to accurately measure and calculate camera model parameters for each value of magnification and working distance. These equivalent values are controlled by a separate optical device contained in the stereo visualization camera 300. The dual optics are aligned so that the parallax between the left channel/view and the right channel/view at the center of an image is approximately zero at the focal plane 4330. In addition, the stereo vision camera 300 is isofocal across the magnification range, and is isocentric across the magnification and working distance range, because the ZRP of each left and right channel has been aligned to its respective pixel network The center of the grid (described above in block 4202). In other words, just changing the magnification will keep the image in focus in both channels and be trained on the same center point. Similarly, if the target and the stereoscopic visualization camera 300 remain fixed, only changing a working distance should not cause vertical parallax in the image, but only increase the horizontal parallax between the left view and the right view.
圖44圖解說明根據本發明之一實例性實施例之一圖表4400,圖表4400圖解說明可由處理器4102執行以用於判定立體視覺化攝影機300之一COP之一常式。在所圖解說明實例中,一針孔或經模型化攝影機300之一COP係沿著在針孔(O
)之平面處之一光軸4402。為判定攝影機模型之COP,使用一虛擬針孔攝影機模型,其中處理器4102經組態以判定自COP至一物件平面之一實際焦距4404。在校準常式期間,處理器4102使攝影機300之放大率保持固定同時(舉例而言)以光學影像感測器744之一平面處之像素數目記錄一影像高度4406之量測,其中高度4408之一物件在沿著光軸4402之三個不同距離處:在物件平面處,及在小於物件平面距離之一距離「d
」處,及在大於物件平面距離之一距離「d
」處。處理器4102使用包含基於在兩個最極端位置處之類似三角形之代數的常式來判定至一COP 4410之焦距4404。處理器4102可基於替代放大率與用於校準之放大率之比率而判定在替代放大率下之焦距。FIG. 44 illustrates a chart 4400 according to an exemplary embodiment of the present invention. The chart 4400 illustrates a routine that can be executed by the processor 4102 for determining a COP of the stereoscopic visualization camera 300. In the illustrated example, a pinhole or a COP of the modeled camera 300 is along an optical axis 4402 at the plane of the pinhole (O). To determine the COP of the camera model, a virtual pinhole camera model is used, in which the processor 4102 is configured to determine the actual focal length 4404 from the COP to an object plane. During the calibration routine, the processor 4102 keeps the magnification of the camera 300 fixed while (for example) recording a measurement of the image height 4406 with the number of pixels on a plane of the optical image sensor 744, where the height 4408 is in 4402 an object along the optical axis of three different distances: the object plane, and "d" in one plane at a distance less than the distance of the object, and one plane from a distance "d" is greater than at the object. The processor 4102 determines the focal length 4404 to a COP 4410 using a routine containing algebras based on similar triangles at the two most extreme positions. The processor 4102 may determine the focal length at the alternative magnification based on the ratio of the alternative magnification and the magnification used for calibration.
返回至圖42,實例性處理器4102經組態以判定變化工作距離及放大率之COP。處理器4102使透鏡之馬達軸件編碼器計數與變化工作距離及放大率之COP在LUT 4203、一不同LUT或一或多個校準暫存器中相關。在某些實施例中,處理器4102可僅儲存一個放大率及/或工作距離之一COP之一關係且使用一個已知COP關係計算其他放大率及/或工作距離。Returning to Figure 42, the example processor 4102 is configured to determine the COP of varying working distance and magnification. The processor 4102 makes the count of the motor shaft encoder of the lens and the COP of varying working distance and magnification correlated in the LUT 4203, a different LUT, or one or more calibration registers. In some embodiments, the processor 4102 may only store one COP relationship of one magnification and/or working distance and use a known COP relationship to calculate other magnifications and/or working distances.
在對一COP進行校準之後,實例性處理器4102經組態以校準立體左光軸及右光軸以及立體視覺化攝影機300之軸線之間的一瞳孔間距離(「IPD」) (方塊4210)。為表徵立體視覺化攝影機300之光學器件,左通道/視圖與右通道/視圖之間的IPD應係已知的。在實施例中,可將IPD設計至固持圖7中所展示之感測器及光學器件之機械組件中。因此機械地設定IPD。然而,實際光軸可不同於光學元件及其座架之機械軸。其他實施例使得IPD能夠在立體視覺化攝影機300內變化。After calibrating a COP, the example processor 4102 is configured to calibrate an interpupillary distance ("IPD") between the stereoscopic left and right optical axis and the axis of the stereo visualization camera 300 (block 4210) . In order to characterize the optical components of the stereo vision camera 300, the IPD between the left channel/view and the right channel/view should be known. In an embodiment, the IPD can be designed into the mechanical components that hold the sensors and optics shown in FIG. 7. Therefore, the IPD is set mechanically. However, the actual optical axis may be different from the mechanical axis of the optical element and its mount. Other embodiments enable the IPD to be changed within the stereo visualization camera 300.
在某些應用中,期望精確地知曉立體光軸在立體視覺化攝影機300之一座標系上相對於一基準或機械軸之方向。此使得(舉例而言)處理器4102能夠透過機械構件使立體視覺化攝影機300精確地瞄準。該瞄準可相對於立體視覺化攝影機300之一參考系由與立體光軸重合地向外看之一幾何上定義之觀看向量表徵。另外,光學感測器744之左通道及右通道之計時包含於一觀看向量(包括立體視覺化攝影機300定向或姿勢)中。In some applications, it is desirable to accurately know the direction of the stereoscopic optical axis relative to a reference or mechanical axis on a coordinate system of the stereoscopic visualization camera 300. This enables, for example, the processor 4102 to accurately aim the stereoscopic visualization camera 300 through mechanical components. The aiming can be characterized by a geometrically defined viewing vector that coincides with the stereo optical axis and looks outward with respect to a reference system of the stereo visualization camera 300. In addition, the timing of the left channel and the right channel of the optical sensor 744 is included in a viewing vector (including the orientation or posture of the stereo visualization camera 300).
圖45展示根據本發明之一實例性實施例之圖解說明可如何量測且校準立體視覺化攝影機300之IPD之一光學示意圖之一平面圖。在所圖解說明實例中,一光軸4502與一機械軸4504完全對準。右影像感測器746與左影像感測器748 (如由一或多個攝影機模型約計)間隔開一IPD 4506。感測器746及748對準且聚焦於一物件4508 (在一目標外科手術部位中)上。物件4508放置於距感測器746及748之一焦距4510處,使得物件之平面處之視差理論上係零,如在焦點平面4512處之物件之左或右視圖之顯示中所繪示。在此例示性實例中,為了清晰,物件4508係一圓盤,該圓盤之前視圖經展示為物項4514。FIG. 45 shows a plan view of an optical schematic diagram illustrating how to measure and calibrate the IPD of the stereo visualization camera 300 according to an exemplary embodiment of the present invention. In the illustrated example, an optical axis 4502 is fully aligned with a mechanical axis 4504. The right image sensor 746 and the left image sensor 748 (eg, approximated by one or more camera models) are separated by an IPD 4506. The sensors 746 and 748 are aligned and focused on an object 4508 (in a target surgical site). The object 4508 is placed at a focal length 4510 from the sensors 746 and 748, so that the parallax at the plane of the object is theoretically zero, as shown in the display of the left or right view of the object at the focal plane 4512. In this illustrative example, for clarity, the object 4508 is a disc, and the front view of the disc is shown as an item 4514.
圖45亦圖解說明其中物件4508沿著機械軸位移一距離「d
」且經展示為物項4508’之另一實例。物件4508之位移產生視差,該視差在一左視圖4520之顯示中表現為PL
且在一右視圖4522之顯示中表現為PR
。在此實例中,機械軸與光軸重合且視差量值係相等的。可(舉例而言)藉由對左視圖4520與右視圖4522之間的差異之一像素數目進行計數且乘以在COP校準步驟中所判定之一放大因子(像素/mm)而量測視差。處理器4102可使用三角量測來計算IPD。對每一視圖之位移距離d
及視差之量測之準確性有助於精確知曉立體視覺化攝影機300之IPD。FIG. 45 also illustrates another example in which the object 4508 is displaced a distance " d " along the mechanical axis and is shown as the item 4508'. Parallax displacement of the object 4508, the disparity in the performance of a left side view of the display 4520 as the P L and a right side view of the display 4522 appears as a P R. In this example, the mechanical axis coincides with the optical axis and the parallax magnitude is the same. The parallax can be measured, for example, by counting the number of pixels of the difference between the left view 4520 and the right view 4522 and multiplying it by a magnification factor (pixels/mm) determined in the COP calibration step. The processor 4102 can use triangulation to calculate the IPD. The accuracy of the measurement of the displacement distance d and the parallax of each view helps to accurately know the IPD of the stereo visualization camera 300.
圖46展示根據本發明之一實例性實施例之圖解說明可如何量測且校準立體視覺化攝影機300之光軸之一光學示意圖之一平面圖。在此實例中,光軸4502與機械軸4504不對準,相差經展示為4602之一角度(α)。右影像感測器746與左影像感測器748 (如由一或多個攝影機模型約計)經對準且聚焦於一物件4508 (在一目標外科手術部位中)上,物件4508放置於一焦距處使得物件4508之平面處之視差理論上係零,如在焦點平面4604處之物件4508之左或右視圖之顯示中所繪示。46 shows a plan view of an optical schematic diagram illustrating how to measure and calibrate the optical axis of the stereo visualization camera 300 according to an exemplary embodiment of the present invention. In this example, the optical axis 4502 and the mechanical axis 4504 are not aligned, and the difference is shown as an angle (α) of 4602. The right image sensor 746 and the left image sensor 748 (as approximated by one or more camera models) are aligned and focused on an object 4508 (in a target surgical site), and the object 4508 is placed on a The focal length makes the parallax at the plane of the object 4508 theoretically zero, as shown in the display of the left or right view of the object 4508 at the focal plane 4604.
圖46亦圖解說明其中物件4508沿著機械軸位移距離「d
」且經展示為物件4508’之另一實例。物件4508之位移產生視差,該視差在左視圖4610之顯示中表現為PL
'且在右視圖4612之顯示中表現為PR
'。在其中機械軸4504與光軸4502不重合之此實例中,視差量值係不相等的。實例性處理器4102經組態以經由三角量測計算IPD以及不對準角度α (例如,立體光軸)。對每一視圖之位移距離d
及視差之量測之準確性使得處理器4102能夠準確地判定立體視覺化攝影機300之IPD及光軸。FIG. 46 also illustrates another example in which the object 4508 is displaced a distance " d " along the mechanical axis and is shown as the object 4508'. The displacement of the object 4508 produces a parallax, which is represented as P L 'in the display of the left view 4610 and P R ' in the display of the right view 4612. In this example where the mechanical axis 4504 and the optical axis 4502 do not coincide, the parallax magnitudes are not equal. The example processor 4102 is configured to calculate IPD via triangulation and the misalignment angle α (e.g., stereo optical axis). The accuracy of the measurement of the displacement distance d and the parallax of each view enables the processor 4102 to accurately determine the IPD and optical axis of the stereo visualization camera 300.
可採用一類似程序來量測(舉例而言)機械軸及光軸在垂直平面中之不對準。在(舉例而言)水平平面或垂直平面中之不對準組合可經組合使得可相對於機械軸準確地推演一觀看向量。在某些實施例中,可在變化位準之工作距離及/或放大率下量測IPD及光軸參數。IPD、光軸、工作距離及/或放大率之間的關係可由處理器4102儲存至LUT 4203、另一LUT及/或校準暫存器。A similar procedure can be used to measure (for example) the misalignment of the mechanical axis and the optical axis in the vertical plane. Misaligned combinations in, for example, the horizontal plane or the vertical plane can be combined so that a viewing vector can be accurately derived with respect to the mechanical axis. In some embodiments, the IPD and optical axis parameters can be measured at varying levels of working distance and/or magnification. The relationship between the IPD, optical axis, working distance, and/or magnification can be stored by the processor 4102 in the LUT 4203, another LUT and/or calibration register.
返回至圖42,在校準實例性立體視覺化攝影機300之光軸及/或IPD之後,實例性處理器4102經組態以完成校準過程以使得攝影機300能夠連接至機器人臂506 (方塊4212)。程序4200然後可結束。在某些實施例中,若重新初始化攝影機300及/或若無法驗證或證實校準中之任一者,則重複實例性程序4200之至少若干部分。Returning to FIG. 42, after calibrating the optical axis and/or IPD of the exemplary stereo visualization camera 300, the exemplary processor 4102 is configured to complete the calibration process so that the camera 300 can be connected to the robot arm 506 (block 4212). Procedure 4200 can then end. In some embodiments, if the camera 300 is reinitialized and/or if any of the calibrations cannot be verified or verified, at least portions of the example procedure 4200 are repeated.
應瞭解,在某些實施例中,可手動地或半手動地執行程序4200之以上步驟。在其他實施例中,可由處理器4102自動地且連續地執行以上步驟。在某些實施例中,可透過對具有充足數目個物件(包括充足對比度以使得能夠在左視圖及右視圖兩者中進行識別)之一適合目標或任何目標之影像辨識進行量測。另外,處理器4102可判定或計算視差量測以用於估算立體視覺化攝影機300之光學元件之準確相對位置。處理器4102可在一即時基礎上執行光學量測。It should be understood that, in some embodiments, the above steps of the procedure 4200 can be performed manually or semi-manually. In other embodiments, the above steps may be executed automatically and continuously by the processor 4102. In some embodiments, it is possible to measure the image recognition of a target or any target with one of a sufficient number of objects (including sufficient contrast to enable recognition in both the left view and the right view). In addition, the processor 4102 can determine or calculate a parallax measurement for estimating the accurate relative position of the optical elements of the stereo visualization camera 300. The processor 4102 can perform optical measurements on a real-time basis.
在某些實施例中,使用自動化反覆技術來執行校準及量測之此等或等效方法可增加準確性且減少校準及量測所需要之時間及/或努力。舉例而言,藉由編碼器計數量及LUT 4203準確地知曉工作距離(及因此位移d
),如先前所闡述。亦藉由編碼器計數量及LUT 4203準確地知曉放大率及其(舉例而言)像素/mm轉換因子,如先前所闡述。對影像中之像素進行計數以用於差異或物件大小判定可準確地手動執行或自動化,舉例而言,如先前使用模板匹配所闡述。此等值之量測及儲存可經組合使得可由實例性處理器4102近乎即時準確地推演立體攝影機模型參數及觀看向量。In some embodiments, using automated iterative techniques to perform these or equivalent methods of calibration and measurement can increase accuracy and reduce the time and/or effort required for calibration and measurement. For example, the working distance (and therefore the displacement d ) is accurately known by the number of encoders and the LUT 4203, as explained previously. The magnification rate and its (for example) pixel/mm conversion factor are also accurately known by the number of encoders and the LUT 4203, as described previously. Counting the pixels in the image for difference or object size determination can be accurately performed manually or automatically, for example, as previously explained using template matching. The measurement and storage of these equivalent values can be combined so that the example processor 4102 can accurately deduce the stereo camera model parameters and viewing vectors in near real time.
圖47圖解說明其中充分地表徵光學參數之一經校準立體視覺化攝影機300之一圖式。在所圖解說明實施例中,展示導向假想左影像感測器位置及右影像感測器位置4700之左光軸及右光軸,如經由一攝影機模型所判定。圖47亦展示中央立體光軸或觀看向量4702。攝影機模型之假想左視圖分量及右視圖分量定位於一焦距Z
處。
另外,攝影機模型之左視圖分量與右視圖分量間隔開所量測有效IPD。在所圖解說明實例中,藉由如由立體視覺化攝影機300內之影像感測器746及748記錄之假想左視圖分量及右視圖分量以類似立體視角來觀看焦平面處之一物件。
2.立體視覺化之校準達成與額外影像之融合 FIG. 47 illustrates a diagram of a calibrated stereo visualization camera 300 in which one of the optical parameters is sufficiently characterized. In the illustrated embodiment, the left and right optical axes leading to the imaginary left and right image sensor positions 4700 are shown, as determined by a camera model. Figure 47 also shows the central stereoscopic optical axis or viewing vector 4702. The imaginary component of the left view and a right side view of the camera model component is positioned at a focal distance Z. In addition, the left view component and the right view component of the camera model are separated from the measured effective IPD. In the illustrated example, an object at the focal plane is viewed in a similar stereoscopic perspective by imaginary left view components and right view components as recorded by the image sensors 746 and 748 in the stereo visualization camera 300. 2. The calibration of stereo visualization is achieved and the fusion of additional images
實例性處理器4102經組態以使用校準參數/資訊以不僅提供高解析度清晰影像,而且對準現場立體影像與自外部裝置4104接收之一或多個影像/模型。與LUT 4203及/或校準暫存器中之攝影機模型參數有關之校準資料之映射使得處理器4102能夠創建在軟體、韌體、硬體及/或電腦程式碼中實施的立體視覺化攝影機300之一數學模型。在一實例中,處理器4102經組態以使用(舉例而言)連同圖42所論述之程序4200來接收、判定或存取攝影機模型參數。若已執行一校準,則處理器4102自一或多個記憶體1570及/或4120存取攝影機模型參數。處理器4102亦自裝置4104接收影像資料之一替代模態,諸如外科手術前影像、MRI影像、依據MRI或CT資料得出的外科手術部位之一3D模型、X射線影像及/或外科手術導板/模板。處理器4102經組態以使用攝影機模型參數再現替代模態資料之一合成立體影像。實例性處理器4102亦經組態以提供合成立體影像以用於經由監視器512顯示。在某些實例中,處理器4102經組態以融合合成立體影像與當前立體視覺化,其中每一模態之合意態樣係可見的及/或在完全相同視角中係覆疊的(好似由一單個視覺化裝置獲取)。The example processor 4102 is configured to use calibration parameters/information to not only provide high-resolution clear images, but also to align the live stereo images and receive one or more images/models from the external device 4104. The mapping of the calibration data related to the camera model parameters in the LUT 4203 and/or the calibration register enables the processor 4102 to create the stereo visualization camera 300 implemented in software, firmware, hardware, and/or computer code. A mathematical model. In one example, the processor 4102 is configured to use, for example, the procedure 4200 discussed in conjunction with FIG. 42 to receive, determine, or access camera model parameters. If a calibration has been performed, the processor 4102 accesses camera model parameters from one or more memories 1570 and/or 4120. The processor 4102 also receives an alternative modality of image data from the device 4104, such as pre-surgical images, MRI images, a 3D model of a surgical site derived from MRI or CT data, X-ray images and/or surgical guides. Board/template. The processor 4102 is configured to use the camera model parameters to reproduce one of the alternative modal data to synthesize a stereoscopic image. The example processor 4102 is also configured to provide synthetic stereo images for display via the monitor 512. In some instances, the processor 4102 is configured to fuse the composite 3D image with the current 3D visualization, where the desired shape of each modality is visible and/or overlapped in the exact same viewing angle (as if by Acquired by a single visualization device).
在某些實施例中,圖47中所圖解說明之參數由處理器4102使用以使替代模態之一合成立體影像(舉例而言MRI影像資料)與立體視覺化攝影機300之立體視角匹配。因此,實例性處理器4102使用所儲存光學校準參數來進行立體影像合成。在一實例中,處理器4102使用光學校準參數來融合現場立體影像與使用一MRI裝置術前成像之一腦腫瘤之一個三維模型。實例性處理器4102使用光學校準參數來選擇與立體影像匹配的腦腫瘤之三維模型之對應位置、大小及/或定向。換言之,處理器4102選擇與由立體視覺化攝影機300記錄之視圖對應的三維模型之一部分。處理器4102亦可基於偵測到攝影機300之工作距離、放大率及/或定向如何改變而改變模型之哪一部分經顯示。In some embodiments, the parameters illustrated in FIG. 47 are used by the processor 4102 to match the synthetic stereo image (for example, MRI image data) of one of the alternative modalities with the stereo viewing angle of the stereo visualization camera 300. Therefore, the exemplary processor 4102 uses the stored optical calibration parameters to perform stereoscopic image synthesis. In one example, the processor 4102 uses optical calibration parameters to fuse the live stereo image with a three-dimensional model of a brain tumor imaged preoperatively using an MRI device. The example processor 4102 uses optical calibration parameters to select the corresponding position, size, and/or orientation of the three-dimensional model of the brain tumor that matches the stereo image. In other words, the processor 4102 selects a part of the three-dimensional model corresponding to the view recorded by the stereo visualization camera 300. The processor 4102 may also change which part of the model is displayed based on detecting how the working distance, magnification, and/or orientation of the camera 300 have changed.
處理器4102可致使模型之一圖形表示與立體影像覆疊及/或致使模型之該圖形表示似乎與立體影像視覺上融合。由處理器4102執行之影像處理可包含使模型之圖形表示與現場立體視圖之間的邊界平滑化。影像處理亦可包含致使模型之圖形表示之至少一部分具有一經增加透明度以使得基本現場立體視圖能夠亦對於一外科醫師可見。The processor 4102 may cause a graphical representation of the model to overlap with the stereoscopic image and/or cause the graphical representation of the model to appear to be visually fused with the stereoscopic image. The image processing performed by the processor 4102 may include smoothing the boundary between the graphical representation of the model and the live stereoscopic view. Image processing may also include causing at least a portion of the graphical representation of the model to have an increased transparency so that the basic stereoscopic view of the scene can also be visible to a surgeon.
在某些實例中,處理器4102經組態以針對一立體影像中之每個像素產生及/或再現一深度圖。處理器4102可使用校準參數來判定(舉例而言)一影像中之組織深度。處理器4102可使用深度資訊來進行影像辨識以記錄所關注組織及/或識別儀器位置從而在攝影機300與機器人臂506配接時避免無意接觸。深度資訊可由處理器4102輸出至(舉例而言)機器人縫合裝置、診斷設備、程序監測與記錄系統等以進行一協調且至少半自動化外科手術程序。
3.機器人臂實施例之校準 In some examples, the processor 4102 is configured to generate and/or render a depth map for each pixel in a stereoscopic image. The processor 4102 can use the calibration parameters to determine, for example, the depth of tissue in an image. The processor 4102 can use the depth information to perform image recognition to record the tissue of interest and/or to identify the position of the instrument so as to avoid unintentional contact when the camera 300 is mated with the robot arm 506. The in-depth information can be output by the processor 4102 to, for example, robotic suture devices, diagnostic equipment, procedure monitoring and recording systems, etc. to perform a coordinated and at least semi-automated surgical procedure. 3. Calibration of the robot arm embodiment
在如上文所論述而校準立體視覺化攝影機300之後,其可連接至機器人臂506及/或耦合板3304。如下文所闡述,精確知曉之相對於焦距Z
之工作距離(由所儲存校準參數提供)由實例性處理器4102及/或機器人處理器4122使用以用於判定立體視覺化攝影機300之一位置及/或定向。立體視覺化攝影機300與機器人臂之組合經組態以提供各種工作距離之無縫轉變同時保持一目標外科手術部位之一焦點或視圖。After the stereo visualization camera 300 is calibrated as discussed above, it can be connected to the robot arm 506 and/or the coupling board 3304. As explained below, the accurately known working distance relative to the focal length Z (provided by the stored calibration parameters) is used by the example processor 4102 and/or the robotic processor 4122 to determine the position and position of the stereo visualization camera 300 /Or orientation. The combination of the stereo vision camera 300 and the robotic arm is configured to provide seamless transitions of various working distances while maintaining a focus or view of a target surgical site.
可執行下文所闡述之機器人臂506之校準程序而不管一機器人臂類型如何。舉例而言,可針對一鉸接式機器人系統執行校準程序,該鉸接式機器人系統包含經由旋轉關節彼此連接之機械連桿(自簡單的一個或兩個連桿及關節編號至包括六個或六個以上關節之關節結構)。亦可針對一笛卡爾機器人系統執行校準程序,該笛卡爾機器人系統包括具有線性關節之一支架,該支架使用具有X、Y及Z方向之一座標系。一笛卡爾機器人系統之一最後關節可包括一腕類型轉動關節。可進一步針對一圓柱形機器人系統執行校準程序,該圓柱形機器人系統包括在其基底處之一旋轉關節及用以一圓柱形工作空間之一或多個額外旋轉及/或線性關節。此外,可針對一極機器人系統執行校準程序,該極機器人系統包括經由可在一個以上旋轉軸線上操作之一關節連接至一基底之一臂且進一步包括一或多個線性或腕關節。可另外針對一選擇性順從組裝機器人臂(「SCARA」)系統執行校準程序,該選擇性順從組裝機器人臂系統包括以一主要圓柱形方式操作之一選擇性地順從臂(其用於組裝應用)。The calibration procedure of the robot arm 506 described below can be executed regardless of the type of the robot arm. For example, a calibration procedure can be performed for an articulated robot system that includes mechanical linkages connected to each other via rotary joints (from simple one or two linkages and joints numbered to six or six The joint structure of the above joints). The calibration procedure can also be executed for a Cartesian robot system, which includes a bracket with linear joints, and the bracket uses a coordinate system with X, Y, and Z directions. One of the last joints of a Cartesian robot system may include a wrist-type revolute joint. The calibration procedure can be further performed on a cylindrical robot system that includes a rotary joint at its base and one or more additional rotary and/or linear joints for a cylindrical working space. In addition, a calibration procedure can be performed for a pole robot system that includes an arm connected to a base via a joint that can operate on more than one axis of rotation and further includes one or more linear or wrist joints. The calibration procedure can additionally be performed for a selective compliant assembly robotic arm ("SCARA") system, which includes operating one of the selective compliant arms in a predominantly cylindrical manner (which is used for assembly applications) .
圖48圖解說明根據本發明之一實例性實施例之用於校準機器人臂506之一實例性程序4800或常式。儘管參考圖48中所圖解說明之流程圖闡述程序4800,但應瞭解,可使用執行與程序4800相關聯之步驟之諸多其他方法。舉例而言,可改變方塊中之諸多方塊之次序,可組合特定方塊與其他方塊,且所闡述之方塊中之諸多方塊係選用的。此外,可在多個裝置當中執行程序4800中所闡述之動作,該多個裝置包含(舉例而言)圖14之實例性立體視覺化攝影機300之光學元件1402、影像擷取模組1404、馬達與光照模組1406、資訊處理器模組1408及/或圖41之關節R1至R9及機器人臂控制器4106。舉例而言,可由儲存於機器人臂控制器4106之記憶體4120中之一程式執行程序4800。FIG. 48 illustrates an example program 4800 or routine for calibrating the robot arm 506 according to an example embodiment of the present invention. Although the procedure 4800 is described with reference to the flowchart illustrated in FIG. 48, it should be understood that many other methods of performing the steps associated with the procedure 4800 can be used. For example, the order of the blocks in the block can be changed, a specific block can be combined with other blocks, and many of the blocks described are optional. In addition, the actions described in the procedure 4800 can be executed in multiple devices, including, for example, the optical element 1402 of the exemplary stereoscopic visualization camera 300 of FIG. 14, the image capturing module 1404, and the motor And the lighting module 1406, the information processor module 1408 and/or the joints R1 to R9 and the robot arm controller 4106 of FIG. 41. For example, the program 4800 can be executed by a program stored in the memory 4120 of the robot arm controller 4106.
在某些實施例中,將耦合板3304連接至機器人臂506 (方塊4802)。若不使用一耦合板3304,則將立體視覺化攝影機300直接連接至機器人臂506之連接或耦合介面3450。若使用耦合板3304,則將立體視覺化攝影機300連接至耦合板(方塊4804)。如上文所論述,耦合板3304之第一端3702連接至機器人臂506且耦合板3304之第二端3704連接至立體視覺化攝影機300。In some embodiments, the coupling plate 3304 is connected to the robot arm 506 (block 4802). If a coupling board 3304 is not used, the stereo visualization camera 300 is directly connected to the connection or coupling interface 3450 of the robot arm 506. If the coupling board 3304 is used, the stereo visualization camera 300 is connected to the coupling board (block 4804). As discussed above, the first end 3702 of the coupling plate 3304 is connected to the robot arm 506 and the second end 3704 of the coupling plate 3304 is connected to the stereo visualization camera 300.
在實例性立體視覺化攝影機300連接至機器人臂506之後,實例性處理器4102及/或機器人臂控制器4106經組態以將攝影機及其觀看向量校準至起源於機器人臂506之固定基底3404周圍之一座標系中(方塊4806)。該座標系在本文中稱為「機器人空間(robot space)」或「機器人空間(robotic space)」。在此校準步驟期間,處理器4102及/或機器人臂控制器4106使用對機器人臂506之已知移動以在一目標外科手術部位之視覺化期間判定攝影機300之一觀看向量及物件平面之一定向及一位置。After the example stereo visualization camera 300 is connected to the robot arm 506, the example processor 4102 and/or the robot arm controller 4106 are configured to calibrate the camera and its viewing vector around the fixed base 3404 originating from the robot arm 506 One of the coordinate systems (block 4806). This coordinate system is called "robot space" or "robotic space" in this article. During this calibration step, the processor 4102 and/or the robotic arm controller 4106 uses the known movement of the robotic arm 506 to determine the orientation of a viewing vector of the camera 300 and an orientation of the object plane during the visualization of a target surgical site And a position.
在某些實施例中,存在攝影機300、耦合板3304及機器人臂506之機械特徵使得當機械地連接在一起時唯一地判定且知曉攝影機300、耦合板3304及機器人臂506之間的關係。在此等實施例中,處理器4102及/或機器人臂控制器4106依據攝影機300、耦合板3304及機器人臂506之已知機械幾何形狀來判定觀看向量之位置、方向及/或定向。In some embodiments, there are mechanical features of the camera 300, the coupling plate 3304, and the robot arm 506 so that the relationship between the camera 300, the coupling plate 3304, and the robot arm 506 is uniquely determined and known when they are mechanically connected together. In these embodiments, the processor 4102 and/or the robot arm controller 4106 determines the position, direction, and/or orientation of the viewing vector based on the known mechanical geometry of the camera 300, the coupling plate 3304, and the robot arm 506.
在其中不存在機械特徵之其他實施例中,實例性處理器4102及/或機器人臂控制器4106經組態以執行一常式以準確地判定在機器人空間中攝影機300與機器人臂506之間的一空間關係。處理器4102及/或機器人臂控制器4106使立體視覺化攝影機300移動至一開始位置,該開始位置可包含一裝載位置、一重定向位置或一外科手術位置。立體視覺化攝影機300然後使攝影機自開始位置移動至大致視覺化位於固定機器人臂506之基底3404上之一校準目標之一位置。該校準目標可位於(舉例而言)搬運車510之一方便區處在機器人臂506之運動球體內之一位置中。舉例而言,校準目標之某些實例包含小球體或可相對於彼此(在二維或立體影像中)位於一唯一已知定向中之其他可唯一地辨識物件。球體之座標相對於搬運車510及固定基底3404係固定的且已知的,且因此在機器人空間中係已知的。處理器4102及/或機器人臂控制器4106經組態以將座標儲存至(舉例而言)記憶體4120。In other embodiments where there are no mechanical features, the example processor 4102 and/or the robot arm controller 4106 are configured to execute a routine to accurately determine the difference between the camera 300 and the robot arm 506 in the robot space. A spatial relationship. The processor 4102 and/or the robot arm controller 4106 move the stereo visualization camera 300 to a starting position, which may include a loading position, a reorientation position, or a surgical operation position. The stereo visualization camera 300 then moves the camera from the starting position to a position of a calibration target that is roughly visualized on the base 3404 of the fixed robot arm 506. The calibration target may be located, for example, in a convenient area of the truck 510 in a position in the motion sphere of the robot arm 506. For example, some examples of calibration targets include small spheres or other uniquely identifiable objects that can be located in a unique known orientation relative to each other (in two-dimensional or three-dimensional images). The coordinates of the sphere are fixed and known relative to the truck 510 and the fixed base 3404, and therefore are known in the robot space. The processor 4102 and/or the robot arm controller 4106 are configured to store the coordinates in the memory 4120, for example.
在校準期間,處理器4102及/或機器人臂控制器4106接收關於工作距離、放大率、立體光軸及/或IPD之觀看向量資料4807。立體視覺化攝影機300經設定以同時視覺化校準目標處之球體且透過使用立體影像中之視差判定其位置。處理器4102及/或機器人臂控制器4106記錄在一初始座標系(舉例而言,X、Y及Z)中之球體相對於攝影機300 (亦即「攝影機空間」)中之一基準之位置。X、Y、Z位置可與一原點位置對應,且在一檔案或LUT中定義為原點或具有其他已知座標值。處理器4102及/或機器人臂控制器4106亦使用來自關節感測器之輸出資料來判定機器人臂506中之關節及連桿之位置及定向。處理器4102及/或機器人臂控制器4106亦接收位置資訊以判定耦合裝置3304之一位置及定向。總之,機器人臂506及耦合裝置3304之位置及定向使得處理器4102及/或機器人臂控制器4106能夠判定攝影機300之一姿勢。處理器4102及/或機器人臂控制器4106經組態以基於如由攝影機記錄之校準目標之球體之位置以及機器人臂506及/或耦合板3304之位置而執行攝影機空間與機器人空間之間的一座標變換。處理器4102及/或機器人臂控制器4106可將座標變換儲存至LUT 4203、機器人臂506之一不同LUT及/或一或多個校準暫存器。During the calibration, the processor 4102 and/or the robot arm controller 4106 receive viewing vector data 4807 about the working distance, magnification, stereo optical axis, and/or IPD. The stereo visualization camera 300 is set to simultaneously visualize the sphere at the calibration target and determine its position by using the parallax in the stereo image. The processor 4102 and/or the robot arm controller 4106 records the position of a sphere in an initial coordinate system (for example, X, Y, and Z) relative to a reference in the camera 300 (ie, "camera space"). The X, Y, and Z positions can correspond to an origin position, and are defined as the origin in a file or LUT or have other known coordinate values. The processor 4102 and/or the robot arm controller 4106 also use the output data from the joint sensors to determine the position and orientation of the joints and links in the robot arm 506. The processor 4102 and/or the robot arm controller 4106 also receive position information to determine a position and orientation of the coupling device 3304. In short, the position and orientation of the robot arm 506 and the coupling device 3304 enable the processor 4102 and/or the robot arm controller 4106 to determine a posture of the camera 300. The processor 4102 and/or the robot arm controller 4106 are configured to perform a conversion between the camera space and the robot space based on the position of the sphere of the calibration target and the position of the robot arm 506 and/or the coupling plate 3304 as recorded by the camera. Coordinate transformation. The processor 4102 and/or the robot arm controller 4106 can store the coordinate transformation in one of the LUT 4203, the robot arm 506, and/or one or more calibration registers.
在某些實施例中,使攝影機300移動以記錄位於一搬運車510、一天花板、一壁上及/或一外科手術區內之多個校準目標之影像。該等校準目標中之每一者可具有使得能夠識別其實體X、Y、Z位置之一唯一定向。處理器4102及/或機器人臂控制器4106針對校準目標中之每一者執行額外座標變換且將該等變換儲存至一或多個LUT及/或暫存器。In some embodiments, the camera 300 is moved to record images of multiple calibration targets located in a cart 510, a ceiling, a wall, and/or a surgical area. Each of the calibration targets may have a unique orientation that enables one of its physical X, Y, Z positions to be identified. The processor 4102 and/or the robot arm controller 4106 perform additional coordinate transformations for each of the calibration targets and store the transformations in one or more LUTs and/or registers.
在其他實施例中,處理器4102及/或機器人臂控制器4106可使用替代方法來將攝影機300校準至機器人空間。在此內容脈絡中,「校準」被視為意味「配準」,其中處理器4102及/或機器人臂控制器4106經組態以在其中配準可變化之一寬廣空間內計算配準。舉例而言,可使用其中一單獨立體攝影機用於觀察且定位搬運車510上之校準目標以及裝設於攝影機300上及/或一患者或外科手術床上之類似校準目標的一系統。處理器4102及/或機器人臂控制器4106經組態以模型化且追蹤攝影機300,將攝影機300作為具有一觀看向量及工作距離之一外科手術儀器而模型化且追蹤。觀看向量及工作距離定義用於準確地視覺化一目標外科手術部位之參數。在此等其他實施例中,另一攝影機判定且報告每一此類儀器之座標系在一參考系(諸如立體攝影機300)中之位置及定向資訊。然後,使用線性代數,由處理器4102及/或機器人臂控制器4106計算儀器相對於彼此之姿勢及/或位置,因而引起攝影機300至機器人空間之一校準。In other embodiments, the processor 4102 and/or the robot arm controller 4106 may use alternative methods to calibrate the camera 300 to the robot space. In this context, "calibration" is regarded as meaning "registration", in which the processor 4102 and/or the robot arm controller 4106 are configured to calculate the registration in a wide space in which the registration can vary. For example, a single stereo camera can be used to observe and locate a calibration target on the cart 510 and a system that is installed on the camera 300 and/or a similar calibration target on a patient or surgical bed. The processor 4102 and/or the robot arm controller 4106 are configured to model and track the camera 300, which is modeled and tracked as a surgical instrument with a viewing vector and working distance. The viewing vector and working distance definition are used to accurately visualize the parameters of a target surgical site. In these other embodiments, another camera determines and reports the position and orientation information of each such instrument in a frame of reference (such as the stereo camera 300). Then, using linear algebra, the processor 4102 and/or the robot arm controller 4106 calculate the posture and/or position of the instruments relative to each other, thereby causing the camera 300 to be calibrated to one of the robot spaces.
在某些實施例中,處理器4102及/或機器人臂控制器4106亦經組態以對耦合板3304進行校準。在某些例項中,耦合板3304包含取決於關節R7至R9之一位置而啟動之一或多個開關。開關之已知位置由處理器4102及/或機器人臂控制器4106使用作為座標變換之一部分。另外或另一選擇係,藉由致使機器人臂506移動而校準耦合板3304同時監測來自攝影機300之影像以判定定向。在其中如圖37中所展示而定向耦合板3304之一實例中,命令機器人臂506在相對於一假定定向(舉例而言,使攝影機300沿著z軸移動)之一方向上移動。若該假定定向係如圖37中所展示,其中使攝影機300朝下瞄準,則機器人臂506之一向下移動應致使影像中之一物件隨著攝影機300愈來愈靠近而變得愈來愈大。若(舉例而言)影像中之物件替代地側向或上下移動,則處理器4102及/或機器人臂控制器4106經組態以偵測運動且判定假定定向係不正確的。處理器4102及/或機器人臂控制器4106可產生一誤差且提示一操作者正確定向及/或基於影像中之所偵測到之移動而判定正確定向。透過使用如先前所闡述之(舉例而言)影像匹配模板演算法來自動解密由攝影機300之移動引起之影像之改變。在某些實施例中,由處理器4102及/或機器人臂控制器4106使用匹配模板演算法判定耦合板3304處之關節定向(其儲存至一LUT以用於校準)。In some embodiments, the processor 4102 and/or the robot arm controller 4106 are also configured to calibrate the coupling board 3304. In some cases, the coupling plate 3304 includes one or more switches that are activated depending on the position of one of the joints R7 to R9. The known position of the switch is used by the processor 4102 and/or the robot arm controller 4106 as part of the coordinate transformation. Alternatively or alternatively, the coupling plate 3304 is calibrated by causing the robot arm 506 to move while monitoring the image from the camera 300 to determine the orientation. In one example of the orientation coupling plate 3304 as shown in FIG. 37, the robot arm 506 is commanded to move in a direction relative to an assumed orientation (for example, to move the camera 300 along the z axis). If the hypothetical orientation is as shown in Figure 37, where the camera 300 is aimed downwards, then one of the robot arms 506 moving downwards should cause an object in the image to become larger and larger as the camera 300 gets closer. . If, for example, the object in the image instead moves sideways or up and down, the processor 4102 and/or the robot arm controller 4106 are configured to detect movement and determine that the assumed orientation is incorrect. The processor 4102 and/or the robot arm controller 4106 can generate an error and prompt an operator to be correctly orientated and/or determine the correct orientation based on the detected movement in the image. By using the image matching template algorithm as described previously (for example), the image changes caused by the movement of the camera 300 are automatically decrypted. In some embodiments, the processor 4102 and/or the robot arm controller 4106 uses a matching template algorithm to determine the joint orientation at the coupling plate 3304 (which is stored in a LUT for calibration).
圖49展示根據本發明之一實例性實施例之圖解說明如何將立體視覺化攝影機300及/或機器人臂506校準至機器人空間之一圖式。在所圖解說明實施例中,基於旋轉能力及/或長度而將關節R1至R9及對應連桿中之每一者模型化。記憶體4120可儲存與模型相關聯之數學參數。此外,處理器4102及/或機器人臂控制器4106可使用數學模型來判定(舉例而言)機器人臂506及/或攝影機300之一當前位置,該當前位置可用於計算將如何使關節基於一操作者所提供之既定移動而旋轉。FIG. 49 shows a diagram illustrating how to calibrate the stereo visualization camera 300 and/or the robot arm 506 to the robot space according to an exemplary embodiment of the present invention. In the illustrated embodiment, each of the joints R1 to R9 and the corresponding links are modeled based on the rotational ability and/or length. The memory 4120 can store mathematical parameters associated with the model. In addition, the processor 4102 and/or the robot arm controller 4106 can use a mathematical model to determine (for example) the current position of one of the robot arm 506 and/or the camera 300, and the current position can be used to calculate how the joint will be based on an operation The given movement and rotation provided by the person.
在所圖解說明實例中,關節R1提供於(0,0,0)之一座標位置處。關節R1至R9之間的長度與連桿之一長度對應。在所圖解說明實例中,將立體視覺化攝影機300模型化為連接至九個耦合器之一機器人末端執行器。使用十個齊次變換(其可包含矩陣乘法)之一序列將圖49中所展示之三維空間模型化。前六個座標系或關節R1至R6表示機器人臂506之正向運動學,且可使用一機器人臂之Denavit–Hartenberg參數來計算。接下來三個座標系或關節R7至R9表示自機器人臂506之工具尖端至耦合板3304之一尖端之變換。最後座標系R10表示自耦合板3304之工具尖端至立體視覺化攝影機300之控制點之變換。In the illustrated example, the joint R1 is provided at one coordinate position (0,0,0). The length between the joints R1 to R9 corresponds to the length of one of the links. In the illustrated example, the stereo visualization camera 300 is modeled as a robotic end effector connected to one of nine couplers. A sequence of ten homogeneous transformations (which may include matrix multiplication) is used to model the three-dimensional space shown in FIG. 49. The first six coordinate systems or joints R1 to R6 represent the forward kinematics of the robot arm 506, and can be calculated using the Denavit–Hartenberg parameters of a robot arm. The next three coordinate systems or joints R7 to R9 represent the transformation from the tip of the tool of the robot arm 506 to the tip of the coupling plate 3304. The last coordinate system R10 represents the transformation from the tool tip of the coupling plate 3304 to the control point of the stereo visualization camera 300.
座標系或關節R7表示可在0°與90°之間改變的耦合板3304之縱傾關節。座標系或關節R8表示耦合板3304之側傾關節,且可取決於側傾組態而在-90°、0°及90°之間改變。耦合板之關節R7至R9可包含一電壓源及一電位計。機器人臂506之連接器3450及/或耦合器控制器4130可包含經組態以自該電位器接收一電壓輸出之一I/O工具尖端連接器。處理器4102及/或機器人臂控制器4106經組態以接收輸出電壓且因此判定耦合板3304之縱傾角度及側傾角度。處理器4102及/或機器人臂控制器4106組合耦合板之縱傾及側傾資訊與來自機器人臂506之關節R1至R6之感測器輸出資料以計算座標系R1至R10之位置從而判定機器人臂506、耦合板3304及/或攝影機300之三維位置。The coordinate system or joint R7 represents the pitch joint of the coupling plate 3304 that can be changed between 0° and 90°. The coordinate system or joint R8 represents the roll joint of the coupling plate 3304, and can be changed between -90°, 0°, and 90° depending on the roll configuration. The joints R7 to R9 of the coupling plate can include a voltage source and a potentiometer. The connector 3450 and/or the coupler controller 4130 of the robot arm 506 may include an I/O tool tip connector configured to receive a voltage output from the potentiometer. The processor 4102 and/or the robot arm controller 4106 are configured to receive the output voltage and thus determine the pitch angle and the roll angle of the coupling plate 3304. The processor 4102 and/or the robot arm controller 4106 combine the pitch and roll information of the coupling plate with the sensor output data from the joints R1 to R6 of the robot arm 506 to calculate the position of the coordinate system R1 to R10 to determine the robot arm 506. The three-dimensional position of the coupling board 3304 and/or the camera 300.
控制點表示在運動鏈最末端之座標系10,且就位置(基於其而選擇特徵)而言係可完全程式化的。舉例而言,若一操作者選擇一輔助驅動特徵,則處理器4102及/或機器人臂控制器4106經組態以沿著控制臂304之一旋轉軸線將表示攝影機300之控制點設定為在攝影機內側。在另一實例中,若一操作者選擇一鎖定至目標特徵,則處理器4102及/或機器人臂控制器4106經組態以將攝影機300之控制點設定至一光軸觀看向量之一原點。The control point represents the coordinate system 10 at the end of the kinematic chain, and is fully programmable in terms of position (based on which features are selected). For example, if an operator selects an auxiliary driving feature, the processor 4102 and/or the robot arm controller 4106 are configured to set the control point representing the camera 300 along a rotation axis of the control arm 304 to be at the camera Inside. In another example, if an operator selects a lock to target feature, the processor 4102 and/or the robot arm controller 4106 are configured to set the control point of the camera 300 to the origin of an optical axis viewing vector .
返回至圖48,在將攝影機300校準至機器人空間之後,處理器4102及/或機器人臂控制器4106經組態以將機器人空間校準至患者空間(方塊4808)。患者空間之校準需要使得立體視覺化平台516能夠進行一患者之準確視覺化,其中需要機器人系統與患者之間的定向。在某些實施例中,此定向係固定的。在其他實施例中,感測且知曉定向(若為變化的)。在某些實施例中,使用一或多個基準4809將一患者放置於一手術室床上且配準至該床。舉例而言,若一患者正在經歷腦外科手術,則將其緊固至一床且將一外部框架固定至其顱骨。在其中已知位置之兩個或兩個以上非共線物件同時可見之一配置(諸如校準目標之配置)中,該框架可由立體視覺化攝影機300觀察到且可包括基準4809,使得框架及因此患者之顱骨之位置及定向能夠經判定。其他實施例可使用植入至一患者中且在MRI或類似影像中可見之基準4809。此等基準4809可用於準確地追蹤一患者之顱骨以及MRI影像且將患者之顱骨以及MRI影像配準至表示患者空間之一座標系。此外,其他實施例可使用患者本身所具有之特徵之影像辨識。舉例而言,使用生物計量資料、原位x射線或類似替代模態成像之面部或類似辨識可用於精確地定位患者之一位置及定向。在另一實例中,可使用如上文所闡述之一或多個深度圖計算以及由處理器4102及/或機器人臂控制器4106執行之表面匹配功能來判定一患者之面部之一表面之一模型。Returning to FIG. 48, after calibrating the camera 300 to the robot space, the processor 4102 and/or the robot arm controller 4106 are configured to calibrate the robot space to the patient space (block 4808). The calibration of the patient space needs to enable the stereoscopic visualization platform 516 to accurately visualize a patient, which requires the orientation between the robot system and the patient. In some embodiments, this orientation is fixed. In other embodiments, the orientation (if changing) is sensed and known. In some embodiments, one or more fiducials 4809 are used to place a patient on an operating room bed and register to the bed. For example, if a patient is undergoing brain surgery, he is fastened to a bed and an external frame is fixed to his skull. In a configuration in which two or more non-collinear objects with known positions are visible at the same time (such as the configuration of a calibration target), the frame can be observed by the stereo visualization camera 300 and can include the reference 4809, so that the frame and therefore The position and orientation of the patient's skull can be determined. Other embodiments may use fiducial 4809 implanted in a patient and visible in MRI or similar images. These benchmarks 4809 can be used to accurately track a patient's skull and MRI images and register the patient's skull and MRI images to a coordinate system representing the patient's space. In addition, other embodiments may use image recognition based on the characteristics of the patient. For example, facial or similar recognition using biometric data, in situ x-rays or similar alternative modal imaging can be used to accurately locate the position and orientation of one of the patients. In another example, one or more depth map calculations as described above and the surface matching function performed by the processor 4102 and/or the robot arm controller 4106 can be used to determine a model of a surface of a patient's face .
在一實施例中,固定且判定一手術室床相對於機器人空間之一位置及定向。某些實施例包括以一已知位置及定向將床機械地配準至(舉例而言)搬運車510上之配件之一剛性框架。另一選擇係,該床相對於機器人臂506可係固定的且基準可用於判定位置及定向。舉例而言,機器人搬運車510及床可錨定至地板且在程序之持續時間內係固定的。In one embodiment, the position and orientation of an operating room bed relative to the robot space are fixed and determined. Certain embodiments include mechanically registering the bed to, for example, a rigid frame of an accessory on the truck 510 in a known position and orientation. Alternatively, the bed can be fixed relative to the robot arm 506 and the reference can be used to determine the position and orientation. For example, the robot truck 510 and the bed can be anchored to the floor and fixed for the duration of the procedure.
在藉由攝影機300將患者之基準4809視覺化之後,可由處理器4102及/或機器人臂控制器4106解密且儲存機器人空間中之其位置及定向,其中達成自機器人空間至患者空間之座標系變換。應注意,自一個空間至另一空間之座標系變換一般係可選擇的且可逆的。舉例而言,將所要攝影機運動或姿勢變換至機器人空間中以使得處理器4102及/或機器人臂控制器4106能夠判定離散關節運動及定向可係更高效的。另一選擇係,在患者空間中之顯示監視器512上將資訊呈現給一外科醫師可係更容易且更高效的。處理器4102及/或機器人臂控制器4106可視需要將點及向量之位置變換至大多數任一座標系(舉例而言,一搬運車原點、一患者參考系、GPS及/或其他座標系)之各別者。After the patient's fiducial 4809 is visualized by the camera 300, the processor 4102 and/or the robot arm controller 4106 can decrypt and store its position and orientation in the robot space, in which the coordinate system transformation from the robot space to the patient space is achieved. . It should be noted that the transformation of the coordinate system from one space to another is generally selectable and reversible. For example, transforming the desired camera motion or posture into the robot space so that the processor 4102 and/or the robot arm controller 4106 can determine the discrete joint motion and orientation can be more efficient. Alternatively, presenting information to a surgeon on the display monitor 512 in the patient space can be easier and more efficient. The processor 4102 and/or the robot arm controller 4106 can optionally transform the positions of points and vectors to any coordinate system (for example, a cart origin, a patient reference system, GPS and/or other coordinate systems). ) Of each.
在某些實施例中,處理器4102及/或機器人臂控制器4106經組態以使用自動化反覆技術來執行機器人/患者空間校準及量測之此等或等效方法以增加準確性且減少校準時間。在例示性實施例中,處理器4102及/或機器人臂控制器4106準確地知曉立體視覺化攝影機300相對於基準之位移及定向。可準確地執行機器人臂506之運動,且可準確地分析基準之後續影像。處理器4102及/或機器人臂控制器4106可組合校準參數之視覺化及知識,使得可以一自動化方式準確地執行量測及因此校準。此(舉例而言)對於維持自一個外科手術程序及一個患者至下一外科手術程序及下一患者之準確校準係重要的。In some embodiments, the processor 4102 and/or the robotic arm controller 4106 are configured to use automated iteration techniques to perform these or equivalent methods of robot/patient space calibration and measurement to increase accuracy and reduce calibration time. In an exemplary embodiment, the processor 4102 and/or the robot arm controller 4106 accurately know the displacement and orientation of the stereo visualization camera 300 relative to the reference. The movement of the robot arm 506 can be accurately executed, and the subsequent images of the benchmark can be accurately analyzed. The processor 4102 and/or the robot arm controller 4106 can combine the visualization and knowledge of the calibration parameters, so that the measurement and thus the calibration can be accurately performed in an automated manner. This (for example) is important to maintain accurate calibration from one surgical procedure and one patient to the next surgical procedure and the next patient.
在某些實例中,處理器4102及/或機器人臂控制器4106經組態以判定機器人臂506及/或攝影機300相對於患者空間及/或機器人空間之邊界。該等邊界表示在軟體中實施以阻止機器人臂506及/或攝影機300接觸或逃離經界定區或空間之虛擬限制。在某些實例中,該等限制在儲存於記憶體4120中之一或多個LUT或暫存器中定義為由處理器4102及/或機器人臂控制器4106應用於關節移動信號之比例因子。比例因子之量值隨著接近每一個別邊界之限制而減小至零。舉例而言,可基於操作者輸入而判定關節旋轉量及速度。然而,處理器4102及/或機器人臂控制器4106在將信號發送至適當關節之前使關節旋轉速度比例縮放比例因子。另外,處理器4102及/或機器人臂控制器4106可維持旋轉量,使得關節移動所要量但以一減小速度移動,直至關節到達邊界為止。應瞭解,其中應用一比例因子之一旋轉區中之一關節在所要移動係遠離邊界之情況下可不應用一比例因子。因此,處理器4102及/或機器人臂控制器4106可基於一當前位置及來自一操作者之所估計所要移動而將一比例因子應用於特定關節同時將「1」之一比例因子應用於其他關節。In some instances, the processor 4102 and/or the robotic arm controller 4106 are configured to determine the boundary of the robotic arm 506 and/or the camera 300 with respect to the patient space and/or the robot space. These boundaries represent virtual restrictions implemented in software to prevent the robot arm 506 and/or the camera 300 from contacting or escaping from the defined area or space. In some instances, the limits are defined in one or more LUTs or registers stored in the memory 4120 as a scale factor applied by the processor 4102 and/or the robot arm controller 4106 to the joint movement signal. The magnitude of the scale factor decreases to zero as it approaches the limit of each individual boundary. For example, the joint rotation amount and speed can be determined based on the operator's input. However, the processor 4102 and/or the robot arm controller 4106 scales the joint rotation speed by a scaling factor before sending the signal to the appropriate joint. In addition, the processor 4102 and/or the robot arm controller 4106 can maintain the amount of rotation, so that the joint moves by the required amount but at a reduced speed until the joint reaches the boundary. It should be understood that a scale factor may not be applied to a joint in a rotation region in which a scale factor is applied when the moving system is far away from the boundary. Therefore, the processor 4102 and/or the robot arm controller 4106 can apply a scale factor to a specific joint while applying a scale factor of "1" to other joints based on a current position and estimated movement from an operator. .
該等比例因子嚴格地介於零與一之間,此使得能夠將其鏈接在一起且使得軟體能夠支援無限數目個可能邊界。該等比例因子可隨著接近一邊界而線性地減小,此致使關節R1至R9之旋轉隨著機器人臂506接近一邊界而逐漸減慢。在其他實例中,該等比例因子可隨著接近一邊界而指數減小。The scale factors are strictly between zero and one, which makes it possible to link them together and enables the software to support an unlimited number of possible boundaries. The scale factors can linearly decrease as they approach a boundary, which causes the rotation of the joints R1 to R9 to gradually slow down as the robot arm 506 approaches a boundary. In other examples, the scale factors may decrease exponentially as they approach a boundary.
一般而言,操作者通常將其注意力聚焦於外科手術場地或顯示監視器512上之立體影像上。如此,操作者通常不知曉機器人臂506及/或耦合板3304之個別連桿之位置。因此,當機器人臂506將要達到一限制或撞擊機器人臂506之另一部分時並非始終直觀的。關節限制因此可始終有效且阻止機器人臂506之任一部分撞到自身或將關節放置於一單個組態(諸如肘鎖)中。實例性處理器4102及/或機器人臂控制器4106經組態以基於機器人臂506之一當前位置而判定比例因子。處理器4102及/或機器人臂控制器4106亦可考量一操作者所提供之既定移動指令以判定將應用哪一比例因子。基於當前及/或預期移動,處理器4102及/或機器人臂控制器4106使用(舉例而言)一或多個LUT基於關節角度空間中之距離而計算比例因子。關節角度間距可定義關節角度之特定組合,已知該等關節角度導致關節鎖定或致使機器人臂506撞到自身。如此,關節角度間距判定係基於判定當前(及/或預期)關節移動且使當前(及/或預期)關節移動相對於彼此進行比較。Generally speaking, the operator usually focuses his attention on the surgical field or the 3D image on the display monitor 512. As such, the operator usually does not know the positions of the individual links of the robot arm 506 and/or the coupling plate 3304. Therefore, it is not always intuitive when the robot arm 506 is about to reach a limit or hit another part of the robot arm 506. The joint restriction can therefore always be effective and prevent any part of the robot arm 506 from hitting itself or placing the joint in a single configuration (such as an elbow lock). The example processor 4102 and/or the robot arm controller 4106 are configured to determine the scale factor based on the current position of one of the robot arms 506. The processor 4102 and/or the robot arm controller 4106 may also consider a predetermined movement command provided by an operator to determine which scale factor will be applied. Based on the current and/or expected movement, the processor 4102 and/or the robot arm controller 4106 uses, for example, one or more LUTs to calculate a scale factor based on the distance in the joint angle space. The joint angle spacing can define a specific combination of joint angles, which are known to cause the joint to lock or cause the robot arm 506 to hit itself. In this way, the joint angular distance determination is based on determining the current (and/or expected) joint movement and comparing the current (and/or expected) joint movement with respect to each other.
除機器人臂506之邊界之外,記憶體4120亦可儲存阻止機器人臂506撞到搬運車510、阻止機器人臂撞到顯示監視器512及/或阻止攝影機300撞到機器人臂506之與笛卡爾限制有關之邊界。處理器4102及/或機器人臂控制器4106可使用(舉例而言)連同圖49所論述之座標系來判定及/或應用笛卡爾限制。在某些實例中,該等限制可係相對的或錨定至一特定連桿。如此,當使連桿在3D空間中移動時,其周圍之邊界相應地移動。在其他實例中,該等限制係靜態的且固定至圖49中所展示之3D空間內之特定座標平面或線。處理器4102及/或機器人臂控制器4106可藉由計算或判定笛卡爾空間中之比例因子且應用正向運動變換而應用該等限制。In addition to the boundary of the robot arm 506, the memory 4120 can also store Cartesian restrictions for preventing the robot arm 506 from hitting the truck 510, preventing the robot arm from hitting the display monitor 512, and/or preventing the camera 300 from hitting the robot arm 506 The relevant boundary. The processor 4102 and/or the robot arm controller 4106 may use, for example, the coordinate system discussed in conjunction with FIG. 49 to determine and/or apply Cartesian constraints. In some instances, the restrictions may be relative or anchored to a specific link. In this way, when the connecting rod is moved in the 3D space, its surrounding boundary moves accordingly. In other examples, the constraints are static and fixed to specific coordinate planes or lines in the 3D space shown in FIG. 49. The processor 4102 and/or the robot arm controller 4106 can apply these restrictions by calculating or determining the scale factor in Cartesian space and applying the forward motion transformation.
實例性處理器4102及/或機器人臂控制器4106亦可判定一患者邊界,其界定機器人臂506及/或攝影機300之任何點皆不可侵犯之一虛擬地方。可藉由針對機器人臂506及/或耦合板3304上之每一位置關節至一邊界平面之一位置之一距離計算笛卡爾空間中之比例因子而判定患者邊界。如圖50之定向5002中所展示之邊界平面實施為位於某一垂直Z位置處之一X,Y平面以達成非縱傾組態。針對縱傾組態,諸如圖50之定向5004中所展示之患者半坐,邊界平面取決於攝影機300面對之方向而設定為位於正或負X值處之一Y,Z平面。The example processor 4102 and/or the robot arm controller 4106 can also determine a patient boundary, which defines a virtual place where any point of the robot arm 506 and/or camera 300 is inviolable. The patient boundary can be determined by calculating the scale factor in Cartesian space for a distance from each position joint on the robot arm 506 and/or the coupling plate 3304 to a position of a boundary plane. The boundary plane shown in the orientation 5002 of FIG. 50 is implemented as an X, Y plane at a certain vertical Z position to achieve a non-trim configuration. For a trim configuration, such as the patient half-sitting shown in the orientation 5004 of FIG. 50, the boundary plane is set to be the Y, Z plane at one of the positive or negative X values depending on the direction the camera 300 faces.
上文所論述之實例性邊界可作為預設邊界儲存至記憶體4120及/或在一外科手術程序之前由處理器4102及/或機器人臂控制器4106判定。在某些實施例中,可基於待執行之外科手術程序之一所輸入類型而存取或判定特定邊界。舉例而言,患者邊界可藉由攝影機300使患者成像且使用校準資訊/參數來定患者深度而判定。處理器4102及/或機器人臂控制器4106然後可創建一邊界且將該邊界應用於在患者上面或接近於患者之一指定位置。可在偵測到監視器、外科手術職員或外科手術儀器之後創建類似邊界。The exemplary boundaries discussed above can be stored in the memory 4120 as a preset boundary and/or determined by the processor 4102 and/or the robotic arm controller 4106 before a surgical procedure. In some embodiments, a specific boundary can be accessed or determined based on the input type of one of the surgical procedures to be performed. For example, the patient boundary can be determined by imaging the patient with the camera 300 and using calibration information/parameters to determine the depth of the patient. The processor 4102 and/or the robotic arm controller 4106 may then create a boundary and apply the boundary to a designated location on or close to the patient. A similar boundary can be created after a monitor, surgical staff, or surgical instrument is detected.
例如,可圍繞使用一特定外科手術工具(諸如較大大小之工具或在被接觸之情況下造成特定風險之工具)來判定邊界。實例性處理器4102及/或機器人臂控制器4106可接收工具類型之一輸入及/或使用影像分析在立體影像中偵測工具。在其他實例中,處理器4102及/或機器人臂控制器4106計算關於一外科手術儀器之深度資訊以判定其大小、定向及/或位置。實例性處理器4102及/或機器人臂控制器4106將外科手術儀器之影像轉化為座標系,諸如結合圖49所論述之座標系。處理器4102及/或機器人臂控制器4106亦將具有小於「1」之一值之比例因子應用於與外科手術儀器之一位置對應之區,因而阻止機器人臂506及/或攝影機300無意地接觸外科手術工具。在某些例項中,處理器4102及/或機器人臂控制器4106可在一程序期間追蹤外科手術工具之一移動且因此改變邊界。For example, the boundary can be determined around the use of a specific surgical tool (such as a tool of a larger size or a tool that poses a specific risk if touched). The example processor 4102 and/or the robot arm controller 4106 may receive input of one of the tool types and/or use image analysis to detect the tool in the stereo image. In other examples, the processor 4102 and/or the robotic arm controller 4106 calculate depth information about a surgical instrument to determine its size, orientation, and/or position. The example processor 4102 and/or the robotic arm controller 4106 convert the image of the surgical instrument into a coordinate system, such as the coordinate system discussed in conjunction with FIG. 49. The processor 4102 and/or the robot arm controller 4106 also apply a scale factor with a value less than "1" to the area corresponding to a position of the surgical instrument, thereby preventing the robot arm 506 and/or the camera 300 from inadvertently contacting Surgical tools. In some instances, the processor 4102 and/or the robotic arm controller 4106 may track the movement of one of the surgical tools during a procedure and thus change the boundary.
圖51圖解說明根據本發明之一實例性實施例之如何基於至一邊界之距離而對機器人臂506及/或耦合板3304之旋轉關節速度進行比例縮放之一實例。圖表5102展示關節R1之一旋轉速度且圖表5104展示相對於與靠近於一邊界之一區對應之一第一區帶5112 (其中一比例因子自「1」之一值減小)及與該邊界對應之一第二區帶5114 (其中比例因子減小至「0」之一值)之一肩角度(例如,旋轉位置) 5110。FIG. 51 illustrates an example of how to scale the rotational joint speed of the robot arm 506 and/or the coupling plate 3304 based on the distance to a boundary according to an exemplary embodiment of the present invention. The graph 5102 shows a rotation speed of the joint R1 and the graph 5104 shows a first zone 5112 (where a scale factor decreases from a value of "1") relative to a region close to a boundary and the boundary A shoulder angle (for example, a rotation position) 5110 corresponding to a second zone 5114 (where the scale factor is reduced to a value of "0").
圖51展示當機器人臂506及特定而言關節R1致使至少一個連桿及/或立體視覺化攝影機300接近第一區帶5112時旋轉速度相對於至第二區帶5114之距離動態地比例縮放。然後,當至少一個連桿及/或立體視覺化攝影機300到達第二區帶5114時,比例因子減小至「0」之一值且停止朝向邊界之所有旋轉關節移動。換言之,當機器人臂506及立體視覺化攝影機300接近一限制或邊界時,處理器4102及/或機器人臂控制器4106致使關節R1至R9中之至少某些關節之旋轉速度減小且在到達第二區帶5114時最終達到「0」度/秒之一速度(如在圖表5102及5104中展示為介於20秒與30秒之間)。圖表亦展示當至少一個連桿及/或立體視覺化攝影機300移動遠離第二區帶5114時,處理器4102及/或機器人臂控制器4106使用「1」之一比例因子值,此乃因未接近第二區帶5114。FIG. 51 shows that the rotation speed is dynamically scaled relative to the distance to the second zone 5114 when the robot arm 506 and, in particular, the joint R1 causes the at least one link and/or the stereo visualization camera 300 to approach the first zone 5112. Then, when the at least one link and/or stereo visualization camera 300 reaches the second zone 5114, the scale factor is reduced to a value of "0" and all the rotation joints moving toward the boundary are stopped. In other words, when the robot arm 506 and the stereo visualization camera 300 approach a limit or boundary, the processor 4102 and/or the robot arm controller 4106 causes the rotation speed of at least some of the joints R1 to R9 to decrease and reach the first The second zone finally reaches a speed of "0" degrees/second at 5114 (as shown in charts 5102 and 5104 as being between 20 seconds and 30 seconds). The graph also shows that when at least one linkage and/or stereo visualization camera 300 moves away from the second zone 5114, the processor 4102 and/or the robot arm controller 4106 uses a scale factor value of "1". Close to the second zone 5114.
在某些實施例中,處理器4102及/或機器人臂控制器4106經組態以致使顯示監視器512或其他使用者介面顯示表示機器人臂506之一狀態之一或多個圖形圖符。舉例而言,當機器人臂506及/或攝影機300位於其中比例因子具有「1」之一值之一區帶或區中時可顯示一綠色圖符。另外,當機器人臂506及/或攝影機300位於第一區帶5112內時可顯示一黃色圖符以指示關節旋轉速度減慢。此外,當機器人臂506到達第二區帶5114或一邊界/限制時可顯示一紅色圖符以指示超過該邊界之額外移動係不可能的。In some embodiments, the processor 4102 and/or the robot arm controller 4106 are configured to cause the display monitor 512 or other user interface to display one or more graphical icons representing a state of the robot arm 506. For example, when the robot arm 506 and/or the camera 300 are located in a zone or zone where the scale factor has a value of "1", a green icon may be displayed. In addition, when the robot arm 506 and/or the camera 300 are located in the first zone 5112, a yellow icon may be displayed to indicate that the joint rotation speed is slowing down. In addition, when the robot arm 506 reaches the second zone 5114 or a boundary/limit, a red icon may be displayed to indicate that additional movement beyond the boundary is impossible.
返回至圖48,在判定機器人空間邊界之後,實例性處理器4102及/或機器人臂控制器4106經組態以使得機器人臂506能夠與立體視覺化攝影機300一起操作(方塊4812)。此可包含使得機器人臂506及立體視覺化攝影機300能夠在一外科手術程序期間使用。此亦可包含啟用特徵,諸如輔助驅動及/或鎖定至目標。另外或另一選擇係,此可包含啟用圖41之輸入裝置1410中之一或多者處之一或多個使用者控件。實例性程序4800在機器人臂506與立體視覺化攝影機300一起經啟用之後結束。若立體視覺化平台516經重新初始化、經歷一所偵測故障或無法證實校準,則可重複實例性程序4800。
D.立體視覺化攝影機及機器人臂操作實施例 Returning to FIG. 48, after determining the robot space boundary, the example processor 4102 and/or the robot arm controller 4106 are configured to enable the robot arm 506 to operate with the stereo visualization camera 300 (block 4812). This may include enabling the robotic arm 506 and the stereo visualization camera 300 to be used during a surgical procedure. This may also include enabling features, such as assisted drive and/or lock to target. Additionally or alternatively, this may include enabling one or more user controls at one or more of the input devices 1410 of FIG. 41. The example procedure 4800 ends after the robotic arm 506 and the stereo visualization camera 300 are activated together. If the stereo visualization platform 516 is reinitialized, undergoes a detected failure, or cannot verify the calibration, the example procedure 4800 can be repeated. D. Example of operation of stereo visual camera and robot arm
實例性立體視覺化攝影機300經組態以連同機器人臂506及/或耦合板3304操作以提供經增強視覺化特徵。如下文更詳細地論述,經增強特徵包含一經延伸焦點、自動化聚焦尖端定位、提供一影像中之物件之間的距離之一量測、提供具有聯合視覺化之機器人運動、垂度補償、影像融合及視覺化位置/定向儲存。經增強視覺化特徵亦可包含機器人臂506之輔助驅動能力及在使得能夠改變機器人臂506及/或耦合板3304之一定向之同時使得攝影機能夠鎖定至一特定視圖上的一鎖定至目標能力。
1.經延伸焦點實施例 The example stereo visualization camera 300 is configured to operate in conjunction with the robotic arm 506 and/or the coupling plate 3304 to provide enhanced visualization features. As discussed in more detail below, enhanced features include an extended focus, automatic focus tip positioning, providing a measurement of the distance between objects in an image, providing robot motion with joint visualization, sag compensation, and image fusion And visualized location/orientation storage. The enhanced visualization features may also include auxiliary driving capabilities of the robotic arm 506 and a lock-to-target capability that enables the camera to be locked to a specific view while enabling one of the orientations of the robotic arm 506 and/or the coupling plate 3304 to be changed. 1. Example of extended focus
在某些實施例中,機器人臂506及/或耦合板3304可提供攝影機300之一經延伸焦點。如上文結合圖43所論述,立體視覺化攝影機300包含主要物鏡總成702以用於改變一工作距離。為聚焦於外科手術部位中之一物件,主要物鏡總成702改變自恰好在物件前面至恰好在物件後面之一焦距。然而,在某些例項中,主要物鏡總成702在達成最佳焦點之前達到前工作距離透鏡408之一機械限制。In some embodiments, the robotic arm 506 and/or the coupling plate 3304 may provide an extended focus of one of the cameras 300. As discussed above in conjunction with FIG. 43, the stereo visualization camera 300 includes the main objective lens assembly 702 for changing a working distance. In order to focus on an object in the surgical site, the main objective lens assembly 702 is changed from just in front of the object to just behind the object by a focal length. However, in some cases, the main objective assembly 702 reaches one of the mechanical limits of the front working distance lens 408 before reaching the best focus.
實例性處理器4102及/或機器人臂控制器4106經組態以偵測何時達到一機械限制及/或判定將要達到透鏡408之一機械限制且因此替代地調整機器人臂506之一位置。處理器4102及/或機器人臂控制器4106經組態以使用機器人臂506以藉由計算攝影機300之一觀看向量且致使機器人臂506 沿著光軸經致動而延伸焦點。處理器4102及/或機器人臂控制器4106使用立體視覺化攝影機300之上文所闡述之校準參數判定達成焦點所需要之一距離。舉例而言,如上文所論述,前工作距離透鏡408之一位置映射至主要物鏡總成702至一目標物件之一實體工作距離。該距離提供關於攝影機300之一中心距目標物件多遠之一估計。另外,校準參數可包含前工作距離透鏡408之馬達或編碼器節距與工作距離之間的一映射以提供達成一特定工作距離或焦點所需要之距離之一估計。因此,處理器4102及/或機器人臂控制器4106可讀取前工作距離透鏡408之一當前編碼器值且判定表示自攝影機300至目標物件之一垂直距離之一米數。換言之,處理器4102及/或機器人臂控制器4106將透鏡移動(在編碼器計數中)轉換為機器人空間中之一實體距離。處理器4102及/或機器人臂控制器4106然後判定將致使機器人臂506沿著光軸移動所判定距離之關節旋轉速度、方向及/或持續時間(例如,一移動順序)。處理器4102及/或機器人臂控制器4106然後將與移動順序對應之一或多個信號傳輸至適當關節以致使機器人臂506提供一經延伸焦點。在某些例項中,處理器4102及/或機器人臂控制器4106可在信號傳輸至機器人臂506及/或耦合板3304之關節R1至R9之前應用一比例因子。The example processor 4102 and/or the robot arm controller 4106 are configured to detect when a mechanical limit is reached and/or determine that a mechanical limit of the lens 408 is about to be reached and therefore adjust a position of the robot arm 506 instead. The processor 4102 and/or the robot arm controller 4106 are configured to use the robot arm 506 to view the vector by calculating one of the cameras 300 and cause the robot arm 506 to be actuated along the optical axis to extend the focus. The processor 4102 and/or the robot arm controller 4106 use the calibration parameters described above of the stereo visualization camera 300 to determine a distance required to achieve the focus. For example, as discussed above, a position of the front working distance lens 408 is mapped to a physical working distance of the main objective lens assembly 702 to a target object. This distance provides an estimate of how far the center of one of the cameras 300 is from the target object. In addition, the calibration parameters may include a mapping between the motor or encoder pitch of the front working distance lens 408 and the working distance to provide an estimate of the distance required to achieve a specific working distance or focus. Therefore, the processor 4102 and/or the robot arm controller 4106 can read the current encoder value of the front working distance lens 408 and determine that it represents a vertical distance of one meter from the camera 300 to the target object. In other words, the processor 4102 and/or the robot arm controller 4106 convert the lens movement (in the encoder count) into a physical distance in the robot space. The processor 4102 and/or the robot arm controller 4106 then determine the joint rotation speed, direction and/or duration (for example, a movement sequence) that will cause the robot arm 506 to move along the optical axis by the determined distance. The processor 4102 and/or the robot arm controller 4106 then transmits one or more signals corresponding to the movement sequence to the appropriate joints to cause the robot arm 506 to provide an extended focus. In some cases, the processor 4102 and/or the robot arm controller 4106 may apply a scale factor before the signal is transmitted to the robot arm 506 and/or the joints R1 to R9 of the coupling plate 3304.
應瞭解,焦點延伸引起機器人臂506之一自動化移動。換言之,機器人臂506可使攝影機300之運動繼續穿過最佳聚焦點。另外,在不具有來自一操作者使機器人臂移動之輸入(確切而言,關於改變一焦距之操作者影像)之情況下發生機器人臂506之移動。在某些例項中,處理器4102及/或機器人臂控制器4106可自動調整焦點以維持一清晰影像。It should be understood that the focus extension causes one of the robotic arms 506 to move automatically. In other words, the robot arm 506 can continue the motion of the camera 300 through the best focus point. In addition, the movement of the robot arm 506 occurs without an input from an operator to move the robot arm (to be precise, regarding an operator image that changes a focal length). In some cases, the processor 4102 and/or the robot arm controller 4106 can automatically adjust the focus to maintain a clear image.
在某些實施例中,處理器4102及/或機器人臂控制器4106經組態以回應於經由輸入裝置1410進行之一單個按鈕按壓而使機器人臂506沿著攝影機之工作距離移動。此特徵使得一操作者能夠固定主要物鏡總成702之一馬達位置且藉由使機器人臂506及/或耦合板3304移動而獲得聚焦。藉由處理器4102及/或機器人臂控制器4106估計或判定自主要物鏡總成702之一前面至一目標之一距離而完成此「機器人自動聚焦」特徵或程序,如上文結合圖43所論述。處理器4102及/或機器人臂控制器4106經組態以與一回饋定律一起使用所判定距離來命令機器人臂506之一垂直速度(或沿著攝影機300之一光軸之速度)直至所判定距離達到「0」之一值為止。處理器4102及/或機器人臂控制器4106可在一程序期間隨時使用此自動聚焦演算法以使一目標物件對焦。在某些實施例中,處理器4102及/或機器人臂控制器4106可在搜尋一自動聚焦方向時使用自上一次使用自動聚焦以來機器人臂506及/或耦合板3304之移動作為一種子或起點,因而改良使一目標物件聚焦之速度及準確性。In some embodiments, the processor 4102 and/or the robot arm controller 4106 are configured to move the robot arm 506 along the working distance of the camera in response to a single button press via the input device 1410. This feature enables an operator to fix a motor position of the main objective lens assembly 702 and obtain focus by moving the robot arm 506 and/or the coupling plate 3304. The "robot auto-focusing" feature or procedure is completed by the processor 4102 and/or the robot arm controller 4106 to estimate or determine the distance from the front of the main objective lens assembly 702 to a target, as discussed above in conjunction with FIG. 43 . The processor 4102 and/or the robot arm controller 4106 are configured to use the determined distance with a feedback law to command a vertical speed of the robot arm 506 (or the speed along an optical axis of the camera 300) to the determined distance Until it reaches a value of "0". The processor 4102 and/or the robot arm controller 4106 can use the auto-focus algorithm at any time during a program to focus a target object. In some embodiments, the processor 4102 and/or the robot arm controller 4106 may use the movement of the robot arm 506 and/or the coupling plate 3304 since the last time the auto focus was used as a kind of sub or starting point when searching for an auto focus direction. , Thus improving the speed and accuracy of focusing a target object.
應瞭解,除了或替代使前透鏡組714、透鏡鏡筒組718及/或最後光學組742 (其中之每一者可係可藉由具有映射至位置、焦點、工作距離及/或放大率之編碼器計數之一各別馬達移動的)移動,實例性處理器4102及/或機器人臂控制器4106亦可經組態以致使機器人臂506及/或耦合板3304移動。舉例而言,當前透鏡組714、透鏡鏡筒組718及/或最後光學組742中之任一者將要接近一移動限制時,處理器4102及/或機器人臂控制器4106可致使機器人臂506沿著一光軸移動。在某些實例中,處理器4102及/或機器人臂控制器4106可致使機器人臂506首先移動至大致對焦或接近焦點之一位置,且然後調整前透鏡組714、透鏡鏡筒組718及/或最後光學組742以使目標影像進入接近理想焦點。
2.自動化聚焦尖端定位實施例 It should be understood that, in addition to or instead of making the front lens group 714, lens barrel group 718 and/or last optical group 742 (each of which can be mapped to position, focus, working distance and/or magnification) The encoder counts the movement of one of the individual motors. The example processor 4102 and/or the robot arm controller 4106 can also be configured to cause the robot arm 506 and/or the coupling plate 3304 to move. For example, when any one of the current lens group 714, the lens barrel group 718, and/or the last optical group 742 is about to approach a movement limit, the processor 4102 and/or the robot arm controller 4106 may cause the robot arm 506 to move along Move along an optical axis. In some examples, the processor 4102 and/or the robot arm controller 4106 may cause the robot arm 506 to first move to a position that is roughly in focus or close to the focus, and then adjust the front lens group 714, lens barrel group 718, and/or Finally, the optical group 742 makes the target image enter close to the ideal focus. 2. Example of automatic focusing tip positioning
在某些實施例中,機器人臂506及/或耦合板3304可連同立體視覺化攝影機300操作以提供自動化聚焦尖端定位。在此等實施例中,處理器4102及/或機器人臂控制器4106經組態以定位攝影機300以用於在不具有一特定影像及其內容之資訊或回饋之情況下視覺化一目標外科手術部位。處理器4102及/或機器人臂控制器4106可使用上文結合圖42及圖49所論述之經校準攝影機模型參數來執行開環攝影機定位。處理器4102及/或機器人臂控制器4106可致使機器人臂506定位立體視覺化攝影機300,使得攝影機之一焦點或尖端在一場景中。立體視覺化攝影機300基於關於機器人臂506及/或耦合板之一姿勢之校準資訊及攝影機300之光學校準參數而判定攝影機300相對於一座標系之一瞄準方向。處理器4102及/或機器人臂控制器4106可藉由與攝影機300之立體光軸重合地對準之一幾何上定義之觀看向量相對於機器人臂506之座標系表徵該瞄準。In some embodiments, the robotic arm 506 and/or the coupling plate 3304 may operate in conjunction with the stereo visualization camera 300 to provide automated focus tip positioning. In these embodiments, the processor 4102 and/or the robotic arm controller 4106 are configured to position the camera 300 for visualizing a target surgical procedure without information or feedback of a specific image and its content Location. The processor 4102 and/or the robot arm controller 4106 may use the calibrated camera model parameters discussed above in connection with FIGS. 42 and 49 to perform open loop camera positioning. The processor 4102 and/or the robot arm controller 4106 can cause the robot arm 506 to position the stereo visualization camera 300 so that a focal point or tip of the camera is in a scene. The stereo visualization camera 300 determines the aiming direction of the camera 300 relative to a standard system based on the calibration information about the posture of the robot arm 506 and/or the coupling plate and the optical calibration parameters of the camera 300. The processor 4102 and/or the robot arm controller 4106 can characterize the aiming with respect to the coordinate system of the robot arm 506 by aligning a geometrically defined viewing vector coincident with the stereo optical axis of the camera 300.
在某些實施例中,處理器4102及/或機器人臂控制器4106經組態以執行一初始化常式以將校準參數及/或其他記憶體資料對準至一實際實體參考位置,此可用於尖端定位。舉例而言,處理器4102及/或機器人臂控制器4106可致使機器人臂506及/或耦合板移動至「位置0」處之一硬停,其中所有位置資料域設定至0 (或在一個三維空間中之0,0,0)。相對於此點進行額外運動且根據(舉例而言)機器人臂506及/或耦合板3304之各種關節馬達之編碼器計數來更新位置資料。In some embodiments, the processor 4102 and/or the robot arm controller 4106 are configured to execute an initialization routine to align the calibration parameters and/or other memory data to an actual physical reference position, which can be used for Tip positioning. For example, the processor 4102 and/or the robot arm controller 4106 can cause the robot arm 506 and/or the coupling plate to move to a hard stop at "position 0", where all position data fields are set to 0 (or in a three-dimensional 0,0,0 in space). Perform additional movement relative to this point and update the position data based on, for example, the encoder count of the robot arm 506 and/or the various joint motors of the coupling plate 3304.
在其他實施例中,處理器4102及/或機器人臂控制器4106可基於一或多個視覺化參數而判定或設定攝影機300之一尖端位置。舉例而言,處理器4102及/或機器人臂控制器4106可使用一投影中心位置作為一觀看向量之一近端(例如,用於使攝影機300瞄準之一「起點」)。在某些外科手術系統中,一外科手術儀器上之此點稱為「後」點,且可相對於尖端而提供。處理器4102及/或機器人臂控制器4106自尖端及後點計算一觀看向量方向以判定攝影機300相對於機器人臂506之座標系之一瞄準。In other embodiments, the processor 4102 and/or the robot arm controller 4106 may determine or set a tip position of the camera 300 based on one or more visualization parameters. For example, the processor 4102 and/or the robot arm controller 4106 may use a projection center position as a near end of a viewing vector (for example, for aiming the camera 300 at a "starting point"). In some surgical systems, this point on a surgical instrument is called the "post" point and can be provided relative to the tip. The processor 4102 and/or the robot arm controller 4106 calculates a viewing vector direction from the tip and back points to determine whether the camera 300 is aimed at one of the coordinate systems of the robot arm 506.
另外或另一選擇係,處理器4102及/或機器人臂控制器4106可判定一焦距以用於計算一立體影像之一焦點平面自投影中心之一範圍。焦點平面處之影像之中心係「尖端」點。處理器4102及/或機器人臂控制器4106可使用一經校準工作距離來判定自攝影機300至尖端點之實際、空間、實體距離。此外,處理器4102及/或機器人臂控制器4106可判定放大率,如上文關於放大率校準所論述。
3.距離量測實施例 Alternatively or alternatively, the processor 4102 and/or the robot arm controller 4106 may determine a focal length for calculating a range from the projection center of a focal plane of a stereoscopic image. The center of the image at the focal plane is the "tip" point. The processor 4102 and/or the robot arm controller 4106 can use a calibrated working distance to determine the actual, spatial, and physical distance from the camera 300 to the tip point. In addition, the processor 4102 and/or the robot arm controller 4106 may determine the magnification, as discussed above regarding magnification calibration. 3. Examples of distance measurement
在某些實施例中,機器人臂506及/或耦合板3304可連同立體視覺化攝影機300來操作以提供一立體影像中之物件之間的距離量測及/或深度量測。舉例而言,處理器4102可使用變換至機器人空間之光學校準參數來在維度上判定相對於機器人臂506之座標系的攝影機300之一焦點或尖端之一中心。如上文結合圖45及圖46所論述,一影像中之任一點之一觀看向量及左/右視差資訊可由處理器4102使用以透過關於尖端或關於影像中之任何其他點之三角量測在三個維度上計算其位置。此三角量測使得處理器4102能夠將一影像中之任一點映射至機器人座標系。如此,處理器4102可相對於機器人臂506之同一座標空間計算多個物件之位置及/或深度及/或一物件之不同部分之位置,此使得能夠在物件之間判定一距離量測及/或深度量測。In some embodiments, the robot arm 506 and/or the coupling plate 3304 can be operated in conjunction with the stereo visualization camera 300 to provide a distance measurement and/or depth measurement between objects in a stereo image. For example, the processor 4102 may use the optical calibration parameters transformed into the robot space to determine a focal point or a center of a tip of the camera 300 relative to the coordinate system of the robot arm 506 in dimensionality. As discussed above in conjunction with FIGS. 45 and 46, one of the viewing vectors and left/right disparity information at any point in an image can be used by the processor 4102 to measure the three points by triangulation about the tip or about any other point in the image. Calculate its position on three dimensions. This triangulation measurement enables the processor 4102 to map any point in an image to the robot coordinate system. In this way, the processor 4102 can calculate the position and/or depth of multiple objects and/or the positions of different parts of an object with respect to the same coordinate space of the robot arm 506, which makes it possible to determine a distance measurement and/or between objects Or depth measurement.
處理器4102可致使距離及/或深度量測資訊視覺上顯示於立體影像上方及/或連同立體影像來顯示。在某些例項中,一操作者可使用輸入裝置1410以藉由如下方式選擇兩個或兩個以上物件:使用一手指或外科手術儀器在一螢幕上選擇該等物件或直接指向患者中之實際物件。處理器4102接收選擇之指示且因此判定物件之座標及物件之間的距離。處理器4102然後可連同立體影像來顯示指示距離之一直尺圖形及/或值(及/或選定物件之一指示)。The processor 4102 can cause the distance and/or depth measurement information to be visually displayed above the stereoscopic image and/or displayed together with the stereoscopic image. In some cases, an operator can use the input device 1410 to select two or more objects by using a finger or a surgical instrument to select these objects on a screen or point directly to one of the patients. The actual object. The processor 4102 receives the indication of selection and therefore determines the coordinates of the object and the distance between the objects. The processor 4102 can then display a linear graph and/or value (and/or an indication of the selected object) indicating the distance along with the stereo image.
此外,對物件之追蹤使得先前經成像(或提供於其他影像中)之其他物件之位置能夠經儲存且稍後進行比較。例如,攝影機300可移動至其中該等物件中之至少某些物件在當前FOV外側之一位置。然而,一操作者可指示處理器4102判定FOV內之一物件與當前在FOV外側之一先前經成像物件之間的一距離。In addition, tracking of objects allows the locations of other objects that have been previously imaged (or provided in other images) to be stored and compared later. For example, the camera 300 can be moved to a position where at least some of the objects are outside the current FOV. However, an operator can instruct the processor 4102 to determine a distance between an object in the FOV and a previously imaged object currently outside the FOV.
在某些實施例中,處理器4102可使用物件之座標來融合來自替代模態視覺化之數位影像或模型,諸如MRI影像、X射線影像、外科手術模板或指南、術前影像等。實例性處理器4102經組態以使用座標平面中之物件位置以及深度資訊來對替代模態視覺化進行恰當地比例縮放、定向及定位。處理器4102可選擇在一所顯示立體影像中具有完全相同特徵(例如,物件)的替代模態視覺化之至少一部分。例如,處理器4102可使用一影像分析常式來在一立體影像中定位一血管型樣、一疤痕、一畸形或其他可觀看實體結構或物件。處理器4102然後定位替代模態視覺化中之完全相同特徵。處理器4102選擇包含完全相同特徵的替代模態視覺化之一部分。處理器4102然後可使用座標、深度及/或立體影像中之特徵之間的距離來對替代模態視覺化之選定部分進行比例縮放、旋轉及/或定向。處理器然後可融合替代模態視覺化之經調整部分與立體影像。處理器4102可追蹤可識別物件如何相對於彼此及/或相對於FOV移動以判定將如何相應地更新融合影像。舉例而言,攝影機300移動至另一外科手術位置可致使處理器4102選擇外科手術前影像之另一部分以用於與其他外科手術位置之立體影像融合。In some embodiments, the processor 4102 may use the coordinates of the object to fuse digital images or models from alternative modal visualization, such as MRI images, X-ray images, surgical templates or guides, preoperative images, and the like. The example processor 4102 is configured to use the object position and depth information in the coordinate plane to appropriately scale, orient, and position the alternative modality visualization. The processor 4102 may select at least a part of the alternative modal visualization that has exactly the same feature (for example, an object) in a displayed 3D image. For example, the processor 4102 can use an image analysis routine to locate a blood vessel pattern, a scar, a deformity, or other visible physical structure or object in a three-dimensional image. The processor 4102 then locates the exact same features in the alternative modality visualization. The processor 4102 selects a part of the alternative modal visualization that contains exactly the same characteristics. The processor 4102 may then use the coordinates, depth, and/or distance between features in the stereoscopic image to scale, rotate, and/or orient selected portions of the alternative modality visualization. The processor can then fuse the adjusted part of the alternative modal visualization with the stereo image. The processor 4102 can track how the identifiable objects move relative to each other and/or relative to the FOV to determine how to update the fused image accordingly. For example, moving the camera 300 to another surgical location may cause the processor 4102 to select another part of the pre-surgical image for fusion with the stereoscopic image of the other surgical location.
在某些例項中,處理器4102及/或機器人臂控制器4106可致使機器人臂506移動以追蹤FOV中之一物件之一移動。處理器4102使用物件之座標位置來偵測移動或混亂。回應於所偵測到之移動或混亂,處理器4102及/或機器人臂控制器4106經組態以判定將如何使機器人臂506及/或耦合板3304移動以追蹤物件之移動或克服混亂。舉例而言,處理器4102及/或機器人臂控制器4106可使機器人臂506在一圓形路徑中移動以自多個方向視覺化一患者之視網膜上之一點以避免自工具之反射或混亂。
4.影像融合實施例 In some examples, the processor 4102 and/or the robot arm controller 4106 may cause the robot arm 506 to move to track the movement of one of the objects in the FOV. The processor 4102 uses the coordinate position of the object to detect movement or confusion. In response to the detected movement or chaos, the processor 4102 and/or the robot arm controller 4106 are configured to determine how to move the robot arm 506 and/or the coupling plate 3304 to track the movement of the object or overcome the chaos. For example, the processor 4102 and/or the robot arm controller 4106 can move the robot arm 506 in a circular path to visualize a point on a patient's retina from multiple directions to avoid reflections or confusion from the tool. 4. Example of image fusion
如上文所論述,處理器4102經組態以將來自一替代模態之一影像融合至現場立體影像。舉例而言,若一外科醫師正在對具有一深腦腫瘤之一患者進行手術,則該外科醫師可指示處理器4102將在恰當位置中且在恰當深度及立體視角處之腦腫瘤之一MRI影像視覺化為顯示監視器512上之來自攝影機300之其現場影像。在某些實施例中,處理器4102經組態以使用FOV中之一或多個物件之距離及/或深度量測資訊來與替代模態視圖融合。處理器4102亦可使用立體光軸(例如,觀看向量)、IPD及/或在結合圖42所論述之校準步驟中計算且儲存至一或多個LUT之攝影機模型參數來提供成像融合。該等光學校準參數之使用使得處理器4102能夠顯示一替代模態影像,好似該影像由立體視覺化攝影機300獲取。處理器4102可使用攝影機之該等光學校準參數以基於攝影機300之一有效IPD而對替代模態影像進行模型化、比例縮放或修改,使得在距外科手術部位中之一焦點之一距離Z
處觀看替代模態影像(給定攝影機300之所應用工作距離及放大率)。As discussed above, the processor 4102 is configured to merge an image from an alternative modality into a live stereo image. For example, if a surgeon is operating on a patient with a deep brain tumor, the surgeon can instruct the processor 4102 to take an MRI image of the brain tumor in the proper position and at the proper depth and perspective. It is visualized to display the live image from the camera 300 on the monitor 512. In some embodiments, the processor 4102 is configured to use distance and/or depth measurement information of one or more objects in the FOV to merge with the alternative modal view. The processor 4102 may also use stereoscopic optical axis (eg, viewing vector), IPD, and/or camera model parameters calculated in the calibration step discussed in conjunction with FIG. 42 and stored in one or more LUTs to provide imaging fusion. The use of these optical calibration parameters enables the processor 4102 to display an alternative modal image, as if the image was acquired by the stereo visualization camera 300. The processor 4102 can use the optical calibration parameters of the camera to model, scale or modify the alternative modal image based on an effective IPD of the camera 300, so that the distance Z from a focal point in the surgical site View alternate modal images (given the working distance and magnification of the camera 300).
圖52展示根據本發明之一實例性實施例之用於融合來自一替代模態視覺化之一影像與立體影像之一實例性程序5200之一圖式。儘管參考圖52中所圖解說明之流程圖闡述程序5200,但應瞭解,可使用執行與程序5200相關聯之步驟之諸多其他方法。舉例而言,可改變方塊中之諸多方塊之次序,可組合特定方塊與其他方塊,且所闡述之方塊中之諸多方塊係選用的。此外,可在多個裝置當中執行在程序5200中所闡述之動作,該多個裝置包含(舉例而言)圖14之實例性立體視覺化攝影機300之光學元件1402、影像擷取模組1404、馬達與光照模組1406、資訊處理器模組1408及/或圖41之關節R1至R9及機器人臂控制器4106。舉例而言,可由儲存於處理器4102之記憶體1570中之一程式執行程序5200。FIG. 52 shows a diagram of an example program 5200 for fusing an image and a stereo image from an alternative modal visualization according to an exemplary embodiment of the present invention. Although the procedure 5200 is described with reference to the flowchart illustrated in FIG. 52, it should be understood that many other methods of performing the steps associated with the procedure 5200 can be used. For example, the order of the blocks in the block can be changed, a specific block can be combined with other blocks, and many of the blocks described are optional. In addition, the actions described in the procedure 5200 can be executed in multiple devices, including, for example, the optical element 1402 of the exemplary stereo visualization camera 300 of FIG. 14, the image capturing module 1404, The motor and lighting module 1406, the information processor module 1408 and/or the joints R1 to R9 and the robot arm controller 4106 of FIG. 41. For example, the program 5200 can be executed by a program stored in the memory 1570 of the processor 4102.
程序5200之實例性處理器4102經組態以使用光學校準參數來將(舉例而言)一患者之先前所產生三維MRI資料再現為具有恰當視角之一立體影像作為由攝影機300記錄之一立體影像。處理器4102可自圖41之裝置4104接收(舉例而言)一替代模態視覺化,諸如MRI資料(方塊5202)。處理器4102亦可經由一輸入裝置1410接收指示替代模態視覺化將與由立體視覺化攝影機300記錄之立體影像融合之一輸入5203 (方塊5204)。The example processor 4102 of the program 5200 is configured to use optical calibration parameters to reproduce, for example, the previously generated three-dimensional MRI data of a patient as a stereoscopic image with a proper angle of view as a stereoscopic image recorded by the camera 300 image. The processor 4102 may receive (for example) an alternative modal visualization, such as MRI data, from the device 4104 of FIG. 41 (block 5202). The processor 4102 may also receive an input 5203 which indicates that the alternative modal visualization will be merged with the stereoscopic image recorded by the stereoscopic visualization camera 300 via an input device 1410 (block 5204).
在程序5200期間,當一外科醫師針對一外科手術程序將攝影機300定位於一所要定向及位置處時,處理器4102獲得姿勢資料5205 (方塊5206)。姿勢資料5201可包含機器人臂506、耦合板3304及/或立體視覺化攝影機300之位置。處理器4102亦自一或多個LUT (諸如圖42之LUT 4203)存取與攝影機300有關之放大率及工作距離光學校準參數5207 (方塊5208)。處理器4102使用姿勢資料5205連同放大率及工作距離光學校準參數5207來判定攝影機300之一立體軸線及IPD (方塊5210)。處理器4102應用姿勢資料、立體軸線資料、IPD資料及/或光學校準參數以選擇MRI資料之至少一部分及/或對MRI資料之選定部分進行修改、比例縮放、定向、分割等,使得以如立體視覺化攝影機300所觀看的患者之大腦之一視圖之一視角提供該選定部分(方塊5212)。處理器4102經組態以應用立體光軸觀看向量及IPD以用於將MRI資料之選定部分再現為與攝影機300之當前現場視圖對應之一立體影像(方塊5114)。處理器4102然後可融合立體MRI影像與來自攝影機300之現場立體影像,如本文中所論述(方塊5216)。 During the procedure 5200, when a surgeon positions the camera 300 at a desired orientation and position for a surgical procedure, the processor 4102 obtains the posture data 5205 (block 5206). The posture data 5201 may include the position of the robot arm 506, the coupling board 3304, and/or the stereo visualization camera 300. The processor 4102 also accesses the magnification and working distance optical calibration parameters 5207 related to the camera 300 from one or more LUTs (such as the LUT 4203 of FIG. 42) (block 5208). The processor 4102 uses the posture data 5205 together with the magnification and working distance optical calibration parameters 5207 to determine a stereo axis and IPD of the camera 300 (block 5210). The processor 4102 applies posture data, three-dimensional axis data, IPD data, and/or optical calibration parameters to select at least a part of the MRI data and/or modify, scale, orient, and segment the selected part of the MRI data, so that A perspective of a view of the brain of the patient viewed by the stereoscopic visualizing camera 300 provides the selected portion (block 5212). The processor 4102 is configured to apply the stereoscopic optical axis viewing vector and IPD for reproducing the selected portion of the MRI data as a stereoscopic image corresponding to the current scene view of the camera 300 (block 5114). The processor 4102 may then fuse the stereo MRI image with the live stereo image from the camera 300, as discussed herein (block 5216) .
如上文所論述,處理器4102可使用一物件或特徵來定位或融合所再現MRI資料與來自立體視覺化攝影機300之立體影像。舉例而言,處理器4102可使用一或多個影像分析常式來識別一立體影像中之區別特徵或物件,定位所再現立體MRI資料中之相同區別特徵,且使所再現立體MRI資料位於攝影機立體影像之適當部分上方,使得特徵或物件對準且具有相同比例、大小、深度、定向等。處理器4102可使所再現立體MRI資料至少部分地透明以使得現場影像亦能夠可觀看。另外或另一選擇係,處理器4102可在所再現立體MRI資料之一邊框處調整一陰影以降低所再現立體MRI資料與攝影機立體影像之間的視覺對比度。圖52之實例性程序5200然後可結束。As discussed above, the processor 4102 can use an object or feature to locate or fuse the reproduced MRI data with the stereo image from the stereo visualization camera 300. For example, the processor 4102 may use one or more image analysis routines to identify distinguishing features or objects in a stereo image, locate the same distinguishing features in the reproduced stereo MRI data, and place the reproduced stereo MRI data on the camera Above the appropriate part of the 3D image, the features or objects are aligned and have the same scale, size, depth, orientation, etc. The processor 4102 can make the reproduced stereo MRI data at least partially transparent so that the live images can also be viewed. Additionally or alternatively, the processor 4102 can adjust a shadow at a frame of the reproduced stereo MRI data to reduce the visual contrast between the reproduced stereo MRI data and the camera stereo image. The example procedure 5200 of FIG. 52 may then end.
實例性程序5200使得腦腫瘤能夠由外科醫師在相對於攝影機300之立體影像之一準確位置中視覺化。外科醫師可尤其在一外科手術程序中途使用此融合視覺化。舉例而言,外科醫師可以最佳地經闡述為低於一當前解剖位準之「x射線視覺」之一方式看到仍然曝露之腫瘤。可經由輸入裝置1410調整現場或所再現立體MRI影像之透明度之控制以最佳化融合影像之清晰度。實例性程序因此達成一腫瘤之一更安全更準確且高效切除。The example procedure 5200 enables the brain tumor to be visualized by the surgeon in an accurate position relative to the camera 300 in one of the stereoscopic images. Surgeons can use this fusion visualization especially in the middle of a surgical procedure. For example, the surgeon can see the tumor that is still exposed as one of the "x-ray vision" methods best described as being below a current anatomical level. The input device 1410 can be used to adjust the transparency control of the live or reproduced 3D MRI image to optimize the clarity of the fused image. The exemplary procedure thus achieves a safer, more accurate and efficient resection of one of the tumors.
在某些實施例中,若一FOV、焦點、工作距離及/或放大率改變,則可重複程序5200。在此等實施例中,處理器4102經組態以使用經更新姿勢資訊且自一查找表提取對應立體軸線及IPD以將MRI資料重新再現為一經更新準確立體影像。處理器4102經組態以將新再現之MRI資料融合至當前立體影像中,使得現場視圖及對應MRI資料位於恰當位置、深度及定向中。In some embodiments, if a FOV, focus, working distance, and/or magnification change, the procedure 5200 can be repeated. In these embodiments, the processor 4102 is configured to use the updated posture information and extract the corresponding stereo axis and IPD from a lookup table to reproduce the MRI data as an updated accurate stereo image. The processor 4102 is configured to fuse the newly reproduced MRI data into the current three-dimensional image, so that the scene view and the corresponding MRI data are located in the proper position, depth, and orientation.
在某些實施例中,實例性處理器4102經組態以與立體視覺化攝影機300、機器人臂506及/或耦合板3304一起操作以產生現場剖面融合視覺化。一外科手術部位之一剖面視覺化給一外科醫師提供不可以其他方式獲得之一顯著經改良視點。圖53展示具有一神經膠質母細胞瘤5302之一患者5300之一圖式,神經膠質母細胞瘤5302在一患者之頭部內側以幻影來圖解說明。具體而言,神經膠質母細胞瘤5302可位於患者之大腦5304中,其以光幻影線來展示。圖53之圖式係(舉例而言)來自一MRI裝置之術前診斷影像之典型,其中眾多影像切片經堆疊且再現及視覺化患者之顱骨5306之一內部之一3D模型。In some embodiments, the example processor 4102 is configured to operate with the stereo visualization camera 300, the robotic arm 506, and/or the coupling plate 3304 to produce a live profile fusion visualization. The visualization of a cross-section of a surgical site provides a surgeon with a significantly improved perspective that cannot be obtained in other ways. Figure 53 shows a diagram of a patient 5300 with a glioblastoma 5302. The glioblastoma 5302 is illustrated as a phantom on the inside of the head of a patient. Specifically, the glioblastoma 5302 can be located in the patient's brain 5304, which is displayed as a light phantom line. The diagram in FIG. 53 is, for example, a typical preoperative diagnostic image from an MRI device, in which a number of image slices are stacked and reproduce and visualize a 3D model of an interior of a patient's skull 5306.
在所圖解說明實例中,將透過大腦外科手術移除神經膠質母細胞瘤5302。圖54展示患者5300經歷一穿顱術程序5400以提供對顱骨5306之通達之一透視圖之一圖式。程序5400亦包含使用外科手術儀器5402進行之大腦解剖及回縮。一般而言,以一深圓錐形狀形成一外科手術通達部位5404以通達神經膠質母細胞瘤5302。In the illustrated example, the glioblastoma 5302 will be removed through brain surgery. FIG. 54 shows a diagram of a perspective view of a patient 5300 undergoing a craniotomy procedure 5400 to provide access to the skull 5306. Procedure 5400 also includes brain dissection and retraction using surgical instruments 5402. Generally speaking, a surgical access site 5404 is formed in a deep cone shape to access the glioblastoma 5302.
圖55展示根據本發明之一實例性實施例之立體視覺化平台516之一圖式,立體視覺化平台516包含立體視覺化攝影機300及機器人臂506以視覺化穿顱術程序5400。如圖解說明,穿顱術程序5400經設置使得機器人臂506經定位以使立體視覺化攝影機300沿著圓錐形外科手術部位5404之視覺化軸線5500瞄準穿過顱骨5306之頂部。手術外科醫師之一觀看一般穿過顱骨5306之頂部,如圖57中所展示。如自圖7可瞭解,外科手術之深度及(舉例而言)外科手術儀器5402之尖端難以看到。55 shows a diagram of a stereo visualization platform 516 according to an exemplary embodiment of the present invention. The stereo visualization platform 516 includes a stereo visualization camera 300 and a robotic arm 506 to visualize the craniotomy procedure 5400. As illustrated, the craniotomy procedure 5400 is configured such that the robotic arm 506 is positioned so that the stereo visualization camera 300 is aimed along the visualization axis 5500 of the conical surgical site 5404 through the top of the skull 5306. One of the operating surgeons looks generally through the top of the skull 5306, as shown in FIG. 57. As can be understood from FIG. 7, the depth of the surgical operation and (for example) the tip of the surgical instrument 5402 are difficult to see.
圖55中所展示之實例性立體視覺化攝影機300提供沿著圓錐形外科手術通達部位之軸線5500向下觀看之一高度準確立體影像。如上文所論述,攝影機300之左視圖與右視圖之間的視差資訊針對通達部位中兩個視圖共同之所有點由處理器4102使用以依據一已知參考深度(諸如(舉例而言)物件平面)判定每一點之一深度。在所圖解說明實例中,左視圖與右視圖之間的視差等於「0」之一值,此使得處理器4102能夠判定影像中之每一點之一深度圖。處理器4102可重新再現該深度圖,好似自一不同角度觀看該圖。此外,處理器4102經組態以基於接收到來自一外科醫師及/或一操作者之一指令而使深度圖之至少一部分為透明的。在所圖解說明實例中,處理器4102可使在圖57之剖面平面AA下面之深度圖之一部分為透明的,因而使得處理器4102能夠產生現場外科手術通達部位5404之一剖面圖。The exemplary stereo visualization camera 300 shown in FIG. 55 provides a highly accurate stereo image that is viewed downward along the axis 5500 of the accessible part of the conical surgery. As discussed above, the disparity information between the left view and the right view of the camera 300 is used by the processor 4102 for all points in the accessible part that are common to the two views according to a known reference depth (such as, for example, the object plane). ) Determine the depth of each point. In the illustrated example, the disparity between the left view and the right view is equal to a value of "0", which enables the processor 4102 to determine a depth map for each point in the image. The processor 4102 can reproduce the depth map as if viewing the map from a different angle. In addition, the processor 4102 is configured to make at least a portion of the depth map transparent based on receiving an instruction from a surgeon and/or an operator. In the illustrated example, the processor 4102 can make a part of the depth map below the cross-sectional plane AA of FIG. 57 transparent, thereby enabling the processor 4102 to generate a cross-sectional view of the site 5404 accessible to surgery.
圖56展示圓錐形外科手術通達部位5404之一幻影視圖之一圖式。所圖解說明外科手術通達部位5404在此論述中為了清晰而包含階梯式圓錐形節段。在此實例中,部位5404之一掃視錐角由角度「α」指定。Figure 56 shows a schematic diagram of a phantom image of the accessible part 5404 of conical surgery. The illustrated surgical access site 5404 includes stepped conical segments for clarity in this discussion. In this example, one of the scan cone angles of part 5404 is specified by the angle "α".
圖58展示用於穿顱術程序5400之圓錐形外科手術通達部位5404之一圖式。外科手術儀器5402之大小及形狀之先前知識連同其位置、方向及/或定向之影像辨識使得處理器4102能夠產生圖58中所展示之剖面圖之影像資料。由圖57表示之立體視圖中之儀器5402之辨識在對患者之大腦5304進行手術時達成在圖58之剖面圖中之其精確放置及(舉例而言)對於外科醫師不可見之儀器之底側之視覺化。Figure 58 shows a diagram of the conical surgical access site 5404 used in the craniotomy procedure 5400. The prior knowledge of the size and shape of the surgical instrument 5402 together with the image recognition of its position, direction, and/or orientation enables the processor 4102 to generate the image data of the cross-sectional view shown in FIG. 58. The identification of the instrument 5402 in the three-dimensional view shown in FIG. 57 achieves its precise placement in the cross-sectional view of FIG. 58 and (for example) the bottom side of the instrument that is not visible to the surgeon during surgery on the brain 5304 of the patient Of visualization.
在某些實施例中,處理器4102經組態以融合神經膠質母細胞瘤5302之一影像與接近現場或現場立體影像。如上文所論述,機器人臂506與攝影機300之組合提供一視圖相對於機器人參考系或機器人空間之高度準確位置、方向及/或定向資訊。在將機器人臂506及攝影機300配準或校準至患者5300之參考系之後,處理器4102產生外科手術通達部位5404之準確位置、方向及/或定向資訊及至患者之其各別位置。處理器4102使用影像融合來將神經膠質母細胞瘤5302之MRI影像之一選擇部分疊加至一剖面圖上,如圖59中所展示。另外,該影像融合達成其他相關MRI影像資料之視覺化,包含(舉例而言)大腦脈管系統或期望包含於影像中之其他結構。例示性外科手術程序以外科醫師能夠看到且理解神經膠質母細胞瘤5302之深度位置(除儀器5402之一安全間距或定位之外)來繼續進行。In some embodiments, the processor 4102 is configured to fuse an image of a glioblastoma 5302 with an approaching scene or a three-dimensional image of the scene. As discussed above, the combination of the robot arm 506 and the camera 300 provides highly accurate position, direction, and/or orientation information of a view relative to the robot reference frame or robot space. After the robot arm 506 and the camera 300 are registered or calibrated to the reference frame of the patient 5300, the processor 4102 generates the accurate position, direction and/or orientation information of the surgical accessible part 5404 and its respective positions to the patient. The processor 4102 uses image fusion to superimpose a selected part of the MRI image of the glioblastoma 5302 onto a cross-sectional view, as shown in FIG. 59. In addition, the image fusion achieves the visualization of other related MRI image data, including, for example, the cerebral vasculature or other structures expected to be included in the image. The exemplary surgical procedure continues with the surgeon being able to see and understand the depth location of the glioblastoma 5302 (except for a safe distance or positioning of the instrument 5402).
圖59展示外科手術通達部位5404之一剖面圖之一圖式。在此實例中,神經膠質母細胞瘤5302之一部分5302’對於立體視覺化攝影機300係可見的。圖60展示正交於圖57之平面AA之一剖面圖之一圖式。該圖式可圖解說明由處理器4102基於與外科手術通達部位5404之現場視圖之MRI資料融合而產生之一剖面圖。處理器4102使用深度圖使得能夠再現在各種所要區段平面及區段平面組合處之外科手術通達部位5404,如圖61中所展示。再現使得處理器4102能夠依據MRI資料顯示包含可見部分5402’及剩餘部分之完整神經膠質母細胞瘤5302。處理器4102可自攝影機300之一視角顯示視覺化或將視覺化顯示為為一剖面圖,如圖61中所展示。
5.具有聯合視覺化之機器人運動實施例 Fig. 59 shows a schematic diagram of a cross-sectional view of a surgical access site 5404. In this example, a portion 5302' of the glioblastoma 5302 is visible to the stereo visualization camera 300 series. FIG. 60 shows a diagram of a cross-sectional view orthogonal to the plane AA of FIG. 57. This drawing can illustrate a cross-sectional view generated by the processor 4102 based on the MRI data fusion with the live view of the surgical access site 5404. The processor 4102 uses the depth map to make it possible to reproduce the surgically accessible parts 5404 at various desired segment planes and segment plane combinations, as shown in FIG. 61. The rendering enables the processor 4102 to display the intact glioblastoma 5302 including the visible part 5402' and the remaining part based on the MRI data. The processor 4102 can display the visualization from a perspective of the camera 300 or display the visualization as a cross-sectional view, as shown in FIG. 61. 5. Robot motion embodiment with joint visualization
在某些實施例中,實例性處理器4102結合機器人臂控制器4106、立體視覺化攝影機300、機器人臂506及/或耦合板3304來操作以聯合視覺化與機器人運動。在某些實例中,處理器4102及/或機器人臂控制器4106在一閉合環路中操作以基於機器人運動而提供聯合視覺化。在此等實例中,處理器4102及/或機器人臂控制器4106經組態以基於一特定影像及其內容(例如,物件、可識別特徵等)而定位機器人臂506、耦合板3304及/或攝影機300以用於一外科手術部位之視覺化。如上文所論述,處理器4102及/或機器人臂控制器4106已知機器人臂506及攝影機300位置。另外,由攝影機記錄之影像資料係立體的,此提供深度資料。因此,處理器4102及/或機器人臂控制器4106可判定一患者上或每個經視覺化點之機器人三維空間中之一位置。因此,當機器人臂506使攝影機300在一所要方向上自具有一初始影像之一初始位置移動時,預期在一第二移動後影像中看到所要影像改變。In some embodiments, the example processor 4102 operates in conjunction with the robot arm controller 4106, the stereo visualization camera 300, the robot arm 506, and/or the coupling board 3304 to combine visualization and robot motion. In some instances, the processor 4102 and/or the robot arm controller 4106 operate in a closed loop to provide joint visualization based on robot motion. In these examples, the processor 4102 and/or the robot arm controller 4106 are configured to position the robot arm 506, the coupling plate 3304, and/or the robot arm 506, the coupling plate 3304, and/or the robot arm 506 based on a specific image and its content (eg, object, recognizable feature, etc.) The camera 300 is used for visualization of a surgical site. As discussed above, the processor 4102 and/or the robot arm controller 4106 know the positions of the robot arm 506 and the camera 300. In addition, the image data recorded by the camera is three-dimensional, which provides depth data. Therefore, the processor 4102 and/or the robot arm controller 4106 can determine a position in the robot three-dimensional space on a patient or each visualized point. Therefore, when the robot arm 506 moves the camera 300 in a desired direction from an initial position with an initial image, it is expected to see the desired image change in the image after a second movement.
另一選擇係,可藉由處理器4102及/或機器人臂控制器4106經組態以應用表示至初始影像資料之所要移動之方程式而計算預期移動後影像,此產生一所計算第二影像。處理器4102及/或機器人臂控制器4106使用一匹配模板常式或函數比較移動後實際影像與該所計算影像,如上文所闡述。若偵測到誤差,則處理器4102及/或機器人臂控制器4106可藉由使機器人臂506及/或攝影機300相應地移動而校正該等誤差。舉例而言,給定自一操作者接收之一初始影像及一所要移動「向右100個像素」,處理器4102及/或機器人臂控制器4106可將理論經移動影像之影像資料計算為向右100個像素之一移位。然後,藉由對如所揭示之各種協調機器人關節執行命令而進行實體移動,以將機器人臂506及/或攝影機300重定位至理論所要位置。攝影機300記錄一第二影像,處理器4102及/或機器人臂控制器4106使用(舉例而言)一匹配模板函數或其等效形式比較該第二影像與所計算影像資料。若該移動係準確的,則資料將指示在攝影機300之一尖端處之一100%關係,其中兩個影像完全對準。然而,若實際影像資料展示另一位置(舉例而言向右101個像素及向上5個像素)處之最佳相關,則處理器4102及/或機器人臂控制器4106可修改該移動以藉由經由機器人臂506使攝影機300實體上移動向左1像素及向下5個像素而校正誤差。
6.垂度補償實施例 Alternatively, the processor 4102 and/or the robot arm controller 4106 can be configured to apply an equation representing the desired movement to the initial image data to calculate the expected post-movement image, which generates a calculated second image. The processor 4102 and/or the robot arm controller 4106 use a matching template routine or function to compare the actual image after the movement with the calculated image, as described above. If errors are detected, the processor 4102 and/or the robot arm controller 4106 can correct the errors by moving the robot arm 506 and/or the camera 300 accordingly. For example, given an initial image received from an operator and a "100 pixels to the right" to be moved, the processor 4102 and/or the robot arm controller 4106 can calculate the theoretical image data of the moved image as the direction One of the 100 pixels to the right is shifted. Then, physical movement is performed by executing commands on various coordinated robot joints as disclosed, so as to reposition the robot arm 506 and/or the camera 300 to the theoretically desired position. The camera 300 records a second image, and the processor 4102 and/or the robot arm controller 4106 uses, for example, a matching template function or its equivalent form to compare the second image with the calculated image data. If the movement is accurate, the data will indicate a 100% relationship at a tip of the camera 300, where the two images are perfectly aligned. However, if the actual image data shows the best correlation at another position (for example, 101 pixels to the right and 5 pixels to the top), the processor 4102 and/or the robot arm controller 4106 can modify the movement by The camera 300 is physically moved 1 pixel to the left and 5 pixels to the down via the robot arm 506 to correct errors. 6. Sag compensation embodiment
在某些實施例中,機器人臂506及/或耦合板3304之關節R1至R9中之至少某些關節可經歷某些垂度。處理器4102及/或機器人臂控制器4106可經組態以提供對機器人臂垂度之校正。在某些例項中,處理器4102及/或機器人臂控制器4106經組態以對一系列小移動執行垂度補償,使得在機器人臂506之一運動範圍內保持運動準確性。舉例而言,為表徵且消除垂度,在鍛煉一特定機器人關節之運動方向上執行垂度補償以隔離隨實際機器人關節旋轉位置而變之誤差。藉由比較誤差與轉矩(藉由使攝影機300負載重量乘以力矩臂(或連桿)長度而計算),可判定彼關節之順從性。另一選擇係,可使用分析技術(舉例而言,有限元素分析(「FEA」))來計算關節順從性。In some embodiments, at least some of the joints R1 to R9 of the robot arm 506 and/or the coupling plate 3304 may experience some sag. The processor 4102 and/or the robot arm controller 4106 can be configured to provide correction for the sag of the robot arm. In some examples, the processor 4102 and/or the robot arm controller 4106 are configured to perform sag compensation for a series of small movements so that the accuracy of the motion is maintained within one of the ranges of motion of the robot arm 506. For example, in order to characterize and eliminate sag, sag compensation is performed in the direction of movement of a specific robot joint to isolate the error that varies with the actual robot joint rotation position. By comparing the error and torque (calculated by multiplying the load weight of the camera 300 by the length of the moment arm (or link)), the compliance of the joint can be determined. Alternatively, analytical techniques (for example, finite element analysis ("FEA")) can be used to calculate joint compliance.
使用且儲存所有旋轉位置中之所有關節之以上順從性表徵,處理器4102及/或機器人臂控制器4106可計算一特定攝影機位置之總體垂度。處理器4102及/或機器人臂控制器4106可針對每一攝影機位置判定至一LUT及/或校準暫存器之一垂度校正因子。此外,處理器4102及/或機器人臂控制器4106可將垂度校正因子應用於機器人臂移動命令或一移動順序(在應用比例因子之前或之後),使得垂度補償併入至移動命令/信號中。可在一持續運動程序中計算校正因子,因而達成攝影機300之準確追蹤及跟隨。此校正因子進一步消除對用於校準/定位立體視覺化平台516之一第二攝影機之一需要,且消除在攝影機300上具有基準目標之需要,且因此消除手術用腹布干擾之一問題。
7.視覺化位置 / 定向儲存實施例 Using and storing the above compliance representations of all joints in all rotational positions, the processor 4102 and/or the robot arm controller 4106 can calculate the overall sag of a specific camera position. The processor 4102 and/or the robot arm controller 4106 can determine a LUT and/or a sag correction factor of the calibration register for each camera position. In addition, the processor 4102 and/or the robot arm controller 4106 can apply the sag correction factor to the robot arm movement command or a movement sequence (before or after applying the scale factor), so that the sag compensation is incorporated into the movement command/signal middle. The correction factor can be calculated in a continuous motion program, thus achieving accurate tracking and following of the camera 300. This correction factor further eliminates the need for one of the second cameras for the calibration/positioning of the stereo visualization platform 516, and eliminates the need to have a reference target on the camera 300, and therefore eliminates the problem of surgical abdominal cloth interference. 7. Example of Visualized Location /Orientation Storage
在某些實施例中,實例性處理器4102經組態以保存用以返回至立體視覺化攝影機300之一特定定向及/或位置之視覺化參數。該等視覺化參數可包含立體視覺化攝影機300、機器人臂506及/或耦合板3304之一觀看向量、位置、放大率、工作距離、焦點、位置及/或定向。In some embodiments, the example processor 4102 is configured to save the visualization parameters used to return to a specific orientation and/or position of the stereo visualization camera 300. The visualization parameters may include a viewing vector, position, magnification, working distance, focus, position, and/or orientation of the stereo visualization camera 300, the robot arm 506, and/or the coupling board 3304.
在一實例中,一外科醫師可希望在一血管之一部分之一吻合期間在視覺照射下具有一小接線之一高度放大視覺化。該外科醫師然後可在紅外線照射下縮小至整個血管之一較寬視圖以檢查通暢性。該外科醫師然後可返回至經放大視覺化以完成該接線。在此實例中,處理器4102經組態以保存位置中之每一者處之視覺化參數。處理器4102可儲存與已被連續地觀看一時間週期(諸如兩秒、五秒、三十秒等)之位置對應之位置。處理器4102亦可在接收到經由輸入裝置1410來自外科醫師之一指令之後儲存一位置。In one example, a surgeon may wish to have a small wiring and a highly magnified visualization during an anastomosis of a part of a blood vessel under visual illumination. The surgeon can then zoom out to a wider view of the entire blood vessel under infrared irradiation to check patency. The surgeon can then return to the magnified visualization to complete the wiring. In this example, the processor 4102 is configured to save the visualization parameters at each of the locations. The processor 4102 may store a position corresponding to a position that has been continuously viewed for a period of time (such as two seconds, five seconds, thirty seconds, etc.). The processor 4102 may also store a position after receiving an instruction from the surgeon via the input device 1410.
處理器4102可顯示所儲存位置及/或路徑點之一清單。一所儲存位置之選擇致使處理器4102及/或機器人臂控制器4106使機器人臂及/或耦合板3304移動至先前位置且如先前設定而調整光學參數,包含光照射及濾波。此一組態使得一外科醫師能夠在不自程序之一所顯示影像移開其眼睛或自部位移開其手及其儀器之情況下無縫地觀看序列中之所有所儲存位置。The processor 4102 can display a list of stored locations and/or waypoints. The selection of a stored position causes the processor 4102 and/or the robot arm controller 4106 to move the robot arm and/or the coupling plate 3304 to the previous position and adjust the optical parameters as previously set, including light irradiation and filtering. This configuration allows a surgeon to seamlessly view all stored positions in the sequence without removing his eyes or removing his hands and instruments from the image displayed in one of the programs.
在某些實施例中,處理器4102可經組態以使得一操作者能夠在一外科手術程序之前形成路徑點或位置/定向。該等路徑點可提供於一序列中,此使得處理器4102能夠在接收來自一操作者之進展之一輸入之後在程序期間進展經過規定路徑點。處理器4102可經由觸控螢幕輸入裝置1410a提供機器人臂506、耦合板3304及/或攝影機300之一個三維表示以使得一操作者能夠實際上定位立體視覺化平台516。此可包含相對於一虛擬化患者及/或基於患者之替代模態視覺化而提供一放大率、工作距離及/或焦點。處理器4102經組態以針對每一路徑點將視覺化參數儲存至(舉例而言)記憶體1570及/或記憶體4120。In certain embodiments, the processor 4102 may be configured to enable an operator to form waypoints or positions/orientations before a surgical procedure. The waypoints may be provided in a sequence, which enables the processor 4102 to progress through prescribed waypoints during the procedure after receiving an input of progress from an operator. The processor 4102 can provide a three-dimensional representation of the robot arm 506, the coupling board 3304, and/or the camera 300 via the touch screen input device 1410a so that an operator can actually position the stereo visualization platform 516. This may include providing a magnification, working distance, and/or focus relative to a virtualized patient and/or patient-based alternative modality visualization. The processor 4102 is configured to store the visualization parameters to, for example, the memory 1570 and/or the memory 4120 for each waypoint.
在某些實施例中,處理器4102經組態以執行特定程序所特有之特定視覺化。舉例而言,處理器4102中之影像辨識功能性用於自動對準攝影機300與一所關注物件。處理器4102比較外科手術部位之影像與目標物件之一先前影像或影像以提供一所要物件及其在一立體影像內之位置及定向之辨識。處理器4102及/或機器人臂控制器4106經組態以(舉例而言)使機器人臂506移動朝向物件且使攝影機300朝向物件變焦且設定特定物件及程序之所要影像視圖屬性。例如,在眼科學中,可比較一現場視網膜影像與一經保存影像使得(舉例而言)患者之視網膜之視神經頭部可依據影像辨識來定位。處理器4102及/或機器人臂控制器4106然後使機器人臂506及/或耦合板3304及焦點自動移動及/或改變攝影機300之一放大率,使得攝影機300之尖端指向神經頭部以用於診斷。處理器4102然後可設定攝影機300及/或監視器512以在不具有紅色著色之情況下進行影像顯示以使得視網膜之特徵能夠與周圍組織更容易地區分。In some embodiments, the processor 4102 is configured to perform specific visualizations unique to specific programs. For example, the image recognition functionality in the processor 4102 is used to automatically align the camera 300 with an object of interest. The processor 4102 compares the image of the surgical site with a previous image or image of the target object to provide identification of a desired object and its position and orientation in a three-dimensional image. The processor 4102 and/or the robot arm controller 4106 are configured to, for example, move the robot arm 506 toward the object and zoom the camera 300 toward the object and set desired image view attributes for specific objects and programs. For example, in ophthalmology, a live retinal image can be compared with a saved image so that, for example, the optic nerve head of the patient's retina can be located based on image recognition. The processor 4102 and/or the robot arm controller 4106 then make the robot arm 506 and/or the coupling plate 3304 and the focus automatically move and/or change the magnification of the camera 300 so that the tip of the camera 300 points to the nerve head for diagnosis . The processor 4102 can then set the camera 300 and/or the monitor 512 to perform image display without the red coloration so that the characteristics of the retina can be more easily distinguished from the surrounding tissues.
除保存且返回至所儲存視覺化之外,實例性處理器亦可保存自一個視圖至另一視圖之運動路徑。在上文所論述之吻合實例中,處理器4102及/或機器人臂控制器4106可致使機器人臂506及/或攝影機300在高放大率下跟隨一血管之一整個長度以檢查動脈瘤或其他條件。處理器4102可經組態以視需要辨識且跟隨連續血管。處理器4102可對一有限像素組執行一匹配模板常式以主動地判定機器人臂506及/或攝影機300之運動方向。In addition to saving and returning to the stored visualization, the example processor can also save the movement path from one view to another. In the anastomosis example discussed above, the processor 4102 and/or the robotic arm controller 4106 can cause the robotic arm 506 and/or the camera 300 to follow a full length of a blood vessel at high magnification to check for aneurysms or other conditions . The processor 4102 can be configured to recognize and follow continuous blood vessels as needed. The processor 4102 can execute a matching template routine on a finite pixel group to actively determine the movement direction of the robot arm 506 and/or the camera 300.
實例性處理器4102亦可程式化且儲存在自不同觀看角度進行的一物件之一視覺化內之一運動路徑。舉例而言,可藉由程式化處理器4102及/或機器人臂控制器4106以圍繞眼睛內側之一點樞轉而執行一患者之眼睛之一眼科前房角鏡檢查。在此實例中,機器人臂506使攝影機300以一大體圓錐形運動掃視使得自大量觀看角度觀看患者之眼睛。外科手術部位視覺化之此運動可用於選擇最佳角度以自照射排除虛假反射或在替代觀看角度中環視阻礙物。The example processor 4102 can also be programmed and stored in a movement path within a visualization of an object from different viewing angles. For example, the programmable processor 4102 and/or the robot arm controller 4106 can be used to pivot around a point inside the eye to perform an ophthalmic gonioscopy of the eye of a patient. In this example, the robotic arm 506 causes the camera 300 to scan in a generally conical motion so that the patient's eyes can be viewed from a large number of viewing angles. This movement of visualizing the surgical site can be used to select the best angle for self-illumination to eliminate false reflections or to look around obstructions in alternative viewing angles.
在某些實施例中,處理器4102經組態以減少深度圖計算中之遮擋。遮擋由於一立體影像之兩個視圖之視差而係深度圖計算中固有的,其中一第一視圖看到不同於其他視圖的一部位之某一部分。因此,每一視圖未看到其他視圖之某一部分。藉由使機器人臂506在各種位置當中移動且在使用影像像素之三維位置之知識之同時重新計算深度圖,減少遮擋。可藉由在執行已知運動步驟、計算預期圖改變、藉由差判定誤差且建構一平均圖之後反覆地計算深度圖而使該圖更準確。
E.輔助驅動實施例 In some embodiments, the processor 4102 is configured to reduce occlusion in the calculation of the depth map. Occlusion is inherent in the calculation of the depth map due to the parallax between the two views of a stereoscopic image, where a first view sees a part of a part that is different from the other views. Therefore, each view does not see a certain part of the other views. By moving the robot arm 506 among various positions and recalculating the depth map while using the knowledge of the three-dimensional position of the image pixel, the occlusion is reduced. The map can be made more accurate by repeatedly calculating the depth map after performing the known motion steps, calculating the expected map change, determining the error by the difference, and constructing an average map. E. Auxiliary drive embodiment
在某些實施例中,處理器4102及/或機器人臂控制器4106經組態以執行由儲存於記憶體1570及/或4120中之指令定義之一或多個演算法、常式等以使得機器人臂506及/或耦合板3304能夠基於由一操作者施加以用於使立體視覺化攝影機300移動之所偵測到之力而提供動力關節移動。在此等實施例中,輔助驅動特徵使得機器人臂506能夠藉由使立體視覺化攝影機300移動至一所要位置及/或定向而操作為一外科醫師之一延伸。如下文所闡述,處理器4102及/或機器人臂控制器4106經組態以監測由一操作者施予之力/轉矩/移動及臂關節之位置以推斷一操作者之意圖且因此使機器人臂506及/或耦合板3304移動。In some embodiments, the processor 4102 and/or the robot arm controller 4106 are configured to execute one or more algorithms, routines, etc. defined by instructions stored in the memory 1570 and/or 4120 to make The robot arm 506 and/or the coupling plate 3304 can provide powered joint movement based on a detected force applied by an operator to move the stereo visualization camera 300. In these embodiments, the auxiliary driving feature enables the robotic arm 506 to operate as an extension of a surgeon by moving the stereo visualization camera 300 to a desired position and/or orientation. As explained below, the processor 4102 and/or the robot arm controller 4106 are configured to monitor the force/torque/movement applied by an operator and the position of the arm joints to infer the intention of an operator and thus enable the robot The arm 506 and/or the coupling plate 3304 move.
圖62展示圖解說明根據本發明之一實例性實施例之用於提供立體視覺化攝影機300之輔助驅動之一演算法、常式或程序6200之一圖式。儘管參考圖62中所圖解說明之流程圖闡述程序6200,但應瞭解,可使用執行與程序6200相關聯之步驟之諸多其他方法。舉例而言,可改變方塊中之諸多方塊之次序,可組合特定方塊與其他方塊,且所闡述之方塊中之諸多方塊係選用的。此外,可在多個裝置當中執行在程序6200中所闡述之動作,該多個裝置包含(舉例而言)圖14之實例性立體視覺化攝影機300之資訊處理器模組1408及/或圖41之關節R1至R9及機器人臂控制器4106。在某些實例中,可由儲存於機器人臂控制器4106之記憶體4120中之一程式執行程序6200。在力施加至攝影機300時可週期性地執行實例性程序6200。舉例而言,程序6200可每個更新循環(其可係1 (「ms」)、5 ms、8 ms、20 ms、50 ms等)對力/轉矩資料進行取樣。FIG. 62 shows a diagram illustrating an algorithm, routine, or program 6200 for providing auxiliary driving of the stereo visualization camera 300 according to an exemplary embodiment of the present invention. Although the procedure 6200 is described with reference to the flowchart illustrated in FIG. 62, it should be understood that many other methods of performing the steps associated with the procedure 6200 can be used. For example, the order of the blocks in the block can be changed, a specific block can be combined with other blocks, and many of the blocks described are optional. In addition, the actions described in the procedure 6200 can be executed in multiple devices including, for example, the information processor module 1408 of the exemplary stereo visualization camera 300 of FIG. 14 and/or FIG. 41 The joints R1 to R9 and the robot arm controller 4106. In some instances, the program 6200 can be executed by a program stored in the memory 4120 of the robot arm controller 4106. The example procedure 6200 may be executed periodically as force is applied to the camera 300. For example, the procedure 6200 may sample the force/torque data every update cycle (which may be 1 ("ms"), 5 ms, 8 ms, 20 ms, 50 ms, etc.).
在所圖解說明實施例中,處理器4102及/或機器人臂控制器4106自感測器3306接收與由一操作者對攝影機300施予之力有關之力/轉矩輸出資料6201。處理器4102及/或機器人臂控制器4106經組態以對所接收輸出資料6201進行濾波(方塊6202)。該輸出資料可包含一力及/或轉矩向量。該濾波可包含應用以搬運車振動為目標之一第一低通濾波器、一第二低通濾波器及/或一陷波濾波器。在其他實例中,處理器4102及/或機器人臂控制器4106可使用一單個低通濾波器及一陷波濾波器。In the illustrated embodiment, the processor 4102 and/or the robot arm controller 4106 receives from the sensor 3306 force/torque output data 6201 related to the force applied to the camera 300 by an operator. The processor 4102 and/or the robot arm controller 4106 are configured to filter the received output data 6201 (block 6202). The output data may include a force and/or torque vector. The filtering may include applying a first low-pass filter, a second low-pass filter, and/or a notch filter that targets vehicle vibration. In other examples, the processor 4102 and/or the robot arm controller 4106 may use a single low-pass filter and a notch filter.
實例性處理器4102及/或機器人臂控制器4106亦自機器人臂506及/或耦合板3304中之一或多個關節感測器接收關節位置資料6203。處理器4102及/或機器人臂控制器4106使用關節位置資料6203來提供對經濾波力/轉矩輸出資料之補償(方塊6204)。該補償可包含重力補償及/或施力點補償。對於重力補償,自經濾波資料移除地球重力之效應。對於施力點補償,處理器4102及/或機器人臂控制器4106基於其中力施加至攝影機300 (例如,控制臂304)之一點而提供對經濾波資料之補償(及/或經重力補償資料)。如上文結合圖35所論述,感測器3306以與控制臂304之一角度位於某一偏移距離遠處。偏移距離及角度致使在控制臂304處施加之力在於感測器3306中偵測到時在方向及角度方面稍微移位。施力補償調整力值,好像力直接施加至感測器3306而非控制臂304。施力補償可基於感測器3306與控制臂304之間的一已知角度及/或距離而預定。總之,重力補償及施力點補償修改經濾波力/轉矩資料以形成與一操作者在攝影機之控制臂304處所提供之力/轉矩成比例之一力/轉矩向量。The example processor 4102 and/or the robot arm controller 4106 also receive joint position data 6203 from one or more joint sensors in the robot arm 506 and/or the coupling plate 3304. The processor 4102 and/or the robot arm controller 4106 use the joint position data 6203 to provide compensation for the filtered force/torque output data (block 6204). The compensation may include gravity compensation and/or force point compensation. For gravity compensation, the effect of earth's gravity is removed from the filtered data. For force point compensation, the processor 4102 and/or the robot arm controller 4106 provide compensation for the filtered data (and/or gravity compensated data) based on a point in which the force is applied to the camera 300 (for example, the control arm 304) . As discussed above in conjunction with FIG. 35, the sensor 3306 is located at an offset distance from the control arm 304 at an angle. The offset distance and angle cause the force applied at the control arm 304 to be slightly shifted in direction and angle when detected in the sensor 3306. The force compensation adjusts the force value as if the force is directly applied to the sensor 3306 instead of the control arm 304. The force compensation can be predetermined based on a known angle and/or distance between the sensor 3306 and the control arm 304. In summary, gravity compensation and force point compensation modify the filtered force/torque data to form a force/torque vector proportional to the force/torque provided by an operator at the control arm 304 of the camera.
實例性處理器4102及/或機器人臂控制器4106亦連同經補償經濾波力/轉矩輸出資料使用關節位置資料6203以執行力/轉矩座標系與一全域座標系或機器人空間之間的一座標變換(方塊6206)。該變換可包含基於已知機器人空間及感測器3306之定向之一或多個預定義方程式或關係。實例性處理器4102及/或機器人臂控制器4106亦使用關節位置資料6203來執行立體視覺化攝影機300之一攝影機座標系與全域座標系或機器人空間之間的一座標變換(方塊6208)。該攝影機座標系之該座標變換可基於映射至機器人臂506之機器人空間之光學校準參數,如上文所闡述。The example processor 4102 and/or the robot arm controller 4106 also uses the joint position data 6203 along with the compensated filtered force/torque output data to perform a relationship between the force/torque coordinate system and a global coordinate system or robot space. Coordinate transformation (block 6206). The transformation may include one or more predefined equations or relationships based on the known robot space and the orientation of the sensor 3306. The example processor 4102 and/or the robot arm controller 4106 also use the joint position data 6203 to perform a coordinate transformation between a camera coordinate system of the stereo visualization camera 300 and the global coordinate system or robot space (block 6208). The coordinate transformation of the camera coordinate system can be based on the optical calibration parameters mapped to the robot space of the robot arm 506, as described above.
在執行該等座標變換之後,實例性處理器4102及/或機器人臂控制器4106經組態以使用至少一個S型函數將力/轉矩向量轉換為一或多個平移/旋轉向量(方塊6210)。該(等)平移/旋轉向量之形成產生操作者之一既定方向之一推斷。平移與旋轉資訊用於判定將如何使機器人臂506之關節旋轉以反映、匹配及/或約計操作者之既定移動。After performing the coordinate transformations, the example processor 4102 and/or the robot arm controller 4106 are configured to use at least one sigmoid function to convert the force/torque vector into one or more translation/rotation vectors (block 6210 ). The formation of this (etc.) translation/rotation vector generates an inference in one of the predetermined directions of the operator. The translation and rotation information is used to determine how to rotate the joint of the robot arm 506 to reflect, match, and/or approximate the predetermined movement of the operator.
在某些實例中,實例性處理器4102及/或機器人臂控制器4106經組態以將機器人速度比例縮放應用於平移/旋轉向量(方塊6212)。該速度比例縮放可基於(舉例而言)機器人臂506之操作條件。舉例而言,一旦已開始一外科手術程序從而以一相對高速度比率阻止臂意外地撞擊手術室職員、儀器及/或患者,舉例而言,便可應用速度比例縮放。當尚未開始一程序時,實例性處理器4102及/或機器人臂控制器4106可在不存在一患者時針對機器人臂506之校準或設定應用較少速度比例縮放。In some instances, the example processor 4102 and/or the robot arm controller 4106 are configured to apply robot speed scaling to the translation/rotation vector (block 6212). The speed scaling may be based on the operating conditions of the robot arm 506, for example. For example, once a surgical procedure has been started to prevent the arm from accidentally hitting the operating room staff, instruments, and/or patient at a relatively high speed ratio, for example, speed scaling can be applied. When a program has not yet started, the example processor 4102 and/or the robotic arm controller 4106 can apply less speed scaling for calibration or setting of the robotic arm 506 when there is no patient.
實例性處理器4102及/或機器人臂控制器4106基於經比例縮放平移/旋轉向量而判定機器人臂506之關節之可能移動順序。在評估可能序列時,處理器4102及/或機器人臂控制器4106識別用於避免之關節奇異點,因而排除機器人臂506之對應移動操作(方塊6214)。如上文所論述,奇異點可包含可易於發生遲滯及背隙之肘鎖或其他位置。處理器4102及/或機器人臂控制器4106經組態以在使用(舉例而言)亞可比運動學(例如,一亞可比矩陣之一反演)消除移動奇異點之後選擇一移動順序(方塊6216)。亞可比運動學方程式定義將如何基於經比例縮放平移/旋轉向量而使機器人臂506及/或耦合板3304之特定關節移動。亞可比運動學提供速度控制,而下文所論述之逆運動學提供位置控制。在某些實施例中,處理器4102及/或機器人臂控制器4106可替代地使用逆運動學或其他機器人臂控制常式。處理器4102及/或機器人臂控制器4106判定規定機器人臂及/或耦合板3304之特定關節將如何以一協調方式移動且規定(舉例而言)關節旋轉速度、關節旋轉方向及/或關節旋轉持續時間之一移動順序。該移動順序亦可規定其中將使機器人臂506及/或耦合板3304之關節旋轉之一序列。機器人臂及/或耦合板3304之關節R1至R9中之任一者可取決於移動順序而個別地旋轉或具有重疊移動。The example processor 4102 and/or the robot arm controller 4106 determine the possible movement sequence of the joints of the robot arm 506 based on the scaled translation/rotation vectors. When evaluating possible sequences, the processor 4102 and/or the robot arm controller 4106 identify joint singularities for avoidance, thereby excluding the corresponding movement operation of the robot arm 506 (block 6214). As discussed above, singularities can include toggle locks or other locations that can be prone to hysteresis and backlash. The processor 4102 and/or the robot arm controller 4106 are configured to select a movement sequence after removing the moving singularities using (for example) sub-comparable kinematics (e.g., an inversion of a sub-comparable matrix) (block 6216). ). The sub-comparable kinematics equations define how the robot arm 506 and/or the specific joint of the coupling plate 3304 will be moved based on the scaled translation/rotation vector. Yacobi kinematics provides speed control, while inverse kinematics discussed below provides position control. In some embodiments, the processor 4102 and/or the robot arm controller 4106 may alternatively use inverse kinematics or other robot arm control routines. The processor 4102 and/or the robot arm controller 4106 determine how the specified joints of the robot arm and/or the coupling plate 3304 will move in a coordinated manner and specify (for example) joint rotation speed, joint rotation direction and/or joint rotation One of the duration of the movement sequence. The movement sequence may also specify a sequence in which the joints of the robot arm 506 and/or the coupling plate 3304 will be rotated. The robot arm and/or any one of the joints R1 to R9 of the coupling plate 3304 may individually rotate or have overlapping movements depending on the movement sequence.
在判定一移動順序之後,處理器4102及/或機器人臂控制器4106經組態以使用關節速度比例縮放及/或邊界來執行碰撞避免。舉例而言,處理器4102及/或機器人臂控制器4106經組態以判定移動順序是否將致使機器人臂506及/或耦合板3304之一或多個關節及/或連桿接近一邊界或其他經定義笛卡爾限制,諸如在一患者或儀器周圍之空間。如上文結合圖49所論述,處理器4102及/或機器人臂控制器4106可比較依據移動順序對機器人空間中之連桿及/或關節之位置之估計與一或多個經定義邊界及/或角度限制。基於自一邊界之一距離,處理器4102及/或機器人臂控制器4106經由一比例值應用一或多個關節速度限制(方塊6218)。處理器4102及/或機器人臂控制器4106亦可應用阻止(舉例而言)機器人臂506之連桿撞擊彼此及/或阻止機器人臂506、耦合板3304及/或攝影機300延伸越過一邊界之一或多個關節位置限制(方塊6220)。恰好在位置限制前面之位置(例如,在一位置限制前面1公分(「cm」)、2 cm、10 cm等)及/或處於位置限制之位置可與笛卡爾機器人空間中之位置對應,其中比例因子之一值係「0」。After determining a movement sequence, the processor 4102 and/or the robot arm controller 4106 are configured to use joint speed scaling and/or boundaries to perform collision avoidance. For example, the processor 4102 and/or the robot arm controller 4106 are configured to determine whether the movement sequence will cause one or more joints and/or links of the robot arm 506 and/or the coupling plate 3304 to approach a boundary or other A defined Cartesian limit, such as the space around a patient or instrument. As discussed above in connection with FIG. 49, the processor 4102 and/or the robot arm controller 4106 may compare the estimates of the positions of the links and/or joints in the robot space based on the movement sequence with one or more defined boundaries and/or Angle limit. Based on a distance from a boundary, the processor 4102 and/or the robot arm controller 4106 applies one or more joint speed limits via a proportional value (block 6218). The processor 4102 and/or the robot arm controller 4106 may also be used to prevent (for example) the linkages of the robot arm 506 from hitting each other and/or prevent the robot arm 506, the coupling plate 3304, and/or the camera 300 from extending across a boundary. Or multiple joint position restrictions (block 6220). The position just before the position limit (for example, 1 cm ("cm"), 2 cm, 10 cm, etc.) in front of the position limit and/or the position in the position limit can correspond to the position in the Cartesian robot space, where One of the scale factors is "0".
在某些實例中,處理器4102及/或機器人臂控制器4106可在邊界提供為至方程式之一輸入之情況下執行亞可比運動學,其中給穿過靠近於一邊界之區之移動提供一較高成本因子。當判定一移動順序時,使用邊界成本因子致使處理器4102及/或機器人臂控制器4106避免靠近於邊界之位置(若可能)。成本因子可包含與相關聯於機器人空間中之一特定位置之一比例因子之一減小成反比。該比例因子可應用於每一關節/連桿,或針對機器人空間中之相同位置之每一關節可存在單獨比例因子。In some instances, the processor 4102 and/or the robot arm controller 4106 can perform sub-comparable kinematics with the boundary provided as one of the inputs to the equation, where a motion is provided for movement through a region close to a boundary. Higher cost factor. When determining a movement sequence, the boundary cost factor is used to cause the processor 4102 and/or the robot arm controller 4106 to avoid locations close to the boundary (if possible). The cost factor may include a reduction in inverse proportion to one of the scale factors associated with a specific location in the robot space. The scale factor can be applied to each joint/link, or there can be a separate scale factor for each joint at the same position in the robot space.
在提供碰撞避免之後,實例性處理器4102及/或機器人臂控制器4106經組態以提供對機器人臂506之相對快速反轉之校正(方塊6222)。處理器4102及/或機器人臂控制器4106可實施一零相位延遲演算法以拒斥迅速地致使一或多個關節改變旋轉方向之方向脈衝。零相位延遲演算法可由在一操作者太迅速地反轉方向之情況下阻止(舉例而言)機器人臂突然移動或搖擺之一濾波器實施。After providing collision avoidance, the example processor 4102 and/or the robotic arm controller 4106 are configured to provide corrections for the relatively fast reversal of the robotic arm 506 (block 6222). The processor 4102 and/or the robot arm controller 4106 may implement a zero phase delay algorithm to reject direction pulses that quickly cause one or more joints to change the direction of rotation. The zero phase delay algorithm can be implemented by a filter that prevents (for example) the robot arm from suddenly moving or swinging if the operator reverses the direction too quickly.
如圖62中所圖解說明,實例性處理器4102及/或機器人臂控制器4106經組態以證實移動順序之命令(方塊6224)。處理器4102及/或機器人臂控制器4106可證實一命令以確保一命令(或指示一命令之信號)在一關節馬達之操作參數(例如,持續時間、旋轉速度等)內。處理器4102及/或機器人臂控制器4106亦可藉由比較一命令與當前臨限值而證實該命令以確保機器人臂506在移動順序之任一階段期間將不汲取過多電流。As illustrated in Figure 62, the example processor 4102 and/or the robotic arm controller 4106 are configured to validate the order of movement (block 6224). The processor 4102 and/or the robot arm controller 4106 may verify a command to ensure that a command (or a signal indicating a command) is within the operating parameters (eg, duration, rotation speed, etc.) of a joint motor. The processor 4102 and/or the robot arm controller 4106 can also verify a command by comparing the command with the current threshold to ensure that the robot arm 506 will not draw too much current during any stage of the movement sequence.
實例性處理器4102及/或機器人臂控制器4106亦可將一或多個抗雜訊濾波器應用於移動命令或指示移動命令之信號(方塊6226)。濾波器可包含移除高頻率雜訊分量(其可在一關節馬達中引發暫態信號)之一高頻率低通濾波器。在任何濾波之後,處理器4102及/或機器人臂控制器4106根據移動順序經由一或多個信號或訊息將一或多個命令傳輸至機器人臂506及/或耦合板3304之適當關節馬達(方塊6228)。所傳輸命令致使各別關節處之馬達使機器人臂506及/或耦合板3304移動,因而致使攝影機300按照操作者所預期而移動。可重複實例性程序6200,只要一操作者將力施加至攝影機300。The example processor 4102 and/or the robot arm controller 4106 may also apply one or more anti-noise filters to the movement command or the signal indicating the movement command (block 6226). The filter may include a high-frequency low-pass filter that removes high-frequency noise components (which can induce transient signals in an articulated motor). After any filtering, the processor 4102 and/or the robot arm controller 4106 transmits one or more commands to the robot arm 506 and/or the appropriate joint motor of the coupling board 3304 via one or more signals or messages according to the movement sequence (block 6228). The transmitted commands cause the motors at the respective joints to move the robot arm 506 and/or the coupling plate 3304, thereby causing the camera 300 to move as expected by the operator. The example procedure 6200 can be repeated as long as an operator applies force to the camera 300.
圖63展示根據本發明之一實例性實施例之用於使用一輸入裝置1410使實例性視覺化攝影機300移動之一實例性程序6300之一圖式。實例性程序6300幾乎完全相同於圖62之程序6200,惟移除與感測器3306有關之方塊6202至6206除外。在所圖解說明實例中,自一輸入裝置1410 (諸如控制臂304上之按鈕、一腳踏板、操縱桿、觸控螢幕介面等)接收一控制輸入6301。控制輸入6301指示攝影機在機器人臂506之笛卡爾機器人空間中之方向性移動。FIG. 63 shows a diagram of an example program 6300 for using an input device 1410 to move the example visualization camera 300 according to an example embodiment of the present invention. The example procedure 6300 is almost identical to the procedure 6200 of FIG. 62, except that the blocks 6202 to 6206 related to the sensor 3306 are removed. In the illustrated example, a control input 6301 is received from an input device 1410 (such as a button on the control arm 304, a foot pedal, a joystick, a touch screen interface, etc.). The control input 6301 instructs the directional movement of the camera in the Cartesian robot space of the robot arm 506.
如圖63中圖解說明,控制輸入6301與來自機器人臂506及/或耦合板3304中之一或多個關節感測器之關節位置資料6203組合以用於執行自一攝影機座標系至一全域座標系及/或機器人空間之一座標變換(方塊6208)。實例性程序6300然後以與針對程序6200所論述的相同之方式繼續。處理器4102及/或機器人臂控制器4106因此基於自輸入裝置1410接收之控制輸入6301而致使機器人臂506、耦合板3304及/或攝影機300移動至一所要位置及/或定向。
F.鎖定至目標實施例 As illustrated in Figure 63, the control input 6301 is combined with joint position data 6203 from one or more joint sensors in the robot arm 506 and/or the coupling plate 3304 for execution from a camera coordinate system to a global coordinate system System and/or one coordinate transformation in robot space (block 6208). The example procedure 6300 then continues in the same manner as discussed for the procedure 6200. The processor 4102 and/or the robot arm controller 4106 therefore cause the robot arm 506, the coupling plate 3304 and/or the camera 300 to move to a desired position and/or orientation based on the control input 6301 received from the input device 1410. F. Lock to the target embodiment
在某些實施例中,處理器4102及/或機器人臂控制器4106經組態以執行由儲存於記憶體1570及/或4120中之指令定義之一或多個演算法、常式等以使得機器人臂506及/或耦合板3304能夠提供一鎖定至目標特徵。在此等實施例中,該鎖定至目標特徵藉由使得立體視覺化攝影機300能夠重定向同時鎖定至一目標外科手術部位上而使得機器人臂506能夠操作為一外科醫師之一延伸。如下文所闡述,處理器4102及/或機器人臂控制器4106經組態以監測由一操作者施予之力/轉矩/移動以及臂關節之位置以推斷一操作者之意圖且因此重定向機器人臂506及/或耦合板3304,使得攝影機300之焦點保持鎖定或固定。In some embodiments, the processor 4102 and/or the robot arm controller 4106 are configured to execute one or more algorithms, routines, etc. defined by instructions stored in the memory 1570 and/or 4120 to make The robot arm 506 and/or the coupling plate 3304 can provide a lock-to-target feature. In these embodiments, the lock-to-target feature enables the robotic arm 506 to operate as an extension of a surgeon by enabling the stereo visualization camera 300 to be redirected and locked to a target surgical site. As explained below, the processor 4102 and/or the robot arm controller 4106 are configured to monitor the force/torque/movement applied by an operator and the position of the arm joints to infer the intention of an operator and thus redirect The robot arm 506 and/or the coupling board 3304 keep the focus of the camera 300 locked or fixed.
鎖定至目標特徵使得攝影機300能夠藉由致使所有運動約束至一虛擬球體之表面而經重定向。攝影機300之尖端位於虛擬球體之一外表面(例如,虛擬球體之一頂部半球)處且攝影機300或目標外科手術部位之一焦點構成虛擬球體之一中心。實例性處理器4102及/或機器人臂控制器4106使得一操作者能夠使攝影機300在虛擬球體之一外表面上方移動同時使攝影機300指向球體之中心,因而在移動期間使目標外科手術部位保持對焦。鎖定至目標特徵使得一操作者能夠容易地且迅速地獲得相同目標部位之顯著不同視圖。Locking to the target feature allows the camera 300 to be redirected by causing all motion to be constrained to the surface of a virtual sphere. The tip of the camera 300 is located at an outer surface of the virtual sphere (for example, a top hemisphere of the virtual sphere) and a focal point of the camera 300 or the target surgical site constitutes a center of the virtual sphere. The example processor 4102 and/or robotic arm controller 4106 enables an operator to move the camera 300 over an outer surface of the virtual sphere while pointing the camera 300 at the center of the sphere, thereby keeping the target surgical site in focus during the movement . Locking to the target feature allows an operator to easily and quickly obtain significantly different views of the same target site.
圖64展示圖解說明根據本發明之一實例性實施例之用於提供立體視覺化攝影機300之一鎖定至目標之一演算法、常式或程序6400之一圖式。儘管參考圖64中所圖解說明之流程圖闡述程序6400,但應瞭解,可使用執行與程序6400相關聯之步驟之諸多其他方法。舉例而言,可改變方塊中之諸多方塊之次序,可組合特定方塊與其他方塊,且所闡述之方塊中之諸多方塊係選用的。此外,可在多個裝置當中執行程序6400中所闡述之動作,該多個裝置包含(舉例而言)圖14之實例性立體視覺化攝影機300之資訊處理器模組1408及/或圖41之關節R1至R9及機器人臂控制器4106。在某些實例中,可藉由儲存於機器人臂控制器4106之記憶體4120中之一程式執行程序6400。FIG. 64 shows a diagram illustrating an algorithm, routine, or program 6400 for providing one of the stereoscopic visualization cameras 300 to be locked to the target according to an exemplary embodiment of the present invention. Although the procedure 6400 is described with reference to the flowchart illustrated in FIG. 64, it should be understood that many other methods of performing the steps associated with the procedure 6400 can be used. For example, the order of the blocks in the block can be changed, a specific block can be combined with other blocks, and many of the blocks described are optional. In addition, the actions described in the procedure 6400 can be executed in a plurality of devices including, for example, the information processor module 1408 of the exemplary stereoscopic visualization camera 300 in FIG. 14 and/or the information processor module 1408 in FIG. 41 Joints R1 to R9 and robot arm controller 4106. In some instances, the program 6400 can be executed by a program stored in the memory 4120 of the robot arm controller 4106.
實例性程序6400類似於輔助驅動程序6200。然而,程序6400準備命令關節位置保持攝影機300之一焦點,而實例性程序6200準備關節速度計算。實例性程序6400判定由一操作者輸入之一所要力/移動向量且計算一旋轉變換,使得攝影機300之焦點保持固定同時使機器人臂506及/或耦合板3304之一或多個關節移動以重定向攝影機300。攝影機300之重定向使得一目標外科手術部位能夠自不同角度成像。當(舉例而言)一儀器阻擋一第一觀看路徑且外科醫師期望維持當前焦點時可需要重定向。The example program 6400 is similar to the auxiliary driver 6200. However, the program 6400 is ready to command the joint position to maintain one of the focus of the camera 300, and the example program 6200 is ready to calculate the joint speed. The example program 6400 determines a required force/movement vector input by an operator and calculates a rotation transformation so that the focus of the camera 300 remains fixed while the robot arm 506 and/or one or more joints of the coupling plate 3304 are moved for weight Directional camera 300. The reorientation of the camera 300 enables a target surgical site to be imaged from different angles. Redirection may be required when (for example) an instrument blocks a first viewing path and the surgeon wishes to maintain the current focus.
實例性程序6400在一操作者選擇輸入裝置1410上之鎖定至目標按鈕(此致使一指令訊息或信號傳輸至處理器4102及/或機器人臂控制器4106)時開始。在接收到該訊息之後,處理器4102及/或機器人臂控制器4106在其中工作距離及/或焦點保持固定同時使得一操作者能夠改變攝影機300之一定向(此致使機器人臂及/或耦合板3304之一或多個關節提供輔助移動)之一鎖定至目標模式中操作。當接收到一指令時,實例性處理器4102及/或機器人臂控制器4106可記錄攝影機300之當前工作距離、放大率、焦點及/或其他光學參數。處理器4102及/或機器人臂控制器4106亦可記錄FOV之一當前影像。The example procedure 6400 starts when the operator selects the lock to target button on the input device 1410 (which causes a command message or signal to be transmitted to the processor 4102 and/or the robot arm controller 4106). After receiving the message, the processor 4102 and/or the robot arm controller 4106 maintains a fixed working distance and/or focus therein while enabling an operator to change the orientation of the camera 300 (this causes the robot arm and/or the coupling plate 3304 One or more joints provide auxiliary movement) One of the locks to the target mode for operation. When an instruction is received, the example processor 4102 and/or the robot arm controller 4106 can record the current working distance, magnification, focus, and/or other optical parameters of the camera 300. The processor 4102 and/or the robot arm controller 4106 can also record a current image of the FOV.
在程序6400開始之後,處理器4102及/或機器人臂控制器4106自感測器3306接收與由一操作者施予攝影機300之力有關之力/轉矩輸出資料6201。如結合圖62所論述,處理器4102及/或機器人臂控制器4106對資料6102進行濾波且提供對資料6102之重力/施力補償(方塊6202及6204)。亦類似於圖62,處理器4102及/或機器人臂控制器4106連同經補償經濾波力/轉矩輸出資料使用關節位置資料6203以執行力/轉矩座標系與一全域座標系或機器人空間之間的一座標變換(方塊6206)。實例性處理器4102及/或機器人臂控制器4106亦使用關節位置資料6203來執行立體視覺化攝影機300之一攝影機座標系與全域座標系或機器人空間之間的一座標變換(方塊6208)。實例性處理器4102及/或機器人臂控制器4106 亦執行自全域座標系或機器人空間至與一虛擬球體對應之球面座標之一變換(方塊6410)。After the program 6400 is started, the processor 4102 and/or the robot arm controller 4106 receives from the sensor 3306 force/torque output data 6201 related to the force applied to the camera 300 by an operator. As discussed in connection with FIG. 62, the processor 4102 and/or the robot arm controller 4106 filters the data 6102 and provides gravity/force compensation to the data 6102 (blocks 6202 and 6204). Also similar to Fig. 62, the processor 4102 and/or the robot arm controller 4106 together with the compensated filtered force/torque output data uses the joint position data 6203 to execute the force/torque coordinate system and a global coordinate system or robot space Transformation between one icon (block 6206). The example processor 4102 and/or the robot arm controller 4106 also use the joint position data 6203 to perform a coordinate transformation between a camera coordinate system of the stereo visualization camera 300 and the global coordinate system or robot space (block 6208). The example processor 4102 and/or the robot arm controller 4106 also perform a transformation from the global coordinate system or robot space to a spherical coordinate corresponding to a virtual sphere (block 6410).
在座標變換之後,實例性處理器4102及/或機器人臂控制器4106經組態以基於(舉例而言)攝影機300之一操作模式而對軌跡速度進行比例縮放(方塊6412)。該比例縮放可類似於在圖62之方塊6212處執行之比例縮放。圖64之實例性程序6400藉由處理器4102及/或機器人臂控制器4106計算一球體端點而繼續(方塊6414)。該球體端點之計算提供關於操作者之所要移動方向之一推斷且判定攝影機300將在虛擬球體上方移動多遠而不使該球體旋轉。After the coordinate transformation, the example processor 4102 and/or the robot arm controller 4106 are configured to scale the trajectory speed based on, for example, one of the operating modes of the camera 300 (block 6412). The scaling may be similar to the scaling performed at block 6212 in FIG. 62. The example program 6400 of FIG. 64 continues by the processor 4102 and/or the robot arm controller 4106 calculating the end points of a sphere (block 6414). The calculation of the end of the sphere provides an inference about one of the operator's desired moving directions and determines how far the camera 300 will move over the virtual sphere without rotating the sphere.
圖65展示圖解說明根據本發明之一實例性實施例之用於鎖定至目標特徵之一虛擬球體6500之一圖式。如圖65中所展示,立體視覺化攝影機300基於如依據關節位置資料6203所判定之一當前位置而虛擬地放置於球體6500上。攝影機300之一觀看向量指向位於球體6500之一中心中之一尖端(指定為xyz目標)。處理器4102及/或機器人臂控制器4106經組態以使用所變換力/轉矩資料來判定球體上之攝影機300將如何沿著球體6500之一表面移動同時使觀看向量維持指向xyz目標,其中由係旋轉球面角「v
」及「u
」之一函數之一方程式給出球體上之任一給定點。當使用力/轉矩資料時,處理器4102及/或機器人臂控制器4106使用與平移力對應之一「x」及「y」分量來直接判定攝影機300將如何在虛擬球體6500上移動以判定球體端點。FIG. 65 shows a diagram illustrating a virtual sphere 6500 for locking to a target feature according to an exemplary embodiment of the present invention. As shown in FIG. 65, the stereoscopic visualization camera 300 is virtually placed on the sphere 6500 based on a current position as determined from the joint position data 6203. One of the viewing vectors of the camera 300 points to a tip (designated as an xyz target) located in one of the centers of the sphere 6500. The processor 4102 and/or the robot arm controller 4106 are configured to use the transformed force/torque data to determine how the camera 300 on the sphere will move along a surface of the sphere 6500 while keeping the viewing vector pointing to the xyz target, where Any given point on the sphere is given by an equation of one of the functions of the rotating spherical angle " v " and "u". When using force/torque data, the processor 4102 and/or the robot arm controller 4106 use one of the "x" and "y" components corresponding to the translational force to directly determine how the camera 300 will move on the virtual sphere 6500 to determine The end of the sphere.
處理器4102及/或機器人臂控制器4106可針對不同輸入以不同方式判定球體端點。舉例而言,若經由輸入裝置1410接收一輸入,如圖63中所展示,則處理器4102及/或機器人臂控制器4106將「向上」、「向下」、「左」及「右」(其提供為x向量、y向量)自攝影機座標轉換至機器人空間座標。類似於力/轉矩資料,處理器4102及/或機器人臂控制器4106使用x向量、y向量來直接判定攝影機300將如何在虛擬球體6500上移動以判定球體端點。應瞭解,在其中經由輸入裝置接收輸入之例項中,可省略連同方塊6202至6206所論述之操作,如圖63中所展示。The processor 4102 and/or the robot arm controller 4106 can determine the end of the sphere in different ways for different inputs. For example, if an input is received via the input device 1410, as shown in FIG. 63, the processor 4102 and/or the robot arm controller 4106 will "up", "down", "left" and "right" ( It is provided as x vector, y vector) from the camera coordinates to the robot space coordinates. Similar to the force/torque data, the processor 4102 and/or the robot arm controller 4106 use the x vector and the y vector to directly determine how the camera 300 will move on the virtual sphere 6500 to determine the end of the sphere. It should be understood that in the example in which input is received via the input device, the operations discussed in connection with blocks 6202 to 6206 may be omitted, as shown in FIG. 63.
在某些實例中,處理器4102及/或機器人臂控制器4106經組態以接收軌道輸入資料。在此等實例中,處理器4102及/或機器人臂控制器4106使球面角「v
」保持恆定同時使移動沿著虛擬球體6500之球面角「u
」反覆。沿著球面角「u
」之反覆移動使得球體端點能夠針對軌道輸入經判定。應瞭解,雖然輸入應用於虛擬球體6500,但在其他實例中,輸入可應用於其他形狀。舉例而言,虛擬球體6500替代地可定義為一虛擬圓柱體、一橢圓體、一蛋型、一角錐體/截頭錐體等。In some instances, the processor 4102 and/or the robot arm controller 4106 are configured to receive track input data. In these examples, the processor 4102 and/or the robot arm controller 4106 keeps the spherical angle " v " constant while causing the movement to be repeated along the spherical angle "u " of the virtual sphere 6500. The repeated movement along the spherical angle " u " enables the end of the sphere to be judged for the orbital input. It should be understood that although the input is applied to the virtual sphere 6500, in other examples, the input may be applied to other shapes. For example, the virtual sphere 6500 may alternatively be defined as a virtual cylinder, an ellipsoid, an egg shape, a pyramid/frustum, etc.
在其他實例中,處理器4102及/或機器人臂控制器4106經組態以接收位準範疇輸入資料。在此等實例中,處理器4102及/或機器人臂控制器4106使球面角「u
」保持恆定同時使移動沿著虛擬球體6500之球面角「v
」反覆。沿著球面角「v
」之反覆移動致使攝影機300移動至虛擬球體6500之一頂部。In other examples, the processor 4102 and/or the robot arm controller 4106 are configured to receive level category input data. In these examples, the processor 4102 and/or the robot arm controller 4106 keeps the spherical angle " u " constant while causing the movement to be repeated along the spherical angle "v " of the virtual sphere 6500. The repeated movement along the spherical angle " v " causes the camera 300 to move to the top of one of the virtual spheres 6500.
返回至圖64,在判定球體端點之後,處理器4102及/或機器人臂控制器4106經組態以計算在已使攝影機300沿著虛擬球體移動至所判定端點之後使攝影機300維持鎖定在x,y,z目標處所需要之一旋轉量(方塊6416)。處理器4102及/或機器人臂控制器4106亦可在此計算期間提供抗側傾校正(方塊6418)。換言之,處理器4102及/或機器人臂控制器4106經組態以判定攝影機300將如何定向(給定其在虛擬球體6500上之新位置)使得攝影機300之觀看向量或尖端提供於指向虛擬球體6500之一中心(其對應於一目標外科手術部位或焦點)之相同x,y,z目標處。Returning to Figure 64, after determining the end of the sphere, the processor 4102 and/or the robot arm controller 4106 are configured to calculate that the camera 300 remains locked after the camera 300 has been moved along the virtual sphere to the determined end point. The amount of rotation required for the x, y, and z target (block 6416). The processor 4102 and/or the robot arm controller 4106 may also provide anti-roll correction during this calculation (block 6418). In other words, the processor 4102 and/or the robot arm controller 4106 are configured to determine how the camera 300 will be oriented (given its new position on the virtual sphere 6500) so that the viewing vector or tip of the camera 300 is provided to point to the virtual sphere 6500 A center (which corresponds to a target surgical site or focus) at the same x, y, z target.
在此步驟期間,處理器4102及/或機器人臂控制器4106判定達成所要定向所需要之機器人臂506及/或耦合板3304之關節角度。在於方塊6414中計算x,y,z球體端點之後,處理器4102及/或機器人臂控制器4106判定攝影機300之側滾及縱傾量。在某些實施例中,計算係一個兩步過程。首先,處理器4102及/或機器人臂控制器4106計算在不具有旋轉之情況下提供攝影機300之移動(給定x,y,z球體端點)之一初始4×4變換矩陣T
。然後,處理器4102及/或機器人臂控制器4106計算區域側滾及縱傾量使得攝影機300保持鎖定在位於x,y,z處(及/或定位於x,y,z球體端點處)之一目標處以用於後續關節旋轉循環。處理器4102及/或機器人臂控制器4106可在下文使用方程式(4)及(5)來計算側滾及縱傾量,其中Tnext
對應於一4×4變換矩陣。可以每一更新循環(例如,8 ms)執行計算。 During this step, the processor 4102 and/or the robot arm controller 4106 determine the joint angles of the robot arm 506 and/or the coupling plate 3304 required to achieve the desired orientation. After calculating the x, y, z sphere endpoints in block 6414, the processor 4102 and/or the robot arm controller 4106 determine the roll and pitch of the camera 300. In some embodiments, the calculation is a two-step process. First, the processor 4102 and/or the robot arm controller 4106 calculates an initial 4×4 transformation matrix T that provides the movement of the camera 300 (given x, y, z sphere endpoints) without rotation. Then, the processor 4102 and/or the robot arm controller 4106 calculates the amount of roll and pitch of the area so that the camera 300 remains locked at x, y, z (and/or at the end of the x, y, z sphere) One of the targets is used for subsequent joint rotation cycles. The processor 4102 and/or the robot arm controller 4106 may use equations (4) and (5) below to calculate the amount of roll and pitch, where T next corresponds to a 4×4 transformation matrix. The calculation can be performed every update cycle (for example, 8 ms).
在以上方程式(4)中,xtaregt_next
、ytaregt_next
及ztaregt_next
係對Tnext
矩陣之約束。以上約束規定側滾角度及縱傾角度經選擇使得以上x,y,z方程式係有效的。換言之,一目標在關節旋轉之一接下來更新循環處之x,y,z位置必須等於目標在當前循環中之x,y,z位置。該等約束使得攝影機300能夠經由側滾角度及縱傾角度旋轉但相對於x,y,z位置保持鎖定。In the above equation (4), x taregt_next , y taregt_next and z taregt_next are constraints on the T next matrix. The above constraints stipulate that the roll angle and pitch angle are selected to make the above equations of x, y, z valid. In other words, the x, y, z position of a target in the next update cycle of one of the joint rotations must be equal to the x, y, z position of the target in the current cycle. These constraints enable the camera 300 to rotate through the roll angle and pitch angle but remain locked relative to the x, y, and z positions.
此外,方程式(5)之第一矩陣之底部列上之-sinθ
對應於一縱傾角度,而第二矩陣之底部列上之sinθ
對應於一側滾角度。給定函數cos(roll),可存在縱傾之一閉型表達。處理器4102及/或機器人臂控制器4106可使用一反覆方法來估計作為函數cos(roll)來計算之側滾,其中縱傾等於fn(cos(roll))以針對以上方程式產生一正確側滾/縱傾解對。 In addition, -sin θ on the bottom column of the first matrix of equation (5) corresponds to a pitch angle, and sin θ on the bottom column of the second matrix corresponds to a roll angle. Given the function cos(roll), there may be a closed expression of the pitch. The processor 4102 and/or the robot arm controller 4106 can use an iterative method to estimate the roll calculated as a function cos(roll), where the trim is equal to fn(cos(roll)) to generate a correct roll for the above equation /The vertical tilt solution is right.
在依據結合方塊6416及6418所闡述之操作計算側滾及縱傾量之後,實例性處理器4102及/或機器人臂控制器4106經組態以提供奇異點避免且計算逆運動學以判定關節旋轉從而除攝影機300沿著虛擬球體6500之新x,y,z位置之外亦達成側傾及縱傾量(方塊6214及6420)。逆運動學之計算使得處理器4102及/或機器人臂控制器4106能夠判定機器人臂506及/或耦合板3304之關節之一移動順序。After calculating the amount of roll and pitch according to the operations described in conjunction with blocks 6416 and 6418, the example processor 4102 and/or the robot arm controller 4106 are configured to provide singularity avoidance and calculate inverse kinematics to determine joint rotation Thus, in addition to the new x, y, and z positions of the camera 300 along the virtual sphere 6500, the amount of roll and pitch is also achieved (blocks 6214 and 6420). The calculation of inverse kinematics enables the processor 4102 and/or the robot arm controller 4106 to determine the movement sequence of one of the joints of the robot arm 506 and/or the coupling plate 3304.
除關節速度限制及/或位置限制之外,實例性處理器4102及/或機器人臂控制器4106可針對移動順序應用誤差校正(方塊6418、6218、6220)。如上文結合圖62所論述,該等限制及誤差校正可阻止機器人臂506、攝影機300及/或耦合板3304撞到自身、超過一或多個邊界及/或在可接受關節位置內。處理器4102及/或機器人臂控制器4106亦可證實用於移動順序之關節之命令提供抗雜訊濾波,然後基於移動順序將該等命令(或指示該等命令之信號)發送至機器人臂506及/或耦合板3304之一或多個關節R1至R9 (方塊6224、6226、6228)。若未偵測到其他移動,則實例性程序6400然後可結束。否則,在接收到操作者輸入時以週期性間隔(例如,10 ms、20 ms等)重複程序6400。In addition to joint speed limits and/or position limits, the example processor 4102 and/or robot arm controller 4106 may apply error correction for the movement sequence (blocks 6418, 6218, 6220). As discussed above in connection with FIG. 62, these limitations and error corrections can prevent the robot arm 506, camera 300, and/or coupling plate 3304 from hitting itself, exceeding one or more boundaries, and/or being within acceptable joint positions. The processor 4102 and/or the robot arm controller 4106 can also verify that the commands for the joints of the movement sequence provide anti-noise filtering, and then send the commands (or signals indicating the commands) to the robot arm 506 based on the movement sequence And/or one or more joints R1 to R9 of the coupling plate 3304 (blocks 6224, 6226, 6228). If no other movement is detected, the example procedure 6400 can then end. Otherwise, the procedure 6400 is repeated at periodic intervals (for example, 10 ms, 20 ms, etc.) when the operator input is received.
在某些實施例中,處理器4102及/或機器人臂控制器4106可為儀器提供鎖定至目標追蹤。在此等實例中,用與一移動目標對應之一動態軌跡替換虛擬球體6500之一中心之xyz目標。舉例而言,此一特徵可達成脊柱工具之一追蹤。在此等實施例中,一儀器可包含一或多個基準及/或其他標記。實例性立體視覺化攝影機300記錄包含該等基準之影像。處理器4102及/或機器人臂控制器4106可執行自攝影機座標系空間至機器人空間之一座標變換以判定如何使儀器沿著x軸、y軸、z軸移動。實例性處理器4102及/或機器人臂控制器4106追蹤基準如何在影像中移動且判定對應x、y、z移動向量。在某些例項中,x、y、z向量可輸入至圖64之方塊6414之球體端點計算中以改變虛擬球體6500之一中心之位置。回應於球體6500之一移動,處理器4102及/或機器人臂控制器4106判定機器人臂506將如何定位以維持與新目標位置相同之工作距離及/或定向。處理器4102及/或機器人臂控制器4106然後可應用逆運動學以判定機器人臂506及/或耦合板之關節旋轉以追蹤目標之移動。類似於程序6200及6400,處理器4102及/或機器人臂控制器4106可在將如一所判定移動順序中所規定之命令發送至關節之前應用誤差校正、關節限制、濾波器及/或證實。總結 In some embodiments, the processor 4102 and/or the robot arm controller 4106 can provide the instrument with lock-to-target tracking. In these examples, the xyz target at a center of the virtual sphere 6500 is replaced with a dynamic trajectory corresponding to a moving target. For example, this feature can be used to track one of the spine tools. In these embodiments, an instrument may include one or more fiducials and/or other markings. The exemplary stereo visualization camera 300 records images that include these references. The processor 4102 and/or the robot arm controller 4106 can perform a coordinate transformation from the camera coordinate system space to the robot space to determine how to move the instrument along the x-axis, y-axis, and z-axis. The example processor 4102 and/or the robot arm controller 4106 tracks how the fiducial moves in the image and determines the corresponding x, y, z movement vectors. In some cases, the x, y, and z vectors can be input to the calculation of the sphere endpoints in block 6414 of FIG. 64 to change the position of one of the centers of the virtual sphere 6500. In response to one of the spheres 6500 moving, the processor 4102 and/or the robot arm controller 4106 determines how the robot arm 506 will be positioned to maintain the same working distance and/or orientation as the new target position. The processor 4102 and/or the robot arm controller 4106 may then apply inverse kinematics to determine the joint rotation of the robot arm 506 and/or the coupling plate to track the movement of the target. Similar to procedures 6200 and 6400, the processor 4102 and/or the robot arm controller 4106 may apply error correction, joint restriction, filters, and/or verification before sending commands to the joints as specified in a determined movement sequence. Summarize
將瞭解,可使用一或多個電腦程式或組件來實施本文中所闡述之系統、結構、方法及程序中之每一者。此等程式及組件可作為一系列電腦指令提供於任何習用電腦可讀媒體(包含隨機存取記憶體(「RAM」)、唯讀記憶體(「ROM」)、快閃記憶體、磁碟或光碟、光學記憶體或其他儲存媒體以及其組合及衍生物)上。該等指令可經組態以由一處理器執行,該處理器在執行該系列電腦指令時執行或促進所揭示方法及程序之全部或一部分之效能。It will be appreciated that one or more computer programs or components can be used to implement each of the systems, structures, methods, and procedures described herein. These programs and components can be provided as a series of computer instructions on any conventional computer-readable media (including random access memory ("RAM"), read-only memory ("ROM"), flash memory, magnetic disks, or CD, optical memory or other storage media and their combinations and derivatives). The instructions can be configured to be executed by a processor that, when executing the series of computer instructions, executes or promotes all or part of the performance of the disclosed methods and procedures.
應理解,熟習此項技術者將明瞭對本文中所闡述之實例性實施例之各種改變及修改。可在不背離本發明標的物之精神及範疇之情況下且在不削弱其既定優勢之情況下做出此等改變及修改。因此,此等改變及修改意欲由隨附申請專利範圍涵蓋。此外,與當前美國法律一致,應瞭解,不意欲調用35 U.S.C. 112(f)或前AIA 35 U.S.C. 112 (段落6),除非術語「構件」或「步驟」明確地陳述於申請專利範圍中。因此,申請專利範圍不意欲限於說明書或其等效內容中所闡述之對應結構、材料或動作。It should be understood that those skilled in the art will be aware of various changes and modifications to the exemplary embodiments described herein. Such changes and modifications can be made without departing from the spirit and scope of the subject matter of the present invention and without diminishing its established advantages. Therefore, these changes and modifications are intended to be covered by the scope of the attached patent application. In addition, consistent with current U.S. law, it should be understood that it is not intended to call 35 U.S.C. 112(f) or the former AIA 35 U.S.C. 112 (paragraph 6) unless the term "component" or "step" is clearly stated in the scope of the patent application. Therefore, the scope of patent application is not intended to be limited to the corresponding structures, materials or actions described in the specification or its equivalent content.