`Cvetko et al.
`
`(l0) Patent No.: US 11,004,271 B2
`*May 11, 2021
`(45) Date of Patent:
`
`(54) AUGMENTING REAL-TIME VIEWS OF A
`PATIENT WITH THREE-DIMENSIONAL
`DATA
`
`(71) Applicant: Novarad Corporation, American Fork,
`UT (US)
`
`(72)
`
`Inventors: Steven Cvetko, Draper, UT (US);
`Wendell Arlen Gibby, Mapleton, UT
`(US)
`
`(73) Assignee: NOVARAD CORPORATION,
`American Fork, UT (US)
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`8,657,809 B2
`8,830,263 B2
`
`2/2014 Schoepp
`9/2014 Kohara et al.
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`DE
`JP
`
`7/2014
`102012025374 Al
`1/2001
`2005-500096 A2
`(Continued)
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`This patent is subject to a terminal dis-
`claimer.
`
`OTHER PUBLICATIONS
`
`United States Patent and Trademark Office; International Search
`Report and Written Opinion issued in PCT Application No. PCT/
`US2018/022921, dated Jul. 5, 2018; 11 pages.
`(Continued)
`
`(21) Appl. No.: 16/574,524
`
`(22) Filed:
`
`Sep. 18, 2019
`
`(65)
`
`Prior Publication Data
`
`US 2020/0013224 Al
`
`Jan. 9, 2020
`
`Related U.S. Application Data
`
`(63) Continuation of application No. 15/894,595, filed on
`Feb. 12, 2018, now Pat. No. 10,475,244, which is a
`(Continued)
`
`(51) Int. Cl.
`GO6T 19/00
`A61B 5/00
`
`(2011.01)
`(2006.01)
`(Continued)
`
`(52) U.S. Cl.
`CPC
`
` GO6T 19/006 (2013.01); A61B 5/0015
`(2013.01); A61B 5/0071 (2013.01);
`(Continued)
`(58) Field of Classification Search
`None
`See application file for complete search history.
`
`Primary Examiner —Ryan M Gray
`(74) Attorney, Agent, or Firm — Maschoff Brennan
`
`(57)
`
`ABSTRACT
`
`Augmenting real-time views of a patient with three-dimen-
`sional (3D) data. In one embodiment, a method may include
`identifying 3D data for a patient with the 3D data including
`an outer layer and multiple inner layers, determining virtual
`morphometric measurements of the outer layer from the 3D
`data, registering a real-time position of the outer layer of the
`patient in a 3D space, determining real-time morphometric
`measurements of the outer layer of the patient, automatically
`registering the position of the outer layer from the 3D data
`to align with the registered real-time position of the outer
`layer of the patient in the 3D space using the virtual
`morphometric measurements and using the real-time mor-
`phometric measurements, and displaying, in an augmented
`reality (AR) headset, one of the inner layers from the 3D
`data projected onto real-time views of the outer layer of the
`patient.
`
`20 Claims, 11 Drawing Sheets
`
`102-\
`
`5-122
`
`02
`
`106
`
`116-\ 107.)_118,\
`
`( -1066
`
`foe
`
`418
`
`Network
`170
`
`Server
`112
`
`04
`
`Medivis Exhibit 1001
`
`1
`
`
`
`US 11,004,271 B2
`Page 2
`
`Related U.S. Application Data
`
`continuation of application No. 15/474,702, filed on
`Mar. 30, 2017, now Pat. No. 9,892,564.
`
`(2006.01)
`(2006.01)
`(2011.01)
`(2011.01)
`(2017.01)
`(2017.01)
`(2006.01)
`(2018.01)
`(2018.01)
`(2018.01)
`
`(51) Int. Cl.
`A61B 7/00
`A61B 5/107
`GO6T 19/20
`GO6T 15/04
`GO6T 7/246
`GO6T 7/73
`HO4N 7/14
`G16H 20/40
`G16H 30/40
`G16H 40/63
`(52) U.S. Cl.
` A61B 5/0077 (2013.01); A61B 5/107
`CPC
`(2013.01); A61B 5/742 (2013.01); A61B 7/00
`(2013.01); GO6T 7/248 (2017.01); GO6T 7/73
`(2017.01); GO6T 15/04 (2013.01); GO6T 19/20
`(2013.01); G16H 20/40 (2018.01); G16H
`30/40 (2018.01); G16H 40/63 (2018.01);
`HO4N 7/147 (2013.01); A61B 2562/0219
`(2013.01); A61B 2562/0223 (2013.01); GO6T
`2200/04 (2013.01); GO6T 2207/10016
`(2013.01); GO6T 2207/10024 (2013.01); GO6T
`2207/30024 (2013.01); GO6T 2207/30088
`(2013.01); GO6T 2207/30204 (2013.01); GO6T
`2210/41 (2013.01); GO6T 2215/16 (2013.01);
`GO6T 2219/2004 (2013.01); GO6T 2219/2012
`(2013.01)
`
`(56)
`
`References Cited
`
`Chuanggui
`9/2005
`2005/0215879 Al
`Lavallee et al.
`8/2006
`2006/0173290 Al
`Agusanto et al.
`10/2007
`2007/0236514 Al
`Tuma et al.
`4/2010
`2010/0100081 Al
`Wendler et al.
`10/2010
`2010/0266171 Al
`Fuchs et al.
`2/2011
`2011/0046483 Al
`Takahashi
`5/2011
`2011/0102549 Al
`5/2012 Kohara et al.
`2012/0127200 Al
`3/2013 Yang et al.
`2013/0060146 Al
`2013/0177229 Al * 7/2013 Inoue
`
`2013/0245461 Al *
`
`9/2013 Maier-Hein
`
`2014/0132605 Al *
`
`5/2014
`
`Tsukagoshi
`
`2014/0142426 Al
`2014/0222462 Al
`2014/0243614 Al
`2014/0275760 Al
`2014/0276001 Al
`2014/0300632 Al
`2015/0049083 Al
`2016/0078669 Al
`2016/0148052 Al
`2016/0154620 Al
`2016/0225192 Al
`2016/0235402 Al
`2016/0302747 Al
`2017/0231714 Al
`2017/0281297 Al
`2018/0020992 Al *
`
`5/2014 Razzaque et al.
`8/2014 Shakil et al.
`8/2014 Rothberg et al.
`9/2014 Lee et al.
`9/2014 Ungi et al.
`10/2014 Laor
`2/2015 Bidne et al.
`3/2016 Lin
`5/2016
`Tsuda et al.
`6/2016
`Tsuda et al.
`8/2016 Jones et al.
`8/2016 Chowaniec et al.
`10/2016 Averbuch
`8/2017 Kosmecki et al.
`10/2017
`Tuma et al.
`1/2018
`Guo
`
` A61B 8/483
`382/131
` A61B 5/742
`600/476
` G06T 19/20
`345/424
`
` A61B 6/032
`600/424
`
`2018/0137690 Al
`2018/0286132 Al
`2018/0289344 Al *
`2018/0303558 Al *
`2018/0338814 Al*
`2019/0246088 Al
`2021/0022808 Al
`2021/0037224 Al
`
`5/2018 Coffey et al.
`10/2018 Cvetko et al.
`10/2018
`Green
`10/2018
`Thomas
`11/2018
`Saget
`8/2019 Casas
`1/2021 Lang
`2/2021 Casas
`
` A61B 18/1815
` A61B 34/20
` G06T 19/006
`
`U.S. PATENT DOCUMENTS
`
`FOREIGN PATENT DOCUMENTS
`
`9,248,000 B2
`9,436,993 B1
`9,538,962 B1
`9,675,319 B1
`9,861,446 B2
`9,892,564 B1
`9,980,780 B2
`10,010,379 Bl
`10,028,727 B2 *
`10,052,170 B2 *
`10,154,239 B2
`10,159,530 B2
`10,194,131 B2
`10,278,777 B1
`10,292,768 B2
`10,326,975 B2
`10,368,947 B2
`10,405,927 B1
`10,511,822 B2
`10,531,852 B2 *
`10,594,998 B1
`10,602,114 B2
`10,603,113 B2
`10,742,949 B2
`10,743,939 B1
`10,799,296 B2
`10,841,556 B2
`10,849,693 B2
`2004/0070611 Al
`2004/0254456 Al
`2005/0203367 Al
`
`2/2016 Sarvestani et al.
`9/2016 Stolka et al.
`1/2017 Hannaford et al.
`6/2017 Razzaque et al.
`1/2018 Lang
`2/2018 Cvetko et al.
`5/2018 Lang
`7/2018 Gibby et al.
`7/2018 Inoue
`8/2018 Saget
`12/2018 Casas
`12/2018 Lang
`1/2019 Casas
`5/2019 Lang
`5/2019 Lang
`6/2019 Casas
`8/2019 Lang
`9/2019 Lang
`12/2019 Casas
`1/2020 Kwon
`3/2020 Casas
`3/2020 Casas
`3/2020 Lang
`8/2020 Casas
`8/2020
`Lang
`10/2020
`Lang
`11/2020 Casas
`12/2020 Lang
`4/2004 Tanaka et al.
`12/2004 Ritter
`9/2005 Ahmed et al.
`
`A61B 8/469
`G02B 27/0172
`
`G16H 30/40
`
`2004-178554 A
`JP
`2015-019678
`JP
`2002/100284 A
`WO
`WO
`2009/116663
`WO WO 2011/010644
`WO
`2015/008470 A2
`WO
`2017/160651
`WO
`2018183001 Al
`
`6/2004
`2/2015
`12/2002
`9/2009
`1/2011
`1/2015
`9/2017
`10/2018
`
`OTHER PUBLICATIONS
`
`U.S. Appl. No. 62/097,771, filed Dec. 20, 2014, titled "Intraopera-
`tive Image-guided Surgery with Surface Reconstruction and Aug-
`mented Reality Visualization".
`U.S. Appl. No. 62/307,476, filed Mar. 12, 2016, titled "Devices and
`Methods for Surgery".
`U.S. Appl. No. 17/111,643, filed Dec. 4, 2020.
`Justin Barad "Controlling Augmented Reality in the Operating
`Room, a Surgeon's Perspective", medgadget, Oct. 30, 2015;
`XP055754822; Webpage; located at: https://www.medgadget.com/
`2015/10/controlling-augmented-reality-operating-room-surgeons-
`perspective.html.
`European Patent Office; Extended European Search Report issued in
`Application No. 18775013.8 dated Mar. 17, 2021, 12 pages.
`Japanese Office Action issued in Application No. 2020-503249
`dated Jan. 5, 2021, 7 pages.
`
`* cited by examiner
`
`Medivis Exhibit 1001
`
`2
`
`
`
`May11, 2021
`
`Sheet 1 of 11
`
`US 11,004,271 B2
`
`U.S. Patent
`
`fn
`
`
`
`FIG.1
`
`NN
`102
`
`Medivis Exhibit 1001
`
`Medivis Exhibit 1001
`
`3
`
`
`
`May11, 2021
`
`Sheet 2 of 11
`
`US 11,004,271 B2
`
`ween U.S. Patent
`
`FIG. 2B
`
`Med
`
`iviS
`
`
`
`ibitExhi
`
`1001
`
`Medivis Exhibit 1001
`
`4
`
`
`
`U.S. Patent
`
`May11, 2021
`
`Sheet 3 of 11
`
`US 11,004,271 B2
`
`
`
`om
`
`FIG. 2D
`
`Medivis Exh
`
`ibit
`
`1001
`
`Medivis Exhibit 1001
`
`5
`
`
`
`U.S. Patent
`
`May11, 2021
`
`Sheet 4 of 11
`
`US 11,004,271 B2
`
`FIG. 2E
`
`FIG. 2F
`
`Medivis Exh
`
`ibit 1001
`
`Medivis Exhibit 1001
`
`6
`
`
`
`
`U.S. Patent
`
`May11, 2021
`
`Sheet 5 of 11
`
`US 11,004,271 B2
`
`SeSeanse
`
`sheieanene
`
`Sreea
`ae
`ee
`
`:ss=
`
`Medivis Exhibit 1001
`
`Medivis Exhibit 1001
`
`7
`
`
`
`U. S. Patent
`
`May11, 2021
`
`Sheet 6 of 11
`
`US 11,004,271 B2
`
`=a 1
`
`raneannanan
`cite
`ea
`
`ahaaa
`eeeaSee
`ee
`ee===:
`
`eee:Ree
`
`Se
`Sea
`=
`
`Seae
`
` eenonihnrenhnesSoo=ee
`
`sia
`scecaeee
`
`3 S
`
`=gre
`
`FIG. 4A
`
`oeSeSe
`==S
`oe
`Seamer:
`eee
`pe
`eeSo See
`
`
`:=
`
`Medivis Exh
`
`ibit
`
`001
`
`Medivis Exhibit 1001
`
`8
`
`
`
`
`U.S. Patent
`
`May 11, 2021
`
`Sheet 7 of 11
`
`US 11,004,271 B2
`
`Computer System
`500
`
`Processor
`502
`
`Memory
`504
`
`File System
`506
`
`Communication Unit
`508
`
`Operating System
`510
`
`User Interface
`512
`
`AR Module
`514
`
`FIG. 5
`
`Medivis Exhibit 1001
`
`9
`
`
`
`U.S. Patent
`
`May 11, 2021
`
`Sheet 8 of 11
`
`US 11,004,271 B2
`
`Identify 3D data for a patient, the 3D data
`including an outer layer of the patient and
`multiple inner layers of the patient
`
`600
`
`y-602
`
`Determine virtual morphometric measurements of
`the outer layer of the patient from the 3D data
`
`y--604
`
`Register a real-time position of the outer
`layer of the patient in a 3D space
`
`y-606
`
`Determine real-time morphometric measurements
`of the outer layer of the patient
`
`y-608
`
`Automatically register the position of the outer
`layer of the patient from the 3D data to align
`with the registered real-time position of the
`outer layer of the patient in the 3D space
`
`Display one of the inner layers of the patient
`from the 3D data projected onto real-time
`views of the outer layer of the patient
`
`y610
`
`_/ --612
`
`A
`
`FIG. 6A
`
`Medivis Exhibit 1001
`
`10
`
`
`
`U.S. Patent
`
`May 11, 2021
`
`Sheet 9 of 11
`
`US 11,004,271 B2
`
`A
`
`Generate a confidence score that the
`automatic registration is correct
`
`Present the confidence score to a user
`
`FIG. 6B
`
`C
`
`Display a virtual spatial difference box projected
`onto real-time views of the patient
`
`FIG. 6C
`
`7 -- 600
`
`y - 614
`
`y- 616
`
`Jr 600
`
`y-- 626
`
`Medivis Exhibit 1001
`
`11
`
`
`
`U.S. Patent
`
`May 11, 2021
`
`Sheet 10 of 11
`
`US 11,004,271 B2
`
`B
`
`T
`Determine real-time morphometric measurements
`of an object prior to insertion of the object into the
`patient through the outer layer of the patient
`
`Automatically track the real-time position of the
`object in the 3D space with respect to the registered
`positions of the outer layer of the patient in the 3D
`space and with respect to the registered position of
`the outer layer of the patient from the 3D data
`
`While a portion of the object is inserted into the patient
`through the outer layer of the patient, display a virtual
`portion of the object projected into the projected inner
`layer of the patient from the 3D data
`
`FIG. 6D
`
`r
`
`600
`
`y618
`
`y-- 622
`
`y-624
`
`Medivis Exhibit 1001
`
`12
`
`
`
`U.S. Patent
`
`May 11, 2021
`
`Sheet 11 of 11
`
`US 11,004,271 B2
`
`0
`
`V
`Generate a virtual user interface that includes
`options for altering the display of the projected
`inner layer of the patient from the 3D data
`
`Display the virtual user interface projected onto
`real-time views while a focal orientation of the
`AR headset is not focused on the patient
`
`Hide the virtual user interface while the focal orientation
`of the AR headset is focused on the patient
`
`V
`Determine a real-time distance of
`the patient from the AR headset
`
`Update, in real-time, the display of the virtual user
`interface to cause the virtual user interface to be
`continually positioned at a focal distance from the
`AR headset that is about equal to the real-time
`distance of the patient from the AR headset
`
`Update, in real-time, the display of the virtual
`user interface to cause the virtual user interface
`to continually be oriented perpendicularly to the
`focal orientation of the AR headset
`
`7 - 600
`
`628
`
`630
`
`632
`
`634
`
`y-
`
`636
`
`y-- 638
`
`Display a virtual cursor projected onto real-time views
`and/or onto the virtual user interface while a focal
`orientation of the AR headset is not focused on the patient
`
`x-640
`
`Hide the virtual cursor while a focal orientation
`of the AR headset is focused on the patient
`FIG. 6E
`
`y- 642
`
`Medivis Exhibit 1001
`
`13
`
`
`
`US 11,004,271 B2
`
`1
`AUGMENTING REAL-TIME VIEWS OF A
`PATIENT WITH THREE-DIMENSIONAL
`DATA
`
`CROSS-REFERENCE TO A RELATED
`APPLICATION
`
`5
`
`This application is a continuation of U.S. patent applica-
`tion Ser. No. 15/894,595, filed Feb. 12, 2018, which is a
`continuation of U.S. patent application Ser. No. 15/474,702, 10
`filed Mar. 30, 2017, now U.S. Pat. No. 9,892,564, each of
`which is incorporated herein by reference in its entirety for
`all that it discloses.
`
`BACKGROUND
`
`15
`
`Augmented reality (AR) systems generally take a user's
`live view of a real-world environment and augment that
`view with computer-generated virtual elements such as
`video, sound, or graphics. As a result, AR systems function 20
`to enhance a user's current perception of reality.
`One common problem faced by AR systems is accurately
`aligning the position of a virtual element with a live view of
`a real-world environment. This alignment process is often
`done manually or is done automatically only after manual 25
`placement of non-anatomical fiducials. In either case, the
`manual process can be time consuming, cumbersome, and
`inaccurate.
`Another common problem faced by AR systems is proper
`placement of virtual controls for managing virtual elements.
`Virtual controls, while intended to aide a user in interacting
`with virtual elements, are often placed in positions in the live
`view that render them more of a hindrance than a help to the
`user.
`The subject matter claimed herein is not limited to
`embodiments that solve any disadvantages or that operate
`only in environments such as those described above. Rather,
`this background is only provided to illustrate one example
`technology area where some embodiments described herein
`may be practiced.
`
`30
`
`35
`
`40
`
`SUMMARY
`
`In one embodiment, a method for augmenting real-time
`views of a patient with three-dimensional (3D) data may 45
`include various acts. For example, the method may include
`identifying 3D data for a patient with the 3D data including
`an outer layer of the patient and multiple inner layers of the
`patient. The method may also include determining virtual
`morphometric measurements of the outer layer of the patient 50
`from the 3D data. The method may further include regis-
`tering a real-time position of the outer layer of the patient in
`a 3D space. The method may also include determining
`real-time morphometric measurements of the outer layer of
`the patient. The method may further include automatically 55
`registering the position of the outer layer of the patient from
`the 3D data to align with the registered real-time position of
`the outer layer of the patient in the 3D space using the virtual
`morphometric measurements and using the real-time mor-
`phometric measurements. The method may also include 60
`displaying, in an augmented reality headset, one of the inner
`layers of the patient from the 3D data projected onto
`real-time views of the outer layer of the patient.
`In another embodiment, a method for augmenting real-
`time views of a patient with 3D data may include various 65
`acts. For example, the method may include identifying 3D
`data for a patient with the 3D data including an outer layer
`
`2
`of the patient and multiple inner layers of the patient. The
`method may also include displaying, in an augmented reality
`headset, one of the inner layers of the patient from the 3D
`data projected onto real-time views of the outer layer of the
`patient. The method may further include generating, in the
`augmented reality headset, a virtual user interface that
`includes options for altering the display of the projected
`inner layer of the patient from the 3D data. The method may
`also include displaying, in the augmented reality headset, the
`virtual user interface projected onto real-time views due to
`a focal orientation of the augmented reality headset not
`being focused on the patient. The method may further
`include hiding, in the augmented reality headset, the virtual
`user interface due to the focal orientation of the augmented
`reality headset being focused on the patient.
`It is to be understood that both the foregoing summary and
`the following detailed description are explanatory and are
`not restrictive of the invention as claimed.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Embodiments will be described and explained with addi-
`tional specificity and detail through the use of the accom-
`panying drawings in which:
`FIG. 1 illustrates an example augmented reality (AR)
`environment in which real-time views of a patient may be
`augmented with three-dimensional (3D) data;
`FIGS. 2A-2F are photographs of the AR environment of
`FIG. 1 with a first patient;
`FIG. 3 is a photograph of the AR environment of FIG. 1
`with a second patient;
`FIGS. 4A-4B are photographs of the AR environment of
`FIG. 1 with a third patient;
`FIG. 5 illustrates an example computer system that may
`be employed in augmenting real-time views of a patient with
`3D data; and
`FIGS. 6A-6E are a flowchart of an example method of
`augmenting real-time views of a patient with 3D data.
`
`DETAILED DESCRIPTION
`
`Medical imaging may be employed to create visual rep-
`resentations of the interior of a patient. More particularly,
`medical imaging may be employed to reveal internal struc-
`tures hidden by an outer layer of a patient, such as the skin,
`for various purposes such as training, research, diagnosis,
`and treatment.
`Conventional medical imaging systems may create three-
`dimensional (3D) data for a patient and then display that 3D
`data as an image or images on a computer display. While
`viewing images of a patient on a computer display, detached
`from the actual patient, may be useful in training, research,
`diagnosis, and treatment, viewing, such detached viewing
`may also result in some problems.
`For example, where a surgeon needs to remove a tumor
`from a patient's brain, the surgeon may view an image of the
`patient's brain on a computer display. After viewing the
`location of the tumor on the computer display, the surgeon
`may then shift his view from the computer display to the
`actual patient on an operating table and attempt to identify
`the approximate location on the actual patient of the tumor
`inside the patient's brain. This method of identifying the
`approximate location of the tumor can be difficult and
`error-prone. For example, the surgeon may accidentally
`identify the left side of the brain in the image as having the
`tumor when in reality the tumor is in the right side of the
`
`Medivis Exhibit 1001
`
`14
`
`
`
`US 11,004,271 B2
`
`3
`brain. This error may lead to the surgeon erroneously
`making an unnecessary incision on the left side of the
`patient's skull.
`In another example, where a doctor needs to perform knee
`surgery on a patient, the doctor may view an image of the
`patient's knee on a computer display. After viewing the
`problematic area of the knee on the computer display, the
`doctor may then shift his view from the computer display to
`the actual patient on an operating table and attempt to
`identify the problematic area of the knee on the actual
`patient for the surgery. This method of identifying the
`problematic area of the knee can be difficult and error-prone.
`For example, the doctor may accidentally pull up images of
`the wrong patient on the computer display, without realizing
`that the patient on the operating table does not match the
`images on the computer display. This error may lead to the
`surgeon erroneously making an incision in the wrong loca-
`tion due to natural variation of problematic areas of the knee
`from one patient to the next.
`The embodiments disclosed herein may provide various
`benefits over a conventional medical imaging system. In
`particular, the embodiments disclosed herein may, for
`example, augment real-time views of a patient with 3D data.
`In some embodiments, the 3D data of a patient may be
`automatically aligned, or registered, with a real-time view of
`the actual patient and then images derived from the 3D data
`may be projected onto the real-time view of the patient.
`Thus, these embodiments may enable a medical professional
`to view a virtual interior of the patient while looking at the
`actual patient without any time consuming, cumbersome,
`and inaccurate manual alignment and/or without any time
`consuming, cumbersome, and inaccurate manual placement
`of non-anatomical fiducial. When used in training, research,
`diagnosis, or treatment, these embodiments may enable a
`medical professional to more easily and more accurately
`locate a target location within a patient.
`For example, when employed in the brain surgery
`example discussed above, these embodiments may avoid the
`surgeon getting confused on the location of the tumor
`between the right and left sides of the brain, and may thereby
`avoid the surgeon making an unnecessary incision on the
`wrong side of the skull during the surgery to remove the
`tumor. Similarly, when employed in the knee surgery
`example discussed above, these embodiments may avoid the
`doctor using 3D data for the wrong patient because the
`automatic alignment may fail or may indicate a low confi-
`dence that the automatic alignment was correct, thus alerting
`the doctor that the patient data may not be for the patient
`currently on the operating table.
`Further, in some embodiments, the augmenting of real-
`time views of a patient with 3D data may include the display
`of a virtual user interface and other virtual controls for
`altering the images projected onto the real-time view of the
`patient. This virtual user interface and these other virtual
`controls may be projected to avoid obstructing the medical
`professional's field of view when viewing the patient, to
`maintain a relatively constant focal length for the medical
`professional, and/or to maintain the orientation of the virtual
`user interface facing the medical professional. In this way,
`these embodiments may allow the medical professional to
`quickly and easily alter the images projected onto the
`real-time view of the patient.
`Turning to the figures, FIG. 1 illustrates an example
`augmented reality (AR) environment 100. In some embodi-
`ments, the environment 100 may include a 3D space 102, a
`user 104, a patient 106, and an AR headset 108 which may
`be in communication with a server 112 over a network 110.
`
`5
`
`4
`In some embodiments, the environment 100 may also
`include a virtual user interface 114, a virtual spatial differ-
`ence box 116, a virtual inserted portion 118a of an object
`118, and a virtual cursor 122, all shown in dashed lines to
`indicate that these virtual elements are generated by the AR
`headset 108 and only viewable by the user 104 through the
`AR headset 108.
`In some embodiments, the 3D space 102 may be any 3D
`space including, but not limited to, an operating room with
`10 an operating table 103 (as illustrated in FIG. 1), an office, a
`classroom, or a laboratory. In some embodiments, the 3D
`space 102 may be a space where the user 104 may view the
`patient 106 while wearing the AR headset 108.
`In some embodiments, the user 104 may be any user of the
`15 AR headset 108 including, but not limited to, a medical
`professional (as illustrated in FIG. 1), an instructor, a
`researcher, a patient, or a caregiver of a patient. For example,
`a medical professional may use the AR headset 108 in order
`to perform a medical procedure on the patient 106. Similarly,
`20 a researcher or an instructor may use the AR headset 108
`while performing medical research or instructing medical
`students. Further, a caregiver of the patient 106, or the
`patient 106 himself, may use the AR headset 108 when a
`medical professional is attempting to explain a suggested
`25 medical procedure for the patient 106.
`In some embodiments, the patient 106 may be any animal,
`either conscious or unconscious, either living or dead, either
`whole or missing one or more body parts. For example, the
`patient 106 may be a living human adult (as illustrated in
`30 FIG. 1) who has been rendered unconscious in order to
`undergo a medical procedure by the user 104. In another
`example, the patient 106 may be a cadaver of a human adult
`that will undergo a dissection for research or training pur-
`poses. In another example, the patient 106 may be a con-
`35 scious animal that is being evaluated by a veterinarian in
`order to diagnose a medical condition. In another example,
`the patient 106 may be a single limb or organ of a deceased
`human.
`In some embodiments, the AR headset 108 may be any
`ao computer system in the form of an AR headset that is
`capable of augmenting real-time views of the patient 106
`with 3D data. For example, the AR headset 108 may be
`employed by the user 104 in order to augment a real-time
`view of the patient 106 with one or more inner layers of the
`45 patient 106 including, but not limited to, bones 106b (as
`illustrated in FIG. 1), muscles, organs, or fluids. In some
`embodiments, the AR headset 108 may perform this aug-
`menting of a real-time view of the patient 106 regardless of
`the current position of the user 104 in the 3D space 102. For
`50 example, the user 104 may walk around the operating table
`103 and view the patient 106 from any angle within the 3D
`space 102, and all the while the AR headset 108 may
`continually augment the real-time view of the patient 106
`with one or more inner layers of the patient 106, so that both
`55 the patient 106 and the 3D data of the patient 106 may be
`viewed by the user 104 from any angle within the 3D space
`102. The AR headset 108 may perform this augmenting of
`a real-time view of the patient 106 with 3D data according
`to the method 600 disclosed herein in connection with FIGS.
`60 6A-6E. In some embodiments, the AR headset 108 may be
`a modified version of the Microsoft HoloLens.
`In some embodiments, the network 110 may be config-
`ured to communicatively couple the AR headset 108 and the
`server 112 or other computer system(s). In some embodi-
`65 ments, the network 110 may be any wired or wireless
`network, or combination of multiple networks, configured to
`send and receive communications between systems and
`
`Medivis Exhibit 1001
`
`15
`
`
`
`US 11,004,271 B2
`
`5
`devices. In some embodiments, the network 110 may
`include a Personal Area Network (PAN) such as a Bluetooth
`network, a Local Area Network (LAN) such as a WiFi
`network, a Metropolitan Area Network (MAN), a Wide Area
`Network (WAN), or a Storage Area Network (SAN). In 5
`some embodiments, the network 110 may also be coupled to,
`or may include, portions of a telecommunications network
`for sending data in a variety of different communication
`protocols, such as a cellular network.
`In some embodiments, the server 112 may be any com- 10
`puter system capable of functioning in connection with the
`AR headset 108. In some embodiments, the server 112 may
`be configured to communicate in real-time with the AR
`headset 108 in order to convey 3D data to, or receive data
`from, the AR headset 108. In addition, the server 112 may 15
`be employed to offload some or all of the data storage or
`processing desired by the AR headset 108.
`In some embodiments, the virtual user interface 114 may
`be any virtual user interface generated by the AR headset
`108 that includes options for altering the display of the zo
`projected inner layer(s) of the patient 106 from the 3D data
`of the patient 106. For example, the options included in the
`virtual user interface 114 may include, but are not limited to,
`options that cause the AR headset 108 to:
`(1) quit viewing the augmented view of the patient 106, 25
`(2) display a demo of the capabilities of the AR headset
`108,
`(3) adjust the characteristics of the 3D data that is pro-
`jected onto the patient 106, such as the brightness and
`color of the projected 3D data,
`(4) adjust the alignment of the 3D data with the patient
`106,
`(5) display the virtual spatial difference box 116,
`(6) display a slice of the 3D data instead of a volume of
`the 3D data,
`(7) drag the 3D data in a direction of the user 104, such
`as in the repositioning of a slice of the 3D data,
`(8) display different slices of the 3D data including, but
`not limited to, axial slices, coronal slices, sagittal slices,
`and oblique slices, and
`(9) perform other advanced features of the AR headset
`108.
`The virtual user interface 114 may further include other
`information that may be useful to the user 104. For example,
`the virtual user interface 114 may include real-time vital 45
`signs for the patient 106 such as heart-rate, blood-pressure,
`and respiration-rate. In another example, the virtual user
`interface 114 may include a stopwatch showing the amount
`of time the patient 106 has been unconscious.
`In some embodiments, the AR headset 108 may be so
`configured to display the virtual user interface 114 at a
`comfortable distance from the user 104 and/or in a comfort-
`able orientation for the user 104. For example, the AR
`headset 108 may be configured to display the virtual user
`interface 114 at a focal distance D2 from the AR headset 108 55
`that is about equal to a real-time distance Dl of the patient
`106 from the AR headset 108. This distance may be com-
`fortable for the user because it may avoid the user 104 from
`having to refocus his eyes when shifting his focus between
`the patient 106 and the virtual user interface 114, even as the 60
`user moves around the 3D space 102 and even as the user
`moves closer to and further away from the patient 106. In
`another example, the AR headset 108 may be configured to
`display the virtual user interface 114 at a focal orientation
`that is oriented perpendicularly to a focal orientation 120 of 65
`the AR headset 108. This orientation may be comfortable for
`the user 104 because it may cause the virtual user interface
`
`35
`
`30
`
`40
`
`6
`114 to constantly face the user 104 head-on regardless of the
`current focal orientation 120 of the AR headset 108, even as
`the user moves around the 3D space 102 and even as the user
`generally faces toward or faces away from the patient 106.
`In some embodiments, the virtual spatial difference box
`116 may be generated by the AR headset 108 to confine
`within a volume of the virtual spatial difference box 116 the
`projected inner layer of the patient 106 from the 3D data. For
`example, the projected bones 106b of the patient 106 may be
`confined within the virtual spatial difference box 116 in FIG.
`1. In some embodiments, the virtual spatial difference box
`116 may also assist the user when navigating the projected
`3D data by providing a frame of reference for the user 104.
`For example, this frame of reference may assist the user
`when moving axial slices, coronal slices, sagittal slices, or
`oblique slices of the 3D data within the virtual spatial
`difference box 116. Slices may be two-dimensional (2D)
`slices and/or 3D slices. 3D slices may include curved slices,
`such as curved slices that follow the natural curve of an
`anatomical feature, or slices that have a depth as well as a
`height and width. The user 104 may move these slices using
`hand gestures that require the user 104 to generally move his
`hand in the directions of the lines of the virtual spatial
`difference box 116, so the display of the virtual spatial
`difference box 116 may make these hand movements easier
`for the user 104.
`In some embodiments, the virtual inserted portion 118a of
`the object 118 may correspond to any portion of the object
`118 that the user 104 wishes to insert into the patient 106
`though an outer layer of the patient 106. For example, the
`object 118 may include, but is not limited to, a scalpel (as
`illustrated in FIG. 1), a scope, a drill, a probe, another
`medical instrument, or even the hand of the user 104. Similar
`to the registration of the real-time position of the outer layer
`of the patient 106, the position of the outer layer of the object
`118 may also be registered. However, unlike the patient 106,
`which may remain relatively still in the environment 100,
`the object 118 may be frequently moved in the environment
`100, such that the real-time position of the object 118 may
`be automatically tracked in the 3D space 102 with respect to
`the registered positions of the outer layer of the patient 106.
`Then, in the event that the user 104 inserts some portion of
`the object 118 into the outer layer of the patient 106, the AR
`headset 108 may display a virtual inserted portion 118a of
`the object 118 projected into the projected inner layer of the
`patient 106 from the 3D data. In this manner, the virtual
`inserted po