`c12) Reissued Patent
`Ojelund et al.
`
`I 1111111111111111 1111111111111111 IIII IIIII IIIII 1111111111 111111111111111111
`US00RE48221E
`
`US RE48,221 E
`(10) Patent Number:
`(45) Date of Reissued Patent:
`Sep. 22, 2020
`
`(54) SYSTEM WITH 3D USER INTERFACE
`INTEGRATION
`
`(71) Applicant: 3Shape A/S, Copenhagen K (DK)
`
`(72)
`
`Inventors: Henrik Ojelund, Lyngby (DK); David
`Fischer, Stenlose (DK); Karl-Josef
`Hollenbeck, Kobenhavn 0 (DK)
`
`(73) Assignee: 3SHAPE A/S, Copenhagen K (DK)
`
`(21) Appl. No.: 16/526,281
`
`(22) Filed:
`
`Jul. 30, 2019
`Related U.S. Patent Documents
`
`9,329,675
`May 3, 2016
`13/991,513
`Dec. 5, 2011
`PCT/DK2011/050461
`
`Reissue of:
`(64) Patent No.:
`Issued:
`Appl. No.:
`PCT Filed:
`PCT No.:
`§ 371 (c)(l),
`Jun.4, 2013
`(2) Date:
`W02012/076013
`PCT Pub. No.:
`PCT Pub. Date: Jun. 14, 2012
`U.S. Applications:
`(60) Provisional application No. 61/420,138, filed on Dec.
`6, 2010.
`
`(30)
`
`Foreign Application Priority Data
`
`Dec. 6, 2010
`
`(DK) ................................. 2010 01104
`
`(51)
`
`Int. Cl.
`G06F 3101
`A61B 5100
`
`(2006.01)
`(2006.01)
`(Continued)
`
`(52)
`
`U.S. Cl.
`CPC .............. G06F 3101 (2013.01); A61B 510088
`(2013.01); A61C 9/004 (2013.01); G0JB 11124
`(2013.01);
`
`(58) Field of Classification Search
`CPC ....... A61B 5/0088; A61C 9/004; G0lB 11/24;
`G06F 3/002; G06F 3/01; G06F 3/011;
`G06F 3/017; G06F 3/0346; G06F
`3/04815
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,131,844 A
`5,181,181 A
`
`7 / 1992 Marinaccio et al.
`1/1993 Glynn
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`CN
`CN
`
`8/2009
`101513350
`8/2009
`101513350 A
`(Continued)
`
`OTHER PUBLICATIONS
`
`Petition for Inter Partes Review of U.S. Pat. No. 9,329,675, filed
`Nov. 22, 2017 in IPR2018-00197.
`(Continued)
`
`Primary Examiner - Peng Ke
`(74) Attorney, Agent, or Firm - Buchanan Ingersoll &
`Rooney P.C.
`
`ABSTRACT
`(57)
`Disclosed is a system comprising a handheld device and at
`least one display. The handheld device is adapted for per(cid:173)
`forming at least one action in a physical 3D environment;
`wherein the at least one display is adapted for visually
`representing the physical 3D environment; and where the
`handheld device is adapted for remotely controlling the view
`with which the 3D environment is represented on the
`display.
`
`(Continued)
`
`43 Claims, 5 Drawing Sheets
`
`0001
`
`Exhibit 1001 page 1 of 18
`DENTAL IMAGING
`
`
`
`US RE48,221 E
`Page 2
`
`(51)
`
`Int. Cl.
`A61C 9/00
`G0JB 11124
`G06F 3/00
`G06F 3/0346
`G06F 3/0481
`(52) U.S. Cl.
`CPC
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2013.01)
`(2013.01)
`
`G06F 3/002 (2013.01); G06F 31011
`(2013.01); G06F 3/017 (2013.01); G06F
`3/0346 (2013.01); G06F 3/04815 (2013.01)
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,377,011 A
`5,722,412 A
`6,135,961 A
`6,227,850 Bl
`6,361,489 Bl
`6,485,413 Bl
`6,592,371 B2
`6,645,148 B2
`6,967,644 Bl*
`7,141,020 B2
`7,213,214 B2
`7,221,332 B2
`7,551,353 B2
`7,813,591 B2 *
`7,831,292 B2 *
`8,035,637 B2 *
`8,384,665 Bl *
`8,903,746 B2
`9,329,675 B2
`2003/0158482 Al
`2003/0164952 Al
`2004/0204787 Al *
`2005/0057745 Al
`2005/0237581 Al
`2006/0020204 Al
`2006/0025684 Al
`2006/0092133 Al*
`2006/0146009 Al
`2006/0212260 Al
`2007 /0031774 Al *
`
`12/1994 Koch
`3/ 1998 Pflugrath et al.
`10/2000 Pflugrath et al.
`5/2001 Chishti et al.
`3/2002 Tsai
`11/2002 Boppart et al.
`7/2003 Durbin et al.
`11/2003 Nguyen-Dinh et al.
`11/2005 Kobayashi .................... 345/158
`11/2006 Poland et al.
`5/2007 Baar et al
`5/2007 Miller et al.
`6/2009 Kim et al.
`10/2010 Paley et al. ................... 382/285
`11/2010 Quaid et al ................... 600/424
`10/2011 Kriveshko .................... 345/419
`2/2013 Powers et al. ................ 345/156
`12/2014 Brennan et al.
`5/2016 Ojelund et al.
`8/2003 Poland et al.
`9/2003 Deichmann et al.
`10/2004 Kopelman et al. ........... 700/182
`3/2005 Bontje
`10/2005 Knighton et al.
`1/2006 Serra et al.
`2/2006 Quistgaard et al.
`5/2006 Touma et al. ................. 345/158
`7 /2006 Syrbe et al.
`9/2006 Kopelman et al.
`2/2007 Cinader, Jr.
`
`A61C 9/0053
`433/24
`
`2007 /0078340 Al
`2007/0171220 Al
`2007/0172112 Al
`2008/0063998 Al
`2009/0040175 Al
`2009/0061381 Al
`2009/0217207 Al
`2009/0322676 Al
`2010/0009308 Al
`2010/0231509 Al
`2012/0062557 Al*
`
`4/2007 Wilcox et al.
`7/2007 Kriveshko
`7 /2007 Paley et al.
`3/2008 Liang et al.
`2/2009 Xu et al.
`3/2009 Durbin et al.
`8/2009 Kagermeier et al.
`12/2009 Kerr et al.
`1/2010 Wen et al.
`9/2010 Ballotetal.
`3/2012 Dillon .
`
`2012/0179035 Al
`2013/0110469 Al*
`
`2014/0022352 Al
`
`7/2012 Boudier
`5/2013 Kopelman .............. G06F 30/00
`703/1
`
`1/2014 Fisker et al.
`
`A61C 7/002
`345/419
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`EP
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`
`2200332 Al
`2664272 Al
`WO 00/08415 Al
`WO 2004/066615 Al
`2007084727
`WO 2007/084727 Al
`2009089126
`WO 2009/089126 Al
`WO 2010/064156 Al
`WO 2010/145669 Al
`2011/011193 Al
`
`6/2010
`11/2013
`2/2000
`8/2004
`7/2007
`7/2007
`7/2009
`7/2009
`6/2010
`12/2010
`1/2011
`
`WO
`WO
`WO
`WO
`WO
`
`2001011193
`WO 2001/011193 Al
`WO 2011/120526 Al
`2012/075013 Al
`WO 2013/010910 Al
`
`1/2011
`1/2011
`10/2011
`6/2012
`1/2013
`
`OTHER PUBLICATIONS
`
`Patent Owner's Preliminary Response to the Petition for Inter Partes
`Review of U.S. Pat. No. 9,329,675, filed Mar. 3, 2018 in IPR2018-
`00197.
`Institution Decision entered May 30, 2018 in IPR20198-00197.
`Patent Owner's Response to the Petition for Inter Partes Review of
`U.S. Pat. No. 9,329,675, filed Aug. 20, 2018 in IPR2018-00197.
`Petitioner's Reply to Patent Owner's Response, filed Nov. 14, 2018,
`in IPR2018-00197.
`Petitioner's Demonstratives filed Jan. 31, 2019, in IPR2018-00197.
`Patent Owner's Submission of Demonstratives for Oral Argument
`filed Jan. 31, 2019, in IPR2018-00197.
`Petition for Inter Partes Review of U.S. Pat. No. 9,329,675, filed
`Nov. 22, 2017 in IPR2018-00198.
`Patent Owner's Preliminary Response to the Petition for Inter Partes
`Review of U.S. Pat. No. 9,329,675, filed Mar. 3, 2018 in IPR2018-
`00198.
`Decision Denying Institution entered May 30, 2018 in IPR20198-
`00198.
`Petitioner's Request for Rehearing oflnstitution Decision, filed Jun.
`29, 2018 in IPR20198-00198.
`Decision Denying Petitioner's Request for Rehearing, entered Dec.
`4, 2018 in IPR20198-00198.
`U.S. Pat. No. 9,329,675 File History (IPR2018-00197, Ex. 1002)
`(IPR2018-00198, Ex. 1002).
`Declaration of Dr. Chandrajit L. Bajaj (IPR2018-00197, Ex. 1003).
`Declaration of Dr. Chandrajit L. Bajaj (IPR2018-00198, Ex. 1003).
`Dr. Chandrajit L. Bajaj Curriculum Vitae (IPR2018-00197. Ex.
`1004) (IPR2018-00198, Ex. 1004).
`Karatas et al., "Three-dimensional imaging techniques: A literature
`review," European Journal of Dentistry , vol. 8, Issue 1, 2014; pp.
`132-140. (IPR2018-00197, Ex, 1016) (IPR2018-00198, Ex. 1016).
`Broadbent H.B., "A New X-Ray Technique and Its Application to
`Orthodontia," The Angle Orthodontist, vol. I, No. 2, Feb. 4, 1931;
`pp. 45-66. (IPR2018-00197, Ex. 1017) (IPR2018-00198, Ex. 1017).
`Birnbaum et al., "Dental Impressions Using 3D Digital Scanners:
`Virtual Becomes Reality," (IPR2018-00197, Ex. 1018) (IPR2018-
`00198, Ex. 1018).
`Ireland et al., "3D surface imaging in dentistry what we are looking
`at," British Dental Journal, vol. 205, No. 7, Oct. 11, 2008; pp.
`387-392. (IPR2018-00197, Ex. 1022) (IPR2018-00198, Ex. 1022).
`Hajeer et al., "Current Products and Practices Applications of 3D
`imaging in orthodontics: Part II," Journal of Orthodontics, vol. 31,
`2004; pp. 154-162. (IPR2018-00197, Ex, 1023) (IPR2018-00198,
`Ex. 1023).
`Bornik et al., "A Hybrid User Interface for Manipulation of Volu(cid:173)
`metric Medical Data," 3D User Interfaces, 2006; 8 pages. (IPR2018-
`00197, Ex. 1029) (IPR2018-00198, Ex. 1029).
`Giammanco, et. al., "Using 3D Laser Scanning Technology to
`Create Digital Models of Hailstones," American Meteorological
`Society, Jul. 2017; pp. 1341-1347. (IPR2018-00197, Ex. 1036)
`(IPR2018-00198, Ex. 1036).
`D. A. Bowman et al, "Theory and Practice" 3D User Interfaces,
`4:96-101, Jul. 2004. (IPR2018-00197, Ex. 1038).
`EPO Prosecution History of European Patent Application No.
`11847582.1, filed Jun. 19, 2013. (IPR2018-00198, Ex, 1038).
`Yoshida, Hiroshi et al., "Intraoral Ultrasonic Scanning as a Diag(cid:173)
`nostic Aid," J. Cranio-Max.-Fac, Surg. 15 (1987), pp. 306-311.
`(IPR2018-00197, Ex. 2002) (IPR2018-00198, Ex. 2004).
`Moran, Carmel M et al., "A Comparison of the Imaging Perfor(cid:173)
`mance of High Resolution Ultrasound Scanners for Preclinical
`Imaging," Ultrasound in Med. & Biol., vol. 37, No. 3 (2011), pp.
`493-501. (IPR2018-00197, Ex. 2003) (IPR2018-00198, Ex. 2005).
`
`0002
`
`Exhibit 1001 page 2 of 18
`DENTAL IMAGING
`
`
`
`US RE48,221 E
`Page 3
`
`(56)
`
`References Cited
`
`OTHER PUBLICATIONS
`
`Ahn, Jae Sung, et al., "Development of Three-Dimensional Dental
`Scanning Apparatus Using Structured Illumination," Sensors, 17,
`1634(2017), 9 pages.(IPR2018-00197, Ex. 2004) (IPR2018-00198,
`Ex. 2002).
`U.S. Appl. No. 10/744,869. (IPR2018-00197, Ex. 2005).
`B.C. Chua et al., "SonoDEX: 3D space management and visual(cid:173)
`ization of ultrasound data," International Congress Series 1281: 143-
`148 (2005). (IPR2018-00197, Ex. 2006).
`Deposition Transcript of Chandrajit Bajaj, Ph.D. on Jul. 25, 2018
`with Errata Sheet. (IPR2018-00197, Ex. 2008).
`J. Mackinlay et al., "A Semantic Analysis of the Design Space of
`Input Devices," Human-Computer Interaction 5:145-190 (1990).
`(IPR2018-00197, Ex. 2009).
`"Taxonomies oflnput" in Developing a Taxonomy oflnput 4.1-4.16
`(Jan. 4, 2009) available at https://www.billbuxton.com/input04.
`Taxonomies.pdf, (IPR2018-00197, Ex, 2010).
`Declaration of Ravin Balakrishnan, Ph.D. (IPR2018-00197, Ex.
`2011).
`Curriculum Vitae of Ravin Balakrishnan, Ph.D. (IPR2018-00197,
`Ex, 2012).
`D. Bowman, et al., 3D User Interfaces Theory and Practice§ 4.1.1
`"Input Device Characteristics" pp. 88-89; § 4.2.2 "2D Mice and
`Trackballs" pp. 91-92; § 4.8.2 "Input Device Taxonomies" pp.
`128-132 (2005), (IPR2018-00197, Ex. 2013).
`J, Jerald, The VR Book: Human-Centered Design for Virtual Reality
`§ 27,1.3 (2016). (IPR2018-00197, Ex. 2014).
`
`S. Vogt et al., An AR System With Intuitive User Interface for
`Manipulation and Visualization of 3D Medical Data, Stud. Health
`Technol. Inform,, Medicine Meets Virtual Reality, 12(98):397-403,
`2004.
`Xia et al,, Three-Dimensional Virtual Reality, IEEE Transactions on
`Information Technology in Biomedicine, 5(2):97-107, Jun. 2001.
`First Office Action dated Apr. 3, 2015 in corresponding Chinese
`Patent Application No. 201180066956.6 (13 pages).
`Second Office Acton issued in corresponding Chinese Patent Appli(cid:173)
`cation No. 201180066956.6 dated Nov. 18, 2015, with English
`translation (27 pages).
`Deposition Transcript of Dr. Ravin Balakrishnan.
`Record of Oral Hearing held Feb. 4, 2019 from IPR2018-00197.
`Final Written Decision, entered May 29, 2019-Termination Deci(cid:173)
`sion Document from IPR2018-00197 [Paper 22].
`Three-Dimensional Virtual Reality Xia et al. Jun. 2001.*
`International Search Report (PCT/ISA/210) issued on Feb. 22,
`2012, by the Danish Patent Office as the International Searching
`Authority for International Application No. PCT/DK/2011/050461.
`C. Graetzel et al., "A Non-Contact Mouse for Surgeon-Computer
`Interaction", Technology and Health Care, 12(3), 2004, pp. 1-19.
`Sebastian Vogt et al., "An AR System With Intuitive User Interface
`for Manipulation and Visualization of 3D Medical Data", Stud.
`Health Technol. Inform., Medicine Meets Virtual Reality 12, 2004;
`vol. 98, pp. 397-403.
`First Office Action issued in corresponding Chinese Patent Appli(cid:173)
`cation No. 201180066956.6, issued Apr. 3, 2015. (13 pages).
`Second Office Action issued in corresponding Chinese Patent Appli(cid:173)
`cation No. 201180066956.6, dated Nov. 18, 2015, with English
`translation (27 pages).
`
`* cited by examiner
`
`0003
`
`Exhibit 1001 page 3 of 18
`DENTAL IMAGING
`
`
`
`U.S. Patent
`
`Sep.22,2020
`
`Sheet 1 of 5
`
`US RE48,221 E
`
`102
`
`Fig. 1
`
`0004
`
`Exhibit 1001 page 4 of 18
`DENTAL IMAGING
`
`
`
`U.S. Patent
`
`Sep.22,2020
`
`Sheet 2 of 5
`
`US RE48,221 E
`
`100
`
`105
`
`102
`
`Fig. 2a)
`
`0005
`
`Exhibit 1001 page 5 of 18
`DENTAL IMAGING
`
`
`
`U.S. Patent
`
`Sep.22,2020
`
`Sheet 3 of 5
`
`US RE48,221 E
`
`105
`
`100
`
`Fig, 2b)
`
`0006
`
`Exhibit 1001 page 6 of 18
`DENTAL IMAGING
`
`
`
`U.S. Patent
`
`Sep.22,2020
`
`Sheet 4 of 5
`
`US RE48,221 E
`
`107
`
`106
`
`Fig. 3
`
`0007
`
`Exhibit 1001 page 7 of 18
`DENTAL IMAGING
`
`
`
`U.S. Patent
`
`Sep.22,2020
`
`Sheet 5 of 5
`
`US RE48,221 E
`
`--,
`
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`__ 1
`
`101
`
`1
`
`1 - - - -
`I
`-
`I
`:
`I
`I
`I
`I
`I
`I
`I
`I
`I
`-
`I
`I
`I
`I
`I
`I
`I
`I
`I
`I
`:
`,_
`I
`I
`I
`I
`I
`I
`I
`I
`'---------
`
`103
`
`Fig. 4
`
`0008
`
`Exhibit 1001 page 8 of 18
`DENTAL IMAGING
`
`
`
`US RE48,221 E
`
`1
`SYSTEM WITH 3D USER INTERFACE
`INTEGRATION
`
`Matter enclosed in heavy brackets [ ] appears in the
`original patent but forms no part of this reissue specifica(cid:173)
`tion; matter printed in italics indicates the additions
`made by reissue; a claim printed with strikethrough
`indicates that the claim was canceled, disclaimed, or held
`invalid by a prior post-patent action or proceeding.
`
`FIELD OF THE INVENTION
`
`This invention generally relates to a method and a system
`comprising a handheld device and at least one display.
`
`BACKGROUND OF THE INVENTION
`
`5
`
`2
`The system disclosed here performs the integration of3D
`user interface functionality with any other handheld device
`with other operating functionality, such that the operator
`ideally only touches this latter device that is intended to be
`touched. A particular example of such a handheld device is
`one that records some 3D geometry, for example a handheld
`3D scanner.
`The handheld device is a multi-purpose device, such as a
`dual-purpose or two-purpose device, i.e. a device both for
`10 performing actions in the physical 3D environment, such as
`measuring and manipulating, and for remotely controlling
`the view of the 3D environment on the display.
`Geometrically, a view is determined by the virtual observ-
`15 er's/camera's position and orientation relative to the 3D
`environment or its visual representation. If the display is
`two-dimensional, the view is also determined by the type of
`projection. A view may also be determined by a magnifica(cid:173)
`tion factor.
`The virtual observer's and the 3D environment's position
`and orientation are always relative to each other. In terms of
`user experience in software systems with 3D input devices,
`the user may feel that for example, he/she is moving the 3D
`environment while remaining stationary himself/herself, but
`25 there is always an equivalent movement of the virtual
`observer/camera that gives the same results on the display.
`Often, descriptions of 3D software systems use the expres(cid:173)
`sion "pan" to indicate an apparent translational movement of
`the 3D environment, "rotate" to indicate a rotational move-
`30 ment of the 3D environment, and "zoom" to indicate a
`change in magnification factor.
`Graphically, a view can represent a 3D environment by
`means of photographs or as some kind of virtual represen(cid:173)
`tation such as a computer graphic, or similar. A computer
`35 graphic can be rendered for example with texture and/or
`shading and/or virtual light sources and/or light models for
`surface properties. A computer graphic can also be a sim(cid:173)
`plified representation of the 3D environment, for example a
`mesh, an outline, or an otherwise simplified representation.
`40 All or parts of the 3D environment can also be rendered with
`some degree of transparency. A view may represent the 3D
`environment in total or only parts thereof.
`All of the touch-less prior art systems are 3D user
`interface devices only. In many prior art applications, the
`45 operator using such user interface device will also hold and
`work with another device that really is the central device in
`the overall application, e.g. a medical instrument.
`It is thus an advantage of the present system that the 3D
`user-interface functionality is integrated in the central
`50 device, which is used for performing some kind of action.
`In some embodiments the handheld device is adapted for
`remotely controlling the magnification with which the 3D
`environment is represented on the display.
`In some embodiments the handheld device is adapted for
`55 changing the rendering of the 3D environment on the
`display.
`In some embodiments the view is defined as viewing
`angle and/or viewing position.
`In some embodiments the at least one action comprises
`60 one or more of the actions of:
`measuring,
`recording,
`scanning,
`manipulating,
`modifying.
`In some embodiments the 3D environment comprises one
`or more 3D objects.
`
`3D visualization is important in many fields of industry
`and medicine, where 3D information is becoming more and 20
`more predominant.
`Displaying and inspecting 3D information is inherently
`difficult. To fully understand a 3D object or entire environ(cid:173)
`ment on a screen, the user should generally be able to rotate
`the object or scene, such that many or preferentially all
`surfaces are displayed. This is true even for 3D displays, e.g.
`stereoscopic or holographic, where from a given viewing
`position and with a given viewing angle, the user will only
`see some surfaces of an arbitrary 3D environment. Often, the
`user will also want to zoom into details or zoom out for an
`overview.
`Various user interaction devices are in use for software
`that displays 3D data; these devices are: 3D mice, space
`balls, and touch screens. The operation of these current
`interaction devices requires physically touching them.
`Physically touching a user-interaction device can be a
`disadvantage in medical applications due to risks of cross(cid:173)
`contamination between patients or between patient and
`operator, or in industrial applications in dirty environments.
`Several non-touch user interfaces for 3D data viewing in
`medical applications have been described in the literature.
`Vogt et al (2004) describe a touchless interactive system for
`in-situ visualization of 3D medical imaging data. The user
`interface is based on tracking of reflective markers, where a
`camera is mounted on the physician's head. Graetzel et al
`(2004) describe a touchless system that interprets hand
`gestures as mouse actions. It is based on stereo vision and
`intended for use in minimally invasive surgery.
`It remains a problem to improve systems that require user
`interfaces for view control, which for example can be used
`for clinical purposes.
`
`SUMMARY
`
`Disclosed is a system comprising a handheld device and
`at least one display, where the handheld device is adapted for
`performing at least one action in a physical 3D environment,
`where the at least one display is adapted for visually
`representing the physical 3D environment, and where the
`handheld device is adapted for remotely controlling the view
`with which said 3D environment is represented on the
`display.
`The system may be adapted for switching between per(cid:173)
`forming the at least one action in the physical 3D environ- 65
`ment, and remotely controlling the view with which the 3D
`environment is represented on the display.
`
`0009
`
`Exhibit 1001 page 9 of 18
`DENTAL IMAGING
`
`
`
`US RE48,221 E
`
`10
`
`35
`
`3
`In some embodiments the handheld device is adapted to
`be held in one hand by an operator.
`In some embodiments the display is adapted to represent
`the 3D environment from multiple views.
`In some embodiments the display is adapted to represent 5
`the 3D environment from different viewing angles and/or
`viewing positions.
`In some embodiments the view of the 3D environment in
`the at least one display is at least partly determined by the
`motion of the operator's hand holding said device.
`In some embodiments the magnification represented in
`the at least one display is at least partly determined by the
`motion of the operator's hand holding said device.
`In some embodiments the handheld device is adapted to
`record the 3D geometry of the 3D environment.
`Thus the handheld device may be an intraoral dental
`scanner, which records the 3D geometry of a patient's teeth.
`The operator may move the scanner along the teeth of the
`patient for capturing the 3D geometry of the relevant teeth,
`e.g. all teeth. The scanner may comprise motion sensors for
`taking the movement of the scanner into account while
`creating the 3D model of the scanned teeth.
`The 3D model of the teeth maybe shown ona display, and
`the display may for example be a PC screen and/or the like. 25
`The user interface functionality may comprise incorpo(cid:173)
`rating motion sensors in the scanner to provide that the user
`can determine the view on the screen by moving the scanner.
`Pointing the scanner down can provide that the scanned
`teeth are shown given a downward viewing angle. Holding 30
`the scanner in a horizontal position can provide that the
`viewing angle is likewise horizontal.
`In some embodiments the handheld device comprises at
`least one user-interface element. A user-interface element is
`an element which the user may manipulate in order to
`activate a function on the user interface of the software.
`Typically the use interface is graphically presented on the
`display of the system.
`The handheld device may furthermore be provided with
`an actuator, which switches the handheld device between
`performing the at least one action and remotely controlling
`the view. By providing such a manual switching function
`that enables the operator to switch between performing the
`at least one action and remotely controlling the view, the 45
`operator may easily control what is performed.
`Such an actuator can for example be in the form of a
`button, switch or contact. In other embodiments it could be
`a touch sensitive surface or element.
`In another embodiment the actuator could be a motion 50
`sensor provided in the handheld device that function as the
`actuator when it registers a specific type of movement, for
`example if the operator shakes the handheld device.
`Examples of such motion sensors will be described herein
`with respect to the user-interface element, however, the
`person skilled in the art will based on the disclosure herein
`understand that such motion sensors may also be used as
`actuators as discussed.
`For example, the handheld device can in one embodiment
`be an intra-oral 3D scanner used by a dentist. The scanner is
`set to be performing the action of scanning a dental area
`when the actuator is in one position. When the actuator is
`switched into a second position the handheld is set to control
`the view with which the 3D environment is represented on
`the display. This could for example be that when the dentist
`have scanned a part of or the complete desired area of an
`dental arch he can activate the actuator which then allows
`
`4
`the dentist to remotely control the view of the 3D represen(cid:173)
`tation of the scanned area on the display by using the
`handheld device.
`For example, the actuator could be a button. When the
`button is pressed quickly the handheld device is prepared for
`scanning, i.e. it is set for performing at least one action, the
`scanning procedure, in the physical 3D environment. The
`scanning is stopped when the button is pressed quickly a
`second time.
`While the scanning is performed a virtual 3D represen(cid:173)
`tation is visually built on the display.
`The user can now press and hold the button. This will put
`the handheld in a controller mode, where the handheld
`15 device is adapted for remotely controlling the view with
`which the 3D environment, such as scanned teeth, is repre(cid:173)
`sented on the display. While holding the button pressed the
`system will use signals from a motion sensor in the handheld
`device to determine how to present the view of the virtual
`20 3D environment. Thus, if the user turns or otherwise moves
`the hand that holds the handheld device the view of the
`virtual 3D environment on the display will change accord(cid:173)
`ingly.
`Thus, the dentist may use the same handheld device for
`both scanning an area and subsequently verifying that the
`scan has been executed correctly without having to move
`away from the patient or touching any other equipment than
`already present in his hands.
`In one embodiment the user-interface element is the same
`as the actuator, or where several user-interface elements are
`present at least one also functions as an actuator.
`The system may be equipped with a button as an addi(cid:173)
`tional element providing the user-interface functionality.
`In an example the handheld device is a handheld intraoral
`scanner, and the display is a computer screen. The operator
`or user may be a dentist, an assistant and/or the like. The
`operation functionality of the device may be to record some
`intraoral 3D geometry, and the user interface functionality
`may be to rotate, pan, and zoom the scanned data on the
`40 computer screen.
`In some embodiments the at least one user-interface
`element is at least one motion sensor.
`Thus the integration of the user interface functionality in
`the device may be provided by motion sensors, which can be
`accelerometers inside the scanner, whose readings deter(cid:173)
`mine the orientation of the display on the screen of the 3D
`model of the teeth acquired by the scanner. Additional
`functionality, e.g. to start/stop scanning, may be provided by
`a button. The button may be located where the operator's or
`user's index finger can reach it conveniently.
`Prior art intraoral scanners use a touch screen, a trackball,
`or a mouse to determine the view in the display. These prior
`art user interface devices can be inconvenient, awkward and
`difficult to use, and they can be labor-intensive, and thus
`55 costly to sterilize or disinfect. An intraoral scanner should
`always be disinfected between scanning different patients,
`because the scanner is in and may come in contact with the
`mouth or other parts of the patient being scanned.
`The operator or user, e.g. dentist, may use one hand or
`60 both hands to hold the intraoral scanner while scanning, and
`the scanner may be light enough and comfortable to be held
`with just one hand for a longer time while scanning.
`The device can also be held with one or two hands, while
`using the device as remote control for e.g. changing the view
`65 in the display. It is an advantage of the touchless user
`interface functionality that in clinical situations, the operator
`can maintain both hands clean, disinfected, or even sterile.
`
`0010
`
`Exhibit 1001 page 10 of 18
`DENTAL IMAGING
`
`
`
`US RE48,221 E
`
`5
`An advantage of the system is that it allows an iterative
`process of working in a 3D environment without releasing
`the handheld device during said process. For the above
`intraoral scanning system example, the operator, e.g. dentist,
`can record some teeth surface geometry with a handheld 5
`device that is an intraoral scanner, inspect coverage of the
`surface recording by using that same handheld device to
`move, e.g. rotate, the recorded surface on the display, e.g. a
`computer screen, detect possible gaps or holes in the cov(cid:173)
`erage of the scanned teeth, and then for example arrange the 10
`scanner in the region where the gaps were located and
`continue recording teeth surface geometry there. Over this
`entire iterative cycle, which can be repeated more than once,
`such as as many times as required for obtaining a desired
`scan coverage of the teeth, the dentist does not have to lay 15
`the handheld intraoral scanner out of his or her hands.
`In some embodiments, the 3D user interface functionality
`is exploited in a separate location than the operation func(cid:173)
`tionality. For the above intraoral scanning system example,
`the scanning operation is performed in the oral cavity of the 20
`patient, while the user interface functionality is more flex(cid:173)
`ibly exploited when the scanner is outside the patient's
`mouth. The key characteristic and advantage of the system,
`again, is that the dentist can exploit the dual and integrated
`functionality, that is operation and user interface, of the 25
`scanner without laying it out of his or her hands.
`The above intraoral scanning system is an example of an
`embodiment. Other examples for operation functionality or
`performing actions could be drilling, welding, grinding,
`cutting, soldering, photographing,
`filming, measuring, 30
`executing some surgical procedure etc.
`The display of the system can be a 2D computer screen,
`a 3D display that projects stereoscopic image pairs, a
`volumetric display creating a 3D effect, such as a swept(cid:173)
`volume display, a static volume display, a parallax barrier 35
`display, a holographic display etc. Even with a 3D display,
`the operator has only one viewing position and viewing
`angle relative to the 3D environment at a time. The operator
`can move his/her head to assume another viewing position
`and/or viewing angle physically, but generally, it may be 40
`more convenient to use the handheld device with its built-in
`user interface functionality, e.g. the remote controlling, to
`change the viewing position and/or viewing angle repre(cid:173)
`sented in the display.
`In some embodiments the system comprises multiple
`displays, or one or more displays that are divided into
`regions. For example, several sub-windows on a PC screen
`can represent different views of the 3D environment. The
`handheld device can be used to change the view in all of
`them, or only some of them.
`In some embodiments the user interface functionality
`comprises the use of gestures.
`Gestures made by e.g. the operator can be used to change,
`shift or toggle between sub-windows, and the user-interface
`functionality can be limited to an active sub-window or one
`of several displays.
`In some embodiments the gestures are adapted to be
`detected by the at least one motion sensor. Gestures can
`alternatively and/or additionally be detected by range sen(cid:173)
`sors or other sensors that record body motion.
`The operator does not have to constantly watch the at least
`one display of the system. In many applications, the operator
`will shift between viewing and possible manipulating the
`display and performing another operation with the handheld
`device. Thus it is an advantage that the operator does not
`have to touch other user interface devices. However, in some
`cases it may not be possible for the operator to fully avoid
`
`6
`touching other devices, and in these cases it is an advantage
`that fewer touches are required compared to a system where
`a handheld device does not provide any user interface
`functionality at all.
`In some embodiments the at least one display is arranged
`separate from the handheld device.
`In some embodiments the at least one display is defined
`as a first display, and where the system further comprises a
`second display.
`In some embodiments the second display is arranged on
`the handheld device.
`In some embodiments the second display is arranged on
`the handheld device in a position such that the display is
`adapted to be viewed by the operator, while the operator is
`operating the handheld device.
`In some embodiments the second display indicates where
`the handheld device is positioned relative to the 3D envi(cid:173)
`ronment.
`In some embodiments the first display and/or the second
`display provides instructions for the operator.
`The display(s) can be arranged in multiple ways. For
`example, they can be mounted on a wall, placed on some
`sort of stand or a cart, placed on a rack or desk, or other.
`In some embodiments at least one display is mounted on
`the device itself. It can be advantageous to have a display on
`the device itself because with such an arrangement, the
`operator's eyes need not focus altematingly between differ(cid:173)
`ent distances. In some cases, the operating functionality may
`require a close look at the device and the vicinity of the 3D
`environment it operates in, and this may be at a distance at
`most as far away as the operator's hand. Especially in
`crowded environments such as dentist's clinics, surgical
`operation theatres, or industrial workplaces, it may be dif(cid:173)
`ficult to place an external display closely to the device.
`In some embodiments visual information is provided to
`the operator on one or more means other than the first
`display.
`In some embodiments audible information to the operator
`is provided to the operator.
`Thus in some embodiments, the system provides addi-
`ti