`
`a2) United States Patent
`US 9,329,675 B2
`(10) Patent No.:
`Ojelundetal.
`
`(45) Date of Patent: May3, 2016
`
`(54) SYSTEM WITH 3D USER INTERFACE
`INTEGRATION
`
`(75)
`
`Inventors: Henrik Ojelund, Lyngby (DK); David
`Fischer, Stenlose (DK); Karl-Josef
`Hollenbeck, Kebenhavn @ (DK)
`
`(52) U.S. CL
`CPC . GO6F 3/01 (2013.01); AGIC 9/004 (2013.01);
`GOIB 11/24 (2013.01)
`(58) Field of Classification Search
`CPC combinationset(s) only.
`See application file for complete search history.
`
`(73) Assignee: 3SHAPE A/S, Copenhagen K (DK)
`
`(56)
`
`(*) Notice:
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`10/2000 Pflugrath et al.
`6,135,961 A
`U.S.C. 154(b) by 249 days.
`6,967,644 B1* 11/2005 Kobayashi
`...........0.0.. 345/158
`7,141,020 B2=11/2006 Polandetal.
`7,813,591 B2* 10/2010 Paleyetal. o..... 382/285
`(21) Appl. No.:
`13/991,513
`
`....
`.. 600/424
`7,831,292 B2* 11/2010 Quaidetal.
`8,035,637 B2* 10/2011 Kriveshko .........0.00. 345/419
`
`(22)
`
`PCTFiled:
`
`Dec. 5, 2011
`
`(86) PCT No::
`
`PCT/DK2011/050461
`
`§ 371 (©),
`(2), (4) Date:
`
`Jun. 4, 2013
`
`(87) PCT Pub. No.: WO2012/076013
`
`PCT Pub. Date: Jun. 14, 2012
`
`(65)
`
`Prior Publication Data
`
`US 2013/0257718 Al
`
`Oct. 3, 2013
`
`Related U.S. Application Data
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`CN
`WO
`
`101513350
`WO 2004/066615 Al
`
`8/2009
`8/2004
`
`(Continued)
`OTHER PUBLICATIONS
`
`Three-Dimensional Virtual Reality Xiaet al. Jun. 2001.*
`(Continued)
`
`Primary Examiner — Van Chow
`(74) Attorney, Agent, or Firm — Buchanan Ingersoll &
`Rooney PC
`
`ABSTRACT
`(57)
`Disclosed is a system comprising a handheld device and at
`least one display. The handheld device is adapted for perform-
`ing at least one action in a physical 3D environment; wherein
`Dec. 6, 2010=(DK) wee ceeeteeeeeenees 2010 01104
`the at least one display is adaptedfor visually representing the
`physical 3D environment; and where the handheld deviceis
`adapted for remotely controlling the view with which the 3D
`environmentis represented onthe display.
`
`(60) Provisional application No. 61/420,138, filed on Dec.
`6, 2010.
`
`(30)
`
`Foreign Application Priority Data
`
`(51)
`
`Int. Cl.
`GO6F 3/01
`AGIC 900
`GOIB 11/24
`
`(2006.01)
`(2006.01)
`(2006.01)
`
`19 Claims, 5 Drawing Sheets
`
`
`
`Align EX1029
`Align v. 3Shape
`IPR2022-00144
`
`Align EX1029
`Align v. 3Shape
`IPR2022-00144
`
`
`
`US 9,329,675 B2
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`2/2013
`8,384,665 BL*
`8/2003
`2003/0158482 Al
`2004/0204787 Al* 10/2004
`2005/0057745 Al
`3/2005
`2006/0025684 Al
`2/2006
`2006/0092133 Al*
`5/2006
`2006/0146009 Al
`7/2006
`2009/0061381 Al
`3/2009
`2009/0217207 Al
`8/2009
`
`Powerset al. oe. 345/156
`Polandetal.
`Kopelman et al... 700/182
`Bontje
`Quistgaard etal.
`Toumaetal. oo. 345/158
`Syrbe etal.
`Durbinet al.
`Kagermeieretal.
`
`FOREIGN PATENT DOCUMENTS
`
`WO
`WO
`WO
`
`2007084727
`2009089 126
`2001011193
`
`7/2007
`7/2009
`1/2011
`
`OTHER PUBLICATIONS
`
`International Search Report (PCT/ISA/210) issued on Feb. 22, 2012,
`by the Danish Patent Office as the International Searching Authority
`for International Application No. PCT/DK/2011/050461.
`C. Graetzel et al., “A Non-Contact Mouse for Surgeon-Computer
`Interaction”, Technology and Health Care, 12(3), 2004, pp. 1-19.
`Sebastian Vogtet al., “An AR System WithIntuitive User Interface for
`Manipulation and Visualization of 3D Medical Data”, Stud. Health
`Technol. Inform., Medicine Meets Virtual Reality 12, 2004; vol. 98,
`pp. 397-403.
`First Office Action issued in corresponding Chinese Patent Applica-
`tion No. 201180066956.6, issued Apr. 3, 2015. (13 pages).
`Second Office Action issued in corresponding Chinese Patent Appli-
`cation No. 201180066956.6,dated Nov. 18, 2015, with English trans-
`lation (27 pages).
`
`* cited by examiner
`
`
`
`U.S. Patent
`
`May3, 2016
`
`Sheet 1 of 5
`
`US 9,329,675 B2
`
`101
`
`100
`
`041
`
`021
`
`Fig. 1
`
`
`
`
`
`
`U.S. Patent
`
`May3, 2016
`
`Sheet 2 of 5
`
`US 9,329,675 B2
`
`1 01
`
`TO ._._—_s.
`
`
`
`
`__
`
`_
`
`
`\
`\\\<\
`<<
`
`
`
`
`
`
`
`
`
`
`\\ \
`\S ~
`
`\
`
` 105
`
`
`100—~ _
`
`
`
`SS
`
`
`SG
`
`
`
`
`
`
`
`
`
`_\\
`
`\
`
`\_|N
`~ ~
`
`~
`
`S
`
`A\\
`
`
`
`
` \\
`
`102
`
`Fig. 2a)
`
`
`
`U.S. Patent
`
`May3, 2016
`
`Sheet 3 of 5
`
`US 9,329,675 B2
`
`100
`
`SAS
`SS
`WSs
`A<
`
`x \
`
`se
`RENeee
`AS
`
`105
`
`102
`
`Fig. 2b)
`
`
`
`ve=N
`
`wT—Ye
`
`oO
`
`eal
`
`US 9,329,675 B2
`
`\o=oSN“ee
`
`107
`
`SecaernGee
`
`eeeoeees2LD=LyWNee
`er)yeYYSay DOT
`
`U.S. Patent
`
`108
`
`106
`
`STTAIOLLETEEE
`
`Fig. 3
`
`
`
`
`U.S. Patent
`
`May3, 2016
`
`Sheet 5 of 5
`
`US 9,329,675 B2
`
`101
`
`102
`
`103
`
`| | |
`
`——seeraeoeneeeeeee
`
`Fig. 4
`
`
`
`US 9,329,675 B2
`
`1
`SYSTEM WITH 3D USER INTERFACE
`INTEGRATION
`
`FIELD OF THE INVENTION
`
`This invention generally relates to a method and a system
`comprising a handheld device andat least one display.
`
`BACKGROUND OF THE INVENTION
`
`3D visualization is important in manyfields of industry and
`medicine, where 3D information is becoming more and more
`predominant.
`Displaying and inspecting 3D information is inherently
`difficult. To fully understand a 3D object or entire environ-
`ment on a screen, the user should generally be able to rotate
`the object or scene, such that manyorpreferentially all sur-
`faces are displayed. This is true even for 3D displays, e.g.
`stereoscopic or holographic, where from a given viewing
`position and with a given viewingangle, the user will only see
`somesurfaces ofan arbitrary 3D environment. Often, the user
`will also want to zoom into details or zoom out for an over-
`view.
`Various user interaction devices are in use for software that
`
`displays 3D data; these devices are: 3D mice, space balls, and
`touch screens. The operation of these current interaction
`devices requires physically touching them.
`Physically touching a user-interaction device can be a dis-
`advantage in medical applications dueto risks of cross-con-
`tamination between patients or between patient and operator,
`or in industrial applications in dirty environments.
`Several non-touch user interfaces for 3D data viewing in
`medical applications have been described in the literature.
`Vogt et al (2004) describe a touchless interactive system for
`in-situ visualization of 3D medical imaging data. The user
`interface is based on tracking of reflective markers, where a
`camera is mounted on the physician’s head. Graetzel et al
`(2004) describe a touchless system that interprets hand ges-
`tures as mouse actions. It is based on stereo vision and
`intended for use in minimally invasive surgery.
`Tt remains a problem to improve systemsthat require user
`interfaces for view control, which for example can be used for
`clinical purposes.
`
`SUMMARY
`
`Disclosed is a system comprising a handheld device and at
`least one display, where the handheld device is adapted for
`performingat least one action in a physical 3D environment,
`where the at least one display is adapted for visually repre-
`senting the physical 3D environment, and where the handheld
`device is adapted for remotely controlling the view with
`which said 3D environmentis represented on the display.
`The system may be adapted for switching between per-
`forming the at least one action in the physical 3D environ-
`ment, and remotely controlling the view with which the 3D
`environmentis represented onthe display.
`The system disclosed here performsthe integration of 3D
`user interface functionality with any other handheld device
`with other operating functionality, such that the operatoride-
`ally only touches this latter device that is intended to be
`touched. A particular example of such a handheld device is
`one that records some 3D geometry, for example a handheld
`3D scanner.
`
`The handheld device is a multi-purpose device, such as a
`dual-purpose or two-purpose device, i.e. a device both for
`performing actions in the physical 3D environment, such as
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`measuring and manipulating, and for remotely controlling the
`view of the 3D environmenton the display.
`Geometrically, a view is determinedbythe virtual observ-
`er’s/camera’s position and orientation relative to the 3D envi-
`ronment or its visual representation. If the display is two-
`dimensional, the view is also determined by the type of
`projection. A view mayalso be determined by a magnification
`factor.
`
`The virtual observer’s and the 3D environment’s position
`and orientation are alwaysrelative to each other. In terms of
`user experience in software systems with 3D input devices,
`the user may feel that for example, he/she is moving the 3D
`environment while remaining stationary himself/herself, but
`there is always an equivalent movement of the virtual
`observer/camera that gives the sameresults on the display.
`Often, descriptions of 3D software systems use the expres-
`sion “pan”to indicate an apparenttranslational movement of
`the 3D environment, “rotate” to indicate a rotational move-
`mentofthe 3D environment, and “zoom”to indicate a change
`in magnification factor.
`Graphically, a view can represent a 3D environment by
`meansof photographs or as somekind of virtual representa-
`tion such as a computer graphic, or similar. A computer
`graphic can be rendered for example with texture and/or
`shading and/or virtual light sources and/or light models for
`surface properties. A computer graphic can also be a simpli-
`fied representation of the 3D environment, for example a
`mesh, an outline, or an otherwise simplified representation.
`All or parts of the 3D environment can also be rendered with
`some degree of transparency. A view may represent the 3D
`environmentin total or only parts thereof.
`All ofthe touch-less prior art systemsare 3D userinterface
`devices only. In manyprior art applications, the operator
`using such user interface device will also hold and work with
`another device thatreally is the central device in the overall
`application, e.g. a medical instrument.
`It is thus an advantage of the present system that the 3D
`user-interface functionality is integrated in the central device,
`whichis used for performing some kind ofaction.
`In some embodiments the handheld device is adapted for
`remotely controlling the magnification with which the 3D
`environmentis represented onthe display.
`In some embodiments the handheld device is adapted for
`changing the rendering of the 3D environmentonthedisplay.
`In some embodimentsthe view is defined as viewing angle
`and/or viewing position.
`In some embodimentstheat least one action comprises one
`or more of the actions of:
`
`measuring,
`recording,
`scanning,
`manipulating,
`modifying.
`In some embodiments the 3D environment comprises one
`or more 3D objects.
`In some embodiments the handheld device is adapted to be
`held in one handby an operator.
`In some embodiments the display is adapted to represent
`the 3D environment from multiple views.
`In some embodiments the display is adapted to represent
`the 3D environment from different viewing angles and/or
`viewing positions.
`In some embodiments the view of the 3D environment in
`the at least one display is at least partly determined by the
`motion of the operator’s hand holding said device.
`
`
`
`3
`In some embodiments the magnification represented in the
`at least one display is at least partly determined by the motion
`of the operator’s hand holding said device.
`In some embodiments the handheld device is adapted to
`record the 3D geometry of the 3D environment.
`Thus the handheld device may be anintraoral dental scan-
`ner, which records the 3D geometry of a patient’s teeth. The
`operator may movethe scanner along the teeth of the patient
`for capturing the 3D geometry of the relevantteeth, e.g. all
`teeth. The scanner may comprise motion sensors for taking
`the movementof the scanner into account while creating the
`3D modelof the scannedteeth.
`
`10
`
`15
`
`20
`
`on the display will change accordingly.
`Thus, the dentist may use the same handheld device for
`both scanning an area and subsequently verifying that the
`scan has been executed correctly without having to move
`away from the patient or touching any other equipment than
`The 3D modelof the teeth may be shown onadisplay, and
`already present in his hands.
`the display may for example be a PC screen and/orthelike.
`In one embodimentthe user-interface elementis the same
`Theuser interface functionality may comprise incorporat-
`ing motion sensors in the scanner to provide that the user can
`determine the view on the screen by moving the scanner.
`Pointing the scanner down can providethat the scanned teeth
`are shown given a downward viewing angle. Holding the
`scannerin a horizontal position can provide that the viewing
`angle is likewise horizontal.
`In some embodiments the handheld device comprises at
`least one user-interface element. A user-interface elementis
`
`US 9,329,675 B2
`
`4
`is adapted for remotely controlling the view with which the
`3D environment, such as scanned teeth, is represented on the
`display. While holding the button pressed the system will use
`signals from a motion sensorin the handheld device to deter-
`mine how to present the view ofthe virtual 3D environment.
`Thus, ifthe user turns or otherwise movesthe hand that holds
`the handheld device the view of the virtual 3D environment
`
`as the actuator, or where several user-interface elements are
`present at least one also functions as an actuator.
`The system may be equipped with a button as an additional
`element providing the user-interface functionality.
`In an example the handheld device is a handheld intraoral
`scamner, and the display is a computer screen. The operator or
`user may be a dentist, an assistant and/or the like. The opera-
`tion functionality of the device may be to record some
`intraoral 3D geometry, and the user interface functionality
`maybeto rotate, pan, and zoom the scanned data on the
`computer screen.
`In some embodiments the at least one user-interface ele-
`mentis at least one motion sensor.
`
`Thusthe integration of the user interface functionality in
`the device may be provided by motion sensors, which can be
`accelerometers inside the scanner, whose readings determine
`the orientation ofthe display on the screen ofthe 3D model of
`the teeth acquired by the scanner. Additional functionality,
`e.g. to start/stop scanning, may be providedbya button. The
`button may be located where the operator’s or user’s index
`finger can reach it conveniently.
`Prior art intraoral scanners use a touch screen, a trackball,
`or a mouse to determine the view in the display. These prior
`art user interface devices can be inconvenient, awkward and
`difficult to use, and they can be labor-intensive, and thus
`costly to sterilize or disinfect. An intraoral scanner should
`always be disinfected between scanning different patients,
`because the scanner is in and may comein contact with the
`mouth or otherparts of the patient being scanned.
`The operatororuser, e.g. dentist, may use one handor both
`hands to hold the intraoral scanner while scanning, and the
`scanner maybe light enough and comfortable to be held with
`just one handfor a longer time while scanning.
`The device can also be held with one or two hands, while
`using the device as remote control for e.g. changing the view
`in the display.It is an advantage ofthe touchless user interface
`functionality thatin clinical situations, the operator can main-
`tain both hands clean, disinfected, or even sterile.
`An advantage of the system is that it allows an iterative
`process ofworking in a 3D environmentwithoutreleasing the
`handheld device during said process. For the above intraoral
`scanning system example,
`the operator, e.g. dentist, can
`record someteeth surface geometry with a handheld device
`that is an intraoral scanner, inspect coverage of the surface
`recording by using that same handheld device to move, e.g.
`rotate, the recorded surface on the display, e.g. a computer
`screen, detect possible gaps or holes in the coverage of the
`scanned teeth, and then for example arrange the scannerin the
`region where the gaps were located and continue recording
`teeth surface geometry there. Overthis entire iterative cycle,
`which can be repeated more than once, such as as many times
`
`an element which the user may manipulatein order to activate
`a function on the user interface of the software. Typically the
`use interface is graphically presented on the display of the
`system.
`The handheld device may furthermore be provided with an
`actuator, which switches the handheld device between per-
`formingthe at least one action and remotely controlling the
`view. By providing such a manual switching function that
`enables the operator to switch between performingtheat least
`one action and remotely controlling the view, the operator
`mayeasily control what is performed.
`Such an actuator can for example be in the form ofa button,
`switch or contact. In other embodiments it could be a touch
`sensitive surface or element.
`In another embodiment the actuator could be a motion
`
`sensor provided in the handheld device that function as the
`actuator whenit registers a specific type of movement, for
`example if the operator shakes
`the handheld device.
`Examples of such motion sensors will be described herein
`with respect to the user-interface element, however, the per-
`son skilled in the art will based on the disclosure herein
`understand that such motion sensors may also be used as
`actuators as discussed.
`For example, the handheld device can in one embodiment
`be an intra-oral 3D scanner used by a dentist. The scanneris
`set to be performingthe action of scanning a dental area when
`the actuatoris in one position. Whenthe actuatoris switched
`into a secondposition the handheld is set to control the view
`with which the 3D environmentis represented on the display.
`This could for example be that whenthe dentist have scanned
`a part ofor the complete desired area of an dental arch he can
`activate the actuator which then allowsthe dentist to remotely
`control the view of the 3D representation of the scanned area
`on the display by using the handheld device.
`For example, the actuator could be a button. When the
`button is pressed quickly the handheld device is prepared for
`scanning, i.e. it is set for performing at least one action, the
`scanning procedure, in the physical 3D environment. The
`scanning is stopped when the button is pressed quickly a
`second time.
`
`While the scanning is performed a virtual 3D representa-
`tion is visually built on the display.
`The user can now press and hold the button. This will put
`the handheld in a controller mode, where the handheld device
`
`25
`
`30
`
`35
`
`40
`
`45
`
`55
`
`60
`
`65
`
`
`
`US 9,329,675 B2
`
`5
`as required for obtaining a desired scan coverageoftheteeth,
`the dentist does not have to lay the handheld intraoral scanner
`outof his or her hands.
`
`6
`In some embodiments the second display indicates where
`the handheld device is positioned relative to the 3D environ-
`ment.
`
`In some embodiments, the 3D user interface functionality
`is exploited in a separate location than the operation function-
`ality. For the above intraoral scanning system example, the
`scanning operation is performed in the oral cavity of the
`patient, while the userinterface functionality is more flexibly
`exploited when the scanner is outside the patient’s mouth.
`The key characteristic and advantage of the system, again,is
`that the dentist can exploit the dual and integrated function-
`ality, that is operation and user interface, of the scanner with-
`out laying it out of his or her hands.
`The above intraoral scanning system is an example of an
`embodiment. Other examples for operation functionality or
`performingactions could be drilling, welding, grinding, cut-
`ting, soldering, photographing, filming, measuring, execut-
`ing somesurgical procedureetc.
`The display of the system can be a 2D computerscreen, a
`3D display that projects stereoscopic imagepairs, a volumet-
`ric display creating a 3D effect, such as a swept-volume
`display, a static volume display, a parallax barrier display, a
`holographic display etc. Even with a 3D display, the operator
`has only one viewing position and viewing angle relative to
`the 3D environmentat a time. The operator can movehis/her
`head to assume another viewing position and/or viewing
`angle physically, but generally, it may be more convenient to
`use the handheld device with its built-in user interface func-
`
`tionality, e.g. the remote controlling, to change the viewing
`position and/or viewing angle represented in the display.
`In some embodiments the system comprises multiple dis-
`plays, or one or more displays that are divided into regions.
`For example, several sub-windows ona PC screen can repre-
`sent different views of the 3D environment. The handheld
`device can be used to change the view in all of them, or only
`some of them.
`
`In some embodimentsthe userinterface functionality com-
`prises the use of gestures.
`Gestures made bye.g. the operator can be used to change,
`shift or toggle between sub-windows, andthe user-interface
`functionality can be limited to an active sub-windowor one of
`several displays.
`In some embodiments the gestures are adapted to be
`detected by the at least one motion sensor. Gestures can
`alternatively and/or additionally be detected by range sensors
`or other sensors that record body motion.
`The operator does not have to constantly watchthe at least
`onedisplay of the system. In many applications, the operator
`will shift between viewing and possible manipulating the
`display and performing another operation with the handheld
`device. Thus it is an advantage that the operator does not have
`to touch other user interface devices. However, in some cases
`it may not be possible for the operatorto fully avoid touching
`other devices, and in these casesit is an advantage that fewer
`touches are required compared to a system where a handheld
`device doesnot provide any user interface functionality atall.
`In some embodimentstheat least one display is arranged
`separate from the handheld device.
`In some embodimentstheat least one display is defined as
`a first display, and where the system further comprises a
`second display.
`In some embodimentsthe seconddisplayis arranged on the
`handheld device.
`
`In some embodimentsthe seconddisplayis arranged on the
`handheld device in a position such thatthe display is adapted
`to be viewed by the operator, while the operator is operating
`the handheld device.
`
`20
`
`30
`
`35
`
`40
`
`45
`
`55
`
`In some embodiments the first display and/or the second
`display provides instructions for the operator.
`The display(s) can be arranged in multiple ways. For
`example, they can be mounted on a wall, placed on somesort
`of stand or a cart, placed on a rack or desk, or other.
`In some embodiments at least one display is mounted on
`the device itself. It can be advantageousto have a display on
`the device itselfbecause with such an arrangement, the opera-
`tor’s eyes need not focusalternatingly between differentdis-
`tances. In somecases, the operating functionality may require
`a close look at the device and the vicinity of the 3D environ-
`mentit operates in, and this maybeat a distance at most as far
`awayas the operator’s hand. Especially in crowded environ-
`ments such as dentist’s clinics, surgical operation theatres, or
`industrial workplaces, it may be difficult to place an external
`display closely to the device.
`In some embodimentsvisual information is provided to the
`operator on one or more meansother than thefirst display.
`In some embodiments audible information to the operator
`is providedto the operator.
`Thus in some embodiments, the system provides addi-
`tional information to the operator. In some embodiments, the
`system includesother visual clues shown on meansother than
`the display(s), such as LEDson the device. In some embodi-
`ments, the system provides audible information to the opera-
`tor, for example by different sounds and/or by speech.
`Said information provided to the operator can comprise
`instructions for use, warnings, and thelike.
`The information can aid with improving the action perfor-
`manceor operation functionality of the device, for example
`by indicating how well an action or operation is being per-
`formed, and/or instructions to the operator aimed at improv-
`ing the ease ofthe action or operation and/orthe quality ofthe
`action or operation’s results. For example, a LED can change
`in color and/or flashing frequency. In a scanner, the informa-
`tion can relate to how well the scanned 3D environmentis in
`focus and/or to scan quality and/or to scan coverage. The
`information can comprise instructions on howbestto position
`the scanner such as to attain good scan quality and/or scan
`coverage. The instructions can be used for planning and/or
`performing bracket placement. The instructions can be in the
`form of a messenger system to the operator.
`In some embodiments, some 3D userinterface functional-
`ity is provided by at least one motion sensor built into the
`device. Examples of motion sensors are accelerometers,
`gyros, and magnetometers and/or the like. These sensors can
`sense rotations, lateral motion, and/or combinationsthereof.
`Other motion sensors use infrared sensing. For example, at
`least one infrared sensor can be mounted on the device and at
`
`least one infrared emitter can be mounted in the surroundings
`of the device. Conversely, the at least one emitter can be
`mounted on the device, and the at least one sensors in the
`surroundings. Yet another possibility is to use infrared reflec-
`tor(s) on the device and both sensor(s) and emitter(s) on the
`surroundings, or again conversely. Thus motion can be sensed
`by a variety of principles.
`Through proper signal processing, some sensors can rec-
`ognize additional operator actions; for example gestures such
`as taps, waving, or shaking of the handheld device. Thus,
`these gestures can also be exploited in the 3D user interface
`functionality.
`In some embodiments the handheld device comprises at
`least two motion sensors providing sensor fusion. Sensor
`fusion can be used to achieve a better motion signal from for
`
`
`
`US 9,329,675 B2
`
`7
`example raw gyro, accelerometer, and/or magnetometerdata.
`Sensor fusion can be implemented in ICs such as the
`InvenSense MPU 3000.
`
`In some embodiments the handheld device comprises at
`least one user-interface element other than the at least one
`motion sensor.
`In some embodiments the at least one other user-interface
`elementis a touch-sensitive element.
`In some embodiments the at least one other user-interface
`elementis a button.
`In some embodiments the at least one other user-interface
`elementis a scroll-wheel.
`
`In some embodiments, user interface functionality is pro-
`vided through additional elements on the device. Thus these
`additional elements can for example be buttons, scroll
`wheels, touch-sensitive fields, proximity sensors and/or the
`like.
`
`The additional user interface elements can be exploited or
`utilized in a workflow suitable for the field of application of
`the device. The workflow may be implemented in some user
`software application that may also control the display and
`thus the view represented thereon. A given interface element
`can supply multiple user inputs to the software. For example,
`a button can provide both a single click and a double click. For
`example, a double click can mean to advance to a subsequent
`step in a workflow. For the example of intraoral scanning,
`three steps within the workflow can be to scan the lower
`mouth, the upper mouth, andthe bite. A touch-sensitive field
`can provide strokes in multiple directions each with a differ-
`ent effect, etc. Providing multiple user inputs from a user
`interface elements is advantageous because the number of
`userinterface elements on the device can be reducedrelative
`to a situation where each userinterface element only provides
`one user input.
`The motion sensors can also be exploited in a workflow.
`For example, lifting the device, which can be sensed by an
`accelerometer, can represent some type of user input, for
`example to start some action. In a device that is a scanner, it
`may start scanning. Conversely, placing the device back in
`somesort of holder, which can be sensed by an accelerometer
`as no acceleration occur over someperiod of time, can stop
`said action.
`
`8
`In some embodiments the view of the 3D environment
`comprises a viewing angle, a magnification factor, and/or a
`viewing position.
`In some embodiments the view of the 3D environment
`comprises rendering of texture and/or shading.
`In some embodiments the at least one display is divided
`into multiple regions, each showing the 3D environment with
`a different view.
`Thusin some embodimentsthe userinterface functionality
`comprises changing the view with which the 3D environment
`is displayed. Changes in view can comprise changes in view-
`ing angle, viewing position, magnification and/orthe like. A
`change in viewing angle can naturally be effected by rotating
`the device. Rotation is naturally sensed by the aid of gyros
`and/orrelative to gravity sensed by an accelerometer. Zoom-
`ing, ic. a change in magnification, can for example be
`achieved by pushing the handheld device forward and back-
`ward, respectively. A translational change of the viewing
`position, i.e., panning, can for example be achieved by push-
`ing the handheld device up/down and/or sideways.
`In some embodimentstheuserinterface functionality com-
`prises selecting or choosing items on a display or any other
`functionality provided by graphical user interfaces in com-
`puters knownin the art. The operator may perform the selec-
`tion. The Lava C.O.S scanner marketed by 3M ESPE has
`additional buttons on the handheld device, but it is not pos-
`sible to manipulate the view bythese. Their only purposeis to
`allow navigation through a menu system, andto start/stop
`scanning.
`In some embodimentstheuserinterface functionality com-
`prises manipulating the 3D environment displayed on the
`screen. For example, the operator may effect deformations or
`change the position or orientation of objects in the 3D envi-
`ronment. Thus, in some embodimentsthe userinterface func-
`tionality comprisesvirtual userinterface functionality, which
`can be that the 3D data are manipulated, but the physical 3D
`environmentin which the device operates may not be manipu-
`lated.
`In some embodiments the handheld device is an intraoral
`scamner and/or an in-the-ear scanner.Ifthe scanner comprises
`a tip, this tip may be exchanged whereby the scanner can
`becomesuitable for scanning in the mouthor in the ear. Since
`the ear is a smaller cavity than the mouth, the tip for fitting
`If the action performed by the device is some kind of
`into an ear may be smaller than a tip for fitting in the mouth.
`recording, for example scanning, for example 3D scanning,
`In some embodiments the handheld device is a surgical
`the results of the recording can also be exploited as user
`instrument. In some embodiments, the surgical instrument
`inputs, possibly along with user inputs from otheruser inter-
`comprises at least one motion sensor, whichis built-in in the
`face elements. For example, with a 3D scanner with a limited
`instrument.
`depth offield, it may be possible to detect whether any objects
`In some embodiments the handheld device is a mechanical
`within the 3D environments are present in the volumecorre-
`tool. In some embodiments, the tool has at least one motion
`sponding to this depth of field by detecting whether any 3D
`sensor built in. In other embodiments, other user-interface
`points are recorded. User inputs can depend on such detected
`elements are built in as well, for example buttons, scroll
`presence. For example, a button click on an intraoral scanner
`wheels, touch-sensitive fields, or proximity sensors.
`can provide a different user input depending on whether the
`In some embodiment the 3D geometry of the 3D environ-
`scanneris in the mouth, where teeth are detectable, or signifi-
`ment is known a-priori or a 3D representation of the environ-
`cantly away from and outside the mouth. Also the effect of
`
`motion sensorsignals can be interpreted differently for either ment is knownapriori, i.e. before the actions (s) are per-
`situation. For example, the scanner may only change the view
`formed. For example in surgery, a CT scan may have been
`represented on the display whenit is outside the mouth.
`taken before the surgical procedure. The handheld device in
`In some embodiments the handheld device is adapted to
`this example could be a surgical instrumentthat a physician
`change a viewing angle with which the 3D environmentis
`needs to apply in the proper 3D position. To make surethis
`represented on the at least one display.
`proper position is reached, it could be beneficial to view the
`In some embodiments the handheld device is adapted to
`3D environmentfrom multiple perspectives interactively,i.e.
`change a magnification factor with which the 3D environment
`without having to release the surgical instrument.
`is represented on the at least one display.
`An advantage of the system, also