`
`~ WIPO
`
`WO R LD
`I NTELLEC TU AL P RO P E RT Y
`ORG A NIZ A T I O N
`
`DOCUMENT MADE AVAILABLE UNDER THE
`PATENT COOPERATION TREATY (PCT)
`PCT/DK2011/050461
`International application number:
`
`International filing date:
`
`05 December 2011 (05.12.2011 )
`
`Document type:
`
`Document details:
`
`Certified copy of priority document
`
`Country/Office:
`Number:
`Filing date:
`
`us
`61/420,138
`06 December 2010 (06.12.2010)
`
`Date of receipt at the International Bureau:
`
`19 December 2011 (19.12.201 1)
`
`Remark: Priority document submitted or transmitted to the International Bureau in compliance with Rule
`17.1(a),(b) or (b-bis)
`
`34, chemin des Colombeltes
`12 11 Geneva 20, Switzerland
`www.w ipo.int
`
`001
`
`Align EX1009
`Align v. 3Shape
`IPR2022-00145
`
`
`
`UNrTEl) S'l:ATES DEPAlffMI:NT OF COMMERC E
`
`December 19, 2011
`
`THIS IS TO CERTIFY THAT ANNEXED HERETO IS A TRUE COPY FROM
`THE RECORDS OF THE UNITED STATES PATENT AND TRADEMARK
`OFFICE OF THOSE PAPERS OF THE BELOW IDENTlFIED PA TENT
`APPLICATION THAT MET THE REQUffiEMENTS TO BE GRANTED A
`FILING DATE UNDER 35 USC 111.
`
`APPLICATION NUMBER: 61/420,138
`FILING DATE: December 06, 2010
`
`THE COUNTRY CODE AND NUMBER OF YOUR PRIORITY
`APPLICATION, TO BE USED FOR FILING ABROAD UNDER THE PARIS
`CONVENTION, IS US61/420,138
`
`Ccrfil1ed by
`
`L!mlt.>1' S('crctm·~-of Conm1i>rrc
`for hitcllcctual •'r<>pcrty
`:md Director of llu., United States
`l'ut1mt mul Trnd('11rnrk Oflke
`
`002
`
`
`
`32
`
`System with 3D user interface integration
`
`5
`
`Abstract
`
`Disclosed is a system comprising a handheld device and at least one display,
`where the handheld device is adapted for performing at least one action in a
`physical 3D environment, where the at least one display is adapted for
`visually representing the physical 3D environment, and where the hand held
`device is adapted for remotely controlling the view with which the 3D
`environment is represented on the display.
`
`1 O
`
`(fig. 2a) should be published)
`
`003
`
`
`
`25
`
`Claims:
`
`5
`
`10
`
`15
`
`1. A system comprising a handheld device and at least one display, where
`the hand held device is adapted for performing at least one action in a
`physical 3D environment, where the at least one display is adapted for
`visually representing the physical 3D environment, and where the handheld
`device is adapted for remotely controlling the view with which the 3D
`environment is represented on the display.
`
`2. T he system according to any one or more of the preceding claims, wherein
`the view is defined as viewing angle and/or viewing position.
`
`3. The system according to any one or more of the preceding claims, wherein
`the handheld device is adapted for remotely controlling the magnification with
`which the 3D environment is represented on the display.
`
`4. T he system according to any one or more of the preceding claims, wherein
`the handheld device is adapted for changing the rendering of the 3D
`environment on the display.
`
`20
`
`5. T he system according to any one or more of the preceding claims, wherein
`the at least one action comprises one or more of:
`
`- measuring,
`- recording,
`
`25
`
`- scanning,
`- manipulating, and/or
`- modifying.
`
`30
`
`6. The system according to any one or more of the preceding claims, wherein
`the 3D environment comprises one or more 3D objects.
`
`004
`
`
`
`26
`
`7. The system according to any one or more of the preceding claims, wherein
`the handheld device is adapted to be held in one hand by an operator.
`
`5
`
`8. The system according to any one or more of the preceding claims, wherein
`the display is adapted to represent the 3D environment from multiple views ..
`
`1 0
`
`15
`
`20
`
`9. The system according to any one or more of the preceding claims, wherein
`the view of the 3D environment represented in the at least one display is at
`least partly determined by the motion of the operator's hand holding said
`device.
`
`10. The system according to any one or more of the preceding claims,
`wherein the magnification represented in the at least one display is at least
`partly determined by the motion of the operator's hand holding said device.
`
`11 . The system according to any one or more of the preceding claims,
`wherein the handheld device is adapted to record the 3D geometry of the 3D
`environment.
`
`12. The system according to any one or more of the preceding claims,
`wherein the 3D geometry of the 30 environment is known a-priori.
`
`13. The system according to any one or more of the preceding claims,
`wherein the handheld device comprises at least one user-interface element.
`
`25
`
`14. The system according to any one or more of the preceding claims,
`wherein the at least one user-interface element is at least one motion sensor.
`
`005
`
`
`
`27
`
`15. The system according to any one or more of the preceding claims,
`
`wherein the handheld device comprises at least two motion sensors
`
`providing sensor fusion.
`
`5
`
`16. The system according to any one or more of the preceding claims,
`
`wherein the user interface functionality comprises the use of gestures.
`
`17. The system according to any one or more of the preceding claims,
`
`wherein the gestures are detected by the at least one motion sensor.
`
`10
`
`18. The system according to any one or more of the preceding claims,
`
`wherein the handheld device comprises at least one user-interface element
`
`other than the at least one motion sensor.
`
`15
`
`19. The system according to any one or more of the preceding claims,
`wherein the at least one other user-interface element is a touch-sensitive
`
`element.
`
`20. The system according to any one or more of the preceding claims,
`
`20
`
`wherein the at least one other user-interface element is a button.
`
`21 . The system according to any one or more of the preceding claims,
`
`wherein the at least one other user-interface element is a scroll wheel.
`
`25
`
`22. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is adapted to change a viewing angle with
`which the 3D environment is represented on the at least one display.
`
`23. The system according to any of the preceding claims, wherein the
`
`30
`
`handheld device is adapted to change a magnification factor with which the
`
`3D environment is represented on the at least one display.
`
`006
`
`
`
`28
`
`5
`
`1 0
`
`15
`
`20
`
`24. The system according to any one or more of the preceding claims,
`wherein the handheld device is adapted to change a viewing position with
`which the 3D environment is represented on the at least one display.
`
`25. The system according to any one or more of the preceding claims,
`wherein the view of the 3D environment comprises a viewing angle, a
`magnification factor, and/or a viewing position.
`
`26. The system according to any one or more of the preceding claims,
`wherein the view of the 3D environment comprises rendering of texture
`and/or shading.
`
`27. The system according to any one or more of the preceding claims,
`wherein the at least one display is divided into multiple regions, each
`showing the 3D environment with a different view.
`
`28. The system according to any one or more of the preceding claims,
`wherein the 3D geometry comprises a 3D surface of the environment.
`
`29. The system according to any one or more of the preceding claims,
`wherein the 3D geometry comprises a 3D volumetric representation of the
`environment.
`
`25
`
`30. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is an intra-oral 3D scanner.
`
`31. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is a surgical instrument.
`
`30
`
`007
`
`
`
`29
`
`32. The system according to any one or more of the preceding claims,
`wherein the handheld device is a mechanical tool.
`
`33. The system according to any one or more of the preceding claims,
`wherein the handheld device is an in-ear 3D scanner.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`34. The system according to any one or more of the preceding claims,
`wherein the at least one display is arranged separate from the handheld
`device.
`
`35. The system according to any one or more of the preceding claims,
`wherein the at least one display is arranged on a cart.
`
`36. The system according to any one or more of the preceding claims,
`wherein the at least one display is defined as a first display, and where the
`system further comprises a second display.
`
`37. The system according to any one or more of the preceding claims,
`wherein the second display is arranged on the handheld device.
`
`38. The system according to any one or more of the preceding claims,
`wherein the second display is arranged on the handheld device in a position
`such that the display is adapted to be viewed by the operator, while the
`
`operator is operating the handheld device.
`
`39. The system according to any one or more of the preceding claims,
`wherein
`the second display indicates where the handheld device is
`positioned relative to the 3D environment.
`
`008
`
`
`
`30
`
`40. The system according to any one or more of the preceding claims,
`wherein the first display and/or the second display provides instructions for
`the operator.
`
`5
`
`41 . The system according to any one or more of the preceding claims,
`wherein visual information is provided to the operator on one or more means
`other than the first display.
`
`42. The system according to any one or more of the preceding claims,
`
`1 0
`
`wherein audible information to the operator is provided to the operator.
`
`15
`
`20
`
`43. The system according to any one or more of the preceding claims,
`
`wherein the scanning is performed by means of LED scanning, laser light
`scanning, white light scanning, X-ray scanning, and/or CT scanning.
`
`44. A method of interaction between a handheld device and at least one
`
`display, where the method comprises the steps of:
`- performing at least one action in a physical 3D environment by means of
`the handheld device;
`- visually representing the physical 3D environment by the at least one
`
`display; and
`- remotely controlling the view of the represented 3D environment on the
`display by means of the handheld device.
`
`25
`
`45. A computer program product comprising program code means for
`
`causing a data processing system to perform the method of any one or more
`of the preceding claims, when said program code means are executed on the
`data processing system.
`
`009
`
`
`
`31
`
`46. A computer program product according to the previous claim , comprising
`a computer-readable medium having stored there on the program code
`means.
`
`5
`
`010
`
`
`
`System with 3D user interface integration
`
`Field of the inve ntion
`
`5
`
`10
`
`15
`
`20
`
`This invention generally relates to a method and a system comprising a
`hand held device and at least one display.
`
`Background of the invention
`
`3D visualization is important in many fields of industry and medicine, where
`3D information is becoming more and more predominant.
`
`Displaying and inspecting 3D information is inherently difficult. To fully
`understand a 3D object or entire environment on a screen, the user should
`generally be able to rotate the object or scene, such that many or
`
`preferentially all surfaces are displayed. This is true even if for 3D displays,
`e.g. stereoscopic or holographic, where from a given viewing position and
`with a given viewing angle, the user will only see some surfaces of an
`arbitrary 3D environment. Often, the user will also want to zoom into details
`
`or zoom out for an oveNiew.
`
`Various user interaction devices are in use for software that displays 3D data;
`
`these devices are: 3D mice, space balls, and touch screens. The operation of
`these current interaction devices requires physically touching them.
`
`25
`
`Physically touching a user-interaction device can be a disadvantage in
`medical applications due to risks of cross-contamination between patients or
`
`between patient and operator, or
`environments.
`
`30
`
`in
`
`industrial applications
`
`in dirty
`
`011
`
`
`
`2
`
`Several non-touch user interfaces for 3D data viewing in medical applications
`have been described in the literature. Vogt et al (2004) describe a touchless
`interactive system for in-situ visualization of 3D medical imaging data. The
`user interface is based on tracking of reflective markers, where a camera is
`5 mounted on the physician's head. Graetzel et al (2004) describe a touchless
`system that interprets hand gestures as mouse actions. It is based on stereo
`vision and intended for use in minimally invasive surgery.
`
`It remains a problem to improve systems that require user interfaces for view
`control, which for example can be used for clinical purposes.
`
`1 0
`
`Summary
`
`Disclosed is a system comprising a handheld device and at least one display,
`where the handheld device is adapted for performing at least one action in a
`physical 3D environment, where the at least one display is adapted for
`
`visually representing the physical 3D environment, and where the handheld
`device is adapted for remotely controlling the view with which said 3D
`environment is represented on the display
`
`15
`
`20
`
`T he system disclosed here performs the integration of 3D user interface
`functionality with any other handheld device with other operating functionality,
`such that the operator ideally only touches this latter device that is intended
`
`to be touched. A particular example of such a handheld device is one that
`records some 3D geometry, for example a handheld 3D scanner.
`
`25
`
`The handheld device is a multi-purpose device, such as a dual-purpose or
`two-purpose device, i.e. a device both for performing actions in the physical
`
`3D environment, such as measuring and manipulating, and for remotely
`controlling the view of the 3D environment on the display.
`
`30
`
`012
`
`
`
`3
`
`Geometrically, a view is determined by the virtual observer's/camera's
`position and orientation relative to the 30 environment or its visual
`representation. If the display is two-dimensional, the view is also determined
`by the type of projection. A view may also be determined by a magnification
`factor.
`
`5
`
`The virtual observer's and the 30 environment's position and orientation are
`always relative to each other. In terms of user experience in software
`systems with 30 input devices, the user may feel that for example, he/she is
`1 0 moving the 30 environment while remaining stationary himself/herself, but
`there is always an equivalent movement of the virtual observer/camera that
`gives the same results on the display. Often, descriptions of 30 software
`systems use the expression "pan" to indicate an apparent translational
`movement of the 30 environment, "rotate" to indicate a rotational movement
`of the 30 environment, and "zoom" to indicate a change in magnification
`factor.
`
`15
`
`Graphically, a view can represent a 30 environment by means of
`photographs or as some kind of virtual representation such as a computer
`graphic, or similar. A computer graphic can be rendered for example with
`
`texture and/or shading and/or virtual light sources and/or light models for
`surface properties. A computer graphic can also be a simplified
`representation of the 30 environment, for example a mesh, an outline, or an
`otherwise simplified representation. All or parts of the 30 environment can
`also be rendered with some degree of transparency. A view may represent
`
`the 30 environment in total or only parts thereof.
`
`All of the touch-less prior art systems are 30 user interface devices only. In
`many prior art applications, the operator using such user interface device will
`also hold and work with another device that really is the central device in the
`overall application, e.g. a medical instrument.
`
`20
`
`25
`
`30
`
`013
`
`
`
`4
`
`It is thus an advantage of the present system that the 3D user-interface
`functionality is integrated in the central device, which is used for performing
`some kind of action.
`
`5
`
`In some embodiments the handheld device
`for remotely
`is adapted
`controlling the magnification with which the 3D environment is represented
`on the display.
`
`1 0
`
`In some embodiments the hand held device is adapted for changing the
`rendering of the 3D environment on the display.
`
`15
`
`20
`
`In some embodiments the view is defined as viewing angle and/or viewing
`position.
`
`In some embodiments the at least one action comprises one or more of the
`
`actions of:
`- measuring,
`- recording,
`- scanning,
`- manipulating,
`- modifying.
`
`In some embodiments the 3D environment comprises one or more 3D
`objects.
`
`25
`
`In some embodiments the handheld device is adapted to be held in one hand
`by an operator.
`
`30
`
`In some embodiments the display
`environment from multiple views.
`
`is adapted
`
`to
`
`represent the 3D
`
`014
`
`
`
`5
`
`represent
`to
`is adapted
`the display
`In some embodiments
`environment from different viewing angles and/or viewing positions.
`
`the 3D
`
`5
`
`In some embodiments the view of the 3D environment in the at least one
`display is at least partly determined by the motion of the operator's hand
`holding said device.
`
`In some embodiments the magnification represented in the at least one
`
`1 0
`
`display is at least partly determined by the motion of the operator's hand
`holding said device.
`
`15
`
`20
`
`In some embodiments the handheld device is adapted to record the 3D
`geometry of the 3D environment.
`Thus the handheld device may be an intraoral dental scanner, which records
`the 3D geometry of a patient's teeth. The operator may move the scanner
`
`along the teeth of the patient for capturing the 30 geometry of the relevant
`teeth , e.g. all teeth. The scanner may comprise motion sensors for taken the
`movement of the scanner into account while creating the 3D model of the
`scanned teeth.
`
`The 3D model of the teeth may be shown on a display, and the display may
`for example be a PC screen and/or the like.
`
`The user interface functionality may comprise incorporating motion sensors
`in the scanner to provide that the user can determine the view on the screen
`
`25
`
`by moving the scanner. Pointing the scanner down can provide that the
`scanned teeth are shown given a downward viewing angle. Holding the
`scanner in a horizontal position can provide that the viewing angle is likewise
`
`horizontal.
`
`30
`
`015
`
`
`
`6
`
`In some embodiments the handheld device comprises at least one user(cid:173)
`interface element.
`The system may be equipped with a button as an additional element
`providing the user-interface functionality.
`
`5
`
`In an example the handheld device is a handheld intraoral scanner, and the
`display is a computer screen. The operator or user may be a dentist, an
`assistant and/or the like. The operation functionality of the device may be to
`record some intraoral 3D geometry, and the user interface functionality may
`
`1 0
`
`be to rotate, pan, and zoom the scanned data on the computer screen.
`
`In some embodiments the at least one user-interface element is at least one
`
`motion sensor.
`Thus the integration of the user interface functionality in the device may be
`provided by motion sensors, which can be accelerometers inside the
`scanner, whose readings determine the orientation of the display on the
`
`screen of the 3D model of the teeth acquired by the scanner. Additional
`functionality, e.g. to start/stop scanning, may be provided by a button. T he
`button may be located where the operator's or user's index finger can reach it
`conveniently.
`
`15
`
`20
`
`Prior art intraoral scanners use a touch screen, a trackball, or a mouse to
`determine the view in the display. These prior art user interface devices can
`
`be inconvenient, awkward and difficult to use, and they can be labor-
`intensive, and thus costly to sterilize or disinfect. An intraoral scanner should
`
`25
`
`always be disinfected between scanning different patients, because the
`scanner is in and may come in contact with the mouth or other parts of the
`patient being scanned.
`
`30
`
`The operator or user, e .g. dentist, may use one hand or both hands to hold
`the intraoral scanner while scanning, and the scanner may be light enough
`
`016
`
`
`
`7
`
`and comfortable to be held with just one hand for a longer time while
`
`scann ing.
`
`The device can also be held with one or two hands, while using the device as
`
`5
`
`remote control for e.g. changing the view in the display. It is an advantage of
`
`the touchless user interface functionality that in clinical situations, the
`
`operator can maintain both hands clean, disinfected, or even sterile.
`
`An advantage of the system is that it allows an iterative process of working in
`
`1 O
`
`a 30 environment without releasing the handheld device during said process.
`
`For the above intraoral scanning system example, the operator, e.g. dentist,
`
`can record some teeth surface geometry with a handheld device that is an
`
`intraoral scanner, inspect coverage of the surface recording by using that
`
`same handheld device to move, e.g. rotate, the recorded surface on the
`
`15
`
`display, e.g. a computer screen, detect possible gaps or holes in the
`coverage of the scanned teeth , and then for example arrange the scanner in
`
`the region where the gaps were located and continue recording teeth surface
`
`geometry there. Over this entire iterative cycle, which can be repeated more
`
`than once, such as as many times as required for obtaining a desired scan
`
`20
`
`coverage of the teeth, the dentist does not have to lay the handheld intraoral
`
`scanner out of his or her hands.
`
`In some em bodiments, the 30 user interface functionality is exploited in a
`
`separate location than the operation functionality. For the above intraoral
`scann ing system example, the scanning operation is performed in the oral
`
`25
`
`cavity of the patient, while the user interface functionality is more flexibly
`exploited when the scanner is outside the patient's mouth. The key
`
`characteristic and advantage of the system, again, is that the dentist can
`
`exploit the dual and integrated functionality, that is operation and user
`
`30
`
`interface, of the scanner without laying it out of his or her hands.
`
`017
`
`
`
`8
`
`The above intraoral scanning system is an example of an embodiment. Other
`
`examples for operation functionality or performing actions could be drilling,
`
`welding, grinding, cutting, soldering, photographing, filming, measuring,
`
`executing some surgical procedure etc ..
`
`5
`
`The display of the system can be a 20 computer screen, a 30 display that
`
`projects stereoscopic image pairs, a volumetric display creating a 30 effect,
`
`such as a swept-volume display, a static volume display, a parallax barrier
`
`display, a holographic display etc .. Even with a 30 display, the operator has
`
`1 O
`
`only one viewing position and viewing angle relative to the 30 environment at
`
`a time. The operator can move his/her head to assume another viewing
`
`position and/or viewing angle physically, but generally, it may be more
`
`convenient to use the handheld device with its built-in user interface
`
`functionality, e.g. the remote controlling, to change the viewing position
`and/or viewing angle represented in the display.
`
`15
`
`In some embodiments the system comprises multiple displays, or one or
`
`more displays that are divided into regions. For example, several sub(cid:173)
`
`windows on a PC screen can represent different views of the 30
`
`20
`
`environment. The handheld device can be used to change the view in all of
`
`them, or only some of them.
`
`In some embodiments the user interface functionality comprises the use of
`
`gestures.
`25 Gestures made by e.g. the operator can be used to change, shift or toggle
`
`between sub-windows, and the user-interface functionality can be limited to
`an active sub-window or one of several displays.
`
`In some embodiments the gestures are adapted to be detected by the at
`
`30
`
`least one motion sensor. Gestures can alternatively and/or additionally be
`
`detected by range sensors or other sensors that record body motion.
`
`018
`
`
`
`9
`
`The operator does not have to constantly watch the at least one display of
`the system. In many applications, the operator will shift between viewing and
`possible manipulating the display and performing another operation with the
`handheld device. Thus it is an advantage that the operator does not have to
`touch other user interface devices. However, in some cases it may not be
`possible for the operator to fully avoid touching other devices, and in these
`cases it is an advantage that fewer touches are required compared to a
`system where a handheld device does not provide any user interface
`functionality at all.
`
`5
`
`1 0
`
`In some embodiments the at least one display is arranged separate from the
`
`handheld device.
`
`15
`
`In some embodiments the at least one display is defined as a first display,
`and where the system further comprises a second display.
`
`In some embodiments the second display is arranged on the handheld
`device.
`
`20
`
`In some embodiments the second display is arranged on the handheld
`device in a position such that the display is adapted to be viewed by the
`operator, while the operator is operating the handheld device.
`
`25
`
`In some embodiments the second display indicates where the handheld
`
`device is positioned relative to the 3D environment.
`
`In some embodiments the first display and/or the second display provides
`
`instructions for the operator.
`
`30
`
`019
`
`
`
`10
`
`The display(s) can be arranged in multiple ways. For example, they can be
`
`mounted on a wall, placed on some sort of stand or a cart, placed on a rack
`
`or desk, or other.
`
`5
`
`In some embodiments at least one display is mounted on the device itself. It
`
`can be advantageous to have a display on the device itself because with
`
`such an arrangement, the operator's eyes need not focus alternatingly
`
`between different distances. In some cases, the operating functionality may
`
`require a close look at the device and the vicinity of the 30 environment it
`
`1 0
`
`operates in, and this may be at a distance at most as far away as the
`operator's hand. Especially in crowded environments such as dentist's
`
`clinics, surgical operation theatres, or industrial workplaces, it may be difficult
`
`to place an external display closely to the device.
`
`15
`
`In some embodiments visual information is provided to the operator on one
`or more means other than the first display.
`
`In some embodiments audible information to the operator is provided to the
`
`operator.
`
`20
`
`Thus in some embodiments, the system provides additional information to
`the operator. In some embodiments, the system includes other visual clues
`
`shown on means other than the display(s), such as LEDs on the device. In
`
`some embodiments, the system provides audible information to the operator,
`for example by different sounds and/or by speech.
`
`25
`
`Said information provided to the operator can comprise instructions for use,
`
`warnings, and the like.
`
`The information can aid with improving the action performance or operation
`
`30
`
`functionality of the device, for example by indicating how well an action or
`
`operation is being performed, and/or instructions to the operator aimed at
`
`020
`
`
`
`11
`
`improving the ease of the action or operation and/or the quality of the action
`or operation's results. For example, a LED can change in color and/or
`flashing frequency. In a scanner, the information can relate to how well the
`scanned 30 environment is in focus and/or to scan quality and/or to scan
`coverage. The information can comprise instructions on how best to position
`the scanner such as to attain good scan quality and/or scan coverage. The
`instructions can be used for planning and/or performing bracket placement.
`The instructions can be in the form of a messenger system to the operator.
`
`In some embodiments, some 30 user interface functionality is provided by at
`least one motion sensor built into the device. Examples of motion sensors
`are accelerometers, gyros, and magnetometers and/or the like. These
`
`sensors can sense rotations, lateral motion, and/or combinations thereof.
`Other motion sensors use infrared sensing. For example, at least one
`infrared sensor can be mounted on the device and at least one infrared
`emitter can be mounted in the surroundings of the device. Conversely, the at
`
`least one e mitter can be mounted on the device, and the at least one sensors
`in the surroundings. Yet another possibility is to use infrared reflector(s) on
`the device and both sensor(s) and emitter(s) on the surroundings, or again
`conversely. Thus motion can be sensed by a variety of principles.
`
`5
`
`1 0
`
`15
`
`20
`
`Through proper signal processing, some sensors can recognize additional
`operator actions; for example gestures such as taps, waving, or shaking of
`
`the handheld device. Thus, these gestures can also be exploited in the 30
`user interface functionality.
`
`25
`
`In some embodiments the handheld device comprises at least two motion
`sensors providing sensor fusion. Sensor fusion can be used to achieve a
`
`better motion signal from for example raw gyro, accelerometer, and/or
`30 magnetometer data. Sensor fusion can be implemented in ICs such as the
`lnvenSense MPU 3000.
`
`021
`
`
`
`12
`
`In some embodiments the handheld device comprises at least one user(cid:173)
`interface element other than the at least one motion sensor.
`
`5
`
`In some embodiments the at least one other user-interface element is a
`touch-sensitive element.
`
`10
`
`15
`
`In some embodiments the at least one other user-interface element is a
`button.
`
`In some embodiments the at least one other user-interface element is a
`scroll-wheel.
`
`In some embodiments, user interface functionality is provided through
`additional elements on the device. Thus these additional elements can for
`example be buttons, scroll wheels, touch-sensitive fields, proximity sensors
`
`and/or the like
`
`The additional user interface elements can be exploited or utilized in a
`workflow suitable for the field of application of the device. The workflow may
`
`20
`
`be implemented in some user software application that may also control the
`display and thus the view represented thereon. A given interface element can
`supply multiple user inputs to the software. For example, a button can
`
`provide both a single click and a double click. For example, a double click
`can mean to advance to a subsequent step in a workflow. For the example of
`
`25
`
`intraoral scanning, three steps within the workflow can be to scan the lower
`mouth, the upper mouth, and the bite. A touch-sensitive field can provide
`strokes in multiple directions each with a different effect, etc. Providing
`
`multiple user inputs from a user interface elements is advantageous because
`the number of user interface elements on the device can be reduced relative
`
`30
`
`022
`
`
`
`13
`
`to a situation where each user interface element only provides one user
`input.
`
`5
`
`10
`
`15
`
`The motion sensors can also be exploited in a workflow. For example, lifting
`the device, which can be sensed by an accelerometer, can represent some
`type of user input, for example to start some action. In a device that is a
`scanner, it may start scanning. Conversely, placing the device back in some
`sort of holder, which can be sensed by an accelerometer as no acceleration
`occur over some period of time, can stop said action.
`
`If the action performed by the device is some kind of recording, for example
`scanning, for example 30 scanning, the results of the recording can also be
`exploited as user inputs, possibly along with user inputs from other user
`interface elements. For example, with a 30 scanner with a limited depth of
`field, it may be possible to detect whether any objects within the 30
`environments are present in the volume corresponding to this depth of field
`
`by detecting whether any 30 points are recorded. User inputs can depend on
`such detected presence. For example, a button click on an intraoral scanner
`can provide a different user input depending on whether the scanner is in the
`20 mouth, where teeth are detectable, or significantly away from and outside the
`
`mouth. Also the effect of motion sensor signals can be interpreted differently
`for either situation. For example, the scanner may only change the view
`represented on the display when it is outside the mouth.
`
`25
`
`In some embodiments the handheld device is adapted to change a viewing
`
`angle with which the 30 environment is represented on the at least one
`display.
`
`In some embodiments the handheld device is adapted to change a
`30 magnification factor with which the 30 environment is represented on the at
`least one display.
`
`023
`
`
`
`14
`
`In