`
` WIPO
`
`WORLD
`INTELLECTUAL PROPERTY
`ORGANIZATION
`
`DOCUMENT MADE AVAILABLE UNDER THE
`PATENT COOPERATION TREATY (PCT)
`PCT/DK2011/050461
`International application number:
`
`International filing date:
`
`05 December 2011 (05.12.2011)
`
`Document type:
`
`Document details:
`
`Certified copy of priority document
`
`Country/Office:
`Number:
`Filing date:
`
`us
`61/420,138
`06 December 2010 (06.12.2010)
`
`Date of receipt at the International Bureau:
`
`19 December 2011 (19.12.2011)
`
`Remark: Priority document submitted or transmitted to the International Bureau in compliance with Rule
`17.1(a),(b) or (b-bis)
`
`34, chemin des Colombettes
`1 2 I I Geneva 20, Switze1·1and
`
`www.wipo.int
`
`0001
`
`Exhibit 1005 page 1 of 45
`DENTAL IMAGING
`
`
`
`ees
`~
`
`ae
`
`:
`
`LN N
`sV Ss
`\N x
`SON
`
`2 SRS
`=
`. EN s
`SAN
`ANH
`.
`SS
`SN
`SS aaaaeaaeanneseensess
`&
`AG LK
`Ve
`N KN S “SS PVN x
`_\
`Ae W'S
`VN
`\
`Ws
`SN
`\
`SNSTSac
`
`¥saiegsteigsTe
`
`UNTPED STATES DEPARTMENTOF COMMERCE
`
`United States Patent and Trademark Office
`
`December 19, 2011
`
`eae
`
`
`
`
`
`
`
`LPATESCLADSBOPTALEROLORLAOLOATEMDAEAD.
`aeeee
`fangseae
`Meetbnte
`
`eg
`
`
`
`Patent and Trademark Office Sa
`
`THIS IS TO CERTIFY THAT ANNEXED HERETO IS A TRUE COPY FROM
`THE RECORDS OF THE UNITED STATES PATENT AND TRADEMARK
`OFFICE OF THOSE PAPERS OF THE BELOW IDENTIFIED PATENT
`APPLICATION THAT MET THE REQUIREMENTSTO BE GRANTED A
`FILING DATE UNDER35 USC 111.
`
`APPLICATION NUMBER:61/420,138
`FILING DATE: December 06, 2010
`
`THE COUNTRY CODE AND NUMBER OF YOUR PRIORITY
`APPLICATION, TO BE USED FOR FILING ABROAD UNDER THE PARIS
`CONVENTION,IS US61/420, 138
`
`
`
`TREOaTTSCSIETEaDapALESYaga
`
`ats
`
`Certified by
`
`S Caypes
`
`Under Secretary of Commierce
`for Intellectual Praperty
`and Director af the United States
`
`0002
`
`Exhibit 1005 page 2 of 45
`DENTAL IMAGING
`
`0002
`
`Exhibit 1005 page 2 of 45
`DENTAL IMAGING
`
`
`
`32
`
`System with 3D user interface integration
`
`5
`
`Abstract
`
`Disclosed is a system comprising a handheld device and at least one display,
`
`where the handheld device is adapted for performing at least one action in a
`
`physical 3D environment, where the at least one display is adapted for
`
`1 O
`
`visually representing the physical 3D environment, and where the handheld
`
`device is adapted for remotely controlling the view with which the 3D
`environment is represented on the display.
`
`(fig. 2a) should be published)
`
`0003
`
`Exhibit 1005 page 3 of 45
`DENTAL IMAGING
`
`
`
`25
`
`Claims:
`
`1. A system comprising a handheld device and at least one display, where
`
`5
`
`the handheld device is adapted for performing at least one action in a
`
`physical 3D environment, where the at least one display is adapted for
`
`visually representing the physical 3D environment, and where the handheld
`
`device is adapted for remotely controlling the view with which the 3D
`
`environment is represented on the display.
`
`10
`
`2. The system according to any one or more of the preceding claims, wherein
`
`the view is defined as viewing angle and/or viewing position.
`
`3. The system according to any one or more of the preceding claims, wherein
`
`15
`
`the handheld device is adapted for remotely controlling the magnification with
`
`which the 3D environment is represented on the display.
`
`4. The system according to any one or more of the preceding claims, wherein
`
`the handheld device is adapted for changing the rendering of the 3D
`
`20
`
`environment on the display.
`
`5. The system according to any one or more of the preceding claims, wherein
`
`the at least one action comprises one or more of:
`
`- measuring,
`
`25
`
`- recording,
`
`- scanning,
`
`- manipulating, and/or
`
`- modifying.
`
`30
`
`6. The system according to any one or more of the preceding claims, wherein
`
`the 3D environment comprises one or more 3D objects.
`
`0004
`
`Exhibit 1005 page 4 of 45
`DENTAL IMAGING
`
`
`
`26
`
`7. The system according to any one or more of the preceding claims, wherein
`
`the handheld device is adapted to be held in one hand by an operator.
`
`5
`
`8. The system according to any one or more of the preceding claims, wherein
`
`the display is adapted to represent the 3D environment from multiple views ..
`
`9. The system according to any one or more of the preceding claims, wherein
`
`the view of the 3D environment represented in the at least one display is at
`
`1 0
`
`least partly determined by the motion of the operator's hand holding said
`
`device.
`
`10. The system according to any one or more of the preceding claims,
`
`wherein the magnification represented in the at least one display is at least
`partly determined by the motion of the operator's hand holding said device.
`
`15
`
`11. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is adapted to record the 3D geometry of the 3D
`
`environment.
`
`20
`
`12. The system according to any one or more of the preceding claims,
`
`wherein the 3D geometry of the 3D environment is known a-priori.
`
`13. The system according to any one or more of the preceding claims,
`
`25
`
`wherein the handheld device comprises at least one user-interface element.
`
`14. The system according to any one or more of the preceding claims,
`wherein the at least one user-interface element is at least one motion sensor.
`
`0005
`
`Exhibit 1005 page 5 of 45
`DENTAL IMAGING
`
`
`
`27
`
`15. The system according to any one or more of the preceding claims,
`
`wherein
`
`the handheld device comprises at
`
`least two motion sensors
`
`providing sensor fusion.
`
`5
`
`16. The system according to any one or more of the preceding claims,
`
`wherein the user interface functionality comprises the use of gestures.
`
`17. The system according to any one or more of the preceding claims,
`
`wherein the gestures are detected by the at least one motion sensor.
`
`10
`
`18. The system according to any one or more of the preceding claims,
`
`wherein the handheld device comprises at least one user-interface element
`
`other than the at least one motion sensor.
`
`15
`
`19. The system according to any one or more of the preceding claims,
`
`wherein the at least one other user-interface element is a touch-sensitive
`
`element.
`
`20. The system according to any one or more of the preceding claims,
`
`20
`
`wherein the at least one other user-interface element is a button.
`
`21. The system according to any one or more of the preceding claims,
`
`wherein the at least one other user-interface element is a scroll wheel.
`
`25
`
`22. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is adapted to change a viewing angle with
`
`which the 3D environment is represented on the at least one display.
`
`23. The system according to any of the preceding claims, wherein the
`
`30
`
`handheld device is adapted to change a magnification factor with which the
`
`3D environment is represented on the at least one display.
`
`0006
`
`Exhibit 1005 page 6 of 45
`DENTAL IMAGING
`
`
`
`28
`
`24. The system according to any one or more of the preceding claims,
`wherein the handheld device is adapted to change a viewing position with
`which the 3D environment is represented on the at least one display.
`
`5
`
`25. The system according to any one or more of the preceding claims,
`wherein the view of the 3D environment comprises a viewing angle, a
`
`magnification factor, and/or a viewing position.
`
`1 0
`
`26. The system according to any one or more of the preceding claims,
`
`wherein the view of the 3D environment comprises rendering of texture
`and/or shading.
`
`27. The system according to any one or more of the preceding claims,
`wherein the at least one display is divided into multiple regions, each
`
`15
`
`showing the 3D environment with a different view.
`
`28. The system according to any one or more of the preceding claims,
`
`wherein the 3D geometry comprises a 3D surface of the environment.
`
`20
`
`29. The system according to any one or more of the preceding claims,
`
`wherein the 3D geometry comprises a 3D volumetric representation of the
`environment.
`
`25
`
`30. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is an intra-oral 3D scanner.
`
`31. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is a surgical instrument.
`
`30
`
`0007
`
`Exhibit 1005 page 7 of 45
`DENTAL IMAGING
`
`
`
`29
`
`32. The system according to any one or more of the preceding claims,
`
`wherein the handheld device is a mechanical tool.
`
`33. The system according to any one or more of the preceding claims,
`
`5
`
`wherein the handheld device is an in-ear 3D scanner.
`
`34. The system according to any one or more of the preceding claims,
`
`wherein the at least one display is arranged separate from the handheld
`
`device.
`
`10
`
`35. The system according to any one or more of the preceding claims,
`
`wherein the at least one display is arranged on a cart.
`
`36. The system according to any one or more of the preceding claims,
`
`15
`
`wherein the at least one display is defined as a first display, and where the
`
`system further comprises a second display.
`
`20
`
`25
`
`37. The system according to any one or more of the preceding claims,
`
`wherein the second display is arranged on the handheld device.
`
`38. The system according to any one or more of the preceding claims,
`
`wherein the second display is arranged on the handheld device in a position
`
`such that the display is adapted to be viewed by the operator, while the
`
`operator is operating the handheld device.
`
`39. The system according to any one or more of the preceding claims,
`
`wherein
`
`the second display
`
`indicates where
`
`the handheld device
`
`is
`
`positioned relative to the 3D environment.
`
`0008
`
`Exhibit 1005 page 8 of 45
`DENTAL IMAGING
`
`
`
`30
`
`40. The system according to any one or more of the preceding claims,
`
`wherein the first display and/or the second display provides instructions for
`
`the operator.
`
`5
`
`41. The system according to any one or more of the preceding claims,
`
`wherein visual information is provided to the operator on one or more means
`
`other than the first display.
`
`42. The system according to any one or more of the preceding claims,
`
`1 0
`
`wherein audible information to the operator is provided to the operator.
`
`43. The system according to any one or more of the preceding claims,
`
`wherein the scanning is performed by means of LED scanning, laser light
`
`scanning, white light scanning, X-ray scanning, and/or CT scanning.
`
`15
`
`44. A method of interaction between a handheld device and at least one
`
`display, where the method comprises the steps of:
`
`- performing at least one action in a physical 3D environment by means of
`
`the handheld device;
`
`20
`
`- visually representing the physical 3D environment by the at least one
`
`display; and
`
`- remotely controlling the view of the represented 3D environment on the
`
`display by means of the handheld device.
`
`25
`
`45. A computer program product compnsIng program code means for
`
`causing a data processing system to perform the method of any one or more
`
`of the preceding claims, when said program code means are executed on the
`
`data processing system.
`
`0009
`
`Exhibit 1005 page 9 of 45
`DENTAL IMAGING
`
`
`
`31
`
`46. A computer program product according to the previous claim, comprising
`
`a computer-readable medium having stored there on the program code
`
`means.
`
`5
`
`0010
`
`Exhibit 1005 page 10 of 45
`DENTAL IMAGING
`
`
`
`System with 3D user interface integration
`
`Field of the invention
`
`5
`
`10
`
`This invention generally relates to a method and a system comprising a
`
`handheld device and at least one display.
`
`Background of the invention
`
`3D visualization is important in many fields of industry and medicine, where
`
`3D information is becoming more and more predominant.
`
`Displaying and inspecting 3D information is inherently difficult. To fully
`
`15
`
`understand a 3D object or entire environment on a screen, the user should
`
`generally be able to rotate the object or scene, such
`
`that many or
`
`preferentially all surfaces are displayed. This is true even if for 3D displays,
`
`e.g. stereoscopic or holographic, where from a given viewing position and
`
`with a given viewing angle, the user will only see some surfaces of an
`
`20
`
`arbitrary 3D environment. Often, the user will also want to zoom into details
`
`or zoom out for an overview.
`
`Various user interaction devices are in use for software that displays 3D data;
`
`these devices are: 3D mice, space balls, and touch screens. The operation of
`
`25
`
`these current interaction devices requires physically touching them.
`
`Physically touching a user-interaction device can be a disadvantage in
`
`medical applications due to risks of cross-contamination between patients or
`
`between patient and operator, or
`
`in
`
`industrial applications
`
`in dirty
`
`30
`
`environments.
`
`0011
`
`Exhibit 1005 page 11 of 45
`DENTAL IMAGING
`
`
`
`2
`
`Several non-touch user interfaces for 3D data viewing in medical applications
`
`have been described in the literature. Vogt et al (2004) describe a touchless
`
`interactive system for in-situ visualization of 3D medical imaging data. The
`
`user interface is based on tracking of reflective markers, where a camera is
`
`5
`
`mounted on the physician's head. Graetzel et al (2004) describe a touchless
`
`system that interprets hand gestures as mouse actions. It is based on stereo
`
`vision and intended for use in minimally invasive surgery.
`
`It remains a problem to improve systems that require user interfaces for view
`
`1 0
`
`control, which for example can be used for clinical purposes.
`
`Summary
`
`Disclosed is a system comprising a handheld device and at least one display,
`
`15
`
`where the handheld device is adapted for performing at least one action in a
`
`physical 3D environment, where the at least one display is adapted for
`
`visually representing the physical 3D environment, and where the handheld
`
`device is adapted for remotely controlling the view with which said 3D
`
`environment is represented on the display
`
`20
`
`The system disclosed here performs the integration of 3D user interface
`
`functionality with any other handheld device with other operating functionality,
`
`such that the operator ideally only touches this latter device that is intended
`
`to be touched. A particular example of such a handheld device is one that
`
`25
`
`records some 3D geometry, for example a handheld 3D scanner.
`
`The handheld device is a multi-purpose device, such as a dual-purpose or
`
`two-purpose device, i.e. a device both for performing actions in the physical
`
`3D environment, such as measuring and manipulating, and for remotely
`
`30
`
`controlling the view of the 3D environment on the display.
`
`0012
`
`Exhibit 1005 page 12 of 45
`DENTAL IMAGING
`
`
`
`3
`
`Geometrically, a view is determined by the virtual observer's/camera's
`
`position and orientation
`
`relative
`
`to
`
`the 3D environment or
`
`its visual
`
`representation. If the display is two-dimensional, the view is also determined
`
`by the type of projection. A view may also be determined by a magnification
`
`5
`
`factor.
`
`The virtual observer's and the 3D environment's position and orientation are
`
`always relative to each other.
`
`In terms of user experience in software
`
`systems with 3D input devices, the user may feel that for example, he/she is
`
`1 O moving the 3D environment while remaining stationary himself/herself, but
`
`there is always an equivalent movement of the virtual observer/camera that
`
`gives the same results on the display. Often, descriptions of 3D software
`
`systems use the expression "pan" to indicate an apparent translational
`
`movement of the 3D environment, "rotate" to indicate a rotational movement
`
`15
`
`of the 3D environment, and "zoom" to indicate a change in magnification
`
`factor.
`
`Graphically, a view can
`
`represent a 3D environment by means of
`
`photographs or as some kind of virtual representation such as a computer
`
`20
`
`graphic, or similar. A computer graphic can be rendered for example with
`
`texture and/or shading and/or virtual light sources and/or light models for
`
`surface properties. A computer graphic can also be a simplified
`
`representation of the 3D environment, for example a mesh, an outline, or an
`
`otherwise simplified representation. All or parts of the 3D environment can
`
`25
`
`also be rendered with some degree of transparency. A view may represent
`
`the 3D environment in total or only parts thereof.
`
`All of the touch-less prior art systems are 3D user interface devices only. In
`
`many prior art applications, the operator using such user interface device will
`
`30
`
`also hold and work with another device that really is the central device in the
`
`overall application, e.g. a medical instrument.
`
`0013
`
`Exhibit 1005 page 13 of 45
`DENTAL IMAGING
`
`
`
`4
`
`It is thus an advantage of the present system that the 3D user-interface
`
`functionality is integrated in the central device, which is used for performing
`some kind of action.
`
`5
`
`for remotely
`is adapted
`the handheld device
`In some embodiments
`controlling the magnification with which the 3D environment is represented
`
`on the display.
`
`1 0
`
`In some embodiments the handheld device is adapted for changing the
`
`rendering of the 3D environment on the display.
`
`15
`
`20
`
`In some embodiments the view is defined as viewing angle and/or viewing
`
`position.
`
`In some embodiments the at least one action comprises one or more of the
`
`actions of:
`
`- measuring,
`
`- recording,
`- scanning,
`
`- manipulating,
`
`- modifying.
`
`In some embodiments the 3D environment comprises one or more 3D
`
`25
`
`objects.
`
`In some embodiments the handheld device is adapted to be held in one hand
`by an operator.
`
`30
`
`the display
`In some embodiments
`environment from multiple views.
`
`is adapted
`
`to
`
`represent
`
`the 3D
`
`0014
`
`Exhibit 1005 page 14 of 45
`DENTAL IMAGING
`
`
`
`5
`
`In some embodiments
`
`the display
`
`is adapted
`
`to
`
`represent
`
`the 3D
`
`environment from different viewing angles and/or viewing positions.
`
`5
`
`In some embodiments the view of the 3D environment in the at least one
`
`display is at least partly determined by the motion of the operator's hand
`holding said device.
`
`In some embodiments the magnification represented in the at least one
`
`1 0
`
`display is at least partly determined by the motion of the operator's hand
`
`holding said device.
`
`In some embodiments the handheld device is adapted to record the 3D
`
`geometry of the 3D environment.
`Thus the handheld device may be an intraoral dental scanner, which records
`
`15
`
`the 3D geometry of a patient's teeth. The operator may move the scanner
`
`along the teeth of the patient for capturing the 3D geometry of the relevant
`
`teeth, e.g. all teeth. The scanner may comprise motion sensors for taken the
`
`movement of the scanner into account while creating the 3D model of the
`scanned teeth.
`
`20
`
`The 3D model of the teeth may be shown on a display, and the display may
`
`for example be a PC screen and/or the like.
`
`The user interface functionality may comprise incorporating motion sensors
`
`25
`
`in the scanner to provide that the user can determine the view on the screen
`
`by moving the scanner. Pointing the scanner down can provide that the
`
`scanned teeth are shown given a downward viewing angle. Holding the
`scanner in a horizontal position can provide that the viewing angle is likewise
`
`horizontal.
`
`30
`
`0015
`
`Exhibit 1005 page 15 of 45
`DENTAL IMAGING
`
`
`
`6
`
`In some embodiments the handheld device comprises at least one user(cid:173)
`
`interface element.
`
`The system may be equipped with a button as an additional element
`
`providing the user-interface functionality.
`
`5
`
`In an example the handheld device is a handheld intraoral scanner, and the
`
`display is a computer screen. The operator or user may be a dentist, an
`
`assistant and/or the like. The operation functionality of the device may be to
`
`record some intraoral 3D geometry, and the user interface functionality may
`
`1 0
`
`be to rotate, pan, and zoom the scanned data on the computer screen.
`
`In some embodiments the at least one user-interface element is at least one
`
`motion sensor.
`
`Thus the integration of the user interface functionality in the device may be
`
`15
`
`provided by motion sensors, which can be accelerometers inside the
`
`scanner, whose readings determine the orientation of the display on the
`
`screen of the 3D model of the teeth acquired by the scanner. Additional
`
`functionality, e.g. to start/stop scanning, may be provided by a button. The
`
`button may be located where the operator's or user's index finger can reach it
`
`20
`
`conveniently.
`
`Prior art intraoral scanners use a touch screen, a trackball, or a mouse to
`
`determine the view in the display. These prior art user interface devices can
`
`be inconvenient, awkward and difficult to use, and they can be labor-
`
`25
`
`intensive, and thus costly to sterilize or disinfect. An intraoral scanner should
`
`always be disinfected between scanning different patients, because the
`
`scanner is in and may come in contact with the mouth or other parts of the
`
`patient being scanned.
`
`30
`
`The operator or user, e.g. dentist, may use one hand or both hands to hold
`
`the intraoral scanner while scanning, and the scanner may be light enough
`
`0016
`
`Exhibit 1005 page 16 of 45
`DENTAL IMAGING
`
`
`
`7
`
`and comfortable to be held with just one hand for a longer time while
`
`scanning.
`
`The device can also be held with one or two hands, while using the device as
`
`5
`
`remote control for e.g. changing the view in the display. It is an advantage of
`
`the touchless user interface functionality that in clinical situations, the
`
`operator can maintain both hands clean, disinfected, or even sterile.
`
`An advantage of the system is that it allows an iterative process of working in
`
`1 O
`
`a 3D environment without releasing the handheld device during said process.
`
`For the above intraoral scanning system example, the operator, e.g. dentist,
`
`can record some teeth surface geometry with a handheld device that is an
`
`intraoral scanner, inspect coverage of the surface recording by using that
`
`same handheld device to move, e.g. rotate, the recorded surface on the
`
`15
`
`display, e.g. a computer screen, detect possible gaps or holes in
`
`the
`
`coverage of the scanned teeth, and then for example arrange the scanner in
`
`the region where the gaps were located and continue recording teeth surface
`
`geometry there. Over this entire iterative cycle, which can be repeated more
`
`than once, such as as many times as required for obtaining a desired scan
`
`20
`
`coverage of the teeth, the dentist does not have to lay the handheld intraoral
`
`scanner out of his or her hands.
`
`In some embodiments, the 3D user interface functionality is exploited in a
`
`separate location than the operation functionality. For the above intraoral
`
`25
`
`scanning system example, the scanning operation is performed in the oral
`
`cavity of the patient, while the user interface functionality is more flexibly
`
`exploited when
`
`the scanner is outside
`
`the patient's mouth. The key
`
`characteristic and advantage of the system, again, is that the dentist can
`
`exploit the dual and integrated functionality, that is operation and user
`
`30
`
`interface, of the scanner without laying it out of his or her hands.
`
`0017
`
`Exhibit 1005 page 17 of 45
`DENTAL IMAGING
`
`
`
`8
`
`The above intraoral scanning system is an example of an embodiment. Other
`
`examples for operation functionality or performing actions could be drilling,
`
`welding, grinding, cutting, soldering, photographing,
`
`filming, measuring,
`
`executing some surgical procedure etc ..
`
`5
`
`The display of the system can be a 2D computer screen, a 3D display that
`
`projects stereoscopic image pairs, a volumetric display creating a 3D effect,
`
`such as a swept-volume display, a static volume display, a parallax barrier
`
`display, a holographic display etc .. Even with a 3D display, the operator has
`
`1 O
`
`only one viewing position and viewing angle relative to the 3D environment at
`
`a time. The operator can move his/her head to assume another viewing
`
`position and/or viewing angle physically, but generally, it may be more
`
`convenient to use the handheld device with
`
`its built-in user interface
`
`functionality, e.g. the remote controlling, to change the viewing position
`
`15
`
`and/or viewing angle represented in the display.
`
`In some embodiments the system comprises multiple displays, or one or
`
`more displays that are divided into regions. For example, several sub(cid:173)
`
`windows on a PC screen can
`
`represent different views of
`
`the 3D
`
`20
`
`environment. The handheld device can be used to change the view in all of
`
`them, or only some of them.
`
`In some embodiments the user interface functionality comprises the use of
`
`gestures.
`
`25
`
`Gestures made by e.g. the operator can be used to change, shift or toggle
`
`between sub-windows, and the user-interface functionality can be limited to
`
`an active sub-window or one of several displays.
`
`In some embodiments the gestures are adapted to be detected by the at
`
`30
`
`least one motion sensor. Gestures can alternatively and/or additionally be
`
`detected by range sensors or other sensors that record body motion.
`
`0018
`
`Exhibit 1005 page 18 of 45
`DENTAL IMAGING
`
`
`
`9
`
`The operator does not have to constantly watch the at least one display of
`
`the system. In many applications, the operator will shift between viewing and
`
`possible manipulating the display and performing another operation with the
`
`5
`
`handheld device. Thus it is an advantage that the operator does not have to
`
`touch other user interface devices. However, in some cases it may not be
`
`possible for the operator to fully avoid touching other devices, and in these
`
`cases it is an advantage that fewer touches are required compared to a
`
`system where a handheld device does not provide any user interface
`
`1 0
`
`functionality at all.
`
`In some embodiments the at least one display is arranged separate from the
`
`handheld device.
`
`15
`
`In some embodiments the at least one display is defined as a first display,
`
`and where the system further comprises a second display.
`
`In some embodiments the second display is arranged on the handheld
`
`device.
`
`20
`
`In some embodiments the second display is arranged on the handheld
`
`device in a position such that the display is adapted to be viewed by the
`
`operator, while the operator is operating the handheld device.
`
`25
`
`In some embodiments the second display indicates where the handheld
`
`device is positioned relative to the 3D environment.
`
`In some embodiments the first display and/or the second display provides
`
`instructions for the operator.
`
`30
`
`0019
`
`Exhibit 1005 page 19 of 45
`DENTAL IMAGING
`
`
`
`10
`
`The display(s) can be arranged in multiple ways. For example, they can be
`
`mounted on a wall, placed on some sort of stand or a cart, placed on a rack
`
`or desk, or other.
`
`5
`
`In some embodiments at least one display is mounted on the device itself. It
`
`can be advantageous to have a display on the device itself because with
`
`such an arrangement, the operator's eyes need not focus alternatingly
`
`between different distances. In some cases, the operating functionality may
`
`require a close look at the device and the vicinity of the 3D environment it
`
`1 0
`
`operates in, and this may be at a distance at most as far away as the
`
`operator's hand. Especially in crowded environments such as dentist's
`
`clinics, surgical operation theatres, or industrial workplaces, it may be difficult
`
`to place an external display closely to the device.
`
`15
`
`In some embodiments visual information is provided to the operator on one
`
`or more means other than the first display.
`
`In some embodiments audible information to the operator is provided to the
`
`operator.
`
`20
`
`Thus in some embodiments, the system provides additional information to
`
`the operator. In some embodiments, the system includes other visual clues
`
`shown on means other than the display(s), such as LEDs on the device. In
`
`some embodiments, the system provides audible information to the operator,
`
`25
`
`for example by different sounds and/or by speech.
`
`Said information provided to the operator can comprise instructions for use,
`
`warnings, and the like.
`
`The information can aid with improving the action performance or operation
`
`30
`
`functionality of the device, for example by indicating how well an action or
`
`operation is being performed, and/or instructions to the operator aimed at
`
`0020
`
`Exhibit 1005 page 20 of 45
`DENTAL IMAGING
`
`
`
`improving the ease of the action or operation and/or the quality of the action
`
`or operation's results. For example, a LED can change in color and/or
`
`flashing frequency. In a scanner, the information can relate to how well the
`scanned 3D environment is in focus and/or to scan quality and/or to scan
`
`5
`
`coverage. The information can comprise instructions on how best to position
`
`the scanner such as to attain good scan quality and/or scan coverage. The
`instructions can be used for planning and/or performing bracket placement.
`
`The instructions can be in the form of a messenger system to the operator.
`
`1 O
`
`In some embodiments, some 3D user interface functionality is provided by at
`
`least one motion sensor built into the device. Examples of motion sensors
`are accelerometers, gyros, and magnetometers and/or the like. These
`
`sensors can sense rotations, lateral motion, and/or combinations thereof.
`
`Other motion sensors use infrared sensing. For example, at least one
`infrared sensor can be mounted on the device and at least one infrared
`
`15
`
`emitter can be mounted in the surroundings of the device. Conversely, the at
`
`least one emitter can be mounted on the device, and the at least one sensors
`
`in the surroundings. Yet another possibility is to use infrared reflector(s) on
`
`the device and both sensor(s) and emitter(s) on the surroundings, or again
`conversely. Thus motion can be sensed by a variety of principles.
`
`20
`
`Through proper signal processing, some sensors can recognize additional
`operator actions; for example gestures such as taps, waving, or shaking of
`
`the handheld device. Thus, these gestures can also be exploited in the 3D
`
`25
`
`user interface functionality.
`
`In some embodiments the handheld device comprises at least two motion
`sensors providing sensor fusion. Sensor fusion can be used to achieve a
`
`better motion signal from for example raw gyro, accelerometer, and/or
`
`30 magnetometer data. Sensor fusion can be implemented in ICs such as the
`lnvenSense MPU 3000.
`
`0021
`
`Exhibit 1005 page 21 of 45
`DENTAL IMAGING
`
`
`
`12
`
`In some embodiments the handheld device comprises at least one user(cid:173)
`
`interface element other than the at least one motion sensor.
`
`5
`
`In some embodiments the at least one other user-interface element is a
`
`touch-sensitive element.
`
`In some embodiments the at least one other user-interface element is a
`
`button.
`
`10
`
`In some embodiments the at least one other user-interface element is a
`
`scroll-wheel.
`
`In some embodiments, user interface functionality is provided through
`
`15
`
`additional elements on the device. Thus these additional elements can for
`
`example be buttons, scroll wheels, touch-sensitive fields, proximity sensors
`
`and/or the like
`
`The additional user interface elements can be exploited or utilized in a
`
`20
`
`workflow suitable for the field of application of the device. The workflow may
`
`be implemented in some user software application that may also control the
`
`display and thus the view represented thereon. A given interface element can
`
`supply multiple user inputs to the software. For example, a button can
`
`provide both a single click and a double click. For example, a double click
`
`25
`
`can mean to advance to a subsequent step in a workflow. For the example of
`
`intraoral scanning, three steps within the workflow can be to scan the lower
`
`mouth, the upper mouth, and the bite. A touch-sensitive field can provide
`
`strokes in multiple directions each with a different effect, etc. Providing
`
`multiple user inputs from a user interface elements is advantageous because
`
`30
`
`the number of user interface elements on the device can be reduced relative
`
`0022
`
`Exhibit 1005 page 22 of 45
`DENTAL IMAGING
`
`
`
`13
`
`to a situation where each user interface element only provides one user
`
`input.
`
`The motion sensors can also be exploited in a workflow. For example, lifting
`
`5
`
`the device, which can be sensed by an accelerometer, can represent some
`
`type of user input, for example to start some action. In a device that is a
`
`scanner, it may start scanning. Conversely, placing the device