throbber
(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)
`(19) World Intellectual Property
`Organization
`International Bureau
`
`\=
`
`(43) International Publication Date
`14 June 2012 (14.06.2012)
`
`WIPO!IPCT
`
`GD)
`
`International Patent Classification:
`GOIB 11/24 (2006.01)
`A6IC 13/00 (2006.01)
`
`(81)
`
`(21)
`
`International Application Number:
`
`PCT/DK201 1/050461
`
`(22)
`
`International Filing Date:
`
`5 December 2011 (05.12.2011)
`
`(10) International Publication Number
`WO 2012/076013 Al
`
`Designated States (unless otherwise indicated, for every
`kind of national protection available): AE, AG, AL, AM,
`AO, AT, AU, AZ, BA, BB, BG, BH, BR, BW, BY, BZ,
`CA, CH, CL, CN, CO, CR, CU, CZ, DE, DK, DM, DO,
`DZ, EC, EE, EG, ES, FI, GB, GD, GE, GH, GM, GT, HN,
`HR, HU,ID, IL, IN, IS, JP, KE, KG, KM, KN, KP, KR,
`KZ, LA, LC, LK, LR, LS, LT, LU, LY, MA, MD, ME,
`MG, MK, MN, MW,MX, MY, MZ, NA, NG, NI NO, NZ,
`OM,PE, PG, PH, PL, PT, QA, RO, RS, RU, RW, SC, SD,
`SE, SG, SK, SL, SM, ST, SV, SY, TH, TJ, TM, TN, TR,
`TT, TZ, UA, UG, US, UZ, VC, VN, ZA, ZM, ZW.
`
`(25)
`
`(26)
`
`(30)
`
`(71)
`
`(72)
`(75)
`
`Filing Language:
`
`Publication Language:
`
`Priority Data:
`PA 2010 01104 6 December 2010 (06.12.2010)
`61/420,138
`6 December2010 (06.12.2010)
`
`English
`
`English
`
`DK
`US
`
`Applicant (for ali designated States except US): 3SHAPE
`A/S [DK/DK]; Holmens Kanal 7, 4, DK-1060 Copenhagen
`K (DK).
`
`Inventors; and
`Inventors/Applicants (for US only): HOLLENBECK,
`Karl Josef [DE/DK]; Ribegade 12, 3.th, DK-2100 Copen-
`hagen @ (DK). OJELUND, Henrik |SE/DK|; Kulsvier-
`parken 55, DK-2500 Kgs. Lyngby (DK). FISCHER, Dav-
`id [DK/DK]; Radyrleddct 16, DK-3660 Stcnlgse (DK).
`
`(84)
`
`Designated States (unless otherwise indicated, for every
`kind of regional protection available): ARIPO (BW, GH,
`GM, KE, LR, LS, MW, MZ, NA, RW, SD, SL, SZ, TZ,
`UG, ZM, ZW), Eurasian (AM, AZ, BY, KG, KZ, MD, RU,
`TJ, TM), European (AL, AT, BE, BG, CH, CY, CZ, DE,
`DK,EE, ES, FI, FR, GB, GR, HR, HU,IE,IS, IT, LT, LU,
`LV, MC, MK, MT, NL, NO, PL, PT, RO, RS, SE, SI, SK,
`SM, TR), OAPI (BF, BJ, CF, CG, CI, CM, GA, GN, GQ,
`GW, ML, MR, NE, SN, TD, TG).
`Declarations under Rule 4.17:
`
`as to applicant's entitlement to apply for and be granted a
`patent (Rule 4.17/(ii))
`
`(74)
`
`Agent: MUNZER, Mare; Guardian IP Consulting I/S,
`Diplomvej, Building 381, DK-2800 Kgs. Lyngby (DK).
`
`ofinventorship (Rule 4.17(iv))
`
`[Continued on next page]
`
`(54) Title: SYSTEM WITH 3D USER INTERFACE INTEGRATION
`
`(57) Abstract: Disclosed is a system comprising a handheld device (100) and
`at Icast one display (101), where the handheld device (100) is adapted for
`performing at least one action in a physical 3D environment. The actions in-
`clude measuring, modifying, manipulating, recording,
`touching, sensing,
`scanning, moving, transforming, cutting, welding, chemically treating, clean-
`ing. The display (101) is adapted for visually representing the physical 3D
`environment, and where the handheld device (100) is adapted for remotely
`controlling the view with which the 3D environment is represented on the
`display (101).
`
`s SS
`
`
`
`Fig. 2a)
`
`
`
`
`
`wo2012/076013A1IMINIININAITAINNINTIANAANTATUA
`
`

`

`WO 2012/076013 A1 IMMA AAIT TMAIAT TAT AMTTI TAAAGTA
`
`Published:
`
`— with international search report (Art. 21(3))
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`System with 3D user interface integration
`
`Field of the invention
`
`This invention generally relates to a method and a system comprising a
`
`handheld device and at least one display.
`
`Background of the invention
`
`10
`
`3D visualization is important in many fields of industry and medicine, where
`
`3D information is becoming more and more predominant.
`
`Displaying and inspecting 3D information is
`
`inherently difficult. To fully
`
`understand a 3D object or entire environment on a screen, the user should
`
`15
`
`generally be able to rotate the object or scene,
`
`such that many or
`
`preferentially all surfaces are displayed. This is true even for 3D displays,
`
`e.g. stereoscopic or holographic, where from a given viewing position and
`
`with a given viewing angle,
`
`the user will only see some surfaces of an
`
`arbitrary 3D environment. Often, the user will also want to zoom into details
`
`20
`
`or zoom out for an overview.
`
`Various userinteraction devices are in use for software that displays 3D data;
`
`these devices are: 3D mice, space balls, and touch screens. The operation of
`
`these current interaction devices requires physically touching them.
`
`Physically touching a user-interaction device can be a disadvantage in
`
`medical applications due to risks of cross-contamination between patients or
`
`between patient
`
`and operator,
`
`or
`
`in
`
`industrial
`
`applications
`
`in
`
`dirty
`
`environments.
`
`25
`
`30
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`Several non-touch user interfaces for 3D data viewing in medical applications
`
`have been describedin the literature. Vogt et al (2004) describe a touchless
`
`interactive system for in-situ visualization of 3D medical imaging data. The
`
`user interface is based on tracking of reflective markers, where a camera is
`
`mounted on the physician’s head. Graetzel et al (2004) describe a touchless
`
`system that interprets hand gestures as mouse actions. It is based on stereo
`
`vision and intended for use in minimally invasive surgery.
`
`It remains a problem to improve systems that require user interfaces for view
`
`10
`
`control, which for example can be usedfor clinical purposes.
`
`Summary
`
`Disclosed is a system comprising a handheld device and at least one display,
`
`15
`
`where the handheld device is adapted for performing at least one action in a
`
`physical 3D environment, where the at
`
`least one display is adapted for
`
`visually representing the physical 3D environment, and where the handheld
`
`device is adapted for remotely controlling the view with which said 3D
`
`environmentis represented on the display.
`
`20
`
`The system may be adapted for switching between performing the at least
`
`one action in the physical 3D environment, and remotely controlling the view
`
`with which the 3D environment is represented on the display.
`
`25
`
`The system disclosed here performs the integration of 3D user interface
`
`functionality with any other handheld device with other operating functionality,
`
`such that the operator ideally only touchesthis latter device that is intended
`
`to be touched. A particular example of such a handheld device is one that
`
`records some 3D geometry, for example a handheld 3D scanner.
`
`30
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`The handheld device is a multi-purpose device, such as a dual-purpose or
`
`two-purpose device,
`
`i.e. a device both for performing actions in the physical
`
`3D environment, such as measuring and manipulating, and for remotely
`
`controlling the view of the 3D environment on the display.
`
`Geometrically, a view is determined by the virtual observer’s/camera’s
`
`position and orientation relative to the 3D environment or
`
`its visual
`
`representation. If the display is two-dimensional, the view is also determined
`
`by the type of projection. A view may also be determined by a magnification
`
`10
`
`factor.
`
`The virtual observer's and the 3D environment’s position and orientation are
`
`always relative to each other.
`
`In terms of user experience in software
`
`systems with 3D input devices, the user may feel that for example, he/she is
`
`15
`
`moving the 3D environment while remaining stationary himself/herself, but
`
`there is always an equivalent movement of the virtual observer/camera that
`
`gives the same results on the display. Often, descriptions of 3D software
`
`systems use the expression “pan” to indicate an apparent
`
`translational
`
`movement of the 3D environment, “rotate” to indicate a rotational movement
`
`20
`
`of the 3D environment, and “zoom” to indicate a change in magnification
`
`factor.
`
`Graphically,
`
`a view can represent a 3D environment by means of
`
`photographs or as some kind of virtual representation such as a computer
`
`25
`
`graphic, or similar. A computer graphic can be rendered for example with
`
`texture and/or shading and/or virtual
`
`light sources and/or light models for
`
`surface
`
`properties. A computer graphic
`
`can
`
`also be
`
`a_
`
`simplified
`
`representation of the 3D environment, for example a mesh, an outline, or an
`
`otherwise simplified representation. All or parts of the 3D environment can
`
`30
`
`also be rendered with some degree of transparency. A view may represent
`
`the 3D environmentin total or only parts thereof.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`All of the touch-less prior art systems are 3D user interface devices only.
`
`In
`
`many prior art applications, the operator using such user interface device will
`
`also hold and work with another device that really is the central device in the
`
`overall application, e.g. a medical instrument.
`
`It
`
`is thus an advantage of the present system that the 3D user-interface
`
`functionality is integrated in the central device, which is used for performing
`
`some kind of action.
`
`10
`
`In
`
`some embodiments
`
`the handheld device is adapted for
`
`remotely
`
`controlling the magnification with which the 3D environment is represented
`
`on the display.
`
`15
`
`In some embodiments the handheld device is adapted for changing the
`
`rendering of the 3D environment on the display.
`
`In some embodiments the view is defined as viewing angle and/or viewing
`
`position.
`
`20
`
`In some embodiments the at least one action comprises one or more of the
`
`actions of:
`
`- measuring,
`
`- recording,
`
`25
`
`- scanning,
`
`- manipulating,
`
`- modifying.
`
`In some embodiments the 3D environment comprises one or more 3D
`
`30
`
`objects.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`In some embodiments the handheld device is adapted to be held in one hand
`
`by an operator.
`
`In
`
`some embodiments
`
`the display is adapted to represent
`
`the 3D
`
`environment from multiple views.
`
`In
`
`some embodiments
`
`the display is adapted to represent
`
`the 3D
`
`environmentfrom different viewing angles and/or viewing positions.
`
`10
`
`In some embodiments the view of the 3D environment in the at least one
`
`display is at least partly determined by the motion of the operators hand
`
`holding said device.
`
`In some embodiments the magnification represented in the at
`
`least one
`
`15
`
`display is at least partly determined by the motion of the operators hand
`
`holding said device.
`
`In some embodiments the handheld device is adapted to record the 3D
`
`geometry of the 3D environment.
`
`20
`
`Thus the handheld device may be an intraoral dental scanner, which records
`
`the 3D geometry of a patient’s teeth. Tne operator may move the scanner
`
`along the teeth of the patient for capturing the 3D geometry of the relevant
`
`teeth, e.g. all teeth. The scanner may comprise motion sensors for taking the
`
`25
`
`movement of the scanner into account while creating the 3D model of the
`
`scannedteeth.
`
`The 3D model of the teeth may be shown on a display, and the display may
`
`for example be a PC screen and/or the like.
`
`30
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`The user interface functionality may comprise incorporating motion sensors
`
`in the scanner to provide that the user can determine the view on the screen
`
`by moving the scanner. Pointing the scanner down can provide that the
`
`scanned teeth are shown given a downward viewing angle. Holding the
`
`scanner in a horizontal position can provide that the viewing angle is likewise
`
`horizontal.
`
`In some embodiments the handheld device comprises at
`
`least one user-
`
`interface element. A user-interface element is an element which the user may
`
`10
`
`manipulate in order to activate a function on the user interface of the
`
`software. Typically the use interface is graphically presented on the display of
`
`the system.
`
`The handheld device may furthermore be provided with an actuator, which
`
`15
`
`switches the handheld device between performing the at least one action and
`
`remotely controlling the view. By providing such a manual switching function
`
`that enables the operator to switch between performing the at
`
`least one
`
`action and remotely controlling the view, the operator may easily control what
`
`is performed.
`
`20
`
`Such an actuator can for example be in the form of a button, switch or
`
`contact.
`
`In other embodiments it could be a touch sensitive surface or
`
`element.
`
`25
`
`In another embodiment the actuator could be a motion sensor provided in the
`
`handheld device that function as the actuator when it registers a specific type
`
`of movement,
`
`for example if
`
`the operator shakes the handheld device.
`
`Examples of such motion sensors will be described herein with respect to the
`
`user-interface element, however, the person skilled in the art will based on
`
`30
`
`the disclosure herein understand that such motion sensors may also be used
`
`as actuators as discussed.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`For example, the handheld device can in one embodiment be an intra-oral
`
`3D scanner used by a dentist. The scanner is set to be performing the action
`
`of scanning a dental area when the actuator is in one position. When the
`
`actuator is switched into a second position the handheld is set to control the
`
`view with which the 3D environment is represented on the display. This could
`
`for example be that when the dentist have scanned a part of or the complete
`
`desired area of an dental arch he can activate the actuator which then allows
`
`the dentist to remotely control
`
`the view of the 3D representation of the
`
`10
`
`scanned area on the display by using the handheld device.
`
`For example,
`
`the actuator could be a button. When the button is pressed
`
`quickly the handheld device is prepared for scanning,
`
`i.e.
`
`it
`
`is set for
`
`performing at least one action, the scanning procedure,
`
`in the physical 3D
`
`15
`
`environment. The scanning is stopped when the button is pressed quickly a
`
`second time.
`
`While the scanning is performed a virtual 3D representation is visually built
`
`on the display.
`
`20
`
`The user can now press and hold the button. This will put the handheld in a
`
`controller mode, where the handheld device is adapted for
`
`remotely
`
`controlling the view with which the 3D environment, such as scannedteeth, is
`
`represented on the display. While holding the button pressed the system will
`
`25
`
`use signals from a motion sensor in the handheld device to determine how to
`
`present the view of the virtual 3D environment. Thus,
`
`if the user turns or
`
`otherwise moves the hand that holds the handheld device the view of the
`
`virtual 3D environment on the display will change accordingly.
`
`30
`
`Thus, the dentist may use the same handheld device for both scanning an
`
`area and subsequently verifying that the scan has been executed correctly
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`without having to move away from the patient or
`
`touching any other
`
`equipment than already presentin his hands.
`
`In one embodiment the user-interface element is the same as the actuator, or
`
`where several user-interface elements are present at least one also functions
`
`as an actuator.
`
`The system may be equipped with a button as an additional element
`
`providing the user-interface functionality.
`
`10
`
`In an example the handheld device is a handheld intraoral scanner, and the
`
`display is a computer screen. The operator or user may be a dentist, an
`
`assistant and/or the like. The operation functionality of the device may be to
`
`record some intraoral 3D geometry, and the user interface functionality may
`
`be to rotate, pan, and zoom the scanned data on the computer screen.
`
`15
`
`In some embodiments the at least one user-interface element is at least one
`
`motion sensor.
`
`Thus the integration of the user interface functionality in the device may be
`
`20
`
`provided by motion sensors, which can be accelerometers inside the
`
`scanner, whose readings determine the orientation of the display on the
`
`screen of the 3D model of the teeth acquired by the scanner. Additional
`
`functionality, e.g. to start/stop scanning, may be provided by a button. The
`
`button may be located where the operator’s or user’s index finger can reachit
`
`25
`
`conveniently.
`
`Prior art intraoral scanners use a touch screen, a trackball, or a mouse to
`
`determine the view in the display. These prior art user interface devices can
`
`be inconvenient, awkward and difficult
`
`to use, and they can be labor-
`
`30
`
`intensive, and thus costly to sterilize or disinfect. An intraoral scanner should
`
`always be disinfected between scanning different patients, because the
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`scanner is in and may come in contact with the mouth or other parts of the
`
`patient being scanned.
`
`The operator or user, e.g. dentist, may use one hand or both hands to hold
`
`the intraoral scanner while scanning, and the scanner may be light enough
`
`and comfortable to be held with just one hand for a longer time while
`
`scanning.
`
`The device can also be held with one or two hands, while using the device as
`
`10
`
`remote control for e.g. changing the view in the display. It is an advantage of
`
`the touchless user
`
`interface functionality that
`
`in clinical situations,
`
`the
`
`operator can maintain both hands clean, disinfected, or even sterile.
`
`An advantage of the system is that it allows an iterative process of working in
`
`15
`
`a 3D environment without releasing the handheld device during said process.
`
`For the above intraoral scanning system example, the operator, e.g. dentist,
`
`can record some teeth surface geometry with a handheld device that is an
`
`intraoral scanner,
`
`inspect coverage of the surface recording by using that
`
`same handheld device to move, e.g.
`
`rotate,
`
`the recorded surface on the
`
`20
`
`display,
`
`€.g.
`
`a computer screen, detect possible gaps or holes in the
`
`coverage of the scanned teeth, and then for example arrange the scanner in
`
`the region where the gaps were located and continue recording teeth surface
`
`geometry there. Over this entire iterative cycle, which can be repeated more
`
`than once, such as as manytimes as required for obtaining a desired scan
`
`25
`
`coverage of the teeth, the dentist does not have to lay the handheld intraoral
`
`scanner out of his or her hands.
`
`In some embodiments,
`
`the 3D user interface functionality is exploited in a
`
`separate location than the operation functionality. For the above intraoral
`
`30
`
`scanning system example, the scanning operation is performed in the oral
`
`cavity of the patient, while the user interface functionality is more flexibly
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`10
`
`exploited when the scanner
`
`is outside the patient's mouth. The key
`
`characteristic and advantage of the system, again,
`
`is that the dentist can
`
`exploit
`
`the dual and integrated functionality,
`
`that
`
`is operation and user
`
`interface, of the scanner without laying it out of his or her hands.
`
`The aboveintraoral scanning system is an example of an embodiment. Other
`
`examples for operation functionality or performing actions could bedrilling,
`
`welding, grinding, cutting,
`
`soldering, photographing,
`
`filming, measuring,
`
`executing some surgical procedureetc..
`
`10
`
`The display of the system can be a 2D computer screen, a 3D display that
`
`projects stereoscopic image pairs, a volumetric display creating a 3D effect,
`
`such as a swept-volume display, a static volume display, a parallax barrier
`
`display, a holographic display etc.. Even with a 3D display, the operator has
`
`15
`
`only one viewing position and viewing angle relative to the 3D environmentat
`
`a time. The operator can move his/her head to assume another viewing
`
`position and/or viewing angle physically, but generally,
`
`it may be more
`
`convenient
`
`to use the handheld device with its built-in user
`
`interface
`
`functionality, e.g.
`
`the remote controlling,
`
`to change the viewing position
`
`20
`
`and/or viewing angle representedin the display.
`
`In some embodiments the system comprises multiple displays, or one or
`
`more displays that are divided into regions. For example, several sub-
`
`windows on a PC screen can represent different views of
`
`the 3D
`
`25
`
`environment. The handheld device can be used to change the viewin all of
`
`them, or only some of them.
`
`In some embodiments the user interface functionality comprises the use of
`
`gestures.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`11
`
`Gestures made by e.g. the operator can be used to change, shift or toggle
`
`between sub-windows, and the user-interface functionality can be limited to
`
`an active sub-window or one of several displays.
`
`In some embodiments the gestures are adapted to be detected by the at
`
`least one motion sensor. Gestures can alternatively and/or additionally be
`
`detected by range sensors or other sensors that record body motion.
`
`The operator does not have to constantly watch the at least one display of
`
`10
`
`the system.
`
`In many applications, the operator will shift between viewing and
`
`possible manipulating the display and performing another operation with the
`
`handheld device. Thus it is an advantage that the operator does not have to
`
`touch other user interface devices. However,
`
`in some cases it may not be
`
`possible for the operator to fully avoid touching other devices, and in these
`
`15
`
`cases it
`
`is an advantage that fewer touches are required compared to a
`
`system where a handheld device does not provide any user interface
`
`functionality at all.
`
`In some embodiments the at least one display is arranged separate from the
`
`20
`
`handheld device.
`
`In some embodiments the at least one display is defined as a first display,
`
`and wherethe system further comprises a second display.
`
`25
`
`In some embodiments the second display is arranged on the handheld
`
`device.
`
`In some embodiments the second display is arranged on the handheld
`
`device in a position such that the display is adapted to be viewed by the
`
`30
`
`operator, while the operator is operating the handheld device.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`12
`
`In some embodiments the second display indicates where the handheld
`
`device is positioned relative to the 3D environment.
`
`In some embodiments the first display and/or the second display provides
`
`instructions for the operator.
`
`The display(s) can be arranged in multiple ways. For example, they can be
`
`mounted on a wall, placed on some sort of stand or a cart, placed on a rack
`
`or desk, or other.
`
`10
`
`In some embodiments at least one display is mounted on the deviceitself. It
`
`can be advantageous to have a display on the device itself because with
`
`such an arrangement,
`
`the operator's eyes need not focus alternatingly
`
`betweendifferent distances.
`
`In some cases, the operating functionality may
`
`15
`
`require a close look at the device and the vicinity of the 3D environmentit
`
`operates in, and this may be at a distance at most as far away as the
`
`operators hand. Especially in crowded environments such as dentist’s
`
`clinics, surgical operation theatres, or industrial workplaces, it may be difficult
`
`to place an external display closely to the device.
`
`20
`
`In some embodiments visual information is provided to the operator on one
`
`or more means other than the first display.
`
`In some embodiments audible information to the operator is provided to the
`
`25
`
`operator.
`
`Thus in some embodiments,
`
`the system provides additional
`
`information to
`
`the operator.
`
`In some embodiments, the system includes other visual clues
`
`shown on means other than the display(s), such as LEDs on the device.
`
`In
`
`30
`
`some embodiments, the system provides audible information to the operator,
`
`for example by different sounds and/or by speech.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`13
`
`Said information provided to the operator can comprise instructions for use,
`
`warnings, and the like.
`
`The information can aid with improving the action performance or operation
`
`functionality of the device, for example by indicating how well an action or
`
`operation is being performed, and/or instructions to the operator aimed at
`
`improving the ease of the action or operation and/or the quality of the action
`
`or operation’s results. For example, a LED can change in color and/or
`
`flashing frequency.
`
`In a scanner, the information can relate to how well the
`
`10
`
`scanned 3D environment is in focus and/or to scan quality and/or to scan
`
`coverage. The information can comprise instructions on how bestto position
`
`the scanner such as to attain good scan quality and/or scan coverage. The
`
`instructions can be used for planning and/or performing bracket placement.
`
`The instructions can be in the form of a messenger system to the operator.
`
`15
`
`In some embodiments, some 3D user interface functionality is provided by at
`
`least one motion sensor built into the device. Examples of motion sensors
`
`are accelerometers, gyros, and magnetometers and/or the like. These
`
`sensors can sense rotations,
`
`lateral motion, and/or combinations thereof.
`
`20
`
`Other motion sensors use infrared sensing. For example, at
`
`least one
`
`infrared sensor can be mounted on the device and at
`
`least one infrared
`
`emitter can be mountedin the surroundings of the device. Conversely, the at
`
`least one emitter can be mounted on the device, and the at least one sensors
`
`in the surroundings. Yet another possibility is to use infrared reflector(s) on
`
`25
`
`the device and both sensor(s) and emitter(s) on the surroundings, or again
`
`conversely. Thus motion can be sensedbya variety of principles.
`
`Through proper signal processing, some sensors can recognize additional
`
`operator actions; for example gestures such as taps, waving, or shaking of
`
`30
`
`the handheld device. Thus, these gestures can also be exploited in the 3D
`
`user interface functionality.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`14
`
`In some embodiments the handheld device comprises at least two motion
`
`sensors providing sensor fusion. Sensor fusion can be used to achieve a
`
`better motion signal
`
`from for example raw gyro, accelerometer, and/or
`
`magnetometer data. Sensor fusion can be implemented in ICs such as the
`
`InvenSense MPU 3000.
`
`In some embodiments the handheld device comprises at
`
`least one user-
`
`interface element other than the at least one motion sensor.
`
`10
`
`In some embodiments the at
`
`least one other user-interface element
`
`is a
`
`touch-sensitive element.
`
`In some embodiments the at
`
`least one other user-interface element
`
`is a
`
`15
`
`button.
`
`In some embodiments the at
`
`least one other user-interface element
`
`is a
`
`scroll-wheel.
`
`20
`
`In some embodiments, user
`
`interface functionality is provided through
`
`additional elements on the device. Thus these additional elements can for
`
`example be buttons, scroll wheels, touch-sensitive fields, proximity sensors
`
`and/or the like.
`
`25
`
`The additional user interface elements can be exploited or utilized in a
`
`workflow suitable for the field of application of the device. The workflow may
`
`be implemented in some user software application that may also control the
`
`display and thus the view represented thereon. A given interface element can
`
`supply multiple user inputs to the software. For example, a button can
`
`30
`
`provide both a single click and a double click. For example, a double click
`
`can mean to advanceto a subsequent step in a workflow. For the example of
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`15
`
`intraoral scanning, three steps within the workflow can be to scan the lower
`
`mouth, the upper mouth, and the bite. A touch-sensitive field can provide
`
`strokes in multiple directions each with a different effect, etc. Providing
`
`multiple user inputs from a user interface elements is advantageous because
`
`the numberof user interface elements on the device can be reduced relative
`
`to a situation where each user interface element only provides one user
`
`input.
`
`The motion sensors can also be exploited in a workflow. For example, lifting
`
`10
`
`the device, which can be sensed by an accelerometer, can represent some
`
`type of user input, for example to start some action.
`
`In a device that is a
`
`scanner, it may start scanning. Conversely, placing the device back in some
`
`sort of holder, which can be sensed by an accelerometer as no acceleration
`
`occur over some period of time, can stop said action.
`
`15
`
`If the action performed by the device is some kind of recording, for example
`
`scanning, for example 3D scanning, the results of the recording can also be
`
`exploited as user inputs, possibly along with user inputs from other user
`
`interface elements. For example, with a 3D scanner with a limited depth of
`
`20
`
`field,
`
`it may be possible to detect whether any objects within the 3D
`
`environments are present in the volume corresponding to this depth of field
`
`by detecting whether any 3D points are recorded. User inputs can depend on
`
`such detected presence. For example, a button click on an intraoral scanner
`
`can provide a different user input depending on whether the scanner is in the
`
`25
`
`mouth, where teeth are detectable, or significantly away from and outside the
`
`mouth. Also the effect of motion sensor signals can be interpreted differently
`
`for either situation. For example,
`
`the scanner may only change the view
`
`represented on the display when it is outside the mouth.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`16
`
`In some embodiments the handheld device is adapted to change a viewing
`
`angle with which the 3D environment is represented on the at least one
`
`display.
`
`In some embodiments the handheld device is adapted to change a
`
`magnification factor with which the 3D environment is represented on the at
`
`least one display.
`
`In some embodiments the handheld device is adapted to change a viewing
`
`10
`
`position with which the 3D environment is represented on the at least one
`
`display.
`
`In some embodiments the view of the 3D environment comprises a viewing
`
`angle, a magnification factor, and/or a viewing position.
`
`15
`
`In some embodiments the view of the 3D environment comprises rendering
`
`of texture and/or shading.
`
`In some embodiments the at least one display is divided into multiple regions,
`
`20
`
`each showing the 3D environmentwith a different view.
`
`Thus in some embodiments the user
`
`interface functionality comprises
`
`changing the view with which the 3D environment is displayed. Changesin
`
`view can comprise changesin viewing angle, viewing position, magnification
`
`25
`
`and/or the like. A change in viewing angle can naturally be effected by
`
`rotating the device. Rotation is naturally sensed by the aid of gyros and/or
`
`relative to gravity sensed by an accelerometer. Zooming,
`
`I.e. a change in
`
`magnification, can for example be achieved by pushing the handheld device
`
`forward and backward, respectively. A translational change of the viewing
`
`30
`
`position, i.e., panning, can for example be achieved by pushing the handheld
`
`device up/down and/or sideways.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`17
`
`In some embodiments the user interface functionality comprises selecting or
`
`choosing items on a display or any other functionality provided by graphical
`
`user interfaces in computers Known in the art. The operator may perform the
`
`selection. The Lava C.O.S scanner marketed by 3M ESPE has additional
`
`buttons on the handheld device, but it is not possible to manipulate the view
`
`by these. Their only purposeis to allow navigation through a menu system,
`
`and to start/stop scanning.
`
`10
`
`In some embodiments the user interface functionality comprises manipulating
`
`the 3D environment displayed on the screen. For example, the operator may
`
`effect deformations or change the position or orientation of objects in the 3D
`
`environment. Thus,
`
`in some embodiments the user interface functionality
`
`comprises virtual user interface functionality, which can be that the 3D data
`
`15
`
`are manipulated, but
`
`the physical 3D environment
`
`in which the device
`
`operates may not be manipulated.
`
`In some embodiments the handheld device is an intraoral scanner and/or an
`
`in-the-ear scanner. If the scanner comprisesa tip, this tip may be exchanged
`
`20
`
`whereby the scanner can become suitable for scanning in the mouth or in the
`
`ear. Since the ear is a smaller cavity than the mouth, thetip for fitting into an
`
`ear may be smaller thanatip for fitting in the mouth.
`
`In some embodiments the handheld device is a surgical instrument.
`
`In some
`
`25
`
`embodiments, the surgical instrument comprises at least one motion sensor,
`
`which is built-in in the instrument.
`
`In some embodiments the handheld device is a mechanical tool.
`
`In some
`
`embodiments,
`
`the tool has at
`
`least one motion sensor built
`
`in.
`
`In other
`
`30
`
`embodiments, other user-interface elements are built in as well, for example
`
`buttons, scroll wheels, touch-sensitive fields, or proximity sensors.
`
`

`

`WO 2012/076013
`
`PCT/DK2011/050461
`
`18
`
`In some embodiment the 3D geometry of the 3D environment is known a-
`
`priori or a 3D representation of the environment is Known a priori,
`
`i.e. before
`
`the actions (s) are performed. For example in surgery, a CT scan may have
`
`been taken before the surgical procedure. The handheld device in this
`
`example could be a surgi

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket