`
`9
`
`) World ••::::.":::.~"':.','! 0
`
`(t
`
`"""""'';on •
`
`I IHIJlllll II JIJHlllJlll lllllll II llllJllHllllllUlllllll llll llllJl Ill llll
`
`(43) International Publication Date
`30 October 2008 (30.10.2008)
`
`PCT
`
`(10) International Publication Number
`WO 2008/128700 Al
`
`(51) International Patent Classification:
`A61C 13100 (2006.01)
`G<J6T 7100 (2006.01 )
`A 61C 19100 (2006.01)
`G<J6T 7160 (2006.01 )
`A61B 5100 (2006.01 )
`G<J6T 17140 (2006.01 )
`A 61B 19100 (2006.01)
`
`(21) International Applica tion Number:
`PCT /EP2008/003072
`
`(22) International Filing Date:
`
`10 April 2008 (10.04.2008)
`
`(25) Filing Language:
`
`(26) Publication Language:
`
`English
`
`English
`
`(30) Priority Data:
`0707454.5
`
`18 April 2007 (18.04.2007) GB
`
`(71) Applicant (for all designated States except US): MATE(cid:173)
`RIALISE DENTAL N.V. [BE/BE]; Technologiclaan 15,
`B-3001 Heverlee (BE).
`
`(72) Inventors; and
`(75) Inventors/Applicants (for US only): MALFLIET, Kalja
`[BE/BE]; Oude Geldenaaksebaan 25, B-3360 Bierbeek
`(BE). PATTUN, Veerle [BE/BE]; Miskom-Dorp 39,
`B-3472 Kersbeek-Miskom (BE). VAN LIERDE, Carl
`lBEIBE]; Brusselsesteenweg 560, B-9402 Meerbeke (BE).
`
`VANCRAEN, Wilfried [BE/BE]; Jan Vander Vorstlaan
`19, B-3040 Huldenberg (BE).
`
`(74) Agents: BIRD, William, E. et al.; Bird Goen & Co, Klein
`Dalenstraal 42A, B-3020 Winkselc (BE).
`
`(81) Designated States (unless otherwise indicated, for every
`kind of national protection available): AE, AG, AL, AM,
`AO, AT, AU, AZ, BA, BB, BG, BH, BR, BW, BY, BZ, CA,
`CH, CN, CO, CR, CU, CZ, DE, DK, DM, DO, DZ, EC, EE,
`EG, ES, Fl, GB, GD, GE, GH, GM, GT, HN, HR, HU, ID,
`IL, IN, IS, JP, KE, KG, KM, KN, KP, KR, KZ, LA, LC,
`LK, LR, LS, LT, LU, LY, MA, MD, ME, MG, MK, MN,
`MW, MX, MY, MZ, NA, NG, NI, NO, NZ, OM, PG, PH,
`PL, PT, RO, RS, RU, SC, SD, SE, SG, SK, SL, SM, SV,
`SY, TJ, TM, TN, TR, TT, TZ, UJ\, UG, US, UZ, VC, VN,
`ZJ\, ZM, ZW.
`
`(84) Designated States (unless otherwise indicated, for every
`kind of regional protection available): ARIPO (BW, GH,
`GM, KE, LS, MW, MZ, NJ\, SD, SL, SZ, TZ, UG, ZM,
`ZW), Eurasian (AM, AZ, BY, KG, KZ, MD, RU, TJ, TM),
`European (AT, BE, BG, CH, CY, CZ, DE, DK, EE, ES, PI,
`FR, GB, GR, HR, HU, IE, IS, IT, LT, LU, LV, MC, MT, NL,
`NO, PL, PT, RO, SE, SI, SK, TR), OJ\PI (BF, BJ, CF, CG,
`Cl, CM, GA, GN, GQ, GW, ML, MR, NE, SN, TD, TG).
`
`Published:
`with intemational search report
`
`(54) Title: COMPUTER-ASSISTUD CRUATION OF A CUSTOM TOOTH SET-UP USING FACIAL ANALYSIS
`
`'\ D
`
`f-'-l 1
`
`[¢1SCAHNER
`
`I •-AA\'PolAC~IME r l2
`1 DIGITAL CAMERA i-.-33
`I o?T1c..c·SCAA>ieRj--34
`
`35
`
`~30
`I PATIENTOA1\\ ~-37
`I AESTHETIC RULES r-~·
`
`---!!!!!!!!!!
`--
`
`----
`
`\ FUNCTIONAL RULES 1-'-s2
`I
`
`UllRAl!.V
`
`"""
`
`= = t"'-
`oc = (57) Abstract: A method for automatic, or semi-automatic, planning of dental treatment for a patient comprises: (a) obtaining data
`= about an area which is to be treated and data about a face of a patient; (b) performing a computer-assisted analysis of the data to
`M determine properties of al least the face of the patient; ( c) creating a modiiie<l looth set-up using a set of stored rules which make use
`0 of the determined facial properties. A three-dimensional representation simulates the appearance of the modilied tooth set-up and the
`> patient's face surrounding the treatment area. The method also determines properties of existing teeth and creates a modified tooth
`
`00
`M
`,..-i
`~
`
`Fig. 1
`
`~ set-up which is also based on the existing teeth of the patient. The method can be implemented as software running on a workstation.
`exocad GmbH, et. al.
`Exhibit 1006
`
`0001
`
`
`
`WO 2008/128700
`
`1
`
`PCT/EP2008/003072
`
`COMPUTER-ASSISTED CREATION OF A CUSTOM TOOTH SET-UP USING
`
`FACIAL ANALYSIS
`
`FIELD OF THE INVENTION
`
`5
`
`This invention relates generally to the field of computer technology used for the
`
`planning of dental treatments and to computer software tools for planning an optimised
`
`tooth (and soft tissue) set-up for a patient as well as to systems and methods planning
`
`an optimised tooth (and soft tissue) set-up for a patient.
`
`I 0
`
`BACKGROUND TO THE INVENTION
`
`For dental or orthodontic treatment one or more imaging modalities such as
`
`orthopantograms (dental X-ray), computerized tomography (CT) scans or digital
`
`photographs are commonly used to analyze, diagnose and document a patient's
`
`condition. Recently, digital patient information has also found its way into the planning
`
`15
`
`stage of treatment. Several software solutions exist for simulating dental implant
`
`placement in medical (CT) images (SimPlant™, Materialise Belgium), orthodontic
`
`treatment can be simulated using digitized information of the patient's dentition
`
`(OrthoCAD, Cadent, U.S.; Invisalign, Align Technologies, U.S.) and maxillofacial
`
`reconstructions can be planned in a virtual environment (SimPlant CMF, Materialise,
`
`20
`
`Belgium). While these solutions provide powerful tools to the clinician to try out
`
`different alternatives at a functional level, the implications of these alternatives at an
`
`aesthetical level are generally far from being clear or in some cases disregarded
`
`altogether when choosing the clinical approach.
`
`W02004/098378 and W02004/098379 describe a workstation for creating a
`
`25
`
`virtual three-dimensional model of a patient using several imaging sources, such as a
`
`CT scan, an X-ray and photographs. Software tools allow a trained user to manipulate
`
`the model to simulate changes in the position of teeth, such as through orthodontic
`
`treatment. The tools described in these documents can be used to plan treatment, and
`
`can present a simulation of the outcome of the treatment to a patient. However, as these
`
`30
`
`tools give the user a considerable degree of freedom in the treatment planning, with
`
`many decisions to be made by the user, they still require an experienced user to plan
`
`the treatment.
`
`Accordingly, the present invention seeks to provide an improved way of
`
`planning dental treatments for a patient.
`
`0002
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`2
`
`SUMMARY OF THE INVENTION
`
`An object of the present invention is to provide computer based methods and
`
`systems for the planning of dental treatments and computer software tools for planning
`
`5
`
`an optimised tooth (and soft tissue) set-up for a patient.
`
`A first aspect of the present invention provides a method for automatic, or
`
`semi-automatic, planning of dental treatment for a patient comprising:
`
`(a) obtaining data about an area which is to be treated and data about a face of a
`
`patient;
`
`10
`
`(b) performing a computer-assisted analysis of the data to determine properties
`
`of at least the face of the patient; and,
`
`(c) creating a modified tooth set-up using a set of stored rules which make use
`
`of the determined facial properties.
`
`15
`
`For the purpose of this application the term 'dental treatment' includes, but is
`
`not limited to, prosthetic reconstructions on natural teeth (crown and bridgework,
`
`veneers),
`
`loose prostheses, prosthetic reconstructions supported by
`
`implants,
`
`corrections of the soft tissue (i.e. the gums of the patient, mucosa and gingival) and
`
`orthodontic treatments, i.e. treatments to correct the position of teeth.
`
`20
`
`The invention recognises that dental treatment needs to be pla1U1ed in the
`
`context of a patient's face, to provide a result which is aesthetically pleasing as well as
`
`being clinically correct. The invention also provides a tool for achieving this, by
`
`performing a computer-assisted analysis of facial characteristics, and the use of stored
`
`rules to create an optimum tooth and soft tissue set-up. This greatly simplifies the
`
`25
`
`process of creating the modified tooth and soft tissue set-up.
`
`Preferably, the method further comprises generating a three-dimensional
`
`representation which simulates the appearance of at least the treatment area with the
`
`modified tooth set-up. The three-dimensional representation preferably also simulates
`
`the appearance of the patient's face surrounding the treatment area. This allows a
`
`30
`
`patient to view, in advance of the treatment, the post-treatment effects of the modified
`
`tooth and soft tissue set-up. Preferably, the three-dimensional representation is as life(cid:173)
`
`like as possible by the use of colour and texture on prosthetic teeth used in the
`
`modified set-up. The effect of modified tooth set-up on surrounding facial features
`
`0003
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`3
`
`(e.g. lips) can also be shown using the three-dimensional representation. This will
`
`allow a patient to assess the aesthetical outcome of dental treatment either subsequent
`
`to or, more ideally, prior to the selection of the type of clinical treatment. For example,
`
`a patient may be offered the choice of a treatment with dental implants, a treatment
`
`5
`
`using crown and bridgework and a treatment using a loose prosthesis and each of these
`
`treatment options can be visualised. Such an approach is highly advantageous for the
`
`patient, who in an early stage is more involved in the decision making process and is
`
`better informed about the aesthetical implications of the different alternatives (e.g.
`
`grinding down of teeth vs. implant placement to allow anchoring of a bridge; stripping
`
`10
`
`of the teeth vs. tooth extraction to solve crowding along the dental arch etc.).
`
`The functionality of this invention can be implemented in software, hardware
`
`or a combination of these. The invention can be implemented by means of hardware
`
`comprising several distinct elements, and by means of a suitably programmed
`
`processor. Accordingly, another aspect of the invention provides software comprising
`
`15
`
`instructions (code) which, when executed by a computer or processor, implements the
`
`method. The software may be tangibly embodied on an electronic memory device, hard
`
`disk, optical disk or any other machine-readable storage medium or it may be
`
`downloaded to the computer or processor via a network connection.
`
`A further aspect of the invention provides apparatus for performing the method.
`
`20
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Embodiments of the invention will be described, by way of example only, with
`
`reference to the accompanying drawings in which:
`
`Figure 1 schematically shows a workstation for implementing the present
`
`25
`
`invention;
`
`Figure 2 shows a flow chart of a method according to an embodiment of the
`
`present invention;
`Figure 3 shows one way of registering a 30 photograph and digitised plaster
`
`casts using a face bow;
`
`30
`
`Figure 4 shows an example of an aesthetical rule in which the width of
`
`maxillary incisors should be equal to the width of the nose base;
`
`Figure 5 shows an example of an aesthetical rule in which the distance between
`
`eyebrow and nose base should be equal to distance between nose base and top of chin
`
`0004
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`4
`
`during occlusion;
`
`Figure 6 shows an example of an aesthetical rule in which the occlusal plane or
`
`the line connecting the cusps of the maxillar canines should be parallel to the
`
`interpupillary line;
`
`5
`
`Figure7 shows buccal corridors during smiling;
`
`Figure 8 shows an example of a class I molar relationship;
`
`Figures 9A-9C show an example of modifying the functional properties of a
`
`prosthetic tooth;
`
`Figure l 0 shows the reconstruction of missing teeth by means of library teeth;
`
`10
`
`Figure 11 shows the application of texture to library teeth to give a life-like
`
`representation of reconstructed teeth;
`
`Figure 12 shows an alternative view of reconstructed teeth.
`
`DESCRIPTION OF PREFERRED EMBODIMENTS
`
`15
`
`The present invention will be described with respect to particular embodiments
`
`and with reference to certain drawings but the invention is not limited thereto but only
`
`by the claims. The drawings described are only schematic and are non-limiting. In the
`
`drawings, the size of some of the elements may be exaggerated and not drawn on scale
`
`for illustrative purposes. Where the tenn "comprising" is used in the present
`
`20
`
`description and claims, it does not exclude other elements or steps. Furthennore, the
`
`terms first, second, third and the like in the description and in the claims, are used for
`
`distinguishing between similar elements and not necessarily for describing a sequential
`
`or chronological order. It is to be understood that the tenns so used are interchangeable
`
`under appropriate circumstances and that the embodiments of the invention described
`
`25
`
`herein are capable of operation in other sequences than described or illustrated herein.
`
`Figure l schematically shows a system for implementing an embodiment of the
`
`present invention. The system can take the fonn of a computer workstation 20, such as
`
`a general purpose PC, which has a processor 22 and memory/storage 24 and a display
`
`10. Software 25 to implement the invention is stored in memory 24 and executed by
`
`30
`
`the processor 22. A user can interact with the workstation using a keyboard 21, mouse
`
`23 or another input device such as a graphics tablet or an electronic stylus. Workstation
`
`20 receives inputs from a variety of imaging sources, such as a computerized
`
`tomography (CT) scanner 31, a dental X-ray machine 32, a digital camera 33 and an
`
`0005
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`5
`
`optical scanner 34. Each of the imaging sources 31-34 can be manipulated by a user to
`
`acquire the image data, and then send this data to the workstation. Alternatively, one or
`
`more of the imaging sources 31-34 can be under the control of the workstation 20, with
`
`the workstation 20 automatically controlling operation of those imaging sources to
`
`5
`
`acquire the image data. As an example, the workstation 20 can control the digital
`
`camera 33 to acquire a picture from each of three predetennined views with respect to
`
`the patient. The acquired image data 30 from. each imaging source can be stored in the
`
`raw fonn in which it is acquired, or can be processed to convert it into a form in which
`
`it can be more readily combined with image data from other sources. This data (in raw
`
`10
`
`or processed format) can be stored 35 within the workstation 20, or externally of the
`
`workstation, such as on an external storage device or server which is networked to the
`
`workstation 20. Other data 37 about a patient, such as their medical history, can also be
`
`stored 35.
`The image data 30 that has been acquired from the imaging sources 31-34 is
`
`15
`
`used
`
`to generate a virtual, three-dimensional model 56 which is a life-like
`
`representation of at least the area of the human body to be treated. Typically, this area
`
`will be the patient's jaw, teeth (if any are remaining) and soft tissue surrounding these
`
`parts, such as the gums, lips and skin on the outer surface of the face. The extent of the
`
`30 model can be restricted just to the area to be treated and the soft tissue immediately
`
`20
`
`surrounding this area or it can extend to the entire face and head of the user.
`
`Figure 2 shows a flow chart which outlines the main steps of a method of
`
`planning treatment in accordance with an embodiment of the invention. Each of the
`
`steps will be described in detail.
`
`25
`
`Acquiring image data (steps 60, 61. Figure 2)
`
`According to one embodiment of the present invention, the 30 model is created
`
`by making 30 measurements of the area to be treated and by converting the
`
`measurement data into a digital solid or surface model (for instance, in standard
`
`triangulated language [.stl] format). Images from digital 20 or 30 photographs, or
`
`30
`
`from scanned printed photographs, of the same area are then mapped onto this model.
`
`A 30 photograph is taken by an optical device that allows capturing the 30
`
`geometry/shape of the object as well as its texture (and optionally colour). In general
`
`the device comprises a laser scanner to measure the 30 geometry/shape and a camera
`
`0006
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`6
`
`for imaging the texture. Both the 30 geometry description and the texture are then
`
`combined in one 30 image. A 30 photograph can be taken by a fixed camera or by a
`
`moving camera. In the latter case a 30 photograph showing all sides (front, left, back,
`
`and right side) of the object is created.
`
`5
`
`The 30 measurement can be performed directly or indirectly on the area to be
`
`treated. A direct measurement can take the form of a CT-scan of the patient, or an
`
`optical scan of the head of a patient. A CT-scan gives detail about both soft tissue and
`
`bone in a 30 co-ordinate system, by providing a stack of 20 images. Based on these
`
`20 images, a 30 model of the bone or face can be reconstructed. An optical scan of the
`
`10
`
`patient's head can give information about the outer shape and surface features of the
`
`face and head. In addition, a small optical scanner can be used to scan the intra-oral
`
`region.
`
`An indirect measurement can take the form of an optical scan of a physical
`
`replica of the area to be treated, such as a plaster cast manufactured from an impression
`
`15
`
`which has been taken of the area to be treated. Measuring techniques can include, but
`
`are not limited to, non-contact scanning using: laser, white light or the like; tactile
`
`scanning using a measurement probe; and volumetric scanning such as CT, MRI, µCT,
`
`etc. The term 'CT' as used here refers to medical CT scanners where the object
`
`remains fixed and the source and detector tum around the object, and results in images
`
`20
`
`with pixel size of about 0.25 mm or more. The term 'µCT' refers to non-medical CT
`
`scanners where typically the object turns and the source and detector are fixed, and
`
`results in images with a typical pixel size I 0 to 20 times smaller than that achieved
`
`with a CT scan. µCT generally results in more accurate images and can also accurately
`
`visualize much smaller details.
`
`25
`
`Converting the measurement data into a digital model will, depending on the
`
`applied measurement technique, involve a series of commonly known data processing
`
`techniques such as image segmentation and point cloud meshing. Data derived from
`
`different imaging sources (e.g. CT, optical scan ... ) needs to be combined into a single
`
`model. Initially, a separate model is constructed from each image data source (e.g. a
`
`30 model for CT scan data, a model for optical scan data) and the set of individual models
`
`is then combined into a single model. One of several known techniques may be used to
`
`combine the models:
`
`the 30 models can be registered onto each other by manually translating
`
`0007
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`7
`
`and/or rotating one of the 30 models with respect to the other. The models
`
`are displayed on displ.ay 10 of the workstation 20 and an operator manipulates
`
`the models.
`
`the 30 models are registered onto each other by indicating corresponding
`
`5
`
`points on both 30 models and applying an N-points registration algorithm.
`
`Afterwards an automatic optimization of the registration is possible using a
`
`registration optimisation program such as a
`
`least-squares registration
`
`algorithm.
`
`the 30 models are registered onto each other using a fully automatic
`
`10
`
`registration algorithm based on feature recognition. For example,
`
`the
`
`registration may be done by a cloud-of-points technique or it may be done by
`
`automatically identifying common features in the images.
`
`Such techniques are described, for example, in: P.J. Besl and N.D. McKay, "A method
`
`for registration of3-d shapes", IEEE Trans. Pat. Anal. And Mach. Intel 14(2), pp 239-
`
`15
`
`256, Feb 1992; R. San-Jose, A. Brun and C.-F. Westin, "Robust generalized total least
`
`squares iterative closest point registration", in C. Barillot, D.R. Raynor, and P.Hellier
`
`(Eds.): MICCAI 2004, LNCS 3216, pp. 234-241, 2004; A. Gruen and D. Akca, "Least
`
`squares 30 surface and curve matching", ISPRS Journal of Photogrammetry and
`
`Remote Sensing 59(3), pp 151-174, May 2005.
`
`20
`
`Photographs (20 or 30) can be scaled to a required dimension using one of
`
`several techniques:
`
`a calibration piece, i.e. a piece with exactly known geometric dimensions, can
`
`be added in the field of view of the camera while taking photographic images
`
`of the patient. This allows exact scaling of the photographs afterwards.
`
`25
`
`measurements can be performed on photographs and 30 models by using
`
`anatomical reference distances (e.g. interpupillary distance ... ) to determine
`
`the scale factor for the photographs.
`
`The scaling can be done automatically by automatically detecting reference
`
`points or features in the images and scaling these to match each other.
`
`30
`
`For mapping of the 20 or 30 photographs onto the digital model one of several
`
`techniques may be used when photographs and digital models contain identical
`
`surfaces (e.g. teeth visible in photograph, facial skin ... ):
`
`Manual registration: The photograph is aligned with the digitized treatment
`
`0008
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`8
`
`area. The photograph can be scaled and translated. The 30 representation of
`
`the treatment area can be rotated. The user rotates the representation to adapt
`
`its orientation to match the angle under which the photograph was made. The
`
`size of the photograph is adjusted and the image is translated until it is
`
`5
`
`aligned with the view on the 30 representation. The steps are repeated to tune
`
`the registration.
`
`Semi-automatic registration: The user rotates the representation to adapt its
`
`orientation to match the angle under which the photograph was taken.
`
`Photograph and 30 representation are shown side-by-side. Reference points
`
`10
`
`are indicated on both to mark corresponding features. A final mapping is
`
`performed either by: a least-squares algorithm/n-point registration/ICP
`
`(Iterative Closest Point)
`
`registration, which will
`
`find
`
`the optimal
`
`transformation necessary to align both sets of points; or by an exact matching
`
`at the location of the reference points and minimal deformations in between,
`
`15
`
`using an RBF (radial base functions) optimization approach.
`
`Automatic registration: Registration applying feature recognition.
`
`In a case where no identical surfaces are available (e.g. mapping of 20 or 30
`
`photograph of edentulous patient onto digitized 30 models of the maxillar and
`
`mandibular plaster casts) the above-mentioned registration techniques cannot be used.
`
`20
`
`In these cases a preferential approach makes use of face bow measurements to map the
`
`different data sets. Referring to Figure 3, a face bow is a mechanical device used in
`
`dentistry
`
`to
`
`record
`
`the positional
`
`relations of
`
`the maxillary arch
`
`to
`
`the
`
`temporomandibular joints, and to orient dental casts in this same relationship to the
`
`opening axis of the mechanical articulator. A face bow consists of two metal parts
`
`25
`
`attached together. The first part 3, called the bite fork, is shaped like a horseshoe and is
`
`inserted in the mouth of the patient and clamped between upper and lower jaw. The
`
`second part comprises two curved elements 1, 9. The ends 8 of the first curved
`
`element 1 are positioned in the ear channels of the patient. The second curved element
`
`9 forms a nasal guide that is put in contact with the nose of the patient. The bite fork 3
`
`30
`
`is fixed to the second curv~d element 9. The current position of all parts of the face
`
`bow is maintained and then used to transfer the plaster cast into the corresponding
`
`mechanical articulator. This implies that the face bow used for transfer of the occlusion
`
`from the patient's mouth to the mechanical articulator is now virtually created and
`
`0009
`
`
`
`WO 2008/128700
`
`PCT /EP2008/0030 72
`
`9
`
`positioned onto the 3D photograph of the patient (Figure 3). The bite registration 3 is
`
`also digitized and used to register the digital 3D models of the patient's jaws in the
`
`same coordinate system as the 3D photograph. In case of 2D photographs, a virtual
`
`face bow cannot be used and a preferential method in this case is using the default
`
`5
`
`values (as used in a mechanical articulator) to position the 3D models of the patient's
`
`jaws in correct relation to the intercondylar axis, which can be defined onto the 2D
`
`photograph of the patient's face.
`
`As an alternative to the above described method, a three-dimensional model of
`
`the area to be treated can be built directly from a 2D video sequence, such as by
`
`10 matching objects and features appearing in images which have been acquired from
`
`different viewpoints. Since the video data inherently holds information that can be
`
`related to more than mere spatial coordinates of the captured points, but also to color,
`
`texture, etc. the calculated reconstruction can be made to reflect each of these qualities,
`
`thereby achieving a life-like model.
`
`15
`
`The composite 3D model created at step 61 should preferably include the face
`
`of the patient to allow facial analysis to be based on the model. The 30 model used to
`
`plan a modified tooth set-up does not have to be life-like, but this information is useful
`
`to visualize to the user and patient the effects of the treatment and can be rendered in
`
`the final stage 66 of the method when a virtual representation of the tooth set-up
`
`20
`
`following treatment is displayed to a user and a patient.
`
`Facial analysis (steps 62, 63, Figure 2)
`
`According to one embodiment of the invention the 3D model of the patient,
`
`which has been created in one of the ways described above, is analysed to determine
`
`25
`
`information about the aesthetical appearance of the face and/or of the area to be
`
`treated. This analysis can be fully automatic, or semi-automatic. In a semi-automatic
`
`analysis, the computer program prompts the user to indicate certain anatomical points
`
`and/or lines on the face of the patient, which are needed for the facial analysis. The
`
`user marks these points on the graphical representation of the face by using an input
`
`30
`
`tool such as a mouse 23, keyboard 21, graphics tablet, electronic stylus etc. The
`
`program then performs facial analysis based on measurements between these marked
`
`.points and automatically creates or modifies the tooth set-up as described below. The
`
`following table, and Figures 4-6, show some example anatomical points which the
`
`0010
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`10
`
`program can prompt a user to mark. Even in the semi-automatic embodiment, the
`
`program can be arranged to automatically determine some of the facial features
`
`without any user prompting and input such as, for example, the overall shape of a
`
`patient's face (rule A) and the interpupillary line (rule D).
`
`5
`
`A set of general aesthetical rules use the results of the facial analysis to create
`
`an aesthetically optimal dental configuration or tooth set-up, based on the particular
`
`characteristics of the patient's face. The foJlowing table gives a non-exhaustive list of
`
`fourteen possible facial analyses and corresponding rules:
`
`Aestbetical analysis
`
`Aestbetical rule
`
`A Determine the shape of the patient's The optimal tooth shape is selected
`
`face and, if available, the patient's teeth.
`
`according to the following rules:
`
`Three main facial shapes exist:
`
`(1) In partially edentulous cases (i.e.
`
`(i) rectangular or square shaped. A
`
`the patient has some teeth remaining)
`
`rectangular or square shaped face has
`
`the tooth shape is determined based on
`
`substantially the same width at
`
`the
`
`the shape of the remaining natural
`
`forehead and just below the cheekbones;
`
`teeth and/or the shape of the patient's
`
`(ii) tapered. A tapered face is wide at the face.
`
`forehead and narrows to a small delicate (2) In edentulous cases the tooth shape
`
`chin;
`
`is chosen based solely on the analysis
`
`(iii) oval. An oval face is slightly wider of the shape of the patient's face.
`
`at the cheekbones than at the forehead A rectangular or square shaped face
`
`or jaw-line.
`
`corresponds with square-shaped teeth.
`
`Teeth are classified m three different A
`
`tapered
`
`face corresponds with
`
`shapes:
`tapered, ovoid, and square-
`tapered-shaped teeth.
`shaped. If a patient has any remaining An oval face corresponds with ovoid(cid:173)
`teeth, the shape of the teeth can be shaped teeth.
`
`determined based on
`
`the digitized
`
`information of the patient's remaining
`
`dentition.
`
`0011
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`11
`
`Aesthetica) analysis
`
`Aesthetical rule
`
`B Determine the width of the nose base Design or reshape the four maxillar
`
`(see 4, Figure 4).
`
`incisors so that their total width (5,
`
`Figure 4) is approximately equal to the
`
`width of the nose base (Gerber).
`
`C Determine
`
`the
`
`distance
`
`between Position the occlusal plane relative to
`
`eyebrow and nose base (see Figure 5).
`
`the patient's face so that the distance
`
`between the nose base and the top of
`
`the chin during occlusion is equal to
`
`said distance between eyebrow and
`
`nose base.
`
`D Determine the interpupillary line, i.e. the Reconstruct or correct the teeth so that
`
`line connecting the centre of the eyes ( 6,
`
`the occlusal plane or
`
`the
`
`line
`
`Figure 6).
`
`connecting the cusps of the maxillar
`
`canines (7, Figure 6) is parallel to said
`
`interpupillary line.
`
`E Determine the symmetry line of the Angulate or
`
`reorient
`
`the
`
`frontal
`
`face, i.e. the line from the centre of the maxillar incisors so that their facial
`
`forehead along the subnasal point to the axis is parallel to said symmetry line
`
`centre point of the chin.
`
`and position the central incisors so
`
`that their contact point lies on said
`
`symmetry line.
`
`F Determine the nasio-labial angle, i.e. the Reconstruct or correct the maxillar
`
`angle between the columella of the nose
`
`incisors so that the nasio-labial angle
`
`and the anterior surface of the upper lip
`
`is approximately 90°. Therefore a soft
`
`measured in a sagittal (lateral) view of tissue simulation is needed to predict
`
`the patient's face.
`
`the tooth position for the upper lip
`
`position, more particular with a nasio(cid:173)
`
`labial angle of90°.
`
`0012
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`12
`
`Aesthetical analysis
`
`Aesthetical rule
`
`G Determine in a sagittal (lateral) view of Reconstruct or correct the teeth so that
`
`the patient's face the distance of the
`
`the distance of upper lip to said line is
`
`upper and lower lip to the line through 4 mm and the distance of lower lip to
`
`the tip of the nose and the chin.
`
`said line is 2 mm.
`
`H Determine the position of the upper lip Position or correct the frontal maxillar
`
`while smiling.
`
`teeth so that only one quarter of their
`
`height is covered by the upper lip
`
`while smiling.
`
`For some patients the smile-line, i.e.
`
`the borderline of the upper lip during
`
`normal smiling, is much higher than
`
`ideally, and the upper gum is exposed.
`
`In these cases a gingival correction is
`
`needed to allow implant placement in
`
`the frontal maxilla. Without gingival
`
`correction pink porcelain will be
`
`needed in the prosthetic reconstruction
`
`and this is not compatible with the
`
`necessary
`
`interdental
`
`spaces
`
`for
`
`cleaning purposes of the implants.
`
`I Determine the curve formed by the Position or correct the frontal maxillar
`
`lower lip while smiling
`
`teeth so that their incisal edge is
`
`parallel to said curve and just touching
`
`the lower lip or showing a slight gap.
`
`0013
`
`
`
`WO 2008/128700
`
`PCT/EP2008/003072
`
`13
`
`Aesthetical analysis
`
`Aesthetical rule
`
`J Determine the buccal corridor, i.e. the Determine or adapt the maxillar dental
`
`small space visible between the angles arch shape as well as the orientation of
`
`of the mouth and the teeth, during maxillar premolars and molars
`
`to
`
`smiling (12, Figure 7).
`
`obtain a normal size of said buccal
`
`corridor. A too wide dental arch will
`
`result in no buccal corridor while a too
`
`small dental arch will result in a
`
`buccal corridor that is too prominent.
`
`K Determine the width to height ratio of Adapt the maxillar central incisors if
`
`the maxillar central incisors.
`
`needed to approximate the ideal value
`
`of 80% for the width to height ratio.
`
`L Determine the proportion of maxillar Adapt maxillar incisors and canines if
`
`central incisor width to lateral incisor needed
`
`to obtain
`
`the
`
`ideal width
`
`width to canine width.
`
`proportion of 1.6,
`
`1,
`
`and 0.6
`
`respectively.
`
`M Determine the position of the upper lip Adapt the position or size of the
`
`during talking.
`
`maxillar inc