`
`
`
`Exhibit 1005
`
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`
`
`
`
`700‘
`]$5 __
`LADINI
`1' 10-‘
`l//I».l PJNTS:
`I
`-
`I Gabf\é1_Brisson;1:akeol<anade;/ygthelny D:GiiéI:_Branis1§v Jariamazg
`
`GAU
`
`EXAMINER
`
`
`
`
`
`
`
`"‘CU*!':'lNUlNG DATA VERlFlED:
`[This appln claims benefit of 60/3'/7,695 05/03/20026)/4% /I
`
`
`
`
`
` u ‘%//3/3 5/
`Icltlon Examlnor
`maybe restricted.
`WARNING: The
`Unauthorized dinclosum maybe prohibited by the United States Code Title 35,
`
`Section: 122. 181 Ind 368, Poucuionomside the U.S. Patcnt& Tndennrk
`Olfioeinmatlictedtonuflxodznd «um -- mdcontnctonl on].
`
`
`
`D CD-ROM
`FILED wm-I:
`E] DISK (CRF)
`(Manned In podm on room lmldo nap)
`
`
`
`
`D TERMINAL
`mscuwan
`UE FEE IN FILE
`
`
`
`Mako Surgical Corp. Ex. 1005
`Page 1
`
`1 | I ! i 5 I
`
`"' fv‘-DREi£.%I\l APPLlCAT!0NS VERIFIED: Cl/V L/V‘
`
`
`
`i"F'f-J-.*"1.|B DO NOT PUBLISI-I C1
`.h'lJ'I 1::-I-In-.
`CI-yes,a’ no
`Fur»-lgn _'.\ri.:,.Ly claimed
`:1 yes El no
`3Er3.J.‘3(2 1 I9 cnndmons rnel
`Vnerffil-:1 anetflcknowledged Examiners‘s inlials Q,‘
`l"TL1-3 : Methods and systems to control a shaping tool
`
`RESCIND E3
`
`0(- 'L‘[ -o({
`
`
`
`
`
`
`
`
`
`
`¢A'H.DO'~lIs)-hO-|
`HM
`WWWWWMMWW
`10427093
`
`pi?
`3
`
`
`
`*
`{fifib
`[\uItu[1I1g;mWu|:m||nL
`
`
`
`¢0NTENTS
`Date
`Received
`(Ind. c. of M.)
`
`_32.
`
`83.
`
`35.
`
`67.
`
`
`
`
`
`
`‘
`
`INITIALS
`
`.=.,»:__.
`
`Date
`Received
`(Incl. C. of M.)
`
`H1H1HHH1H|H4HIHWWE
`
`Mako Surgical Corp. Ex. 1005
`Page 2
`
`
`
`
` SEARCH NOTES’
`
`(Llst databases searched. Attach
`search strata inside.
`
` INTERFERENCE SEARCHED
`
`Page 3
`
`
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`Class Sub.
`
`Date Exmr.
`
`
`
`ISSUE SLIP STAPLE AREA -
`ISSUING CLASSIFICATION
`CROSS REFERENCE 8
`SUBCLA33 ONE SUBCLASS PER BLOCK
`
`CLASS
`
`SUBCLASS
`
`CLASS
`
`7
`INTERNATIONAL
`CLASSIFICATION
`
`56
`
`: 425;-
`
`INDEX OF CLAIMS
`Glnoolad
`il'I%
`
`I
`
`W
`
`mmuwflaafinnfiEEBBE?HHHHHHIIIIIIIIIIIIIIII-I.-I--I-.-L..gflflmummummmmuflmuflmmuflfiflflmmmflflgflfimflmflfimflflflflflWmuflflmmufiflmumuflmufiwW«ImgmmmmmflmflmmflflfififlflmmflflEUIIIIIIIIIIIIII-I-II-I-I-.-
`UIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`fiIIIIlIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`II-I-I-----I---:--I-II---I----I--
`III-EI-II...---I---l-I-I----I---I-
`
`
`flnwfllgflafififlflflaflfiflflflafiififiII-I..-IIIIIIIIIIIIIIIIIIIIIIII--E---I---I------I-III-.----I--I--E---I---I-III-I.-----IIIIIIIIIIIIIIII---=I------I------I-I.-I--I--I-
`
`
`
`mmIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`
`_uIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`..mIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`..IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`NIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`mam.mmmmmmmmmmmmmmmmmmmmmmmmwmmwmmmmmmmmmmmmmmmmmHWmmcIHI--nuns:--Inn-anunsung--n----In-I-un-E
`IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`'IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
`
`IlnunnununuuuumnmnmmmmmmmmmmawmmmmawwmmmmmmmmmmmmmwwEe@mumaummmauEamzm.,mumamaauaiaaaaaaaaaaaaaaaaE3535WW
`
`
`
`
`
`
`IIIIIIIMIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIn...MmE!!=_§EE_......”EEEEEEEE§_5mwaazaauaaaaaaaaaaaaazaaEEEEH
`
`II’ more than 150 cl
`
`IIII IIEHII
`III.-
`aims or 9 actluns alapla additional shaet ham
`
`Mako Surgical Corp. Ex. 1005
`Page 4
`
`
`
`
`
`
`
`
`
`
`
`(12) United States Patent
`Brisson et a].
`
`U006757582B2
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 6,757,582 B2
`jun. 29, 2004
`
`(54) METHODS AND SYSTEMS TO CONTROL A
`SHAPING TOOL
`
`(75)
`
`Inventors: Gabriel Brissim, Pittsburgh, PA (US);
`Takeo Kanade, Pittsburgh, PA (US);
`Anthony DIGiola, HI1, Pittsburgh, PA
`(US); Branislav Jaraniz, Pittsburgh,
`PA (US)
`
`(73) Assignee: Carnegie Mellon University,
`Pittsburgh, PA (US)
`
`(lNotice:
`
`Subject to any disclaimer, the term of this
`patent is extended Or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`Appl. No.: 10/427,093
`
`Filed:
`
`Apr. 30, 2003
`
`Prior Publication Data
`US 2003/=98296 Al Nov. 6, 2003
`
`Related U.S. Application Data
`(60) Provisional application No. 60/377,695, filed on May 3,
`2002.
`Int. Cl: .................................. G06F 19/00
`(51)
`700/'186; 83/768; 606/128
`(52) U.S. Cl
`..................
`700/186, 163,
`(58) Field of Search .......................
`700/159, 245; 606/128; 318/56&.11; 144/3.1;
`83/76.8, 367, 370; 451/5
`
`References Cited
`U.S. PATEKr DOCUMENTS
`4/1987 Brurnbach
`4,660,513 A
`-
`66/128
`........
`606/128
`9/1905 Brust el al. - ......
`5,449,363 A
`-
`6,091,168 A
`8/20WM Katoh ct alni ........ 318/568.11
`6,501,997 BI-
`700/159
`122002 Kakino ...............
`2/2003 Kennedy et al .......... 83/76.8
`6,520,228 Bi
`* cited by examiner
`Primary Examiner-Adbert W. Paladi
`(74) Attorney, Agent, or Firm-Kevin A. Oliver; Foley
`Hoag LLP
`(57)
`
`ABSTRACT
`
`A method and system for providing control that include
`providing a workpicce that includes a target shape provid-
`ing a cutting tool, providing a 3-D image associated with the
`workpiece, identifying the target shape within the workpiece
`image, providing a 3-D image associated with the cutting
`tol registering the workpiece with the workpiece image,
`the cutting tool with the cutting tool image,
`registering
`tracking at least one of the workpicce and the cutting tool,
`transforming the tracking data based on image coordinates
`to determine a relationship between the workpiecee and the
`cutting tool, arnd, based on the relationship, providing a
`control to the cutting tool. In one embodiment, the work-
`piece image can be represented as volume pixels (voxels)
`that can be classified and/or reclassified based on target
`shape, waste, and/or workpicce.
`
`65 Claims, 13 Drawing Sheets
`
`Page 5
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`Jun. 29, 2004
`
`Sheet I of 13
`
`US 6,757,582 B2
`
`GENERATE WORKCPIECE IMAGE
`
`INTEGRATE TARGET SHAPE INTO WORKCPIECE IMAGE
`
`T-
`"VOX3LLATE" INTEGRATED WORKPIECE IMAGE
`
`-1-
`GENERATE CUTTING TOOL IMAGE
`
`CALIBRATE POINT PROBE
`
`L
`
`16
`
`Ina
`
`-10
`-F
`ASSOCIATE MARKERS WITH CUTTING TOOL AND WORKPIECE
`
`ITERATIVELY:4
`
`VOXILLATE INTEGRATED WORKPIECE IMAGE K
`
`TRACK CUTTING TOOL AND WORKPIECE
`
`PROVIDE CONTROL TO CUTTING TOOL
`
`FIGURE I
`
`Page 6
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`jun. 29, 2004
`
`Sheet 2 of 13
`
`US 6,757,582 B2
`
`TRACK MARKERS ON WORKPIECE AND/OR CUTTING TOOL
`(RECEIVE TRACKING DATA)
`
`TRANSFORM TRACKING DATA TO IMAGE COORDINATES
`
`INTERSECTION DETECTION/COMPUTE CONTROL
`
`TRANSMIT CONTROL TO CUTTING TOOL AND UPDATE
`IMAGEIV OXELS B3ASED ON INTERSECTION DETECTION
`
`FIGURE 2A
`
`Page 7
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`jun. 29, 2004
`
`Sheet 3 or 13
`
`US 6,757,582 B2
`
`ESTABLISH GRID OF VOXELS
`
`INCORP ORATE WORKPIECE IMAGE DATA TO GRID
`
`CLASSIFY VOXELS
`
`REGISTER WORKPIECE/CTT~ING TOOL TO IMAGES
`
`UPDATE WORKPIECE AND/OR CUTTING TOOL IMAGES
`WITH TRACKING DATA
`
`PERFORM COLLISION/flNTERSECTION DETECTION/
`COMPUTE CONTROL
`
`UPDATE IMAGE/VOXELS (CLASSIFICATION, DIVISION,
`COMBINATION, .. .)/TRANSMIT CONTROL
`
`112
`
`120, 122
`
`124
`
`126
`
`FIGURE 2B
`
`Page 8
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`Jun.29,2004
`
`Sheet 4 of 13
`
`U.S.Patnt
`6,757,582 B2
`Jn. 9, 204 heet4 o 13US
`
`1100
`
`1104 1108
`
`1 08 11102
`
`FIGURE 3A
`
`Page 9
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`U.S.
`Patent
`un. 29, 2004
`
`Sheet 5 of 13US67,52B
`
`US 69757 582 B2
`
`1100
`
`1101Ji
`FIGURE 3B
`
`Page 10
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`U.S.
`Patent
`un. 29, 2004
`
`Sheet 6 of 13Us67,52B
`
`US 697579582 B2
`
`1120
`
`1110
`
`1132
`
`1110
`
`11321
`
`1132
`
`1132 1132 1132
`
`1122
`
`1122
`
`1122
`
`1132
`
`1112
`
`1122 1122
`
`1122 1132
`
`1132 1132 1132 1132
`
`1134 1134
`
`1134 1134
`
`1134 1134 1134 1134
`
`1104
`
`1104
`
`1104
`
`1104
`
`FIGURE 4A
`
`Page 11
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`jun.29,2004
`
`Sheet 7 of 13
`
`U.S.Patnt
`6,757,582 B2
`Jn. 9, 204 heet7 o 13US
`
`1132"1 1132"1
`
`1120
`
`1120'
`
`1102'
`
`1132" 1122"1
`
`1132 1132 1132 1132 1132 1122
`
`1122
`
`1122 1122
`
`1122 1122 1122 1122
`
`1122 1122
`
`1122 1122
`
`1122 1122 1122 1122
`
`1134 1134
`
`1134 1134
`
`1134 1134 1134 1134
`
`1104
`
`1104
`
`1104
`
`1104
`
`FIGURE 4B
`
`Page 12
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`U.S.
`Patent
`un. 29, 2004
`
`US 65757,582 B2
`Sheet 8 of 13US67,52B
`
`1130
`
`1132
`
`1132
`
`1132
`
`1122
`
`1132
`
`1122'
`
`1132
`
`1122~
`
`1122
`
`1122
`
`1110' 1110' 1110' 1110' 1132
`
`1122 1122 1122
`
`1134 1134 1134 1134
`
`1134 1134 1134 1134
`
`1104
`
`1104
`
`1104
`
`1104
`
`FIGURE 4C
`
`Page 13
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`jun. 29, 2004
`
`Sheet 9 of 13
`
`US 6,757,582 B2
`
`FIGURE 5
`
`Page 14
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`Jun. 29, 2004
`
`Sheet 10 of 13
`
`US 6,757,582 B2
`
`rig. 6
`
`Fig. 7
`
`L
`
`"I'M
`
`et--
`
`3 ,y
`
`Fig. a
`
`Page 15
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`jun. 29, 2004
`
`Sheet 11 of 13
`
`US 6,757,582 B2
`
`Fig. 9A
`
`a.fFI.Q, 9B
`
`Page 16
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`Jun. 29, 2004
`
`Sheet 12 of 13
`
`US 6,757,582 B2
`
`C',x
`
`qsn2% .f
`
`Fir,.
`
`IOA
`
`502
`
`Fig. IIAFi.±8
`
`Fig, 115
`
`Page 17
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`U.S. Patent
`
`jun. 29, 2004
`
`Sheet 13 of 13
`
`US 6,757,582 B2
`
`Fig. 12
`
`4.
`
`404.
`
`/
`
`1b9
`
`A
`
`.
`
`so
`
`Fig. 13
`
`Page 18
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`US 6,757,582 B2
`
`I
`METHODS AND SYSTEMS TO CONTROL A
`SHAPING TOOL
`CROSS-REFERENCE TO RELAITED
`APPLICA31ON
`This application claims benefit of priority to U.S. Provi-
`sional Patent Application Serial No. 60/377,695, filed May
`3, 2002, the contents of which are herein incorporated by
`reference in their entirety.
`
`FIELD
`This disclosure relates generally to controls, and more
`specifically to controlling a tool for shaping a workpiece.
`BACKGROUND
`A wide variety of applications call for objects to be
`formed from general workpicoes, where the object forma-
`dion can include cutting the object from the workpiecov. The
`precision required of the cut may depend on the particular
`application.
`One application that can call for high precision is the
`shaping of bone during surgery. For example, in a surgery
`such as a total knee replacement (TKR), the bones to whichb
`a prosthetic knee may be attached, typically the femur and
`the tibia, can be shaped to facilitate stable and effective
`implantation of the prosthesis.
`increased accuracy by
`Som cutting systems achieve
`fixating the workpiece, such as bone. Bone fixation can be
`accomplished using a screw(s) and/or a clamp(s) to secure
`the bone to a secure positioning brace. Fixation can be used
`for many robotic orthopedic surgery systems because such
`systems depend on a fixed target and cannot or do not track
`the target. Fixation may be used in robotic system despite
`the risks to the patient that can include pain, infection, and
`increased recovery and rehahilitation periods caused by the
`invasive nature of fixation.
`Robotic and other surgical systems may be susceptible to
`failure that can occur suddenly and can cause damage or
`injury to the subject. Controllable or detectible failure may
`be detected before harm is done. Undetectable failure may
`cause damage when the system appears to be functioning
`normally. If a robot provides power drive or power assis-
`the robot
`tance
`to an operator, failure may result when
`malfunctions because the operator may be unable to react in
`time or with suffichin force to prevent the robot components
`from injuring the subject. Robotic system may also be
`undetectable failures, since the operator may not be in full
`or even partial physical control of the robot.
`
`SUMMARY
`'The disclosed methods and systems include a control
`method that includes providing a wrdpiece that includes a
`target shape, providing a cutting tool, providing a 3-D image
`associated with the workpiece, identifying the target shape
`within the workpicce image, providing a 3-D image asso-
`diated with the cutting tool, registering the workpiece with
`the workpiece image, registering the cutting tool with the
`cutting tool image, tracking the workpiece and/or the cutting
`InOl, transforming the tracking data based on image coor-
`dinues to determine a relationship between the workpice&z
`and the cutting tool, and based on the relationship, providing
`a control to the cutting tool, The control can include an
`analog signal, a digital signal, a control to at least partially
`retract a cutting element associated with the cutting tool, a
`control to reduce the speed of a cutting element associated
`
`25
`
`with the cutting tool, a control to stop a cutting element
`associated with a cutting tool, or another control.
`The methods and system can include representing the
`workpicoc image using volume pixels (voxels), and classi-
`5fying the workpiece image voxels based on the target shape.
`Accordingly, based on the relationship between the cutting
`tool and the workpiece, the methods and systems can include
`re-classifying the voxels based on the relationship.
`The methods and systems can include providing an image
`10 based on CTscan data, X-ray data, MRI data, fluoroscopy
`data, and/or ultrasound data, The methods and systems can
`also include classifying such image data, represented as
`three dimensional volume pixels or "voxels,' where classi-
`fying the image voxels based on the target shape includes
`is distinguishing target shape voxcls and workpiee voxels. In
`an embodiment, distinguishing target and workpiece voxels;
`includes associating
`target shape voxels with the target
`shape and associating non-target shape voxels as waste.
`Color-coding voxels, such as target shape voxels associated
`with the target shape, can also be performed to distinguish
`20 voxels. The imageis and/or voxels can be displayed to a user
`to enable a user to view relative positions of the cutting tool
`and workpiece and/or target shape. In one embodiment, the
`methods and systems can include re-classifying the voxels
`based on the relationship.
`Classifying and/or re-classifying voxels can include iden-
`tifying mixture voxels that include part workpiece and part
`target shape, subdividing the mixture voxcls, and iteratively
`identifying and subdividing mixture voxels to a predeter-
`minedvoxel resolution. In one embodiment, mixture voxes
`30 can be understood to be voxets that can be associated with
`more than one classification, where exemplary voxcl clas-
`include target, workpicce, waste, empty,
`sifications can
`cutting tool, cutting element, or other classificatuions Sub-
`dividing the mixture voxels can be performed based on an
`35 octree data structure. Further, the methods and systems can
`include recombining voxrels having the same classification,
`where such recombining can generally be performed based
`on neighiboring voxels of the same classification.
`The methods and systems can also include a probe that
`4oj can be calibrated and employed to register the workpicce
`and/or the cutting tool to the workpiece: image and the
`cutting tool image, respectively. The disclosed tracker can
`include a tracking method and system based on providing
`one or more markets on or otherwise associated with the
`45 workpiece and/or the cutting tool. The tracker can measure
`and/or determine at least one position and at least one angle
`associated with the workpiece and/or the cutting tool, where
`in one embodiment, the tracker can track in three positions
`and three angles to provide six degrees of freedom. The
`tracked data can thus be transformed to an image coordinate
`image
`system to allow an updating of the respective
`positions, angles, etc.
`The image updating can also include (re)classifying vox-
`els associated with the workpiece, where the reclassification
`55 can be based on the hracking data associated with the
`workpiece and/or the cutting tool. Such classifying and/or
`reclassifying can include identifying voxels associated with
`the workpiece that are eliminated by the cutting tool. The
`classifying and/or reclassifying can also include identifying
`60 mixture voxels, subdividing the mixture voxels, and, itera-
`identifying and subdividing mixture VOXeIS Until
`tively
`reaching a predetermined voxcl resolution. As provided
`previously, identifying mixture voxcls includes identifying
`voxels having more than one classification. M'e subdividing
`65 can be based on an octree data structure. Voxel recombina-
`tion of voxuls having the same classification can also be
`performed.
`
`so
`
`Page 19
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`US 6,757,582 B2
`
`3
`Accordingly, the methods andlsystems include providing
`a control based on determining a distance between the
`cutting tool image and the voxels classified based on the
`target shape. In one embodiment, the control can be based on
`increasing the size of the cutting tool image to determine
`whether the increased size cutting tool intersects with the
`target image. The cutting tool image can be increased by a
`fixed amount and/or based on tracking data associated with
`the cutting tool. The control provided to the cutting tool can
`thus be based on the relationship between a cutting element
`associated with the cutting tool image, and voxcls classified
`based on the target shape.
`In an embodiment, the control provided to the cutting tool
`can be based on the relationship between the cutting tool
`image and the voxcls classified and/or associated with the
`target shape, where the relationship can be based on coll-
`sioni detection and/or intersection detection between at least
`part of the cutting tool and voxels associated with the target
`shape.
`In one embodiment, the workpiece image can be under-
`stood to be associated with voxels
`that can be further
`associated with a three-dimensional grid of voxels, where an
`image associated with the workpiece can be incorporated
`into the grid, and grid voxels can be identified as being
`associated with tbe workpiece. Some of the workpiece
`voxels can thus further be associated with the target shape.
`The methods and systems include providing a control to
`the cutting tool by performing at least one of collision
`detection and intersection detection. Such control can per-
`forming at least one of collision detection and intersection
`detection between at least part of the cutting tool and the
`target shape of thc workpicce image.
`In the disclosed methods and systems, identifying
`the
`target shape includes classifying voxels associated with the
`workpiece image as at least one of workpicce and target
`shape. Accordingly, providing control to the cutting tool can
`include performing at least one of collision detection and
`intersection detection between at least part of the cutting tool
`and the target shape voxels. Providing control can also
`include providing a control based on a threshold distance
`between the workpiece image and the cutting tool image.
`Also disclosed is a system that includes a cutting tooil, a
`workpiocc that includes a target shape, a tracker to provide
`the
`tracking data associated with the cutting tool and
`workpieco, and a controller to control the cutting tool based
`on the tracking data associated with the cutting tool and the
`hracking data associated with the workpiece. The cutting tool
`can include one or more cutting elements that can include
`one or more blade(s), one or more rotatable blade(s), one or
`more retractable blade(s), one or more water jet(s), one or
`more particulate jet(s), one or more lithotriptor(s) and/or one
`or more ultrasonic lithotriptor(s). The controller can control
`the cutting tool by providing a control to at least partially
`retract the cutting element(s), and/or at least partially reduce
`a rotation rate and/or change a cutting rate of the cutting
`element(s). The controller can transmit a control signal to
`the cutting tool, where the control signal includes an analog
`signal, a digital signal, and no signal.
`The systems can include a tracker that includes or other-
`wise provides tracking data based on at least three positions
`and at least three angles. The tracker can include one or more
`first markers associated with the workpiewe, and one or more
`second markers associated with the cutting tool. The mark-
`ers or some of the markers can be one or more infrared
`sources, Radio Frequency (RF) sources, ultrasound sources,
`and/or
`transmitters. The tracker can thus be an infrared
`
`tracking system, an optical tracking system, an ultrasound
`hracking system, an inertial tracking system, a wired system,
`and/or a RF tracking system.
`The systems also include one or more images associated
`5 with the workpiece and at least one image associated with
`the cutting tool. The workpiece image(s) can be registered to
`the workpicce, and the cutting tool. image(s) can be regis-
`tered to the cutting tool. Accordingly, the system include a
`means to register the workpiece to the image(s) associated
`10 with the workpiece, and a means to register the cutting tool
`to the image(s) associated with the cutting tool. The regis-
`tration means can include a probe that can be calibrated prior
`to registration. Registration can be performed by contacting
`the
`locations on the workpiece and/or cutting tool with
`Is calibrated probe.
`The systems thus also include means to provide at least
`one image associated with the workpiece, and means to
`provide at least one image associated with the cutting tool.
`Such means can include ComputerAided Design (CAD), CT
`20 scan, MvRI data, X-ray. fluoroscopy, and/or ultrasound.
`although other means can be used. The systems can update
`the images with tracking data using means to transform the
`tracking data between different coordinate systems. Such
`transformations can be mathematically effectuated.
`25 Te systems and methods can be applied to a workpieco,
`that includes bone, cartilage, tendon, ligament, muscle,
`connective tissue, fat, neuron, hair, skin, a tumor, and an
`organ. The cutting tool can include an endoscopic instru-
`ment.
`The'n controller can also
`include a collision detection
`module and/or an intersection detection module that can
`determine a relationship between the cutting tool and at least
`part of the workpiece.
`Disclosed is a system that includes a workpiece having a
`target shape included therein, a tracker to track at least one
`of a cutting tool and the workpiece, and, a control system,
`the control system including instructions to cause a proces-
`sor to track the cutting tool and the workpiece, to determine
`40a relationship between the cutting tool and at least one of the
`workpiece and the target shape, and to provide a control to
`the cutting tool based on at least one of the relationship of
`the cutting tool and the workpiece, and the relationship of
`the cutting tool and the target shape. The control system can
`45 also include an image associated with the workpiece and an
`image associated with the cutting tool. The image associated
`with the workpiece can includes an image associated with
`the target shape, and/or at least part of the workpiecm image
`can be designated and/or otherwise classified as being
`50 associated with the target shape.
`The system also includes an image registration means,
`where the image registration means registers the workpiece
`to an image associated with the workpiece, and the image
`registration means registers the cutting tool to an image
`55 associated with the cutting tool, and wherein the control
`system includes instructions to update at least positions of
`the workpiece image and the cutting tool image based on
`data from the
`tracker; and, where at least one of the
`relationshtip, of the cutting tool and the workpiece, and the
`60 relationship of the cutting tool and the target shape, are
`based on the updated image positions.
`In the disclosed systems, the relationship between the
`cutting tool and the workpiece can be I,ased on position data
`and/or angle data associated with the cutting tool(s) and/or
`the workpicce, where the position data and angle data can be
`based on the tracker. The relationship between the cutting
`tool and the target shape can thus be based on position data
`
`35
`
`65
`
`Page 20
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`US 6,757,582 B2
`
`trachea, bronchus,
`arteriole, venule, capillary, lung,
`bronchiole, alveolus, blood, extremity, and a reproductive
`organ. A tumor can include a neoplasm, a benign tumor, a
`hyperplasia, a hypcrtropy, a dysplabia, an anaplasia, a
`.5metaplasia, a metastasis, and a malignant tumor.
`
`5
`and/or angle data associated with t
`cutting tool and/or the
`target shape, where the position data and angle data are
`based on the tracker. The instructions to determine a rela-
`tionship between the cutting toot and the target shape and/or
`workpiece can also include instructions to represent
`the
`workpicce as a group of volume pixels (voxels), classify
`to the target shape, represent the
`voxels corresponding
`cutting tool as a group of voxels, a surface model, and/or
`using constructive solid geometry or other geometric
`modeling, and, based on the tracker data, classify and/or
`update the voxels. The instructions to classify voxels cor-
`responding to the target shape can include classifying voxels
`as target shape and classifying voxels as waste, and/or
`instructions to color-code voxels corresponding to the target
`shape. In an embodiment, the workpicce can be represented
`as a surface model
`The disclosed methods and systems can include a control
`for a shaping tool that can be referred to herein as a cutting
`tool, and in one embodiment, is a freehand shape cutter, but
`can be understood to be a tool that can cut, shave, and/or
`grind. References herein to a shaping tool or cutting tool can
`accordingly be understood to represent a tool that can cut,
`shave, and/or grind.
`The disclosed methods and systems include a freehand
`shape cutter that includes a handheld cutting tool having a
`cutting element and a first marker. A second marker can be
`that includes a target shape. A
`affixable to a workpicce
`tracker can track a position of the cutting tool based on a
`position of the first marker, and also track a position of the
`workpiece based on a position of the second marker. A
`controller can control the cutting element based on the
`position of the cutting tool and the position of the workpiece
`to prevent the cutting element from invading
`the target
`shape.
`In one exemplary embodiment, the methods and systems
`include a method of shaping a bone by determining a target
`shape of the bone, aligning the target shape with the bone,
`providing a handheld cutting tool having a cutting element,
`tracking the bone and the cutting tool, cutting the bone with
`the cutting tool, and controlling the cutting element to
`prevent invasion of the cutting tool on the target shape. In
`such an embodiment, the target shape of the bone can be
`determined by creating a bone model based on geometrical
`data of the bone, and establishing the target shape based on
`the bone model.
`In one embodiment, the cutting tool can have six degrees
`of freedom, and the tracker can track with six degrees of
`freedom.
`The cutting element can include at least one of a blade, a
`rotatable blade, a retractable bladc, a water jet, a particulate
`jet, a lithotriptor. and an ultrasonic lithotriptor The control-
`ler can control the cutting element by at least one of stopping
`the cutting element, retracting the cutting element, progres-
`the cutting element, switching off the
`sively retracting
`cutting element, and interrupting power to the cutting ele-
`ment. The tracked and/or determined positions can he three-
`dimensional positions that can be
`tracked substantially
`simultaneously. Additionally and optionally, the positions
`can be tracked continuously.
`The target shape may be represented in the controller as
`a virtual template. The workpiece can include, for example,
`at least one of bone, cartilage, tendon, ligament, muscle,
`connective tissue, fat, neuron, hair, skin, tumor, and an organ
`tongue,
`that can include skin, brain, meninges, palate,
`esophagus, stomach, duodenum, jejurtum,
`ileum, colon,
`liver, kidney, spleen, pancreas, ganglion, heart, artery, vein,
`
`20
`
`25
`
`30
`
`BRIEF DESCRIPTION OF TIHE FIGURES
`FIG. 1 is a block diagram of one embodiment;
`10 FIGS. 2A and 2B illustrate embodiments of the disclosed
`methods and systems;
`FIGS. 3A-3B provide schematic diagrams of embodi-
`ments of voxeliation;
`FIGS. 4A-4C provide schematic diagrams of embodi-
`15 ments Of VOXellation;
`FIG. 5 is an architectural block diagram of one embodi-
`ment;
`FIGS. 6 and 7 depict embodiment% of cutting systems;
`F'IGS. 8 depicts an embodiment of a handheld cutting
`tool;
`FIGS. 9A and 9B depict an embodiment of a cutting tool
`with a retractable head;
`FIGS. 10A and 108 illustrate an exemplary cutting head;
`FIGS. 11A and 11B also illustrate an exemplary cutting
`head;
`FIG. 12 provides a cutting tool for endoscopic: use; and
`FIG. 13 shows a cross-section of the FIG. 12 tool.
`DETAILED DESCRIFFION
`To provide an overall understanding, certain illustrative
`embodiments will now be described; however, it will be
`understood by one of ordinary skill in the art that the systems
`35and methods described herein can he adapted and modified
`to provide systems and methods for other suitable applica-
`tions and that other additions and modifications can be made
`without departing from the scope of the systems and meth-
`ods described herein.
`Unless otherwise specified, the illustrated embodiments
`can be understood as providing exemplary features of vary-
`ing detail of certain embodiments, and therefore, unles
`otherwise specified, features, components, modules, and/or
`aspects of the illustrations can be otherwise combined,
`45 separated, interchanged, and/or rearranged without depart-
`ing from the disclosed systems or methods.
`The disclosed systems and methods include a methods
`In one
`and systems for controlling a cutting tool.
`embodiment, the cutting tool can be controlled relative to a
`so workpiece and/or an object (target) that can be derived from
`the workpic"e. Although the illustrated embodiments and
`other examples provided herein relate to surgical applica-
`tions where the workpiece can he, for example, a hone, those
`of ordinary skill in the art will recognize that the disclosed
`55 methods and systems relate to a system and method where
`a target shape can be developed from other workpiecces,
`where the workpiecc can include a material including, for
`example, wood, plastic, living tissue, ceramic, plaster, or
`other non-living materials. For surgical applications, work-
`60 pieces can include, for example, living tissue including
`bone, cadavcric: grafts, or engineered tissue grafts.
`The disclosed methods and systems thus include and/or
`can be associated with a workpiece from which a target
`shape can be formed using a cutting tool to cut, grind away,
`65 or otherwise eliminate pieces or portions of the workpicce.
`When appropriate cuts are made and pieces or portions
`appropriately eliminated, the remaining workpiece can be
`
`40
`
`Page 21
`
`Mako Surgical Corp. Ex. 1005
`
`
`
`US 6,757,582 B2
`
`25
`
`7
`the desired target shape.
`to
`substantially similar
`Accordingly, the workpicce can be understood to include a
`target shape and waste, wherein the cutting tool can be used
`to eliminate the waste from the workpiece to leave the target
`shape. The disclosed methods and systems can thus include5
`generating one or more computer models and/or a workpiece
`image that includes the target shape and a cutting tool image.
`registering the workpiece and the cutting tool to the respec-
`tive computer models and/or images, and facilitating target
`shape formation by tracking
`the cutting tool (and/or the 1
`cutting element) and the workpice relative to the computer
`the cutting tool (and/or
`models and/or images to enable
`cutting element) to cut and/or eliminate those portions of the
`workpiece that are not part of the target shape. A 2D
`representation of the 3D images can he provided on a 15
`display.
`A block diagram describtng the features of a method and
`system as disclosed herein can be as provided in FIG. 1. As
`previously provided, the features of FIG. 1 are not provided
`in particular order, include varying levels of detail for certain 2
`embodiments, and such features are presented for illustrative
`purposes. Accordingly, the illustrated features of FIG. 1 can
`in ternms of order, and as described herein,
`be rearranged
`some features can be further detailed and/or eliminated in
`some embodiments.
`As FIG. 1 indicates, a workpicce image can be provided
`or otherwise generated 100 for input to a processor-
`controlled device that can include, and can be referred to
`herein, as a computer. In some embodiments, the workpiece
`image can be a three-dimensional (3-D) image and/or can be 30
`translated to a 3-D image for presentation. Some embodi-
`ments allow the workpiewo image to be manipulated on the
`display screen by a computer user using keyboard, joystick,
`mouse, audio, and/or other commands. For example, in
`medical applications, the 3-D image can be provided by a 3
`Computer Aided Design (CAD), Computed Tomography
`(CT) Scan, Medical Resonance Imaging (MRI), and/or
`combinations of such data and/or other data that can include
`X-Ray, digital image, or other data and/or data formats. In
`an embodiment, a 3-D