throbber

`
`
`Exhibit 1004
`
`
`Exhibit 1004
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`
`
`Patent No. 6,757,582
`Petition For Inter Partes Review
`
`
`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`______________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`______________________
`
`
`Mako Surgical Corp.
`Petitioner
`
`v.
`
`Blue Belt Technologies, Inc.
`Patent Owner
`
`Patent No. 6,757,582
`Issue Date: June 29, 2004
`Title: METHODS AND SYSTEMS TO CONTROL A SHAPING TOOL
`______________________
`
`Case IPR: Unassigned
`____________________________________________________________
`
`DECLARATION OF ROBERT D. HOWE
`
`1
`
`Page 1
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`I.
`
`1.
`
`INTRODUCTION
`
`I have been retained by Morrison & Foerster LLP in this case as an
`
`expert in the relevant art.
`
`2.
`
`I have been asked to provide my opinions and views on the materials I
`
`have reviewed in this case related to U.S. Patent No. 6,757,582 (“the ’582 patent”
`
`(Ex. 1001)), and the scientific and technical knowledge regarding the same subject
`
`matter as the ’582 patent before and at the earliest effective filing date of May 3,
`
`2002. The ’582 patent issued from U.S. Application No. 10/427,093 (the ’093
`
`application), which was filed on April 30, 2003, following Provisional application
`
`No. 60/377,695, filed on May 3, 2002.
`
`3. My opinions and underlying reasoning for the opinions are set forth
`
`below.
`
`II.
`
`4.
`
`PROFESSIONAL BACKGROUND
`
`I am currently the Abbott and James Lawrence Professor of
`
`Engineering at Harvard University. I also serve as Area Dean (equivalent to
`
`Department Chair) of Bioengineering. I am the Director of the BioRobotics
`
`Laboratory at Harvard University, which is the home to over a dozen doctoral
`
`students, postdoctoral fellows, and visiting scholars. Our research focuses on
`
`robotics, particularly robotic manipulation and robot-assisted surgery. Among
`
`other projects, we have developed image-guided and minimally invasive surgical
`
`
`
`2
`
`Page 2
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`robot systems. Our work has been funded by government grants, private
`
`foundations, and commercial partners.
`
`5.
`
`I earned a bachelor’s degree in physics from Reed College in 1979
`
`and Master of Science and Doctor of Philosophy degrees in Mechanical
`
`Engineering from Stanford University in 1987 and 1990, respectively.
`
`6. My work has resulted in over four issued patents, six patent
`
`applications, and approximately 200 peer-reviewed publications.
`
`7.
`
`A copy of my curriculum vitae that summarizes my education, work
`
`history, and publications is in Appendix A.
`
`8.
`
`I am being compensated at the rate of $395/hour for taking part in this
`
`case but have no other relationship to Mako Surgical Corp. My compensation is
`
`not dependent on the outcome of this case.
`
`III. BASIS FOR OPINION
`
`9. My opinions and views set forth in this report are based on my
`
`education, training, and experience in the relevant field, as well as the materials I
`
`reviewed in this case, and the scientific knowledge regarding the same subject
`
`matter that existed prior to the earliest effective filing date of the ’582 patent.
`
`
`
`3
`
`Page 3
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`IV. PATENT LAW STANDARD
`
`10.
`
`It is my understanding that a patent claim is invalid for anticipation if
`
`it can be shown that each and every limitation of the claim is disclosed either
`
`expressly or inherently in a single prior art reference.
`
`11.
`
`It is my understanding that a patent claim is invalid for obviousness if
`
`the claimed invention as a whole would have been obvious to one of ordinary skill
`
`in the art at the time the invention was made, in view of a single prior art reference
`
`or a combination of prior art references. Specifically, I understand that a
`
`determination of whether a claimed invention would have been obvious requires
`
`taking into consideration factors which include: (a) assessing the scope and content
`
`of the prior art; (b) the differences between the claimed invention and the prior art;
`
`and (c) the level of ordinary skill in the art.
`
`12.
`
`It is my understanding that when combining two or more references,
`
`or when modifying an item disclosed in one reference, so as to arrive at a claimed
`
`invention, one should consider whether there is a reason for the proposed
`
`combination or modification. For example, when a technology or product is
`
`available in one field of endeavor, design incentives and other market forces can
`
`prompt variations of it, either in the same field or a different one. For the same
`
`reason, if a technique has been used to improve one device and a person of
`
`ordinary skill in the art would recognize that it would improve similar devices in
`
`
`
`4
`
`Page 4
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`the same way, using the technique is obvious unless its actual application is
`
`beyond his or her skill.
`
`13.
`
`It is my understanding that the claims of a patent are analyzed from
`
`the perspective of “a person of ordinary skill in the art” and that the claims of the
`
`’582 patent are interpreted as a person of ordinary skill in the art would have
`
`understood them at the time the ’093 application, which issued as the ’582 patent,
`
`was filed. It is further my understanding that a claim is given the “broadest
`
`reasonable construction in light of the specification” in inter partes review. See 37
`
`C.F.R. § 42.100(b).
`
`14.
`
`It is my understanding that “prior art” includes patents and
`
`publications in the relevant literature and information that predate the effective
`
`priority date of the ’582 patent. It is also my understanding that priority is
`
`determined on a claim-by-claim basis.
`
`15.
`
`It is my understanding that a patent application can disclose prior
`
`technologies as prior art in its specification, and the admitted prior information can
`
`be used as “prior art” against its claims.
`
`V. A PERSON OF ORDINARY SKILL IN THE ART
`
`16. A person of ordinary skill in the art relevant to the ’582 patent would
`
`have had at least a bachelor’s degree in mechanical, electrical, or biomedical
`
`
`
`5
`
`Page 5
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`engineering or computer science and at least five years of experience developing or
`
`researching image-guided medical devices and procedures or surgical robotics.
`
`VI. OVERVIEW OF THE APPLICABLE TECHNOLOGIES
`
`17. The ’582 patent generally relates to systems and methods of
`
`controlling cutting tools to obtain a target shape, particularly as applicable to
`
`surgical devices and implantation procedures. (Ex. 1001 at 1:17-27.) It describes
`
`beginning with an image of a workpiece (id. at 7:26-48); identifying a target shape
`
`within the workpiece (id. at 7:49-59); registering images of a cutting tool and the
`
`workpiece to the physical objects (id. at 9:5-20); tracking the cutting tool and
`
`workpiece (id. at 9:36-45); and controlling the cutting tool based on the tracking
`
`information. (Id.)
`
`18. The use of imaging, tracking, and a controller—such as a computer
`
`system—to increase the precision and safety of surgery is not new. As
`
`acknowledged in the Background portion of the ’582 patent, a wide variety of
`
`applications call for objects to cut from general workpieces with high precision.
`
`(Ex. 1001 at 1:17-23.) Surgery has long been a field with an especially strong need
`
`for such precision. (See id. at 1:22-27.) However, the suggestions in the ’582
`
`patent that such precision was previously met in the art through the use of “[b]one
`
`fixation” with screws and clamps (id. at 1:28-32) or complete reliance on robotic
`
`systems (id. at 1:39-41) are inaccurate. The use of a control system that forgoes
`
`
`
`6
`
`Page 6
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`physical clamps and screws in favor of surgeon-assisting controls as described in
`
`the ’582 specification was known well before 2002.
`
`19. For example, in the mid-1990s, several surgeons and engineers
`
`researched, wrote about, and used “interactive” systems consisting of a robotic arm
`
`that would be moved by a surgeon but with simultaneous control mechanisms
`
`based on imaging and tracking to increase precision. One such system is described
`
`in Russell H. Taylor et al., An Image-Directed Robotic System for Precise
`
`Orthopaedic Surgery, IEEE Transactions on Robotics and Automation, Vol. 10,
`
`No. 3, June 1994 (“Taylor”) (Ex. 1008). I was aware of the Taylor article around
`
`the time of its publication, well before my involvement in this case. Persons of
`
`ordinary skill in the art would have been well aware of the type of system Taylor
`
`described before 2002.
`
`20. Other examples of similar systems discussed in the mid-1990s include
`
`ACROBOT, PADyC, and HipNav. ACROBOT (short for Active Constraint
`
`Robot) consisted of a robotic system that utilized a virtual constraint surface
`
`defined by a preoperative plan. (See Ex. 1013 at 734.) Motors would actuate to
`
`gradually increase resistance until preventing further motion at the edge of the
`
`permitted region. (Id.) PADyC (short for Passive Arm with Dynamic Constraints)
`
`consisted of a “two degrees of freedom” system to constrain a surgeon’s
`
`movement. (Id. at 733.) The operator moved a surgical tool, for example a rotary
`
`
`
`7
`
`Page 7
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`cutter. (Id.) As the joint being moved approached a defined constraint surface, the
`
`system would narrow angular velocity until, ultimately, the only velocities
`
`available would be velocities moving the device away from or parallel to the
`
`constraint surfaces. (Id.; see also id. at 734 (figures illustrating PADyC).) HipNav
`
`described a system to determine optimal implant placement during hip replacement
`
`surgery using a range of motion simulator and intra-operative navigational tracking
`
`and guidance. (Ex. 1014 at 1.) A surgeon would specify a component position,
`
`after which the range of motion simulator would estimate femoral range of motion
`
`based on parameters provided by a pre-operative planner, and the feedback from
`
`the simulator would allow the surgeon to determine patient-specific optimal
`
`implant placement. (Id at 2.)
`
`VII. THE ’582 PATENT
`
`21. The ’582 patent includes four independent claims. I have been asked
`
`to evaluate three independent claims: claims 1, 17, and 24. These claims recite:
`
`1. A system comprising:
`
`a cutting tool;
`
`a workpiece that includes a target shape;
`
`a tracker to provide tracking data associated with the
`cutting tool and the workpiece,
`
`where the tracker includes at least one of: at least one
`first marker associated with the workpiece, and at least
`one second marker associated with the cutting tool; and
`
`
`
`8
`
`Page 8
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`a controller to control the cutting tool based on the
`tracking data associated with the cutting tool and the
`tracking data associated with the workpiece.
`
`
`
`17. A system, comprising:
`
`a workpiece having a target shape included therein,
`
`a tracker to track at least one of: a cutting tool and the
`workpiece, and,
`
`a control system, the control system including
`instructions to cause a processor to track the cutting tool
`and the workpiece,
`
`to associate the tracked data to an image associated with
`the cutting tool and
`
`an image associated with the workpiece, where the
`workpiece includes an image associated with the target
`shape,
`
`to determine a relationship between the cutting tool and
`at least one of the workpiece and the target shape, and to
`provide a control to the cutting tool based on at least one
`of the relationship of the cutting tool and the workpiece,
`and the relationship of the cutting tool and the target
`shape.
`
`
`
`24. A method, the method comprising:
`
`providing a workpiece that includes a target shape,
`
`providing a cutting tool,
`
`providing a 4-D image associated with the workpiece,
`
`identifying the target shape within the workpiece image,
`
`providing a 4-D image associated with the cutting tool,
`9
`
`
`
`Page 9
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`registering the workpiece with the workpiece image,
`
`registering the cutting tool with the tuning tool image,
`
`tracking at least one of the workpiece and the cutting
`tool,
`
`transforming the tracking data based on image
`coordinates to determine a relationship between the
`workpiece and the cutting tool, and,
`
`based on the relationship, providing a control to the
`cutting tool.
`
`
`22. Dependent claim 3 adds a list of potential cutting tools. Dependent
`
`claim 5 adds that the control signal must be analog, digital, or no signal.
`
`Dependent claims 6, 19, 20, and 38 add that the tracking data must be based on at
`
`least three positions and at least three angles, and that relationships are determined
`
`based on position and angle data. Dependent claim 7 adds a list of potential
`
`markers. Dependent claims 8 and 11 add images associated with the workpiece
`
`and cutting tool and means to provide those images. Dependent claims 9, 10, and
`
`18 add registration of and means to register those images to the workpiece and
`
`cutting tool. Dependent claim 12 adds means to transform tracking data to the
`
`workpiece image or cutting tool image. Dependent claim 13 adds a list of potential
`
`workpieces. Dependent claim 14 adds a list of potential tracking systems.
`
`Dependent claims 16, 39, 54, 55, and 57 add collision detection or intersection
`
`detection. Dependent claims 21, 22, 23, 25, 26, 27, 28, 29, 30, 40, 41, 42, and 56
`
`
`
`10
`
`Page 10
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`add volume pixels (voxels), as well as classification, re-classification, color-
`
`coding, and similar actions on voxels based on tracker data. Dependent claim 34
`
`adds a list of potential imaging types. Dependent claims 35 and 36 add calibrating
`
`a probe and using the calibrated probe to identify locations. Dependent claim 37
`
`adds providing a marker on the workpiece or cutting tool for tracking. Dependent
`
`claim 47 adds determining a distance between the cutting tool image and target
`
`shape. Dependent claims 48 and 49 add providing a control by increasing the size
`
`of the cutting tool image to determine whether the larger image intersects with the
`
`target shape. Dependent claim 50 adds providing a control based on the
`
`relationship between the cutting element and voxels. Dependent claim 51 adds the
`
`use of a three-dimensional grid of voxels, incorporating the workpiece into the
`
`grid, and identifying the voxels that are associated with the workpiece. Dependent
`
`claim 52 adds associating grid voxels with the workpiece or target shape.
`
`Dependent claim 53 adds a list of potential control types. Dependent claim 58
`
`adds providing a control based on threshold distance between workpiece image and
`
`cutting tool image.
`
`23. Although the claims are broad enough to cover a variety of
`
`applications, the ’582 specification describes surgical devices and implantation
`
`procedures. (Ex. 1001 at 1:17-27.) For example, the ’582 specification describes
`
`high precision surgeries such as total knee replacement, in which the bones to
`
`
`
`11
`
`Page 11
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`which a prosthetic is to be attached “can be shaped to facilitate stable and effective
`
`implantation of the prosthesis.” (Id. at 1:22-27.)
`
`24. The ’582 specification distinguishes its purported invention from the
`
`prior art by emphasizing comfort and safety considerations specific to surgical
`
`applications. The specification explains, for example, that some prior art cutting
`
`systems “fixat[ed] the workpiece, such as bone” through clamps and screws.
`
`(Ex. 1001 at 1:28-32.) A drawback to that method of fixation was “pain, infection,
`
`and increased recovery and rehabilitation periods caused by the invasive nature of
`
`fixation.” (Id. at 1:34-37.) The ’582 specification omits, however, that many prior
`
`art systems did not use limb fixation—for example, the HipNav system discussed
`
`above or the Burghart system described below in more detail.
`
`25. The specification also notes that “[r]obotic and other surgical systems
`
`may be susceptible to failure that can occur suddenly and can cause damage or
`
`injury to the subject. Controllable or detectible failure may be detected before
`
`harm is done.” (Ex. 1001 at 1:38-41.) The specification expresses a preference for
`
`a robotic system that is at least partially controlled by a human operator, explaining
`
`that fully robotic systems may have “undetectable failures” where “the operator
`
`may not be in full or even partial physical control of the robot.” (Id. at 1:48-49.)
`
`26. The ’582 applicants propose to address the above problems by
`
`providing a control that would forgo physical clamps and screws—though as noted
`
`
`
`12
`
`Page 12
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`above, the concept of a control without physical clamps and screws was already
`
`known. The ’582 applicants describe a controller—for example a computer
`
`system–that can send instructions to retract, slow down, or stop a cutting tool if the
`
`tool is moved in an incorrect way or too far from a target area. The specification
`
`describes beginning with an image of a workpiece—for example, a leg or bone—
`
`from common imaging methods such as CT, X-ray, MRI, fluoroscopy, or
`
`ultrasound. (Ex. 1001 at 2:9-11.) A target shape is identified within the
`
`workpiece, either as a separate 3D image or simply as a portion of the workpiece
`
`that is designated as the target shape. (Id. at 7:49-59.) Markers are placed on the
`
`cutting tool and workpiece, and images of the tool and workpiece can then be
`
`registered to the physical objects by using a calibrated probe to measure discrete
`
`positions on the cutting tool and workpiece to confirm anatomical correlation. (Id.
`
`at 9:5-20.) During the surgery or other cutting process, the physical cutting tool
`
`and workpiece are tracked, with their coordinates transformed to image coordinates
`
`as the process continues. (Id. at 9:36-45.) Finally, a control signal can be sent to
`
`stop, retract, continue, or reduce speed based on the position of the cutting tool
`
`image relative to the workpiece and target shape images. (Id.)
`
`VIII. THE PRIOR ART
`
`27. Following are brief summaries of the prior art references applied
`
`against the claims of the ’582 patent in this declaration.
`
`
`
`13
`
`Page 13
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`Russell H. Taylor et al., An Image-Directed Robotic System for Precise
`
`Orthopaedic Surgery, IEEE Transactions on Robotics and Automation, Vol.
`
`10, No. 3, June 1994 (“Taylor”) (Ex. 1008).
`
`28. Taylor is an article published June 1994. Taylor describes an “image-
`
`directed robotic system to augment the performance of human surgeons” and
`
`explains that “[o]rthopaedic applications represent a particularly promising domain
`
`for the integration of image and model-based presurgical planning, CAD/CAM
`
`technology, and precise robotic execution.” (Ex. 1008 at 261-62.) The Taylor
`
`system uses 3-D imaging, registration, and tracking of the cutting tool and a
`
`workpiece, for example a leg or bone, with control based on that data.
`
`29. Taylor discloses a system with (1) a cutting tool (Ex. 1008 at 263
`
`(“ball probe ‘cutter bit’ is inserted into the collet of the cutting tool”)); (2) a
`
`workpiece that includes a target shape (id. at 267 (discussing “3D CAD model
`
`of the desired prosthesis shape” for the patient, who is the workpiece)); (3)
`
`trackers to track the cutting tool and the workpiece (id. at 265 (“specialized IO
`
`hardware . . . to track the position and orientation of the robot end effector during
`
`the cutting phase of the surgery”); 270 (disclosing that system verifies “the bone
`
`does not move relative to the fixator,” demonstrating that it is tracking the
`
`workpiece)); (4) markers associated with the workpiece and markers
`
`associated with the cutting tool (id. at 262-63 (discussing pins implanted into
`
`
`
`14
`
`Page 14
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`patient, used as markers); 270 (discussing beacons attached to robot, which include
`
`the cutting tool)); (5) a control system with instructions to track the cutting tool
`
`and workpiece and ability to control the cutting tool (id. at 264-65 (discussing
`
`monitoring cutting tool position so tool can be stopped if it strays out of desired
`
`volume, including with “‘freeze motion’ signal”); 270 (“Independent Motion
`
`Monitoring Checks”)); (6) ability to associate tracked data with images
`
`associated with the cutting tool and workpiece (id. at 268 (discussing computer
`
`and formula to transform tracking data from kinematic model to determine tool
`
`position); 269-70 (describing checking subsystem that verifies cutter stays within
`
`defined “safe” volume)); (7) an image associated with the target shape (id. at
`
`267 (3D CAD model of desired prosthesis shape)); and (8) ability to determine a
`
`relationship between the cutting tool and the workpiece or the target shape
`
`(id. at 270 (checking subsystem verifies the cutting tool “stays within a defined
`
`‘safe’ volume relative to the bone [workpiece], essentially corresponding to the
`
`implant shape [target shape]”)).
`
`
`
`15
`
`Page 15
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`Catherina Burghart et al., Robot Controlled Osteotomy in Craniofacial
`
`Surgery, 1st International Workshop on Haptic Devices in Medical
`
`Applications Proceedings, Paris – France, pp. 12-22, June 23, 1999
`
`(“Burghart”) (Ex. 1012).
`
`30. Burghart is an article published in association with a workshop on
`
`June 23, 1999.
`
`31. Burghart discusses a system and methods for improving the precision
`
`and safety of bone surgery through the use of imaging, tracking, and control in a
`
`partially robotic system that is controlled by the surgeon. Burghart specifically
`
`describes a system for craniofacial surgeries with six degrees of freedom and a
`
`sensor that constrains the motions of the surgeon controlling a surgical saw
`
`attached to a robot’s flange. (Ex. 1012 at 12.) In Burghart, the positions of the
`
`tool and the patient workpiece are detected by an IR system for registration. (Id. at
`
`13.) Titanium miniscrews are preoperatively implanted into the workpiece skull to
`
`generate a patient-specific coordinate frame, and the robot tool is detected through
`
`a cylinder fitted with infrared diodes. (Id. at 14.) Two CCD cameras track the
`
`positions of the workpiece and tool, sending images to a navigation workstation to
`
`serve as a visualizing device. (Id.)
`
`32. Burghart discloses a system with (1) a cutting tool (Ex. 1012 at 13
`
`(“surgical saw”)); (2) a workpiece that includes a target shape (id. at Figs. 2-3
`
`
`
`16
`
`Page 16
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`(depicting skull workpiece and model of security zone constraining movement of
`
`saw that functions as target shape)); (3) trackers to track the cutting tool and the
`
`workpiece (id. at 13 (“[t]he position of both robot tool and patient can be detected
`
`by an integrated infrared navigation system for automatical [sic] registration”)); (4)
`
`markers associated with the workpiece and markers associated with the
`
`cutting tool (id. at 14 (titanium miniscrews preoperatively implanted into skull and
`
`later mounted with infrared diodes; robot tool detected through cylinder fitted with
`
`infrared diodes)); (5) a control system with instructions to track the cutting tool
`
`and workpiece and ability to control the cutting tool (id. at 15-16, Fig. 2 (“robot
`
`is controlled by evaluating the position of the tip of the robot’s tool within a
`
`defined safety zone”); Fig. 5 (illustrating tracking of relationship between tool and
`
`workpiece)); (6) ability to associate tracked data with images associated with
`
`the cutting tool and workpiece (id. at 14 (infrared diodes on workpiece used to
`
`generate “patient specific coordinate frame” allowing surgeon to choose various
`
`positions during surgery); 21 (dynamic bars show position of surgical tool with
`
`respect to trajectory)); (7) an image associated with the target shape (id. at Figs.
`
`3-4 (cylinder and point images associated with workpiece and target cutting paths
`
`on workpiece)); and (8) ability to determine a relationship between the cutting
`
`tool and the workpiece or the target shape (id. at Fig. 5 (illustrating resistance to
`
`
`
`17
`
`Page 17
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`movement of cutting tool that depends on relationship between tool and
`
`workpiece)).
`
`U.S. Patent No. 6,205,411 (DiGioia) (Ex. 1010).
`
`33. DiGioia issued on March 20, 2001. I understand that DiGioia was
`
`invented by several of the ’582 inventors affiliated with Carnegie Mellon
`
`University (“CMU”). DiGioia relates to an apparatus for facilitating the surgical
`
`implantation of an artificial joint component. (Ex. 1010 at 1:18-23.) DiGioia
`
`covers very similar subject matter to both Taylor and the ’582 patent, and a person
`
`of ordinary skill would therefore be motivated to look to DiGioia for solutions and
`
`combine DiGioia with Taylor.
`
`34. DiGioia discloses a control algorithm performed by a computer
`
`system, utilizing tracking data associated with a bone model created from skeletal
`
`geometric data, which is the workpiece, and the optimal position of the implant,
`
`which is the target shape. (Ex. 1010 at 7:1-18, 7:46-53.) Figure 3 illustrates the
`
`system, including a computer system to display objects being tracked with a
`
`tracking device (which is depicted in Figure 3a); a controller connected to the
`
`computer system; a camera able to detect LEDs that can be attached to bones and
`
`tools. (Id. at Fig. 3, 6:24-48.) Like both the ’582 patent and Taylor, DiGioia
`
`specifically proposes a commercially available Optotrak device as its tracker. (Id.
`
`at 6:43-46.)
`
`
`
`18
`
`Page 18
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`35. DiGioia discloses a system with (1) a cutting tool (Ex. 1010 at 9:39-
`
`45 (“[S]urgical cuts can be made freely but with precise spatial constraints.”)); (2)
`
`a workpiece that includes a target shape (id. at Figs. 2-3 (depicting patent and
`
`describing creation of bone model, which is the workpiece)); (3) trackers to track
`
`the cutting tool and the workpiece (id. at Fig. 3, 6:35-48 (depicting and
`
`describing optical tracking camera used to track targets attached to bones, tools,
`
`and other objects in the operating room)); (4) markers associated with the
`
`workpiece and markers associated with the cutting tool (id. at Fig. 3
`
`(illustrating markers attached to both patient and tool)); (5) a control system with
`
`instructions to track the cutting tool and workpiece and ability to control the
`
`cutting tool (id. at 9:20-45 (describing computer system and control algorithm that
`
`“define[s] the space within which surgical tools can be moved safely” through
`
`tracking and comparison in “near real time” of joint and implant positions)); (6)
`
`ability to associate tracked data with images associated with the cutting tool
`
`and workpiece (id. (describing tracking of surgical tool in order to control robotic
`
`arm to stay within defined space based on joint and implant positions, which
`
`necessarily requires associating tracked data with cutting tool and workpiece
`
`images)); (7) an image associated with the target shape (id. at 7:1-18, 46-53
`
`(describing simulation to determine optimal position of implant, i.e. target shape,
`
`relative to anatomy)); and (8) ability to determine a relationship between the
`
`
`
`19
`
`Page 19
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`cutting tool and the workpiece or the target shape (id. at 9:20-22, 39-45
`
`(describing control based on relationship between cutting tool, workpiece (joint),
`
`and target shape (implant positions))).
`
`Scott L. Delp et al., An Interactive Graphics-Based Model of the Lower
`
`Extremity to Study Orthopaedic Surgical Procedures, Vol. 37, No. 8, Aug.
`
`1990 (“Delp”) (Ex. 1011).
`
`36. Delp is an article published in August 1990 addressing an interactive
`
`graphics-based model to study orthopaedic surgical procedures. It is focused on
`
`the use of graphical tools to enhance design and analysis of surgical procedures,
`
`and a person of ordinary skill would therefore have reason to combine its solutions
`
`with the teachings of the other references discussed herein. Delp discloses an
`
`element common to display systems: a scaling function that allows the user to
`
`“rotate, scale, and translate the model into any viewing perspective.” (Ex. 1011 at
`
`761.)
`
`U.S. Patent No. 5,408,409 (Glassman) (Ex. 1009).
`
`37. Glassman issued on April 18, 1995. Glassman names as inventors
`
`Taylor and several authors of the Taylor article discussed above, and essentially
`
`discloses the system discussed in Taylor or a system that is highly similar to the
`
`system discussed in Taylor. It discloses a cutting tool with a rotatable blade—
`
`specifically, a drill. In light of the commonality between inventors and authors and
`
`
`
`20
`
`Page 20
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`the disclosure of essentially the same system, one of skill in the art would be
`
`motivated to combine Glassman and Taylor.
`
`IX. CLAIM CONSTRUCTION
`
`38.
`
`I have been asked to provide my opinion on the appropriate
`
`construction of a number of claim terms. I understand that a claim is given the
`
`“broadest reasonable construction in light of the specification” in inter partes
`
`review. I also understand that claim terms are given their ordinary and customary
`
`meaning, as would be understood by one of ordinary skill in the art in the context
`
`of the entire disclosure. I understand that an inventor may rebut that meaning by
`
`providing a definition of the term in the specification with reasonable clarity,
`
`deliberateness, and precision.
`
`39.
`
`I also understand that when a claim uses the word “means” and there
`
`is no definite structure corresponding to the function of the claim limitation, then
`
`the claim is presumed to be “means-plus-function” language under 35 U.S.C.
`
`§ 112, ¶ 6. I understand that the first step in construing a means-plus-function
`
`limitation is to identify the function explicitly recited in the claim, which includes
`
`construing any terms in the recited function if necessary. The next step is to
`
`identify the corresponding structure set forth in the written description that is
`
`clearly linked to and necessary to perform the particular function set forth in the
`
`claim because the means-plus-function term will cover only the corresponding
`
`
`
`21
`
`Page 21
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`structure, material, or act in the specification and equivalents thereof. For
`
`corresponding structure involving computer algorithms, the specification must at
`
`least disclose some algorithms to perform the recited function (not just a discussion
`
`of the end result) and it is insufficient to rely solely on the knowledge of one of
`
`ordinary skill in the art to provide such algorithm.
`
`40. Claim 10 and 18 require “means to register” a workpiece to at least
`
`one image associated with the workpiece, or “image registration means.” The ’582
`
`specification only explicitly recites two structures for registrations: a calibration
`
`probe (see, e.g., Ex. 1001 at 2:39-42, 4:11-13, 9:28-35) or fiducial markers (id. at
`
`9:35). While the specification notes a number of imaging methods (e.g.,
`
`ultrasound) that may be used in registration, it does not explicitly recite
`
`corresponding structures for those methodologies. “Means to register” should
`
`therefore be construed to encompass “a calibration probe, fiducial markers, or
`
`equivalent structures.”
`
`41. Claim 11 requires “means to provide at least one image.” The ’582
`
`specification discloses that the means for providing an image can include CAD,
`
`CT, MRI, X-Ray, fluoroscopy, and ultrasound. (Ex. 1001 at 2:9-11, 4:16-21,
`
`15:52-55.) “Means to provide at least one image” should therefore be construed to
`
`encompass “CAD, CT, MRI, X-Ray, fluoroscopy, ultrasound, and equivalent
`
`structures.”
`
`
`
`22
`
`Page 22
`
`Mako Surgical Corp. Ex. 1004
`
`

`

`42. Claim 12 requires “means to transform tracking data.” The ’582
`
`specification discusses only the use of computers and processors to execute
`
`instructions of the system. (Ex. 1001 at 4:21-24, 19:35-65, 20:20-27.) The ’582
`
`specification does not disclose any algorithm that achieves the claimed “transform”
`
`function, instead stating only generally that “transformations can be
`
`mathematically effectuated” and even emphasizing that the described methods and
`
`systems “are not limited to a particular hardware or software configuration, and
`
`may find applicability in many computing or processing environments.” (Id. at
`
`19:42-4

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket