`OF
`ROBOTS:
`
`High-Performance Visual Servoing
`
`Peter I. Corke
`CSIRO Division of Manufacturing Technology, Australia.
`
`ABB Inc. Exhibit 1004, Page 1 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`To my family, Phillipa, Lucy and Madeline.
`
`v
`
`ABB Inc. Exhibit 1004, Page 2 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`vi
`
`ABB Inc. Exhibit 1004, Page 3 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`Editorial foreword
`
`It is no longer necessary to explain the word 'mechatronics'. The world has become
`accustomed to the blending of mechanics, electronics and computer control. That does
`not mean that mechatronics has lost its 'art'.
`The addition of vision sensing to assist in the solution of a variety of problems is
`still very much a 'cutting edge' topic of research. Peter Corke has written a very clear
`exposition which embraces both the theory and the practical problems encountered in
`adding vision sensing to a robot arm.
`There is great value in this book, both for advanced undergraduate reading and for
`the researcher or designer in industry who wishes to add vision-based control.
`We will one day come to expect vision sensing and control to be a regular feature
`of mechatronic devices from machine tools to domestic appliances. It is research such
`as this which will bring that day about.
`
`John Billingsley
`University of Southern Queensland,
`Toowoomba, QLD4350
`August 1996
`
`vii
`
`ABB Inc. Exhibit 1004, Page 4 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`viii
`
`ABB Inc. Exhibit 1004, Page 5 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`Author's Preface
`
`Outline
`
`This book is about the application of high-speed machine vision for closed-loop po-
`sition control, or visual servoing, of a robot manipulator. The book aims to provide
`a comprehensive coverage of all aspects of the visual servoing problem: robotics, vi-
`sion, control, technology and implementation issues. While much of the discussion
`is quite general the experimental work described is based on the use of a high-speed
`binary vision system with a monocular 'e ye-in-hand' camera.
`The particular focus is on accurate high-speed motion, where in this context 'high
`speed' is taken to mean approaching, or exceeding, the performance limits stated by
`the robot manufacturer. In order to achieve such high-performance I argue that it is
`necessary to have accurate dynamical models of the system to be controlled (the robot)
`and the sensor (the camera and vision system). Despite the long history of research in
`the constituent topics of robotics and computer vision, the system dynamics of closed-
`loop visually guided robot systems has not been well addressed in the literature to
`date.
`I am a confirmed experimentalist and therefore this book has a strong theme of
`experimentation. Experiments are used to build and verify models of the physical
`system components such as robots, cameras and vision systems. These models are
`then used for controller synthesis, and the controllers are verified experimentally and
`compared with results obtained by simulation.
`Finally, the book has a World Wide Web home page which serves as a virtual
`appendix. It contains links to the software and models discussed within the book as
`well as pointers to other useful sources of information. A video tape, showing many
`of the experiments, can be ordered via the home page.
`
`Background
`My interest in the area of visual servoing dates back to 1984 when I was involved in
`two research projects; video-rate feature extraction1, and sensor-based robot control.
`At that time it became apparent that machine vision could be used for closed-loop
`control of robot position, since the video-field rate of 50 Hz exceeded the position
`setpoint rate of the Puma robot which is only 36 Hz. Around the same period Weiss
`and Sanderson published a number of papers on this topic [224–226,273] in particular
`concentrating on control strategies and the direct use of image features — but only
`in simulation. I was interested in actually building a system based on the feature-
`extractor and robot controller, but for a number of reasons this was not possible at that
`time.
`1This work resulted in a commercial unit — the APA-512 [261], and its successor the APA-512+ [25].
`Both devices are manufactured by Atlantek Microsystems Ltd. of Adelaide, Australia.
`
`ix
`
`ABB Inc. Exhibit 1004, Page 6 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`In the period 1988–89 I was fortunate in being able to spend 11 months at the
`GRASP Laboratory, University of Pennsylvania on a CSIRO Overseas Fellowship.
`There I was able to demonstrate a 60 Hz visual feedback system [65]. Whilst the
`sample rate was high, the actual closed-loop bandwidth was quite low. Clearly there
`was a need to more closely model the system dynamics so as to be able to achieve
`better control performance. On return to Australia this became the subject of my PhD
`research [52].
`
`Nomenclature
`
`The most commonly used symbols used in this book, and their units are listed below.
`Note that some symbols are overloaded in which case their context must be used to
`disambiguate them.
`
`v
`vx
`A
`ˆx
`òx
`xd
`AT
`αx, αy
`B
`C
`C q ˙q
`ceil x
`E
`f
`f
`F
`F ˙q
`floor x
`G
`φ
`φ
`G
`G q
`i
`In
`j
`J
`
`a vector
`a component of a vector
`a matrix
`an estimate of x
`error in x
`demanded value of x
`transpose of A
`pixel pitch
`viscous friction coefficient
`4)
`camera calibration matrix (3
`manipulator centripetal and Coriolis term
`returns n, the smallest integer such that n
`illuminance (lux)
`force
`focal length
`f -number
`friction torque
`returns n, the largest integer such that n
`gear ratio
`luminous flux (lumens)
`magnetic flux (Webers)
`gear ratio matrix
`manipulator gravity loading term
`current
`n identity matrix
`n
`1
`scalar inertia
`
`x
`
`x
`
`x
`
`pixels/mm
`N m s rad
`
`kg m2 s
`
`lx
`N
`m
`
`N.m
`
`lm
`Wb
`
`N.m
`A
`
`kg m2
`
`ABB Inc. Exhibit 1004, Page 7 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`3 matrix
`inertia tensor, 3
`Jacobian transforming velocities in frame A to frame B
`constant
`amplifier gain (transconductance)
`motor torque constant
`forward kinematics
`inverse kinematics
`inductance
`luminance (nit)
`mass of link i
`manipulator inertia matrix
`order of polynomial
`generalized joint coordinates
`generalized joint torque/force
`resistance
`angle
`vector of angles, generally robot joint angles
`Laplace transform operator
`COM of link i with respect to the link i coordinate frame m
`kg.m
`first moment of link i. Si misi
`standard deviation
`time
`sample interval
`lens transmission constant
`camera exposure interval
`homogeneous transformation
`homogeneous transform of point B with respect to the
`frame A. If A is not given then assumed relative to world
`1.
`coordinate frame 0. Note that ATB
`BTA
`torque
`Coulomb friction torque
`voltage
`frequency
`T comprising translation
`3-D pose, x
`x y z rx ry rz
`along, and rotation about the X, Y and Z axes.
`Cartesian coordinates
`coordinates of the principal point
`camera image plane coordinates
`camera image plane coordinates
`camera image plane coordinates iX
`image plane error
`
`J
`AJB
`k K
`Ki
`Km
`K
`K 1
`L
`L
`mi
`M q
`Ord
`q
`Q
`R
`θ
`θ
`s
`si
`Si
`σ
`t
`T
`T
`Te
`T
`ATB
`
`τ
`τC
`v
`ω
`x
`
`x y z
`X0, Y0
`ix iy
`iX iY
`iX
`i òX
`
`iX iY
`
`xi
`
`kg m2
`
`A/V
`N.m/A
`
`H
`nt
`kg
`kg m2
`
`Ω
`rad
`rad
`
`s
`s
`
`s
`
`N.m
`N.m
`V
`rad s
`
`pixels
`m
`pixels
`pixels
`
`ABB Inc. Exhibit 1004, Page 8 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`z
`Z
`
`z-transform operator
`Z-transform
`
`The following conventions have also been adopted:
`
`Time domain variables are in lower case, frequency domain in upper case.
`
`Transfer functions will frequently be written using the notation
`2ζ
`ωn
`
`s
`
`1
`
`s2
`
`1ω
`
`2n
`
`1
`
`s a
`
`K a ζ ωn
`
`K
`
`A free integrator is an exception, and 0 is used to represent s.
`
`When specifying motor motion, inertia and friction parameters it is important
`that a consistent reference is used, usually either the motor or the load, denoted
`by the subscripts m or l respectively.
`For numeric quantities the units radm and radl are used to indicate the reference
`frame.
`
`In order to clearly distinguish results that were experimentally determined from
`simulated or derived results, the former will always be designated as 'measured'
`in the caption and index entry.
`
`A comprehensive glossary of terms and abbreviations is provided in Appendix
`A.
`
`xii
`
`ABB Inc. Exhibit 1004, Page 9 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`Acknowledgements
`
`The work described in this book is largely based on my PhD research [52] which was
`carried out, part time, at the University of Melbourne over the period 1991–94. My
`supervisors Professor Malcolm Good at the University of Melbourne, and Dr. Paul
`Dunn at CSIRO provided much valuable discussion and guidance over the course of
`the research, and critical comments on the draft text.
`That work could not have occurred without the generosity and support of my em-
`ployer, CSIRO. I am indebted to Dr. Bob Brown and Dr. S. Ramakrishnan for sup-
`porting me in the Overseas Fellowship and PhD study, and making available the nec-
`essary time and laboratory facilities. I would like to thank my CSIRO colleagues for
`their support of this work, in particular: Dr. Paul Dunn, Dr. Patrick Kearney, Robin
`Kirkham, Dennis Mills, and Vaughan Roberts for technical advice and much valuable
`discussion; Murray Jensen and Geoff Lamb for keeping the computer systems run-
`ning; Jannis Young and Karyn Gee, the librarians, for tracking down all manner of
`references; Les Ewbank for mechanical design and drafting; Ian Brittle' s Research
`Support Group for mechanical construction; and Terry Harvey and Steve Hogan for
`electronic construction. The PhD work was partially supported by a University of
`Melbourne/ARC small grant. Writing this book was partially supported by the Co-
`operative Research Centre for Mining Technology and Equipment (CMTE), a joint
`venture between AMIRA, CSIRO, and the University of Queensland.
`Many others helped as well. Professor Richard (Lou) Paul, University of Penn-
`sylvania, was there at the beginning and made facilities at the GRASP laboratory
`available to me. Dr. Kim Ng of Monash University and Dr. Rick Alexander helped in
`discussions on camera calibration and lens distortion, and also loaned me the SHAPE
`system calibration target used in Chapter 4. Vision Systems Ltd. of Adelaide, through
`their then US distributor Tom Seitzler of Vision International, loaned me an APA-512
`video-rate feature extractor unit for use while I was at the GRASP Laboratory. David
`Hoadley proof read the original thesis, and my next door neighbour, Jack Davies, fixed
`lots of things around my house that I didn' t get around to doing.
`
`xiii
`
`ABB Inc. Exhibit 1004, Page 10 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`xiv
`
`ABB Inc. Exhibit 1004, Page 11 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`Contents
`
`1 Introduction
`. . . . . . . .
`1.1 Visual servoing .
`1.1.1 Related disciplines . . .
`1.2 Structure of the book . . . . . .
`
`. . . . . . .
`. . . . . . .
`. . . . . . .
`
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`
`. . . . .
`. . . . .
`. . . . .
`
`2 Modelling the robot
`. . . . . . . .
`. . . . . . .
`2.1 Manipulator kinematics . . . . .
`. . . . . . . .
`2.1.1
`Forward and inverse kinematics . . .
`. . . . . . . .
`2.1.2 Accuracy and repeatability . . . . . .
`. . . . . . . .
`2.1.3 Manipulator kinematic parameters . .
`. . . . . . . .
`2.2 Manipulator rigid-body dynamics
`. . . . . .
`2.2.1 Recursive Newton-Euler formulation . . . . . . . .
`2.2.2
`Symbolic manipulation .
`. . . . . . .
`. . . . . . . .
`2.2.3
`Forward dynamics . . .
`. . . . . . .
`. . . . . . . .
`2.2.4 Rigid-body inertial parameters . . . .
`. . . . . . . .
`2.2.5 Transmission and gearing . . . . . .
`. . . . . . . .
`2.2.6 Quantifying rigid body effects . . . .
`. . . . . . . .
`2.2.7 Robot payload . . . . .
`. . . . . . .
`. . . . . . . .
`2.3 Electro-mechanical dynamics . .
`. . . . . . .
`. . . . . . . .
`2.3.1
`Friction .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`2.3.2 Motor . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`2.3.3 Current loop . . . . . .
`. . . . . . .
`. . . . . . . .
`2.3.4 Combined motor and current-loop dynamics
`. . . .
`2.3.5 Velocity loop . . . . . .
`. . . . . . .
`. . . . . . . .
`2.3.6
`Position loop . . . . . .
`. . . . . . .
`. . . . . . . .
`2.3.7
`Fundamental performance limits . . .
`. . . . . . . .
`2.4 Significance of dynamic effects .
`. . . . . . .
`. . . . . . . .
`2.5 Manipulator control . . . . . . .
`. . . . . . .
`. . . . . . . .
`2.5.1 Rigid-body dynamics compensation .
`. . . . . . . .
`
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`
`1
`1
`5
`5
`
`7
`7
`10
`11
`12
`14
`16
`19
`21
`21
`27
`28
`30
`31
`32
`35
`42
`45
`49
`52
`56
`58
`60
`60
`
`xv
`
`ABB Inc. Exhibit 1004, Page 12 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`xvi
`
`CONTENTS
`
`2.5.2 Electro-mechanical dynamics compensation . . . . .
`2.6 Computational issues . . . . . .
`. . . . . . .
`. . . . . . . .
`2.6.1
`Parallel computation . .
`. . . . . . .
`. . . . . . . .
`2.6.2
`Symbolic simplification of run-time equations . . . .
`2.6.3
`Significance-based simplification . .
`. . . . . . . .
`2.6.4 Comparison . . . . . . .
`. . . . . . .
`. . . . . . . .
`
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`. . . . .
`
`64
`64
`65
`66
`67
`68
`
`3.2
`
`3 Fundamentals of image capture
`. . . . . . . .
`. . . . . . .
`3.1 Light . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . .
`3.1.1
`Illumination . . . . . . .
`. . . . . . . .
`. . . . . . .
`3.1.2
`Surface reflectance . . .
`3.1.3
`Spectral characteristics and color temperature . . . .
`Image formation .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.2.1 Light gathering and metering . . . . .
`. . . . . . . .
`3.2.2
`Focus and depth of field . . . . . . .
`. . . . . . . .
`3.2.3
`Image quality . . . . . .
`. . . . . . .
`. . . . . . . .
`3.2.4
`Perspective transform . .
`. . . . . . .
`. . . . . . . .
`3.3 Camera and sensor technologies . . . . . . .
`. . . . . . . .
`3.3.1
`Sensors .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.3.2
`Spatial sampling . . . .
`. . . . . . .
`. . . . . . . .
`3.3.3 CCD exposure control and motion blur
`. . . . . . .
`3.3.4 Linearity . . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.3.5
`Sensitivity . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.3.6 Dark current
`. . . . . .
`. . . . . . .
`. . . . . . . .
`3.3.7 Noise . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.3.8 Dynamic range . . . . .
`. . . . . . .
`. . . . . . . .
`3.4 Video standards .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.4.1
`Interlacing and machine vision . . . .
`. . . . . . . .
`Image digitization . . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.5.1 Offset and DC restoration . . . . . .
`. . . . . . . .
`3.5.2
`Signal conditioning . . .
`. . . . . . .
`. . . . . . . .
`3.5.3
`Sampling and aspect ratio . . . . . .
`. . . . . . . .
`3.5.4 Quantization . . . . . .
`. . . . . . .
`. . . . . . . .
`3.5.5 Overall MTF . . . . . .
`. . . . . . .
`. . . . . . . .
`3.5.6 Visual temporal sampling . . . . . .
`. . . . . . . .
`3.6 Camera and lighting constraints
`. . . . . . .
`. . . . . . . .
`3.6.1
`Illumination . . . . . . .
`. . . . . . .
`. . . . . . . .
`3.7 The human eye .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`
`3.5
`
`73
`73
`. . . . .
`73
`. . . . .
`75
`. . . . .
`76
`. . . . .
`79
`. . . . .
`81
`. . . . .
`82
`. . . . .
`84
`. . . . .
`86
`. . . . .
`87
`. . . . .
`88
`. . . . .
`91
`. . . . .
`94
`. . . . .
`95
`. . . . .
`96
`. . . . .
`. . . . . 100
`. . . . . 100
`. . . . . 102
`. . . . . 102
`. . . . . 105
`. . . . . 106
`. . . . . 107
`. . . . . 107
`. . . . . 107
`. . . . . 112
`. . . . . 113
`. . . . . 115
`. . . . . 116
`. . . . . 118
`. . . . . 121
`
`ABB Inc. Exhibit 1004, Page 13 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`CONTENTS
`
`4 Machine vision
`. . . . . . .
`4.1
`Image feature extraction . . . .
`4.1.1 Whole scene segmentation . . . . . .
`4.1.2 Moment features . . . .
`. . . . . . .
`4.1.3 Binary region features
`.
`. . . . . . .
`4.1.4
`Feature tracking . . . .
`. . . . . . .
`4.2 Perspective and photogrammetry . . . . . . .
`4.2.1 Close-range photogrammetry . . . . .
`4.2.2 Camera calibration techniques . . . .
`4.2.3 Eye-hand calibration . .
`. . . . . . .
`
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`
`5 Visual servoing
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`5.1 Fundamentals . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`5.2 Prior work . . . .
`. . . . . . . .
`. . . . . . .
`5.3 Position-based visual servoing .
`. . . . . . . .
`5.3.1
`Photogrammetric techniques . . . . .
`. . . . . . . .
`5.3.2
`Stereo vision . . . . . .
`. . . . . . .
`. . . . . . . .
`5.3.3 Depth from motion . . .
`. . . . . . .
`. . . . . . . .
`Image based servoing . . . . . .
`. . . . . . .
`5.4.1 Approaches to image-based visual servoing . . . . .
`Implementation issues
`. . . . .
`. . . . . . .
`. . . . . . . .
`5.5.1 Cameras .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`5.5.2
`Image processing . . . .
`. . . . . . .
`. . . . . . . .
`5.5.3
`Feature extraction . . . .
`. . . . . . .
`. . . . . . . .
`5.5.4 Visual task specification . . . . . . .
`. . . . . . . .
`
`5.4
`
`5.5
`
`6 Modelling an experimental visual servo system
`. . . . . . . .
`6.1 Architectures and dynamic performance . . .
`. . . . . . . .
`6.2 Experimental hardware and software . . . . .
`. . . . . . . .
`6.2.1
`Processor and operating system . . .
`. . . . . . . .
`6.2.2 Robot control hardware .
`. . . . . . .
`. . . . . . . .
`6.2.3 ARCL . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`6.2.4 Vision system . . . . . .
`. . . . . . .
`6.2.5 Visual servo support software — RTVL . . . . . . .
`6.3 Kinematics of camera mount and lens
`. . . .
`. . . . . . . .
`6.3.1 Camera mount kinematics . . . . . .
`. . . . . . . .
`6.3.2 Modelling the lens . . .
`. . . . . . .
`. . . . . . . .
`6.4 Visual feedback control . . . . .
`. . . . . . .
`. . . . . . . .
`6.4.1 Control structure . . . .
`. . . . . . .
`. . . . . . . .
`6.4.2
`“Black box” experiments . . . . . . .
`. . . . . . . .
`6.4.3 Modelling system dynamics . . . . .
`. . . . . . . .
`
`xvii
`
`123
`. . . . . 123
`. . . . . 124
`. . . . . 127
`. . . . . 130
`. . . . . 136
`. . . . . 137
`. . . . . 138
`. . . . . 139
`. . . . . 147
`
`151
`. . . . . 152
`. . . . . 154
`. . . . . 159
`. . . . . 159
`. . . . . 160
`. . . . . 160
`. . . . . 161
`. . . . . 163
`. . . . . 166
`. . . . . 166
`. . . . . 167
`. . . . . 167
`. . . . . 169
`
`171
`. . . . . 172
`. . . . . 175
`. . . . . 176
`. . . . . 177
`. . . . . 178
`. . . . . 179
`. . . . . 182
`. . . . . 184
`. . . . . 184
`. . . . . 188
`. . . . . 191
`. . . . . 194
`. . . . . 195
`. . . . . 197
`
`ABB Inc. Exhibit 1004, Page 14 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`xviii
`
`CONTENTS
`
`. . . . . . . .
`6.4.4 The effect of multi-rate sampling . .
`. . . . . . . .
`6.4.5 A single-rate model . . .
`. . . . . . .
`. . . . . . . .
`6.4.6 The effect of camera shutter interval .
`. . . . . . . .
`6.4.7 The effect of target range . . . . . . .
`6.4.8 Comparison with joint control schemes . . . . . . .
`6.4.9
`Summary . . . . . . . .
`. . . . . . .
`. . . . . . . .
`
`. . . . . 201
`. . . . . 203
`. . . . . 204
`. . . . . 206
`. . . . . 208
`. . . . . 209
`
`7 Control design and performance
`. . . . . . . .
`. . . . . . .
`7.1 Control formulation . . . . . . .
`. . . . . . . .
`. . . . . . .
`7.2 Performance metrics
`. . . . . .
`. . . . . . . .
`7.3 Compensator design and evaluation . . . . .
`. . . . . . . .
`7.3.1 Addition of an extra integrator . . . .
`. . . . . . . .
`7.3.2
`PID controller . . . . . .
`. . . . . . .
`. . . . . . . .
`7.3.3
`Smith' s method . . . . .
`. . . . . . .
`7.3.4
`State feedback controller with integral action . . . .
`7.3.5
`Summary . . . . . . . .
`. . . . . . .
`. . . . . . . .
`7.4 Axis control modes for visual servoing . . . .
`. . . . . . . .
`7.4.1 Torque control
`. . . . .
`. . . . . . .
`. . . . . . . .
`7.4.2 Velocity control . . . . .
`. . . . . . .
`. . . . . . . .
`7.4.3
`Position control . . . . .
`. . . . . . .
`. . . . . . . .
`7.4.4 Discussion . . . . . . .
`. . . . . . .
`. . . . . . . .
`7.4.5 Non-linear simulation and model error . . . . . . . .
`7.4.6
`Summary . . . . . . . .
`. . . . . . .
`. . . . . . . .
`7.5 Visual feedforward control
`. . .
`. . . . . . .
`. . . . . . . .
`7.5.1 High-performance axis velocity control
`. . . . . . .
`7.5.2 Target state estimation .
`. . . . . . .
`. . . . . . . .
`7.5.3
`Feedforward control implementation .
`. . . . . . . .
`7.5.4 Experimental results . .
`. . . . . . .
`. . . . . . . .
`7.6 Biological parallels . . . . . . .
`. . . . . . .
`. . . . . . . .
`7.7 Summary . . . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`
`8 Further experiments in visual servoing
`. . . . . . . .
`8.1 Visual control of a major axis . .
`. . . . . . .
`. . . . . . . .
`8.1.1 The experimental setup .
`. . . . . . .
`. . . . . . . .
`8.1.2 Trajectory generation . .
`. . . . . . .
`. . . . . . . .
`8.1.3
`Puma 'nati ve' position control
`. . . .
`. . . . . . . .
`8.1.4 Understanding joint 1 dynamics . . .
`. . . . . . . .
`8.1.5
`Single-axis computed torque control .
`. . . . . . . .
`8.1.6 Vision based control
`. .
`. . . . . . .
`. . . . . . . .
`8.1.7 Discussion . . . . . . .
`. . . . . . .
`8.2 High-performance 3D translational visual servoing . . . . .
`
`211
`. . . . . 212
`. . . . . 214
`. . . . . 215
`. . . . . 215
`. . . . . 216
`. . . . . 218
`. . . . . 219
`. . . . . 225
`. . . . . 227
`. . . . . 228
`. . . . . 229
`. . . . . 231
`. . . . . 231
`. . . . . 233
`. . . . . 234
`. . . . . 235
`. . . . . 237
`. . . . . 242
`. . . . . 251
`. . . . . 255
`. . . . . 257
`. . . . . 260
`
`263
`. . . . . 263
`. . . . . 264
`. . . . . 265
`. . . . . 266
`. . . . . 269
`. . . . . 274
`. . . . . 279
`. . . . . 281
`. . . . . 282
`
`ABB Inc. Exhibit 1004, Page 15 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`CONTENTS
`
`xix
`
`8.2.1 Visual control strategy .
`8.2.2 Axis velocity control . .
`8.2.3
`Implementation details .
`8.2.4 Results and discussion .
`8.3 Conclusion . . .
`. . . . . . . .
`
`. . . . . . .
`. . . . . . .
`. . . . . . .
`. . . . . . .
`. . . . . . .
`
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`. . . . . . . .
`
`. . . . . 283
`. . . . . 286
`. . . . . 287
`. . . . . 290
`. . . . . 294
`
`9 Discussion and future directions
`. . . . . . .
`9.1 Discussion . . . .
`. . . . . . . .
`9.2 Visual servoing: some questions (and answers)
`9.3 Future work . . .
`. . . . . . . .
`. . . . . . .
`
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`
`297
`. . . . . 297
`. . . . . 299
`. . . . . 302
`
`Bibliography
`
`A Glossary
`
`B This book on the Web
`
`C APA-512
`
`303
`
`321
`
`325
`
`327
`
`D RTVL: a software system for robot visual servoing
`D.1 Image processing control
`. . . .
`. . . . . . .
`. . . . . . . .
`D.2 Image features . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`D.3 Time stamps and synchronized interrupts
`. .
`. . . . . . . .
`D.4 Real-time graphics
`. . . . . . .
`. . . . . . .
`. . . . . . . .
`D.5 Variable watch .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`D.6 Parameters . . . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`D.7 Interactive control facility . . . .
`. . . . . . .
`. . . . . . . .
`D.8 Data logging and debugging . .
`. . . . . . .
`. . . . . . . .
`D.9 Robot control
`. .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`D.10 Application program facilities
`.
`. . . . . . .
`. . . . . . . .
`D.11 An example — planar positioning . . . . . .
`. . . . . . . .
`D.12 Conclusion . . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`
`333
`. . . . . 334
`. . . . . 334
`. . . . . 335
`. . . . . 337
`. . . . . 337
`. . . . . 338
`. . . . . 338
`. . . . . 338
`. . . . . 340
`. . . . . 340
`. . . . . 340
`. . . . . 341
`
`E LED strobe
`
`Index
`
`343
`
`347
`
`ABB Inc. Exhibit 1004, Page 16 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`xx
`
`CONTENTS
`
`ABB Inc. Exhibit 1004, Page 17 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`List of Figures
`
`1.1 General structure of hierarchical model-based robot and vision system.
`
`4
`
`8
`. . . . .
`2.1 Different forms of Denavit-Hartenberg notation. . . . . . . .
`13
`. . . . .
`2.2 Details of coordinate frames used for Puma 560 . . . . . . .
`17
`. . . . .
`2.3 Notation for inverse dynamics
`.
`. . . . . . .
`. . . . . . . .
`25
`. . . . .
`2.4 Measured and estimated gravity load on joint 2.
`. . . . . . .
`29
`. . . . .
`2.5 Configuration dependent inertia for joint 1 . .
`. . . . . . . .
`29
`. . . . .
`2.6 Configuration dependent inertia for joint 2 . .
`. . . . . . . .
`30
`. . . . .
`2.7 Gravity load on joint 2 . . . . .
`. . . . . . .
`. . . . . . . .
`32
`. . . . .
`2.8 Typical friction versus speed characteristic.
`.
`. . . . . . . .
`34
`. . . . .
`2.9 Measured motor current versus joint velocity for joint 2 . . .
`36
`. . . . .
`2.10 Block diagram of motor mechanical dynamics . . . . . . . .
`36
`. . . . .
`2.11 Schematic of motor electrical model. . . . . .
`. . . . . . . .
`2.12 Measured joint angle and voltage data from open-circuit test on joint 2. 39
`2.13 Block diagram of motor current loop . . . . .
`. . . . . . . .
`. . . . .
`43
`2.14 Measured joint 6 current-loop frequency response . . . . . .
`. . . . .
`44
`2.15 Measured joint 6 motor and current-loop transfer function.
`.
`. . . . .
`45
`2.16 Measured maximum current step response for joint 6. . . . .
`. . . . .
`48
`2.17 SIMULINK model MOTOR . .
`. . . . . . .
`. . . . . . . .
`. . . . .
`50
`2.18 SIMULINK model LMOTOR .
`. . . . . . .
`. . . . . . . .
`. . . . .
`50
`2.19 Velocity loop block diagram.
`. .
`. . . . . . .
`. . . . . . . .
`. . . . .
`51
`2.20 Measured joint 6 velocity-loop transfer function . . . . . . .
`. . . . .
`52
`2.21 SIMULINK model VLOOP . .
`. . . . . . .
`. . . . . . . .
`. . . . .
`52
`2.22 Unimation servo position control mode.
`. . .
`. . . . . . . .
`. . . . .
`53
`2.23 Block diagram of Unimation position control loop.
`. . . . .
`. . . . .
`54
`2.24 Root-locus diagram of position loop with no integral action.
`. . . . .
`56
`2.25 Root-locus diagram of position loop with integral action enabled. . . .
`56
`2.26 SIMULINK model POSLOOP .
`. . . . . . .
`. . . . . . . .
`. . . . .
`57
`2.27 Unimation servo current control mode. . . . .
`. . . . . . . .
`. . . . .
`57
`2.28 Standard trajectory torque components.
`. . .
`. . . . . . . .
`. . . . .
`59
`
`xxi
`
`ABB Inc. Exhibit 1004, Page 18 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`xxii
`
`LIST OF FIGURES
`
`. . . . . . . .
`2.29 Computed torque control structure. . . . . . .
`. . . . . . . .
`2.30 Feedforward control structure.
`.
`. . . . . . .
`2.31 Histogram of torque expression coefficient magnitudes . . .
`
`. . . . .
`. . . . .
`. . . . .
`
`61
`61
`68
`
`74
`. . . . .
`. . . . . . . .
`. . . . .
`3.1 Steps involved in image processing.
`75
`. . . . .
`. . . . . . . .
`3.2 Luminosity curve for standard observer.
`. . .
`76
`. . . . .
`. . . . . . . .
`3.3 Specular and diffuse surface reflectance . . .
`77
`. . . . .
`3.4 Blackbody emissions for solar and tungsten illumination.
`. .
`79
`. . . . .
`3.5 Elementary image formation . .
`. . . . . . .
`. . . . . . . .
`83
`. . . . .
`3.6 Depth of field bounds . . . . . .
`. . . . . . .
`. . . . . . . .
`87
`. . . . .
`3.7 Central perspective geometry. . .
`. . . . . . .
`. . . . . . . .
`88
`. . . . .
`3.8 CCD photosite charge wells and incident photons. . . . . . .
`89
`. . . . .
`3.9 CCD sensor architectures . . . .
`. . . . . . .
`. . . . . . . .
`91
`. . . . .
`3.10 Pixel exposure intervals . . . . .
`. . . . . . .
`. . . . . . . .
`92
`. . . . .
`3.11 Camera spatial sampling . . . .
`. . . . . . .
`. . . . . . . .
`93
`. . . . .
`3.12 Some photosite capture profiles.
`. . . . . . .
`. . . . . . . .
`93
`. . . . .
`3.13 MTF for various capture profiles. . . . . . . .
`. . . . . . . .
`95
`. . . . .
`3.14 Exposure interval of the Pulnix camera.
`. . .
`. . . . . . . .
`97
`. . . . .
`3.15 Experimental setup to determine camera sensitivity. . . . . .
`98
`. . . . .
`3.16 Measured response of AGC circuit to changing illumination.
`99
`3.17 Measured response of AGC circuit to step illumination change. . . . .
`3.18 Measured spatial variance of illuminance as a function of illuminance. 102
`3.19 CCIR standard video waveform . . . . . . .
`. . . . . . . .
`. . . . . 103
`3.20 Interlaced video fields.
`. . . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 104
`3.21 The effects of field-shuttering on a moving object. . . . . . .
`. . . . . 106
`3.22 Phase delay for digitizer filter . .
`. . . . . . .
`. . . . . . . .
`. . . . . 108
`3.23 Step response of digitizer filter .
`. . . . . . .
`. . . . . . . .
`. . . . . 108
`3.24 Measured camera and digitizer horizontal timing.
`. . . . . .
`. . . . . 109
`3.25 Camera and image plane coordinate systems.
`. . . . . . . .
`. . . . . 112
`3.26 Measured camera response to horizontal step illumination change.
`. . 113
`3.27 Measured camera response to vertical step illumination change. . . . . 114
`3.28 Typical arrangement of anti-aliasing (low-pass) filter, sampler and zero-
`order hold. . . . .
`. . . . . . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 115
`3.29 Magnitude response of camera output versus target motion .
`. . . . . 117
`3.30 Magnitude response of camera output to changing threshold . . . . . 117
`3.31 Spreadsheet program for camera and lighting setup . . . . .
`. . . . . 119
`3.32 Comparison of illuminance due to a conventional floodlamp and cam-
`era mounted LEDs
`. . . . . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 120
`
`. . . . . . . .
`4.1 Steps involved in scene interpretation . . . .
`4.2 Boundary representation as either crack codes or chain code.
`
`. . . . . 125
`. . . . . 126
`
`ABB Inc. Exhibit 1004, Page 19 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`LIST OF FIGURES
`
`xxiii
`
`. 128
`4.3 Exaggerated view showing circle centroid offset in the image plane.
`4.4 Equivalent ellipse for an arbitrary region.
`. .
`. . . . . . . .
`. . . . . 129
`4.5 The ideal sensor array showing rectangular image and notation. . . . . 131
`4.6 The effect of edge gradients on binarized width. . . . . . . .
`. . . . . 134
`4.7 Relevant coordinate frames. . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 139
`4.8 The calibration target used for intrinsic parameter determination.
`. . . 140
`4.9 The two-plane camera model. . .
`. . . . . . .
`. . . . . . . .
`. . . . . 142
`4.10 Contour plot of intensity profile around calibration marker.
`.
`. . . . . 146
`4.11 Details of camera mounting . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 148
`4.12 Details of camera, lens and sensor placement.
`. . . . . . . .
`. . . . . 149
`
`. . . . . . . .
`. . . . . . .
`5.1 Relevant coordinate frames . . .
`5.2 Dynamic position-based look-and-move structure. . . . . . .
`5.3 Dynamic image-based look-and-move structure. . . . . . . .
`5.4 Position-based visual servo (PBVS) structure as per Weiss.
`.
`5.5
`Image-based visual servo (IBVS) structure as per Weiss.
`. .
`5.6 Example of initial and desired view of a cube . . . . . . . .
`
`. . . . . 152
`. . . . . 154
`. . . . . 154
`. . . . . 155
`. . . . . 155
`. . . . . 162
`
`. . . . . 175
`. . . . . . . .
`. . . . . . .
`6.1 Photograph of VME rack . . . .
`. . . . . 176
`. . . . . . . .
`6.2 Overall view of the experimental system . . .
`. . . . . 178
`. . . . . . . .
`6.3 Robot controller hardware architecture.
`. . .
`. . . . . 179
`6.4 ARCL setpoint and servo communication timing . . . . . .
`. . . . . 180
`6.5 MAXBUS and VMEbus datapaths . . . . . .
`. . . . . . . .
`. . . . . 181
`6.6 Schematic of image processing data flow . . .
`. . . . . . . .
`. . . . . 182
`6.7 Comparison of latency for frame and field-rate processing . .
`. . . . . 183
`6.8 Typical RTVL display . . . . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 186
`6.9 A simple camera mount.
`. . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 186
`6.10 Camera mount used in this work . . . . . . .
`. . . . . . . .
`. . . . . 187
`6.11 Photograph of camera mounting arrangement.
`. . . . . . . .
`. . . . . 189
`6.12 Target location in terms of bearing angle . . .
`. . . . . . . .
`. . . . . 190
`6.13 Lens center of rotation . . . . .
`. . . . . . .
`. . . . . . . .
`. . . . . 193
`6.14 Coordinate and sign conventions . . . . . . .
`. . . . . . . .
`. . . . . 194
`6.15 Block diagram of 1-DOF visual feedback system . . . . . .
`. . . . . 195
`6.16 Photograph of square wave response test configuration . . .
`. . . . . 195
`6.17 Experimental setup for step response tests. . .
`. . . . . . . .
`. . . . . 196
`6.18 Measured response to ' visual step'.
`. . . . . .
`. . . . . . . .
`6.19 Measured closed-loop frequency response of single-axis visual servo . 197
`6.20 Measured phase response and delay estimate .
`. . . . . . . .
`. . . . . 198
`6.21 Temporal relationships in image processing and robot control . . . . . 199
`6.22 SIMULINK model of the 1-DOF visual feedback controller.
`. . . . . 200
`6.23 Multi-rate sampling example . .
`. . . . . . .
`. . . . . . . .
`. . . . . 201
`
`ABB Inc. Exhibit 1004, Page 20 of 378
`ABB Inc. v. Roboticvisiontech, Inc.
` IPR2023-01426
`
`
`
`xxiv
`
`LIST OF FIGURES
`
`6.24 Analysis of sampling time

Accessing this document will incur an additional charge of $.
After purchase, you can access this document again without charge.
Accept $ ChargeStill Working On It
This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.
Give it another minute or two to complete, and then try the refresh button.
A few More Minutes ... Still Working
It can take up to 5 minutes for us to download a document if the court servers are running slowly.
Thank you for your continued patience.

This document could not be displayed.
We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.
You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.
Set your membership
status to view this document.
With a Docket Alarm membership, you'll
get a whole lot more, including:
- Up-to-date information for this case.
- Email alerts whenever there is an update.
- Full text search for other cases.
- Get email alerts whenever a new case matches your search.

One Moment Please
The filing “” is large (MB) and is being downloaded.
Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!
If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document
We are unable to display this document, it may be under a court ordered seal.
If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.
Access Government Site