• All Courts
  • Federal Courts
  • Bankruptcies
  • PTAB
  • ITC
Track Search
Export
Download All
Displaying 39-53 of 55 results

2009 Exhibit: EX2009 Breaking The Toolhead Barrier With MedUSA Fabbaloo

Document IPR2023-01426, No. 2009 Exhibit - EX2009 Breaking The Toolhead Barrier With MedUSA Fabbaloo (P.T.A.B. Jan. 4, 2024)
Breaking The Toolhead Barrier With MedUSA « Fabbaloo Other companies attempt to speed up 3D printing by doing operations in parallel.
3D printers with independent dual extruders (IDEX) are often able to print two objects simultaneously in “copy mode”.
Dr. Andrzej Nycz is the Wire-Arc Metal AM Technical Lead in the Manufacturing Demonstration Facility at the Oak Ridge National Laboratory.
In a recent LinkedIn post, Nycz posted this incredible video: MedUSA is a multi-robot large scale In the video you can see three independent WAAMtoolheads simultaneously working on a metal 3D print.
Somehow the ORNL researchers were able to develop a system to collaborate between the toolheads to avoid conflicts and produce the object at high speed.
cite Cite Document

2006 Exhibit: EX2006 Rizzi 1991

Document IPR2023-01426, No. 2006 Exhibit - EX2006 Rizzi 1991 (P.T.A.B. Jan. 4, 2024)
In this second generation machine, a three degree of freedom direct drive arm (Figure 1) relies on a field rate stereo vision system to bat an artificially illuminated ping-pong ball into a specified periodic vertical motion.
Thresholding, of course, necessitates a visually structured environment, and we presently illuminate white ping-pong bails with halogen lamps while putting black matte cloth cowling on the robot, floor, and curtaining off any background scene, 2.3.1 Triangulation In order to simplify the construction of a trangulator for this vision system, we have employed a simple projective camera model.
After introducing the “environmental control system,” an abstract dynamical sys- tem formed by composing the free flight and impact models, it becomes possible to encode an elementary dexterous task, the “vertical one juggle,” as an equilibrium state —— a fixed point.
Its implementation in a network of XP/DCS nodes is depicted in Figure 4, The juggling al- gorithm these diagrams realize is a straightforward application of contemporary robot tracking techniques to the mirror law presented in Section 3 as driven by the output of the vision system.
calibration time, one supposes that some point on the robot’s gripper (that wewilll take to be the origin of the “tool” frame) is marked with a light reflecting material in such a fashion as to produce an unmistakable camera observation — a four vector, ¢ € IR* comprised of the two image plane measurements.
cite Cite Document

2005 Exhibit: EX2005 Kurfess CV

Document IPR2023-01426, No. 2005 Exhibit - EX2005 Kurfess CV (P.T.A.B. Jan. 4, 2024)
“Bearing Fault Detection via High Frequency Resonance Technique with Adaptive Line Enhancer,” Li, Y., Shiroishi, J.W., Kurfess, T.R., Liang, S., and Danyluk, S., accepted by the 12th Biennial Conference on Reliability, Stress Analysis, & Failure Prevention, Virginia Beach, VA, April 15-17, 1997.
Y., Shiroishi, J., Danyluk, S., Kurfess, T., and Liang, S. Y., “Diagnostics of Roller Bearing Defects Based on Vibration and Acoustic Emission,” Proceedings of the 10th International Congress and Exhibition on Condition Monitoring and Diagnostic Engineering Management, pp. 256-267, Espoo, Finland, June 1997.
Kirkland, E., Kurfess, T. R., Liang, S. Y., “A Nano Coordinate Machine for Optical Dimensional Metrology,” First Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM) International Conference, Manila, Philippines, March 2003.
Cui, Y., Kurfess, T.R., and Messman, M. “Testing and Modeling of Nonlinear Properties of Shock Absorbers for Vehicle Dynamics Studies,” Proceedings of the World Congress on Engineering and Computer Science 2010, Vol II WCECS 2010, San Francisco, CA, USA, October 20-22, 2010.
Whitney, D. E., Kurfess, T. R., Todtenkopf, A. B., Edsall, A. C., Brown, M. L. and Roxas, P. S., “Development and Control of an Automated Robotic Weld Bead Grinding System,” Proceedings of the 15th National Science Foundation Grantees Conference on Production Research and Technology, Berkeley, CA, January 1989.
cite Cite Document

1003 Exhibit: EX1003 Hutchinson Decl

Document IPR2023-01426, No. 1003 Exhibit - EX1003 Hutchinson Decl (P.T.A.B. Sep. 22, 2023)
I further understand that objective indicia of nonobviousness include failure of others, copying, unexpected results, information that “teaches away” from the claimed subject matter, perception in the industry, commercial success, and long-felt but unmet need.
I understand that the USPTO will look to the specification and prosecution history to see if there is a definition for a given claim term, and if not, will apply the ordinary and customary meaning from the perspective of a POSITA at the time in which the alleged invention was made.
I understand that the Patent Owner sought to distinguish Wei-II by arguing that: “Importantly…Wei[-II] …is directed to the use of a stereo pair of cameras” but the claimed invention is “directed to methods and apparatus that employ single camera three-dimensional (3-D) vision for robotic guidance.” EX1002, p.129.
Instead of allowing the application to issue, I understand that the Patent Owner filed a request for continued examination, submitted additional prior art, and further amended the claims.
Corke explains that four images of the SHAPE calibration target were used to compute the “calibration matrix”—a mathematical object that “contains information about the position of the camera with respect to the world coordinate frame”—i.e., the extrinsic parameters.
cite Cite Document

1009 Exhibit: EX1009 US5959425

Document IPR2023-01426, No. 1009 Exhibit - EX1009 US5959425 (P.T.A.B. Sep. 22, 2023)
The various features and advantages of this invention will become apparent to those skilled in the art from the follow ing detailed description of the currently preferred embodi ment.
Conventional Stereo techniques are used to convert the two-dimensional image data from two different perspectives to determine the three dimensional location of the path on the workpiece 32.
The flow chart 60 includes several basic steps that preferably are Sequentially completed to program the robot to follow a desired path.
Conventional Stereo techniques, as understood by those skilled in the art, preferably are used to convert the information from the images into a three-dimensional Set of data representing the location of the desired path relative to a robot reference frame, which typically is associated with the base 24.
This invention provides a number of Significant advan tages including eliminating the need for manually teaching a robot to follow a desired path.
cite Cite Document

1005 Exhibit: EX1005 Wei I

Document IPR2023-01426, No. 1005 Exhibit - EX1005 Wei I (P.T.A.B. Sep. 22, 2023)
ABB Inc. Exhibit 1005, Page 1 of 11 ABB Inc. v. Roboticvisiontech, Inc. IPR2023-01426 ABB Inc. Exhibit 1005, Page 2 of 11 ABB Inc. v. Roboticvisiontech, Inc. IPR2023-01426 ABB Inc. Exhibit 1005, Page 3 of 11 ABB Inc. v. Roboticvisiontech, Inc.
cite Cite Document

1006 Exhibit: EX1006 Wei II

Document IPR2023-01426, No. 1006 Exhibit - EX1006 Wei II (P.T.A.B. Sep. 22, 2023)
In the case of multiple sensory visual servoing, an extra difficulty arises in determining the optimal relative weights (importance) of each sensor in the motion estimation procedure (e.g., in the minimization of an objective function [15]).
Kuperstein [7] first proposed to map stereo disparities of stationary cameras directly to the robot joint angles used to reach a single point in the three-dimensional (3-D) space by a nonlinear network.
Hashimoto et al. [4] simplified Miller’s method by considering the relative positioning with respect to a static object, without having to involve the current joint angle configuration in the input space.
We denote the small portion by M1 = M1; where the constant  is used to scale the motion and is made dependent on the magnitude of M1: After the robot has moved by M1; we measure the sensory data to obtain, say S2; at the new position.
After 280 cycles of iterations, which took about 5 min on a Indygo2 Silicon Graphics workstation, the training is stopped with a rms error being 4% of the maximum motion component (each Authorized licensed use limited to: Kathryn Albanese.
cite Cite Document

1002 Exhibit: EX1002 File History Part 2 of 2

Document IPR2023-01426, No. 1002-2 Exhibit - EX1002 File History Part 2 of 2 (P.T.A.B. Sep. 22, 2023)
While said gripping mechanism separates said memory measure prescribed distance to a subject under said movement by said follow-up control means, when it follows, specifically, Coordinates of each characteristic quantity of said subject which said feature amount extracting means extracted are newly memorized, What is necessary is just to memorize a position of said gripping mechanism when said follow-up control means makes said gripping mechanism follow a subject thoroughly from said 2nd end point position using coordinates of each newly memorized characteristic quantity as the 2nd new end point position.
[OO61]The robot controller 5 computes Pa according to the following formulas noting that it takes time TO to carry out Po part relative displacement.
cameras (2), are fastened directly or indirectly to the robot and moveable with it, to monitor any deviation of the components held by a grab (3), from a standard position.
Patent and Trademark Office; U.S. DEPARTMENT OF COMMERCE Under the Paperwork Reduction Act of 1995 no oersons are reaulred to resoond to a oollectlon of information unless it disolavs a.valid 0MB control number.
Any reply received by the Office later than three months after the mailing date of this communication, even if timely filed, may reduce any earned patent term adjustment.
cite Cite Document

1002 Exhibit: EX1002 File History Part 1 of 2

Document IPR2023-01426, No. 1002 Exhibit - EX1002 File History Part 1 of 2 (P.T.A.B. Sep. 22, 2023)
Prior single camera systems however have used laser triangulation which involves expensive specialized sensors, must be rigidly packaged to maintain geometric relationships, require sophisticated inter-tool calibration methods and tend to be susceptible to damage or misalignment when operating in industrial environ ments.
conditions as demonstrated in "Uniqueness of 3D Pose Under Weak Perspective: A Geometric Proof' by Thomas Huang, Alfred Bruckenstein, Robert Holt, Arnn Netravali; approximate object location using Direct Linear Transforma-
Yuncai Liu, Thomas S. Huang and Olivier D. Faugeras, "Determination of Camera Location from 2-D to 3-D Line and Point Correspondences", IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.
Thomas Huang, Alfred M. Bruckstein, Robert J. Holt, and Aron N. Netravali, "Uniqueness of 3D Pose Under Weak Perspective: A Geometrical Proof'', IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.
Figure 2 illustrates a calibration system according to the present invention in which the basic features thereof have been reversed, i.e. in this case, the robot arm carries the cameras while the dotted targets are fixed with respect to the work station,i.e.
cite Cite Document

1001 Exhibit: EX1001 US8095237

Document IPR2023-01426, No. 1001 Exhibit - EX1001 US8095237 (P.T.A.B. Sep. 22, 2023)
Yuncai Liu, Thomas S. Huang and Olivier D. Faugeras, "Determina tion of Camera Location from 2-D to 3-D Line and Point Correspon dences", IEEE Transactions on Pattern Analysis and Machine Intel ligence, vol.
Thomas Huang, Alfred M. Bruckstein, Robert J. Holt, and Arun N. Netravali, "Uniqueness of3D Pose Under Weak Perspective: A Geo metrical Proof', IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Yuncai Liu, Thomas S. Huang and Olivier D. Faugeras, "Determina tion of Camera Location from 2-D to 3-D Line and Point Correspon dences", IEEE Transactions on Pattern Analysis and Machine Intel ligence, vol.
Thomas Huang, Alfred M. Bruckstein, Robert J. Holt, and Arun N. Netravali, "Uniqueness of3D Pose Under Weak Perspective: A Geo metrical Proof', IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Prior single camera systems however have used laser triangulation which involves expensive specialized sensors, must be rigidly packaged to maintain geometric rela tionships, require sophisticated inter-tool calibration meth- 35 ods and tend to be susceptible to damage or misalignment when operating in industrial environments.
cite Cite Document

1013 Exhibit: EX1013 Scheduling Order

Document IPR2023-01426, No. 1013 Exhibit - EX1013 Scheduling Order (P.T.A.B. Sep. 22, 2023)
Unless otherwise ordered by the Court or agreed to by parties, the limitations on discovery set forth in the Federal Rules of Civil Procedure shall be strictly observed.
Subsequent to exchanging that list, the parties will meet and confer to prepare a Joint Claim Construction Chart to be filed no later than January 12, 2024.
On June 21, 2024, counsel shall submit a joint letter to the Court with an interim report of the matters in issue and the progress of discovery to date.
Any motion for summary judgment shall be accompanied by a separate concise statement, not to exceed 6 pages, which details each material fact which the moving party contends is essential for the Court' s resolution of the summary judgment motion (not the entire case) and as to which the moving party contends there is no genuine issue to be tried.
Defendant produces core technical documents, including operation manuals, product literature, schematics, specifications, and source code.
cite Cite Document

1011 Exhibit: EX1011 Docket Sheet

Document IPR2023-01426, No. 1011 Exhibit - EX1011 Docket Sheet (P.T.A.B. Sep. 22, 2023)
ÿ  ÿÿÿÿ   ÿ ÿ 012345267389ÿ;8<=<8<29ÿ>2=6?7@ÿA18 GHIHÿKLMNOLPNÿQRSON KLMNOLPNÿRTÿKUVWXWOUÿYZLV[L\]NR\^ Q ` aÿKbQcdeÿfbgÿQhIdÿijÿkjllmPnmoklpqmrsZ tuvuwxyz{|{}FD~y€ÿ{‚yƒÿ„ƒÿC……ÿ
cite Cite Document

1008 Exhibit: EX1008 US4146924

Document IPR2023-01426, No. 1008 Exhibit - EX1008 US4146924 (P.T.A.B. Sep. 22, 2023)
To quantify the magnitude of the accuracy combined in a single portable unit or visual program issue, one can assume a 100 cm line in a reference plane ming device (VPD) that is marked 20. perpendicular to the optical axis.
unique orientation between a pair of robot fingertips Specification of trajectories is useful to avoid obsta which are structurally different, a distinction between cles, contouring and for special applications such as, for the two lights must be made.
sources 10A and 10B, as sensed by the TV camera 2A Next the robot removes the workpiece 13 from the and extracted by the computer 3, is measured relative to fixture 15 (location 30B) and places it at the open posi the fiducial array position and orientation.
apparatus, the vision system can also be used to commu This procedure minimizes computational time and nicate information to a computer-controlled robot with memory requirements at the moment a keyboard entry out direct wire connections; by way of illustration, the is made.
Economic justification of the present technique must (3) Keyboard status is monitored for codes to initiate also take into account the frequency with which the the various forms of processing which couples the latest robot will be reprogrammed.
cite Cite Document

1004 Exhibit: EX1004 Corke

Document IPR2023-01426, No. 1004 Exhibit - EX1004 Corke (P.T.A.B. Sep. 22, 2023)
I would like to thank my CSIRO colleagues for their support of this work, in particular: Dr. Paul Dunn, Dr. Patrick Kearney, Robin Kirkham, Dennis Mills, and Vaughan Roberts for technical advice and much valuable discussion; Murray Jensen and Geoff Lamb for keeping the computer systems run- ning; Jannis Young and Karyn Gee, the librarians, for tracking down all manner of references; Les Ewbank for mechanical design and drafting; Ian Brittle' s Research Support Group for mechanical construction; and Terry Harvey and Steve Hogan for electronic construction.
In a manufacturing environment visual servoing could thus eliminate robot teaching and allow tasks that were not strictly repetitive, such as assembly without precise fixturing and with incoming components that were unoriented or perhaps swinging on overhead transfer lines.
The term visual servoing appears to have been first introduced by Hill and Park [116] in 1979 to distinguish their approach from earlier 'blocks world' experi- ments where the robot system alternated between picture taking and moving.
Nonetheless there is a rapidly growing body of literature dealing with visual servoing, though dy- namic performance or bandwidth reported to date is substantially less than could be expected given the video sample rate.
Introduction abstraction world model task level reasoning scene interpretation feature extraction pixel manipulation visual servoing object motion planning trajectory generation joint control bandwidth perception reaction Figure 1.1: General structure of hierarchical model-based robot and vision sys- tem.
cite Cite Document

1007 Exhibit: EX1007 Summons

Document IPR2023-01426, No. 1007 Exhibit - EX1007 Summons (P.T.A.B. Sep. 22, 2023)
Case 1:22-cv-01257-GBW Document 7 Filed 09/23/22 Page 1 of 2 PageID #: 428 ABB Inc. Exhibit 1007, Page 1 of 2 ABB Inc. v. Roboticvisiontech, Inc. IPR2023-01426 Case 1:22-cv-01257-GBW Document 7 Filed 09/23/22 Page 2 of 2 PageID #: 429 ABB Inc.
cite Cite Document
<< 1 2 3 4 5 >>