• All Courts
  • Federal Courts
  • Bankruptcies
  • PTAB
  • ITC
Track Search
Export
Download All
Displaying 24-38 of 55 results

No. 1-1 COMPLAINT for PATENT INFRINGEMENT filed with Jury Demand against ABB Inc. ( Filing fee $ 402, ...

Document RoboticVISIONTech, Inc. v. ABB Inc., 1:22-cv-01257, No. 1-1 (D.Del. Sep. 22, 2022)
Yuncai Liu, Thomas S. Huang and Olivier D. Faugeras, "Determination of Camera Location from 2-D to 3-D Line and Point Correspondences", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Chien-Ping Lu, Gregory D. Hager and Eric Mjolsness, "Fast and Globally Convergent Pose Estimation from Video Images", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Thomas Huang, Alfred M. Bruckstein, Robert J. Holt, and Arnn N. Netravali, "Uniqueness of 3D Pose Under Weak Perspective: A Geometrical Proof", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Prior single camera systems however have used 25 laser triangulation which involves specialized sensors, must be rigidly packaged to maintain geometric relationships, require sophisticated inter-tool calibration methods and tend to be susceptible to damage or misalignment when operating in industrial environments.
As will be apparent to those skilled in the art in the light of the foregoing disclosure, many alterations and modifica tions are possible in the practice of this invention without departing from the spirit or scope thereof.
cite Cite Document

No. 1-8 COMPLAINT for PATENT INFRINGEMENT filed with Jury Demand against ABB Inc. ( Filing fee $ 402, ...

Document RoboticVISIONTech, Inc. v. ABB Inc., 1:22-cv-01257, No. 1-8 (D.Del. Sep. 22, 2022)
TrueView enables ABB robots to precisely locate the grip points of a disoriented object within a 3D space.
The eVF software platform includes unique technologies such as AutoCal for easy calibration, and AccuTest and AccuTrain for quick and reliable integration.
TrueViewTM Vision Guided Robotics Printed in USA, ABB reserves the right to change specifications without notice.
• Uses structured light (e.g. laser) stripes to scan part surfaces to provide added feature visibility • Provides the 3D position of rigid parts with smooth, featureless surfaces, in full six degrees of freedom TECHNICAL DATA, TrueView Vision Systems Supported Robot Types___________________________ Robot Controller
Robot Type All IRB Arms Robot Controller Configuration Requirement___ _________ Hardware Analog/Digital Combi Board Baseware Version 3.2 or higher PC Interface Performance_______________________________________ Vision Accuracy +/- 0.5mm Vision Processing Time 0.5 – 1.5 seconds Typical Part Movement +/- 15 degrees, +/- 300 mm Capability_________________________________________ Camera Analog High resolution Analog Standard resolution Analog High Speed Any size LED – Multiple sizes Yes Lens Lights Structured Light www.abb.com/robotics
cite Cite Document

No. 1-3 COMPLAINT for PATENT INFRINGEMENT filed with Jury Demand against ABB Inc. ( Filing fee $ 402, ...

Document RoboticVISIONTech, Inc. v. ABB Inc., 1:22-cv-01257, No. 1-3 (D.Del. Sep. 22, 2022)
Yuncai Liu, Thomas S. Huang and Olivier D. Faugeras, "Determina tion of Camera Location from 2-D to 3-D Line and Point Correspon dences", IEEE Transactions on Pattern Analysis and Machine Intel ligence, vol.
Thomas Huang, Alfred M. Bruckstein, Robert J. Holt, and Arun N. Netravali, "Uniqueness of3D Pose Under Weak Perspective: A Geo metrical Proof', IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Yuncai Liu, Thomas S. Huang and Olivier D. Faugeras, "Determina tion of Camera Location from 2-D to 3-D Line and Point Correspon dences", IEEE Transactions on Pattern Analysis and Machine Intel ligence, vol.
Thomas Huang, Alfred M. Bruckstein, Robert J. Holt, and Arun N. Netravali, "Uniqueness of3D Pose Under Weak Perspective: A Geo metrical Proof', IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Prior single camera systems however have used laser triangulation which involves expensive specialized sensors, must be rigidly packaged to maintain geometric rela tionships, require sophisticated inter-tool calibration meth- 35 ods and tend to be susceptible to damage or misalignment when operating in industrial environments.
cite Cite Document

No. 1-9 COMPLAINT for PATENT INFRINGEMENT filed with Jury Demand against ABB Inc. ( Filing fee $ 402, ...

Document RoboticVISIONTech, Inc. v. ABB Inc., 1:22-cv-01257, No. 1-9 (D.Del. Sep. 22, 2022)
The calibration produces three types of data - intrinsic, extrinsic and hand- eye calibration data.” ABB User Manual, p. 74. a) the camera intrinsic parameters; 8.i 8.i.a Claim 8.i.b US 6,816,755 – Claim features b) the position of the camera relative to the tool of the robot (“hand-eye” calibration); 8.ii ii) teaching the object features by 8.ii.a 8.ii.b a) putting the object in the field of view of the camera and capturing an image of the object; b) selecting at least 6 visible features from the image;
A figure on page 171 of the ABB User Manual (reproduced below) shows a software image in which 9 features are visible on an object.
The robot can then properly initiate the next sequence in the tasks given to it and accurately determine where the part being used is.” ABB User Manual, p. 172.
All subsequent feature tools will then be aligned to the part by being positioned in the fixture coordinate system.” ABB User Manual, p. 91.
Claim US 6,816,755 – Claim features transformation between the “Object Space” and “Camera Space”; 8.iii.d d) using the said transformation to calculate the movement of the robot to position the camera so that it appears orthogonal to the object; Features of Accused Product “…the camera TCP is located at the origin of the calibration target, at the working distance and perpendicular to the calibration target.” ABB User Manual, p. 109.
cite Cite Document

1014 Exhibit: EX1014 Petitioners Stipulation

Document IPR2023-01426, No. 1014 Exhibit - EX1014 Petitioners Stipulation (P.T.A.B. Jan. 19, 2024)
As in Sand Revolution, Petitioner stipulates that if the above-captioned inter partes review is instituted, the Petitioner will not pursue the same invalidity grounds presented in the Petition in the pending district court litigation.
Dated: January 19, 2024 Respectfully submitted,
cite Cite Document

2013 Exhibit: EX2013 Roth 2003

Document IPR2023-01426, No. 2013 Exhibit - EX2013 Roth 2003 (P.T.A.B. Jan. 4, 2024)
When merging the ball positions, it is important to limit the standard deviation, own,baii, to no less than the default to prevent it minimum confidence value, SMALL-ERROR, from becoming vanishingly small.
We compare the behavior of a robot team using our world model, constructed with both sensor and shared 2497 Authorized licensed use limited to: Sterne Kessler Goldstein Fox.
Therefore, we consider the SEARCHSPIN behavior to provide an accu- rate estimate of how frequently the individual world model 2498 Authorized licensed use limited to: Sterne Kessler Goldstein Fox.
Our experimental results clearly show that sharing infor- mation about the state of the world with teammates helps robots to overcome the problem of partial observability when locating relevant objects in their environment.
Our approach, detailed in this paper, contributes a step towards meeting this challenge by providing a method that is applicable to any multi-robot team that operates un- der equivalent real-time and high latency communication conditions.
cite Cite Document

2014 Exhibit: EX2014 Zhao 2009

Document IPR2023-01426, No. 2014 Exhibit - EX2014 Zhao 2009 (P.T.A.B. Jan. 4, 2024)
Finally, experiments are presented to show the efficiency of the system, in which sets of automotive parts, such as axletrees, bearings, pistons and pins are grasped from the conveyor and assembled together precisely.
)(tR Here, we take into account the relationship between the time variation of the moments and the relative kinematic ),( Zv , ,vvv [ ] , where and screw y z x ZZZZ [ , ] , represent and translational the y z rotational velocity, respectively [10]
In addition, the accurate geometric parameters of industrial parts are known and the camera lens is calibrated ahead of time so the distance between the image plane and : the conveyer is known too.
[12] Kwon Oh-Kyu, Sim Dong-Gyu, Park Rae-Hong, “Robust Hausdorff distance matching algorithms using Pyramidal structures”, Pattern Recognition, 34(7): 2005-2013, 2001 [13] Guangtao Zhao, Hong Qiao, Zhicai Ou, “A method for calibrating camera lens distortion with cross-ratio invariability in welding seam system”, IEEE Conference of Intelligent Robotics and Applications [14] Shiu Y.C.
Further experiments show that (1) The success rate is 75 percent without visual feedback information; (2) The 1056 Authorized licensed use limited to: Sterne Kessler Goldstein Fox.
cite Cite Document

2008 Exhibit: EX2008 Understanding Robot Coordinate Frames and Points SolisPLC

Document IPR2023-01426, No. 2008 Exhibit - EX2008 Understanding Robot Coordinate Frames and Points SolisPLC (P.T.A.B. Jan. 4, 2024)
We would intuitively know howto position ourselves and the tool using our hands to perform the required operation on the correct part of the workpiece.
World Frame Further details around these conventions won't get covered in this tutorial, but the main idea to understand is that by using the six indicators we learned about (X,Y, Z, A, B, C), you can now define the location and orientation of an object in space.
An example would be when using a ceiling-mounted robotor if the robot is mountedslightly offset in its position but when the direction of operation needs to stay aligned with a specific coordinate axis.
If we know that the workpiece might get offset during the conceptstage in this application, wewill design the robotcell with sensing capabilities to locate this corner.
Once found, the robot could continue the welding application as usual, but the offset now has been dealt with without having to modify all points individually.
cite Cite Document

2003 Exhibit: EX2003 20231215 Invalidity Contentions

Document IPR2023-01426, No. 2003 Exhibit - EX2003 20231215 Invalidity Contentions (P.T.A.B. Jan. 4, 2024)
ABB further relies on and incorporates by reference, as if originally set forth herein, all invalidity positions, and all associated prior art and claim charts, disclosed to RVT in the course of any lawsuits or legal proceedings involving any of the Patents-in-Suit, or by potential or actual licensees to any of the Patents-in-Suit.
To the extent that such an issue arises, ABB reserves the right to identify other references that, inter alia, would have made the addition of the allegedly missing limitation to the disclosed device or method obvious.
Furthermore, an unknown scale factor in image sampling may also need to be recovered, because scan lines are typically resampled in the frame grabber, and so picture cells do not correspond discrete sensing elements.”); see also id. at 2–12 (describing steps for calibration); see also Wang at 1 (“Vision sensors capable of finding the position and orientation of an object are mounted on a robot manipulator to enhance tracking accuracy, versatile robot-operator communication, and intelligent task control.
The test for whether a specification adequately describes an invention is “whether the disclosure of the application relied upon reasonably conveys to those skilled in the art that the inventor had possession of the claimed subject matter as of the filing date.
There is no support in the specification for determining a camera space-to-tool space transformation using a single image and one feature, and based on this lack of disclosure, one of ordinary skill in the art would not be able to carry out the claimed method without undue experimentation.
cite Cite Document

2007 Exhibit: EX2007 Choi 1999

Document IPR2023-01426, No. 2007 Exhibit - EX2007 Choi 1999 (P.T.A.B. Jan. 4, 2024)
Incorporation of various types of sensors increases the degrees of autonomy and intelligence of mobile robots (mobots) in perceiving surroundings, which at the same time imposes a large computational burden on data processing.
This paper proposes digital image processing schemes for map- building and localization of a mobot using a monocular vision system and a single ultrasonic sensor in indoor environments.
The proposed algorithms were implemented, and the mobot was able to localize itself in an allowed position error range and to locate dynamic obstacles moving reasonably fast inside a building.
The localization includes a camera calibration process which enables the mobot to measure the range in a single image frame, and utilizes linear structural features in indoor environment such as doors and corridor lines.
Then, the inverse- mapping can be written as L k J In order to find the elements of A-', the inverse-mapping relation can be also expressed in the form of gxX + hyX + iX - ax- by - c = 0 gxY+hyY+iY-dx-ey- f = O Since there are two equations with nine unknowns, at least five pairs of (xg) and (X,Y) can determine a unique A-'.
cite Cite Document

2004 Exhibit: EX2004 Habibi US6816755

Document IPR2023-01426, No. 2004 Exhibit - EX2004 Habibi US6816755 (P.T.A.B. Jan. 4, 2024)
Yuncai Liu, Thomas S. Huang and Olivier D. Faugeras, "Determination of Camera Location from 2-D to 3-D Line and Point Correspondences”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Chien-Ping Lu, Gregory D. Hager and Eric Mjolsness, “Fast and Globally Convergent Pose Estimation from Video Images”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
Prior Single camera Systems however have used laser triangulation which involves Specialized Sensors, must be rigidly packaged to maintain geometric relationships, require Sophisticated inter-tool calibration methods and tend to be Susceptible to damage or misalignment when operating in industrial environments.
3D pose estimation from lines correspondence (in witch case Selected features will be edges) as described in "Determination of Camera Location from 2D to 3D Line and Point Correspondences” by Yucai Liu, Tho mas S. Huang, Olivier D. Faugeras, c. pose estimation using “orthogonal iteration' described in “Fast and Globally Convergent Pose Estimation from Video Images” by Chien Ping Lu, Gregory D. Hager, Eric Mjolsness, d. approximate object location under weak perspective conditions as demonstrated in “Uniqueness of 3D Pose Under Weak Perspective: A Geometric Proof” by Tho mas Huang, Alfred Bruckenstein, Robert Holt, Arun Netravali; e. approximate object location using Direct Linear Trans formation (DLT) as described in “An investigation on the accuracy of three-dimensional Space reconstruction using Direct Linear Transformation techniques” by Chen, Armstrong, Raftopoulos.
AS will be apparent to those skilled in the art in the light of the foregoing disclosure, many alterations and modifica tions are possible in the practice of this invention without departing from the Spirit or Scope thereof.
cite Cite Document

2001 Exhibit: EX2001 Kurfess Declaration

Document IPR2023-01426, No. 2001 Exhibit - EX2001 Kurfess Declaration (P.T.A.B. Jan. 4, 2024)
In this position, I had responsibility for engaging the Federal sector and the greater scientific community to identify possible areas for policy actions related to manufacturing.
Case IPR2023-01426 U.S. Patent No. 8,095,237 Exhibit No. 2013 Description Roth, M., et al., “A Real-time World Model for Multi-Robot Teams with High-Latency Communication,” Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (2003), pp. 2494-99.
While both techniques might be used to accomplish similar tasks, the inventors of the ’237 patent recognized that use of an object space can help to “provide the accuracy and repeatability required by industrial applications.”
As one example, Corke provides an application of “visual servoing” in which “[t]hree robot DOF [(degrees of freedom)] are controlled by image features so as to keep the camera at a constant height vertically above a target rotating on a turntable,” as illustrated in Corke’s Figure 8.15.
The use of a world coordinate system is also beneficial for numerous manufacturing applications where objects remain stationary and are manipulated Case IPR2023-01426 U.S. Patent No. 8,095,237 within a defined space.
cite Cite Document

2012 Exhibit: EX2012 Xi 2013

Document IPR2023-01426, No. 2012 Exhibit - EX2012 Xi 2013 (P.T.A.B. Jan. 4, 2024)
As shown in Fig. 1b, this method uses a rivet gun in size of a regular hand-held power tool, very compact and light, operating under much lower pressure in a range less than 100 psi, very safe and energy efficient.
Figure 9 shows a test result of the vibration experiment conducted to establish an empirical relation between the triggering frequency and the supply air pressure.
The first part is to carry out a continuous measurement of tHj based on the afore-mentioned position sensing system; the goal is to keep track of the tool pose in the course of insertion.
Inman J, Carbrey B, Calawa R et al (1996) Flexible development system for automated aircraft assembly, SAE aerospace auto- mated fastening conference and exposition, October 1996.
Monsarrat B, Lavoie E, Cote G et al (2007) High performance robotized assembly system for challenger 300 business jet nose fuse panels, AeroTech 2007.
cite Cite Document

2011 Exhibit: EX2011 Liu 2016

Document IPR2023-01426, No. 2011 Exhibit - EX2011 Liu 2016 (P.T.A.B. Jan. 4, 2024)
A minor error introduced by an imprecise coordinate transformation could cause problems such as the failure of image matching and track breaking [1].
This method rotates three single axes of the robot to calculate the normal vectors in three directions, combined with the data of the calibration sensor.
To evaluate the accuracy distribution of the robot for different areas of the working range, a new set of testing data are utilized in a demonstration experiment.
Conclusions This paper proposes a simple method of coordinate transformation in a multi-sensor combination measurement system for use in the field of industrial robot calibration.
Acknowledgments: This research was supported by the Natural Science Foundation of China (NSFC) No. 51275350and the Tianjin Program for Strengthening Marine Technology, No. KJXH201408.
cite Cite Document

2002 Exhibit: EX2002 Comparison of Petition to Hutchinson Declaration

Document IPR2023-01426, No. 2002 Exhibit - EX2002 Comparison of Petition to Hutchinson Declaration (P.T.A.B. Jan. 4, 2024)
I further understand that objective indicia of nonobviousness include failure of others, copying, unexpected results, information that “teaches away” from the claimed subject matter, perception in the industry, commercial success, and long-felt but unmet need.
I understand that the USPTO will look to the specification and prosecution history to see if there is a definition for a given claim term, and if not, will apply the ordinary and customary meaning from the perspective of a POSITA at the time in which the alleged invention was made.
Many machine vision systems that existed as of the priority date had the ability to determine the 3D pose of an object using a single camera mounted on a moveable part of the robot.
I understand that Patent Owner distinguished Parker because it, like Wei-II, disclosed a two-camera system while “Applicants’ claims are directed to methods and apparatus that employ single camera three- dimensional (3-D) vision for robotic guidance.” EX1002, p.588.
EX1004, p.146; EX1003, ¶¶124-127.. Corke explains that four images of the SHAPE calibration target were used to compute the “calibration matrix”—a mathematical object that “contains information about the position of the camera with respect to the world coordinate frame”—i.e., the extrinsic parameters.
cite Cite Document
<< 1 2 3 4 5 >>