throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`––––––––––––––
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`––––––––––––––
`
`Bank of America, N.A.,
`Petitioner
`
`v.
`
`Nant Holdings IP, LLC,
`Patent Owner
`––––––––––––––
`
`Case No. IPR2021-01080
`U.S. Patent No. 8.463,030
`
`––––––––––––––
`
`DECLARATION OF PROFESSOR CHANDRAJIT BAJAJ, PHD IN
`SUPPORT OF PATENT OWNER’S PRELIMINARY RESPONSE TO
`PETITION FOR INTER PARTES REVIEW OF U.S. PATENT NO. 8,463,030
`
`Mail Stop: Patent Board
`Patent Trial and Appeal Board
`United States Patent and Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`1
`
`Patent Owner’s Ex. 2002, Page 1 of 18
`
`

`

`
`
`I, Chandrajit Bajaj, declare as follows:
`
`I.
`
`QUALIFICATIONS AND BACKGROUND
`
`1.
`
`I am currently employed as a Professor of Computer Science at the
`
`University of Texas at Austin (“UT Austin”). I hold the Computational Applied
`
`Mathematics endowed Chair in Visualization. I am also the Director of the
`
`Computational Visualization Center at UT Austin, which has been funded by the
`
`National Institutes of Health, the National Science Foundation, the Department of
`
`Energy, and the Department of Defense. The center’s personnel include twelve
`
`researchers, scientists, post-graduate students, and staff.
`
`2. My curriculum vitae (“CV”), a copy of which is included as Exhibit
`
`2003 hereto, provides details on my education, experience, publications, and other
`
`qualifications. It includes a list of all publications I have authored in the previous
`
`35 years.
`
`3.
`
`I have a Bachelor of Technology degree in Electrical Engineering,
`
`which I obtained from the Indian Institute of Technology in Delhi (IITD) in 1980.
`
`I also have a Master of Science degree and a doctorate in Computer Science from
`
`Cornell University in 1983 and 1984, respectively.
`
`4.
`
`Prior to my employment at the University of Texas (UT), I was an
`
`assistant professor, associate professor, and finally professor of Computer Sciences
`
`at Purdue University (Purdue) from 1984 until I resigned in 1997 and transferred to
`
`
`
`1
`
`Patent Owner’s Ex. 2002, Page 2 of 18
`
`

`

`
`
`UT. During this time, I was also the Director of the Image Analysis and
`
`Visualization Center at Purdue University. I was a visiting associate professor of
`
`Computer Science at Cornell University from 1990-1991. I have also been invited
`
`for collaborative visits by several academic institutions and have presented
`
`numerous keynote presentations worldwide. I have been an editorial member of
`
`the SIAM Journal on Imaging Sciences and the ACM Transactions on Graphics,
`
`and I continue my editorial role for ACM Computing Surveys and the International
`
`Journal of Computational Geometry and Applications.
`
`5.
`
`I have spent the better part of my career, both at Purdue and UT Austin,
`
`researching, designing, teaching, and using computer systems to model, simulate,
`
`search, and visualize natural and synthetic objects, combining computational image
`
`and geometry processing. I am knowledgeable about and have much experience in
`
`both the hardware and software, including algorithms, used for capturing,
`
`analyzing, and displaying imagery in real-time permitting user interaction.
`
`6.
`
`In the 1970s, I majored in Electrical Engineering at the Indian Institute
`
`of Technology, with a minor in Computer Sciences. There, I was intimately
`
`involved in the design and fabrication of microprocessor-controlled circuits
`
`including the development of microprocessor controller software. In the 1980s,
`
`while at Cornell University, these past experiences from my time at Indiana
`
`Institute of Technology led to research in computational geometry, processing, and
`
`
`
`2
`
`Patent Owner’s Ex. 2002, Page 3 of 18
`
`

`

`
`
`optimization. In the early 1990s, I created 3D collaborative multimedia software
`
`environments which were fully searchable, navigable for multi-person computer
`
`gaming and simulation. In 1994, I co-authored a technical paper entitled “Shastra:
`
`Multimedia Collaborative Design Environment,” Vinod Anupam and Chandrajit L.
`
`Bajaj, IEEE Multimedia 1.2 (1994) 39, 39–49.
`
`7.
`
`The increasing need for real-time computer graphics display realism
`
`without sacrificing interactivity led me also to explore fast and efficient image
`
`processing techniques such as feature selection, segmentation, detection, texture
`
`mapping with data compression, such as described in my publications “Image
`
`Segmentation Using Gradient Vector Diffusion and Region Merging”, “Detecting
`
`Circular and Rectangular Particles Based on Geometric Feature Detection in
`
`Electron Micrographs”, “Compression-Based 3D Texture Mapping for Real-Time
`
`Rendering,” and “3D RGB Image Compression for Interactive Applications.”
`
`During this time, I was also intimately involved with the development of a new
`
`synthetic-natural hybrid data compression MPEG (Motion Pictures Expert Group)
`
`standard. Additionally, I applied and received a joint patent “Encoding Images of
`
`3-D Objects with Improved Rendering Time and Transmission Process,” August
`
`2002, US Patent 6,438,266.
`
`8. My work with encoding, transmission and reconstructing 3-D objects
`
`led me to explore image processing and geometric modeling techniques such as
`
`
`
`3
`
`Patent Owner’s Ex. 2002, Page 4 of 18
`
`

`

`
`
`surface reconstruction from CT scans, point clouds, segmentation, and texture
`
`mapping with data compression, such as those described in my publications:
`
`“Multi-Component Heart Reconstruction from Volumetric Imaging,” and
`
`“Automatic Reconstruction of Surfaces and Scalar Fields from 3D Scans”.
`
`9.
`
`In the mid-2000s, I began to create spatially-realistic 3D graphical
`
`environments of natural molecules and cells with a combination of different types
`
`of acquired and reconstructed imagery within which a user may explore, query,
`
`and learn. My publication titled “From Voxel Maps to Models,” that appeared in
`
`an Oxford University Press book titled Imaging Life: Biological Systems from
`
`Atoms to Tissues, c. 15 (Gary C. Howard, William E. Brown & Manfred Auer eds.)
`
`(2014), is an example of my research in computational imaging.
`
`10. Over the course of my career, I have participated in the design and use
`
`of several computer systems, spanning handhelds,
`
`laptops, and graphics
`
`workstations to PC/Linux clusters, as well as very large memory supercomputers
`
`for capturing, modeling, processing and displaying acquire and simulated data of
`
`diverse scientific phenomena. My experience with computer modeling, imaging,
`
`computer graphics, and scientific visualization encompasses the use of interactive
`
`technologies, such as 3D pointers and mice, smart IR touch-screens, motion
`
`detection and gesture-based user interfaces, and multi-display walls, in many
`
`different fields and industrial settings, such as interactive games, medicine (e.g.,
`
`
`
`4
`
`Patent Owner’s Ex. 2002, Page 5 of 18
`
`

`

`
`
`molecular, biomedical, and industrial diagnostics), oil and gas exploration, geology,
`
`cosmology, and military industries.
`
`11. During this time at UT Austin, I also designed and implemented
`
`scalable solutions for inverse problems in microscopy, spectroscopy, biomedical
`
`imaging, constructing spatially realistic and hierarchical 3D models, development
`
`of search/scoring engines for predicting energetically favorable multi-molecular
`
`and cellular complexes, and statistical analysis and interrogative visualization of
`
`neuronal form-function.
`
`12.
`
`I have courtesy appointments and supervise M.S. and Ph.D. students
`
`from several UT Austin departments, including biomedical and electrical
`
`engineering, neurobiology, and mathematics. I currently serve on the editorial
`
`boards for the International Journal of Computational Geometry and Applications
`
`and the ACM Computing Surveys. Much of my work involves issues relating to
`
`interactive image processing, feature extraction, 3D modeling, bio-informatics,
`
`computer graphics, and computational visualization. Examples of my publications,
`
`including peer-reviewed publications, are listed in my CV.
`
`13.
`
`I currently serve on the editorial boards for the International Journal of
`
`Computational Geometry and Applications and the ACM Computing Surveys.
`
`Much of my work involves issues relating to interactive image processing, 3D
`
`modeling, bio-informatics, computer graphics, and computational visualization.
`
`
`
`5
`
`Patent Owner’s Ex. 2002, Page 6 of 18
`
`

`

`
`
`Examples of my publications, including peer-reviewed publications, are listed in
`
`my curriculum vitae.
`
`14. As set forth in my CV, I have authored approximately 167 peer-
`
`reviewed journal articles, 34 book chapters (which were also peer reviewed), and
`
`154 peer-reviewed conference publications.
`
`15.
`
`I have written and edited four books on topics ranging from image
`
`processing, geometric modeling, and visualization techniques to algebraic
`
`geometry and its applications. I have given 165 invited speaker keynote
`
`presentations. I am a Fellow of the American Association for the Advancement of
`
`Science, a Fellow of the Institute of Electrical and Electronics Engineers (IEEE), a
`
`Fellow of the Society of Industrial and Applied Mathematics (SIAM), and also a
`
`Fellow of the Association of Computing Machinery (also known as ACM), which
`
`is the world’s largest education and scientific computing society. The ACM
`
`Fellow is ACM’s most prestigious member grade and recognizes the top 1% of
`
`ACM members for their outstanding accomplishments in computing and
`
`information technology and/or outstanding service to ACM and the larger
`
`computing community.
`
`II.
`
`SCOPE OF WORK
`
`16.
`
`I was asked by counsel for Patent Owner Nant Holdings IP, LLC
`
`(“Nant”) to review U.S. Patent No. 8,463,030 (the “’030 patent”) (Ex. 2001). I
`
`
`
`6
`
`Patent Owner’s Ex. 2002, Page 7 of 18
`
`

`

`
`
`receive $700 per hour for my services. No part of my compensation is dependent
`
`on my opinions or on the outcome of this proceeding.
`
`17.
`
`I also reviewed the present Petition for Inter Parties Review of U.S.
`
`Patent No. 8,463,030 (IPR2021-01080) and the exhibits and declarations
`
`associated therewith, including U.S. Patent No. 6,512,919 to Nobuo Ogasawara
`
`(“Ogasawara”) (Ex. 2004).
`
`18.
`
`I was asked to provide my understanding of whether Ogasawara
`
`discloses the limitations of claim 1 of the ’030 patent, and in particular, the
`
`requirement of “an object identification platform configured to obtain the acquired
`
`data, recognize the object as a target object based on the acquired data, and
`
`determine object information associated with the target object.” In my opinion, it
`
`does not.
`
`19. Ogasawara contains a passing reference to “[a]dvanced pattern
`
`recognition software” which may purportedly be used
`
`to “enhance
`
`the
`
`performance” of a “wireless videophone” and provide “the capability to capture
`
`merchandise information from items that are not identified by either a bar code or
`
`an alpha-numeric label.” Ogasawara at col. 23:12-31.
`
`20.
`
`In these above few lines on the last column before the claims,
`
`Ogasawara describes using such software to “allow[] a consumer to capture a
`
`ideographic image of an apple, for example, and to have the apple by recognized,”
`
`
`
`7
`
`Patent Owner’s Ex. 2002, Page 8 of 18
`
`

`

`
`
`and also notes this “capability is useful for any merchandise item having a distinct
`
`or identifiable shape or other visually identifiable characteristic.” Id. 23:16-22.
`
`Ogasawara does not provide any explanation whatsoever as to what this “advanced
`
`pattern recognition software” is or how it would operate, the algorithms it would
`
`use to recognize pattern, or any other details regarding how it could recognize a
`
`pattern.
`
`III. LEGAL UNDERSTANDING OF ANTICIPATION
`
`21.
`
`I understand that in order to anticipate a claimed invention, a prior art
`
`reference must disclose all elements of the claim within the four corners of the
`
`documents and must disclosed those elements in the same arrangement as the
`
`purportedly anticipated claim. Microsoft Corp. v. Biscotti Inc., 878 F.3d 1052,
`
`1068 (Fed. Cir. 2017). I also understand that, under Federal Circuit precedent, a
`
`prior art reference cannot anticipate a claimed invention unless the allegedly
`
`anticipatory disclosure cited as prior art is enabled by the reference. In re Antor
`
`Media Corp., 689 F.3d 1282, 1288 (Fed. Cir. 2012).
`
`IV. OVERVIEW OF THE ’030 PATENT
`
`22. The ’030 patent discloses “technology and processes that can
`
`accommodate linking objects and images to information via a network such as the
`
`Internet” based on imaging performed by a mobile device and in a manner that
`
`“requires no modification to the linked object.” Ex. 2001 (’030 patent) 3:24-27.
`
`
`
`8
`
`Patent Owner’s Ex. 2002, Page 9 of 18
`
`

`

`
`
`The image recognition algorithms disclosed in the ’030 patent allow for “fast and
`
`reliable detection and recognition of images and/or objects based on their visual
`
`appearance in an image, no matter whether shadows, reflections, partial
`
`obscuration, and variations in viewing geometry are present.” Id. 3:42-50. The
`
`specification of the ’030 patent identifies specific algorithms capable of
`
`performing the object image recognition steps of the invention using, among other
`
`things, steps involving the processes of segmentation, decomposition, and
`
`comparison. See, e.g., id. 6:3-12:23.
`
`23. The ’030 patent explains
`
`that
`
`the
`
`invention provides for “a
`
`‘decomposition’, in the Input Image Decomposition 34, of a high-resolution input
`
`image into several different types of quantifiable salient parameters” which
`
`“allows for multiple independent convergent search processes of the database to
`
`occur in parallel” and leads to improved “match robustness.” Id. 6:3-14. The ’030
`
`patent provides that this “Input Image Decomposition process” may consists of the
`
`following individual steps:
`
`
`
`9
`
`Patent Owner’s Ex. 2002, Page 10 of 18
`
`

`

`
`
`
`
`Id. 6:19-28. Each of these steps is described in detail in the body of the ’030
`
`patent’s specification. See id. 6:30-41 (Radiometric Correction), 6:42-49
`
`(Segmentation), 50-64 (Segment Group Generation), 6:65-7:4 (Bounding Box
`
`Generation), 7:5-12 (Geometric Normalization), 7:13-30 (Wavelet
`
`Decomposition), 7:31-55 (Color Cube Decomposition), 7:56-62 (Shape
`
`Decomposition), 7:63-8:5 (Low-Resolution Grayscale Image Generation).
`
`24. The ’030 patent then describes a process for comparing the outputs of
`
`each of these segmentation and decomposition steps to a database to produce “a
`
`match score” for that value, culminating in the calculation of a “combined match
`
`score” for a given object:
`
`
`
`10
`
`Patent Owner’s Ex. 2002, Page 11 of 18
`
`

`

`
`
`
`
`25. The ’030 patent describes how each of these parameter comparisons
`
`may be performed. See id. 8:28-53 (comparison of Each Input Image Segment
`
`Group), (comparison for Each Database Object), 8:57-61 (comparison for Each
`
`View of this Object), 8:62-67 (comparison for Each Segment Group in this View
`
`of this Database Object), 9:1-45 (Shape Comparison), 9:46-10:7 (Grayscale
`
`Comparison), 10:8-38 (Wavelet Comparison), 10:39-64 (Color Cube Comparison).
`
`Ultimately, this comparison process results in “a normalized matching score” for
`
`each object image comparison that represent “independent assessments of the
`
`match of salient features of the input image to database images.” Id. 10:65-11:3.
`
`The specification also explains that, “[t]o minimize the effect of uncertainties in
`
`any single comparison process,” each of these independent assessments may be
`
`processed through a “root sum of squares relationship” (depicted below) in order to
`
`produce “a combined match score for an image.” Id. 11:3-14.
`
`
`
`11
`
`Patent Owner’s Ex. 2002, Page 12 of 18
`
`

`

`
`
`
`
`26. The ’030 patent also makes clear that the results of this image analysis
`
`process need not result in perfect one-to-one matches with image information
`
`already contained in the comparison database. Id. 11:35-55. Indeed, the
`
`specification explains that as an initial step, a feature parameter “carrying greatest
`
`weight from the input image” may be “compared first to find statistical matches
`
`and near-matches in all database records.” Id. 11:41-43. This results in a
`
`“normalized interim score (e.g., scaled value from zero to one, where one is perfect
`
`match and zero is not match).” Id. 11:43-45. These variable match scores are then
`
`compiled and assessed as part of a final “Combined Match Score evaluation,”
`
`which can be used to recognize an object in the image. Id. 11:64-67.
`
`27. Once objects in
`
`the
`
`image have been
`
`identified using
`
`these
`
`comprehensive search, detection and recognition algorithmic approaches, the ’030
`
`patent further discloses facilitating a transaction related to the object through a
`
`mobile device, by retrieving and delivering information related to the object to a
`
`user via a network connection. Id. 3:60-4:5. For example, the patent explains that
`
`once the “image is analyzed and the object or image of interest is detected and
`
`recognized,” the “network address of information corresponding to that object is
`
`
`
`12
`
`Patent Owner’s Ex. 2002, Page 13 of 18
`
`

`

`
`
`transmitted” back to the mobile device, “allowing the mobile device to access
`
`information using the network address.” Id. 3:67-4:5.
`
`V. OVERVIEW OF OGASAWARA
`
`28. Ogasawara discloses “an electronic shopping system” which
`
`“facilitates purchase transaction via a wireless videophone.” Ex. 2004 at Abstract.
`
`In one embodiment, Ogasawara teaches a “store server 10 in communication with a
`
`commercial telephone network 14,” a “wireless telephone 18,” an “external” or
`
`“built-in” bar code scanner, and a “catalog 21 of the items which can be
`
`purchased” that “contains a bar code 22 for each such item.” Id. 4:66-5:48. The
`
`patent describes how a user can dial a store’s telephone number upon arrival to
`
`automatically download a “personal shopping application” that will allow for the
`
`scanning of product bar codes and permit the telephone to “facilitate[] purchase
`
`transactions.” Id. 6:5-57.
`
`
`
`13
`
`Patent Owner’s Ex. 2002, Page 14 of 18
`
`

`

`
`
`
`
`29.
`
`In a separate embodiment, Ogasawara replaces this wireless telephone
`
`with a “wireless videophone” with “a digital camera 236 in place of a bar code
`
`scanner.” Id. 18:11-22. In the case of this “videophone” embodiment, Ogasawara
`
`explains that, “the tailored purchase transaction program might additionally
`
`include character recognition and/or pattern recognition, as well as bar code
`
`decode, software.” Id. 18:15-19. In the absence of a discrete bar code reader,
`
`Ogasawara explains this software “would allow the wireless videophone to
`
`function in a manner similar to the wireless telephone and bar code scanner
`
`embodiment” described elsewhere. Id. 18:19-22. Specifically, it could be used “to
`
`
`
`14
`
`Patent Owner’s Ex. 2002, Page 15 of 18
`
`

`

`
`
`either decode the bar code videographic data or to perform pattern recognition
`
`functions on a [sic] icon-like pattern captured by the digital video camera.” Id.
`
`18:34-37.
`
`30. Finally, on the final page of its specification, Ogasawara provides a
`
`single paragraph suggesting that its “wireless videophone” embodiment could be
`
`enhanced through the use of “[a]dvanced pattern recognition software.” Id. 21:12-
`
`31. Ogasawara does not provide any detail as to what this software is, how it
`
`operates, or what its capabilities and limitations may or not be. In fact,
`
`Ogasawara’s only description of this purported software comes by way of an
`
`“example” included in this paragraph:
`
`Advanced pattern recognition software allows a consumer to capture a
`videographic image of an apple, for example, and to have the apple be
`recognized as such by the pattern recognition software. This capability
`is useful for any merchandise item having a distinct or identifiable
`shape or other visually identifiable characteristic.
`
`Id. 23:16-22.
`
`VI. OGASAWARA DOES NOT DISCLOSE OR ENABLE THE
`ADVANCED OBJECT RECOGNITION CLAIMED BY THE ’030
`PATENT
`
`31.
`
`I understand that Petitioner contends the singular paragraph in
`
`Ogasawara referring to “[a]dvanced pattern recognition software” discloses claim 1
`
`of the ’030 patent’s requirement of a “an object identification platform configured
`
`to obtain the acquired data, recognize the object as a target object based on the
`
`
`
`15
`
`Patent Owner’s Ex. 2002, Page 16 of 18
`
`

`

`
`
`acquired data, and determine object information associated with the target object.”
`
`Petition at 16-18, 32-36. I also understand that Petitioner has submitted a
`
`declaration from its expert—Dr. Jeffrey J. Rodriguez—in support of this claim.
`
`See Ex. 1003 at ¶¶ 70, 84-85, 127. In my opinion, both Petitioner and Dr.
`
`Rodriguez are incorrect.
`
`32. A POSITA at the time of Ogasawara—in or around March 1999—
`
`would not have understood the cursory and undefined reference to “[a]dvanced
`
`pattern recognition software” in Ogasawara’s specification to disclose to the kinds
`
`of advanced object image recognition disclosed and claimed by the ’030 patent.
`
`During that time period, a POSITA would have understood that image processing
`
`technology was not yet sophisticated enough to engage in true object recognition
`
`regardless of irregularities in the object, lighting, field of view, or viewing
`
`geometry as described as claimed by the ’030 patent. As a result, in my opinion a
`
`POSITA would not understand Ogasawara to disclose this limitation of the ’030
`
`patent within its four corners and that Ogasawara thus cannot be considered
`
`anticipatory to claim 1 of the ’030 patent.
`
`33. The scant description of this purported “[a]dvanced image recognition
`
`software” in Ogasawara only confirms this point. Ogasawara’s only substantive
`
`description of this purported software is couched in terms of its limitations, noting
`
`that this software only functions when an object contains a highly “distinct or
`
`
`
`16
`
`Patent Owner’s Ex. 2002, Page 17 of 18
`
`

`

`identifiable shape or other visually identifiable characteristic.” Ex. 2004 at 23:16-
`
`22.
`
`34. Even to the extent Ogasawara could be considered to disclose the
`
`object recognition techniques of the ’030 patent—and to be clear, I disagree that it
`
`could be—I also understand that Ogasawara cannot be considered anticipatory
`
`because it fails to actually enable those advanced techniques. As discussed above,
`
`Ogasawara provides no explanation or guidance as to how any “[a]dvanced pattern
`
`recognition software” would work, how it could be implemented in the
`
`embodiments described in its specification, or what its capabilities or limitations
`
`would be beyond the need for a specific “distinct or identifiable shape or other
`
`visually identifiable characteristic.” In my opinion, a POSITA reading these
`
`disclosures would not understand them to enable the advanced object recognition
`
`techniques of claim 1 of the ’030 patent.
`
`VII. CONCLUSION
`
`35.
`
`I declare that the information contained in this declaration is true and
`
`accurate to the best of my knowledge. If called upon to testify, I would do so
`
`consistent with the statements and opinions contained in this declaration.
`
`Dated: September 21, 2021
`
`Chandrajit Bajaj, Ph.D.
`
`17
`
`Patent Owner’s Ex. 2002, Page 18 of 18
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket