`
`
`
`
`Case IPR2021-00742
`Patent 10,594,854
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`GOOGLE LLC,
`Petitioner,
`
`v.
`
`Mira Advanced Technology Systems Inc.
`Patent Owner.
`____________
`
`Case IPR2021-00742
`Patent 10,594,854
`____________
`
`
`EXHIBIT 1003
`
`DECLARATION OF CHRISTOPHER SCHMANDT
`
`
`
`
`
`
`
`
`
`Mail Stop “PATENT BOARD”
`Patent Trial and Appeal Board
`U.S. Patent & Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 1 of 108
`
`
`
`
`
`TABLE OF CONTENTS
`
`Page
`
`I.
`
`INTRODUCTION ......................................................................................... 1
`
`II. QUALIFICATIONS ...................................................................................... 2
`
`III. BASES OF OPINIONS ................................................................................. 9
`
`IV. APPLICABLE LEGAL STANDARDS .....................................................11
`
`A. Ordinary Skill In The Art ................................................................ 11
`
`B.
`
`Claim Construction ........................................................................... 12
`
`C. Anticipation (35 U.S.C. § 102) .......................................................... 13
`
`D. Obviousness (35 U.S.C. § 103) .......................................................... 13
`
`V.
`
`LEVEL OF ORDINARY SKILL IN THE ART ......................................17
`
`VI. THE ’854 PATENT .....................................................................................18
`
`A. Overview ............................................................................................. 18
`
`VII. THE STATE OF THE ART OF LOCATION-BASED REMINDERS
`AT THE TIME OF THE INVENTION ....................................................24
`
`A. Claims Of The ’854 Patent ............................................................... 38
`
`1.
`
`2.
`
`3.
`
`4.
`
`5.
`
`6.
`
`Claim 1 of the ’854 Patent ...................................................... 38
`
`Claim 2 of the ’854 Patent ...................................................... 43
`
`Claim 3 of the ’854 Patent ...................................................... 44
`
`Claim 4 of the ’854 Patent ...................................................... 44
`
`Claim 5 of the ’854 Patent ...................................................... 44
`
`Claim 6 of the ’854 Patent ...................................................... 45
`
`i
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 2 of 108
`
`
`
`
`
`7.
`
`Claim 7 of the ’854 Patent ...................................................... 45
`
`VIII. UNPATENTABILITY ANALYSIS ...........................................................45
`
`A. Overview Of The Prior Art .............................................................. 45
`
`1.
`
`2.
`
`3.
`
`Dunton ...................................................................................... 45
`
`Barchi ....................................................................................... 49
`
`Bedingfield ............................................................................... 50
`
`B. Ground 1: Claims 1-7 Are Obvious Over Dunton, Barchi,
`And Bedingfield ................................................................................. 51
`
`1. Motivation to Combine ........................................................... 51
`
`2.
`
`Claim 1 ..................................................................................... 56
`
`a.
`
`b.
`
`c.
`
`d.
`
`1.pre.a: “A method for providing location-
`based notifications using (i) a mobile
`communication device of a user equipped with
`an on-board GPS device and (ii) a remote geo-
`code database accessible through a remote
`server” ............................................................................56
`
`1.pre.b: “the remote geo-code database storing
`contact information linked to geographical
`locations such that each stored set of GPS
`coordinates corresponding to a respective
`geographical location is mapped to a respective
`set of contact information of an entity located at
`the respective geographical location” .........................60
`
`1.pre.c: “the mobile device configured to be
`communicable to and from the remote server” .........62
`
`1.pre.d: “the mobile device configured to store
`and display a first collection of one or more
`viewable entries” ...........................................................62
`
`ii
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 3 of 108
`
`
`
`
`
`e.
`
`f.
`
`g.
`
`h.
`
`i.
`
`j.
`
`1.pre.e: “each said viewable entry configured to
`be linked with a respective geographical
`location” .........................................................................64
`
`1.pre.f: “each said viewable entry configured to
`store a location-denoting text denoting the
`respective geographical location and a
`respective set of GPS coordinates identifying
`the respective geographical location” .........................65
`
`1.pre.g: “each said viewable entry configured to
`store a respective reminder text denoting a
`respective task linked with the respective
`geographical location” ..................................................66
`
`1.pre.h: “each said viewable entry configured to
`have a respective set of user interfaces, which,
`when selectively displayed on the mobile device,
`enable the user, for the respective viewable
`entry, to at least (a) search for, through use of
`the remote geo-code database, the respective
`location-denoting text and the respective set of
`GPS coordinates and (b) view at least both the
`respective location-denoting text and the
`respective reminder text” .............................................67
`
`1.a: “the mobile device displaying a first set of
`one or more user interfaces enabling the user to
`input a first input text for the respective
`reminder text of a first viewable entry of the
`first collection so that the mobile device receives
`and stores the first input text as the respective
`reminder text of the first viewable entry
`subsequently viewable through the respective
`set of user interfaces thereof” ......................................71
`
`1.b: “the mobile device displaying a second set
`of one or more user interfaces included in the
`respective set of user interfaces of the first
`viewable entry, the second set of one or more
`
`iii
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 4 of 108
`
`
`
`
`
`user interfaces enabling the user to input text
`on contact information of an entity located at
`the respective geographical location of the first
`viewable entry in using the user-inputted
`contact information to acquire both the
`respective location-denoting text and the
`respective set of GPS coordinates of the first
`viewable entry through use of the remote geo-
`code database, the second set of one or more
`user interfaces including at least a first user
`interface element enabling the user to input a
`second input text for searching against a first
`set of one or more data fields of contact
`information of an entity located at the
`respective geographical location of the first
`viewable entry” .............................................................73
`
`1.c: “the mobile device sending to the remote
`server a search request including the second
`input text and indicating a search criterion of
`the second input text being used to search
`against the first set of one or more data fields of
`contact information of an entity, the search
`request requesting for searching for, based on
`the search criterion, at least one result entity
`meeting the search criterion” ......................................78
`
`1.d: “the mobile device receiving from the
`remote server a set of result data of a first result
`entity including a respective set of contact
`information of the first result entity and a
`respective set of GPS coordinates of the first
`result entity identifying a respective
`geographical location where the first result
`entity is located, as a result of the remote
`server, upon receiving from the mobile device
`the search request, performing a search
`operation against the remote geo-code database
`based on the search criterion and retrieving
`from the remote geo-code database, as a result
`
`iv
`
`k.
`
`l.
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 5 of 108
`
`
`
`
`
`m.
`
`n.
`
`o.
`
`of the search operation, the set of result data of
`the first result entity” ...................................................80
`
`1.e: “the mobile device setting and storing a
`first subset of the received respective set of
`contact information of the first result entity and
`the received respective set of GPS coordinates
`of the first result entity, as the respective
`location-denoting text of the first viewable entry
`and the respective set of GPS coordinates of the
`first viewable entry, respectively” ...............................81
`
`1.f: “the mobile device displaying an indication
`indicating a presence of the respective reminder
`text of the first viewable entry to remind the
`user of performing of the respective task
`denoted by the respective reminder text when a
`set of contemporaneous GPS coordinates of the
`mobile device corresponding to a
`contemporaneous geographical location of the
`mobile device, as captured by the on-board GPS
`device of the mobile device, corresponds with
`the stored respective set of GPS coordinates of
`the first viewable entry” ...............................................83
`
`1.g: “wherein the second set of user interfaces
`include at least a first user interface enabling
`the user to input a set of one or more identifier
`values for a respective set of one or more data
`fields of contact information of an entity located
`at the respective geographical location of the
`first viewable entry, in uniquely identifying an
`entity located at the respective geographical
`location of the first viewable entry through use
`of the remote geo-database” ........................................86
`
`p.
`
`1.h: “wherein the set of one or more identifier
`values for the respective set of one or more data
`fields of contact information of an entity located
`at the respective geographical location of the
`
`v
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 6 of 108
`
`
`
`
`
`first viewable entry, is calculated to be used as
`unique identifier information to uniquely
`identify an entity located at the respective
`geographical location of the first viewable entry
`through use of the remote geo-code database” ..........88
`
`3.
`
`4.
`
`5.
`
`6.
`
`7.
`
`8.
`
`Claim 2 ..................................................................................... 89
`
`Claim 3 ..................................................................................... 92
`
`Claim 4 ..................................................................................... 94
`
`Claim 5 ..................................................................................... 95
`
`Claim 6 ..................................................................................... 96
`
`Claim 7 ..................................................................................... 98
`
`C.
`
`Secondary Considerations Of Non-obviousness ........................... 100
`
`IX. CONCLUSION ..........................................................................................100
`
`vi
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 7 of 108
`
`
`
`
`
`I.
`
`INTRODUCTION
`
`1. My name is Christopher Schmandt. I have been retained as an expert
`
`witness to provide my independent opinion regarding matters at issue in the above-
`
`captioned inter partes review of U.S. Pat. No. 10,594,854 (the “’854 Patent”),
`
`entitled “Location Based Personal Organizer.” I have been retained by Google
`
`LLC (“Google”), the Petitioner, in the above proceeding.
`
`2.
`
`Unless otherwise noted, the statements made herein are based on my
`
`personal knowledge, and if called to testify about this declaration, I could and
`
`would do so competently and truthfully.
`
`3.
`
`A detailed record of my professional qualifications, including cases in
`
`which I have served as an expert, is being submitted herewith as Exhibit 1004 and
`
`is summarized in Section II, infra. My work on this case is being billed at my
`
`normal hourly rate, with reimbursement for actual expenses.
`
`4.
`
`I am not a legal expert and offer no opinions on the law. However, I
`
`have been informed by counsel of the various legal standards that apply, and I have
`
`applied those standards in arriving at my conclusions.
`
`5.
`
`I have reviewed and am familiar with the specification of the ’854
`
`Patent, issued on March 17, 2020. I understand that the ’854 Patent has been
`
`provided as Ex. 1001.
`
`1
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 8 of 108
`
`
`
`
`
`6.
`
`formats:
`
`7.
`
`For clarity to the reader, my citations will generally adhere to these
`
`Patents will be cited by exhibit number and specific column and line
`
`numbers. Example: Ex. 1008 [Barchi] 1:10-12 refers to the Barchi patent at
`
`column 1, lines 10-12.
`
`8.
`
`Published patent applications will be cited by exhibit number and
`
`paragraph numbers. Example: Ex. 1006 [Dunton] ¶ [0001] refers to the Dunton
`
`published patent application at paragraph [0001].
`
`9.
`
`I will generally cite to the specification of the challenged patent in the
`
`following format: Ex. 1001 [’854 Patent] 1:1-10. These exemplary citations
`
`reference the ’854 Patent specification at column 1, lines 1-10.
`
`10. Articles and other publications will be cited by their exhibit number
`
`and their original page number when feasible. Example: Ex. 1007 [Ludford] p. 889
`
`refers to the Ludford paper at original page 889 of the paper.
`
`II. QUALIFICATIONS
`
`11.
`
` I have recently retired from my position as a Principal Research
`
`Scientist at the Media Laboratory at the Massachusetts Institute of Technology
`
`(“M.I.T.”), after 40 years of employment by M.I.T. In that role, I also served as
`
`faculty for the M.I.T. Media Arts and Sciences academic program. I have more
`
`2
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 9 of 108
`
`
`
`
`
`than 40 years of experience in the field of Media Technology and was a founder of
`
`the M.I.T. Media Laboratory.
`
`12.
`
`I received my Bachelor of Science degree in Electrical Engineering
`
`and Computer Science from M.I.T. in 1978, and my Master of Science degree in
`
`Visual Studies (Computer Graphics), also from M.I.T., in January 1980. I was
`
`employed at M.I.T. starting in 1980, initially at the Architecture Machine Group,
`
`which was an early computer graphics research lab. In 1985, I helped found the
`
`Media Laboratory and continued to work there until my retirement. I ran a research
`
`group most recently titled “Living Mobile.” My research spanned distributed
`
`communication and collaborative systems, with an emphasis on multi-media and
`
`mobile user interfaces. I have more than 70 published conference and journal
`
`papers and one book in these fields. I have received “best paper” awards from both
`
`ACM and IEEE conferences and am a member of the ACM CHI (Computer
`
`Human Interaction) Academy, a peer-awarded honor given to a handful of active
`
`researchers annually.
`
`13.
`
`In my faculty position, I taught courses and directly supervised
`
`student research and theses at the Bachelors, Masters, and Ph.D. level. I oversaw
`
`the masters and Ph.D. thesis programs for the entire Media Arts and Sciences
`
`academic program. I was the official Departmental coordinator for MIT’s
`
`Undergraduate Research Opportunities Program (UROP), through which the Lab
`
`3
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 10 of 108
`
`
`
`
`
`employed typically 150 undergrad research assistants, for 35 years. I also served
`
`on, and at times chaired, the Departmental Intellectual Property committee. Based
`
`on my experience and qualifications, I have a solid understanding of the
`
`knowledge and perspective of a person of ordinary skill in this technical field
`
`(location-aware computing) since the 1980s.
`
`14. From the earliest days of my involvement, both at the Architecture
`
`Machine Group and subsequently at the Media Laboratory, at MIT, my work has
`
`been centered around multimedia computer mediated communication and
`
`corresponding user interfaces. In 1979 I co-authored “Put That There,” a
`
`pioneering multi-modal conversational system employing speech, graphics, and
`
`gesture. Several years later, in 1981, my Intelligent Ear project was one of the first
`
`graphical user interfaces to allow editing of digital audio voice recordings by
`
`means of a touch screen user interface. I worked with early audio and visual real
`
`time conferencing systems, doing acoustic and facial feature detection to enable
`
`minimal transmission bandwidth by creating surrogates at the remote sides of
`
`conference links.
`
`15.
`
`In part because of my work with speech understanding, many of my
`
`research projects included telephone components, initially wired and, in time,
`
`wireless. I built what was perhaps the first unified messaging system, allowing
`
`mixes of voice, text, and image in electronic messages, in 1984. Later my
`
`4
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 11 of 108
`
`
`
`
`
`Phoneshell system allowed telephone access to many ordinary desktop utilities,
`
`including voice and text messages, calendar, contact list, and news and weather
`
`information. Information could be spoken or sent as facsimile if an image. In the
`
`process of building these systems I also engineered protocol stacks for the ISDN
`
`digital telephony standards.
`
`16. As reliable digital wireless communication started to become
`
`available around the mid-1990s, many of the voice features were also implemented
`
`as text, including for example two-way alphanumeric pagers. Later audio pagers
`
`were used, and by early in the millennium many of these projects had transitioned
`
`to mobile phones. Computer mediated communications on mobile phones became
`
`a dominant thrust of my work resulting in changing the name of my research lab to
`
`“Speech and Mobility” and later “Living Mobile.”
`
`17. By the mid-1990s, I was working with early mobile phones and
`
`building handheld multimedia devices, such as Voice Notes (1993), a voice note
`
`taker, NewComm (1995), a portable audio player for subscribed podcasts, with
`
`intelligent speech segment processing for smart “fast forward”, and the Audio
`
`Notebook (1997), wherein a user both recorded and took notes while at a lecture or
`
`meeting and could later touch the note page to access the associated audio. We
`
`added wearable communication devices in 1999 with Nomadic Radio, a mobile
`
`wireless audio communicator.
`
`5
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 12 of 108
`
`
`
`
`
`18. The key concept behind the name change for my research group was a
`
`growing focus on computing and interaction beyond traditional desktop interfaces
`
`and into to the real world of daily life. Working with wireless and mobile devices
`
`requires sensitivity to the quantity and style of alerts to incoming communication.
`
`Even with wired telephony, speech (e.g., speech synthesis of electronic mail) is
`
`much slower than reading, and this led to a number of projects dealing with
`
`filtering of incoming messages. My CLUES system from 1996 built dynamic
`
`profiles (updated hourly) of email and voicemail priority by examining
`
`communication history, personal contact lists, and a user’s calendar. The Knothole
`
`system from 1999 applied the same substrate to variable overall alerting profiles
`
`depending on how a user was accessing the messages, ranging from desktop in the
`
`office or from a laptop, mobile or landline phone, text pager, audio pager, or
`
`satellite paging system; alerting changed, based on user preferences, for each
`
`connection type. The prioritization scheme was also built into the auditory
`
`messaging of Nomadic Radio.
`
`19. Working with mobile devices led to several decades of work in
`
`location-aware computing. We started in 1987 with a program, Direction
`
`Assistance (a bit of punning on Directory Assistance) wherein a user at a pay
`
`phone called into the system, identified their starting location by the phone number
`
`of the pay phone, and a destination address using the touch tone keypad; in
`
`6
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 13 of 108
`
`
`
`
`
`response Direction Assistance used speech synthesis to recite a driving route. In
`
`this project we relied on reverse telephone number lookup with geo-coding to
`
`determine the user’s initial location, and our own map of Cambridge and Boston to
`
`determine the best route.
`
`20.
`
`In 1988, my students and I extended some of the concepts in Direction
`
`Assistance and built the first real time spoken driving directions system, installed
`
`in a car and including our own street database for Cambridge, MA. This system,
`
`Back Seat Driver, was rooted in concern for driver safety, and used velocity in
`
`addition to position to determine when to recite driving directions, when to
`
`collapse multiple driving steps into a single instruction if the car’s velocity so
`
`required, and enabled “coaching” which warned when to slow down due to an
`
`upcoming navigation event or change in road conditions.
`
`21. Later comMotion (1999) tracked location and preferred points of
`
`interest to combine mobile messaging with shopping on the run, thereby becoming
`
`a pioneering location-based reminder system. comMotion would learn locations
`
`frequented by the user, who carried a GPS and mobile computing equipment, and
`
`allow the user to add these to a personal contact list of points of interest, including
`
`a name (e.g., Star Market) and a class (grocery store). The user could add
`
`reminders, either by text or voice, for any location, or any class of locations (e.g., a
`
`grocery list for any food store). The geo-coded databases were stored in a server,
`
`7
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 14 of 108
`
`
`
`
`
`but the mobile system tracked position in real time and would give an audio alert
`
`as the user was approaching a location with an attached reminder. In addition to the
`
`user, other authorized users (e.g., roommates) could add items to lists, and lists
`
`could be shared (in part because much of the system was server based).
`
`22.
`
`In WatchMe (2004) we also tracked user location, but learned
`
`preferred routes and travel times. WatchMe was built into a wristwatch case in
`
`communication with a mobile phone, and it selected communication mode and
`
`alerting style based on inferring a user’s route and travel mode. It knew, for
`
`example, that the user was walking to work, which gave an ETA, but also would
`
`allow phone calls with contacts. On the other hand, if the user was about to enter
`
`the subway (which at the time had no mobile phone relay base stations)
`
`communication would be limited to text and transmitted when the user surfaced,
`
`and if the user were driving, calls would be diverted to voice mail.
`
`23. The Ringing In the Rain (2007) project used similar logic to alert
`
`mobile users on foot or bicycle to impending rain during summer thunderstorms,
`
`and Will You Help Me (2006) focused on advising walkers of personal safety en
`
`route based on police crime logs. These are both examples of personalized
`
`location-based services which relied on servers with geo-coded databases (weather
`
`radar in one case, and publicly available crime log data in the other). Another
`
`8
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 15 of 108
`
`
`
`
`
`personal safety project, Safe and Sound (2002) alerted a parent when a child
`
`exceeded a distance or left a “geo-fenced” bounding box.
`
`24.
`
`In addition to my own work, part of my responsibility as a principal
`
`investigator was to track and maintain contacts with industrial researchers and
`
`product developers. In addition to government funding (e.g., DARPA, ONR, and
`
`NSF) I relied on industrial funding through the Media Lab’s two research
`
`consortia, Digital Life and Things that Think. Many established international
`
`communication companies and mobile phone developers participated in frequent
`
`visits and brainstorming sessions with my research group. As a result, during this
`
`time, I was made aware of industrial trends and emerging standards as well as
`
`further looking research agendas.
`
`25.
`
`In addition to MIT-centered research, I have also participated in and
`
`reviewed for a number of conferences for both the ACM and IEEE. I have had
`
`program committee, conference management, and conference chairing positions in
`
`the ACM CSCW, CHI, UIST, Mobicom, and Ubiquitous Computing conferences. I
`
`have served on the editorial board of several professional journals as well.
`
`III. BASES OF OPINIONS
`
`26. While conducting my analysis and forming my opinions, I have
`
`reviewed at least the materials listed below:
`
`9
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 16 of 108
`
`
`
`
`
`27. Ex. 1001 – U.S. Patent No. U.S. 10,594,854 (“’854 Patent”) and its
`
`file history (Ex. 1002);
`
`28. Ex. 1006 – U.S. Pub. No. 2006/0061488 (“Dunton”);
`
`29. Ex. 1007 – “Because I Carry My Cell Phone Anyway: Functional
`
`Location-Based Reminder Applications,” Pamela J. Ludford, Dan Frankowski,
`
`Ken Reily, Kurt Wilms, and Loren Terveen, CHI ’06: Proceedings of the SIGCHI
`
`Conference on Human Factors in Computing Systems, ACM, April 22, 2006,
`
`pages 889–898, (“Ludford”);
`
`30. Ex. 1008 – U.S. Pat. No. 7,187,932 to Barchi (“Barchi”);
`
`31. Ex. 1009 – U.S. Pub. No. US 2004/0260604 (“Bedingfield”);
`
`32. Ex. 1010 – U.S. Pub. No. 2001/0005171 (“Farringdon”);
`
`33. Ex. 1011 – U.S. Pub. No. US 2004/0108375 (“Maillard”);
`
`34. Ex. 1012 – U.S. Pub. No. 2004/0081120 (“Chaskar”);
`
`35. Ex. 1013 – U.S. Pub. No. 2006/0225076 (“Longobardi”);
`
`36. Ex. 1014 – “Shopper’s Eye: Using Location-based Filtering for
`
`Shopping Agent in the Physical World,” Andrew E. Fano, AGENTS ’98:
`
`Proceedings of the second international conference on Autonomous agents, ACM,
`
`May 1998, pages 416–421, (“Fano”);
`
`37. Ex. 1015 – “Exploiting Online Sources to Accurately Geocode
`
`Addresses,” Rahul Bakshi, Craig A. Knoblock, and Snehal Thakkar, GIS ’04:
`
`10
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 17 of 108
`
`
`
`
`
`Proceedings of the 12th annual ACM international workshop on Geographic
`
`information systems, ACM, November 2004, pages 194–203, (“Bakshi”);
`
`38. Ex. 1016 – “A Probabilistic Geocoding System based on a National
`
`Address File,” Peter Christen, Tim Churches, and Alan Willmore, Data Mining:
`
`Theory, Methodology, Techniques, and Applications, ACM, January 2006, pages
`
`130–145, (“Christen”);
`
`39. Ex. 1017 – Complaint in Mira Advanced Technology Systems, Inc. v.
`
`Google LLC, No. 1:21-cv-00371, Dkt. No. 1 (E.D. Va. March 25, 2021), (“Mira
`
`Complaint”); and
`
`40. Ex. 1018 – “Baby Bells Ring In Online Yellow Pages”, Kaitlin
`
`Quistgaard, Wired Magazine, June 25, 1997 (available at
`
`https://web.archive.org/web/20080607061643/http:/www.wired.com/techbiz/media
`
`/news/1997/06/4714).
`
`IV. APPLICABLE LEGAL STANDARDS
`
`A. Ordinary Skill In The Art
`
`41. My opinions in this declaration are based on the understandings of a
`
`person of ordinary skill in the art, which I understand is sometimes referred to as
`
`an “ordinary artisan” or by the acronym “POSITA” (person of ordinary skill in the
`
`art) as of the time of the invention. I understand that the person of ordinary skill in
`
`the art is a hypothetical person who is presumed to have known the relevant art at
`
`11
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 18 of 108
`
`
`
`
`
`the time of the invention. By “relevant,” I mean relevant to the challenged claims
`
`of the ’854 Patent.
`
`42.
`
`I understand that, in assessing the level of skill of a person of ordinary
`
`skill in the field, one should consider the type of problems encountered in the art,
`
`the prior solutions to those problems found in prior art references, the rapidity with
`
`which innovations are made, the sophistication of the technology, the levels of
`
`education and experience of persons working in the field, and my own experience
`
`working with those of skill in the art at the time of the invention.
`
`B. Claim Construction
`
`43.
`
`I understand that claims of a patent are generally interpreted according
`
`to their ordinary and customary meaning taking into consideration the so-called
`
`“intrinsic evidence” of the patent consisting of (1) the claim language; (2) the
`
`specification; and (3) the prosecution history.
`
`44.
`
`I understand that claim terms may sometimes be defined by the
`
`specification. For example, a claim term may be explicitly defined in the patent
`
`specification, or it may be implicitly defined through consistent usage in the
`
`specification. I also understand that the scope of claim terms may be limited by
`
`statements in the specification or prosecution history where the applicant clearly
`
`disavows or disclaims subject matter.
`
`
`
`12
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 19 of 108
`
`
`
`
`
`C. Anticipation (35 U.S.C. § 102)
`
`45.
`
`I have been informed that to obtain a patent the claimed invention
`
`must have, as of the date of the invention, been novel over the prior art. I
`
`understand that an invention is not novel if it is anticipated by the prior art. I
`
`understand that an invention is anticipated if a single prior art reference discloses
`
`all the limitations or elements as set forth in the claim. I understand that in some
`
`cases an element or limitation, while not explicitly or expressly discloses in a prior
`
`art reference, may still be disclosed by the prior art reference if that element or
`
`limitation is inherent in the reference. I understand that an element or limitation is
`
`inherent if the nature of the reference is such that the element or limitation is
`
`necessarily present, or put another way, that it would have to follow from the
`
`disclosed system or method that the element or limitation must be present.
`
`D. Obviousness (35 U.S.C. § 103)
`
`46.
`
`I have been informed that to obtain a patent the claimed invention
`
`must have, as of the date of the invention, been nonobvious in view of the prior art
`
`in the field. I understand that an invention is obvious if the differences between the
`
`subject matter sought to be patented and the prior art are such that the subject
`
`matter as a whole would have been obvious at the time of the invention to a person
`
`having ordinary skill in the art.
`
`13
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 20 of 108
`
`
`
`
`
`47.
`
`I have been informed that the following factors are to be evaluated to
`
`determine whether a Petitioner has met its burden of proof that a claimed invention
`
`is obvious:
`
`48. The scope and content of the prior art relied upon;
`
`49. The difference or differences, if any, between each claim of the patent
`
`and the prior art; and
`
`50. The level of ordinary skill in the art at the time the invention of the
`
`patent was made.
`
`51. Based on those factual inquiries, it is determined whether the claimed
`
`subject matter would have been obvious to one of ordinary skill in the art at the
`
`time the alleged invention was made.
`
`52.
`
`I understand that various objective indicia can be evidence of
`
`nonobviousness, including, for example: (1) commercial success of the invention;
`
`(2) a long-felt need for the invention; (3) copying by others of the subject matter of
`
`the claim invention; (4) failure by others to find the solution provided by the
`
`invention; (5) skepticism or expressions of surprise from experts and those skilled
`
`in the art; (6) unexpected results of the claimed invention; (7) acceptance of others
`
`and industry praise; and (8) licensing of the patents.
`
`53.
`
`I also understand that “obviousness” is a legal conclusion based on the
`
`underlying factual issues of the scope and content of the prior art, the differences
`
`14
`
`Google, Exhibit 1003
`IPR2022-00742
`Page 21 of 108
`
`
`
`
`
`between the claimed invention and the prior art, the level of ordinary skill in the
`
`prior art, and any objective indicia of non-obviousness. For that reason, I am not
`
`rendering a legal opinion on the ultimate legal question of obviousness. Rather, my
`
`testimony addresses the underlying facts and factual analysis that would support a
`
`legal conclusion of obviousness or non-obviousness, and when I use the term
`
`obvious, I am referring to the perspective of one of ordinary skill at the time of
`
`invention.
`
`54.
`
`I understand that to prove that prior art or a combination of prior art
`
`renders a patent obvious, it is necessary to: (1) identify the prior art references, that
`
`singly or in combination, render the patent obvious; (2) specifically identify which
`
`li