`____________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________________
`
`PAYPAL INC.,
`Petitioner,
`
`v.
`
`IOENGINE, LLC,
`Patent Owner.
`____________________
`
`IPR2019-00906
`
`Patent 9,059,969
`____________________
`
`DECLARATION OF AVIEL D. RUBIN, PH.D.
`
`
`Exhibit 2003
`
`
`
`
`
`I.
`
`II.
`
`A.
`
`B.
`
`TABLE OF CONTENTS
`
`Introduction .......................................................................................................... 3
`
`Qualifications ....................................................................................................... 3
`
`Education & Career ........................................................................................ 4
`
`Publications .................................................................................................... 7
`
`III.
`
`Person of Ordinary Skill in the Art ...................................................................... 7
`
`IV. Legal Framework ................................................................................................. 8
`
`A.
`
`B.
`
`C.
`
`Standard of Proof ............................................................................................ 9
`
`Anticipation .................................................................................................... 9
`
`Obviousness .................................................................................................. 10
`
`V.
`
`The ’969 Patent .................................................................................................. 12
`
`VI. Asserted Grounds and References Relied On In Petitions ................................ 13
`
`A. Abbott ........................................................................................................... 14
`
`B.
`
`Shmueli ......................................................................................................... 16
`
`VII. Claim Construction ............................................................................................ 17
`
`A.
`
`B.
`
`IUI ................................................................................................................. 17
`
`“communications…to a communications network node through the terminal
`network communication interface”; “communication…through the terminal
`network interface to a communications network node”; and
`“communications through the communication node on the terminal and the
`terminal network interface to a communications network node” ................ 23
`
`C.
`
`“[first/second/third/fourth/fifth] program code” .......................................... 25
`
`VIII. Detailed Analysis ............................................................................................... 26
`
`A. No Motivation to Combine Abbott and Shmueli ......................................... 26
`
`B.
`
`Petitioner’s References Do Not Teach Numerous Elements of the
`Independent Challenged Claims ................................................................... 31
`
`C.
`
`The Dependent Challenged Claims are Not Obvious .................................. 43
`
`
`
`2
`
`Exhibit 2003
`
`
`
`
`
`1.
`
`I, Aviel D. Rubin, make this declaration in connection with the
`
`proceedings identified above.
`
`I.
`
`INTRODUCTION
`
`2.
`
`I have been retained as an expert on behalf of IOENGINE, LLC
`
`(“IOENGINE”) in connection with the proceedings identified above. I submit this
`
`Declaration on behalf of IOENGINE in support of its Preliminary Response responding
`
`to the Petition for Inter Partes Review in Case IPR2019-00906 (the “Petition”) filed by
`
`PayPal Inc. (“PayPal” or “Petitioner”) challenging claims 1−22 and 24−29 (the
`
`“Challenged Claims”) of U.S. Patent No. 9,059,969 to McNulty (“the ’969 Patent”)
`
`(Ex. 1002).
`
`3.
`
`The opinions in this Declaration are based on my professional training and
`
`experience and my review of the exhibits discussed herein.
`
`4.
`
`I am being compensated for my services in connection with the
`
`proceedings identified above at my regular rate of $775 per hour (plus reimbursement
`
`for expenses). My compensation is in no way contingent on the outcome of the
`
`proceedings identified above or on any of the opinions I provide below.
`
`II. QUALIFICATIONS
`
`5.
`
`I have summarized in this section my educational background, career
`
`history, publications, and other relevant qualifications. My curriculum vitae is
`
`submitted
`
`as
`
`Ex.
`
`2062,
`
`and
`
`can
`
`also
`
`be
`
`found
`
`at
`
`http://avirubin.com/Avi_Rubins_home_page/Vita.html.
`
`3
`
`Exhibit 2003
`
`
`
`
`
`A. Education & Career
`
`6.
`
`I received my Ph.D. in Computer Science and Engineering from the
`
`University of Michigan, Ann Arbor, in 1994, with a specialty in computer security and
`
`cryptographic protocols. My thesis was titled “Nonmonotonic Cryptographic
`
`Protocols” and concerned authentication in long-running networking operations.
`
`7.
`
`I am currently employed as Professor of Computer Science at Johns
`
`Hopkins University, where I perform research, teach graduate courses in computer
`
`science and related subjects, and supervise the research of Ph.D. candidates and other
`
`students. Courses I have taught include Computer Network Fundamentals, Security
`
`and Privacy in Computing, Cryptography, and Advanced Topics in Computer Security.
`
`I am also the Technical Director of the Johns Hopkins University Information Security
`
`Institute, the University’s focal point for research and education in information
`
`security, assurance, and privacy. The University, through the Information Security
`
`Institute’s leadership, has been designated as a Center of Academic Excellence in
`
`Information Assurance by the National Security Agency and leading experts in the
`
`field. The focus of my work over my career has been computer security, and my current
`
`research concentrates on systems and networking security, with special attention to
`
`software and network security.
`
`8.
`
`After receiving my Ph.D., I began working at Bellcore in its Cryptography
`
`and Network Security Research Group from 1994 to 1996. During this period, I
`
`focused my work on internet and computer security. While at Bellcore, I published an
`
`4
`
`Exhibit 2003
`
`
`
`
`
`article titled “Blocking Java Applets at the Firewall” about the security challenges of
`
`dealing with Java applets and firewalls, and a system that we built to overcome those
`
`challenges.
`
`9.
`
`In 1997, I moved to AT&T Labs, Secure Systems Research Department,
`
`where I continued to focus on internet and computer security. From 1995 through
`
`1999, in addition to my work in industry, I served as Adjunct Professor at New York
`
`University, where I taught undergraduate classes on computer, network, and internet
`
`security issues.
`
`10.
`
`I stayed at AT&T until 2003, when I left to accept a full-time academic
`
`position at Johns Hopkins University. I was promoted to full professor with tenure in
`
`April, 2004.
`
`11.
`
`I serve, or have served, on a number of technical and editorial advisory
`
`boards. For example, I served on the Editorial and Advisory Board for the International
`
`Journal of Information and Computer Security. I also served on the Editorial Board
`
`for the Journal of Privacy Technology. I have been Associate Editor of IEEE Security
`
`and Privacy Magazine and have served as Associate Editor of ACM Transactions on
`
`Internet Technology.
`
` I am currently an Associate Editor of
`
`the
`
`journal
`
`Communications of the ACM. I was an Advisory Board Member of Springer’s
`
`Information Security and Cryptography Book Series. I have served in the past as a
`
`member of the DARPA Information Science and Technology Study Group, a member
`
`of the Government Infosec Science and Technology Study Group of Malicious Code,
`
`5
`
`Exhibit 2003
`
`
`
`
`
`a member of the AT&T Intellectual Property Review Team, Associate Editor of
`
`Electronic Commerce Research Journal, Co-editor of the Electronic Newsletter of the
`
`IEEE Technical Committee on Security and Privacy, a member of the board of directors
`
`of the USENIX Association (the leading academic computing systems society), and a
`
`member of the editorial board of the Bellcore Security Update Newsletter.
`
`12.
`
`I have spoken on information security and electronic privacy issues at
`
`more than 50 seminars and symposia. For example, I presented keynote addresses on
`
`the topics “Security of Electronic Voting” at Computer Security 2004 Mexico in
`
`Mexico City in May 2004; “Electronic Voting” to the Secure Trusted Systems
`
`Consortium 5th Annual Symposium in Washington DC in December 2003; “Security
`
`Problems on the Web” to the AT&T EUA Customer conference in March 2000; and
`
`“Security on the Internet” to the AT&T Security Workshop in June 1997. I also
`
`presented a talk about hacking devices at the TEDx conference in October 2011 and
`
`another TEDx talk on the same topic in September 2015.
`
`13.
`
`I was founder and President of Independent Security Evaluators (ISE), a
`
`computer security consulting firm, from 2005–2011. In that capacity, I guided ISE
`
`through the qualification as an independent testing lab for Consumer Union, which
`
`produces Consumer Reports magazine. As an independent testing lab for Consumer
`
`Union, I managed an annual project where we tested all of the popular anti-virus
`
`products. Our results were published in Consumer Reports each year for three
`
`consecutive years.
`
`6
`
`Exhibit 2003
`
`
`
`
`
`14.
`
`I am currently the founder and managing partner of Harbor Labs, a
`
`software and networking consulting firm.
`
`B.
`
`Publications
`
`15.
`
`I am a named inventor on ten U.S. patents in the information security area.
`
`16.
`
`I have also testified before Congress regarding the security issues with
`
`electronic voting machines and in the U.S. Senate on the issue of censorship. I also
`
`testified in Congress on November 19, 2013 about security issues related to the
`
`government’s Healthcare.gov website.
`
`17.
`
`I am author or co-author of five books regarding information security
`
`issues: Brave New Ballot, Random House, 2006; Firewalls and Internet Security
`
`(second edition), Addison Wesley, 2003; White-Hat Security Arsenal, Addison
`
`Wesley, 2001; Peer-to-Peer, O’Reilly, 2001; and Web Security Sourcebook, John
`
`Wiley & Sons, 1997. I am also the author of numerous journal and conference
`
`publications, which are reflected in my CV.
`
`III. PERSON OF ORDINARY SKILL IN THE ART
`
`18.
`
`In my opinion, a person of ordinary skill in the art (“POSITA”) would be
`
`a person with a Bachelor of Science degree in Computer Science, or related discipline,
`
`and two to three years of experience in developing, implementing, or deploying
`
`systems for the encryption of data on a portable device.
`
`19. Such a level of skill would encompass a knowledge of computer hardware,
`
`software development, operating systems, networking, portable or embedded devices,
`
`7
`
`Exhibit 2003
`
`
`
`
`
`data encryption, data storage, and peripherals, which would be sufficient for
`
`understanding the technology of the ’969 Patent. Because the focus of the ’969 Patent
`
`is not merely on a portable device, but instead a portable device that tunnels data
`
`through a terminal to a remote networked device so as to provide a “solution to securely
`
`access, execute, and process data….,” ’969 Patent 2:25–28, a POSITA would have had
`
`experience with encryption. Moreover, secure tunneling on the Internet is typically
`
`done via IPSec and sometimes SSL IPSec and SSL are based on encryption algorithms.
`
`A POSITA would know how these tunneling protocols utilize encryption to protect
`
`information as it transits on the Internet.
`
`20. This definition of a POSITA characterizes the ordinary skill of persons
`
`involved with software research and development at AT&T Labs, where I worked in
`
`the Secure Systems Research Department from 1997 to 2003. It also characterizes the
`
`ordinary skill of persons within the Information Security Institute at Johns Hopkins
`
`University involved in research relating to portable device security. Furthermore, in
`
`my role as Professor of Computer Science where I train students to become persons of
`
`ordinary skill in the area of computer security, an understanding of how to develop,
`
`implement, or deploy systems for encrypting data on a portable device would require
`
`building the knowledge and skills outlined above.
`
`IV. LEGAL FRAMEWORK
`
`21.
`
`I am not a legal expert and I offer no opinions on the law. However, I
`
`have been advised by counsel about some of the legal principles relevant to an analysis
`
`8
`
`Exhibit 2003
`
`
`
`
`
`of patentability of claims in a United States patent, as summarized below. I have
`
`conducted my analysis in accordance with these principles.
`
`22.
`
`I understand that, for an invention claimed in a patent to be found
`
`patentable, or to be valid, it must be, among other things, new and not obvious in light
`
`of what came before it such as “prior art.”
`
`A.
`
`Standard of Proof
`
`23.
`
`I understand that, in these proceedings, the burden is on the party asserting
`
`unpatentability to prove it by a preponderance of the evidence. I understand that “a
`
`preponderance of the evidence” is evidence sufficient to show that something is more
`
`likely than not.
`
`B. Anticipation
`
`24.
`
`I have been informed that a patent claim is invalid as anticipated only if
`
`each and every element and element of that claim is publicly disclosed, either explicitly
`
`or inherently, in a single prior art reference, already in practice in the industry in a
`
`single process or method, already in a single marketed product, or otherwise previously
`
`invented. I also understand that anticipation requires the presence in a single prior art
`
`disclosure of all elements of the claim arranged as in the claim. For a step or element
`
`to be inherent in a reference, I understand that the step or element must necessarily and
`
`inevitably occur or be present when one follows the teachings of the reference. I am
`
`informed that for a reference to be considered as anticipating, it must disclose the
`
`relevant technology in a manner such that a person of ordinary skill in the art would be
`
`9
`
`Exhibit 2003
`
`
`
`
`
`able to carry out or utilize the technology that the reference describes without having
`
`to undertake considerable experimentation.
`
`C. Obviousness
`
`25.
`
`I have been informed that a patent claim is invalid as obvious only if it
`
`would have been obvious to a person of ordinary skill in the art at the time the patent
`
`was filed, in light of the prior art. The prior art may include one or more references
`
`that a person of ordinary skill in the art trying to solve the problem that the inventor
`
`addressed would likely have considered, as well as the body of knowledge that such a
`
`person would possess. As in the case of a single reference that anticipates a patent
`
`claim, when one or more references make a patent claim obvious, the reference or
`
`combined references relied upon must disclose each and every element of the
`
`challenged claim. If the reference or combined references relied upon do not disclose
`
`any element in the challenged claim, then it is not obvious.
`
`26.
`
`I have been informed that, although a challenger may rely on a
`
`combination of separate prior art references to support an obviousness challenge, a
`
`person of ordinary skill in the art at the time of filing must have had some motivation
`
`to combine the references. That motivation can come from the problem the invention
`
`purports to solve, from other references, from the teachings of those references
`
`showing the invention is obvious, from other references, or from the common-sense
`
`knowledge of a person of ordinary skill in the art. However, obviousness findings
`
`10
`
`Exhibit 2003
`
`
`
`
`
`grounded in common sense must contain explicit and clear reasoning providing some
`
`rational underpinning why common sense compels a finding of obviousness.
`
`27.
`
`I have been informed that assessing which prior art references to combine
`
`and how they may be combined to match the asserted claim may not be based on
`
`hindsight reconstruction or ex-post reasoning. That is, one must view the prior art
`
`forward from the perspective of the person of ordinary skill in the art at the time of
`
`invention, not backwards from the current time through the lens of hindsight. It is my
`
`understanding that it is improper to use a patent claim as a template and pick and choose
`
`among the prior art to meet the elements of that claim to show obviousness.
`
`28.
`
`I have been informed that a reference qualifies as prior art for determining
`
`obviousness when it is analogous to the claimed invention. I understand that prior art
`
`may be considered analogous if it is from the same field of endeavor, regardless of the
`
`problem addressed, as the claimed invention. I also understand that prior art may be
`
`considered analogous if it is “reasonably pertinent” to the particular problem with
`
`which the inventor is involved. I have been informed that a reference is “reasonably
`
`pertinent” if it is one which, because of the matter with which it deals, logically would
`
`have commended itself to an inventor’s attention in considering the inventor’s problem.
`
`29.
`
`I have been informed that the factual elements of the obviousness
`
`inquiry—the scope and content of the prior art, the differences between the prior art
`
`and the claims, and the level of ordinary skill in the art at the time of the invention—
`
`11
`
`Exhibit 2003
`
`
`
`
`
`are primary to performing a proper obviousness analysis and drawing the appropriate
`
`conclusions from that analysis.
`
`V. THE ’969 PATENT
`
`30. The ’969 Patent describes a tunneling client access point (“TCAP”) that
`
`communicates with an access terminal (e.g., a cellular telephone or computer), and also
`
`communicates with a remote network device (e.g., a server) by “tunneling” data
`
`through the terminal’s network interface. ’969 Patent Title, Abstract, 1:11−14,
`
`1:19−25, 2:39−51, 3:41−4:30, 18:12−14, Figs. 1, 9−10. Specifically, the access
`
`terminal serves as a “conduit” or “bridge” through which the TCAP communicates with
`
`the server or other network devices. Id. at 4:60−64, 28:54−57.
`
`31. To prevent the terminal from accessing information sent between the
`
`TCAP and the server, the TCAP secures the data such that “if data moving out of the
`
`TCAP and across the [terminal] were captured at the [terminal], such data would not
`
`be readable.” Id. at 12:67−13:4; see also id. Abstract (“The TCAP “tunnels” data
`
`through an access terminal’s (AT) input/output facilities….This enables the user to
`
`observe data stored on the TCAP without it being resident on the AT, which can be
`
`useful to maintain higher levels of data security.”), 1:11−14, 27:28−28:15, Fig. 10. For
`
`example, the TCAP preferably includes a Cryptographic Server Module, which can
`
`“encrypt all data sent through the access terminal based on the TCAP’s unique ID and
`
`user’s authorization information.” Id. at 28:9−12. Thus, when the ’969 Patent claims
`
`refer to the portable device facilitating communications “through” the terminal’s
`
`12
`
`Exhibit 2003
`
`
`
`
`
`network interface to a network device (as opposed to communicating “with” or “to”
`
`the terminal), it means that information is “tunneled” through the terminal in a secure
`
`manner, such that the terminal cannot access it.
`
`32. Users interact with an interactive user interface (“IUI”)—e.g., an
`
`interactive graphical user interface—presented by the terminal on the “output
`
`component.” ’969 Patent Abstract, 2:39−46, 3:57−60, 17:51−18:3, 30:60−66,
`
`33:8−13, 34:2−7. An “input component” on the terminal allows the user to interact
`
`with the IUI. See, e.g., ’969 Patent 30:62, 33:8, 34:2. The IUI on the terminal thus
`
`accepts user input (from the “first input component”) as interaction with the presented
`
`interface elements, interpret the inputs, and generate corresponding communications to
`
`the TCAP.
`
`33. Applications of the TCAP include, e.g., securely accessing remote data,
`
`encrypted communication, and secure purchasing, payment and billing. ’969 Patent
`
`2:49−51, 8:63−9:1, 12:55−13:27, fig. 2.
`
`VI. ASSERTED GROUNDS AND REFERENCES RELIED ON
`PETITIONS
`
`IN
`
`34.
`
`I understand that the Petition asserts two grounds for invalidity as
`
`summarized on page 10 of the Petition. The art primarily relied on in the Petition,
`
`Abbott and Shmueli, is described in the following sections.
`
`13
`
`Exhibit 2003
`
`
`
`
`
`A. Abbott
`
`35. Abbott (Ex. 1008) describes a “USB-compliant personal key with integral
`
`input and output devices.” Abbott title. Abbott uses the term “token” to refer to its
`
`personal key. Abbott 4:65, 6:21−23, 23:3−8, 23:39. The integral user input and output
`
`devices on Abbott’s token “communicate with the [token] processor by communication
`
`paths which are independent from the USB-compliant interface, and thus allow the user
`
`to communicate with the processor without manifesting any private information
`
`external to the personal key.” Abbott Abstract. Figure 2 depicts this architecture,
`
`including input and output devices 218 and 222 on the token, as well as data transceiver
`
`252, all of which communicate with processor 212 via communication paths that are
`
`separate from the USB communication path 202 between the token and computer 102.
`
`Abbott Fig. 2, 6:65−7:4, 7:15−19, 9:43−49; see also id. Figs. 7A−8C, 18:8−19:30
`
`(describing integral input and output devices). Abbott’s integral user input and output
`
`devices are central to achieving Abbott’s stated goal of providing “a personal key that
`
`allows the user to store and retrieve passwords and digital certificates without
`
`requiring the use of vulnerable external interfaces.” Abbott 3:59−62 (emphasis
`
`added).
`
`36. Petitioner quotes Abbott’s reference to “point to point tunneling.” Petition
`
`15. But Abbott actually references “point to point tunneling protocol (PPTP),” a
`
`protocol that was typically built into Microsoft Windows operating systems, within a
`
`14
`
`Exhibit 2003
`
`
`
`
`
`long list of other protocols and with no discussion of how the token would be used in
`
`such an environment. Ex. 2080.
`
`37. Abbott expresses a security concern that the computer or a server may be
`
`compromised and running malicious software. Abbott 19:42−50, 20:51−59. Abbott
`
`explains that its integral user input and output devices guard against this security
`
`concern by allowing the user to communicate directly with the token, without making
`
`use of the computer or server. Id. at 19:35−50, 20:59−65, 22:6−18. Abbott discloses
`
`two embodiments that use the integral user input and output devices to provide added
`
`security. One is a “squeeze-to-sign” authorization procedure, depicted in Figure 9, in
`
`which the user is prompted to authorize a transaction by “direct user input via” the
`
`token’s integral user input. Id. Fig. 9, 19:57−21:51. A second, depicted in Figure 10,
`
`is a PIN authentication procedure in which the user’s PIN is entered directly on the
`
`token’s integral user input. Id. Fig. 10, 21:64−22:55. These two are separate
`
`embodiments, depicted in separate figures and described in separate portions of the
`
`specification. It is apparent from Abbott’s title, abstract, and written description that
`
`the integral user input and output devices are an important feature of Abbott’s approach
`
`to security.
`
`38. Abbott’s token is accessed through a series of specialized function calls
`
`that are part of API 260. Abbott Fig. 2, 8:6−15, 12:45−14:36. This arrangement is
`
`important to the security features of Abbott, because this API-based mechanism is how
`
`the system restricts access to functions on the token. Abbott 12:45−13:15 (“The read
`
`15
`
`Exhibit 2003
`
`
`
`
`
`and write access type controls govern the transfer of files in the personal key 200 to
`
`and from the application 110.”), 19:61−20:2 (token “requires direct user input…before
`
`honoring any request from the host computer”). Moreover, Abbott contemplates that
`
`the computer software, including at least API 260, application 110, and browser 262,
`
`is stored and executed on the computer, not the token. Abbott Figs. 1, 2, 6:13−21,
`
`8:6−15. Abbott discloses installation media for the computer software, but not that the
`
`computer software could be stored on or originate from the token. Abbott 6:26−32.
`
`B.
`
`Shmueli
`
`39. Shmueli (Ex. 1009) describes a “portable memory device” that stores
`
`software that can execute on a host computer. Shmueli [0006]−[0007], [0022], [0025].
`
`40. Shmueli’s memory device plugs into a host computer’s USB port, and
`
`“will emulate a file system on a solid state mass storage device.” Id. [0026] [0031].
`
`The software on Shmueli’s memory device “is configured to readily execute on the
`
`host 12 upon interface.” Id. [0028]. Shmueli’s applications that are “stored on the key
`
`10 and capable of executing on the host 12 are referred to in general as keylets.”
`
`Shmueli [0031]; see also id. [0022], [0028], [0092]. Shmueli thus describes a typical
`
`USB mass storage device having particular applications and data stored on it.
`
`41. Shmueli’s storage device has no on-board processor, and thus cannot
`
`execute any program code. For example, Shmueli’s Figure 1 depicts CPUs in the host
`
`and server, but not in the memory device. Shmueli Fig. 1, [0024], [0027], [0030].
`
`Shmueli’s disclosure of “control circuitry” that “assist[s] in interaction with the host
`
`16
`
`Exhibit 2003
`
`
`
`
`
`computing devices as well as organizing the data stored thereon” is not a reference to
`
`an onboard device processor, but instead refers to device functionality that would be
`
`found on any USB mass storage device. Shmueli [0006]. Shmueli’s “control circuitry”
`
`is responsible for implementing the device side of a USB mass storage device. This is
`
`distinct from the invention of the ’969 Patent, in which the portable device comprises
`
`a processor that executes program code.
`
`42. One distinction between Shmueli and the ’969 Patent is that, following
`
`Shmueli’s authentication routine, which “run[s] on the host,” the Shmueli device
`
`becomes incidental to the execution of program code on the host computer. Shmueli
`
`Figs. 3A−3B, [0035]−[0037]. Shmueli’s summary states that the device is “primarily
`
`a memory device” that may not include control circuitry, and, if it does include control
`
`circuitry, the role of the control circuitry is simply to “assist in interaction with the host
`
`computing device.” Shmueli [0006]. The ’969 Patent, on the other hand, describes
`
`and claims execution of program code on the portable device, and further that such
`
`execution is in response to a communication resulting from user interaction with the
`
`interactive user interface on the terminal. E.g., ’969 Patent 31:21−25.
`
`VII. CLAIM CONSTRUCTION
`
`A.
`
`IUI
`
`43. As used in the claims, “an interactive user interface” means “a
`
`presentation containing interface elements with which a user may interact to result in
`
`the terminal taking action responsively by modifying what is presented.”
`
`17
`
`Exhibit 2003
`
`
`
`
`
`44. An example of the commonly used definition of an interactive user
`
`interface comes from an influential paper published at the 1992 ACM Conference on
`
`Human Factors in Computing Systems. Dennis J. M. J. de Baar et al., Coupling
`
`Application Design and User Interface Design, Published in CHI ’92 Proceedings of
`
`the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York,
`
`NY, USA (1992), available at https://dl.acm.org/citation.cfm?id=142806, Ex. 2004. In
`
`this paper, de Baar et al. write that “[b]uilding an interactive application involves the
`
`design of both a data model and a graphical user interface (GUI) to present that model
`
`to the user.” Id. at p. 1 (Abstract). Furthermore, “[e]xternal attributes and methods are
`
`represented in the user interface as standard interaction objects such as buttons,
`
`settings, or sliders (hereafter referred to as “controls”) or as data manipulated directly
`
`by the user.” Id. at p. 1 (Introduction). Therefore, it would have been understood that
`
`an interactive user interface for such an interactive application would have user
`
`interface elements represented within the user interface.
`
`45.
`
`de Baar et al. further explain that “[i]nteraction objects are controls such
`
`as buttons, sliders, or menus that can be manipulated directly by the user.” By reference
`
`to Figures 1 and 2, which appear below, de Baar et al. further explain that “[e]very
`
`interaction object in the user interface is associated with an action or attribute in the
`
`application data model (Figure 1). In Figure 2, for example, the attribute ‘volume
`
`Input’ in the application data model is linked to a slider that can be used to change its
`
`18
`
`Exhibit 2003
`
`
`
`
`
`value.” Id. at p. 2; Figs. 1-2. These interaction objects are presented within the user
`
`interface and are responsive to user interaction.
`
`
`
`
`
`46. The types of interactive user interfaces described in de Baar serve as
`
`baseline points of reference when considering the interactive user interface (“IUI”)
`
`described in the ’969 Patent. To begin with, although the interactive user interface
`
`(“IUI”) described in the ’969 Patent may be used to cause the TCAP to take actions,
`
`’969 Patent 6:64-7:3, 7:13-15, 7:46-48, 8:36-43, 9:6-10:45, 12:31-64, the IUI is
`
`19
`
`Exhibit 2003
`
`
`
`
`
`presented on the access terminal, id. Abstract, Figs. 5−8, 4:39−64, 30:65−66 (“present
`
`an interactive user interface on the terminal output component”). Furthermore, the IUI
`
`is “interactive” in the sense that the user may manipulate the IUI such that the device
`
`on which the IUI resides—the terminal—acts responsively to the user’s input by
`
`modifying what is presented. Id. 9:37−48 (when user clicks button, interface “further
`
`unfurl[s]” to present options to access facilities and services, including a login button
`
`which takes user to a login screen), 10:19−23 (engaging the interface by dragging and
`
`dropping files), 10:35−38, 10:64−11:4, (drag and drop), 11:13−31, 11:61−66
`
`(unfurling interface by graphically opening can of soda), Figs. 5−8.
`
`47. The specification explains that user interaction takes place via “computer
`
`interaction interface elements such as check boxes, cursors, menus, scrollers, and
`
`windows….” ’969 Patent 1:52−61; see also id. Figs. 5−8 (showing various interaction
`
`interface elements of an IUI). The interaction interface elements are not simply
`
`examples or suggestions, but are what “allow for the display, execution, interaction,
`
`manipulation, and/or operation of program modules and/or system facilities through
`
`textual and/or graphical facilities,” and “provide[] a facility through which users may
`
`affect, interact, and/or operate a computer system.” Id. at 17:58−63; see also id. at
`
`10:32−46 (engaging an interface element to manipulate data), 11:13−17 (accessing
`
`help facilities by “engaging a help facility user interface element”).
`
`48.
`
`Indeed, a central purpose of the invention of the Patents-in-Suit is to allow
`
`users to interact with the TCAP by employing “traditional large user interfaces” that
`
`20
`
`Exhibit 2003
`
`
`
`
`
`users “are already comfortable with,” as opposed to then-existing portable computing
`
`devices, which had “uncomfortably small user interfaces.” ’969 Patent at 2:23−38.
`
`This makes the disclosed TCAP easy to use, as “at most it requires the user to simply
`
`plug the device into any existing and available desktop or laptop computer, through
`
`which the TCAP can make use of a traditional user interface and input/output (I/O)
`
`peripherals….” Id. at 2:39−46. Users interact with the IUI, which is presented on the
`
`terminal’s output component and is able to accept input via the terminal’s “input
`
`component”—allowing replication of the “traditional large user interfaces” that users
`
`“are already comfortable with.” ’969 Patent Abstract, 2:35–46, 2:55–3:60, 17:51–18:3.
`
`The IUI on the terminal thus takes user input from interaction with the interface
`
`elements of the IUI (via the “input component”) and the terminal can then send a
`
`communication to the TCAP resulting from such user interaction. Id. at 26:63−27:7.
`
`49. Petitioner recognizes that “a user may interact” with an IUI. Petition 33
`
`n.4. Petitioner’s construction would limit IUI to “display[s]” and to interaction via
`
`“keyboard or mouse.” But it is clear from the ’969 Patent that the IUI can be or
`
`incorporate other interfaces or