`Exhibit O1 - Omnibus
`
`
`To the extent that Plaintiff argues that any reference charted in these Invalidity Contentions does not disclose a given patent claim
`element, it would have been obvious to combine such reference with the knowledge of a person of ordinary skill in the art, and/or any
`of the references identified herein as disclosing the element, and/or Applicant’s Admitted Prior Art (“AAPA”). One of ordinary skill
`in the art, as of the alleged priority date of the ’413 patent, would have known to combine the prior art elements disclosed by the
`foregoing references using known methods, and to use these elements according to their established functions in order to achieve a
`known and predictable result.
`
`Except where specifically noted otherwise, this chart may apply the apparent interpretations of claim language as used by Plaintiff in
`its infringement contentions. Such use, however, does not imply that Defendants adopt or agree with Plaintiff’s interpretations in any
`way. Additionally, by providing contentions for claim preamble elements, Defendants do not take a position on whether the preamble
`is a claim limitation.
`
`To the extent 35 U.S.C. § 112, ¶6 applies to any of the claim limitations of the Asserted Claims, the art cited herein also discloses the
`corresponding structure(s) and function(s) claimed or their equivalents, as shown below, or renders them obvious in view of the
`knowledge of one skilled in the art.
`
`
`’413
`Claim
`1.pre
`
`Claim Element
`
`A controlling method of
`an operation screen for
`operations of a remote
`control device,
`comprising the steps of:
`
`Prior Art
`
`To the extent that the preamble is construed as a limitation, this element is well-known as of the
`priority date of the ’413 patent.
`See, e.g., elements 1.a – 1.d.
`U.S. Patent No. 7,116,310 (“Evans”)
`Abstract (“A system for using computer input devices with software applications is disclosed. The
`system includes an input device mapper API, which uses a language of semantics as an interface
`between input devices and Software applications.
`2:24-39 (“The system of the present invention includes a Mapper Application Program Interface
`(API), which links controls on input devices with actions that a software application performs. The
`Mapper API uses vocabularies of semantics, called “genres,” where the semantics in each genre are
`appropriate for a particular category of applications, such as driving games or flight simulation games.
`For each input device, a correlation is made between the device's controls and semantics selected from
`a genre. Also, for each software application, a correlation is provided between the application's actions
`
`
`
`1
`
`Page 1 of 18
`
`CANON EXHIBIT 2005
`Roku, Inc. v. Canon Kabushiki Kaisha
`IPR2020-00357
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`and semantics selected from a genre. The Mapper API creates a mapping between device controls and
`software actions by identifying an input device that supports the software's genre and by connecting,
`as closely as possible, each control on the device with a software action that is correlated with the
`same semantic.”)
`3:5-13 (“The mapping created may be used by an input device manager, which translates notification
`of device events (such as the pressing of a button on a joystick) into the application's input dialect
`while the application executes. Alternatively, the Mapper API may provide the mapping directly to the
`application, which then receives event notifications directly from the various input devices and uses
`the mapping to perform a particular action upon receiving notification of a corresponding device
`event, as specified in the mapping.”)
`U.S. Patent No. 5,561,708 (“Remillard”)
`Abstract (“The electronic device displays a menu including several user selectable facilities on the
`display for a user. The user chooses one of the options from the menu by use of a remote keypad
`control, similar to a conventional television remote control. The options available include printing,
`electronic mail and other news and information services. Interfacing the electronic device with a
`stylus-type pointing device permits sketching and drawing on the television, including superposition
`of images on captured television images. Captured images of graphics or text are optionally stored or
`forwarded to a user through a mail facility accessed through operation of the system.”)
`2:23-40 (“According to one aspect of the present invention, it includes a television set, a
`communications device connected to a communications network, such as a telephone system, a remote
`keypad and a controller. The controller displays menu items on a portion of a screen of the television
`and controls operation of the communications device. The menu items correspond to various services
`provided to a user. Numbers, 0-9, identify the individual menu items that are selectable from the
`remote keypad. Some services are better implemented with alphabetical characters. In those instances,
`the remote keypad is provided with alpha-numeric characters. The controller includes a tuner coupled
`to the television and to an LED readout identifying a selected channel viewed by the user, and a
`"genlock' apparatus. The preferred embodiment employs the well-known gunlock principle in a novel
`way to facilitate interactiveness with television programming as well as to provide economic
`telestration capabilities to the home user.”)
`5:3-15 (“The electronic device 20 interfaces these facilities for access and display on a conventional
`television display 50. The user selects and controls access to the facilities displayed on the television
`
`
`
`2
`
`Page 2 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`50 by use of a remote keypad control 52. The remote keypad control 52 of the preferred embodiment
`is similar to conventional television remote controls for selection of channel and volume, for instance.
`The remote keypad control 52 provides menu selection signals to the electronic device for selection of
`a particular facility of the host computer 30.”)
`5:16- 22 (“The electronic device 20 of the preferred embodiment includes a stylus-type or pen-type
`pointing device 54 for creation of bitmap images on the television 50. The bitmap images include
`graphical and textual information drawn by the user. The user is able to direct the electronic device 20
`to capture images from the television 50 screen. The images include any bitmap images created by the
`user.”)
`5:26-35 (“Applications of this embodiment of the present invention include use of the pointing device
`54 to superimpose user created images over captured television station transmissions or educational
`and recreational sketching and drawing. Additionally, the pointing device 54 is able to function as a
`text input device by interaction with an image of a conventional typewriter keyboard. Through display
`of the keyboard image on the television, and monitoring cursor positions associated with "clicks' of
`the pointing device 54, a user may "typewrite' information on the screen.”)
`U.S. Application Publication No. 2002/0072912 (“Yen”)
`¶4 (“It is an object of the invention to provide an improved system and method of the type defined in
`the opening paragraph. To that end, the invention provides a system wherein the system also
`comprises a further microphone for enabling further users of the system to input speech commands.
`The system according to the invention thus provides (at least) two microphones for controlling the
`apparatus.”)
`¶10 (“Additionally, a user of the remote control 102 can enter speech commands via the microphone
`104, which are then transmitted to the IR receiver 106 of the television receiver 101 and converted to
`corresponding control commands by a speech processor, described hereinafter. The further
`microphone 107 is an omnidirectional microphone, which picks up speech signals from any direction,
`thus enabling other users which are not currently holding the remote control 102 to control the
`television receiver 101 by means of voice commands.”)
`U.S. Patent No. 6,437,836 (“Huang”)
`4:23-27 (“To accomplish the preservation of specialized functionality and features, the universal
`remote control according to its preferred embodiment can dynamically construct the user's remote
`
`
`
`3
`
`Page 3 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`control buttons on a graphical touch screen, from information contained within the downloaded data
`file.”)
`Sony Digital Handycam Digital 8 (DCR-TRV530) (“Handycam”)1
`Upon information and belief the Handycam discloses a controlling method of an operation screen for
`operations of a remote control device.
`Upon information and belief, the Sony LANC Remote Control Tripod (VCT-870RM) (“LANC
`Remote”) is an accessory that works with the Handycam. Remote operation of the Handycam using
`the LANC Remote was possible.
`’413 Patent, Applicant Admitted Prior Art (“AAPA”)
`’413 at 1:18-24 (“In case a plurality of remote control devices for controlling a television receiver are
`used, as disclosed in JP-A-2001-61110, there has been proposed a television receiver, which is
`enabled to use a plurality of remote control devices (as abbreviated into the “remo-con”) by giving
`priority to the individual remote control devices to improve the operability of the television receiver.”)
`(the disclosures of JP-A-2001-61110 herein incorporated by reference). The prosecution history
`further confirms that the physical components in claim 1 were conventional and well known. See,
`e.g., ’413 File History, 12/24/09 Office Action at 2-5; U.S. 7,250,988 File History, 03/22/2007 Office
`Action at 2-3.
`
`
`1 Handycam refers to the Sony Handycam series of digital video recorders, including but not limited to the DCR-TRV230, DCR-TRV330, and DCR-TRV530
`and their accessories including at least the Remote Commander and the LANC Remote Control Tripod, along with any other accessories and related software.
`The Sony Digital Handycam Digital 8 (DCR-TRV530) product referred to in this Exhibit is exemplary of all Handycam series products. Citations are to the
`“Sony Digital Video Camera Recorder, Digital 8, DCR-TRV2630/TRV330/TRV530” user manual unless otherwise indicated. On information and belief, the
`other Handycam series models, such as the DCR-TRV230 and DCR-TRV330 (with the Remote Commander, the LANC Remote Control Tripod, and other
`related accessories and software) have identical or substantially similar functionality are also prior art to the ’413 patent.
`4
`
`
`
`Page 4 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`
`
`
`The Handycam has a viewfinder (pp. 7, 19-22, 130, 150), an LCD display (pp. 7, 19-22, 130, 146),
`and a display window. (pp. 130, 147, 153). Video inputs and outputs include a A/V (composite
`video) input/output (pp. 31, 60, 75, 76, 104), an S-video input/output (pp. 31, 75-76, 104),
`iLink DV input/output (pp. 61, 75, 76, 104, 136), and USB (p. 11). The Handycam includes a
`Remote Commander (pp. 10, 25-29, 33, 34, 52, 56-60, 77, 87, 94, 97, 103, 129, 152).
`
`Evans
`2:50-56 (“The Mapper API maps each device control into the game action associated with the same
`semantic. The Mapper API uses these correlations to map device controls into Software actions; for
`example, the steering wheel maps to the action of turning the car, and the right and left pedals map to
`the actions of speeding up and slowing down the car.”)
`
`1.a
`
`acquiring an attribute of a
`remote control device;
`
`
`
`5
`
`Page 5 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`6:15-19 (“Input devices 65, 66, 67, and 42 provide input device mapper 39 with correlations between
`their controls and the semantics of genres 211-213, called ‘control-semantic’ correlations 221-225
`(abbreviated ‘C-S correlation’).”)
`See also FIGS. 3 (elements 301), 14 (elements 1402, 1404).
`S. Ponnekanti et al., “ICrafter: A Service Framework for Ubiquitous Computing
`Environments,” Proceedings Ubicomp 2001, pp. 56-75 (Oct. 2, 2001) (“ICrafter”)
`59 (“When the IM receives a request for UI for one or more services, it first selects one or more
`generators based on the requesting appliance and the service(s) for which the UI was requested. (A
`generator is a software entity that can generate a UI for one or more services for a particular
`appliance). Next, the IM “executes” the generators and returns the generated UI to the requesting
`appliance. To generate the UIs, generators need access to information about the services, appliance,
`and the workspace context.”)
`59 (“When an appliance requests a UI from the IM, it supplies an appliance description that provides
`information about the appliance (such as number of pixels).”); see also Fig. 1.
`Yen
`¶18 (“The input designation means may be controlled by means of a control element on the remote
`control or on the controlled apparatus. The control element may be a single-state toggle button as
`described above, or any other appropriate control element, such as a ‘radio button’ for each state, or a
`multi-position switch, each position of which corresponds to a particular state.”)
`Handycam
`Upon information and belief, the Handycam discloses acquiring an attribute of a remote control
`device. Different buttons were available on the Remote Commander and the LANC Remote. For
`example, the Remote Commander includes a “PHOTO,” DISPLAY,” “SEARCH MODE,” “ZERO
`SET MEMORY,” “START/STOP,” “DATA CODE,” and search, tape transport, and power zoom
`buttons. p. 152. The LANC Remote includes a “recording standby switch,” a “START/STOP
`button,” a “zoom lever,” and a “PHOTO” button. LANC Remote manual, p. 2.
`AAPA
` ’413 at 1:18-24 (“In case a plurality of remote control devices for controlling a television receiver are
`used, as disclosed in JP-A-2001-61110, there has been proposed a television receiver, which is
`
`
`
`6
`
`Page 6 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`1.b
`
`determining an operation
`form corresponding to the
`remote control device
`from among a plurality of
`operation forms
`previously stored based
`on the acquired attribute
`of the remote control
`device; and
`
`enabled to use a plurality of remote control devices (as abbreviated into the “remo-con”) by giving
`priority to the individual remote control devices to improve the operability of the television receiver.”)
`(the disclosures of JP-A-2001-61110 herein incorporated by reference). See also ’413 File History,
`12/24/09 Office Action at 2-5; U.S. 7,250,988 File History, 03/22/2007 Office Action at 2-3.
`Evans
`3:8-13 (“Alternatively, the Mapper API may provide the mapping directly to the application, which
`then receives event notifications directly from the various input devices and uses the mapping to
`perform a particular action upon receiving notification of a corresponding device event, as specified in
`the mapping.”)
`8:63-67 (“It is also possible for a user to affect a mapping created by input device mapper 39, either
`by providing a set of preferences for input device mapper 39 to take into account in creating the
`mapping, or by modifying a mapping after it has been created.”); see also 9:1-9:39
`9:8-15 (“FIG. 6 depicts such a display, as might appear for joystick 67. The manufacturer of joystick
`67 may provide a bitmap image or 3D model of the device, with blank text fields that are filled in with
`data from the application. The data is provided by the application as part of the A-S correlation in the
`form of text strings; the application may provide a text string label for each action, and the labels may
`be displayed with an image of the device.”)
`ICrafter
`56 (“The main objective of ICrafter is to allow users of interactive workspaces to flexibly interact with
`the services in the workspace. By service, we refer to a device (such as a light, projector, or a scanner)
`or an application (such as a web browser or Microsoft PowerPoint running on a large display) that
`provides useful functions to end-users. Users interact with the services using a variety of access/input
`devices (such as laptops, handhelds, etc).We use the term appliance to refer to such an access/input
`device. (In other words, service UIs run on appliances.) ICrafter is a framework that allows developers
`to deploy services and to create user interfaces to these services for various user appliances.”)
`60 (“Appliances request UIs from the InterfaceManager while supplying an appliance description. The
`Interface Manager first selects appropriate UI generators based on the requesting appliance and the
`services for which the UI was requested.”)
`63 (“As shown in figure 2, when a user requests a UI for one or more services (we explain how this
`process is bootstrapped later), the user appliance sends a request to the IM (step 1). The IM responds
`
`
`
`7
`
`Page 7 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`with the appropriate UI (step 2), which is rendered on the appliance by a renderer (step 3). The
`renderer itself is not part of ICrafter, and can be any native renderer, such as a web browser. User
`actions on the UI (step 4) result in remote invocations on the target services(step 5).”)
`68 (“The IM automatically picks a suitable UI based on the requesting appliance.”)
`Huang
`8:45-55 (“FIG. 6 presents an example of the graphical representation of the dynamically generated
`remote control buttons displayed as step 403 of the application 212 data flow operational procedure. A
`generic remote control with standard features is displayed; however, the dynamically generated nature
`of the remote control features allows for alternate display representation to fully encapsulate features
`specific to a user’s preferences or a customized home-theater setup. A feature common to this user
`interface representation is a “Keypad” indicator 601, which echoes the user input on the keypad
`pushbuttons 205.”)
`Handycam
`Upon information and belief, Handycam discloses determining an operation form corresponding to the
`remote control device from among a plurality of operation forms previously stored based on the
`acquired attribute of the remote control device.
`AAPA
`’413 at 1:18-24 (“In case a plurality of remote control devices for controlling a television receiver are
`used, as disclosed in JP-A-2001-61110, there has been proposed a television receiver, which is
`enabled to use a plurality of remote control devices (as abbreviated into the “remo-con”) by giving
`priority to the individual remote control devices to improve the operability of the television receiver.”)
`(the disclosures of JP-A-2001-61110 herein incorporated by reference). See also ’413 File History,
`12/24/09 Office Action at 2-5; U.S. 7,250,988 File History, 03/22/2007 Office Action at 2-3.
`Evans
`See element 1.b.
`ICrafter
`See, e.g., element 1.b.
`In addition, ICrafter discloses:
`
`8
`
`displaying an operation
`screen related to the
`determined operation
`form displayed,
`
`1.c
`
`
`
`Page 8 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`63 (“The IM responds with the appropriate UI (step 2), which is rendered on the appliance by a
`renderer (step 3).”)
`Huang
`See element 1.b.
`Handycam
`Upon information and belief, Handycam discloses displaying an operation screen related to the
`determined operation form displayed.
`AAPA
`’413 at 1:18-24 (“In case a plurality of remote control devices for controlling a television receiver are
`used, as disclosed in JP-A-2001-61110, there has been proposed a television receiver, which is
`enabled to use a plurality of remote control devices (as abbreviated into the “remo-con”) by giving
`priority to the individual remote control devices to improve the operability of the television receiver.”)
`(the disclosures of JP-A-2001-61110 herein incorporated by reference). See also ’413 File History,
`12/24/09 Office Action at 2-5; U.S. 7,250,988 File History, 03/22/2007 Office Action at 2-3.
`Evans
`6:15-19 (“Input devices 65, 66, 67, and 42 provide input device mapper 39 with correlations between
`their controls and the semantics of genres 211-213, called “control-semantic” correlations 221–225
`(abbreviated “C-S correlation”)”)
`6:40-43 (“Applications 36a and 36b provide input device mapper 39 with correlations between actions
`that they perform and genres 211-213, called “action-semantic” correlations 231-233 (abbreviated “A-
`S correlation”).”)
`7:15-18 (“Input device mapper may create a second mapping (not shown) for a different phase of an
`application that requires controls to be used in a different context, such as the role-playing phase of
`driving simulation game 36a.”)
`8:38-45 (“For example, in the genres provided below in the Examples section, controls are divided
`into the categories “priority 1 and “priority 2. A priority 1 control is a control that must be mapped to
`the primary input device and may not be implemented by an auxiliary input. A priority 2 control is a
`control that may be implemented on the primary input device, if a control is available.”)
`
`9
`
`wherein, in the step of
`determining the operation
`form, the operation form
`corresponding to the
`remote control device is
`determined by evaluating
`a degree of suitability
`between the remote
`control device and each
`of the plurality of
`operation forms based on
`the acquired attribute of
`the remote control device.
`
`1.d
`
`
`
`Page 9 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`9:42-49 (“FIG. 7 is a flowchart showing an example use of an input device mapper in accordance with
`the present invention, and the steps to initiate its use. As shown in FIG. 6 and described in detail
`below, a device and an application both undergo a setup phase, in which they pass their respective C-S
`and A-S correlations to an input device mapper; the application program then receives and processes
`input in accordance with the mapping.”)
`11:8-15 (“Alternatively, when an input device manager is used, as depicted in FIG. 8 and discussed
`below, the input device manager translates each device event notification into an instruction to
`application program 36a to perform a particular action. In this case, application program 36a does not
`perform any lookup into the mapping in processing step 707; it simply follows instructions received
`from the input device manager.”)
`27:1-9 (“In process box 1306, the API provides the input devices to the application based on how
`suitable the input devices are to the application. Thus the API analyzes how many of the semantics of
`the C-S correlations 221 (FIG. 3) match the semantics of the A-S correlations 231 (FIG. 4). The input
`devices can be provided in a list, table, array, etc. Alternatively, the API can invoke an application-
`defined callback function that returns the ranking through repeated calls to the application.”);see also
`FIG. 13 (element 1304); see also FIG. 7.
`Handycam
`Upon information and belief, Handycam discloses in the step of determining the operation form, the
`operation form corresponding to the remote control device is determined by evaluating a degree of
`suitability between the remote control device and each of the plurality of operation forms based on the
`acquired attribute of the remote control device.
`Defendants incorporate by reference their contentions relating to claim 1, as if fully set forth herein.
`
`ICrafter
`See, e.g., elements 1.b, 1.c.
`In addition, ICrafter discloses:
`68 (“Workspace adaptation. Figure 6 shows the light control SUIML UIs for two different
`workspaces. Note that the UIs are very different but are generated by the same template accessing
`different context memories.”)
`
`10
`
`A controlling method
`according to claim 1,
`wherein the plurality of
`operation forms are
`different from each other
`in a combination of
`operation devices
`selected for use therein
`from among a plurality of
`operation devices.
`
`2.pre
`
`2.a
`
`
`
`Page 10 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`Handycam
`Upon information and belief, Handycam discloses the plurality of operation forms are different from
`each other in a combination of operation devices selected for use therein from among a plurality of
`operation devices.
`Defendants incorporate by reference their contentions relating to claim 1, as if fully set forth herein.
`
`ICrafter
`See, e.g., elements 1.b, 1.c, 2.a.
`In addition, ICrafter discloses:
`57 (“Appliance adaptation. The framework should not only support several modalities (e.g. a
`gesture-based UI or a voice-based UI), but also different appliances with the same modality (e.g. a
`handheld computer vs. a pen-and-tablet form factor computer vs. the screen of a user’s laptop). Also,
`appliances can vary widely in resources.”)
`60 (“We apply ideas from previous research in related domains [7] to generalize this approach by
`allowing “intelligence” to exist in the IM (i.e., a third party other than the service or the appliance) to
`handle UI selection, generation, or adaptation. This lets the resource-rich, infrastructure-based IM
`select, adapt, or generate a suitable UI based on the requesting appliance.”)
`65 (“While the UI shown here presents a list of services, a different generator could result in (for
`example) a spatial map of all available services.”)
`Handycam
`Upon information and belief, Handycam discloses the plurality of operation forms are different from
`each other in a layout of a display element constructing the operation screen.
`Defendants incorporate by reference their contentions relating to claim 1, as if fully set forth herein.
`
`Evans
`9:33-39 (“The user's preferences may be stored in a file or database for future use by the user.
`Additionally, storing the preferences in a file or database permits the preferences to be easily ported
`
`11
`
`4.pre
`
`4.a
`
`A controlling method
`according to claim 1,
`wherein the plurality of
`operation forms are
`different from each other
`in a layout of a display
`element constructing the
`operation screen.
`
`A controlling method
`according to claim 1,
`further comprising a step
`of, in case that the
`acquired attribute of the
`remote control device
`
`5.pre
`
`5.a
`
`
`
`Page 11 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`cannot be specified,
`acquiring an attribute of
`the remote control device
`from outside, and
`updating a database in
`which attributes of
`remote control devices
`are previously stored.
`
`Prior Art
`
`from computer 20 to any other machine on which input device mapper 39 has been implemented, thus
`permitting consistent mappings across several machines.”)
`U.S. Patent No. 7,046,161 (“Hayes”)
`4:31-41 (“If the data in the squawk signal is not recognized by the remote control 10, i.e.,
`communications with the device are not supported by the remote control 10, the remote control 10
`may simply remain unchanged and continue to use its previous setup configuration. Alternatively, if
`the remote control 10 does not support communications with the device, the remote control 10 may
`access, as described hereinafter, a remote data repository to attempt to download configuration data
`that will allow the remote control 10 to be used to communicate with the device.”)
`6:14-32 (‘If no squawk is detected, or if a squawk is detected but specifies an unknown setup number
`(i.e., device type or manufacturer not supported by the built-in database of the remote control 10), the
`remote control 10 may simply continue to process events using its current configuration or,
`alternatively, the remote control 10 may initiate access to a remote data repository and attempt to
`download the remote control user interface and signaling information that corresponds to the setup
`number provided by the device. The remote control 10 may also store the data in the squawk signal for
`uploading to an intermediate, client device. The client device may then use this data to download the
`appropriate user interface and signaling information for Subsequent, off-line downloading to the
`remote control 10. If, however, a valid squawk signal is detected, the remote control 10 may respond
`by sending a command to the device in the requested format, commencing during the 150ms inter-
`frame interval, to suspend the squawk procedure at the device.”)
`10:22-37 (“As an illustrative example, an LCD based remote control 10, shown in FIG. 18, can
`download configuration information from multiple consumer devices which are intercomnected via a
`digital network as described above. Such a remote control 10, which includes a graphic LCD display
`and touch screen input capability, would be capable of supporting both types of command structure.
`The remote control 10 would, therefore, represent an extremely powerful user interface device,
`essentially becoming an extension of the controlled device in the users hand. Also, since the standard
`being used may allow an ongoing two-way dialog between the controlled and controlling devices, the
`remote control display and configuration may be updated dynamically during use of the system; not
`just at setup time as is the case with the basic “extended DAS' transaction described earlier.”)
`11:11-41 (“As discussed above, device and function identity information, whether included in a DAS
`transmission, read from a barcode label (as described in U.S. Pat. No. 6.225,938), entered by the
`consumer as a UPC or other code, etc. may, in turn, be used to directly access information stored in a
`12
`
`
`
`Page 12 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`centralized device database that contains definitions necessary to configure the remote control 10 to
`communicate with and/or control the identified device generally and/or specific functions of the
`identified device. To this end, the centralized device database may include control codes for devices of
`different types and manufacturers (and sometime model number) as well as elements of graphical user
`interface layouts to be displayed by the remote control 10 as an interface to communicate with/control
`various devices. As illustrated in FIG. 13, the remote control 10 can access the centralized device
`database server, provide the centralized device database server with the device and/or function identity
`information, and request that the centralized device database server download to the remote control 10
`information from the centralized device database needed by the remote control 10 to configure itself to
`communicate with and/or control the device corresponding to the device identity and/or function
`identity information. As will be described in greater detail hereinafter, the centralized device database
`may also store information relevant to the operation of devices such as user manuals, TV-guide
`listings, etc. Additionally, the identity information provided to the centralized device database server
`can be used to provide services such as automatic warranty registration, capturing of demographics
`(e.g., identifying devices a user owns/has previously setup), etc.”)
`12:61-13:2 (“In addition, the centralized device database server 300 may also use the device and/or
`function identity information to retrieve from the centralized device database graphical user interface
`elements, such as command key representations and layouts, that are appropriate for the identified
`device and/or function. The graphical user interface elements may then be downloaded as described
`above to the remote control 10 for use in providing a display by which the user can command the
`operation of the device.”)
`13:17-33 (“The centralized database server 300 may also be used to provide other information
`relevant to the operation of devices to the benefit of the consumer and/or device manufacturer. For
`example, device specific reference documentation Such as user manuals, hook-up instructions, FAQs,
`and the like may be stored at the centralized database server and downloaded to the client device or
`remote control 10 according to the device identity information provided to the centralized database
`server 300. This additional information may be provided either as part of an initial setup procedure or
`at some later point by explicit user request. Alternatively, in cases where the remote control 10 is
`capable of wireless communication with the client device or directly to the server 300 (as shown, for
`example in FIGS. 15–17) reference information can be offered interactively using, for example, the
`techniques described in co-pending U.S. application Ser. No. 09/905,423.”)
`
`
`
`13
`
`Page 13 of 18
`
`
`
`’413
`Claim
`
`Claim Element
`
`Prior Art
`
`14:21-37 (“Still further,