`
`
`James M. Barton, et al..
`In re Patent of:
`6,233,389 Attorney Docket No.: 39843-0037IP2
`U.S. Patent No.:
`May 15, 2001
`
`Issue Date:
`Appl. Serial No.: 09/126,071
`
`Filing Date:
`July 30, 1998
`
`Title:
`Multimedia Time Warping System
`
`DECLARATION OF JOHN MICHAEL STRAWN, PhD
`
`1
`
`
`
`SAMSUNG 1003
`
`
`
`TABLE OF CONTENTS
`
`I. QUALIFICATIONS AND BACKGROUND
`INFORMATION ......................................................................... 6
`
`
`II. LEGAL PRINCIPLES ............................................... 11
`A. Anticipation ........................................................................................ 11
`B. Obviousness ....................................................................................... 12
`
`
`
`III. OVERVIEW OF CONCLUSIONS FORMED .............. 13
`
`
`IV. BACKGROUND KNOWLEDGE ONE OF SKILL IN
`THE ART WOULD HAVE HAD PRIOR TO THE PRIORITY
`DATE OF THE ’389 PATENT ................................................... 13
`A. Overview of the ’389 Patent ............................................................... 14
`B. Background Prior Art – Platform SDK .............................................. 18
`C.
`Other Background Prior Art ........................................................... 27
`D.
`Person of Ordinary Skill in the Art ................................................. 28
`
`
`
`INTERPRETATIONS OF THE ’389 PATENT CLAIMS
`V.
`AT ISSUE ................................................................................ 29
`
`
`VI. ANALYSIS OF PLATFORM SDK (CLAIMS 31 AND
`61) – ANTICIPATION .............................................................. 34
`A. Preambles of Claims 31 and 61 ......................................................... 35
`B. Physical Data Source Features of Claims 31 and 61 ........................ 35
`C.
`Source Object Features of Claims 31 and 61 ................................. 45
`i. First Function - Extracts Video and Audio Data from a Physical
`Data Source ...................................................................................................... 47
`ii. Second Function - Obtains a Buffer from a Transform Object ...... 50
`iii.
`Third Function - Converts Video Data into Data Streams .......... 53
`iv.
`Fourth Function - Fills the Buffer with the Streams ................... 54
`v. Source Object - Conclusion ............................................................ 54
`D.
`Transform Object Features of Claims 31 and 61 ........................... 55
`
`
`
`2
`
`
`
`E. Sink Object Features of Claims 31 and 61 ........................................ 62
`i. First Function - Obtains Data Stream Buffers from a Transform
`Object
`64
`ii. Second Function - Outputs the Streams to a Video and Audio
`Decoder 67
`iii.
`Sink Object - Conclusion ............................................................ 70
`F. Automatic Flow Control Features of Claims 31 and 61 .................... 71
`i. Automatic Flow Control - Construction ......................................... 71
`ii. Source Object - Automatic Flow Control ....................................... 72
`iii.
`Sink Object - Automatic Flow Control ....................................... 81
`iv. Automatic Flow Control Features - Conclusion ......................... 88
`G. Decoder Features of Claims 31 and 61 .......................................... 89
`H.
`Control Object Features of Claims 31 and 61 ............................... 91
`i. Control Object – Receives Commands that Control the Flow of
`Broadcast Data ................................................................................................. 91
`ii. Control Object – Sends Flow Command Events ............................ 95
`I. Anticipation Analysis - Conclusion .................................................. 101
`
`
`
`VII. ANALYSIS OF PLATFORM SDK (CLAIMS 31 AND
`61) – OBVIOUSNESS ............................................................. 101
`
`
`VIII. SECONDARY CONSIDERATIONS ....................... 108
`
`
`IX. ADDITIONAL REMARKS ...................................... 112
`
`3
`
`
`
`
`TABLE OF FIGURES
`
`Figure 1. [SE1001, ’389 Patent FIG. 1.] ................................................................. 14
`Figure 1. [SE1001, ’389 Patent FIG. 1.] ............................................................... .. 14
`
`TABLE OF FIGURES
`
`Figure 2. [SE1001, ’389 Patent FIG. 8.] ................................................................. 16
`Figure 2. [SE1001, ’389 Patent FIG. 8.] ............................................................... .. 16
`
`Figure 3. [SE1001, ’389 Patent FIG. 9 (annotated).] ............................................. 17
`Figure 3. [SE1001, ’389 Patent FIG. 9 (annotated).] ........................................... .. 17
`
`Figure 4. Platform SDK documentation user interface. ......................................... 20
`Figure 4. Platform SDK documentation user interface. ....................................... .. 20
`
`Figure 5. [SE1004, 2266-2267 (annotated).] .......................................................... 22
`Figure 5. [SE1004, 2266-2267 (annotated).] ........................................................ .. 22
`
`Figure 6. [SE1004, 143 (annotated).] ..................................................................... 23
`Figure 6. [SE1004, 143 (annotated).] ................................................................... .. 23
`
`Figure 7. [SE1001, ‘389 Patent FIG. 9, SE1004 143 (both annotated).] ................ 25
`Figure 7. [SE1001, ‘389 Patent FIG. 9, SE1004 143 (both annotated).] .............. .. 25
`
`Figure 8. [SE1018 (Bescos), 4 (annotated).] .......................................................... 28
`Figure 8. [SE1018 (Bescos), 4 (annotated).] ........................................................ .. 28
`
`Figure 9. [SE1004, 2267 (annotated).] ................................................................... 37
`Figure 9. [SE1004, 2267 (annotated).] ................................................................. .. 37
`
`Figure 10. [SE1004, 1172 (annotated).] ................................................................. 39
`Figure 10. [SE1004, 1172 (annotated).] ............................................................... .. 39
`
`Figure 11. [SE1004, 2267 (annotated).] ................................................................. 41
`Figure 11. [SE1004, 2267 (annotated).] ............................................................... .. 41
`
`Figure 12. [SE1004, 2270-2271]. .......................................................................... 42
`Figure 12.
`[SE1004, 2270-2271]. ........................................................................ .. 42
`
`Figure 13. [SE1004, 2256 (annotated).] ................................................................. 49
`Figure 13. [SE1004, 2256 (annotated).] ............................................................... .. 49
`
`Figure 14. [SE1004, 2260 (annotated).] ................................................................. 50
`Figure 14. [SE1004, 2260 (annotated).] ............................................................... .. 50
`
`Figure 16. [SE1004, 146 (annotated).] ................................................................... 52
`Figure 16. [SE1004, 146 (annotated).] ................................................................. .. 52
`
`Figure 17. [SE1004, 146 (annotated).] ................................................................... 54
`Figure 17. [SE1004, 146 (annotated).] ................................................................. .. 54
`
`Figure 18. [SE1004, 143 (annotated).] ................................................................... 56
`Figure 18. [SE1004, 143 (annotated).] ................................................................. .. 56
`
`Figure 19. [SE1004, 143 (annotated).] ................................................................... 59
`Figure 19. [SE1004, 143 (annotated).] ................................................................. .. 59
`
`Figure 20. [SE1004, 143 (annotated).] ................................................................... 61
`Figure 20. [SE1004, 143 (annotated).] ................................................................. .. 61
`
`4
`
`
`
`Figure 21. [SE1004, 143 (annotated)] .................................................................... 63
`
`Figure 22. [SE1004, 146 (annotated).] ................................................................... 66
`
`Figure 23. [SE1004, 143 (annotated).] ................................................................... 66
`
`Figure 24. [SE1004, 143 (annotated).] ................................................................... 68
`
`Figure 25. [SE1001, ‘389 Patent FIG. 9, SE1004 146 (both annotated).] ............. 77
`
`Figure 26. [SE1004, 143 (annotated).] ................................................................... 79
`
`Figure 27. [SE1004, 143 (annotated).] ................................................................... 82
`
`Figure 25. [SE1001, ‘389 Patent FIG. 9, SE1004 146 (both annotated).] ............. 84
`
`Figure 28. [SE1004, 143 (annotated).] ................................................................... 86
`
`Figure 29. [SE1004, 2267 (annotated).] ................................................................. 90
`
`Figure 30. [SE1004, 143 (annotated).] ................................................................... 92
`
`Figure 31. [SE1004, 34 (annotated).] ..................................................................... 94
`
`Figure 32. [SE1004, 143 (annotated).] ................................................................... 97
`
`Figure 33. [SE1004, 143 (annotated).] ................................................................. 100
`
`Figure 34. [SE1004, 192.] .................................................................................... 104
`
`Figure 35. [SE1004, 143 (annotated).] ................................................................. 106
`
`5
`
`
`
`I, John Michael Strawn, Ph.D., of Larkspur, California, declare that:
`
`I.
`
`QUALIFICATIONS AND BACKGROUND INFORMATION
`
`1.
`
`I am currently an independent consultant working under the aegis of
`
`my corporation S Systems Inc. A copy of my curriculum vitae, which describes in
`
`further detail my qualifications, employment history, honors, patent, awards,
`
`professional associations, presentations, and publications is attached hereto.
`
`2. My formal education includes a Bachelor's degree from Oberlin
`
`College in 1973. As a Fulbright scholar in Berlin, I attended lectures and seminars
`
`in German at the Free University and Technical University Berlin from 1973-1975.
`
`I earned a Ph.D. degree from Stanford in 1985, with my doctoral dissertation
`
`focusing on signal processing for analyzing digital audio. As part of that work, I
`
`implemented streaming audio recording and playback in real time on a mainframe
`
`computer using, for example, specially formatted hard disks that operated in a
`
`drive the size of a washing machine, long before the compact disc was invented.
`
`3. With regard to the subject matter of this proceeding, I have extensive
`
`experience in streaming and related technology. I have studied analog and digital
`
`circuitry, analog and digital hardware, computer architecture, processor
`
`architecture, high-level language programming including object-oriented
`
`programming in languages such as C++ and Java, assembly language
`
`programming, digital signal processing, cybernetics, information theory,
`
`6
`
`
`
`compression (especially audio but also data, image, and video), television
`
`transmission formats, networking, user interface design, user interface
`
`implementation, and client/server interactions.
`
`4.
`
`In addition, I have over 45 years involvement in software, digital
`
`media, digital signal processing, networking, and processor architecture. Working
`
`in those areas, I have been an employee, a manager of a team of other Ph.D.s, and
`
`an independent software consultant in signal processing specializing in high-level
`
`languages and assembly language. My specialties have included compression and
`
`decompression of media, streaming media, the Fourier transform, and the discrete
`
`cosine transform used in audio compression, JPEG, and MPEG video.
`
`Implementing buffering for streaming media has been the backbone of many of my
`
`consulting projects, such as for DTS or Verance.
`
`5.
`
`Throughout my career, I have received a variety of awards including
`
`the Fulbright scholarship mentioned above and a grant from the IBM Thomas
`
`Watson Foundation to work in Europe and Japan. I was named Fellow of the
`
`Audio Engineering Society.
`
`6.
`
`I have made extensive contributions to the practice of assembly
`
`language programming for real-time processing and digital signal processing. My
`
`work on the NeXT machine served as a tutorial for other programmers. I have
`
`7
`
`
`
`held seminars at industry gatherings teaching my programming methodology as
`
`well as compression to others practicing in the field.
`
`7.
`
`In writing this Declaration, I have considered the following: my own
`
`knowledge and experience, including my work experience in the fields of audio
`
`and video streaming; my experience in teaching those subjects; and my experience
`
`in working with others involved in those fields. In addition, I have analyzed the
`
`following publications and materials, in addition to other materials I cite in my
`
`Declaration:
`
`• U.S. Patent No. 6,233,389 (Exhibit SE1001), and its accompanying
`
`prosecution history (Exhibit SE1002);
`
`• Microsoft Platform Software Development Kit (January 1998)
`
`(Excerpts including DirectShow SDK and Broadcast Architecture
`
`Programmer’s Reference) (“Platform SDK”, Exhibit SE1004);
`
`• Disc Image of Microsoft Developer Network Platform SDK (DISC 6)
`
`(January 1998) (Exhibit SE1005);
`
`• U.S. Patent No. 6,061,692 to Thomas et al. (“Thomas”, Exhibit
`
`SE1006);
`
`• Giant Stakes in Cable, CNET (11/5/1997) (“CNET”, Exhibit
`
`SE1007);
`
`8
`
`
`
`• Claim Construction Order, TiVo Inc. v. Echostar Communications
`
`Corp., et al., 2:04-cv-00001 (8/18/2005) (Exhibit SE1011);
`
`• Claim Construction Order, TiVo, Inc. v. AT&T Inc., et al., 2:09-cv-
`
`00259 (10/13/2011) (Exhibit SE1012);
`
`• Claim Construction Order, TiVo, Inc. v. Verizon Comm’n, Inc. et al.,
`
`2:09-cv-00257 (3/12/2012) (Exhibit SE1013);
`
`• Memorandum Opinion and Order, Motorola Mobility, Inc. et al. v.
`
`TiVo, Inc., 5:11-cv-00053 (12/06/2012) (Exhibit SE1014);
`
`• Exhibit B, Preliminary Infringement Claim Chart for U.S. Pat. No.
`
`6,233,389, Samsung Mobile Devices (“Infringement Contentions”,
`
`Exhibit SE1015);
`
`• Prosecution History of Ex Parte Reexamination of claims 1, 3-5, 15-
`
`18, 20-25, 32, 34-36, 46-49, and 51-55 of the ’389 patent (Serial No.
`
`90/007750) (“First Reexam”, Exhibit SE1016);
`
`• Prosecution History of Ex Parte Reexamination of claims 31 and 61 of
`
`the ’389 patent (Serial No. 90/009329) (“Second Reexam”, Exhibit
`
`SE1017);
`
`• Bescos, Jesus et al., From Multimedia Stream Models to GUI
`
`Generation (1997) (“Bescos”, Exhibit SE1018);
`
`9
`
`
`
`• Exhibit A, Supplemental Preliminary Infringement Claim Chart for
`
`U.S. Pat. No. 6,233,389, Samsung Mobile Devices (“Supplemental
`
`Infringement Contentions”, Exhibit SE1019).
`
`8.
`
`Each of these foregoing references (not including the legal documents
`
`or patents) were published in publications or libraries with which I am familiar,
`
`and which would have been available to and disseminated to members of the
`
`general technical community prior to July of 1998.
`
`9.
`
`Although this Declaration refers to selected portions of some of the
`
`cited references for the sake of brevity, it should be understood that these are
`
`examples, and that one of ordinary skill in the art would have viewed the
`
`references cited herein in their entirety and in combination with other references
`
`cited herein or cited within the references themselves. The references used in this
`
`Declaration, therefore, should be viewed as being incorporated herein in their
`
`entirety.
`
`10.
`
`I am not, and never was, an employee of the Petitioner in this
`
`proceeding, Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc.
`
`I have been engaged in the present matter to provide my independent analysis of
`
`the issues raised in the petition for inter partes review of the ’389 patent. I
`
`received no compensation for this Declaration beyond my normal hourly
`
`compensation based on my time actually spent studying the matter, and I will not
`
`10
`
`
`
`receive any added compensation based on the outcome of this inter partes review
`
`of the ’389 patent.
`
`II. LEGAL PRINCIPLES
`A.
`
`Anticipation
`11.
`
`I have been informed that a patent claim is invalid as anticipated
`
`under 35 U.S.C. § 102 if each and every element of a claim, as properly construed,
`
`is found either explicitly or inherently in a single prior art reference. Under the
`
`principles of inherency, if the prior art necessarily functions in accordance with, or
`
`includes the claimed limitations, it anticipates.
`
`12.
`
`I have been informed that a claim is invalid under 35 U.S.C. § 102(a)
`
`if the claimed invention was known or used by others in the U.S., or was patented
`
`or published anywhere, before the applicant’s invention. I further have been
`
`informed that a claim is invalid under 35 U.S.C. § 102(b) if the invention was
`
`patented or published anywhere, or was in public use, on sale, or offered for sale in
`
`this country, more than one year prior to the filing date of the patent application.
`
`And a claim is invalid, as I have been informed, under 35 U.S.C. § 102(e), if an
`
`invention described by that claim was described in a U.S. patent granted on an
`
`application for a patent by another that was filed in the U.S. before the date of
`
`11
`
`invention for such a claim.
`
`
`
`
`
`B. Obviousness
`13.
`
`I have been informed that a patent claim is invalid as “obvious” under
`
`35 U.S.C. § 103 in light of one or more prior art references if it would have been
`
`obvious to a person of ordinary skill in the art at the time of the invention of the
`
`’389 patent (“POSITA”), taking into account (1) the scope and content of the prior
`
`art, (2) the differences between the prior art and the claims, (3) the level of
`
`ordinary skill in the art, and (4) any so called “secondary considerations” of non-
`
`obviousness, which include: (i) “long felt need” for the claimed invention, (ii)
`
`commercial success attributable to the claimed invention, (iii) unexpected results
`
`of the claimed invention, and (iv) “copying” of the claimed invention by others.
`
`For purposes of my analysis, and because I know of no indication from the Patent
`
`Owner or others to the contrary, I have applied a date of July 30, 1998, as the date
`
`of invention in my analyses, although in many cases the same analysis would hold
`
`true even at a time earlier than July 30, 1998.
`
`14.
`
`I have been informed that a claim can be obvious in light of a single
`
`prior art reference or multiple prior art references. To be obvious in light of a
`
`single prior art reference or multiple prior art references, there must be a reason to
`
`modify the single prior art reference, or combine two or more references, in order
`
`to achieve the claimed invention. This reason may come from a teaching,
`
`suggestion, or motivation to combine, or may come from the reference or
`
`12
`
`
`
`references themselves, the knowledge or “common sense” of one skilled in the art,
`
`or from the nature of the problem to be solved, and may be explicit or implicit
`
`from the prior art as a whole. I have been informed that the combination of
`
`familiar elements according to known methods is likely to be obvious when it does
`
`no more than yield predictable results. I also understand it is improper to rely on
`
`hindsight in making the obviousness determination.
`
`III. OVERVIEW OF CONCLUSIONS FORMED
`15. This expert Declaration explains the conclusions that I have formed
`
`based on my analysis. To summarize those conclusions:
`
`• Based upon my knowledge and experience and my review of the prior
`
`art publications listed above, I believe that claims 31 and 61 of the
`
`’389 patent are anticipated by Platform SDK.
`
`• Based upon my knowledge and experience and my review of the prior
`
`art publications listed above, I believe that claims 31 and 61 of the
`
`’389 patent are rendered obvious by Platform SDK.
`
`IV. BACKGROUND KNOWLEDGE ONE OF SKILL IN THE ART
`WOULD HAVE HAD PRIOR TO THE PRIORITY DATE OF THE
`’389 PATENT
`
`16. The technology in the ’389 patent at issue generally relates to
`
`streaming of audio and video data. Prior to the filing date of the ’389 patent, there
`
`existed products, publications, and patents that implemented or described
`
`13
`
`
`
`functionality claimed in the ’389 patent. Thus, the methodology of the ’389 patent
`
`was known in the prior art. Further, to the extent there was any problem to be
`
`solved in the ’389 patent, it had already been solved in prior art systems before the
`
`filing date of the ’389 patent.
`
`A. Overview of the ’389 Patent
`17. The ’389 patent’s disclosure “relates to the real time capture, storage,
`
`and display of television broadcast signals.” [SE1001 (the ’389 patent), 1:6-9.]
`
`Figure 1 of the ’389 patent provides a “high level view” of the ’389 patent’s
`
`system. [SE1001, 2:44-45.] I have reproduced Figure 1 below for clarity.
`
`Figure 1. [SE1001, ’389 Patent FIG. 1.]
`
`18. As shown in Figure 1, Input Module 101 receives an input stream
`
`(such as an analog television signal), converts the input stream into a digital MPEG
`
`
`
`14
`
`
`
`format, and outputs a digital MPEG stream to Media Switch 102. [SE1001, 2:10-
`
`14, 3:30-65.] Downstream of Input Module 101 is Media Switch 102.
`
`19. Media Switch 102 “parses the stream looking for MPEG distinguished
`
`events including the start of video, audio or private data segments.” [SE1001, 5:3-
`
`6.] When video or audio segments are found, Media Switch 102 indexes the
`
`segments in memory 104 and stores the segments in storage device 105. [SE1001,
`
`5:6 to 6:7.]
`
`20. Downstream of Media Switch 102 is Output Module 103. Output
`
`Module 103 reads the stored digital segments from storage device 105, decodes the
`
`segments into an analog signal, and outputs the analog signal. [SE1001, 4:5-9.]
`
`21. Within the high level framework discussed above, claims 31 and 61
`
`are directed to operations that control movement of data through the ’389 patent’s
`
`system. The operations are performed by three conceptual components, illustrated
`
`in Figure 8 below as “Sources,” “Transforms,” and “Sinks.”
`
`15
`
`
`
`
`
`Figure 2. [SE1001, ’389 Patent FIG. 8.]
`
`22. Sources 801 accept digital data from an encoder and package the data
`
`in buffers acquired from transforms 802. Sources 801 then push the buffers to
`
`transform 802. [SE1001, 7:58-61.] Transforms 802 write the buffers to a file on
`
`the storage medium or hard disk 804. At a later time, transforms 802 pull out the
`
`buffers from hard disk 804 and sequence them with the stream - i.e., an operation
`
`the ’389 patent describes as performing a temporal transform. [SE1001, 8:3-8.]
`
`Sinks 803 then take the buffers from transforms 802 and send, to a decoder, digital
`
`video/audio data from the buffers.
`
`23. The ’389 patent describes the use of object-oriented programming
`
`language (such as the C++ programming language) to implement the program logic
`
`illustrated in ’389 patent Figure 8 above. As shown in ’389 patent Figure 9 below,
`
`16
`
`
`
`the ’389 patent describes the use of a “source object” 901, a “transform object”
`
`902, and a “sink object” 903, which correspond to sources 801, transforms 802,
`
`and sinks 803. [SE1001, 8:9-18, FIG. 9.] A “control object” 917 accepts user
`
`commands. [SE1001, 9:25-32.] I have reproduced Figure 9, below, and annotated
`
`it with colors and labels to show the source, sink, transform, and control objects, as
`
`well as the physical data source, storage device, and decoder.
`
`Figure 3. [SE1001, ’389 Patent FIG. 9 (annotated).]
`
`
`
`17
`
`
`
`24. The source, transform, and sink objects operate in conjunction with
`
`the components described above in ’389 patent Figure 1. For example, the source
`
`object “takes data out of a physical data source, such as the Media Switch.”
`
`[SE1001, 8:43-45.] The ’389 patent explains that the source object calls the
`
`transform object for a buffer to fill. [SE1001, 8:45-48.] The transform object
`
`provides the empty buffer to the source object and then takes the full buffer from
`
`the source object and stores it on hard disk or storage device 105 in Figure l.
`
`[SE1001, 9:2-9.] The sink object calls the transform object for a full buffer and
`
`then sends the digital data to a decoder in Output Module 103 of Figure 1.
`
`[SE1001, 9:10-16.] It then releases the empty buffer to the transform object for
`
`use again by the source object. [SE1001, 8:55-59.]
`
`25. Under this system, the source object waits for the transform object to
`
`provide an empty buffer. Similarly, the sink object also waits for the transform
`
`object to provide a full buffer. According to the ’389 patent, “[t]his means that the
`
`pipeline is self-regulating; it has automatic flow control.” [SE1001, 8:48-49.]
`
`Background Prior Art – Platform SDK
`
`B.
`26. A review of other relevant literature available at the time shows that
`
`the idea of a pipeline being “self-regulating” or exhibiting “automatic flow
`
`control” was well known in the technical community by 1998. For example,
`
`Platform SDK is a software development kit (aka “SDK”) by Microsoft
`
`18
`
`
`
`Corporation that contains compilers, tools, documentation, header files, libraries
`
`and samples needed for software development. Following industry standard
`
`practices, the documentation for Platform SDK explained to developers how they
`
`could create applications using Platform SDK. The documentation was divided
`
`into different sections and Exhibit SE1004 contains selected portions of Platform
`
`SDK. [SE1004.] The following image is a screenshot that I took of the Platform
`
`SDK documentation open in the standard Microsoft documentation viewer of the
`
`time. This image shows how a POSITA would have viewed Platform SDK in
`
`January of 1998. Exhibit SE1004 is a PDF of the Platform SDK documentation,
`
`which can be viewed in a PDF viewer today.
`
`
`
`19
`
`
`
`
`
`Figure 4. Platform SDK documentation user interface.
`
`
`
`27. Platform SDK is a single publication on a single disc (see SE1005),
`
`with multiple sections. These multiple sections are akin to a book having multiple
`
`chapters. As shown in the figure above, the table of contents for Platform SDK
`
`lists the multiple sections contained in Platform SDK, such as “Broadcast
`
`Architecture” and “DirectShow.” One portion of Platform SDK included in
`
`Exhibit SE1004 is titled “Broadcast Architecture Programmer’s Guide” or
`
`“Broadcast Architecture” (seen in the figure above) and is directed to developing
`
`applications allowing computers to be used as broadcast clients and broadcast
`
`20
`
`
`
`servers. [See SE1004, 2230-2919.] Another portion of Platform SDK included in
`
`Exhibit SE1004 is titled “Microsoft DirectShow SDK” or “DirectShow”
`
`(highlighted in the figure above) and is directed to developing video streaming
`
`services on computers using Microsoft DirectShow. [SE1004, 1-2229.] In these
`
`sections, Platform SDK describes systems for storage and playback of multimedia
`
`data, including the Broadcast Architecture section describing details of relevant
`
`hardware and the DirectShow section describing details of relevant software.
`
`[SE1004, 2260 (“The DirectShow filter graph is a Broadcast Architecture
`
`component that tells direct show what filters to use and how they are connected to
`
`each other.”), 2233.]
`
`28. Platform SDK explains that the “Broadcast Architecture” section of
`
`Platform SDK “enables personal computers to serve as broadcast clients for digital
`
`data networks and for analog broadcast networks.” [SE1004, 2232.] The
`
`Broadcast Architecture section teaches “how to develop software for the broadcast
`
`client” including “Streaming Video Services, documenting the streaming video
`
`services of Broadcast Architecture in the sections Video Control, Enhancement
`
`Video Control, and DirectShow Filter Reference.” [SE1004, 2233.] Platform
`
`SDK includes an illustration that “shows how the data flows through the various
`
`components of the broadcast client hardware.”
`
`21
`
`
`
`Figure 5. [SE1004, 2266-2267 (annotated).]
`
`
`
`This figure is annotated to show the hardware in Platform SDK that corresponds to
`
`certain hardware and software elements in claims 31 and 61 of the ’389 patent.
`
`Notably, I have identified the video card, which is labeled “MPEG decoder SVGA
`
`adapter,” as both the decoder and the physical data source as claimed. This is
`
`because the video card is used for both capture and playback in Platform SDK.
`
`[SE1004, 2269, 2271-2273 (describing video card requirements).]
`
`29. The Broadcast Architecture section of Platform SDK references the
`
`DirectShow section of Platform SDK for streaming video. [SE1004, 2233, 2260.]
`
`“DirectShow services [] provide playback multimedia streams from local files or
`
`Internet servers, and capture of multimedia streams from devices,” which “enables
`
`playback of video and audio content compressed in various formats.” [SE1004, 1.]
`
`22
`
`
`
`DirectShow processes streamed data using objects that are called “filters.”
`
`[SE1004, 142.] Platform SDK explains filters and filter graphs as follows:
`
`The DirectShow architecture defines how to control and process
`streams of multimedia data using modular components called filters.
`The filters have input or output pins, or both, and are connected to
`each other in a configuration called a filter graph. Applications use an
`object called the filter graph manager to assemble the filter graph and
`move data through it.
`
`[SE1004, 142 (emphasis added).] Platform SDK shows filters in a filter graph for
`
`playing media using the following illustration.
`
`Figure 6. [SE1004, 143 (annotated).]
`
`
`
`23
`
`
`
`30. The above figure has been annotated to show the objects disclosed in
`
`Platform SDK that correspond to the source, transform, sink, and control objects
`
`claimed in claims 31 and 61 of the ’389 patent. The control object can be used to
`
`play media by accessing DirectShow COM interfaces directly, or through for
`
`example the ActiveMovie Control. [SE1004, 142-143.]
`
`31.
`
`In addition to the annotated figure shown above, I am also providing a
`
`color-coded version of that same figure for comparison with the ’389 patent’s FIG.
`
`9. In the following figure, common features having similar functions are colored
`
`the same.
`
`
`
`24
`
`
`
`Figure 7. [SE1001, ‘389 Patent FIG. 9, SE1004 143 (both annotated).]
`
`
`
`
`
`25
`
`
`
`32. Platform SDK’s transform filter (which functions as the transform
`
`object) performs automatic flow control, including automatic flow control of both
`
`the source filter (source object) and the renderer filter (sink object). [SE1004, 142-
`
`143, 160-165.] In addition to automatic flow control, Platform SDK also describes
`
`the filter objects of DirectShow performing the various other functions claimed in
`
`claims 31 and 61, as I further explain below. Accordingly, I find that a POSITA
`
`would have understood Platform SDK as literally teaching each and every claim
`
`element of claims 31 and 61 of the ’389 patent.
`
`33. Even if Platform SDK is considered to not expressly teach one or
`
`more of the claimed features being connected and functioning exactly as claimed, a
`
`POSITA would still have found that each of the claimed features and functionality
`
`were at least described by Platform SDK. A POSITA would also have considered
`
`the combination of such features to be predictable and obvious in light of the
`
`teachings of Platform SDK. For example, Platform SDK repeatedly emphasizes
`
`the “flexibility” of the “modular software components” described in Platform
`
`SDK, and also explicitly cross-references to other sections to show the reader
`
`where to find the information teaching to combine features. [See SE1004, 2250-
`
`2252, see also 2235-2236, 2252, 2260, 2814-2815.] Therefore, even if Platform
`
`SDK is found to not anticipate claims 31 and 61, a POSITA would have found any
`
`such difference to be predictable and obvious over the teachings of Platform SDK.
`
`26
`
`
`
`C. Other Background Prior Ar