throbber
CONTAINS PROTECTIVE ORDER MATERIAL
`
`I, Atif Hashmi, hereby declare and state as follows:
`
`I.
`
`INTRODUCTION
`
`1.
`
`I have been retained as an expert witness for Inter Parties Review
`
`(“IPR”) proceedings, IPR2018-00766 and IPR2018-00767, related to two patents,
`
`U.S. Patent Nos. 7,391,791 (“the ’791 Patent”) and 8,942,252 (“the ’252 Patent”)
`
`(collectively, “the Patents”).
`
`2.
`
`More specifically, I have been retained to review source code
`
`provided by Implicit (“the Implicit Source Code”) and to provide an opinion
`
`regarding whether that source code practices the claims of the ’791 Patent and ’252
`
`Patent that are involved in these proceedings, namely claims 1–3, 6–9, 12, 16, 19,
`
`and 23–25 of the ’791 Patent (“the Challenged ’791 Patent claims”) and claims 1–
`
`3, 8, 11, and 17 of the ’252 Patent (“the Challenged ’252 Patent claims”).
`
`3.
`
`For the reasons detailed in this declaration and the attachments
`
`thereto, it is my opinion that the Implicit Source Code practices the Challenged
`
`’791 Patent claims and the challenged ’252 Patent claims.
`
`II. BACKGROUND AND QUALIFICATIONS
`
`4.
`
`A copy of my Curriculum Vitae (“CV”) is attached to this declaration
`
`as Appendix 1, which contains a detailed record of my professional qualifications,
`
`some of which I summarize below.
`
`Page 1 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`5.
`
`I am an engineer and co-founder of Thalchemy Corporation, which
`
`develops complex neural network systems for efficient processing of digital data
`
`produced by electronic sensors available in modern smartphones and wearable
`
`devices. I also provide technical consulting services for intellectual property
`
`litigation. A complete list of my publications, professional activities, and honors
`
`that I have received is fully set forth in my CV.
`
`6.
`
`I received a B.S. in Computer Engineering from Lahore University of
`
`Management Sciences in Pakistan and continued to receive an M.S. and a Ph.D. in
`
`Electrical Engineering from the University of Wisconsin – Madison. My
`
`educational training and research have been focused on processing of digital
`
`signals such as audio and video, computer networks, operating systems, and design
`
`and development of hardware circuits and software for computing systems. I have
`
`studied and developed algorithms and systems for data capturing and processing,
`
`synchronization of data obtained from different sources, computer network
`
`communication and synchronization, and have also designed systems using digital
`
`circuits.
`
`7.
`
`Some of my relevant professional experiences include working at
`
`Intel Corporation, where I worked on both software and hardware projects related
`
`to the design and implementation of Intel’s next generation graphics processing
`
`cores and data communication between these cores. I also collaborated with
`
`
`
`1
`
`Page 2 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`research and development teams at IBM developing complex software and
`
`hardware systems that emulate the functionality of the mammalian brain to capture
`
`and synchronously process sensory data obtained for multiple sensory modalities.
`
`This project was funded by the United States Department of Defense under the
`
`DARPA SyNAPSE program.1 Afterwards, I co-founded a company, Thalchemy
`
`Corporation, to commercialize a software and hardware technology for
`
`smartphones and wearable devices. At Thalchemy, I designed and implemented
`
`software and hardware to obtain and synchronize data generated by sensors present
`
`is modern smartphones and wearable devices. I have also developed software and
`
`hardware to process, filter, and transmit the sensory data to remote servers for
`
`additional processing and storage. I have also developed software for servers to
`
`receive, store, and further process the transmitted data. Over the years, Thalchemy
`
`has received several grants and awards from the National Science Foundation to
`
`support research and development activities. I have developed software using
`
`many software programming languages including C, C++, Java, Python,
`
`JavaScript, Embedded C, and assembly language.
`
`8.
`
`I have authored several research papers and articles related to
`
`computer hardware and software in peer-reviewed Computer Science and
`
`
`1 See http://www.darpa.mil/program/systems-of-neuromorphic-adaptive-plastic-
`scalable-electronics, retrieved October 2018.
`
`
`
`2
`
`Page 3 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`Electrical Engineering conferences and workshops, several of these publications
`
`have received best paper awards. I have also been an invited speaker at venues
`
`including academic conferences and technology companies. A complete list of my
`
`publications and invited talks is included in my CV.
`
`9.
`
`I am a named inventor on multiple patents and patent applications for
`
`neural network software and hardware for processing sensory data. A complete list
`
`of patents and patent applications on which I am a named inventor is included in
`
`my CV.
`
`10.
`
`I have also provided technical consulting for cases involving hardware
`
`and software patent infringement, software copyright infringement, trade secret
`
`theft, and source code quality. I have given testimony as an expert and submitted
`
`reports and declarations in which I offered opinions based on my technical
`
`analysis. As a technical consultant, I have analyzed hardware circuits and systems
`
`for audio and video streaming and decoding systems, computer networks,
`
`operating systems, neural networks, graphics display systems, and wireless and
`
`subsea communication systems. Additional details about specific cases can be
`
`found in my CV.
`
`III. COMPENSATION
`
`11. Quandary Peak Research is being compensated at the rate of $325 an
`
`hour for my time consulting on this matter and $487.50 for my time spent
`
`
`
`3
`
`Page 4 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`consulting on nights and weekends on this matter. My compensation does not
`
`depend on the outcome of these IPR proceedings.
`
`IV. MATERIALS CONSIDERED
`
`
`12.
`
`In developing my opinions set forth in this declaration, I have
`
`reviewed, among other materials, the ’791 and ’252 Patents, Provisional
`
`Application No. 60/341,574, Dr. Roman Chertov’s declaration for Inter Partes
`
`Review of U.S. Patent No. 7,391,791 and of U.S. Patent No. 8,942,252 (Sonos
`
`Exhibit 1009 in the IPR of the ’791 Patent; Sonos Exhibit 1009 in the IPR of the
`
`’252 Patent), and the Implicit Source Code. I have also reviewed the materials
`
`cited in this declaration.
`
`V. LEGAL STANDARDS APPLIED
`
`13.
`
`I am not an attorney and will not offer any opinions on the law.
`
`14.
`
`I understand that for a product to practice a claim, it must meet the
`
`language of each claim limitation, as properly construed, viewed through the
`
`perspective of a person having ordinary skill in the art.
`
`15.
`
`I also understand that terms are to be read not only in the context of
`
`the particular claim in which the term appears, but in the context of the entire
`
`patent, including the specification.
`
`16. For the purposes of my analysis in this declaration, I applied the claim
`
`constructions that Dr. Chertov applied (Sonos Exhibit 1009 in the IPR of the ’791
`
`
`
`4
`
`Page 5 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`Patent, at 25–28; Sonos Exhibit 1009 in the IPR of the ’252 Patent, at 25–26) and
`
`the definition of a person having ordinary skill in the art that Dr. Chertov applied
`
`(Sonos Exhibit 1009 in the IPR of the ’791 Patent, at 28; Sonos Exhibit 1009 in the
`
`IPR of the ’252 Patent, at 12). To the extent Dr. Chertov did not provide a
`
`construction, I used the ordinary and customary meaning of the word(s), as would
`
`be understood by a person having ordinary skill in the art in the context of the field
`
`of the invention, at the time of the invention, to construe such a claim term or
`
`phrase. I do not offer my own opinion in this declaration and its accompanying
`
`exhibits on whether the constructions in Dr. Chertov’s declaration are correct and I
`
`do not offer my own opinion in this declaration and its accompanying exhibits on
`
`whether Dr. Chertov’s definition of a person having ordinary skill in the art is
`
`correct.
`
`VI. OPINIONS REGARDING THE ’791 AND ’252 PATENTS
`
`17.
`
`In my opinion, the Implicit Source Code practices the ’791 Patent
`
`Challenged Claims and the ’252 Challenged Claims. The claim charts detailing
`
`my opinions are attached as Exhibits 2080 and 2081.
`
`18.
`
`I reviewed the source code files and software specification/technical
`
`documents exported from a Concurrent Version System (“CVS”) archive of the
`
`Implicit Source Code. The exhibits referenced below include some of the source
`
`code and technical documents I reviewed and referenced in the attached claims
`
`
`
`5
`
`Page 6 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`charts. However, I had access to and was able to review the full code exported
`
`from CVS as it existed as of November 1, 2001 as well as the full code exported
`
`from CVS as it existed as of November 15, 2001 as needed to form my opinions
`
`reflected in this declaration.
`
`19.
`
`I have experience with source code repository systems like CVS that
`
`provide the time and date stamp for source code files. In my experience, skilled
`
`artisans in the field rely on the time and date stamps, version numbers, and other
`
`metadata on source code files like those exported from CVS. I also have
`
`experience with the metadata that is provided by a computer file system, such as
`
`the date created and date modified information for a file. In my experience, skilled
`
`artisans in the field rely on that information to determine when a file was created or
`
`modified. Thus, the Implicit Source Code discussed in this declaration existed by
`
`at least the “checkout” date of the code from CVS, here November 1, 2001 for
`
`certain source code files (those discussed relating to the test/audiosync/ and
`
`the test/demo/ folders) and November 15, 2001 for other source code files
`
`(those related to the test/demo2/ folder), as detailed in my declaration.
`
`Besides other meta data, when available, I used the SOS_VERSION string to
`
`verify the date of the source code and the file creation timestamp.
`
`A. The General Structure of the Implicit Source Code
`
`
`
`6
`
`Page 7 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`20. The Implicit Source Code operates by executing “rules” files that
`
`invoke services provided by “beads” to perform specific tasks. A rule is invoked
`
`when certain conditions are met, and a bead implements the software that the rule
`
`invokes to perform a task. The rules files I reviewed were written in extensible
`
`markup language (“XML”) format, a well-known metalanguage format that is
`
`similar to hypertext markup language (“HTML”) format.
`
`21.
`
`In the Implicit Source Code, a rule typically comprises at least one
`
`route which, in turn, comprises at least one step. The following is a typical rule
`
`structure in the Implicit Source Code:
`
`
`
`
`
`<RULE>
`
`<DESCRIPTION> . . . </DESCRIPTION>
`<PREDICATE value=“. . .” />
`<ROUTE>
`
`<STEP>
`
`
`<BEAD name=“. . .” />
`<EDGE name=“. . .” />
`<SEED value=“. . .” />
`</STEP>
`
`
`</ROUTE>
`
`</RULE>
`
`
`
`22. The rule executes if the rule’s conditions (the <PREDICATE .../>)
`
`are met. A rule can include four types of steps. The first is a bead step (<BEAD
`
`name=“...”/>) which invokes and passes data to an identified bead. The edge
`
`step (<EDGE>) is connected to the bead step to pass along the “edge” of the bead
`
`that is invoked, usually whether the “encode” edge or a “decode” edge of the bead.
`
`
`
`7
`
`Page 8 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`The rules also contain a seed step (<SEED value=”namespace:...”/>)
`
`that populates a namespace with seed values. The namespace is akin to the
`
`operating environment for a set of rules that make up an application. Lastly, the
`
`rules may contain a loopback step (<LOOPBACK ... />). That step specifies
`
`what information should be sent in the opposite direction of the original data flow.
`
`23. Besides reviewing the Implicit Source Code, I also reviewed two
`
`documents to confirm my understanding of how the rules and beads operate within
`
`the Implicit Source Code. The first was a document entitled “Using Strings to
`
`Compose Applications from Reusable Components” to confirm my understanding
`
`of how the rules and beads operate within the Implicit Source Code. Exhibit 2021.
`
`The second was a document within the source code files. Exhibit 2022. Experts in
`
`the field of computer science and software engineering often rely on software
`
`specification/technical documents in conjunction with the source code to determine
`
`how various software elements are implemented and how they would operate.
`
`24. The following rule from the Implicit Source Code provides an
`
`example to illustrate how the rules and beads are connected. This rule is from the
`
`syncaudio.rule file, and it functions to play synchronized pulse-control
`
`modulated (“PCM”) audio at a slave device.
`
`
`
`8
`
`Page 9 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`
`
`
`
`Exhibit 2066.
`
`
`
`25. This rule, as described, plays out synchronized PCM to on the slave
`
`device listening to Port 9002. The <PREDICATE statement indicates that the rule
`
`is invoked if the device has received and decoded a Uniform Datagram Protocol
`
`(“UDP”) or Transmission Control Protocol (“TCP”) transmission at Port 9002. If
`
`the predicate is met, the rule executes four steps along the rule’s route.
`
`26.
`
`In the first step, the rule decodes the received PCM audio frame using
`
`the framer bead. This functionality is described in more detail below and in the
`
`attached claim charts.
`
`27.
`
`In the second step, the rule executes the clocksync bead to decode
`
`the master rendering time sent by a master device. For this case, invoking the
`
`
`
`9
`
`Page 10 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`clocksync bead updates the master rendering time maintained at the slave
`
`device in proportion to the time domain offset information that the clocksync
`
`bead receives from the timesync bead running in the background. The
`
`clocksync bead expresses the updated master rendering time in the time domain
`
`of the slave device. This functionality is described in more detail below and in the
`
`attached claim charts.
`
`28.
`
`In this second step, the rule also sets three elements in the namespace.
`
`It sets the Content-Type to PCM audio. It also assigns the MasterClock and the
`
`RenderClock to instances of sampleclock class.
`
`29.
`
`In the third step, the rule invokes the audiosync bead to adjust the
`
`audio stream at the slave device to synchronize playback with the master device
`
`based on the difference between the MasterClock and RenderClock values,
`
`described in more detail below and in the attached claim charts.
`
`30.
`
`In the fourth and final step, the rule invokes the speaker bead to
`
`play the audio.
`
`B. Overview of Implicit Source Code Files
`
`31.
`
`I reviewed the Implicit Source Code in connection with this
`
`declaration. Based on my review, the Implicit Source Code implements the
`
`intended purpose to synchronize audio and video content in the manner I describe
`
`in this declaration and the accompanying claim charts. I reviewed three sets of
`
`
`
`10
`
`Page 11 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`rules in connection with this declaration. The first were in the directory
`
`test/audiosync/package/package. The second set of rules were in the
`
`directory test/demo/rules. The third set of rules were in the directory
`
`test/demo2/rules. I also reviewed the source code for a number of beads
`
`that were in the directory beads. This section contains a high-level summary of
`
`those files.
`
`i.
`
`Rules
`
`32.
`
`In connection with my analysis, I reviewed the files in the test
`
`folder that relate to synchronizing the rendering of audio and/or video on a master
`
`device and a slave device. Those include the rules files in the test/audiosync
`
`directory, the test/demo directory, and the test/demo2 directory.
`
`33.
`
`I focused my analysis on the rule files in the test/demo directory.
`
`This was primarily because those files described a complete distributed system for
`
`audio and video synchronization and rendering implemented within the Implicit
`
`Source Code with an export date of November 1, 2001. The rules files in the
`
`test/demo directory also were the most recent from that exported codebase in
`
`terms of their date modified, all on or before October 29, 2001. However, my
`
`review of the files from the other folders listed above indicated that those files also
`
`implemented the synchronization functionality I describe in this declaration,
`
`including via the rules in the files audio.rule, video.rule,
`
`
`
`11
`
`Page 12 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`video2.rule, and timesync.rule files in folder test/audiosync and,
`
`for audio, the audioplayerapp.rule, clocksync.rule,
`
`timesync.rule, and syncaudio.rule files in the test/demo2/rules
`
`directory. I reserve my right to perform a comprehensive analysis on those and
`
`other files if permitted.
`
`34. The files in the test/demo/rules directory operate to
`
`synchronize audio and video playback over a network from at least one master
`
`device to at least one slave device.
`
`35. The rules in the pcmserveraudio.rule and
`
`syncaudio.rule files, among other functionality, designate a master device,
`
`play PCM audio on the master device, send the PCM audio to at least one slave
`
`device, and then adjust the audio stream at the slave device(s) so it renders
`
`synchronized audio with the master device. The timesync.rule file
`
`designates a timesync bead that is used as part of synchronizing the audio at the
`
`slave device. In addition, the folder test/demo/rules contains additional
`
`rules, including mp3serveraudio.rule and, syncmp3.rule, to
`
`synchronize the playback of mp3 audio over three devices using a similar
`
`synchronization method.
`
`36. The rules in the videomulti.rule and videoclient.rule,
`
`in conjunction with the rules files discussed above, among other functionality,
`
`
`
`12
`
`Page 13 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`divide a combined audio and video stream into an audio part and a video part,
`
`designate a master device, render the separated audio and video content on the
`
`master device, and then synchronize the playback of the audio and the playback of
`
`the video on slave devices.
`
`ii.
`
`Beads
`
`37. The rules in the rules files that I identified above use a number of
`
`beads or services for various processing steps. Below is a summary of each of
`
`those beads. The source code for each bead is the “.c” file generally located within
`
`a directory named after the bead within the /beads directory. However, the
`
`clocksync bead source code is located in the directory for the timesync bead.
`
`The source code for the UDP and TCP beads are located in the /beads/socket
`
`directory. The source code for the IP bead is located in the /beads/ipv4
`
`directory.
`
`38. The core beads for the synchronization functionality that I describe in
`
`this declaration and the attached claim charts are clocksync, timesync, and
`
`audiosync. The system may utilize other beads depending on if the system is
`
`synchronizing audio or video (or both), the number of devices on the network
`
`rendering the content, and other considerations. I will first discuss the
`
`aforementioned three beads and then discuss the remaining beads.
`
`
`
`13
`
`Page 14 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`39. The audiosync bead – This bead is used to determine the rendering
`
`time differential between the master and slave devices and adjusts an audio stream
`
`playback at the slave device by either dropping data, padding data, or resampling
`
`data in an effort to make the rendering time on the slave device match the
`
`rendering time of the master device. The bead achieves this adjustment by
`
`computing a rendering time difference using the MasterClock and
`
`RenderClock variables accessed after invoking the decode edge of the
`
`clocksync bead (which itself calls the timesync bead functions to determine
`
`the time domain differential). The audiosync bead then smooths the adjustment
`
`by using a moving average of the last eight rendering time difference values. If
`
`master rendering time is greater than the slave rendering time by a prespecified
`
`threshold (200 ms), signifying that the audio playback by the slave device is too far
`
`ahead of the master device, the audiosync bead duplicates the audio samples as
`
`necessary to delay the content playback by the slave device. If the audio playback
`
`by the slave device is behind the master device by a prespecified threshold (200
`
`ms), the audiosync bead discards some or all of the samples to speed up the
`
`playback by the slave device. If the audio is a little early or a little late (greater
`
`than 4 ms and less than or equal to 200 ms), the bead resamples the packet to
`
`stretch it out or shrink it down. The version of the audiosync bead I reviewed
`
`was version 1.12, which was last modified on October 23, 2001.
`
`
`
`14
`
`Page 15 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`40. The timesync and clocksync beads – These beads are used in
`
`conjunction with each other to determine and adjust for the time domain
`
`differential between each device on a network. The clocksync bead is used to
`
`propagate and receive rendering times over a network link. Corresponding to the
`
`decode edge of the clocksync bead at the slave device, the clocksync
`
`bead, upon receiving a remote master rendering time, decodes it, calculates the
`
`time domain differential between the master and the slave devices based on the
`
`timing domain offset between the slave device and the master device provided by
`
`the timesync bead. The timesync bead, which is constantly updating in the
`
`background as a service per the timesync.rule file, determines the time
`
`domain offsets of each device on the network with respect to every other device on
`
`the network. The timesync bead determines the time domain difference using
`
`(t1-t0)+(t2-t3)
`the formula
`2
`
`, where t0 is the device time at which a sending device sends
`
`a packet to the recipient device, t1 is the device time at which a recipient device
`
`receives the sent packet, t2 is the device time at which the recipient device sends a
`
`response to the sending device, and t3 is the device time at which the sending
`
`device receives a response from the recipient device. The bead first finds the
`
`highest minimum and the lowest maximum value from among the last eight
`
`difference samples to ensure that the offset does not radically jump from sample to
`
`sample. Eventually, the time domain difference computed by the timesync is
`
`
`
`15
`
`Page 16 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`used by the clocksync bead to adjust for the time domain differential between
`
`the sending and receiving device. The version of the timesync bead that I
`
`reviewed was version 1.14, which was last modified on October 23, 2001. The
`
`version of the clocksync bead I reviewed was version 1.11, which was last
`
`modified on October 23, 2001.
`
`41. The framer bead – In the context of the Implicit Source Code, this
`
`bead encodes video and audio frames for transmission between devices and, upon
`
`receipt, decodes the video and audio frames to enable their playback. The version
`
`of the framer bead that I reviewed was 1.3, which was last modified on October
`
`11, 2001.
`
`42. The UDP Bead – In the context of the Implicit Source Code, this bead
`
`formats the data (e.g., content frame and timing information) for transmission or
`
`receipt using the UDP. The version of the UDP bead for Win32 that I reviewed
`
`was 1.16, which was last modified on September 2, 2001. The version of the UDP
`
`bead for POSIX that I reviewed was 1.10, which was last modified on September
`
`2, 2001.
`
`43. The TCP bead – In the context of the Implicit Source Code, this bead
`
`formats the data (e.g., content frame and timing information) for transmission or
`
`receipt using the TDP. The version of the TCP bead for Win32 that I reviewed
`
`was 1.17, which was last modified on August 10, 2001. The version of the TCP
`
`
`
`16
`
`Page 17 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`bead for POSIX that I reviewed was 1.10, which was last modified on August 10,
`
`2001.
`
`44. The IP Bead – In the context of the Implicit Source Code, this bead
`
`formats the audio or video frame for transmission or receipt over and Internet
`
`Protocol (“IP”) network. The version of the IP bead that I reviewed was 1.1,
`
`which was last modified on August 31, 2001.
`
`45. The fanout bead – This bead copies a single input to multiple
`
`outputs. In the context of the Implicit Source Code, this bead copies the audio and
`
`video streams to distribute them to other devices on the network. The version of
`
`the fanout bead that I reviewed was 1.2, which was last modified on October 23,
`
`2001.
`
`46. The avidemux bead – This bead receives incoming data in Audio-
`
`Video Interleave (“avi”) format and separates the combined audio and video
`
`stream into a video stream of encoded bitmaps and an audio stream of PCM audio.
`
`The video stream is sent along the main path and the audio is sent along a child
`
`path. The bead, upon instantiation, creates a master rendering clock
`
`(IAudioClock) associated with the audio content stream, on which the
`
`MaterClock data structures for both the audio stream and video stream are
`
`linked. The version of the avidemux bead I reviewed was version 1.36, which
`
`was last modified on October 2, 2001.
`
`
`
`17
`
`Page 18 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`47. The bmptorgb bead – This bead renders an encoded bitmap image
`
`into a 24-bit red-green-blue (“RGB”) formatted bitmap image. The version of the
`
`bmptorgb bead I reviewed was version 1.5, which was last modified on October
`
`2, 2001. The bmpdecoder bead is part of the bmptorgb bead. The
`
`bmpdecoder bead decodes a video stream such that the decoded video stream
`
`can eventually be displayed on a device. The version of the bmpdecoder bead
`
`that I reviewed was 1.1, which was last modified on September 6, 2001.
`
`48. The rgbvideo bead – This bead displays RGB video on the default
`
`RGB device, such as a computer monitor. The version of the rgbvideo bead
`
`that I reviewed was 1.9, which was last modified on October 2, 2001.
`
`49. The speaker bead – This bead sends audio to an output device. The
`
`version of the fanout bead that I reviewed was 1.25, which was last modified on
`
`October 23, 2001.
`
`C. Overview of How the Source Code Implements Synchronization
`of Audio and Video Content
`
`50. The claim charts in Exhibits 2080 and 2081 show in detail how the
`
`Implicit Source Code practices each element of each of the ’791 Patent Challenged
`
`Claims and each of the ’252 Patent Challenged Claims. I describe a few example
`
`embodiments below.
`
`
`
`18
`
`Page 19 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`51. The most straight-forward case of the synchronization functionality of
`
`the ’791 Patent Challenged claims and the ’252 Patent Challenged claims is the
`
`audio-only case for PCM audio in which the master device first receives the
`
`content from a source. That functionality is generally reflected in the
`
`pcmserveraudio.rule and syncaudio.rule files found in the
`
`test/demo/rules folder. These rules also use the timesync service from
`
`the rules in the timesync.rule file.
`
`52.
`
`In this scenario, a device receives PCM audio from a source device.
`
`The device decodes the audio using the framer bead and then creates two parallel
`
`paths for the audio using the fanout bead, one path to render the audio on the
`
`receiving device and one path to send the audio to a remote device. The rule
`
`designates the device receiving the PCM audio from the source device as the
`
`master device by setting the MasterClock variable in the namespace as master.
`
`This designation results in the other devices on the network being designated as
`
`slave devices.
`
`53. Through the first path, the master device renders the audio and sets
`
`the RenderClock variable in the namespace to the to the rendering clock at the
`
`master device.
`
`54. Through the second path, the master device invokes the clocksync
`
`bead to encode the master rendering time and updates the MasterClock variable
`
`
`
`19
`
`Page 20 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`to the rendering time at the master device. The master device then encodes the
`
`PCM audio frame using the framer bead and sends the frame and the encoded
`
`data from the clocksync bead to the slave device through the UDP and IP
`
`beads.
`
`55. The slave device then receives the transmission and performs the four
`
`steps outlined above: (1) decoding the framed PCM audio using the framer bead;
`
`(2) invoking the clocksync bead to decode the received clock information and
`
`obtain the master rendering time adjusted by the time domain differential between
`
`the master device and the slave device and updating the MasterClock data
`
`structure accordingly; (3) invoking the audiosync bead to adjust the audio
`
`stream based on the rendering difference between the MasterClock and
`
`RenderClock values; and (4) rendering the adjusted audio using the speaker
`
`bead.2
`
`
`2 In addition to this more straightforward case, the rules in the Implicit Source
`Code also contain an embodiment for synchronizing the playback of mp3 audio
`over three devices and in which a remote device receives the initial mp3 content
`and is designated the master device, reflected in the mp3serveraudio.rule
`and syncmp3.rule. In that scenario, the invocations of the clocksync,
`timesync, and audiosync beads are similar, except that the remote master
`device sets itself as the master device after rendering the content and the local
`slave device invokes the audiosync bead to synchronize its audio playback to
`the remote master device.
`
`
`
`20
`
`Page 21 of 29
`
`Implicit Exhibit 2080
`Sonos v. Implicit, IPR2018-0766, -0767
`
`

`

`CONTAINS PROTECTIVE ORDER MATERIAL
`
`56. The case of synchronizing audio and video content has additional
`
`steps, some of which I discuss below. In addition to the rules files identified in the
`
`audio-only case above, this functionality is generally reflected in the
`
`videomulti.rule and videoclient.rule files.
`
`57.
`
`In this case, a device first receives an audio-video frame and, after
`
`decoding the transmission using the framer bead, demultiplexes the audio-video
`
`frame into its video and audio parts using the avidemux bead. That bead sets the
`
`path clock for the audio and designates the device that receives the audio-video
`
`stream from the source as the master device, which renders the audio and video
`
`after converting the video into RGB format via the bmptorgb bead. The other
`
`devices in the network are designated as slave devices.
`
`58. Through the fanout bead, the master creates two paths: one to a
`
`remote slave device that will display the video and one to a separate remote slave
`
`device that will render the audio. In each p

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket