`
`I, Edward Balassanian, hereby testify as follows:
`
`1.
`
`2.
`
`I have personal knowledge of the facts stated herein.
`
`I am the founder, member, and manager Implicit, LLC (“Implicit”),
`
`the Patent Owner in these proceedings, IPR2018-00766 and IPR2018-00767.
`
`3.
`
`Implicit owns the two patents at issue in these proceedings, U.S.
`
`Patent Nos. 7,391,791 (“the ’791 Patent”) and 8,942,252 (“the ’252 Patent”)
`
`(collectively, “the Patents”). My understanding is that these proceedings involve
`
`claims 1–3, 6–9, 12, 16, 19, and 23–25 of the ’791 Patent and claims 1–3, 8, 11,
`
`and 17 of the ’252 Patent (collectively, “the Claims”).
`
`4.
`
`I am also the founder and owner of the predecessors-in-interest to
`
`Implicit, specifically BeComm Corp. (“BeComm”), Implicit Networks, Inc.
`
`(“Implicit Networks”), and Digbee Media Corporation (“Digbee”). I served as the
`
`President and Chief Executive Officer of BeComm, Implicit Networks, and
`
`Digbee.
`
`5.
`
`The inventions of the Patents were initially owned by BeComm.
`
`BeComm changed its name to Implicit Networks in 2003. Implicit Networks
`
`changed its name to Digbee in 2006. Digbee then changed its name back to
`
`Implicit Networks in 2007. Implicit Networks then assigned its assets, including
`
`the Patents, to Implicit in 2013.
`
`6.
`
`I am the lead inventor on both of the Patents. Scott Bradley, a former
`
`BeComm Development Manager, is listed as a co-inventor on both of the Patents.
`
` 1
`
`Page 1 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`Prior to December 11, 2001, Mr. Bradley and I completed our inventions set forth
`
`in the Claims of the Patents. We originally conceived of the inventions set forth in
`
`the Claims of the Patents and they were actually reduced to practice before
`
`December 11, 2001. The evidence in support of these statements includes my
`
`testimony in this sworn declaration and the Exhibits referenced herein.
`
`7.
`
`The conception, design, development, building, and reduction to
`
`practice of the inventions claimed in the Patents occurred in the United States.
`
`BeComm, Implicit Networks, and Digbee were headquartered in the Seattle,
`
`Washington area. Implicit has its headquarters in Texas. The acts of conceiving
`
`and actually reducing the Claims of the Patents to practice was performed in the
`
`United States. Mr. Bradley and I were located in the Seattle, Washington area
`
`when the inventions of the Patents were conceived and reduced to practice.
`
`BeComm’s engineering staff were also located in the United States, mainly in the
`
`Seattle, Washington area, at the time the inventions set forth in the Claims of the
`
`Patents were conceived and reduced to practice.
`
`8.
`
`I graduated from the University of Washington with a Bachelor’s
`
`degree in Computer Science in 1989, when I was 19 years old. After graduation, I
`
`worked as an engineer at Microsoft Corp. (“Microsoft”) for approximately five
`
`years. In 1995, I left Microsoft to start my own technology company. In 1996, I
`
`founded BeComm (which has since become Implicit).
`
` 2
`
`Page 2 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`I. Overview of BeComm and Strings
`
`9. My vision when I founded BeComm in 1996 was to build a new type
`
`of operating system that could allow any application or device to interact with any
`
`other application or device. I sought to re-define how computing devices operated,
`
`which required re-architecting a new operating system from the ground up.
`
`10.
`
`I am the inventor on over 25 issued U.S. Patents that stem from my
`
`work at BeComm. Companies such as Microsoft, Apple, Google, Intel, AMD,
`
`Cisco, HP, Palo Alto Networks, and a number of other companies have licensed
`
`that patent portfolio for significant value.
`
`11.
`
`In the traditional systems in the 1990s of the type I was familiar with,
`
`the operating system interacted with the drivers for the hardware and then, using
`
`fixed, defined pathways, relayed the data for an application to use. For example, a
`
`traditional video player would need to know the content source, video destination,
`
`and audio destination and would be bound to that hardware and that processing
`
`path.
`
` 3
`
`Page 3 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`12. A BeComm document described that type of traditional system in this
`
`way using a video player example:
`
`Exhibit 2002, at 5.
`
`13. This approach was the conventional way systems operated at the time.
`
`
`
`Many of the limitations were due to the design of the operating system software
`
`and architecture on those devices.
`
`14. The BeComm operating system (initially named Portal and then
`
`released under the name Strings) was a fundamentally different architecture. It
`
`could, on-the-fly, provide a data flow from any source to any destination and
`
`provide the data in a format that the destination could consume or use.
`
` 4
`
`Page 4 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`15. The operating system accomplished that result by utilizing “beads” to
`
`process information instead of the fixed processing pathways. A “bead” was a
`
`routine or set of routines that could manipulate computer information, such as
`
`transcoding data for different media formats or transmission formats. The system
`
`worked by “stringing” together these “beads” to process data from a source to a
`
`destination.
`
`16. This flexibility enabled communication and processing between
`
`different types of devices. Using Strings, a computing device could, for example,
`
`play audio and video that originated from a camera, also described in the same
`
`internal BeComm document from above:
`
`Exhibit 2002, at 9.
`
` 5
`
`
`
`Page 5 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`17.
`
`In this Distributed Media Player example, Strings would employ a
`
`sequence of beads (Framer, MPEG Encoder, RTP, and UDP) to process audio-
`
`video content originating from a camera into a format for transmission over an IP
`
`network, such as the Internet. On the receiving side, the Strings would string
`
`together the beads (UDP, RTP, and MPEG Parser) to receive the IP packets,
`
`convert them into an MPEG stream, and parse the MPEG stream into video and
`
`audio parts. Strings then would process the video and ultimately output to the
`
`video card for display. To do so, Strings would use a sequence of beads (MPEG
`
`Video Decoder, MPEG Player, UI, and GL). At the same time, Strings would
`
`process the audio and ultimately output it through a speaker using another
`
`sequence of beads (Dolby Digital (AC3) and Audio Player). In this example, the
`
`playback also could be synchronized depending on the configuration and use of
`
`beads.
`
`18. A prototype of Portal containing some functionality was operating
`
`internally at BeComm by the fall of 1998. I demonstrated a number of Strings
`
`applications at the Consumer Electronics Show (“CES”) in January 2000 in Las
`
`Vegas, Nevada. I also demonstrated a number of Strings applications at DEMO
`
`2000 on February 7, 2000. Exhibit 2003. Our demonstration won Best in Show at
`
`DEMO 2000.
`
` 6
`
`Page 6 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`19. The photograph below is from DEMO 2000 and depicts a handheld
`
`device, using Strings, playing a video feed from a VHS tape playing in a VCR.
`
`Exhibit 2004, at 10.
`
`I also demonstrated a number of other Strings functionalities at
`
`
`
`20.
`
`DEMO 2000, including: enabling a handheld device to browse a website and route
`
`media; and rendering content from a website and audio from a television feed onto
`
`a landline telephone. Exhibit 2004.
`
`21. By the start of 2001, BeComm dedicated over 50 person-years to
`
`building Strings. I provided the seed funding for BeComm, and, with cash flow
`
`generated from BeComm’s customers, the combined investment totaled over $10
`
`million. A photo taken around the November 30, 2000 timeframe shows about
`
` 7
`
`Page 7 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`twenty members of the BeComm team (which had grown to over thirty by that
`
`time). Exhibit 2005.
`
`22. One of BeComm’s early major relationship was with Intel
`
`Corporation (“Intel”). In the 2000 to 2001 time period, BeComm ported Strings on
`
`the Intel Web Tablet, or iPAD, which was an internal project at Intel.
`
`23. One key application of Strings was that it could render on the Intel
`
`Web Tablet content stored on the user’s personal computer (“PC”) or content
`
`streamed from over the Internet. The Intel Web Tablet connected to a user’s PC.
`
`Running on the user’s PC and on the Tablet, Strings would retrieve the content
`
`from the PC, process the content and convert it into a format to playback on the
`
`Tablet, transmit the content to the Tablet, and then render the media on the Tablet.
`
`Strings also could, through the PC, retrieve content from the Internet and render
`
`that content onto the Tablet, such as streaming Internet radio.
`
` 8
`
`Page 8 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`24. Through Strings, as BeComm explained on a webpage in April 2001,
`
`users could “enjoy digital audio on the tablet by playing MP3s stored on their PC
`
`or by listening to Internet radio from anywhere in the home.” Exhibit 2006.
`
`Exhibit 2006.
`
`In addition to the Intel Web Tablet project, BeComm worked on a
`
`25.
`
`
`
`number of other projects, some of which I describe in more detail below.
`
`II. The Technology to Synchronize Content on Multiple Devices
`
`26. Starting by the late 2000 time period, BeComm began working on
`
`streaming audio and video content from a source to multiple output devices. The
`
` 9
`
`Page 9 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`Intel Web Tablet was an example of streaming audio through a PC to one
`
`destination device, the Tablet, but the underlying Strings software could distribute
`
`to multiple devices, not just one tablet. A BeComm webpage from 2001 discusses
`
`the potential of the “BeComm Audio Solution” for original equipment
`
`manufacturers in building connected devices for the home. Exhibit 2007.
`
`27.
`
`In late 2000, Intel executives visited BeComm’s facilities and were
`
`impressed that Strings, at that time, could render content on multiple devices at the
`
`same time. BeComm and Intel began exploring working on a on a project for a
`
`potential Strings-enabled digital relay, called “Juno,” around this time. Exhibit
`
`2008.
`
`28. A problem BeComm encountered as it developed the Audio Solution
`
`was, for users that wanted to play content on different devices at the same time, the
`
`playback was out of sync. This is because each device can have different time delays
`
`to receive, process, and render the content, and there are delays in transmitting the
`
`content between devices. In addition, formats such as video and audio require
`
`different processing which can introduce complexity when trying to synchronize
`
`video and audio across devices. For example, if a host computer is playing an audio
`
`stream and also transmits that same audio (or associated video) stream to a remote
`
`computer for playback, by the time the remote computer receives, processes, and
`
`plays the audio stream, the host computer will already be further ahead in rendering
`
` 10
`
`Page 10 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`the audio. On playback, this difference would often sound like an echo if the content
`
`was noticeably out of sync.
`
`29. On December 4, 2000, BeComm completed a confidential Phase 0
`
`document and presented it to Intel (codenamed “Jupiter” by BeComm internally).
`
`Exhibit 2009. In the document, “both Jupiter and BeComm recognize[d] that true
`
`synchronization is an unsolved computer science problem,” but that BeComm would
`
`make “a best effort to keep the playback at the Adapters synchronized.” Id. at 15.
`
`30.
`
`In January 2001, BeComm entered into a contract with Intel for the
`
`Juno project. BeComm received $850,000, plus a 5% revenue share going forward
`
`of all the Intel Consumer Products Division related revenue. In the Phase 1
`
`document dated January 26, 2001, BeComm stated that “[w]e have not yet finalized
`
`how Juno will implement the requirement that a Media Server session be able to
`
`simultaneously serve multiple concurrent Adapters and keep their playback
`
`synchronized” and that, in all cases of the multicasting implementations BeComm
`
`was considering, “the Adapter synchronization . . . will be difficult at best.” Exhibit
`
`2011, at 37–38. The Juno project for Intel went on hold by the middle of February
`
`2001. Exhibit 2012. It was never completed for Intel, due to market conditions and
`
`other reasons. We continued to develop the technology of the Juno project and
`
`ultimately got the software to operate.
`
` 11
`
`Page 11 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`31. The Juno project team was primarily myself, Scott Bradley, Guy
`
`Carpenter, an Engineer Master, and two other engineers, Adam Greene and Neil
`
`Mintz.
`
`32. The “Document Contributors” portion of the Juno Phase 1 document
`
`reflected our core engineering team on the project.
`
`Exhibit 2011, at 8.
`
`33. Around the time of the Juno project (and after the project for Intel went
`
`
`
`on hold), I contemplated how to achieve the best-possible synchronization of content
`
`across multiple devices as we continued our work. Mr. Bradley and I solved the
`
`synchronization problem and conceived the inventions set forth in the Claims of the
`
`Patents. We then began working on the implementation of the inventions thereafter,
`
`as detailed below. W e communicated those inventions to BeComm’s internal
`
`engineering and development staff to reduce them to practice. We worked primarily
`
`with Guy Carpenter, an Engineering Master at BeComm, to implement the
`
`inventions, as I describe below.
`
` 12
`
`Page 12 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`34. BeComm’s embodiment of the synchronization functionality of the
`inventions is primarily housed within three beads: audiosync, timesync, and
`clocksync. Additional beads may be used with these beads, depending on the
`
`implementation.
`
`35. BeComm utilized a Concurrent Version System (“CVS”) repository for
`
`its source code and related files that logs the dates and changes and versions of the
`files. On August 28, 2001, the first version of a test file avsync.c was checked
`
`into CVS, described as a “[n]ew test [that] checks audio and video synchronization
`code.” Exhibit 2013, at 2. The first version of the timesync bead was checked in
`
`on August 31, 2001, which described a “[n]ew protocol for inter-host time
`
`synchronization (incomplete).” Exhibit 2014, at 6. On September 10, 2001, major
`additions were made to split the clocksync bead from the timesync bead at
`
`version 1.9. Id. at 5. Later that day, o n September 10, 2001, the first version
`remotesync, a “[n]ew package for testing remote synchronization of audio/video
`
`over the network” was checked into CVS. Exhibit 2015, at 3.
`36. The first version of the audiosync bead was checked into CVS on
`
`September 28, 2001. Exhibit 2016, at 7. That version adjusted the audio stream on
`
`a slave device by adding data (which added silence) or dropping data (to speed up
`the rendering of content). Id. By version 1.12 of the audiosync bead, completed
`
`by at least October 23, 2001, the bead adjusted the audio stream by either dropping
`
` 13
`
`Page 13 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`data, adding data, or resampling data (stretching or shrinking the audio packet).
`
`Exhibit 2017, at 1.
` Used in conjunction, the timesync and clocksync beads were
`
`37.
`
`used as part of determining an offset for rendering content at a master device and a
`slave device. The timesync bead determined the clock offsets of all listening
`peers on a network. The clocksync bead was “a filter bead that uses the
`
`information gathered by the timesync bead to propagate a master clock and render
`
`clock pair across a network boundary.” Exhibit 2018, at 1. Exhibit 2018 is an
`
`internal developer document from this period that generally describes the
`clocksync bead. Id. The clocksync and timesync beads were fully
`
`operational to synchronize content over multiple devices by at least by October 23,
`2001, if not earlier, with version 1.11 of clocksync and version 1.14 of
`timesync. Exhibit 2019; Exhibit 2020.
`
`38. By at least this same time frame in October 2001, the beads that could
`
`be used with the synchronization beads to implement the synchronization of audio
`
`and video rendering were also completed and fully operating with regard to the
`
`processing, transmitting, and receiving audio and video content and related data
`structures between devices. That includes avidemux, fanout, framer, ipv4,
`rgbvideo, UDP, TCP, and speaker beads, among others.
`
`39. Throughout Strings’s existence (including prior to and during 2001),
`
`the use of “rules” was the primary mechanism to configure Strings to perform
`
` 14
`
`Page 14 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`different functionalities because the rules defined when the system invoked various
`
`beads to perform a task or achieve an application-specific goal. The rules within the
`
`Strings engine were written in extensible markup language (“XML”) format.
`40. A rule (<RULE ... />) was a sequence of one or more steps that
`
`that execute when a specific set of constraints is met. A rule had within it at least
`one route (<ROUTE ... />), and each route contained at least one step (<STEP
`... />) within the route. If a rule’s conditions were met (the <PREDICATE
`.../>), then Strings will execute the specific steps in the rule, which included
`
`executing the beads called in each step and setting variables in the namespace. The
`
`Strings engine would process these rules for a given application context and
`
`construct paths that strings together the beads to perform the processing for that
`
`application. Two documents, Exhibit 2021 and Exhibit 2022, describe the Strings
`
`rules-based system.
`
`41. A rule used to synchronize the rendering of audio at a slave device, for
`example, would decode the transmission using one bead step (framer), determine
`
`the how much the slave device and master device were out of sync using a second
`bead step (clocksync, which utilizes timing offset data from timesync), adjust
`
`the rendering of audio content to have the slave device render the content in sync
`with the master device using a third bead step (clocksync), and then send the
`audio to an output device for playback using a fourth bead step (speaker).
`
` 15
`
`Page 15 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`42. Prior to December 11, 2001, and at least starting in September, 2001,
`
`we completed the source code and related files, including the rules files, to
`
`implement the audio-video synchronization inventions that Mr. Bradley and I
`
`conceived of earlier that year. Prior to December 11, 2001, and at least starting in
`
`September, 2001, I participated in demonstrations and tests of the audio-video
`
`synchronization inventions using Strings. Running these tests, I confirmed that
`
`Strings was able to work for its intended purpose. These tests and demonstrations
`
`synchronized content according to the invention Mr. Bradley and I conceived of
`
`earlier in 2001. I will discuss example tests and demonstrations of the
`
`synchronization functionality that we completed and performed prior to December
`
`11, 2001, starting at least in September, 2001.
`
`43.
`In general, the demonstration computer would execute software, such
`as a source.pl script, to set up the demonstration and begin transmitting content,
`
`such as audio or video, from the demonstration computer to a selected remote host
`computer. Exhibit 2023 is the source.pl script present in the source code
`repository in the test/demo directory dated October 29, 2001, though I witnessed
`
`and participated in testing and demonstrations prior to then in addition to around the
`
`time of this file. The software would send content from the source computer to a
`
`destination computer, which caused the content to be synchronized across multiple
`
`devices. We tested and demonstrated this functionality for multiple audio and video
`
` 16
`
`Page 16 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`files before December 11, 2001 to synchronize the rendering of audio and video,
`
`starting at least in September, 2001, as I describe in the paragraphs below.
`
`44. This testing began in connection with the development of the
`audiosync, clocksync, and timesync beads described above, some of which
`
`are described in the CVS log files for the source code for each bead. Exhibit 2014;
`
`Exhibit 2016; Exhibit 2030.
`
`45.
`
`I participated
`
`in successful demonstrations and
`
`tests of
`
`the
`
`synchronization of PCM and MP3 audio across multiple devices. Typically for
`
`audio testing and demonstration, we initially used two devices—one master device
`
`and one slave device—that rendered the audio in synchronous fashion according to
`
`the inventions set forth in the Claims of the Patents.
`
`46. We also created a number of test packages to test the synchronization
`
`functionality of the inventions Mr. Bradley and I conceived of using the
`audiosync, clocksync, and timesync beads. At BeComm, we created a
`
`“test” package to test a certain functionality or application that we wanted to add to
`
`Strings or for the purpose of demonstrating certain functionality. Some of the test
`
`packages that were created to test the synchronization functionality included the
`test/audiosync, test/demo, and test/demo2 test packages.
`
`47. The description of those test packages matches my memory of
`
`participating in tests and demonstrations of Strings that operated for its intended
`
` 17
`
`Page 17 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`purpose to synchronize the rendering of audio and video in accordance with the
`
`inventions set forth in the Claims of the Patents.
`48. The audiosync test was a new package for “demonstrating
`
`synchronized audio” that was first checked in on October 10, 2001, with changes
`
`being made for a “demo configuration.” Exhibit 2031, at 2. By November 17, 2001,
`
`the test was moved to the “packages” file. Id. I generally instructed our developers
`
`at BeComm to move functionality from the “test” directory to the “packages”
`
`directory when the testing had been completed, and we were prepared to provide the
`
`functionality as a standalone package.
`49. The demo
`
`test began
`
`in October, 2001 as an audio-video
`
`synchronization demo that then added a Compaq iPAQ handheld device as a third
`
`potential host, though not the sole host. Exhibit 2032, at 2. This was one demo
`
`package that BeComm often used to demonstrate the audio-video synchronization
`
`functionality of Strings. The functionality was copied into a subsequent test folder
`called demo2 in November 2001. The demo test package itself was ultimately
`
`duplicated is a package called “fulldemo” in 2003. Id.
`50. Part of the demo and demo2 packages was to test the synchronization
`
`of audio and video rendering on a Compaq iPAQ, which was a handheld device that
`
`Compaq sold during this time period. By February 7, 2002, BeComm created
`
`documentation related to the iPAQ that was entered into the CVS repository. While
`
` 18
`
`Page 18 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`the iPAQ had significant challenges running aspects of Strings during our testing,
`
`we were able to synchronize content in some circumstances. Exhibit 2033, at 2.
`
`51. The document states that, when it came to synchronizing video, “[w]e
`
`achieved peak video performance by transmitting successive frames of 100 x 55
`
`RGB bitmaps over a raw UDP socket. The video looked pretty good (~12 fps) and
`
`was definitely synchronized.” Exhibit 2033, at 2. Regarding the synchronization of
`
`audio, the document states that “[w]hen synchronizing between the iPAQ and
`
`another machine, the audio breaks up considerably in the first five seconds, has a
`
`few chops for the next minute, and plays fine after that.” Id.
`
`52. These statements match my memory and describe how Strings worked
`
`to synchronize content on the iPAQ prior to December 11, 2001 and during the
`
`October, 2001 time period when the tests of the iPAQ began with the iPAQ-specific
`rules created in the demo test package, such as the ipaqvideo.rule, Exhibit
`
`2060. Unlike the iPAQ, however, Strings did not have nearly as much difficulty
`
`streaming audio and video content for synchronization on PCs because they had
`
`significantly more memory and processing power than the iPAQ had at that time. I
`
`witnessed the operation of that synchronization functionality for PCs at or around
`
`that time.
`
`53. One example demonstration of the synchronization functionality set
`
`forth in the Claims of the Patents was on the file “fightclubrgb.avi.” At this time,
`
`BeComm had a laptop that we used as part of demonstrating and testing various
`
` 19
`
`Page 19 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`applications of Strings and other projects we were working on, including the audio-
`
`video synchronization functionality of the inventions of the Claims of the Patents.
`
`In March 2018, I received a back-up of the files that were on the hard drive of this
`
`computer.
`
`54. The BeComm laptop included copies of the “fightclubrgb.avi” file in a
`few locations, including the bdk/test/demo directory, the scratch/avi
`directory, and the scratch/demoavi directory. These directories were typically
`
`used to hold media on which we would test various Strings applications using that
`
`laptop.
`
`55. The “fightclubrgb.avi” file was approximately a 2:31 long trailer for the
`
`movie Fight Club and contained audio and video content.
`
`Exhibit 2024.
`
`56. The file system metadata information for the “fightclubrgb.avi” in that
`
`
`
`folder indicates that the movie was last modified on September 7, 2001, Exhibit
`
`2077, at 21, which is consistent with my memory of about when I remember
`
` 20
`
`Page 20 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`participating in tests and demonstrations of the synchronization that began to use the
`
`Fight Club trailer. The file contains the Fight Club content whose rendering we
`
`synchronized prior to December 11, 2001, starting in the September, 2001 time
`
`period. During this time, we tested and demonstrated synchronized rendering of
`
`content in the “fightclubrgb.avi” file across multiple devices. The testing and
`
`demonstrating of the system worked for its intended purpose to synchronize the
`
`rendering of the audio and video across a master device and one or slave devices.
`
`57. The following is a general explanation of the functionality of the tests
`
`and demonstrations I participated in for the “fightclubrgb.avi” prior to December 11,
`
`2001, including during October, 2001, though the process was the generally same
`
`for other audio-video files as well during this time.
`
`58. The source computer transmitted the “fightclubrgb.avi” file identified
`
`in the script to the master device, usually a PC, which separated the audio-video
`
`stream into an audio stream and a video stream. The file system indicates that the
`
`“fightclubrgb.avi” contained uncompressed PCM audio, Exhibit 2077, at 22, and we
`
`were able to synchronize the audio playback of PCM audio as well as MP3 audio
`
`prior to December 11, 2001.
`
`59. The master device then rendered the video and audio and played the
`
`PCM audio. The master device also transmitted the video to one slave device,
`
`typically a second PC, which displayed the video in sync. The master device also
`
`transmitted the audio to another slave device, typically a third PC, which rendered
`
` 21
`
`Page 21 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`and played the audio in sync with master device. Prior to December 11, 2001, I
`
`witnessed or participated in a test or demonstration of the Fight Club trailer that
`
`synchronized the rendering of audio-video content multiple times, including
`
`multiple times in the October, 2001 time period.
`
`60. By at least January 28, 2002, use of the Fight Club trailer demonstration
`
`had become so common within BeComm that part of the video synchronization
`functionality within Strings was transitioned from the test/demo folder into two
`
`standalone packages just for the “fight-club syncronized video demo.” Exhibit 2035,
`
`at 3; Exhibit 2036, at 3.
`
`61. BeComm documentation also discussed
`
`the
`
`synchronization
`
`functionality that I witnessed during the mid-to-late 2001 time period, including the
`
`Fight Club trailer. The materials show a Strings-based distributed media player
`
`application that, due to its bead-based design, provided the “best possible
`
`synchronization” of the video and audio data flow that overcame the limitations of
`
`traditional application design approaches of conventional media player applications.
`
`Exhibit 2021. This document is dated October 4, 2001. Those dates are consistent
`
`with my memory that we had the audio-video synchronization functionality
`
`operational by at least that time.
`
`62. Strings’s approach provided the best possible synchronization because
`
`the design placed the clock synchronization “as late as possible in the data flow of
`
` 22
`
`Page 22 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`video and audio on each client,” as shown in the late-stream placement of the “Clock
`
`Sync.” beads in the diagram below, Exhibit 2021, at 9.
`
`Exhibit 2021, at 9 .
`
`63. Prior to December 11, 2001, BeComm also began incorporating the
`
`
`
`audio-video synchronization functionality set forth in the Claims in the Patents into
`
`a Strings Audio Player, which was an example built-in application in the RADkit.
`
`The RADkit was a developer kit that BeComm could provide to potential third-
`
`parties, such as Intel, who wanted to have various Strings-based applications running
`
`on their devices.
`
`64. By November 2001, we created documents in the source code
`
`repository to instruct potential customers on how to set up a demonstration of the
`
`Strings Audio Player to play synchronized audio on multiple devices, based on the
`
`RADkit. Exhibit 2025, at 1–2. The documentation instructed how to set up the
`
`Strings Audio Player to play the file mp3 files from a directory using the rules from
`the demo2 package (a successor to the demo package) located in the
`
` 23
`
`Page 23 of 53
`
`Implicit Exhibit 2001
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER MATERIAL
`
`/test/demo2 directory. Id. Additional documents from the repository that detail
`
`the Strings Audio Player at this time include Exhibit 2026 and Exhibit 2027.
`
`65.
`In addition to instructing on how to install the software, the
`documentation instructed how to edit to the audioplayerapp.rule file to set
`
`the IP address of the local computer and the IP address of the remote computer.
`
`Exhibit 2025, at 1–2.
`66. The rules in the audioplayerapp.rule file, working with other
`
`rules, provided synchronized rendering of mp3 audio between a master device and
`
`a slave device. They did so by configuring a remote master device and a local slave
`
`device, converting the mp3 aud