`
`IPR2018-00767
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`____________
`
`SONOS, INC.,
`Petitioner,
`
`v.
`
`IMPLICIT, LLC,
`Patent Owner.
`
`_____________
`
`Case IPR2018-00767
`Patent 8,942,252
`
`____________
`
`
`PATENT OWNER IMPLICIT, LLC’S SUR-REPLY TO PETITIONER’S REPLY
`
`
`
`
`
`
`
`
`
`
`TABLE OF CONTENTS
`
`I.
`
`II.
`
`INTRODUCTION ........................................................................................... 1
`
`JANEVSKI IS NOT PRIOR ART .................................................................. 1
`
`A.
`
`Preponderant Evidence—Most of Which Sonos Fails to
`Address—Establishes Prior Invention Under a Rule of
`Reason Analysis .................................................................................... 2
`
`B.
`
`Implicit’s Source Code Practices the Challenged Claims ..................... 7
`
`1.
`
`2.
`
`The Source Code Meets the “Rendering Time”
`Limitations .................................................................................. 7
`
`Sonos’s “Synchronization” Arguments Lack Legal
`and Factual Foundation ............................................................. 16
`
`III. THE CLAIMS ARE PATENTABLE OVER JANEVSKI ........................... 19
`
`A.
`
`B.
`
`C.
`
`Implicit Does Not Rely on Bare Attorney Argument ......................... 19
`
`Janevski Does Not Provide a Reason to Combine the
`Secondary References’ Averaging to the Claimed
`“Smoothing a Rendering Time” .......................................................... 21
`
`Janevski Does Not Disclose Claim 2’s “Master Device
`Time” Limitation ................................................................................. 23
`
`IV. CONCLUSION .............................................................................................. 24
`
`
`
`
`
`
`i
`
`
`
`TABLE OF AUTHORITIES
`
`CASES
`
`Apator Mitors ApS v. Kamstrup A/S, 887 F.3d 1293 (Fed. Cir. 2018) .................. 4-5
`
`Corning Inc. v. DSM IP Assets B.V., No. IPR2013-00050, 2014 WL
`1783280 (P.T.A.B. May 1, 2014) .................................................................. 20
`
`Elbit Sys. of Am., LLC v. Thales Visionix, Inc., 881 F.3d 1354 (Fed.
`Cir. 2018) ................................................................................................. 19-20
`
`Hybritech Inc. v. Monoclonal Antibodies, Inc., 802 F.2d 1367 (Fed.
`Cir. 1986) ......................................................................................................... 5
`
`NFC Techs., LLC v. Matal, 871 F.3d 1367 (Fed. Cir. 2017) ..................................... 3
`
`Price v. Symsek, 988 F.2d 1187 (Fed. Cir. 1993) ...................................................... 3
`
`
`
`ii
`
`
`
`
`
`I.
`
`INTRODUCTION
`
`The Board should find that the claims are patentable. First, Janevksi is not
`
`prior art because Implicit can swear behind it, despite Sonos’s procedural challenges.
`
`Second, the Source Code practices the Challenged Claims—further indicating that
`
`Implicit pre-dates Janevski. It is undisputed that not a single secondary reference
`
`teaches “smoothing a rendering time” differential as required by the challenged
`
`claims. Further, Janevski does not disclose the “master device” limitation.
`
`II.
`
`JANEVSKI IS NOT PRIOR ART
`
`The question in this proceeding is whether Janevski—filed six days before the
`
`December 17, 2001, provisional application that led to the Patent—is actually prior
`
`art. To swear behind the reference, Implicit provided significant evidence to
`
`corroborate Mr. Balassanian’s testimony of prior invention, including documentary
`
`evidence and source code.
`
`Sonos ignores the bulk of this evidence. It instead tries to raise procedural
`
`roadblocks and asserts theories that are improbable on their face: (1) that the source
`
`code functionality for which Implicit sought patent protection does not practice the
`
`Challenged Claims and (2) that the Board should ignore the voluminous source code
`
` 1
`
`
`
`
`
`and documentary evidence because that evidence is not sufficiently “independent”
`
`of Mr. Balassanian.1
`
`These arguments fail. Implicit only needs to show it is more likely than not
`
`that it is entitled to priority over Janevski. When considered in total, under a rule of
`
`reasons analysis, the evidence shows that Implicit has cleared that hurdle.
`
`A.
`
`Preponderant Evidence—Most of Which Sonos Fails to
`Address—Establishes Prior Invention Under a Rule of Reason
`Analysis
`
`Mr. Balassanian’s testimony lays out the invention story in significant detail,
`
`with supporting documentation. Exhibit 2001. The testimony spans the year up to
`
`the December 17, 2001, filing of the application that led to the Patent: the Intel Juno
`
`Project in late 2000 to early 2001; conception of the inventions after the Juno project
`
`was suspended in February 2001; the audio-video synchronization project (and
`
`source code) development during the summer of 2001; the tests and demonstrations,
`
`including the Fight Club demonstration, of the invention through the fall of 2001;
`
`and the drafting of the provisional patent application, completed in substance on
`
`
`1 Implicit also produced to Sonos the entire source code repository for Strings
`(“cvs_strings”), the entire hard drive of the BeComm demo laptop discussed in
`Implicit’s Patent Owner Response, and its entire website root from the 2000-2001
`time period. Implicit also allowed Sonos’s forensic expert to obtain a forensic image
`of a backup CD of the source code repository and the original hard drive of the
`BeComm demo laptop.
`
` 2
`
`
`
`
`
`December 9, 2001—two days before Janevski’s filing date and about a week before
`
`it was filed on December 17, 2001.
`
`Dozens of documents and source code files, timestamped by a computer
`
`operating system or the Coordinated Universal Time (“UTC”) stamp utilized by the
`
`Concurrent Version System (“CVS”) repository, corroborate Mr. Balassanian’s
`
`testimony and show priority of invention. The dates and documents line up.
`
`Considering that this evidence is almost twenty years old, the amount of supporting
`
`evidence is more than sufficient for Implicit to meet its burden of proof to move its
`
`invention date more than six days before it filed the provisional patent application.
`
`Sonos casts aside virtually all of that evidence and asks that the Board do the
`
`same. But, under the “rule of reason” analysis, “[a]n evaluation of all pertinent
`
`evidence must be made so that a sound determination of the credibility of the
`
`inventor’s story may be reached.” Price v. Symsek, 988 F.2d 1187, 1995 (Fed. Cir.
`
`1993) (emphasis in original). Because the evidence must be considered as a whole,
`
`not individually, an inventor’s conception can be corroborated even though “no one
`
`piece of evidence in and of itself” establishes that fact and even though
`
`circumstantial evidence provides the corroboration. NFC Techs., LLC v. Matal, 871
`
`F.3d 1367, 1372 (Fed. Cir. 2017) (collecting cases, quotations and citations omitted).
`
`Indeed, to reject prior invention, the Board would need to decline to credit Mr.
`
`Balassanian’s testimony, internal BeComm documentation, the draft provisional
`
` 3
`
`
`
`
`
`patent application, the source code and related documentation detailing the
`
`synchronization of content, and the computer system and CVS timestamps. That is
`
`unreasonable.
`
`The Board would also need to decline to credit the October 4, 2001, email
`
`from Mr. Fuiczynski to Dr. Peterson (the “Peterson Email”) that is the subject of
`
`Implicit’s Motion to Submit Supplemental Information (Paper No. 20). The date
`
`(October 2001) and content of that email (“We can synchronize video . . . We can
`
`also synchronize audio”) further corroborates Mr. Balassanian’s testimony and
`
`provides direct evidence of reduction of practice in October 2001—prior to Janevski:
`
`We can synchronize video playing on one computer very accurately
`while the audio plays back on another. We can also synchronize audio
`across two computers to the point that you can plug a headphone into
`each separate computer, and listen to it in stereo. It sounds really
`good . . .
`
`You had mentioned that this type of synchronization was rather difficult
`to achieve. Do you know of prior art that attempts to address this
`particular problem . . . ? We are thinking of writing up a provisional
`patent application on our system.
`
`Rather than address the totality of the evidence, Sonos picks at Implicit’s
`
`evidence and erects procedural hurdles—challenging objective evidence’s
`
`corroboration of inventor testimony and charting of supporting evidence.
`
`Mr. Balassanian’s testimony is adequately corroborated. Sonos relies on
`
`Apator Mitors ApS v. Kamstrup A/S, 887 F.3d 1293 (Fed. Cir. 2018), to attack the
`
`sufficiency of Mr. Balassanian’s corroborating evidence. But this case is unlike
`
` 4
`
`
`
`
`
`Apator because there is no “catch-22” or circular evidence. In Apator, the only
`
`evidence of dates originated from the inventor’s testimony—the documents
`
`themselves were either undated or were merely the inventor’s characterization of his
`
`own invention to others. Id. at 1296-97.2 The difference here is that explaining the
`
`documents or proving their dates does not depend on Mr. Balassanian’s substantive
`
`testimony because the evidence is self-corroborating. The documents and source
`
`code are timestamped by the computer file system or CVS, which on their face
`
`corroborate conception and reduction to practice before Janveski. Moreover, there
`
`is no requirement for every fact to be corroborated—the question is the entire story
`
`told by all of the facts. See, e.g., Hybritech Inc. v. Monoclonal Antibodies, Inc., 802
`
`F.2d 1367, 1378 (Fed. Cir. 1986) (inventor testimony of conception was sufficiently
`
`corroborated by inventor's laboratory notebook in which some entries were
`
`witnessed before the critical date but others were not).
`
`Sonos mischaracterizes Mr. Balassanian’s testimony. See Paper 17, at 2.
`
`Sonos asked Mr. Balassanian to construe claim terms, a legal exercise, not whether
`
`he invented the subject matter of the claims. E.g., id. at 2-3 (“Do you have an
`
`
`2 In its analysis, the Federal Circuit even recognized that timestamps on the
`inventor’s computer files were relevant to corroboration. Id. at 1297. But, unlike
`here, the Federal Circuit in Apator noted the files’ timestamps post-dated the priority
`date. Id. at 1297 (“The drawings proffered by [the inventor] indicate they were
`‘[m]odified’ on January 30, 2012, but the [inventor’s declaration] states, based on
`[his] file naming convention, that these drawings were actually created earlier, prior
`to Nielsen’s effective filing date.”).
`
` 5
`
`
`
`
`
`understanding of how the Claims of the Patents are construed? . . . And did you
`
`understand the claims when you read them?”) (all emphases added). Those
`
`questions turn on the scope of the claims—not whether or when Mr. Balassanian
`
`conceived of the subject matter captured by the claims. Further, Sonos omitted the
`
`objections that Implicit interjected after each question that called for a legal
`
`conclusion of the scope of the claims. E.g., Ex. 1019 at 46:7-8, 47:8. Indeed, Sonos
`
`avoided even asking Mr. Balassanian about the invention story. And when Sonos
`
`asked about how Mr. Balassanian determined that he conceived of the inventions
`
`claimed, he responded. Ex. 1019 (50:11-51:13) (discussing source code,
`
`conversations, and documents pre-dating December 11, 2001).
`
`Next, Sonos argues that Implicit improperly incorporated Dr. Hashmi’s
`
`declaration. Paper 17 at 10-11. Implicit cited to the evidence in its claim chart—
`
`showing record evidence of conception for every element of claim 1. Paper 9 at 29-
`
`31. There is no requirement for Implicit to repeat, word-for-word, Dr. Hashmi’s
`
`testimony in its Patent Owner Response for the Board to consider it. Nor must
`
`Implicit copy-and-paste source code files into its response—ballooning the size.
`
`Sonos seeks a remedy—striking the testimony to then invalidate the patent—that
`
`exceeds any alleged prejudice. Indeed, Sonos has submitted nearly 150 pages of
`
`expert testimony per proceeding. See, e.g., Ex. 1009 (92 pages); Ex.1022 (45 pages).
`
`Dr. Hashmi’s testimony is less than a third that length. Sonos’s claim of prejudice
`
` 6
`
`
`
`
`
`rings hollow when it was able to address—in a new forty-plus page declaration from
`
`Dr. Chertov—the Implicit Source Code, its operation, and why Dr. Chertov believes
`
`the Source Code does not practice the invention.
`
`B.
`
`Implicit’s Source Code Practices the Challenged Claims
`
`In Reply, Sonos takes the position that the source code functionality for which
`
`Implicit sought patent protection does not actually practice Implicit’s patents. Sonos
`
`alleges that the source code does not meet the “rendering time” limitations and does
`
`not synchronize the rendering of content. The remaining limitations are not
`
`disputed. Sonos exclusively relies on the direct testimony of Dr. Roman Chertov.
`
`But that testimony conflicts with Dr. Chertov’s testimony regarding Janevksi, Dr.
`
`Chertov’s testimony on cross-examination, contemporaneous documentation, and
`
`the source code itself. Implicit thus respectfully requests that the Board decline to
`
`credit Dr. Chertov’s direct testimony and reach the natural conclusion: that the
`
`source code functionality for which Implicit sought patent protection practices the
`
`Challenged Claims.
`
`1.
`
`The Source Code Meets the “Rendering Time” Limitations
`
`The Source Code meets the “rendering time” limitations of the Challenged
`
`Claims. As Dr. Hashmi explained, IAudioClock and RenderClock for the
`audio-video example he described in detail generate the rendering times for “master”
`
`device and “slave” devices, respectively. E.g., Exhibit 2081, at 2-3, 5-6; Exhibit
`
` 7
`
`
`
`
`
`2082, at 3-4. For the audio synchronization example, MasterClock generates the
`rendering times for the master device, as Dr. Hashmi explained. E.g., Exhibit 2080,
`
`at ¶¶ 51–59; see also Exhibit 2018
`
`(describing MasterClock and
`RenderClock). The speaker bead, which is the bead to send audio to an output
`device (invoked by both the master device and the slave devices), also utilizes
`
`ISampleClock to update the relevant clock data structure for that device (whether
`the master device or one of the slave devices) to indicate the actual playout position
`
`of the content. Exhibit 2051, at 1 (l. 11); id. at 11–12 (ll. 446-512).
`
`There is no dispute that these structures are utilized in the manner detailed in
`
`the Challenged Claims. Sonos and Dr. Chertov assert that these structures do not
`
`evidence a “rendering time” based on a new, overly narrow application of the
`
`“rendering time” terms and a misunderstanding of the source code. Those assertions
`
`are incorrect for the reasons detailed below.
`
`Each of the structures identified by Dr. Hashmi above are of the
`
`sampleclock data structure, as Dr. Chertov confirmed on cross-examination.
`Exhibit 2094, at 74:23-75:25. The source code defines the sampleclock data
`structure as follows:
`
` 8
`
`
`
`
`
`
`Exhibit 2086, at 2 (ll. 64–72); Exhibit 2088, at 2 (ll. 64–72).
`
`
`The structure contains the frequency and divisor for the stream (e.g., 44,100
`
`Hz frequency for audio and a divisor of 1,000 when time is expressed in
`
`milliseconds). Exhibit 1025, at 1. The structure also includes “an instantaneous
`
`position mark” of the stream, which consists of the “wall-clock time” (the “Time”
`
`entry above) and the number of samples that have been played, which are audio
`
`samples for audio or video frames for video (the “Sample” entry above). Id.; Exhibit
`
`2094, at 72:14–74:9. During playout at each device, the speaker bead updates the
`“Time” and “Sample” entries of the relevant sampleclock associated with that
`device and content. Exhibit 2051, at 1 (l. 11); id. at 11–12 (ll. 446–512).
`
`The “Sample” variable within the sampleclock data structures identified
`by Dr. Hashmi is one form of a “rendering time,” as Dr. Chertov and Sonos applied
`
`that term in the Petition. In seeking review, Sonos and Dr. Chertov contended that
`
`the “time or frame into the program,” when applying the claims to Janevksi to assert
`
`invalidity, “amounts to the claimed ‘rendering time.’” Paper No. 1, at 28 (emphasis
`
`added); Exhibit 1009, ¶¶ 103, 105 (“In my opinion, this ‘time or frame into the
`
` 9
`
`
`
`
`
`program’ maintained by a PVR amounts to the claimed ‘rendering time.’”)
`
`(emphasis added). They further asserted that Janveski’s use of a “frame
`
`misregistration” between devices constituted the claimed determining a “time
`
`differential” between the amount of content that has been rendered at a master device
`
`and the amount of content that has already been rendered by a slave device. E.g.,
`
`id., ¶ 116 (emphasis added). The Board credited these assertions when it instituted
`
`these proceedings, but Sonos talks out of the other side of its mouth when it comes
`
`to the source code.
`
`It is undisputed that the “Sample” variable is the number of audio samples or
`
`video frames that have already been played. See, e.g., Exhibit 1025, at 1, 4; Exhibit
`
`2094, at 73:18–25. It is also beyond dispute that those values would meet the
`
`“rendering time” terms as Sonos applied them in the Petition. If the issue is what
`
`units of time to express the “rendering time” in (e.g., number of samples or frames
`
`versus number of seconds), dividing the number of samples by the frequency—both
`
`of which are in the sampleclock data structure—yields a time value expressed in
`seconds, versus one expressed in samples (e.g., dividing an audio sample value of
`
`44,100 by a frequency of 44.1 KHz would result in one second of audio). See Exhibit
`
`2094, at 80:22–81:3. Moreover, that operation actually occurs in the source code,
`
`which Dr. Chertov ignored on direct.
`
` 10
`
`
`
`
`
`Sonos (and Dr. Chertov) changed positions in Reply. Now, Dr. Chertov
`
`asserts that use of the video frame into a program does not meet the “rendering time”
`
`limitations, a new position in conflict with his prior testimony:
`
`Q.
`
`Is the video frame that is maintained by a device a claimed
`rendering time?
`
`
`A. No, it is not.
`
`Q.
`
`Is the frame that tells you how far you are into a program a
`claimed rendering time?
`
`
`A. No.
`
`Q. And the only thing that is a claimed rendering time under your
`application in your second declaration is the number of seconds
`that have transpired in the content, is that right?
`
`
`A.
`
`Correct.
`
`Exhibit 2094, at 154:17–155:1; see also id. at 56:18–57:18 (testifying that video
`
`frame number are not a “rendering time”); id. at 71:1–17 (testifying that audio
`
`samples or video frames into the stream is not a “rendering time”); id. at 78:1–14
`
`(same).
`
`Implicit directly confronted Dr. Chertov with his inconsistent application of
`
`the “rendering time” terms on cross-examination. Exhibit 2094, at 143:19–25.
`
`Sonos blocked Dr. Chertov from testifying on that issue, improperly instructing him
`
`not to answer those questions based on “scope.” Id. at 144:1–145:13. But Dr.
`
`Chertov’s conflicting application of the “rendering time” terms weighs heavily on
`
` 11
`
`
`
`
`
`Dr. Chertov’s credibility regarding the Implicit Source Code. The conflicting
`
`testimony (and Sonos’s corresponding discovery tactics) cast significant doubt on
`
`all of Dr. Chertov’s testimony in favor of Sonos. Implicit thus respectfully requests
`
`that the Board decline to credit any of Dr. Chertov’s testimony proffered in support
`
`of Sonos, whether in support of invalidity or in support of their positions on
`
`Implicit’s Source Code.
`
`Should the Board look further, the Implicit Source code also discloses a
`
`“rendering time” even under Dr. Chertov’s new and overly narrow application of
`
`that term. The structures identified by Dr. Hashmi are used to return an “epoch”
`
`value that incorporates the amount of time that has elapsed in the content at a
`
`particular device. That incorporated value, referred to as “delta” in the Source Code,
`
`is the number of samples that have been rendered in the content stream
`
`(sampleclock->Sample) divided by
`(sampleclock->Frequency), scaled to the desired time units (e.g., seconds or
`milliseconds, depending on the sampleclock->Divisor variable).
`
`stream
`
`the
`
`frequency of
`
`the
`
`Exhibit 2086, at 12 (ll. 502–508); Exhibit 2088, at 11 (ll. 442–447);
`see also Exhibit 2094, at 78:19–90:11.
`
`
`
`
` 12
`
`
`
`
`
`Dr. Chertov ignored the “delta” variable in his direct testimony, except to
`
`obliquely refer to it as an “adjustment factor.” Exhibit 1022, ¶¶ 39–55. But, on
`
`cross-examination, he did not dispute that “delta” is the amount of time that has
`
`elapsed in the content, i.e., a “rendering time” even under his narrow application of
`
`the term. Exhibit 2094, at 78:19–90:11.
`
`The Source Code further details how a slave device utilizes the “delta”
`
`variables to adjust when the slave device renders content in order to match the master
`
`device, using the audiosync bead for audio and the rgbvideo bead for video.
`Both scenarios use the difference in “epochs” between the slave device and the
`
`master device to determine how much to adjust the slave device:
`
`• SOS_INT32
`(SOS_INT32)(MasterEpoch-
`=
`early
`RenderEpoch)
`(Exhibit 2017 (audiosync), at 11 (l. 473)).
`
`• delay = (SOS_INT32)(masterEpoch - renderEpoch)
`(Exhibit 2050 (rgbvideo), at 8 (l. 312)).
`Each “epoch” is obtained the same way, through a function call to the
`
`
`
`EpochGet function for the “master” sampleclock instance (the structure that
`tracks the timing of when the master device has rendered content) and the “render”
`
`sampleclock instance (the structure that tracks the timing of when the device that
`is adjusting the content has rendered content, e.g., the slave device). Exhibit 2017,
`
`at 17 (ll. 721–29); Exhibit 2050, at 7–8 (ll. 302–310).
`
` 13
`
`
`
`
`
`The EpochGet function calculates the “epoch” as follows:
`epoch = sampleclock->Time - delta;
`Exhibit 2086, at 12, (l. 500); Exhibit 2088, at 11 (l. 448).
`The sampleclock->Time variable is the wall-clock time. Exhibit 1025,
`at 1. The delta variable is the amount of content that has been rendered at the
`device in question, expressed in the same units as the sampleclock->Time
`variable (e.g., milliseconds). Exhibit 2086, at 12 (ll. 502–508); Exhibit 2088, at 11
`
`(ll. 442–447); Exhibit 2094, at 83:19–90:5–11. The “epoch” is the difference
`
`between the time value (sampleclock->Time) and the amount of content
`rendered at the device (“delta”).
`The “epoch” is thus a “a time measure of the amount of content that has
`
`already been rendered by a given rendering device,” viz. a “rendering time.” It is
`
`undisputedly a “time measure,” as it is in seconds or milliseconds. It also is “of the
`
`amount of content that has already been rendered by a given rendering device”
`
`because it incorporates the “delta” value, which even Dr. Chertov does not dispute
`
`is, on its own, a “rendering time.” Exhibit 2094, at 78:19–90:11.
`
`Moreover, the difference between the “master epoch” and the “render epoch,”
`
`as used in the “early” and “delay” equations above by the audiosync and
`rgbvideo beads, is, in substance, the difference between how much content has
`been rendered at the slave device and how much content has been rendered at the
`
` 14
`
`
`
`
`
`master device. That is because the sampleclock->Time value in each “epoch”
`essentially cancels out, as Dr. Chertov confirmed on cross-examination:
`
`. . . . MasterEpoch is system time minus delta at the master, right?
`
`Q
`
`A. Okay.
`
`Q. And then RenderEpoch is system time minus delta at the slave,
`right?
`
`Correct.
`
`
`A.
`
`Q. And so if you subtract those two isn’t that the equivalent
`mathematically of delta at the slave minus delta at the master?
`
`
`A.
`
`Exhibit 2094, at 89:20–90:4.
`
`It would appear so.
`
`The instances of sampleclock that Dr. Hashmi identified (e.g.,
`IAudioClock, RenderClock, MasterClock) meet the “rendering time”
`limitations for the reasons described above. Sonos’s (and Dr. Chertov’s) application
`
`of the “rendering time” term to exclude these structures is a mid-proceeding change
`
`in position based on a new application of the term. Implicit did not contest Sonos’s
`
`proposed construction of “rendering time” in its Patent Owner’s Response because,
`
`as Sonos had applied the term, it should have been beyond dispute that the Source
`
`Code met the limitation. Given Sonos’s switch, in these unique circumstances,
`
`should the Board disagree with the analysis above, Implicit requests that the Board
`
`enter the construction of “rendering time” that Implicit proposed in the litigation—
`
` 15
`
`
`
`
`
`“a content position,” Exhibit 2010, at 4. The above-identified sampleclock
`instances undisputedly meet that construction.
`
`2.
`
`Sonos’s “Synchronization” Arguments Lack Legal and
`Factual Foundation
`
`Sonos’s second position—that the source code does not provide actual
`
`“synchronization,” which all the claims allegedly require—simply lacks credibility.
`
`Substantial contemporaneous documents and source code show “synchronization,”
`
`examples including:
`
`• “Strings Synchronization Model.” Exhibit 2037, at 1. “This
`document describes a collection of beads and conventions developed
`to support multi-host synchronization in Strings.” Id.
`
`• “[T]he Strings framework only requires the addition of the clock
`synchronization modules encapsulated within a bead, which is then
`placed as late as possible in the data flow of video and audio on each
`client in order to achieve the best possible synchronization.” Exhibit
`2021, at 9.
`
`• “A ‘Sample Clock’ provides a mechanism for synchronizing two
`streams of multimedia . . . By comparing two epochs, we can
`determine the time shift required to bring them into synchronization.”
`Exhibit 1025, at 1.
`
`• “[S]ynchronized PCM to Speaker.” Exhibit 2056, at 1; Exhibit 2066,
`at 1; Exhibit 2075, at 1;
`
`• “New package for demonstrating synchronized audio.” Exhibit 2031,
`at 2.
`
`• “The RADapi also makes it possible to synchronize multiple DataFlow
`objects with each other regardless of content type. This makes it
`possible to synchronize audio playout on multiple endpoints or to
`
` 16
`
`
`
`
`
`synchronize audio with other content such as video or text.” Exhibit
`2029, at 7.
`
`• “StringsAudioPlayer: Slave Fanout Branch (sync)” Exhibit 2028, at 2.
`
`• “Added audio/audio synchronization . . . New package for testing
`remote synchronization of audio/video over the network.” Exhibit
`2015, at 2.
`
`• “We can synchronize video . . . We can also synchronize audio”
`Motion to Submit Supplemental Information (Paper No. 20).
`
`Against this backdrop of contemporaneous evidence, Dr. Chertov’s testimony
`
`(and Sonos’s position) is not credible. Dr. Chertov did not opine on direct what
`
`construction of “synchronize” he used to form his opinions. Exhibit 2094, at 16:20–
`
`24. On cross-examination, Dr. Chertov revealed that he viewed synchronization in
`
`an incredibly narrow way: even if the devices were off by only one-millisecond—
`
`which is not humanly perceptible—they would not be “synchronized” in his opinion.
`
`Id. at 17:8–18:20. He expanded even further on re-direct: if the devices were off by
`
`one-tenth of a millisecond, they would not be synchronized in his opinion.
`
`So if something was off by one-tenth of a millisecond, would that
`still be in sync?
`
`Q.
`
`
`A.
`
`It would not.
`
`
`Exhibit 2094, at 159:2–4. There is no support for this narrow definition in the Patent,
`
`and Dr. Chertov’s reading of “synchronization” would exclude virtually every real-
`
`world system for playing synchronized content. E.g., U.S. 10,216,473, at 36:58–62
`
`(Sonos Patent that explains that a delay of “on the order of fifty milliseconds”
`
` 17
`
`
`
`
`
`between audio and video content would be “barely, if at all, be perceptible to
`
`someone viewing the video”).
`
`The evidence thus shows that the master and the slave would render content
`
`in sync, under an ordinary application of that term. In addition, Dr. Chertov’s
`
`opinion also does not match the facts from the source code. Dr. Chertov opines that
`
`the “master” and “slave” will not play in “sync” because the source code always has
`
`the “master” play the content first (on Fanout 0) and the “slave” plays second (on
`
`Fanout 1), and that a scenario in which the slave plays first is “impossible.” Exhibit
`
`2094, at 26:14–217. But the source code contains embodiments in which the slave
`
`is the local device on Fanout 0 (and thus “plays first”) and the master is a remote
`
`device on Fanout 1 (and thus plays second). Exhibit 2063; Exhibit 2067. In those
`
`scenarios, the local slave would call the audiosync bead to adjust the playback at
`the local slave device to match the master device, as detailed above. Exhibit 2063;
`
`Exhibit 2067. And, even in the embodiment in which the master supposedly “plays
`
`first,” the source code anticipates buffering at the master and the slave, e.g., Exhibit
`
`2051, at 11 (ll. 457–478) (reciting samplesQueued variable), and the fact that the
`Fanout 0 path is created first in the source code (which may run in parallel when
`
`assembled) for a frame of content does not mean that the devices are unable to play
`
`in sync.
`
` 18
`
`
`
`
`
`Moreover, nothing in the Patents requires the level of synchronization Sonos
`
`asserts. The ’791 Patent claims only state: “A method for synchronizing a rendering
`
`of a content” in the preamble and “to render content that should be rendered at the
`
`same time” in the body (claim 1). Other claims are similar. E.g., claim 23
`
`(“synchronizing” in the preamble; “rendering content . . . to account for the
`
`calculated time domain difference” in the body). No claim requires absolute
`
`synchronization (something only theoretically possible). Nor does the specification.
`
`Sonos’s reading of the claims would exclude essentially all real-world devices.
`
`III. THE CLAIMS ARE PATENTABLE OVER JANEVSKI
`
`Sonos argues two main points: (1) Implicit presents no evidence that rebuts
`
`Sonos’s “evidence,” and (2) Implicit’s attacks on Sonos’s prior combinations fail.
`
`Paper 17 at 20. First, the prior art reference itself, Janevski, is evidence. Sonos’s
`
`“evidence” comprises expert witness testimony that parrots Sonos’s petition without
`
`substantive analysis. Second, it is undisputed that not a single secondary reference
`
`teaches “smoothing a rendering time” differential as required by the challenged
`
`claims. Further, Janevski does not disclose the “master device time” limitation.
`
`A.
`
`Implicit Does Not Rely on Bare Attorney Argument
`
`Sonos repeatedly casts Implicit’s discussion of Janevski—for what it does and
`
`does not teach—as “attorney argument.” See e.g., Paper 17 at 21 (citing Elbit Sys.
`
`of Am., LLC v. Thales Visionix, Inc., 881 F.3d 1354, 1359 (Fed. Cir. 2018)). For
`
` 19
`
`
`
`
`
`starters, Elbit is inapposite. There, the unsupported attorney argument concerned
`
`integral calculus applied to navigational equations—which the opposing side’s
`
`expert testified was “mathematically inappropriate.” Id.
`
`Here, the Janevski reference itself is the evidence. And, unlike Sonos,
`
`Implicit did not use an expert as an attorney mouthpiece to regurgitate Janevski’s
`
`prose. Indeed, Dr. Chertov’s declaration simply parrots the attorney analysis found
`
`in Sonos’s petition—cloaking the argument as Dr. Chertov’s “opinion.” Compare,
`
`e.g., Paper 1 at 48 (claim 2) with Ex. 1009, ¶ 129 (analysis for claim 2 is attorney
`
`argument recast as expert opinion). And that is far from the only instance. Much of
`
`Dr. Chertov’s declaration is simply the petition reworded to account for his
`
`“opinion.” Compare also, e.g., Paper 1 at 45-46 with Ex. 1009, ¶¶125-126 (almost
`
`identical discussion in both petition and expert declaration, save for omitted caselaw
`
`citations in declaration and pronoun usage).
`
`Consequently, Dr. Chertov’s declaration in describing Janevski should be
`
`given little, if any, weight. See, e.g., Corning Inc. v. DSM IP Assets B.V., No.
`
`IPR2013-00050, 2014 WL 1783280, at *14 (P.T.A.B. May 1, 2014) (“[The expert’s]
`
`statement is a word-for-word reproduction of DSM’s argument in the Response . . .
`
`Dr. Bowman does not disclose underlying facts or data on which his opinion is
`
`based; we give it, therefore, little