`8,942,252
`Claim 1
`[1 Preamble] A method,
`comprising:
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`Source Code Folders:
`SOFTWARE_STRUCTURE = \test\demo\rules\
`SOFTWARE_CODE = \beads\
`
`[1a] a master rendering
`device rendering a first
`content stream;
`and;
`
`The Implicit Source Code specifies a distributed system consisting of a plurality of
`devices that are nodes of a network (“devices”). These devices execute a method for
`synchronizing rendering of content provided by a source, where each device maintains
`a rendering time corresponding to the device time when the device renders content.
`
`Implicit source code specifies a distributed system consisting of a plurality of devices.
`These devices include a master rendering device that is set up to render an audio and
`video content stream. This audio and video content stream rendered by the master
`rendering device corresponds to a first content stream.
`
`Implicit source code implements a distributed system consisting of a plurality of
`devices. An architecture of one such distributed system comprising a plurality of
`devices is defined in files videomulti.rule, videoclient.rule,
`ipaqvideo.rule, pcmaudioserver.rule, syncaudio.rule, and
`timesync.rule.1,2 Implicit source code also implements beads or services that run
`
`1 These files are contained in folder SOFTWARE_STRUCTURE\
`2 Another similar distributed system that renders synchronized content streams is described by the rules files in
`folder \2001.11.01\test\audiosync\
`
`1
`
`Page 1 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`U.S. Patent No.
`8,942,252
`
`on devices that are nodes of a network. Among other services, these beads can
`perform tasks including decoding and encoding content streams, receiving and
`transmitting content streams, synchronizing content streams, and rendering content
`streams.3 Each device is configured using one or more rule files and executes the
`services implemented by the beads specified within the rules file.
`
`Among other devices, the distributed system described in the rules files specifies a
`master rendering device that renders an audio and video content stream.4 This audio
`and video content stream rendered by the master rendering device corresponds to the
`first content stream. Specifically, the master rendering device receives a combined
`audio and video content stream at its local network port 8013 from a content source
`device.5,6 As part of the rendering process, this master rendering device uses the
`avidemux bead to split the combined audio and video content stream into separate
`audio and video streams.7,8 Specifically, functions
`AviDemux_EncodeMessageHandler9 and ChunkProcess10 of bead
`avidemux separate the combined audio and video content stream. Once the audio
`and video streams have been separated, the master rendering device runs the
`bmptorgb bead to decode the received video content stream to generate decoded
`
`3 These files are contained in folder SOFTWARE_CODE\
`4 Defined at lines 6 to 134 in file SOFTWARE_STRUCTURE\videomulti.rule
`5 See line 9 in file SOFTWARE_STRUCTURE\videomulti.rule
`6 See lines 7 to 79 in file \test\demo\source.pl
`7 See lines 11 to 15 in file SOFTWARE_STRUCTURE\videomulti.rule
`8 See file SOFTWARE_CODE\avidemux\main\avidemux.c
`9 Implemented at lines 883 to 964 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`10 Implemented at lines 597 to 849 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`
`2
`
`Page 2 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`U.S. Patent No.
`8,942,252
`
`Application of Claim Language to Source Code
`RGB video frames.11,12 Specifically, functions BmpToRgb_MessageHandler,13
`DecodeFrame,14 and BmpDecoder_FrameDecode15 of bead bmptorgb decode
`the video content stream to generate the decoded RGB video frames.
`
`As discussed earlier, the master rendering device uses the avidemux bead to split the
`combined audio and video content stream into separate audio and video streams. Once
`the audio and video content stream has been separated, the master rendering device
`uses an instance of the fanout bead to distribute the audio content stream to the two
`processing pathways.16 The first processing pathway corresponding to index
`FanoutIndex 0 outputs the audio content stream using the speaker bead to the
`audio output device of the master rendering device.17,18
`
`[1b] sending, from the
`
`master rendering device
`As discussed earlier (see Claim 1, Limitation 1a), Implicit source code specifies a
`to a first one of a
`distributed system that includes a master rendering device. This master rendering
`plurality of slave
`device receives a combined audio and video content stream sent by a source device.
`devices, a plurality of
`Upon receiving the combined audio and video content stream, the master rendering
`device uses the avidemux bead to separate the combined audio and video stream into
`master rendering times
`
`11 See lines 16 to 20 in file SOFTWARE_STRUCTURE\videomulti.rule
`12 See file SOFTWARE_CODE\bmp2rgb\main\bmp2rgb.c
`13 Implemented at lines 306 to 335 in file SOFTWARE_CODE\bmp2rgb\main\bmp2rgb.c
`14 Implemented at lines 180 to 283 in file SOFTWARE_CODE\bmp2rgb\main\bmp2rgb.c
`15 Implemented at lines 366 to 407 in file SOFTWARE_CODE\bmp2rgb\main\bmpdecoder.c
`16 See lines 91 to 97 in file SOFTWARE_STRUCTURE\videomulti.rule
`17 See lines 104 to 107 in file SOFTWARE_STRUCTURE\videomulti.rule
`18 See file SOFTWARE_CODE\speaker\main\speaker.c
`
`3
`
`Page 3 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`indicative of statuses of
`the rendering the first
`content stream at the
`master rendering device
`at different times;
`
`
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`separate audio and video streams. The avidemux bead instantiates a master
`rendering clock IAudioClock associated with the audio content stream.19 This
`master rendering clock IAudioClock generates a plurality of master rendering times
`that are indicative of the of statuses of the rendering the combined audio and video
`content stream by the master rendering device at different times. Furthermore, Implicit
`source code specifies a distributed system that also includes plurality of slave
`rendering devices. One of these slave rendering devices renders PCM audio content
`stream in synchronization with the master rendering device.20 Another of these slave
`rendering devices renders the video content stream in synchronization with the master
`rendering device.21 The master rendering device sends a plurality of encoded audio
`PCM frames and the corresponding master rendering times indicative of the statuses of
`the rendering of the first content stream by the master rendering device to the slave
`rendering device rendering audio content stream at port 9002.22,23 Similarly, the
`master rendering device sends a plurality of encoded video RGB video frames and the
`corresponding master rendering times indicative of the statuses of the rendering of the
`first content stream by the master rendering device to the slave rendering device
`rendering video content stream at port 8002.24,25
`
`As discussed earlier (see Claim 1, Limitation 1a), upon receiving the combined audio
`
`
`19 See lines 447 to 451 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`20 Defined at lines 3 to 25 in file SOFTWARE_STRUCTURE\syncaudio.rule
`21 Defined at lines 7 to 35 in file SOFTWARE_STRUCTURE\videoclient.rule
`22 See lines 123 to 127 in file SOFTWARE_STRUCTURE\videomulti.rule
`23 See line 4 in file SOFTWARE_STRUCTURE\syncaudio.rule
`24 See lines 123 to 127 in file SOFTWARE_STRUCTURE\videomulti.rule
`25 See line 4 in file SOFTWARE_STRUCTURE\syncaudio.rule
`
`4
`
`Page 4 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`and video content stream, the master rendering device uses the avidemux bead to
`separate the combined audio and video stream into separate audio and video streams.
`While processing the combined audio and video content streams, the avidemux bead
`calls function AudioPrepare.26 Function AudioPrepare initializes a master
`rendering clock IAudioClock associated with the audio content stream.27 This
`master rendering clock IAudioClock generates a plurality of master rendering times
`that are indicative of the of statuses of the rendering the combined audio and video
`content stream by the master rendering device at different times.
`
`As discussed earlier (see Claim 1, Limitation 1a), the master rendering devices
`processes the video content stream using the bmptorgb bead to render decoded RGB
`video frames. Once the video frame has been decoded into an RGB video frame, the
`master rendering device uses a fanout bead to distribute the decoded video frame to
`two processing pathways or fanouts.28,29 Specifically, function
`FanOut_MessageHandler30 of bead fanout distributes decoded RGB video
`frames to the two processing pathways. Within one of these processing pathways,31
`the master rendering device uses the clocksync bead to encode the master rendering
`
`
`26 Implemented at lines 389 to 477 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`27 See lines 447 to 451 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`28 See lines 21 to 25 in file SOFTWARE_STRUCTURE\videomulti.rule
`29 See file SOFTWARE_CODE\fanout\main\fanout.c
`30 Implemented at lines 180 to 199 in file SOFTWARE_CODE\fanout\main\fanout.c
`31 See lines 30 to 49 in file SOFTWARE_STRUCTURE\videomulti.rule
`
`5
`
`Page 5 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`times.32,33 Specifically, functions ClockSync_EncodeHandler34 and
`ClockSync_Encode35 encode the master rendering times.36 After encoding the
`master rendering time, the master rendering device encodes the decoded RGB video
`frames using the framer bead.37,38 Specifically, function
`Framer_EncodeHandler39 of bead framer encodes the decoded RGB video
`frame. Afterwards, the master rendering device transfers the master rendering time
`and the encoded RGB video frame over the IP network, which sends the encoded
`master rendering times and the encoded RGB video frame to a remote network port
`8002 using the User Datagram Protocol (“UDP”).40
`
`As discussed earlier (see Claim 1, Limitation 1a), once the audio and video content
`stream has been separated, the master rendering device uses the fanout bead to
`distribute the PCM audio frames over two processing pathways. Within one of these
`processing pathways,41 the master rendering device uses the clocksync bead to
`encode the master rendering times. After encoding the master rendering time, the
`
`
`32 See lines 32 to 35 in file SOFTWARE_STRUCTURE\videomulti.rule
`33 See file SOFTWARE_CODE\timesync\main\clocksync.c
`34 Implemented at lines 570 to 602 in file SOFTWARE_CODE\timesync\main\clocksync.c
`35 Implemented at lines 244 to 348 in file SOFTWARE_CODE\timesync\main\clocksync.c
`36 See lines 280 to 294 in file SOFTWARE_CODE\timesync\main\clocksync.c
`37 See lines 36 to 39 in file SOFTWARE_STRUCTURE\videomulti.rule
`38 See file SOFTWARE_CODE\framer\main\framer.c
`39 Implemented at lines 230 to 269 in file SOFTWARE_CODE\framer\main\framer.c
`40 See line 43 in file SOFTWARE_STRUCTURE\videomulti.rule
`41 See lines 113 to 132 in file SOFTWARE_STRUCTURE\videomulti.rule
`
`6
`
`Page 6 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`U.S. Patent No.
`8,942,252
`
`Application of Claim Language to Source Code
`master rendering device encodes the PCM audio frames using the framer bead.
`Afterwards, the master rendering device transfers the master rendering time and the
`encoded PCM audio frame over the IP network, which sends the encoded master
`rendering times and the encoded PCM audio frame to a remote network port 9002
`using UDP.42
`
`Among other devices, the distributed system described in the rules files specifies a
`slave device that displays RGB video frames in synchronizations with the master
`rendering device.43 This slave device receives the encoded master rendering times and
`the encoded RGB video frame at its port 8002.44 Afterwards, the slave device decodes
`the encoded RGB frames sent by the master rendering device using the framer
`bead.45 Specifically, function Framer_DecodeHandler46 of bead framer
`decodes the decoded RGB frame. Afterwards, the slave device displays the decoded
`RGB frames using the rgbvideo bead.47,48 Specifically, functions
`RgbVideo_MessageHandler49 and DisplayFrame50 of bead rgbvideo
`display the decoded RGB frame.
`
`
`42 See line 43 in file SOFTWARE_STRUCTURE\videomulti.rule
`43 Defined at lines 6 to 37 in file SOFTWARE_STRUCTURE\videoclient.rule
`44 See line 8 in file SOFTWARE_STRUCTURE\videoclient.rule
`45 See lines 10 to 14 in file SOFTWARE_STRUCTURE\videoclient.rule
`46 Implemented at lines 271 to 349 in file SOFTWARE_CODE\framer\main\framer.c
`47 See lines 30 to 33 in file SOFTWARE_STRUCTURE\videoclient.rule
`48 See file SOFTWARE_CODE\rgbvideo\main\rgbvideo.c
`49 Implemented at lines 464 to 524 in file SOFTWARE_CODE\rgbvideo\main\rgbvideo.c
`50 Implemented at lines 347 to 395 in file SOFTWARE_CODE\rgbvideo\main\rgbvideo.c
`
`7
`
`Page 7 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`
`[1c] wherein the first
`slave device is
`configured to smooth a
`rendering time
`differential that exists
`between the master
`rendering device and
`the first slave device in
`order to render a second
`content stream at the
`first slave device
`synchronously with the
`rendering of the first
`content stream at the
`master rendering
`device, wherein
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`Furthermore, the distributed system described in the rules files specifies another slave
`device that renders PCM audio frames in synchronizations with the master rendering
`device.51 This slave rendering device receives the encoded master rendering times and
`the encoded PCM audio frame at its port 9002.52
`
`
`As discussed earlier (see Claim 1, Limitation 1b), Implicit source code specifies a
`distributed system that includes a slave device that receives the encoded master
`rendering times and the encoded PCM audio frames at port 9002. This slave device is
`set up to render PCM audio in synchronization with rendering of the combined audio
`and video content stream by the master rendering device. This slave device uses the
`audiosync bead to determine and smoothen a rendering time differential that exists
`between the master rendering device and the slave rendering device such that the PCM
`audio frames (second content stream) are rendered at the slave devices in
`synchronization with the rendering of the combined audio and video content stream
`(first content stream) at the master rendering device.53,54 The smoothening of the
`rendering time differential includes calculations using the plurality of master rendering
`times.55
`
`As discussed earlier (see Claim 1, Limitation 1b), Implicit source code specifies a
`slave device that renders PCM audio frames in synchronization with the master
`
`
`51 Defined at lines 3 to 25 in file SOFTWARE_STRUCTURE\syncaudio.rule
`52 See line 5 in file SOFTWARE_STRUCTURE\syncaudio.rule
`53 See lines 16 to 19 in file SOFTWARE_STRUCTURE\syncaudio.rule
`54 See file SOFTWARE_CODE\audiosync\main\audiosync.c
`55 See lines 160 to 192 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`
`8
`
`Page 8 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`smoothing the
`rendering time
`differential includes
`calculations using the
`plurality of master
`rendering times.
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`rendering device. This slave device receives the encoded master rendering times and
`the encoded PCM audio frames at port 9002. The slave device uses the clocksync
`bead to decode the received master rendering times.56 Furthermore, the clocksync
`service determines if a time domain differential exists between the master time domain
`and the slave time domain by calling function TimeSync_OffsetGet57,58 and
`adjusts the master rendering time in proportion to the time domain differential that
`exists between the master and the slave time domain.59 The clocksync bead
`implements a function ClockSync_DecodeHandler60 that calls function
`ClockSync_Decode61 to decode the master rendering times sent by the master
`rendering device. Function ClockSync_Decode also determines if a time domain
`differential exists between the master and the slave device time domains and if such a
`differential exists, it adjusts the master rendering time in proportion to the time domain
`differential between the master and the slave device time domains. Afterwards,
`function ClockSync_Decode updates the master rendering time at the slave
`rendering device.62 Each slave device runs a clocksync bead that maintains a
`rendering clock RenderClock that tracks the rendering time for the slave device.63
`
`
`
`56 See lines 11 to 15 in file SOFTWARE_STRUCTURE\audiosync\package\package\audio.rule
`57 See line 409 in file SOFTWARE_CODE\timesync\main\clocksync.c
`58 Implemented at lines 361 to 388 in file SOFTWARE_CODE\timesync\main\timesync.c
`59 See lines 462 to 469 in file SOFTWARE_CODE\timesync\main\clocksync.c
`60 Implemented at lines 604 to 635 in file SOFTWARE_CODE\timesync\main\clocksync.c
`61 Implemented at lines 618 to 635 in file SOFTWARE_CODE\timesync\main\clocksync.c
`62 See lines 476 to 480 in file SOFTWARE_CODE\timesync\main\clocksync.c
`63 See lines 11 to 15 in file SOFTWARE_STRUCTURE\syncaudio.rule
`
`9
`
`Page 9 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`The slave device also uses the framer bead to decode the encoded PCM audio
`frames sent by the master rendering device.64 Specifically, the framer bead
`implements a function Framer_DecodeHandler65 to decode the received encoded
`PCM audio frames sent by the master rendering device.66 Afterwards, the slave
`rendering device uses the audiosync bead to determine and smoothen a rendering
`time differential between the master rendering device and the slave rendering
`device.67,68
`
`The audiosync bead implements a function AudioSync_MessageHandler.69
`Function AudioSync_MessageHandler obtains the master rendering time70 and
`the slave rendering time71 and calls function AudioSync_Adjust.72,73 Function
`AudioSync_Adjust determines the rendering time differential between the master
`and the slave rendering devices and stores the rendering time differential in data
`structure early.74 Afterwards, function AudioSync_Adjust calls function
`
`
`64 See lines 7 to 10 in file SOFTWARE_STRUCTURE\syncaudio.rule
`65 Implemented at lines 271 to 349 in file SOFTWARE_CODE\timesync\main\clocksync.c
`66 See lines 457 to 486 in file SOFTWARE_CODE\timesync\main\clocksync.c
`67 See lines 16 to 19 in file SOFTWARE_STRUCTURE\syncaudio.rule
`68 See file SOFTWARE_CODE\audiosync\main\audiosync.c
`69 Implemented at lines 692 to 777 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`70 See lines 721 to 723 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`71 See lines 727 to 729 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`72 See lines 732 to 738 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`73 Implemented at lines 462 to 685 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`74 See line 473 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`
`10
`
`Page 10 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`SlidingAvg_Add75 and provides it with the rendering time differentials. Function
`SlidingAvg_Add uses a moving average process that smoothens the rendering time
`differential by using a plurality of master and slave rendering times.76 Back in
`function AudioSync_Adjust, the smoothened rendering time differential is stored
`in data structure avgEarly.77 Afterwards, function AudioSync_Adjust adjusts
`the decoded audio media content (second content stream) by duplicating the audio
`samples or resampling the audio samples such that the slave rendering device renders
`the audio media content (second content stream) in synchronization with the first
`content stream rendered by the master device.78 Afterwards, the adjusted audio media
`stream is sent to the speakers for playback.
`
`Finally, each slave rendering devices implements a service speaker, that renders the
`received PCM audio frames in synchronization with the rendering of the first content
`stream by the master rendering device.79,80
`
`
`
`See analysis for Claim 1, Limitation 1b.
`
`
`Claim 2
`The method of claim 1,
`wherein one of the
`plurality of master
`rendering times
`
`75 Implemented at lines 160 to 192 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`76 See lines 172 to 191 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`77 See line 474 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`78 See lines 559 to 652 in file SOFTWARE_CODE\audiosync\main\audiosync.c
`79 See lines 20 to 23 in file SOFTWARE_STRUCTURE\syncaudio.rule
`80 See file SOFTWARE_CODE\speaker\main\speaker.c
`
`11
`
`Page 11 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`U.S. Patent No.
`8,942,252
`includes a master
`device time at which
`the master rendering
`device renders content.
`
`
`
`As discussed above (see Claim 1, Limitation 1b), as part of the rendering process, the
`master rendering device uses the avidemux bead to split the combined audio and
`video content stream into separate audio and video streams. While processing the
`combined audio and video content streams, the avidemux bead calls function
`AudioPrepare.81 Function AudioPrepare initializes a master rendering clock
`IAudioClock associated with the audio content stream.82 Furthermore, master
`rendering clock IAudioClock is created in function
`AviDemux_ContextCreate83 using the sampleclock class.84 The
`sampleclock class implements a function SampleClock_FrequencySet85
`that is used to initialize the instance of sampleclock. Function
`SampleClock_FrequencySet initializes the sampleclock->Time data
`structure by calling function SOS_CLOCK_TickGet, which provides the current
`device time. Therefore, the master rendering time is initialized using the current
`master device time and is updated afterwards based on the amount of content that
`master has rendered. Consequently, the indication sent from the master device to the
`at least one slave device includes the master device time at which the master device
`renders content corresponding to the master rendering time.
`
`Claim 3
`
`
`The method of claim 1,
`See analysis for Claim 1, Limitation 1b.
`wherein sending the
`
`81 Implemented at lines 389 to 477 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`82 See lines 447 to 451 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`83 Implemented at lines 995 to 1022 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`84 See lines 1009 to 1013 in file SOFTWARE_CODE\avidemux\main\avidemux.c
`85 See lines 275 to 339 in file SOFTWARE_CODE\sampleclock\main\sampleclock.c
`
`12
`
`Page 12 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`
`This master rendering clock IAudioClock at the master rendering device generates
`a plurality of rendering times that include master rendering times that are indicative of
`how much rendering has occurred at the master rendering device.
`
`
`
`
`See analysis for Claim 1, Limitations 1a and 1b.
`
`As discussed earlier (see analysis for Claim 1, Limitation 1a), a first source device
`transmits a combined audio and video content stream to the master rendering device as
`port 8013.86 The master rendering device decodes and renders the combined audio and
`video content stream. Afterwards, the master rendering device re-encodes the decoded
`PCM audio frames and sends the re-encoded PCM audio frames to the slave device at
`port 9002. Therefore, the master rendering devices (a different source device),
`transmits the second content stream to the plurality of slave devices (see analysis for
`Claim 1, Limitation 1b).
`
`
`
`
`U.S. Patent No.
`8,942,252
`plurality of master
`rendering times
`comprises sending a
`series of transmissions
`to the first slave device,
`each one of the series of
`transmissions being at a
`different time.
`
`Claim 8
`The method of claim 1,
`wherein the first content
`stream is sent from a
`first source device to
`the master rendering
`device and the second
`content stream is sent to
`the first slave device
`from a different source
`device.
`
`
`Claim 11
`[11 Preamble] A
`method, comprising:
`
`
`86 See lines 7 to 79 in file \test\demo\source.pl
`
`13
`
`Page 13 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`U.S. Patent No.
`8,942,252
`[11a] receiving, at a
`slave device, a
`particular content
`stream;
`[11b] receiving, at the
`slave device from a
`master rendering
`device, a plurality of
`master rendering times
`indicative of status of
`rendering a different
`content stream at the
`master rendering
`device;
`[11c] the slave device
`determining a smoothed
`rendering time
`differential that exists
`between the master
`rendering device and
`the slave device,
`wherein the
`determining is based on
`calculations using the
`plurality of master
`rendering times and a
`plurality of slave
`rendering times
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`
`See analysis for Claim 1, Limitation 1c.
`
`
`
`See analysis for Claim 1, Limitations 1a, 1b, and 1c.
`
`
`
`See analysis for Claim 1, Limitation 1c.
`
`
`14
`
`Page 14 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`
`
`CONTAINS PROTECTIVE ORDER INFORMATION
`
`Application of Claim Language to Source Code
`
`
`See analysis for Claim 1, Limitation 1c.
`
`
`
`
`See analysis for Claim 1, Limitation 1a.
`
`As discussed earlier (see summary for Claim 1, Limitation 1a), the same distributed
`system contains the master and the slave rendering devices.
`
`
`U.S. Patent No.
`8,942,252
`corresponding to
`rendering the particular
`content stream at the
`slave device; and
`[11d] based on the
`smoothed rendering
`time differential, the
`slave device rendering
`the particular content
`stream synchronously
`with the master
`rendering device
`rendering the different
`content stream.
`Claim 17
`The method of claim 11
`wherein the master
`rendering device and
`the slave rendering
`device are part of a
`same system.
`
`
`
`15
`
`Page 15 of 15
`
`Implicit Exhibit 2082
`Sonos v. Implicit, IPR2018-0766, -0767
`
`