`
`(12) United States Patent
`Balassanian et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,942,252 B2
`Jan. 27, 2015
`
`(54) METHOD AND SYSTEM
`SYNCHRONIZATION OF CONTENT
`RENDERING
`
`(71) Applicant: Implicit Networks, Inc., Bellevue, WA
`(US)
`
`(72) Inventors: Edward Balassanian, Bellevue, WA
`(US); Scott W. Bradley, Kirkland, WA
`(US)
`
`(73) Assignee: Implicit, LLC, Seattle, WA (US)
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl. No.: 13/850,260
`(22) Filed:
`Mar. 25, 2013
`
`USPC .......................................................... 370/431
`(58) Field of Classification Search
`CPC ................................................... HO4L 2007/04
`USPC ......... 370/431,432, 464, 498, 503, 507–521;
`709/219, 231-237,248
`See application file for complete search historv.
`ry
`pp
`p
`References Cited
`
`(56)
`
`U.S. PATENT DOCUMENTS
`
`4,569,042 A
`5,333,299 A
`
`2f1986 Larson
`7, 1994 Koval et al.
`(Continued)
`OTHER PUBLICATIONS
`Mills, RFC 778 DCNET Internet Clock Service, RFC, Apr. 1981,
`pp. 1-5.
`
`(Continued)
`
`(65)
`
`Prior Publication Data
`US 2013/O290461 A1
`Oct. 31, 2013
`Related U.S. Application Data
`(63) Continuation of application No. 12/710,146, filed on
`Feb. 22, 2010, now Pat. No. 8,406.257, which is a
`continuation of application No. 1 1/933,194, filed on
`Oct. 31, 2007, now abandoned, which is a continuation
`(Continued)
`
`(51) Int. Cl.
`H04L 2/28
`H04L 2/24
`G06F 7/30
`H04N 5/765
`HO)4N 5/775
`(52) U.S. Cl.
`CPC .......... H04L 41/04 (2013.01); G06F 17730056
`(2013.01); H04N5/765 (2013.01); H04N 5/775
`(2013.01)
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`Primary Examiner — Dmitry H Levitan
`(57)
`ABSTRACT
`A method and system for synchronizing the rendering of
`content at various rendering devices. Each rendering device
`has a device time and a rendering time. The synchronization
`system designates O of the rendering devices aS a master
`rendering device and designates all other rendering devices aS
`slave rendering devices. Each slave rendering device adjusts
`the rendering of its content to keep it in Synchronization with
`the rendering of the content at the master rendering device.
`The master rendering device sends a message with its render
`ing time and corresponding device time to the slave rendering
`devices. Each slave rendering device, upon receiving the mes
`sage from the master rendering device, determines whetherit
`is synchronized with the master rendering time. If not, the
`slave rendering device adjusts the rendering of its content to
`compensate for the difference between the master rendering
`time and the slave rendering time.
`
`17 Claims, 10 Drawing Sheets
`
`3O1
`
`302
`
`303
`
`304
`
`305
`
`306
`
`
`
`Table Domain Table
`
`300
`
`
`PAGE 1 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`US 8,942.252 B2
`Page 2
`
`Related U.S. Application Data
`of application No. 10/322,335, filed on Dec. 17, 2002,
`now Pat. No. 7,391,791.
`(60) Provisional application No. 60/341,574, filed on Dec.
`17, 2001
`s
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,452,435 A
`5.487,167 A
`5,530,859 A
`5,553,222 A
`5,602,992 A
`5,623.483. A
`5,815,689 A
`
`9, 1995 Maloufetal.
`1/1996 Dinallo et al.
`6/1996 Tobias, II et al.
`9, 1996 Milne et al.
`2f1997 Danneels
`4, 1997 Agrawal et al.
`9, 1998 Shaw et al.
`
`6/1999 Kuthyar et al.
`5,909,431 A
`6,009,457 A 12, 1999 Moller
`6,622,171 B2
`9/2003 Gupta et al.
`6,643,612 B1
`1 1/2003 Lahat et al.
`6,763,374 B1
`7/2004 Levi et al.
`6,766.407 B1 * 7/2004 Lisitsa et al. .................. T10.316
`6,859,460 B1
`2, 2005 Chen
`6,934,759 B2
`8/2005 Hejna, Jr.
`6,985,966 B1
`1/2006 Gupta et al.
`7,096,271 B1
`8/2006 Omoiguiet al.
`7,391,791 B2
`6/2008 Balassanian et al.
`7,756,032 B2
`7/2010 Feicket al.
`2002/0031196 A1
`3/2002 Muller et al.
`
`OTHER PUBLICATIONS
`
`Postel, RFC 792 Internet Control Message Protocol, RFC, Sep.
`1981, pp. 1-16.
`
`* cited by examiner
`
`
`PAGE 2 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 1 of 10
`
`US 8,942,252 B2
`
`
`
`Rendering
`time bC
`
`
`
`Video
`
`1O1
`
`-
`
`
`
`Source
`
`Rendering
`time bC
`H
`
`Audio
`
`
`
`N
`Rendering
`time bC
`
`
`
`Text
`
`102
`
`103
`
`104
`
`Fig. 1
`
`
`PAGE 3 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 2 of 10
`
`US 8,942,252 B2
`
`Device 1
`
`Device 2
`
`—- ""
`Send time 1
`301
`Receive time 1
`
`Send time 1
`Receive time 1 -H
`Send time 2
`Send time 1
`3O2
`Receive time 2
`Receive time 1
`Send time 2
`
`Send time 2
`— ""°
`Send time 2
`303
`Send time 3
`Receive time 2
`Receive time 3
`Send time 3
`
`Diff1 = (RT1 - ST1) + (ST2-RT2)
`2
`
`Diff1 = (RT2 - ST2) + (ST3-RT3)
`2
`
`FIG. 2
`
`
`PAGE 4 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 3 of 10
`
`US 8,942,252 B2
`US 8,942,252 B2
`
`€ (61-)
`Fig.3
`
`
`
`303304305306
`
`302
`
`301
`
`NodeID
`RT18T2RT2Dfi
`
`ST1
`
`[\(D
`00
`(0U)
`
`
`
`
`
`TableDomainTable
`
`PAGE 5 OF 17
`
`SONOS EXHIBIT 1001
`
`IPR of U.S. Pat. No. 8,942,252
`
`
`PAGE 5 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 4 of 10
`
`US 8,942,252 B2
`
`400
`
`Source master
`
`
`
`Clifference
`-D
`
`NOde time
`rendering
`time
`1H
`
`
`
`
`
`4O1
`
`4O2
`
`Video
`
`Audio
`
`403
`
`Text
`
`Fig. 4
`
`
`PAGE 6 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 5 of 10
`
`US 8,942,252 B2
`
`500
`
`Content rendering device
`
`5O1
`
`502
`
`Receive
`COntent
`
`Render
`COntent
`
`503
`
`504
`
`505
`
`Send TD
`mSg
`
`Receive TD
`mSg
`
`Time
`Domain
`table
`
`506
`
`507
`
`Send
`rendering
`time msg
`
`Receive
`rendering
`time msg
`
`Fig. 5
`
`
`PAGE 7 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 6 of 10
`
`US 8,942,252 B2
`
`Send TD msg
`
`Addid to msg
`
`Add Send
`time to msg
`
`
`
`603
`
`Select next
`device
`
`604
`All already
`Selected
`
`
`
`608
`
`Send msg
`
`O Done D
`
`Addid Of Selected
`device to msg
`
`606
`
`Add Send time Of last
`msg of Selected
`device to msg
`
`Add receive time Of
`last msg of Selected
`device to msg
`
`Fig. 6
`
`
`PAGE 8 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 7 of 10
`
`US 8,942,252 B2
`
`
`
`Receive TD msg
`
`
`
`this device
`in msg list
`701
`
`Save Current
`Send time
`702
`
`Save last
`Send time
`703
`
`Save last
`ReCeive time
`704
`
`Save Current
`ReCeive time
`705
`
`Calculate
`differential
`7O6
`
`SnOOth
`differential
`7OZ
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 7
`
`
`PAGE 9 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 8 of 10
`
`US 8,942,252 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Render Content
`
`Select next block of
`Content
`
`802
`
`All already
`Selected
`
`
`
`N 804
`Adjust Selected
`block
`
`805
`
`Adjust timediff
`
`806
`Output selected
`block
`
`Fig. 8
`
`
`PAGE 10 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 9 of 10
`
`US 8,942,252 B2
`
`Send Render
`time mSg
`
`
`
`
`
`Add render time
`901
`
`Retrieve device
`time
`902
`
`
`
`Add device time
`903
`
`FIG. 9
`
`
`PAGE 11 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`U.S. Patent
`
`Jan. 27, 2015
`
`Sheet 10 of 10
`
`US 8,942,252 B2
`
`
`
`Receive
`rendering time
`mSg
`
`Extract master
`device time
`
`
`
`
`
`
`
`
`
`Extract master
`render time
`
`Convert master
`device time to TD
`
`Master Start time =
`Master device time -
`Master rendering time
`
`Slave Start time =
`Slave device time
`Slave rendering time
`
`Timediff=
`Master Start time -
`Slave Start time
`
`Fig. 10
`
`
`PAGE 12 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`US 8,942,252 B2
`
`1.
`METHOD AND SYSTEM
`SYNCHRONIZATION OF CONTENT
`RENDERING
`
`PRIORITY CLAIM
`
`This application is a continuation of U.S. application Ser.
`No. 12/710,146, filed Feb. 22, 2010, which is a continuation
`of U.S. application Ser. No. 1 1/933,194, filed Oct. 31, 2007,
`now abandoned, which is a continuation of U.S. application
`Ser. No. 10/322,335, filed Dec. 17, 2002, now U.S. Pat. No.
`7,391,791, which claims the benefit of U.S. Provisional
`Application No. 60/341,574, filed Dec. 17, 2001.
`
`TECHNICAL FIELD
`
`10
`
`15
`
`The described technology relates to rendering of content at
`multiple rendering devices in a synchronized manner.
`
`BACKGROUND
`
`2
`FIG. 5 is a block diagram illustrating components of a
`content rendering device in one embodiment.
`FIG. 6 is a flow diagram illustrating the processing of the
`send time domain message component in one embodiment.
`FIG. 7 is a flow diagram of the receive time domain mes
`sage component in one embodiment.
`FIG. 8 is a flow diagram illustrating the render content
`component in one embodiment.
`FIG.9 is a flow diagram illustrating the process of the send
`rendering time message component in one embodiment.
`FIG.10 is a block diagram illustrating the processing of the
`receive rendering time message component in one embodi
`ment.
`
`DETAILED DESCRIPTION
`
`A method and system for synchronizing the rendering of
`content at various rendering devices is provided. In one
`embodiment, each rendering device has a device time and a
`rendering time. The device time is the time as indicated by a
`designated clock (e.g., system clock) of the rendering device.
`The rendering time is the time represented by the amount of
`content that has been rendered by that rendering device. For
`example, if a rendering device is displaying 30 frames of
`video per second, then the rendering time will be 15 seconds
`after 450 frames are displayed. The rendering time of content
`at a rendering device has a “corresponding device time,
`which is the device time at which the rendering time occurred.
`For example, the rendering time of 15 seconds may have a
`corresponding device time of 30 minutes and 15 seconds
`when the rendering device initialized 30 minutes before the
`start of rendering the video. To help ensure synchronization of
`rendering devices, the synchronization system designates one
`of the rendering devices as a master rendering device and
`designates all other rendering devices as slave rendering
`devices. Each slave rendering device adjusts the rendering of
`its content to keep it in Synchronization with the rendering of
`the content at the master rendering device. The master ren
`dering device sends a message with its rendering time and
`corresponding device time to the slave rendering devices.
`Each slave rendering device, upon receiving the message
`from the master rendering device, determines whether it is
`synchronized with the master rendering time. If not, the slave
`rendering device adjusts the rendering of its content to com
`pensate for the difference between the master rendering time
`and the slave rendering time. A slave rendering device can
`determine the amount it is out of synchronization by compar
`ing its slave rendering time at a certain slave device time to the
`master rendering time at that same device time. Alternatively,
`the amount can be determined by comparing its slave device
`time at a certain rendering time to the master device time at
`that same rendering time. In another embodiment, the Syn
`chronization system can define a default rendering time for
`the synchronization. In such a case, the master rendering
`device need only include its effective device time that corre
`sponds to the default rendering time in the message that is sent
`to the slave rendering devices. For example, the default ren
`dering time might be the rendering time of Zero. In Such a
`case, the master rendering device can Subtract its current
`rendering time from its current device time to give its effec
`tive device time at rendering time Zero. A slave rendering
`device, knowing the default rendering time, can determine
`whether it is synchronized and the variation in rendering time
`between the master rendering device and the slave rendering
`device.
`In one embodiment, the synchronization system allows for
`two sources of content to be synchronized even though the
`
`Multimedia presentations that are presented on different
`rendering devices (e.g., video display and stereo system)
`typically require that the different content of the presentation
`25
`be rendered in a synchronized manner. For example, a mul
`timedia presentation may include video, audio, and text con
`tent that should be rendered in a synchronized manner. The
`audio and text content may correspond to the dialogue of the
`video. Thus, the audio and text contents need to be rendered
`in a synchronized manner with the video content. Typically,
`the content of a multimedia presentation is stored at a single
`location, Such as on a disk drive of a source device. To render
`the presentation, the source device retrieves each different
`type of content and sends it to the appropriate rendering
`device to effect the multimedia presentation. The source
`device then sends the content to the rendering devices in
`Sufficient time so that the rendering devices can receive and
`render the content in a timely manner.
`Various rendering devices, however, may have different
`time domains that make the rendering of the multimedia
`presentation in a synchronized manner difficult. For example,
`Video and audio rendering devices may have system clocks
`that operate at slightly different frequencies. As a result, the
`Video and audio content will gradually appear to the person
`viewing the presentation to be out of synchronization. The
`rendering of content in a synchronized manner is made even
`more difficult because Some rendering devices may have mul
`tiple time domains. For example, an audio rendering device
`may have a system clock and a clock on a digital signal
`50
`processing (“DSP) interface card. In such a case, the com
`bination of clocks may result in the presentation becoming
`even more quickly out of synchronization.
`It would be desirable to have the technique that would
`facilitate the rendering of the multimedia presentation in a
`synchronized manner.
`
`30
`
`35
`
`40
`
`45
`
`55
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a block diagram illustrating synchronization of
`rendering devices in one embodiment.
`FIG. 2 is a diagram illustrating the calculation of the time
`domain differential between two devices.
`FIG.3 illustrates a time domaintable for a rendering device
`in one embodiment.
`FIG. 4 illustrates a block diagram of another embodiment
`of the synchronization system.
`
`60
`
`65
`
`
`PAGE 13 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`3
`rendering times of the Sources are not themselves synchro
`nized. For example, two separate sources may be video trans
`mitted via satellite and audio transmitted via land telephone
`lines. If audio is being transmitted and then a few seconds
`later the corresponding video starts to be transmitted, then the
`rendering times of Zero for the audio and video will not
`correspond to a synchronized State. For example, the video at
`the video rendering time of Zero should be rendered at the
`same time as the audio with the audio rendering time of five
`is rendered. This difference in rendering times is referred to as
`Source offset. In addition, the difference in the propagation
`delay resulting from the different transmission paths of the
`video and audio may be variable and thus contribute to a
`variation in Synchronization that is variable and is not known
`in advance.
`To account for this lack of synchronization, the synchro
`nization system allows a user (e.g., the person viewing the
`content) to manually account for the variation. For example,
`if the video and audio are rendered via a personal computer,
`the synchronization system may display a dial or a slider on a
`user interface that the user can adjust to indicate the differ
`ence in the rendering times. If the video is rendered five
`seconds after the corresponding audio, then the user can
`indicate via the user interface that the offset is five seconds. In
`Such a case, the synchronization system may use the offset to
`adjust the rendering time of the audio so that the audio asso
`ciated with the adjusted audio rendering time should be ren
`dered at the same time as the video content with the same
`Video rendering time. The synchronization system could
`buffer the audio to account for the offset.
`The synchronization system in one embodiment factors in
`the differences in the time domains of the various rendering
`devices when evaluating synchronization. The rendering
`devices exchange device time information so that the render
`ing devices can account for the differences in the time
`domains of the other rendering devices. Each rendering
`device may send to the other rendering devices a time domain
`message that includes its current device time (i.e., send time)
`along with the time it received the last time domain message
`{i.e., receive time) from each of the other rendering devices
`and the send time of that last time domain message. When a
`rendering device receives Such a time domain message, it
`calculates the time differential between its time domain and
`the time domain of the sending rendering device. In one
`embodiment, the synchronization system calculates the time
`domain differential by combining the difference in send and
`receive times for the last messages sent to and received from
`another device in a way that helps factor out the transmission
`time of the messages. A slave rendering device can then use
`this time domain differential to convert the master device time
`to the time domain of the slave when synchronizing the ren
`dering of content. In one embodiment, each rendering device
`broadcasts at various times its time domain message. The
`time domain message includes a received time for a message
`received for each of the other rendering devices. Each ren
`dering device receives the broadcast time domain message.
`The receiving rendering device can then calculate its time
`domain differential with the broadcasting rendering device.
`In this way, time domain differentials can be determined on a
`peer-to-peer basis without the need for a master device to
`keep a master time and by broadcasting the time domain
`messages, rather than sending separate time domain mes
`sages for each pair of devices.
`FIG. 1 is a block diagram illustrating synchronization of
`rendering devices in one embodiment. The source device 101
`distributes the content of a presentation to the video rendering
`device 102, the audio rendering device 103 and the text ren
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 8,942,252 B2
`
`10
`
`15
`
`4
`dering device 104 via communications link 105. The source
`device may have the multimedia presentation stored locally,
`for example, on a disk drive, may dynamically generate the
`multimedia presentation, may receive content of the multi
`media presentation from other sources, and so on. A multi
`media presentation is any presentation that includes different
`content that is to be rendered in a synchronized manner. For
`example, the content could be video and audio content for a
`virtual ride in a theme park along with motion content to
`control the ride. As another example, the presentation may
`include light content that controls the display of a laser to be
`synchronized with audio content. Also, the "synchroniza
`tion of content may be different for different rendering
`devices. For example, audio content may be sent to multiple
`audio rendering devices with the expectation that some of the
`audio rendering devices may delay rendering for a certain
`period (e.g., 10 milliseconds) to achieve a desired audio
`effect. In Such a case, the rendering is considered synchro
`nized when the delay equals that period. The synchronization
`system designates one of the rendering devices as the master
`rendering device. In this example, the audio rendering device
`103 is designated as the master rendering device, and the
`video rendering device 102 and text rendering device 104 are
`designated as slave rendering devices. After the source starts
`sending the content to the rendering devices, the audio ren
`dering device broadcasts a master rendering time message
`with its master device time and master rendering time to the
`slave rendering devices on a periodic basis. In this example,
`since the communications link is point-to-point from the
`Source device, the audio rendering device sends the message
`to the Source device, which in turn forwards the message to
`the slave rendering devices. Upon receiving the master ren
`dering time message, the slave rendering devices convert the
`master device time to their own time domains and then cal
`culate the difference between their slave rendering time and
`the master rendering time at a certain point in time. In one
`embodiment, the synchronization system uses a device time
`at a calculated Start of sending as the point in time. The slave
`rendering devices then adjusts the rendering as appropriate to
`compensate for the difference. The rendering device adjusts
`the rendering of their content in ways that are appropriate for
`their content. For example, if the video rendering device was
`one second behind the master audio rendering device, then it
`might skip the display of every other frame for the next two
`seconds to “speed up' to the master audio rendering device.
`Alternately, if the video rendering device was one second
`ahead of the master audio rendering device, then the video
`rendering device might display each of the next 30 frames
`twice to “slow down to the master audio rendering device.
`FIG. 2 is a diagram illustrating the calculation of the time
`domain differential between two devices. Device 1 initially
`sends to device 2a time domain message 301 that includes its
`current device time, referred to as “sendtimel.” When device
`2 receives the time domain message, it stores the sendtimel
`along with the time it received the time domain message,
`referred to as “receivetime 1” Device 2 then sends to device 1
`a time domain message 302 that includes its device time,
`referred to as “sendtime2. along with sendtimel and
`receivetimel. When device 1 receives the time domain mes
`sage, it stores sendtimel, receivetimel, and sendtime2 along
`with its device time, referred to as “receivetime2. Device 1
`now has enough information to calculate the time domain
`differential according to the following formula:
`
`where Diff is the time domain differential, RT is receive time,
`and ST is send time. Device 1 then sends a time domain
`
`
`PAGE 14 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`5
`message 303 to device 2 that includes its device time, referred
`to as “sendtime3 along with sendtime2 and receivetime2.
`When device 2 receives the time domain message, it stores
`sendtime2, receivetime2, and sendtime3 along with its device
`time, referred to as “receivetime3.” Device 2 now has enough 5
`information to calculate the time differential according to a
`similar formula.
`This formula calculates the difference between the send
`time and the receive time for time domain messages between
`the two devices. If there was no variation in the time domains 10
`between the devices, then the send and receive times would
`reflect the communications link latency between sending and
`receiving the time domain messages. In one embodiment, the
`synchronization system assumes that the latency in transmit
`ting a message from one device to another device is approxi- 15
`mately the same as the latency in transmitting the message
`from the other device to the device. Thus, the synchronization
`system calculates the time domain difference by taking the
`average of the differences in the send and receive times of the
`messages. The receive time of the messages is represented by 20
`the following equations:
`
`RT2=ST2-DiffL
`where Diff represents the time domain differential and L
`represents the latency of the communications link. These
`equations are equivalent to the following equations:
`
`- 0
`
`25
`
`The average of these two equations is
`Diff=({RT1-ST1-L)+(ST2-RT2+L))/2
`The latency factors out of the equation to give the following 35
`equation:
`
`FIG.3 illustrates a time domaintable for a rendering device
`in one embodiment. The time domain table of a device 40
`includes a row for each other device to which the device is
`connected. For example, the audio rendering device 103 of
`FIG. 1 would have a row for the source device 101, the video
`rendering device 102, and the text rendering device 104. In
`this example, the time domain table includes a node identifier 45
`column 301, a sendtime 1 column 302, a receivetime 1 column
`303, a sendtime2 column 304, a receivetime2 column 305,
`and a time domain differential column 306. A positive time
`domain differential indicates the number of time units that
`this device is ahead of the other device, and a negative time 50
`domain differential indicates the number of time units that
`this device is behind the other device. Thus, in this example,
`the device time of the audio rendering device 103 is ahead of
`the device time of the source device 101 by 1000 time units.
`In contrast, the device time of the audio rendering device 103 55
`is behind the device time of the video rendering device 102 by
`495 time units. One skilled in the art will appreciate the time
`units can be any units appropriate to the desired synchroni
`Zation accuracy, Such as milliseconds, microseconds, and so
`on. One skilled in the art will appreciate the time domain 60
`messages need not include the times set by the receiving
`device. For example, the time domain message 302 need not
`include sendtime 1 since device 1 could have stored that time
`locally.
`FIG. 4 illustrates a block diagram of another embodiment 65
`of the synchronization system. In this example, the Source
`device 400 performs the function of the master, and the video
`
`US 8,942,252 B2
`
`6
`rendering device 401, the audio rendering device 402, and the
`text rendering device 403 are slaves. In particular, the source
`device, even though it does no rendering itself, may keep
`track of an idealized rendering time that may not correspond
`to the actual rendering time of any of the rendering devices. In
`Such a situation, the master Source device periodically sends
`a rendering time message that includes its device time along
`with the corresponding idealized rendering time to each of the
`rendering devices. The rendering devices can then adjust their
`rendering in the same manner as if the rendering time mes
`sage is sent from a master rendering device. Alternatively,
`each rendering device can provide their device time and cor
`responding rendering time to the source device. The source
`device can then calculate the rendering time differential for
`each rendering device and provide that differential to the
`rendering devices to speed up or slow down their rendering as
`appropriate.
`FIG. 5 is a block diagram illustrating components of a
`content rendering device in one embodiment. The content
`rendering device 500 includes a receive content component
`501, a render content component 502, a send time domain
`message component 503, a receive time domain message
`component 504, a time domain table 505, a send rendering
`time message component 506, and a receive rendering time
`message component 507. The receive content component
`receives content from the Source device and may store the
`content in a buffer for Subsequent rendering. The rendering
`content component retrieves the buffered content and effects
`the rendering of the content. The send time domain message
`component sends time domain messages to the other devices.
`The send time domain message component may send the
`message upon occurrence of an event, such as when a timer
`expires, when a message is received, and so on. One skilled in
`the art will appreciate that the frequency of sending time
`domain messages can be adjusted to account for the antici
`pated drift between clocks of the rendering devices. The
`receive time domain message component receives the time
`domain messages sent by other devices and updates the time
`domain table as appropriate. The send rendering time mes
`sage component is used when this content rendering device is
`a master rendering device to send a rendering time message to
`the other rendering devices. The receive rendering time mes
`sage component receives the rendering time messages sent by
`the master device and calculates a rendering time differential
`that is used to adjust the rendering of the content. The devices
`may include a central processing unit, memory, input devices
`(e.g., keyboard and pointing devices), output devices (e.g.,
`display devices), and storage devices (e.g., disk drives). The
`memory and storage devices are computer-readable media
`that may contain instructions that implement the synchroni
`Zation system. In addition, data structures and message struc
`tures may be stored or transmitted via a data transmission
`medium, Such as a signal on a communications link. Various
`communications links may be used, such as the Internet, a
`local area network, a wide area network, or a point-to-point
`dial-up connection.
`FIG. 6 is a flow diagram illustrating the processing of the
`send time domain message component in one embodiment. In
`block 601, the component adds the identifier of this device to
`the time domain message. In block 602, the component adds
`the send time to the message. The send time is the current
`device time. In blocks 603-607, the component loops select
`ing each other device and adding times for that device to the
`time domain message then loops to block 603 to select the
`next device. In block 608, the component sends the time
`domain message to the other devices and then completes.
`
`
`PAGE 15 OF 17
`
`SONOS EXHIBIT 1001
`IPR of U.S. Pat. No. 8,942,252
`
`
`
`US 8,942,252 B2
`
`10
`
`15
`
`25
`
`35
`
`40
`
`55
`
`60
`
`7
`FIG. 7 is a flow diagram of the receive time domain mes
`sage component in one embodiment. In decision block 701, if
`the identifier of this device is in the list of device identifiers in
`the message, then the component continues at block 702, else
`the component completes. In block 702, the component
`retrieves the current send time from the message and saves it
`in the time domain table. In block 703, the component
`retrieves the last send time from the message and saves it in
`the time domain table. In block 704, the component retrieves
`the last receive time from the message and saves it in the time
`domain table. In block 705, the component retrieves the
`device time as the current receive time and saves it in the time
`domain table. The time values may be saved by storing them
`in the time domaintable in the row associated with the device
`that sent the message. In block 706, the component calculates
`the time domain differential. In block 707, the component
`smoothes the time domain differential. The time domain dif
`ferential can be Smoothed using various techniques such as
`averaging the last several time domain differentials using a
`decaying function to limit the impact of the oldest time
`domain differentials. In one embodiment, the synchroniza
`tion system saves the values of the last eight pairs of time
`domain differentials (i.e., ST2-RT2 and RT1-ST1) and uses
`the average of the minimum value of the set of eight larger
`differentials and the maximum value of the set of eight
`smaller differentials as the time domain differential. The
`component then completes.
`FIG. 8 is a flow diagram illustrating the render content
`component in one embodiment. In blocks 801-806, the com
`ponent loops processing each block of content that is received
`30
`from the source device. In block 801, the component selects
`the next block of content provided by the source device. The
`content may be buffered at this rendering device. In decision
`block 802, if all the blocks of content have already been
`selected, then the component completes, else the component
`continues at block 803. In decision block 803, if the rendering
`time differential is 0, then the component continues at block
`806, else the component continues at block 804. The render
`ing time differential is calculated by the receive rendering
`time message component and adjusted by this component as
`the rendering of the content is adjusted. In block 804, the
`component adjusts the selected block to account for the ren
`dering time differential. For example, if the content corre
`sponds to video