throbber
3,394,352
`R. WERNIKOFF ET AL
`J01}! 23. 1963
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`a Sheets—Sheet
`
`1
`
`FIG. m
`
`
`
`
`xl'xz...xm....
`
`S
`INFORMATION
`SOURCE
`
`
`
`/
`
`/
`
`/
`
`//
`
`”
`
`/
`
`/
`
`1 ‘
`
`.
`
`‘
`
`//
`
`/
`
`C k
`
`/Co
`
`/
`
`/
`
`/
`
`//
`
`//
`
`/
`
`/
`
`C.
`.—— —- —-—-
`
`/
`
`’ ——-‘
`
`/
`
`/
`
`/
`
`/
`
`/
`
`”k
`
`/
`
`\
`(3|
`
`/
`
`.i
`
`g
`g
`fi

`
`z

`LP
`3
`«79
`5
`Q.
`’—
`8
`55
`g
`g
`z
`
`XI x3 ID.
`X2 x4
`
`Xk
`
`nnnnnnnn
`Xx“
`
`O... Xm .....
`an
`
`INCOMINC SOURCE SEQUENCE
`
`HQ {[5
`'
`
`MIVE/vmfr’f
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM F. SCHRIBER
`VBY WWW
`
`ATTORNEYS
`
`Commvault Ex. 1009
`Commvault v. Realtime
`
`US Patent No. 9,054,728
`
`Page 1
`
`Page 1
`
`Commvault Ex. 1009
`Commvault v. Realtime
`US Patent No. 9,054,728
`
`

`

`3,394,352
`R. WERNIKOFF ET AL
`July 23. 1968
`METHOD CF AND APPARATUS FOR CODE} COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets—Sheet 2
`
` (:0 (NULL CODER)
`
`Ck
`
`FIG. 2
`
`
`
`NUMBEROFoquuTSYMBOLSFROMCODERS
`
`XI)? x3
`
`
`
`,4 : x4“
`
`sounce $‘I'MBOL5
`
`} '
`
`1
`
`
`l
`I
`I SLOPES I
`IUNCHANCEDI
`|~CURVE$
`I
`
`"RANslATED,
`1
`:
`2H
`I
`
`Co
`
`Ck
`
`m
`3
`go
`23-15
`
`"uui
`
`35
`:35
`
`3s3z
`
`
`
`X‘ x2 X3
`
`*4 x.“
`
`1‘
`POPULATION
`$WITCHED
`
`x3 x6“
`
`F|CI.:'>
`
`memo/e;
`ROBERT we RNIKOFF'
`PAUL EPSTEIN
`WILLIAM ESCHRIBER
`9y WWW
`Mamas/5
`
`Page 2
`
`Page 2
`
`

`

`3,394,352
`R. WERNIKOFF ET AL
`July 23, 1968
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets—Sheet
`
`:5
`
`0
`
`Le
`
`
`
` *
`'STOPBUFFERFULL
`
`N ‘#
`

`
`
`
`
`
`‘
`
`Page 3
`
`|NVENTOIZ§2
`ROBERT WERN‘KDFF
`
`WWEN
`5 142215211
`-
`B’Afluzuavs
`
`—
`3,
`
`B
`V
`
`FIGURE 4
`
`Page 3
`
`

`

`3,394,352
`R. WERNIKOFF ET AL
`July 23, 1968
`METHOD CF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets—Sheet
`
`4
`
`SELECT OUTPUT POSITION
`
`24‘
`
`STANDARD
`OUTPUT
`
`
`
`
`
`
`M U
`I- 2
`W Lu
`g 8
`Lu
`U\
`
`
`
`FIGURE 4A
`
`INVENTORY.
`ROBERT WERNIKOFF
`PAUL. EPSTEIN
`WILLIAM F SCHRIR R
`>zZVW£A aw“
`ATT'OIZ N EW
`
`BY
`
`Page 4
`
`PO moid7
`20
`
`
`OUTPUT
`
`POSITION
`SWITCH
`T33
`TRANSMlTTEK
`
`2'
`
`2 :
`
`Z9
`
`(Z)
`_ E
`E
`E n!
`. U_J_
`{R \
`a E;
`ALTERNATE
`L11
`OUTPUT
`L5
`o a
`PosmON
`u
`1..
`w LL
`@ 5%
`O
`1..3 a
`5?
`Q m
`
`Page 4
`
`

`

`3,394,352
`R. WERNIKOFF ET AL
`July 23. 1953
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets—Sheet
`
`5
`
`200'
`
`I
`I
`
`I
`
`==
`
`_—-
`
`DoMODE EllUDWORD_DETECTOR;DETECTOR
`”5INPUTBUFFERElem:
`REiII"§TERo-uecomak““1J 230
`I—!=
`
`SYMBOL
`COUNTER
`
`DECODED
`WMBOL
`OUTPUT
`
`III-IIIIIIIII.——In
`
`FIGURE 43
`
`INVENTORS:
`
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM F. SCHRIBER
`~35, me
`ATflijEfié
`
`Page 5
`
`Page 5
`
`

`

`July 23
`
`. 1968
`
`R. VVEF?FJH<<>FF’ ET AL
`
`3,394,352
`
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets—Sheet 6
`
`._<U_Ew>
`
`$5.86
`
`no02w
`
`ammam4<u_.Ew>
`
`4528.551
`
`ammBm
`
`PZSOUJJSu
`
`mebpw
`
`dwkZJOU
`
`25855:324%?
`£93523
`
`mmJaa
`
`?CmeF2_
`
`s
`
`
`
`mmmWSaUZCnthwkZ_
`
`FZJOU443aP4Pmme
`
`
`
`
`
`:Wmflufiuomoouzz992m
`
`gu-I
`
`”fizz/Um
`
`.EU
`
`_|:aa
`
`:315ngcan;
`
`MNFZSOU
`
`dntuncimczOu
`
`szz<x
`
`ESE
`
`)flOU
`
`NmHOP
`
`
`
`puxu.ru.I.hm$33tomm53305:8
`
`Qfiéuig
`
`
`quv:de37HameB\.
`mil;3§§aF\(fl
`
`
`oF<2ouz.20Eon
`
`Page 6
`
`Page 6
`
`
`
`
`
`
`
`
`
`
`
`
`

`

`July 23, 1968
`3,394,352
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`R. WERNIKOFF ET AL
`
`Filed July 22, 1965
`
`8 Sheets—Sheet "
`
`
` Il-J<FZON‘N_O3’>44mo
`
` J<U_Ew>-IIIdMFZDOUI
`
`mWJjn.
`413.205do
`
`.wU2<>O<I
`
`mewdMZEJ197:“.
`
`Sol—KN.waU
`
`M:FUE02wQOFm
`
`02am.“
`
`mu>ESE
`
`..<._.ZON_~_OI
`
`awwbpm
`
`>00U
`
`n.wm>>m
`
`1.43de)
`
`dwmfidluw25.3.3intuit!
`
`uuoxzvmtawwflWméik155.3%>n
`
`
`
`.2w?.Edobwmo
`
`ukohzwxzx.u2>m.233EumI.
` dwkmfivwd
`
`23H.
`
`IFUZN...
`
`EMFZSOU
`
`do...
`
`Him
`
`Page 7
`
`
`
`.ZuFamS-555232.1:912.024‘any;no.
`
`deZDOU
`
`2%.:
`
`
`
`
`
`Page 7
`
`
`
`
`
`
`
`
`

`

`3,394,352
`R. WERNIKOFF ET AL
`July 23. 1968
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets-Sheet 8
`
`5'?
`_I
`< +-
`E 23
`Q 8
`——
`ga
`I w
`3
`U‘
`
`FROMEDGEGATE 63
`
`o
`
`M3
`
`23%
`a:
`E o_ :9—
`0..
`{:54
`LLLL
`
`d)
`L0
`.
`U
`E
`
`‘
`
`M/VE/vrom
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM F. SCHRIBER
`W 12.41,. M 7%
`AflOR/VEVS
`
`Page 8
`
` o
`
`c.’
`339
`O 4
`w 22’
`5%
`j 8
`
`on
`
`a
`n
`
`3::
`
`on.
`l—O
`”‘4
`20
`SE33
`U20
`Lu 2‘3
`2813
`2 no
`oém:
`Uz‘n-a
`“195%
`€25:
`E <
`a
`2
`<
`
`Page 8
`
`

`

`United States Patent Ofice
`
`3,394,352
`Patented July 23, 1968
`
`1
`
`2
`
`3,394,352
`METHOD OF AND APPARATUS FOR CODE
`COMMUNICATION
`Robert Wernikoff, Cambridge, Paul Epstein, Brookline,
`and William F. Schreiber, Lexington, Mass., assignors
`to Electronic Image Systems Corporation, Boston,
`Mass, a corporation of Massachusetts
`Filed July 22, 1965, Ser. No. 474,189
`35 Claims. (Cl. 340—172.5)
`
`
`ABSTRACT OF THE DISCLOSURE
`
`The methods and apparatus disclosed minimize the
`number of symbols to be transmitted, and hence the trans-
`mission time, in a communication system, by determining
`the most efficient code for encoding sequences of message
`symbols and transmitting the symbols in that code. Deter-
`mination of the most efficient code can be accomplished
`by applying the message symbols to a plurality of encoders,
`counting the numbers of symbols of coded representations
`of successive message symbols produced by each encoder,
`and comparing the numbers to determine the particular
`code that introduces the smallest number of symbols of
`coded representation consistent with distortionless mes-
`sage transmission. Tagging symbols are transmitted to per-
`mit reconstruction of the original message symbols from
`the transmitted coded symbols. In an application of the in-
`vention to facsimile transmission run-length coding is em-
`—_——
`ployed.
`
`The present invention relates to methods of and appa—
`ratus for code communication, being more particularly,
`though not in all cases exclusively, directed to efficient
`coding for inhomogeneous sources of information.
`The problem of processing information-to-be-communi-
`cated so as to enable a minimum number of symbols to be
`transmitted with reliable recovery of the information has
`long challenged the art. In devising his telegraphic code
`more than a century ago, Samuel F, B. Morse generally
`assigned the shortest code symbols to the most frequently
`used English alphabet letters in order to economize on
`the transmission time of messages. Had Morse optimized
`the assigning of the shorter code symbols to the more fre—
`quent letters, this would have produced about a fifteen per—
`cent saving in transmission time over assigning symbols of
`equal
`length to all
`letters. Far greater saving, however,
`could have been effected by assigning code symbols, not
`for individual
`letters, but for pairs,
`triplets or other
`grouping of letters that in normal text occur much more
`frequently than others. Within recent years, indeed Claude
`Shannon has estimated that the maximum saving in trans
`mission capacity that can be obtained through optimum
`ecoding of English text is about seventy-five percent when
`statistical dependences between successive letters of text,
`carried out to one hundred letters, are considered.
`When the symbols issuing from the information source
`are statistically dependent upon one another, so that cer-
`tain groupings of symbols occur more frequently than
`others, efficient coding requires that the symbols be con-
`sidered not singly but in groups. The number of elements
`of which the group must consist is dictated by the range
`of substantial statistical dependence. It might amount to
`ten or twenty letters in English text, and to several thou-
`sand bits in pictorial transmission. The size of the code
`book in which groups of source symbols are assigned
`code symbols increases fantastically as the group is made
`larger
`(the number of entries
`in the code book is
`271°=2OS,900,000,000,000 for groups of
`ten English
`letters, and it is the square of that number for groups of
`twenty letters) and this growth soon becomes an insur-
`mountable practical limitation to optimum coding. There
`
`are no non—singular codes that circumvent this problem.
`An optimum code is matched as closely as possible to
`the statistical peculiarities of the population or set of se-
`quences to be encoded. A code designed to match the letter
`frequencies of English text, for example, becomes quite
`inefficient when it is applied to a different population or
`set of sequences, such as a foreign language. Whenever dis—
`similar populations are mixed (for example, for the pur-
`pose of obtaining a single set of letter frequencies applica-
`ble to English and to a foreign language simultaneously)
`the frequencies of the letters tend to equalize; and any
`operation on sequences that equalizes probabilities reduces
`the possible compression, In the limit, indeed, if an in-
`formation source is as likely to produce one symbol as
`another, and if successive symbols do not depend on
`previous ones, then no compression whatsoever is possible.
`The reduction of transmission time by means of coding
`is thus possible only to the extent that the statistics of
`the strings of symbols produced by information sources
`differ from independence and equiprobiability. The recipe
`for the optimum code is fixed in a deterministic, mathe-
`matical way by the precise probability distributions that
`describe the information source.
`While it
`is difficult
`to reconcile language text, televi-
`sion, moving pictures,
`facsimile,
`telemetered data, etc.,
`with the mathematical idea of a homogeneous population
`with a fixed, immutable and known frequency distribution,
`yet this abstract concept underlies all prior coding theory.
`In accordance with Shannon’s theorem, an information
`source is contemplated that produces sequences of sym-
`bols (but not just any sequence) that may be regarded as
`random samples from a homogeneous population with
`known and immutable frequency distributions of all or-
`ders (i.c., for pairs of symbols,
`triplets,
`.
`.
`.. n~tuples).
`The term “homogeneous” (the technical word is “er-
`godic") implies that any sufficiently long sequence from
`the source is very likely to be representative of any other
`source sequence,
`Adherence to this framework has, however, resulted
`in prior—art coding and compression systems that fall far
`below the ultimate transmission efficiency. The reason is
`that the mathematical framework on which they are based
`does not sufficiently resemble reality. In accordance with
`the present invention, on the other hand, as pragmatically
`determined from actual experiment.
`it
`is not assumed
`that the population of symbols (letters of text, or ele-
`ments of television or facsimile pictures, for example) is
`a homogeneous source in the sense of information theory;
`but, rather, that inhomogeneous sources are involved re-
`quiring a novel coding logic that has been found in prac—
`tice to more than double the average compression at-
`tained by prior—art systems implementing conventional
`coding theory, and with less equipment. In addition, the
`coder for inhomogeneous sources, underlying the inven-
`tion, results in a transmission time that never exceeds the
`uncoded time and sometimes is fifty times smaller.
`An object of the invention, thus,
`is to provide a new
`and improved method of and apparatus for code com-
`munication that is not subject to the above—described dis-
`advantages of prior—art coding and compression systems,
`predicated upon conventional coding theorems; but that,
`to the contrary, provides for greater transmission-time
`efficiency and with less complicated and costly apparatus.
`A further object is to provide a new and improved scan-
`ning transmission and reception system for such purposes
`as facsimile or other communication, all generically re-
`ferred to as “picture” scanning and the like.
`Still another object is to provide each of a novel code
`transmission and reception apparatus.
`Other and further objects will be made apparent herein-
`after and are more particularly pointed out
`in the ap-
`pended claims.
`
`10
`
`20
`
`25
`
`30
`
`40
`
`45
`
`5O
`
`60
`
`70
`
`Page 9
`
`Page 9
`
`

`

`3,394,352
`
`10
`
`3
`4
`The invention will now be described in connection with
`. C“.
`.
`each of the different encoders or coders C0, C1, C2 .
`the accompanying drawings, FIG. 1A of which is a
`In view of the fact that all possible source populations are
`schematic block diagram of a portion of a transmitter
`not known,
`it
`is useful to include one encoder Co that
`useful with the invention;
`performs pure, uncoded transmission; its output is identi-
`FIG. 1B is a graph illustrative of the performance of
`cal with its input. We shall call this the null encoder. As
`the system of FIG. 1A;
`each successive source symbol enters a given encoder, the
`FIGS. 2 and 3 are similar graphs of modified perform-
`encoder output (which may differ from all the rest) will
`ance;
`be of several different
`types. The output may be “no”
`FIG. 4 is a block diagram of a preferred generic en-
`symbols (if that particular code works on blocks or source
`coding and transmitting system embodying the invention;
`symbols, and the end of the block hasn‘t been reached
`FIG. 4A is a block diagram of the output formatting
`yet); or it may be one symbol (as in the null encoder C0
`system of FIG. 4;
`and others); or it may be many symbols (as in a coder
`that has reached the end of a block or of a run of source
`FIG. 4B is a similar diagram of a preferred generic
`decoding or receiving system for use with the transmitting
`symbols). In FIG. 1B, accordingly, there are plotted the
`system of FIG. 4, FIGS. 4, 4A and 4B, as a unit, together
`numbers of symbols issuing from each coder as successive
`constituting a complete system employing the invention;
`source symbols enter it from the source S.
`FIGS. 5 and 5A are block diagrams of the transmitting
`If the population statistics are truly stationary, each
`and receiving sections, respectively, of a preferred fac-
`coder will be characterized by an average straightline
`simile or similar scanning system using the novel
`tech—
`behavior whose slope w determines the compression (or
`niques of the invention as a species of the generic con-
`expansion) produced by that code. The null encoder Co
`cept of FIGS. 4, 4A and 4B; and
`has unit slope if the input and output alphabets are the
`FIG. 5B is a circuit modification that may be employed
`same, because it puts out one symbol for each successive
`with the system of FIG. 5.
`input symbol. If the input and output alphabets are differ—
`Considering for purposes of illustration an information
`ent, the one at the input having A1 letters and the one in
`source that produces sequences draWn not necessarily
`the output Ao letters, then the previous statement is still
`from one homogeneous population, but possibly from sev-
`correct after all ordinate values have been divided by log
`eral different populations, some known and others perhaps
`Al/log A0. In the sequel, it is tacitly assumed that the
`not yet identified, and considering that the source may
`alphabets are equal or that the indicated normalization has
`switch from one population to .another at random. and
`been done, particularly since a change in alphabets does
`without stopping; that the different p0pulations are used in
`not change the channel capacity required for message
`unknown, and maybe even in variable proportions; and
`transmission and hence does not affect efficiency.
`that switching front one population to the next occurs at
`Coder C1 in FIG. 1A may, as an illustration, produce on
`random times, an example of the method underlying the
`the average fewer symbols than enter it; thus, it produces
`present invention may be constructed as follows. For pur-
`a. characteristic shown in FIG. 18 of slope w1<I. Coder
`poses of illustration only, it may be considered that this
`Ck, on the other hand, may produce on the average more
`operation occurs when different-language messages are
`than one symbol per input symbol, producing a slope
`successively transmitted from United Nations Headquar-
`wk>l. These statements apply to the average behavior of
`tcrs.
`the coders, shown as the dash lines in FIG. 13. The actual
`Since it is known how to gain a useful compression in
`number of output symbols put out by each coder for each
`any one language (or, more generically, population), it
`input or arriving source symbol is, of course, a random
`is proposed to construct an apparatus that effectively has
`variable Whose value depends on the specific source sym-
`a collection of encoders, one matched to each language
`bol that entered the coder (these symbols are themselves
`(or population). Then, as the source switches around
`random variables) and on the source symbols that entered
`from one language (or population) to the next, the ap-
`the coder earlier. The actual number of symbols issuing
`paratus automatically follows and, at each instant, uses
`from each coder, as represented by the solid lines in FIG.
`the code best suited to the output of the source. Each time
`18, thus, is a random walk or step-curve about which the
`the coder switches codes, it sends a special signal to the
`most that can be said, in general, is that it is monotonic
`receiver indicative of the code of the following message,
`nondecreasing.
`so that
`the receiver can apply the corresponding de-
`When the source S abruptly switches populations, so
`coding procedure and recover the original message. The
`that the source sequence has entirely different statistics, the
`problem is, however,
`to enable the coder to determine
`dash-line curves of FIG. 1B that describe the average
`when the source switches populations. The coder must
`behavior of the coders will abruptly change slope, since a
`determine this as soon as possible, so that it may switch
`coder which formerly was quite efficient (w<l) may now
`codes promptly before any great
`loss in transmission
`be very inefficient (w>l). Automatic code—switching can
`efficiency has been incurred.
`be based on detecting changes in slope in the average
`What makes the problem difficult is that the conditions
`behavior of the coders (i.e., the ratio of output to input
`of this oversimplified illustration, in which it is possible to
`symbols or the number of output symbols per unit of real
`think of the source as switching around among a finite
`time) and such operation in the system of the invention is
`and definite number of discrete populations, are not often
`for some purposes feasible; but it is preferred, because the
`realized in practice. It is more usual to find real sources
`average behavior is not known with any acceptable ac—
`changing their behavior in a continuous way, and to find
`curacy unless the source pcrsists in its new behavior for a
`the statistics of the corresponding source sequences assum-
`very long time, not
`to rely upon mere slope detection.
`ing a continuum of values; thus there does not exist, even
`What is really available, as opposed to merely assumed,
`in principle, a unique, well—defined time at which codes
`is the actual output from each coder; i.e., the solid—line
`should be switched. The present invention provides the
`step curves of FIG. 18. The randomness of these curves
`solution both for the simple case of switching among
`makes the measurement of slopes unreliable, and masks
`discrete sources and for the more important and realistic
`the precise point at which the switch in populations oc-
`case of gradually changing source statistics; but the meth-
`curred; and there is,
`in fact, no such point with real
`od and apparatus of the invention are more clearly under-
`sources, whose population parameters change gradually,
`standable if initially described in terms of the simpler case.
`rather than abruptly. Thus any code-switching systems
`Since it
`is not known which population the source is
`based on slopes or other population averages, such as
`currently drawing its sequence from, according to the
`entropy, would at best merely be crude approximations.
`present invention, the message sequence from the informa-
`In accordance with the preferred form of the present,
`tion source is fed simultaneously to all encorders. Thus,
`invention, therefore, the code-switching logic is based, not
`in NO.
`lA,
`the sequence of information source symbols
`£1,352. ---\’m~ ..
`is shown fed from the source S to 7;, on abstract constructs like average slopes or on unsup-
`
`65
`
`30
`
`4t]
`
`60
`
`Page 10
`
`Page 10
`
`

`

`3,394,852
`
`6
`.
`.r2 .
`in the source sequence .r1,
`source-subscript a is sought such that
`
`. etc., the smallest
`
`5
`ported assumptions about physical sources, but on the
`final goal of the coding system, which is to transmit as few
`symbols as possible without distorting the information
`produced by the source. Attention is thus focussed on the
`symbol counts at the outputs of the encoders; and at any
`given time,
`that encoder is chosen which leads to the
`fewest transmitted symbols. This operation can be imple-
`mented in a manner now to be described.
`it must
`Each time the transmitter S switches codes,
`convey this fact to the receiver; that is,
`the transmitter
`must tell the receiver which code it will be using next.
`The accomplishment of this function requires transmitting
`a few symbols, representative of the name of the new
`code. If the transmissions are in binary alphabet, then it
`takes one symbol to identify one out of two possible
`codes; two symbols to identify one out of four possible
`codes; three symbols for eight possible codes; and so on.
`Each additional symbol doubles the number of codes
`that can be identified. With a quaternary alphabet, one
`symbol identifies one of four codes; two symbols, one of
`sixteen codes; and so on. Each additional symbol quad-
`ruplcs the number of codes that can be so identified.
`Since, in most applications, it is unlikely that a practical
`system would have more than a dozen or two different
`codes, it never takes more than three or four symbols
`to identify the new code. The receiver, however,
`is con-
`fronted with a continuing stream of symbols and it will
`not be able to distinguish between ordinary message
`symbols and code-switching instructions unless special
`provision is made to provide such instructions in an un-
`ambiguous manner. This may be done by inserting in the
`message stream a uniquelydecipherable tagging word
`which is the same in all codes and which informs the re-
`ceiver that the immediately following symbols define the
`new code. To effect this requires a few symbols, say ten
`or so. The switching signal,
`therefore, may consist al-
`together of fifteen symbols or so,
`this being short com—
`pared with most intervals or source homogeneity. Since
`some transmission time is consumed by sending instruc—
`tiOn or tagging words, however, the number of transmitted
`instruction or tagging symbols must be noted.
`In fact, in accordance with the invention, it is the trans-
`mission cost of the instruction or tagging symbols that
`supplies the discriminant for the code—switching decision;
`that is, codes are switched precisely then, when what is
`saved in transmission cost by replacing the old code with
`a new one just exceeds the cost of sending the instruction
`or tagging word that announces the change.
`Assume that the information source S is allowed to feed
`source symbols to the encoders, as in FIG. 1A, and that
`there are buffers,
`later described,
`inserted between the
`encoders and the transmitter output so that time is ‘af«
`forded before a decision is made as to which coder output
`is to be transmitted. As successive source symbols x1,
`x2 .
`.
`. etc., enter the coders,
`the number of symbols
`coming out of each coder is counted, presenting an out-
`put that,
`if plotted,
`is of the type shown in the graphs
`of FIG. 13. if, for purposes of explanation, the graphs
`are temporarily thought of as smooth lines, as shown in
`FIG. 2, rather than as random walks or steps,
`then, as
`the source symbols x1, x2 .
`.
`. etc., enter the coders
`C0, C1 .
`.
`. etc., the graphs representive of output sym-
`bol counts will begin to diverge from each other. If com«
`ipression is at all possible, one or more coder lines (for
`example, Ck in FIG. 2) will be below the null—coder
`line C0, so-labelled.
`Assume that, as a convention, the receiver always pre-
`supposes that transmissions commence uncoded (i.e., using
`the output from C0) unless the very first received word is
`a code-instruction or tagging word. The transmitter, be-
`fore sending anything, first examines the outputs from the
`various coders as they enter the later‘describcd buffers
`that precede the output register, with the following ob—
`jective.
`
`(1)
`Ck(x\, x,)<Co(x,, xa)—2H
`for some k. where H is the number of tagging symbols
`that must be transmitted to change codes, A (initially
`zero) is the number of source symbols that already have
`been transmitted, and Ck(x,, x,) is the number of sym-
`bols that must be transmitted by coder Ck to represent
`all source symbols starting with x,“ and ending with
`x,. It costs H symbols to start transmitting in Ck instead
`of C0; but, if the first
`or source symbols are transmitted
`with the Ck code, then a minimum of ZH—HzH sym-
`bols are saved. Even at x,“ the line for Ck veers up
`sharply and Ck begins to cost more than C0 per unit
`input symbol, therefore, the transmitter can switch back
`to C0 (an operation that costs another H symbols) and
`still not have lost anything with respect
`to uncoded
`transmission.
`Thus, the code-switching logic of the invention issues
`a command that initiates transmission of a code-instruc-
`rtion word identifying Ck, and of those symbols in the
`Ck output buffer which correspond to the first a source
`symbols. While transmission is taking place,
`the code-
`switching logic allows enough new source symbols to
`enter the coders so that the logic can decide what code
`to use next before the transmitter has finished sending
`the symbols of the previous block. The object here is to
`keep the transmission line fully loaded at all times. This
`requirement defines the minimum computation rate for
`the switching logic, and it also defines the rate at which
`source symbols must be supplied to the coders.
`is
`it
`To decide what code to use in the next step,
`noted that the cumulative number of symbols that the
`coders would have sent for the first a source symbols
`is now quite immaterial. What matters is how many sym-
`bols per source symbol they produce from x, on. There
`is subtracted from each line of FIG. 2,
`therefore,
`its
`value at x“, resulting in the graph of FIG. 3.
`Now the system is initially in the Ck code, A has as-
`sumed its new value, and the system will stay with this
`code unless a new a is reached such that
`
`(2)
`C,(x,, xa)<Ck(x,, x,)—2H, for some j
`If this happens, the system transmits the symbols between
`x,“ and x,
`in code C,-. If this does not happen before
`the Ck output buffer fills up, then transmission continues
`in code Ck. The Ck butter is emptied, all the other buffers
`are cleared, and the above procedure is repeater! with all
`the coder output lines now originating from that source
`symbol which would have caused the Ck buffer to over—
`flow if the system had not just then stopped the search
`procedure.
`The method of code selection outlined above is simple
`and effective, but it occasionally results in a transmission
`cost higher than what is incurred with uncoded transmis-
`sion. A preferred method of code selection that never re-
`sults in transmitting more symbols than are required for
`uncoded transmission might be tabulated as follows:
`
`Coder logic when discriminant equals 2H
`
`the maximum capacity of coder
`
`Let M represent
`buffers.
`Step 1.—Thc value of k is initially zero as is the value
`of i in Steps 2, 5 and 7. Proceed to Step 2.
`Step 2.—Search for the smallest source sequence sub-
`script or such that conditions (3) and (4) are both satis-
`fied for some values of k:
`
`u—KSM
`
`Cid-"x, xix) SCi(xM 'ra)_2}1
`
`(3)
`
`(4)
`
`10
`
`15
`
`20
`
`25
`
`3O
`
`40
`
`50
`
`(30
`
`70
`
`If no such value of a is found, proceed to Step 7; other-
`wise proceed to Step 3.
`Step 3.—Using the a value and k value found in Step 2,
`
`.
`7’1
`
`Page 11
`
`Page 11
`
`

`

`3,394,352
`
`8
`script :1 such that conditions (8) and (9) are both satis-
`fied for one or more valucs of :‘i:
`
`7
`determine whether or not condition (5) is satisfied:
`
`Curt, x. ) scam. x. ) —H
`
`(5)
`
`If condition (5) is satisfied, proceed to Step 4; otherwise
`proceed to Step 5.
`Step 4.——Using the a value and k value found in Step 2,
`transmit a code~instruction word,
`transmit
`the next
`or.
`source symbols in code Ck, set the value of i in Steps 2,
`5 and 7 equal to the value of k and proceed to Step 9.
`Step 5.—-—Using the a value found in Step 2, determine
`whether or not condition (6) is satisfied:
`
`CI
`
`10
`
`CAI“ xa ) ECOLVM X4)
`
`(6)
`
`If condition (6) is satisfied, transmit the next a source
`symbols in code C, and proceed to Step 9; otherwise pro-
`ceed to Step 6.
`Step 6,—Transmit a code-instruction word for code
`Co, transmit the next a source symbols in code Co, set the
`value of i in Steps 2, 5 and 7 equal to zero and proceed
`to Step 9.
`Step 7.—Determine whether or not condition (7) is
`satisfied:
`
`20
`
`CIU‘A» XA+M) Scull}, XMM)
`
`(7)
`
`a—AEM
`
`(8)
`
`(9)
`Ch("‘>n -‘Au)<CD(I.t: \ui —2I"
`If no such value of a is found, proceed to Step 3; other-
`wise proceed to Step 5.
`Step 3.—Transmit M source symbols in C0. Proceed to
`Step 4.
`the value of .\ equal to the total number
`Step 4.~—Set
`of source sequence symbols that have now been trans-
`mitted. Proceed to Step 2.
`Step 5.—Using the value of or found in Step 2 and using
`the preferred value, relative to .\ and a of the set of values
`of 11 found in Step 2, transmit a code-instruction word,
`transmit
`the next a source symbols in Ch. and set
`the
`value of i
`in Steps 7, 8, 9, 12 and 13 equal to the pre-
`ferred value of h. Proceed to Step 6.
`Step 6.—Set the value of ,\ equal to the total number
`of source symbols that now have been transmitted. Pro—
`ceed to Step 7.
`Step 7.-—Search for the smallest source sequence sub—
`script s: such that conditions (10) and (11) are both satis-
`fied for one or more values of It:
`u—REZrM
`
`C'k( A}, X, ) <C,(.r,, .r,,) —1[
`
`(10)
`(11)
`
`30
`
`If no such value of at
`wise proceed to Step 9.
`Step 8.-—Determir.e whether or not condition (12) is
`satisfied:
`
`is found, proceed to Step 8; other-
`
`(12)
`(lift, l‘rlmlicuf-Yi» IMM)
`If condition (12) is satisfied, transmit the next M source
`symbols in C1 and proceed to Step 6; otherwise transmit
`a code-instruction word for CO transmit the next M source
`symbols in CD and proceed to Step 4.
`Step 9.——Using the value of or found in Step 7, search
`for the smallest source sequence subscript
`,H such that
`conditions (13) and (14) are both satisfied for one or
`more values of k:
`
`(13)
`x—A<fl—?\::M
`(14)
`Cam, xi‘J)ECi(-¥M .13, ~31
`If no such value of {3 is found, proceed to Step 11; other-
`wise proceed to Step 10.
`Step 10,—Using the value of 5 found in Step 9 and
`using the preferred value, relative to x and [3, of the set
`of values of It found in Step 9, determine whether or not
`condition (15) is satisfied:
`
`(15)
`Ck(.r(, .r,):Co(.r,, xfiv—H
`If condition (15) is satisfied, transmit a code-instruction
`word for Ck,
`transmit the next )3 source symbols in Ck,
`set the value of i in Steps 7, 8, 9, 12, and 13 equal to the
`preferred value of it and proceed to Step 11.
`Step 11.——Using the value of or found in Step 7 and
`using the preferred value, relative to x and a, of the set
`of values of k found in Step 7, search for the smallest
`source sequence subscrip 6 such that conditions
`([6)
`and (17) are both satisfied for one or more values of j:
`a—x<a—x<M
`(16)
`
`Ci ( 'Yav In} <Ck(xa: In) *[1
`
`U7)
`
`45
`
`U!C1
`
`60
`
`If no such value of a is found, proceed to Step 13; other-
`wise proceed to Step 12.
`is among the values of 1'
`Step 12.—If the value of i
`found in Step 11, proceed to Step 13; otherwise proceed
`to Step 14.
`Step J3.—Using the value of a found in Step 7, deter-
`mine whether or not condition (18) is satisfied
`(fifty. .X‘Q)‘
`'(“uhfi r.)
`
`l h! )
`
`l
`
`‘ Ii
`
`ll' condition (18) is satisfied,
`
`transmit
`
`the next a source
`
`Page 12
`
`If condition (7) is satisfied, transmit the next M source
`symbols in code C, and proceed to Step 9; otherwise pro-
`ceed to Step 8.
`Step 8.—Transmit a code-instruction word for code Co,
`transmit the next M source symbols in code Cg, set the
`value of i in Steps 2, 5 and 7 equal to zero and proceed
`to Step 9.
`Step 9.——Set the value of A equal to the total number
`of source sequence symbols that have now been trans-
`mitted. Proceed to Step 2.
`Using 2H, rather than H, as the discrimination level is
`ultra-conservative; its virtue being that it allows lossless
`return from any code to Cu in case it turns out,
`in the
`interval following the decision interval, that the chosen
`code and all the other codes suddenly become more ineffi-
`cient than uncoded transmission. Another way to obviate
`this danger is to monitor the interval following the de—
`cision interval before making a firm decision. When this
`is done, the discrimination level can be lowered from 2H
`to H without comprising the property that the system of
`the invention never transmits more inefficiently than un-
`coded.
`
`The requisite steps for this operation may be tabulated
`as follows:
`
`Coder logic when discriminant equals [1
`
`Let M represent the capacity of the coder buffers, and,
`as before. let Ck(x\. X“) be the number of symbols that

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket