throbber
3,394,352
`R. WERNIKOFF ETAL
`July 23, 1968
`METHOD GF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets-Sheet
`
`1
`
`FIG IA
`
`
`
`Ky XoeeeXquesee
`
`
`$
`INFORMATION
`SOURCE
`
`: k
`
`ye
`
`y
`
`/
`
`/
`
`/
`
`/
`
`af
`
`/
`
`/
`
`/
`
`4
`
`ma
`
`//
`
`Y
`
`‘
`
`cs
`8
`if
`g
`
`3
`
`z=
`5
`a
`E
`3
`5
`5
`2
`2
`
`“
`
`iz
`as
`x, Xs
`XX
`
`/
`
`a“
`
`—
`
`—
`
`C,
`Si
`
`)
`
`“
`
`2
`
`“
`at NN
`
`“~
`
`wo;
`
`k
`
`aee ae nee
`Xeay
`
`.
`
`X
`
`euaee
`Xnel
`
`INCOMING SOURCE SEQUENCE
`
`FIC,
`
`IB
`
`INVENTORS
`
`ROBERT WERNIICOFF
`PAUL EPSTEIN
`WILLIAM F. SCHRIBER
`
`“By YEnee avd Mines
`
`ATTORNEYS
`
`Commvault Ex. 1009
`Commvault v. Realtime
`
`US Patent No. 9,054,728
`
`Page1
`
`Page 1
`
`Commvault Ex. 1009
`Commvault v. Realtime
`US Patent No. 9,054,728
`
`

`

`3,394,352
`R. WERNIKOFF ET AL
`July 23, 1968
`METHOD CF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets-Sheet 2
`
` C, (NULL CODER)
`
`
`k
`
`FIG. 2
`
`Cy
`
`Cj
`
`|
`
`|
`1SLOPES |
`IUNCHANGED}
`1~ CURVES
`{
`MRANSLATED,
`ey
`!
`!
`!
`!
`l
`!
`|
`|
`
`!|
`
`
`
`NUMBEROFOUTPUTSYMBOLSFROMCODERS
`
`vn
`a
`8,
`32
`ee
`ay
`aw
`ae
`uw
`ol
`
`@3
`
`z
`
`e Xs esee
`
`X, ! Xa s+e* SOURCE SYMBOLS
`
`|
`|
`
`ity I Xe Xe
`SS
`*|
` BOPULATION
`BG
`SWITCHED
`
`FIG.3
`
`INVENTORS
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM — SCHRIBER
`By Runcarrd Lines
`ATTORNEYS
`
`Page 2
`
`Page 2
`
`

`

`3,394,352
`R. WERNIKOFF ETAL
`July 23, 1968
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`Filed July 22, 1965
`8 Sheets-Sheet 3
`
`2
`
`le
`
`
`
`|@
`
`4k
`
`K
`29
`Mh et
`t+ +
`
`Qau
`
`
`OMPARATOTeCONTROL
`
` a|
`
`SIS 5
`,
`al"hy 4 0
`
`
`is
`
`3
`i
`x 5
`SH | 5 aa
`Ses 3] 829
`= ol] = An=
`<,
`
`
`
`
`
`*STOPBUFFER,FULL
`
`RB

`
`FIGURE 4
`
`Page 3
`
`s >
`
`+
`=
`INVENTORS:
`ROBERT WERNIKOFF
`SOYscunSCHREIBER,
`OM nA fOrvre
`© BRN
`
`Page 3
`
`

`

`3,394,352
`R. WERNIKOFF ETAL
`July 23, 1968
`METHOD CF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`SELECT OUTPUT POSITION
`
`8 Sheets-Sheet
`
`4
`
`
`
`
`STANDARD
`OuTPUT
`
`PO INT
`20
`
`ouTPUT
`
`
`POSITION
`
`SWITCH
`o>
`2!
`TRANSMITTER
`ie
`ALTERNATE
`OUTPUT
`POSITION
`
`24'
`
`Z
`ie
`a
`2 v
`WwW
`rn
`As
`W Ll
`ao &
`Oo
`-
`ww
`
`4—
`
`Oo
`
`AK
`L= 2
`=z3
`Sy
`
`Y VU
`- 2
`wv Wi
`=3
`
`ty
`N
`
`
`
`FIGURE 4A
`
`INVENTORS:
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM F SCHRIRER
`Ferree corel
`ATTORNEYS
`
`BY
`
`Page 4
`
`Page 4
`
`

`

`3,394,352
`R. WERNIKOFF ETAL
`July 23, 1968
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets-Sheet 5
`
`Zoo’
`
`213
`
`202.
`
`SYMBOL
`U.B. WORD
`Dg MODE
`|
`
`
`| DETECTOR|]DETECTOR FE COUNTER
`200
`OG
`
`
`
`se AeJos
`INPUT BUFFER F4+—-] Mone
`REGISTER LEcloecover
`TL216
`
`FI
`esee
`pf
`
`250
`DECODED
`SYMBOL
`OUTPUT
`|eel
`T
`
`iC
`
`FIGURE 4B
`
`INVENTORS :
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM F. SCHRIBER
`
`By
`
`GJlrv1es om Soviae
`
`ATTORNEYS
`
`Page 5
`
`Page 5
`
`

`

`July 23
`
`, 1968
`
`R. WERNIKOFF ET AL
`
`3,394,352
`
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`
`UaSUESVITASAINYOLLYkoioounypwomy
`-°
`SYOLNTIANM
`WLNOZRIOH
`dWWLNOZIUOH|SONVAIV
`LNNODNas
`
`,eS7ANdONIAAISNALNI
`
`LNAODTNAlV235$32 Sa.Qnd
`
`Jd3an$
`AWOILLUAA
`
`dans
`
`YaLNnos
`
`8 Sheets-—Sheet 6
`
`GANNV2S
`
`110at\NdNOANNOD
`
`Page 6
`
`ALISNALNI
`
`d3a3aas
`
`09
`
`AWOMUAA
`
`AUNLIId4
`
`AOQNA
`
`=D
`
`[any
`
`9S
`
`NOLIN}
`RUVLS
`
`ADODD
`
`3S1nNd
`
`Page 6
`
`

`

`July 23, 1968
`3,394,352
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`R. WERNIKOFF ETAL
`
` BUNLoidGNSS
`
`/AAoOLs
`
`NDOWD
`
`Filed July 22, 1965
`
`HLONAT
`
`UaLNNOS
`
`NOU
`
`SINVACY
`
`TWLNOZIUOH
`
`asid
`
`Sasind
`
`
`
`UalISNBLN|LOINOSso!
`
`
`
`
`
`UWADHSUHDSWVITUASAINVOLLY
`
`WLNOZIUOH
`WOLLUSA
`
`daaMs
`
`d33aMs
`
`8 Sheets-Sheet
`
`7
`
`4dOoUNUaKe
`
`L4IHS
`
`UABLSIORz INIasaby-VINdOWaa
`
`SYOLNIANI.oNreANN!VGOld
`usdeg‘9YOLNSANTWolosi3d
`
`
`>the°
`
`Page 7
`
`Page 7
`
`
`
`
`
`

`

`3,394,352
`R. WERNIKOFF ETAL
`July 23, 1968
`METHOD CF AND APPARATUS FOR CODE COMMUNICATION
`Filed July 22, 1965
`8 Sheets-Sheet 8
`
`FROMEDGECATE 63 uw
`
`all
`=tk
`52>98
`=
`5m
`Tw
`3
`ms
`
`~
`
`iD
`
`LO
`
`oO
`Cy
`io
`
`.
`
`Ser
`3)
`m On
`5-
`uc
`4
`
`oe
`3| SE
`m at
`we
`aS
`39g
`
`3
`"
`
`ou
`FO
`eu
`ZO
`spas
`UZon
`3826
`Z yldu
`Gan5
`G20
`WOnZ
`z0My
`3 a
`A
`z
`=
`
`WS
`Sit
`sanooCc?’
`
`rie
`
`INVENTORS
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM F. SCHRIBER
`BY Bivec amt Finer
`ATTORNEYS
`
`Page 8
`
`Page 8
`
`

`

`3,394,352
`Patented July 23, 1968
`
`2
`
`United States Patent Office
`
` 1
`
`3,394,352
`METHOD OF AND APPARATUS FOR CODE
`COMMUNICATION
`Robert Wernikoff, Cambridge, Paul Epstein, Brookline,
`and William F. Schreiber, Lexington, Mass., assignors
`to Electronic Image Systems Corporation, Boston,
`Mass., a corporation of Massachusetts
`Filed July 22, 1965, Ser. No. 474,189
`35 Claims. (Cl. 340—172.5)
`
`
`ABSTRACT OF THE DISCLOSURE
`
`The methods and apparatus disclosed minimize the
`number of symbols to be transmitted, and hence the trans-
`mission time, in a communication system, by determining
`the most efficient code for encoding sequences of message
`symbols and transmitting the symbols in that code. Deter-
`mination of the most efficient code can be accomplished
`by applying the message symbols to a plurality of encoders,
`counting the numbers of symbols of coded representations
`of successive message symbols produced by each encoder,
`and comparing the numbers to determine the particular
`code that introduces the smallest number of symbols of
`coded representation consistent with distortionless mes-
`sage transmission. Tagging symbols are transmitted to per-
`mit reconstruction of the original message symbols from
`the transmitted coded symbols. In.an application of the in-
`vention to facsimile transmission run-length coding is em-
`ployed. —
`
`The present invention relates to methods of and appa-
`ratus for code communication, being more particularly,
`though not in all cases exclusively, directed to efficient
`coding for inhomogeneous sources of information.
`The problem of processing information-to-be-communi-
`cated so as to enable a minimumnumberof symbols to be
`transmitted with reliable recovery of the information has
`long challenged the art. In devising his telegraphic code
`more than a century ago, Samuel F, B. Morse generally
`assigned the shortest code symbols to the most frequently
`used English alphabet letters in order to economize on
`the transmission time of messages. Had Morse optimized
`the assigning of the shorter code symbols to the more fre-
`quentletters, this would have produced about a fifteen per-
`cent saving in transmission time over assigning symbols of
`equal
`length to all
`letters. Far greater saving, however,
`could have been effected by assigning code symbols, not
`for individual
`letters, but for pairs,
`triplets or other
`grouping of letiers that in normal text occur much more
`frequently than others. Within recent years, indeed Claude
`Shannon has estimated that the maximumsaving in trans-
`mission capacity that can be obtained through optimum
`ecoding of English text is about seventy-five percent when
`statistical dependences between successive letters of text,
`carried out to one hundred letters, are considered.
`When the symbols issuing from the information source
`are statistically dependent upon one another, so that cer-
`tain groupings of symbols occur more frequently than
`others, efficient coding requires that the symbols be con-
`sidered not singly but in groups. The number of elements
`of which the group must consist is dictated by the range
`of substantial statistical dependence. It might amount to
`ten or twenty letters in English text, and to several thou-
`sand bits in pictorial transmission. The size of the code
`‘book in which groups of source symbols are assigned
`code symbols increases fantastically as the group is made
`larger
`(the number of entries
`in the code book is
`2710—205,900,000,000,000 for groups of
`ten English
`letters, and it is the square of that numberfor groups of
`twenty letters) and this growth scon becomes an insur-
`mountable practical limitation to optimum coding. There
`
`are no non-singular codes that circumvent this problem,
`An optimum code is matched as closely as possible to
`the statistical peculiarities of the population or set of se-
`quences to be encoded. A code designed to match the letter
`frequencies of English text, for example, becomes quite
`inefficient when it is applied to a different population or
`set of sequences, such as a foreign language. Wheneverdis-
`similar populations are mixed (for example, for the pur-
`pose of obtaining a single set of letter frequencies applica-
`ble to English and to a foreign language simultaneously )
`the frequencies of the letters tend to equalize; and any
`Operation on sequences that equalizes probabilities reduces
`the possible compression, In the limit, indeed, if an in-
`formation source is as likely to produce one symbol as
`another, and if successive symbals do not depend on
`previous ones, then no compression whatsoeveris possible.
`The reduction of transmission time by means of coding
`is thus possible only to the extent that the statistics of
`the strings of symbols produced by information sources
`differ from independence and equiprobability. The recipe
`for the optimum code is fixed in a deterministic, mathe-
`matical way by the precise probability distributions that
`describe the information source.
`While it
`is difficult
`to reconcile language text, televi-
`sion, moving pictures,
`facsimile,
`telemetered data, etc.,
`with the mathematical idea of a homogeneous population
`with a fixed, immutable and knownfrequency distribution,
`yet this abstract concept underlies all prior coding theory.
`In accordance with Shannon’s theorem, an information
`source is contemplated that produces sequences of sym-
`bo!s (but not just any sequence) that may be regarded as
`random samples from a homegeneous population with
`known and immutable frequency distributions of all or-
`ders (i.e., for pairs of symbols,
`triplets,
`.
`.
`.. n-tuples).
`The term “homogeneous” (the technical word is “er-
`gedic’) implies that any sufficiently long sequence from
`the source is very likely to be representative of any other
`source sequence,
`Adherence to this framework has, however, resulted
`in prior-art coding and compression systems that fall far
`below the ultimate transmission efficiency. The reason is
`that the mathematical framework on which theyare based
`does not sufficiently resemble reality. In accordance with
`the present invention, on the other hand, as pragmatically
`determined from actual experiment,
`it
`is not assumed
`that the population of symbols (letters of text, or ele-
`ments of television or facsimile pictures, for example) is
`a homogeneous source in the sense of information theory;
`but, rather, that inhomogeneous sources are involved re-
`quiring a novel coding logic that has been found in prac-
`tice to more than double the average compression at-
`tained by prior-art systems implementing conventional
`coding theory, and with less equipment. In addition, the
`coder for inhomogeneous sources, underlying the inven-
`tion, results in a transmission time that never exceeds the
`uncoded time and sometimesis fifty times smaller.
`An object of the invention, thus,
`is to provide a new
`and improved method of and apparatus for code com-
`munication that is not subject to the above-described dis-
`advantages of prior-art coding and compression systems,
`predicated upon conventional coding theorems; but that,
`to the contrary, provides for greater transmission-time
`efficiency and with less complicated and costly apparatus.
`A further object is to provide a new and improved scan-
`ning transmission and reception system for such purposes
`as facsimile or other communication, all generically re-
`ferred to as “picture” scanning and the like.
`Still another object is to provide each of a novel code
`transmission and reception apparatus.
`Other and further objects will be made apparent herein-
`after and are more particularly pointed out
`in the ap-
`pended claims.
`
`10
`
`20
`
`25
`
`30
`
`wsot
`
`40
`
`45
`
`50
`
`60
`
`70
`
`Page 9
`
`Page 9
`
`

`

`3,394,352
`
`3
`The invention will now be described in connection with
`the accompanying drawings, FIG. 1A of which is a
`schematic block diagram of a portion of a transmitter
`useful with the invention;
`FIG. 1B is a graph illustrative of the performance of
`the system of FIG. 1A;
`FIGS. 2 and 3 are similar graphs of modified perform-
`ance;
`FIG. 4 is a block diagram of a preferred generic en-
`coding and transmitting system embodying the invention;
`FIG, 4A is a block diagram of the output formatting
`system of FIG. 4;
`FIG. 4B is a similar diagram of a preferred generic
`decoding or receiving system for use with the transmitting
`system of FIG. 4, FIGS. 4, 4A and 4B, as a unit, together
`constituting a complete system employing the invention;
`FIGS. 5 and SA are block diagrams of the transmitting
`and receiving sections, respectively, of a preferred fac-
`simile or similar scanning system using the novel
`tech-
`niques of the invention as a species of the generic con-
`cept of FIGS. 4, 4A and 4B; and
`FIG. 5B is a circuit modification that may be employed
`with the system of FIG. 5.
`Considering for purposes of illustration an information
`source that produces sequences drawn not necessarily
`from one homogeneous population, but possibly from sev-
`eral different populations, some knownand others perhaps
`not yet identified, and considering that the source may
`switch from one population to another at random and
`without stopping; that the different populations are used in
`unknown, and maybe even in variable proportions; and
`that switching from one population to the next occurs at
`random times, an example of the method underlying the
`present invention may be constructed as follows. For pur-
`peses of illustration only, it may be considered that this
`operation occurs when different-language messages are
`successively transmitted from United Nations Headquar-
`ters.
`Since it is known how to gain a useful compression in
`any one language (or, more generically, population), it
`is proposed to construct an apparatus that effectively has
`a collection of encoders, one matched to each language
`(or population}. Then, as the source switches around
`from one language (or population} to the next, the ap-
`paratus automatically follows and, at each instant, uses
`the code best suited to the output of the source. Each time
`the coder switches codes, it sends a special signal to the
`receiver indicative of the code of the following message,
`so that
`the receiver can apply the corresponding de-
`coding procedure and recover the original message. The
`problem is, however,
`to enable the coder to determine
`when the source switches populations. The coder must
`determine this as soon as possible, so that it may switch
`codes promptly before any great
`loss in transmission
`efficiency has been incurred.
`What makes the problem difficult is that the conditions
`of this oversimplified illustration, in which it is possible to
`think of the source as switching around among a finite
`and definite number of discrete populations, are not often
`realized in practice. It is more usual to find real sources
`changing their behavior in a continuous way, and to find
`the statistics of the corresponding source sequences assum-
`ing a continuum of values; thus there does not exist, even
`in principle, a unique, well-defined time at which codes
`should be switched. The present invention provides the
`solution both for the simple case of switching among
`discrete sources and for the more important and realistic
`case of gradually changing sourcestatistics; but the meth-
`od and apparatus of the invention are more clearly under-
`standable if initially described in terms of the simpler case.
`Since it
`is not known which population the source is
`currently drawing its sequence from, according to the
`present invention, the message sequence from the informa-
`tion source is fed simultaneously to all encorders. Thus,
`in FIG,
`fA,
`the sequence of information source symbols
`Vy, Vg.
`Xm... is shown fed from the source S$
`to
`
`4
`each of the different encoders or coders Cp, Cy, Cy... Cy.
`In view of the fact that all possible source populations are
`not known,
`it
`is useful to include one encoder Co that
`performs pure, uncoded transmission; its output is identi-
`cal with its input. We shall call this the null encoder. As
`each successive source symbol enters a given encoder, the
`encoder output (which may differ from all the rest) will
`be of several different
`types. The output may be “no”
`symbols (if that particular code works on blocks or source
`symbols, and the end of the block hasn't been reached
`yet); or it may be one symbol (as in the null encoder Cy
`and others); or it may be many symbols (as in a coder
`that has reached the end of a block or of a run of source
`symbols). In FIG. 1B, accordingly, there are plotted the
`numbers of symbols issuing from each coder as successive
`source symbols enter it from the source S.
`If the population statistics are truly stationary, each
`coder will be characterized by an average straightline
`behavior whose slope w determines the compression (or
`expansion) produced by that code. The null encoder Co
`has unit slope if the input and output alphabets are the
`same, because it puts out one symbol for each successive
`input symbol. If the input and output alphabets are differ-
`ent, the one at the input having A,letters and the one in
`the output A, letters, then the previous statementis still
`correct after all ordinate values have been divided by log
`Aj/log Ay. In the sequel, it is tacitly assumed that the
`alphabets are equal or that the indicated normalization has
`been done, particularly since a change in alphabets does
`not change the channel capacity required for message
`transmission and hence does not affect efficiency.
`Coder C, in FIG. 1A may,as an illustration, produce on
`the average fewer symbols than enter it; thus, it produces
`a characteristic shown in FIG. 1B of slope w,<1. Coder
`Cy, on the other hand, may produce on the average more
`than one symbol per input symbol, producing a slope
`wo >>1, These statements apply to the average behavior of
`the coders, shown as the dash lines in FIG. 1B. The actual
`number of output symbols put out by each coder for each
`input or arriving source symbol!is, of course, a random
`variable whose value depends on the specific source sym-
`bol that entered the coder (these symbols are themselves
`random variables) and on the source symbols that entered
`the coder earlier. The actual number of symbols issuing
`from each coder, as represented by the solid lines in FIG.
`1B, thus, is a random walk or step-curve about which the
`most that can be said, in general, is that it is monotonic
`nondecreasing.
`When the source S abruptly switches populations, so
`that the source sequence has entirely different statistics, the
`dash-line curves of FIG. 1B that describe the average
`behavior of the coders will abruptly change slope, since a
`coder which formerly was quite efficient (w<1) may now
`be very inefficient (w>1). Automatic code-switching can
`be based on detecting changes in slope in the average
`behavior of the coders (ie., the ratio of output to input
`symbols or the number of output symbols per unit of real
`time) and such operation in the system of the invention is
`for some purposesfeasible; butit is preferred, because the
`average behavior is not known with any acceptable ac-
`curacy unless the source persists in its new behavior for a
`very long time, not
`to rely upon mere slope detection.
`Whatis really available, as opposed to merely assumed,
`is the actual output from each coder; i.e., the solid-line
`step curves of FIG. 1B. The randomness of these curves
`makes the measurement of slopes unreliable, and masks
`the precise point at which the switch in populations oc-
`curred; and there is,
`in fact, no such point with real
`sources, whose population parameters change gradually,
`rather than abruptly. Thus any code-switching systems
`based on slopes or other population averages, such as
`entropy, would at best merely be crude approximations.
`In accordance with the preferred form of the present,
`invention, therelore, the code-switching logic is based, not
`on abstract constructs like average slopes or on unsup-
`
`30
`
`40
`
`60
`
`65
`
`Page 10
`
`Page 10
`
`

`

`3,394,352
`
`. etc, the smallest
`
`6
`In the source sequence x1, ¥2..
`source-subscript « is sought such that
`(1)
`Cy ay Xe) KCoA, X)—-2H
`for some k, where H is the number of tagging symbols
`that must be transmitted to change codes, A (initially
`zero) is the number of source symbols that already have
`been transmitted, and C,(x,, x,) is the number of sym-
`bols that must be transmitted by coder Cy, to represent
`all source symbols starting with x,,, and ending with
`x,. It costs H symbols to start transmitting in Cy instead
`of Co; but, if the first a source symbols are transmitted
`with the C, code, then a minimum of 24—H=H sym-
`bols are saved. Even at x,,, the line for Cy veers up
`sharply and Cy, begins to cost more than Cp per unit
`input symbol, therefore, the transmitter can switch back
`to Cg (an operation that costs another H symbols) and
`still not have lost anything with respect
`to uncoded
`transmission.
`Thus, the code-switching logic of the invention issues
`a command that initiates transmission of a code-instruc-
`tion word identifying C,, and of those symbols in the
`Cy output buffer which correspond to the first « source
`symbols. While transmission is taking place,
`the code-
`switching logic allows enough new source symbols to
`enter the coders so that the logic can decide what code
`fo use next before the transmitter has finished sending
`the symbols of the previous block. The object here is to
`keep the transmission line fully loaded at all times. This
`requirement defines the minimum computation rate for
`the switching logic, and it also defines the rate at which
`source symbols must be supplied to the coders.
`is
`it
`To decide what code to use in the next step,
`noted that the cumulative number of symbols that the
`coders would have sent for the first « source symbols
`js now quite immaterial, What matters is how many sym-
`bols per source symbol they produce from x, on. There
`is subtracted from each line of FIG. 2,
`therefore,
`its
`value at x,, resulting in the graph of FIG. 3.
`Now the system is initially in the C, code, \ has as-
`sumed its new value, and the system will stay with this
`code unless a new a is reached such that
`
`(2)
`Ci, Xe) CCy (Ky, X.)—2H, for some j
`If this happens, the system transmits the symbols between
`Xia. and x,
`in code C;. If this does not happen before
`the Cy, output buffer fills up, then transmission continues
`in code Cy. The Cy, buffer is emptied, all the other buffers
`are cleared, and the above procedure is repeated with all
`the coder output lines now originating from that source
`symbol which would have caused the C, buffer to over-
`flow if the system had not just then stopped the search
`procedure.
`The method of code selection outlined above is simple
`and effective, but it occasionally results in a transmission
`cost higher than what is incurred with uncoded transmis-
`sion. A preferred method of code selection that never re-
`sults in transmitting more symbols than are required for
`uncoded transmission might be tabulated as follows:
`
`Coder logic when discriminant equals 2H
`
`the maximum capacity of coder
`
`Let M represent
`buffers.
`Step 1.—The value of ) is initially zero as is the value
`of i in Steps 2, 5 and 7. Proceed to Step 2.
`Step 2.—Search for the smallest source sequence sub-
`script « such that conditions (3) and (4) are both satis-
`fied for some values of k:
`
`a—rA<M
`
`Cyl, Xa) =Ci Oy, X,) —2H
`
`(3)
`
`(4)
`
`5
`ported assumptions about physical sources, but on the
`final goal of the coding system, which is to transmit as few
`symbols as possible without distorting the information
`produced by the source. Attention is thus focussed on the
`symbol counts at the outputs of the encoders; and at any
`given time,
`that encoder is chosen which leads to the
`fewest transmitted symbols. This operation can be imple-
`mented in a manner now to be described.
`it must
`Each time the transmitter S switches codes,
`convey this fact to the receiver; that is,
`the transmitter
`must tell the receiver which code it will be using next.
`The accomplishment of this function requires transmitting
`a few symbols, representative of the name of the new
`code, If the transmissions are in binary alphabet, then it
`takes one symbol to identify one out of two possible
`codes; two symbols to identify one out of four possible
`codes; three symbols for eight possible codes; and so on.
`Each additional symbol doubles the number of codes
`that can be identified. With a quaternary alphabet, one
`symbol identifies one of four codes; two symbols, one of
`sixteen codes: and so on. Each additional symbol quad-
`ruples the number of codes that can be so identified.
`Since, in most applications, it is unlikely that a practical
`system would have more than a dozen or two different
`codes,
`it never takes more than three or four symbols
`to identify the new code. The receiver, however,
`is con-
`fronted with a continuing stream of symbols and it will
`not be able to distinguish between ordinary message
`symbols and code-switching instructions unless special
`provision is made to provide such instructions in an un-
`ambiguous manner. This may be done by inserting in the
`message stream a uniquely-decipherable tagging word
`which is the same in all codes and which informs the re-
`ceiver that the immediately following symbols define the
`new code. To effect this requires a few symbols, say ten
`or so. The switching signal,
`therefore, may consist al-
`together of fifteen symbols or so,
`this being short com-
`pared with most intervals or source homogeneity. Since
`some transmission time is consumed by sending instruc-
`tion or tagging words, however, the numberof transmitted
`instruction or tagging symbols must be noted.
`In fact, in accordance with the invention, it is the trans-
`mission cost of the instruction or tagging symbols that
`supplies the discriminant for the code-switching decision;
`that is, codes are switched precisely then, when whatis
`saved in transmission cost by replacing the old code with
`a new one just exceeds ihe cost of sending the instruction
`or tagging word that announces the change.
`Assumethat the information source S is allowed to feed
`source symbols to the encoders, as in FIG. 1A, and that
`there are buffers,
`later described,
`inserted between the
`encoders and the transmitter output so that time is af-
`forded before a decision is made as to which coder output
`is to be transmitted. As successive source symbols x,
`Ng... etc., enter the coders,
`the number of symbols
`coming out of each coder is counted, presenting an out-
`put that,
`if plotted,
`is of the type shown in the graphs
`of FIG. 1B. H, for purposes of explanation, the graphs
`are temporarily thought of as smooth lines, as shown in
`FIG. 2, rather than as random walks or steps,
`then, as
`the source symbols x;, x2... etc., enter the coders
`Cy, C1 .. . etc., the graphs representive of output sym-
`bol counts will begin to diverge from each other. If com-
`pression is at all possible, one or more coder lines (for
`example, C,
`in FIG. 2) will be below the null-coder
`line Cg, so-labelled.
`Assume that, as a convention, the receiver always pre-
`supposes that transmissions commence uncoded (i.e., using
`the output from Cy) unless the very first received word is
`a code-instruction or tagging word. The transmitter, be-
`fore sending anything, first examines the outputs from the
`various coders as they enter the later-described buffers
`that precede the output register, with the following ob-
`jective.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`40
`
`50
`
`60
`
`If no such value of « is found, proceed to Step 7; other-
`wise proceed to Step 3.
`Step 3.—Using the a value and & value found in Step 2,
`
`~T nm
`
`Page 11
`
`Page 11
`
`

`

`3,394,352
`
`7
`determine whether or not condition (5) is satisfied:
`
`CK(Xy, Xa) SCol%y 12) —H
`
`(5)
`
`If condition (5) is satisfied, proceed to Step 4; otherwise
`proceed to Step 5.
`Step 4-—Using the a value and & value found in Step 2,
`transmit a code-instruction word,
`transmit
`the next «
`source symbols in code Cy, set the value of i in Steps 2,
`5 and 7 equal to the value of k and proceed to Step 9.
`Step 5.——Using the a value found in Step 2, determine
`whether or not condition (6) is satisfied:
`
`a
`
`10
`
`20
`
`30
`
`45
`
`60
`
`Ci, XI SColKy, X)
`
`(6)
`
`If condition (6) is satisfied, transmit the next « source
`symbols in code C; and proceed to Step 9; otherwise pro-
`ceed to Step 6.
`Step 6.—Transmit a code-instruction word for code
`Cy, transmit the next « source symbols in code Co, set the
`value of i in Steps 2, 5 and 7 equal to zero and proceed
`to Step 9.
`Step 7.—Determine whether or not condition (7) is
`satisfied:
`
`Cilky, Mam) <Colty, Aum)
`
`7)
`
`If condition (7) is satisfied, transmit the next M source
`symbols in code C, and proceed to Step 9; otherwise pro-
`ceed to Step 8.
`Step 8.—-Transmit a code-instruction word for code Cg,
`transmit the next M source symbols in code Co, set the
`value of i in Steps 2, 5 and 7 equal to zero and proceed
`to Step 9.
`Step 9.—Set the value of X equal to the total number
`of source sequence symbols that have now beentrans-
`mitted. Proceed to Step 2.
`Using 2H, rather than H, as the discrimination level is
`ultra-conservative; its virtue being that it allows lossless
`return from any code to Cy in case it turns out,
`in the
`interval following the decision interval, that the chosen
`code and all the other codes suddenly become moreineffi-
`cient than uncoded transmission. Another way to obviate
`this danger is to monitor the interval following the de-
`cision interval before making a firm decision. When this
`is done, the discrimination level can be lowered from 2H
`to H without comprising the property that the system of
`the invention never transmits more inefficiently than un-
`coded.
`
`The requisite steps for this operation may be tabulated
`as follows:
`
`Coder logic when discriminant equals H
`
`Let M represent the capacity of the coder buffers, and,
`as before, let Cy(x,. x,) be the number of symbols that
`must be transmitted by the kth coder to represent all the
`source symbols starting with x,,, and ending with x,.
`The object of the following logic is to identify that coder
`whose use results in the fewest
`transmitted symbols.
`Occasionally it will happen that two or more coders result
`in equally few transmitted symbols for a particular seg-
`ment of source symbol sequence; the logic then selects a
`specific coder by applying some arbitrary rule to resolve
`the ambiguity, an example of such a rule being, “choose
`that coder whose subscript & is least.” Any number of
`other rules are admissible. In the coding logic presented
`here, occasional use is made of the term, “the preferred
`value, relative to \ and a, of a set of values of k,” which
`term is defined by the following rule: from among the
`set of coder subscripts A find the subset
`for which
`Cy, x,) is least; from among the subset of values & thus
`found, find the least value of k.
`Step 1.—The value of 4,
`the total number of source
`sequence symbols that have been transmitted,
`is initially
`zero. Proceed to Step 2.
`Step 2.—Seareh for the smallest source sequence sub-
`
`8
`script a such that conditions (8) and (9) are both satis-
`fied for one or more values of #1:
`
`a—h=M
`
`(8)
`
`(9)
`Cul, Xe) <Co(ry, X} —2H
`If no such value of aw is found, proceed to Step 3; other-
`wise proceed to Step 5,
`Step 3,—Transmit M source symbols in Co. Proceed to
`Step 4.
`the value of \ equal to the total number
`Step 4.—Set
`of source sequence symbols that have now been trans-
`mitted. Proceed to Step 2.
`Step 5.—Using the value of « found in Step 2 and using
`the preferred value, relative to \ and @ of the set of values
`of / found in Step 2, transmit a code-instruction word,
`transmit
`the next « source symbols in Cy), and set
`the
`value of /
`in Steps 7, 8, 9, 12 and 13 equal to the pre-
`ferred value of &. Proceed to Step 6.
`Step 6.—Set the value of 4 equal to the total number
`of source symbols that now have been transmitted. Pro-
`ceed to Step 7.
`Step 7.—Search for the smallest source sequence sub-
`script a such that conditions (10) and (11) are both satis-
`fied for one or more values of k:
`
`(10)
`a-ha
`(11)
`Ces Xe) <Cy(y, 4)
`If no such value of « is found, proceed to Step 8; other-
`wise proceed to Step 9.
`Step 8.—-Determine whether or not condition (12) is
`satisfied:
`
`(12)
`Ci, Xa = Colry. Kya)
`If condition (12) is satisfied, transmit the next M source
`symbols in C, and procced to Step 6; otherwise transmit
`a code-instruction word for Cy transmit the next M source
`symbols in Cy and proceed to Step 4.
`Step 9—Using the value of « found in Step 7, search
`for the smallest source sequence subscript @ such that
`conditions (13) and (14) are both satisfied for one or
`more values of k:
`
`(13)
`X—h<P—-A=M
`(14)
`CKOKSC, X)—- 2
`Tf no such value of g is found, proceed to Step 11; other-
`wise proceed to Step 10.
`Step 10.—Using the value of 8 found in Step 9 and
`using the preferred value, relative to \ and B, of the set
`of values of k found in Step 9, determine whether or not
`condition (15) is satisfied:
`(15)
`CKOG, Xp )=2CoC, Ky)
`If condition (15) is satisfied, transmit a code-instruction
`word for Cy,
`transmit the next 8 source symbols in Cy,
`se

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket