`R. WERNIKOFF ET AL
`July 23, 1968
`METHOD ILF AND A1"'PfifiA.TUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`15 Shaets-Sheet
`
`I
`
`
`
`
`
`NUMBERor:ou-rpm’‘SIGNALSFROMEACHenconerz
`
`
`
`
`
`
`
`INFORMATION
`30U'.'?C E
`
`
`
`\.
`
`'\
`
`L“
`
`INCOMING SOURCE ‘SEQUENCE
`
`Fl
`
`‘B
`
`reoasrzr WERN/fi{(S$A{-‘mi;
`PAUL EPSTEIH
`WILLIAM F-SCHRIBER
`
`BY
`
`ATTORNEYS
`
`Oracle 1113
`
`Oracle 1 1 13
`
`
`
`3394.352
`R, WERNIKOFF ET AL
`July 23. 1968
`!-EETHOD er AND APPARATUS FOR com: COMMUNICATION
`
`Filed July 22, 1965
`
`3 Sheets-Sheet
`
`E
`
`:10 (NULL cooerz)
`
`Ck
`
`FIG. 2
`
`SOURCE '5wu3oL5
`
`
`
`
`
`
`-4 I >5“!
`I
`
`: -
`
`1
`I
`I
`ISLDPES I
`IUNCI-h\NGED|
`
`co
`
`
`
`ck
`
` l
`
`
`
`NUMBEROFOUTPUT5V|~lBOL$FROMCODERS
`
`
`
`
`
`NUMBEROFOUTPUTSYMBOLSAFTER
`RESETTJNQ
`
`1,
`
`*3
`
`"2
`
`
`x,
`
`X
`'2 5
`
`x
`
`x
`4
`
`f
`-
`POPULATION
`$WlTCHED
`
`X5 13
`
`‘I
`
`rnvsnmrey
`rcorssrzr WERNIKOFF
`PAUL EP5TEIN
`wv.LLIAM E SCHR[BER
`ey
`‘7E¢..'..¢_4¢-vi ‘tau;
`:r7.To.v?~£V5
`
`
`
`3,394,352
`R. WERNIKOFF EFAL
`July 23. 1933
`METHOD OF HND APPARATUS FOR CODE CUHMUNICHRTION
`
`Filed July 22, 1965
`
`8 Sheets-—Shee‘i.
`
`:5
`
`0
`F)
`
`_u.>
`
`Ha.LSA$ ON l.L_l."1'W‘LlO:i
`
`.Lfid.LnO
`
`
`
`
`
`
`
`
`
`t1|’
`
`Ia
`
`U’)
`
`
`
`l'
`
`I
`
`
`
`L)
`
`LJ
`
`jj'
`
`ul
`
`2
`
` FIXEDVOLTAGE
`
`
`
` ‘STOPBUFFER,FULL
`
`
`
`-7-
`
`uwsmolzg:
`ROBEIZTWERNIKDFF
`DAULEDSTEIH
`wl um SCHREIBER
`€7ATII:rzNE~.35H'”2"”""'
`
`1-
`
`>(
`
`1!
`'1".
`_:
`Fm
`=~3- o
`'* %%%I§
`-gr; *3}:-
`3,
`
`3*’
`
`FIGURE 4
`
`«
`
`
`
`3,394,352
`R. wERN|KoFF ET AL
`July 23, 1968
`METHOD i.'F AND APPARATUS FOR CODE CCMMUNICATIDN
`
`Filed July 22, 1965
`
`8 Sheets-Sheet
`
`4
`
`SELECT OU'l'PLlT POSITION
`
`.24‘
`
`
`
`
`u w
`
`E
`Q :2
`g 1!‘-
`: 5
`2 n.
`9 %
`'3
`oz 8
`i- 2
`V‘ UJ
`Z :1
`
`FIGURE 4A
`
`INVENTDRS:
`ROBERT WEF(NlKoFF
`PAUL EPSTE IN
`W|!../LIAM F SCHRIR F3
`
`ATI"Ol2NE‘1’§
`
`8u
`
`
`
`\
`
`
`
`Z0
`
`2 L
`
`
`» E
`E:
`SWITCH
`Tfi
`
`E E,
`‘n-enusumenz
`g.
`E.‘ é
`3 J
`ALTERNATE
`D Lu
`OUTPUT
`O I‘—’
`POSITION
`
`
`
`3,394,352
`R. WERNIKOFF ETAL
`J1-I1)’ 23. 1968
`METHOD OF AND APPARATUS FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`3 Sheet.5—Sheet 5
`
`
`
`DECO DE D
`5\’ME>DL
`OUTPUT
`
`FIGURE 4B
`
`uwe wruzs :
`
`ROBERT WERNIKOFF
`PAUL EPSTEIN
`WILLIAM F. SCHFHBER
`
`ATTOTEN EYE
`
`3?
`
`
`
`RE El’ AT :=uu_ coum
`
`:~n;m§1r_=~rwc. pucsz ¢53'
`
`FULL coum
`puma
`“‘.3.‘;*»”“‘
`5°
`59
`
`END caF
`PIcTul'tE
`
`VERTICAL
`sweep
`COUNTER
`
`mrznsma
`
`?a%Ié,_gNmL
`
`VERTICAL
`5""EE"‘
`
`TE
`
`I4E
`W“ ‘T3?
`3-ét%%%E'ri
`WORD LENCUH WORD
`-¢—— Buy‘.-ER
`Lena“)
`COUNTER
`- M‘
`M
`5,
`as’
`34
`
`arr s=uu.5"Es
`
`3.
`
`II
`
`COUNTER
`
`TUBE
`XAMNED
`
`F|C_.5
`
`—
`3,. 2??-..'n.a..z7fl-ca
`Arr-‘WHEY!
`
`wx-'£zvm/P;
`ROBERT WERNIKDFF
`PAUL EPSTEIN
`WILLIAM SCHREIBER
`
`‘'1
`
`3Lo
`
`EH)
`N
`5
`
`.,
`*9
`E
`If
`cg
`g
`
`:2
`
`
`
`
`
`ROH.'n‘:'IlNIl31|Mi.1:JEH03H0!SflJ.VHVddVumJ9COHEN
`
`
`
`
`
`
`
`zs*a‘vs£‘sW13id°>"~H='M'3896!'£z-‘If-r
`
`
`
`
`
`E9513‘aaKtnr[mu
`
`w
`I11
`
`E
`E3'
`3
`"
`r
`
`VHVJI-VUHVJ0EIOI-|.l.3W
`
`.*i0[.[-YL1IN.IE‘I?l?l'J33:I;r;)H03an;
`
`
`
`
`
`996:‘'3:filnr
`
`
`
`
`
`'rv;3:Iso>IINH.'=IM'u
`
`Z99'f'5£'E
`
`5|’ ' END PICTLI E.
`T?" ' '_
`
`[ac
`
`CLOCK
`‘PULSE
`
`, NEnA1u"_
`IOU
`
`START "2
`
`523 1 RESET
`FINLSH LINE
`I-ce.‘3‘é‘c25A~'?ou m"°""“%'EFf* =:.
`H
`CDUN
`IZDNTAL
`"”H"3'T
`
`0H
`
`COHPARN
`TOR.
`
`ma" E.
`
`IU6
`
`be
`
`Cop‘?
`
`
`LNTEN-3:1-f.-_r=uL-sis
`__
`_
`'
`as
`
`.
`Du
`I07.
`"gr: LA m;‘2.§.%En
`“”""' IX .33
`lol
`5W‘—
`WPUT
`°ETEC"°"~
`
`HQ 5
`-
`
`J‘
`
`zanla to-H fir-I-wk lo
`4 1-for HE ‘rs
`
`wrzavrawr
`rzonenr \lJEI'?.N|i(DFF
`PAUL EDWTEIN
`WILLIJKM KCHREIBER.
`
`
`
`611:»:-
`
`'°5
`
`-1
`
`
`
`
`
`3,394,352
`R. WERNIKOFF ET AL
`July 23. 1968
`MEIHOD -1’-"' AND APPAFMTU5 FOR CODE COMMUNICATION
`
`Filed July 22, 1965
`
`8 Sheets-Sheet 8
`
`
`
`C.’
`m
`é+—
`+- 2'
`fig
`--
`Ea
`31%
`3
`u-
`
`LU
`é?!
`5&3
`E
`
`E%
`85.‘
`5 E
`g%
`
`""
`
`d3
`LO
`-
`L5
`L:
`
`-
`
`M/P£.r'u/7'0/?I
`ROBERT WERNll<0FI=
`PAUL EP‘5TElN
`NILLIAM r. scnmam
`av fl%é4.“4‘fiam¢
`.4zrorexvEv5
`
`O
`'J
`
`0!
`
`H
`L11 0
`_.
`3 32
`w 2:
`51%4
`A 8
`
`55‘
`n
`
`3%
`
`‘£4
`2283
`$333
`§~:9
`83%’
`‘P ageflg
`40"‘
`nag“?
`a
`2
`‘I
`
`
`
`3,394,352
`United States Patent Office
`Patented July 23, I968
`
`1
`
`2
`
`3,394,352
`METHOD OF AND APPARATUS FOR CODE
`_
`COMMUNICATION
`Robert Werniltotf, Cambridge, Paul Epstein, liroolrhne,
`and William F. Sclzlreiber, Lexington. Ma.ss._. amignom
`to Electronic Image Systems Corporation, Boston,
`Mass, a corporation of Massacllusefls
`Filed July 23, 1965, Ser. No. I‘!-L139
`35 Claims. (Cl. 34ll—l'i2.5)
`
`ABSTRACT OF THE DISCLOSURE
`
`The methods and apparatus disclosed minimize the
`number of symbols to be transmitted, and hence the trans-
`mission time, in a communication system, by determining
`the most efficient code for encoding sequences of message
`symbols and transmitting the symbols in that code. Deter-
`mination of the most eflicient code can be accomplished
`by applying the message symbols to a plurality of encoders.
`counting the numbers of symbols of coded representations
`of successive message symbols produced by each encoder,
`and comparing the numbers to determine the particular
`code that introduces the smallest number of synibols of
`coded representation consistent with disturtionless mes-
`sage transmission. Tagging symbols are transmitted to per-
`mit reconstruction of the original message symbols from
`the transmitted coded symbols. In an application of the in-
`vention to facsimile transmission run-length coding is em-
`ployed.
`
`The present invention relates to methods of and appa-
`ratus for code communication, being more particularly,
`though not in all cases exclusively, directed to elficient
`coding for inhomogeneous sources of inform atinn.
`The problem of processing information-to-be-communi-
`cated so as to enable a minimum number of symbols to be
`transmitted with -reliable recovery of the information has
`long challenged the art. In devising his telegraphic code
`more than a century ago, Samuel F. B. Morse generally
`assigned the shortest code symbols to the most frequently
`used English alphabet letters in order to economize on
`the transmission time of messages. Had Morse optimized
`the assigning of the shorter code symbols to the more fre-
`quent letters. this would have produced about it fifteen per-
`cent saving in transmission time over assigning symbols of
`equal
`length to all
`letters. Far grculcr saving. however,
`could have been effected by assigning code symbols. not
`for individual
`letters, but
`for pairs.
`triplets or other
`grouping of letters that in normal text occur muclt more
`frequently than others. Within recent years, indeed Claude
`Shannon has estimated that the mnxirnum saving in trans-
`mission capacity that can be obtained through optimum
`ecoding of English text is about scventy—five percent when
`statistical dependence.-s between successive letters of text,
`carried out to one hundred letters. are considered.
`When Lhe symbols issuing from the information source
`are statistically dependent upon one another, so that cer-
`tain groupings of symbols occur more frequently than
`others. efficient coding requires that the symbols be con-
`sidered not singly but in groups. The number of elements
`of which the group must consist is dictated by the range
`of substantial statistical dependence. It might amount to
`ten or twenty letters in English text, and to several thou-
`sand bits in pictorial transmission. The size of the code
`"book in which groups of source symbols are assigned
`code symbols increases fantastically as the group is made
`larger
`(the number of entries
`in the code. book is
`2?"'="-'f_'|5.9DD,U00,|Jl][l,0(}0 for groups of
`ten English
`letters. and it is the square of that number for groups of
`twenty letters) and this growth soon becomes an insur-
`mountable practical limitation to oplimurn coding. There
`
`are no non-singular codes that circumvent this problem.
`An optimum code is matched as closely as possible to
`the statistical peculiarities of the population or set of se-
`quences to be encoded. A code designed to match the letter
`frequencies of English text, for example, becomes quite
`ineflicient when it is applied to -.1 different population or
`set of sequences, such as a foreign language. Whenever dis-
`similar populations are mixed {for example, for the pur-
`pose of obtaining 3 single set of letter frequencies applica-
`ble to English and to a foreign language simultaneously]
`the frequencies of the letters tend to equalize: and any
`operation on sequences that cqualizes probabilities reduces
`the possible compression, In the limit.
`indeed, if an in-
`formation source is as likely to produce one symbol as
`another. and if successive symbols do not depend on
`previous ones, then no compression whatsoever is possible.
`The reduction of transmission time by means of coding
`is thus possible only to the extent
`that the statistics of
`the strings of symbols produced by inforrnatiorl sources
`differ from independence and equiprobability. The recipe
`for the optimum code is fixed in a deterministic. mathe-
`matical way by the precise probability distributions that
`describe the information source.
`While it
`is ditficult to reconcile language text. televi-
`sion, moving pictures,
`facsimile.
`telemetercd data. etc...
`with the ntathcntntical idea of a homogeneous population
`with a fixed, immutable and known frequency tiistribLttiort,
`yet this abstract concept underlies all prior coding tlteory.
`In accordance with Shannon's theorem, an information
`source is contemplated that produces sequences of sym-
`bols {but not just any sequence} that may be regarded as
`rundciit samples from a homogeneous population tvitlt
`ltnown and immutable frequency distributions of all or-
`ders (i.e., for pairs of syntbols,
`triplcls,
`.
`.
`,.
`ii-tttplcs).
`The term “hon1os_:,cneotIs"
`(the technical word is "er-
`godEc“|
`implies that any sulficiently long sequence from
`the source is very likely to be rcprcsenlativc of any other
`source sequence.
`Adherence to this framework has, however, resulted
`in prior—art coding and compression systems that fall far
`below the ultimate transntission efficiency. The reason is
`that the mathematical framcworl-.' on which they are based
`tloc-'. not sulficicntly resemble reality. In accordance with
`the present invention, on the other hand, as prugmutically
`determined from actual experiment.
`it
`is not assumed
`that the population of symbols {letters of text, or ele-
`ments of television or facsimile pictures. for example] is
`a homogeneous source in the sense of information theory:
`but. rather, that inhomogeneous sources are involved re-
`quiring a novel coding logic that has been found in prac-
`tice to more than double the average compression at-
`tained by prior—art systems implementing conventional
`coding theory, and with less equipment. In addition, the
`coder for inhomogeneous sources, underlying the inven-
`tion, results in a transmission time that never exceeds the
`uncoded time and sometimes is fifty times smaller.
`An object of the invention. thus,
`is to provide a new
`and improved method of and apparatus for code com-
`munication that is not subject to the abol-e—descri bed dis-
`advantages of prior—art coding and compression systems,
`predicated upon conventional coding theorems: but that,
`to the contrary, provides for greater transmission-time
`efficiency and with less complicated and costly apparatus.
`A further object is to provide a new and improved Scan-
`ning transmission and reception system for such purposes
`as facsimile or other communication. all generically re-
`ferred to as “picture" scanning and the like-
`Still another object is to provide each of a novel code
`transmission and reception apparatus.
`Other and further objects will be made apparent herein-
`after and are more particularly pointed out
`in the ap-
`pended claims.
`
`10
`
`20
`
`25
`
`3|]
`
`4!!
`
`-15
`
`50
`
`55
`
`00
`
`E5
`
`70
`
`
`
`3,394,352
`
`5
`
`ll)
`
`-.
`
`3
`4
`The invention will now be described in connection with
`. Cn.
`each of the different encoders or coders C0. C1, C1 . .
`the accompanying drawings, FIG.
`IA of which is a
`In view of the fact that all possible source populations are
`schematic block diagram of a portion of a transmitter
`not known.
`it
`is useful
`to include one encoder Co that
`useful with the invention;
`performs pure, uneoded transmission; its output is identi-
`FIG. IE is a graph illustrative of the performance of
`cal with its input. We shall call this the null encoder. As
`the system of FIG. 1.-\;
`each successive source symbol enters a given encoder, the
`FIGS. 2 and 3 are similar graphs of modified perform-
`encoder output (which may differ from all the rest) will
`ance;
`be of several different
`types. The output may be "no"
`FIG. 4 is a block diagram of a preferred generic en-
`symbols (if that particular code works on blocks or source
`coding and transmitting system embodying the invention:
`symbols, and the end of the block hasn't been reached
`FIG. 4A is a block diagram of the output for-matting
`yell: ‘or it may he one symbol [as in the null encoder CD
`system of FIG. 4;
`and others]; or it may be many symbols (as in a coder
`that has reached the end of a block or of a run of source
`FIG. 4B is a similar diagram of :1 preferred generic
`decoding or receiving system for use with the transmitting
`symbols). In FIG. 1B, accordingly.
`the-re are plotted the
`system of FIG. 4, FIGS. 4, 4A and -113, as a unit, together
`numbers of symbols issuing from each coder as successive
`constituting a complete system employing the invention;
`source symbols enter it from the source S.
`FIGS. 5 and 5A are block diagrams of the transmitting
`If the population statistics are truly stationary. each
`and receiving sections, respectively, of a preferred fac-
`coder will be characterized by an average straightline
`simile or similar scanning system using the novel
`tech-
`behavior whose slope W determines the compression (or
`niques of the invention as a species of the generic con-
`expatrsionl prodncetl by that code. The null encoder Co
`cept of FIGS. 4, 4A and 4B: and
`has unit slope if the input and output alphabets are the
`FIG. 5B is :1 circuit modification that may be employed
`same. because it puts out one symbol for each successive
`with the system of FIG. 5.
`input symbol. If the inptlt and output alphabets are differ-
`Considering for purposes of illustration on information
`ent, the one at the input having A, letters and the one in
`source that produces sequences drawn not necessarily
`the output A,, letters, then the previous statement is still
`from one homoge neous population, but possibly from sev-
`correct after all ordinate values have been divided by log
`eral different populations, some known and others perhaps
`A,/log A... In the sequel,
`it is tacitly assumed that the
`not yet identified, and considering that the source may
`alphabets are equal or that the indicated normrtlization has
`switch from one population to another at random and
`been done. particularly since a change in alphabets does
`without stopping; that the dilie rent populations are used in
`not change the channel capacity required for message
`unknown, and maybe even in variable proportions; and
`trnnsrnission and hence does not affect clficicncy.
`that switching from one population to the next occurs at
`Coder C1 in FIG. IA may, as an illustration, produce on
`random times. an example of the method underlying the
`the average fewer symbols than enter it; thus, it produces
`present invention may be constructed as fol lows. For pur-
`-a. characteristic shown in FIG. IE of slope m,<.t. Coder
`poses of illustration only, it may be considered that this
`C1‘, on the other hand, may prod uce on the average more
`operation occurs when different-language messages are
`than one symbol per
`input symbol, producing a slope
`st.lccessively transmitted from United Nations Headquar-
`w,,>l. These statements apply to the average behavior of
`lcrs.
`the coders. shown as the dash lines in FIG. 1B. The actual
`Since it is known how to gain :1 useful compression in
`number of output symbols put out by each coder for each
`any one language (or. more generically, population). it
`input or arriving source symbol
`is. of course, a random
`is proposed to construct an apparatus that elfectively has
`variable whose value depends on the specific source sym-
`a collection of encoders, one matched to each language
`bol that entered the coder (these symbols are themselves
`(or population}. Then, as the source switches around
`random variables) and on the source symbols that entered
`from one language for population} to the next, the ap-
`the coder earlier. The actual number of symbols issuing
`paratus automatically follows and, at each instant. uses
`from each coder, as represented by the solid lines in FIG.
`the code best suited to the output of the source. Each time
`13, thus. is a random walls or step—curve about which the
`the coder switches codes, it sends a special signal to the
`most that can be said. in general, is that it is monotonic
`receiver indicative of the code of the following message.
`nondecreasing.
`so that
`the receiver can apply the corresponding dc-
`When the source S abruptly switches pt‘_Ip|_|]‘._tl_in|1s_ so
`coding procedure and recover the original message. The
`that the source sequence has entirely dilierent statistics, the
`problem is. however,
`to enable the coder to determine «
`dash-line curves of FIG. 1B that describe the average
`when the source switches populations. The coder must
`behavior of the coders will abruptly change slope, since a
`determine this as soon as possible. so that it may switch
`coder which formerly was quite efficient to-<1} may now
`codes promptly before any great
`loss in transmission
`be very inefticient iw>l). Automatic code-svvitching can
`efficiency has been incurred.
`be based on detecting changes in slope in the average
`What makes the problem dillicult is that the conditions
`behavior of the coders (i.e.. the ratio of output to input
`of this oversimplified illustration, in which it is possible to
`symbols or the number of output symbols per unit of real
`think of the source as switching around among a finite
`time) and such operation in the system of the invention is
`and definite number of discrete populations, are not often
`for some purposes feasible; but it is preferred, because the
`realized in practice. It is more usual to find real sources
`average behavior is not ltnown with any acceptable ac-
`changing their behavior in a continuous way, and to find
`curacy unlcss the source persists in its new behavior for Ft
`the statistics of the corresponding source sequences assum-
`very long time, not to rely upon mere slope detection.
`ing a continuum of values; thus there does not exist, even
`What is really available, as opposed to merely assumed.
`in principle. a unique. well—defined time at which codes
`is the actual output from each coder: i.e.. the solid-line
`should be switched. The present invention provides the
`step curves of FIG. 1B. The randomness of these curves
`solution both for the simple case of switching among
`makes the measurement of slopes unreliable. and masks
`discrete sources and for the more important and realistic
`the precise point at which the switch in populations oc-
`case of gradually changing source statistics; but the meth~
`curred: and there
`in fact, no such point with real
`ml and apparatus of the invention are more clearly under-
`Sotlrccs, whose population purantctcrs cltnnge gratluttlly,
`standable if initially described in terms of the simpler case.
`rather than :tl\rLlptly. Thus any code-switching Systems
`Since it is not known which population the source is
`based on slopes or other population :-tvernges, such as
`currently drawing its sequence from, according to the
`entropy. would at best rnert.-ly be crude approximations.
`present invention. the message sequence from the informal»
`lion source is fed simttltnneottsly to all cncortlers. Titus.
`ln nccnrtllinct: with the prrlerrcd forth of the pt'cset‘lt
`invt.-ntion. llturt-l'trtc. lite l."tJ(it.‘~.S\v'-ilt.‘l'liI'lg.! logic is ltnscd, not
`in FIG.
`IA.
`the sequence of inforrtuution soitrce symbols
`.\'1_-
`.1";
`.
`I
`a X,“ .
`.
`,
`is shown fell
`from the 5DLl|'l:‘I2 S to -,;, on .tlrstruct cuJtslt't.n:ls like in-t:t'.Igc slopes or on t..trtstIp-
`
`L‘! GI
`
`30
`
`-11]
`
`50
`
`
`
`3,394,352
`
`5
`ported assumptions about physical] sources, but on the
`final goal of the coding system, which is to transmit as few
`symbols as possible without distorting the information
`produced by the source. Attention is thus focussed on the
`symbol counts at the outputs of the encoders; and at any
`given time.
`that encoder is chosen which leads to the
`fewest transmitted symbols. This operation can bc imple-
`mented in a manner now to be described.
`it must
`Each time the transmitter S switches codes,
`convey this fact to the receiver; that is. the transmitter
`must tell the receiver which code it will be using next.
`The accomplishment of this function requires transmitting
`a few symbols, representative of the name of the new
`code. if the transmissions are in binary alphabet, then it
`takes one symbol to identify one out of two possible
`codes:
`two symbols to iclentify one out of four possible
`codes‘. three symbols for eight possible codes; and so on.
`Each additional symbol doubles the number of codes
`that can be identified. With at quaternary alphabet, one
`symbol identifies one of four codes; two symbols, one of
`sivtteen codes: and so on. Each additional symbol quad-
`ruples the number of codes that can be so identified.
`Since, in most applications, it
`is unlikely that a practical
`system would have more than a dozen or two different
`codes, it never takes more than -three or four symbols
`to identify the new code. The receiver. however, is con-
`fronted with a continuing stream of symbols and it will
`not be able to distinguish between ordinary message
`symbols and codc—swilt:hing instructions unless special
`provision is made to provide such instructions in an un-
`ambiguous manner. This may be done by inserting in the
`1'1‘lt'.‘53i1_s_.'B stream a ttniquelvdecipherable tagging word
`which is the same in all codes and -which informs the re-
`ceiver that the irnmedi-ately following symbols define the
`new code. To effect this requires a few symbols, say ten
`or so. The switching signal,
`therefore, may consist al-
`together of fifteen symbols or so,
`this being short com-
`pared with most intervals or source homogeneity. Since
`some transmission time is consumed by sending instruc-
`tion or tagging words, however, the number of transmitted
`instruction or tagging symbols must. -be noted.
`In fat:-l. in aecorrluttce with the invention. it is the trans-
`mission cost of the instruction or tagging symbols that
`supplies the discriminant for the code-switching decision;
`that is, codes are switched precisely -then, when what is
`saved in transmission cost by replacing the old code -with
`a new one just exceeds the cost of sending the instruction
`or tagging word -that announces the change.
`Assume that the information source S is allowed to feed
`source symbols to the encoders, as in FIG. IA. and that
`there are buffers.
`later described,
`inserted between the
`encoders and the transmitter output so that time is af-
`forded -bcfore :1 decision is made as to which coder output
`is to -be
`transmitted. As successive source symbols :1.
`x; .
`.
`. etc., enter the coders,
`the number of symbols
`coming out of each coder is counted. presenting an out-
`put. -that,
`if plotted,
`is of the type shown in the graphs
`of FIG. 113. if, for purposes of explanation, the graphs
`are temporarily thought of -as smooth lines. as shown in
`FIG. 2, rather than as random wallts or steps.
`then, as
`the source syrobuls x1, 1:,
`.
`.
`. cti:., enter the coders
`C9. C1 .
`.
`. etc.. the graphs representive of output sym-
`bol counts will begin to diverge from each other. If -com-
`pression is at all possible, one or more coder lines (for
`example, C.,
`in FIG. 2) will be below the rtull—codcr
`line CD, so-labelled.
`Assume that, as a convention, the receiver always pre-
`supposes tltal transntissions com-mence uncodcd (i.e., using
`the output from CL.) unless the very first received word is
`a code-instruction or tagging word. The transmitter. be-
`fore sending an} thing, first examines the outputs from the
`various coders as they enter the later-described buffers
`that precede the output register, with the following ob-
`jective.
`
`ID
`
`15
`
`20
`
`30
`
`35
`
`40
`
`Cu‘.'_.1
`
`60
`
`70
`
`. ctc.,
`
`-the .-smttllcst
`
`6
`.
`.
`.r«_.
`lo the sotI:‘t.'c tvCII|lIt:l‘ICC .11,
`sou1'c‘c—sttl1script u is sought .~;ttt‘l'l that
`(1)
`Cpl .t_,. .r_} <Cg(.1',. .t_, 1 -—2H
`for some it‘. where H is the number of tagging symbols
`that must be transmitted -to change codes,
`it {initially
`zero} is the number of source symbols that already have
`been transmitted. and C‘g(.r.,, Jr.) is the number of sym-
`bols that must be transmitted by coder Ct to represent
`all source symbols starting with .r,_,1 -and ending with
`.r,. It costs H symbols to start transmitting in C,, instead
`of Cg: but,
`if the first
`at source symbols are transmitted
`with the C.. code, then a minimum of 2H—H=H sym-
`bols are saved. Even at .r,_, L the line for C‘,, veers up
`sharply and Cg, begins to cost more than C]; per unit
`input symbol, therefore, the transmitter can switch back
`to Cu (an operation that costs another H symbols] and
`still not have lost any-thing with respect
`to uncoded
`transmission.
`Thus, the code-switching logic of the invention issues
`a command that initiates transmission of a eode—instruc-
`-lion word identifying C1,, and of those symbols in the
`C, output buffer which correspond to the first t1 source
`symbols. While transmission is taking place,
`the code-
`switching logic allows enough new source symbols to
`enter the coders so that the logic can decide what code
`to use next before the transmitter has finished sending
`the symbols of the previous block. The object here is to
`keep the transmission line fully loaded at all t'u-nes. This
`requirement defines the minimum computation rate for
`the switching logic. and it also defines the rate -at vrhich
`source symbols must be supplied to the coders.
`is
`it
`To decide what Code to use in the next step,
`noted that the cumulative number of symbols that the
`coders would have sent for the first
`I! source symbols
`is now quite immaterial. What matters is how many sym-
`bols per source symbol they produce from .r,, on. There
`is subtracted from each line of FIG. 2,
`therefore,
`its
`value at 1,. resulting in the graph of FIG. 3.
`?t has as-
`Now the system is initiitlly in the C,‘ code.
`sumed its new value. and -the system will stay with this
`code unless a new Cl
`is reached such that
`
`(2)
`Cit-rt. -\'.}<CutXt. X.)—2h’. for some i
`If this happens, the system transmits the symbols between
`:r,,1 and Jr,
`in code C]. If this does not -happen before
`the C3. output buffer fills up, then transmission continues
`in code Ck. The C1, buffer is emptied, all the other buffers
`are cleared. and the above procedure is repeated with all
`the coder output lines now originating from that source
`symbol which would have caused the C1, buffer to over-
`flow tf the system had not just then stopped the search
`procedure.
`The method of code selection outlined above is simple
`and effective, but it occasionally results in a transmission
`cost higher than what is incurred with uncoded transmis-
`sion. A preferred method of code selection that never re-
`sults in transmitting more symbols than are required for
`uneoded transmission might be tabulated as follows:
`
`Coder logic tvhen tfiscrimrrtmrr com.-t's EH
`
`Let M represent
`buffers.
`
`the ma.1t.il1tL|l'l1. capacity of coder
`
`Step I.—Th-c value of It is initially zero as is the value
`of I in Steps 2, S and 7. Proceed to Step 2.
`Step 2.—Search for the smallest source sequence sub-
`script as such that conditions (3) and (4) are both satis-
`fied for some values of k:
`
`u—A$M
`
`C'kt.t,,. .r,,l :C'1(x., .r,,l -—2H'
`
`(3)
`
`(4)
`
`If no such value of is is found, proceed to Step 1; other-
`wise proceed to Step 3.
`Step 3.—Using the ct value and k value found in Step 2.
`
`
`
`7
`determine whether or not condition (5) is satisfied:
`
`(‘gin 1.. l::-C'o{-\'~- 3'. 1-N
`
`(5)
`
`If condition (5) is satisfied, proceed to Step 4; otherwise
`proceed to Step 5.
`Step -rt.——-Using the :: value and I: value found in Step 2,
`transmit a code-instruction word,
`transmit
`the next
`in
`source symbols in code Ck. set the value of t‘ in Steps 2,
`5 and 3' equal to the value of It and proceed to Step 9.
`Step 5.—Using the at value found in Step 2, determine
`whether or not condition (6) is satisfied:
`
`5
`
`ll}
`
`C',[x,. .r,,)sCal.t,. .r,,}
`
`{5}
`
`if condition (6) is satisfied, transmit the next U. source
`symbols in code C1 and proceed to Step 9; otherwise pro-
`ceed to Step IS.
`Step t5.—Tr:-msmit a code-instruction word for code
`C.,, transmit the next at source symbols in code Co, set the
`value of i in Steps 2, 5 and 7 equal to zero and proceed
`to Step 9.
`Step 7.——Detcrminc whether or not condition (‘it
`satisfied:
`
`is
`
`Cit-T» -‘Vt .- 1:35 Co Us 1» rm)
`
`(7)
`
`20
`
`3,394,352
`
`8
`script :2 such that conditions {H} and £9) are both statis-
`fied for one or more ‘values of .'i:
`
`n:—ltiJ'l-I
`
`t'.".;(_t‘_..
`
`.\',_.)<C;l'.\.'\_.
`
`.t',__J -21’!
`
`(8)
`
`(9)
`
`If no such value of El. is found. proceed to Stop 3; other-
`wise procecd to Step 5.
`3."c'p 3.—'l'ruusmit M source symbols in Co. Proceed to
`Step 4.
`to the total number
`the value til it equal
`Step -1.—Sct
`of source seqtlcttuc s}'n1hols that have now been trans-
`mitted. Proceed to Step 3.
`Step 5.—Usit1g the value of El. found in Step 2 and using
`the preferred value, rclativc to .\ and at of the set of values
`of it found in Step 2,
`tr:u1s:uit
`:1 code-instrtiction word,
`transmit
`the next
`ll source syrltbols in Ch. and set
`the
`value off in Steps 7. S. 9, 12 and 13 equal to the pre-
`ferred value of 1:. Proceed to Step ti.
`Step 6.—Sct
`the value of It cqu;-I to the total ntlmber
`of source symbols that now l1;|\."t,‘ been transmitted. Pro-
`ceed to Step 7.
`5tc'_v 7.—-Sea.-rch for the smallest sottrce sequence sub-
`script nt such that conditions I 10) and (ll; are both satis-
`fied for one or more values of L:
`
`is satisfied. transmit the next M source
`If condition (Tl
`symbols in code (It and proceed to Step 9; otherwise pro-
`ceed to Step 8.
`Step &.—Ttansn1it a code-instruction word for code C9.
`transmit the next M source symbols in code C“, set the
`value of i in Steps 2. 5 and 7 equal to zero and proceed
`to Step 9.
`Step 9.——Set the value of A equal to the total number
`of source sequence symbols that have now been trans-
`mitted. Proceed to Step 2.
`Using 2H, rather than H. as the discrimination level is
`ultra-conservative; its virtue being that it allows lossless
`return from any code to Cu in case it turns out,
`in the
`interval following the decision interval, that the chosen
`code and all the other codes suddenly become more inciti-
`cient than uncoded transmission. Another way to obviate
`this danger is to monitor the interval following the de-
`cision interval before making a firm decision. When this
`is done. the discrimination level can be lowered from 21-1
`to H without comprising the property that the system of
`the invention ncvcr transmits more inefficiently than un-
`coded.
`
`The requisite steps for this operation may be tabulated
`as follows:
`
`Coder (epic whm discrintittottt cqttolr H’
`
`Let M represent the capacity of the coder buffers. and.
`as before. let C’.¢[it.. I“) be the number of symbols that
`must be transmitted by the kth coder to represent all the
`source symbols starting with .r,,, and ending with x...
`The object of the following logic is to identify that coder
`whose use results in the fewest
`transmitted symbols.
`Occasionally it will happen that two or more coders result
`in equally few transmitted symbols for at particular seg-
`ment of source symbol sequence; the logic then selects a
`specific coder by applying some arbitrary rule to resolve
`the ambiguity. an example of such 2: rule being, "choose
`that coder whose subscript k is least." Any number of
`other rules are admissible. In the coding logic presented
`here, occasional use is made of the term. “the preferred
`value. relative to X and n. of a set of values of k," which
`term is defined by the following rule:
`from zunong the
`set of coder subscripts k find the subset
`for which
`€]t.\'_\.
`.t‘.l
`is least: from among the subset of values it thus
`found. find the least value of in
`Step J’.--The value of it,
`the total nttmbcr of source
`sequence symbols that have been transntitted,
`is initially
`zero. Proceed to Step 2.
`.‘itup J'.——Sc:irc|1 tor the h.~Ill:Illt"\l. sutucc NL’qllt!Ilt.‘\.' wh-
`
`o.-t\:';."lrf
`
`C'1.;l.t.. .t',)'<C'.(.\':,.t_,i—H
`
`([0]
`
`(ll)
`
`30
`
`If no such value of D:
`wise proceed to Step 9.
`Step 8.-—-Dct-.:rn1it*.e whether or not condition {l2} is
`satisfied:
`
`is found, pI'CIC'CCI.l to Step 8: other-
`
`(Ill
`Ctl-R. -l‘..1\tl:'C'ut.t'.. ,r,.M)
`If condition (12) is satisfied, transmit the next M source
`symbols in C; and proceed to Step ta; otherwise trttrtsmit
`rt code-instruction word for Co transmit the next M sotlrcc
`symbols in C], and proceed to Step 4.
`Step 9.—Using the value of LI. found in Step 7, search
`for the smallest source scqtlcncc sttbscripl
`rt shell
`that
`conditions (J3) and (M) are both satisfied for one or
`more values of k:
`
`-‘1|'I
`
`.'t'——l\€,\'."l—RE.-‘:4’
`C,,t.r-,__. .r_,_1.<(]t.r._. .rnJ-—1t