throbber
Design of Provably Good Low-Density Parit:y Check Codes
`
`Tom Richc.trd:>on) Amin S'lwl.,;.rollahi and Rudiger [!Tbanli.e
`Hdl Labs~ Lucent Technologies
`11 urray HilL N J 0797 4
`
`April ."5, 1999
`
`Abstract
`
`\Ve de~ign :;equence:; of low-Jemily parily check code~ Lha.L provably perform a.L ra.Le:; ex(cid:173)
`tremely close to the Shannon capacity. The codes arc built from highly irregular bipartite graphs
`>vith carefully chosen degree patterns on both sides. Our theoretical analysis oft he codes is based
`on [1]. AdJiLiona.lly, baseJ on Lhe a~::;umplion Lha.L Lhe underlying corruuunicalion channel i:;
`symmetric, we prove that the probability densities at the message nodes of the graph satisfy
`a cerlain ::;ymmelry. Thi:; enable:; u::; Lo derive a ~uccinct. de:;criplion of Lhe den::;ily evoluLion
`for the case of a belief propagation decoder. Furthermore, we prove a stability condition which
`implies an upper hound on the fraction of errors that a belief propagation decoder can correct
`when applied Lo a. code induced from a biparlile graph wilh a. given degree di:;lribulion.
`Our codes arc found by optimizing the degree structure of the underlying graphs. \Vc develop
`several strategies to perform this optimization. We also present some simu htion results for the
`codes f(:mnd which show that the performance of the codes is very dose to the asymptotic
`theoretical hounds.
`
`Inde:r Ternt~ Low den:;ily parily check code~, belief propagalion, L urbo code:;, irregular
`codes
`
`1
`
`Introduction
`
`In this p;:tper we present irregular low-density parity rheck wdes (LDPCCs) which exhibit perfor-
`
`rnancc extremely dose to lhc bcsl possible as dclenuined by the Shannon capacity fmmula.. Fm
`
`the additive ·white Gaussian noise channel (A\VG:-IC) the best code of rate one-half presented in
`
`this paper has a threshold \Vithin O.OfidB from capacity, and simulation results sho'v that our best
`
`L]) PCC of length 106 arhieves a. bit error probability of 10-6 less than 0.1:3d H ;nvay from raparity,
`
`handily beating the bcsl (turbo) codes known ~o far.
`
`LDPCCs posse~~ several other dislincl a.dva.ntage~ over lmbo-codes. Firsl, lhc complexity of (belief-
`
`propagation) decoding is some>vhat less than that of turbo-codes and~ being fully parallelizable, can
`
`potentially be performed at significantly greater speeds. Second, as indicated in a previous paper
`
`[l], very low wmplexity dewders that rlosely approxima.te belief-propagation in performa.nce may
`
`Hughes, Exh. 1029, p. 1
`
`

`

`be (and have been) de~:iigned l'or these codes. Third, low-dt'n~:iity parit_r' check decoding i~:i vcriJiablc
`
`in the sense that decoding to a correct codeword is a detectable event. One practical objection to
`
`lrnv-density parity-rherk codes has been that their encoding complexity is high. One \vay to get
`
`around this problem is by slightly modifying the ronstrurtion of codes from bipartite graphh to a
`
`cal:icadc ol' such graph~:i, set' [2 1 3]. A diJferent solutiou for practical purposes; \Vhich dot's not require
`
`cascades. will be presented elsewhere ['1].
`
`Let us recall some basic notation. As is well known, low-density parity-check codes are well repre-
`
`sented by bipartite graphs in \vhirh one flet of nodefl, the rariablf nodes 1 rorrehpotHlfl to elements
`
`of the rode\vord and the other flet of nodes 1 the ch.ec/,; norlfs; rorrehpotHlfl to the flet of parity-rherk
`
`constraintl:i sa tidied b.Y code\vords ol' the code. Rr:,galaT·lo\v-densit.Y parit_v-check codes arc those l'or
`
`which all nodes of the same type have the same degree. Thus, a (:Ui)-regular low-density parity-
`
`rherk rode has a graphiral representation in ·whirh all variable nodes have degree :3 and all rherk
`
`nodefl have degree 6. The bipartite graph determining surh a rode is shmvn in Ji'ig. L
`
`CJ
`
`('2
`
`C:J
`
`r4
`
`c-.,
`
`'U;)
`
`'1}4
`
`V.-;
`
`v6
`
`·u,
`
`Vs
`
`Vg
`
`'VJI)
`
`Figure 1: A (:1, (i)-regular code of length 10. There are 10 variable nodes and 5 check nodes. For
`each check node c; the sum (over GF(2)) of all adjacent variable nodes is equal to zero.
`
`For an irregular low-density parity-check code the degrees of each set of nodes are chosen accord-
`
`ing to some distribution. Thus; an irregular lmv-density parity-rherk rode might have a graphiral
`
`Hughes, Exh. 1029, p. 2
`
`

`

`represent a lion in which half the variable nodes have degree 5 and half have degree :3, while half
`
`the constraint nodes have degree 6 and half have degree X. For a given length and a given de-
`
`gree sequenre (finite distrib11tion) we define an f.'nsnnblt: of rodes by choosing the edges, i.e., the
`
`ron nedions between variable and rherk nodes, randomly. More precisely, we en u mera.te the edges
`
`l'manaling from the variable nodes in some' arbitrary order a.nd proceed in the sanw way with the
`
`edges emanating from the check nodes. Assume that the number of edges is E. Then a code (a par(cid:173)
`
`ticular instance of this ensemble) can be identified with a permutation on E letters. Note that all
`
`elements in this ensemble are equiprobable. In pra.dire the edges are not rhosen entirely randomly
`
`since certain potentially unfortunate events in the graph construction ca.n be easily avoided.
`
`In a recent pa.per [1] we prc'sented an asymptotic analysis of LDPCCs undc'r message' passing
`
`decoding. Assume we have the following setup:
`
`1. \Ve a.re givc'n an ordered famil_y of hinary-inpul discrete memorykss channels parameterized
`by a real pa.raml'ier 8 such that if o1 < 82 then the channel with pa.raml'ier 82 is a physically
`degraded version of the channel with parameter b 1 • Further, each channel in this family fulfills
`
`the rhan nel symmetry condition
`
`p(yi:t = 1) = p( -yl:t = -1)'
`
`(1)
`
`) is the variable (check) node degree sequence.
`
`2. A sequence of rode ensembles Cn(?.,p) is specified, where n is the length of the rode and
`/\( :r) := '2::::1~1 ,\.;:ci-1 (p( :r ) := '2::::£1
`;;;,1 p.;:ci-1
`::VIore precisely, /\i (Pi) is the fraction of edges emanating from variable (check) nodes of
`.
`.
`.
`.
`.
`) .
`degree t. Nate that the rate of the code 1s g1ven m terms of ,\( x) and p( a:) as 1 - t
`J~~ d:r I'( .1:)
`d,r >.(J:
`

`

`

`
`0
`
`:3. A message passing decoder is selected. Dy ddinition , messages only contain e:rlrinsic infor-
`
`mation, i.e., the message emitted along an edge c does not depend on the incoming message
`
`along the same edge. I'1Jrther, the deroder fulfills the following symmetry conditions. !' lip-
`
`ping the sign of all incoming messages at a variable node results in a. flip of the sign of all
`
`outgoing messages. The symmetry condition al a check node is slightly more' involved. Let
`
`c be an edge emanating from a check node c. Then flipping the sign of i incoming messages
`
`arriving at node c, excluding the message along edge c, results in a change of the sign of the
`
`outgoing message by (-1 ) '. In all these rases, only the sign IS rhanged, but the reliability
`
`remains unchanged.
`
`Hughes, Exh. 1029, p. 3
`
`

`

`UndPr tlw a.bove a,ssumptions th12re exists a thr12shold {J* \Vith tlw following propNtiPs: For a.ny
`c > 0 and b < 0* there exists <m n( c, b) such lhal almost. cver.y code1 in Cn(/\, p), n > n(c, b),
`
`has probability of error smaller than E assuming that transmission takes place over a. channel with
`parameter 6. Conversely, if the transmission takes place over a channel \Vith parameter 6 > 8*,
`
`then almost any rodP in C,J.\,p) has probability ofNror uniformly bounded away from 7.Pro. 2
`
`FurthN, for thP important rase of belief propaga.tion decoders a pnxedure wa.s introdured that
`
`enables the efficienl compulation of bx lo an.y desired degree of accuracy. In [1] threshold values
`
`and simulation results were given for a. variety of noise models, but the class of lo,v-density parity(cid:173)
`
`check codes considered >vas largely restricted to regular codes. In the present paper we present
`
`results indirating the remMkable performance that can be adieved by properly dosen irregular
`
`codes.
`
`The idea underlying lhis paper is quite straighlfmward. Assume we arc intereslcd in lransmission
`
`over a particular memoryless channel using a particular message passing decoder. Since for every
`
`given pair of degree sequencPs (.\,p) we ran determine the resulting threshold val uP tJ*, it is natural
`
`to seMrh for thosP pai1·s \vhid maximir,e this threshold.:J This was donP, with great sucress, in
`
`the case of erasure channels [5, 6, 7]. In the case of mosl other channels of interest the situation is
`
`much more complicated and ne\V methods must be brought to bear on the optimization problem.
`
`Fig. :2 compares the performance of the (:1, 6)-regular LDPCC (which is the best regular such code)
`
`with the performanre of the best irregular LDt>CC we found in our seard and the p12rformanre of
`
`the sla.ndard parallel lmbo code introduced by Denou, Galvieux, and Thilimajshima. [8]. All lhrec
`
`codes have rate one-half and their perl'onnance under belief propagation decoding over the A \VG N C
`
`is shO\vn for a code word length 106
`
`. Also shown is the Shannon limit and the threshold value of
`
`our best LDt>CC (ax= 0.9718). l1'rom this figurP it is rlea.1· how much benefit ran be derivPd from
`optimir,ing the degree sequenres. l1'or n = 106 and a bit Prror probability of 10-6 , our best LDt>CC
`
`is only 0.1:3dD awa.y from capacity. This handily beats the perl'onnance of lurbo-codes. Even more
`
`impressive, the threshold, which indicates the performance for infinite lengths, is a mere 0.06dB
`
`av.;ay from ra.parity.
`
`1 More precisely, the fraction of codes for which the above statement is true converges exponentially (in n) fast
`to 1.
`2 \Ve conjecture that this is actually true for eve1·y code in Cn(.\, p).
`3 \Ve rnay al~o opl.irni~.e degree se<1uences under variou~ con~Lraird.s. For example, tlre larger Llre degrees used l.lre
`la.rger tlre code si~e needs l.o be irr order Lo approadr Llre predicted a.~yrnptol.e. Tlrerdore, il. is lriglrly desirable Lo
`look for the best degree sequences with some a priori bound on the size of the degrees.
`
`4
`
`Hughes, Exh. 1029, p. 4
`
`

`

`10-2
`
`10-3
`
`10-4
`
`10-5
`
`10-6
`
`0.0
`
`1.0
`
`0.977
`
`0.955
`
`0.933
`
`0.912
`
`0.891
`
`0.871
`
`0.159
`
`0.153
`
`0.147
`
`0.142
`
`0.136
`
`0.131
`
`0.125
`
`(3, 6)-regular LDPCC
`
`Threshold
`
`Turbo Code
`
`irregular LDPCC
`
`Threshold
`Shannon Limit
`
`0.2
`
`0.4
`
`0.6
`
`0.8
`
`1.0
`
`1.2
`
`Eb/N0 [dB]
`
`0.851 (cid:109)
`
`0.12 Pb
`
`Hughes, Exh. 1029, p. 5
`
`

`

`r > 1- h(8)/h(p*), \vhere h(-) is the binary entropy fundion and p* =
`error probability is bounded away from 0.
`
`1+(1-:tS)d,-
`2
`
`, then the blork
`
`Proof. Note that the capacity of the BSC is equal to 1- h(8). Since p* > 1/ '2 for any finite d,.,
`we have h(p~) < 1 and, henr,e, I - h(8)/h(p*) < I - h(l5). Callager stated the a.bove Theorem
`originally fm (j, h)-regular codes. A clo~er look, however, ~hows lhal the proof remains valid for
`1 ~ 28)' is replaced by the expression
`the case of irregular codes if the original expression p* = 1
`+(
`p* = l+(l~ 26)Jr. Details of the proof can be found in [9, p. :19]. \Ve note that a slightly sharper
`
`bound r,an be given if we replare dr with the average degree of the (l- r)-frar,tion of highest degree
`
`node~. However, since lhe improvement i~ usually only slighl and since the exact. size of the gap is
`
`not signiJicanl in practice, we leave the details to lhe reader.
`
`D
`
`The outline of this paper is as follo-ws.
`
`In Section '2 we describe and study density evolution.
`
`Under the assumption that the input distribution arises from a symmetric channel, \Ve show that
`
`the message distributions satisfy a rertain eonsistfney condition whirh is invariant under density
`
`evolution. }fany ~impliJications arise in this ca~e which all"ord u~ con~idcrable information on the
`
`nature of den~ity evolulion. In particular, we show that den~ity evolulion alway~ converge~ to a
`
`fixed point. \Ve also address the stability of the fixed point that corresponds to the correct solution.
`
`In Ser,tion :3 \Ve give tables of some very good degree sequenres. Although we forus mostly on the
`A weN c and rates one- half \Ve also give a. fe\V exam pies for d iiferent chan nels and rates. In Ser,tion
`
`3 we al~o presenl some simulation result~ lhal ~how that lhe promi~ed perfonnance can be dosd_y
`
`achieved for reasonably long codes. Finally, in Section ·1 we describe our numerical optimization
`
`techniques whirh \Vere used to generate the tables in Ser,tion :3.
`
`2 Density Evolution: Consistency, Stability, and Fixed Points
`
`Density evolution, as described in [ 1], is general enough to enable the computation of message den-
`
`sities rega.rdless of the initial density. In the case of most interest, the message distribution Mlses
`
`from the observation of the input bit after passing it through a knmvn symmetric da.nnel. Sinr,e
`
`the channel is known, lhe log-likelihood fm the input bit given lhe observation is determined and
`
`this log-likelihood constitutes the received message. lVIessage distributions (under the ubiquitous
`
`all-one word assumption) ·which arise this \Vay satisfy certain consistency conditions that are pre-
`
`served undN density evolution. In the follmving, we \viii examine these conditions. They will lead
`
`6
`
`Hughes, Exh. 1029, p. 6
`
`

`

`to simplilications of density evolution. Furthermore, in this case, one can prove a. monotonicity
`
`property of density evolution which implies that density evolution converges to a fixed point. Fi-
`
`na.lly, in this section, we study the asymptotics of correct decoding to arrive at a. stability condition
`
`for the fixed point corresponding to zero probability of error.
`
`Let us lirst bridly recall the Jorm and derivation oJ density evolution Jor hdieJ propagation. In
`
`general, each message represents a.n estimate oJ a particular transmitted hit. Let J; denote the value
`
`of this bit and recall that a; is an element of { ±1}. At a variable node, we will represent messages
`
`as log-likelihood ratios,
`
`log
`
`p(yl:r = I)
`· ..
`p(ylx = -1) ·
`
`where y represents all the observations conveyed to the variable node at that time. In the sequel
`
`we will often refer to message distributions witho11t further qualifications. Unless stated otherwise,
`
`this will always rder to message distributions a.t variable nodes under the all-one word assumption
`
`where the messages are represented as log-likelihood ratios. At a check node it is more convenient
`
`to represent the message as a tuple ( s, r) where s E lV2 := { ±1} indicates the hard decision and
`
`where r E JR+ is equal to
`
`I log IP(J: = lly)- p(:c = -lly)ll·
`
`Here ~-1/2 denotes the two-element group with elements {±.1} and group operation equal to real
`multi pi i ration. Since we wish to ad Ill it the possibility of perfect knowledge of a. bit (erasure ch a.n nel),
`
`we must admit the possibility that message distributions ma.y have point masses a.t the endpoints.
`
`E.g., in the log-likelihood representation this means that we may have 'point masses at infinity' in
`
`our message distributions. Thus, in generaL message distributions are probability distributions on
`
`the two-point compactifica.tion of IR, which we denote by JR. To simplify the presentation we shall
`
`generally assunll' that distributions a.re absolutely continuous with rc'sped to Lebesgue mea.surc'
`
`and can therdorc' he rl'JH"l'sented h.Y their densities."
`
`Assume now that the received message and the incoming messages at a variable node are given in
`
`log-likelihood form. The outgoing message along an edge c is then simply the sum of the received
`
`message plus all incoming messages, excluding the message incoming along edge e. Therefore,
`
`5 A couple of except.ions will be made for certain discret.e measures. ln particular, we shall require the dist.ribution
`Ll.,-o , the 'delta function a.t infinity.' \.Ye shall alw require the distribution D.ll the 'delta function a.t zero,' corresponding
`to a.n cra.su IT.
`
`7
`
`Hughes, Exh. 1029, p. 7
`
`

`

`assuming independence (tree assumption), the density of the outgoing message is the convolution
`
`of the densities of the messages participating in this surn.
`
`An equivalent statement is true at the check nodes. Assume that the incoming messages to a,
`
`check node a.re all in the form (.~, .,.). The outgoing message (again in (.~, r) represenlation) along
`
`an edge c is then simply the sum of all incoming messages, excluding the message incoming along
`edge c (here "addition" is performed componentv,rise, i.e., (s1,r1) + (s2,r2) = (s1 * s2,r1 + r2),
`·where the first component is in kV2 = {±.1} and "addition" corresponds to real multiplication,
`and the second component is an dement of R with regular addition.) Thus, as in lhe previous
`
`case, lhe density of the outgoing message, over H7
`2 X JR.+, can be written as lhe convolution of the
`
`corresponding densities of the incoming messages. Here, JR+ refers to [0, oo] the compa.ctification of
`
`JR.+. To corn plete the description we need only specify the change of variables necessary to convert
`
`a density from one representalion to the other and lo incorporale lhe degree sequences.
`
`Suppose that we have a graph >vith lel'l and right edge degree dislributions given by -\(:r) =
`"L/~ 1 A.; X.; _ 1, and p(x ) = "L f~ 1 Pi;Ti-l, respectively. \Ve folhw [:1] and use the follov,ring convention:
`for a polynomial q( x) = L; q,x' - 1 v,rith real coefficients qi we denote by q(f) the distribution
`L ; q;fr>;•(i- 1 ), where here and in the sequel (];, denotes convolution.
`
`In the follmving, we assume that the channel satisfies the symmetry condition (T). Further, we
`
`assume lhal the all-one codeword was transmitted. Ld P0 be lhe distribulion of the received
`
`log-likelihoods and P; denote the distribution of log-likelihoods sent from the variable nodes to the
`
`check nodes in the Cth iteration. Let R; denote the distribution of log-likelihoods sent from the
`check nodes to the variable nodes in the fth iteration. We rnay initiali7.e with 1{0 = 6. 0 .
`
`The dislribution of messages passed from a message node to a. check node in the flh iteration is
`
`given by the convolution Pr: := P0 ii) /\(Rc_t).
`
`To obtain the distribution of messages from check nodes back to message nodes, >ve must perform
`
`a. change of measure. To this end \\'e define the operator "i which takes the space of distributions
`
`Oil JR+ into itself. [<''or densities Oil JR.+ we define) as
`
`rU)(y) := /(ln coth y/ 2) csch(y).
`
`(2)
`
`In coth y/ 2, i.e., .f(y) dy = "'.-'(.f)(y) dy, since
`The operator"/ represents a. change of variables y -
`tIn coth y/ 2 = - csch(y). (\Ve can extend the definition of"'.-' to distributions on JR.+ by taking
`
`8
`
`Hughes, Exh. 1029, p. 8
`
`

`

`limits.) This shows part (a) of the following lemma, while part (b) follows easily from the fact that
`ln cotl-1 y/'2)
`2
`
`1
`1 (
`n cot 1
`
`= y.
`
`Lemma 1. '/'he following facts hold:
`
`(a) .~:;x• ~~ (.f) ( y) d !) = .~:;x• .f ( y) d !)
`
`(b) ~;(~;(J))(y) = f(y).
`
`l'or a.ny distrib11tion .f on lR let .r- denote -~[-<x;,o].f and let .f+ denote -l[o,:x-.].f, where l""1 denotes
`the characteristic Junction oJ .·1 so that I = j'+ + I- Y
`
`Given a distribution I over lR: orlog-likclihoods l, Wl' can represent it as a. distribution over 1-V~ X JR:+
`by representing l as the pair (s, r). We therefore define f, the change of variable operator, by
`
`where Rf- denotes the reflection off about 0, i.e.: Rf-(x) = f-(-x).
`
`Now, let g be a distribution over ~1/2 X lR: +. Let g+ and _q- be defined by g+(T) = I { 1 }(s)g(s, T) a.nd
`_q-(-r) = -~f-l}(s)_q(s,r). Then the inverse of l' is given by
`
`Using this notation, we obtain the following.
`
`Theorem 2 (Density Evolution). Ld P0 denote !he initial nW88age (h.~lr-ibulion, under· lhe a8-
`
`sumption that the all-one codeword was transmitted! of a low-density parity check code specified by
`edge df;_qn;e d-ish·ibut-ions L-; >..;:ri- 1 and L; p.;:l;<-1 . fl U.€ denotes the dens-ity of the messn.gf;S pasSF;d
`fr-om !he check node8 lo the nu:·t;sage nodes al r·ound e of lhe bdief-pmpagalion, wilh R 0 := il0 , then
`we have
`
`R; = r- 1 p(f(Po l~'! )..(Rt-1 ))) .
`
`2.1 Consistency
`
`ln the following, we call a distribution .f on IR consistent if it satisfies f(:r) = .f( -;1;)e"' for all
`:r E JR:+. More Jorrnally, a probability mea.surc I/ is consistent iJ the following hold. First, I/ can
`6 In the general case, where a point mass at 0 is allowed, the ma~~ at 0 should be split equally between .f- and .f+.
`
`9
`
`Hughes, Exh. 1029, p. 9
`
`

`

`be written a.s ~~- + u+ where~~- is supported on [-·x,O] and u+ is supported on [0, +x]. Letting
`'R11- dc'notc as bdorc' the rdkction oJ 11 - a.hout 0, we rc'quirc' that 'R11- he absolutely continuous
`
`with respect to 11+ and that the Radon-Nikodyn derivative [10, p. 1ii0] of R11- with respect 11+ be
`
`given by e-"'. If 11 has a density f then consistency of 11 is equivalent to consistency off as defined
`
`above .. \ote that Ll,x, and D- 0 are ronsistent measures.
`
`We will first establish wnsistenry of the initial message distri b11tion of density evol11tion under
`
`gc'Ill'ral conditions.
`
`Proposition 1. Suppost: that n. dw.rmd hn.s 8J;rmndry pmpF:rty p(y I :1; = -1) = p(-y I :1; = -1),
`
`and let P0
`
`lw lhr: initial rrtr:s8age dislribulion in log-likelihood ratio for·m under· the all-one word
`
`assumption. Then Po is consistent.
`
`Proof. Let 11 denote the measure corresponding to P0 , i.e., for any measurable subset A_, 11(A) :=
`J4 P0 (x) dx. ~ote that Po is equal to the distribution of the random variable
`p(ylx = 1)
`.
`z (y) := loe; ---=---_ -'--':--· --'--------'----(cid:173)
`• p(yl:r = -1)
`
`11(-A )
`
`p(ylx = 1) dy
`
`under the assumption that the all-one codeword was transmitted. For any measurable set A_ we
`have u( .4) = Jz-1 (cl) p(yl :r = "I) d !J . Invoking ch a.n nel sym nwt ry and equation ( 3) we have
`f
`1::-1 (-A)
`j. . p(-yl:r = 1) dy
`I
`f
`1::-1 (A)
`Thus, we obtain that 11 is a consistent measure, i.e., setting A_= ( z , z + d z) we obtain
`//( -z- dz, -z)
`------'-----,---------,----- = e-z .
`u( z,z+ dz)
`
`o-- 1 (11)
`
`. _ -1 (A)
`
`p(yl:r =-"I) dy .
`
`e-'(Y)p(ylx = 1) dy.
`
`Example 1 (Erasure Channel). For the erasure channel the initial message distribution is E~o+
`(-1- F)Ll,x•· Then l~i = }Llo +(I- F)D- ,x a.nd 1)0- = }Llo, whirh shows that the initial distribution
`is consistent.
`
`D
`
`10
`
`D
`
`Hughes, Exh. 1029, p. 10
`
`

`

`Example 2 (BSC). For the DSC the initial message distribution is Pula:) := bD._ 10g 17,1 + (1 -
`8)D.1_ 1-h. We have Pci = (1- 8)D.1_ 1-h and P0- = 8D.1_ 1-6, so consistency is apparent .
`og - 6-
`
`og ----;;- ·
`
`•
`
`. ·
`
`og --y
`
`.·
`
`Example 3 (AWGNC). Here the initial message distribution 1s /)0 (;r) :=
`
`consistency mndition is then
`
`D
`
`( " _--L•,2a-2
`'
`n-2'
`-s
`
`e-
`
`• The
`
`D
`
`Example 4 (Laplace Channel). As a final example, the initial message distribution for the
`
`Lap I are d1 an nel is given by
`
`where
`
`Again, it is easy to dlerk that the wnsistenry wndition is fulfilled.
`
`\Ve say tha.t two channels a.re equivalen/ if they ha.ve the same initial message distribution. It is
`
`often convenient to pick one representative for each equivalence class. How this can be done is
`
`shown in
`
`Lemma 2. Let P0 (y) be a consistent distribution. Then the symmetric channel p(·l·) with p(ylx =
`
`1) = P0(y) (and hence by symmetry p(yix = -1) = P 0(-y)) has initial message distribution P0(y).
`
`Pr·oof.
`
`p(yia: = 1)
`p(yix = 1)
`· = log
`.
`log
`· p(yi:r = -1)
`· p(-yi:r = 1)
`
`Po(y)
`= og
`1
`. · ·
`- P0 (-y)
`
`__ I!)~ Po(y)
`0 P 0 (y)e-Y
`
`v
`
`= !J.
`
`D
`
`Our next goal is to show that consistency is preserved under density evolution. This leads to certain
`
`simplifications of its representation.
`
`Lemma 3. !1 di.<:lr·ibulion I over lR i8 con.<:i.<:lt.nl if and only if ~r(RI-)(y) = tanh(y/ 2)/(J+)(y).
`
`11
`
`Hughes, Exh. 1029, p. 11
`
`

`

`and
`
`~;(R.f-)(!J) = .f-(-ln roth(y/2)) r,sr,h(y)
`
`~~(f+)(y) = .t+(ln coth(y/L)) csch(y).
`
`~ole that Jrom I(-J') = I(:c)e-x it. follows that.
`
`i'(RJ-)(y) = exp( -In wth(y/2)).f+(ln wth(y/2)) r,sch(y) = tanh(y/2}i•(f+)(!J),
`
`and the lemma follmvs directly.
`
`D
`
`
`
`Thus, >ve may extend the definition of consistency to densities over H72 x 1Ft+.
`
`Lemma 4. Consistency is invariard under convolution in IR and convolution in l'T-'2 x IR +.
`
`Pmof. Let. I and g be consist.enl densities over R. Then
`
`I:. f(J.' - y)g( y) dy = I':. c "- '1 I(Y - :c)e'l g( -y) dy
`I·:. e~ f(y- x )g( -y) dy
`= ex .r: . .f(-;r- y),q(y) dy.
`
`To show that consistencv is invariant under convolution over l'T-'2 x 1Ft+ is somewhat more elaborate.
`
`Let .f and g be consistent densities over H'2 X R.+ and let us express them as
`I { I } ( s v+ ( r) + l {- I} (8) .f- (r)
`1{1}(8)g+(r) + 1{-1} (8)g-(r),
`
`g(8, r)
`
`f(s, r)
`
`(v-ihere f+(r) = l{ 1}(.s).f(.s, r), etc) and let us similarly express f G· g. Then the wnvolution off
`and g is ~pecilicd by
`
`(f {)g)++(f::S::g)(cid:173)
`
`u+ + .r-) { ) (g+ +g-)
`
`(f (;) g)+- (I?; g)-
`
`u+ -I-) .::;) (g+- g-).
`
`~ow, Uw consist.enc_y condition for a dist.ribulion I over lF2 X !R+ can be expressed as
`
`Thus, it is easily verified that f {) .fJ is wn sis tent.
`
`D
`
`12
`
`Hughes, Exh. 1029, p. 12
`
`

`

`In view of the above it is convenient to introduce the "fold" operator F which takes distributions
`
`on JR: to distributions on JR.+. We define it by
`
`.Ff(:r) = .t+(:r) + R.r(:r) = .t+(:1:) + .r(-:r).
`
`Altogether, we obtain the following.
`
`Theorem 3 (Density Evolution and Consistency). l.f:i /~) rhnote t/u; initial me8.mgF: distr·i(cid:173)
`bulion of a low-den~ily par·ily du.·ck code ~fH:njied by f:dgf: df:gr·N:' di~lribulion8 L; /\i:E:i-l and
`Li p;a:'- 1 and assume that Po is consistent. If R r denotes the density of the messages passed
`from the check nodes to the message nodes at round f of the belief-propagation then u:e hare
`
`and Pr is consistent for all£ ':;:> 0.
`
`Example 5 (Erasure Channel). Let J be a density distribution of the form U:l0 + (1- ()Ll:o.
`Then F(f) = rand~;(!)= (1- t)L1o + ELl:x-.• Further,)..(!)= A(E)Llo + (1- A(E))L1.x. Hence,
`starting with the initial message distribution fLlo +(I - f)Ll~x· corresponding to an erasure channel
`with erasure probability f, the iteration is ~iven by /)c = PfLlo +('I - pfJLl~x" where
`
`Pr = p(1- EA(1- Pt-1 )).
`
`This is the s;cune formula as derived in [5, 6].
`
`Let f( x) be a consistent distribution. \Ve define the error probability operator as 7
`
`D
`
`Prerr(f) :=
`
`j ·O
`
`-
`
`•X f(J.:) d:t ·
`
`An easy but helpful consequence of the consistency condition is noted in the following
`
`Corollary 1. Let f(J.:) he a consistent distribution. If Prerr (f) = 0 then f = Ll·:o·
`
`Hence, requiring that the probability of error converges to 7.ern is equivalent to requiring that the
`
`message density converges to a delta function at infinity.
`
`7If .f has a point mass at 0 then exactly half of that mass is included in Prcn (.f).
`
`Hughes, Exh. 1029, p. 13
`
`

`

`2.2 Fixed Points of Density Evolution
`
`In the case of the erasure channel and a belief propagation decoder the state of the system at
`
`any it12ration is d12sr,rilwd by th12 r12maining number of erasures. Denot12 the r12maining number of
`
`12ras ures by;~; and l12t h(:r) be the r12maining numlwr oferasur12s aft12r afurth12r itNation. Th12n the
`
`threshold is given b.Y lhe ma"ximmn mnnber J:u such that
`
`\1() < .T < Xo :
`
`h(.T) < X.
`
`(-1)
`
`A similar stalemenl b true for lhe DSC together wilh Gallager's decoding algorithm A. In lhis case
`
`:c signiJies lhe remaining nmnber of errors.
`
`An alternative description of the threshold can be given in terms of fixed points. The threshold is
`
`giv12n by th12 max1mum number :r0 such that for no ;r E (0,;r0 )
`
`h(J:) = :c .
`
`(.5)
`
`Assume now \Ve are given a general discrete memoryless channel which fulfills the channel symmetry
`
`ronditions and \VI2 12m ploy a, belief propagation decoder. Th12 state of the system is now desrribed
`by th12 rn12ssag12 den sity. L12t f denot12 s ud a density and let h(.f) d12note the wrresponding d12nsity
`
`al'ler a further ileration. Clearl_'l', lhere is no characterization of the threshold >vhich corresponds
`
`to (-1) since there is, a priori, no given linear ordering of densities. The alternative formulation in
`
`(.5) involving fixed points looks more promising. Unfortunately, the non-existence of fixed-points
`
`for itNated fu net ion systems in mm12 than on12 dirn12nsion is in general not enough to guara.nt12e
`
`convergence. So it is quite surprising that for message dislributions of belief propagalion decoders
`
`Jixed poinls suliicient.ly characlerize the convergence of the sequences, as we will show in lhis section.
`
`Let Pr:(x) denote the distribution of the messages at the C-th iteration assuming that the all-one
`
`word was transmitted. Assunw that we \VI2re to d12r,ide on the transmittNl bit acwrding to the sign
`
`of the nwssage. In this r,as12 th12 conditional error proba.bility, conditioned on the tra,nsrnission of
`
`the all-one codeword, is equal to Pr.rr(P£). Dul, because of lhe symmclry condilions, this is equal
`
`to the unconditional error probability. Since the sign of the message is equal to the IVIAP estimate,
`
`this error probability is clearly a non-increasing function of C.
`
`This rnonotonir,ity property will play a k12y rol12 in the s12quel. It 1s not hard to se12 that there is
`
`a.r,tually a \vhol12 family of surh monotonicity wnditions.
`
`14
`
`Hughes, Exh. 1029, p. 14
`
`

`

`Theorem 4. Lei P;: lw !he me~~age (h.~lribulion al !he f'-lh decoding 8lep and lr::l g be a con~i~lenl
`
`distribution on JR. Then
`
`z~ a twn-incr·ea~ing fundion of e.
`
`J1mof. The message of which /)c is the distribution represents a. ronditional probability of a par-
`
`ticula.r hit value. Assume that an independent ob servation oJ the :,;a.me hit value is available to the
`
`decoder and assume that this independent observation is obtained by passing the bit through a
`
`channel p(-1·) which fulfills the symmetry condition and has p(yla: = 1) = g(y). By Lemma 2 and
`
`11nder the assumption that the all-one codeword was transmitted, the conditional distribution of
`
`the hit value log-likelihood, conditioned on all information incorporated in Pi. and tlw indl'IWndent
`
`observation ~ has distribution Pt l~'! g. Since the new distribution corresponds again to a MAP esti(cid:173)
`
`mate conditioned on information which is non-decreasing in £ ~ the stated monotonicity condition
`
`follows.
`
`It is convenient to use the following "basis" for all these monotonicity conditions. Let
`
`D
`
`(6)
`
`Clearly, .rJz(:r) is a mnsistent distrib11tion. For any mnsistent distribution .f let P,.f be defined as
`
`\Vritten in an alternative way we have
`
`= - 1
`- .• ("~- ro:('l -e?-JJ).f(:!;) d:r).
`1 + c~
`.f::
`(7)
`
`Corollary 2. Let Pr be the message distribution at the ( -th decoding step. Then P zPr zs a non-
`
`inrTeas-ing function of l fm· f:Vf:ry 0 .::; z .::; oo.
`
`Lemma 5 (Basis Lemma). Let f and g be consistent distributions such that P zg = P z f Vz ~ 0.
`Tlu:n .f = g.
`
`Proof. Write the conditions P z(f(x)- g(x)) = (L Vz ~ 0, in the equivalent form
`
`15
`
`Hughes, Exh. 1029, p. 15
`
`

`

`Difl'erentialing both sides with respect to z we get
`
`or, equivalently,
`
`"i/-; z 0 : L: (f(:r)- g(:r)) d:r = 0.
`
`C'learly, the last condition implies .f(:r) = g(:r), for all :r::; 0. Since f and g are both distributions,
`so lhal .f~~xc(f(:c)- g(:r)) d:r = 0, and since f a.nd g a.re both consislent it follows that f =g. D
`
`Theorem 5. Let f be a consistent distribution and assume that f has support over the whole
`real axis. Let P11 (;T) and P;Jx) denote the message distributions at the £1-th and £2-th iteration
`respectively, \Vith f 1 < f2. If
`
`then P1·1 (a:)= P~:(x) for ( z £1, i.e., P;1 (x) is a fixed point of density evolution.
`
`J>roof". We first establish the folluwing identity for any consistent distribution f.
`.f(z) = r . .Ff(:~:)g,.(z) d:r
`+o::
`
`.fu
`
`The identity is proved as follows.
`
`f(z) =
`
`r+cx_;
`L~x· f(x) D.o(z - ;T) da.·
`
`(8)
`
`(9)
`
`. (
`.F.f(:r)
`.
`
`1
`.
`. D.o(z- :r) +
`1 + e-->
`
`. u
`
`l +~
`l + x·
`
`. u
`N 0\'v'' let e satisfy £I ::; £ ::; e2. Applying the identity above \\'e have
`
`.F.f(:r)g":(-:) d:r .
`
`P; ~g f =
`
`i +'-xc
`
`-
`
`.Ff(x) (g_, ~g Pf)(z) dx,
`
`from which it follows that
`
`Note lhal (8) together wilh Theorem 4 implies that
`
`16
`
`Hughes, Exh. 1029, p. 16
`
`

`

`Thus, we ha.ve
`
`IL is dear from (7) that Px~! is a continuous function of :c. Further, f has support on IR by
`
`assumption and since, by Corollary 2~ P.,P~: is non-increasing in f. and~ hence~ the '':hole integrand
`is a non-negative function, it follov,·s that P., Pp1 = P., P; for all 0 :; a: < oc. By the Basis Lemma
`
`evolulion.
`
`D
`
`\Ve uole that the above lheorem has some very slrong implicalions

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket