`oRett:
`aS1PT:
`2)ge===-oY.=)aed
`
`|
`
`1
`
`PETITIONERS - EXHIBIT 1010
`
`IPR2022-00217
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Data Structures
`
`and Algorithms
`
`ALFRED V. AHO
`
`Beil Laboratories
`Murray Hill, New Jersey
`
`JOHN E. HOPCROFT
`
`Cornell University
`Ithaca, New York
`
`
`
`
`
`
`
`JEFFREY D. ULLMAN
`
`Stanford University
`Stanford, California
`
`(cid:21)
`
`
`
`
`
`This book is in the
`ADDISON-WESLEY SERIES IN
`COMPUTER SCIENCE AND INFORMATION PROCESSING
`
`Michael A. Harrison
`Consulting Editor
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Reproduced by Addison-Wesley from camera-ready copy supplied by the authors.
`Reprinted with corrections April, 1987
`Copyright ° 1983 by Bell Telephone Laboratories, Incorporated.
`part of this publication may be reproduced, stared in a re-
`All rights reserved. No
`d,
`in any form or by any means, electronic, mechanical,
`trieva system, of transmitte
`he prior written permission of the pub-
`photocopying, recording, or otherwise, without
`Published simultaneously in Canada.
`lisher. Printed in the United States of America.
`
`Library of Congress Cataloging in Publication Data
`Aho, Alfred V.
`Data structures and algorithms.
`2. Algorithras.
`1. Data structures (Computer science)
`L. Hopcroft, John E., 1939-
`.
`HH, Utlman,
`Jeffrey D., 1942-
`.
`Ii. Title.
`QA76.9.D35A38
`1982
`001.64
`82-11596
`ISBN 0-201-00023-7
`
`“ss PSBN: 0-201-00023~7
`
`(cid:22)
`
`
`
`Contents
`
`Chapter i
`tt
`1.2
`1.3
`L.4
`1.5
`1.6
`1.7
`
`Design and Analysis of Algorithms
`From Problems to Programs ...........:--cescuenrereee cee eentn rarer ec nsens 1
`Abstract Data Types .......2ccccccereeeeeeeeereee near nner rine ee 10
`Data Types, Data Structures, and Abstract Data Types ...cceceeee 13
`The Running Time of a Program .......ceseeeeeereeetteeeeeeteer trees 16
`Calculating the Running Time of a Program...........-+-6:s-eceo21
`Good Programming Practice ...........c.sceeceereeeeresrnnereeneeereneees27
`Super Pascal......sccccccecceeceseseeetesteesennee stern eeeesnner rey reese ees29
`
`Chapter 2
`2.1
`2.2
`2.3
`2.4
`2.3
`2.6
`
`Basic Data Types
`The Data Type “List” ........:cccsseseeerreeeenescrnnrpereneeneeee ten nnes37
`Implementation of Lists ..........:.:c:ereesreetcrereertnerreer eaters eees40
`Stack..ccccccecccceceeea reece eee nent een ee epee EEE DETTE TEAL GREE ESE EEL ETE ES 53
`QUCUES ooo e cece ee cece enn ee eEEEEEEEeEEEE ALES 56
`Mappings .........-cccceeeeeeeseeeeeseeeennerrenescneaeaeenetaouenaeseeneeeees 61
`Stacks and Recursive Procedures ......-.::cseceeseeeeeeseenea retort nays64
`
`Chapter 3
`3.1
`3.2
`3.3
`3.4
`
`Trees
`Basic Terminology ............:.:eceecsseeeeeeenee eens es reetnen eee neenee at75
`The ADT TREE ......cccccece cree ree EEE EE EERE EEE 82
`Implementations Of Trees .........-cscersee reese eer ennet tat etetet tenes 84
`Binary Trees ..c.cccece ceeee 93
`
`Chapter 4
`4,1
`4.2
`4.3
`4.4
`4.5
`4.6
`4.7
`4.8
`49
`4.10
`4.1l
`4.12
`
`Basic Operations on Sets
`Introduction to Sets ..........cccceceeseeeee nee rea ee ere re ete ee rene nea een ed 107
`An ADT with Union, Intersection, and Difference ............-+-- 109
`A Bit-Vector Implementation of Sets.........c: cece eeeernrer creas 112
`A Linked-List Implementation of Sets 0.0.1... :.:::seeeeeereeee etree 115
`The Dictionary .........2.:cc:ccceee eer rere ee neers 117
`Simple Dictionary Implementations...........:060--sssseeeeeeeereeeees 119
`The Hash Table Data Structure..........::cccereresseeeeeeeec ener eaees 122
`Estimating the Efficiency of Hash Functions..........0--.0ee 129
`Implementation of the Mapping ADT...............:::sseeeeeereeeees 135
`Priority QUeUes ......c:cceceeeeeceeeeeee eee ee seen etee enter nee ener tere ree ees 135
`Implementations of Priority QUEUES ..........-.stese eters 138
`Some Complex Set Structures 0.0.2... c:ccceeeeectireeeetteeeeeeeer eens 145
`
`(cid:23)
`
`
`
`CONTENTS
`
`Advanced Set Representation Methods
`
`Binary Search Trees ..........cccercesenseeeeneeeeeseeeeeteeeseceeentetaes 155
`Time Analysis of Binary Search Tree Operations .................. 160
`TYIGS oo. ce ccc cs eee ence ene ee enesteeeeeteaee tere teneegeteeeneeneceneeenssees 163
`Balanced Tree Implementations of Sets ..............:c::ccsreeenseees 169
`Sets with the MERGE and FIND Operations.............,.......5 180
`An ADT with MERGE and SPLIT ...............eeeeeesenereneen ee 189
`
`Local Search Algorithms .........:.:ccseccsseeerseestaeenreraneeeensaeten 336
`
`Directed Graphs
`
`Basic Defimitions .............ccccseceereneceeetaeeeneeeaeeeesenseseabersaee 198
`Representations for Directed Graphs..........ccccecseceseseeenneeenes 199
`The Single-Source Shortest Paths Problem ...........0.:cceseeeeeee203
`The All-Pairs Shortest Path Problem ..,............--:c:seeeeeeeeee ees 208
`Traversals of Directed Graphs............cccccssecateeeeseeseneeseneren 215
`Directed Acyclic Graphs.......ccccccscccsseesescsereessneaenseeeareeenes 219
`Strong COMponents .....0..0 sce ceec tees t eset ee eeetenteeeeteeenetene ies 222
`
`Undirected Graphs
`
`Definitions ©. 00.0.0... eee cneceeceeceeneeeeeenerepeeseeespesateasenapeeee 230
`Minimum-Cost Spanning Trees ............:.cccceesesseeseneenereen ees 233
`TYaVersals ......c.ccccsessceetesseesseersasessestanreseearanseneaeseestae ened 239
`Articulation Points and Biconnected Components...............0.244
`Graph Matching ......0,......:ccccececeeceseeteseeseeessea essa eeneteneees 246
`
`Sorting
`
`The Internal Sorting Model ........cccceccesseesesssseeeerenseereaaeens 253
`Some Simple Sorting Schemes .............::cueccssestcueeesreeeteeeres 254
`QUICKSOTE 00.e cece cent ee ed ee eben been ba ed toed ee eendbaedeeneeettoe 260
`Heapsort 0... sccccseccevcce eee eeeteeetaeeenweneeeaseea seen tee nesatanseareees 271
`Bin Sorting ........... cc ckccee cee ctece sence ntbee beet eset esbertaettartnes 274
`A Lower Bound for Sorting by Comparisons...............-....066 282
`Order Statistics....,.....0.::ccecccsesceetesseeesesseseraeeeeseeeeeeeaenes 286
`
`Algorithm Analysis Techniques
`
`Efficiency of Algorithms .............0ccscssesesseeseseesteesaene ena enans 293
`Analysis of Recursive Programs ..........cccsccceeseeeveesssresensvenes 294
`Solving Recurrence Equations ............c.ccceeseeseeseesersesteeeenes296
`A General Solution for a Large Class of Recurrences............ 298
`
`Algorithm Design Techniques
`
`Divide-and-Conquer Algorithims.........cccccssreceeeeeeeveseeureeee 306
`Dynamic Programming.............cccececseseteeereeaesnateeresnesenens 311
`Greedy Algorithms ...............cscesceceecenesenesaesesauneesteeueeneas 321
`Backtracking ................ccccceeeseeeceeedeentaetaeeteseeeeatenben ba eeiee 324
`
`(cid:24)
`
`Chapter 5
`3.1
`5.2
`3.3
`5.4
`5.5
`5.6
`
`Chapter 6
`6.1
`6.2
`6.3
`6.4
`6.5
`6.6
`6.7
`
`Chapter 7
`7.1
`7.2
`73
`74
`7.5
`
`Chapter 8
`8.1
`8.2
`8.3
`8.4
`8.5
`8.6
`8.7
`
`Chapter 9
`9.1
`9.2
`9.3
`9.4
`
`Chapter 10
`10.1
`10.2
`10.3
`10.4
`10.5
`
`
`
`
`
`Chapter 11 Data Structures and Algorithms for External Storage
`11.1
`A Model of External Computation........-ccccererssrisereeriestrrss347
`11.2 External Sorting .......--ccccecctesssesrnereee349
`11.3
`Storing Information in Files ......:-.sseeeeersesreeesreeree361
`11.4 External Search Trees.....-ccccerecreerrenersreer rere368
`
`Chapter 12 Memory Management
`12.1 The Issues in Memory Management ......cccccccccrrrerrcrrtensessines378
`12.2 Managing Equal-Sized BIOCKS ....cccceceeenneceeerenrneae neers ce reetens382
`12.3. Garbage Collection Algorithms for Equal-Sized Blocks........--384
`12.4
`Storage Allocation for Objects with Mixed Sizes 0... eceeee reer ees392
`12.5 Buddy Systems ........--ccceeeerseseesser terresete400
`12.6
`Storage Compaction ....c..-ssseccssesseetter tessee404
`Bibliography ........6.c.ccccceereece4il
`
`
`CONTENTS
`
`xi
`
`TAGOX ooo ccccccecc escent eee EEE EE ETT419
`
`(cid:25)
`
`
`
`
`
`CHAPTER3
`
`
`
`Trees
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`A tree imposes a hierarchical structure on a collection of items. Familiar
`exampies of trees are genealogies and organization charts. Trees are used to
`help analyze electrical circuits and to represent the structure of mathematical
`formulas. Trees also arise naturally in many different areas of computersci-
`ence. For example,
`trees are used to organize information in database sys-
`tems and to represent the syntactic structure of source programs in compilers.
`Chapter 5 describes applications of
`trees in the representation of data.
`Throughout this book, we shall use many different variants of trees.
`In this
`chapter we introduce the basic definitions and present some of the more com-
`montree operations, We then describe some of the more frequently used data
`structures for trees that can be used to support these operations efficiently.
`
`3.1 Basic Terminology
`
`A tree is a collection of elements called nodes, one of which is distinguished as
`a root, along with @ relation (‘‘parenthood”) that places a hierarchical struc-
`ture on the nodes. A node, like an elementofa list, can be of whatever type
`we wish, We often depict a node as a letter, a string, or a number with a cir-
`cle around it. Formally, a «ree can be defined recursively in the following
`manner.
`
`i. A single node by itself is a tree. This node is also the root of the tree.
`2.
`Suppose
`nv
`is
`a
`node
`and 7),7>,...,%,
`are
`trees with
`roots
`ny, My, ...,M, Tespectively. We can construct a new tree by making n
`be the parent of nodes m,, m2,...,m%.
`In this tree a is the root and
`T,, Tz, ...,7, are the subtrees of the root. Nodes n), 72, ...,2, are
`called the children of node n.
`
`is convenient to include among trees the null tree, a “tree” with
`“Sometimes, it
`no nodes, which we shall represent by A.
`Example 3.1. Consider the table of contents of a book, as suggested by Fig.
`3.1(a). This table of contents is a tree. We can redraw it
`in the manner
`
`Shown in Fig. 3.i(b). The parent-child relationship is depicted by a line.
`Trees are normally drawn top-down as in Fig. 3.1(b), with the parent above
`the child.
`three subtrees with roots
`the node calied “Book,” has
`root,
`“2 The
`‘orresponding to the chapters Cl, C2,
`and C3.
`This
`relationship is
`epresented by the lines downward from Book to Cl, C2, and C3. Book is
`he parent of C1, C2, and C3, and these three nodes are the children of Book.
`
`
`
`
`
`
`(cid:26)
`
`
`
`
`
`
`
`Book
`cl
`
`Book
`
`"
`
`oo
`sit a
`C22
`/\ SIN
`2.2
`/ \
`
`
`2.1.1
`s2.}.2
`
`$2.3
`
`$1.4
`
`sl.2
`
` s2.1
`
`s2.2
`
`82.3
`
`s2.1.1
`
`82.1.2
`
`(a)
`
`(b)
`
`Fig. 3.1. A table of contents and its tree representation.
`
`
`The third subtree, with root C3, is a tree of a single node, while the other
`two subtrees have a nontrivial structure, For example, the subtree with root
`C2 has three subtrees, corresponding to the sections s2,1, s2.2, and s2.3; the
`last two are one-node trees, while the first has two subtrees corresponding to
`the subsections s2.1.1 and s2.1.2. 0
`
`‘Example 3.1 is typical of one kind of data that is best represented as a
`tree:
`In this example, the parenthood relationship stands for containment; a
`parent node is comprised of its children, as Book is comprised of C1, C2, and
`C3, Throughout this book we shall encounter a variety of other relationships
`that can be represented by parenthoodin trees.
`is the
`If ny, my, ...,m, is a sequence of nodes in a tree such that n;
`parent of n,.,; for 1 = i < k, then this sequence is called a path from node nj,
`to node n,. The length of a path is one less than the number of nodes in the
`path. Thus there is a path of length zero from every node to itself. For
`example,
`in Fig. 3.1 there is a path of length two, namely (C2, s2.1, 52.1.2)
`from C2 to s2,1.2.
`If there is a path from node a to node 4, then a is an ancestor of b, and b
`is a descendant of a. For example,
`in Fig. 3.1,
`the ancestors of s2.1, are
`itself, C2, and Book, while its descendants.are itself, s2.1.1, and s2,1.2.
`Notice that any node is both an ancestor and a descendantofitself.
`An ancestor or descendant of a node, other than the nodeitself, is called
`a proper ancestor or proper descendant, respectively.
`In a tree, the root is the
`only node with no proper ancestors. A node with no proper descendants is
`called a leaf. A subtree of a tree is a node, together with all its descendants.
`The height of a node in a tree is the length of a longest path from the
`node to a leaf.
`In Fig. 3.1 node C1 has height 1, node C2 height 2, and node
`C3 height 0. The height of a tree-is the height of the root, The depth of a
`node is the length of the unique path from the root to that node.
`
`
`
`(cid:27)
`
`
`
`cama
`
`
`
`illsiiaEE
` Seaieeae
`
`
`
`77
`
`The children of a node are usually ordered from Jeft-to-right. Thus the two
`trees of Fig, 3.2 are different because the two children of node a appear in a
`different order in the two trees. If we wish explicitly to ignore the order of
`children, we shall refer to a tree as an unordered tree.
`
`Fig. 3.2. Two distinct (ordered) trees,
`
`The “left-to-right” ordering of stblings (children of the same node) can be
`extended to compare any two nodes that are not related by the ancestor-
`descendant relationship. The relevantrule is that if a and & are siblings, and
`a is to the left of 6, then all the descendants of a are to the left of all the des-
`cendants of b.
`
`Example 3.2. Consider the tree in Fig. 3.3. Node 8 is to the right of node 2,
`to the left of nodes 9, 6, 10, 4, and 7, and neither left nor right of its ances-
`tors 1, 3, and 5.
`
`The Order of Nodes
`
` 3,1. BASIC TERMINOLOGY
`
`——_$
`ae “ts..
`eo
`\
`|
`
`9
`
`/&
`
`Fig. 3.3. A tree.
`
`tT
`
`A simple rule, given a node n, for finding those nodesto its left and those
`to its right, is to draw the path from the rool tom. All nodes branching off to
`the left of this path, and all descendants of such nodes, are to the left of n.
`All nodes and descendants of nodes branching off to the right are to the right
`of n, oO
`
`(cid:28)
`
`
`
`TREES
`
`Preorder, Postorder, and [norder
`
`There are several useful ways in which we can systematically order all nodes
`of a tree. The three most
`important orderings are called preorder, inorder
`and postorder; these orderings are defined recursively as follows.
`®
`If a tree T is null, then the empty list is the preorder, inorder and post-
`orderlisting of T,
`then that node by itself is the preorder,
`If T consists a single node,
`inorder, and postorderlisting of T.
`Otherwise, let T be a tree with root n and subtrees T,, Tz, ...,7,, a8 sug-
`gested in Fig. 3.4.
`
`®
`
`(*)
`
`A A~D
`
`Fig. 3.4. Tree T.
`
` 78
`
`1. The preorder listing (or preorder traversal) of the nodes of T is the root n
`of T followed by the nodes of T,
`in preorder,
`then the nodes of 7) in
`preorder, and so on, up to the nodes of T, in preorder.
`in inorder, fol-
`2, The morder listing of the nodes of T is the nodes of T,
`lowed by node rn, followed by the nodes of Tz,...,7,, each group of
`nodes in inorder.
`-
`3. The postorder listing of the nodes of T is the nodes of T,
`in postorder,
`then the nodes of T, in postorder, and so on, up to 7,, all followed by
`node rn.
`
`Figure 3.5(a) shows a sketch of a procedure to list the nodes of a tree in
`preorder. To make it a postorder procedure, we simply reverse the order of
`steps (1) and (2). Figure 3.5(b) is a sketch of an inorder procedure.
`In each
`case, we produce the desired ordering of the tree by calling the appropriate
`procedure on the root of the tree.
`
`| and
`Example 3.3. Let us list the tree of Fig. 3.3 in preorder. Wefirst list
`then call PREORDERon thefirst subtree of 1, the subtree with root 2. This
`subtree is a single node, so we simply list it. Then we proceed to the second
`subtree of 1, the tree rooted at 3. Welist 3, and then call PREORDER on
`the first subtree of 3. That call results in listing 5, 8, and 9,
`in that order.
`
`(cid:20)(cid:19)
`10
`
`
`
`
`
`
`
`
`
` i ( 4 ii
`
`ta
`
`4 i |i44#4
`
`
`
`
`
`
`
`
`
`
`3.1 BASIC TERMINOLOGY
`
`79
`
`(1)
`(2)
`
`procedure PREORDER ( n: nade );
`begin
`list 1:
`for each child c of nv, if any, in order from the left do
`PREORDER(c)
`{ PREORDER}
`
`end;
`
`(a) PREORDERprocedure.
`
`procedure INORDER( n: node );
`begin
`ifm is a leaf then
`list n
`else begin
`INORDER(leftmost child of »);
`list ;
`for each child c of n, except for the leftmost,
`in order from the left do
`INORDER(c)
`
`end
`{ INORDER}
`
`end;
`
`(b) INORDER procedure.
`
`Fig. 3.5. Recursive ordering procedures.
`
`Continuing in this manner, we obtain the complete preorder traversal of Fig.
`3.3; 1, 2, 3, 5, 8, 9, 6, 10, 4, 7.
`Similarly, by simulating Fig. 3.5(a) with the steps reversed, we can dis-
`cover that the postorder of Fig. 3.3 is 2, 8, 9, 5, 10, 6, 3, 7, 4, 1. By simulat-
`ing Fig. 3.5(b), we find that the inorder listing of Fig. 3.3 is 2, 1, 8, 5, 9, 3,
`10,6,7,4.0
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`trick for producing the three node orderings is the following.
`A useful
`. Imagine we walk around the outside of the tree, starting at the root, moving
`counterclockwise, and staying as close to the tree as possible; the path we have
`-in mind for Fig. 3.3 is shown in Fig. 3.6.
`:
`For preorder, we list a nade the first time we pass it. For postorder, we
`‘list a node the last time we pass it, as we move up to its parent. For inorder,
`
`“we list a leaf the first time we pass it, but list an interior node the second time
`
`oWe pass it. For example, node 1
`in Fig, 3.6 is passed the first time at the
`
`beginning, and the second time while passing through the “bay” between
`nodes 2 and 3. Note that the order of the leaves in the three orderings is
`
`
`
`(cid:20)(cid:20)
`
`
`
`TREES
`
`
`
`
`
`VFA
`
`wee i Se
`(5 | oN
`\JI)fi,
`\\7ee[
`6
`8 JX 9)
`10
`}
`w_ Sead \A
`
`Fig. 3.6, Traversal of a tree.
`
`it is only the ordering of
`always the same left-to-right ordering of the leaves.
`the interior nodes and their relationship to the leaves that vary among the
`three. 0
`
`Labeled Trees and Expression Trees
`
` 80
`
`Often it is useful to associate a label, or value, with each node of a tree, in
`the same spirit with which we assocjated a value with a list element in the pre-
`yious chapter, That is, the label of a node is not the name of the node, but a
`value that is “stored” at the node,
`In some applications we shall even change
`the label of a node, while the name of a node remains the same. A useful
`analogyis tree:list = label:element = node:position.
`
`Example 3.4. Figure 3.7 shows a labeled tree representing the arithmetic
`expression (a+b) * (atc), where n,,...,n7 are the names of the nodes,
`and the labels, by convention, are shown neat
`to the nodes. The rules
`wherebya labeled tree represents an expression are as follows:
`I. Every leaf is labeled by an operand and consists of that operand alone.
`For example, node a, represents the expression a.
`2. Every interior node n is labeled by an operator. Suppose n is labeled by a
`binary operator 8, such as + or *, and that
`the left child represents
`expression E, and the right child Ey, Then nm
`represents expression
`(E,) 8 (E.), We may remove the parentheses if they are not necessary.
`For example, node m) has operator +, and its left and right children
`represent
`the expressions a and b, respectively, Therefore, nz represents
`(a)+(b), or just a+b. Node n, represents (at+5)4#(a+c), since * is the label
`
`(cid:20)(cid:21)
`12
`
`
`
`
`
`
`3.1 BASIC TERMINOLOGY
`
`81
`
`at 4), and a+b and a+c are the expressions represented by n2 and a3, respec-
`tively. 0
`
`
`
`Fig. 3.7. Expression tree with labeis.
`
`
`
`
`
`
`(cid:20)(cid:22)
`
`inorder, or postorder listing of a
`Often, when we produce the preorder,
`tree, we prefer to list not the node names, but rather the labels.
`In the case
`of an expression tree, the preorder listing of the labels gives us what is known
`as the prefix form of an expression, where the operator precedes its left
`operand and its right operand. To be precise, the prefix expression for a sin-
`gle operand a is a@
`itself. The prefix expression for (E,) @ (£2), with 0 a
`binary operator,
`is 6P,;P,, where P, and P, are the prefix expressions for E;
`and £3. Note that no parentheses are necessary in the prefix expression, since
`we can scan the prefix expression 0P,P, and uniquely identify P, as the shor-
`test (and only) prefix of P,P, that is a legal prefix expression.
`For example, the preorder listing of the labels of Fig. 3.7 is *tab+ac.
`The prefix expression for m,, which is +ab,
`is the shortest legat prefix of
`+abt+ac.
`Similarly, a postorder listing of the labels of an expression tree gives us
`what is known as the postfix (or Polish) representation of an expression. The
`expression (E£,) 9 (Z,) is represented by the postfix expression P,P,6, where
`P, and P2 are the postfix representations of E, and E, respectively. Again,
`no parentheses are necessary in the postfix representation, as we can deduce
`what P. is by looking for the shortest suffix of P,P, that is a legal postfix
`_ expression. For example, the postfix expression for Fig. 3.7 is ab+act+*.
`If
`“we write this expression as P\P2*,
`then P, is act+,
`the shortest suffix of
`‘ab+ac+ that is a legal postfix expression.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`SLSE
`
`
`
`
`TREES
`
`
`
`The inorder traversal of an expression tree gives the infix expression
`itself, but with no parentheses, For example, the inorder listing of the labels
`of Fig. 3.7 is a+b * a+c, The reader is invited to provide an algorithm for
`traversing an expression tree and producing an infix expression with all
`needed pairs of parentheses,
`
`Computing Ancestral Information
`
`The preorder and postorder traversals of a tree are useful in obtaining ances-
`tral
`information. Suppose postorder(n) is the position of node n in a post-
`order listing of the nodes of a tree. Suppose desc(n) is the number of proper
`descendants of node n. For example,
`in the tree of Fig. 3.7 the postorder
`numbers of nodes 2, m4, and ns are 3, 1, and 2, respectively.
`The postorder numbers assigned to the nodes have the useful property
`that the nodes in the subtree with root m are numbered consecutively from
`postorder(n) — desc(n) to postorder(n). To test if a vertex x is a descendant
`of vertex'y, all we need do is determine whether
`
`postorder(y)~desc(y) = postorder(x) = postorder(y).
`
`A similar property holds for preorder.
`
`3.2 The ADT TREE
`
` 82
`
`
`
`
`In Chapter 2, lists, stacks, queues, and mappings were treated as abstract data
`types (ADT’s).
`In this chapter trees will be treated both as ADT's and as
`data structures. One of our most important uses of trees occurs in the design
`of implementations for the various ADT’s we study. For example, in Section
`5.1, we shall see how a “binary search tree’ can be used to implement
`abstract data types based on the mathematical model of a set, together with
`operations such as INSERT, DELETE, and MEMBER(io test whether an
`element is in a set). The next two chapters present a number of other tree
`implementations of various ADT’s.
`In this section, we shall present several useful operations on trees and
`show how tree algorithms can be designed in terms of these operations. As
`with lists, there are a great variety of operations that can be performed on
`trees. Here, we shall consider the followingoperations:
`If
`1. PARENT(n,T). This function returns the parent of node n in tree T.
`n is the root, which has no parent, A is returned.
`In this context, A is a
`“null node," which is used as a signal that we have navigatedoff the tree.
`2, LEFTMOST_CHILD(n, T) returns the leftmost child of node n in tree T,
`and it returns A if nm is a leaf, which therefore has no children.
`in tree T,
`3. RIGHT_SIBLING(n, T) returns the right sibling of node nm
`defined to be that node m with the same parent p as nm such that m lies
`immediately to the right of n in the ordering of the children of p, For
`example,
`for
`the
`tree
`in Fig. 3.7, LEFTMOST_CHILD(ng) = ng
`RIGHT_SIBLING(n4) = ns, and RIGHT_SIBLING (ns) = A.
`
`(cid:20)(cid:23)
`14
`
`
`
`
`
`
`
`
`
`3.2 THE ADT TREE
`
`83
`
`4. LABEL(n, T) returns the label of node vn in tree T. We do not, however,
`require labels to be defined for every tree.
`§. CREATE((v, T,, T,...,7,;) is one of an infinite family of functions,
`one for each value of jf = 0, 1,2, .... CREATE? makes a new node r
`with label v and gives
`it
`é children, which are the roots of
`trees
`T,, Tz, ...,T;, in order from the left. The tree with root r is returned,
`Note that if? = 0, then r is both a Jeaf and the root.
`6. ROOT(T) returns the node that is the root of tree T, or A if T is the null
`tree.
`7. MAKENULL({?) makes 7 be the null tree.
`
`Example 3.5. Let us write both recursive and nonrecursive procedures to take
`a tree and list the labels of its nodes in preorder. We assume that there are
`data types node and TREE already defined for us, and that the data type
`TREE is for trees with labels of the type labeltype. Figure 3.8 shows a recur-
`sive procedure that, given node n, lists the labels of the subtree rooted at ” in
`preorder. We call PREORDER(ROOT(T)) to get a preorderlisting of tree T.
`
`procedure PREORDER( a: node );
`{ list the labels of the descendants of n in preorder }
`var
`
`c: node;
`begin
`print(LABEL(n, T));
`c := LEFTMOST_CHILD(a, T):
`while c <> A do begin
`PREORDER(c);
`c i= RIGHT_SIBLING(c, T)
`
`end
`end; { PREORDER}
`
`Fig. 3.8. A recursive preorder listing procedure.
`
`tree in
`a
`We shail also develop a nonrecursive procedure to print
`preorder. To find our way around the tree, we shall use a stack S, whose
`type STACK is really ‘stack of nodes.” The basic idea underlying our algo-
`rithm is that when we are at a node a, the stack will hold the path from the
`root ton, with the root at the bottom of the stack and node n at the top.t
`
`T Recall our discussion of recursion in Section 2.6 in which we iilustrated how the implementation
`of a recursive procedure involves a stack of activation records.
`If we examine Fig, 3.8, we can
`observe that when PRECRDER(») is called, the active procedure calls, and therefore the stack of
`activation records, correspond to the calls of PREORDER for all the ancestors of x. Thus our
`“. Mlonrecursive preorder procedure,
`like the example in Section 2.6, models closely the way the re-
`cursive procedure is implemented.
`
`
`
`(cid:20)(cid:24)
`
`
`
`
`
`84
`
`TREES
`
`One way to perform a nonrecursive preorder traversal of a tree is given
`by the program NPREORDER shown in Fig. 3.9. This program has two
`modes of operation.
`In the first mode it descends down the leftmost unex-
`plored path in the tree, printing and stacking the nodes along the path, until it
`reachesa leaf.
`The program then enters the second mode of operation in which it retreats
`back up the stacked path, popping the nodes of the path off the stack, until it
`encounters a node on the path with a right sibling. The program then reverts
`back to the first mode of operation, starting the descent from that unexplored
`right sibling.
`.
`The program begins in mode one at
`the root and terminates when the
`stack becomes empty. The complete program is shown in Fig. 3.9.
`
`3.3 Implementations of Trees
`
`In this section we shail present several basic implementations for trees and dis-
`cuss their capabilities for supporting the various tree operations introduced in
`Section 3.2.
`
`An Array Representation of Trees
`
`
`
`Let T be a tree in which the nodes are named 1, 2,...,7. Perhaps the sim-
`plest representation of 7 that supports the PARENT operation is a linear
`array A in which entry A[i] is a pointer or a cursor to the parent of node i.
`The root of 7 can be distinguished by giving it a null pointer or a pointer to
`itself as parent.
`In Pascal, pointers to array elements are not feasible, so we
`shall have to use a cursor scheme where Alf] = j if node j is the parent of
`node i, and A[i] = 0 if node f¢ is the root.
`This representation uses the property of trees that each node has a unique
`parent, With this representation the parent of a node can be found in con-
`stant time. A path going up the tree, that is, from node to parent to parent,
`and so on, can be traversed in time proportional to the number of nodes on
`the path. We can also support the LABEL operator by adding another array
`L, such that L[/] is the label of node f, or by making the elements of array A
`be records consisting of an integer (cursor) and a label.
`
`Example 3.6. The tree of Fig. 3.10(a) has the parent representation given by
`the array A shownin Fig. 3.10(b), 0
`
`facilitate operations that
`representation does not
`The parent pointer
`require child-of information. Given a node a, it is expensive to determine the
`children of n, or the height of n.
`In addition, the parent pointer representa-
`tion does not specify the order of the children of a node. Thus, operations
`like LEFTMOST_CHILD and RIGHT_SIBLING are not well defined. We
`could impose an artificial order, for example, by numbering the children of
`each node after numbering the parent, and numbering the children in
`
`(cid:20)(cid:25)
`16
`
`
`
`3.3 IMPLEMENTATIONS OF TREES
`
`85
`
`procedure NPREORDER( T: TREE );
`{ nonrecursive preordertraversal of tree T }
`var
`
`{ a temporary }
`m: node;
`5S: STACK;
`{ stack of nodes holding path from the root
`to the parent TOP(S) of the “current” node m }
`
`begin
`{ initialize }
`MAKENULL(S);
`m:= ROOT(?);
`
`while true do
`ifm <> A then begin
`prin(LABEL(m, T));
`PUSH(m, 5);
`{ explore leftmost child of m }
`m := LEFTMOST_CHILD(m, T)
`
`end
`else begin
`{ exploration of path on stack
`is now complete }
`if EMPTY(S) then
`return;
`{ explore right sibling of node
`on top of stack }
`m:= RIGHT_SIBLING(TOP(S), 7):
`POP(S)
`
`end
`{ NPREORDER}
`
`end;
`
`Fig. 3.9, A nonrecursive preorder procedure.
`
`increasing order from left to right. On that assumption, we have written the
`~ function RIGHT_SIBLING in Fig. 3.11, for types node and.TREE that are
`. defined as follows:
`
`
`
`
`
`type
`node = integer;
`TREE = array [1..maxnodes] of node:
`
`For this implementation we assume the null node A is represented by 0.
`
`(cid:20)(cid:26)
`17
`
`
`
`
`
`
`
`86
`
`TREES
`
`(a) a tree
`
`2 3
`4
`5
`6
`7
`8
`9 W
`afolili{[2[2[sls15]313
`
`(b) parent representation.
`
`Fig, 3.10. A tree and its parent pointer representation.
`
`function RIGHTSIBLING ( n: node; T: TREE) : node,
`{ return the right sibling of node n in tree T }
`*var
`
`i, parent: node;
`begin
`parent := T[n];
`for i := 2 + 1 to maxnodes do
`{ search for node after n with same parent }
`if T[i] = parent then
`:
`return (i);
`return (0)
`{ null node will be returned
`if no right sibling is ever found }
`{ RIGHT_SIBLING }
`
`end;
`
`Fig. 3.11. Right sibling operation using atray representation.
`
`(cid:20)(cid:27)
`18
`
`
`
`
`
`
`
`3.3 IMPLEMENTATIONS OF TREES
`
`
`Representation of Trees by Lists of Children
`
`An important and useful way of representing trees is to form for each node a
`
`list of its children. The lists can be represented by any of the methods sug-
`
`gested in Chapter 2, but because the number of children each node may have
`
`can be variable, the linked-list representations are often more appropriate.
`
`Figure’ 3.12 suggests how the tree of Fig. 3.10(a) might be represented.
`
`There is an array of header cells,
`indexed by nodes, which we assume to be
`
`numbered 1, 2,...,10, Each header points to a linked list of “elements,”
`
`which are nodes. The elements on the list headed by header{i] are the chil-
`dren of node i; for example, 9 and 10 are the children of 3.
`
`87
`
`header
`
`Fig. 3.12. A linked-list representation of a tree.
`
`
`Let us first develop the data structures we need in terms of an abstract
`
`data type LIST (of nodes), and then give a particular implementation of lists
`
`and see how the abstractions fit together. Later, we shall see some of the
`simplifications we can make. We begin with the following type declarations:
`
`
`
`
`
`
`type
`node = integer;
`LIST = { appropriate definition for list of nodes };
`position = { appropriate definition for positions in lists },
`TREE = record
`header: array [1..maxnodes] of LIST;
`labels: array [1..maxnodes] of labeltype;
`root: node
`,
`
`end;
`
`(cid:20)(cid:28)
`19
`
`
`
`
`
`
`
`
`
`We assume that the root of each tree is stored explicitly in the roor field.
`Also, 0 is used to represent the null node,
`Figure 3.13 shows the code for the LEFTMOST_CHILD operation. The
`reader should write the code for the other operations as exercises.
`
`TREES
`
`function LEFTMOST_CHILD( n: node; T: TREE ) ;: node;
`{ returns the leftmost child of node n of tree T }
`var
`
`L: LIST;
`
`{ shorthand for the list of n's children }
`
`L := Theader[nj;
`if EMPTY(L) then {7 is a leaf }
`return (0)
`
`else
`
`return (RETRIEVE(FIRST(L), L))
`{ LEFTMOST_CHILD}
`
`end;
`
`Fig. 3.13. Function to find leftmost child,
`
` 85
`
`Now let us choose a particular implementation of lists, in which both LIST
`and position are integers, used as cursors into an array cellspace of records:
`ver
`
`cellspace: array [1,.maxnodes ] of record
`node: integer;
`next: integer
`
`end;
`
`lists of children have header cells.
`that
`insist
`To simplify, we shall not
`Rather, we shall let T.keader[n) point directly to the first cell of the list, as is
`suggested
`by
`Fig.
`3.12.
`Figure
`3.14{a)
`shows
`the
`function
`LEFTMOST_CHILD of Fig. 3.13 rewritten for this specific implementation.
`Figure 3.14{b) shows the operator PARENT, which is more difficult to write
`using this representation of lists, since a search ofall lists is required to deter-
`mine on