`
`ELASTIC - EXHIBIT 1010
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Data Structures
`
`and Algorithms
`
`ALFRED V. AHO
`
`Bell Laboratories
`Murray Hill, New Jersey
`
`JOHN E. HOPCROFT
`
`Cornell University
`ithaca, New York
`
`JEFFREY D. ULLMAN
`
`Stanford University
`Stanford, California
`
`
`
`
`
`
`
`
`vv ADDISON-WESLEYPUBLISHING COMPANY
`
`
`Reading, Massachusetts e Menlo Park: California
`i
`,Ontarioe* Sydney
`
`(cid:21)
`
`
`
`
`
`This book is in the
`ADDISON-WESLEY SERIES IN
`COMPUTER SCIENCE AND INFORMATION PROCESSING
`
`Michael A. Harrison
`Consulting Editor
`
`
`
`
`
`
`
`
`
`Library of Congress Cataloging in Publication Data
`Aho, Alfred V.
`Data structures and algorithms.
`2. Algorithms.
`1. Data structures (Computer science)
`I. Hopcroft, John E., 1939-
`.
`HH, Utlman,
`Jeffrey D., 1942-
`.
`IIL. Title.
`QA76.9.D35A38
`1982
`001.64
`82-11596
`ISBN 0-201-00023-7
`
`
`
`
`
`Reproduced by Addison-Wesley from camera-ready copy supplied by the authors.
`
`Reprinted with corrections April, 1987
`Copyright © 1983 by Bell Telephone Laboratories, Incorporated.
`
`All rights reserved. No part of this publication may be reproduced, stored in a re-
`trieval system, of transmitted, in any form or by any means, electronic, mechanical,
`
`photocopying, recording, or otherwise, without the prior written permission of the pub-
`
`
`
`
`
`
`
`lisher. Printed in the United States of America. Published simultaneously in Canada. vst TSENG 0-201-00023-7
`
`(cid:22)
`
`
`
`Contents
`
`Design and Analysis of Algorithms
`From Problems to Programs ..........cccccseeeenee tener neeeeee nanan nesters J
`Abstract Data Types ......02ccccccereeeeeeeeerene near rrr rte ee 10
`Data Types, Data Structures, and Abstract Data Types ...cceceeee 13
`The Running Time of a Program ........eseeeeeereeetteeeereteer trees 16
`Calculating the Running Time of a Program........:::cesreeee21
`Good Programming Practice .........ccceserereereeeererennneeereneneeeess27
`Super Pascal........0--:ceccececersetesterrscnnesseereeeessees cree ees esses29
`
`Chapter 2
`2.1
`2.2
`2.3
`2.4
`2.3
`2.6
`
`Basic Data Types
`The Data Type “List” ......:.:cceeseeererreneneresnnnerpereneenegeeeeneenes37
`Implementation of Lists .............:cccccnnreeeeeeterreeeeneneserertans40
`Stack..cccccccccececeeec eens sted tees ee eee EEE EEA TEER E EEG E REESE EEE ESOS 33
`QUEUES ooo eee eect eee EER EE EEE LE 56
`Mappings .......cc-.cceeeeeeeeseeeneeeee enna renesnea ec eenetaouteeeneeeeeeees 61
`Stacks and Recursive Procedures ......-.:--:eceeseereeeeenenentesenenees64
`
`Chapter 3
`3.1
`3.2
`3.3
`3.4
`
`Trees
`Basic Terminology ............:.:eceeseeeee renee ernest tener eee75
`The ADT TREE ......cccccece cree ree EEE EE EERE EEE 82
`Implementations Of Trees .........-cscersee reese eer ennet tat etetet tenes 84
`Binary Trees .....ceecee sere cere reer seer eeeeeee ener teeeer eres 93
`
`Chapter 4
`41
`4.2
`4.3
`4.4
`4.5
`4.6
`4.7
`4.8
`49
`4.10
`4.ii
`4.12
`
`Basic Operations on Sets
`Introduction to Sets ..........cccseeeeseeeee nee renee eterna rere nen sen ened 107
`An ADT with Union, Intersection, and Difference ............-+-- 109
`A Bit-Vector Implementation of Sets.............:.ceseeeeeernrer eres 112
`A Linked-List Implementation of Sets ............::c:-seescseeeserees 115
`The Dictionary ........-.:cceeeeecceeee cere tenn e es nner een e ener tees 117
`Simple Dictionary Implementations .......-..--.eee reteset 119
`The Hash Table Data Structure ..........::cccerereseeereeeee ener eaees 122
`Estimating the Efficiency of Hash Functions...........-:0ee 129
`Implementation of the Mapping ADT .........0:ceceseeiereeees 135
`Priority QUeUeS .........eccee eter terete eet e eres eeee eet ee reer er rne ere 135
`Impiementations of Priority Queues .......... teers 138
`Some Complex Set Structures 0.0.0.0... eccrine eine 145
`
`(cid:23)
`
`
`
`CONTENTS
`
`Advanced Set Representation Methods
`
`Binary Search Trees .........ccccerceceeseeeeneeeeeseeetesessteeentsetaes 155
`Time Analysis of Binary Search Tree Operations .................4 160
`THIS oie cece eee cnet ene enteeest ee eedbeet heed taeeeetastteeeeeseeeedaeeees 163
`Balanced Tree Implementations of Sets ..............,..:ccsreeenseaee 169
`Sets with the MERGE and FIND Operations.............,.......5 180
`An ADT with MERGE and SPLIT............... ce eeecesenenrneeaee 189
`
`Directed Graphs
`
`Basic Definitions ........0....ccccccceeseeeeeeeteeenseeseseepeeeneatetenes 198
`Representations for Directed Graphs..........ccccccseceseeeeennseeaes 199
`The Single-Source Shortest Paths Problem ...........0.:cceseeeeeee203
`The All-Pairs Shortest Path Problem ...............:cceseeeeneeeeeeees 208
`Traversals of Directed Graphs .............ccccceecceseeese seen eeeeeeees 215
`Directed Acyclic Graphs ....... 0... cccescecssccseeesseseueneeneeeneenes219
`Strong COMponents .....0..0 sce ceee tees te sete een ete teeeeteeeneteee tes 222
`
`Undirected Graphs
`Definitions ............ 0...SS eekeeebeetaveeeeeteateeeeseeeeeetaareneenecees 230
`Minimum-Cost Spanning Trees ............0 0.0 cec cee eeeeee sere etene 233
`TraverSals .......cccccsessceeteseeceserseaseseseeeeeseeassneeseaesnestanened 239
`Articulation Points and Biconnected Components..................244
`Graph Matching ......,.......ccccececesceeseeteeeseeeesia essa eeneteneres 246
`
`Local Search Algorithms ...............cccccesereeeteeeeerereneeeenennten 336
`
`Sorting
`
`The Internal Sorting Model.........:ccessseessesseesevseneerea een sagees253
`Some Simple Sorting Schemes .............::cueccssessseeresreeeteeenes 254
`QUICKSOTE 00. cece recent ened nent ec be en band teed ewe endbaed een eeettoe 260
`Heapsort .........cceceecceeeeeseets eee eeeneesseeaseeeeeneesseeesseneeeaenees271
`Bin SOrting .......0..cccsccoseeeeecsessenceneeentreeseeeneeeensseaneesaneres274
`A Lower Boundfor Sorting by Comparisons...............-.:.50008 282
`Order Statistics....,.....0.::ccccccsesseetes sees sessaserseeeeeseeeeeeeuenes 286
`
`Algorithm Analysis Techniques
`
`Efficiency of Algorithms ...............ccccicces eee seeeeeneeeeeee ere ene ne 293
`Analysis of Recursive Programs ..........0:.:ccssccsceseesseeneceeveres294
`Solving Recurrence Equations ............c.cccesseseecsessereecteeeenes296
`A General Solution for a Large Class of Recurrences............ 298
`
`Algorithm Design Techniques
`
`Divide-and-Conquer Algorithms................:2c:eccsseenseseeeeeeeee306
`Dynamic Programming .............cccccecseeeteeeseeenscuueceu eva ssannes 311
`Greedy Algorithms ...............cscesceceesenesereseeeeseuneesteeueeneeas 321
`Backtracking ................ccccceecseeeceecdecntaetaeeteneeeeatenbenbaeeinge 324
`
`(cid:24)
`
`Chapter 5
`3.1
`3.2
`3.3
`3.4
`5.5
`5.6
`
`Chapter 6
`6.1
`6.2
`6.3
`6.4
`6.5
`6.6
`6.7
`
`Chapter 7
`7.1
`72
`73
`74
`7.5
`
`Chapter 8
`8.1
`8.2
`8.3
`8.4
`8.5
`8.6
`8.7
`
`Chapter 9
`9.1
`9.2
`9.3
`9.4
`
`Chapter 10
`10.1
`10.2
`10.3
`10.4
`10.5
`
`
`
`
`
`Chapter 11 Data Structures and Algorithmsfor External Storage
`11.1 A Modelof External Computation.....cceecrercerierersrnertserte®347
`11.2 External Sorting .......:cccsecceesssertneereee349
`11.3
`Storing Information im Files .......-:-scrresersreerrreeer361
`11.4 External Search Trees....:.sseccercceceeenrsreere eres368
`
`Chapter 12. Memory Management
`12.L The Issues in Memory Management ........-ccscrcrererrsrrtensesseers378
`12.2 Managing Equal-Sized BLOCKS ..cccccccceceeee eee en erees seen tet ee ressee382
`12.3. Garbage Collection Algorithms for Equal-Sized Blocks ........--384
`12.4
`Storage Allocation for Objects with Mixed Sizes .....-.--.0e392
`12.5 Buddy Systems .........--ccceecreesseserestee etree400
`12.6
`Storage Compaction .......-cceeeserentereeset404
`Bibliography .........-:cccccecceece retesetes411
`
`
`CONTENTS
`
`xi
`
`TeX ooo ccccccccceeenee creer ene ree TEE EEE TE419
`
`(cid:25)
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`CHAPTER3
`
`Trees
`
`A tree imposes a hierarchical structure on a collection of items. Familiar
`examples of trees are genealogies and organization charts. Trees are used to
`help analyze electrical circuits and to represent the structure of mathematical
`formulas. Trees also arise naturally in many different areas of computer sci-
`ence. For example,
`trees are used to organize information in database sys-
`tems and to represent the syntactic structure of source programs in compilers.
`Chapter 5 describes applications of
`trees in the representation of data.
`Throughout this book, we shall use many different variants of trees.
`In this
`chapter we introduce the basic definitions and present some of the more com-
`mon tree operations. We then describe some of the more frequently used data
`structures for trees that can be used to support these operations efficiently.
`
`3.1 Basic Terminology
`
`A tree is a collection of elements called nodes, one of which is distinguished as
`a root, along with @ relation (‘‘parenthood”) that places a hierarchical struc-
`ture on the nodes. A node, like an elementofa list, can be of whatever type
`we wish, We often depict a node as a letter, a string, or a number with a cir-
`cle around it. Formally, a tree can be defined recursively in the following
`manner,
`
`i, A single node by itself is a tree. This node is also the root of the tree.
`2.
`Suppose
`nv
`is
`a
`node
`and
`17 ,,7>,...,7%,
`are
`trees with
`roots
`ny, M2, ... Mg, Tespectively. We can construct a new tree by making n
`be the parent of nodes n,, m2, ...,%%.
`In this tree # is the root and
`T,,Tz,...,7, are the subtrees of the root. Nodes 7m), 72,...,M, are
`called the children of node x.
`
`
`
`
`
`
`
`
`
`
`is convenient to include amongtrees the null tree, a “‘tree’” with
`it
`“Sometimes,
`
`no nodes, which we shail represent by A.
`
`- Example 3.1. Consider the table of contents of a book, as suggested by Fig.
`-3.1(a). This table of contents is a tree. We can redraw it
`in the manner
`
`Shown in Fig. 3.1(b). The parent-child relationship is depicted by a line.
`Trees are normally drawn top-down as in Fig. 3.1(b), with the parent above
`‘the child.
`
`three subtrees with roots
`the node called “Book,” has
`= The root,
`
`Orresponding to the chapters Cl, C2,
`and C3.
`This
`relationship is
`
`epresented by the lines downward from Book to C1, C2, and C3. Book is
`he parent of C1, C2, and C3, and these three nodes are the children of Book.
`
`
`
`(cid:26)
`
`
`
`
`
`
`
`.
`
`Cl
`
`Book
`
`C2
`
`C3
`
`Book
`
`8
`
`s2.1.1
`s2.}.2
`
`$2.3
`
`oN
`C27
`J SIN
`
`/ \
`82.2
`
`
`sl.
`
`s1.2
`
` s2.k
`
`s2.2
`
`82.3
`
`s2.t.1
`
`82.1.2
`
`(a)
`
`(b)
`
`Fig. 3.1. A table of contents and its tree representation.
`
`
`The third subtree, with root C3, is a tree of a single node, while the other
`two subtrees have a nontrivial structure. For example, the subtree with root
`C2 has three subtrees, corresponding to the sections s2,1, s2.2, and s2.3; the
`last two are one-node trees, while the first has two subtrees corresponding to
`the subsections s2.1.1 and s2.1.2. 0
`
`'
`
`‘Example 3.1 is typical of one kind of data that is best represented as a
`tree:
`In this example, the parenthood relationship stands for containment; a
`parent node is comprised of its children, as Book is comprised of C1, C2, and
`C3, Throughout this book we shall encounter a variety of other relationships
`that can be represented by parenthoodin trees.
`is the
`nm;
`If ny, nz, ...,my iS a Sequence of nodes in a tree such that
`parent of n,,, for 1 = i < k, then this sequence is called a path from node n,
`to node n,. The length of a path is one less than the number of nodes in the
`path. Thus there is a path of length zero from every node to itself. For
`exampie,
`in Fig. 3.1 there is a path of length two, namely (C2, s2.1, 52.1.2)
`from C2 to s2,1.2.
`If there is a path from node a io node b, then a is an ancestor of b, and b
`is a descendant of a. For example,
`in Fig. 3.1,
`the ancestors of s2.1, are
`itself, C2, and Book, while its descendants.are itself, s2.1.1, and s2,1.2.
`Notice that any node is both an ancestor and a descendantofitself.
`An ancestor or descendant of a node, other than the nodeitself, is called
`a proper ancestor or proper descendant, respectively.
`In a tree, the root is the
`only node with no proper ancestors. A node with no proper descendants is
`called a leaf. A subtree of a tree is a node, together with all its descendants.
`The height of a node in a tree is the length of a longest path from the
`node to a leaf.
`In Fig. 3.1 node C1 has height 1, node C2 height 2, and node
`C3 height 0. The height of a tree -is the height of the root. The depth of a
`node is the length of the unique path from the root to that node,
`
`
`
`(cid:27)
`
`
`
`ellesaiiniaineea
`
` named
`
` 3,1 BASIC TERMINOLOGY
`
`9
`
`Fig. 3.3. A tree.
`
`A simple rule, given a node n, for finding those nodesto its left and those
`to its right, is to draw the path from the rool to nm. All nodes branching off to
`the left of this path, and gli descendants of such nodes, are to the left of n.
`All nodes and descendants of nodes branching off to the right are to the right
`of”, a
`
`(cid:28)
`
`Fl
`
`The Order of Nodes
`
`The children of a node are usually ordered from Jeft-to-right. Thus the two
`trees of Fig, 3.2 are different because the two children of node a appear in a
`different order in the two trees. If we wish explicitly to ignore the order of
`children, we shall refer to a tree as an unordered tree.
`
`Fig. 3.2. Two distinct (ordered) trees,
`
`The “left-to-right” ordering of siblings (children of the same node) can be
`extended to compare any two nodes that are not related by the ancestor-
`descendant relationship. The relevantrule is that if a and & are siblings, and
`a is to the left of 6, then all the descendants of a are to the left of all the des-
`cendants of b.
`
`Example 3.2. Consider the tree in Fig. 3.3. Node 8 is to the right of node 2,
`to the left of nodes 9, 6, 10, 4, and 7, and neither left nor right of its ances-
`tors 1, 3, and 5.
`
`————
`ee Ss
`ay |
`\
`|
`
`/&
`
`tm
`
`
`
`TREES
`
`Preorder, Postorder, and [norder
`
`There are several useful ways in which we can systematically order all nodes
`of a tree. The three most
`important orderings are called preorder, inorder
`and postorder; these orderings are defined recursively as follows.
`®
`If a tree T is null, then the empty list is the preorder, inorder and post-
`orderlisting of T,
`then that node by itself is the preorder,
`If T consists a single node,
`inorder, and postorderlisting of T.
`Otherwise, let T be a tree with root n and subtrees T,, Tz, ...,7,, a8 sug-
`gested in Fig. 3.4.
`
`®
`
`(*)
`
`A A~D
`
`Fig. 3.4. Tree T.
`
` 78
`
`1. The preorder listing (or preorder traversal) of the nodes of T is the root n
`of T followed by the nodes of T,
`in preorder,
`then the nodes of 7) in
`preorder, and so on, up to the nodes of T, in preorder.
`in inorder, fol-
`2, The jnorder [isting of the nodes of T is the nodes of T,
`lowed by node rn, followed by the nodes of Tz,...,7,, each group of
`nodes in inorder.
`-
`3. The postorder listing of the nodes of T is the nodes of T,
`in postorder,
`then the nodes of T, in postorder, and so on, up to 7,, all followed by
`node nt.
`
`Figure 3.5(a) shows a sketch of a procedure to list the nodes of a tree in
`preorder. To make it a postorder procedure, we simply reverse the order of
`steps (1) and (2). Figure 3.5(b) is a sketch of an inorder procedure.
`In each
`case, we produce the desired ordering of the tree by calling the appropriate
`procedure on the root of the tree.
`
`| and
`Example 3,3. Let us list the tree of Fig. 3.3 in preorder. Wefirst list
`then call PREORDER on thefirst subtree of 1, the subtree with root 2. This
`subtree is a single node, so we simply list it. Then we proceed to the second
`subtree of 1, the tree rooted at 3. We list 3, and then call PREORDER on
`the first subtree of 3. That call results in listing 5, 8, and 9,
`in that order.
`
`(cid:20)(cid:19)
`10
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`trick for producing the three node orderings is the following.
`A useful
`. Imagine we walk around the outside of the tree, starting at the root, moving
`
`counterclockwise, and staying as close to the tree as possible; the path we have
`-in mind for Fig. 3.3 is shown in Fig. 3.6.
`:
`For preorder, we list a node the first time we pass it. For postorder, we
`‘list a node the last time we pass it, as we move up to its parent. For inorder,
`
`“we list a leaf the first time we pass it, but list an interior node the second time
`
`We pass it. For example, node 1
`in Fig, 3.6 is passed the first time at the
`beginning, and the second time while passing through the “bay” between
`
`nodes 2 and 3. Note that the order of the leaves in the three orderings is
`
`3.1 BASIC TERMINOLOGY
`
`79
`
`(1)
`(2)
`
`procedure PREORDER ( n: nade );
`begin
`list 93
`for each child c of nv, if any, in order from the left do
`PREORDER(c)
`{ PREORDER}
`
`end;
`
`(a) PREORDERprocedure.
`
`procedure INORDER( n: node );
`begin
`ifm is a leaf then
`list n
`else begin
`INORDER(leftmost child of »);
`list ;
`for each child c of n, except for the leftmost,
`in order from the left do
`INORDER(c)
`
`end
`{ INORDER}
`
`end;
`
`(b) INORDER procedure.
`
`Fig. 3.5. Recursive ordering procedures.
`
`Continuing in this manner, we obtain the complete preorder traversal of Fig.
`3.3: 1, 2, 3, 5, 8, 9, 6, 10, 4, 7.
`Similarly, by simulating Fig. 3.5(a) with the steps reversed, we can dis-
`cover that the postorder of Fig. 3.3 is 2, 8, 9, 5, 10, 6, 3, 7, 4, 1. By simulat-
`ing Fig. 3.5(b), we find that the inorder listing of Fig. 3.3 is 2, 1, 8, 5, 9, 3,
`10,6,7,4.0
`
`(cid:20)(cid:20)
`
`
`
`aS
`Spas
`
`
`
`
`TREES
`
`
`
`
`
`VFA
`
`wee i Se
`(5 | oN
`\JI)fi,
`\\7ee[
`6
`8 JX 9)
`10
`}
`w_ Sead \A
`
`Fig. 3.6, Traversal of a tree.
`
`it is only the ordering of
`always the same left-to-right ordering of the leaves.
`the interior nodes and their relationship to the leaves that vary among the
`three. 0
`
`Labeled Trees and Expression Trees
`
` 80
`
`Often it is useful to associate a label, or value, with each node of a tree, in
`the same spirit with which we assocjated a value with a list element in the pre-
`yious chapter, That is, the label of a node is not the name of the node, but a
`value that is “stored” at the node,
`In some applications we shall even change
`the label of a node, while the name of a node remains the same. A useful
`analogyis tree:list = label:element = node:position.
`
`Example 3.4. Figure 3.7 shows a labeled tree representing the arithmetic
`expression (a+b) * (atc), where n,,...,n7 are the names of the nodes,
`and the labels, by convention, are shown neat
`to the nodes. The rules
`wherebya labeled tree represents an expression are as follows:
`I. Every leaf is labeled by an operand and consists of that operand alone.
`For example, node a, represents the expression a.
`2. Every interior node n is labeled by an operator. Suppose n is labeled by a
`binary operator 8, such as + or *, and that
`the left child represents
`expression E, and the right child Ey, Then nm
`represents expression
`(E,) 8 (E,). We may remove the parentheses if they are not necessary.
`For example, node m) has operator +, and its left and right children
`represent
`the expressions a and b, respectively, Therefore, nz represents
`(a)+(b), or just a+b. Node n, represents (at+5)4#(a+c), since * is the label
`
`(cid:20)(cid:21)
`12
`
`
`
`
`
`3.1 BASIC TERMINOLOGY
`81
`
`
`
`
`
`
`
`
`
`
`
`
`at n;, and a+b and atc are the expressions represented by nz and ny, respec-
`tively. 0
`
`
`
`Fig. 3.7. Expression tree with labeis.
`
`inorder, or postorder listing of a
`Often, when we produce the preorder,
`tree, we prefer to list not the node names, but rather the labels.
`In the case
`of an expression tree, the preorder listing of the labels gives us what is known
`as the prefix form of an expression, where the operator precedes its left
`operand and its right operand. To be precise, the prefix expression for a sin-
`gle operand a is a@
`itself. The prefix expression for (E,) @ (£2), with 0 a
`binary operator,
`is 6P,;P2, where P, and P» are the prefix expressions for E;
`and £3. Note that no parentheses are necessary in the prefix expression, since
`we can scan the prefix expression 8P,P, and uniquely identify P, as the shor-
`test (and only) prefix of P,P, that is a legal prefix expression.
`For example, the preorder listing of the labels of Fig. 3.7 is *t+ab+ac.
`The prefix expression for m2, which is +a@b,
`is the shortest legal prefix of
`+abt+ac,
`Similarly, a postorder listing of the labels of an expression tree gives us
`what is known as the postfix (or Polish) representation of an expression. The
`expression (£,} @ (Z,) is represented by the postfix expression P,P,0, where
`P, and P, are the postfix representations of E, and E, respectively. Again,
`no parentheses are necessary in the postfix representation, as we can deduce
`what P. is by looking for the shortest suffix of P,P, that is a legal postfix
`.€xpression.
`For example, the postfix expression for Fig. 3.7 is ab+ac+*.
`If
`- We write this expression as P,\P,*,
`then P. is ac+,
`the shortest suffix of
`ab+ac+t that is a legal postfix expression.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`(cid:20)(cid:22)
`
`
`
`TREES
`
`
`
`The inorder traversal of an expression tree gives the infix expression
`itself, but with no parentheses, For example, the inorder listing of the labels
`of Fig. 3.7 is a+b * a+c, The reader is invited to provide an algorithm for
`traversing an expression tree and producing an infix expression with all
`needed pairs of parentheses,
`
`Computing Ancestral Information
`
`The preorder and postorder traversals of a tree are useful in obtaining ances-
`tral
`information. Suppose postorder(n) is the position of node n in a post-
`order listing of the nodes of a tree. Suppose desc(n) is the number of proper
`descendants of node n. For example,
`in the tree of Fig. 3.7 the postorder
`numbers of nodes m2, nq, and ms are 3, 1, and 2, respectively.
`The postorder numbers assigned to the nodes have the useful property
`that the nodes in the subtree with root m are numbered consecutively from
`postorder(n) — desc(n) to postorder(n). To test if a vertex x is a descendant
`of vertex'y, all we need do is determine whether
`
`postorder(y)~desc(y) = postorder(x) = postorder(y).
`
`A similar property holds for preorder.
`
`3.2 The ADT TREE
`
` 82
`
`
`
`In Chapter 2, lists, stacks, queues, and mappings were treated as abstract data
`types (ADT’s).
`In this chapter trees will be treated both as ADT's and as
`data structures. One of our most important uses of trees occurs in the design
`of implementations for the various ADT’s we study. For example, in Section
`5.1, we shall see how a “binary search tree” can be used to implement
`abstract data types based on the mathematical model of a set, together with
`operations such as INSERT, DELETE, and MEMBER(io test whether an
`element is in a set). The next two chapters present a number of other tree
`implementations of various ADT’s.
`In this section, we shall present several useful operations on trees and
`show how tree algorithms can be designed in terms of these operations. As
`with lists, there are a great variety of operations that can be performed on
`trees. Here, we shall consider the followingoperations:
`If
`1. PARENT(n,T). This function returns the parent of node n in tree T.
`n is the root, which has no parent, A is returned.
`In this context, A is a
`“null node," which is used as a signal that we have navigatedoff the trec.
`2, LEFTMOST_CHILD(n, T) returns the leftmost child of node n in tree T,
`and it returns A if m is a leaf, which therefore has no children.
`in tree T,
`3. RIGHT_SIBLING(n, T) returns the right sibling of node nm
`defined to be that node m with the same parent p as mn such that m lies
`immediately to the right of n in the ordering of the children of p, For
`example,
`for
`the
`tree
`in Fig. 3.7, LEFTMOST_CHILD(n) = ng;
`RIGHT_SIBLING(n4) = ns, and RIGHT_SIBLING (ns) = A.
`
`(cid:20)(cid:23)
`14
`
`
`
`
`
`
`
`
`
`3.2 THE ADT TREE
`
`83
`
`4, LABEL(n, T) returns the label of node a in tree T. We do not, however,
`require labels to be defined for every tree.
`5. CREATEI(v, TT, ...,7,;) is one of an infinite family of functions,
`one for each value of j = 0, 1,2, .... CREATE? makes a new node r
`with label v and gives
`it
`é children, which are the roots of
`trees
`T,, Tz, ...,T;, in order from the left. The tree with root r is returned,
`Note that if? = 0, then r is both a Jeaf and the root.
`6. ROOT(T) returns the node that is the root of tree T, or A if T is the null
`tree.
`7, MAKENULL({T) makes Tf be the null tree.
`
`Example 3.5. Let us write both recursive and nonrecursive procedures to take
`a tree and list the labels of its nodes in preorder. We assume that there are
`data types node and TREE already defined for us, and that the data type
`TREE is for trees with labels of the type labeltype. Figure 3.8 shows a recur-
`sive procedure that, given node n, lists the labels of the subtree rooted at n in
`preorder. We call PREORDER(ROOT(T)) to get a preorderlisting of tree T.
`
`procedure PREORDER( a: node );
`{ list the labeis of the descendants of n in preorder }
`var
`
`c: node;
`begin
`print(LABEL(n, T));
`c >= LEFTMOST_CHILD(n, T);
`while ¢ <> A do begin
`PREORDER(c);
`c := RIGHT_SIBLING(c, T)
`
`end
`end; { PREORDER}
`
`Fig. 3.8. A recursive preorder listing procedure.
`
`tree in
`a
`We shail also develop a nonrecursive procedure to print
`preorder. To find our way around the tree, we shall use a stack S, whose
`type STACK is really ‘stack of nodes.” The basic idea underlying our algo-
`rithm is that when we are at a node a, the stack will hold the path from the
`root ton, with the root at the bottom of the stack and node n at the top,
`
`T Recall our discussion of recursion in Section 2.6 in which we illustrated how the implementation
`of a recursive procedure involves a stack of activation records.
`If we examine Fig, 3.8, we can
`observe that when PREORDER(») is called, the active procedure calls, and therefore the stack of
`activation records, correspond to the calls of PREORDERfor ail the ancestors of n. Thus our
`honrecursive preorder procedure,
`like the example in Section 2.6, models closely the way the re-
`cursive procedure is implemented.
`
`“"_
`
`
`
`(cid:20)(cid:24)
`
`
`
`84
`
`TREES
`
`One way to perform a nonrecursive preorder traversal of a tree is given
`by the program NPREORDER shown in Fig. 3.9. This program has two
`modes of operation.
`In the first mode it descends down the leftmost unex-
`plored path in the tree, printing and stacking the nodes along the path, until it
`reachesa leaf.
`The program then enters the second mode of operation in which it retreats
`back up the stacked path, popping the nodes of the path off the stack, until it
`encounters a node on the path with a right sibling. The program then reverts
`back to the first mode of operation, starting the descent from that unexplored
`right sibling.
`.
`The program begins in mode one at
`the root and terminates when the
`stack becomes empty. The complete program is shown in Fig. 3.9.
`
`3.3 Implementations of Trees
`
`In this section we shall present several basic implementations for trees and dis-
`cuss their capabilities for supporting the various tree operations introduced in
`Section 3.2.
`
`An Array Representation of Trees
`
`
`
`.
`
`
`
`Let T be a tree in which the nodes are named |, 2,.-.,. Perhaps the sim-
`plest representation of T that supports the PARENT operation is a linear
`array A in which entry A[i] is a pointer or a cursor to the parent of node i.
`The root of T can be distinguished by giving it a null pointer or a pointer to
`itself as parent.
`in Pascal, pointers to array elements are not feasible, so we
`shall have to use a cursor scheme where Afi] = j if node j is the parent of
`node i, and A[i] = 0 if node # is the root,
`This representation uses the property of trees that each node has a unique
`parent. With this representation the parent of a node can be found in con-
`stant time. A path going up the tree, that is, from node to parent to parent,
`and so on, can be traversed in time proportional to the number of nodes on
`the path. We can also support the LABEL operator by adding another array
`L, such that L[/] is the label of node f, or by making the elements of array A
`be records consisting of an integer (cursor) and a label.
`
`Example 3.6. The tree of Fig. 3.10(a) has the parent representation given by
`the array A shownin Fig. 3.10(b), 0
`
`facilitate operations that
`representation does not
`The parent pointer
`require child-of information. Given a node a, it is expensive to determine the
`children of n, or the height of n.
`In addition, the parent pointer representa-
`tion does not specify the order of the children of a node. Thus, operations
`like LEFTMOST_CHILD and RIGHT_SIBLING are not well defined. We
`could impose an artificial order, for example, by numbering the children of
`each node after numbering the parent, and numbering the children in
`
`(cid:20)(cid:25)
`16
`
`
`
`3.3 IMPLEMENTATIONS OF TREES
`
`85
`
`procedure NPREORDER( T: TREE );
`{ nonrecursive preordertraversal of tree T }
`var
`
`{ a temporary }
`m: node;
`S: STACK;
`{ stack of nodes holding path from the root
`to the parent TOP(S) of the “current” node m }
`
`begin
`{ initialize }
`MAKENULL(S);
`m := ROOT(?);
`
`while true do
`ifm <> A then begin
`print(LABEL(m, T));
`PUSH(m, 5);
`{ explore leftmost child of m }
`m := LEFTMOST_CHILD(m, T)
`
`end
`
`else begin
`{ exploration of path on stack
`is now complete }
`if EMPTY(S) then
`return,
`{ explore right sibling of nede
`on top of stack }
`m:= RIGHT_SIBLING(TOP(S), 7):
`POP(S)
`
`end
`{ NPREORDER}
`
`end;
`
`Fig. 3.9. A nonrecursive preorder procedure.
`
`increasing order from left to right. On that assumption, we have written the
`- function RIGHT_SIBLING in Fig. 3.1L, for types node and. TREE that are
`“defined as follows:
`
`
`
`
`
`type
`
`node = integer;
`TREE = array [1..maxnodes] of node;
`
`: For this implementation we assume the null node A is represented by 0.
`
`(cid:20)(cid:26)
`17
`
`
`
`
`
`
`
`86
`
`TREES
`
`(a) a tree
`
`2 3
`4
`5
`6
`7
`8
`9
`10
`a folil:[2121515151313|
`
`(b) parent representation.
`
`Fig. 3.10. A tree and its parent pointer representation.
`
`function RIGHTSIBLING ( n: node; T: TREE ) : node;
`{ return the right sibling of node n in tree 7 }
`*var
`
`i, parent: node;
`begin
`parent := T[n];
`for i := n + 1 to maxnodes do
`{ search for node after n with same parent }
`if T[i] = parent then
`:
`return (i);
`return (0)
`{ null node will be returned
`if no right sibling is ever found }
`{ RIGHT_SIBLING }
`
`end;
`
`Fig. 3.11. Right sibling operation using atray representation.
`
`(cid:20)(cid:27)
`18
`
`
`
`
`
`
`
`3.3 IMPLEMENTATIONS OF TREES
`
`87
`
`Representation of Trees by Lists of Children
`An important and useful way of representing trees is to form for each node a
`list of its children. The lists can be represented by any of the methods sug-
`gested in Chapter 2, but because the number of children each node may have
`can be variable, the linked-list representations are often more appropriate.
`Figure’ 3.12 suggests how the tree of Fig. 3.10(a) might be represented.
`There is an array of header cells,
`indexed by nodes, which we assume to be
`numbered 1, 2,...,!0, Each header points to a linked list of “elements,”
`which are nodes. The elements on the list headed by header{i] are the chil-
`dren of node i; for example, 9 and 10 are the children of 3.
`
`
`
`
`
`header
`
`Fig. 3.12. A linked-list representation of a tree.
`
`Let us first develop the data structures we need in terms of an abstract
`data type LIST (of nodes), and then give a particular implementation of lists
`and see how the abstractions fit together. Later, we shall see some of the
`simplifications we can make, We begin with the following type declarations:
`
`type
`node = integer;
`LIST = { appropriate definition for list of nodes };
`position = { appropriate definition for positions in lists },
`TREE = record
`header: array [1..maxnodes] of LIST;
`labels: array [l..maxnodes] of labeltype;
`root: node
`,
`
`end;
`
`(cid:20)(cid:28)
`19
`
`
`
`
`
`
`
`
`
`We assume that the root of each tree is stored explicitly in the roor field.
`Also, 0 is used to represent the null node,
`Figure 3.13 shows the code for the LEFTMOST_CHILD operation. The
`reader should write the code for the other operations as exercises.
`
`TREES
`
`function LEFTMOST_CHILD( n: node; T: TREE ) ;: node;
`{ returns the leftmost child of node n of tree T }
`var
`
`L: LIST;
`
`{ shorthand for the list of n's children }
`
`L := Theader[nj;
`if EMPTY(L) then { 7 is a leaf }
`return (0)
`
`else
`
`return (RETRIEVE(FIRST(L), L))
`{ LEFTMOST_CHILD}
`
`end;
`
`Fig. 3.13. Function to find leftmost child,
`
` 85
`
`Now let us choose a particular implementation of lists, in which both LIST
`and position are integers, used as cursors into an array cellspace of records:
`ver
`
`cellspace: array [1,.maxnodes ] of record
`node: integer;
`next: integer
`
`end;
`
`lists of children have header cells.
`that
`insist
`To simplify, we shall not
`Rather, we shall let T.keader[n) point directly to the first cell of the list, as is
`suggested
`by
`Fig.
`3.12.
`Figure
`3.14{a)
`shows
`the
`function
`LEFTMOST_CHILD of Fig. 3.13 rewritten for this specific implementation.
`Figure 3.14(b) shows the operator PARENT, which is more difficult to write
`using this representation of lists, since a search ofall lists is required to deter-
`mine on which list a given node appears.
`
`The Leftmost-Ch