`
`ELASTIC - EXHIBIT 1010
`
`
`
`
`
`
`
`
`
`
`
`Data Structures
`
`and Algorithms
`
`ALFRED V. AHO
`
`Bell Laboratories
`
`Murray Hiii, New Jersey
`
`JOHN E. HOPCROFT
`
`Cornell University
`Ithaca, New York
`
`JEFFREY D. ULLMAN
`
`Stanford University
`Stanford, California
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`"é Abols:_oN-WESL_:EY"buousumq .comiéANY?
`
`Reading;- Mas's'acmi'setis .O' Monio'_ Bark; Qalifornia
`;i,on_d_0nio: Amsterdam "Do Milis,’.Qntorio_:__o_ Sydney
`
`(cid:21)
`
`
`
`
`
`iF
`
`
`
`Michael A. Harrison
`Consulting Editor
`
`This book is in the
`
`
`ADDISON-WESLEY SERIES IN
`
`COMPUTER SCIENCE AND iNFORMATlON PROCESSING
`
`
`
`
`
`Ubrary of Congress Cataloging in Publication Data
`Aha. Alfred V.
`Data structures and algorithms.
`
`2. Algorithms.
`1. Data structures (Computer science)
`I. Hopcroft, John F... 1939-
`II. Ullman.
`Jeffrey D., l942-
`.
`Ill. Title.
`1982
`QA76.9.D35A38
`001.64
`ISBN 0-m1—00023~7
`
`82-11596
`
`
`
`
`
`
`
`
`
`
`Reproduced by Addison-Wesley from camera-ready copy supplied by the authors.
`
`
`Reprinted with corrections April, 1987
`Copyright ‘3 I983 by Bell Telephone Laboratories, incorporated.
`
`
`rt of this publication may be reproduced, stored in a re-
`All rights reserved. No pa
`d,
`in any form or by any means. electronic, mechanical.
`trievai system. or transmitte
`he prior written permission of the pub-
`
`
`photocopying. recording. or otherwise, without t
`Pubiished simultaneously in Canada.
`
`
`lisher. Printed in the United States of America.
`
`' ISBN: 0-201-00023—7
`
` __
`
`(cid:22)
`
`
`
`Contents
`
`Chapter 1
`1.1
`1.2
`1.3
`1.4
`1.5
`1.6
`1.7
`
`Design and Analysis of Algorithms
`From Problems to Programs ................................................ 1
`Abstract Data Types ........................................................ 10
`Data Types, Data Structures, and Abstract Data Types ............ 13
`The Running Time of a Program ........................................ 16
`Calculating the Running Time of a Program .......................... 21
`Good Programming Practice .............................................. 27
`Super Pascal ................................................................... 29
`
`Chapter 2
`2.1
`2.2
`2.3
`2.4
`2.5
`2.6
`
`Chapter 3
`3.1
`3.2
`3.3
`3.4
`
`Chapter 4
`4.1
`4.2
`4.3
`4.4
`4.5
`4.6
`4.7
`4.8
`4.9
`4.10
`4.11
`4.12
`
`Basic Data Types
`
`The Data Type ”List" ......................................................37
`Implementation of Lists ....................................................40
`Stacks ........................................................................... 53
`Queues ......................................................................... 56
`Mappings .........................................'............................. 61
`Stacks and Recursive Procedures ........................................ 64
`
`Trees
`
`Basic Terminology ........................................................... 75
`The ADT TREE ............................................................. 82
`Implementations of Trees .................................................. 84
`Binary Trees .................................................................. 93
`
`Basic Operations on Sets
`
`Introduction to Sets ........................................................ 107
`An ADT with Union, Intersection, and Difference ................ 109
`A Bit-Vector implementation of Sets .................................. 112
`A Linked-List Implementation of Sets ................................ 115
`The Dictionary .............................................................. 117
`Simple Dictionary Implementations .................................... 119
`The Hash Table Data Structure ......................................... 122
`Estimating the Efficiency of Hash Functions ........................ 129
`Implementation of the Mapping ADT ................................. 135
`Priority Queues ............................................................. 135
`Implementations of Priority Queues ................................... 138
`Some Complex Set Structures ........................................... 145
`
`(cid:23)
`
`
`
`Advanced Set Representation Methods
`
`CONTENTS
`
`Binary Search Trees ....................................................... 155
`Time Analysis of Binary Search Tree Operations .................. 160
`Tries ........................................................................... 163
`
`Balanced Tree Implementations of Sets ............................... 169
`Sets with the MERGE and FIND Operations ....................... 180
`An ADT with MERGE and SPLIT .................................... 189
`
`Directed Graphs
`
`Basic Definitions ........................................................... 198
`Representations for Directed Graphs .................................. 199
`The Single-Source Shortest Paths Problem ...........................203
`The All-Pairs Shortest Path Problem .................................. 208
`
`Traversals of Directed Graphs .......................................... 215
`Directed Acyclic Graphs .................................................. 219
`Strong Components ........................................................ 222
`
`Undirectfll Graphs
`Definitions ...................- ............................................... 230
`
`Minimum-Cost Spanning Trees ......................................... 233
`Traversals .................................................................... 239
`
`Articulation Points and Biconnected Components .................. 244
`Graph Matching ............................................................ 246
`
`Sorting
`
`Local Search Algorithms ................................................. 336
`
`The Internal Sorting Model .............................................. 253
`Some Simple Sorting Schemes ........................................... 254
`Quicksort ..................................................................... 260
`Heapsort ...................................................................... 271
`Bin Sorting ................................................................... 274
`A Lower Bound for Sorting by Comparisons ........................ 282
`Order Statistics .............................................................. 286
`
`Algorithm Analysis Techniques
`
`Efficiency of Algorithms ................................................. 293
`Analysis of Recursive Programs ........................................ 294
`Solving Recurrence Equations .......................................... 296
`A General Solution for a Large Class of Recurrences ............ 298
`
`Algorithm Design Techniques
`
`Divide-and-Conquer Algorithms ........................................ 306
`Dynamic Programming ................................................... 311
`Greedy Algorithms ........................................................ 321
`Backtracking ................................................................. 324
`
`Chapter 5
`5.1
`5.2
`5.3
`5.4
`5.5
`5.6
`
`Chapter 6
`6.1
`6.2
`6.3
`6.4
`6.5
`6.6
`6.7
`
`Chapter 7
`7.1
`7.2
`7.3
`7.4
`7.5
`
`Chapter 8
`8.1
`8.2
`8.3
`8.4
`8.5
`8.6
`8.7
`
`Chapter 9
`9.1
`9.2
`9.3
`9.4
`
`Chapter 10
`10.1
`10.2
`10.3
`10.4
`10.5
`
`
`
`(cid:24)
`
`
`
`
`CONTENTS
`
`xi
`
`Chapter 11 Data Structures and Algorithms for External Storage
`11.1
`A Model of External Computation .....................................347
`11.2 External Sorting ............................................................349
`11.3
`Storing Information in Files .............................................361
`11.4 External Search Trees .....................................................368
`
`Chapter 12 Memory Management
`12.1 The Issues in Memory Management ...................................378
`12.2 Managing Equal-Sized Blocks ...........................................382
`12.3 Garbage Collection Algorithms for Equal-Sized Blocks ..........384
`12.4
`Storage Allocation for Objects with Mixed Sizes ................... 392
`12.5 Buddy Systems .............................................................. 400
`12.6
`Storage Compaction ....................................................... 404
`
`Bibliography ................................................................41 1
`
`Index ..........................................................................419
`
`(cid:25)
`
`
`
`
`
`CHAPTER 3
`
`Trees
`
`A tree imposes a hierarchical structure on a collection of items. Familiar
`examples of trees are genealogies and organization charts. Trees are used to
`help analyze electrical circuits and to represent the structure of mathematical
`formulas. Trees also arise naturally in many different areas of computer sci—
`ence. For example,
`trees are used to organize information in database sys-
`tems and to represent the syntactic structure of source programs in compilers.
`Chapter 5 describes applications of
`trees in the representation of data.
`Throughout this book, we shall use many different variants of trees.
`In this
`chapter we introduce the basic definitions and present some of the more com-
`mon tree operations. We then describe some of the more frequently used data
`structures for trees that can be used to support these operations efficientty.
`
`3.1 Basic Terminology
`
`A tree is a collection of elements called nodes, one of which is distinguished as
`a root, along with a relation (“parenthood") that places a hierarchical struc-
`ture on the nodes. A node, like an element of a list, can be of whatever type
`we wish. We often depict a node as a letter, a string, or a number with a cir-
`cle around it. Formally, a tree can be defined recursively in the following
`manner.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`is convenient to include among trees the null tree, a “tree" with
`Sometimes, it
`no nodes, which we shall represent by A.
`
`
`'_Example 3.1. Consider the table of contents of a book, as suggested by Fig.
`_- 3_.1(a). This table of contents is a tree. We can redraw it
`in the manner
`
`shown in Fig. 3.1(b). The parent-child relationship is depicted by a line.
`
`_Trees are normally drawn top-down as in Fig. 3.1(b), with the parent above
`the child.
`
`three subtrees with roots
`the node calied “Book," has
`'
`_The root,
`
`eresponding to the chapters C1, C2,
`and C3.
`This
`relationship is
`
`épresented by the lines downward from Book to C1, C2, and C3. Book is
`he parent of C1, C2, and C3, and these three nodes are the children of Book.
`
`5. A single node by itself is a tree. This node is also the root of the tree.
`2.
`Suppose
`n
`is
`a
`node
`and TI, T2,
`.
`.
`.
`,Tk
`are
`trees with
`roots
`.
`n1, n2.
`.
`.
`,nk, respectively. We can construct a new tree by making n
`be the parent of nodes n1, n2,
`.
`.
`..nk.
`In this tree n is the root and
`T1, T2.
`.
`.
`.
`,7), are the subrrees of the root. Nodes n1, n2,
`.
`.
`.
`,nk are
`called the children of node n.
`
`
`
`(cid:26)
`
`
`
`
`
` Book
`
`
`
`.
`
`rBook
`
`Ci? / \
`C2
`5
`C1
`C2
`.21
`/ \ / \
`
`52.2
`52.1.1
`51.1
`31.2
`52.1
`52.3
`52.1.2
`52.2
`/ \
`52.3
`52.1.1
`52.1.2
`
`c3
`
`(a)
`
`(b)
`
`
`
`
`
`
`
`Fig. 3.1. A table of contents and its tree representation.
`
`The third subtree, with root C3, is a tree of a single node, while the other
`We subtrees have a nontrivial structure. For example, the subtree with root
`C2 has three subtrees, corresponding to the sections s2.l, 52.2, and s2.3; the
`last two are one~node trees, while the first has two subtrees corresponding to
`the subsections s2.l.1 and 52.1.2. El
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Example 3.1 is typical of one kind of data that is best represented as a
`tree;
`In this example, the parenthood relationship stands for containment; a
`parent node is comprised of its children, as Book is comprised of C1, C2, and
`C3. Throughout this book we shall encounter a variety of other relationships
`that can be represented by parenthood in trees.
`.
`is the
`n,-
`If n1, n2,
`.
`.
`.
`,nk is a sequence of nodes in a tree such that
`parent of nm for l S i < k, then this sequence is called a path from node n;
`to node 11;. The length of a path is one less than the number of nodes in the
`path. Thus there is a path of length zero from every node to itself. For
`exampie,
`in Fig. 3.] there is a path of length two, namely (C2, s2.1, s2.l.2)
`from C2 to 52.1.2.
`
`If there is a path from node a to node b, then a is an ancestor of b, and b
`is a descendant of a. For example,
`in Fig. 3.1,
`the ancestors of 52.1, are
`itself, C2, and Book, while its descendantsare itself, 52.1.1, and $11.2.
`Notice that any node is both an ancestor and a descendant of itself.
`An ancestor or descendant of a node, other than the node itself, is called
`a proper ancestor or proper descendant, respectively.
`In a tree, the root is the
`only node with no proper ancestors. A node with no proper descendants is
`called a leaf. A subtree of a tree is a node, together with all its descendants.
`The height of a node in a tree is the length'of a longest path from the
`node to a leaf.
`In Fig. 3.] node C1 has height 1, node C2 height 2, and node
`C3 height 0. The height of a treeris the height of the root. The depth of a
`node is the length of the unique path from the root to that node.
`
`
`
`(cid:27)
`
`
`
`
`
`3.1 BASIC TERMINOLOGY
`
`Tr'
`
`The Order of Nodes
`
`The children of a node are usually ordered frorn left-to-right. Thus the two
`trees of Fig. 3.2 are different because the two children of node it appear in a
`different order in the two trees.‘ If we wish explicitly to ignore the order of
`children, we shall refer to a tree as an unordered tree.
`
`Fig. 3.2. Two distinct (ordered) trees.
`
`The “left-to-right” ordering of siblings {children of the same node) can be
`extended to compare any two nodes that are not related by the ancestor-
`descendant relationship. The relevant rule is that if a and b are siblings. and
`a is to the left of b, then all the descendants oi'a art: to the left of all the des-
`cendants of b.
`
`Example 3.2. Consider the tree in Fig. 3.3. Node 8 is to the right of node 2,
`to the left of nodes 9. 6, 10. 4. and 7. and neither left not right of its ances-
`tors 1, 3. and 5.
`
`Fig. 3.3. A tree.
`
`A simple rule, given a node n. for finding those nodes to its left and those
`to its right, is to draw the path from the root to n. All nodes branching off to
`the left of this path, and all descendants oi" such nodes, are to the left of in.
`All nodes and descendants of nodes branching off to the right are to the right
`of n. U
`
`(cid:28)
`
`
`viii;tend
`
`Sign»an;A:«EFT-«wake
`
`
`
`
`
`
`
`
`
`TB
`
`TREES
`
`Freon-tier, Postorder. and 1110er
`
`There are several useful ways in which we can systematically order all nodes
`of a tree. The three most
`important orderings are called preorder, inorder
`and postorder; these orderings are defined recursively as follows.
`
`0
`
`o
`
`If a tree T is null. then the empty list is the preorder. inorder and post-
`order listing of T.
`then that node by itself is the preorder.
`If T consists a Single node.
`inorder. and postorder listing of T.
`
`Otherwise, let T be a tree with root it and subtrces Tl, T2,
`gested in Fig. 3.4.
`
`.
`
`.
`
`. {I}, as sug-
`
`0
`
`A F
`
`ig. 3.4. Tree T.
`
`l. The preorder listing (or preorder rraversoi) of the nodes of T is the rent it
`of T followed by the nodes of T]
`in preorder.
`then the nodes of T2 in
`preorder. and so on. up to the nodes of Ti. in preorder.
`in inorder, fol-
`2. The inorder listing of the nodes of T is the nodes of T.
`lowed by node it, followed by the nodes of T3.
`.
`.
`. .1}. each group of
`nodes in inorder.
`.
`3. The postnrder listing of the nodes of T is the nodes of T.
`in postorder,
`then the nodes of T2 in postorder, and so on, up to Th all followed by
`node [1.
`
`Figure 3.5(a) shows a sketch of a procedure to list the nodes of a tree in
`preorder. To make it a postorder procedure, we simply reverse the order of
`steps (I) and (2}. Figure 3.5(b) is a sketch of an inorder procedure.
`In each
`case, we produce the desired ordering of the tree by calling the appropriate
`procedure on the root of the tree.
`
`i and
`Example 3.3. Let us list the tree of Fig. 3.3 in preonler. We first list
`then call PREORDER on the first subtree of l. the subtree with root 2. This
`sobtree is a single node. so we simply list it. Then we proceed to the second
`subtree of l. the tree rooted at 3. We list 3. and then call PREDRDER on
`thc first subtree of 3. That call results in listing 5, 8. and 9,
`in that order.
`
`(cid:20)(cid:19)
`10
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`3.: BASIC TERMINOLOGY
`
`79
`
`(1}
`(2)
`
`procedure PREORDER ( n: node );
`begin
`list it;
`[or each child c of a, if any. in order from the left do
`PREORDER(£}
`{ PREORDER }
`
`end:
`
`(a) PREORDER procedure.
`
`procedure INORDER { in: node );
`begin
`if n is a leaf then
`list :1
`else begin
`_INORDERGeftmost child of a);
`list a:
`[or each child r: of it, except for the leftmost.
`in order from the left do
`[NOR 13151102)
`
`and
`
`end;
`
`{ INORDER }
`
`(b) [NORDER procedure.
`
`Fig. 3.5. Recursive ordering procedures.
`
`Continuing in this manner, We obtain the complete preorder traversal of Fig.
`3.3: l, 2, 3, 5, 8, 9, 6. 10, 4. 7.
`Similarly, by simulating Fig. 3.5(a) with the steps reversed, we Can dis-
`cover that the postorder of Fig. 3.3 is 2, 8, 9. 5, 1i]I 6. 3. 7", d, 1. By simulat~
`ing Fig. 3.50:), we find that the inordet listing of Fig. 3.3 is 2, 1, 8, 5, 9, 3,
`1i], 6, T. 4. D
`
`
`
`
`
`
`
`
`trick for producing the three node orderings is the following.
`A useful
`_. Imagine we wail: around the outside of the tree. starting at the root. moving
`counterclockwise. and staying as close to the tree as possible; the path we have
`:in mind for Fig. 3.3 is shown in Fig. 3.6.
`__
`For preorder, we list a node the first time we pass it. For postorder. we
`.' list a node the last time we pass it, as we move up to its parent. For inorder,
`
`" We list a leaf the first time we pass it. but list an interior node the second time
`
`23'"??? pass it. For example, node 1
`in Fig. 3.6 is passed the first time at the
`- beginning, and the second time while passing through the “bay“ between
`nodes 2 and 3. Note that the order of the leaves in the three orderings is
`
`
`
`(cid:20)(cid:20)
`
`
`
`seesaw—25:».fiefieHgfiaéi'flgmg;_mwhn__..,.,,-
`
`
`.'fistulas;eezevgmsuflml
`
`
`
`
`
`
`80
`
`TREES
`
`
`
`\___.,,/ /
`
`\
`
`x’x/ ' Ken.
`(“f/Z: !
`(”:3
`1/5//(fx\\:i\\l l!
`)
`x/ \\ \th
`
`1
`s H 9
`l
`ll\_,//
`\NJ”
`
`1
`1o
`\x/l
`
`
`
`Fig. 3.6. Traversal of a tree.
`
`It is only the ordering of
`always the same left-to-right ordering of the leaves.
`the interior nodes and their relationship to the leaves that vary among the
`three. CI
`
`Labeled Trees and Expression Trees
`
`Often it is useful to associate a label. or value. with each node of a tree, in
`the same spirit with which we associated a value with a list element in the pre-
`vious chapter. That is, the label of a node is not the name of the node, but a
`value that is “stored“ at the node.
`In some applications we shall even change
`the label of a node, while the name of a node remains the same. A useful
`analogy is treedist = label:elernent = nodezposition.
`
`Example 3.4. Figure 3.7 shows a labeled tree representing the arithmetic
`expression (rt-Hr) 1' (n+cJ, where m, .
`.
`. ,n; are the names of the nodes,
`and the labels, by convention, are shown next
`to the nodes. The rules
`whereby a labeled tree represents an expression are as follows:
`i. Every leaf is labeled by an operand and consists of that operand alone.
`For example, node (:4 represents the expression a.
`2. Every interior node n is labeled by an operator. Suppose I: is labeled by a
`binary operator 9, such as + or 1-, and that
`the leit child represents
`expression E, and the right child E1. Then it
`represents expression
`[E 1) 9 (El). We may remove the parentheses if they are not necessary.
`
`For example, node n; has operator +, and its left and right children
`represent
`the expressiuns a and b, respectively. Therefore. n; represents
`(a)+[b). or just a+b. Node rt. represents {a+b}*(e+c), since * is the label
`
`(cid:20)(cid:21)
`12
`
`
`
`
`3.1 BASiC TERMINOLOGY
`81
`
`
`at m, and a+b and a+c are the expressions represented by n; and n3, respec-
`tively. E!
`
`
`
`Fig. 3.7. Expression tree with labels.
`
`inorder, or postorder listing of a
`Often, when we produce the preorder,
`tree, we prefer to list not the node names, but rather the labels.
`in the case
`of an expression tree, the preorder listing of the labels gives us what is known
`as the prefix form of an expression, where the operator precedes its left
`operand and its right operand. To be precise, the prefix expression for a sin-
`gle operand a is a itself. The prefix expression for (E1) 9 (E2), with 0 a
`binary operator,
`is BPIPz, where P, and P2 are the prefix expressions for E l
`and E2. Note that no parentheses are necessary in the prefix expression, since
`we can scan the prefix expression BPIP; and uniquely identify P, as the shor-
`test (and only) prefix of P1P; that is a legal prefix expression.
`For example, the preorder listing of the labels of Fig. 3.7 is *+ab+ac.
`The prefix expression for n2, which is +ab,
`is the shortest legal prefix of
`+ab+ac.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Simiiarly, a postorder listing of the labels of an expression tree gives us
`what is known as the postfix (or Polish) representation of an expression. The
`expression (E 1) 6 (E1)
`is represented by the postfix expression PIPZB, where
`P1 and P2 are the postfix representations of E1 and E2, respectively. Again,
`no parentheses are necessary in the postfix representation, as we can deduce
`what P2 is by looking for the shortest suffix of P1P; that is a legal postfix
`.. expression. For example, the postfix expression for Fig. 3.7 is ab+ac+*.
`If
`-...we write this expression as P1P2*,
`then P2 is ac+,
`the shortest suffix of
`ab +ac+ that is a legal postfix expression.
`
`hematfiammwm
`
`
`
`
`
`
`
`(cid:20)(cid:22)
`
`
`
`82
`
`TR [£35
`
`
`
`
`
`The inorder traversal of an expression tree gives the infix expression
`itself, but with no parentheses. For example. the inorder listing of the labels
`of Fig. 3.7 is a+b * nine. The reader is invited to provide an algorithm for
`traversing an expression tree and producing an infix expression with all
`needed pairs of parentheses.
`
`Computlng Ancestral Information
`
`The preorder and postorder traversals of a tree are useful in obtaining ances-
`tral
`information. Suppose poster-dam) is the position of node n in a post-
`order listing of the nodes of a tree. Suppose desctn) is the number of preper
`descendants of node rt. For example.
`in the tree of Fig. 3.7 the pastorder
`numbers of nodes #2. n... and n, are 3. l. and 2. respectively.
`The postorder numbers assigned to the nodes have the useful property
`that the nodes in the subtrec with root n are numbered consecutively from
`pastorder(n) — datum) to posrordertn). To test if a vertex .1:
`is a descenth
`of vertexy, all we need do is determine whether
`
`pn:tnrder[y)~—desc(y) '5 postordertx) E postorder(y).
`
`A similar property holds for preorder.
`
`3.2 The ADT TREE
`
`In Chapter 2. lists. stacks, queues. and mappings were treated as abstract data
`types (ADT’s).
`In this chapter trees will be treated both as ADT's and as
`data structures. One of our most important uses of trees occurs in the design
`of implementations for the various ADT‘s we study. For example. in Section
`5.1. we shall see how a "binary search tree" can be used to implement
`abstract data types based on the mathematical model of a act. together with
`operations such as INSERT, DELETE. and MEMBER (to test whether an
`element is in a set). The next two chapters present a number of other tree
`implementations of various ADT‘s.
`In this section. we shall present several useful operations on trees and
`show how tree algorithms can be designed in terms of these operations. As
`with lists. there are a great variety of operations that can be performed on
`trees. Here, we shall consider the following operations:
`
`If
`I. PARENTtn. T). This function returns the parent of node n in tree T.
`n is the root. which has no parent, A is returned.
`In this context. A is a
`“null node." which is used as a signal that we have navigated off the tree.
`2. LEFTMOSTFCHILDQ. T) returns the leftmost child of node rt
`in tree T,
`and it returns A if n is a leaf. which therefore has no children.
`
`3. RIGHT_SIBLING(n. T) returns the right sibling of node n in tree T,
`defined to be that node in with the same parent p as n such that m lies
`immediatelyr to the right of n in the ordering of the children of p. For
`example.
`for
`the
`tree
`in Fig. 3.7. LEFTMDSTgCHILDtnz) 5 n4;
`RIGHTflSIBLINGmt) = as. and RlGH'LSlBLING (as) = A.
`
`(cid:20)(cid:23)
`14
`
`
`
`
`
`
`
`
`
`3.2 THE ADT TREE
`
`33
`
`
`
`4. LABELUI, T) returns the label of node n in tree T. We do not, however,
`require labels to be defined for every tree.
`5. CREATEi(v, T1. T2,
`.
`.
`.
`,T;)
`is one of an infinite family of functions,
`ll
`one for each value oft"
`0, l, 2,
`.
`.
`.. CREATEi makes a new node r
`with label v and gives
`it
`1' children, which are the roots of
`trees
`T1, T2,
`.
`.
`.
`,7}, in order from the left. The tree with root r is returned.
`Note that if i = 0, then r is both a leaf and the root.
`6. ROOT(T) returns the node that is the root of tree T, or A if T is the null
`tree.
`
`7. MAKENULLlT) makes T be the null tree.
`
`Example 3.5. Let us write both recursive and nonrecursive procedures to take
`a tree and list the labels of its nodes in preorder. We assume that there are
`data types node and TREE already defined for us, and that the data type
`TREE is for trees with labels of the type labeltype. Figure 3.8 shows a recur-
`sive procedure that, given node n, lists the labels of the subtree rooted at n in
`preorder. We call PREORDER(ROOT(T}) to get a preorder listing of tree T.
`
`procedure PREORDER ( n: node );
`{ list the labels of the descendants of n in preorder }
`var
`
`6: node;
`begin
`print(LABEL(n, T));
`c := LEFTMOST_CHILD(n, T);
`while 0 <> A do begin
`PREORDER(c);
`c := RIGHT_SIBLING{C, T)
`
`end
`
`end; { PREORDER }
`
`Fig. 3.8. A recursive preorder listing procedure.
`
`tree in
`a
`We shall also develop a nonrecursive procedure to print
`preorder. To find our way around the tree, we shall use a stack S, whose
`type STACK is really “stack of nodes." The basic idea underlying our algo-
`rithm is that when we are at a node n, the stack will hold the path from the
`root to n, with the root at the bottom of the stack and node n at the top.'l'
`
`1‘ Recall our discussion of recursion in Section 2.6 in which we illustrated how the implementation
`of a recursive procedure involves a stack of activation records.
`If we examine Fig. 3.8, we can
`observe that when PREORDERUI) is called, the active procedure calls, and therefore the stack of
`activation records, correspond to the calls of PREORDER for all the ancestors of n. Thus our
`nonrecursive preorder procedure,
`like the example in Section 2.6, models closely the way the re-
`cursive procedure is implemented.
`
`'
`
`
`
`(cid:20)(cid:24)
`
`
`
`
`
`
`
`
`
`84
`
`TREES
`
`One way to perform a nonrecursive preorder traversal of a tree is given
`by the program NPREORDER shown in Fig. 3.9. This program has two
`modes of operation.
`In the first mode it descends down the leftmost unex-
`plored path in the tree,‘ printing and stacking the nodes along the path, until it
`reaches a leaf.
`
`The program then enters the second mode of operation in which it retreats
`back up the stacked path, popping the nodes of the path off the stack, until it
`encounters a node on the path with a right sibling. The program then reverts
`back to the first mode of operation, starting the descent from that unexplored
`right sibling.
`_
`The program begins in mode one at
`the root and terminates when the
`stack becomes empty. The complete program is shown in Fig. 3.9.
`
`3.3 Implementations of Trees
`
`In this section we shall present several basic implementations for trees and dis-
`cuss their capabilities for supporting the various tree operations introduced in
`Section 3.2.
`
`An Array Representation of Trees
`
`. ,n. Perhaps the sim-
`.
`.
`Let T he a tree in which the nodes are named l, 2.
`plest representation of T that supports the PARENT operation is a linear
`array A in which entry A[t‘] is a pointer or a cursor to the parent of node i.
`The root of T can be distinguished by giving it a null pointer or a pointer to
`itself as parent.
`in Pascal, pointers to array elements are not feasible, so we
`shall have to use a cursor scheme where AU] = j if node j is the parent of
`node i, and A[i] = 0 if node t' is the root.
`This representation uses the property of trees that each node has a unique
`parent. With this representation the parent of a node can be found in con-
`stant time. A path going up the tree, that is, from node to parent to parent,
`and so on, can be traversed in time proportionai to the number of nodes on
`the path. We can also support the LABEL operator by adding another array
`L, such that LU] is the label of node 1', or by making the elements of array A
`be records consisting of an integer (cursor) and a label.
`
`.
`
`Example 3.6. The tree of Fig. 3.10(a) has the parent representation given by
`the array A shown in Fig. 3.10(b). [J
`
`facilitate operations that
`representation does not
`The parent pointer
`require child-of information. Given a node n, it is expensive to determine the
`children of n, or the height of n.
`In addition, the parent pointer representa-
`tion does not specify the order of the children of a node. Thus, operations
`like LEFTMOST_CHILD and RIGHT_SIBLING are not well defined. We
`could impose an artificial order, for example, by numbering the children of
`each node after numbering the parent, and numbering the children in
`
`(cid:20)(cid:25)
`16
`
`
`
`3.3 IMPLEMENTATIONS OF TREES
`
`85
`
`procedure NPREORDER ( T: TREE );
`{ nonrecursive preorder traversal of tree T }
`var
`
`{ a temporary }
`m: node;
`S: STACK;
`{ stack of nodes holding path from the root
`to the parent TOP(S) of the "current” node m }
`
`begin
`{ initialize }
`MAKENULL{S);
`m := ROOT(T);
`
`while true do
`
`if m <> A then begin
`print(LABEL(m, T));
`FUSE-Hm, S);
`{ explore leftmost child of m }
`m := LEFTMOST_CHILD(m, T)
`
`end
`
`else begin
`{ exploration of path on stack
`is now complete }
`if EMPTY(S) then
`return;
`{ explore right sibling of node
`on top of stack }
`m := RIGHT_SIBLING(TOP(S), T);
`POP(S)
`
`end
`
`end; {NPREORDER}
`
`Fig. 3.9. A nonrecursive preorder procedure.
`
`increasing order from left to right. On that assumption, we have written the
`'function RIGHTHSIBLING in Fig. 3.1t, for types node and-TREE that are
`- defined as follows:
`
`_
`
`type
`node = integer;
`TREE = array [1..maxnodes] of node;
`
`" For this implementation we assume the null node A is represented by O.
`
`
`
`
`(cid:20)(cid:26)
`17
`
`
`
`
`
`
`
`86
`
`TREES
`
`
`
`/\3
`/\ 9/ \0
`
`(a) a tree
`
`12345678910
`A “III-III“.-
`
`(b) parent representation
`
`Fig. 3.10. A tree and its parent pointer representation.
`
`function RIGHT__SIBLING ( n: node; T: TREE ) : node;
`{ return the right sibling of node n in tree T }
`'var
`
`1', parent: node;
`begin
`parent := T[n1;
`for i := n + 1 to maxnodes do
`{ search for node after H with same parent }
`if T[i] = parent then
`7
`return (i);
`return (0)
`{ null node will be returned
`if no right sibling is ever found }
`{ RIGHT_SIBL[NG }
`
`end;
`
`Fig. 3.11. Right sibling operation using array representation.
`
`(cid:20)(cid:27)
`18
`
`
`
`
`
`
`
`3.3 IMPLEMENTATIONS 0F TREES
`
`87
`
`Representation of Trees by Lists of Children
`
`An important and useful way of representing trees is to form for each node a
`list of its children. The lists can be represented by any of the methods sug-
`gested in Chapter 2, but because the number of chiidren each node may have
`can be variable, the linked-list representations are often more appropriate.
`Figure‘3.12 suggests how the tree of Fig. 3.10(a) might be represented.
`There is an array of header cells,
`indexed by nodes, which We assume to be
`numbered 1, 2,
`.
`.
`. .10. Each header points to a linked list of “elements,”
`which are nodes. The elements on the list headed by headerli] are the chil-
`dren of node i; for example, 9 and 10 are the children of 3.
`
`
`
`header
`
`Fig. 3.12. A linked-list representation of a tree.
`
`
`
`
`
`
`Let us first deveiop the data structures we need in terms of an abstract
`data type LIST (of nodes), and then give a particular implementation of lists
`and see how the abstractions fit together. Later, we shall see some of the
`simplifications we can make. We begin with the following type declarations:
`
`type
`node = integer;
`LIST = { appropriate definition for list of nodes };
`position = { appropriate definition for positions in lists };
`TREE = record
`
`header: array [1..maxnodes] of LIST;
`labels: array [l..maxn0des] of labeltype;
`root: node
`'
`
`end;
`
`(cid:20)(cid:28)
`l9
`
`
`
`
`
`
`
`
`
`
`
`88
`
`TREES
`
`We assume that the root of each tree is stored explicitly in the root field.
`Also. 0 is used to represent the null node.
`Figure 3.13 shows the code for the LEFTMDS’LCHILD operation. The
`reader should write the code for the other operations as exercises.
`
`lhnetlon LEFTMDS'LCHILD ( n: node; T: TREE J : node;
`{ returns the leftmost child of node n of tree T J-
`var
`
`{ shorthand for the list of n‘s children }
`
`L: LIST:
`hesit-
`L := T.heoder[n];
`HEMPI'YG.) then {it is a leaf}
`