throbber
Uieait)
`AND ALGORITHMS
`
`ALFRED V. AHO
`JOHN E. HOPCROFT
`
`Maal ome MeF
`
`4
`
`EX 1010
`
`1
`
`EX 1010
`
`

`

`
`
`
`
`
`Data Structures
`
`and Algorithms
`
`ALFRED V. AHO
`
`Bell Laboratories
`Murray Hill, New Jersey
`
`JOHN E. HOPCROFT
`
`Corneil University
`ithaca, New York
`
`JEFFREY BD. ULLMAN
`
`
`
`
`
`
`
`
`
`
`
`Stanford University
`Stanford, California
`
`
`
`VV.ADDISON-WESLEYPUBLISHINGCOMPANY.
`Reading, Massachusetts.
`lo Park, California
`ondon®Amsterdam@ Don Mills, Ontario.@ Sydney’
`
`
`2
`
`

`

`
`
`This book is in the
`ADDISON-WESLEY SERIES IN
`COMPUTER SCIENCE AND INFORMATION PROCESSING
`
`Michael A. Harrison
`Consulting Editor
`
`
`
`
`
`
`
`
`
`Library of Congress Cataloging in Publication Data
`Aho, Alfred V.
`Data structures and algorithms.
`2. Algorithms.
`1. Data structures (Computer science)
`L. Hopcroft, John E., 1939-
`.
`HH, Ullman,
`Jeffrey D., 1942-
`.
`IIL. Title.
`QA76.9.D35A38
`1982
`001.64
`82-11596
`ISBN 0-201-00023-7
`
`
`
`
`
`Reproduced by Addison-Wesley from camera-ready copy supplied by the authors.
`
`Reprinted with corrections April, 1987
`Copyright © 1983 by Bell Telephone Laboratories, Incorporated.
`
`All rights reserved. No part of this publication may be reproduced, stored in a re-
`trieval system, or transmitted,
`in any form or by any means, electronic, mechanical,
`
`photocopying, recording, or otherwise, without the prior written permission of the pub-
`
`
`
`
`
`lisher. Printed in the United States of America. Published simultaneously in Canada. 2 ISBN: 0-201-00023-7
`
`3
`
`

`

`Contents
`
`Design and Analysis of Algorithms
`From Problems to Programs ...........:---s-ceensereneeneeetnrn ee rer ee neees 1
`Abstract Data Types ......0:ccccceesseeeeeeeerene nearer nee tne ee 10
`Data Types, Data Structures, and Abstract Data Types... cece 13
`The Running Time of a Program ......cseecccceereeetreeeeereere eters 16
`Calculating the Running Time of a Program.......:::-sceeeeeeees 21
`Good Programming Practice .........:.ccsceeceereee sees renee tener ents27
`Super Pascal......sccccccecccececereeetesteeeeenneeseereeeeee sete e eres sees29
`
`Chapter 1
`
`1 1
`
`.2
`1.3
`L4
`1.5
`1.6
`1.7
`
`Chapter 2
`2.1
`2.2
`2.3
`2.4
`2.3
`2.6
`
`Basic Data Types
`The Data Type “List” ......ccceccccsessereetenetnsrecnnn ene ceeeaneneese eee37
`Implementation of Lists ........:.0..:cccecerieeeerstertennenestrttees40
`Stack..ccccccecucceceeea rene et een eee ee epee nEER EGTA EERE STEER ERE EASE EGET SS 33
`QUGUES oe e eee enr eeEEE56
`Mappings .......cc-ccceeeeeeeeeeereeeee cnn ernenesneaee nen esuoueeeeneeneeeees61
`Stacks and Recursive Procedures ........:-::eceeseeseeeeeerenseeree tens64
`
`Chapter 3
`3.1
`3.2
`3.3
`3.4
`
`Trees
`Basic Terminology ............::ceereeseeeeeceeee rere een ree75
`The ADT TREE......cccccece ere ee ee eee ee re en EE Ee EERE EEE 82
`Implementations Of Trees .........-cscerse reece erent tate e terete84
`Binary Trees ...0.ccseeee see ce etree reer terre ree eeer ener teeter regs93
`
`Chapter 4
`4.1
`4.2
`4.3
`4.4
`4.5
`4.6
`4.7
`4.8
`4.9
`4.10
`4.11
`4,12
`
`Basic Operations on Sets
`Introduction to Sets .....:..c:cccececet erent eee nen eee e eens ened enersneesntas 107
`An ADT with Union, Intersection, and Difference ............-+-- 109
`A Bit-Vector Implementation of Sets............::..ceeeeeeeenrer cree 112
`A Linked-List Implementation of Sets ............:::: see scneeeeeeees 115
`The Dictionary ........-.:ccceeeeceeeeece rece teen esr ee eee e reer eres 117
`Simple Dictionary Implementations.......-..--.eseeeereeeetee steerer 119
`The Hash Table Data Structure.......:ces-ccceseeee eee ee eter e rennet es 122
`Estimating the Efficiency of Hash Functions..........-- ee 129
`Implementation of the Mapping ADT ...........::::eceeeeeees tener ees 135
`Priority QUeUeS .........0ccce eter teerrtrete ee eeeer reenter ert e rene ri 135
`Implementations of Priority Queues ..........: seer reer 138
`Some Complex Set Structures ........0.. creer teeter etter 145
`
`4
`
`

`

`CONTENTS
`
`Advanced Set Representation Methods
`
`Binary Search Trees ...2.......:...ccssescteeeseeteceeeteeeaeneeetteteee 155
`Time Analysis of Binary Search Tree Operations .................. 160
`TYICS oe eceecc eee cnee eee eee n est eee ee teaee ner teneegeteenepeenecene eenseges 163
`Balanced Tree Implementations of Sets ............0..cc:cceneeenseoes 169
`Sets with the MERGE and FIND Operations.............,-......5 180
`An ADT with MERGE and SPLIT .................ceeeceeeeneeeneenee 189
`
`Directed Graphs
`
`Basic Defimitions .............ccccseceereneeeeetareeneeeaeeeeenesseebetenes 198
`Representations for Directed Graphs..........ccccccesceseeeeenneeenes 199
`The Single-Source Shortest Paths Problem ............0.:0eceeeee203
`The All-Pairs Shortest Path Problem .....0.........ccceeeeeeeereeerees 208
`Traversals of Directed Graphs .............0.ccccccscceeeeeeseneeeeeraes 215
`Directed Acyclic Graphs.......cccccccsctsecetsteetiieeetieeeereeeies 219
`Strong Components .....0..0 ccc ceecseecetesetee een etenteseetaneen tenes 222
`
`Undirected Graphs
`Definitions ............ 00...eekeeebeetaeeeeeeteeteeeseeeeseetaarsneesecees 230
`Minimum-Cost Spanning Trees ........00..00.. cee ceeceseeee sere enene 233
`TYAVETSA]S 00. ...ececece ee etee renee erento eter estan ean REAR Eee rE entEEE 239
`Articulation Points and Biconnected Components.............0.0.244
`Graph Matching ............cccccceseceeeeeteeetse eens renpeneteneeeeaneeeres 246
`
`Local Search Algorithms .............0::cccccseeteeseree ersten eeeeennines 336
`
`Sorting
`
`The Internal Sorting Model,.........:c-.::seseeeeceeeesersensee een ganees253
`Some Simple Sorting Schemes .............::cuesciceeseueresreeeteeenes 254
`QUICKSOLE 0. cece cece t ceed nent ec been band teed ereda nt baed een eeetos 260
`Heapsort 0... .... cece ceeeecceet acne eeeeneesseeaseeeesneesseeesseneeeaeeees27k
`Bin Sorting .......0..cccecceesee eee ceetensnneeen tree seeeteeneneeeantentnanenes274
`A Lower Bound for Sorting by Comparisons...............-::.50008 282
`Order Statistics............ 0c ccseeessseseeceesaeeseestaeeseeeten ten enatenes 286
`
`Algorithm Analysis Techniques
`
`Efficiency of Algorithms ...............ccccisces eee seeeeeneeeeeee ere ene ne 293
`Analysis of Recursive Programs ....,.....:0.:cccseseeeseeeeaeneeesneees294
`Solving Recurrence Equations .........:::ccscccsceressueetereesteunaees296
`A General Solution for a Large Class of Recurrences ............ 298
`
`Algorithm Design Techniques
`
`Divide-and-Conquer Algorithms...............c:esccccseseesceeeeseeeee306
`Dynamic Programming ......-..ccececcareseeeeeteeenreeeeseeeenrenapenies 311
`Greedy Algorithms ...........ceccereeeeeceeeetesereerreeueteeatieeeeseas 321
`Backtr acking .........cc.ccccccc sete senses cere ee ee tae etn e eee emen een eearegs324
`
`5
`
`Chapter 5
`3.1
`5.2
`3.3
`5.4
`5.5
`3.6
`
`Chapter 6
`6.1
`6.2
`6,3
`6.4
`6.5
`6.6
`6.7
`
`Chapter 7
`7.1
`72
`73
`74
`7.5
`
`Chapter 8
`8.1
`8.2
`8.3
`8.4
`8.5
`8.6
`8.7
`
`Chapter 9
`9.1
`9.2
`93
`9.4
`
`Chapter 10
`10.1
`10.2
`10.3
`10.4
`10.5
`
`
`
`

`

` xi
`CONTENTS
`
`Chapter 11
`il.
`11.2
`11.3
`11.4
`
`Data Structures and Algorithms for External Storage
`A Mode! of External Computation........-c-ecrersseriseresrtsertes347
`External Sorting .....cccceceesccrsseeee setteeerrr349
`Storing Information in Files ......6:-s-:csecssereereetereer ee361
`External Search Trees .....cccceecectrrere terete368
`
`Chapter 12
`12.5
`12.2
`12.3
`12.4
`12.5
`12.6
`
`Memory Management
`The Issues in Memory Management..........sssesrrecrrsrrerersee378
`Managing Equal-Sized BIOCKS ...cc.cceceeeennee ee ree ett ence eee nersees382
`Garbage Collection Algorithms for Equal-Sized Blocks ........--384
`Storage Allocation for Objects with Mixed Sizes 0.0... cere renee ees392
`Buddy Systems .......c2-cccscecerseseees presenceree400
`Storage Compaction .......-cccectessrrterteneeeese404
`Bibliography ........-:cccccceceee seerseee4il
`TAGOX 5c cccccceccs cence tere een EEEET419
`
`
`6
`
`
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`CHAPTER 3
`
`Trees
`
`A tree imposes a hierarchical structure on a collection of items. Familiar
`examples of trees are genealogies and organization charts. Trees are used to
`help analyze electrical circuits and to represent the structure of mathematical
`formulas. Trees also arise naturally in many different areas of computer sci-
`ence. For example,
`trees are used to organize information in database sys-
`tems and to represent the syntactic structure of source programs in compilers.
`Chapter 5 describes applications of
`trees in the representation of data.
`Throughout this book, we shall use many different variants of trees.
`In this
`chapter we introduce the basic definitions and present some of the more com-
`mon tree operations. We then describe some of the more frequently used data
`structures for trees that can be used to support these operations efficiently.
`
`3.1 Basic Terminology
`
`A tree is a collection of elements called nodes, one of which is distinguished as
`a root, along with a relation (‘‘parenthood”) that places a hierarchical struc-
`ture on the nodes. A node, like an elementofa list, can be of whatever type
`we wish, We often depict a node as a letter, a string, or a number with a cir-
`cle around it. Formally, a tree can be defined recursively in the following
`manner,
`
`1, A single node byitself is a tree. This node is also the root of the tree.
`2.
`Suppose
`a
`is
`a
`node
`and
`17 ,,7>,...,7,
`are
`trees with
`roots
`ny, 2, ... Mg, Tespectively. We can construct a new tree by making n
`be the parent of nodes mj, m2,...,%%.
`In this tree # is the root and
`T,,7T2,...,7, are the subtrees of the root. Nodes my, m2, ...,M, are
`called the children of node x.
`
`
`
`
`
`
`
`
`
`
`Sometimes, it is convenient to include among trees the null tree, a “tree”? with
`no nodes, which we shail represent by A.
`
`Example 3.1. Consider the table of contents of a book, as suggested by Fig.
`-3.1({a). This table of contents is a tree, We can redraw it
`in the manner
`
`Shown in Fig. 3.1(b). The parent-child relationship is depicted by a line.
`
`‘Trees are normally drawn top-down as in Fig. 3.1(b), with the parent above
`‘thechild.
`
`three subtrees with roots
`the node called “Book,” has
`:
`:
`“The root,
`COtresponding to the chapters Cl, C2,
`and C3.
`This
`relationship is
`
`represented by the lines downward from Book to Cl, C2, and C3. Book is
`he parent of C1, C2, and C3, and these three nodes are the children of Book.
`
`
`
`7
`
`
`
`

`

`76
`
`Book
`Ci
`sl.
`
`C22
`52.2
`
`$2.1.1
`82.4.2
`
`$2.3
`C3
`
`(a)
`
`TREES
`
`C3
`
`.
`
`cl
`
`Book
`
`C2
`
`sl.
`
`sl.2
`
` s2.1
`
`82.2
`
`82.3
`
`oo
`
`JN SIN
`/ \
`
`s2.t.4
`
`82.1.2
`
`(b)
`
`
`
`Fig. 3.1. A table of contents and its tree representation.
`
`The third subtree, with root C3, is a tree of a single node, while the other
`two subtrees have a nontrivial structure, For example, the subtree with root
`C2 has three subtrees, corresponding to the sections s2.1, 52.2, and s2.3; the
`last two are one-node trees, while the first has two subtrees corresponding to
`the subsections 2.1.1 and s2.1.2. 0
`
`'
`
`Example 3.1 is typical of one kind of data that is best represented as a
`tree:
`In this example, the parenthood relationship stands for containment; a
`parent node is comprised of its children, as Book is comprised of C1, C2, and
`C3. Throughout this book we shall encounter a variety of other relationships
`that can be represented by parenthoodin trees.
`is the
`If my, m2, ...,My iS @ Sequence of nodes in a tree such that n,;
`parent of n;,; for 1 = i < k, then this sequence is called a path from node nj,
`to node n,. The length of a path is one less than the number of nodes in the
`path. Thus there is a path of length zero from every node to itself. For
`exampie,
`in Fig. 3.1 there is a path of length two, namely (C2, 52.1, 2.1.2)
`from C2 to s2.1.2.
`If there is a path from node a io node b, then a is an ancestor of B, and b
`is a descendant of a. For example,
`in Fig. 3.1,
`the ancestors of s2.1, are
`itself, C2, and Book, while its descendants.are itself, 82.1.1, and 2.1.2.
`Notice that any node is both an ancestor and a descendant ofitself.
`An ancestor or descendant of a node, other than the nodeitself, is called
`a proper ancestor or proper descendant, respectively.
`In a tree, the root is the
`only node with no proper ancestors. A node with no proper descendants is
`called a leaf. A subtree of a tree is a node, together with all its descendants.
`The height of a node in a tree is the length of a longest path from the
`node to a leaf.
`In Fig. 3.4 node C1 has height 1, node C2 height 2, and node
`C3 height 0. The height of a tree-is the height of the root. The depth of a
`node is the length of the unique path from the root to that nade,
`
`8
`
`
`
`

`

`77
`
`The children of a node are usually ordered from left-to-right. Thus the two
`trees of Fig, 3.2 are different because the two children of node a appear in a
`different order in the two trees. [f we wish explicitly to ignore the order of
`children, we shall refer to a tree as an unordered tree.
`
`Fig. 3.2. Two distinct (ordered) trees.
`
`The “left-to-right” ordering of siblings (children of the same node) can be
`extended to compare any two nodes that are not related by the ancestor-
`descendant relationship. The relevant rule is that if a and b are siblings, and
`a is to the left of 6, then all the descendants of a are to the left ofall the des-
`cendants of b.
`
`Example 3.2. Consider the tree in Fig. 3.3. Node 8 is to the right of node 2,
`to the left of nodes 9, 6, 10, 4, and 7, and neither left nor right of its ances-
`tors 1, 3, and 5.
`
`The Order of Nodes
`
` 3.1 BASIC TERMINOLOGY
`
`A simple rule, given a node n, for finding those nodes to its left and those
`to its right, is to draw the path from the root tom. AH nodes branching off to
`the left of this path, and ali descendants of such nodes, are to the left of a.
`: All nodes and descendants of nodes branching off to the right are to the right
`‘of n.o
`
`a ™N
`\
`|
`
`10
`
`8
`
`9
`
`Fig. 3.3. A tree.
`
`9
`
`
`
`

`

`Preorder, Postorder, and [norder
`
`TREES
`
`There are several useful ways in which we can systematically order all nodes
`of a tree. The three most important orderings are called preorder,
`inorder
`and postorder; these orderings are defined recursively as follows.

`If atree T is null, then the empty list is the preorder, inorder and post-
`order listing of 7.
`then that node by itself is the preorder,
`If T consists a single node,
`inorder, and postorderlisting of T.
`Otherwise, let T be a tree with root mn and subtrees 7), 7), ...,7,, as sug-
`gested in Fig. 3.4,
`
`@
`
` 78
`
`Fig. 3.4, Tree T.
`
`1. The preorder listing (or preorder traversal) of the nodes of T is the root x”
`of T followed by the nodes of T,
`in preorder, then the nodes of T,
`in
`preorder, and so on, up to the nodes of 7, in preorder.
`in inorder, fol-
`2. The inorder listing of the nodes of T is the nodes of 7,
`lowed by node n, followed by the nodes of T2,...,7,, each group of
`nodes in inorder.
`.
`3. The postorder listing of the nodes of T is the nodes of T,
`in postorder,
`then the nodes of T,
`in postorder, and so on, up to 7;,, all followed by
`node n.
`
`Figure 3.5(a) shows a sketch of a procedure to list the nodes of a tree in
`preorder. To make it a postorder procedure, we simply reverse the order of
`steps (1) and (2). Figure 3.5(b) is a sketch of an inorder procedure.
`In each
`case, we produce the desired ordering of the tree by calling the appropriate
`procedure on the root of the tree.
`
`Example 3.3. Let us list the tree of Fig. 3.3 in preorder. We first list L and
`then call PREORDER onthefirst subtree of 1, the subtree with root 2. This
`subtree is a single node, so we simply list it. Then we proceed to the second
`subtree of 1, the tree rooted at 3. We list 3, and then call PREORDER on
`the first subtree of 3. That call results in listing 5, 8, and 9, in that order.
`
`10
`10
`
`
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`
`
`3.1 BASIC TERMINOLOGY
`
`79
`
`(1)
`(2)
`
`procedure PREORDER( m: node );
`begin
`list 1;
`for each child c of n, if any, in order from the left do
`PREORDER(c)
`{ PREORDER}
`
`end;
`
`(a) PREORDERprocedure.
`
`procedure INORDER( n: node );
`begin
`if n is a leaf then
`list
`else begin
`INORDER (leftmost child of 7);
`list n;
`for each child c of n, except for the leftmost,
`in order from the left do
`INORDER(c)
`
`end
`{ INORDER}
`
`end;
`
`(b) INORDER procedure.
`
`Fig. 3.5. Recursive ordering procedures.
`
`
`Continuing in this manner, we obtain the complete preorder traversal of Fig.
`3.3: 1, 2, 3, 5, 8, 9, 6, 10, 4, 7.
`
`Similarly, by simulating Fig, 3.5(a) with the steps reversed, we can dis-
`-" caver that the postorder of Fig, 3.3 is 2, 8, 9, 5, 10, 6, 3, 7, 4, 1. By simulat-
`
`.ing Fig. 3.5(b), we find that the inorder listing of Fig. 3.3 is 2, 1, 8, 5, 9, 3,
`
`10, 6,7, 4.0
`
`
`
`
`A useful trick for producing the three node orderings is the following.
`Imagine we walk around the outside of the tree, starting at the root, moving
`“counterclockwise, and staying as close to the tree as possible; the path we have
`in mind for Fig, 3.3 is shown in Fig, 3.6.
`For preorder, we list a node the first time we pass it. For postorder, we
`“cst a node the last time we pass it, as we move up to its parent. For inorder,
`welist a leaf the first time we pass it, but list an interior node the second time
`wepass it. For example, node 1
`in Fig. 3.6 is passed the first time at the
`beginning, and the second time while passing through the “bay” between
`‘Modes 2 and 3. Note that
`the order of the leaves in the three orderings is
`
`
`
`11
`11
`
`
`
`ahceeeis
`
`
`

`

`
`
`
`
`TREES
`
`fg f\9 ]
`
`10
`
`Fig. 3.6. Traversal of a tree.
`
`it is only the ordering of
`always the sameleft-to-right ordering of the leaves.
`the interior nodes and their relationship to the leaves that vary among the
`three,
`
`Labeled Trees and Expression Trees
`
` 80
`
`Often it is useful to associate a label, or value, with each node of a tree, in
`the same spirit with which we associated a value with a list element in the pre-
`vious chapter. That is, the label of a node is not the name of the node, but a
`value that is “stored” at the node.
`In some applications we shall even change
`the label of a node, while the name of a node remains the same. A useful
`analogy is tree:list = label:element = node:position.
`
`Example 3.4. Figure 3.7 shows a labeled tree representing the arithmetic
`expression (a+b) * (a+c), where n,,...,7 are the names of the nodes,
`and the labels, by convention, are shown next
`to the nodes. The rules
`whereby a labeled tree represents an expression are as follows:
`1. Every leaf is labeled by an operand and consists of that operand alone.
`For example, node n, represents the expression a.
`2. Every interior node n is labeled by an operator. Suppose n is labeled by a
`binary operator 6, such as + or *, and that
`the left child represents
`expression E, and the right child £,. Then nm
`represents expression
`(E,) 0 (E,). We may remove the parentheses if they are not necessary.
`For example, node nm) has operator +, and its left and right children
`represent
`the expressions a and b, respectively, Therefore, n, represents
`(a}+(5), or just a+b... Node n, represents (a+b)*(at+c), since * is the label
`
`12
`12
`
`

`

`
`3.1 BASIC TERMINOLOGY
`81
`
`at my, and a+b and a+c are the expressions represented by nz and 13, respec-
`tively, 0
`
`
`
`Fig. 3.7. Expression tree with labels.
`
`
`inorder, or postorder listing of a
`Often, when we produce the preorder,
`tree, we prefer to list not the node names, but rather the labels.
`In the case
`of an expression tree, the preorder listing of the labels gives us what is known
`as the prefix form of an expression, where the operator precedes its left
`operand andits right operand. To be precise, the prefix expression for a sin-
`gle operand a is a@
`itself. The prefix expression for (E,) @ (£2), with 0 a
`binary operator,
`is 0P,;P2, where P, and P2 are the prefix expressions for FE;
`and £3. Note that no parentheses are necessary in the prefix expression, since
`we can scan the prefix expression 8P,P, and uniquely identify P, as the shor-
`test (and only) prefix of P,P, that is a legal prefix expression.
`For example, the preorder listing of the labels of Fig. 3.7 is *+ab+ac.
`The prefix expression for m2, which is +a@b,
`is the shortest legal prefix of
`+abtac,
`Similarly, a postorder listing of the labels of an expression tree gives us
`what is known as the postfix (or Polish) representation of an expression. The
`expression (£,) @ (E,) is represented by the postfix expression P,?,0, where
`P, and P» are the postfix representations of E, and E, respectively, Again,
`no parentheses are necessary in the postfix representation, as we can deduce
`what P, is by looking for the shortest suffix of P\P, that is a legal postfix
`expression. For example, the postfix expression for Fig. 3.7 is abtact+*.
`If
`we write this expression as P\P2*,
`then P, is ac+,
`the shortest suffix of
`ab+ac+ that is a legal postfix expression.
`
`
`
`
`13
`13
`
`

`

`TREES
`
`The inorder traversal of an expression tree gives the infix expression
`itself, but with no parentheses. For exampie, the inorder listing of the labels
`of Fig. 3.7 is atb *.a+c. The reader is invited to provide an algorithm for
`traversing an expression tree and producing an infix expression with all
`needed pairs of parentheses.
`
`Computing Ancestral Information
`
` 82
`
`The preorder and postorder traversals of a tree are useful in obtaining ances-
`tral information. Suppose postorder(n) is the position of node n in a post-
`order listing of the nodes of a tree. Suppose desc(n) is the number of proper
`descendants of node n. For example,
`in the tree of Fig. 3.7 the postorder
`numbers of nodes 2, ng, and ms are 3, 1, and 2, respectively.
`The postorder numbers assigned to the nodes have the useful property
`that the nodes in the subtree with root nm are numbered consecutively from
`postorder(n) — desc(n) to postorder(n). To test if a vertex x is a descendant
`of vertex’y, all we need do is determine whether
`:
`
`postorder(y)~ desc{y) = postorder(x) = postorder(y).
`
`A similar property holds for preorder.
`
`3.2 The ADT TREE
`
`In Chapter 2, lists, stacks, queues, and mappings were treated as abstract data
`types (ADT’s).
`In this chapter trees will be treated both as ADT’s and as
`data structures. One of our most important uses of trees occurs in the design
`of implementations for the various ADT’s we study. For example, in Section
`5.1, we shall see how a “binary search tree” can be used to implement
`abstract data types based on the mathematical model of a set, together with
`operations such as INSERT, DELETE, and MEMBER (to test whether an
`element is in a set}. The next two chapters present a number of other tree
`implementations of various ADT’s.
`In this section, we shall present several useful operations on trees and
`show how tree algorithms can be designed in terms of these operations, As
`with lists, there are a great variety of operations that can be performed on
`trees, Here, we shall consider the following operations:
`If
`1, PARENT(x, 7). This function returns the parent of node # in tree T.
`n is the root, which has no parent, A is returned.
`In this context, A is a
`“null node,”’ which is used as a signal that we have navigated off the tree.
`2.. LEFTMOST_CHILD(n, T) returns the leftmost child of node n in tree T,
`and it returns A if m is a leaf, which therefore has no children.
`3. RIGHT_SIBLING(n, T)
`returns the right sibling of nede n in tree T,
`defined to be that node m with the same parent p as n such that m lies
`immediately to the right of n in the ordering of the children of p. For
`example,
`for
`the
`tree
`in Fig.
`3.7, LEFTMOST_CHILD(n2) = ny;
`RIGHT_SIBLING(n,) = ns, and RIGHT_SIBLING (ns) = A.
`
`14
`14
`
`
`
`
`
`
`
`

`

`
`
`
`
`3.2 THE ADT TREE
`
`83
`
`4. LABEL(n, T) returns the label of node n in tree T. We do not, however,
`require labels to be defined for every tree.
`§. CREATEI(¥, T,, Tz, ...,7,;) is one of an infinite family of functions,
`one for each value of 7 = 0, 1,2, .... CREATE: makes a new node r
`with label
`vy and gives
`it
`i children, which are the roots of
`trees
`T,, Tz, ...,7;, in order from the left. The tree with root r is returned,
`Note that if 7 = 0, then r is both a Jeaf and the root.
`6, ROOT(T) returns the node that is the root of tree T, or A if T is the null
`tree.
`7, MAKENULL(T) makes 7 bethe null tree.
`
`Example 3.5. Let us write both recursive and nonrecursive procedures to take
`a tree and list the labels of its nodes in preorder. We assume that there are
`data types node and TREE already defined for us, and that
`the data type
`TREE is for trees with labels of the type labeltype. Figure 3.8 shows a recur-
`sive procedure that, given node n, lists the labels of the subtree rooted at ” in
`preorder. We call PREORDER(ROOT(T)) to get a preorderlisting of tree T.
`
`procedure PREORDER( a: node );
`{ list the labeis of the descendants of n in preorder }
`var
`
`c: node;
`begin
`print(LABEL(n, T));
`c:= LEFTMOST_CHILD(n, T);
`while c <> A do begin
`PREORDER(c);
`c := RIGHT_SIBLING(c, T)
`
`end
`end; { PREORDER}
`
`Fig. 3.8. A recursive preorder listing procedure.
`
`tree in
`a
`We shail also develop a nonrecursive procedure to print
`preorder. To find our way around the tree, we shall use a stack S$, whose
`type STACK is really ‘stack of nodes.” The basic idea underlying our algo-
`rithm is that when we are at a node a, the stack will hold the path from the
`root ton, with the root at the bottom of the stack and node n at the top,t
`
`T Recall our discussion of recursion in Section 2.6 in which we illustrated how the implementation
`of a recursive procedure involves a stack of activation records.
`If we examine Fig, 3.8, we can
`observe that when PREORDER(») is called, the active procedure calls, and therefore the stack of
`activation records, correspond to the calls of PREORDERfor ail the ancestors of n. Thus our
`honrecursive preorder procedure,
`like the example in Section 2.6, models closely the way the re-
`cursive procedure is implemented.
`
`
`
`
`
`15
`
`

`

`84
`
`TREES
`
`One way to perform a nonrecursive preorder traversal of a tree is given
`by the program NPREORDER shown in Fig, 3.9. This program has two
`modes of operation.
`In the first mode it descends down the leftmost unex-
`plored path in the tree, printing and stacking the nodes along the path, until it
`reachesa leaf.
`The program then enters the second mode of operation in which it retreats
`back up the stacked path, popping the nodes of the path off the stack, until it
`encounters a node on the path with a right sibling. The program then reverts
`back to the first mode of operation, starting the descent from that unexplored
`right sibling.
`.
`The program begins in mode one at
`the root and terminates when the
`stack becomes empty. The complete program is shown in Fig. 3.9.
`
`3.3 Implementations of Trees
`
`In this section we shall present several basic implementations for trees and dis-
`cuss their capabilities for supporting the various tree operations introduced in
`Section 3.2.
`
`An Array Representation of Trees
`
`
`
`Let T be a tree in which the nodes are named |, 2,.-.,. Perhaps the sim-
`plest representation of T that supports the PARENT operation is a linear
`array A in which entry A[i] is a pointer or a cursor to the parent of node i.
`The root of T can be distinguished by giving it a null pointer or a pointer to
`itself as parent.
`in Pascal, pointers to array elements are not feasible, so we
`shall have to use a cursor scheme where Afi] = j if node j is the parent of
`node i, and A[f] = 0 if node ¢ is the root,
`This representation uses the property of trees that each node has a unique
`parent. With this representation the parent of a node can be found in con-
`stant time. A path going up the tree, that is, from node to parent to parent,
`and so on, can be traversed in time proportional to the number of nodes on
`the path. We can also support the LABEL operator by adding another array
`L, such that L[/] is the label of node f, or by making the elements of array A
`be records consisting of an integer (cursor) and a label.
`
`Example 3.6. The tree of Fig. 3.10(a) has the parent representation given by
`the array A shown in Fig. 3.10(b). 0
`
`facilitate operations that
`representation does not
`The parent pointer
`require child-of information. Given a node a, it is expensive to determine the
`children of n, or the height of n.
`In addition, the parent pointer representa-
`tion does not specify the order of the children of a node, Thus, operations
`like LEFTMOST_CHILD and RIGHT_SIBLING are not well defined, We
`could impose an artificial order, for example, by numbering the children of
`each node after numbering the parent, and numbering the children in
`
`16
`16
`
`
`
`
`
`

`

`3.3 IMPLEMENTATIONS OF TREES
`
`85
`
`procedure NPREORDER( T: TREE );
`{ nonrecursive preorder traversal of tree T }
`var
`
`{ a temporary }
`m: node;
`S: STACK;
`{ stack of nodes holding path from the root
`to the parent TOP(S) of the ‘‘current” node m }
`
`begin
`{ initialize }
`MAKENULL({S);
`m := ROOT(7);
`
`while true do
`ifm <> A then begin
`print(LABEL(m, T));
`PUSH(m, 5);
`{ explore leftmost child of m }
`m := LEFTMOST_CHILD(m, T)
`
`end
`
`else begin
`{ exploration of path on stack
`is now complete }
`if EMPTY(S) then
`return,
`{ explore right sibling of nede
`on top of stack }
`m:= RIGHT_SIBLING(TOP(S), 7):
`POP(S)
`
`end
`{ NPREORDER}
`
`end;
`
`Fig. 3.9. A nonrecursive preorder procedure.
`
`increasing order from left to right. On that assumption, we have written the
`function RIGHT_SIBLING in Fig. 3.11, for types node and. TREEthat are
`>:
`‘.. defined as follows:
`
`type
`
`node = integer;
`TREE=array [1..maxnodes] of node;
`
`
`
`
`
`
`_ Por this implementation we assume the null node A is represented by @.
`
`17
`17
`
`

`

` &6
`
`
`
`
`
`
`
`
`
`2 3
`4
`5
`6
`7
`8
`9
`10
`afolil:[2]2[5[51[s51]313|
`
`TREES
`
`iN
`/\,/%
`JN
`
`(a) a tree
`
`(b) parent representation.
`
`Fig. 3.10. A tree and its parent pointer representation.
`
`fonction RIGHTSIBLING ( n: node; 7: TREE } : node;
`{ return the right sibling of node n in tree T }
`*var
`
`i, parent: node;
`begin
`parent := T[n};
`for i := n + 1 to maxnodes do
`{ search for node after n with same parent }
`if T[i] = parent then
`:
`return (7);
`return (0)
`{ null node will be returned
`if no right sibling is ever found }
`{ RIGHT_SIBLING }
`
`end;
`
`Fig. 3.11. Right sibling operation using atray representation.
`
`18
`18
`
`

`

`3.3 IMPLEMENTATIONS OF TREES
`
`87
`
`Representation of Trees by Lists of Children
`An important and useful way of representing trees is to form for each node a
`list of its children. The lists can be represented by any of the methods sug-
`gested in Chapter 2, but because the number of children each node may have
`can be variable, the linked-list representations are often more appropriate.
`Figure’ 3.12 suggests how the tree of Fig. 3.10(a) might be represented.
`There is an array of header cells,
`indexed by nodes, which we assume to be
`numbered 1, 2,,...,10, Each header points to a linked list of “elements,”
`which are nodes. The elements on the list headed by header{i] are the chil-
`dren of node i; for example, 9 and 10 are the children of 3.
`
`
`
`header
`
`Fig. 3.12. A linked-list representation of a tree.
`
`Let us first develop the data structures we need in terms of an abstract
`data type LIST (of nodes), and then give a particular implementation of lists
`and see how the abstractions fit together. Later, we shall see some of the
`simplifications we can make, We begin with the following type declarations:
`
`
`
`
`
`
`
`type
`It
`node=integer;
`LIST={ appropriate definition for list of nodes };
`position = { appropriate definition for positions in lists },
`TREE = record
`header: array [1..maxnodes] of LIST;
`labels: array [1..maxnodes] of labeltype;
`root: node
`
`end;
`
`19
`19
`
`

`

`We assume that the root of each tree is stored explicitly in the root field.
`Also, 0 is used to represent the null node.
`Figure 3.13 shows the code for the LEFTMOST_CHILD operation. The
`reader should write the code for the other operations as exercises.
`
`TREES
`
`function LEFFMOST_CHILD ( n: node; T: TREE) : node;
`{ returns the teftmost child of node n of tree T }
`var
`
`L: LIST;
`
`{ shorthand for the list of a's children }
`
`L := T.header{n |;
`if EMPTY(L) then {17 is a leaf}
`return (0}
`
`else
`
`return (RETRIEVE(FIRST(L), L)}
`{ LEFTMOST_CHILD }
`
`end;
`
`Fig. 3.13. Function to find leftmost child.
`
` 88
`
`Now let us choose a particular implementation of lists, in which both LIST
`and position are integers, used as cursors into an array celispace of records:
`var
`
`celispace: array [1..maxnodes ] of record
`nade : integer;
`next: integer
`
`end;
`
`lists of children have header cells.
`that
`insist
`To simplify, we shall not
`Rather, we shall let 7.header[n] point directly to the first cell of the list, as is.
`suggested
`by
`Fig.
`3.12.
`Figure
`3.14(a)
`shows
`the
`function
`LEFTMOST_CHILD of Fig. 3.13 rewritten for this specific implementation.
`Figure 3.14(b) shows the operator PARENT, which is more difficult to write
`using this representation of lists, since a search ofall lists is required to deter-
`mine on which list a given node appears.
`
`The Leftmost-Child, Right-Sibling Representation
`
`The data structure described above has, among other shortcomings, the inabil-
`ity to create large trees from smaller ones, using the CREATE/ op

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket