throbber
r, .~,_,rfi_
`
`ALFRED V. AND
`JOHN E. HOPCHOFT
`
`DAfA STRUCTURES
`AND ALGORITHMS
`
`JEFFREY D. ULLMAN
`
`1
`
`EX1010
`
`1
`
`EX 1010
`
`

`

`
`
`Data Structures
`
`and Algorithms
`
`
`
`
`
`
`
`
`
`
`
`
`ALFRED V. AHO
`
`Bell Laboratories
`
`Murray Hiil, New Jersey
`
`JOHN E. HOPCROFT
`
`Cornell University
`Ithaca, New York
`
`JEFFREY D. ULLMAN
`
`Stanford University
`Stanford, California
`
`
`
`-
`
`
`
`y‘v} ADDISQN_7WESL_E__Y_:PUBLISHING;COMEANY
`'Raadin 'MassachLisefls.
`IdPérk; California
`ondo'
`AmstéjrdamioDOhMrll Oman _0 Sydney
`
`
`
`2
`
`

`

`Michael A. Harrison
`Consulting Editor
`
`This book is in the
`
`
`ADDISON-WESLEY SERIES IN
`
`COMPUTER SCIENCE AND ENFORMATION PROCESSING
`
`
`
`
`
`Library of Congress Cataloging in Publication Data
`Aho. Alfred V.
`Data structures and algorithms.
`
`2. Algorithms.
`1. Data structures (Computer science)
`I. Hopcroft, John F... 1939-
`ll. Ullman.
`Jeffrey D.. l942-
`.
`Ill. Title.
`1982
`QA76.9.D35A38
`001.64
`lSBN {Hm-000234
`
`82-11596
`
`
`
`
`
`
`
`
`
`
`Reproduced by Addison-Wesley from camera-ready copy supplied by the authors.
`
`
`Reprinted with corrections April, 1987
`Copyright ‘3 I983 by Bell Telephone Laboratories, Incorporated.
`
`
`All rights reserved. No part of this publication may be reproduced, stored in a re-
`trieval system, or transmitted,
`in any form or by any means, electronic. mechanical.
`photocopying. recording, or otherwise, without the prior written permission of the pub-
`
`
`lisher. Printed in the United States-of America. Published simultaneously in Canada.
`
` _- ISBN: 0-201-00023—7
`
`3
`
`

`

`Contents
`
`Chapter 1
`1.1
`1.2
`1.3
`1.4
`1.5
`1.6
`1.7
`
`Design and Analysis of Algorithms
`From Problems to Programs ................................................ 1
`Abstract Data Types ........................................................ 10
`Data Types, Data Structures, and Abstract Data Types ............ 13
`The Running Time of a Program ........................................ l6
`Calculating the Running Time of a Program .......................... 21
`Good Programming Practice .............................................. 27
`Super Pascal ................................................................... 29
`
`Chapter 2
`2.1
`2.2
`2.3
`2.4
`2.5
`2.6
`
`Chapter 3
`3.1
`3.2
`3.3
`3.4
`
`Chapter 4
`4.1
`4.2
`4.3
`4.4
`4.5
`4.6
`4.7
`4.8
`4.9
`4.10
`4.11
`4.12
`
`Basic Data Types
`
`The Data Type ”List" ......................................................37
`Implementation of Lists ....................................................40
`Stacks ........................................................................... 53
`Queues ......................................................................... 56
`Mappings .........................................'............................. 61
`Stacks and Recursive Procedures ........................................ 64
`
`Trees
`
`Basic Terminology ........................................................... 75
`The ADT TREE ............................................................. 82
`Implementations of Trees .................................................. 84
`Binary Trees .................................................................. 93
`
`Basic Operations on Sets
`
`Introduction to Sets ........................................................ 107
`An ADT with Union, Intersection, and Difference ................ 109
`A Bit-Vector Implementation of Sets .................................. 112
`A Linked-List Implementation of Sets ................................ 115
`The Dictionary .............................................................. 117
`Simple Dictionary Implementations .................................... 119
`The Hash Table Data Structure ......................................... 122
`Estimating the Efficiency of Hash Functions ........................ 129
`Implementation of the Mapping ADT ................................. 135
`Priority Queues ............................................................. 135
`Implementations of Priority Queues ................................... 138
`Some Complex Set Structures ........................................... 145
`
`4
`
`

`

`Chapter 5
`5.1
`5.2
`5.3
`5.4
`5.5
`5.6
`
`Chapter 6
`6.1
`6.2
`6.3
`6.4
`6.5
`6.6
`6.7
`
`Chapter 7
`7.1
`7.2
`7.3
`7.4
`7.5
`
`Chapter 8
`8.1
`8.2
`8.3
`8.4
`8.5
`8.6
`8.7
`
`Chapter 9
`9.1
`9.2
`9.3
`9.4
`
`Chapter 10
`10.1
`10.2
`10.3
`10.4
`10.5
`
`
`
`Advanced Set Representation Methods
`
`CONTENTS
`
`Binary Search Trees ....................................................... 155
`Time Analysis of Binary Search Tree Operations .................. 160
`Tries ........................................................................... 163
`Balanced Tree Implementations of Sets ............................... 169
`Sets with the MERGE and FIND Operations ....................... 180
`An ADT with MERGE and SPLIT .................................... 189
`
`Directed Graphs
`
`Basic Definitions ........................................................... 198
`Representations for Directed Graphs .................................. 199
`The Single-Source Shortest Paths Problem ...........................203
`The Ali-Pairs Shortest Path Problem .................................. 208
`
`Traversals of Directed Graphs .......................................... 215
`Directed Acyclic Graphs .................................................. 219
`Strong Components ........................................................ 222
`
`Undirectfll Graphs
`Definitions ...................- ............................................... 230
`
`Minimum-Cost Spanning Trees ......................................... 233
`Traversals .................................................................... 239
`
`Articulation Points and Biconnected Components .................. 244
`Graph Matching ............................................................ 246
`
`Sorting
`
`Local Search Algorithms ................................................. 336
`
`The Internal Sorting Model .............................................. 253
`Some Simple Sorting Schemes ........................................... 254
`Quicksort ..................................................................... 260
`Heapsort ...................................................................... 271
`Bin Sorting ................................................................... 274
`A Lower Bound for Sorting by Comparisons ........................ 282
`Order Statistics .............................................................. 286
`
`Algorithm Analysis Techniques
`
`Efficiency of Algorithms ................................................. 293
`Analysis of Recursive Programs ........................................ 294
`Solving Recurrence Equations .......................................... 296
`A General Solution for a Large Class of Recurrences ............ 298
`
`Algorithm Design Techniques
`
`Divide-and-Conquer Algorithms ........................................ 306
`Dynamic Programming ................................................... 311
`Greedy Algorithms ........................................................ 321
`Backtr aeking ................................................................. 324
`
`5
`
`

`

` xi
`CONTENTS
`
`Chapter 11
`11.1
`11.2
`11.3
`11.4
`
`Data Structures and Algorithms for External Storage
`A Model of External Computation .....................................347
`External Sorting ............................................................349
`Storing Information in Files .............................................361
`External Search Trees .....................................................368
`
`Chapter 12
`12.1
`12.2
`12.3
`12.4
`12.5
`12.6
`
`Memory Management
`The Issues in Memory Management ...................................378
`Managing Equal-Sized Blocks ...........................................382
`Garbage Collection Algorithms for Equal-Sized Blocks ..........384
`Storage Allocation for Objects with Mixed Sizes ................... 392
`Buddy Systems ..............................................................400
`Storage Compaction ....................................................... 404
`
`Bibliography ................................................................41 1
`
`Index ..........................................................................419
`
`
`
`
`6
`
`

`

`
`
`CHAPTER 3
`
`Trees
`
`A tree imposes a hierarchical structure on a collection of items. Familiar
`examples of trees are genealogies and organization charts. Trees are used to
`help analyze electrical circuits and to represent the structure of mathematical
`formulas. Trees also arise naturally in many different areas of computer sci-
`ence. For example,
`trees are used to organize information in database sys-
`tems and to represent the syntactic structure of source programs in compilers.
`Chapter 5 describes applications of
`trees in the representation of data.
`Throughout this book, we shall use many different variants of trees,
`In this
`chapter we introduce the basic definitions and present some of the more com-
`mon tree operations. We then describe some of the more frequently used data
`structures for trees that can be used to support these operations efficientiy.
`
`3. 1 Basic Terminology
`
`A tree is a collection of elements called nodes, one of which is distinguished as
`a root, along with a relation (“parenthood“) that places a hierarchical struc-
`ture on the nodes. A node, like an element of a list, can be of whatever type
`we wish. We often depict a node as a letter, a string, or a number with a cir-
`cle around it. Formally, a tree can be defined recursively in the following
`manner.
`
`1. A single node by itself is a tree. This node is also the root of the tree.
`2.
`Suppose
`a
`is
`a
`node
`and TI, T2,
`.
`.
`.
`,Tk
`are
`trees with
`roots
`.
`n1, n2.
`.
`.
`,nk, respectively. We can construct a new tree by making n
`be the parent of nodes n1, n2,
`.
`.
`.,nk.
`In this tree it
`is the root and
`T1, T2.
`.
`.
`.
`,Tk are the subtrees of the root. Nodes n1, n2,
`.
`.
`,
`,nk are
`called the children of node n.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Sometimes, it is convenient to include among trees the null tree, a “tree" with
`no nodes, which we shall represent by A.
`
`Example 3.1. Consider the table of contents of a book, as suggested by Fig.
`'3.l(a). This table of contents is a tree. We can redraw it
`in the manner
`.shown in Fig. 3.1(b). The parent-child relationship is depicted by a line.
`'Trees are normally drawn top-down as in Fig. 3.1(b), with the parent above
`the child.
`three subtrees with roots
`the node calied “Book," has
`" The root,
`Corresponding to the chapters C1, C2,
`and C3.
`This
`relationship is
`'_6presented by the lines downward from Book to C1, C2, and C3. Book is
`he parent of C1, C2, and C3, and these three nodes are the children of Book.
`
`'
`
`
`
`7
`
`
`
`

`

`
`
`
`
`76
`
`Book
`(‘1
`sl.l
`
`5‘3
`C2
`.1.
`52.1.1
`52.1.2
`52.2
`32.3
`C3
`
`(a)
`
`TREES
`
`c3
`
`.
`
`Book
`
`'
`
`
`
`C1
`C2
`/ \ / l \
`31.1
`51.2
`52.1
`52.2
`52.3
`/
`52.1.1
`
`52.1.2
`
`(b)
`
`Fig. 3.1. A table of contents and its tree representation.
`
`The third subtree, with root C3, is a tree of a single node, while the other
`two subtrees have a nontrivial structure. For example, the subtree with root
`C2 has three subtrees, corresponding to the sections s2.1, 52.2, and s2.3; the
`last two are one~node trees, while the first has two subtrees corresponding to
`the subsections s2.l.l and 52.1.2. El
`
`Example 3.1 is typical of one kind of data that is best represented as a
`tree.
`In this example, the parenthood relationship stands for containment; a
`parent node is comprised of its children, as Book is comprised of C1, C2, and
`C3. Throughout this book we shall encounter a variety of other relationships
`that can be represented by parenthood in trees.
`.
`is the
`If n1, n2,
`.
`.
`.
`,nk is a sequence of nodes in a tree such that n,-
`parent of n,-+1 for 1 S i < k, then this sequence is called a path from node n,
`to node in. The length of a path is one less than the number of nodes in the
`path. Thus there is a path of length zero from every node to itself. For
`example,
`in Fig. 3.1 there is a path of length two, namely (C2, s21, s2.l.2)
`from C2 to 32.1.2.
`
`If there is a path from node a to node b, then a is an ancestor of b, and b
`is a descendant of a. For example,
`in Fig. 3.1,
`the ancestors of 52.1, are
`itself, C2, and Book, while its descendants are itself, 82.1.1, and 52.1.2.
`Notice that any node is both an ancestor and a descendant of itself.
`An ancestor or descendant of a node, other than the node itself, is called
`a proper ancestor or proper descendant, respectively.
`In a tree, the root is the
`only node with no proper ancestors. A node with no proper descendants is
`called a leaf. A subtree of a tree is a node, together with all its descendants.
`The height of a node in a tree is the length'of a longest path from the
`node to a leaf.
`In Fig. 3.] node C1 has height 1, node C2 height 2, and node
`C3 height 0. The height of a tree-is the height of the root. The depth of a
`node is the length of the unique path from the root to that node.
`
`8
`
`
`
`

`

`3.1 BASIC TERMINOLOGY
`
`77
`
`The Order of Nodes
`
`The children of a node are usually ordered from left-to-right. Thus the two
`trees of Fig. 3.2 are different because the two children of node a appear in a
`different order in the two trees.‘ If we wish explicitly to ignore the order of
`children, we shall refer to a tree as an unordered tree.
`
`Fig. 3.2. Two distinct (ordered) trees.
`
`The “ieft—to~right" ordering of siblings (Children of the same node) can be
`extended to compare any two nodes that are not related by the ancestor-
`descendant relationship. The relevant rule is that if a and b are siblings, and
`a is to the left of b, then all the descendants of a are to the left of all the des-
`cendants of b.
`
`' of n. El
`
`
`
`Example 3.2. Consider the tree in Fig. 3.3. Node 8 is to the right of node 2,
`to the left of nodes 9, 6, 10, 4, and 7, and neither left our right of its ances—
`tors 1, 3, and 5.
`
`./\.l
`{\9
`l.
`
`A simple rule, given a node n, for finding those nodes to its left and those
`to its right, is to draw the path from the root to n. All nodes branching off to
`the left of this path, and all descendants of such nodes, are to the left of n.
`' All nodes and descendants of nodes branching off to the right are to the right
`
`9
`
`

`

`
`
`78
`
`TREES
`
`Preorder, Postorder, and Inorder
`
`There are several useful ways in which we can systematically order all nodes
`of a tree. The three most important orderings are called preorder,
`inorder
`and postorder; these orderings are defined recursively as follows.
`
`0
`
`I
`
`If a tree T is null, then the empty list is the preorder, inorder and post-
`order listing of T.
`then that node by itself is the preotder,
`[f T conSists a Single node.
`inorder, and postorder listing of T.
`
`Otherwise, let T be a tree with root n and subtrees T1, T2,
`gested in Fig. 3.4.
`
`.
`
`.
`
`.
`
`,Tk, as sug»
`
`Fig. 3.4. Tree T.
`
`1. The preorder listing (or preorder traversal) of the nodes of T is the root n
`of T followed by the nodes of T1 in preorder, then the nodes of T2 in
`preorder, and so on, up to the nodes of T; in preorder.
`2. The inorder listing of the nodes of T is the nodes of T1 in inorder, fol-
`lowed by node n, folloWed by the nodes of T2,
`.
`.
`. ,Tk, each group of
`nodes in inorder.
`.
`3. The postarder listing of the nodes of T is the nodes of T1 in postorder,
`then the nodes of T2 in postorder, and so on, up to Tk, all followed by
`node n.
`
`Figure 3.5(a) shows a sketch of a procedure to list the nodes of a tree in
`preorder. To make it a postol'der procedure, we simply reverse the order of
`steps (1) and (2). Figure 3.5(b) is a sketch of an inorder procedure.
`In each
`case, we produce the desired ordering of the tree by calling the appr0priate
`procedure on the root of the tree.
`
`Example 3.3. Let us list the tree of Fig. 3.3 in preorder. We first list i and
`then call PREORDER on the first subtree of 1, the subtree with root 2. This
`subtree is a single node, so We simply list it. Then we proceed to the second
`subtree of l, the tree rooted at 3. We list 3, and then call PREORDER on
`the first subtree of 3. That call results in listing 5, 8, and 9, in that order.
`
`10
`10
`
`
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`3.1 BASIC TERMINOLOGY
`
`79
`
`(1)
`(2)
`
`:2: node );
`
`procedure PREORDER (
`begin
`iist n;
`for each child c of n, if any, in order from the left do
`PREORDER(c)
`{ PREORDER }
`
`end;
`
`(a) PREORDER procedure.
`
`procedure INORDER ( n: node );
`begin
`if n is a leaf then
`list n
`
`else begin
`iNORDERUeftmost child of :2);
`list it;
`for each child 6 of n, except for the leftmost,
`in order from the left do
`INORDER(c)
`
`end
`
`end;
`
`{ INORDER }
`
`(b) INORDER procedure.
`
`Fig. 3.5. Recursive ordering procedures.
`
`
`
`_
`
`Continuing in this manner, We obtain the complete preorder traversal of Fig.
`3.311, 2, 3, 5, 8, 9, 6, 10, 4, 7.
`Similarly, by simulating Fig. 3.5(a) with the steps reversed, we can dis—
`.
`
`-' cover that the postorder of Fig. 3.3 is 2, 8, 9, 5, 10, 6, 3, 7, 4, 1. By simulat-
`
`-:ing Fig. 3.5(b), we find that the inorder listing of Fig. 3.3 is 2, l, 8, 5, 9, 3,
`- 10, 6, 7, 4. III
`
`
`
`
`'
`
`_
`
`A useful trick for producing the three node orderings is the following.
`Imagine we walk around the outside of the tree, starting at the root, moving
`' counterclockwise, and staying as close to the tree as possible; the path We havc
`It mind for Fig. 3.3 is shown in Fig. 3.6.
`For preorder, we list a node the first time we pass it. For postorder, we
`t a node the last time we pass it, as we move up to its parent. For inorder,
`We list a leaf the first time We pass it, but ilst an interior node the second time
`__
`__ pass it. For example, node 1
`in Fig. 3.6 is passed the first time at the
`beginning, and the second time while passing through the “bay” between
`nodes 2 and 3. Note that
`the order of the leaves in the three orderings is
`
`
`
`11
`11
`
`
`
`

`

`
`
`
`
`
`
`80
`
`TREES
`
`/
`\
`I/ \ x
`8
`/I’\
`9
`)
`
`I
`
`\
`
`‘
`10 [I
`
`Fig. 3.6. Traversal of a tree.
`
`It is only the ordering of
`always the same left-to-right ordering of the leaves.
`the interior nodes and their relationship to the leaves that vary among the
`three. El
`
`Labeled Trees and Expression Trees
`
`Often it is useful to associate a label, or value, with each node of a tree, in
`the same spirit with which we associated a value with a list element in the pre-
`vious chapter. That is, the label of a node is not the name of the node, but a
`value that is “stored" at the node.
`In some applications we shall even change
`the label of a node, while the name of a node remains the same. A useful
`analogy is tree:list = label:element = nodezposition.
`
`Example 3.4. Figure 3.7 shows a labeled tree representing the arithmetic
`expression (a+b) * (n+c), where m,
`.
`.
`.
`,n-, are the names of the nodes,
`and the labels, by convention, are shown next
`to the nodes. The rules
`whereby a labeled tree represents an expression are as follows:
`1. Every leaf is labeled by an operand and consists of that operand alone.
`For example, node :14 represents the expression a.
`2. Every interior node n is labeled by an operator. Suppose n is labeled by a
`binary operator 6, such as + or *, and that
`the left child represents
`expression E1 and the right child Ez. Then :1
`represents expression
`(E 1) 9 (13;). We may remove the parentheses if they are not necessary.
`
`For example, node it; has operator +, and its left and right children
`represent
`the expressions a and b, respectively. Therefore, n; represents
`(a}+(b), or just a+b. Node n1 represents (a+b)*(a+c), since a. is the label
`
`12
`12
`
`

`

`
`3.1 BASIC TERMINOLOGY
`8]
`
`
`at n}, and a+b and a+c are the expressions represented by n; and n3, respec-
`tively. D
`
`
`
`
`
`
`
`
`Fig. 3.7. Expression tree with labels.
`
`inorder, or postorder listing of a
`Often, when we produce the preorder,
`tree, we prefer to list not the node names, but rather the labels.
`in the case
`of an expression tree, the preorder listing of the labels gives us what is known
`as the prefix form of an expression, where the operator precedes its left
`operand and its right operand. To be precise, the prefix expression for a sin-
`gle operand a is a itself, The prefix expression for (E1) 9 (13;), with 8 a
`binary operator,
`is BPng, where P, and P2 are the prefix expressions for E l
`and E2. Note that no parentheses are necessary in the prefix expression, since
`we can scan the prefix expression BPIP; and uniquely identify P, as the shor-
`test (and only) prefix of P1P; that is a legal prefix expression.
`For example, the preorder listing of the labels of Fig. 3.7 is *+ab+ac.
`The prefix expression for n2, which is +ab,
`is the shortest legal prefix of
`+ab+ac.
`
`
`
`
`
`
`
`
`
`
`
`
`
`Simiiarly, a postorder listing of the labels of an expression tree gives us
`
`what is known as the posrfix (or Polish) representation of an expression. The
`
`expression (E I) 6 (E1)
`is represented by the postfix expression PIPZB, where
`
`P; and P2 are the postfix representations of E1 and E2, respectively. Again,
`
`no parentheses are necessary in the postfix representation, as we can deduce
`
`what P2 is by looking for the shortest suffix of P1P; that is a legal postfix
`expression. For example, the postfix expression for Fig. 3.7 is ab+ac+*.
`If
`
`.we write this expression as P1P2*,
`then P2 is ac+.
`the shortest suffix of
`
`ab +ac+ that is a legal postfix expression.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`13
`13
`
`

`

`
`
`
`
`
`
`
`
`82
`
`TREES
`
`The inorder traversal of an expression tree gives the infix expression
`itself, but with no parentheses. For example, the inorder listing of the labels
`of Fig. 3.7 is a+b * a+c. The reader is invited to provide an algorithm for
`traversing an expression tree and producing an infix expression with all
`needed pairs of parentheses.
`
`Computing Ancestral Information
`
`The preorder and postorder traversals of a tree are useful in obtaining ances—
`tral information. Suppose postorder(n) is the position of node n in a post-
`order listing of the nodes of a tree. Suppose desc(n) is the number of proper
`descendants of node n. For example,
`in the tree of Fig. 3.7 the postorder
`numbers of nodes n2, in, and us are 3, l, and 2, respectively.
`The postorder numbers assigned to the nodes have the useful property
`that the nodes in the subtree with root n are numbered consecutively from
`poszorder(n) — desc(n) to postorder(n). To test if a vertex 1 is a descendant
`of vertex 3!, all we need do is determine whether
`
`postorder(y)-—desc(y) s posrorder(x) S postorder(y).
`
`A similar property holds for preorder.
`
`3.2 The ADT TREE
`
`in Chapter 2, lists, stacks, queues, and mappings were treated as abstract data
`types (ADT's).
`In this chapter trees will he treated both as ADT’s and as
`data structures. One of our most important uses of trees occurs in the design
`of implementations for the various ADT’s we study. For example, in Section
`5.1, we shall see how a “binary search tree“ can be used to implement
`abstract data types based on the mathematical model of a set, together with
`operations such as INSERT, DELETE, and MEMBER (to test whether an
`element is in a set). The next two chapters present a number of other tree
`implementations of various ADT’s.
`In this section, we shall present several useful operations on trees and
`show how tree algorithms can be designed in terms of these operations. As
`with lists, there are a great variety of operations that can be performed on
`trees. Here, we shall consider the following operations:
`
`If
`1. PARENT(n, T). This function returns the parent of node n in tree T.
`n is the root, which has no parent, A is returned.
`In this context, A is a
`“null node," which is used as a signal that We have navigated off the tree.
`2. LEFTMOST_CHILD(n, T) returns the leftmost child of node n in tree T,
`and it returns A if n is a leaf, which therefore has no children.
`
`returns the right sibling of node n in tree T,
`3. RlGHT_SIBLING(n, T)
`defined to be that node m with the same parent p as n such that m lies
`immediately to the right of n in the ordering of the children of p. For
`example,
`for
`the
`tree
`in Fig.
`3.7, LEFTMOST__CHILD(n2) = In;
`RIGHT_SIBLING(n4) =_ H5, and RIGH'LSIBLING (ns) = A.
`
`14
`14
`
`

`

`
`
`
`
`3.2 THE ADT TREE
`
`33
`
`4. LABELUI, T) returns the label of node n in tree T. We do not, however,
`require labels to be defined for every tree.
`5. CREATEi(v, TL T2,
`.
`.
`.
`,T;)
`is one of an infinite family of functions,
`ll
`one for each value ofi
`0, 1, 2,
`.
`.
`.. CREATEi' makes a new node r
`with label v and gives
`it
`1' children, which are the roots of
`trees
`T1, T2,
`.
`.
`.
`,7}, in order from the left. The tree with root r is returned.
`Note that if i = 0, then r is both a leaf and the root.
`6. ROOT(T) returns the node that is the root of tree T, or A if T is the null
`tree.
`
`7. MAKENULMT) makes T be the null tree.
`
`Example 3.5. Let us write both recursive and nonrecursive procedures to take
`a tree and list the labels of its nodes in preorder. We assume that there are
`data types node and TREE already defined for us, and that
`the data type
`TREE is for trees with labels of the type labeltype. Figure 3.8 shows a recur-
`sive procedure that, given node n, lists the labels of the subtree rooted at n in
`preorder. We call PREORDER(ROOT(T}) to get a preorder listing of tree T.
`
`procedure PREORDER ( n: node );
`{ list the labels of the descendants of n in preorder }
`var
`
`c: node;
`begin
`print(LABEL(n, T));
`c := LEFTMOST_CHILD(n, T);
`while 0 <> A do begin
`PREORDER(c);
`c := RIGHT_SIBLING(c, T)
`
`end
`
`end; { PREORDER }
`
`Fig. 3.8. A recursive preorder listing procedure.
`
`tree in
`a
`We shall also develop a nonrecursive procedure to print
`preorder. To find our way around the tree, we shall use a stack S, whose
`type STACK is really “stack of nodes." The basic idea underlying our algo-
`rithm is that when we are at a node n, the stack will hold the path from the
`root to n, with the root at the bottom of the stack and node n at the top.'l'
`
`1‘ Recall our discussion of recursion in Section 2.6 in which we illustrated how the implementation
`of a recursive procedure involves a stack of activation records.
`If we examine Fig. 3.8, we can
`observe that when PREORDER(n) is called, the active procedure calls, and therefore the stack of
`activation records, correspond to the calls of PREORDER for all the ancestors of n. Thus our
`nonrecursive preorder procedure,
`like the example in Section 2.6, models closely the way the re-
`cursive procedure is implemented.
`
`
`
`
`
`15
`
`

`

`
`
`
`
`
`
`
`
`84
`
`TREES
`
`One way to perform a nonrecursive preorder traversal of a tree is given
`by the program NPREORDER shown in Fig. 3.9. This program has two
`modes of operation.
`In the first mode it descends down the leftmost unex-
`plored path in the tree,‘ printing and stacking the nodes along the path, until it
`reaches a leaf.
`
`The program then enters the second mode of operation in which it retreats
`back up the stacked path, popping the nodes of the path off the stack, until it
`encounters a node on the path with a right sibling. The program then reverts
`back to the first mode of operation, starting the descent from that unexplored
`right sibling.
`_
`The program begins in mode one at
`the root and terminates when the
`stack becomes empty. The complete program is shown in Fig. 3.9.
`
`3.3 Implementations of Trees
`
`in this section we shall present several basic implementations for trees and dis-
`cuss their capabilities for supporting the various tree operations introduced in
`Section 3.2.
`
`An Array Representation of Trees
`
`.31. Perhaps the sim-
`.
`.
`.
`Let T he a tree in which the nodes are named 1, 2.
`plest representation of T that supports the PARENT operation is a linear
`array A in which entry A[i] is a pointer or a cursor to the parent of node i.
`The root of T can be distinguished by giving it a null pointer or a pointer to
`itself as parent.
`In Pascal, pointers to array elements are not feasible, so we
`shall have to use a cursor scheme where AU] = j if node j is the parent of
`node i, and A[i] = 0 if node t' is the root.
`This representation uses the property of trees that each node has a unique
`parent. With this representation the parent of a node can be found in con-
`stant time. A path going up the tree, that is, from node to parent to parent,
`and so on, can be traversed in time proportionai to the number of nodes on
`the path. We can also support the LABEL operator by adding another array
`L, such that LU] is the label of node i, or by making the elements of array A
`be records consisting of an integer (cursor) and a label.
`
`Example 3.6. The tree of Fig. 3.10(a) has the parent representation given by
`the array A shown in Fig. 3.10(h). D
`
`facilitate operations that
`representation does not
`The parent pointer
`require child-of information. Given a node n, it is expensive to determine the
`children of n, or the height of n.
`In addition, the parent pointer representa-
`tion does not specify the order of the children of a node. Thus, operations
`like LEFTMOST_CHILD and RIGHT_SIBLING are not well defined. We
`could impose an artificial order, for example, by numbering the children of
`each node after numbering the parent, and numbering the children in
`
`16
`16
`
`

`

`3.3 IMPLEMENTATIONS OF TREES
`
`85
`
`procedure NPREORDER ( T: TREE );
`{ nonrecursive preorder traversal of tree T }
`var
`
`{ a temporary }
`m: node;
`S: STACK;
`{ stack of nodes holding path from the root
`to the parent TOP(S) of the "current" node m }
`
`begin
`{ initialize }
`MAKENULL{S);
`m := ROOT(T);
`
`while true do
`
`if m <> A then begin
`print(LABEL(m, T));
`PUSH(m, S);
`{ explore leftmost child of m }
`m := LEFTMOST_CHILD(m, T)
`
`end
`
`else begin
`{ exploration of path on stack
`is now complete }
`if EMPTY(S) then
`return;
`{ explore right sibling of node
`on top of stack }
`m := RIGHT_SIBLING(TOP(S), T);
`POP(S)
`
`end
`
`end; {NPREORDER }
`
`Fig. 3.9. A nonrecursive preorder procedure.
`
`011 that assumption, we have written the
`increasing order from left to right.
`function RIGHTHSIBLING in Fig. 3.”, for types node and-TREE that are
`' defined as foilows:
`
`_
`
`'
`
`type
`
`node = integer;
`TREE = array [1..maxnodes] of node;
`
`_ For this implementation we assume the null node A is represented by 0.
`
`
`
`
`17
`17
`
`
`
`

`

`86
`
`TREES
`
`
`
`/\
`/\ 9/ \0
`
`(a) a tree
`
`10
`9
`8
`7
`6
`5
`4
`3
`2
`I
`A “III-III“.-
`
`(b) parent representation
`
`Fig. 3.10. A tree and its parent pointer representation.
`
`fimction RIGHT_SIBLING ( n: node; T: TREE ) : node;
`{ return the right sibling of node n in tree T }
`'var
`
`1', parent: node;
`begin
`parent := T[n1;
`for i := n + 1 to maxnodes do
`{ search for node after H with same parent }
`if TU] = parent then
`7
`return (i);
`return (0)
`{ null node will be returned
`if no right sibling is ever found }
`{ RIGHT_SIBL[NG }
`
`end;
`
`Fig. 3.11. Right sibling operation using array representation,
`
`18
`18
`
`
`
`
`
`
`
`
`
`

`

`3.3 IMPLEMENTATIONS 0F TREES
`
`87
`
`Representation of Trees by Lists of Children
`
`An important and useful way of representing trees is to form for each node a
`list of its children. The lists can be represented by any of the methods sug-
`gested in Chapter 2, but because the number of children each node may have
`can be variable, the linked-list representations are often more appropriate.
`Figure‘3.12 suggests how the tree of Fig. 3.10(a) might be represented.
`There is an array of header cells,
`indexed by nodes, which we assume to be
`numbered 1, 2,
`.
`.
`. .10. Each header points to a linked list of “elements,”
`which are nodes. The elements on the list headed by headerli] are the chil—
`dren of node i; for example, 9 and 10 are the children of 3.
`
`—‘
`
`\aon-qaxm-P-mm E
`
`header
`
`Fig. 3.12. A linked~list representation of a tree.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Let us first devetop the data structures we need in terms of an abstract
`data type LIST (of nodes), and then give a particular implementation of lists
`and see how the abstractions fit together. Later, we shall see some of the
`simplifications we can make. We begin with the following type declarations:
`
`type
`node = integer;
`LIST = { appropriate definition for list of nodes };
`position = { appropriate definition for positions in lists };
`TREE = record
`
`header: array [1..maxnodes] of LIST;
`labels: array [l..maxnodes] of labettype;
`root: node
`
`end;
`
`19
`19
`
`
`
`
`
`

`

`
`
`88
`
`TREES
`
`We assume that the root of each tree is stored explicitly in the root field.
`Also. 0 is used to represent the null node.
`Figure 3.13 shows the code for the LEFTMOSI‘_Cl-IlLD operation. The
`reader should write the code for the other operations as exercises.
`
`“motion LEFTMOS'LCHILD ( n: node; T: TREE ) : node;
`{ returns the leitrnost child of node n of tree T }
`var
`
`{ shorthand for the list of n ’s children }
`
`L: LlSl‘;
`begin
`L := T.header [n ];
`liEMP'l‘Y(L) then {n is a leaf}
`return (0)
`
`else
`
`rem"! (RETRIEVE(FIRST(L), Ln
`{ LEFTMOST_CHILD }
`
`end;
`
`Fig. 3.13. Function to find leftmost child.
`
`Now let us choose a particular implementation of lists, in which both LISI'
`and position are integers, used as cursors into an array ccllspace of records:
`var
`
`cellspace: array [1..maxmdes] of record
`node: integer;
`next: integer
`
`end;
`
`lists of children have heade

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket