`
`Page 1 of 1
`
`Security in Computing, Fourth Edition
`By Charles P. Pfleeger - Pfleeger Consulting Group, Shari Lawrence Pfleeger - RAND Corporation
`...............................................
`Publisher: Prentice Hall
`Pub Date: October 13, 2006
`Print ISBN-10: 0-13-239077-9
`Print ISBN-13: 978-0-13-239077-4
`Pages: 880
`
`Table of Contents | Index
`
`The New State-of-the-Art in Information Security: Now Covers the Economics of Cyber Security and the Intersection of Privacy and
`Information Security
`
`For years, IT and security professionals and students have turned to Security in Computing as the definitive guide to information about
`computer security attacks and countermeasures. In their new fourth edition, Charles P. Pfleeger and Shari Lawrence Pfleeger have
`thoroughly updated their classic guide to reflect today's newest technologies, standards, and trends.
`
`The authors first introduce the core concepts and vocabulary of computer security, including attacks and controls. Next, the authors
`systematically identify and assess threats now facing programs, operating systems, database systems, and networks. For each threat, they
`offer best-practice responses.
`
`Security in Computing, Fourth Edition, goes beyond technology, covering crucial management issues faced in protecting infrastructure
`and information. This edition contains an all-new chapter on the economics of cybersecurity, explaining ways to make a business case for
`security investments. Another new chapter addresses privacy--from data mining and identity theft, to RFID and e-voting.
`
`New coverage also includes
`
`(cid:122) Programming mistakes that compromise security: man-in-the-middle, timing, and privilege escalation attacks
`
`(cid:122) Web application threats and vulnerabilities
`
`(cid:122) Networks of compromised systems: bots, botnets, and drones
`
`(cid:122) Rootkits--including the notorious Sony XCP
`
`(cid:122) Wi-Fi network security challenges, standards, and techniques
`
`(cid:122) New malicious code attacks, including false interfaces and keystroke loggers
`
`Improving code quality: software engineering, testing, and liability approaches
`
`(cid:122)
`
`(cid:122) Biometric authentication: capabilities and limitations
`
`(cid:122) Using the Advanced Encryption System (AES) more effectively
`
`(cid:122) Balancing dissemination with piracy control in music and other digital content
`
`(cid:122) Countering new cryptanalytic attacks against RSA, DES, and SHA
`
`(cid:122) Responding to the emergence of organized attacker groups pursuing profit
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhA7CF.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 1
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Table of Contents
`
`
`
`
`
`Page 1 of 4
`
`Security in Computing, Fourth Edition
`By Charles P. Pfleeger - Pfleeger Consulting Group, Shari Lawrence Pfleeger - RAND Corporation
`...............................................
`Publisher: Prentice Hall
`Pub Date: October 13, 2006
`Print ISBN-10: 0-13-239077-9
`Print ISBN-13: 978-0-13-239077-4
`Pages: 880
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Table of Contents | Index
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Copyright
`Foreword
`Preface
` Chapter 1. Is There a Security Problem in Computing?
`Section 1.1. What Does "Secure" Mean?
`Section 1.2. Attacks
`Section 1.3. The Meaning of Computer Security
`Section 1.4. Computer Criminals
`Section 1.5. Methods of Defense
`Section 1.6. What's Next
`Section 1.7. Summary
`Section 1.8. Terms and Concepts
`Section 1.9. Where the Field Is Headed
`Section 1.10. To Learn More
`Section 1.11. Exercises
` Chapter 2. Elementary Cryptography
`Section 2.1. Terminology and Background
`Section 2.2. Substitution Ciphers
`Section 2.3. Transpositions (Permutations)
`Section 2.4. Making "Good" Encryption Algorithms
`Section 2.5. The Data Encryption Standard
`Section 2.6. The AES Encryption Algorithm
`Section 2.7. Public Key Encryption
`Section 2.8. The Uses of Encryption
`Section 2.9. Summary of Encryption
`Section 2.10. Terms and Concepts
`Section 2.11. Where the Field Is Headed
`Section 2.12. To Learn More
`Section 2.13. Exercises
` Chapter 3. Program Security
`Section 3.1. Secure Programs
`Section 3.2. Nonmalicious Program Errors
`Section 3.3. Viruses and Other Malicious Code
`Section 3.4. Targeted Malicious Code
`Section 3.5. Controls Against Program Threats
`Section 3.6. Summary of Program Threats and Controls
`Section 3.7. Terms and Concepts
`Section 3.8. Where the Field Is Headed
`Section 3.9. To Learn More
`
`
`
`
`
`
`
`
`
`
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhDFB7.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 2
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Table of Contents
`
`Page 2 of 4
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Section 3.10. Exercises
` Chapter 4. Protection in General-Purpose Operating Systems
`Section 4.1. Protected Objects and Methods of Protection
`Section 4.2. Memory and Address Protection
`Section 4.3. Control of Access to General Objects
`Section 4.4. File Protection Mechanisms
`Section 4.5. User Authentication
`Section 4.6. Summary of Security for Users
`Section 4.7. Terms and Concepts
`Section 4.8. Where the Field Is Headed
`Section 4.9. To Learn More
`Section 4.10. Exercises
` Chapter 5. Designing Trusted Operating Systems
`Section 5.1. What Is a Trusted System?
`Section 5.2. Security Policies
`Section 5.3. Models of Security
`Section 5.4. Trusted Operating System Design
`Section 5.5. Assurance in Trusted Operating Systems
`Section 5.6. Summary of Security in Operating Systems
`Section 5.7. Terms and Concepts
`Section 5.8. Where the Field Is Headed
`Section 5.9. To Learn More
`Section 5.10. Exercises
` Chapter 6. Database and Data Mining Security
`Section 6.1. Introduction to Databases
`Section 6.2. Security Requirements
`Section 6.3. Reliability and Integrity
`Section 6.4. Sensitive Data
`Section 6.5. Inference
`Section 6.6. Multilevel Databases
`Section 6.7. Proposals for Multilevel Security
`Section 6.8. Data Mining
`Section 6.9. Summary of Database Security
`Section 6.10. Terms and Concepts
`Section 6.11. Where the Field Is Headed
`Section 6.12. To Learn More
`Section 6.13. Exercises
` Chapter 7. Security in Networks
`Section 7.1. Network Concepts
`Section 7.2. Threats in Networks
`Section 7.3. Network Security Controls
`Section 7.4. Firewalls
`Section 7.5. Intrusion Detection Systems
`Section 7.6. Secure E-Mail
`Section 7.7. Summary of Network Security
`Section 7.8. Terms and Concepts
`Section 7.9. Where the Field Is Headed
`Section 7.10. To Learn More
`Section 7.11. Exercises
` Chapter 8. Administering Security
`Section 8.1. Security Planning
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhDFB7.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 3
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Table of Contents
`
`Page 3 of 4
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Section 8.2. Risk Analysis
`Section 8.3. Organizational Security Policies
`Section 8.4. Physical Security
`Section 8.5. Summary
`Section 8.6. Terms and Concepts
`Section 8.7. To Learn More
`Section 8.8. Exercises
` Chapter 9. The Economics of Cybersecurity
`Section 9.1. Making a Business Case
`Section 9.2. Quantifying Security
`Section 9.3. Modeling Cybersecurity
`Section 9.4. Current Research and Future Directions
`Section 9.5. Summary
`Section 9.6. Terms and Concepts
`Section 9.7. To Learn More
`Section 9.8. Exercises
` Chapter 10. Privacy in Computing
`Section 10.1. Privacy Concepts
`Section 10.2. Privacy Principles and Policies
`Section 10.3. Authentication and Privacy
`Section 10.4. Data Mining
`Section 10.5. Privacy on the Web
`Section 10.6. E-Mail Security
`Section 10.7. Impacts on Emerging Technologies
`Section 10.8. Summary
`Section 10.9. Terms and Concepts
`Section 10.10. Where the Field Is Headed
`Section 10.11. To Learn More
`Section 10.12. Exercises
` Chapter 11. Legal and Ethical Issues in Computer Security
`Section 11.1. Protecting Programs and Data
`Section 11.2. Information and the Law
`Section 11.3. Rights of Employees and Employers
`Section 11.4. Redress for Software Failures
`Section 11.5. Computer Crime
`Section 11.6. Ethical Issues in Computer Security
`Section 11.7. Case Studies of Ethics
`Section 11.8. Terms and Concepts
`Section 11.9. To Learn More
`Section 11.10. Exercises
` Chapter 12. Cryptography Explained
`Section 12.1. Mathematics for Cryptography
`Section 12.2. Symmetric Encryption
`Section 12.3. Public Key Encryption Systems
`Section 12.4. Quantum Cryptography
`Section 12.5. Summary of Encryption
`Section 12.6. Terms and Concepts
`Section 12.7. Where the Field Is Headed
`Section 12.8. To Learn More
`Section 12.9. Exercises
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhDFB7.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 4
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Table of Contents
`
`
`
`
`
`
`
`
` Bibliography
`
`Index
`
`
`
`Page 4 of 4
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhDFB7.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 5
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Copyright
`
`
`
`
`Copyright
`
`Page 1 of 7
`
`Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those
`designations appear in this book, and the publisher was aware of a trademark claim, the designations have been printed with initial capital
`letters or in all capitals.
`
`The authors and publisher have taken care in the preparation of this book, but make no expressed or implied warranty of any kind and
`assume no responsibility for errors or omissions. No liability is assumed for incidental or consequential damages in connection with or
`arising out of the use of the information or programs contained herein.
`
`The publisher offers excellent discounts on this book when ordered in quantity for bulk purchases or special sales, which may include
`electronic versions and/or custom covers and content particular to your business, training goals, marketing focus, and branding interests.
`For more information, please contact:
`
`
` U.S. Corporate and Government Sales
` (800) 382-3419
` corpsales@pearsontechgroup.com
`
`For sales outside the United States, please contact:
`
`
` International Sales
` international@pearsoned.com
`
`Visit us on the Web: www.prenhallprofessional.com
`
`
`Library of Congress Cataloging-in-Publication Data
`Pfleeger, Charles P., 1948-
` Security in computing / Charles P. Pfleeger, Shari Lawrence Pfleeger. 4th ed.
` p. cm.
` Includes bibliographical references and index.
` ISBN 0-13-239077-9 (hardback : alk. paper)
` 1. Computer security. 2. Data protection. 3. Privacy, Right of. I. Pfleeger, Shari
`Lawrence. II. Title.
` QA76.9.A25P45 2006
` 005.8dc22 2006026798
`
`Copyright © 2007 Pearson Education, Inc.
`
`All rights reserved. Printed in the United States of America. This publication is protected by copyright, and permission must be obtained
`from the publisher prior to any prohibited reproduction, storage in a retrieval system, or transmission in any form or by any means,
`electronic, mechanical, photocopying, recording, or likewise. For information regarding permissions, write to:
`
`
` Pearson Education, Inc.
` Rights and Contracts Department
` One Lake Street
` Upper Saddle River, NJ 07458
` Fax: (201) 236-3290
`
`
`Text printed in the United States on recycled paper at Courier in Westford, Massachusetts.
`First printing, October 2006
`
`
`
`
`
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hh226D.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 6
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Chapter 4. Protection in General-Purpose Operating Systems
`
`Page 23 of 40
`
`The primary limitation of these protection schemes is the ability to create meaningful groups of related users who should have similar
`access to related objects. The access control lists or access control matrices described earlier provide very flexible protection. Their
`disadvantage is for the user who wants to allow access to many users and to many different data sets; such a user must still specify each
`data set to be accessed by each user. As a new user is added, that user's special access rights must be specified by all appropriate users.
`
`
`
`
`
`
`4.5. User Authentication
`
`An operating system bases much of its protection on knowing who a user of the system is. In real-life situations, people commonly ask for
`identification from people they do not know: A bank employee may ask for a driver's license before cashing a check, library employees
`may require some identification before charging out books, and immigration officials ask for passports as proof of identity. In-person
`identification is usually easier than remote identification. For instance, some universities do not report grades over the telephone because
`the office workers do not necessarily know the students calling. However, a professor who recognizes the voice of a certain student can
`release that student's grades. Over time, organizations and systems have developed means of authentication, using documents, voice
`recognition, fingerprint and retina matching, and other trusted means of identification.
`
`In computing, the choices are more limited and the possibilities less secure. Anyone can attempt to log in to a computing system. Unlike
`the professor who recognizes a student's voice, the computer cannot recognize electrical signals from one person as being any different
`from those of anyone else. Thus, most computing authentication systems must be based on some knowledge shared only by the computing
`system and the user.
`
`Authentication mechanisms use any of three qualities to confirm a user's identity.
`
`1. Something the user knows. Passwords, PIN numbers, passphrases, a secret handshake, and mother's maiden name are examples of
`what a user may know.
`
`2. Something the user has. Identity badges, physical keys, a driver's license, or a uniform are common examples of things people have
`that make them recognizable.
`
`3. Something the user is. These authenticators, called biometrics, are based on a physical characteristic of the user, such as a
`fingerprint, the pattern of a person's voice, or a face (picture). These authentication methods are old (we recognize friends in person
`by their faces or on a telephone by their voices) but are just starting to be used in computer authentications. See Sidebar 4-3 for a
`glimpse at some of the promising approaches.
`
`Two or more forms can be combined for more solid authentication; for example, a bank card and a PIN combine something the user has
`with something the user knows.
`
`Sidebar 4-3: Biometrics: Ready for Prime Time?
`
`Biometric authentication is a strong technology, certainly far superior to the password approach that is by far the most
`common form of authentication. The technology is mature, products exist, standards define products' interfaces, reliability
`rates are acceptable, and costs are reasonable. Why then is use of biometrics so small?
`
`The reason seems to be user acceptance. Few rigorous scientific studies have been done of users' reactions to biometrics,
`but there is plenty of anecdotal evidence.
`
`In perhaps the biggest commercial use of biometrics, Piggly-Wiggly supermarkets tried to encourage its customers to use a
`fingerprint technology to pay for groceries. The primary advantage for Piggly-Wiggly was cost: By speeding its customers
`through the checkout process, it could serve more customers in a fixed amount of time with no additional staff, thereby
`reducing cost. Bonuses were strong authentication reducing the likelihood of credit card or check-writing fraud (saving
`more money) and being able to track customers' buying habits. The stores did not anticipate the negative customer reaction
`they got [SCH06a]. Even though the reactions were to psychological perceptions and not technological deficiencies, they
`help explain why biometric authentication has not caught on in voluntary settings.
`
`Some customers did not like the idea of registering and using their fingerprints because of the association of fingerprints
`with law enforcement and criminals. Others feared that criminals would harm them to obtain their authenticators (for
`example, cutting off a finger). And still others cited Biblical concerns about the "mark of the devil" being imprinted on the
`hand as a precondition to purchasing.
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhE410.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 7
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Chapter 4. Protection in General-Purpose Operating Systems
`
`Page 24 of 40
`
`In other settings, people question the hygiene of pressing a finger onto a plate others have used. And others resist having
`their biometric data entered into a database, for example, by having a picture taken, citing fears of losing privacy, either to
`the government or to commercial data banks.
`
`Prabhakar et al. [PRA03] list three categories of privacy concerns:
`
`(cid:122) Unintended functional scope. The authentication does more than authenticate, for example, finding a tumor in the
`eye from a scan or detecting arthritis from a hand reading
`
`(cid:122) Unintended application scope. The authentication routine identifies the subject, for example if a subject enrolls
`under a false name but is identified by a match with an existing biometric record in another database
`
`(cid:122) Covert identification. The subject is identified without seeking identification or authentication, for example, if the
`subject is identified as a face in a crowd
`
`All these concerns arise from a subject's having lost control of private biometric information through an authentication
`application. People may misunderstand or overestimate the capability of biometric technology, but there is no denying the
`depth of feeling. Even when Piggly-Wiggly offered free turkeys to people who enrolled in their biometric program, the
`turnout was meager.
`
`Thus, for a wide range of reasons, people prefer not to use biometrics. Unless and until human perception is changed,
`biometrics will achieve wide acceptance only in situations in which its use is mandatory.
`
`
`Passwords as Authenticators
`
`The most common authentication mechanism for user to operating system is a password, a "word" known to computer and user. Although
`password protection seems to offer a relatively secure system, human practice sometimes degrades its quality. In this section we consider
`passwords, criteria for selecting them, and ways of using them for authentication. We conclude by noting other authentication techniques
`and by studying problems in the authentication process, notably Trojan horses masquerading as the computer authentication process.
`
`Use of Passwords
`
`Passwords are mutually agreed-upon code words, assumed to be known only to the user and the system. In some cases a user chooses
`passwords; in other cases the system assigns them. The length and format of the password also vary from one system to another.
`
`Even though they are widely used, passwords suffer from some difficulties of use:
`
`(cid:122) Loss. Depending on how the passwords are implemented, it is possible that no one will be able to replace a lost or forgotten
`password. The operators or system administrators can certainly intervene and unprotect or assign a particular password, but often
`they cannot determine what password a user has chosen; if the user loses the password, a new one must be assigned.
`
`(cid:122) Use. Supplying a password for each access to a file can be inconvenient and time consuming.
`
`(cid:122) Disclosure. If a password is disclosed to an unauthorized individual, the file becomes immediately accessible. If the user then
`changes the password to reprotect the file, all the other legitimate users must be informed of the new password because their old
`password will fail.
`
`(cid:122) Revocation. To revoke one user's access right to a file, someone must change the password, thereby causing the same problems as
`disclosure.
`
`The use of passwords is fairly straightforward. A user enters some piece of identification, such as a name or an assigned user ID; this
`identification can be available to the public or easy to guess because it does not provide the real security of the system. The system then
`requests a password from the user. If the password matches that on file for the user, the user is authenticated and allowed access to the
`system. If the password match fails, the system requests the password again, in case the user mistyped.
`
`Additional Authentication Information
`
`In addition to the name and password, we can use other information available to authenticate users. Suppose Adams works in the
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhE410.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 8
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Chapter 4. Protection in General-Purpose Operating Systems
`
`Page 25 of 40
`
`accounting department during the shift between 8:00 a.m. and 5:00 p.m., Monday through Friday. Any legitimate access attempt by Adams
`should be made during those times, through a workstation in the accounting department offices. By limiting Adams to logging in under
`those conditions, the system protects against two problems:
`
`(cid:122) Someone from outside might try to impersonate Adams. This attempt would be thwarted by either the time of access or the port
`through which the access was attempted.
`
`(cid:122) Adams might attempt to access the system from home or on a weekend, planning to use resources not allowed or to do something
`that would be too risky with other people around.
`
`Limiting users to certain workstations or certain times of access can cause complications (as when a user legitimately needs to work
`overtime, a person has to access the system while out of town on a business trip, or a particular workstation fails). However, some
`companies use these authentication techniques because the added security they provide outweighs inconveniences.
`
`Using additional authentication information is called multifactor authentication. Two forms of authentication (which is, not surprisingly,
`known as two-factor authentication) are better than one, assuming of course that the two forms are strong. But as the number of forms
`increases, so also does the inconvenience. (For example, think about passing through a security checkpoint at an airport.) Each
`authentication factor requires the system and its administrators to manage more security information.
`
`Attacks on Passwords
`
`How secure are passwords themselves? Passwords are somewhat limited as protection devices because of the relatively small number of
`bits of information they contain.
`
`Here are some ways you might be able to determine a user's password, in decreasing order of difficulty.
`
`(cid:122) Try all possible passwords.
`
`(cid:122) Try frequently used passwords.
`
`(cid:122) Try passwords likely for the user.
`
`(cid:122) Search for the system list of passwords.
`
`(cid:122) Ask the user.
`
`Loose-Lipped Systems
`
`So far the process seems secure, but in fact it has some vulnerabilities. To see why, consider the actions of a would-be intruder.
`Authentication is based on knowing the <name, password> pair A complete outsider is presumed to know nothing of the system. Suppose
`the intruder attempts to access a system in the following manner. (In the following examples, the system messages are in uppercase, and the
`user's responses are in lowercase.)
`
`WELCOME TO THE XYZ COMPUTING SYSTEMS
`ENTER USER NAME: adams
`INVALID USER NAMEUNKNOWN USER
`ENTER USER NAME:
`
`
`We assumed that the intruder knew nothing of the system, but without having to do much, the intruder found out that adams is not the name
`of an authorized user. The intruder could try other common names, first names, and likely generic names such as system or operator to
`build a list of authorized users.
`
`An alternative arrangement of the login sequence is shown below.
`
`WELCOME TO THE XYZ COMPUTING SYSTEMS
`ENTER USER NAME: adams
`ENTER PASSWORD: john
`INVALID ACCESS
`ENTER USER NAME:
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhE410.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 9
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Chapter 4. Protection in General-Purpose Operating Systems
`
`Page 26 of 40
`
`
`This system notifies a user of a failure only after accepting both the user name and the password. The failure message should not indicate
`whether it is the user name or password that is unacceptable. In this way, the intruder does not know which failed.
`
`These examples also gave a clue as to which computing system is being accessed. The true outsider has no right to know that, and
`legitimate insiders already know what system they have accessed. In the example below, the user is given no information until the system is
`assured of the identity of the user.
`
`ENTER USER NAME: adams
`ENTER PASSWORD: john
`INVALID ACCESS
`ENTER USER NAME: adams
`ENTER PASSWORD: johnq
`WELCOME TO THE XYZ COMPUTING SYSTEMS
`
`
`Exhaustive Attack
`
`In an exhaustive or brute force attack, the attacker tries all possible passwords, usually in some automated fashion. Of course, the number
`of possible passwords depends on the implementation of the particular computing system. For example, if passwords are words consisting
`of the 26 characters AZ and can be of any length from 1 to 8 characters, there are 261 passwords of 1 character, 262 passwords of 2
`
` 5 * 1012 or five
`characters, and 268 passwords of 8 characters. Therefore, the system as a whole has 261 + 262 + ... + 268 = 269 - 1
`million million possible passwords. That number seems intractable enough. If we were to use a computer to create and try each password at
`a rate of checking one password per millisecond, it would take on the order of 150 years to test all passwords. But if we can speed up the
`search to one password per microsecond, the work factor drops to about two months. This amount of time is reasonable if the reward is
`large. For instance, an intruder may try to break the password on a file of credit card numbers or bank account information.
`
`But the break-in time can be made more tractable in a number of ways. Searching for a single particular password does not necessarily
`require all passwords to be tried; an intruder needs to try only until the correct password is identified. If the set of all possible passwords
`were evenly distributed, an intruder would likely need to try only half of the password space: the expected number of searches to find any
`particular password. However, an intruder can also use to advantage the fact that passwords are not evenly distributed. Because a password
`has to be remembered, people tend to pick simple passwords. This feature reduces the size of the password space.
`
`Probable Passwords
`
`Think of a word.
`
`Is the word you thought of long? Is it uncommon? Is it hard to spell or to pronounce? The answer to all three of these questions is probably
`no.
`
`Penetrators searching for passwords realize these very human characteristics and use them to their advantage. Therefore, penetrators try
`techniques that are likely to lead to rapid success. If people prefer short passwords to long ones, the penetrator will plan to try all passwords
`but to try them in order by length. There are only 261 + 262 + 263=18,278 passwords of length 3 or less. At the assumed rate of one
`password per millisecond, all of these passwords can be checked in 18.278 seconds, hardly a challenge with a computer. Even expanding
`the tries to 4 or 5 characters raises the count only to 475 seconds (about 8 minutes) or 12,356 seconds (about 3.5 hours), respectively.
`
`This analysis assumes that people choose passwords such as vxlag and msms as often as they pick enter and beer. However, people tend to
`choose names or words they can remember. Many computing systems have spelling checkers that can be used to check for spelling errors
`and typographic mistakes in documents. These spelling checkers sometimes carry online dictionaries of the most common English words.
`One contains a dictionary of 80,000 words. Trying all of these words as passwords takes only 80 seconds.
`
`Passwords Likely for a User
`
`If Sandy is selecting a password, she is probably not choosing a word completely at random. Most likely Sandy's password is something
`meaningful to her. People typically choose personal passwords, such as the name of a spouse, a child, a brother or sister, a pet, a street
`name, or something memorable or familiar. If we restrict our password attempts to just names of people (first names), streets, projects, and
`so forth, we generate a list of only a few hundred possibilities at most. Trying this number of passwords takes under a second! Even a
`person working by hand could try ten likely candidates in a minute or two.
`
`Thus, what seemed formidable in theory is in fact quite vulnerable in practice, and the likelihood of successful penetration is frightening.
`Morris and Thompson [MOR79] confirmed our fears in their report on the results of having gathered passwords from many users, shown in
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhE410.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhibit 2002
`Page 2002 - 10
`IPR2020-00280, Apple Inc. v. SEVEN Networks LLC
`
`
`
`Chapter 4. Protection in General-Purpose Operating Systems
`
`Page 27 of 40
`
`Table 4-2. Figure 4-15 (based on data from that study) shows the characteristics of the 3,289 passwords gathered. The results from that
`study are distressing, and the situation today is likely to be the same. Of those passwords, 86 percent could be uncovered in about one
`week's worth of 24-hour-a-day testing, using the very generous estimate of 1 millisecond per password check.
`
`15
`72
`464
`477
`706
`605
`492
`2831
`
`Table 4-2. Distribution of Actual Passwords.
`0.5%
`were a single(!) ASCII character
`2%
`were two ASCII characters
`14%
`were three ASCII characters
`14%
`were four alphabetic letters
`21%
`were five alphabetic letters, all the same case
`18%
`were six lowercase alphabetic letters
`15%
`were words in dictionaries or lists of names
`86%
`total of all above categories
`
`
`
`Figure 4-15. Users' Password Choices.
`
`
`
`
`Lest you dismiss these results as dated (they were reported in 1979), Klein repeated the experiment in 1990 [KLE90] and Spafford in 1992
`[SPA92]. Each collected approximately 15,000 passwords. Klein reported that 2.7 percent of the passwords were guessed in only 15
`minutes of machine time and 21 percent were guessed within a week! Spafford found the average password length was 6.8 characters, and
`28.9 percent consisted of only lowercase alphabetic characters. Notice that both these studies were done after the Internet worm (described
`in Chapter 3) succeeded, in part by breaking weak passwords.
`
`Even in 2002, the British online bank Egg found users still choosing weak passwords [BUX02]. A full 50 percent of passwords for their
`online banking service were family members' names: 23 percent children's names, 19 percent a spouse or partner, and 9 percent their own.
`Alas, pets came in at only 8 percent, while celebrities and football (soccer) stars tied at 9 percent each. And in 1998, Knight and Hartley
`[KNI98] reported that approximately 35 percent of passwords are deduced from syllables and initials of the account owner's name.
`
`file://D:\Documents and Settings\S\Local Settings\Temp\~hhE410.htm
`
`8/13/2006
`
`SEVEN Networks LLC, Exhi