throbber
Automated Personal Identification Methods
`
`133
`
`where S is the sample feature vector of components Si (i=1 to f) and Ti is
`the true (reference) feature mean value of variance Vi.
`This distance measure requires only five to ten true signatures to
`establish initial values for user behaviour statistics at enrolment.
`The 'best' feature set selected as standard by the above method contained
`features: 1 , 2, 3, 6, 11, 12, 13, 14, 16, 20, 22, 25, 26, 27, 28, 29, 30, 32,
`33, 40, 41, 42, 43 and 44. 24 features in all.
`
`REFERENCES
`
`[1] Masuyama, H. (1985) 'Properties of personal identification systems using
`question-answer techniques' Trans lnt Electronics and Communications
`Engineering J69D/4, 613-620.
`[2] Kniessler, H. (1985) Identification of individuals with computer graphics SRI
`Report, SRI International Inc.
`[3] Harmon, L. D., Khan, M. K., Lasch, R. and Ramig, P. F. {1981) 'Machine
`identification of human faces' Pattern Recognition 13(2) 97-110.
`(4] Siemens, A. G. (19�2) Fingerprint personal identification system DE 3036
`912. 13 May 1982.
`[5] Asai Koh "{1980) Device for extracting a density as one of a number of pattern
`. . . GB 2050026. 3L Dec 1980.
`features extracted for each feature point
`Nippon Electric Co Ltd.
`[6] Sparrow, M. K. and Sparrow, P. J. (1985) A Topological approach to the
`Matching of Single Fingerprints US Dept of Commerce, NBS. Oct. 1985.
`[7] Isenor, D. K. and Zaky, S. G. {1986) 'Fingerprint identification using graph
`matching' Pattern Recognition 19{2) 113-122.
`[8] Sibany Mfg. Corp. (1966) Recognising fingerprints GB 1150511. 29 April
`1966.
`[9] Stellar Systems Inc. (1983) Personnel identification system using characteristic
`data US Au8335163. 31 March 1983.
`[10] Fowler, R. C., Ruby, K., Sartor, F. F. and Sartor, T. F. (1985) Fingerprint
`Imaging Apparatus US 4537484. 27 Aug 1985. ldentix Inc, Palo Alto, Ca.
`[11] Fingermatrix Inc. (1984) Fast action fingerprint check for access control US
`EP125-532. 21 Nov 1984.
`[12] IBM Corp. (1979) Palm print identifier GB 1535467.
`[13] Palmguard Inc. (1982) Image recognition system for recognising human palm.
`US 4186378. 20 Jan 1982.
`[14] Stellar Systems Inc. (1984) Personnel Identification Devices Using Hand
`Measurement Techniques April, 1984.
`[15] Mitsubishi Denki K. K. {1985) Individual identification apparatus based on
`finger length comparison JP EP132665A. 13 Feb 1985.
`(16] Zuccarelli, H. {1983) 'Ears hear by making sounds' New Scientist 100 BPC.
`(17] EYE-D Developments IT Ltd (1984) Ocular fundus reflectivity pattern identi­
`fication apparatus US EP126549A. 28 Nov 1984.
`[18] Rice, J. (1986) Method of, and Apparatus for the Identification of Individuals
`UK 8509389 86.
`
`Page 151 of 201
`
`UNITED SERVICES AUTOMOBILE ASSOCIATION
`Exhibit 1003
`
`

`
`134
`
`Integrated Circuit Cards
`
`[19] Abberton, E. and Fourcin, A. J. (1978) 'Intonation and speaker identificatimi'

`.
`Language & Speech 21(4) 305.
`[20] Bolt, R. H., Cooper, F. S., David Jr, E. E. and Denes, P. B. {1973)
`'Speaker identification by speech spectrograms: some further observations'
`JASA 54(2) 531-537, Acoustic Society of America.
`[21] Holden, A. D. C. and Cheung, J. Y. (1977) The role of idiosyncracies in
`linguistic stress cues, and accurate formant analysis, in Speaker Identification
`Carnahan Conference 1977, University of Kentucky, 33-37.
`(22] McGlone, R. E. and Hollien, H. (1976) 'Partial analysis of acoustic signal of
`stressed and unstressed speech' Carnahan Conference, University of Kentucky,
`19
`[23] Davis, R. L., Sinnamon, J. T. and Cox, D. L. (1982) Voice verification
`upgrade. Report Texas Instruments Inc.
`[24] Rosenberg, A. E. (1976) 'Automatic 'speaker verification: a review' Proc
`IEEE 64(4) 475--'487.
`[25] DeGeorge, M. (1981) 'Experiments in automatic speaker verification'
`Carnahan Conference 1981, University of Kentucky, 103-110.
`[26] Bunge, E., Hofker, U., Hohne, H. D. and Kriener, B. (1977) 'Report about
`speaker-recognition investigations with the AUROS system' Frequenz 31(12)
`382.
`[27] Kuhn, M. H. and Geppert, R. (1980) 'A low cost speaker verification device'
`Carnahan Conf 1980, University of Kentucky, 57-61.
`[28] Ney, H. and Gierloff, R. (1982) 'Speaker recognition using a weighting
`technique' IEEE Int. Conf on Acoustics, Speech and Signal Processing
`I
`1645-1648
`[29] IBM Corp. (1967) Speech recognition GB 1179029. 19 April 1967.
`[30] Perkin Elmer Corp. (1970) Speech recognition GB 1289202. 21 April 1970.
`[31] NCR Corp (1977) Verification GB 1532944. 21 Feb 1977.
`[32] Kashyap, R. L. 'Speaker recognition from an unknown utterance and speaker­
`speech interaction' Tr Acou. Speech & Signal Processing ASSP 24(6) 451.
`[33] Tokyo Shibaura Denki (1983) Individual verification apparatus based on
`speech recognition JP EP-86-064. 27 Jan 1983.
`[34] Reitboeck,-H. J. (1977). 'Speaker identification over telephone transmission
`channels� Carnahan 1977, University of Kentucky, 237-238.
`(35) Shridhar, M., Baraniecki, M. and Mohankrishnan, N. (1982) 'A unified
`approach to speaker verification with noisy speech input' Speech Communi-
`,
`cation 103-112.
`[36] Kiyohiro, S. (1985) Text-Independent Speaker Recognition Experiments using .
`.
`Codebook Computer Science Dept C-M U, 9 April 1985.
`[37) Moore, R. K. 'Systems for isolated and connected word recognition' NATO
`AS/ 16 74-143 Springer-V�rlag.
`[38] Nagel, R. N. and Rosenfeld, A. (1977) 'Computer detection of freehand
`forgeries' Trans on Computers C 26(9) 895.
`[39] Rediffusion Computers Ltd (1983) 'Rediffusion introduces low-cost signature
`verifier' Financial Times 24 Feb 1983, 19.
`[40] Chainer, T. J., Scranton, R. A. and Worthington T. K. {1985) Data input
`pen for signature verification US 4513437. 23 April 1985.
`[41] Crane, H. D. and Ostrem, J. S. {1983) 'Automatic signat�re verification
`
`1
`i
`
`Page 152 of 201
`
`

`
`Automated Personal Identification Methods
`
`135
`
`Using a three-axis force-sensitive pen' Trans on Sys., Man & Cybernet SMC-
`13{3) 329-337.
`[42] Quest Automation Ltd (1979) Transducer pad for electrographies GB 1539755.
`[43] Greenaway, D. L. (1978) Apparatus and Method for Producing an Electrical
`Signal Responsive to Handwriting Characteristics US 4122435. 24 Oct 1978.
`[44] Darringer, J. A., Denil, N. J. and Evangelisti, C. J. (1975) 'Speed pen' IBM
`Tech Disclosure Bulletin 18(7) 2374-5.
`(45] Radice, P. F. (1980) Personal Verification. Device US 4234868. 18 Nov 1980.
`[46] Worthington, T. K., Chainer T. J., Williford, J. D. and Gundersen, S. C.
`(1985) 'IBM dynamic signature verification' Computer Security, Elsevier
`Science Publishers BV, 129-54.
`[47] Zimmermann, K. P. and Varady, M. J. 'Handwriter, identification from one­
`bit quantized pressure patterns' Pattern Recognition 18[1] 63-72.
`[48] Hale, W. J. and Paganini, B. J. (1980) 'An automatic personal verification
`system based on signature writing habits' Carnahan Conf. on Crime Counter,
`University of Kentucky, 121-125.
`[49] Stuckert, P. E. (1979) 'Magnetic pen and tablet' IBM Technical Disclosure
`Bul 22(3) 1245-1251.
`[50] de Bruyne, P. (1984) 'An ultrasonic radar graphic input tablet' Scienta
`Electrica 1-26.
`[5i] Lew, J. S. (1980). 'Optimal accelerometer layouts for data recovery in
`signature verification' IBM Journal of R & D 24(4) 496-511.
`[52] . Herbst, N. M. and. Liu, C. N. (1977) 'Automatic signature verification based
`on accelerometry' IBM Jour of R & D 245.
`[53] Liu, C. N., Herbst, N. M. and Anthony, N: J. (1979) 'Automatic signature
`verification: system description and field test results' TR Sys., Man & Cyber-

`netics SNC 9(1) 35.
`[,54] de Bruyne, P. (1977) 'Developments in signature verification' I.nt Conf on
`Crime Countermeasure, Oxford Univ.
`[55) Watson, R. S. and Pobgee, P . .1. (1980) 'A computer to check signatures'
`Visible Language 13(3).
`[56] Fox, P. F. (1982) 'A practical method of personal identification by signature
`validation' IACSS Conference, Zurich, SWitzerland.
`[57] Umphress, D. and Williams, G. (1985) 'Identity verification through keyboard
`characteristics' Int J Man-Machine Studies 263-273.
`[58] Haberman, W. and Fejfar, A. (1976) 'Automatic identification of personnel
`through, speaker and signature verification - system design and testing'
`Carnahan Conference 1976, University of Kentucky, 23-30.
`
`Page 153 of 201
`
`

`
`Chapter 8
`Cryptography and the Smart
`Card
`D. W. DAVIES
`(Data Security Consultant)
`
`Cryptology is the key technology for secure systems ..
`
`8.1 Introduction
`
`The close relationship between smart card and cryptogtaphic techniques
`can be looked at from two directions. The smart card can be used as a
`component of a cryptographic system to improve its convenience or level
`of security. From the other viewpoint, the smart card itself is the main
`component of the system and cryptography is called upon to help it with
`its task. In this chapter we shall mainly adopt the second viewpoint,
`which is centred on smart card applic�tions but first let us look at the
`smart card as an adjunct to cryptography.
`The confidentiality of data on a communication line can be protected
`by enciphering it. Encipherment is a transformation which makes the
`transformed data seem meaningless to an outsider, yet which allows an
`inverse transformation, for those authorised to receive the information,
`which turns it back into its clear text. To separate the authorised readers
`from others, the authorised readers hold a secret value called a crypto­
`graphic key without which decipherment is impossible. In the usual form
`of cryptography, this secret key is used as a parameter for both the
`encipherment and the decipherment functions.
`When cryptography is used to protect data travelling some distance,
`before it can go into operation a secret key must be established at both
`the sending and the receiving end. Conveying the key from one place to
`the other entails a risk of losing it to an opponent. A smart card can be
`used to store a key for secure transport. The use of this key can be
`authorised by means of a password, known only to authorised users, and
`the smart card itself can take part in the complex process of key manage­
`ment. Some of the techniques are described later in the. chapter.
`Sometimes, cryptography is used to encipher information not for com­
`munications purposes but to protect it while it is stored locally. It might
`be difficult to protect the local store from illicit access or information
`
`.,
`f
`
`Page 154 of 201
`
`

`
`Cryptography and the Smart Card
`
`137
`
`stored on a removable medium might be stolen or copied. When crypto­
`
`graphy is used for stored data, the keys are not transported but their
`
`security is very important because they_ can unlock all the protection
`
`
`provided. Most computers are physically insecure, so a smart card can be
`used to hold the keys and the card taken away by its owner and stored in
`
`
`a safe place. Here also, a password can be used to unlock the secret key
`from the card.
`A related problem of cryptography is the protection
`of the cryptographic
`
`mechanism itself. Not only must the key be protected but also the place
`
`
`
`where the cryptographic transformations take place. Smart cards can help
`engine' of
`in this problem by becoming, themselves, the 'cryptographic
`
`the system. If they have enough processing power for the purpose, they
`
`
`
`can hold all the protective mechanism of a secure system, particularly at
`the terminal end where the processing demands are less severe. The
`
`
`computer itself, which might be an intelligent terminal, is physically
`
`insecure and any part of its store or process is open to tapping or
`
`
`'bugging'. To counter this, cryptographic methods are used and the keys,
`
`together with the cryptographic transformations, are contained entirely in
`
`smart cards. When these are removed, the system is locked up and the
`
`information it contains is safe against illegal access.
`These are· examples of the close relationship
`between smart cards and
`
`
`cryptography, seen from the side of the cryptographer who regards the
`
`
`smart card as an additional tool. Our viewpoint in the rest of this chapter
`
`is to think of the smart card as a main component of the system and see
`
`how cryptographic techniques are used for its purpose.
`
`8.2 PROTECTION F�OM PASSIVE AND ACTIVE ATTACKS
`
`Cryptographic techniques can be used in a large number of ways and for
`
`
`many different purposes. The basic purpose is to protect a system against
`
`
`misuse by impostors or unauthorised people. The first stage in protecting
`a system is to analyse the threats to the system and the risks they entail.
`
`We shall consider only those threats that are amenable to cryptographic
`protection and, as a first step we divide these into passive and active
`attacks.·
`A passive attack attempts to read information without changing it.
`Examples are the tapping of a telephone line, stealing or copying a
`diskette,
`observing a password by looking at the keyboard while it is
`
`
`entered or picking up stray electromagnetic radiation from which a mean­
`ingful signal can be reconstructed. Generally speaking, these attacks are
`
`
`not difficult to carry out and in a widespread communication network it is
`impossible to prevent them. The tapping or bugging of voice conver­
`sations-is a 'highly developed art which can be applied (with a few
`
`Page 155 of 201
`
`

`
`138
`
`Integrated Circuit Cards
`
`changes) to the collecting of digital data from communication lines, I/0
`channels, processors, stores, keyboards or any other part of the infor­
`mation system. In these. circumstances, to preserve the confidentiality of
`information it must be transformed by encipherment and then transformed
`back into clear form for processing, printing or display. Wherever the
`information is in clear form there must be other ways to protect it, such
`as not allowing unauthorised persons into offices where information is
`displayed or printed.
`On the other hand, an active attack is one which seeks to alter the
`information, perhaps to falsify a transaction, prevent a debit to a bank
`account from reaching it or even destroy an entire file. Generally speaking,
`these active attacks are much more difficult to carry out and they require
`more skill and sometimes more luck to achieve their purpose. When they
`do succeed, the consequences are often more serious than those of a
`passive attack.
`It might seem that encipherment was all that is needed to prevent an
`active attack on data but this is not so. Some of the more extreme active
`attacks such as placing a bomb in the computer room are not amenable to
`cryptographic protection but, these aside, there is a much wider range of
`possibjlities against which protection is needed. Suppose, for example,
`that an important file is stored in enciphered form and updated from time
`to time, while still being enciphered. If we are not careful, an attacker
`simply replaces today's file by an older, one which will pass as genuine
`because it is properly enciphered. Whole transactions might be knocked
`out on a · communications line without detection. Some encipherment
`methods allow an attacker to change individual bits of a message or file
`without knowing the cryptographic key. Thus there is more to protection
`against an active attack than encipherment alone and the methods used
`have in the past been known as 'authentication' because they ensure that
`the information remains authentic. This term is discouraged by the recent
`usage of the International Standards Organisation and we speak therefore
`of data integrity meaning that the data takes the values it was intended,
`not those altered or substituted by an impostor.
`In an extended communication system it may not be possible to prevent
`the changing of information but we can at least hope to detect w�en the
`information loses its integrity. Thus we shall normally be speaking about
`integrity verification rather than the prev�ntion of an active attack.
`One form of active attack is the masquerade which means an unauthorised
`user pretending to the system that he is an authorised one. It goes further
`than this because any part of the system can be subject to a masquerade.
`A well known example is the program which asks the user to present his
`password. If the program is an impostor, the passwords can be given
`away to an enemy. Protection against masquerading should therefore be
`
`included in every interaction within the system where it is physically
`
`I I I i
`1
`I
`I
`
`I I
`
`\
`
`f. I
`
`i
`
`:; I
`I
`I j'
`
`Page 156 of 201
`
`

`
`Cryptography and the Smart Card
`
`139
`
`possible for an intruder to slip in. In the work of the International
`Standards Organ�ation this is known as peer entity authentication since
`the two communicating parts of the system are entities at the same level,
`i.e. peer entities. The particular case which we meet most frequently is the
`authentication of a user to the system. Masquerading as an authorised
`user is the commonest type of active attack on teleprocessing systems,
`typified by the 'hacker.' who mainly has to contend with rather simple
`password systems. Authentication of users can take much more secure
`forms, using smart cards, which will be described later in section 8.5. As
`in the present use of 'automatic teller machines' (cash dispensers), a
`token which the user holds together with a password which he remembers
`adds a lot to the security. A smart card, which is very difficult to forge,
`makes an ideal token for this purpose.
`Masquerading forms a bridge between passive and active attacks. By
`tapping the line to find the password or guessing the password or trying a
`lot of passwords the attacker obtains the means to enter the system in the
`guise of a genuine user. This gives him the possibility of both a passive
`and an active attack, the active attack requiring no more skill than normal ....
`use of the terminal, if a public network is used for access.
`In the next three sections of this chapter we shall describe the use of
`cryptography against passive attacks, preservation of data integrity and
`user authentication.

`
`8.3 CRYPTOGRAPHY
`
`Cryptography is an ancient art which, like many others, is changing
`rapidly with the development of information processing technology. We
`are concerned only with ciphers, which use a cryptographic algorithm and
`not with codes which are based on large, arbitrary code books. Figure 8.1
`shows the nature of a cipher which employs two algorithms E and D to
`encipher and decipher information respectively. Encipherment operates
`on the plaintext using a cryptographic key (k) as parameter. Decipherment
`
`Key k
`
`Key k
`
`Plain text
`X
`
`E
`
`Cipher text
`y = Ek(x)
`
`D
`
`Plain text
`X
`
`Encipherment
`Decipherment
`Fig. 8.1 Notation for encipherment and decipherment.
`
`Page 157 of 201
`
`

`
`140
`
`Integrated Circuit Cards
`
`uses the same key to operate on the ciphertext y and restore the plaintext
`x. The symbols x and y can either represent single blocks of data on
`which the transformation takes place or streams of data, for example, a
`stream of ASCII characters passing over a communication link. This dis- ·
`tinction corresponds to two different types of cipher algorithm, a block
`cipher and a stream cipher. Later we shall give examples of both.
`
`8.3.1 Attacks on a cipher
`The cipher derives its strength, in part, from the key. If the secrecy
`depended on the algorithm alone, when this algorithm became known to
`an enemy, either by chance or by hard cryptographic work, it would be
`necessary to go back to the drawing board and design a new algorithm.
`All modern ciphers make use of a key so that, even when the algorithm is
`known, decipherment is not possible unless the particular key employed
`is known.
`The possible number of different keys is important, like the number of
`'differs' of the keys you use in a lock. If this is too small, a processor can
`be set to try all the keys and see which one generates a plausible plaintext.
`The total number of different key values is known as the size of the key
`space.
`If nothing were known about the pl�intext and it could therefore be
`any random set of digits, the most elaborate and fastest machine for
`searching through the keys would still tell .us nothing. We must either
`have some knowledge of redundancy in the plaintext or, better still, some
`examples of actual plaintext with their ciphertext equivalents. In many
`cases, probable words or phrases from the plaintext can be guessed when
`their place in the text is unknown. With any of these clues about the
`plaintext an attack can be mounted, by searching through all the keys, in
`principle. A very large keyspace prevents the breaking of a cipher by
`simple key searching but it does not ensure that other, cleverer methods
`will not succeed. Only long experience with the practical breaking of
`ciphers can enable the strength of a cipher to be evaluated. Ultimately,
`there can be no guarantee because the whole range of possible methods
`of attack can never be enumerated.
`
`8.3.2 The Data Encryption Standard
`
`The US Government employs 'Federal Information Processing Standards'
`for its own use. In 1973, the US National Bureau of Standards announced
`that it was contemplating a standard for data encryption and asked for
`proposals. The response was disappointing but after a second call in 1974
`
`·i : .
`
`I j
`
`Page 158 of 201
`
`

`
`Cryptography and the Smart Card
`
`141
`
`,..
`
`a proposal from IBM was seen to have promise and, after some changes,
`it was published in 1975 as a draft. Eventually this was adopted in 1976 as
`the Data Encryption Standard defined by FIPS Publication 46.
`The Data Encryption Standard or DES had a much wider influence
`than was originally intended. The US standards body ANSI made it a US
`standard and it was widely adopted, particularly in banking and financial
`services. Because of restrictions placed on the export of chips for the
`DES algorithms from the USA, its use outside the USA has largely been
`confined to banking and financial services. It became a de facto standard
`within this community.
`The DES is a block cipher with plaintext and ciphertext blocks of 64
`bits and it employs a key of 64 bits, but has 8 parity bits so that its
`keyspace depends only on 56 bits.
`The structure of the algorithm is described in detail in reference (1]. It
`uses a combination of bit permutations and substitutions. By substitution
`we mean that an input field is used as an address to look up a table and
`produce an output field. The DES employs eight substitutions, each with
`a 6 bit input and 4 bit output. These are known as the 'S boxes'.
`From the outset, the strength of the DES has been controversial. The
`size of its keyspace was criticised on the basis that a complete search
`through its keyspace. could be carried out in less than one day by a
`machine containing one million devices, each able to test one million keys
`. per second. The cost of such a machine was argued about but believed at
`the time to be in the region of $10M. The structure of the S boxes was
`clearly not random but the criteria for choosing these tables has never
`been published and the work done to develop the DES is classified. Many
`felt that this was unsatisfactory for a standard.
`The whole idea of a published algorithm was a new one. Though the
`secrecy of the algorithm is not assumed to be essential to the security of a
`cipher, no cipher had been published, studied and discussed so widely
`and in such detail:
`Studies in many places revealed interesting properties of the DES,
`including the existence of four weak keys and twelve semi-weak keys
`which give the cipher special properties. Interesting facets of the design of
`the DES continue to be discovered. However, none of these has shown
`any significant weakness of the algorithm beyond that which is implied by
`the size of its keyspace.
`As information technology improved, the cost of a key searching
`machine decreased until, by about 1985 it became obvious that a searching
`machine would be within the bounds of expenditure by a large corporation
`or a large criminal organisation, assuming that the searching time was
`allowed to extend to about one month. This is · a real threat when the
`same key is used for longer than this time. Multiple encryption using the
`DES with at least two different keys has been adopted by the financial
`
`Page 159 of 201
`
`

`
`142
`
`Integrated Circuit Cards
`
`community to overcome this problem. This is particularly relevant in key
`management for a large system where a master key remains in place for
`some time.
`By 1988 the US had decided not to renew their endorsement of the
`DES for federal purposes. When this was announced in .1986 it caused
`consternation among the financial users but the US Government stated
`· that it would not discourage use of the DES for financial transaction
`purposes.
`At the present time, adoption of the DES or any other cipher, as an
`international standard has been stopped by a resolution by the ISO
`Council so there is no role for the international standards body in the
`development of a replacement. This leaves the financial community in an
`unsatisfactory position, particularly where the use of the DES for safe­
`guarding large payments is concerned. For small payments, with a good
`key management system, the DES is usually considered adequate.
`The algorithm was designed for implementation solely in hardware on a
`special chip, according to the US Federal standard. The ANSI standard
`removed this condition and many systems now employ the DES im­
`plemented in software on a microprocessor. Because of its origins and the
`original intention to use hardware, the DES is not a convenient algonthm
`for software implementation and wa� outside the ability of the early smart
`cards. Recent developments in smart cards have made the DES more
`feasible so that it is now possible to, implement the DES in software
`within the. same smart cards. Where this is the algorithm required by
`financial transactions, incorporation of the cipher in a smart card is a
`useful contribution to security, enabling the card to store its own secret
`key and use it internally.
`A replacement for the DES will probably be a block cipher with the
`same size of plaintext and ciphertext blocks in order to ease the change
`from DES to the new cipher. The key size would have to be larger, at
`least 64 bits. A replacement for DES should be designed to be easy to

`implement in hardware and software.
`
`8.3.3. Methods of using a block cipher
`
`There is a limited number of applications for a cipher which can only
`handle 64 bits. A longer message could be broken into blocks of 64 bits,
`for example blocks of eight ASCII characters, and each block in tum could
`be enciphered. This simple method of enciphering a longer stream is not
`recommended for two reasons. Firstly, an enemy could extract pieces of
`the message made of a number of blocks and reuse them or rearrange
`them in messages of his own construction. Secondly, though eight characters
`may have a large number of potential values, in some contexts there
`
`Page 160 of 201
`
`

`
`Cryptography and the Smart Card
`
`143
`
`64 bit store
`
`k
`
`E
`
`I I I
`I I I I I
`
`Cipher
`
`Plain
`
`Plai n
`
`Fig. 8.2 Cipher block chaining.
`would be blocks which repeated quite frequently and could be recognised
`in. ciphertext, so that methods used for breaking codebo6ks could be
`applied. For example, in English text the words 'of the', wHh the three
`spaces, when it occurs on the eight character boundary will probably be
`the most common block and can thus be recognised. In computer output,
`sequences such as all zeros or all spaces will easily be detected.
`To overcome these problems a method of using the DES was introduced
`known as cipher block chaining ( CBC) and this can be applied to any
`block cipher. It is illustrated in Figure 8.2 which shows the ciphertext
`produced by the sender added, modulo 2, to the next plaintext block.
`This chains each block to its neighbour, preventing .the code book analysis
`method and the separation and reassembly of blocks of text. When applying
`this method, the block which is modulo 2 added to the first plaintext
`block is called the initialising variable or IV. In many systems, a new IV is
`sent when each new key is distributed and the IV is kept as secret as the
`key.
`Decipherment presents no problems, since the ciphertext can be stored
`and modulo 2 added to the plaintext of the next block. There is no
`synchronising problem in the sense that, whatever happens to modify or
`momentarily interrupt the ciphertext, provided the boundaries of the 64
`btt block can be recognised at the receiver, the system will recover. For
`example, a single error in ciphertext results in completely random output
`for the plaintext block into which it is transformed. The same error goes
`into the 64 bit store and emerges in the following block, so two blocks of
`text are affected. After this, the error does not propagate any more. Even
`this amount of error extension can be troublesome. It can interact badly
`with error correction schemes and it greatly increases the average error
`rate on the line.
`The CBC mode of operation is most useful where the information
`already has some imposed block structure, as in formatted messages.
`When these messages have been assembled in a store, the CBC encipher­
`ment can be applied to the whole stored message and then the result
`transmitted or stored, as required.
`
`Page 161 of 201
`
`

`
`144
`
`Integrated Circuit Cards
`64 bit shift register
`--
`
`64 bit shift register
`-
`
`k
`
`E
`
`64
`
`Select
`left bit
`1
`
`1
`
`1
`
`1
`
`E
`
`k
`
`64
`
`Select
`left bit
`
`1
`
`1
`
`Plain
`
`Plain
`
`Cipher
`Fig. 8.3 One-bit cipher feedback.
`This mode of operation is not convenient when the natural unit of
`
`
`
`
`
`communication is smaller than a block. For example, much. communication
`
`takes place in the form of 8 bit units or less, for example 8 bit ASCII
`
`characters or 5 bit telegraph codes. For these, the use of a 64 bit block
`
`
`
`
`
`cipher would mean that characters were held up, waiting for transmission,
`
`
`
`until the block was filled. Interactive communication between two operators
`
`
`were would require that, after a short pause, the residual characters
`
`
`
`
`
`padded out and transmitted. To satisfy these requirements, another method
`
`of using a block cipher was introduced which is cipher feedback (CFB).
`
`This is illustrated in Figure 8.3. At the top of the figure is a 64 bit shift
`
`
`
`register into which the ciphertext is introduced, bit by bit, from the right
`
`
`hand end. Thus it contains, at each end of the communication line, a
`
`64 bits of the ciphertext. At each end of the
`record of the last
`line, this
`
`
`
`
`block is enciphered and from the result only the left hand bit is extracted.
`
`This bit stream is added into the plaintext to produce the ciphertext.
`
`Effectively, it is a random bit stream generated by feeding back the
`is in this case used only in
`
`
`ciphertext itself. Note that the cipher algorithm
`
`
`
`
`the encryption mode; in fact a reversible algorithm is not really needed.
`
`
`Because this works on each bit as it arrives, it is completely transparent
`
`
`to the procedure used by the communicating parties, whether they are
`
`
`
`people or machines. It is normally used within the OSI architecture at the
`
`
`
`
`
`physical layer because it treats only the bit stream, without reference to
`
`
`its structure, and passes on each bit as it arrives.
`
`
`The entire DES algorithm calculation must be made again for each bit,
`
`this method will be slow:
`so if there is a limited speed of processing,
`
`As with the CBC mode, the shift register must be loaded before the
`
`
`
`process begins. In this case the initialising variable (IV) can be sent in
`
`
`
`clear over the line in a preamble. It can be in clear because the block is
`
`Page 162 of 201
`
`

`
`Cryptography and the Smart Card
`
`145
`
`Key
`
`E
`
`Key
`
`E
`
`64
`
`64
`
`Select
`left j bits
`
`Select
`left j bits
`
`· Plain text
`
`Cipher text
`
`Plain text
`
`Fig. 8.4 Output feedback.
`
`enciphered before it is ever used. Another possibility is to send no IV but
`to transmit random data for the first 64 bits, which will ensure that the
`two registers are synchronised.
`For this method, there is no synchronising because there is no 64 bit or
`other boundary to be observed. If an error occurs on the line or, for
`example, a bit is missed or an extra bit inserted, this causes an immediate
`error in the delivered plaintext and also affects the shift register during
`the next 64 bits. Consequently the error extension property of thi

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket