`
`
`
`
`
`Attorney Docket No.: 30146-0019IP1
`
`In re Patent of: Osborn
`U.S. Pat. No.: 6,026,293
`Issue Date:
`Feb. 15, 2000
`Appl. Serial No.: 08/706,574
`Filing Date:
`Sep. 5, 1996
`Title: SYSTEM FOR PREVENTING ELECTRONIC MEMORY TAMPERING
`
`Mail Stop Patent Board
`Patent Trial and Appeal Board
`U.S. Patent and Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`
`
`
`
`DECLARATION OF RICHARD A. KEMMERER IN SUPPORT OF INTER
`PARTES REVIEW OF UNITED STATES PATENT NO. 6,026,293
`
`
`
`I, Dr. Richard Kemmerer, of Santa Barbara, CA, declare that:
`
`
`INTRODUCTION
`1.
`
`I have been retained on behalf of the Petitioner Apple Inc. to provide
`
`this Declaration concerning technical subject matter relevant to the inter partes
`
`review of U.S. Patent No. 6,026,293 (the “’293 patent”).
`
`2.
`
`I am over 18 years of age. I have personal knowledge of the facts
`
`stated in this Declaration and could testify competently to them if asked to do so.
`
`BACKGROUND
`3. My name is Richard A. Kemmerer. I am an expert in the general
`
`fields of computer science and computer technology and more specifically in the
`
`1
`
`APPLE 1003
`
`
`
`fields of computer and network security. I have been an expert in these fields since
`
`prior to 1996. In formulating my opinions, I have relied upon my training,
`
`knowledge, and experience in the relevant art. A copy of my resume is provided as
`
`an appendix to this Declaration (Appendix A) and provides a comprehensive
`
`description of my relevant experience, including academic and employment
`
`history, publications, conference participation, and issued and pending U.S.
`
`patents.
`
`4.
`
`I received a B.S. degree in Mathematics from Pennsylvania State
`
`University in 1966, followed by a M.S. (1976) and Ph.D. (1979) in Computer
`
`Science from the University of California, Los Angeles.
`
`5. My academic career began in 1979 when I served as an Assistant
`
`Professor in the Department of Computer Science, University of California, Santa
`
`Barbara (1979-1985). I then served as a Visiting Professor at the Laboratory of
`
`Computer Science at MIT from 1985-1986, and an Associate Professor in the
`
`Department of Computer Science at the University of California, Santa Barbara
`
`from 1985 to 1989. Since 1989, I have served and continue to serve as a Professor
`
`of Computer Science at the University of California, Santa Barbara, where I have
`
`held the Leadership Professor Endowed Chair since 2006.
`
`6. My experience with computer and network security goes back more
`
`than 35 years. I have been teaching courses in computer security at both the
`
`2
`
`
`
`undergraduate and graduate level since around 1983. Since 1981 I have organized
`
`an annual workshop on cryptographic techniques in Santa Barbara.
`
`7.
`
`In 2009, while working with a team from the University of California
`
`at Santa Barbara, I commandeered the Torpig botnet from a team of Russian
`
`computer hackers who had been using the botnet to steal e-mail credentials, credit
`
`cards numbers, and other personal information from thousands of people. The
`
`botnet included hundreds of thousands of hosts that were volunteering gigabytes of
`
`sensitive information.
`
`8.
`
`Since 1979 I have worked as a computer system security consultant.
`
`In 2007, while working with the California Secretary of State, I compromised
`
`several electronic voting systems to expose their security vulnerabilities. In 2000,
`
`while working with a major international bank, I compromised the bank’s
`
`computer network to demonstrate existing vulnerabilities.
`
`9.
`
`Over my career, I have received numerous research grants from the
`
`U.S. Army, National Security Agency, and DARPA related to network and
`
`computer security. From 1977-1979, I worked as a research assistant at UCLA
`
`sponsored by DARPA to verify the UCLA Secure Unix Operating system. That
`
`work formed the basis of my dissertation. Most recently I have been working on a
`
`grant from the Army Research Office regarding cyber-attack analysis and
`
`prediction.
`
`3
`
`
`
`10.
`
`In 1979, I presented one of my first papers on computer security,
`
`entitled “Specification and Verification of the UCLA Security Kernel.” The paper
`
`was presented at the 7th Symposium on Operating Systems Principles and was
`
`later published in the Communications of the ACM.
`
`11.
`
`I have published numerous technical articles and papers in major
`
`conference proceedings and journals, and have served on numerous national and
`
`international committees.
`
`12.
`
`I have testified as an expert witness and served as a consultant in
`
`patent and intellectual property litigation and in inter partes review proceedings.
`
`13.
`
`I am not, and never was, an employee of Apple Inc. I have been
`
`engaged in the present matter to provide my independent analysis of the issues
`
`raised in the petition for inter partes review of the ’293 patent. I received no
`
`compensation for this Declaration beyond my normal hourly compensation based
`
`on my time actually spent studying the matter, and I will not receive any added
`
`compensation based on the outcome of this inter partes review of the ’293 patent.
`
`14. My analysis and conclusions expressed herein are based on review
`
`and analysis of certain information obtained in connection with my work in this
`
`matter, together with my training, education, and experience. The analysis and
`
`conclusions expressed herein are my own.
`
`4
`
`
`
`15. As part of my independent analysis for this Declaration, I have
`
`considered the following: my own knowledge and experience, including my work
`
`experience in the fields of computer science and electrical engineering; my
`
`experience in teaching and advising students in those subjects; and my experience
`
`in working with others involved in those fields. In addition, I have analyzed the
`
`following publications and materials:
`
` The disclosure and claims of U.S. Patent No. 6,026,293 to Osborn
`
`(“the ’293 patent”; Ex. 1001)
`
` U.S. Pat. No. 5,421,006 to Jablon et al. (“Jablon”; Ex. 1004)
`
` U.S. Pat. No. 5,349,697 to Pelkonen (“Pelkonen”; Ex. 1005)
`
` U.S. Pat. No. 4,727,544 to Brunner et al. (“Brunner”; Ex. 1006)
`
` U.S. Pat. No. 4,590,552 to Guttag et al. (“Guttag”; Ex. 1007)
`
` National Institute of Standards and Technology—Proposed Federal
`
`Information Processing Standard for Secure Hash Standard, 57
`
`Fed. Reg. 3,747 (Jan. 31, 1992) (“Secure Hash Standard”; Ex.
`
`1008)
`
` U.S. Pat. No. 5,802,592 to Chess et al. (“Chess”; Ex. 1009)
`
` Joint Claim Construction and PreHearing Statement in Ericsson
`
`Inc. et al. v. Apple Inc., case no. 2:15-cv-289-JRG (E.D. Tex.,
`
`complaint filed Feb. 26, 2015) (Ex. 1010)
`
`5
`
`
`
`16. Although this Declaration refers to selected portions of the cited
`
`references for the sake of brevity, it should be understood that these are examples,
`
`and that one of ordinary skill in the art would have viewed the references cited
`
`herein in their entirety and in combination with other references cited herein or
`
`cited within the references themselves. The references used in this Declaration,
`
`therefore, should be viewed as being incorporated herein in their entirety.
`
`OVERVIEW OF CONCLUSIONS FORMED
`17. This Declaration explains the conclusions that I have formed based on
`
`my independent analysis. To summarize those conclusions:
`
` Based upon my knowledge and experience and my review of the prior art
`
`publications listed above, I believe that claims 1, 10, 11, 14, and 16 of the
`
`’293 patent are obvious in light of U.S. Patent No. 5,421,006 to Jablon et
`
`al. (“Jablon”) in view of U.S. Patent No. 5,349,697 to Pelkonen (“Pelko-
`
`nen”)
`
` Based upon my knowledge and experience and my review of the prior art
`
`publications listed above, I believe that claim 2 of the ’293 patent is ob-
`
`vious in light of Jablon in view of Pelkonen and U.S. Patent No.
`
`4,727,544 to Brunner et al. (“Brunner”)
`
` Based upon my knowledge and experience and my review of the prior art
`
`publications listed above, I believe that claims 8 and 9 of the ’293 patent
`
`6
`
`
`
`are obvious in light of Jablon in view of Pelkonen and U.S. Patent No.
`
`4,590,552 to Guttag et al. (“Guttag”)
`
` Based upon my knowledge and experience and my review of the prior art
`
`publications listed above, I believe that claim 13 of the ’293 patent is ob-
`
`vious in light of Jablon in view of Pelkonen and National Institute of
`
`Standards and Technology—Proposed Federal Information Processing
`
`Standard for Secure Hash Standard, 57 Fed. Reg. 3,747 (Jan. 31, 1992)
`
`(“Secure Hash Standard”)
`
` Based upon my knowledge and experience and my review of the prior art
`
`publications listed above, I believe that claims 1, 2, 10, 11, 13, 14, and 16
`
`of the ’293 patent are obvious in light of U.S Patent No. 5,802,592 to
`
`Chess et al. (“Chess”) in view of Pelkonen
`
` Based upon my knowledge and experience and my review of the prior art
`
`publications listed above, I believe that claims 8 and 9 of the ’293 patent
`
`are obvious in light of Chess in view of Pelkonen and Guttag
`
`BACKGROUND KNOWLEDGE ONE OF SKILL IN THE ART WOULD
`HAVE HAD PRIOR TO THE FILING OF THE ’293 PATENT
`18. The technology in the ’293 patent at issue generally relates to com-
`
`puter memory security, including “preventing electronic memory manipulation . . .
`
`[and] preventing unauthorized manipulation of desirably secure memory contents
`
`in an electronic device.” See Ex. 1001 at 1:5-8. The ’293 patent describes per-
`
`7
`
`
`
`forming an “audit” of electronic memory that includes performing a hash calcula-
`
`tion for contents of an electronic memory and comparing this calculated hash value
`
`to a valid hash value previously determined for the memory contents. Id. at 6:30-
`
`45. If this comparison reveals a disparity between the newly calculated “audit hash
`
`value” and the stored hash value for the memory, this can indicate memory tamper-
`
`ing. Id. The ’293 patent also discloses ensuring memory integrity through use of
`
`public/private key authentication for stored hash values. Id. at 8:19-46.
`
`19. Prior to the filing date of the ’293 patent (September 5, 1996) there
`
`existed numerous products, publications, and patents that implemented or
`
`described the functionality claimed in claims 1, 2, 8-11, 13-14, and 16 of the ’293
`
`patent. Based upon my knowledge and experience and my review of the prior art
`
`publications listed above, I believe that a person of ordinary skill in the art at the
`
`time would have recognized that the subject matter described in claims 1, 2, 8-11,
`
`13-14, and 16 of the ’293 patent was well known in the prior art. Further, to the
`
`extent there was any problem to be solved in subject matter recited in claims 1, 2,
`
`8-11, 13-14, and 16 of the ’293 patent, a person of ordinary skill in the art at the
`
`time would have known that such a problem had already been solved in the prior
`
`art systems before the filing date of the ’293 patent.
`
`20. Based upon my experience in this area, a person of ordinary skill in
`
`the art in this field at the relevant time frame (“POSITA”) would have had a
`
`8
`
`
`
`combination of experience and education in electrical or computer engineering or
`
`computer science and their subfield of computer and network security and cryptog-
`
`raphy. This typically would consist of a minimum of a bachelor degree in electri-
`
`cal or computer engineering, computer science, or a related engineering field with
`
`at least two years of experience in embedded systems design and knowledge in the
`
`field of cryptography.
`
`21. Based on my experiences and as described in ¶¶ 3-12, I have a good
`
`understanding of the capabilities of a POSITA. Indeed, I have taught, participated
`
`in organizations, and worked closely with many such persons over the course of
`
`my career, including during the mid-1990s and continuing through the present
`
`time.
`
`INTERPRETATIONS OF THE ’293 PATENT CLAIMS AT ISSUE
`
`22.
`
`I have been informed that the ’293 patent will expire on September 5,
`
`2016. I have been informed that when a patent is set to expire during the pendency
`
`of an inter partes review the standard for claim construction shifts from the
`
`broadest reasonable interpretation standard to the district court’s Markman
`
`standard. Under a Markman standard, I have been informed that the claims of a
`
`patent are to be given their meaning as understood by a POSITA at the time of
`
`invention in light of the patent’s intrinsic evidence (e.g., specification and
`
`prosecution history) and, when appropriate, extrinsic evidence (e.g., technical
`
`9
`
`
`
`dictionaries). The construction proposed herein is consistent with the specification
`
`and claims (i.e., the intrinsic evidence) and therefore should not be affected by the
`
`standard applied. I also understand that the words of the claims should be
`
`interpreted as they would have been interpreted by a POSITA at the time the
`
`invention was made (not today). Because I do not know at what date the invention
`
`as claimed was made, I have used the filing date of the ’293 patent, which is Sep-
`
`tember 5, 1996. Without exception, however, the interpretation that I provide
`
`below would have also been correct if the date of invention was anywhere within
`
`the early to mid-1990s. I have been asked to provide my interpretation of the
`
`following term of the ’293 patent set forth below.
`
`23. Claims 1 and 14 recite “secure from tampering.” Based on my
`
`knowledge and experience in this field and my review of the ’293 patent and other
`
`evidence, I believe a POSITA would have understood this term to mean “able to
`
`provide assurance of the integrity of the memory contents.” In reviewing the ’293
`
`patent, I observed that all references to preventing tampering relate to preventing
`
`or detecting tampering of the contents of electronic memory. For example, the
`
`’293 patent recites:
`
`drawbacks and limitations of conventional methods and proposed
`solutions for preventing cellular telephone memory tampering, and
`electronic memory tampering generally, are overcome by the present
`
`10
`
`
`
`invention, exemplary embodiments of which protect electronic
`memory contents from unauthorized access and manipulation.
`
`In accordance with one aspect of the invention, security is achieved by
`periodically auditing electronic memory contents in an electronic
`device to ensure that the contents have not been tampered with. The
`audit involves performing a hash calculation over selected contents of
`the electronic memory to derive an audit hash value, or audit
`signature, of such contents. The audit hash value is compared with a
`valid hash value previously derived from authentic memory contents.
`The valid hash value is preferably stored in an encrypted form within
`an electronic memory and decrypted only for purposes of comparison.
`A disparity between the audit hash value and the valid hash value
`can indicate memory tampering, wherefore an electronic device
`containing the electronic memory can be rendered inoperative, or a
`warning indication can be made.
`
`Ex. 1001 at 6:24-45 (emphasis added). There is no explicit discussion by the ’293
`
`patent of avoiding and detecting tampering of specific steps of storing an original
`
`hash value, calculating an audit hash value, or comparing hash values, only to pre-
`
`venting “memory tampering” in general. Id. Additionally, the ’293 patent does
`
`not explicitly indicate which, if any, disclosed processes and/or systems are re-
`
`sponsible for ensuring that any hash calculation and comparison processes are “se-
`
`cure from tampering.” Finding no clear discussion in the ’293 patent of processes
`
`and/or systems that are explicitly implemented for the purpose of ensuring that
`
`11
`
`
`
`hash calculation and comparison processes are secure from tampering, I have
`
`turned to other evidence, including the claim constructions provided by the Patent
`
`Owner in the corresponding litigation, which provide a definition of “able to pro-
`
`vide assurance of the integrity of the memory contents” for this claim term and I
`
`have applied this claim interpretation in analyzing the prior art references cited
`
`herein vis-à-vis the claims of the ’293 patent. See Ex. 1010 at pp. 11-12.
`
`
`
`ANALYSIS OF JABLON IN VIEW OF PELKONEN (CLAIMS 1, 10, 11, 14,
`and 16)
`24. Based upon my knowledge and experience and my review of the prior
`
`art publications listed above, I believe that claims 1, 10, 11, 14, and 16 of the ’293
`
`patent are obvious in light of Jablon in view of Pelkonen. Jablon is one of many
`
`prior art examples of a system for preventing and detecting unauthorized access to
`
`contents of electronic memory that employ secure hashing to verify the integrity of
`
`the memory contents. Ex. 1004 at 5:44-56; 6:2-5; 11:21-32; 8:60-68; 17:15-21;
`
`14:52-67; 9:18-20; 16:66-17:4. FIG. 9 of Jablon shows a flow chart of a process
`
`for computing a hash value (referred to by Jablon as a “modification detection
`
`code” or “MDC”) for the contents of memory, and comparing the newly calculated
`
`hash value to a verification hash value for original/authentic memory contents to
`
`determine if memory tampering has occurred:
`
`12
`
`
`
`
`
`Ex. 1004 at FIG. 9; 5:44-56; 17:42-47; 11:21-32; 8:60-68. Jablon further discloses
`
`using “a one-way latch, which protects data stored in non-volatile memory” to en-
`
`sure that the hash value calculation and comparison processes are secure from
`
`tampering. Id. at 10:25-45. The one-way latch protects data stored in memory that
`
`is used during the hash calculation and comparison process from tampering. Id. In
`
`this way, this “memory protection latch . . . make[s] security mechanisms immune
`
`to software attack” and therefore makes Jablon’s hash value calculation based “in-
`
`tegrity assessment method . . . immune to the kind of violations it is intended to de-
`
`13
`
`
`
`tect.” Id. at 6:20-25. Jablon discloses other additional security assurance features,
`
`including use of public/private key “signature verification” in conjunction with the
`
`above described memory integrity verification processes. Id. at 9:1-20. For exam-
`
`ple, Jablon discloses “using the private-key of a trusted authority” to sign the
`
`“modification detection code” for an authorized program (e.g., authenticated, valid
`
`memory contents) and storing the signature. Id. at 19:28-62. Then, during the
`
`modification detection code verification process, the signed authenticated modifi-
`
`cation detection code is retrieved, the digital signature is verified using the stored
`
`public-key, and the integrity of the program (e.g., the memory contents) is
`
`checked. Id.
`
`25.
`
`Jablon does not explicitly disclose performing the various memory in-
`
`tegrity assurance and verification processes on a cellular telephone, but does ex-
`
`plicitly indicate that the described techniques can be used “in many computer sys-
`
`tems” and that “the same benefits” achieved when these techniques are applied to
`
`an “IBM-compatible personal computer” “can be realized in other computer oper-
`
`ating systems and hardware, and appropriately modified implementations will be
`
`apparent to those skilled in the art” which includes “non-PC systems.” Ex. 1004 at
`
`1:34-42; 10:23-24; 22:28-29. Indeed, based on my knowledge and experience in
`
`the relevant field, I agree with Jablon’s disclosure that a POSITA would have rec-
`
`ognized the memory integrity assurance and verification processes disclosed by
`
`14
`
`
`
`Jablon could be readily and easily applied to a variety of computing environments,
`
`and such application would have led to a number of predictable and beneficial out-
`
`comes, especially when applied to computing devices for which memory integrity
`
`was recognized as important, such as cellular telephones and other mobile, network
`
`accessible devices. One such example of a computing device on which a POSITA
`
`would have been able to readily and beneficially apply Jablon’s processes is the
`
`radiotelephone disclosed by Pelkonen. See Ex. 1005 at 1:35-46; 1:66-2:3; 2:25-47.
`
`Pelkonen’s radio telephone includes “logic means, non-volatile memory and a
`
`RAM coupled to the logic means” and Pelkonen further discloses that “programme
`
`code is recorded in the RAM.” Id. at 1:37-39. Pelkonen also recognizes the im-
`
`portance of verifying “the integrity of the programme code stored in the RAM 3”
`
`using known methods.” Id. at 2:29-30. Indeed, Jablon discloses precisely such a
`
`known method, which a POSITA would have readily applied to Pelkonen’s radio-
`
`telephone to achieve predictable and beneficial results, especially by as late of a
`
`date as the filing of the ’293 patent.
`
`26. As set forth in more detail below, all of which is based upon my
`
`knowledge and experience in this art and my review of Jablon and Pelkonen, I
`
`believe that the teachings of Jablon in view of Pelkonen provide every element
`
`recited in claims 1, 10, 11, 14, and 16 of the ’293 patent, and that there were a
`
`15
`
`
`
`number of reasons (articulated in detail in ¶¶ 29-33 below) that would have
`
`prompted a POSITA to provide this resulting combination of Jablon and Pelkonen.
`
`[1.P] A cellular telephone comprising:
`27. The teachings of Jablon in view of Pelkonen disclose a cellular tele-
`
`phone. For example, as described in ¶ 24 above, Jablon discloses processes and
`
`methods for protecting and verifying memory integrity that can be performed in
`
`numerous computing environments, including “non-PC systems,” to achieve vari-
`
`ous “benefits” disclosed by Jablon. Ex. 1004 at 1:34-42; 10:23-24; 22:28-29. In-
`
`deed, a POSITA would have recognized that Jablon’s processes would have been
`
`beneficially implemented in any computing environment involving memory, and
`
`would have been particularly useful in computing environments in which assuring
`
`the integrity of the contents of memory was useful, including cellular telephones.
`
`28.
`
` To the extent that Jablon does not explicitly disclose a cellular tele-
`
`phone, the use of memory integrity assurance in cellular telephones was commonly
`
`known in similar prior art systems. For example, like Jablon, Pelkonen discloses a
`
`computing environment that includes processes for “check[ing] the integrity” of
`
`memory contents (including program code), and more specifically discloses
`
`memory integrity verification in the context of a cellular telephone. Ex. 1005 at
`
`2:29-31; 1:37-39. Pelkonen’s “radiotelephone” includes “the logic means and
`
`wherein the programme code is recorded in the RAM.” Id. at 1:37-39. Pelkonen
`
`16
`
`
`
`further discloses a “checking programme recorded in the ROM 4 [that] controls the
`
`CPU 1 to read and check the integrity of the programme code stored in the RAM 3
`
`by a known method, for example by calculating the check sum.” Id. at 2:26-31. As
`
`discussed in ¶ 24, above, one such “known method” for verifying the integrity of
`
`the contents of the radiotelephone’s memory is the “modification detection code”
`
`method disclosed by Jablon, which Jablon explicitly describes as an improvement
`
`over prior techniques “such as checksums.” See Ex. 1004 at 5:19-35; 11:21-32;
`
`8:60-68. Furthermore, Pelkonen discloses a “check sum” as an example “known
`
`method” for “check[ing] the integrity of the programme code stored in the RAM.”
`
`Ex. 1005 2:29-31.
`
`29. There are several reasons that would have prompted a POSITA to
`
`implement Jablon’s processes for protecting and assuring the integrity of the con-
`
`tents of an electronic memory (using hashing based “modification detection
`
`codes”) in the computing environment disclosed by Pelkonen (a cellular telephone
`
`having a “checking programme” for checking the integrity of the contents of the
`
`radiotelephone’s memory) to achieve a cellular telephone having improved
`
`memory integrity verification and protection. First, based upon my knowledge and
`
`experience and my review of the ’293 patent, Jablon, and Pelkonen, I believe that a
`
`POSITA would have understood that Pelkonen is concerned with ensuring “integ-
`
`rity of the programme code stored in the RAM 3 by a known method” and that Ja-
`
`17
`
`
`
`blon discloses precisely such a known “integrity assessment” method for verifying
`
`and protecting the integrity of the contents of memory using hashing based “modi-
`
`fication detection codes.” Ex. 1005 at 2:29-30; Ex. 1004 at 8:60-68; 11:21-32. In-
`
`deed, Jablon indicates that the use of “modification detection codes” “address[es]
`
`th[e] problem[s]” associated with other known methods of verifying and protecting
`
`the integrity of memory contents. Ex. 1004 at 5:32-35.
`
`30. Second, based upon my knowledge and experience and my review of
`
`the ’293 patent and Jablon and Pelkonen references, I believe that a POSITA
`
`would have been prompted to implement Jablon’s processes in Pelkonen’s cellular
`
`telephone computing environment because Pelkonen contemplates using “known
`
`method[s]” for verifying memory integrity and gives an express example of using a
`
`“check sum” calculation, while Jablon’s use of hashing based “modification
`
`detection codes” was implemented to “address . . . problem[s]” associated with
`
`checksum based integrity verification techniques. Ex. 1005 at 2:29-31; Ex. 1004 at
`
`5:26-35. Indeed, based upon a review of the teachings in Jablon and Pelkonen, a
`
`POSITA would have recognized that—because Jablon’s approach to verifying
`
`memory integrity was intended as an improvement over check sum based meth-
`
`ods—applying these processes to Pelkonen would have been an improvement over
`
`the check sum example given by Pelkonen.
`
`18
`
`
`
`31. Third, based upon my knowledge and experience and my review of
`
`the ’293 patent and Jablon and Pelkonen references, I believe that a POSITA
`
`would have understood that Jablon recognizes that “appropriately modified
`
`implementations will be apparent to those skilled in the art” and plainly suggests
`
`using Jablon’s processes “in many other computer systems” including “non-PC
`
`systems.” Ex. 1004 at 1:34-42; 10:23-24; 22:28-29. In light of such suggestions, a
`
`POSITA would have been prompted to implement Jablon’s processes in Pelko-
`
`nen’s cellular telephone computing environment because Pelkonen discloses just
`
`such a computer system in the form of Pelkonen’s “radiotelephone comprising
`
`logic means, non-volatile memory and a RAM coupled to the logic means and
`
`wherein the programme code is recorded in the RAM.” Ex. 1005 at 1:37-39.
`
`32. Fourth, based upon my knowledge and experience and my review of
`
`the ’293 patent and Jablon and Pelkonen references, I believe that a POSITA
`
`would have understood that Jablon and Pelkonen share similar motivations and
`
`therefore Jablon’s improved memory integrity verification processes could be ad-
`
`vantageously implemented on Pelkonen’s radiotelephone. Namely, both Jablon
`
`and Pelkonen are concerned with allowing code stored on memory of a computing
`
`device to be updated without requiring replacement of physical hardware compo-
`
`nents (such as memory). For example, Jablon discloses as “allow[ing] the system
`
`to be upgraded with a new boot record, as part of a general installation of new
`
`19
`
`
`
`software” without requiring replacement of the system memory, while Pelkonen
`
`discloses the “advantage” of a user being “able to update the programme code
`
`without need to replace the memory.” Ex. 1004 at 14:41-42; Ex. 1005 at 1:40-42.
`
`33. Fifth, based upon my knowledge and experience and my review of the
`
`’293 patent and Jablon and Pelkonen references, it is plain that Jablon and
`
`Pelkonen both describe computing systems having programs and processes for pro-
`
`tecting and verifying the integrity of the contents of memory. Here, a POSITA
`
`would have been prompted to implement Jablon’s “modification detection code”
`
`memory integrity verification techniques on Pelkonen’s cellular telephone because
`
`doing so would be merely the application of a known technique (e.g., hashing
`
`based memory protection and verification) to a known device (e.g., a cellular
`
`phone having electronic memory) ready for improvement to yield predictable
`
`results. Indeed, by the mid-1990’s it was a well-known and accepted practice in
`
`the design of computing devices to take features (including security features) from
`
`traditional computers such as personal computers and other computing devices and
`
`implement those features in cellular phones. Additionally, a POSTIA would have
`
`readily understood at the time that implementing Jablon’s processes on Pelkonen’s
`
`cellular phone would not significantly alter or hinder the functions performed by
`
`the processes of Jablon, yet it would have also provided the additional benefits
`
`described above.
`
`20
`
`
`
`[1.1] a microprocessor
`34. The teachings of Jablon in view of Pelkonen disclose a microproces-
`
`sor. For example, Jablon includes numerous references to processor or “central
`
`processing unit” (CPU) including disclosing that the “computer system compris[es]
`
`a processor [and] random access memory,” that the “CPU 36 is [located] on the
`
`PC’s motherboard” and that the CPU is in electronic communication with “RAM
`
`13 and ROM 15” via “the address bus 12.” Ex. 1004 at claim 1; 13:45-46; 11:57-
`
`61; 13:47-13:66. FIG. 1 show’s Jablon’s CPU:
`
`Ex. 1004 at FIG. 1. Pelkonen also discloses that the “radiotelephone comprises a
`
`central processing unit (CPU) 1 coupled to an integrated internal bus INT BUS 2.
`
`
`
`21
`
`
`
`The CPU 1 can communicate, via the bus 2, with a random-access-memory (RAM)
`
`3 of reading and writing type.” Ex. 1005 at 1:66-2:2; FIG. 1.
`
`[1.2] a memory
`35. The teachings of Jablon in view of Pelkonen disclose a memory. For
`
`example, Jablon includes numerous references to memory, including disclosing
`
`that the “computer system compris[es] a processor [and] random access memory,”
`
`that the “CPU 36 is [located] on the PC’s motherboard” and that the CPU is in
`
`electronic communication with “RAM 13 and ROM 15” via “the address bus 12.”
`
`Ex. 1004 at claim 1; 13:45-46; 11:57-61; 13:47-13:66. Jablon discloses that data
`
`can be stored “in a protectable memory region during a software configuration
`
`process” and that this data can include “data to verify system initialization pro-
`
`grams before they are run.” Id. at 8:39-42. Jablon makes numerous references to
`
`“non-volatile memory,” further describing this non-volatile memory as “CMOS
`
`RAM.” Id. at 7:6-8; 8:48-56; 10:25-27. Jablon also discloses loading programs in-
`
`to memory. See, e.g., id. at 12:29-30; 9:17-20; FIG. 10. Pelkonen also discloses
`
`that the “radiotelephone comprises a central processing unit (CPU) 1 coupled to an
`
`integrated internal bus INT BUS 2. The CPU 1 can communicate, via the bus 2,
`
`with a random-access-memory (RAM) 3 of reading and writing type.” Ex. 1005 at
`
`1:66-2:2; FIG. 1.
`
`[1.3] wherein the microprocessor performs a hash calculation on contents of
`the memory to derive an audit hash value
`
`22
`
`
`
`36.
`
`Jablon discloses wherein the microprocessor performs a hash
`
`calculation on contents of the memory to derive an audit hash value. For example,
`
`as discussed above in ¶ 24, Jablon discloses using known hashing techniques such
`
`as a “cryptographic hash,” or the “secure hash algorithm” taught by “National
`
`Institute of Standards and Technology--Proposed FIPS for Secure Hash Standard”
`
`to compute a hash value which Jablon refers to as a “modification detection code.”
`
`Ex. 1004 at 5:44-56; see also Ex. 1008 at pp. 1-2 (giving details on uses for the
`
`“Secure Hash Algorithm (SHA)” referenced by Jablon). Jablon uses these calcu-
`
`lated hash values or “modification detection codes” to conduct various memory in-
`
`tegrity verification processes. For example, Jablon discloses “comput[ing]
`
`modification detection codes on programs [to] verify them prior to program
`
`execution.” Ex. 1004 at 6:2-5. As explained by Jablon, the systems CPU executes
`
`program instructions that “compute[] a modification detection code on the boot
`
`record when the system starts up, and compares the code to a previously stored
`
`value in non-volatile memory. If the value does not match, the boot record program
`
`is not run, and the user is warned of the error.” Id. at 11:21-32. Jablon explains
`
`that “modification detection codes provide a suitably strong means of integrity
`
`verification” by allowing the system to “compute[], using known techniques, a
`
`modification detection code for [a] program, and compare[] it to a pre-computed
`
`stored code in protectable non-volatile memory” prior to executing the program.
`
`23
`
`
`
`Id. at 8:60-66; see also 10:59-60; 16:66-