`
`———————
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`———————
`
`TESLA INC.,
`Petitioner,
`
`v.
`
`INTELLECTUAL VENTURES II LLC,
`Patent Owner.
`———————
`
`IPR2025-00222
`U.S. Patent No. 9,232,158
`_____________________
`
`DECLARATION OF R. MICHAEL GUIDASH,
`UNDER 37 C.F.R. § 1.68 IN SUPPORT OF PETITION FOR
`INTER PARTES REVIEW
`
`1
`
`Ex.1003 / Page 1 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`TABLE OF CONTENTS
`Introduction ...................................................................................................... 4
`I.
`Qualifications and Professional Experience .................................................... 8
`II.
`III. Level of Ordinary Skill in the Art ................................................................. 11
`IV. Relevant Legal Standards .............................................................................. 12
`V.
`Technical Background ................................................................................... 14
`A.
`Image Sensors ...................................................................................... 14
`B.
`Dynamic Range ................................................................................... 20
`1.
`True Dynamic Range vs. Intra-Scene Dynamic Range 20
`2.
`Expanding Intra-Scene Dynamic Range Using Multiple
`Integration Times ........................................................... 21
`Exposure .............................................................................................. 23
`C.
`VI. The ’158 Patent .............................................................................................. 30
`A. Overview ............................................................................................. 30
`B.
`Prosecution History ............................................................................. 34
`VII. Claim Construction ........................................................................................ 34
`A.
`“during a frame” (claim 2) .................................................................. 36
`VIII. Identification of How the Claims are Unpatentable ...................................... 38
`A. Ground 1: Claims 1-2, 5, 8-9, 11, 13-16, and 19 Would Have
`Been Obvious Over Matsushima......................................................... 39
`1.
`Overview of Matsushima ............................................... 39
`2.
`Analysis ......................................................................... 40
`Ground 2: Claims 1-2, 4-5, 8-11, 13, and 14 Would Have Been
`Obvious Over Yu and Miyazaki.......................................................... 71
`
`B.
`
`
`
`
`2
`
`
`
`Ex.1003 / Page 2 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`Overview of Yu ............................................................. 72
`1.
`Overview of Miyazaki ................................................... 74
`2.
`Analysis ......................................................................... 75
`3.
`IX. Conclusion ................................................................................................... 111
`
`
`
`
`
`
`
`
`3
`
`
`
`Ex.1003 / Page 3 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`I, R. Michael Guidash, do hereby declare as follows:
`
`I.
`
`INTRODUCTION
`1.
`
`I am making this declaration at the request of Tesla Inc. in the matter
`
`of the Inter Partes Review of U.S. Patent No. 9,232,158 (“the ’158 patent”) to
`
`
`
`Olsen et al.
`
`2.
`
`I am also being reimbursed for reasonable and customary expenses
`
`associated with my work and testimony in this investigation. My compensation is
`
`not contingent on the outcome of this matter or the specifics of my testimony, and I
`
`have no other interest in this case or the parties thereto.
`
`3.
`
`I have been asked to provide my opinions regarding whether Claims
`
`1-2, 4-5, 8-11, 13-16, and 19 (“Challenged Claims”) of the ’158 patent are
`
`unpatentable as they would have been obvious to a person having ordinary skill in
`
`the art (“POSITA”) at the time of the alleged invention, in light of the prior art. It
`
`is my opinion that all of the limitations of the challenged claims would have been
`
`obvious to a POSITA.
`
`4.
`
`In forming the opinions set forth in this declaration, I have reviewed
`
`the documents in the following table:
`
`Ex.1001
`
`Ex.1002
`
`U.S. Pat. No. 9,232,158 to Olsen et al. (“the ’158 patent”)
`
`Prosecution History of U.S. Application No. 14/063,236 (issued as
`the ’158 patent)
`
`
`
`
`4
`
`
`
`Ex.1003 / Page 4 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`Ex.1005
`
`Ex.1006
`
`Ex.1007
`
`Ex.1008
`
`Ex.1009
`
`Ex.1010
`
`Ex.1011
`
`Ex.1012
`
`Ex.1013
`
`Ex.1014
`
`Ex.1015
`
`U.S. Pat. No. 6,611,289 to Yu et. al (“Yu”)
`
`U.S. Pat. No. 7,365,780 to Miyazaki (“Miyazaki”)
`
`JP Application Publication No. JP2003-319231 to Matsushima
`(Certified English Translation + Declaration + Japanese)
`(“Matsushima”)
`
`Eastman Kodak Company, Shutter Operations for CCD and CMOS
`Image Sensors, Application Node, MTD/PS-0259, Revision 1
`(October 23, 2001) (retrieved from
`https://web.archive.org/web/20030419002619/http://www.kodak.co
`m/global/plugins/acrobat/en/digital/ccd/applicationNotes/ShutterOp
`erations.pdf)
`
`Eastman Kodak Company, Kodak CMOS Image Sensors White
`Paper (November 10, 2000) (retrieved from
`https://web.archive.org/web/20010611235410/http://www.kodak.co
`m/US/plugins/acrobat/en/digital/ccd/cmos.pdf)
`
`Eastman Kodak Company, Kodak Digital Science KAC-0311
`Image Sensor (August 5, 2002) (retrieved from
`https://web.archive.org/web/20030411152500/http://www.kodak.co
`m/global/plugins/acrobat/en/digital/ccd/products/cmos/KAC-
`0311LongSpec.pdf)
`
`U.S. Pat. No. 7,733,414 to Kobayashi
`
`Dave Litwiller, “CCD vs. CMOS: Facts and Fiction, Photonics
`Spectra,” (January 2001)
`
`U.S. Application Pub. No. 2004/0239771 to Habe
`
`EP Application Pub. No. EP0858208 to Weldy et al.
`
`Memorandum in Support of Claim Construction Order, Intellectual
`Ventures I LLC et al. v. General Motors Company et al., 6:21-CV-
`1088 (WDTX), December 1st, 2022
`
`Ex.1016
`
`Texas Instruments, TC255P 336- × 244-PIXEL CCD IMAGE
`SENSOR (March 2003)
`
`
`
`
`5
`
`
`
`Ex.1003 / Page 5 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`Ex.1021
`
`Ex.1022
`Ex.1023
`
`Ex.1024
`
`Ex.1025
`
`Ex.1026
`
`Ex.1027
`Ex.1028
`
`Ex.1029
`
`Ex.1030
`
`Ex.1031
`
`
`Eastman Kodak Company, Solid State Image Sensors Terminology,
`Application Note, DS 00-001, Revision 0 (December 8, 1994)
`(retrieved from
`https://web.archive.org/web/20030413064955/http://www.kodak.co
`m/global/plugins/acrobat/en/digital/ccd/applicationNotes/terminolo
`gy.pdf)
`U.S. Pat. No. 7,830,435 to Guidash
`
`Eastman Kodak Company, Charge Coupled Device (CCD) Image
`Sensors, CCD Primer, MTD/PS-0218, Revision No. 1 (May 29,
`2001) (retrieved from
`https://web.archive.org/web/20030422183725/http://www.kodak.co
`m/global/plugins/acrobat/en/digital/ccd/applicationNotes/chargeCo
`upledDevice.pdf)
`Szeliski, “Image Mosaicing for Tele-Reality Applications,” (May
`1994)
`Mann et al., “On Being ‘Undigital’ with Digital Cameras:
`Extending Dynamic Range By Combining Differently Exposed
`Pictures,” (May 1995)
`Debevec et al., “Recovering High Dynamic Range Radiance Maps
`from Photographs,” SIGGRAPH ’97: Proceedings of the 24th
`annual conference on Computer graphics and interactive
`techniques, 369-378 (August 1997)
`PCT Pub. No. WO2001/010110 to Stark
`Yang et al., “A 640 512 CMOS Image Sensor with Ultrawide
`Dynamic Range Floating-Point Pixel-Level ADC,” IEEE Journal of
`Solid-State Circuits, VOL. 34, NO. 12 (December 1999)
`CCD Electronic Shutters, Interactive Java Tutorial, October 22,
`2002 (retrieved from
`https://web.archive.org/web/20021028042044/https://micro.magnet
`.fsu.edu/primer/java/digitalimaging/ccd/shutter/index.html)
`Prosecution History of U.S. Application No. 11/212,803
`(abandoned)
`Prosecution History of U.S. Application No. 11/788,122 (issued as
`
`
`
`
`6
`
`
`
`Ex.1003 / Page 6 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`U.S. Pat. No. 7,564,019)
`Prosecution History of U.S. Application No. 12/496,854 (issued as
`U.S. Pat. No. 8,198,574)
`Prosecution History of U.S. Application No. 13/465,229 (issued as
`U.S. Pat. No. 8,334,494)
`Prosecution History of U.S. Application No. 13/681,603 (issued as
`U.S. Pat. No. 8,598,504)
`Holst, CCD arrays, Cameras, and Displays, 2nd edition (1998)
`U.S. Pat. No. 4,831,403 to Ishida et al.
`U.S. Pat. No. 7443,427 to Takayanagi
`U.S. Pat. No. 4,642,679 to Nagano
`e2v Technologies, CCD Sensors Technical Note: Glossary of
`Terms (2003)
`Ralph E. Jacobson et al., The Manual of Photography: photographic
`and digital imaging, 9th Edition (2000)
`U.S. Pat. No. 6,215,597 to Duncan et al.
`O’Donnell, The Practical Use of the Exposure Triangle (retrieved
`from https://creativeraw.com/practical-use-exposure-triangle-
`explained/)
`U.S. Pat. No. 2,953,983 to Larson
`
`Ex.1032
`
`Ex.1033
`
`Ex.1034
`
`Ex.1035
`Ex.1036
`Ex.1037
`Ex.1038
`Ex.1039
`
`Ex.1040
`
`Ex.1041
`Ex.1042
`
`Ex.1043
`
`
`5.
`
`Additionally, I have utilized my own experience and expertise,
`
`including regarding the knowledge and capabilities of a person of ordinary skill in
`
`the relevant art in the timeframe of the claimed priority date of the ’158 patent.
`
`6.
`
`Unless otherwise noted, all emphasis in any quoted material has been
`
`added.
`
`
`
`
`7
`
`
`
`Ex.1003 / Page 7 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`II. QUALIFICATIONS AND PROFESSIONAL EXPERIENCE
`7. My complete qualifications and professional experience are described
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`in my Curriculum Vitae, a copy of which can be found in Exhibit 1004. The
`
`following is a brief summary of my relevant qualifications and professional
`
`experience.
`
`8.
`
`After graduating with a Bachelor of Science in Electrical Engineering
`
`from the University of Delaware in 1981, I started my career with Eastman Kodak
`
`as a product engineer for application specific integrated circuits (ASIC’s) used to
`
`perform photometer functions, (e.g. auto-exposure, auto-focus and flash control),
`
`for Kodak film cameras. In this role, I was responsible for all aspects of IC test,
`
`packaging, delivery, customer interface and failure analysis, yield and cost of the
`
`photometer ASICs. This required detailed knowledge of camera designs, functions
`
`and operation. This role continued until 1987 and included responsibility for copier
`
`output driver ASICs and photometer ASICs for Kodak DISC cameras and Instant
`
`Film cameras.
`
`9.
`
`In 1986-1988, I began transitioning to the Kodak Research
`
`Laboratories and CCD wafer fabrication facility. During my time with the Kodak
`
`Research Laboratories and CCD wafer fabrication facility, I developed 2 µm and 1
`
`µm CMOS processes, and a 30 V 4 µm Bipolar complementary metal-oxide-
`
`semiconductor (BiCMOS) process. These processes were used for gate arrays for
`
`
`
`
`8
`
`
`
`Ex.1003 / Page 8 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`many Kodak products and output driver ASICs for all of Kodak's copiers. During
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`this time, I was also awarded entrance into the Special Opportunity Graduate
`
`Program that allowed me to obtain a Master’s of Science in Electrical Engineering
`
`from the Rochester Institute of Technology. My Master’s Thesis involved design
`
`and process integration of bipolar and CMOS transistors into an existing CCD
`
`(charge-coupled device) image sensor process.
`
`10.
`
`In 1989, I initiated and began leading a Smart Sensor Group which
`
`developed BiCMOS-CCD processes to provide fully integrated CCD systems on a
`
`chip. I also served as product engineer and yield enhancement engineer for Kodak's
`
`high volume CCDs. During this time, I led product delivery and technology
`
`development of all ASIC and smart sensor products used in Kodak copiers and
`
`other electronic products, and CCD sensor yield improvement for sensors used in
`
`Nikon digital SLR cameras. I developed a number of products during this time to
`
`include an interline CCD integrated with 2 µm CMOS on the same chip, and linear
`
`CCD image sensor with on-chip timing and control. I also led and directed the
`
`CCD dark current and point defect reduction team.
`
`11.
`
`In 1996, I formally started the CMOS Image Sensor group at Kodak.
`
`From 1996 to 2009, I managed the R&D and product development of CMOS
`
`Image Sensor (CIS) programs. As is further shown in my curriculum vitae, I was
`
`the inventor or co-inventor of many key patents in the field of CMOS image
`
`
`
`
`9
`
`
`
`Ex.1003 / Page 9 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`sensors including pinned photodiode pixels, camera-on-a-chip architecture, and
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`shared amplifier pixel architectures. During this time, I led the development of the
`
`world’s 1st pinned photodiode CIS product, Shared amplifier CIS pixel product,
`
`and the world's first ½ inch optical format 1.3 megapixel CMOS image sensor into
`
`mass production, providing successful delivery to several compact digital camera
`
`products for Kodak and other camera manufacturers. I also led the development of
`
`one of the earliest 3D hybrid stacked backside illuminated (BSI) CIS architecture.
`
`This led to demonstration of very high performance, low dark current BSI
`
`prototype sensors with pixel sizes ranging from 0.9µm to 5.0µm. During
`
`development of these CIS devices, I worked closely with Kodak’s and external
`
`companies’ digital camera developers, and these CIS devices were commercialized
`
`in Kodak consumer digital cameras, as well as other companies’ low-cost compact
`
`digital cameras.
`
`12.
`
`In 2001, product development was transferred to other individuals and
`
`my role was focused on CIS and R&D. During this time, I initiated and led a cross-
`
`functional camera R&D team directed at CIS and other Kodak technologies for
`
`novel co-optimized cameras and camera sub-systems, including cell phone camera
`
`modules with small form factors, improved low light response, and fast auto-focus.
`
`During this time, I interfaced with cell phone camera manufacturers regarding their
`
`requirements and presented our technological improvements and camera module
`
`
`
`
`10
`
`
`
`Ex.1003 / Page 10 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`and CIS roadmaps.
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`13.
`
`In 2009, when Kodak closed its CMOS Image Sensor Business, I
`
`transitioned into a role as an intellectual property technologist and coordinator.
`
`This role included managing remaining image sensor patent applications and office
`
`actions, managing broad electronic components patent applications and office
`
`actions, and providing technical support for intellectual property sales and
`
`licensing teams.
`
`14.
`
`In 2012 I left Kodak and started my consulting company, providing
`
`technical consulting services for CMOS image sensors. In this role I provide
`
`consulting services for processes, pixels, circuits, sensor architectures, systems and
`
`intellectual property to companies and clients.
`
`15.
`
`I am a named inventor on over 100 issued US patents. The majority of
`
`these patents are directed toward technologies relevant to cameras and image
`
`sensors.
`
`III. LEVEL OF ORDINARY SKILL IN THE ART
`16.
`
`I understand there are multiple factors relevant to determining the
`
`level of ordinary skill in the pertinent art, including (1) the levels of education and
`
`experience of persons working in the field at the time of the invention; (2) the
`
`sophistication of the technology; (3) the types of problems encountered in the field;
`
`and (4) the prior art solutions to those problems.
`
`
`
`
`11
`
`
`
`Ex.1003 / Page 11 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`17.
`
` It is my understanding that the earliest possible priority date for the
`
`
`
`’158 patent is August 25, 2004. A person of ordinary skill in the art (“POSITA”) in
`
`the field of the ’158 patent, as of August 25, 2004, would have been someone
`
`knowledgeable and familiar with the digital imaging systems arts that are pertinent
`
`to the ’158 patent. That person would have a bachelor’s degree from an accredited
`
`program in electrical engineering, computer engineering, computer science, or a
`
`related field and 2-3 years of experience in digital imaging systems. Lack of work
`
`experience can be remedied by additional education, and vice versa.
`
`18. For purposes of this Declaration, in general, and unless otherwise
`
`noted, my statements and opinions, such as those regarding my experience and the
`
`understanding of a POSITA generally (and specifically related to the references I
`
`consulted herein), reflect the knowledge that existed in the field as of the earliest
`
`priority date of the ’158 patent (i.e., August 25, 2004). Unless otherwise stated,
`
`when I provide my understanding and analysis below, it is consistent with the level
`
`of a POSITA as of the alleged priority date of the ’158 patent.
`
`IV. RELEVANT LEGAL STANDARDS
`19.
`
`I am not an attorney. In preparing and expressing my opinions and
`
`considering the subject matter of the ’158 patent, I am relying on certain basic
`
`legal principles that counsel have explained to me. These principles are discussed
`
`below.
`
`
`
`
`12
`
`
`
`Ex.1003 / Page 12 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`Inter Partes Review of U.S. 9,232,158
`
`20.
`
`I understand that prior art to the ’158 patent includes patents and
`
`printed publications in the relevant art that predate the priority date of the alleged
`
`
`
`invention recited in the ’158 patent.
`
`21.
`
`I have been informed that a claimed invention is unpatentable under
`
`35 U.S.C. § 103 if the differences between the invention and the prior art are such
`
`that the subject matter as a whole would have been obvious at the time the
`
`invention was made to a person having ordinary skill in the art to which the subject
`
`matter pertains. I have also been informed by counsel that the obviousness analysis
`
`takes into account factual inquiries including the level of ordinary skill in the art,
`
`the scope and content of the prior art, and the differences between the prior art and
`
`the claimed subject matter.
`
`22.
`
`I have been informed by counsel that the Supreme Court has
`
`recognized several rationales for combining references or modifying a reference to
`
`show obviousness of claimed subject matter. Some of these rationales include the
`
`following: (a) combining prior art elements according to known methods to yield
`
`predictable results; (b) simple substitution of one known element for another to
`
`obtain predictable results; (c) use of a known technique to improve a similar device
`
`(method, or product) in the same way; (d) applying a known technique to a known
`
`device (method, or product) ready for improvement to yield predictable results; (e)
`
`choosing from a finite number of identified, predictable solutions, with a
`
`
`
`
`13
`
`
`
`Ex.1003 / Page 13 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`reasonable expectation of success; and (f) some teaching, suggestion, or motivation
`
`Inter Partes Review of U.S. 9,232,158
`
`in the prior art that would have led one of ordinary skill to modify the prior art
`
`reference or to combine prior art reference teachings to arrive at the claimed
`
`invention.
`
`V. TECHNICAL BACKGROUND
`23.
`
`In its background, the ’158 patent discusses various well-known
`
`components and techniques in digital imaging, including image sensors,
`
`controlling exposure times for image sensors, and expanding dynamic range for
`
`image sensors. Ex.1001, 1:32-2:42. All these techniques were not only well-
`
`known, but also widely available in commercial products, such as those offered by
`
`Eastman Kodak Company (“Kodak”). See e.g., Ex.1009 (Kodak Image Sensors
`
`White Paper describing Kodak’s leading technology in image sensors); Ex.1008
`
`(Kodak Shutter Operations Application Note describing shutters in digital cameras
`
`to control photodiode integration times).
`
`A.
`24.
`
`Image Sensors
`
`Image sensors are devices “capable of converting an incident
`
`optical pattern (i.e. image) into an electronic signal which contains all spatial
`
`and intensity relationships of the original pattern,” and the term is “usually
`
`used to refer to solid state semiconductor image sensors.” Ex.1021, 24. There
`
`were two well-known types of image sensors: “CCD (Charge Coupled Device)
`
`
`
`
`14
`
`
`
`Ex.1003 / Page 14 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`image sensors,” and “CMOS (Complementary Metal Oxide Semiconductor)
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`image sensors.” Ex.1009, 2.
`
`25. Both CCD image sensors and CMOS image sensors (also referred to
`
`as CIS) include an array of photosensitive cells, also referred to as pixels, on the
`
`image sensor chip, that respond to light that is striking the chip. Each pixel
`
`includes a photodetector, usually a photodiode (PD) including a p-n junction. I
`
`have created a simplified cross-sectional diagram of a photodiode for illustration
`
`below.
`
`
`26. While the CCD and CMOS image sensor pixels structures are
`
`different, the basic imaging process using the pixels is the same, which
`
`includes: (1) converting incident light into an electrical charge, (2) collecting
`
`and spatially confining the charge in the pixel, and (3) transferring the collected
`
`charge from the photodetector to be converted into a voltage and readout.
`
`Ex.1023, 5.
`
`27. First, the photodiode converts light (photons) into charge
`
`(electrons), which are integrated at pixel sites of image sensors. Ex.1009, 4
`
`
`
`
`15
`
`
`
`Ex.1003 / Page 15 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`(“The photodiode converts light (photons) into charge (electrons).” FIG. 7 of
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`Ex.1023 below illustrates the photon interaction with silicon “converting light
`
`(photons) to electronic charge.” Ex.1023, FIG. 7, 5 (“An image is acquired when
`
`incident light, in the form of photons, falls on the array of pixels. The energy
`
`associated with each photon is absorbed by the silicon and causes a reaction to take
`
`place. This reaction yields the creation of an electron-hole charge pair (or simply
`
`an electron)”). “The number of electrons collected at each pixel is linearly
`
`dependent on light level and exposure time and non-linearly dependent on
`
`wavelength.” Ex.1023, 5.
`
`Ex.1023, FIG. 7.
`
`
`
`28. Second, the photoelectrons are collected (integrated) and spatially
`
`confined within each photodiode. Fig. 8 of Ex.1023 below illustrates an
`
`example photodiode structure, which “consists of a vertically stacked
`
`
`
`
`16
`
`
`
`Ex.1003 / Page 16 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`conductive material (doped polysilicon) overlying a semiconductor (silicon)
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`separated by a highly insulating material (silicon dioxide).” Ex.1023, 6.
`
`Ex.1023, FIG. 8.
`
`
`
`29. As shown in FIG. 8 of Ex.1023 above, by applying a voltage potential
`
`to the polysilicon or “gate” electrode, a potential “well” can be formed “which has
`
`the capability of collecting the localized electrons that were created by the incident
`
`light.” Ex.1023, 6.
`
`30. The electrical charge (number of electrons) collected in each PD is
`
`proportional to (1) amount of light incident on that PD; and (2) the time that the
`
`resulting photoelectrons are collected, which is also called “integration time.” The
`
`resulting amount of charge in the photodiode is a measure of brightness for a
`
`particular part of the scene that the incident light has arrived from. I have created a
`
`
`
`
`17
`
`
`
`Ex.1003 / Page 17 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`simplified diagram below showing the collected charge in two adjacent pixels, one
`
`Inter Partes Review of U.S. 9,232,158
`
`with a lower light level and one with a brighter light level. The collection and
`
`spatial confinement of charge (photoelectrons shown as blue dots) is a critical
`
`process of the image capture process.
`
`31. Third, after “charge has been integrated and held locally by the
`
`bounds of the pixel architecture,” charge transfer techniques are applied to “move
`
`the packets of charge within the silicon substrate.” Ex.1023, 5-6. Readout
`
`techniques are then performed, “where the electrons (which represent a charge)
`
`are converted to a voltage.” Ex.1023, 10.
`
`32. Specifically, some transistors associated with the pixel transfer the
`
`charge out of the PD and convert the collected charge to an electrical signal,
`
`typically a voltage. These electrical signals are used to produce a digital image.
`
`CCD and CIS devices are shown in the figures that I created below, CCD on the
`
`left, CIS on the right. There is a noise associated with reading out of the pixel and
`
`converting it to a voltage. This noise is determined by measuring the temporal and
`
`spatial variation of the image output signals when the image sensor is not exposed
`
`
`
`
`18
`
`
`
`
`
`
`
`Ex.1003 / Page 18 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`to light, i.e. in the dark. This “read noise” will limit the signal to noise ratio (SNR)
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`in the dark regions of a scene or a low light imaging situation. Low light image
`
`quality is dependent on the SNR of the readout. For example, if the read noise is 4
`
`electrons, in order to get a SNR of 10, the pixel will need to collect 40 electrons. In
`
`very low light this will require a longer integration time to get a desired SNR.
`
`
`33. The amount of charge (number of electrons) that the PD can hold is
`
`limited. This maximum level, called Full Well Capacity (FWC), is determined by
`
`photodetector size, and photodetector electrical capacitance. If a region in a scene
`
`is very bright, the resulting number of photoelectrons can exceed the FWC
`
`(become saturated), and the image information in that part of scene is lost (clipped
`
`to the FWC level).
`
`34. As a result of this FWC limitation, one cannot simply increase the
`
`
`
`
`19
`
`
`
`Ex.1003 / Page 19 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`integration time of the sensor in order to collect more charge in the dark regions of
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`the scene in an attempt to increase the SNR of the dark regions, because this would
`
`cause the photodetectors capturing the bright regions of the scene to saturate. The
`
`dynamic range of the image sensor, discussed below, is related to this trade off
`
`with dark regions and bright regions.
`
`B. Dynamic Range
`1.
`True Dynamic Range vs. Intra-Scene Dynamic Range
`35. Two well-known types of dynamic range—true dynamic range and
`
`intra-scene dynamic range—are described in the ’158 patent.
`
`36. Specifically, the ’158 patent states, “[d]ynamic range is measured as
`
`the ratio of the maximum photo-charge that can be meaningfully integrated in a
`
`pixel of the imager to the pixel noise level.” Ex.1001, 1:40-42. Each pixel has a
`
`“maximum photo-charge,” because the “finite charge storage capacitance within
`
`each pixel limits the amount of integrated photo-charge.” Ex.1001, 1:38-39. This
`
`type of dynamic range is referred to as “true dynamic range” or “DR” in the art.
`
`Ex.1009, 9 (“True dynamic range is a measure of the sensor’s maximum
`
`number of signal electrons compared to its total dark temporal RMS noise
`
`level.”).
`
`37. Another type of dynamic range is referred to in the art as “intra-scene
`
`dynamic range” or “IDR,” which refers to the range of illumination levels within
`
`
`
`
`20
`
`
`
`Ex.1003 / Page 20 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`
`a single scene. Ex.1009, 9 (“Intra-scene dynamic range (IDR) refers to the range
`
`Inter Partes Review of U.S. 9,232,158
`
`of illumination levels within a single scene”). See also Ex.1001, 1:43-45
`
`(“[i]ntrascene dynamic range refers to the range of incident light that can be
`
`accommodated by an image sensor in a single frame of pixel data,” providing
`
`examples with a high intra-scene dynamic range including “an indoor room with a
`
`window view of the outdoors, an outdoor scene with mixed shadows and bright
`
`sunshine, and evening or night scenes combining artificial lighting and shadows.”).
`
`2.
`Expanding Intra-Scene Dynamic Range Using Multiple
`Integration Times
`38. Techniques for capturing pictures with different integration times
`
`and combining the pictures to achieve expanded intra-scene dynamic range
`
`were well-known. See, e.g., Ex.1009, 9; Ex.1024, 3 (Abstract discussing the
`
`“fundamental technique” of “image mosaicing, i.e., the automatic alignment of
`
`multiple images into larger aggregates); Ex.1025, 7, FIG. 8 (illustrating combining
`
`images of different exposures including “underexposed,” “properly exposed,” and
`
`“overexposed”); Ex.1026, 1 (providing high dynamic range by fusing multiple
`
`photographs with different amounts of exposure into a single high dynamic range
`
`radiance map); Ex.1014, FIGS.1a-1c, 3, 5:14-20 (describing producing a combined
`
`digital image having improved characteristics using two electronic images, where
`
`“scene exposure conditions” are selected “in a way that provides for improving
`
`various characteristics,” where “all the greater than two lens/sensor systems are
`
`
`
`
`21
`
`
`
`Ex.1003 / Page 21 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`used to simultaneously capture light information from the same scene”); see also
`
`Inter Partes Review of U.S. 9,232,158
`
`Ex.1001, 1:57-58 (citing U.S. Pat. Nos. 4,647,975, 5,168,532, and 5,671,013, the
`
`’158 patent acknowledging that it was known that “[intra-scene] dynamic range
`
`of an image sensor can be increased by using multiple exposure times and/or
`
`integration times.”).1
`
`39. Specifically, Kodak Image Sensor White Paper explains in detail
`
`the technique for expanding intra-scene dynamic range. Ex.1009, 9. As an
`
`example, a sensor can take two pictures with “a long integration time” and “a
`
`short integration time” respectively. Ex.1009, 9. “The picture with the long
`
`integration time will provide good details in the dark regions of the scene but
`
`the bright regions may not show good details since too many electrons may
`
`have overpowered, or saturated the pixel.” Ex.1009, 9. On the other hand, “the
`
`picture with the short integration time will provide good details in the bright
`
`
`1 The ’158 patent also uses the terms “dynamic exposure range” and “effective
`
`single-frame dynamic exposure range.” See e.g., Ex.1001, 1:26-87 (“expanding
`
`the dynamic exposure range”), 2:40-42 (“the effective single-frame dynamic
`
`exposure range is expanded”). A POSITA would have understood that these
`
`terms refer to the intra-scene dynamic range, directed at an expanded dynamic
`
`range of a scene. Ex.1009, 9.
`
`
`
`
`22
`
`
`
`
`
`Ex.1003 / Page 22 of 111
`TESLA, INC.
`
`
`
`Declaration of R. Michael Guidash
`
`areas (because the pixels did not saturate), but now, there are not enough
`
`Inter Partes Review of U.S. 9,232,158
`
`
`
`photons captured to provide good details in the darker regions of the scene.” Id.
`
`40. As such, an image with a high intra-scene dynamic range can be
`
`achieved by combining two images captured with different integration times,
`
`where “signal processing electronics can be used to blend these two images to
`
`provide both the dark and bright regions of the scene.” Id. In other words, a
`
`high IDR is achieved to capture details of both the bright and dark areas,
`
`without the requirement of a high true dynamic range, since the details of the
`
`darker parts of the scene are captured by the image that has a short integration
`
`time, and the details of the brighter part of the scene is captured with the other
`
`image that has a longer integration time.
`
`C. Exposure
`41. Exposure “is important for any photographic situation in terms of
`
`the resulting quality of the photographic image.” Ex.1040, 310 (describing that
`
`generally, “the photographic result of exposure (H) is the product of image
`
`illuminance (E) and exposure duration (t), so that H = Et.”).
`
`42.
`
` “The four principal variables determining camera exposure are
`
`subject luminance, fil