`Case 5:19-cv-00036—RWS Document 161-4 Filed 12/09/19 Page 1 of 129 PageID #: 6476
`
`
`
`
`
`
`
`EXHIBIT C
`
`EXHIBIT C
`
`
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 2 of 129 PageID #: 6477
`
`
`IN THE UNITED STATES DISTRICT COURT
`FOR THE EASTERN DISTRICT OF TEXAS
`TEXARKANA DIVISION
`
`MAXELL, LTD.,
`
`Plaintiff,
`
`
`
`
`
`
`
`vs.
`
`APPLE INC.,
`
` Civil Action No. 5:19-cv-00036-RWS
`
`
`
`
`
`Defendant.
`
`DECLARATION OF DR. ALAN C. BOVIK IN SUPPORT OF
`APPLE INC.’S PROPOSED CLAIM CONSTRUCTIONS
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 3 of 129 PageID #: 6478
`
`
`I, Alan C. Bovik, declare and state as follows:
`
`I.
`
`INTRODUCTION
`1.
`My name is Dr. Alan C. Bovik. I am thee Ernest J. Cockrell Endowed Chair in
`
`Engineering at The University of Texas at Austin, Professor in the Department of Electrical and
`
`Computer Engineering and The Institute for Neurosciences, and Director of the Laboratory for
`
`Image and Video Engineering (LIVE). I am over the age of eighteen, and I am a citizen of the
`
`United States.
`
`2.
`
`I have been retained by defendant Apple Inc. (“Apple” or “Defendant”) in
`
`connection with civil action Maxell, Ltd. v. Apple Inc., Case No. 5:19-cv-00036-RWS (E.D.
`
`Texas), to provide my opinions regarding technical background, level of ordinary skill in the art,
`
`and other subject-matter relevant to interpretation of certain disputed claim terms in the asserted
`
`claims of U.S. Patent No. 8,339,493 (the “’493 patent”).
`
`3.
`
`I have been asked to provide my opinions on the following topics: (1) the
`
`technology relevant to the ’493 patent; (2) the state of the art at the time the relevant patent
`
`application was filed; (3) the level of ordinary skill in that field as of the filing date of the
`
`application that issued as the ’493 patent; (4) how those of ordinary skill in the art at the time of
`
`the invention would have understood statements made by the patentee during prosecution of the
`
`’493 patent; and (5) how those of ordinary skill in the art at the time of the invention would
`
`understand certain terms used in the claims of the ’493 patent.
`
`4.
`
`My opinions expressed in this declaration rely on my own personal knowledge
`
`and experience. However, where I also considered specific documents or other information in
`
`formulating the opinions expressed in this declaration, such items are referred to in this
`
`declaration. This includes, but is not limited to, the ’493 patent, its prosecution history
`
`(including, if applicable, inter partes review proceedings before the Patent Trial and Appeal
`
`1
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 4 of 129 PageID #: 6479
`
`
`Board), prior art references cited during prosecution, and certain dictionaries and other extrinsic
`
`evidence cited by Apple and/or Maxell as part of their claim construction disclosures.
`
`II.
`
`QUALIFICATIONS
`5.
`I received my B.S. degree in Computer Engineering in 1980 and the M.S. and
`
`Ph.D. degrees in Electrical and Computer Engineering in 1982 and 1984, all from the University
`
`of Illinois, Urbana-Champaign.
`
`6.
`
`I am a tenured full Professor and I hold the Cockrell Family Regents Endowed
`
`Chair at The University of Texas at Austin. My appointments are in the Department of Electrical
`
`and Computer Engineering, the Department of Computer Sciences, and the Department of
`
`Biomedical Engineering. I am also the Director of the Laboratory for Image and Video
`
`Engineering (“LIVE”).
`
`7.
`
`My research is in the general area of digital television, digital cameras, image and
`
`video processing, computational neuroscience, and modeling of biological visual perception. I
`
`have published over 800 technical articles in these areas and hold seven U.S. patents. I am also
`
`the author of The Handbook of Image and Video Processing, Second Edition (Elsevier Academic
`
`Press, 2005); Modern Image Quality Assessment (Morgan & Claypool, 2006); The Essential
`
`Guide to Image Processing (Elsevier Academic Press, 2009); and The Essential Guide to Video
`
`Processing (Elsevier Academic Press, 2009); and numerous other publications.
`
`8.
`
`I received the 2017 Edwin H. Land Medal from the Optical Society of America in
`
`September 2017 with citation: For substantially shaping the direction and advancement of
`
`modern perceptual picture quality computation, and for energetically engaging industry to
`
`transform his ideas into global practice. I also received a Primetime Emmy Award for
`
`Outstanding Achievement in Engineering Development, for the Academy of Television Arts and
`
`Sciences, in October 2015, for the widespread use of my video quality prediction and monitoring
`
`2
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 5 of 129 PageID #: 6480
`
`
`models and algorithms that are widely used throughout the global broadcast, cable, satellite and
`
`internet Television industries.
`
`9.
`
`I will also receive the Progress Medal from the Royal Photographic Society (RPS)
`
`in November, 2019 “in recognition of any invention, research, publication or other contribution
`
`which has resulted in an important advance in the scientific or technological development of
`
`photography or imaging in the widest sense.” The Progress Medal is the oldest and most
`
`prestigious honor in the field of photography, having been given annually since 1878. I was also
`
`named Honorable Fellow of RPS (HonFRPS). Previous winners include George Eastman
`
`(Founder of Kodak), Edwin Land (inventor of the instant camera and Founder of Polaroid),
`
`George Smith (inventor of the CCD image sensor) and Steve Sasson (inventor of the first digital
`
`camera).
`
`10.
`
`Among other awards and honors, I have received the 2013 IEEE Signal
`
`Processing Society’s “Society Award,” which is the highest honor accorded by that technical
`
`society (“for fundamental contributions to digital image processing theory, technology,
`
`leadership and education”). In 2005, I received the Technical Achievement Award of the IEEE
`
`Signal Processing Society, which is the highest technical honor given by the Society, for “broad
`
`and lasting contributions to the field of digital image processing”; and in 2008 I received the
`
`Education Award of the IEEE Signal Processing Society, which is the highest education honor
`
`given by the Society, for “broad and lasting contributions to image processing, including popular
`
`and important image processing books, innovative on-line courseware, and for the creation of the
`
`leading research and educational journal and conference in the image processing field.”
`
`11. My technical articles have been widely recognized as well, including the 2009
`
`IEEE Signal Processing Society Best Journal Paper Award for the paper “Image quality
`
`assessment: From error visibility to structural similarity,” published in IEEE Transactions on
`
`3
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 6 of 129 PageID #: 6481
`
`
`Image Processing, volume 13, number 4, April 2004; this same paper received the 2017 IEEE
`
`Signal Processing Society Sustained Impact Paper Award as the most impactful paper published
`
`over a period of at least ten years; the 2013 Best Magazine Paper Award for the paper “Mean
`
`squared error: Love it or leave it?? A new look at signal fidelity measures,” published in IEEE
`
`Transactions on Image Processing, volume 26, number 1, January 2009; the IEEE Circuits and
`
`Systems Society Best Journal Paper Prize for the paper “Video quality assessment by reduced
`
`reference spatio-temporal entropic differencing,” published in the IEEE Transactions on Circuits
`
`and Systems for Video Technology, vol. 23, no. 4, pp. 684-694, April 2013.
`
`12.
`
`I received the Google Scholar Classic Paper Award twice in 2017, for the paper
`
`“Image information and visual quality,” published in the IEEE Transactions on Image
`
`Processing, vol. 15, no. 2, pp. 430-444, February 2006 (the main algorithm developed in the
`
`paper, called the Visual Information Fidelity (VIF) Index, is a core picture quality prediction
`
`engine used to quality-assess all encodes streamed globally by Netflix), and for “An evaluation
`
`of recent full reference image quality assessment algorithms,” published in the IEEE
`
`Transactions on Image Processing, vol. 15, no. 11, pp. 3440-3451, November 2006. (the picture
`
`quality database and human study described in the paper, the LIVE Image Quality Database, has
`
`been the standard development tool for picture quality research since its first introduction in
`
`2003). Google Scholar Classic Papers are very highly-cited papers that have stood the test of
`
`time, and are among the ten most-cited articles in their area of research over the ten years since
`
`their publication.
`
`13.
`
`I have also been honored by other technical organizations, including the Society
`
`for Photo-optical and Instrumentation Engineers (SPIE), from which I received the Technology
`
`Achievement Award (2013) “For Broad and Lasting Contributions to the Field of Perception-
`
`Based Image Processing,” and the Society for Imaging Science and Technology, which accorded
`
`4
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 7 of 129 PageID #: 6482
`
`
`me Honorary Membership, which is the highest recognition by that Society given to a single
`
`individual, “for his impact in shaping the direction and advancement of the field of perceptual
`
`image processing.” I was also elected as a Fellow of the Institute of Electrical and Electronics
`
`Engineers (IEEE) “for contributions to nonlinear image processing” in 1995, a Fellow of the
`
`Optical Society of America (OSA) for “fundamental research contributions to and technical
`
`leadership in digital image and video processing” in 2006, and as a Fellow of SPIE for
`
`“pioneering technical, leadership, and educational contributions to the field of image processing”
`
`in 2007.
`
`14.
`
`Among other relevant research, I have worked with the National Aeronautics and
`
`Space Administration (“NASA”) to develop high compression image sequence coding and
`
`animated vision technology, on various military projects for the Air Force Office of Scientific
`
`Research, Phillips Air Force Base, the Army Research Office, and the Department of Defense.
`
`These projects have focused on developing local spatio-temporal analysis in vision systems,
`
`scalable processing of multi-sensor and multi-spectral imagery, image processing and data
`
`compression tools for satellite imaging, AM-FM analysis of images and video, the scientific
`
`foundations of image representation and analysis, computer vision systems for automatic target
`
`recognition and automatic recognition of human activities, vehicle structure recovery from a
`
`moving air platform, passive optical modeling, and detection of speculated masses and
`
`architectural distortions in digitized mammograms. My research has also recently been funded
`
`by Netflix, Qualcomm, Texas Instruments, Intel, Cisco, and the National Institute of Standards
`
`and Technology (NIST) for research on image and video quality assessment. I have also received
`
`numerous grants from the National Science Foundation for research on image and video
`
`processing and on computational vision.
`
`5
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 8 of 129 PageID #: 6483
`
`
`15.
`
`Additional details about my employment history, fields of expertise, and
`
`publications are further described in my curriculum vitae, which is attached as Exhibit A to this
`
`declaration. The list of litigation matters in which I have been engaged can be found in my CV.
`
`16.
`
`I am being compensated at my usual rate of $500 per hour, plus reimbursement
`
`for expenses, for my analysis. My compensation does not depend on the content of my opinions
`
`or the outcome of this proceeding.
`
`III. LEGAL STANDARDS
`17.
`I have been informed that the words of a claim are generally given the ordinary
`
`and customary meaning that the term would have to a person of ordinary skill in the art at the
`
`time of the invention. I understand that there are exceptions to this general rule if: (1) the
`
`patentee, in the specification or prosecution history, defined a claim term to have a meaning
`
`different from its ordinary meaning, or (2) the patentee disclaimed or disavowed patent scope in
`
`the specification or prosecution history.
`
`18.
`
`I have been informed that to determine how a person of ordinary skill would
`
`understand a claim term, courts may consider both “intrinsic” and “extrinsic” evidence. I
`
`understand that courts look first to the intrinsic evidence of record, which includes the patent
`
`itself (including the claims, specification, and drawings) and its prosecution history. I also
`
`understand that courts may consider extrinsic evidence, such as expert and inventor testimony,
`
`dictionaries, and learned treatises.
`
`19.
`
`I have been informed that a person of ordinary skill in the art is deemed to read
`
`the claim term not only in the context of the particular claim in which it appears but also in the
`
`context of the entire patent, including the specification, drawings, and prosecution history. I
`
`have been informed that where the specification clearly indicates that the described embodiment
`
`is the only embodiment the claims cover, the claims are properly limited to that embodiment
`
`6
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 9 of 129 PageID #: 6484
`
`
`even if the claim language, removed from the context of the specification, could be read more
`
`broadly. I have also been informed that, to properly construe claims in view of the specification,
`
`one must also consider whether the specification read as whole suggests that the very character
`
`of the invention requires the limitation be a part of every embodiment.
`
`20.
`
`I have been informed that a term must be interpreted with a full understanding of
`
`what the inventors actually invented and intended to include within the scope of the claim as set
`
`forth in the patent itself. Thus, claim terms should not be broadly construed to encompass
`
`subject matter that is technically within the broadest reading of the term but is not supported
`
`when the claims are viewed in light of the invention described in the specification. I have also
`
`been informed that when a patent specification repeatedly and consistently characterizes the
`
`claimed invention in a particular way, it is proper to construe the relevant claim terms in
`
`accordance with that characterization.
`
`21.
`
`I have been informed that a patent claim may be declared invalid for a lack of
`
`written description. I have been informed that the subject matter of the claim need not be
`
`described using language identical to that in the claim in order for the disclosure to satisfy the
`
`description requirement. I have also been informed that, where possible, a claim term should
`
`generally be construed to preserve the claim’s validity.
`
`IV.
`
`LEVEL OF ORDINARY SKILL IN THE ART
`22.
`Based on my review of the ’493 patent and its prosecution history, and based on
`
`my years of experience in digital image processing, my opinion is that a person of ordinary skill
`
`in the art around the filing of the ’493 patent would have had a Bachelor’s degree in Electrical
`
`Engineering, Computer Engineering, Computer Science, or an equivalent degree with at least 2
`
`years of experience in digital image processing technologies. Additional education may
`
`substitute for lesser work experience and vice-versa.
`
`7
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 10 of 129 PageID #: 6485
`
`
`23.
`
`I am informed that the earliest possible priority date of the ’493 Patent is January
`
`2000. At around January 2000, I would have qualified as a person of ordinary skill in the art
`
`under the definition I provided above. By January 2000, I had 20 years of B.S., M.S., and Ph.D
`
`degrees and over 15 years of experiences in digital image processing. The opinions expressed in
`
`this declaration would not change if the level of experience varied by a few years.
`
`V.
`
`THE ’493 PATENT
`24.
`The ’493 Patent relates to electric cameras, and more specifically, to an electric
`
`camera having an image sensing device having a sufficient number of pixels that is capable of
`
`taking highly detailed still images and moving video with reduced image quality without
`
`increasing circuitry. ’493 Patent 3:8–13. The Abstract of the ’493 Patent recites:
`
`An electric camera includes an image sensing device with a light
`receiving surface having N vertically arranged pixels and an
`arbitrary number of pixels arranged horizontally, N being equal to
`or more than three times the number of effective scanning lines M
`of a display screen of a television system, a driver to drive the image
`sensing device to vertically mix or cull signal charges accumulated
`in individual pixels of K pixels to produce, during a vertical
`effective scanning period of the television system, a number of lines
`of output signals which correspond to 1/K the number of vertically
`arranged pixels N of the image sensing device, K being an integer
`equal to or less than an integral part of a quotient of N divided by
`M, and a signal processing unit having a function of generating
`image signals by using the output signals of the image sensing
`device.
`
`’493 Patent Abstract.
`
`25.
`
`In particular, the ’493 Patent describes an electric camera including an image
`
`sensing device (e.g., a CCD) having a number (e.g., 1200) of vertically arranged pixels and a
`
`number (e.g., 1600) of horizontally arranged pixels. ’493 Patent at 4:34–48. When recording a
`
`static image, all of the effective pixels on the image sensing device are used to produce signals
`
`with as high a resolution as possible. ’493 Patent 7:31–39. When recording a moving video, a
`
`number (e.g., 1200/240 = 5) of the vertically arranged pixels are mixed or culled, so that the
`
`8
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 11 of 129 PageID #: 6486
`
`
`number of lines of output signals from the image sensing device match or conform to the number
`
`of effective scanning lines (e.g., 240) of a television monitor. See Maxell Ltd. v. Huawei Device
`
`USA Inc. et al., Case No. 5:16-cv-00178-RWS, Dkt. No. 175, Claim Construction Memorandum
`
`and Order (January 31, 2018) (the “Huawei Claim Construction Order”); ’493 Patent 4:64–5:6,
`
`7:40–59, 9:30–36. Finally, when monitoring in a static image mode, the vertically arranged
`
`pixels are also mixed or culled, so that the number of output signals from the image sensing
`
`device match or conform to the number of effective scanning lines of a display screen used to
`
`monitor the static image.
`
`26.
`
`The ’493 patent states that “[i]n a video camera to photograph moving images, it
`
`is generally assumed that the video is viewed on a display such as [a] television monitor” and
`
`thus a prior art video camera “is designed to produce output signals conforming to a television
`
`system such as NTSC.” Id., 1:30-34, 1:37-43 (the NTSC system “has an effective scanning line
`
`number of about 240 lines,” which is “the number of scanning lines actually displayed on the
`
`monitor”).
`
`VI.
`
`THE DISPUTED CLAIM TERMS
`A.
` “effective scanning lines … of a display screen” (Claim 1)
`
`27.
`
`I am informed that the parties propose the following constructions:
`
`Apple’s Proposal
`“the lines displayed in a single field
`of an interlaced scanning display”
`
`Maxell’s Proposal
`“lines on a display screen
`corresponding to an actually
`displayed image
`
`Claim Term
`“effective
`scanning lines …
`of a display
`screen”
`
`
`
`28.
`
`I agree with Apple’s proposed construction, because it accurately reflects the
`
`common usage of the term as understood by a person of ordinary skill in the art at the time of the
`
`’493 Patent’s earliest possible priority date, and is supported by the ’493 Patent’s specification.
`
`9
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 12 of 129 PageID #: 6487
`
`
`29.
`
`At the time of the ’493 Patent, a person of ordinary skill in the art was familiar
`
`with the concept of “scanning” in the context of display screen. See e.g., U.S. Patent No.
`
`4,054,915 at 4: 49-56 (discussing “scanning line of an odd numbered field” and “scanning line of
`
`an even-numbered field”); U.S. Patent Nos. 4,620,134; 6,002,203. “Scanning” in the context of
`
`display screen at around early 2000 referred to a type of display also called a “raster” display.
`
`The most common example at the time was a Cathode Ray Tube (CRT) television or monitor.
`
`See e.g., U.S. Patent No. 4,541,010. In a CRT monitor, an electron gun emits electrons toward a
`
`screen that is covered with tiny red, green, and blue phosphor dots that glow when struck by an
`
`electron beam. U.S. Patent Nos. 4,620,134; 6,002,203. The electron gun emits electron beam
`
`toward each of the dots, one dot at a time, to sequentially make them glow. This process is
`
`called “scanning.” For each dot on the screen (called a “pixel”), the intensity of electron beam
`
`emitted by the electron gun changes, making the colored dots glow in different degree, resulting
`
`in different colors. The electron gun scans through the pixels from left to right, followed by
`
`displaying the next row from left to right, and repeating this process until the bottom-right pixel
`
`is displayed. This process of “scanning” is illustrated below:
`
`Screen display system using scan lines
`
`
`
`30.
`
`Each row of scanning display is commonly referred to as a scan line, or a
`
`scanning line. The same term was used not only in the context of CRT screens, but also in the
`
`10
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 13 of 129 PageID #: 6488
`
`
`context of other video formats and display standards, such as for cable TV transmissions and
`
`later, in HDTV (High-definition television) broadcast formats. Interlaced displays remain in use
`
`in broadcast and live streaming television even today.
`
`31.
`
`Once the scanning process reaches the lower right corner of the screen, the
`
`display mechanism (for example, the electron gun that emits the electron beam to the screen of a
`
`CRT (cathode ray tube) display) must move back to the top left corner to start the scanning
`
`process for the next field of display all over again. There is a short time gap between the display
`
`of the bottom-rightmost pixel and the display of the top-leftmost pixel of the next field, during
`
`which the scanning mechanism may move from the bottom right corner of the screen back to the
`
`top left corner of the screen. This period of time is referred to as a “vertical blanking period” or
`
`“VBI.” This “vertical blanking period” is unique to a raster scanning display, and it is mentioned
`
`over 20 times in the specification of the ’493 Patent. See, e.g., ’493 Patent at 1:42, 1:56, 5:19-62,
`
`7:52-56, 9:2, 9:31, 10:51-61, 10:64-11:5, 13:28, 13:43-55. The specification also refers to the
`
`“horizontal blanking period” (e.g., ’493 Patent at 5:58-62, 7:47, 8:60, 11:2, 13:57). The
`
`“horizontal blanking period” refers to the time gap between the completion of scanning of one
`
`line to the beginning of scanning for the next line. The concept of “horizontal blanking period”
`
`is also used in the context of a raster scanning display.
`
`32.
`
`A person of ordinary skill in the art at the time of the ’493 Patent’s priority date
`
`would have been familiar with scanning displays, and the various terms used in connection with
`
`scanning display technology, including the term “scanning line.” See, e.g., ’493 Patent at 1:37-
`
`43 (“The NTSC system, for example, performs interlaced scanning on two fields, each of which
`
`has an effective scanning line number of about 240 lines (the number of scanning lines actually
`
`displayed on the monitor which is equal to the number of scanning lines in the vertical blanking
`
`period subtracted from the total number of scanning lines in each field).”); see also, id. at 1:30-
`
`11
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 14 of 129 PageID #: 6489
`
`
`34. In particular, a person of ordinary skill would have understood that the term “scanning line”
`
`refers to the individual scan lines on a raster scanning display, such as the NTSC television
`
`system discussed in the specification of the ’493 Patent.
`
`33.
`
`At the time of the invention of the ’493 Patent, television systems followed
`
`display standards that used scanning displays. At the time of the invention, NTSC, PAL, and
`
`SECAM were three common television systems used around the world. Each of these television
`
`systems used interlaced display. The ’493 Patent explains:
`
`In a video camera to photograph moving images, it is generally
`assumed that the video is viewed on a display such as television
`monitor and thus the camera is designed to produce output signals
`conforming to a television system such as NTSC and PAL.
`
`’493 Patent at 1:30-34.
`
`34.
`
`A person of ordinary skill in the field of ’493 Patent at the time of the alleged
`
`invention would have been familiar with the NTSC or PAL display standard, and would have
`
`known that both systems use interlaced scanning display, i.e., a scanning display in which there
`
`are two fields that are interlaced to create one frame of image. See, e.g., ’493 Patent at 1:30-34
`
`(“In a video camera to photograph moving images, it is generally assumed that the video is
`
`viewed on a display such as television monitor and thus the camera is designed to produce output
`
`signals conforming to a television system such as NTSC and PAL.”); 1:37-43. In an interlaced
`
`display (shown in the illustration below right), one frame of image is divided into two fields, one
`
`containing even-numbered lines, and one containing odd-numbered lines. See e.g., ’493 Patent
`
`at Figs. 4, 6, 8, 8:8-13 (“FIG. 6 shows combinations of pixels to be cyclically mixed on the A
`
`field and the B field . , , . In the interlaced scanning, Scanning lines of the A field and the B field
`
`are located at the centers of adjoining scanning lines on other field.”) This is in contrast to what
`
`is called “progressive scan” display (shown in the illustration below left), which displays the full
`
`frame one line at a time from top to bottom. Although both “interlaced scan” and “progressive
`
`12
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 15 of 129 PageID #: 6490
`
`
`scan” were known at the time of the ’493 Patent, common television systems of the time, such as
`
`NTSC and PAL, used interlaced scan method.
`
`35.
`
`The concept of the “effective scanning lines” is explained in the specification.
`
`
`
`For example, the ’493 Patent states:
`
`The NTSC system, for example, performs interlaced scanning on
`two fields, each of which has an effective scanning line number of
`about 240 lines (the number of scanning lines actually displayed
`on the monitor which is equal to the number of scanning lines in
`the vertical blanking period subtracted from the total number of
`scanning lines in each field).
`
`’493 Patent at 1:37-43 (emphasis added). The ’493 Patent thus equates the “effective scanning
`
`line” to the “number of scanning lines actually displayed on the monitor which is equal to the
`
`number of scanning lines in the vertical blanking period subtracted from the total number of
`
`scanning lines in each field.” Id. By incorporating the concepts of “vertical blanking period”
`
`and “field” in its definition of “effective scanning lines,” the specification makes clear that
`
`“effective scanning lines” is a term used in the context of an interlaced scanning display. The
`
`terms “vertical blanking period” and “field” would be meaningless in the context of an “always
`
`on” type of non-scanning display, or in the context of a non-interlaced display.
`
`36.
`
`The specification’s explanation of “effective scanning lines” differentiates this
`
`term from the “total” number of scanning lines, which includes the number of scanning lines that
`
`13
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 16 of 129 PageID #: 6491
`
`
`are displayed in a field (i.e., the effective scanning lines) plus additional scanning lines that are
`
`not displayed, but received during the “vertical blanking period.” The ’493 Patent explains:
`
`Some image sensing devices to take moving images according to the
`NTSC system have an area of pixels for image stabilization added
`to the area of effective pixel area, thus bringing the effective number
`of vertically arranged pixels to about 480 or more. In this case, an
`area beyond 480th pixels is read out at high speed during the vertical
`blanking period and therefore the signals thus read out are not used
`as effective signals.
`
`’493 Patent at 1:51-58.
`
`37.
`
`In other words, the “effective scanning line” refers to the scanning lines of a
`
`display screen that are displayed within one “field,” as opposed to the lines of data that are
`
`received from the sensor but not displayed because they are received during the vertical blanking
`
`period. See, e.g., ’493 Patent at 1: 51-67. A person of ordinary skill would have understood that
`
`this concept of “effective scanning line” only makes sense in the context of a scanning display,
`
`because non-scanning displays do not have “scanning lines” nor any vertical blanking period. A
`
`person of ordinary skill would have also understood that this concept of “effective scanning line”
`
`only makes sense in the context of an interlaced display, because non-interlaced displays do not
`
`have “fields.”
`
`38.
`
`In my opinion, a person of ordinary skill would have understood the “effective
`
`scanning lines . . . of a display screen” to refer to the number of effective scanning lines per field
`
`(and not per frame) in the context of the ’493 Patent. This is because, throughout the ’493
`
`Patent, the phrase “effective scanning lines” is used only to refer to the lines per field, and never
`
`to lines per frame. See, e.g., ’493 Patent at 1:37-43, 4:64–5:6, 7:40-48, 9:57–10:2, 10:3-6 (“The
`
`number of vertically arranged pixels for static image photographing needs only to be three or
`
`more times the number of effective scanning lines on each field of the television system.”);
`
`10:22-32 (“effective scanning lines M of each field of the television system”), 12:37-48
`
`14
`
`
`
`Case 5:19-cv-00036-RWS Document 161-4 Filed 12/09/19 Page 17 of 129 PageID #: 6492
`
`
`(“effective scanning lines M of each field”), 15:11-21 (“using an image sensing device that has
`
`an arbitrary number of vertically arranged pixels N three or more times the number of effective
`
`scanning lines M of each field”).
`
`39.
`
`A person of ordinary skill in the art would have also recognized that, if the
`
`“effective scanning lines . . . of a display screen” were interpreted to include the total number of
`
`scanning lines per frame, no embodiment in the specification would disclose an image sensing
`
`device with a light receiving sensor having scanning lines “greater than three times [the] number
`
`of … scanning line” per frame (which is described in the specification as 480 lines for NTSC
`
`system). See, e.g., ’493 Patent at 4:64-5:6, 6:49-59, Fig. 1, Fig. 5. Instead, the largest image
`
`sensor described in the ’493 Patent has 1200 vertically arranged pixels, which is less than three
`
`times the number of effective scanning lines per frame in the NTSC television system (480 lines
`
`per frame). See, id. A person of ordinary skill in the art would have also understood that the
`
`PAL system typically has more scanning lines than NTSC, such as 576 lines. See, e.g., ’493
`
`Patent at 10:18-21. Thus, a person of ordinary skill in the art would have thus realized that, for
`
`the ’493 Patent’s specification to support the claim that recites a light receiving sensor that has
`
`“N number of vertically arranged pixel lines, wherein N is equal or greater than three times a
`
`number of effective scanning lines M of a display screen,” the effective scanning lines must refer
`
`to the number of scanning lines displayed in a field of the interlaced scanning display, such as
`
`the NTSC television screen described by the specification.
`
`40.
`
`Therefore, in my opinion, a person of ordinary skill in the field of the ’493 Patent
`
`would have understood that the “effective scanning line . . . of a disp