throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`UNIFIED PATENTS, LLC,
`Petitioner,
`
`v.
`
`ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE,
`KWANGWOON UNIVERSITY RESEARCH INSTITUTE FOR INDUSTRY
`COOPERATION, INDUSTRY-ACADEMIA COOPERATION GROUP OF
`SEJONG UNIVERSITY,
`Patent Owners.
`
`Case: IPR2021-00368
`
`U.S. Patent No. 9,736,484
`
`DECLARATION OF JOSEPH P. HAVLICEK, PH.D. SUBMITTED IN
`SUPPORT OF PETITION FOR INTER PARTES REVIEW OF
`U.S. PATENT NO. 9,736,484
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`TABLE OF CONTENTS
`
`BACKGROUND AND QUALIFICATIONS ................................................. 1
`I.
`A. Educational Background .................................................................................. 1
`B. Professional Experience .................................................................................. 1
`C. Publications ...................................................................................................... 6
`D. Compensation .................................................................................................. 8
`II. MATERIALS CONSIDERED ........................................................................ 8
`III. LEVEL OF ORDINARY SKILL IN the ART ................................................ 8
`IV. TECHNICAL TUTORIAL ............................................................................13
`A. Still Images and Image Capture .....................................................................13
`B. Color Spaces ..................................................................................................18
`C. Moving Pictures and the Need for Compression ...........................................20
`D. Video Compression: a 10,000 Foot View .....................................................21
`1. Reducing Spatial and Temporal Redundancy .............................................22
`a. Spatial Prediction / Intra Prediction .........................................................22
`b. Temporal Prediction / Inter Prediction .....................................................27
`2. Overview of A Typical Encoder / Decoder ................................................28
`3. Discrete Cosine Transform .........................................................................31
`4. Quantization and Scanning .........................................................................37
`5. Entropy Coding ...........................................................................................39
`6. Data Structures: Pixels, Blocks, Macroblocks, Slices, and Frames ............39
`V. OVERVIEW OF THE ’484 PATENT ..........................................................41
`
`i
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`VI. BRIEF SUMMARY OF THE PROSECUTION HISTORY OF THE ’484
`PATENT AND RELATED APPLICATIONS .............................................47
`A. U.S. Patent Application No. 12/377,617 (Ex. 1004) .....................................48
`B. U.S. Patent Application No. 13/975,251 (Ex. 1005) .....................................53
`C. U.S. Patent Application No. 14/823,273 (Ex. 1006) .....................................55
`VII. CLAIM 4 OF THE ’484 PATENT ................................................................59
`VIII. CLAIM CONSTRUCTION ..........................................................................60
`IX. LEGAL STANDARDS .................................................................................61
`A. Anticipation ...................................................................................................61
`B. Obviousness ...................................................................................................62
`X.
`THE PRIOR ART ..........................................................................................66
`A. Nishi (Ex. 1014) .............................................................................................66
`B. Do (Ex. 1009, Ex. 1010) ................................................................................78
`C. Kobayashi (Ex. 1023) ....................................................................................84
`D. Kalevo (Ex. 1011) ..........................................................................................88
`XI. THE PRIOR ART IS ANALOGOUS TO THE ’484 PATENT ...................94
`XII. CLAIM 4 IS UNPATENTABLE AS ANTICIPATED AND OBVIOUS ....95
`A. Claim 4 Is Anticipated and Obvious Over Nishi ...........................................95
`1. “A non-transitory computer-readable storage medium storing instructions
`that, when executed by a processor, cause the processor to perform a
`method of decoding, the method comprising:” ...........................................96
`2. “performing entropy decoding of encoded video information in a bitstream
`to obtain transform coefficients for a current block;” ...............................100
`3. “selecting a scanning mode for the transform coefficients” .....................104
`4. “wherein selecting a scanning mode comprises: selecting a horizontal
`
`ii
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`scanning mode in response to the intra prediction mode being a vertical
`intra prediction mode; and selecting a vertical scanning mode in response
`to the intra prediction mode being a horizontal intra prediction mode.” ..107
`5. “scanning the transform coefficients based on the selected scanning mode”
` 115
`B. Claim 4 Would Have Been Obvious Over Do In View of Kobayashi and
`Over Do In View of Kalevo ........................................................................116
`1. “A non-transitory computer-readable storage medium storing instructions
`that, when executed by a processor, cause the processor to perform a
`method of decoding, the method comprising:” .........................................116
`2. “performing entropy decoding of encoded video information in a bitstream
`to obtain transform coefficients for a current block;” ...............................119
`3. “selecting a scanning mode for the transform coefficients” .....................120
`4. “wherein selecting a scanning mode comprises: selecting a horizontal
`scanning mode in response to the intra prediction mode being a vertical
`intra prediction mode; and selecting a vertical scanning mode in response
`to the intra prediction mode being a horizontal intra prediction mode.” ..122
`5. “scanning the transform coefficients based on the selected scanning mode”
` 129
`6. Claim 4 Would Have Been Obvious Over Do in View of Kobayashi and
`Do in View of Kalevo ...............................................................................130
`XIII. CONCLUSION ............................................................................................140
`XIV. DECLARATION IN LIEU OF OATH .......................................................142
`
`
`
`iii
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`EXHIBITS CONSIDERED
`
`
`Exhibit No. Description
`1001
`U.S. Patent No. 9,736,484 to Jeong, et al.
`
`1004
`
`1005
`
`1006
`
`1007
`
`1008
`
`1009
`
`1010
`
`1011
`
`1012
`
`1013
`
`1014
`
`1015
`
`File history of U.S. Patent Application No. 12/377,617 obtained
`from PAIR
`
`File History of U.S. Patent Application No. 13/975,251 obtained
`from PAIR
`
`File History of U.S. Patent Application No. 14/823,273 obtained
`from PAIR
`
`U.S. Patent Application Publication No. 2006/0002466 to Park
`(“Park”)
`
`U.S. Patent No. 7,995,654 to Boon, et al. (“Boon”)
`
`Korean Patent KR 0135364 B1 to Do, et al.
`
`Declaration of Corey Colling and English Translation of Korean
`Patent KR 0135364 to Do, et al. (“Do”)
`
`International Publication No. WO 01/54416A1 to Kalevo, et al.
`(“Kalevo”)
`
`Korean Patent KR 10-0180173 B1 to Chung, et al.
`
`Declaration of Corey Colling and English Translation of Korean
`Patent KR 10-0180173 B1 to Chung, et al. (“Chung”)
`
`U.S. Patent No. 6,426,975 Nishi, et al. (“Nishi”)
`
`Puri, et al., Improvements in DCT-based video coding, Proc. of
`SPIE 3024, Visual Communications and Image Processing ’97
`(Jan. 10, 1997).
`
`1016
`
`International Publication No. WO 94/15312 to Chu, et al.
`
`iv
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`Exhibit No. Description
`1017
`Lee, et al., Adaptive Scanning for H.264/AVC Intra Coding,
`ETRI Journal, Vol. 28, No. 5, pp.668-671 (Oct. 2006).
`
`1018
`
`1019
`
`1020
`
`1021
`
`1022
`
`1023
`
`1024
`
`1025
`
`1026
`
`1027
`
`1028
`
`1029
`
`1030
`
`1031
`
`1032
`
`Fan, et al., A novel coefficient scanning scheme for directional
`spatial prediction-based image compression, 2003 International
`Conference on Multimedia and Expo. ICME ’03 (Aug. 2003)
`
`ITU-T Recommendation H.264 Series H: Audiovisual and
`Multimedia Systems Infrastructure of audiovisual services –
`Coding of moving video; Advanced video coding for generic
`audiovisual services (03/2005)
`
`U.S. Patent No. 7,010,044 to Dattani, et al.
`
`U.S. Patent No. 6,856,701 to Karczewicz et al.
`
`U.S. Patent No. 5,285,402 to Keith
`
`U.S. Patent Application Publication No. 2005/0281337 to
`Kobayashi et al. (“Kobayashi”)
`
`U.S. Patent No. 6,425,054 to Nguyen
`
`U.S. Patent No. 6,188,381 to van der Wal
`
`U.S. Patent No. 7,903,735 to Cha et al.
`
`U.S. Patent No. 7,298,782 to Kuriakin et al.
`
`U.S. Patent No. 5,815,206 to Malladi et al.
`
`U.S. Patent Application Publication No. 2007/0009047 to Shim
`et al.
`
`U.S. Patent No. 3,971,065 to Bayer
`
`U.S. Patent No. 6,809,765 to Tao
`
`EP 0 680 223 A1 to Winbond
`
`v
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`Exhibit No. Description
`1033
`M. Armstrong, et al., BBC Research White Paper, WHP 169
`(Sept. 2008): High Frame-Rate Television
`
`1034
`
`1035
`
`1036
`
`1037
`
`1038
`
`U.S. Patent Application Publication No. 2005/0265447 A1 to
`Park
`
`Iain E. G. Richardson, H.264 and MPEG-4 Video Compression:
`Video Coding for Next-generation Multimedia (John Wiley &
`Sons Ltd. 2003)
`
`U.S. Patent Application Publication No. 2010/0086025 A1 to
`Chen et al.
`
`U.S. Patent Application Publication No. 2007/0274385 to He
`
`U.S. Patent Application Publication No. 2003/0081850 to
`Karczewicz et al.
`
`1039
`
`U.S. Patent No. 8,135,064 to Tasaka et al.
`
`
`
`
`
`
`vi
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`I, Joseph P. Havlicek, Ph.D., hereby declare under penalty of perjury:
`
`I.
`
`BACKGROUND AND QUALIFICATIONS
`I have been retained on behalf of Unified Patents Inc. to provide my
`
`
`opinions regarding the validity of claim 4 of U.S. Patent No. 9,736,484. I refer to
`
`this patent as the ’484 patent in this declaration.
`
`
`
`Exhibit 1003 is a true and correct copy of my Curriculum Vitae. This
`
`document provides further details about my background and experience.
`
`A. Educational Background
`I received a Bachelor of Science degree in electrical engineering with
`
`
`
`minors in mathematics and computer science from Virginia Tech in 1986. I also
`
`received a Master of Science Degree in electrical engineering, also from Virginia
`
`Tech, in 1988. I received the Ph.D. degree in Electrical and Computer Engineering
`
`from the University of Texas at Austin in 1996. My Ph.D. research was in the field
`
`of image processing.
`
`B. Professional Experience
`From December 1984 to May 1987, I was a software engineer at
`
`
`
`Management Systems Laboratories, Blacksburg, VA. My job responsibilities
`
`included developing software for nuclear materials management under contract with
`
`the United States Department of Energy.
`
`
`
`From June 1987 to January 1997 I was an electrical engineer at the
`
`1
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`United States Naval Research Laboratory. For the period of June 1987 through
`
`August 1989, I was an on-site contractor affiliated with SFA, Inc, Landover,
`
`Maryland. From August 1989 through January 1997, I was a federal government
`
`employee. I was on leave without pay from August 1987 through July 1988 while
`
`completing my Master of Science degree. I was also on leave without pay for much
`
`of the period from August 1990 through January 1997 while I completed my Ph.D.
`
`degree. My main job responsibilities at the United States Naval Research
`
`Laboratory included designing digital and analog circuits to process real-time video
`
`signals and designing and implementing target detection, tracking, and identification
`
`algorithms for real-time video signals. I was a recipient of the 1990 Department of
`
`the Navy Award of Merit for Group Achievement for this work.
`
`
`
`From January 1993 through December 1993 I was an on-site contractor
`
`at International Business Machines (IBM) Corporation, Austin, TX. My main job
`
`responsibilities included designing and implementing image compression and
`
`decompression algorithms (CODECs) for IBM products.
`
`
`
`Since January 1997, I have been a regular faculty member in the School
`
`of Electrical and Computer Engineering at the University of Oklahoma, Norman,
`
`OK. I was an Assistant Professor from January 1997 through June 2002. I was
`
`promoted to the rank of Associate Professor and granted tenure in July 2002. I was
`
`promoted to the rank of Professor in July 2007. I was appointed to the Williams
`
`2
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`Companies Foundation Presidential Professorship from April 2009 through April
`
`2017. In April 2017 I was appointed to the Gerald Tuma Presidential Professorship.
`
`My main job responsibilities at the University of Oklahoma include conducting
`
`academic research in electrical and computer engineering, teaching graduate and
`
`undergraduate courses in electrical and computer engineering, and performing
`
`professional and institutional service.
`
`
`
`I am a member of several professional societies and organizations,
`
`including the Institute of Electrical and Electronics Engineers (IEEE), the IEEE
`
`Signal Processing Society, the IEEE Computer Society, and the IEEE Intelligent
`
`Transportation Society. I am a Senior Member of the IEEE. From November 2015
`
`through February 2018 I served as a Senior Area Editor for the IEEE Transactions
`
`on Image Processing. I was formerly an Associate Editor for the IEEE Transactions
`
`on Image Processing from December 2010 through October 2015. I have served as
`
`a Technical Area Chair for the IEEE International Conference on Image Processing
`
`in the area of Image & Video Analysis, Synthesis, and Retrieval (2012, 2013) and
`
`have served on the organizing committee of that conference (2007). I have also
`
`served as a Technical Area Chair for the IEEE International Conference on
`
`Acoustics, Speech, and Signal Processing in the area of Image, Video, and
`
`Multidimensional Signal Processing (2012-2014).
`
`
`
`For over 30 years, I have conducted research and taught classes in the
`
`3
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`field of image and video processing and analysis. My main scholarly contributions
`
`have been in the areas of modulation domain image models and image processing
`
`(AM-FM image models), video target tracking, and distributed control of video
`
`networks for intelligent transportation systems.
`
`
`
`I have served as a supervisor or committee member for numerous Ph.D.
`
`dissertations and Master’s theses. I have supervised 11 Ph.D. students to completion
`
`and am currently supervising two Ph.D. students. I have been a member of 57
`
`additional Ph.D. dissertation committees. I have supervised 25 Master’s students to
`
`completion. I am currently supervising two additional Master’s students. I have
`
`been a member of 65 additional Master’s thesis committees. A listing of my Ph.D.
`
`and Master’s supervisions and committee memberships is found in my curriculum
`
`vitae in Ex. 1003.
`
`
`
`I am co-founder and director of the University of Oklahoma Center for
`
`Intelligent Transportation Systems (CITS). Under my supervision, the Center has
`
`collaborated with the Oklahoma Department of Transportation since 1998 to design
`
`and implement the Oklahoma Statewide Intelligent Transportation System,
`
`including a geographically distributed video network that is currently deployed on
`
`major highways and interstates across the entire State of Oklahoma.
`
`
`
`I teach a variety of courses at the University of Oklahoma, including
`
`the required junior-level Signals and Systems course ECE 3793 (taught 21 times),
`
`4
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`the graduate level Digital Image Processing course ECE 5273 (taught 21 times), and
`
`the graduate level Digital Signal Processing course ECE 5213 (taught 15 times).
`
` Since joining the University of Oklahoma in January 1997, I have been
`
`Principal Investigator or Co-Principal Investigator on over 100 externally funded
`
`grants and contracts with a total value of over $20.4M. My main research
`
`contributions have been in the areas of signal, image, and video processing, video
`
`target tracking, and intelligent transportation systems. I have been author or
`
`coauthor on over 120 scholarly publications in these areas. I was a recipient of the
`
`1990 Department of the Navy Award of Merit for Group Achievement for my work
`
`in video target tracking. My research group at the University of Oklahoma originated
`
`the Virtual Traffic Management Center concept featured in a December 2014
`
`FHWA technical report (Guidelines for Virtual Transportation Management Center
`
`Development) and a November 2014 FHWA national webinar with the same title. I
`
`have received a number of teaching awards, including the University of Oklahoma
`
`College Of Engineering Outstanding Faculty Advisor Award (2005-2006) and the
`
`University of Texas Engineering Foundation Award for Exemplary Engineering
`
`Teaching while Pursuing a Graduate Degree (1992).
`
` Since joining the faculty of the University of Oklahoma in 1997, I have
`
`taught numerous classes at both the graduate and undergraduate levels. At the
`
`graduate level, I have taught the following courses: Digital Signal Processing (ECE
`
`5
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`5213), Digital Image Processing (ECE 5273 and CS 5273), Multimedia
`
`Communications (ECE 5973), Kalman Filtering (ECE 6973), and Advanced Image
`
`Processing (ECE 6283). At the undergraduate level, I have taught the following
`
`courses: Digital Signals and Filtering (ECE 2713), Microcomputer System Design
`
`(ECE 3223), Signals and Systems (ECE 3793), Digital Signal Processing (ECE
`
`4213), Digital Image Processing (ECE 4973), and Multimedia Communications
`
`(ECE 4793).
`
`C. Publications
` A complete listing of my publications is found in my curriculum vitae
`
`(Ex. 1003). I highlight some of the publications relevant to the subject matter of
`
`the ’484 patent below.
`
`
`
`I have published numerous peer reviewed book chapters, journal
`
`articles, and conference papers, including 9 book chapters, 24 journal articles, and
`
`99 conference papers; the following are representative:
`
`
`
`J.P. Havlicek, T.N. Arian, H. Soltani, T. Przebinda, and M. Özaydın,
`
`“A preliminary case for Hirschman transform video coding,” in Proc. IEEE
`
`Southwest Symp. Image Anal. & Interp., Santa Fe, NM, Mar. 29-31, 2020, pp. 104-
`
`107.
`
` E. Vorakitolan, J.P. Havlicek, R.D. Barnes, and A.R. Stevenson,
`
`“Simple, Effective Rate Control for Video Distribution in Heterogeneous Intelligent
`
`6
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`Transportation System Networks,” in Proc. IEEE Southwest Symp. Image Anal. &
`
`Interp., San Diego, CA, Apr. 6-8, 2014, pp. 37-40.
`
` V. DeBrunner, J.P. Havlicek, T. Przebinda, and M. Ozayd, "Entropy-
`
`Based Uncertainty Measures for L2(Rn), l2(Z), and l2(Z/NZ) with a Hirschman
`
`Optimal Transform for l2(Z/NZ)," IEEE Trans. Signal Process., vol. 53, no. 8, pp.
`
`2690-2699, Aug. 2005.
`
` A.C. Bovik, J.P. Havlicek, M.D. Desai, and D.S. Harding, “Limits on
`
`Discrete Modulated Signals,” IEEE Trans. Signal Process., vol. 45, no. 4, pp. 867-
`
`879, Apr. 1997.
`
`
`
`J.P. Havlicek and A.C. Bovik, “Image Modulation Models,” in
`
`Handbook of Image and Video Processing, A.C. Bovik, ed., Communications,
`
`Networking, and Multimedia Series by Academic Press, San Diego, 2000, pp. 305-
`
`316.
`
` P.C. Tay, J.P. Havlicek, S.T. Acton, and J.A. Hossack, “Properties of
`
`the magnitude terms of orthogonal scaling functions,” Digital Signal Process., vol.
`
`20, no. 5, pp. 1330-1340, Sep. 2010.
`
` O. Alkhouli, V. DeBrunner, and J. Havlicek, “Hirschman Optimal
`
`Transform (HOT) DFT Block LMS Algorithm,” in Adaptive Filtering, L. Garcia,
`
`ed., ISBN: 978-953-307-158-9, InTech, Sep. 2011, pp. 135-152.
`
` V. DeBrunner, M. Özaydın, T. Przebinda, and J. Havlicek, “The
`
`7
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`optimal solutions to the continuous- and discrete-time versions of the Hirschman
`
`uncertainty principle,” in Proc. IEEE Int’l. Conf. Acoust., Speech, Signal Process.,
`
`Istanbul, Turkey, Jun. 5-9, 200, vol. 1, pp. 81-84.
`
`D. Compensation
`I am being compensated for my services at my standard rate and
`
`
`reimbursed for reasonable expenses that I may incur while working on this matter.
`
`I have no financial interest in the outcome of this matter or any other matter that
`
`may exist between Unified Patents, on one hand and Electronics and
`
`Telecommunications Research Institute, Kwangwoon University Research Institute
`
`for Industry Cooperation, or Industry-Academia Cooperation Group of Sejong
`
`University. My compensation does not depend in any way on the conclusions I
`
`reach.
`
`II. MATERIALS CONSIDERED
`In developing my opinions relating to the ’484 patent, I have
`
`
`considered the materials cited herein, including those itemized in the “Exhibits
`
`Considered” list preceding this declaration.
`
`III. LEVEL OF ORDINARY SKILL IN THE ART
`I have been asked to provide an opinion about what the qualifications
`
`
`of a person of ordinary skill in the art (“POSA”) in the field of the ’484 patent
`
`would have been. The field of the ’484 patent “relates to an encoding/decoding
`
`8
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`apparatus and method using an adaptive Discrete Cosine Transform (DCT)
`
`coefficient scanning based on pixel similarity.” Ex. 1001, 1:26-29. The ’484
`
`patent further explains that it “relates to an encoding/decoding apparatus and
`
`method which performs intra prediction onto input video, predicts pixel similarity
`
`information of coefficients to be encoded that is acquired from adjacent pixels in
`
`the intra-predicted video, and performs a most effective scanning, e.g., Discrete
`
`Cosine Transform (DCT) coefficient scanning, according to the pixel similarity.”
`
`Ex. 1001, 1:29-36.
`
` For the purposes of my opinion regarding the qualifications of a
`
`POSA, I have been asked to assume that the relevant date for determining the
`
`knowledge and qualifications of such a person for purposes of the ’484 patent is
`
`August 17, 2006, which I understand is the date that the earliest of the two Korean
`
`applications listed on the face of the ’484 patent was filed.
`
`
`
`I understand that the factors considered in determining the ordinary
`
`level of skill in the art include: (i) the levels of education and experience of persons
`
`working in the field; (ii) the types of problems encountered in the field; and (iii)
`
`the sophistication of the technology. I understand that a person of ordinary skill in
`
`the art is not a specific real individual, but rather a hypothetical individual having
`
`the qualities reflected by the factors above. This hypothetical person has
`
`knowledge of all prior art in the relevant field as if it were arranged on a workshop
`
`9
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`wall and takes from each reference what it would teach to a person having the
`
`skills of a person of ordinary skill in the art.
`
` A POSA would have had at least a bachelor’s degree in electrical
`
`engineering or a similar field, such as physics, computer science, or computer
`
`engineering. A person without such an educational background may also qualify
`
`as a POSA if they had a post-graduate education in one of these areas with a focus
`
`on data compression, and more specifically image and video compression. In
`
`addition to formal education, a POSA would have had several years of hands-on
`
`experience with video processing systems either in industry or academia. The
`
`’484 patent and certain prior art materials show that by August 17, 2006, the POSA
`
`would have been familiar with standardized video coding techniques, such as those
`
`defined in MPEG, MPEG-2, H.264/AVC, and MPEG-4. See, e.g., Ex. 1001, 1:56
`
`(“H.264/Advanced Video Coding (AVC) standard technology can compress video
`
`about twice as high as Moving Pictures Expert Group 2 (MPEG-2) and about one
`
`and a half times as high as MPEG-4 by using such technique as intra prediction
`
`encoding, ¼-pixel based variable block motion prediction and compensation,
`
`Context-based Adaptive Variable Length Coding (CAVLC), and Context-based
`
`Adaptive Binary Arithmetic Coding (CABAC).”); Ex. 1007, ¶[0005] (“New
`
`standards called MPEG-4 part 10 AVC (advanced video coding) or ITU-T H.264
`
`emerged in 2003 in the field of video compression.”); Ex. 1008, 2:10-12 (“The
`
`10
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`coding of image data has been widely used in many international standards such as
`
`JPEG, MPEG1, H.261, MPEG2, and H.263.); Ex. 1011, 1:35-2:1 (“To reduce the
`
`amount of information to be transmitted, a number of different compression
`
`methods have been developed, such as the JPEG, MPEG and H.263 standards.”);
`
`Ex. 1014, 1:17-23 (referring to MPEG as a “representative image coding method”);
`
`Ex. 1015, p.676 (referring to “the MPEG-4 video standard” and how the “ISO
`
`Moving Picture Experts Group (MPEG) is currently developing this standard after
`
`having completed the MPEG-1 and MPEG-2 standards,” as well as “recent
`
`experience in the development of ITU-T H.263”). Some techniques used in these
`
`standards including discrete cosine transform, quantization, intra frame prediction,
`
`scanning of coefficients, and variable length encoding were techniques familiar to
`
`the POSA, as I explain in more detail below and as reflected in the materials cited
`
`above. Additionally, a POSA would have been familiar with methods to reverse
`
`these processes. Of course, these are not rigid qualifications, and a person with
`
`less education but more relevant hands-on experience may have been able to get
`
`sufficient exposure to the technical subject matter of the ’484 patent to develop
`
`ordinary skill in the art with some equivalent combination of education and hands-
`
`on experience.
`
` Based on my knowledge and experience in the industry, including
`
`teaching data compression techniques and working at IBM developing data
`
`11
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`compression tools, these qualifications would have led to familiarity with data
`
`compression systems in general and, more specifically, data compression and
`
`decompression techniques that can be associated with video encoding.
`
` My opinions are further supported by my education, including studies
`
`related to video data compression techniques and applications, the relevant prior
`
`art identified in the exhibit list at the front of this declaration, and my reading of
`
`the ’484 patent.
`
` Because a person of ordinary skill in the art is presumed to have
`
`knowledge of all relevant prior art, a person of ordinary skill in the art would have
`
`been familiar with each of the references cited herein and the full range of
`
`teachings they contain. A person of ordinary skill in the art would have reviewed
`
`the various publications and patents I discuss herein at least because these prior art
`
`references address solutions to problems in data compression and particularly data
`
`compression for image data, and more specifically, video data.
`
` As of August 2006, I had more than ordinary skill in the field of video
`
`coding. But, I am familiar with the skills and knowledge possessed by those I
`
`would have considered to be of ordinary skill in the art in the August 2006 time
`
`frame. When I refer to the understanding of a person of ordinary skill in the art, I
`
`am referring to the understanding of a person of ordinary skill in the art as of this
`
`effective filing date.
`
`12
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`IV. TECHNICAL TUTORIAL
`In this section, I summarize some of the key technical concepts needed
`
`
`for an understanding of intra prediction and scanning in the video coding process. I
`
`start with an overview of how scenes—things that you or I may see visually—are
`
`captured by image capture systems and converted into arrays of digital data. Next,
`
`I discuss the concept of moving pictures and show why video compression is
`
`important when dealing with video data. Then I introduce some concepts
`
`associated with video compression including (1) reducing spatial and temporal
`
`redundancy, (2) encoders and decoders in some typical video encoding systems
`
`including the general process of taking raw image data and compressing and
`
`decompressing it, and (3) an introduction to certain concepts including data
`
`structures, performing DCT on pixel values, quantization, scanning, and entropy
`
`encoding (and reversing this process at the decoder).
`
`A. Still Images and Image Capture
` Capturing still images has been commonplace for a long time. Over at
`
`least the last half-century it has been possible to image scenes using arrays of
`
`photosensors. Those sensors collect light that reflects off of surfaces in the scene
`
`and the resulting data can be used to re-create an image of the scene recorded by the
`
`sensors. But, as a general matter, most charge coupled device (CCD) arrays—
`
`photosensors arranged in an array—were in essence “color blind.” They were able
`
`13
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

`to detect the amount of light, but not the wavelength of the light. One early
`
`solution to this problem was the Bayer mosaic filter, named after the inventor of
`
`U.S. Patent No. 3,971,065. See generally Ex. 1030. The idea described in Bayer’s
`
`patent was to create colored filters—red, green, and blue, for example—that
`
`overlaid the CCD array so that only light filtered with a particular wavelength
`
`would be detected. Ex. 1030, 2:63-3:13. Because the human eye is better able to
`
`detect variance in light intensity, and changes in green light are most perceptible
`
`by the human eye as a change in intensity of light, one embodiment described by
`
`Bayer was to use twice as many green filters as red and blue filters. Ex. 1030,
`
`2:63-3:13, Abstract. One embodiment shown in Bayer’s patent is below, with the
`
`red, blue, and green filters colored accordingly.
`
`
`Ex. 1030, FIGs. 1A, 1B; see also Ex. 1014, 1:25-32 (describing a “1-CCD . . .
`
`camera[]” that includes “solid state photosensors . . . arranged in an array in a matrix
`
`fashion with each CCD defining an element in the matrix called a ‘pixel,’” and the
`
`placement of a “mosaiced color filter over the photosensor array”).
`
`14
`
`Unified Patents, LLC v. Elects. & Telecomm. Res. Inst., et al.
`
`Ex. 1002
`
`

`

` When light hit each of the sensors in the CCD array, analog signals
`
`would be generated reflecting the intensity of the light incident on that sensor;
`
`those analog signals could then be converted into digital signals representing the
`
`intensity and color of the light incident on the sensor. Ex. 1031, 3:25-32 (“The
`
`CCDs 26 photo-electrically convert the light falling on them into proportional
`
`analog electrical signals in photosensor conversion 71. The analog signals are
`
`converted to digital signals by an analog to digital (A/D) conversion 72. The
`
`digital signals are then output for demosaicing.”).
`
` Each data element captured by a sensor in the array is called a “pixel”
`
`or “pel.” At this point in the process of capturing a digital image, each pixel would
`
`generally have one color value. For example, each pixel might be a red, green, or
`
`blue pixel and have a certain digital value associated with its intensity. A mosaic
`
`image from a Bayer s

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket