`
`Exhibit G
`
`
`
`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 2 of 7 PageID #: 5947
`
`UNITED STATES DISTRICT COURT
`
`NORTHERN DISTRICT OF CALIFORNIA
`
`SAN JOSE DIVISION
`
`Case No. 12-cv-00630
`
`EXPERT REPORT OF
`JOHN R. HAUSER,
`August 11, 2013
`
`APPLE INC., a California corporation,
`
`
`Plaintiff,
`
`
`
`v.
`
`
`SAMSUNG ELECTRONICS CO., LTD., A
`Korean corporation; SAMSUNG
`ELECTRONICS AMERICA, INC., a New
`York corporation; SAMSUNG
`TELECOMMUNICATIONS AMERICA,
`LLC, a Delaware limited liability company,
`
`
`Defendants.
`
`
`
`**CONFIDENTIAL – CONTAINS MATERIAL DESIGNATED AS CONFIDENTIAL –
`
`PURSUANT TO A PROTECTIVE ORDER**
`
`UNITED STATES DISTRICT COURT
`
`
`
`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 3 of 7 PageID #: 5948
`
`Table of Contents
`
`V.
`
`Introduction and Qualifications ............................................................................................ 3
`I.
`Assignment ........................................................................................................................... 5
`II.
`Summary of Opinions .......................................................................................................... 6
`III.
`IV. Overview of Methodology ................................................................................................... 7
`A. Choosing Among Different Products .......................................................................... 10
`B. Choosing Whether or Not to Buy ............................................................................... 17
`C. Hierarchical Bayes for Choice-Based Conjoint .......................................................... 19
`Survey Design and Administration .................................................................................... 21
`A. Pretesting..................................................................................................................... 23
`B.
`Identifying the Sample ................................................................................................ 25
`C. Survey Methodology ................................................................................................... 27
`D. Survey Implementation ............................................................................................... 34
`Estimation of Partworths .................................................................................................... 41
`A. Testing the Fit and Predictive Ability of the Model ................................................... 45
`B. Consumers Positively Value the Patent-Related Features .......................................... 50
`VII. The Effect of the Patent-Related Features on Consumers Willingness to Buy................. 54
`VIII. The Price Premium for the Patent-Related Features .......................................................... 62
`
`VI.
`
`ConfidentialContains Material Designated as ConfidentialPursuant to a Protective Order
`
`Page 2 of 67
`
`
`
`
`
`
`
`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 4 of 7 PageID #: 5949
`
`13.
`
`After careful review of the results of these surveys, including sensitivity analyses and
`
`robustness checks described in detail below, it is my opinion that the model of consumer
`
`behavior I use provides reliable and substantial information explaining consumers
`
`choices in regard to the features tested for both smartphones and tablets.
`
`14.
`
`The choice simulation results that I have obtained and the robustness checks that I have
`
`performed demonstrate that, for both smartphones and tablets, the patent-related features
`
`at issue in this case, whether individually or collectively, have substantial effects on
`
`Samsung consumers willingness to buy the relevant Samsung products. My analysis of
`
`the data also finds that, on average, Samsung consumers place substantial positive value
`
`on the patent-related features at issue in this case.
`
`15.
`
`Finally, I understand that the exhibits to my report and underlying data have been
`
`provided to Christopher Vellturo, Ph.D. I further understand from Dr. Vellturo that he is
`
`utilizing the data and findings from my conjoint studies in order to estimate how demand
`
`for certain smartphones and tablets is impacted when product features I considered in my
`
`conjoint studies are removed from those devices. I also understand that those devices
`
`have prices and screen sizes within or close to the ranges in my conjoint studies. In my
`
`opinion, it is appropriate to use the data and findings from my studies for such an
`
`application.
`
`IV.
`
`Overview of Methodology
`
`16.
`
`The basic survey methodology that I selected is known as web-based conjoint analysis.
`
`Conjoint analysis is a tool that enjoys wide use in the field of marketing research. It was
`
`introduced to the field of marketing research in 1971 and is generally recognized by
`
`marketing science academics and industry practitioners to be the most widely studied and
`
`applied form of quantitative consumer preference measurement. It has been shown to
`
`provide valid and reliable measures of consumer preferences, and these preferences have
`
`ConfidentialContains Material Designated as ConfidentialPursuant to a Protective Order
`
`Page 7 of 67
`
`
`
`
`
`
`
`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 5 of 7 PageID #: 5950
`
`been shown to provide valid and reliable forecasts of what consumers will do (or would
`have done) under scenarios related to those measured.7
`
`17.
`
`For example, under the auspices of MITs Virtual Consumer Initiative, my colleagues
`
`and I undertook large-scale tests of the validity of web-based conjoint analysis.
`
`Predictions were highly accurate, predicting future choices consumers would make in a
`
`subsequent test with real money at stake and predicting what would happen in the
`
`marketplace. One of the scientific papers discussing the validity test received two highly
`
`prestigious awards as the best paper in the marketing sciences literature for 2003
`(awarded in 2004) and for the best paper based on a dissertation (awarded in June 2005).8
`Another one of my scientific papers was a finalist for the best paper in 2002 in the
`
`Journal of Product Innovation Management and still a third paper was a finalist for the
`best contribution to the practice of marketing research in 2003. 9 Many successful
`products, including automobiles, hotel chains, the EZ Pass system, HMOs, and cameras,
`have been developed using conjoint analysis.10
`
`18.
`
`The general idea behind conjoint analysis is that consumers preferences for a particular
`
`product are driven by the features or descriptions of the features embodied in that
`
`product. The features included in the survey are organized into feature categories. This is
`
`typical practice in conjoint analysis: consumers are presented with different hypothetical
`
`products as options (also known as profiles), which are constructed based on the
`
`
`7 Hauser, John R. and Vithala Rao (2004), Conjoint Analysis, Related Modeling, and Applications, Advances
`in Marketing Research: Progress and Prospects, Jerry Wind and Paul Green, Eds., (Boston, MA: Kluwer
`Academic Publishers), pp. 141168.
`8 Toubia, Olivier, Duncan I. Simester, John R. Hauser, and Ely Dahan (2003), Fast Polyhedral Adaptive
`Conjoint Estimation, Marketing Science, 22, 3, (Summer), pp. 273303.
`9 Dahan, Ely and John R. Hauser (2002), The Virtual Customer, Journal of Product Innovation Management,
`19, 5, (September), pp. 332354; Toubia, Olivier, John R. Hauser, and Duncan Simester (2004), Polyhedral
`Methods for Adaptive Choice-based Conjoint Analysis, Journal of Marketing Research, 41, 1, (February), pp.
`116131.
`10 Green, Paul E., Abba M. Krieger, and Yoram Wind (2001), Thirty Years of Conjoint Analysis: Reflections and
`Prospects, Interfaces, 31, 3, (MayJune), pp. S53S76. Hauser, John R. and Vithala Rao (2004), Conjoint
`Analysis, Related Modeling, and Applications, Market Research and Modeling: Progress and Prospects, Jerry
`Wind and Paul Green, Eds., (Boston, MA: Kluwer Academic Publishers), pp. 141168.
`
`ConfidentialContains Material Designated as ConfidentialPursuant to a Protective Order
`
`Page 8 of 67
`
`
`
`
`
`
`
`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 6 of 7 PageID #: 5951
`
`feature categories (also known as attributes). 11 For example, in one of the classic
`articles in the conjoint area, Lenk, DeSarbo, Green, and Young (1996) studied computer
`
`purchases of MBA students using the feature categories of telephone service hot line,
`
`RAM, screen size, CPU speed, hard disk size, CD ROM/multimedia, cache, color of unit,
`availability, warranty, bundled productivity software, money back guarantee, and price.12
`In another study, storage format, LCD screen size, optical zoom, camera resolution, low-
`light sensitivity, and price were used as attributes for camcorders.13
`
`
`
`19.19.
`
`
`
`In the smartphone survey I conducted here, I included the following feature categories: (i) In the smartphone survey I conducted here, I included the following feature categories: (i)
`
`
`
`data accessibility; (ii) call initiation and screening; (iii) input assistance; (iv) screen size; data accessibility; (ii) call initiation and screening; (iii) input assistance; (iv) screen size;
`
`
`
`(v) camera; and (vi) price. In the tablet survey I conducted here, I included the following (v) camera; and (vi) price. In the tablet survey I conducted here, I included the following
`
`
`
`feature categories: (i) data accessibility; (ii) lock screen interface; (iii) input assistance; feature categories: (i) data accessibility; (ii) lock screen interface; (iii) input assistance;
`
`
`
`(iv) screen size; (v) connectivity; and (vi) price. Each of the feature categories included in (iv) screen size; (v) connectivity; and (vi) price. Each of the feature categories included in
`
`
`
`the smartphone and tablet surveys takes on one of four different levels. For example, in the smartphone and tablet surveys takes on one of four different levels. For example, in
`
`
`
`the smartphone survey, the screen size category takes one of the following four levels: (1) the smartphone survey, the screen size category takes one of the following four levels: (1)
`
`
`
`4.0, (2) 4.8, (3) 5.0, or (4) 5.5. In some cases, levels consist of various features. For 4.0, (2) 4.8, (3) 5.0, or (4) 5.5. In some cases, levels consist of various features. For
`
`
`
`example, the camera feature category takes on one of the following four levels: (1) example, the camera feature category takes on one of the following four levels: (1)
`
`
`
`Panorama, (2) Panorama and Best Photo, (3) Panorama, Best Photo, and Smile Shot, or Panorama, (2) Panorama and Best Photo, (3) Panorama, Best Photo, and Smile Shot, or
`
`
`
`(4) Panorama, Best Photo, Smile Shot, and Buddy Photo Share. (4) Panorama, Best Photo, Smile Shot, and Buddy Photo Share.
`
`20.
`
`There are different forms of conjoint analysis.14 For this assignment, I used a form of
`conjoint analysis known as Choice-Based Conjoint (CBC) analysis. In CBC,
`
`consumers are shown products generated as different combinations of levels of the
`
`
`11 See, for example, Vithala Rao (2008), Developments in Conjoint Analysis, Handbook of Marketing Decision
`Models, B. Wierenga, Ed., (Boston, MA: Springer Science and Business Media), pp. 2349.
`12 Peter J. Lenk, Wayne S. DeSarbo, Paul E. Green, and Martin R. Young (1996), Hierarchical Bayes Conjoint
`Analysis: Recovery of Partworth Heterogeneity from Reduced Experimental Designs, Marketing Science, 15,
`2, pp. 173191.
`13 Min Ding, Young-Hoon Park, and Eric T. Bradlow (2009), Barter Markets for Conjoint Analysis, Marketing
`Science, 55, 6, pp. 10031017. For other examples of feature categories for different products, see Dahan, Ely
`and John R. Hauser (2002), The Virtual Customer, Journal of Product Innovation Management, 19, 5,
`(September), pp. 332354;
`14 Examples of conjoint approaches include Ratings Based (or Full Profile) Conjoint Analysis, Choice-Based
`Conjoint Analysis, Adaptive Conjoint Analysis, and Self-Explicated Conjoint Analysis. See Vithala Rao (2008),
`Developments in Conjoint Analysis, Handbook of Marketing Decision Models, B. Wierenga, Ed., (Boston,
`MA: Springer Science and Business Media), pp. 2349.
`
`ConfidentialContains Material Designated as ConfidentialPursuant to a Protective Order
`
`Page 9 of 67
`
`
`
`
`
`
`
`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 7 of 7 PageID #: 5952
`
`feature categories. Respondents are shown these products in sets (called the choice sets
`
`or choice tasks) and are asked to choose the product that they most prefer among the
`
`alternatives presented. I chose to show respondents four products in each choice set. I
`
`have used four-product choice sets in other applications, including award-winning
`academic articles and in litigation.15 I have found the data to be both reliable and valid.
`
`21.
`
`In the smartphone and tablet surveys I conducted here, respondents are asked whether
`
`they would buy the product that they chose at the indicated price. In making that choice,
`
`respondents are asked to take into account other options in the market (other smartphone
`
`options in the smartphone survey and other tablet options in the tablet survey) that might
`influence their purchase decision.16
`
`22.
`
`I now discuss how the choices respondents make in response to each question are used to
`
`understand consumer demand for the features included in the survey; as well as how
`
`consumer demand for the features drives consumer demand for the product.
`
`A.
`
`Choosing Among Different Products
`
`23.
`
`Conjoint analysis provides respondents with realistic choices among hypothetical
`
`products that vary simultaneously on multiple feature categories. This realism enhances
`
`the predictive ability, and hence, reliability and validity of the conjoint analysis task.
`
`Furthermore, because multiple feature categories are varying simultaneously, the task
`
`does not cause the respondent to focus artificially on a single feature category. By
`
`avoiding such a focus, conjoint analysis minimizes any demand artifacts that might be
`induced. A demand artifact is akin to a leading question.17 A multi-feature-category task
`
`
`15 See, for example, Toubia, Olivier, John R. Hauser, and Duncan Simester (2004), Polyhedral Methods for
`Adaptive Choice-based Conjoint Analysis, Journal of Marketing Research, 41, 1, (February), pp. 116131;
`Toubia, Olivier, John Hauser and Rosanna Garcia (2007), Probabilistic Polyhedral Methods for Adaptive
`Choice-Based Conjoint Analysis: Theory and Application, Marketing Science, 26, 5, (SeptemberOctober),
`pp. 596610. TiVo Inc. v. Echostar Communications Corp. et al., Case No. 2:04-CV-1-DF, United States
`District Court for the Eastern District of Texas, Marshall Division.
`16 Jeff D. Brazell, Christopher G. Diener, Ekaterina Karniouchina, William L. Moore, Válerie Séverin, and Pierre-
`Francois Uldry (2006), The no-choice option and dual response choice designs, Marketing Letters, 17, 4,
`(December), p. 256; Christopher G. Diener, Bryan Orme, Dan Yardley, Dual Response None Approaches:
`Theory and Practice, 2006 Sawtooth Software Conference Proceedings, pp. 157167.
`17 Demand artifacts are aspects of the study that influence research results based on the chosen procedure rather
`than based on the phenomenon under study. For a discussion of demand artifacts, see, e.g., Sawyer, Alan G.
`
`ConfidentialContains Material Designated as ConfidentialPursuant to a Protective Order
`
`Page 10 of 67
`
`
`
`
`
`