throbber
Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 1 of 7 PageID #: 5946
`
`Exhibit G
`
`

`

`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 2 of 7 PageID #: 5947
`
`UNITED STATES DISTRICT COURT
`
`NORTHERN DISTRICT OF CALIFORNIA
`
`SAN JOSE DIVISION
`
`Case No. 12-cv-00630
`
`EXPERT REPORT OF
`JOHN R. HAUSER,
`August 11, 2013
`
`APPLE INC., a California corporation,
`
`
`Plaintiff,
`
`
`
`v.
`
`
`SAMSUNG ELECTRONICS CO., LTD., A
`Korean corporation; SAMSUNG
`ELECTRONICS AMERICA, INC., a New
`York corporation; SAMSUNG
`TELECOMMUNICATIONS AMERICA,
`LLC, a Delaware limited liability company,
`
`
`Defendants.
`
`
`
`**CONFIDENTIAL – CONTAINS MATERIAL DESIGNATED AS CONFIDENTIAL –
`
`PURSUANT TO A PROTECTIVE ORDER**
`
`UNITED STATES DISTRICT COURT
`
`

`

`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 3 of 7 PageID #: 5948
`
`Table of Contents
`
`V.
`
`Introduction and Qualifications ............................................................................................ 3
`I.
`Assignment ........................................................................................................................... 5
`II.
`Summary of Opinions .......................................................................................................... 6
`III.
`IV. Overview of Methodology ................................................................................................... 7
`A. Choosing Among Different Products .......................................................................... 10
`B. Choosing Whether or Not to Buy ............................................................................... 17
`C. Hierarchical Bayes for Choice-Based Conjoint .......................................................... 19
`Survey Design and Administration .................................................................................... 21
`A. Pretesting..................................................................................................................... 23
`B.
`Identifying the Sample ................................................................................................ 25
`C. Survey Methodology ................................................................................................... 27
`D. Survey Implementation ............................................................................................... 34
`Estimation of Partworths .................................................................................................... 41
`A. Testing the Fit and Predictive Ability of the Model ................................................... 45
`B. Consumers Positively Value the Patent-Related Features .......................................... 50
`VII. The Effect of the Patent-Related Features on Consumers’ Willingness to Buy................. 54
`VIII. The Price Premium for the Patent-Related Features .......................................................... 62
`
`VI.
`
`Confidential—Contains Material Designated as Confidential—Pursuant to a Protective Order
`
`Page 2 of 67
`
`
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 4 of 7 PageID #: 5949
`
`13.
`
`After careful review of the results of these surveys, including sensitivity analyses and
`
`robustness checks described in detail below, it is my opinion that the model of consumer
`
`behavior I use provides reliable and substantial information explaining consumers’
`
`choices in regard to the features tested for both smartphones and tablets.
`
`14.
`
`The choice simulation results that I have obtained and the robustness checks that I have
`
`performed demonstrate that, for both smartphones and tablets, the patent-related features
`
`at issue in this case, whether individually or collectively, have substantial effects on
`
`Samsung consumers’ willingness to buy the relevant Samsung products. My analysis of
`
`the data also finds that, on average, Samsung consumers place substantial positive value
`
`on the patent-related features at issue in this case.
`
`15.
`
`Finally, I understand that the exhibits to my report and underlying data have been
`
`provided to Christopher Vellturo, Ph.D. I further understand from Dr. Vellturo that he is
`
`utilizing the data and findings from my conjoint studies in order to estimate how demand
`
`for certain smartphones and tablets is impacted when product features I considered in my
`
`conjoint studies are removed from those devices. I also understand that those devices
`
`have prices and screen sizes within or close to the ranges in my conjoint studies. In my
`
`opinion, it is appropriate to use the data and findings from my studies for such an
`
`application.
`
`IV.
`
`Overview of Methodology
`
`16.
`
`The basic survey methodology that I selected is known as web-based conjoint analysis.
`
`Conjoint analysis is a tool that enjoys wide use in the field of marketing research. It was
`
`introduced to the field of marketing research in 1971 and is generally recognized by
`
`marketing science academics and industry practitioners to be the most widely studied and
`
`applied form of quantitative consumer preference measurement. It has been shown to
`
`provide valid and reliable measures of consumer preferences, and these preferences have
`
`Confidential—Contains Material Designated as Confidential—Pursuant to a Protective Order
`
`Page 7 of 67
`
`
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 5 of 7 PageID #: 5950
`
`been shown to provide valid and reliable forecasts of what consumers will do (or would
`have done) under scenarios related to those measured.7
`
`17.
`
`For example, under the auspices of MIT’s Virtual Consumer Initiative, my colleagues
`
`and I undertook large-scale tests of the validity of web-based conjoint analysis.
`
`Predictions were highly accurate, predicting future choices consumers would make in a
`
`subsequent test with real money at stake and predicting what would happen in the
`
`marketplace. One of the scientific papers discussing the validity test received two highly
`
`prestigious awards as the best paper in the marketing sciences literature for 2003
`(awarded in 2004) and for the best paper based on a dissertation (awarded in June 2005).8
`Another one of my scientific papers was a finalist for the best paper in 2002 in the
`
`Journal of Product Innovation Management and still a third paper was a finalist for the
`best contribution to the practice of marketing research in 2003. 9 Many successful
`products, including automobiles, hotel chains, the EZ Pass system, HMOs, and cameras,
`have been developed using conjoint analysis.10
`
`18.
`
`The general idea behind conjoint analysis is that consumers’ preferences for a particular
`
`product are driven by the features or descriptions of the features embodied in that
`
`product. The features included in the survey are organized into feature categories. This is
`
`typical practice in conjoint analysis: consumers are presented with different hypothetical
`
`products as options (also known as “profiles”), which are constructed based on the
`
`
`7 Hauser, John R. and Vithala Rao (2004), “Conjoint Analysis, Related Modeling, and Applications,” Advances
`in Marketing Research: Progress and Prospects, Jerry Wind and Paul Green, Eds., (Boston, MA: Kluwer
`Academic Publishers), pp. 141–168.
`8 Toubia, Olivier, Duncan I. Simester, John R. Hauser, and Ely Dahan (2003), “Fast Polyhedral Adaptive
`Conjoint Estimation,” Marketing Science, 22, 3, (Summer), pp. 273–303.
`9 Dahan, Ely and John R. Hauser (2002), “The Virtual Customer,” Journal of Product Innovation Management,
`19, 5, (September), pp. 332–354; Toubia, Olivier, John R. Hauser, and Duncan Simester (2004), “Polyhedral
`Methods for Adaptive Choice-based Conjoint Analysis,” Journal of Marketing Research, 41, 1, (February), pp.
`116–131.
`10 Green, Paul E., Abba M. Krieger, and Yoram Wind (2001), “Thirty Years of Conjoint Analysis: Reflections and
`Prospects,” Interfaces, 31, 3, (May–June), pp. S53–S76. Hauser, John R. and Vithala Rao (2004), “Conjoint
`Analysis, Related Modeling, and Applications,” Market Research and Modeling: Progress and Prospects, Jerry
`Wind and Paul Green, Eds., (Boston, MA: Kluwer Academic Publishers), pp. 141–168.
`
`Confidential—Contains Material Designated as Confidential—Pursuant to a Protective Order
`
`Page 8 of 67
`
`
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 6 of 7 PageID #: 5951
`
`feature categories (also known as “attributes”). 11 For example, in one of the classic
`articles in the conjoint area, Lenk, DeSarbo, Green, and Young (1996) studied computer
`
`purchases of MBA students using the feature categories of telephone service hot line,
`
`RAM, screen size, CPU speed, hard disk size, CD ROM/multimedia, cache, color of unit,
`availability, warranty, bundled productivity software, money back guarantee, and price.12
`In another study, storage format, LCD screen size, optical zoom, camera resolution, low-
`light sensitivity, and price were used as attributes for camcorders.13
`
`
`
`19.19.
`
`
`
`In the smartphone survey I conducted here, I included the following feature categories: (i) In the smartphone survey I conducted here, I included the following feature categories: (i)
`
`
`
`data accessibility; (ii) call initiation and screening; (iii) input assistance; (iv) screen size; data accessibility; (ii) call initiation and screening; (iii) input assistance; (iv) screen size;
`
`
`
`(v) camera; and (vi) price. In the tablet survey I conducted here, I included the following (v) camera; and (vi) price. In the tablet survey I conducted here, I included the following
`
`
`
`feature categories: (i) data accessibility; (ii) lock screen interface; (iii) input assistance; feature categories: (i) data accessibility; (ii) lock screen interface; (iii) input assistance;
`
`
`
`(iv) screen size; (v) connectivity; and (vi) price. Each of the feature categories included in (iv) screen size; (v) connectivity; and (vi) price. Each of the feature categories included in
`
`
`
`the smartphone and tablet surveys takes on one of four different levels. For example, in the smartphone and tablet surveys takes on one of four different levels. For example, in
`
`
`
`the smartphone survey, the screen size category takes one of the following four levels: (1) the smartphone survey, the screen size category takes one of the following four levels: (1)
`
`
`
`4.0”, (2) 4.8”, (3) 5.0”, or (4) 5.5”. In some cases, levels consist of various features. For 4.0”, (2) 4.8”, (3) 5.0”, or (4) 5.5”. In some cases, levels consist of various features. For
`
`
`
`example, the camera feature category takes on one of the following four levels: (1) example, the camera feature category takes on one of the following four levels: (1)
`
`
`
`Panorama, (2) Panorama and Best Photo, (3) Panorama, Best Photo, and Smile Shot, or Panorama, (2) Panorama and Best Photo, (3) Panorama, Best Photo, and Smile Shot, or
`
`
`
`(4) Panorama, Best Photo, Smile Shot, and Buddy Photo Share. (4) Panorama, Best Photo, Smile Shot, and Buddy Photo Share.
`
`20.
`
`There are different forms of conjoint analysis.14 For this assignment, I used a form of
`conjoint analysis known as Choice-Based Conjoint (“CBC”) analysis. In CBC,
`
`consumers are shown products generated as different combinations of levels of the
`
`
`11 See, for example, Vithala Rao (2008), “Developments in Conjoint Analysis,” Handbook of Marketing Decision
`Models, B. Wierenga, Ed., (Boston, MA: Springer Science and Business Media), pp. 23–49.
`12 Peter J. Lenk, Wayne S. DeSarbo, Paul E. Green, and Martin R. Young (1996), “Hierarchical Bayes Conjoint
`Analysis: Recovery of Partworth Heterogeneity from Reduced Experimental Designs,” Marketing Science, 15,
`2, pp. 173–191.
`13 Min Ding, Young-Hoon Park, and Eric T. Bradlow (2009), “Barter Markets for Conjoint Analysis,” Marketing
`Science, 55, 6, pp. 1003–1017. For other examples of feature categories for different products, see Dahan, Ely
`and John R. Hauser (2002), “The Virtual Customer,” Journal of Product Innovation Management, 19, 5,
`(September), pp. 332–354;
`14 Examples of conjoint approaches include Ratings Based (or Full Profile) Conjoint Analysis, Choice-Based
`Conjoint Analysis, Adaptive Conjoint Analysis, and Self-Explicated Conjoint Analysis. See Vithala Rao (2008),
`“Developments in Conjoint Analysis,” Handbook of Marketing Decision Models, B. Wierenga, Ed., (Boston,
`MA: Springer Science and Business Media), pp. 23–49.
`
`Confidential—Contains Material Designated as Confidential—Pursuant to a Protective Order
`
`Page 9 of 67
`
`
`
`
`
`

`

`Case 2:21-cv-00040-JRG Document 147-3 Filed 12/03/21 Page 7 of 7 PageID #: 5952
`
`feature categories. Respondents are shown these products in sets (called the “choice sets”
`
`or “choice tasks”) and are asked to choose the product that they most prefer among the
`
`alternatives presented. I chose to show respondents four products in each choice set. I
`
`have used four-product choice sets in other applications, including award-winning
`academic articles and in litigation.15 I have found the data to be both reliable and valid.
`
`21.
`
`In the smartphone and tablet surveys I conducted here, respondents are asked whether
`
`they would buy the product that they chose at the indicated price. In making that choice,
`
`respondents are asked to take into account other options in the market (other smartphone
`
`options in the smartphone survey and other tablet options in the tablet survey) that might
`influence their purchase decision.16
`
`22.
`
`I now discuss how the choices respondents make in response to each question are used to
`
`understand consumer demand for the features included in the survey; as well as how
`
`consumer demand for the features drives consumer demand for the product.
`
`A.
`
`Choosing Among Different Products
`
`23.
`
`Conjoint analysis provides respondents with realistic choices among hypothetical
`
`products that vary simultaneously on multiple feature categories. This realism enhances
`
`the predictive ability, and hence, reliability and validity of the conjoint analysis task.
`
`Furthermore, because multiple feature categories are varying simultaneously, the task
`
`does not cause the respondent to focus artificially on a single feature category. By
`
`avoiding such a focus, conjoint analysis minimizes any demand artifacts that might be
`induced. A demand artifact is akin to a leading question.17 A multi-feature-category task
`
`
`15 See, for example, Toubia, Olivier, John R. Hauser, and Duncan Simester (2004), “Polyhedral Methods for
`Adaptive Choice-based Conjoint Analysis,” Journal of Marketing Research, 41, 1, (February), pp. 116–131;
`Toubia, Olivier, John Hauser and Rosanna Garcia (2007), “Probabilistic Polyhedral Methods for Adaptive
`Choice-Based Conjoint Analysis: Theory and Application,” Marketing Science, 26, 5, (September–October),
`pp. 596–610. TiVo Inc. v. Echostar Communications Corp. et al., Case No. 2:04-CV-1-DF, United States
`District Court for the Eastern District of Texas, Marshall Division.
`16 Jeff D. Brazell, Christopher G. Diener, Ekaterina Karniouchina, William L. Moore, Válerie Séverin, and Pierre-
`Francois Uldry (2006), “The no-choice option and dual response choice designs,” Marketing Letters, 17, 4,
`(December), p. 256; Christopher G. Diener, Bryan Orme, Dan Yardley, “Dual Response ‘None’ Approaches:
`Theory and Practice,” 2006 Sawtooth Software Conference Proceedings, pp. 157–167.
`17 Demand artifacts are aspects of the study that influence research results based on the chosen procedure rather
`than based on the phenomenon under study. For a discussion of demand artifacts, see, e.g., Sawyer, Alan G.
`
`Confidential—Contains Material Designated as Confidential—Pursuant to a Protective Order
`
`Page 10 of 67
`
`
`
`
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket