throbber
Jean-Pierre Hubaux and Ari Juels
`
`DOI:10.1145/2834114
`
`Viewpoint
`Privacy Is Dead,
`Long Live Privacy
`
`Protecting social norms as confidentiality wanes.
`
`working hard to develop privacy-en-
`hancing technologies (PETs). PETs en-
`able users to encrypt email, conceal
`their IP addresses, avoid tracking by
`Web servers, hide their geographic
`location when using mobile devices,
`use anonymous credentials, make un-
`traceable database queries, and pub-
`lish documents anonymously. Nearly
`all major PETs aim at protecting con-
`fidentiality; we call these confidenti-
`ality-oriented PETs (C-PETs). C-PETs
`
`can be good and helpful. But there
`is a significant chance that in many
`or most places, C-PETs will not save
`privacy. It is time to consider adding
`a new research objective to the com-
`munity’s portfolio: preparedness for
`a post-confidentiality world in which
`many of today’s social norms regard-
`ing the flow of information are regu-
`larly and systematically violated.
`Global warming offers a useful
`analogy, as another slow and seem-
`
`JUNE 2016 | VOL. 59 | NO. 6 | COMMUNICATIONS OF THE ACM 39
`
`THE PAST FEW years have been
`
`especially turbulent for pri-
`vacy advocates. On the one
`hand, the global dragnet
`of
`surveillance
`agencies
`has demonstrated the sweeping sur-
`veillance achievable by massively re-
`sourced government organizations.
`On the other, the European Union has
`issued a mandate that Google defini-
`tively “forget’’ information in order to
`protect users.
`Privacy has deep historical roots,
`as illustrated by the pledge in the
`Hippocratic oath (5th century b.c.),
`“Whatever I see or hear in the lives of
`my patients ... which ought not to be
`spoken of outside, I will keep secret,
`as considering all such things to be
`private.”11 Privacy also has a number
`of definitions. A now common one
`among scholars views it as the flow
`of information in accordance with
`social norms, as governed by con-
`text.10 An intricate set of such norms
`is enshrined in laws, policies, and
`ordinary conduct in almost every
`culture and social setting. Privacy in
`this sense includes two key notions:
`confidentiality and fair use. We ar-
`gue that confidentiality, in the sense
`of individuals’ ability to preserve
`secrets from governments, corpora-
`tions, and one another, could well
`continue to erode. We call instead for
`more attention and research devoted
`to fair use.
`To preserve existing forms of pri-
`vacy against an onslaught of online
`threats, the technical community is
`
`PHOTO: 2013 GIANTS ARE SMALL LP. ALL RIGHTS RESERVED.
`
`V viewpoints
`
`USR Exhibit 2115, Page 1
`
`

`

`viewpoints
`
`est sense. The adversaries include sur-
`veillance agencies and companies in
`markets such as targeted advertising,
`as well as smaller, nefarious players.
`Pervasive data collection. As the
`number of online services and al-
`ways-on devices grows, potential ad-
`versaries can access a universe of per-
`sonal data quickly expanding beyond
`browsing history to location, financial
`transactions, video and audio feeds,
`genetic data4, real-time physiological
`data—and perhaps eventually even
`brainwaves.8 These adversaries are
`developing better and better ways to
`correlate and extract new value from
`these data sources, especially as ad-
`vances in applied machine learning
`make it possible to fill in gaps in us-
`ers’ data via inference. Sensitive data
`might be collected by a benevolent
`party for a purpose that is acceptable
`to a user, but later fall into danger-
`ous hands, due to political pressure,
`a breach, and other reasons. “Sec-
`ondhand” data leakage is also grow-
`ing in prevalence, meaning that one
`person’s action impacts another’s
`private data (for example, if a friend
`declares a co-location with us, or if a
`blood relative unveils her genome).
`The emerging Internet of Things will
`make things even trickier, soon sur-
`rounding us with objects that can re-
`port on what we touch, eat, and do.16
`Monetization (greed). Political phi-
`losophers are observing a drift from
`what they term having a market econo-
`my to being a market society13 in which
`market values eclipse non-market so-
`cial norms. On the Internet, the ability
`to monetize nearly every piece of infor-
`mation is clearly fueling this process,
`which is itself facilitated by the exis-
`tence of quasi-monopolies. A market-
`
`There is no reason
`to think of privacy
`as we conceive of it
`today as an enduring
`feature of life.
`
`ingly unstoppable human-induced
`disaster and a worldwide tragedy of
`commons. Scientists and technolo-
`gists are developing a portfolio of
`mitigating innovations in renewable
`energy, energy efficiency, and carbon
`sequestration. But they are also study-
`ing ways of coping with likely effects,
`including rising sea levels and dis-
`placement of populations. There is a
`scientific consensus that the threat
`justifies not just mitigation, but prep-
`aration (for example, elevating Hol-
`land’s dikes).
`The same, we believe, could be
`true of privacy. Confidentiality may
`be melting away, perhaps inexora-
`bly: soon, a few companies and sur-
`veillance agencies could have access
`to most of the personal data of the
`world’s population. Data provides in-
`formation, and information is power.
`An information asymmetry of this de-
`gree and global scale is an absolute
`historical novelty.
`There is no reason, therefore, to
`think of privacy as we conceive of it to-
`day as an enduring feature of life.
`
`Example: RFID
`Radio-Frequency IDentification (RFID)
`location privacy concretely illustrates
`how technological evolution can un-
`dermine C-PETs. RFID tags are wire-
`less microchips that often emit static
`identifiers to nearby readers. Number-
`ing in the billions, they in principle
`permit secret local tracking of ordinary
`people. Hundreds of papers proposed
`C-PETs that rotate identifiers to pre-
`vent RFID-based tracking.6
`Today, this threat seems quaint.
`Mobile phones with multiple RF in-
`terfaces (including Bluetooth, Wi-Fi,
`NFC), improvements in face recog-
`nition, and a raft of new wireless de-
`vices (fitness trackers, smartwatches,
`and other devices), offer far more
`effective ways to track people than
`RFID ever did. They render RFID C-
`PETs obsolete.
`This story of multiplying threat vec-
`tors undermining C-PETs’ power—and
`privacy more generally—is becoming
`common.
`
`The Assault on Privacy
`We posit four major trends providing
`the means, motive, and opportunity
`for the assault on privacy in its broad-
`
`40 COMMUNICATIONS OF THE ACM | JUNE 2016 | VOL. 59 | NO. 6
`
`place could someday arise that would
`seem both impossible and abhorrent
`today. (For example, for $10: “I know
`that Alice and Bob met several times.
`Give me the locations and transcripts
`of their conversations.”) Paradoxically,
`tools such as anonymous routing and
`anonymous cash could facilitate such
`a service by allowing operation from
`loosely regulated territories or from no
`fixed jurisdiction at all.
`Adaptation and apathy. Users’
`data curation habits are a complex
`research topic, but there is a clear
`generational shift toward more in-
`formation sharing, particularly on
`social networks. (Facebook has more
`than one billion users regularly shar-
`ing information in ways that would
`have been infeasible or unthinkable
`a generation ago.). Rather than fight-
`ing information sharing, users and
`norms have rapidly changed, and
`convenience has trumped privacy to
`create large pockets of data-sharing
`apathy. Foursquare and various other
`microblogging services that encour-
`age disclosure of physical location,
`for example, have led many users to
`cooperate in their own physical track-
`ing. Information overload has in any
`event degraded the abilities of users
`to curate their data, due to the com-
`plex and growing challenges of “sec-
`ondhand” data-protection weakening
`and inference, as noted previously.
`Secret judgment. Traceability and
`accountability are essential to protect-
`ing privacy. Facebook privacy settings
`are a good example of visible privacy
`practice: stark deviation from expected
`norms often prompts consumer and/
`or regulatory pushback.
`Increasingly often, though, sen-
`sitive-data exploitation can happen
`away from vigilant eyes, as the recent
`surveillance scandals have revealed.
`(National security
`legitimately de-
`mands surveillance, but its scope and
`oversight are critical issues.) Deci-
`sions made by corporations—hiring,
`setting insurance premiums, com-
`puting credit ratings, and so forth—
`are becoming increasingly algorith-
`mic, as we discuss later. Predictive
`consumer scores are one example;
`privacy scholars have argued they
`constitute a regime of secret, arbi-
`trary, and potentially discriminatory
`and abusive judgment of consumers.2
`
`USR Exhibit 2115, Page 2
`
`

`

`viewpoints
`
`A Post-Confidentiality
`Research Agenda
`We should prepare for the possibil-
`ity of a post-confidentiality world, one
`in which confidentiality has greatly
`eroded and in which data flows in such
`complicated ways that social norms
`are jeopardized. The main research
`challenge in such a world is to preserve
`social norms, as we now explain.
`Privacy is important for many rea-
`sons. A key reason, however, often cit-
`ed in discussions of medical privacy, is
`concern about abuse of leaked person-
`al information. It is the potentially re-
`sulting unfairness of decision making,
`for example, hiring decisions made on
`the basis of medical history, that is par-
`ticularly worrisome. A critical, defen-
`sible bastion of privacy we see in post-
`confidentiality world therefore is in the
`fair use of disclosed information.
`Fair use is increasingly important
`as algorithms dictate the fates of
`workers and consumers. For example,
`for several years, some Silicon Valley
`companies have required job candi-
`dates to fill out questionnaires (“Have
`you ever set a regional-, state-, coun-
`try-, or world-record?”). These compa-
`nies apply classification algorithms
`to the answers to filter applications.5
`This trend will surely continue, given
`the many domains in which statisti-
`cal predictions demonstrably outper-
`form human experts.7 Algorithms,
`though, enable deep, murky, and ex-
`tensive use of information that can
`exacerbate the unfairness resulting
`from disclosure of private data.
`On the other hand, there is hope
`that algorithmic decision making can
`lend itself nicely to protocols for en-
`forcing accountability and fair use. If
`decision-making is algorithmic, it is
`possible to require decision-makers
`to prove that they are not making use
`of information in contravention of
`social norms expressed as laws, poli-
`cies, or regulations. For example, an
`insurance company might prove it
`has set a premium without taking
`genetic data into account—even if
`this data is published online or oth-
`erwise widely available. If input data
`carries authenticated
`labels, then
`cryptographic techniques permit the
`construction of such proofs with-
`out revealing underlying algorithms,
`which may themselves be company
`
`If we cannot win
`the privacy game
`definitively, we need
`to defend paths to
`an equitable society.
`
`secrets (for example, see Ben-Sasson
`et al.1). Use of information flow con-
`trol12 preferably enforced by software
`attested to by a hardware root of trust
`(for example, see McKeen et al.9) can
`accomplish much the same end. Sta-
`tistical testing is an essential, com-
`plementary approach to verifying fair
`use, one that can help identify cases
`in which data labeling is inadequate,
`rendered ineffective by correlations
`among data, or disregarded in a sys-
`tem. (A variety of frameworks exist,
`for example, see Dwork et al.3)
`
`Conclusion
`A complementary research goal is
`related to privacy quantification. To
`substantiate claims about the decline
`of confidentiality, we must measure
`it. Direct, global measurements are
`difficult, but research might look to
`indirect monetary ones: The profits
`of the online advertising industry per
`pair of eyeballs and the “precision” of
`advertising, perhaps as measured by
`click-through rates. At the local scale,
`research is already quantifying pri-
`vacy (loss) in such settings as location-
`based services.14
`There remains a vital and enduring
`place for confidentiality. Particularly
`in certain niches—protecting politi-
`cal dissent, anti-censorship in repres-
`sive regimes—it can play a societally
`transformative role. It is the respon-
`sibility of policymakers and society
`as a whole to recognize and meet the
`threat of confidentiality’s loss, even
`as market forces propel it and politi-
`cal leaders give it little attention. But
`it is also incumbent upon the research
`community to contemplate alterna-
`tives to C-PETs, as confidentiality is
`broadly menaced by technology and
`
`social evolution. If we cannot win the
`privacy game definitively, we need to
`defend paths to an equitable society.
`We believe the protection of social
`norms, especially through fair use
`of data, is the place to start. While C-
`PETs will keep being developed and
`will partially mitigate the erosion of
`confidentiality, we hope to see many
`“fair-use PETs” (F-PETs) proposed
`and deployed in the near future.15
`
`References
`1. Ben-Sasson, E. et al. SNARKs for C: Verifying program
`executions succinctly and in zero knowledge. In
`Advances in Cryptology–CRYPTO, (Springer, 2013),
`90–108.
`2. Dixon, P. and Gellman, R. The scoring of America: How
`secret consumer scores threaten your privacy and
`your future. Technical report, World Privacy Forum
`(Apr. 2, 2014).
`3. Dwork, C. et al. Fairness through awareness. In
`Proceedings of the 3rd Innovations in Theoretical
`Computer Science Conference. (ACM, 2012), 214–226.
`4. Erlich, Y. and Narayanan, A. Routes for breaching and
`protecting genetic privacy. Nature Reviews Genetics
`15, 6 (2014), 409–421.
`5. Hansell, S. Google answer to filling jobs is an algorithm.
`New York Times (Jan. 3, 2007).
`6. Juels, A. RFID security and privacy: A research
`survey. IEEE Journal on Selected Areas in
`Communication 24, 2 (Feb. 2006).
`7. Kahneman, D. Thinking, Fast and Slow. Farrar, Straus,
`and Giroux, 2012, 223–224.
`8. Martinovic, I. et al. On the feasibility of side
`channel attacks with brain-computer interfaces. In
`Proceedings of the USENIX Security Symposium,
`(2012), 143–158.
`9. McKeen, F. et al. Innovative instructions and software
`model for isolated execution. In Proceedings of
`the 2nd International Workshop on Hardware and
`Architectural Support for Security and Privacy, Article
`no. 10 (2013).
`10. Nissenbaum, H. Privacy in Context: Technology, Policy,
`and the Integrity of Social Life. Stanford University
`Press, 2009.
`11. North, M.J. Hippocratic oath translation. U.S. National
`Library of Medicine, 2002.
`12. Sabelfeld, A. and Myers, C. Language-based
`information-flow security. IEEE Journal on Selected
`Areas in Communications 21, 1 (2003), 5–19.
`13. Sandel, M.J. What Money Can’t Buy: The Moral Limits
`of Markets. Macmillan, 2012.
`14. Shokri, R. et al. Quantifying location privacy. In
`Proceedings of the IEEE Symposium on Security and
`Privacy (2011), 247–262.
`15. Tramèr, F. et al. Discovering Unwarranted Associations
`in Data-Driven Applications with the FairTest Testing
`Toolkit, 2016; arXiv:1510.02377.
`16. Weber, R.H. Internet of things—New security and
`privacy challenges. Computer Law and Security
`Review 26, 1 (2010), 23–30.
`
`Jean-Pierre Hubaux (jean-pierre.hubaux@epfl.ch)
`is a professor in the Computer Communications and
`Applications Laboratory at the Ecole Polytechnique
`Fédérale de Lausanne in Switzerland.
`
`Ari Juels (juels@cornell.edu) is a professor at Cornell
`Tech (Jacobs Institute) in New York.
`
`We would like to thank George Danezis, Virgil Gligor, Kévin
`Huguenin, Markus Jakobsson, Huang Lin, Tom Ristenpart,
`Paul Syverson, Gene Tsudik and the reviewers of this
`Viewpoint for their many generously provided, helpful
`comments, as well the many colleagues with whom we
`have shared discussions on the topic of privacy. The views
`presented in this Viewpoint remain solely our own.
`
`Copyright held by authors.
`
`JUNE 2016 | VOL. 59 | NO. 6 | COMMUNICATIONS OF THE ACM 41
`
`USR Exhibit 2115, Page 3
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket