Privacy is not Access Control (But then what is it?)

February 13, 2010 at 3:03 am 10 comments

In my previous article on the Google Buzz fiasco, I pointed out that the privacy problems were exacerbated by the fact that the user interface was created by programmers. In this post I will elaborate on that theme and provide some constructive advice on privacy-conscious design, especially for social networking.

The problem I’m addressing is that as far as computer scientists and computer programmers are concerned, privacy is a question of access control, i.e., who is allowed to look at what. Unfortunately, in the real world, that is only a tiny part of what privacy is about. Here are three examples to make my point:

1. Dummy cameras. Consider a thought experiment: suppose the government installed a bunch of cameras all over a public park along with prominent signs announcing 24×7 surveillance. The catch, however, is that the cameras have not been turned on. Has anyone’s privacy been violated?

From the computer science perspective, the answer is no, because no one is actually being observed, nothing is being recorded and no data is being generated. But common sense tells us that something is wrong with that answer. The cameras cause people considerable discomfort. The surveillance, real or imaginary, changes their behavior.

This hypothetical scenario is adapted from Ryan Calo’s paper, which analyzes in detail the “sensation of being observed.”

2. Aggregation changes the equation. Remember the uproar when Facebook released News Feed? No new information was revealed to your friends that wasn’t accessible to them before; it was just that the News Feed made it dramatically easier to observe all your activities on the site.

Of course, it goes both ways: the technology in turn changed people’s expectations; it is now hard to imagine not having a feed-like system, whether on Facebook or another social network. Nevertheless, I often see people putting something into their profile, deciding a few moments later that they didn’t want to share it after all, and realizing that it was too late because the information has already been broadcast to their friends.

3. Everyone-but-X access control, which I described in an earlier article, shows in a direct way how access control fails to capture privacy requirements. From the traditional CS security perspective, the ability for a user to make something visible to “everyone but X” is meaningless: X can always create a fake account to get around it.

But a use-case should hopefully immediately convince you that everyone-but-X is a good idea: your sibling is on your friends list and you want to post about your sex life. It’s not that you want to prevent X from having access to your post, but rather that both of you prefer that X didn’t have access to it.

Access control is not the goal of privacy design. It is at best one of many tools. Rather, human behavior is key. The dummy cameras were bad because they affected the behavior of people in a detrimental way. News feed was bad because it introduced major new privacy consequences for the behaviors that people were accustomed to on the site. (However, I would argue that the dramatic increase in usefulness trumped the privacy drawbacks.) Everyone-but-X privacy is good because it allows people to carry over to the online setting behaviors that they are used to in the real world.

It is impossible to fully analyze the privacy consequences of a design decision without studying its impact on actual user behavior. There is no theoretical framework to ensure that a design decision is safe — user testing is essential. Going back to Google Buzz, a beta period or a more gradually phased roll-out would have undoubtedly been better.

To stay on top of future posts, subscribe to the RSS feed or follow me on Twitter.

Entry filed under: Uncategorized. Tags: , , , , , .

Google Buzz, Social Norms and Privacy Cookies, Supercookies and Ubercookies: Stealing the Identity of Web Visitors

10 Comments Add your own

  • 1. Google Buzz, Social Norms and Privacy « 33 Bits of Entropy  |  February 14, 2010 at 4:49 am

    […] have a follow-up article with advice on privacy-conscious […]

  • 2. Monica  |  February 14, 2010 at 8:14 pm

    Yes, especially to the sibling example. The threat model is not always about malicious adversaries.

  • 3. Vincent  |  February 16, 2010 at 4:14 am

    In my opinion, the first example is still a question of access control. People provide an access to their data, even if this access is not used.

  • 4. cowherd  |  February 18, 2010 at 11:51 pm

    Please provide an example of data that you consider confidential but not private, and another example of data that you consider private but not confidential.

  • 5. WH  |  February 19, 2010 at 2:09 pm

    It’s not just privacy, there seems to be disconnect between how developers and non-technical people see “security”, too.

    Google’s Privacy Engineering Lead Alma Whitten recently said at a panel session I went to that when security people told her that people don’t care about security, she told them one security concern was stalking – and the security researchers said, what’s that got to do with security?!

  • 6. How Google Docs Leaks Your Identity « 33 Bits of Entropy  |  February 22, 2010 at 5:39 pm

    […] lesson from this bug is that access control in social networking can be tricky. I’ve written before that privacy in social networking is about a lot more than access control, and that theory […]

  • 7. Álvaro Del Hoyo  |  March 19, 2010 at 8:49 am

    Will love a post regarding Everyone-but-X access control

  • […] To avoid nasty surprises, developers building websites need to think carefully about privacy and user behavior when implementing any of these […]

  • […] that leads to the formulation of privacy as an access-control problem, something that I’ve criticized; the Geni blog post explicitly mentions this as their formulation of privacy. […]

  • 10. brian  |  February 15, 2011 at 9:03 pm

    become a virtual recluse
    – or net ninja

    avoid online social networking

    launder and hide your identity

    avoid traceable transactions
    use cash to secure / top-up a pre-payment debit cards and sim-cards

    use proxy servers, and avatars – never upload photos of yourself – ask friends to not publish pictures of you

    if necessary, obscure your face in public and privte gatherings – wear dark shades

    clock snooping sites and tracking cookies.
    resist requests for confirmation of id
    never provide biometric data,

    represent yourself as someone else by using psuedonames or surrogates for communication (other people’s acounts),
    change this id a every communication to maintain annoymity


    just accept that in the 21st century we are now all celebrities without the perks and safeguards of wealth

    – make the most of this limelight
    while distracting unwanted attention from self by other means – at opportunistic moments.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


I’m an associate professor of computer science at Princeton. I research (and teach) information privacy and security, and moonlight in technology policy.

This is a blog about my research on breaking data anonymization, and more broadly about information privacy, law and policy.

For an explanation of the blog title and more info, see the About page.

Me, elsewhere

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 265 other subscribers

%d bloggers like this: