The Problem with “Fair Information Practices” on the Web

2355790455_9d135317e9

One of the reasons I think EFF made a mistake in endorsing new privacy regulations is that I think there’s a huge gap between the sensible-sounding rhetoric of privacy legislation propposals and the details of what’s actually being proposed. The report EFF endorsed offers no fewer than 16 “points” for the design of privacy regulations (although note that “these recommendations are not exhaustive”). Here’s one of them:

“Sensitive information should not be collected or used for behavioral tracking or targeting. Sensitive information should be defined by the FTC and should include data about health, finances, ethnicity, race, sexual orientation, personal relationships and political activity.

I think it’s instructive to try to puzzle out how a regulatory scheme like this would affect sites like Facebook and Twitter. Facebook, by its nature, involves collecting sensitive information about such subjects as when a user logs on, who the user communicates with, what political organizations and causes a user supports, when a user begins or ends a romantic relationship, and so forth. The document advocates that “data collected on users who consent must not be retained beyond a period of three months.” This would apparently require that this kind of information be deleted and re-entered by the user every three months.

The report also demands that “with any change of purpose of the data the individual must be alerted and given an option to refuse collection or use.” Facebook introduces new features that use personal data pretty regularly, so we’re talking about users being repeatedly presented with legalistic descriptions of new features being introduced. The primary result of this is likely to be irritating users, the vast majority of whom will reflexively click the “OK” button without reading the disclosures.

The report would outlaw any “contest that seeks the collection of consumer information in exchange for the chance to win a prize,” apparently even with user consent.

It would also prohibit “behavioral targeting” of anyone under 18. So the Democratic Party, the Catholic Church, and Human Rights Campign, couldn’t buy ads targeted toward 17-year-olds who had identified themselves as respectively, liberal, Catholic, and gay. This seems unduly restrictive.

“Personal data” includes information about “personal relationships,” and the report advocates mandatory advance disclosure of all the ways such data will be used. That would seem to outlaw Twitter’s open social graph APIs, which allows arbitrary third parties to access information about your “personal relationships” without oversight from Twitter.

Maybe I’m misunderstanding some of these proposals; it’s a 13-page document that still manages to be vague about many key details. But I think that’s the point: drawing up a set of rules for the use of personal information by every site on the web today and every website that might be created by someone in the future is a tall order. Websites use personal information in a huge number of different ways; one-size-fits-all rules will inevitably be inadequate. Even in the best case, people crafting a policy like this are going to make some mistakes and create headaches for the world’s web developers. And of course, the sausage-making process on Capitol Hill is not the best case. Maybe the brainiacs at EFF could craft legislation that protects consumers privacy without unduly burdening firms like Facebook. But the people with ultimately authority over the legislation will be members of Congress, most of whom have never used a social networking site in their lives.

The proposal EFF endorsed is based on the “fair information practices” that are beloved by privacy zealots. These rules were designed in the 1970s at a time when the technologies for large-scale data collection was only affordable to relatively large companies and the technology severely limited the types of information that could be collected and the ways it could be used. Given these limitations, it wasn’t crazy to think policymakers could catalog and regulate every conceivable use of personal information.

That top-down world is gone. It has been replaced by a bottom-up world in which the technologies for data collection are cheap and ubiquitous. There’s a lot more data being collected by a lot more organizations in a lot more different ways. It’s not unreasonable to be concerned about the potential for abuses. But it’s delusional to think that we can put the genie back in the bottle. We’re not going to get back to a world where government bureaucrats can prospectively regulate every use of personal data. And legislating as if it’s still the 1970s is a recipe for creating laws that are completely out of touch with the real world.

This entry was posted in Uncategorized. Bookmark the permalink.

4 Responses to The Problem with “Fair Information Practices” on the Web

  1. To be fair to the EFF, the report advocates prohibition of gathering data *for the purposes of behavioral targeting*, which it defines at the end to mean collecting/compiling data for advertising, and specifically excludes contextual advertising from the definition.

    To me, this implies that Facebook is largely unaffected by these rules. The focus is on stopping them from selling advertising that is tied to an analysis of the aggregate of the data on a particular person.

    The difficulty with this issue is that there are clear benefits to allowing targeting, but also serious drawbacks.

    I think the measures in this report need a lot more deliberation and analysis on how they will work practically, but the concerns they raise about people being denied services based on analysis of their activities (without their consent) certainly need addressing.

  2. Clinton,

    Does it? I don’t think it’s clear. I’d be less critical (although probably still opposed) if the report clearly advocated a safe harbor for firms that don’t engage in targeted advertising. But there are lots of statements that aren’t clearly limited to behavioral advertising. For example, on page 8 it says “Data collected on users who consent must not be retained beyond a period of three months.” I don’t see any caveats limiting this to firms engaging in behavioral advertising. Maybe that’s just sloppy draftsmanship on their part, but I suspect that at least some of the coalition partners actually want to see regulations that go well beyond behavioral advertising.

  3. The report could use a good editor, but given that the point on page 8 comes under the subheading “Implementation ideas”, my impression is that they have their views on what *shouldn’t* happen, but haven’t worked out what should.

    My hope is that browsers will integrate anonymous browsing (Tor-style) to defeat tracking, and social peer-to-peer technology so social networking can be decentralized. This wouldn’t solve all the problems highlighted in the report, but would mitigate them significantly.

  4. my impression is that they have their views on what *shouldn’t* happen, but haven’t worked out what should.

    Well OK, but I think that proves my point. If they can’t decide among themselves what should happen, it seems like a bad idea to demand that Congress enact legislation that will apply to hundreds of millions of people.

Leave a Reply

Your email address will not be published.