Lost among the noise of Facebook’s rumoured plans to launch a kids’ network, there’s quiet pressures developing as consumers start to realise the value of their data – the pressure to regulate social media.
In his Rethinking Privacy in an Era of Big Data, New York Times writer Quentin Hardy raises some of the issues about the data which is being collected about us.
One of the big areas is triangulation – building a picture of somebody based upon seemingly unrelated data. Quentin explains it in the example of somebody who might be looking for a job.
There other ways in which we can lose control of our privacy now. By triangulating different sets of data (you are suddenly asking lots of people on LinkedIn for endorsements on you as a worker, and on Foursquare you seem to be checking in at midday near a competitor’s location), people can now conclude things about you (you’re probably interviewing for a job there) that are radically different from either set of public information.
The key word of course is “conclude” – we base an assumption on what we think we know. It could turn out those LinkedIn endorsements could be part of a performance review and the competitor’s location could right next door to a hot new lunch spot.
We should also keep in mind the value of this data is asymmetric as the value of this data to a third party is low, if anything. But to the individual it could mean losing a job and other major consequences.
A good example of this is the story of how a UK hospital trust lost highly sensitive health records of thousands of patients, including those being treated for HIV.
The trust ended up being fined £325,000 but that fine is trivial compared to the massive individual cost from just one of those records being released.
Fines are a lousy way of enforcing privacy anyway, as the financial penalties are just passed onto shareholders or taxpayers.
The only meaningful sanction for failures like the Brighton General Hospital breach are holding individuals, particularly managers, personally responsible.
As we saw in the successive Sony security breaches last year, most organisations aren’t interested in holding their senior managers responsible for even the most egregious data failures.
This failure of the corporate sector to protect consumer data will almost certainly drive calls for government regulation and sanctions.
Microsoft researcher Danah Boyd flags this regulation issue in Quentin Hardy’s New York Times piece, saying “Regulation is coming,” she says. “You may not like it, you may close your eyes and hold your nose, but it is coming.”
Danah also makes an important point that users – particularly kids – have developed tactics to obscure their ‘digital footprints’.
For Danah, and others trying to understand what is happening online, this causes a problem, “When I started doing my fieldwork I could tell you what people were talking about. Now I can’t.”
These tactics of creating dummy social media profiles and using euphemisms are a huge threat to the business plans of social media services and the “identity services” desired by Google’s Eric Schmidt.
As data becomes less reliable, or more difficult to triangulate, the value of it to advertisers falls.
It may well be that regulation of social media and web services ends up not being necessary as users become more net savvy. For medical and other personal data though, it’s clear we have to rethink the way we use and store it.