Consumers typically don’t want to pay for services that the internet has taught them should be “free”. Social networking, email, calendars, search, messaging…these are all “free” on a cash basis, but have a major cost to your privacy.
The best analogy I have heard to describe how these services work was in an episode of Sam Harris’ podcast with Jaron Lanier.
To paraphrase: imagine if when you viewed an article on Wikipedia, it customised the content based on thousands of variables about you based on such things as where you are, who you are friends with, what websites you visit and how often, how old you are, your political views, what you read recently, what your recent purchases are, your credit rating, where you travel, what your job is and many other things you have no idea about. You wouldn’t know the page had changed, or how it differed from anyone else. Or even if any of the inferred characteristics were true or not.
That’s how Google and Facebook work, all in the name of showing ads.
I don’t have a problem with trading privacy for free services per se. The problem is the lack of transparency with how these systems work, and the resultant lack of understanding by those making the trade off (ToS;DR). For the market mechanism to work, you have to be well informed.
We’re starting to see this with how governments are trying to force the big platforms to police the content they host but leaving the details to platforms themselves. Naturally, they are applying algorithms and technology to the problem, but how the rules are being applied is completely opaque. There’s no way to appeal. By design, the hueristics constantly change and there’s no way to understand how they have been applied.
Policing content is a problem that has been solved in the past through the development of Western legal systems and the rule of law. The separate powers of the state – government, judiciary and legislature – counter-balance each other with various checks and stages to allow for change. It’s not perfect, but it has had hundreds of years of production deployment and new version releases!
What has changed is the scale. And the fact that governments are delegating the responsibility of the implementation to a small number of massive, private firms.
It’s certainly not that the government could do a better job at solving this. Indeed, they would likely make even more of a mess of it e.g. EU cookie notice laws. But private companies can’t be allowed to do it by themselves.
The solution requires open debate, evidence based review, a robust appeals system, transparency into decision making and the ability for the design to be changed over time. But it also needs to be mostly automated and done at internet-scale. Unfortunately, right now I’m not sure such a solution exists.
Regulation always favours the large incumbents, stifling innovation and freedom of expression. Perhaps it is time for the legislative process to adopt a more lightweight, agile process with a specific, long term goal that successive governments can work towards. There tends to be a preference for huge, wide-ranging regulatory schemes which try to do everything in one go. Instead, we should be making small changes, focusing on maximum transparency and taking the time to measure and iterate. The tech companies need to apply good engineering processes to how they are developing their social policy, in public.
But without any incentive to do so, we risk ending up with a Kafka-esque system that might achieve the goal at a macro level, but will have many unintended consequences.