Surprise! Congress failed to address Facebook’s fundamental flaw
Of all the things Mark Zuckerberg said this week before Congress, I found this the most incredulous:
“I would hope that what we do with data is not surprising to people.”
Everyone is surprised when they find out what Facebook does with data. I’ve been writing about data hacks and privacy for more than a decade, and I’m here to tell you that surprise is the very thing, and often the only thing, that really makes consumers mad. People weren’t really angry that Equifax was hacked; they were surprised, and angry, to learn that Equifax had all their personal information and there was nothing they could do about it. They have been surprised, and angered, to learn that companies with names like ChoicePoint or Axciom exist, and that they hoard consumers’ most sensitive details.
Americans weren’t mad that politicians used online advertisements to support or defeat a political candidate. They were mad that somehow, a company like Cambridge Analytica had even heard of them, let alone had them in buckets like “suburban housewife.” And they were really mad that Facebook put them in these buckets, which most of them never contemplated as they were sharing cute puppy pictures.
Facebook, and most Internet firms, don’t have a trust problem. They have a surprise problem. The element of surprise is the entire foundation of Facebook’s business, and indeed, it’s the foundation of all targeted advertising. Programmers are natural lurkers. They prefer to watch what you do, and use that information to predict what you might buy — ideally, a few moments before you realize you might buy it — and then connect you with someone willing to sell you that thing. This odd triangular arrangement means you are not Facebook’s customer. Advertisers are Facebook’s customers. You are merely the raw material.
I was surprised that the Cambridge Analytica story created the stir that it did. There’s very little in the saga that hasn’t happened before: app makers tricking users; Facebook urging users to overshare, then oversharing with its paying customers; Facebook’s cavalier attitude towards difficult people who care about privacy.
Mark Zuckerberg wanted to make the hearings about topics like consumer “control” and the “sale” of data. All along, I wanted someone to confront him with the element of surprise.
Consumer lawyer and privacy expert Joel Winston is blunt about the surprise in a column for NBC: “On the basis of ten “Likes,” researchers from Cambridge have demonstrated that Facebook knows you better than your work colleagues. After 70 “Likes,” Facebook knows you better than your friends. Accumulate 150 “Likes,” researchers showed, and Facebook knows you better than your parents. Complete 300 “Likes” and Facebook knows you better than your spouse or partner. Record more than 500 honest “Likes” and Facebook can even know you better than you know yourself.”
Your innermost thoughts and urges are a natural resource. Right now, that resource is being exploited by firms like Facebook, just as once upon a time, corporate giants used and abused water and air with little or no consequence.
It doesn’t have to be that way. Surprise doesn’t have to be the business model. Last week, I talked to Ryan Sandler, co-founder of a start-up that just launched called Truework.com. It’s taking on Equifax’s very lucrative but very surprising service called The Work Number, which collects and shares paystub information on tens of millions of Americans right from their human resource departments. The business is shockingly lucrative. Sandler, who worked at LinkedIn before founding the company, says part of his inspiration came from an expose I wrote about The Work Number several years ago. He says it helped him design the business as an anti-surprise.
Consumers who have a good reason to allow a business verify their employment, or even their salary — a potential landlord for example — get a notice that’s very specific: It says “John Smith from Bank of America is requesting access to your verified employment and salary info.” Then, it shows the consumer the exact same report that Bank of America would see. And the consumer gets to say yes or no, or critically, gets a chance to fix any errors.
“We decided to turn model on its head,” Sandler told me. “This is a business that’s been stuck in the past and very abusive of millions (of people) and we wanted to find a better solution.”
This model is different. Very different. While consumers have the right to see their “credit report” every year for free, most don’t know (SURPRISE!) that the credit report they see is NOT the same report that lenders or other businesses see. This is true for almost every data driven interaction consumers have. You see one packaged version of your data; companies see something very different.
Sure, you can untag yourself from photos. Yes, you can like or unlike things. You can even visit your activity log and see all those 500 likes. You cannot see the digital person Facebook has turned you into, and yes, sold to the highest bidder. Facebook doesn’t sell its raw data, as Zuckerberg repeated over and over. It sells something far more obscure, more surprising, and more scary.
I write a lot about consumer scams and gotchas, and I’ve interviewed a lot of gotcha-wielding companies who always explain away their practices with some variation of this sentiment: “Hey, those fees are disclosed on this (obscure) webpage.” In return, I always say that when you take someone’s money, and they are surprised, that’s usually called theft. And if you are afraid that by being more clear with consumers, you’ll earn less money, that’s a you problem.
This same principal applies here. Until Facebook and other companies get out of the business of surprising people, we will continue to have these issues. No token “control” or privacy tuneup tool can change this simple reality: When you don’t pay for the product, you are the product. And when you are the product, you’re probably going to be unpleasantly surprised at how you are used.
originally posted at BobSullivan.net