Mark Zuckerberg did everything in his power to avoid Facebook becoming the next MySpace – but forgot one crucial detail…
No one likes a lying asshole
Comment Let's get one thing straight right off the bat: Facebook, its CEO Mark Zuckerberg, and its COO Sheryl Sandberg, and its public relations people, and its engineers have lied. They have lied repeatedly. They have lied exhaustively. They have lied so much they've lost track of their lies, and then lied about them.
For some reason, in an era where the defining characteristic of the President of the United States is that he lies with impunity, it feels as though everyone has started policing the use of the word "lie" with uncommon zeal. But it is not some holy relic, it is a word, and it has a definition.
1 : to make an untrue statement with intent to deceive
2 : to create a false or misleading impression
By any measure, Facebook as an organization has knowingly, willingly, purposefully, and repeatedly lied. And two reports this week demonstrate that the depth of its lying was even worse than we previously imagined.
Before we dig into the lies, though, it's worth asking the question: why? Why has the corporation got itself into this position, and why does it have to be dragged kicking and screaming, time and again, to confront what it already knows to be true?
And the answer to that is at the very heart of Facebook, it goes to the core of Mark Zuckerberg's personality, and it defines the company's corporate culture: it is insecure. And it has good reason to be.
The truth is that Facebook is nothing special. It is a website. A very big and clever website but a website that is completely reliant on its users to post their own content. Those users don't need Facebook and they could, in a matter of seconds, decide to tap on a different app and post their thoughts and updates there, instead. If enough people make that decision, the company collapses. All 340 billion dollars of it.
Mark Zuckerberg knows that all too well, and as internal emails handed over to the British Parliament and then published make clear, the top tier of Facebook was highly focused on that question of existential dread: how do we avoid becoming the next MySpace, Geocities, Google Plus, or Friendster?
With thousands of people working underneath them, the world's largest companies knocking at their door with blank checks for advertising, and the globe's political leaders inviting them to meetings, Facebook tasted greatness, but couldn't shake a huge question underneath it all: how does Facebook survive once the novelty wears off?
And the answer was the smart one: make yourself a part of the digital ecosystem. Yes, Facebook was completely reliant on its users, but everyone else wanted those users, too, and while it had them, the corporation needed to make sure it became enmeshed in as many other systems as possible.
It became a savvy businessman making sure that all his money and resources aren't in one market: diversify, Mark! And that became the driving force behind every subsequent strategic decision while the rest of the company focused on making Facebook a really good product – making it easy to do more, post more, interact more.
Dear Santa, all I want for Christmas is: 1. More ad revenue, and 2. Good PR. Lots of love – Mark, aged 34½READ MORE
And so, we had music service Spotify granted access to Facebook users' private messages, once users had agreed to link their Spotify and Facebook accounts. Why on Earth would Spotify want to read people's private messages?
Easy: it is a huge, tasty dataset. You could find out what bands people are excited about, and send them notices of new albums or gigs. You could see what they think of rival services, or the cost of your service. People were encouraged to message their pals on Facebook through Spotify, letting them know what they were listening to. All in all, it was access to private thoughts: companies spend small fortunes paying specialist survey companies for these sorts of insights.
Likewise Netflix. It had access to the same data under a special program that Facebook ran with other monster internet companies and banks in which they were granted extraordinary privileges to millions of people's personal data.
Facebook cut data-exchange deals with all sorts of companies based on this premise: give them what they want, and in return they would be hauled onto Zuckerberg's internet reservation.
For example, Yahoo! got real-time feeds of posts by users' friends – reminding us of Cambridge Analytica gathering information on millions of voters via a quiz app, and using it to target them in contentious political campaigns in the US and Europe.
Microsoft's Bing was able to access the names of nearly all Facebook users’ friends without permission, and Amazon was able to get at friends' names and contact details. Russian search engine Yandex had Facebook account IDs at its fingertips, though it claims it didn't even know this information was available. Facebook at first told the New York Times Yandex wasn't a partner, and then told US Congress it was.
Crossing the line
Allowing large companies to vacuum up users' profiles, their contacts lists, and their friends' profiles, became a running theme, and for the antisocial network, it all worked: the data flowed. In some cases, it kept flowing even when the sharing agreements were supposed to have ended.
But then things took a darker turn. The users and privacy groups started asking questions. Facebook's entire strategy started looking shaky as people decided they should have control over what is done with their private data. In Europe, a long debate led to solid legislation: everyone in the EU would soon have a legal right to control their information and, much worse, organizations that didn't respect that could face massive fines.
Facebook started cutting shadier and shadier deals to protect its bottom line. Its policy people started developing language that carefully skirted around reality; and its lawyers began working on semantic workarounds so that the Silicon Valley titan could make what looked like firm and unequivocal statements on privacy and data control, but in fact allowed things to continue on exactly as they had. What was being shared was not always completely clear.
The line was crossed when Facebook got in bed with smartphone manufacturers, such as Apple, Amazon, BlackBerry, Microsoft, and Samsung. Facebook secretly gave the device makers access to each phone user's Facebook friends' profiles, when the handheld was linked to its owner's account, bypassing protections.
And you know how you can turn off "location history" in the Facebook app, and you can go into your iPhone's settings and select "never" for the Facebook app when it comes to knowing your location? And you can refuse to use Facebook's built-in workaround where you "check in" to places – at which point it will re-grant itself access to your location with a single tap?
Well, you can do all that, and still Facebook will know where you are and sell that information to others.
To which the natural question is: how? Well, we have what we believe to be the technical answer. But the real answer is: because it lies. Because that information is valuable to it. Because that information forms the basis of mutually reinforcing data-sharing agreements with all the companies that could one day kill Facebook by simply shrugging their shoulders.
That is how Sandberg and Zuckerberg are able to rationalize their lies: because they believe the future of the entire company is dependent on maintaining the careful fiction that users have control over their data when they don't.
Here's a personal example of how these lies have played out. Until recently, your humble Reg vulture lived next door to a man called Stan. Stan had spent his whole life in Oakland, California. He was a proud black man in his 70s who lived alone. This reporter moved next door to him having spent his entire life up until that moment not in Oakland; a white man in his 30s. To say we had no social connections in common would be an understatement. The only crossover in friends, family, culture, and hangouts were the occasional conversations we had in the street with our neighbors.
He had good taste in music. And I know that in the same way I knew he had an expensive and powerful stereo system. But we didn't even go the same gigs because most of the music he played was from artists long since dead.
Despite all this, Facebook would persistently suggest that I knew Stan and should add him as a friend on Facebook. The same happened to my wife. I took this as a sign I needed to tighten up my privacy settings but even after making changes cutting Facebook off from my daily habits, it still recommended him as a friend. The only thing that finally stopped it? Deleting the Facebook app from my phone.
Sensing a story, and in my capacity as a tech reporter, I started asking Facebook questions about this extraordinary ability to know who I lived next to when it didn't have access to my location. And the company responded, repeatedly, that it doesn't. You have control over your data. You can choose what Facebook can see and do with that data. Facebook does not gather or sell data unless its users agree to it.
Except, of course, the opposite was true. It was a lie. And Facebook knew it. It had in fact gone to some lengths to make sure it knew where all its users were.
Precisely how it manages to say one thing and do the opposite is not yet clear but we are willing to bet it is a combination of two factors: one, its app stores and sends several data points that can be used to figure out location: your broadband IP address and/or Wi-Fi and Bluetooth network identifiers. It may be possible to figure out someone's location from these data points: for example, your cable broadband IP address can often be narrowed down to a relatively precise location, such as a street or neighborhood, especially if you have a fixed IP address at home.
At this point, using Stan's location from his IP address or from his phone app, Facebook could work out we live next to each other, or at least are near each other a lot, and thus might be friends.
Control is an illusion
With the news that Facebook signed dozens of data sharing agreements with large tech companies, it seems increasingly likely that Facebook was in fact not gathering my location data directly to figure out where I was, but was pulling in data from others, perhaps mixing in my home broadband IP address's geolocation, and correlating it all to work out relationships and whereabouts.
We don't yet know what precise methods Facebook uses to undercut its promises, but one thing is true – the company has made to this reporter, and many other reporters, users, lawmakers, federal agencies, and academics untrue statements with an intent to deceive. And it has created false or misleading impressions. It has lied. And it has done so deliberately. Over and over again.
And it is still lying today. Faced with evidence of its data-sharing agreements where – let's not forget this – Facebook provided third parties access to people's personal messages, and more importantly to their contacts lists and friends' feeds, the company claims it broke no promises because it defined the outfits it signed agreements with as "service providers." And so, according to Facebook, it didn't break a pact it has with the US government's trade watchdog, the FTC, not to share private data without permission, and likewise not to break agreements it has with its users.
Facebook also argues it had clearly communicated that it was granting apps access to people's private messages, and that users had to link their Spotify, Netflix, Royal Bank of Canada, et al, accounts with their Facebook accounts to activate it. And while Facebook's tie-ups with, say, Spotify and Netflix were well publicized, given this week's outcry, arguably not every user was aware or made aware of what they were getting into. In any case, the "experimental" access to folks' private conversations was discontinued nearly three years ago.
The social network claims it only ever shared with companies what people had agreed to share or chosen to make public, sidestepping a key issue: that people unwittingly had their profiles viewed, slurped, harvested, and exploited by their friends' connected apps and websites.
As for the question of potential abuse of personal data handed to third parties, Facebook amazingly used the same line that it rolled out when it attempted to deflect the Cambridge Analytica scandal: that third parties were beholden to Facebook's rules about using data. But, of course, Facebook doesn't check or audit whether that is the case.
And what is its self-reflective apology this time for granting such broad access to personal data to so many companies? It says that it is guilty of not keeping on top of old agreements, and the channels of private data to third parties stayed open much longer than they should have done after it had made privacy-enhancing changes.
We can't prove it yet, and may never be able to unless more internal emails find their way out, but let's be honest, we all know that this is another lie. Facebook didn't touch those agreements because it didn't want anyone to look at them. It chose to be willfully ignorant of the details of its most significant agreements with some of the world's largest companies.
And it did so because it still believes it can ride this out, and that those agreements are going to be what keeps Facebook going as a corporation.
What Zuckerberg didn't factor into his strategic masterstroke, however, was one critical detail: no one likes a liar. And when you lie repeatedly, to people's faces, you go from liar to lying asshole. And lying asshole is enough to make people delete your app.
And when that app is deleted, the whole sorry house of cards will come tumbling down. And Facebook will become Friendster. ®