Facebook and Cambridge Analytica

What's happening?

You've probably seen the headlines: How Trump Consultants Exploited the Facebook Data of Millions. Facebook Ignites Debate Over Third-Party Access to User Data. Mark Zuckerberg Under Fire Over Data Controversy. 


Here's what's happening: Facebook's lax data-sharing policies in 2015 exposed information about more than 50 million users to a researcher, who in turn shared it with political consulting firm Cambridge Analytica. Some of the story broke back in 2015, but it shattered this month when a former Cambridge employee, Christopher Wylie, shared internal company documents showing what happened.

 

This wasn't a hack. The researcher, Aleksandr Kogan, built a quiz app that you could log into using Facebook. You've probably done that before. So you know that when you use Facebook Login, you share information from your account—your name, your email, etc. But in 2015, Facebook's policies also allowed third party apps to collect information about your friends as well, even though they had never opted in. Though only 270,000 people actually used the app, Kogan (and Cambridge) got information on more than 50 million.

 

Facebook says Kogan broke their terms of service by sharing the data he gathered, and Cambridge broke the terms by keeping it. But up until that point, both had been following the rules.

Why is it important?

There's a lot of hype around the "data machine" Cambridge built. Their goal was to develop psychographic profiles, to understand each person's personality and to serve them ads they would most respond to. The firm sold itself by touting this method, first to Ted Cruz' campaign and then to Donald Trump's. But you should know that, despite the hype, it's not clear this kind of targeting actually works. You can develop a psychological profile based on how a person behaves online, but the research is mixed on whether that profile is more useful than simple information like parents' political orientation. And even when you know a personality type, it's easy to get wrong which ads that specific person will like. 

 

So the heart of this debate isn't whether Cambridge Analytica did anything wrong (they did) or whether their psychographic targeting really influenced the election (we don't know). It's how a company like Facebook should be required to deal with data sharing. 

Debate it!

Should you be able to give away your friends' data?

Yes: 

Your ability to transport data that's yours—including information your friends have shared with you about themselves—is a vital part of your ability to leave a platform. Clamping down on your power to take that information elsewhere further cements the social networks that exist now. It doesn't give power to you. It gives power to them.

 

Imagine if you were unable to take your contacts from Google. If you use Gmail, you probably have hundreds of valuable names, email addresses, profile pictures. These are bits of information other people gave to so you can better communicate. They're clearly yours to transport, even though they are others' information. They shouldn't be Google's to own. You are and should be able to download them and take them with you—for example, if you wanted to use another email service.  

 

On Facebook, you share pictures, birthdays, locations, preferences deliberately to better communicate with your friends. And they do the same with you. After you have mutually decided to share that information, you should all be able to take that information elsewhere—for example, if you wanted to use another social network. 

 

It's a mistake to create and enforce regulation that prevents you from doing this. When Facebook changed their policy in 2015 to prevent third party apps from collecting information about people's friends, they weren't helping you out. They were preventing you from going elsewhere and stopping their own competition from effectively building platforms that you would enjoy.

 

In short, we benefit from a person-centric system, not a company-centric one. And we shouldn't forget it.

No: 

Facebook says, “Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time.” That's the problem. As we share more and more information about ourselves online, we need more and more control over where it goes. And we need to enforce that on platforms, on our friends, and on ourselves.

 

When whistleblower Christopher Wylie shared what happened, he called Cambridge Analytica's behavior a "grossly unethical experiment." It was an experiment Facebook allowed, even facilitated. The third party app used to gather this data, "thisisyourdigitallife," wasn't some anomaly. Facebook is investigating whether other apps similarly "misused" data, and it seems likely they did. When third party apps are allowed to take data from a person about their friends, they will. And because Facebook doesn't know for sure what that data is being used for, it will be misused.

 

The solution is strong policies against it. Or, failing that, regulation. 

 

It's in Facebook and other social networks' best interest to hold themselves to strict privacy policies in order to preserve public trust—and avoid the regulation that comes with losing it. The Federal Trade Commission (FTC) is now investigating whether Facebook violated a government settlement they reached in 2011 regarding data gathering by third party apps (again). Facebook stock dropped nearly 7% immediately after the news broke, the largest decline in over five years. But more importantly, stock in other social networks dropped too. These platforms are deeply entrenched, but trust in them is fragile. And their business models are based on it.

Learn more...

  1. How Facebook allowed Cambridge Analytica to get data for 50 million users
    • "The problem here is that Facebook gives a lot of trust to the developers who use its software features. The company’s terms of service are an agreement in the same way any user agrees to use Facebook: The rules represent a contract that Facebook can use to punish someone, but not until after that someone has already broken the rules."
  2. How Facebook is taking action and why they're only responding now
    • "What happened to Facebook with Cambridge Analytica is a microcosm of an increasingly obvious problem that’s increasingly affecting all social media platforms—with effects that potentially impact every internet user. Whether these effects end up yielding an algorithmically botched election or just more creepy fake celebrity porn, it seems clear that we’ve entered an unprecedented era of massive online data manipulation."
  3. The entire scandal, explained in a few simple diagrams
    • "There is a complicated web of relationships that explains how the Trump campaign, via the help of a political consulting firm, was able to harvest raw data from 50 million Facebook profiles to direct its messaging.