Skip to content

Breaking News

This file photo taken on Nov. 20, 2017 shows Facebook's logos. (Loic Venance/AFP/Getty Images)
LOIC VENANCE/AFP/Getty Images
This file photo taken on Nov. 20, 2017 shows Facebook’s logos. (Loic Venance/AFP/Getty Images)
PUBLISHED: | UPDATED:

What would you look like as a movie star? How “bitchy” are you? What’s your St. Patrick’s Day nickname?

You’ve probably found the answers to these fascinating questions by taking a quiz on Facebook, and shared the results with your friends.

siliconbeat logo tech news blogBut what did you give up in exchange for the deep dives into your soul? Each developer asks for different amounts of information, but it’s possible you shared your Facebook likes, education and work history, religious and political affiliations and more.

A Cambridge University psychology professor who developed a personality-prediction app reportedly passed along that kind of personal data on 50 million Facebook users — 270,000 users of the app plus their networks of friends — to Cambridge Analytica, an advertising data firm once used by Donald Trump’s presidential campaign.

The New York Times and the U.K.-based Observer/Guardian reported over the weekend that Facebook has suspended the account of Cambridge Analytica, the firm founded by Steve Bannon with funding from famed Republican donor Robert Mercer with the purpose of building “tools that could identify the personalities of American voters and influence their behavior,” according to the Times report.

Trump’s campaign reportedly used the firm’s data during the primaries but not during the general election. Federal Election Commission numbers show the firm collected $5.9 million in 2016 from Trump’s campaign. In the United Kingdom, lawmakers are exploring whether Cambridge Analytica’s work played a role in Brexit.

On Friday, amid the pending publication of the newspaper reports, Facebook announced Cambridge Analytica’s suspension, as well as that of the professor, Aleksandr Kogan, and a former Cambridge Analytica worker who spoke out about what his company did.

That former worker, Christopher Wylie, talked to the Guardian, the Observer’s sister newspaper.

“Facebook could see it was happening,” Wylie told the Guardian. “Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic use. So they were like, ‘Fine’.”

Wylie also said in an interview Monday on NBC’s “Today” show that Cambridge Analytica sought to “explore mental vulnerabilities of people” by “creating a web of disinformation online so people start going down the rabbit hole of clicking on blogs, websites etc., that make them think things are happening that may not be.”

That is also what Russian trolls are now known to have done ahead of the 2016 election.

Facebook’s deputy general counsel, Paul Grewal, said in a blog post Friday that the professor who shared his app’s information with Cambridge Analytica violated Facebook’s policies, and that the parties involved swore to Facebook that the data they had collected has been destroyed. (The Times reports, though, that copies of the data remain, and that it viewed some of the data.)

Facebook users who can’t resist a quiz might be comforted to know that Grewal also said the social network’s third-party developers are now subject to a stricter app-review process.

The process “requires developers to justify the data they’re looking to collect and how they’re going to use it – before they’re allowed to even ask people for it,” he wrote.

When reached for comment Monday, a Facebook spokeswoman sent along the following statement from Facebook VP of Global Operations Justin Osoksky: “It’s important to note that Kogan’s app would not have access to detailed friends’ data today.”

Still, even if a third-party app says it collects minimal information about users, it could violate Facebook’s rules and pass that information along to others anyway, which is what seems to have happened in the Cambridge Analytica case.

“It’s not like this is a one-in-a-trillion outcome,” said Eric Goldman, professor at the Santa Clara University School of Law, on Monday. “Given how obvious and foreseeable it was [that someone would break the rules], Facebook could have come up with a system that would’ve reduced the risk of a contract breach.”

Goldman noted that the “stakes were so high in this particular data leakage” because it could have affected the outcome of the U.S. presidential election.

Facebook’s shares fell sharply Monday. Among the fallout: more calls for regulation, concern that Facebook may have violated a privacy settlement with the Federal Trade Commission, and lawmakers such as Sens. Amy Klobuchar and Ron Wyden demanding answers from CEO Mark Zuckerberg. But will users care?

“It’s so hard to trust Facebook now,” Goldman said. “Facebook’s not going to die in a cataclysmic fire, it’s going to die through apathy. Each of us as individuals will choose to use it less.”

Andrew Bosworth, a Facebook vice president, said in a Facebook post Monday afternoon that “if people aren’t having a positive experience connecting with businesses and apps then it all breaks down. This is specifically what I mean when we say our interests are aligned with users when it comes to protecting data.”

Meanwhile, British TV station Channel 4 on Monday revealed that its undercover investigation found that Cambridge Analytica has offered to entrap politicians by using sex and bribes, a charge the firm denies.


IF YOU’RE CONCERNED

If you’re a Facebook user who takes quizzes and uses third-party apps, you can check what kind of information the apps are collecting about you by going to Settings, then Apps.