Towards the end of the film There Will Be Blood, Daniel Day-Lewis, playing a ruthless oil baron we have watched rise from lone prospector, explains slant drilling technology to a rival. By angling his drill horizontally, he can tap the petroleum reserves underneath his rival’s property. He compares it to having a long straw to reach across a table and dip into the other’s beverage. “I drink your milkshake!” he crows triumphantly in the film’s signature line.
Did Cambridge Analytica pull off the digital age equivalent against Facebook? Although the exact details are still being sorted out, in 2015 the U.K.-based data firm, which was briefly hired by the Trump campaign, ended up with personal data from some 50 million Facebook accounts through a personality polling app software application. Although only 270,000 users downloaded the app, Cambridge Analytica designed it to scrape data from the profiles and feeds of all the Facebook friends of those original 270,000, according to Christopher Wylie, the whistleblower in the case. As a result, Cambridge Analytica harvested a motherlode of information proprietary to the social media giant. It drank Facebook’s milkshake.
But while in There Will Be Blood, oil was being hijacked, in this case it was individual personal information–information that Facebook not only was obliged to protect, but had significant business interests in doing so. That’s why the company’s stock is being pounded. Washington and Wall Street are openly wondering what’s going on. Massachusetts Attorney General Maura Healy promises an investigation. The Federal Trade Commission reportedly believes Facebook violated terms of a 2011 consent decree on user privacy and may face a fine of $40,000 per violation.
Most of that emotion stems from the case being framed within the controversy over “fake news” and whether to any degree it improperly affected the outcome of the 2016 election. As of now, it’s not clear whether the Cambridge Analytica used the data it collected and if so, if it got any measurable results.
But before we get carried away, let’s recall that audience manipulation is not a crime. On the contrary, it’s the whole point of advertising, politics not excepted. To this day we argue whether images of a nuclear bomb detonation juxtaposed with a little girl picking flowers or an African-American man in a revolving prison door were outside the rules of political engagement. They had an undeniable influence on the elections held those years, but no ad agencies or TV networks were investigated or fined.
But Facebook’s not off the hook. If the company wants a hand in policy direction it first needs to take responsibility for what it’s become. It seems pathologically reluctant to admit that it has become a major media company that brings to market unique algorithms that advertisers can use, with amazing accuracy, to target, profile and manipulate individuals. Consequently, third parties like Cambridge Analytica are becoming more adept at understanding and exploiting the value of Facebook’s own resources than Facebook itself. It doesn’t help that CEO Mark Zuckerberg, who let five days pass before publicly addressing the reports, gives the impression that he’s being pushed along by events, rather than taking direct control of them. If he doesn’t step up with a clearer vision of Facebook’s direction, in the end it won’t be Cambridge Analytica, but the government itself, that will drink Facebook’s milkshake. That will be the detriment to everyone.
This article is an updated version of an op-ed originally published in The Hill March 23, 2018 under the title “Facebook Has a Problem with Transparency, but It’s Not Yet Time for Regulators to Act.”