Misleading investors about its misuse of user data costs Facebook $-100 million
Because not all bad apples fall from the financial services tree, let’s take a small detour from our usual posts to discuss a failure to disclose and lack of oversight of monumental proportions:
Facebook’s compliance failure
Facebook just got smacked with a $100 million fine for misleading investors and the public regarding the potential risk of misuse of its user data. According to the original Securities and Exchange Commission’s (SEC) compliant, for more than two years between 2016 and mid-March 2018, Facebook made misleading statements in its required public filings that “hid” the fact of the misuse of its users’ data.
Rather, the company presented the risk of misuse as merely hypothetical – not actual. A what could happen. Not a what did happen. But, as early as December 2015, Facebook already was aware that a researcher had improperly sold information related to tens of millions of Facebook users to data analytics firm Cambridge Analytica.
A warning issued in 2012
Since its initial public offering in 2012, Facebook has warned investors that one of the material risks to its business is that independent developers who create applications for its platform might misuse personal data obtained from Facebook users. And that is what happened.
In June 2014, an academic researcher and Cambridge Analytica entered into an agreement, through affiliated companies. Cambridge Analytica agreed to pay the researcher to collect data on Facebook users. At Cambridge Analytica’s expense, the researcher developed a personality survey that obtained data from U.S. Facebook users, including their names, birthdates, gender, location, and their affinities, or “page likes.” From the summer of 2014 through the spring of 2015, the researcher transferred data relating to approximately 30 million Facebook users in the United States to Cambridge Analytica. And Cambridge Analytica used this information in connection with its political advertising activities.
Facebook read about the exposure in the news
Facebook learned about the collaboration when it investigated a report published in the British press in December 2015. Within days both the researcher and Cambridge Analytica privately confirmed to Facebook that the researcher had transferred personality profiles based on Facebook user data to Cambridge Analytica. Facebook determined that the transfer violated its policy that prohibits developers, like the researcher, from selling or transferring its users’ data, and told the researcher and Cambridge Analytica to delete the data.
But it got worse. In June 2016, the researcher told Facebook that, in addition to transferring Cambridge Analytica personality profiles for approximately 30 million of its users, he also sold Cambridge a substantial quantity of the underlying Facebook data from the same users.
Robbery is one thing. Failure to disclose is another
Between January 28, 2016 and March 16, 2018, Facebook did not disclose the misuse in its quarterly and annual reports. It did not report that a researcher had, in violation of the company’s policies, transferred data relating to approximately 30 million Facebook users to Cambridge Analytica. Instead, the company mislead investors. It presented the misuse of user data as merely a hypothetical investment risk. And when questioned by reporters in 2017, the company falsely claimed it had found no evidence of wrongdoing. In fact, Facebook did not publicly acknowledge on its website what it had learned and when it had learned it. And, as expected, the price of Facebook shares declined substantially.
The SEC alleges that during the two-year period of misrepresentations, Facebook had no specific policies or procedures in place to assess the results of their investigation for the purposes of making accurate disclosures in Facebook’s public filings.
“Public companies,” says SEC Enforcement Division Co-Director Stephanie Avakian,” must accurately describe the material risks to their business. As alleged in our complaint, Facebook presented the risk of misuse of user data as hypothetical when they knew user data had in fact been misused. Public companies must have procedures in place to make accurate disclosures about material business risks.”
What does Facebooks failure to have policies and procedures mean for you?
That like Facebook, you are not immune from oversight and regulatory compliance. One cannot stop bad actors from acting badly, but a well-run compliance system can spot irregularities and give an attentive compliance team a chance to nip exposures before they get out of hand. That’s where Patrina can help. We’ve built our business based on helping organizations keep track of “bad apples,” and stay on the “straight and narrow” efficiently and cost-effectively. Be smart. Be covered. Let’s talk.