Updated 2:47 pm ET Tuesday, November 29
Facebook has settled a lengthy privacy complaint with the U.S. Federal Trade Commission after the agency’s two year investigation found wrongdoing on the part of the company, according to a release from the FTC on Tuesday morning.
The settlement tentatively requires Facebook to stop sharing user information with advertisers without consent and to undergo two decades of privacy audits every two years, among other drastic measures designed to reign-in the world’s largest social network’s treatment of user information and communication with users.
And the FTC didn’t let Facebook off easy, either, giving the company a stern warning about any future privacy violations, noting that Facebook could be fined up to $16,000 per violation if it doesn’t comply with the terms of the settlement.
The 8-count complaint posted by the FTC blatantly calls Facebook’s privacy policy and practices “unfair and deceptive,” saying that Facebook’s user privacy controls (or privacy settings) gave the impression that a user could restrict their information to “Friends” or “Friends of Friends,” when in fact, third-party applications could still access their information.
In addition, the complaint document blasts the world’s largest social network for sharing identifying user information with advertisers even when Facebook claimed it wasn’t, and for allowing access to deactivated user photos and videos.
A summary of the full 8 counts follows:
1. Facebook misrepresented who could see user information.
2. Facebook “failed to disclose” that its controversial December 2009 update (which sparked the FTC complaint in the first place) would override existing user privacy settings.
3. Facebook “retroactively” applied changes to user information, making previously private information public.
4. Facebook has provided apps and app companies/developers “unrestricted access to user profile information that such Applications have not needed to operate.”
5. “Facebook promised users that it would not share their personal information with advertisers. It did.”
6. Facebook charged certain companies $175 or $375 for a “Verified Application” certification when it in some cases didn’t actually verify the safety of the app to users.
7. Facebook provided third-party access to user account information even after the user deleted his or her account.
8. “Facebook has not adhered to the U.S. Safe Harbor Privacy Principles of Notice and Choice,” importing data from the European Union into the U.S. without user consent.
In response, the FTC has ordered Facebook to undergo independent privacy audits every two years for the next 20 years, similar to an arrangement the agency reached with Google in October, which revolved around complaints over Google’s failed Google Buzz social network.
The FTC also notes that its settlement with Facebook requires the social network to obtain users’ “affirmative express consent,” before making any privacy changes, which would seem to indicate a switch from the current “opt-out” system — which requires a user to take proactive steps to disable public information sharing — to an “opt-in” system, which would require Facebook to invite users to share their information publicly but not automatically switch it on without a user’s choice.
Facebook is additionally required to make a user’s information unavailable within 30 days after his or her account is deleted. That’s because the FTC found that such information was still accessible to anyone with the URL (web address) for that specific content.
The issue of data retention is not only contentious in the U.S., but has boiled over in the United Kingdom after an Austrian student found Facebook had stored over 1,000 pages of his information, including deleted wall posts, chat conversations and other content, and summarily filed a complaint with the Irish Data Commissioner, which is currently in the middle of auditing Facebook.
Facebook is also called upon establish a “comprehensive privacy program” within the company to “ensure that the privacy of consumers’ information is protected.”
Facebook founder Mark Zuckeberg wasted no time in posting his lengthy response to the settlement on (where else?) Facebook, claiming that although Facebook has made “a bunch of mistakes,” he’s “committed to making Facebook the leader in transparency and control around privacy.”
Still, there can be no denying Zuckeberg’s efforts to spin the FTC settlement as just business as usual for big tech companies like Facebook. He calls out Google and Twitter’s agreements with the FTC, saying “These agreements create a framework for how companies should approach privacy in the United States and around the world.”
Zuckerberg defends Facebook’s record, too, saying “Facebook has always been committed to being transparent about the information you have stored with us – and we have led the internet in building tools to give people the ability to see and control what they share.”
As such, Zuckeberg also announced the creation of two new positions: Chief Privacy Officer of Products, filled by Michael Richter, and Chief Privacy Officer of Policy, filled by Erin Egan.
Still, Facebook’s FTC privacy complaint saga isn’t quite over yet: The FTC notes that the settlement agreement has yet to be made final and has posted the document for public comment, which it will accept over the next 30 days. Comments can be submitted online here or via snail mail to Federal Trade Commission, Office of the Secretary, Room H-113 (Annex D), 600 Pennsylvania Avenue, N.W., Washington, DC 20580.
Late update: The American Civil Liberties Union, which was one of many groups that called for the FTC investigation in the first place, has issued a statement in response to the settlement.
“Today’s Facebook settlement is an important step, making it clear that companies can’t simply change the rules without asking users’ permission,” said Chris Conley, policy attorney of the ACLU of Northern California. “But to keep pace with new technology, we also need new laws and tools like Do Not Track and comprehensive privacy legislation to help us safeguard our own personal information. We shouldn’t have to struggle with complicated and constantly shifting privacy settings just to keep control of our own personal information.”
“Do Not Track,” is a proposal developed by Stanford researchers and a coalition of advocacy groups, supported by the FTC, that alters the HTTP header sent out by a user’s browser, notifying websites and advertising services that the user doesn’t want to be tracked. No law currently enforces “Do Not Track” standards, but members of the House and Senate earlier this year introduced drafts of legislation that would do so.