In 1942, the U.S. Office of War Information created a marketing slogan that has since become a favored colloquialism for the importance of secrecy:
Loose Lips Sink Ships.
Other than the rhyming, the government wasn’t actually trying to be clever with that campaign — idle talk to the wrong person could literally have sunk a ship in 1942.
But in 2017 — a year where so far the mainstream news has been more or less defined and dominated by a series of leaks, anonymous sources and unattributed tips — the phrase has enjoyed something of a comeback.
And while there are any number of examples in Washington, D.C. we could pick on, this week’s loose lips- sinking ships award goes to Facebook. Over the last several weeks, the social network has endured back-to-back leaks of internal documents relating to its content screening processes and ad targeting practices that have managed to put the firm back on its heels seesawing between explaining itself and apologizing.
The two things Facebook most certainly does not want to be doing.
So what got out and how did Facebook find itself with some ‘splainin’ to do?
Well …
Content, Controversy and the Free Speech Challenge
The dark side of Facebook can be a particularly dark place. This year has seen its livestreaming services being used to broadcast murder, violence, terrorism and animal cruelty; and, according to some reports, it is becoming one of the internet’s premier destinations for angry exes looking to ruin their former beloved’s life with “revenge porn” (defined as nude or near-nude photos shared in order to “shame or embarrass” someone). According to reports out earlier this week, in a single month, Facebook saw 54,000 cases of potential revenge porn.
Creating the proverbial quicksand for advertisers who want nothing to do with a platform that has become more than just a magnet for bad jokes and being stalked by high school classmates — to a platform where such bad things happen that it ends up pulling in more just like that, and fewer people, like traditional advertisers who don’t want anything to do with such content. Or, as Karen Webster described it in a recent commentary: Facebook’s MySpace problem.
And reports this week also gave the outside world a glimpse into the social media network’s field operating guide for what can go on the site and what cannot, via a set of specific — and in some cases controversial — guidelines.
For example, photos of animal abuse, bullying and even some physical abuse of children can reportedly be shared. Calling for an assassination and images of implied child sexual abuse cannot.
Facebook’s Global Policy Head Monika Bickert has published a blog that defends the social network’s guidelines, while noting that it’s hard to hit it perfectly when governing a site of nearly two billion users. Bickert wrote that Facebook has to be “as objective as possible” in order to have consistent guidelines across every area it serves.
“It’s hard to judge the intent behind one post or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help?”
Facebook’s critics — like Jennifer King, a Ph.D. candidate at the UC Berkeley School of Information — are not fully persuaded by Facebook’s answer.
King worked at Yahoo from 2002 to 2004 as a moderator — a job she likened to working “at the front lines of what I call the toxic waste of the internet.”
Ouch.
Though sympathetic to how hard Facebook’s burden is in sorting through millions of posts, she noted Facebook could easily have foreseen many of the issues it has faced this year with inappropriate content before they launched services like live video streaming.
“I don’t think there’s any excuse in 2017 to say ‘Let’s just throw it open and see what happens,’” King said. “We know what will happen. It will be bad.”
To remediate the issue, Facebook will add 3,000 content moderators by the end of the year — on top of the 4,500 it already has on the job around the world.
And while one might think the news that Facebook will have more eyes on content would be a comfort — there is the small matter of that other leak…
Marketing to Teens Feeling “Worthless,” “Insecure” and “Defeated”
As May opened, The Australian published the text of a 23-page internal document that gave the world a moment to wonder just how exactly Facebook is using — and monetizing — the petabytes of data it routinely collects on everyone.
The document was reportedly prepared by company executives David Fernandez and Andy Sinn, as part of a presentation for potential advertisers about Facebook’s extraordinary accuracy in tracking the status updates, public interactions and internet activity of its users.
Even the rather young ones.
The confidential document offered specific details on how, by monitoring posts, comments and interactions on the site, Facebook can figure out when people as young as 14 feel “defeated,” “overwhelmed,” “stressed,” “anxious,” “nervous,” “stupid,” “silly,” “useless,” and like a “failure” and notes that Facebook can offer real-time insight into “moments when young people need a confidence boost.”
Notably, Facebook does already provide ad buyers with a lot of information about its users — including relationship status, location, age and how often and in what manner people use the social media website. Emotional state is new — and could be a lucrative addition to this data.
Unless it turns out to be an illegal one, particularly as it relates to minors. Currently, there is some debate as to whether Facebook has violated the Australian Code for Advertising & Marketing Communications to Children, which requires organizations to obtain expressive consent from the guardians of a minor before extracting any personally identifiable data.
Facebook was quick to issue an apology and noted that it would be conducting an investigation into the matter, admitting it was inappropriate to target young children in such a way.
In a subsequent public statement, Facebook added:
The premise of the article is misleading. Facebook does not offer tools to target people based on their emotional state. The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated. Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight.
Some have noted that answer is less than fully illuminating — and consumer privacy groups have officially requested that Facebook release the entire document in public so that people may determine just what exact use Facebook was suggesting for the data it had collected on the emotional state of minors.
Over two dozen organizations, including the Center for Digital Democracy, Consumer Watchdog and Center for Science in the Public Interest, have signed an open letter addressed to Facebook CEO Mark Zuckerberg demanding the company release the entirety of the research for public viewing and offer explanations as to its “sentiment mining” program.
So far Facebook has declined to do either — though reportedly members of their executive team will meet with concerned privacy watchdogs.
So is Facebook watching us all too closely — or is it not quite watching any of us closely enough? And is there a reasonable standard for a social network of two billion people worldwide, especially since said social network makes all of its considerable cash hoard by selling that aggregated user data?
There is no obvious answer, but Facebook might want to think about pulling its top execs off their meet-and-greet-the-world tours to develop some not obvious ones.
Because as they should know better than any other corporate team in America, there are no secrets in the digital age. And before advertisers don’t keep their disdain for what’s going on a secret either — and do what they often want to do when the going gets tough — let their checkbooks do the talking.