(Photo: Maurizio Pesce)
Earlier this week, in a post he wrote on Facebook to celebrate his company's 15th birthday, Mark Zuckerberg raised the ire of Facebook's critics yet again.
Authors, journalists, researchers, and academics from across the internet pilloried Zuckerberg for his tone, his use of jargon, and his attempts to “deflect” attention from the societal problems Facebook has caused over the last decade and a half.
The truth is that when appealing to a group of people through writing, the intentions or context of your words do not always matter as much as how they are perceived.
In the past few years, Zuckerberg has made a lot of apologies to the Facebook community. He's written a lot of explanations in Facebook posts. He's even produced written testimony for Congress.
Rather than restoring trust, Zuckerberg's statements have often eroded trust between Facebook and its user base even further, making the situation worse, not better.
A look into the anatomy of these apologies will show that Zuckerberg repeatedly makes the same errors in his writing to come off as disingenuous and untrustworthy.
On the surface, Zuckerberg's apologies seem to have the right elements. He is quick to issue an apology whenever necessary, and in those apologies, he promises to improve something about himself or his company to avoid the same mistake again.
However, he is also quick to point out that external actors are the real culprits of the mistake he is apologizing for. From the perspective of someone expecting a sincere apology, this deflection of blame is a clear defense mechanism to hide Facebook's own fault in the issue.
In the spring of 2018, after a some internal Facebook emails were released that further implicated the company in the charges of data misuse as part of the Cambridge Analytica scandal, Zuckerberg published a 630-word statement to get ahead of the situation and repair his reputation. In the statement, he repeatedly refers to “sketchy apps,” “shady apps,” and their unnamed developers as the main offenders of the data misuse.
In this explanation, the problem isn't framed as something that Facebook and its employees did wrong. Instead, the problem was caused by outside actors (developers and their apps). Furthermore, the use of the words “shady” and “sketchy” imply that these outside actors were motivated by character flaws rather than by situational opportunity.
By attributing the causation of the problem to outside actors' character flaws (i.e., they were “sketchy” or “shady”) rather than to the situational factors motivating their actions (i.e., Facebook had limited restrictions or guidelines in place to prevent or discourage such actions), Zuckerberg makes the error of failing to acknowledge his own role in the problem and the steps he could have taken to prevent it.
By doing so, Zuckerberg tiptoed around the most important component of earning an audience's trust back through an apology—acknowledgement of responsibility.
Zuckerberg didn't just study computer science at Harvard—he was also an avid classicist, and he brought a strong grasp of persuasive and manipulative rhetoric to his public statements on behalf of Facebook.
Across different posts written in defense of policy positions or seeking to explain decisions he's made, Zuckerberg weaponizes the informal logical fallacy of the false dilemma: he presents the reader with a choice between two options, carefully chosen to nudge them in the direction he wants.
This kind of technique can be highly persuasive when unnoticed. When the reader picks up on it, however, it doesn't just fail to persuade—it causes you to lose trust in the writer.
In the post below, he presents the reader with two false options when defending (implicitly) Facebook's role in spreading fake news during the 2016 elections: either we can keep on giving people a voice, or we can allow traditional gatekeepers to control all the information we see and read on the internet.
In the larger context of the letter, of course, the choice he's presenting is between allowing traditional gatekeepers to control information—or allowing Facebook to continue doing what it has been doing.
The two options presented are framed as collectively exhaustive, but there are various middle states that could exist as well.
Technology could give about the same number of people a voice who have one today, for example, or it could give more people a voice sans any concept of “traditional gatekeeper control.”
Alternatively, “traditional gatekeepers” (like newspapers) could still continue to have *a *role even while technologies like Facebook continue to grow and expand.
None of these options are presented, suggesting to the reader that the only real way to avoid total information censorship is through the work that Facebook and Zuckerberg are doing.
The conclusion of this post makes it clear that this false choice was given to the reader in bad faith.
In this excerpt above, Zuckerberg is promising to do more real thinking about the different risks and trade-offs inherent to the kind of work Facebook does. He insists that he will do more work “engaging more in some of these debates.” But just paragraphs before, he has professed that there is no middle ground. On the question of whether Facebook should be checked or whether “traditional gatekeepers” have a role to play in the way information moves in the modern era, at least, there is no debate to be done.
The panoply of options between the two presented by Zuckerberg in the original false choice makes it a weak manipulative technique in the first place, but the end of this post delivers the final coup de grace. For any thoughtful reader, the bad faith demonstrated here makes it impossible to take Zuckerberg at his word—and trust virtually anything that he says.
Throughout 2018, Zuckerberg started off a good number of his public posts by stating his top priorities or biggest areas of focus for the year. Filling someone in on your thinking and direction is a step towards transparency and, by extension, trust, but in the case of these 2018 posts, the seeming reshuffling and inconsistency of some of Zuckerberg's stated priorities raise a few red flags.
At the beginning of the year, Zuckerberg's main focus seemed to be on changing what people devoted their time to on Facebook.
As the year progressed, however, his stated focus and priorities for the year changed to fair elections.
By the end of the year, no trace of his initial focus area for 2018 was found in the summary of what he had prioritized for the year.
As a reader, it is hard to fully buy into Zuckerberg's commitment to any of these priorities because his focus changes so frequently. How, for example, are we supposed to believe that 2018 was devoted to preventing election interference and protecting data if at the beginning of the year the main focus was on something completely different?
Without an explanation from Zuckerberg himself, a reader might also begin investigating why his main priorities changed throughout the year. The shift in stated priorities from “making sure the time we spend on Facebook is time well spent” to elections and data security coincides with the unfolding of the Cambridge Analytica scandal. Once the story broke, Zuckerberg's focus quickly changed to be about problems that were relevant to the problems exposed by the story.
Given this context, the stated focus areas and priorities seem to be more of an attempt by Zuckerberg to pander to what he thinks the reader wants to hear rather than an honest and transparent statement of priorities.
Over the last decade and a half, Zuckerberg's explanations of his worldview and of Facebook's values and mission have evolved to be significantly nobler-sounding, higher-minded, and full of weasel words.
Weasel words are words that have been used so many times, in so many different contexts, and are in nature so ambiguous that any meaning attached to them is drained away. They're a mainstay of corporate crisis communications, and for PR agencies, they are both a tool and a kind of mistake—evidence that they have resorted to obscuring the truth rather than merely spinning it.
(Photo: David Berkowitz)
2018 was not a great year for Facebook. Following the Cambridge Analytica scandal, Facebook users were increasingly fearful of the amount of power the company had and wary of how safe their personal data was. In an attempt to restore the public's trust in Facebook, Zuckerberg wrote an end-of-year post about Facebook's mission and goals for the upcoming year, and the progress they had made in those areas in 2018.
The overall message was that “Facebook has your best interests in mind and will continue to do what's best for you.” However, Zuckerberg's overuse of words like “community,” “well-being,” “together,” and “good” obscure that message and give it a tone of insincerity.
The letter starts off by saying that Facebook is working to 'improve people's well-being.” “Well-being” in itself is an ambiguous term. It could mean physical health, security, mental health, or comfort. Coupled with an equally ambiguous measure of scale, “improvement,” this statement loses its significance for the reader.
Zuckerberg closes out the letter with an equally empty and ambiguous statement by saying “Building community and bringing people together leads to a lot of good, and I'm committed to continuing our progress in these areas as well.” The phrases “building community” and “bringing people together” certainly sound positive, but they are vague at best and act more as buzzwords than actionable goals and measures. Jumping around from abstract idea to abstract idea like this when simply describing what Facebook's priorities are doesn't give off a strong sense of a mission-driven or ethical company.
As a reader, it is too easy to interpret the words here as meaningless PR filler used solely for the purpose of manipulating people into trusting Facebook.
This past week wasn't the first time Zuckerberg's writing created controversy, and it probably won't be the last, but it is a moment that's symbolic of where his relationship (and his company's relationship) with the wider world has gone over the last several years: even a post just celebrating Facebook's birthday can inspire the internet's outrage.
While it's clear today that Zuckerberg's wide reaching statements, bereft of real meaning, have functioned to keep shareholders from panicking about Facebook's public image, they have also fallen far short of establishing trust between the company and its community of users. Earning trust requires true honesty and transparency—and it may be that true honesty is a luxury that Facebook can't afford.