In banning the far right from its platform, Facebook acted like a responsible media provider.
In April, Facebook announced its decision to “ban” 12 UK-based individuals and organizations linked to the radical right. Arguing that each sought to “spread hate,” Facebook explained that the decision was made on the basis that all were “individuals and organisations who spread hate, or attack or call for the exclusion of others on the basis of who they are, have no place on Facebook.” The measure was, however, far from a “ban.” Less sensationally, the social media provider was merely stating that those who spread hate were no longer welcome to do so via its platforms — simple.
In spite of what seemed to be quite a clear and unequivocal justification, much has been made of the decision since. This largely focused on four issues. First, whether the ban went against the right to free speech. Second, whether banning these individuals and organizations had the potential to make them victims. Third, whether the ban would act as a deterrent to others from empathizing or supporting those affected, and, finally, whether the decision was ultimately right or wrong.
The 12 affected by Facebook’s decision include the British National Party and its ex-leader Nick Griffin; Britain First, its leader Paul Golding and former deputy leader Jayda Fransen; the English Defence League (EDL) and its founding member Paul Ray; Knights Templar International and its promoter Jim Dowson; National Front and its leader Tony Martin; and Jack Renshaw, a neo-Nazi who plotted to murder a Labour Party MP. While some of these have — like others — claimed they have been banned, the reality is quite different.
Unlike National Action, which was proscribed — to the extent that it is now a criminal offence to be a member of the group — all of those affected remain free to find other online platforms from which to espouse the exact same messages they have been restricted from doing so via Facebook. In this way, Facebook is merely cutting off its oxygen supply to those who wish to espouse hate — nothing to do with free speech.
While so, the contradiction that is former EDL leader Tommy Robinson will likely be the template each will seek to follow. By contradiction I mean the fact that despite Robinson variously protesting about being gagged and having his free speech curtailed, he continues to have a quite disproportionate public voice that shows no tangible evidence whatsoever of it being curtailed. Nonetheless, like Robinson, those affected by Facebook’s decision will claim to be “victims.”
Having said that, notions of victimhood are ever present tropes in the public discourses associated with all of those affected: from being victims of the mainstream media to the political elites, from the liberal left to minority communities, there will be little new being offered in this respect. Just because Robinson, Fransen, Griffin et al., say they are victims, it does not necessarily mean that they are right. That we will no doubt continue to hear them making such claims will no doubt disprove the suggestion they have been gagged, so let’s call them out on this.
As regards to the claims that Facebook’s decision will achieve little in deterring others from empathizing or supporting those affected is, in my opinion at least, missing the point. That is because Facebook was attempting to achieve neither of these outcomes. Despite claims to the contrary, however, Facebook has been historically used by many among the radical right to recruit and organize. Nonetheless, as Facebook unequivocally stated, its action now is about acknowledging that hateful messages do not align with mainstream societal thinking and thereby should not be afforded legitimacy or validation through mainstream (social) media providers. As with the issue of free speech previously, Facebook is far from denying anyone the right to hold, empathize with or support any individual, organization or ideology. They are merely saying that such views and ideas cannot be attributed either directly or indirectly to them.
Finally, whether Facebook’s decision is right or wrong is a wholly subjective matter. While so, it is not without precedent among mainstream media outlets. This came to me while doing a live interview with BBC’s 5 Live last week on this very issue. Had I begun to espouse overt racist or anti-Semitic views during the interview, the BBC would have — rightly, I hasten to add — cut me off. The same too had I begun swearing or using abusive or insulting language about certain minority groups. As a consequence of this, it would be highly unlikely for me to be invited to participate in any future live broadcasts.
In the same way that there is an expectation for me to conduct myself in a certain way while appearing on the BBC — as indeed I would all other mainstream media outlets also — the same can be said to be true of social media providers also. Why should social media provide an arena where hate can be routinely shared and disseminated when other media arenas are not? Abusing the platform afforded to them by Facebook and others, Britain’s Dirty Dozen now face the consequences others would in other comparable settings. Facebook’s decision, therefore, is far from exceptional and clearly not without precedent.
That I have previously aired my concerns about how in recent years the mainstream media has afforded legitimacy and normalcy to those on the radical right, Facebook’s decision — and in particular the message it conveys — has to be welcomed. Stating that hateful and bigoted messages do not have a place and will not be afforded a mainstream platform is not a bad thing. Likewise, that the same is occurring at a time when divisions and antagonisms between us are on the rise. Far from being about the right to free speech — I reiterate, no one is being disallowed to say exactly what they want — the decision is instead about what is the right action by a responsible media provider.
*[The Centre for Analysis of the Radical Right is a partner institution of Fair Observer.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.