Facebook to ban all praise, support and representation of white nationalism and separatism… but wait…

Earlier this week we shared our thoughts on Facebook and the involvement in the Christchurch Terror attack.

Today CNN Business reported on Facebook announcing on Wednesday that it would ban all “praise, support and representation of white nationalism and separatism” on Facebook and Instagram.

A welcome move following on from the live stream of the attack and a manifesto allegedly written by the suspect reveals white nationalist views.
Facebook (FB) said while it had long prohibited hateful treatment of people based on race, it hadn’t applied the same rationale to white nationalism, “because we were thinking about broader concepts of nationalism and separatism — things like American pride and Basque separatism, which are an important part of people’s identity.” It said it had reconsidered that after  conversations with members of civil society and academics who are experts in race relations around the world” who said, according to Facebook, “that white nationalism and separatism cannot be meaningfully separated from white supremacy and organized hate groups.”
Over the past three months, Facebook said, it had more than 20 conversations with civil rights groups and experts in race relations across the US, Europe, and Africa.
Among the groups, Facebook consulted was the Lawyers’ Committee for Civil Rights Under Law.
“It took a lot of hard work to get Facebook to where they are today. But the hard work lies ahead; we will be watching closely how they implement the policy,” Kristen Clarke, the group’s president and executive director, told CNN Business after Facebook’s announcement on Wednesday.
Clarke said that Facebook’s previous policy allowed white supremacists to abuse the platform and that her group had been talking to the company about a policy change for several months.

A message Facebook says it will show users who search for terms it says are associated with white supremacy

The attack in New Zealand, Clarke said, “underscores the urgency here. It’s exhibit A in how violent white supremacists abuse the Facebook platform to promote their dangerous, fatal activities.”
Facebook said it would start directing people who search for terms associated with white supremacy to organisations that help people leave hate groups.
Facebook is currently undergoing a “civil rights audit,” a project Facebook COO Sheryl Sandberg says is one of her top priorities for 2019.
This is a welcome approach from Facebook; however many feel it is a slow and all to familiar PR line, as Paul  Brislen wrote on Radio NZ today.

Paul writes “If all this sounds familiar, it’s because sadly it is. This is not the first time Facebook has declared it will take action in this way.

In April 2016, Robert Godwin, a 74-year-old grandfather, was shot and killed in Cleveland, Ohio having been chosen at random by a killer who broadcast it live on Facebook.

At the time CEO Mark Zuckerberg told Facebook’s annual developer conference, “We have a lot of work [to do], and we will keep doing all we can to prevent tragedies like this from happening”. In May of 2017, the company announced plans to add 3000 more staff to review user content to help battle violent videos.

Not much has changed. Today’s announcement sounds awfully hollow in light of the lack of action since 2016 and does little to assuage anyone’s concerns about the company and its ability to self-regulate.

Facebook Live will continue to be unmonitored and unmanaged. Having a room full of content moderators is a good step forward, but given the vast amount of video footage that is uploaded every second of the day, Facebook must invest more in machine learning and automated systems that can identify and root out such content before any human sees it.”

Paul draws a simial conclusion we did; it’s not in the financial interest of Facebook, so not much will change… I hold out hope that the media giant that was built around its end user, actually remembers it needs an audience to be in business…

There is also the underlying issue that we are not addressing, can we blame Facebook? Should we not be concerned with the nature of humanity to feel that it’s ok to share hate based content? Facebook is just the platform, but it’s us that hits that ‘share’ button… Much like the Momo Challenge if we stop sharing this rubbish, the narcissist hate mongers lose the platform of speech. DO we need Facebook to parent us all? Or should we take responsibility for what is right, when we see something that is wrong, block it!

Let’s be proactive; sure Facebook will make changes. Slowly, but they are unlikely to catch all and everything. The best thing we can do is be vigilant, look after our eyeballs and those of our children, block the hate, stop watching the hurt, let’s use the tools we have to focus on the love and the good things that networks like Facebook allow us to share.


Editor in Chief at here SMNZ, I have a passion for social and digital media. When not writing and managing SMNZ I am the Head of Innovation at TAG The Agency, a digital ad agency and the Head of Sales and Marketing for End-Game, a software development agency. I'm also involved with a number of startups and I am always keen to support those that are bold enough to give things a go. Start something, better to try than to live wondering what if...

Leave a Reply

Your email address will not be published. Required fields are marked *