Despite blocking QAnon and Co. – Zuckerberg sticks to Facebook’s ideal of free speech

No change of heart on Facebook: Despite the current ban on harmful content, the platform sticks to the view that controversial content should be allowed to persist in favor of freedom of speech.

Since the beginning of the month, Facebook has introduced some far-reaching changes aimed at curbing the spread of hate speech, conspiracy theories and misinformation on the platform. This includes, for example, that all contributions from the conspiracy group QAnon have been deleted. The long-requested ban on content that denies the Holocaust has finally been implemented. In addition, Facebook is now blocking ads that should or could prevent people from vaccinating.

Many users and experts who have long been worried about the possible effects that content on Facebook can have in the world, should be happy. But the hope that Facebook will now fundamentally change the way it deals with harmful content remains unfulfilled. As reported by Social Media Today, Facebook wants to stick to its original approach of preserving freedom of speech on the platform as much as possible.

The reason for the short-lived change: The US elections

At a meeting, employees were informed that the current changes would only be made due to the upcoming US elections. Because in light of this, the potential for possible harm and violence that could arise from contributions of that kind is expected. So explained Mark Zuckerberg:

This does not reflect a shift in our underlying philosophy or strong support of free expression. What it reflects is, in our view, an increased risk of violence and unrest, especially around the elections, and an increased risk of physical harm, especially around the time when we expect COVID vaccines to be approved over the coming months.

The change would have nothing to do with an expectation on the part of Facebook that Joe Biden would win the election, and a related policy adjustment. As soon as the current events are over, Facebook would not plan any further restrictions on content on the platform. So conspiracy theories and disinformation can be spread more widely – as long as they don’t violate the basic rules of the network.

Facebook is not supposed to be an “arbiter of the truth”

To what extent Zuckerberg’s – and thus Facebook’s – essential attitude to harmful content on the platform is morally justifiable is a matter of opinion. The huge social network can be a real breeding ground for misinformation, which can also have an impact outside the platform. This can be dangerous, especially in times of the corona pandemic. For example, a report by the nonprofit Avaaz found that there was an enormous amount of disinformation about health issues being spread on Facebook. Nevertheless, Zuckerberg’s perspective that Facebook should not be an “arbiter of the truth” is not unfounded. Facebook users should be able to freely determine what they want to exchange information about on the platform. As long as a contribution does not pose a risk of imminent damage, freedom of speech should be preserved.

There are certainly business reasons for this. Because the more posts Facebook allows, the more engagement is recorded. In addition, given the size of the platform, it is not possible to maintain an overview of all content. Nevertheless, it remains questionable whether Facebook’s approach of only blocking content that could cause imminent damage makes sense. One thinks, for example, of content that denies climate change – which poses a serious risk in the long term at the latest. Facebook’s ban on QAnon and misinformation about vaccinations also raises questions. The platform might have already prevented a lot of damage had it banned that content earlier.

The customer experience (CX), i.e. the customer experience, is a decisive factor for economic success. But how do you create a positive customer experience? Download your free white paper now!



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *