It is perhaps no surprise that social media companies are being forced to rethink their content moderation policies after the permanent suspension of Donald Trump over allegations he incited his followers to riot at the United States Capitol Building earlier this month in a show of support over his claims of election fraud.
Twitter was the first to make the ban. Facebook was quick to follow. Both social media giants had been under pressure for some time to moderate or quieten the voice of the often-outrageously outspoken former US President. Both refused to take action saying that allowing him to remain active on the platforms was in the public interest and important for both free speech and democracy.
It's a valid point. In banning Trump however, they have both shattered the longstanding notion that they are simply neutral platforms open for all to express their views. The fact is social media platforms wield a lot of power to persuade public opinion when it comes to politics or otherwise.
Shortly after Donald Trump was removed from Twitter, CEO Jack Dorsey noted the communication via social media, inciting the riots in which five people died, represented "a failure of ours (Twitter) ultimately to promote healthy conversation".
Furthermore, he said, it "sets a precedent I feel is dangerous: the power an individual or corporation has over a part of the global public conversation." For some, this second statement is a bit like passing the buck, absolving the platform itself for the views of an active user.
The danger of algorithms
The issue is twofold - it is as much an issue of censorship as it is technology. There's also a big question around where the responsibility for content actually lies.
A significant danger of social media platforms lies in their ability to track and trace everything that individual users show an interest in, and then push out more 'similar' content. This strategy was originally intended to catch users' interest and keep people on the platform for as long as possible. There is no doubting its success.
But this use of algorithms also presents significant problems. The greatest is that they sort content in a users' feed based on 'relevancy'. This essentially creates a 'bubble' of content which can be narrowly focused and highly targeted.
What this means is, that if, for example, you're a Trump supporter, or a cat lover, you can, simply through your frequent reactions to posts, and by pausing or stopping to read posts, inadvertently personalise your own newsfeed to such a degree that eventually the majority of posts in your newsfeed will be related to Trump, or cats in cute poses.
The algorithms have the potential to destroy the type of balanced information that's required for people to be able to consider multiple and varied viewpoints and maintain a healthy perspective.
Balance, fairness, accuracy, independence and impartiality are ethics that traditional media is bound by and which journalists are trained in. And while there are occasional biases and pumped-up PR pieces, these are not as common as they are on social media.
Blurred lines between marketing, advertising and news
The lines between marketing, advertising and journalism have been blurring for a long time, since the introduction of 'advertorials.' Social media has made these pieces much more commonplace, and over time we've also seen the rise of fake news - which has no basis in fact whatsoever, but which can be so cleverly written that it is difficult to detect.
Another factor which has contributed to the rise in social media power is the fact that many trusted media outlets now have paywalls, restricting access to content to 'subscribers only'. This has the potential to turn people away from traditional media, relying solely on what they see on social media to stay informed.
The other issue of course is that social media relies on user content - chiefly, opinion, which is not put under scrutiny or fact-checking, or censorship, and nor should it be, necessarily. Such a practice would defeat the purpose of building a platform that enables people to express their views, Conversely, allowing users 'free reign' is not ideal either.
Where does responsibility for content lie?
Last year a decision by the Supreme Court of New South Wales, upheld by the NSW Court of Appeal, determined that administrators of Facebook pages may be considered the 'publishers' of posts by others, and may therefore be liable for defamatory comments. The repercussions of this decision are anyone who has an account is responsible for the comments that other's make on their page.
This essentially places the responsibility for defamatory comments onto ordinary people not knowledgeable about the laws. It also opens the door for a cascade of lawsuits. Social media defamation suits have been on the rise in the past several years and the message is clear: beware of what you post online. Express yourself but be careful what you say.
The simple fact of the matter is that most people don't read terms and conditions, such as those set out by the platforms, nor do they fully realise the implications of some things they post online, as evidenced by the case of a pregnant Victorian woman who posted information about a protest rally during lockdowns last year and was arrested and charged by police for incitement.
Increasing reliance on social media as a news source
It's clear that social media companies need to recognise the litany of issues that are beginning to emerge as users around the world continue to gravitate to the platforms for information-sharing, entertainment and contact with friends and family. These are issues that will only get worse if not addressed properly because we now have a generation of young people who have grown up with the internet and social media, rely heavily upon them, and consider both highly trustworthy.
During 2020, with the world in lockdown, social media use surged globally, and numerous studies have confirmed that social media became a 'go to' source for Covid-19 information, primarily because of its immediacy and accessibility.
There's no doubt that becoming a news hub means that social media companies bear increasingly more responsibility for preventing the spread of misinformation. They must also put in place appropriate safeguards to ensure that users retain balance in their news feeds, in particular returning full power to the user to 'follow' or 'unfollow' a page, rather than allowing an algorithm to determine what they see.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.