In an increasingly digital world, social media platforms have become an integral part of our daily lives, connecting us with friends, family, and the world. However, with the convenience and benefits of social media come risks to our online safety. In Australia, the eSafety Commissioner plays a pivotal role in safeguarding internet users, and the Online Safety Act 2021 (Cth) (the Act) is a significant legislative milestone aimed at addressing the evolving challenges of online safety. This article explores the importance of understanding online safety when using social media in the Australian context and the key role played by the eSafety Commissioner and the legislation that regulates online safety.

The prevalence of social media in Australia

Social media platforms have witnessed exponential growth in Australia. As of October 2022 around 82% of Australians are active on social media, with Facebook, Instagram, TikTok and YouTube being among the most popular platforms. While these platforms provide a space for creativity, social interaction, and information sharing, they also expose users to various online threats, including cyberbullying, harassment, hate speech, and misinformation.

What is the role of the eSafety Commissioner?

The eSafety Commissioner, established in 2015, is an independent statutory office tasked with promoting online safety for all Australians, particularly children and vulnerable users. eSafety's role includes:

  1. Education and awareness – eSafety provides valuable resources and information to help Australians of all ages understand the risks associated with social media and how to stay safe online. Their website offers tips, guides, and educational materials.
  2. Online complaints system – eSafety operates an online complaints system that allows users to report harmful content, cyberbullying, or other online safety concerns. The Commissioner works to have this content removed or blocked.
  3. Safeguarding children – protecting children online is a top priority. eSafety collaborates with schools, parents, and community organisations to educate children about online safety and works to remove illegal or harmful content targeting minors.
  4. Regulatory powers – eSafety has a wide range of regulatory powers that facilitate rapid removal of harmful online content and the investigative division operates a ranges of regulatory schemes.

What legislation regulates online platforms?

The Online Safety Act 2021 (Cth)

In July 2021, the Australian government passed the Online Safety Act 2021, (the Act) which was landmark legislation designed to enhance online safety in our country. Key provisions of the Act include:

  1. Stronger penalties – the Act increases penalties for social media platforms and individuals who fail to remove harmful content promptly.
  2. Basic online safety expectations – the Act establishes a set of basic online safety expectations for social media services, including transparency reports on content removal and proactive measures to protect end users who are accessing content from Australia regardless of whether a company has an Australian presence.
  3. Cyberbullying provisions – the Act expands cyberbullying provisions to cover adults and introduces a new process for adults to request the removal of material that targets them with serious harm.
  4. Enhanced powers – eSafety's powers have been strengthened, allowing for a more proactive approach to combating online harms.

The Online Safety (Basic Online Safety Expectations) Determination 2022 (the Expectations) are a key element of the Act and help to protect you from online harm. They are set out in a Ministerial Determination and aim to increase transparency and accountability of online service providers by ensuring they have adequate terms of service, moderation practices and reporting and risk processes in place to minimise harm online.

Further to this are Industry Codes which were based on provisions of Part 9, Division 7 of the Act and have been requested by the eSafety Commissioner. Five of eight codes have now been published and they create mandatory compliance requirements for online service providers. The finalised codes come into force in December 2023 and create mandatory requirements relating to moderation, risk assessments and reporting but will depend on the nature of the service provided.

Promoting a safer online experience

Online safety is a shared responsibility, and social media users must be proactive in safeguarding their online presence. This involves using privacy settings, avoiding sharing personal information, and being cautious about engaging with unknown individuals or suspicious content. By staying informed about online safety and leveraging the resources provided by the eSafety Commissioner, you can enjoy the benefits of social media while minimising the associated risks.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.