- within Media, Telecoms, IT and Entertainment topic(s)
- in United States
- within Cannabis & Hemp, Strategy and Privacy topic(s)
Ofcom has recently issued guidance about the Online Safety Act in the video gaming context. It has emphasised that the Act makes online businesses, including gaming companies, legally responsible for keeping UK users, especially children, safe online, even if their business is based outside the UK.
The guidance includes:
- how the rules in the Act apply to online video game services
- the risks of harm for users, especially children; and
- what video game services need to do to comply.
How the Online Safety Act applies
Online games can allow people of all ages to play, create, explore and express themselves. However, whilst games companies work hard to create safe environments for players and design measures to protect players in the game, some features of online video game services can be exploited or misused and games can become an unsafe environment, especially for children. The Online Safety Act applies to certain types of online services, including those where users can interact with each other, or create, share or upload content. Ofcom has a useful tool which providers can use to see if the rules apply to them. This includes services that:
- have links to the UK; and
- are a 'user-to-user' service – an internet service that enables users to generate, share or upload content (such as messages, images, videos, comments, audio) on the service that may be encountered by other users of the service. This includes services that enable user interactions.
On online video game services, this could take many different forms. Sometimes, users may interact by creating or manipulating player profiles, avatars, objects, and the environments themselves, or by using voice and text chat (including, for example, team-based channels or open-world chats). Online safety rules would apply to content where games use matchmaking systems to connect users with each other, including strangers, through mechanisms such as populating lobbies and/or by assigning players to teams, and where services enable livestreaming.
On a user-to-user service, while the Act covers user-generated content, it does not cover content published by the provider of the service (except for online pornography). On online video game services, this could include, for example, offline gameplay, original or additional game content developed and published by a studio, or the enforcement of PEGI age ratings.
The online safety risks in gaming
Ofcom explains that there is evidence about the risks of online harm in games. Its Online Experiences Tracker has shown that many 13 to 17-year-olds report being highly concerned about trolling (47%), one-off abusive behaviour or threats (45%), and intentional harassment (37%) – known as 'griefing' – while playing games online.
NSPCC research on online grooming highlights "voice or text chat services built into online multiplayer games" as methods used by grooming perpetrators to approach children.
Charities such as Catch-22 and the Children's Society have supported child victims/survivors of criminal exploitation who have been recruited into trafficking via online video games. Meanwhile, the NSPCC helpline has supported children who have encountered suicide and self-harm content in a game environment.
Ofcom media literacy research reports that online games were the third most likely place for 'nasty or hurtful' behaviour to occur among children aged 8-17 (12%, after social media at 16% and messaging apps at 15%).
The Online Safety Act introduces new legal duties for providers of regulated services to assess the risk of harm from illegal content and certain kinds of content harmful to children.
Providers must then put in place measures to keep people safe online.
Illegal content
Ofcom's research and its Register of Risks indicates that online video game services are more likely to have an increased risk of harm. There are 17 identified harms including:
- terrorism;
- child sexual exploitation and abuse, in particular grooming;
- hate offences; and
- offences relating to harassment, stalking, threats and abuse
Content harmful to children
For children, Ofcom research and its Children's Register of Risks has indicated 12 types of harm, including:
- abuse and hate content
- bullying content; and
- violent content.
How to comply
Ofcom has produced guidance and resources to help gaming platforms to comply with the law, including:
- check if the Online Safety Act applies to them;
- carry out their illegal content assessment and put in place protections under the safety and related duties;
- complete their children's access assessment and, if applicable, carry out their children's risk assessment and put in place protections under the children's safety and related duties; and
- comply with the record-keeping and review duties for the above activities.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.