In August 2020, a group of game firms and online companies came together to form Oasis Consortium (the "Consortium")—a non-profit dedicated to advancing common ethical standards and technologies for the rising Metaverse. In less than two years, the Consortium has become a standard-setter for what might be the most exciting new industry in the world.

Today, the Consortium includes executives from Roblox, Fandom, Riot Games, Pandora, News Corp, and Grindr. Between them, they have hundreds of millions of users—many of whom have already started engaging with Metaverse technologies. And by December 2021, the group had already released a proposed "User Safety Standards"—a set of guidelines for Metaverse companies to follow going forward, which include hiring trust and safety officers, employing content moderation, and integrating the latest research in fighting toxicity. Further, the Consortium is currently working on creating a grading system to ensure that the public knows where a given company stands in trust and safety (in a similar way that restaurants are currently graded on cleanliness).

Given recent events, it appears that the Metaverse needs the Oasis Consortium now more than ever before. In December 2021, the MIT Technology Review reported that at least one beta tester in Meta's virtual social platform, Horizon Worlds, had already complained of being sexually assaulted by a fellow user. Users will obviously need assurance that the Metaverse is a safe and ethical digital platform before it can achieve substantial growth.

Sign up to receive emails with links to new posts about navigating the business and legal risks in the growing Metaverse, and other blockchain and videogame topics, by clicking here.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.