The Federal Swiss Council has recently announced it is considering enacting new laws to regulate large platforms, including search engines (e.g. Google), social networking platforms (e.g. Facebook), multimedia platforms (e.g. YouTube) and microblogging services (e.g. Twitter), which could have far-reaching implications for the tech industry and digital communication. The move comes in response to growing concerns about the potential misuse of these platforms, particularly in terms of hate speech, fake news, and other harmful content.

Such platforms have become essential to modern-day life, enabling people to stay connected, conduct business and share information, but they also entail risks for public communication in the country. While these platforms have their terms of service and community guidelines, there is no formal specific Swiss legal framework to govern their operation or hold them accountable for harmful content that may be shared.

What are the problems identified, and how could these regulations look like?

Identified problem areas include hate speech, false information, quasi-censorship and a lack of transparency. To tackle these issues, the Federal Council wants to strengthen users' rights and enable greater transparency. To that end, measures considered by the Swiss government include the following:

  • Large platforms would be required to comply with certain privacy and security standards (including the encryption of user data and the implementation of measures to prevent the spread of harmful content), and be more transparent about its moderation policies and processes.
  • Large platforms would have to appoint a legal representative and a point of contact in Switzerland.
  • Users whose content or access has been suspended, limited or removed should be able to ask the platform to review the measure directly.
  • Users should be able to report harmful and illegal content quickly and easily, and platforms should review such complaints, take action and inform users about the decision in a timely and accessible manner.
  • A Swiss arbitration board would need to be set up, and large platforms would have to pay for it.
  • Large platforms would need to be more transparent about its advertising practices, including informing its users about the target audience and main parameters used to determine the recipient to whom the advertisement is displayed.

Swiss authorities are also considering holding these large platforms liable for harmful content shared on the platforms, which could have significant financial and legal implications. The move is part of a broader effort by the Swiss government to improve online safety and protect user privacy. Last year, Switzerland introduced new data protection laws that imposed strict limits on companies' collection, storage, and use of personal data.

The government has instructed the communications ministry to prepare a draft law for consultation on regulating large platforms by the end of March 2024. New provisions, where appropriate, will be based on the European Union's Digital Services Act.

Ultimately, any discussion or legislative action on the topic depends on a number of factors, including the views of lawmakers, industry stakeholders, and members of the public. However, given the growing concerns about online security and privacy, it seems likely that in the coming years more countries and regulatory bodies around the world will take similar steps to hold technology companies more accountable.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.