ARTICLE
16 February 2024

Liability Of Social Media Companies For Injuries And Losses Suffered As A Result Of Posts Hosted On Their Platforms

PU
Paul Usoro & Co

Contributor

Paul Usoro & Co
An Examination Of The Livecase Of Gonzalez v Google At The United States' Supreme Court.
Nigeria Media, Telecoms, IT, Entertainment

Outline

  1. Overview/Definition of Concepts
  2. Review of Nigerian laws on the liability of Social Media Companies
  3. Foreign Jurisdiction laws on liability of Social Media companies
  4. Writer's opinion on the appropriateness or otherwise of Social Media liability
  5. Examination of the Ongoing case of Gonzalez v Google in the U.S.

Overview

The issue of the liability or otherwise of Social Media companies for injuries including but not limited to death and other losses suffered by users and non-users alike resulting from posts hosted on their platforms, is gradually becoming a burning issue. In the past, it was not envisaged that a time would come when the Social Media platforms would be employed by non-state and faceless actors in unleashing their terror on unsuspecting victims. This issue is more remote in developing countries like Nigeria where it is not so relatable. Social Media companies have grown exponentially and have spread their tentacles everywhere that these non-state actors, especially terrorist organizations, are increasingly employing their platforms in advancing and enhancing their nefarious activities.

Social media platforms have spread to the most remote parts of the globe and so too have their positive and adverse effects. It is the norm to seek to protect social media companies from liability for injuries and losses occasioned by the use of their platforms. Extant laws in most advanced democracies favour their protection from such liabilities in order to encourage their growth. Overtime however, these platforms are increasingly becoming stadiums for many non-state players and organizations in the perpetration of their heinous crimes.

Definition of Concepts

  1. Social Media: Social media are websites and computer programmes that allow people to communicate and share information on the internet using a computer or a cell phone.1
  2. Social Media Companies: These are companies that set up, own and run social media platforms and/or apps. These companies are the entities that are recognised by law and clothed with legal personalities. They have legal personalities and are responsible for maintaining the various social media platforms. For the purpose of this discourse, these companies include:
    • Meta Inc. that own Facebook, Instagram and acquired WhatsApp in 2014.
    • Alphabet Inc. the parent company of Google, which owns YouTube.
    • Twitter Inc. which owns and operates Twitter app.
    • Microsoft Inc. which owns LinkedIn.
    • Snap Inc. formerly Snapchat Inc. owns and operates Snapchat app.
    • Bytedance, a Chinese company owns and operates TikTok app.
    • Tencent Inc. another Chinese company owns WeChat and QQ apps Etc.
  3. Liability: Liability in the simplest form means the state of being legally responsible for something. Going further, Civil liability refers to the right of an injured party to hold someone responsible for his injuries or damages, which resulted from the other party's wrongful actions. In order to hold a person or entity civilly liable, the wronged party must have suffered some type of quantifiable loss or damage. This may be in the form of personal injury, property damage, loss of income, loss of contract, and a host of other losses. In a civil liability lawsuit, the injured party's losses must have occurred due to the defendant's violation of a law,breach of contract, or other wrongful acts, referred to as a "tort." Examples of civil liability cases include injuries and property damages sustained in automobile accidents, anddefamationof character claims. To be successful in a civil liability lawsuit, the Plaintiff must prove to the Court, or to aJury, that it is more likely than not that the defendant's actions caused his injuries or loss. This level of proof required is referred to as a "preponderance of evidence."
  4. Foreign Jurisdiction: Foreign Jurisdiction in lay man's terms means any jurisdiction outside the borders of Nigeria. For the purposes of this discourse, foreign jurisdiction are the laws of other countries particularly the laws of the United States on liability or otherwise of Social Media companies for injuries occasioned by posts hosted on their Apps.

Review of Nigerian laws on the liability of Social Media Companies

Regrettably, Nigeria does not have any statutory law that provides for protection from civil liability for social media companies regarding third-party content. In the wake of the suspension of Twitter by Nigerian Government on 05 April 2021, the Nigerian Government realizing that it has no law enacted for the regulation of social media, directed the Nigeria Broadcasting Commission ("NBC") to come up with regulations to regulate social media. Although this is not directly related to this discourse as this discourse focuses on the liability of the social media companies for injuries suffered as a result of the use of their apps. It is however important to observe that this lack of laws on this subject exposes social media companies to uncertainties as well.

The law as it is in Nigeria, does not provide any protection for these Social Media companies as in other jurisdictions. Indeed, the law as it is, appears to be anti-social media as the provisions of the NBC Broadcasting Code appears to conflict with some of the policies of the social media companies. Of particular note, are the provisions of the NBC Code that prohibits the broadcast of sexual contents like adultery, incest, bestiality, same sex, etc. Of greater interest is the provision of Section 3.8.1(b) of the Code which prohibits broadcasters in Nigeria from broadcasting "any language or scene likely to encourage crime, lead to disorder or any content which amounts to subversion of constituted authority or compromises the unity or corporate existence of Nigeria as a sovereign state". This particular provision suffices to hold social media companies liable for posts from their users hosted on their platforms, since same could be interpreted by the Nigerian judiciary to mean a publication/broadcast. The Same Sex Marriage Prohibition Act, an extant law in Nigeria is also another statute that could operate to make social media companies liable for posts on their platforms that offend the provisions of the Act. It is remarkable that no Suit seeking to hold any social media company liable for breach of these laws has been undertaken.

However, in the face of all these laws, there is no statute on the liability or otherwise of the social media companies for injuries and losses suffered by users and non-users alike for posts by users and as a result of the employment of the social media apps by non-state actors in the advancement of their nefarious activities.

Foreign Jurisdiction laws on liability of Social Media companies

The most popular social media apps with worldwide usage were developed and nurtured in the United States. In recent times, China has made tremendous inroads in the building and development of social media platforms. This discourse will principally be examining the United States laws on the liability of the social media companies.

There is no principal legislation protecting social media companies per se. However, the most outstanding law in the U.S. that protects social media companies from liabilities arising from posts by users on their apps is the Communications Decency Act 1996 ("CDA"). The CDA was enacted by the United States Congress in its first notable attempt to regulate pornographic materials on the internet. Section 230 (C) of the CDA is the relevant provision to our discourse and provides thus:

(c) Protection for "Good Samaritan" blocking and screening of offensive material

(1) Treatment of publisher or speaker:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) Civil liability:

No provider or user of an interactive computer service shall be held liable on account of –

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

This particular section was introduced to promote the development of the Internet (social media) in its infancy by protecting Web services from being drowned in lawsuits, and to address concerns about children's access to inappropriate content. It has been interpreted to mean that websites and social media platforms are not liable for third-party content produced and shared by users through those mediums. If a user makes a post defaming another person, it is the user, rather than the platform, that is responsible. Section 230 of the CDA also permits "good faith" action taken to restrict egregious content by users. This allows social media handlers to engage in selective removal/deletion of posts without the platforms losing their immunity from legal responsibility for content or liability for such removals. It therefore provides two-fold protection to the Social Media companies. This has greatly prevented any successful liability suit against social media companies in the U.S.

Section 30 of the CDA was applied by the U.S. judiciary in the recent case of Force v Facebook.2 Taylor Force was a business student and an American veteran in Iraq and Afghanistan. In 2016, during the "knife intifada3," he was stabbed and killed in Tel Aviv by a Hamas terrorist and his relatives sought justice by bringing an action in the U.S. Social media (Facebook) was seen as a vital tool for terrorists, and legal action was instituted at a federal court in New York on behalf of five families. It was argued that Facebook gave material assistance to a known terrorist organization, Hamas, by providing it with a communications platform and recommending Hamas' content to users.

The Court didn't agree with the terrorism victims and dismissed the case, as with the Court of Appeal and the Supreme Court. Facebook had an impenetrable wall that made it irreproachable.

Robert Tolchin, who was the Counsel for the petitioners and has worked on anti-terrorism litigation for almost 20 years, said that "the problem that we have come up against in all these cases wasSection 230 of the CDA.

The UK is currently in the process of making laws to limit the protection of social media companies that these platforms enjoy as they do not have any substantive law on that. In the proposed law, which is at an advanced stage, the UK is proposing laws that would hold social media platforms and their companies responsible and liable for harmful, violent, and dangerous posts on their platforms. The move is reported to be aimed at making the UK the safest place for social media users.

Writer's opinion on the appropriateness or otherwise of Social Media liability

Social Media has come to stay and its positives/benefits/merits in the opinion of the author, far outweighs its negatives/demerits. The intention of the U.S. lawmakers in enacting Section 230 of the CDA was and remains valid as it was made principally for the protection of the internet companies and developers from crippling lawsuits and attendant liabilities during its early years. Of interest, however, is the fact that present-day social media platforms were not immediately in consideration when that section was enacted. It is indeed praiseworthy that Section 230 of the CDA has worked greatly to protect the social media companies.

Overtime, the social media platforms have grown geometrically and prospered greatly, courtesy of Section 230 of the CDA. However, the progress has seen the platforms become increasingly ready and appealing tools for the advancement and propagation of terrorism and harmful contents. The platforms have also become uncensored so much that disinformation and misinformation have taken steady and obvious abode on them. These posts are largely if not completely the handwork of the users and as such Social Media companies have escaped and continue to escape liability howsoever for harms and injuries occasioned by these contents, hiding under the canopy of Section 230 of the CDA. Of even graver concern is the fact that some known terrorist organizations have active accounts on social media and at times, recruit via these platforms. In all of these, social media companies have evaded liability because of the statutory umbrella ofSection 230 of the CDA.

The prevailing circumstance, means that the protection provided to the social media companies is due for review. In the jurisdictions that have no proper legislation on this subject, it is due time to enact laws to cover the seeming gap and as such rests on their legislators to do so. Such legislations will put social media companies on their toes in acting promptly and vigilantly to checkmate all harmful contents put on their platforms. It will also serve as a wakeup call to social media companies to disable all accounts with verifiable link to terrorism. Nigeria is not left out among the jurisdictions that urgently need to legislate for the liability of the social media companies in this regard. Being bedevilled and troubled by terrorism and other societal ills, many of whom employ social media platforms in perpetrating their evils, it is even more compelling for Nigeria to make legislations to hold social media companies liable where their negligence and indolence result to injury or loss.

The need to sanitize the system without necessarily interfering with the free use of the platforms has ripened for legislative action. The companies should not be protected from liabilities which they reasonably ought to have averted with little duty of care and vigilance. In all of these, there ought to be an objective standard in determining such liability. Arguments by social media companies that it is not practicable to review all posts on their platform should no longer suffice to protect them from civil liability. The companies ought to device a means to track such posts and take them down or face proportionate civil liability for the outcome. This can be achieved by social media companies with the deployment of tracking devices to track harmful posts and by monitoring accounts/posts that are affiliated to any terrorist group and deleting same. This liability, however, may not be in the form of strict liability. Every case should be heard and decided on their merit applying the letters of the law strictly thereby. The second leg of the protection in Section 230 of the CDA which empowers the Social Media Companies to take down posts that they consider harmful may be retained with the proviso that same is done in "good faith".

Examination of the Ongoing case of Gonzalez v Google in the U.S.

Nohemi Gonzalez was in Paris, France for her studies when she was murdered by the Islamist State ("ISIS") terrorists on 15 November 2015. It is over seven years since her death, but the pain and memory persist. Shortly after her death, the decision was made to sue Google (Alphabet Inc.) which owns YouTube. The argument for YouTube's liability for the rise of ISIS and the subsequent death of Nohemi is based on the platform's recommendations systems, which algorithmically suggests content similar to that liked or regularly watched by users. In its brief, the Counter Extremism Project detailed that these algorithms are built with the idea that "edgy" content is more attention-grabbing. This leads to inundation and the radicalization of users. Petitioners contend that this process was monetized by Google through ad programs, which didn't take the necessary action to remove the wave of jihadist content it was suggesting. Recommending content should make YouTube more than a publisher of third-party content, argued Tolchin. "This is more than a billboard; you are guiding people down the rabbit hole with your algorithms. This isn't someone going to the library and selecting a book; this is a librarian following around and suggesting books."

A companion case was also filed for Nawra Alassaf's relatives. The relatives of Nawra Alassaf had suffered as a result of his murder by ISIS terrorists in the 2017 Reina nightclub shooting in Instabul, Turkey. In that case, complaints against Twitter, Facebook and YouTube hold that they knew that Islamic State terrorists were using their platforms and didn't take action thereby allowing the use of their platforms for communications and recruitment which constitutes material support for terrorists. "U.S. Congress made it clear that if one helps a terrorist organization and that group commits a terrorist act, such a person can be held responsible," said Keith Altman, a member of the legal team. "The Alassaf's case seeks to hold social media companies liable for their knowing assistance to ISIS." While these cases were rejected by the U.S. lower courts on account of Section 230 of the CDA, their appeals were accepted by the U.S. Supreme Court.

The cases were heard by the U.S. Supreme Court on 21 and 22 February 2023. The Petitioners argued that the social media companies should be held liable for their knowing assistance to the terrorist groups. The Social media giants argued that terrorist content is already forbidden by their terms of service. They further argued that they lack the capacity to review all contents, however, they have employed automatic flagging and personnel to remove as much as possible. Their routine services were being abused. Their platforms were not linked directly to the attacks, not having knowingly aided or encouraged them. In Gonzalez case, Google argued that YouTube's videos couldn't be proven to have radicalized the attackers. Instead, that there was a general complaint about its role in ISIS's rise to prominence. In Alassaf, the companies argued that they could not be seen as abetting a criminal act that they had no knowledge of or did not actively assist. When it came to algorithms, Google argued that they were neutral automated tools, and that it was still protected from liability under Section 230 of the CDA. It noted that every lower court had reiterated these protections. Prominent Internet companies and interest groups also filed briefs in support of Google, warning of the impact that would arise if Section 230 of the CDA was to be abolished or restricted.

On 08 March 2023, the U.S. Senate Committee on Judiciary convened a snap hearing in the aftermath of the Supreme Court Hearing. In the hearing, Senators from both parties made it clear that they intend to amend the internet law. The snap hearing follows hearing of the case Gonzalez v. Google, where it was alleged that YouTube and other social media platforms aided and abetted ISIS attacks by providing the outlawed Islamic group with video streaming services. It was also contended that Section 230 of the Communications Decency Act should not be interpreted as a blanket immunity for the social media giants.

The Senators were largely united in their contention that Section 230 of the Communications Decency Act must be amended. The Committee members and experts rallied against the policy of immunity which they claimed was outdated for the modern internet and unjustly prevents terror victims and others from seeking restitution. Senator Josh Hawley, who has over the years been active on the issue, claimed that Section 230 of the Communications Decency Act had been "systematically rewritten" by courts over the past twenty years, often at the behest of Big Tech companies to a point where it is now "completely unrecognizable" from what the congress intended.

Conclusion

The wait for the Judgment of the U.S. Supreme Court on this issue has begun. In the meantime, while it is unethical to make statements on issues and matters that are subjudice, in order not to pre-empt the Court, the likeliness of the U.S. Supreme Court repealing Section 230 of the CDA is a bit remote but not impossible, especially in the light of the escalating employment of the social media and indeed other internet platforms to achieve unwholesome ends. The U.S. Supreme Court has been afforded a golden opportunity to re-interpret Section 230 of the CDA in its Judgment. The next thing that will happen, if this happens, is for the U.S. Congress to develop a new law. The real outcome of a successful case would be that "some of the ugly things on the Internet would likely be limited, and that is critical.

Footnotes

1. Cambridge Advanced Learner's Dictionary & Thesaurus.

2. Force v. Facebook, Inc., No. 18-397 (2d Cir. 2019)

3. The era of rampant knife stabbings in Jerusalem

Originally published April 6 2023

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More