- within Consumer Protection, Tax, Government and Public Sector topic(s)
A governance and risk issue for SGBs and executive teams
Deepfakes and AI-generated manipulation are no longer fringe tech. In a school environment, a learner or staff member’s face can be pulled from a sports photo, a class video, or a social media post and inserted into content designed to humiliate, sexualise, threaten or incite ridicule. The speed of sharing is what catches schools off guard. So is the pressure on leadership to respond decisively before speculation becomes the story.
For school governing bodies and executive teams, an abusive deepfake is not simply a discipline matter. It is a governance and risk event. It impacts duty of care, wellbeing, privacy and personal information handling, workplace safety and community trust. It also tests whether your policies and response processes are fit for modern threats.
This article sets out a practical, South Africa-relevant framework for school leadership: What to do immediately, how to handle sensitive material responsibly, when to escalate and what preventative steps reduce risk.
The South African Legal Context Summarised
You don’t need a single “deepfake law” for deepfake harm to create legal exposure. Depending on the facts, schools may be dealing with a mix of:
- Privacy and personal information obligations, especially where images, videos, or school-held information are used, mishandled or shared inappropriately (including obligations to apply appropriate safeguards).
- Cyber-related criminal conduct, particularly where content involves threats, harassment, harmful distribution or manipulation shared electronically.
- Children and learner protection considerations, especially where a minor is involved in sexualised or exploitative content.
- Governance and discipline duties, including maintaining a disciplined and purposeful school environment, with clear codes of conduct and fair processes.
The practical point is simple. Schools should respond with a disciplined process first, then apply the right legal pathway based on the nature of the content and how it was created and distributed.
The First Response Window: 24 – 72 Hours
1) Contain the spread, don’t feed it
Your first objective is harm reduction.
- Identify where the content is circulating (platforms, groups, accounts, devices).
- Instruct staff immediately: No forwarding, no informal sharing, no “just so you can see” circulation.
- Trigger platform reporting and takedown steps as soon as possible.
A frequent error is circulating the content internally to “confirm” what it is. That can multiply the harm, worsen reputational fallout and complicate later processes.
2) Sensitive content warning: Handle with extreme care
If the deepfake is sexual, explicit, or involves a minor, staff can unintentionally make things worse by storing, copying, or sharing the material “for evidence”.
Schools should:
- Limit exposure to a small, authorised custodian team only.
- Avoid storing the content broadly on shared drives, group chats or personal devices.
- Record what is necessary for an incident log (who reported, when, where it appeared, how it was distributed) while keeping any captured material to an absolute minimum.
Where the content is particularly serious, it is often best to get legal advice early on how to preserve information without creating further legal and ethical risk.
3) Preserve information properly, with controlled access
Schools often need evidence for internal processes, but evidence handling must be minimal, secure and controlled.
- Appoint a single custodian (or small custodian team).
- Capture only what is necessary (links, timestamps, limited screenshots where needed).
- Store material in a restricted location with a clear record of who accessed it and when.
- Start an incident log: what was reported, by whom, when actions were taken and what those actions were.
This supports fair discipline processes and responsible information management.
4) Protect the affected person and stabilise the environment
Your response will be judged on how you reduce ongoing harm.
- Put support in place for the affected learner or staff member (including counselling structures).
- Reduce opportunities for pile-ons, retaliation and “spectator sharing”.
- Brief staff on how to manage classroom/campus talk without naming, shaming or repeating details.
A calm, caring response builds trust and reduces the likelihood of escalating conflict.
5) Who created the deepfake matters
It is not always a current learner. Deepfakes may be created by:
- A current learner,
- A former learner,
- A parent or someone linked to the school community, or
- A third party outside the school.
This affects the available response options. Where the source is inside the school, discipline processes may be primary. Where it is outside the school, the school may need to lean more heavily on legal escalation, takedown actions and external reporting pathways, while still stabilising the school environment internally.
6) Staff can be targets too. Treat those incidents as high priority
Where staff are targeted, the situation may raise additional workplace considerations. Schools should treat these incidents as urgent, including:
- Workplace safety and wellbeing,
- Internal reporting lines and support,
- Fair process, and
- Disciplined communications to prevent harassment and reputational damage.
Even a “learner discipline” incident can quickly become a staff safety and employment issue if not handled early and properly.
A Simple Escalation Guide for Leadership
“Escalate wisely” is only helpful if it becomes practical. As a starting point:
Manage internally (with tight controls) where:
- The content is limited in spread and is not sexual or threatening,
- It appears to be a first-time incident,
- The source is identified as a learner and the school can run a fair discipline process quickly, and
- There is no indication of hacked systems or wider compromise.
Seek legal support urgently where:
- The content is sexualised, explicit, or involves a minor,
- Threats, extortion, coercion, or ongoing harassment are present,
- The content targets staff or suggests broader risk to workplace safety,
- There is uncertainty about who created it, or
- Community panic is rising and communications must be tightly managed.
Consider external reporting where:
- Serious harm is evident (especially sexualised content involving minors),
- There are threats, stalking-type patterns, or ongoing harassment,
- The perpetrator is outside the school and the school lacks internal leverage, or
- There are signs of a broader cyber incident.
This is not about “handing everything to the police”. It is about choosing the right pathway while continuing to contain harm and support those affected.
Communications: Calm, factual and controlled
Deepfake incidents create fear, and fear spreads quickly. Schools need a communications approach that is reassuring, disciplined, and legally careful. Good practice includes:
- Appointing a spokesperson (and sticking to it),
- Communicating that the school is responding and supporting those affected,
- Making it clear that creating, distributing, and participating in harmful content is treated seriously,
- Avoiding unnecessary detail that could identify a victim or intensify harm,
- Controlling staff communication channels and expectations.
Parent groups and social media can become accelerants. The school’s message should not become a debate about the content.
Preventing The Next Crisis
The strongest schools are not the ones that never face incidents. They are the ones that respond with clarity because rules and procedures already exist.
1) Update policies so AI misconduct is explicitly covered
Many acceptable use and social media policies predate generative AI. Schools should clearly prohibit:
- Manipulating images, video, or audio of real people without consent,
- Creating or distributing humiliating or sexualised content,
- Operating fake accounts to target learners or staff,
- Possessing and forwarding harmful content (not only creating it).
Clear rules reduce “grey areas” and make discipline enforceable.
2) Align codes of conduct with modern online misconduct
Schools should be clear about:
- How online conduct can still fall within oversight when it impacts the school community,
- What triggers formal discipline,
- The sanctions and corrective steps available,
- How incidents are investigated fairly and quickly.
3) Review image and information governance
Deepfakes thrive on accessible images. Schools should assess:
- What learner photos are publicly available online,
- How consent is obtained, recorded, and withdrawn,
- Who can access high-quality photos and video and how they are shared,
- How staff use parent groups and third-party platforms,
- Whether security safeguards match the risk profile.
If school-held material is used or systems are compromised, the school should treat it as a potential data governance issue and get appropriate advice on obligations and reporting.
4) Train staff on “first hour” steps
Most harm happens early. Training should cover:
- What not to do (especially forwarding and informal sharing),
- Who to alert internally,
- What to preserve and how to preserve it,
- How to support learners without spreading detail,
- How to communicate calmly with parents and colleagues.
A short, practical protocol is more valuable than a long policy document.
5) Build a one-page incident response protocol
Schools don’t need a perfect system. They need a usable system. A one-page protocol should set out:
- Containment steps,
- Sensitive content handling and evidence controls,
- Support steps,
- Discipline steps,
- Communications steps,
- Escalation triggers and who makes decisions.
A minimum viable response (so leadership isn’t overwhelmed)
Even well-resourced schools benefit from a simple baseline:
- One incident custodian,
- One incident log,
- One staff instruction: “do not share; report to X”,
- One spokesperson,
- One clear containment and support plan.
That level of discipline prevents chaos.
How Barnard can support schools
This is a problem that rewards preparation. Schools that treat it as a governance risk are far more likely to contain harm and prevent repeat incidents.
Barnard can assist SGBs and executive teams with:
- Reviewing and updating acceptable use, social media, discipline and privacy policies,
- Building incident response protocols tailored to the school environment,
- Delivering practical training for executives and staff,
- Advising during active incidents, including evidence handling, escalation choices and disciplined communications.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]