Anti-Bullying Week digital citizenship response kit

Rehearse, report, record — and embed

A teacher guiding pupils through a digital citizenship scenario on a classroom screen

What 2025 needs

Anti-Bullying Week can still drift into posters, slogans and a single assembly that fades by Friday. In 2025, pupils’ social lives are threaded through group chats, gaming voice channels, shared documents and anonymous reporting features that change monthly. Schools therefore need response readiness: a simple route for reporting, staff who know what to say in the first two minutes, and pupils who have practised what to do as bystanders. If you are also reviewing your AI approach this year, it helps to align this kit with your wider safeguarding and boundaries, including how staff use AI tools and where human judgement must remain central. The thinking overlaps strongly with World Mental Health Day: AI wellbeing copilot boundaries, because pupils’ online harm and wellbeing are inseparable in day-to-day pastoral work.

This “incident response kit” is designed to link PSHE and Computing in a sustained way: the same reporting route appears in tutor time, one lesson, and staff huddles, then becomes routine for the next month. It is not a new policy. It is the usable, one-page version of what you already expect.

Define the behaviours

Before you can respond consistently, you need shared language. Bullying is repeated, intentional harm where there is a power imbalance. Banter is mutual and welcome; the moment it is unwelcome or coercive, it is no longer banter. Conflict is a disagreement between people of relatively equal power; it can be serious, but the response is different from bullying.

Cyberbullying is not a separate category so much as a context that can amplify harm. Online spaces increase audience size, persistence (screenshots and forwards), and ambiguity (tone is hard to read). A single post can be “one incident” but still have repeated impact, because it is viewed and reshared repeatedly. Agree as a staff team what you will call it in your setting, then be consistent with pupils and parents/carers.

A practical classroom example: a Year 8 pupil says, “It was just a joke,” after posting an edited photo in a group chat. Your definition work lets you respond calmly: “Intent matters, but impact matters too. The photo was shared without consent and it’s spreading. That’s harmful behaviour, and we need to deal with it.”

Your reporting route

Your route should be teachable in under a minute, and it should clearly separate “tell us quickly” from “we will investigate and decide”. Role clarity reduces the most common failure point: pupils telling three adults and assuming “someone else will log it”.

In one page, define what happens for pupils, staff, and parents/carers. Pupils need two options: a trusted adult route and a digital route (a monitored form or email). Staff need an immediate safeguarding threshold reminder and a single place to record. Parents/carers need a clear promise about acknowledgement times and what information you can and cannot share.

If you are refreshing staff expectations around AI and digital tools this year, make sure your reporting route sits alongside your wider guidance, rather than in a separate document nobody reads. A useful companion is the annual AI acceptable use policy refresh checklist, because it helps you keep “where to report concerns” visible in staff routines.

Recording and evidence

Recording is where good intentions often go wrong. You want enough detail to support action, patterns, and follow-through, without collecting excessive personal data or creating a new harm by circulating content.

Capture what you need: the who/what/when/where, the platform or context, the exact words used (copy them accurately), and the impact as reported. Note any immediate safety concerns and who was informed. Where possible, store a single screenshot in a secure system, with restricted access, rather than forwarding it around staff email chains.

Avoid what you do not need: do not ask pupils to keep re-opening harmful content “to prove it”. Do not encourage pupils to screenshot and share widely. Do not store whole chat histories unless there is a clear safeguarding rationale. If the content is illegal or meets a serious safeguarding threshold, follow your established safeguarding procedures immediately and avoid amateur “evidence gathering” that risks contaminating an investigation.

A simple rule for staff helps: “Record the minimum needed to act, store it once, and keep it secure.”

AI scenario rehearsals

AI can help you rehearse situations without using real pupil stories. You can generate age-appropriate vignettes, vary details, and practise language as a team. The key is to keep humans in the loop: you are not asking AI to decide sanctions; you are using it to help staff and pupils practise responses. If you want a safe way to introduce this idea with younger pupils, the routines in KS1–KS2 teacher-in-the-loop AI playbook translate well to cyberbullying rehearsals: short, structured, and teacher-guided.

Here are six copy-and-adapt vignettes (each can be run as a five-minute “stop and think” in tutor time, or a longer role-play in PSHE/Computing). When you adapt them, change names, platforms, and context to match your setting without mimicking a current real incident.

  1. KS2 (Year 5/6): A pupil is repeatedly excluded from a class gaming group. Others post “we forgot” and use laughing emojis when the pupil asks to join. A bystander feels uneasy but worries about losing friends.

  2. KS3 (Year 7/8): A group chat shares a “rate people in our year” poll. One pupil is repeatedly scored low with comments about appearance. A screenshot is posted to a wider chat.

  3. KS3/KS4: Someone creates an account using another pupil’s name and posts rude comments “as them”. The targeted pupil is blamed and starts skipping lessons.

  4. KS4 (GCSE years): A private photo is shared without consent after a relationship ends. The sender says it was “only to one person”, but it spreads. Friends argue about whether to report it.

  5. KS5: A pupil receives repeated anonymous messages implying they do not belong in a subject because of identity. The messages arrive late at night, affecting sleep and attendance.

  6. Staff-facing scenario: A staff member is shown a screenshot by a pupil at the classroom door. The screenshot includes other pupils’ names and a slur. The pupil asks, “Please don’t tell anyone.”

For each vignette, rehearse three moves: what the bystander can do in the moment, what the targeted pupil can do next, and what the first adult response sounds like. If you use AI to generate variations, set clear constraints: no real names, no real platforms if that risks glamorising, and no graphic content.

Ready to Revolutionise Your Teaching Experience?

Discover the power of Automated Education by joining out community of educators who are reclaiming their time whilst enriching their classrooms. With our intuitive platform, you can automate administrative tasks, personalise student learning, and engage with your class like never before.

Don’t let administrative tasks overshadow your passion for teaching. Sign up today and transform your educational environment with Automated Education.

🎓 Register for FREE!

Ready-to-say scripts

Scripts are not about sounding robotic; they are about reducing panic and inconsistency. In cyberbullying, the first adult response often determines whether pupils keep reporting.

For a first response to a disclosure, aim for calm, belief in the experience, and a clear next step: “Thank you for telling me. You’ve done the right thing. I’m going to record what you’ve shared and pass it to the right person today. You’re not in trouble for reporting.” If a pupil shows you content, add: “You don’t need to keep looking at it. Let’s note what we need once, and then we’ll put it away.”

For bystander support, you want to validate and give a doable action: “It takes courage to speak up when it’s your friends. You can save a single screenshot if it’s safe, then report it. You don’t have to argue in the chat to help.”

For an alleged perpetrator, keep it factual and avoid a courtroom tone: “We’re looking into a report about messages sent from your account. I’m going to ask you some questions, and we will decide next steps once we’ve gathered information.” This reduces the chance of escalation and protects fairness.

For a parent/carer call, clarity matters: “I’m calling to let you know we’re responding to an online incident involving your child. I can’t share details about other pupils, but I can explain our process and what support we’re putting in place.” If you need to discuss device checks, consent, or platform reporting, stick to your policy and safeguarding guidance.

If you want to build staff confidence quickly, a short INSET micro-routine works well: read a vignette, practise the first two sentences, then agree the recording step. The approach aligns with INSET day AI workshop micro-routines, even if your focus here is incident response rather than classroom AI use.

To avoid the “one-off assembly” trap, make PSHE do the human work: identity, empathy, consent, power, and help-seeking online. When pupils explore identity, include how online spaces can intensify belonging and exclusion. When you teach empathy, practise interpreting impact when tone is unclear. When you teach consent, include forwarding, screenshotting, tagging, and “adding someone to a group” without asking. When you teach power, include audience size, anonymity, and influencer dynamics within year groups. When you teach help-seeking, rehearse the exact reporting route and what happens after a report.

A simple example: after discussing consent, ask pupils to rewrite a message that pressures someone to share a photo into a message that respects boundaries. Then link it directly to what they should do if they receive an image they did not ask for.

Computing makes the response practical: privacy, security, screenshots, platform reporting, and digital footprints. Pupils should learn how to take a screenshot safely (capturing the key evidence without sharing it), how to block and report within common platforms, and why “deleting it” does not remove it from circulation. Digital footprints should be taught as both risk and protection: “Your choices leave traces, and traces can support investigations.”

If you are already running student-led digital projects, you can integrate this as a short audit task: pupils design a “reporting route” poster that is accurate, accessible, and non-judgemental. The enquiry approach fits well with KS3–KS4 AI exploration week projects, even if you keep the focus on safety rather than AI.

Platform reporting routes

Pupils often know how to report in theory, but freeze in the moment. A practical checklist should be short enough to remember and specific enough to follow. Teach it as a routine: pause, protect, prove, report, recover. Pause before replying. Protect by blocking or leaving if safe. Prove by capturing one screenshot or link. Report in-app and to school. Recover by checking in with a trusted adult and using support strategies.

Build in one crucial nuance: pupils should not “pile on” by confronting the perpetrator publicly. Encourage private support to the targeted pupil and formal reporting instead. For older pupils, include a reminder that sharing harmful content “to show someone” can become re-sharing.

A 5-day week plan

Anti-Bullying Week (10–14 Nov) works best when it is small, repeated, and consistent. Use tutor time for rehearsal, one curriculum lesson for skills, and one short staff huddle for alignment.

On Monday, introduce the definitions and the one-page reporting route, then practise a 60-second “how to report” routine. On Tuesday, run one vignette in tutor time focused on bystanders, and in the staff huddle agree the first-response script and recording location. On Wednesday, teach the Computing lesson on screenshots, privacy and platform reporting, using a neutral demo environment. On Thursday, teach the PSHE lesson on consent, power and empathy online, linking back to the same reporting route. On Friday, revisit the route, share anonymised “what we learned” themes, and remind pupils what follow-through looks like.

If you have a new intake or a cohort that needs extra scaffolding, you can adapt this into induction routines too, drawing on the structure in Year 7 induction safe AI charter routines to keep it light, repeated, and memorable.

After the week

Embedding is where impact happens. Over the next 30 days, keep one small routine alive: a weekly two-minute reminder in tutor time of the reporting route, plus one staff check-in on recording quality and follow-through. Add a simple confidence measure for pupils (“I know how to report”, “I know what will happen next”) and for staff (“I can respond in the moment”, “I know where to record”). Track reporting clarity by sampling a handful of logs: are key fields completed, is evidence stored once, and are actions recorded?

Most importantly, close the loop. When pupils report, they need to see that adults act. You may not be able to share details, but you can say, “Thank you. We’ve followed this up, and we will keep you safe.” That sentence, repeated consistently, is what turns Anti-Bullying Week into a culture rather than a campaign.

May your reporting routes be clear, your responses calm, and your follow-through consistent. The Automated Education Team

Table of Contents

Categories

Safety

Tags

Safety Student Support

Latest

Alternative Languages