Black History Month: an AI representation audit
October 24, 2025
AI can speed up Black History Month planning, but it can also reproduce stereotypes, omit key figures, and quietly centre “default whiteness” in images and text. This article offers a practical representation audit you can run on AI-generated classroom materials: images, short biographies, and display language. You’ll find a bias-checking workflow, quick critical media literacy activities for pupils, and a printable-style checklist to improve the final version. The goal is simple: safer, more accurate, more inclusive materials—made transparently.
World Mental Health Day: AI wellbeing copilot
October 10, 2025
World Mental Health Day is a timely moment to strengthen school wellbeing systems, but AI must be used carefully. This guide sets out a “wellbeing copilot” approach: low-stakes chatbot support for check-ins, signposting and staff workload relief, while safeguarding stays firmly human-led. You’ll find non-negotiable boundaries, escalation routes, and record-keeping expectations, alongside UK GDPR-friendly data minimisation. Practical scripts are included, plus a clear do-not-do list for pupil-facing mental health use and a simple week-one rollout plan.
Teaching AI Ethics: 2025/26 Classroom Kit
September 26, 2025
AI ethics lessons can’t rely on one-off trolley problems any more. In 2025/26, pupils are encountering agentic AI, deepfake voice, AI companions, and AI used in school admin—often before adults realise. This classroom kit offers 12 updated, phase-banded case studies with teacher notes, a repeatable 10–20 minute discussion protocol that builds reasoning rather than ‘hot takes’, and low-marking assessment ideas that capture evidence of process. You’ll also find safeguarding, privacy and inclusion checks, plus a printable prompt pack and one-page run sheet.
EU AI Act: One Year On
September 17, 2025
One year on from the EU AI Act, UK schools are not directly regulated by it, but many of the products schools buy are built, marketed, or supported by suppliers who are. This article translates “high-risk” thinking into a practical procurement and governance playbook: vendor questions, a simple risk register, and audit-ready documentation. It also shows how to align this work with UK expectations around data protection, safeguarding, and assessment integrity—without over-claiming legal obligations.
Year 7 Induction: Safe AI Charter in Tutor Time
September 10, 2025
This ready-to-run Year 7 tutor-time programme covers the first fortnight with low-stakes ice-breakers that build belonging while setting clear AI boundaries. Pupils co-create a one-page ‘Safe AI Charter’ anchored in a simple rule: no pupil data, no accounts, no screenshots of personal information. Across ten short sessions, tutors teach prompt hygiene, verification habits, and calm daily routines such as check-ins, device norms, and help-seeking scripts—reducing September anxiety and creating consistent expectations across subjects.
Refreshing Your AI Acceptable Use Policy
August 18, 2025
An AI Acceptable Use Policy written once and filed away won’t keep pace with tools, assessments, and expectations in 2025–26. This guide reframes your AUP as a living “AI Use & Integrity Agreement” with an annual July/August refresh you can actually run. You’ll find a 12-point checklist, practical assessment boundaries, and data protection defaults that reduce risk by design. It also covers stakeholder sign-off, clear pupil and parent/carer communications, and a simple monitoring loop that supports learning without turning school into a surveillance project.
Summer AI Challenge Ladder
July 9, 2025
The Summer AI Challenge Ladder is a simple, four-week set of missions that helps students use AI thoughtfully across subjects, even with mixed device access at home. Each week offers a choice board with clear time boxes, plus low-device alternatives so nobody is excluded. A paper-first evidence pack keeps learning visible through prompt logs, verification checks, and reflection. The programme ends with a family-friendly showcase using a rubric that rewards habits and thinking over polished outputs.
Student AI Project Showcase Ideas
June 24, 2025
An end-of-year AI showcase can easily reward the glossiest output rather than the strongest learning. This playbook helps you run a ‘proof-of-learning’ celebration where every project includes a short evidence pack: decision log, prompt trail, verification checks and reflection. You’ll find practical format options, a moderation-friendly judging approach, and routines for safeguarding and media consent. The goal is simple: celebrate thinking, integrity and impact—so students can be proud of both what they made and how they made it.
Year 7 Transition Day AI Literacy Carousel
June 17, 2025
Transition Days are about belonging, confidence, and routines that reduce September anxiety. This timetable-ready “AI Literacy Carousel” adds a safe, low-stakes layer: pupils learn how to use AI with minimal data, recognise hallucinations and bias, and practise prompt hygiene without needing lots of devices. Six short stations (10–15 minutes each) are mostly paper-based, supported by clear staff scripts and safeguarding boundaries. The day ends with a pupil-friendly Safe AI Charter linked to your school values, signed and taken home—then revisited in tutor time to embed habits early.
KS3/KS4 AI Exploration Week
May 23, 2025
AI Exploration Week is a five-day, student-led project sprint that treats AI as a research and design tool, not a writing shortcut. This timetable-ready scheme builds curiosity while keeping boundaries tight: daily enquiry questions, 10–15 minute mini-lessons on bias, hallucinations and citations, and structured studio time with clear checkpoints. Assessment is evidence-first, focusing on process, source trails and decision-making, so mixed device access is workable. The week ends with a simple showcase that celebrates thinking, not ‘AI magic’.
Exam-Season AI Traffic Lights for Schools
May 2, 2025
Exam season is when AI rules most often unravel: different teachers say different things, students guess what’s allowed, and well-meaning support can tip into malpractice. This one-page “AI traffic-light” boundary system gives a shared language for revision, homework, coursework/NEA, controlled assessment and exams. You’ll get clear permitted/restricted/prohibited uses, quick ways to introduce the system in five minutes, ready-to-say scripts for staff, students and families, and integrity checks that work even when you can’t reliably “detect AI”.
Phase-banded AI ethics dilemmas toolkit
April 24, 2025
AI ethics can feel abstract, yet pupils meet its effects daily: recommendations, image filters, chatbots, and “too-good-to-be-true” videos. This phase-banded toolkit offers short, story-led dilemmas for Primary, KS3 and KS4, designed for tutor time, PSHE and computing without needing technical detail or real pupil data. Each scenario uses a consistent, safe discussion protocol that helps learners reason about fairness, privacy, consent, deepfakes and ownership.
One year of Sora: a classroom reality check
April 21, 2025
A year on from Sora-style video generation entering mainstream conversation, teachers are asking a practical question: what actually works in a classroom, and what still causes problems? This reality check focuses on the shifts you’ll notice most—better coherence, improved text handling, and more usable editing controls—alongside predictable failure modes like continuity glitches, broken physics, biased portrayals, and unsafe outputs. You’ll find low-stakes use cases, a media literacy sequence, safeguarding boundaries, workload-aware workflows, and a 30-day pilot plan with clear “keep/kill” criteria.
Student Perspectives on AI in Class
February 27, 2025
“Student voice on AI” should do more than collect opinions. Done well, it protects trust, surfaces equity issues, and produces practical classroom norms students understand and will follow. This post sets out a 2–3 week “student AI listening cycle” using a safe survey, small focus groups, and quick classroom trials. The goal is a one-page, student-authored AI classroom agreement plus a short set of policy-ready insights on assessment, privacy, trust, and access— without turning decision-making into a popularity contest.
Digital Citizenship and AI
October 24, 2024
As AI tools move into everyday schoolwork, they must become part of digital citizenship, not an optional extra. This article gives teachers age-banded, ready-to-teach mini-units that weave online safety, ethics and academic integrity into practical AI activities. With examples from primary through to upper secondary, and options for low- or no-device classrooms, you can help pupils actually practise responsible AI use rather than simply memorising rules. Includes ideas for classroom routines, pupil agreements and ways to link lessons with school policy and home.