
Why it’s worth doing
Halloween STEM works when the ‘spooky’ is a wrapper for real scientific thinking. The hook lowers the barrier to asking questions, spotting patterns, and arguing from evidence. A ‘Spooky Science Studio’ also gives you a natural reason to teach AI literacy: pupils can see, quickly, that AI is useful for generating ideas, but unreliable if treated as a truth machine. If you’ve been building routines around safe, teacher-mediated AI use, this is a satisfying, seasonal application of those habits (especially if you already use simple Primary guardrails like those in teacher-in-the-loop AI micro-routines).
What to avoid is equally important. Don’t let pupils prompt for ‘real ghost evidence’, medical advice, or dangerous experiments. Don’t let AI invent sources, fabricate datasets, or output polished graphs that no one can explain. And don’t allow pupils to upload identifiable photos ‘for analysis’ when a description or a teacher-provided image would do. The aim is a studio mindset: question, test, record, explain—then critique.
The studio model
The Spooky Science Studio has three roles for AI, each with clear boundaries. First, AI is a hypothesis partner. Pupils describe a phenomenon and ask the AI to propose several testable hypotheses with measurable variables. Second, AI is a simulator. Where hands-on testing is slow or impractical, pupils use simple, transparent simulations (often paper-based, with AI helping to generate trials) to explore relationships and uncertainty. Third, AI is an editor. Pupils turn results into a data story, with AI helping to improve clarity, not rewrite thinking. If you want a wider framework for moving from ‘autocomplete’ to genuine co-authoring with evidence, the approach aligns well with evidence-first writing routines.
A practical way to run it is to set up three ‘stations’ (literal tables, or just three phases). At Station 1, pupils produce a one-sentence question and a hypothesis with variables. At Station 2, they gather data via a quick test or simulation. At Station 3, they craft a short data story: claim, evidence, and what they’d do next. The studio language matters. You’re not asking, ‘What’s the answer?’ You’re asking, ‘What can we test, and what would count as evidence?’
Safety first
Halloween themes can drift into frightening content, personal fears, or unsafe experimentation. Start with boundaries in plain language: ‘We’re doing spooky science, not scary stories.’ Keep prompts minimum-data by design: no names, no photos of pupils, no personal medical details, and no location identifiers. If you’re reviewing policies for the year, it’s worth cross-checking your classroom routines with an acceptable use policy refresh checklist, then translating it into pupil-friendly studio rules.
Image and media rules need to be explicit. If pupils are generating images, restrict them to non-photorealistic styles and avoid prompts involving real people. If pupils are analysing images, use teacher-provided, non-identifiable images (or, better, have pupils describe what they observe in words). For any media tool, require a ‘source label’ on outputs: ‘AI-generated’ or ‘teacher-provided’. If you’ve seen how convincing media can become, the safety and literacy workflows in classroom reality checks for generative video and images offer useful routines you can simplify for Halloween week.
Finally, keep experiments safe and boring in the best way. No flames, no aerosols, no ingestion, and no ‘mixing chemicals at home’. Aim for household-safe materials (paper, water, salt, sugar, balloons, torches, rulers) and teacher-controlled demonstrations where needed.
A spooky theme invites extraordinary claims, so build in a short routine that pupils can run in under two minutes. Teach it as a studio habit, not a lecture.
The first question is: ‘Is this claim testable?’ Pupils must identify at least one measurable variable and a method that could, in principle, be repeated. If the claim is ‘Ghosts drain batteries’, the testable version becomes ‘Does battery voltage drop faster in a “spooky” location than in a control location, under the same conditions?’ That still may be impractical, but it forces clarity.
The second question is: ‘What would change my mind?’ Pupils state what result would make them reject their hypothesis. This stops them treating AI outputs as confirmation. It also gives you a neat way to spot overconfident reasoning: if nothing could change their mind, it isn’t science yet.
If you’re teaching wider AI ethics and reliability, you can connect this routine to discussion protocols from an AI ethics classroom kit, but keep the Halloween version lightweight: testability, falsifiability, and evidence.
Primary activities
For Primary, the win is ‘teacher-in-the-loop’ investigations where the data are simple and the reasoning is visible. Keep devices shared or teacher-operated, with prompts displayed on the board. A class might explore ‘Which material makes the best “monster-proof” curtain?’ using paper, foil, tissue, and fabric scraps to block torchlight. Pupils predict, test, and record light levels with a simple three-point scale (bright, medium, dim) or by counting how many squares on a grid they can still see through the material.
AI’s role is to suggest fair-test language and help pupils turn results into a short data story. You might type a pupil’s spoken explanation into the AI and ask it to produce two versions: one ‘too confident’ and one ‘careful scientist’. Pupils then choose which matches their evidence and explain why. The outcome is a poster with a bar chart drawn by hand and a caption beginning, ‘We think… because… Next time we would…’
KS3 activities
At KS3, lean into variables, uncertainty, and quick simulations. A classic ‘spooky’ context is a ‘haunted corridor’, where pupils test whether sound seems louder in different conditions. They can model sound absorption using different materials around a phone speaker (teacher-controlled volume), then rate perceived loudness at a fixed distance. If devices are limited, you can run it as a demo and have groups focus on experimental design and data interpretation.
For simulation, choose scenarios where randomness matters. For example, ‘Werewolf infection’ can be reframed as a disease-spread model with clear assumptions. Pupils define a simple rule set (probability of transmission per contact, number of contacts per round), then run 20 trials with AI generating random numbers and logging outcomes. The key learning is not the fantasy; it’s seeing how changing one parameter shifts the distribution of outcomes, and why a single run proves nothing.
Two prompts keep AI in the right lane: ‘Generate 20 random numbers between 1 and 100’ and ‘Create a table to record our trial results’. Pupils then interpret the table themselves.
KS4 and KS5 activities
For KS4/KS5, the studio becomes a modelling and critique exercise. Pupils should state assumptions explicitly, test sensitivity, and challenge whether the model matches reality. A ‘vampire repellent’ project can be reframed as optimisation: which combination of reflectivity, scent, and light intensity best reduces ‘approach rate’ in a simulated environment? The science is in defining proxies, acknowledging limitations, and evaluating uncertainty.
Ask pupils to run sensitivity tests: change one parameter by 10% and observe the effect on outcomes. If the output swings wildly, the model may be unstable or poorly grounded. If it barely changes, the parameter may not matter, or the model may be too blunt. Pupils then write a critique paragraph: what the model captures, what it ignores, and what real-world data would be needed.
Discover the power of Automated Education by joining out community of educators who are reclaiming their time whilst enriching their classrooms. With our intuitive platform, you can automate administrative tasks, personalise student learning, and engage with your class like never before.
Don’t let administrative tasks overshadow your passion for teaching. Sign up today and transform your educational environment with Automated Education.
🎓 Register for FREE!
Three project briefs
Brief 1: Monster Materials Lab. Hypothesis: ‘Foil blocks more light than tissue because it reflects and is less porous.’ Simulation/test: torch behind materials at a fixed distance; record light rating or grid visibility. Data story: a hand-drawn chart plus a cautious conclusion and one improvement.
Brief 2: Potion Cooling Curve. Hypothesis: ‘Stirring cools a hot drink faster because it increases heat transfer.’ Simulation/test: teacher-prepared warm water in identical cups; compare stirred vs unstirred; measure temperature every minute (or use ‘warm/less warm’ if no thermometers). Data story: line graph and a note about control variables.
Brief 3: Haunted Random Walk. Hypothesis: ‘If choices are random, most paths cluster near the start.’ Simulation: define a corridor as steps left/right; run 30 trials of 10 steps using AI for random directions; tally final positions. Data story: histogram, mean/median, and a paragraph on why randomness still creates patterns.
Assessment and evidence
Assessment is easiest when you mark the process, not the polish. A quick rubric can focus on three strands: quality of hypothesis (testable, variables named), quality of method (fair test or clear simulation rules), and quality of reasoning (claim matches data, uncertainty acknowledged). Keep evidence lightweight: a photo of the results table, a pupil’s annotated graph, and a short reflection.
Prompt logs matter too. Ask pupils to paste or copy their prompts into the margin of their work, then add a ‘verification note’ stating what they checked. If you want a fuller structure for showcasing pupil projects with evidence, adapt the formats in the proof-of-learning showcase playbook, but keep Halloween outputs short and explainable.
Mini showcase lesson
A one-lesson showcase works well at the end: set up a gallery walk with each group’s data story on a desk. Pupils circulate with two peer-review questions: ‘What is the claim, in one sentence?’ and ‘What evidence supports it?’ Add a safeguarding check at the door: no personal data, no scary imagery, and any AI media clearly labelled. Finish with a quick whole-class reflection on the misinformation routine: pick one project and ask, ‘What would change your mind?’
If you want a ready-made rhythm for low-stakes showcases, the structure in the summer challenge ladder translates neatly into a Halloween studio without adding workload.
Troubleshooting
Hallucinations will happen, especially if pupils ask broad questions like ‘Explain ghosts scientifically’. Redirect to narrow, measurable prompts and insist on ‘show me the steps’. Overconfident graphs are another common issue: pupils may paste numbers and receive a slick chart with wrong labels or invented units. Make ‘graph checks’ a routine: units, axes, source of data, and whether the chart type matches the question. Finally, if outputs become ‘too spooky to be true’ (claims of toxins, curses, or medical effects), treat it as a teachable moment: run the ‘testable’ and ‘change my mind’ questions, and require a verification note before anything is shared.
Printable checklist and scripts
Use these as a one-page printout or slide.
Teacher checklist:
- I have set theme boundaries (spooky science, not scary content).
- I have chosen safe materials and a no-home-experiments rule.
- I have a minimum-data prompt displayed.
- I have a method for prompt logs and verification notes.
- I have a gallery-walk safeguarding check.
Pupil scripts:
- ‘Our question is… We will measure… and keep the same…’
- ‘AI, suggest three testable hypotheses using variables we can measure in class.’
- ‘AI, generate random numbers for 20 trials and format a results table.’
- ‘Our claim is… Our evidence shows… This might be wrong if… Next time we would…’
May your Spooky Science Studio produce more evidence than eerie rumours.
The Automated Education Team