Human-Like AI Tutor for GCSE Maths

Customer: AI | Published: 17.10.2025

Goal Build a human-like AI tutor prototype that delivers one high-quality, interactive 10-minute lesson for GCSE Maths (expandable to English/Science). It should feel warm, supportive, and clear—like a great human tutor—while staying transparent, safe, and privacy-respecting. The lesson must follow an evidence-based arc with checks for understanding and adaptive help when students struggle. (Full instructions on the pdf attached) Human‑like Ai Tutor — Roadmap, … Scope (MVP) Lesson format (10 mins): Emotional check-in → 2) quick prior review → 3) objectives/relevance → 4) worked example → 5) guided practice → 6) independent practice → 7) wrap-up/next steps. (Rosenshine-inspired arc; mastery gates & error-specific hints.) Human‑like Ai Tutor — Roadmap, … Key capabilities Voice out (TTS): natural, expressive neural TTS (e.g., Azure/ElevenLabs). Voice in (ASR): real-time speech recognition with barge-in (student can answer/interject). Adaptivity: mastery-based branching; targeted hints; retry/reteach on errors. Affect cues (no camera): adapt to hesitation/latency/phrases like “I don’t get this” (slow down, add encouragement, offer hint). Human‑like Ai Tutor — Roadmap, … Safety & transparency: clear “You’re talking to an AI tutor”, mic opt-in, content moderation, age-appropriate defaults. Human‑like Ai Tutor — Roadmap, … Authoring (v0): lesson script/branches in JSON/YAML + simple review/edit path (even a basic admin or config file). Platforms: Responsive web (desktop & mobile; Chromebook-friendly). PWA acceptable. Out of scope (for MVP): peer chat, video avatars, AR, full catalog, certification. Human‑like Ai Tutor — Roadmap, … Deliverables Working web prototype of the 10-minute GCSE lesson (voice in/out, adaptive flow, safety/consent UI). Lesson script & branching encoded (JSON/YAML) + simple way to tweak copy/hints. Deployment: hosted demo or reproducible setup (Docker/readme/env keys). Docs: README (run/deploy), how to edit lesson content, API keys/config, known limits. Acceptance criteria Full 10-minute run-through with clear, friendly narration and low-friction turn-taking. At least one adaptive branch on incorrect/uncertain answers (reteach + hint, then retry). Barge-in works: student can interrupt to answer; tutor stops and listens. Safety: AI identity disclosure, mic permission flow, basic moderation path. Human‑like Ai Tutor — Roadmap, … Performance: snappy UX; no blocking waits between turns (streaming/fast responses prioritized). Mobile & desktop: responsive, touch-friendly UI. Nice-to-haves (if time allows) “Repeat last explanation” button; light avatar/animation to signal speaking/listening. Small analytics panel (local) for attempts, time-to-answer. Basic educator page listing lesson steps for quick edits. Suggested tech (open to alternatives) Frontend: React/Next (or Vue) + Web Audio/getUserMedia; PWA; WebSockets/WebRTC where helpful. Speech: Azure Speech / Google Cloud / ElevenLabs (expressive TTS); browser Web Speech API if suitable. Logic: lightweight state machine for lesson phases & mastery gates. Backend: Minimal Node/Express or Python FastAPI proxy for speech APIs + moderation; keep content in JSON/YAML. Safety: simple moderation (keyword list or API), consent copy, logs. (Regulatory posture: UK GDPR/Children’s Code principles.) Human‑like Ai Tutor — Roadmap, … Implementation plan (6 weeks max; can deliver faster) Week 1 — Plan & setup Finalize topic (e.g., GCSE Maths: algebra step-by-step). Lock lesson script, misconceptions, hints. Scaffold app; pick TTS/ASR; set keys; draft consent copy. Human‑like Ai Tutor — Roadmap, … Week 2 — Voice I/O Integrate expressive TTS; implement ASR capture; prototype barge-in (stop playback on user speech). “Say–listen–respond” loop working. Week 3 — Adaptivity Encode script + branching (correct → advance; incorrect/hesitant → hint/reteach → retry). Affect cues from latency/lexical phrases → slower pace/encouragement. Week 4 — UX polish & safety Responsive UI (subtitle captions, mic state, progress steps); repeat button; mobile testing. Safety: AI identity banner, mic opt-in, basic moderation, fallback to text input. Week 5 — Test & refine Edge cases: silence, mis-recognition, multi-attempt loop. Tune thresholds; improve copy/SSML. Documentation: run/deploy, content editing. Week 6 — Deploy & handover Hosted demo or Docker; final fixes; walkthrough & code handover. (Reference roadmap pillars: pedagogy, empathy/affect, low-latency streaming, safety, educator control.) Human‑like Ai Tutor — Roadmap, … Budget & commercials Budget: £750-£1500 all-in (contractor to propose API usage assumptions). Engagement: fixed-price with milestone payments (see below). Ownership: full source code + rights to extend; contractor may reference work generically in portfolio. Milestones (suggested) M1 (20%) – Architecture & lesson script signed off; TTS/ASR “hello world”. M2 (30%) – End-to-end voice loop & basic branching working. M3 (30%) – Feature-complete MVP (UX, safety, adaptivity), mobile/desktop tested. M4 (20%) – Deployed demo + docs + handover. What to include in your bid Relevant voice/ASR and real-time web examples (links). Proposed tech choices (speech APIs, framework) and reasons. Timeline per milestone + risks/mitigations (latency, ASR accuracy). Notes on API costs during prototyping and how you’ll keep latency low. Any UX ideas to make the tutor feel more human (prosody, pacing, empathy cues).