JAWS reader - Accessibility issues detector on Storyline Courses

Заказчик: AI | Опубликовано: 11.02.2026

I already have a partly-finished, AI-assisted utility that scans the Articulate Storyline courses, simulates a JAWS screen-reader pass, flags every accessibility defect it finds, and suggests code-level fixes. The original vendor could not complete the build, so I need someone who can step in, assess the existing codebase, and carry the project to a reliable, production-ready release. Current status • Core crawler and DOM parser are in place and successfully open Storyline-published packages. • Crawl the course ZIP package, and detect issues like missing alt text, missing tab orders and tab focuses etc.. specifically to find the issues that may be found when we run the course with JAWS reader. • A preliminary rule set identifies several common JAWS issues, but coverage is far from complete. • The AI suggestion engine returns generic fixes; it still needs fine-tuning to Storyline’s HTML structure and ARIA patterns. • Reporting works, yet the UI needs polish and export options (HTML/PDF/JSON). Key expectations • Deep familiarity with eLearning course architecture, especially the way Storyline emits HTML, CSS, and JavaScript. • Solid grasp of accessibility standards (WCAG 2AAA) and, most importantly, JAWS behaviours. • Ability to expand and optimise detection rules, refine the AI prompt/response cycle, and validate results against real JAWS sessions. • Should match with the manual testing QA report, that was captured when tested with JAWS reader. • No direct plug-in to Storyline is required; the tool should remain a standalone analyser that ingests a published ZIP or hosted URL. • Clean, well-documented code and a concise user guide so our in-house QA team can maintain and extend it. Deliverables 1. Completed detection engine covering the full range of JAWS-related accessibility issues found in Storyline output. 2. AI-generated fix recommendations that reference the exact file, element, and attribute to change. 3. Responsive web interface with sortable/reportable issue list plus export options. 4. Automated test suite with a set of sample courses and known defects for regression checks. 5. Setup script or container image for easy deployment on our internal server. Acceptance criteria • When run against the sample courses we provide, the tool must detect at least 95 % of the manually confirmed issues and produce actionable fixes. • Reports open cleanly in JAWS and NVDA, demonstrating the fixes. • Code passes a peer review for readability and maintainability. If you have hands-on experience marrying accessibility auditing, screen-reader quirks, and Storyline’s unique front-end, I’m ready to share the repo and a detailed roadmap right away.