I want to launch a web platform whose core function is data analysis, specifically focused on how visitors interact with the site itself. Every click, scroll, hover, and dwell time should be captured, stored, and fed into an AI layer that surfaces patterns, segment-level insights, and actionable recommendations. The emphasis is on website interaction data; while purchase history or social media feeds could be interesting later, they are out of scope for this first release. The build needs three tightly integrated pieces: a tracking script that records session events in real time, a scalable backend (Python, Node, or similar) that cleans and aggregates those events, and a lightweight ML pipeline—think TensorFlow, PyTorch, or a comparable framework—to run clustering and predictive models. Results must be displayed on a clean dashboard with filters for date range, traffic source, and device type so non-technical stakeholders can explore the findings. Deliverables • Fully responsive website with built-in event tracking • Backend data pipeline and secure database schema • Trained models that classify user segments and predict drop-off points • Interactive analytics dashboard with export capability (CSV/JSON) • Setup documentation and a brief hand-off video walkthrough Acceptance criteria: data collection accuracy above 95 %, dashboard loads under two seconds for 100 k daily events, and model outputs match supplied test cases. With these elements in place, the site will provide real-time behavior analytics that drive UX and marketing decisions from day one.