I need a mobile application built from the ground up that can continuously track a user’s location, analyse that data with predictive analytics, and display context-aware information through an augmented-reality overlay. Core workflow • The GPS module keeps an accurate, low-latency lock on the device position and movement history. • That live data is fed into an AI layer—preferably a lightweight on-device model or a cloud endpoint—that looks for patterns, forecasts next likely locations, and suggests actions or insights. • An AR view (ARKit, ARCore or any comparable framework you prefer) then anchors those insights to the real world so the user sees relevant, real-time information hovering over their surroundings. What I will call “done” • Smooth background location tracking with minimal battery drain. • Predictive analytics results delivered in under two seconds after a position update. • Real-time information overlay that remains stable, readable and correctly aligned while the user moves the camera. • Source code, build instructions and a short video demo that proves all three components working together on at least one platform. Please let me know which tech stack you’d like to use, any past work that demonstrates similar GPS, AI or AR integration, and an estimated timeline for a first test build.