I’m building a native watchOS app, paired with an iOS companion, that uses the Apple Watch’s accelerometer to create passive motion-detection timer splits. Everything must be written in SwiftUI (no storyboards) and designed to work offline-first, syncing seamlessly with the iPhone as soon as a connection is available. Core experience • Passive motion detection drives the auto-split logic; the user never taps to mark laps—movement alone does that. • Subtle haptic “success” taps confirm each detected split. • At a glance, users can view current split progress through rich complications on all supported watch faces. Companion iOS app The iPhone app mainly handles data review, settings, and background sync. It shares a single SwiftUI codebase where practical and uses CoreData + CloudKit to keep workouts consistent whether the devices are connected or not. End-to-end delivery 1. Fully functional watchOS app with passive split detection, haptic feedback, and complications. 2. iOS companion with history, settings, and offline-first sync. 3. App Store assets, provisioning, TestFlight build, and final submissions for both targets. Acceptance criteria – Splits fire accurately under typical running motion in airplane mode. – Success haptic triggers within 0.2 s of each split event. – Complications refresh within the system-allowed window and never exceed power-budget guidelines. – Sync resolves conflicts automatically when connectivity resumes. – App Store review passes without rejection. If you’re comfortable owning architecture, SwiftUI animations, CoreMotion, CoreHaptics, Complications API, and the full submission workflow, let’s talk.