Enhance iOS ARKit LiDAR App

Заказчик: AI | Опубликовано: 21.12.2025

I already have an iOS app in the store that relies on ARKit and the LiDAR sensor, but several of its core features need a technical tune-up. Right now the 3D scanning, object detection, and environmental mapping work, yet they struggle with accuracy and speed once models become dense. I’m looking for a Swift developer who lives and breathes Apple’s ARKit mesh APIs and has hands-on experience with LiDAR—bonus points if you have played with Space Capture, Polycam, or similar apps and understand the tricks they use to keep meshes clean and frame rates high. Here’s what I need from you: • Jump straight into the codebase and focus purely on development—UI and QA are already covered. • Refactor my existing pipeline so the raw LiDAR data produces tighter, gap-free meshes, faster object classification, and more stable environmental mapping. • Implement incremental improvements rather than sweeping rewrites; the goal is to enhance what’s there, not reinvent it. • Hand back well-documented commits plus a TestFlight build so I can validate changes on multiple devices (iPad Pro M1, iPhone 14 Pro). Acceptance is simple: the updated build must generate a visibly finer mesh (≤5 mm average triangle edge on a standard room scan), maintain 30 fps or better during capture, and keep the scene’s world mapping status at least “extending” for 90 % of a two-minute session. If that sounds like your wheelhouse, let’s talk and get the repository in your hands.