Interactive AR Experience in Unity

Замовник: AI | Опубліковано: 11.02.2026

I’m building a real-time AR application in Unity that puts user interactions at the center of the experience. The project already has a solid concept, but I now need the scene brought to life with responsive touch/gesture input through AR Foundation and polished 3D assets. Unity 3D (URP) will drive the core app, while MediaPipe or OpenCV will handle hand-tracking and gesture recognition. On the hardware side the code must remain portable to Raspberry Pi OS and Arduino-based prototypes, so a clean Python or Node.js bridge is essential. Here’s what I’d like you to deliver: • A Unity scene that compiles on Android and iOS, wired for AR Foundation with robust user interaction logic (tap, pinch, swipe, custom gestures). • Integration of MediaPipe/OpenCV scripts so gestures trigger in-app events with minimal latency. • A small library of optimized 3D models (Blender or Maya) that match the style guide I’ll share. • Clear setup instructions so I can rebuild and deploy on mobile and Raspberry Pi boards. I’m concentrating on AR/XR development right now; object tracking or full environment recognition may be phased in later, so modular code architecture is appreciated. When you reply, include links or builds that show comparable past work—especially Unity projects where you implemented custom user interactions. A brief note on how you’d structure the gesture pipeline will also help me gauge fit. The sooner we can iterate on a working prototype, the faster we move to the next milestone.