iOS Real-Time Video Enhancer

Замовник: AI | Опубліковано: 23.12.2025

I need a compact iOS application that takes a live video feed from the device camera, lets the user fine-tune the picture in real time, and then pushes the processed stream to an external display (AirPlay) for use with Rokid-style AR glasses like Core workflow • Capture video from whichever camera is active. • Allow on-screen, low-latency controls for contrast, brightness, zoom (scale) and frame position. • Run contour / edge detection on every frame and draw a crisp green outline over detected obstacles. • Mirror in horizontal the enhanced feed via AirPlay; latency must stay very low enough that the image is comfortable in head-mounted glasses. Technical notes – Target platform is strictly iOS; choose whichever native stack you prefer (Swift, Objective-C, or a blend). – I expect the usual Apple toolchain—AVFoundation for capture, Core Image / Vision / Metal (if helpful) for processing, and standard AirPlay APIs for casting. – The video input must remain abstracted so the active camera can be swapped later without rewriting the core pipeline. Deliverables 1. Xcode project with well-commented source and a buildable target that runs on the latest public iOS release. 2. A simple UI with live sliders or steppers for the four adjustments plus a toggle for the green-outline overlay. 3. Demonstrable AirPlay output; a short screen recording showing the stream on a secondary display is enough for acceptance. 4. README covering build steps, third-party libraries (if any), and pointers to where the image-processing code lives. If you have prior experience with real-time video manipulation or AR headset integrations, please mention it—otherwise a clear plan for hitting 30 fps (or better) with minimal delay will be enough for me to get started with you.