I need a production-ready Python service that keeps Personio as the single source of truth for employee data while mirroring the information into Microbizz. Each employee must exist twice in Microbizz—once as the actual planning user (team_id 111) and once as the pre-planning user (team_id 116). Personio will fire webhooks on every new employee creation, capability update, or attribute change, and your code should react instantly, creating or updating both Microbizz user records in a fully idempotent way. Scope of the sync • Capabilities to transfer: Skills and Certifications only; other fields may flow through untouched but never override Personio’s values. • Triggers: New employee creation, employee capability updates, and any attribute changes that Personio pushes. • Error handling: Implement structured, custom logging with clear notifications plus automatic retries that use exponential back-off. All actions must be recoverable on a rerun without duplicate side-effects. Technical expectations The service should be containerised for a Linux host (Dockerfile and docker-compose.yml provided) and follow best practices for retries, back-pressure, reconciliation, and exception safety. Please lean on the official Personio and Microbizz APIs, and feel free to use libraries such as requests/HTTPX, asyncio, or Celery if they help you reach production robustness. Clear separation between initial full import and ongoing webhook processing is essential. Deliverables 1. Clean, well-commented Python source code. 2. Dockerfile and docker-compose setup ready for immediate deployment. 3. Read-me covering environment variables, webhook configuration, and runbook for monitoring and recovery. 4. A short test plan or script demonstrating idempotency and successful retries after simulated failures. If you have battle-tested experience building API integrations with robust retry logic and containerised deployment, I’d love to see how you would approach this.