I need a clear, reproducible workflow that lets GitHub Copilot sit seamlessly inside Databricks notebooks and VS Code so we can automate code generation and analysis across our data platform. The focus is on day-to-day tasks: writing and debugging Python/PySpark, generating robust SQL, and transforming data with PySpark, generate testscripts based on requirements, ‑ exactly where Copilot will save us the most time. Our team is comfortable with Python, PySpark, SQL and Git and already works in Databricks at an intermediate level, but we’ve never wired Copilot into this environment. I’m looking for someone who can: • Configure Copilot for both VS Code and the Databricks notebook experience while keeping Git history clean. • Demonstrate automated code suggestions, inline debugging help, and SQL generation inside an active Databricks cluster. • Provide a lightweight CI/CD hook (GitHub Actions is fine) to validate Copilot-generated code before merge. • Deliver concise documentation plus a live walk-through so the team can replicate the setup on their own workspaces. Acceptance is straightforward: I will spin up a fresh workspace, follow your guide, and expect Copilot to suggest, debug, and transform code in the notebook exactly as shown in your demo. If it works end-to-end, the job is done.