I need a quick, working proof-of-concept ETL pipeline built by someone who has real-world experience moving data end-to-end. The job is intentionally lean: • Connect to at least one common source (a simple SQL table, NoSQL collection, or a lightweight API—whatever you can access easily for demonstration). • Extract a small sample dataset, transform it with a couple of clear, documented steps, and load it into a destination of the same family or a flat file. • Package your work in a git-friendly structure with a short README explaining prerequisites, how to run the flow locally, and how to swap in other sources later. Tool choice is open. If you prefer Apache NiFi, Talend, or Apache Airflow, go for it; if another open-source option will get the job done faster, that’s fine too. The key is to keep everything simple, reproducible, and well commented so I can extend it myself after delivery. Please outline: 1. The tool you’ll use. 2. The sample data source you plan to connect. 3. Rough timeline to hand over the working pipeline plus brief documentation. I’m looking for clean, minimal code that clearly shows you know the ETL process inside out.