API User Data Scraper

Замовник: AI | Опубліковано: 26.11.2025
Бюджет: 750 $

A public REST endpoint supplies user-level data that must be ingested and stored in a local relational database. The job is to write a reliable, repeatable script or small service that pulls the full user dataset (including any paginated or nested records), normalises it, and writes it directly to either SQLite or MySQL—whichever you prefer so long as the schema is well documented. Key points to keep in mind • Authentication: The API requires a standard bearer token that can be passed in the header. • Volume control: Some calls return thousands of records; the solution has to respect rate limits and recover gracefully if the connection drops. • Idempotency: Running the scraper twice should never create duplicates. • Extensibility: New fields may appear in the response; designing with SQLAlchemy-style models, pandas, or similar libraries is fine as long as adding columns later is straightforward. Deliverables 1. Clean, well-commented code (Python, Node.js, Go, or another mainstream language) that fetches the user data. 2. SQL file or migration script that creates the target tables in SQLite/MySQL. 3. README with step-by-step setup instructions, including how to configure the API token and run the job on a schedule (cron, Windows Task Scheduler, etc.). 4. A one-time dump of the populated database so I can quickly verify the output. Acceptance criteria • Running the script from a fresh environment completes without manual tweaks. • All user records present in the API appear in the DB with accurate field mapping. • No duplicate rows after two consecutive executions. Please attach a detailed project proposal that outlines your chosen tech stack, timeline, and any prior work that demonstrates similar API extraction or database-centred automation.