I need a straightforward internal tool that automatically gathers company-level data each month and makes it easy for our small team to work with the results. The scraper must pull three data types—Emails, Contact information, and general Company details—from a set of websites we will specify together. Once the crawl completes, the data should flow into a lightweight backend where I can: • Visualise key metrics at a glance (simple charts or summary widgets are fine) • Quickly search and filter records to find the right lead or company profile • Export any filtered view to CSV or Excel for further analysis A background job should handle the monthly refresh, overwriting or versioning records so we always see the latest snapshot without manual effort. I’m open to your preferred stack—Python with Scrapy, Node.js with Puppeteer, Playwright, or any comparable framework—as long as it is well-documented and can run on an ordinary cloud instance. Deliverables I’m expecting: 1. Scraper code with clear configuration for target sites and selectors 2. Backend application (web or CLI) that surfaces the visualisation, search/filter, and export features 3. Setup guide plus a brief hand-off call or screen-share showing deployment and scheduling Clean, maintainable code and concise documentation are more important to me than fancy UI work. If this sounds like a fit, please outline your proposed approach and any relevant past projects when you reply.