I have a ready-made list of product names. For each one I want an automated script that: • Launches a Google search (Google only, no other engines), • Captures the first eight organic results, saves their URLs, • Visits each of those pages and extracts price, brand, currency, product description and stated delivery time, • Outputs the compiled results to a single CSV file and .json file. The job is not a one-off; I need the process to run on a schedule—daily or weekly is fine as long as the interval can be changed easily from a config file or cron entry. Robust error handling matters because I expect captchas, rate-limits and occasional missing fields. Feel free to rely on Python with Selenium, Scrapy or BeautifulSoup, headless Chrome, rotating proxies, or any other stack you trust. Acceptance criteria 1. Command or cron entry that triggers the scrape without manual steps. 2. CSV delivered after each run with columns: SearchTerm, ResultRank(1-8), URL, Price, Currency, Brand, Description, DeliveryTime, Timestamp. 3. Clear README explaining setup, environment variables, and how to adjust the schedule. 4. Source code documented well enough for me to tweak selectors if sites change. Once everything is verified against a sample list of 20 products, we can scale up to the full list. We need high accuracy and we will evaluate this accuracy.