I need a small, reliable program that can scan Google Search results, spot which ads are running, and export the findings to a CSV. The workflow is simple: I type a niche and a city, hit run, and the tool crawls Google for professional-service businesses in major cities only. For every advertiser it detects, the CSV must list business name, Country, city, street address, email, phone, primary contact, and an estimated PPC budget. On top of that, the scrape should extract key campaign insights: • Budget • Keywords used • Ad placements A straightforward command-line script in Python is fine—Selenium, Scrapy, BeautifulSoup, SerpAPI, or the Google Ads API can all be leveraged so long as the solution stays within Google’s terms and reliably handles captchas, rotating proxies, and rate limits. Batch processing (multiple niches or cities fed from a text file) and clear logging are a plus. Deliverables • Fully commented source code • Setup/usage guide that works on Windows and macOS • Sample CSV confirming the required columns and populated fields Once I can run the script locally and reproduce your sample output, the project is complete. ---------------------------------------------------------------------------- Timeline: 5 days The program should be able to take the niche and city as inputs, crawl Google for businesses. It should also include any results without emails, but could simply choose to delete them if needed. The CSV output should also include the website for each businesses