Weekly Caselaw Scraper & NotebookLM Sync

Замовник: AI | Опубліковано: 18.11.2025

I want an end-to-end Python workflow that, every Sunday, goes to the Advanced Search page on https://www.caselaw.nsw.gov.au, runs a query that combines Keywords or phrases, Date range, and Case type, then saves every returned result URL to a plain .txt file. For the keywords, I am targeting judgement types, and the date range should always be Past week. Once the list is generated the same script (or a second, chained script) must log in to NotebookLM, upload the file into a specific notebook and trigger the built-in summary tools so I end up with concise summary documents ready for review. I am open to the scheduling method—cron on a small VPS, GitHub Actions, or another approach you can recommend—so long as it fires reliably each Sunday. What I want to receive • Clean, well-commented Python code that handles the scrape, file creation, NotebookLM upload and summary trigger • Setup notes so I can provide credentials, change search criteria, or move the schedule if needed • A quick screen-share or recorded demo showing the job running successfully on your side • Links or short write-ups of comparable automations you have built where data was scraped and then pushed to another platform or database Acceptance criteria 1. Running the job on a fresh machine with the supplied instructions produces a .txt file listing only live case URLs from the past week. 2. The file appears in my chosen NotebookLM notebook inside five minutes of the job finishing. 3. NotebookLM automatically generates a summary document that references those URLs. If you have examples of similar web-to-web or web-to-database integrations, please include them in your bid.