I need a robust Python function that can log in to a password-protected site, navigate to a given page, locate the primary table, and convert it into a clean Pandas DataFrame before writing the result to CSV. The same function must work on each URL I provide, and ideally on any future page built on the same template, so please keep the approach modular and scalable. Because the pages sit behind authentication, the username and password can be hard-coded directly in the script; no interactive prompts or external files are necessary this time. Anti-blocking tactics (session persistence, realistic headers, controlled request pacing, etc.) are mandatory—I want to be able to run the notebook repeatedly without getting shut out. Deliverables • A Jupyter notebook (.ipynb) containing the login routine, table-parsing function, and an example call that exports a CSV • Any helper modules or requirements listed in a requirements.txt so I can recreate the environment quickly When you bid, share a concise example of past scraping work that proves you can handle authentication-gated content and large tables. That’s the only application material I need.