I need a clean, well-documented Python solution that automatically pulls macro- and micro-level market indicators from several public websites and presents them in a clearly structured format. Think of economic releases (GDP, CPI, PMI) alongside company-specific metrics I will specify, all consolidated into one up-to-date data set. Core workflow • Collect the indicators from multiple sites—some offer APIs, others will require HTML parsing or CSV downloads. • Normalise naming, units and time stamps so every row lines up chronologically. • Store the results in a tidy DataFrame and export to both CSV and JSON. • Produce a quick summary view (CLI table or simple Flask/Dash page is fine) so I can eyeball the latest figures at a glance. Technical preferences Python 3.x, requests/BeautifulSoup or Scrapy for scraping, pandas for wrangling, and either matplotlib or Plotly if you add a mini visual. Feel free to leverage official APIs where available; just keep the code modular so sources can be swapped in or out. Acceptance criteria 1. Running `python main.py` fetches the data without manual intervention. 2. Output files are saved with today’s date in an `output/` folder. 3. All sources and parsing logic are clearly commented, and a README explains how to add new indicators or sites. 4. Any dependency is listed in `requirements.txt` and installs smoothly with `pip install -r requirements.txt`. Hand over the finished scripts, README, and a short demo capture or screenshots showing the export and display.