I need a small, reliable system that polls the Upstox API once a minute for roughly 500 equities and captures only one metric—Market Cap. Each pull should be written straight into PostgreSQL with proper time-stamping so the data set is tidy and easy to query later. Once the trading day closes, the service must roll up everything it has gathered into a concise daily summary report. I am happy for this summary to be generated as a CSV, a simple HTML page, or even a lightweight dashboard—whatever is fastest for you to implement and easiest for me to consume. You may pick the language and framework you’re most productive in (Python + Pandas, Node.js + Knex, Go + pgx, etc.), but the core requirements stay the same: • Poll Upstox every 60 seconds for Market Cap of 500 symbols • Persist each minute’s payload in PostgreSQL with minimal latency • Handle network hiccups, API limits, and retries gracefully • Auto-generate a daily summary report from the stored data • Provide clean, well-commented code plus a README explaining setup and cron/scheduler configuration Acceptance criteria • A single command spins up the database schema and scheduler • At least eight trading hours’ worth of data loads without a miss or duplicate row • Daily report matches manual spot checks of raw figures • Code passes a quick review for clarity, error handling, and PostgreSQL best practices If you have previous experience with market data feeds or the Upstox SDK that’s a bonus, but solid ETL skills and a knack for building resilient jobs are what really matter here. I would like to see the first working pull-and-store cycle within a few days so we can fine-tune before tackling the reporting layer.