Advanced Crypto Analytics Platform

Customer: AI | Published: 27.02.2026
Бюджет: 10000 $

We require a robust, from-scratch backend system capable of continuously ingesting live exchange data streams — including price ticks, trade volumes, and full order-book depth — from multiple major cryptocurrency exchanges. The system must normalize heterogeneous exchange formats into a unified schema and store them in a scalable, time-series optimized architecture that can grow seamlessly with increasing traffic and trading pair coverage. On top of this storage layer, we require a clean, well-structured internal API that powers real-time dashboards and analytics without exposing raw databases to the front-end layer. Architecture Expectations The backend must include: • Low-latency WebSocket-based ingestion pipelines • Lossless, fault-tolerant streaming architecture • Schema normalization across exchanges • Time-series optimized storage with rollups and retention policies • Horizontal scalability by design Analytics Requirements Analytics are central to the build. The system must support: • Trend analysis (technical indicators, momentum, moving averages) • Liquidity and order-flow analytics (spread, imbalance, depth metrics) • Volatility and risk calculations (realized volatility, drawdowns, regime detection) • Sentiment overlays derived from derivatives data (funding rates, open interest) • Predictive modelling modules (probabilistic signal generation — not guaranteed forecasting) Analytics must compute in near-real time and be accessible exclusively through the internal API. The frontend must never query raw databases directly. Technology Stack We are open to the stack, including but not limited to: Python, Go, Node.js, Rust Kafka or equivalent streaming broker WebSockets Redis TimescaleDB, InfluxDB, or ClickHouse REST or gRPC Docker and Kubernetes You may recommend the stack you can confidently deliver and support. Primary priorities: • Low-latency ingestion • Fault tolerance • Horizontal scalability • Clean modular code structure • Maintainability and clear documentation Deliverables • Streaming ingestion engine connected to at least three major exchanges • Normalized historical time-series database with retention and roll-up policies • Analytics modules implementing the metrics described above • Authenticated internal API with documented endpoints • Containerized deployment (Docker) and basic orchestration scripts • Monitoring hooks and logging setup • Brief operational run-book for DevOps handover Acceptance Criteria Tick-level parity between exchange data and stored records over a continuous 24-hour validation run. Internal API endpoints returning computed analytics in <200 ms for queries covering the most recent 60 minutes of data. Clean, well-documented code pushed to our private repository. Architecture walkthrough session covering system design, CI/CD, scaling strategy, and monitoring. Closing This is an end-to-end systems engineering challenge. We are looking for a developer or team who can take ownership of architecture, implementation, and handover. If you’re confident in building high-performance streaming data systems at scale, let’s discuss timelines, milestones, and execution strategy.