Real-TimeWebSocketsData Viz

Building a Real-Time Analytics Dashboard

12 min read
Real-time dashboards are a common requirement for operations teams, trading floors, IoT monitoring, and SaaS metrics. The challenge isn't just displaying data — it's building a system that handles high-throughput data ingestion, efficient aggregation, and low-latency delivery to potentially thousands of concurrent dashboard viewers. This case study walks through the architecture of a production dashboard serving live analytics.

System Architecture Overview

The architecture follows a pipeline model: data producers emit events to a message broker (Kafka or Redis Streams), a stream processing layer aggregates raw events into time-windowed metrics, aggregated results are written to a time-series database (TimescaleDB or ClickHouse), and a WebSocket gateway pushes updates to connected dashboard clients. Each layer is independently scalable and can be replaced without affecting the others.

  • Event ingestion: Kafka topics partitioned by metric type
  • Stream processing: Flink or custom consumers computing 1s/5s/1m aggregations
  • Storage: TimescaleDB with continuous aggregates for historical queries
  • Delivery: WebSocket server with room-based subscriptions per dashboard

WebSocket Connection Design

Dashboard clients establish a WebSocket connection and subscribe to specific metric channels. The server pushes updates only for subscribed metrics, avoiding unnecessary data transfer. Connection management includes heartbeat pings every 30 seconds, automatic reconnection with exponential backoff on the client, and graceful degradation to HTTP polling if WebSocket connections are blocked by corporate proxies.

client/dashboard-socket.ts
class DashboardSocket {
  private ws: WebSocket | null = null
  private subscriptions = new Set<string>()
  private reconnectDelay = 1000

  connect(url: string) {
    this.ws = new WebSocket(url)

    this.ws.onmessage = (event) => {
      const { metric, value, timestamp } = JSON.parse(event.data)
      this.onMetricUpdate(metric, value, timestamp)
    }

    this.ws.onclose = () => {
      setTimeout(() => this.connect(url), this.reconnectDelay)
      this.reconnectDelay = Math.min(this.reconnectDelay * 2, 30000)
    }

    this.ws.onopen = () => {
      this.reconnectDelay = 1000
      this.subscriptions.forEach((s) => this.subscribe(s))
    }
  }

  subscribe(metric: string) {
    this.subscriptions.add(metric)
    this.ws?.send(JSON.stringify({ action: "subscribe", metric }))
  }
}

Efficient Data Aggregation

Raw event streams can produce millions of data points per minute. Displaying every point is neither performant nor useful. The aggregation layer computes time-windowed summaries: counts, averages, percentiles, and rates over 1-second, 5-second, and 1-minute windows. Pre-computed aggregations dramatically reduce the data volume sent to clients and the query load on the database.

tip

Use a two-tier aggregation strategy: compute fine-grained (1s) aggregations in the stream processor for live display, and store coarser (1m, 1h) aggregations in the database for historical queries and trend analysis.

Frontend Rendering Strategies

For smooth 60fps chart updates, use Canvas or WebGL-based charting libraries rather than SVG for high-frequency data. Batch incoming WebSocket messages into animation frames using requestAnimationFrame to avoid layout thrashing. Implement virtual scrolling for data tables with thousands of rows. Keep the React component tree shallow — dashboard widgets should manage their own state locally rather than funneling all updates through a global store.

warning

Avoid re-rendering the entire dashboard on every data update. Each widget should subscribe to its specific metrics and update independently. React.memo and useMemo are essential here.