OpsTrailsDocs
Console

Impact Analysis

Impact analysis compares metrics before and after an event to help you understand whether a deployment, rollback, or other change made things better or worse.

How It Works

  1. Event occurs — A deployment, rollback, or other change is recorded on the timeline
  2. Metrics are collected Connected analytics providers continuously report metrics (error rates, response times, page views, etc.)
  3. Before/after comparison — OpsTrails compares metric values in a configurable window before and after the event
  4. Impact assessment — Significant changes are flagged, helping you correlate deployments with metric movements

Comparison Windows

When querying impact, you can specify the comparison window size:

Worked Example: Before/After a Deployment

Here's what impact analysis looks like in practice. Suppose api-service v2.5.1 was deployed to production at 2:45 PM. OpsTrails compares metrics in a 2-hour window before and after the event:

MetricBefore (12:45–2:45 PM)After (2:45–4:45 PM)Change
error_rate0.12%2.41%+2.29% (20x increase)
p95_latency180ms420ms+240ms (2.3x slower)
throughput1,200 req/s1,180 req/s-1.7% (stable)

Interpretation: The error rate and latency both jumped significantly after the deployment, while throughput remained stable. This pattern suggests the deployment introduced a bug that causes errors and slower responses, but hasn't affected overall traffic. This deployment is a strong candidate for rollback.

Which Metrics Matter Most

The most useful metrics depend on what you're investigating:

Using Impact Analysis via MCP

When an AI assistant is connected to OpsTrails via MCP, it uses the get_metrics_around_event tool to perform impact analysis automatically. This is the same tool that powers the before/after comparisons shown above. See the MCP Tools Reference for full tool documentation.

Example AI queries that trigger impact analysis:

The AI will call query_events to find the relevant event, then get_metrics_around_event with the event's timestamp and subject to retrieve the before/after metric comparison.

Using with AI

When connected via MCP, AI assistants automatically use the get_metrics_around_event tool to assess impact. You can ask:

Tip

For the most accurate impact analysis, make sure your analytics provider metrics are mapped to the same subjects (environments) as your events.

Example Metrics

Common metrics used for impact analysis: