OpsTrailsDocs
Console

Data Pipeline Monitoring

Track ETL jobs, data loads, and database migrations on the timeline. Correlate data pipeline events with downstream issues.

The Pattern

  1. Pipeline records events — Airflow, cron jobs, or your ETL framework records data-load events when jobs complete
  2. Query data operations — Ask “what data jobs ran today?” to see all data pipeline activity
  3. Correlate with issues — When a downstream dashboard shows stale data, check if the data load completed successfully

Example: Track a Data Load

bash
curl -X POST https://api.opstrails.dev/api/v1/events \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "specversion": "1.0",
    "type": "data-load",
    "source": "//airflow/etl-pipeline",
    "time": "NOW",
    "subject": "warehouse",
    "data": {
      "description": "Loaded 2.3M rows into analytics warehouse"
    }
  }'

Example: Track a Migration

bash
curl -X POST https://api.opstrails.dev/api/v1/events \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "specversion": "1.0",
    "type": "data-load",
    "source": "//scripts/db-migration",
    "time": "NOW",
    "subject": "production",
    "severity": "MAJOR",
    "version": "migration-042",
    "data": {
      "description": "Added index on users.email column"
    }
  }'

Querying with AI

Best Practices