Skip to main content
As a data engineer, you build and maintain the pipelines that keep data flowing. AnomalyArmor helps you catch breaking changes before they impact downstream systems.
Data Engineer Journey: Connect (~5 min) → Discover (~3 min) → Monitor (~5 min) → Alert (~5 min)

Your Key Workflows

Detect Breaking Schema Changes

Schema drift is your biggest enemy. A column rename or type change can silently break pipelines that ran fine yesterday.
1

Connect Your Database

Start with your most critical production database. Connect now
2

Run Discovery

AnomalyArmor catalogs all tables, views, and columns. Run discovery
3

Configure Schema Alerts

Get notified of column additions, removals, type changes, and renames. Set up alerts

Integrate with Your CI/CD

Gate deployments on data quality using the CLI:
# Install
pip install anomalyarmor-cli

# Check freshness before running dbt
armor freshness check snowflake.prod.warehouse.orders

# Exit code 1 if stale, blocking the pipeline
Full CLI reference

Automate with Webhooks

Trigger actions when schema changes are detected:
from anomalyarmor import Client

client = Client()

# Get schema changes from last 24 hours
changes = client.schema.changes(
    since="24h",
    change_types=["column_removed", "type_changed"]
)

for change in changes:
    print(f"Breaking change: {change.asset_name} - {change.description}")
Python SDK guide
FeatureWhy You Need It
Schema Drift DetectionCatch column changes before they break pipelines
Freshness MonitoringKnow when upstream data is stale
Webhook AlertsIntegrate with your existing monitoring
CLIAutomate checks in CI/CD

Common Tasks

Set Up dbt Integration

Run AnomalyArmor checks as part of dbt runs

Airflow Pre-flight Checks

Gate DAG tasks on data freshness

GitHub Actions Integration

Add data quality checks to your CI pipeline

CLI Reference

Full command documentation