LogoLogo
  • Introduction
    • Welcome
    • Quickstart
  • Platform
    • Connections
      • Jobs
      • Sync types
      • Sync schedule
      • Column Anonymization
      • Block Window
    • Connect to databases
    • Running jobs manually
    • Alert Integrations
      • Discord
      • Email
      • Slack
    • Runs history
    • How we move data
  • Connectors
    • APIs
      • ActiveCampaign
      • Amplitude
      • Asaas
      • Asana
      • Bling
      • Beehiiv
      • BomControle
      • Campaign Monitor
      • ClickUp
      • Conta Azul
      • Convenia
      • CustomerX
      • Delighted
      • Facebook Ads
      • FreshChat
      • Freshdesk
      • Gainsight
      • Google Ads
      • Hotmart
      • HubSpot
      • Intercom
      • Jira
      • Linkedin Ads
      • MailChimp
      • MailGun
      • MercadoLibre
      • Mixpanel
      • Monday
      • Movidesk
      • Omie
      • Pipefy
      • Qulture.Rocks
      • RD Station CRM
      • RD Station Marketing
      • Sankhya
      • Sentry
      • Stripe
      • Superlógica
      • Tiny
      • Teamwork Desk
      • Track.co
      • Twitch.tv
      • Twygo
      • Typeform
      • Vindi
      • Zendesk
    • Databases
      • Firebird
      • MySQL
      • Neo4j
      • Oracle
      • PostgreSQL
  • Destinations
    • BigQuery
      • Hosted by Erathos
    • Databricks
    • PostgreSQL
      • Aws (RDS)
      • Azure
    • Redshift
    • S3 Iceberg
  • API
    • Introduction
      • Trigger jobs outside Erathos
      • Trigger jobs in Erathos
    • Quickstart
      • Airflow
      • Dbt Cloud
      • Erathos
      • Prefect
    • Concepts
      • Authentication
      • Metadata
      • Rules
      • Variables and secrets
    • API Reference
      • Jobs
      • Orchestration
      • Secrets
      • Variables
      • Workspaces
  • Settings
    • User settings
    • Workspace settings
Powered by GitBook
On this page
  • Prerequisites
  • Authentication
  • Triggering a DAG
  1. API
  2. Quickstart

Airflow

Integration with Airflow.

Prerequisites

Ensure the following variables are configured in the Airflow environment:

  • airflow_url: Base URL of the Airflow server.

  • airflow_dag_id: Identifier of the DAG to be triggered.

Note: While you can directly use the target URL, we strongly recommend leveraging variables and secrets for enhanced security and maintainability.

Authentication

For Airflow authentication, a Base64-encoded token is required:

  • Generate the token using the string username:password.

  • Example command to generate the token:

    echo -n "username:password" | base64
  • Store this token securely as a secret named airflow_base64_token.

Triggering a DAG

When a registered job in Erathos successfully completes, the specified DAG will be triggered with the following parameters:

  • schema_name: Name of the related schema.

  • table_name: Name of the related table.

POST Payload:

{
  "description": "Trigger Airflow DAG run on success job execution",
  "is_active": true,
  "method": "POST",
  "url": "https://${{variables.airflow_url}}/api/v1/dags/${{variables.airflow_dag_id}}/dagRuns",
  "header": {
    "Authorization": "Basic ${{secrets.airflow_base64_token}}" 
  },
  "body": {
    "conf": {
      "schema": "${{erathos.schema_name}}",
      "table": "${{erathos.table_name}}"
    },
    "note": "triggered on Erathos platform by ${{erathos.triggered_by}}"
  },
  "rules": [
    {
      "variable_name": "STATUS",
      "operation": "EQUAL",
      "value": "FINISHED"
    },
  ],
  "jobs": [
    "<ERATHOS_JOB1_ID>",
    "<ERATHOS_JOB2_ID>",
    ...
    "<ERATHOS_JOBN_ID>"
  ]
}
PreviousQuickstartNextDbt Cloud

Last updated 6 months ago