Introduction
Integrate Erathos with your Modern data stack.
This feature is in Beta.
The Erathos platform simplifies the management of your data workflows by offering two powerful orchestration options: Job Execution and Webhook. These features allow you to automate and streamline your data processes, ensuring greater efficiency and flexibility.
You can trigger job executions programmatically, automating tasks such as updating a Data Warehouse or chaining multiple jobs. Meanwhile, Webhooks enable you to receive notifications when specific events occur, providing real-time insights into job statuses and metadata. The platform's capabilities are designed to integrate seamlessly with your existing data pipelines, making it easier to manage, monitor, and scale your data operations.
Webhooks Registration
To register a webhook, you need to provide the following information:
description: A text description to help identify the functionality of the webhook.
jobs: A list of Job IDs (UUIDs) that should trigger the webhook. These IDs can be found through the Jobs endpoint in the workspace.
method: The type of HTTP request (GET, POST, PATCH, PUT).
url: The destination URL where the webhook will be sent.
header: Additional fields to include in the request, such as Content-Type and Authorization.
body: The request body, which can include static or dynamic data, using execution metadata.
rules: A list of conditions that must be met for the webhook to be triggered. All rules must be satisfied for the webhook to be sent.
variable_name: The name of the metadata variable available (check the Erathos Metadata section).
operation: One of the accepted operations depending on the variable type (check the Rules section).
value: A string or list of strings corresponding to the desired values.
The url, header, and body fields support the use of dynamic values in the format ${{variables.my_variable}}
, ${{secrets.my_secrets}}
, ${{erathos.rows}}
. These references will be replaced with real values before sending the webhook, using variables and secrets previously registered by the user, as well as metadata values from the execution that triggered the webhook.
Example
Below is an example of a webhook that triggers the execution of a DAG in Airflow after the successful completion of a Job, with execution between 00:00 and 08:00 on weekdays, and only if at least one record has been transferred:
More examples are available in the Templates section.
After building your webhook, send a request to the Create Webhook endpoint to register it.
Limitations
This section outlines the constraints and boundaries of the Orchestration feature to help you better understand its current capabilities. Being aware of these limitations ensures optimal usage while avoiding unexpected behavior or errors in your workflows.
Job Dependencies
For API connectors, dependent jobs may exist, meaning certain endpoints require data from the primary endpoint. For instance, a primary endpoint that lists users might have a dependent endpoint that provides detailed information about each user based on their identifier.
These dependent jobs cannot run independently. To execute a dependent job, the primary job must be executed first. When the primary job runs, all its dependent jobs are automatically executed afterward.
Block Window
Programmatic use of the Run endpoint is treated as a manual execution, equivalent to initiating a new manual execution directly through the platform. Similarly, if a Block Window is configured for the same time as the Run orchestration and the Allow manual executions indicator is disabled, new executions will be blocked, and the API will return a 400 response, preventing the execution from starting.
Last updated