LogoLogo
  • Introduction
    • Welcome
    • Quickstart
  • Platform
    • Connections
      • Jobs
      • Sync types
      • Sync schedule
      • Column Anonymization
      • Block Window
    • Connect to databases
    • Running jobs manually
    • Alert Integrations
      • Discord
      • Email
      • Slack
    • Runs history
    • How we move data
  • Connectors
    • APIs
      • ActiveCampaign
      • Amplitude
      • Asaas
      • Asana
      • Bling
      • Beehiiv
      • BomControle
      • Campaign Monitor
      • ClickUp
      • Conta Azul
      • Convenia
      • CustomerX
      • Delighted
      • Facebook Ads
      • FreshChat
      • Freshdesk
      • Gainsight
      • Google Ads
      • Hotmart
      • HubSpot
      • Intercom
      • Jira
      • Linkedin Ads
      • MailChimp
      • MailGun
      • MercadoLibre
      • Mixpanel
      • Monday
      • Movidesk
      • Omie
      • Pipefy
      • Qulture.Rocks
      • RD Station CRM
      • RD Station Marketing
      • Sankhya
      • Sentry
      • Stripe
      • Superlógica
      • Tiny
      • Teamwork Desk
      • Track.co
      • Twitch.tv
      • Twygo
      • Typeform
      • Vindi
      • Zendesk
    • Databases
      • Firebird
      • MySQL
      • Neo4j
      • Oracle
      • PostgreSQL
  • Destinations
    • BigQuery
      • Hosted by Erathos
    • Databricks
    • PostgreSQL
      • Aws (RDS)
      • Azure
    • Redshift
    • S3 Iceberg
  • API
    • Introduction
      • Trigger jobs outside Erathos
      • Trigger jobs in Erathos
    • Quickstart
      • Airflow
      • Dbt Cloud
      • Erathos
      • Prefect
    • Concepts
      • Authentication
      • Metadata
      • Rules
      • Variables and secrets
    • API Reference
      • Jobs
      • Orchestration
      • Secrets
      • Variables
      • Workspaces
  • Settings
    • User settings
    • Workspace settings
Powered by GitBook
On this page
  1. Platform

How we move data

PreviousRuns historyNextAPIs

Last updated 6 months ago

The data synchronization process in Erathos follows the flow illustrated below:

  • Data Source Environment: The process begins with data extraction from your source system, such as a database (e.g., PostgreSQL) or a SaaS platform like HubSpot, configured in Connections.

  • Temporary Storage: During each job execution, the extracted data is temporarily stored in a cloud bucket (e.g., S3, GCS, or Azure Blob Storage) within Erathos' infrastructure. These temporary files ensure efficient and scalable handling of data.

  • Loading to Data Warehouse: Once the data is ready for transfer, Erathos uses commands like COPY (or equivalent methods specific to the destination) to load the data from the temporary storage into your Data Warehouse.

  • Data Cleanup: After the process is complete, all temporary files are deleted, ensuring security and optimal use of storage resources.