# How we move data

The data synchronization process in Erathos follows the flow illustrated below:

<figure><img src="https://2158418640-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FovVtDtJhcLYgikyBBku0%2Fuploads%2FPBj9SualR5k3s2EaJkmM%2FUntitled%20drawing%20(2).png?alt=media&#x26;token=18597ab8-d76e-4b92-a9a8-7572ae55207c" alt=""><figcaption></figcaption></figure>

* **Data Source Environment**: The process begins with data extraction from your source system, such as a database (e.g., PostgreSQL) or a SaaS platform like HubSpot, configured in **Connections**.
* **Temporary Storage**: During each job execution, the extracted data is temporarily stored in a cloud bucket (e.g., S3, GCS, or Azure Blob Storage) within Erathos' infrastructure. These temporary files ensure efficient and scalable handling of data.
* **Loading to Data Warehouse**: Once the data is ready for transfer, Erathos uses commands like `COPY` (or equivalent methods specific to the destination) to load the data from the temporary storage into your Data Warehouse.
* **Data Cleanup**: After the process is complete, all temporary files are deleted, ensuring security and optimal use of storage resources.
