Cloud Data Engineer

Exactius Poland
Remote
Apply
AI Summary

Design, build, and maintain scalable ETL pipelines to process and manage large datasets efficiently. Collaborate with data engineers, analytics engineers, and infrastructure teams to evolve the data platform architecture, reduce operational overhead, and ensure high availability, performance, and security of data systems.

Key Highlights
Design, build, and maintain scalable ETL pipelines
Collaborate with data engineers, analytics engineers, and infrastructure teams
Evolve the data platform architecture, reduce operational overhead, and ensure high availability, performance, and security of data systems
Technical Skills Required
SQL dbt Python GCP AWS Azure Airflow Git Docker
Benefits & Perks
Opportunity to have a real impact in a high growth international brand
Work remote from anywhere in the world
Control your time - flexible hours
Responsibility from day one, professional and personal growth
Great company culture, working alongside rockstar professionals from various industries and backgrounds

Job Description


About Exactius

Exactius works with companies to solve complex digital marketing challenges and achieve ambitious growth goals. We bring executive leadership and cross-functional execution teams that can not only form the strategy but can execute and deliver the result. These teams always include a proven performance marketing CMO, along with campaign and CRM managers, conversion and product managers, data scientists, front/back end developers, and creative teams as necessary. Those teams are supported by a proprietary technology that supports the need for advanced analysis and real-time optimization. Exactius is an extension of your internal teams with one goal in mind: grow faster and be more profitable.


Description

As a Data Engineer at Exactius, you will contribute to the design, development, and operation of a cloud-native data platform that supports large-scale data ingestion, transformation, and analytics workloads. You will build and operate reliable, production-grade data pipelines and shared data services, with a strong focus on scalability, automation, and system robustness.

You will work extensively with cloud infrastructure, orchestration frameworks, and CI/CD pipelines to standardize data ingestion patterns, enforce data quality, and improve observability across the platform. The role emphasizes platform thinking: building reusable components, enforcing engineering standards, and enabling multiple teams to consume data safely and efficiently.

You will collaborate with data engineers, analytics engineers, and infrastructure teams to evolve the data platform architecture, reduce operational overhead, and ensure high availability, performance, and security of data systems.


Responsibilities

  • Design, build, and maintain scalable ETL pipelines to process and manage large datasets efficiently.
  • Ensure data pipelines are optimized for performance and reliability.
  • Develop and maintain data models using dbt, adhering to best practices in data warehousing.
  • Integrate with third-party APIs and develop services to enable seamless data access.
  • Deploy and scale data infrastructure using cloud platforms (GCP, AWS, or Azure).
  • Implement and manage workflows using Airflow or similar orchestration tools.
  • Establish and maintain CI/CD pipelines for seamless deployment, testing, and automation.
  • Apply software engineering principles to enhance data solution maintainability and scalability.
  • Collaborate with stakeholders to understand data needs and deliver reliable, actionable solutions.
  • Work closely with marketing and advertising teams to enable data-driven decision-making (preferred).
  • Requirements

    • At least 2 years as a Data Engineer, with a strong focus on SQL, data warehouse modeling, and dbt.
    • Proficiency in Python for data processing and scripting.
    • Experience in integrating with third-party APIs or designing APIs for data access.
    • Experience with Git for version control and Docker for containerization.
    • Hands-on expertise with cloud platforms (GCP, AWS, or Azure).
    • Familiarity with Airflow for workflow orchestration.
    • Knowledge of CI/CD pipelines and deployment automation.
    • Knowledge of digital marketing or advertising (A plus).
    • Enthusiasm about learning and adapting to the exciting world of Al - a commitment to exploring this field is a fundamental part of our culture.


    Benefits & Perks

    • Opportunity to have a real impact in a high growth international brand
    • Work remote from anywhere in the world
    • Control your time - flexible hours
    • Responsibility from day one, professional and personal growth
    • Great company culture, working alongside rockstar professionals from various industries and backgrounds
    • We care about you and your career path


    Subscribe our newsletter

    New Things Will Always Update Regularly