Design, build, and evolve scalable data pipelines and data products within a Lakehouse architecture on Azure. Contribute to the evolution of the data platform and collaborate with cross-functional teams. Strong expertise in Python, SQL, and Azure is required.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
๐น Senior Data Engineer (Azure, Databricks, Airflow) ๐น
๐ We welcome international candidates based in Europe who are open to relocating to Barcelona
๐ About the role
We are looking for a Senior Data Engineer to join a modern, cloud-based data platform team within a leading international tech company.
In this role, you will design, build, and evolve scalable data pipelines and data products within a Lakehouse architecture on Azure, leveraging tools such as Databricks, Data Factory, and Airflow.
You will work at the intersection of data engineering and platform evolution, contributing not only to building data products but also to improving the underlying platform โ including metadata-driven frameworks, data quality, observability, and governance.
This is a highly hands-on role with strong ownership, where you will collaborate with cross-functional teams and play a key part in shaping how data is consumed across the business.
If you enjoy working in modern data environments, solving complex data challenges, and building reliable and scalable solutions โ this could be a great fit.
๐ป What youโll do
๐น Design and build scalable, reliable, and reusable data pipelines using Azure Data Factory and Apache Airflow
๐น Develop data transformations in Azure Databricks (PySpark / SQL) following Medallion Architecture (Bronze, Silver, Gold layers)
๐น Optimize performance, cost, and reliability of data workloads
๐น Contribute to the evolution of the data platform (metadata-driven orchestration, observability, data quality)
๐น Support the migration of legacy data solutions to modern Lakehouse architecture
๐น Implement and improve data quality frameworks (e.g. Soda, Great Expectations)
๐น Ensure pipelines are observable, testable, and production-ready
๐น Collaborate with Run/Operations teams to troubleshoot incidents and ensure platform stability
๐น Participate in incident management, root cause analysis, and continuous improvement initiatives
๐น Contribute to data cataloging, lineage, and governance (e.g. Unity Catalog)
Looking to advance your Data Science career with relocation support? Explore Data Science Jobs with Relocation Packages that include comprehensive packages to help you move and settle in your new role.
๐ก Must Have
๐น 4+ years of experience in Data Engineering / Analytics Engineering / BI
๐น Strong expertise in: Python (PySpark); SQL (advanced level)
๐น Hands-on experience with: Azure (Data Factory, cloud data platforms), Databricks & Delta Lake, Apache Airflow
๐น Solid understanding of: Data Lake / Data Warehouse / Lakehouse architectures, Medallion architecture, ETL / ELT design patterns, Metadata-driven approaches
๐น Experience with performance optimization, partitioning, and scalable data pipelines
๐น Understanding of batch and streaming pipelines
๐น Familiarity with DevOps / DataOps practices
๐น Strong problem-solving skills and ownership mindset
๐น Ability to work with both technical and business stakeholders
๐น Fluent English
โจ Nice to Have
๐น Experience with Infrastructure as Code (Terraform, ARM, Bicep)
๐น Experience with CI/CD pipelines (Azure DevOps, GitHub)
๐น Exposure to Data Quality tools (Great Expectations, Soda)
๐น Experience with Data Catalogs (Unity Catalog, DataHub, Atlan, etc.)
๐น Experience with monitoring/logging tools (Grafana, Azure Log Analytics)
๐น Background in AdTech or digital environments
๐น Experience with BI tools (Power BI, Tableau)
๐ This role is based in Barcelona. Candidates must be willing to relocate.
Discover our full range of relocation jobs with comprehensive support packages to help you relocate and settle in your new location.
๐ Why join this project?
๐ค People first โ diverse and inclusive culture in an international environment.
๐ Modern cloud platforms and large-scale, global projects.
๐ High team stability and collaborative culture.
๐ โฌ1200 per year training budget and continuous learning opportunities.
๐ฐ Flexible compensation model.
๐ฉบ Private health insurance and benefits package.
โก Flexible working hours and hybrid model.
๐๏ธ Wellhub: fitness, wellness, and mental health support.
โฝ Football and paddle tennis teams sponsored by Capitole.
๐ฅณ Team buildings, global events, and strong tech communities.
โจ Want to know more about us? Click here and discover all the details.
๐ Curious about our culture? Check out what people are saying about us on Glassdoor.
๐ฌ We know that not every candidate will meet 100% of the requirements. If your profile doesnโt match perfectly but you believe you can add value, weโd still love to hear from you!
๐ Ready for the challenge? Apply now and be part of a global team driving cloud innovation and security.
Empowering People, Unlocking Innovation.
Information Security Notice
- The employee will have access to confidential information related to Capitole and the assigned project.
- Compliance with internal security and information protection policies is mandatory.
- NDA signature required.
Similar Jobs
Explore other opportunities that match your interests
EyeSpy Recruitment - iGaming S...
Junior Data Analyst Intern
TD