Senior Data Engineer

BrainRocket Spain
Relocation
Apply
AI Summary

Design, develop, and maintain scalable data pipelines using Apache Airflow, Snowflake, and Apache Kafka. Ensure data integrity, reliability, and accessibility. Collaborate with cross-functional teams to meet data requirements.

Key Highlights
Develop and implement data models to support business requirements
Design, build, and maintain scalable data pipelines using Apache Airflow
Implement and maintain Kafka-based streaming data pipelines for real-time data processing
Collaborate with cross-functional teams to understand data requirements
Technical Skills Required
Python SQL Apache Airflow Snowflake Apache Kafka Data Warehouse Data Modelling
Benefits & Perks
Six additional days of sick leave
Medical Insurance
Birthdays, milestones, and employee anniversaries celebrations
Modern offices with snacks and essentials
Social Club and events
Partial coverage of breakfasts and lunches
Learning and development opportunities
Relocation package (tickets, hotel, visa relocation support)
Partial compensation for language skills development

Job Description


❗️Please note that this role is office based for Valencia, Spain (Carrer de Catarroja, 13, 46940 Manises).

❗️We can provide relocation assistance if you're outside of the city or country.


We are seeking a highly skilled Data Engineer with expertise in managing, designing, and optimizing data pipelines utilizing Apache Airflow, Snowflake, and Apache Kafka.

This individual will play a pivotal role in architecting robust, scalable, and efficient data solutions, ensuring the integrity, reliability, and accessibility of our data infrastructure


Responsibilities:

  • Develop and implement data models to support business requirements, optimizing for performance and scalability;
  • Design, build, and maintain scalable data pipelines using Apache Airflow;
  • Implement and maintain Kafka-based streaming data pipelines for real-time data processing and integration with various systems;
  • Integration to third party databases and APIs;
  • Establish monitoring, alerting, and maintenance procedures to ensure the health and reliability of data pipelines;
  • Collaborate with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements.


Requirements:

  • Proficiency in Python, SQL, and experience with data manipulation and transformation;
  • Data warehouse and data modelling techniques;
  • Experience in designing, building, and maintaining complex data pipelines using Airflow;
  • Proven track record in data engineering roles, with a focus on designing and implementing scalable data solutions using Snowflake or Redshift;
  • In-depth understanding and practical experience in implementing Kafka-based streaming architectures for real-time data processing.


We offer excellent benefits, including but not limited to:

🏥 Six additional days of undocumented sick leaves;

🏥 Medical Insurance;

🥳 Birthdays, milestones and employee anniversaries celebrations;

🏢 Modern offices with snacks and all the essentials;

🎉 Social Club and more than 50 events per year;

🍳 Partial coverage of breakfasts and lunches;

💻 Learning and development opportunities and interesting, challenging tasks;

✈️ Relocation package (tickets, staying in a hotel for up to 2 weeks, and visa relocation support for our employees and their family members);

📚 Opportunity to develop language skills, with partial compensation for the cost of English;

📈 Competitive remuneration level with annual review;

🤝 Teambuilding activities.


Bold moves start here. Make yours. Apply today!


Subscribe our newsletter

New Things Will Always Update Regularly