GCP Data Engineer

Remote
Apply
AI Summary

Design, build, and optimize scalable data solutions on Google Cloud Platform. Collaborate with data scientists and stakeholders to deliver high-quality datasets. Implement data modeling and performance tuning strategies.

Key Highlights
Design scalable data pipelines and ETL/ELT processes on GCP
Collaborate with data scientists and stakeholders
Implement data modeling and performance tuning strategies
Key Responsibilities
Design, develop, and maintain scalable data pipelines and ETL/ELT processes on Google Cloud Platform
Build and optimize data architectures using GCP services
Collaborate with data scientists, analysts, and business stakeholders
Technical Skills Required
Google Cloud Platform BigQuery Dataflow Pub/Sub Cloud Storage Dataproc SQL Python Java Scala Terraform
Benefits & Perks
100% remote work in Brazil
Long-term based contract
Nice to Have
Experience with real-time streaming architectures
Knowledge of data governance and security best practices
Familiarity with Terraform or other Infrastructure as Code tools

Job Description


Job Profile: GCP Data Engineer

Job Type: Long-time based contract Job Opportunity

Location: 100% Remote in Brazil


Job Description:


Job Description:

Job Summary

We are seeking a skilled Data Engineer with strong experience in Google Cloud Platform (GCP) to design, build, and optimize scalable data solutions. The ideal candidate will have hands-on experience with GCP data services, hold a relevant Google Cloud Professional certification, and possess a Data Architect certification. This role requires a minimum of 2–3 years of practical GCP experience in production environments.

Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes on Google Cloud Platform.
  • Build and optimize data architectures using GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Dataproc.
  • Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality datasets.
  • Implement data modeling, partitioning, clustering, and performance tuning strategies.
  • Ensure data quality, integrity, governance, and security across platforms.
  • Develop automation for data workflows using Infrastructure as Code and CI/CD practices.
  • Monitor, troubleshoot, and optimize data processing jobs and cloud resources.
  • Support real-time and batch data processing solutions.

Required Qualifications

  • 2–3+ years of hands-on experience with Google Cloud Platform.
  • Relevant Google Cloud Professional Certification (e.g., Professional Data Engineer).
  • Data Architect certification (required).
  • Strong experience with BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage.
  • Proficiency in SQL and at least one programming language (Python, Java, or Scala).
  • Experience with data modeling, warehousing, and distributed systems.
  • Knowledge of CI/CD pipelines and DevOps practices.

Preferred Skills

  • Experience with real-time streaming architectures.
  • Knowledge of data governance and security best practices.
  • Familiarity with Terraform or other Infrastructure as Code tools.
  • Experience working in Agile environments.

Education

  • Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field.


Similar Jobs

Explore other opportunities that match your interests

Data Scientist III - Remote

Data Science
•
1d ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Not Applicable

talent bridge

Brazil

Senior Data Engineer

Data Science
•
1w ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Aubay Portugal

Brazil

Data Scientist - Circular Economy

Data Science
•
1w ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Entry level

remoov

Brazil

Subscribe our newsletter

New Things Will Always Update Regularly