Design, build, and optimize scalable data solutions on Google Cloud Platform. Collaborate with data scientists and stakeholders to deliver high-quality datasets. Implement data modeling and performance tuning strategies.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Job Profile: GCP Data Engineer
Job Type: Long-time based contract Job Opportunity
Location: 100% Remote in Brazil
Job Description:
Job Description:
Job Summary
We are seeking a skilled Data Engineer with strong experience in Google Cloud Platform (GCP) to design, build, and optimize scalable data solutions. The ideal candidate will have hands-on experience with GCP data services, hold a relevant Google Cloud Professional certification, and possess a Data Architect certification. This role requires a minimum of 2–3 years of practical GCP experience in production environments.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL/ELT processes on Google Cloud Platform.
- Build and optimize data architectures using GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Dataproc.
- Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality datasets.
- Implement data modeling, partitioning, clustering, and performance tuning strategies.
- Ensure data quality, integrity, governance, and security across platforms.
- Develop automation for data workflows using Infrastructure as Code and CI/CD practices.
- Monitor, troubleshoot, and optimize data processing jobs and cloud resources.
- Support real-time and batch data processing solutions.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Required Qualifications
- 2–3+ years of hands-on experience with Google Cloud Platform.
- Relevant Google Cloud Professional Certification (e.g., Professional Data Engineer).
- Data Architect certification (required).
- Strong experience with BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage.
- Proficiency in SQL and at least one programming language (Python, Java, or Scala).
- Experience with data modeling, warehousing, and distributed systems.
- Knowledge of CI/CD pipelines and DevOps practices.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Preferred Skills
- Experience with real-time streaming architectures.
- Knowledge of data governance and security best practices.
- Familiarity with Terraform or other Infrastructure as Code tools.
- Experience working in Agile environments.
Education
- Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field.
Similar Jobs
Explore other opportunities that match your interests
talent bridge
Senior Data Engineer
Aubay Portugal