Join a high-impact data transformation programme as a GCP Data Engineer. Build and maintain scalable data pipelines, work with large datasets, and solve real-world data challenges in GCP. Proven experience as a Data Engineer working on GCP is required.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
I’m looking for an experienced GCP Data Engineer to join a high-impact data transformation programme on a 12-month contract.
Hands on GCP experience required
💰 $100–$120/hour
🌍 Fully Remote (US timezone)
📅 12 months (strong likelihood of extension)
🔍 The Role
You’ll be hands-on building and optimising a modern data platform on Google Cloud Platform (GCP), working closely with architects and stakeholders to deliver scalable data solutions.
This role is ideal for someone who enjoys building robust pipelines, working with large datasets, and solving real-world data challenges in GCP.
🧠 Key Responsibilities
- Build and maintain scalable data pipelines (batch & real-time)
- Work with large datasets in BigQuery, optimising performance and cost
- Develop data processing solutions using Dataflow (Apache Beam)
- Integrate data using Pub/Sub and Cloud Storage
- Support data modelling and transformation workflows
- Collaborate with engineers, analysts, and business stakeholders
- Follow best practices in data quality, testing, and deployment
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
☁️ Tech Stack (GCP-focused)
BigQuery · Dataflow (Apache Beam) · Pub/Sub · Cloud Storage · Cloud Composer / Dataform
- Strong preference for candidates with hands-on GCP experience (not just general cloud exposure)
✅ Requirements
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
- Proven experience as a Data Engineer working on GCP
- Strong experience building data pipelines at scale
- Experience with both batch and real-time processing
- Solid SQL skills and experience with data warehousing (BigQuery)
- Comfortable working in a fast-paced, collaborative environment
⭐ Nice to Have
- Experience migrating data platforms to GCP
- BigQuery cost optimisation experience
- Exposure to CI/CD or infrastructure as code (e.g., Terraform)
📩 Interested?
Drop me a message or comment below, and I’ll reach out with more details.
#Hiring #DataEngineer #GCP #GoogleCloud #BigData #RemoteJobs #ContractJobs
Similar Jobs
Explore other opportunities that match your interests
Senior Data Analyst
harmonic finance™ | certified...
Director, Strategy & Operations
fetch