We are seeking a highly skilled GCP Data Engineer to support a large-scale Snowflake to Google Cloud Platform (GCP) migration initiative. This role will play a critical part in transforming and validating data models as part of the migration process, ensuring accuracy, scalability, and performance across systems. The ideal candidate is technically strong, proactive, and comfortable working at the intersection of data engineering, validation frameworks, and AI-driven cloud services.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Job Title: GCP Data Engineer
Location: Fully Remote
Job Type: W2
About The Role
We are seeking a highly skilled GCP Data Engineer to support a large-scale Snowflake to Google Cloud Platform (GCP) migration initiative. This role will play a critical part in transforming and validating data models as part of the migration process, ensuring accuracy, scalability, and performance across systems.
The ideal candidate is technically strong, proactive, and comfortable working at the intersection of data engineering, validation frameworks, and AI-driven cloud services.
Project Overview
Project Goal:
Migrate enterprise data from Snowflake to GCP.
Key Impact Areas
- Transforming data models (e.g., snapshot to incremental models)
- Combining and restructuring datasets
- Validating data integrity and business impact
- Building automated validation pipelines
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
- Lead and support Snowflake-to-GCP migration activities.
- Perform comprehensive data validations, comparing legacy vs. transformed models.
- Design and implement automated validation pipelines.
- Build scalable and reliable data pipelines using GCP services.
- Collaborate with cross-functional teams to translate complex data findings into actionable insights.
- Integrate GCP and Palantir Foundry systems using REST APIs and secure data transfer mechanisms.
- Work with modern AI and cloud-native tools to enhance data workflows.
- 3+ years of programming experience in:
- Python
- PySpark
- SQL
- Strong hands-on experience with Google Cloud Platform (GCP) services, including:
- BigQuery
- Vertex AI
- Cloud Functions
- Cloud Storage
- Looker
- GCP Agent Development Kit (ADK)
- Experience implementing scalable data pipelines and system integrations.
- Strong understanding of data modeling concepts (incremental models, transformations, dataset consolidation).
- Experience building automated validation frameworks.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
- Experience with Palantir Foundry (especially Foundry AIP).
- Exposure to modern AI frameworks (e.g., Vertex AI, Gemini).
- Experience integrating GCP with external platforms via REST APIs.
- Knowledge of cloud networking concepts (egress/ingress policies).
To Get more job notifications
Similar Jobs
Explore other opportunities that match your interests
Empiric
clevanoo llc