Google Cloud Data Architect - IAM Data Modernization
Design and implement data lake architectures on Google Cloud, migrating on-premises SQL data warehouses to target-state Data Lakes. Develop scalable batch and streaming pipelines using Dataflow and Spark. Ensure data governance, quality, and metadata management.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
H1B Workable
Fine with Relocation
Role : Google Cloud Data Architect โ IAM Data Modernization
Location : Dallas, TX / Carlotte, NC/ Columbus Ohio / New jersey (4 days onsite)
Implementation partner: *******
End Client - (Domain ) Banking / Finance
Mode of Interview - Video / Virtual
Experience: 12+ years
Project/Program
Identity & Access Management (IAM) Data Modernization โ migration of an onโpremises SQL data warehouse to a targetโstate Data Lake on Google Cloud (GCP), enabling metrics & reporting, advanced analytics, and GenAI use cases (natural language querying, accelerated summarization, crossโdomain trend analysis).
About Program/Project
The IAM Data Modernization project involves migrating an on-premises SQL data warehouse to a target state Data Lake in GCP cloud environment. Key highlights include:
- Integration Scope: 30+ source system data ingestions and multiple downstream integrations
- Capabilities: Metrics, reporting, and Gen AI use cases with natural language querying, advanced pattern/trend analysis, faster summarizations, and cross-domain metric monitoring
- Benefits:
- Scalability and access to advanced cloud functionality
- Highly available and performant semantic layer with historical data support
- Unified data strategy for executive reporting, analytics, and Gen AI across cyber domains
Looking to advance your Development & Programming career with relocation support? Explore Development & Programming Jobs with Relocation Packages that include comprehensive packages to help you move and settle in your new role.
This modernization establishes a single source of truth for enterprise-wide data-driven decision-making.
Required Skills
Data Lake Architecture & Storage
- Proven experience designing and implementing data lake architectures (e.g., Bronze/Silver/Gold or layered models).
- Strong knowledge of Cloud Storage (GCS) design, including bucket layout, naming conventions, lifecycle policies, and access controls
ยท Experience with Hadoop/HDFS architecture, distributed file systems, and data locality principles
- Hands-on experience with columnar data formats (Parquet, Avro, ORC) and compression techniques
- Expertise in partitioning strategies, backfills, and large-scale data organization
- Ability to design data models optimized for analytics and BI consumption
Qualifications
- Experience: [10โ14]+ years in data engineering/architecture, 5+ years designing on GCP at scale; prior onโprem โ cloud migration a must.
- Education: Bachelorโs/Masterโs in Computer Science, Information Systems, or equivalent experience.
- Certifications: Google Cloud Professional Cloud Architect (required or within 3 months). Plus: Professional Data Engineer, Security Engineer.
Discover our full range of relocation jobs with comprehensive support packages to help you relocate and settle in your new location.
Data Ingestion & Orchestration
ยท Experience building batch and streaming ingestion pipelines using GCP-native services
ยท Knowledge of Pub/Sub-based streaming architectures, event schema design, and versioning
ยท Strong understanding of incremental ingestion and CDC patterns, including idempotency and deduplication
ยท Hands-on experience with workflow orchestration tools (Cloud Composer / Airflow)
ยท Ability to design robust error handling, replay, and backfill mechanisms
Data Processing & Transformation
ยท Experience developing scalable batch and streaming pipelines using Dataflow (Apache Beam) and/or Spark (Dataproc)
ยท Strong proficiency in BigQuery SQL, including query optimization, partitioning, clustering, and cost control.
ยท Hands-on experience with Hadoop MapReduce and ecosystem tools (Hive, Pig, Sqoop)
ยท Advanced Python programming skills for data engineering, including testing and maintainable code design
ยท Experience managing schema evolution while minimizing downstream impact
Analytics & Data Serving
ยท Expertise in BigQuery performance optimization and data serving patterns
ยท Experience building semantic layers and governed metrics for consistent analytics
ยท Familiarity with BI integration, access controls, and dashboard standards
ยท Understanding of data exposure patterns via views, APIs, or curated datasets
Data Governance, Quality & Metadata
Interested in relocating to United State? Check out our comprehensive Relocation Jobs in United State page with detailed relocation packages and benefits.
ยท Experience implementing data catalogs, metadata management, and ownership models
ยท Understanding of data lineage for auditability and troubleshooting
ยท Strong focus on data quality frameworks, including validation, freshness checks, and alerting
ยท Experience defining and enforcing data contracts, schemas, and SLAs
ยท Familiarity with audit logging and compliance readiness
Cloud Platform Management
ยท Strong hands-on experience with Google Cloud Platform (GCP), including project setup, environment separation, billing, quotas, and cost controls
ยท Expertise in IAM and security best practices, including least-privilege access, service accounts, and role-based access
ยท Solid understanding of VPC networking, private access patterns, and secure service connectivity
ยท Experience with encryption and key management (KMS, CMEK) and security auditing
DevOps, Platform & Reliability
ยท Proven ability to build CI/CD pipelines for data and infrastructure workloads
ยท Experience managing secrets securely using GCP Secret Manager
ยท Ownership of observability, SLOs, dashboards, alerts, and runbooks
ยท Proficiency in logging, monitoring, and alerting for data pipelines and platform reliability
Good to have
Security, Privacy & Compliance
ยท Hands-on experience implementing fine-grained access controls for BigQuery and GCS
ยท Experience with VPC Service Controls and data exfiltration prevention
ยท Knowledge of PII handling, data masking, tokenization, and audit requirements
Similar Jobs
Explore other opportunities that match your interests
Strategic Staffing Solutions
Embedded Software Engineer
Raytheon
Director of Talent Acquisition