Senior Data Architect (Canonical Data Models and Event-Driven Architecture)
We are seeking a hands-on Senior Data Architect to design and own a canonical, event-driven data layer that standardizes data from multiple source systems into a unified enterprise model. The ideal candidate will have 10+ years of experience in data management and 5+ years in Data Architecture.
Key Highlights
Technical Skills Required
Benefits & Perks
Job Description
Important: This role is W2 only. We will not accept submissions from vendor companies or third-party agencies. All visa statuses are accepted, including H1B transfers
Job Title: Data Architect
Location: 100% Remote (EST hours)
Duration: Long Term
Engagement: W2 Only
Visa status: Any + H1B Transfers
Contact: Ram@brightmindsol.com
Job Overview
We are looking for a hands-on Data Architect to design and own a canonical, event-driven data layer that standardizes data from multiple source systems into a unified enterprise model. This role focuses on schema design, governance, data modeling, and validation in an AWS-based, microservices-driven environment.
Key Responsibilities
- Design and manage canonical event schemas across multiple systems
- Define schema registry, governance, and versioning strategies
- Build normalized data models and enterprise data dictionary
- Design event envelopes and abstraction layers for downstream consumers
- Implement metadata management using AWS Glue Catalog
- Map source systems to standardized models (FDX JSON)
- Write and validate SQL, transformation logic, and data quality rules
- Collaborate with application, platform, and data engineering teams
- Support legacy-to-AWS data migration and pipelines
- Mentor Data Engineers and define architectural best practices
Required Skills
- 10+ years in data management; 5+ years in Data Architecture
- Strong experience with canonical data models and event-driven architecture
- AWS expertise: Glue, S3, Lambda, SQS, RDS, DynamoDB
- Microservices and containerized environments
- Strong SQL and data validation experience
- Knowledge of Data Mesh and Data Fabric concepts
Preferred Skills
- Financial services or regulated environments
- FDX JSON standards
- Kafka, EventBridge, RabbitMQ
- Python; familiarity with Java or Node.js
- Kubernetes / EKS
- AWS Certifications
Role Split:
60% Architecture & Design | 40% Hands-on Development