Senior Data Engineer

the brixton group United State
Remote
Apply
AI Summary

Design and implement scalable data pipelines using AWS services. Build end-to-end AI models and agents from data preprocessing through deployment and monitoring. Lead technical delivery with client stakeholders and mentor engineering teams.

Key Highlights
Design scalable data pipelines
Build end-to-end AI models
Lead technical delivery
Key Responsibilities
Design scalable data pipelines using AWS services
Build end-to-end AI models and agents from data preprocessing through deployment and monitoring
Implement large-scale data processing solutions using platforms such as Databricks and Snowflake
Establish CI/CD pipelines for data systems within DevOps environments
Lead technical delivery with client stakeholders and mentor engineering teams
Technical Skills Required
AWS Python Databricks Snowflake Airflow CI/CD pipelines Generative AI LLMs RAG SQL Spark
Benefits & Perks
$75-95/hr
100% remote - EST hours
Nice to Have
AWS Certified Solutions Architect
Databricks certification
AWS SageMaker experience

Job Description



Duration: 6+ month contract
Compensation: $75-95/hr
Location: 100% remote - EST hours
 
Key Responsibilities
  • Design scalable data pipelines using AWS services including S3, Glue, Lambda, Kinesis, Redshift, Step Functions
  • Build end-to-end AI models and agents from data preprocessing through deployment and monitoring
  • Implement large-scale data processing solutions using platforms such as Databricks and Snowflake
  • Apply Generative AI architectures including large language model integration and retrieval-augmented generation
  • Establish CI/CD pipelines for data systems within DevOps environments
  • Lead technical delivery with client stakeholders and mentor engineering teams
 
Requirements
  • 7+ years data engineering experience
  • Experience with Amazon Web Services (AWS) data architecture 
  • Experience building AI models and GenAI architectures
  • Skilled in Python programming
 
Preferred Skills
  • AWS Certified Solutions Architect or equivalent certification
  • Databricks certification
  • AWS SageMaker experience
Core Tech Environment
Airflow; AWS (Glue, Kinesis, Lambda, Redshift, S3, Step Functions); CI/CD pipelines; Databricks; DevOps practices; Generative AI (LLMs, RAG); Python; Snowflake; SQL; Spark

Job ID: 26-00412

Similar Jobs

Explore other opportunities that match your interests

Director of Carrier Partnerships

Programming
11m ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

shippo

United State

Principal Architect for D&A Ecosystem

Programming
29m ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Daman

United State

Vice President of Finance

Programming
37m ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Tala

United State

Subscribe our newsletter

New Things Will Always Update Regularly