Lead MLOps Engineer required for a contract position to migrate production ML workloads from Databricks to AWS SageMaker. The role involves technical leadership, migration, and defining the future MLOps operating model on SageMaker. Strong experience in Databricks, AWS SageMaker, and Python-based ML workloads is essential.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Job Description
Digital Transformation & Software Engineering consultancy immediately require a Lead MLOps Engineer on a contract basis who has expertise in Databricks and AWS SageMaker.
Contract Details:
- Job Title: Lead MLOps Engineer
- Day Rate: £625
- Location: Fully Remote
- Determination: OUTSIDE IR35
- Duration: 3 months initially (extension highly likely)
- Start Date: ASAP
Project Scope:
Interested in remote work opportunities in Devops? Discover Devops Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Time-bound delivery programme focused on migrating production ML workloads from Databricks to AWS SageMaker to a client within a regulated environment. The immediate priority is a container-first migration of existing Databricks-hosted ML workloads to AWS, with SageMaker as the default execution platform and a hard commercial deadline. In parallel, you will help define the future MLOps operating model on SageMaker, which will become business-as-usual once the migration completes.
Responsibilities:
- This is a hands-on technical leadership role where you will set patterns, review work, unblock delivery, and personally handle the most complex migrations.
- Act as the Lead MLOps Engineer delivering the migration from Databricks to AWS SageMaker.
- Own the technical direction, delivery integrity, and coordination across all technical workstreams.
- Lead and coordinate work across multiple streams (standardised migrations, complex/edge-case workloads, platform foundations), working closely with Data Engineers, Cloud Engineers, Delivery Management, and Data Science SMEs.
What you’ll be doing:
You will lead and contribute across the following areas:
- AWS SageMaker-based ML execution - Designing and operating batch processing, training, and (where appropriate) inference workloads on SageMaker.
- Databricks to SageMaker migration - Migrating Databricks notebooks, jobs, and ML workloads into containerised execution on AWS, ensuring behavioural parity and production stability.
- Python-based ML workloads - Working directly with Python-based ML codebases (e.g. sklearn, XGBoost, and similar libraries), refactoring only where required to support containerised execution.
- Containerised ML runtimes - Using containers to replicate Databricks runtimes, manage Python dependencies, and stabilise legacy workloads.
- ML pipelines & automation - Orchestrating end-to-end ML workflows on AWS, including batch execution, retraining, and validation.
- Monitoring, validation & governance - Implementing monitoring, logging, and validation patterns suitable for regulated production ML environments.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Essential skills & experience (must-haves):
- Proven, hands-on experience migrating ML workloads from Databricks to AWS SageMaker (this is non-negotiable).
- Strong experience building and operating Python-based ML workloads in production environments.
- Solid understanding of container-based ML execution and Python dependency management.
- Experience leading or owning technical delivery across multiple engineers and workstreams.
- Comfort working in regulated or high-governance environments where validation, auditability, and controlled change are required.
Similar Jobs
Explore other opportunities that match your interests
boss erp consulting
trg.recruitment
Senior Systems Engineer, Production