Data AI/ML Engineer

ai talent • Australia
Visa Sponsorship
Apply
AI Summary

Design and build scalable data pipelines, deploy machine learning models, and collaborate with data scientists to ensure low-latency model prediction serving. Requires expertise in cloud data architecture, production machine learning, and MLOps. 4+ years of experience in Data Engineering or ML Engineering.

Key Highlights
Design and build scalable data pipelines
Deploy machine learning models
Collaborate with data scientists
Key Responsibilities
Design and build scalable, automated data pipelines
Lead the deployment and operationalisation of machine learning models
Develop and maintain feature stores and real-time data services
Technical Skills Required
Python TensorFlow PyTorch AWS Glue EMR SageMaker SQL Docker Kubernetes

Job Description


We're partnering with a rapidly scaling and innovative leader in the Digital Media and Entertainment sector, dedicated to optimising user experience and content recommendation through cutting-edge Machine Learning. For the right candidate with the necessary skills and experience, we are pleased to offer 482 visa sponsorship.

This client requires a Data AI/ML Engineer to bridge the gap between data science and production engineering. You will be instrumental in designing the MLOps platform, building robust feature pipelines, and deploying high-performance ML models (such as recommendation engines and user prediction systems) into a live, high-traffic environment. This role demands expertise in both cloud data architecture and production machine learning best practises.


What You'll Do
  • Design and build scalable, automated data pipelines (ETL/ELT) for feature engineering, training, and model serving using cloud services like AWS Glue and EMR.
  • Lead the deployment and operationalisation of machine learning models (MLOps) into production environments, utilizing platforms like AWS SageMaker for continuous integration and continuous delivery (CI/CD).
  • Develop and maintain feature stores and real-time data services to ensure low-latency model prediction serving.
  • Collaborate closely with data scientists to transition experimental models into resilient, production-ready code, focusing on performance, scalability, and cost optimisation.
  • Implement monitoring and alerting for model performance, data drift, and data quality in production.
  • Champion MLOps and DevSecOps practises for the ML platform, ensuring code quality, security, and reproducibility across the entire model lifecycle.
  • Contribute to architectural decisions for the overall data and ML infrastructure.


What You'll Bring
  • 4+ years of professional experience in Data Engineering or ML Engineering, with a proven track record of deploying models into production.
  • Expert proficiency in Python and deep experience with ML frameworks such as TensorFlow or PyTorch.
  • Mandatory hands-on experience with AWS cloud services for data and ML (e.g., SageMaker, EMR, S3, Lambda).
  • Strong experience with the MLOps lifecycle and tools for model management, versioning, and monitoring.
  • Expert-level SQL proficiency and solid understanding of data warehousing and data lake architectures.
  • Familiarity with containerisation (Docker) and orchestration (Kubernetes) for model deployment.
  • Excellent communication skills, with the ability to articulate complex technical requirements to data scientists and software engineers.



Similar Jobs

Explore other opportunities that match your interests

Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

pluralis research

Australia

Machine Learning Engineer

Machine Learning
•
2d ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Palo Alto Networks

United State

Machine Learning Researcher - Investment Firm

Machine Learning
•
2d ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

oscar faye

New York City Metropolitan Area

Subscribe our newsletter

New Things Will Always Update Regularly