AI Summary
Join a complex integration program as a Senior Data Engineer. Build and maintain ETL pipelines using Azure Data Factory and related Azure services. Collaborate with cross-functional teams to ensure smooth execution under a Waterfall framework.
Key Highlights
Build and maintain ETL pipelines using Azure Data Factory and related Azure services
Support integration across four club environments, ensuring data consistency, quality, and reliability
Collaborate with cross-functional teams to ensure smooth execution under a Waterfall framework
Technical Skills Required
Benefits & Perks
Casual Loading (25%)
Remote work (fully remote, based in Adelaide preferred)
Job Description
- 3 Months Contract Assignment
- Meaningful piece of work
- Based in Adelaide preferred, can be fully remote
You’ll be joining a complex and crucial integration program. The work is challenging, occasionally ambiguous, and requires someone who brings emotional intelligence, resilience, and the ability to collaborate deeply with cross‑functional teams.
You’ll be supporting a highly capable Lead Data Engineer who needs a strong “doer” to help manage the volume of work and keep delivery moving.
This contract assignment is for 3months.
Key Responsibilities
- Build and maintain ETL pipelines using Azure Data Factory and related Azure services.
- Support integration across four club environments, ensuring data consistency, quality, and reliability.
- Conduct discovery and analysis activities to understand data flows, dependencies, and constraints.
- Work closely with the Lead Data Engineer and technical delivery lead to ensure smooth execution under a Waterfall framework.
- Implement security and governance controls across data pipelines and environments.
- Monitor, troubleshoot, and optimise pipelines using Azure-native tools.
- Collaborate with cross‑functional stakeholders with empathy, clarity, and strong communication.
Azure Data Services
- Azure Data Factory (ADF)
- SQL Server Managed Instance
- Azure IR and SHIR
- Data modelling (relational + non‑relational)
- Building ETL pipelines in ADF
- Experience transferring data to platforms such as Snowflake or SQL Server
- SQL (essential)
- Stored procedures and triggers
- Python (widely used)
- Private Link
- RBAC
- Data masking
- Encryption
- Compliance policies
- Azure Monitor
- Log Analytics
- Query tuning
- Azure DevOps (CI/CD, pipeline automation)
- Git/GitHub
- Terraform or Bicep (nice to have)
- Microsoft Certified: Azure Data Engineer Associate
- Someone who thrives in a lean, hands‑on environment.
- A strong communicator with high EQ who can navigate complexity and ambiguity.
- A resilient problem‑solver who enjoys discovery, analysis, and cross‑team collaboration.
- A genuinely good human who contributes to a positive team culture.
(Tip for a successful consideration: As you prepare your application, we encourage you to carefully review the requirements associated with this role to ensure eligibility. To support a meaningful assessment of your fit, your resume should provide clear, detailed examples of your contributions, measurable impact, and relevant commercial experience that align with the role’s criteria. Please be aware that we rely solely on the information presented in your application - if specific experience or achievements are not included, we are unable to infer or assume them.)
Lastly, we will be only be able to consider candidates who have full work rights in Australia.
Diversity, Equity & Inclusion at Hudson
Hudson is committed to helping you find a workplace where you feel respected, supported, and free to thrive. We welcome applications from all backgrounds, identities, and lived experiences—because when different voices come together, amazing things happen.
Casual Loading
Please note for all Australian based contract and temporary roles only, the pay rate is inclusive of mandatory 25% casual loading. This excludes permanent and fixed term roles.
Profession
- IT, Technology & Digital, Data Analysts