Senior Cloud Data Engineer

ai talent • Australia
Visa Sponsorship
Apply
AI Summary

Design, build, and maintain robust data pipelines within a Cloud environment.

Key Highlights
Designing, building, and maintaining robust data pipelines within a Cloud environment
Bridging the gap between raw data collection and high-level business intelligence
Ensuring data is "AI-ready" for advanced analytics teams
Key Responsibilities
Designing, building, and maintaining robust, scalable data pipelines within a Cloud environment
Bridging the gap between raw data collection and high-level business intelligence
Ensuring that data is "AI-ready" for advanced analytics teams
Technical Skills Required
Cloud Platform Azure Data Factory Synapse Databricks Terraform Bicep Python PySpark SQL Git Azure DevOps Apache Spark Snowflake Power BI Data Lake Storage (ADLS Gen2) dbt (data build tool) Azure DevOps GitHub Actions
Benefits & Perks
482 Visa sponsorship
Competitive salary range
Perks not explicitly mentioned

Job Description


As a Senior Cloud Data Engineer, you will be responsible for designing, building, and maintaining robust, scalable data pipelines within a Cloud environment; 482 Visa sponsorship is available for the right candidate. You will bridge the gap between raw data collection and high-level business intelligence, ensuring that data is "AI-ready" for advanced analytics teams. This role requires a "hands-on" developer who understands the nuances of Perth’s large-scale industrial data sets.


📋 Core Responsibilities
  • Pipeline Development: Build and automate complex ETL/ELT pipelines using Azure Data Factory, Synapse, or Databricks.
  • Data Architecture: Implement "Lakehouse" architectures (Medallion: Bronze/Silver/Gold) to manage structured and unstructured data.
  • Infrastructure as Code (IaC): Deploy and manage cloud data infrastructure using Terraform or Bicep.
  • Code Quality: Write production-grade Python (PySpark) and SQL code, ensuring all pipelines are version-controlled via Git/Azure DevOps.
  • Data Modelling: Design scalable data models (Star Schema/Data Vault) optimized for Snowflake or Power BI consumption.
  • Security & Compliance: Ensure all data solutions adhere to Australian data sovereignty and security standards (AES/IRAP).

  • ✅ Technical Requirements
    • Cloud Platform: 3+ years of expert-level experience in Microsoft Azure (highly preferred in Perth) or AWS.
    • Processing: Deep experience with Apache Spark and Databricks.
    • Languages: Mastery of SQL and Python.
    • Transformation: Proficiency with dbt (data build tool) for modular SQL modeling.
    • Storage: Extensive experience with Data Lake Storage (ADLS Gen2) and Snowflake.
    • CI/CD: Practical experience with Azure DevOps or GitHub Actions for automated deployments.



    Similar Jobs

    Explore other opportunities that match your interests

    Lead Data Architect

    Devops
    •
    3h ago
    Visa Sponsorship Relocation Remote
    Job Type Contract
    Experience Level Mid-Senior level

    ai talent

    Australia

    2027 BHP Indigenous Graduate Program

    Devops
    •
    2d ago

    Premium Job

    Sign up is free! Login or Sign up to view full details.

    •••••• •••••• ••••••
    Job Type ••••••
    Experience Level ••••••

    BHP

    Australia
    Visa Sponsorship Relocation Remote
    Job Type Full-time
    Experience Level Mid-Senior level

    Reqiva

    Australia

    Subscribe our newsletter

    New Things Will Always Update Regularly