Manager - Databricks Solutions Architect

Deloitte • United State
Visa Sponsorship
Apply
AI Summary

Deliver end-to-end data engineering solutions using Databricks on major cloud platforms. Lead innovation in big data architecture and analytics. Drive measurable results.

Key Highlights
Lead the development, implementation, and scaling of advanced data engineering solutions using Databricks
Establish best-in-class approaches for data architecture, integration, and modeling
Oversee the design, development, and maintenance of robust data pipelines and data architectures
Key Responsibilities
Architect and deliver solutions using Databricks
Champion best practices for data architecture, integration, and modeling
Oversee pipeline ownership and drive excellence
Lead technology leadership and strategic data governance
Manage team leadership and mentoring, stakeholder engagement, and DevOps and automation
Technical Skills Required
Databricks AWS Azure GCP Cloud-native databases Storage solutions Distributed compute platforms Lakehouse architecture Apache Spark Delta Lake Delta Live Tables Autoloader Structured Streaming Databricks Workflows Apache Airflow PySpark Databricks Unity Catalog CI/CD pipelines Azure DevOps AWS Code Pipeline Jenkins TFS PowerShell
Benefits & Perks
Annual salary: $130,800-$241,000
Discretionary annual incentive program
Limited immigration sponsorship available
Nice to Have
Comprehensive knowledge of AWS, Azure, and GCP cloud ecosystems
Demonstrated skill in performance tuning and optimization within Databricks/Apache Spark environments
Experience with Databricks Lakeflow
Experience in AI/ML

Job Description


Job Title: Manager - Databricks

Job Summary

As a Manager, you will oversee the end-to-end design, deployment, and optimization of enterprise-scale data engineering solutions using Databricks on any major cloud platform (AWS, Azure, or GCP). This highly strategic role focuses on leading innovation in big data architecture and analytics, shaping best practices, advising senior stakeholders, and ensuring that data solutions align with business objectives and drive measurable results.

You will provide engagement delivery, with your ability to bring the breadth and depth of Deloitte's capabilities and talent to deliver technical solutions which allow clients to achieve their business strategy. Lead engagement planning and budgeting; mobilize and manage engagement teams; define deliverable structure and content; facilitate buy-in of proposed solutions from top management levels at the client; direct on-time, quality delivery of work products; manage engagement economics; manage engagement risk.

Recruiting for this role ends on 2/22/2026.

Key Responsibilities

  • Architect and Deliver Solutions: Lead the development, implementation, and scaling of advanced data engineering solutions using Databricks across AWS, Azure, or GCP environments.
  • Champion Best Practices: Establish, document, and promote best-in-class approaches for data architecture, integration, and modelling.
  • Pipeline Ownership: Oversee the design, development, and maintenance of robust data pipelines and data architectures that support large-scale, enterprise data needs.
  • Drive Excellence: Initiate and manage efforts to improve data quality, operational efficiency, and process scalability.
  • Technology Leadership: Evaluate, pilot, and integrate new big data and analytics technologies, ensuring the organization remains at the cutting edge.
  • Strategic Data Governance: Consult on, design, and implement governance, security, and compliance strategies tailored to modern cloud data ecosystems.
  • Team Leadership and Mentoring: Lead, coach, and develop teams of data engineers and architects, fostering technical growth and effective project delivery.
  • Stakeholder Engagement: Communicate technical concepts and business value to diverse stakeholders, including executives, business leads, and technology teams.
  • DevOps and Automation: Oversee the implementation of CI/CD practices with tools such as Azure DevOps, AWS Code Pipeline, Jenkins, TFS, or PowerShell for streamlined deployments and operations.

Qualifications:

Required:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field (Master's preferred).
  • 6+ years of hands-on experience in data engineering with a strong focus on Databricks, deployed on any major cloud (AWS, Azure, GCP).
  • Minimum of 6 years Technical Proficiency:
    • Expertise with cloud-native databases, storage solutions, and distributed compute platforms.
    • Deep understanding of Lakehouse architecture, Apache Spark, Delta Lake, and related big data technologies.
    • Advanced skills in data warehousing, 3NF, dimensional modeling, and enterprise-level data lakes.
    • Experience with Databricks components including Delta Live Tables, Autoloader, Structured Streaming, Databricks Workflows, and orchestration tools (e.g., Apache Airflow).
    • Expertise in designing and supporting incremental data loads and building metadata-driven ingestion/data quality frameworks using PySpark.
    • Hands-on experience with Databricks Unity Catalog and implementing fine-grained security and access control.
    • Proven track record in deploying code and solutions via automated CI/CD pipelines.
    • A minimum of 4 year experience leadership in managing complex, cross-functional data projects and technical teams.
    • Experience with performance optimization of Data engineering pipelines, code, compute resources
    • Ability to travel up to 50%, on average, based on the work you do and the clients and industries/sectors you serve
    • Limited immigration sponsorship may be available.

    Preferred:

    • Comprehensive knowledge of the AWS, Azure, and GCP cloud ecosystems and associated big data stacks is strongly preferred.
    • Demonstrated skill in performance tuning and optimization within Databricks/Apache Spark environments.
    • Stays current with the latest Databricks feature releases and platform enhancements.
    • Exceptional communication and stakeholder management abilities, including comfort interfacing with executive leadership.
    • Experience with Databricks Lakeflow is plus
    • Experience in AI/ML is plus

    The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $130,800- $241,000.

    You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.

    Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html

    Similar Jobs

    Explore other opportunities that match your interests

    Cloud Security Senior Consultant

    Devops
    •
    10h ago

    Premium Job

    Sign up is free! Login or Sign up to view full details.

    •••••• •••••• ••••••
    Job Type ••••••
    Experience Level ••••••

    Deloitte

    United State

    Engineering Manager, Cloud & DevSecOps

    Devops
    •
    11h ago

    Premium Job

    Sign up is free! Login or Sign up to view full details.

    •••••• •••••• ••••••
    Job Type ••••••
    Experience Level ••••••

    Space Telescope Science Instit...

    United State
    Visa Sponsorship Relocation Remote
    Job Type Contract
    Experience Level Mid-Senior level

    FUSTIS LLC

    United State

    Subscribe our newsletter

    New Things Will Always Update Regularly