DataOps Engineer

IDEA Public Schools United State
Relocation
Apply
AI Summary

Designs and maintains automation, testing, and deployment infrastructure for IDEA's data platform. Ensures reliability at scale and enables rapid deployment of data systems. Builds reusable automation tools, scripts, and templates for self-service provisioning and reduced manual toil.

Key Highlights
Design and maintain CI/CD pipelines for data workflows
Implement deployment automation for dbt projects and Snowflake infrastructure
Develop safe deployment strategies to reduce risk and downtime
Key Responsibilities
Design, build, and maintain CI/CD pipelines for data workflows
Implement deployment automation for dbt projects, Snowflake infrastructure, and data ingestion tools
Develop safe deployment strategies to reduce risk and downtime
Technical Skills Required
Terraform Python Shell Snowflake dbt Fivetran Airbyte
Benefits & Perks
Salary: $89,600 - $105,300
Medical, dental, and vision plans
Disability, life insurance
Parenting benefits
Flexible spending account options
Generous vacation time
Referral bonuses
Professional development opportunities
Retirement (403(b)) plan
Nice to Have
Experience supporting data pipelines, analytics platforms, or dbt deployments
Knowledge of Snowflake administration and performance considerations
Familiarity with data ingestion tools and APIs (e.g., Fivetran, Airbyte)

Job Description


Description

DataOps Engineer

Mission

The DataOps Engineer designs and maintains the automation, testing, and deployment infrastructure that allows IDEA’s data platform to operate reliably at scale. This role applies DevOps principles to data systems -- eliminating manual processes, improving deployment safety, and enabling Platform and Analytics Engineering teams to ship faster without sacrificing reliability.

Reporting to the Managing Director of Data Platform & Engineering, this engineer serves as the organization’s automation and reliability expert, partnering closely with Platform Engineering (infrastructure and ingestion) and Analytics Engineering (dbt transformations and data quality).

This role is ideal for someone who enjoys building systems, tooling, and automation that make other engineers dramatically more effective.

Supervisory Responsibilities

Individual contributor role with no direct reports. Acts as a peer technical expert across Platform and Analytics Engineering teams.

Location:

This is a full-time position based in Texas, with preference given to candidates who live in Austin, El Paso, Houston, Permian Basin (Midland/Odessa), Rio Grande Valley, San Antonio, and Tarrant County (Fort Worth), or who are willing to relocate.

Travel Expectations

Minimal travel (5–10% annually) for collaboration, training, or critical implementation milestones.

Essential Duties

What You’ll Do – Accountabilities

  • Design, build, and maintain CI/CD pipelines for data workflows, enabling automated testing, validation, and safe deployment to production.
  • Implement deployment automation for dbt projects, Snowflake infrastructure, and data ingestion tools with proper environment promotion and approval gates.
  • Develop safe deployment strategies (rollback, canary, blue/green) that reduce risk and downtime.
  • Maintain CI/CD tooling, documentation, and runbooks to ensure reliability and team adoption.
  • Own infrastructure-as-code for the data platform, using Terraform to provision and manage Snowflake environments and related resources.
  • Automate operational tasks such as environment setup, access provisioning, configuration management, and resource monitoring.
  • Build reusable automation tools, scripts, and templates that enable self-service provisioning and reduce manual toil.

Additional Duties And Responsibilities

  • Partner with Platform Engineering to align on IaC standards, patterns, and shared modules
  • Design and implement monitoring and observability for data pipelines, dbt models, and platform health.
  • Build dashboards and alerts tracking pipeline success, data freshness, test results, and system performance.
  • Implement intelligent alerting with clear escalation paths and minimal noise.
  • Establish incident response practices, runbooks, and post-incident learning loops.
  • Implement automated testing frameworks for data pipelines, including schema validation, regression testing, and data contract checks.
  • Enable automated execution and reporting of data quality tests in CI/CD workflows.
  • Partner with Analytics Engineering to standardize dbt test patterns and quality enforcement strategies.
  • Maintain test environments, fixtures, and safe testing workflows for production-bound changes
  • Serve as a DataOps advisor to Platform and Analytics Engineering teams, identifying automation opportunities and reducing friction.
  • Review pull requests with a focus on deployment safety, testing coverage, and operational risk.
  • Document DataOps standards and best practices, enabling scalable self-service adoption.
  • Partner with the Managing Director on DataOps strategy, tooling decisions, and technical roadmap.

Knowledge and Skills – Competencies

  • Make Strategic Decisions: This team member fosters an inclusive decision-making environment by encouraging diverse perspectives, managing disagreements constructively, and creating space for all voices to be heard. They clearly communicate final decisions, providing context and anticipated impact—even when choices are difficult or unpopular.
  • Manage Work and Teams: This team member builds and maintains systems to track progress toward team goals, ensuring clarity through defined roles and responsibilities. They implement structured processes that support smooth team operations and strategically allocate time and resources to drive goal achievement.
  • Grow Self and Others: This team member uses data to assess development needs and designs learning opportunities that align with team goals and individual career growth. They model a growth mindset by being open about their own development and ensure that both personal and team-led learning initiatives are impactful and well-aligned with organizational priorities.
  • Build a Culture of Trust: This team member fosters a team culture where individuals genuinely care for one another both personally and professionally. They lead with transparency, encourage open communication, including healthy conflict—and promote reliability and consistency, while regularly seeking and responding to team feedback to enhance the collective experience.
  • Communicate Deliberately: This team member leads inclusive discussions that surface obstacles and drive actionable solutions, ensuring all voices are heard. They communicate key information clearly across multiple channels and establish feedback loops that promote open dialogue, collaboration, and continuous improvement.

Required

Additional Skills:

  • Hands-on experience building and maintaining CI/CD pipelines in production environments.
  • Strong proficiency with infrastructure-as-code tools (Terraform or equivalent).
  • Solid programming and scripting skills (Python, shell) for automation and tooling.
  • Experience operating production systems with monitoring, alerting, and incident response.
  • Familiarity with modern data platforms and analytics engineering workflows (e.g., Snowflake, dbt).

Preferred

  • Experience supporting data pipelines, analytics platforms, or dbt deployments.
  • Knowledge of Snowflake administration and performance considerations.
  • Familiarity with data ingestion tools and APIs (e.g., Fivetran, Airbyte).
  • Exposure to containerization or SRE practices.
  • Experience working in regulated or privacy-sensitive environments (education preferred).

Required Education And Experience

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field, or equivalent practical experience.
  • 5+ years of experience in DevOps, platform engineering, SRE, or similar technical roles.
  • Demonstrated experience automating deployments, infrastructure, and operational workflows with measurable impact.

Preferred Education And Experience

  • Experience supporting data platforms, analytics engineering workflows, or large-scale data pipelines in production environments.
  • Hands-on experience with Snowflake administration, performance tuning, or environment management.
  • Experience deploying or supporting dbt projects, including CI/CD integration and testing automation.
  • Familiarity with data ingestion tools and APIs (e.g., Fivetran, Airbyte) and automating their configuration or deployment.
  • Exposure to containerization or site reliability engineering (SRE) practices.
  • Experience working in regulated or privacy-sensitive environments (education, healthcare, or public sector).
  • Relevant certifications (e.g., Terraform Associate, cloud DevOps certifications) or demonstrated equivalent expertise.
  • Contributions to open-source tooling, internal developer platforms, or shared automation frameworks.

Physical Requirements

  • Prolonged periods working on a computer
  • Ability to participate in virtual meetings and team collaborations across Platform and Analytics Engineering teams
  • Flexibility for occasional evening/weekend work during critical deployments or automation maintenance windows

What We Offer

Compensation & Benefits:

Salaries for people entering this role typically fall between $89,600 and $105,300, commensurate with relevant experience and qualifications and in alignment with internal equity. This role is also eligible for performance pay based on organizational performance and goal attainment.

Additionally, we offer medical, dental, and vision plans, disability, life insurance, parenting benefits, flexible spending account options, generous vacation time, referral bonuses, professional development, and a 403(b) plan. You can find more information about our benefits at https://ideapublicschools.org/careers/benefits/.

  • IDEA may offer a relocation stipend to defray the cost of moving for this role, if applicable.

Application Process

Submit your application online through Jobvite. Please note that applications will be reviewed on an ongoing basis until the position is filled. Applicants are encouraged to apply as early as possible.

Learn more about IDEA

At IDEA the Staff Experience Team uses our Core Values to promote human connection and a culture of integrity, respect, and belonging for all Team and Family members. Learn more about our Commitment to Core Values here: https://ideapublicschools.org/our-story/#core-values

IDEA Public Schools does not discriminate on the basis of race, color, national origin, age, sex or disability, in admission or access to, or treatment of employment in its programs and activities. Any person having inquiries concerning the organization's compliance with the regulations implementing Title VI of Civil Rights Act of 1964 (Title VI), Section 504 of the Rehabilitation Act of 1973 (Section 504), or Title II of the Americans with Disabilities Act of 1990 (ADA), may contact IDEA Human Resources at (956) 377-8000.

Similar Jobs

Explore other opportunities that match your interests

Kong Architect

Devops
40m ago
Visa Sponsorship Relocation Remote
Job Type Contract
Experience Level Mid-Senior level

HYR Global Source Inc

United State

Director of Technology - Enterprise Integrations

Devops
12h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Ferguson

United State

DevOps Engineer for AI-Driven Innovations at Claris

Devops
12h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

apple

United State

Subscribe our newsletter

New Things Will Always Update Regularly