Design, build, and maintain automated ELT pipelines ingesting data from diverse source systems into Snowflake. Administer Snowflake environments, including databases, schemas, warehouses, access controls, and security settings. Collaborate closely with Analytics Engineering, DataOps, and Data Governance partners.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Description
Data Platform Engineer
This is a vacancy for the 26-27 school year with a target start date of July 1, 2026
Mission
The Data Platform Engineer builds and operates IDEA’s data infrastructure on Snowflake, enabling reliable, scalable access to data that supports analytics, reporting, and research across multiple states. This role designs automated ingestion pipelines, optimizes platform performance and cost, and ensures the data platform functions as a production-grade system for downstream teams.
Reporting to the Manager of Data Platform Engineering, this engineer works hands-on with ELT pipelines, infrastructure-as-code, and Snowflake administration while contributing to IDEA’s transition from legacy ETL systems to a modern lakehouse architecture.
Supervisory Responsibilities
Individual contributor role with no direct reports. Senior engineers may mentor peers and lead technical initiatives.
Location:
This is a full-time remote position based in Texas, with preference given to candidates who live in Austin, El Paso, Houston, Permian Basin (Midland/Odessa), Rio Grande Valley, San Antonio, and Tarrant County (Fort Worth), or who are willing to relocate.
Travel Expectations
Minimal travel (5–10% annually) for collaboration, training, or critical implementation milestones.
Essential Duties
What You’ll Do – Accountabilities
- Design, build, and maintain automated ELT pipelines ingesting data from diverse source systems into Snowflake.
- Configure and manage cloud-native ingestion tools and custom Python-based pipelines when needed.
- Build and maintain Bronze-layer tables with schema evolution handling, audit metadata, and lineage.
- Implement ingestion-level validation and monitoring to catch issues early.
- Document source configurations, refresh schedules, and troubleshooting procedures.
- Partner with Analytics Engineering to ensure ingestion patterns support downstream transformation needs.
- Administer Snowflake environments, including databases, schemas, warehouses, access controls, and security settings.
- Optimize performance and cost through warehouse sizing, clustering, query analysis, and resource monitoring.
- Manage Snowflake objects using infrastructure-as-code patterns.
- Implement security best practices including RBAC, encryption, auditing, and network policies.
- Evaluate and adopt new Snowflake capabilities as appropriate.
- Own Terraform-based infrastructure definitions for Snowflake and related platform components.
- Automate recurring operational tasks such as provisioning, access grants, and environment setup.
- Build CI/CD pipelines for infrastructure changes with testing and safe deployment practices.
- Develop reusable templates and modules to accelerate onboarding of new sources and domains.
- Maintain clear documentation and runbooks for platform operations.
- Implement monitoring and alerting for pipelines, platform health, and performance.
- Troubleshoot pipeline failures and platform issues using systematic root-cause analysis.
- Embed observability (logging, metrics, alerts) into all production pipelines.
- Collaborate closely with Analytics Engineering, DataOps, and Data Governance partners.
- Participate in code reviews and design discussions.
- Share platform knowledge through documentation, mentoring, and team forums.
- Contribute to retrospectives and continuous improvement efforts.
Looking to advance your Devops career with relocation support? Explore Devops Jobs with Relocation Packages that include comprehensive packages to help you move and settle in your new role.
- Make Strategic Decisions: This team member uses data, feedback, and insights to inform thoughtful decision-making, while considering the impact on their direct reports and team. They communicate decisions with clear rationale and begin to connect their choices to broader team objectives.
- Manage Work and Teams: This team member sets clear, measurable goals and regularly reflects on progress, adjusting actions as needed. They prioritize work aligned with their goals using a task management system and consistently meet deadlines through effective time management.
- Grow Self and Others: This team member regularly offers affirming and adjusting feedback, maintaining a positive balance that reinforces growth and motivation. They provide transparent, candid performance insights and offer consistent coaching and development aligned with individual goals, supporting both direct reports and cross-functional partners.
- Build a Culture of Trust: This team member proactively builds strong personal and professional relationships with individual stakeholders and regularly seeks feedback to improve their work experience. They create a supportive environment where others feel safe to take risks and learn from mistakes without fear of retribution.
- Communicate Deliberately: This team member communicates thoughtfully by anticipating potential misunderstandings and providing necessary context to ensure clarity. They leverage structured communication channels to address challenges, ask meaningful questions, and guide conversations toward solutions, while actively listening to the concerns of others.
Discover our full range of relocation jobs with comprehensive support packages to help you relocate and settle in your new location.
Additional Skills:
- Hands-on experience administering Snowflake or similar cloud data platforms.
- Strong SQL skills for data extraction, validation, and performance tuning.
- Experience building and operating automated data pipelines.
- Proficiency with Python and scripting for automation and operational tooling.
- Experience operating production systems with monitoring and incident response.
- Familiarity with infrastructure-as-code and CI/CD concepts.
- Experience with cloud-native ingestion tools (e.g., Fivetran, Airbyte).
- Experience with Terraform or similar IaC tooling.
- Familiarity with dbt and analytics engineering workflows.
- Exposure to orchestration, data quality, or observability tools.
- Experience with education, public sector, or regulated data environments.
- Bachelor’s degree in a technical field or equivalent practical experience.
- 3+ years of experience in data engineering, platform engineering, or related roles.
- Demonstrated experience building and operating production data systems.
- Hands-on experience with Snowflake or comparable cloud data warehouses.
Interested in relocating to United State? Check out our comprehensive Relocation Jobs in United State page with detailed relocation packages and benefits.
- Snowflake or cloud platform certifications.
- Experience supporting multi-team data platforms at scale.
- Strong Python proficiency beyond basic scripting.
- Prolonged periods working on a computer and in virtual meetings
- Ability to travel domestically via car and air travel to campuses and state office
- Flexibility for occasional evening meetings with distributed stakeholders across time zones
Compensation & Benefits:
Salaries for people entering this role typically fall between $89,600 and $105,300, commensurate with relevant experience and qualifications and in alignment with internal equity. This role is also eligible for performance pay based on organizational performance and goal attainment.
Additionally, we offer medical, dental, and vision plans, disability, life insurance, parenting benefits, flexible spending account options, generous vacation time, referral bonuses, professional development, and a 403(b) plan. You can find more information about our benefits at https://ideapublicschools.org/careers/benefits/.
- IDEA may offer a relocation stipend to defray the cost of moving for this role, if applicable.
Submit your application online through Jobvite. Please note that applications will be reviewed on an ongoing basis until the position is filled. Applicants are encouraged to apply as early as possible.
Learn more about IDEA
At IDEA the Staff Experience Team uses our Core Values to promote human connection and a culture of integrity, respect, and belonging for all Team and Family members. Learn more about our Commitment to Core Values here: https://ideapublicschools.org/our-story/#core-values
Similar Jobs
Explore other opportunities that match your interests
Optomi
Mid/Sr Scrum Master - Cyber & Analytics (Intel Sector)
Leidos
Associate Director of Engineering Statistics & Data Analytics