We're looking for a Senior Data Engineer to join a fast-moving data team building scalable, modern data infrastructure. The role involves designing, building, and optimizing data pipelines and models using various tools. Strong software engineering fundamentals and experience with data transformation and pipeline design are required.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Senior Data Engineer
Fully Remote (US Based)
We’re looking for a Senior Data Engineer to join a fast-moving data team building scalable, modern data infrastructure. This is a full-time remote position open to candidates based anywhere in the U.S.
What You’ll Do
- Design, build, and optimize data pipelines and models using dbt, Snowflake, and AWS.
- Develop robust ETL/ELT workflows to ensure data reliability and performance at scale.
- Collaborate with software and analytics teams to implement best practices for data architecture, CI/CD pipelines, and version control.
- Contribute to data infrastructure strategy—focusing on scalability, automation, and maintainability.
- Apply strong software engineering fundamentals toward data transformation and pipeline design.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
What We’re Looking For
- 5+ years of experience as a Data Engineer or Software Engineer building production-grade data systems.
- Deep hands-on experience with dbt, Snowflake, AWS (Glue, S3, Lambda, etc.), and Python.
- Solid understanding of data modeling, performance optimization, and distributed data processing.
- Strong software engineering foundation — experience with code reviews, testing, CI/CD, and infrastructure-as-code.
- Proven ability to work autonomously in a fully remote environment and communicate clearly across teams.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Nice to Have
- Experience with orchestration tools (Airflow, Dagster, Prefect).
- Familiarity with modern DevOps practices and containerization (Docker, ECS, Kubernetes).
- Interest in building scalable pipelines and data platforms, not just reporting environments.
Similar Jobs
Explore other opportunities that match your interests
Lensa
coody