Design scalable data infrastructure for analytics and reporting. Develop and optimize SQL transformations and large-scale data processing workflows. Implement data governance standards and enable high-performance analytics.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Job Description
Job Title: Senior Data Engineer
Location: Fully Remote (must work Pacific Time hours)
Compensation: $195,000 base salary
Work Authorization: U.S. work authorization required — no sponsorship available
Overview
We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable data infrastructure that powers analytics, reporting, and data-driven decision-making across the organization. This role is ideal for an engineer who thrives in modern data environments, understands large-scale data architecture, and has deep experience building robust ETL/ELT pipelines within a Lakehouse ecosystem.
You will play a key role in developing reliable data platforms, implementing governance standards, and enabling high-performance analytics through strong data modeling and transformation practices.
This is a fully remote role that requires working Pacific Time business hours.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Key Responsibilities
- Design, build, and maintain scalable ETL/ELT data pipelines supporting analytics and operational workloads.
- Develop and optimize complex SQL transformations and large-scale data processing workflows.
- Implement and manage modern data architectures supporting analytics engineering and self-service reporting.
- Design and maintain robust data models using established methodologies (e.g., Kimball, Data Vault, semantic modeling).
- Optimize query performance and ensure efficient processing of large data volumes.
- Build and maintain Lakehouse architecture components using modern open table formats and distributed query engines.
- Implement and enforce data governance standards, including data quality, lineage, and documentation.
- Collaborate with analytics, product, and data science teams to deliver trusted and scalable data solutions.
- Contribute to platform reliability, monitoring, and performance tuning across the data ecosystem.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Required Qualifications
- Extensive experience with expert-level SQL and strong Python development.
- Proven experience building robust, production-grade ETL/ELT workflows.
- Advanced experience with Snowflake and dbt Labs (dbt) for data transformation and analytics engineering.
- Deep expertise in modern data modeling techniques, including Kimball, Data Vault, and semantic layer design.
- Demonstrated experience tuning and optimizing performance of large-scale analytical queries.
- Hands-on experience working within Lakehouse architectures using open table formats and distributed query engines (e.g., Iceberg, Trino, or similar technologies).
- Strong experience implementing data governance practices, including data quality frameworks, metadata management, and lineage tracking.
Similar Jobs
Explore other opportunities that match your interests
Jobs via Dice
adf medical