Snowflake Data Engineer

IntePros • United State
Remote
Apply
AI Summary

We are seeking a highly skilled Snowflake Data Engineer to support and enhance a Snowflake-based data warehouse environment within the mortgage and real estate domain. The ideal candidate will have hands-on experience building and optimizing Snowflake pipelines using streams, dynamic tables, stored procedures, and DAG-based orchestration. Key responsibilities include designing and maintaining ELT pipelines, developing and optimizing heavy SQL workloads, and implementing and managing Snowflake streams, dynamic tables, tasks, and stored procedures.

Key Highlights
Snowflake Data Engineer
Snowflake-based data warehouse environment
ELT pipelines
SQL workloads
Snowflake streams, dynamic tables, tasks, and stored procedures
Technical Skills Required
Snowflake SQL Python ELT DAG-based orchestration AWS S3 PostgreSQL
Benefits & Perks
100% remote work
U.S. only

Job Description


Snowflake Data Engineer

Location: 100% Remote (U.S. only)

Work Hours: EST

IntePros is seeking a highly skilled Snowflake Data Engineer to support and enhance a Snowflake-based data warehouse environment within the mortgage and real estate domain. This role is heavily focused on Snowflake, SQL, and Python, with an emphasis on orchestrating ELT workflows directly within the Snowflake platform.

The ideal candidate will have hands-on experience building and optimizing Snowflake pipelines using streams, dynamic tables, stored procedures, and DAG-based orchestration, while working with complex, imperfect datasets in a fast-moving, production environment.

Key Responsibilities

  • Support and enhance a Snowflake data warehouse, with a strong focus on performance, scalability, and reliability
  • Design and maintain ELT pipelines that move data across multiple layers (raw, processed, curated) within Snowflake
  • Develop and optimize heavy SQL workloads, including complex transformations, tuning, and optimization
  • Implement and manage Snowflake streams, dynamic tables, tasks, and stored procedures
  • Use Python to orchestrate workflows, build DAGs, and automate data processes within and around Snowflake
  • Process inbound data files, converting formats (e.g., flat files → Parquet) and loading data into S3 and Snowflake
  • Support application-side data needs, including building data pipelines for a pricing engine integrated with third-party suppliers
  • Handle imperfect, incomplete, or incorrect data, improving data quality, validation, and processing logic
  • Partner closely with data engineers, application engineers, and other technical teams to deliver scalable solutions
  • Contribute to DevOps and deployment practices related to data pipelines and Snowflake environments
  • Document data flows, transformations, and Snowflake use cases

Required Qualifications

  • 2+ years of hands-on experience working with Snowflake in a production environment
  • Strong expertise in SQL, including query optimization and complex transformations within Snowflake
  • Proficiency in Python for data processing, orchestration, and workflow automation
  • Experience building and managing DAGs for data orchestration
  • Deep understanding of Snowflake architecture and features, including streams, dynamic tables, tasks, and stored procedures
  • Experience implementing ELT patterns within cloud-based data platforms
  • Familiarity with AWS, particularly S3 and file-based ingestion workflows
  • Exposure to DevOps practices in data engineering environments
  • Strong communication skills and the ability to collaborate effectively with technical teams

Nice to Have

  • Experience working with mortgage, lending, or real estate data
  • Familiarity with financial services data models, compliance, or regulatory considerations
  • Experience supporting application-facing data pipelines


Subscribe our newsletter

New Things Will Always Update Regularly