Senior Data Engineer (Remote)

Knowledge Services • United State
Remote
Apply
AI Summary

Knowledge Services is seeking a Senior Data Engineer for a 6-month contract to lead the design, development, and optimization of data pipelines across diverse sources. The ideal candidate will have a strong foundation in modern data engineering practices, hands-on experience with Snowflake and tools like Fivetran, and a collaborative mindset.

Key Highlights
Lead data pipeline design and development
Optimize data extraction and loading processes
Collaborate with cross-functional teams
Technical Skills Required
Snowflake Fivetran Python SQL ETL/ELT frameworks Data modeling Relational database design
Benefits & Perks
Remote work
6-month contract with potential for extension

Job Description


Knowledge Services is seeking a remote Senior Data Engineer for a 6-month contract (potential for extension). This role may work 100% remotely.


  • Please note that we CANNOT CONSIDER ANYONE REQUIRING C2C or Sponsorship for a work visa


Senior Data Engineer Overview:

The Sr. Data Engineer will lead the design, develop, and optimize data pipelines across diverse sources. This role focuses on efficient data extraction, staging, and loading into our Snowflake-based data warehouse, ensuring high availability, accuracy, and performance. The ideal candidate will bring a technical foundation in modern data engineering practices, hands-on experience with Snowflake and tools like Fivetran, and a collaborative mindset.


Duties and Responsibilities:

• Develop efficient and scalable data extraction methodologies to retrieve data from diverse sources, such as databases, APIs, web scraping, flat files, and streaming platforms.

• Design and implement robust data loading processes to efficiently ingest and integrate data into the latest data warehousing technology, ensuring data quality and consistency.

• Develop and maintain staging processes to facilitate the organization and transformation of raw data into structured formats, preparing it for downstream analysis and reporting.

• Implement data quality checks and validation processes to identify and address data anomalies, inconsistencies, and integrity issues.

• Identify and resolve performance bottlenecks in data extraction and loading processes, optimizing overall system performance and data availability.

• Ensure adherence to data security and privacy standards throughout the data extraction and warehousing processes, implementing appropriate access controls and encryption mechanisms.

• Create and maintain comprehensive documentation of data extraction and warehousing processes, including data flow diagrams, data dictionaries, and process workflows.

• Mentor and support junior data engineers, providing guidance on best practices, technical design, and professional development to elevate overall team capability and performance.

• Collaborate with cross-functional teams, including data scientists, data analysts, software engineers, and business stakeholders, to understand their data requirements and provide efficient data engineering solutions.

• Stay updated with the latest advancements in data engineering, data warehousing, and cloud technologies, and proactively propose innovative solutions to enhance data extraction and warehousing capabilities.


Senior Data Engineer Requirements:

• Minimum of 5 years’ experience in data engineering, with a strong focus on data extraction and cloud-based warehousing; a combination of years of experience and relevant advanced technology proficiency will also be considered.

• Proficiency with Snowflake and data integration tools like Fivetran.

• Advanced SQL skills and experience with ETL/ELT frameworks.

• Experience with scripting languages such as Python for data processing and automation.

• Solid understanding of data modeling and relational database design.

• Strong communication skills and the ability to collaborate with technical and non-technical stakeholders.

• Strong analytical and problem-solving skills, with the ability to identify and resolve complex data engineering challenges.


Preferred Credentials and Experience:

• Bachelor's or Master’s degree in Computer Science, Information Systems, or a related field.

• Snowflake Architect, Administrator, or Data Engineering certification required.

• Experience with dbt (data build tool) for managing data transformations, modeling, and maintaining version- controlled, modular SQL pipelines.

• Familiarity with cloud platforms such as AWS and Azure, including services like S3, Lambda, Redshift, Glue, Azure Data Lake, and Synapse.


Subscribe our newsletter

New Things Will Always Update Regularly