Senior Data Engineer (AWS, Python, SQL)

KPG99 INC United State
Relocation
This Job is No Longer Active This position is no longer accepting applications
AI Summary

Design, build, and maintain scalable data pipelines and cloud data platforms. Proficient in AWS, Python, and SQL. Collaborate with global cross-functional teams.

Key Highlights
Design, build, and maintain scalable data pipelines and cloud data platforms
Collaborate with global cross-functional teams
Support impactful, life-changing work in a collaborative and balanced culture
Technical Skills Required
AWS Python SQL AWS Glue AWS Lambda AWS Athena AWS Step Functions AWS Lake Formation PostgreSQL MySQL MongoDB Cassandra Snowflake Airflow dbt Spark Hadoop Flink
Benefits & Perks
Full-time employment (W2 only)
Hybrid work arrangement (onsite 2-3 days/week)
Relocation considered (with valid reasons)
Opportunity to support impactful, life-changing work

Job Description


Title: Data Engineer

Location: Hybrid – Ridgefield, CT (Onsite 2–3 days/week)

Type: Full-Time Employment (W2 Only)

Relocation: Considered (must have valid reasons such as family, etc.)

Overview

We’re seeking a hands-on Data Engineer to design, build, and maintain scalable data pipelines and cloud data platforms. The ideal candidate will be proficient in AWS, Python, and SQL, with excellent communication skills and the ability to collaborate effectively with global cross-functional teams.

This position offers the opportunity to directly support impactful, life-changing work in a collaborative and balanced culture.

Required Qualifications

  • Bachelor’s Degree + 4 years (or Master’s + 2 years) in Data Engineering or related field.
  • Strong understanding of data integration, data modeling, and SDLC.
  • Hands-on experience with AWS data services (Glue, Lambda, Athena, Step Functions, Lake Formation).
  • Expert-level proficiency in Python and SQL.
  • Advanced knowledge of data warehousing and data modeling (Kimball/star schema).
  • Experience with relational and NoSQL databases (PostgreSQL, MySQL, MongoDB, Cassandra).
  • Familiarity with CI/CD pipelines and DevOps principles.
  • Excellent communication and collaboration skills.


Preferred Skills

  • Experience with ETL/ELT tools such as Airflow, dbt, AWS Glue, ADF.
  • Experience with Snowflake.
  • Knowledge of data governance and metadata management.
  • AWS certification (preferred).
  • Familiarity with big data processing tools (Spark, Hadoop, Flink).

Interview Process

  1. Initial screening with recruiter.
  2. Technical interview.
  3. Final onsite interview with 4–5 team members (individual sessions).


Subscribe our newsletter

New Things Will Always Update Regularly