Senior Data Engineer (AWS | Snowflake | Databricks | Python)
Visa Sponsorship
Job Description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, OnwardPath Technology Solutions LLC, is seeking the following. Apply via Dice today!
About the Role:
We are seeking a Senior AWS Lead Data Engineer with strong hands-on development experience in Python, Pandas, PySpark, Terraform, and AWS services (Glue, Lambda, S3, Redshift, EMR).
This is a lead-level individual contributor position ideal for a technically strong engineer who enjoys building and optimizing data pipelines, not a pure architect role.
Key Responsibilities:
- Design, develop, and optimize data pipelines using AWS Glue, Redshift, S3, Lambda, EMR, and Athena.
- Build and maintain ELT processes to integrate and transform data from multiple sources.
- Collaborate with cross-functional teams to understand data needs and deliver high-quality solutions.
- Write robust, efficient code using Python, Pandas, and PySpark.
- Implement infrastructure as code (IaC) using Terraform.
- Apply data quality, governance, and security best practices.
- Monitor, troubleshoot, and resolve pipeline performance issues.
- Stay current with AWS data technologies and continuously improve pipeline efficiency.
- Bachelor s or Master s degree in Computer Science, IT, or related field.
- 5 8 years of professional experience as a Data Engineer, with a strong focus on AWS-based data solutions.
- Expertise in:
- AWS Glue, Redshift, S3, Lambda, EMR, Athena
- Python, Pandas, PySpark, SQL
- Terraform (IaC)
- Hands-on experience with AWS RDS, PostgreSQL, and SAP HANA.
- Solid understanding of ETL / ELT processes, data modeling, and data warehousing.
- Familiar with CI/CD and version control (Git).
- Strong analytical, debugging, and problem-solving skills.
- AWS Certifications:
- AWS Certified Data Analytics
- AWS Certified Developer
- AWS Certified Solutions Architect
- Experience with SageMaker, Textract, Rekognition, Bedrock, or other GenAI/LLM tools.
- Familiarity with Apache Spark, Hadoop, or Kafka.
- Experience with data visualization tools (Tableau, Power BI, or AWS QuickSight).
- Knowledge of Azure DevOps / Pipelines.
- Familiarity with data governance and catalog tools (AWS DQ, Collibra, DataBrew).
- AWS Certified Developer 2019
- Denodo Platform Certified Developer 2018
- Tableau Desktop Qualified Associate 2018
- Hortonworks HDP Certified Administrator 2016
- Cloudera Hadoop Developer 2014
- Oracle Data Warehousing 11g Essentials 2011
- Oracle Business Intelligence 10 Foundation Essentials 2011
- Remote opportunity preference for Southern California, California, or West Coast time zone candidates.
- Candidates must be authorised to work in the United States without current or future sponsorship.
- An immediate start is available for the right candidate.