Senior GenAI Data Engineer

auxoai • India
Remote
Apply
AI Summary

Design and develop production-grade pipelines, leverage GenAI tools, and define engineering best practices. Collaborate with cross-functional teams and mentor junior engineers. Implement robust data quality, testing, and governance frameworks.

Key Highlights
Design and develop end-to-end data pipelines
Leverage GenAI tools (Copilot, Claude, Gemini) for productivity
Implement robust data quality, testing, and governance frameworks
Mentor junior engineers and contribute to capability building
Collaborate with cross-functional teams
Technical Skills Required
Apache Spark SQL (Advanced) Python (Advanced) BigQuery Dataflow Cloud Composer Pub/Sub Dataproc Vertex AI Databricks Delta Lake Apache Iceberg GitHub Copilot Claude Gemini
Benefits & Perks
Competitive compensation
Flexible work arrangements
Professional development support
Cloud certification sponsorship

Job Description


Description

AuxoAI is seeking a Senior GenAI Data Engineer with strong fundamentals in data engineering and end-to-end solution design. In this role, you will design and develop production-grade pipelines, leverage GenAI tools (Copilot, Claude, Gemini) to boost development productivity, and define engineering best practices across complex data environments. This is a highly collaborative, cross-functional role ideal for someone who thrives at the intersection of data engineering excellence and GenAI-powered :

  • Architect and develop end-to-end data pipelines from ingestion to transformation to consumption
  • Lead solutioning and integration for complex data workflows (batch and streaming)
  • Use AI-assisted coding tools (e.g., GitHub Copilot, Claude, Gemini) to accelerate code development, refactoring, and debugging
  • Implement robust data quality, testing, lineage, and governance frameworks
  • Drive best practices across pipeline performance, reusability, and scalability
  • Mentor junior engineers and contribute to capability building within the data team

Requirements

6+ years of experience in data engineering, with expertise in :

  • End-to-end pipeline development (batch and streaming)
  • Data modeling (dimensional, Data Vault, OBT)
  • ETL/ELT design patterns, performance tuning, and optimization
  • SQL (Advanced) and Python (Advanced)
  • Apache Spark for large-scale data processing
  • Proficiency using AI coding tools (e.g., Copilot, Claude, Gemini) to enhance productivity and code quality
  • Strong understanding of data quality frameworks, unit testing, and CI/CD for data workflows

Preferred Qualifications

  • Experience with Google Cloud Platform services : BigQuery, Dataflow, Cloud Composer, Pub/Sub, Dataproc, Vertex AI
  • Exposure to finance or sales data domains
  • Familiarity with Databricks, Delta Lake, or Apache Iceberg
  • GCP Professional Data Engineer certification is a plus

What We Offer

  • Opportunity to work on modern data platforms with GenAI integration
  • Access to professional development support and cloud certification sponsorship
  • Competitive compensation and flexible work arrangements
  • A fast-paced, high-impact environment where innovation is valued

(ref:hirist.tech)

Subscribe our newsletter

New Things Will Always Update Regularly