We're looking for a senior data engineer to add to our dynamic and rapidly scaling team. The role involves developing data pipelines and models to support advanced AI/ML analytics projects. The ideal candidate will have experience with modern data integration frameworks, big data, and cloud technologies.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Job Title: Data Engineering
Location: Fully Remote - India
Hours: Work Hours 12pm to 9pm IST
Duration: 3 months, with potential for extension
Data Engineer
- Emphasizing Informatica Cloud (IICS), Snowflake, AWS, SQL, and Python as essential skills.
- The role involves developing code in Informatica, transforming data into Snowflake, and potentially working on streaming pipelines and API gateways.
Must‑Have Skills
- Informatica IICS (primary ETL tool)
- Snowflake (used as the target data platform)
- AWS
- SQL & Python
Nice‑to‑Have Skills
- Airflow (potential future work in a few months)
- Streaming pipelines (Kafka / AWS streaming / API Gateways) — required for 1 of the 3 roles
Technical Clarifications
Interested in remote work opportunities in Development & Programming? Discover Development & Programming Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
- IICS + Snowflake:
- Logic typically built in Informatica IICS with Snowflake as the target.
- Pushdown optimization is used selectively for heavier pipelines, case‑by‑case.
- Python usage:
- Primarily for scripting, REST API calls, AWS Lambda functions, and light transformations.
- Not currently used for Airflow orchestration.
- Cloud / DevOps:
- AWS is the primary cloud.
- Dedicated DevOps / platform team handles CI/CD and infrastructure.
- Terraform is used but only light exposure is expected from engineers.
Team & Work Structure
- Initial assignments may support KFC or Taco Bell, with flexibility over time.
- Day‑to‑day collaboration is with project managers and technical leads.
FORMAL CLIENT JD:
We’re looking for a data engineer to add to our dynamic and rapidly scaling team. We’re making this investment to help us optimize our digital channels and technology innovations with the end goal of creating competitive advantages for our restaurants around the globe. We’re looking for a solid engineer who brings fresh ideas from past experiences and is eager to tackle new challenges in our company.
We’re in search of a candidate who knows about and loves working with modern data integration frameworks, big data and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects - with the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Responsibilities:
- Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to identify opportunities to leverage company data to drive business outcomes.
- Play a key role in our advanced analytics team - developing data-driven solutions, and responsible for driving Yum Growth.
- Design and develop scalable streaming data integration frameworks to move and transform a variety of data sets.
- Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.).
- Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration points.
Requirements:
- Must have at least 2 years of Snowflake and Cloud
- 2+ years of data engineering experience.
- 2+ years of experience building cloud data solutions (e.g. on Azure, AWS, GCP) and using services such as storage, virtual machines, serverless technologies and parallel processing technologies.
- Cloud certifications are a plus.
- Experience building solutions utilizing streaming or event processing (Kafka, Spark Streaming, Pulsar, etc.) is a plus.
- Proficiency in Python and its supporting libraries for data processing
- Experience processing structured & semi-structured data
- Experience building serverless APIs (AWS preferred, but any cloud environment works)
- Working knowledge of agile development, including DevOps concepts (IAC, CI/CD, etc.).
- Experience with cloud SDKs and programmatic access services.
- Proficiency in SQL.
- Experience with ETL tools such as Informatica, DataFactory, SSIS are a plus.
- Bachelor’s degree from an accredited institution is required.
Similar Jobs
Explore other opportunities that match your interests
agilegrid solutions
True Tech Professionals