Design, implement, and evolve data engineering processes and frameworks for healthcare data acquisition, curation, and delivery workflows. Collaborate with cross-functional teams to ensure designed processes meet performance, cost, and regulatory requirements. Develop reusable solutions and enable large-scale, repeatable success across varied client environments.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
About the Company
We are a Healthcare IT Data & Analytics company that is on the cutting edge, with an amazing offering. We have a very fun, creative, collaborative, and fast-paced work environment. This is a fantastic team that has a great work/life balance.
About the Role
Senior Data Engineer - Data Engineering Product Group
*This is a 100% Remote and full-time opportunity*
We are building a specialized Data Engineering Product Group that focuses on designing, implementing, and evolving the processes, frameworks, and tools that power our healthcare data acquisition, curation, and delivery workflows. Unlike operational data engineers, this group does not run day-to-day ingestion projects. Instead, it defines the architecture, automation, and standards that our Operations Engineering team will use, and provides the training, documentation, and ongoing guidance to ensure those teams can execute efficiently and consistently. This role blends deep technical expertise with a product mindset — designing reusable solutions, balancing flexibility and standardization, and enabling large-scale, repeatable success across varied client environments.
SUCCESS METRICS:
In 1 month:
- Develop a deep understanding of current data acquisition, curation, and delivery workflows, including operational pain points and tooling gaps.
- Contribute to the design and enhancement of reusable frameworks, templates, and automation scripts for ingestion, transformation, and validation.
- Partner with Operations Engineering to validate and refine proposed process changes in test or pilot settings.
In 3 months:
- Deliver at least one major tool or framework upgrade that reduces operational time-to-delivery.
- Establish clear metrics and reporting to measure process efficiency, data quality, and operational consistency.
- Deliver training sessions, reference architectures, and “how-to” guides to Operations Engineering.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
In 6 months:
- Maintain a living library of best practices, design patterns, and reusable code components for data acquisition and processing.
- Drive adoption of at least two significant new capabilities or frameworks across multiple operational teams.
- Identify and pilot emerging technologies that could streamline ingestion, processing, or delivery.
KEY RESPONSIBILITIES:
Process & Tool Design
- Architect and maintain standardized ingestion, transformation, and validation workflows that can be applied across multiple projects.
- Select, configure, and optimize tools for data acquisition, cleansing, transformation, and delivery (e.g., Airflow, Apache Nifi, Spark, Python frameworks, ElasticSearch, AI).
- Design and maintain monitoring, alerting, and performance tracking systems for operational processes.
Enablement & Training
- Create clear documentation, training modules, and internal certification paths for Operations Engineering teams.
- Serve as a subject matter expert to troubleshoot complex data engineering challenges alongside Operations Engineering without directly executing project work.
- Gather feedback from operational users to continuously improve processes and tools.
Cross-Functional Collaboration
- Partner with Product Management to align tool and process roadmaps with company goals.
- Collaborate with Security, Compliance, and Infrastructure teams to ensure designed processes meet performance, cost, and regulatory requirements.
- Work with Operations Engineering leadership to implement change management for new tools and workflows.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
QUALIFICATIONS:
- 5+ years in data engineering or a related technical field, with significant experience in data process and tooling design.
- Expert level SQL skills and strong knowledge of database platforms (Postgres, MySQL, MS SQL Server, etc.).
- Proficiency in at least one modern programming language used for data engineering (Python, Scala, Java). Python preferred.
- AWS expertise (Glue, S3, Lambda, RDS, etc.).
- Experience designing workflows for large-scale, complex data sets — ideally in healthcare data environments.
- Strong background in automation, pipeline orchestration, and CI/CD practices.
- Excellent communication skills with the ability to train, mentor, and influence technical teams.
Preferred Skills
- Experience with Apache Nifi, Spark, Kafka, or similar distributed data technologies.
- Exposure to infrastructure as code (Terraform) and DevOps workflows.
- Background in designing systems that meet HITRUST or HIPAA compliance standards.
- Familiarity with HL7, CCD, or FHIR data standards.
Why Join Us
You will have the opportunity to shape how healthcare data moves through our platform at scale. Your work will directly influence efficiency, quality, and scalability, enabling the Operations Engineering team to deliver faster, more consistently, and with higher confidence. This is an impact role with the freedom to innovate, experiment, and set the standard for the future of data engineering in our company.
Similar Jobs
Explore other opportunities that match your interests
Myticas Consulting
Data Engineer (ADO and Power BI)
Ascendion