Senior Data Engineer

Remote
This Job is No Longer Active This position is no longer accepting applications
AI Summary

Forbes Advisor is seeking a Senior Data Engineer to build core data products, promote data-driven culture, and establish a single source of truth in business and customer metrics. The ideal candidate will have experience working with Python, data handling frameworks, and cloud storage and computing for data pipelines in GCP. This is a remote-first opportunity with a world-class data and marketing team.

Key Highlights
Assemble large, complex data that meet functional/non-functional business requirements
Identify, design, and implement internal process improvements
Work with stakeholders to assist with data-related technical issues and support their data requirement needs
Technical Skills Required
Python Spark Apache Beam GCP GCS BigQuery Cloud Composer Airflow dbt Great Expectations
Benefits & Perks
Monthly long weekends
Wellness reimbursement
Paid parental leave
Remote-first work arrangement

Job Description


Company Description

Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most.

Forbes Advisor is looking to hire a Data Engineer to perform and Build core data products, Promote data-driven culture, Democratize insights through self-service, and Establish a single source of truth in business and customer metrics

If you're looking for challenges and opportunities similar to those of a startup, with the benefits of a seasoned and successful company, then read on:

Job Description

Responsibilities:

  • Assemble large, complex data that meet functional/non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing transformation for greater scalability, etc.
  • Use the infrastructure/services required for optimal extraction, transformation, and loading of data from a wide variety of data sources using GCP services.
  • Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data requirement needs.

Requirements:

  • Bachelor’s degree with Minimum 1.5+ years of experience working in globally distributed teams successfully
  • Must have experience working on Python and data handling frameworks(spark, beam, etc)
  • Apply experience with cloud storage and computing for data pipelines in GCP (GCS, BQ, composer, etc)
  • Write pipelines in Airflow to orchestrate data pipelines
  • Experience handling data from 3rd party providers is a great plus: Google Analytics, Google Ads etc.
  • Experience in manipulating, processing and extracting value from large disconnected datasets.
  • Experience with software engineering practices in data engineering, e.g. release management, testing, etc and corresponding tooling (dbt, great expectations, …)
  • Basic knowledge on dbt is a good to have
  • Knowledge on data privacy and security
  • Excellent verbal and written communication skills

Perks:

  • Monthly long weekends — every third Friday off
  • Wellness reimbursement to support your health and balance
  • Paid parental leave
  • Remote-first with flexibility and trust
  • Work with a world-class data and marketing team inside a globally recognized brand

Subscribe our newsletter

New Things Will Always Update Regularly