Data Engineer (Freelance)

YunoJuno • United Kingdom
Remote
Apply
AI Summary

YunoJuno is seeking a freelance Data Engineer for a 6-month contract to enhance and maintain a cloud-based data warehouse, build data pipelines, and integrate new data sources.

Key Highlights
Enhance and maintain the existing GCP-based data warehouse
Build and manage data pipelines using Apache Airflow and Python
Integrate new data sources
Key Responsibilities
Enhance and maintain the existing GCP-based data warehouse
Build and manage data pipelines using Apache Airflow and Python
Integrate new data sources
Collaborate with engineering and product teams
Ensure data quality, reliability, and documentation
Technical Skills Required
Google Cloud Platform BigQuery Apache Airflow Python
Benefits & Perks
£250 per day
Fully remote work
Nice to Have
Experience with web crawling and scraping at scale
Experience integrating social media APIs
JavaScript, Node.js/Express, and React development experience

Job Description


Data Engineer / 6 month freelance contract / Fully remote (any location)


YunoJuno has partnered with a Media Company who are looking to hire a freelance Data Engineer for an upcoming 6 month contract.


We're looking for an experienced freelance Data Engineer to join our team on a 6-month contract. You'll be working directly on enhancing and maintaining the cloud-based data warehouse, helping us scale our data infrastructure, improve pipeline reliability, and expand our data collection capabilities.


Responsibilities

  • Enhance and maintain the existing GCP-based data warehouse, including schema design, performance tuning, and cost optimization
  • Build and manage data pipelines using Apache Airflow and Python
  • Integrate new data sources, including social media APIs and web crawling pipelines
  • Collaborate with engineering and product teams to support data needs across the organization
  • Ensure data quality, reliability, and documentation across all pipelines
  • Support event-level data collect ion and tracking infrastructure


Requirements

  • 5+ years of data engineering experience
  • Strong hands-on experience with Google Cloud Platform (GCP), specifically BigQuery
  • Proficiency with Apache Airflow for pipeline orchestration
  • Strong Python development skills for ETL/ELT pipeline development
  • Experience with data modeling, warehousing best practices, and query optimization


Nice to Have

  • Experience with web crawling and scraping at scale
  • Experience integrating social media APIs (e.g., Twitter/X, LinkedIn, Meta) for data pipeline creation
  • Familiarity with event-level data collection platforms such as Rudderstack or Segment
  • JavaScript, Node.js/Express, and React development experience


Start date: ASAP

Duration: 6 month freelance contract

Rate: £250 per day

Location: Fully remote


Similar Jobs

Explore other opportunities that match your interests

Visa Sponsorship Relocation Remote
Job Type Contract
Experience Level Mid-Senior level

Smartedge Solutions

United Kingdom

Principal Data Engineer

Data Science
•
14h ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Associate

netrolynx ai

United Kingdom

Junior Data Analyst

Data Science
•
1d ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Entry level

blue oak consulting

United Kingdom

Subscribe our newsletter

New Things Will Always Update Regularly