SonicJobs Logo
Login
Left arrow iconBack to search

Data Engineer Contract

Harnham - Data & Analytics Recruitment
Posted 3 days ago, valid for a month
Location

London, Greater London EC1R 0WX

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • The client is looking for a hands-on Data Engineer with strong AWS experience to support their data infrastructure and CRM transformation initiatives.
  • The role involves finalizing a Braze CRM integration and migrating legacy R-based data pipelines to a modern cloud-native stack over a three-month period.
  • Candidates should have proven commercial experience as a data engineer, with a focus on Python, SQL, and familiarity with AWS services and tools such as Airflow and Terraform.
  • The position requires strong communication skills and a pragmatic approach to deliver quickly with minimal supervision, especially when dealing with legacy systems.
  • The salary for this role is competitive, and candidates should ideally have at least 3-5 years of relevant experience.

We are partnering with a client undergoing a critical transformation of their data infrastructure and CRM capabilities. They are seeking a hands-on Data Engineer with strong AWS experience to support two key initiatives: finalising a Braze CRM integration and migrating legacy R-based data pipelines to a modern cloud-native stack.

Phase 1: CRM Data Engineering (Month 1)

  • Support the CRM team with data engineering requests.

  • QA, deploy, and monitor data pipelines that push third-party formatted data into Braze.

  • Manage ad hoc CRM data tasks including journey updates and API integrations.

  • Work extensively within AWS using Lambda, API Gateway, and Python to maintain and enhance integrations.

Phase 2: Legacy Pipeline Migration (Months 2-3)

  • Analyze and understand existing R-based data pipelines created by data scientists.

  • Migrate these pipelines into Airflow, dbt, and Terraform workflows.

  • Modernize and scale legacy infrastructure running on AWS.

  • Collaborate with engineering teams to ensure a smooth transition and system stability.

Languages & Scripting:

  • Python (primary scripting language for Lambda functions)

  • SQL (BigQuery, Redshift)

  • R (not essential but beneficial for interpreting existing scripts)

Cloud & Infrastructure:

  • AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose

  • Terraform for infrastructure as code

Orchestration & Transformation:

  • Apache Airflow

  • dbt

CRM & Marketing Tools:

  • Braze (preferred)

  • Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is a plus

Candidate Profile
  • Proven commercial experience as a data engineer; industry background is not critical.

  • Hands-on, pragmatic, and able to deliver quickly with minimal supervision.

  • Strong communicator, able to clearly explain technical decisions and project status.

  • Willing to take on essential but sometimes "tedious" tasks without hesitation.

  • Practical attitude, especially when working with legacy systems or imperfect code.

  • Ideally, experience migrating legacy scripting environments (e.g., R to Python) to modern pipelines.

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.