We are partnering with a client undergoing a critical transformation of their data infrastructure and CRM capabilities. They are seeking a hands-on Data Engineer with strong AWS experience to support two key initiatives: finalising a Braze CRM integration and migrating legacy R-based data pipelines to a modern cloud-native stack.
Phase 1: CRM Data Engineering (Month 1)
-
Support the CRM team with data engineering requests.
-
QA, deploy, and monitor data pipelines that push third-party formatted data into Braze.
-
Manage ad hoc CRM data tasks including journey updates and API integrations.
-
Work extensively within AWS using Lambda, API Gateway, and Python to maintain and enhance integrations.
Phase 2: Legacy Pipeline Migration (Months 2-3)
-
Analyze and understand existing R-based data pipelines created by data scientists.
-
Migrate these pipelines into Airflow, dbt, and Terraform workflows.
-
Modernize and scale legacy infrastructure running on AWS.
-
Collaborate with engineering teams to ensure a smooth transition and system stability.
Languages & Scripting:
-
Python (primary scripting language for Lambda functions)
-
SQL (BigQuery, Redshift)
-
R (not essential but beneficial for interpreting existing scripts)
Cloud & Infrastructure:
-
AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose
-
Terraform for infrastructure as code
Orchestration & Transformation:
-
Apache Airflow
-
dbt
CRM & Marketing Tools:
-
Braze (preferred)
-
Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is a plus
-
Proven commercial experience as a data engineer; industry background is not critical.
-
Hands-on, pragmatic, and able to deliver quickly with minimal supervision.
-
Strong communicator, able to clearly explain technical decisions and project status.
-
Willing to take on essential but sometimes "tedious" tasks without hesitation.
-
Practical attitude, especially when working with legacy systems or imperfect code.
-
Ideally, experience migrating legacy scripting environments (e.g., R to Python) to modern pipelines.