SonicJobs Logo
Login
Left arrow iconBack to search

Lead Data Engineer

F5
Posted 5 days ago, valid for 2 days
Location

London, Greater London EC1R 0WX

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • The Lead AWS Data Engineer position is based in London with a hybrid work model, requiring around 2-3 office days per week.
  • The salary for this role is up to £120,000, and candidates must have at least 7 years of hands-on experience in data engineering and ETL architecture.
  • The role involves leading a team of Data Engineers, designing scalable data solutions in AWS, and utilizing technologies such as Glue, Athena, and PySpark.
  • Candidates should possess strong programming skills, extensive SQL experience, and familiarity with Infrastructure-as-Code tools like Terraform.
  • Additional benefits include 25 days of holiday, private healthcare, gym membership contributions, and a personal development allowance.

Lead AWS Data Engineer

City ofLondon (Hybrid: Around 2-3 office days per week)

Salary: Up to £120,000

Industry: FinTech / RegTech

A 200 employee data and technology company specialising in Regulatory Technology (RegTech) are looking for a Lead AWS Data Engineer. They develop products that enhance data quality and help financial institutions navigate regulatory compliance.

The Role

The Lead Data Engineer will provide technical leadership to a small team of Data Engineers (onshore & offshore), driving the design, implementation, and optimisation of scalable data solutions in AWS.

Key Responsibilities

  • Lead and mentor a team of onshore and offshore Data Engineers, driving technical growth and ensuring effective collaboration.
  • Design, develop, and optimise scalable data pipelines and infrastructure using AWS (Glue, Athena, Redshift, Kinesis, Step Functions, Lake Formation).
  • Utilise PySpark for distributed data processing, ETL, SQL querying, and real-time data streaming.
  • Architect and implement robust data solutions for analytics, reporting, machine learning, and data science initiatives.
  • Establish and enforce best practices in data engineering, coding standards, and architecture guidelines.
  • Manage team tasks and delivery using Agile methodologies, collaborating closely with product owners and scrum masters.

Skills and Experience

  • 2+ years of leadership experience in data engineering, including mentoring and team management.
  • 7+ years of hands-on experience in data engineering and ETL architecture.
  • Deep expertise in AWS Data Services, including Glue, Athena, Redshift, Kinesis, Step Functions, and Lake Formation.
  • Strong programming skills in Python and PySpark for data processing and automation.
  • Extensive SQL experience (Spark-SQL, MySQL, Presto SQL) and familiarity with NoSQL databases (DynamoDB, MongoDB, etc.).
  • Proficiency in Infrastructure-as-Code (Terraform, CloudFormation) for automating AWS data workflows.
  • Experience designing, implementing, and optimising ETL pipelines, including real-time data processing and streaming architectures.

25 days holiday + bank holidays + birthday off.

Private healthcare & medical cashback plan.

Monthly gym membership contribution.

Annual personal development allowance.

Regular team socials in a collaborative environment.

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.