SonicJobs Logo
Login
Left arrow iconBack to search

Databricks Data Engineer Contract

Harnham - Data & Analytics Recruitment
Posted 7 hours ago, valid for 17 days
Location

London, Greater London EC1R 0WX

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • The job is for a Databricks Data Engineer on a 6-month contract located in London, requiring a hybrid work arrangement of 2 days per week.
  • The contract rate is between £550 to £600 per day, classified as Inside IR35.
  • Candidates should have extensive experience with Databricks in production environments and strong analytical skills with large datasets.
  • Key responsibilities include building scalable data lakehouse architectures, designing ETL/ELT pipelines, and managing both structured and unstructured datasets.
  • The role requires advanced SQL and Python programming skills, as well as experience with big data tools like Hadoop, Spark, and Kafka.

Databricks Data Engineer - ContractLocation: Hybrid in London, 2 days per weekDuration: 6 month contractRate: £550 to £600 per day, Inside IR35

We are working with a leading organization that plays a key role in the UK's energy sector, driving forward innovative data solutions to support the transition to renewable energy. As part of their strategic initiatives, they are preparing for an upcoming regulatory change that will require detailed reporting on complex data products.

This organization manages a vast amount of critical data, primarily from smart meters and power generators, which is essential for building impactful dashboards and driving operational efficiency. With these developments on the horizon, the team is looking for an experienced contractor to assist with a growing backlog, enhance data pipelines, and improve the overall quality and output of their data operations.

Key Responsibilities

  • Build and maintain scalable data lakehouse architectures using Databricks.

  • Design and develop ETL/ELT pipelines for large-scale data processing.

  • Manage both structured and unstructured datasets, optimizing performance and reliability.

  • Set up structured streaming pipelines (using Kafka).

  • Support Power BI reporting by preparing and transforming datasets.

  • Deploy infrastructure and manage environments using Terraform and CI/CD practices.

  • Collaborate with the engineering team to resolve backlogs and support data product delivery.

Required Experience & Skills

  • Extensive experience with Databricks in production environments.

  • Strong analytical skills with large, complex datasets (structured and unstructured).

  • Proven experience in building ETL/ELT pipelines.

  • In-depth understanding of Azure services (Data Factory, Azure Functions, Synapse, etc.).

  • Advanced SQL skills, including performance tuning and query optimization.

  • Strong Python programming skills.

  • Experience with big data tools such as Hadoop, Spark, and Kafka.

  • Proficiency in CI/CD processes and version control.

  • Solid experience with Terraform and Infrastructure as Code (IaC).

  • Experience with cloud-based networking and distributed systems.

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.