SonicJobs Logo
Login
Left arrow iconBack to search

Senior Data Engineer

Primus Connect
Posted 11 days ago, valid for 14 days
Location

Edinburgh, City of Edinburgh EH13EG, Scotland

Salary

£550 - £615/day Outside IR35

info
Contract type

Full Time

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • The position is for a Senior Data Engineer based in Edinburgh, requiring on-site work three days a week for a six-month contract with a likely extension.
  • The role offers a salary range of £550 to £615 per day, classified as outside IR35.
  • Candidates should have experience in designing and implementing data solutions in Databricks, along with strong skills in PySpark, SparkSQL, and relational database modeling.
  • Essential experience includes working with the Azure platform, ADF or Synapse pipelines, and Python development, while familiarity with CI/CD and DevOps principles is also necessary.
  • Desirable skills include knowledge of Data Vault 2.0 and experience in the Financial Services sector.

Senior Data Engineer

Edinburgh 3 days per week on-site

6 months (likely extension)

550 - 615 per day outside IR35

Primus is partnering with a leading Financial Services client who are embarking on a greenfield data transformation programme. Their current processes offer limited digital customer interaction, and the vision is to modernise these processes by:

  • Building a modern data platform in Databricks
  • Creating a single customer view across the organisation.
  • Enabling new client-facing digital services through real-time and batch data pipelines.

You will join a growing team of engineers and architects, with strong autonomy and ownership. This is a high-value greenfield initiative for the business, directly impacting customer experience and long-term data strategy.

Key Responsibilities:

  • Design and build scalable data pipelines and transformation logic in Databricks
  • Implement and maintain Delta Lake physical models and relational data models.
  • Contribute to design and coding standards, working closely with architects.
  • Develop and maintain Python packages and libraries to support engineering work.
  • Build and run automated testing frameworks (e.g. PyTest).
  • Support CI/CD pipelines and DevOps best practices.
  • Collaborate with BAs on source-to-target mapping and build new data model components.
  • Participate in Agile ceremonies (stand-ups, backlog refinement, etc.).

Essential Skills:

  • PySpark and SparkSQL.
  • Strong knowledge of relational database modelling
  • Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes).
  • Azure platform experience.
  • ADF or Synapse pipelines for orchestration.
  • Python development
  • Familiarity with CI/CD and DevOps principles.

Desirable Skills

  • Data Vault 2.0.
  • Data Governance & Quality tools (e.g. Great Expectations, Collibra).
  • Terraform and Infrastructure as Code.
  • Event Hubs, Azure Functions.
  • Experience with DLT / Lakeflow Declarative Pipelines:
  • Financial Services background.

Apply now in a few quick clicks

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.