SonicJobs Logo
Login
Left arrow iconBack to search

PySpark Developer

DCV Technologies
Posted 4 days ago, valid for 7 days
Location

Coventry, West Midlands CV1 4FS, England

Salary

£35 - £55 per hour

Contract type

Full Time

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • We are seeking an experienced PySpark Developer with strong Microsoft Fabric and Azure engineering skills for a major transformation programme in the financial-markets domain.
  • This hands-on role involves building and optimising large-scale data pipelines, dataflows, semantic models, and lakehouse components.
  • Key responsibilities include designing Spark-based data pipelines, developing Fabric dataflows, and implementing data validation and cleansing.
  • Candidates should have strong hands-on experience with PySpark, Microsoft Fabric, Azure, and Delta Lake, along with a solid troubleshooting ability.
  • The position is office-based in London, is a 6-month contract, and offers a competitive salary based on market rates, requiring relevant experience in the field.

We are looking for an experienced PySpark Developer with strong Microsoft Fabric and Azure engineering skills to join a major transformation programme within the financial-markets domain. This role is fully hands-on, focused on building and optimising large-scale data pipelines, dataflows, semantic models, and lakehouse components.

Key Responsibilities

  • Design, build and optimise Spark-based data pipelines for batch and streaming workloads
  • Develop Fabric dataflows, pipelines, and semantic models
  • Implement complex transformations, joins, aggregations and performance tuning
  • Build and optimise Delta Lake / delta tables
  • Develop secure data solutions including role-based access, data masking and compliance controls
  • Implement data validation, cleansing, profiling and documentation
  • Work closely with analysts and stakeholders to translate requirements into scalable technical solutions
  • Troubleshoot and improve reliability, latency and workload performance

Essential Skills

  • Strong hands-on experience with PySpark, Spark SQL, Spark Streaming, DataFrames
  • Microsoft Fabric (Fabric Spark jobs, dataflows, pipelines, semantic models)
  • Azure: ADLS, cloud data engineering, notebooks
  • Python programming; Java exposure beneficial
  • Delta Lake / Delta table optimisation experience
  • Git / GitLab, CI/CD pipelines, DevOps practices
  • Strong troubleshooting and problem-solving ability
  • Experience with lakehouse architectures, ETL workflows, and distributed computing
  • Familiarity with time-series, market data, transactional data or risk metrics

Nice to Have

  • Power BI dataset preparation
  • OneLake, Azure Data Lake, Kubernetes, Docker
  • Knowledge of financial regulations (GDPR, SOX)

Details

  • Location: London (office-based)
  • Type: Contract
  • Duration: 6 months
  • Start: ASAP
  • Rate: Market rates

If you are a PySpark / Fabric / Azure Data Engineer looking for a high-impact contract role, apply now for immediate consideration.

Apply now in a few quick clicks

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.