SonicJobs Logo
Login
Left arrow iconBack to search

AWS Data Engineer

Tenth Revolution Group
Posted a day ago, valid for 15 days
Location

London, Greater London EC3V 3LA, England

Salary

£300 - £350 per day

Contract type

Full Time

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • The position is for a Data Engineer on a 14-week contract, with a daily rate of £350, starting on January 12th.
  • This role is remote and outside IR35, providing flexibility and autonomy.
  • Candidates should have experience in designing and implementing ETL/ELT pipelines and advanced knowledge of Python and SQL.
  • Familiarity with AWS services and document processing workflows is essential, along with strong data governance skills.
  • Immediate interviews are available, making it a great opportunity to secure a role before Christmas.


Data Engineer - 14-Week Contract (Outside IR35) Likely to Extend

Start Date: 12th January
Rate: 350 per day
Location: Remote (UK-based)
Interview: Immediate - Offer before Christmas

We are seeking an experienced Data Engineer to join a 14-week project focused on building robust data pipelines and integrating complex data sources. This is an outside IR35 engagement, offering flexibility and autonomy.



Key Responsibilities

  • Design and implement ETL/ELT pipelines with strong error handling and retry logic.
  • Develop incremental data processing patterns for large-scale datasets.
  • Work with AWS services including Glue, Step Functions, S3, DynamoDB, Redshift, Lambda, and EventBridge.
  • Build and optimise vector database solutions and embedding generation pipelines for semantic search.
  • Implement document processing workflows (PDF parsing, OCR, metadata extraction).
  • Integrate data from REST APIs, PIM systems, and potentially SAP.
  • Ensure data quality, governance, and lineage tracking throughout the project.


Required Skills

  • ETL/ELT pipeline design and data validation frameworks.
  • Advanced Python (pandas, numpy, boto3) and SQL (complex queries, optimisation).
  • Experience with AWS Glue, Step Functions, and event-driven architectures.
  • Knowledge of vector databases, embeddings, and semantic search strategies.
  • Familiarity with document parsing libraries (PyPDF2, pdfplumber, Textract) and OCR tools.
  • Understanding of data governance, schema validation, and master data management.
  • Strong grasp of real-time vs batch processing trade-offs.


Beneficial Experience

  • CockroachDB deployment and management.
  • PySpark or similar for large-scale processing.
  • SAP data structures and PIM systems.
  • E-commerce and B2B data integration patterns.

Why Apply?

  • Fully remote contract
  • Outside IR35
  • Competitive day rate
  • Immediate interviews - secure your next role before Christmas

Apply now in a few quick clicks

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.