- Strong coding skills in Python & SQL
- Excellent experience in the AWS Cloud platform, including AWS Glue, Amazon DynamoDB for NoSQL database management, Amazon Redshift for data warehousing, and AWS Lambda
- Designing & developing ETL Pipelines
- Familiarity with big data technologies and frameworks
- Understanding of data governance and compliance frameworks
Back to searchData Engineer – Birmingham - AWSHybrid working – 1 day every 3 weeks out of their Birmingham City Centre offices.Salary 55-65k plus 10% bonus, 28 days holidays Childcare vouchers, Life Assurance, pension.We are a looking for an experienced Data Engineer to play a pivotal role in constructing and maintaining data pipelines within a dynamic, Agile environment. This opportunity is based in Birmingham, West Midlands, with a hybrid work model. As a key member of their Data team, you'll be instrumental in translating technical requirements into actionable pipelines, developing solutions to drive data-centric decision-making, and upholding data quality and organisation standards. Strong skills in Python, SQL, AWS, Glue, ETL and DataOps principles is essential.Core skills:
Data Engineer
Erin Associates
Posted 6 days ago, valid for 3 days
Birmingham, West Midlands B27 6QS, England
£40,000 - £70,000 per annum
Full Time
In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.
Sonic Summary
- Salary: 55-65k plus 10% bonus
- Year of experience required: Experienced
- We are looking for an experienced Data Engineer to construct and maintain data pipelines within a dynamic, Agile environment in Birmingham, West Midlands.
- The role involves translating technical requirements into actionable pipelines, developing solutions for data-centric decision-making, and upholding data quality standards.
- Key skills required include Python, SQL, AWS, Glue, ETL, and DataOps principles.