- Design and develop a modern data warehouse (Azure or Snowflake), capable of ingesting data from multiple sources and that can store and organise large volumes of data.
- Develop and implement an automated, repeatable data migration process suitable for use over multiple project phases.
- Actively review data quality assessments, addressing any inconsistencies and apply data cleansing and validation techniques.
- Build data pipelines that clean, transform, and aggregate data from disparate sources.
- Stay up-to-date with emerging trends and technologies in data engineering.
- Proven experience (5 years+) in a data engineering or similar role
- Strong proficiency in SQL and database technologies (e.g. MS SQL, Snowflake)
- Hands-on experience with ETL/ELT tools (Azure Data factory, DBT, AWS Glue, etc)
- Strong Proficiency in Power BI and Advanced Analytics.
- Good proficiency in Python for data processing, scripting and automation.
- Any experience with DBT, Airbyte or similar transformation and replication products is advantageous.
- Experience with data migration and mapping complex relational data between business systems.
- Strong analytical skills with the ability to translate business requirements into data engineering solutions.
- Excellent problem-solving abilities, attention to detail and ability to work independently or in a team.
- Effective communication and interpersonal skills to foster relationships with stakeholders at all levels.
- Degree in Computer Science, Information Systems, Data Science or a related field.Â