- Provide first-line support for data engineering tasks, such as managing and monitoring data pipelines, resolving issues, and ensuring data integrity.
- Work with both structured and unstructured datasets to design and implement data models, perform data cleansing, transformation, and validation.
- Maintain accurate documentation of data workflows, pipelines, and issue resolutions.
- Manage system administration tasks, including user access to data resources and troubleshooting data-related errors.
- Collaborate with business stakeholders to identify data requirements and deliver sustainable solutions.
- Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy.
- Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability.
- Assist in gathering, documenting, and managing data engineering requirements and workflows.
- Contribute to the development of guidelines and documentation for data engineering best practices.
- Assist in designing, testing, and implementing data pipelines and workflows using established software development lifecycle techniques.
- Help define and optimize scalable data processes that drive operational improvements.
- Collaborate with cross-functional teams to ensure data-related initiatives are properly planned, scheduled, and managed.
- Participate in risk management and change management processes related to data infrastructure.
- Participate in quality reviews of designs, prototypes, and other work products to ensure requirements are met.
- Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management.
- Hands-on experience with SQL (e.g., writing queries, basic database management).
- Familiarity with data tools and platforms (e.g., Python, Power BI, Tableau, or similar visualization tools).
- Attention to detail across large data sets and multiple business unit data fields.
- Experience with Snowflake.
- Familiarity with cloud data platforms (e.g., AWS, Azure, or Google Cloud).
- Basic knowledge of version control tools like Git.
- Awareness of data warehousing concepts and architectures.
- £30-35k
- Excellent company benefits
- Option for this role to be hybrid or remote