- Work with Product Owners to establish product requirements.
- Design and develop observable and secure data pipelines.
- Source, integrate and clean data using medallion architecture.
- Ensure data completeness, integrity and validity.
- Create Lakehouses and Warehouses.
- Optimise data pipelines, processing workflows and Fabric warehouses for performance.
- Maintain Data Catalogue and ensure data duplication does not occur.
- Ensure data security is implemented and enforced.
- Set and own standards in the Data Engineering space.
- Work with the DevOps team to establish CI/CD processes.
- Work with other members of the central platform team to monitor the Microsoft Fabric feature roadmap and integrate new features into the established eco-system.
- BCP/DR strategy.
- Work with other members of the central platform team to define an efficient project process to deliver new data products.
- Minimum 5 years working in a cloud environment using data engineering tools on a variety of complex products.
- Python
- Microsoft Fabric
- Power BI
- Semantic Models
- Azure Data Factory
- Azure Synapse
- SQL Server
- ETL/ELT
- Database Design
- Data Warehousing
- Datalake / Lakehouse