- Azure
- ETL
- Data Lake/Warehouse
- The development and maintenance of the end-to-end data pipelines and data warehouse
- Implementing systems and standards to ensure service quality
- Maintenance of code repositories
- A strong, independent problem solver
- Able to work through complex problems
- Work independently on own projects but also flexible to step in to help with any critical issues
- Have a wide understanding of data infrastructure and environments and who can implement best practice
- A team player but also able to use own initiative
- A problem solver with a good eye for details and REAL-TIME hands-on experience
- Be self-motivated to continuously learn with the goal to become an expert
- Good knowledge of the tools, techniques, and processes for cleaning, shaping data from a variety of sources
- Have the ability to create robust data models to drive the business insights
- Collect and translate the business requirements for analytics solutions
- Operate and implement a variety of data structures
- Able to troubleshoot and anticipate any issues
- Design and develop the data engineering steps needed to provide the required data
- Good working knowledge of: MS. SQL, Python, Snowflake, Azure Services, ETL, Elasticsearch, GitLab (or similar)
- Machine learning model deployment experience would be a plus
- Experienced in building and maintaining a live cloud data lake and data warehouse
- Design and develop ETL processes
- Design, build and maintain the data pipelines and data models
- Pipeline automation
- Support the business in the implementation of Data Engineering applications and tools
- Take ownership of warehouse platform
- Take responsibility for the quality of the code base to ensure the data is reliable
- Be a good communicator as you will be speaking to various Business Departments and stakeholders