Back to searchData Engineer
Location: Hybrid 3 days per week onsite London
Rate: Inside Ir35 400.00 pd
We are seeking a Senior Data Engineer to join our team on a critical system and data migration project. This position will play a key role in an ongoing initiative that has been underway throughout the calendar year, supporting the migration and consolidation of data across our organisation. The ideal candidate will possess extensive experience across the full data lifecycle, with particular proficiency in ETL processes, data collection from diverse sources, high-speed data cleansing, and rigorous testing and validation to ensure data integrity and quality.
This role requires a deep understanding of data movement tools and advanced programming skills in Python, enabling efficient data transfer and transformation to meet organisational needs. Additionally, the successful candidate will collaborate with business users, providing guidance and technical insights to data providers and validation teams on effective data handling techniques.
As we are in the early stages of building a Snowflake-based data warehouse, experience in architecting and implementing data warehouses from the ground up is essential, alongside a strong understanding of how to optimise for scalable and resilient reporting frameworks. ETL best practises, and the ability to construct a data infrastructure that supports sophisticated, future-proof reporting are key.
**Experience**
Prior experience in the financial lending sector would be advantageous, as much of the data you will be working with is within this area. An affinity with financial products and lending will enable you to hit the ground running in this highly paced project. Furthermore, this individual will work closely with team members of varying levels of data engineering experience, sharing best practises, mentoring on data warehousing and ETL techniques, and fostering a culture of technical excellence.
**Key Responsibilities**
- Lead and execute data migration tasks, working closely with cross-functional teams to ensure seamless integration and data quality.
- Design, implement, and optimise ETL processes to facilitate data collection, transformation, and loading from diverse sources.
- Build, maintain, and improve the Snowflake data warehouse, ensuring scalability and resilience for future reporting needs.
- Collaborate with business users, providing guidance on data management practises and troubleshooting data quality issues.
- Develop and maintain Python scripts and data pipelines for automation and streamlined data processes.
- Conduct rigorous data cleansing, validation, and testing to uphold high standards of data integrity across all systems.
- Support and mentor team members in best practises for data engineering, ETL, and warehousing, fostering a collaborative environment.
- Identify and address issues within the financial lending data set, applying industry knowledge to improve data reliability.
**Knowledge and Experience**
- Extensive experience with the full data lifecycle, including data collection, ETL processes, cleansing, testing, and validation.
- Good knowledge of data sharing back to business users either via BI tools or direct data access for self-analysis in Excel.
- Strong programming skills in Python for data engineering tasks, automation, and scripting.
- Proficient with data movement tools and techniques, ensuring efficient data integration and transformation across systems.
- Hands-on experience with Snowflake, including data warehousing architecture, setup, and optimisation for reporting.
- Familiarity with data governance principles, data quality standards, and best practises in ETL and data warehousing.
- Industry experience within the financial lending sector is highly desirable, especially in addressing domain-specific data challenges.
- Proven ability to mentor and develop less experienced data engineers, sharing best practises in data management and ETL.
- Strong analytical skills with attention to detail, ensuring data accuracy and reliability.
**Qualities and Competencies**
- Strong attention to detail and ability to work with large and complex datasets.
- Collaborative, problem-solving approach.
- Strong interpersonal skills, able to work across teams.
- Must be a detail-oriented individual.
- Ability to work independently and in a team environment.
- Excellent communication and presentation skills.
- Strong organisational and project management skills.
- Proactive and results-driven attitude.
- Inquisitive and curious personality.
Data Engineer
BrightBox Group
Posted 8 hours ago, valid for 23 days
London, Greater London SW1A2DX, England
£400 per day
Full Time
By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.
Sonic Summary
- We are looking for a Senior Data Engineer for a hybrid role in London, requiring 3 days onsite each week.
- The position offers a rate of £400.00 per day, inside IR35, and candidates should have extensive experience across the full data lifecycle.
- Key responsibilities include leading data migration tasks, optimizing ETL processes, and building a Snowflake data warehouse.
- Experience in the financial lending sector is advantageous, and the ideal candidate should have strong programming skills in Python and familiarity with data governance principles.
- The role involves mentoring less experienced engineers and requires a collaborative approach with strong analytical and communication skills.
Location: Hybrid 3 days per week onsite London
Rate: Inside Ir35 400.00 pd
We are seeking a Senior Data Engineer to join our team on a critical system and data migration project. This position will play a key role in an ongoing initiative that has been underway throughout the calendar year, supporting the migration and consolidation of data across our organisation. The ideal candidate will possess extensive experience across the full data lifecycle, with particular proficiency in ETL processes, data collection from diverse sources, high-speed data cleansing, and rigorous testing and validation to ensure data integrity and quality.
This role requires a deep understanding of data movement tools and advanced programming skills in Python, enabling efficient data transfer and transformation to meet organisational needs. Additionally, the successful candidate will collaborate with business users, providing guidance and technical insights to data providers and validation teams on effective data handling techniques.
As we are in the early stages of building a Snowflake-based data warehouse, experience in architecting and implementing data warehouses from the ground up is essential, alongside a strong understanding of how to optimise for scalable and resilient reporting frameworks. ETL best practises, and the ability to construct a data infrastructure that supports sophisticated, future-proof reporting are key.
**Experience**
Prior experience in the financial lending sector would be advantageous, as much of the data you will be working with is within this area. An affinity with financial products and lending will enable you to hit the ground running in this highly paced project. Furthermore, this individual will work closely with team members of varying levels of data engineering experience, sharing best practises, mentoring on data warehousing and ETL techniques, and fostering a culture of technical excellence.
**Key Responsibilities**
- Lead and execute data migration tasks, working closely with cross-functional teams to ensure seamless integration and data quality.
- Design, implement, and optimise ETL processes to facilitate data collection, transformation, and loading from diverse sources.
- Build, maintain, and improve the Snowflake data warehouse, ensuring scalability and resilience for future reporting needs.
- Collaborate with business users, providing guidance on data management practises and troubleshooting data quality issues.
- Develop and maintain Python scripts and data pipelines for automation and streamlined data processes.
- Conduct rigorous data cleansing, validation, and testing to uphold high standards of data integrity across all systems.
- Support and mentor team members in best practises for data engineering, ETL, and warehousing, fostering a collaborative environment.
- Identify and address issues within the financial lending data set, applying industry knowledge to improve data reliability.
**Knowledge and Experience**
- Extensive experience with the full data lifecycle, including data collection, ETL processes, cleansing, testing, and validation.
- Good knowledge of data sharing back to business users either via BI tools or direct data access for self-analysis in Excel.
- Strong programming skills in Python for data engineering tasks, automation, and scripting.
- Proficient with data movement tools and techniques, ensuring efficient data integration and transformation across systems.
- Hands-on experience with Snowflake, including data warehousing architecture, setup, and optimisation for reporting.
- Familiarity with data governance principles, data quality standards, and best practises in ETL and data warehousing.
- Industry experience within the financial lending sector is highly desirable, especially in addressing domain-specific data challenges.
- Proven ability to mentor and develop less experienced data engineers, sharing best practises in data management and ETL.
- Strong analytical skills with attention to detail, ensuring data accuracy and reliability.
**Qualities and Competencies**
- Strong attention to detail and ability to work with large and complex datasets.
- Collaborative, problem-solving approach.
- Strong interpersonal skills, able to work across teams.
- Must be a detail-oriented individual.
- Ability to work independently and in a team environment.
- Excellent communication and presentation skills.
- Strong organisational and project management skills.
- Proactive and results-driven attitude.
- Inquisitive and curious personality.