Job Title: Junior Data EngineerLocation: London, UKSalary: £60 - £70k plus bonuses/benefitsWays of Working: 3x per week in London officeEngagement: Full Time
Project Overview:We are looking for a skilled Junior Data Engineer to join the Surveillance Technology department in London. This role will be focused on developing and enhancing data ingestion pipelines and frameworks across the application portfolio, supporting Trade Surveillance analysts with strategy and decision-making. The team has recently implemented a new data lake house and ingestion architecture in the Azure Synapse stack and is now entering a phase of optimization to increase the speed, ease, and quality of data ingestion across the portfolio. This role also involves extending pipelines to a more real-time architecture (Kappa-like) to support future initiatives.
Key Responsibilities:- Develop and manage data ingestion pipelines across the portfolio.
- Support the roll-out of trade surveillance solutions and manage impacts from new business initiatives.
- Enhance the quality and speed of data ingestion within the surveillance platform.
- Collaborate closely with cross-functional teams to implement changes and drive the global strategy for trade surveillance.
- Work with new and emerging technologies in a cloud-based environment (Azure).
- Bachelor's degree in Information Systems, Computer Science, Engineering, or a related field, or equivalent experience.
- Reasonable experience in a similar role within Data Engineering or related fields.
- Strong communicator with the ability to convey technical issues to non-technical audiences.
- Experience with Agile methodologies (SCRUM, KANBAN, or similar).
- A fast learner with a strong ability to quickly pick up new skills and concepts.
- A high standard of work with great attention to detail and the ability to handle change and shifting priorities.
- A team player who can also work independently, with strong problem-solving skills and the ability to think quickly under pressure.
- Proficiency in Python, Pyspark, Synapse, DataBricks, or similar.
- Strong knowledge of SQL and working with data pipelines.
- Experience with Big Data and data architectures.
- Familiarity with CI/CD tools and processes.
- Experience with test automation and PyTest.
- Experience in Azure/Cloud, including App services, Azure functions, Cosmos DB, and Synapse.
- Understanding of operations or trading in commodities, particularly in Oil, Gas, Shipping, or related sectors.
- Experience with Jenkins, Octopus, Git, and DevOps practices, including Docker.
- Familiarity with TDD, automation, Specflow, and BDD.
This role offers a fantastic opportunity to join an innovative and fast-paced team working on cutting-edge technologies within the financial services industry. If you're passionate about data engineering and looking to take your career to the next level, apply now!