SonicJobs Logo
Left arrow iconBack to search

Technical Lead; Data Engineering - London/Hybrid - Energy/Trading

Coltech Recruitment
Posted 13 hours ago, valid for 4 days
Location

London, Greater London SW1A2DX, England

Salary

£48,000 - £57,600 per annum

info
Contract type

Full Time

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

Sonic Summary

info
  • The position of Technical Lead - Data Engineering is available in London with a hybrid work model requiring onsite presence 2-3 days per week.
  • The role offers a salary of up to £580 per day inside IR35 and is initially a 12-month contract with potential extensions.
  • Candidates should have extensive experience in Databricks, Apache Spark, and data lakes, along with proficiency in Python and SQL.
  • The successful applicant will lead the design and implementation of data engineering solutions, collaborating with cross-functional teams and mentoring junior staff.
  • Immediate applications are encouraged for this opportunity with a prestigious consultancy client.

Job Title:Technical Lead -Data Engineering
Location:London/Hybrid- Onsite 2-3 days per week, with flexibility to work remotely once up and running
Salary/Rate: Up to 580per day INSIDE IR35
Start Date:As soon as possible
Job Type:12 month contract initially + extensions

Company Introduction

Coltech is partnered with a prestigious consultancy client who are looking for an experiencedTechnical Lead in Data Engineeringto lead the development of the data Lakehouse platform


Job Responsibilities

  • Lead the design, development, and implementation of data engineering solutions for our data Lakehouse platform.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Provide technical leadership, mentoring, and coaching to the data engineering team.
  • Ensure best practices in data engineering, including data modelling, ETL processes, and data quality.
  • Optimize big data workloads in Spark and other big data technologies.
  • Manage CI/CD pipelines using Azure DevOps or Git.
  • Develop and maintain event-driven pipelines using .NET and other relevant technologies.
  • Implement and manage Databricks Unity Catalog and other data governance tools.
  • Deliver projects in the streaming data world using Kafka, KSQL DB, and similar technologies.
  • Utilize reporting tools such as Power BI and Qlik for data visualization and reporting.


Required Skills/Experience:

  • Extensive experience with Databricks, Apache Spark, and data lakes.
  • Proficiency in Python and SQL.
  • Experience with CI/CD processes using Azure DevOps or Git.
  • Knowledge of streaming data technologies such as Kafka and KSQL DB.
  • Familiarity with reporting tools like Power BI and Qlik.
  • Software engineering experience with .NET for event-driven pipelines and automation testing.
  • Experience with Databricks Unity Catalog.
  • Strong skills in data modelling, including Snowflake modelling, star schema modelling, and other techniques.
  • Ability to optimize big data workloads in Spark.


Desired Skills/Experience:

  • Experience in delivering projects in the streaming data world.
  • Understanding of data governance and data quality best practices.
  • Familiarity with automation testing frameworks.
  • Experience working in Energy trading related projects.

Apply now for immediate consideration.

Apply now in a few quick clicks

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.