SonicJobs Logo
Left arrow iconBack to search

Senior Data Engineer

Retelligence
Posted 6 hours ago, valid for 2 days
Location

London, Greater London EC2M 3TL, England

Salary

£75,000 - £90,000 per annum

Contract type

Full Time

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

Sonic Summary

info
  • Retelligence is looking for a Senior Data Engineer to join a high-growth organization focused on digital innovation and marketing.
  • The role involves designing and delivering robust, real-time data pipelines within a Google Cloud Platform (GCP) environment, with a strong emphasis on Python expertise.
  • Candidates should have significant hands-on experience in data engineering, particularly in building and managing real-time data pipelines, and familiarity with tools like Kafka.
  • The position offers exceptional career progression and emphasizes the importance of data quality, integrity, and security across systems.
  • The salary for this role is competitive, and candidates are expected to have a proven track record with at least 5 years of relevant experience.

Senior Data Engineer


Retelligence is partnering with a high-growth, forward-thinking organization specializing in digital innovation and marketing across international markets. This company is on an exciting journey, rapidly scaling its capabilities and leveraging advanced technology to deliver cutting-edge solutions.

This is a fantastic opportunity to join a dynamic team within a business that values innovation, supports professional development, and offers exceptional career progression.

The Role


We are seeking a Senior Data Engineer to assist in designing and delivering robust, real-time data pipelines and infrastructure. The company operates in a Google Cloud Platform (GCP) environment, and they are particularly interested in candidates with strong expertise in Python.

Key Responsibilities

  • Design, develop, and maintain scalable, real-time data pipelines and infrastructure in a GCP environment.
  • Integrate multiple data sources to ensure seamless real-time data flow across the organization.
  • Build and optimize data models for querying and analytics use cases.
  • Develop fault-tolerant, highly available data ingestion and processing pipelines.
  • Continuously monitor and improve pipeline performance for low-latency and high-throughput operations.
  • Ensure data quality, integrity, and security across all systems.
  • Implement effective monitoring, logging, and alerting mechanisms.

About You

  • Strong hands-on experience in data engineering with expertise in Python.
  • Proven track record of building and managing real-time data pipelines.
  • In-depth experience working with Google Cloud Platform (GCP) and its associated tools for data ingestion and processing.
  • Familiarity with distributed streaming platforms such as Kafka or similar technologies.
  • Advanced knowledge of SQL
  • Experience with data orchestration tools
  • Ability to optimize and refactor data pipelines for improved performance and scalability.
  • Strong problem-solving skills and the ability to thrive in a collaborative, fast-paced environment.

Apply now in a few quick clicks

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.