SonicJobs Logo
Left arrow iconBack to search

AWS Kafka Engineer

Advanced Resource Managers Limited
Posted 6 hours ago, valid for 2 days
Location

London, Greater London NW11 9NN, England

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • The position is for an AWS Kafka Engineer on a 3-month contract with a salary of up to £580 per day, inside IR35.
  • Candidates are expected to have strong proficiency in Apache Kafka, AWS services, and Python programming skills.
  • The role involves designing and maintaining real-time data streaming pipelines, as well as writing unit tests for Kafka components.
  • Collaboration with cross-functional teams and managing code using Git is essential, along with familiarity with Infrastructure as Code principles and CI/CD practices.
  • Previous experience in similar roles is required, although the specific number of years of experience is not mentioned.

AWS Kafka Engineer

3-month contract - up to £580 a day Inside IR35

Heathrow - Hybrid 2/3 days onsite

Role Information:

Design and Development: Build and maintain real-time data streaming pipelines using Apache Kafka, including producer and consumer applications.

Testing Lifecycle: Write Python scripts for performing unit testing around Kafka topics, consumers, producers, and data processing Lambda functions.

AWS Services: Utilise strong knowledge of AWS services, particularly those relevant to stream processing and serverless components like Lambda Functions.

Performance Monitoring: Monitor and troubleshoot Kafka cluster performance issues to ensure optimal functionality.

Collaboration: Work closely with cross-functional teams to integrate Kafka into various platform components and support existing implementations.

Version Control: Manage code using Git, ensuring best practices in version control are followed.

Infrastructure as Code (IaC): Apply knowledge of Terraform for infrastructure management where applicable.

CI/CD Practices: Implement CI/CD processes using GitHub Actions or similar tools to automate deployment workflows.

Technical Skills:

Strong proficiency in Apache Kafka, AWS MSK, and Confluent Kafka.

Excellent programming skills in Python with the ability to write efficient scripts for testing and data processing.

Solid understanding of AWS services used in stream processing environments, especially serverless architectures.

Familiarity with Git concepts for version control.

Knowledge of Infrastructure as Code principles, particularly using Terraform.

Experience with CI/CD tools like GitHub Actions is a plus.

Disclaimer:

This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited ("ARM"). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission. Where the role is marked as Outside IR35 in the advertisement this is subject to receipt of a final Status Determination Statement from the end Client and may be subject to change.

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.