Aker Systems was founded in 2017 by a team of experienced technology professionals who recognised an opportunity to provide highly secure enterprise data platforms to large organisations. We build and operate ground-breaking, ultra-secure, high performance, cloud-based data infrastructure for the enterprise. Our proprietary technology solutions drive performance and reduce costs while helping our clients to improve the management and sharing of data across their organisations.
In 2024, Aker Systems won the Breakthrough Culture Awards highlighting growth companies putting culture first. In 2020 Aker Systems was recognised as a One to Watch on the Sunday Times Tech Track. The Company was also recognised at the Thames Valley Tech Awards 2020; winning the Thames Valley Tech Company of the year, the Emerging Tech Company and High Growth Tech Business categories. We encourage people of all different backgrounds and identities to apply. We are committed to maintaining an inclusive, and supportive place for you to do your very best work.
SC Clearance is mandatory
We are seeking a highly skilled AWS MSK and Kubernetes expert to architect, deploy, and manage streaming data pipelines and containerized applications on AWS. The ideal candidate will have deep knowledge of AWS MSK for Kafka-based data streaming and expertise in Kubernetes for container orchestration. You will be responsible for creating scalable, secure, and fault-tolerant systems to handle real-time data processing and microservices in cloud-native environments.
Duties and Responsibilities
- Design, deploy, and manage AWS Managed Streaming for Apache Kafka (MSK) clusters. Optimize MSK configurations for low-latency, high-throughput streaming, ensuring high availability
- Implement Kafka security using SSL encryption, IAM roles, VPC networking, and authentication mechanisms
- Automate the creation, monitoring, scaling, and maintenance of Kafka topics, producers, and consumers
- Integrate MSK with other AWS services (e.g., Lambda, S3, Elastic Search). Implement best practices for data replication and fault tolerance in MSK environments
- Troubleshoot deployment issues
- Deploy, monitor, and scale Kubernetes-based applications across AWS. Ensure service discovery, load balancing, auto-scaling, and failover for applications running in Kubernetes
- Manage Kubernetes networking, including VPC integration and ingress configurations. Automate the deployment of containerized Kafka clients, stream processing applications, and monitoring tools on Kubernetes
- Implement CI/CD pipelines for microservices and Kafka-related applications using tools like Drone
- Automate infrastructure provisioning using Terraform or Infrastructure-as-Code tools
- Build and maintain monitoring and alerting systems using Prometheus, Grafana, or AWS native monitoring tools like CloudWatch
- Collaborate with development and DevOps teams to design MSK and Kubernetes-based solutions
- Troubleshoot complex issues related to Kafka and container orchestration.
- Document infrastructure setups, architectures, and operational procedures for MSK and Kubernetes environments
- Stay current with evolving AWS, Kafka, and Kubernetes ecosystems and apply best practices accordingly
Essential Experience and Competencies
- Strong experience with AWS MSK for Apache Kafka and understanding of Kafka internals (brokers, topics, partitions, producers/consumers)
- Proficiency with AWS services, like S3, IAM, VPC and CloudWatch
- Hands-on experience with Kubernetes in production environments
- Strong knowledge of Docker and managing containerized applications
- Proficiency in configuring and managing Kubernetes clusters, including monitoring, scaling, and automated deployments
- Familiarity with Helm charts
- Experience with Infrastructure as Code (IaC) tools like Terraform
- Knowledge of container build and deployment automation using CI/CD pipelines
- Experience in observability tools for both MSK and Kubernetes, including Prometheus, Grafana, and AWS CloudWatch for metrics and logs
- Deep understanding of Kafka and Kubernetes security practices, including network policies and IAM roles
- Experience with Vault
- Strong analytical and troubleshooting skills
- Ability to work independently and collaboratively in a team
- Strong communication and documentation skills
- Ability to prioritize and handle multiple tasks in a fast-paced environment
Preferred Experience and Competencies
- Experience with Kafka Connects (S3 and ElasticSearch), Kafka Streams
- Familiarity with AWS Glue, Redshift for streaming data integration
- AWS Certifications (AWS Certified DevOps Engineer, etc.)
- Knowledge of multi-region Kafka deployments and cross-cluster replication
Aker Systems Attributes
At Aker we work as a team, we are collaborative, hardworking, open, and delivery obsessed. There is no blame culture here: try things, and take responsibility for the outcomes. You are always part of the wider Aker. We help out our colleagues and take pride in successfully achieving difficult tasks. We run towards problems and help solve them. Communicate always, do so accurately and in a timely fashion.
In return, we offer a competitive salary, 25 days holiday (excluding bank holidays), Company Paid Medical Insurance, Life Assurance (4x times basic salary), Pension scheme, Perks at Work, Cycle Scheme, Tech Scheme and Season Ticket Loan. Plus, a list of voluntary benefits including Dental Insurance, Critical illness cover and Virtual GP.
Equal Opportunities
Aker Systems fosters a diverse environment that encourages openness in its communications and is committed to providing equal employment opportunity for all people regardless of race, religion, gender or sexual orientation, age, marital status, national origin, citizenship status, disability, veteran status or other personal characteristics. We embrace differences of opinion and diversity because they help challenge us and find new groundbreaking technical solutions.