- Around 15+ years of industry experience with atleast 5 years in architecturing cloud solutions
- At least 5 years of architecture and design experience in designing and implementing data and analytics platforms, data integration solutions in AWS cloud infrastructure.
- Experience working with and designing solutions using AWS services such as Redshift, DyanamoDB, RDS, Aurora, Route 53, EC2, EMR, Elastic Beanstalk, API Gateway, Lambda, Microservices etc.
- Good understanding and expereince working with Data Integration tools like Talend-BigData
- Experience in implementing scalable and distributed architecture and process optimizations techniques in AWS cloud platform.
- Experience of Amazon Web Services provisioning, design and configuration capacity planning, administration etc
- Extensive experience in data analysis and ability to convert the business requirements to technical specifications.
- Thought leadership and technical advisory.
- Very good communication skill and can do attitude.
- Experience working in a collobrative environment to deliver the outcome.
- Excellent stakeholder management skills at all levels; both internally and externally
- Experience with formal architectural methodologies and frameworks, including TOGAF
Primary Skills
- Python, Spark Scala, Big Data
- Apache Spark and the Hadoop Ecosystem , Strong SQL knowledge
- Impala/Hive QL
- Analytical Querying
- AWS IAM , S3 ( Most AWS storage & computing services)
- Good to have ETL Tool (StreamSets, Stich, Matillion, Talend, Informatica)
- Good to have Any Reporting Tool (Tableau, Looker, Spotfire)
- Python, Scala, Java. (Expertise in at least one of them).
- Big Data Talend
- Redshift, DyanamoDB, RDS, Aurora, Route 53, EC2, EMR, Elastic Beanstalk, API Gateway, Lambda, Microservices
- Strong technical experience in implementing Big Data solutions using the Hadoop ecosystems on a Cloud Platforms (AWS) and associated technologies.
2 days per week on site in London required