Data Engineer - in-depth Python, PostgreSQL, AWS, Snowflake
Hybrid - a day each week central London
£50-75k + benefits
The Data Engineer will be core to the design and implementation of a new operational data warehouse and a new analytical data warehouse from scratch.
The current tech stack is Python, PostgreSQL on RDS, Airflow and Snowflake. This is NOT a role where you’ll be just configuring data tooling… it’s hardcore Python and SQL coding!
The company is a digital surgery scale-up (launched in 2016) with a rapidly growing customer base across the UK, Europe, US and Australia, working with prestigious healthcare clients.
Sustained growth and a partnership with a billion-dollar US healthcare company underpins this opening for a talented Data Engineer to help take their data infrastructure and pipelines to the next level.
The role:
- Helping to rebuild reporting interfaces across the product and the analytics warehouse. You will be core to creating the infrastructure for this effort as well
- Working with the backend team to assist with data optimisation, real-time data queries, and other infrastructure projects
- Proactively analysing and improving the quality of the systems: including performance, scalability, maintainability, test coverage and documentation
You will need:
- 3+ years of hands-on experience with Python. Python is the core competency hence it’s crucial for you to be intimately familiar with collaborative Python development
- Very comfortable with SQL, preferably PostgreSQL
- Growth mindset: a love of learning and resilience in the face of odds
- Team mindset
- CI/CD and modern software testing
- Foundational AWS products and systems, such as RDS, S3 and EC2