SonicJobs Logo
Left arrow iconBack to search

Data Engineer

Verisk
Posted 8 days ago, valid for 19 days
Location

Bath, Somerset BA2 2QH, England

Salary

£56,000 - £84,000 per annum

info
Contract type

Full Time

By applying, a Reed account will be created for you. Reed's Terms & Conditions and Privacy policy will apply.

Sonic Summary

info
  • Verisk Maplecroft is seeking a Data Engineer to build and maintain high-impact data pipelines for their Global Risk Data.
  • Candidates should have a proven track record of developing in Python and at least 3 years of experience in building ETL frameworks.
  • The role involves optimizing data workflows using tools like Metaflow, Prefect, and AWS, while collaborating with cross-functional teams.
  • A salary of approximately $90,000 to $110,000 per year is offered, depending on experience and qualifications.
  • Familiarity with geospatial data and cloud technologies is advantageous but not mandatory.

Job Description

Verisk Maplecroft are looking for a Data Engineer who thrives on building efficient, bespoke and high-impact data pipelines. As a Data Engineer at Maplecroft, you will be tasked with building and maintaining the automated pipelines responsible for the continuous delivery of our risk index data that form our Maplecroft’s Global Risk Data.

About the Day to Day Responsibilities of the Role

  • Design, build, and optimize data pipelines with Python.
  • Streamline and scale data workflows using Metaflow, Prefect, AWS and proprietary data services
  • Collaborate with data scientists, analysts, and developers to ensure seamless data flow and implementation of bespoke company methodologies
  • Improve data quality, reliability, and accessibility across teams.
  • Deliver high-quality, maintainable well-tested code that meets business requirements
  • Enable a consistent approach to our data production pipeline

Qualifications

About You and How You Can Excel in This Role

Required

  • A proven track record of developing in Python
  • An ability to meet pipeline requirements through an applied understanding of good data acquisition, transformation and manipulation techniques
  • Established experience of building ETL frameworks and tooling
  • Knowledge of common Python data analysis libraries (numpy, pandas)
  • Familiarity with Agile software development practices
  • Good understanding of git and working collaboratively on team-level code base

Useful to have

  • Familiarity working with geospatial data within Python (GDAL, rasterio, shapely)
  • Knowledge of cloud technologies and platforms such as AWS
  • Experience using docker or other container orchestration technologies
  • Experience of the Linux command line and basic Linux server administration skills

Apply now in a few quick clicks

By applying, a Reed account will be created for you. Reed's Terms & Conditions and Privacy policy will apply.