SonicJobs Logo
Left arrow iconBack to search

AWS Data Engineer

Harnham - Data & Analytics Recruitment
Posted 15 hours ago, valid for 11 days
Location

London, Greater London EC1R 0WX

Salary

£500 - £550 per day

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • The AWS Data Engineer position is a 6-month contract role based in London, offering a daily rate of £500-£550.
  • This role involves supporting a large-scale AWS Databricks project by integrating third-party data into the data warehouse for reporting purposes.
  • Candidates should have commercial experience with AWS, ETL development in Python, and familiarity with Airflow for data transformation.
  • The ideal applicant will also have experience mentoring Junior Engineers and working with APIs to gather data from various sources.
  • Interested candidates are encouraged to apply by sending their CV to Matt Collett.

AWS DATA ENGINEER,

6-MONTH CONTRACT

LONDON

£500-£550 PER DAY

This position as an AWS Engineer allows you to work within a dynamic Broadcasting company located in the heart of London. The role will be supporting a larger time on a large scale AWS Databricks project, bringing in large sets of third party data into the warehouse and preparing this for reporting.

THE COMPANY

This Broadcasting client has been investing heavily within their data function over the past few years and are now looking to create a layered engineering function, by bringing in a number of Senior Engineers into the business.

THE ROLE

As an AWS Engineer, you will be involved in brining in third party data through API's from a variety of sources, building dataflows in Pyspark and developing data stores within the AWS Databricks environment. This will involve:

  • Gathering requirements from the product and engineering.
  • Developing and implementing ETL extractor pipelines and API to bring in data from various sources into the AWS Databricks environment .
  • Using Pyspark to build connectors.
  • Using Airflow to transform the data within the warehouse.

KEY SKILLS AND REQUIREMENTS:

As an AWS Engineer, you will require the following background and skills:

- Commercial experience across AWS and ideally some GCP experience

- Experience developing ETL (Python) and API connectors to pull data into the warehouse

- Experience working with Airflow as the data transformation tool

- Experience upskilling and mentoring Junior Engineers.

HOW TO APPLY

Please register your interest by sending your CV to Matt Collett () via the apply link on this page.

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.