SonicJobs Logo
Login
Left arrow iconBack to search

Snowflake Data Engineer Contract

Harnham - Data & Analytics Recruitment
Posted 7 days ago, valid for 19 days
Location

London, Greater London EC1R 0WX

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reedā€™s services as part of the process. By submitting this application, you agree to Reedā€™s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • A client is seeking a Data Engineer to manage data for their HR team, with a strong emphasis on Snowflake expertise.
  • The role involves designing and optimizing ETL/ELT workflows, developing applications for data transformation, and maintaining cloud-based data stores.
  • Candidates should have strong experience with Python, SQL databases, and AWS services, along with proficiency in CI/CD and Infrastructure as Code tools.
  • The position requires excellent communication skills and the ability to work collaboratively across teams.
  • The salary for this role is competitive, and candidates should have a minimum of 3 years of relevant experience.

I am working with a client who is looking for a Data Engineer to take ownership of all things data for their HR team. This role is essential in building and maintaining data stores, automation, and stream consumers, enabling Data Scientists and Analysts to develop effective algorithms, processes, and reports. As a bridge between software engineering and data science, you'll work within the tech team to develop scalable solutions that meet business needs.

Please apply if the below apply and look interesting to you, but only if you're very strong in Snowflake.

Key Responsibilities
  • Design, build, and optimize ETL/ELT workflows in Snowflake for HR data.
  • Develop applications to consume and transform HR production data streams (Kafka) for analytical and ML use.
  • Architect and maintain cloud-based data stores (AWS Redshift, Snowflake).
  • Automate model training, evaluation, and deployment pipelines.
  • Work closely with cross-functional teams to gather requirements and deliver data-driven solutions.
Your Experience & Skills
  • Strong experience with Python or similar languages (e.g., R).
  • Hands-on experience with SQL databases (PostgreSQL preferred).
  • Experience with Snowflake & AWS services (S3, SageMaker, RDS, EC2).
  • Proficiency with CI/CD tools (AWS CodePipeline, GitHub Actions).
  • Experience with Infrastructure as Code tools (Terraform, CloudFormation).
  • Familiarity with Kafka and distributed data processing tools (Spark, Dask).
  • Excellent communication skills and strategic thinking to work across teams.

If this sounds like you, apply now using the link below!

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reedā€™s services as part of the process. By submitting this application, you agree to Reedā€™s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Cookies
Privacy Policy
Read more in the Privacy Policy