SonicJobs Logo
Left arrow iconBack to search

Data Engineer

Huxley Associates
Posted 7 hours ago, valid for 15 days
Location

London, Greater London SW1A2DX, England

Salary

£500 - £600 per day

Contract type

Full Time

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

Sonic Summary

info
  • We are looking for a GCP Data Engineer with 3-5 years of experience and a strong proficiency in Python programming, particularly in data processing tasks for 1-2 years.
  • The role involves designing, building, and optimizing data pipelines on Google Cloud Platform (GCP) using tools like BigQuery and Airflow.
  • Candidates should have a solid understanding of data engineering concepts, including ETL processes and data warehousing, along with familiarity in SQL.
  • The position requires excellent problem-solving skills and the ability to work in a dynamic environment, collaborating with data scientists and analysts.
  • The salary for this position is competitive, and the role is offered through Huxley, an Employment Business.

Data Engineer - GCP, Big Query, Airflow

Job Description: We are seeking a GCP Data Engineer with 3-5 years experience who is confident and proficient in Python programming, with 1-2 years of experience specifically in data processing tasks. The successful candidate will play a pivotal role in designing, building, and optimizing our data pipelines on GCP, ensuring that our data infrastructure is robust, efficient, and scalable.

Key Responsibilities:

  • Develop, maintain, and optimize data pipelines on Google Cloud Platform (GCP).
  • Big Query, Airflow
  • Utilize Python for data processing tasks, ensuring high performance and reliability.
  • Collaborate with data scientists, analysts, and other engineers to gather requirements and deliver high-quality data solutions.
  • Implement best practices for data engineering, including data governance, security, and compliance.
  • Troubleshoot and resolve issues related to data processing and pipeline performance.

Qualifications:

  • Primary Skill: Confident Python programming.
  • Experience: 1-2 years of hands-on experience using Python specifically for data processing tasks.
  • Proven experience with Google Cloud Platform (GCP) and its data services (e.g., BigQuery, Dataflow, Pub/Sub).
  • Strong understanding of data engineering concepts, including ETL processes, data warehousing, and data modeling.
  • Familiarity with SQL and experience in writing complex queries.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work independently and as part of a team in a fast-paced, dynamic environment.

Preferred Qualifications:

  • Experience with additional programming languages such as Java or Scala.
  • Knowledge of containerization and orchestration tools like Docker and Kubernetes.
  • Familiarity with CI/CD pipelines and version control systems (e.g., Git).

Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.

To find out more about Huxley, please visit (url removed)

Huxley, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy | Registered office | 8 Bishopsgate, London, EC2N 4BQ, United Kingdom | Partnership Number | OC(phone number removed) England and Wales

Apply now in a few quick clicks

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.