SonicJobs Logo
Left arrow iconBack to search

Senior Data Engineer

ARM
Posted 5 hours ago, valid for 7 days
Location

Basildon, Essex SS14 3RH, England

Salary

£320 - £35 per day

Contract type

Full Time

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.

Sonic Summary

info
  • The position is for a Senior Data Engineer on a 12-month contract with a salary of £320 per day, inside IR35.
  • The role requires collaboration with GDIA product lines and business partners to understand data requirements and build data products.
  • Candidates must have experience with GCP native tools, SQL, Python, and PySpark, along with knowledge of data security and governance best practices.
  • A minimum of 5 years of experience in data engineering, including hands-on GCP Cloud experience, is preferred.
  • The position is hybrid, requiring on-site presence every Tuesday in Basildon.
Senior Data Engineer
12 months
320 p/d INSIDE IR35
Hybrid- on site every Tuesday in Basildon

My client, a well-known vehicle manufacture, are looking for a Senior Data Engineer to join their fast-paced team on an initial 12 month contract.

Responsibilities:
Collaborate with GDIA product lines and business partners to understand data requirements and opportunities.
Build & maintain data products in accordance with GDI&A Data Factory standards, ensuring adherence to data quality, governance, and control guidelines.
Develop and automate scalable cloud solutions using GCP native tools (e.g., Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, Big Query) and Apache Airflow.
Operationalize and automate data best practices: quality, auditable, timeliness and complete
Monitor and enhance the performance and scalability of data processing systems to meet organizational needs.
Participate in design reviews to accelerate the business and ensure scalability
Advise and direct team members and business partners on company standards and processes

Skills Required:
Develop custom cloud solutions and pipelines with GCP native tools e.g. Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, and Big Query - Proficiency in SQL, Python, and PySpark.
Expertise in GCP Cloud and open-source tools like Terraform.
Experience with CI/CD practices and tools such as Tekton.
Knowledge of workflow management platforms like Apache Airflow and Astronomer.
Proficiency in using GitHub for version control and collaboration.
Ability to design and maintain efficient data pipelines.
Familiarity with data security, governance, and compliance best practices.
Strong problem-solving, communication, and collaboration skills.
Ability to work autonomously and in a collaborative environment.
Ability to design pipelines and architectures for data processing.
Experience with data security, governance, and compliance best practices in the cloud.
An understanding of current architecture standards and digital platform services strategy.
Excellent problem-solving skills, with the ability to design and optimize complex data pipelines.
Meticulous approach to data accuracy and quality

Experience Required:
Programming and scripting experience with SQL, Python, and PySpark.
Ability to work effectively across organizations, product teams and business partners.
Knowledge Agile Methodology, experience in writing user stories
Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion.
GCP Cloud experience with solutions designed and implemented at production scale.
Knowledge of Data Warehouse concepts
Experience with Data Warehouse/ ETL processes
Strong process discipline and thorough understating of IT processes (ISP, Data Security).
Critical thinking skills to propose data solutions, test, and make them a reality.

Experience Preferred:
Excellent communication, collaboration and influence skills; ability to energize a team.
Hands on experience in Python using libraries like NumPy, Pandas, etc.
Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI
Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products.

Disclaimer:

This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited ("ARM"). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission. Where the role is marked as Outside IR35 in the advertisement this is subject to receipt of a final Status Determination Statement from the end Client and may be subject to change.

Apply now in a few quick clicks

By applying, a CV-Library account will be created for you. CV-Library's Terms & Conditions and Privacy Policy will apply.