SonicJobs Logo
Login
Left arrow iconBack to search

Data Engineer

Hays Specialist Recruitment Limited
Posted 7 days ago, valid for 15 days
Location

Manchester, Greater Manchester M17 1DJ, England

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • The role is for a Data Engineer with a contract length of 3 months, offering a daily rate of £450 - £500 (inside IR35).
  • Candidates should have strong experience with Databricks, PySpark, Delta Lake, and API ingestion, along with advanced SQL skills.
  • The position requires the ability to develop and optimize data pipelines, work with various data formats, and ensure data quality and integrity.
  • Familiarity with cloud platforms like Azure, AWS, or GCP, as well as version control systems, is essential for this role.
  • The successful applicant will need to operate through an umbrella company due to the IR35 compliance requirement.

Job Specification: Data Engineer (Databricks, PySpark, Delta, Parquet, API Ingestion)Role: Data Engineer Contract Length: 3 Months Rate: £450 - £500 per day (inside IR35) Location: Remote (with occasional travel to site) Start Date: ASAPKey Responsibilities:

  • Develop and Optimise Data Pipelines: Build, maintain, and optimise data pipelines using Databricks, PySpark, and Delta Lake.
  • Data Transformation & Integration: Work with structured and unstructured data using Parquet and other data formats.
  • API Ingestion: Design and implement robust solutions for API-based data ingestion.
  • Collaboration: Work closely with other teams (e.g., Data Science, Product, and Backend Engineers) to ensure smooth data flow and integration.
  • Troubleshooting & Performance Tuning: Identify and resolve data-related issues and ensure performance optimisations across the data pipeline.
  • Data Quality & Integrity: Ensure the quality and integrity of data, handling errors, and resolving data discrepancies.
  • Documentation: Maintain proper documentation of data pipelines, integrations, and processes for reference and troubleshooting.

Key Skills & Experience:

  • Databricks: Strong experience using Databricks for data engineering tasks such as data processing, ETL pipelines, and batch/streaming workflows.
  • PySpark: Extensive hands-on experience with PySpark to process large-scale datasets and implement scalable data solutions.
  • Delta Lake: Proficiency in Delta Lake for handling transactional data, managing schema evolution, and building data lakes.
  • Parquet: Experience working with Parquet files for efficient data storage and processing.
  • API Ingestion: Strong knowledge of integrating and ingesting data through APIs, ensuring reliable and scalable data pipelines.
  • Cloud Platforms: Familiarity with cloud platforms like Azure, AWS, or GCP.
  • SQL: Advanced skills in SQL for data querying, data manipulation, and troubleshooting.
  • Version Control: Experience with Git or similar version control systems.
  • IR35 Compliance: Familiarity with IR35 regulations and working inside IR35 contracts.

Desired Skills:

  • Experience with other big data tools (e.g., Hadoop, Kafka) is a plus.
  • Knowledge of orchestration tools like Airflow, Apache NiFi, or similar.
  • Familiarity with data warehousing concepts and solutions (e.g., Snowflake, Redshift).
  • Strong communication and collaboration skills.

Additional Information:

  • Work Environment: Fully remote with occasional travel to site for team collaboration, meetings, or project delivery (travel expenses will be covered).
  • IR35 Status: The role is inside IR35, meaning the successful contractor will need to operate through an umbrella company

If you're a talented Data Engineer with a strong background in Databricks, PySpark, Delta Lake, Parquet, and API ingestion, and you're looking for an exciting contract opportunity, apply now!If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.