SonicJobs Logo
Login
Left arrow iconBack to search

data analyst

Lorien
Posted 2 days ago, valid for a month
Location

Edinburgh, City of Edinburgh EH105BP, Scotland

Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • We are seeking a Data Engineer with expertise in SQL Server, Teradata, and Snowflake to join our team.
  • The role requires designing and maintaining ETL processes, optimizing database performance, and creating scalable data pipelines.
  • Candidates should have strong experience in SQL Server and Snowflake architectures, along with knowledge of cloud platforms like AWS.
  • A degree in Computer Science or related fields is preferred, along with strong communication and problem-solving skills.
  • The position offers a competitive salary, and candidates should have a minimum of 5 years of relevant experience.

We are looking for a skilled Data Engineer with expertise in SQL Server, Teradata and Snowflake to join our team. The role involves designing, developing, and maintaining ETL processes to transfer data from source systems into Snowflake, Teradata and SQL Server. You will optimize database performance and work on creating scalable data pipelines for large datasets. Collaboration with data scientists, analysts, and business teams is essential for defining and implementing data strategies. Strong experience in SQL Server (T-SQL, stored procedures, optimization), and Teradata and Snowflake architectures is required. The role also includes troubleshooting and improving existing data systems for efficiency and performance. You will handle data migration and integration tasks from legacy systems into Snowflake. Knowledge of cloud platforms (AWS, Azure, Google Cloud) and data warehousing best practices is necessary. Proficiency in data modelling and ETL tools is expected. A degree in Computer Science or related fields, along with strong communication and problem-solving skills, is preferred.

Responsibilities:

  • Design, develop, and maintain ETL processes to move data from source systems to Snowflake, Teradata and SQL Server environments.
  • Build scalable data pipelines for processing and storing large datasets.
  • Collaborate with analysts and business stakeholders to define and implement data strategies.
  • Optimize database performance, including query optimization, indexing, and partitioning across all enterprise data platforms.
  • Work with cross-functional teams to gather requirements and develop effective data solutions.
  • Ensure data quality, consistency, and integrity throughout the data lifecycle.
  • Maintain and improve existing data architectures and workflows to meet business requirements.
  • Create and maintain documentation for data systems, pipelines, and processes.
  • Monitor and troubleshoot data pipeline performance issues and resolve them promptly.
  • Assist in the migration and integration of data from legacy systems into Snowflake, and other strategic data stores.

Perform data transformation tasks using SQL, Snowflake SQL, and relevant data processing tools.

Required Skills and Qualifications:

  • Proven experience working with SQL Server (e.g., T-SQL, Stored Procedures, Indexing, Query Optimization, System Catalog Views).
  • Strong experience in Snowflake architecture, including data loading, transformation, and performance tuning.
  • Proficient in ETL processes using tools such as Informatica PowerCenter and BDM, AutoSys, Airflow, and SQL Server Agent.
  • Experience with cloud platforms preferably AWS.
  • Strong knowledge of AWS cloud services, including EMR, RDS Postgres, Athena, S3 and IAM.
  • Solid understanding of data warehousing principles and best practices.
  • Strong proficiency in SQL for data manipulation, reporting, and optimization.
  • Knowledge of data modelling and schema design.
  • Experience working with large, complex datasets and implementing scalable data pipelines.
  • Familiarity with version control tools such as GitLab.
  • Experience with data integration, data governance, and security best practices.
  • TSYS experince & knowledge of credit cards data

Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to this vacancy.

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed’s services as part of the process. By submitting this application, you agree to Reed’s Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.