DBA (GCP, SQL,Python, Java)
12 Months
Hybrid - Every Tuesday on-site in Basildon
321.38 per day (Inside IR35)
Overview
Join the Purpose Build Data Products team and be part of an innovative journey to transform how data is managed and utilized across our organization. We are dedicated to pioneering an adaptive and collaborative data ecosystem that optimizes every aspect of the data lifecycle. Our team focuses on comprehensive data ingestion, ensuring regulatory compliance, and democratizing access to enhanced insights. By fostering a culture of continuous improvement and innovation, we empower every team with actionable and enriched insights. Our goal is to drive transformative outcomes and set a new standard of data-powered success. The successful candidate will be responsible for building scalable data products in a cloud-native environment. You will lead both inbound and outbound data integrations, support global data and analytics initiatives, and develop always-on solutions. Your work will be pivotal in ensuring our data infrastructure is robust, efficient, and adaptable to evolving business requirements. Responsibilities: - Collaborate with GDIA product lines and business partners to understand data requirements and opportunities. -
Skills Required:
Develop custom cloud solutions and pipelines with GCP native tools, Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, and Big Query - Proficiency in SQL, Python, and PySpark. - Expertise in GCP Cloud and open-source tools like Terraform. - Experience with CI/CD practices and tools such as Tekton. - Knowledge of workflow management platforms like Apache Airflow and Astronomer. - Proficiency in using GitHub for version control and collaboration. - Ability to design and maintain efficient data pipelines. - Familiarity with data security, governance, and compliance best practices. - Strong problem-solving, communication, and collaboration skills. - Ability to work autonomously and in a collaborative environment. - Ability to design pipelines and architectures for data processing. - Experience with data security, governance, and compliance best practices in the cloud. - An understanding of current architecture standards and digital platform services strategy. - Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. - Meticulous approach to data accuracy and quality - Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team.
Skills Preferred:
Experience of Java, MDM - Front-end experience, e.g., angular or react. - Experience with data visualization tools (e.g., Tableau, Power BI). - Software Quality and Performance (e.g., Sonarqube, Checkmarx, FOSSA, and Dynatrace)
Experience Required:
Strong programming and scripting experience with SQL, Python, and PySpark. - Ability to work effectively across organizations, product teams and business partners. - Knowledge Agile Methodology, experience in writing user stories - Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion. - Experience with GCP Cloud experience with solutions designed and implemented at production scale. - Knowledge of Data Warehouse concepts, experience with Data Warehouse/ ETL processes - Strong process discipline and thorough understating of IT processes (ISP, Data Security). - Critical thinking skills to propose data solutions, test, and make them a reality. - Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases. - User experience advocacy through empathetic stakeholder relationship. - Effective Communication both internally (with team members) and externally (with stakeholders) - Must be able to take customer requirements, conceptualize solutions, and build scalable/extensible systems that can be easily expanded or enhanced in the future.
Experience Preferred:
Excellent communication, collaboration and influence skills; ability to energize a team. - Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality - Hands on experience in Python using libraries like NumPy, Pandas, etc. - Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI - Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. - Data Governance concepts including GDPR (General Data Protection Regulation) and how these can impact technical architecture.
Disclaimer:
This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited ("ARM"). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission. Where the role is marked as Outside IR35 in the advertisement this is subject to receipt of a final Status Determination Statement from the end Client and may be subject to change.