SonicJobs Logo
Left arrow iconBack to search

Spark Scala Architect/SME

VALLUM ASSOCIATES LIMITED
Posted 3 days ago, valid for 12 days
Location

Sheffield, South Yorkshire S5 8DP

Salary

拢60,000 - 拢72,000 per annum

info
Contract type

Full Time

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed鈥檚 services as part of the process. By submitting this application, you agree to Reed鈥檚 Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.

Sonic Summary

info
  • The job requires a minimum of 12 years of IT experience with a strong focus on Spark Data Integration, including PySpark and Spark SQL.
  • Candidates must have expertise in analyzing Spark code failures and making performance improvement recommendations.
  • A deep understanding of Data Frames, Resilient Distributed Data Sets, and memory-related issues is essential.
  • The role involves monitoring Spark jobs using tools like Grafana and requires familiarity with Cloudera (CDP) Spark and Prophecy for low-code solutions.
  • The position is based in Sheffield, UK, and offers a competitive salary, with the expectation to work at least three days in the office.

Mandatory Skills

You need to have the below skills.

路 At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans.

路 Spark SME - Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.

路 To be able to traverse and explain the architecture you have been a part of and why any particular tool/technology was used.

路 Spark SME - Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.

路 Spark - SME Be able to understand Data Frames / Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations.

路 Monitoring -Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures.

路 Cloudera (CDP) Spark and how the run time libraries are used by PySpark code.

路 Prophecy - High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code.

路 Ready to work at least three days from Sheffield (UK) office and accept changes as per customer policies.

Good to have skills.

Ideally, you should be familiar with

路 Collaboration with multiple customer stakeholders

路 Knowledge of working with Cloud Databases

路 Excellent communication and solution presentation skills.

Apply now in a few quick clicks

In order to submit this application, a Reed account will be created for you. As such, in addition to applying for this job, you will be signed up to all Reed鈥檚 services as part of the process. By submitting this application, you agree to Reed鈥檚 Terms and Conditions and acknowledge that your personal data will be transferred to Reed and processed by them in accordance with their Privacy Policy.