My Shortlist

Your shortlisted jobs will appear here. To view your shortlist: Login Or Register

Date Added: Thu 10/12/2020

Lead Linux & Hadoop Big Data Engineer

Tyger Valley, South Africa
Add To Shortlist Apply Now

Job Type: Permanent

Salary: / monthly


The expertise of a highly technical & customer-centric Linux & Hadoop Lead Support Engineer is sought by a leading global UK cloud solutions specialist to join its Cape Town team. In this multi-faceted role, you will be required to support Data Engineering specialists based in the UK, serving as the technical authority to the customer for support services while successfully resolving complex technical issues. You will require a Bachelor of Arts/Science Degree, an ITIL V3/4 Foundation or other relevant industry certification, at least 5 years' experience supporting Hadoop clusters - ideally Hortonworks, 5+ year's Linux, SQL, Kafka, Spark, Python & Shell Scripting, Core Java and be a Big Data specialist.


  • Successfully resolve technical issues (hardware and software) from incoming internal or external businesses and end user's contacts and proactive notification systems.
  • Be the technical authority to the customer for supported services.
  • Proactively assist internal or external businesses and end users to avoid or reduce problem occurrence.
  • Act as a mentor and guide other employees. Provide direction and guidance to process improvements.
  • Articulate clearly, recommend and explain resolutions /clients.
  • Manage the deployment, monitoring, maintenance, development, upgrade, and support of supported systems.
  • Work with stakeholders to define systems requirements for new technology implementations.
  • Through metric driven decisions, make recommendations on hardware and software assessments and upgrades.
  • Contribute to the drive of automation and continuous improvement of services and support.
  • Ensure operational documentation is up to date, tested and distributed where needed.
  • Undertake change submission role including submitting new change requests and attending the weekly CAB meeting as required.
  • Fulfil responsibilities in the event of an emergency or disaster in adherence with BC/DR policies.


Qualifications -

  • First Level University Degree: a) Technical, b) Non-technical (i.e., Bachelor of Arts/ Science). Typically, 3-4-year completion beyond High School level, BA/BS or equivalent experience.
  • ITIL V3 or 4 Foundation Certification.
  • Relevant industry certification to support technical experience.

Experience/Skills -

  • 5-7 Years' experience in supporting Hadoop clusters, ideally Hortonworks (HDP) 2.6.5 and 3.1.
  • 5+ Years' experience -
  • Linux administration skills (RHEL certified or similar).
  • SQL and data streaming experience (Kafka, Spark).
  • Python Scripting, Shell Scripting, Core Java programming language.
  • Managing and supporting Hortonworks Dataflow HDF 3.X.
  • Data visualisation and reporting.
  • Big Data specialist.
  • Advanced troubleshooting skills in a technical environment.
  • Phone and remote support experience.
  • E-support experience, knowledge and resolution ability.
  • Ability to lead technical action plans.

Advantageous -

  • HDFS, YARN, Tez, Hive, HBase, Kafka, Spark, Spark2, ZooKeeper


  • Excellent verbal and written communication skills in language to be supported.
  • Excellent analytical and problem-solving skills.
  • Superior customer service skills.
  • Excellent team collaboration and working skills.
  • Partners with Account management and Sales teams on new leads and business opportunities.
  • Able to solve and document solutions for usage of other technic