The expertise of a highly technical & customer-centric Lead Linux & Hadoop Big Data Engineer is sought by a leading global UK cloud solutions specialist to join its Cape Town team. In this multi-faceted role, you will be required to support Data Engineering specialists based in the UK, serving as the technical authority to the customer for support services while successfully resolving complex technical issues. You will require a Bachelor of Arts/Science Degree, an ITIL V3/4 Foundation or other relevant industry certification, at least 5 years’ experience supporting Hadoop clusters – ideally Hortonworks, 5+ year’s Linux, SQL, Kafka, Spark, Python & Shell Scripting, Core Java and be a Big Data specialist.
Successfully resolve technical issues (hardware and software) from incoming internal or external businesses and end user’s contacts and proactive notification systems.
Be the technical authority to the customer for supported services.
Proactively assist internal or external businesses and end users to avoid or reduce problem occurrence.
Act as a mentor and guide other employees. Provide direction and guidance to process improvements.
Articulate clearly, recommend and explain resolutions /clients.
Manage the deployment, monitoring, maintenance, development, upgrade, and support of supported systems.
Work with stakeholders to define systems requirements for new technology implementations.
Through metric driven decisions, make recommendations on hardware and software assessments and upgrades.
Contribute to the drive of automation and continuous improvement of services and support.
Ensure operational documentation is up to date, tested and distributed where needed.
Undertake change submission role including submitting new change requests and attending the weekly CAB meeting as required.
Fulfil responsibilities in the event of an emergency or disaster in adherence with BC/DR policies.
First Level University Degree: a) Technical, b) Non-technical (i.e., Bachelor of Arts/ Science). Typically, 3-4-year completion beyond High School level, BA/BS or equivalent experience.
ITIL V3 or 4 Foundation Certification.
Relevant industry certification to support technical experience.
5-7 Years’ experience in supporting Hadoop clusters, ideally Hortonworks (HDP) 2.6.5 and 3.1 is ESSENTIAL
5+ Years’ experience –
Linux administration skills (RHEL certified or similar).
SQL and data streaming experience (Kafka, Spark).
Python Scripting, Shell Scripting, Core Java programming language.
Managing and supporting Hortonworks Dataflow HDF 3.X.
Data visualisation and reporting.
Big Data Specialist.
Advanced troubleshooting skills in a technical environment.
Phone and remote support experience.
E-support experience, knowledge and resolution ability.
Ability to lead technical action plans.
HDFS, YARN, Tez, Hive, HBase, Kafka, Spark, Spark2, ZooKeeper
Excellent verbal and written communication skills in language to be supported.
Excellent analytical and problem-solving skills.
Superior customer service skills.
Excellent team collaboration and working skills.
Partners with Account management and Sales teams on new leads and business opportunities.
Able to solve and document solutions for usage of other technicians and customers.
Can train peers on solutions.
Takes full ownership for resolution with escalated customers.
Lead or provide expertise to teams or projects.
Highly developed knowledge of more complex solutions.
Ability to balance attention to detail with expeditious execution in a fast-paced environment.
Passion for driving exceptional customer experience.
While we would really like to respond to every application, should you not be contacted for this position within 10 working days please consider your application unsuccessful.
When applying for jobs, ensure that you have the minimum job requirements. Only SA Citizens will be considered for this role. If you are not in the mentioned location of any of the jobs, please note your relocation plans in all applications for jobs and correspondence.
Please complete the following form.