Menu Close

Data Warehouse Engineer

IT – Analyst, Data Management ~ IT – Software Development
Cape Town – Western Cape

ENVIRONMENT:
DESIGN, optimize and delivery tech solutions as the next Data Warehouse Engineer sought by a dynamic independent Asset Management firm seeking your passion for building tools which ensures great client engagement. Your role will entail driving the Data Engineering solution & exploring ways to improve technology processes within Distribution Technology while developing and documenting the provisioning and management of data pipelines. The ideal candidate will require a suitable tertiary qualification, experience with Python for Data Engineering, proficiency with Azure Data Factory and/or Airflow, SQL, building data pipelines/ETL/ELT scripts and familiarity with Agile delivery methodologies & Financial Services experience (preferably Asset Management). You’ll need to be both creative and analytical, with the ability to understand and articulate business challenges and/or ideas, explore options, design the solution, and see through to implementation.
 
DUTIES:
  • Work closely with business teams, Technical Leads, and Data Analysts to develop working solutions which deliver the desired business outcome.
  • Be responsible for driving the Data Engineering solution within Distribution Technology.
  • Develop and document the provisioning and management of data pipelines.
  • Explore ways to improve the technology processes to progress the Distribution Technology function across a broad range of areas such as cost savings, simplification, security, efficiency, and reliability.
  • Play a key role in familiarising and upskilling the extended team in Data Engineering tools and best practices.
  • Maintain an understanding of industry trends and how new technologies can be leveraged to meet business objectives.
  • Discover and playback innovative functionality and potential new ways to realise additional business benefits.
  • Engage actively in Agile planning, including story refinement, demos, and retrospectives.
 
REQUIREMENTS:
  • Relevant qualification.
  • Knowledge and experience of Python for Data Engineering.
  • Experience with Data Management services and ETL tooling, such as Azure Data Factory and/or Airflow.
  • Experience of how to expose and consume data from multiple systems using APIs.
  • Proficiency in building data pipelines/ETL/ELT scripts.
  • Knowledge of SQL or a similar database query language.
  • Understanding of industry-recognised Data Modelling patterns and standards.
  • Understanding of a range of coding tools and languages, security, accessibility, and version control.
  • High level of organisational skills to sustain momentum in multiple work streams.
  • Ability to plan, design, manage and execute, and report tests using appropriate tools and techniques.
  • Knowledge and experience of contributing to the development of technology solutions, both in house developed bespoke applications and commercially available off the shelf solutions.
  • Familiarity with Agile delivery methodologies.
  • Financial Services experience (preferably Asset Management).
 
System Skills –
    • Python
    • Airflow
    • Azure Data
    • SQL
 
ATTRIBUTES:
  • Self-motivated and enthusiastic.
  • Finds satisfaction in solving complex technical and theoretical challenges.
  • Flexible, highly curious, and willing to learn.
  • A critical thinker who is able to look at things from different angles.
  • Great at unpicking a problem, from start to end, own, and deliver the solution.
  • Able to work under pressure, on several priorities at once, and with tight deadlines.
  • Great at working with other people, sharing, and communicating decisions and ideas.