Menu Close

Senior Data Engineer (Remote)

IT – Analyst, Data Management
Remote

ENVIRONMENT:
A dynamic Tech Company specialising in building FinTech platforms seeks the technical and analytical expertise of a Senior Data Engineer for a hands-on role to help build out, setup and manage its data infrastructure to ensure high-level automation & optimal performance. You will require experience with Kafka, Flink, Spark, Apache Airflow, ideally GCP but strong experience in AWS/Azure will be accepted, Cloud data warehouses such as BigQuery, Redshift or Snowflake, Java, Python, Kubernetes and be able to produce transparent and easily navigable data pipelines and be comfortable writing detailed design documents. Please note Remote work on offer.
 
DUTIES:
  • Setup and manage the data infrastructure plus building new systems where required.
  • Build and optimize key ETL pipelines on both batch and streaming data.
  • Work with the teams from Product, Engineering, BI/Analytics and Data Science.
  • Take ownership of data model design and data quality.
  • Play an active role in ensuring data governance tooling is implemented and policies thereby adhered to.
  • Value should be assigned to consistently producing high quality metadata to support discoverability and consistency of calculation and interpretation.
 
REQUIREMENTS:
  • Ideally GCP, but strong experience in another platform such as AWS or Azure will suffice.
  • Able to produce transparent and easily navigable data pipelines.
  • Event streaming platforms such as Kafka.
  • Stream analytics frameworks such as Flink, Spark, GCP Dataflow, etc.
  • Workflow scheduler such as Apache Airflow.
  • Cloud data warehouses such as BigQuery, Redshift or Snowflake.
  • Fluent using Kubernetes.
  • Java and Python.
  • Comfortable writing detailed design documents.
  • A solid understanding of the retail banking domain is highly desirable, but not required.
  • Able to work with technical leadership to make well informed architectural choices when required.
  • A high degree of empathy is required for the needs of the downstream consumers of the data artefacts produced by the Data Engineering team, i.e., the Software Engineers, Data Scientists, Business Intelligence Analysts, etc.