Apply now »
Location: 

Johannesburg, ZA

Date:  14 May 2026

Title:  Data Engineer

145848

Job Purpose

Architects, engineers, and drives optimisation across enterprise-state of the art petabyte scale data pipelines and infrastructure foundational to the banks data driven strategy. This role is crucial in translating high volume, heterogeneous raw data into governed low latency and actionable data products for advanced analytics, machine learning and real time data insights that enable the delivery of accurate, high-quality, and timely data. Own the end-to-end data lifecycle from source ingestion, batch, streaming via various modalities ensuring data quality and semantic consistency and metadata. Partners with cross-functional teams to align technical solutions with business needs, strengthen the enterprise data architecture, and advancing the Bank’s strategic goal of becoming a data-driven organisation.

Accountabilities

Data Pipeline Development and Engineering

  • Architect, engineer and maintain robust scalable and fault tolerant data pipelines using technologies like Ab Initio, Python/Scala, Apache spark, Microsoft Fabric, Databricks for ingestion, transformation and delivery of structured, semi structured and unstructured data.
  • Ensure efficient performance tuned extraction and loading of data into the enterprise Data Warehouse and Data Lake focusing on high availability, cost efficiency and reliability.
  • Implement modern Data Ops practices including automation, testing and orchestration for both batch and low latency streaming data workflows.
  • Collaborate with Software Engineers to design and implement consumption APIs and services that enable secure real time data access for business applications.

Data Infrastructure and Platform Management

  • Architect and manage the robust, hybrid cloud data platform utilising Azure Services (Microsoft Fabric, Ab initio, synapse, Databricks, SAS) and potentially on-premises technologies (DB2 warehouse, Netezza, Denodo, Ab Initio). Demonstrate proficiency in infrastructure as code.
  • Establish and manage comprehensive observability for all data infrastructure components such as databases, data lakes, and data warehouses to meet strict service level objectives and aligned with architectural standards.
  • Collaborate with Information Security, CISO, and Data Governance to enforce data security, policies and implement masking/anonomysation techniques to ensure compliance with data privacy regulations (POPIA)
  • Lead the technical evaluation, proof of concept and the successful integration of cutting-edge data engineering technologies and tools to enhance platform maturity and competitive advantage.

Data Quality, Validation, and Governance

  • Embed data quality as code by implementing automated, unit and end to end data validation, reconciliation, and auditing frameworks within the CI/CD data pipelines.
  • Design and maintain the automated capture and maintenance of technical and operational metadata ensuring complete automated data lineage within the enterprise metadata hub (Ab Initio).
  • Act as a technical guardian of the data integrity partnering with Data Governance to define and enforce master data management and data quality standards across all data products.
  • Proactively manage data quality through regular data profiling, advanced issue analysis, and swift  documented root cause resolution.

Collaboration, Delivery, and Support

  • Serve as trusted technical partner, collaborating with business stakeholders, Data Scientists, Analysts, and Architects to translate requirements into scalable data solutions.
  • Drive the full data lifecycle of data products and end-to-end ownership of data engineering initiatives, ensuring timely delivery and alignment with business objectives.
  • Provide Level 2/3 support for complex data issues, coordinating with cross-functional teams to resolve incidents, ensuring a minimal mean time to resolution and clear communication
  • Contribute to sprint planning, backlog refinement, and agile delivery processes.

Continuous Improvement and Professional Development

  • Drive continuous optimisation of data engineering practices, standards, and processes to improve efficiency and performance.
  • Mentor and guide junior engineers, providing coaching and technical support to build capability.
  • Stay abreast of emerging technologies and industry trends to enhance data engineering maturity.
  • Contribute to innovation initiatives that align with the bank’s data-driven strategy.

Essential Requirements

  • Undergraduate Degree
  • General Experience: Experience enables job holder to deal with the majority of situations and to advise others (2 to 5 years in IT or BI environment)
  • Managerial Experience: Basic experience of coordinating the work of others (4 to 6 months)

Technical Expertise

  • Data Collection and Analysis: Works with full competence to determine and analyze trends from collected data to assist in compiling reports that support business decisions. Typically works without supervision and may provide technical guidance.
  • Engineering: Deep expertise In major cloud Azure platform, Azure Data Factory, Microsoft Fabric, Databricks, Python, PySpark. Infrastructure as Code
  • Database User Interfaces and Queries: Works with full competence to create and run queries and interact with various database interfaces and query languages. Typically works without supervision and may provide technical guidance. Experience with low latency data Ingestion and processing using Message queues, Kafka, Azure Event Hub, Real time Intelligence.
  • Data Conversion: Works with full competence to use data conversion tools and techniques to encode data in various formats. Typically works without supervision and may provide technical guidance.
  • Database Reporting: Works with full competence to use database reporting tools and techniques. Typically works without supervision and may provide technical guidance.
  • Orchestration and Data Ops proven experience with workflow orchestration and Implementing CI/Cd pipelines for data solutions
  • Application Development: Works with full competence to develop software through use of programming languages. Typically works without supervision and may provide technical guidance.
  • Categorizing and Classifying Information: Works with full competence to utilize systems and tools to support categorizing and classifying data and information. Typically works without supervision and may provide technical guidance.

Behavioural competencies

  • Tech Savvy
  • Ensures Accountability
  • Collaborates
  • Manages Complexity
  • Communicates Effectively

---------------------------------------------------------------------------------------

Please contact the Nedbank Recruiting Team at +27 860 555 566 

Job Title
Loading...
Req ID
Loading...
Job Family
Loading...
Career Stream
Loading...
Leadership Pipeline
Loading...
FAIS Affected
Loading...
Job Location
Loading...
Company:  Nedbank

Apply now »