Apply now »
Position

Data Engineer

Details

Location: 

Johannesburg, ZA

Date:  19 Mar 2026
Reference:  144304

Shape the Future of Data at Nedbank

We are expanding our Data & Analytics capability and seeking talented Data Engineers who want to build enterprise‑level data platforms, pipelines, and solutions that power strategic decisions across the bank. This role focuses on delivering modern, scalable, cloud‑enabled data solutions that drive organisational goal of becoming a fully data‑driven organisation.

What You’ll Do

  • Build and maintain enterprise‑scale data pipelines for ingestion, provisioning, streaming, and API delivery.
  • Enhance Nedbank’s data infrastructure to support analytics, machine learning, and AI.
  • Develop optimised data integration across Golden Sources, Trusted Sources, and Writebacks.
  • Load and maintain Nedbank Data Warehouse layers (Data Reservoir, ADW, Enterprise Data Marts).
  • Engineer big‑data and streaming solutions using Hadoop, Kafka, and IBM Infosphere Data Replication.
  • Drive cloud‑based data engineering using Azure Data Factory, Azure Databricks, and related tools.
  • Build APIs for data consumption and enable self‑service access for business users.
  • Implement data quality, governance controls, and monitoring across all data pipelines.

Why Nedbank?

We’re one of Africa’s largest financial services groups, serving millions across banking, insurance, assetmanagement, and wealth. With a strong Pan-African footprint, we’re committed to sustainable finance anddriving the continent’s ambitions toward a net-zero economy

Why Join Our Data & Analytics Team?

  • Work with modern cloud, data engineering, and machine learning technologies.
  • Build trusted data products used across the enterprise.
  • Collaborate with strong, supportive teams across Data Engineering, Data Science, and Technology.
  • Deliver meaningful solutions that directly influence business outcomes.

What We’re Looking For

  • 3+ years’ experience in Data Engineering.
  • Credit Risk System experience (non‑negotiable).
  • Experience designing and maintaining data warehouses and lakes.
  • Strong programming in SAS, Java, Python.
  • Strong SQL skills across Oracle SQL, SQL Server, relational and NoSQL databases.
  • CI/CD & DevOps: GitLab CI, GitOps, Docker, Kubernetes (EKS), Microservices.
  • Infrastructure as Code: AWS CDK, Terraform.
  • Observability: Prometheus, Grafana, Loki, Tempo, OpenTelemetry.
  • Cloud: AWS, Azure (preferred), with exposure to GCP.
  • API development and testing using REST, Postman, Swagger.
  • Familiarity with AI/ML tooling: PyTorch, TensorFlow, MLFlow, computer vision, NLP.

 

Required Certification:

  • Bachelor’s Degree or Advanced Diploma in IT, Engineering, Mathematics, Statistics, Econometrics, or related field.
  • Cloud certifications in Azure or AWS.
  • DevOps, Data Engineering, or Data Science certifications (Coursera, Udemy, SAS, Microsoft).

Preferred Certifications

  • Cloud (Azure, AWS), DEVOPS or Data engineering certification.
  • Data Science certification Coursera, Udemy, SAS Data Scientist certification, Microsoft Data Scientist. 

Rewards & Benefits

  • Competitive remuneration and incentives
  • Comprehensive medical aid and provident fund
  • Hybrid working environment
  • Access to learning and growth opportunities
  • A culture built on trust, support, and shared ownership

Ready to make an impact?

Click “Apply” and our Talent Acquisition team will connect with you.

---------------------------------------------------------------------------------------

Please contact the Nedbank Recruiting Team at +27 860 555 566 

 

If you can't find the job you're looking for, activate job alerts to be one of the first to know when new positions open up.

Apply now »