Data Platform Engineer

  • McCabe & Barton
  • Dec 10, 2025
Full time Telecommunications

Job Description

Data Platform Engineer - Permanent
Hybrid (3 days in the office, 2 days WFH)
London

McCabe & Barton are partnering with a leading financial services client to recruit an experienced Data Platform Engineer. This is an excellent opportunity to join a forward-thinking team driving innovation with modern cloud-based data technologies.

Role Overview

As a Data Platform Engineer, you will design, build, and maintain scalable cloud-based data infrastructure using Azure and Databricks. You'll play a key role in ensuring that data pipelines, architecture, and analytics environments are reliable, performant, and secure.

Key Responsibilities

Platform Development & Maintenance

  • Design and implement data pipelines using Azure Data Factory, Databricks, and related Azure services.
  • Build ETL/ELT processes to transform raw data into structured, analytics-ready formats.
  • Optimise pipeline performance and ensure high availability of data services.

Infrastructure & Architecture

  • Architect and deploy scalable data lake solutions using Azure Data Lake Storage.
  • Implement governance and security measures across the platform.
  • Leverage Terraform or similar IaC tools for controlled and reproducible deployments.

Databricks Development

  • Develop and optimise data jobs using PySpark or Scala within Databricks.
  • Implement the medallion architecture (bronze, silver, gold layers) and use Delta Lake for reliable data transactions.
  • Manage cluster configurations and CI/CD pipelines for Databricks deployments.

Monitoring & Operations

  • Implement monitoring solutions using Azure Monitor, Log Analytics, and Databricks tools.
  • Optimise performance, ensure SLAs are met, and establish disaster recovery and backup strategies.

Collaboration & Documentation

  • Partner with data scientists, analysts, and business stakeholders to deliver effective solutions.
  • Document technical designs, data flows, and operational procedures for knowledge sharing.

Essential Skills & Experience

  • 5+ years of experience with Azure services (Azure Data Factory, ADLS, Azure SQL Database, Synapse Analytics).
  • Strong hands-on expertise in Databricks, Delta Lake, and cluster management.
  • Proficiency in SQL and Python for pipeline development.
  • Familiarity with Git/GitHub and CI/CD practices.
  • Understanding of data modelling, data governance, and security principles.

Desirable Skills

  • Experience with Terraform or other Infrastructure-as-Code tools.
  • Familiarity with Azure DevOps or similar CI/CD platforms.
  • Experience with data quality frameworks and testing.
  • Azure Data Engineer or Databricks certifications.