Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
Apr 21, 2026
Full time
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
Description We are seeking an experienced Technical Architect to support the design and evolution of large-scale, cloud-based data platforms across, working across our portfolio of clients. The Technical Architect will play a key role in shaping solution design patterns, ensuring alignment with established standards, and supporting a strategic transition and migrations across/to/from AWS & Azure. Key Responsibilities Define and evolve technical architecture patterns for data ingestion, processing, and access. Design scalable, resilient, and cost-efficient data solutions within a Hub and Spoke model. Support the design of new data ingestion pipelines (batch and Real Time). Ensure alignment with organisational architectural standards and governance frameworks. Contribute to target architecture roadmaps. Provide architectural guidance across: Data ingestion (Kafka, APIs, SFTP) Data processing (PySpark, EMR, Glue) Storage (S3 and data lake patterns) Collaborate with DevOps, Data Engineers, and Testers to ensure cohesive delivery. Promote engineering best practices, including CI/CD, infrastructure as code, and observability. Ensure robust handling of schema evolution and upstream data changes. Support onboarding of new data sources and services into the platform. Ensure solutions meet requirements for: Data quality and consistency Performance and scalability Security and compliance Work within defined data modelling ownership boundaries where applicable. Support cloud strategy evolution. Avoid platform lock-in and ensure portable, future-proof designs. Contribute to technical decision-making for future platform direction. Work in blended, cross-functional teams. Provide technical leadership and mentoring to delivery teams. Ensure effective knowledge transfer and capability uplift. Required Skills & Experience Strong experience designing modern cloud-based data platforms. Hands-on architectural experience with: AWS (essential): S3, EMR, Glue Kafka/event streaming architectures Python & PySpark-based data processing Experience designing data ingestion pipelines (batch and Real Time). Proficiency in Infrastructure as Code (Terraform). Experience with GitHub-based workflows and CI/CD pipelines. Experience with data lake and lakehouse architectures. Strong understanding of: Data ingestion patterns Data transformation and curation layers Data access and productisation Ability to design for large-scale datasets. Experience supporting cloud migrations Knowledge and experience with Azure & Microsoft Fabric & Databricks would be beneficial Familiarity with event-driven and streaming-first architectures at scale. Strong stakeholder engagement and cross-team collaboration skills. Ability to operate effectively within existing governance and standards. Pragmatic decision-making balancing delivery pace and technical quality. Clear communicator able to translate complex architecture into actionable guidance. Experience working in large, complex enterprise environments. This role will require the ability to obtain and hold UK SC Clearance
Apr 20, 2026
Full time
Description We are seeking an experienced Technical Architect to support the design and evolution of large-scale, cloud-based data platforms across, working across our portfolio of clients. The Technical Architect will play a key role in shaping solution design patterns, ensuring alignment with established standards, and supporting a strategic transition and migrations across/to/from AWS & Azure. Key Responsibilities Define and evolve technical architecture patterns for data ingestion, processing, and access. Design scalable, resilient, and cost-efficient data solutions within a Hub and Spoke model. Support the design of new data ingestion pipelines (batch and Real Time). Ensure alignment with organisational architectural standards and governance frameworks. Contribute to target architecture roadmaps. Provide architectural guidance across: Data ingestion (Kafka, APIs, SFTP) Data processing (PySpark, EMR, Glue) Storage (S3 and data lake patterns) Collaborate with DevOps, Data Engineers, and Testers to ensure cohesive delivery. Promote engineering best practices, including CI/CD, infrastructure as code, and observability. Ensure robust handling of schema evolution and upstream data changes. Support onboarding of new data sources and services into the platform. Ensure solutions meet requirements for: Data quality and consistency Performance and scalability Security and compliance Work within defined data modelling ownership boundaries where applicable. Support cloud strategy evolution. Avoid platform lock-in and ensure portable, future-proof designs. Contribute to technical decision-making for future platform direction. Work in blended, cross-functional teams. Provide technical leadership and mentoring to delivery teams. Ensure effective knowledge transfer and capability uplift. Required Skills & Experience Strong experience designing modern cloud-based data platforms. Hands-on architectural experience with: AWS (essential): S3, EMR, Glue Kafka/event streaming architectures Python & PySpark-based data processing Experience designing data ingestion pipelines (batch and Real Time). Proficiency in Infrastructure as Code (Terraform). Experience with GitHub-based workflows and CI/CD pipelines. Experience with data lake and lakehouse architectures. Strong understanding of: Data ingestion patterns Data transformation and curation layers Data access and productisation Ability to design for large-scale datasets. Experience supporting cloud migrations Knowledge and experience with Azure & Microsoft Fabric & Databricks would be beneficial Familiarity with event-driven and streaming-first architectures at scale. Strong stakeholder engagement and cross-team collaboration skills. Ability to operate effectively within existing governance and standards. Pragmatic decision-making balancing delivery pace and technical quality. Clear communicator able to translate complex architecture into actionable guidance. Experience working in large, complex enterprise environments. This role will require the ability to obtain and hold UK SC Clearance
Data Science & Measurement Lead Because your new ideas are our way new ways of working. Evolve, your way. We are seeking a Data Science & Measurement Lead to manage and grow a team of data scientists responsible for building advanced analytics, predictive models, and measurement solutions across Primark. This is a hands on role requiring strong technical depth in Databricks, Apache Spark, and SQL. What You'll Get People are at the heart of what we do here, so it's essential we provide you with the right environment to perform at your very best. Let's talk lifestyle: Healthcare, pension, and potential bonus. 27 days of leave, plus bank holidays and if you want, you can buy 5 more. Because Primark is all about tailoring to you, we offer Tax Saver Tickets, fitness centre, and a subsidised cafeteria. This role is a hybrid opportunity, offering 1-2 days Working from home. What You'll Do as a Data Science & Measurement Lead We want you to feel challenged and inspired. Here, you'll develop your skills across a range of responsibilities: Lead a data science team to deliver machine learning models, experimentation frameworks, and measurement solutions that drive measurable business impact. Design, build, and deploy end to end ML pipelines and workflows using Databricks, Spark, Python, SQL, and PySpark. Ensure robust operationalisation of models through scalable, reliable data pipelines and production ready ML systems. Partner closely with engineering teams to optimise distributed compute workloads and uphold data quality, monitoring, and governance standards. Establish and drive best practices in model reproducibility, experiment tracking, and end to end ML lifecycle management. Act as a trusted advisor by sharing deep technical expertise, developing team capability, and managing complex delivery plans. Leverage strong retail domain experience-ideally within apparel or grocery-to translate business needs into effective data driven solutions. What You'll Bring Here at Primark, we want everyone to feel valued - so please bring your authentic self to work, of course with some other key experience and abilities for this role in particular: Extensive hands on experience with Databricks, Apache Spark, advanced SQL, and cloud based lakehouse architectures (Azure, AWS, or GCP), with a strong foundation in statistical modelling and machine learning techniques. Proven ability to deliver measurable commercial value through retail focused data science use cases such as demand forecasting, pricing and promotion effectiveness, allocation, stock optimisation, and waste or shrink reduction. Strong experience in experimental design and causal inference (e.g., A/B testing, quasi experiments), with a clear focus on quantifying incremental value and ensuring insights translate into action. Demonstrated experience taking models from prototype to production, establishing clear success metrics, monitoring, governance, and driving adoption across commercial and operational teams. Ability to shape and prioritise the data science roadmap by balancing business value, data readiness, and delivery risk; applies sound commercial judgement informed by market and industry trends. Proven people leader with experience mentoring and developing high performing data science teams; communicates complex technical concepts clearly to non technical stakeholders and acts as a trusted advisor to the business. Does this sound like you? Great, because we can't wait to see what you'll bring. You'll be supported within a team of equally capable people, celebrating who you are and aiding you reach your potential. At Primark, we're excited about our future - and we're excited to develop yours. About Primark At Primark, people matter. They're the beating heart of our business and the reason we've grown from our first store in Dublin in 1969 to a £9bn+ turnover business and over 80,000 colleagues and over 440 stores in 17 countries today. Our values run through everything we do. In essence, we're Caring and always strive to put people first. We're also Dynamic, bravely pushing the boundaries to stay ahead. And finally, we succeed Together. If you need any reasonable adjustments or have an accessibility request, during your recruitment journey, such as extended time or breaks between online assessments, a sign language interpreter, mobility access, or assistive technology please contact your talent acquisition specialist. All offers of employment are subject to background checks, including right to work, reference education and for some roles criminal, and financial checks. If you have any concerns, please reach out to our talent acquisition team to discuss. Our fashion isn't one size fits all and neither is our culture. Primark promotes equal employment opportunity, we strive to create an inclusive workplace where people can be themselves, access opportunities and thrive together. REQ ID: JR-7582
Apr 19, 2026
Full time
Data Science & Measurement Lead Because your new ideas are our way new ways of working. Evolve, your way. We are seeking a Data Science & Measurement Lead to manage and grow a team of data scientists responsible for building advanced analytics, predictive models, and measurement solutions across Primark. This is a hands on role requiring strong technical depth in Databricks, Apache Spark, and SQL. What You'll Get People are at the heart of what we do here, so it's essential we provide you with the right environment to perform at your very best. Let's talk lifestyle: Healthcare, pension, and potential bonus. 27 days of leave, plus bank holidays and if you want, you can buy 5 more. Because Primark is all about tailoring to you, we offer Tax Saver Tickets, fitness centre, and a subsidised cafeteria. This role is a hybrid opportunity, offering 1-2 days Working from home. What You'll Do as a Data Science & Measurement Lead We want you to feel challenged and inspired. Here, you'll develop your skills across a range of responsibilities: Lead a data science team to deliver machine learning models, experimentation frameworks, and measurement solutions that drive measurable business impact. Design, build, and deploy end to end ML pipelines and workflows using Databricks, Spark, Python, SQL, and PySpark. Ensure robust operationalisation of models through scalable, reliable data pipelines and production ready ML systems. Partner closely with engineering teams to optimise distributed compute workloads and uphold data quality, monitoring, and governance standards. Establish and drive best practices in model reproducibility, experiment tracking, and end to end ML lifecycle management. Act as a trusted advisor by sharing deep technical expertise, developing team capability, and managing complex delivery plans. Leverage strong retail domain experience-ideally within apparel or grocery-to translate business needs into effective data driven solutions. What You'll Bring Here at Primark, we want everyone to feel valued - so please bring your authentic self to work, of course with some other key experience and abilities for this role in particular: Extensive hands on experience with Databricks, Apache Spark, advanced SQL, and cloud based lakehouse architectures (Azure, AWS, or GCP), with a strong foundation in statistical modelling and machine learning techniques. Proven ability to deliver measurable commercial value through retail focused data science use cases such as demand forecasting, pricing and promotion effectiveness, allocation, stock optimisation, and waste or shrink reduction. Strong experience in experimental design and causal inference (e.g., A/B testing, quasi experiments), with a clear focus on quantifying incremental value and ensuring insights translate into action. Demonstrated experience taking models from prototype to production, establishing clear success metrics, monitoring, governance, and driving adoption across commercial and operational teams. Ability to shape and prioritise the data science roadmap by balancing business value, data readiness, and delivery risk; applies sound commercial judgement informed by market and industry trends. Proven people leader with experience mentoring and developing high performing data science teams; communicates complex technical concepts clearly to non technical stakeholders and acts as a trusted advisor to the business. Does this sound like you? Great, because we can't wait to see what you'll bring. You'll be supported within a team of equally capable people, celebrating who you are and aiding you reach your potential. At Primark, we're excited about our future - and we're excited to develop yours. About Primark At Primark, people matter. They're the beating heart of our business and the reason we've grown from our first store in Dublin in 1969 to a £9bn+ turnover business and over 80,000 colleagues and over 440 stores in 17 countries today. Our values run through everything we do. In essence, we're Caring and always strive to put people first. We're also Dynamic, bravely pushing the boundaries to stay ahead. And finally, we succeed Together. If you need any reasonable adjustments or have an accessibility request, during your recruitment journey, such as extended time or breaks between online assessments, a sign language interpreter, mobility access, or assistive technology please contact your talent acquisition specialist. All offers of employment are subject to background checks, including right to work, reference education and for some roles criminal, and financial checks. If you have any concerns, please reach out to our talent acquisition team to discuss. Our fashion isn't one size fits all and neither is our culture. Primark promotes equal employment opportunity, we strive to create an inclusive workplace where people can be themselves, access opportunities and thrive together. REQ ID: JR-7582
We seek an experienced Associate Director, Gen AI Architect, specialising in the Energy, Resources & Industrials (ER&I) sector, to join our AI & Data team. This role is pivotal in driving the adoption and implementation of Gen AI solutions within ER&I. Generative AI is transforming ER&I, offering unprecedented opportunities for optimization, enhanced decision-making, and new revenue streams. A robust Gen AI strategy is crucial for realizing this potential and gaining a competitive advantage. Our team delivers cutting-edge Gen AI solutions enabling ER&I clients to thrive. ER&I organizations are adopting innovative approaches to model building, customization, and data management, including transfer learning and robust data governance. A well-designed Gen AI platform is at the heart of our clients' GenAI CoE strategy. The Associate Director, Gen AI Architect role is crucial to shaping and executing this vision. Our AI & Data team specializes in implementing Gen AI solutions that drive tangible value for ER&I clients by: Identifying Gen AI opportunities aligned with client strategy. Gathering detailed requirements. Designing scalable Gen AI platforms & architectures. Join Deloitte for exceptional training, growth, and a dynamic team environment. We encourage flexible working arrangements. If this opportunity interests you, please discuss it with us. Responsibilities Designing Gen AI Architectures: Define end-to-end Gen AI architectures aligned with client business objectives and technology strategies. Advising on Gen AI Applications: Guide ER&I clients on leveraging Gen AI to address their challenges and objectives. Establishing Common AI Language: Foster executive-level discussions to establish a common understanding of AI/Gen AI terminology. Creating Gen AI Roadmaps: Develop strategic roadmaps for Gen AI capabilities to generate value from data and AI. Assessing Systems & Proposing Solutions: Evaluate existing systems and recommend target Gen AI architectures using AI technologies and cloud platforms. Leading & Mentoring Teams: Lead diverse global teams, fostering an inclusive and valued team culture. Managing Stakeholders & Change: Support change management processes to ensure successful Gen AI adoption. Developing Market Offerings: Assist in developing market-leading Gen AI solutions and proposals. Contributing to AI Community: Contribute to the development and growth of our AI and Data Architecture community. Driving Project Delivery: Drive client project delivery by owning workstreams and ensuring successful engagements. Developing Team Members: Develop junior team members through on-the-job training. Qualifications Consulting or ER&I Experience: Client-facing project experience in consulting or direct ER&I industry roles. Proven contribution to proposals, presentations, pre-sales, and opportunity development. ER&I Industry Domain Knowledge: In-depth expertise in ER&I functional areas (Engineering, Operations, Sustainability, Regulatory Compliance, etc.). Deep GenAI Architecture Expertise: Extensive technical architecture experience in GenAI, AI, or Enterprise Architecture, ideally within consulting or industry. Strong Problem-Solving & Analytical Skills: Excellent problem-solving and analytical skills applied to complex GenAI challenges. Executive Stakeholder Management: Strong executive-level stakeholder management and communication skills; ability to build robust client relationships. Leadership & Team Development: Proven leadership in building and developing high-performing, diverse GenAI architecture teams, nurturing junior talent. Designing & Implementing Complex GenAI Solutions: Excellent understanding and experience designing and implementing complex GenAI solutions, including several of the following areas: GenAI model integration & deployment. Prompt engineering & model customization. AI/GenAI governance & ethics (bias detection, explainability). GenAI Platform & Infrastructure Architecture (Cloud, Lakehouse). GenAI ModelOps & Performance Monitoring. AI-driven business intelligence & reporting. Observability & FinOps for AI/GenAI. Cloud Infrastructure, Networking, & Security for AI. Aligning GenAI Architectures Across Organizations: Experience aligning GenAI architecture blueprints across business units and geographies with peers and senior architects. Presenting GenAI Architectural Designs: Experience presenting GenAI architectural designs to diverse stakeholders, including technical authorities and architecture boards. Architectural Evaluation of GenAI Systems: Experience evaluating, designing, and analysing enterprise-wide systems incorporating GenAI, both on-premise and cloud-based. Defining Business Outcomes for GenAI Programs: Experience engaging with business and IT stakeholders to document business outcomes and objectives for large-scale GenAI solutions and programs. Technology & Platform Recommendations for GenAI: Ability to identify requirements, analyse technology alternatives, and recommend build vs. buy for GenAI platforms and solutions. Facilitating GenAI Discovery & Design Workshops: Proven ability to conduct effective discovery and design workshops focused on GenAI solutions. Rapid Learning & Application of GenAI: Demonstrates ability to quickly learn and apply new GenAI techniques and knowledge to achieve business outcomes. Leading Resilient GenAI Project Teams: Experience leading multi-disciplinary teams in fast-paced GenAI projects; demonstrates personal resilience. Go-to-Market & Proposal Development for GenAI: Ability to lead go-to-market activities, including RFI/RFP responses and developing high-quality GenAI-focused proposals. GenAI Design Leadership: Led technical design authorities for strategic GenAI adoption. Strategic GenAI Platform Selection: Strategic GenAI platform/tool evaluation & selection skills. Leading GenAI Trends: Up-to-date on emerging GenAI technologies & standards. AI Regulatory Landscape (ER&I): Understands AI regulations; ensures project compliance. Cloud & Advanced LLM Architectures: Cloud expertise (AWS/Azure/GCP); emerging LLM architectures. GenAI Frameworks & Platforms: Proficient with Data & AI platforms (Azure AI, Vertex AI, Databricks, Hugging Face), advanced GenAI frameworks (LangChain, HF Transformers, LlamaIndex) & Agentic architectures (LangGraph, SmolAgents, PydanticAI) Vector DBs & RAG: Designed solutions using vector DBs & Retrieval Augmented GenAI (RAG) for knowledge applications. GenAI ModelOps/MLOps & Governance: GenAI ModelOps/MLOps knowledge with ethical AI governance focus. ER&I GenAI Applications: Applied GenAI to ER&I use cases to create business value. Enterprise Software Integration: Designed GenAI integrations with SaaS/ERP for business process automation. GenAI Impact Reporting: Designed advanced reporting for measuring GenAI impact and actionable insights. Strategic Project Sizing: Proven strategic project sizing/shaping for large Gen AI programs in ER&I. Global Team Leadership: Managed global/offshore teams effectively for Gen AI projects. Agile Delivery & Client Engagement: Agile project management expertise for rapid GenAI solution delivery; led client workshops.
Apr 17, 2026
Full time
We seek an experienced Associate Director, Gen AI Architect, specialising in the Energy, Resources & Industrials (ER&I) sector, to join our AI & Data team. This role is pivotal in driving the adoption and implementation of Gen AI solutions within ER&I. Generative AI is transforming ER&I, offering unprecedented opportunities for optimization, enhanced decision-making, and new revenue streams. A robust Gen AI strategy is crucial for realizing this potential and gaining a competitive advantage. Our team delivers cutting-edge Gen AI solutions enabling ER&I clients to thrive. ER&I organizations are adopting innovative approaches to model building, customization, and data management, including transfer learning and robust data governance. A well-designed Gen AI platform is at the heart of our clients' GenAI CoE strategy. The Associate Director, Gen AI Architect role is crucial to shaping and executing this vision. Our AI & Data team specializes in implementing Gen AI solutions that drive tangible value for ER&I clients by: Identifying Gen AI opportunities aligned with client strategy. Gathering detailed requirements. Designing scalable Gen AI platforms & architectures. Join Deloitte for exceptional training, growth, and a dynamic team environment. We encourage flexible working arrangements. If this opportunity interests you, please discuss it with us. Responsibilities Designing Gen AI Architectures: Define end-to-end Gen AI architectures aligned with client business objectives and technology strategies. Advising on Gen AI Applications: Guide ER&I clients on leveraging Gen AI to address their challenges and objectives. Establishing Common AI Language: Foster executive-level discussions to establish a common understanding of AI/Gen AI terminology. Creating Gen AI Roadmaps: Develop strategic roadmaps for Gen AI capabilities to generate value from data and AI. Assessing Systems & Proposing Solutions: Evaluate existing systems and recommend target Gen AI architectures using AI technologies and cloud platforms. Leading & Mentoring Teams: Lead diverse global teams, fostering an inclusive and valued team culture. Managing Stakeholders & Change: Support change management processes to ensure successful Gen AI adoption. Developing Market Offerings: Assist in developing market-leading Gen AI solutions and proposals. Contributing to AI Community: Contribute to the development and growth of our AI and Data Architecture community. Driving Project Delivery: Drive client project delivery by owning workstreams and ensuring successful engagements. Developing Team Members: Develop junior team members through on-the-job training. Qualifications Consulting or ER&I Experience: Client-facing project experience in consulting or direct ER&I industry roles. Proven contribution to proposals, presentations, pre-sales, and opportunity development. ER&I Industry Domain Knowledge: In-depth expertise in ER&I functional areas (Engineering, Operations, Sustainability, Regulatory Compliance, etc.). Deep GenAI Architecture Expertise: Extensive technical architecture experience in GenAI, AI, or Enterprise Architecture, ideally within consulting or industry. Strong Problem-Solving & Analytical Skills: Excellent problem-solving and analytical skills applied to complex GenAI challenges. Executive Stakeholder Management: Strong executive-level stakeholder management and communication skills; ability to build robust client relationships. Leadership & Team Development: Proven leadership in building and developing high-performing, diverse GenAI architecture teams, nurturing junior talent. Designing & Implementing Complex GenAI Solutions: Excellent understanding and experience designing and implementing complex GenAI solutions, including several of the following areas: GenAI model integration & deployment. Prompt engineering & model customization. AI/GenAI governance & ethics (bias detection, explainability). GenAI Platform & Infrastructure Architecture (Cloud, Lakehouse). GenAI ModelOps & Performance Monitoring. AI-driven business intelligence & reporting. Observability & FinOps for AI/GenAI. Cloud Infrastructure, Networking, & Security for AI. Aligning GenAI Architectures Across Organizations: Experience aligning GenAI architecture blueprints across business units and geographies with peers and senior architects. Presenting GenAI Architectural Designs: Experience presenting GenAI architectural designs to diverse stakeholders, including technical authorities and architecture boards. Architectural Evaluation of GenAI Systems: Experience evaluating, designing, and analysing enterprise-wide systems incorporating GenAI, both on-premise and cloud-based. Defining Business Outcomes for GenAI Programs: Experience engaging with business and IT stakeholders to document business outcomes and objectives for large-scale GenAI solutions and programs. Technology & Platform Recommendations for GenAI: Ability to identify requirements, analyse technology alternatives, and recommend build vs. buy for GenAI platforms and solutions. Facilitating GenAI Discovery & Design Workshops: Proven ability to conduct effective discovery and design workshops focused on GenAI solutions. Rapid Learning & Application of GenAI: Demonstrates ability to quickly learn and apply new GenAI techniques and knowledge to achieve business outcomes. Leading Resilient GenAI Project Teams: Experience leading multi-disciplinary teams in fast-paced GenAI projects; demonstrates personal resilience. Go-to-Market & Proposal Development for GenAI: Ability to lead go-to-market activities, including RFI/RFP responses and developing high-quality GenAI-focused proposals. GenAI Design Leadership: Led technical design authorities for strategic GenAI adoption. Strategic GenAI Platform Selection: Strategic GenAI platform/tool evaluation & selection skills. Leading GenAI Trends: Up-to-date on emerging GenAI technologies & standards. AI Regulatory Landscape (ER&I): Understands AI regulations; ensures project compliance. Cloud & Advanced LLM Architectures: Cloud expertise (AWS/Azure/GCP); emerging LLM architectures. GenAI Frameworks & Platforms: Proficient with Data & AI platforms (Azure AI, Vertex AI, Databricks, Hugging Face), advanced GenAI frameworks (LangChain, HF Transformers, LlamaIndex) & Agentic architectures (LangGraph, SmolAgents, PydanticAI) Vector DBs & RAG: Designed solutions using vector DBs & Retrieval Augmented GenAI (RAG) for knowledge applications. GenAI ModelOps/MLOps & Governance: GenAI ModelOps/MLOps knowledge with ethical AI governance focus. ER&I GenAI Applications: Applied GenAI to ER&I use cases to create business value. Enterprise Software Integration: Designed GenAI integrations with SaaS/ERP for business process automation. GenAI Impact Reporting: Designed advanced reporting for measuring GenAI impact and actionable insights. Strategic Project Sizing: Proven strategic project sizing/shaping for large Gen AI programs in ER&I. Global Team Leadership: Managed global/offshore teams effectively for Gen AI projects. Agile Delivery & Client Engagement: Agile project management expertise for rapid GenAI solution delivery; led client workshops.
Job Description Role: Data Delivery Manager/Scrum Master (Databricks + Salesforce) Location: UK (Flexible/Hybrid) Salary: £65,000 - £70,000 Overview We are seeking an experienced Data Delivery Manager/Scrum Master to lead the successful delivery of data and CRM initiatives. This role focuses on managing Databricks-based data platforms and Salesforce-integrated solutions , ensuring seamless coordination between technical teams and business stakeholders. You will play a key role in driving delivery excellence, maintaining Agile practices, and ensuring high-quality outcomes across multiple parallel workstreams. Key Responsibilities Lead end-to-end delivery of data platform projects, with a focus on Databricks Act as Scrum Master for Agile teams (stand-ups, sprint planning, retrospectives) Coordinate across data engineering, analytics, and Salesforce teams Manage project plans, timelines, risks, and dependencies Drive stakeholder communication across business and technical functions Ensure alignment between data platform outputs and CRM (Salesforce) use cases Monitor and report on KPIs, delivery metrics, and progress Facilitate continuous improvement in Agile delivery processes Required Skills & Experience Proven experience as a Delivery Manager/Scrum Master Strong background delivering data platform or data engineering projects (Databricks preferred) Experience working with Salesforce or CRM environments Solid understanding of Agile/Scrum methodologies Strong stakeholder management and communication skills Ability to manage multiple concurrent workstreams Nice to Have Experience with Azure or AWS data platforms Understanding of data engineering concepts (ETL, pipelines, lakehouse architecture) Scrum certifications such as CSM or PSM
Apr 17, 2026
Full time
Job Description Role: Data Delivery Manager/Scrum Master (Databricks + Salesforce) Location: UK (Flexible/Hybrid) Salary: £65,000 - £70,000 Overview We are seeking an experienced Data Delivery Manager/Scrum Master to lead the successful delivery of data and CRM initiatives. This role focuses on managing Databricks-based data platforms and Salesforce-integrated solutions , ensuring seamless coordination between technical teams and business stakeholders. You will play a key role in driving delivery excellence, maintaining Agile practices, and ensuring high-quality outcomes across multiple parallel workstreams. Key Responsibilities Lead end-to-end delivery of data platform projects, with a focus on Databricks Act as Scrum Master for Agile teams (stand-ups, sprint planning, retrospectives) Coordinate across data engineering, analytics, and Salesforce teams Manage project plans, timelines, risks, and dependencies Drive stakeholder communication across business and technical functions Ensure alignment between data platform outputs and CRM (Salesforce) use cases Monitor and report on KPIs, delivery metrics, and progress Facilitate continuous improvement in Agile delivery processes Required Skills & Experience Proven experience as a Delivery Manager/Scrum Master Strong background delivering data platform or data engineering projects (Databricks preferred) Experience working with Salesforce or CRM environments Solid understanding of Agile/Scrum methodologies Strong stakeholder management and communication skills Ability to manage multiple concurrent workstreams Nice to Have Experience with Azure or AWS data platforms Understanding of data engineering concepts (ETL, pipelines, lakehouse architecture) Scrum certifications such as CSM or PSM
Hiring: Lead Analytics Engineer (Databricks/Fabric) We're working with a forward-thinking organisation looking to bring in a Lead Analytics Engineer to shape and drive their data strategy, building a scalable, business-critical analytics layer. Role Details: Salary: £80,000 - £85,000 Location: London (3 days onsite) Reporting to: Principal for Data & AI What You'll Be Doing: Lead the design and orchestration of data pipelines, semantic models, and analytics layers Build Fabric workflows and move business logic into the Gold layer (Medallion Architecture) Develop DBT models to create a scalable and trusted source of truth Design and implement Fact & Dimension (F&D) data models for analytics and reporting Own and optimise Databricks workflows , ensuring performance and scalability Drive decision-making frameworks through high-quality, well-modelled data Conduct root cause analysis, optimise ETL pipelines, and improve data reliability Key Skills & Experience: Strong experience with Databricks (Spark, Delta, Workflows, Unity Catalog) Hands-on expertise in SQL & Python (Databricks notebooks, Pandas) Proven experience building data warehouses/lakehouses for BI & analytics Deep understanding of data modelling principles (Star Schema, SCD, CDC, Fact & Dimension design) Experience with DBT , CI/CD pipelines, and Azure DevOps Exposure to Microsoft Fabric & Semantic Models Strong analytical mindset with the ability to translate data into business insights Why Join? Be the technical authority shaping the organisation's data & analytics layer Work with modern platforms including Databricks & Microsoft Fabric Collaborate with a high-performing Data & AI team in a strategic role
Apr 16, 2026
Full time
Hiring: Lead Analytics Engineer (Databricks/Fabric) We're working with a forward-thinking organisation looking to bring in a Lead Analytics Engineer to shape and drive their data strategy, building a scalable, business-critical analytics layer. Role Details: Salary: £80,000 - £85,000 Location: London (3 days onsite) Reporting to: Principal for Data & AI What You'll Be Doing: Lead the design and orchestration of data pipelines, semantic models, and analytics layers Build Fabric workflows and move business logic into the Gold layer (Medallion Architecture) Develop DBT models to create a scalable and trusted source of truth Design and implement Fact & Dimension (F&D) data models for analytics and reporting Own and optimise Databricks workflows , ensuring performance and scalability Drive decision-making frameworks through high-quality, well-modelled data Conduct root cause analysis, optimise ETL pipelines, and improve data reliability Key Skills & Experience: Strong experience with Databricks (Spark, Delta, Workflows, Unity Catalog) Hands-on expertise in SQL & Python (Databricks notebooks, Pandas) Proven experience building data warehouses/lakehouses for BI & analytics Deep understanding of data modelling principles (Star Schema, SCD, CDC, Fact & Dimension design) Experience with DBT , CI/CD pipelines, and Azure DevOps Exposure to Microsoft Fabric & Semantic Models Strong analytical mindset with the ability to translate data into business insights Why Join? Be the technical authority shaping the organisation's data & analytics layer Work with modern platforms including Databricks & Microsoft Fabric Collaborate with a high-performing Data & AI team in a strategic role
Data Engineer- £45,000- Remote About the Role As a Data Engineer , you will be a key member of our agile delivery team, working closely with clients to unlock the full value of their data. This is a hands-on role where you'll be designing and implementing data solutions using cutting-edge tools in the Azure ecosystem. You'll have the opportunity to develop your technical expertise while contributing to high-impact projects across a variety of industries. We operate with a Winning from Anywhere® philosophy, offering flexibility in where you work while maintaining a strong team culture through regular client site visits, company events, and collaboration opportunities. Key Responsibilities Deliver end-to-end data solutions, including acquisition, engineering, modelling, analysis, and visualisation. Lead and participate in workshops to gather requirements and engage with clients on both technical and business levels. Design and implement scalable, robust ETL/ELT pipelines using Microsoft/Azure technologies such as Azure Data Factory, Synapse, Databricks, or Fabric. Build and optimise data lake solutions using medallion architecture. Support cloud migration of on-premises SQL Server-based data platforms (SQL, SSIS, SSAS, SSRS). Develop reports, dashboards, and analytics solutions using Power BI. Provide ongoing support and enhancements to solutions post-deployment. Skills and Experience Essential: Proven experience in a Data Engineering or Data Warehouse Development role. Strong hands-on expertise with Azure/Microsoft and/or SQL Server technology stacks. Proficiency in ETL/ELT development using tools like Azure Synapse, Data Factory, Databricks, or Fabric. Advanced SQL and Python skills (DDL, DML, Stored Procedures, Notebooks). Understanding of lakehouse architecture and medallion design principles. Ability to work with large, complex datasets from multiple sources. Strong knowledge of BI and data warehousing concepts. Experience with Power BI for reporting and data visualisation. Excellent communication and client engagement skills. What We Offer Remote-first working model (Winning from Anywhere®) 25 days annual holiday Monthly home working allowance Set-up allowance for home office 24/7 virtual GP access Employee Assistance Programme (available 24/7) Company sick pay scheme Life assurance (4x base salary) Private health insurance after 1 year of service Enhanced parental leave and pay Cyclescheme and electric car scheme Opportunity to work with a 3 World Class Best Company* To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Oct 07, 2025
Full time
Data Engineer- £45,000- Remote About the Role As a Data Engineer , you will be a key member of our agile delivery team, working closely with clients to unlock the full value of their data. This is a hands-on role where you'll be designing and implementing data solutions using cutting-edge tools in the Azure ecosystem. You'll have the opportunity to develop your technical expertise while contributing to high-impact projects across a variety of industries. We operate with a Winning from Anywhere® philosophy, offering flexibility in where you work while maintaining a strong team culture through regular client site visits, company events, and collaboration opportunities. Key Responsibilities Deliver end-to-end data solutions, including acquisition, engineering, modelling, analysis, and visualisation. Lead and participate in workshops to gather requirements and engage with clients on both technical and business levels. Design and implement scalable, robust ETL/ELT pipelines using Microsoft/Azure technologies such as Azure Data Factory, Synapse, Databricks, or Fabric. Build and optimise data lake solutions using medallion architecture. Support cloud migration of on-premises SQL Server-based data platforms (SQL, SSIS, SSAS, SSRS). Develop reports, dashboards, and analytics solutions using Power BI. Provide ongoing support and enhancements to solutions post-deployment. Skills and Experience Essential: Proven experience in a Data Engineering or Data Warehouse Development role. Strong hands-on expertise with Azure/Microsoft and/or SQL Server technology stacks. Proficiency in ETL/ELT development using tools like Azure Synapse, Data Factory, Databricks, or Fabric. Advanced SQL and Python skills (DDL, DML, Stored Procedures, Notebooks). Understanding of lakehouse architecture and medallion design principles. Ability to work with large, complex datasets from multiple sources. Strong knowledge of BI and data warehousing concepts. Experience with Power BI for reporting and data visualisation. Excellent communication and client engagement skills. What We Offer Remote-first working model (Winning from Anywhere®) 25 days annual holiday Monthly home working allowance Set-up allowance for home office 24/7 virtual GP access Employee Assistance Programme (available 24/7) Company sick pay scheme Life assurance (4x base salary) Private health insurance after 1 year of service Enhanced parental leave and pay Cyclescheme and electric car scheme Opportunity to work with a 3 World Class Best Company* To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Your new company A Principal Data Engineer/Architect is required on a permanent basis for a forward-thinking organisation at the heart of Leeds. The Data Services team are on a mission to unlock the value of data by delivering high-quality, secure, and accessible data services. With a focus on modern cloud-based technologies and strong partnerships, they help colleagues navigate the complexities of a data-driven world. Your new role As a Principal Data Engineer, you will be instrumental in shaping the organisations strategic cloud data platform. You'll lead the design and implementation of scalable data pipelines, drive innovation in data-centric products, and champion automation and predictive analytics. This is a senior technical leadership role where you'll establish best practices, ensure compliance, and deliver smart, customer-focused solutions. What you'll need to succeed You'll bring extensive experience in data engineering within Azure environments, with a strong track record in modernisation and large-scale migration projects. You'll be confident designing metadata-driven frameworks and managing Databricks environments, with hands-on expertise in Python, T-SQL, and PySpark. Your leadership and mentoring skills will be key, alongside your ability to collaborate across teams and drive strategic decisions. Essential Skills Include: Proven leadership and mentoring experience in senior data engineering roles Expertise in Azure Data Factory, Azure Databricks, and lakehouse architecture Strong programming skills (Python, T-SQL, PySpark) and test-driven development Deep understanding of data security, compliance, and tools like Microsoft Purview Excellent communication and stakeholder management skills Experience with containerisation and orchestration (eg, Kubernetes, Azure Container Instances) would be desirable AI/ML integration within data platforms would be advantageous What you'll get in return You'll be part of a dynamic and inclusive team, working on cutting-edge data solutions that make a real impact. The organisation offers a competitive salary up to £81K, excellent benefits including 8% cash payment on top of the salary, bonus scheme, and opportunities for professional development. You'll also enjoy flexible working arrangements, generous annual leave, public sector pension and a supportive environment that values innovation and collaboration. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
Oct 06, 2025
Full time
Your new company A Principal Data Engineer/Architect is required on a permanent basis for a forward-thinking organisation at the heart of Leeds. The Data Services team are on a mission to unlock the value of data by delivering high-quality, secure, and accessible data services. With a focus on modern cloud-based technologies and strong partnerships, they help colleagues navigate the complexities of a data-driven world. Your new role As a Principal Data Engineer, you will be instrumental in shaping the organisations strategic cloud data platform. You'll lead the design and implementation of scalable data pipelines, drive innovation in data-centric products, and champion automation and predictive analytics. This is a senior technical leadership role where you'll establish best practices, ensure compliance, and deliver smart, customer-focused solutions. What you'll need to succeed You'll bring extensive experience in data engineering within Azure environments, with a strong track record in modernisation and large-scale migration projects. You'll be confident designing metadata-driven frameworks and managing Databricks environments, with hands-on expertise in Python, T-SQL, and PySpark. Your leadership and mentoring skills will be key, alongside your ability to collaborate across teams and drive strategic decisions. Essential Skills Include: Proven leadership and mentoring experience in senior data engineering roles Expertise in Azure Data Factory, Azure Databricks, and lakehouse architecture Strong programming skills (Python, T-SQL, PySpark) and test-driven development Deep understanding of data security, compliance, and tools like Microsoft Purview Excellent communication and stakeholder management skills Experience with containerisation and orchestration (eg, Kubernetes, Azure Container Instances) would be desirable AI/ML integration within data platforms would be advantageous What you'll get in return You'll be part of a dynamic and inclusive team, working on cutting-edge data solutions that make a real impact. The organisation offers a competitive salary up to £81K, excellent benefits including 8% cash payment on top of the salary, bonus scheme, and opportunities for professional development. You'll also enjoy flexible working arrangements, generous annual leave, public sector pension and a supportive environment that values innovation and collaboration. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.