• Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
  • Sign in
  • Sign up
  • Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

65 jobs found

Email me jobs like this
Refine Search
Current Search
databricks data engineer
Big Red Recruitment
Data Analyst FTC
Big Red Recruitment Coventry, Warwickshire
Turn data into decisions that directly impact revenue, pricing, and performance. We are hiring a Data Analyst for a 6 month Fixed Term Contract to play a key role in transforming how a major UK contracts division uses data across sales, margin, and commercial performance. About the client:You will be joining a large, multi-brand UK organisation operating within a complex distribution environment. With multiple business units and evolving systems, the organisation is investing heavily in improving how data is accessed, trusted, and used. This is a business in transition, moving towards a more modern, insight-driven approach to decision making. Project overview:This role sits at the centre of a transformation focused on improving visibility of margin, pricing, and sales performance. You will help redefine how analytics supports the business, working closely with stakeholders to shape reporting priorities and deliver insights that drive action. The work will also contribute to building a scalable analytics capability that can be rolled out more widely across the organisation. What you will be doing:You will act as the link between business stakeholders and data teams, translating complex commercial requirements into clear data and reporting solutions. A core part of the role will involve analysing sales, pricing, and margin performance, identifying key drivers, and turning these into actionable insights. You will develop a strong understanding of rebate-driven margin structures and ensure that changes to pricing and margin logic are accurately reflected in reporting. You will also support the transition from legacy tools into modern data models, ensuring historical and current reporting remains consistent and reliable. Alongside analysis, you will design and build scalable dashboards and reports, enabling business users to access clear, consistent insights without relying on manual processes. You will work closely with Data Engineers to define reusable datasets and ensure data models support flexible, self-serve analytics. Tech environment:You will work with tools such as Excel, Power BI, and modern data platforms, with exposure to enterprise data environments and evolving cloud-based solutions. What we are looking for:We are looking for someone with strong experience in data analysis, business intelligence, or commercial analytics. You should have a proven ability to work with pricing, margin, or financial data, and be confident using Excel at an advanced level alongside BI tools such as Power BI. You will need to be comfortable translating business needs into data solutions and working closely with stakeholders across the organisation. Strong communication and the ability to influence decision making through insight are key. Nice to have:Experience working in large-scale data environments or transformation programmes would be beneficial, as would familiarity with tools such as Phocas, Azure Synapse, or Databricks. Experience within retail, distribution, or similar sectors is also advantageous. Why join:This is an opportunity to play a central role in shaping how data is used within a key part of the business. Your work will directly influence commercial decisions and help set the standard for future analytics across multiple brands.Role: Data AnalystDuration: 6 month FTC - Chance to renew or go permSalary: Up to £50,000Location: Warwickshire / Hybrid
Apr 02, 2026
Full time
Turn data into decisions that directly impact revenue, pricing, and performance. We are hiring a Data Analyst for a 6 month Fixed Term Contract to play a key role in transforming how a major UK contracts division uses data across sales, margin, and commercial performance. About the client:You will be joining a large, multi-brand UK organisation operating within a complex distribution environment. With multiple business units and evolving systems, the organisation is investing heavily in improving how data is accessed, trusted, and used. This is a business in transition, moving towards a more modern, insight-driven approach to decision making. Project overview:This role sits at the centre of a transformation focused on improving visibility of margin, pricing, and sales performance. You will help redefine how analytics supports the business, working closely with stakeholders to shape reporting priorities and deliver insights that drive action. The work will also contribute to building a scalable analytics capability that can be rolled out more widely across the organisation. What you will be doing:You will act as the link between business stakeholders and data teams, translating complex commercial requirements into clear data and reporting solutions. A core part of the role will involve analysing sales, pricing, and margin performance, identifying key drivers, and turning these into actionable insights. You will develop a strong understanding of rebate-driven margin structures and ensure that changes to pricing and margin logic are accurately reflected in reporting. You will also support the transition from legacy tools into modern data models, ensuring historical and current reporting remains consistent and reliable. Alongside analysis, you will design and build scalable dashboards and reports, enabling business users to access clear, consistent insights without relying on manual processes. You will work closely with Data Engineers to define reusable datasets and ensure data models support flexible, self-serve analytics. Tech environment:You will work with tools such as Excel, Power BI, and modern data platforms, with exposure to enterprise data environments and evolving cloud-based solutions. What we are looking for:We are looking for someone with strong experience in data analysis, business intelligence, or commercial analytics. You should have a proven ability to work with pricing, margin, or financial data, and be confident using Excel at an advanced level alongside BI tools such as Power BI. You will need to be comfortable translating business needs into data solutions and working closely with stakeholders across the organisation. Strong communication and the ability to influence decision making through insight are key. Nice to have:Experience working in large-scale data environments or transformation programmes would be beneficial, as would familiarity with tools such as Phocas, Azure Synapse, or Databricks. Experience within retail, distribution, or similar sectors is also advantageous. Why join:This is an opportunity to play a central role in shaping how data is used within a key part of the business. Your work will directly influence commercial decisions and help set the standard for future analytics across multiple brands.Role: Data AnalystDuration: 6 month FTC - Chance to renew or go permSalary: Up to £50,000Location: Warwickshire / Hybrid
Government Digital & Data
Lead Technical Architect (Solutions) - Infected Blood Compensation Authority - G7
Government Digital & Data
Location Glasgow, Newcastle-upon-Tyne About the job Job summary The Infected Blood Compensation Authority (IBCA) is responsible for delivering a compensation scheme that has been long awaited by the infected blood community to provide financial compensation to victims of infected blood on a UK-wide basis. We are looking for an experienced Lead Technical Architect who can blend knowledge of Data Platforms with data security, governance and strategic thinking. You will be confident at working in a complex and pressured data delivery environment across products where you'll be supporting the strategic objective of IBCA to pay compensation to those impacted by the infected blood scandal seamlessly. You will collaborate across the architecture function in data and digital to support the delivery of solutions, creating options and recommendations, and providing expert advice to drive technology choices. Working at IBCA gives you a huge opportunity to make an impact on those who deserve compensation, and this role will shape and support the development of safe and secure data solutions that provide a single source of truth for those going through their compensation journey. Successful applicants will join the Civil Service Pension Scheme. Please note that the mission of IBCA means that it is likely to be operational for a period of approximately 5 to 7 years. When IBCA's work begins to wind down, IBCA employees will receive support and practical guidance to find a new role, whether in the Civil Service, another Arms Length Body (ALB), or an external employer. Job description As a Lead Technical Architect (Solutions) you will work on multiple projects across the data delivery programme on problems that require broad architectural thinking. This post is within a data delivery setting, especially one focussed on AWS, Quantexa and Databricks. You will: Be responsible for leading the technical solution design of data systems and services, justifying and communicating design decisions; Assure data services and system quality, ensuring the technical work fits into the data platform and wider IBCA technology strategy; Drive continuous improvement in system reliability, performance, and security; Regularly collaborate and find agreement with senior stakeholders, providing direction and challenge; Be proactive in working with Product Managers and Engineers to identify problems and translating these into scalable technical solutions; Participate in architecture reviews, and technical workshops. Responsibilities Secure Cloud & Platform Architecture (AWS Focus): You will lead and optimise the design of resilient, high-availability AWS environments, ensuring that every infrastructure component is built with a "security-by-design" approach to protect against evolving threats; Architecture Patterns & Standards: You will establish and uphold the technical standards for the organisation, creating reusable patterns and governance frameworks that ensure consistency and quality across all engineering squads; Comprehensive Data Management Design: This involves architecting the end-to-end lifecycle of data systems and services, ensuring the underlying infrastructure supports high-performance analytics and robust data sovereignty; Digital Service Integration & API Management: You will oversee the connectivity of modern digital services by designing scalable API layers and integration strategies that allow internal and external systems to communicate securely and efficiently; Agile Delivery & Stakeholder Translation: You will bridge the gap between complex technical roadmaps and strategic business goals, communicating technical trade-offs to non-technical stakeholders to ensure smooth, iterative delivery. Person specification Essential Architecture design of highly scalable and secure Data Platforms hosted on AWS. This includes expertise in configuring and integrating data management systems like Databricks and Quantexa; Architecture design informed through use of principles, patterns, technical radars, practices and standards; Data Storage Design & Management - expertise across diverse storage technologies including SQL, NoSQL and Data Lakehouse; Data Architecture Design - Data Modelling (conceptual to physical), Master Data Management (MDM), Metadata Management, Data Lineage and Data Governance; API Management - designing and implementing robust integration patterns (including end-to-end secure API management and gateways) that reliably connect Data Platforms to wider digital services and consuming tech systems; Service Architecture Design - monitoring, logging and observability including patching, business continuity and disaster recovery; Experience of working within and across product teams within Agile environments throughout the full delivery lifecycle of inception to go-live; Translating business requirements into tangible, compliant, technical solutions with traceability to the user value that can be demonstrated to stakeholders. Desirable You will have experience of working on architecture design for the Quantexa Decision Intelligence Platform; Security Architecture Design - Data Security, Identity & Access Management, Cloud Security, Network Security and DevSecOps; Strong knowledge of DevOps practices and Infrastructure-as-Code.
Apr 02, 2026
Full time
Location Glasgow, Newcastle-upon-Tyne About the job Job summary The Infected Blood Compensation Authority (IBCA) is responsible for delivering a compensation scheme that has been long awaited by the infected blood community to provide financial compensation to victims of infected blood on a UK-wide basis. We are looking for an experienced Lead Technical Architect who can blend knowledge of Data Platforms with data security, governance and strategic thinking. You will be confident at working in a complex and pressured data delivery environment across products where you'll be supporting the strategic objective of IBCA to pay compensation to those impacted by the infected blood scandal seamlessly. You will collaborate across the architecture function in data and digital to support the delivery of solutions, creating options and recommendations, and providing expert advice to drive technology choices. Working at IBCA gives you a huge opportunity to make an impact on those who deserve compensation, and this role will shape and support the development of safe and secure data solutions that provide a single source of truth for those going through their compensation journey. Successful applicants will join the Civil Service Pension Scheme. Please note that the mission of IBCA means that it is likely to be operational for a period of approximately 5 to 7 years. When IBCA's work begins to wind down, IBCA employees will receive support and practical guidance to find a new role, whether in the Civil Service, another Arms Length Body (ALB), or an external employer. Job description As a Lead Technical Architect (Solutions) you will work on multiple projects across the data delivery programme on problems that require broad architectural thinking. This post is within a data delivery setting, especially one focussed on AWS, Quantexa and Databricks. You will: Be responsible for leading the technical solution design of data systems and services, justifying and communicating design decisions; Assure data services and system quality, ensuring the technical work fits into the data platform and wider IBCA technology strategy; Drive continuous improvement in system reliability, performance, and security; Regularly collaborate and find agreement with senior stakeholders, providing direction and challenge; Be proactive in working with Product Managers and Engineers to identify problems and translating these into scalable technical solutions; Participate in architecture reviews, and technical workshops. Responsibilities Secure Cloud & Platform Architecture (AWS Focus): You will lead and optimise the design of resilient, high-availability AWS environments, ensuring that every infrastructure component is built with a "security-by-design" approach to protect against evolving threats; Architecture Patterns & Standards: You will establish and uphold the technical standards for the organisation, creating reusable patterns and governance frameworks that ensure consistency and quality across all engineering squads; Comprehensive Data Management Design: This involves architecting the end-to-end lifecycle of data systems and services, ensuring the underlying infrastructure supports high-performance analytics and robust data sovereignty; Digital Service Integration & API Management: You will oversee the connectivity of modern digital services by designing scalable API layers and integration strategies that allow internal and external systems to communicate securely and efficiently; Agile Delivery & Stakeholder Translation: You will bridge the gap between complex technical roadmaps and strategic business goals, communicating technical trade-offs to non-technical stakeholders to ensure smooth, iterative delivery. Person specification Essential Architecture design of highly scalable and secure Data Platforms hosted on AWS. This includes expertise in configuring and integrating data management systems like Databricks and Quantexa; Architecture design informed through use of principles, patterns, technical radars, practices and standards; Data Storage Design & Management - expertise across diverse storage technologies including SQL, NoSQL and Data Lakehouse; Data Architecture Design - Data Modelling (conceptual to physical), Master Data Management (MDM), Metadata Management, Data Lineage and Data Governance; API Management - designing and implementing robust integration patterns (including end-to-end secure API management and gateways) that reliably connect Data Platforms to wider digital services and consuming tech systems; Service Architecture Design - monitoring, logging and observability including patching, business continuity and disaster recovery; Experience of working within and across product teams within Agile environments throughout the full delivery lifecycle of inception to go-live; Translating business requirements into tangible, compliant, technical solutions with traceability to the user value that can be demonstrated to stakeholders. Desirable You will have experience of working on architecture design for the Quantexa Decision Intelligence Platform; Security Architecture Design - Data Security, Identity & Access Management, Cloud Security, Network Security and DevSecOps; Strong knowledge of DevOps practices and Infrastructure-as-Code.
Involved Solutions
Lead Data Analyst - up to £70,000 + Bonus + Benefits - Hybrid
Involved Solutions Esher, Surrey
Lead Data Analyst Salary: Up to £70,000 + Benefits Location: Esher - Hybrid Working Hours: Full time - PermanentA large well-established firm has recently implemented Microsoft Fabric and is now seeking a Lead Data Analyst to take ownership of the organisation's data and analytics capability. This role will lead the development of the company's data platform, ensuring data is transformed into meaningful insights that support decision-making across the business. The Lead Data Analyst will work across the full data lifecycle, from ingestion and modelling through to reporting and visualisation, while also managing and mentoring another Data Analyst/Engineer. The Lead Data Analyst position is ideal for someone who enjoys combining hands-on technical delivery with leadership responsibility, advising on data strategy while building scalable BI solutions. Responsibilities for the Lead Data Analyst: Own the organisation's data and BI capability following the implementation of Microsoft Fabric Design and develop high-quality Power BI dashboards and reporting solutions Develop and maintain data pipelines, integrations and data flows within Microsoft Fabric and Azure Integrate data from third-party systems and internal platforms into the data lake environment Build scalable data models and semantic layers for business reporting Build and optimise SQL queries, data models and dimensional schemas for reporting Support the continued growth of the organisation's data lake and analytics platform Analyse and interpret data to identify trends and insights that support business decision-making Work with business stakeholders to understand data needs and deliver actionable insights Manage and prioritise the analytics backlog to ensure work aligns with business value Lead and mentor a Data Analyst/Engineer while driving best practices across the data function Essential Skills for the Lead Data Analyst: Strong Power BI expertise including DAX Experience working with Microsoft Fabric Knowledge of Azure Synapse, Databricks, Spark Strong SQL capability for querying, shaping and modelling data Experience building ETL/ELT pipelines and integrating data from APIs, files and databases Experience with cloud data services within Azure environments Strong stakeholder engagement skills and ability to translate data insights for business audiences Desirable Skills for the Lead Data Analyst: Experience with Python or another analytics-focused programming language Experience working with Azure Data Lake, Azure Functions or Service Bus Experience managing or mentoring analysts or engineers Knowledge of data governance, security and BI deployment best practices If you are a data professional looking to take ownership of a modern analytics platform and shape how data drives decision-making across a business, please apply for the Lead Data Analyst position in the immediate instance. Senior Data Analyst, Senior Data & BI Analyst, Lead Data Analyst, Lead Data & BI Analyst
Apr 02, 2026
Full time
Lead Data Analyst Salary: Up to £70,000 + Benefits Location: Esher - Hybrid Working Hours: Full time - PermanentA large well-established firm has recently implemented Microsoft Fabric and is now seeking a Lead Data Analyst to take ownership of the organisation's data and analytics capability. This role will lead the development of the company's data platform, ensuring data is transformed into meaningful insights that support decision-making across the business. The Lead Data Analyst will work across the full data lifecycle, from ingestion and modelling through to reporting and visualisation, while also managing and mentoring another Data Analyst/Engineer. The Lead Data Analyst position is ideal for someone who enjoys combining hands-on technical delivery with leadership responsibility, advising on data strategy while building scalable BI solutions. Responsibilities for the Lead Data Analyst: Own the organisation's data and BI capability following the implementation of Microsoft Fabric Design and develop high-quality Power BI dashboards and reporting solutions Develop and maintain data pipelines, integrations and data flows within Microsoft Fabric and Azure Integrate data from third-party systems and internal platforms into the data lake environment Build scalable data models and semantic layers for business reporting Build and optimise SQL queries, data models and dimensional schemas for reporting Support the continued growth of the organisation's data lake and analytics platform Analyse and interpret data to identify trends and insights that support business decision-making Work with business stakeholders to understand data needs and deliver actionable insights Manage and prioritise the analytics backlog to ensure work aligns with business value Lead and mentor a Data Analyst/Engineer while driving best practices across the data function Essential Skills for the Lead Data Analyst: Strong Power BI expertise including DAX Experience working with Microsoft Fabric Knowledge of Azure Synapse, Databricks, Spark Strong SQL capability for querying, shaping and modelling data Experience building ETL/ELT pipelines and integrating data from APIs, files and databases Experience with cloud data services within Azure environments Strong stakeholder engagement skills and ability to translate data insights for business audiences Desirable Skills for the Lead Data Analyst: Experience with Python or another analytics-focused programming language Experience working with Azure Data Lake, Azure Functions or Service Bus Experience managing or mentoring analysts or engineers Knowledge of data governance, security and BI deployment best practices If you are a data professional looking to take ownership of a modern analytics platform and shape how data drives decision-making across a business, please apply for the Lead Data Analyst position in the immediate instance. Senior Data Analyst, Senior Data & BI Analyst, Lead Data Analyst, Lead Data & BI Analyst
Client Server
Data Engineer Python SQL Spark
Client Server
Data Engineer (Python SQL Spark Azure Databricks) London to £160k Are you a tech savvy Data Engineer with a first class education? You could be progressing your career working on complex and challenging systems at a Hedge Fund with over $17 billion under management. What's in it for you: Salary to £160k Significant bonus earning potential Fund performance share Personal training budget and mentoring Family friendly benefits that include unlimited emergency backup childcare as well as care for elderly relatives Various social groups including sports teams Private healthcare and wellness activities Your role: As a Data Engineer you will join a small team responsible for understanding, managing and transforming raw data content from various 3 rd parties for the trading team, investment quants and investment desk. Typical responsibilities will include combining and transforming raw data into useful insights, analysis and visualisations, interrogating various vendor data endpoints to source and analyse data, ensuring data consistency, completeness and accuracy across all platforms. You'll develop data dictionaries and other documentation and collaborate with technology teams to implement and enhance data systems and processes, keeping up to date with industry trends and emerging technology in data content and tooling. Location / WFH: You'll join the team in fantastic London (Soho) based offices that offer a wide range of facilities including nutritionally balance breakfast, lunch and all day snacks. Please note this role is full-time office based (Monday to Friday), with some flexibility if needed on ad hoc basis. About you: You have an outstanding record of academic achievement - first class degree in a STEM discipline from a top tier university (i.e. Russel Group or top 100 global university), backed by A grades at A-level You have experience in a similar Data Engineer role at a Hedge Fund or Investment Bank and have a good understanding of financial markets and investment management You have strong technical skills with Python or C# and SQL, experience with version control and contributing to a shared codebases You have experience with modern data tools and technologies including Apache Spark and Azure Databricks preferred You have a strong knowledge of data management principles and best practices You have experience with data analysis, visualisation tools and techniques You're able to convey complex data and technical information to front office traders Ideally you will have had exposure to BBG, Markit, Refinitiv, macro research Apply now to find out more about this Data Engineer (Python SQL Spark Azure Databricks) opportunity.
Apr 01, 2026
Full time
Data Engineer (Python SQL Spark Azure Databricks) London to £160k Are you a tech savvy Data Engineer with a first class education? You could be progressing your career working on complex and challenging systems at a Hedge Fund with over $17 billion under management. What's in it for you: Salary to £160k Significant bonus earning potential Fund performance share Personal training budget and mentoring Family friendly benefits that include unlimited emergency backup childcare as well as care for elderly relatives Various social groups including sports teams Private healthcare and wellness activities Your role: As a Data Engineer you will join a small team responsible for understanding, managing and transforming raw data content from various 3 rd parties for the trading team, investment quants and investment desk. Typical responsibilities will include combining and transforming raw data into useful insights, analysis and visualisations, interrogating various vendor data endpoints to source and analyse data, ensuring data consistency, completeness and accuracy across all platforms. You'll develop data dictionaries and other documentation and collaborate with technology teams to implement and enhance data systems and processes, keeping up to date with industry trends and emerging technology in data content and tooling. Location / WFH: You'll join the team in fantastic London (Soho) based offices that offer a wide range of facilities including nutritionally balance breakfast, lunch and all day snacks. Please note this role is full-time office based (Monday to Friday), with some flexibility if needed on ad hoc basis. About you: You have an outstanding record of academic achievement - first class degree in a STEM discipline from a top tier university (i.e. Russel Group or top 100 global university), backed by A grades at A-level You have experience in a similar Data Engineer role at a Hedge Fund or Investment Bank and have a good understanding of financial markets and investment management You have strong technical skills with Python or C# and SQL, experience with version control and contributing to a shared codebases You have experience with modern data tools and technologies including Apache Spark and Azure Databricks preferred You have a strong knowledge of data management principles and best practices You have experience with data analysis, visualisation tools and techniques You're able to convey complex data and technical information to front office traders Ideally you will have had exposure to BBG, Markit, Refinitiv, macro research Apply now to find out more about this Data Engineer (Python SQL Spark Azure Databricks) opportunity.
Robert Walters
Data Modeller
Robert Walters Manchester, Lancashire
Data Modeller Location: Manchester Contract: Consultant Work Setup: Hybrid - 2 days onsite (moving to 3 days in September) Who We Are We are a consultancy operating within Robert Walters, the world's most trusted talent solutions business. Across the globe, we deliver recruitment, outsourcing, and talent advisory services for businesses of all sizes, opening doors for people with diverse skills, ambitions, and backgrounds. The Role We have an exciting new opportunity for a Data Modeller to join Robert Walters as a Consultant. As a consultant, you will benefit from permanent employment with Robert Walters and will be deployed on an assignment within our clients' organisations, in return we will provide you with the opportunity to develop your skills with ongoing training and professional support. This role offers an exciting opportunity to join a global business, providing top-tier service to our blue chip clients. What you'll do Design, build and maintain scalable data pipelines and models in Databricks using Python to deliver reliable datasets for reporting and key business metrics. Develop efficient, well-structured code while adhering to technical standards, reconciliation checks, and version control practices using Git and DevOps tools. Partner with visualisation analysts to ensure data models are structured effectively for dashboards, reporting and insight generation. Work within Agile delivery teams to scope work, contribute to sprint planning and deliver outputs within agreed timelines. Engage with stakeholders to clarify requirements, provide progress updates and communicate technical concepts clearly to non-technical audiences. Continuously develop knowledge of insurance data and emerging analytics technologies to improve data solutions and support business decision-making. What you bring Strong hands-on experience with Databricks, Python and Power BI, with the ability to contribute quickly in an established environment. Background as a Data Modeller, Analytics Engineer, or Data Analyst with strong modelling experience. Experience designing scalable data models and pipelines within cloud-based data platforms. Proficiency with Git version control and development best practices. Strong analytical mindset with the ability to interpret complex datasets and produce actionable insight. Insurance or financial services experience preferred, with understanding of business reporting and operational data. What's Next? If you are ready to take the next step, apply now. Successful applicants will be contacted directly by a recruiter to discuss the role more. We are committed to creating an inclusive recruitment experience. If you require support or adjustments to the recruitment process, our Adjustment Concierge Service is here to help. Please feel free to contact us at (see below) to discuss how we can support you. This position is being recruited on behalf of our client through our Outsourcing service line. Resource Solutions Limited, trading as Robert Walters, acts as an employment business and agency, partnering with top organizations to help them find the best talent. We welcome applications from all candidates and are committed to providing equal opportunities.
Apr 01, 2026
Full time
Data Modeller Location: Manchester Contract: Consultant Work Setup: Hybrid - 2 days onsite (moving to 3 days in September) Who We Are We are a consultancy operating within Robert Walters, the world's most trusted talent solutions business. Across the globe, we deliver recruitment, outsourcing, and talent advisory services for businesses of all sizes, opening doors for people with diverse skills, ambitions, and backgrounds. The Role We have an exciting new opportunity for a Data Modeller to join Robert Walters as a Consultant. As a consultant, you will benefit from permanent employment with Robert Walters and will be deployed on an assignment within our clients' organisations, in return we will provide you with the opportunity to develop your skills with ongoing training and professional support. This role offers an exciting opportunity to join a global business, providing top-tier service to our blue chip clients. What you'll do Design, build and maintain scalable data pipelines and models in Databricks using Python to deliver reliable datasets for reporting and key business metrics. Develop efficient, well-structured code while adhering to technical standards, reconciliation checks, and version control practices using Git and DevOps tools. Partner with visualisation analysts to ensure data models are structured effectively for dashboards, reporting and insight generation. Work within Agile delivery teams to scope work, contribute to sprint planning and deliver outputs within agreed timelines. Engage with stakeholders to clarify requirements, provide progress updates and communicate technical concepts clearly to non-technical audiences. Continuously develop knowledge of insurance data and emerging analytics technologies to improve data solutions and support business decision-making. What you bring Strong hands-on experience with Databricks, Python and Power BI, with the ability to contribute quickly in an established environment. Background as a Data Modeller, Analytics Engineer, or Data Analyst with strong modelling experience. Experience designing scalable data models and pipelines within cloud-based data platforms. Proficiency with Git version control and development best practices. Strong analytical mindset with the ability to interpret complex datasets and produce actionable insight. Insurance or financial services experience preferred, with understanding of business reporting and operational data. What's Next? If you are ready to take the next step, apply now. Successful applicants will be contacted directly by a recruiter to discuss the role more. We are committed to creating an inclusive recruitment experience. If you require support or adjustments to the recruitment process, our Adjustment Concierge Service is here to help. Please feel free to contact us at (see below) to discuss how we can support you. This position is being recruited on behalf of our client through our Outsourcing service line. Resource Solutions Limited, trading as Robert Walters, acts as an employment business and agency, partnering with top organizations to help them find the best talent. We welcome applications from all candidates and are committed to providing equal opportunities.
Sanderson Recruitment Plc
Staff Data Engineer
Sanderson Recruitment Plc
Role: Staff Data Engineer Location: London Area, United Kingdom (Hybrid) Employment Type: Full-time Seniority: Senior/Staff Level Salary: £95,000+ Bonus Are you a highly experienced Data Engineer ready to lead from the front and shape the future of a modern data platform? We're looking for a Staff Data Engineer to play a pivotal role in a large-scale data transformation, driving the evolution of a cloud-native data platform. You'll work at the intersection of engineering, product, and delivery-leading complex initiatives, mentoring engineers, and making high-impact technical decisions. This is an opportunity to be a true technical leader: setting standards, influencing strategy, and building scalable, high-quality data solutions in a collaborative, forward-thinking environment. What You'll Do Design, build, and own robust, scalable cloud-based data solutions with a strong focus on automation Lead a squad of data engineers, providing technical direction, mentoring, and coaching Drive complex engineering decisions and take ownership of high-impact backlog items Develop and enforce data engineering standards, frameworks, and best practices Design and optimise scalable data pipelines aligned to ETL principles and business objectives Collaborate closely with Product and Delivery teams to deliver high-value solutions Champion the adoption and evolution of a modern cloud data platform Required Experience Proven experience operating at Senior or Staff Data Engineer level Strong leadership capability with experience mentoring and developing engineers Deep expertise in cloud-based data platforms (preferably Azure) Strong background in end-to-end data solution design and data warehousing principles Hands-on experience with ETL tools (eg Databricks) and data processing frameworks Advanced proficiency in Python, SQL, and Spark Ability to navigate and deliver complex engineering challenges with confidence What Sets You Apart A proactive, ownership-driven mindset with strong business awareness Experience influencing engineering strategy and driving best practices Passion for building high-performing engineering teams and communities Ability to communicate complex technical decisions to both technical and non-technical stakeholders Why Join You'll be part of an ambitious transformation, helping to build a best-in-class data engineering function within a business that is redefining its digital future. You'll work with cutting-edge technologies, solve meaningful problems at scale, and play a key role in democratising data across the organisation-all while doing some of the most impactful work of your career. If you're a senior technical leader who thrives on ownership, mentorship, and solving complex data challenges, get in touch for a confidential conversation. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
Apr 01, 2026
Full time
Role: Staff Data Engineer Location: London Area, United Kingdom (Hybrid) Employment Type: Full-time Seniority: Senior/Staff Level Salary: £95,000+ Bonus Are you a highly experienced Data Engineer ready to lead from the front and shape the future of a modern data platform? We're looking for a Staff Data Engineer to play a pivotal role in a large-scale data transformation, driving the evolution of a cloud-native data platform. You'll work at the intersection of engineering, product, and delivery-leading complex initiatives, mentoring engineers, and making high-impact technical decisions. This is an opportunity to be a true technical leader: setting standards, influencing strategy, and building scalable, high-quality data solutions in a collaborative, forward-thinking environment. What You'll Do Design, build, and own robust, scalable cloud-based data solutions with a strong focus on automation Lead a squad of data engineers, providing technical direction, mentoring, and coaching Drive complex engineering decisions and take ownership of high-impact backlog items Develop and enforce data engineering standards, frameworks, and best practices Design and optimise scalable data pipelines aligned to ETL principles and business objectives Collaborate closely with Product and Delivery teams to deliver high-value solutions Champion the adoption and evolution of a modern cloud data platform Required Experience Proven experience operating at Senior or Staff Data Engineer level Strong leadership capability with experience mentoring and developing engineers Deep expertise in cloud-based data platforms (preferably Azure) Strong background in end-to-end data solution design and data warehousing principles Hands-on experience with ETL tools (eg Databricks) and data processing frameworks Advanced proficiency in Python, SQL, and Spark Ability to navigate and deliver complex engineering challenges with confidence What Sets You Apart A proactive, ownership-driven mindset with strong business awareness Experience influencing engineering strategy and driving best practices Passion for building high-performing engineering teams and communities Ability to communicate complex technical decisions to both technical and non-technical stakeholders Why Join You'll be part of an ambitious transformation, helping to build a best-in-class data engineering function within a business that is redefining its digital future. You'll work with cutting-edge technologies, solve meaningful problems at scale, and play a key role in democratising data across the organisation-all while doing some of the most impactful work of your career. If you're a senior technical leader who thrives on ownership, mentorship, and solving complex data challenges, get in touch for a confidential conversation. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
Sanderson Recruitment Plc
Data Engineer
Sanderson Recruitment Plc Bristol, Somerset
Data Engineer Industry: Not-for-profit (Family Services) Location : Bristol/South Gloucestershire (Free parking available) Salary: £45,000 - £48,000 + benefits Permanent role: hybrid working (3 days on site) Data Engineer - Role Purpose We are embarking on a significant transformation of our data and analytics capabilities and are seeking a skilled Data Engineer to help build and shape our modern Data & AI Platform. Working alongside the Head of Data & Analytics, you will design, develop and maintain secure, high-quality data pipelines that enable trusted reporting, analytics, and future AI/ML development. This is a rare opportunity to influence architecture, engineering standards, automation and governance within a co-managed delivery model. You will work across structured and semi-structured data from key internal systems, including HR, care delivery, finance, estates, medication and incident management to build reusable data pipelines, semantic models and certified datasets. Your work will directly support operational teams, strategic planning and improved outcomes for the people we support. This role is ideal for someone who enjoys solving complex data challenges, building scalable solutions and embedding best-practice engineering within a collaborative, mission-driven environment. Data Engineer: Technical Skills & Experience We are looking for candidates with experience in: Cloud or data platform technologies (eg, Azure, Fabric, Databricks) Operating and managing modern cloud-based data platforms Integrating third-party data feeds Exposure to DevOps or platform engineering Strong SQL for transformation, modelling and optimisation At least one data engineering programming language (eg, Python) Data modelling (dimensional, star schema, analytics-optimised models) Building and maintaining production-grade ETL/ELT pipelines Orchestration and scheduling tools Version control (eg, Git) CI/CD principles for data workloads Environment separation (Dev/Test/Prod) Writing maintainable, testable and well-documented code Applying GDPR and data protection principles, including privacy-by-design, retention, anonymisation and pseudonymisation Data Engineer: Pay & Benefits We recognise the importance of investing in our people and offer a competitive employment package, including: 34 days annual leave (including public holidays) Access to earned pay before payday Company pension scheme Generous occupational maternity/paternity pay Ongoing learning and development opportunities Health Cash Plan after probation (covering dental, optical, therapies, maternity/paternity, prescriptions and more) Opportunities for career progression How to Apply This role is being recruited by Sanderson Recruitment. Please apply with your CV. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
Apr 01, 2026
Full time
Data Engineer Industry: Not-for-profit (Family Services) Location : Bristol/South Gloucestershire (Free parking available) Salary: £45,000 - £48,000 + benefits Permanent role: hybrid working (3 days on site) Data Engineer - Role Purpose We are embarking on a significant transformation of our data and analytics capabilities and are seeking a skilled Data Engineer to help build and shape our modern Data & AI Platform. Working alongside the Head of Data & Analytics, you will design, develop and maintain secure, high-quality data pipelines that enable trusted reporting, analytics, and future AI/ML development. This is a rare opportunity to influence architecture, engineering standards, automation and governance within a co-managed delivery model. You will work across structured and semi-structured data from key internal systems, including HR, care delivery, finance, estates, medication and incident management to build reusable data pipelines, semantic models and certified datasets. Your work will directly support operational teams, strategic planning and improved outcomes for the people we support. This role is ideal for someone who enjoys solving complex data challenges, building scalable solutions and embedding best-practice engineering within a collaborative, mission-driven environment. Data Engineer: Technical Skills & Experience We are looking for candidates with experience in: Cloud or data platform technologies (eg, Azure, Fabric, Databricks) Operating and managing modern cloud-based data platforms Integrating third-party data feeds Exposure to DevOps or platform engineering Strong SQL for transformation, modelling and optimisation At least one data engineering programming language (eg, Python) Data modelling (dimensional, star schema, analytics-optimised models) Building and maintaining production-grade ETL/ELT pipelines Orchestration and scheduling tools Version control (eg, Git) CI/CD principles for data workloads Environment separation (Dev/Test/Prod) Writing maintainable, testable and well-documented code Applying GDPR and data protection principles, including privacy-by-design, retention, anonymisation and pseudonymisation Data Engineer: Pay & Benefits We recognise the importance of investing in our people and offer a competitive employment package, including: 34 days annual leave (including public holidays) Access to earned pay before payday Company pension scheme Generous occupational maternity/paternity pay Ongoing learning and development opportunities Health Cash Plan after probation (covering dental, optical, therapies, maternity/paternity, prescriptions and more) Opportunities for career progression How to Apply This role is being recruited by Sanderson Recruitment. Please apply with your CV. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
Avanti
DV Cleared Data Architects Wanted
Avanti
DV Cleared Data Architects Wanted Multi-Cloud / Data Platforms UK (Hybrid) £90,000 - £100,000 + bonus + benefits London / Bristol / Manchester (hybrid) Active DV Clearance Required I'm working with successful multinational tech consultancy that has secured a number of long-term client programmes and is now looking to bring in several DV Cleared Data Architects to meet that demand. If you're DV cleared and tired of being locked into one stack or one client environment, this is worth a look. You'll be working across a range of secure, high-impact programmes with real ownership over how the data platforms are designed and delivered not just implementing someone else's blueprint. What you'll be doing It's a genuine end-to-end architecture role. You won't just be focused on one part of the data lifecycle, you'll own the full picture, from ingestion through to consumption, across different client environments and industries. Designing full data platforms end-to-end, making sure everything flows properly from ingestion through to consumption Working with modern approaches like lakehouse and medallion architectures (Bronze, Silver, Gold) Defining both batch and streaming pipelines depending on the use case Making real decisions around storage, orchestration, governance and security - not just executing a predefined design Working closely with clients and stakeholders to translate business problems into practical technical solutions, across both technical and non-technical audiences Operating in secure, regulated environments where attention to detail and sound judgement matter Tech environment It's a multi-cloud setup so you won't be pigeonholed but they are happy to consider applications from DV Cleared candidates with experience of one or more of the major cloud platforms. AWS, Azure and GCP depending on the project Databricks, Snowflake and Synapse Spark and distributed processing frameworks for large-scale data Streaming technologies such as Kafka or Kinesis You're expected to pick the right tools for the problem, not just default to whatever you know best. What they're looking for Active DV Clearance - must already be held A strong background in Data Architecture with hands-on experience Solid understanding of modern data patterns - lakehouse, medallion, streaming Experience designing full platforms, not just individual components Comfortable working with clients and stakeholders, not just internal teams A practical mindset, someone who makes sensible trade-offs rather than over-engineering Why it's worth considering What tends to appeal to people is the combination of variety and genuine ownership. You're working across different projects and industries, with the freedom to steer your own career based on where your strengths lie. There's a good balance of deep technical work and client-facing delivery, and you'll get real exposure to a wide range of tools, platforms and environments rather than repeating the same patterns on the same stack. Interested? If you hold active DV Clearance and want a role where you can design, influence and deliver real data platforms across secure programmes, it's worth a conversation. Get in touch and I'll give you the full picture. Salary: £90,000 - £100,000 + bonus + benefits Location - London / Bristol / Manchester / Belfast (hybrid) - 2 days a week in any of their offices.
Apr 01, 2026
Full time
DV Cleared Data Architects Wanted Multi-Cloud / Data Platforms UK (Hybrid) £90,000 - £100,000 + bonus + benefits London / Bristol / Manchester (hybrid) Active DV Clearance Required I'm working with successful multinational tech consultancy that has secured a number of long-term client programmes and is now looking to bring in several DV Cleared Data Architects to meet that demand. If you're DV cleared and tired of being locked into one stack or one client environment, this is worth a look. You'll be working across a range of secure, high-impact programmes with real ownership over how the data platforms are designed and delivered not just implementing someone else's blueprint. What you'll be doing It's a genuine end-to-end architecture role. You won't just be focused on one part of the data lifecycle, you'll own the full picture, from ingestion through to consumption, across different client environments and industries. Designing full data platforms end-to-end, making sure everything flows properly from ingestion through to consumption Working with modern approaches like lakehouse and medallion architectures (Bronze, Silver, Gold) Defining both batch and streaming pipelines depending on the use case Making real decisions around storage, orchestration, governance and security - not just executing a predefined design Working closely with clients and stakeholders to translate business problems into practical technical solutions, across both technical and non-technical audiences Operating in secure, regulated environments where attention to detail and sound judgement matter Tech environment It's a multi-cloud setup so you won't be pigeonholed but they are happy to consider applications from DV Cleared candidates with experience of one or more of the major cloud platforms. AWS, Azure and GCP depending on the project Databricks, Snowflake and Synapse Spark and distributed processing frameworks for large-scale data Streaming technologies such as Kafka or Kinesis You're expected to pick the right tools for the problem, not just default to whatever you know best. What they're looking for Active DV Clearance - must already be held A strong background in Data Architecture with hands-on experience Solid understanding of modern data patterns - lakehouse, medallion, streaming Experience designing full platforms, not just individual components Comfortable working with clients and stakeholders, not just internal teams A practical mindset, someone who makes sensible trade-offs rather than over-engineering Why it's worth considering What tends to appeal to people is the combination of variety and genuine ownership. You're working across different projects and industries, with the freedom to steer your own career based on where your strengths lie. There's a good balance of deep technical work and client-facing delivery, and you'll get real exposure to a wide range of tools, platforms and environments rather than repeating the same patterns on the same stack. Interested? If you hold active DV Clearance and want a role where you can design, influence and deliver real data platforms across secure programmes, it's worth a conversation. Get in touch and I'll give you the full picture. Salary: £90,000 - £100,000 + bonus + benefits Location - London / Bristol / Manchester / Belfast (hybrid) - 2 days a week in any of their offices.
Specsavers
Senior Data Ops Engineer
Specsavers St. Andrews, Fife
Are you passionate about building the engine that powers data at scale? At Specsavers, we're looking for a Senior DataOps Engineer to play a critical role at the heart of our Data Engineering team helping us build, run, and continuously improve the platforms that enable data-driven decision making across the business. This is an exciting opportunity to own the "assembly line" behind our data products. You'll be focused on making our data platform faster, more reliable, and easier to work with using automation, observability, and modern engineering practices to improve quality, resilience, and speed to value. If you enjoy solving complex problems, reducing friction for engineering teams, and making systems run better every day, this role will really resonate. In this role, you'll be hands-on building and evolving our DataOps capability automating data pipelines, testing data quality, and ensuring our production and development environments are monitored, observable, and highly performant. You'll work closely with Data Engineers, QA, DevOps, and platform teams to replace manual processes with smart orchestration, enable self-service environments, and deploy with confidence through CI/CD and infrastructure-as-code. Your work will directly impact how quickly and safely data products can be delivered to the business. You'll thrive here if you bring strong, real-world experience in DataOps and cloud-based data platforms. You'll be comfortable working end-to-end across Azure based technologies such as Databricks, Data Factory, Data Lake and Azure SQL, and confident using Python, SQL, Git, and automation tooling to improve reliability and scalability. Experience with containerisation, Kubernetes, Terraform, and modern DevOps practices will allow you to lead by example and champion best practice across the data engineering community. What really sets you apart is your mindset. You care deeply about quality, observability, and operational excellence. You enjoy collaborating across teams, explaining complex technical concepts in simple terms, and helping others learn and improve. You're curious, proactive, and always looking for smarter, more efficient ways to do things while keeping security, performance, and cost firmly in mind. If you're excited by the idea of enabling data at scale, building robust platforms, and having real influence over how data engineering is done across a global organisation, this is a role where you can truly make your mark. Join us as a Senior DataOps Engineer and help power the data foundations that support Specsavers' mission to change lives through better sight and hearing.
Apr 01, 2026
Full time
Are you passionate about building the engine that powers data at scale? At Specsavers, we're looking for a Senior DataOps Engineer to play a critical role at the heart of our Data Engineering team helping us build, run, and continuously improve the platforms that enable data-driven decision making across the business. This is an exciting opportunity to own the "assembly line" behind our data products. You'll be focused on making our data platform faster, more reliable, and easier to work with using automation, observability, and modern engineering practices to improve quality, resilience, and speed to value. If you enjoy solving complex problems, reducing friction for engineering teams, and making systems run better every day, this role will really resonate. In this role, you'll be hands-on building and evolving our DataOps capability automating data pipelines, testing data quality, and ensuring our production and development environments are monitored, observable, and highly performant. You'll work closely with Data Engineers, QA, DevOps, and platform teams to replace manual processes with smart orchestration, enable self-service environments, and deploy with confidence through CI/CD and infrastructure-as-code. Your work will directly impact how quickly and safely data products can be delivered to the business. You'll thrive here if you bring strong, real-world experience in DataOps and cloud-based data platforms. You'll be comfortable working end-to-end across Azure based technologies such as Databricks, Data Factory, Data Lake and Azure SQL, and confident using Python, SQL, Git, and automation tooling to improve reliability and scalability. Experience with containerisation, Kubernetes, Terraform, and modern DevOps practices will allow you to lead by example and champion best practice across the data engineering community. What really sets you apart is your mindset. You care deeply about quality, observability, and operational excellence. You enjoy collaborating across teams, explaining complex technical concepts in simple terms, and helping others learn and improve. You're curious, proactive, and always looking for smarter, more efficient ways to do things while keeping security, performance, and cost firmly in mind. If you're excited by the idea of enabling data at scale, building robust platforms, and having real influence over how data engineering is done across a global organisation, this is a role where you can truly make your mark. Join us as a Senior DataOps Engineer and help power the data foundations that support Specsavers' mission to change lives through better sight and hearing.
Specsavers
Senior Data Ops Engineer
Specsavers Nottingham, Nottinghamshire
Are you passionate about building the engine that powers data at scale? At Specsavers, we're looking for a Senior DataOps Engineer to play a critical role at the heart of our Data Engineering team helping us build, run, and continuously improve the platforms that enable data-driven decision making across the business. This is an exciting opportunity to own the "assembly line" behind our data products. You'll be focused on making our data platform faster, more reliable, and easier to work with using automation, observability, and modern engineering practices to improve quality, resilience, and speed to value. If you enjoy solving complex problems, reducing friction for engineering teams, and making systems run better every day, this role will really resonate. In this role, you'll be hands-on building and evolving our DataOps capability automating data pipelines, testing data quality, and ensuring our production and development environments are monitored, observable, and highly performant. You'll work closely with Data Engineers, QA, DevOps, and platform teams to replace manual processes with smart orchestration, enable self-service environments, and deploy with confidence through CI/CD and infrastructure-as-code. Your work will directly impact how quickly and safely data products can be delivered to the business. You'll thrive here if you bring strong, real-world experience in DataOps and cloud-based data platforms. You'll be comfortable working end-to-end across Azure based technologies such as Databricks, Data Factory, Data Lake and Azure SQL, and confident using Python, SQL, Git, and automation tooling to improve reliability and scalability. Experience with containerisation, Kubernetes, Terraform, and modern DevOps practices will allow you to lead by example and champion best practice across the data engineering community. What really sets you apart is your mindset. You care deeply about quality, observability, and operational excellence. You enjoy collaborating across teams, explaining complex technical concepts in simple terms, and helping others learn and improve. You're curious, proactive, and always looking for smarter, more efficient ways to do things while keeping security, performance, and cost firmly in mind. If you're excited by the idea of enabling data at scale, building robust platforms, and having real influence over how data engineering is done across a global organisation, this is a role where you can truly make your mark. Join us as a Senior DataOps Engineer and help power the data foundations that support Specsavers' mission to change lives through better sight and hearing.
Apr 01, 2026
Full time
Are you passionate about building the engine that powers data at scale? At Specsavers, we're looking for a Senior DataOps Engineer to play a critical role at the heart of our Data Engineering team helping us build, run, and continuously improve the platforms that enable data-driven decision making across the business. This is an exciting opportunity to own the "assembly line" behind our data products. You'll be focused on making our data platform faster, more reliable, and easier to work with using automation, observability, and modern engineering practices to improve quality, resilience, and speed to value. If you enjoy solving complex problems, reducing friction for engineering teams, and making systems run better every day, this role will really resonate. In this role, you'll be hands-on building and evolving our DataOps capability automating data pipelines, testing data quality, and ensuring our production and development environments are monitored, observable, and highly performant. You'll work closely with Data Engineers, QA, DevOps, and platform teams to replace manual processes with smart orchestration, enable self-service environments, and deploy with confidence through CI/CD and infrastructure-as-code. Your work will directly impact how quickly and safely data products can be delivered to the business. You'll thrive here if you bring strong, real-world experience in DataOps and cloud-based data platforms. You'll be comfortable working end-to-end across Azure based technologies such as Databricks, Data Factory, Data Lake and Azure SQL, and confident using Python, SQL, Git, and automation tooling to improve reliability and scalability. Experience with containerisation, Kubernetes, Terraform, and modern DevOps practices will allow you to lead by example and champion best practice across the data engineering community. What really sets you apart is your mindset. You care deeply about quality, observability, and operational excellence. You enjoy collaborating across teams, explaining complex technical concepts in simple terms, and helping others learn and improve. You're curious, proactive, and always looking for smarter, more efficient ways to do things while keeping security, performance, and cost firmly in mind. If you're excited by the idea of enabling data at scale, building robust platforms, and having real influence over how data engineering is done across a global organisation, this is a role where you can truly make your mark. Join us as a Senior DataOps Engineer and help power the data foundations that support Specsavers' mission to change lives through better sight and hearing.
Michael Page Technology
People Analytics Engineer
Michael Page Technology
This is an exciting opportunity for a Data/Analytics Engineer to join a temporary position in London. The role involves working within the Analytics department of the Technology & Telecoms industry, focusing on data-driven solutions. Client Details Our client is a fast-growing, product-led technology company with a global team and a strong culture of innovation. They're investing in data to support smarter decision-making as they continue to scale. Description Databricks experience essential Design, build, and maintain ETL/ELT data pipelines supporting People Analytics use cases. Develop scalable models and schemas to integrate data from HRIS, ATS, learning platforms, engagement tools, and other people-related systems. Ensure data quality, governance, and security across sensitive HR datasets. Implement monitoring, testing, and automation to maintain robust pipelines. Partner with the People team to translate business questions into data requirements and technical solutions. Build self-service dashboards and metrics in tools such as Looker, Tableau, Power BI, or similar. Create standardised reporting frameworks for KPIs such as hiring funnel metrics, headcount, retention, performance, compensation, and DEI insights. Support modelling for workforce planning and scenario forecasting. Work cross-functionally with Data Engineering, Finance, Legal, and other stakeholders on shared datasets and processes. Provide technical thought-leadership on People data infrastructure and best practices. Communicate complex concepts clearly and concisely to non-technical HR stakeholders. Profile Databricks experience essential Strong experience as a Data Engineer, Analytics Engineer, or BI Engineer, ideally with exposure to People Analytics or similar domains. Deep expertise with SQL and modern data warehousing technologies (e.g., Snowflake, BigQuery, Redshift, Databricks). Proven experience building production-ready pipelines using tools such as dbt, Airflow, Dagster, Prefect, or similar. Experience integrating data from HRIS and talent systems (e.g., Workday, BambooHR, Greenhouse, Lever, HiBob, Personio). Ability to work with highly sensitive data and ensure compliance with GDPR and internal security standards. Strong stakeholder skills and the ability to collaborate with HR partners and business leaders. Experience building People Analytics data models or dashboards. Exposure to Python for automation, testing, or pipeline orchestration. Understanding of workforce planning, organisational design, talent analytics, or People Operations data. Job Offer £400-600 per day - IR35 status TBC London based - hybrid working 6 month contract with potential for extension
Apr 01, 2026
Contractor
This is an exciting opportunity for a Data/Analytics Engineer to join a temporary position in London. The role involves working within the Analytics department of the Technology & Telecoms industry, focusing on data-driven solutions. Client Details Our client is a fast-growing, product-led technology company with a global team and a strong culture of innovation. They're investing in data to support smarter decision-making as they continue to scale. Description Databricks experience essential Design, build, and maintain ETL/ELT data pipelines supporting People Analytics use cases. Develop scalable models and schemas to integrate data from HRIS, ATS, learning platforms, engagement tools, and other people-related systems. Ensure data quality, governance, and security across sensitive HR datasets. Implement monitoring, testing, and automation to maintain robust pipelines. Partner with the People team to translate business questions into data requirements and technical solutions. Build self-service dashboards and metrics in tools such as Looker, Tableau, Power BI, or similar. Create standardised reporting frameworks for KPIs such as hiring funnel metrics, headcount, retention, performance, compensation, and DEI insights. Support modelling for workforce planning and scenario forecasting. Work cross-functionally with Data Engineering, Finance, Legal, and other stakeholders on shared datasets and processes. Provide technical thought-leadership on People data infrastructure and best practices. Communicate complex concepts clearly and concisely to non-technical HR stakeholders. Profile Databricks experience essential Strong experience as a Data Engineer, Analytics Engineer, or BI Engineer, ideally with exposure to People Analytics or similar domains. Deep expertise with SQL and modern data warehousing technologies (e.g., Snowflake, BigQuery, Redshift, Databricks). Proven experience building production-ready pipelines using tools such as dbt, Airflow, Dagster, Prefect, or similar. Experience integrating data from HRIS and talent systems (e.g., Workday, BambooHR, Greenhouse, Lever, HiBob, Personio). Ability to work with highly sensitive data and ensure compliance with GDPR and internal security standards. Strong stakeholder skills and the ability to collaborate with HR partners and business leaders. Experience building People Analytics data models or dashboards. Exposure to Python for automation, testing, or pipeline orchestration. Understanding of workforce planning, organisational design, talent analytics, or People Operations data. Job Offer £400-600 per day - IR35 status TBC London based - hybrid working 6 month contract with potential for extension
Big Red Recruitment
Data Engineer
Big Red Recruitment Huddersfield, Yorkshire
Turn messy, fragmented data into something the business can actually trust. About the client:You will be working with a major UK-based organisation that operates across a portfolio of well-known brands, delivering at scale within a complex distribution environment. With deeply integrated ERP and finance systems, the business is now undergoing a significant shift in how it uses data. Following recent system changes, there is a clear focus on modernising reporting and building a more transparent, reliable data landscape that can support future growth. Project overview:This programme is focused on rebuilding critical financial reporting that has been disrupted, including complex margin and rebate calculations that sit at the heart of commercial performance. Alongside this, you will help design and deliver scalable data pipelines, transformation logic, and reporting outputs within a modern Azure environment. The work will establish a repeatable blueprint that can be rolled out across multiple business units, playing a key role in the transition away from legacy systems to a clean, future-ready data platform. What you will be doing:You will take ownership of building and maintaining robust data pipelines within Azure, working hands-on to deliver reliable, high-performance data solutions. This includes developing workflows in Databricks and or Synapse, and transforming data from a range of sources such as CSV and Excel into clean, usable datasets. A key part of the role will involve translating complex pricing and margin logic into scalable data models, ensuring accuracy and consistency across reporting. You will also focus on improving data quality, validation processes, and overall performance, while supporting the move away from legacy reporting tools. Alongside this, you will contribute to the design of a scalable data architecture that can be leveraged across multiple business units. Tech environment:You will be working across a modern Azure stack including Data Lake, Synapse, Databricks, Data Factory, and SQL, with exposure to complex enterprise data environments. What we are looking for:We are looking for someone with strong experience across Azure data engineering tools and a proven track record of building ETL or ELT pipelines in production environments. You should have solid SQL and data transformation skills, along with experience handling large and complex datasets. Equally important is your ability to work closely with analysts and business stakeholders, translating requirements into effective data solutions. Nice to have:Experience working with ERP or finance data would be highly beneficial, as would any background in transformation or migration programmes where legacy systems are being modernised. Contract details: 6 month fixed term contract with potential to extend or move into a permanent role Salary: Up to £60,000 Location: West Yorkshire Hybrid
Apr 01, 2026
Full time
Turn messy, fragmented data into something the business can actually trust. About the client:You will be working with a major UK-based organisation that operates across a portfolio of well-known brands, delivering at scale within a complex distribution environment. With deeply integrated ERP and finance systems, the business is now undergoing a significant shift in how it uses data. Following recent system changes, there is a clear focus on modernising reporting and building a more transparent, reliable data landscape that can support future growth. Project overview:This programme is focused on rebuilding critical financial reporting that has been disrupted, including complex margin and rebate calculations that sit at the heart of commercial performance. Alongside this, you will help design and deliver scalable data pipelines, transformation logic, and reporting outputs within a modern Azure environment. The work will establish a repeatable blueprint that can be rolled out across multiple business units, playing a key role in the transition away from legacy systems to a clean, future-ready data platform. What you will be doing:You will take ownership of building and maintaining robust data pipelines within Azure, working hands-on to deliver reliable, high-performance data solutions. This includes developing workflows in Databricks and or Synapse, and transforming data from a range of sources such as CSV and Excel into clean, usable datasets. A key part of the role will involve translating complex pricing and margin logic into scalable data models, ensuring accuracy and consistency across reporting. You will also focus on improving data quality, validation processes, and overall performance, while supporting the move away from legacy reporting tools. Alongside this, you will contribute to the design of a scalable data architecture that can be leveraged across multiple business units. Tech environment:You will be working across a modern Azure stack including Data Lake, Synapse, Databricks, Data Factory, and SQL, with exposure to complex enterprise data environments. What we are looking for:We are looking for someone with strong experience across Azure data engineering tools and a proven track record of building ETL or ELT pipelines in production environments. You should have solid SQL and data transformation skills, along with experience handling large and complex datasets. Equally important is your ability to work closely with analysts and business stakeholders, translating requirements into effective data solutions. Nice to have:Experience working with ERP or finance data would be highly beneficial, as would any background in transformation or migration programmes where legacy systems are being modernised. Contract details: 6 month fixed term contract with potential to extend or move into a permanent role Salary: Up to £60,000 Location: West Yorkshire Hybrid
Akkodis
Data Engineer
Akkodis
Data Engineer Full Time / Permanent 55,000 - 60,000 plus up to 20% bonus, private medical and other extensive benefits Hybrid - 1-2 days a week in the North Oxfordshire head office The Company: My client is an industry leading and award-winning financial services organisation who operate on a global scale. They are headquartered in North Oxfordshire, UK. This would be a hybrid role requiring 1-2 days a week in the North Oxfordshire head office. The Role: I am looking for a driven and experienced Data Engineer to help to design, build and maintain a data lakehouse in databricks pulling data from core platforms and external sources and refining this into well curated analysis ready datasets. As a Data Engineer you will operate within an Agile delivery environment, working closely with other Data Engineers, Data Analysts and a Data Architect to deliver against the backlog; providing vital insight from a wide-ranging dataset to support executive and operational decision making that will underpin sustained growth of business units domestically and internationally. The Person: The ideal candidate will possess a strong background in Data Engineering with a proven ability to design, build, and maintain scalable data pipelines and solutions. From a technical standpoint you will ideally possess: Proven experience with databricks Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka). Experience of Waterfall and Agile delivery methodologies Contact: Please apply via the link or contact (url removed) for more information. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Mar 31, 2026
Full time
Data Engineer Full Time / Permanent 55,000 - 60,000 plus up to 20% bonus, private medical and other extensive benefits Hybrid - 1-2 days a week in the North Oxfordshire head office The Company: My client is an industry leading and award-winning financial services organisation who operate on a global scale. They are headquartered in North Oxfordshire, UK. This would be a hybrid role requiring 1-2 days a week in the North Oxfordshire head office. The Role: I am looking for a driven and experienced Data Engineer to help to design, build and maintain a data lakehouse in databricks pulling data from core platforms and external sources and refining this into well curated analysis ready datasets. As a Data Engineer you will operate within an Agile delivery environment, working closely with other Data Engineers, Data Analysts and a Data Architect to deliver against the backlog; providing vital insight from a wide-ranging dataset to support executive and operational decision making that will underpin sustained growth of business units domestically and internationally. The Person: The ideal candidate will possess a strong background in Data Engineering with a proven ability to design, build, and maintain scalable data pipelines and solutions. From a technical standpoint you will ideally possess: Proven experience with databricks Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka). Experience of Waterfall and Agile delivery methodologies Contact: Please apply via the link or contact (url removed) for more information. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
William Alexander Recruitment Ltd
Senior Data Engineer - Azure/Databricks
William Alexander Recruitment Ltd Manchester, Lancashire
Our client, a leading name in the insurance sector, is seeking a Senior Data Engineer to drive the delivery of key data initiatives across their Azure data platform. This role will play a pivotal part in advancing Databricks development, supporting AI-focused projects, and maintaining Legacy SQL Server data warehouse environments. Key Responsibilities Lead and contribute to data engineering initiatives across the Azure ecosystem Build, optimise, and maintain Databricks workflows and Python-based data solutions Enhance existing SQL Server data warehouses and support strategic migrations Collaborate with cross-functional teams to enable AI and analytics-driven outcomes What We're Looking For Proven experience as a Data Engineer Strong hands-on expertise in Azure, Databricks, and Python Advanced SQL capabilities Insurance industry experience is highly desirable, though strong candidates from wider financial services will also be considered This is a permanent opportunity, offering 2-3 days per week in the Manchester office, with a salary of up to £80,000 per annum. If you feel you have the right skill set, please apply. Successful candidates will be contacted within 2 working days. The processing and use by us of your personal data is in accordance with our Privacy Notice which can be found on our website. William Alexander Diversity & Inclusion Policy, actively promotes the principles of equality, diversity and inclusion in all its dealings with employees, workers, job applicants, clients, customers, suppliers, contractors and the public. We fully feel an inclusive work culture where people of different backgrounds are valued equally will ensure better outcomes for us all and we approach recruitment for our clients.
Mar 31, 2026
Full time
Our client, a leading name in the insurance sector, is seeking a Senior Data Engineer to drive the delivery of key data initiatives across their Azure data platform. This role will play a pivotal part in advancing Databricks development, supporting AI-focused projects, and maintaining Legacy SQL Server data warehouse environments. Key Responsibilities Lead and contribute to data engineering initiatives across the Azure ecosystem Build, optimise, and maintain Databricks workflows and Python-based data solutions Enhance existing SQL Server data warehouses and support strategic migrations Collaborate with cross-functional teams to enable AI and analytics-driven outcomes What We're Looking For Proven experience as a Data Engineer Strong hands-on expertise in Azure, Databricks, and Python Advanced SQL capabilities Insurance industry experience is highly desirable, though strong candidates from wider financial services will also be considered This is a permanent opportunity, offering 2-3 days per week in the Manchester office, with a salary of up to £80,000 per annum. If you feel you have the right skill set, please apply. Successful candidates will be contacted within 2 working days. The processing and use by us of your personal data is in accordance with our Privacy Notice which can be found on our website. William Alexander Diversity & Inclusion Policy, actively promotes the principles of equality, diversity and inclusion in all its dealings with employees, workers, job applicants, clients, customers, suppliers, contractors and the public. We fully feel an inclusive work culture where people of different backgrounds are valued equally will ensure better outcomes for us all and we approach recruitment for our clients.
Lynx Recruitment Ltd
Data Engineer
Lynx Recruitment Ltd
Data Engineer (Permanent) Salary: Up to £70,000 Location: London Victoria (1 day per week onsite, rest remote) *This role does not offer sponsorship* The Role We are seeking a skilled Data Engineer to join a growing team delivering modern, scalable data solutions. This is a permanent opportunity offering a hybrid model, with one day per week in London Victoria and the remainder working remotely. You will play a key role in designing, building, and optimising data pipelines and platforms, supporting advanced analytics and Business Intelligence initiatives. Key Responsibilities Design and develop scalable data pipelines and data processing solutions Work with modern data platforms such as Microsoft Fabric, Databricks, or similar technologies Build and optimise data solutions using Python, Spark, PySpark, and/or SQL Collaborate with stakeholders to ensure high-quality, reliable data architecture Essential Skills & Experience Hands-on experience with Microsoft Fabric, Databricks, or equivalent modern data platforms Strong coding experience in Python and/or Spark/PySpark Solid SQL skills and experience working with structured datasets Desirable Experience Knowledge of relational database design , normalisation , and data modelling
Mar 31, 2026
Full time
Data Engineer (Permanent) Salary: Up to £70,000 Location: London Victoria (1 day per week onsite, rest remote) *This role does not offer sponsorship* The Role We are seeking a skilled Data Engineer to join a growing team delivering modern, scalable data solutions. This is a permanent opportunity offering a hybrid model, with one day per week in London Victoria and the remainder working remotely. You will play a key role in designing, building, and optimising data pipelines and platforms, supporting advanced analytics and Business Intelligence initiatives. Key Responsibilities Design and develop scalable data pipelines and data processing solutions Work with modern data platforms such as Microsoft Fabric, Databricks, or similar technologies Build and optimise data solutions using Python, Spark, PySpark, and/or SQL Collaborate with stakeholders to ensure high-quality, reliable data architecture Essential Skills & Experience Hands-on experience with Microsoft Fabric, Databricks, or equivalent modern data platforms Strong coding experience in Python and/or Spark/PySpark Solid SQL skills and experience working with structured datasets Desirable Experience Knowledge of relational database design , normalisation , and data modelling
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Brighton
Joseph Harry Ltd Brighton, Sussex
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Mar 30, 2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Brighton
Joseph Harry Ltd Brighton, Sussex
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
Mar 30, 2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Brighton
Joseph Harry Ltd Brighton, Sussex
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Mar 30, 2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Tunbridge Wells Kent
Joseph Harry Ltd Tunbridge Wells, Kent
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Mar 30, 2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Tunbridge Wells Kent
Joseph Harry Ltd Tunbridge Wells, Kent
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Mar 30, 2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension

Modal Window

  • Blog
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Instagram
  • Pinterest
  • Youtube
Parent and Partner sites: IT Job Board | Search Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | London Jobs | Property jobs
© 2008-2026 Jobs Hiring Near Me