Data Scrum Lead Duration: 6 Months Possibility for extension) Location: London/Hybrid (3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Overview This role will manage a scrum team that will ingest required reference data into the EMEA Data Platform for any future consumers within EMEA. The successful candidate will, amongst other tasks: Facilitate efficient and effective software delivery from inception to go-live for the EMEA Data Platform, initial focus will be running the Reference Data Scrum, ingesting reference data into the EMEA Data Platform Work with the Product Owners, business stakeholders, data modellers, data engineers, testers and Data Office to deliver a well-planned pipeline of reference data sets. Use scrum techniques to deliver reference data requirements within a scrum of scrum structure and drive effective outcomes each sprint. Key Responsibilities: Responsible for effective and efficient running of Software Development Scrums consisting of Developers, Testers, Physical Data Modellers, Data Governance and other stakeholders. Ensure that everyone on the team understands goals, scope, estimates and burndown charts of each iteration. Maintain product backlog and agree prioritisation with senior stakeholders. Identify and remove obstacles so that the team members can focus on their immediate tasks. Keep stakeholders up to date via direct communications, tailored reports and JIRA dashboards. Escalate any serious challenges to scope, timelines and budget to senior management. Key Skills & Requirements: Broad and deep understanding of reference data utilised in a corporate investment bank. Extensive experience of Core Agile methodology experience (Scrum, Scrum of Scrums), including: Running Scrum ceremonies: Sprint Planning, Daily Stand-ups, Sprint Reviews, and Retrospectives. Ensures meetings are productive and time-boxed. Educates the team and stakeholders on Scrum principles and Agile practices. Helps the team improve continuously and adopt Agile mindsets. Supports the team by removing obstacles and enabling progress. Experience with using Atlassian JIRA Understanding of Data Governance practises and subsequent requirements on IT delivery. Ability to handle communications at all levels from across the whole business Excellent analytical and problem-solving skills Ability to work collaboratively in a team environment, share ideas, and contribute to group discussions. Attention to detail and a commitment to producing high-quality work with a focus on accuracy and precision. Proven ability to work independently on large scale projects while contributing effectively in a collaborative team environment. Exposure to concepts like Data Lakehouse and Medallion Architecture, especially with Databricks as the underlying technology. Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Dec 18, 2025
Contractor
Data Scrum Lead Duration: 6 Months Possibility for extension) Location: London/Hybrid (3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Overview This role will manage a scrum team that will ingest required reference data into the EMEA Data Platform for any future consumers within EMEA. The successful candidate will, amongst other tasks: Facilitate efficient and effective software delivery from inception to go-live for the EMEA Data Platform, initial focus will be running the Reference Data Scrum, ingesting reference data into the EMEA Data Platform Work with the Product Owners, business stakeholders, data modellers, data engineers, testers and Data Office to deliver a well-planned pipeline of reference data sets. Use scrum techniques to deliver reference data requirements within a scrum of scrum structure and drive effective outcomes each sprint. Key Responsibilities: Responsible for effective and efficient running of Software Development Scrums consisting of Developers, Testers, Physical Data Modellers, Data Governance and other stakeholders. Ensure that everyone on the team understands goals, scope, estimates and burndown charts of each iteration. Maintain product backlog and agree prioritisation with senior stakeholders. Identify and remove obstacles so that the team members can focus on their immediate tasks. Keep stakeholders up to date via direct communications, tailored reports and JIRA dashboards. Escalate any serious challenges to scope, timelines and budget to senior management. Key Skills & Requirements: Broad and deep understanding of reference data utilised in a corporate investment bank. Extensive experience of Core Agile methodology experience (Scrum, Scrum of Scrums), including: Running Scrum ceremonies: Sprint Planning, Daily Stand-ups, Sprint Reviews, and Retrospectives. Ensures meetings are productive and time-boxed. Educates the team and stakeholders on Scrum principles and Agile practices. Helps the team improve continuously and adopt Agile mindsets. Supports the team by removing obstacles and enabling progress. Experience with using Atlassian JIRA Understanding of Data Governance practises and subsequent requirements on IT delivery. Ability to handle communications at all levels from across the whole business Excellent analytical and problem-solving skills Ability to work collaboratively in a team environment, share ideas, and contribute to group discussions. Attention to detail and a commitment to producing high-quality work with a focus on accuracy and precision. Proven ability to work independently on large scale projects while contributing effectively in a collaborative team environment. Exposure to concepts like Data Lakehouse and Medallion Architecture, especially with Databricks as the underlying technology. Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
Dec 18, 2025
Full time
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
Tenth Revolution Group
Northampton, Northamptonshire
Senior Databricks Engineer - Northampton (Hybrid) - Up to 80K + Benefits Ready to lead, innovate, and make an impact? We're looking for a Senior Databricks Engineer to join a forward-thinking team and take ownership of cutting-edge data solutions. This is your chance to shape the future of data strategy in a business that has been empowering companies for decades, supporting thousands of SMEs worldwide. We value relationships, trust, and innovation, and offer a flexible, inclusive environment where you can grow, make an impact, and be part of something special. About the Role You'll play a key role in designing and delivering scalable data pipelines, collaborating on architecture, and mentoring a small team of Data Engineers. This is a hands-on technical leadership position where your expertise will drive innovation and performance across our data ecosystem. What You'll Do Build and optimise data pipelines using Databricks Collaborate on data architecture and strategy Deliver large-scale workflows for ingestion, transformation, and validation Implement best practices for data quality and governance Lead and coach a team of Data Engineers What We're Looking For Significant experience with Databricks (including Unity Catalog) Strong skills in Python, Spark, SQL Cloud expertise Knowledge of pipeline tools (Airflow, ADF) Leadership and problem-solving ability Ready to take the next step? Apply now and join us as our Senior Databricks Engineer!
Dec 18, 2025
Full time
Senior Databricks Engineer - Northampton (Hybrid) - Up to 80K + Benefits Ready to lead, innovate, and make an impact? We're looking for a Senior Databricks Engineer to join a forward-thinking team and take ownership of cutting-edge data solutions. This is your chance to shape the future of data strategy in a business that has been empowering companies for decades, supporting thousands of SMEs worldwide. We value relationships, trust, and innovation, and offer a flexible, inclusive environment where you can grow, make an impact, and be part of something special. About the Role You'll play a key role in designing and delivering scalable data pipelines, collaborating on architecture, and mentoring a small team of Data Engineers. This is a hands-on technical leadership position where your expertise will drive innovation and performance across our data ecosystem. What You'll Do Build and optimise data pipelines using Databricks Collaborate on data architecture and strategy Deliver large-scale workflows for ingestion, transformation, and validation Implement best practices for data quality and governance Lead and coach a team of Data Engineers What We're Looking For Significant experience with Databricks (including Unity Catalog) Strong skills in Python, Spark, SQL Cloud expertise Knowledge of pipeline tools (Airflow, ADF) Leadership and problem-solving ability Ready to take the next step? Apply now and join us as our Senior Databricks Engineer!
Senior Data Engineer London 2-3 days on-site 65,000 - 72,000 + 20% Bonus + Excellent Benefits Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
Dec 18, 2025
Full time
Senior Data Engineer London 2-3 days on-site 65,000 - 72,000 + 20% Bonus + Excellent Benefits Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
Role: BI Manager Rate: 500.00 Per Day - Inside IR35 Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) Duration: Initial 3 - 6 Months with potential to go Permanent We are currently working with a leading services provider who require a technically strong, Midlands based Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Dec 18, 2025
Contractor
Role: BI Manager Rate: 500.00 Per Day - Inside IR35 Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) Duration: Initial 3 - 6 Months with potential to go Permanent We are currently working with a leading services provider who require a technically strong, Midlands based Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
willmott dixon group
Letchworth Garden City, Hertfordshire
Willmott Dixon are looking for a Senior Data Engineer to join our Data & Analytics team who are driving the next wave of our data platform in Microsoft Fabric. As a Senior Data Engineer, you'll be at the heart of designing and delivering scalable, reliable data products and pipelines that power decision-making and drive business performance. You'll work closely with analysts, developers, and stakeholders to align data engineering efforts with business goals so you need to be a strong communicator with evidence of previous stakeholder collaboration and engagement to deliver results. Our IT team are based at our head office in Letchworth Garden City, Hertfordshire but this role can be done remotely although occasional travel to our Head office and other sites will be required on occasions to support business needs. What you'll do: You'll play a key role in shaping how we deliver a trusted, single source of truth platform for the organisation, building the foundations for self-service analytics and smarter, faster insights. But this is more than just a technical role. You'll be a mentor and thought partner bringing your experience to evolve our engineering capability while fostering a culture of innovation, experimentation, and continuous improvement. You'll promote inclusive collaboration, encourage new ideas, and play a key part in pushing our data capabilities to the next level. We're looking for someone who is seeking a long-term career in our inclusive team. What you'll bring: Technical Excellence: Advanced Python and SQL skills, with hands-on experience in relational and dimensional data modelling. Modern Data Engineering: Proven ability to design and deliver scalable solutions using Microsoft Fabric (strongly preferred) or Databricks. Supporting Know-How: Solid grasp of data architecture, governance and security. DevOps & Cloud Fluency: Practical experience with CI/CD pipelines, APIs, and cloud tooling (e.g. Azure DevOps). Engineering Craftsmanship: Commitment to clean, maintainable code, robust testing, graceful failure handling, and managing technical debt. Problem Solver: Strong analytical mindset and a skill for root-cause resolution. Growth Mindset: Comfortable navigating ambiguity, balancing exploration with simplification, and thriving in evolving environments. Impact-Driven: Passionate about turning data into business value, with a collaborative and customer-focused approach. Clear Communicator: Able to translate complex technical concepts for diverse audiences and engage stakeholders effectively. Self-Starter: Skilled at prioritising, managing time, and delivering high-quality work that drives outcomes. Team Player: Supportive, curious, and constructive; always ready to mentor, challenge the status quo, and build together. Why join us? You will be part of a new and evolving team and have the rare opportunity to shape something from the ground up. We're committed to adopting the latest technologies and methodologies, and you'll be right at the heart of that journey. This is your chance to make a meaningful impact and grow your career in a supportive, forward-thinking team environment. You'll be joining an IT team that prides itself on being: Flexible Fun Supportive of people development Genuine, friendly and inclusive Innovative and keen to improve Responsive to customer needs About Us With over 170 years of rich history, Willmott Dixon's purpose is beyond profit; delivering brilliant buildings, transforming lives, strengthening communities and enhancing the environment so our world is fit for future generations. Ensuring that we add lasting value to the neighbourhoods we work in; our values, people, innovation, partnerships and focus on sustainability has allowed us to build a successful and solid privately owned business where our people can thrive. Willmott Dixon was recognised by The Sunday Times as one of the Top 10 "Big" Companies to Work For in 2025, named among the Times Top 50 Employers for Gender Equality in 2024, and ranked in the Top Five of Europe's 1,000 best workplaces by the Financial Times in 2025. Willmott Dixon is also the first major contractor and developer to win a King's Award for Enterprise in the category of sustainable development.
Dec 18, 2025
Full time
Willmott Dixon are looking for a Senior Data Engineer to join our Data & Analytics team who are driving the next wave of our data platform in Microsoft Fabric. As a Senior Data Engineer, you'll be at the heart of designing and delivering scalable, reliable data products and pipelines that power decision-making and drive business performance. You'll work closely with analysts, developers, and stakeholders to align data engineering efforts with business goals so you need to be a strong communicator with evidence of previous stakeholder collaboration and engagement to deliver results. Our IT team are based at our head office in Letchworth Garden City, Hertfordshire but this role can be done remotely although occasional travel to our Head office and other sites will be required on occasions to support business needs. What you'll do: You'll play a key role in shaping how we deliver a trusted, single source of truth platform for the organisation, building the foundations for self-service analytics and smarter, faster insights. But this is more than just a technical role. You'll be a mentor and thought partner bringing your experience to evolve our engineering capability while fostering a culture of innovation, experimentation, and continuous improvement. You'll promote inclusive collaboration, encourage new ideas, and play a key part in pushing our data capabilities to the next level. We're looking for someone who is seeking a long-term career in our inclusive team. What you'll bring: Technical Excellence: Advanced Python and SQL skills, with hands-on experience in relational and dimensional data modelling. Modern Data Engineering: Proven ability to design and deliver scalable solutions using Microsoft Fabric (strongly preferred) or Databricks. Supporting Know-How: Solid grasp of data architecture, governance and security. DevOps & Cloud Fluency: Practical experience with CI/CD pipelines, APIs, and cloud tooling (e.g. Azure DevOps). Engineering Craftsmanship: Commitment to clean, maintainable code, robust testing, graceful failure handling, and managing technical debt. Problem Solver: Strong analytical mindset and a skill for root-cause resolution. Growth Mindset: Comfortable navigating ambiguity, balancing exploration with simplification, and thriving in evolving environments. Impact-Driven: Passionate about turning data into business value, with a collaborative and customer-focused approach. Clear Communicator: Able to translate complex technical concepts for diverse audiences and engage stakeholders effectively. Self-Starter: Skilled at prioritising, managing time, and delivering high-quality work that drives outcomes. Team Player: Supportive, curious, and constructive; always ready to mentor, challenge the status quo, and build together. Why join us? You will be part of a new and evolving team and have the rare opportunity to shape something from the ground up. We're committed to adopting the latest technologies and methodologies, and you'll be right at the heart of that journey. This is your chance to make a meaningful impact and grow your career in a supportive, forward-thinking team environment. You'll be joining an IT team that prides itself on being: Flexible Fun Supportive of people development Genuine, friendly and inclusive Innovative and keen to improve Responsive to customer needs About Us With over 170 years of rich history, Willmott Dixon's purpose is beyond profit; delivering brilliant buildings, transforming lives, strengthening communities and enhancing the environment so our world is fit for future generations. Ensuring that we add lasting value to the neighbourhoods we work in; our values, people, innovation, partnerships and focus on sustainability has allowed us to build a successful and solid privately owned business where our people can thrive. Willmott Dixon was recognised by The Sunday Times as one of the Top 10 "Big" Companies to Work For in 2025, named among the Times Top 50 Employers for Gender Equality in 2024, and ranked in the Top Five of Europe's 1,000 best workplaces by the Financial Times in 2025. Willmott Dixon is also the first major contractor and developer to win a King's Award for Enterprise in the category of sustainable development.
Senior Data Engineer Location: Manchester Salary: Up to 100,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 100K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.
Dec 17, 2025
Full time
Senior Data Engineer Location: Manchester Salary: Up to 100,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 100K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.
Databricks Data Engineer London Senior Manager Up to 100K + Bonus Ready to take your data engineering career to the next level? Join a global consultancy on a major transformation project within the insurance domain. This is your chance to work with cutting-edge technologies, influence strategic decisions, and make a real impact in a collaborative, forward-thinking environment. Why This Role? Be part of a high-profile project driving innovation in data and analytics. Work with a global leader in digital transformation. Enjoy senior-level responsibilities, clear progression, and exposure to decision-makers. Competitive package: Up to 100K base + 12% bonus + benefits. Hybrid role based in London. What You'll Do Design and develop data pipelines and transformation workflows using Azure Databricks. Collaborate with cross-functional teams to deliver data-driven solutions. Work on cloud-based data storage and processing platforms. Contribute to strategic decision-making and innovation in the insurance domain. What We're Looking For Proven Data Engineer with 5+ years of hands-on Databricks experience. Insurance domain expertise - essential. Strong background in data management, ETL, and SQL. Familiarity with Azure and Microsoft BI tools. Immediate start No VISA sponsorship. This is more than a job - it's a chance to shape the future of data engineering. Apply today and join a team where your ideas matter!
Dec 17, 2025
Full time
Databricks Data Engineer London Senior Manager Up to 100K + Bonus Ready to take your data engineering career to the next level? Join a global consultancy on a major transformation project within the insurance domain. This is your chance to work with cutting-edge technologies, influence strategic decisions, and make a real impact in a collaborative, forward-thinking environment. Why This Role? Be part of a high-profile project driving innovation in data and analytics. Work with a global leader in digital transformation. Enjoy senior-level responsibilities, clear progression, and exposure to decision-makers. Competitive package: Up to 100K base + 12% bonus + benefits. Hybrid role based in London. What You'll Do Design and develop data pipelines and transformation workflows using Azure Databricks. Collaborate with cross-functional teams to deliver data-driven solutions. Work on cloud-based data storage and processing platforms. Contribute to strategic decision-making and innovation in the insurance domain. What We're Looking For Proven Data Engineer with 5+ years of hands-on Databricks experience. Insurance domain expertise - essential. Strong background in data management, ETL, and SQL. Familiarity with Azure and Microsoft BI tools. Immediate start No VISA sponsorship. This is more than a job - it's a chance to shape the future of data engineering. Apply today and join a team where your ideas matter!
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Dec 17, 2025
Full time
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Dec 16, 2025
Full time
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Senior Data Engineer Salary: Up to 75,000 I am working with a well-established financial services organisation that is undergoing a major transformation of its data and analytics capabilities. The data team plays a critical role in building scalable, cloud-first data solutions that provide actionable insights to support executive and operational decision-making. These insights underpin the organisation's growth strategy across both domestic and international markets. As a Senior Data Engineer, you will take an active role in shaping solution delivery against business requirements while contributing to the wider technical architecture and strategy. This is a hands-on position where you will spend most of your time developing robust data solutions while also mentoring a small team of Data Engineers to ensure adherence to best practices and governance standards. In this role, you will be responsible for: Designing end-to-end data architecture aligned with modern best practices. Building and managing ingestion pipelines using Databricks and related tools. Developing PySpark/Spark SQL notebooks for complex transformations and cleansing. Applying governance, security, and CI/CD best practices across cloud environments. Leading technical discussions and producing professional documentation. To be successful in this role, you will have: Hands-on experience with Databricks including Unity Catalog. Strong PySpark/Spark SQL skills for large-scale transformations. Experience integrating with diverse data sources such as APIs, cloud storage and databases. Experience with the Azure cloud data platform Some of the package/role details include: Salary up to 75,000 Hybrid working model - one day per week in Oxfordshire 25 days holiday plus bank holiday Company pension scheme Private healthcare Exposure to cutting-edge Databricks projects and enterprise-scale data platforms This is just a brief overview of the role. For the full details, simply apply with your CV and we'll be in touch to discuss it further.
Dec 16, 2025
Full time
Senior Data Engineer Salary: Up to 75,000 I am working with a well-established financial services organisation that is undergoing a major transformation of its data and analytics capabilities. The data team plays a critical role in building scalable, cloud-first data solutions that provide actionable insights to support executive and operational decision-making. These insights underpin the organisation's growth strategy across both domestic and international markets. As a Senior Data Engineer, you will take an active role in shaping solution delivery against business requirements while contributing to the wider technical architecture and strategy. This is a hands-on position where you will spend most of your time developing robust data solutions while also mentoring a small team of Data Engineers to ensure adherence to best practices and governance standards. In this role, you will be responsible for: Designing end-to-end data architecture aligned with modern best practices. Building and managing ingestion pipelines using Databricks and related tools. Developing PySpark/Spark SQL notebooks for complex transformations and cleansing. Applying governance, security, and CI/CD best practices across cloud environments. Leading technical discussions and producing professional documentation. To be successful in this role, you will have: Hands-on experience with Databricks including Unity Catalog. Strong PySpark/Spark SQL skills for large-scale transformations. Experience integrating with diverse data sources such as APIs, cloud storage and databases. Experience with the Azure cloud data platform Some of the package/role details include: Salary up to 75,000 Hybrid working model - one day per week in Oxfordshire 25 days holiday plus bank holiday Company pension scheme Private healthcare Exposure to cutting-edge Databricks projects and enterprise-scale data platforms This is just a brief overview of the role. For the full details, simply apply with your CV and we'll be in touch to discuss it further.
Senior Data Engineering Consultant - 60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Dec 14, 2025
Full time
Senior Data Engineering Consultant - 60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Data Engineer Manager Hybrid - London with 2/3 days WFH Circ £85,000 - £95,000 + Attractive Bonus & Benefits This newly created Data Engineer Managers position is an excellent opportunity for someone that enjoys being hands on technically as well as managing a small team of Data Engineers. It would suit those with official management experience, or potentially a Lead or Senior Engineer used to leading teams and now looking to take on more managerial responsibility. Our client is a well-established and rapidly growing global business with its headquarters based in London. The Data Engineer Manager will play a pivotal role at the heart of our client's data & analytics operation. Having implemented a new MS Fabric based Data platform, the need is now to scale up and meet the demand to deliver data driven insights and strategies right across the business globally. There'll be a hands-on element to the role as you'll be troubleshooting, doing code reviews, steering the team through deployments and acting as the escalation point for data engineering. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Our client can offer an excellent career development opportunity and a work environment that's vibrant, friendly, and collaborative. Key Responsibilities include; Define and take ownership of the roadmap for the ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting, mentoring, and developing a high-performing data organization. Demonstrate pragmatic leadership by aligning multiple product workstreams to achieve a unified, robust, and trustworthy data platform that supports production services such as dashboards, new product launches, analytics, and data science initiatives. Develop and maintain comprehensive data models, data lakes, and data warehouses (e.g., utilizing Azure Synapse). Collaborate with data analysts, Analytics Engineers, and various stakeholders to fulfil business requirements. Key Experience, Skills and Knowledge: Experience leading data or platform teams in a production environment as a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in the likes of Python, Pyspark, SQL, Scala or Java. Experience operating in a cloud-native environment such as Azure, AWS, GCP etc ( Fabric experience would be beneficial but is not essential). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to delivery and prioritisation. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines. Experience building, defining, and owning data models, data lakes, and data warehouses. Exposure to data science concepts and techniques is highly desirable. Strong problem-solving skills and attention to detail. Salary is dependent on experience and expected to be in the region of £85,000 - £95,000 + an attractive bonus scheme and benefits package. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business.
Dec 13, 2025
Full time
Data Engineer Manager Hybrid - London with 2/3 days WFH Circ £85,000 - £95,000 + Attractive Bonus & Benefits This newly created Data Engineer Managers position is an excellent opportunity for someone that enjoys being hands on technically as well as managing a small team of Data Engineers. It would suit those with official management experience, or potentially a Lead or Senior Engineer used to leading teams and now looking to take on more managerial responsibility. Our client is a well-established and rapidly growing global business with its headquarters based in London. The Data Engineer Manager will play a pivotal role at the heart of our client's data & analytics operation. Having implemented a new MS Fabric based Data platform, the need is now to scale up and meet the demand to deliver data driven insights and strategies right across the business globally. There'll be a hands-on element to the role as you'll be troubleshooting, doing code reviews, steering the team through deployments and acting as the escalation point for data engineering. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Our client can offer an excellent career development opportunity and a work environment that's vibrant, friendly, and collaborative. Key Responsibilities include; Define and take ownership of the roadmap for the ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting, mentoring, and developing a high-performing data organization. Demonstrate pragmatic leadership by aligning multiple product workstreams to achieve a unified, robust, and trustworthy data platform that supports production services such as dashboards, new product launches, analytics, and data science initiatives. Develop and maintain comprehensive data models, data lakes, and data warehouses (e.g., utilizing Azure Synapse). Collaborate with data analysts, Analytics Engineers, and various stakeholders to fulfil business requirements. Key Experience, Skills and Knowledge: Experience leading data or platform teams in a production environment as a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in the likes of Python, Pyspark, SQL, Scala or Java. Experience operating in a cloud-native environment such as Azure, AWS, GCP etc ( Fabric experience would be beneficial but is not essential). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to delivery and prioritisation. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines. Experience building, defining, and owning data models, data lakes, and data warehouses. Exposure to data science concepts and techniques is highly desirable. Strong problem-solving skills and attention to detail. Salary is dependent on experience and expected to be in the region of £85,000 - £95,000 + an attractive bonus scheme and benefits package. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business.
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Dec 13, 2025
Full time
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Senior Data Engineer - Make an Impact About Us We're driving a major transformation in data and analytics, and we need a Senior Data Engineer who can do more than just build pipelines - someone who can demonstrate real impact , influence stakeholders, and help shape the future of our data platform. Why This Role Exists This is an opportunity for an experienced data engineer r and an opportunity to grow into a Principal-level role within 2-3 years . You'll join a small, ambitious team with high visibility across the business, working on modernisation projects that will redefine how we use data. What We're Looking For Impact-Driven Mindset: We want someone who can clearly articulate the difference they've made in previous roles - not just list tasks. Show us how you improved processes, accelerated insights, or drove strategic decisions. Technical Expertise: Essential: Strong experience with Microsoft Fabric or Databricks at a good level. Python Proficiency: Advanced coding skills for building robust, scalable solutions. Strong SQL and data modelling (relational and dimensional). Modern Data Engineering: Proven ability to design and deliver scalable solutions using modern architectures (lakehouse, medallion, warehouse-first). Stakeholder Engagement: Ability to influence and collaborate with business leaders, translating technical solutions into measurable business outcomes. Growth Potential: Comfortable mentoring junior engineers and keen to develop into a leadership role. Mindset: Curious, proactive, and passionate about turning data into tangible business value. What You'll Do Drive the evolution of our data platform using Microsoft Fabric and modern engineering practices. Build and optimise data pipelines for ingestion, transformation, and modelling. Support migration from legacy systems (e.g., Synapse) to modern architectures. Collaborate with stakeholders to ensure solutions deliver real business impact. Contribute to innovation projects, including AI integration and advanced analytics. Why Join Us A small, supportive team with big ambitions. High visibility and the chance to make a real difference. Opportunity to shape modern data capabilities from the ground up. Flexible working (remote with quarterly meet-ups). What Success Looks Like You can evidence impact : cost savings, efficiency gains, improved decision-making, or accelerated delivery timelines. You're trusted by stakeholders and seen as a partner who drives change. You bring clarity and simplicity to complex data challenges. Please note you MUST have Python and Microsoft Fabric experience Please get in touch with Kamilla removed) Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Dec 13, 2025
Full time
Senior Data Engineer - Make an Impact About Us We're driving a major transformation in data and analytics, and we need a Senior Data Engineer who can do more than just build pipelines - someone who can demonstrate real impact , influence stakeholders, and help shape the future of our data platform. Why This Role Exists This is an opportunity for an experienced data engineer r and an opportunity to grow into a Principal-level role within 2-3 years . You'll join a small, ambitious team with high visibility across the business, working on modernisation projects that will redefine how we use data. What We're Looking For Impact-Driven Mindset: We want someone who can clearly articulate the difference they've made in previous roles - not just list tasks. Show us how you improved processes, accelerated insights, or drove strategic decisions. Technical Expertise: Essential: Strong experience with Microsoft Fabric or Databricks at a good level. Python Proficiency: Advanced coding skills for building robust, scalable solutions. Strong SQL and data modelling (relational and dimensional). Modern Data Engineering: Proven ability to design and deliver scalable solutions using modern architectures (lakehouse, medallion, warehouse-first). Stakeholder Engagement: Ability to influence and collaborate with business leaders, translating technical solutions into measurable business outcomes. Growth Potential: Comfortable mentoring junior engineers and keen to develop into a leadership role. Mindset: Curious, proactive, and passionate about turning data into tangible business value. What You'll Do Drive the evolution of our data platform using Microsoft Fabric and modern engineering practices. Build and optimise data pipelines for ingestion, transformation, and modelling. Support migration from legacy systems (e.g., Synapse) to modern architectures. Collaborate with stakeholders to ensure solutions deliver real business impact. Contribute to innovation projects, including AI integration and advanced analytics. Why Join Us A small, supportive team with big ambitions. High visibility and the chance to make a real difference. Opportunity to shape modern data capabilities from the ground up. Flexible working (remote with quarterly meet-ups). What Success Looks Like You can evidence impact : cost savings, efficiency gains, improved decision-making, or accelerated delivery timelines. You're trusted by stakeholders and seen as a partner who drives change. You bring clarity and simplicity to complex data challenges. Please note you MUST have Python and Microsoft Fabric experience Please get in touch with Kamilla removed) Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
Dec 12, 2025
Full time
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
Dec 12, 2025
Full time
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Dec 12, 2025
Full time
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
About Us Makutu designs, builds and supports Microsoft Azure cloud data platforms. We are a Microsoft Solutions Partner (Azure Data & AI) and are busy building a talented team with relevant skills to deliver industry leading data platforms for our customers. The Role The Data Engineer role is key to building and growing the in-house technical team at Makutu. The role will provide the successful applicants with the opportunity for significant career development while working with a range of large businesses to whom data is critical to their success. Working as part of the team and with the customer, you'll require excellent written and verbal English language and communication skills. Big growth plans are in place to build a broader and deeper technical capability with a focus on the Microsoft Azure technology stack. The position of Data Engineer is a key role in the wider capability of our team. Occasional visits to our Head Office and customers sites will be required. Key responsibilities: Identify, design, and implement working practices across data pipelines, data architectures, testing and deployment Understand complex business requirements and providing solutions to business problems Understand modern data architecture approaches and associated cloud focused solutions Defining data engineering best practice and sharing across the organisation Collaborating with the wider team on data strategy Skills and experience: A relevant Bachelors degree in Computing, Mathematics, Data Science or similar (ideal but not essential) A Masters degree in Data Science (ideal but not essential) Experience building data pipelines with modern practices including the use of cloud native technologies, DevOps practices, CI/CD pipelines and agile delivery Experience with data modelling, data warehousing, data lake solutions Able to communicate effectively with senior stakeholders. Successful candidates will likely posses Azure certifications such as DP-600 and/or DP-700. Also, applicants will have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python
Dec 12, 2025
Full time
About Us Makutu designs, builds and supports Microsoft Azure cloud data platforms. We are a Microsoft Solutions Partner (Azure Data & AI) and are busy building a talented team with relevant skills to deliver industry leading data platforms for our customers. The Role The Data Engineer role is key to building and growing the in-house technical team at Makutu. The role will provide the successful applicants with the opportunity for significant career development while working with a range of large businesses to whom data is critical to their success. Working as part of the team and with the customer, you'll require excellent written and verbal English language and communication skills. Big growth plans are in place to build a broader and deeper technical capability with a focus on the Microsoft Azure technology stack. The position of Data Engineer is a key role in the wider capability of our team. Occasional visits to our Head Office and customers sites will be required. Key responsibilities: Identify, design, and implement working practices across data pipelines, data architectures, testing and deployment Understand complex business requirements and providing solutions to business problems Understand modern data architecture approaches and associated cloud focused solutions Defining data engineering best practice and sharing across the organisation Collaborating with the wider team on data strategy Skills and experience: A relevant Bachelors degree in Computing, Mathematics, Data Science or similar (ideal but not essential) A Masters degree in Data Science (ideal but not essential) Experience building data pipelines with modern practices including the use of cloud native technologies, DevOps practices, CI/CD pipelines and agile delivery Experience with data modelling, data warehousing, data lake solutions Able to communicate effectively with senior stakeholders. Successful candidates will likely posses Azure certifications such as DP-600 and/or DP-700. Also, applicants will have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python
Senior Data Engineer Location: Manchester Salary: Up to 105,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 105K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.
Dec 11, 2025
Full time
Senior Data Engineer Location: Manchester Salary: Up to 105,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 105K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.