• Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
  • Sign in
  • Sign up
  • Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

29 jobs found

Email me jobs like this
Refine Search
Current Search
senior data engineer in databricks
Tenth Revolution Group
Databricks Data Engineer
Tenth Revolution Group City, Manchester
Senior Data Engineer Location: Manchester Salary: Up to 105,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 105K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.
Dec 11, 2025
Full time
Senior Data Engineer Location: Manchester Salary: Up to 105,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 105K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.
Deerfoot Recruitment Solutions Limited
Data Technical Lead
Deerfoot Recruitment Solutions Limited City, London
Data Engineering Technical Lead Global Investment Bank London - Hybrid Permanent - Excellent Package + Benefits We are working with one of the world's leading banking groups, who we have partnered with for 15 years. We are seeking an experienced Data Architect / EDM Developer / Data Engineering Lead to join their International Technology team in London. You will be a key part of the Architecture, Middleware, Data & Enterprise Services (AMD) division, driving data engineering, integration and automation initiatives across our clients EMEA banking and securities entities. This is a hands-on leadership role, combining technical expertise with mentoring and team leadership. Key Responsibilities Architect, design and deliver enterprise-wide EDM and data solutions. Lead and mentor EDM developers, ensuring high-quality, cost-effective delivery. Drive data innovation, automation and best practices across EMEA. Translate business requirements into functional and technical designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering / ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services / Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to gather requirements and liaise effectively with business stakeholders. Desirable Skills Cloud (AWS / Azure), Python, PowerShell, APIs. Data pipelines, lineage, automation. BI tools (Power BI, Tableau, SSRS). Modern data architectures (lakehouse, data mesh). CI/CD, GitHub, Control-M, dbt/Databricks. This is an opportunity to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate 1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn't right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.
Dec 11, 2025
Full time
Data Engineering Technical Lead Global Investment Bank London - Hybrid Permanent - Excellent Package + Benefits We are working with one of the world's leading banking groups, who we have partnered with for 15 years. We are seeking an experienced Data Architect / EDM Developer / Data Engineering Lead to join their International Technology team in London. You will be a key part of the Architecture, Middleware, Data & Enterprise Services (AMD) division, driving data engineering, integration and automation initiatives across our clients EMEA banking and securities entities. This is a hands-on leadership role, combining technical expertise with mentoring and team leadership. Key Responsibilities Architect, design and deliver enterprise-wide EDM and data solutions. Lead and mentor EDM developers, ensuring high-quality, cost-effective delivery. Drive data innovation, automation and best practices across EMEA. Translate business requirements into functional and technical designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering / ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services / Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to gather requirements and liaise effectively with business stakeholders. Desirable Skills Cloud (AWS / Azure), Python, PowerShell, APIs. Data pipelines, lineage, automation. BI tools (Power BI, Tableau, SSRS). Modern data architectures (lakehouse, data mesh). CI/CD, GitHub, Control-M, dbt/Databricks. This is an opportunity to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate 1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn't right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.
Tenth Revolution Group
Senior Databricks Engineer - £70,000 - Hybrid
Tenth Revolution Group Hook Norton, Oxfordshire
Senior DataBricks Engineer - 70,000 - Hybrid We're looking for a hands-on Senior Databricks Engineer to lead the delivery of scalable data solutions within an Agile environment. Working closely with the Data Product Manager and Data Architect, you will shape and develop our data platform, delivering high-quality pipelines and insights that support strategic decision-making. You will also manage and coach a small team of Data Engineers, driving best practice, consistency, and governance. Key Responsibilities Translate business strategy into data solutions and ensure alignment with product goals. Provide technical leadership, breaking initiatives into Features, Epics, and Stories and setting engineering standards. Collaborate with the Data Architect to design and implement data architecture and build plans. Build and maintain scalable data pipelines, ETL/ELT processes, and large-scale data workflows. Optimise data systems for performance, reliability, and scalability. Implement data quality processes and maintain data models, schemas, and documentation. Operate CI/CD practices in Azure DevOps and contribute to Agile sprint cycles. Troubleshoot and resolve pipeline issues promptly. Stay current with industry trends and recommend improvements. Ensure adherence to governance standards. Line-manage and mentor a small team of Data Engineers. What We're Looking For Extensive Databricks experience, including Unity Catalog. Strong skills in Python, Spark, SQL and experience with SQL databases. Terraform experience for cloud infrastructure as code. Experience with Azure and workflow tools (Airflow, ADF). Excellent problem-solving ability, communication skills, and attention to detail. Experience across Waterfall and Agile methodologies. Curious, inclusive, and committed to continuous learning. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Dec 11, 2025
Full time
Senior DataBricks Engineer - 70,000 - Hybrid We're looking for a hands-on Senior Databricks Engineer to lead the delivery of scalable data solutions within an Agile environment. Working closely with the Data Product Manager and Data Architect, you will shape and develop our data platform, delivering high-quality pipelines and insights that support strategic decision-making. You will also manage and coach a small team of Data Engineers, driving best practice, consistency, and governance. Key Responsibilities Translate business strategy into data solutions and ensure alignment with product goals. Provide technical leadership, breaking initiatives into Features, Epics, and Stories and setting engineering standards. Collaborate with the Data Architect to design and implement data architecture and build plans. Build and maintain scalable data pipelines, ETL/ELT processes, and large-scale data workflows. Optimise data systems for performance, reliability, and scalability. Implement data quality processes and maintain data models, schemas, and documentation. Operate CI/CD practices in Azure DevOps and contribute to Agile sprint cycles. Troubleshoot and resolve pipeline issues promptly. Stay current with industry trends and recommend improvements. Ensure adherence to governance standards. Line-manage and mentor a small team of Data Engineers. What We're Looking For Extensive Databricks experience, including Unity Catalog. Strong skills in Python, Spark, SQL and experience with SQL databases. Terraform experience for cloud infrastructure as code. Experience with Azure and workflow tools (Airflow, ADF). Excellent problem-solving ability, communication skills, and attention to detail. Experience across Waterfall and Agile methodologies. Curious, inclusive, and committed to continuous learning. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Intuition IT Solutions Ltd
Head of Data & AI - (Banking & Financial Services)
Intuition IT Solutions Ltd
We are seeking a Data & AI Lead for our Platinum account. Our ideal candidate should be passionate about Data & AI, possess deep technical knowledge and a focus on delivering measurable business impact. This role offers leadership opportunities and client exposure in North America, the UK, and Europe. You must possess: 15-20 years' experience in a reputable Data & AI services firm, working in the Banking & Financial Services vertical. Proven Revenue Generation Track Record: A consistent history of delivering and exceeding revenue targets on a YoY basis within the Banking & FS sector, including winning new logos, expanding existing accounts, and converting pipeline into closed business with measurable commercial impact. Exceptional Client Engagement & Relationship Management: Possess the ability to engage C-suite executives and senior decision-makers in Banks, Building Societies, and Financial Market Infrastructures with confidence and credibility. Outstanding Communication & Compelling Storytelling: You are an articulate, persuasive communicator who can distill complex Data & AI concepts into compelling narratives that resonate with diverse audiences-from technical architects to CDOs. Deep Technical Expertise Across Data & AI: You possess comprehensive technical knowledge spanning the entire Data & AI landscape, including cloud platforms (Azure, Databricks, Snowflake, AWS &GCP). Knowledge of AI/ML, Gen AI and Agentic AI capabilities: You understand not just the "what" but the "how" and "why" behind these technologies, enabling you to architect enterprise-scale solutions that address real-world Banking & FS challenges. Ability to solve: Translate business requirements into scalable, secure, and compliant technical solutions that align with enterprise standards and regulatory frameworks. Matrix Organisation Leadership Across Geographies Ability to work with delivery, pre-sales, and sales teams throughout deal pursuits. Good to have: Bachelor's/Master's degree in IT, Computer Science, Engineering, Business, or Decision Sciences. Deep Banking & Financial Services Domain Expertise: At least 10+ years of progressive experience within the UK&I banking & FS sector, with demonstrable knowledge of retail banking, commercial banking, investment banking, wealth management, or insurance operations. Regulatory & Compliance Acumen Active Participation in Banking & FS Industry Forums & Thought Leadership Practice Building & Team Leadership: You must have experience building, mentoring, and scaling high-performing consulting teams Deep Understanding of UK Banking Regulatory Landscape Willingness to travel 10-20% of the time NOTE: 4 days/week onsite
Dec 11, 2025
Full time
We are seeking a Data & AI Lead for our Platinum account. Our ideal candidate should be passionate about Data & AI, possess deep technical knowledge and a focus on delivering measurable business impact. This role offers leadership opportunities and client exposure in North America, the UK, and Europe. You must possess: 15-20 years' experience in a reputable Data & AI services firm, working in the Banking & Financial Services vertical. Proven Revenue Generation Track Record: A consistent history of delivering and exceeding revenue targets on a YoY basis within the Banking & FS sector, including winning new logos, expanding existing accounts, and converting pipeline into closed business with measurable commercial impact. Exceptional Client Engagement & Relationship Management: Possess the ability to engage C-suite executives and senior decision-makers in Banks, Building Societies, and Financial Market Infrastructures with confidence and credibility. Outstanding Communication & Compelling Storytelling: You are an articulate, persuasive communicator who can distill complex Data & AI concepts into compelling narratives that resonate with diverse audiences-from technical architects to CDOs. Deep Technical Expertise Across Data & AI: You possess comprehensive technical knowledge spanning the entire Data & AI landscape, including cloud platforms (Azure, Databricks, Snowflake, AWS &GCP). Knowledge of AI/ML, Gen AI and Agentic AI capabilities: You understand not just the "what" but the "how" and "why" behind these technologies, enabling you to architect enterprise-scale solutions that address real-world Banking & FS challenges. Ability to solve: Translate business requirements into scalable, secure, and compliant technical solutions that align with enterprise standards and regulatory frameworks. Matrix Organisation Leadership Across Geographies Ability to work with delivery, pre-sales, and sales teams throughout deal pursuits. Good to have: Bachelor's/Master's degree in IT, Computer Science, Engineering, Business, or Decision Sciences. Deep Banking & Financial Services Domain Expertise: At least 10+ years of progressive experience within the UK&I banking & FS sector, with demonstrable knowledge of retail banking, commercial banking, investment banking, wealth management, or insurance operations. Regulatory & Compliance Acumen Active Participation in Banking & FS Industry Forums & Thought Leadership Practice Building & Team Leadership: You must have experience building, mentoring, and scaling high-performing consulting teams Deep Understanding of UK Banking Regulatory Landscape Willingness to travel 10-20% of the time NOTE: 4 days/week onsite
Staffworx Limited
Data & AI Senior Consultants - Dynamic AI Consulting firm
Staffworx Limited
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
Dec 11, 2025
Full time
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
Adecco
Data Scrum Lead
Adecco
Data Scrum Lead Duration: 6 Months Possibility for extension) Location: London/Hybrid (3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Overview This role will manage a scrum team that will ingest required reference data into the EMEA Data Platform for any future consumers within EMEA. The successful candidate will, amongst other tasks: Facilitate efficient and effective software delivery from inception to go-live for the EMEA Data Platform, initial focus will be running the Reference Data Scrum, ingesting reference data into the EMEA Data Platform Work with the Product Owners, business stakeholders, data modellers, data engineers, testers and Data Office to deliver a well-planned pipeline of reference data sets. Use scrum techniques to deliver reference data requirements within a scrum of scrum structure and drive effective outcomes each sprint. Key Responsibilities: Responsible for effective and efficient running of Software Development Scrums consisting of Developers, Testers, Physical Data Modellers, Data Governance and other stakeholders. Ensure that everyone on the team understands goals, scope, estimates and burndown charts of each iteration. Maintain product backlog and agree prioritisation with senior stakeholders. Identify and remove obstacles so that the team members can focus on their immediate tasks. Keep stakeholders up to date via direct communications, tailored reports and JIRA dashboards. Escalate any serious challenges to scope, timelines and budget to senior management. Key Skills & Requirements: Broad and deep understanding of reference data utilised in a corporate investment bank. Extensive experience of Core Agile methodology experience (Scrum, Scrum of Scrums), including: Running Scrum ceremonies: Sprint Planning, Daily Stand-ups, Sprint Reviews, and Retrospectives. Ensures meetings are productive and time-boxed. Educates the team and stakeholders on Scrum principles and Agile practices. Helps the team improve continuously and adopt Agile mindsets. Supports the team by removing obstacles and enabling progress. Experience with using Atlassian JIRA Understanding of Data Governance practises and subsequent requirements on IT delivery. Ability to handle communications at all levels from across the whole business Excellent analytical and problem-solving skills Ability to work collaboratively in a team environment, share ideas, and contribute to group discussions. Attention to detail and a commitment to producing high-quality work with a focus on accuracy and precision. Proven ability to work independently on large scale projects while contributing effectively in a collaborative team environment. Exposure to concepts like Data Lakehouse and Medallion Architecture, especially with Databricks as the underlying technology. Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Dec 11, 2025
Contractor
Data Scrum Lead Duration: 6 Months Possibility for extension) Location: London/Hybrid (3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Overview This role will manage a scrum team that will ingest required reference data into the EMEA Data Platform for any future consumers within EMEA. The successful candidate will, amongst other tasks: Facilitate efficient and effective software delivery from inception to go-live for the EMEA Data Platform, initial focus will be running the Reference Data Scrum, ingesting reference data into the EMEA Data Platform Work with the Product Owners, business stakeholders, data modellers, data engineers, testers and Data Office to deliver a well-planned pipeline of reference data sets. Use scrum techniques to deliver reference data requirements within a scrum of scrum structure and drive effective outcomes each sprint. Key Responsibilities: Responsible for effective and efficient running of Software Development Scrums consisting of Developers, Testers, Physical Data Modellers, Data Governance and other stakeholders. Ensure that everyone on the team understands goals, scope, estimates and burndown charts of each iteration. Maintain product backlog and agree prioritisation with senior stakeholders. Identify and remove obstacles so that the team members can focus on their immediate tasks. Keep stakeholders up to date via direct communications, tailored reports and JIRA dashboards. Escalate any serious challenges to scope, timelines and budget to senior management. Key Skills & Requirements: Broad and deep understanding of reference data utilised in a corporate investment bank. Extensive experience of Core Agile methodology experience (Scrum, Scrum of Scrums), including: Running Scrum ceremonies: Sprint Planning, Daily Stand-ups, Sprint Reviews, and Retrospectives. Ensures meetings are productive and time-boxed. Educates the team and stakeholders on Scrum principles and Agile practices. Helps the team improve continuously and adopt Agile mindsets. Supports the team by removing obstacles and enabling progress. Experience with using Atlassian JIRA Understanding of Data Governance practises and subsequent requirements on IT delivery. Ability to handle communications at all levels from across the whole business Excellent analytical and problem-solving skills Ability to work collaboratively in a team environment, share ideas, and contribute to group discussions. Attention to detail and a commitment to producing high-quality work with a focus on accuracy and precision. Proven ability to work independently on large scale projects while contributing effectively in a collaborative team environment. Exposure to concepts like Data Lakehouse and Medallion Architecture, especially with Databricks as the underlying technology. Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Tenth Revolution Group
Senior Databricks Engineer - Northampton (Hybrid) - Up to £80K
Tenth Revolution Group Northampton, Northamptonshire
Senior Databricks Engineer - Northampton (Hybrid) - Up to 80K + Benefits Ready to lead, innovate, and make an impact? We're looking for a Senior Databricks Engineer to join a forward-thinking team and take ownership of cutting-edge data solutions. This is your chance to shape the future of data strategy in a business that has been empowering companies for decades, supporting thousands of SMEs worldwide. We value relationships, trust, and innovation, and offer a flexible, inclusive environment where you can grow, make an impact, and be part of something special. About the Role You'll play a key role in designing and delivering scalable data pipelines, collaborating on architecture, and mentoring a small team of Data Engineers. This is a hands-on technical leadership position where your expertise will drive innovation and performance across our data ecosystem. What You'll Do Build and optimise data pipelines using Databricks Collaborate on data architecture and strategy Deliver large-scale workflows for ingestion, transformation, and validation Implement best practices for data quality and governance Lead and coach a team of Data Engineers What We're Looking For Significant experience with Databricks (including Unity Catalog) Strong skills in Python, Spark, SQL Cloud expertise Knowledge of pipeline tools (Airflow, ADF) Leadership and problem-solving ability Ready to take the next step? Apply now and join us as our Senior Databricks Engineer!
Dec 11, 2025
Full time
Senior Databricks Engineer - Northampton (Hybrid) - Up to 80K + Benefits Ready to lead, innovate, and make an impact? We're looking for a Senior Databricks Engineer to join a forward-thinking team and take ownership of cutting-edge data solutions. This is your chance to shape the future of data strategy in a business that has been empowering companies for decades, supporting thousands of SMEs worldwide. We value relationships, trust, and innovation, and offer a flexible, inclusive environment where you can grow, make an impact, and be part of something special. About the Role You'll play a key role in designing and delivering scalable data pipelines, collaborating on architecture, and mentoring a small team of Data Engineers. This is a hands-on technical leadership position where your expertise will drive innovation and performance across our data ecosystem. What You'll Do Build and optimise data pipelines using Databricks Collaborate on data architecture and strategy Deliver large-scale workflows for ingestion, transformation, and validation Implement best practices for data quality and governance Lead and coach a team of Data Engineers What We're Looking For Significant experience with Databricks (including Unity Catalog) Strong skills in Python, Spark, SQL Cloud expertise Knowledge of pipeline tools (Airflow, ADF) Leadership and problem-solving ability Ready to take the next step? Apply now and join us as our Senior Databricks Engineer!
TRIA
Senior Data Engineer
TRIA
Senior Data Engineer London 2-3 days on-site 65,000 - 72,000 + 20% Bonus + Excellent Benefits Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
Dec 11, 2025
Full time
Senior Data Engineer London 2-3 days on-site 65,000 - 72,000 + 20% Bonus + Excellent Benefits Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
TXP
Head of Business Intelligence (Azure Data Lake, Fabric)
TXP City, Birmingham
Role: BI Manager Rate: 500.00 Per Day - Inside IR35 Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) Duration: Initial 3 - 6 Months with potential to go Permanent We are currently working with a leading services provider who require a technically strong, Midlands based Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Dec 10, 2025
Contractor
Role: BI Manager Rate: 500.00 Per Day - Inside IR35 Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) Duration: Initial 3 - 6 Months with potential to go Permanent We are currently working with a leading services provider who require a technically strong, Midlands based Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Boston Consulting Group
AI Software Engineer/Platform Architect - BCG X
Boston Consulting Group
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Dec 09, 2025
Full time
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Tenth Revolution Group
Senior Data Engineer
Tenth Revolution Group Havant, Hampshire
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Dec 09, 2025
Full time
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Head Resourcing
Senior Data Engineer/ Scientist
Head Resourcing
Senior Data Engineer - Azure & Databricks Lakehouse Glasgow (3/4 days onsite) Exclusive Role with a Leading UK Consumer Business A rapidly scaling UK consumer brand is undertaking a major data modernisation programme-moving away from legacy systems, manual Excel reporting and fragmented data sources into a fully automated Azure Enterprise Landing Zone + Databricks Lakehouse . They are building a modern data platform from the ground up using Lakeflow Declarative Pipelines , Unity Catalog , and Azure Data Factory , and this role sits right at the heart of that transformation. This is a rare opportunity to join early, influence architecture, and help define engineering standards, pipelines, curated layers and best practices that will support Operations, Finance, Sales, Logistics and Customer Care. If you want to build a best-in-class Lakehouse from scratch-this is the one. ? What You'll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines , PySpark , and Spark SQL across a full Medallion Architecture (Bronze ? Silver ? Gold) . Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts to power high-performance Power BI datasets . Apply governance, lineage and fine-grained permissions via Unity Catalog . Orchestration & Observability Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory . Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev ? Prod , using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews and future roadmap. Collaboration & Delivery Work closely with BI/Analytics teams to deliver curated datasets powering dashboards across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. ? Tech Stack You'll Work With Databricks : Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure : ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages : PySpark, Spark SQL, Python, Git DevOps : Azure DevOps Repos, Pipelines, CI/CD Analytics : Power BI, Fabric ? What We're Looking For Experience 5-8+ years of Data Engineering with 2-3+ years delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensional modelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing Zone patterns . Comfort with Git, CI/CD, automated deployments and modern engineering standards. Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model / Direct Lake optimisation
Dec 07, 2025
Full time
Senior Data Engineer - Azure & Databricks Lakehouse Glasgow (3/4 days onsite) Exclusive Role with a Leading UK Consumer Business A rapidly scaling UK consumer brand is undertaking a major data modernisation programme-moving away from legacy systems, manual Excel reporting and fragmented data sources into a fully automated Azure Enterprise Landing Zone + Databricks Lakehouse . They are building a modern data platform from the ground up using Lakeflow Declarative Pipelines , Unity Catalog , and Azure Data Factory , and this role sits right at the heart of that transformation. This is a rare opportunity to join early, influence architecture, and help define engineering standards, pipelines, curated layers and best practices that will support Operations, Finance, Sales, Logistics and Customer Care. If you want to build a best-in-class Lakehouse from scratch-this is the one. ? What You'll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines , PySpark , and Spark SQL across a full Medallion Architecture (Bronze ? Silver ? Gold) . Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts to power high-performance Power BI datasets . Apply governance, lineage and fine-grained permissions via Unity Catalog . Orchestration & Observability Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory . Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev ? Prod , using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews and future roadmap. Collaboration & Delivery Work closely with BI/Analytics teams to deliver curated datasets powering dashboards across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. ? Tech Stack You'll Work With Databricks : Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure : ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages : PySpark, Spark SQL, Python, Git DevOps : Azure DevOps Repos, Pipelines, CI/CD Analytics : Power BI, Fabric ? What We're Looking For Experience 5-8+ years of Data Engineering with 2-3+ years delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensional modelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing Zone patterns . Comfort with Git, CI/CD, automated deployments and modern engineering standards. Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model / Direct Lake optimisation
TXP
BI Manager (Azure Data Lake, Fabric, DW)
TXP City, Birmingham
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Dec 05, 2025
Full time
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Tenth Revolution Group
Senior Databricks Engineer - Oxfordshire Hybrid -£Competitive
Tenth Revolution Group Hook Norton, Oxfordshire
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
Dec 05, 2025
Full time
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
Senior Azure Data Engineer
Youngs Employment Services
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
Dec 05, 2025
Full time
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
Tenth Revolution Group
Senior Data Engineering Consultant - £60,000 - Hybrid
Tenth Revolution Group City, London
Senior Data Engineering Consultant - 60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Dec 05, 2025
Full time
Senior Data Engineering Consultant - 60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Tenth Revolution Group
Senior Data Engineer
Tenth Revolution Group Portsmouth, Hampshire
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Dec 05, 2025
Full time
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Guidant Global
IT Data and Analytics Senior Development Operations Engineer
Guidant Global Reading, Oxfordshire
Base Location: Reading / Havant / Perth Salary: 600 per day Working Pattern: 40 hours per week / Full time Embark on a transformative career journey with SSE energy company, where innovation meets impact in the heart of the IT sector. As a pivotal player in our forward-thinking team, you'll harness cutting-edge technology to drive change and propel the UK towards its ambitious net-zero targets. Your expertise will not only shape the future of energy but also carve a sustainable world for generations to come. Join us and be at the forefront of the green revolution, where every line of code contributes to a cleaner, brighter future. Key Responsibilities: Provide technical leadership and oversight to the group Data & Analytics platform team. Responsible for ensuring the reliability, security and scalability of analytics platform services. Deliver full automation of the deployment of Data & Analytics platform services via Infrastructure as code. Help to set development standards, configure operational support processes and provide technical assurance. Provide support to Data & Analytics platform users and internal development teams interacting with the Data & Analytics platform services. What do you need? Extensive experience of deploying Azure and ideally AWS cloud resources and be fully conversant with agile and DevOps development methodology. Extensive experience in using Terraform to deploy cloud resources as infrastructure as code. Excellent understanding of CI/CD principles and experience with related tools (e.g. Azure DevOps, GitHub Actions). Strong knowledge of scripting languages such as PowerShell, Python and Azure CLI and proven experience with automation runbooks, VM maintenance scripts and SQL. Strong understanding of cloud access control and governance such as RBAC and IAM. Strong knowledge on Cloud Networking (Azure) such as private endpoints, Firewalls, NSGs, NAT gateways and route tables. Good knowledge in Microsoft Entra ID such as managing App registrations, Enterprise Apps, AD groups, managed identities and Privileged Identity Management. Proven experience in IaaS such as virtual machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for the Data and Analytics Senior Development Operations Engineer role, we understand you're eager to hear back. We value your time and interest, and if your application is successful, you will be contacted directly by the team within 2 working days. We appreciate your patience and look forward to the possibility of welcoming you aboard.
Oct 08, 2025
Contractor
Base Location: Reading / Havant / Perth Salary: 600 per day Working Pattern: 40 hours per week / Full time Embark on a transformative career journey with SSE energy company, where innovation meets impact in the heart of the IT sector. As a pivotal player in our forward-thinking team, you'll harness cutting-edge technology to drive change and propel the UK towards its ambitious net-zero targets. Your expertise will not only shape the future of energy but also carve a sustainable world for generations to come. Join us and be at the forefront of the green revolution, where every line of code contributes to a cleaner, brighter future. Key Responsibilities: Provide technical leadership and oversight to the group Data & Analytics platform team. Responsible for ensuring the reliability, security and scalability of analytics platform services. Deliver full automation of the deployment of Data & Analytics platform services via Infrastructure as code. Help to set development standards, configure operational support processes and provide technical assurance. Provide support to Data & Analytics platform users and internal development teams interacting with the Data & Analytics platform services. What do you need? Extensive experience of deploying Azure and ideally AWS cloud resources and be fully conversant with agile and DevOps development methodology. Extensive experience in using Terraform to deploy cloud resources as infrastructure as code. Excellent understanding of CI/CD principles and experience with related tools (e.g. Azure DevOps, GitHub Actions). Strong knowledge of scripting languages such as PowerShell, Python and Azure CLI and proven experience with automation runbooks, VM maintenance scripts and SQL. Strong understanding of cloud access control and governance such as RBAC and IAM. Strong knowledge on Cloud Networking (Azure) such as private endpoints, Firewalls, NSGs, NAT gateways and route tables. Good knowledge in Microsoft Entra ID such as managing App registrations, Enterprise Apps, AD groups, managed identities and Privileged Identity Management. Proven experience in IaaS such as virtual machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for the Data and Analytics Senior Development Operations Engineer role, we understand you're eager to hear back. We value your time and interest, and if your application is successful, you will be contacted directly by the team within 2 working days. We appreciate your patience and look forward to the possibility of welcoming you aboard.
ARC IT Recruitment
Senior Data Engineer, Insurance
ARC IT Recruitment City, London
Senior Data Engineer, Insurance London/Hybrid To £85K + Bonus and Benefits SQL, ETL, Azure Senior Data Engineer is required to join a forward-thinking data team within a thriving City based insurance group. This role will see you playing a critical role in delivering reliable, scalable and business-focused data solutions. With a strong focus on Microsoft technologies and cloud-based tools, you'll work directly with key business stakeholders, MI teams and technical teams to drive performance and decision-making through data. The ideal candidate here will have a strong background in insurance MI or reporting-experience within an MGA or insurance carrier is essential. Key Responsibilities Deliver data solutions and changes that support evolving business requirements. Build and maintain robust, scalable data pipelines using SQL and ETL best practices. Collaborate with stakeholders to analyse, define and implement solutions to complex data challenges. Proactively assess the impact of changes on the broader data model and ensure integrity is maintained. Work alongside the MI/reporting team to ensure data is accurately reflected in dashboards and reporting tools. Consult with business analysts, system owners and architects to align technical delivery with strategic objectives. Build deep knowledge of internal systems and promote collaboration across teams. Key Skills & Experience: Significant experience with SQL and ETL development. Strong experience with MS SQL Server, T-SQL, Azure Data Factory, Azure Databricks, Python, Data Lake. Strong background in insurance MI or reporting-experience within an MGA or insurance carrier is essential. A sharp analytical mind with the ability to work quickly, efficiently and methodically. Strong communication skills with excellent stakeholder management and influencing skills. Solid understanding of Insurance Operations, Credit Control, and Finance functions. A team player who thrives in an agile, fast-moving, and highly collaborative environment. For a full consultation on this pivotal role, send your CV to ARC IT Recruitment today.
Oct 08, 2025
Full time
Senior Data Engineer, Insurance London/Hybrid To £85K + Bonus and Benefits SQL, ETL, Azure Senior Data Engineer is required to join a forward-thinking data team within a thriving City based insurance group. This role will see you playing a critical role in delivering reliable, scalable and business-focused data solutions. With a strong focus on Microsoft technologies and cloud-based tools, you'll work directly with key business stakeholders, MI teams and technical teams to drive performance and decision-making through data. The ideal candidate here will have a strong background in insurance MI or reporting-experience within an MGA or insurance carrier is essential. Key Responsibilities Deliver data solutions and changes that support evolving business requirements. Build and maintain robust, scalable data pipelines using SQL and ETL best practices. Collaborate with stakeholders to analyse, define and implement solutions to complex data challenges. Proactively assess the impact of changes on the broader data model and ensure integrity is maintained. Work alongside the MI/reporting team to ensure data is accurately reflected in dashboards and reporting tools. Consult with business analysts, system owners and architects to align technical delivery with strategic objectives. Build deep knowledge of internal systems and promote collaboration across teams. Key Skills & Experience: Significant experience with SQL and ETL development. Strong experience with MS SQL Server, T-SQL, Azure Data Factory, Azure Databricks, Python, Data Lake. Strong background in insurance MI or reporting-experience within an MGA or insurance carrier is essential. A sharp analytical mind with the ability to work quickly, efficiently and methodically. Strong communication skills with excellent stakeholder management and influencing skills. Solid understanding of Insurance Operations, Credit Control, and Finance functions. A team player who thrives in an agile, fast-moving, and highly collaborative environment. For a full consultation on this pivotal role, send your CV to ARC IT Recruitment today.
Tenth Revolution Group
Senior AWS Data Engineer - London - £125,000
Tenth Revolution Group City, London
Senior AWS Data Engineer - London - 125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Oct 06, 2025
Full time
Senior AWS Data Engineer - London - 125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

Modal Window

  • Blog
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Instagram
  • Pinterest
  • Youtube
Parent and Partner sites: IT Job Board | Search Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | London Jobs | Property jobs
© 2008-2025 Jobs Hiring Near Me