• Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
  • Sign in
  • Sign up
  • Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

41 jobs found

Email me jobs like this
Refine Search
Current Search
databricks architect
Data Science & Measurement Lead
Primark Stores Limited Reading, Berkshire
Data Science & Measurement Lead Because your new ideas are our way new ways of working. Evolve, your way. We are seeking a Data Science & Measurement Lead to manage and grow a team of data scientists responsible for building advanced analytics, predictive models, and measurement solutions across Primark. This is a hands on role requiring strong technical depth in Databricks, Apache Spark, and SQL. What You'll Get People are at the heart of what we do here, so it's essential we provide you with the right environment to perform at your very best. Let's talk lifestyle: Healthcare, pension, and potential bonus. 27 days of leave, plus bank holidays and if you want, you can buy 5 more. Because Primark is all about tailoring to you, we offer Tax Saver Tickets, fitness centre, and a subsidised cafeteria. This role is a hybrid opportunity, offering 1-2 days Working from home. What You'll Do as a Data Science & Measurement Lead We want you to feel challenged and inspired. Here, you'll develop your skills across a range of responsibilities: Lead a data science team to deliver machine learning models, experimentation frameworks, and measurement solutions that drive measurable business impact. Design, build, and deploy end to end ML pipelines and workflows using Databricks, Spark, Python, SQL, and PySpark. Ensure robust operationalisation of models through scalable, reliable data pipelines and production ready ML systems. Partner closely with engineering teams to optimise distributed compute workloads and uphold data quality, monitoring, and governance standards. Establish and drive best practices in model reproducibility, experiment tracking, and end to end ML lifecycle management. Act as a trusted advisor by sharing deep technical expertise, developing team capability, and managing complex delivery plans. Leverage strong retail domain experience-ideally within apparel or grocery-to translate business needs into effective data driven solutions. What You'll Bring Here at Primark, we want everyone to feel valued - so please bring your authentic self to work, of course with some other key experience and abilities for this role in particular: Extensive hands on experience with Databricks, Apache Spark, advanced SQL, and cloud based lakehouse architectures (Azure, AWS, or GCP), with a strong foundation in statistical modelling and machine learning techniques. Proven ability to deliver measurable commercial value through retail focused data science use cases such as demand forecasting, pricing and promotion effectiveness, allocation, stock optimisation, and waste or shrink reduction. Strong experience in experimental design and causal inference (e.g., A/B testing, quasi experiments), with a clear focus on quantifying incremental value and ensuring insights translate into action. Demonstrated experience taking models from prototype to production, establishing clear success metrics, monitoring, governance, and driving adoption across commercial and operational teams. Ability to shape and prioritise the data science roadmap by balancing business value, data readiness, and delivery risk; applies sound commercial judgement informed by market and industry trends. Proven people leader with experience mentoring and developing high performing data science teams; communicates complex technical concepts clearly to non technical stakeholders and acts as a trusted advisor to the business. Does this sound like you? Great, because we can't wait to see what you'll bring. You'll be supported within a team of equally capable people, celebrating who you are and aiding you reach your potential. At Primark, we're excited about our future - and we're excited to develop yours. About Primark At Primark, people matter. They're the beating heart of our business and the reason we've grown from our first store in Dublin in 1969 to a £9bn+ turnover business and over 80,000 colleagues and over 440 stores in 17 countries today. Our values run through everything we do. In essence, we're Caring and always strive to put people first. We're also Dynamic, bravely pushing the boundaries to stay ahead. And finally, we succeed Together. If you need any reasonable adjustments or have an accessibility request, during your recruitment journey, such as extended time or breaks between online assessments, a sign language interpreter, mobility access, or assistive technology please contact your talent acquisition specialist. All offers of employment are subject to background checks, including right to work, reference education and for some roles criminal, and financial checks. If you have any concerns, please reach out to our talent acquisition team to discuss. Our fashion isn't one size fits all and neither is our culture. Primark promotes equal employment opportunity, we strive to create an inclusive workplace where people can be themselves, access opportunities and thrive together. REQ ID: JR-7582
Apr 26, 2026
Full time
Data Science & Measurement Lead Because your new ideas are our way new ways of working. Evolve, your way. We are seeking a Data Science & Measurement Lead to manage and grow a team of data scientists responsible for building advanced analytics, predictive models, and measurement solutions across Primark. This is a hands on role requiring strong technical depth in Databricks, Apache Spark, and SQL. What You'll Get People are at the heart of what we do here, so it's essential we provide you with the right environment to perform at your very best. Let's talk lifestyle: Healthcare, pension, and potential bonus. 27 days of leave, plus bank holidays and if you want, you can buy 5 more. Because Primark is all about tailoring to you, we offer Tax Saver Tickets, fitness centre, and a subsidised cafeteria. This role is a hybrid opportunity, offering 1-2 days Working from home. What You'll Do as a Data Science & Measurement Lead We want you to feel challenged and inspired. Here, you'll develop your skills across a range of responsibilities: Lead a data science team to deliver machine learning models, experimentation frameworks, and measurement solutions that drive measurable business impact. Design, build, and deploy end to end ML pipelines and workflows using Databricks, Spark, Python, SQL, and PySpark. Ensure robust operationalisation of models through scalable, reliable data pipelines and production ready ML systems. Partner closely with engineering teams to optimise distributed compute workloads and uphold data quality, monitoring, and governance standards. Establish and drive best practices in model reproducibility, experiment tracking, and end to end ML lifecycle management. Act as a trusted advisor by sharing deep technical expertise, developing team capability, and managing complex delivery plans. Leverage strong retail domain experience-ideally within apparel or grocery-to translate business needs into effective data driven solutions. What You'll Bring Here at Primark, we want everyone to feel valued - so please bring your authentic self to work, of course with some other key experience and abilities for this role in particular: Extensive hands on experience with Databricks, Apache Spark, advanced SQL, and cloud based lakehouse architectures (Azure, AWS, or GCP), with a strong foundation in statistical modelling and machine learning techniques. Proven ability to deliver measurable commercial value through retail focused data science use cases such as demand forecasting, pricing and promotion effectiveness, allocation, stock optimisation, and waste or shrink reduction. Strong experience in experimental design and causal inference (e.g., A/B testing, quasi experiments), with a clear focus on quantifying incremental value and ensuring insights translate into action. Demonstrated experience taking models from prototype to production, establishing clear success metrics, monitoring, governance, and driving adoption across commercial and operational teams. Ability to shape and prioritise the data science roadmap by balancing business value, data readiness, and delivery risk; applies sound commercial judgement informed by market and industry trends. Proven people leader with experience mentoring and developing high performing data science teams; communicates complex technical concepts clearly to non technical stakeholders and acts as a trusted advisor to the business. Does this sound like you? Great, because we can't wait to see what you'll bring. You'll be supported within a team of equally capable people, celebrating who you are and aiding you reach your potential. At Primark, we're excited about our future - and we're excited to develop yours. About Primark At Primark, people matter. They're the beating heart of our business and the reason we've grown from our first store in Dublin in 1969 to a £9bn+ turnover business and over 80,000 colleagues and over 440 stores in 17 countries today. Our values run through everything we do. In essence, we're Caring and always strive to put people first. We're also Dynamic, bravely pushing the boundaries to stay ahead. And finally, we succeed Together. If you need any reasonable adjustments or have an accessibility request, during your recruitment journey, such as extended time or breaks between online assessments, a sign language interpreter, mobility access, or assistive technology please contact your talent acquisition specialist. All offers of employment are subject to background checks, including right to work, reference education and for some roles criminal, and financial checks. If you have any concerns, please reach out to our talent acquisition team to discuss. Our fashion isn't one size fits all and neither is our culture. Primark promotes equal employment opportunity, we strive to create an inclusive workplace where people can be themselves, access opportunities and thrive together. REQ ID: JR-7582
Broking Insights - Senior Data Analyst
Stryker Corporation
Broking Insights - Senior Data Analyst Would you like to be part of a growing analytics team looking to drive forward advanced data driven decision making at the heart of Lloyd's insurance market? Do you have a passion for analytics, visualization and generating real business impact through communicating value insights? We are seeking a highly skilled Senior Data Analyst to join our forward thinking broking analytics team. This role combines commercial acumen, deep insurance expertise, and analytical capabilities to deliver actionable insights and optimise decision making across the organisation. The successful candidate will deliver data solutions, standardise reporting practices, scale our Power BI capabilities and enhance analytical capability across departments. Based from our flagship London office, this role comes with Hybrid working. What the day will look like Power BI Development: Build scalable data models, optimise model performance, and standardise reporting frameworks to deliver Deliver engineering pipelines in Databricks to create structured scalable assets to meet the evolving needs of the reporting requirements Assess AI for BI opportunities, leveraging LLMs and Develop reporting frameworks from critical source systems to enable deeper insights into operational and broking performance Collaborate with brokers, operations, and carriers to assess analytics requirements and translate business needs into actionable insights Insurance Market Expertise: Leverage Lloyd's market knowledge to inform analytics solutions and reporting Deep of Data Modelling and architecture is Data Governance - embed standards and practices to improve data quality Liaise with analytics capabilities across departments and co ordinate to deliver business centric solutions Engage in strategic data transformation projects to provide technical guidance Data Engineering & Automation: Develop ETL processes using SQL and Python; Implement Power Automate and Power Apps for workflow automation AI & Emerging Technologies: understand how to leverage emerging capabilities Skills and experience that will lead to success Mandatory: SQL, Power BI, Python scripting - Advanced modelling, optimisation, and governance. Expertise in standardising reporting practices and scalable BI solutions. Excellent communication and data storytelling. Passion for AI and experience leveraging it in a commercial context Strong stakeholder management and influencing skills. High commercial awareness and strategic thinking. Collaborative and adaptable in a fast paced environment. Proven experience in insurance analytics within the Lloyd's or London Market. Expertise in standardising reporting practices and scalable BI solutions. Experience with Microsoft Co Pilot Studio or similar Agentic AI tooling. Familiarity with regulatory compliance and data governance practices. Understanding of underwriting / broking principles and performance analytics (loss ratios)
Apr 26, 2026
Full time
Broking Insights - Senior Data Analyst Would you like to be part of a growing analytics team looking to drive forward advanced data driven decision making at the heart of Lloyd's insurance market? Do you have a passion for analytics, visualization and generating real business impact through communicating value insights? We are seeking a highly skilled Senior Data Analyst to join our forward thinking broking analytics team. This role combines commercial acumen, deep insurance expertise, and analytical capabilities to deliver actionable insights and optimise decision making across the organisation. The successful candidate will deliver data solutions, standardise reporting practices, scale our Power BI capabilities and enhance analytical capability across departments. Based from our flagship London office, this role comes with Hybrid working. What the day will look like Power BI Development: Build scalable data models, optimise model performance, and standardise reporting frameworks to deliver Deliver engineering pipelines in Databricks to create structured scalable assets to meet the evolving needs of the reporting requirements Assess AI for BI opportunities, leveraging LLMs and Develop reporting frameworks from critical source systems to enable deeper insights into operational and broking performance Collaborate with brokers, operations, and carriers to assess analytics requirements and translate business needs into actionable insights Insurance Market Expertise: Leverage Lloyd's market knowledge to inform analytics solutions and reporting Deep of Data Modelling and architecture is Data Governance - embed standards and practices to improve data quality Liaise with analytics capabilities across departments and co ordinate to deliver business centric solutions Engage in strategic data transformation projects to provide technical guidance Data Engineering & Automation: Develop ETL processes using SQL and Python; Implement Power Automate and Power Apps for workflow automation AI & Emerging Technologies: understand how to leverage emerging capabilities Skills and experience that will lead to success Mandatory: SQL, Power BI, Python scripting - Advanced modelling, optimisation, and governance. Expertise in standardising reporting practices and scalable BI solutions. Excellent communication and data storytelling. Passion for AI and experience leveraging it in a commercial context Strong stakeholder management and influencing skills. High commercial awareness and strategic thinking. Collaborative and adaptable in a fast paced environment. Proven experience in insurance analytics within the Lloyd's or London Market. Expertise in standardising reporting practices and scalable BI solutions. Experience with Microsoft Co Pilot Studio or similar Agentic AI tooling. Familiarity with regulatory compliance and data governance practices. Understanding of underwriting / broking principles and performance analytics (loss ratios)
Stott and May
Senior Salesforce Developer
Stott and May
Senior Salesforce Developer Founding Hire Fintech Disruptor Location: Central London (High-Frequency Office Presence) Salary: Up to £80k + Bonus + Equity/Share Options The Mission We are representing a rapidly scaling fintech disruptor in the insurance brokerage space, led by a founding team with a proven track record of multi-million-pound exits. As the first dedicated technology hire, you will partner directly with an Engineering Director (a seasoned CTO) who has scaled and exited Salesforce teams before. You aren't just maintaining a system; you are architectural owner of a platform facilitating substantial global revenue, positioned for a major liquidity event. The Role Scale the Foundation: Move a "founder-built" architecture toward a scalable, enterprise-grade platform. Hybrid Development: Manage a 90% custom Sales Cloud environment (Apex/LWC) while championing a "Low-to-No Code" (Flow-first) philosophy. Strategic Advisory: Challenge the status quo, advising stakeholders on integrations and design elements that save capital and time. Data & AI: Lead the data structuring strategy to prepare the business for advanced AI utility and prompt engineering. Technical Profile The Expert: Deep Sales Cloud proficiency and a "Flow Champion" mindset. The Developer: Ability to work within Apex and LWC frameworks is essential, even if your goal is to minimize code for future agility. The Strategist: Strong SOQL skills; SQL (Databricks) is a major advantage. The Visionary: An interest in AI utility and how it drives business efficiency. The Environment This is a sales-driven, high-octane culture. Collaborative: The team values the energy of the office. Action-Oriented: You are a "doer" who handles everything from high-level architecture to general user support. The Reward Beyond a competitive base and bonus, this role may offer equity/share options. Given the founders' history of successful exits, this represents a genuine opportunity for a life changing financial event.
Apr 25, 2026
Full time
Senior Salesforce Developer Founding Hire Fintech Disruptor Location: Central London (High-Frequency Office Presence) Salary: Up to £80k + Bonus + Equity/Share Options The Mission We are representing a rapidly scaling fintech disruptor in the insurance brokerage space, led by a founding team with a proven track record of multi-million-pound exits. As the first dedicated technology hire, you will partner directly with an Engineering Director (a seasoned CTO) who has scaled and exited Salesforce teams before. You aren't just maintaining a system; you are architectural owner of a platform facilitating substantial global revenue, positioned for a major liquidity event. The Role Scale the Foundation: Move a "founder-built" architecture toward a scalable, enterprise-grade platform. Hybrid Development: Manage a 90% custom Sales Cloud environment (Apex/LWC) while championing a "Low-to-No Code" (Flow-first) philosophy. Strategic Advisory: Challenge the status quo, advising stakeholders on integrations and design elements that save capital and time. Data & AI: Lead the data structuring strategy to prepare the business for advanced AI utility and prompt engineering. Technical Profile The Expert: Deep Sales Cloud proficiency and a "Flow Champion" mindset. The Developer: Ability to work within Apex and LWC frameworks is essential, even if your goal is to minimize code for future agility. The Strategist: Strong SOQL skills; SQL (Databricks) is a major advantage. The Visionary: An interest in AI utility and how it drives business efficiency. The Environment This is a sales-driven, high-octane culture. Collaborative: The team values the energy of the office. Action-Oriented: You are a "doer" who handles everything from high-level architecture to general user support. The Reward Beyond a competitive base and bonus, this role may offer equity/share options. Given the founders' history of successful exits, this represents a genuine opportunity for a life changing financial event.
Taylor Hopkinson Limited
Data Engineer
Taylor Hopkinson Limited City, London
Data Engineer for a major offshore wind project in The United Kingdom Responsibilities Design and implement scalable ingestion pipelines from multiple source systems including and internal business data sources. Ensure reliable, automated, and monitored data flows into the Bronze layer of the Medallion architecture. Work within clients existing security framework to establish compliant connectivity to operational data sources. Build and maintain Silver and Gold layer transformations in Databricks using Python and SQL. Onboard datasets into Unity Catalog, ensuring proper governance, lineage, and discoverability. Platform Collaboration & Delivery Support the ML/Data Scientist in preparing clean, structured datasets for anomaly detection and asset performance modelling. Contribute to technical documentation and ensure pipelines are maintainable and transferable. Stay current on Databricks and Azure platform developments relevant to the stack. Support the Digital & AI Strategy Manager in assessing feasibility of new data source integrations as the roadmap evolves. Experience Master's degree in Computer Science, Data Engineering, Software Engineering, or a related technical field. Professional certifications in Azure, Databricks preferred Training or background in energy systems, renewable energy, offshore wind or BESS technologies is a strong plus. 4-7 years of hands-on data engineering experience in a cloud environment. Demonstrated experience delivering production pipelines on Databricks and Azure (ADLS Gen2, ADF or equivalent). Proven ability to implement Medallion architecture or equivalent layered data modelling patterns. Experience with REST API ingestion and integration of business systems (ERP, finance tools). Experience in a contractor or project-based delivery model preferred. Exposure to OT/SCADA environments or energy sector data. Exposure to MLOps workflows or collaboration with data science teams.
Apr 25, 2026
Contractor
Data Engineer for a major offshore wind project in The United Kingdom Responsibilities Design and implement scalable ingestion pipelines from multiple source systems including and internal business data sources. Ensure reliable, automated, and monitored data flows into the Bronze layer of the Medallion architecture. Work within clients existing security framework to establish compliant connectivity to operational data sources. Build and maintain Silver and Gold layer transformations in Databricks using Python and SQL. Onboard datasets into Unity Catalog, ensuring proper governance, lineage, and discoverability. Platform Collaboration & Delivery Support the ML/Data Scientist in preparing clean, structured datasets for anomaly detection and asset performance modelling. Contribute to technical documentation and ensure pipelines are maintainable and transferable. Stay current on Databricks and Azure platform developments relevant to the stack. Support the Digital & AI Strategy Manager in assessing feasibility of new data source integrations as the roadmap evolves. Experience Master's degree in Computer Science, Data Engineering, Software Engineering, or a related technical field. Professional certifications in Azure, Databricks preferred Training or background in energy systems, renewable energy, offshore wind or BESS technologies is a strong plus. 4-7 years of hands-on data engineering experience in a cloud environment. Demonstrated experience delivering production pipelines on Databricks and Azure (ADLS Gen2, ADF or equivalent). Proven ability to implement Medallion architecture or equivalent layered data modelling patterns. Experience with REST API ingestion and integration of business systems (ERP, finance tools). Experience in a contractor or project-based delivery model preferred. Exposure to OT/SCADA environments or energy sector data. Exposure to MLOps workflows or collaboration with data science teams.
Ageas Insurance Limited
Senior Data Quality Analyst
Ageas Insurance Limited Eastleigh, Hampshire
Job Title : Senior Data Quality Analyst Target Start Date: Q2 2026 Contract Type: Permanent, Full Time Salary Range: £65,000 - £70,000 Location: Eastleigh, Hybrid (1x week) Closing Date for applications: 7th May Senior Data Quality Analyst: We are currently looking for a Senior Data Quality Analyst. You will work alongside Data Scientists, Engineers, Architects and Analysts to support the design, build and maintenance of cutting-edge data and AI services, ensuring strong data quality practices are embedded and monitored from the outset. Working closely with our governance leads and collaborating with risk, compliance and privacy teams, you'll help establish enterprise standards and drive trusted, high-quality data that powers analytics and AI innovation. Main Responsibilities as Senior Data Quality Analyst: Provide data quality advice and guidance across the business, promoting best practice and pragmatic solutions Design and implement data quality processes, controls and monitoring across our data platforms and enterprise systems Develop data profiling, reporting and monitoring solutions using SQL and Python Collaborate with data owners, stewards and the wider data community to improve trust and quality in critical datasets Curate and maintain key data artefacts such as data catalogues, dictionaries, lineage and asset registers Champion the value of data quality through governance forums, stakeholder engagement and guidance materials Support delivery of the strategic data quality roadmap and key governance outcomes Work with architects and AI teams to ensure high-quality, well-governed data supports scalable data products and GenAI services Skills and experience you need as Senior Data Quality Analyst: Strong experience implementing data quality processes and governance frameworks within complex data environments Hands-on coding capability in SQL, with experience using Python for data manipulation, profiling or automation Experience working with modern cloud data platforms, particularly Databricks Experience profiling datasets and defining data quality rules, controls and monitoring approaches Experience working with data governance frameworks and collaborating with data owners, stewards and governance teams Familiarity with data governance and data management tooling such as Unity Catalog, Collibra or similar Strong stakeholder engagement skills with the ability to influence across technical and non-technical teams Interest in AI and emerging technologies, and an understanding of how strong data management enables advanced analytics and GenAI Qualifications : DAMA CDMP (Certified Data Management Professional) or equivalent. Recognised Data Quality Specialist certification or training. Desirable: Experience in the insurance or financial services sector. Exposure to data migration or transformation programmes. At Ageas we offer a wide range of benefits to support you and your family inside and outside of work, which helped us achieve, Top Employer status in the UK. Here are some of the benefits you can enjoy at Ageas: Flexible Working- Smart gives employees flexibility around location (as long as it's within the UK) and, for many of our roles, flexibility within the working day to manage other commitments, such as school drop offs etc. We also offer all our vacancies part-time/job-shares. We also offer a minimum of 35 days holiday (inc. bank holidays) and you can buy and sell days. Supporting your Health- Dental Insurance Health Cash Plan, Health Screening, Will Writing, Voluntary Critical Illness, Mental Health First Aiders, Well Being Activities - Mindfulness. Supporting your Wealth- Annual Bonus Schemes, Annual Salary Reviews, Competitive Pension, Employee Savings, Employee Loans. Supporting you at Work- Well-being activities, mindfulness sessions, Sports and Social Club events and more. Supporting you and your Family- Maternity/pregnant parent/primary adopter entitlement of 16 weeks at full pay and paternity/non-pregnant parent/co-adopter at 8 weeks' full pay. Benefits for Them- Partner Life Assurance and Critical Illness cover. Get some Tech- Deals on various gadgets including Wearables, Tablets and Laptops. Getting around- Car Salary Exchange, Cycle Scheme, Vehicle Breakdown Cover. Supporting you back to work- Return to work programme after maternity leave. About Ageas: We are one of the largest car and home insurers in the UK. Our People help Ageas to be a thriving, creative and innovative place to work. We show this in the service we provide to over four million customers.As an inclusive employer, we encourage anyone to apply. We're a signatory of the Race at Work Charter and Women in Finance Charter , member of iCAN and GAIN . As a Disability Confident Leader , we are committed to ensuring our recruitment processes are fully inclusive. That means if you are applying for a job with us, you will have fair access to support and adjustments throughout your recruit
Apr 24, 2026
Full time
Job Title : Senior Data Quality Analyst Target Start Date: Q2 2026 Contract Type: Permanent, Full Time Salary Range: £65,000 - £70,000 Location: Eastleigh, Hybrid (1x week) Closing Date for applications: 7th May Senior Data Quality Analyst: We are currently looking for a Senior Data Quality Analyst. You will work alongside Data Scientists, Engineers, Architects and Analysts to support the design, build and maintenance of cutting-edge data and AI services, ensuring strong data quality practices are embedded and monitored from the outset. Working closely with our governance leads and collaborating with risk, compliance and privacy teams, you'll help establish enterprise standards and drive trusted, high-quality data that powers analytics and AI innovation. Main Responsibilities as Senior Data Quality Analyst: Provide data quality advice and guidance across the business, promoting best practice and pragmatic solutions Design and implement data quality processes, controls and monitoring across our data platforms and enterprise systems Develop data profiling, reporting and monitoring solutions using SQL and Python Collaborate with data owners, stewards and the wider data community to improve trust and quality in critical datasets Curate and maintain key data artefacts such as data catalogues, dictionaries, lineage and asset registers Champion the value of data quality through governance forums, stakeholder engagement and guidance materials Support delivery of the strategic data quality roadmap and key governance outcomes Work with architects and AI teams to ensure high-quality, well-governed data supports scalable data products and GenAI services Skills and experience you need as Senior Data Quality Analyst: Strong experience implementing data quality processes and governance frameworks within complex data environments Hands-on coding capability in SQL, with experience using Python for data manipulation, profiling or automation Experience working with modern cloud data platforms, particularly Databricks Experience profiling datasets and defining data quality rules, controls and monitoring approaches Experience working with data governance frameworks and collaborating with data owners, stewards and governance teams Familiarity with data governance and data management tooling such as Unity Catalog, Collibra or similar Strong stakeholder engagement skills with the ability to influence across technical and non-technical teams Interest in AI and emerging technologies, and an understanding of how strong data management enables advanced analytics and GenAI Qualifications : DAMA CDMP (Certified Data Management Professional) or equivalent. Recognised Data Quality Specialist certification or training. Desirable: Experience in the insurance or financial services sector. Exposure to data migration or transformation programmes. At Ageas we offer a wide range of benefits to support you and your family inside and outside of work, which helped us achieve, Top Employer status in the UK. Here are some of the benefits you can enjoy at Ageas: Flexible Working- Smart gives employees flexibility around location (as long as it's within the UK) and, for many of our roles, flexibility within the working day to manage other commitments, such as school drop offs etc. We also offer all our vacancies part-time/job-shares. We also offer a minimum of 35 days holiday (inc. bank holidays) and you can buy and sell days. Supporting your Health- Dental Insurance Health Cash Plan, Health Screening, Will Writing, Voluntary Critical Illness, Mental Health First Aiders, Well Being Activities - Mindfulness. Supporting your Wealth- Annual Bonus Schemes, Annual Salary Reviews, Competitive Pension, Employee Savings, Employee Loans. Supporting you at Work- Well-being activities, mindfulness sessions, Sports and Social Club events and more. Supporting you and your Family- Maternity/pregnant parent/primary adopter entitlement of 16 weeks at full pay and paternity/non-pregnant parent/co-adopter at 8 weeks' full pay. Benefits for Them- Partner Life Assurance and Critical Illness cover. Get some Tech- Deals on various gadgets including Wearables, Tablets and Laptops. Getting around- Car Salary Exchange, Cycle Scheme, Vehicle Breakdown Cover. Supporting you back to work- Return to work programme after maternity leave. About Ageas: We are one of the largest car and home insurers in the UK. Our People help Ageas to be a thriving, creative and innovative place to work. We show this in the service we provide to over four million customers.As an inclusive employer, we encourage anyone to apply. We're a signatory of the Race at Work Charter and Women in Finance Charter , member of iCAN and GAIN . As a Disability Confident Leader , we are committed to ensuring our recruitment processes are fully inclusive. That means if you are applying for a job with us, you will have fair access to support and adjustments throughout your recruit
Amplius
Head of Enterprise Data Delivery
Amplius Milton Keynes, Buckinghamshire
Head of Enterprise Data Delivery Salary £80,000 (plus car allowance of £5,900) Location Hybrid - Milton Keynes or Boston Permanent, Full Time Data, transformation and governance - this role has it all! As Head of Enterprise Data Delivery at Amplius, you'll shape how data is built, managed and delivered across the organisation. You'll set clear direction and standards, making sure data is consistent, reliable and easy to use. By bringing structure to complex, fragmented data, you'll help improve reporting and support better, more confident decision-making. Salary: £80,000 (plus car allowance of £5,900) per year Contract: Permanent, full time Your week: 36.25 hours Monday - Friday 9am - 5.15pm Location: Hybrid with a weekly presence in our Milton Keynes or Boston office Snapshot of your role Take full end-to-end ownership of the enterprise data estate, including Data Platform, Data Products, and Data Governance, ensuring reliable delivery from source systems through to reporting and insight. Lead the modernisation of legacy data assets into a scalable cloud-based platform, defining the approach and ensuring successful migration, adoption, and long-term stability. Shape, structure, and maintain clear delivery plans with defined priorities, dependencies, ownership, and measurable outcomes that keep teams aligned and focused. Drive pace, momentum, and accountability across data teams, actively identifying risks, resolving blockers, and intervening where delivery is off track. Oversee the design and delivery of certified, reusable data products that support BI, KPI reporting, analytics, and operational decision-making. Embed modern data engineering practices including CI/CD, automated testing, monitoring, environment separation, and operational readiness across all delivery workstreams. Ensure governance is embedded into day-to-day delivery, including data definitions, ownership, lineage, quality standards, and access controls. Act as a senior technical partner to engineering, architecture, BI, and transformation teams, challenging design decisions and ensuring scalable, production-ready solutions. Provide clear visibility to senior stakeholders on progress, risks, dependencies, and delivery timelines across all data initiatives. What we're looking for Proven experience leading data platform, data engineering, or enterprise data delivery teams Strong background delivering cloud data platforms (e.g. Azure, Databricks, Snowflake or similar) Hands-on experience with data pipelines, transformation, and data modelling at scale Experience modernising legacy data estates into cloud-based platforms Strong technical credibility with the ability to engage engineers, architects, and technical teams Understanding of how data products, semantic layers, and reporting structures are designed and delivered Experience embedding data governance, quality, and MDM into delivery practices Ability to operate in complex, fast-paced, ambiguous environments with competing priorities Strong leadership skills with a focus on accountability, delivery, and outcomes Clear communicator able to translate technical detail into business impact Practical, delivery-focused mindset with a bias for action and problem-solving Please read the attached Job Description before applying so you get the full scope of the role. You can read about our colleague benefits here - Amplius colleague benefits This vacancy will close on 6 May. Following this, we will be in touch to arrange interviews. We reserve the right to close the vacancy early in response to an overwhelming number of applications or a change in business priorities. We do not provide visa sponsorship; you must be eligible to work in the UK. You must reside in the UK for the duration of your employment and provide Right to Work evidence. If you have any questions, please contact the Amplius Talent Team and we'll be happy to assist you. The Company Amplius is one of the largest housing providers across the Midlands, East and Southeast of England. We own and manage more than 37,000 homes and deliver a range of quality services, including care and support, specialist housing and home ownership options. We're a team of over 1,300 colleagues driven to have a positive impact on people's lives and provide affordable homes that make a difference.
Apr 24, 2026
Full time
Head of Enterprise Data Delivery Salary £80,000 (plus car allowance of £5,900) Location Hybrid - Milton Keynes or Boston Permanent, Full Time Data, transformation and governance - this role has it all! As Head of Enterprise Data Delivery at Amplius, you'll shape how data is built, managed and delivered across the organisation. You'll set clear direction and standards, making sure data is consistent, reliable and easy to use. By bringing structure to complex, fragmented data, you'll help improve reporting and support better, more confident decision-making. Salary: £80,000 (plus car allowance of £5,900) per year Contract: Permanent, full time Your week: 36.25 hours Monday - Friday 9am - 5.15pm Location: Hybrid with a weekly presence in our Milton Keynes or Boston office Snapshot of your role Take full end-to-end ownership of the enterprise data estate, including Data Platform, Data Products, and Data Governance, ensuring reliable delivery from source systems through to reporting and insight. Lead the modernisation of legacy data assets into a scalable cloud-based platform, defining the approach and ensuring successful migration, adoption, and long-term stability. Shape, structure, and maintain clear delivery plans with defined priorities, dependencies, ownership, and measurable outcomes that keep teams aligned and focused. Drive pace, momentum, and accountability across data teams, actively identifying risks, resolving blockers, and intervening where delivery is off track. Oversee the design and delivery of certified, reusable data products that support BI, KPI reporting, analytics, and operational decision-making. Embed modern data engineering practices including CI/CD, automated testing, monitoring, environment separation, and operational readiness across all delivery workstreams. Ensure governance is embedded into day-to-day delivery, including data definitions, ownership, lineage, quality standards, and access controls. Act as a senior technical partner to engineering, architecture, BI, and transformation teams, challenging design decisions and ensuring scalable, production-ready solutions. Provide clear visibility to senior stakeholders on progress, risks, dependencies, and delivery timelines across all data initiatives. What we're looking for Proven experience leading data platform, data engineering, or enterprise data delivery teams Strong background delivering cloud data platforms (e.g. Azure, Databricks, Snowflake or similar) Hands-on experience with data pipelines, transformation, and data modelling at scale Experience modernising legacy data estates into cloud-based platforms Strong technical credibility with the ability to engage engineers, architects, and technical teams Understanding of how data products, semantic layers, and reporting structures are designed and delivered Experience embedding data governance, quality, and MDM into delivery practices Ability to operate in complex, fast-paced, ambiguous environments with competing priorities Strong leadership skills with a focus on accountability, delivery, and outcomes Clear communicator able to translate technical detail into business impact Practical, delivery-focused mindset with a bias for action and problem-solving Please read the attached Job Description before applying so you get the full scope of the role. You can read about our colleague benefits here - Amplius colleague benefits This vacancy will close on 6 May. Following this, we will be in touch to arrange interviews. We reserve the right to close the vacancy early in response to an overwhelming number of applications or a change in business priorities. We do not provide visa sponsorship; you must be eligible to work in the UK. You must reside in the UK for the duration of your employment and provide Right to Work evidence. If you have any questions, please contact the Amplius Talent Team and we'll be happy to assist you. The Company Amplius is one of the largest housing providers across the Midlands, East and Southeast of England. We own and manage more than 37,000 homes and deliver a range of quality services, including care and support, specialist housing and home ownership options. We're a team of over 1,300 colleagues driven to have a positive impact on people's lives and provide affordable homes that make a difference.
Fruition Group
Solution Architect - AI & Automation
Fruition Group
Job Title: Solutions Architect (AI and Automation) Location: Remote (occasional travel to London required) Salary: £75,000 - £85,000 (Depending on location) Would you like to work for an organisation plays a pivotal role in safeguarding and improving the quality of health and social care services across England. My client plays a vital role in improving outcomes across health and social care by using data, technology, and insight to drive meaningful change. With a strong focus on innovation and digital transformation, they are investing in modern cloud platforms, artificial intelligence, and advanced analytics to become a truly intelligence-led regulator. Solutions Architect Responsibilities Design scalable, secure, and ethical AI and data solutions using Microsoft Azure technologies. Translate business requirements into end-to-end solution architecture across AI, machine learning, and data platforms. Maintain architectural blueprints ensuring interoperability, resilience, governance, and regulatory compliance. Lead architecture design reviews and provide technical leadership to delivery teams using Agile, DevOps, and CI/CD practices. Mentor technical teams and promote responsible AI practices, including transparency, fairness, and security. Solutions Architect Requirements Extensive experience delivering solutions on the Microsoft Azure Data Platform (Synapse Analytics, Data Lake, Databricks, Azure ML, Power BI). Expertise in modern data engineering approaches such as ETL/ELT pipelines, streaming, APIs, and event-driven architecture. Experience with cloud-native architectures, microservices, and SaaS integrations. Knowledge of data governance, information security, and AI ethics within regulated industries. Excellent stakeholder engagement and leadership skills. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Apr 24, 2026
Full time
Job Title: Solutions Architect (AI and Automation) Location: Remote (occasional travel to London required) Salary: £75,000 - £85,000 (Depending on location) Would you like to work for an organisation plays a pivotal role in safeguarding and improving the quality of health and social care services across England. My client plays a vital role in improving outcomes across health and social care by using data, technology, and insight to drive meaningful change. With a strong focus on innovation and digital transformation, they are investing in modern cloud platforms, artificial intelligence, and advanced analytics to become a truly intelligence-led regulator. Solutions Architect Responsibilities Design scalable, secure, and ethical AI and data solutions using Microsoft Azure technologies. Translate business requirements into end-to-end solution architecture across AI, machine learning, and data platforms. Maintain architectural blueprints ensuring interoperability, resilience, governance, and regulatory compliance. Lead architecture design reviews and provide technical leadership to delivery teams using Agile, DevOps, and CI/CD practices. Mentor technical teams and promote responsible AI practices, including transparency, fairness, and security. Solutions Architect Requirements Extensive experience delivering solutions on the Microsoft Azure Data Platform (Synapse Analytics, Data Lake, Databricks, Azure ML, Power BI). Expertise in modern data engineering approaches such as ETL/ELT pipelines, streaming, APIs, and event-driven architecture. Experience with cloud-native architectures, microservices, and SaaS integrations. Knowledge of data governance, information security, and AI ethics within regulated industries. Excellent stakeholder engagement and leadership skills. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Oscar Technology
Senior Data Engineer / Data Architect
Oscar Technology Warrington, Cheshire
Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. The Role: This is a great new role for either a Senior Data Engineer, wanting to make a step towards architecture, or, a hand's on data architect who is still very much a do'er. You will be the Data Engineer / Architect for the business so this is a design and execute role, you will need to be happy to roll sleeves up and do some technical work alongside ownership of the architecture piece. The system is large, the company is a global business and there systems are already in a good shape, this is not like some roles I have seen where "everything is in a mess and we need someone to fix" - everything her is in place for the successful person to deliver. In addition to the MS Technologies, they also have Salesforce and SAP in the business and a large amount of data is moved and out of those systems so experience in this area would definitely be an advantage. This role is high profile, we are looking for someone to work with SLT, board members and international stakeholders. You will need to be confident and comfortable engaging in conversations and high level decision making. Please note - this is not a remote position, it is hybrid in the office, 2 days a week but there is quite a lot of flexibility to that. Responsibilities: Own and evolve the enterprise data architecture - defining data models, integration patterns, and standards that scale with the business. Design secure, resilient data solutions across cloud and on-premises environments, ensuring they are fit for purpose today and adaptable for tomorrow. Act as the bridge between business stakeholders, analytics teams, and engineering - translating commercial requirements into robust, well-reasoned architectural designs. Set the standard for data governance, data quality, metadata management, and master data management - and hold the organisation to it. Ensure all data practices meet security, privacy, and regulatory obligations, proactively identifying and mitigating compliance risk. Provide architectural leadership and assurance across data programmes, guiding teams to make the right design decisions at every stage. Assess, recommend, and champion the right data technologies, tools, and platforms - balancing innovation with pragmatism. Lead and support data migration, modernisation, and transformation initiatives, bringing structure and clarity to complex change. Produce clear, consistent documentation of data architectures, models, and design decisions that serve as a lasting reference for the organisation. Identify opportunities to commercialise data insights through automation and process efficiency - turning data into measurable business value. Align data architecture with global systems requirements and regulatory evolution, ensuring enabling technology delivers maximum business impact. Requirements: Azure Environment Strong, well - rounded data engineering skillset. Apply Now! If you have a range of experience in Data Engineering and you are looking to progress with an organisation that has a fantastic approach to work in a thriving and ambitious environment, then look no further - this is the role for you! Please note: this role does not offer sponsorship. Referrals: If this role isn't right for you, do you know someone that might be interested? You could earn £500 of retail vouchers if you refer a successful candidate to Oscar. Email: to recommend someone for this role. Interviews for this role will be held imminently. To be considered, please send your CV to me now to avoid disappointment. Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy. To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.
Apr 24, 2026
Full time
Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. The Role: This is a great new role for either a Senior Data Engineer, wanting to make a step towards architecture, or, a hand's on data architect who is still very much a do'er. You will be the Data Engineer / Architect for the business so this is a design and execute role, you will need to be happy to roll sleeves up and do some technical work alongside ownership of the architecture piece. The system is large, the company is a global business and there systems are already in a good shape, this is not like some roles I have seen where "everything is in a mess and we need someone to fix" - everything her is in place for the successful person to deliver. In addition to the MS Technologies, they also have Salesforce and SAP in the business and a large amount of data is moved and out of those systems so experience in this area would definitely be an advantage. This role is high profile, we are looking for someone to work with SLT, board members and international stakeholders. You will need to be confident and comfortable engaging in conversations and high level decision making. Please note - this is not a remote position, it is hybrid in the office, 2 days a week but there is quite a lot of flexibility to that. Responsibilities: Own and evolve the enterprise data architecture - defining data models, integration patterns, and standards that scale with the business. Design secure, resilient data solutions across cloud and on-premises environments, ensuring they are fit for purpose today and adaptable for tomorrow. Act as the bridge between business stakeholders, analytics teams, and engineering - translating commercial requirements into robust, well-reasoned architectural designs. Set the standard for data governance, data quality, metadata management, and master data management - and hold the organisation to it. Ensure all data practices meet security, privacy, and regulatory obligations, proactively identifying and mitigating compliance risk. Provide architectural leadership and assurance across data programmes, guiding teams to make the right design decisions at every stage. Assess, recommend, and champion the right data technologies, tools, and platforms - balancing innovation with pragmatism. Lead and support data migration, modernisation, and transformation initiatives, bringing structure and clarity to complex change. Produce clear, consistent documentation of data architectures, models, and design decisions that serve as a lasting reference for the organisation. Identify opportunities to commercialise data insights through automation and process efficiency - turning data into measurable business value. Align data architecture with global systems requirements and regulatory evolution, ensuring enabling technology delivers maximum business impact. Requirements: Azure Environment Strong, well - rounded data engineering skillset. Apply Now! If you have a range of experience in Data Engineering and you are looking to progress with an organisation that has a fantastic approach to work in a thriving and ambitious environment, then look no further - this is the role for you! Please note: this role does not offer sponsorship. Referrals: If this role isn't right for you, do you know someone that might be interested? You could earn £500 of retail vouchers if you refer a successful candidate to Oscar. Email: to recommend someone for this role. Interviews for this role will be held imminently. To be considered, please send your CV to me now to avoid disappointment. Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy. To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.
IntaPeople
Senior Data Engineer
IntaPeople
IntaPeople are hiring for a mid-senior level Data Engineer to join a growing digital engineering team working on modern technology platforms. You ll work alongside an established BI &Data to play a role in an active phase of platform modernisation. The successful candidate will join a small, collaborative team of data engineers and analysts delivering work across the full data lifecycle, from extraction and transformation through to data modelling and reporting. This role sits at the heart of a growing data engineering capability in Cardiff. You ll be actively involved in delivering high quality data solutions, while acting as a trusted reference point for best practice across the team. From shaping and delivering ETL workflows to collaborating directly with stakeholders, your work will help ensure data platforms evolve in line with the organisation s expanding requirements. You will be working primarily within the Microsoft Azure ecosystem, including Azure SQL Server, Azure Data Factory, and Azure DevOps. Required Skills Strong Python experience for data engineering Strong experience working with SQL Server Strong experience working with Azure Data Factory and Azure DevOps Hands-on experience with data lake platforms, Azure Synapse, Databricks, or equivalent experience/skills Experience with tooling s such as CI/CD pipelines and version control. The ability to lead feature specifications and working closely with key business stakeholders Adopting an AI-first approach to development and being inquisitive to it s benefits and features Key Responsibilities Reporting into the Director of Data Engineering and working closely with other Data Engineers and Analysts within the business You will be responsible for the Design, build, and maintenance of Python-based ETL pipelines Accounting and leading the SQL development Data lake development within Azure Synapse, working to the organisations architecture standards Meeting with business stakeholders to define requirements and translate them into solution designs Keeping stakeholders informed on the status of data initiatives Producing technical documentation: solution designs, data dictionaries, and engineering runbooks Reviewing and guiding the work of less experienced members of the team Contributing to solution design discussions and architecture decisions Role overview Senior Data Engineer Starting Salary of £55,000 - £60,000 Annual bonus scheme between 10%-20% 25 days holiday allowance (which increases with service) Central Cardiff office location True Flexible working hours Hybrid working setup expectations are typically 2-3 days per week Private Medical care Company wide trips Group Life Assurance, Income Protection & Critical Illness cover Matched pension contribution Cycle to work scheme If you're an experienced Data Engineer looking to make an impact in a modern, forward-thinking team, this is a great opportunity. Please not we do not have the ability to provide sponsorship and candidates must only apply who have the ability to work without restriction within the UK. Interested? Click apply now with your CV or call (phone number removed) for a chat!
Apr 24, 2026
Full time
IntaPeople are hiring for a mid-senior level Data Engineer to join a growing digital engineering team working on modern technology platforms. You ll work alongside an established BI &Data to play a role in an active phase of platform modernisation. The successful candidate will join a small, collaborative team of data engineers and analysts delivering work across the full data lifecycle, from extraction and transformation through to data modelling and reporting. This role sits at the heart of a growing data engineering capability in Cardiff. You ll be actively involved in delivering high quality data solutions, while acting as a trusted reference point for best practice across the team. From shaping and delivering ETL workflows to collaborating directly with stakeholders, your work will help ensure data platforms evolve in line with the organisation s expanding requirements. You will be working primarily within the Microsoft Azure ecosystem, including Azure SQL Server, Azure Data Factory, and Azure DevOps. Required Skills Strong Python experience for data engineering Strong experience working with SQL Server Strong experience working with Azure Data Factory and Azure DevOps Hands-on experience with data lake platforms, Azure Synapse, Databricks, or equivalent experience/skills Experience with tooling s such as CI/CD pipelines and version control. The ability to lead feature specifications and working closely with key business stakeholders Adopting an AI-first approach to development and being inquisitive to it s benefits and features Key Responsibilities Reporting into the Director of Data Engineering and working closely with other Data Engineers and Analysts within the business You will be responsible for the Design, build, and maintenance of Python-based ETL pipelines Accounting and leading the SQL development Data lake development within Azure Synapse, working to the organisations architecture standards Meeting with business stakeholders to define requirements and translate them into solution designs Keeping stakeholders informed on the status of data initiatives Producing technical documentation: solution designs, data dictionaries, and engineering runbooks Reviewing and guiding the work of less experienced members of the team Contributing to solution design discussions and architecture decisions Role overview Senior Data Engineer Starting Salary of £55,000 - £60,000 Annual bonus scheme between 10%-20% 25 days holiday allowance (which increases with service) Central Cardiff office location True Flexible working hours Hybrid working setup expectations are typically 2-3 days per week Private Medical care Company wide trips Group Life Assurance, Income Protection & Critical Illness cover Matched pension contribution Cycle to work scheme If you're an experienced Data Engineer looking to make an impact in a modern, forward-thinking team, this is a great opportunity. Please not we do not have the ability to provide sponsorship and candidates must only apply who have the ability to work without restriction within the UK. Interested? Click apply now with your CV or call (phone number removed) for a chat!
Alexander Mann Solutions
Financial Data Architecture Lead
Alexander Mann Solutions
AMS is a global workforce solutions partner committed to creating inclusive, dynamic, and future-ready workplaces. We help organisations adapt, grow, and thrive in an ever-evolving world by building, shaping, and optimising diverse talent strategies. We partner with PwC to support their contingent recruitment processes. Acting as an extension of their recruitment teams, we connect them with skilled interim and temporary professionals, fostering workplaces where everyone can contribute and succeed. PwC is a hugely diverse business, bound by our global purpose - to build trust in society and solve important problems. Our greatest opportunity to deliver our purpose is through meaningful work that makes a difference to our clients and society. We have a culture of high performance built on exceptional quality, ethical and professional standards. We lead by example. Our standards for quality - and care - are high. And together we surpass them. We believe we can make the biggest impact when leading with our purpose, values and PwC Professional behaviours in every interaction. On behalf of PwC, we are looking for a Financial Data Architecture Lead for a 6 Month contract based in London. Join us as a Financial Data Architecture Lead: We are seeking a senior Financial Data Architecture Lead to design and lead the development of the Finance data foundation across Insurance Finance, Risk, Treasury, Tax and Actuarial domains. This role is not focused on building core platforms or infrastructure ; instead, it centres on designing data models, structures, and flows grounded in real business and actuarial use cases , with a deep understanding of how data resolution, granularity and quality impact financial and actuarial models. You will work closely with a centrally provided technology platform (Azure, Databricks) and act as the design authority for Finance data, analytics, and reconciliation frameworks. Python capability is essential, including building and deploying financial and analytical models. What you'll do: Define and lead the end-to-end Finance Data architecture across Finance, Risk, Treasury, Tax, and Actuarial. Design robust finance and actuarial data models that align with business use cases, regulatory needs, and analytical requirements. Establish data foundations for new and evolving data sources , considering data resolution, lineage, controls, and downstream model impact. Define data flows supporting statutory, regulatory, and internal reporting . Partner closely with Finance and Actuarial stakeholders to understand how data feeds financial and actuarial models. Assess and resolve data issues impacting model accuracy, reconciliation, and comparability across reporting bases . Provide thought leadership on data structures that enable scalable analytics and advanced modelling. Lead and oversee the deployment of Python-based data and financial models on central data platforms. Develop and review Python pipelines supporting analytics, modelling, and reporting use cases. Support reporting and analytics capabilities, including BI and advanced analytical layers . Key Accountabilities, Skills & Experience: Proven experience as a Finance Data Architect, Senior Data Architect, or Finance Data Modeller in insurance or financial services. Deep understanding of insurance finance and actuarial data , including large-scale transformation programmes. Experience working with central/platform technology teams rather than owning infrastructure delivery. Strong Python development skills for data analysis, financial modelling, and pipeline development. Experience deploying Python models within modern data platforms (eg Databricks). Strong understanding of data modelling concepts (conceptual, logical, physical) within Finance and Risk domains. Experience supporting reporting, analytics, and BI use cases built on Finance data foundations. Understanding of regulatory and statutory reporting data requirements. Next Steps: At PwC we want every individual to feel valued, respected and empowered to contribute fully. Creating an environment where everyone belongs and thrives unlocks greater innovation, productivity and deeper engagement. If you are interested in applying for this position and meet the criteria outlined above, please click the link to apply and we will contact you with an update in due course. AMS, a Recruitment Process Outsourcing Company, may in the delivery of some of its services be deemed to operate as an Employment Agency or an Employment Business.
Apr 24, 2026
Contractor
AMS is a global workforce solutions partner committed to creating inclusive, dynamic, and future-ready workplaces. We help organisations adapt, grow, and thrive in an ever-evolving world by building, shaping, and optimising diverse talent strategies. We partner with PwC to support their contingent recruitment processes. Acting as an extension of their recruitment teams, we connect them with skilled interim and temporary professionals, fostering workplaces where everyone can contribute and succeed. PwC is a hugely diverse business, bound by our global purpose - to build trust in society and solve important problems. Our greatest opportunity to deliver our purpose is through meaningful work that makes a difference to our clients and society. We have a culture of high performance built on exceptional quality, ethical and professional standards. We lead by example. Our standards for quality - and care - are high. And together we surpass them. We believe we can make the biggest impact when leading with our purpose, values and PwC Professional behaviours in every interaction. On behalf of PwC, we are looking for a Financial Data Architecture Lead for a 6 Month contract based in London. Join us as a Financial Data Architecture Lead: We are seeking a senior Financial Data Architecture Lead to design and lead the development of the Finance data foundation across Insurance Finance, Risk, Treasury, Tax and Actuarial domains. This role is not focused on building core platforms or infrastructure ; instead, it centres on designing data models, structures, and flows grounded in real business and actuarial use cases , with a deep understanding of how data resolution, granularity and quality impact financial and actuarial models. You will work closely with a centrally provided technology platform (Azure, Databricks) and act as the design authority for Finance data, analytics, and reconciliation frameworks. Python capability is essential, including building and deploying financial and analytical models. What you'll do: Define and lead the end-to-end Finance Data architecture across Finance, Risk, Treasury, Tax, and Actuarial. Design robust finance and actuarial data models that align with business use cases, regulatory needs, and analytical requirements. Establish data foundations for new and evolving data sources , considering data resolution, lineage, controls, and downstream model impact. Define data flows supporting statutory, regulatory, and internal reporting . Partner closely with Finance and Actuarial stakeholders to understand how data feeds financial and actuarial models. Assess and resolve data issues impacting model accuracy, reconciliation, and comparability across reporting bases . Provide thought leadership on data structures that enable scalable analytics and advanced modelling. Lead and oversee the deployment of Python-based data and financial models on central data platforms. Develop and review Python pipelines supporting analytics, modelling, and reporting use cases. Support reporting and analytics capabilities, including BI and advanced analytical layers . Key Accountabilities, Skills & Experience: Proven experience as a Finance Data Architect, Senior Data Architect, or Finance Data Modeller in insurance or financial services. Deep understanding of insurance finance and actuarial data , including large-scale transformation programmes. Experience working with central/platform technology teams rather than owning infrastructure delivery. Strong Python development skills for data analysis, financial modelling, and pipeline development. Experience deploying Python models within modern data platforms (eg Databricks). Strong understanding of data modelling concepts (conceptual, logical, physical) within Finance and Risk domains. Experience supporting reporting, analytics, and BI use cases built on Finance data foundations. Understanding of regulatory and statutory reporting data requirements. Next Steps: At PwC we want every individual to feel valued, respected and empowered to contribute fully. Creating an environment where everyone belongs and thrives unlocks greater innovation, productivity and deeper engagement. If you are interested in applying for this position and meet the criteria outlined above, please click the link to apply and we will contact you with an update in due course. AMS, a Recruitment Process Outsourcing Company, may in the delivery of some of its services be deemed to operate as an Employment Agency or an Employment Business.
Ageas Insurance Limited
Senior Data Engineer
Ageas Insurance Limited Reigate, Surrey
Job Title : Senior Data Engineer Target Start Date: 20th June 2026 Contract Type: Permanent, Part Time, Full Time, Job Share option available Salary Range: £70,000 - £85,000 Location: Eastleigh and Reigate Senior Data Engineer: We are currently recruiting for a Senior Data Engineer to join our innovative Data team. You will join a collaborative team of data and AI engineers, scientists, developers, analysts, and architects. Together, you will design and build modern machine learning and AI services that support analytics and improve products across the business. Main Responsibilities as the Senior Data Engineer: Build and support data products within our modern data platform Design and deliver solutions with engineers, scientists and product teams Develop and optimise data pipelines across the analytics platform Integrate data from varied sources with strong quality standards Maintain orchestration, monitoring and performance of data components Improve engineering processes across the wider data community Promote high coding and data practice standards Experiment with emerging data, ML and AI technologies Partner with architects on data product designs Work collaboratively in multi-functional agile squads Support ML and GenAI infrastructure and workflows Skills and experience you need as the Senior Data Engineer: Passion for building scalable, resilient cloud data platforms Strong experience with Databricks or Snowflake on AWS Proven Python skills, including Spark and Airflow expertise Advanced SQL skills and end-to-end data modelling experience Experience building batch and real-time data integrations Hands-on CICD skills with Git, Jenkins or similar Ability to ingest, cleanse and structure large, diverse datasets Knowledge of Terraform or similar IaC tools Experience with Docker and Kubernetes is beneficial Exposure to production Generative AI is an advantage Strong collaboration skills and a proactive attitude At Ageas we offer a wide range of benefits to support you and your family inside and outside of work, which helped us achieve, Top Employer status in the UK. Here are some of the benefits you can enjoy at Ageas: Flexible Working- Smart gives employees flexibility around location (as long as it's within the UK) and, for many of our roles, flexibility within the working day to manage other commitments, such as school drop offs etc. We also offer all our vacancies part-time/job-shares. We also offer a minimum of 35 days holiday (inc. bank holidays) and you can buy and sell days. Supporting your Health- Dental Insurance Health Cash Plan, Health Screening, Will Writing, Voluntary Critical Illness, Mental Health First Aiders, Well Being Activities - Mindfulness. Supporting your Wealth- 50% off esure and Sheilas' Wheels motor and home insurance, Annual Bonus Schemes, Annual Salary Reviews, Competitive Pension, Employee Savings, Employee Loans. Supporting you at Work- Well-being activities, mindfulness sessions, Sports and Social Club events and more. Supporting you and your Family- Maternity/pregnant parent/primary adopter entitlement of 16 weeks at full pay and paternity/non-pregnant parent/co-adopter at 8 weeks' full pay. Benefits for Them- Partner Life Assurance and Critical Illness cover. Get some Tech- Deals on various gadgets including Wearables, Tablets and Laptops. Getting around- Car Salary Exchange, Cycle Scheme, Vehicle Breakdown Cover. Supporting you back to work- Return to work programme after maternity leave. About Ageas: We are one of the largest car and home insurers in the UK. Our People help Ageas to be a thriving, creative and innovative place to work. We show this in the service we provide to over four million customers.As an inclusive employer, we encourage anyone to apply. We're a signatory of the Race at Work Charter and Women in Finance Charter , member of iCAN and GAIN . As a Disability Confident Leader , we are committed to ensuring our recruitment processes are fully inclusive. That means if you are applying for a job with us, you will have fair access to support and adjustments throughout your recruitment experience. If the list does not cover the support you need, please contact our Recruitment Team to discuss how they can help. We also guarantee an interview for applicants with a disability who meet the minimum criteria for the role. For more information, please see Ageas Everyone . We have a zero-tolerance approach towards any form of harassment during the recruitment process, ensuring that everyone is treated with respect and professionalism.Our aim is to have great people everywhere in our business and we're always looking for outstanding people to join us. Most roles across Ageas allow a proportion of your time to be spent working from home and we're open to discussing flexible working, including full-time, part-time or job share arrangements. To find out more about Ageas, see About Us . Want to be part of a Winning Team? Come and join Ageas.Click on the 'Apply button' to be considered. Important Notice -
Apr 23, 2026
Full time
Job Title : Senior Data Engineer Target Start Date: 20th June 2026 Contract Type: Permanent, Part Time, Full Time, Job Share option available Salary Range: £70,000 - £85,000 Location: Eastleigh and Reigate Senior Data Engineer: We are currently recruiting for a Senior Data Engineer to join our innovative Data team. You will join a collaborative team of data and AI engineers, scientists, developers, analysts, and architects. Together, you will design and build modern machine learning and AI services that support analytics and improve products across the business. Main Responsibilities as the Senior Data Engineer: Build and support data products within our modern data platform Design and deliver solutions with engineers, scientists and product teams Develop and optimise data pipelines across the analytics platform Integrate data from varied sources with strong quality standards Maintain orchestration, monitoring and performance of data components Improve engineering processes across the wider data community Promote high coding and data practice standards Experiment with emerging data, ML and AI technologies Partner with architects on data product designs Work collaboratively in multi-functional agile squads Support ML and GenAI infrastructure and workflows Skills and experience you need as the Senior Data Engineer: Passion for building scalable, resilient cloud data platforms Strong experience with Databricks or Snowflake on AWS Proven Python skills, including Spark and Airflow expertise Advanced SQL skills and end-to-end data modelling experience Experience building batch and real-time data integrations Hands-on CICD skills with Git, Jenkins or similar Ability to ingest, cleanse and structure large, diverse datasets Knowledge of Terraform or similar IaC tools Experience with Docker and Kubernetes is beneficial Exposure to production Generative AI is an advantage Strong collaboration skills and a proactive attitude At Ageas we offer a wide range of benefits to support you and your family inside and outside of work, which helped us achieve, Top Employer status in the UK. Here are some of the benefits you can enjoy at Ageas: Flexible Working- Smart gives employees flexibility around location (as long as it's within the UK) and, for many of our roles, flexibility within the working day to manage other commitments, such as school drop offs etc. We also offer all our vacancies part-time/job-shares. We also offer a minimum of 35 days holiday (inc. bank holidays) and you can buy and sell days. Supporting your Health- Dental Insurance Health Cash Plan, Health Screening, Will Writing, Voluntary Critical Illness, Mental Health First Aiders, Well Being Activities - Mindfulness. Supporting your Wealth- 50% off esure and Sheilas' Wheels motor and home insurance, Annual Bonus Schemes, Annual Salary Reviews, Competitive Pension, Employee Savings, Employee Loans. Supporting you at Work- Well-being activities, mindfulness sessions, Sports and Social Club events and more. Supporting you and your Family- Maternity/pregnant parent/primary adopter entitlement of 16 weeks at full pay and paternity/non-pregnant parent/co-adopter at 8 weeks' full pay. Benefits for Them- Partner Life Assurance and Critical Illness cover. Get some Tech- Deals on various gadgets including Wearables, Tablets and Laptops. Getting around- Car Salary Exchange, Cycle Scheme, Vehicle Breakdown Cover. Supporting you back to work- Return to work programme after maternity leave. About Ageas: We are one of the largest car and home insurers in the UK. Our People help Ageas to be a thriving, creative and innovative place to work. We show this in the service we provide to over four million customers.As an inclusive employer, we encourage anyone to apply. We're a signatory of the Race at Work Charter and Women in Finance Charter , member of iCAN and GAIN . As a Disability Confident Leader , we are committed to ensuring our recruitment processes are fully inclusive. That means if you are applying for a job with us, you will have fair access to support and adjustments throughout your recruitment experience. If the list does not cover the support you need, please contact our Recruitment Team to discuss how they can help. We also guarantee an interview for applicants with a disability who meet the minimum criteria for the role. For more information, please see Ageas Everyone . We have a zero-tolerance approach towards any form of harassment during the recruitment process, ensuring that everyone is treated with respect and professionalism.Our aim is to have great people everywhere in our business and we're always looking for outstanding people to join us. Most roles across Ageas allow a proportion of your time to be spent working from home and we're open to discussing flexible working, including full-time, part-time or job share arrangements. To find out more about Ageas, see About Us . Want to be part of a Winning Team? Come and join Ageas.Click on the 'Apply button' to be considered. Important Notice -
Data Science Manager
Huron Consulting Group Inc. City, Belfast
Data Science Manager page is loaded Data Science Managerremote type: Hybridlocations: Belfast - 20 Adelaide Streetposted on: Posted Todayjob requisition id: JR-Huron is a global consultancy that collaborates with clients to drive strategic growth, ignite innovation and navigate constant change. Through a combination of strategy, expertise and creativity, we help clients accelerate operational, digital and cultural transformation, enabling the change they need to own their future. Join our team as the expert you are now and create your future. Data Science Manager We're seeking a Data Science Manager to join the Data Science & Machine Learning team in our Commercial Digital practice, where you'll lead advanced analytics initiatives that transform how Fortune 500 companies make decisions across Financial Services, Manufacturing, Energy & Utilities, and other commercial industries.Managers play a vibrant, integral role at Huron. Their invaluable knowledge reflects in the projects they manage and the teams they lead. Known for building long-standing partnerships with clients, they collaborate with colleagues to solve their most important challenges. Our Managers also spend significant time mentoring junior staff on the engagement team-sharing expertise, feedback, and encouragement. This promotes a culture of respect, unity, collaboration, and personal achievement.This isn't a reporting role or a dashboard factory-you'll own the full analytics lifecycle from hypothesis formulation through insight delivery, while leading and developing a team of data scientists and analysts. You'll work on problems that matter: experimental designs that validate multi-million-dollar strategies, predictive models that surface hidden patterns in complex data, and deep learning pipelines that extract signal from unstructured text, images, and time-series. Our clients are Fortune 500 companies looking for partners who can find the signal in the noise and tell the story that drives action.The variety is real. In your first year, you might lead a customer segmentation and lifetime value analysis for a financial services firm, design and analyze a pricing experiment for a global manufacturer, and build an agentic anomaly detection system for a utility company's operational data-all while developing the next generation of data science talent at Huron. If you thrive on rigorous analysis, clear communication of complex findings, and building high-performing teams, this role is for you.# What You'll Do Lead and mentor junior data scientists and analysts -provide technical guidance, review analytical approaches and code, and support professional development. Foster a culture of intellectual curiosity, rigorous methodology, and clear communication within the team. Manage complex multi-workstream analytics projects -oversee project planning, resource allocation, and delivery timelines. Ensure analyses meet quality standards and client expectations while maintaining methodological rigor. Design and execute end-to-end data science workflows -from problem framing and hypothesis development through exploratory analysis, modeling, validation, and insight delivery. Own the analytical approach and ensure conclusions are defensible. Lead development of both traditional statistical and modern AI-powered analyses -including regression, classification, clustering, causal inference, A/B testing, and modern deep learning approaches using embeddings, transformer architectures, and foundation models for text, time-series, and multimodal analysis. Build predictive and prescriptive models that drive business decisions-customer segmentation, churn prediction, demand forecasting, pricing optimization, risk scoring, and operational efficiency analysis for commercial enterprises. Translate complex analytical findings into actionable insights -create compelling data narratives, develop executive-ready presentations, and communicate technical results to non-technical stakeholders in ways that drive decisions. Serve as a trusted advisor to clients -build long-standing partnerships, deeply understand business problems, formulate the right analytical questions, and deliver insights that create measurable value. Contribute to practice development -participate in business development activities, develop reusable analytical frameworks and methodologies, and help shape the technical direction of Huron's DSML capabilities.# Required Qualifications 5+ years of hands-on experience conducting data science and advanced analytics -not just ad-hoc analysis, but structured analytical projects that drove business decisions. You've framed problems, developed hypotheses, analyzed data, and delivered insights that created measurable impact. Experience leading and developing technical teams -including coaching, mentorship, methodology review, and performance management. Demonstrated ability to build high-performing teams and develop junior talent. Strong Python and SQL programming skills with deep experience in the data science ecosystem (Pandas, NumPy, Scikit-learn, statsmodels, visualization libraries). Comfortable writing production-quality code, not just notebooks. Solid foundation in statistics and machine learning : hypothesis testing, regression analysis, classification, clustering, experimental design, causal inference, and understanding of when different approaches are appropriate for different questions. Experience with deep learning and modern neural architectures -understanding of transformer models, embeddings, transfer learning, and how to leverage foundation models for analytical tasks. You know when ML approaches add value over classical methods, and how to integrate them into rigorous analytical workflows. Proficiency with data platforms : Microsoft Fabric, Snowflake, Databricks, or similar cloud analytics environments. You're comfortable working with large datasets and can optimize queries for performance. Exceptional communication and data storytelling skills -ability to distill complex analyses into clear narratives, create compelling visualizations, lead client meetings, and build trusted relationships with executive audiences. This is non-negotiable. Bachelor's degree in Statistics, Mathematics, Economics, Computer Science, or related quantitative field (or equivalent practical experience). Flexibility to work in a hybrid model with periodic travel to client sites as needed.# Preferred Qualifications Experience in Financial Services, Manufacturing, or Energy & Utilities industries. Background in experimental design, A/B testing, and causal inference methodologies-including propensity score matching, difference-in-differences, or instrumental variables. Hands-on experience with deep learning frameworks (PyTorch, TensorFlow) and neural architectures-including transformers, attention mechanisms, and fine-tuning pretrained models for NLP, time-series, or tabular data applications. Experience building AI-assisted analytical workflows-leveraging foundation model APIs, vector databases, and retrieval systems to accelerate insight extraction from unstructured data. Experience with Bayesian methods, probabilistic programming (PyMC, NumPyro, etc.), or uncertainty quantification in business contexts. Strong visualization and data interface design and development skills using programmatic visualization libraries (Plotly, Altair, D3). Proficiency with AI-assisted rapid data application development using Cursor, Lovable, v0, etc. Experience with time-series analysis, forecasting methods (ARIMA, Prophet, neural forecasting), and demand planning applications. Cloud certifications (Azure Data Scientist, Databricks ML Associate, AWS ML Specialty). Consulting experience or demonstrated ability to work across multiple domains and adapt quickly to new
Apr 23, 2026
Full time
Data Science Manager page is loaded Data Science Managerremote type: Hybridlocations: Belfast - 20 Adelaide Streetposted on: Posted Todayjob requisition id: JR-Huron is a global consultancy that collaborates with clients to drive strategic growth, ignite innovation and navigate constant change. Through a combination of strategy, expertise and creativity, we help clients accelerate operational, digital and cultural transformation, enabling the change they need to own their future. Join our team as the expert you are now and create your future. Data Science Manager We're seeking a Data Science Manager to join the Data Science & Machine Learning team in our Commercial Digital practice, where you'll lead advanced analytics initiatives that transform how Fortune 500 companies make decisions across Financial Services, Manufacturing, Energy & Utilities, and other commercial industries.Managers play a vibrant, integral role at Huron. Their invaluable knowledge reflects in the projects they manage and the teams they lead. Known for building long-standing partnerships with clients, they collaborate with colleagues to solve their most important challenges. Our Managers also spend significant time mentoring junior staff on the engagement team-sharing expertise, feedback, and encouragement. This promotes a culture of respect, unity, collaboration, and personal achievement.This isn't a reporting role or a dashboard factory-you'll own the full analytics lifecycle from hypothesis formulation through insight delivery, while leading and developing a team of data scientists and analysts. You'll work on problems that matter: experimental designs that validate multi-million-dollar strategies, predictive models that surface hidden patterns in complex data, and deep learning pipelines that extract signal from unstructured text, images, and time-series. Our clients are Fortune 500 companies looking for partners who can find the signal in the noise and tell the story that drives action.The variety is real. In your first year, you might lead a customer segmentation and lifetime value analysis for a financial services firm, design and analyze a pricing experiment for a global manufacturer, and build an agentic anomaly detection system for a utility company's operational data-all while developing the next generation of data science talent at Huron. If you thrive on rigorous analysis, clear communication of complex findings, and building high-performing teams, this role is for you.# What You'll Do Lead and mentor junior data scientists and analysts -provide technical guidance, review analytical approaches and code, and support professional development. Foster a culture of intellectual curiosity, rigorous methodology, and clear communication within the team. Manage complex multi-workstream analytics projects -oversee project planning, resource allocation, and delivery timelines. Ensure analyses meet quality standards and client expectations while maintaining methodological rigor. Design and execute end-to-end data science workflows -from problem framing and hypothesis development through exploratory analysis, modeling, validation, and insight delivery. Own the analytical approach and ensure conclusions are defensible. Lead development of both traditional statistical and modern AI-powered analyses -including regression, classification, clustering, causal inference, A/B testing, and modern deep learning approaches using embeddings, transformer architectures, and foundation models for text, time-series, and multimodal analysis. Build predictive and prescriptive models that drive business decisions-customer segmentation, churn prediction, demand forecasting, pricing optimization, risk scoring, and operational efficiency analysis for commercial enterprises. Translate complex analytical findings into actionable insights -create compelling data narratives, develop executive-ready presentations, and communicate technical results to non-technical stakeholders in ways that drive decisions. Serve as a trusted advisor to clients -build long-standing partnerships, deeply understand business problems, formulate the right analytical questions, and deliver insights that create measurable value. Contribute to practice development -participate in business development activities, develop reusable analytical frameworks and methodologies, and help shape the technical direction of Huron's DSML capabilities.# Required Qualifications 5+ years of hands-on experience conducting data science and advanced analytics -not just ad-hoc analysis, but structured analytical projects that drove business decisions. You've framed problems, developed hypotheses, analyzed data, and delivered insights that created measurable impact. Experience leading and developing technical teams -including coaching, mentorship, methodology review, and performance management. Demonstrated ability to build high-performing teams and develop junior talent. Strong Python and SQL programming skills with deep experience in the data science ecosystem (Pandas, NumPy, Scikit-learn, statsmodels, visualization libraries). Comfortable writing production-quality code, not just notebooks. Solid foundation in statistics and machine learning : hypothesis testing, regression analysis, classification, clustering, experimental design, causal inference, and understanding of when different approaches are appropriate for different questions. Experience with deep learning and modern neural architectures -understanding of transformer models, embeddings, transfer learning, and how to leverage foundation models for analytical tasks. You know when ML approaches add value over classical methods, and how to integrate them into rigorous analytical workflows. Proficiency with data platforms : Microsoft Fabric, Snowflake, Databricks, or similar cloud analytics environments. You're comfortable working with large datasets and can optimize queries for performance. Exceptional communication and data storytelling skills -ability to distill complex analyses into clear narratives, create compelling visualizations, lead client meetings, and build trusted relationships with executive audiences. This is non-negotiable. Bachelor's degree in Statistics, Mathematics, Economics, Computer Science, or related quantitative field (or equivalent practical experience). Flexibility to work in a hybrid model with periodic travel to client sites as needed.# Preferred Qualifications Experience in Financial Services, Manufacturing, or Energy & Utilities industries. Background in experimental design, A/B testing, and causal inference methodologies-including propensity score matching, difference-in-differences, or instrumental variables. Hands-on experience with deep learning frameworks (PyTorch, TensorFlow) and neural architectures-including transformers, attention mechanisms, and fine-tuning pretrained models for NLP, time-series, or tabular data applications. Experience building AI-assisted analytical workflows-leveraging foundation model APIs, vector databases, and retrieval systems to accelerate insight extraction from unstructured data. Experience with Bayesian methods, probabilistic programming (PyMC, NumPyro, etc.), or uncertainty quantification in business contexts. Strong visualization and data interface design and development skills using programmatic visualization libraries (Plotly, Altair, D3). Proficiency with AI-assisted rapid data application development using Cursor, Lovable, v0, etc. Experience with time-series analysis, forecasting methods (ARIMA, Prophet, neural forecasting), and demand planning applications. Cloud certifications (Azure Data Scientist, Databricks ML Associate, AWS ML Specialty). Consulting experience or demonstrated ability to work across multiple domains and adapt quickly to new
Solution Director; Analytics, AI/ML
Pace Industries, LLC
Solution Director; Analytics, AI/ML page is loaded Solution Director; Analytics, AI/MLlocations: UK-Hayes-Hyde Park Hayestime type: Full timeposted on: Posted 26 Days Agojob requisition id: R-23123We are seeking a highly accomplished Solution Director (Analytics & Al/ML) to lead the design and sales of two critical solution portfolios: Generative AI/LLM solutions and Data modernization/Lakehouse architectures on AWS.This pivotal role requires mastery of both domains - leveraging generative AI capabilities (Amazon Q, Amazon Bedrock, QuickSight) to drive executive conversations and opportunity creation, while delivering enterprise data modernization through Lakehouse architectures using AWS native services (Glue, SageMaker Unified Studio) and leading platforms (Databricks on AWS, Snowflake on AWS).This is a presales role that demands cross-functional experience with proven ability to engage C-level stakeholders, drive top-of-funnel opportunity creation, and maintain comprehensive account ownership across the entire customer lifecycle.The ideal candidate will excel at both selling the vision of generative AI transformation and delivering the reality of enterprise data modernization, combining deep technical expertise with exceptional business acumen and executive presence. Responsibilities Strategic Leadership & Opportunity Development • Drive top-of-funnel opportunity creation through two parallel tracks: engaging C-level stakeholders with generative AI demonstrations (Amazon Q, Amazon Bedrock) and identifying data modernization needs for Lakehouse transformations.• Lead the design and architecture of dual solution portfolios; 1) Generative AI Solutions : Amazon Bedrock implementations, Amazon Q deployments, QuickSight with Q capabilities, RAG architectures, and custom LLM solutions, and 2) Data Modernization : Enterprise Lakehouse architectures using AWS Glue, SageMaker Unified Studio, Databricks on AWS, and Snowflake on AWS.• Act as the trusted advisor, positioning generative AI as the transformational vision while grounding delivery in robust data platform modernization.• Develop compelling business cases that connect AI aspirations with practical data foundation requirements, demonstrating ROI across both portfolios.• Stay current with advancements in generative AI (foundation models, LLMs) and modern data architectures (Lakehouse patterns, data mesh, unified analytics).• Contribute to Rackspace's intellectual property through reference architectures covering both generative AI implementations and Lakehouse design patterns.• Mentor and provide leadership to Solution Architects by guiding technical development and fostering skill growth across both generative AI and data modernization solution areas. Customer Engagement & Solution Delivery • Serve as the primary technical lead orchestrating both generative AI discussions and data modernization programs for strategic accounts.• Build strategic relationships using two engagement models; 1) Executive Level : Amazon Q demonstrations, QuickSight analytics with generative BI, art-of-the-possible sessions, and 2) Technical Level : Lakehouse architecture workshops, platform assessments (Databricks vs Snowflake vs AWS-native), migration planning.• Lead comprehensive consultative engagements that begin with generative AI vision (Amazon Q, Bedrock) and translate into concrete data modernization roadmaps.• Develop proposals that balance innovative AI capabilities with foundational data platform requirements.Guide customers through parallel journeys: generative AI adoption (POCs to production) and data platform modernization (legacy to Lakehouse).• Collaborate with sales teams to position both solution portfolios strategically based on customer maturity and needs. Technical Excellence & Market Awareness • Maintain deep expertise across both solution domains; 1) Generative AI : Amazon Bedrock, Amazon Q, QuickSight Q, SageMaker JumpStart, prompt engineering, RAG architectures, vector databases, and 2) Data Platforms : AWS Glue, SageMaker Unified Studio, Databricks on AWS, Snowflake on AWS, Redshift, EMR, Apache Iceberg, Delta Lake.• Position AWS solutions effectively against other cloud platforms' offerings in both generative AI (Azure OpenAI, Vertex AI) and data platforms (Azure Synapse, BigQuery)• Guide architectural decisions on build vs. buy for both Al capabilities and data platform componentsExperience Deep experience with generative AI technologies: Amazon Bedrock, Amazon Q, LLM architectures, RAG implementations. Proven track record delivering data modernization: Lakehouse architectures, Databricks and/or Snowflake implementations, AWS Glue/EMR deployments A bachelor's degree in computer science, Data Science, Engineering, Mathematics, or a related technical field is required. At the manager's discretion, additional relevant experience may substitute for the degree requirement. A minimum of 15 years of enterprise solution architecture experience. A minimum of 8 years of public cloud experience. A minimum of 5 years as a senior-level architect or solutions leader with hands-on experience in both AI/ML and data platform modernization. Proven Presales/Sales Engineering experience. Demonstrated success in engaging C-level executives using generative AI demonstrations while delivering complex data platform transformations. Strong understanding across the full spectrum: AI/ML: Generative AI, foundation models, LLMs, traditional ML, prompt engineering, fine-tuning. Data Platforms: Lakehouse architectures, data mesh, ETL/ELT, streaming, data governance, data quality. Proficiency in Python, SQL, and Spark with hands-on experience in: Generative AI: LangChain, vector databases, embedding models. Data Engineering: PySpark, Apache Iceberg/Delta Lake, orchestration tools. A proven ability to articulate both visionary AI possibilities and practical data platform requirements to diverse audiences. About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world's leading technologies - across applications, data and security - to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we're all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Apr 23, 2026
Full time
Solution Director; Analytics, AI/ML page is loaded Solution Director; Analytics, AI/MLlocations: UK-Hayes-Hyde Park Hayestime type: Full timeposted on: Posted 26 Days Agojob requisition id: R-23123We are seeking a highly accomplished Solution Director (Analytics & Al/ML) to lead the design and sales of two critical solution portfolios: Generative AI/LLM solutions and Data modernization/Lakehouse architectures on AWS.This pivotal role requires mastery of both domains - leveraging generative AI capabilities (Amazon Q, Amazon Bedrock, QuickSight) to drive executive conversations and opportunity creation, while delivering enterprise data modernization through Lakehouse architectures using AWS native services (Glue, SageMaker Unified Studio) and leading platforms (Databricks on AWS, Snowflake on AWS).This is a presales role that demands cross-functional experience with proven ability to engage C-level stakeholders, drive top-of-funnel opportunity creation, and maintain comprehensive account ownership across the entire customer lifecycle.The ideal candidate will excel at both selling the vision of generative AI transformation and delivering the reality of enterprise data modernization, combining deep technical expertise with exceptional business acumen and executive presence. Responsibilities Strategic Leadership & Opportunity Development • Drive top-of-funnel opportunity creation through two parallel tracks: engaging C-level stakeholders with generative AI demonstrations (Amazon Q, Amazon Bedrock) and identifying data modernization needs for Lakehouse transformations.• Lead the design and architecture of dual solution portfolios; 1) Generative AI Solutions : Amazon Bedrock implementations, Amazon Q deployments, QuickSight with Q capabilities, RAG architectures, and custom LLM solutions, and 2) Data Modernization : Enterprise Lakehouse architectures using AWS Glue, SageMaker Unified Studio, Databricks on AWS, and Snowflake on AWS.• Act as the trusted advisor, positioning generative AI as the transformational vision while grounding delivery in robust data platform modernization.• Develop compelling business cases that connect AI aspirations with practical data foundation requirements, demonstrating ROI across both portfolios.• Stay current with advancements in generative AI (foundation models, LLMs) and modern data architectures (Lakehouse patterns, data mesh, unified analytics).• Contribute to Rackspace's intellectual property through reference architectures covering both generative AI implementations and Lakehouse design patterns.• Mentor and provide leadership to Solution Architects by guiding technical development and fostering skill growth across both generative AI and data modernization solution areas. Customer Engagement & Solution Delivery • Serve as the primary technical lead orchestrating both generative AI discussions and data modernization programs for strategic accounts.• Build strategic relationships using two engagement models; 1) Executive Level : Amazon Q demonstrations, QuickSight analytics with generative BI, art-of-the-possible sessions, and 2) Technical Level : Lakehouse architecture workshops, platform assessments (Databricks vs Snowflake vs AWS-native), migration planning.• Lead comprehensive consultative engagements that begin with generative AI vision (Amazon Q, Bedrock) and translate into concrete data modernization roadmaps.• Develop proposals that balance innovative AI capabilities with foundational data platform requirements.Guide customers through parallel journeys: generative AI adoption (POCs to production) and data platform modernization (legacy to Lakehouse).• Collaborate with sales teams to position both solution portfolios strategically based on customer maturity and needs. Technical Excellence & Market Awareness • Maintain deep expertise across both solution domains; 1) Generative AI : Amazon Bedrock, Amazon Q, QuickSight Q, SageMaker JumpStart, prompt engineering, RAG architectures, vector databases, and 2) Data Platforms : AWS Glue, SageMaker Unified Studio, Databricks on AWS, Snowflake on AWS, Redshift, EMR, Apache Iceberg, Delta Lake.• Position AWS solutions effectively against other cloud platforms' offerings in both generative AI (Azure OpenAI, Vertex AI) and data platforms (Azure Synapse, BigQuery)• Guide architectural decisions on build vs. buy for both Al capabilities and data platform componentsExperience Deep experience with generative AI technologies: Amazon Bedrock, Amazon Q, LLM architectures, RAG implementations. Proven track record delivering data modernization: Lakehouse architectures, Databricks and/or Snowflake implementations, AWS Glue/EMR deployments A bachelor's degree in computer science, Data Science, Engineering, Mathematics, or a related technical field is required. At the manager's discretion, additional relevant experience may substitute for the degree requirement. A minimum of 15 years of enterprise solution architecture experience. A minimum of 8 years of public cloud experience. A minimum of 5 years as a senior-level architect or solutions leader with hands-on experience in both AI/ML and data platform modernization. Proven Presales/Sales Engineering experience. Demonstrated success in engaging C-level executives using generative AI demonstrations while delivering complex data platform transformations. Strong understanding across the full spectrum: AI/ML: Generative AI, foundation models, LLMs, traditional ML, prompt engineering, fine-tuning. Data Platforms: Lakehouse architectures, data mesh, ETL/ELT, streaming, data governance, data quality. Proficiency in Python, SQL, and Spark with hands-on experience in: Generative AI: LangChain, vector databases, embedding models. Data Engineering: PySpark, Apache Iceberg/Delta Lake, orchestration tools. A proven ability to articulate both visionary AI possibilities and practical data platform requirements to diverse audiences. About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world's leading technologies - across applications, data and security - to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we're all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Gleeson Recruitment Group
Data Engineer
Gleeson Recruitment Group Leicester, Leicestershire
Data Engineer (SQL / Python) Onsite 3 times per week (Leicester or Nottingham office) £35K - £40K DOE Our client is looking to appoint a Data Engineer to join their expanding data team. This is an excellent opportunity for someone with solid foundational experience who is eager to develop their skills and grow within a modern, cloud-based data environment. Working alongside senior engineers and analysts, the successful candidate will contribute to the design and development of scalable data solutions while gaining exposure to cutting-edge technologies. The role will involve: Supporting the development and maintenance of cloud-based data pipelines Assisting in the design and optimisation of data models and architectures Working with analytics teams to ensure high-quality, reliable data outputs Contributing to best practices in data governance and engineering standards The successful candidate will ideally have: Experience with at least one cloud platform (Azure, AWS, or Snowflake) Exposure to Databricks or similar modern data processing tools Working knowledge of SQL and some experience with Python An understanding of data warehousing concepts A strong desire to learn, develop, and progress within a data engineering career Importantly, given the level of this role, our client is open to candidates who may not tick every box. They are keen to speak with individuals who demonstrate strong potential, a solid grasp of core concepts, and a genuine enthusiasm to build their technical capability within a supportive team environment. This role offers genuine progression, hands-on learning from experienced engineers, and the chance to be part of a collaborative and forward-thinking data team. Please apply asap if interested - GleeIT - Data Engineer At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
Apr 22, 2026
Full time
Data Engineer (SQL / Python) Onsite 3 times per week (Leicester or Nottingham office) £35K - £40K DOE Our client is looking to appoint a Data Engineer to join their expanding data team. This is an excellent opportunity for someone with solid foundational experience who is eager to develop their skills and grow within a modern, cloud-based data environment. Working alongside senior engineers and analysts, the successful candidate will contribute to the design and development of scalable data solutions while gaining exposure to cutting-edge technologies. The role will involve: Supporting the development and maintenance of cloud-based data pipelines Assisting in the design and optimisation of data models and architectures Working with analytics teams to ensure high-quality, reliable data outputs Contributing to best practices in data governance and engineering standards The successful candidate will ideally have: Experience with at least one cloud platform (Azure, AWS, or Snowflake) Exposure to Databricks or similar modern data processing tools Working knowledge of SQL and some experience with Python An understanding of data warehousing concepts A strong desire to learn, develop, and progress within a data engineering career Importantly, given the level of this role, our client is open to candidates who may not tick every box. They are keen to speak with individuals who demonstrate strong potential, a solid grasp of core concepts, and a genuine enthusiasm to build their technical capability within a supportive team environment. This role offers genuine progression, hands-on learning from experienced engineers, and the chance to be part of a collaborative and forward-thinking data team. Please apply asap if interested - GleeIT - Data Engineer At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
Logix 2
SAP BDC Consultant SAP Business Data Cloud Consultant
Logix 2
SAP BDC Consultant SAP Business Data Cloud Consultant Role You will be a Data & AI solution expert for SAP Business Data Cloud Expert SAP Datasphere, SAP Analytics Cloud, SAP Databricks, and SAP HANA Cloud. Deliver standard or customized demos, workshops Lead discovery phases, data maturity assessments, solution scoping, and architecture reviews. Design end-to-end modern data architectures including data warehousing, data fabric, analytics, and AI readiness. Lead proofs-of-concept and feasibility assessments for complex data and AI scenarios. Help to shape strategy and Data & AI positioning. Commercial experience: 7+ years of experience in consulting within data, analytics, or AI domains. Good experience ofin SAP Business Data Cloud, including Datasphere, SAP Analytics Cloud, HANA Cloud, and SAP application data models. Experience with hyperscaler data platforms and ecosystems (AWS, Azure, GCP; Databricks, Snowflake, Microsoft Fabric). Strong understanding of modern data architecture, data warehousing/lakehouse, analytics, and AI/ML concepts. Skills you'll use: SAP Business Data Cloud (BDC) SAP Datasphere & HANA Cloud Analytics, Data Warehousing & Lakehouse Artificial Intelligence & AI Readiness, Databricks Cloud & SaaS/PaaS Platforms SAP Cloud Suite Portfolio RISE with SAP Any presales experience would be of great benefit
Apr 22, 2026
Full time
SAP BDC Consultant SAP Business Data Cloud Consultant Role You will be a Data & AI solution expert for SAP Business Data Cloud Expert SAP Datasphere, SAP Analytics Cloud, SAP Databricks, and SAP HANA Cloud. Deliver standard or customized demos, workshops Lead discovery phases, data maturity assessments, solution scoping, and architecture reviews. Design end-to-end modern data architectures including data warehousing, data fabric, analytics, and AI readiness. Lead proofs-of-concept and feasibility assessments for complex data and AI scenarios. Help to shape strategy and Data & AI positioning. Commercial experience: 7+ years of experience in consulting within data, analytics, or AI domains. Good experience ofin SAP Business Data Cloud, including Datasphere, SAP Analytics Cloud, HANA Cloud, and SAP application data models. Experience with hyperscaler data platforms and ecosystems (AWS, Azure, GCP; Databricks, Snowflake, Microsoft Fabric). Strong understanding of modern data architecture, data warehousing/lakehouse, analytics, and AI/ML concepts. Skills you'll use: SAP Business Data Cloud (BDC) SAP Datasphere & HANA Cloud Analytics, Data Warehousing & Lakehouse Artificial Intelligence & AI Readiness, Databricks Cloud & SaaS/PaaS Platforms SAP Cloud Suite Portfolio RISE with SAP Any presales experience would be of great benefit
Harnham - Data & Analytics Recruitment
Senior Analytics Engineer (12 month FTC)
Harnham - Data & Analytics Recruitment Manchester, Lancashire
Senior Analytics Engineer - 12 month fixed term contract UK Remote £78,000 plus benefits This Senior Analytics Engineer role stands out as a chance to play a key part in a large scale Lakehouse programme, sitting at the intersection of engineering, analytics and the wider business. You will take real ownership of curated data models, shape how data is structured and served across the organisation, and influence best practice as the analytics engineering capability continues to grow. The Company They are a large, well established UK organisation with a strong reputation for combining technical excellence with a people first culture. Data and analytics are a strategic priority, with ongoing investment into a modern cloud based data platform. Engineering teams are expanding as part of a broader transformation, creating genuine opportunities to have impact and influence. The Role You will join a growing analytics engineering team and play a critical role in the Lakehouse environment. Your responsibilities will include: Leading the design and delivery of curated, analytics ready data models within the Lakehouse Owning the transformation from enriched to curated datasets, enabling trusted reporting and insight Developing and maintaining robust SQL and PySpark transformation pipelines in Databricks Embedding data quality, testing, reliability and performance into the curated layer Working closely with data engineers, BI teams and business stakeholders to translate complex requirements Providing technical leadership, mentoring and setting modelling and engineering standards Contributing to CI/CD processes and wider engineering best practice across the data platform Your Skills and Experience Strong commercial experience as an Analytics Engineer within a modern data platform Excellent data modelling capability, including dimensional and semantic modelling Advanced SQL skills and strong hands on experience with PySpark Experience working with Databricks and Lakehouse architectures A solid grounding in engineering best practices, testing and data quality Confidence mentoring others and taking ownership of technical decisions An engineering mindset applied to analytics, rather than an analyst focused role What They Offer Flexible by choice working, supporting different schedules and work life balance A 35 hour working week within a supportive, inclusive engineering culture The opportunity to shape a critical data programme with real visibility across the business How to Apply If you are a Senior Analytics Engineer looking to make an impact in a growing Lakehouse environment, apply now to find out more.
Apr 21, 2026
Full time
Senior Analytics Engineer - 12 month fixed term contract UK Remote £78,000 plus benefits This Senior Analytics Engineer role stands out as a chance to play a key part in a large scale Lakehouse programme, sitting at the intersection of engineering, analytics and the wider business. You will take real ownership of curated data models, shape how data is structured and served across the organisation, and influence best practice as the analytics engineering capability continues to grow. The Company They are a large, well established UK organisation with a strong reputation for combining technical excellence with a people first culture. Data and analytics are a strategic priority, with ongoing investment into a modern cloud based data platform. Engineering teams are expanding as part of a broader transformation, creating genuine opportunities to have impact and influence. The Role You will join a growing analytics engineering team and play a critical role in the Lakehouse environment. Your responsibilities will include: Leading the design and delivery of curated, analytics ready data models within the Lakehouse Owning the transformation from enriched to curated datasets, enabling trusted reporting and insight Developing and maintaining robust SQL and PySpark transformation pipelines in Databricks Embedding data quality, testing, reliability and performance into the curated layer Working closely with data engineers, BI teams and business stakeholders to translate complex requirements Providing technical leadership, mentoring and setting modelling and engineering standards Contributing to CI/CD processes and wider engineering best practice across the data platform Your Skills and Experience Strong commercial experience as an Analytics Engineer within a modern data platform Excellent data modelling capability, including dimensional and semantic modelling Advanced SQL skills and strong hands on experience with PySpark Experience working with Databricks and Lakehouse architectures A solid grounding in engineering best practices, testing and data quality Confidence mentoring others and taking ownership of technical decisions An engineering mindset applied to analytics, rather than an analyst focused role What They Offer Flexible by choice working, supporting different schedules and work life balance A 35 hour working week within a supportive, inclusive engineering culture The opportunity to shape a critical data programme with real visibility across the business How to Apply If you are a Senior Analytics Engineer looking to make an impact in a growing Lakehouse environment, apply now to find out more.
Harnham - Data & Analytics Recruitment
Analytics Engineer
Harnham - Data & Analytics Recruitment
Analytics Engineer - 12 month fixed term contract UK Remote £67,000 plus benefits This Analytics Engineer role stands out as a chance to play a key part in a large scale Lakehouse programme, sitting at the intersection of engineering, analytics and the wider business. You will take real ownership of curated data models, shape how data is structured and served across the organisation, and influence best practice as the analytics engineering capability continues to grow. The Company They are a large, well established UK organisation with a strong reputation for combining technical excellence with a people first culture. Data and analytics are a strategic priority, with ongoing investment into a modern cloud based data platform. Engineering teams are expanding as part of a broader transformation, creating genuine opportunities to have impact and influence. The Role You will join a growing analytics engineering team and play a critical role in the Lakehouse environment. Your responsibilities will include: Leading the design and delivery of curated, analytics ready data models within the Lakehouse Owning the transformation from enriched to curated datasets, enabling trusted reporting and insight Developing and maintaining robust SQL and PySpark transformation pipelines in Databricks Embedding data quality, testing, reliability and performance into the curated layer Working closely with data engineers, BI teams and business stakeholders to translate complex requirements Providing technical leadership, mentoring and setting modelling and engineering standards Contributing to CI/CD processes and wider engineering best practice across the data platform Your Skills and Experience Strong commercial experience as an Analytics Engineer within a modern data platform Excellent data modelling capability, including dimensional and semantic modelling Advanced SQL skills and strong hands on experience with PySpark Experience working with Databricks and Lakehouse architectures A solid grounding in engineering best practices, testing and data quality Confidence mentoring others and taking ownership of technical decisions An engineering mindset applied to analytics, rather than an analyst focused role What They Offer Flexible by choice working, supporting different schedules and work life balance A 35 hour working week within a supportive, inclusive engineering culture The opportunity to shape a critical data programme with real visibility across the business How to Apply If you are an Analytics Engineer looking to make an impact in a growing Lakehouse environment, apply now to find out more.
Apr 21, 2026
Full time
Analytics Engineer - 12 month fixed term contract UK Remote £67,000 plus benefits This Analytics Engineer role stands out as a chance to play a key part in a large scale Lakehouse programme, sitting at the intersection of engineering, analytics and the wider business. You will take real ownership of curated data models, shape how data is structured and served across the organisation, and influence best practice as the analytics engineering capability continues to grow. The Company They are a large, well established UK organisation with a strong reputation for combining technical excellence with a people first culture. Data and analytics are a strategic priority, with ongoing investment into a modern cloud based data platform. Engineering teams are expanding as part of a broader transformation, creating genuine opportunities to have impact and influence. The Role You will join a growing analytics engineering team and play a critical role in the Lakehouse environment. Your responsibilities will include: Leading the design and delivery of curated, analytics ready data models within the Lakehouse Owning the transformation from enriched to curated datasets, enabling trusted reporting and insight Developing and maintaining robust SQL and PySpark transformation pipelines in Databricks Embedding data quality, testing, reliability and performance into the curated layer Working closely with data engineers, BI teams and business stakeholders to translate complex requirements Providing technical leadership, mentoring and setting modelling and engineering standards Contributing to CI/CD processes and wider engineering best practice across the data platform Your Skills and Experience Strong commercial experience as an Analytics Engineer within a modern data platform Excellent data modelling capability, including dimensional and semantic modelling Advanced SQL skills and strong hands on experience with PySpark Experience working with Databricks and Lakehouse architectures A solid grounding in engineering best practices, testing and data quality Confidence mentoring others and taking ownership of technical decisions An engineering mindset applied to analytics, rather than an analyst focused role What They Offer Flexible by choice working, supporting different schedules and work life balance A 35 hour working week within a supportive, inclusive engineering culture The opportunity to shape a critical data programme with real visibility across the business How to Apply If you are an Analytics Engineer looking to make an impact in a growing Lakehouse environment, apply now to find out more.
Harnham - Data & Analytics Recruitment
Senior Analytics Engineer (12 month FTC)
Harnham - Data & Analytics Recruitment Edinburgh, Midlothian
Senior Analytics Engineer - 12 month fixed term contract UK Remote £78,000 plus benefits This Senior Analytics Engineer role stands out as a chance to play a key part in a large scale Lakehouse programme, sitting at the intersection of engineering, analytics and the wider business. You will take real ownership of curated data models, shape how data is structured and served across the organisation, and influence best practice as the analytics engineering capability continues to grow. The Company They are a large, well established UK organisation with a strong reputation for combining technical excellence with a people first culture. Data and analytics are a strategic priority, with ongoing investment into a modern cloud based data platform. Engineering teams are expanding as part of a broader transformation, creating genuine opportunities to have impact and influence. The Role You will join a growing analytics engineering team and play a critical role in the Lakehouse environment. Your responsibilities will include: Leading the design and delivery of curated, analytics ready data models within the Lakehouse Owning the transformation from enriched to curated datasets, enabling trusted reporting and insight Developing and maintaining robust SQL and PySpark transformation pipelines in Databricks Embedding data quality, testing, reliability and performance into the curated layer Working closely with data engineers, BI teams and business stakeholders to translate complex requirements Providing technical leadership, mentoring and setting modelling and engineering standards Contributing to CI/CD processes and wider engineering best practice across the data platform Your Skills and Experience Strong commercial experience as an Analytics Engineer within a modern data platform Excellent data modelling capability, including dimensional and semantic modelling Advanced SQL skills and strong hands on experience with PySpark Experience working with Databricks and Lakehouse architectures A solid grounding in engineering best practices, testing and data quality Confidence mentoring others and taking ownership of technical decisions An engineering mindset applied to analytics, rather than an analyst focused role What They Offer Flexible by choice working, supporting different schedules and work life balance A 35 hour working week within a supportive, inclusive engineering culture The opportunity to shape a critical data programme with real visibility across the business How to Apply If you are a Senior Analytics Engineer looking to make an impact in a growing Lakehouse environment, apply now to find out more.
Apr 21, 2026
Full time
Senior Analytics Engineer - 12 month fixed term contract UK Remote £78,000 plus benefits This Senior Analytics Engineer role stands out as a chance to play a key part in a large scale Lakehouse programme, sitting at the intersection of engineering, analytics and the wider business. You will take real ownership of curated data models, shape how data is structured and served across the organisation, and influence best practice as the analytics engineering capability continues to grow. The Company They are a large, well established UK organisation with a strong reputation for combining technical excellence with a people first culture. Data and analytics are a strategic priority, with ongoing investment into a modern cloud based data platform. Engineering teams are expanding as part of a broader transformation, creating genuine opportunities to have impact and influence. The Role You will join a growing analytics engineering team and play a critical role in the Lakehouse environment. Your responsibilities will include: Leading the design and delivery of curated, analytics ready data models within the Lakehouse Owning the transformation from enriched to curated datasets, enabling trusted reporting and insight Developing and maintaining robust SQL and PySpark transformation pipelines in Databricks Embedding data quality, testing, reliability and performance into the curated layer Working closely with data engineers, BI teams and business stakeholders to translate complex requirements Providing technical leadership, mentoring and setting modelling and engineering standards Contributing to CI/CD processes and wider engineering best practice across the data platform Your Skills and Experience Strong commercial experience as an Analytics Engineer within a modern data platform Excellent data modelling capability, including dimensional and semantic modelling Advanced SQL skills and strong hands on experience with PySpark Experience working with Databricks and Lakehouse architectures A solid grounding in engineering best practices, testing and data quality Confidence mentoring others and taking ownership of technical decisions An engineering mindset applied to analytics, rather than an analyst focused role What They Offer Flexible by choice working, supporting different schedules and work life balance A 35 hour working week within a supportive, inclusive engineering culture The opportunity to shape a critical data programme with real visibility across the business How to Apply If you are a Senior Analytics Engineer looking to make an impact in a growing Lakehouse environment, apply now to find out more.
Enterprise Customer Success Manager
Cerebras
Overview The Role: The Enterprise CSM is a unique blend of a technical expert and strategic relationship builder. You will drive adoption within a portfolio of Fortune 5000 accounts, moving beyond traditional BI to help customers deploy AI Agents and Embedded Analytics. You'll be as comfortable discussing API authentication with a developer as you are discussing ROI with a CDO. What You'll Do Account Strategy: Be part of a focused team managing multiple Fortune 5000 accounts, responsible for driving adoption, tying usage to business problems, and building expansion opportunities through passive selling. Champion Agentic AI: Partner with customers to move from dashboards to agents, helping them leverage ThoughtSpot Agents and LLM-based workflows to automate data insights. Architect & Advise: Guide technical stakeholders through the development lifecycle of building high-performance data apps using our APIs and SDKs, while ensuring their data stack (Snowflake/Databricks/BigQuery) is optimized for AI-driven search. Voice of the Customer: Act as the primary technical point of contact, communicating requirements and use cases in a way that is actionable for ThoughtSpot's Product, Engineering, and Marketing teams. Relationship Management: Foster robust relationships through proactive champion building, acting as the bridge between human business needs and complex data technicalities. Bridge Business and Technology: Translate customer business goals into technical requirements and, conversely, explain the business value of technical features to stakeholders. Technical Enablement: Run advanced workshops and live demos that showcase the "Art of the Possible" with AI, embedding, and agentic analytics. What Sets You Apart A Consultative Problem Solver: You have a knack for understanding complex business challenges and prescribing elegant technical solutions. Technically Curious: You have a deep passion for the data space and are constantly learning about new technologies like LLMs and Generative AI. Incredible Communicator: You can command a room of data architects and then pivot to explain a technical concept in simple terms to a business leader. Proactive & Eager to Help: You are a natural problem solver who enjoys diving in to help customers with their initial technical hurdles. Language Skills: Fluent in English and capable of speaking European languages (e.g., French, Spanish, German) in a business environment. What You Bring 5+ years in a customer-facing technical role (Technical CSM, Solutions Architect, or Sales Engineer) within the Data/SaaS space. Analytics & AI Depth: Strong knowledge of the modern data stack (Snowflake/BigQuery/Databricks) and an understanding of LLM-based applications or AI Agents. Developer Literacy: Proficiency in SQL and familiarity with JavaScript/TypeScript frameworks (React, Angular, or Vue) for supporting embedded use cases. Integration Knowledge: Comfort discussing REST APIs, webhooks, and security protocols like SAML/OIDC. The "Consultative Edge": Ability to translate complex technical "how-to" into strategic "why-it-matters" for executive stakeholders. Education: Master's/Bachelor's Degree preferred but not required. Mandatory and Required Skills for All ThoughtSpot Roles Spotters are expected to demonstrate AI literacy and workflow integration, including the ability to: Comfortably and confidently integrate artificial intelligence into daily workflows to increase productivity and quality. Hands-on experience leveraging AI tools (industry-leading LLMs) to increase productivity, automate routine tasks, and improve work quality. Describe experience using AI for research, content creation, and document summarization while maintaining ownership of judgment and final decisions. Write effective prompts to obtain accurate and creative results from AI tools. Spotters are expected to exemplify these key traits and AI Mindset: Curiosity in exploring new AI tools Adaptability to quickly learn and implement new, emerging AI technologies Critical thinking to know when to identify when AI should be used versus when human judgement is necessary This combination of curiosity, adaptability, and discernment defines the AI mindset and is required for every role at ThoughtSpot. AI Mindset for All Spotters At ThoughtSpot, we believe AI is a necessary and essential part of how we work. Every role, across every team, is expected to be fluent and comfortable with using AI to do their best work. All Spotters are expected to experiment with ThoughtSpot's AI tools and leading industry LLMs to streamline workflows, enhance output, and uncover new insights. Whether drafting content, analyzing data, or summarizing documents, AI is a daily partner. We value curiosity, openness to learning, and thoughtful application of AI to create real value. Training and resources are provided so every Spotter can confidently create with AI. Hybrid Work at ThoughtSpot This office-assigned role is available as a hybrid position, reporting to the office in the UK - London. Spotters assigned to an office are encouraged to experience the energy of their local office with an in-office expectation of 2-3 days per week. This approach balances the benefits of in-person collaboration and peer learning with the flexibility needed by individuals and teams. ThoughtSpot for All At ThoughtSpot, diverse teams build better products. Complex data problems need many perspectives, not just one. We welcome different backgrounds, identities, and experiences, and we work to create a place where everyone can be themselves and do their best work. If this role excites you and you believe you're a strong match, we encourage you to apply. What Makes ThoughtSpot a Great Place to Work? ThoughtSpot is the Agentic Analytics Platform that empowers every enterprise to transform insights into action, on a mission to make the world more fact driven. We hire people with unique identities, backgrounds, and perspectives; this balance-for-the-better philosophy is key to our success. When paired with our culture of Trust, Customer Obsession, Innovation and Intensity, ThoughtSpot cultivates a respectful culture that pushes norms to create world-class products. If you're excited by the opportunity to work with some of the brightest minds in the business and make your mark on a truly innovative company, we invite you to read more about our mission, and apply to the role that's right for you. About ThoughtSpot The world's most innovative companies turn to ThoughtSpot's AI-Powered Analytics to put data in the hands of everyone, from the C-suite to the frontline. With simple, natural language search and AI, anyone can ask questions, discover insights, and act with confidence. Unlike legacy tools that sacrifice performance for complexity, ThoughtSpot is intuitively designed for every business user while being built to handle the most complex, large-scale data, wherever it resides. This unique combination of speed and simplicity is why enterprise leaders trust ThoughtSpot to transform decision-making into a truly data-driven culture. At ThoughtSpot, we're a curious, data-driven bunch. We believe the world works better when everyone has access to facts. That's why we build products that make asking and answering data questions as natural as having a conversation.
Apr 21, 2026
Full time
Overview The Role: The Enterprise CSM is a unique blend of a technical expert and strategic relationship builder. You will drive adoption within a portfolio of Fortune 5000 accounts, moving beyond traditional BI to help customers deploy AI Agents and Embedded Analytics. You'll be as comfortable discussing API authentication with a developer as you are discussing ROI with a CDO. What You'll Do Account Strategy: Be part of a focused team managing multiple Fortune 5000 accounts, responsible for driving adoption, tying usage to business problems, and building expansion opportunities through passive selling. Champion Agentic AI: Partner with customers to move from dashboards to agents, helping them leverage ThoughtSpot Agents and LLM-based workflows to automate data insights. Architect & Advise: Guide technical stakeholders through the development lifecycle of building high-performance data apps using our APIs and SDKs, while ensuring their data stack (Snowflake/Databricks/BigQuery) is optimized for AI-driven search. Voice of the Customer: Act as the primary technical point of contact, communicating requirements and use cases in a way that is actionable for ThoughtSpot's Product, Engineering, and Marketing teams. Relationship Management: Foster robust relationships through proactive champion building, acting as the bridge between human business needs and complex data technicalities. Bridge Business and Technology: Translate customer business goals into technical requirements and, conversely, explain the business value of technical features to stakeholders. Technical Enablement: Run advanced workshops and live demos that showcase the "Art of the Possible" with AI, embedding, and agentic analytics. What Sets You Apart A Consultative Problem Solver: You have a knack for understanding complex business challenges and prescribing elegant technical solutions. Technically Curious: You have a deep passion for the data space and are constantly learning about new technologies like LLMs and Generative AI. Incredible Communicator: You can command a room of data architects and then pivot to explain a technical concept in simple terms to a business leader. Proactive & Eager to Help: You are a natural problem solver who enjoys diving in to help customers with their initial technical hurdles. Language Skills: Fluent in English and capable of speaking European languages (e.g., French, Spanish, German) in a business environment. What You Bring 5+ years in a customer-facing technical role (Technical CSM, Solutions Architect, or Sales Engineer) within the Data/SaaS space. Analytics & AI Depth: Strong knowledge of the modern data stack (Snowflake/BigQuery/Databricks) and an understanding of LLM-based applications or AI Agents. Developer Literacy: Proficiency in SQL and familiarity with JavaScript/TypeScript frameworks (React, Angular, or Vue) for supporting embedded use cases. Integration Knowledge: Comfort discussing REST APIs, webhooks, and security protocols like SAML/OIDC. The "Consultative Edge": Ability to translate complex technical "how-to" into strategic "why-it-matters" for executive stakeholders. Education: Master's/Bachelor's Degree preferred but not required. Mandatory and Required Skills for All ThoughtSpot Roles Spotters are expected to demonstrate AI literacy and workflow integration, including the ability to: Comfortably and confidently integrate artificial intelligence into daily workflows to increase productivity and quality. Hands-on experience leveraging AI tools (industry-leading LLMs) to increase productivity, automate routine tasks, and improve work quality. Describe experience using AI for research, content creation, and document summarization while maintaining ownership of judgment and final decisions. Write effective prompts to obtain accurate and creative results from AI tools. Spotters are expected to exemplify these key traits and AI Mindset: Curiosity in exploring new AI tools Adaptability to quickly learn and implement new, emerging AI technologies Critical thinking to know when to identify when AI should be used versus when human judgement is necessary This combination of curiosity, adaptability, and discernment defines the AI mindset and is required for every role at ThoughtSpot. AI Mindset for All Spotters At ThoughtSpot, we believe AI is a necessary and essential part of how we work. Every role, across every team, is expected to be fluent and comfortable with using AI to do their best work. All Spotters are expected to experiment with ThoughtSpot's AI tools and leading industry LLMs to streamline workflows, enhance output, and uncover new insights. Whether drafting content, analyzing data, or summarizing documents, AI is a daily partner. We value curiosity, openness to learning, and thoughtful application of AI to create real value. Training and resources are provided so every Spotter can confidently create with AI. Hybrid Work at ThoughtSpot This office-assigned role is available as a hybrid position, reporting to the office in the UK - London. Spotters assigned to an office are encouraged to experience the energy of their local office with an in-office expectation of 2-3 days per week. This approach balances the benefits of in-person collaboration and peer learning with the flexibility needed by individuals and teams. ThoughtSpot for All At ThoughtSpot, diverse teams build better products. Complex data problems need many perspectives, not just one. We welcome different backgrounds, identities, and experiences, and we work to create a place where everyone can be themselves and do their best work. If this role excites you and you believe you're a strong match, we encourage you to apply. What Makes ThoughtSpot a Great Place to Work? ThoughtSpot is the Agentic Analytics Platform that empowers every enterprise to transform insights into action, on a mission to make the world more fact driven. We hire people with unique identities, backgrounds, and perspectives; this balance-for-the-better philosophy is key to our success. When paired with our culture of Trust, Customer Obsession, Innovation and Intensity, ThoughtSpot cultivates a respectful culture that pushes norms to create world-class products. If you're excited by the opportunity to work with some of the brightest minds in the business and make your mark on a truly innovative company, we invite you to read more about our mission, and apply to the role that's right for you. About ThoughtSpot The world's most innovative companies turn to ThoughtSpot's AI-Powered Analytics to put data in the hands of everyone, from the C-suite to the frontline. With simple, natural language search and AI, anyone can ask questions, discover insights, and act with confidence. Unlike legacy tools that sacrifice performance for complexity, ThoughtSpot is intuitively designed for every business user while being built to handle the most complex, large-scale data, wherever it resides. This unique combination of speed and simplicity is why enterprise leaders trust ThoughtSpot to transform decision-making into a truly data-driven culture. At ThoughtSpot, we're a curious, data-driven bunch. We believe the world works better when everyone has access to facts. That's why we build products that make asking and answering data questions as natural as having a conversation.
Nexere Consulting Limited
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse
Nexere Consulting Limited
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
Apr 21, 2026
Full time
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.

Modal Window

  • Blog
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Instagram
  • Pinterest
  • Youtube
Parent and Partner sites: IT Job Board | Search Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | London Jobs | Property jobs
© 2008-2026 Jobs Hiring Near Me