• Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
  • Sign in
  • Sign up
  • Home
  • Find Jobs
  • Register CV
  • Advertise jobs
  • Employer Pricing
  • IT Jobs
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

8 jobs found

Email me jobs like this
Refine Search
Current Search
azure data architect adf
Head Resourcing
Data Engineer
Head Resourcing
Mid-Level Data Engineer (Azure / Databricks) NO VISA REQUIREMENTS Location: Glasgow (3+ days) Reports to: Head of IT My client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory. They are looking for a Mid-Level Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics. What You'll Do Lakehouse Engineering (Azure + Databricks) Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL. Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets. Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns. Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations. Curated Layers & Data Modelling Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas. Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets. Apply governance, lineage and permissioning through Unity Catalog. Orchestration & Observability Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs. Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability. Assist in performance tuning and cost optimisation. DevOps & Platform Engineering Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets. Support secure deployment patterns using private endpoints, managed identities and Key Vault. Participate in code reviews and help improve engineering practices. Collaboration & Delivery Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business. Contribute to architectural discussions and the ongoing data platform roadmap. Tech You'll Use Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints Languages: PySpark, Spark SQL, Python, Git DevOps: Azure DevOps Repos & Pipelines, CI/CD Analytics: Power BI, Fabric What We're Looking For Experience Commercial and proven data engineering experience. Hands-on experience delivering solutions on Azure + Databricks . Strong PySpark and Spark SQL skills within distributed compute environments. Experience working in a Lakehouse/Medallion architecture with Delta Lake. Understanding of dimensional modelling (Kimball), including SCD Type 1/2. Exposure to operational concepts such as monitoring, retries, idempotency and backfills. Mindset Keen to grow within a modern Azure Data Platform environment. Comfortable with Git, CI/CD and modern engineering workflows. Able to communicate technical concepts clearly to non-technical stakeholders. Quality-driven, collaborative and proactive. Nice to Have Databricks Certified Data Engineer Associate. Experience with streaming ingestion (Auto Loader, event streams, watermarking). Subscription/entitlement modelling (e.g., ChargeBee). Unity Catalog advanced security (RLS, PII governance). Terraform or Bicep for IaC. Fabric Semantic Models or Direct Lake optimisation experience. Why Join? Opportunity to shape and build a modern enterprise Lakehouse platform. Hands-on work with Azure, Databricks and leading-edge engineering practices. Real progression opportunities within a growing data function. Direct impact across multiple business domains.
Dec 17, 2025
Full time
Mid-Level Data Engineer (Azure / Databricks) NO VISA REQUIREMENTS Location: Glasgow (3+ days) Reports to: Head of IT My client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory. They are looking for a Mid-Level Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics. What You'll Do Lakehouse Engineering (Azure + Databricks) Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL. Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets. Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns. Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations. Curated Layers & Data Modelling Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas. Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets. Apply governance, lineage and permissioning through Unity Catalog. Orchestration & Observability Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs. Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability. Assist in performance tuning and cost optimisation. DevOps & Platform Engineering Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets. Support secure deployment patterns using private endpoints, managed identities and Key Vault. Participate in code reviews and help improve engineering practices. Collaboration & Delivery Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business. Contribute to architectural discussions and the ongoing data platform roadmap. Tech You'll Use Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints Languages: PySpark, Spark SQL, Python, Git DevOps: Azure DevOps Repos & Pipelines, CI/CD Analytics: Power BI, Fabric What We're Looking For Experience Commercial and proven data engineering experience. Hands-on experience delivering solutions on Azure + Databricks . Strong PySpark and Spark SQL skills within distributed compute environments. Experience working in a Lakehouse/Medallion architecture with Delta Lake. Understanding of dimensional modelling (Kimball), including SCD Type 1/2. Exposure to operational concepts such as monitoring, retries, idempotency and backfills. Mindset Keen to grow within a modern Azure Data Platform environment. Comfortable with Git, CI/CD and modern engineering workflows. Able to communicate technical concepts clearly to non-technical stakeholders. Quality-driven, collaborative and proactive. Nice to Have Databricks Certified Data Engineer Associate. Experience with streaming ingestion (Auto Loader, event streams, watermarking). Subscription/entitlement modelling (e.g., ChargeBee). Unity Catalog advanced security (RLS, PII governance). Terraform or Bicep for IaC. Fabric Semantic Models or Direct Lake optimisation experience. Why Join? Opportunity to shape and build a modern enterprise Lakehouse platform. Hands-on work with Azure, Databricks and leading-edge engineering practices. Real progression opportunities within a growing data function. Direct impact across multiple business domains.
Randstad Technologies Recruitment
Data Architect (Insurance Domain)
Randstad Technologies Recruitment
Adword Job Title: Data Architect (Insurance Domain) Location: London UK Hybrid role Responsibilities: Experience Level 15 years with at least 5 years in Azure ecosystem Role: We are seeking a seasoned Data Architect to lead the design and implementation of scalable data solutions for a strategic insurance project The ideal candidate will have deep expertise in Azure cloud services Azure Data Factory and Databricks with a strong understanding of data modeling data integration and analytics in the insurance domain Key Responsibilities Architect and design end to end data solutions on Azure for insurance related data workflows Lead data ingestion transformation and orchestration using ADF and Databricks Collaborate with business stakeholders to understand data requirements and translate them into technical solutions Ensure data quality governance and security compliance across the data lifecycle Optimize performance and cost efficiency of data pipelines and storage Provide technical leadership and mentoring to data engineers and developers Mandatory Skillset: Azure Cloud Services Strong experience with Azure Data Lake Azure Synapse Azure SQL and Azure Storage Azure Data Factory ADF Expertise in building and managing complex data pipelines Databricks Handson experience with Spark based data processing notebooks and ML workflows Data Modeling Proficiency in conceptual logical and physical data modeling SQL Python Advanced skills for data manipulation and transformation Insurance Domain Knowledge Understanding of insurance data structures claims policy underwriting and regulatory requirements Preferred Skillset: Power BI Experience in building dashboards and visualizations Data Governance Tools Familiarity with tools like Purview or Collibra Machine Learning Exposure to ML model deployment and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like to hear from you! Please apply with a copy of your CV or send it to Prasanna com and let's start the conversation! Randstad Technologies Ltd is a leading specialist recruitment business for the IT & Engineering industries. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
Dec 14, 2025
Full time
Adword Job Title: Data Architect (Insurance Domain) Location: London UK Hybrid role Responsibilities: Experience Level 15 years with at least 5 years in Azure ecosystem Role: We are seeking a seasoned Data Architect to lead the design and implementation of scalable data solutions for a strategic insurance project The ideal candidate will have deep expertise in Azure cloud services Azure Data Factory and Databricks with a strong understanding of data modeling data integration and analytics in the insurance domain Key Responsibilities Architect and design end to end data solutions on Azure for insurance related data workflows Lead data ingestion transformation and orchestration using ADF and Databricks Collaborate with business stakeholders to understand data requirements and translate them into technical solutions Ensure data quality governance and security compliance across the data lifecycle Optimize performance and cost efficiency of data pipelines and storage Provide technical leadership and mentoring to data engineers and developers Mandatory Skillset: Azure Cloud Services Strong experience with Azure Data Lake Azure Synapse Azure SQL and Azure Storage Azure Data Factory ADF Expertise in building and managing complex data pipelines Databricks Handson experience with Spark based data processing notebooks and ML workflows Data Modeling Proficiency in conceptual logical and physical data modeling SQL Python Advanced skills for data manipulation and transformation Insurance Domain Knowledge Understanding of insurance data structures claims policy underwriting and regulatory requirements Preferred Skillset: Power BI Experience in building dashboards and visualizations Data Governance Tools Familiarity with tools like Purview or Collibra Machine Learning Exposure to ML model deployment and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like to hear from you! Please apply with a copy of your CV or send it to Prasanna com and let's start the conversation! Randstad Technologies Ltd is a leading specialist recruitment business for the IT & Engineering industries. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
Makutu
Data Engineer
Makutu City, Derby
About Us Makutu designs, builds and supports Microsoft Azure cloud data platforms. We are a Microsoft Solutions Partner (Azure Data & AI) and are busy building a talented team with relevant skills to deliver industry leading data platforms for our customers. The Role The Data Engineer role is key to building and growing the in-house technical team at Makutu. The role will provide the successful applicants with the opportunity for significant career development while working with a range of large businesses to whom data is critical to their success. Working as part of the team and with the customer, you'll require excellent written and verbal English language and communication skills. Big growth plans are in place to build a broader and deeper technical capability with a focus on the Microsoft Azure technology stack. The position of Data Engineer is a key role in the wider capability of our team. Occasional visits to our Head Office and customers sites will be required. Key responsibilities: Identify, design, and implement working practices across data pipelines, data architectures, testing and deployment Understand complex business requirements and providing solutions to business problems Understand modern data architecture approaches and associated cloud focused solutions Defining data engineering best practice and sharing across the organisation Collaborating with the wider team on data strategy Skills and experience: A relevant Bachelors degree in Computing, Mathematics, Data Science or similar (ideal but not essential) A Masters degree in Data Science (ideal but not essential) Experience building data pipelines with modern practices including the use of cloud native technologies, DevOps practices, CI/CD pipelines and agile delivery Experience with data modelling, data warehousing, data lake solutions Able to communicate effectively with senior stakeholders. Successful candidates will likely posses Azure certifications such as DP-600 and/or DP-700. Also, applicants will have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python
Dec 12, 2025
Full time
About Us Makutu designs, builds and supports Microsoft Azure cloud data platforms. We are a Microsoft Solutions Partner (Azure Data & AI) and are busy building a talented team with relevant skills to deliver industry leading data platforms for our customers. The Role The Data Engineer role is key to building and growing the in-house technical team at Makutu. The role will provide the successful applicants with the opportunity for significant career development while working with a range of large businesses to whom data is critical to their success. Working as part of the team and with the customer, you'll require excellent written and verbal English language and communication skills. Big growth plans are in place to build a broader and deeper technical capability with a focus on the Microsoft Azure technology stack. The position of Data Engineer is a key role in the wider capability of our team. Occasional visits to our Head Office and customers sites will be required. Key responsibilities: Identify, design, and implement working practices across data pipelines, data architectures, testing and deployment Understand complex business requirements and providing solutions to business problems Understand modern data architecture approaches and associated cloud focused solutions Defining data engineering best practice and sharing across the organisation Collaborating with the wider team on data strategy Skills and experience: A relevant Bachelors degree in Computing, Mathematics, Data Science or similar (ideal but not essential) A Masters degree in Data Science (ideal but not essential) Experience building data pipelines with modern practices including the use of cloud native technologies, DevOps practices, CI/CD pipelines and agile delivery Experience with data modelling, data warehousing, data lake solutions Able to communicate effectively with senior stakeholders. Successful candidates will likely posses Azure certifications such as DP-600 and/or DP-700. Also, applicants will have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python
Informed Recruitment
Integration Developer - Azure, Logic/Function Apps, API, SQL, C
Informed Recruitment City, Manchester
Are you an experienced contract Integrations Developer looking to be part of an expanding development function? Do you have Azure Integration, Logic Apps, Function Apps, C#, and API development skills? Let Informed Recruitment help you to achieve your potential with an exciting opportunity for a Systems Development to influence the development of high quality and robust systems. As a specialist provider of resource to the Property & Associated Technology markets, we are delighted to be partnering with a social enterprise offering you the opportunity to make a difference and take responsibility as part of modern environment championing continual improvement. This role is initially offered on a 3-8-month basis, inside IR35, with scope to run. The purpose of the role will be to design, develop, and update the business systems required to support business-as-usual services as well as the change and transformation team. Your day-to-day responsibilities will include the analysis of business requirements, development of functional specifications, the configuration and development of code; unit testing; documentation; ensuring all system developments follow the overarching design; quality assurance and code reviews; and documentation. Must Have Strong and successful track record as an integration developer Azure Integration Azure Logic Apps Azure Function Apps Azure DevOps C# development SQL or PL/SQL Scripting APIs Experience of one or more software/technical delivery approaches such as Waterfall, Agile, Scrum, DevOps, etc Experience in analysing requirements, system design documentation, developing objects/code, units testing, and deployment. Nice to Have ADF/Azure Data Factory Power Platform ERP Solutions such as MS Dynamics or Oracle Cloud SQL Server or Oracle RDBMS Knowledge of Architecture principles, design patterns, coding standards and testing Relevant certification As an individual you will have excellent problem-solving skills and attention to detail, whilst also being a self-starter comfortable with taking responsibility for delivery. You will have excellent interpersonal skills, the ability to think on your feet and be ultimately goal orientated. The role will be predominantly home based, with one/two days in the office required each week in Manchester. On offer is a 3-8-month contract, inside IR35, with scope to run. Interviews slots are available on a case-by-case basis, so please apply without delay Informed Recruitment Limited acts as an Employment Business in respect to this vacancy as defined by the Employment Agencies Act. We are an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, colour, religion, gender, national origin, disability status, or any other basis protected by appropriate law. All hiring decisions are made based on merit, competence, and business need. As defined under the General Data Protection Regulation (GDPR), Informed Recruitment is a Data Controller and a Data Processor, and our legal basis for processing your personal data is 'Legitimate Interests'. You have the right to object to us processing your data in this way. For more information about this, your rights, and our approach to Data Protection and Privacy, please visit our website.
Dec 12, 2025
Contractor
Are you an experienced contract Integrations Developer looking to be part of an expanding development function? Do you have Azure Integration, Logic Apps, Function Apps, C#, and API development skills? Let Informed Recruitment help you to achieve your potential with an exciting opportunity for a Systems Development to influence the development of high quality and robust systems. As a specialist provider of resource to the Property & Associated Technology markets, we are delighted to be partnering with a social enterprise offering you the opportunity to make a difference and take responsibility as part of modern environment championing continual improvement. This role is initially offered on a 3-8-month basis, inside IR35, with scope to run. The purpose of the role will be to design, develop, and update the business systems required to support business-as-usual services as well as the change and transformation team. Your day-to-day responsibilities will include the analysis of business requirements, development of functional specifications, the configuration and development of code; unit testing; documentation; ensuring all system developments follow the overarching design; quality assurance and code reviews; and documentation. Must Have Strong and successful track record as an integration developer Azure Integration Azure Logic Apps Azure Function Apps Azure DevOps C# development SQL or PL/SQL Scripting APIs Experience of one or more software/technical delivery approaches such as Waterfall, Agile, Scrum, DevOps, etc Experience in analysing requirements, system design documentation, developing objects/code, units testing, and deployment. Nice to Have ADF/Azure Data Factory Power Platform ERP Solutions such as MS Dynamics or Oracle Cloud SQL Server or Oracle RDBMS Knowledge of Architecture principles, design patterns, coding standards and testing Relevant certification As an individual you will have excellent problem-solving skills and attention to detail, whilst also being a self-starter comfortable with taking responsibility for delivery. You will have excellent interpersonal skills, the ability to think on your feet and be ultimately goal orientated. The role will be predominantly home based, with one/two days in the office required each week in Manchester. On offer is a 3-8-month contract, inside IR35, with scope to run. Interviews slots are available on a case-by-case basis, so please apply without delay Informed Recruitment Limited acts as an Employment Business in respect to this vacancy as defined by the Employment Agencies Act. We are an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, colour, religion, gender, national origin, disability status, or any other basis protected by appropriate law. All hiring decisions are made based on merit, competence, and business need. As defined under the General Data Protection Regulation (GDPR), Informed Recruitment is a Data Controller and a Data Processor, and our legal basis for processing your personal data is 'Legitimate Interests'. You have the right to object to us processing your data in this way. For more information about this, your rights, and our approach to Data Protection and Privacy, please visit our website.
Fruition Group
Senior Software Engineer (Contract)
Fruition Group Bradford, Yorkshire
Role: Contract Senior Software Engineer - Backend Rate: C.£550 Outside IR35 Per day Location: Bradford - Hybrid (2x days on site) Length: 3-6 months About Us: We are working with a growing, forward-thinking technology company dedicated to building scalable, reliable, and high-performing solutions for our customers. Role Overview: As a Senior Software Engineer, you will be responsible for designing, developing, and maintaining Server Side applications and services. You'll work closely with cross-functional teams to ensure our systems are performant, secure, and deliver a great user experience. Key Responsibilities: Design, implement, and maintain Back End services, APIs, and databases. Collaborate with product managers, Front End engineers, and other stakeholders to deliver features end-to-end. Ensure systems are scalable, reliable, and maintainable. Write clean, testable, and efficient code following best practices. Requirements: Strong experience with React (Open to other Front End experiences) Strong programming skills in Node.js Solid understanding of RESTful APIs, microservices, and event-driven architecture. Hands-on experience with relational and/or NoSQL databases. Familiarity with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes). Knowledge of software design principles, testing methodologies, and version control (Git). We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Oct 07, 2025
Contractor
Role: Contract Senior Software Engineer - Backend Rate: C.£550 Outside IR35 Per day Location: Bradford - Hybrid (2x days on site) Length: 3-6 months About Us: We are working with a growing, forward-thinking technology company dedicated to building scalable, reliable, and high-performing solutions for our customers. Role Overview: As a Senior Software Engineer, you will be responsible for designing, developing, and maintaining Server Side applications and services. You'll work closely with cross-functional teams to ensure our systems are performant, secure, and deliver a great user experience. Key Responsibilities: Design, implement, and maintain Back End services, APIs, and databases. Collaborate with product managers, Front End engineers, and other stakeholders to deliver features end-to-end. Ensure systems are scalable, reliable, and maintainable. Write clean, testable, and efficient code following best practices. Requirements: Strong experience with React (Open to other Front End experiences) Strong programming skills in Node.js Solid understanding of RESTful APIs, microservices, and event-driven architecture. Hands-on experience with relational and/or NoSQL databases. Familiarity with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes). Knowledge of software design principles, testing methodologies, and version control (Git). We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Hays
Data Architect - Data Vault, Snowflake, ADF
Hays
Data Architect - Data Vault, Snowflake, ADF £Market rate (Inside IR35) London / Hybrid 6 Months My client is an instantly recognisable Insurance brand who urgently require a Data Architect with expertise in Data Vault, Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to click apply for full job details
Oct 06, 2025
Contractor
Data Architect - Data Vault, Snowflake, ADF £Market rate (Inside IR35) London / Hybrid 6 Months My client is an instantly recognisable Insurance brand who urgently require a Data Architect with expertise in Data Vault, Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to click apply for full job details
Hays Specialist Recruitment
Data Architect - Data Vault, Snowflake, ADF
Hays Specialist Recruitment
Data Architect - Data Vault, Snowflake, ADF £Market rate (Inside IR35) London/Hybrid 6 months My client is an instantly recognisable Insurance brand who urgently require a Data Architect with expertise in Data Vault, Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to join a business-critical Programme ASAP. Key Requirements: Proven expertise in Data Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven experience with Snowflake Architecture, including DevOps integration and CI/CD practices Previous experience with large, Enterprise scale Data sets and sources Strong understanding of Data Governance principles (Data stewardship, Data quality and compliance guardrails as well as Data quality, lineage, life cycle management and security) Excellent communication and stakeholder management skills Flexible approach to hybrid working (attending workshops, possibly across differ locations if needed) Nice to have: Insurance industry experience Previous work on Data strategies Exposure to collaborative modelling tools such as Ellie.ai Immediate availability If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
Oct 03, 2025
Contractor
Data Architect - Data Vault, Snowflake, ADF £Market rate (Inside IR35) London/Hybrid 6 months My client is an instantly recognisable Insurance brand who urgently require a Data Architect with expertise in Data Vault, Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to join a business-critical Programme ASAP. Key Requirements: Proven expertise in Data Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven experience with Snowflake Architecture, including DevOps integration and CI/CD practices Previous experience with large, Enterprise scale Data sets and sources Strong understanding of Data Governance principles (Data stewardship, Data quality and compliance guardrails as well as Data quality, lineage, life cycle management and security) Excellent communication and stakeholder management skills Flexible approach to hybrid working (attending workshops, possibly across differ locations if needed) Nice to have: Insurance industry experience Previous work on Data strategies Exposure to collaborative modelling tools such as Ellie.ai Immediate availability If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
Hays Specialist Recruitment Limited
Data Solution Architect
Hays Specialist Recruitment Limited
Data Solution Architect - Azure, ADF, Snowflake, Insurance £Market rate (Inside IR35) London / Hybrid 6 Months My client is an instantly recognisable Insurance brand who urgently require a Data Solution Architect with expertise in Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to join a business-critical Programme ASAP. Key Requirements: Proven expertise in Data Solution Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven experience with Snowflake Architecture, including DevOps integration and CI/CD practices Previous experience with large, Enterprise scale Data sets and sources Strong understanding of Data Governance principles (Data stewardship, Data quality and compliance guardrails as well as Data quality, lineage, lifecycle management and security) Excellent communication and stakeholder management skills Flexible approach to hybrid working (attending workshops, possibly across differ locations if needed) Nice to have: Insurance industry experience Previous work on Data strategies Exposure to collaborative modelling tools such as Ellie.ai Immediate availability If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk
Sep 22, 2025
Full time
Data Solution Architect - Azure, ADF, Snowflake, Insurance £Market rate (Inside IR35) London / Hybrid 6 Months My client is an instantly recognisable Insurance brand who urgently require a Data Solution Architect with expertise in Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to join a business-critical Programme ASAP. Key Requirements: Proven expertise in Data Solution Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven experience with Snowflake Architecture, including DevOps integration and CI/CD practices Previous experience with large, Enterprise scale Data sets and sources Strong understanding of Data Governance principles (Data stewardship, Data quality and compliance guardrails as well as Data quality, lineage, lifecycle management and security) Excellent communication and stakeholder management skills Flexible approach to hybrid working (attending workshops, possibly across differ locations if needed) Nice to have: Insurance industry experience Previous work on Data strategies Exposure to collaborative modelling tools such as Ellie.ai Immediate availability If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk

Modal Window

  • Blog
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Instagram
  • Pinterest
  • Youtube
Parent and Partner sites: IT Job Board | Search Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | London Jobs | Property jobs
© 2008-2025 Jobs Hiring Near Me