? Power BI Report Engineer (Azure / Databricks) Glasgow based only 4 days onsite No visa restrictions please Are you a Power BI specialist who loves clean, governed data and high-performance semantic models? Do you want to work with a business that's rebuilding its entire BI estate the right way-proper Lakehouse architecture, curated Gold tables, PBIP, Git, and end-to-end governance? If so, this is one of the most modern, forward-thinking Power BI engineering roles in Scotland. Our Glasgow-based client is transforming its reporting platform using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and Customer Care-turning trusted Lakehouse data into insights the business relies on every day. ? Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights. You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. ? What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain consistent dimensional modelling aligned to the enterprise Bus Matrix. High-Performance Power BI Engineering Optimise performance: aggregations, composite models, incremental refresh, DQ/Import strategy. Tune Databricks SQL Warehouse queries for speed and cost efficiency. Monitor PPU capacity performance, refresh reliability and dataset health. Governance, Security & Standards Implement RLS/OLS , naming conventions, KPI definitions and calc groups. Apply dataset certification, endorsements and governance metadata. Align semantic models with lineage and security policies across the Azure/Databricks estate. Lifecycle, Release & Best Practice Delivery Use Power BI Deployment Pipelines for Dev ? UAT ? Prod releases. Enforce semantic CI/CD patterns with PBIP + Git + Tabular Editor. Build reusable, certified datasets and dataflows enabling scalable self-service BI. Adoption, UX & Collaboration Design intuitive dashboards with consistent UX across multiple business functions. Support BI adoption through training, documentation and best-practice guidance. Use telemetry to track usage, performance and improve user experience. ? What We're Looking For Required Certifications To meet BI engineering standards, candidates must hold: PL-300: Power BI Data Analyst Associate DP-600: Fabric Analytics Engineer Associate Skills & Experience Commercial years building enterprise Power BI datasets and dashboards. Strong DAX and semantic modelling expertise (calc groups, conformed dimensions, role-playing dimensions). Strong SQL skills; comfortable working with Databricks Gold-layer tables. Proven ability to optimise dataset performance (aggregations, incremental refresh, DQ/Import). Experience working with Git-based modelling workflows and PR reviews via Tabular Editor. Excellent design intuition-clean layouts, drill paths, and KPI logic. Nice to Have Python for automation or ad-hoc prep; PySpark familiarity. Understanding of Lakehouse patterns, Delta Lake, metadata-driven pipelines. Unity Catalog / Purview experience for lineage and governance. RLS/OLS implementation experience.
Jan 13, 2026
Full time
? Power BI Report Engineer (Azure / Databricks) Glasgow based only 4 days onsite No visa restrictions please Are you a Power BI specialist who loves clean, governed data and high-performance semantic models? Do you want to work with a business that's rebuilding its entire BI estate the right way-proper Lakehouse architecture, curated Gold tables, PBIP, Git, and end-to-end governance? If so, this is one of the most modern, forward-thinking Power BI engineering roles in Scotland. Our Glasgow-based client is transforming its reporting platform using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and Customer Care-turning trusted Lakehouse data into insights the business relies on every day. ? Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights. You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. ? What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain consistent dimensional modelling aligned to the enterprise Bus Matrix. High-Performance Power BI Engineering Optimise performance: aggregations, composite models, incremental refresh, DQ/Import strategy. Tune Databricks SQL Warehouse queries for speed and cost efficiency. Monitor PPU capacity performance, refresh reliability and dataset health. Governance, Security & Standards Implement RLS/OLS , naming conventions, KPI definitions and calc groups. Apply dataset certification, endorsements and governance metadata. Align semantic models with lineage and security policies across the Azure/Databricks estate. Lifecycle, Release & Best Practice Delivery Use Power BI Deployment Pipelines for Dev ? UAT ? Prod releases. Enforce semantic CI/CD patterns with PBIP + Git + Tabular Editor. Build reusable, certified datasets and dataflows enabling scalable self-service BI. Adoption, UX & Collaboration Design intuitive dashboards with consistent UX across multiple business functions. Support BI adoption through training, documentation and best-practice guidance. Use telemetry to track usage, performance and improve user experience. ? What We're Looking For Required Certifications To meet BI engineering standards, candidates must hold: PL-300: Power BI Data Analyst Associate DP-600: Fabric Analytics Engineer Associate Skills & Experience Commercial years building enterprise Power BI datasets and dashboards. Strong DAX and semantic modelling expertise (calc groups, conformed dimensions, role-playing dimensions). Strong SQL skills; comfortable working with Databricks Gold-layer tables. Proven ability to optimise dataset performance (aggregations, incremental refresh, DQ/Import). Experience working with Git-based modelling workflows and PR reviews via Tabular Editor. Excellent design intuition-clean layouts, drill paths, and KPI logic. Nice to Have Python for automation or ad-hoc prep; PySpark familiarity. Understanding of Lakehouse patterns, Delta Lake, metadata-driven pipelines. Unity Catalog / Purview experience for lineage and governance. RLS/OLS implementation experience.
Mid-Level Data Engineer (Azure / Databricks) NO VISA REQUIREMENTS Location: Glasgow (3+ days) Reports to: Head of IT My client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory. They are looking for a Mid-Level Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics. What You'll Do Lakehouse Engineering (Azure + Databricks) Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL. Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets. Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns. Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations. Curated Layers & Data Modelling Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas. Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets. Apply governance, lineage and permissioning through Unity Catalog. Orchestration & Observability Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs. Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability. Assist in performance tuning and cost optimisation. DevOps & Platform Engineering Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets. Support secure deployment patterns using private endpoints, managed identities and Key Vault. Participate in code reviews and help improve engineering practices. Collaboration & Delivery Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business. Contribute to architectural discussions and the ongoing data platform roadmap. Tech You'll Use Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints Languages: PySpark, Spark SQL, Python, Git DevOps: Azure DevOps Repos & Pipelines, CI/CD Analytics: Power BI, Fabric What We're Looking For Experience Commercial and proven data engineering experience. Hands-on experience delivering solutions on Azure + Databricks . Strong PySpark and Spark SQL skills within distributed compute environments. Experience working in a Lakehouse/Medallion architecture with Delta Lake. Understanding of dimensional modelling (Kimball), including SCD Type 1/2. Exposure to operational concepts such as monitoring, retries, idempotency and backfills. Mindset Keen to grow within a modern Azure Data Platform environment. Comfortable with Git, CI/CD and modern engineering workflows. Able to communicate technical concepts clearly to non-technical stakeholders. Quality-driven, collaborative and proactive. Nice to Have Databricks Certified Data Engineer Associate. Experience with streaming ingestion (Auto Loader, event streams, watermarking). Subscription/entitlement modelling (e.g., ChargeBee). Unity Catalog advanced security (RLS, PII governance). Terraform or Bicep for IaC. Fabric Semantic Models or Direct Lake optimisation experience. Why Join? Opportunity to shape and build a modern enterprise Lakehouse platform. Hands-on work with Azure, Databricks and leading-edge engineering practices. Real progression opportunities within a growing data function. Direct impact across multiple business domains.
Jan 07, 2026
Full time
Mid-Level Data Engineer (Azure / Databricks) NO VISA REQUIREMENTS Location: Glasgow (3+ days) Reports to: Head of IT My client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory. They are looking for a Mid-Level Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics. What You'll Do Lakehouse Engineering (Azure + Databricks) Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL. Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets. Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns. Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations. Curated Layers & Data Modelling Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas. Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets. Apply governance, lineage and permissioning through Unity Catalog. Orchestration & Observability Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs. Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability. Assist in performance tuning and cost optimisation. DevOps & Platform Engineering Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets. Support secure deployment patterns using private endpoints, managed identities and Key Vault. Participate in code reviews and help improve engineering practices. Collaboration & Delivery Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business. Contribute to architectural discussions and the ongoing data platform roadmap. Tech You'll Use Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints Languages: PySpark, Spark SQL, Python, Git DevOps: Azure DevOps Repos & Pipelines, CI/CD Analytics: Power BI, Fabric What We're Looking For Experience Commercial and proven data engineering experience. Hands-on experience delivering solutions on Azure + Databricks . Strong PySpark and Spark SQL skills within distributed compute environments. Experience working in a Lakehouse/Medallion architecture with Delta Lake. Understanding of dimensional modelling (Kimball), including SCD Type 1/2. Exposure to operational concepts such as monitoring, retries, idempotency and backfills. Mindset Keen to grow within a modern Azure Data Platform environment. Comfortable with Git, CI/CD and modern engineering workflows. Able to communicate technical concepts clearly to non-technical stakeholders. Quality-driven, collaborative and proactive. Nice to Have Databricks Certified Data Engineer Associate. Experience with streaming ingestion (Auto Loader, event streams, watermarking). Subscription/entitlement modelling (e.g., ChargeBee). Unity Catalog advanced security (RLS, PII governance). Terraform or Bicep for IaC. Fabric Semantic Models or Direct Lake optimisation experience. Why Join? Opportunity to shape and build a modern enterprise Lakehouse platform. Hands-on work with Azure, Databricks and leading-edge engineering practices. Real progression opportunities within a growing data function. Direct impact across multiple business domains.
BI Data Architect Edinburgh - office based Head Resourcing are pleased to be working with a global manufacturer who are headquartered in Scotland as they look to hire a talented BI Data Architect. Our client is a long-established, family-owned business with global operations producing a wide range of high-quality products. The BI Data Architect is a new position within our clients IT structure and will be responsible for designing, implementing, and maintaining scalable data architectures which support business intelligence, analytics, and reporting across the organisation. The successful candidate will be able to bridge the gap between data engineering and strategic decision making. Required skills: Experienced in Data Engineering with strong knowledge of Data Architecture Advanced SQL for data manipulation and querying Experience with ETL tools in Azure Knowledge of BI tools such as Power BI, Tableau, or fabric Strong communication skills and the ability to explain technical concepts to non-technical users If this sounds of interest and you'd like a confidential chat to find out more, please apply today!
Jan 06, 2026
Full time
BI Data Architect Edinburgh - office based Head Resourcing are pleased to be working with a global manufacturer who are headquartered in Scotland as they look to hire a talented BI Data Architect. Our client is a long-established, family-owned business with global operations producing a wide range of high-quality products. The BI Data Architect is a new position within our clients IT structure and will be responsible for designing, implementing, and maintaining scalable data architectures which support business intelligence, analytics, and reporting across the organisation. The successful candidate will be able to bridge the gap between data engineering and strategic decision making. Required skills: Experienced in Data Engineering with strong knowledge of Data Architecture Advanced SQL for data manipulation and querying Experience with ETL tools in Azure Knowledge of BI tools such as Power BI, Tableau, or fabric Strong communication skills and the ability to explain technical concepts to non-technical users If this sounds of interest and you'd like a confidential chat to find out more, please apply today!
Glasgow Salary - 50,000 - 52,000 Role - Business Analyst Head Resourcing have partnered with a successful family-owned Glasgow business with an international presence to support their search for a Business Analyst to join their growing IT department. Technology is at the heart of this entrepreneurial business who operate in the tech for good space and have a massive impact on the local and wider community. The successful candidate will get the opportunity to work closely with software developers building cloud platforms, mobile and desktop applications. This role would suit an agile and product focussed Business Analyst that enjoys building internal relationships. You will be naturally curious and inquisitive, with strong communication skills and a passion for business analysis. What we are looking for: Experienced Business Analyst within a technology or product environment. Demonstrated experience in collaborating with UX teams and converting wireframes, mock-ups, and other design artifacts into detailed user stories. Strong analytical and problem-solving skills with the ability to understand complex business processes. Excellent communication skills, both written and verbal, to interact effectively with technical teams, stakeholders, and end users. This company are based just on the outskirts of central Glasgow and operate a five day onsite working pattern.
Oct 07, 2025
Full time
Glasgow Salary - 50,000 - 52,000 Role - Business Analyst Head Resourcing have partnered with a successful family-owned Glasgow business with an international presence to support their search for a Business Analyst to join their growing IT department. Technology is at the heart of this entrepreneurial business who operate in the tech for good space and have a massive impact on the local and wider community. The successful candidate will get the opportunity to work closely with software developers building cloud platforms, mobile and desktop applications. This role would suit an agile and product focussed Business Analyst that enjoys building internal relationships. You will be naturally curious and inquisitive, with strong communication skills and a passion for business analysis. What we are looking for: Experienced Business Analyst within a technology or product environment. Demonstrated experience in collaborating with UX teams and converting wireframes, mock-ups, and other design artifacts into detailed user stories. Strong analytical and problem-solving skills with the ability to understand complex business processes. Excellent communication skills, both written and verbal, to interact effectively with technical teams, stakeholders, and end users. This company are based just on the outskirts of central Glasgow and operate a five day onsite working pattern.
Head Resourcing is delighted to be partnering with one of the UK's leading retail banks to recruit an experienced Senior DevOps Engineer for their Leeds-based team. The Opportunity This is a fantastic chance to join a forward-thinking digital engineering team where you'll focus on improving and maintaining the tools and processes that support continuous integration, automated testing, and software delivery pipelines . The role is highly collaborative and hands-on, with a strong emphasis on automation, resilience, and enabling faster, more reliable software releases. As well as technical delivery, you'll play an important part in championing DevOps culture -helping teams adopt agile practices, embedding continuous improvement, and sharing expertise across the organisation. Key Responsibilities Design, develop, and enhance CI/CD pipelines and release frameworks. Provide operational support and optimisation for applications running in cloud environments (public and private). Build and manage solutions with containerisation and orchestration (Docker, Kubernetes, service mesh). Drive automation and scalability through infrastructure as code (Terraform, CloudFormation). Contribute across the full software lifecycle-from planning through to production. Mentor colleagues and act as a point of leadership in modern engineering practices. Work closely with cross-functional teams to solve complex technical challenges. What We're Looking For Strong background in cloud platforms such as GCP, Azure, AWS, or OCP. Hands-on experience with DevOps toolchains (Jenkins, Nexus, SonarQube, Git, Maven). Solid programming ability-ideally in Java or JavaScript , though Python, Golang, or Rust are also valuable. Experience with containers and orchestration frameworks . Demonstrated ability to mentor or lead within technical teams . A collaborative approach, with an interest in driving cultural and process improvements. Salary : Depending on experience Pension Discretionary Bonus This role is Hybrid 2 days per week onsite in Leeds. If this sounds like you we would love to hear from you.
Oct 01, 2025
Full time
Head Resourcing is delighted to be partnering with one of the UK's leading retail banks to recruit an experienced Senior DevOps Engineer for their Leeds-based team. The Opportunity This is a fantastic chance to join a forward-thinking digital engineering team where you'll focus on improving and maintaining the tools and processes that support continuous integration, automated testing, and software delivery pipelines . The role is highly collaborative and hands-on, with a strong emphasis on automation, resilience, and enabling faster, more reliable software releases. As well as technical delivery, you'll play an important part in championing DevOps culture -helping teams adopt agile practices, embedding continuous improvement, and sharing expertise across the organisation. Key Responsibilities Design, develop, and enhance CI/CD pipelines and release frameworks. Provide operational support and optimisation for applications running in cloud environments (public and private). Build and manage solutions with containerisation and orchestration (Docker, Kubernetes, service mesh). Drive automation and scalability through infrastructure as code (Terraform, CloudFormation). Contribute across the full software lifecycle-from planning through to production. Mentor colleagues and act as a point of leadership in modern engineering practices. Work closely with cross-functional teams to solve complex technical challenges. What We're Looking For Strong background in cloud platforms such as GCP, Azure, AWS, or OCP. Hands-on experience with DevOps toolchains (Jenkins, Nexus, SonarQube, Git, Maven). Solid programming ability-ideally in Java or JavaScript , though Python, Golang, or Rust are also valuable. Experience with containers and orchestration frameworks . Demonstrated ability to mentor or lead within technical teams . A collaborative approach, with an interest in driving cultural and process improvements. Salary : Depending on experience Pension Discretionary Bonus This role is Hybrid 2 days per week onsite in Leeds. If this sounds like you we would love to hear from you.