Browse our archives by topic…
Blog

Insight Discovery (part 6) – How to define business requirements for a successful cloud data & analytics project
Many data projects fail to deliver the impact they should for a simple reason – they focus on the data. This series of posts explains a different way of thinking that will set up your data & analytics projects for success. Using an iterative, action-oriented, insight discovery process, it demonstrates tools and techniques that will help you to identify, define and prioritize requirements in your own projects so that they deliver maximum value. It also explores the synergy with modern cloud analytics platforms like Azure Synapse, explaining how the process and the architecture actively support each other for fast, impactful delivery.

Putting total cost of ownership (TCO) into action
Total cost of ownership can be used a various stages in the lifecycle of a digital asset to support data driven decisions.

What are Synapse Analytics Database Templates and why should you use them?
Explore Azure Synapse Analytics Database Templates and learn to leverage them in modern data pipelines.

5 lessons learnt from using Power Automate
In this post, we look at 5 lessons that we learnt from a recently completed Power Automate project that helped us to get the best out of the platform.

Bye bye Azure Functions, Hello Azure Container Apps: Build and deployment pipelines, and our first big problem
The third in a series of posts talking about how and why we migrated an application from Azure Functions to Azure Container Apps

Insight Discovery (part 5) – Deliver insights incrementally with data pipelines
Discover a unique, action-oriented approach to data projects for maximum impact. Learn how to prioritize requirements and leverage cloud analytics platforms.

How to calculate the total cost of ownership (TCO)
There is a broad spectrum of costs that a digital asset will accrue over its lifetime, which will typically span many years.

Insight Discovery (part 4) – Data projects should have a backlog
This series focuses on maximizing data projects' impact via an iterative, insight discovery process, and synergy with cloud platforms like Azure Synapse.

What is the total cost of ownership (TCO) and why is it important?
Understanding the total cost of ownership is key to making informed decisions about technology investments.

Continuous Integration with GitHub Actions
This post gives an overview of Continuous Integrations and shows how you can implement it with GitHub Actions, with an accompanying example Python project

Publishing Scripts to the PowerShell Gallery
Explore how to share a function from a PowerShell module as a standalone script, without maintaining two code versions.

Bye bye Azure Functions, Hello Azure Container Apps: Migrating from Azure Functions to ASP.NET Core
The second in a series of posts talking about how and why we migrated an application from Azure Functions to Azure Container Apps

Insight Discovery (part 3) – Defining Actionable Insights
Discover a unique, action-oriented approach to data projects for maximum impact. Learn how to synergize with platforms like Azure Synapse for fast delivery.

How to enable data teams with the design assets required for impactful data storytelling in Power BI
In this post we will talk through how to expand on a data team's creative skillset, without access to specialist photo editing software such as Photoshop or Illustrator.

5 tips to pass the PL-300 exam: Microsoft Power BI Data Analyst
I recently passed the PL-300 - Power BI Data Analyst exam. Here are some tips to prepare for it that I found useful!

Insight Discovery (part 2) – successful data projects start by forgetting about the data
Many data projects fail to deliver the impact they should for a simple reason – they focus on the data. This series of posts explains a different way of thinking that will set up your data & analytics projects for success. Using an iterative, action-oriented, insight discovery process, it demonstrates tools and techniques that will help you to identify, define and prioritize requirements in your own projects so that they deliver maximum value. It also explores the synergy with modern cloud analytics platforms like Azure Synapse, explaining how the process and the architecture actively support each other for fast, impactful delivery.

A simple toolkit for IT budgeting and planning
We describe how to create a high level view of your digital assets, where everything is measured equally and actionable insights can be generated that allow you to optimise your budget and build a roadmap focused on business value.

Performance Optimisation Tools for Power BI
Optimise Power BI report performance with analyzer tools. Discover essential techniques for efficient report development in this blog post.

C# Lambda Discards
C# has gradually been adding support for discards. This article explores how this evolution has led to some surprises.

Insight Discovery (part 1) – why do data projects often fail?
Discover a unique, action-oriented approach to data projects for maximum impact. Learn how to synergize with platforms like Azure Synapse for fast delivery.

Bye bye Azure Functions, Hello Azure Container Apps: Introduction
The first in a series of posts talking about how and why we migrated an application from Azure Functions to Azure Container Apps

Automating Excel in the Cloud with Office Scripts and Power Automate
Automate Excel tasks with Office Scripts & Power Automate. Get an overview and explore a practical example in this post.

Service Lifetimes in ASP.NET Core
Explore Microsoft Dependency Injection container's 3 lifetimes: transient, singleton, scoped. Learn their behaviors & importance in app dependencies.

Using Azure CLI Authentication within Local Containers
Fix broken dev loop with containerized apps on Azure CLI for Windows. Learn a workaround to avoid using outdated Azure CLI versions.

How to apply behaviour driven development to data and analytics projects
In this blog we demonstrate how the Gherkin specification can be adapted to enable BDD to be applied to data engineering use cases.