Skip to content
Barry Smart By Barry Smart Director of Data & AI
Developing a Data Mesh Inspired Vision Using Microsoft Fabric

TLDR; Microsoft Fabric has clearly been influenced by Data Mesh. If you are seeking a Data Mesh inspired vision for your organisation, Microsoft Fabric is a solid choice to help you drive that strategy forward. However, there are a number of tensions between the Data Mesh principles that you will need to identify and address in order to maximise the value.

In May 2023, Microsoft announced Microsoft Fabric. It extends the promise of Azure Synapse Analytics integration to all analytics workloads from the data engineer to the business knowledge worker. It brings together reporting, analytics, data science and data engineering on a new generation of lake house infrastructure. Delivered as a unified SaaS offering, it aims to reduce cost and time to value, while enabling new "citizen data science" capabilities. See Ed Freeman's Introduction To Microsoft Fabric for more background.

Microsoft Fabric has been heavily influenced by Data Mesh. Data Mesh is Zhamak Dehghani's thought leadership about how to "deliver data-driven value at scale". The fundamental thing to note about Data Mesh is that it is not just about technology, it also describes important cultural and organisational principles that need to be applied in order to drive value from data in a safe, scalable and secure manner. In other words data is a socio-technical endeavour. Due to Conway's Law, we know that these cultural and organisational concerns will tend to override the technology concerns.

In this blog we seek to provide you with a framework to help identify the socio-technical tensions that may be present in the organisation with an objective of using Microsoft Fabric to alleviate those tensions and therefore maximise the value you can extract from a Data Mesh inspired vision enabled by Fabric.

Tensions between Data Mesh principles

For some background about Data Mesh, please read my previous blog in the series How Does Microsoft Fabric Measure Up To Data Mesh?. Data Mesh is founded on four principles:

  • Domain-orientated ownership
  • Data as a product
  • Federated computational governance
  • Self-serve data platform

Assuming that Microsoft Fabric largely fulfils the principle of a "Self-serve data platform", when thinking about Microsoft Fabric and embracing Data Mesh principles, it is useful to consider the remaining three principles and what it means for your organisation:

Data Mesh Principle Description Challenges
Data as a product The objective here is fundamentally about discoverability, enabling users to easily find and consume trusted "data products" in a secure manner with a vision of re-use and inter-operability. This principle clearly creates challenges around governance, in particular when it comes to managing sensitive data and preventing inappropriate use. It can also be hampered by de-centralisation where the individual domains will need to build an understanding of concepts that will alien to them - for example, what a data product actually is and the responsibilities of ownership that will be placed on them. It also requires a "data product marketplace" to enable discovery and tools to enable easy consumption of these data products, which isn't trivial.
Domain-orientated ownership The objective here is to enable the de-centralised domain driven ownership of data products through a self service platform such as Microsoft Fabric. This often referred to as "democratisation of data & analytics" or establishing "citizen analysts". This will often be a daunting prospect to existing centralised teams that are currently involved in delivering data solutions to the organisation - they will see it as a loss of control, will be naturally concerned about the sustainability of this operating model and what it means for their future roles. The key to overcoming these concerns is for these teams to begin shifting from operating as a "centre of excellence" where all expertise is concentrated in one central team, to a "centre of enablement" where the central team are responsible for upskilling and supporting wider organisation to successfully discover, build and own data products. Part of this enablement role is putting in place the standards to ensure compatibility of these data products across different domains.
Federated computational governance The objective is to manage risk by enabling policies concerning standardisation, security and compliance to applied in manner that does not inhibit the other principles. The objectives of data governance cannot be disputed - to manage risk and to promote trust in the "data products". However, traditional data governance is often applied through centralised processes that can generally feel like "barriers to getting things done". This is obviously in direct conflict with the "de-centralisation" data dimension above. In order to remove this conflict, Data Mesh seeks to shift governance from a centralised to a federated approach, where the domains are empowered to apply the policies that are relevant to them, on top of a small set of policies that are defined and set centrally. The idea is that technology can help by applying "policies as code" in an automated manner.

The Data Strategy Triangle

The project management triangle governs the quality of a project according to the natural constraints in project management (time, cost, scope). There are similar constraints and tensions between the three Data Mesh principles (domain-orientated ownership, federated computational governance, data as a product) above:

Diagram showing a triangle with domain-orientated ownership, federated computational governance and data as a product as the edges and some labels on the vertices to describe the natural tension between them

Fundamentally, if you constrain one of these dimensions you will constrain the other two and therefore limit the value that you can deliver. This is a common challenge in any Data Mesh implementation: finding the the right balance between the three objectives of domain-orientated ownership (de-centralisation, empowering "citizen analysts" to act autonomously), data as a product (discoverability, standardisation) and federated computational governance.

The infrastructure provided by Microsoft Fabric enables some of these tensions to be alleviated - for example by providing a common platform that is easy to use, domain teams are empowered to deliver data products, whilst enabling the wider capabilities such as standardisation and discoverability. By its very nature a single platform will also allow some control to be exerted over technical implementation and for some key aspects of governance to be managed centrally.

Start with the most constrained dimension

When you are developing your strategy, it is useful to consider these dimensions, and the tensions that exist between them today, to find the new balance (enabled by Microsoft Fabric) that you believe is right for your organisation. It is useful to start by identifying the dimension that is currently most constrained in your organisation. This will help you to shape your strategy by identifying the actions you should take first to unlock value:

Key constraint is "Federated computational governance"

In tightly regulated industries such as Financial Services, the governance dimension often dominates. If this is the case, consider how you could leverage features in Microsoft Fabric (or augment Fabric) to adopt a new approach to governance that will open up the opportunities around de-centralisation and discoverability without compromising immutable laws, regulations and policies that apply to your organisation.

In many cases it is often helpful to run "table top exercises" to test boundaries and explore how new technology could open up new ways of working. This gives you an opportunity to involve key stakeholders before committing to change. For example explore how new tools could:

  • Empower the domains to certify their datasets (rather than rely on a central authority to do this) and in doing so improve GDPR compliance by flagging personal data with more certainty.
  • Delegate responsibility for monitoring usage of data products to the domain teams by opening up access to some of the admin tools in Fabric.

Key constraint is "Domain-orientated ownership"

Perhaps your organisation has low "data literacy" and therefore the prospect of up-skilling individual domains to discover, build and own data products is daunting. Or the culture in the organisation is one based on "decision by gut" rather than "decision through data"?

In these cases, it is often easier to find a team who are willing to act as early adopters to create an internal success story before pushing ahead with an organisation wide initiative.

Key constraint is "Data as a product"

This is typically the challenge in organisations that are inherently siloed in nature, where there has been a lack of trust around sharing data.

Under these circumstances you could choose to start with master data as a data product. By its very nature, master data tends to be high value information that is used across the whole organisation. By solving this problem you will need to get individual departments to collaborate around data and to see the benefit of sharing data products. Look to tackle a single master data entity that will give you a quick win and build from there.

Final thoughts

To successfully deliver a Data Mesh inspired vision enabled by Fabric, important cultural and organisational principles will need to be applied in order to drive value from data in a safe, scalable and secure manner.

Organisations should use the time available before Microsoft Fabric becomes generally available (GA) as an opportunity to assess their organisational readiness and take the actions necessary to get ready maximise the value that Fabric can deliver.

We strongly advise an "evolution not revolution" approach to building organisational readiness. You may have a strong vision for a Data Mesh inspired architecture, but avoid making big bets to deliver it! Adopt an experimental mindset to drive small incremental steps to evaluate options, with feedback loops that ensure that future steps are informed by what you have learned. Use your existing data and analytics capability as a benchmark throughout. By investing in a small amount of experimentation every month, you will be amazed by what you will achieve between now and Fabric going GA.

FAQs

What is Microsoft Fabric and how is it related to Data Mesh? Microsoft Fabric is a unified SaaS offering that extends Azure Synapse Analytics integration to all analytics workloads, enabling reporting, analytics, data science, and data engineering on a new generation of lake house infrastructure. It is heavily influenced by Data Mesh, a concept introduced by Zhamak Dehghani to deliver data-driven value at scale, which involves not only technology and architecture but also cultural and organizational principles.
What are the main challenges when implementing Data Mesh principles in an organization using Microsoft Fabric? The main challenges when implementing Data Mesh principles using Microsoft Fabric include finding the right balance between domain-orientated ownership, data as a product, and federated computational governance. Addressing issues such as data governance, decentralization, and discoverability while considering the specific constraints of the organization is crucial.
What are the significant technology-related gaps to consider when adopting Microsoft Fabric for a Data Mesh-inspired vision? The significant technology-related gaps to consider when adopting Microsoft Fabric for a Data Mesh-inspired vision include the data product marketplace, developing standards and patterns, master data management, tooling to support federated computational governance, and data product "contract" specification. These gaps can be addressed by creating a research and development backlog, prioritizing items based on your organization's specific requirements, and adopting an experimental mindset.

Barry Smart

Director of Data & AI

Barry Smart

Barry has spent over 25 years in the tech industry; from developer to solution architect, business transformation manager to IT Director, and CTO of a £100m FinTech company. In 2020 Barry's passion for data and analytics led him to gain an MSc in Artificial Intelligence and Applications.