Skip to content
James Broome By James Broome Director of Engineering
Why Power BI developers should care about the read/write XMLA endpoint

Although not discussed or documented widely, it's no secret that Power BI is built on top of the same underlying technology that powers (Azure) Analysis Services.

The same terminology prevails throughout data models (e.g. calculated columns and measures) and the modelling itself is performed using the same language (DAX). You might not have realised that opening a Power BI report (.pbix file) in Power BI Desktop actually starts a local instance of SQL Server Analysis Services, which is what powers the data modelling when you design the datasets in the reports.

But, until recently, the Analysis Services side of Power BI was a bit of a black box.

It was there, but hidden away, encapsulated by the Power BI functionality running on top. Last year, this started to change, and the Power BI team exposed an XMLA endpoint for the underlying tabular model - which could be connected to when running a local Power BI Desktop instance, or in the Power BI Service once a report had been published to a workspace (if you have a dedicated capacity running).

This endpoint was read-only - but it enabled integrations through 3rd party tooling so that you could, for example, query the data in the model, or use different visualisation tools to present the data outside of Power BI.

Fast forward to March 2020, and a read/write version of the XMLA endpoint was announced which takes things to the next level in terms of integration scenarios. This is a significant step towards the vision merging Power BI and Analysis Services into a single unified platform. As Christian Wade explained in the announcement:

With read/write capabilities for the XMLA endpoint in Power BI Premium, Power BI is growing to become a superset of Azure Analysis Services. The convergence of IT-managed workloads and self-service BI on the Power BI platform taps into Microsoft's deep heritage in enterprise BI spanning multiple decades. It paves the way for customers to co-locate BI artifacts, modernize BI solutions, and leverage the full power of Analysis Services in Power BI.

The rest of this post highlights some of the new scenarios enabled by this functionality, and why they make a difference to Power BI developers.

1. Better 3rd party tooling support

The initial read-only XMLA endpoint enabled 3rd party tooling integrations, but with the read/write support this now means that data modelling activities can be performed in tools like DAX Studio, ALM Toolkit and and Tabular Editor. What was only possible inside Power BI Desktop is now also possible (as and when the tooling progresses) in other tools too.

And in some cases, the 3rd party tools are already ahead - at the time of the announcement, Tabular Editor could provide additional modelling capabilities supported by the Analysis Services engine, but not yet supported in Power BI Desktop. With this increased "openness" of the inner workings of Power BI, the developer community will benefit from a wider choice and faster pace of tooling support.

2. More SDK and API integration options

The 3rd party tools all rely on lower level SDKs and APIs like ADOMD.NET and AMO which connect to the XMLA endpoint - and with the availability of read/write endpoints, once a connection is made, what is possible in these SDKs automatically becomes more advanced, with more scenarios supported "out-the-box".

Updating model metadata and triggering model refreshes can be performed using .NET or PowerShell, opening up lots of opportunities for custom integrations. I've described in a recent post how to write executable specifications over the underlying tabular model in order to test Power BI reports using this approach.

3. New DevOps and developer workflows

The read/write XMLA endpoint is made possible by a change to how the tabular model metadata is stored in Power BI. If you'd opened and interrogated a .pbix file (which in fact is really a proprietary .zip file), you'll have realised that the content is hard to explore as most of it is stored in binary format.

Discover your Power BI Maturity Score by taking our FREE 5 minute quiz.

However, recent versions of Power BI Desktop have included a preview feature to "Store datasets using enhanced metadata format" which moves away from the binary format, and stores the model metadata in the JSON .bim file, just like an Analysis Services model definition.

This change is quite important as with the model metadata stored in JSON, it can now be version controlled and diff-merged in the myriad of developer tools available. For example, the model definition could be stored in an Azure DevOps Git repo, and with the XMLA read/write endpoint, it could be deployed programmatically from a pipeline.

This opens up loads of options around developer workflows and DevOps scenarios that up to now have been difficult, and in some cases, impossible with Power BI solutions.

One caveat to note though, is that the model definition only contains the model! There's no notion of the UI layer in there, so metadata around the visualisations in the reports that consume the model is still part of the Power BI black box.

4. Custom application integrations

The combination of the XMLA read/write endpoint and the enhanced model metadata format are what make all the 3rd party tooling integrations described above possible. However, the opportunity doesn't end there - I've written extensively about the reasons you might want to think about integrating Azure Analysis Services into your custom applications and, with these changes in the Power BI service, all of the points apply equally here as well.

You've always been able to integrate Power BI reports into your custom applications through Power BI Embedded, but it's limited to presenting the report layer, with a report being the logical integration unit. This works well for a lot of scenarios, but a whole host of new options are now available, specifically around the "write" side of the XMLA endpoint.

"Write-backs", or data-update scenarios have always been problematic with Power BI Embedded - for custom applications that include more advanced functionality like what-if modelling, or user-defined dynamic expressions, the ability to programmatically update and re-process the Power BI model is fantastic.

5. The roadmap is much bigger

This recent announcement is part of a much bigger roadmap of investment in Power BI over coming months. It's easy to miss announcements when a platform moves this quickly, and whilst the XMLA read/write endpoint is a significant update, there's plenty more coming that may provide different benefits to you and your products depending on your scenarios.

Azure Weekly is a summary of the week's top Microsoft Azure news from AI to Availability Zones. Keep on top of all the latest Azure developments!

But, the important thing is that the platform is continually improving, and features like this show that the team behind Power BI aren't standing still. For us at endjin, investments in the developer/DevOps/automation space are always valuable and have a big impact on how we deliver efficiently and effectively for our customers. But, for you it might be something else.

If you want to stay on top of all the news and articles from the Power BI ecosystem, then I'd definitely recommend signing up to endjin's free Power BI Weekly newsletter.

James Broome

Director of Engineering

James Broome

James has spent 20+ years delivering high quality software solutions addressing global business problems, with teams and clients across 3 continents. As Director of Engineering at endjin, he leads the team in providing technology strategy, data insights and engineering support to organisations of all sizes - from disruptive B2C start-ups, to global financial institutions. He's responsible for the success of our customer-facing project delivery, as well as the capability and growth of our delivery team.