Skip to content
James Broome By James Broome Director of Engineering
Azure Analysis Services: 8 reasons why you might want to integrate into a custom application

Azure Analysis Services offers a cloud-hosted, PaaS, data analytics platform, allowing massive amounts of data to be queried for ad-hoc data analysis. It's really easy to get started, ingest data from a variety of sources and build semantic tabular models. And with out-the-box connectors for both Excel and Power BI, developing highly interactive visualisations to gain deep insights into your data is quick and simple.

However, with a variety of support through client SDKs, PowerShell cmdlets and REST APIs, the opportunities for integrating Azure Analysis Services into your own custom applications are endless. This post will explain some of the common (and not-so-common) reasons that you might want to do this, and the subsequent series of technical how-to's goes into more detail around some of the core aspects if you're looking to get started.

Reason 1. Ultimate UI flexibility

Integrating an Azure Analysis Services model into an Excel worksheet or a Power BI report is very simple as both offer out-the-box connector options. This means building interactive charts and visualisations is fast, with drag and drop report building in a familiar Office tool set.

The best hour you can spend to refine your own data strategy and leverage the latest capabilities on Azure to accelerate your road map.

However, if you're building a end-user-facing product, rather than an internal tool, you may have very specific or demanding requirements around data visualisations, design/branding or accessibility that can only be achieved in a custom built web (or native) application. Being able to execute queries in Azure Analysis Services programmatically makes this possible, meaning you can build any user interface, in any technology, over your underlying data model.

Reason 2. Providing external access

Your data insights and visualisations may be a product in their own right that you want to put in the hands of your end users (outside your own organisation). Equally, they may form just a sub-set of features in a much bigger product. Either way, whilst Power BI Embedded offers a quick way to expose your Azure Analysis Services data models, it's pricing model may not be suitable for your application.

Interacting with Azure Analysis Services directly using SDKs and APIs from your custom applications means you can expose your data directly to your audience, wherever they are.

Reason 3. Monetising your data insights

One of the reasons to expose your data externally is, of course, to monetise access to it. If your data, or the insights you provide on top of it, are themselves an asset, then custom applications and APIs offering the same levels of analysis allow new business models to be created based on the Data Economy.

The Introduction to Rx.NET 2nd Edition (2024) Book, by Ian Griffiths & Lee Campbell, is now available to download for FREE.

Of course, this isn't just a case of exposing Azure Analysis Services directly to the outside world. Building custom API layers over your tabular data models means you can provide access to, and charge for, your analysis capabilities and data without exposing your internal IP.

Reason 4. Supporting a developer community

Equally, you may be looking to foster innovation by opening up access to your data models and analysis to developers (internal or external). This might enable new insights to be discovered in your data, or even lead to disruptive business models using your analysis in new and unexpected ways.

Providing them with modern, RESTful APIs that expose the meta data around your semantic model allows them to query data in the way they want, using just the pieces they care about, using the tools and processes that they're familiar with.

Reason 5. Updating the model schema

Whilst at a first glance, Azure Analysis Services might seem like a read-only data model, there are reasons why you might need to update parts of the model at run-time. As you'd expect, triggering the data ingestion process to refresh the data is supported through a variety of mechanisms, but it is in fact also possible to programmatically update the underlying model schema too.

Being able to perform dynamic calculations based on user input (for example "What If?" scenarios) is a commonly requested use-case, and whilst there are ways to achieve this using data parameter tables, another (better?) way to solve this is to actually re-define the algorithm behind a calculated column when the values (or even entire formula) changes.

Reason 6. Creating data mash-ups

Azure Analysis Services' sweet spot is when all the data from all the sources are ingested into its model, creating a semantic layer for analysis. However, it might not always be possible to ingest all the data you want to analyse into Azure Analysis Services, despite its flexibility in data connectors. In these scenarios, you may have to build a custom integration from Azure Analysis Services into an additional data analysis application using SDKs and APIs to expose query logic or data exports.

Reason 7. Automating repeatable processes

The above reasons all assume that you have a need to move away from the out-the-box Azure Analysis Services integrations and connectors. Even if the likes of Power BI, Excel and developer tooling are sufficient for you to interrogate your data and perform your analysis, you might still need to automate repetitive processes - for example, scheduling the ingestion process to automate the data refresh, or automatically pausing and resuming your Azure Analysis Service instances around office hours to reduce running costs.

On a wider scale, automating the entire DevOps pipeline around Azure Analysis Services is absolutely possible too - from provisioning the services inside Azure, to configuring and deploying the data models and ingesting the data. If Azure Analysis Services forms part of your line of business tool set, then being able to manage it with modern development processes, like any other infrastructure or application is critical.

Reason 8. Testing your algorithms

Finally, writing DAX expressions to create calculated columns and measures in your model is no different from developing any other type of code, and the same quality gates and development processes should be applied. Unit testing models in Azure Analysis Services isn't something that most people think about, but it's absolutely possible using the client SDKs and your favourite testing framework to execute queries.

Couple this with DevOps automation, and the automatic ingestion of known/sample data models and you can easily create a continuous integration type workflow around developing new measures or DAX calculations over your data.

Conclusion

There's numerous reasons why you might want, or need, to integrate Azure Analysis Services into a custom application. This integration could take a variety of forms - from entire applications built over the tabular data model, to background automated processes that trigger specific administrative or management functions. With the client SDKs, PowerShell cmdlets and REST APIs that Azure Analysis Services offers, any integration is possible - from data query operations, to management tasks - in a variety of ways, providing incredible flexibility for your applications and processes.

Azure Weekly is a summary of the week's top Microsoft Azure news from AI to Availability Zones. Keep on top of all the latest Azure developments!

The subsequent series of technical posts explores some examples of how to achieve the results described here.

[Azure Analysis Services - Technical How-To Series]

If you're looking to get started with programmatic integration of Azure Analysis Services into your custom applications and APIs, this series of posts will cover some of the core aspects of interacting with the SDKs and APIs:

James Broome

Director of Engineering

James Broome

James has spent nearly 20 years delivering high quality software solutions addressing global business problems, with teams and clients across 3 continents. As Director of Engineering at endjin, he leads the team in providing technology strategy, data insights and engineering support to organisations of all sizes - from disruptive B2C start-ups, to global financial institutions. He's responsible for the success of our customer-facing project delivery, as well as the capability and growth of our delivery team.