Skip to content
James Broome By James Broome Director of Engineering
Azure Analysis Services - How to process an asynchronous model refresh from .NET

In previous posts, I've explained how to connect to Azure Analysis Services using .NET and Powershell, execute DAX queries to return data, and retrieve metadata about the underlying tabular model. Being able to do all of this enables integration into custom applications and processes, meaning you can unleash the raw analytical processing power of Azure Analysis Services in a whole variety of new ways.

However, up to now, everything I've described has been based on a static, read-only model. And whilst there's no reason why you can't take this a step further and perform model updates through these same integrations - for example, updating the expression behind a calculated column based on a user-defined value in your custom application - if you do, the important thing to realise is that these changes won't take effect until the model and data have been re-processed.

There's a number of ways to trigger the processing of a model using methods such as TOM (Tabular Object Model), PowerShell cmdlets, or TMSL (Tabular Model Scripting Language), as well as client tooling like SQL Server Management Studio. However, these methods can require often unreliable, long-running HTTP connections, which is problematic as data refresh operations can take some time depending on a number of factors including data volume and level of optimisation using partitions etc.

REST API for Azure Analysis Services

Alongside the general purpose Azure management API, there is a specific REST API for Azure Analysis Services, which deals solely with asynchronous processing of tabular models (but uses the terminology "refresh" rather than "process").

This API is part of Azure Analysis Services itself – i.e. it's cloud-only, so isn't available for your on-premise SQL Analysis Services instances.

The key part to this is that it's asynchronous – in fact, it's the *only* way to trigger an asynchronous refresh. All the other methods (using the previous mentioned SDKs, or PowerShell) rely on a long-running HTTP connection which is error prone if data volumes are large, or models contain complex calculations and data manipulations.

It's not clear if this API will evolve over time to include other useful Azure Analysis Services-specific features, but whilst it currently only serves a single purpose, it fills a gap that the other integrations don't support.

Triggering a refresh from .NET

With this API, by using any programming language that supports REST calls (for example, C#), you can perform asynchronous data refresh operations on your Azure Analysis Services tabular models.

The best hour you can spend to refine your own data strategy and leverage the latest capabilities on Azure to accelerate your road map.

As you'd expect, in .NET, this means using an HttpClient to point to the API to issue requests, as show in the first code sample below. Authentication with the API is performed using a Bearer token, which I explain about in a previous post (as well as providing the necessary code sample).

public async Task<HttpClient> AuthenticateAndUseHttpClient()
{
    var client = new HttpClient
    {
        BaseAddress = new Uri($"https://{aasRegion}.asazure.windows.net/servers/{aasServerName}/models/{aasDatabaseName}/")
    };

    client.DefaultRequestHeaders.Accept.Clear();
    client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
    client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", await this.GetTokenAsync());

    // Do something with the http client here
    // i.e. issue GET or POST request...
    
    return client;
}

As the API is asynchronous, it follows a standard "long running operations" pattern of:

  1. Issuing a POST request to start the operation
  2. Returning an HTTP 202 Accepted response, along with a Location header that specifies the ID of the operation.
  3. Issuing GET requests to check the status of the operation using the ID.

The refresh operation can be customised in the JSON payload - to target specific tables in the model, force a "full" refresh, rather than just the delta of changes, as well as tweaking parallelism of processing to influence performance. The full API specification is documented here.

The next code sample issues the request and returns the ID of the operation for subsequent status polling by the calling client.

public async Task<string> RefreshModelAsync()
{
    var client = await this.AuthenticateAndUseHttpClient();

    // Hard coded to sensible defaults, but could be parameterised 
    var jsonBody = @"{""Type"": ""Automatic"",""CommitMode"": ""default"",""MaxParallelism"": 10}";

    var content = new StringContent(jsonBody, Encoding.UTF8, "application/json");

    var response = await client.PostAsync("refreshes", content);

    response.EnsureSuccessStatusCode();
    Uri location = response.Headers.Location;

    return location.Segments.Last();
}

Polling the refresh status from .NET

Finally, once the refresh has started and we have the ID, we can issue subsequent requests to check the status. Depending on the complexity and scale of the model, this could take seconds, minutes, or hours! So, the polling mechanism and frequency completely depends on the specific use case.

The Introduction to Rx.NET 2nd Edition (2024) Book, by Ian Griffiths & Lee Campbell, is now available to download for FREE.

The last code sample calls the API again to retrieve the status of the operation, using the ID previously returned.

public async Task<string> GetRefreshStatus(string id)
{
    var client = await this.AuthenticateAndUseHttpClient();

    var response = await client.GetAsync($"refreshes/{id}");

    response.EnsureSuccessStatusCode();

    var content = await response.Content.ReadAsStringAsync();

    return content;
}

Conclusion

With the REST API for Azure Analysis Services, it's possible to trigger asynchronous refreshes of your data models, which is necessary for maintaining responsive and reliable custom applications when underlying model changes are made.

This post shows how easy it is to call the REST API from a .NET application, giving you everything you need to get started with more complex Azure Analysis Service integration scenarios.

James Broome

Director of Engineering

James Broome

James has spent nearly 20 years delivering high quality software solutions addressing global business problems, with teams and clients across 3 continents. As Director of Engineering at endjin, he leads the team in providing technology strategy, data insights and engineering support to organisations of all sizes - from disruptive B2C start-ups, to global financial institutions. He's responsible for the success of our customer-facing project delivery, as well as the capability and growth of our delivery team.