Microsoft Fabric: Capacity Cost Management Part 2, Automate Pause/Resume Capacity with Azure Logic Apps

Automate Pause Resume Suspend Fabric Capacity with Azure Logic Apps

In the previous blog post, I explained Microsoft Fabric capacities, shedding light on diverse capacity options and how they influence data projects. We delved into Capacity Units (CUs), pricing nuances, and practical cost control methods, including manually scaling and pausing Fabric capacity. Now, we’re taking the next step in our Microsoft Fabric journey by exploring the possibility of automating the pause and resume process. In this blog post, we’ll unlock the secrets to seamlessly managing your Fabric Capacity with automation that helps us save time and resources while optimising the usage of data and analytics workloads.

Right off the bat, this is a rather long blog, so I added a bonus section at the end for those who are reading from the beginning to the end. With that, let’s dive in!

The Problem

As we have learned in the previous blog post, one way to manage our Fabric capacity costs is to pause the capacity while not in use and resume it again when needed. While this can help with cost management, as it is a manual process, it is prone to human error, which makes it impractical in the long run.

The Solution

A more practical solution is to automate a daily process to pause and resume our Fabric capacity automatically. This can be done by running Azure Management APIs. Depending on our expertise, there are several ways to achieve the goal, such as running APIs on running the APIs via PowerShell (scheduling the runs separately), running the APIs via CloudShell, creating a flow in Power Automate, or creating the workflow in Azure Logic Apps. I prefer the latter, so this blog post explains the method.

I also explain the same scenario on my YouTube channel. Here is the video:

Automating Pause and Resume Fabric Capacity with Azure Logic Apps

Here is the scenario: we are going to create an Azure Logic Apps workflow that automatically does the following:

  • Check the time of the day
  • If it is between 8 am to 4 pm:
  • Check the status of the Fabric capacity
  • If the capacity is paused, then resume it, otherwise do nothing
  • If it is after 4 pm and before 8 am:
  • Check the status of the Fabric capacity
  • If the capacity is resumed, then pause it, otherwise do nothing

Follow these steps to implement the scenario in Azure Logic Apps:

  1. Login to Azure Portal and search for “Logic App
  2. Click the Logic App service
Finding Logic Apps on Azure Portal

This navigates us to the Logic App service. If you currently have existing Logic Apps workflows, they will appear here.

Continue reading “Microsoft Fabric: Capacity Cost Management Part 2, Automate Pause/Resume Capacity with Azure Logic Apps”

Microsoft Fabric: Capacity Options and Cost Management, Part 1; The Basics

Microsoft Fabric: Capacity Options and Cost Management, Part 1

Microsoft Fabric is a SaaS platform that allows users to get, create, share, and visualise data using a wide set of tools. It provides a unified solution for all our data and analytics workloads, from data ingestion and transformation to data engineering, data science, data warehouse, real-time analytics, and data visualisation. In a previous blog post, I explained the basics of the Microsoft Fabric data platform. In a separate blog post, I explained some Microsoft Fabric terminologies and personas where I explained what Tenant and Capacities are.

In this blog post, we will explore the different types of Fabric capacities, how they affect the performance and cost of our Fabric projects, and how you can control the capacity costs by pausing the capacity in Azure when it is not in use.

Fabric capacity types

Fabric capacities are the compute resources that power all the experiences in Fabric. They are available in different sizes and prices, depending on our needs and budget. We can currently obtain Fabric capacities in one of the following options:

If we want to purchase Microsoft Fabric capacities on Azure, they come in SKUs (Stock Keeping Units) sized from F2 – F2048, representing 2 – 2048 CU (Capacity Units). A CU is a unit of measure representing the resource power available for a Fabric capacity. The higher the CU, the more resources we get on our Fabric projects. For example, an F8 capacity has 8 CUs, which means it is four times more powerful than an F2 capacity, which has 2 CUs.

When purchasing Azure SKUs with a pay-as-you-go subscription, we are billed for compute power (which is the size of the capacity we choose) and for OneLake storage, which is charged for the data stored in OneLake per gigabyte per month (approximately $0.043 (New Zealand Dollar) per GB). OneLake is the unified storage layer for all the Fabric workloads. It allows users to store and access our data in a secure, scalable and cost-effective way.

Azure Fabric capacities are priced uniquely across regions. The pay-as-you-go pricing for a Fabric capacity at Australia East region is $0.3605 (NZD) per CU per hour, which translates to a monthly price of $526.217 (NZD) for an F2 ($0.3605 * 2 * 730 hours).

Microsoft Fabric pricing overview
Microsoft Fabric pricing overview

It is important to note that billing is per second with a one-minute minimum. Therefore, we will be billed for when the capacity is not in use. Here is a full list of prices available at the Azure portal by selecting our Fabric capacity region.

Now that we have an indication of the costs of owning Microsoft Fabric capacities let’s explore the methods to control the cost.

Nuances of Fabric’s Cost of Ownership

It is important to note that all the math we have gone through in the previous section is just about the capacity itself. But are there any other costs that may apply? The answer is it depends. If we obtain any SKUs lower than F64, we must buy Power BI Pro licenses per user on top of the capacity costs. For the tiers above F64, we get unlimited free users but, BUT, we still have to purchase Power BI Pro licenses for all developers on top of the cost of the capacity itself.

Another gotcha is that the Fabric experiences are unavailable to either Power BI Premium (PPU) users or the Power BI Embedded capacities. Just be mindful of that.

The good news for organisations owning Power BI Premium capacities is that you do not need to do anything to leverage Fabric capabilities. As a matter of fact, you already own a Fabric capacity, you just need to enable it on your tenant.

Continue reading “Microsoft Fabric: Capacity Options and Cost Management, Part 1; The Basics”

Integrating Power BI with AzureDevOps (Git), part 1: Cloud Integration


Power BI is a powerful tool for creating and sharing interactive data visualizations. But how can you collaborate with other developers on your Power BI projects and ensure quality and consistency across your reports? In this series of blog posts, I will show you how to integrate Power BI with Azure DevOps, a cloud-based software development and delivery platform. We can integrate Azure DevOps with Power BI Service (Fabric) as well as Power BI Desktop.
The current post explains how to set up Azure DevOps and connect a Power BI Workspace.
The next blog post will explain how to use it on your local machine to integrate your Power BI Desktop projects with Azure DevOps.

A brief history of source control systems

Before we dive into the details of Power BI and Azure DevOps integration, let’s take a moment to understand what source control systems are and why they are essential for any software project.

Source control systems, also known as version control systems or revision control systems, are tools that help developers manage the changes made to their code over time. They allow developers to track, compare, and roll back changes when necessary and collaborate with other developers on the same project.

There are two main types of source control systems: centralised and distributed. Centralised source control systems use Client-server approach to store all the code and its history in a single server, and developers need to connect to that server to access or modify the code. Examples of centralised source control systems are Microsoft’s Team Foundation Server (TFS) which rebranded to Azure DevOps Server in 2018, IBM’s ClearCase, and Apache’s Subversion.

On the other hand, distributed source control systems use a peer-to-peer approach, allowing each developer to have a local copy of the entire code repository, including its history. Developers can work offline and sync their changes with other developers through a remote server. Examples of distributed source control systems are Git Software and Mercurial, which takes us to the next section. Let’s see what Git is.

What is Git, and why use it?

Git is one of the world’s most popular and widely used distributed source control systems. It was created by Linus Torvalds, the creator of Linux, in 2005. Git has many advantages over centralised source control systems, such as:

  • Speed: Git is fast and efficient, performing most operations locally without network access.
  • Scalability: Git can easily handle large and complex projects, as it does not depend on a single server.
  • Flexibility: Git supports various workflows and branching strategies, allowing developers to choose how they want to organise their code and collaborate with others.
  • Security: Git uses cryptographic hashes to ensure the integrity and authenticity of the code.
  • Open-source: Git is free and open-source, meaning anyone can use it, modify it, or contribute to it.

While Git is pretty good, it has some disadvantages compared with a centralised source control system. Here are some:

  • Complexity: Git has a steep learning curve, especially for users who are new to distributed version control systems. Understanding concepts such as branching, merging, rebasing, and resolving conflicts can be challenging for beginners and sometimes even seasoned Git users.
  • Collaboration challenges: While distributed version control systems like Git enable easy collaboration, they can also lead to collaboration issues. Multiple developers working on the same branch simultaneously may encounter conflicts that need to be resolved, which can introduce complexities and require extra effort.
  • Performance with large repositories: While Git performs pretty well on most operations, it can get abortive when working with large repositories containing many files or a long history of commits. Operations such as cloning or checking out large repositories can be time-consuming.

What is Azure DevOps, and what does it relate to Git?

Azure DevOps is Microsoft’s cloud-based platform providing a set of tools and services for software development. It encompasses a range of capabilities for managing, planning, developing, testing, and delivering software applications. Azure DevOps offers:

  • Azure Boards: A tool for planning, tracking, and managing work items, such as user stories, tasks, bugs, etc.
  • Azure Repos: A tool for hosting Git repositories online, which is the main focus of this blog post.
  • Azure Pipelines: A tool for automating builds, tests, and deployments.
  • Azure Test Plans: A tool for creating and running manual and automated tests.
  • Azure Artifacts: A tool for managing packages and dependencies.

Azure DevOps also integrates with other tools and platforms, such as GitHub, Visual Studio Code, and now, Power BI. This takes us to the next section of this blog post, Integrating Power BI with Azure DevOps.

How to integrate Power BI with Azure DevOps

Now that we understand what Git and Azure DevOps are let’s see how we can integrate Power BI with Azure DevOps.

Integrating Power BI with Azure DevOps has two different integrations. Cloud integration and local machine integration have the following requirements.

Prerequisites

To follow along with this tutorial, you will need:

  • In the cloud:
    • An Azure DevOps Service
    • A Power BI account with one of the following licenses to enable Power BI Workspace integration with Azure DevOps.:
      • Power BI PPU (Premium Per User)
      • Premium Capacity
      • Embedded Capacity (EM/A)
      • Fabric Capacity
  • On your local machine:
    • The latest version of Power BI Desktop (June 2023 or later)
    • Either Visual Studio or VS Code

As stated earlier, this post explains the Cloud integration partTherefore, we require to have an Azure DevOps Service and a Power BI account with a Premium licencing plan in order to integrate Power BI with Azure DevOps.

In the following few sections, we look into more details and go through them together step-by-step.

Continue reading “Integrating Power BI with AzureDevOps (Git), part 1: Cloud Integration”

What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?

Test Environment from Power BI XMLA Endpoint

XMLA endpoint connectivity for public preview has been announced late March 2019. As at today, it is only available for Power BI Premium capacity users. This sounds like a massive restriction to a lot of people who don’t have a Premium capacity, but they’d love to see how it works. In this article I show you an easy way to get your hands to Power BI XMLA endpoint as quick as possible. Before I start, I’d like to simply explain what XMLA endpoint is and what it really means for Power BI users.

Power BI is Like Onion! It has layers!

Generally speaking, Power BI has two different layers, presentation layer and data model layer. Presentation layer is the visual layer, the one you make all those compelling reports and visualisations. The data model as the name resembles, is the layer that you make your data model in. This layer is the one you can access it via XMLA connectivity.

In a Power BI Desktop file, you can see both layers:

Different layers of Power BI

How XMLA Relates to Different Layers in Power BI?

As you may have already guessed, XMLA is only related to the data model layer and it has nothing to do with the presentation layer. So you may connect to a data model, browse the data model, import data from the model to other platforms like Excel and so forth.

XMLA Is Not New!

Seriously? Yes, seriously. It is not new. It’s been around for many years and perhaps you’ve already used it zillions of times. Whenever you’re connecting to an instance of SQL Server Analysis Services, either Multidimensional or Tabular from any tools like SQL Server Management Studio (SSMS), Power BI Report Builder, Excel, Tableau, etc…, you’re using XMLA connectivity indeed.

Power BI is an Instance of SSAS Tabular

It is true. Power BI runs a local instance of SSAS Tabular model. So, whenever you open a Power BI Desktop file (PBIX), Power BI creates a local instance of SSAS Tabular model with a random local port number that can be accessed on your local machine only. When you close the file, the local instance of SSAS Tabular is shut down and its port number is released.

I first revealed the fact that you can connect to the underlying data model in Power BI Desktop from whole different range of tools like SSMS, SQL Server Profiler, Excel, etc… on Jun 2016. So, we indeed were using XMLA to connect to Power BI data models for a long time. We can even take a step further to import our Power BI data models into an instance of SSAS Tabular. In that sense, we are literally generating XMLA scripts from Power BI to create the same data model in SSAS Tabular. How cool is that?

Sooo… What is new then?

Continue reading “What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?”