Microsoft Fabric: Unlocking the Secrets to Mastering Shared Semantic Models – Part 2 – Implementation

This blog series complements a YouTube tutorial I published earlier this month, where I quickly covered the scenario and implementation of shared semantic models in Microsoft Fabric. However, I realised this topic demands a more detailed explanation for those who need a deeper understanding of the processes and considerations involved in one of the most common enterprise-grade BI scenarios.

In organisations with strong security and governance requirements, implementing shared semantic models is vital to ensure seamless and secure access to data. These organisations often split roles across various teams responsible for productionising analytics solutions. Typically, they have strict Row-Level Security (RLS) and Object-Level Security (OLS) implemented in their semantic models. The goal is to enable two key groups within the organisation:

  • Report Writers: They must access the semantic models securely. This means having sufficient permissions to create reports while ensuring access is restricted to only the relevant objects and data.
  • End-Users: They need access to trustworthy and relevant information without dealing with underlying complexities. All the heavy lifting should be managed behind the scenes.

The first blog laid the groundwork by covering all the essential core concepts necessary for successfully implementing this scenario. It also provided a clear explanation of the roles involved in the process.

Blog Series Overview

Initially, I planned to cover everything in one post. However, the scope turned out to be too large, so I split it into two parts to ensure clarity and avoid overwhelming readers. Here’s what the series includes:

By the end of this blog, you will apply the understanding from the previous post to a real-world scenario, managing secure access to shared semantic models in Microsoft Fabric, and implement the solution step-by-step.

If you prefer a video format, check out the tutorial on YouTube:

For those who enjoy diving into the details, let’s get started!

Continue reading “Microsoft Fabric: Unlocking the Secrets to Mastering Shared Semantic Models – Part 2 – Implementation”

Microsoft Fabric: Unlocking the Secrets to Mastering Shared Semantic Models – Part 1 – Core Concepts

Microsoft Fabric: Unlocking the Secrets to Mastering Shared Semantic Models - Part 1 - Core Concepts

Managing and optimising shared semantic models in Microsoft Fabric, with a focus on securing access, is essential in today’s data-driven world. These models are the backbone of an organisation’s analytics, providing consistent and scalable insights across teams. Whether you’re an experienced professional or just starting with Microsoft Fabric, understanding how to manage access to shared semantic models is key to delivering impactful insights.

This blog focuses on the core concepts that are vital for building a strong foundation. These concepts are pivotal for a correct and successful implementation of shared semantic models. Without a solid grasp of these basics, it can be challenging to navigate the complexities of advanced configurations or ensure secure and efficient use of semantic models within Microsoft Fabric.

I originally planned to cover this topic in one blog, but it turned out to be too much for a single post. Splitting it into two parts allows me to explain everything clearly without making it overwhelming. Here’s what the series covers:

By the end of this blog, you’ll understand the basics of managing and optimising secured access to shared semantic models in Microsoft Fabric.

If you prefer a video format, check out the tutorial on YouTube:

For those who enjoy reading the details, keep scrolling!

Requirements

Before diving into the implementation of shared semantic models in Microsoft Fabric, it’s important to understand the prerequisites. This process has specific licensing and role requirements, which are outlined below:

  • At least Power BI Pro license: This is the minimum required license because Workspace functionality is available only with a Pro or higher license. For large semantic models you will required Power BI Premium Per User (PPU) or a Fabric Capacity.
  • Microsoft Fabric Administrator role: Necessary for configuring semantic model discoverability in the Admin Portal.
  • At least Workspace Member role: Required to set permissions on the semantic models.
  • At least Workspace Contributor role: Needed to assign users and security groups to RLS (Row-Level Security) and/or OLS (Object-Level Security) roles.

Ensure that you have the proper licenses and roles assigned before starting the implementation to avoid any disruptions or limitations in managing shared semantic models.

Continue reading “Microsoft Fabric: Unlocking the Secrets to Mastering Shared Semantic Models – Part 1 – Core Concepts”

Microsoft Fabric: Capacity Cost Management Part 3, Pause Capacity During Christmas with Azure Logic Apps

Microsoft Fabric: Capacity Cost Management Part 3, Pause Capacity During Christmas with Azure Logic Apps

In the first blog post of this series, I explained that we can Pause and Resume a Microsoft Fabric capacity from Azure Portal. In the second blog and its accompanying YouTube video, I showed you how to automate the Pause and Resume actions in Azure LogicApps so the capacity starts at 8:00 AM and stops at 4:00 PM. While I have already mentioned in those posts, it is worthwhile to mention again that these methods only make sense for PAYG (Pay-As-You-Go) capacities and NOT the Reservation capacities. While the method works fine, you may need more fine-tuning.

Managing operational costs becomes crucial for businesses leveraging Microsoft Fabric capacities when the holiday season approaches. This presents a unique challenge of maintaining efficiency while reducing unnecessary expenses, especially during Christmas when business operations might slow down or pause entirely.

In this post and video, I will extend the discussions from my previous blog and demonstrate how to optimise your Azure Logic Apps to manage Microsoft Fabric capacity during the Christmas holidays.

Extending the Logic Apps Workflow

Existing Setup Recap

In earlier discussions, we’ve explored using Azure Logic Apps to manage Fabric capacity effectively from 8:00 AM to 4:00 PM on regular business days and pausing operations afterwards. This setup ensures that we’re not incurring costs when the capacity isn’t needed, particularly from 4:00 PM to 8:00 AM the next morning, and throughout the weekends. I encourage you to check out my previous post for more information. This is how the existing solution looks like in Azure LogicApps:

Automating Microsoft Fabric Capacity with Azure LogicApps
Automating Microsoft Fabric Capacity with Azure LogicApps

Incorporating Holiday Schedules

The key to extending this setup for the Christmas period lies in integrating specific holiday schedules into your existing workflows using Workflow Definition Language which is used in Azure Logic Apps and Microsoft Flow. The following expression determines if the current date (in New Zealand Standard Time) falls within the period from December 25th of the current year to January 2nd of the next year:

and(
    greaterOrEquals(
        int(
            formatDateTime(
                convertFromUtc(
                    utcNow(), 
                    'New Zealand Standard Time'
                ), 
                'yyyyMMdd'
            )
        ), 
        int(
   concat(
    formatDateTime(
     utcNow()
     , 'yyyy'
     )
    , '1225'
    )
   ) 
    ), 
    lessOrEquals(
        int(
            formatDateTime(
                convertFromUtc(
                    utcNow(), 
                    'New Zealand Standard Time'
                ), 
                'yyyyMMdd'
            )
        ), 
        int(
   concat(
    add(
     int(
      formatDateTime(
       utcNow()
       , 'yyyy'
       )
      )
     ,1
     )
    , '0102'
    )
   )
  )
)

The following section explains how the expression works.

Continue reading “Microsoft Fabric: Capacity Cost Management Part 3, Pause Capacity During Christmas with Azure Logic Apps”

Use Copilot in Power BI Desktop to Create Measures from Numeric Columns

I have been thinking about a mechanism to generate measures from numeric columns on Power BI data models. Of course, we can use Tabular Editor, but it requires some scripting, which is all right. However, the more advanced our requirements get, the more complex the C# script. In real-world development scenarios, it does not make sense to blindly create measures for all numeric columns, such as the key columns used to define relationships between tables, making C# scripting a bit more complex.

In this blog and accompanying YouTube video, I explain using Copilot within Power BI Desktop to create measures from numeric columns. This feature represents a significant advancement in Power BI’s capabilities as of April 2024, enabling data analysts and BI professionals to streamline parts of their data analysis tasks.

Prerequisites

As explained in a previous post here, we first need to enable Copilot on the Fabric Portal. Please note that Copilot in Power BI Desktop requires either Power BI Premium Capacity or AT LEAST an F64 Fabric Capacity. Unfortunately, Copilot is NOT available on PPUEmbedded capacities, Fabric capacities smaller than F64 and Fabric Trial (FT) capacities.

We also need to have the latest version of Power BI Desktop installed on our machine. With that, let’s begin.

YouTube Video

Here is the video on YouTube where I explain the same thing in less than 5 min. But if you are after more details, continue reading.

Introduction to Power BI and Copilot

As Power BI evolves, it incorporates more sophisticated AI-driven capabilities that simplify various aspects of data analytics. The integration of Copilot in Power BI Desktop enhances user interaction with data in many ways. Our focus on this blog is specifically using Copilot to create simple yet crucial measures based on numeric columns that previously required manual effort.

Use Copilot for Measure Creation

Using Copilot is straightforward and demonstrates impressive intelligence in its operational logic. The following steps explain how to do so:

Continue reading “Use Copilot in Power BI Desktop to Create Measures from Numeric Columns”