Power BI Governance, Good Practices, Part 1: Multiple Environments, Data Source Certification and Documentation

Power BI Governance, Good Practices, Part 1: Multiple Environments, Data Source Certification and Documentation

Power BI is taking off, and it’s fast becoming the most popular business intelligence platform on the market. It’s easy to engage with and get professional results quickly, making it the perfect tool for organisations looking to beef up their BI prowess and make data driven decisions through-out the organisation.

Gartner 2020 Magic Quadrant for Analytics and Business Intelligence Platforms
Did you know that Gartner named Microsoft as the 2020 leader in their Magic Quadrant for Analytics and Business Intelligence Platforms?

In this post we’re going to look at three good practices for implementation and give you the tips you need to make sure you avoid common pitfalls so you are on the fast track to success with Power BI on your organisation.

1. Setup multiple environments

When working on a Power BI implementation project, it’s wise to have multiple environments to manage the lifecycle of your BI assets. Below we’ve listed several environments that should be considered depending on the complexity of the project and your organisation’s needs.

Development (aka Dev)

Being able to keep on top of the many reports you’re testing, and having the ability to track changes that occur, is essential as you get setup. Without a specific Dev environment, your production environment will quickly become overwhelmed with assets, making it hard to maintain and manage.  

When working in the dev environment, make sure that you have data sources specifically for development. We’ve seen production data used in dev on many occasions which can lead to serious privacy and data sovereignty issues. Your dev data sources shouldn’t contain sensitive data. 

These development environments can be on your local network or in cloud storage (like OneDrive for Business or GitHub). It is recommended to have separate Workspaces in Power BI Service for each environment.

Tip: The data sources of all published reports to Power BI Service must be sufficient for development use only and should avoid including confidential data.

User Acceptance Testing (aka UAT) 

The people who will be using the reports daily are the ones who should be testing them – they know the business best, and will be able to identify opportunities and gaps that the development team may not be able to identify themselves. By making sure the user is brought into the process early on, it maximises the value added to the business.

User acceptance testing is the last phase of testing. The UAT environment should only be created once the solution has been fully tested in Dev and approved by senior Power BI developers.

Continue reading “Power BI Governance, Good Practices, Part 1: Multiple Environments, Data Source Certification and Documentation”

Highlighting Below Avg Sales per Hierarchy Level with SWITCH() and ISINSCOPE() DAX Functions in Power BI

Highlighting Below Avg Sales per Hierarchy Level with SWITCH() and ISINSCOPE() DAX Functions in Power BI

I was working on a project a wee bit ago that the customer had conditional formatting requirement on a Column Chart.
They wanted to format the columns in the chart conditionally based on the average value based on the level of hierarchy you are at.
Here is the scenario, I have a Calendar hierarchy as below:

  • Calendar Hierarchy:
    • Year
    • Semester
    • Quarter
    • Month
    • Day

I use “Adventure Works DW2017, Internet Sales” Excel as my source in Power BI Desktop. If I want to visualise “Total Sales” over the above “Calendar Hierarchy” I get something like this:

Line Chart in Power BI, Total Sales by Year

Now I activate “Average Line” from “Analytics” tab of the Line chart.

Adding Average Line to Line Chart in Power BI

When I drill down in the line chart the Average line shows the average of that particular hierarchy level that I am in. This is quite cool that I get the average base on the level that I’m in code free.

Power BI, Drilling Donw in Line Chart

Easy, right?

Now, the requirement is to show the above behaviour in a “Column Chart” (yes! visualising time series with column chart, that’s what the customer wants) and highlight the columns with values below average amount in Orange and leave the rest in default theme colour.

So, I need to create Measures to conditionally format the column chart. I also need to add a bit of intelligent in the measures to:

  • Detect which hierarchy level I am in
  • Calculate the average of sales for that particular hierarchy level
  • Change the colour of the columns that are below the average amount

Let’s get it done!

Detecting Hierarchy Level with ISINSCOPE() DAX Function

Microsoft introduced ISINSCOPE() DAX function in the November 2018 release of Power BI Desktop. Soon after the announcement “Kasper de Jonge” wrote a concise blogpost about it.

So I try to keep it as simple as possible. Here is how is works, the ISINSCOPE() function returns “True” when a specified column is in a level of a hierarchy. As stated earlier, we have a “Calendar Hierarchy” including the following 5 levels:

  • Year
  • Semester
  • Quarter
  • Month
  • Day

So, to determine if we are in each of the above hierarchy levels we just need to create DAX measures like below:

ISINSCOPE Year		=	ISINSCOPE('Date'[Year])
ISINSCOPE Semester	=	ISINSCOPE('Date'[Semester])
ISINSCOPE Quarter	=	ISINSCOPE('Date'[Quarter])
ISINSCOPE Month		=	ISINSCOPE('Date'[Month])
ISINSCOPE Day		=	ISINSCOPE('Date'[Day])

Now let’s do an easy experiment.

  • Put a Matrix on the canvas
  • Put the “Calendar Hierarchy” to “Rows”
  • Put the above measures in “Values”
Detecting Year, Semester, Quarter, Month and Day hierarchy levels with ISINSCOPE in Power BI Desktop

As you see the “ISINSCOPE Year” shows “True” for the “Year” level. Let’s expand to the to the next level and see how the other measures work:

Continue reading “Highlighting Below Avg Sales per Hierarchy Level with SWITCH() and ISINSCOPE() DAX Functions in Power BI”

Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One-Shot with DAX Studio

Exporting Model Data to CSV 
or SQL Server in One Shot

Update 2021 March:

You can now export the data direct from Power BI Desktop using my tool, Power BI Exporter. Read more here.

In some of my old posts, which are the most popular ones, I explained how to Export data Power BI Desktop or Power BI Service data to different destinations like CSV, Excel and SQL Server. In this quick tip I explain a very easy way to export the model data as a whole to either CSV or SQL Server with DAX Studio.

Daniil from XXL BI well explained this method, but I’d rather quickly explain how it works and add some more information.

After release 2.8 of DAX Studio, you can now quickly export the whole model to CSV and SQL Server in one shot.

Enabling Export All Data in DAX Studio

  • Open DAX Studio
  • Click “File”
  • Click “Options”
  • Click “Advanced”
  • Tick “Show Export All Data button”
DAX Studio Export Power BI Model Data Settings

Export Power BI Model Data to CSV

DAX Studio Export Power BI Model Data to CSV
Continue reading “Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One-Shot with DAX Studio”

What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?

Test Environment from Power BI XMLA Endpoint

XMLA endpoint connectivity for public preview has been announced late March 2019. As at today, it is only available for Power BI Premium capacity users. This sounds like a massive restriction to a lot of people who don’t have a Premium capacity, but they’d love to see how it works. In this article I show you an easy way to get your hands to Power BI XMLA endpoint as quick as possible. Before I start, I’d like to simply explain what XMLA endpoint is and what it really means for Power BI users.

Power BI is Like Onion! It has layers!

Generally speaking, Power BI has two different layers, presentation layer and data model layer. Presentation layer is the visual layer, the one you make all those compelling reports and visualisations. The data model as the name resembles, is the layer that you make your data model in. This layer is the one you can access it via XMLA connectivity.

In a Power BI Desktop file, you can see both layers:

Different layers of Power BI

How XMLA Relates to Different Layers in Power BI?

As you may have already guessed, XMLA is only related to the data model layer and it has nothing to do with the presentation layer. So you may connect to a data model, browse the data model, import data from the model to other platforms like Excel and so forth.

XMLA Is Not New!

Seriously? Yes, seriously. It is not new. It’s been around for many years and perhaps you’ve already used it zillions of times. Whenever you’re connecting to an instance of SQL Server Analysis Services, either Multidimensional or Tabular from any tools like SQL Server Management Studio (SSMS), Power BI Report Builder, Excel, Tableau, etc…, you’re using XMLA connectivity indeed.

Power BI is an Instance of SSAS Tabular

It is true. Power BI runs a local instance of SSAS Tabular model. So, whenever you open a Power BI Desktop file (PBIX), Power BI creates a local instance of SSAS Tabular model with a random local port number that can be accessed on your local machine only. When you close the file, the local instance of SSAS Tabular is shut down and its port number is released.

I first revealed the fact that you can connect to the underlying data model in Power BI Desktop from whole different range of tools like SSMS, SQL Server Profiler, Excel, etc… on Jun 2016. So, we indeed were using XMLA to connect to Power BI data models for a long time. We can even take a step further to import our Power BI data models into an instance of SSAS Tabular. In that sense, we are literally generating XMLA scripts from Power BI to create the same data model in SSAS Tabular. How cool is that?

Sooo… What is new then?

Continue reading “What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?”