It has been a long time that I use SQL Server Profiler to diagnose my data models in the Power BI Desktop. I wrote a blog post in June 2016 about connecting to the underlying Power BI Desktop model from different tools, including SQL Server Management Studio (SSMS), Excel and SQL Server Profiler. In this quick post, I share a pbitool.json file that you can use to register the SQL Server Profiler as an external tool. Read more about how to register an external tool here. This is quite handy as this way to use SQL Server Profiler to diagnose Power BI Desktop without needing to find the diagnostic port. As an external tool, the SQL Server Profiler automatically connects to the data model via the diagnostic port. You can download the sqlserverprofiler.pbitool.json file from here. After you download the file you can open it in a text editor to see or modify the JSON code. If you are using SSMS 18, then you do not even need to modify the file. If you use a different version, the only thing you have to change is the “path”.
Continue reading “Quick Tips: Registering SQL Server Profiler as an External Tool in Power BI Desktop”Category: Self-Service BI
Microsoft newly announced a piece of very exciting news that Azure Purview now supports Power BI. This is massive news from a data governance point of view. Azure Purview is the next generation of Azure Data Catalog with more metadata discovery power and the ability to use sensitivity labels. After reading the news, I immediately decided to set up my test environment and give it a go. I followed the steps mentioned in this article on the Microsoft documentation website but I faced some difficulties to get it to work. And here we are, another blog post to help you to set up the Azure Purview for Power BI.
Note: In this blog post I am not intending to explain what Azure Purview is. You can find heaps of useful information here.
Creating an Azure Purview Resource
We first need to have an Azure subscription, if you don’t have, don’t worry, you can start your Azure free trial subscription here. The following steps explain how to set up Azure Purview for Power BI:Login to the Azure Portal
- Click Create a resource button
2. Type in Purview in the search box
3. Click Azure Purview
4. Click the Create button
5. Select your Subscription
6. Select a Resource group or Create new if you don’t have any
7. Type in a name in the Purview account name text box
8. Select the Location
9. Click Review + Create (if you require to do more configurations click Next:Configuration > button)
At this point, Azure validates the configurations and requirements. You may get an error message like below:
Continue reading “Power BI Governance, Good Practices: Setting up Azure Purview for Power BI”I am involved with a Power BI development in the past few days. I got some data exported from various systems in different formats, including Excel, CSV and OData. The CSV files are data export dumps from an ERP system. Working with ERP systems can be very time consuming, especially when you don’t have access to the data model, and you get the data in raw format in CSV files. It is challenging, as in the ERP systems, the table names and column names are not user friendly at all, which makes sense. The ERP systems are being used in various environments for many different customers with different requirements. So if we can get our hands to the underlying data model, we see configuration tables keeping column names. Some of the columns are custom built to cover specific needs. The tables may have many columns that are not necessarily useful for analytical purposes. So it is quite critical to have a good understanding of the underlying entity model. Anyhow, I don’t want to go off-topic.
The Problem
So, here is my scenario. I received about 10 files, including 15 tables. Some tables are quite small, so I didn’t bother. But some of them are really wide like having between 150 to 208 columns. Nice!
Looking at the column names, they cannot be more difficult to read than they are, and I have multiple tables like that. So I have to rename those columns to something more readable, more on this side of the story later.
Background
I emailed back to my customer, asking for their help. Luckily they have a very nice data expert who also understands their ERP system as well as the underlying entity model. I emailed him all the current column names and asked if he can provide more user-friendly names. He replied me back with a mapping table in Excel. Here is an example to show the Column Names Mapping table:
I was quite happy with the mapping table. Now, the next step is to rename all columns is based on the mapping table. Ouch! I have almost 800 columns to rename. That is literally a pain in the neck, and it doesn’t sound quite right to burn the project time to rename 800 columns.
But wait, what about writing automating the rename process? Like writing a custom function to rename all columns at once? I recall I read an excellent blog post about renaming multiple columns in Power Query that Gilbert Quevauvilliers wrote in 2018. I definitely recommend looking at his blog post. So I must do something similar to what Gilbert did; creating a custom function that gets the original columns names and brings back the new names. Then I use the custom function in each table to rename the columns. Easy!
Continue reading “A Power Query Custom Function to Rename all Columns at Once in a Table”It’s been a while that I am working with OData data source in Power BI. One challenge that I almost always do not have a good understanding of the underlying data model. It can be really hard and time consuming if there is no one in the business that understands the underlying data model. I know, we can use $metadata
to get the metadata schema from the OData feed, but let’s not go there. I am not an OData expert but here is the thing for someone like me, I work with various data sources which I am not necessarily an expert in, but I need to understand what the entities are, how they are connected etc… then what if I do not have access any SMEs (Subject Matter Expert) who can help me with that?
So getting involved with more OData options, let’s get into it.
The custom function below accepts an OData URL then it discovers all tables, their column count, their row count (more on this later), number and list of related tables, number and list of columns of type text
, type number
and Decimal.Type
.
// fnODataFeedAnalyser
(ODataFeed as text) =>
let
Source = OData.Feed(ODataFeed),
SourceToTable = Table.RenameColumns(
Table.DemoteHeaders(Table.FromValue(Source)),
{{"Column1", "Name"}, {"Column2", "Data"}}
),
FilterTables = Table.SelectRows(
SourceToTable,
each Type.Is(Value.Type([Data]), Table.Type) = true
),
SchemaAdded = Table.AddColumn(FilterTables, "Schema", each Table.Schema([Data])),
TableColumnCountAdded = Table.AddColumn(
SchemaAdded,
"Table Column Count",
each Table.ColumnCount([Data]),
Int64.Type
),
TableCountRowsAdded = Table.AddColumn(
TableColumnCountAdded,
"Table Row Count",
each Table.RowCount([Data]),
Int64.Type
),
NumberOfRelatedTablesAdded = Table.AddColumn(
TableCountRowsAdded,
"Number of Related Tables",
each List.Count(Table.ColumnsOfType([Data], {Table.Type}))
),
ListOfRelatedTables = Table.AddColumn(
NumberOfRelatedTablesAdded,
"List of Related Tables",
each
if [Number of Related Tables] = 0 then
null
else
Table.ColumnsOfType([Data], {Table.Type}),
List.Type
),
NumberOfTextColumnsAdded = Table.AddColumn(
ListOfRelatedTables,
"Number of Text Columns",
each List.Count(Table.SelectRows([Schema], each Text.Contains([Kind], "text"))[Name]),
Int64.Type
),
ListOfTextColunmsAdded = Table.AddColumn(
NumberOfTextColumnsAdded,
"List of Text Columns",
each
if [Number of Text Columns] = 0 then
null
else
Table.SelectRows([Schema], each Text.Contains([Kind], "text"))[Name]
),
NumberOfNumericColumnsAdded = Table.AddColumn(
ListOfTextColunmsAdded,
"Number of Numeric Columns",
each List.Count(Table.SelectRows([Schema], each Text.Contains([Kind], "number"))[Name]),
Int64.Type
),
ListOfNumericColunmsAdded = Table.AddColumn(
NumberOfNumericColumnsAdded,
"List of Numeric Columns",
each
if [Number of Numeric Columns] = 0 then
null
else
Table.SelectRows([Schema], each Text.Contains([Kind], "number"))[Name]
),
NumberOfDecimalColumnsAdded = Table.AddColumn(
ListOfNumericColunmsAdded,
"Number of Decimal Columns",
each List.Count(
Table.SelectRows([Schema], each Text.Contains([TypeName], "Decimal.Type"))[Name]
),
Int64.Type
),
ListOfDcimalColunmsAdded = Table.AddColumn(
NumberOfDecimalColumnsAdded,
"List of Decimal Columns",
each
if [Number of Decimal Columns] = 0 then
null
else
Table.SelectRows([Schema], each Text.Contains([TypeName], "Decimal.Type"))[Name]
),
#"Removed Other Columns" = Table.SelectColumns(
ListOfDcimalColunmsAdded,
{
"Name",
"Table Column Count",
"Table Row Count",
"Number of Related Tables",
"List of Related Tables",
"Number of Text Columns",
"List of Text Columns",
"Number of Numeric Columns",
"List of Numeric Columns",
"Number of Decimal Columns",
"List of Decimal Columns"
}
)
in
#"Removed Other Columns"
Continue reading “Quick Tips: OData Feed Analyser Custom Function in Power Query”