Integrating Power BI with Azure DevOps (Git), part 2: Local Machine Integration

Integrating Power BI with Azure DevOps (Git), part 2: Local Machine Integration

This is the second part of the series of blog posts showing how to integrate Power BI with Azure DevOps, a cloud platform for software development. The previous post gave a brief history of source control systems, which help developers manage code changes. It also explained what Git is, a fast and flexible distributed source control system, and why it is useful. It introduced the initial configurations required in Azure DevOps and explained how to integrate Power BI (Fabric) Service with Azure DevOps.

This blog post explains how to synchronise an Azure DevOps repository with your local machine to integrate your Power BI Projects with Azure DevOps. Before we start, we need to know what a Power BI Project is and how we can create it.

What is Power BI Project (Developer Mode)

Power BI Project (*.PBIP) is a new file format for Power BI Desktop that was announced in May 2023 and made available for public preview in June 2023. It allows us to save our work as a project, which consists of a folder structure containing individual text files that define the report and dataset artefacts. This enables us to use source control systems, such as Git, to track changes, compare revisions, resolve conflicts, and review changes. It also enables us to use text editors, such as Visual Studio Code, to edit the artefact definitions more productively and programmatically. Additionally, it supports CI/CD (continuous integration and continuous delivery), where we submit changes to a series of quality gates before applying them to the production system.

PBIP files differ from the regular Power BI Desktop files (PBIX), which store the report and dataset artefacts as a single binary file. This made integrating with source control systems, text editors, and CI/CD systems difficult. PBIP aims to overcome these limitations and provide a more developer-friendly experience for Power BI Desktop users.

Since this feature is still in public preview when writing this blog post, we have to enable it from the Power BI Desktop Options and Settings.

Enable Power BI Project (Developer Mode) (Currently in Preview)

As mentioned, we first need to enable the Power BI Project (Developer Mode) feature, introduced for public preview in the June 2023 release of Power BI Desktop. Power BI Project files allow us to save our Power BI files as *.PBIP files deconstruct the legacy Power BI report files (*.PBIX) into well-organised folders and files.
With this feature, we can:

  • Edit individual components of our Power BI file, such as data sources, queries, data model, visuals, etc.
  • Use any text editor or IDE to edit our Power BI file
  • Compare and merge changes
  • Collaborate with other developers on the same Power BI file

To enable Power BI Project (Developer Mode), follow these steps in Power BI Desktop:

Continue reading “Integrating Power BI with Azure DevOps (Git), part 2: Local Machine Integration”

Integrating Power BI with AzureDevOps (Git), part 1: Cloud Integration


Power BI is a powerful tool for creating and sharing interactive data visualizations. But how can you collaborate with other developers on your Power BI projects and ensure quality and consistency across your reports? In this series of blog posts, I will show you how to integrate Power BI with Azure DevOps, a cloud-based software development and delivery platform. We can integrate Azure DevOps with Power BI Service (Fabric) as well as Power BI Desktop.
The current post explains how to set up Azure DevOps and connect a Power BI Workspace.
The next blog post will explain how to use it on your local machine to integrate your Power BI Desktop projects with Azure DevOps.

A brief history of source control systems

Before we dive into the details of Power BI and Azure DevOps integration, let’s take a moment to understand what source control systems are and why they are essential for any software project.

Source control systems, also known as version control systems or revision control systems, are tools that help developers manage the changes made to their code over time. They allow developers to track, compare, and roll back changes when necessary and collaborate with other developers on the same project.

There are two main types of source control systems: centralised and distributed. Centralised source control systems use Client-server approach to store all the code and its history in a single server, and developers need to connect to that server to access or modify the code. Examples of centralised source control systems are Microsoft’s Team Foundation Server (TFS) which rebranded to Azure DevOps Server in 2018, IBM’s ClearCase, and Apache’s Subversion.

On the other hand, distributed source control systems use a peer-to-peer approach, allowing each developer to have a local copy of the entire code repository, including its history. Developers can work offline and sync their changes with other developers through a remote server. Examples of distributed source control systems are Git Software and Mercurial, which takes us to the next section. Let’s see what Git is.

What is Git, and why use it?

Git is one of the world’s most popular and widely used distributed source control systems. It was created by Linus Torvalds, the creator of Linux, in 2005. Git has many advantages over centralised source control systems, such as:

  • Speed: Git is fast and efficient, performing most operations locally without network access.
  • Scalability: Git can easily handle large and complex projects, as it does not depend on a single server.
  • Flexibility: Git supports various workflows and branching strategies, allowing developers to choose how they want to organise their code and collaborate with others.
  • Security: Git uses cryptographic hashes to ensure the integrity and authenticity of the code.
  • Open-source: Git is free and open-source, meaning anyone can use it, modify it, or contribute to it.

While Git is pretty good, it has some disadvantages compared with a centralised source control system. Here are some:

  • Complexity: Git has a steep learning curve, especially for users who are new to distributed version control systems. Understanding concepts such as branching, merging, rebasing, and resolving conflicts can be challenging for beginners and sometimes even seasoned Git users.
  • Collaboration challenges: While distributed version control systems like Git enable easy collaboration, they can also lead to collaboration issues. Multiple developers working on the same branch simultaneously may encounter conflicts that need to be resolved, which can introduce complexities and require extra effort.
  • Performance with large repositories: While Git performs pretty well on most operations, it can get abortive when working with large repositories containing many files or a long history of commits. Operations such as cloning or checking out large repositories can be time-consuming.

What is Azure DevOps, and what does it relate to Git?

Azure DevOps is Microsoft’s cloud-based platform providing a set of tools and services for software development. It encompasses a range of capabilities for managing, planning, developing, testing, and delivering software applications. Azure DevOps offers:

  • Azure Boards: A tool for planning, tracking, and managing work items, such as user stories, tasks, bugs, etc.
  • Azure Repos: A tool for hosting Git repositories online, which is the main focus of this blog post.
  • Azure Pipelines: A tool for automating builds, tests, and deployments.
  • Azure Test Plans: A tool for creating and running manual and automated tests.
  • Azure Artifacts: A tool for managing packages and dependencies.

Azure DevOps also integrates with other tools and platforms, such as GitHub, Visual Studio Code, and now, Power BI. This takes us to the next section of this blog post, Integrating Power BI with Azure DevOps.

How to integrate Power BI with Azure DevOps

Now that we understand what Git and Azure DevOps are let’s see how we can integrate Power BI with Azure DevOps.

Integrating Power BI with Azure DevOps has two different integrations. Cloud integration and local machine integration have the following requirements.

Prerequisites

To follow along with this tutorial, you will need:

  • In the cloud:
    • An Azure DevOps Service
    • A Power BI account with one of the following licenses to enable Power BI Workspace integration with Azure DevOps.:
      • Power BI PPU (Premium Per User)
      • Premium Capacity
      • Embedded Capacity (EM/A)
      • Fabric Capacity
  • On your local machine:
    • The latest version of Power BI Desktop (June 2023 or later)
    • Either Visual Studio or VS Code

As stated earlier, this post explains the Cloud integration partTherefore, we require to have an Azure DevOps Service and a Power BI account with a Premium licencing plan in order to integrate Power BI with Azure DevOps.

In the following few sections, we look into more details and go through them together step-by-step.

Continue reading “Integrating Power BI with AzureDevOps (Git), part 1: Cloud Integration”

Microsoft Fabric: Terminologies and Personas Explained

In this blog post, I will explain some of the key concepts, personas, and terminologies related to Microsoft Fabric, a SaaS analytics platform for the era of AI. If you are not familiar with the basic concepts of SaaS analytics platforms and how Microsoft Fabric fits in, I recommend you read my previous blog post, where I explain them in detail.

Microsoft Fabric is an experience-based platform, meaning users can interact with it depending on their roles and personas. For example, a data engineer can use the Data Engineering experience to perform large-scale data transformation through the lakehouse. A data scientist can use Data Science experience to develop AI models on a single foundation without data movement. A business analyst can use the Power BI experience to create and consume interactive reports and dashboards. And a data steward can use the Data Activator experience to govern and secure data across the organisation.
The Data Activator experience is in private preview and is not available for public use yet!

Microsoft Fabric Terminologies

To understand how Microsoft Fabric works, it is crucial to know some of the terminologies that are used in the platform. Some of them are existing terms that are also used in Power BI or Azure services, while some of them are new and specific to Microsoft Fabric. Here are some of the key terms that you should know:

  • Tenant: A tenant is a dedicated instance of Microsoft Fabric that is provisioned for an organisation or a department within an organisation. A tenant has its own set of users, groups, permissions, capacities, workspaces, items, and experiences. A Fabric tenant is associated with an Azure Active Directory (AAD) tenant, which is a directory service that the organisations own when they sign up for a Microsoft cloud service such as Azure, Microsoft 365, Power BI, etc. AAD provides identity and access management for cloud applications. A tenant in Microsoft Fabric can only be accessed by users who belong to the same AAD tenant.
  • Capacity: Capacity is a term that refers to the amount of resources available to support a computing service. In the context of SaaS applications, capacity refers to the ability of the system to handle a certain amount of load or demand based on the required resources and infrastructure such as compute power (CPU, RAM, etc.), storage, network bandwidth and whatnot. As explained in my previous post, Microsoft Fabric is a SaaS platform. So, from a Microsoft Fabric perspective, capacities are sets of resources that are allocated to a tenant to run analytics workloads. The capacities sit in a tenant, and the available resources can be shared by multiple workspaces or dedicated to a single workspace for better performance and isolation. Microsoft Fabric capacities are available in various F SKUs that offer different levels of resources and features. For more information about capacities and SKUs, see Microsoft Fabric Capacity and SKUs.
  • Workspace: A workspace is a logical container that holds a collection of items and artefacts. A workspace can have one or more owners who can manage its settings and permissions and one or more members who can access its items. A workspace can also be assigned to a capacity to run its analytics workloads. In Microsoft Fabric, workspaces are based on Power BI workspaces.

The above terms also apply to Power BI, so they have been used within the community for a long time. The hierarchy starts with an organisation acquiring their potential Tenants, and then the purchased Capacities are available to tenants and the Workspaces that are assigned to capacities.

Continue reading “Microsoft Fabric: Terminologies and Personas Explained”

Microsoft Fabric: A SaaS Analytics Platform for the Era of AI

Microsoft Fabric

Microsoft Fabric is a new and unified analytics platform in the cloud that integrates various data and analytics services, such as Azure Data Factory, Azure Synapse Analytics, and Power BI, into a single product that covers everything from data movement to data science, real-time analytics, and business intelligence. Microsoft Fabric is built upon the well-known Power BI platform, which provides industry-leading visualization and AI-driven analytics that enable business analysts and users to gain insights from data.

Basic concepts

On May 23rd 2023, Microsoft announced a new product called Microsoft Fabric at the Microsoft Build conference. Microsoft Fabric is a SaaS Analytics Platform that covers end-to-end business requirements. As mentioned earlier, it is built upon the Power BI platform and extends the capabilities of Azure Synapse Analytics to all analytics workloads. This means that Microfot Fabric is an enterprise-grade analytics platform. But wait, let’s see what the SaaS Analytics Platform means.

What is an analytics platform?

An analytics platform is a comprehensive software solution designed to facilitate data analysis to enable organisations to derive meaningful insights from their data. It typically combines various tools, technologies, and frameworks to streamline the entire analytics lifecycle, from data ingestion and processing to visualisation and reporting. Here are some key characteristics you would expect to find in an analytics platform:

  1. Data Integration: The platform should support integrating data from multiple sources, such as databases, data warehouses, APIs, and streaming platforms. It should provide capabilities for data ingestion, extraction, transformation, and loading (ETL) to ensure a smooth flow of data into the analytics ecosystem.
  2. Data Storage and Management: An analytics platform needs to have a robust and scalable data storage infrastructure. This could include data lakes, data warehouses, or a combination of both. It should also support data governance practices, including data quality management, metadata management, and data security.
  3. Data Processing and Transformation: The platform should offer tools and frameworks for processing and transforming raw data into a usable format. This may involve data cleaning, denormalisation, enrichment, aggregation, or advanced analytics on large data volumes, including streaming IOT (Internet of Things) data. Handling large volumes of data efficiently is crucial for performance and scalability.
  4. Analytics and Visualisation: A core aspect of an analytics platform is its ability to perform advanced analytics on the data. This includes providing a wide range of analytical capabilities, such as descriptive, diagnostic, predictive, and prescriptive analytics with ML (Machine Learning) and AI (Artificial Intelligence) algorithms. Additionally, the platform should offer interactive visualisation tools to present insights in a clear and intuitive manner, enabling users to explore data and generate reports easily.
  5. Scalability and Performance: Analytics platforms need to be scalable to handle increasing volumes of data and user demands. They should have the ability to scale horizontally or vertically. High-performance processing engines and optimised algorithms are essential to ensure efficient data processing and analysis.
  6. Collaboration and Sharing: An analytics platform should facilitate collaboration among data analysts, data scientists, and business users. It should provide features for sharing data assets, analytics models, and insights across teams. Collaboration features may include data annotations, commenting, sharing dashboards, and collaborative workflows.
  7. Data Security and Governance: As data privacy and compliance become increasingly important, an analytics platform must have robust security measures in place. This includes access controls, encryption, auditing, and compliance with relevant regulations such as GDPR or HIPAA. Data governance features, such as data lineage, data cataloging, and policy enforcement, are also crucial for maintaining data integrity and compliance.
  8. Flexibility and Extensibility: An ideal analytics platform should be flexible and extensible to accommodate evolving business needs and technological advancements. It should support integration with third-party tools, frameworks, and libraries to leverage additional functionality.
  9. Ease of Use: Usability plays a significant role in an analytics platform’s adoption and effectiveness. It should have an intuitive user interface and provide user-friendly tools for data exploration, analysis, and visualisation. Self-service capabilities empower business users to access and analyse data without heavy reliance on IT or data specialists.
    These characteristics collectively enable organisations to harness the power of data and make data-driven decisions. An effective analytics platform helps unlock insights, identify patterns, discover trends, and drive innovation across various domains and industries.
Continue reading “Microsoft Fabric: A SaaS Analytics Platform for the Era of AI”