Version control systems, also known as source control systems, enable users to track and manage changes made to files over time. In my opinion, Version Control has two parts, one is the change log which records changes and updates to the Power BI reports, and the other is to save the PBIX file after any changes so that you can compare different versions or restore previous versions. This article shows you how to set up a flow on Power Automate to check all Power BI reports (in your organization) every minute in order to find reports that have been modified, record those reports' properties in the Change Log, and export/save them as PBIX files. The following items are required to set up this flow:
In this flow, some Power BI APIs need to be invoked by Power Automate, so please check this blog post (Call Power BI Rest APIs with Power Automate) and make sure you set up all prerequisites.
To run the Admin APIs, the user must have administrator rights (such as Office 365 Global Administrator or Power BI Service Administrator) or authenticate using a service principal.
For a better understanding of this flow on Power Automate, here is a process diagram that shows the main items of the process in 3 sections:
Section 1 is the predecessor of sections 2 & 3. If you only want a Change Log, you can create a flow with sections 1 & 2 without section 3, or you can have a flow without the yellow box to just export PBIX files.
Note: There are some limitations to section 3, and it must be set up based on your organization's situation (I explained that at the beginning of the section).
Section 1:
The goal of this flow is to check the modification date/time of all reports every minute, find the reports that have been modified, update the change log, and then save the PBIX files. To initiate this flow, a "Recurrence" trigger needs to be used with a frequency interval of 1 minute.
As a next step, in order to find all changes between the previous and current runs of this flow, add the past time action with the following setup to the flow that returns the current time minus one minute (one minute before the current time).
Check the below image to see why we need the current time and one minute before it:
Now it's time to add the HTTP action and invoke the "GetReportsAsAdmin" API to get the list of all reports and their properties in all workspaces (My workspace is included).
Note:
If you face any issues or don't know how to add Active Directory OAuth to the HTTP action, please check my previous blog: "Call Power BI Rest APIs with Power Automate".
Suppose you don't want to get My Workspace's reports data. In that case, you can use the "Reports GetReportsInGroupAsAdmin" API and specify the workspaces you want to include in the Version Control.
There is a limit of 200 requests per hour for these APIs.
HTTP action has 3 outputs: Body which is a JSON, Header, and Status Code, but the "modifiedDateTime" is required from the Body to be able to compare that with the output of the Get past time action (one minute before the current time).
So the next step/action in this flow should be the Parse JSON to extract the items from the Body, which is the output of the HTTP action. The Parse JSON action has two fields: Content, which needs to be the Body (the result of the HTTP action - GetReportsAsAdmin API), and Schema, which can be filled in by using the Generate from sample action. To use the Generate from sample, run the flow once, copy the output of the HTTP action, paste it in the new window that opens after clicking on the Generate from sample button, and then click on Done. The output of this action is all the items from the HTTP action like Report ID, Report Name, Dataset ID, Modified Date Time, and Workspace ID.
Section 2:
You now have all the required properties for all reports in your organization, so it's time to add a condition action to compare each report's Modified Date Time with the output of the Get past time action (one minute before the current time). Add the Condition action and in the first field select the "modifiedDateTime" like the following image:
The "Apply to each" action will be added automatically to your flow when you select "modifiedDateTime", as this condition must be checked for each report.
In the condition action select the "is greater than" as a comparison condition and select the Past Time (the output of the Get past time action) in the next field.
There are two possible outcomes from this action: Yes and No. The action doesn't need to be taken if the answer is No; it means that the report has been updated more than one minute before you run the flow, and that's captured in the previous runs of the flow; on the other hand, if the answer is Yes, then some action needs to be taken.
The reports that have been updated between this run of the flow and the previous run will pass the condition (previous step), so their details should be added to the change log, and they should be exported as PBIX files (Section 3).
Update the Change Log:
First, the change log (Excel file) needs to be created on SharePoint or OneDrive, and then you can use the "Add a row into a table" action to update that file with the report's details.
In SharePoint, I created an Excel file with these columns, set it as a table, and changed the table name to the HistoryTable: Report ID, Report Name, Dataset ID, Modified Date Time, Modified By, Workspace ID, Comment
The file can be structured differently with different columns aligned with the report properties returned by the HTTP action (API).
Add "Add a row into a table" to the "if yes" section of the condition, and set up that like below: (Map the Excel table columns to the report's properties from the "Parse JSON - Reports Properties" action)
Note:
The Comment column can be added, and users can update it with details of file changes
You can add a new step to the Power Automate Flow to send an email and ask the person who modified the report to add details of the change/modification on the change log
Section 3:
As I mentioned at the beginning of this blog, there are some limitations in this section as follow that will affect your Power Automate flow and your solution based on your organization:
Since the HTTP action limit is 100MB (104857600 bytes), and Export File API does not support chunking, this section is not suitable for report files larger than 100MB (for more details, click here)
In order to find the average flow duration, you need to run and test it in different situations. Then, you need to change the frequency intervals for the trigger (Recurrence) and the "Get Past Time" action from 1 minute to a time longer than the average duration.
If you want to export only specific reports or reports from specific workspaces, you can add another condition before this section in your flow.
Export Report to PBIX:
After finding modified reports and adding their details to the change log file, it's time to export those reports and save them into a SharePoint folder. Why SharePoint folder? Because we can overwrite a file and keep the history. If you need to save your files on another platform without the update function, you can include a timestamp in the file name to be able to sort and manage them.
To export the report to a Power BI .pbix file, the HTTP action with the "Export Report In Group" API need to be used.
HTTP |
GET https://api.powerbi.com/v1.0/myorg/groups/{groupId}/reports/{reportId}/Export |
The URL of this Power BI API needs to be completed by {groupId} (WorkspaceId) and {reportId} (id), which come from the Parse JSON action.
In this step, the HTTP action needs to be configured as shown in this image:
Get File Properties from the SharePoint Folder:
The output of this HTTP action is a JSON (application/zip) file that can be saved as a .pbix file. Depending on whether there is an existing file with the same name in the folder, that file needs to be updated, or a new file needs to be created.
To determine whether the .pbix file exists in the folder or not, the "Get files (properties only)" action needs to be added to the flow as a next step. Add the folder address that you created on SharePoint to save the Power BI report files by selecting the right path and names on the Site Address, Library Name, and Limit Entries to Folder fields and add "FileLeafRef eq {name}.pbix'" to the query section. the {name} should be selected from the Parse JSON action output (Report Name).
Condition - Check the file existence:
Add a condition after the "Get files (properties only)" step and configure it as follows:
Click on the first field, "Choose a value", and in the Expression section copy and paste this: length(outputs('Get_files_(properties_only)')?['body/value'])
Select "is equal to"
In the third section type 1
If it returns 1, the report's file is in the folder, otherwise, it is not.
Save the report file:
The final section of this fellow is to save or update the file. As a result of the previous step (the condition), the file needs to be updated in the If yes section of the condition and a new file needs to be created in the If no section.
Add the SharePoint Update file action to the If yes section and the SharePoint Create file action to the If no section, then set them up as follows:
The Body in the File Content section of both actions is the output of the HTTP action for "Export Report to PBIX".
Flow image in its entirety:
This is a super useful blog. Thanks for sharing.