top of page

From API to Insight: Real-Time Crypto Analysis Using Power Automate and Microsoft Fabric

Financial markets require real-time data to enable investors to make informed decisions and capitalize on opportunities as soon as they arise. This blog post guid you through using Power Automate and Microsoft Fabric Real-Time analytics items like Eventstream to fetch cryptocurrency data from Binance and stream it directly into a KQL database and analyze it in Power BI. This setup not only simplifies data handling but also opens up a multitude of possibilities for analyzing trends as they happen.

In this post, I used the Binance API to analyze the Crypto prices, but you can use any API you like and analyze different types of data using this structure.


Video



Let's first have a look on the overal structure of this solution:

Overal Structure for this solution

The main part of the Power Automate section in this solution is explained in detail in my previous blog post (How to Use Power Automate as an Eventstream Source Custom App) : https://www.vahiddm.com/post/fabric-real-time-analytics-use-power-automate-eventstream-custom-app - To avoid feeling lost, read that blog or watch the related video before reading this blog.


1- API - Binance

Binance Holdings Ltd., branded Binance, is a global company that operates the largest cryptocurrency exchange in terms of daily trading volume of cryptocurrencies. It is one of the most popular platforms for Crypto traders that provides access to a range of valuable data via a free APIs. If you need more info about this platform's API, read this:


In this blog post I used the /api/v3/ticker/price that returns latest price for a symbol or symbols. If you run "https://api.binance.com/api/v3/ticker/price" in your web browser, you will get the price for more than 2,600 different cryptocurrencies (Click here for more info). Apart from the free API, another good point about this API is that the result is well-structured for use in Fabric Real-Time Analytics, like this:

{
    "symbol": "BTCUSDT",
    "price": "59578.15000000"
  }

If you are looking for specific crypto, you can customized your API. For instance, if you want to see Bitcoin and Ethereum price as the result of the API, you can run this API:

and the result wil be as below:

[
  {
    "symbol": "BTCUSDT",
    "price": "59585.99000000"
  },
  {
    "symbol": "ETHUSDT",
    "price": "3002.20000000"
  }
]

In this way, you can customize the API to get the results you need.

Binance API Limitations:


2- Set up the Eventstream and Flow

I have explained in detail how to create Eventstream and use Power Automate as the Source Custom App in my previous blog:

Please follow the instruction in that blog/video and create an Eventstream and Power Automate Flow (Don't run the flow, just set it up - The flow needs to be changed and all details are in the next section).

The expected output will be as follows:

Eventstream in Microsoft Fabric structure

2- Update the Power Automate Flow

Following the previous step (And my previous Blog), you have an Eventstream in Microsoft Fabric and a Power Automate Flow with Event Hub's Send Event as the only action in the flow:

Power Automate Flow with Event Hub Send Event Action

For this scenario, the flow needs some changes like adding a single HTTP action with the following set up to execute the API call.

Method: GET

URI: https://api.binance.com/api/v3/ticker/price (or your customised URI)

Power Automate API HTTP

Then add the result of the HTTP action as the content for the "Event Hub" Send Event action as below:

Power Automate API HTTP and send event

*Please ignore all Event Hub Names, I've added random characters in different images


The flow is nearly completed; the only remaining steps are:

  • Test the flow and make sure it works without any issue

  • Make sure Eventstream has received data successfully

  • In Power Automate Flow, change the trigger from manually trigger a flow to Recurrence with 1 minute interval

Power Automate Trigger
  • Put HTTP and Send Event actions in a loop to fetch cryptocurrency prices at least once every second: To create this loop, add Do Until action to the Flow and move HTTP and Send Event actions into it. The "Do Until" action runs all included actions until a specified condition is met. You can also set a limit on the number of executions. For example, you could configure it to perform the actions up to 10 times, and then exit the loop regardless of whether the condition has been met. For this scenario, I need the flow to run every minute. Therefore, it's crucial to set the loop's limit to complete the entire process within one minute, ensuring there are no overlaps between successive runs. In this case, you can set the condition as follows, which is always true: 😎:

Do Until in Power Automate to create Loop

Set the count limit for the "Do Until" loop to 100 to ensure the process completes in under a minute. You may manually run the flow a few times to determine the exact number of iterations that fit within the one-minute mark, so the Do Until will be as follow:

Do Until in Power Automate to create Loop

Our flow is not ready.

Test the flow and ensure that data is correctly streaming into the Eventstream. Please run the flow at least one time before next step and turn that off for now.


3- Create a KQL database and add it to Eventstream

Our Eventstream Source (Power Automate) is ready and now we need to add a destination. There are 4 options for destinations KQL Database, Lakehouse, Custom App, and Reflex (Data Activator). I want to use KQL database and need to create it first.

Open the workspace, and change the view to Real-time Analytics then click on KQL database:

Real-Time Analytics in Fabric - Create KQL database

Add a name for the database and select type then click on Create and wait for it to become available:

Real-Time Analytics in Fabric - Create KQL database

Don't need to add any table or change any details in KQL.


Note:


Open the Eventstream again and click on new destination and select KQL Database:

Eventstream in Microsoft Fabric structure

In the new window, select the Data ingestion mode:

  1. Direct ingestion: This option directly sends data from Eventstream to Kusto Database without any intermediate processing. Ideal for scenarios where data doesn't need transformation before analysis.

  2. Event processing before ingestion: This option allows you to process the data stream using the Event Processor before sending it to the Kusto Database. Suitable for scenarios that require real-time data processing or enrichment before analysis.

Select the second option, the Event processing before ingestion, and add a Destination name then choose the workspace where your KQL is saved and select the KQL database from the list.

If you faced this issue in the Destination Table section:

"Unable to validate schema or create new table. No data found in this eventstream or it's loading. Please wait for more data loaded or try to add new data source. Refresh"

That is because you did not run the flow at the end of the previous step, close the destination window and run the flow first.

In the Destination table section, click on the Create New, and add a name for the table:

Set up KQL as destination in Eventstream - Microsoft Fabric

keep the Input data format as JSON, and click on the Open Event processor.

In the processing editor you can trasform your data from source to destination. select the Eventstream icon at the left and change the price column data type to Double:

Process Data in Eventstream - KQL Database

Then Select the destination and make sure the price column data type has been changed there as well, and click on Done to close Event processinf editor page. In the video I used more items to trasform data in a better way, highly recommend to watch this blog's video as well.

Click on Add button to finalize the destination and wait for it to see the status become ingesting:

Eventstream structure using Power Automate and KQL database

Now your Eventstream is ready ✨



4- Run the flow and test the Eventstream

Open the Power Automate and turn on the flow again and open the KQL database to check the data are streaming correctly or not:

  • Click on the Table

KQL Database in Microsoft Fabric
  • Click on the Explore your data

  • You can write SQL or KQL code, run this:

SELECT *
FROM CryptoPrice
  • Check the result table, if the output is as below then it means your solution works without any issue:

Run SQL or KQL in microsoft Fabric KQL database

4- Create Power BI Report

In KQL dtaabase page, you can click on Build PowerBI report amnd start creating your report directly on web browser or you can connect your Power BI desktop to the KQL data base using Import or Direct Query mode.



Conclusion

In this blog post, I showed how powerful Microsoft Fabric Real-Time Analytics is and how you can use APIs with Power Automate to feed data into Eventstream. This approach can help solve tough business problems. You can use different APIs to handle various types of data using this same method.

0 comments
bottom of page