Search
Close this search box.

Dealing with PowerBI Resource Consumption

PowerBI memory usage can be quite difficult to manage, especially when you’re running on an embed capacity where you have hard limits on RAM. In this article, we’ll look at how we measure the impact of our queries & estimate how much RAM they will require. This information has been gathered from trial and error and trawling the web – there is not very much information around this from Microsoft themselves.

The below are my observations based on my experience and the aforementioned research. The best way to find your own memory requirements is of-course through trial and error. If you find your memory usage to be high, the below may help to understand why that is.

Incredible compression is confusing

Let’s say you have a PowerBI (pbix) file size of 1GB. That is a highly compressed file which does not reflect the true data volume that you are processing. In my experience, the data size can be 5 times the size of your pbix file.

So your 1GB pbix file, is actually reading 5GB of data.

Refreshing existing datsets make it more confusing

When you refresh a dashboard that is already running, PowerBI will take a copy of the dataset prior to refresh & place it into memory. This is so the dashboard remains operable during the refresh. Once that copy has been taken, it loads the data into memory for the refresh. Towards the end of the data refresh, you have now got two copies of the same dataset in memory, thus doubling your overhead.

Now, your 1GB pbix file, is reading 5GB of data and it needs to double the dataset size for refresh, so we’re looking at a total requirement of 10GB of RAM.

Complex DAX make it even more confusing

That though, is not the end of the story. You also need to provision a multiple of your dataset size to account for complex DAX formulas. They can increase the memory overhead from between 2x and 10x your dataset size, as PowerBI pulls into memory to process those functions.

Now, your 1GB pbix file, is reading 5GB of data, it needs to be doubled for data refresh, giving us a tentative requirement of 10GB and we need to accommodate DAX functions of between 2x and 10x of our dataset size. Let’s be fair and settle on 3x for our example. Now then, we need 5GB+(3x5GB)+(5GB overhead) = 25GB RAM.

Continued use makes it EVEN MORE confusing

When you load a report, DAX functions will likely be firing off with each sheet visualization you render. Those DAX functions consume memory & the wisdom online is that 35% of your total memory requirement (for the original dataset + the refresh) should be allocated to the report operations.

Now, your 1GB pbix file, is reading 5GB of data, it needs to be doubled for data refresh, giving us a tentative requirement of 10GB; we need to accommodate DAX functions of between 2x and 10x of our dataset size. Let’s be fair and settle on 3x for our example. Now then, we need 5GB+(3x5GB)+(5GB overhead) = 25GB RAM AND we need to account for a 35% overhead for report operations. So our total memory requirement is 5GB+(3x5GB)+(5GB overhead) + (0.35 * 5GB) = 26.75GB RAM.

So how much memory do I need?

This is the million dollar question and the answer is ‘I don’t really know’. But here are some steps you can take to find out.

  • Instead of using the pbix file size as the indicator for your dataset size, run all of the queries & download the data as CSV’s. Use the sum of the CSV sizes to determine how much data PowerBI is actually ingesting, prior to the compression.
  • During the data refresh on Desktop, open task manager and monitor your memory usage. It won’t be an exact science, but it will give you a good idea of the amount of memory the refresh operation actually consumes.
Share the Post:

Related Posts