DBInsight’s Blogs

Use Power BI to Forecast Server Capacity

Posted by Rob Risetto on February 22, 2016

Microsoft recently announced the Power View forecasting feature which is available when running Power View in HTML 5 format through their Power BI for Office 365 cloud offering. For those that don’t know, the Power BI for Office 365 official description is …

Power BI is a cloud-based service that works with Excel to provide a complete self-service business intelligence (BI) solution. With both Excel and Power BI for Office 365, you can give everyone in your organization a powerful new way to work with data.

The Power BI – Getting Started Guide can be found (here).

For more information on the Power View predictive forecasting feature go (here).

Anyway I got to thinking that wouldn’t it be good if I could use the forecasting feature to predict server capacity. For example, answer questions like, what will be the CPU utilisation for my server in the future based on the historical Perfmon data.

So I decided to put the Power View forecasting feature to the test and build a CPU {2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} Analysis dashboard with KPI and forecasting elements. The pre-requisites to build the solution are

  1. Power BI for Office 365 subscription
  2. Excel 2013 with the Power Pivot and Power View add-ons enabled
  3. Perfmon binary data files

This post assumes you have some knowledge to drive Power Pivot and Power View.

First of all I created a SQL Server database called SQLPerf, this will house the uploaded Perfmon data sets.

The Powershell code to load up multiple Perfmon binary files is shown below, note I’m loading files created by SQLDiag so they have a file extension of .BLG.

$ImportDir = “C:\Perfmon\data”
$CounterFile = “C:\Perfmon\cpuonly.txt”
$SQLServer = “MySQLServer”

$fileList = get-childitem “$ImportDir” | where {$_.extension -eq “.blg”}

foreach ($file in $fileList)

$PMFile = $file.fullname
& relog $PMFile -cf $CounterFile -f SQL -o SQL:SQLPerf!SQLPerf

The Cpuonly.txt describes the counters to be uploaded, in this case only the Processor {2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} Time is required. The counter name format entered in the file looks like below

\Processor(_Total)\{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} Processor Time

For the script to work, an ODBC data source called SQLPerf needs to be created, it points to the SQL Server that houses the SQLPerf database.

This Powershell code picks all .BLG files in the $ImportDir and uses the utility Relog.exe to upload into tables called CounterDetails (Perfmon counter names), CounterData (actual Perfmon data points). Note that the Relog utility creates the CounterData and CounterDetail tables.

The solution also needs some time intelligence tables configured and populated in the SQLPerf database. The Date dimension table needs a date key column that contains just the date part e.g. dd-mm-yyyy and a Time dimension table that will match against the time only component of the Perfmon data point.

If you don’t already have a Date dimension data set,  you can use the Create Time Dimension Excel App to generate the Data dimension records. The App can be downloaded from the Microsoft App Store, information about the Create Time Dimension App can be found (here). Basically you download the App and then run it within an Excel 2013 worksheet, save the file as Office 97-2003 workbook, then use SSIS import to load the data into a DateDim table in SQLPerf.

Now the actual Time dimension table was generated using the code from Mark Hudson’s site (here). I just added the INTO TimeDim clause in the final SELECT of the CTE query to create/populate the table.

Next I opened up a new Excel 2013 file, clicked on the Power Pivot function, clicked on Manage and then clicked on From Database function to import the CounterData, CounterDetails, DateDim and TimeDim tables into Power Pivot.


Once imported, I created some extra columns in Power Pivot on the CounterData table so that I could link the CounterData to the date/time tables DateDim and TimeDim.

The CounterDateTime column was loaded as a text field and I couldn’t set it as a Date data type since it had the milliseconds component, so I created a new column in Power Pivot called DateKey, calculation is below, and set it to Date data type. It doesn’t have the time component, just the date portion.


Next I created a “DateTimeNoMS” column which included the time component minus the millisecond part, calculation is below


From this field I created a TimeKey column which only contains the time portion, calculation is below


Note that Power Pivot adds in the default date portion of 30/12/1899 even though I only specified time.

Note also that all of the new date columns were set to a Date data type and I marked the DateDim as a Date table (you can do this via the Design tab in Power Pivot or using the diagram view).

As a final touch I sorted the Month Name and Month Abbrev columns in the DateDim table using the column MonthNo, this ensures that the Month text is sorted correctly in reports rather than alphabetically.

I then switched to the Diagram view in Power Pivot to create the relationships between the tables. I linked the CounterData DateKey to the DateDim Date column, CounterData TimeKey to the TimeDim TimeKey column, CounterData CounterID to the CounterDetails CounterID column.


I then switched back to the data view and created a calculated field (measure) called CPU{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd}, see calculation below.

CPU{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd}:=CALCULATE(average(CounterData[CounterValue]), CounterDetails[ObjectName]=”Processor”, CounterDetails[CounterName]=”{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} Processor Time”)

Since I only imported one counter I probably didn’t need to create this calculation specifically for the counter (ie. I could have just used AVERAGE([CounterValue]) ), however, if you import multiple counters and their values then you will need to create a calculation specific for each counter so that you can present specific counter values on your reports and create specific KPIs for each counter.

Next I created a KPI based on the following ranges, if CPU{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} is 80{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} then KPI is red (critical), 65 to 80{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} its yellow (warning), less than 65{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} then green (all is ok).


The data modelling work is now completed and we are now ready to create a Power View dashboard.

Just create a new worksheet in the Excel file, then click on the Insert function in the menu bar and then choose the Power View Reports. I won’t go through the mechanics of creating the report components since there is plenty of information on the Net on creating Power View charts and reports.

The end result of the Power View dashboard in Excel 2013 looks like below.


The top left chart shows the average CPU usage for the time segments of Afternoon, Evening, Morning, Over Night over Day No in Month for a specific month. These time segments are defined in the TimeDim table as PeriodName.

The top right table shows the average CPU {2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} and the KPI CPU {2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} Status symbol for each period segment for each month.

The bottom chart shows the average CPU {2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} over Day No in Month. It’s this chart that I will use for forecasting once uploaded to the Power BI site.

Now it’s time to upload the Excel file that contains the compressed Counter data and model (in Power Pivot) to the Power BI site. The process is, log into Power BI and navigate to the Power BI team site, click on the Add button to upload the Excel file.


Once uploaded, you can click on the Thumb nail image of the report to invoke the Power View report.

To get to the HTML 5 version, so that you can do forecasting, click on the bottom right hand corner HTML 5 button.


The HTML 5 version of the Power View report looks a little different but it has similar functionality with extra features.


If you click on the bottom chart you will notice a blue dot at the end of the chart, you can drag this dot to the right to start the forecasting process. The lighter coloured line is the forecasted CPU {2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd}.


You can also indicate the level of confidence in your forecasting by selecting the number of standard deviations, in this case two. The shaded area around the forecast line indicates the possible range of forecasted data points based on 2 standard deviations.


You can also verify your forecasting model by performing “hind casting”. Basically you drag the left most blue dot a number periods back and then compare how close the modelling data points (light line) actually matches the real data points (dark line).


Finally, if you have some outlier data points i.e. data values which are one off peaks or troughs then you can reset the value in the chart so that the forecasting model is not skewed.

For example, you can click on a trough on the line chart and drag the blue dot up to a point that is more typical for the utilisation.


The gray line (below) was the original data point (7.36{2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd}) and now the new data point is 9.25 {2ec0bbd3bfdf207d2f0779c26660c3798ccadae611e41b6dbc787103c4a85cdd} for day 15.


So you can see from the above examples that Power BI provides some powerful “What If” analytics and forecasting, and how relatively easy it is to use this features in server capacity forecasting.

This example focused on CPU utilisation but you can easily apply the same principles to other counters like Disk Reads/sec or Disk Writes/sec and create a KPI based on your disk system IO capacity limit, or disk latency counters like Avg Disk Sec/Read or Avg Disk Sec/Write and set KPIs for the typical latency guidelines.

In addition, you could automate the refreshing of the data set in the Power Pivot model using the Power BI’s data management gateway feature.

You have to create a data management gateway first in the Power BI Admin web page, then install and register a data management gateway agent on your server, then create a data source in Power BI Admin that points to the on-premise SQL Server and SQLPerf database. More information on creating a data source and enabling cloud access can be found (here).

The catch is that you need to specify the same provider on the Power BI data source as the provider used in the Excel Power Pivot connection. In my case it was the SQL Server native client 11 ( SQL Server 2012). Note that the SQL Server and Database name also have to match.

You can then right click on the Power View report and click Schedule Data Refresh, once defined you can also manually run the Refresh immediately. More information on how to schedule a data refresh can be found (here)


The sample Excel file can be downloaded from (here).

Spread the love

Leave a Reply

Notify of