With the massive amount of data collected in the System Center Operations Manager from all servers and other monitored equipment, IT Operations departments are sitting on a gold mine of data just begging to be used. One of the areas that can benefit from such internal data capital is forecasting.
By implementing forecasting processes you can predict the behavior of managed objects some months into the future. This knowledge enables you to act in advance in order to prevent service failures and service level breaches.
Most business areas use some kind of forecasting methods when planning new investments, calculating yearly budgets etc. We believe that IT organizations should be no different and start using operational data to gain insights and learn from the past while planning their future.
Where does the data come from?
All data collected by SCOM is stored in the Operational Database. This database is queried by administrators via SCOM Console when working with ongoing issues. Here most values are not stored for longer than 7 days thus Operational Database is of no interest for us when we are looking for a larger (time-wise) set of data which we could use as a base for forecasting.
The good news is that, at the same time as being saved to Operational DB, data is also transferred to the Data Warehouse. Therein it is aggregated to Hourly and Daily values and stored a bit longer (default setting is 400 days). As you might guess, this is going to be a much better data source for us when working with forecasts!
- * Line charts are perfect for displaying continuous data sets. Since forecasts are closely related to time, we will be using line charts to illustrate data.
One of the most basic forecasting methods is linear forecasting. To put it simply, it is the same thing as drawing a straight line from the first historical data point to the last and then continuing into the future for as long as we see reasonable and useful.
What we need to be aware of is that the linear method is not really accurate and that data seasonality is completely left out of the picture as well, therefore linear forecasting should only be regarded as a guideline showing the trend of where the value is heading.
Keep on reading to find out why the linear forecast depicted below might be a little deceiving!
Since many of the IT Services you are supporting could have high or low peaks during different time periods it is important to keep in mind data seasonality. These periods are not necessarily the same as the four natural seasons of the year. Some of the common data seasons are:
- low usage during summer vacation
- peak usage before Easter sale
- high load on government IT infrastructure during elections (every x years)
As mentioned previously, the default setting is to truncate data from SCOM Data Warehouse after it is 400 days old. Is that a long enough period to accommodate for your seasonality? Can your SCOM Data Warehouse handle more than that?
Preparing the data
SCOM DW is huge. It contains so many things that were written into the database throughout the 400 days. That means that leaving data for even longer period will increase the storage costs. Furthermore additional data in this complex data warehouse will make extraction and reporting even slower than what it already is thus making the whole setup less and less usable as time goes by.
Why not just pick the subset of data we need for answering our specific questions? Why not store it in a different shape, making it more reporting-friendly?
No reason not to do so.
At Approved we call it our Data Mart. A relatively lightweight, compressed and cleaned data set that can hold even 10 years of data (if that would be of any value).
We use this data set as our base when performing advanced forecasting.
Our main goals when generating automated forecasts are to make sure that predicted values are accurate both short and long term, that data seasonality is considered and the results are consistent day after day.
Microsoft has developed a great set of tools to help out with forecasting (and other Data Mining tasks).
Their Time Series algorithm uses a combination of ARTXP data mining model optimized for short-term predictions and ARIMA model which is optimized for long-term predictions. Two separate models are trained on the same data: one model uses the ARTXP algorithm and one model uses the ARIMA algorithm. The results of the two mining models are then blended together in order to yield the best prediction over a variable number of time slices.
You can read more about the Microsoft Time Series on MSDN.
Remember the linear forecast we calculated earlier? Here is the same data forecasted with Microsoft Time Series (linear forecast as gray line for reference). Last part of forecast is red because Disk Free Space % is already less than 10% – a clear indication of potential problem. And we know about it half year in advance.
This is just one example illustrating the importance of the right choice and configuration of forecasting algorithm. While the linear version is much easier to implement and deliver, it is the advanced model that will actually help your IT organization reach availability and stability levels like never measured before!
Great, we have a forecast. Now what?
Many great opportunities rise once the process of generating automated forecasts is established and we have clean, consistent and trustworthy data.
As in the example above, we have a great chance to make sure we don’t run out of resources within specific entities.
We can use forecast data to plan capacity for upcoming changes and investments related to them.
Data is there. Now the sky is the limit. Keep an eye out for future posts where we are going to take a closer look at how we use operational data to make tech life easier.