Follow

What unit (Bits, Bytes, %, etc) is my Alarm measuring?

Basically it depends on how the metric is defined and the aggregation function used in the Alarm.

When you define an Alarm and select a relevant metric, its unit is selected automatically. You can set the threshold value, time period (e.g. 1 min or 5 min) and the aggregation function (min, max, avg or sum).

For example, if you are interested in generating an alert if average CPU utilization on your web servers goes above 80% for at least 1 minute, it would look similar to this screen:

 
 

Note that we used 'Avg' function, which computes average value from 300 samples for each server, selected by this Alarm. Each sample represents the value in percents, so the result is also in percents.

Let us take another example. We want to make sure that incoming web traffic does not drop below certain value. In this case, we would use 'Inbound Network Traffic' metric and 'Min' function, so the alarm will be similar to this one:

 

In this case. aggregation function will choose the minimum value from 60 samples (for each monitored server). Since the original metric units are MBytes/sec, then the result will also be MBytes/sec

However, if we would like to define the alarm threshold as sum of traffic transferred through web server during 1 minute, then the function will be 'Sum' and it would summarise 60 samples, each one measured in Mbytes/sec, but the sum of them will be MBytes.  


 

Have more questions? Submit a request