bell notificationshomepageloginNewPostedit profiledmBox

Hoots : Why we sometimes use mean and sometimes use standard deviation to measure volatility Both the metrics average true range and historical volatility measure volatility of a financial instrument's price. Average true range - freshhoot.com

10% popularity   0 Reactions

Why we sometimes use mean and sometimes use standard deviation to measure volatility
Both the metrics average true range and historical volatility measure volatility of a financial instrument's price. Average true range measures volatility on an intraday basis, while historical volatity measures volatility on an interday basis.

Why then for average true range do we compute a mean, whereas for historical volatility we compute a standard deviation?


Load Full (1)

Login to follow hoots

1 Comments

Sorted by latest first Latest Oldest Best

10% popularity   0 Reactions

I just went and learned the definitions of these terms, and it appears that the difference is not that one does not use a mean but rather that they apply a different number of means.

either statistic can be used for both intra- and inter-day measurements
historical volatility is a deviation measurement (such as standard deviation) that applies two separate mean computations: the first determines the mean of all the values, and the second determines the mean of the individual differences from the first mean.
average true range is a deviation measurement that applies only one mean computation: a bunch of mean-less deviations are computed and then their overall mean is determined.

So both involve taking means (because that's the only way to compute a statistic over a possibly variable number of data points), but historical volatility also uses a mean to compute the individual data points in the first place.

(Fascinating side note about the standard deviation is that you can compute it without computing the overall mean at the beginning, so technically it also involves only a single mean.)


Back to top Use Dark theme