Definition of Variance

If random variable $X$ has expected value (mean) $\mu = E(X)$, then the variance $Var(X)$ of $X$ is given by:

$Var(X) = E[ ( X - \mu ) ^ 2 ]$.

In practice we collect data and estimate the $Var(X)$ as either

$s_n^2 = \frac 1n \sum_{i=1}^n \left(x_i - \overline{x} \right)^ 2 = \left(\frac{1}{n} \sum_{i=1}^{n}x_i^2\right) - \overline{x}^2$

(which makes clear the nature of the variance as a mean), or as

$ s^2 = \frac{1}{n-1} \sum_{i=1}^n\left(x_i - \overline{x} \right)^ 2 = \frac{1}{n-1}\sum_{i=1}^n \left(x_i - \frac{1}{n} \sum_{j=1}^nx_j \right)^2 = \frac{1}{n-1}\sum_{i=1}^n \left(\frac{1}{n} \sum_{j=1}^n(x_i - x_j) \right)^2 $,
(which provides a correction to the slight bias in the estimated value).

Let's play around with that last formula a little bit:

$ s^2 = \frac{1}{n-1}\sum_{i=1}^n \left(\frac{1}{n} \sum_{j=1}^n(x_i - x_j) \right) \left(\frac{1}{n} \sum_{k=1}^n(x_i - x_k) \right) $
Replacing the mean $\overline{x}$ by the sum which defines it,
$s^2 = {\frac{1}{n^2(n-1)}}\sum_{i=1}^n \sum_{j=1}^n \sum_{k=1}^n (x_i-x_j)(x_i-x_k).$
Adding an appropriate form of zero (a favorite trick!),
$ s^2 = {\frac{1}{n^2(n-1)}} \sum_{i=1}^n \sum_{j=1}^n \sum_{k=1}^n (x_i-x_j)(x_i-x_j+x_j-x_k), $
which is
$ s^2 = {\frac{1}{n(n-1)}} \sum_{i=1}^n \sum_{j=1}^n (x_i-x_j)^2 - {\frac{1}{n^2(n-1)}} \sum_{i=1}^n \sum_{j=1}^n \sum_{k=1}^n (x_j-x_i)(x_j-x_k). $
Notice that the second sum in the previous expression as exactly $s^2$, so
$ s^2 = {\frac{1}{2n(n-1)}} \sum_{i=1}^n \sum_{j=1}^n (x_i-x_j)^2, $
or, finally,
$ s^2 = {\frac{1}{n(n-1)}} \sum_{i=1}^n \sum_{j=i+1}^n (x_i-x_j)^2, $
This you can think of as
$ s^2 = \mathrm{ average distinct pair deviation squared} $
(which we knew!;)