Expectation and Variance
The expected value (or mean) of X, where X is a discrete random variable, is a weighted average of the possible values that X can take, each value being weighted according to the probability of that event occurring. The expected value of X is usually written as E(X) or m.

E(X) = S x P(X = x)
So the expected value is the sum of: [(each of the possible outcomes) × (the probability of the outcome occurring)].
In more concrete terms, the expectation is what you would expect the outcome of an experiment to be on average.
Example
What is the expected value when we roll a fair die?
There are six possible outcomes: 1, 2, 3, 4, 5, 6. Each of these has a probability of 1/6 of occurring. Let X represent the outcome of the experiment.
Therefore P(X = 1) = 1/6 (this means that the probability that the outcome of the experiment is 1 is 1/6)
P(X = 2) = 1/6 (the probability that you throw a 2 is 1/6)
P(X = 3) = 1/6 (the probability that you throw a 3 is 1/6)
P(X = 4) = 1/6 (the probability that you throw a 4 is 1/6)
P(X = 5) = 1/6 (the probability that you throw a 5 is 1/6)
P(X = 6) = 1/6 (the probability that you throw a 6 is 1/6)
E(X) = 1×P(X = 1) + 2×P(X = 2) + 3×P(X = 3) + 4×P(X=4) + 5×P(X=5) + 6×P(X=6)
Therefore E(X) = 1/6 + 2/6 + 3/6 + 4/6 + 5/6 + 6/6 = 7/2
So the expectation is 3.5 . If you think about it, 3.5 is halfway between the possible values the die can take and so this is what you should have expected.
Expected Value of a Function of X
To find E[ f(X) ], where f(X) is a function of X, use the following formula:

E[ f(X) ] = S f(x)P(X = x)
Example
For the above experiment (with the die), calculate E(X^{2})
Using our notation above, f(x) = x^{2}
f(1) = 1, f(2) = 4, f(3) = 9, f(4) = 16, f(5) = 25, f(6) = 36
P(X = 1) = 1/6, P(X = 2) = 1/6, etc
So E(X^{2}) = 1/6 + 4/6 + 9/6 + 16/6 + 25/6 + 36/6 = 91/6 = 15.167
The expected value of a constant is just the constant, so for example E(1) = 1. Multiplying a random variable by a constant multiplies the expected value by that constant, so E[2X] = 2E[X].
A useful formula, where a and b are constants, is:

E[aX + b] = aE[X] + b
[This says that expectation is a linear operator].
Variance
The variance of a random variable tells us something about the spread of the possible values of the variable. For a discrete random variable X, the variance of X is written as Var(X).

Var(X) = E[ (X – m)^{2} ] where m is the expected value E(X)
This can also be written as:

Var(X) = E(X^{2}) – m^{2}
The standard deviation of X is the square root of Var(X).
Note that the variance does not behave in the same way as expectation when we multiply and add constants to random variables. In fact:

Var[aX + b] = a^{2}Var(X)
You is because: Var[aX + b] = E[ (aX + b)^{2} ]  (E [aX + b])^{2} .
= E[ a^{2}X^{2} + 2abX + b^{2}]  (aE(X) + b)^{2}
= a^{2}E(X^{2}) + 2abE(X) + b^{2}  a^{2}E^{2}(X)  2abE(X)  b^{2}
= a^{2}E(X^{2})  a^{2}E^{2}(X) = a^{2}Var(X)