The idea of expectation in statistics is different from the ordinary use of the word.[in the sense it can take values that don’t make sense in the ordinary usage such as 2.5 children]. Expected value is the average value of a random variable.
It also differs from probability: Probability is the the chance of an event happening, Expectation calculates the long run value of the probabilistic event. It clearly approximates the ‘sample mean as the sample size approaches infinity’. If we are dealing with samples from a population[ which theoretically is infinite size: a population of hypertensives would be considered infinite, because it could refer to all diabetes not only in the present – which we can’t for practical reasons count, but also in principle all hypertensives in the past and in the future]. It is in this sense the expected value [ which arithmetically is a average] is considered same as the population mean.
If one is dealing with events with equal probability , then the expectated value is merely a simple average of the probabilities of the event. If the probabilities are different, then the expected value is computed as the weighted average of the probabilites of the event.
Expectation is the value in the long run. It is a function of two variables. The probability of the event and the payoff attached to the event. If one tosses a coin for a reasonably sufficient number of times , we know the probability of getting a head or tail approximate 0.5. If we were to bet one rupee for occurance of the event heads and one rupee for the each occurance of tails , then in the long run , the expectation would 0.5. ie [1* probability of heads + 1* probability of tails]/2.
Let us use a simple example to calculate expectation:
The probablity of occurance of face of a six sided dice is 1/6. let us agree that we attach a value equal to the face of the dice: thus 1 [ say rupee] if side with 1 turns up , 2 if side with 2 turns and so on to six] If we calculate the expectation: [1*1/6+2*1/6+3*1/6+4*1/6+5*1/6+6*1/6]/6= 21/6 or 3.5.
The only situation the probability of the event and the mathematical expectation of the event will be the same when considering the indicator random variable of the event. Probability can be defined as the expectation of a indicator random variable.