4.2    Expectation, Variance, Moments, Moment Generating Function

Consider:  in discrete case, the expected value of a random variable X is defined as i.e., it gives the weighted average of all possible values of X  

Try instead:  approximate the continuous distribution by a discrete one, as follows:
Suppose possible values for X lie within some interval [a,b].

We use the above to motivate the definition of the expected value of a continuous random variable (extending it to the case wgere the possible values lie in the infinite range from - to ):
 

Def:   If X is a continuous random variable, then its expected value is defined as

ex: In general,  E(X) gives the “balance point” of the density function: if the region between the density curve and the x-axis were cut out of a piece of wood, the location of the mean would be the point on which the piece would balance.  
 
 

For any function H(X) of X, we define the expected value of H as

 

Thus the moments are

The variance is We have the same properties for expectation as before:
  1. E(cX) = c E(X)
  2. E(X+Y)  =  E (X) + E (Y)
As before, these give an alternate formula for computing the variance:  

ex:

 

Note:  the standard deviation measures the expected deviation from the mean, as before; thus it measures the spread of the density function, i.e., how widely spread the values tend to fall from the location of the mean.
 
 

The moment generating function is again defined as

and is used as before to find the values of the moments by differentiation:  



 
Previous section  Next section