概率统计课件【英文】
- 格式:ppt
- 大小:106.00 KB
- 文档页数:17
Chapter 4 Mathematical Expectation4.1 Mean of Random Variables4.2 Variance and Covariance4.3 Means and Variances of LinearCombinations of Random variables4.4 Chebyshev’s Theorem13Definition 4.1 Let X be a random variable with probability distribution f(x).The mean or expected value of X isif X is discrete, andif X is continuous.Remark:The mean of a random variable X can be thought of as a measure of the “center of location”in the sense that it indicates where the “center”of the density line. xx xf X E )()( dx x xf X E)()(4Example 4.1, page 89The probability distribution of a random variable X is given byx =0,1,2,3.f(0)=1/35 f(1)=12/35 f(2)=18/35 f(3)=4/3537334)(x x x f 71235435183512351))(3())(2())(1())(0(5ExampleThe probability distribution of a random variable X is given byelsewherex e x f x 00)( 0)(dx e x X E x 00dx e xe xx 1 0 ( )6Example 4.2, page 90In a gambling game a man is paid $5 if he gets all heads or all tails when three coins are tossed, and he will pay out $3 if either one or two heads show. What is his expected gain?Let Y be the amount of gain per bet. The possible values are 5 and –3 dollars.Let X be the number of heads that occur in tossing three coins. The possible values of X are 0, 1, 2, and 3.Solution:P(Y = 5) = P(X = 0 or X = 3) = 1/8 + 1/8 = ¼P(Y = -3) = P(X =1 or X = 2) = 6/8 = ¾= (5)(1/4) + (-3)(3/4) = –1Interpretation : Over the long run, the gambler will, on average, lose $1 per bet. Most likely, the more the gambler plays the games, the more he would lose.7Notice that in the preceding example, there are two random variables, X and Y ; and Y is a function of X , for example if we letE(Y) = E(g(X))= (5)P(Y = 5) + (-3)P(Y = -3)= (5)[P(X = 0) + P(X = 3)] + (-3)[P(X = 1) + P(X = 2)]= (5)P(X = 0) + (5)P(X = 3) + (-3)P(X = 1) + (-3)P(X = 2) = g(0)P(X = 0) + g(3)P(X = 3) + g(1)P(X = 1) + g(2)P(X = 2)= 2,133,05)(X X X g Y x x f x g )()(8Theorem 4.1 Let X be a random variable with probability distribution f(x). The mean or expected value of random variable g(X)isif X is discrete, andif X is continuous.xX g x f x g X g E )()()]([)(dxx f x g X g E X g )()()]([)(9ExampleLet X denote the length in minutes of a long-distance telephone conversation. Assume that the density for X is given byFind E(X) and E(2X+3)Solution:E(X )= = = 10 E(2X+3)= = 2(10) + 3 = 23 .0)(10/101x e x f x dx e x x 10/0101)32( dxx f x )(dx e x x 10/010110Definition 4.2 Let X and Y be random variables with joint probability distribution f(x, y).The mean or expected value of the random variable g(X, Y)isif X and Y are discrete, andif X and Y are continuous. ),(),(),(),()],([y x Y X g y x f y x g Y X g E dydx y x f y x g Y X g E Y X g),(),()],([),( Extension11ExampleExample:Suppose two dice are rolled, one red and one white. Let X be the number on the top face of the red die, and Y be the number on the top face of the white one. Find E(X+Y).E[X + Y]= == ==3.5 + 3.5 = 7 ),(),()(y x y Y x X P y x 6161),()(x y y Y x X P y x 61616161),(),(x y x y y Y x X yP y Y x X xP 61616161)36/1()36/1(x y x y y x12Example 4.7, page 93. Find E[Y/X]for the densitySolution:= = elsewhere ,0 2020 ,16),(33 y , x y x y x f dydx y x f y x g Y X g E Y X g),(),()],([),( dy dx y x x y 20332016dy dx y x 2042201613In generalIf X and Y are two random variables, f(x, y)is thejoint density function, then:E(X)= = (discrete case) E(X)= (continuous case)E(Y)= = (discrete case)E(Y)= (continuous case)g(x)and h(y)are marginal probability distributions of X and Y , respectively. x y y x xf ),( xx xg )( dxx xg dxdy y x xf )(),( y x y x yf ),( y y yh )( dyy yh dxdy y x yf )(),(。