Part 6
Part 6
Example 1.1
Two distinguishable dice (labeled as and ) are thrown simultaneously. Here the sample
space is =
, : , ∈ 1, … ,6, where the outcome , ∈ indicates that number of
dots are observed on the uppermost face of die and number of dots are observed on
uppermost face of die . For , ∈ , define
and
Throughout ℝ. =
/ = / , … , /. : − ∞ < /3 < ∞, = 1, … , 4 will denote the 4-dimensional
Euclidean space and, for a set 5 ⊆ ℝ. and a function = , , … , . : Ω ⟶ ℝ. ,
Definition 1.1
1
function = , … , . : Ω ⟶ ℝ. is a random vector if
A 1-dimensional random vector will simply be referred to as a random variable. Clearly, a
; ∈ Ω ∶ ; < A , … , . ; < A. ∈ ℱ, ∀ A = A , … , A. ∈ ℝ. .
A, H = A , H × ⋯ × A. , H. ≡ JA3 , H3 ,
3K
A, HL = A , H ] × ⋯ × A. , H. L ≡ JA3 , H3 ],
3K
and
.
Further define
OP =
−∞, HL ∶ H ∈ ℝ. ,
O =
A, H ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,
O =
A, HL ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,
OQ =
MA, H ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,
2
OR =
MA, HL ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,
OS =
−∞, H ∶ H ∈ ℝ. ,
OT =
A, ∞ ∶ A ∈ ℝ. ,
and
OU =
MA, ∞ ∶ A ∈ ℝ. .
7 8 ∈ ℱ, ∀ 8 ∈ O ;
equivalent conditions hold:
a)
b) 7 8 ∈ ℱ, ∀ 8 ∈ O ;
c) 7 8 ∈ ℱ, ∀ 8 ∈ OQ ;
d) 7 8 ∈ ℱ, ∀ 8 ∈ OR ;
e) 7 8 ∈ ℱ, ∀ 8 ∈ OS ;
f) 7 8 ∈ ℱ, ∀ 8 ∈ OT ;
g) 7 8 ∈ ℱ, ∀ 8 ∈ OU ;
h) 7 8 ∈ ℱ, ∀ 8 ∈ ℬ. .
3
7
A =
; ∈ Ω: ; = A , … , . ; = A. ∈ ℱ, ∀ A = A , … , A. ∈ ℝ. ;
(viii) The function @a : ℬ. → ℝ given by,
@a 8 = @ < 7 8 = , 8 ∈ ℬ. ,
Example 1.2
Let [, 8 ⊆ Ω. Define = , : Ω → ℝ by
1, if ; ∈ [
; = bc ; = d ;
0, if ; ∉ [
1 if ; ∈ 8
and
; = be ; = d .
0, if ; ∉ 8
Then, for A = A , A ∈ ℝ ,
7 − ∞ , A] = ; ∈ Ω: X ; ≤ A , ; ≤ A
l, if A < 0 or A < 0
k m
i[ ∩ 8 , if 0 ≤ A < 1, 0 ≤ A < 1
m
= [m , if 0 ≤ A < 1, A ≥ 1 .
j8 m , if A ≥ 1, 0 ≤ A < 1
i
hΩ, if A ≥ ,1, A ≥ 1
Thus
⟺ [m , 8 m ∈ ℱ
⟺ [, 8 ∈ ℱ
Theorem 1.1
Let = , … , . : Ω → ℝ. be a given function. Then is a random vector if, and only if
, … , . 3 : Ω → ℝ, = 1, … , 4 are random variables.
Proof. First suppose that = , … , . is a random vector. Then, for A ∈ ℝ, and for fixed
∈ 1, … , 4
4
x
∈ ℱ
7 <−∞, AL= = ; ∈ Ω: 3 ; ≤ A3 , = 1, … , 4
= q; ∈ Ω: 3 ; ≤ A3
3K
= q 3 −∞, A3 ]
stttutttv
7
3K ∈ ℱ
∈ ℱ
Remark 1.1
Definition 1.2
za / , … , /. = @
; ∈ Ω: ; ≤ / , … , . ; ≤ /. , / = / , … , /. ∈ ℝ. .
The joint distribution function of any subset of random variables , … , . is called a
marginal distribution function of za ∙.
(ii)
Remark 1.2
(i) If za ∙ is the distribution function of a 4-dimensional random vector = , … , . then
5
za / = @3 ≤ /3 , = 1, … , 4
= @ | 7 <−∞, /L=}
= @a <−∞, /L=
= @ ~q3 ≤ /3
3K
(ii) Let za,⋯,a ∙ be the distribution function of a random vector = , … , . and let
= , … , . be a permutation of 1, … , 4. Then
= @ ~q
≤ /
3K
= @ ≤ , Q ≤ , ≤ Q
= @ ≤ Q , ≤ , Q ≤
6
= za ,a, a < , , Q =.
(iii) Note that a distribution function za ,…,a / , … , /. is increasing in each argument
when other arguments are kept fixed.
Lemma 1.1
Let ⊆ ℝ. and let ]: → ℝ be a function such that:
(i) ] is bounded above, i.e., there exists a real constant such that ]/ ≤ , ∀/ ∈ ;
(ii) for every fixed ∈ 1, … , 4 and fixed / , … , /37 , /3 , … , /. ∈ ℝ.7 , ]/ , … , /37 , ,
/3 , … , /. is non decreasing in ∈ 3 = ∈ ℝ: / , … , /37 , , /3 , … , /. ∈ . Then
lim→x ]/ exists and, for any permutation = , … , . of 1, … , 4,
exist and are equal, where . denotes the set of all permutations of 1, … , 4. We
denote the common value of all iterated limits by
lim ]/.
→x
3K,…,.
▄
]/ , … , /` = za / , … , /` , /` , … , /. ,
satisfies properties (i) and (ii) stated in Lemma 1.1. Therefore, for fixed /` , … , /` ∈ ℝ.7`
where / ∗ = / , … , /` .
7
Lemma 1.2
Let za ∙ be the distribution function of a 4 -dimensional 4 ≥ 2 random vector =
, … , . . For a fixed positive integer _ ∈ 1, … , 4 − 1, let = , … , ` and let
= ` , … , . so that = , . Then the marginal distribution function of =
, … , ` is given by
¢
.7
7 7
= lim @ q 3 −∞, /3 ] q . −∞, r] ¡
w→x ¡
sttttttttttttuttttttttttttv
3K
Kc ↑
x
= @ £¤ [w ¥
wK
.7 x
Remark 1.3
Let = , … , . be a random vector and let = , … , . ∈ . , the set of all
permutations of 1, … , 4. If = , … , . is the inverse permutation of , … , . then, for
a fixed _ ∈ 1, … , 4 − 1, the marginal distribution function of , … , § is given by
8
= lim za ,…,a </ , … , / = .
¨ → x
©K`,…,.
▄
− @A < ≤ H , ≤ A
4
Note that the set ª`,. has < = elements.
_
From (1.2) and (1.3) we have
and
Lemma 1.3
9
Let = , … , . : Ω → ℝ. be a random vector and let A = A , … , A. , H = H , … , H. ∈
ℝ. . Let ª`,. ≡ ª`,. <A, HL= , _ = 0, 1, … , 4 be as defined in (1.4). Then
Proof. From (1.5) and (1.6) it is clear that the result is true for 4 = 1 and 4 = 2. Now suppose
that (1.7) holds for general 4 -dimensional random vectors. For simplicity assume that
@
A. < . ≤ H. = 0. Then, for , … , . , . : Ω → ℝ. , A = A , … , A. ∈ ℝ. ,
H = H , … , H. ∈ ℝ. , A∗ = A , … , A. , A. ∈ ℝ. andH ∗ = H , … , H. , H. ∈ ℝ. .
= @
A3 < 3 ≤ H3 , = 1, … , 4»
A. < . ≤ H. @
A. < . ≤ H.
.
= ¯−1` ¯ @
3 ≤ «3 , = 1, … , 4|{A. < . ≤ H. @
A. < . ≤ H.
`KP °∈±§, (A,H]
= ¯(−1)` ¯ @
3 ≤ «3 , = 1, … , 4, A. < . ≤ H.
`KP «∈ª_,4 (¹,º]
= ¯(−1)` ¯ M@
≤ « , … , . ≤ «. , . ≤ H.
`KP °∈±§, <A,HL=
−@
≤ « , … , . ≤ «., . ≤ A. L.
¯(−1)` ¯ M@
≤ « , … , . ≤ «. , . ≤ H.
`KP °∈±§, <A,HL=
−@
≤ « , … , . ≤ «. , . ≤ A. L
.
10
Theorem 1.2
(i) lim z / , … , /.
→x a
= 1;
3K,…,.
(ii) For each fixed ∈ {1, … , 4 and fixed / , … , /37 , /3 , … , /. ∈ ℝ.7 ,
(iii) za / , … , /. is right continuous in each argument (keeping other arguments fixed);
(iv) For each rectangle (A, H] ∈ ℝ.
.
¯(−1)` ¯ za « ≥ 0.
`KP °∈±§, (¹ , º]
For notational convenience we will provide the proofs of (i) - (iii) for only 4 = 2.
Kc ↑
= @ £¤ [w ¥
wK
Therefore,
11
7 ( ) =
= lim @ <
st tut
tt (−∞,
ttr]
tv
w →x
Ke ↑
= @ £¤ 8w ¥
wK
= @(Ω)
= 1.
lim za ,a (/ , / ) = lim @( (−∞, −r] ∩ ((−∞, / ]))
stttttttttutttttttttv
7 7
→7x w →x
Ke ↓
= @ £q 8w ¥
wK
= @(ϕ)
= 0.
1
lim za ,a (/ + ℎ, / ) = lim za ,a |/ + , / }
Ã↓P w→x r
1
= lim @(7 |(−∞, / + ]} ∩ 7 ((−∞, / ]))
w →x stttttttttttutttttttttttv
r
KÅ ↓
= @ £q Æw ¥
wK
12
i.e., for every fixed / ∈ -, za ,a / , / is right continuous in / . Similarly, for every
fixed / ∈ -, za ,a / , / is right continuous in / .
Remark 1.4
(i) Let Δ. ⋃ KP Δ`,. . Then Δ. is the set of 2. vertices of the rectangle A, HL ∈ -. . For
.̀
Figure 1.1
Figure 1.2
(ii) Note that, for 4 1, the assertion (iv) of Theorem 1.2 reduces to za H o
za A, ∀ 0 ∞ 2 A g H 2 ∞ i.e., za is non-decreasing.
Now we state the following theorem without providing its proof. This theorem states that
properties (i) - (iv) described in Theorem 1.2 characterize distribution functions.
Theorem 1.3
Let É: -. → - be a function such that
13
(i) lim É/ , … , /. = 1;
→x
3K,…,.
(ii) for each fixed ∈ {1, … , 4 and each fixed / , … , /37 , /37 , … , /. ∈ ℝ.7
lim É(/ , … , /37 , , /3 , … , /. ) = 0;
¾→7x
(iii) É/ , … , /. is right continuous in each argument when other arguments are kept
¯(−1)` ¯ É« ≥ 0.
`KP °∈ ¶§, (¹,º]
Then there exists a probability space (Ω, ℱ, @) and a random vector = , … , . defined on
(Ω, ℱ, @) such that É is the distribution function of i. e., za / = É/, ∀/ ∈ ℝ. . ▄
Remark 1.5
(i) As in the one dimensional case it can be shown that the probability measure @a (⋅),
induced by a random vector , is completely determined by its distribution function
za (⋅). Thus, to study the induced probability measure @a (⋅), it is enough to study the
distribution function za .
function which satisfies properties (ii) and (iv) of Theorem 1.3, then
.
⇒ ¹ lim
→7x
(−1)` ¯ ¯ É« ≥ 0
3K,…,. `KP °∈ ¶§, (¹,º]
Example 1.3
14
Consider the function É: ℝÏ → ℝ defined by
/ , if 0 ≤ / < 1, 0 ≤ < 1
k
i/, if 0 ≤ / < 1, ≥ 1
( )
É /, = , if / ≥ 1, 0 ≤ < 1 .
j1, if / ≥ 1, ≥ 1
i
h 0, otherwise
(i) Show that É is a distribution function of some two-dimensional random vector, say (, ).
(ii) Find marginal distribution functions of and .
Solution. (i) Note that, for / ≥ 1, ≥ 1, É (/, ) = 1. Therefore lim→x É (/, ) = 1. Also, for
¾→x
/ < 0 or < 0, É (/, ) = 0. Therefore, for each fixed / ∈ ℝ, lim É (/, ) = 0 and, for each
¾→7x
fixed ∈ ℝ, lim É (/, ) = 0.
→7x
0, if / < 0
É(/, ) = Ò/ , if 0 ≤ / < 1 , if ∈ [0,1) (1.9)
, if / ≥ 1
and
0, if / < 0
É(/, ) = Ô/, if 0 ≤ / < 1 , if ∈ [1, ∞) . (1.10)
1, if / ≥ 1
From (1.8) - (1.10) it is evident that, for each fixed value of ∈ ℝ, É (/, ) is a continuous (and
hence right continuous) function of /. Similarly, for each fixed value of / ∈ ℝ, É (/, ) is a
continuous function of .
From (1.8 ) - (1.10) it is also clear that, for each fixed value of ∈ ℝ, É (/, ) is a non-decreasing
function of / ∈ ℝ. Similarly, for each fixed value of / ∈ ℝ, É(/, ) is a non-decreasing function
of ∈ ℝ.
Now let −∞ < A < H < ∞, −∞ < A < H < ∞, A = (A , A ), H = (H , H ) and (A , H] =
(A , H ] × (A , H ] . Then
Δ = ¯(−1)` ¯ É (« , « )
`KP ° ∈¶§, <¹,ºL=
15
The following cases arise:
Case I. A < 0
In this case
Δ = É (H , H ) − É (H , A ) ≥ 0,
Δ = É (H , H ) − É (A , H ) ≥ 0,
Δ = H − H A − A + A A
= (H − A ) (1 − A ) ≥ 0;
= (1 − A ) (H − A ) ≥ 0;
Δ = 1 − A − A + A A
= (1 − A ) (1 − A ) ≥ 0;
Δ = H − H − A + A = 0;
Case VIII. 0 ≤ A < 1, A ≥ 1, H ≥ 1, H ≥ 1
Δ = 1 − 1 − A + A = 0;
16
Δ = H − A − H + A = 0;
Case X. A ≥ 1, 0 ≤ A < 1, H ≥ 1, H ≥ 1
Δ = 1 − A − 1 + A = 0;
Case XI. A ≥ 1, A ≥ 1, H ≥ 1, H ≥ 1
Δ = 1 − 1 − 1 + 1 = 0.
Combining Case I - Case XI it follows that
Now using Theorem 1.3 it follows that É (/ , / ) is a distribution function of some two-
dimensional random vector (, ) ∈ ℝ .
0, if / < 0
( ) ( )
za / = lim É /, = Ô/, if 0 ≤ / < 1.
1, if / ≥ 1
¾→x
0, if < 0
z () = lim É(/, ) = Ò , if 0 ≤ < 1.
1, if / ≥ 1
→x
Example 1.4
Let É: ℝ → ℝ be defined by
/, if 0 ≤ / < 1, ≥ 1
, if / ≥ 1, 0 ≤ < 1
É (/, ) = Õ .
1, if / ≥ 1, ≥ 1
0, otherwise
Show that É is not a distribution function of any random vector (, ).
Solution. Note that É (/, ) is non-decreasing in each argument when the other argument is
kept fixed. Let A ∈ [0, 1), A ∈ [0, 1), H ∈ [1, ∞), H ∈ [1, ∞) A + A > 1, A = (A , A ),
H = (H , H ) and (A, H] = (A , H ] × (A , H ] . Then
17
= 1 − A − A < 0.
Let (Ω, ℱ, @) be a probability space and let = , … , . : Ω → ℝ. be a random vector with
distribution function za / , … , /. .
Definition 2.1
is said to a random vector of discrete type if there exists a non-empty countable set
a ⊆ ℝ. such that @
= / > 0, ∀ / ∈ a and @
∈ a = ∑ ∈ ר @
=
(i)
/ = 1. The set a is called the support of the discrete type random vector (or
simple the support of the probability distribution of ) and the function
Ùa / = @
= /, / ∈ ℝ. ,
which is such that Ùa / > 0, ∀ / ∈ a , Ùa / = 0, ∀ / ∈ am (see Remark 2.1 (i)
later) and ∑∈ר Ùa / = 1, is called the joint probability mass function (p.m.f.) of .
18
Ú Ùa / , … , /. Û / = ¾lim z , … , . = 1,
→x a
ℝ 3K,⋯,.
Remark 2.1
Let be a random vector of discrete type with support a and p.m.f. Ùa (∙). Then we
a is countable, Ùa / ≥ 0, ∀ / ∈ ℝ. , Ùa / > 0, ∀ / ∈ a and
(ii)
Let be a random vector of absolutely continuous type with joint and p.d.f. Ùa (∙). Then
Ùa / ≥ 0, ∀ / ∈ ℝ. and
(iii)
Ú Ùa (/) Û/ = 1,
ℝ
Ú ℎ/ Û / = 1,
ℝ
then it can be shown that ℎ(∙) is a joint p.d.f. of some random vector of absolutely
continuous type.
19
Ú Ψ/ Û/ ~or ¯ Ψ/
¹,ºL ∈ Ü
is finite, we know that the integral (or sum) can be evaluated iteratively (section wise).
º º
or ~¯ ℎ/ = ¯ ⋯ ¯ ℎ/.
∈Ü
∈Ü
∈Ü
= A = 7
A
x
= 7
£q (Aw , A]¥
wK
= q 7 (Aw , A]
wK
⇒ @
= A = @ ~q (Awt
stttut
7
, tv
A]
wK Kc ↓
(vi) Let be a 4-dimensional random vector with distribution function za that is continuous
20
.
@
= A = lim ¯(−1)` ¯ za «w
w→x
`KP ° ∈ ¶§, <¹ ,¹L=
.
4
= ¯(−1)` < = za (A)
_
`KP
= (1 − 1). za A
= 0.
@
= A = 0.
(vii) Let be a 4-dimensional random vector of continuous type so that its distribution
function za (∙) is continuous at every / ∈ ℝ. .Then, by (vi),
@
= A = 0, ∀ A ∈ ℝ. .
@
∈ = @ ~Ò¤
= Aß
¹∈×
= ¯ @
= A
¹∈×
= 0.
(viii) Suppose that is a 4-dimensional random vector of absolutely continuous type with
p.d.f. za (∙). Then it can be shown that its distribution function
@
= A = 0, ∀ A ∈ ℝ. and @
∈ = 0,
21
for any countable set .
Let be a 4-dimensional random vector of discrete type with joint p.m.f. Ùa (∙) and
support a . Then, for any [ ∈ ℬ. ,
(ix)
@
∈ [ = @
∈ [ ∩ a since @
∈ a = 1
= @ ~ ¤
= /
∈c ∩ ר
= ¯ @
= / [ ∩ a ⊆ a is countable
∈c∩ ר
= ¯ Ùa (/)
∈c∩ר
= ¯ Ùa /bc /.
∈ר
Let be a 4-dimensional random vector of absolutely continuous type with joint p.d.f.
Ùa (⋅) and let A, H ∈ ℝ. , A3 < H3 , = 1, … , 4. Then, using the idea of the proof of Lemma
(x)
º º
. ° °
= ¯(−1)` ¯ za «
`KP ° ∈ ¶§, <¹,ºL=
= @
∈ A, HL.
It follows that
22
@
∈ A, HL = @({A3 < 3 ≤ H3 , = 1, … , 4)
º º
x x
@
∈ [ = Ú Ùa / bc / Û/.
ℝ
@
∈ [ = Ú Ùa / bc / Û/ = 0.
ℝ
In particular @
3 = © = 0, ∀ ≠ .
Let be a 4-dimensional random vector of discrete type with joint distribution function
za (⋅), joint p.m.f. Ùa (⋅) and support a . Then, using (ix),
(xi)
za / = @
∈ (−∞ /]
= ¯ Ùa / / ∈ ℝ. (2.1)
∈ (7x ]∩ ר
Ùa / = @
= / = lim ¯(−1)` ¯ za «w , (2.2)
w→x
`KP ° ∈ ¶§, ( ,]
where /w = </ − , … , /. − = , r = 1, 2, ….
w w
23
Using (2.1) and (2.2) we conclude that the joint distribution function of a discrete type
(xii) If is a random vector of absolutely continuous type then its joint p.d.f. is not unique
and there are different versions of joint p.d.f.. In fact if the values of the joint p.d.f. Ùa (∙)
of a random vector of absolutely continuous type are changed at a countable number
.
of curves with other non-negative values then the resulting function is again a p.d.f. of
á.
z / , … , /. ,
á/ ⋯ á/. a
á.
Ú za / , … , /. bÅ â / Û/ = 1.
ℝ á/ ⋯ á/.
á.
za / , … , /. , if / ∉ Æ
Ùa / = Òá/ ⋯ á/. ,
A , if / ∈ Æ
(xiv) Let be a 4-dimensional random vector of absolutely continuous type with joint
distribution function za (⋅) and joint p.d.f. Ùa (⋅). Then
Clearly the joint distribution function of an absolutely continuous type random vector
is determined by its joint p.d.f. Ùa (∙). Thus to study the probability measure @a (⋅)
induced by an absolutely continuous type random vector it is enough to study its joint
p.d.f. Ùa (⋅).
24
(xv) Using Remark 1.2 (ii) and using (v) above it follows that if Ùa /, / ∈ ℝ. , is the p.m.f. (a
p.d.f.) of 4-dimensional random vector = , … , . then, for any permutation
, … , . of (1, … , 4) with inverse permutation , … , . the joint p.m.f (joint p.d.f.)
of < , … , = is Ùa
,…,a
/ , … , /. = Ùa ,…,a </ , … , / . = , / ∈ ℝ. .
Theorem 2.1
(i) Suppose that is of discrete type with support a and p.m.f. Ùa (∙). For ∈ ℝ` , define
ã¾ = :z ∈ ℝ.7` : <, «= ∈ a > (note that, for each ∈ ℝ` , ã¾ is a countable set. Then
the random vector = ( , … , ` ) is of discrete type with support = : ∈
ℝ` : <, «= ∈ a , for some z ∈ ℝ.7` > and joint p.m.f. (called the marginal p.m.f. of )
¯ Ùa <, «= , if ∈
Ù <= = Õ °∈åæ .
0, otherwise
Suppose that is of absolutely continuous type with joint p.d.f. Ùa (∙). Then the random
vector = ( , … , ` ) is of absolutely continuous type with p.d.f. (called the marginal
(ii)
p.d.f. of )
x x
@
∈ ≥ @
∈ a = 1,
i.e., @
∈ = 1.
25
@ <: = >= = @ <: = > ∩
∈ a = (since @
∈ a = 1)
= @ <: = > ∩
, ∈ a =
= ¯ Ùa <, «= .
° ∈åæ
Note that, for ∈ , ã¾ ≠ l, and for « ∈ ã¾ , <, «= ∈ a . Therefore we have za <, «= >
0 ∀ ∈ and « ∈ ã¾ . If follows that @
∈ = 1, @ <: = >= > 0, ∀ ∈ . Hence the
assertion follows.
z <= = lim za
° →x
<, «=
3K,…,,.7`
¾ ¾§ ° °¦§
= lim
° →x
Ú ⋯ Ú ⋯ Ú ⋯ Ú Ùa ç, Û Ûç
3K,…,.7` 7x 7x 7x 7x
¾ ¾§ x x
= Ú ⋯ Ú ⋯ è Ú ⋯ Ú Ùa ç, Û é Ûç,
7x 7x 7x 7x
26
¾ ¾§
= Ú ⋯ Ú ℎç Ûç , (2.3)
7x 7x
x x
Now using (2.3) and the above properties of ℎ(⋅) it follows that is of absolutely
continuous type with p.d.f.
x x
Example 2.1
Let = (, ) be a bivariate random vector with p.m.f.
ë , if (/, ) ∈ ã
Ùê (/, ) = @({ = /, = ) = d ,
0, otherwise
where ã = {(ç, ) ∈ ℝ : ç, ∈ {1, … , r, ç ≤ , r (≥ 2) is fixed positive integer and ë is a fixed
real constant.
Solution. (i) Clearly we must have ë > 0. Then the support of is ê = ã = {(ç, ) ∈ ℝ : ç, ∈
{1, … , r, ç ≤ and therefore
¯ Ùê (/, ) = 1
(,¾)∈×ì
27
w ¾
⇒ ë ¯ ¯ = 1
¾K K
⇒ ë ¯ = 1
¾K
6
⇒ ë = .
r(r + 1)(2r + 1)
¯ Ùê (/, ), if / ∈ a
Ùa (/ ) = Ò¾∈åí .
0, otherwise
3[r(r + 1) − (/ − 1)/]
, if / ∈ a
Ùa (/ ) = Ò r(r + 1)(2r + 1) ,
0, otherwise
where a = {1, … , r .
¯ Ùê (/, ), if ∈
Ù () = Ò∈åæ∗ .
0, otherwise
For ∈ , we have
28
¾
¯ Ùê (/, ) = ë ¯ = ë .
∈åæ
∗ K
, if ∈
T¾
Ù () = Ôw(w)(w)
0, otherwise
,
(iii) Let [ = {(ç, ): ç > and 8 = {(ç, ): ç = . Then by Remark 2.1 (ix)
@{ > = @
∈ [
= ¯ Ùê (/, )
(,¾)∈×ì ∩c
= 0 since ê ∩ [ = l.
@{ = = @
∈ 8
= ¯ zê (/, )
(,¾)∈×ì ∩e
= ë ¯
¾K
3
= .
2r + 1
Therefore
3
=1−
2r + 1
2(r − 1)
= .
2r + 1
▄
Example 2.2
29
Let = ( , , Q ) be a discrete type random vector with p.m.f.
¯ Ùa / , /, /Q = 1
∈ר
⇒ë ¯ ¯ ¯ / / /Q = 1
∈{, ∈{,,Q ∈{,Q
1
⇒ë= .
72
(ii) The supports of , and Q are
a =
/ ∈ ℝ : (/ , / , /Q ) ∈ a for some (/ , /Q ) ∈ ℝ = {1, 2,
a =
/ ∈ ℝ : (/ , / , /Q ) ∈ a for some (/ , /Q ) ∈ ℝ = {1, 2, 3
and
a =
/Q ∈ ℝ : (/ , / , /Q ) ∈ a for some (/ , / ) ∈ ℝ = {1, 3,
respectively.
= ¯ Ùa (/ , / , /Q )
( , )∈åí
30
= ¯ ¯ / / /Q
∈{,,Q ∈{,Q
/
= .
3
Therefore the marginal p.m.f. of is
/
, if / ∈ {1, 2
Ùa (/ ) = Ô 3 .
0, otherwise
and
, if /Q ∈ {1, 3
Ùa (/Q ) = ð R
0, otherwise
,
respectively.
= {1, 2 × {1, 3
= ¯ ë ç
ñ∈{,,Q
= ,
12
and the marginal p.m.f. of = ( , ) is
, if ( , ) ∈ {(1, 1), (1, 3), (2, 1), (2, 3)
Ù ( , ) = Ô 12 .
0, otherwise
31
(iv) Let [ = {(/ , / , /Q ) ∈ ℝQ : / = / = /Q . Then a ∩ [ = {(1, 1, 1) and therefore
@({ = = Q ) = ¯ Ùa /
∈ר ∩c
= ë
1
= .
72
▄
Example 2.3
Let = ( , , Q ) be a random vector of absolutely continuous type with joint p.d.f.
ë
, if 0 < /Q < / < / < 1
Ùa / = Ô/ / ,
0, otherwise
Ú Ùa /Û/ = 1
ℝ
ë
⇒ ÚÚ Ú Û/Q Û/ Û/ = 1
/ /
P P P
⇒ ë = 1.
32
k 1
Ú Û/ , if 0 < < < 1
= /
j¾
h0, otherwise
− ln
, if 0 < < < 1
= Ò .
0, otherwise
− ln / , if 0 < / < 1
= d .
0, otherwise
1
= Ú Ú Ú b / Û/
/ / c
Pò ò ò ò
í
1
= Ú Ú Ú Û/Q Û/ Û/
/ /
P P P
1
= .
2
▄
Remark 2.2
33
also nor of absolutely continuous type). To see this let = ( , ) have the joint distribution
(i) There are random vectors that are neither of discrete type nor of continuous type (and hence
function
1 / /
k + , if 0 ≤ / < 1, 0 ≤ / < 1
i2 2
1 /
i + , if 0 ≤ / < 1, / ≥ 1
za ,a (/ , / ) = 2 2 .
j1 + / , if / ≥ 1, 0 ≤ / < 1
i2 2
i1, if / ≥ 1, / ≥ 1
h0, otherwise
It is easy to verify that za ,a (∙) is a distribution function (i.e; it satisfies properties (i)-(iv) of
Theorem 1.3). The marginal distribution functions of and are
0, if / < 0
1 /
za (/ ) = lim za ,a (/ , / ) = Õ + , if 0 ≤ / < 1
→x 2 2
1, if / ≥ 1
and
0, if / < 0
1 /
za (/ ) = lim za ,a (/ , / ) = Õ + , if 0 ≤ / < 1.
→x 2 2
1, if / ≥ 1
Clearly the set of discontinuity points of za (= za ) is = {0 and
1
¯Mza (/ ) − za (/ −)L = ¯Mza (/ ) − za (/ −)L = ≠ 1.
2
∈Ü ∈Ü
It follows that and are not of discrete type and therefore using Theorem 2.1 (i) it follows
that ( , ) is not of discrete type.
Note that
1
»za ,a (/ , / ) − za ,a (0,0)» = óza ,a (/ , / ) − ó
2
1
, if / < 0 or / < 0
= Õ/2 /
, if 0 ≤ / < 1,0 ≤ / < 1
2
↛ 0, as (/ , / ) → (0, 0),
34
i.e., za,Ø (∙) is not continuous at (0, 0). Therefore ( , ) is also not of continuous type.
(ii) There are random vectors which are of continuous type but not of absolutely continuous
type. These random vectors are normally difficult to study.
3. Conditional Distributions
Definition 3.1
za|Ü / = @
∈ (−∞, /]|
∈
@
∈ −∞, /L ∩
=
@
∈
@
≤ / , … , . ≤ /. , ∈
= , / ∈ ℝ. .
@
∈
For a given ∈ ℬ. it can be verified that za|Ü (∙) is a distribution function, i.e., it satisfies
properties (i) − (iv) of Theorem 1.3. For a fixed _ ∈ {1, … , 4 − 1, let = ( , … , ` ) (=
( , … , ` ) say) and = ` , … , . = , … , .7` , say, so that = , . In many
denote respectively the heights and weights of newly born babies in a community then it may
To make the above discussion precise, first suppose that = , is of discrete type so that
and are also of discrete type (see Theorem 2.1 (i)). Let a , and ê denote the supports of
, and respectively. Further let Ùa (⋅) ≑ Ù,ê (⋅) and Ùê (⋅) denote the joint pm.f.s of
= , and , respectively. Let « ∈ ê be fixed so that Ùê « = @
= « > 0. Define
»êK° = : ∈ ℝ` : < , «= ∈ a >. Then »êK° ⊆ = : ∈ ℝ. : < , = ∈ a , for some ∈
35
ℝ.7` > and, using Definition 3.1, the conditional distribution function of given
= « <=
: ∈
«>= is given by
@
≤ , … , ` ≤ ` , = «
z»ê < |«= = , ∈ ℝ` (3.1)
@
= «
∑ Ù / , «
∈ ×ö»ì÷ø ∩|<7x,¾Þ} a
=
Ùê «
Ùa / , «
= ¯ . (3.2)
Ùê «
∈ ×ö»ì÷ø ∩|<7x,¾Þ}
Clearly the p.m.f. corresponding to distribution function z»ê ⋅ |« is (see Remark 2.1 (xi))
ùö ì <¾ ,°=
Ù»ê < |«= = Ò ùì ° , if ∈ »êK° (3.3)
0 otherwise
ùö ì <¾ ,°=
= , ∈ ℝ` (3.4)
ùì °
Definition 3.2
Let = , … , . be a discrete type random vector. Then, under the above notation,
(i) the conditional p.m.f. of given = « (where « ∈ ê is fixed) is defined by (3.3) (or (3.4));
(ii) the conditional distribution function of given = « (where « ∈ ê is fixed) is defined by
(3.1) (or (3.2)); ▄
Now suppose that = , is of absolutely continuous type so that and are also of
absolutely continuous type (see Theorem 2.1 (iii)). Let Ùa (∙) ≑ Ù,ê (∙), Ù (∙) and Ùê (∙) denote
the p.d.f.s. of , and respectively. Then we have @
= « = 0, ∀ « ∈ ℝ.7` (Remark 2.1
(viii)) and therefore conditional distribution function of given
= « cannot be defined by
(3.1). For « ∈ ℝ.7` , note that
36
x x
1
= « = q ⋯ q d«3 − < 3 ≤ «3 , = 1, … , 4 − _û,
r3
w K w¦§ K
1
@
= « = lim @ |d«3 − < 3 ≤ «3, = 1, … , 4 − _û}
w →x
r3
3K,…,.7`
@({3 ≤ 3 , = 1, … , _, «3 − ℎ3 < 3 ≤ «3 , = 1, … , 4 − _ )
= lim
à ↓P @({«3 − ℎ3 < 3 ≤ «3 , = 1, ⋯ , 4 − _)
3K,…,.7`
ý7x⋯ ý7x ý° 7Ã ⋯ ý° 7Ã Ù ê ç , Û Ûç
¾ § ¾ ¦§ ° °
= lim
¦§ ¦§
ý° ⋯ ý° ¦§7à ٠ê Û
° °
Ã↓P
3K,…,.7` 7à ¦§ ¦§
=
3K,…,.7`
=
Ù ê «
¾ ¾§
Ù ê ç , «
= Ú ⋯ Ú Ûç, ∈ ℝ` , (3.7)
Ù ê «
7x 7x
provided Ù ê « > 0 and « is such that (3.5) is satisfied. In that case the p.d.f corresponding to
distribution function z|ê ⋅ |« is given by
37
Ù ê <, «=
Ù|ê <|«= = , ∈ ℝ` . (3.8)
Ù ê «
Definition 3.3
Remark 3.1
Example 3.1
Let = ( , , Q ) be a discrete type random vector with p.m.f.
/ / /Q
, if (/ , / , /Q ) ∈ {1, 2 × {1, 2 × {1, 3
Ùa (/ , / , /Q ) = Ô 72 .
0, otherwise
(i) Find the conditional p.m.f. of given that ( , Q ) = (2, 1);
@({ = / , = 2, Q = 1 )
Ùa |(a,a ) (/ |(2, 1)) =
@({( , Q ) = (2, 1) )
2/
, if / ∈ {1, 2
= Ò72 @({ = 2, Q = 1 ) ,
0, otherwise
38
2
= (1 + 2)
72
1
= .
12
Therefore
/
, if / ∈ {1, 2
Ùa |(a,a ) (/ |(2, 1)) = Ô 3 .
0, otherwise
@({ = / , = 3, Q = /Q )
(ii) We have
/ /Q
, if (/ , /Q ) ∈ {1, 2 × {1, 3
Ùa ,a |a (/ , /Q |3) = Ô 12 .
0, otherwise
Example 3.2
Let = ( , , Q ) be a random vector of absolutely continuous type with joint p.d.f.
1
, if 0 < /Q < / < / < 1
Ùa / = Ò/ / .
0, otherwise
(i) For 0 < /Q < / < 1, find the conditional p.d.f. of given ( , Q ) = (/ , /Q );
(ii) For 0 < / < 1, find the conditional p.d.f. of ( , Q ) given = / .
ln /
Ùa , a (/ , /Q ) = − .
/
39
Therefore,
1
− , if / < / < 1
Ùa |(a ,a ) (/ |/ , /Q ) = Ò / ln / .
0, otherwise
Alternatively Ùa |(a,a ) (/ |/ , /Q ) can be found useful Remark 3.1.
Ùa (/ ) = − ln / .
Alternatively Ùa ,a (/ , /Q |/ ) can be found using Remark 3.1. ▄
Definition 4.1
The random variables { : ∈ Λ are said to be (statistically) independent if for any finite sub
collection
, … , . ⊆ Λ we have
.
The observations made in the following remark are immediate from Definition 4.1.
Remark 4.1
40
(i) The random variables { : ∈ Λ are independent if, and only if, every finite sub collection
{ , … , ⊆ { : ∈ Λ constitutes a collection of independent random variables;
(iii) It can be shown that (see Theorem 5.3 (ii)) , … , . are independent if, and only if, for any
[3 ∈ ℬ , = 1, … , 4,
.
Theorem 4.1
Let X = ( , … , . ) be a 4-dimentional (4 ≥ 2) random vector with joint distribution function
za ,… ,a (∙). Let za (∙) denote the marginal distribution function of 3 , = 1, … , 4. Then the
random variables , … , . are independent if, and only if,
.
Proof. First suppose that , … , . are independent. Then, by definition, (4.1) obviously holds.
Conversely suppose that (4.1) holds. Then, for any ∈ ℝ. and any permutation ( , … , . ) of
(1, … , 4),
.
⇒ @
≤ , = 1, … , 4 = J @(
≤ )
3K
where . denotes the set of all permutation of (1, … , 4). It follows that, for any , … , . ∈
. and any / ∈ ℝ. ,
41
.
= ílim
→
J za (/
) (using (4.2))
¨
¨÷¼,⋯,
K
= J za (/
), ∀ / = / , … , / ∈ ℝ .
K
The following remark is immediate from the above theorem and Remark 1.2 (ii).
Remark 4.2
Random variables , … , . are independent if, and only if, for any = , … , . ∈ . the
random variables , … , are independent. ▄
Theorem 4.2
Let = ( , … , . ) be a 4-dimensional (4 ≥ 2) random vector of either discrete type or of
absolutely continuous type. Let Ùa ,…,a (∙) denote the joint p.m.f. (or p.d.f.) of and let Ùa (∙)
denote the marginal p.m.f. (or p.d.f.) of 3 , = 1, … , 4. Then
42
for some non-negative functions ] (⋅), … , ]. (⋅). In that case Ùa (/3 ) = Û3 ]3 (/ ), / ∈
ℝ, = 1, … , 4 for some positive constants Û , … , Û. .
Proof. (i) For notational simplicity we will provide the proof for 4 = 2.
Case I. is of discrete type
Let a be the support of = ( , ) and let a be the support of 3 , = 1, 2 …. First suppose
that (4.3) holds. Then clearly a = a × a (see (iii)). Therefore, for / = (/ , / ) ∈ ℝ ,
za ,a (/ , / ) = ¯ Ùa ,a ( , )
¾∈ר ∩((7x,])
1 1
= lim [za ,a (/ , / ) − za ,a |/ − , / } − za ,a |/ , / − }
w→x r r
1 1
+za ,a |/ − , / − }]
r r
1 1
= lim [za (/ )za (/ ) − za |/ − } za (/ ) − za (/ ) za |/ − }
w→x r r
43
1 1
+za |/ − } za |/ − }]
r r
= za (/ )za (/ ) − za (/ −)za (/ ) − za (/ )za (/ −) + za (/ −)za (/ −)
= za (/ )Mza (/ ) − za (/ −)L − za (/ −)[za (/ ) − za (/ −)]
= @({ = / ) @({ = / )
For simplicity assume that Ùa ,a (/ , / ) is continuous everywhere. Then, by Remark 2.1 (xiii)
á
= (z (/ ) z (/ ))
á/ á/ a a
44
áza (/ ) áza (/ )
=
á/ á/
(ii) First suppose that and are independent. Then clearly (4.4) holds with the choice
]3 (/3 ) = Ùa (/3 ), /3 ∈ ℝ , = 1, 2. Conversely suppose that (4.4) holds. Let
x
ë3 = Ú ]3 (/) Û/, = 1, 2,
7x
so that ë ≥ 0 , ë ≥ 0 and
x x
x x
x x
= 1.
= ë ] (/ ), / ∈ ℝ.
Similarly
Thus we have
45
= (ë ] (/ ))(ë ] (/ )) (ë ë = 1)
(iii) Since and are independent by (i), Ùa ,a (/ , / ) = Ùa (/ )Ùa (/ ) ∀ / ∈ ℝ .
a =
(/ , / ): Ùa ,a (/ , / ) > 0
Therefore
=
(/ , / ): Ùa (/ ) Ùa (/ ) > 0
= {/: Ùa (/) > 0 × {: Ùa () > 0
= a ∩ a .
Remark 4.3
Let = ( , ) be a bivariate vector of either discrete type or of absolutely continuous
type. Let = {/ ∈ ℝ: Ùa |a / |/ is defined. Then by Theorem 4.2 (i)
(i)
and are independent ⟺ Ùa ,a (/ , / ) = Ùa (/ ) Ùa (/ ) ∀ / = (/ , / ) ∈ ℝ
It follows that and are independent if, and only if, for every / ∈ the
conditional distribution of given = / is the same as unconditional distribution of
. Similarly, by symmetry, and are independent if, and only if, for every
/ ∈ = { ∈ ℝ: Ùa |a ∙ | is defined the conditional distribution of given
= / is the same as the unconditional distribution of .
46
.
= J za (/3 ), ∀ / , … , /. .
3K
Let = , … , . be a random vector and let _ , … , _ be positive integers such that
∑3K _3 = 4. Define = , … , ` , = ` , … , `` and 3 = (∑¦ ` , …,
(iii)
¨÷ ¨
Then, on using the analog of Theorem 4.1 for random vectors, it follows that , … ,
are independent random vectors.
Theorem 4.3
Let , … , . be independent random vectors such that 3 is 3 -dimensional, = 1, … , 4. Let
ψ3 : ℝ → ℝ , = 1, … , 4, be Borel functions. Then ψ , … , ψ. . are independent.
Proof. Let = ( , … , . ) and let 3 = ψ3 3 , = 1, … , 4. For fixed 3 ∈ ℝ define
[3 = :/ ∈ ℝ : ψ3 / ≤ 3 > , = 1, … , 4 (where, for /, ∈ ℝ , / ≤ means /3 ≤ 3 , =
1, … , ). Then, for 3 ∈ ℝ , = 1, … , 4, the joint distribution function of = ψ , … , . =
ψ. . is given by
= @(
∈ [ , … , . ∈ [. )
.
47
.
= J @({© ≤ © )
©K
= J z¨ ( © ),
©K
where z¨ (∙) denotes the marginal distribution function of © , = 1, 2, … , 4. Now, using
version of Theorem 4.1 for random vectors, it follows that , … , . are independent.
Example 4.1
Let = ( , , Q ) be a discrete type random vector with joint p.m.f.
/ / /Q
, if (/ , / , /Q ) ∈ {1, 2 × {1, 2, 3 × {1, 3
Ùa (/ , / , /Q ) = Ô 72 .
0, otherwise
and
, if /Q ∈ {1, 3
Ùa (/Q ) = ð R
0, otherwise
.
Clearly
Ùa ,a ,a (/ , / , /Q ) = Ùa (/ )Ùa (/ )Ùa (/Q ), ∀ / = (/ , / , /Q ) ∈ ℝQ .
Now using Theorem 4.2 (i) it follows that , and Q are independent.
One can also directly infer the independence of , and Q from Theorem 4.2 (ii) by nothing
that
Ùa ,a ,a (/ , / , /Q ) = ] (/ )] (/ )]Q (/Q ), ∀ / = (/ , / , /Q ) ∈ ℝQ ,
48
where
/
, if / ∈ {1, 2 / , if / ∈ {1, 2, 3
] (/ ) = Ô 72 , ] (/ ) = d
0, otherwise 0 , otherwise
and
/Q , if / ∈ {1, 3
]Q (/Q ) = d .
0, otherwise
/ /
(ii) From Example 2.2 (iii) we have
, if (/ , / ) ∈ {1, 2 × {1, 3
( )
Ùa ,a / , /Q = Ô 12 .
0, otherwise
Clearly
Example 4.2
Let = ( , , Q ) be a random vector of absolutely continuous type with p.d.f.
1
, if 0 < /Q < / < / < 1
(/ )
Ùa , / , /Q = Ò/ / .
0 , otherwise
1, if 0 < / < 1
= d ,
0, otherwise
− ln / , if 0 < / < 1
Ùa (/ ) = d ( See Example 2.3 (iii))
0, otherwise
and
49
k 1
ÚÚ Û/ Û/ , if 0 < /Q < 1
Ùa (/Q ) = / /
j
h0, otherwise
(ln /Q )
= Ò 2 , if 0 < /Q < 1.
0, otherwise
Clearly
Ùa ,a ,a (/ , / , /Q ) ≠ Ùa (/ )Ùa (/ )Ùa (/Q ), ∀ (/ , / , /Q ) ∈ ℝQ ,
a =
(/ , / , /Q ): Ùa (/ , / , /Q ) > 0 = {(/ , / , /Q ): 0 < /Q < / < / < 1,
a =
/ : Ùa (/ ) > 0 = (0, 1) = a = a .Sincea ≠ a × a × a one can also infer
Note that
1
− , if / < / < 1, 0 < /Q < /
= Ò / / ln / .
0, otherwise
1
Ùa ,a (/ , / ) − , if / < / < 1
Ùa |a / |/ = Ò / ln /
Ùa (/ )
0, otherwise
and
1
Ùa ,a (/ , /Q ) , if 0 < /Q < /
Ùa |a /Q |/ = Ò/ .
(
Ùa / )
0, otherwise
Ùa ,a |a / , /Q |/ Ùa |a / |/ Ùa |a /Q |/ , ∀ (/ , /Q ) ∈ ℝ .
50
Now using Theorem 4.2 (i) on conditional p.d.f. of ( , Q ) given = / it follows that,
given = / , random variables and Q are conditionally independent.
One can also infer the conditional independence of and Q given = / directly from
Theorem 4.2 (ii) by nothing that, for a fixed / ∈ (0,1),
ë(/ )
, if / < / < 1 1 , if 0 < /Q < /
] (/ ) = Ò / / and ] (/Q ) = d .
() ()
0, otherwise
0, otherwise
The proof of the following theorem, being similar to that of Theorem 3.2, Module 3, is omitted.
Theorem 5.1
51
▄
Definition 5.1
Some special kind of expectations are defined below:
Note that
= Cov© , 3 .
Theorem 5.2
Let = , … , . and = , … , . be random vectors and let A , … , A. , H , … , H. be
real constants. Then, provided the involved expectations are finite,
52
. .
(i) ~¯ A3 3 = ¯ A3 (3 );
3K 3K
. . . .
In particular
. . . .
.
Proof. We will provide the proof for the absolutely continuous case. The proof for the discrete
case follows similarly.
(i) Let Ùa (∙) denote the joint p.d.f. of = , … , . . Then
. .
~¯ A3 3 = Ú ~¯ A3 /3 Ùa /Û/
3K ℝ 3K
.
= ¯ A3 Ú /3 Ùa /Û/
3K ℝ
.
. . . . . .
Cov ~¯ A3 3 , ¯ H© © = ~¯ A3 3 − ~¯ A3 3 ~¯ H© © − ~¯ H© ©
3K ©K 3K 3K ©K ©K
. . . .
¢
= ~¯ A3 3 − ¯ A3 (3 ) ¯ H© © − ¯ H© © ¡ (using (i))
3K 3K ©K ©K
53
. .
¢
= ¯ A3 3 − (X ) ¯ H© <© − © =¡
3K ©K
. .
. .
= ¯ ¯ A3 H© Cov3 , © .
3K ©K
Also,
. . .
Var £¯ A3 3 ¥ = Cov ~¯ A3 3 , ¯ A© ©
3K 3K ©K
. .
= ¯ ¯ A3 A© Cov3 , ©
3K ©K
. . .
. . .
.
Theorem 5.3
54
Let , … , . be independent random vectors where 3 is 3 - dimensional, = 1, … , 4.
~J ψ3 3 = J ψ3 3 ,
3K 3K
@
3 ∈ [3 , = 1, … , 4 = J @{3 ∈ [3 .
3K
Proof. We will provide the proof for the absolutely continuous case. The proof for the discrete
case follows similarly and is left as an exercise.
. x x . .
1 , if 3 ∈ [3
(ii) Let
ψ3 3 = d , = 1, … , 4,
0 , otherwise
so that
.
1 , if 3 ∈ [3 , = 1, … , 4
J ψ3 3 = d .
0 , otherwise
3K
55
. .
Corollary 5.1
Let , … , . be independent random variables. Then
Cov3 , © = 0, ∀ ≠ ,
Definition 5.2
(i) The correlation coefficient between random variables and is defined by
56
"(, ) = ,
#$% (a,)
&'()(a)'()()
Note that "(, ) = "(, ). Also from Corollary 5.1 it is clear that if and are independent
random variables then they are uncorrelated. However, as the following examples illustrates,
the converse may not be true (i.e., uncorrected random variables may not be independent).
Example 5.1
Let (, ) be a bivariate random vector of discrete type with p.m.f. given by
Clearly
⟹ "(, ) = 0.
However
Example 5.2
Let = ( , ) be a bivariate random vector of absolutely continuous type with p.d.f. given
by
1, if 0 < |/ | g / 2 1
Ùa (/ , / ) = d .
0, otherwise
Then
57
2
( ) = Ú Ú / Û/ Û/ =
3
P 7
and
Therefore,
"( , ) = 0,
and
k
Ú Û/ , if − 1 < / < 1 1 − |/ |, if − 1 < / < 1
Ùa (/ ) = = d .
j|| 0, otherwise
h0, otherwise
Clearly
One can also infer that and are not independent by directly observing from the joint
p.d.f. Ùa (∙) that a =
/ ∈ ℝ : Ùa / > 0 = {(/ , / ): 0 < |/ | g / 2 1, a =
/ ∈
ℝ : Ùa (/ ) > 0 = (0, 1), a =
/ ∈ ℝ : Ùa (/ ) > 0 = (−1, 1) and thata ≠ a × a .
▄
58
Theorem 5.4 (Cauchy- Schwarz Inequality for Random Variables)
Let (, ) be a bivariate random vector. Then, provided the involved expectations exist,
The equality in (5.1) is attained if, and only if, @({ = ë) = 1 or @ ({ = ë) = 1, for some
real constant ë.
Case I. ( ) = 0.
In this case @({ = 0) = 1 (see Theorem 3.3 (iii), Module 3) and hence @({ = 0) = 1. It
follows that () = 0, () = 0, @({ = ë) = 1, (for ë = 0) and the equality in (5.1) is
attained.
Then,
This implies that the discriminant of the quadratic equation ( ) − 2 () + ( ) = 0
is non- negative, i.e.,
4( ()) ≤ 4 ( ) ( )
Corollary 5.2
Let ( , ) be a bivariate random vector with (3 ) = 3 ∈ (−∞, ∞) and Var(3 ) = W3 ∈
(0, ∞), = 1, 2. Then
59
"( , ) = ±1 if, and only if, =Û , where 3 = (3 ), = 1,2.
a 7+ a 7+
, ,
(ii)
⟺ " ( , ) ≤ 1
⟺ |" , )| g 1,
− −
⟺ @| =Û } = 1 , for some Û ∈ ℝ.
W W
For a given « ∈ ê (or « satisfying (3.5) and Ùê « > 0) the conditional p.m.f. (or p.d.f.) of
given = « is given by
Ù,ê <, «=
Ù»ê <»«= = , ∈ ℝ. .
Ùê «
Let ψ: ℝ. → ℝ be a Borel function. Then the conditional expectation of ψ() given that
= « may be defined by
60
where ψ∗ is defined by
Theorem 5.5
Under the above notations
Proof. We will provide the proof for the absolutely continuous case. The proof for the discrete
case follows in the similar fashion.
= (ψ).
= £
|ψ − <ψ=} -¥ (5.4)
61
<ψ − (ψ) »= = <ψ − ψ» + ψ» − <ψ= |}
= |<ψ − ψ|= |} + |ψ» − E <ψ=}
= Var ψ» + ψ» − |ψ»=} . (5.5)
Remark 5.1
If and are independent then
Example 5.1
Let = ( , , Q ) be a discrete type random vector with p.m.f.
/ / /Q
, if (/ , / , /Q ) ∈ {1, 2 × {1, 2, 3 × {1, 3
Ùa (/ , / , /Q ) = Ô 72 .
0, otherwise
(i) Let = 2 − + 3Q and = − 2 + Q . Find the correlation coefficient between
and ;
(ii) For a fixed / ∈ {1, 2, 3, find (| = / ) and Var(| = / ), where = Q .
Solution. (i) From Example 4.1 (i) we know that , and Q are independent. Therefore
Cov( , ) = Cov( , Q ) = Cov( , Q ) = 0. Also Cov(3 , 3 ) = Var(3 ), = 1, 2, 3. Using
Theorem 5.2 (ii) we have
+3Var(Q ) − 7Cov( , Q )
62
= 2Var( ) + 2Var( ) + 3Var(Q ).
and
/Q
, if /Q ∈ {1, 3
Ùa (/Q ) = Ô 4 .
0 , otherwise
Therefore
/ (1 + 2 ) 5
( ) = ¯ / Ùa (/ ) = ¯ = =
3 3 3
∈ ר ∈{,
/Q (1Q + 2Q )
( ) = ¯ / Ùa (/ ) = ¯ = =3
3 3
∈ ר ∈{,
/ (1 + 2 + 3 ) 7
( ) = ¯ / Ùa (/ ) = ¯ = =
6 6 3
∈ ר ∈{,,Q
/Q (1Q + 2Q + 3Q )
( ) = ¯ / Ùa (/ ) = ¯ = =6
6 6
∈ ר ∈{,,Q
/Q (1 + 3 ) 5
(Q ) = ¯ /Q Ùa (/Q ) = ¯ = =
4 4 2
∈ ר ∈{,Q
/QQ (1Q + 3Q )
(Q ) = ¯ /Q Ùa (/Q ) = ¯ = =7
4 4
∈ ר ∈{,Q
2
Var( ) = ( ) − ( ( )) =
9
5
Var( ) = ( ) − ( ( )) =
9
and
63
3
Var(Q ) = (Q ) − (Q ) = .
4
Therefore,
4 10 9 137
Cov( , ) = + + = .
9 9 4 36
Also, by Corollary 5.1,
8 5 27
= + +
9 9 4
295
=
36
and
2 20 3
= + +
9 9 4
115
= .
36
Therefore
Cov( , )
"( , ) =
&Var( )Var( )
137
=
√295√115
= 0.7438 ⋯
(ii) Since , and Q are independent it follows that ( , Q ) and are independent. This in
turn implies that = Q and are independent. Therefore (| / and
Var(| / Var(). Now
() = ( Q )
64
= ( )(Q ) (using Theorem (5.3))
25
= .
6
Var() = Var( Q )
5 2
= Var | Q } + | Q }
3 9
25 2
= Var(Q ) + (Q )
9 9
75 14
= +
36 9
131
= .
36
▄
Example 5.2
Let = ( , , Q ) be an absolutely continuous type random vector with p.d.f.
1
, if 0 < /Q < / < / < 1
Ùa (/ , / , /Q ) = Ò/ / .
0, otherwise
Solution.
+5Cov( , Q ) − 7Cov( , Q ).
65
1 1
( ) = Ú / Ùa /Û/ = Ú Ú Ú Û/Q Û/ Û/ =
ℝ / 2
P P P
/ 1
( ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 3
P P P
1 1
( ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 4
P P P
/ 1
( ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 9
P P P
/Q 1
(Q ) = Ú Ú Ú Û/Q Û/ Û/ =
/ / 8
P P P
/Q 1
(Q ) = ÚÚ Ú Û/Q Û/ Û/ =
/ / 27
P P P
1
( ) = Ú Ú Ú Û/Q Û/ Û/ =
6
P P P
/Q 1
( Q ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 12
P P P
/Q 1
( Q ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 18
P P P
1
Var( ) = ( ) − (( )) =
12
7
Var( ) = ( ) − (( )) =
144
37
Var(Q ) = (Q ) − ((Q )) =
1728
1
Cov( , ) = ( , ) − ( ) ( ) =
24
66
1
Cov( , Q ) = ( Q ) − ( ) (Q ) =
48
7
Cov( , Q ) = ( Q ) − ( ) (Q ) = .
288
Therefore,
1 7 37 5 5 49 31
Cov( , ) = + + − + − = .
6 72 576 24 48 288 576
Also,
+12Cov( , Q ) − 6Cov( , Q )
1 7 37 1 1 7
= + + − + −
3 144 192 6 4 48
295
= .
576
Var( ) = Var( ) + 4Var( ) + Var(Q ) − 4Cov( , ) + 2Cov( , Q ) − 4Cov( , Q )
1 7 37 1 1 7
= + + − + −
12 36 1728 6 24 72
133
= .
1728
Therefore
Cov( , )
"( , ) = = 0.3251 ⋯
&Var( )Var( )
Since
67
x x
we have
1
ë (/ ) Ú Ú Û/Q Û/ = 1,
/
P P
1
i. e., ë (/ ) = .
/
Also
= / ( Q | = / )
1
= / Ú Ú / /Q Û/ Û/
/ / Q
P P
/Q
= .
6
( | = / ) = ( Q | = / )
/T
= .
15
Therefore
/T /T
= −
15 36
7 T
= / .
180
68
▄
the function a ∶ [ → ℝ by
Definition 6.1
(i) The function a ∶ [ → ℝ , defined by (6.1), is called the joint moment generating
function (m.g.f.) of random vector .
(ii) We say that the joint m.g.f. of exists if it is finite in a rectangle −A, A ⊆ ℝ. , for some
A = (A , A , … , A. ) ∈ ℝ. ; here −A = (−A , −A , … , −A. ) and −A, A = { ∈ ℝ. : − A3 <
3 < A3 , = 1, 2, … , 4.
▄
As in the one- dimensional case many properties of probability distribution of can be studied
through joint m.g.f. of . Some of the results, which may be useful in this direction, are
provided below without their proofs. Note that 10 = 1. Also if , … , . are independent
. . .
then
Theorem 6.1
Suppose that a exists in a rectangle −A, A ⊆ ℝ. . Then a possesses partial
derivatives of all orders in −A, A. Furthermore, for positive integers _ , … , _. ,
á ⋯ á.̀
`
Under the assumptions of Theorem 6.1, note that, for ψa = ln a , ∈ [,
69
á á
(3 ) = î a ï = î Ψa ï , = 1, … , 4
á3 ½KP
á3 ½KP
á4
(3 4 ) = î ï , = 1, … , 4
á3 4 a ½KP
á á
Var(3 ) = 5 a 6 − £î a ï ¥
á3 ½KP
á3 ½KP
á
= 5 ψa 6 , = 1, … , 4,
á3 ½KP
á á á
= 5 a 6 − î a ï 5 a 6
á3 á© ½KP
á3 ½KP
á© ½KP
á
= 5 6 .
á3 á© a ½KP
a 0, … ,0, 3 , 0, … ,0, © , 0, … ,0 = (X ½ a ½¨a¨ ) = a ,a¨ 3 , © , , ∈ {1, … , 4,
Definition 7.1
Let and be two 4-dimensional random vectors, defined on the same probability space
(Ω, ℱ, @) . Then and are said to have the same distribution (written as = ) if
7
The following results are multivariate analogs of theorems stated in Section 4 of Module 3. The
proofs of these theorems, being similar to their univariate counterparts, is omitted.
70
Theorem 7.1
Let and be 4 -dimensional random vectors of discrete type with joint p.m.f.
Ùa (⋅) and Ù (⋅), respectively. Then = if, and only if, Ùa / = Ù / , ∀ / ∈ ℝ. .
(i)
7
á . za / á . z /
and
á/ ⋯ á/. á/ ⋯ á/.
á . za / á . z /
Ú b /Û/ = Ú bÅ â /Û/ = 1.
ℝ á/ ⋯ á/. ℝ á/ ⋯ á/.
Åâ
Then both of them are of absolutely continuous type. Moreover, = if and only if,
7
there exist versions of p.d.f.s Ùa (∙) and Ù (∙) of and , respectively, such that
Ùa / = Ù / , ∀/ ∈ ℝ. . ▄
Theorem 7.2
Let and be 4 -dimensional random vectors of either discrete type or of absolutely
continuous type with = . Then
7
(i) For any Borel function ℎ: ℝ. → ℝ, <ℎ= = <ℎ=, provided the expectations are
finite;
For any Borel function : ℝ. → ℝ, = .
7
(ii)
▄
71
Then = .
7
▄
Remark 7.1
and
.
a8 () = îa | }ï , ∈ ℝ,
4
Example 7.1
Let , , … , . be independent random variable such that 3 ~ :(3 , W3 ), −∞ < 3 < ∞,
W3 > 0, = 1, … , 4. If A , … , A. are real constants, such that not all of them are zero, then show
that
. . .
¯ A3 3 ~: £¯ A3 3 , ¯ A3 W3 ¥.
3K 3K 3K
= £J X ½¹a ¥
3K
72
.
= J a (A3 )
3K
.
; < =
= J X ½¹+ , ∈ ℝ
3K
<∑ ; < ==
= X , ∈ ℝ ,
½∑÷ ¹ + ÷
which is the m.g.f. of :∑3K A3 3 , ∑3K A3 W3 distribution. Using Theorem 7.3 it follows that
. .
. .
~ : £¯ A3 3 , ¯ A3 W3 ¥.
3K 3K
Example 7.2
Let , , … , . be independent random variable such that 3 ~ Bin(r3 , ? ), 0 < ? < 1, r3 ∈
{1, 2, … , = 1, … , 4. Show that
. .
¯ 3 ~ Bin £¯ r3 , ?¥.
3K 3K
= £J X ½a ¥
3K
= J (X ½a )
3K
= J a ()
3K
= J(1 − ? + ?X ½ )w , ∈ ℝ
3K
73
= (1 − ? + ?X ½ )∑÷ w , ∈ ℝ,
which is the m.g.f. of Bin∑3K r3 , ? distribution. Using Theorem 7.3 it follows that
.
. .
= ¯ 3 ~ Bin £¯ r3 , ?¥.
3K 3K
Example 7.3
Let , , … , . be independent random variables such that 3 ~ NB(3 , ? ) , 0 < ? < 1, 3 ∈
{1,2, … = 1,2, … , 4. Then show that
. .
= ¯ 3 ~ NB £¯ 3 , ?¥.
3K 3K
?
a () = | } , < − ln(1 − ?).
1 − (1 − ?)X½
Example 7.4
Let , , … , . be independent random variables such that 3 ~ @(3 ), 3 > 0, = 1, … , 4.
Then show that
. .
¯ 3 ~ @ £¯ 3 ¥.
3K 3K
Solution. Similar to solution of Example 7.2 on noting that if ~ @() , > 0, then
a () = X (A , ∈ ℝ .
= 7)
Example 7.5
Let , , … , . be independent random variable such that 3 ~ É (B3 , ? ) , ? > 0, B3 > 0,
= 1, … , 4. Show that
74
. .
¯ 3 ~ É £¯ B3 , ?¥.
3K 3K
Solution. Similar to solution of Example 7.2 on noting that if ~ É (B, ? ), B > 0 , ? > 0, then
Example 7.6
Let , , … , . be independent random variables such that 3 ~ χw , r3 ∈ {1, 2, … , =
1, … , 4. Then show that
(i)
¯ 3 ~ F∑ .
÷ w
3K
.
3 −
¯| } ~ F. .
W
3K
Solution.
(i) Note that 3 ~ Fw = É < , 2= , = 1, … , 4. Now the assertion follows from Example 7.5.
w
(ii) Follows on using Theorem 4.1 (ii) of Module 5 and (i) above.
▄
Theorem 7.4
75
▄
8. Multinomial Distribution
First let us introduce the notion of multinomial coefficients, which is a generalization of notion
of binomial coefficients.
The number of visually distinguishable ways in which these r items can be arranged in a row is
`7
r r − r r − r − r r!
<r = < r = < r = ⋯ r − ¯ r3 = .
Q 3K r ! r ! ⋯ r`7 ! (r − ∑`7
3K r3 )!
r`7
The coefficients
`7
r r!
<r r ⋯ r = = , r3 ≥ 0, = 1, … , _ − 1, ¯ r3 ≤ r (8.1)
`7 r ! r ! ⋯ r`7 ! (r − ∑`7
3K r3 )! 3K
Note that, for _ = 2 (so that 0 ≤ r ≤ r), multinomial coefficients (8.1) reduce to binomial
coefficients
r r!
<r = = , r ∈ {0, 1, … , r.
r ! (r − r )!
76
/ /Q /R / / / … /`7 /R ). Each such term equals / w / w ⋯ /` w§ and total number of
visually distinguishable ways of arranging
`7
r
r / ç, r / ç, … , r`7 /`7
ç and £r − ¯ r3 ¥ /` ç is <r r ⋯ r =.
`7
3K
Thus, we have
w w
r
(/ + / + ⋯ + /` )w = ¯ ⋯ ¯ <r r ⋯ r = / / ⋯ / § .
w w ẁ
`7
w KP w§¦ KP
w w¼ … w§¦ w
1, … , 4, and ∑3K ?3 < 1 so that @[. = 1 − ∑3K ?3 ∈ (0, 1). Suppose that the random
. .
Define
Then one may be interested in the joint probability distribution of = ( , , … , . ). Note
that
.
Ùa (/ , … , /. ) = @
= / , … , . = /.
(w7∑÷ )
k
.
r!
= / ! ⋯ / ! (r − ∑. / )! ? ⋯ ?. £1 − ¯ ?3 ¥ if / ∈ a (8.2)
j . 3K 3 3K
h0, otherwise
77
Definition 8.1
The probability distribution given by (8.2) is called a multinomial distribution with r trials and
cell probabilities ? , … , ?. <denoted by Multn, ? , … , ?. =. ▄
Note that, for p = 1, Mult(r, ? ) distribution is nothing but the Bin(r, ? ) distribution.
Theorem 8.1
(i) 3 ~Bin(r, ?3 ), = 1, … , 4;
(ii) 3 + © ~Bin r, ?3 + ?© , , = 1, … , 4, ≠ ;
(iii) (3 ) = r?3 and Var(3 ) = r?3 (1 − ?3 ), = 1, … , 4;
(iv) Cov3 , © = −r?3 ?© , , = 1, … , 4, ≠ .
Proof.
Fix ∈ {1, … , 4. In a given trial of the random experiment treat the occurrence of outcome
[3 as success and that of any other [© , ≠ (i.e., non-occurrence of [3 ) as failure. Then we
(i)
have a sequence of r independent Bernoulli trials with probability of success in each trial as
@([3 ) = ?3 . Therefore
(ii) Fix , ∈ {1, … , 4, ≠ . In a given trial of the random experiment treat the occurrence of
[3 or [© i. e. , occurrence of [3 ∪ [© as success and its non-occurrence as failure. Then we
have a sequence of r independent Bernoulli trials with probability of success in each trial as
@([3 ∪ [© ) = @([3 ) + @[© = ?3 + ?© and, therefore,
3 + © = # of successes in r independent Bernoulli trials ~ Binr, ?3 + ?© .
3 + © ~ Binr, ?3 + ?©
78
⟹ r?3 (1 − ?3 ) + r?© 1 − ?© + 2Cov3 , © = r?3 + ?© 1 − ?3 − ?©
⟹ Cov3 , © = −r?3 ?© , ≠ .
KP ÷W / ! / ! ⋯ /. ! (r − ∑3K /3 )!
.
⋯ w
. w7∑÷
− ¯ ?3 ¥
3K
´ w7∑÷
w w
r!
= ¯ ⋯ ¯ (? X½ ) ⋯ (?. X½ ) ~1 − ¯ ?3
/ ! / ! ⋯ . /. ! (r − ∑3K /3 )!
.
KP KP K
⋯ w
. w
= £? X½ + ⋯ + ? X ½ + 1 − ¯ ?3 ¥ , ∈ ℝ. .
3K
Therefore,
á
(3 ) = î ï
á3 a ½KP
. w7
= r?3 , = 1, … , 4.
á
3 © = 5 6
á3 á© a ½KP
. w7
79
Cov3 , © = 3 © − (3 )© = −r?3 ?© , ≠ .
(3 ) = î a ()ï
X
X½
½KP
. w7
+ r?3 X ½ £? X ½ + ⋯ + ?. X ½ + 1 − ¯ ?3 ¥ 3
3K
½KP
= r(r − 1)?3 + r?3 , = 1, … , 4.
▄
Definition 9.1
A bivariate random vector = ( , ) is said to have a bivariate normal distribution
: ( , , W , W , ") if, for some −∞ < 3 < ∞, = 1, 2, W3 > 0, = 1, 2, and −1 < " < 1, the
joint p.d.f. of = ( , ) is given by
1 7
í ¦[ í ¦[ í ¦[ í ¦[
î< = 7\< = < =< = ï
Ùa,a (/ , / ) = X ¦Z < < < <
, / = (/ , / ) ∈ ℝ .
2YW W &1 − "
x x
x x
1
= Ú ÚX Û«
7 ° 7\ ° ° °
¦Z
2Y &1 − " 7x 7x
x x
1
(° 7\° )
= ÚX Ô Ú X ¦Z Û« ] Û«
7 ° 7\ ° 7
¦Z
80
x
1 ø
= Ú X7 Û«
√2Y
7x
= 1.
Theorem 9.1
Suppose that = ( , ) ~ : ( , , W , W , "), −∞ < 3 < ∞, = 1, 2, W3 > 0, = 1, 2 and
−1 < " < 1. Then,
" , (/ − ), W (1 − " ) (written as | = / ~ :
+ " , (/ − ), W (1 − " ) ;
, ,
" (/ − ), W (1 − " )(written as | = / ~ :
+ " (/ − ), W (1 − " ) ;
, ,
, ,
<
(v) for real constants ë and ë such that ë + ë > 0
Proof.
(i) For / ∈ ℝ
x
81
X7
( 7+ ) x
í ¦[ í ¦[
. 7\ Þ
= ÚX Û/
, 7
¦Z < <
2Y W W &1 − "
7x
X 7 ,
( 7+ )
x
Z<
7 . 7<+ < ( 7+ )=Þ
=
ÚX <
¦Z Û/
2Y W W &1 − "
7x
X 7
( 7+ )
2Y W W &1 − "
1 (í ¦[ )
= X ,
7
<
W √2Y
which is the p.d.f. of :( , W ) distribution. Thus ∼ :( , W ). By summetry
~:( , W ).
Z<
7 £ 7
+ ( 7+ )¥
= ë (/ )X , / ∈ ℝ,
< ¦Z <
Clearly, for a fixed / ∈ ℝ, Ùa |a (⋅ |/ ) is the p.d.f. of :
+ (/ − ), W (1 − " )
\,
,
distribution.
82
For a fixed / ∈ ℝ , since | = / ~ :
+ (/ − ), W (1 − " ) , on using
\,
,
<
Z<
< ¦Z = <
|= ¼ < = }
a ,a ( , ) = X X
Z< Z<
+ ½ 7 + ½ <½ ½ =+
< <
<
(v) Let ë and ë be real constants such that ë + ë > 0 and let = ë + ë . Then, for
∈ ℝ,
() = (X ½ )
= (X ½m a½ma )
<â
< ¼â < ¼Zâ â < < ==
= X ,
(m + m + )½
which is the m.g.f. of :(ë + ë , ë W + ë W + 2"ë ë W W ) distribution. Thus, by
Theorem 7.3,
(vi) By (i), Var( ) = W and Var( ) = W . Also, for Ψa ,a ( , ) = ln a ,a ( , ), =
( , ) ∈ ℝ ,
á
Cov( , ) = 5 Ψ ( , )6 = "W W
á á a ,a ½K P
83
Cov ( , )
⇒ "( , ) = = ".
&Var ( ) Var ( )
(vii) Since independent random variables are uncorrelated it follows from (vi) that if and
are independent then " = 0. Conversely suppose that " = 0. Then, for / = (/ , / ) ∈
ℝ ,
1 í ¦[ í ¦[
Ùa ,a (/ , / ) = X
7 î< = < = ï
2π σ σ
b b
Theorem 9.2
Let = ( , ) be a bivariate random vector with (3 ) = 3 ∈ (−∞, ∞), Var(3 ) = W3 , =
1, 2 and Cov( , ) = " ∈ (−1, 1). Then ~ : ( , , W , W , ") if, and only if, for any real
constants and such that + t > 0, = + ~ :( + , W + W +
2" W W ).
suppose that for all real constants t and t with + t > 0,
Proof. Clearly the necessary part of the assertion follows from Theorem 9.1 (v). Conversely
= (X )
= (1)
= X , (using (9.11))
= < ¼= < ¼Z = = < <
½ + ½ +
which is the m.g.f. of : ( , , W , W , ") distribution. Now using Theorem 7.3 it follows that
Let = , … , . be a random vector of either discrete type or of absolutely continuous type
and let Ùa (∙) denote the p.m.f./p.d.f. of . Let ]: ℝ. → ℝ be a Borel function. As the following
84
of ].
example illustrates, in many situations, it may be of interest to find the probability distribution
Example 10.1
Consider a company that manufactures electric bulbs. The lifetimes of electric bulbs
manufactured by the company are random. Past experience with testing on electric bulbs
manufactured by the company can be described by a random variable having the p.d.f.
manufactured by the company suggests that the lifetime of a randomly chosen electric bulb
1 7 í
Ùa (/|?) = Ô? X , if / > 0 , ? > 0.
c
0, otherwise
However the value of ?(> 0) is not evident from the past experience and therefore ? is
unknown. One way to obtain information about unknown ? is to do testing independently and
under identical conditions, on a number (say r) of electric bulbs manufactured by the company.
Let 3 denote the lifetime of the -th bulb, = 1, … , r . We call , … , w (which are
independent and identically distributed random variables from the distribution Ùa (⋅ |? ), ? > 0)
the random sample from distribution Ùa (⋅ |? ), ? > 0. Clearly the joint p.d.f. of = ( , … , w )
is given by
w 1 7 ∑÷ í
3K 0, otherwise
Definition 10.1
(i) A function of one or more random variables that does not depend on any unknown
(ii) Let , … , w be a collection of independent random variables each having the same
parameter is called a statistic.
p.m.f./p.d.f. Ù (or distribution function z). We then call , ⋯ , w a random sample (of size
r) from a distribution having p.m.f./p.d.f Ù (or distribution function z). In other words a
random sample is a collection of independent and identically distributed random variables.
Remark 10.1
85
(i) Let ~ : ( , , W , W , "), −∞ < 3 < ∞, W3 , = 1, 2, −1 < " < 1. Then the random
variable = + is a statistic but the random variable =
a 7+
,
is not a statistic
unless and W are known parameters.
(ii) Although a statistic does not depend upon any unknown parameters, the distribution of a
(iii) If , … , w is a random sample from a distribution having p.m.f./p.d..f. Ù(∙), then the joint
p.m.f./p.d.f. of = ( , … , w ) is
w
Ùa (/ , … , /w ) = J Ùa 3 (/3 )
3K
= J Ù(/3 ) , / = (/ , … , /w ) ∈ ℝw .
3K
(iv) Let , … , w be a random sample from a distribution. Some of the commonly used
statistics are
w
1
(a) Sample Mean 8 = ¯ 3 ;
r
3K
w w
1 1
(b) Sample Variance = ¯(3 − 8 ) = è¯ 3 − r 8 é , r ≥ 2 ;
r−1 r−1
3K 3K
¼:w , if r is odd
(e) Median = Òa: a¼: .
, if r is even
w
Theorem 10.1
Let , … , w be a random sample from a distribution having p.m.f./p.d.f. Ù(⋅). Then, for any
permutation ( , … , w ) of (1, … , r ),
Proof. Let ( , … , w ) be a permutation of (1, … , r ) and let ( , … , w ) be the inverse
permutation of ( , … , w ). Then, for / = (/ , … , /w ) ∈ ℝw ,
86
Ùa (/ , … , /w ) = Ùa ,…,a </ , … , / =
,…,a
w
= J Ùa </ 3 =
3K
= J Ù (/3 )
3K
It follows that
Example 10.1
Let , … , w be a random sample from a given distribution.
is of absolutely continuous type then show that @({ < < ⋯ < w ) =
@ <: < < ⋯ < w >= = w!, for any permutation ( , … , w ) of (1, … , r );
(i) If
(ii) If is absolutely continuous type then show that @({3 = :w ) = w , = 1, … , r, where,
3 1
(iii) Show that
| } = , = 1,2, … , r,
+ + ⋯ + w r
w
(iv) Show that
£3 e¯ 3 = ¥ = , = 1, … , r.
r
3K
Solution. Let w denote the set of all permutations of (1, … , r). Using Theorem 10.1 we have
87
⇒ Ψ( , … , w ) = |Ψ < , … , =} , ∀ ∈ w .
w
(10.1)
(i) On taking
We conclude that
@({ < < ⋯ < w ) = @ <: < < ⋯ < w >= , ∀ ∈ w . (10.2)
Since @
3 = © = 0 for ≠ (as (3 , © ) is of absolutely continuous type; see Remark
2.1 (ix)), we have
in (10.1) and noting that, for any permutation = ( , … , w ) ∈ w , -th smallest of
: , … , > = -th smallest of { , … , w = :w , we have
w
But
w
¯ @({3 = :w ) = 1,
3K
and therefore
1
@({3 = :w ) = @({ = :w ) = ∙
r
(iii) On taking
88
/
Ψ(/ , … , /w ) = , / ∈ ℝw ,
/ + ⋯ + /w
| } =
+ ⋯ + w + ⋯ +
3 w
w w
=
£since ¯ /3 = ¯ ¥
+ ⋯ + w 3
3K 3K
3
⇒ | } = | } , = 1, … , r. (10.3)
+ ⋯ + w + ⋯ + w
w w
3 3
But
¯| } = £¯ ¥
+ ⋯ + w + ⋯ + w
3K 3K
+ ⋯ + w
= | }
+ ⋯ + w
= 1.
3 1
| }= | } = , = 1, … , r.
+ ⋯ + w + ⋯ + w r
©K ©K
w w w w
w w
89
w w
But
w w w w
¯ ~3 f¯ © = = ~¯ © f¯ © = = .
3K ©K 3K ©K
w w
~ f¯ © = = ~3 f¯ © = = , = 1, … , r.
r
©K ©K
In the following subsections we will discuss various techniques to find the distribution of
functions of random variables.
Let = , … , . be a random vector and let ]: ℝ. → ℝ be a Borel function. The distribution
of = ] , … , . can be determined by computing the distribution function
z () = @
] , … , . ≤ , −∞ < < ∞.
za (/ , … , /w ) = J z (/3 ) , / ∈ ℝw .
3K
We have
w
za / = J z (/3 )
3K
90
w
= J Ú Ù(3 )Û3
3K 7x
w
= Ú ⋯ Ú Ùa Ûw ⋯ Û .
7x 7x
It follows that is of absolutely continuous type with joint p.d.f. Ùa (⋅). Therefore, for ≠ ,
@
3 = © = 0.
Define
so that
Therefore
za :w (/ ) = @({:w ≤ / )
= ¯ @ ({ of , … , w are ≤ /), / ∈ ℝ .
3K
Fix / ∈ ℝ, and consider a sequence of r trials where at the -th trial we observe 3 and consider
the trial having resulted in success if 3 ≤ / and it having resulted in failure if 3 > /, =
1, … , r. Since , … , w are independent and the probability of success in the -th trial is
@({3 ≤ /) = z(/) (same for all the trials), the above sequence of trials may be considered as
a sequence of independent Bernoulli trials with probability of success in each trial as z (/ ).
Therefore
91
r
= < = z(/) 1 − z (/ ) ,
3 w73
and consequently
w
r
za :w (/ ) = ¯ < = z(/) 1 − z (/) , / ∈ ℝ.
3 w73
3K
Recall that for ç ∈ {0, 1, … , r and 4 ∈ (0, 1) (see Theorem 3.1, Module 5)
w .
r 1
¯ < = 4 © (1 − 4)w7© = Ú ñ7 (1 − )w7ñ Û.
8(ç, r − ç + 1)
©Kñ P
Therefore,
h()
1
za :w (/ ) = Ú 7 (1 − )w7 Û, / ∈ ℝ .
8 (, r − + 1)
P
Let
1
Ùa :w (/) = [z (/)]7 [1 − z (/)]w7 Ù(/), / ∈ ℝ, (10.1.1)
8(, r − + 1)
so that
Û
z (/) = Ùa :w (/ ), ∀ / ∉ Æ,
Û/ a :w
and
x x
1
Ú Ùa :w (/ )Û/ = Ú [z (/ )]7 [1 − z (/ )]w7 Ù (/) Û/
7x 7x 8 (, r − + 1)
1
= Ú 7 (1 − )w7 Û
8 (, r − + 1) P
= 1.
It follows that the random variable :w is of absolutely continuous type with p.d.f. given by
(10.1.1). A simple heuristic argument for expression (10.1.1) is as follows. Interpret Ùa :w (/)∆/
as the probability that :w lies in an infinitesimal interval [/, / + ∆/]. Realizing that the
probability of more than one 3 ç falling in the infinitesimal interval [/, / + ∆/ ] may be
negligible, Ùa :w (/)∆/ may be interpreted as probability that one of the 3 ç falls in the
92
infinitesimal interval [/, / + ∆/], ( − 1)3 ç fall in the interval (−∞, /] and (r − )3 ç fall in
the interval (/ + ∆/, ∞) ≃ (/, ∞). Since , … , w are independent and the probabilities of an
observation falling in intervals[/, / + ∆/ ], (−∞, /] and (/, ∞) are Ù(/ ) ∆/, z (/ ) and 1 − z (/)
respectively, Ùa : (/)∆/ is given by the multinomial probability
r!
Ùa : (/)∆/ ≡ (Ù(/)∆/) z(/) 1 − z(/) ,
7 w7
1! ( − 1)! (r − )!
i.e.,
r!
Ùa : (/) = [z (/ )]7 [1 − z (/ )]w7 Ù (/), −∞ < / < ∞.
( − 1)! (r − )!
Now we will derive the joint distribution of (:w , ñ:w ), where and ç are fixed positive
integers satisfying 1 ≤ < ç ≤ r. For −∞ < / < < ∞,
za : ,aj: (/, )
r!
= [z (/ )]3 [z () − z (/)]© [1 − z ()]w737© ∙
! ! (r − − )!
93
w w73
r!
= ¯ ¯ [z(/)]3 [z() − z(/)]© [1 − z()]w737©
! ! (r − − )!
3K ©Kk(l(P,ñ73)
ñ7 w73
r!
= ¯ ¯ [z (/)]3 [z () − z (/ )]© [1 − z ()]w737©
! ! (r − − )!
3K ©Kñ73
w w73
r!
+ ¯ ¯ [z (/ )]3 [z () − z (/)]© [1 − z ()]w737©
! ! (r − − )!
3Kñ ©KP
ñ7 w73
r r−
= ¯ < = [z (/)]3 Ò ¯ | } [z () − z (/ )]© [1 − z ()]w737© ß
3K ©Kñ73
w w73
r r−
+ ¯ < = [z (/ )]3 Ò¯ | } [z () − z (/)]© [1 − z ()]w737© ß
3Kñ ©KP
k p
m(æ)¦m(í)
i i
ñ7
1
¦m(í)
r
= ¯ < = [z (/ )]3 [1 − z (/)]w73 Ú ñ737 (1 − )w7ñ Û
j 8 (ç − , r − ç + 1) o
3K i P i
h n
h()
1
+ Ú ñ7 (1 − )w7ñ Û. (using Theorem 3.1, Module 5)
8(ç, r − ç + 1)
P
z () − z (/)
ñ7
á r (r − )!
ñ737
za : ,aj: (/, ) = ¯ < = [z (/ )]3 [1 − z (/)]w73
á (ç − − 1)! (r − ç)! 1 − z (/)
3K
z () − z (/ ) Ù ()
w7ñ
× 51 − 6
1 − z (/ ) [1 − z (/ )]
94
r! 1 − z ()
w7ñ ñ7
ç − 1 [ ( )]3 [ ( )
= Ù () ¯ < = z / z − z (/)]ñ737
(ç − 1)! (r − ç)!
3K
r!
= z() 1 − z () Ù ()
ñ7 w7ñ
(ç − 1)! (r − ç)!
ç − 1 z(/ ) z(/)
ñ7 3 ñ737
× ¯ < =5 6 51 − 6
z () z ()
3K
r!
= z() 1 − z() Ù()
ñ7 w7ñ
(ç − 1)! (r − ç)!
m(í)
1
m(æ)
× Ú 7 (1 − )ñ77 Û
8(, ç − )
P
á
⇒ Ùa : ,aj: (/, ) = z
á/ á a : ,aj:
r!
= z() 1 − z () Ù ()
ñ7 w7ñ
( − 1)! (ç − − 1)! (r − ç)!
r!
= z(/) z() − z(/)
7 ñ77
( − 1)! (ç − − 1)! (r − ç)!
{ñ:w ≤ ⊆ {:w ≤ /
and therefore
á
⇒ z (/, ) = 0.
á/ á a : ,aj:
Let
95
Ù,ñ (/, )
r!
k [z (/ )]7 [z () − z (/)]ñ77 [1 − z ()]w7ñ Ù (/)Ù (),
i( − 1)! (ç − − 1)! (r − ç)!
= if − ∞ < ≤ / < ∞
j
i 0, otherwise
h
(10.1.2)
so that
á
z (/, ) = Ù,ñ (/, ) ∀ (/, ) ∈ ℝ − (Æ × Æ).
á/ á a : ,aj:
It follows that the random vector (:w , ñ:w ) is of absolutely continuous type with joint p.d.f.
Ù,ñ (/, ) ∆/ ∆ = probability that ( − 1) 3 ç fall in (−∞, /], one 3 falls in (/, / + Δ/ ], (ç −
− 1) 3 ç fall in (/ + ∆/, ](≈ (/, ]), one 3 falls in (, + Δ] and (r − ç) 3 s fall in
( + ∆, ∞)≈ (, ∞). Using the property of multinomial distribution, we have
Ù,ñ (/, ) ∆/ ∆
i.e.,
r!
Ù,ñ (/, ) = [z (/ )]7 [z () − z (/)]ñ77
( − 1)! (ç − − 1)! (r − ç)!
Example 10.1.2
Let , … , w be a random sample from a distribution having support , distribution function
z (∙) and p.m.f. Ù (∙). Define :w = min{ , … , w and w:w = max{ , … , w . Find the p.m.f.s
of :w and w:w .
96
Solution. For / ∈ ℝ, the distribution function of :w is
= 1 − J[1 − z (/ )]
3K
= 1 − [1 − z (/ )]w .
Note that
a: =
/ ∈ ℝ: za:w (⋅) is discontinuous at /
= {/ ∈ ℝ: z(∙) is discontinuous at /
= .
Thus :w is a discrete type random variable with support and p.m.f.
= @({3 ≤ /, = 1, … , r)
w
= J @({3 ≤ / )
3K
= J z (/)
3K
97
= [z(/)]w , / ∈ ℝ.
Since za: (∙) is continuous at / if, and only if, z (∙) is continuous at /, the random variable w:w
is of discrete type with support and p.m.f.
Example 10.1.3
Let , be a random sample from r(0,1) distribution. Find the distribution function of
= + . Hence find the p.d.f. of .
z (/ ) = @({ + ≤ /)
x x
= Ú Ú Û/ Û/
P P
0, if / < 0
k1
i × / × /, if 0 ≤ / < 1
= 2 .
j1 − 1 (2 − / ) × (2 − /), if 1 ≤ / < 2
i 2
h1, if / ≥ 2
98
0, if / 2 0
k
/
i , if 0 g / 2 1
⇒ z / 2 .
j4/ 0 / 0 2
, if 1 g / 2 2
i 2
h1, if / o 2
/, if 0 2 / 2 1
]/ Ô2 0 /, if 1 2 / 2 2 ,
0, otherwise
Û
so that
z / ]/ ∀/ ∈ - 0 Æ
Û/
and
x
Ú ]/ Û/ 1.
7x
/, if 0 2 / 2 1
]/ Ô2 0 /, if 1 2 / 2 2 .
0, otherwise
Example 10.1.4
Let , be a random sample from a distribution having p.d.f.
99
2/, if 0 2 / 2 1
Ù(/ ) = d .
0, otherwise
4 / / , if 0 2 / 2 1, 0 2 / 2 1
d .
0, otherwise
For 0 g / 2 1
7
/R
z / Ú Ú 4/ / Û/ Û/ .
6
P P
For 1 g / 2 2
7 7
4/ 0 3 0 / 3/ 0 1Q
/ 0 1 .
6
100
Therefore,
0, if / 2 0
k R
/
i , if 0 g / 2 1
z (/) = 6 .
j 4/ 0 3 0 / 3/ 0 1Q
/ 0 1
, if 1 g / 2 2
i 6
h1, if / o 2
2
k / Q , if 0 2 / 2 1
i3
]/ 2 ,
j2/ 0 1 1 0 / 2/ 0 1 , if 1 2 / 2 2
i 3
h0, otherwise
Û
so that
z / ]/ ∀/ ∈ - 0 Æ
Û/
and
x
Ú ]/ Û/ 1.
7x
2
k / Q , if 0 2 / 2 1
i3
]/ 2 .
j2/ 0 1 1 0 / 2/ 0 1 , if 1 2 / 2 2
i 3
h0, otherwise
101
Example 10.1.5
Let , , Q be a random sample and let ~ :(0, 1). Find the distribution function of
= + + Q . Hence finds the p.d.f. of .
Ùa (/ , / , /Q ) = J Ùa 3 (/3 )
3K
Q
1 í
= J X 7
3K
√2Y
1
= X 7 , − ∞ < /3 < ∞, = 1, 2,3.
(2Y )
7x 7x 7x (2Y)
¾
/ = sin ? sin ? ,
/ = sin ? cos ? ,
/Q = cos ? ,
so that > 0, 0 < ? ≤ Y, 0 < ? ≤ 2Y and the Jacobian of the transformation is s = sin ? ,
we get for > 0
√¾ ^ ^
1
z () = Ú ÚÚ X
7
sin? Û? Û? Û
(2Y) P P P
√¾
2
= t Ú X 7 Û
Y
P
¾
1
= Ú X 7 7 Û
=
2 Γ( )
Qu Q
P
102
Therefore
0, if ≤ 0
k ¾
z () = 1 .
Q Ú X 7 Û, if > 0
=
j Qu
7
h2 Γ() P
Clearly z (∙) is the distribution function of FQ distribution having the p.d.f.
X 7 7
æ
, if > 0
Ù () = Õ 2Qu Γ(Q) ,
0, otherwise
z () = @
] , … , . ≤ , −∞ < < ∞,
of random variable = ] , … , . may be difficult or quite tedious. For example, consider a
random sample , … , w (r ≥ 4) from :(0, 1) distribution and suppose that the distribution
function of = ∑w3K 3 is desired. Clearly, for > 0
x x
1
z () = Ú ⋯ Ú w X ÷ Û/ Û/ ⋯ Û/ .
7 ∑
(2Y) u
w
7x 7x
⋯
¾
So that > 0, ∑w3K /3 = , 0 < ?3 ≤ Y, = 1, … , r − 2, 0 < ?w7 ≤ 2Y and the Jacobian of
transformation is s = w7 sinw7 ? sinw7Q ? ⋯ sin ?w7 , we get for > 0
√¾ ^ ^ ^
1
z () = Ú Ú ⋯ Ú Ú X 7 w7 sinw7 ? sinw7Q ? ⋯ sin ?w7 Û?w7 Û?w7 ⋯ Û? Û.
(2Y)
wu
P P P P
103
Clearly evaluating the above integral may be tedious. This points towards desirability, if
possible, of other methods of determining the distributions of functions of random variable.
We will see that other techniques are available and, in a given situation, often one technique is
more elegant than the others.
Theorem 10.2.1
Let = , … , . be a discrete type random vector with support a and p.m.f. Ùa (∙). Let
]3 : ℝ` → ℝ, = 1, … , _ be _ Borel functions and let 3 = ]3 , = 1, … , _ . Define, for
= ( , … , ` ) ∈ ℝ` , [¾ =
/ = / , … , /. ∈ a : ] / ≤ , … , ]` / ≤ ` and 8¾ =
z <= = ¯ Ùa / , ∈ ℝ`
∈cæ
Ù <= = ¯ Ùa / , ∈ ℝ` .
∈eæ
3K [3 = {(/ , … , /4 ): /3 ∈ [3 , =
We will denote the Cartesian product of set [ , … , [4 by ∏4
1, … , w.
Example 10.2.1
Let , … , . be independent random variables with 3 ~ Bin(r3 , ? ), where r3 ∈ ℕ, = 1, … , 4
and ? ∈ (0, 1). Without using the m.g.f. of = ∑3K 3 , find the p.m.f. of .
.
Solution. For finding the probability distribution of using the uniqueness of m.g.f. (see
Example 7.2). The joint p.m.f. of = , … , . is given by
.
Ùa / = J Ùa 3 (/3 )
3K
104
. .
r3
J </ = ? (1 − ?)w7 , if / ∈ J{0, 1, … , r3
= Õ 3
3K 3K
0, otherwise
. .
r3
£J </ =¥ ? ∑÷ (1 − ? )w7∑÷ , if / ∈ J{0, 1, … , r3 ,
= Õ 3
3K 3K
0, otherwise
Ù () = ¯ Ùa / , ∈ ℝ,
∈eæ
where, for ∈ ℝ, 8¾ =
/ ∈ a : / + ⋯ + /. = . Clearly, for ∉ {0, 1, ⋯ , r, 8¾ = l and
therefore Ù () = 0. Also, for ∈ {0, 1, … , r ,
w w
Ù () = ¯ ⋯ ¯ Ùa /
KP KP
⋯ K¾
w w .
r3
= ¯ ⋯ ¯ £J < / =¥ ? ∑÷ (1 − ?)w7∑÷
3
KP KP 3K
⋯ K¾
w w .
r3 ¢
= ¯ ⋯ ¯ £J </ =¥¡ ? ¾ (1 − ? )w7¾
3
KP KP 3K
⋯ K¾
r
= <= ? ¾ (1 − ? )w7¾ .
Therefore
r
< = ? ¾ (1 − ? )w7¾ , if ∈ {0, 1, … , r
Ù () = Ô ,
0, otherwise
i.e. ~ Bin(r, ? ).
105
Example 10.2.2
Let , … , . be independent random variables such that 3 ~ P(3 ), = 1, … , 4 , where
3 > 0, = 1, … , 4. Without using the m.g.f. of = ∑3K 3 , find the probability distribution of
.
.
Solution. For derivation of probability distribution of using the uniqueness of m.g.f. (see
Example 7.4). We have a = {0, 1, … . . The joint p.m.f. of = , … , . is
.
Ùa / = J Ùa 3 (/3 )
3K
.
3
X £J ¥ , if / = / , … , /. ∈ {0, 1, … .
7 ∑÷
= Õ /3 ! .
3K
0, otherwise
Ù () = ¯ Ùa (/) , ∈ ℝ,
∈eæ
where, for ∈ ℝ, 8¾ =
/ ∈ a : / + ⋯ + /. = .
For ∈ {0, 1, … ,
x x .
3
Ù () = ¯ ⋯ ¯ X £J ¥
7 ∑÷
/3 !
KP KP 3K
⋯ K¾
(/ + ⋯ + /. )!
x x
X 7 ∑÷
= ¯ ⋯ ¯ ⋯ .
! / ! ⋯ /. !
KP KP
⋯ K¾
X 7 ∑÷ + ⋯ + .
¾
= .
!
Therefore,
106
X 7 ∑÷ ∑3K 3
. ¾
Ù () = Ò , if ∈ {0, 1, 2, … ,
!
0, otherwise
.
i. e. , ~ P £¯ 3 ¥.
3K
Example 10.2.3
Let , … , . be independent random variables such that 3 ~ NB(3 , ? ), ? ∈ (0, 1), 3 ∈
{1, 2, … , = 1, … , 4. Without using the m.g.f. of = ∑.3K 3 show that ~ NB∑.3K 3 , ? .
Solution. For a solution utilizing the uniqueness of m.g.f. refer to Example 7.3. One can also
provide a solution based on methods used in solving problems 10.2.1 and 10.2.2 by using the
identity
.
x x
+ _ − 1 +_ −1
¯ ⋯ ¯ | } ⋯ | . . } = ¯ 3 + − 1 , ∈ {0, 1, 2, … .
. 3K
` KP
` ⋯
` KP
` K¾
Example 10.2.4
Let and be independent and identically distributed random variables with ~ NB(1, 4),
where 4 ∈ (0, 1). Find the distribution function of = + . Hence find the p.m.f. of (also
see Examples 7.3 and 10.2.3).
Solution. Since and have the common support = {0, 1, 2, … , we have z () = 0,
if < 0. Moreover, for ∈ [_, _ + 1), _ ∈ {0, 1, 2, …
= @({ + ≤ _)
x
= ¯ @ ({ + ≤ _, = )
©KP
107
x
= ¯ @ ({ ≤ _ − , = )
©KP
= ¯(1 − 4)© 4
©KP
= 1 − (1 − 4)
.
0, if < 0
Consequently
z () = d .
1 − (1 − 4)` – (_ + 1)4(1 − 4)` , if _ ≤ < _ + 1, _ ∈ {0, 1, …
Clearly is a discrete type random variable with support = {0, 1, 2, … and for _ ∈ ,
= z (_ ) − z (_ − 1)
= (_ + 1)4 (1 − 4)` .
108
( + 1)4 (1 − 4) ¾ , if ∈ {0, 1, 2, …
Ù () = d .
0, otherwise
Example 10.2.5
Let and be independent and identically distributed random variables with common p.m.f.
?(1 − ? ) 7 , if / ∈ {1, 2, …
Ù (/) = d ,
0, otherwise
Find the marginal p.m.f. of without finding the joint p.m.f. of = ( , );
Find the marginal p.m.f. of without finding the joint p.m.f. of = ( , );
(i)
For ∈ {1, 2, …
8¾ =
(/ , / ) ∈ a : / = / = ∪ :(/ , / ) ∈ a : / = , / ∈ { + 1, + 2, … >
∪ :(/ , / ) ∈ a : / = , / ∈ { + 1, + 2, … >
109
Clearly, for ∈ {1, 2, … , 8,¾ , 8,¾ and 8Q,¾ , are pairwise disjoint sets. Therefore, for
∈ ℕ,
x x
( )¾7 ( )¾7
= ? 1−? + ¯ ? 1−? + ¯ ? (1 − ? ) ¾7
K¾ K¾
= ? (1 − ? )¾7 + 2? ¯ (1 − ? )¾7
K¾
= ?(2 − ? ) (1 − ? )¾7 .
Therefore,
(ii) We have
where, for ∈ ℝ,
8¾ =
(/ , / ) ∈ a : max{/ , / − min{/ , / = .
For = 0, 8¾ =
(/ , / ) ∈ a : / = / =
(/, / ): / ∈ {1, 2, … , and therefore
x
?
= .
2−?
For ∈ {1, 2, … ,
8¾ =
(/ , / ) ∈ a : max{/ , / − min{/ , / =
110
=
(/, / + ): / ∈ {1, 2, … ∪
(/ + , /): / ∈ {1, 2, …
x x
( )¾7
= ¯ ? 1−? + ¯ ? (1 − ? )¾7
K K
2? (1 − ? )¾
= .
2−?
Therefore
?
k , if = 0
i2 − ?
Ù () = 2? (1 − ? )¾
j , if ∈ {1, 2, … .
i 2−?
h0, otherwise
= @
min{X , = , max{X , = +
= ¯ Ùa (/ , / ),
∈eæ
8¾ =
(/ , / ) ∈ a : min{/ , / = , max{/ , / = + .
Ù ( , ) = ? (1 − ? )¾7 .
111
Ù ( , ) = 2? (1 − ?)¾ ¾7 .
It follows that
=
∈ ℝ: ( , ) ∈ for some ∈ ℝ = ℕ
and
=
∈ ℝ: ( , ) ∈ for some ∈ ℝ = {0, 1, 2, … .
Also
¯ Ù ( , ) , if ∈
Ù ( ) = Ò¾∈ åæ ,
0, otherwise
= ¯ Ù ( , )
¾ KP
x
( )¾7
= ? 1−? + ¯ 2? (1 − ? )¾¾ 7
¾ K
= ?(2 − ? ) (1 − ? )¾7 .
112
Therefore
? (2 − ? ) (1 − ? )¾7 , if ∈ {1, 2, …
Ù ( ) = d .
0, otheriwse
Similarly,
¯ Ù ( , ) , if ∈
Ù ( ) = Ò¾∈ åæ ,
0, otherwise
?
= ,
2−?
and, for ∈ {1, 2, …
x
2?(1 − ?)¾
= .
2−?
It follows that
?
k , if = 0
i2 − ?
Ù ( ) = 2? (1 − ? )¾
j , if ∈ {1, 2, … .
i 2−?
h0, otherwise
Example 10.2.6
Let = ( , , Q ) be a discrete type random vector with p.m.f.
113
, if (/ , / , /Q ) ∈ {(1, 1, 0), (1, 0, 1), (0, 1, 1)
|
Ùa (/ , / , /Q ) = Õ , if (/ , / , /Q ) = (1, 1, 1) .
Q
0, otherwise
Define = + and = + Q .
(i) Find the marginal p.m.f. of without finding the joint p.m.f. of = ( , );
(ii) Find the marginal p.m.f. of without finding the joint p.m.f. of = ( , );
(iii) Find the joint p.m.f. of = ( , );
(iv) Are and independent?
(v) Using (iii) find the marginal p.m.f.s of and .
4
=
9
and
5
= .
9
Therefore,
4
k , if = 1
i9
Ù () = 5 .
j , if = 2
i9
h 0, otherwise
114
(ii) By symmetry
4
k , if = 1
i9
Ù () = 5 .
j , if = 2
i9
h0, otherwise
Ù ( , ) = @({ + = , + Q = )
2
= ,
9
Ù (1, 2) = @ ({ + = 1, + Q = 2)
2
= ,
9
Ù (2, 1) = @ ({ + = 2, + Q = 1)
2
=
9
and
1
= .
3
Therefore
115
2
k , if ( , ) ∈ {(1, 1), (1, 2), (2, 1)
i9
Ù ( , ) = 1 .
j , if ( , ) = (2, 2)
i3
h0, otherwise
(iv) Since
2
@({ = 1, = 1) =
9
≠ @({ = 1)@({ = 1)
16
= ,
81
and are not independent.
(v) Using (iii) we have = {(1, 1), (1, 2), (2, 1), (2, 2). Therefore
=
∈ ℝ: ( , ) ∈ for some ∈ ℝ = {1, 2
and
=
∈ ℝ: ( , ) ∈ for some ∈ ℝ = {1, 2.
Also
¯ Ù , , if ∈
Ù () = Ò¾ ∈ åæ ,
0, otherwise
4
Ù (1) = Ù (1, 1) + Ù (1, 2) =
9
and
116
5
Ù (2) = Ù (2, 1) + Ù (2, 2) = .
9
Then
4
k , if = 1
i9
Ù ( ) = 5 .
j , if = 2
i9
h0, otherwise
By symmetry
Example 10.2.7
Let = ( , ) be a discrete type random vector with p.m.f. given by
1 1
4 2
1 3
0
2 16 16
Solution. We have
(1, 0) 1
2
1
(−1, 2) 1
16
3
5
(1, 2)
16
3
117
Therefore the p.m.f. of = | 0 2 | is give by
3
k , if = 1
i4
i3
Ù 16 , if = 3 .
j1
i16 , if = 5
i
h 0, otherwise
Theorem 10.2.2
Let = , … , . be a random vector of absolutely continuous type with a joint p.d.f. Ùa (∙)
and support a =
/ ∈ ℝ. : Ùa / > 0. Let , … , ` be open subset of ℝ. such that
3 ∩ © = ∅, if ≠ , and ⋃`3K 3 = a . Suppose that ℎ© : ℝ. → ℝ , = 1, … , 4, are 4 Borel
functions such that on each 3 , ℎ = ℎ , … , ℎ. : 3 → ℝ. is one-to-one with inverse
transformation ℎ37 = <ℎ,3
7
, … , ℎ.,3
7
= (say), = 1, … , _. Further suppose that
ℎ©,3
7
, = 1, … , 4, = 1, … , _, have continuous partial derivatives and the Jacobian
determinants
áℎ,3
7
áℎ,3
7
⋯
} á á. }
áℎ,3
7
áℎ,3
7
s3 = } á ⋯
á. } ≠ 0, = 1, … , 4.
⋮ ⋮
}áℎ7 ⋮ áℎ.,3 }
7
⋯
.,3
á á.
Ù = ¯ Ùa <ℎ,©
7
, … , ℎ.,©
7
= »s© » bÃñ¨ .
©K
118
▄
We shall not provide the proof of the above theorem. The idea of the proof of the above
distribution function of ~ is written in the form of multiple integrals which are simplified by
theorem is similar to that of Theorem 2.2, Module 3. In the proof of the theorem, the joint
Corollary 10.2.1
Let = , … , . be a random vector of absolutely continuous type with a joint p.d.f. Ùa (∙)
and support a =
/ ∈ ℝ. : Ùa / > 0 , an open set in ℝ. . Suppose that ℎ© : ℝ. → ℝ, =
1, … , 4, are 4 Borel functions such that ℎ = ℎ , … , ℎ. : a → ℝ. is one-to-one with inverse
transformation ℎ7 = (ℎ7 , … , ℎ.7 ) (say). Further suppose that ℎ37 , = 1, … , 4, have
continuous partial derivatives and the Jacobean determinant
Define ℎa =
ℎ/ = (ℎ /, ⋯ , ℎ. /) ∈ ℝ. : / ∈ a and ~© = ℎ© , … , . , = 1, … , 4.
Then the random vector ~ = ~ , … , ~. is of absolutely continuous type with joint p.d.f.
Remark 10.2.1
Let = , … , . be a random vector of absolutely continuous type with joint p.d.f. Ùa and
let a =
/ ∈ ℝ. : Ùa / > 0. Suppose that we are interested in finding the joint probability
distribution of random vector ~ = (~ , … , ~` ) = (ℎ /, … , ℎ` /), where _ ∈ {1, … , 4 and
ℎ3 : ℝ. → ℝ, = 1, … , _, are some Borel functions. For this we shall define 4 − _ additional
auxiliary Borel functions ℎ3 : ℝ. → ℝ, = _ + 1, … , 4 , such that the transformation
ℎ = ℎ , … , ℎ. : a → ℝ. , satisfies the assumptions of Theorem 10.2.2/Corollary 10.2.1. Then
119
an application of Theorem 10.2.2/Corollary 10.2.1 will provide the joint p.d.f. Ù , … , . of
~ = ~ , … , ~. from which marginal joint p.d.f. of r = (~ , … , ~` ) is obtained by integrating
out unwanted variables ` , … , . in Ù , … , ` , ` , … , . .
Example 10.2.8
Let and be independent and identically distributed random variables with common p.d.f.
1
k , if − 2 < / < −1
i2
Ù (/) = 1 .
j , if 0 < / < 3
i6
h0, otherwise
1
k , if (/ , / ) ∈ (−2, −1) × (−2, −1)
i 4
i1
( ) ) ( ) ) ( )
= 12 , if / , / ∈ (−2, −1 × 0, 3 ∪ (0, 3 × −2, −1 .
j1
( ) ( ) ( )
i36 , if / , / ∈ 0, 3 × 0, 3
i
h0, otherwise
a
/ / , / ) ∈ ℝ : Ùa (/ , / ) > 0
= ∪ ∪ Q ∪ R ,
where = (−2, −1) , = (−2, −1) × (0, 3), Q = (0, 3) × (−2, −1) and R = (0, 3) .
0 −1
ℎ,
7
= − , ℎ,
7
= − , s = 0 0 = −1;
−1 1
120
0 −1
ℎ,
7
= − , ℎ,
7
= − , s = 0 0 = 1;
1 −1
0 1
ℎ,Q
7
= , ℎ,Q
7
= − , sQ = 0 0 = 1;
−1 1
0 1
ℎ,R
7
= , ℎ,R
7
= − , sR = 0 0 = −1;
1 −1
ℎ( ) = {( , ) ∈ ℝ : −2 < − < −1 , −2 < − < −1
and
Ù ( , ) = ¯ Ùa <ℎ,©
7
, ℎ,©
7
= »s© » bÃר ()
©K
121
1
k , if < < + 1, 0 < < 1
i36
1 1
i + , if + 1 < < + 2, 0 < < 1
i12 36
i 1 , if + 2 < < + 3, 0 < < 1
i36
i1 1
i12 + 36 , if < < + 1, 1 < < 2
i1 1 1 1
= 4 + 12 + 12 + 36 , if + 1 < < + 2, 1 < < 2
j1 1
i + , if + 2 < < + 3, 1 < < 2
i12 36
1
i , if < < + 1, 2 < < 3
i36
i 1 + 1 , if + 1 < < + 2, 2 < < 3
i12 36
i1
i36 , if + 2 < < + 3, 2 < < 3
h0, otherwise
1
k , if 0 < < 2, max{0, − 1 < < min{1,
i36
or
i 2 < < 4, max{0, − 3 < < min{1, − 2
i
i or
i 2 < < 4, max{2, − 1 < < min{3,
i or
i 4 < < 6, max{2, − 3 < < min{3, − 2
i
= 1
j 9 , if 1 < < 3, max{0, − 2 < < min{1, − 1
i or
i 1 < < 3, max{1, − 1 < < min{2,
i or
i 3 < < 5, max{1, − 3 < < min{ 2, − 2
i
122
For ∈ (0, 1)
min{1, − max{0, − 1
Ù ( ) =
36
= ;
36
for ∈ (1, 2)
for ∈ (3, 4)
123
For ∈ (4, 5)
min{3, − 2 − max{2, − 3
Ù ( ) =
36
6 −
= .
36
Therefore the p.d.f. of = | | + | | is given by
k , if 0 < < 1
36
i7 − 6
i , if 1 < < 2
i 36
i5 − 6
i 18 , if 2 < < 3
Ù 24 − 5 .
j , if 3 < < 4
18
i36 − 7
i
, if 4 < < 5
i 36
i6 −
i 36 , if 5 < < 6
h0, otherwise
124
Find an expression for the joint distribution function of = ( , … , w ). Hence find the
joint p.d.f. of ;
(i)
Let w =
Π , ⋯ , Πw! denote the set of all permutations of (1, … , r); here for ∈ {1, … , r!,
Π3 = Π3,() , … , Π3,(w) is a permutation of (1, … , r).
w!
w!
125
¾ ¾ w
where [ =
/ ∈ ℝw : −∞ < / < ⋯ < /w < ∞.
(ii) Since is of absolutely continuous type we may, without loss of generality, take a ⊆
:/ ∈ ℝw : /3 ≠ /© , ∀ ≠ , , ∈ {1, … , r >. Then a = ⋃w!
3K 3 , where 3 = :/ ∈ a : /,() <
Π3,()
7
, … , Π3,(w)
7
, = 1, … , r! is the inverse permutation of Π3 . Under the notation of Theorem
10.2.2 each row and each column of the jacobian determinant s3 contains one, and only one,
non-zero element which is 1. Therefore s3 = ± 1, = 1, … , r!. Also ℎ (3 ) = : ∈ a : − ∞ <
< ⋯ < w < ∞> = 8, say, = 1, … , r. Therefore the joint p.d.f. of is given by
w!
w! w
= ¯ ~J Ù <¦ = be <=.
,( )
3K
K
Since
Π3,()
7
, … , Π3,(w)
7
= {1, … , r , we have
w w
J Ù <¦ = = J Ù(
) , ∀ ∈ 8.
,( )
K
K
126
Consequently
w! w
Ù <= = ~¯ £J Ù (
)¥ be <=
3K
K
= r! £J Ù(
)¥ be <=
K
r! £J Ù(
)¥ , if − ∞ < < ⋯ < w < ∞
= Õ , ∈ a .
K
0, otherwise
x x x x w
7x 7x 7x 7x
K
x ¾ ¾ ¼ ¾ ¾ ¦ ¾ w w
= Ú Ú ⋯ Ú Ú Ú ⋯ Ú r! J Ù(
) Ù() J Û
¾ ¾ ¾ 7x 7x 7x
K
K
r!
= [z ()]7 [1 − z ()]w7 Ù() , − ∞ < < ∞,
( − 1)! (r − )!
x x w
x ¾ ¾j¼ ¾ ¾j¦ ¾ ¼ ¾ ¦ ¾ w w
= Ú Ú ⋯ Ú Ú Ú ⋯ Ú Ú Ú ⋯ Ú r! J Ù (
)¢ Ù(/ )Ù() J Û
, if / <
K
K
,ñ
¾ ¾ ¾ 7x 7x 7x
,ñ
r!
= [z (/)]7 ×
( − 1)! (ç − − 1)! (r − ç)!
127
Clearly Ù ,j (/, ) = 0, if / ≥ .
1 7 æ
Ù () = Ô? X , if > 0 .
c
0, otherwise
For notational convenience let = ∶w , = 1, … , r. Then, by Example 10.2.9, a joint p.d.f. of
= ( , … , w ) is
w
r! 7 ∑÷ æ
= Ô? w X , if 0 < < < ⋯ < w < ∞.
c
0, otherwise
The support of Ù (⋅) is = : ∈ ℝw : 0 < < < ⋯ < w < ∞>. Consider the
transformation ℎ = (ℎ , … , ℎw ): ℝw → ℝw , where ℎ <= = r , ℎ3 <= = (r − + 1)(3 −
37 ), = 2, … , r. Then = ℎ and 3 = ℎ3 , 1 = 2, … , r. Clearly the transformation
ℎ: → ℝw is one-to-one with inverse transformation ℎ7 = (ℎ7 , … , ℎw7 ) , where for
« ∈ ℎ ,
«
ℎ7 « =
r
« «
ℎ7 « = +
r r−1
⋮
128
«©
3
« « «3
ℎ37 « = + + ⋯+ =¯
r r−1 r−+1 r−+1
©K
«©
w
« « «w7 «w
ℎw7 « = + + ⋯+ + =¯ .
r r−1 2 1 r−+1
©K
1
0 0 ⋯ 0
r
}1 1 }
0 ⋯ 0
r r−1
= }1 1 1 }
⋯ 0
r r−1 r−2
}⋮ }
1 1 1
⋯ 1
r r−1 r−2
1
= .
r!
Also
« « « « « «w
⇔ 0 < < + < ⋯ < + + ⋯+ < ∞
r r r r r r
⇔ «3 > 0, = 1, … r.
129
r! 7 ∑÷ æ° 1
= X c × × b(P,x) «.
?w r!
We have, for « ∈ (0, ∞)w ,
«©
w w 3
¯ ℎ37 « = ¯¯
r−+1
3K 3K ©K
«©
w w
= ¯ ¯
r−+1
©K 3K©
= ¯ «© .
©K
?
3K
It follows that , … , w are independent and identically distributed Exp(?) random variables.
Example 10.2.11
Let and be independent random variables such that 3 ~ É (B3 , ? ), B3 > 0, ? > 0, =
1,2 . Define = + and = a . Show that and are independently
(i)
a
a
distributed with
1
Ùa (/ ) = / C7 X 7 c b(P,x) (/), = 1,2,
í
Γ(B3 ) ?
C
and
130
1
Ùa (/ , / ) = J Ùa (/3 ) = / / X b(P,x) /,
í ¼í
C 7 C 7 7 c
Γ(B )Γ(B ) ? CC
3K
respectively.
Clearly a =
/ ∈ ℝ : Ùa (/ , / ) > 0 = (0, ∞) . Consider the transformation ℎ = (ℎ , ℎ ) ∶
ℝ → ℝ defined by
/
, if / + / ≠ 0
ℎ (/ , / ) = / + / and ℎ (/ , / ) = Ô/ + / .
0, if / + / = 0
Then @
( , ) = ℎ ( , ), ℎ ( , ) = 1 and therefore
( , ) = ℎ ( , ), ℎ ( , ).
7
áℎ7 áℎ7
á á }
J = }} 7 7 } = 01 − − 0 = − .
áℎ áℎ
á á
Also
⇔ > 0, (1 − ) > 0
131
X 7 c 1
æ
C C 7
= ~ b(P,x) ( ) | C7 (1 − )C 7 b(P,) ( ) }.
Γ(B + B ) 8 (B , B )
Example 10.2.12
(i) Let = ( , ) be a random vector of absolutely continuous type with joint p.d.f.
Let (ã, ) be the polar coordinate of the point = ( , ) in the Cartesian plane, so that,
= ã cos , = ã Sin, ã > 0, ∈ [0,2Y), ã = & + and one may take
0, if = 0, = 0
kY
i , if = 0, > 0
i2
3Y
i , if = 0, < 0
i2
Θ = tan7 | } , if > 0, ≥ 0
j
i X
iY + tan7 | } , if < 0
i
i2π + tan7 |X } , if > 0, < 0
h
where tan7 ? ∈ <− , = denotes the principal value. Show that ã and are independently
^ ^
132
and
1
Ù (? ) = b(P,^) (?),
2Y
respectively.
Let and be independent and identically distributed :(0, 1) random variables. Show
that the distribution of random variable = a has p.d.f.
(ii)
a
1 1
1
Ùa (/ , / ) = ÔY , if 0 < / + / < 1.
0, otherwise
Solution.
(i) Let a =
/ ∈ ℝ : Ùa / > 0 = :/ ∈ ℝ : ] <&/ + / = > 0>. Consider the transfomation
ℎ = (ℎ , ℎ ): ℝ → ℝ , defined by ℎ (/ , / ) = &/ + / and`
0, if / = 0, / = 0
kY
i , if / = 0, / > 0
i2
3Y
i , if / = 0, / < 0
i2
ℎ (/ , / ) = tan7 |X } , if / > 0, / ≥ 0 .
j
i X
iY + tan7 | } , if / < 0
i
i2π + tan7 |X } , if / > 0, / < 0
h
133
áℎ7 áℎ7
J = } á7 á? = 0 cos θ −r sin θ
0 = .
}
áℎ áℎ7 sin θ r cos θ
á á?
Also ℎa = {(, ? ) ∈ ℝ ∶ ∈ [0, ∞), ? ∈ [0, 2Y) and ]() > 0 = [ × [ , where [ =
{ ∈ [0, ∞): ]() > 0 and [ = [0,2Y). The joint p.d.f. of (ã, Θ) is given by
1
= 2Y bc () | b(P,^) (?)}
2Y
1
= 2Y b(P,x) () | b(P,^) (?)}.
2Y
It follows that ã and Θ are independent random variables with respective p.d.f.s
and
1
Ù (? ) = b (?).
2Y (P,^)
Note that = a is not defined if = 0. However @({ = 0) = 0 (i. e. , @({ ≠ 0) = 1)
1
(ii)
and therefore = is well defined with probability one. In fact, since = ( , ) is of
1
a
absolutely continuous type, we may, without loss of generality, take a = ℝ − {(/ , / ) ∈
ℝ : / = 0. Define
Y 3Y
, if (/ , / ) ∈ a tan Θ , if Θ ∈ [0, 2π) − d0, , û
=d =Ô 2 2 .
0, otherwise 0, otherwise
Then @({ = ) = 1 and therefore = . Thus we will find the distribution of random
7
Y 3Y
variable .
134
The p.d.f. of Θ is given by
1
Ù (θ) = Ô2π , if 0 ≤ θ ≤ 2π.
0, otherwise
Consider the transformation ℎ: ℝ → ℝ defined by
Y 3Y
tan / , if / ∈ [0, 2π) − d0, , û
ℎ(/) = Ô 2 2 .
0, otherwise
= {? ∈ ℝ: Ù (?) > 0
Y 3Y
= [0, 2π) − d0, , û
2 2
= ∪ ∪ Q , say,
where = <0, = , = < , = and Q = < , 2Y=. On each of the sets , and Q , ℎ is
^ ^ Q^ Q^
strictly increasing with inverse transformations
and
Also ℎ( ) = (0, ∞), ℎ( ) = (−∞, ∞) and ℎ(Q ) = (−∞, 0). Therefore the p.d.f. of is given
by
Q
Û
Ùê («) = ¯ Ù <ℎ©7 («)= ó ℎ©7 («)ó bÃר («)
Û«
©K
1 1
= Ù (Ar 7 «) ó ó b(P,x) («) + Ù (Y + Ar 7 «) ó ób («)
1+« 1 + « (7x,x)
1 1 1 1 1 1
= ∙ b(P,x) («) + ∙ b(7x,x) («) + . b(7x,P) («)
2Y 1 + « 2Y 1 + « 2Y 1 + «
135
1 1
. , if « ∈ ℝ − {0
= Õ Y 1 + «
.
1
, if « = 0
2Y
Since the random variable is of absolutely continuous type we may take the p.d.f. of as
1 1
Ùê («) = . , −∞ < « < ∞.
Y 1 + «
It follows that the random variable <and hence = a = has the Cauchy distribution (see
a
(iii) We have
1
](/ ) = Ôπ , if 0 < / < 1.
0, otherwise
Moreover
and
1
Ù (? ) = b (? ).
2Y (P,^ )
Therefore
2
+ = (ã) = Ú 2 Û = ,
3
P
and
136
^
2 cos ? + sin ?
= Ú Û?
3 2Y
P
= 0.
Let = , … , . be a random vector with p.d.f./p.m.f. Ùa (∙) and let ]: ℝ´ → ℝO be a Borel
function. Suppose that we seek the probability distribution of = ](). Under the m.g.f.
technique we try to identify the m.g.f. () of random vector with the m.g.f. of some
vector has that known distribution. Various usages of this technique are illustrate in Examples
known distribution. Then the uniqueness of m.g.f.s (Theorem 7.3) ascertains that the random
Theorem 10.3.1
(i) ~ : <, =;
,
w
(ii) and are independent random variables;
~ Fw7
(w7)
,
(iii) ;
Solution.
(r − 1) = ∑w3K3 − = ∑w3K 3 , a function of . The joint m.g.f. of , is given by
137
w w
¯ 3 3 + = ¯ 3 3 − +
3K 3K
( − ∑w3K 3 )
w w
= ¯ © © + ¯ ©
r
©K ©K
w
= ¯ <© − + = ©
r
©K
= ¯ © © ,
©K
and
w w w
¯ © = ¯ <© − + = = ¯© − + .
r r
©K ©K ©K
Consequently,
= J (X ½¨ a¨ )
©K
= J a¨ ©
©K
w
< =¨
= J X +½¨
©K
= X +
<
∑
¨÷ ½¨ ∑¨÷ ½¨
138
<
= X
+ d∑¨÷¨ 7 û
< < ∑
¨÷<¨ ¦=
= X +
X , ∈ ℝw , ∈ ℝ.
<
= ,a , 0 = X , ∈ ℝw ,
∑
¨÷¨ 7
Clearly
Now using Theorem 7.1 it follows that = − , … , w − and are independent. This
in turn implies that, for any Borel functions Ψ (∙) and Ψ (∙), Ψ and Ψ are
independent. In particular, it follows that (a function of ) and are independent.
random variables. Furthermore, by (i) and Theorem 4.1 (ii), Module 5, = ~:(0,1).
√w a7+
,
Let = = and = . Then, by (ii), and are independent random
wa7+ (w7)×
, ,
variables. Also, by Example 7.6 (ii), ~ F and ~ = ∑w3K 3 ~ Fw . Thus the m.g.f.s of and ~
1
are
2
and
1
() = (1 − 2)7 , < .
2
Also
w
~ = ¯ 3
3K
139
(3 − )
w
= ¯
W
3K
3 − + −
w
= ¯
W
3K
3 −
w
r −
= ¯ +
W W
3K
= + .
()
⇒ () =
()
(1 − 2)7
=
(1 − 2)7
1
= (1 − 2)7 , < ,
¦
2
( ) = < =
(r − 1)
x
W X 7
æ ¦
7
= Ú Û
(r − 1) 2 Γ(
w7
)
¦
P
x
W 1
= Ú X 7 Û
æ ¦¼
7
(r − 1) 2 Γ(
w7
)
¦
P
Γ( )
w7
2 W
¦¼
= , > −(r − 1)
Γ( ) (r − 1)
w7
2
¦
140
2 Γ( )
w7
Therefore
Γ< =
w
2
() = t ∙ w7
W,
r − 1 Γ< =
2 Γ( + 1)
w7
( )
= W = W ,
r − 1 Γ(w7)
2 Γ( + 2) R r + 1 R
w7
( R)
=| } W = W
r−1 Γ( )
w7 r−1
and
2W R
Var( ) = ( R ) − ( ) = .
r−1
▄
the Snedecor z-distribution, which arise as probability distributions of various statistics based
First we will introduce two new probability distributions, called the Student t-distribution and
Definition 11.1
(i) For a given positive integer w, a random variable is said to have the Student -distribution
with w degrees of freedom (written as ~ 4 ) if the p.d.f. of is given by
Γ( )
4
1
Ùa (/) =
, −∞ < / < ∞.
√wY Γ( )
4 ¼
<1 + =
4
(ii) The Student -distribution with 1 degree of freedom is also called the standard Cauchy
(iii) For positive integers r and r , a random variable is said to have the Snedecor z-
distribution.
141
7
< = < /=
w w
8< , =
w w
<1 + /=
w
w
Remark 11.1
The following observations are obvious:
symmetric about 0;
(i)
1 1
(ii) The p.d.f. of Cauchy distribution is given by
< , =
w w
Let and be independent and identically distribution :(0,1) random variables and let
(see, Definition 3.2, Module 5);
The following theorem provides representations of the Student and the Snedecor z random
variables in terms of normal and chi-squared random variables.
Theorem 11.1
(
(i) Let ~ :(0,1) and ~ F4 where w ∈ {1, 2, … ) be independent random variables. Then
~= ~ 4 .
4
(ii) For positive integers r and r , let ~ Fw and ~ Fw be independent random
variables. Then
⁄r
r= ~ zw,w .
⁄r
142
0, if is odd
k
( ) = w ! Γ( )
47
.
j , if is even
h 2 <= ! Γ( )
4
If ~ 4 , then
= () = 0, for w ∈ {2, 3, …
(iv)
w
= Var( ) = , for w ∈ {3, 4, …
w−2
= coefficient of skewness = 0, for w ∈ {4, 5, …
Let r , r and be positive integers and let ~ zw ,w . Then, for r ∈ {1, 2, … ,2 and ≥
, ( ) is not finite. For r ∈ {2 + 1, 2 + 2, …
(v)
w
r r + 2( − 1)
( ) = | } J | }.
r r − 2
3K
(vi) If ~ zw ,w then
r
= ( ) = , if r ∈ {3, 4, …
r − 2
2r (r + r − 2)
= Var() = , if r ∈ {5, 6, …
r (r − 2) (r − 4)
2(2r + r − 2) 2(r − 4)
= coefficient of skewness = t , if r ∈ {7, 8, …
r − 6 r (r + r − 2)
and
1 æ¼ø
X 7 , if , « ∈ 0, ∞ × ℝ
7
Ù,ê , « = Ù Ù ê « = Ò2 .
√Y
¼
0, otherwise
143
Clearly ,ê =
(, «) ∈ ℝ : Ù,ê (, «) > 0 = (0, ∞) × ℝ. Consider the transformation ℎ =
with inverse transformation ℎ7 = (ℎ7 , ℎ7 ), where for (, ) ∈ ℎ,ê ,
áℎ7 áℎ7
s = } á7 á = 00 2w
0 = −2w .
}
áℎ áℎ7
á á
Also
ℎ,ê =
(, ) ∈ ℝ : ℎ7 (, ), ℎ7 (, ) ∈ ,ê
= {(, ) ∈ ℝ : ∈ ℝ, > 0
= ℝ × (0, ∞)
= [, say.
w
4u
¼=
4 X 7 , if (, ) ∈ ℝ × (0, ∞)
= Õ√Y 2 .
Γ < =
¦
4
0, otherwise
144
x
w
4u
¼=
= Ú 4 X 7
Û, ∈ ℝ
√Y 2 Γ < =
¦
4
P
x
1
= ¼ Ú X 7¾ Û
¦
√wY Γ < = <1 + 4 =
4 ½
P
Γ< =
4
1
=
∙ , ∈ ℝ,
√wY Γ < =
4 ¼
<1 + =
½
4
1
= X 7 / 7 / 7 b(P,x) (/ , / ).
(í ¼í )
2 Γ < = Γ < =
¼
w w
áℎ7 áℎ7
r r
s = } á7 á7 } = 00 r 0 = r r .
áℎ áℎ
á á
Also,
ℎa ,a =
(, ) ∈ ℝ : ℎ7 (, ), ℎ7 (, ) ∈ a ,a
145
= {(, ) ∈ ℝ : r > 0, r > 0
= (0, ∞) ,
Ù, (, ) = Ùa ,a ℎ7 (, ), ℎ7 (, )|s| bÃר (, )
,Ø
r r
2 Γ < = Γ < =
¼
w w
r r
x
Ù ( ) = Ú X 7 Û
¼ ( ¼ )
7 7
2 Γ < = Γ < =
¼
w w
P
7
Γ< = < =
w w w w
= , 0 < < ∞.
w w
Γ < = Γ < =
w w ¼
<1 + =
w
w
Therefore
⁄r
r= ~ zw,w .
⁄r
= ,
7
4
146
( ) = w < 7 = = w ( ) < 7 =, (since and are independent)
provided the expectations are finite. We have, from the proof of Theorem 4.3 (iii), Module 5,
0, if is odd
!
( )
=Õ , if is even.
2 < = !
2 Γ < =
4
P
which is finite if, and only if, w > (see Section 2, Module 5). Also, for w >
Γ< =
47
2
¦ ¦
< =
< = = = ∙
7
0, if is odd
k
( ) = w ! Γ < =
47
.
j , if is even
h 2 < = ! Γ < =
4
and
3 w
R = R = ( R)
= , if w ∈ {5, 6, … .
(w − 2) (w − 4)
Consequently
Q
= = 0, if w ∈ {4, 5, …
147
and
R 3(w − 2)
= = , if w ∈ {5, 6, … .
w−4
r
= ,
7
r
where ~ Fw and ~ Fw are independent random variables. Fix ∈ {1, 2, … . Then
( ) = <w = ( 7 ) = <w = ( )(7 ), ( and are independent)
w w
provided the expectations are finite. Since ~ Fw , ( ) is finite for any > 0 and
x
1
( ) = Ú / 7 X 7 Û/
í
2 Γ < =
w
P
2 Γ < + =
w
=
2 Γ < =
w
r r r
= 2 < + − 1= < + − 2= ⋯
2 2 2
Since ~ Fw , (7 ) is finite if, and only if, r > 2. For r > 2
2 7 Γ < − =
w
1
(7 ) = = ∙
2 Γ< =
w ∏3K(r − 2 )
It follows that, for r ∈ {1, 2, … ,2 and ≥ , ( ) is not finite. For r ∈ {2 + 1, 2 +
w
2, …
r r + 2( − 1)
( )
= | } J
∙
r r − 2
3K
148
(vi) Follows on using (v) after some tedious calculations.
▄
Corollary 11.1
Let , … , w (r ≥ 2) be a random sample from :(, W ) distribution, where ∈ (−∞, ∞)
and W > 0. Let 8 = ∑w3K 3 and = ∑w3K(3 − 8 ) denote the sample mean and the
w w7
sample variance respectively. Then
√r (8 − )
~ w7 .
W
w ,
Proof. By Theorem 10.3.1, are independent random
√w(a8 7+)
,
~ w7 ,
(w7) ⁄,
w7
√r(8 − )
i.e.,
~ w7 .
▄
Corollary 11.2
W
~ z47,w7 .
W
149
(w − 1) (r − 1)
~ F47
and ~ Fw7
∙
W W
⁄W
~ z47,w7
⁄W
W
i. e., ~ z47,w7 .
W
Remark 11.2
(i) Suppose that ~ 4 . Then, by Theorem 11.1 (i),
= ,
7
4
= ∙
7
⁄w
Since ~ :(0,1), by Theorem 4.2 (V), Module 5, we have ~ F . It follows that ~ F and
~ F4
are independent random variables. Consequently
⁄1
= ~ z,4 .
7
⁄w
⁄r
= ,
7
⁄r
1 7 ⁄r
= ,
⁄r
150
where ~ Fw and ~ Fw are independent random variables. Now using Theorem 11.1 (ii)
it follows that
1 7 ⁄r
~ zw,w .
⁄r
Note that if ~ 4 then, by Remark 11.1 (i), the distribution of is symmetric about 0 and its
kurtosis is
3w 0 2
Ê 3, w Ê 4.
w04
Thus a -distribution with w (> 4 degrees of freedom is symmetric and leptokurtic (i.e., it has
shaper peak and longer fatter tails). Note that the kurtosis decreases as w increases and
→ 3, as w → ∞. This suggests that, for large degrees of freedom, Student’s -distribution
behaves like :(0, 1 distribution. A rigorous proof of this observation will be provided in the
next module.
Suppose that ~ 4 and, for a fixed B ∈ 0, 1, let 4,C be the 1 0 B-th quantile of , i.e.,
za 4,C = @
≤ 4,C = 1 0 B.
Then
151
za −4,C = 1 0 za 4,C = B <since 0 =.
7
Now suppose that ∼ zw,w and, for a fixed B ∈ 0, 1, let Ùw,w ,C be the 1 0 B-th quantile
of , i.e.,
za Ùw,w ,C @
≤ Ùw,w ,C = 1 0 B.
1 1
@
ð o 1 0 B
Ùw ,w ,C
1 1
⇒ @
ð g B 1 0 1 0 B
Ùw ,w ,C
1
⇒ Ùw,w ,7C .
Ùw,w,C
152
Figure11.3: Plots of p.d.f.s of ~ zw ,w
Example 11.1
Let , … , w be independent and identically distributed :(0,1 random variables and let
A , … , Aw , H , … , Hw be real numbers such that ∑w3K A3 > 0, ∑w3K H3 > 0, and ∑w3K A3 H3 = 0.
Show that:
∑÷ ∙ »∑÷ ~ ;
∑ º ∑ ¹ ê
÷ ¹ ÷ º ê »
(i)
153
= ∑÷ ¹ ∙ |∑÷
º ê } ~ z, ;
∑ º ∑
¹ê
÷
(ii)
÷
Q = ∑ ∙ ∑ ~ .
∑
÷ º
∑
÷ ¹ ê
÷ ¹ ÷ º ê
(iii)
ë ~ + ë ~ = ¯(ë A3 + ë H3 ) 3 .
3K
ë ~ + ë ~ ~ : £0, ¯(ë A3 + ë H3 ) ¥ ∙
3K
Now using Theorem 9.2 it follows that ~ = (~ , ~ ) ~ : (0, 0, ∑w3K A3 , ∑w3K H3 , 0) (since
(~ ) = 0 = (~ ) , Var(~ ) = ∑w3K A3 , Var(~ ) = ∑w3K H3 and Cov(~ , ~ ) = ∑w3K A3 H3 = 0 ). Since
correlation between ~ and ~ is 0 and ~ = (~ , ~ ) ~ : (0, 0, ∑w3K A3 , ∑w3K H3 , 0) it follows
that ~ ~ :(0, ∑w3K A3 ) and ~ ~ :(0, ∑w3K H3 ) are independent (see Theorem 9.1). Thus
~ ∑w3K A3 3 ~ ∑w3K H3 3
= = and = =
&∑w3K A3 &∑w3K A3 ∑w3K H3 ∑w3K H3
are independent and identically distribution :(0, 1) random variables. This implies that
~ :(0, 1) ( ~ F ) and ~ :(0, 1) ( ~ F ) are independent random variables.
Consequently
= ~
&
⁄1
= ~ z,
⁄1
and
Q = ~ . (see Example 10.2.12 (ii))
154
Table 11.1: (1 − B)-th quantities of ~ 4 @
≤ 4,C = 1 − B
B
w .25 .1 .05 .025 .01 .005 .001
1 1.000 3.078 6.314 12.71 31.82 63.66 318.3
2 0.816 1.886 2.920 4.303 6.965 9.925 22.33
3 0.765 1.638 2.353 3.182 4.541 5.841 10.21
4 0.741 1.533 2.132 2.776 3.747 4.604 7.173
5 0.727 1.476 2.015 2.571 3.365 4.032 5.893
6 0.718 1.440 1.943 2.447 3.143 3.707 5.208
7 0.711 1.415 1.895 2.365 2.998 3.499 4.785
8 0.706 1.397 1.860 2.306 2.896 3.355 4.501
9 0.703 1.383 1.833 2.262 2.821 3.250 4.297
10 0.700 1.372 1.812 2.228 2.764 3.169 4.144
11 0.697 1.363 1.796 2.201 2.718 3.106 4.025
12 0.695 1.356 1.782 2.179 2.681 3.055 3.930
13 0.694 1.350 1.771 2.160 2.650 3.012 3.852
14 0.692 1.345 1.761 2.145 2.624 2.977 3.787
15 0.691 1.341 1.753 2.131 2.602 2.947 3.733
26 0.690 1.337 1.746 2.120 2.583 2.921 3.686
17 0.689 1.333 1.740 2.110 2.567 2.898 3.646
18 0.688 1.330 1.734 2.101 2.552 2.878 3.610
19 0.688 1.328 1.729 2.093 2.539 2.861 3.579
20 0.687 1.325 1.725 2.086 2.528 2.845 3.552
21 0.686 1.323 1.721 2.080 2.518 2.831 3.527
22 0.686 1.321 1.717 2.074 2.508 2.819 3.505
23 0.685 1.319 1.714 2.069 2.500 2.807 3.485
24 0.685 1.318 1.711 2.064 2.492 2.797 3.467
25 0.684 1.316 1.708 2.060 2.485 2.787 3.450
26 0.684 1.315 1.706 2.056 2.479 2.779 3.435
27 0.684 1.314 1.703 2.052 2.473 2.771 3.421
28 0.683 1.313 1.701 2.048 2.467 2.763 3.408
29 0.683 1.311 1.699 2.045 2.462 2.756 3.396
30 0.683 1.310 1.697 2.042 2.457 2.750 3.385
35 0.682 1.306 1.690 2.030 2.438 2.724 3.340
40 0.681 1.303 1.684 2.021 2.423 2.704 3.307
50 0.679 1.299 1.676 2.009 2.403 2.678 3.261
100 0.677 1.290 1.660 1.984 2.364 2.626 3.174
∞ 0.674 1.282 1.645 1.960 2.326 2.576 3.090
155
Table 11.2: (1 − B)-th quantiles of ~ zw,w @
≤ Ùw ,w,C = 1 − B, B = 0.10
r
r 1 2 3 4 5 6 7 8 9
1 39.86 49.5 53.59 53.83 57.24 58.2 58.91 59.44 59.86
2 8.53 9.00 9.16 9.24 9.29 9.33 9.35 9.37 9.38
3 5.54 5.46 5.39 5.34 5.31 5.28 5.27 5.25 5.24
4 4.54 4.32 4.19 4.11 4.05 4.01 3.98 3.95 3.94
5 4.06 3.78 3.62 3.52 3.45 3.40 3.37 3.34 3.32
6 3.78 3.46 3.29 3.18 3.11 3.05 3.01 2.98 2.96
7 3.59 3.26 3.07 2.96 2.88 2.83 2.78 2.75 2.72
8 3.46 3.11 2.92 2.81 2.73 2.67 2.62 2.59 2.56
9 3.36 3.01 2.81 2.69 2.61 2.55 2.51 2.47 2.44
10 3.29 2.92 2.73 2.61 2.52 2.46 2.41 2.38 2.35
11 3.23 2.86 2.66 2.54 2.45 2.39 2.34 2.3 2.27
12 3.18 2.81 2.61 2.48 2.39 2.33 2.28 2.24 2.21
13 3.14 2.76 2.56 2.43 2.35 2.28 2.23 2.20 2.16
14 3.10 2.73 2.52 2.39 2.31 2.24 2.19 2.15 2.12
15 3.07 2.70 2.49 2.36 2.27 2.21 2.16 2.12 2.09
16 3.05 2.67 2.46 2.33 2.24 2.18 2.13 2.09 2.06
17 3.03 2.64 2.44 2.31 2.22 2.15 2.10 2.06 2.03
18 3.01 2.62 2.42 2.29 2.20 2.13 2.08 2.04 2.00
19 2.99 2.61 2.40 2.27 2.18 2.11 2.06 2.02 1.98
20 2.97 2.59 2.38 2.25 2.16 2.09 2.04 2.00 1.96
21 2.96 2.57 2.36 2.23 2.14 2.08 2.02 1.98 1.95
22 2.95 2.56 2.35 2.22 2.13 2.06 2.01 1.97 1.93
23 2.94 2.55 2.34 2.21 2.11 2.05 1.99 1.95 1.92
24 2.93 2.54 2.33 2.19 2.10 2.04 1.98 1.94 1.91
25 2.92 2.53 2.32 2.18 2.09 2.02 1.97 1.93 1.89
26 2.91 2.52 2.31 2.17 2.08 2.01 1.96 1.92 1.88
27 2.90 2.51 2.30 2.17 2.07 2.00 1.95 1.91 1.87
28 2.89 2.50 2.29 2.16 2.06 2.00 1.94 1.90 1.87
29 2.89 2.50 2.28 2.15 2.06 1.99 1.93 1.89 1.86
30 2.88 2.49 2.28 2.14 2.05 1.98 1.93 1.88 1.85
40 2.84 2.44 2.23 2.09 2.00 1.93 1.87 1.83 1.79
60 2.79 2.39 2.18 2.04 1.95 1.87 1.82 1.77 1.74
∞
120 2.75 2.35 2.13 1.99 1.90 1.82 1.77 1.72 1.68
2.71 2.30 2.08 1.94 1.85 1.77 1.72 1.67 1.63
156
Table 11.2: (1 − B)-th quantile of ~ zw,w @
≤ Ùw ,w B = 1 − B, B = 0.10
r
r 10 12 15 20 24 30 40 60 120 ∞
1 60.19 60.71 61.22 61.74 62 62.26 62.93 62.79 63.06 63.33
2 9.39 9.41 9.42 9.44 9.45 9.46 9.47 9.47 9.48 9.49
3 5.23 5.22 5.20 5.18 5.80 5.17 5.16 5.15 5.14 5.13
4 3.92 4.90 3.87 3.84 3.83 3.82 3.80 3.79 3.78 3.76
5 3.30 3.27 3.24 3.21 3.19 3.17 3.16 3.14 3.12 3.10
6 2.94 3.90 2.87 2.84 2.82 2.80 2.78 2.76 2.74 2.72
7 2.70 3.67 2.63 2.59 2.58 2.56 2.54 2.51 2.49 2.47
8 2.54 3.50 2.46 2.42 2.40 2.38 2.36 2.34 2.32 2.29
9 2.42 3.38 2.34 2.30 2.28 2.25 2.23 2.21 2.18 2.16
10 2.32 2.28 2.24 2.20 2.18 2.16 2.13 2.11 2.08 2.06
11 2.25 2.21 2.17 2.12 2.10 2.08 2.05 2.03 2.00 1.97
12 2.19 2.15 2.10 2.06 2.04 2.01 1.99 1.96 1.93 1.90
13 2.40 2.10 2.05 2.01 1.98 1.96 1.93 1.90 1.88 1.85
14 2.10 2.05 2.01 1.96 1.94 1.91 1.89 1.86 1.83 1.80
15 2.06 2.02 1.97 1.92 1.90 1.87 1.85 1.82 1.79 1.76
16 2.03 1.99 1.94 1.89 1.87 1.84 1.81 1.78 1.75 1.72
17 2.00 1.96 1.91 1.86 1.84 1.81 1.78 1.75 1.72 1.69
18 1.98 1.93 1.89 1.84 1.81 1.78 1.75 1.72 1.69 1.66
19 1.96 1.91 1.86 1.81 1.79 1.76 1.73 1.70 1.67 1.63
20 1.94 1.89 1.84 1.79 1.77 1.74 1.71 1.68 1.64 1.61
21 1.92 1.87 1.83 1.78 1.75 1.72 1.69 1.66 1.62 1.59
22 1.90 1.86 1.81 1.76 1.73 1.70 1.67 1.64 1.60 1.57
23 1.89 1.84 1.80 1.74 1.72 1.69 1.66 1.62 1.59 1.55
24 1.88 1.83 1.78 1.73 1.70 1.67 1.64 1.61 1.57 1.53
25 1.87 1.82 1.77 1.72 1.69 1.66 1.63 1.59 1.56 1.52
26 1.86 1.81 1.76 1.71 1.80 1.65 1.61 1.58 1.54 1.50
27 1.85 1.80 1.75 1.70 1.67 1.64 1.60 1.57 1.53 1.49
28 1.84 1.79 1.74 1.69 1.66 1.63 1.59 1.56 1.52 1.48
29 1.83 1.78 1.73 1.68 1.65 1.62 1.58 1.55 1.51 1.47
30 1.82 1.77 1.72 1.67 1.64 1.61 1.57 1.54 1.50 1.46
40 1.76 1.71 1.66 1.61 1.57 1.54 1.51 1.47 1.42 1.38
60 1.71 1.66 1.60 1.54 1.51 1.48 1.44 1.40 1.35 .129
∞
120 1.65 1.60 1.55 1.48 1.45 1.41 1.37 1.32 1.26 1.19
1.60 1.55 1.49 1.42 1.38 1.34 1.30 1.24 1.17 1.00
157
Table 11.2: (1 − B)-th quantiles of ~ zw,w @
≤ Ùw ,w B = 1 − B, B = 0.05
r
r 1 2 3 4 5 6 7 8 9
1 161.4 199.5 215.7 224.6 230.2 234.0 236.8 238.9 940.5
2 18.51 19.00 19.16 19.25 19.3 19.33 19.35 19.37 19.38
3 10.13 9.55 9.28 9.12 9.01 8.94 8.89 8.85 8.81
4 7.71 6.94 6.59 6.39 6.26 6.16 6.09 6.04 6.00
5 6.61 5.79 5.41 5.19 5.05 4.95 4.88 4.82 4.77
6 5.99 5.14 4.76 4.53 4.39 4.28 4.21 4.15 4.10
7 5.59 4.74 4.35 4.12 3.97 3.87 3.79 3.73 3.68
8 5.32 4.46 4.07 3.84 3.69 3.58 3.50 3.44 3.39
9 5.12 4.26 3.86 3.63 3.48 3.37 3.29 3.23 3.18
10 4.96 4.10 3.71 3.48 3.33 3.22 3.14 3.07 3.02
11 4.84 3.98 3.59 3.36 3.20 3.09 3.01 2.95 2.90
12 4.75 3.89 3.49 3.26 3.11 3.00 2.91 2.85 2.80
13 4.67 3.81 3.41 3.18 3.03 2.92 2.83 2.77 2.71
14 4.60 3.74 3.34 3.11 2.96 2.85 2.76 2.70 2.65
15 4.54 3.68 3.29 3.06 2.90 2.79 2.71 2.64 2.59
16 4.49 3.63 3.24 3.01 2.85 2.74 2.66 2.59 2.54
17 4.45 3.59 3.20 2.96 2.81 2.70 2.61 2.55 2.49
18 4.41 3.55 3.16 2.93 2.77 2.66 2.58 2.51 2.46
19 4.38 3.52 3.13 2.90 2.74 2.63 2.54 2.48 2.42
20 4.35 3.49 3.10 2.87 2.71 2.60 2.51 2.45 2.39
21 4.32 3.47 3.07 2.84 2.68 2.57 2.49 2.42 2.37
22 4.30 3.44 3.05 2.82 2.66 2.55 2.46 2.40 2.34
23 4.28 3.42 3.03 2.80 2.64 2.53 2.44 2.37 2.32
24 4.26 3.40 3.01 2.78 2.62 2.51 2.42 2.36 2.30
25 4.24 3.39 2.99 2.76 2.60 2.49 2.40 2.34 2.28
26 4.23 3.37 2.98 2.74 2.59 2.47 2.39 2.32 2.27
27 4.21 3.35 2.96 2.73 2.57 2.46 2.37 2.31 2.25
28 4.20 3.34 2.95 2.71 2.56 2.45 2.36 2.29 2.24
29 4.18 3.33 2.93 2.70 2.55 2.43 2.35 2.28 2.22
30 4.17 3.32 2.92 2.69 2.53 2.42 2.33 2.27 2.21
40 4.08 3.23 2.84 2.61 2.45 2.34 2.25 2.18 2.12
60 4.00 3.15 2.76 2.53 2.37 2.25 2.17 2.10 2.04
∞
120 3.92 3.07 2.68 2.45 2.29 2.17 2.09 2.02 1.96
3.84 3.00 2.60 2.37 2.21 2.10 2.01 1.94 1.88
158
Table 11.2: (1 − B)-th quantiles of ~ zw ,w @
≤ Ùw ,w B = 1 − B, B = 0.05
r
r 10 12 15 20 24 30 40 60 120 ∞
1 241.9 243.9 245.9 248.0 249.1 250.1 251.1 252.2 253.3 254.3
2 19.4 19.41 19.43 19.45 19.45 19.46 19.47 19.48 19.49 19.5
3 8.79 8.74 8.70 8.66 8.64 8.62 8.59 8.57 8.55 8.53
4 5.96 5.91 5.86 5.80 5.77 5.75 5.72 5.69 5.66 5.63
5 4.74 4.68 4.62 4.56 4.53 4.50 4.46 4.43 4.40 4.36
6 4.06 4.00 3.94 3.87 3.84 3.81 3.77 3.74 3.70 3.67
7 3.64 3.57 3.51 3.44 3.41 3.38 3.34 3.30 3.27 3.23
8 3.35 3.28 3.22 3.15 3.12 3.08 3.04 3.01 2.97 2.93
9 3.14 3.07 3.01 2.94 2.90 2.86 2.83 2.79 2.75 2.71
10 2.98 2.91 2.85 2.77 2.74 2.70 2.66 2.62 2.58 2.54
11 2.85 2.79 2.72 2.65 2.61 2.57 2.53 2.49 2.45 2.40
12 2.75 2.69 2.62 2.54 2.51 2.47 2.43 2.38 2.34 2.30
13 2.67 2.60 2.53 2.46 2.42 2.38 2.34 2.30 2.25 2.21
14 2.60 2.53 2.46 2.39 2.35 2.31 2.27 2.22 2.18 2.13
15 2.54 2.48 2.40 2.33 2.29 2.25 2.20 2.16 2.11 2.07
16 2.49 2.42 2.35 2.28 2.24 2.19 2.15 2.11 2.06 2.01
17 2.45 2.38 2.31 2.23 2.19 2.15 2.10 2.06 2.01 1.96
18 2.41 2.64 2.27 2.19 2.15 2.11 2.06 2.02 1.97 1.92
19 2.38 2.31 2.23 2.16 2.11 2.07 2.03 1.98 1.93 1.88
20 2.35 2.28 2.20 2.12 2.08 2.04 1.99 1.95 1.90 1.84
21 2.32 2.25 2.18 2.10 2.05 2.01 1.96 1.92 1.87 1.81
22 2.30 2.23 2.15 2.07 2.03 1.98 1.94 1.89 1.84 1.78
23 2.27 2.20 2.13 2.05 2.01 1.96 1.91 1.86 1.81 1.76
24 2.25 2.18 2.11 2.03 1.98 1.94 1.89 1.84 1.79 1.73
25 2.24 2.16 2.09 2.01 1.96 1.92 1.87 1.82 1.77 1.71
26 2.22 2.15 2.07 1.99 1.95 1.90 1.85 1.80 1.75 1.69
27 2.20 2.13 2.06 1.97 1.93 1.88 1.84 1.79 1.73 1.67
28 2.19 2.12 2.04 1.96 1.91 1.87 1.82 1.77 1.71 1.65
29 2.18 2.10 2.03 1.94 1.90 1.85 1.81 1.75 1.70 1.64
30 2.16 2.09 2.01 1.3 1.89 1.84 1.79 1.74 1.68 162
40 2.08 2.00 1.92 1.84 1.79 1.74 1.69 1.64 1.58 1.51
60 1.99 1.92 1.84 1.75 1.70 1.65 1.59 1.53 1.47 1.39
∞
120 1.91 1.83 1.75 1.66 1.10 1.55 1.50 1.43 1.35 1.25
1.83 1.75 1.67 1.57 1.52 1.46 1.39 1.32 1.22 1.00
159
Table 11.2: (1 − B)-th quartiles of ~ zw,w (@| gÙw ,w ,C ) = 1 − B), B = 0.10
r 1 2 3 4 5 6 7 8 9
1 4052 4999.5 5403 5625 5764 5859 5928 5982 6022
2 98.50 99.00 99.17 99.25 99.30 99.33 99.36 99.37 99.39
3 34.12 30.82 29.46 28.71 2824 27.91 27.67 27.49 27.35
4 21.20 18.00 16.69 15.98 15.52 15.21 14.98 14.80 14.66
5 16.26 13.27 12.06 11.39 10.97 10.67 10.46 10.29 10.16
6 13.75 10.92 9.78 9.15 8.75 8.47 8.26 8.10 7.98
7 12.25 9.55 8.45 7.85 7.46 7.19 6.99 6.84 6.72
8 11.26 8.65 7.59 7.01 6.63 6.37 6.18 6.03 5.91
9 10.56 8.02 6.99 6.42 6.06 5.80 5.61 5.47 5.35
10 10.04 7.56 6.55 5.99 5.64 5.39 5.2 5.06 4.94
11 9.65 7.21 6.22 5.67 5.32 5.07 4.89 4.74 4.63
12 9.33 6.93 5.95 5.41 5.06 4.82 4.64 4.50 4.39
13 9.07 6.70 5.74 5.21 4.86 4.62 4.44 4.30 4.14
14 8.86 6.51 5.56 5.04 4.69 4.46 4.28 4.14 4.03
15 8.68 6.36 5.42 4.89 4.56 4.32 4.14 4.00 3.89
16 8.53 6.23 5.29 4.77 4.44 4.20 4.03 3.89 3.78
17 8.40 6.11 5.18 4.67 4.34 4.10 3.93 3.79 3.68
18 8.29 6.01 5.09 4.58 4.25 4.01 3.84 3.71 3.60
19 8.18 5.93 5.01 4.50 4.17 3.94 3.77 3.63 3.52
20 8.10 5.85 4.94 4.43 4.10 3.87 3.70 3.56 3.46
21 8.02 5.78 4.87 4.37 4.04 3.81 3.64 3.51 3.40
22 7.95 5.72 4.82 4.31 3.99 3.76 3.59 3.45 3.35
23 7.88 5.66 4.76 4.26 3.94 3.71 3.54 3.41 3.30
24 7.82 5.61 4.72 4.22 3.90 3.67 3.50 3.36 3.26
25 7.77 5.57 4.68 4.18 3.85 3.63 3.46 3.32 3.22
26 7.72 5.53 4.64 4.14 3.82 3.59 3.42 3.29 3.18
27 7.68 5.49 4.60 4.11 3.78 3.56 3.39 3.26 3.15
28 7.64 5.45 4.57 4.07 3.75 3.53 3.36 3.23 3.12
29 7.60 5.42 4.54 4.4 3.73 3.50 3.33 3.20 3.09
30 7.56 5.39 4.51 4.02 3.70 3.47 3.30 3.17 3.07
40 7.31 5.18 4.31 3.83 3.51 3.29 3.12 2.99 2.89
60 7.08 4.98 4.13 3.65 3.34 3.12 2.95 2.82 2.72
∞
120 6.85 4.79 3.95 3.48 3.17 2.96 2.79 2.66 2.56
6.63 4.61 3.78 3.32 3.02 2.80 2.64 2.51 2.41
160
Table 11.2: (1 − B)-th quartiles of ~ zw,w (@| gÙw ,w ,C ) = 1 − B), B = 0.01
r
r 10 12 15 20 24 30 40 60 120 ∞
1 6056 6106 6157 6209 6235 6261 6287 6313 6339 6366
2 99.40 99.42 99.43 99.45 99.46 99.47 99.47 99.48 99.49 99.50
3 27.23 27.05 26.87 26.69 26.60 26.50 26.41 26.32 26.22 26.13
4 14.55 14.37 14.20 14.02 13.93 13.84 13.75 13.65 13.56 13.46
5 10.05 9.89 9.72 9.55 9.47 9.38 9.29 9.20 9.11 9.02
6 7.87 7.72 7.56 7.40 7.31 7.23 7.14 7.06 6.97 6.88
7 6.62 6.47 6.31 6.16 6.07 5.99 5.91 5.82 5.74 5.65
8 5.81 5.67 5.52 5.36 5.28 5.20 5.12 5.03 4.95 4.86
9 5.26 5.11 4.96 4.81 4.73 4.65 4.57 4.48 4.40 4.31
10 4.85 4.71 4.56 4.41 4.33 4.25 4.17 4.08 4.00 3.91
11 4.54 4.40 4.25 4.10 4.02 3.94 3.86 3.78 3.69 3.60
12 4.30 4.16 4.01 3.86 3.78 3.70 3.62 3.54 3.45 3.36
13 4.10 3.96 3.82 3.66 3.59 3.51 3.43 3.34 3.25 3.17
14 3.94 3.80 3.66 3.51 3.43 3.35 3.27 3.18 3.09 3.00
15 3.80 3.67 3.52 3.37 3.29 3.21 3.13 3.05 2.96 2.87
16 3.69 3.55 3.41 3.26 3.18 3.10 3.02 2.93 2.84 2.75
17 3.59 3.46 3.31 3.16 3.08 3.00 2.92 2.83 2.75 2.65
18 3.51 3.37 3.23 3.08 3.00 2.92 2.84 2.75 2.66 2.57
19 3.43 3.30 3.15 3.00 2.92 2.84 2.76 2.67 2.58 2.49
20 3.37 3.23 3.09 2.94 2.86 2.78 2.69 2.61 2.52 2.42
21 3.31 3.17 3.03 2.88 2.80 2.72 2.64 2.55 2.46 2.36
22 3.26 3.12 2.98 2.83 2.75 2.67 2.58 2.50 2.40 2.31
23 3.21 3.07 2.93 2.78 2.70 2.62 2.54 2.45 2.35 2.26
24 3.17 3.03 2.89 2.74 2.66 2.58 2.49 2.40 2.31 2.21
25 3.13 2.99 2.85 2.70 2.62 2.54 2.45 2.36 2.27 2.17
26 3.09 2.96 2.81 2.66 2.58 2.50 2.42 2.33 2.23 2.13
27 3.06 2.93 2.78 2.63 2.55 2.47 2.38 2.29 2.20 2.10
28 3.03 2.90 2.75 2.60 2.52 2.44 2.35 2.26 2.17 2.06
29 3.00 2.87 2.73 2.57 2.49 2.41 2.33 2.23 2.14 2.03
30 2.98 2.84 2.70 2.55 2.47 2.39 2.30 2.21 2.11 2.01
40 2.80 2.66 2.52 2.37 2.29 2.20 2.11 2.02 1.92 1.80
60 2.63 2.50 2.35 2.20 2.12 2.03 1.94 1.84 1.73 1.60
∞
120 2.47 2.34 2.19 2.03 1.95 1.86 1.76 1.66 1.53 1.38
2.32 2.18 2.04 1.88 1.79 1.70 1.59 1.47 1.32 1.00
161
Problems
(iii) Let za, (⋅,⋅) be the distribution function of some two-dimensional random vector (, ), and
let za (⋅) and z (⋅), respectively, be the marginal distribution functions of and . Define
(/, ) = min{za (/), z () , (/, ) ∈ ℝ (/, ) = max{za (/) + z () − 1, 0,
(/, ) ∈ ℝ . Prove that:
and
(a) É (⋅,⋅) and (⋅,⋅) are each distribution functions and that their marginal distribution
functions are the same as those of za, (⋅,⋅);
(Note: Let the random variable have distribution function za (⋅) and let = ]() have
distribution function z (⋅), where ](⋅) is some function. If ](⋅) is strictly increasing
(decreasing), then za, (/, ) = r(/, ) za, (/, ) = (/, ).
2. Let the random vector = ( , ) have the joint distribution function
0, if / < 0 or / < 0
k/ /
i , if 0 ≤ / < 1, 0 ≤ / < 2 or 1 ≤ / < 2, 0 ≤ / < 1
8
i/
i , if 0 ≤ / < 1, / ≥ 2
4
i1 / /
i + , if 1 ≤ / < 2, 1 ≤ / < 2
2 8
za ,a (/ , / ) = 1 / .
j + , if 1 ≤ / < 2, / ≥ 2
i2 4
/
i , if / ≥ 2, 0 ≤ / < 1
i4
i1 /
+ , if / ≥ 2, 1 ≤ / < 2
i2 4
h1 if / ≥ 2, / ≥ 2
162
Find @({( , ) = (0,0)) and @({( , ) = (1,1)) . Is = ( , ) of absolutely
continuous type?
(/ + + _ − 1)! ¾
? ? (1 − ? − ? )` , if (/, ) ∈ ℤ × ℤ
( )
Ùa, /, = Ò /! ! (_ − 1)! ,
0, otherwise
where _ ≥ 1 is an integer, 0 < ?3 < 1, = 1, 2, ? + ? < 1 and ℤ = {0, 1, 2, … . Find the
marginal p.m.f.s of and and the conditional distributions. (Note: A distribution with
above p.m.f. is called a bivariate negative binomial distribution).
4. Three balls are randomly placed in three empty boxes 8 , 8 and 8Q . Let : denote the total
number boxes which are occupied and let 3 denote the number of balls in the box 83 , =
1, 2, 3.
(i) Find the joint p.m.f. of (:, );
(ii) Find the joint p.m.f. of ( , );
(iii) Find the marginal p.m.f.s of : and ;
(iv) Find the marginal p.m.f. of from the joint p.m.f. of ( , ).
163
7. Suppose that , … , w are i.i.d. random variables and that @( = 0) = 1 − 4 = 1 −
@( = 1), for some 4 ∈ (0, 1). Let denote the number of , … , w that are as large as .
Find the p.m.f. of .
8. Suppose that the number, , of eggs laid by a bird has the @() distribution (the Poisson
distribution with mean ), and the probability that an egg would finally develop is 4 ∈ (0,1);
here > 0. Further suppose that eggs develop independently of each other. Show that the
number, , of eggs surviving has the @(4) distribution. Also, find the conditional
distribution of given = , where ∈ {0, 1, 2, … .
9. Let the random vector (, ) have the joint p.d.f. For the bivariate beta random variable
(, ) having p.d.f.
Γ(? + ? + ?Q ) D 7 D 7
/ (1 − / − )D7 , if / > 0, > 0, / + < 1
(/,
Ùa, ) = ÒΓ(? )Γ(? )Γ(?Q ) ,
0, otherwise
where ?3 > 0, = 1, 2, 3. Find the marginal p.d.f.s of and and the conditional p.d.f.s.
(Note: A distribution with above p.d.f. is called a bivariate beta distribution).
10. Let the random variable = ( , ) have the joint p.m.f.
/ + 2/
Ùa (/ , / ) = Ô 18 , if (x , x ) ∈ {1, 2 × {1, 2.
0, otherwise
Determine the conditional mean and conditional variance of given = / , / ∈ {1, 2.
1
( )
Ùa ,a ,a (/ , / , /Q ) = Ô4 , if / , / , /Q ∈ [ ,
0, otherwise
164
12. Let and be two random variables such that ({ ∈ {0,1) = @
∈ {0,1 = 1. If
@({ = 1, = 1) = @({ = 1)@({ = 1), show that and are independent random
variables.
variables , and Q , respectively, denote the number of spades, the number of hearts and
13. Five cards are drawn at random without replacement from a deck of 52 cards. Let the random
and 3 red balls. Let the random variables and , respectively, denote the number of white
14. Consider a sample of size 3 drawn with replacement from an urn containing 3 white, 2 black
balls and the number of black balls in the sample. Determine whether or not and are
independent.
17. Let Ù and ] be two p.d.f.s with respective distribution functions z and É. Define ℎ: ℝ →
[0, ∞) as
ℎ (/, ) = [1 + B {2z (/ ) − 1{2É () − 1]Ù (/)](),
where B ∈ [−1,1].
165
Show that the marginal p.d.f.s of and are Ù and ], respectively;
Does there exists a value of B ∈ [−1,1] such that and are independent.
(ii)
(iii)
1 <í <í
Ùa (/ , / , /Q ) =
¼í ¼í = ¼í ¼í =
X 7
1 + / / /Q X 7
, /3 ∈ ℝ, = 1,2,3.
(2Y)
19. A point is chosen at random from the interval (0,1) and then a point is chosen at
random from the interval (0, ). Compute @({ + ≥ 1) and find the conditional mean
( | / , / ∈ (0,1).
20. With the help of a counter example, show that if the random variables and are
uncorrelated, then this does not, in general, imply that and are independent.
4/ (1 − / ), if 0 < / < 1
Ùa (/ ) = d ,
0, otherwise
and, for fixed / ∈ (0,1), the conditional p.d.f. of given = / is
166
2
Ù|a |/ Ô1 − / , if / < < 1.
0, otherwise
(i) For ∈ (0,1), find conditional p.d.f. of given = ;
(ii) Find (| and Var <| = ;
(iii)Find @ <:0 < < >= and @ <: < < > 0: = >=.
Q Q Q
23. Let (, ) be a random vector with joint p.m.f. given by:
Ùa, (/, )
↓/→ 1 2 3 4
4 .08 .11 .09 .03
5 .04 .12 .21 .05
6 .09 .06 .08 .04
24. Let , … , w be r random variables with (3 ) = 3 , Var(3 ) = W3 and "3© =
Corr3 , © , , = 1, … , r, ≠ . For real numbers A3 , H3 , = 1, … , r, define = ∑w3K A3 3
and = ∑w3K H3 3 . Find Cov(, ).
25. Let , and Q be three independent random variables each with a variance W . Define the
√3 − 1 3 − √3
random variables
26. Let and be jointly distributed random variables with () = () = 0, ( ) =
( ) = 2 and Corr(, ) = 1⁄3. Find Corr < + , + =.
a a
Q Q Q Q
167
and Ùa, (/, ) = 0, elsewhere. Find " = Corr(, ).
28. Let , and Q be three random variables with means, variances and correlation
coefficients denoted by , , Q ; W , W , WQ and " , "Q , "Q , respectively. If ( −
)| / , Q = /Q = H (/ − ) + HQ (/Q − Q ), for some constants H and HQ ,
determine H and HQ in terms of the variances and correlation coefficients.
29. Let , … , w denote a random sample, where , … , w are positive with probability one.
+ + ⋯ + ` _
Show that
| } = , _ = 1,2, … , r.
+ + ⋯ +w r
28. Let , … , w be a random sample of absolutely continuous type random variables. If the
expectation of is finite and the distribution of is symmetric about ∈ (−∞, ∞) then
show that
30. Let and be i.i.d. :(0,1) random variables and let = + , = + .
=
31. Suppose that the lifetimes of electric bulbs manufactured by a manufacturer follows
exponential distribution with mean of 50 hours. Eight such bulbs are chosen at random.
(i) Find the probability that, among eight chosen bulbs, 2 will last less than 40 hours, 3 will
last anywhere between 40 and 60 hours, 2 will last anywhere between 60 and 80 hours
and 1 will last for more than 80 hours;
168
(ii) Find the expected number of bulbs in the lot of chosen 8 bulbs with lifetime between 60
and 80 hours;
(iii) Find the expected number of bulbs in the lot of 8 chosen bulbs with lifetime between 60
and 80 hours, given that the number of bulbs in the lot with lifetime anywhere between
40 and 60 hours is 2.
32. Suppose that ~ Mult(30, ? , ? , ?Q , ?R ). Find the conditional probability mass function of
( , , Q , R ) given that + +Q + R = 28.
34. Let = ( , ) ~ : ( , , W , W , ") and, for real constants A , A , AQ , and AR (A3 ≠ 0,
= 1, 2, 3, 4, A AR ≠ A AQ ), let = A + A and = AQ + AR .
(i) Find the joint p.d.f. of (, ) ;
(ii) Find the marginal p.d.f.s. of and .
1 7 ¾
Ù(/, ) = Ô Y X , if / > 0 .
0, otherwise
Show that 3 ∼ :(0, 1), = 1, 2 , but = ( , ) does not have a bivariate normal
distribution.
169
38. For a fixed " ∈ (−1,1) and B ∈ (0, 1), let the random variable (, ) have the joint p.d.f.
where Ù (⋅,⋅), −1 < < 1, denote the pdf of : (0,0,1,1, ). Show that and are normally
distributed but the distribution of (, ) is not bivariate normal.
41. (i) Let (, ) ~ : (5, 8,16, 9, 0.6) . Find @({5 < < 11)| 2), @({4 < < 6)
and @({7 < < 9);
(ii) Let (, ) ~ : (5, 10, 1, 25, ") , where " > 0. If @({4 < < 16)| 5
0.954, determine ".
42. (i) Let ~ Bin(r , 4) and ~ Bin(r , 4) be independent random variables. For ∈
{0, 1, … , min(r , r ) , find the conditional distribution and conditional mean of given
+ = .
(ii) Let ~ P( ) and ~ P( ) be independent random variables. For ∈ {0,1, … , find the
conditional distribution and conditional mean of given + =
and
X 7(¾7) , if ≥ 2
Ù () = d .
0, otherwise
44. Let and be i.i.d. r(0,1)random variables. Find the marginal p.d.f.s. of
+ , − , , | 0 |;
a
a7
(i)
170
min(, ), max(, ), k(l(a,) ;
k(a,)
+ .
(ii)
(iii)
46. Let and be i.i.d. random variables with common p.d.f. Ù (/) = , −∞ < / < ∞,
m
47. Let and be i.i.d. :(0,1) random variable. Define the random variables ã and Θ by
= ã cos Θ , = ã sin Θ.
Show that ã and Θ are independent with ~ Exp(1) and Θ ~ r(0, 2Y);
å
(i)
Show that + and
a
Show that sin Θ and sin 2Θ are identically distributed and hence find the p.d.f. of
(ii) are independently distributed;
~= ;
a
(iii)
√a
48. Let r and r be i.i.d. r (0, 1) random variables. Show that = &−2 lnr cos(2Yr ) and
= &−2 lnr sin(2Yr ) are i.i.d. :(0, 1) random variables. (This is known as the Box-
Muller transformation).
3 = 3 , = 1, … , r − 1, w = + ⋯ + w ;
∑©K ©
(ii) Are , … , w independent ?
171
Show that and and ( , Q ) are independent and find marginal p.d.f.s. of , and
Q ;
(i)
52. Let and be independent random variables with 3 ~ Bin <r3 , = , = 1, 2. Using the
m.g.f. technique, find the distribution of = − + r .
53. Let :w ≤ :w ≤ ⋯ , ≤ w:w be the set of order statistics associated with a random sample
of size r (≥ 2) from the Exp(1) distribution.
(i) Let = r:w , 3 = (r − + 1)(3:w − 37:w ), = 2, … , r. Show that , … , w are
i.i.d. Exp(1) random variables;
(ii) Using (i), or otherwise, find (:w ), Var(:w ) and Cov(:w , ñ:w ), 1 ≤ < ç ≤ r;
(iii) Show that :w and ñ:w − :w are independent for any ç > ;
(iv) Find the p.d.f. of :w − :w , = 1, 2, … , r.
54. Let , … , w be i.i.d. non-negative random variables (@({ ≥ 0) = 1) of the absolutely
continuous type. If (| | 2 ∞ and w = max( , … , w ), show that
55. Let :w ≤ :w ≤ ⋯ ≤ w:w be the order statistics associated with a random sample of size n
(≥ 2) from the r(0,1) distribution. Let 3 = , = 1, … , r − 1, and w = w:w . Show
a:
a¼:
that , … , w are independent and find the p.d.f of 3 , = 1, … , r.
172