0% found this document useful (0 votes)
73 views172 pages

Part 6

MSO201

Uploaded by

Shloka Patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views172 pages

Part 6

MSO201

Uploaded by

Shloka Patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Module 6

Random Vector and Its Joint Distribution


1. Multivariate Distributions
A (univariate) random variable describes a numerical characteristic of a typical outcome of a
random experiment. In many situations we may be interested in simultaneously studying two
or more numerical characteristics of outcomes of a random experiment. To make the above
discussion more clear consider the following example.

Example 1.1
Two distinguishable dice (labeled as  and  ) are thrown simultaneously. Here the sample
space is  =
, : ,  ∈ 1, … ,6, where the outcome ,  ∈  indicates that number of
dots are observed on the uppermost face of die  and  number of dots are observed on
uppermost face of die  . For ,  ∈ , define

  ,  = +  = sum of number of dots on uppermost faces of two dice

and

  ,  = | + | = absolute difference of number of dots on uppermost faces of two dice.

It may be of interest to study numerical characteristics  and  simultaneously. This


amounts to the study of the function  =  ,   ∶ Ω ⟶ ℝ defined on the sample space Ω.

Throughout ℝ. =
/ = / , … , /.  : − ∞ < /3 < ∞, = 1, … , 4 will denote the 4-dimensional
Euclidean space and, for a set 5 ⊆ ℝ. and a function  =  ,  , … , . : Ω ⟶ ℝ. ,

 7 8  ≝ :; ∈ Ω:  ; = < ;,  ;, … , . ;= ∈ 8>.

Let Ω, ℱ, @ be a given probability space.

Definition 1.1

A function  =  , … , . : Ω ⟶ ℝ. is called a 4-dimensional random vector (or simple a


random vector) if  7  −∞, A] ∈ ℱ, ∀ A = A , … , A.  ∈ ℝ. ; here −∞, A] = −∞, A ] ×
⋯ × −∞, A. ]. ▄

1
function  =  , … , . : Ω ⟶ ℝ. is a random vector if
A 1-dimensional random vector will simply be referred to as a random variable. Clearly, a


; ∈ Ω ∶  ; < A , … , . ; < A.  ∈ ℱ, ∀ A = A , … , A.  ∈ ℝ. .

For A = A , … , A.  ∈ ℝ. , H = H , … , H.  ∈ ℝ. , and A3 < H3 , = 1, … , 4, define


.

A, H = A , H  × ⋯ × A. , H.  ≡ J A3 , H3 ,
3K

A, HL = A , H ] × ⋯ × A. , H. L ≡ J A3 , H3 ],
3K

MA, H = NA , H  × ⋯ × MA. , H.  ≡ JNA3 , H3 ,


3K

MA, HL = NA , H ] × ⋯ × NA. , H. ] ≡ JNA3 , H3 ],


3K

−∞, H = −∞, H  × ⋯ × −∞, H.  ≡ J −∞, H3 ,


3K

A, ∞ = A , ∞ × ⋯ × A. , ∞ ≡ J A3 , ∞,


3K

and
.

MA, ∞ = NA , ∞ × ⋯ × NA. , ∞ ≡ JNA3 , ∞.


3K

Further define

OP =
−∞, HL ∶ H ∈ ℝ. ,

O =
A, H ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,

O =
A, HL ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,

OQ =
MA, H ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,

2
OR =
MA, HL ∶ A, H ∈ ℝ. , A3 < H3 , = 1, … , 4,

OS =
−∞, H ∶ H ∈ ℝ. ,

OT =
A, ∞ ∶ A ∈ ℝ. ,

and

OU =
MA, ∞ ∶ A ∈ ℝ. .

As in the case of 4 = 1 it can be shown that

(i) ℬ. = the Boral W-field in ℝ. = W X3 , = 0, 1, … ,7;


(ii)
A ∈ ℬ. , ∀ A ∈ ℝ. , i. e., ℬ. contains all singleton subsets of ℝ. ;
(iii) If 8 ⊆ ℝ. is countable then 8 ∈ 8. ;
There exists a set [ ⊆ ℝ. such that [ ∉ ℬ. ;
(v) : Ω ⟶ ℝ. is a 4-dimensional random vector if, and only if one of the following
(iv)

 7 8  ∈ ℱ, ∀ 8 ∈ O ;
equivalent conditions hold:
a)

b)  7 8  ∈ ℱ, ∀ 8 ∈ O ;

c)  7 8  ∈ ℱ, ∀ 8 ∈ OQ ;

d)  7 8  ∈ ℱ, ∀ 8 ∈ OR ;

e)  7 8  ∈ ℱ, ∀ 8 ∈ OS ;

f)  7 8  ∈ ℱ, ∀ 8 ∈ OT ;

g)  7 8 ∈ ℱ, ∀ 8 ∈ OU ;

h)  7 8  ∈ ℱ, ∀ 8 ∈ ℬ. .

If  =  , … , .  is a 4-dimensional random vector and ]3 : ℝ. → ℝ, = 1, … , _, are


Borel functions i. e. , ]37 8  ∈ ℱ, ∀ 8 ∈ ℬ. , = 1, … , _ then <] , … , ]` = is a
(vi)

_-dimensional random vector.

(vii) If : Ω → ℝ. is a 4-dimensional random vector then

3
 7 
A =
; ∈ Ω:  ; = A , … , . ; = A.  ∈ ℱ, ∀ A = A , … , A.  ∈ ℝ. ;
(viii) The function @a : ℬ. → ℝ given by,

@a 8  = @ < 7 8 = , 8 ∈ ℬ. ,

is a probability measure on ℬ. i. e., ℝ. , ℬ. , @a  is a probability space), called the


probability measure induced by .

Example 1.2
Let [, 8 ⊆ Ω. Define  =  ,  : Ω → ℝ by

1, if ; ∈ [
 ; = bc ; = d ;
0, if ; ∉ [

1 if ; ∈ 8
and
 ; = be ; = d .
0, if ; ∉ 8

Then, for A = A , A  ∈ ℝ ,

 7  − ∞ , A] = ; ∈ Ω: X ; ≤ A ,  ; ≤ A 

l, if A < 0 or A < 0
k m
i[ ∩ 8 , if 0 ≤ A < 1, 0 ≤ A < 1
m

= [m , if 0 ≤ A < 1, A ≥ 1 .
j8 m , if A ≥ 1, 0 ≤ A < 1
i  
hΩ, if A ≥ ,1, A ≥ 1

Thus

 is a random vector ⟺  7  − ∞ , A] ∈ ℱ, ∀ A ∈ ℝ

⟺ [m , 8 m ∈ ℱ
⟺ [, 8 ∈ ℱ

Thus  is a random vector if, and only if, [, 8 ∈ ℱ. ▄

Theorem 1.1

Let  =  , … , . : Ω → ℝ. be a given function. Then  is a random vector if, and only if
 , … , . 3 : Ω → ℝ, = 1, … , 4 are random variables.

Proof. First suppose that  =  , … , .  is a random vector. Then, for A ∈ ℝ, and for fixed
∈ 1, … , 4

4
x

37 −∞, A] = q  7


−∞, r] × ⋯ × −∞, r] × −∞, A] × −∞, r] × ⋯ × −∞, r] ,
sttttttttttttttttttttuttttttttttttttttttttv
wK ∈ ℱ,∀wK,,…

∈ ℱ

i.e., 3 is a random variable.

Conversely suppose that  , … , . are random variables. Then, for A = A , … , A.  ∈ ℝ. ,

 7 <−∞, AL= = ; ∈ Ω: 3 ; ≤ A3 , = 1, … , 4

= q; ∈ Ω: 3 ; ≤ A3 
3K

= q 3 −∞, A3 ]
stttutttv
7

3K ∈ ℱ

∈ ℱ

i.e.,  is a random vector. ▄

Remark 1.1

If Ω is countable then we take ℱ = y Ω and, therefore, any function  =  , … , . : Ω →


ℝ. is a random vector.

Definition 1.2

The joint distribution function of a 4-dimensional random vector  =  , … , . : Ω →


ℝ. is defined by
(i)

za / , … , /.  = @
; ∈ Ω:  ; ≤ / , … , . ; ≤ /. , / = / , … , /.  ∈ ℝ. .
The joint distribution function of any subset of random variables  , … , . is called a
marginal distribution function of za ∙.
(ii)

Remark 1.2
(i) If za ∙ is the distribution function of a 4-dimensional random vector  =  , … , .  then

5
za / = @ 3 ≤ /3 , = 1, … , 4

= @ | 7 <−∞, /L=}

= @a <−∞, /L=

= @ ~q3 ≤ /3 
3K

= @ ~q 37 −∞, /3 ] , / = / , … , /.  ∈ ℝ. .


3K

(ii) Let za€,⋯,a ∙ be the distribution function of a random vector  =  , … , .  and let
‚ = ‚ , … , ‚.  be a permutation of 1, … , 4. Then

za€ ,…,a / , … , /.  = @ ~q3 ≤ /3 


3K

= @ ~q
ƒ„ ≤ /ƒ„ 
3K

= za…€ ,⋯,a… </ƒ€ , … , /ƒ =, / = / , … , /.  ∈ ℝ. .

It follows that the distribution function of <ƒ , … , ƒ = is given by

za…€ ,…,a… † , … , †.  = za€ ,…,a <†‡  , … , †‡ . =, † = † , … , †.  ∈ ℝ. ,


where ˆ = ˆ , … , ˆ.  is the inverse permutation of ‚ = ‚ , … , ‚. . To illustrate this
point, consider 4 = 3 and ‚ = ‚ , ‚ , ‚Q  = 2, 3, 1. Then the inverse permutation of
‚ = ‚ , ‚ , ‚Q ) is ˆ = ˆ , ˆ , ˆQ  = 3, 1, 2 and therefore, for † = † , † , †Q  ∈ ℝQ ,

za…€ ,a…€ ,a…‹ † , †, †Q  = zaŒ, a‹,a€ † , †,†Q 

= @  ≤ † , Q ≤ † ,  ≤ †Q 

= @  ≤ †Q ,  ≤ † , Q ≤ † 

= za€ ,aŒ ,a‹ †Q , †, † 

6
= za€ ,aŒ, a‹ <†‡  , †‡  , †‡ Q =.

(iii) Note that a distribution function za€ ,…,a / , … , /.  is increasing in each argument
when other arguments are kept fixed.

We recall the following results from the theory of multivariable calculus.

Lemma 1.1
Let  ⊆ ℝ. and let ]:  → ℝ be a function such that:

(i) ] is bounded above, i.e., there exists a real constant  such that ]/ ≤ , ∀/ ∈ ;
(ii) for every fixed ∈ 1, … , 4 and fixed / , … , /37 , /3Ž , … , /.  ∈ ℝ.7 , ] / , … , /37 , ,
/3Ž , … , /.  is non decreasing in  ∈ 3 = † ∈ ℝ: / , … , /37 , †, /3Ž , … , /.  ∈ . Then
lim‘→x ] / exists and, for any permutation ‚ = ‚ , … , ‚.  of 1, … , 4,

lim ⋯ lim ]/ , … , /.  = lim ] / ∙


‘… →’ ‘…€ →’ ‘→x

In particular all iterated limits

lim ⋯ lim ]/ , … , /. , ‚ , … , ‚.  ∈ “. ,


‘… →’ ‘…€ →’

exist and are equal, where “. denotes the set of all permutations of 1, … , 4. We
denote the common value of all iterated limits by

lim ] /.
‘„ →x
3K,…,.

Note that if za ⋅ is a distribution function in ℝ. 4 ≥ 2 then, for a fixed _ ∈ 1, … , 4 − 1 and


fixed /`Ž , … , /.  ∈ ℝ.7` , the function ]: ℝ` → ℝ, given by

] / , … , /`  = za / , … , /` , /`Ž , … , /. ,

satisfies properties (i) and (ii) stated in Lemma 1.1. Therefore, for fixed /`Ž , … , /`  ∈ ℝ.7`

lim za / , … , /` , /`Ž , … , /.  = ‘lim z / , … , /` , /`Ž , … , /. ,


→x a
‘ ∗ →x „
3K,⋯,`

where / ∗ = / , … , /` .

7
Lemma 1.2
Let za ∙ be the distribution function of a 4 -dimensional 4 ≥ 2 random vector  =
 , … , .  . For a fixed positive integer _ ∈ 1, … , 4 − 1, let – =  , … , `  and let
— =  `Ž , … , .  so that  = –, —  . Then the marginal distribution function of – =
– , … , –`  is given by

z˜ / , … , /`  = lim za / , … , /` , /`Ž , … , /. , / , … , /` 


‘„ →x
∈ ℝ` .
3K`Ž,…,.

Proof. For fixed / , … , /.7 ∈ ℝ

lim za / , … , /.  = lim @ ™q 37 −∞, /3 ]š


‘ →x ‘ →x
3K

 ¢
.7
7  7 
= lim @ œ™q 3 −∞, /3 ] š q . −∞, r] ¡
w→x œ ¡
sttttttttttttuttttttttttttv
3K

› Kcž ↑  
x

= @ £¤ [w ¥
wK

.7 x

= @ ™q 37 −∞, /3 ]š since £¤ .7 −∞, r] = Ω¥


3K wK

= za€ ,…,a¦€ / , … , /.7 . 1.1


Now the assertion follows on recursively using (1.1). ▄

Remark 1.3

Let  =  , … , .  be a random vector and let ‚ = ‚ , … , ‚.  ∈ “. , the set of all
permutations of 1, … , 4. If ˆ = ˆ , … , ˆ.  is the inverse permutation of ‚ , … , ‚.  then, for
a fixed _ ∈ 1, … , 4 − 1, the marginal distribution function of ƒ€ , … , ƒ§  is given by

za…€ ,…,a… / , … , /`  = lim za…€ ,…,a… / , … , /. 


‘¨ → x
§
©K`Ž,⋯,.

8
= lim za€ ,…,a </‡€ , … , /‡ = .
‘¨ → x
©K`Ž,…,.

Let  =  , … , .  be a random vector and let A = A , … , A. , H = H , … , H.  ∈ ℝ. . Then

@ A <  ≤ H  = @  ≤ H  − @  ≤ A 

= za€ H  − za€ A . 1.2


Also

@ A <  ≤ H , A <  ≤ H  = @ A <  ≤ H ,  ≤ H 

− @ A <  ≤ H ,  ≤ A 

= N@  ≤ H ,  ≤ H  − @  ≤ A ,  ≤ H ]

−N@  ≤ H ,  ≤ A  − @  ≤ A ,  ≤ A ]

= za€ ,aŒ H , H  − Mza€ ,aŒ A , H  + za€ ,aŒ H , A L

+za€ ,aŒ A , A . 1.3


To write the expression of @ A3 < 3 ≤ H3 , = 1, … , 4 in a closed form define, for
_ ∈ 0, 1, … , 4,

ª`,. ≡ ª`,. <A, HL= =


« ∈ ℝ. : «3 ∈ A3 , H3 , = 1, … , 4, and _ of « , … , «. are A¬ ′s. 1.4

4
Note that the set ª`,. has < = elements.
_
From (1.2) and (1.3) we have


@ A <  ≤ H  = za€ H  − za€ A  = ¯ −1` ¯ za€ « 1.5


`KP °∈±§,€

and


@ A3 < 3 ≤ H3 , = 1, 2 = ¯ −1` ¯ za€ ,aŒ « , «  1.6


`KP °€ ,°Œ ∈±§,Œ

Lemma 1.3

9
Let  =  , … , . : Ω → ℝ. be a random vector and let A = A , … , A. , H = H , … , H.  ∈
ℝ. . Let ª`,. ≡ ª`,. <A, HL= , _ = 0, 1, … , 4 be as defined in (1.4). Then

@ A3 < 3 ≤ H3 , = 1, … , 4 = ¯ −1³ ¯ za « 1.7


³KP µ ∈¶·,¸  ¹,º]

Proof. From (1.5) and (1.6) it is clear that the result is true for 4 = 1 and 4 = 2. Now suppose
that (1.7) holds for general 4 -dimensional random vectors. For simplicity assume that
@
A.Ž < .Ž ≤ H.Ž  = 0. Then, for  , … , . , .Ž : Ω → ℝ.Ž , A = A , … , A.  ∈ ℝ. ,
H = H , … , H.  ∈ ℝ. , A∗ = A , … , A. , A.Ž  ∈ ℝ.Ž andH ∗ = H , … , H. , H.Ž  ∈ ℝ.Ž .

@ A3 < 3 ≤ H3 , = 1, … , 4 + 1

= @
A3 < 3 ≤ H3 , = 1, … , 4»
A.Ž < .Ž ≤ H.Ž @ 
A.Ž < .Ž ≤ H.Ž 
.

= ¯ −1` ¯ @
3 ≤ «3 , = 1, … , 4|{A.Ž < .Ž ≤ H.Ž  @ 
A.Ž < .Ž ≤ H.Ž 
`KP °∈±§, (A,H]

= ¯(−1)` ¯ @
3 ≤ «3 , = 1, … , 4, A.Ž < .Ž ≤ H.Ž  
`KP «∈ª_,4 (¹,º]

= ¯(−1)` ¯ M@
 ≤ « , … , . ≤ «. , .Ž ≤ H.Ž 
`KP °∈±§, <A,HL=

−@
 ≤ « , … , . ≤ «., .Ž ≤ A.Ž L.

It is easy to verify that


.

¯(−1)` ¯ M@
 ≤ « , … , . ≤ «. , .Ž ≤ H.Ž 
`KP °∈±§, <A,HL=

−@
 ≤ « , … , . ≤ «. , .Ž ≤ A.Ž L

.Ž

= ¯(−1)_ ¯ za€ ,⋯,a¼€  , … , .Ž ,


`KP ½ ∈±§,¼€ (¹∗ , º∗ ]

and therefore the assertion follows by principle of mathematical induction. ▄

10
Theorem 1.2

Let za (⋅) be the distribution of a 4-dimensional random vector  =  , … , . . Then

(i) lim z / , … , /. 
‘„ →x a 
= 1;
3K,…,.
(ii) For each fixed ∈ {1, … , 4 and fixed / , … , /37 , /3Ž , … , /.  ∈ ℝ.7 ,

lim za / , … , /37 , †, /3Ž , … , /.  = 0 ;


¾→7x

(iii) za / , … , /.  is right continuous in each argument (keeping other arguments fixed);
(iv) For each rectangle (A, H] ∈ ℝ.
.

¯(−1)` ¯ za « ≥ 0.
`KP °∈±§, (¹ , º]

Proof. Note that, for (A, H] ∈ ℝ. ,


.

¯(−1)` ¯ za « = @  ∈ (A, H] ≥ 0. (using Lemma 1.3)


`KP °∈±§, (¹ , º]

This proves (iv).

For notational convenience we will provide the proofs of (i) - (iii) for only 4 = 2.

(i) For fixed / ∈ ℝ

lim za€ ,aŒ (/ , / ) = lim za€ ,aŒ (/ , r )


‘Œ →x w →x

= limw →x @( ((−∞, / ]) ∩  ((−∞, r]))


stttttttttutttttttttv
7 7

Kcž ↑

= @ £¤ [w ¥
wK

= @ (7 (−∞, / ]).

Therefore,

lim lim za€ ,aŒ (/ , / ) = lim @(7 (−∞, / ])


‘€ →x ‘Œ →x ‘€ →x

11
7 ( ) =
= lim @ <
st tut
 tt (−∞,
ttr]
tv
w →x
Kež ↑

= @ £¤ 8w ¥
wK

= @(Ω)

= 1.

(ii) Fix / ∈ ℝ. Then

lim za€ ,aŒ (/ , / ) = lim @( (−∞, −r] ∩  ((−∞, / ]))
stttttttttutttttttttv
7 7
€‘ →7x w →x
Kež ↓

= @ £q 8w ¥
wK

= @(ϕ)

= 0.

Similarly, for fixed / ∈ ℝ

lim za€ ,aŒ (/ , / ) = 0.


‘Œ →7x

(iii) Let / = (/ , / ) ∈ ℝ . Then

1
lim za€ ,aŒ (/ + ℎ, / ) = lim za€ ,aŒ |/ + , / }
Ã↓P w→x r
1
= lim @(7 |(−∞, / + ]} ∩ 7 ((−∞, / ]))
w →x stttttttttttutttttttttttv
r
KŞ ↓

= @ £q Æw ¥
wK

= @ 7 ((−∞, / ]) ∩ 7 ((−∞, / ])

= za€ ,aŒ (/ , / ),

12
i.e., for every fixed / ∈ -, za€ ,aŒ / , /  is right continuous in / . Similarly, for every
fixed / ∈ -, za€ ,aŒ / , /  is right continuous in / .

Remark 1.4

(i) Let Δ. ⋃ KP Δ`,. . Then Δ. is the set of 2. vertices of the rectangle A, HL ∈ -. . For

example, for 4 1, A, HL A , H B, Δ A , H  and, for 4 2, A, HL A , H B E


A , H B, Δ  b , b , b , a , a , b , a , a .

Figure 1.1

Figure 1.2

(ii) Note that, for 4 1, the assertion (iv) of Theorem 1.2 reduces to za H o
za A, ∀ 0 ∞ 2 A g H 2 ∞ i.e., za is non-decreasing.

Now we state the following theorem without providing its proof. This theorem states that
properties (i) - (iv) described in Theorem 1.2 characterize distribution functions.

Theorem 1.3
Let É: -. → - be a function such that

13
(i) lim É/ , … , /.  = 1;
‘„ →x
3K,…,.
(ii) for each fixed ∈ {1, … , 4 and each fixed / , … , /37 , /37 , … , /.  ∈ ℝ.7
lim É(/ , … , /37 , †, /3Ž , … , /. ) = 0;
¾→7x
(iii) É/ , … , /.  is right continuous in each argument when other arguments are kept

for each rectangle (A, H] ⊆ ℝ.


fixed;
(iv)
.

¯(−1)` ¯ É« ≥ 0.
`KP °∈ ¶§, (¹,º]

Then there exists a probability space (Ω, ℱ, @) and a random vector  =  , … , .  defined on
(Ω, ℱ, @) such that É is the distribution function of  i. e., za / = É/, ∀/ ∈ ℝ. . ▄

Remark 1.5
(i) As in the one dimensional case it can be shown that the probability measure @a (⋅),
induced by a random vector , is completely determined by its distribution function
za (⋅). Thus, to study the induced probability measure @a (⋅), it is enough to study the
distribution function za .

Let A = A , A , … , A.  and H = A + ℎ, H , … , H. , where ℎ > 0. If É: ℝË → ℝ is any


(ii) The properties (i)-(iv) given in Theorem 1.3 are key properties of a distribution function.

function which satisfies properties (ii) and (iv) of Theorem 1.3, then
.

¯ (−1)` ¯ É« ≥ 0 (using property (iv))


`KP °∈ ¶§, (¹,º]

⇒ ¹ lim
→7x
(−1)` ¯ ¯ É« ≥ 0
„
3K,…,. `KP °∈ ¶§, (¹,º]

⇒ É A , +ℎ, H , … , H.  − É A , H , … , H.  ≥ 0, (using property (ii))


i.e, É(⋅) is non-decreasing in each argument when other arguments are kept fixed. It follows
that if É: ℝË → ℝ is a distribution function then the property that it is non-decreasing in each
argument (when other arguments are kept fixed) is not one of its key characteristics and it is a
consequence of properties (ii)-(iv) given in Theorem 1.3.

Example 1.3

14
Consider the function É: ℝÏ → ℝ defined by

/†  , if 0 ≤ / < 1, 0 ≤ † < 1
k
i/, if 0 ≤ / < 1, † ≥ 1
( )
É /, † = †  , if / ≥ 1, 0 ≤ † < 1 .
j1, if / ≥ 1, † ≥ 1
i
h 0, otherwise

(i) Show that É is a distribution function of some two-dimensional random vector, say (, –).
(ii) Find marginal distribution functions of  and –.

Solution. (i) Note that, for / ≥ 1, † ≥ 1, É (/, †) = 1. Therefore lim‘→x É (/, †) = 1. Also, for
¾→x
/ < 0 or † < 0, É (/, †) = 0. Therefore, for each fixed / ∈ ℝ, lim É (/, †) = 0 and, for each
¾→7x
fixed † ∈ ℝ, lim É (/, †) = 0.
‘→7x

Note that, É (/, †) = 0, ∀ / ∈ ℝ if † < 0, (1.8)

0, if / < 0
É(/, †) = Ò/†  , if 0 ≤ / < 1 , if † ∈ [0,1) (1.9)
†  , if / ≥ 1
and

0, if / < 0
É(/, †) = Ô/, if 0 ≤ / < 1 , if † ∈ [1, ∞) . (1.10)
1, if / ≥ 1

From (1.8) - (1.10) it is evident that, for each fixed value of † ∈ ℝ, É (/, †) is a continuous (and
hence right continuous) function of /. Similarly, for each fixed value of / ∈ ℝ, É (/, †) is a
continuous function of †.

From (1.8 ) - (1.10) it is also clear that, for each fixed value of † ∈ ℝ, É (/, †) is a non-decreasing
function of / ∈ ℝ. Similarly, for each fixed value of / ∈ ℝ, É(/, †) is a non-decreasing function
of † ∈ ℝ.

Now let −∞ < A < H < ∞, −∞ < A < H < ∞, A = (A , A ), H = (H , H ) and (A , H] =
(A , H ] × (A , H ] . Then


Δ = ¯(−1)` ¯ É (« , « )
`KP ° ∈¶§,Œ <¹,ºL=

= É (H , H ) − É (H , A ) − É (A , H ) + É (A , A ).

15
The following cases arise:

Case I. A < 0

In this case

Δ = É (H , H ) − É (H , A ) ≥ 0,

since, for a fixed H ∈ ℝ, É (H , †) is a non-decreasing function of †;

Case II. A < 0

Δ = É (H , H ) − É (A , H ) ≥ 0,

since, for a fixed H ∈ ℝ, É (/, H ) is a non-decreasing function of /;

Case III. 0 ≤ A < 1, 0 ≤ A < 1, 0 ≤ H < 1, 0 ≤ H < 1

Δ = H H − H A − A H + A A

= (H − A ) (H − A ) ≥ 0;

Case IV. 0 ≤ A < 1, 0 ≤ A < 1, 0 ≤ H < 1, H ≥ 1

Δ = H − H A − A + A A

= (H − A ) (1 − A ) ≥ 0;

Case V. 0 ≤ A < 1, 0 ≤ A < 1, H ≥ 1, 0 ≤ H < 1

Δ = H − A − A H + A A

= (1 − A ) (H − A ) ≥ 0;

Case VI. 0 ≤ A < 1, 0 ≤ A < 1, H ≥ 1, H ≥ 1

Δ = 1 − A − A + A A

= (1 − A ) (1 − A ) ≥ 0;

Case VII. 0 ≤ A < 1, A ≥ 1, 0 ≤ H < 1, H ≥ 1

Δ = H − H − A + A = 0;
Case VIII. 0 ≤ A < 1, A ≥ 1, H ≥ 1, H ≥ 1

Δ = 1 − 1 − A + A = 0;

Case IX. A ≥ 1, 0 ≤ A < 1, H ≥ 1, 0 ≤ H < 1

16
Δ = H − A − H + A = 0;
Case X. A ≥ 1, 0 ≤ A < 1, H ≥ 1, H ≥ 1

Δ = 1 − A − 1 + A = 0;

Case XI. A ≥ 1, A ≥ 1, H ≥ 1, H ≥ 1

Δ = 1 − 1 − 1 + 1 = 0.
Combining Case I - Case XI it follows that


¯(−1)` ¯ É(« , « ) ≥ 0, ∀ A, HL ⊆ ℝ .


`KP ° ∈¶§,Œ <¹,ºL=

Now using Theorem 1.3 it follows that É (/ , / ) is a distribution function of some two-
dimensional random vector (, –) ∈ ℝ .

(ii) Using Lemma 1.2, we have

0, if / < 0
( ) ( )
za / = lim É /, † = Ô/, if 0 ≤ / < 1.
1, if / ≥ 1
¾→x

Also using Lemma 1.2 and Remark 1.3 we have

0, if † < 0
z˜ (†) = lim É(/, †) = ҆  , if 0 ≤ † < 1.
1, if / ≥ 1
‘→x

Example 1.4
Let É: ℝ → ℝ be defined by

/, if 0 ≤ / < 1, † ≥ 1
†  , if / ≥ 1, 0 ≤ † < 1
É (/, †) = Õ .
1, if / ≥ 1, † ≥ 1
0, otherwise

Show that É is not a distribution function of any random vector (, –).

Solution. Note that É (/, †) is non-decreasing in each argument when the other argument is
kept fixed. Let A ∈ [0, 1), A ∈ [0, 1), H ∈ [1, ∞), H ∈ [1, ∞) A + A > 1, A = (A , A ),
H = (H , H ) and (A, H] = (A , H ] × (A , H ] . Then

17


¯ (−1)` ¯ É (« , « ) = É (H , H ) − É (H , A ) − É (A , H ) + É (A , A )


`KP ° ∈¶§,Œ <¹,ºL=

= 1 − A − A < 0.

Thus É is not a distribution function of any random vector.

2. Types of Random Variables

Let (Ω, ℱ, @) be a probability space and let  =  , … , . : Ω → ℝ. be a random vector with
distribution function za / , … , /. .

Definition 2.1
 is said to a random vector of discrete type if there exists a non-empty countable set
“a ⊆ ℝ. such that @
 = / > 0, ∀ / ∈ “a and @
 ∈ “a  = ∑‘ ∈ ר @
 =
(i)

/ = 1. The set “a is called the support of the discrete type random vector  (or
simple the support of the probability distribution of  ) and the function

Ùa / = @
 = /, / ∈ ℝ. ,

which is such that Ùa / > 0, ∀ / ∈ “a , Ùa / = 0, ∀ / ∈ “am (see Remark 2.1 (i)
later) and ∑‘∈ר Ùa / = 1, is called the joint probability mass function (p.m.f.) of .

 is said to be a random vector of continuous type if za / is continuous at every


/ ∈ ℝ. ;
(ii)

 is said to be a random vector of absolutely continuous type if there exists a non-


negative function Ùa : ℝ. → ℝ such that
(iii)

za / = Ú Ùa (†)ۆ , / = / , … , /.  ∈ ℝ. ,


(7x,‘]

where (−∞, /]= (−∞, / ] × ⋯ × −∞, /. L, † = († , … , †. ) and ۆ = ۆ ⋯ ۆ. .

The function Ùa (∙), which is non-negative and is such that

18
Ú Ùa / , … , /. Û / = ¾lim z † , … , †.  = 1,
→x a 
„
ℝ 3K,⋯,.

is called the joint probability density function (p.d.f.) of  . The set “a =


/ ∈
ℝ. : Ùa / > 0 is called a support of the p.d.f. Ùa . ▄

Remark 2.1

If  is of discrete type with support “a then @


 ∈ “a  = 1 and, therefore,
@
 ∈ “am  = 0. In particular Ùa / = @
 = / = 0, ∀ / ∈ “am .
(i)

Let  be a random vector of discrete type with support “a and p.m.f. Ùa (∙). Then we
“a is countable, Ùa / ≥ 0, ∀ / ∈ ℝ. , Ùa / > 0, ∀ / ∈ “a and
(ii)

∑‘∈ ר Ùa / = 1. As in the one-dimensional case (4 = 1) it can be shown that if


know that

]: ℝ. → ℝ is any function such that ]/ ≥ 0, ∀ / ∈ ℝ. , ]/ > 0, ∀ / ∈  and


∑‘∈ Ü ]/ = 1, for some non-empty countable set  ⊆ ℝ. , then ](∙) is a joint p.m.f. of
a random vector of discrete type.

Let  be a random vector of absolutely continuous type with joint and p.d.f. Ùa (∙). Then
Ùa / ≥ 0, ∀ / ∈ ℝ. and
(iii)

Ú Ùa (/) Û/ = 1,
ℝ

where / = / , … , /.  and Û/ = Û/ ⋯ Û/. . Conversely if ℎ: ℝ. → ℝ is any function


such that ℎ/ ≥ 0, ∀/ ∈ ℝ. , and

Ú ℎ/ Û / = 1,
ℝ

then it can be shown that ℎ(∙) is a joint p.d.f. of some random vector of absolutely
continuous type.

LetA, HL ⊆ ℝ. and let Ψ: A, HL → ℝ be a non-negative function. Let  =  × ⋯ × . ,


where each 3 , = 1, … , 4, is countable. Then, provided the integral (or sum)
(iv)

19
Ú Ψ/ Û/ ~or ¯ Ψ/
¹,ºL ‘∈ Ü

is finite, we know that the integral (or sum) can be evaluated iteratively (section wise).

ℎ: ℝ. → ℝ is a joint p.d.f. (or joint p.m.f.), then


The order in which integration (or sum) is carried out is immaterial. In particular if

º… º…€

Ú ℎ/ Û/ ⋯ Û/. = Ú ⋯ Ú ℎ/ Û/ƒ€ ⋯ H/ƒ


(¹,º] ¹… ¹…€

or ~¯ ℎ/ = ¯ ⋯ ¯ ℎ/.
‘∈Ü ‘…€ ∈܅€ ‘… ∈܅

(v) Let  be a 4 -dimensional random vector with distribution function za . For A =


A , … , A.  ∈ ℝ. , define Aw = <A − , … , A. − = , r = 1, 2, …. Then
 
w w


 = A =  7 
A
x

=  7
£q (Aw , A]¥
wK

= q  7 (Aw , A]
wK

⇒ @
 = A = @ ~q  (Awt
stttut
7
, tv
A]
wK Kcž ↓

= lim @ < 7 (Aw , A]=


w→x

= lim ¯(−1)` ¯ za «w .


w→x
`KP °ž ∈ ¶§, (¹ž ,¹]

(vi) Let  be a 4-dimensional random vector with distribution function za that is continuous

at A ∈ ℝ. . Let Aw , r = 1, 2 … be as defined in (v) above. Then, for «w ∈ Δ`,. |<Aw , AÞ},

r = 1, 2, … so that, as r → ∞, «w → A, za <«w = → za (A) as r → ∞. Therefore

20
.

@
 = A = lim ¯(−1)` ¯ za «w 
w→x
`KP °ž ∈ ¶§, <¹ž ,¹L=

.
4
= ¯(−1)` < = za (A)
_
`KP

= (1 − 1). za A

= 0.

It follows that if the distribution functions za of a 4-dimensional random vector  is


continuous at A ∈ ℝ. then

@
 = A = 0.

(vii) Let  be a 4-dimensional random vector of continuous type so that its distribution
function za (∙) is continuous at every / ∈ ℝ. .Then, by (vi),
@
 = A = 0, ∀ A ∈ ℝ. .

Consequently, for any countable set “ ⊆ ℝ. ,

@
 ∈ “ = @ ~Ò¤
 = Aß
¹∈×

= ¯ @
 = A
¹∈×

= 0.

(viii) Suppose that  is a 4-dimensional random vector of absolutely continuous type with
p.d.f. za (∙). Then it can be shown that its distribution function

‘€ ‘

za / = Ú ⋯ Ú Ùa <†= ۆ. ⋯ ۆ , / ∈ ℝ. ,


7x 7x

is continuous at every / ∈ ℝ. . Thus a random vector of absolutely continuous type is


also of continuous. Moreover if  is of absolutely continuous type then

@
 = A = 0, ∀ A ∈ ℝ. and @
 ∈ “ = 0,

21
for any countable set “.

Let  be a 4-dimensional random vector of discrete type with joint p.m.f. Ùa (∙) and
support “a . Then, for any [ ∈ ℬ. ,
(ix)

@
 ∈ [ = @
 ∈ [ ∩ “a  since @
 ∈ “a  = 1

= @ ~ ¤
 = /
‘∈c ∩ ר

= ¯ @
 = / [ ∩ “a ⊆ “a is countable
‘∈c∩ ר

= ¯ Ùa (/)
‘∈c∩ר

= ¯ Ùa /bc /.
‘∈ר

Let  be a 4-dimensional random vector of absolutely continuous type with joint p.d.f.
Ùa (⋅) and let A, H ∈ ℝ. , A3 < H3 , = 1, … , 4. Then, using the idea of the proof of Lemma
(x)

1.3, it can be shown that

º€ º

Ú Ùa / Û/ = Ú ⋯ Ú Ùa / Û/. ⋯ Û/


¹,ºL ¹€ ¹

. °€ °

= ¯(−1)` ¯ Ú ⋯ Ú Ùa / Û/. ⋯ Û/


`KP ° ∈ ¶§, <¹,ºL= 7x 7x

= ¯(−1)` ¯ za «
`KP ° ∈ ¶§, <¹,ºL=

= @({A3 < 3 ≤ H3 , = 1, … , 4)

= @
 ∈ A, HL.

It follows that

22
@
 ∈ A, HL = @({A3 < 3 ≤ H3 , = 1, … , 4)

º€ º

= Ú ⋯ Ú Ùa / Û/. ⋯ Û/


¹€ ¹

x x

= Ú ⋯ Ú Ùa / b<¹,ºL= / Û/. ⋯ Û/


7x 7x

= Ú Ùa / b<¹,ºL= / Û/.


ℝ

In general, for any set [ ∈ ℬ. , if can be shown that

@
 ∈ [ = Ú Ùa / bc / Û/.
ℝ

Consequently if [ comprises of a countable number of curves then

@
 ∈ [ = Ú Ùa / bc / Û/ = 0.
ℝ

In particular @
3 = ©  = 0, ∀ ≠ .

Let  be a 4-dimensional random vector of discrete type with joint distribution function
za (⋅), joint p.m.f. Ùa (⋅) and support “a . Then, using (ix),
(xi)

za / = @
 ∈ (−∞ /]

= ¯ Ùa / / ∈ ℝ. (2.1)
‘ ∈ (7x ‘]∩ ר

Also, using (v),


.

Ùa / = @
 = / = lim ¯(−1)` ¯ za «w  , (2.2)
w→x
`KP °ž ∈ ¶§, (‘ž ,‘]

where /w = </ − , … , /. − = , r = 1, 2, ….
 
w w

23
Using (2.1) and (2.2) we conclude that the joint distribution function of a discrete type

probability measure @a (∙) induced by a discrete type random vector  it is enough to


random vector is determined by its joint p.m.f. and vice-versa. Thus to study the

study its p.m.f. (also see Remark 1.5 (i)).

(xii) If  is a random vector of absolutely continuous type then its joint p.d.f. is not unique
and there are different versions of joint p.d.f.. In fact if the values of the joint p.d.f. Ùa (∙)
of a random vector  of absolutely continuous type are changed at a countable number

.
of curves with other non-negative values then the resulting function is again a p.d.f. of

(xiii) As in the one-dimensional case it can be shown that if  is a 4-dimensional random


vector with distribution function za (⋅) such that

á.
z / , … , /. ,
á/ ⋯ á/. a 

exists everywhere except (possibly) on a set Æ comprising of countable number of


curves and

á.
Ú za / , … , /.  bÅ â / Û/ = 1.
ℝ á/ ⋯ á/.

Then  is of absolutely continuous type with a p.d.f.

á.
za / , … , /. , if / ∉ Æ
Ùa / = Òá/ ⋯ á/. ,
A‘ , if / ∈ Æ

here A‘ , / ∈ Æ, are arbitrary non-negative constants.

(xiv) Let  be a 4-dimensional random vector of absolutely continuous type with joint
distribution function za (⋅) and joint p.d.f. Ùa (⋅). Then

‘€ ‘

za / = Ú ⋯ Ú Ùa <†= ۆ. ⋯ ۆ , / ∈ ℝ. .


7x 7x

Clearly the joint distribution function of an absolutely continuous type random vector 
is determined by its joint p.d.f. Ùa (∙). Thus to study the probability measure @a (⋅)
induced by an absolutely continuous type random vector  it is enough to study its joint
p.d.f. Ùa (⋅).

24
(xv) Using Remark 1.2 (ii) and using (v) above it follows that if Ùa /, / ∈ ℝ. , is the p.m.f. (a
p.d.f.) of 4-dimensional random vector  =  , … , .  then, for any permutation
‚ , … , ‚.  of (1, … , 4) with inverse permutation ˆ , … , ˆ.  the joint p.m.f (joint p.d.f.)
of <ƒ€ , … , ƒ = is Ùa…€ ,…,a… / , … , /.  = Ùa€ ,…,a </‡  , … , /‡ . = , / ∈ ℝ. .

Theorem 2.1

Let  =  , … , .  be a 4-dimensional (4 ≥ 2) random vector with distribution function


za (⋅). For a fixed positive integer _ ∈ {1, … , 4 − 1, let – = ( , … , ` ) and — = `Ž , … , . 
so that  = –, —.

(i) Suppose that  is of discrete type with support “a and p.m.f. Ùa (∙). For † ∈ ℝ` , define
ã¾ = :z ∈ ℝ.7` : <†, «= ∈ “a > (note that, for each † ∈ ℝ` , ã¾ is a countable set. Then
the random vector – = ( , … , ` ) is of discrete type with support “˜ = :† ∈
ℝ` : <†, «= ∈ “a , for some z ∈ ℝ.7` > and joint p.m.f. (called the marginal p.m.f. of – )

¯ Ùa <†, «= , if † ∈ “˜
٘ <†= = Õ °∈åæ .
0, otherwise

Suppose that  is of absolutely continuous type with joint p.d.f. Ùa (∙). Then the random
vector – = ( , … , ` ) is of absolutely continuous type with p.d.f. (called the marginal
(ii)

p.d.f. of –)
x x

٘ <†= = Ú ⋯ Ú Ùa <†, «= Û«.7` ⋯ Û« , † ∈ ℝ` ,


7x 7x

where « = « , … , «.7` .

Proof. (i) Note that


 ∈ “a  =
–, — ∈ “a  ⊆
– ∈ “˜ . Therefore

@
– ∈ “˜  ≥ @
 ∈ “a  = 1,

i.e., @
– ∈ “˜  = 1.

Also “˜ is countable and, for † ∈ “˜ ,

25
@ <:– = †>= = @ <:– = †> ∩
 ∈ “a = (since @ 
 ∈ “a  = 1)

= @ <:– = †> ∩
–, — ∈ “a =

= @ <:– = †> ∩ :<†, —= ∈ “a >=

= @ <:– = †> ∩ :— ∈ ã¾ >=

= @ ~ ¤ :–, — = <†, «=>


° ∈åæ

= ¯ @ <:–, — = <†, «=>=


° ∈åæ

= ¯ @ <: = <†, «=>=


° ∈åæ

= ¯ Ùa <†, «= .
° ∈åæ

Note that, for † ∈ “˜ , ã¾ ≠ l, and for « ∈ ã¾ , <†, «= ∈ “a . Therefore we have za <†, «= >
0 ∀† ∈ “˜ and « ∈ ã¾ . If follows that @
– ∈ “˜  = 1, @ <:– = †>= > 0, ∀ † ∈ “˜ . Hence the
assertion follows.

(iii) Note that, for † ∈ ℝ` ,

z˜ <†= = lim za
°„ →x
<†, «=
3K,…,,.7`

¾€ ¾§ °€ °¦§

= lim
°„ →x
Ú ⋯ Ú ⋯ Ú ⋯ Ú Ùa ç,  ۏ Ûç
3K,…,.7` 7x 7x 7x 7x

¾€ ¾§ x x

= Ú ⋯ Ú ⋯ è Ú ⋯ Ú Ùa ç,  ۏ é Ûç,
7x 7x 7x 7x

26
¾€ ¾§

= Ú ⋯ Ú ℎç Ûç , (2.3)
7x 7x

where ç = (ç , … , ç` ),  =  , … , .7` , ۏ = ۏ.7` ⋯ ۏ , Ûç = Ûç` ⋯ Ûç and


x x

ℎç = Ú ⋯ Ú Ùa ç,  ۏ, ç ∈ ℝ` .


7x 7x

Clearly ℎç ≥ 0, ∀ ç ∈ ℝ` and

x x

Ú ⋯ Ú ℎ ç Ûç` ⋯ Ûç = Ú Ùa ç, ۏ Ûç = 1.


7x 7x ℝ

Now using (2.3) and the above properties of ℎ(⋅) it follows that – is of absolutely
continuous type with p.d.f.
x x

٘ <†= = ℎ <†= = Ú ⋯ Ú Ùa <†, = ۏ, † ∈ ℝ` .


7x 7x

Example 2.1
Let — = (, –) be a bivariate random vector with p.m.f.

ë †, if (/, †) ∈ ã
Ùê (/, †) = @({ = /, – = †) = d ,
0, otherwise

where ã = {(ç, ) ∈ ℝ : ç,  ∈ {1, … , r, ç ≤ , r (≥ 2) is fixed positive integer and ë is a fixed
real constant.

(i) Find the value of constant ë;


(ii) Find marginal p.m.f.s of  and –;
(iii) Find @({ > –), @ ({ = –) and @ ({ < –).

Solution. (i) Clearly we must have ë > 0. Then the support of — is “ê = ã = {(ç, ) ∈ ℝ : ç,  ∈
{1, … , r, ç ≤  and therefore

¯ Ùê (/, †) = 1
(‘,¾)∈×ì

27
w ¾

⇒ ë ¯ ¯ † = 1
¾K ‘K

⇒ ë ¯ †  = 1
¾K

6
⇒ ë = .
r(r + 1)(2r + 1)

(ii) By Theorem 2.1 (i) the support of  is “a =


/ ∈ ℝ: (/, †) ∈ “ê for some † ∈ ℝ  =
{1,2, … , r , and the support of – is “˜ =
† ∈ ℝ: (/, †) ∈ “ê for some / ∈ ℝ  = {1, 2, … , r .
For / ∈ “a , define 㑠=
† ∈ ℝ: (/, †) ∈ “ê . Then, by Theorem 2.1, the marginal p.m.f. of 
is

¯ Ùê (/, †), if / ∈ “a
Ùa (/ ) = Ò¾∈åí .
0, otherwise

For / ∈ “a , we have 㑠= {/, / + 1, … , r


w
r(r + 1) (/ − 1)/
¯ Ùê (/, †) = ë ¯ † = ë î − ï.
2 2
¾∈åí ¾K‘

Therefore the marginal p.m.f.  is

3[r(r + 1) − (/ − 1)/]
, if / ∈ “a
Ùa (/ ) = Ò r(r + 1)(2r + 1) ,
0, otherwise

where “a = {1, … , r .

For † ∈ “˜ , define ã¾∗ =


/ ∈ ℝ: (/, †) ∈ “ê  = {1, 2, … , †. Then, by Theorem 2.1, the
marginal p.m.f. of – is

¯ Ùê (/, †), if † ∈ “˜
٘ (†) = ґ∈åæ∗ .
0, otherwise

For † ∈ “˜ , we have

28
¾

¯ Ùê (/, †) = ë ¯ † = ë†  .
‘∈åæ
∗ ‘K

Therefore the marginal p.m.f. of – is

, if † ∈ “˜
T¾ Œ
٘ (†) = Ôw(wŽ)(wŽ)
0, otherwise
,

where “˜ = {1, 2, … , r.

(iii) Let [ = {(ç, ): ç >  and 8 = {(ç, ): ç = . Then by Remark 2.1 (ix)

@{ > – = @
— ∈ [

= ¯ Ùê (/, †)
(‘,¾)∈×ì ∩c

= 0 since “ê ∩ [ = l.

@{ = – = @
— ∈ 8

= ¯ zê (/, †)
(‘,¾)∈×ì ∩e

= ë ¯ †
¾K

3
= .
2r + 1

Therefore

@({ < –) = 1 − @({ = –) − @({ > –)

3
=1−
2r + 1
2(r − 1)
= .
2r + 1

Example 2.2

29
Let  = ( ,  , Q ) be a discrete type random vector with p.m.f.

ë/ / / , if (/ , / , /Q ) ∈ {1, 2 × {1, 2, 3 × {1, 3


Ùa (/ , / , /Q ) = d   Q ,
0, otherwise

where ë is a real constant.

(i) Find the value of ë;


(ii) Find the marginal p.m.f.s. of  ; of  ; of Q ;
(iii) Find the marginal p.m.f. of – = ( , Q );
(iv) Find @({ =  = Q ).

Solution. (i) Clearly we must have ë > 0. Then the support of  is “a =


(/ , / , /Q ): / ∈
{1, 2, / ∈ {1, 2, 3, /Q ∈ {1, 3. Therefore

¯ Ùa / , /, /Q  = 1
‘∈ר

⇒ë ¯ ¯ ¯ / / /Q = 1
‘€ ∈{, ‘Œ ∈{,,Q ‘‹ ∈{,Q

1
⇒ë= .
72
(ii) The supports of  ,  and Q are

“a€ =
/ ∈ ℝ : (/ , / , /Q ) ∈ “a for some (/ , /Q ) ∈ ℝ  = {1, 2,

“aŒ =
/ ∈ ℝ : (/ , / , /Q ) ∈ “a for some (/ , /Q ) ∈ ℝ  = {1, 2, 3

and

“a‹ =
/Q ∈ ℝ : (/ , / , /Q ) ∈ “a for some (/ , / ) ∈ ℝ  = {1, 3,

respectively.

For / ∈ “a€ , 㑀 =


(/ , /Q ): (/ , / , /Q ) ∈ “ê  = {1, 2, 3 × {1, 3. Then, for / ∈ “a€

Ùa€ (/ ) = @({  = / )

= ¯ Ùa (/ , / , /Q )
(‘Œ ,‘‹ )∈åí€

30
= ¯ ¯ / / /Q
‘Œ ∈{,,Q ‘‹ ∈{,Q

/
= .
3
Therefore the marginal p.m.f. of  is
/
, if / ∈ {1, 2
Ùa€ (/ ) = Ô 3 .
0, otherwise

Similarly the p.m.f.s of  and Q are


/
, if / ∈ {1, 2, 3
ÙaŒ (/ ) = Ô 6
0, otherwise

and

, if /Q ∈ {1, 3
‹ ‘
Ùa‹ (/Q ) = ð R
0, otherwise
,

respectively.

(iii) The support of – = ( , Q ) is


“˜ =
(† , † ): († , ç, † ) ∈ “ê for some ç ∈ ℝ

= {1, 2 × {1, 3

= {(1, 1), (1, 3), (2, 1), (2 ,3).

For † = († , † ) ∈ “˜ , ã¾ = {ç ∈ ℝ: († , ç, †Q ) ∈ “ê  = {1, 2, 3. Therefore, for † = († , † ) ∈


“˜ ,

٘ <†= = <:– = †>=

= ¯ ë † ç †
ñ∈{,,Q

† †
= ,
12
and the marginal p.m.f. of – = (– , – ) is
† †
, if († , † ) ∈ {(1, 1), (1, 3), (2, 1), (2, 3)
٘ († , † ) = Ô 12 .
0, otherwise
31
(iv) Let [ = {(/ , / , /Q ) ∈ ℝQ : / = / = /Q . Then “a ∩ [ = {(1, 1, 1) and therefore

@({ =  = Q ) = ¯ Ùa /
‘∈ר ∩c

= ë

1
= .
72

Example 2.3
Let  = ( ,  , Q ) be a random vector of absolutely continuous type with joint p.d.f.
ë
, if 0 < /Q < / < / < 1
Ùa / = Ô/ / ,
0, otherwise

where ë is a real constant.

Find the value of constant ë;


Find the marginal p.d.f. of – = ( , Q );
(i)

Find the marginal p.d.f. of  ;


(ii)

Find @({ > 2 ).


(iii)
(iv)

Solution. (i) Clearly we have ë > 0. Also

Ú Ùa /Û/ = 1
ℝ‹

 ‘€ ‘Œ
ë
⇒ ÚÚ Ú Û/Q Û/ Û/ = 1
/ /
P P P

⇒ ë = 1.

(ii) The marginal p.d.f. of – = ( , Q ) is


x

٘ († , † ) = Ú Ùa (/ , † , † ) Û/


7x

32

k 1
Ú Û/ , if 0 < † < † < 1
= / † 
j¾€
h0, otherwise
− ln †
, if 0 < † < † < 1
= Ò † .
0, otherwise

(iii) The marginal p.d.f. of  is


x x

ÙaŒ (/ ) = Ú Ú Ùa (/ , / , /Q ) Û/ Û/Q


7x 7x
‘Œ 
k 1
Ú Ú Û/ Û/Q , if 0 < / < 1
= / /
j P ‘Œ
h0, otherwise

− ln / , if 0 < / < 1
= d .
0, otherwise

(iv) Let [ = {(/ , / , /Q ) ∈ ℝQ : / > 2/ . Then


x x x

@({ > 2  ) = Ú Ú Ú Ùa / bc / Û/


7x 7x 7x

1
= Ú Ú Ú b / Û/
/ / c
Pò‘‹ ò‘Œ ò‘€ ò

í€
 Œ ‘Œ
1
= Ú Ú Ú Û/Q Û/ Û/
/ /
P P P

1
= .
2

We conclude this section with the following remark.

Remark 2.2

33
also nor of absolutely continuous type). To see this let  = ( ,  ) have the joint distribution
(i) There are random vectors that are neither of discrete type nor of continuous type (and hence

function

1 / /
k + , if 0 ≤ / < 1, 0 ≤ / < 1
i2 2
1 /
i +  , if 0 ≤ / < 1, / ≥ 1
za€ ,aŒ (/ , / ) = 2 2 .
j1 + / , if / ≥ 1, 0 ≤ / < 1
i2 2  

i1, if / ≥ 1, / ≥ 1
h0, otherwise

It is easy to verify that za€ ,aŒ (∙) is a distribution function (i.e; it satisfies properties (i)-(iv) of
Theorem 1.3). The marginal distribution functions of  and  are

0, if / < 0
1 /
za€ (/ ) = lim za€ ,aŒ (/ , / ) = Õ +  , if 0 ≤ / < 1
‘Œ →x 2 2
1, if / ≥ 1

and

0, if / < 0
1 /
zaŒ (/ ) = lim za€ ,aŒ (/ , / ) = Õ + , if 0 ≤ / < 1.
‘€ →x 2 2
1, if / ≥ 1
Clearly the set of discontinuity points of za€ (= zaŒ ) is  = {0 and
1
¯Mza€ (/ ) − za€ (/ −)L = ¯MzaŒ (/ ) − zaŒ (/ −)L = ≠ 1.
2
‘∈Ü ‘∈Ü

It follows that  and  are not of discrete type and therefore using Theorem 2.1 (i) it follows
that ( ,  ) is not of discrete type.

Note that

1
»za€ ,aŒ (/ , / ) − za€ ,aŒ (0,0)» = óza€ ,aŒ (/ , / ) − ó
2
1
, if / < 0 or / < 0
= Õ/2 /
, if 0 ≤ / < 1,0 ≤ / < 1
 
2
↛ 0, as (/ , / ) → (0, 0),

34
i.e., za€,، (∙) is not continuous at (0, 0). Therefore ( ,  ) is also not of continuous type.

(ii) There are random vectors which are of continuous type but not of absolutely continuous
type. These random vectors are normally difficult to study.

3. Conditional Distributions

Let (Ω, ℱ, @) be a probability space and let  =  , … , . : Ω → ℝ. be a 4 -dimensional


(4 ≥ 2) random vector with distribution function za (∙).

Definition 3.1

Let  ∈ ℬ. be such that @


 ∈  > 0. Then the conditional distribution function of  given
that  ∈  is defined by

za|Ü / = @
 ∈ (−∞, /]|
 ∈ 

@
 ∈ −∞, /L ∩  
=
@
 ∈  

@
 ≤ / , … , . ≤ /. ,  ∈ 
= , / ∈ ℝ. .
@
 ∈  

For a given  ∈ ℬ. it can be verified that za|Ü (∙) is a distribution function, i.e., it satisfies
properties (i) − (iv) of Theorem 1.3. For a fixed _ ∈ {1, … , 4 − 1, let – = ( , … , ` ) (=
(– , … , –` ) say) and — = `Ž , … , .  = — , … , —.7` , say, so that  = –, —. In many

characteristic – given a fixed value of numerical characteristic —. For example if  and 


situations it may be of interest to study the conditional probability distribution of numerical

denote respectively the heights and weights of newly born babies in a community then it may

(i.e., conditional distribution of  given that { = 3}).


be of interest to study the probability distribution of heights of babies having weight of 3Kg

To make the above discussion precise, first suppose that  = –, — is of discrete type so that
– and — are also of discrete type (see Theorem 2.1 (i)). Let “a , “˜ and “ê denote the supports of
, – and — respectively. Further let Ùa (⋅) ≑ ٘,ê (⋅) and Ùê (⋅) denote the joint pm.f.s of
 = –, — and —, respectively. Let « ∈ “ê be fixed so that Ùê « = @
— = « > 0. Define
“˜»êK° = :† ∈ ℝ` : <† , «= ∈ “a >. Then “˜»êK° ⊆ “˜ = :† ∈ ℝ. : <† , = ∈ “a , for some  ∈

35
ℝ.7` > and, using Definition 3.1, the conditional distribution function of – given
— = « <=
:— ∈
«>= is given by

@
– ≤ † , … , –` ≤ †` , — = «
z˜»ê <† |«= = , † ∈ ℝ` (3.1)
@
— = «

∑ Ù / , «
‘ ∈ ×ö»ì÷ø ∩|<7x,¾Þ} a
=
Ùê «

Ùa / , «
= ¯ . (3.2)
Ùê «
‘ ∈ ×ö»ì÷ø ∩|<7x,¾Þ}

Clearly the p.m.f. corresponding to distribution function z˜»ê ⋅ |« is (see Remark 2.1 (xi))

ùö ì <¾ ,°=
٘»ê <† |«= = Ò ùì ° , if † ∈ “˜»êK° (3.3)
0 otherwise
ùö ì <¾ ,°=
= , † ∈ ℝ` (3.4)
ùì °

= @ <:– = †|— = «>=, † ∈ ℝ` .

The above discussion leads to the following definition.

Definition 3.2

Let  =  , … , .  be a discrete type random vector. Then, under the above notation,

(i) the conditional p.m.f. of – given — = « (where « ∈ “ê is fixed) is defined by (3.3) (or (3.4));
(ii) the conditional distribution function of – given — = « (where « ∈ “ê is fixed) is defined by
(3.1) (or (3.2)); ▄

Now suppose that  = –, — is of absolutely continuous type so that – and — are also of
absolutely continuous type (see Theorem 2.1 (iii)). Let Ùa (∙) ≑ ٘,ê (∙), ٘ (∙) and Ùê (∙) denote
the p.d.f.s. of , – and — respectively. Then we have @
— = « = 0, ∀ « ∈ ℝ.7` (Remark 2.1
(viii)) and therefore conditional distribution function of – given
— = « cannot be defined by
(3.1). For « ∈ ℝ.7` , note that

36
x x
1

— = « = q ⋯ q d«3 − < —3 ≤ «3 , = 1, … , 4 − _û,
r3
w€ K w¦§ K

and therefore, using continuity of probability measures,

1
@
— = « = lim @ |d«3 − < —3 ≤ «3, = 1, … , 4 − _û}
w →x
„ r3
3K,…,.7`

= lim @ ({«3 − ℎ3 < —3 ≤ «3 , = 1, … , 4 − _).


Ã↓P
3K,⋯,.7`

Thus if « ∈ ℝ.7` is such that

@({«3 − ü3 < —3 ≤ «3 , = 1, … , 4 − _ ) > 0, ∀ ü = ü , … , ü.7`  ∈ (0, ∞).7` , (3.5)

then the conditional distribution function of – given — = « may be defined by

z˜»ê <†|«= = lim @({–3 ≤ †3 , = 1, … , _|{«3 − ℎ3 < —3 ≤ «3 , = 1, … , 4 − _) (3.6)


Ä ↓P
3K,⋯,.7`

@({–3 ≤ †3 , = 1, … , _, «3 − ℎ3 < —3 ≤ «3 , = 1, … , 4 − _ )
= lim
Ä ↓P @({«3 − ℎ3 < —3 ≤ «3 , = 1, ⋯ , 4 − _)
3K,…,.7`

ý7x⋯ ý7x ý° 7à ⋯ ý° 7à ٘ ê ç ,  ۏ Ûç
€ ¾ § € ¾ ¦§ ° °

= lim
€ € ¦§ ¦§

ý° ⋯ ý° ¦§7à ٠ê  ۏ
°€ °
Ã↓P
3K,…,.7` € 7à € ¦§ ¦§

ý7x ⋯ ý7x Ò lim ý° €7à ⋯ ý° ¦§7à ٘ ê ç,  Ûß Ûç


¾€ § ¾  ° °
Ä ↓P À ⋯Á¦§ € € ¦§ ¦§

=
3K,…,.7`

lim ý° €7à ⋯ ý° ¦§7à ٠ê ۏ


 ° °
Ä ↓P À ⋯Á¦§ € € ¦§ ¦§
3K,…,.7`

ý7x⋯ ý7x ٘ ê ç , «Ûç


¾€§ ¾

=
Ù ê «
¾€ ¾§
٘ ê ç , «
= Ú ⋯ Ú Ûç, † ∈ ℝ` , (3.7)
Ù ê «
7x 7x

provided Ù ê « > 0 and « is such that (3.5) is satisfied. In that case the p.d.f corresponding to
distribution function z˜|ê ⋅ |« is given by

37
٘ ê <†, «=
٘|ê <†|«= = , † ∈ ℝ` . (3.8)
Ù ê «

The above discussion is summarized in the following definition.

Definition 3.3

Let  =  , … , .  be a random vector of absolutely continuous type. Let « ∈ ℝ` be such


that Ù ê « > 0 and it satisfies (3.5). Then

(i) the conditional p.d.f. of – given — = « is defined by (3.8);


(ii) the conditional distribution function of – given — = « is defined by (3.6) (or (3.7)).

Remark 3.1

Using (3.4) and (3.8), for fixed « ∈ , “° =


 ∈ ℝ.7` : ٘|ê ∙ | is defined, the conditional
p.m.f./p.d.f. of – given — = « is given by

٘|ê <†|«= = ë« ٘, ê <†, «= , † ∈ ℝ` ,

where ë« is the normalizing constant.

Example 3.1
Let  = ( ,  , Q ) be a discrete type random vector with p.m.f.
/ / /Q
, if (/ , / , /Q ) ∈ {1, 2 × {1, 2 × {1, 3
Ùa (/ , / , /Q ) = Ô 72 .
0, otherwise
(i) Find the conditional p.m.f. of  given that ( , Q ) = (2, 1);

(ii) Find the conditional p.m.f. of ( , Q ) given that  = 3.

Solution. (i) We have

@({ = / ,  = 2, Q = 1 )
Ùa€ |(aŒ,a‹ ) (/ |(2, 1)) =
@({( , Q ) = (2, 1) )

2/
, if / ∈ {1, 2
= Ò72 @({  = 2, Q = 1 ) ,
0, otherwise

38


@({ = 2, Q = 1) = ¯ @({ = / ,  = 2, Q = 1)


‘€ K

2
= (1 + 2)
72
1
= .
12
Therefore
/
, if / ∈ {1, 2
Ùa€ |(aŒ,a‹ ) (/ |(2, 1)) = Ô 3 .
0, otherwise

@({ = / ,  = 3, Q = /Q )
(ii) We have

Ùa€,a‹ |aŒ (/ , /Q |3) = .


@({  = 3)

Using Example 2.2, @({  = 3) = and therefore





/ /Q
, if (/ , /Q ) ∈ {1, 2 × {1, 3
Ùa€ ,a‹ |aŒ (/ , /Q |3) = Ô 12 .
0, otherwise

Example 3.2
Let  = ( ,  , Q ) be a random vector of absolutely continuous type with joint p.d.f.
1
, if 0 < /Q < / < / < 1
Ùa / = Ò/ / .
0, otherwise
(i) For 0 < /Q < / < 1, find the conditional p.d.f. of  given ( , Q ) = (/ , /Q );
(ii) For 0 < / < 1, find the conditional p.d.f. of ( , Q ) given  = / .

Solution. (i) For 0 < /Q < / < 1


Ùa€ ,aŒ , a‹ ( / , / , /Q )
Ùa€ |(aŒ, a‹ ) / |(/ , /Q ) = , / ∈ ℝ.
ÙaŒ , a‹ (/ , /Q )

Using Example 2.3 (ii), for 0 < /Q < / < 1, we have

ln /
ÙaŒ , a‹ (/ , /Q ) = − .
/

39
Therefore,

1
− , if / < / < 1
Ùa€ |(aŒ ,a‹ ) (/ |/ , /Q ) = Ò / ln / .
0, otherwise

Alternatively Ùa€ |(aŒ,a‹ ) (/ |/ , /Q ) can be found useful Remark 3.1.

(ii) For 0 < / < 1,

Ùa€ ,aŒ , a‹ (/ , / , /Q )


Ùa€ ,a‹ |aŒ (/ , /Q |/ ) = , (/ , /Q ) ∈ ℝ .
ÙaŒ (/ )

Using Example 2.3 (iii) we have, for 0 < / < 1 ,

ÙaŒ (/ ) = − ln / .

Therefore, for 0 < / < 1,

− , if / < / < 1, 0 < /Q < /



Ùa€ ,a‹ |aŒ (/ , /Q |/ ) = 𠑀 ‘Œ þ ‘Œ .
0, otherwise

Alternatively Ùa€ ,a‹ (/ , /Q |/ ) can be found using Remark 3.1. ▄

4. Independent Random Variables


Let (Ω, ℱ, @) be a probability space and let { :  ∈ Λ be a collection of random variables,
where Λ ⊆ ℝ is a non-empty index set.

Definition 4.1
The random variables { :  ∈ Λ are said to be (statistically) independent if for any finite sub
collection
 , … , .  ⊆ Λ we have
.

za€ ,… ,a / , … , /.  = J za (/3 ), ∀ / = / , … , /.  ∈ ℝ. .


„
3K

The observations made in the following remark are immediate from Definition 4.1.

Remark 4.1

40
(i) The random variables { :  ∈ Λ are independent if, and only if, every finite sub collection
{€ , … ,   ⊆ { :  ∈ Λ constitutes a collection of independent random variables;

(ii) Suppose that Λ ⊆ Λ  ⊆ ℝ and Λ ≠ l . Then

{ :  ∈ Λ   are independent ⇒ { :  ∈ Λ  are independent;

(iii) It can be shown that (see Theorem 5.3 (ii))  , … , . are independent if, and only if, for any
[3 ∈ ℬ , = 1, … , 4,
.

@({ 3 ∈ [3 , = 1, … , 4) = J @({3 ∈ [3 ).


3K

Theorem 4.1
Let X = ( , … , . ) be a 4-dimentional (4 ≥ 2) random vector with joint distribution function
za€ ,… ,a (∙). Let za„ (∙) denote the marginal distribution function of 3 , = 1, … , 4. Then the
random variables  , … , . are independent if, and only if,
.

za€ ,… ,a / , … , /.  = J za„ (/3 ) , ∀ / = / , … , /.  ∈ ℝ. . (4.1)


3K

Proof. First suppose that  , … , . are independent. Then, by definition, (4.1) obviously holds.
Conversely suppose that (4.1) holds. Then, for any † ∈ ℝ. and any permutation (‚ , … , ‚. ) of
(1, … , 4),
.

@({3 ≤ †3 , = 1, … , 4) = J @({3 ≤ †3 )


3K

⇒ @
ƒ„ ≤ †ƒ„ , = 1, … , 4 = J @(
ƒ„ ≤ †ƒ„ )
3K

⇒ za…€ ,…,a… <†ƒ€ , … , †ƒ = = J za… †ƒ„ , ∀ † = † , … , †.  ∈ ℝ. , ‚ = ‚ , … , ‚.  ∈ “. ,


„
3K

where “. denotes the set of all permutation of (1, … , 4). It follows that, for any ‚ , … , ‚.  ∈
“. and any / ∈ ℝ. ,

41
.

za…€ ,… ,a… / , … , /.  = J za… (/3 ) . (4.2)


„
3K

Let  ∈ {2, … , 4 and let


 , … ,   ⊆ {1, … , 4 = Λ , say. Let Ž , … , . be such that
Λ −
 , … ,   =
Ž , … , . . Then ( , … ,  , Ž , … , . ) ∈ “. and by Lemma 1.2

za€ ,…,a / , … , /  = ílim


→’
za€ ,…,a / , … , /. 
¨
¨÷¼€,… ,

= ílim
→’
J za (/
) (using (4.2))
¨
¨÷¼€,⋯,
K

= J za (/
), ∀ / = / , … , /  ∈ ℝ .


K

Hence the result follows. ▄

The following remark is immediate from the above theorem and Remark 1.2 (ii).

Remark 4.2

Random variables  , … , . are independent if, and only if, for any ‚ = ‚ , … , ‚.  ∈ “. the
random variables ƒ€ , … , ƒ are independent. ▄

Theorem 4.2
Let  = ( , … , . ) be a 4-dimensional (4 ≥ 2) random vector of either discrete type or of
absolutely continuous type. Let Ùa€ ,…,a (∙) denote the joint p.m.f. (or p.d.f.) of  and let Ùa„ (∙)
denote the marginal p.m.f. (or p.d.f.) of 3 , = 1, … , 4. Then

 , ⋯ , . are independent if, and only if,


.
(i)

Ùa€ ,…,a / , … , /.  = J Ùa„ (/3 ), ∀ / = / , … , /.  ∈ ℝ. . (4.3)


3K
(ii)  , … , . are independent if, and only if,
.

Ùa€ ,… ,a / , … , /.  = J ]3 (/3 ), ∀ / = / , … , /.  ∈ ℝ. , (4.4)


3K

42
for some non-negative functions ] (⋅), … , ]. (⋅). In that case Ùa„ (/3 ) = Û3 ]3 (/ ), / ∈
ℝ, = 1, … , 4 for some positive constants Û , … , Û. .

(iii)  ,  , … , . are independent ⇒ “a = ∏.3K “a„ ,


where, for a random variable –, “˜ = :†: ٘ <†= > 0>.

Proof. (i) For notational simplicity we will provide the proof for 4 = 2.
Case I.  is of discrete type
Let “a be the support of  = ( ,  ) and let “a„ be the support of 3 , = 1, 2 …. First suppose
that (4.3) holds. Then clearly “a = “a€ × “aŒ (see (iii)). Therefore, for / = (/ , / ) ∈ ℝ ,
za€ ,aŒ (/ , / ) = ¯ Ùa€ ,aŒ († , † )
¾∈ר ∩((7x,‘])

= ¯ ¯ Ùa€ († )ÙaŒ († ) (“a = “a€ × “aŒ )


¾€ ∈ר€ ∩(7x,‘€ ] ¾Œ ∈רŒ ∩(7x,‘Œ ]

= ™ ¯ Ùa€ († )š ™ ¯ ÙaŒ († )š


¾€ ∈ר€ ∩(7x,‘€ ] ¾Œ ∈רŒ ∩(7x,‘Œ ]

= za€ (/ )zaŒ (/ ).

Using Theorem 4.1 it follows that  and  are independent.

Conversely suppose that  and  are independent. Then, by Theorem 4.1,

za€ ,aŒ (« , « ) = za€ (« )zaŒ (« ), ∀ « = (« , « ) ∈ ℝ .

Let / = (/ , / ) ∈ ℝ . Define /w = </ − , / − = , r = 1, 2, …. Then, by Remark 2.1 (v),


 
w w

Ùa€ ,aŒ (/ , / ) = @({ = / ,  = / )




= lim ¯ ¯ za€ ,aŒ (« , « )


w→x
`KP °∈∆§,Œ ((‘ž ,‘])

1 1
= lim [za€ ,aŒ (/ , / ) − za€ ,aŒ |/ − , / } − za€ ,aŒ |/ , / − }
w→x r r
1 1
+za€ ,aŒ |/ − , / − }]
r r
1 1
= lim [za€ (/ )zaŒ (/ ) − za€ |/ − } zaŒ (/ ) − za€ (/ ) zaŒ |/ − }
w→x r r

43
1 1
+za€ |/ − } zaŒ |/ − }]
r r

= za€ (/ )zaŒ (/ ) − za€ (/ −)zaŒ (/ ) − za€ (/ )zaŒ (/ −) + za€ (/ −)zaŒ (/ −)

= zaŒ (/ )Mza€ (/ ) − za€ (/ −)L − zaŒ (/ −)[za€ (/ ) − za€ (/ −)]

= Mza€ (/ ) − za€ (/ −)LMzaŒ (/ ) − zaŒ (/ −)L

= @({ = / ) @({ = / )

= Ùa€ (/ )ÙaŒ (/ ),

i.e., (4.3) holds.

Case II.  is of absolutely continuous type

First suppose that (4.3) holds. Then, for / = (/ , / ) ∈ ℝ,


‘€ ‘Œ

za€ ,aŒ (/ , / ) = Ú Ú Ùa€ ,aŒ († , † ) ۆ ۆ


7x 7x

‘€ ‘Œ

= Ú Ú Ùa€ († ) ÙaŒ († ) ۆ ۆ


7x 7x

‘€ ‘Œ

= ~ Ú Ùa€ († )ۆ  ~ Ú ÙaŒ († )ۆ 


7x 7x

= za€ (/ ) zaŒ (/ ).

Using Theorem 4.1 it follows that  and  are independent.

Conversely suppose that  and  are independent. Then, by Theorem 4.1,

za€ ,aŒ (/ , / ) = za€ (/ ) zaŒ (/ ), ∀ / = (/ , / ) ∈ ℝ .

For simplicity assume that Ùa€ ,aŒ (/ , / ) is continuous everywhere. Then, by Remark 2.1 (xiii)

á  za€ ,aŒ (/ , / )


Ùa€ ,aŒ (/ , / ) =
á/ á/

á
= (z (/ ) z (/ ))
á/ á/ a€  aŒ 

44
áza€ (/ ) ázaŒ (/ )
=  
á/ á/

= Ùa€ (/ )ÙaŒ (/ ), ∀ / = (/ , / ) ∈ ℝ .

(ii) First suppose that  and  are independent. Then clearly (4.4) holds with the choice
]3 (/3 ) = Ùa„ (/3 ), /3 ∈ ℝ , = 1, 2. Conversely suppose that (4.4) holds. Let
x

ë3 = Ú ]3 (/) Û/, = 1, 2,
7x

so that ë ≥ 0 , ë ≥ 0 and
x x

ë ë = £ Ú ] (/ )Û/ ¥ £ Ú ] (/ )Û/ ¥


7x 7x

x x

= Ú Ú ] (/ )] (/ )Û/ Û/


7x 7x

x x

= Ú Ú Ùa€ ,aŒ (/ , / ) Û/ Û/


7x 7x

= 1.

It follows that ë > 0, ë > 0 and ë ë = 1. Also


x

Ùa€ (/ ) = Ú Ùa€ ,aŒ (/ , / ) Û/


7x

= Ú ] (/ )] (/ ) Û/


7x

= ë ] (/ ), / ∈ ℝ.

Similarly

ÙaŒ (/ ) = ë ] (/ ) , / ∈ ℝ .

Thus we have

Ùa€ ,aŒ (/ , / ) = ] (/ )] (/ )

45
= (ë ] (/ ))(ë ] (/ )) (ë ë = 1)

= Ùa€ (/ )ÙaŒ (/ ) , ∀ / = (/ , / ) ∈ ℝ .

Using (i) it follows that  and  are independent.

(iii) Since  and  are independent by (i), Ùa€ ,aŒ (/ , / ) = Ùa€ (/ )ÙaŒ (/ ) ∀ / ∈ ℝ .

“a =
(/ , / ): Ùa€ ,aŒ (/ , / ) > 0
Therefore

=
(/ , / ): Ùa€ (/ ) ÙaŒ (/ ) > 0
= {/: Ùa€ (/) > 0 × {†: ÙaŒ (†) > 0

= “a€ ∩ “aŒ .

Remark 4.3
Let  = ( ,  ) be a bivariate vector of either discrete type or of absolutely continuous
type. Let  = {/ ∈ ℝ: Ùa€ |aŒ / |/  is defined. Then by Theorem 4.2 (i)
(i)

 and  are independent ⟺ Ùa€ ,aŒ (/ , / ) = Ùa€ (/ ) ÙaŒ (/ ) ∀ / = (/ , / ) ∈ ℝ

Ùa€ ,aŒ (/ , / )


⟺ = Ùa€ (/ ) ∀ / ∈ ℝ, / ∈ 
ÙaŒ (/ )

⟺ Ùa€ |aŒ / |/  Ùa€ /  , ∀ / ∈ ℝ, / ∈ .

It follows that  and  are independent if, and only if, for every / ∈  the
conditional distribution of  given  = / is the same as unconditional distribution of
 . Similarly, by symmetry,  and  are independent if, and only if, for every
/ ∈  = { ∈ ℝ: ÙaŒ |a€ ∙ | is defined the conditional distribution of  given
 = / is the same as the unconditional distribution of  .

Let Λ ⊆ ℝ be an arbitrary non-empty index set, and let


 :  ∈ Λ be a collection of
random vectors defined on a probability space (Ω, ℱ, @), where  s may be of different
(ii)

dimensions. One can define the independence of random vectors


 :  ∈ Λ by
extending Definition 4.1 in an obvious manner. We say that the random vectors
 :  ∈
Λ are independent if for any sub collection { , … , .  ⊆ Λ we have

za€ ,… ,a / , … , /.  = @(


„ ∈ −∞, /3 L, = 1, … , 4

46
.

= J @({ „ ∈ (−∞, /3 ])


3K

= J za (/3 ), ∀ / , … , /. .
„
3K

With above definition of independence of random vectors


 :  ∈ Λ the results stated
in Theorem 4.1 and 4.2 hold with random variables  , … , . replaced by random
vectors  , ⋯ , . . Morever, Remarks 4.1, 4.2 and 4.3 (i) also hold with random variables
 ç replaced by random vectors  ç.

Let  =  , … , .  be a random vector and let _ , … , _ be positive integers such that
∑3K _3 = 4. Define – =  , … , `€ , – = `€Ž , … , `€Ž`Œ  and –3 = (∑„¦€ ` , …,
(iii)

¨÷€ ¨ Ž

∑„ ), = 2, 3, … , . Suppose that  , … , . are independent random variables.


¨÷€ `¨

Then, on using the analog of Theorem 4.1 for random vectors, it follows that – , … , –
are independent random vectors.

Theorem 4.3
Let  , … , . be independent random vectors such that 3 is 3 -dimensional, = 1, … , 4. Let
ψ3 : ℝ„ → ℝ„ , = 1, … , 4, be Borel functions. Then ψ  , … , ψ. .  are independent.

Proof. Let  = ( , … , . ) and let –3 = ψ3 3 , = 1, … , 4. For fixed †3 ∈ ℝ„ define
[3 = :/ ∈ ℝ„ : ψ3 / ≤ †3 > , = 1, … , 4 (where, for /, † ∈ ℝ , / ≤ † means /3 ≤ †3 , =
1, … , ). Then, for †3 ∈ ℝ„ , = 1, … , 4, the joint distribution function of – = ψ  , … , –. =
ψ. .  is given by

z˜€ ,…,˜ <† , … , †. = = @(


– ∈ −∞, † L, … , –. ∈ (−∞, †. ]

= @(
 ∈ [ , … , . ∈ [. )
.

= J @({© ∈ [© ) (using Remark 4.1 (iii))


©K

47
.

= J @({–© ≤ †© )
©K

= J z˜¨ ( †© ),
©K

where z˜¨ (∙) denotes the marginal distribution function of –© ,  = 1, 2, … , 4. Now, using
version of Theorem 4.1 for random vectors, it follows that – , … , –. are independent.

Example 4.1
Let  = ( ,  , Q ) be a discrete type random vector with joint p.m.f.
/ / /Q
, if (/ , / , /Q ) ∈ {1, 2 × {1, 2, 3 × {1, 3
Ùa (/ , / , /Q ) = Ô 72 .
0, otherwise

(i) Are  ,  and Q independent random variables?


(ii) Are  and Q independent random variables?

Solution. (i) From Example 2.2 (ii) we have


/ /
, if / ∈ {1, 2 , if / ∈ {1, 2, 3
Ùa€ (/ ) = Ô 3 ; ÙaŒ (/ ) = Ô 6
0, otherwise 0, otherwise

and

, if /Q ∈ {1, 3
‹ ‘
Ùa‹ (/Q ) = ð R
0, otherwise
.

Clearly

Ùa€ ,aŒ ,a‹ (/ , / , /Q ) = Ùa€ (/ )ÙaŒ (/ )Ùa‹ (/Q ), ∀ / = (/ , / , /Q ) ∈ ℝQ .

Now using Theorem 4.2 (i) it follows that  ,  and Q are independent.

One can also directly infer the independence of  ,  and Q from Theorem 4.2 (ii) by nothing
that

Ùa€ ,aŒ ,a‹ (/ , / , /Q ) = ] (/ )] (/ )]Q (/Q ), ∀ / = (/ , / , /Q ) ∈ ℝQ ,

48
where
/
, if / ∈ {1, 2 / , if / ∈ {1, 2, 3
] (/ ) = Ô 72 , ] (/ ) = d 
0, otherwise 0 , otherwise

and

/Q , if / ∈ {1, 3
]Q (/Q ) = d .
0, otherwise

/ /
(ii) From Example 2.2 (iii) we have
, if (/ , / ) ∈ {1, 2 × {1, 3
( )
Ùa€ ,a‹ / , /Q = Ô 12 .
0, otherwise

Clearly

Ùa€ ,a‹ (/ , /Q ) = Ùa€ (/ )Ùa‹ (/Q ), ∀ (/ , /Q ) ∈ ℝ .

Therefore  and Q are independent. ▄

Example 4.2
Let  = ( ,  , Q ) be a random vector of absolutely continuous type with p.d.f.

1
, if 0 < /Q < / < / < 1
(/ )
Ùa  , / , /Q = Ò/ / .
0 , otherwise

(i) Are  ,  and Q independent random variables?


(ii) Let / ∈ (0, 1) be fixed. Are X and Q independent given  = / ?

Solution. (i) We have


‘€ ‘Œ
1
ÚÚ Û/Q Û/ , if 0 < / < 1
Ùa€ (/ ) = Õ / /
P P
0, otherwise

1, if 0 < / < 1
= d ,
0, otherwise
− ln / , if 0 < / < 1
ÙaŒ (/ ) = d ( See Example 2.3 (iii))
0, otherwise

and

49
 
k 1
ÚÚ Û/ Û/ , if 0 < /Q < 1
Ùa‹ (/Q ) = / /
j ‘‹ ‘Œ
h0, otherwise

(ln /Q )
= Ò 2 , if 0 < /Q < 1.
0, otherwise

Clearly

Ùa€ ,aŒ ,a‹ (/ , / , /Q ) ≠ Ùa€ (/ )ÙaŒ (/ )Ùa‹ (/Q ), ∀ (/ , / , /Q ) ∈ ℝQ ,

and therefore  ,  and Q are not independent.

“a =
(/ , / , /Q ): Ùa (/ , / , /Q ) > 0 = {(/ , / , /Q ): 0 < /Q < / < / < 1,
“a€ =
/ : Ùa€ (/ ) > 0 = (0, 1) = “aŒ = “a‹ .Since“a ≠ “a€ × “aŒ × “a‹ one can also infer
Note that

the non-independence of  ,  and Q from Theorem 4.2 (iii).

(ii) Fix / ∈ (0, 1). From Example 3.2 (ii) we have

Ùa€ ,aŒ ,a‹ (/ , / , /Q )


Ùa€ ,a‹ |aŒ / , /Q |/ 
ÙaŒ (/ )

1
− , if / < / < 1, 0 < /Q < /
= Ò / / ln / .
0, otherwise

Also it is easy to see that

1
Ùa€ ,aŒ (/ , / ) − , if / < / < 1
Ùa€ |aŒ / |/  = Ò / ln /
ÙaŒ (/ )
0, otherwise

and

1
ÙaŒ ,a‹ (/ , /Q ) , if 0 < /Q < /
Ùa‹ |aŒ /Q |/  = Ò/ .
(
ÙaŒ / )
0, otherwise

Clearly, for fixed / ∈ (0, 1),

Ùa€ ,a‹ |aŒ / , /Q |/  Ùa€ |aŒ / |/ Ùa‹ |aŒ /Q |/ , ∀ (/ , /Q ) ∈ ℝ .

50
Now using Theorem 4.2 (i) on conditional p.d.f. of ( , Q ) given  = / it follows that,
given  = / , random variables  and Q are conditionally independent.

One can also infer the conditional independence of  and Q given  = / directly from
Theorem 4.2 (ii) by nothing that, for a fixed / ∈ (0,1),

Ùa€ ,aŒ ,a‹ (/ , / , /Q )


Ùa€ ,a‹ |aŒ / , /Q |/ 
ÙaŒ (/ )

= ë(/ )Ùa€ ,aŒ ,a‹ (/ , / , /Q )

= ]‘Œ (/ )]‘Œ (/Q ), (/ , /Q ) ∈ ℝ ,


() ()

where, for a fixed / ∈ (0, 1)

ë(/ )
, if / < / < 1 1 , if 0 < /Q < /
]‘Œ (/ ) = Ò / / and ]‘Œ (/Q ) = d .
() ()
0, otherwise
0, otherwise

5. Expectations and Moments

Let  =  , … , .  be a 4-dimensional random vector of either discrete type or of absolutely


continuous type. Let Ùa (∙) and “a =
/ ∈ ℝ. : Ùa / > 0 denote respectively the p.m.f. (or
p.d.f.) and suppose of  (or Ùa ). Further let Ùa„ (∙) and “a„ =
/ ∈ ℝ: Ùa„ / > 0 denote
respectively the p.m.f. (or p.d.f.) and support of 3 (or Ùa„ ), = 1, … , 4.

The proof of the following theorem, being similar to that of Theorem 3.2, Module 3, is omitted.

Theorem 5.1

Let ψ: ℝ. → ℝ be a Borel function such that (ψ) is finite.

(i) If  is of discrete type then


 <ψ= = ¯ ψ/Ùa (/).
‘∈ר

(ii) If  is of absolutely continuous type then

 <ψ= = Ú ψÙa /Û/ .


ℝ

51

Definition 5.1
Some special kind of expectations are defined below:

For non-negative integers _ , … , _. , let ψ/ = / € ⋯ /.̀  . Then


`
(i)
` € ,…,` =  < € ⋯ .̀  = ,
`

provided it is finite, is called a joint moment of order _ + ⋯ + _. of ;


For non-negative integers _ , … , _. , let ψ/ = (/ −  ( ))`€ ⋯ (/. − . )` . Then
`€ ,…,` = ( −  ( ))`€ ⋯ (. − . )` ,
(ii)

provided it is finite, is called a joint central moment of order _ + ⋯ + _. of ;

Let ψ/ = (/3 −  (3 ))(/© − © ), ,  = 1, … , 4. Then


Cov3 , ©  = (3 −  (3 ))(© − © ),
(iii)

provided it is finite, is called the covariance between 3 and © . ▄

Note that

Cov(3 , 3 ) = ((3 −  (3 )) ) = Var(3 ), = 1, … , 4,

and, for ,  ∈ {1, … , 4, ≠  ,

Cov3 , ©  =  |3 −  (3 ) <© − © =}

=  |<© − © = 3 −  (3 )}

= Cov© , 3 .

Also, for ,  ∈ {1, … , 4,


Cov3 , ©  =  |3 −  (3 ) <© − © =}

= 3 ©  − (3 )(© ).

Theorem 5.2

Let  =  , … , .€  and – = – , … , –.Œ  be random vectors and let A , … , A.€ , H , … , H.Œ be
real constants. Then, provided the involved expectations are finite,

52
.€ .€

(i)  ~¯ A3 3  = ¯ A3 (3 );
3K 3K

.€ .Œ .€ .Œ

(ii) Cov ~¯ A3 3 , ¯ H© –©  = ¯ ¯ A3 H© Cov3 , –© .


3K ©K 3K ©K

In particular
.€ .€ .€ .Œ

Var ~¯ A3 3  = ¯ A3 Var(3 ) + ¯ ¯ A3 A© Cov3 , © 


3K 3K 3K ©K
©3

= ¯ A3 Var(3 ) + 2 ¯ ¯ A3 A© Cov3 , © .


3K 3ò© .€

Proof. We will provide the proof for the absolutely continuous case. The proof for the discrete
case follows similarly.

(i) Let Ùa (∙) denote the joint p.d.f. of  =  , … , .€ . Then
.€ .€

 ~¯ A3 3  = Ú ~¯ A3 /3  Ùa /Û/
3K ℝ€ 3K

= ¯ A3 Ú /3 Ùa /Û/
3K ℝ€

= ¯ A3  (3 ). (using Theorem 5.1)


3K
(ii) We have

.€ .Œ .€ .€ .Œ .Œ

Cov ~¯ A3 3 , ¯ H© –©  =  ™~¯ A3 3 −  ~¯ A3 3   ~¯ H© –© −  ~¯ H© –©  š
3K ©K 3K 3K ©K ©K

.€ .€ .Œ .Œ
 ¢
=  œ~¯ A3 3 − ¯ A3  (3 )  ™¯ H© –© − ¯ H©  –© š¡ (using (i))
3K 3K ©K ©K
›  

53
.€ .Œ
 ¢
=  œ™¯ A3 3 −  (X )𠙝 H© <–© − –© =š¡
3K ©K
›  

.€ .Œ

=  ™¯ ¯ A3 H© 3 −  (3 ) <–© − –© =š


3K ©K

.€ .Œ

= ¯ ¯ A3 H©  |3 −  (3 ) <–© − –© =} (again using (i))


3K ©K
.€ .Œ

= ¯ ¯ A3 H© Cov3 , © .
3K ©K

Also,
. .€ .€

Var £¯ A3 3 ¥ = Cov ~¯ A3 3 , ¯ A© © 
3K 3K ©K

.€ .€

= ¯ ¯ A3 A© Cov3 , © 
3K ©K

.€ .€ .€

= ¯ A3 Cov(3 , 3 ) + ¯ ¯ A3 A© Cov(3 , © )


3K 3K ©K
3©

.€ .€ .€

= ¯ A3 Var(3 ) + ¯ ¯ A3 A© Cov(3 , © )


3K 3K ©K
3©

= ¯ A3 Var(3 ) + 2 ¯ ¯ A3 A© Cov3 , © .


3K 3ò© .€

(since Cov3 , ©  = Cov© , 3 )


Theorem 5.3

54
Let  , … , . be independent random vectors where 3 is 3 - dimensional, = 1, … , 4.

(i) Let 3 : ℝ„ → ℝ, = 1, 2, … , 4, be Borel functions. Then


. .

 ~J ψ3 3  = J ψ3 3 ,
3K 3K

provided the involved expectations are finite.

(ii) For [3 ∈ ℬ„ , = 1, … , 4,


.

@
3 ∈ [3 , = 1, … , 4 = J @{3 ∈ [3 .
3K

Proof. We will provide the proof for the absolutely continuous case. The proof for the discrete
case follows similarly and is left as an exercise.

(i) Let  = ( , … , . ). Since  , … , . are independent. We have


.

Ùa / , … , /.  = J Ùa„ /3  , ∀ (/ , … , /. ) ∈ ℝ ,


3K
where  = ∑3K 3 . Therefore,
.

. x x . .

 ~J ψ3 3  = Ú ⋯ Ú ~J ψ3 /3  £J Ùa„ /3 ¥ Û/ ⋯ Û/.


3K 7x 7x 3K 3K
x x .

= Ú ⋯ Ú J <ψ3 /3 Ùa„ /3 = Û/ ⋯ Û/. ,


7x 7x 3K
x

= £ Ú ψ / Ùa€ / Û/ ¥ ⋯ ~ Ú ψ. /. Ùa /. Û/. 


ℝ € ℝ 

=  <ψ  = ⋯  <ψ. . =.

1 , if 3 ∈ [3
(ii) Let
ψ3 3  = d , = 1, … , 4,
0 , otherwise

so that
.
1 , if 3 ∈ [3 , = 1, … , 4
J ψ3 3  = d .
0 , otherwise
3K

Now using (i) we get

55
. .

 £J ψ3 3  ¥ = J (ψ3 3 


3K 3K

⟹ @({3 ∈ [3 , = 1, … , 4) = J @({ 3 ∈ [3 ).


3K

Corollary 5.1
Let  , … , . be independent random variables. Then

Cov3 , ©  = 0, ∀ ≠ ,

and, for real constants A , … , A. ,


. .

Var £¯ A3 3 ¥ = ¯ A3 Var(3 ),


3K 3K

provided the involved expectations exist.

Proof. Fix ,  ∈ {1, … , 4, ≠ . Using Theorem 5.3 (i), we have

3 ©  =  (3 ) (© )

⟹ Cov3 , ©  = 3 ©  −  (3 ) ©  = 0.

By Theorem 5.2 we have


. . . .

Var £¯ A3 3 ¥ = ¯ A3 Var(3 ) + ¯ ¯ A3 A© Cov3 , © 


3K 3K 3K ©K
3©

= ¯ A3 Var(3 ). since Cov3 , ©  = 0, ≠ 


3K

Definition 5.2
(i) The correlation coefficient between random variables  and – is defined by

56
"(, –) = ,
#$% (a,˜)
&'()(a)'()(˜)

provided 0 < Var(), Var(–) < ∞.


(ii) Random variables  and – are said to be uncorrelated if Cov(, –) = 0.

Note that "(, –) = "(–, ). Also from Corollary 5.1 it is clear that if  and – are independent
random variables then they are uncorrelated. However, as the following examples illustrates,
the converse may not be true (i.e., uncorrected random variables may not be independent).

Example 5.1
Let (, –) be a bivariate random vector of discrete type with p.m.f. given by

(/, †) (−1, 1) (0, 0)


Ùa,˜ (/, †) 4 4 4
(1, 1)

where 4 ∈ (0, 1), 4 ∈ (0, 1) and 24 + 4 = 1.

Clearly

(–) = (−1)4 + (0)4 + (1)4 = 0

 () = (−1)4 + (0)4 + (1)4 = 0

 (–) = (1)4 + (0)4 + (1)4 = 24

⟹ Cov(, –) =  (–) −  () (–) = 0

⟹ "(, –) = 0.

However

@({(, –) = (−1,1)) = 4 ≠ 24  = @({ = −1)@ ({– = 1).

Example 5.2
Let  = ( ,  ) be a bivariate random vector of absolutely continuous type with p.d.f. given
by

1, if 0 < |/ | g / 2 1
Ùa (/ , / ) = d .
0, otherwise

Then

57
 ‘€

 (  ) = Ú Ú / / Û/ Û/ = 0


P 7‘€

 ‘€
2
 ( ) = Ú Ú / Û/ Û/ =
3
P 7‘€

 ‘€

( ) = Ú Ú / Û/ Û/ = 0


P 7‘€

and

Cov( ,  ) =  (  ) −  ( )  ( ) = 0.

Therefore,

"( ,  ) = 0,

i.e.,  and  are uncorrelated. Also


‘€
k
Ú Û/ , if 0 < / < 1 2/ , if 0 < / < 1
Ùa€ (/ ) = = d 
j7‘€ 0, otherwise
h0, otherwise

and

k
Ú Û/ , if − 1 < / < 1 1 − |/ |, if − 1 < / < 1
ÙaŒ (/ ) = = d .
j|‘Œ| 0, otherwise
h0, otherwise

Clearly

Ùa€ ,aŒ (/ , / ) ≠ Ùa€ (/ )ÙaŒ (/ ), ∀ / = (/ , / ) ∈ ℝ ,

and therefore  and  are not independent.

One can also infer that  and  are not independent by directly observing from the joint
p.d.f. Ùa (∙) that “a =
/ ∈ ℝ : Ùa / > 0 = {(/ , / ): 0 < |/ | g / 2 1, “a€ =
/ ∈
ℝ : Ùa€ (/ ) > 0 = (0, 1), “aŒ =
/ ∈ ℝ : ÙaŒ (/ ) > 0 = (−1, 1) and that“a ≠ “a€ × “aŒ .

58
Theorem 5.4 (Cauchy- Schwarz Inequality for Random Variables)
Let (, –) be a bivariate random vector. Then, provided the involved expectations exist,

( (–)) ≤  (  )(–  ). (5.1)

The equality in (5.1) is attained if, and only if, @({– = ë) = 1 or @ ({ = ë–) = 1, for some
real constant ë.

Proof. Consider the following two cases.

Case I. (  ) = 0.

In this case @({ = 0) = 1 (see Theorem 3.3 (iii), Module 3) and hence @({– = 0) = 1. It
follows that  (–) = 0,  () = 0, @({ = ë–) = 1, (for ë = 0) and the equality in (5.1) is
attained.

Case II. () > 0.

Then,

0 ≤  ((– − ) ) = (  ) − 2 (–) + (–  )

i.e.,  (  ) − 2 (–) +  (–  ) ≥ 0, ∀  ∈ ℝ.

This implies that the discriminant of the quadratic equation  (  ) − 2 (–) +  (–  ) = 0
is non- negative, i.e.,

4( (–)) ≤ 4 (  ) (–  )

⇒ ((–)) ≤ (  )(–  ),

and the equality is attained if, and only if,

 ((– − ë) ) = 0, for some ë ∈ ℝ

⟺ @({– = ë ) = 1, for some ë ∈ ℝ.

Corollary 5.2

Let ( ,  ) be a bivariate random vector with  (3 ) = 3 ∈ (−∞, ∞) and Var(3 ) = W3 ∈
(0, ∞), = 1, 2. Then

(i) |"  ,  )| g 1;`

59
"( ,  ) = ±1 if, and only if, =Û , where 3 =  (3 ), = 1,2.
a€ 7+€ aŒ 7+Œ
,€ ,Œ
(ii)

Proof. Taking  =  −  and – =  −  in Theorem 5.4. We get

(( −  )( −  )) ≤ (( −  ) )(( −  ) )

⟺ " ( ,  ) ≤ 1

⟺ |"  ,  )| g 1,

and the equality is attained if and only if,

@(( −  ) = ë( −  )) = 1 , for some ë ∈ ℝ

 −   − 
⟺ @| =Û } = 1 , for some Û ∈ ℝ.
W W

Let  = (–, —) be a 4-dimensional random vector of either discrete type or of absolutely


continuous type and let – and — respectively be 4 and 4 dimensional, so that 4 = 4 + 4 .

For a given « ∈ “ê (or « satisfying (3.5) and Ùê « > 0) the conditional p.m.f. (or p.d.f.) of
– given — = « is given by

٘,ê <†, «=
٘»ê <†»«= = , † ∈ ℝ.€ .
Ùê «

Let ψ: ℝ.€ → ℝ be a Borel function. Then the conditional expectation of ψ(–) given that
— = « may be defined by

ψ–»— = « = Ú ψ <†= ٘»ê <†»«= d† ,


ℝ€

provided the expectation is finite.

Similarly the conditional variance of ψ–, given that — = « , may be defined by

Varψ–»— = « = ψ– − (ψ–»— = «) »— = «.

Throughout we will use the following notation

ψ–»— = ψ∗ —, (5.2)

60
where ψ∗ is defined by

ψ∗ « = Eψ–»— = «, (5.3)

for all « ∈ “° ( or all « satisfying (3.5) and Ùê « > 0) .

Theorem 5.5
Under the above notations

(i)  <ψ–»—= =  <ψ–= ;


(ii) Var <Eψ–»—= +  <Varψ–»—= = Var <ψ–=.

Proof. We will provide the proof for the absolutely continuous case. The proof for the discrete
case follows in the similar fashion.

(i) Note that

 <ψ–»—= =  <ψ∗ —=,

where ψ∗ (∙) is defined by (5.2) and (5.3). Therefore

 <ψ–»—= = Ú ψ∗ «Ùê « Û«


ℝŒ

= Ú ~ Ú ψ <†= ٘»ê <†»«= d† Ùê « Û«


ℝŒ ℝ€

= Ú Ú ψ <†= ٘,ê <†, «= d†d«


ℝŒ ℝ€

= (ψ–).

(ii) Let ψ∗ — = Eψ–»—. Then, by (i),

Var <ψ–= =  <ψ– − (ψ–) =





=  £ |ψ– −  <ψ–=} -—¥ (5.4)

61

 <ψ– − (ψ–) »—= =  <ψ– − ψ–»— + ψ–»— −  <ψ–= |—}



=  |<ψ– − ψ–|—= |—} + |ψ–»— − E <ψ–=}


+2 .ψ–»— −  <ψ–=Þ ψ– − ψ–»—»—


= Var ψ–»— + ψ–»— −  |ψ–»—=} . (5.5)

Combining (5.4) and (5.5), we get



Var <ψ–= =  <Varψ–»—= + Eψ–»— −  | ψ–»—=}

=  <Varψ–»—= + Var <Eψ–»—=.

Remark 5.1
If – and — are independent then

ψ–»— =  <ψ–= and Varψ–»— = Var <ψ–= .

Example 5.1
Let  = ( ,  , Q ) be a discrete type random vector with p.m.f.
/ / /Q
, if (/ , / , /Q ) ∈ {1, 2 × {1, 2, 3 × {1, 3
Ùa (/ , / , /Q ) = Ô 72 .
0, otherwise

(i) Let – = 2 −  + 3Q and – =  − 2 + Q . Find the correlation coefficient between
– and – ;
(ii) For a fixed / ∈ {1, 2, 3, find  (–| = / ) and Var(–| = / ), where – =  Q .

Solution. (i) From Example 4.1 (i) we know that  ,  and Q are independent. Therefore
Cov( ,  ) = Cov( , Q ) = Cov( , Q ) = 0. Also Cov(3 , 3 ) = Var(3 ), = 1, 2, 3. Using
Theorem 5.2 (ii) we have

Cov(– , – ) = 2Var( ) − 5Cov( ,  ) + 2Var( ) + 5Cov( , Q )

+3Var(Q ) − 7Cov( , Q )

62
= 2Var( ) + 2Var( ) + 3Var(Q ).

From the solution of Example 4.1 (ii) we have


/ /
, if / ∈ {1, 2 , if / ∈ {1, 2, 3
( )
Ùa€ / = Ô 3 ( )
, ÙaŒ / = Ô 6
0 , otherwise 0 , otherwise

and
/Q
, if /Q ∈ {1, 3
Ùa‹ (/Q ) = Ô 4 .
0 , otherwise

Therefore

/ (1 + 2 ) 5
 ( ) = ¯ / Ùa€ (/ ) = ¯ = =
3 3 3
‘€ ∈ ר€ ‘€ ∈{,

/Q (1Q + 2Q )
 ( ) = ¯ / Ùa€ (/ ) = ¯ = =3
3 3
‘€ ∈ ר€ ‘€ ∈{,

/ (1 + 2 + 3 ) 7
( ) = ¯ / ÙaŒ (/ ) = ¯ = =
6 6 3
‘Œ ∈ רŒ ‘Œ ∈{,,Q

/Q (1Q + 2Q + 3Q )
 ( ) = ¯ / ÙaŒ (/ ) = ¯ = =6
6 6
‘Œ ∈ רŒ ‘Œ ∈{,,Q

/Q (1 + 3 ) 5
(Q ) = ¯ /Q Ùa‹ (/Q ) = ¯ = =
4 4 2
‘‹ ∈ ר‹ ‘‹ ∈{,Q

/QQ (1Q + 3Q )
 (Q ) = ¯ /Q Ùa‹ (/Q ) = ¯ = =7
4 4
‘‹ ∈ ר‹ ‘‹ ∈{,Q

2
Var( ) =  ( ) − ( ( )) =
9
5
Var( ) =  ( ) − ( ( )) =
9
and

63
3
Var(Q ) =  (Q ) − (Q ) = .

4
Therefore,

4 10 9 137
Cov(– , – ) = + + = .
9 9 4 36
Also, by Corollary 5.1,

Var(– ) = Var(2 −  + 3Q )

= 4 Var( ) + Var( ) + 9 Var(Q )

8 5 27
= + +
9 9 4
295
=
36
and

Var(– ) = Var( − 2 + Q )

= Var( ) + 4Var( ) + Var(Q )

2 20 3
= + +
9 9 4
115
= .
36
Therefore

Cov(– , – )
"(– , – ) =
&Var(– )Var(– )

137
=
√295√115

= 0.7438 ⋯

(ii) Since  ,  and Q are independent it follows that ( , Q ) and  are independent. This in
turn implies that – =  Q and  are independent. Therefore (–| /   – and
Var(–| /  Var(–). Now

 (–) =  ( Q )

64
= ( )(Q ) (using Theorem (5.3))

25
= .
6
Var(–) = Var( Q )

= Var( ( Q |Q  +  Var( Q |Q 

Var(Q  ( |Q  +  Q Var( |Q 

VarQ ( ) + Q Var( ) (Remark 5.1)

5 2
= Var | Q } +  | Q }
3 9
25 2
= Var(Q ) + (Q )
9 9
75 14
= +
36 9
131
= .
36

Example 5.2
Let  = ( ,  , Q ) be an absolutely continuous type random vector with p.d.f.

1
, if 0 < /Q < / < / < 1
Ùa (/ , / , /Q ) = Ò/ / .
0, otherwise

(i) Let – = 2 −  + 3Q and – =  − 2 + Q . Find "(– , – );


(ii) For a fixed / ∈ (0, 1) find (–| /  and Var(–| / , where – =   Q .

Solution.

(i) As in Example 5.1 (i)

Cov(– , – ) = 2Var( ) + 2Var( ) + 3Var(Q ) − 5Cov( ,  )

+5Cov( , Q ) − 7Cov( , Q ).

65
 ‘€ ‘Œ
1 1
 ( ) = Ú / Ùa /Û/ = Ú Ú Ú Û/Q Û/ Û/ =
ℝ‹ / 2
P P P

 ‘€ ‘Œ
/ 1
( ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 3
P P P

 ‘€ ‘Œ
1 1
 ( ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 4
P P P

 ‘€ ‘Œ
/ 1
( ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 9
P P P

 ‘€ ‘Œ
/Q 1
 (Q ) = Ú Ú Ú Û/Q Û/ Û/ =
/ / 8
P P P

 ‘€ ‘Œ
/Q  1
(Q ) = ÚÚ Ú Û/Q Û/ Û/ =
/ / 27
P P P

 ‘€ ‘Œ
1
(  ) = Ú Ú Ú Û/Q Û/ Û/ =
6
P P P

 ‘€ ‘Œ
/Q 1
 ( Q ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 12
P P P

 ‘€ ‘Œ
/Q 1
 ( Q ) = Ú Ú Ú Û/Q Û/ Û/ =
/ 18
P P P

1
Var( ) = ( ) − (( )) =
12
7
Var( ) = ( ) − (( )) =
144
37
Var(Q ) = (Q ) − ((Q )) =
1728
1
Cov( ,  ) = ( ,  ) − ( ) ( ) =
24
66
1
Cov( , Q ) = ( Q ) − ( ) (Q ) =
48
7
Cov( , Q ) = ( Q ) − ( ) (Q ) = .
288
Therefore,

1 7 37 5 5 49 31
Cov(– , – ) = + + − + − = .
6 72 576 24 48 288 576
Also,

Var(– ) = 4Var( ) + Var( ) + 9Var(Q ) − 4Cov( ,  )

+12Cov( , Q ) − 6Cov( , Q )

1 7 37 1 1 7
= + + − + −
3 144 192 6 4 48
295
= .
576
Var(– ) = Var( ) + 4Var( ) + Var(Q ) − 4Cov( ,  ) + 2Cov( , Q ) − 4Cov( , Q )

1 7 37 1 1 7
= + + − + −
12 36 1728 6 24 72
133
= .
1728
Therefore

Cov(– , – )
"(– , – ) = = 0.3251 ⋯
&Var(– )Var(– )

(ii) Clearly, for a fixed / ∈ (0, 1),

ÙaŒ ,a‹ |a€ / , /Q |/  ë / Ùa€ ,aŒ ,a‹ (/ , / , /Q )

, if 0 < /Q < / < /


mŒ (‘€ )
= 𠑌 .
0, otherwise

Since

67
x x

Ú Ú ÙaŒ ,a‹ |a€ (/ , /Q |/ ) Û/ Û/Q = 1,


7x 7x

we have
‘€ ‘Œ
1
ë (/ ) Ú Ú Û/Q Û/ = 1,
/
P P

1
i. e., ë (/ ) = .
/

Also

(–| = / ) =  (  Q | = / )

= / ( Q | = / )
‘€ ‘Œ
1
= / Ú Ú / /Q Û/ Û/
/ / Q 
P P

/Q
= .
6
(–  | = / ) =  (  Q | = / )

= / ( Q | = / )


‘€ ‘Œ
1
= / Ú Ú / /Q Û/ Û/
/ / Q 
P P

/T
= .
15
Therefore

Var(–| = / ) =(–  » = / ) − ((–| = / ))




/T /T
= −
15 36
7 T
= / .
180 

68

6. Joint Moment Generating Function


Let  = ( , … , . ) be a 4 -dimensional random vector defined on a probability space
(Ω, ℱ, @). Let [ =
 = ( ,  , … , .  ∈ ℝ. :  <0X ∑„÷€ ½„a„ 0= =  <X ∑„÷€ ½„ a„ = is finite . Define
 

the function a ∶ [ → ℝ by

a  =  <X ∑„÷€ ½„a„ =,  = ( ,  , … , . ) ∈ [. (6.1)




Definition 6.1
(i) The function a ∶ [ → ℝ , defined by (6.1), is called the joint moment generating
function (m.g.f.) of random vector .
(ii) We say that the joint m.g.f. of  exists if it is finite in a rectangle −A, A ⊆ ℝ. , for some
A = (A , A , … , A. ) ∈ ℝ. ; here −A = (−A , −A , … , −A. ) and −A, A = { ∈ ℝ. : − A3 <
3 < A3 , = 1, 2, … , 4.

As in the one- dimensional case many properties of probability distribution of  can be studied
through joint m.g.f. of  . Some of the results, which may be useful in this direction, are
provided below without their proofs. Note that 10 = 1. Also if  , … , . are independent

. . .
then

a  =  <X = =  £J X ½„a„ ¥ = J (X ½„a„ ) = J a„ (3 ) ,  ∈ ℝ. .



∑„÷€ ½„ a„

3K 3K 3K

Theorem 6.1

Suppose that a  exists in a rectangle −A, A ⊆ ℝ. . Then a  possesses partial
derivatives of all orders in −A, A. Furthermore, for positive integers _ , … , _. ,

á`€ Ž`Œ Ž`‹ ⋯Ž`


 < €  Œ ⋯ .̀  = = 2 a 3 .
` `

Ꮰ€ ⋯ á.̀
` 

(½€ ,½Œ ,…,½ )K(P,…,P)

Under the assumptions of Theorem 6.1, note that, for ψa  = ln a  ,  ∈ [,

69
á á
(3 ) = î a ï = î Ψa ï , = 1, … , 4
á3 ½KP
á3 ½KP

á4
(3 4 ) = î  ï , = 1, … , 4
á3 4 a ½KP


á á
Var(3 ) = 5  a 6 − £î a ï ¥
á3 ½KP
á3 ½KP

á
= 5  ψa 6 , = 1, … , 4,
á3 ½KP

and, for ,  ∈ {1, … , 4, ≠ ,

Cov3 , ©  = (3 © ) − (3 )© 

á á á
= 5 a 6 − î a ï 5 a 6
á3 Ꮹ ½KP
á3 ½KP
Ꮹ ½KP

á
= 5  6 .
á3 Ꮹ a ½KP

Also note that

a (0, … ,0, 3 , 0, … ,0) = (X½„ a„ ) = a„ (3 ) , = 1,2, … , 4.

a 0, … ,0, 3 , 0, … ,0, © , 0, … ,0 = (X ½„ a„ ޽¨a¨ ) = a„ ,a¨ 3 , © , ,  ∈ {1, … , 4,

provided the involved expectations are finite.

7. Properties of Random Vectors Having the Same Distribution

Definition 7.1
Let  and – be two 4-dimensional random vectors, defined on the same probability space
(Ω, ℱ, @) . Then  and – are said to have the same distribution (written as  = – ) if
7

za / = z˜ / , ∀ / ∈ ℝ. (i.e., if they have the same distribution function). ▄

The following results are multivariate analogs of theorems stated in Section 4 of Module 3. The
proofs of these theorems, being similar to their univariate counterparts, is omitted.

70
Theorem 7.1
Let  and – be 4 -dimensional random vectors of discrete type with joint p.m.f.
Ùa (⋅) and ٘ (⋅), respectively. Then  = – if, and only if, Ùa / = ٘ / , ∀ / ∈ ℝ. .
(i)
7

Let  and – be 4 -dimensional random


za (∙) and z˜ (∙), respectively. Suppose that
(ii) vectors having distribution functions

á . za / á . z˜ /
and
á/ ⋯ á/. á/ ⋯ á/.

exist everywhere except, possibly, on a set Æ comprising of countable number of curves.


Further suppose that

á . za / á . z˜ /
Ú b /Û/ = Ú bÅ â /Û/ = 1.
ℝ á/ ⋯ á/. ℝ á/ ⋯ á/.
Åâ

Then both of them are of absolutely continuous type. Moreover,  = – if and only if,
7

there exist versions of p.d.f.s Ùa (∙) and ٘ (∙) of  and – , respectively, such that
Ùa / = ٘ / , ∀/ ∈ ℝ. . ▄

Theorem 7.2
Let  and – be 4 -dimensional random vectors of either discrete type or of absolutely
continuous type with  = –. Then
7

(i) For any Borel function ℎ: ℝ. → ℝ,  <ℎ= =  <ℎ–=, provided the expectations are
finite;
For any Borel function : ℝ. → ℝ,  = –.
7
(ii)

Theorem 7.3 (Uniqueness Theorem)


Let  and – be two random vectors of either discrete type or of absolutely continuous type
having m.g.f.s a (⋅) and ˜ (⋅) that are finite on a rectangle −A, A for some A =
(A , A , … , A. ) ∈ ℝ. ; here −A = (−A , −A , … , −A. ) and −A, A = { ∈ ℝ. : − A3 < 3 < A3 ,
= 1, … , 4. . Suppose that

a  = ˜ , ∀  ∈ −A, A.

71
Then  = –.
7

Remark 7.1

If  ,  , … , . are independent and identically distributed <i. e. ; 3 =  , = 1, … , 4= , – =


7

∑.3K 3 and 8 = ∑.3K 3 , then



.

a  = J a€ (3 ) ,  ∈ ℝ.


3K

˜ () = Ma€ ()L ,  ∈ ℝ


.

and

 .
a8 () = îa€ | }ï ,  ∈ ℝ,
4

provided the expectations are finite.

Example 7.1

Let  ,  , … , . be independent random variable such that 3 ~ :(3 , W3 ), −∞ < 3 < ∞,
W3 > 0, = 1, … , 4. If A , … , A. are real constants, such that not all of them are zero, then show
that
. . .

¯ A3 3 ~: £¯ A3 3 , ¯ A3 W3  ¥.
3K 3K 3K

Solution. Let – = ∑3K A3 3 . Then


.

˜ () =  <X ½ ∑„÷€ ¹„a„ =




=  £J X ½¹„a„ ¥
3K

= J  (X ½¹„ a„ ) ( ,  , ⋯ , . are independent)


3K

72
.

= J a„ (A3 )
3K

.
;„ Œ <„ Œ =Œ
= J X ½¹„+„Ž Œ ,  ∈ ℝ
3K


<∑ ;„ Œ <„ Œ ==Œ
= X ,  ∈ ℝ ,

½∑„÷€ ¹„ +„ Ž „÷€
Œ

which is the m.g.f. of :∑3K A3 3 , ∑3K A3 W3   distribution. Using Theorem 7.3 it follows that
. .

. .

– ~ : £¯ A3 3 , ¯ A3 W3  ¥.
3K 3K

Example 7.2
Let  ,  , … , . be independent random variable such that 3 ~ Bin(r3 , ? ), 0 < ? < 1, r3 ∈
{1, 2, … , = 1, … , 4. Show that
. .

¯ 3 ~ Bin £¯ r3 , ?¥.
3K 3K

Solution. Let – = ∑3K 3 . Then


.

˜ () =  <X ½ ∑„÷€ a„ =




=  £J X ½a„ ¥
3K

= J  (X ½a„ )
3K

= J a„ ()
3K

= J(1 − ? + ?X ½ )w„ ,  ∈ ℝ
3K

73
= (1 − ? + ?X ½ )∑„÷€ w„ ,  ∈ ℝ,


which is the m.g.f. of Bin∑3K r3 , ? distribution. Using Theorem 7.3 it follows that
.

. .

– = ¯ 3 ~ Bin £¯ r3 , ?¥.
3K 3K

Example 7.3
Let  ,  , … , . be independent random variables such that 3 ~ NB(3 , ? ) , 0 < ? < 1, 3 ∈
{1,2, …  = 1,2, … , 4. Then show that
. .

– = ¯ 3 ~ NB £¯ 3 , ?¥.
3K 3K

Solution. Similar to solution of Example 7.2 on noting that if  ~NB(, ?) then

? 
a () = | } ,  < − ln(1 − ?).
1 − (1 − ?)X½

Example 7.4
Let  ,  , … , . be independent random variables such that 3 ~ @(3 ), 3 > 0, = 1, … , 4.
Then show that
. .

¯ 3 ~ @ £¯ 3 ¥.
3K 3K

Solution. Similar to solution of Example 7.2 on noting that if  ~ @() ,  > 0, then

a () = X (A ,  ∈ ℝ .
= 7)

Example 7.5
Let  ,  , … , . be independent random variable such that 3 ~ É (B3 , ? ) , ? > 0, B3 > 0,
= 1, … , 4. Show that

74
. .

¯ 3 ~ É £¯ B3 , ?¥.
3K 3K

Solution. Similar to solution of Example 7.2 on noting that if ~ É (B, ? ), B > 0 , ? > 0, then

a () = (1 − ?)7C ,  < .



D

Example 7.6
Let  ,  , … , . be independent random variables such that 3 ~ χw„ , r3 ∈ {1, 2, … , =
1, … , 4. Then show that
(i)

¯ 3 ~ F∑  .
„÷€ w„
3K

Let – , – , … , –. be independent random variables such that –3 ~ :(, W  ), = 1, … , 4,


−∞ <  < ∞, W > 0. Then
(ii)

.
–3 −  
¯| } ~ F. .
W
3K

Solution.

(i) Note that 3 ~ Fw„ = É < „ , 2= , = 1, … , 4. Now the assertion follows from Example 7.5.
w

(ii) Follows on using Theorem 4.1 (ii) of Module 5 and (i) above.

We state the following theorem without providing its proof.

Theorem 7.4

Let  be a 4-dimensional random vector and let  =  , … , ` , where 3 is 43 -dimensional,


= 1, … , _ , ∑`3K 43 = 4. Suppose that there exist A3 ∈ ℝ.„ , A3 ≠ 0, = 1, … , _, such that
a (∙) is finite on −A, A and a„ (∙) is finite on −A3 , A3 , = 1, … , _, where A = A , … , A` ,
and − A = −A , … , −A` . Then  , … , ` are independent iff
`

a  , … , `  = J a„ 3 , ∀ 3 ∈ −A3 , A3 , = 1, … , _.


3K

75

8. Multinomial Distribution
First let us introduce the notion of multinomial coefficients, which is a generalization of notion
of binomial coefficients.

Let _, r , … , r`7 and r be non-negative integers such that _ ≥ 2, ∑`7


3K r3 ≤ r . Consider a
collection of r items comprising of

r identical items of type 1


r identical items of type 2

r`7 identical items of type _ − 1
`7

r − ¯ r3 identical items of type _.


3K

The number of visually distinguishable ways in which these r items can be arranged in a row is
`7
r r − r r − r − r r!
<r = < r = < r = ⋯ ™r − ¯ r3 š = .
  Q 3K r ! r ! ⋯ r`7 ! (r − ∑`7
3K r3 )!
r`7

The coefficients
`7
r r!
<r r ⋯ r = = , r3 ≥ 0, = 1, … , _ − 1, ¯ r3 ≤ r (8.1)
  `7 r ! r ! ⋯ r`7 ! (r − ∑`7
3K r3 )! 3K

are called multinomial coefficients.

Note that, for _ = 2 (so that 0 ≤ r ≤ r), multinomial coefficients (8.1) reduce to binomial
coefficients

r r!
<r = = , r ∈ {0, 1, … , r.
 r ! (r − r )!

Also note that, for real numbers / , … , /` ,

(/ + / + ⋯ + /` )(/ + / + ⋯ + /` ) ⋯ (/ + / + ⋯ + /` ).


(/ + / ⋯ + /` )w = sttttttttttttttttttuttttttttttttttttttv
I)$JKLM $N  OK(MMPQ

A typical term in expansion of above product is an arrangement of r / ç, r / ç, … ,


r`7 /`7 3K r3 ) /` ç, r3 ∈ {0, 1, … , r + r + ⋯ + r`7 ≤ r
ç and (r − ∑`7
 
(such as

76
/ /Q /R / / / … /`7 /R ). Each such term equals / w€ / wŒ ⋯ /` w§ and total number of
visually distinguishable ways of arranging
`7
r
r / ç, r / ç, … , r`7 /`7

ç and £r − ¯ r3 ¥ /` ç is <r r ⋯ r =.
  `7
3K

Thus, we have
w w
r
(/ + / + ⋯ + /` )w = ¯ ⋯ ¯ <r r ⋯ r = / € / Œ ⋯ / § .
w w ẁ
  `7
w€ KP w§¦€ KP
w€ ŽwŒ¼ … Žw§¦€ w

Example 8.1 (Multinomial Distribution)


Consider a random experiment that can result in one of 4 + 1 (4 ≥ 1) possible outcomes
[ , [ , … , [.Ž , where [3 ∩ [© = ∅, ≠  and ⋃3K [3 = Ω . Let @([3 ) = ?3 ∈ (0, 1), =
.Ž

1, … , 4, and ∑3K ?3 < 1 so that @[.Ž  = 1 − ∑3K ?3 ∈ (0, 1). Suppose that the random
. .

experiment is repeated r times independently.

Define

3 = number of times event [3 occurs in r trials , = 1, … , 4 + 1.

Then one may be interested in the joint probability distribution of  = ( ,  , … , .Ž ). Note
that
.

.Ž = r − ¯ 3 = number of times [.Ž occurs


3K

is completely determined by  = ( ,  , … , . ) and therefore only distribution of  =


( , … , . ) may be of interest. Let “a =
/ = (/ , … , /. : /3 ∈ {0, 1, … , r, = 1, … , 4,
∑.3K /3 ≤ r. Then

Ùa (/ , … , /. ) = @
 = / , … , . = /. 

(w7∑„÷€ ‘„ )
k
.
r!
= / ! ⋯ / ! (r − ∑. / )! ? ⋯ ?. £1 − ¯ ?3 ¥ if / ∈ “a (8.2)
‘€ ‘

j  . 3K 3 3K
h0, otherwise

77
Definition 8.1
The probability distribution given by (8.2) is called a multinomial distribution with r trials and
cell probabilities ? , … , ?. <denoted by Multn, ? , … , ?. =. ▄

Note that, for p = 1, Mult(r, ? ) distribution is nothing but the Bin(r, ? ) distribution.

Theorem 8.1

Let  = ( ,  , … , . ) ~ Multr, ? , … , ?. , where r ∈ {1,2, … , ?3 ∈ (0, 1), = 1, … , 4 and


∑.3K ?3 < 1. Then

(i) 3 ~Bin(r, ?3 ), = 1, … , 4;
(ii) 3 + © ~Bin r, ?3 + ?© , ,  = 1, … , 4, ≠ ;
(iii) (3 ) = r?3 and Var(3 ) = r?3 (1 − ?3 ), = 1, … , 4;
(iv) Cov3 , ©  = −r?3 ?© , ,  = 1, … , 4, ≠ .

Proof.

Fix ∈ {1, … , 4. In a given trial of the random experiment treat the occurrence of outcome
[3 as success and that of any other [© ,  ≠ (i.e., non-occurrence of [3 ) as failure. Then we
(i)

have a sequence of r independent Bernoulli trials with probability of success in each trial as
@([3 ) = ?3 . Therefore

3 = # of success in r independent Bernoulli trials ~ Bin(r, ?3 ).

(ii) Fix ,  ∈ {1, … , 4, ≠ . In a given trial of the random experiment treat the occurrence of
[3 or [© i. e. , occurrence of [3 ∪ [©  as success and its non-occurrence as failure. Then we
have a sequence of r independent Bernoulli trials with probability of success in each trial as
@([3 ∪ [© ) = @([3 ) + @[©  = ?3 + ?© and, therefore,
3 + © = # of successes in r independent Bernoulli trials ~ Binr, ?3 + ?© .

(iii) Follows from (i) on using properties of binomial distribution.

(iv) Fix ,  ∈ {1, … , 4, ≠ . Then

3 + © ~ Binr, ?3 + ?© 

⟹ Var3 + ©  = r?3 + ?© 1 − ?3 − ?© 

⟹ Var(3 ) + Var©  + 2Cov3 , ©  = r?3 + ?© 1 − ?3 − ?© 

78
⟹ r?3 (1 − ?3 ) + r?© 1 − ?©  + 2Cov3 , ©  = r?3 + ?© 1 − ?3 − ?© 

⟹ Cov3 , ©  = −r?3 ?© , ≠ .

The joint m.g.f. of  = ( ,  , … , . ) ~ Multr, ? , … , ?.  is given by


w w
r!
a () = ¯ ⋯ ¯ X ½€ ‘€Ž⋯޽ ‘ ? € ⋯ ?. £1
‘ ‘

‘€ KP ‘÷W / ! / ! ⋯ /. ! (r − ∑3K /3 )!
.

‘€ Ž⋯ Ž‘ w

. w7∑„÷€ ‘„

− ¯ ?3 ¥
3K


´ w7∑„÷€ ‘€
w w
r!
= ¯ ⋯ ¯ (? X½€ )‘€ ⋯ (?. X½ )‘ ~1 − ¯ ?3 
/ ! / ! ⋯ . /. ! (r − ∑3K /3 )!
.
‘€ KP ‘ KP K
‘€ Ž⋯ Ž‘ w

. w

= £? X½€ + ⋯ + ? X ½Œ + 1 − ¯ ?3 ¥ ,  ∈ ℝ. .
3K

Therefore,

á
 (3 ) = î  ï
á3 a ½KP

. w7

= 2r?3 X ½„ £? X½€ + ⋯ + ? X½Œ + 1 − ¯ ?3 ¥ 3


3K
½KP

= r?3 , = 1, … , 4.

á
3 ©  = 5  6
á3 Ꮹ a ½KP

. w7

= 2r(r − 1)?3 ?© X ½„޽¨ £? X½€ + ⋯ + ?. X½ + 1 − ¯ ?3 ¥ 3


3K
½KP

= r(r − 1)?3 ?© , ,  ∈ {1, … , 4, ≠ .

79
Cov3 , ©  = 3 ©  −  (3 )©  = −r?3 ?© , ≠ .

 (3 ) = î a ()ï

X½„Œ
½KP
. w7

= 2r(r − 1)?3 X ½„ £? X ½€ + ⋯ + ?. X ½ + 1 − ¯ ?3 ¥


3K
. w7

+ r?3 X ½„ £? X ½€ + ⋯ + ?. X ½ + 1 − ¯ ?3 ¥ 3
3K
½KP
= r(r − 1)?3 + r?3 , = 1, … , 4.

9. Bivariate Normal distribution

Definition 9.1
A bivariate random vector  = ( ,  ) is said to have a bivariate normal distribution
: ( ,  , W , W , ") if, for some −∞ < 3 < ∞, = 1, 2, W3 > 0, = 1, 2, and −1 < " < 1, the
joint p.d.f. of  = ( ,  ) is given by

1 7
€ í ¦[ Œ í ¦[ í ¦[ í ¦[ Œ
î< € € = 7\< € € = < Œ Œ=Ž< Œ Œ = ï
Ùa€,aŒ (/ , / ) = X Œ€¦ZŒ  <€ <€ <Œ <Œ
, / = (/ , / ) ∈ ℝ .
2YW W &1 − "

Note that Ùa€ ,aŒ / ≥ 0, ∀/ ∈ ℝ and on making the transformation « =


‘€ 7+€

and
« =
‘Œ 7+Œ

in the interval below, we have

x x

b = Ú Ú Ùa€ ,aŒ (/ , / )Û/


7x 7x

x x
1 €
= Ú ÚX Û«
7 ° Œ 7\ °€ °Œ ްŒŒ 
Œ€¦ZŒ  €

2Y &1 − " 7x 7x

x x
1 € €
(°€ 7\°Œ )Œ
= ÚX Ô Ú X Œ€¦ZŒ  Û« ] Û«
7 ° Œ 7\Œ °ŒŒ  7
Œ€¦ZŒ  Œ

2Y&1 − "  7x stttttttutttttttv


7x
K&7\Œ √^

80
x
1 øŒ Œ
= Ú X7 Û«
√2Y
Œ

7x

= 1.

Therefore Ùa€ ,aŒ (/ , / ) is a p.d.f..

Theorem 9.1

Suppose that  = ( ,  ) ~ : ( ,  , W , W , "), −∞ < 3 < ∞, = 1, 2, W3 > 0, = 1, 2 and
−1 < " < 1. Then,

(i)  ~ :( , W ) and  ~ :( , W ) ;

(ii) for a fixed / ∈ ℝ , the conditional distribution of  given that  = / is :  +

" ,€ (/ −  ), W (1 − "  ) (written as  | = / ~ :  + " ,€ (/ −  ), W (1 − " ) ;
, ,
Œ Œ

(iii) for a given / ∈ ℝ , the conditional distribution of  given  = / is :  +

" (/ −  ), W (1 − "  )(written as | = / ~ :  + " (/ −  ), W (1 − "  ) ;
,Œ ,Œ
,€ ,€

(iv) the m.g.f. of  = ( ,  ) is

<Œ Œ Œ Œ

a€,aŒ ( ,  ) = X +€½€ Ž+Œ½Œ Ž ,  = ( ,  ) ∈ ℝ ;


€ =€ Ž<Œ =Œ Ž\, , ½ ½
Œ Œ € Œ € Œ

(v) for real constants ë and ë such that ë + ë > 0

ë  + ë  ~ :(ë  + ë  , ë W + ë W + 2 " ë ë W W );

(vi) "( ,  ) = ";


(vii)  and  are independent if, and only if, " = 0.

Proof.

(i) For / ∈ ℝ
x

Ùa€ (/ ) = Ú Ùa€ ,aŒ (/ , / )Û/


7x

81
X7
(‘€ 7+€ )Œ x
€ í ¦[ í ¦[ Œ
. Œ Œ 7\ € € Þ
= ÚX Û/
,€ Œ 7
Œ€¦ZŒ  <Œ <€

2Y W W &1 − "
7x

X 7 €, Œ€
(‘ 7+ ) Œ
x Œ
€ Z<Œ
7 Œ .‘Œ 7<+Œ Ž <€ (‘€ 7+€ )=Þ
= €
ÚX Œ<Œ
Œ €¦Z  Û/
2Y W W &1 − "
7x

X 7 € Œ€
(‘ 7+ ) Œ

= × √2Y W &1 − "


,€

2Y W W &1 − " 
1 (í ¦[ )Œ
= X ,
7 € Œ€
Œ<€
W √2Y

which is the p.d.f. of :( , W ) distribution. Thus  ∼ :( , W ). By summetry
 ~:( , W ).

(ii) Fix / ∈ ℝ . Then


Ùa€ |aŒ (/ |/ ) = ë (/ )Ùa€ ,aŒ (/ , / )
Œ
€ í ¦[ í ¦[
7 è € € 7\< Œ Œ = é
= ë (/ ) X
Œ€¦ZŒ  <€ <Œ

Œ
€ Z<€
7 £‘ 7 + Ž (‘ 7+ )¥
= ë (/ )X , / ∈ ℝ,
Œ € € Œ Œ
Œ<€ €¦ZŒ  <Œ

where ë (/ ) is the normalizing constant, i.e., ë (/ ) satisfies


x

Ú Ùa€ |aŒ (/ |/ ) Û/ = 1.


7x

Clearly, for a fixed / ∈ ℝ, Ùa€ |aŒ (⋅ |/ ) is the p.d.f. of :  + (/ −  ), W (1 − "  )
\,€

distribution.

(iv) For  = ( ,  ) ∈ ℝ , using Theorem 5.5, we have


(iii) Follows from (ii) on using symmetry.

a€ ,aŒ ( ,  ) =  (X ½€ a€ ޽ŒaŒ )

=  (X ½€ a€ ޽Œ aŒ | )

=  ½ŒaŒ  (X ½€a€ | ).

82
For a fixed / ∈ ℝ , since  | = / ~ :  + (/ −  ), W (1 − "  ) , on using
\,€

Theorem 4.2 (i), Module 5, we get

<Œ €¦ZŒ =Œ


 (X | = / ) = X ,  ∈ ℝ.
Z<€
½€ a€ :+€ Ž (‘Œ 7+Œ )>½€ Ž € €
<Œ Œ

Therefore, for  = ( ,  ) ∈ ℝ ,

<Œ Œ Œ

a€ ,aŒ ( ,  ) =  X


€ €¦Z =€
X
Z<€
½Œ aŒ :+€ Ž <Œ (aŒ 7+Œ )>½€ Ž Œ 

<Œ €¦ZŒ =Œ


= X  |X }.
+€ ½€ Ž € € 7 Z<€ + ½ <½Œ Ž
Z<€
½ =a
Œ <Œ Œ € <Œ € Œ

Since  ~:( , W ), on using Theorem 4.2 (i), Module 5, we get

Z<€ Œ
<Œ €¦ZŒ =Œ <Œ
Œ |=Œ ¼ <Œ =€ }
a€ ,aŒ ( ,  ) = X X
€ Z<€ Z<€
+€ ½€ Ž € 7 + ½ <½Œ Ž ½ =+ Ž
Œ <Œ Œ € <Œ € Œ Œ

<Œ Œ Œ Œ

= X +€½€ Ž+Œ½€ Ž ,  = ( ,  ) ∈ ℝ .


€ =€ Ž <Œ =Œ Ž \, , ½ ½
Œ Œ € Œ € Œ

(v) Let ë and ë be real constants such that ë + ë > 0 and let – = ë  + ë  . Then, for
 ∈ ℝ,

˜ () =  (X ½˜ )

=  (X ½m€ a€Ž½mŒaŒ )

= a€ ,aŒ (ë , ë )

<⌠Œ Œ Œ
€ <€ ¼âŒ <Œ ¼ŒZ†⌠<€ <Œ ==
Œ

= X ,
(m€ +€ ŽmŒ +Œ )½Ž
Œ

which is the m.g.f. of :(ë  + ë  , ë W + ë W + 2"ë ë W W ) distribution. Thus, by
Theorem 7.3,

– ~ :(ë  + ë  , ë W + ë W + 2"ë ë W W ).

(vi) By (i), Var( ) = W and Var( ) = W . Also, for Ψa€ ,aŒ ( ,  ) = ln a€ ,aŒ ( ,  ),  =
( ,  ) ∈ ℝ ,

á
Cov( ,  ) = 5 Ψ ( ,  )6 = "W W
ᏠᏠa€ ,aŒ   ½K P

83
Cov ( ,  )
⇒ "( ,  ) = = ".
&Var ( ) Var ( )

(vii) Since independent random variables are uncorrelated it follows from (vi) that if  and
 are independent then " = 0. Conversely suppose that " = 0. Then, for / = (/ , / ) ∈
ℝ ,

1 € í ¦[ Œ í ¦[ Œ
Ùa€ ,aŒ (/ , / ) = X
7 î< € € = Ž< Œ Œ = ï
2π σ σ
Œ b€ bŒ

= Ùa€ (/ )ÙaŒ (/ ).

Now the assertion follows on using Theorem 4.2 (i). ▄

Theorem 9.2

Let  = ( ,  ) be a bivariate random vector with  (3 ) = 3 ∈ (−∞, ∞), Var(3 ) = W3 , =
1, 2 and Cov( ,  ) = " ∈ (−1, 1). Then  ~ : ( ,  , W , W , ") if, and only if, for any real
constants  and  such that  + t  > 0, – =   +   ~ :(  +   ,  W +  W +
2"  W W ).

suppose that for all real constants t and t  with  + t  > 0,
Proof. Clearly the necessary part of the assertion follows from Theorem 9.1 (v). Conversely

– =   +   ~:(  +   ,  σ + t  σ + 2"  W W ). (9.1)

Then, for  = ( ,  ) ∈ ℝ ,

a€ ,aŒ ( ,  ) =  (X ½€ a€ ޽Œ aŒ )

=  (X ˜ )

= ˜ (1)
Œ Œ Œ Œ

= X , (using (9.11))
= < ¼= < ¼ŒZ =€ =Œ <€ <Œ
½€ +€ ޽Œ +Œ Ž € € Œ Œ
Œ

which is the m.g.f. of : ( ,  , W , W , ") distribution. Now using Theorem 7.3 it follows that

 = ( ,  ) ~ : ( ,  , W , W , "). ▄

10. Distribution of Function of Random Vectors

Let  =  , … , .  be a random vector of either discrete type or of absolutely continuous type
and let Ùa (∙) denote the p.m.f./p.d.f. of . Let ]: ℝ. → ℝ be a Borel function. As the following

84
of ].
example illustrates, in many situations, it may be of interest to find the probability distribution

Example 10.1
Consider a company that manufactures electric bulbs. The lifetimes of electric bulbs
manufactured by the company are random. Past experience with testing on electric bulbs

manufactured by the company can be described by a random variable  having the p.d.f.
manufactured by the company suggests that the lifetime of a randomly chosen electric bulb

1 7 í
Ùa (/|?) = Ô? X , if / > 0 , ? > 0.
c

0, otherwise

However the value of ?(> 0) is not evident from the past experience and therefore ? is
unknown. One way to obtain information about unknown ? is to do testing independently and
under identical conditions, on a number (say r) of electric bulbs manufactured by the company.
Let 3 denote the lifetime of the -th bulb, = 1, … , r . We call  , … , w (which are
independent and identically distributed random variables from the distribution Ùa (⋅ |? ), ? > 0)
the random sample from distribution Ùa (⋅ |? ), ? > 0. Clearly the joint p.d.f. of  = ( , … , w )
is given by

w 1 7 ∑„÷€ í„
ž

Ùa /|? = J Ùa„ (/3 |? ) = Ò? w X , if /3 > 0, = 1, … , r.


c

3K 0, otherwise

Since  ( ) = ? , a natural estimator of ? is the sample mean 8 = ∑w3K 3 . To study



w
theoretical properties of the estimator 8 we need probability distribution of 8 . ▄

Definition 10.1
(i) A function of one or more random variables that does not depend on any unknown

(ii) Let  , … , w be a collection of independent random variables each having the same
parameter is called a statistic.

p.m.f./p.d.f. Ù (or distribution function z). We then call  , ⋯ , w a random sample (of size
r) from a distribution having p.m.f./p.d.f Ù (or distribution function z). In other words a
random sample is a collection of independent and identically distributed random variables.

Remark 10.1

85
(i) Let  ~ : ( ,  , W , W , "), −∞ < 3 < ∞, W3 , = 1, 2, −1 < " < 1. Then the random
variable – =  +  is a statistic but the random variable – =
a€ 7+€

is not a statistic
unless  and W are known parameters.
(ii) Although a statistic does not depend upon any unknown parameters, the distribution of a

– ~ :( +  , W + W + 2"W W ).


statistic may very well depend upon unknown parameters. For example, in (i) above,

(iii) If  , … , w is a random sample from a distribution having p.m.f./p.d..f. Ù(∙), then the joint
p.m.f./p.d.f. of  = ( , … , w ) is
w

Ùa (/ , … , /w ) = J Ùa 3 (/3 )
3K

= J Ù(/3 ) , / = (/ , … , /w ) ∈ ℝw .
3K
(iv) Let  , … , w be a random sample from a distribution. Some of the commonly used
statistics are
w
1
(a) Sample Mean 8 = ¯ 3 ;
r
3K

w w
1 1
(b) Sample Variance “  = ¯(3 − 8 ) = è¯ 3 − r 8  é , r ≥ 2 ;
r−1 r−1
3K 3K

(c) r − th Order Statistics :w = r − th smallest of ( , … , w ),  = 1,2, … , r ;

(d) Sample Range ã = w:w − :w ;

ž¼€:w , if r is odd
(e) Median  = Òaž:ž Žaž¼€:ž .
Œ

Œ Œ
, if r is even
w

Theorem 10.1
Let  , … , w be a random sample from a distribution having p.m.f./p.d.f. Ù(⋅). Then, for any
permutation (‚ , … , ‚w ) of (1, … , r ),

( , … , w ) = ƒ€ , … , ƒž .


7

Proof. Let (‚ , … , ‚w ) be a permutation of (1, … , r ) and let (ˆ , … , ˆw ) be the inverse
permutation of (‚ , … , ‚w ). Then, for / = (/ , … , /w ) ∈ ℝw ,

86
Ùa ƒ (/ , … , /w ) = Ùa€ ,…,až </‡ , … , /‡ =
€ ,…,a… ž  w

= J Ùa„ </‡ 3 =
3K

= J Ù (/3 )
3K

= Ùa€ ,…,až (/ , … , /w ).

It follows that

Ùa…€ ,…,a…ž / = Ùa€ ,…,až /, ∀/ ∈ ℝw

⇒ ƒ€ , … , ƒž  = ( , … , w ).


7

Example 10.1
Let  , … , w be a random sample from a given distribution.

 is of absolutely continuous type then show that @({ <  < ⋯ < w ) =
@ <:ƒ  < ƒ  < ⋯ < ƒ w >= = w!, for any permutation (‚ , … , ‚w ) of (1, … , r );
(i) If


(ii) If  is absolutely continuous type then show that @({3 = :w ) = w , = 1, … , r, where,


for  ∈ {1, … , r, :w = -th smallest of { , … , w ;

3 1
(iii) Show that

| } = , = 1,2, … , r,
 +  + ⋯ + w r

provided the expectations exist;

w

(iv) Show that

 £3 e¯ 3 = ¥ = , = 1, … , r.
r
3K

Solution. Let “w denote the set of all permutations of (1, … , r). Using Theorem 10.1 we have

( , … , w ) = <ƒ , … , ƒ = , ∀ ‚ = (‚ , … , ‚w ) ∈ “w .


7
 w

87
⇒ Ψ( , … , w ) =  |Ψ <ƒ , … , ƒ =} , ∀ ‚ ∈ “w .
 w
(10.1)

(i) On taking

1, if / < / < ⋯ < /w


Ψ(/ , … , /w ) = d .
0, otherwise

We conclude that

@({ <  < ⋯ < w ) = @ <:ƒ  < ƒ  < ⋯ < ƒ w >= , ∀ ‚ ∈ “w . (10.2)

Since @
3 = ©  = 0 for ≠  (as (3 , © ) is of absolutely continuous type; see Remark
2.1 (ix)), we have

¯ @ <:ƒ < ƒ < ⋯ < ƒ >= = 1


  w
ƒ∈מ

⇒ @({ <  < ⋯ < w ) = @ <:ƒ < ƒ < ⋯ < ƒ >= = .



  w w!
(using (10.2))

(ii) Fix ∈ {1, 2, … , r . On taking


1, if /3 =  − th smallest of {/ , … , /w 
Ψ(/ , ⋯ /w ) = d ,
0, otherwise

in (10.1) and noting that, for any permutation ‚ = (‚ , … , ‚w ) ∈ “w , -th smallest of
:ƒ , … , ƒ > = -th smallest of { , … , w  = :w , we have
 w

@({3 = :w ) = @ <:ƒ 3 = :w >= ∀‚ ∈ “w

⇒ @({3 = :w ) = @({ = :w ) = 1, … , r.

But
w

¯ @({3 = :w ) = 1,
3K

and therefore

1
@({3 = :w ) = @({ = :w ) = ∙
r
(iii) On taking

88
/
Ψ(/ , … , /w ) = , / ∈ ℝw ,
/ + ⋯ + /w

in (10.1) we get, for all ‚ = (‚ , … , ‚w ) ∈ “w ,

 ƒ 
| } =  
 + ⋯ + w ƒ + ⋯ + ƒ
3 w


w w

=  
 £since ¯ /3 = ¯ ƒ ¥
 + ⋯ + w 3
3K 3K

 3
⇒  | } = | } , = 1, … , r. (10.3)
 + ⋯ + w  + ⋯ + w

w w
3 3
But

¯| } =  £¯ ¥
 + ⋯ + w  + ⋯ + w
3K 3K

 + ⋯ + w
=  | }
 + ⋯ + w

= 1.

Therefore from (10.3) we get

 3 1
| }= | } = , = 1, … , r.
 + ⋯ + w  + ⋯ + w r

(iv) For fixed 


w w

 f~¯ © =  = ƒ€ f ~¯ ƒ¨ =  ∀ ‚ ∈ “w


7

©K ©K
w w w w

⇒  f~¯ © =  = ƒ f ~¯ © =  , ∀ ‚ ∈ “w ~since ¯ /© = ¯ /ƒ¨ 


7

©K ©K ©K ©K

w w

⇒  ™ f¯ © = š =  ~ƒ f¯ © =  , ∀ ‚ ∈ “w



©K ©K

89
w w

⇒  ™ f¯ © = š =  ~3 f¯ © =  , = 1, … , r. (10.4)


©K ©K

But

w w w w

¯  ~3 f¯ © =  =  ~¯ © f¯ © =  = .
3K ©K 3K ©K

Now using (10.4) we get

w w

 ~ f¯ © =  =  ~3 f¯ © =  = , = 1, … , r.
r
©K ©K

In the following subsections we will discuss various techniques to find the distribution of
functions of random variables.

10.1 Distribution Function Technique

Let  =  , … , .  be a random vector and let ]: ℝ. → ℝ be a Borel function. The distribution
of – = ] , … , .  can be determined by computing the distribution function

z˜ (†) = @
] , … , .  ≤ †, −∞ < † < ∞.

Example 10.1.1 (Marginal distribution of order statistics of a random sample of


absolutely continuous type random variables)
Let  , … , w be a random sample of absolutely continuous type random variables, each having
the distribution function z (∙) and p.d.f. Ù(∙) . Then the joint distribution function of  =
( , … , w ) is
w

za (/ , … , /w ) = J z (/3 ) , / ∈ ℝw .
3K

We have
w

za / = J z (/3 )
3K

90
w ‘„

= J Ú Ù(3 )ۏ3
3K 7x

‘€ ‘ž w

= Ú ⋯ Ú J Ù(3 ) ۏw ⋯ ۏ


7x 7x 3K

‘€ ‘ž

= Ú ⋯ Ú Ùa ۏw ⋯ ۏ .
7x 7x

It follows that  is of absolutely continuous type with joint p.d.f. Ùa (⋅). Therefore, for ≠ ,

@
3 = ©  = 0.

Define

:w =  − th smallest of { , … , w ,  = 1, … , r,

so that

@({:w < :w < ⋯ < w:w ) = 1.

First let us derive the distribution of :w ,  = 1, ⋯ , r. Note that, for / ∈ ℝ,

:w ≤ / ⇔ at least  of { , … , w  are ≤ /.

Therefore

za :w (/ ) = @({:w ≤ / )

= @({at least  of  , … , w are ≤ / )


w

= ¯ @ ({ of  , … , w are ≤ /), / ∈ ℝ .
3K

Fix / ∈ ℝ, and consider a sequence of r trials where at the -th trial we observe 3 and consider
the trial having resulted in success if 3 ≤ / and it having resulted in failure if 3 > /, =
1, … , r. Since  , … , w are independent and the probability of success in the -th trial is
@({3 ≤ /) = z(/) (same for all the trials), the above sequence of trials may be considered as
a sequence of independent Bernoulli trials with probability of success in each trial as z (/ ).
Therefore

@({ of  , … , w ≤ /) = @({ successes in r trials)

91
r
= < = z(/) 1 − z (/ ) ,
3 w73

and consequently
w
r
za :w (/ ) = ¯ < = z(/) 1 − z (/) , / ∈ ℝ.
3 w73

3K

Recall that for ç ∈ {0, 1, … , r and 4 ∈ (0, 1) (see Theorem 3.1, Module 5)

w .
r 1
¯ <  = 4 © (1 − 4)w7© = Ú  ñ7 (1 − )w7ñ ۏ.
8(ç, r − ç + 1)
©Kñ P

Therefore,
h(‘)
1
za :w (/ ) = Ú  7 (1 − )w7 ۏ, / ∈ ℝ .
8 (, r −  + 1)
P

Let

1
Ùa :w (/) = [z (/)]7 [1 − z (/)]w7 Ù(/), / ∈ ℝ, (10.1.1)
8(, r −  + 1)

so that

Û
z (/) = Ùa :w (/ ), ∀ / ∉ Æ,
Û/ a :w
and
x x
1
Ú Ùa :w (/ )Û/ = Ú [z (/ )]7 [1 − z (/ )]w7 Ù (/) Û/
7x 7x 8 (, r −  + 1)

1 
= Ú  7 (1 − )w7 ۏ
8 (, r −  + 1) P

= 1.

It follows that the random variable :w is of absolutely continuous type with p.d.f. given by
(10.1.1). A simple heuristic argument for expression (10.1.1) is as follows. Interpret Ùa :w (/)∆/
as the probability that :w lies in an infinitesimal interval [/, / + ∆/]. Realizing that the
probability of more than one 3 ç falling in the infinitesimal interval [/, / + ∆/ ] may be
negligible, Ùa :w (/)∆/ may be interpreted as probability that one of the 3 ç falls in the

92
infinitesimal interval [/, / + ∆/], ( − 1)3 ç fall in the interval (−∞, /] and (r − )3 ç fall in
the interval (/ + ∆/, ∞) ≃ (/, ∞). Since  , … , w are independent and the probabilities of an
observation falling in intervals[/, / + ∆/ ], (−∞, /] and (/, ∞) are Ù(/ ) ∆/, z (/ ) and 1 − z (/)
respectively, Ùa :ž (/)∆/ is given by the multinomial probability

r!
Ùa :ž (/)∆/ ≡ (Ù(/)∆/) z(/) 1 − z(/) ,
7 w7
1! ( − 1)! (r − )!

i.e.,

r!
Ùa :ž (/) = [z (/ )]7 [1 − z (/ )]w7 Ù (/), −∞ < / < ∞.
( − 1)! (r − )!

Now we will derive the joint distribution of (:w , ñ:w ), where  and ç are fixed positive
integers satisfying 1 ≤  < ç ≤ r. For −∞ < / < † < ∞,

za :ž ,aj:ž (/, †)

= @({:w ≤ /, ñ:w ≤ †)

= @({at least  of { , … , w  are ≤ / and at least ç of  , … , w are ≤ †)


w w

= ¯ ¯ @({ of { , … , w  are in (−∞, /]and  of { , … , w  are in (/, †]).


3KP ©KP
3w
ñ3Ž©w

Since  , … , w are independent and probabilities of an observation falling in intervals


(−∞, /], (/, †] and (†, ∞) are z (/ ), z (†) − z (/) and 1 − z (†) respectively, using property of
multinomial distribution, we have, for  ≤ ≤ r, ç ≤ +  ≤ r and − ∞ < / < † < ∞

@ ({ of { , … , w  are in (−∞, /] and  of { , … , w  are in (/, †])

r!
= [z (/ )]3 [z (†) − z (/)]© [1 − z (†)]w737© ∙
! ! (r − − )!

Therefore, for −∞ < / < † < ∞,


w w
r!
za :ž ,aj:ž (/, †) = ¯ ¯ [z (/ )]3 [z (†) − z (/)]© [1 − z (†)]w737©
! ! (r − − )!
3KP ©KP
3w
ñ3Ž©w

93
w w73
r!
= ¯ ¯ [z(/)]3 [z(†) − z(/)]© [1 − z(†)]w737©
! ! (r − − )!
3K ©Kk(l(P,ñ73)

ñ7 w73
r!
= ¯ ¯ [z (/)]3 [z (†) − z (/ )]© [1 − z (†)]w737©
! ! (r − − )!
3K ©Kñ73

w w73
r!
+ ¯ ¯ [z (/ )]3 [z (†) − z (/)]© [1 − z (†)]w737©
! ! (r − − )!
3Kñ ©KP

ñ7 w73
r r−
= ¯ < = [z (/)]3 Ò ¯ | } [z (†) − z (/ )]© [1 − z (†)]w737© ß

3K ©Kñ73

w w73
r r−
+ ¯ < = [z (/ )]3 Ò¯ | } [z (†) − z (/)]© [1 − z (†)]w737© ß

3Kñ ©KP

r − z(†) − z (/) z (†) − z (/)


ñ7 w73
r
© w737©
= ¯ < = [z (/)]3 [1 − z (/ )]w73 Ò ¯ | }  1 −  ß
 1 − z (/) 1 − z (/)
3K ©Kñ73
w
r
+ ¯ < = [z (/)]3 [1 − z (/ )]w73

3Kñ

k p
m(æ)¦m(í)

i i
ñ7
1
€¦m(í)
r
= ¯ < = [z (/ )]3 [1 − z (/)]w73 Ú  ñ737 (1 − )w7ñ ۏ
j 8 (ç − , r − ç + 1) o
3K i P i
h n
h(‘)
1
+ Ú  ñ7 (1 − )w7ñ ۏ. (using Theorem 3.1, Module 5)
8(ç, r − ç + 1)
P

Thus, for −∞ < / < † < ∞, / ∉ Æ, † ∉ Æ

z (†) − z (/)
ñ7
á r (r − )!
ñ737
za :ž ,aj:ž (/, †) = ¯ < = [z (/ )]3 [1 − z (/)]w73 
ᆠ(ç − − 1)! (r − ç)! 1 − z (/)
3K

z (†) − z (/ ) Ù (†)
w7ñ
× 51 − 6
1 − z (/ ) [1 − z (/ )]

94
r! 1 − z (†)
w7ñ ñ7
ç − 1 [ ( )]3 [ ( )
= Ù (†) ¯ < = z / z † − z (/)]ñ737
(ç − 1)! (r − ç)!
3K

r!
= z(†) 1 − z (†) Ù (†)
ñ7 w7ñ
(ç − 1)! (r − ç)!

ç − 1 z(/ ) z(/)
ñ7 3 ñ737
× ¯ < =5 6 51 − 6
z (†) z (†)
3K

r!
= z(†) 1 − z(†) Ù(†)
ñ7 w7ñ
(ç − 1)! (r − ç)!
m(í)

1
m(æ)

× Ú  7 (1 − )ñ77 ۏ
8(, ç − )
P

á
⇒ Ùa :ž ,aj:ž (/, †) = z
á/ ᆠa :ž ,aj:ž

r!
= z(†) 1 − z (†) Ù (†)
ñ7 w7ñ
( − 1)! (ç −  − 1)! (r − ç)!

z(/ ) z(/ ) Ù(/ )


7 ñ77
× 5 6 51 − 6
z (†) z (†) z (†)

r!
= z(/) z(†) − z(/)
7 ñ77
( − 1)! (ç −  − 1)! (r − ç)!

× [1 − z (†)]w7ñ Ù (/) Ù (†), −∞ < / < † < ∞.

Also, for −∞ < † < / < ∞, and 1 ≤  < ç ≤ r

{ñ:w ≤ † ⊆ {:w ≤ / 

and therefore

za :ž ,aj:ž (/, †) = za :ž (/)

á
⇒ z (/, †) = 0.
á/ ᆠa :ž ,aj:ž

Let

95
Ù,ñ (/, †)
r!
k [z (/ )]7 [z (†) − z (/)]ñ77 [1 − z (†)]w7ñ Ù (/)Ù (†),
i( − 1)! (ç −  − 1)! (r − ç)!
= if − ∞ < † ≤ / < ∞
j
i 0, otherwise
h
(10.1.2)

so that

á
z (/, †) = Ù,ñ (/, †) ∀ (/, †) ∈ ℝ − (Æ × Æ).
á/ ᆠa :ž ,aj:ž

It is easy to verify that


x x
Ú Ú Ù,ñ (/, †) Û/ ۆ = 1.
7x 7x

It follows that the random vector (:w , ñ:w ) is of absolutely continuous type with joint p.d.f.

−∞ < / < † < ∞,


given by (10.1.2). One can give the following heuristic argument for the expression (10.1.2). For

Ù,ñ (/, †) ∆/ ∆† = probability that ( − 1) 3 ç fall in (−∞, /], one 3 falls in (/, / + Δ/ ], (ç −
 − 1) 3 ç fall in (/ + ∆/, †](≈ (/, †]), one 3 falls in (†, † + Δ†] and (r − ç) 3 s fall in
(† + ∆†, ∞)≈ (†, ∞). Using the property of multinomial distribution, we have

Ù,ñ (/, †) ∆/ ∆†

= [z (/)]7 [Ù(/ )∆/] [z (†) − z (/)]ñ77 [Ù (†)∆†] [1 − z (†)]w7ñ


w!
(7)!!(ñ77)!!(w7ñ)!

i.e.,

r!
Ù,ñ (/, †) = [z (/ )]7 [z (†) − z (/)]ñ77
( − 1)! (ç −  − 1)! (r − ç)!

× [1 − z (†)]w7ñ Ù (/)Ù(†), −∞ < / < † < ∞.


Example 10.1.2
Let  , … , w be a random sample from a distribution having support “, distribution function
z (∙) and p.m.f. Ù (∙). Define :w = min{ , … , w  and w:w = max{ , … , w . Find the p.m.f.s
of :w and w:w .

96
Solution. For / ∈ ℝ, the distribution function of :w is

za€:ž (/) = @({:w ≤ /)

= 1 − @({:w > /)

= 1 − @({3 > /, = 1, … , r)


w

= 1 − J @({3 > /)


3K

= 1 − J[1 − z (/ )]
3K

= 1 − [1 − z (/ )]w .

Note that

a€:ž =
/ ∈ ℝ: za:w (⋅) is discontinuous at /

= {/ ∈ ℝ: z(∙) is discontinuous at /

= “.

Thus :w is a discrete type random variable with support “ and p.m.f.

za€:ž (/) − za€:ž (/ −), if / ∈ “


Ùa :w (/) = d
0, otherwise
[1 − z (/ −)]w − [1 − z (/ )]w , if / ∈ “
= d .
0, otherwise

Also the distribution function of w:w is given by

za :w (/) = @({w:w ≤ /)

= @({3 ≤ /, = 1, … , r)
w

= J @({3 ≤ / )
3K

= J z (/)
3K

97
= [z(/)]w , / ∈ ℝ.

Since zaž:ž (∙) is continuous at / if, and only if, z (∙) is continuous at /, the random variable w:w
is of discrete type with support “ and p.m.f.

zaž:ž (/) − zaž:ž (/ −), if / ∈ “


Ùaž:ž (/ ) = d
0, otherwise
[z (/ )]w − [z (/ −)]w , if / ∈ “
=d .
0, otherwise

Example 10.1.3
Let  ,  be a random sample from r(0,1) distribution. Find the distribution function of
– =  +  . Hence find the p.d.f. of –.

Solution. The joint p.d.f. of ( ,  ) is given by

Ùa€ ,aŒ (/ , / ) = Ùa€ (/ )ÙaŒ (/ )

1, if 0 < / < 1, 0 < / < 1


= d .
0, otherwise

Therefore the distribution function of – is given by

z˜ (/ ) = @({ +  ≤ /)
x x

= Ú Ú Ùa€ ,aŒ (/ , / ) b(7x,‘] (/ + / )Û/ Û/


7x 7x

 

= Ú Ú Û/ Û/
P P
‘€ Ž ‘Œ ‘

0, if / < 0
k1
i × / × /, if 0 ≤ / < 1
= 2 .
j1 − 1 (2 − / ) × (2 − /), if 1 ≤ / < 2
i 2
h1, if / ≥ 2

98
0, if / 2 0
k 
/
i , if 0 g / 2 1
⇒ z˜ / 2 .
j4/ 0 / 0 2

, if 1 g / 2 2
i 2
h1, if / o 2

Clearly z˜ ∙ is differentiable everywhere except on a finite set Æ ⊆ 0, 1, 2. Let

/, if 0 2 / 2 1
] /  Ô2 0 /, if 1 2 / 2 2 ,
0, otherwise

Û
so that

z / ] / ∀/ ∈ - 0 Æ
Û/ ˜
and
x
Ú ] / Û/ 1.
7x

It follows that – is of absolutely continuous type with a p.d.f.

/, if 0 2 / 2 1
] /  Ô2 0 /, if 1 2 / 2 2 .
0, otherwise

Example 10.1.4
Let  ,  be a random sample from a distribution having p.d.f.

99
2/, if 0 2 / 2 1
Ù(/ ) = d .
0, otherwise

Find the distribution function of –    . Hence find the p.d.f. of –.

Solution. The joint p.d.f. of  ,   is given by

Ùa€ ,aŒ / , /  Ùa  / Ùa  / 

4 / / , if 0 2 / 2 1, 0 2 / 2 1
d   .
0, otherwise

The distribution function of – is given by


 

z˜ /  @    g /  Ú Ú 4/ / Û/ Û/ .


P P
‘€ Ž‘Œ ‘

Clearly, for / 2 0, z˜ /  0 and, for / o 2, z˜€ /  1.

For 0 g / 2 1
‘ ‘7‘€
/R
z˜ /  Ú Ú 4/ / Û/ Û/ .
6
P P

For 1 g / 2 2
‘7   ‘7‘€

z˜ / Ú Ú 4/ / Û/ Û/  Ú Ú 4/ / Û/ Û/


P P ‘7 P

4/ 0 3 0 /  3 / 0 1Q
/ 0 1  .
6

100
Therefore,

0, if / 2 0
k R
/
i , if 0 g / 2 1
z˜ (/) = 6 .
j 4/ 0 3 0 /  3 / 0 1Q
/ 0 1  
, if 1 g / 2 2
i 6
h1, if / o 2

Clearly z˜ ∙ is differentiable everywhere except on a finite set Æ ⊆ 0, 1, 2. Let

2
k / Q , if 0 2 / 2 1
i3
] /  2 ,
j2 / 0 1  1 0 /  2 / 0 1 , if 1 2 / 2 2

i 3
h0, otherwise

Û
so that

z /  ] /  ∀/ ∈ - 0 Æ
Û/ ˜
and
x
Ú ] / Û/ 1.
7x

It follows that – is of absolutely continuous type with a p.d.f.

2
k / Q , if 0 2 / 2 1
i3
] /  2 .
j2 / 0 1  1 0 /  2 / 0 1 , if 1 2 / 2 2

i 3
h0, otherwise

101
Example 10.1.5
Let  ,  , Q be a random sample and let  ~ :(0, 1). Find the distribution function of
– =  +  + Q . Hence finds the p.d.f. of –.

Solution. The joint p.d.f. of  = ( ,  , Q ) is


Q

Ùa (/ , / , /Q ) = J Ùa 3 (/3 )
3K

Q
1 íŒ
= J X 7 Œ
„

3K
√2Y

1
= X 7Œ‘€ Ž‘Œ Ž‘‹  , − ∞ < /3 < ∞, = 1, 2,3.
€ Œ Œ Œ

(2Y )
‹
Œ

Therefore the distribution function of – =  +  + Q is


x x x
1
z˜ (†) = Ú Ú Ú ‹ X Û/ Û/ Û/Q .
€
7 ‘€ Ž‘Œ Ž‘‹ 
Œ Œ Œ
Œ

7x 7x 7x (2Y)Œ
‘€Œ Ž ‘ŒŒ Ž ‘‹Œ ¾

On making the spherical coordinates transformation

/ =  sin ? sin ? ,
/ =  sin ? cos ? ,
/Q =  cos ? ,

so that  > 0, 0 < ? ≤ Y, 0 < ? ≤ 2Y and the Jacobian of the transformation is s =   sin ? ,
we get for † > 0

√¾ ^ ^
1 Œ
z˜ (†) = ‹Ú ÚÚ X
7
Œ   sin? Û? Û? Û
  
(2Y)Œ P P P

√¾
2
= t Ú X 7 Œ   Û
Œ

Y
P

¾
1
= Ú X 7 Œ  Œ 7  ۏ
= ‹

2  Γ( )
Qu Q
 P

102
Therefore

0, if † ≤ 0
k ¾
z˜ (†) = 1 .
Q Ú X  Œ 7  ۏ, if † > 0
= ‹
j Qu
7
Œ

h2  Γ() P

Clearly z˜ (∙) is the distribution function of FQ distribution having the p.d.f.

X 7 Œ † Œ 7 
æ ‹

, if † > 0
٘ (†) = Õ 2Qu Γ(Q) ,

0, otherwise

Thus – ~ FQ (also see Example 7.6 (ii)). ▄

In many situations finding distribution function

z˜ (†) = @
] , … , .  ≤ †, −∞ < † < ∞,

of random variable – = ] , … , .  may be difficult or quite tedious. For example, consider a
random sample  , … , w (r ≥ 4) from :(0, 1) distribution and suppose that the distribution
function of – = ∑w3K 3 is desired. Clearly, for † > 0
x x
1
z˜ (†) = Ú ⋯ Ú w X Œ „÷€ „ Û/ Û/ ⋯ Û/ .
€
7 ∑ž ‘Œ
(2Y) u
  w
7x 7x
‘€Œ Ž⋯ Ž‘ž
Œ ¾

On making the spherical coordinates transformation

/ =  sin ? sin ? ⋯ sin ?w7 sin ?w7


/ =  sin ? sin ? ⋯ sin ?w7 cos ?w7
/Q =  sin ? sin ? ⋯ sin ?w7Q cos ?w7

/w7 =  sin ? cos ?
/w =  cos ? ,

So that  > 0, ∑w3K /3 =   , 0 < ?3 ≤ Y, = 1, … , r − 2, 0 < ?w7 ≤ 2Y and the Jacobian of
transformation is s =  w7 sinw7 ? sinw7Q ? ⋯ sin ?w7 , we get for † > 0

√¾ ^ ^ ^
1 Œ
z˜ (†) = Ú Ú ⋯ Ú Ú X 7 Œ  w7 sinw7 ? sinw7Q ? ⋯ sin ?w7 Û?w7 Û?w7 ⋯ Û? Û.
(2Y)
wu

P P P P

103
Clearly evaluating the above integral may be tedious. This points towards desirability, if
possible, of other methods of determining the distributions of functions of random variable.
We will see that other techniques are available and, in a given situation, often one technique is
more elegant than the others.

10.2. Transformation of Variables Technique

joint probability distribution of functions of a discrete type random vector .


The following theorem, whose proof is similar to that of Theorem 2.1, Module 3, deals with

Theorem 10.2.1

Let  =  , … , .  be a discrete type random vector with support “a and p.m.f. Ùa (∙). Let
]3 : ℝ` → ℝ, = 1, … , _ be _ Borel functions and let –3 = ]3 , = 1, … , _ . Define, for
† = († , … , †` ) ∈ ℝ` , [¾ =
/ = / , … , /.  ∈ “a : ] / ≤ † , … , ]` / ≤ †`  and 8¾ =

/ ∈ “a : ] / = † , … , ]` / = †` . Then the random vector – = (– , … , –` ) is of discrete


type with distribution function

z˜ <†= = ¯ Ùa / , † ∈ ℝ`
‘∈cæ

and the p.m.f.

٘ <†= = ¯ Ùa / , † ∈ ℝ` .
‘∈eæ

3K [3 = {(/ , … , /4 ): /3 ∈ [3 , =
We will denote the Cartesian product of set [ , … , [4 by ∏4
1, … , w.

Example 10.2.1
Let  , … , . be independent random variables with 3 ~ Bin(r3 , ? ), where r3 ∈ ℕ, = 1, … , 4
and ? ∈ (0, 1). Without using the m.g.f. of – = ∑3K 3 , find the p.m.f. of –.
.

Solution. For finding the probability distribution of – using the uniqueness of m.g.f. (see
Example 7.2). The joint p.m.f. of  =  , … , .  is given by
.

Ùa / = J Ùa 3 (/3 )
3K

104
. .
r3
J </ = ? ‘„ (1 − ?)w„7‘„ , if / ∈ J{0, 1, … , r3 
= Õ 3
3K 3K
0, otherwise
. .
r3
£J </ =¥ ? ∑„÷€ ‘„ (1 − ? )w7∑„÷€ ‘„ , if / ∈ J{0, 1, … , r3 ,
 

= Õ 3
3K 3K
0, otherwise

where r = ∑3K r3 . By Theorem 10.2.1, we have


.

٘ (†) = ¯ Ùa / , † ∈ ℝ,
‘ ∈eæ

where, for † ∈ ℝ, 8¾ =
/ ∈ “a : / + ⋯ + /. = †. Clearly, for † ∉ {0, 1, ⋯ , r, 8¾ = l and
therefore ٘ (†) = 0. Also, for † ∈ {0, 1, … , r ,
w€ w

٘ (†) = ¯ ⋯ ¯ Ùa /
‘€ KP ‘ KP
‘€ Ž⋯ Ž‘ K¾

w€ w .
r3
= ¯ ⋯ ¯ £J < / =¥ ? ∑„÷€ ‘„ (1 − ?)w7∑„÷€ ‘„
 

3
‘€ KP ‘ KP 3K
‘€ Ž⋯ Ž‘ K¾

w€ w .
 r3 ¢
= œ ¯ ⋯ ¯ £J </ =¥¡ ? ¾ (1 − ? )w7¾
3
‘€ KP ‘ KP 3K
›‘€ Ž⋯ Ž‘ K¾
 
r
= <†= ? ¾ (1 − ? )w7¾ .

Therefore
r
<† = ? ¾ (1 − ? )w7¾ , if † ∈ {0, 1, … , r
٘ (†) = Ô ,
0, otherwise

i.e. – ~ Bin(r, ? ).

105
Example 10.2.2
Let  , … , . be independent random variables such that 3 ~ P(3 ), = 1, … , 4 , where
3 > 0, = 1, … , 4. Without using the m.g.f. of – = ∑3K 3 , find the probability distribution of
.

–.

Solution. For derivation of probability distribution of – using the uniqueness of m.g.f. (see
Example 7.4). We have “a = {0, 1, … . . The joint p.m.f. of  =  , … , .  is
.

Ùa / = J Ùa 3 (/3 )
3K

.
3 ‘„
X £J ¥ , if / = / , … , /.  ∈ {0, 1, … .

7 ∑„÷€ „
= Õ /3 ! .
3K
0, otherwise

Using Theorem 10.2.1, we have

٘ (†) = ¯ Ùa (/) , † ∈ ℝ,
‘ ∈eæ

where, for † ∈ ℝ, 8¾ =
/ ∈ “a : / + ⋯ + /. = †.

For † ∉ {0, 1, … , 8¾ = l and therefore ٘ (†) = 0.

For † ∈ {0, 1, … ,
x x .
3 ‘„
٘ (†) = ¯ ⋯ ¯ X £J ¥

7 ∑„÷€ „
/3 !
‘€ KP ‘ KP 3K
‘€ Ž⋯ Ž‘ K¾

(/ + ⋯ + /. )! ‘€
x x
X 7 ∑„÷€ „


= ¯ ⋯ ¯  ⋯ . ‘
†! / ! ⋯ /. !
‘€ KP ‘ KP
‘€ Ž⋯ Ž‘ K¾

X 7 ∑„÷€ „  + ⋯ + . 
 ¾

= .
†!

Therefore,

106
X 7 ∑„÷€ „ ∑3K 3 
 . ¾

٘ (†) = Ò , if † ∈ {0, 1, 2, … ,
†!
0, otherwise
.

i. e. , – ~ P £¯ 3 ¥.
3K

Example 10.2.3
Let  , … , . be independent random variables such that 3 ~ NB(3 , ? ), ? ∈ (0, 1), 3 ∈
{1, 2, … , = 1, … , 4. Without using the m.g.f. of – = ∑.3K 3 show that – ~ NB∑.3K 3 , ? .

Solution. For a solution utilizing the uniqueness of m.g.f. refer to Example 7.3. One can also
provide a solution based on methods used in solving problems 10.2.1 and 10.2.2 by using the
identity
.
x x
 + _ − 1  +_ −1
¯ ⋯ ¯ | } ⋯ | . . } = ™¯ 3 + † − 1š , † ∈ {0, 1, 2, … .
 . 3K
`€ KP
` Ž⋯
` KP
†
€ Ž` K¾

Example 10.2.4
Let  and  be independent and identically distributed random variables with  ~ NB(1, 4),
where 4 ∈ (0, 1). Find the distribution function of – =  +  . Hence find the p.m.f. of – (also
see Examples 7.3 and 10.2.3).

Solution. Since  and  have the common support “ = {0, 1, 2, …  , we have z˜ (†) = 0,
if † < 0. Moreover, for † ∈ [_, _ + 1), _ ∈ {0, 1, 2, … 

z˜ (†) = @({ +  ≤ †)

= @({ +  ≤ _)
x

= ¯ @ ({ +  ≤ _,  = )
©KP

107
x

= ¯ @ ({ ≤ _ − ,  = )
©KP

= ¯ @ ({ ≤ _ − )@({ = ).


©KP

We have @({ ≤ z ) = 0, if z ∈ {−1, −2, …  and, for z ∈ {0, 1, 2, … 


@({ ≤ z ) = ¯ @ ({ = )


©KP

= ¯(1 − 4)© 4
©KP

= 1 − (1 − 4)
Ž .

It follows that, for † ∈ [_, _ + 1), _ ∈ {0, 1, 2, … 


`

z˜ (†) = ¯ @({ ≤ _ − ) @ ({ = )


©KP

= ¯1 − (1 − 4)`7©Ž  4(1 − 4) ©


©KP

= 1 − (1 − 4)`Ž – (_ + 1) 4(1 − 4)`Ž .

0, if † < 0
Consequently

z˜ (†) = d .
1 − (1 − 4)`Ž – (_ + 1)4(1 − 4)`Ž , if _ ≤ † < _ + 1, _ ∈ {0, 1, … 

Clearly – is a discrete type random variable with support “˜ = {0, 1, 2, …  and for _ ∈ “˜ ,

@({– = _) = z˜ (_) − z˜ (_ −)

= z˜ (_ ) − z˜ (_ − 1)

= (_ + 1)4 (1 − 4)` .

Therefore the p.m.f of – is given by

108
(† + 1)4 (1 − 4) ¾ , if † ∈ {0, 1, 2, … 
٘ (†) = d .
0, otherwise

Example 10.2.5
Let  and  be independent and identically distributed random variables with common p.m.f.

?(1 − ? )‘ 7 , if / ∈ {1, 2, … 
Ù (/) = d ,
0, otherwise

where ? ∈ (0, 1). Let – = min{ ,   and – = max{ ,   − min{ ,  .

Find the marginal p.m.f. of – without finding the joint p.m.f. of – = (– , – );
Find the marginal p.m.f. of – without finding the joint p.m.f. of – = (– , – );
(i)

Find the joint p.m.f. of – = (– , – );


(ii)

Are – and – independent?


(iii)

Using (iii) find the marginal p.m.f.s of – and – .


(iv)
(v)

Solution. The joint p.m.f of  = ( ,  ) is given by

?  (1 − ? )‘€ Ž‘Œ7 , if (/ , / ) ∈ ℕ × ℕ


Ùa (/ , / ) = Ù(/ )Ù (/ ) = d ,
0, otherwise

where ℕ = {1, 2, … . Clearly “a = ℕ × ℕ.

(i) By Theorem 10.2.1

٘€ (†) = ¯ Ùa (/ , / ), † ∈ ℝ,


‘∈eæ

where, for † ∈ ℝ , 8¾ = {(/ , / ) ∈ “a : min(/ , / ) = †. Clearly, for † ∉ {1, 2, …  = ℕ,


8¾ = l and therefore ٘€ (†) = 0.

For † ∈ {1, 2, … 

8¾ =
(/ , / ) ∈ “a : / = / = †  ∪ :(/ , / ) ∈ “a : / = †, / ∈ {† + 1, † + 2, … >

∪ :(/ , / ) ∈ “a : / = †, / ∈ {† + 1, † + 2, … >

= 8,¾ ∪ 8,¾ ∪ 8Q,¾ , say.

109
Clearly, for † ∈ {1, 2, … , 8,¾ , 8,¾ and 8Q,¾ , are pairwise disjoint sets. Therefore, for
† ∈ ℕ,

٘€ (†) = ¯ Ùa (/ , / ) + ¯ Ùa (/ , / ) + ¯ Ùa (/ , / )


‘ ∈e€,æ ‘ ∈eŒ,æ ‘ ∈e‹,æ

x x
 ( )¾7 ( )‘€Ž¾7
= ? 1−? + ¯ ? 1−? + ¯ ?  (1 − ? )‘Œ ޾7
‘€ K¾Ž ‘Œ K¾Ž

= ?  (1 − ? )¾7 + 2?  ¯ (1 − ? )‘޾7
‘K¾Ž

= ?  (1 − ? )¾7 + 2?(1 − ? )¾7

= ?(2 − ? ) (1 − ? )¾7 .

Therefore,

?(2 − ? ) (1 − ? )¾7 , if † ∈ {1, 2, … 


٘€ (†) = d .
0, otherwise

(ii) We have

٘Œ (†) = ¯ Ùa (/ , / ), † ∈ ℝ,


‘∈ eæ

where, for † ∈ ℝ,

8¾ =
(/ , / ) ∈ “a : max{/ , /  − min{/ , /  = †.

Clearly, for † ∉ {0, 1, 2, … , 8¾ = ∅, and therefore ٘Œ (†) = 0.

For † = 0, 8¾ =
(/ , / ) ∈ “a : / = /  =
(/, / ): / ∈ {1, 2, … , and therefore
x

٘Œ (†) = ¯ ?  (1 − ? )‘7


‘K

?
= .
2−?
For † ∈ {1, 2, … ,

8¾ =
(/ , / ) ∈ “a : max{/ , /  − min{/ , /  = †

110
=
(/, / + †): / ∈ {1, 2, …  ∪
(/ + †, /): / ∈ {1, 2, … 

= 8,¾ ∪ 8,¾ , say.

Since 8,¾ ∩ 8,¾ = ∅, † ∈ {1, 2, … , we have for † ∈ {1, 2, … ,

٘Œ (†) = ¯ Ùa (/ , / ) + ¯ Ùa (/ , / )


‘∈e€,æ ‘∈eŒ,æ

x x
 ( )‘޾7
= ¯ ? 1−? + ¯ ?  (1 − ? )‘޾7
‘K ‘K

2? (1 − ? )¾
= .
2−?
Therefore

?
k , if † = 0
i2 − ?
٘Œ (†) = 2? (1 − ? )¾
j , if † ∈ {1, 2, …  .
i 2−?
h0, otherwise

(iii) We have, for † = († , † ) ∈ ℝ ,


٘ († , † ) = @ <:min{X ,   = †, max{X ,   − min{X ,   = † >

= @
min{X ,   = †, max{X ,   = † + † 

= ¯ Ùa (/ , / ),
‘∈eæ

where, for † = († , † ) ∈ ℝ ,

8¾ =
(/ , / ) ∈ “a : min{/ , /  = † , max{/ , /  = † + † .

Note that, for † = († , † ) ∉ ℕ × {0, 1, 2, … , 8¾ = ∅ and therefore ٘ († , † ) = 0.

For † = († , † ) ∈ ℕ × {0, 8¾ = {(† , † ) and therefore

٘ († , † ) = ?  (1 − ? )¾€7 .

Also, for † = († , † ) ∈ ℕ × ℕ, 8¾ = {(† , † + † ), († + † , † ) and therefore

111
٘ († , † ) = 2?  (1 − ?)¾€ ޾Œ7 .

It follows that

?  (1 − ? )¾€7 , if († , † ) ∈ ℕ × {0


٘ († , † ) = Ò2?  (1 − ? )¾€Ž¾Œ7 , if († , † ) ∈ ℕ × ℕ .
0, otherwise

(iv) By (i)−(iii) we have

٘ († , † ) = ٘€ († ) ٘Œ († ), ∀ † = († , † ) ∈ ℝ .

Consequently – and – are independent random variables.

(v) From (iii) we have “˜ = ℕ × {0, 1, 2, … . Therefore

“˜€ =
† ∈ ℝ: († , † ) ∈ “˜ for some † ∈ ℝ = ℕ

and

“˜Œ =
† ∈ ℝ: († , † ) ∈ “˜ for some † ∈ ℝ = {0, 1, 2, … .

Also

¯ ٘ († , † ) , if † ∈ “˜€
٘€ († ) = Ò¾Œ∈ åæ ,
0, otherwise
€

where, for † ∈ “˜€ , 㾀 =


† ∈ ℝ: († , † ) ∈ “˜  = {0, 1, 2, … .

Thus, for † ∈ “˜€ = {1, 2, … ,

٘€ († ) = ¯ ٘ († , † )


¾Œ ∈ 忀

= ¯ ٘ († , † )
¾Œ KP

x
 ( )¾€7
= ? 1−? + ¯ 2?  (1 − ? )¾€Ž¾Œ 7
¾Œ K

= ?(2 − ? ) (1 − ? )¾€7 .

112
Therefore

? (2 − ? ) (1 − ? )¾€7 , if † ∈ {1, 2, … 
٘€ († ) = d .
0, otheriwse

Similarly,

¯ ٘ († , † ) , if † ∈ “˜Œ
٘Œ († ) = Ò¾€∈ åæ ,
0, otherwise
Œ

where, for † ∈ “˜Œ , 㾌 =


† ∈ ℝ: († , † ) ∈ “˜  = {1, 2, … . Therefore, for † = 0
x

٘Œ († ) = ¯ ?  (1 − ? )¾€7


¾€ K

?
= ,
2−?
and, for † ∈ {1, 2, … 
x

٘Œ († ) = ¯ 2?  (1 − ? )¾€Ž¾Œ7


¾€ K

2?(1 − ?)¾Œ
= .
2−?
It follows that

?
k , if † = 0
i2 − ?
٘Œ († ) = 2? (1 − ? )¾Œ
j , if † ∈ {1, 2, …  .
i 2−?
h0, otherwise

Example 10.2.6
Let  = ( ,  , Q ) be a discrete type random vector with p.m.f.

113
, if (/ , / , /Q ) ∈ {(1, 1, 0), (1, 0, 1), (0, 1, 1)

|
Ùa (/ , / , /Q ) = Õ , if (/ , / , /Q ) = (1, 1, 1) .
Q
0, otherwise

Define – =  +  and – =  + Q .

(i) Find the marginal p.m.f. of – without finding the joint p.m.f. of – = (– , – );
(ii) Find the marginal p.m.f. of – without finding the joint p.m.f. of – = (– , – );
(iii) Find the joint p.m.f. of – = (– , – );
(iv) Are – and – independent?
(v) Using (iii) find the marginal p.m.f.s of – and – .

Solution. (i) We have

@({– = †) = @({ +  = †) = 0, if † ∉ {1,2,

@({– = 1) = @({ +  = 1)

= @({( ,  , Q ) ∈ {(1, 0, 1), (0, 1, 1))

= @ ({( ,  , Q ) = (1, 0, 1)) + @ ({( ,  , Q ) = (0, 1, 1))

4
=
9
and

@({– = 2) = @({ +  = 2)

= @ ({( ,  , Q ) = (1,1,0)) + @({( ,  , Q ) = (1,1,1))

5
= .
9
Therefore,

4
k , if † = 1
i9
٘€ (†) = 5 .
j , if † = 2
i9
h 0, otherwise

114
(ii) By symmetry

4
k , if † = 1
i9
٘Œ (†) = 5 .
j , if † = 2
i9
h0, otherwise

(iii) The joint p.m.f. of – = (– , – ) is

٘ († , † ) = @({ +  = † ,  + Q = † )

= 0, if († , † ) ∉ {(1, 1), (1, 2), (2, 1), (2, 2),

٘ (1, 1) = @({ +  = 1,  + Q = 1)

= @({( ,  , Q ) = (1, 0, 1))

2
= ,
9
٘ (1, 2) = @ ({ +  = 1,  + Q = 2)

= @({( ,  , Q ) = (0, 1, 1))

2
= ,
9
٘ (2, 1) = @ ({ +  = 2,  + Q = 1)

= @({( ,  , Q ) = (1, 1, 0))

2
=
9
and

٘ (2, 2) = @({ +  = 2,  + Q = 2)

= @ ({( ,  , Q ) = (1, 1, 1))

1
= .
3
Therefore

115
2
k , if († , † ) ∈ {(1, 1), (1, 2), (2, 1)
i9
٘ († , † ) = 1 .
j , if († , † ) = (2, 2)
i3
h0, otherwise

(iv) Since

2
@({– = 1, – = 1) =
9
≠ @({– = 1)@({– = 1)

16
= ,
81
– and – are not independent.

(v) Using (iii) we have “˜ = {(1, 1), (1, 2), (2, 1), (2, 2). Therefore

“˜€ =
† ∈ ℝ: († , † ) ∈ “˜ for some † ∈ ℝ = {1, 2

and

“˜Œ =
† ∈ ℝ: († , † ) ∈ “˜ for some † ∈ ℝ = {1, 2.

Also

¯ ٘ †, † , if † ∈ “˜€
٘€ (†) = Ò¾Œ ∈ åæ ,
€
0, otherwise

where, for † ∈ “˜€ , 㾀 =


† ∈ ℝ: († , † ) ∈ “˜  = {1, 2.

Consequently, for † ∈ “˜€ = {1, 2,




٘€ († ) = ¯ ٘ († , † ) = ¯ ٘ († , † ).


¾Œ ∈ 忀 ¾Œ K

4
٘€ (1) = ٘ (1, 1) + ٘ (1, 2) =
9
and

116
5
٘€ (2) = ٘ (2, 1) + ٘ (2, 2) = .
9
Then

4
k , if † = 1
i9
٘€ († ) = 5 .
j , if † = 2
i9
h0, otherwise

By symmetry

٘Œ (†) = ٘€ (†), ∀ † ∈ ℝ.

Example 10.2.7
Let  = ( ,  ) be a discrete type random vector with p.m.f. given by

Ùa€ ,aŒ (/ , / )


/ −1
/
1

1 1
4 2
1 3
0

2 16 16

Find the p.m.f. of – = | 0 2 |.

Solution. We have

(/ , / ) Ùa€ ,aŒ (/ , / ) † = |/ 0 2/ |


01, 0) 1
4
1

(1, 0) 1
2
1

(−1, 2) 1
16
3
5

(1, 2)
16
3

117
Therefore the p.m.f. of – = | 0 2 | is give by

3
k , if † = 1
i4
i3
٘ † 16 , if † = 3 .
j1
i16 , if † = 5
i
h 0, otherwise

For finding the probability distributions of functions of a random vector of absolutely


continuous type we have the following theorem.

Theorem 10.2.2

Let  =  , … , .  be a random vector of absolutely continuous type with a joint p.d.f. Ùa (∙)
and support “a =
/ ∈ ℝ. : Ùa / > 0. Let “ , … , “` be open subset of ℝ. such that
“3 ∩ “© = ∅, if ≠ , and ⋃`3K “3 = “a . Suppose that ℎ© : ℝ. → ℝ ,  = 1, … , 4, are 4 Borel
functions such that on each “3 , ℎ = ℎ , … , ℎ. : “3 → ℝ. is one-to-one with inverse
transformation ℎ37  = <ℎ,3
7
, … , ℎ.,3
7
= (say), = 1, … , _. Further suppose that
ℎ©,3
7
,  = 1, … , 4, = 1, … , _, have continuous partial derivatives and the Jacobian
determinants

áℎ,3
7
 áℎ,3
7


} Ꮰá. }
áℎ,3
7
 áℎ,3
7

s3 = } Ꮰ⋯
á. } ≠ 0, = 1, … , 4.
⋮ ⋮
}áℎ7  ⋮ áℎ.,3 }
7

.,3
Ꮰá.

Define ℎ“©  = :ℎ/ = <ℎ /, … , ℎ. /= ∈ ℝ. : / ∈ “© > ,  = 1, … , 4 and ~© = ℎ©  , … , . ,


 = 1, … , 4. Then the random vector ~ = ~ , … , ~.  is of absolutely continuous type with joint
p.d.f.
`

Ù  = ¯ Ùa <ℎ,©
7
, … , ℎ.,©
7
= »s© » bÃñ¨ .
©K

118

We shall not provide the proof of the above theorem. The idea of the proof of the above

distribution function of ~ is written in the form of multiple integrals which are simplified by
theorem is similar to that of Theorem 2.2, Module 3. In the proof of the theorem, the joint

making change of variables using change of variable Theorem of multivariable calculus.

The following corollary is immediate from Theorem 10.2.2.

Corollary 10.2.1

Let  =  , … , .  be a random vector of absolutely continuous type with a joint p.d.f. Ùa (∙)
and support “a =
/ ∈ ℝ. : Ùa / > 0 , an open set in ℝ. . Suppose that ℎ© : ℝ. → ℝ,  =
1, … , 4, are 4 Borel functions such that ℎ = ℎ , … , ℎ. : “a → ℝ. is one-to-one with inverse
transformation ℎ7  = (ℎ7 , … , ℎ.7 ) (say). Further suppose that ℎ37 , = 1, … , 4, have
continuous partial derivatives and the Jacobean determinant

áℎ7  áℎ7 



} Ꮰá. }
áℎ7  áℎ7 
s = } Ꮰ⋯
á. } ≠ 0.

}áℎ7  áℎ.7 }

á.
.
á

Define ℎ“a  =
ℎ/ = (ℎ /, ⋯ , ℎ. /) ∈ ℝ. : / ∈ “a  and ~© = ℎ©  , … , . ,  = 1, … , 4.
Then the random vector ~ = ~ , … , ~.  is of absolutely continuous type with joint p.d.f.

Ù () = Ùa (ℎ7 , ⋯ , ℎ.7  |s|bÃר .

Remark 10.2.1

Let  =  , … , .  be a random vector of absolutely continuous type with joint p.d.f. Ùa and
let “a =
/ ∈ ℝ. : Ùa / > 0. Suppose that we are interested in finding the joint probability
distribution of random vector ~ = (~ , … , ~` ) = (ℎ /, … , ℎ` /), where _ ∈ {1, … , 4 and
ℎ3 : ℝ. → ℝ, = 1, … , _, are some Borel functions. For this we shall define 4 − _ additional
auxiliary Borel functions ℎ3 : ℝ. → ℝ, = _ + 1, … , 4 , such that the transformation
ℎ = ℎ , … , ℎ. : “a → ℝ. , satisfies the assumptions of Theorem 10.2.2/Corollary 10.2.1. Then

119
an application of Theorem 10.2.2/Corollary 10.2.1 will provide the joint p.d.f. Ù  , … , .  of
~ = ~ , … , ~.  from which marginal joint p.d.f. of r = (~ , … , ~` ) is obtained by integrating
out unwanted variables `Ž , … , . in Ù € , … , €` , `Ž , … , . .

Example 10.2.8
Let  and  be independent and identically distributed random variables with common p.d.f.

1
k , if − 2 < / < −1
i2
Ù (/) = 1 .
j , if 0 < / < 3
i6
h0, otherwise

Find the p.d.f. of – = | | + | |.

Solution. The joint p.d.f. of  = ( ,  ) is given by

Ùa (/ , / ) = Ù(/ )Ù (/ )

1
k , if (/ , / ) ∈ (−2, −1) × (−2, −1)
i 4
i1
( ) ) ( ) ) ( )
= 12 , if / , / ∈ (−2, −1 × 0, 3 ∪ (0, 3 × −2, −1 .
j1
( ) ( ) ( )
i36 , if / , / ∈ 0, 3 × 0, 3
i
h0, otherwise

Define the auxiliary random variable – = | | .We have

“a
/ / , / ) ∈ ℝ : Ùa (/ , / ) > 0

= “ ∪ “ ∪ “Q ∪ “R ,

where “ = (−2, −1) , “ = (−2, −1) × (0, 3), “Q = (0, 3) × (−2, −1) and “R = (0, 3) .

Let ℎ = (ℎ , ℎ ): ℝ → ℝ be defined by

ℎ (/ , / ) = |/ | + |/ | and ℎ / , / ) = |/ |, / = (/ , / ) ∈ ℝ .

Then – = ℎ ( ,  ), – = ℎ ( ,  ), “3 ∩ “© = l, ≠  and on each “3 , = 1, 2, 3, 4, ℎ =


(ℎ , ℎ ): “3 → ℝ is one-to-one. Under the notation of Theorem 10.2.2 we have

0 −1
ℎ,
7
 = − , ℎ,
7
 =  −  , s = 0 0 = −1;
−1 1

120
0 −1
ℎ,
7
 = − , ℎ,
7
 =  −  , s = 0 0 = 1;
1 −1
0 1
ℎ,Q
7
 =  , ℎ,Q
7
 =  −  , sQ = 0 0 = 1;
−1 1
0 1
ℎ,R
7
 =  , ℎ,R
7
 =  −  , sR = 0 0 = −1;
1 −1
ℎ(“ ) = {( ,  ) ∈ ℝ : −2 < − < −1 , −2 <  −  < −1

= {( ,  ) ∈ ℝ :  + 1 <  <  + 2, 1 <  < 2;

ℎ(“ ) = {( ,  ) ∈ ℝ : −2 < − < −1, 0 <  −  < 3 

= {( ,  ) ∈ ℝ :  <  <  + 3, 1 <  < 2 ;

ℎ(“Q ) = {( ,  ) ∈ ℝ : 0 <  < 3 , −2 <  −  < −1

= {( ,  ) ∈ ℝ :  + 1 <  <  + 2, 0 <  < 3

and

ℎ(“R ) = {( ,  ) ∈ ℝ : 0 <  < 3 ,0 <  −  < 3

= {( ,  ) ∈ ℝ :  <  <  + 3, 0 <  < 3.

Consequently the joint p.d.f. of – = (– , – ) is given by


R

٘ ( ,  ) = ¯ Ùa <ℎ,©
7
, ℎ,©
7
= »s© » bÃר ()
©K

= Ùa (− ,  − ) bÃ(׀)  + Ùa (− ,  − ) bÃ(׌) 

+Ùa ( ,  − )bÃ(׋) () + Ùa ( ,  − ) bÃ(ׁ) ()

121
1
k , if  <  <  + 1, 0 <  < 1
i36
1 1
i + , if  + 1 <  <  + 2, 0 <  < 1
i12 36
i 1 , if  + 2 <  <  + 3, 0 <  < 1
i36
   

i1 1
i12 + 36 , if  <  <  + 1, 1 <  < 2
i1 1 1 1
= 4 + 12 + 12 + 36 , if  + 1 <  <  + 2, 1 <  < 2
j1 1
i + , if  + 2 <  <  + 3, 1 <  < 2
i12 36
1
i , if  <  <  + 1, 2 <  < 3
i36
i 1 + 1 , if  + 1 <  <  + 2, 2 <  < 3
i12 36    

i1
i36 , if  + 2 <  <  + 3, 2 <  < 3
h0, otherwise

1
k , if 0 <  < 2, max{0,  − 1 <  < min{1,  
i36
or
i 2 <  < 4, max{0,  − 3 <  < min{1,  − 2
i    

i or
i 2 <  < 4, max{2,  − 1 <  < min{3,  
i or
i 4 <  < 6, max{2,  − 3 <  < min{3,  − 2
i
= 1
j 9 , if 1 <  < 3, max{0,  − 2 <  < min{1,  − 1
i or
i 1 <  < 3, max{1,  − 1 <  < min{2,  
i or
i 3 <  < 5, max{1,  − 3 <  < min{ 2,  − 2
i    

i or 3 <  < 5, max { 2,  − 2  <   < min { 3,  − 1


i4 {  { 
i9 , if 2 <  < 4, max 1,  − 2 <  < min 2,  − 1
h0, otherwise

Then the marginal p.d.f. of – is given by


x

٘  ( ) = Ú Ù˜ ( ,  ) ۏ ,  ∈ ℝ.


7x

122
For  ∈ (0, 1)

min{1,   − max{0,  − 1
٘  ( ) =
36

= ;
36
for  ∈ (1, 2)

min{1,   − max{0,  − 1 min{1,  − 1 − max{0,  − 2


٘  ( ) = +
36 9
min{2,   − max{1,  − 1
+
9
7 − 6
= ;
36
for  ∈ (2, 3)

min{1,  − 2 − max{0,  − 3 min{3,   − max{2,  − 1


٘  ( ) = +
36 36
min{1,  − 1 − max{0,  − 2 min{2,   − max{1,  − 1
+ +
9 9
4[min{2,  − 1 − max{1,  − 2]
+
9
5 − 6
= ;
18

for  ∈ (3, 4)

min{1,  − 2 − max{0,  − 3 min{3,   − max{2,  − 1


٘  ( ) = +
36 36
min{2,  − 2 − max{1,  − 3 min{3,  − 1 − max{2,  − 2
+ +
9 9
4[min{2,  − 1 − max{1,  − 2]
+
9
24 − 5
= ;
18

123
For  ∈ (4, 5)

min{3,  − 2 − max{2,  − 3 min{2,  − 2 − max{1,  − 3


٘  ( ) = +
36 9
min{3,  − 1 − max{2,  − 2
+
9
36 − 7
= ;
36
and, for  ∈ (5, 6)

min{3,  − 2 − max{2,  − 3
٘  ( ) =
36
6 − 
= .
36
Therefore the p.d.f. of – = | | + | | is given by

k , if 0 <  < 1
36
i7 − 6
i  , if 1 <  < 2
i 36
i5 − 6
i 18 , if 2 <  < 3
٘€   24 − 5 .
j , if 3 <  < 4
18
i36 − 7
i 
, if 4 <  < 5
i 36
i6 − 
i 36 , if 5 <  < 6
h0, otherwise

Example 10.2.9 (Distribution of order statistics)


Let  , … , w be a random sample of absolutely continuous type random variables having a
common p.d.f. Ù(∙), the common distribution function z(⋅) and a common support “ =
{/ ∈ ℝ: Ù (/) > 0, an open set in ℝ. Let :w , … , w:w denote the order statistics of  , … , w ,
i.e., :w = r − th smallest of  , … , w ,  = 1, … , r. For notational connivance, let – = :w ,
 = 1, … , r.

124
Find an expression for the joint distribution function of – = (– , … , –w ). Hence find the
joint p.d.f. of –;
(i)

(ii) Find the joint p.d.f. of – directly using Theorem 10.2.2;


(iii) Using (ii), find the marginal p.d.f. of – ,  = 1, … , r;
(iv) Using (ii), find the marginal joint p.d.f. of (– , –ñ ), where 1 ≤  < ç ≤ r.

Proof. The joint p.d.f. of  = ( , … , w ) is given by


w w

Ùa (/ , … , /w ) = J Ùa 3 (/3 ) = J Ù (/3 ) , / = (/ , … , /w ) ∈ ℝw .


3K 3K

Let “w =
Π , ⋯ , Πw!  denote the set of all permutations of (1, … , r); here for ∈ {1, … , r!,
Π3 = Π3,() , … , Π3,(w)  is a permutation of (1, … , r).

(i) Since  , … , w is a random sample we have

( , … , w ) = <ƒ„,(€) , … , ƒ„,(ž) = , ∈ {1, … , r!. (10.2.1)


7

Also since  = ( , … , w ) is of absolutely continuous type (as  , … , w are of absolutely


continuous type) i.e., @
3 = © , for some ≠  = 0 and therefore
w!

¯ @ <:ƒ„,(€) < ⋯ < ƒ„,(ž) >= = 1.


3K

Then, for † = († , … , †w ) ∈ ℝw ,

z˜ (†) = @ ({:w ≤ † , … , w:w ≤ †w )


w!

= ¯ @ <::w ≤ † , … , w:w ≤ †w , ƒ„,(€) < ⋯ < ƒ„,(ž) >=


3K

w!

= ¯ @ <:ƒ„,(€) ≤ † , … , ƒ„,(ž) ≤ †w , ƒ„,(€) < ⋯ < ƒ„,(ž) >=


3K

w!

= ¯ @ ({ ≤ † , … , w ≤ †w ,  < ⋯ < w ) (using (10.2.1))


3K

= r! @({ ≤ † , … , w ≤ †w ,  < ⋯ < w )

125
¾€ ¾ž w

= Ú ⋯ Ú r! £J Ù(/3 )¥ bc /Û/w ⋯ Û/ ,


7x 7x 3K

where [ =
/ ∈ ℝw : −∞ < / < ⋯ < /w < ∞.

It follows that – is of absolutely continuous type with p.d.f.


w

٘ <†= = r! £J Ù(†3 )¥ bc <†=


3K

r! £J Ù(†3 )¥ , if − ∞ < † < ⋯ < †w < ∞


= Õ .
3K
0, otherwise

(ii) Since  is of absolutely continuous type we may, without loss of generality, take “a ⊆
:/ ∈ ℝw : /3 ≠ /© , ∀ ≠ , ,  ∈ {1, … , r >. Then “a = ⋃w!
3K “3 , where “3 = :/ ∈ “a : /ƒ„,(€) <

⋯ < /ƒ„,(ž) > , = 1, … , r!. Define ℎ3 : ℝw → ℝ by ℎ3 / = -th smallest of / , … , /w , = 1, … , r


and ℎ = (ℎ , … , ℎw ). Then ℎ: “a → ℝw is not one-to-one. (for each † ∈ ℎ“a  =
ℎ ():  ∈
“a , there are r! pre-images) However, on each “3 , = 1, … , r!, ℎ: “3 → ℝw is one-to-one with

inverse transformation ℎ37 <†= = |ℎ,3


7
<†= , … , ℎw,3
7
<†=} = <†ƒ¦€ , … , †ƒ¦€ =, where Π37 =
„,(€) „,(ž)

Π3,()
7
, … , Π3,(w)
7
, = 1, … , r! is the inverse permutation of Π3 . Under the notation of Theorem
10.2.2 each row and each column of the jacobian determinant s3 contains one, and only one,
non-zero element which is 1. Therefore s3 = ± 1, = 1, … , r!. Also ℎ (“3 ) = :† ∈ “a : − ∞ <
† < ⋯ < †w < ∞> = 8, say, = 1, … , r. Therefore the joint p.d.f. of – is given by

w!

٘ <†= = ¯ Ùa <†ƒ¦€ , … , †ƒ¦€ = |s3 | bà ׄ  <†=


„,(€) „,(ž)
3K

w! w

= ¯ ~J Ù <†ƒ¦€ = be <†=.
„,( )
3K
K

Since
Π3,()
7
, … , Π3,(w)
7
 = {1, … , r , we have
w w

J Ù <†ƒ¦€ = = J Ù(†
) , ∀ † ∈ 8.
„,( )

K
K

126
Consequently

w! w

٘ <†= = ~¯ £J Ù (†
)¥ be <†=
3K
K

= r! £J Ù(†
)¥ be <†=

K

r! £J Ù(†
)¥ , if − ∞ < † < ⋯ < †w < ∞
= Õ , † ∈ “a .

K
0, otherwise

(iii) The marginal p.d.f of – ( = 1, … , r ) is given by

x x x x w

٘ (†) = Ú ⋯ Ú Ú ⋯ Ú Ù˜ † , … , †7 †, †Ž, … , †w  J ۆ

7x 7x 7x 7x
K


x ¾ž ¾ ¼Œ ¾ ¾ ¦€ ¾Œ w w

= Ú Ú ⋯ Ú Ú Ú ⋯ Ú r! ™J Ù(†
)š Ù(†) J ۆ

¾ ¾ ¾ 7x 7x 7x
K
K



r!
= [z (†)]7 [1 − z (†)]w7 Ù(†) , − ∞ < † < ∞,
( − 1)! (r − )!

since ý7x Ù () ۏ = z (A) and ýº Ù () ۏ = 1 − z (H).


¹ x

(iv) As in (iii) we have

x x w

٘ ,˜j (/, †) = Ú ⋯ Ú r! ٘ († , … , †7 , /, †Ž , … †ñ7 , †, †ñŽ , … †w ) J ۆ



7x 7x
K

,ñ

x ¾ž ¾j¼Œ ¾ ¾j¦€ ¾ ¼Œ ‘ ¾ ¦€ ¾Œ w w

= Ú Ú ⋯ Ú Ú Ú ⋯ Ú Ú Ú ⋯ Ú r!  J Ù (†
)¢ Ù(/ )Ù(†) J ۆ
, if / < †

K
K
›
,ñ  
¾ ¾ ¾ ‘ ‘ ‘ 7x 7x 7x

,ñ

r!
= [z (/)]7 ×
( − 1)! (ç −  − 1)! (r − ç)!

[z (†) − z (/ )]ñ77 [1 − z (†)]w7ñ Ù(/)Ù (†), if − ∞ < / < † < ∞.

127
Clearly ٘ ,˜j (/, †) = 0, if / ≥ †.

Example 10.2.10 (Distribution of normalized spacing’s of exponential distribution)


Let  , … , w be a random sample from Exp(? ) distribution, where ? > 0. Let :w ≤ ⋯ ≤ w:w
denote the order statistics of  , … , w . Define — = r :w , —3 = (r − + 1) (3:w − 37:w ),
= 2, … , r. Show that — , … , —w are independent and identically distributed Exp(?) random
variables.

Solution. The common p.d.f. of random variables  , … , w is

1 7 æ
Ù (†) = Ô? X , if † > 0 .
c

0, otherwise

For notational convenience let – = ∶w ,  = 1, … , r. Then, by Example 10.2.9, a joint p.d.f. of
– = (– , … , –w ) is
w

r! £J Ù(†3 )¥ , if 0 < † < † < ⋯ < †w < ∞


٘ <†= = Õ
3K
0, otherwise

r! 7 ∑ž€÷€ æ„
= Ô? w X , if 0 < † < † < ⋯ < †w < ∞.
c

0, otherwise

The support of ٘ (⋅) is “˜ = :† ∈ ℝw : 0 < † < † < ⋯ < –w < ∞>. Consider the
transformation ℎ = (ℎ , … , ℎw ): ℝw → ℝw , where ℎ <†= = r† , ℎ3 <†= = (r − + 1)(†3 −
†37 ), = 2, … , r. Then — = ℎ – and —3 = ℎ3 –, 1 = 2, … , r. Clearly the transformation
ℎ: “˜ → ℝw is one-to-one with inverse transformation ℎ7 = (ℎ7 , … , ℎw7 ) , where for
« ∈ ℎ“˜ ,

«
ℎ7 « =
r
« «
ℎ7 « = +
r r−1

128
«©
3
« « «3
ℎ37 « = + + ⋯+ =¯
r r−1 r− +1 r−+1
©K

«©
w
« « «w7 «w
ℎw7 « = + + ⋯+ + =¯ .
r r−1 2 1 r−+1
©K

The Jacobian determinant of the transformation is

áℎ7 áℎ7 áℎ7



} á« á« á«w }
áℎ áℎ
7 7
áℎ7

s = } á« á« ⋯
á«w }

}áℎ7 áℎ7 áℎw7 }

w w
á« á« á«w

1
0 0 ⋯ 0
r
}1 1 }
0 ⋯ 0
r r−1
= }1 1 1 }
⋯ 0
r r−1 r−2
}⋮ }
1 1 1
⋯ 1
r r−1 r−2
1
= .
r!
Also

« = (« , « , … , «w ) ∈ ℎ“˜  ⇔ <ℎ7 «, … , ℎw7 «= ∈ “˜

« « « « « «w
⇔ 0 < < + < ⋯ < + + ⋯+ < ∞
r r r r r r
⇔ «3 > 0, = 1, … r.

Therefore ℎ“˜  = (0, ∞)w and the joint p.d.f. of — is given by

Ùê « = ٘ <ℎ7 «, … , ℎw7 «= |s| bÃ×ö «

129
r! 7 € ∑ž€÷€ Ħ€° 1
= X c × × b(P,x)ž «.
?w r!
We have, for « ∈ (0, ∞)w ,

«©
w w 3

¯ ℎ37 « = ¯¯
r−+1
3K 3K ©K

«©
w w

= ¯ ¯
r−+1
©K 3K©

= ¯ «© .
©K

Since b(P,x)ž « = ∏w3K b(P,x) («3 ), we have


w
1
Ùê « = J X 7 c b(P,x) («3 ).
ø„

?
3K

It follows that — , … , —w are independent and identically distributed Exp(?) random variables.

Example 10.2.11
Let  and  be independent random variables such that 3 ~ É (B3 , ? ), B3 > 0, ? > 0, =
1,2 . Define – =  +  and – = a . Show that – and – are independently
(i)

€ ŽaŒ
distributed with

– ~ É (B + B , ? ) and – ~ Be(B , B ).

If  ~ Exp(? ) and  ~ Exp(? ) are independently distributed then show that – =


~ U(0,1).
(ii)

a€ ŽaŒ

Solution. (i) The p.d.f.s of 3 and  = ( ,  ) are given by

1
Ùa„ (/ ) = / C„7 X 7 c b(P,x) (/), = 1,2,
í

Γ(B3 ) ? „
C

and

130

1
Ùa (/ , / ) = J Ùa„ (/3 ) = / / X b(P,x)Œ /,
í€ ¼íŒ
C€ 7 CŒ 7 7 c
Γ(B )Γ(B ) ? C€ŽCŒ  
3K

respectively.

Clearly “a =
/ ∈ ℝ : Ùa (/ , / ) > 0  = (0, ∞) . Consider the transformation ℎ = (ℎ , ℎ ) ∶
ℝ → ℝ defined by
/
, if / + / ≠ 0
ℎ (/ , / ) = / + / and ℎ (/ , / ) = Ô/ + / .
0, if / + / = 0

Then @
(– , – ) = ℎ ( ,  ), ℎ ( ,  ) = 1 and therefore
(– , – ) = ℎ ( ,  ), ℎ ( ,  ).
7

Also the transformation ℎ = (ℎ , ℎ ): “a → ℝ is one-to-one with inverse transformation


ℎ7 = (ℎ7 , ℎ7 ), where for († , † ) ∈ ℎ“a ,

ℎ7 († , † ) = † † and ℎ7 († , † ) = † (1 − † ).

The Jacobean determinant of the transformation is

áℎ7 áℎ7
ᆠᆠ} † †
J = }} 7 7 } = 01 − † −† 0 = −† .
áℎ áℎ
ᆠá†

Also

† = († , † ) ∈ ℎ“a  ⇔ |ℎ7 <†= , ℎ7 <†=} ∈ “a

⇔ † † > 0, † (1 − † ) > 0

⇔ † > 0, 0 < † < 1.

Therefore ℎ“a  = (0, ∞) × (0, 1) and the joint p.d.f. of – is given by

٘ († , † ) = Ùa ℎ7 (†), ℎ7 (†) |s|bÃר <†=

= Ùa † † , † (1 − † ) |0† | b P,x)×(P,) († , † )

131
† € Œ X 7 c 1
æ€
C ŽC 7
= ~ b(P,x) († ) | † C€7 (1 − † )CŒ 7 b(P,) († ) }.
Γ(B + B ) 8 (B , B ) 

It follows that – and – are independent random variables, – ~É(B + B , ?) and


– ~ Be (B , B ).

(ii) Follows form (a) by taking B = B = 1.

Example 10.2.12
(i) Let  = ( ,  ) be a random vector of absolutely continuous type with joint p.d.f.

Ùa (/ , / ) = ] <&/ + / = , / = (/ , / ) ∈ ℝ ,

where ]: [0, ∞) → ℝ is a non-negative function such that


x
1
Ú /](/ )Û/ = .

P

Let (ã, †) be the polar coordinate of the point  = ( ,  ) in the Cartesian plane, so that,

 = ã cos † ,  = ã Sin†, ã > 0, † ∈ [0,2Y), ã = & +  and one may take

0, if  = 0,  = 0
kY
i , if  = 0,  > 0
i2
3Y
i , if  = 0,  < 0
i2
Θ = tan7 | } , if  > 0,  ≥ 0
j 
i X
iY + tan7 | } , if  < 0
i 
i2π + tan7 |X } , if  > 0,  < 0
h   

where tan7 ? ∈ <−  ,  = denotes the principal value. Show that ã and † are independently
^ ^

distributed with p.d.f.s

Ùå () = 2π] ()b(P,x) ()

132
and

1
و (? ) = b(P,^) (?),
2Y
respectively.

Let  and  be independent and identically distributed :(0, 1) random variables. Show
that the distribution of random variable – = aŒ has p.d.f.
(ii)
a

1 1
€

٘ (†) = ∙ , −∞ < † < ∞.


π 1 + †
(iii) Let  = ( ,  ) have the joint p.d.f.

1
Ùa (/ , / ) = ÔY , if 0 < / + / < 1.
 

0, otherwise

Find  <& +  = and  ( +  ).

Solution.

(i) Let “a =
/ ∈ ℝ : Ùa / > 0 = :/ ∈ ℝ : ] <&/ + / = > 0>. Consider the transfomation
ℎ = (ℎ , ℎ ): ℝ → ℝ , defined by ℎ (/ , / ) = &/ + / and`

0, if / = 0, / = 0
kY
i , if / = 0, / > 0
i2
3Y
i , if / = 0, / < 0
i2
ℎ (/ , / ) = tan7 |X } , if / > 0, / ≥ 0 .
j   

i X
iY + tan7 | } , if / < 0
i 
i2π + tan7 |X } , if / > 0, / < 0
h   

Then (ã, Θ) = ℎ (/ , / ), ℎ (/ , / ) . The transformation ℎ = (ℎ , ℎ ): “a → ℝ is one-to-


one with inverse transformation ℎ7 († , † ) = ℎ7 († , † ), ℎ7 († , † ) , where for
(, ? ) ∈ ℎ“a ,

ℎ7 (, ? ) =  cos ? and ℎ7 (, ? ) =  sin ?.

The Jacobian determinant of the transformation is

133
áℎ7 áℎ7
J = } á7 á? = 0 cos θ −r sin θ
0 = .
}
áℎ áℎ7 sin θ r cos θ
á á?

Also ℎ“a  = {(, ? ) ∈ ℝ ∶  ∈ [0, ∞), ? ∈ [0, 2Y) and ]() > 0  = [ × [ , where [ =
{ ∈ [0, ∞): ]() > 0 and [ = [0,2Y). The joint p.d.f. of (ã, Θ) is given by

Ùå,ˆ (, ? ) = Ùa ℎ7 (, ? ), ℎ7 (, ? ) |s| bà ר (, ? )

= Ùa ( ëŠç ? ,  ç r ? ) || bc€ ×cŒ (, ? )

= ]() bc€ () bcŒ (?)

1
= 2Y  bc€ () | b(P,^) (?)}
2Y
1
= 2Y  b(P,x) () | b(P,^) (?)}.
2Y

It follows that ã and Θ are independent random variables with respective p.d.f.s

Ùå () = 2Y  ]() b(P,x) ()

and

1
و (? ) = b (?).
2Y (P,^)

Note that – = aŒ is not defined if  = 0. However @({ = 0) = 0 (i. e. , @({ ≠ 0) = 1)
1
€
(ii)

and therefore – = is well defined with probability one. In fact, since  = ( ,  ) is of


absolutely continuous type, we may, without loss of generality, take “a = ℝ − {(/ , / ) ∈
ℝ : / = 0. Define

Y 3Y
–, if (/ , / ) ∈ “a tan Θ , if Θ ∈ [0, 2π) − d0, , û
—=d =Ô 2 2 .
0, otherwise 0, otherwise

Then @({— = –) = 1 and therefore – = —. Thus we will find the distribution of random
7

Y 3Y
variable —.

tan Θ , if Θ ∈ [0, 2π) − d0, , û


—=Ô 2 2 .
0, otherwise

134
The p.d.f. of Θ is given by
1
ً (θ) = Ô2π , if 0 ≤ θ ≤ 2π.
0, otherwise
Consider the transformation ℎ: ℝ → ℝ defined by
Y 3Y
tan / , if / ∈ [0, 2π) − d0, , û
ℎ(/) = Ô 2 2 .
0, otherwise

Note that the transformation ℎ: ℝ → ℝ is not one-to–one. Since Θ is of absolutely continuous


type we may, without loss of generality, take

“‹ = {? ∈ ℝ: ً (?) > 0 

Y 3Y
= [0, 2π) − d0, , û
2 2

= “ ∪ “ ∪ “Q , say,

where “ = <0,  = , “ = <  , = and “Q = <  , 2Y=. On each of the sets “ , “ and “Q , ℎ is
^ ^ Q^ Q^

strictly increasing with inverse transformations

ℎ7 («) = tan7 «, « ∈ (0, ∞),

ℎ7 («) = Y + tan7 «, « ∈ (−∞, ∞)

and

ℎQ7 («) = 2Y + tan7 «, « ∈ (−∞, 0).

Also ℎ(“ ) = (0, ∞), ℎ(“ ) = (−∞, ∞) and ℎ(“Q ) = (−∞, 0). Therefore the p.d.f. of — is given
by
Q
Û
Ùê («) = ¯ و <ℎ©7 («)= ó ℎ©7 («)ó bÃר  («)
Û«
©K

1 1
= و (Ar 7 «) ó ó b(P,x) («) + و (Y + Ar 7 «) ó ób («)
1+«  1 + «  (7x,x)

+و (2Y + Ar 7 «) 0 0 b(7x,P) («)



ް Œ

1 1 1 1 1 1
= ∙ b(P,x) («) + ∙ b(7x,x) («) + . b(7x,P) («)
2Y 1 + «  2Y 1 + «  2Y 1 + « 

135
1 1
. , if « ∈ ℝ − {0
= Õ Y 1 + « 
.
1
, if « = 0
2Y
Since the random variable — is of absolutely continuous type we may take the p.d.f. of — as

1 1
Ùê («) = . , −∞ < « < ∞.
Y 1 + «

It follows that the random variable — <and hence – = aŒ = has the Cauchy distribution (see
a
€

Definition 11.1 (ii)).

(iii) We have

 Œ +   = (ã)

( +  ) = ã(cos Θ + sin Θ)


=  (ã ) (cos Θ + sin Θ). (since ã and Θ are independent)

Under the notation of (i), we have

1
](/ ) = Ôπ , if 0 < / < 1.
0, otherwise

Moreover

2, if 0 <  < 1


Ùå () = d
0, otherwise

and

1
و (? ) = b (? ).
2Y (P,^ )
Therefore

2
 Œ +   = (ã) = Ú 2  Û = ,
3
P

and

 ( +  ) =  (ã ) (cos Θ + sin Θ)

136
^
2 cos ? + sin ?
= Ú Û?
3 2Y
P

= 0.

10.3. Moment Generating Function Technique

Let  =  , … , .  be a random vector with p.d.f./p.m.f. Ùa (∙) and let ]: ℝ´ → ℝO be a Borel
function. Suppose that we seek the probability distribution of – = ](). Under the m.g.f.
technique we try to identify the m.g.f. ˜ () of random vector – with the m.g.f. of some

vector – has that known distribution. Various usages of this technique are illustrate in Examples
known distribution. Then the uniqueness of m.g.f.s (Theorem 7.3) ascertains that the random

7.1, 7.2, 7.3, 7.4, 7.5 and 7.6.

Theorem 10.3.1

Let  , … , w (r ≥ 2) be a random sample from :(, W  ) distribution, where  ∈ (−∞, ∞) and



W > 0 . Let  = ∑w3K 3 and “  = ∑w3K3 −  denote the sample mean and the
 
w w7
sample variance respectively. Then

(i)  ~ : <, =;

w
(ii)  and “  are independent random variables;
“  ~ Fw7
(w7) 

(iii) ;
ž

(iv)  (“  ) = W  , Var(“  ) = w7 and  (“) = Œw7


( )
W.
,  Œ
ž¦€
( )
Œ

Solution.

(ii) Let –3 = 3 − , = 1, … , r and let – = (– , … , –w ) . Then ∑w3K –3 = ∑w3K 3 − r = 0 and


(i) Follows from Example 7.1.

(r − 1)“  = ∑w3K3 −  = ∑w3K –3 , a function of –. The joint m.g.f. of –,  is given by


˜,a €, Ž = <X ∑„÷€ „ ˜„ ސ a =, € = (€ , … , €w ) ∈ ℝw , Ž ∈ ℝ.


ž

Let us fix € = (€ , … , €w ) ∈ ℝw and Ž ∈ ℝ. Then

137
w w

¯ €3 –3 + Ž  = ¯ €3 3 −  + Ž 
3K 3K

(Ž − ∑w3K €3 )
w w

= ¯ €© © + ¯ ©
r
©K ©K

w
Ž
= ¯ <€© − € + = ©
r
©K

= ¯ © © ,
©K

where € = ∑w3K €3 and © = €© − € + ,  = 1, … , r. Note that ∑w©K€© − € = 0, and


 
w w
therefore,
w w
Ž
¯ © = ¯ <€© − € + = = Ž,
r
©K ©K

and
w w w
Ž  Ž
¯ © = ¯ <€© − € + = = ¯€© − € + .

r r
©K ©K ©K

Consequently,

˜,a €, Ž =  <X ∑¨÷€ ½¨ a¨ =


ž

= J  (X ½¨ a¨ )
©K

= J a¨ © 
©K

w
<Œ =¨ Œ
= J X +½¨ Ž
Œ

©K

= X +
<Œ ž
∑ž
¨÷€ ½¨ Ž ∑¨÷€ ½¨ Œ
Œ

138
<Œ ž Œ ‘Œ
= X
+Ž d∑¨÷€¨ 7 Ž û
Œ ž

Œ
<Œ ‘Œ <Œ ∑ž
¨÷€<’¨ ¦’=
= X +Ž
Œž X Œ , € ∈ ℝw , Ž ∈ ℝ.

The joint m.g.f. of – = (– , … , –w ) is given by


˜ € = ˜,a €, 0 = X Œ , € ∈ ℝw ,
Œ
∑ž
¨÷€¨ 7

and the m.g.f. of  is given by


<Œ ‘Œ
a (Ž) = ˜,a (0, Ž) = X +Ž Œž , Ž ∈ ℝ.

Clearly

˜,a €, Ž = ˜ € a (Ž), ∀ €, Ž ∈ ℝwŽ .

Now using Theorem 7.1 it follows that – =  − , … , w −  and  are independent. This
in turn implies that, for any Borel functions Ψ (∙) and Ψ (∙), Ψ – and Ψ  are
independent. In particular, it follows that “  (a function of –) and  are independent.

Let —3 = , = 1, … , r. Then — , … , —w are independent and identically distributed :(0,1)


a„ 7+
,
(iii)

random variables. Furthermore, by (i) and Theorem 4.1 (ii), Module 5, — = ~:(0,1).
√w a7+
,
Œ
Let “ = —  = and – = . Then, by (ii), “ and – are independent random
wa7+ (w7)× Œ
,Œ ,Œ
variables. Also, by Example 7.6 (ii), “ ~ F and ~ = ∑w3K —3 ~ Fw . Thus the m.g.f.s of “ and ~

1
are

” () = (1 − 2)7Œ ,  < ,


€

2
and

1
 () = (1 − 2)7 Œ ,  < .
ž

2
Also
w

~ = ¯ —3
3K

139
(3 − )
w

= ¯
W
3K


3 −  +  − 
w

= ¯
W
3K

 
3 − 
w
r − 
= ¯ +
W  W
3K

= – + “.

Since – and “ are independent random variables, we have

 () = ˜ () ” ()

 ()
⇒ ˜ () =
” ()

(1 − 2)7 Œ
ž

=
(1 − 2)7 Œ
€

1
= (1 − 2)7 ,  < ,
ž¦€

2
Œ

which is the m.g.f. of Fw7



distribution. Now, by uniqueness of m.g.f.s it follows that
–=
(w7)׌
~ Fw7


.

(iv) We have – = ~ Fw7


(w7)× Œ 

W
. Therefore

 (“  ) =  <– Œ =
(r − 1)Œ
x
W X 7 Œ †
æ ž¦€
7
= Ú † Œ ۆ
Œ

(r − 1)Œ 2 Γ(
w7
)
ž¦€
Œ
P 

x
W 1
= Ú X 7 Œ † ۆ
æ ž¦€¼
7
Œ
(r − 1)Œ 2 Γ(
w7
)
ž¦€
Œ
 P

Γ( )
w7Ž
2 W
ž¦€¼

= ,  > −(r − 1)
Œ

Γ( ) (r − 1)Œ
w7
2
ž¦€
Œ


140
2 Œ Γ(  ) 
w7Ž

= | } w7 W ,  > −(r − 1).


r−1 Γ( )


Therefore

Γ< =
w
2
 (“) = t ∙ w7

W,
r − 1 Γ< =


2 Γ(  + 1) 
w7

 (“ )
= W = W  ,
r − 1 Γ(w7)


2  Γ(  + 2) R r + 1 R
w7

 (“ R)
=| } W = W
r−1 Γ( )
w7 r−1


and

2W R
Var(“  ) =  (“ R ) − (“  ) = .

r−1

11. Distributions Based on Sampling From a Normal Distribution

the Snedecor z-distribution, which arise as probability distributions of various statistics based
First we will introduce two new probability distributions, called the Student t-distribution and

on a random sample from normal distribution.

Definition 11.1
(i) For a given positive integer w, a random variable  is said to have the Student -distribution
with w degrees of freedom (written as  ~ 4 ) if the p.d.f. of  is given by
Γ( )
4Ž
1
Ùa (/) = 
, −∞ < / < ∞.
√wY Γ(  )
4 •¼€

<1 + =
‘Œ Œ
4
(ii) The Student -distribution with 1 degree of freedom is also called the standard Cauchy

(iii) For positive integers r and r , a random variable  is said to have the Snedecor z-
distribution.

distribution with (r , r ) degrees of freedom (written as  ~ zw€,wŒ ) if the p.d.f. of  is


given by

141
ž€
7 
< = < /=
w€ w€ Œ

Ùa (/) = ž€ ¼žŒ b(P,x) (/).


wŒ wŒ

8< , =
w€ wŒ
<1 + /=
w€ Œ
 

Remark 11.1
The following observations are obvious:

If  ~ 4 , then  =−  (since Ùa (/ ) = Ùa (−/ ), ∀/ ∈ ℝ), i.e., the distribution of  ~ 4 is


7

symmetric about 0;
(i)

1 1
(ii) The p.d.f. of Cauchy distribution is given by

٘ (†) = ∙ , −∞ < † < ∞.


Y 1 + †
By Example 3.4, Module 3, if the random variable  has the Cauchy distribution (i.e., if
 ~  ) then () is not finite;
ž€ Ø

If  ~ zw€,wŒ , then – = ~ Be < € , =, the beta distribution with shape parameter


žŒ w wŒ
ž Ø
Ž €  
(iii)
žŒ

< , =
w€ wŒ
 
Let — and — be independent and identically distribution :(0,1) random variables and let
(see, Definition 3.2, Module 5);

— = — ⁄— . Then, by Example 10.2.12 (b), the distribution of — is Cauchy (i. e. , — ~  ).


(iv)

The following theorem provides representations of the Student  and the Snedecor z random
variables in terms of normal and chi-squared random variables.

Theorem 11.1
 (
(i) Let — ~ :(0,1) and – ~ F4 where w ∈ {1, 2, … ) be independent random variables. Then

—
~= ~ 4 .
Œ
˜
4

(ii) For positive integers r and r , let  ~ Fw€ and  ~ FwŒ be independent random
variables. Then

 ⁄r
r= ~ zw€,wŒ .
 ⁄r

Let w and  be positive integers and let  ~ 4 . Then  (  ) is not finite if  ∈


{w, w + 1, … . For  ∈ {1, 2, … , w − 1 and w ≥  + 1
(iii)

142
0, if  is odd
k
 (  ) = wŒ ! Γ(  )
47
.
j   , if  is even
h 2 <= ! Γ(  )
4

If  ~ 4 , then
 =  () = 0, for w ∈ {2, 3, … 
(iv)

w
 = Var( ) = , for w ∈ {3, 4, … 
w−2
‚ = coefficient of skewness = 0, for w ∈ {4, 5, … 

and ˆ = kurtosis = , for w ∈ {5, 6, … .


Q(47)
47R

Let r , r and  be positive integers and let  ~ zw€ ,wŒ . Then, for r ∈ {1, 2, … ,2 and  ≥
,  (  ) is not finite. For r ∈ {2 + 1, 2 + 2, … 
(v)


r  r + 2( − 1)


 (  ) = | } J | }.
r r − 2
3K
(vi) If  ~ zw€ ,wŒ then

r
 =  ( ) = , if r ∈ {3, 4, … 
r − 2

2r (r + r − 2)
 = Var() = , if r ∈ {5, 6, … 
r (r − 2) (r − 4)

2(2r + r − 2) 2(r − 4)
‚ = coefficient of skewness = t , if r ∈ {7, 8, … 
r − 6 r (r + r − 2)

and

ˆ = kurtosis = + 3, if r ∈ 9, 10, … .


M(wŒ 7)Œ (wŒ 7R)Žw€ (w€ ŽwŒ 7)w€ (SwŒ 7L
w€ wŒ 7T wŒ 7R w€ ŽwŒ 7

Proof. (i) The joint p.d.f. of –, — is given by

1 æ¼øŒ 
X † Œ 7 , if †, « ∈ 0, ∞ × ℝ
•
7
٘,ê †, « = ٘ †Ù ê « = Ò2 .
Œ

√Y
•¼€
Œ

0, otherwise

143
Clearly “˜,ê =
(†, «) ∈ ℝ : ٘,ê (†, «) > 0 = (0, ∞) × ℝ. Consider the transformation ℎ =

(ℎ , ℎ ): “˜,ê ⟶ ℝ defined by ℎ (†, «) = and ℎ (†, «) = Œ4 . Then ~ = ℎ (–, —) =


° ¾ ê
æ ö
Œ Œ
.
• •

Let r = ℎ (–, —) = Œ4 . Clearly the transformation ℎ = (ℎ , ℎ ): “˜,ê ⟶ ℝ is one-to-one


˜

with inverse transformation ℎ7 = (ℎ7 , ℎ7 ), where for (, € ) ∈ ℎ“˜,ê ,

ℎ7 (, € ) = w€  and ℎ7 (, € ) = €.

The Jacobian determinant is

áℎ7 áℎ7
s = } á7 ဠ= 00 2w€
0 = −2w€  .
}
áℎ áℎ7 € 
Ꮰá€
Also

ℎ“˜,ê  =
(, € ) ∈ ℝ : ℎ7 (, € ), ℎ7 (, € ) ∈ “˜,ê 

= {(, € ) ∈ ℝ : w€  ∈ (0, ∞), € > 0, € ∈ ℝ

= {(, € ) ∈ ℝ :  ∈ ℝ, € > 0

= ℝ × (0, ∞)

= [, say.

Therefore the joint p.d.f. of (~, r) is given by

Ù,— (, €) = ٘,° ℎ7 (, € ), ℎ7 (, € ) |s|bÃ×ö,ø (, € )

= ٘,° (w€  , € ) |02w€  | bc (, € )

w
4u
 •¼=Œ ’Œ
€ 4 X 7 , if (, € ) ∈ ℝ × (0, ∞)
= Õ√Y 2 .
Œ

Γ <  =
•¦€
4
Œ

0, otherwise

Consequently the p.d.f. of ~ is given by


x

Ù () = Ú Ù,— (, € ) ۀ


7x

144
x
w
4u
 •¼=Œ ’Œ
= Ú € 4 X 7 Œ

ۀ,  ∈ ℝ
√Y 2 Γ < =
•¦€
4
Œ
 P

x
1
= •¼€ Ú † X 7¾ ۆ
•¦€

Œ

√wY Γ <  = <1 + 4 =
4 ½Œ Œ
P

Γ< =
4Ž
1
= 
∙ ,  ∈ ℝ,
√wY Γ <  =
4 •¼€

<1 + =
½Œ Œ
4

which is the p.d.f. of Student’s -distribution with w degrees of freedom.

(ii) The joint p.d.f. of  = ( ,  ) is given by

Ùa€ ,aŒ (/ , / ) = Ùa€ (/ )ÙaŒ (/ )

1
= X 7 / Œ 7 / Œ 7 b(P,x)Œ (/ , / ).
(í€ ¼íŒ ) ž€ žŒ
Œ

2 Γ < €= Γ < Œ=
ž€ ¼žŒ
w w
Œ
 

We have “a€ ,aŒ =


(/ , / ) ∈ ℝ : Ùa€ ,aŒ (/ , / ) > 0 = (0, ∞) . Consider the one-to-one
transformation ℎ = (ℎ , ℎ ): “a€ ,aŒ → ℝ given by
r / /
ℎ (/ , / ) = and ℎ (/ , / ) = .
r / r
a€u
Define r = ℎ ( ,  ) = aŒ and ˜ = ℎ ( ,  ) =
w€ aŒ
uwŒ wŒ
. Then the inverse of transformation

ℎ = (ℎ , ℎ ): “a€ ,aŒ → ℝ is ℎ 7


= (ℎ7 , ℎ7 ), where for (€, Ž) ∈ ℎ“a€ ,aŒ ,

ℎ7 (€, Ž) = r €Ž and ℎ7 (€, Ž ) = r Ž.

The Jacabian determinant is

áℎ7 áℎ7
r Ž r €
s = } á€7 áŽ7 } = 00 r 0 = r r Ž.
áℎ áℎ 

ဠáŽ
Also,

ℎ“a€ ,aŒ  =
(€, Ž ) ∈ ℝ : ℎ7 (€, Ž ), ℎ7 (€, Ž) ∈ “a€ ,aŒ 

145
= {(€, Ž) ∈ ℝ : r €Ž > 0, r Ž > 0

= (0, ∞) ,

and therefore, the joint p.d.f. of (r, ˜ ) is given by

ٗ,™ (€, Ž) = Ùa€ ,aŒ ℎ7 (€, Ž), ℎ7 (€, Ž)|s| bÃר (€, Ž)
€ ,، 

= Ùa€ ,aŒ (r €Ž, r Ž )|r r Ž| b P,x)Œ (€, Ž)

r Œ r Œ
ž€ žŒ

= € Œ 7 Ž X7 b(P,x)Œ (€, Ž).


ž€ ž€ ¼žŒ (žŒ ¼ž€ ’)‘
7
Œ Œ

2 Γ < € = Γ < Œ =
ž€ ¼žŒ
w w
Œ

Consequently the p.d.f. of r is given by


x

ٗ (€ ) = Ú Ù—,™ (€, Ž )ێ.


7x

Clearly ٗ (€ ) = 0, if € ≤ 0. For € > 0

r Œ r Œ
ž€ žŒ
x
ٗ (€ ) = € Ú Ž X 7 ێ
ž€ ž€ ¼žŒ (žŒ ¼ž€ ’)‘
7 7
Œ Œ Œ

2 Γ < € = Γ < Œ =
ž€ ¼žŒ
w w
Œ P
 

ž€
7
Γ< = < €=
w€ ŽwŒ w€ w€ Œ

= , 0 < € < ∞.
 wŒ wŒ

Γ < € = Γ < Œ =
w w ž€¼žŒ

<1 + €=
w€ Œ
 

Therefore

⁄r
r= ~ zw€,wŒ .
⁄r

(iii) Fix w ∈ {1, 2, … . By (i)

—
= ,
7

Œ
˜
4

where — ~ :(0,1) and – ~ F4



are independent random Variables. Thus, for w ∈ {1, 2, …  and
 > 0,

146
(  ) = wŒ  <—  – 7Œ = = wŒ (—  )  <– 7Œ =, (since – and — are independent)

provided the expectations are finite. We have, from the proof of Theorem 4.3 (iii), Module 5,

0, if  is odd
!
 (— )
=Õ , if  is even.
2Œ < = !



Moreover, for  ∈ {1, 2, … ,


x
1
 <– Œ = = Ú † X 7Œ ۆ,
•¦ æ
7
Œ

2 Γ < =
•
4
Œ
 P

which is finite if, and only if, w >  (see Section 2, Module 5). Also, for w > 

Γ< =
47
2
•¦ •¦
< =
 <– Œ = = = ∙
Œ Œ
7 

2ŒΓ <= 2Œ Γ <  =


•
4 4

Thus  (  ) is finite if  ∈ {1, 2, … , w − 1. For  ∈ {1, 2, … , w − 1 and w ≥  + 1

0, if  is odd
k
 (  ) = wŒ ! Γ <  =
47
.
j   , if  is even
h 2 < = ! Γ <  =
4

(iv) Using (iii), we have

 =  () = 0, if w ∈ {2, 3, … 


w
 =  =  (  ) = , if w ∈ {3, 4, … 
w−2
Q = Q =  ( Q ) = 0, if w ∈ {4, 5, … 

and

3 w
R = R =  ( R)
= , if w ∈ {5, 6, … .
(w − 2) (w − 4)

Consequently
Q
‚ = = 0, if w ∈ {4, 5, … 

‹
Œ

147
and

R 3(w − 2)
ˆ = = , if w ∈ {5, 6, … .
  w−4

(v) Using (ii), we have

r 
= ,
7
r 

where  ~ Fw€ and  ~ FwŒ are independent random variables. Fix  ∈ {1, 2, … . Then
 
 (  ) = <wŒ =  ( 7 ) = <wŒ =  ( )(7 ), ( and  are independent)
w w
€ €

provided the expectations are finite. Since  ~ Fw€ ,  ( ) is finite for any  > 0 and
x
1
 ( ) = Ú / Œ Ž7 X 7 Œ Û/
ž€ í

2 Γ < € =
ž€
w
Œ
P

2 Œ Ž Γ < + =
ž€

= 

2 Œ Γ < =
ž€



r r r
= 2 < +  − 1= < +  − 2= ⋯
2 2 2

= r + 2( − 1)r + 2( − 2) ⋯ r




= Jr + 2( − 1) ,  ∈ {1, 2, … .


3K

Since  ~ FwŒ ,  (7 ) is finite if, and only if, r > 2. For r > 2

2 Œ 7 Γ < Œ − =
žŒ
w
1
 (7 ) = = ∙
2 Γ< =
žŒ
Œ
wŒ ∏3K(r − 2 )


It follows that, for r ∈ {1, 2, … ,2 and  ≥ , (  ) is not finite. For r ∈ {2 + 1, 2 +


2, … 

r  r + 2( − 1)


 ( )
= | } J ∙
r r − 2
3K

148
(vi) Follows on using (v) after some tedious calculations.

Corollary 11.1
Let  , … , w (r ≥ 2) be a random sample from :(, W  ) distribution, where  ∈ (−∞, ∞)
and W > 0. Let 8 = ∑w3K 3 and “  = ∑w3K(3 − 8 ) denote the sample mean and the
 
w w7
sample variance respectively. Then

√r (8 − )
~ w7 .
W

8 ~ : <, = and ~ Fw7


, Œ (w7)× 
Œ

w , Œ
Proof. By Theorem 10.3.1, are independent random

, ~ :(0,1) and ~ Fw7


√w (a8 7+) (w7)׌ 

variables. This in turn implies that are independent
random variables. Now by virtue of Theorem 11.1 (i)

√w(a8 7+)
,
~ w7 ,
Œ
Œ
(w7)š ⁄,
w7

√r(8 − )
i.e.,

~ w7 .
“

Corollary 11.2

Let  , … , 4 (w ≥ 2) be a random sample from :( , W ) distribution and let – , … , –w (r ≥


2) be a random sample from :( , W ) distribution, where −∞ < 3 < ∞ and W3 > 0, = 1,2.
Further suppose that  = ( , … , 4 ) and – = (– , … , –w ) are independent. Let “ =

∑4 8 
3K(3 −  ) and “ =
 
∑w3K(–3 − –8 ) be the sample variances based on random
47 w7
samples  = ( , … , 4 ) and (– , … , –w ), respectively; here 8 = ∑4 8
3K 3 and – = ∑3K –3
  w
4 w
are the sample means based on two random samples. Then

W “
~ z47,w7 .
W “

Proof. By virtue of Theorem 10.3.1 (iii) we have

149
(w − 1)“ (r − 1)“
~ F47

and ~ Fw7


W W

Also the independence of  and – implies that a function of  alone and


(47)׀Œ (w7)׌Œ
,€Œ ,ŒŒ
a function of – alone are independent. Now use of Theorem 11.1 (ii) yields

“ ⁄W
~ z47,w7
“ ⁄W

W “
i. e., ~ z47,w7 .
W “

Remark 11.2
(i) Suppose that  ~ 4 . Then, by Theorem 11.1 (i),

—
= ,
7

Œ
˜
4

where — ~ :(0,1) and – ~ F4



are independent random variables. Therefore

—
 = ∙
7
– ⁄w

Since — ~ :(0,1), by Theorem 4.2 (V), Module 5, we have —  ~ F . It follows that —  ~ F and
– ~ F4
are independent random variables. Consequently

—  ⁄1
 = ~ z,4 .
7
–⁄w

Thus if  ~ 4 , then   ~ z,4 .

(ii) Suppose that  ~ zw€,wŒ . Then, by Theorem 11.1 (ii),

⁄r
 = ,
7
⁄r

where  ~ Fw€ and  ~ FwŒ are independent random variables. Then

1 7 ⁄r
= ,
 ⁄r

150
where  ~ FwŒ and  ~ Fw€ are independent random variables. Now using Theorem 11.1 (ii)
it follows that

1 7  ⁄r
~ zwŒ,w€ .
  ⁄r

Thus if  ~ zw€ ,wŒ , then ~ zwŒ,w€ .



a

Note that if  ~ 4 then, by Remark 11.1 (i), the distribution of  is symmetric about 0 and its
kurtosis is

3 w 0 2
Ž Ê 3, w Ê 4.
w04
Thus a -distribution with w (> 4 degrees of freedom is symmetric and leptokurtic (i.e., it has
shaper peak and longer fatter tails). Note that the kurtosis Ž decreases as w increases and
Ž → 3, as w → ∞. This suggests that, for large degrees of freedom, Student’s -distribution
behaves like :(0, 1 distribution. A rigorous proof of this observation will be provided in the
next module.

Figure 11.1 : Plot of p.d.f. of  ∼ 4

Suppose that  ~ 4 and, for a fixed B ∈ 0, 1, let 4,C be the 1 0 B-th quantile of , i.e.,

za 4,C  = @
 ≤ 4,C  = 1 0 B.

Then

151
za −4,C  = 1 0 za 4,C  = B <since  0 =.
7

Figure 11.2 : 1 0 B-th quantile of  ∼ 4 @


 ≤ 4,C  = 1 0 B

Now suppose that  ∼ zw€,wŒ and, for a fixed B ∈ 0, 1, let Ùw€,wŒ ,C be the 1 0 B-th quantile
of , i.e.,

za Ùw€,wŒ ,C  @
 ≤ Ùw€,wŒ ,C  = 1 0 B.

∼ zwŒ,w€ and @({ > 0 1, it follows that



a
Since

1 1
@ ð o › 1 0 B
 Ùw€ ,wŒ ,C

1 1
⇒ @ ð g › B 1 0 1 0 B
 Ùw€ ,wŒ ,C

1
⇒ ÙwŒ,w€ ,7C .
Ùw€,wŒ,C

i.e., Ùw€,wŒ ,C E ÙwŒ,w€ ,7C 1.

152
Figure11.3: Plots of p.d.f.s of ~ zw€ ,wŒ

Figure 11.4: (1 0 B-th quantile of ~zw€,wŒ <@


 ≤ Ùw€ ,wŒ ,C  = 1 0 B=

Example 11.1
Let — , … , —w be independent and identically distributed :(0,1 random variables and let
A , … , Aw , H , … , Hw be real numbers such that ∑w3K A3 > 0, ∑w3K H3 > 0, and ∑w3K A3 H3 = 0.
Show that:

– Œ∑ž„÷€ ∙ »∑„÷€ ~  ;
∑ž º„Œ ∑ž ¹„ ê„
„÷€ ¹„ „÷€ º„ ê„ »
(i) Œ ž

153

– = ∑ž„÷€ ¹„Œ ∙ |∑„÷€
ž º ê } ~ z, ;
∑ž ºŒ ∑ž
„ „ ¹ê
„÷€ „ „
(ii)
„÷€ „

–Q = Œ∑ž ∙ ∑ž ~  .
∑ž
„÷€ º„
Œ ∑ž
„÷€ ¹„ ê„
„÷€ ¹„ „÷€ º„ ê„
(iii) Œ

Solution. Let ~ = ∑w3K A3 —3 and ~ = ∑w3K H3 —3 . For ë , ë ∈ ℝ,


w

ë ~ + ë ~ = ¯(ë A3 + ë H3 ) —3 .
3K

Since — , … , —w are independent, by Example 7.1.


w

ë ~ + ë ~ ~ : £0, ¯(ë A3 + ë H3 ) ¥ ∙
3K

Now using Theorem 9.2 it follows that ~ = (~ , ~ ) ~ : (0, 0, ∑w3K A3 , ∑w3K H3 , 0) (since
(~ ) = 0 = (~ ) , Var(~ ) = ∑w3K A3 , Var(~ ) = ∑w3K H3 and Cov(~ , ~ ) = ∑w3K A3 H3 = 0 ). Since
correlation between ~ and ~ is 0 and ~ = (~ , ~ ) ~ : (0, 0, ∑w3K A3 , ∑w3K H3 , 0) it follows
that ~ ~ :(0, ∑w3K A3 ) and ~ ~ :(0, ∑w3K H3 ) are independent (see Theorem 9.1). Thus

~ ∑w3K A3 —3 ~ ∑w3K H3 —3
“ = = and “ = =
&∑w3K A3  &∑w3K A3  Œ∑w3K H3  Œ∑w3K H3 

are independent and identically distribution :(0, 1) random variables. This implies that
“ ~ :(0, 1) (“ ~ F ) and “ ~ :(0, 1) (“ ~ F ) are independent random variables.
Consequently

“
– = ~ 
&“

“ ⁄1
– = ~ z,
“ ⁄1

and

“
–Q = ~  . (see Example 10.2.12 (ii))
“

154
Table 11.1: (1 − B)-th quantities of  ~ 4 @
 ≤ 4,C  = 1 − B

B
w .25 .1 .05 .025 .01 .005 .001
1 1.000 3.078 6.314 12.71 31.82 63.66 318.3
2 0.816 1.886 2.920 4.303 6.965 9.925 22.33
3 0.765 1.638 2.353 3.182 4.541 5.841 10.21
4 0.741 1.533 2.132 2.776 3.747 4.604 7.173
5 0.727 1.476 2.015 2.571 3.365 4.032 5.893
6 0.718 1.440 1.943 2.447 3.143 3.707 5.208
7 0.711 1.415 1.895 2.365 2.998 3.499 4.785
8 0.706 1.397 1.860 2.306 2.896 3.355 4.501
9 0.703 1.383 1.833 2.262 2.821 3.250 4.297
10 0.700 1.372 1.812 2.228 2.764 3.169 4.144
11 0.697 1.363 1.796 2.201 2.718 3.106 4.025
12 0.695 1.356 1.782 2.179 2.681 3.055 3.930
13 0.694 1.350 1.771 2.160 2.650 3.012 3.852
14 0.692 1.345 1.761 2.145 2.624 2.977 3.787
15 0.691 1.341 1.753 2.131 2.602 2.947 3.733
26 0.690 1.337 1.746 2.120 2.583 2.921 3.686
17 0.689 1.333 1.740 2.110 2.567 2.898 3.646
18 0.688 1.330 1.734 2.101 2.552 2.878 3.610
19 0.688 1.328 1.729 2.093 2.539 2.861 3.579
20 0.687 1.325 1.725 2.086 2.528 2.845 3.552
21 0.686 1.323 1.721 2.080 2.518 2.831 3.527
22 0.686 1.321 1.717 2.074 2.508 2.819 3.505
23 0.685 1.319 1.714 2.069 2.500 2.807 3.485
24 0.685 1.318 1.711 2.064 2.492 2.797 3.467
25 0.684 1.316 1.708 2.060 2.485 2.787 3.450
26 0.684 1.315 1.706 2.056 2.479 2.779 3.435
27 0.684 1.314 1.703 2.052 2.473 2.771 3.421
28 0.683 1.313 1.701 2.048 2.467 2.763 3.408
29 0.683 1.311 1.699 2.045 2.462 2.756 3.396
30 0.683 1.310 1.697 2.042 2.457 2.750 3.385
35 0.682 1.306 1.690 2.030 2.438 2.724 3.340
40 0.681 1.303 1.684 2.021 2.423 2.704 3.307
50 0.679 1.299 1.676 2.009 2.403 2.678 3.261
100 0.677 1.290 1.660 1.984 2.364 2.626 3.174
∞ 0.674 1.282 1.645 1.960 2.326 2.576 3.090

155
Table 11.2: (1 − B)-th quantiles of  ~ zw€,wŒ @
 ≤ Ùw€ ,wŒ,C  = 1 − B, B = 0.10

r
r 1 2 3 4 5 6 7 8 9
1 39.86 49.5 53.59 53.83 57.24 58.2 58.91 59.44 59.86
2 8.53 9.00 9.16 9.24 9.29 9.33 9.35 9.37 9.38
3 5.54 5.46 5.39 5.34 5.31 5.28 5.27 5.25 5.24
4 4.54 4.32 4.19 4.11 4.05 4.01 3.98 3.95 3.94
5 4.06 3.78 3.62 3.52 3.45 3.40 3.37 3.34 3.32
6 3.78 3.46 3.29 3.18 3.11 3.05 3.01 2.98 2.96
7 3.59 3.26 3.07 2.96 2.88 2.83 2.78 2.75 2.72
8 3.46 3.11 2.92 2.81 2.73 2.67 2.62 2.59 2.56
9 3.36 3.01 2.81 2.69 2.61 2.55 2.51 2.47 2.44
10 3.29 2.92 2.73 2.61 2.52 2.46 2.41 2.38 2.35
11 3.23 2.86 2.66 2.54 2.45 2.39 2.34 2.3 2.27
12 3.18 2.81 2.61 2.48 2.39 2.33 2.28 2.24 2.21
13 3.14 2.76 2.56 2.43 2.35 2.28 2.23 2.20 2.16
14 3.10 2.73 2.52 2.39 2.31 2.24 2.19 2.15 2.12
15 3.07 2.70 2.49 2.36 2.27 2.21 2.16 2.12 2.09
16 3.05 2.67 2.46 2.33 2.24 2.18 2.13 2.09 2.06
17 3.03 2.64 2.44 2.31 2.22 2.15 2.10 2.06 2.03
18 3.01 2.62 2.42 2.29 2.20 2.13 2.08 2.04 2.00
19 2.99 2.61 2.40 2.27 2.18 2.11 2.06 2.02 1.98
20 2.97 2.59 2.38 2.25 2.16 2.09 2.04 2.00 1.96
21 2.96 2.57 2.36 2.23 2.14 2.08 2.02 1.98 1.95
22 2.95 2.56 2.35 2.22 2.13 2.06 2.01 1.97 1.93
23 2.94 2.55 2.34 2.21 2.11 2.05 1.99 1.95 1.92
24 2.93 2.54 2.33 2.19 2.10 2.04 1.98 1.94 1.91
25 2.92 2.53 2.32 2.18 2.09 2.02 1.97 1.93 1.89
26 2.91 2.52 2.31 2.17 2.08 2.01 1.96 1.92 1.88
27 2.90 2.51 2.30 2.17 2.07 2.00 1.95 1.91 1.87
28 2.89 2.50 2.29 2.16 2.06 2.00 1.94 1.90 1.87
29 2.89 2.50 2.28 2.15 2.06 1.99 1.93 1.89 1.86
30 2.88 2.49 2.28 2.14 2.05 1.98 1.93 1.88 1.85
40 2.84 2.44 2.23 2.09 2.00 1.93 1.87 1.83 1.79
60 2.79 2.39 2.18 2.04 1.95 1.87 1.82 1.77 1.74


120 2.75 2.35 2.13 1.99 1.90 1.82 1.77 1.72 1.68
2.71 2.30 2.08 1.94 1.85 1.77 1.72 1.67 1.63

156
Table 11.2: (1 − B)-th quantile of  ~ zw€,wŒ @
 ≤ Ùw€ ,wŒ B = 1 − B, B = 0.10

r
r 10 12 15 20 24 30 40 60 120 ∞
1 60.19 60.71 61.22 61.74 62 62.26 62.93 62.79 63.06 63.33
2 9.39 9.41 9.42 9.44 9.45 9.46 9.47 9.47 9.48 9.49
3 5.23 5.22 5.20 5.18 5.80 5.17 5.16 5.15 5.14 5.13
4 3.92 4.90 3.87 3.84 3.83 3.82 3.80 3.79 3.78 3.76
5 3.30 3.27 3.24 3.21 3.19 3.17 3.16 3.14 3.12 3.10
6 2.94 3.90 2.87 2.84 2.82 2.80 2.78 2.76 2.74 2.72
7 2.70 3.67 2.63 2.59 2.58 2.56 2.54 2.51 2.49 2.47
8 2.54 3.50 2.46 2.42 2.40 2.38 2.36 2.34 2.32 2.29
9 2.42 3.38 2.34 2.30 2.28 2.25 2.23 2.21 2.18 2.16
10 2.32 2.28 2.24 2.20 2.18 2.16 2.13 2.11 2.08 2.06
11 2.25 2.21 2.17 2.12 2.10 2.08 2.05 2.03 2.00 1.97
12 2.19 2.15 2.10 2.06 2.04 2.01 1.99 1.96 1.93 1.90
13 2.40 2.10 2.05 2.01 1.98 1.96 1.93 1.90 1.88 1.85
14 2.10 2.05 2.01 1.96 1.94 1.91 1.89 1.86 1.83 1.80
15 2.06 2.02 1.97 1.92 1.90 1.87 1.85 1.82 1.79 1.76
16 2.03 1.99 1.94 1.89 1.87 1.84 1.81 1.78 1.75 1.72
17 2.00 1.96 1.91 1.86 1.84 1.81 1.78 1.75 1.72 1.69
18 1.98 1.93 1.89 1.84 1.81 1.78 1.75 1.72 1.69 1.66
19 1.96 1.91 1.86 1.81 1.79 1.76 1.73 1.70 1.67 1.63
20 1.94 1.89 1.84 1.79 1.77 1.74 1.71 1.68 1.64 1.61
21 1.92 1.87 1.83 1.78 1.75 1.72 1.69 1.66 1.62 1.59
22 1.90 1.86 1.81 1.76 1.73 1.70 1.67 1.64 1.60 1.57
23 1.89 1.84 1.80 1.74 1.72 1.69 1.66 1.62 1.59 1.55
24 1.88 1.83 1.78 1.73 1.70 1.67 1.64 1.61 1.57 1.53
25 1.87 1.82 1.77 1.72 1.69 1.66 1.63 1.59 1.56 1.52
26 1.86 1.81 1.76 1.71 1.80 1.65 1.61 1.58 1.54 1.50
27 1.85 1.80 1.75 1.70 1.67 1.64 1.60 1.57 1.53 1.49
28 1.84 1.79 1.74 1.69 1.66 1.63 1.59 1.56 1.52 1.48
29 1.83 1.78 1.73 1.68 1.65 1.62 1.58 1.55 1.51 1.47
30 1.82 1.77 1.72 1.67 1.64 1.61 1.57 1.54 1.50 1.46
40 1.76 1.71 1.66 1.61 1.57 1.54 1.51 1.47 1.42 1.38
60 1.71 1.66 1.60 1.54 1.51 1.48 1.44 1.40 1.35 .129


120 1.65 1.60 1.55 1.48 1.45 1.41 1.37 1.32 1.26 1.19
1.60 1.55 1.49 1.42 1.38 1.34 1.30 1.24 1.17 1.00

157
Table 11.2: (1 − B)-th quantiles of  ~ zw€,wŒ @
 ≤ Ùw€ ,wŒ B = 1 − B, B = 0.05

r
r 1 2 3 4 5 6 7 8 9
1 161.4 199.5 215.7 224.6 230.2 234.0 236.8 238.9 940.5
2 18.51 19.00 19.16 19.25 19.3 19.33 19.35 19.37 19.38
3 10.13 9.55 9.28 9.12 9.01 8.94 8.89 8.85 8.81
4 7.71 6.94 6.59 6.39 6.26 6.16 6.09 6.04 6.00
5 6.61 5.79 5.41 5.19 5.05 4.95 4.88 4.82 4.77
6 5.99 5.14 4.76 4.53 4.39 4.28 4.21 4.15 4.10
7 5.59 4.74 4.35 4.12 3.97 3.87 3.79 3.73 3.68
8 5.32 4.46 4.07 3.84 3.69 3.58 3.50 3.44 3.39
9 5.12 4.26 3.86 3.63 3.48 3.37 3.29 3.23 3.18
10 4.96 4.10 3.71 3.48 3.33 3.22 3.14 3.07 3.02
11 4.84 3.98 3.59 3.36 3.20 3.09 3.01 2.95 2.90
12 4.75 3.89 3.49 3.26 3.11 3.00 2.91 2.85 2.80
13 4.67 3.81 3.41 3.18 3.03 2.92 2.83 2.77 2.71
14 4.60 3.74 3.34 3.11 2.96 2.85 2.76 2.70 2.65
15 4.54 3.68 3.29 3.06 2.90 2.79 2.71 2.64 2.59
16 4.49 3.63 3.24 3.01 2.85 2.74 2.66 2.59 2.54
17 4.45 3.59 3.20 2.96 2.81 2.70 2.61 2.55 2.49
18 4.41 3.55 3.16 2.93 2.77 2.66 2.58 2.51 2.46
19 4.38 3.52 3.13 2.90 2.74 2.63 2.54 2.48 2.42
20 4.35 3.49 3.10 2.87 2.71 2.60 2.51 2.45 2.39
21 4.32 3.47 3.07 2.84 2.68 2.57 2.49 2.42 2.37
22 4.30 3.44 3.05 2.82 2.66 2.55 2.46 2.40 2.34
23 4.28 3.42 3.03 2.80 2.64 2.53 2.44 2.37 2.32
24 4.26 3.40 3.01 2.78 2.62 2.51 2.42 2.36 2.30
25 4.24 3.39 2.99 2.76 2.60 2.49 2.40 2.34 2.28
26 4.23 3.37 2.98 2.74 2.59 2.47 2.39 2.32 2.27
27 4.21 3.35 2.96 2.73 2.57 2.46 2.37 2.31 2.25
28 4.20 3.34 2.95 2.71 2.56 2.45 2.36 2.29 2.24
29 4.18 3.33 2.93 2.70 2.55 2.43 2.35 2.28 2.22
30 4.17 3.32 2.92 2.69 2.53 2.42 2.33 2.27 2.21
40 4.08 3.23 2.84 2.61 2.45 2.34 2.25 2.18 2.12
60 4.00 3.15 2.76 2.53 2.37 2.25 2.17 2.10 2.04


120 3.92 3.07 2.68 2.45 2.29 2.17 2.09 2.02 1.96
3.84 3.00 2.60 2.37 2.21 2.10 2.01 1.94 1.88

158
Table 11.2: (1 − B)-th quantiles of ~ zw€ ,wŒ @
 ≤ Ùw€ ,wŒ B = 1 − B, B = 0.05

r
r 10 12 15 20 24 30 40 60 120 ∞
1 241.9 243.9 245.9 248.0 249.1 250.1 251.1 252.2 253.3 254.3
2 19.4 19.41 19.43 19.45 19.45 19.46 19.47 19.48 19.49 19.5
3 8.79 8.74 8.70 8.66 8.64 8.62 8.59 8.57 8.55 8.53
4 5.96 5.91 5.86 5.80 5.77 5.75 5.72 5.69 5.66 5.63
5 4.74 4.68 4.62 4.56 4.53 4.50 4.46 4.43 4.40 4.36
6 4.06 4.00 3.94 3.87 3.84 3.81 3.77 3.74 3.70 3.67
7 3.64 3.57 3.51 3.44 3.41 3.38 3.34 3.30 3.27 3.23
8 3.35 3.28 3.22 3.15 3.12 3.08 3.04 3.01 2.97 2.93
9 3.14 3.07 3.01 2.94 2.90 2.86 2.83 2.79 2.75 2.71
10 2.98 2.91 2.85 2.77 2.74 2.70 2.66 2.62 2.58 2.54
11 2.85 2.79 2.72 2.65 2.61 2.57 2.53 2.49 2.45 2.40
12 2.75 2.69 2.62 2.54 2.51 2.47 2.43 2.38 2.34 2.30
13 2.67 2.60 2.53 2.46 2.42 2.38 2.34 2.30 2.25 2.21
14 2.60 2.53 2.46 2.39 2.35 2.31 2.27 2.22 2.18 2.13
15 2.54 2.48 2.40 2.33 2.29 2.25 2.20 2.16 2.11 2.07
16 2.49 2.42 2.35 2.28 2.24 2.19 2.15 2.11 2.06 2.01
17 2.45 2.38 2.31 2.23 2.19 2.15 2.10 2.06 2.01 1.96
18 2.41 2.64 2.27 2.19 2.15 2.11 2.06 2.02 1.97 1.92
19 2.38 2.31 2.23 2.16 2.11 2.07 2.03 1.98 1.93 1.88
20 2.35 2.28 2.20 2.12 2.08 2.04 1.99 1.95 1.90 1.84
21 2.32 2.25 2.18 2.10 2.05 2.01 1.96 1.92 1.87 1.81
22 2.30 2.23 2.15 2.07 2.03 1.98 1.94 1.89 1.84 1.78
23 2.27 2.20 2.13 2.05 2.01 1.96 1.91 1.86 1.81 1.76
24 2.25 2.18 2.11 2.03 1.98 1.94 1.89 1.84 1.79 1.73
25 2.24 2.16 2.09 2.01 1.96 1.92 1.87 1.82 1.77 1.71
26 2.22 2.15 2.07 1.99 1.95 1.90 1.85 1.80 1.75 1.69
27 2.20 2.13 2.06 1.97 1.93 1.88 1.84 1.79 1.73 1.67
28 2.19 2.12 2.04 1.96 1.91 1.87 1.82 1.77 1.71 1.65
29 2.18 2.10 2.03 1.94 1.90 1.85 1.81 1.75 1.70 1.64
30 2.16 2.09 2.01 1.3 1.89 1.84 1.79 1.74 1.68 162
40 2.08 2.00 1.92 1.84 1.79 1.74 1.69 1.64 1.58 1.51
60 1.99 1.92 1.84 1.75 1.70 1.65 1.59 1.53 1.47 1.39


120 1.91 1.83 1.75 1.66 1.10 1.55 1.50 1.43 1.35 1.25
1.83 1.75 1.67 1.57 1.52 1.46 1.39 1.32 1.22 1.00

159
Table 11.2: (1 − B)-th quartiles of  ~ zw€,wŒ (@| gÙw€ ,wŒ ,C ) = 1 − B), B = 0.10

r 1 2 3 4 5 6 7 8 9
1 4052 4999.5 5403 5625 5764 5859 5928 5982 6022
2 98.50 99.00 99.17 99.25 99.30 99.33 99.36 99.37 99.39
3 34.12 30.82 29.46 28.71 2824 27.91 27.67 27.49 27.35
4 21.20 18.00 16.69 15.98 15.52 15.21 14.98 14.80 14.66
5 16.26 13.27 12.06 11.39 10.97 10.67 10.46 10.29 10.16
6 13.75 10.92 9.78 9.15 8.75 8.47 8.26 8.10 7.98
7 12.25 9.55 8.45 7.85 7.46 7.19 6.99 6.84 6.72
8 11.26 8.65 7.59 7.01 6.63 6.37 6.18 6.03 5.91
9 10.56 8.02 6.99 6.42 6.06 5.80 5.61 5.47 5.35
10 10.04 7.56 6.55 5.99 5.64 5.39 5.2 5.06 4.94
11 9.65 7.21 6.22 5.67 5.32 5.07 4.89 4.74 4.63
12 9.33 6.93 5.95 5.41 5.06 4.82 4.64 4.50 4.39
13 9.07 6.70 5.74 5.21 4.86 4.62 4.44 4.30 4.14
14 8.86 6.51 5.56 5.04 4.69 4.46 4.28 4.14 4.03
15 8.68 6.36 5.42 4.89 4.56 4.32 4.14 4.00 3.89
16 8.53 6.23 5.29 4.77 4.44 4.20 4.03 3.89 3.78
17 8.40 6.11 5.18 4.67 4.34 4.10 3.93 3.79 3.68
18 8.29 6.01 5.09 4.58 4.25 4.01 3.84 3.71 3.60
19 8.18 5.93 5.01 4.50 4.17 3.94 3.77 3.63 3.52
20 8.10 5.85 4.94 4.43 4.10 3.87 3.70 3.56 3.46
21 8.02 5.78 4.87 4.37 4.04 3.81 3.64 3.51 3.40
22 7.95 5.72 4.82 4.31 3.99 3.76 3.59 3.45 3.35
23 7.88 5.66 4.76 4.26 3.94 3.71 3.54 3.41 3.30
24 7.82 5.61 4.72 4.22 3.90 3.67 3.50 3.36 3.26
25 7.77 5.57 4.68 4.18 3.85 3.63 3.46 3.32 3.22
26 7.72 5.53 4.64 4.14 3.82 3.59 3.42 3.29 3.18
27 7.68 5.49 4.60 4.11 3.78 3.56 3.39 3.26 3.15
28 7.64 5.45 4.57 4.07 3.75 3.53 3.36 3.23 3.12
29 7.60 5.42 4.54 4.4 3.73 3.50 3.33 3.20 3.09
30 7.56 5.39 4.51 4.02 3.70 3.47 3.30 3.17 3.07
40 7.31 5.18 4.31 3.83 3.51 3.29 3.12 2.99 2.89
60 7.08 4.98 4.13 3.65 3.34 3.12 2.95 2.82 2.72


120 6.85 4.79 3.95 3.48 3.17 2.96 2.79 2.66 2.56
6.63 4.61 3.78 3.32 3.02 2.80 2.64 2.51 2.41

160
Table 11.2: (1 − B)-th quartiles of  ~ zw€,wŒ (@| gÙw€ ,wŒ ,C ) = 1 − B), B = 0.01

r
r 10 12 15 20 24 30 40 60 120 ∞
1 6056 6106 6157 6209 6235 6261 6287 6313 6339 6366
2 99.40 99.42 99.43 99.45 99.46 99.47 99.47 99.48 99.49 99.50
3 27.23 27.05 26.87 26.69 26.60 26.50 26.41 26.32 26.22 26.13
4 14.55 14.37 14.20 14.02 13.93 13.84 13.75 13.65 13.56 13.46
5 10.05 9.89 9.72 9.55 9.47 9.38 9.29 9.20 9.11 9.02
6 7.87 7.72 7.56 7.40 7.31 7.23 7.14 7.06 6.97 6.88
7 6.62 6.47 6.31 6.16 6.07 5.99 5.91 5.82 5.74 5.65
8 5.81 5.67 5.52 5.36 5.28 5.20 5.12 5.03 4.95 4.86
9 5.26 5.11 4.96 4.81 4.73 4.65 4.57 4.48 4.40 4.31
10 4.85 4.71 4.56 4.41 4.33 4.25 4.17 4.08 4.00 3.91
11 4.54 4.40 4.25 4.10 4.02 3.94 3.86 3.78 3.69 3.60
12 4.30 4.16 4.01 3.86 3.78 3.70 3.62 3.54 3.45 3.36
13 4.10 3.96 3.82 3.66 3.59 3.51 3.43 3.34 3.25 3.17
14 3.94 3.80 3.66 3.51 3.43 3.35 3.27 3.18 3.09 3.00
15 3.80 3.67 3.52 3.37 3.29 3.21 3.13 3.05 2.96 2.87
16 3.69 3.55 3.41 3.26 3.18 3.10 3.02 2.93 2.84 2.75
17 3.59 3.46 3.31 3.16 3.08 3.00 2.92 2.83 2.75 2.65
18 3.51 3.37 3.23 3.08 3.00 2.92 2.84 2.75 2.66 2.57
19 3.43 3.30 3.15 3.00 2.92 2.84 2.76 2.67 2.58 2.49
20 3.37 3.23 3.09 2.94 2.86 2.78 2.69 2.61 2.52 2.42
21 3.31 3.17 3.03 2.88 2.80 2.72 2.64 2.55 2.46 2.36
22 3.26 3.12 2.98 2.83 2.75 2.67 2.58 2.50 2.40 2.31
23 3.21 3.07 2.93 2.78 2.70 2.62 2.54 2.45 2.35 2.26
24 3.17 3.03 2.89 2.74 2.66 2.58 2.49 2.40 2.31 2.21
25 3.13 2.99 2.85 2.70 2.62 2.54 2.45 2.36 2.27 2.17
26 3.09 2.96 2.81 2.66 2.58 2.50 2.42 2.33 2.23 2.13
27 3.06 2.93 2.78 2.63 2.55 2.47 2.38 2.29 2.20 2.10
28 3.03 2.90 2.75 2.60 2.52 2.44 2.35 2.26 2.17 2.06
29 3.00 2.87 2.73 2.57 2.49 2.41 2.33 2.23 2.14 2.03
30 2.98 2.84 2.70 2.55 2.47 2.39 2.30 2.21 2.11 2.01
40 2.80 2.66 2.52 2.37 2.29 2.20 2.11 2.02 1.92 1.80
60 2.63 2.50 2.35 2.20 2.12 2.03 1.94 1.84 1.73 1.60


120 2.47 2.34 2.19 2.03 1.95 1.86 1.76 1.66 1.53 1.38
2.32 2.18 2.04 1.88 1.79 1.70 1.59 1.47 1.32 1.00

161
Problems

1. (i) Let z: ℝ → ℝ be given by


1, if / + 2† ≥ 1
z(/, †) = d .
0, if / + 2† < 1

Does z (⋅,⋅) define a distribution function?

(ii) Let z: ℝ → ℝ be given by

0, if / < 0 or / + † < 1 or † < 0


z (/, †) = d .
1, otherwise

Does z (⋅,⋅) define a distribution function?

(iii) Let za,˜ (⋅,⋅) be the distribution function of some two-dimensional random vector (, –), and
let za (⋅) and z˜ (⋅), respectively, be the marginal distribution functions of  and –. Define
 (/, †) = min{za (/), z˜ (†) , (/, †) ∈ ℝ (/, †) = max{za (/) + z˜ (†) − 1, 0,
(/, †) ∈ ℝ . Prove that:

and

(a) É (⋅,⋅) and (⋅,⋅) are each distribution functions and that their marginal distribution
functions are the same as those of za,˜ (⋅,⋅);

(b) É(/, †) ≤ za,˜ (/, †) ≤ (/, †), ∀(/, †) ∈ ℝ .

(Note: Let the random variable  have distribution function za (⋅) and let – = ]() have
distribution function z˜ (⋅), where ](⋅) is some function. If ](⋅) is strictly increasing
(decreasing), then za,˜ (/, †) = r(/, †) za,˜ (/, †) = ž(/, †).

2. Let the random vector  = ( ,  ) have the joint distribution function

0, if / < 0 or / < 0
k/ /
i , if 0 ≤ / < 1, 0 ≤ / < 2 or 1 ≤ / < 2, 0 ≤ / < 1
8
i/
i , if 0 ≤ / < 1, / ≥ 2
4
i1 / /
i +  , if 1 ≤ / < 2, 1 ≤ / < 2
2 8
za€ ,aŒ (/ , / ) = 1 / .
j +  , if 1 ≤ / < 2, / ≥ 2
i2 4
 
/
i  , if / ≥ 2, 0 ≤ / < 1
i4
 

i1 /
+ , if / ≥ 2, 1 ≤ / < 2
i2 4
h1 if / ≥ 2, / ≥ 2

162
Find @({( ,  ) = (0,0)) and @({( ,  ) = (1,1)) . Is  = ( ,  ) of absolutely
continuous type?

3. Let the random vector (, –) have the p.m.f.

(/ + † + _ − 1)! ‘ ¾
? ? (1 − ? − ? )` , if (/, †) ∈ ℤŽ × ℤŽ
( )
Ùa,˜ /, † = Ò /! †! (_ − 1)!   ,
0, otherwise
where _ ≥ 1 is an integer, 0 < ?3 < 1, = 1, 2, ? + ? < 1 and ℤŽ = {0, 1, 2, … . Find the
marginal p.m.f.s of  and – and the conditional distributions. (Note: A distribution with
above p.m.f. is called a bivariate negative binomial distribution).

4. Three balls are randomly placed in three empty boxes 8 , 8 and 8Q . Let : denote the total
number boxes which are occupied and let 3 denote the number of balls in the box 83 , =
1, 2, 3.
(i) Find the joint p.m.f. of (:,  );
(ii) Find the joint p.m.f. of ( ,  );
(iii) Find the marginal p.m.f.s of : and ;
(iv) Find the marginal p.m.f. of  from the joint p.m.f. of ( ,  ).

5. Let  and have the joint p.m.f.


2 ‘€ Ž‘Œ 1 7‘€ 7‘Œ
Ùa (/ , / ) = Ò|3} | } , if (/ , / ) = (0,0), (0,1), (1,0), (1,1).
3
0, otherwise
Find the joint p.m.f. of – =  −  and – =  +  ;
Find the marginal p.m.f.s of – and – ;
(i)

Find the  (– – ).


(ii)
(iii)

6. Let  = ( ,  ) have the joint p.m.f.


/ /
, if / = 1,2,3
Ùa (/ , / ) = Ô 36 ,
0, otherwise
and let – =   and – =  .
(i) Find the joint p.m.f. of (– , – );
(ii) Find the marginal p.m.f. of – .
(iii) Find @({ +  = 4).

163
7. Suppose that  , … , w are i.i.d. random variables and that @( = 0) = 1 − 4 = 1 −
@( = 1), for some 4 ∈ (0, 1). Let  denote the number of  , … , w that are as large as  .
Find the p.m.f. of .

8. Suppose that the number, , of eggs laid by a bird has the @() distribution (the Poisson
distribution with mean ), and the probability that an egg would finally develop is 4 ∈ (0,1);
here  > 0. Further suppose that eggs develop independently of each other. Show that the
number, –, of eggs surviving has the @(4) distribution. Also, find the conditional
distribution of  given – = †, where † ∈ {0, 1, 2, … .

9. Let the random vector (, –) have the joint p.d.f. For the bivariate beta random variable
(, –) having p.d.f.

Γ(? + ? + ?Q ) D 7 D 7
/ € † Œ (1 − / − †)D‹7 ,  if / > 0, † > 0, / + † < 1
(/,
Ùa,˜ †) = ÒΓ(? )Γ(? )Γ(?Q ) ,
0, otherwise

where ?3 > 0, = 1, 2, 3. Find the marginal p.d.f.s of  and – and the conditional p.d.f.s.
(Note: A distribution with above p.d.f. is called a bivariate beta distribution).

10. Let the random variable  = ( ,  ) have the joint p.m.f.

/ + 2/
Ùa (/ , / ) = Ô 18 , if (x , x ) ∈ {1, 2 × {1, 2.
0, otherwise

Determine the conditional mean and conditional variance of  given  = / , / ∈ {1, 2.

11. Let  = ( ,  , Q ) be a random vector with joint p.m.f.

1
( )
Ùa€ ,aŒ ,a‹ (/ , / , /Q ) = Ô4 , if / , / , /Q ∈ [ ,
0, otherwise

where [ = {(1,0,0), (0,1,0), (0,0,1), (1,1,1).

(i) Are  ,  , Q independent?


(ii) Are  ,  , Q pairwise independent?
(iii) Are  +  andQ independent?

164
12. Let  and – be two random variables such that ({ ∈ {0,1) = @
– ∈ {0,1 = 1. If
@({ = 1, – = 1) = @({ = 1)@({– = 1), show that  and – are independent random
variables.

variables  ,  and Q , respectively, denote the number of spades, the number of hearts and
13. Five cards are drawn at random without replacement from a deck of 52 cards. Let the random

(i) Find the joint p.m.f. of ( ,  , Q );


the number of diamonds among the five drawn cards.

(ii) Are random variables  ,  and Q independent?

and 3 red balls. Let the random variables  and  , respectively, denote the number of white
14. Consider a sample of size 3 drawn with replacement from an urn containing 3 white, 2 black

balls and the number of black balls in the sample. Determine whether or not  and  are
independent.

15. The joint p.d.f. of (, –) is given by


4/†, if 0 < / < 1, 0 < † < 1
Ùa,˜ (/, †) = d .
0, otherwise
(i) Verify whether  and – are independent;
(ii) Find the marginal p.d.f.s. of  and –;
(iii) Find @ <:0 <  <  , < – < 1>= and @ ({ + – < 1).
 
R

16. The joint p.d.f. of (, –) is given by

ëX 7(‘ŽQ¾) , if 0 < / < † < ∞


Ùa,˜ (/, †) = d ,
0, otherwise

where ë is a real constant.

(i) Find the value of the constant ë;


(ii) Verify whether  and – are independent;
(iii) Find the marginal p.d.f.s of  and –;
(iv) Find @ <: < >=.
˜


17. Let Ù and ] be two p.d.f.s with respective distribution functions z and É. Define ℎ: ℝ →
[0, ∞) as
ℎ (/, †) = [1 + B {2z (/ ) − 1{2É (†) − 1]Ù (/)](†),

where B ∈ [−1,1].

(i) Show that ℎ is a p.d.f. of some random vector (, –);

165
Show that the marginal p.d.f.s of  and – are Ù and ], respectively;
Does there exists a value of B ∈ [−1,1] such that  and – are independent.
(ii)
(iii)

18. Let  = ( ,  , Q ) be a random vector with joint p.d.f.

1 <íŒ Œ Œ <íŒ Œ Œ

Ùa (/ , / , /Q ) =
€ ¼íŒ ¼í‹ = € ¼íŒ ¼í‹ =
X 7
Œ 1 + / / /Q X 7
Œ , /3 ∈ ℝ, = 1,2,3.
(2Y)
‹
Œ

(i) Are  ,  , Q independent?


(ii) Are  ,  , Q pairwise independent?
(iii) Find the marginal p.d.f.s of ( ,  ), ( , Q ), and ( , Q ).

19. A point  is chosen at random from the interval (0,1) and then a point  is chosen at
random from the interval (0,  ). Compute @({ +  ≥ 1) and find the conditional mean
 ( | / , / ∈ (0,1).

20. With the help of a counter example, show that if the random variables  and  are
uncorrelated, then this does not, in general, imply that  and  are independent.

21. Let  = ( ,  ) be a random vector having the p.d.f.


1 1
, if 1 < / < ∞, < / < /
Ù (/ , / ) = Ò2/ /
 / .
0, otherwise
(i) Find the marginal p.d.f.s of  and  ;
(ii) Find the conditional means and variances of  given  = / (/ ∈ (0, ∞)) and 
given  = / (/ ∈ (1, ∞));
(iii) Are  and  independent random variables?;
(iv) Find Corr( ,  );
(v) Find @({ < | 2);


(vi) Find @({ < | > 3.




22. Let (, –) be a random vector such that the p.d.f. of  is

4/ (1 − /  ), if 0 < / < 1
Ùa (/ ) = d ,
0, otherwise
and, for fixed / ∈ (0,1), the conditional p.d.f. of – given  = / is

166

٘|a †|/ Ô1 − /  , if / < † < 1.
0, otherwise
(i) For † ∈ (0,1), find conditional p.d.f. of  given – = †;
(ii) Find (|–  and Var <|– = ;
 

(iii)Find @ <:0 < – < >= and @ <: < – < > 0: = >=.
   
Q Q Q 

23. Let (, –) be a random vector with joint p.m.f. given by:

Ùa,˜ (/, †)
†↓/→ 1 2 3 4
4 .08 .11 .09 .03
5 .04 .12 .21 .05
6 .09 .06 .08 .04

(i) Find the conditional p.m.f. of , given – = 5;


@ ({ + – ≤ 8), @({ + – > 7), @({– ≤ 14),
@({– > 18), @({ = 3|– 5 and @({– = 5| 3;
(ii) Find the probabilities

(iii) Find Corr(, –).

24. Let  , … , w be r random variables with  (3 ) = 3 , Var(3 ) = W3 and "3© =
Corr3 , © , ,  = 1, … , r, ≠ . For real numbers A3 , H3 , = 1, … , r, define – = ∑w3K A3 3
and — = ∑w3K H3 3 . Find Cov(–, —).

25. Let  ,  and Q be three independent random variables each with a variance W  . Define the

√3 − 1 3 − √3
random variables

“ =  , “ =  +  , and “Q = (√2 − 1) + 2 − √2Q .


2 2
Find Corr(“ , “ ), Corr(“ , “Q ) and Corr(“ , “Q ).

26. Let  and – be jointly distributed random variables with () = (–) = 0, (  ) =
 (–  ) = 2 and Corr(, –) = 1⁄3. Find Corr < + , + =.
a ˜ a ˜
Q Q Q Q

27. Let (, –) have the joint p.m.f. given by:

(/, †) (1,1) (1,2) (1,3) (2,1) (2,2) (2,3)


Ùa,˜ (/, †) 2 4 3 1 1 4
15 15 15 15 15 15

167
and Ùa,˜ (/, †) = 0, elsewhere. Find " = Corr(, –).

28. Let  ,  and Q be three random variables with means, variances and correlation
coefficients denoted by  ,  , Q ; W , W , WQ and " , "Q , "Q , respectively. If ( −
 )| / , Q = /Q  = H (/ −  ) + HQ (/Q − Q ), for some constants H and HQ ,
determine H and HQ in terms of the variances and correlation coefficients.

29. Let  , … , w denote a random sample, where  , … , w are positive with probability one.

 +  + ⋯ + ` _
Show that
| } = ,  _ = 1,2, … , r.
 +  + ⋯ +w r

28. Let  , … , w be a random sample of absolutely continuous type random variables. If the
expectation of  is finite and the distribution of  is symmetric about  ∈ (−∞, ∞) then
show that

(i) :w −  =  − w7Ž:w ,   = 1, … , r;


7

(ii) (:w + w7Ž:w ) = 2,  = 1, … , r;


(iii)  |ž¼€:w } = , if r is odd;
Œ

(iv) @ |ž¼€:w > } = , if r is odd.



Œ 

29. Let  , … , w be a random sample and let  ( ) be finite.


(i) Find the conditional expectation  ( | + ⋯ + w = ), where  ∈ ℝ is such that the

(ii) If  is of absolutely continuous type and (Y , … , Yw ) is a partition of (1, … , r ), find


conditional expectation is defined.

@^€ < ⋯ < ^ž .

30. Let  and  be i.i.d. :(0,1) random variables and let – =  +  , — =  +  .

€

Show that the m.g.f. of (–, —) is ˜,ê ( ,  ) = ,  < ;


A €¦Œ=Œ 
7½Œ 
Using (a), find Corr(–, —).
(i)
(ii)

31. Suppose that the lifetimes of electric bulbs manufactured by a manufacturer follows
exponential distribution with mean of 50 hours. Eight such bulbs are chosen at random.
(i) Find the probability that, among eight chosen bulbs, 2 will last less than 40 hours, 3 will
last anywhere between 40 and 60 hours, 2 will last anywhere between 60 and 80 hours
and 1 will last for more than 80 hours;

168
(ii) Find the expected number of bulbs in the lot of chosen 8 bulbs with lifetime between 60
and 80 hours;
(iii) Find the expected number of bulbs in the lot of 8 chosen bulbs with lifetime between 60
and 80 hours, given that the number of bulbs in the lot with lifetime anywhere between
40 and 60 hours is 2.

32. Suppose that ~ Mult(30, ? , ? , ?Q , ?R ). Find the conditional probability mass function of
( ,  , Q , R ) given that  +  +Q + R = 28.

33. Let  = ( ,  ) have the joint p.d.f.

Ùa (/ , / ) = l(/ )l(/ )[1 + B (2Φ(/ ) − 1)(2Φ(/ ) − 1)], /3 ∈ ℝ, = 1, 2,


where |B| g 1.
(i) Verify that Ùa (/ , / ) is a p.d.f.;
(ii) Find the marginal p.d.f.s of  and  ;
(iii) Is ( ,  ) jointly normal?

34. Let  = ( ,  ) ~ : ( ,  , W , W , ") and, for real constants A , A , AQ , and AR (A3 ≠ 0,
= 1, 2, 3, 4, A AR ≠ A AQ ), let – = A  + A  and — = AQ  + AR  .
(i) Find the joint p.d.f. of (–, —) ;
(ii) Find the marginal p.d.f.s. of – and —.

35. Let  and – be i.i.d. :(0, W  ) random variables.


(i) Find the joint p.d.f. of (r, ˜ ), where r = A + H– and ˜ = H − A– (A ≠ 0, H ≠ 0);
(ii) Show that r and ˜ are independent;
and are i.i.d. :(0, W  ) random variables.
aŽ˜ a7˜
√ √
(iii) Show that

36. Let  = ( ,  ) ~ : (0, 0, 1,1, ").


(i) Find the m.g.f. of – =   ;
(ii) Using (i), find  (  );
(iii) Using conditional distribution of  given  , find  (  ).

37. Let  = ( ,  ) have the joint p.d.f.

1 7€‘ Œ ޾ Œ 
Ù(/, †) = Ô Y X , if /† > 0 .
Œ

0, otherwise
Show that 3 ∼ :(0, 1), = 1, 2 , but  = ( ,  ) does not have a bivariate normal
distribution.

169
38. For a fixed " ∈ (−1,1) and B ∈ (0, 1), let the random variable (, –) have the joint p.d.f.

]\ (/, †) = BÙ\ (/, †) + (1 − B)Ù7\ (/, †) ,

where Ù (⋅,⋅), −1 <  < 1, denote the pdf of : (0,0,1,1, ). Show that  and – are normally
distributed but the distribution of (, –) is not bivariate normal.

39. Consider the random vector (, –) as defined in Problem 39.


(i) Find Corr(, –);
(ii) Are  and– independent?

40. Suppose that  ~ : (0, 0, 1, 1, 0) . Find ë such that @(−ë ≤  ≤ ë , −ë ≤  ≤ ë ) =


0.95.

41. (i) Let (, –) ~ : (5, 8,16, 9, 0.6) . Find @({5 < – < 11)| 2), @({4 <  < 6)
and @({7 < – < 9);
(ii) Let (, –) ~ : (5, 10, 1, 25, ") , where " > 0. If @({4 < – < 16)| 5
0.954, determine ".

42. (i) Let  ~ Bin(r , 4) and – ~ Bin(r , 4) be independent random variables. For  ∈
{0, 1, … , min(r , r ) , find the conditional distribution and conditional mean of  given
 + – = .

(ii) Let  ~ P( ) and – ~ P( ) be independent random variables. For  ∈ {0,1, … , find the
conditional distribution and conditional mean of  given  + – = 

43. Let  and – be independent random variables with respective p.d.f.s.


1
Ùa (/) = Ô3 , if 1 ≤ / ≤ 4
0, otherwise

and

X 7(¾7) , if † ≥ 2
٘ (†) = d .
0, otherwise

Find the distribution function of ~ = and hence find the p.d.f. of ~.


a
˜

44. Let  and – be i.i.d. r(0,1)random variables. Find the marginal p.d.f.s. of
 + –,  − –, , | 0 –|;
aŽ˜
a7˜
(i)

170
min(, –), max(, –), k(l(a,˜) ;
k(a,˜)

  + –.
(ii)
(iii)

45. Let “ ~ 8X(B , B ) and ~ ~ É(B + B , ?) be independent random variables. Using


Example 10.2.11 show that “~ ~ É(B , ?).

46. Let  and – be i.i.d. random variables with common p.d.f. Ù (/) = Ž‘  , −∞ < / < ∞,
m

where ë is the normalizing constant. Find the p.d.f. of — = ˜ .


a

47. Let  and – be i.i.d. :(0,1) random variable. Define the random variables ã and Θ by
 = ã cos Θ , – = ã sin Θ.
Show that ã and Θ are independent with ~ Exp(1) and Θ ~ r(0, 2Y);
åŒ

(i)
Show that   + –  and
a
˜
Show that sin Θ and sin 2Θ are identically distributed and hence find the p.d.f. of
(ii) are independently distributed;

~= ;

(iii)

√a Œ Ž˜ Œ

Find the distribution of r =


Qa Œ ˜7˜ ‹
a Œ Ž˜ Œ
(iv) .

48. Let r and r be i.i.d. r (0, 1) random variables. Show that  = &−2 lnr cos(2Yr ) and
 = &−2 lnr sin(2Yr ) are i.i.d. :(0, 1) random variables. (This is known as the Box-
Muller transformation).

49. Let (, –) ~ : (0, 0, 1, 1, ").


Show that @({ > 0, – > 0) = R + . Also find @({ < 0, – < 0), @({ >
 ()LQ\
^
0, – < 0 ) and @({ < 0, – > 0);
(i)

Show that @({– > 0) = + and @({– < 0) = − .


 ()LQ\  ()LQ\
 ^  ^
(ii)

50. Let  , … , w be a random sample from the Exp(1) distribution.


Find the marginal distributions of – , … , –w , where
∑3©K ©
(i)

–3 = 3Ž ,   = 1, … , r − 1, –w =  + ⋯ + w ;
∑©K ©
(ii) Are – , … , –w independent ?

51. Let  ,  , Q be i.i.d. É(w, 1) random Let — =  +  + Q , — =


and —Q = .
aŒ a‹
variables.

a€ ŽaŒ Ža‹ a€ ŽaŒ Ža‹

171
Show that — and and (— , —Q ) are independent and find marginal p.d.f.s. of — , — and
—Q ;
(i)

(ii) Find  (— — —Q ).

52. Let  and  be independent random variables with 3 ~ Bin <r3 , = , = 1, 2. Using the


m.g.f. technique, find the distribution of – =  −  + r .

53. Let :w ≤ :w ≤ ⋯ , ≤ w:w be the set of order statistics associated with a random sample
of size r (≥ 2) from the Exp(1) distribution.
(i) Let — = r:w , —3 = (r − + 1)(3:w − 37:w ), = 2, … , r. Show that — , … , —w are
i.i.d. Exp(1) random variables;
(ii) Using (i), or otherwise, find (:w ), Var(:w ) and Cov(:w , ñ:w ), 1 ≤  < ç ≤ r;
(iii) Show that :w and ñ:w − :w are independent for any ç > ;
(iv) Find the p.d.f. of Ž:w − :w ,  = 1, 2, … , r.

54. Let  , … , w be i.i.d. non-negative random variables (@({ ≥ 0) = 1) of the absolutely
continuous type. If  (| | 2 ∞ and w = max( , … , w ), show that

 (w ) =  (w7 ) + Úz (/) 1 − z (/)Û/.


w7

55. Let :w ≤ :w ≤ ⋯ ≤ w:w be the order statistics associated with a random sample of size n
(≥ 2) from the r(0,1) distribution. Let –3 = , = 1, … , r − 1, and –w = w:w . Show
a„:ž
a„¼€:ž
that – , … , –w are independent and find the p.d.f of –3 , = 1, … , r.

172

You might also like