Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Probability, Random Variable, & Random Processes, Thesis of Statistics

Practice problems with full explanations that reinforce knowledge, coverage of the most up-to-date developments in your course field, and in-depth review of practices and applications. Fully compatible with your classroom text, Schaum's highlights all the important facts you need to know. Use Schaum's to shorten your study time-and get your best test scores!

Typology: Thesis

2016/2017

Uploaded on 02/05/2017

ryan_blaze
ryan_blaze šŸ‡®šŸ‡©

5

(1)

1 document

1 / 320

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c
pf2d
pf2e
pf2f
pf30
pf31
pf32
pf33
pf34
pf35
pf36
pf37
pf38
pf39
pf3a
pf3b
pf3c
pf3d
pf3e
pf3f
pf40
pf41
pf42
pf43
pf44
pf45
pf46
pf47
pf48
pf49
pf4a
pf4b
pf4c
pf4d
pf4e
pf4f
pf50
pf51
pf52
pf53
pf54
pf55
pf56
pf57
pf58
pf59
pf5a
pf5b
pf5c
pf5d
pf5e
pf5f
pf60
pf61
pf62
pf63
pf64

Partial preview of the text

Download Probability, Random Variable, & Random Processes and more Thesis Statistics in PDF only on Docsity!

Schaum's Outline of

Theory and Problems of

Probability, Random Variables, and Random

Processes

Hwei P. Hsu, Ph.D.

Professor of Electrical Engineering

Fairleigh Dickinson University

Start of Citation[PU]McGraw-Hill Professional[/PU][DP]1997[/DP]End of Citation

Preface

The purpose of this book is to provide an introduction to principles of

probability, random variables, and random processes and their applications.

The book is designed for students in various disciplines of engineering,

science, mathematics, and management. It may be used as a textbook and/or as

a supplement to all current comparable texts. It should also be useful to those

interested in the field for self-study. The book combines the advantages of both

the textbook and the so-called review book. It provides the textual explanations

of the textbook, and in the direct way characteristic of the review book, it gives

hundreds of completely solved problems that use essential theory and

techniques. Moreover, the solved problems are an integral part of the text. The

background required to study the book is one year of calculus, elementary

differential equations, matrix analysis, and some signal and system theory,

including Fourier transforms.

I wish to thank Dr. Gordon Silverman for his invaluable suggestions and

critical review of the manuscript. I also wish to express my appreciation to the

editorial staff of the McGraw-Hill Schaum Series for their care, cooperation,

and attention devoted to the preparation of the book. Finally, I thank my wife,

Daisy, for her patience and encouragement.

HWEI P. HSU

MONTVILLE, NEW JERSEY

Start of Citation[PU]McGraw-Hill Professional[/PU][DP]1997[/DP]End of Citation

  • Chapter 1. Probability
    • 1.1 Introduction
    • 1.2 Sample Space and Events
    • 1.3 Algebra of Sets
    • 1.4 The Notion and Axioms of Probability
    • 1.5 Equally Likely Events
    • 1.6 Conditional Probability
    • 1.7 Total Probability
    • 1.8 Independent Events
    • Solved Problems
  • Chapter 2. Random Variables
    • 2.1 Introduction
    • 2.2 Random Variables
    • 2.3 Distribution Functions
    • 2.4 Discrete Random Variables and Probability Mass Functions
    • 2.5 Continuous Random Variables and Probability Density Functions
    • 2.6 Mean and Variance
    • 2.7 Some Special Distributions
    • 2.8 Conditional Distributions
    • Solved Problems
  • Chapter 3. Multiple Random Variables
    • 3.1 Introduction
    • 3.2 Bivariate Random Variables
    • 3.3 Joint Distribution Functions
    • 3.4 Discrete Random Variables - Joint Probability Mass Functions
    • 3.5 Continuous Random Variables - Joint Probability Density Functions
    • 3.6 Conditional Distributions
    • 3.7 Covariance and Correlation Coefficient
    • 3.8 Conditional Means and Conditional Variances
    • 3.9 N-Variate Random Variables
    • 3.10 Special Distributions
    • Solved Problems
  • Chapter 4. Functions of Random Variables, Expectation, Limit Theorems vi
    • 4.1 Introduction
    • 4.2 Functions of One Random Variable
    • 4.3 Functions of Two Random Variables
    • 4.4 Functions of n Random Variables
    • 4.5 Expectation
    • 4.6 Moment Generating Functions
    • 4.7 Characteristic Functions
    • 4.8 The Laws of Large Numbers and the Central Limit Theorem
    • Solved Problems
  • Chapter 5. Random Processes
    • 5.1 Introduction
    • 5.2 Random Processes
    • 5.3 Characterization of Random Processes
    • 5.4 Classification of Random Processes
    • 5.5 Discrete-Parameter Markov Chains
    • 5.6 Poisson Processes
    • 5.7 Wiener Processes
    • Solved Problems
  • Chapter 6. Analysis and Processing of Random Processes
    • 6.1 Introduction
    • 6.2 Continuity, Differentiation, Integration
    • 6.3 Power Spectral Densities
    • 6.4 White Noise
    • 6.5 Response of Linear Systems to Random Inputs
    • 6.6 Fourier Series and Karhunen-LoĆ©ve Expansions
    • 6.7 Fourier Transform of Random Processes
    • Solved Problems
  • Chapter 7. Estimation Theory
    • 7.1 Introduction
    • 7.2 Parameter Estimation
    • 7.3 Properties of Point Estimators
    • 7.4 Maximum-Likelihood Estimation
    • 7.5 Bayes' Estimation
    • 7.6 Mean Square Estimation
    • 7.7 Linear Mean Square Estimation
    • Solved Problems
  • Chapter 8. Decision Theory vii
    • 8.1 Introduction
    • 8.2 Hypothesis Testing
    • 8.3 Decision Tests
    • Solved Problems
  • Chapter 9. Queueing Theory
    • 9.1 Introduction
    • 9.2 Queueing Systems
    • 9.3 Birth-Death Process
    • 9.4 The M/M/1 Queueing System
    • 9.5 The M/M/s Queueing System
    • 9.6 The M/M/1/K Queueing System
    • 9.7 The M/M/s/K Queueing System
    • Solved Problems
  • Appendix A. Normal Distribution
  • Appendix B. Fourier Transform
    • B.1 Continuous-Time Fourier Transform
    • B.2 Discrete-Time Fourier Transform
  • Index

PROBABILITY [CHAP 1

sample points (as in Example 1.1) or countably infinite sample points (as in Example 1.2). A set is called countable if its elements can be placed in a one-to-one correspondence with the positive integers. A sample space S is said to be continuous if the sample points constitute a continuum (as in Example 1.3).

C. Events:

Since we have identified a sample space S as the set of all possible outcomes of a random experi- ment, we will review some set notations in the following. If C is an element of S (or belongs to S), then we write

If S is not an element of S (or does not belong to S), then we write

u s

A set A is called a subset of B, denoted by

A c B

if every element of A is also an element of B. Any subset of the sample space S is called an event. A

sample point of S is often referred to as an elementary event. Note that the sample space S is the subset of itself, that is, S c S. Since S is the set of all possible outcomes, it is often called the certain event.

EXAMPLE 1.4 Consider the experiment of Example 1.2. Let A be the event that the number of tosses required until the first head appears is even. Let B be the event that the number of tosses required until the first head appears is odd. Let C be the event that the number of tosses required until the first head appears is less than 5. Express events A, B, and C.

1.3 ALGEBRA OF SETS

A. Set Operations:

I. Equality:

Two sets A and B are equal, denoted A = B, if and only if A c B and B c A.

2. Complementation :

Suppose A c S. The complement of set A, denoted A, is the set containing all elements in S but not in A.

A= {C: C: E S a n d $ A)

3. Union:

The union of sets A and B, denoted A u B, is the set containing all elements in either A or B or

both.

4. Intersection:

The intersection of sets A and B, denoted A n B, is the set containing all elements in both A and B.

CHAP. 1) PROBABILITY

The set containing no element is called the null set, denoted 0. Note that

6. Disjoint Sets:

Two sets A and B are called disjoint or mutually exclusive if they contain no common element,

that is, if A n B = 0.

The definitions of the union and intersection of two sets can be extended to any finite number of sets as follows: n U A ~ = A , u A , U.. - U A, i = 1 = ([: [ E A l or [ E AZ or. - - E A,)

= (5: 5 E Al and 5 E A, and 5 E A,)

Note that these definitions can be extended to an infinite number of sets:

In our definition of event, we state that every subset of S is an event, including S and the null set

0. Then

S = the certain event @ = the impossible event

If A and B are events in S, then

2 = the event that A did not occur

A u B = the event that either A or B or both occurred

A n B = the event that both A and B occurred

Similarly, if A,, A,,. .. , A, are a sequence of events in S, then

n U A, = the event that at least one of the A, occurred; i = 1 n n Ai = the event that all of the A, occurred. i = 1

B. Venn Diagram:

A graphical representation that is very useful for illustrating set operation is the Venn diagram.

For instance, in the three Venn diagrams shown in Fig. 1-1, the shaded areas represent, respectively, the events A u B, A n B, and A. The Venn diagram in Fig.. 1-2 indicates that B c A and the event

A n B is shown as the shaded area.

CHAP. 11

Distributive Laws:

PROBABILITY

De Morgan's Laws:

These relations are verified by showing that any element that is contained in the set on the left side of the equality sign is also contained in the set on the right side, and vice versa. One way of showing this

is by means of a Venn diagram (Prob. 1.13). The distributive laws can be extended as follows:

Similarly, De Morgan's laws also can be extended as follows (Prob. 1.17):

1.4 THE NOTION AND AXIOMS OF PROBABILITY

An assignment of real numbers to the events defined in a sample space S is known as the prob- ability measure. Consider a random experiment with a sample space S, and let A be a particular event defined in S.

A. Relative Frequency Definition:

Suppose that the random experiment is repeated n times. If event A occurs n(A) times, then the probability of event A, denoted P(A), is defined as

where n(A)/n is called the relative frequency of event A. Note that this limit may not exist, and in addition, there are many situations in which the concepts of repeatability may not be valid. It is clear that for any event A, the relative frequency of A will have the following properties:

1. 0 5 n(A)/n I 1, where n(A)/n = 0 if A occurs in none of the n repeated trials and n(A)/n = 1 if A

occurs in all of the n repeated trials.

2. If A and B are mutually exclusive events, then

and

PROBABILITY [CHAP 1

B. Axiomatic Definition:

Let S be a finite sample space and A be an event in S. Then in the axiomatic definition, the , probability P(A) of the event A is a real number assigned to A which satisfies the following three axioms :

Axiom 1 : P(A) 2 0 (1.21)

Axiom 2: P(S) = 1 (1.22)

Axiom 3 : P ( A u B) = P(A) + P(B) if A n B = 0 (1.23)

If the sample space S is not finite, then axiom 3 must be modified as follows:

Axiom 3': If A , , A , , ... is an infinite sequence of mutually exclusive events in S (Ai n A j = 0 for i # j), then

These axioms satisfy our intuitive notion of probability measure obtained from the notion of relative frequency.

C. Elementary Properties of Probability:

By using the above axioms, the following useful properties of probability can be obtained:

6. If A , , A , ,... , A, are n arbitrary events in S, then

  • ... ( - 1 ) " - ' P ( A 1 n A, n - -. n A,) (1.30)

where the sum of the second term is over all distinct pairs of events, that of the third term is over all distinct triples of events, and so forth.

  1. If A , , A , ,. .., A, is a finite sequence of mutually exclusive events in S (Ai n Aj = 0 for i # j), then

and a similar equality holds for any subcollection of the events.

Note that property 4 can be easily derived from axiom 2 and property 3. Since A c S, we have

8 PROBABILITY [CHAP 1

is the conditional probability of an event B given event A. From Eqs. (1.39) and (1.40), we have

P(A n B) = P(A I B)P(B) = P(B I A)P(A) (1.41)

Equation (1.dl) is often quite useful in computing the joint probability of events.

B. Bayes' Rule:

From Eq. (1.41) we can obtain the following Bayes' rule:

1.7 TOTAL PROBABILITY

The events A,, A,,.. .,A, are called mutually exclusive and exhaustive if

n UAi = A, u A, u v A, = S and A, n Aj = @ i # j i = 1 Let B be any event in S. Then

which is known as the total probability of event B (Prob. 1.47). Let A = Ai in Eq. (1.42); then, using Eq. (1.44), we obtain

Note that the terms on the right-hand side are all conditioned on events A i , while the term on the left is conditioned on B. Equation (1.45) is sometimes referred to as Bayes' theorem.

1.8 INDEPENDENT EVENTS

Two events A and B are said to be (statistically) independent if and only if

It follows immediately that if A and B are independent, then by Eqs. (1.39) and (1.40),

P(A I B) = P(A) and P(B I A) = P(B) (1.47)

If two events A and B are independent, then it can be shown that A and B are also independent; that

is (Prob. 1. 53 ),

Then

Thus, if A is independent of B, then the probability of A's occurrence is unchanged by information as to whether or not B has occurred. Three events A, B, C are said to be independent if and only if

(1SO)

CHAP. 11 PROBABILITY

We may also extend the definition of independence to more than three events. The events A,, A,,. .. ,

A, are independent if and only if for every subset (A,,, A,, , ..., A,,) (2 5 k 5 n) of these events,

P(Ail n A,, n.. n Aik)= P(Ai1)P(Ai,) P(Aik) (1.51) Finally, we define an infinite set of events to be independent if and only if every finite subset of these events is independent. To distinguish between the mutual exclusiveness (or disjointness) and independence of a collec- tion of events we summarize as follows:

  1. If (A,, i = 1,2, ... , n} is a sequence of mutually exclusive events, then

P( i)A,) = P(AJ i = 1 i = 1

2. If {A,, i = 1,2, ...,n) is a sequence of independent events, then

and a similar equality holds for any subcollection of the events.

Solved Problems

SAMPLE SPACE AND EVENTS

1.1. Consider a random experiment of tossing a coin three times.

(a) Find the sample space S , if we wish to observe the exact sequences of heads and tails

obtained.

(b) Find the sample space S , if we wish to observe the number of heads in the three tosses.

(a) The sampling space S, is given by

S, = (HHH, HHT, HTH, THH, HTT, THT, TTH, TTT)

where, for example, HTH indicates a head on the first and third throws and a tail on the second throw. There are eight sample points in S,. (b) The sampling space S , is given by Sz = (0, 1, 2, 3) where, for example, the outcome 2 indicates that two heads were obtained in the three tosses. The sample space S, contains four sample points.

1.2. Consider an experiment of drawing two cards at random from a bag containing four cards marked with the integers 1 through 4.

(a) Find the sample space S , of the experiment if the first card is replaced before the second is drawn.

(b) Find the sample space S , of the experiment if the first card is not replaced.

(a) The sample space S, contains 16 ordered pairs (i, J], 1 Ii 1 4, 1 5 j 5 4, where the first number indicates the first number drawn. Thus,

[(l, 1) (1, 2) (1, 3) (1,4))

CHAP. 1) PROBABILITY

A Fig. 1-

1.6. An automobile dealer offers vehicles with the following options: (a) With or without automatic transmission

(b) With or without air-conditioning

(c) With one of two choices of a stereo system (d) With one of three exterior colors ,

If the sample space consists of the set of all possible vehicle types, what is the number of out- comes in the sample space? The tree diagram for the different types of vehicles is shown in Fig. 1-4. From Fig. 1-4 we see that the number of sample points in S is 2 x 2 x 2 x 3 = 24.

Transmission Automatic

Air-conditioning

Stereo

Color

Fig. 1-

1.7. State every possible event in the sample space^ S^ =^ {a,^ b,^ c,^ d ).

There are z4 = 16 possible events in S. They are 0; {a), (b), {c), {d); {a, b), {a, c), {a, d), {b, c), {b, d), ( c , d ) ; {a, b, c), (a, b, 4 , (a, c, d), {b, c, d) ; S = {a, b, c, dl-

1.8. How many events are there in a sample space S with n elementary events?

Let S = {s,, s,, .. ., s,). Let Q be the family of all subsets of S. (a is sometimes referred to as the power set of S.) Let Si be the set consisting of two statements, that is, Si = (Yes, the si is in; No, the s, is not in) Then Cl can be represented as the Cartesian product n = s, x s, x ... x s,

= ((s,, s2,. .., s,): si E Si for i = 1, 2, ..., n)

PROBABILITY [CHAP 1

Since each subset of S can be uniquely characterized by an element in the above Cartesian product, we obtain the number of elements in Q by

n(Q) = n(S,)n(S,) - -. n(S,) = 2"

'

where n(Si) = number of elements in Si = 2. An alternative way of finding n(Q) is by the following summation:

" nl

n(Ql= ( y ) = i=O i = o^ i! ( n^ -^ i)! The proof that the last sum is equal to 2" is not easy.

ALGEBRA OF SETS

1.9. Consider the experiment of Example 1.2. We define the events

A = { k : k is odd) B = { k : 4 < k 1 7 ) C = { k : 1 5 k 5 10)

where k is the number of tosses required until the first H (head) appears. Determine the events A, B , C , A u B , B u C , A n B , A n C , B n C , a n d A n B.

= (k: k is even) = (2, 4, 6, .. .)

B = { k : k = 1, 2, 3 or k 2 8) C = ( k : k r 1 1 ) A u B = { k : k is odd or k = 4, 6 ) B u C = C A n B = ( 5 , 7) A n C = {I, 3, 5, 7, 9) B n C = B A n B = ( 4 , 6 )

1.10. The sample space of an experiment is the real line expressed as

(a) Consider the events

A , = { v : 0 S v < $ A, = { v : f 5 V < $

Determine the events

(b) Consider the events

U Ai and A, i = 1 i = 1

B, = { v : v 5^1 B, = { v : v < 3)