Preface
1 Introduction to Probability 1
1.1 Introduction: Why Study Probability? 1
1.2 The Different Kinds of Probability 2
Probability as Intuition 2
Probability as the Ratio of Favorable to Total Outcomes (Classical Theory) 3
Probability as a Measure of Frequency of Occurrence 4
Probability Based on an Axiomatic Theory 5
1.3 Misuses, Miscalculations, and Paradoxes in Probability 7
1.4 Sets, Fields, and Events 8
Examples of Sample Spaces 8
1.5 Axiomatic Definition of Probability 15
1.6 Joint, Conditional, and Total Probabilities; Independence 20
Compound Experiments 23
1.7 Bayes Theorem and Applications 35
1.8 Combinatorics 38
Occupancy Problems 42
Extensions and Applications 46
1.9 Bernoulli TrialsBinomial and Multinomial Probability Laws 48
Multinomial Probability Law 54
1.10 Asymptotic Behavior of the Binomial Law: The Poisson Law 57
1.11 Normal Approximation to the Binomial Law 63
Summary 65
Problems 66
References 77
2 Random Variables 79
2.1 Introduction 79
2.2 Definition of a Random Variable 80
2.3 Cumulative Distribution Function 83
Properties of FX(x) 84
Computation of FX(x) 85
2.4 Probability Density Function (pdf) 88
Four Other Common Density Functions 95
More Advanced Density Functions 97
2.5 Continuous, Discrete, and Mixed Random Variables 100
Some Common Discrete Random Variables 102
2.6 Conditional and Joint Distributions and Densities 107
Properties of Joint CDF FXY (x, y) 118
2.7 Failure Rates 137
Summary 141
Problems 141
References 149
Additional Reading 149
3 Functions of Random Variables 151
3.1 Introduction 151
Functions of a Random Variable (FRV): Several Views 154
3.2 Solving Problems of the Type Y = g(X) 155
General Formula of Determining the pdf of Y = g(X) 166
3.3 Solving Problems of the Type Z = g(X, Y ) 171
3.4 Solving Problems of the Type V = g(X, Y ), W = h(X, Y ) 193
Fundamental Problem 193
Obtaining fVW Directly from fXY 196
3.5 Additional Examples 200
Summary 205
Problems 206
References 214
Additional Reading 214
4 Expectation and Moments 215
4.1 Expected Value of a Random Variable 215
On the Validity of Equation 4.1-8 218
4.2 Conditional Expectations 232
Conditional Expectation as a Random Variable 239
4.3 Moments of Random Variables 242
Joint Moments 246
Properties of Uncorrelated Random Variables 248
Jointly Gaussian Random Variables 251
4.4 Chebyshev and Schwarz Inequalities 255
Markov Inequality 257
The Schwarz Inequality 258
4.5 Moment-Generating Functions 261
4.6 Chernoff Bound 264
4.7 Characteristic Functions 266
Joint Characteristic Functions 273
The Central Limit Theorem 276
4.8 Additional Examples 281
Summary 283
Problems 284
References 293
Additional Reading 294
5 Random Vectors 295
5.1 Joint Distribution and Densities 295
5.2 Multiple Transformation of Random Variables 299
5.3 Ordered Random Variables 302
Distribution of area random variables 305
5.4 Expectation Vectors and Covariance Matrices 311
5.5 Properties of Covariance Matrices 314
Whitening Transformation 318
5.6 The Multidimensional Gaussian (Normal) Law 319
5.7 Characteristic Functions of Random Vectors 328
Properties of CF of Random Vectors 330
The Characteristic Function of the Gaussian (Normal) Law 331
Summary 332
Problems 333
References 339
Additional Reading 339
6 Statistics: Part 1 Parameter Estimation 340
6.1 Introduction 340
Independent, Identically Distributed (i.i.d.) Observations 341
Estimation of Probabilities 343
6.2 Estimators 346
6.3 Estimation of the Mean 348
Properties of the Mean-Estimator Function (MEF) 349
Procedure for Getting a d-confidence Interval on the Mean of a Normal
Random Variable When sX Is Known 352
Confidence Interval for the Mean of a Normal Distribution When sX Is Not
Known 352
Procedure for Getting a d-Confidence Interval Based on n Observations on
the Mean of a Normal Random Variable when sX Is Not Known 355
Interpretation of the Confidence Interval 355
6.4 Estimation of the Variance and Covariance 355
Confidence Interval for the Variance of a Normal Random
variable 357
Estimating the Standard Deviation Directly 359
Estimating the covariance 360
6.5 Simultaneous Estimation of Mean and Variance 361
6.6 Estimation of Non-Gaussian Parameters from Large Samples 363
6.7 Maximum Likelihood Estimators 365
6.8 Ordering, more on Percentiles, Parametric Versus Nonparametric Statistics 369
The Median of a Population Versus Its Mean 371
Parametric versus Nonparametric Statistics 372
Confidence Interval on the Percentile 373
Confidence Interval for the Median When n Is Large 375
6.9 Estimation of Vector Means and Covariance Matrices 376
Estimation of µ 377
Estimation of the covariance K 378
6.10 Linear Estimation of Vector Parameters 380
Summary 384
Problems 384
References 388
Additional Reading 389
7 Statistics: Part 2 Hypothesis Testing 390
7.1 Bayesian Decision Theory 391
7.2 Likelihood Ratio Test 396
7.3 Composite Hypotheses 402
Generalized Likelihood Ratio Test (GLRT) 403
How Do We Test for the Equality of Means of Two Populations? 408
Testing for the Equality of Variances for Normal Populations:
The F-test 412
Testing Whether the Variance of a Normal Population Has a
Predetermined Value: 416
7.4 Goodness of Fit 417
7.5 Ordering, Percentiles, and Rank 423
How Ordering is Useful in Estimating Percentiles and the Median 425
Confidence Interval for the Median When n Is Large 428
Distribution-free Hypothesis Testing: Testing If Two Population are the
Same Using Runs 429
Ranking Test for Sameness of Two Populations 432
Summary 433
Problems 433
References 439
8 Random Sequences 441
8.1 Basic Concepts 442
Infinite-length Bernoulli Trials 447
Continuity of Probability Measure 452
Statistical Specification of a Random Sequence 454
8.2 Basic Principles of Discrete-Time Linear Systems 471
8.3 Random Sequences and Linear Systems 477
8.4 WSS Random Sequences 486
Power Spectral Density 489
Interpretation of the psd 490
Synthesis of Random Sequences and Discrete-Time Simulation 493
Decimation 496
Interpolation 497
8.5 Markov Random Sequences 500
ARMA Models 503
Markov Chains 504
8.6 Vector Random Sequences and State Equations 511
8.7 Convergence of Random Sequences 513
8.8 Laws of Large Numbers 521
Summary 526
Problems 526
References 541
9 Random Processes 543
9.1 Basic Definitions 544
9.2 Some Important Random Processes 548
Asynchronous Binary Signaling 548
Poisson Counting Process 550
Alternative Derivation of Poisson Process 555
Random Telegraph
Digital Modulation Using Phase-Shift Keying 558
Wiener Process or Brownian Motion 560
Markov Random Processes 563
BirthDeath Markov Chains 567
ChapmanKolmogorov Equations 571
Random Process Generated from Random Sequences 572
9.3 Continuous-Time Linear Systems with Random Inputs 572
White Noise 577
9.4 Some Useful Classifications of Random Processes 578
Stationarity 579
9.5 Wide-Sense Stationary Processes and LSI Systems 581
Wide-Sense Stationary Case 582
Power Spectral Density 584
An Interpretation of the psd 586
More on White Noise 590
Stationary Processes and Differential Equations 596
9.6 Periodic and Cyclostationary Processes 600
9.7 Vector Processes and State Equations 606
State Equations 608
Summary 611
Problems 611
References 633
Chapters 10 and 11 are available as Web chapters on the companion
Web site at http://www.pearsonhighered.com/stark.
10 Advanced Topics in Random Processes 635
10.1 Mean-Square (m.s.) Calculus 635
Stochastic Continuity and Derivatives [10-1] 635
Further Results on m.s. Convergence [10-1] 645
10.2 Mean-Square Stochastic Integrals 650
10.3 Mean-Square Stochastic Differential Equations 653
10.4 Ergodicity [10-3] 658
10.5 KarhunenLo`eve Expansion [10-5] 665
10.6 Representation of Bandlimited and Periodic Processes 671
Bandlimited Processes 671
Bandpass Random Processes 674
WSS Periodic Processes 677
Fourier Series for WSS Processes 680
Summary 682
Appendix: Integral Equations 682
Existence Theorem 683
Problems 686
References 699
11 Applications to Statistical Signal Processing 700
11.1 Estimation of Random Variables and Vectors 700
More on the Conditional Mean 706
Orthogonality and Linear Estimation 708
Some Properties of the Operator E 716
11.2 Innovation Sequences and Kalman Filtering 718
Predicting Gaussian Random Sequences 722
Kalman Predictor and Filter 724
Error-Covariance Equations 729
11.3 Wiener Filters for Random Sequences 733
Unrealizable Case (Smoothing) 734
Causal Wiener Filter 736
11.4 Expectation-Maximization Algorithm 738
Log-likelihood for the Linear Transformation 740
Summary of the E-M algorithm 742
E-M Algorithm for Exponential Probability
Functions 743
Application to Emission Tomography 744
Log-likelihood Function of Complete Data 746
E-step 747
M-step 748
11.5 Hidden Markov Models (HMM) 749
Specification of an HMM 751
Application to Speech Processing 753
Efficient Computation of P[E|M] with a Recursive
Algorithm 754
Viterbi Algorithm and the Most Likely State Sequence
for the Observations 756
11.6 Spectral Estimation 759
The Periodogram 760
Bartletts Procedure---Averaging Periodograms 762
Parametric Spectral Estimate 767
Maximum Entropy Spectral Density 769
11.7 Simulated Annealing 772
Gibbs Sampler 773
Noncausal GaussMarkov Models 774
Compound Markov Models 778
Gibbs Line Sequence 779
Summary 783
Problems 783
References 788
Appendix A Review of Relevant Mathematics A-1
A.1 Basic Mathematics A-1
Sequences A-1
Convergence A-2
Summations A-3
Z-Transform A-3
A.2 Continuous Mathematics A-4
Definite and Indefinite Integrals A-5
Differentiation of Integrals A-6
Integration by Parts A-7
Completing the Square A-7
Double Integration A-8
Functions A-8
A.3 Residue Method for Inverse Fourier Transformation A-10
Fact A-11
Inverse Fourier Transform for psd of Random Sequence A-13
A.4 Mathematical Induction A-17
References A-17
Appendix B Gamma and Delta Functions B-1
B.1 Gamma Function B-1
B.2 Incomplete Gamma Function B-2
B.3 Dirac Delta Function B-2
References B-5
Appendix C Functional Transformations and Jacobians C-1
C.1 Introduction C-1
C.2 Jacobians for n = 2 C-2
C.3 Jacobian for General n C-4
Appendix D Measure and Probability D-1
D.1 Introduction and Basic Ideas D-1
Measurable Mappings and Functions D-3
D.2 Application of Measure Theory to Probability D-3
Distribution Measure D-4
Appendix E Sampled Analog Waveforms and Discrete-time Signals E-1
Appendix F Independence of Sample Mean and Variance for Normal
Random Variables F-1
Appendix G Tables of Cumulative Distribution Functions: the Normal,
Student t, Chi-square, and F G-1
Index I-1