Share

Book

# Introduction to Statistical Signal Processing

10 years 9 months ago
"A random or stochastic process is a mathematical model for a phenomenon that evolves in time in an unpredictable manner from the viewpoint of the observer. The phenomenon may be a sequence of real-valued measurements of voltage or temperature, a binary data stream from a computer, a modulated binary data stream from a modem, a sequence of coin tosses, the daily Dow–Jones average, radiometer data or photographs from deep space probes, a sequence of images from a cable television, or any of an infinite number of possible sequences, waveforms, or signals of any imaginable type. It may be unpredictable because of such effects as interference or noise in a communication link or storage medium, or it may be an information-bearing signal, deterministic from the viewpoint of an observer at the transmitter but random to an observer at the receiver."
R.M. Gray
 Added 04 Mar 2009 Updated 04 Mar 2009 Authors R.M. Gray

## Table of Content

1 Introduction 1
2 Probability 10
2.1 Introduction 10
2.2 Spinning pointers and flipping coins 14
2.3 Probability spaces 22
2.4 Discrete probability spaces 45
2.5 Continuous probability spaces 54
2.6 Independence 69
2.7 Elementary conditional probability 70
2.8 Problems 74

3 Random variables, vectors, and processes 83
3.1 Introduction 83
3.2 Random variables 94
3.3 Distributions of random variables 103
3.4 Random vectors and random processes 113
3.5 Distributions of random vectors 116
3.6 Independent random variables 125
3.7 Conditional distributions 128
3.8 Statistical detection and classification 133
3.10 Binary detection in Gaussian noise 143
3.11 Statistical estimation 145
3.12 Characteristic functions 146
3.13 Gaussian random vectors 152
3.14 Simple random processes 153
3.15 Directly given random processes 157
3.16 Discrete time Markov processes 159
3.17 Nonelementary conditional probability 168
3.18 Problems 169

4 Expectation and averages 183
4.1 Averages 183
4.2 Expectation 186
4.3 Functions of random variables 189
4.4 Functions of several random variables 196
4.5 Properties of expectation 196
4.6 Examples 198
4.7 Conditional expectation 207
4.8 Jointly Gaussian vectors 210
4.9 Expectation as estimation 212
4.10 Implications for linear estimation 219
4.11 Correlation and linear estimation 222
4.12 Correlation and covariance functions 229
4.13 The central limit theorem 232
4.14 Sample averages 235
4.15 Convergence of random variables 237
4.16 Weak law of large numbers 244
4.17 Strong law of large numbers 246
4.18 Stationarity 251
4.19 Asymptotically uncorrelated processes 256
4.20 Problems 259

5 Second-order theory 277
5.1 Linear filtering of random processes 278
5.2 Linear systems I/O relations 280
5.3 Power spectral densities 286
5.4 Linearly filtered uncorrelated processes 288
5.5 Linear modulation 295
5.6 White noise 298
5.7 Time averages 302
5.8 Mean square calculus 305
5.9 Linear estimation and filtering 333
5.10 Problems 351

6 A menagerie of processes 365
6.1 Discrete time linear models 366
6.2 Sums of iid random variables 371
6.3 Independent stationary increment processes 372
6.4 Second-order moments of isi processes 375
6.5 Specification of continuous time isi processes 378
6.6 Moving-average and autoregressive processes 380
6.7 The discrete time Gauss–Markov process 382
6.8 Gaussian random processes 383
6.9 The Poisson counting process 384
6.10 Compound processes 387
6.11 Composite random processes 388
6.12 Exponential modulation 389
6.13 Thermal noise 394
6.14 Ergodicity 397
6.15 Random fields 400
6.16 Problems 402

Appendix A Preliminaries 413
A.1 Set theory 413
A.2 Examples of proofs 420
A.3 Mappings and functions 424
A.4 Linear algebra 425
A.5 Linear system fundamentals 429
A.6 Problems 433
Appendix B Sums and integrals 438
B.1 Summation 438
B.2 Double sums 441
B.3 Integration 443
B.4 The Lebesgue integral 445
Appendix C Common univariate distributions 448