图书介绍
自适应滤波器原理 英文版 第3版pdf电子书版本下载
- (美)(S.海金)Simon Haykin著 著
- 出版社: 北京:电子工业出版社
- ISBN:7505348841
- 出版时间:1998
- 标注页数:989页
- 文件大小:26MB
- 文件页数:1006页
- 主题词:
PDF下载
下载说明
自适应滤波器原理 英文版 第3版PDF格式电子书版下载
下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如 BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!
(文件页数 要大于 标注页数,上中下等多册电子书除外)
注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具
图书目录
Introduction 1
1.The Filtering Problem 1
2.Adaptive Filters 2
3.Linear Filter Structures 4
4.Approaches to the Development of Linear Adaptive Filtering Algorithms 9
Contents 13
Preface 13
5.Real and Complex Forms of Adaptive Filters 14
6.Nonlinear Adaptive Filters 15
Acknowledgments 16
7.Applications 18
8.Some Historical Notes 67
PART 1 BACKGROUND MATERIAL 78
Chapter 1 Discrete-Time Signal Processing 79
1.1 z-Transform 79
1.2 Linear Time-Invariant Filters 81
1.3 Minimum-Phase Filters 86
1.4 Discrete Fourier Transform 87
1.5 Implementing Convolutions Using the DFT 87
1.6 Discrete Cosine Transform 93
1.7 Summary and Discussion 94
Problems 95
Chapter 2 Stationary Processes and Models 96
2.1 Partial Characterization of a Discrete-Time Stochastic Process 97
2.2 Mean Ergodic Theorem 98
2.3 Correlation Matrix 100
2.4 Correlation Matrix of Sine Wave Plus Noi se 106
2.5 Stochastic Models 108
2.6 Wold Decomposition 115
2.7 Asymptotic Stationarity of an Autoregressive Process 116
2.8 Yule-Walker Equations 118
2.9 Computer Experiment:Autoregressive Process of Order 2 120
2.10 Selecting the Model Order 128
2.11 Complex Gaussian Processes 130
2.12 Summary and Discussion 132
Problems 133
Chapter 3 Spectrum Analysis 136
3.1 Power Spectral Density 136
3.2 Properties of Power Spectral Density 138
3.3 Transmission of a Stationary Process Through a Linear Filter 140
3.4 Cramér Spectral Representation for a Stationary Process 144
3.5 Power Spectrum Estimation 146
3.6 Other Statistical Characteristics of a Stochastic Process 149
3.7 Polyspectra 150
3.8 Spectral-Correlation Density 154
3.9 Summary and Discussion 157
Problems 158
4.1 The Eigenvalue Problem 160
Chapter 4 Eigenanalysis 160
4.2 Properties of Eigenvalues and Eigenvectors 162
4.3 Low-Rank Modeling 176
4.4 Eigenfilters 181
4.5 Eigenvalue Computations 184
4.6 Summary and Discussion 187
Problems 188
PART 2 LINEAR OPTIMUM FILTERING 193
Chapter 5 Wiener Filters 194
5.1 Linear Optimum Filtering:Problem Statement 194
5.2 Principle of Orthogonality 197
5.3 Minimum Mean-Squared Error 201
5.4 Wiener-Hopf Equations 203
5.5 Error-Performance Surface 206
5.6 Numerical Example 210
5.7 Channel Equalization 217
5.8 Linearly Constrained Minimum Variance Filter 220
5.9 Generalized Sidelobe Cancelers 227
5.10 Summary and Disussion 235
Problems 236
Chapter 6 Linear Prediction 241
6.1 Forward Linear Prediction 242
6.2 Backward Linear Prediction 248
6.3 Levinson-Durbin Algorithm 254
6.4 Properties of Prediction-Error Filters 262
6.5 Schur-Cohn Test 271
6.6 Autoregressive Modeling of a Stationary Stochastic Process 273
6.7 Cholesky Factorization 276
6.8 Lattice Predictors 280
6.9 Joint-Process Estimation 286
6.10 Block Estimation 290
6.11 Summary and Discussion 293
Problems 295
Chapter 7 Kalman Filters 302
7.1 Recursive Minimum Mean-Square Estimation for Scalar Random Variables 303
7.2 Statement of the Kalman FiItering Problem 306
7.3 The Innovations Process 307
7.4 Estimation ofthe State using the Innovations Process 310
7.5 Filtering 317
7.6 Initial Conditions 320
7.7 Summary of the Kalman FiIter 320
7.8 Variants of the Kalman Filter 322
7.9 The Extended Kalman Filter 328
7.10 Summary and Discussion 333
Problems 334
PART 3 LINEAR ADAPTIVE FILTERING 338
8.1 Some Preliminaries 339
Chapter 8 Method of Steepest Descent 339
8.2 Steepest-Descent Algorithm 341
8.3 Stability of the Steepest-Descent Algorithm 343
8.4 Example 350
8.5 Summary and Discussion 362
Problems 362
Chapter 9 Least-Mean-Square Algorithm 365
9.1 Overview of the Structure and Operation of the Least-Mean-Square Algorithm 365
9.2 Least-Mean-Square Adaptation Algorithm 367
9.3 Examples 372
9.4 Stability and Performance Analysis of the LMS Algodthm 390
9.5 Summary of the LMS Algorithm 405
9.6 Computer Experiment on Adaptive Prediction 406
9.7 Computer Experiment on Adaptive Equalization 412
9.8 Computer Experiment on Minimum-Variance Distortionless Response Beamformer 421
9.9 Directionality of Convergence of the LMS Algorithm for Non-White Inputs 425
9.10 Robustness of the LMS Algorithm 427
9.11 Normalized LMS Algorithm 432
9.12 Summary and Discussion 438
Problems 439
Chapter 10 Frequency-Domain Adaptive Filters 445
10.1 Block Adaptive Filters 446
10.2 Fast LMS Algorithm 451
10.3 Unconstrained Frequency-Domain Adaptive Filtering 457
10.4 Self-Orthogonalizing Adaptive Filters 458
10.5 Computer Experiment on Adaptive Equalization 469
10.6 Classification ofAdaptive Filtering Algorithms 477
10.7 Summary and Discussion 478
Problems 479
Chapter 11 Method of Least Squares 483
11.1 Statement of the Linear Least-Squares Estimation Problem 483
11.2 Data Windowing 486
11.3 Principle of Orthogonality(Revisited) 487
11.4 Minimum Sum ofError Squares 491
11.5 Normal Equations and Linear Least-Squares Filters 492
11.6 Time-Averaged Correlation Matrix 495
11.7 Reformulation of the Normal Equations in Terms of Data Matrices 497
11.8 Properties of Least-Squares Estimates 502
11.9 Parametric Spectrum Estimation 506
11.10 Singular Value Decomposition 516
11.11 Pseudoinverse 524
11.12 Interpretation of Singular Values and Singular Vectors 525
11.13 Minimum Norm Solution to the Linear Least-Squares Problem 526
11.14 Normalized LMS Algorithm Viewed as the Minimum-Norm Solution to anUnderdetermined Least-Squares Estimation Problem 530
11.5 Summary and Discussion 532
Problems 533
Chapter 12 Rotations and Reflections 536
12.1 Plane Rotations 537
12.2 Two-Sided Jacobi Algorithm 538
12.3 Cyclic Jacobi Algorithm 544
12.4 Householder Transformation 548
12.5 The QR Algorithm 551
12.6 Summary and Discussion 558
Problems 560
Chapter 13 Recursive Least-Squares Algorithm 562
13.1 Some Preliminaries 563
13.2 The Matrix Inversion Lemma 565
13.3 The Exponentially Weighted Recursive Least-Squares Algorithm 566
13.4 Update Recursion for the Sum of Weighted Error Squares 571
13.5 Example:Single-Weight Adaptive Noise Canceler 572
13.6 Convergence Analysis of the RLS Algorithm 573
13.7 Computer Experiment on Adaptive Equalization 580
13.8 State-Space Formulation of the RLS Problem 583
Problems 587
13.9 Summary and Discussion 587
Chapter 14 Square-Root Adaptive Filters 589
14.1 Square-Root Kalman Filters 589
14.2 Building Square-Root Adaptive Filtering Algorithms on their Kalman FilterCounterparts 597
14.3 QR-RLS Algorithm 598
14.4 Extended QR-RLS Algorithm 614
14.5 Adaptive Beamforming 617
14.6 Inverse QR-RLS AIgorithm 624
14.7 Summary and Discussion 627
Problems 628
Chapter 15 Order-Recursive Adaptive Filters 630
15.1 Adaptive Forward Linear Prediction 631
15.2 Adaptive Backward Linear Prediction 634
15.3 Conversion Factor 636
15.4 Least-Squares Lattice Predictor 640
15.5 Angle-Normalized Estimation Errors 653
15.6 First-Order State-Space Models for Lattice Filtering 655
15.7 QR-Decomposition-Based Least-Squares Lattice Filters 660
15.8 Fundamental Properties of the QRD-LSL Filter 667
15.9 Computer Experiment on Adaptive Equalization 672
15.10 Extended QRD-LSL Algorithm 677
15 11 Recursive Least-Squares Lattice Filters Using A Posteriori Estimation Errors 679
15.12 Recursive LSL Filters Using A Priori Estimation Errors with Error Feedback 683
15.13 Computation of the Least-Squares Weight Vector 686
15.14 Computer Experiment on Adaptive Prediction 69l 693
15.15 Other Variants of Least-Squares Lattice Filters 693
15.16 Summary and Discussion 694
Problems 696
Chapter 16 Tracking of Time-Varying Systems 701
16.1 Markov Model for System Identification 702
16.2 Degree of Nonstationaritv 705
16.3 Criteria for Tracking Assessment 706
16.4 Tracking Performance of the LMS Algorithm 708
16.5 Tracking Performance of the RLS Algorithm 711
16.6 Comparison of the Tracking Performance of LMS and RLS Algorithms 716
16.7 Adaptive Recovery of a Chirped Sinusoid in Noise 719
16.8 How to Improve the Tracking Behavior of the RLS Algorithm 726
16.9 Computer Experiment on System Identification 729
16.10 Automatic Tuning ofAdaptation Constants 731
16.11 Summary and Discussion 736
Problems 737
Chapter 17 Finite-Precision Effects 738
17.1 Quantization Errors 739
17.2 Least-Mean-Square Algorithm 741
17.3 Recursive Least-Squares Algorithm 751
17.4 Square-Root Adaptive Filters 757
17.5 Order-Recursive Adaptive Filters 760
17.6 Fast Transversal Filters 763
17.7 Summary and Discussion 767
Problems 769
PART 4 NONLINEAR ADAPTIVE FILTERING 771
Chapter 18 Blind Deconvolution 772
18.1 Theoretical and Practical Considerations 773
18.2 Bussgang Algorithm for Blind Equalization of Real Baseband Channels 776
18.3 Extension of Bussgang Algorithms to Complex Baseband Channels 791
18.4 Special Cases of the Bussgang Algorithm 792
18.5 Blind Channel Identification and Equalization Using Polyspectra 796
18.6 Advantages and Disadvantages of HOS-Based Deconvolution Algorithms 802
18.7 Channel Identifiability Using Cyclostationary Statistics 803
18.8 Subspace Decomposition for Fractionally-Spaced Blind Identification 804
18.9 Summary and Discussion 813
Problems 814
Chapter 19 Back-Propagation Learning 817
19.1 Models of aNeuron 818
19.2 Multilayer Perceptron 822
19.3 Complex Back-Propagation Algorithm 824
19.4 Back-Propagation Algorithm for Real Parameters 837
19.5 Universal Approximation Theorem 838
19.6 Network Complexity 840
19.7 Filtering Applications 842
19.8 Summary and Discussion 852
Problems 854
Chapter 20 Radial Basis Function Networks 855
20.1 Structure of RBF Networks 856
20.2 Radial-Basis Functions 858
20.3 Fixed Centers Selected at Random 859
20.4 Recursive Hybrid Learning Procedure 862
20.5 Stochastic Gradient Approach 863
20.6 Universal Approximation Theorem(Revisited) 865
20.7 Filtering Applications 866
20.8 Summary and Discussion 871
Problems 873
Appendix A Complex Variables 875
Appendix B Differentiation with Respect to a Vector 890
Appendix C Method of Lagrange Multipliers 895
Appendix D Estimation Theory 899
Appendix E Maximum-Entropy Method 905
Appendix F Minimum-Variance Distortionless Response Spectrum 912
Appendix G Gradient Adaptive Lattice Algorithm 915
Appendix H Solution of the Difference Equation(9.75) 919
Appendix I Steady-State Analysis of the LMS Algorithm without Invoking the Inde-pendence Assumption 921
Appendix J The Complex Wishart Distribution 924
GIossary 928
Abbreviations 932
Principal Symbols 933
Bibliography 941
Index 978