IN357: ADAPTIVE FILTERS

Similar documents
MATLAB SIMULATOR FOR ADAPTIVE FILTERS

Project due. Final exam: two hours, close book/notes. Office hours. Mainly cover Part-2 and Part-3 May involve basic multirate concepts from Part-1

Analysis of LMS and NLMS Adaptive Beamforming Algorithms

SIMULATIONS OF ADAPTIVE ALGORITHMS FOR SPATIAL BEAMFORMING

Performance Analysis of gradient decent adaptive filters for noise cancellation in Signal Processing

Performance Comparison of ZF, LMS and RLS Algorithms for Linear Adaptive Equalizer

Why is scramble needed for DFE. Gordon Wu

IMPLEMENTATION CONSIDERATIONS FOR FPGA-BASED ADAPTIVE TRANSVERSAL FILTER DESIGNS

REAL TIME DIGITAL SIGNAL PROCESSING

Temporal Clutter Filtering via Adaptive Techniques

Adaptive Kalman Filter based Channel Equalizer

Fig(1). Basic diagram of smart antenna

MITIGATING INTERFERENCE TO GPS OPERATION USING VARIABLE FORGETTING FACTOR BASED RECURSIVE LEAST SQUARES ESTIMATION

An Effective Implementation of Noise Cancellation for Audio Enhancement using Adaptive Filtering Algorithm

Performance Analysis of the LMS Adaptive Algorithm for Adaptive Beamforming

Keywords: Adaptive filtering, LMS algorithm, Noise cancellation, VHDL Design, Signal to noise ratio (SNR), Convergence Speed.

Adaptive Array Beamforming using LMS Algorithm

Study of Different Adaptive Filter Algorithms for Noise Cancellation in Real-Time Environment

ADAPTIVE BEAMFORMING USING LMS ALGORITHM

AN INSIGHT INTO ADAPTIVE NOISE CANCELLATION AND COMPARISON OF ALGORITHMS

A Review on Beamforming Techniques in Wireless Communication

(i) Understanding the basic concepts of signal modeling, correlation, maximum likelihood estimation, least squares and iterative numerical methods

Speech Enhancement Based On Noise Reduction

Passive Inter-modulation Cancellation in FDD System

Performance Analysis of LMS and NLMS Algorithms for a Smart Antenna System

Least squares and adaptive multirate filtering

Speech Recognition using FIR Wiener Filter

ECE 5650/4650 Computer Project #3 Adaptive Filter Simulation

Fixed Point Lms Adaptive Filter Using Partial Product Generator

Multirate Algorithm for Acoustic Echo Cancellation

Study the Behavioral Change in Adaptive Beamforming of Smart Antenna Array Using LMS and RLS Algorithms

Adaptive Noise Cancellation using Multirate Technique

2.161 Signal Processing: Continuous and Discrete Fall 2008

Impulsive Noise Reduction Method Based on Clipping and Adaptive Filters in AWGN Channel

Mitigation of Non-linear Impairments in Optical Fast-OFDM using Wiener-Hammerstein Electrical Equalizer

Performance Study of A Non-Blind Algorithm for Smart Antenna System

Beam Forming Algorithm Implementation using FPGA

BER Analysis ofimpulse Noise inofdm System Using LMS,NLMS&RLS

Suggested Solutions to Examination SSY130 Applied Signal Processing

Active Noise Cancellation in Audio Signal Processing

Computer exercise 3: Normalized Least Mean Square

COMPARISON OF CHANNEL ESTIMATION AND EQUALIZATION TECHNIQUES FOR OFDM SYSTEMS

Comparison of LMS and NLMS algorithm with the using of 4 Linear Microphone Array for Speech Enhancement

Acoustic Echo Cancellation for Noisy Signals

Adaptive Filter for Ecg Noise Reduction Using Rls Algorithm

Adaptive Systems Homework Assignment 3

Digital Signal Processing

Implementation of Adaptive Filters on TMS320C6713 using LabVIEW A Case Study

for Single-Tone Frequency Tracking H. C. So Department of Computer Engineering & Information Technology, City University of Hong Kong,

LMS and RLS based Adaptive Filter Design for Different Signals

NON-BLIND ADAPTIVE BEAM FORMING ALGORITHMS FOR SMART ANTENNAS

Interband Alias-Free Subband Adaptive Filtering with Critical Sampling

Evoked Potentials (EPs)

Noise Reduction for L-3 Nautronix Receivers

SGN Advanced Signal Processing

Performance Analysis of Adaptive Beamforming Algorithms for Orthogonal Frequency Division Multiplexing System

Development of Real-Time Adaptive Noise Canceller and Echo Canceller

ESE531 Spring University of Pennsylvania Department of Electrical and System Engineering Digital Signal Processing

Chapter 4 SPEECH ENHANCEMENT

Performance Analysis of Smart Antenna Beam forming Techniques

INTERFERENCE REJECTION OF ADAPTIVE ARRAY ANTENNAS BY USING LMS AND SMI ALGORITHMS

A New Method For Active Noise Control Systems With Online Acoustic Feedback Path Modeling

Frugal Sensing Spectral Analysis from Power Inequalities

Performance Analysis of Feedforward Adaptive Noise Canceller Using Nfxlms Algorithm

Adaptive Linear Predictive Frequency Tracking and CPM Demodulation

Performance improvement in beamforming of Smart Antenna by using LMS algorithm

A Stable LMS Adaptive Channel Estimation Algorithm for MIMO-OFDM Systems Based on STBC Sonia Rani 1 Manish Kansal 2

Combined Use of Various Passive Radar Range-Doppler Techniques and Angle of Arrival using MUSIC for the Detection of Ground Moving Objects

Algorithms in Signal Processors Audio Applications 2006

A New Least Mean Squares Adaptive Algorithm over Distributed Networks Based on Incremental Strategy

An Improved Pre-Distortion Algorithm Based On Indirect Learning Architecture for Nonlinear Power Amplifiers Wei You, Daoxing Guo, Yi Xu, Ziping Zhang

EE482: Digital Signal Processing Applications

Acoustic Echo Cancellation: Dual Architecture Implementation

A Dual-Mode Algorithm for CMA Blind Equalizer of Asymmetric QAM Signal

Adaptive Antennas in Wireless Communication Networks

Comprehensive Performance Analysis of Non Blind LMS Beamforming Algorithm using a Prefilter

Analysis and Comparison of Adaptive Beamforming Algorithms for Smart Antenna 1 Snehal N Shinde 2 Ujwala G Shinde

A Study on Various Types of Beamforming Algorithms

DESIGN AND IMPLEMENTATION OF AN ADAPTIVE NOISE CANCELING SYSTEM IN WAVELET TRANSFORM DOMAIN. AThesis. Presented to

Biosignal filtering and artifact rejection, Part II. Biosignal processing, S Autumn 2017

IMPULSE NOISE CANCELLATION ON POWER LINES

Analysis on Extraction of Modulated Signal Using Adaptive Filtering Algorithms against Ambient Noises in Underwater Communication

Adaptive Filters Linear Prediction

Equalization of Audio Channels A Practical Approach for Speech Communication. Nils Westerlund

SIGNAL MODEL AND PARAMETER ESTIMATION FOR COLOCATED MIMO RADAR

A variable step-size LMS adaptive filtering algorithm for speech denoising in VoIP

Performance Analysis of MUSIC and LMS Algorithms for Smart Antenna Systems

ADAPTIVE IDENTIFICATION OF TIME-VARYING IMPULSE RESPONSE OF UNDERWATER ACOUSTIC COMMUNICATION CHANNEL IWONA KOCHAŃSKA

Lab 6. Advanced Filter Design in Matlab

An Adaptive Algorithm for Morse Code Recognition

A Comparison of the Convolutive Model and Real Recording for Using in Acoustic Echo Cancellation

Blind Equalization Using Constant Modulus Algorithm and Multi-Modulus Algorithm in Wireless Communication Systems

Variable Step-Size LMS Adaptive Filters for CDMA Multiuser Detection

Report 3. Kalman or Wiener Filters

Systematic comparison of performance of different Adaptive beam forming Algorithms for Smart Antenna systems

Area Optimized Adaptive Noise Cancellation System Using FPGA for Ultrasonic NDE Applications

Performance Evaluation of Adaptive Line Enhancer Implementated with LMS, NLMS and BLMS Algorithm for Frequency Range 3-300Hz

DOWNLINK TRANSMITTER ADAPTATION BASED ON GREEDY SINR MAXIMIZATION. Dimitrie C. Popescu, Shiny Abraham, and Otilia Popescu

Acoustic Echo Cancellation using LMS Algorithm

Index Terms. Adaptive filters, Reconfigurable filter, circuit optimization, fixed-point arithmetic, least mean square (LMS) algorithms. 1.

Transcription:

R 1

IN357: ADAPTIVE FILTERS Course book: Chap. 9 Statistical Digital Signal Processing and modeling, M. Hayes 1996 (also builds on Chap 7.2). David Gesbert Signal and Image Processing Group (DSB) http://www.ifi.uio.no/~gesbert March 2003 DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing1 of 21

Outline Motivations for adaptive filtering The adaptive FIR filter Steepest descent and optimization theory Steepest descent in adaptive filtering The LMS algorithm Performance of LMS The RLS algorithm Performance of RLS Example: Adaptive beamforming in mobile networks DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing2 of 21

Motivations for adaptive filtering Goal: Extending optimum (ex: Wiener) filters to the case where the data is not stationary or the underlying system is time varying {d(n)} desired random process (unobserved) may be non stationary {x0(n)} observed random process, may be non stationary {x2(n)} observed random process,may be non stationary. {xp 1(n)} observed random process, may be non stationary p observations x (n) x (n) 2 x (n) p 1 filter W must be adjusted over time n 0 filter Wn d(n) desired signal DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing3 of 21 ^ d(n) estimated signal e(n) error signal

Cases of non stationarity The filter W must be adjusted over time and is denoted W (n) in order to track non stationarity: Example 1: To find the wiener solution to the linear prediction of speech signal. The speech signal is non stationary beyond approx 20ms of observations. d(n), {xi(n)} are non stationary. Example 2: To find the adaptive beamformer that tracks the location of a mobile user, in a wireless network. d(n) is stationary (sequence of modulation symbols), but {xi(n)} are not because the channel is changing. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing4 of 21

Aproaches to the problem Two solutions to track filter W (n): (Adaptive filtering) One has a long training signal for d(n) and one adjusts W (n) to minimize the power of e(n) continuously. (Block filtering) One splits time into short time intervals where the data is approximately stationary, and re-compute the Wiener solution for every block. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing5 of 21

Vector Formulation (time-varying filter) W (n) =[w0(n),w1(n),.., wp 1(n)] T X(n) =[x0(n),x1(n),.., xp 1(n)] T ˆd(n) =W(n) T X(n) where T is the transpose operator. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing6 of 21

Time varying optimum linear filtering e(n) =d(n) ˆd(n) J(n)=E e(n) 2 varies with n due to non-stationarity where E() is the expectation. Find W (n) such that J(n) is minimum at time n. W (n) is the optimum linear filter in the Wiener sense at time n. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing7 of 21

Finding the solution The solution W (n) is given by the time varying Wiener-Hopf equations. Rx(n)W (n) =rdx(n) where (1) Rx(n) =E(X(n) X(n) T ) (2) rdx(n) =E(d(n)X(n) ) (3) DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing8 of 21

Adaptive Algorithms The time-varying statistics used in (1) are unknown but can be estimated. Adaptive algorithms aim at estimating and tracking the solution W (n) given the observations {xi(n)},i=0..p 1 and a training sequence for d(n). Two key approaches: Steepest descent (also called gradient search) algorithms. Recursive Least Squares (RLS) algorithm. Tracking is formulated by: (n +1)=W(n)+ W(n) (4) where W (n) is the correction applied to the filter at time n. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing9 of 21

Steepest descent in optimization theory Assumptions: Stationary case. Idea: Local extrema of cost function J(W) can be found by following the path with the largest gradient (derivative) on the surface of J(W). W (0) is an arbitrary initial point W (n +1)=W(n) µ δj δw W=W(n) where µ is a small step-size (µ <<1). Because J() is quadratic here, there is only one local minimum toward which W (n) will converge. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing10 of 21

The steepest descent Wiener algorithm Derivation of the gradient expression: J(W )=E(e(n)e(n) ) where e(n) =d(n) ˆd(n)=d(n) W T X(n) δj δw = E(δe(n) δw e(n) + e(n) δe(n) δw ) δj δw δj δw = E(e(n)X(n) ) = E(0 + e(n)δe(n) δw ) DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing11 of 21

Algorithm: The steepest descent Wiener algorithm W (0) is an arbitrary initial point W (n +1)=W(n)+µE(e(n)X(n) ) W (n) will converge to Wo = R 1 x r dx (wiener solution) if 0 <µ<2/λmax (max eigenvalue of Rx. (see p. 501 for proof). Problem: E(e(n)X(n) ) is unknown! DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing12 of 21

The Least Mean Square (LMS) Algorithm Idea: E(e(n)X(n) ) is replaced by its instantaneous value. W (0) is an arbitrary initial point W (n +1)=W(n)+µe(n)X(n) Repeat with n +2.. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing13 of 21

The Least Mean Square (LMS) Algorithm Lemma: W (n) will converge in the mean toward Wo = R 1 x r dx, if 0<µ<2/λmax, (see p. 507) ie.: (W (n)) R 1 x r dx when n (5) Important Remarks: ThevarianceofW(n)around its mean is function of µ. µ allows a trade-off between speed of convergence and accuracy of the estimate. Asmallµresults in larger accuracy but slower convergence. The algorithm is derived under the assumption of stationarity, but can be used in non-stationary environment as a tracking method. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing14 of 21

A faster-converging algorithm Idea: build a running estimate of the statistics Rx(n), rdx(n), and solve the Wiener Hopf equation at each time: x(n)w (n) =rdx(n) (6) Where Rx(n) = rdx(n) = k=n k=0 k=n k=0 λ n k X(k) X(k) T (7) λ n k d(k)x(k) (8) where λ is the forgetting factor (λ <1close to 1) DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing15 of 21

Recursive least-squares (RLS) To avoid inverting a matrix a each step, ones finds a recursive solution for W (n). Rx(n) =λrx(n 1) + X(n) X(n) T (9) rdx(n) =λrdx(n 1) + d(n)x(n) (10) W (n) =W(n 1) + W (n 1) (11) Question: How to determine the right correction W (n 1)?? Answer: Using the matrix inversion lemma (Woodbury s identity) DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing16 of 21

Matrix inversion lemma We define P(n) =Rx(n) 1. The M.I.L. is used to update P(n 1) to P(n) directly: A + uv H ) 1 = A 1 A 1 uv H A 1 1+v H A 1 u We apply to Rx(n) 1 =(λrx(n 1) + X(n) X(n) T ) 1 (12) DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing17 of 21

Matrix inversion lemma Rx(n) 1 = λ 1 Rx(n 1) 1 λ 1 Rx(n 1) 1 X(n) X(n) T Rx(n 1) 1 1+λ 1 X(n) T Rx(n 1) 1 X(n) (13) (14) DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing18 of 21

The RLS algorithm W (0) = 0 P(0) = δ 1 I (15) (16) Z(n) =P(n 1)X(n) (17) G(n) = Z(n) λ+x(n) T Z(n) (18) α(n) =d(n) W(n 1) T X(n) (19) W (n) =W(n 1) + α(n)g(n) (20) P(n) = 1 λ (P(n 1) G(n)Z(n)H ) (21) where δ<<1is a small arbitrary initialization parameter DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing19 of 21

RLS vs. LMS Complexity: RLS more complex because of matrix multiplications. LMS simpler to implement. Convergence speed: LMS slower because depends on amplitude of gradient and eigenvalue spread of correlation matrix. RLS is faster because it points always at the right solution (it solves the problem exactly at each step). Accuracy: In LMS the accuracy is controlled via the step size µ. In RLS via the forgetting factor λ. In both cases very high accuracy in the stationary regime can be obtained at the loss of convergence speed. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing20 of 21

Application The LMS applied to the problem of adaptive beamforming... To be developped in class. DEPARTMENT OF INFORMATICS D. Gesbert: IN357 Statistical Signal Processing21 of 21