Design and implementation of a depth-dependent matched filter to maximize signal-to-noise ratio in optical coherence tomography
Obtaining higher depth of imaging is an important goal in Optical Coherence Tomography (OCT) systems. One of the main factors that affect the depth of OCT imaging is the presence of noise. That’s why the study of noise statistics is an important problem. In the first part of this thesis we obtain an empirical estimate of the second order statistics of noise by using a sequence of Time domain (TD) OCT images. These estimates confirm the non-stationary nature of noise in TD-OCT. In the second part of the thesis these estimates are used to design a depth-dependent matched filter to maximize the Signal-to-Noise Ratio (SNR) and increase the Contrast-to-Noise Ratio (CNR) in TD-OCT. By applying our filter to TD-OCT images of both vascular rabbit tissue and a human tooth, both SNR and CNR were increased and a higher imaging depth was achieved.
optical coherence tomography, matched filter, signal to noise ratio