Analysis on Lowering the Effect of Timing Jitter in OFDM System using Oversampling
Monika Tiwari1, Kanwar Preet Kaur2
1Monika Tiwari, M. Tech. Department of Electronics & Communication Engineering, Gyan Ganga College of Technology, Jabalpur, (M. P.), India.
2Kanwar Preet Kaur, M. Tech. Department of Electronics & Communication Engineering, Gyan Ganga College of Technology, Jabalpur, (M. P.), India.
Manuscript received on May 20, 2014. | Revised Manuscript received on June 17, 2014. | Manuscript published on June 30, 2014. | PP: 322-324  | Volume-3, Issue-5, June 2014.  | Retrieval Number:  E3233063514/2013©BEIESP

Open Access | Ethics and Policies | Cite
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: The limitation of the high speed analog to digital converters and synchronization systems causes the miss-timed sampling of the signals, which results the timing jitter. The seriousness of this effect greatly increases for the multicarrier systems like OFDM, because of their structure where all subcarriers may get affected by single sample. The removal (or suppression) of the timing jitter is hence one of the challenging task for the system designers. This paper presents an analysis of one of the jitter suppressing technique named “Over-Sampling”, for its effectiveness the paper presents a detailed analysis on the basis of most practical OFDM mathematical modeling and simulation. The presented model facilities to analyze different size, type of modulation symbol, effect of jitter probability, amplitude independently, and to select different OFDM modulation techniques. Finally the simulation result shows that the BER can be reduced by half for any value of jitter at fixed AWGN when the oversampling rate is doubled.
Keywords: Timing jitter, OFDM, Oversampling.