Joint optimization of AGC and timing detection for OFDM systems

Jihye Lee, Taehyun Jeon

Research output: Contribution to journalArticlepeer-review

Abstract

In the OFDM (Orthogonal Frequency Division Multiplexing) system the majority of the digital front-end functions are carried out relying on the training sequence or the preamble when it operates in the burst mode. The main goal of the AGC (Automatic Gain Control) is to determine the optimum signal power level at the input to the analog-to-digital convertor. The AGC is followed by the timing detection block to determine the start point of the OFDM symbol. The timing detection normally utilizes the repeated pattern of the preamble signal and the estimation performance is associated with the number of observation samples as well as the adopted algorithm. In this paper a joint optimization method is proposed to maximize the performance of both the AGC and the timing detection. On the AGC side a fast convergence algorithm is proposed which improves the performance by utilization of the statistics of the incoming signal samples and the multiple gain loops. And also a timing estimation algorithm is proposed which is robust to the signal level fluctuation.

Original languageEnglish
Pages (from-to)4353-4358
Number of pages6
JournalInformation (Japan)
Volume18
Issue number10
StatePublished - Oct 2015

Keywords

  • Automatic gain control
  • OFDM
  • Symbol timing
  • Synchronization

Fingerprint

Dive into the research topics of 'Joint optimization of AGC and timing detection for OFDM systems'. Together they form a unique fingerprint.

Cite this