Abstract
In the OFDM (Orthogonal Frequency Division Multiplexing) system the majority of the digital front-end functions are carried out relying on the training sequence or the preamble when it operates in the burst mode. The main goal of the AGC (Automatic Gain Control) is to determine the optimum signal power level at the input to the analog-to-digital convertor. The AGC is followed by the timing detection block to determine the start point of the OFDM symbol. The timing detection normally utilizes the repeated pattern of the preamble signal and the estimation performance is associated with the number of observation samples as well as the adopted algorithm. In this paper a joint optimization method is proposed to maximize the performance of both the AGC and the timing detection. On the AGC side a fast convergence algorithm is proposed which improves the performance by utilization of the statistics of the incoming signal samples and the multiple gain loops. And also a timing estimation algorithm is proposed which is robust to the signal level fluctuation.
| Original language | English |
|---|---|
| Pages (from-to) | 4353-4358 |
| Number of pages | 6 |
| Journal | Information (Japan) |
| Volume | 18 |
| Issue number | 10 |
| State | Published - Oct 2015 |
Keywords
- Automatic gain control
- OFDM
- Symbol timing
- Synchronization