Loading…
Machine Learning With Gaussian Process Regression For Time-Varying Channel Estimation
The minimum mean-squared error (MMSE) estimator is recognized as the best estimator for measuring transmission channel distortion in orthogonal frequency division multiplexing (OFDM) using pilot-symbol assisted modulation (PSAM) in the presence of noise. In practice, however, the estimator suffers f...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The minimum mean-squared error (MMSE) estimator is recognized as the best estimator for measuring transmission channel distortion in orthogonal frequency division multiplexing (OFDM) using pilot-symbol assisted modulation (PSAM) in the presence of noise. In practice, however, the estimator suffers from high complexity and relies on the estimation of second-order statistics which may change rapidly within small-scale fading environments in a high-mobility wireless transmission system. We propose using machine learning (ML) with Gaussian Process Regression (GPR) to adaptively learn the hyperparameters of a channel model, which then can be used to calculate the MMSE estimates. Moreover, GPR can be used to more accurately interpolate the channel estimates in between pilot symbols compared to linear interpolation techniques. After describing the learning process and its equivalency to MMSE, we derive the BER for a receiver using GPR for time-domain interpolation, then use BER to find a practical bound on the number of training points needed to achieve best performance. We show that the performance of GPR-based ML is comparable to that of more complex neural network-based ML. |
---|---|
ISSN: | 1938-1883 |
DOI: | 10.1109/ICC45855.2022.9838448 |