This repository contains code for enhancing Channel Quality Indicator (CQI) prediction in 5G wireless networks using a novel loss function called Residual-based Adaptive Huber Loss (RAHL). CQI is a key metric that helps dynamically optimize network infrastructure to maintain high Quality of Service (QoS).
Accurate CQI prediction is critical for efficient resource allocation in 5G systems. Traditional loss functions like Mean Squared Error (MSE) and Mean Absolute Error (MAE) each have limitations:
- MSE is sensitive to outliers, emphasizing large errors.
- MAE focuses on the majority of data but may underperform with noisy data.
The Huber loss combines benefits of MSE and MAE by smoothly transitioning between them, controlled by a hyperparameter delta. However, choosing the best delta manually is difficult and can impact model accuracy.
To address this, we propose RAHL, which:
- Incorporates a learnable residual term into delta
- Allows dynamic adaptation to the error distribution
- Enhances robustness to outliers while preserving precision on inliers
- Implementation of the RAHL loss function adaptable during training
- Application to multiple deep learning architectures:
- Long Short-Term Memory (LSTM)
- 1D Convolutional Neural Networks (CNN1D)
- Informer model (attention-based architecture for time series)
- Improved CQI prediction performance demonstrated through extensive experiments
- Robustness against noisy and outlier data in wireless communication scenarios
Channel Quality Indicator (CQI), 5G Networks, Loss Functions, Huber Loss, Adaptive Loss, Machine Learning, LSTM, CNN, Informer, Wireless Communications.
To run this project, you need the following Python libraries:
- numpy
- pandas
- torch
- scikit-learn
- matplotlib
- pyts
- thop
| File / Folder | Description |
|---|---|
loss_RAHL.py |
Contains the implementation of the Residual-based Adaptive Huber Loss (RAHL) function. |
model_informer.py |
Informer model implementation (adapted from an external GitHub repository). See attribution below. |
models_lstm_cnn1d.py |
Includes LSTM and CNN1D architectures. |
others_loss_function.py |
Contains additional loss functions. |
train_validation.py |
Training and validation logic for the models. |
main_for_informer.py |
Entry point for running the Informer model, including data preparation, training, and evaluation. |
main_for_lstm_or_cnn1d.py |
Entry point for running LSTM or CNN1D models, including data preparation, training, and evaluation. |
dataset/ |
Folder for dataset files (if any). |
README.md |
Project documentation and instructions. |
This is the main entry point for running the Informer model. It handles:
- Data loading and preprocessing
- Model initialization
- Training loop
- Evaluation on the test set
- Logging performance metrics
This script is used for running the LSTM or CNN1D models. It includes:
- Data preparation
- Model selection (LSTM or CNN1D)
- Training and validation
- Test performance evaluation
These main scripts import modular components from other files to keep the code clean and well-structured.
This project utilizes the Informer model, originally developed by Haoyi Zhou and collaborators. The model was introduced in their AAAI 2021 Best Paper titled:
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
🔗 Official PyTorch implementation: https://github.com/zhouhaoyi/Informer2020
We gratefully acknowledge their contribution and use their implementation as the foundation for our time-series forecasting tasks.
If you're working in this field feel free to reach out. I'm happy to help or guide you if I can. You can contact me via email: [email protected] - [email protected]