Stars
基于神经网络的通用股票预测模型 A general stock prediction model based on neural networks
A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
Unified Training of Universal Time Series Forecasting Transformers
This is the pytorch implementation of Basisformer in the Neurips paper: [BasisFormer: Attention-based Time Series Forecasting with Learnable and Interpretable Basis]
A professionally curated list of awesome resources (paper, code, data, etc.) on transformers in time series.
A time series forecasting project using PyTorch Forecasting's Temporal Fusion Transformer (TFT) model.
Time series forecasting on an hourly energy dataset, with LSTM & Transformer models implemented in PyTorch Lightning. Deployment of the Transformer model using Docker, with GPU support.
A Library for Advanced Deep Time Series Models for General Time Series Analysis.
Official implementation for "TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables" (NeurIPS 2024)
A Python library for temporal disaggregation of time series data
Ego-centric multiple-correlation and temporal graph neural networks based residential load forecasting
Code for the CIKM 2019 paper "DSANet: Dual Self-Attention Network for Multivariate Time Series Forecasting".
In this repository, one can find the code for my master's thesis project. The main goal of the project was to study and improve attention mechanisms for trajectory prediction of moving agents.
A demonstration of the attention mechanism with some toy experiments and explanations.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
pytorch neural network attention mechanism
About Code release for "FECAM: Frequency Enhanced Channel Attention Mechanism for Time Series Forecasting" ⌚
Implementation of the paper "Self-Adaptive Physics-Informed Neural Networks using a Soft Attention Mechanism" [AAAI-MLPS 2021]
Attention mechanism for processing sequential data that considers the context for each timestamp.
A wrapper layer for stacking layers horizontally
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Temporal Pattern Attention for Multivariate Time Series Forecasting
bi-LSTM with Attention Mechanism for Climate Forecasting
Time Series Forecasting using Long Short Term Memory with Attention Mechanism
Local Attention Mechanism for time series forecasting.
AEFIN: Non-Stationary Time Series Forecasting Based on Fourier Analysis and Cross Attention Mechanism
Forecasting air pollution using temporal attention mechanism in Beijing