LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION

Overview

Query Selector

Here you can find code and data loaders for the paper https://arxiv.org/pdf/2107.08687v1.pdf . Query Selector is a novel approach to sparse attention Transformer algorithm that is especially suitable for long term time series forecasting

Depencency

Python            3.7.9
deepspeed         0.4.0
numpy             1.20.3
pandas            1.2.4
scipy             1.6.3
tensorboardX      1.8
torch             1.7.1
torchaudio        0.7.2
torchvision       0.8.2
tqdm              4.61.0

Results on ETT dataset

Univariate

Data Prediction len Informer MSE Informer MAE Trans former MSE Trans former MAE Query Selector MSE Query Selector MAE MSE ratio
ETTh1 24 0.0980 0.2470 0.0548 0.1830 0.0436 0.1616 0.445
ETTh1 48 0.1580 0.3190 0.0740 0.2144 0.0721 0.2118 0.456
ETTh1 168 0.1830 0.3460 0.1049 0.2539 0.0935 0.2371 0.511
ETTh1 336 0.2220 0.3870 0.1541 0.3201 0.1267 0.2844 0.571
ETTh1 720 0.2690 0.4350 0.2501 0.4213 0.2136 0.3730 0.794
ETTh2 24 0.0930 0.2400 0.0999 0.2479 0.0843 0.2239 0.906
ETTh2 48 0.1550 0.3140 0.1218 0.2763 0.1117 0.2622 0.721
ETTh2 168 0.2320 0.3890 0.1974 0.3547 0.1753 0.3322 0.756
ETTh2 336 0.2630 0.4170 0.2191 0.3805 0.2088 0.3710 0.794
ETTh2 720 0.2770 0.4310 0.2853 0.4340 0.2585 0.4130 0.933
ETTm1 24 0.0300 0.1370 0.0143 0.0894 0.0139 0.0870 0.463
ETTm1 48 0.0690 0.2030 0.0328 0.1388 0.0342 0.1408 0.475
ETTm1 96 0.1940 0.2030 0.0695 0.2085 0.0702 0.2100 0.358
ETTm1 288 0.4010 0.5540 0.1316 0.2948 0.1548 0.3240 0.328
ETTm1 672 0.5120 0.6440 0.1728 0.3437 0.1735 0.3427 0.338

Multivariate

Data Prediction len Informer MSE Informer MAE Trans former MSE Trans former MAE Query Selector MSE Query Selector MAE MSE ratio
ETTh1 24 0.5770 0.5490 0.4496 0.4788 0.4226 0.4627 0.732
ETTh1 48 0.6850 0.6250 0.4668 0.4968 0.4581 0.4878 0.669
ETTh1 168 0.9310 0.7520 0.7146 0.6325 0.6835 0.6088 0.734
ETTh1 336 1.1280 0.8730 0.8321 0.7041 0.8503 0.7039 0.738
ETTh1 720 1.2150 0.8960 1.1080 0.8399 1.1150 0.8428 0.912
ETTh2 24 0.7200 0.6650 0.4237 0.5013 0.4124 0.4864 0.573
ETTh2 48 1.4570 1.0010 1.5220 0.9488 1.4074 0.9317 0.966
ETTh2 168 3.4890 1.5150 1.6225 0.9726 1.7385 1.0125 0.465
ETTh2 336 2.7230 1.3400 2.6617 1.2189 2.3168 1.1859 0.851
ETTh2 720 3.4670 1.4730 3.1805 1.3668 3.0664 1.3084 0.884
ETTm1 24 0.3230 0.3690 0.3150 0.3886 0.3351 0.3875 0.975
ETTm1 48 0.4940 0.5030 0.4454 0.4620 0.4726 0.4702 0.902
ETTm1 96 0.6780 0.6140 0.4641 0.4823 0.4543 0.4831 0.670
ETTm1 288 1.0560 0.7860 0.6814 0.6312 0.6185 0.5991 0.586
ETTm1 672 1.1920 0.9260 1.1365 0.8572 1.1273 0.8412 0.946

State Of Art

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

Citation

@misc{klimek2021longterm,
      title={Long-term series forecasting with Query Selector -- efficient model of sparse attention}, 
      author={Jacek Klimek and Jakub Klimek and Witold Kraskiewicz and Mateusz Topolewski},
      year={2021},
      eprint={2107.08687},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Contact

If you have any questions please contact us by email - [email protected]

Owner
MORAI
MORAI
JugLab 33 Dec 30, 2022
A Keras implementation of CapsNet in the paper: Sara Sabour, Nicholas Frosst, Geoffrey E Hinton. Dynamic Routing Between Capsules

NOTE This implementation is fork of https://github.com/XifengGuo/CapsNet-Keras , applied to IMDB texts reviews dataset. CapsNet-Keras A Keras implemen

Lauro Moraes 5 Oct 23, 2022
Code for "Learning Structural Edits via Incremental Tree Transformations" (ICLR'21)

Learning Structural Edits via Incremental Tree Transformations Code for "Learning Structural Edits via Incremental Tree Transformations" (ICLR'21) 1.

NeuLab 40 Dec 23, 2022
[CVPRW 21] "BNN - BN = ? Training Binary Neural Networks without Batch Normalization", Tianlong Chen, Zhenyu Zhang, Xu Ouyang, Zechun Liu, Zhiqiang Shen, Zhangyang Wang

BNN - BN = ? Training Binary Neural Networks without Batch Normalization Codes for this paper BNN - BN = ? Training Binary Neural Networks without Bat

VITA 40 Dec 30, 2022
Python based framework for Automatic AI for Regression and Classification over numerical data.

Python based framework for Automatic AI for Regression and Classification over numerical data. Performs model search, hyper-parameter tuning, and high-quality Jupyter Notebook code generation.

BlobCity, Inc 141 Dec 21, 2022
Code for CVPR2021 paper 'Where and What? Examining Interpretable Disentangled Representations'.

PS-SC GAN This repository contains the main code for training a PS-SC GAN (a GAN implemented with the Perceptual Simplicity and Spatial Constriction c

Xinqi/Steven Zhu 40 Dec 16, 2022
Sample and Computation Redistribution for Efficient Face Detection

Introduction SCRFD is an efficient high accuracy face detection approach which initially described in Arxiv. Performance Precision, flops and infer ti

Sajjad Aemmi 13 Mar 05, 2022
This repository contains the map content ontology used in narrative cartography

Narrative-cartography-ontology This repository contains the map content ontology used in narrative cartography, which is associated with a submission

Weiming Huang 0 Oct 31, 2021
Voice Gender Recognition

In this project it was used some different Machine Learning models to identify the gender of a voice (Female or Male) based on some specific speech and voice attributes.

Anne Livia 1 Jan 27, 2022
Code for the paper "Zero-shot Natural Language Video Localization" (ICCV2021, Oral).

Zero-shot Natural Language Video Localization (ZSNLVL) by Pseudo-Supervised Video Localization (PSVL) This repository is for Zero-shot Natural Languag

Computer Vision Lab. @ GIST 37 Dec 27, 2022
A set of Deep Reinforcement Learning Agents implemented in Tensorflow.

Deep Reinforcement Learning Agents This repository contains a collection of reinforcement learning algorithms written in Tensorflow. The ipython noteb

Arthur Juliani 2.2k Jan 01, 2023
Anchor-free Oriented Proposal Generator for Object Detection

Anchor-free Oriented Proposal Generator for Object Detection Gong Cheng, Jiabao Wang, Ke Li, Xingxing Xie, Chunbo Lang, Yanqing Yao, Junwei Han, Intro

jbwang1997 56 Nov 15, 2022
MIM: MIM Installs OpenMMLab Packages

MIM provides a unified API for launching and installing OpenMMLab projects and their extensions, and managing the OpenMMLab model zoo.

OpenMMLab 254 Jan 04, 2023
Self-supervised Label Augmentation via Input Transformations (ICML 2020)

Self-supervised Label Augmentation via Input Transformations Authors: Hankook Lee, Sung Ju Hwang, Jinwoo Shin (KAIST) Accepted to ICML 2020 Install de

hankook 96 Dec 29, 2022
Official Python implementation of the 'Sparse deconvolution'-v0.3.0

Sparse deconvolution Python v0.3.0 Official Python implementation of the 'Sparse deconvolution', and the CPU (NumPy) and GPU (CuPy) calculation backen

Weisong Zhao 23 Dec 28, 2022
Unsupervised Learning of Multi-Frame Optical Flow with Occlusions

This is a Pytorch implementation of Janai, J., Güney, F., Ranjan, A., Black, M. and Geiger, A., Unsupervised Learning of Multi-Frame Optical Flow with

Anurag Ranjan 110 Nov 02, 2022
Codebase for the paper titled "Continual learning with local module selection"

This repository contains the codebase for the paper Continual Learning via Local Module Composition. Setting up the environemnt Create a new conda env

Oleksiy Ostapenko 20 Dec 10, 2022
Adversarial Graph Representation Adaptation for Cross-Domain Facial Expression Recognition (AGRA, ACM 2020, Oral)

Cross Domain Facial Expression Recognition Benchmark Implementation of papers: Cross-Domain Facial Expression Recognition: A Unified Evaluation Benchm

89 Dec 09, 2022
This code provides various models combining dilated convolutions with residual networks

Overview This code provides various models combining dilated convolutions with residual networks. Our models can achieve better performance with less

Fisher Yu 1.1k Dec 30, 2022