Skip to content

Commit 16a3501

Browse files
committed
add readme and licence
1 parent 7942cee commit 16a3501

File tree

4 files changed

+75
-2
lines changed

4 files changed

+75
-2
lines changed

LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2021 xxxx
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

README.md

Lines changed: 52 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
# FEDformer (ICML 2022 under review)
2+
3+
4+
Frequency Enhanced Decomposed
5+
Transformer (FEDformer) is more efficient than
6+
standard Transformer with a linear complexity
7+
to the sequence length.
8+
9+
Our empirical studies
10+
with six benchmark datasets show that compared
11+
with state-of-the-art methods, FEDformer can
12+
reduce prediction error by 14.8% and 22.6%
13+
for multivariate and univariate time series,
14+
respectively.
15+
16+
17+
## Get Started
18+
19+
1. Install Python 3.6, PyTorch 1.9.0.
20+
2. Download data. You can obtain all the six benchmarks from xxxx.
21+
3. Train the model. We provide the experiment scripts of all benchmarks under the folder `./scripts`. You can reproduce the experiment results by:
22+
23+
```bash
24+
bash ./scripts/run_M.sh
25+
bash ./scripts/run_S.sh
26+
```
27+
28+
29+
## Citation
30+
31+
If you find this repo useful, please cite our paper.
32+
33+
```
34+
xxxxx
35+
```
36+
37+
## Contact
38+
39+
If you have any question or want to use the code, please contact xxx@xxxx .
40+
41+
## Acknowledgement
42+
43+
We appreciate the following github repos a lot for their valuable code base or datasets:
44+
45+
https://github.com/thuml/Autoformer
46+
47+
https://github.com/zhouhaoyi/Informer2020
48+
49+
https://github.com/zhouhaoyi/ETDataset
50+
51+
https://github.com/laiguokun/multivariate-time-series-data
52+

scripts/run_M.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
export CUDA_VISIBLE_DEVICES=1
22

3-
cd ..
3+
#cd ..
44

55
for model in FEDformer Autoformer Informer Transformer
66
do

scripts/run_S.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
export CUDA_VISIBLE_DEVICES=0
22

3-
cd ..
3+
#cd ..
44

55
for model in FEDformer Autoformer Informer Transformer
66
do

0 commit comments

Comments
 (0)