HyperAIHyperAI超神经
首页资讯最新论文教程数据集百科SOTALLM 模型天梯GPU 天梯顶会
全站搜索
关于
中文
HyperAIHyperAI超神经
  1. 首页
  2. SOTA
  3. 时间序列预测
  4. Time Series Forecasting On Etth1 720 2

Time Series Forecasting On Etth1 720 2

评估指标

MAE
MSE

评测结果

各个模型在此基准测试上的表现结果

模型名称
MAE
MSE
Paper TitleRepository
DLinear0.3590.189Are Transformers Effective for Time Series Forecasting?-
Transformer0.42130.2501Long-term series forecasting with Query Selector -- efficient model of sparse attention-
QuerySelector0.3730.2136Long-term series forecasting with Query Selector -- efficient model of sparse attention-
Informer0.3570.201Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting-
SCINet0.250.099SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction-
PatchTST/640.2360.087A Time Series is Worth 64 Words: Long-term Forecasting with Transformers-
FiLM0.240.09FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting-
AutoCon0.2230.078Self-Supervised Contrastive Learning for Long-term Forecasting-
SegRNN0.2330.085SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting-
PatchMixer0.2430.093PatchMixer: A Patch-Mixing Architecture for Long-Term Time Series Forecasting-
NLinear0.2260.08Are Transformers Effective for Time Series Forecasting?-
Parallel Series Transformer0.2860.129How Features Benefit: Parallel Series Embedding for Multivariate Time Series Forecasting with Transformer
0 of 12 row(s) selected.
HyperAI

学习、理解、实践,与社区一起构建人工智能的未来

中文

关于

关于我们数据集帮助

产品

资讯教程数据集百科

链接

TVM 中文Apache TVMOpenBayes

© HyperAI超神经

津ICP备17010941号-1京公网安备11010502038810号京公网安备11010502038810号
TwitterBilibili