Venue | Title |
---|---|
Under review of ICLR'25 | Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts |
Under review of ICLR'25 | In-context Time Series Predictor |
NIPS'24 | SOFTS: Efficient Multivariate Time Series Forecasting with Series-Core Fusion |
NIPS'24 | Are Self-Attentions Effective for Time Series Forecasting? |
Venue | Title |
---|---|
ICML'24 | Position: What Can Large Language Models Tell Us about Time Series Analysis |
KDD'24 | Foundation Models for Time Series Analysis: A Tutorial and Survey |
Venue | Title | Keywords |
---|---|---|
Under Review of ICLR'25 | FoundTS: Comprehensive and Unified Benchmarking of Foundation Models for Time Series Forecasting | |
Under Review of ICLR'25 | GIFT-Eval: A Benchmark for General Time Series Forecasting Model Evaluation |
Venue | Title | Keywords |
---|---|---|
Under Review of ICLR'25 | Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts | |
Under Review of ICLR'25 | FlexTSF: A universal forecasting model for time series with variable regularities | |
Under Review of ICLR'25 | Towards Generalisable Time Series Understanding Across Domains | |
NIPS'24 | Large Pre-trained time series models for cross-domain Time series analysis tasks | |
NIPS’24 | UNITS: A Unified Multi-Task Time Series Model | One model for many tasks; Prompt tuning |
ICML'24 | Unified Training of Universal Time Series Forecasting Transformers | Multivariate; Large-scale data; Variable window size |
ICML’24 | Timer: Generative Pre-trained Transformers Are Large Time Series Models | Channel independence; Large-scale data; Auto-regression |
ICML'24 | MOMENT: A Family of Open Time-series Foundation Models | Masked auto-encoder pretraining |
ICML'24 | A decoder-only foundation model for time-series forecasting | Auto-regressive patch-wise decoding |
Venue | Title | Keywords |
---|---|---|
Under Review of ICLR'25 | Enhancing Foundation Models for Time Series Forecasting via Wavelet-based Tokenization | |
Under Review of ICLR'25 | Towards Adaptive Time Series Foundation Models Against Distribution Shift |
Venue | Title | Keywords |
---|---|---|
Under Review of ICLR'25 | In-context Fine-tuning for Time-series Foundation Models | |
Under Review of ICLR'25 |
Venue | Title | Keywords |
---|---|---|
ICML'24 | UP2ME: Univariate Pre-training to Multivariate Fine-tuning as a General-purpose Framework for Multivariate Time Series Analysis | Masked auto-encoder pretraining; Variable window size; Multivariate fine-tuning |
ICML'24 | Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning | Autoregressive patch-wise decoding |
Venue | Title | Keywords |
---|---|---|
Under Review of ICLR'25 | TimeRAF: Retrieval-Augmented Foundation model for Zero-shot Time Series Forecasting | |
Under Review of ICLR'25 | TimeRAG: It's Time for Retrieval-Augmented Generation in Time-Series Forecasting | |
Under Review of ICLR'25 | Retrieval Augmented Time Series Forecasting | |
Under Review of ICLR'25 | CRAFT: Time Series Forecasting with Cross-Future Behavior Awareness | |
Under Review of ICLR'25 | Metadata Matters for Time Series: Informative Forecasting with Transformers | |
ICML'24 | Time Weaver: A Conditional Time Series Generation Model | Diffusion |
ICML'24 | S$^2$IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting | Retrieve word embeddings |
KDD'24 | POND: Multi-Source Time Series Domain Adaptation with Information-Aware Prompt Tuning | Transfer |
Venue | Title | Keywords |
---|---|---|
Under Review of ICLR'25 | Timer-XL: Long-Context Transformers for Unified Time Series Forecasting |
Venue | Title | Keywords |
---|---|---|
Under Review of ICLR'25 | FastTF: 4 Parameters are All You Need for Long-term Time Series Forecasting | |
ICML'24 | SparseTSF: Modeling Long-term Time Series Forecasting with 1k Parameters |
Venue | Title | Keywords |
---|---|---|
ICML'24 | SIN: Selective and Interpretable Normalization for Long-Term Time Series Forecasting |
Venue | Title | Keywords |
---|---|---|
NIPS'24 | Time-MMD: A New Multi-Domain Multimodal Dataset for Time Series Analysis | |
MoAT: Multi-Modal Augmented Time Series Forecasting | News article |