Skip to content

ustc-time-series/CoGenCast

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CoGenCast: A Coupled Autoregressive-Flow Generative Framework for Time Series Forecasting

Python 3.10+ PyTorch 2.2.2 Time Series Forecasting

📝 Abstract

In this work, we propose CoGenCast, a hybrid generative framework that couples pre-trained LLMs with flow-matching mechanism for effective time series forecasting. This repository contains the official code for our paper.

Full Abstract

Time series forecasting can be viewed as a generative problem that requires both semantic understanding over contextual conditions and stochastic modeling of continuous temporal dynamics. Existing approaches typically rely on either autoregressive large language models (LLMs) for semantic context modeling or diffusion-like models for continuous probabilistic generation. However, neither method alone can adequately model both aspects simultaneously. In this work, we propose CoGenCast, a hybrid generative framework that couples pre-trained LLMs with flow-matching mechanism for effective time series forecasting. Specifically, we reconfigure pre-trained decoder-only LLMs into a native forecasting encoder–decoder backbone by modifying only the attention topology, enabling bidirectional context encoding and causal representation generation. Building on this, a flow-matching mechanism is further integrated to model temporal evolution, capturing continuous stochastic dynamics conditioned on the autoregressively generated representation. Notably, CoGenCast naturally supports multimodal forecasting and cross-domain unified training. Extensive experiments on multiple benchmarks show that CoGenCast consistently outperforms previous compared baselines.

🖼️ Overview

CoGenCast Framework

  • Figure 1. Overview of our proposed CoGenCast. Left (Training): We reconfigure decoder-only LLMs into an encoder–decoder backbone by attention-only modification, and perform continuous flow-matching mechanism conditioned on the LLM-generated representation. Right (Inference): Future patches are generated autoregressively and sampled via one-step flow-matching generation with low-latency.

✨ Key Features

  • Problem Reformulation: We highlight that an ideal forecasting approach should possess dual capabilities: semantic understanding over contextual conditions and stochastic modeling of continuous temporal dynamics.
  • Architecture Innovation: We propose CoGenCast, a hybrid generative framework that couples pre-trained LLMs with continuous flow-matching mechanism for time series forecasting.
  • Empirical Superiority: CoGenCast consistently outperforms previous methods in forecasting across multiple benchmarks.

🚀 Quick Start

Environment

  • Python 3.10 (recommended)
conda create cogencast python=3.10
conda activate cogencast
  • Install dependencies
pip install -r requirements.txt
pip install transformers

Datasets

We use the following ten real-world datasets for our experiments. They are placed in the ./datasets folder in the repository. Please ensure you adhere to each dataset's respective license when using them.

For all datasets , the CSV should contain:

  • date (timestamp)
  • feature columns (multivariate) or the target column (univariate)
  • json_summary (text summary field in the last column used as context features)

Backbone

Download the Qwen3-0.6B weights from Hugging Face and set the local path via --llm_path.

Run Demo

cd CoGenCast
sh scripts/ETTh1.sh

📊 Performance

Main Results

🙏 Acknowledgement

This repo is built on the pioneer works. We appreciate the following GitHub repos a lot for their valuable code base or datasets:

📌 Citation

🙋 Please let us know if you find any mistakes or have suggestions.

⭐ If you find our work helpful, please consider starring this repository and citing our research.

About

CoGenCast: A Coupled Autoregressive–Flow Generative Framework for Time Series Forecasting

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors