Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FEAT: TimeXer #1267

Merged
merged 10 commits into from
Feb 18, 2025
Merged

FEAT: TimeXer #1267

merged 10 commits into from
Feb 18, 2025

Conversation

marcopeix
Copy link
Contributor

Add TimeXer to neuralforecast. This is a transformer-based model that supports future exogenous features, and seems to perform very well in long-horizon forecasting according to here.

Original code implementation: https://github.com/thuml/TimeXer/tree/main
Paper: https://arxiv.org/abs/2402.19072

Note: The encoder and embedding are specific to TimeXer, that's why I am not reusing the existing common module from other models.

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@marcopeix marcopeix marked this pull request as ready for review February 17, 2025 19:30
@marcopeix marcopeix requested a review from elephaint February 17, 2025 19:30
Copy link
Contributor

@elephaint elephaint left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, maybe the refactoring in a separate PR?

We can also do it separately though, it's non-blocking imho.

Small detail re. mint.json.

Does it (somewhat) reproduce the paper results?

@elephaint elephaint self-requested a review February 18, 2025 20:51
Copy link
Contributor

@elephaint elephaint left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@marcopeix
Copy link
Contributor Author

LGTM, maybe the refactoring in a separate PR?

We can also do it separately though, it's non-blocking imho.

Small detail re. mint.json.

Does it (somewhat) reproduce the paper results?

Here are the results for ETTm1:

Paper (avg for 4 horizons)
MAE: 0.397
MSE: 0.382

neuralforecast (avg for 4 horizons)
MAE: 0.423
MSE: 0.445

Although I was training for 1000 step with early stopping on all horizons and they all reached 1000 steps, so maybe training for longer would give closer scores

@marcopeix marcopeix merged commit 532988e into main Feb 18, 2025
17 checks passed
@marcopeix marcopeix deleted the feature/timexer branch February 18, 2025 20:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants