N-HiTS — Making Deep Learning for Time Series Forecasting More Efficient

A deep dive into how N-HiTS works and how you can use it

Jonte Dancker
Towards Data Science

--

Architecture of N-HiTS (Image taken from Challu and Olivares et al.).

In 2020, N-BEATS was the first deep-learning model to outperform statistical and hybrid models in time series forecasting.

Two years later, in 2022, a new model threw N-BEATS off its throne. Challu and Olivares et al. published the deep learning model N-HiTS. They addressed two shortcomings of N-BEATS for longer forecast horizons:

  • decreasing accuracy and
  • increasing computation.

N-HiTS stands for Neural Hierarchical Interpolation for Time Series Forecasting.

The model builds on N-BEATS and its idea of neural basis expansion. The neural basis expansion takes place in several blocks across layered stacks.

In this article, I will go through the architecture behind N-HiTS, particularly the differences to N-BEATS. But do not be afraid, the deep dive will be easy-to-understand. However, it is not enough to only understand how N-HiTS works. Thus, I will show you how we can easily implement a N-HiTS model in Python and also tune its hyperparameters.

If the core idea is the same, what is the difference between N-BEATS and

--

--

Expert in time series forecasting and analysis | Writing about my data science side projects and sharing my learnings