Tiny, speedy AI models shake up business as we know it

In life, timing is critical.

That’s especially true in business, where every organization must forecast sales, demand, revenue, and capacity requirements. Accurate and reliable time-dependent forecasts could help every organization save—and earn—billions of dollars.

Time series forecasting is the bread and butter of what drives an enterprise. It involves predicting future values based on past observations collected in constant time intervals, whether daily, monthly, quarterly, or yearly.

Artificial intelligence is expected to accelerate and tighten business planning with new, faster, and smaller, foundation models designed for multivariable time series forecasting. These models do not need to be the equivalent of an AI sledgehammer to drive results. Small time series-based or other small foundation models trained on high-quality, curated data are more energy-efficient and can achieve the same results or better.

How can time series AI models predict the future?

Time series models can be built from scratch or adapted from existing pre-trained models and are best used for predicting outcomes in time-series data. Traditionally, AI’s large language models compute relationships among words to identify patterns in the data that can be projected forward to make better decisions.

Time series foundation models look for patterns in historical observations to “understand” a temporal process. These abstract representations are what allow the models to solve predictive tasks. The longer the time series, the better the forecast.

However, these kinds of measurements pose complications in ways that words, code and pixels do not. First, time series data is often continuous: Think of video streaming from a self-driving car, temperature readings from a reactor or heart rate data from a smartwatch. There’s a lot of data to process, and its sequential order and directionality must be strictly preserved.

Time series data vary widely, from stock prices and satellite images to brain waves and light curves from distant stars. Compressing disparate observations into an abstract representation is an enormous challenge.

In addition, different sets of time series data are often highly correlated. In the real world, complex events arise from multiple factors. For example, air temperature, pressure, and humidity strongly interact to drive the weather. To predict a hurricane, you must know how these variables influenced each other in the past to understand how the future could play out. Computations and cross-channel correlations could quickly become overwhelming as the number of variables increases, especially if it’s a long historical record.

The farther back you go, the more complex these calculations get, especially if your target variable is influenced by other factors. Home heater sales, for example, may be tied to quirky weather or the economy. The more interacting variables in any time series data set, the harder it can be to isolate the signal that foreshadows the future.

Breaking barriers in time series forecasting

AI foundation models designed for time series forecasting can be difficult to build. The sheer scale and complexity of multi-channel data sources coupled with external variables pose significant architectural challenges for the ensuing model and non-trivial computational demands, making it difficult to train and update models with reasonable accuracy and desired forecasting window in a timely manner. Today, many foundational models can’t pick up on the trends that the rapidly evolving data patterns reveal – a process known as “temporal adaptation”. Time series foundation models such as MOIRAI, TimesFM, and Chronos are built on hundreds of millions of parameters demanding significant computational resources and runtime.

The next wave of innovation

Researchers and practitioners are working on new ways to overcome these obstacles and unlock the full potential of using AI in time series forecasting. Can smaller models pre-trained purely on limited public diverse time series datasets deliver better forecasting accuracy? It turns out that the answer is yes!

Experimentation with the development of “tiny” foundation models significantly less than 1B parameters is now well underway. Smaller models for time series forecasting (1M to 3M parameters) can offer significant computational efficiency while still achieving state-of-the-art results in zero/few-shot forecasting, which is when models generate forecasts from unseen datasets. They also can support cross-channel and external variables — critical features that existing popular methods lack.

These fast and tiny general pre-trained AI models can be put to work quickly on use cases such as predicting electricity consumption demand. They are also flexible enough to be extended to other time series tasks beyond forecasting. In anomaly detection, for instance, these tiny models can be trained on datasets that include anomalous and regular patterns, allowing them to learn the characteristics of anomalies and detect deviations from normal behavior.

Increasingly, we’re seeing that these small models, combined with enterprise data, can make a big impact, offering task-specific performance that rival large models at a fraction of the cost. They are poised to become the ‘workhorses’ of enterprise AI.

In the next few years AI is expected to help progress a radical transformation in the business landscape. While most of the world’s public data feeds current models, a vast majority of enterprise data remains untapped. Small, fast foundation models — which have flexibility, low development costs and wide-ranging applications — are poised to play a significant role in this shift.

We feature the best AI tools currently available.

This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Related posts

I tried Govee’s new AI-powered gaming pixel light, and I finally see a reason to bring generative AI into my home

How smart cities leverage AI to integrate services and improve efficiency

I’ve been smart home hunting at CES 2025, and these are the coolest gadgets I found

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More