Neural Network Regression – Considering a dynamic state

I am using Tensorflow to solve a regression problem with known dynamic components, that is, we are aware that the (singular) label at time t depends on some dynamic state of the environment, but this feature is unknown. The initial attempt to solve the problem via simple regression has, understandably, failed, confirming our assumption that there is some kind of dynamic influence by a feature we have no access to.

However, the state of the environment at time t should be reflected somewhere in the features and labels (and in particular their interplay) known at times t0-n, where n > 0. Unfortunately, because of the nature of the problem, the output at time t heavily depends on the input at time t, about as much as it depends on the dynamic state of the environment. I am worried that this renders the approach I wanted to try ineffective – time series forecasting, in my understanding, would consider features from previous timesteps, but no inputs on the current timestep. Additionally, I know labels from previous timesteps, but not at the time at which I want to make my prediction.

Here is a table to illustrate the problem:

t input output
0 x(t=0) y(t=0)
t0-1 x(t=t0-1) y(t=t0-1)
t0 x(t=t0) y(t=t0)=?
  • How can I use all the information at my disposal to predict the value of y(t=t0), using x(t=t0) (where x is the array of input features) and a defined window of features and labels at previous timesteps?
  • Is there an established method for solving a problem like this, either using a neural net or perhaps even a different model?
  • Does this problem require a combination of methods, and if so, which ones might be suitable for tackling it?

The final model is meant to be deployed and continue working for future time windows as well. We know the size of the relevant time window to be roughly 100 time steps into the past.

Answer

The kind of problem I have described is, as I have since learned, linked to so-called exogenous variables. In my case, I require something called NNARX, which is similar to the ARMAX model at its core, but (as a neural net) can take non-linearity into account.

The general idea is to introduce an LSTM layer which acts as an Encoder for the historical input, which is then coupled to another input layer with the exogenous variables. Both are coupled at the so-called Decoder – the rest of the NN architecture.

Leave a Reply

Your email address will not be published. Required fields are marked *