Consider a time dependent observable obs[t]
and a set of n
time dependent parameters par[i,t]
with i=1,...,n
. The observable is such that it is lagging behind the parameters (current observable value is correlated to some degree with parameter i
values a time t0[i]
respectively in the past). I would like to have a machine learning model that predicts the observable based on the data of the parameters.
In search for ways to do the above, I have found the function TimeSeriesForecast
, but it only seems to take one input sequence and predict the same. I am also aware of the Predict
and Classify
, but they seem to be working purely statistically, not taking continuity in time flow into account. Which means that the time shifted correlation will not be captured.
Question:
Is there a way to use mathematica to employ past and current data of obs[t]
and par[i,t]
in a machine learning model, in order to predict obs[t]
taking into account its lagging nature in time for stronger correlation?
EDIT:
I was asked to provide a sample set of data. On this gist.github page you can find and copy/paste observable data obs
that has 700 time steps, as well as parameter data par1, par2, par3
with same number of time steps:
ListPlot[obs]
{ListPlot[par1],ListPlot[par2],ListPlot[par3]}
As mentioned before, each parameter data is correlated with observable data under a respective time shift (parameters see to some extent what will happen with observable in the future). The goal is to predict as well as possible e.g. the next 300 time steps of the observable (not included in the obs
data above, but present to some extent in the parameter data):
Comments
Post a Comment