20

I am trying to use a LSTM for time series prediction. The data streams in once per minute, but I would like to predict an hour ahead. There are two ways I can think of for going about this:

  1. Squash the data into hourly data instead, taking the average over each 60 minute time period as one data point.
  2. For each (X, y) training data pair, let X be the time series from t - 120 to t - 60, and let y be the time series from t - 60 to t. Force the LSTM to predict 60 timesteps ahead, and take y[-1] as the prediction.

Are there any best practices for going about this?

Edward Yu
  • 321
  • What is LSTM? Least squares time series model maybe? – Michael R. Chernick Mar 04 '17 at 19:55
  • 2
    Do you need 60 predictions, or just the last one? If you just need the last one, just feed in y = t+60 value to train. I don't think it's critical (for LSTM) that the value you are predicting is the very next one sequentially. So if you want predictions further out in time, just train it that way. – photox Mar 05 '17 at 21:36
  • use multi-step forecasting with the data per minutes you have with appropriate lag value – SATYAJIT MAITRA May 14 '19 at 01:31

2 Answers2

12

There are different approaches

  • Recursive strategy

    • one many-to-one model

      prediction(t+1) = model(obs(t-1), obs(t-2), ..., obs(t-n))
      prediction(t+2) = model(prediction(t+1), obs(t-1), ..., obs(t-n)) 
      
  • Direct strategy

    • multiple many-to-one models

      prediction(t+1) = model1(obs(t-1), obs(t-2), ..., obs(t-n))
      prediction(t+2) = model2(obs(t-2), obs(t-3), ..., obs(t-n))`
      
  • Multiple output strategy

    • one many-to-many model

      prediction(t+1), prediction(t+2) = model(obs(t-1), obs(t-2), ..., obs(t-n))`
      
  • Hybrid Strategies

    • combine two or more above strategies

Reference : Multi-Step Time Series Forecasting

mingxue
  • 221
3

From https://machinelearningmastery.com/multi-step-time-series-forecasting-long-short-term-memory-networks-python/

train = [[t-120,t-199...t,t+1...t+60],[...],[...]]

# fit an LSTM network to training data
def fit_lstm(train, n_lag, n_seq, n_batch, nb_epoch, n_neurons):
    # reshape training into [samples, timesteps, features]
    X, y = train[:, 0:n_lag], train[:, n_lag:]
    X = X.reshape(X.shape[0], 1, X.shape[1])
    # design network
    model = Sequential()
    model.add(LSTM(n_neurons, batch_input_shape=(n_batch, X.shape[1], X.shape[2]), stateful=True))
    model.add(Dense(y.shape[1]))
    model.compile(loss='mean_squared_error', optimizer='adam')
    # fit network
    for i in range(nb_epoch):
        model.fit(X, y, epochs=1, batch_size=n_batch, verbose=0, shuffle=False)
        model.reset_states()
    return model
user4446237
  • 177
  • 1
  • 1
  • 9
  • I am not sure what you are aiming for. Either you answer the question or you ask a new question. You might just delete your last sentence. – Ferdi Sep 01 '17 at 14:43
  • removed "Did you ever find an approach that worked for you?" , but yes, i am very curious on how this guy proceeded since I am working on a very similar task – user4446237 Sep 01 '17 at 14:44
  • Thx for your edit. If you are superinterested you can start a bounty as soon as you have 75 rep. I do not know any other way to draw interest on the question. – Ferdi Sep 01 '17 at 14:50