| ... | @@ -95,12 +95,16 @@ Recurrent cells can also be used to perform sequence to sequence analysis, also |
... | @@ -95,12 +95,16 @@ Recurrent cells can also be used to perform sequence to sequence analysis, also |
|
|
We may also be interested in guessing one output (label) given a sequence of data. For example, we can pass the last 24 hours of the load demand and expect the model to predict the next expected demand.
|
|
We may also be interested in guessing one output (label) given a sequence of data. For example, we can pass the last 24 hours of the load demand and expect the model to predict the next expected demand.
|
|
|
|
|
|
|
|
<div align="center">
|
|
<div align="center">
|
|
|
<img src="uploads/1c0c6bbb21d5a1237a52b9369fc4e9bf/rnn_4.png" width="300">
|
|
<img src="uploads/1c0c6bbb21d5a1237a52b9369fc4e9bf/rnn_4.png" width="400">
|
|
|
</div>
|
|
</div>
|
|
|
|
|
|
|
|
|
|
|
|
|
### Multilayered RNN Case study: Load demand forecasting
|
|
### Multilayered RNN Case study: Load demand forecasting
|
|
|
|
|
|
|
|
Let's take our dataset and notebook as an example. We have time vs. load data, coupled with temperature. We decided to use a sliding window of 4 to create median and std data for load, and we pass the temperature as an additional feature. Here T can be both the current temperature, mean of the last a few temperatures, or even better, the difference between the current and past temperatures (information on whether it is increasing or decreasing). At the end, we have 4 features. Next, we need to decide how much past is relevant to make predictions about the future. To make it easier to plot, I will take it as 3 here. Our data representation becomes (1, 3, 4) for one training instance.
|
|
One computational node will not be sufficiently complex for almost all cases and we need to increase the model complexity by adding more nodes and layers, giving us a Recurrent Neural Network. Similar to the node-wise model architectures, we need to think about what kind of graph design suits best to our objective (one-to-many, many-to-many etc.).
|
|
|
|
|
|
|
|
Let's take our dataset and notebook as an example. We have time vs. load data, coupled with temperature. We decided to use a sliding window of 4 to create median and std data for load, and we pass the temperature as an additional feature. Here T can be both the current temperature, mean of the last a few temperatures, or maybe even better, the difference between the current and past temperatures (information on whether it is increasing or decreasing). At the end, we decided to have 4 features.
|
|
|
|
|
|
|
|
Next, we need to decide how much past is relevant to make predictions about the future. To make it easier to plot, I will take it as 3 here. Our data representation becomes (1, 3, 4) for one training instance. At this point, I decided to make predictions one hour ahead, so I can use many-to-one approach (3 past observations of 4 features will give one prediction).
|
|
|
|
|
|
|
|
... |
|
... |