| ... | @@ -105,8 +105,13 @@ One computational node will not be sufficiently complex for almost all cases and |
... | @@ -105,8 +105,13 @@ One computational node will not be sufficiently complex for almost all cases and |
|
|
|
|
|
|
|
Let's take our dataset and notebook as an example. We have time vs. load data, coupled with temperature. We decided to use a sliding window of 4 to create median and std data for load, and we pass the temperature as an additional feature. Here T can be both the current temperature, mean of the last a few temperatures, or maybe even better, the difference between the current and past temperatures (information on whether it is increasing or decreasing). At the end, we decided to have 4 features.
|
|
Let's take our dataset and notebook as an example. We have time vs. load data, coupled with temperature. We decided to use a sliding window of 4 to create median and std data for load, and we pass the temperature as an additional feature. Here T can be both the current temperature, mean of the last a few temperatures, or maybe even better, the difference between the current and past temperatures (information on whether it is increasing or decreasing). At the end, we decided to have 4 features.
|
|
|
|
|
|
|
|
Next, we need to decide how much past is relevant to make predictions about the future. To make it easier to plot, I will take it as 3 here. Our data representation becomes (1, 3, 4) for one training instance. At this point, I decided to make predictions one hour ahead, so I can use many-to-one approach (3 past observations of 4 features will give one prediction).
|
|
Next, we need to decide how much past is relevant to make predictions about the future. To make it easier to plot, I will take it as 3 here (in the notebook, it was 24). Our data representation becomes (1, 3, 4) for one training instance. This is the format TensorFlow expects. At this point, I decided to make predictions one hour ahead. Based on this choice, I see that I can utilize many-to-one approach (3 past observations of 4 features will give one prediction).
|
|
|
|
|
|
|
|
Next, I decided to use 2 RNN layers, followed by and 2 MLP layers. RNN layers have 6 nodes, while MLP layers have 3 and 1 node respectively. Note that these are custom selections, except the last MLP layer (since this is a regression task, I will have one linear node at the end to give me continuous values).
|
|
Next task is to decide the model graph. In this example, I will use moderate numbers to be able to draw them in detail. I decided to use 2 RNN layers, followed by 2 MLP layers. RNN layers have 6 nodes, while MLP layers have 3 and 1 node respectively. Note that these are custom selections, except the last MLP layer (since this is a regression task, I will have one linear node at the end to give me continuous values).
|
|
|
|
|
|
|
|
|
This is how the custom model looks like:
|
|
|
|
|
|
|
|
<div align="center">
|
|
|
|
<img src="uploads/90c5d0580c35b1076a042d4d533af472/rnn_1.png" width="400">
|
|
|
|
</div>
|
|
|
... |
|
... |