| ... | ... | @@ -33,7 +33,7 @@ or, we can also have another activation function here as well: |
|
|
|
y_t = ReLU(W^{yh}_{16}h_t)
|
|
|
|
```
|
|
|
|
|
|
|
|
Imagine that we have an input X of the shape (1,3,4) where the second dimension is the time. If we use our model for all time steps, we get the following:
|
|
|
|
Let's look at a simpler case where the second activation function is linear. Imagine that we have an input X of the shape (1,3,4) where the second dimension is the time. If we use our model for all time steps, we get the following:
|
|
|
|
|
|
|
|
```math
|
|
|
|
y_1 = W^{yh}_{16} tanh(W^{xh}_{64}X_1 + W^{hh}_{66}h_{0})
|
| ... | ... | @@ -90,8 +90,14 @@ Recurrent cells can also be used to perform sequence to sequence analysis, also |
|
|
|
<img src="uploads/d229e9dd599633916fbaabdd06e61e76/rnn_3.png" width="600">
|
|
|
|
</div>
|
|
|
|
|
|
|
|
**Many to one**
|
|
|
|
|
|
|
|
We may also be interested in guessing one output (label) given a sequence of data. For example, we can pass the last 24 hours of the load demand and expect the model to predict the next expected demand.
|
|
|
|
|
|
|
|
<div align="center">
|
|
|
|
<img src="uploads/1c0c6bbb21d5a1237a52b9369fc4e9bf/rnn_4.png" width="600">
|
|
|
|
</div>
|
|
|
|
|
|
|
|
...
|
|
|
|
|
|
|
|
### Multilayered RNN Case study: Load demand forecasting
|
|
|
|
|
| ... | ... | |
| ... | ... | |