| ... | ... | @@ -64,9 +64,16 @@ So far we talked about the a mathematical model to learn about the time dependen |
|
|
|
|
|
|
|
**One to many**
|
|
|
|
|
|
|
|
We can use even one recurrent node to have a simple memory representation for a given sample X. We can feed the same data to the same node, each time updating the hidden state h. The model can give multiple outputs by using the updated state. In such a scenario, we can use RepeatVector to duplicate the data X to be able to feed it to a generic RNN code.
|
|
|
|
|
|
|
|
Alternatively, we can feed the output y to the next time iteration, replacing X with y. Note that here the dimensions of X and y needs to match to be able to use the same weight matrices. A typical example for this case is generative usage of RNN, such as composing music.
|
|
|
|
|
|
|
|
|
|
|
|
<div align="center">
|
|
|
|
<img src="uploads/a1cad34e93b000f487503fa18738e6ab/rnn_1.png" width="600">
|
|
|
|
|
|
|
|
**Many to many**
|
|
|
|
|
|
|
|
...
|
|
|
|
|
|
|
|
### Multilayered RNN Case study: Load demand forecasting
|
| ... | ... | |
| ... | ... | |