@@ -15,17 +15,11 @@ Grey-box models add some obfuscation to the system dynamics. While the structure

An example could be a simple RC circuit with unknown capacitance and resistance. The equations governing the current through the components are known. A physical model can be constructed based off of those equations. The model, in combination with actual measurements from the circuit then estimates the values of parameters in the equations such that the difference between the model's predictions and the actual readings is minimized.

### State-space equations as grey-boxes

## Black-box models

A Black-box model foregoes any *apriori* knowledge about the distribution of system parameters. Instead it learns the mechanics from scratch. A neural network used to approximate an RC circuit is a black-box model. The network simply learns the mappings from the inputs to the outputs.

### LSTM networks as black-boxes

Long short-term memory networks are a variant of recurrent neural networks. They are *stateful* models. An LSTM network is able to memorize earlier states such that it affects later states. This is important when modelling dynamic systems wherre the future states of the system depend on the rate of change of states or other higher order phenomenon.

A Black-box model foregoes any *a priori* knowledge about the distribution of system parameters. Instead it learns the mechanics from scratch. A neural network used to approximate an RC circuit is a black-box model. The network simply learns the mappings from the inputs to the outputs.

Feed-forward networks (FFNs), also known as Multi-layer Perceptrons (MLPs), instantaneously map inputs to outputs. They are universal function approximators given their layer size can be arbitrarily large.

However, FFNs do not have 'memory' - their past inputs do not affect their current outputs. The history of the model needs to be encoded in a single input (for e.g. concatenating previous and current state variables into a longer vector).

## Recurrent networks

Recurrent neural networks (RNNs) are *stateful* models. An LSTM network is able to memorize earlier states such that it affects later states. This is important when modelling dynamic systems wherre the future states of the system depend on the rate of change of states or other higher order phenomenon.

An LSTM network has two architectural parameters:

***Cell**: A cell is an equivalent of a layer for a LSTM network. A cell takes in sequential data and outputs a transformed sequence.

***Hidden unit**: Each cell has memory. Each hidden unit represents the state of the cell for one past time step. A cell with 10 hidden units has a sliding window of 10 steps over the input sequence.

@@ -79,6 +79,8 @@ The results show that the model is highly susceptible to noise. The mathematical

### Black box model

The black box model is developed from a multi-layer LSTM network. The input to the system is the voltage supplied by the source. The predicted output is the current through the capacitor.