Exploding Gradient
During Recurrent neural network training, if the gradients are > 1 then, repeated gradient computation causes graident to explode. Exploding gradient problem can be solved by:
- Gradient Clipping (i.e. don't allow the gradients to increase beyond certain threshold)