📕 subnode [[@KGBicheno/long short term memory lstm]] in 📚 node [[long-short-term-memory-lstm]]

long short-term memory - LSTM

Go to [[Week 2 - Introduction]] or back to the [[Main AI Page]] Part of the pages on [[Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Natural Language Processing]] and [[Attention Mechanism]].

A deep learning architecture in the same way [[CNNS - Convolutional neural networks]] are.

These algorithms are able to forget, according to the RNN section of the beginners' guide to NLP

According to wikipedia:

A common LSTM unit is composed of a cell, an input gate, an output gate and a forget gate. The cell remembers values over arbitrary time intervals and the three gates regulate the flow of information into and out of the cell. 

Source - Wikipedia.

An LSTM is made up of a:

  • cell
  • an input gate
  • an output gate
  • a forget gate

A video on how LSTMs work

The difference between LSTMs and [[GRUs]]

An LSTM and a GRU side-by-side

📖 stoas
⥱ context