📚 node [[long short term memory lstm]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Long short-term memory - LSTM.md by @KGBicheno
long short-term memory - LSTM
Go to [[Week 2 - Introduction]] or back to the [[Main AI Page]] Part of the pages on [[Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Natural Language Processing]] and [[Attention Mechanism]].
A deep learning architecture in the same way [[CNNS - Convolutional neural networks]] are.
These algorithms are able to forget, according to the RNN section of the beginners' guide to NLP
According to wikipedia:
A common LSTM unit is composed of a cell, an input gate, an output gate and a forget gate. The cell remembers values over arbitrary time intervals and the three gates regulate the flow of information into and out of the cell.
Source - Wikipedia.
An LSTM is made up of a:
- cell
- an input gate
- an output gate
- a forget gate
The difference between LSTMs and [[GRUs]]
📖 stoas
- public document at doc.anagora.org/long-short-term-memory-lstm
- video call at meet.jit.si/long-short-term-memory-lstm
⥱ context
↑ pushing here
(none)
(none)
↓ pulling this
(none)
(none)
⥅ related node [[long_short term_memory_(lstm)]]
🔎 full text search for 'long short term memory lstm'