📕 subnode [[@KGBicheno/recurrent neural networks in nlp]]
in 📚 node [[recurrent-neural-networks-in-nlp]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Recurrent Neural Networks in NLP.md by @KGBicheno
Recurrent Neural Networks in NLP
Go to [[Week 2 - Introduction]] or back to the [[Main AI Page]] Part of the page on [[Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Natural Language Processing]] See [[RNNS - Recurrent neural networks]] for more detail.
As words move through the neurons, the next batch of inputs enter the network in such a way that late-stage words form part of the processing of early-stage words.
This ensures context is taken into account when processing language.
Traditional RNNs are trained through back-propagation through time (BPTT).
📖 stoas
- public document at doc.anagora.org/recurrent-neural-networks-in-nlp
- video call at meet.jit.si/recurrent-neural-networks-in-nlp