📚 node [[recurrent neural networks in nlp]]

Recurrent Neural Networks in NLP

Go to [[Week 2 - Introduction]] or back to the [[Main AI Page]] Part of the page on [[Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Natural Language Processing]] See [[RNNS - Recurrent neural networks]] for more detail.

As words move through the neurons, the next batch of inputs enter the network in such a way that late-stage words form part of the processing of early-stage words.

This ensures context is taken into account when processing language.

Traditional RNNs are trained through back-propagation through time (BPTT).

Here's an example of what BPTT looks like.

📖 stoas
⥱ context