Straight to the point.
If you’re not comfortable with recurrent neural networks (RNNs) through why, how, and when, then this is for you.
Clarification: I’m focusing on NLP input, hence words, for the consistency but most of what I’m describing also applies to sequence and time series data. While doing that I’m not saying anything about how we’re representing words (embedding space, integers, one hot encoded vectors) for simplicity sake, words will be just words.
NN (neural network), DNN (deep neural network), RNN (recurrent neural network), CNN (convolutional neural network).
First a reminder — RNNs are a way to infer…
I love to tackle hard problems with useful solutions