Straight to the point.

Audience for this post

If you’re not comfortable with recurrent neural networks (RNNs) through why, how, and when, then this is for you.

Clarification: I’m focusing on NLP input, hence words, for the consistency but most of what I’m describing also applies to sequence and time series data. While doing that I’m not saying anything about how we’re representing words (embedding space, integers, one hot encoded vectors) for simplicity sake, words will be just words.


NN (neural network), DNN (deep neural network), RNN (recurrent neural network), CNN (convolutional neural network).

INTRO — Why RNNs are simpler than CNNs

First a reminder — RNNs are a way to infer…

Serj Smorodinsky

I love to tackle hard problems with useful solutions

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store