My favorite were Andrew Ng's lectures:
https://www.coursera.org/learn/nlp-sequence-models
He goes into the intuition behind RNNs and word embeddings and goes from there.
As you review those lectures, I would play around with NLTK, Spacey, and Gensim so you can see some of it for yourself.