Recurrent Neural Networks (RNN) – Plan of Attack

Published by SuperDataScience Team

August 24, 2018

Live Training With Hadelin

Discover the 5 steps to Unlock your Career!

Days
Hours
Minutes
Seconds

Plan of attack

(For the PPT of this lecture Click Here)

Are you excited about the idea to uncover the essence of Recurrent Neural Networks? Hope you are… because we are now venturing into a complex, very forward-looking and cutting-age areas of deep learning!
But what’s really important to keep in mind – this going to be very, very fun! 
So, how are we going to approach Deep Learning?
First of all, we will learn the idea behind how Recurrent Neural Networks work and how they compare to the human brain.
Then, we’ll talk about the Vanishing Gradient Problem – something that used to be a major roadblock in the development and utilization of RNN.
Next, we’ll move on to the solution of this problem – Long Short-Term Memory (LSTM) neural networks and their architecture. We’ll break down the complex structure inside these networks into simple terms and you’ll get a pretty solid understanding of LSTMs.
Then, we’ll have a look at some great practical examples to get an idea on what’s actually going on in the brain of LSTM.
Finally, we will have an extra tutorial on LSTM variations just to get you up to speed on what other options of LSTM exist out there in the world.
Ready to get started? Then move on to our next section!

Share on

Related Blogs