LSTM Networks - EXPLAINED!

LSTM Networks - EXPLAINED!

305.707 Lượt nghe
LSTM Networks - EXPLAINED!
Recurrent neural nets are very versatile. However, they don’t work well for longer sequences. Why is this the case? You’ll understand that now. And we delve into one of the most common Recurrent Neural Network Architectures : LSTM. We also build a text generator in Keras to generate state union speeches. BLOG: https://medium.com/@dataemporium PLAYLISTS FROM MY CHANNEL ⭕ Reinforcement Learning: https://youtube.com/playlist?list=PLTl9hO2Oobd9kS--NgVz0EPNyEmygV1Ha&si=AuThDZJwG19cgTA8 Natural Language Processing: https://youtube.com/playlist?list=PLTl9hO2Oobd_bzXUpzKMKA3liq2kj6LfE&si=LsVy8RDPu8jeO-cc ⭕ Transformers from Scratch: https://youtube.com/playlist?list=PLTl9hO2Oobd_bzXUpzKMKA3liq2kj6LfE ⭕ ChatGPT Playlist: https://youtube.com/playlist?list=PLTl9hO2Oobd9coYT6XsTraTBo4pL1j4HJ ⭕ Convolutional Neural Networks: https://youtube.com/playlist?list=PLTl9hO2Oobd9U0XHz62Lw6EgIMkQpfz74 ⭕ The Math You Should Know : https://youtube.com/playlist?list=PLTl9hO2Oobd-_5sGLnbgE8Poer1Xjzz4h ⭕ Probability Theory for Machine Learning: https://youtube.com/playlist?list=PLTl9hO2Oobd9bPcq0fj91Jgk_-h1H_W3V ⭕ Coding Machine Learning: https://youtube.com/playlist?list=PLTl9hO2Oobd82vcsOnvCNzxrZOlrz3RiD MATH COURSES (7 day free trial) 📕 Mathematics for Machine Learning: https://imp.i384100.net/MathML 📕 Calculus: https://imp.i384100.net/Calculus 📕 Statistics for Data Science: https://imp.i384100.net/AdvancedStatistics 📕 Bayesian Statistics: https://imp.i384100.net/BayesianStatistics 📕 Linear Algebra: https://imp.i384100.net/LinearAlgebra 📕 Probability: https://imp.i384100.net/Probability OTHER RELATED COURSES (7 day free trial) 📕 ⭐ Deep Learning Specialization: https://imp.i384100.net/Deep-Learning 📕 Python for Everybody: https://imp.i384100.net/python 📕 MLOps Course: https://imp.i384100.net/MLOps 📕 Natural Language Processing (NLP): https://imp.i384100.net/NLP 📕 Machine Learning in Production: https://imp.i384100.net/MLProduction 📕 Data Science Specialization: https://imp.i384100.net/DataScience 📕 Tensorflow: https://imp.i384100.net/Tensorflow CODE FOR THIS VIDEO: https://github.com/ajhalthor/Keras_LSTM_Text_Generator REFERENCES [1] LSTM Landmark paper (Sepp Hochreiter ): https://www.bioinf.jku.at/publications/older/2604.pdf [1] Slides from the Deep Learning book for RNNs: https://www.deeplearningbook.org/slides/10_rnn.pdf [2] Andrej Karpathy’s Blog + Code (You can probably understand more from this now!): http://karpathy.github.io/2015/05/21/rnn-effectiveness/ [3] The Deep learning Book on Sequence Modeling: https://www.deeplearningbook.org/contents/rnn.html [4] Colah’s blog on LSTMs: http://colah.github.io/posts/2015-08-Understanding-LSTMs/ [6] Visualizing and Understanding RNNs : https://arxiv.org/pdf/1506.02078.pdf