PinnedLSTM, GRU and Attention Block BasicsHow Language discovered itself — ML techniques in NLPDec 3, 2020Dec 3, 2020
Transformers…Attention is all you need!Transformers are getting more and more important not just in NLP but now its going to extend its surface area into other areas of deep…Feb 4, 20211Feb 4, 20211
Convolutional Sequence to Sequence Learning in NLPRecurrent neural networks (RNNs) with LSTM or GRU units are the most prevalent tools for NLP researchers, and provide state-of-the-art…Jan 28, 2021Jan 28, 2021