Why do we need LSTM

LSTM stands for Long Short Term Memory. LSTM are basically an extension of RNNS, primarily implemented to deal with situations where RNNs Fail. Speaking of RNN, it is a network that operates on the current input by taking the previous output into account and storing it for a limited period of time in its memory (short-term memory). Out of its Numerous applications, some of the most popular Applications of RNNs Include Natural Language processing, non-Markovian control, and music composition. Nevertheless, there are drawbacks to RNNs. First, for a longer period of time, it fails to store information. At times, to predict the current output, a reference to such information stored quite a long time ago is needed. Yet RNNs are completely unable to manage "long-term dependencies" like that. Second, the aspect of the context that has to be taken forward and how much of the past information needs to be 'forgotten' is not properly managed. Other problems with RNNs are the bursting and disappearing gradients that occur by backtracking during a network's training phase. This is where Long Short Term memory (LSTM) came into being. It has been built such that the issue of the vanishing gradient is almost entirely eliminated, although the model of training is left unchanged. A broad variety of parameters, such as learning rates and input and output biases, are given by LSTMs. The complexity to update each weight is decreased to O(1) with LSTMs, similar to that of Back Propagation Through Time (BPTT), which is an advantage.

 

 

 

Some Applications of LSTM:

Text Prediction:

LSTM's long-term memory capabilities mean that it excels at predicting sequences of text. The network needs to maintain all of the words that followed it in order to anticipate the next word in a sentence. One of the most common applications of text prediction is in chatbots which is used by eCommerce sites.

Stock Prediction:

To accurately predict a stock value with substantial accuracy, the model needs to consider one of the biggest factors—the trend of the stock. In order to do that, the model needs to identify the trend based on the values recorded over the past days—a task suited to an LSTM network.

Check out the Data Science Masters Program @ Fingertips

Leave A Reply

Ashpreet Kaur - Jul 2, 2021

Exercitation photo booth stumptown tote bag Banksy, elit small batch freegan sed. Craft beer elit seitan exercitation, photo booth et 8-bit kale chips proident chillwave deep v laborum. Aliquip veniam delectus, Marfa eiusmod Pinterest in do umami readymade swag.