I am excited to announce a new section on the everyday series, where I will be regularly covering important research papers in the field of AI from the last 15 years. I will be reading these papers and will be writing my notes on this page. I will also try and share public resources where these papers are covered like YouTube Videos, substack etc. I aim to make the information more accessible to a wider audience to chew the information in micro-bite as notes or better-described online resources. I will try and use technologies to summarise or generate and then edit these notes as well.

Artificial intelligence is one of the most rapidly growing and exciting fields today, and it is crucial to stay informed about the latest advancements. However, many research papers can be dense and difficult to understand, even for those with a background in the field.

This series will aim to bridge that gap, by breaking down the key findings and insights from these papers and making them easy to understand. I will cover many topics, including machine learning, natural language processing, computer vision, and more.

I will be publishing notes quite often, so be sure to check back regularly for new updates. I am looking forward to sharing this journey with you, and I hope you will find it as interesting and informative as I do.

You can also bookmark https://everydayseries.com/papers to view the papers I have covered and the ones I will cover on this page.

The first paper is:

Long Short-Term Memory
Hochreiter, S. and Schmidhuber, J., 1997. Long short-term memory. Neural computation, 9(8), pp.1735-1780.

Thank you for reading.

We curate, research and publish daily updates from the field of AI.
Consider becoming a paying subscriber to get the latest!