This year, we started our âAI Agents and Agentic Workflowsâ series (https://www.turingpost.com/t/AI-Agents) to explore everything about AI agents step by step: all the vocabulary, how they work, and how to build them. The huge interest in this series and the large number of studies conducted on agents showed that it was one of the most popular and important themes of the year. In 2025, most likely, agents will reach new highs â we will be covering that for you. Now, letâs review the agentic systems that have emerged this year.
Here is a list of 15 agentic systems and frameworks of 2024:
Key Idea: A data-dependent weighted average for pooling and communication, enabling flexible and powerful neural network connections.
Breakthrough: Bahdanau's "soft search" mechanism (softmax + weighted averaging) solved encoder-decoder bottlenecks in machine translation. Transformer Revolution: Attention Is All You Need (1706.03762) (2017) by @ashishvaswanigoogle et al. simplified architectures by stacking attention layers, introducing multi-headed attention and positional encodings. Legacy: Attention replaced RNNs, driving modern AI systems like ChatGPT. It emerged independently but was influenced by contemporaneous work like Alex Gravesâs Neural Turing Machines (1410.5401) and Jason Westonâs Memory Networks (1410.3916) .
Attention to history: JĂŒrgen Schmidhuber claims his 1992 Fast Weight Programmers anticipated modern attention mechanisms. While conceptually similar, the term âattentionâ was absent, and thereâs no evidence it influenced Bahdanau, Cho, and Bengioâs 2014 work. Paying attention (!) to history might have brought us to genAI earlier â but credit for the breakthrough still goes to Montreal.
Who else deserves recognition in this groundbreaking narrative of innovation? Letâs ensure every contributor gets the credit they deserve. Leave a comment below đđ»đ€