Near-infinite memory is going to knock the socks off of those aiming to use advanced generative AI ...

[+] and LLMs. In today’s column, I examine the rapidly emerging topic of establishing near-infinite memory for generative AI and large language models (LLMs). What’s that, you might be wondering.

If you haven’t yet heard about this quite remarkable AI upcoming breakthrough, you certainly will in the coming months. Technology for this is being formulated and the resulting impacts will be enormous regarding what generative AI and LLMs will be able to additionally accomplish. It has to do with a slew of AI foundational elements including stateless interactions, session-based memory, context chaining, and other facets that are going to transform toward near-infinite memory and what is colloquially referred to as infinite attention.

Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI including identifying and explaining various impactful AI complexities (see the link here ). Defining Near-Infinite Memory The place to begin is by clarifying what the catchphrase of near-infinite memory means.

Here’s a handy way to think of this parlance. Suppose that there are lots of digital photos on your smartphone. Perhaps you have several thousand pics.

A cloud provider urges you to store your digital photos on their servers. The servers can handle billions of digital photos. The amount of storage or mem.