LLMs as Operating Systems - Agent Memory, a new course based on the MemGPT approach

The Development of L as Operating Systems: A Partnership with LetterA and the Open-Source Agentic Framework Lettera

I am excited to introduce L, an operating system agent memory built in partnership with LetterA company. This innovative technology is a result of collaboration between our team and the founders Charles Pcker and Sarah Withers. If you have taken one of De Blend AI's earlier prompting courses, you may have noticed that when you prompt an LM via API, they do not have persistent memories. For example, if you prompt an Elm with "hello my name is Andrew" and then ask in the following prompt "what is my name", it won't remember to simulate memory. You need to include the conversational history in the prompt or the context window of the launch language model.

As applications have grown more sophisticated, managing the information that goes into and out of the limited context window has become increasingly challenging. Our team at mgpt developed a very clever way to handle AI agentic memories. The chatbot application is one example where we include conversational history in the context window. However, as it grows, it may need to be compressed or summarized to fit. Additionally, we want to track personal or topic-specific information and include that in the context window for rag applications. These applications strive to include the most relevant information in the context answer queries and agenting applications are the most demanding.

They potentially need different information for each processing step. This is where our team's proposed solution using an OM agent to manage this context window comes into play. The analogy we used in the paper is that of virtual memory from the computer systems literature. Computers use a technique called virtual memory to store much more data than the physical memory of the computer can. When data in virtual memory is accessed, that is not in physical memory, the operating system will fetch that data from disk and loaded into physical memory.

In this course, you'll learn how to achieve a similar result with an LM agent AC as the operating system. The agent will determine which information should be in the context window and fetch it from the appropriate source. To understand the fundamentals, you will code an LLm agent that can edit its own memory from scratch. Then, you'll learn how to use an open-source agentic framework Lettera, this framework allows you to put the LLM as operating systems ideas from mgpt into practice.

It lets you build and deploy agents that have long-term memory, and agents OFA are given tools to track context memory usage edit memory and move information between the context window and external memory. You'll create agents generate chat memories and create your own custom memory types. I read the original MPD paper and thought it was an Innovative and important technique for handling memory for LS. The open-source Letter framework makes these ideas easy for you to use, please learn about them in this course.

The Partnership with LetterA

Our team is excited to introduce L as operating systems agent memory built in partnership with LetterA company. This partnership allows us to bring innovative technology to the field of language models. As a result of collaboration between our team and the founders Charles Pcker and Sarah Withers, we have developed a new way to handle AI agentic memories.

The Limitations of Current Language Models

If you have taken one of De Blend AI's earlier prompting courses, you may have noticed that when you prompt an LM via API, they do not have persistent memories. For example, if you prompt an Elm with "hello my name is Andrew" and then ask in the following prompt "what is my name", it won't remember to simulate memory. You need to include the conversational history in the prompt or the context window of the launch language model.

As applications have grown more sophisticated, managing the information that goes into and out of the limited context window has become increasingly challenging. Our team at mgpt developed a very clever way to handle AI agentic memories. The chatbot application is one example where we include conversational history in the context window. However, as it grows, it may need to be compressed or summarized to fit.

The Need for Advanced Memory Management

Rag applications are another example where they strive to include the most relevant information in the context answer queries and agenting applications are the most demanding. They potentially need different information for each processing step. This is where our team's proposed solution using an OM agent to manage this context window comes into play.

The Analogy of Virtual Memory

The analogy we used in the paper is that of virtual memory from the computer systems literature. Computers use a technique called virtual memory to store much more data than the physical memory of the computer can. When data in virtual memory is accessed, that is not in physical memory, the operating system will fetch that data from disk and loaded into physical memory.

The Benefits of L as Operating Systems

In this course, you'll learn how to achieve a similar result with an LM agent AC as the operating system. The agent will determine which information should be in the context window and fetch it from the appropriate source. To understand the fundamentals, you will code an LLm agent that can edit its own memory from scratch.

You will also learn how to use an open-source agentic framework Lettera, this framework allows you to put the LLM as operating systems ideas from mgpt into practice. It lets you build and deploy agents that have long-term memory, and agents OFA are given tools to track context memory usage edit memory and move information between the context window and external memory.

You will also create agents generate chat memories and create your own custom memory types. Our team is excited to introduce L as operating systems agent memory built in partnership with LetterA company. This partnership allows us to bring innovative technology to the field of language models. As a result of collaboration between our team and the founders Charles Pcker and Sarah Withers, we have developed a new way to handle AI agentic memories.

The Open-Source Agentic Framework

I read the original MPD paper and thought it was an Innovative and important technique for handling memory for LS. The open-source Letter framework makes these ideas easy for you to use, please learn about them in this course. Our team is committed to making this technology accessible to researchers and developers.

The Future of Language Models

As we continue to develop and improve L as operating systems agent memory, we are excited to see the impact it will have on the field of language models. We believe that this technology has the potential to revolutionize the way we approach natural language processing and generation.

"WEBVTTKind: captionsLanguage: enI'm excited to introduce L as operating systems agent memory built in partnership with letter A company that supports the open- source agentic framework called lettera your instructors are the founders Charles pcker and Sarah Withers if you taken one of De blend ai's earlier prompting causes you may have found that when you prompt an LM via API they do not have persistent memories for example if you prompt an Elm with hello my name is Andrew and then ask in the following prompt what is my name it won't remember to simulate memory you have to include the conversational history in the prompt or the context window of the launch language model as the applications have grown more sophisticated managing the information that goes into and out of the limited context window has become increasingly challenging mgpt developed a very clever way to handle AI agentic memories take the chatbot application the context window will include convers history but as it grows it may need to be compressed or summarized to fit it may also want to track personal or topic specific information and include that in the context window rag applications are another example they strive to include the most relevant information in the context answer queries and agenting applications are the most demanding they potentially need different information for each processing step and so in the paper mgpt towards om as operating systems the author proposed using an OM agent to manage this context window two of those authors are your instructors for this course the analogy we used in the paper is that of virtual memory from the computer systems literature computers use a technique called virtual memory to store much more data than the physical memory of the computer can when data in virtual memory is accessed that is not in physical memory the operating system will fetch that data from disk and loaded into physical memory in this course you'll learn how to achieve a similar result with an LM agent AC as the operating system the agent will determine which information should be in the context window and fetch it from the appropriate source to understand the fundamentals you will code an llm agent that can edit its own memory from scratch then you'll learn how to use an open source agentic framework Leta this framework allows you to put the llm as operating systems ideas from mgpt into practice it lets you build and deploy agents that have long-term memory agents OFA are given tools to track context memory usage edit memory and move information between the context window and external memory you'll create agents generate chat memories and create your own custom memory types when I read the original MPD paper I thought it was an Innovative and important technique for handling memory for LS the open source letter framework makes these ideas easy for your to use please learn about them in this courseI'm excited to introduce L as operating systems agent memory built in partnership with letter A company that supports the open- source agentic framework called lettera your instructors are the founders Charles pcker and Sarah Withers if you taken one of De blend ai's earlier prompting causes you may have found that when you prompt an LM via API they do not have persistent memories for example if you prompt an Elm with hello my name is Andrew and then ask in the following prompt what is my name it won't remember to simulate memory you have to include the conversational history in the prompt or the context window of the launch language model as the applications have grown more sophisticated managing the information that goes into and out of the limited context window has become increasingly challenging mgpt developed a very clever way to handle AI agentic memories take the chatbot application the context window will include convers history but as it grows it may need to be compressed or summarized to fit it may also want to track personal or topic specific information and include that in the context window rag applications are another example they strive to include the most relevant information in the context answer queries and agenting applications are the most demanding they potentially need different information for each processing step and so in the paper mgpt towards om as operating systems the author proposed using an OM agent to manage this context window two of those authors are your instructors for this course the analogy we used in the paper is that of virtual memory from the computer systems literature computers use a technique called virtual memory to store much more data than the physical memory of the computer can when data in virtual memory is accessed that is not in physical memory the operating system will fetch that data from disk and loaded into physical memory in this course you'll learn how to achieve a similar result with an LM agent AC as the operating system the agent will determine which information should be in the context window and fetch it from the appropriate source to understand the fundamentals you will code an llm agent that can edit its own memory from scratch then you'll learn how to use an open source agentic framework Leta this framework allows you to put the llm as operating systems ideas from mgpt into practice it lets you build and deploy agents that have long-term memory agents OFA are given tools to track context memory usage edit memory and move information between the context window and external memory you'll create agents generate chat memories and create your own custom memory types when I read the original MPD paper I thought it was an Innovative and important technique for handling memory for LS the open source letter framework makes these ideas easy for your to use please learn about them in this course\n"