2 Easy Ways of Adding Short Term Memory in LangGraph Chatbot

Use InMemorySaver and trim_messages to add Short Term Memory in LangGraph

If you’re building chatbots using LangGraph, you’ve likely noticed that by default, they don’t remember previous messages across calls. That’s where Short Term Memory in LangGraph becomes incredibly powerful.

In this blog post, we’ll explore how to equip your chatbot with short-term memory using LangGraph’s native tools like InMemorySaver and trim_messages.

πŸ€– Why Short Term Memory in LangGraph Matters

Most real-world chatbots need some memory β€” not just for a smarter experience, but to keep context. Short Term Memory in LangGraph allows your bot to remember what users said earlier, so it can give coherent, contextual responses.

Let’s begin with a stateless chatbot example.

πŸ§ͺ Example 1: LangGraph Chatbot Without Short Term Memory

Here’s a simple implementation of a LangGraph chatbot that does not retain memory across invocations:

Each time you invoke this agent, it only responds to the current input, with no awareness of the previous ones.

Here is how our agent looks like in the mermaid graph.

Agent with Short term memory in Langgraph

🧠 Example 2: Adding Short Term Memory in LangGraph with trim_messages and InMemorySaver

To add short term memory in LangGraph, we’ll make two enhancements:

  1. Use trim_messages to avoid token overload.
  2. Add a checkpointer (InMemorySaver) so the agent remembers the thread history.

Now, this agent can maintain a short-term conversational memory by remembering prior messages using a thread_id.

πŸ’¬ Demonstration of Short Term Memory in LangGraph

Now, the chatbot remembers that your name is Bob in the second and third calls β€” thanks to short term memory in LangGraph.

Here are the logs:

πŸ“Œ Benefits of Adding Short Term Memory in LangGraph

  • 🧠 Contextual Awareness
  • πŸ’‘ Improved UX
  • πŸͺ„ Minimal code changes
  • ⏳ Short-term only: Keeps responses focused, not bloated

πŸ§‘πŸ»β€πŸ’» Here is the full Code Example

πŸ“š FAQs: Short Term Memory in LangGraph

❓ What is short term memory in LangGraph?

It refers to the ability of the chatbot to retain conversation context across invocations using in-memory checkpoints and message trimming.


❓ What is trim_messages used for?

To ensure token limits are respected while still maintaining recent message context.


❓ What is InMemorySaver?

It’s a built-in memory checkpoint manager that lets LangGraph remember previous messages based on thread ID.


❓ How is this different from long-term memory?

Short term memory is volatile and typically used just during active sessions. Long-term memory would require persistent storage like a vector database or file system.


❓ Can I scale this beyond in-memory?

Yes, LangGraph supports custom checkpointers. You can persist messages in databases like Redis, SQLite, or use LangChain’s VectorStore Memory.

Leave a Reply

Your email address will not be published. Required fields are marked *