What if your AI chatbot could remember every word you said? A langgraph chatbot with memory example builds AI agents that retain conversational context. It uses LangGraph’s state management to pass message history between nodes, enabling coherent, human-like responses by recalling past interactions. This persistence is vital for practical AI applications.
What is LangGraph Memory?
LangGraph memory is how AI agents built with LangGraph retain and access past interaction information. This allows agents to maintain conversational context, avoid repetition, and build upon exchanges. It’s implemented by defining a state that includes memory components, like message history, passed between graph nodes. This forms a foundational aspect of a LangGraph chatbot with memory example.
This memory is implemented by defining an explicit state for the graph that can include memory components, like message history. This state is passed between different nodes during execution, ensuring that information persists across the workflow. It allows for more advanced agent behaviors beyond stateless, turn-by-turn interactions, creating a truly persistent chatbot with LangGraph memory.
Defining Memory in LangGraph States
When building a LangGraph application, you define the state that the graph operates on. For a chatbot, this state typically includes the current user input, the AI’s response, and critically, a history of the conversation. This history acts as the primary memory for your LangGraph memory chatbot.
You can represent this memory as a list of dictionaries, where each dictionary contains the role (e.g., “user”, “assistant”) and the content of the message. This structured approach allows the LLM to easily parse and understand the conversational context.
1from typing import List, Dict, TypedDict, Annotated
2from langgraph.graph import StateGraph, END
3from langchain_core.messages import BaseMessage, HumanMessage, AIMessage
4
5## Define the state for our chatbot
6class ChatState(TypedDict):
7 messages: Annotated[List[BaseMessage], lambda x: x] # List of messages for conversation history
8
9## Initialize the graph
10workflow = StateGraph(ChatState)
11
12## Define nodes (simplified for example)
13def user_node(state: ChatState):
14 # In a real app, this would handle user input
15 return {"messages": [HumanMessage(content="Hello!")]}
16
17def agent_node(state: ChatState):
18 # This node would contain the LLM call, using the messages for context
19 # For demonstration, we'll just append a placeholder response
20 # In a real scenario, you'd pass state['messages'] to the LLM
21 print(f"Received messages for agent: {state['messages']}")
22 return {"messages": AIMessage(content="Hello! How can I help you today?")}
23
24## Add nodes to the graph
25workflow.add_node("user", user_node)
26workflow.add_node("agent", agent_node)
27
28## Define the edges
29workflow.set_entry_point("user")
30workflow.add_edge("user", "agent")
31## This creates a loop for continuous conversation, allowing the agent to respond
32workflow.add_edge("agent", "agent")
33
34## Compile the graph
35app = workflow.compile()
36
37## Example of running the graph
38## Initial state (empty messages list)
39initial_state = {"messages": []}
40## Running the graph for a few steps to simulate a conversation
41## In a real chatbot, you'd have a loop that takes user input and updates the state
42print("