Skip to main content

Overview

Memory nodes enable stateful conversations by storing and retrieving chat history for LLM nodes.

Chat Memory Node

Store and retrieve conversation history with automatic trimming and summarization

Chat Memory Node

Purpose: Manage chat history and conversation context Memory nodes store and retrieve conversation history for LLM nodes, enabling stateful conversations.
Memory node visual representation
Memory node visual representation
Node Handles:
Top - Memory HandleConnects to LLM node’s MEMORY handle (bottom) for bidirectional communication.Functionality:
  • Reads: Retrieves existing conversation history for LLM context
  • Writes: Stores new messages after LLM execution
  • Bidirectional: Both input and output through single connection
Data Flow:
  • LLM requests history → Memory provides messages
  • LLM generates response → Memory stores new message
Memory nodes use a special bidirectional connection to the LLM’s MEMORY handle. They don’t have traditional PARALLEL or ERROR output handles - the memory connection handles all data flow.

Memory Limit Types

Keep last N messages
  • Simple message-based trimming
  • Maintains recent context
  • Predictable memory usage
Example: Keep last 10 messages

Summarization

When messages are trimmed, enable summarization to condense old context:
{
  "enable_summarization": true,
  "summarize_prompt": "Summarize the key points from this conversation..."
}
The summarization runs automatically and replaces trimmed messages with a concise summary, preserving important context while reducing token usage.

External Chat IDs

Memory nodes use external chat IDs to group conversations:
external_chat_id: "user-{{user_id}}-session-{{session_id}}"
This enables:
  • Multi-user conversations
  • Session management
  • Conversation history across workflow executions
  • Isolated contexts per user/session
Use template variables in external_chat_id to dynamically create isolated conversation contexts for each user or session.

What’s Next?