Add persistent memory to Microsoft Agent Framework agents with Supermemory
Microsoft’s Agent Framework is a Python framework for building AI agents with tools, handoffs, and context providers. Supermemory integrates natively as a context provider, tool set, or middleware — so your agents remember users across sessions.
All integration points share a single AgentSupermemory connection. This ensures the same API client, container tag, and conversation ID are used across middleware, tools, and context providers.
Copy
Ask AI
from supermemory_agent_framework import AgentSupermemoryconn = AgentSupermemory( api_key="your-supermemory-api-key", # or set SUPERMEMORY_API_KEY env var container_tag="user-123", # memory scope (e.g., user ID) conversation_id="session-abc", # optional, auto-generated if omitted entity_context="The user is a Python developer.", # optional)
The most idiomatic integration. Follows the same pattern as Agent Framework’s built-in Mem0 provider — memories are automatically fetched before the LLM runs and conversations can be stored afterward.
before_run() — Searches Supermemory for the user’s profile and relevant memories, then injects them into the session context as additional instructions
after_run() — If store_conversations=True, saves the conversation to Supermemory so future sessions have more context
Intercept chat requests to automatically inject memory context. Useful when you want memory injection without the session-based context provider pattern.
A support agent that remembers customers across sessions:
Copy
Ask AI
import asynciofrom agent_framework import AgentSessionfrom agent_framework.openai import OpenAIResponsesClientfrom supermemory_agent_framework import ( AgentSupermemory, SupermemoryChatMiddleware, SupermemoryMiddlewareOptions, SupermemoryContextProvider, SupermemoryTools,)async def main(): conn = AgentSupermemory( container_tag="customer-456", conversation_id="support-session-789", entity_context="Enterprise customer on the Pro plan.", ) provider = SupermemoryContextProvider( conn, mode="full", store_conversations=True, ) middleware = SupermemoryChatMiddleware( conn, options=SupermemoryMiddlewareOptions( mode="full", add_memory="always", ), ) tools = SupermemoryTools(conn) agent = OpenAIResponsesClient().as_agent( name="SupportAgent", instructions="""You are a customer support agent.Use the user context provided to personalize your responses.Reference past interactions when relevant.Save important new information about the customer.""", context_providers=[provider], middleware=[middleware], ) session = AgentSession() # First interaction response = await agent.run( "My order hasn't arrived yet. Order ID is ORD-789.", session=session, tools=tools.get_tools(), ) print(response.text) # Follow-up — agent automatically has context from first message response = await agent.run( "Actually, can you also check my previous order?", session=session, tools=tools.get_tools(), ) print(response.text)asyncio.run(main())
from supermemory_agent_framework import ( AgentSupermemory, SupermemoryConfigurationError, SupermemoryAPIError, SupermemoryNetworkError,)try: conn = AgentSupermemory() # no API key setexcept SupermemoryConfigurationError as e: print(f"Missing API key: {e}")