FastMCP Architecture
Server Class
FastMCP is the central class. It manages registration of tools/resources/prompts, handles transport, and coordinates providers.
from fastmcp import FastMCP
mcp = FastMCP("MyServer")
Constructor accepts:
name- server name (shown to clients)instructions- server-level instructions for LLMslifespan- async context manager for startup/shutdownauth- auth provider (OAuth, JWT, etc.)tags- server-level tags
Class Hierarchy
FastMCP
+-- LifespanMixin # startup/shutdown lifecycle
+-- MCPOperationsMixin # tool/resource/prompt CRUD
+-- TransportMixin # stdio/HTTP/SSE transport
+-- AggregateProvider # composition of multiple providers
Providers
Providers are the abstraction for where tools/resources/prompts come from. This is what enables composition.
| Provider | Purpose |
|---|---|
LocalProvider | Statically registered components (default) |
FastMCPProvider | Mounts another FastMCP server |
ProxyProvider | Wraps a remote MCP server |
OpenAPIProvider | Auto-generates from OpenAPI specs |
AggregateProvider | Combines multiple providers with namespacing |
SkillsProvider | Loads skills from filesystem |
When you call @mcp.tool, the tool goes into the LocalProvider. When you call mcp.add_provider(), you add another source of components.
Transports
How the server communicates with clients:
| Transport | Use case | Protocol |
|---|---|---|
stdio | CLI tools, subprocess spawning | Line-based stdin/stdout |
http | Stateless HTTP | POST requests |
sse | Streaming HTTP | Server-Sent Events |
streamable-http | Long-running HTTP | Chunked streaming |
# Run with specific transport
mcp.run(transport="stdio") # default, for CLI
mcp.run(transport="sse") # for HTTP streaming
mcp.run(transport="streamable-http") # for long-running ops
# HTTP with custom host/port
mcp.run(transport="sse", host="0.0.0.0", port=8000)
Lifespan
Startup/shutdown logic via async context manager:
from contextlib import asynccontextmanager
@asynccontextmanager
async def lifespan(mcp: FastMCP):
db = await connect_db()
yield {"db": db} # available via context
await db.close()
mcp = FastMCP("MyServer", lifespan=lifespan)
The yielded dict is accessible in tools via context.fastmcp_lifespan_state.
ASGI Integration
Get a Starlette app for embedding in existing web frameworks:
app = mcp.http_app(transport="sse")
# Or add custom routes
@mcp.custom_route("/health", methods=["GET"])
async def health(request):
return JSONResponse({"status": "ok"})
Context
Every tool/resource/prompt handler can receive a Context parameter (injected automatically by name/type):
from fastmcp import Context
@mcp.tool
async def my_tool(query: str, context: Context) -> str:
# Session state
await context.set_state("last_query", query)
prev = await context.get_state("last_query")
# Logging to client
await context.send_message("info", "Processing...")
# LLM sampling (ask the connected LLM a question)
result = await context.sample("Summarize this: ...")
return "done"
MCP Governance
As of December 2025, MCP is governed by the Agentic AI Foundation (AAIF), a Linux Foundation directed fund co-founded by Anthropic, Block, and OpenAI. Spec changes go through an open RFC process with multi-vendor consensus.
This means FastMCP tracks a vendor-neutral specification, reducing the risk of platform-specific drift. New MCP features (tasks, elicitation, MCP Apps) are designed for cross-platform compatibility. See MCP in 2026 for details on spec evolution.
Settings
Environment variables (all prefixed FASTMCP_):
FASTMCP_LOG_LEVEL=DEBUG
FASTMCP_TRANSPORT=http
FASTMCP_HOST=0.0.0.0
FASTMCP_PORT=8000
FASTMCP_MASK_ERROR_DETAILS=false # show full errors in dev