Use server-side prompt templates with automatic argument serialization.
New in version 2.0.0Prompts are reusable message templates exposed by MCP servers. They can accept arguments to generate personalized message sequences for LLM interactions.
Use list_prompts() to retrieve all available prompt templates:
Copy
async with client: prompts = await client.list_prompts() # prompts -> list[mcp.types.Prompt] for prompt in prompts: print(f"Prompt: {prompt.name}") print(f"Description: {prompt.description}") if prompt.arguments: print(f"Arguments: {[arg.name for arg in prompt.arguments]}") # Access tags and other metadata if hasattr(prompt, '_meta') and prompt._meta: fastmcp_meta = prompt._meta.get('_fastmcp', {}) print(f"Tags: {fastmcp_meta.get('tags', [])}")
New in version 2.11.0You can use the meta field to filter prompts based on their tags:
Copy
async with client: prompts = await client.list_prompts() # Filter prompts by tag analysis_prompts = [ prompt for prompt in prompts if hasattr(prompt, '_meta') and prompt._meta and prompt._meta.get('_fastmcp', {}) and 'analysis' in prompt._meta.get('_fastmcp', {}).get('tags', []) ] print(f"Found {len(analysis_prompts)} analysis prompts")
The _meta field is part of the standard MCP specification. FastMCP servers include tags and other metadata within a _fastmcp namespace (e.g., _meta._fastmcp.tags) to avoid conflicts with user-defined metadata. This behavior can be controlled with the server’s include_fastmcp_meta setting - when disabled, the _fastmcp namespace won’t be included. Other MCP server implementations may not provide this metadata structure.
Request a rendered prompt using get_prompt() with the prompt name and arguments:
Copy
async with client: # Simple prompt without arguments result = await client.get_prompt("welcome_message") # result -> mcp.types.GetPromptResult # Access the generated messages for message in result.messages: print(f"Role: {message.role}") print(f"Content: {message.content}")
Pass arguments as a dictionary to customize the prompt:
Copy
async with client: # Prompt with simple arguments result = await client.get_prompt("user_greeting", { "name": "Alice", "role": "administrator" }) # Access the personalized messages for message in result.messages: print(f"Generated message: {message.content}")
New in version 2.9.0FastMCP automatically serializes complex arguments to JSON strings as required by the MCP specification. This allows you to pass typed objects directly:
Copy
from dataclasses import dataclass@dataclassclass UserData: name: str age: intasync with client: # Complex arguments are automatically serialized result = await client.get_prompt("analyze_user", { "user": UserData(name="Alice", age=30), # Automatically serialized to JSON "preferences": {"theme": "dark"}, # Dict serialized to JSON string "scores": [85, 92, 78], # List serialized to JSON string "simple_name": "Bob" # Strings passed through unchanged })
The client handles serialization using pydantic_core.to_json() for consistent formatting. FastMCP servers can automatically deserialize these JSON strings back to the expected types.
Prompts can generate multi-turn conversation templates:
Copy
async with client: result = await client.get_prompt("interview_template", { "candidate_name": "Alice", "position": "Senior Developer" }) # Multiple messages for a conversation flow for message in result.messages: print(f"{message.role}: {message.content}")
Prompt arguments and their expected types depend on the specific prompt implementation. Check the server’s documentation or use list_prompts() to see available arguments for each prompt.