Prompts
Create reusable, parameterized prompt templates for MCP clients.
Prompts are reusable message templates that help LLMs generate structured, purposeful responses. FastMCP simplifies defining these templates, primarily using the @mcp.prompt
decorator.
What Are Prompts?
Prompts provide parameterized message templates for LLMs. When a client requests a prompt:
- FastMCP finds the corresponding prompt definition.
- If it has parameters, they are validated against your function signature.
- Your function executes with the validated inputs.
- The generated message(s) are returned to the LLM to guide its response.
This allows you to define consistent, reusable templates that LLMs can use across different clients and contexts.
Defining Prompts
The @prompt
Decorator
The most common way to define a prompt is by decorating a Python function. The decorator uses the function name as the prompt’s identifier.
Key Concepts:
- Name: By default, the prompt name is taken from the function name.
- Parameters: The function parameters define the inputs needed to generate the prompt.
- Inferred Metadata: By default:
- Prompt Name: Taken from the function name (
ask_about_topic
). - Prompt Description: Taken from the function’s docstring.
- Prompt Name: Taken from the function name (
Return Values
FastMCP intelligently handles different return types from your prompt function:
str
: Automatically converted to a singleUserMessage
.Message
(e.g.,UserMessage
,AssistantMessage
): Used directly as provided.dict
: Parsed as aMessage
object if it has the correct structure.list[Message]
: Used as a sequence of messages (a conversation).
Type Annotations
Type annotations are important for prompts. They:
- Inform FastMCP about the expected types for each parameter.
- Allow validation of parameters received from clients.
- Are used to generate the prompt’s schema for the MCP protocol.
Required vs. Optional Parameters
Parameters in your function signature are considered required unless they have a default value.
In this example, the client must provide data_uri
. If analysis_type
or include_charts
are omitted, their default values will be used.
Prompt Metadata
While FastMCP infers the name and description from your function, you can override these and add tags using arguments to the @mcp.prompt
decorator:
name
: Sets the explicit prompt name exposed via MCP.description
: Provides the description exposed via MCP. If set, the function’s docstring is ignored for this purpose.tags
: A set of strings used to categorize the prompt. Clients might use tags to filter or group available prompts.
Asynchronous Prompts
FastMCP seamlessly supports both standard (def
) and asynchronous (async def
) functions as prompts.
Use async def
when your prompt function performs I/O operations like network requests, database queries, file I/O, or external service calls.
The MCP Session
Prompts can access the MCP features via the Context
object, just like tools.
Using the ctx
parameter (based on its Context
type hint), you can access:
- Logging:
ctx.debug()
,ctx.info()
, etc. - Resource Access:
ctx.read_resource(uri)
- LLM Sampling:
ctx.sample(...)
- Request Info:
ctx.request_id
,ctx.client_id
Refer to the Context documentation for more details on these capabilities.
Server Behavior
Duplicate Prompts
You can configure how the FastMCP server handles attempts to register multiple prompts with the same name. Use the on_duplicate_prompts
setting during FastMCP
initialization.
The duplicate behavior options are:
"warn"
(default): Logs a warning, and the new prompt replaces the old one."error"
: Raises aValueError
, preventing the duplicate registration."replace"
: Silently replaces the existing prompt with the new one."ignore"
: Keeps the original prompt and ignores the new registration attempt.