Request the client’s LLM to generate text based on provided messages through the MCP context.
New in version: 2.0.0
LLM sampling allows MCP tools to request the client’s LLM to generate text based on provided messages. This is useful when tools need to leverage the LLM’s capabilities to process data, generate responses, or perform text-based analysis.
ctx.sample()
to request text generation from the client’s LLM:
ctx.sample()
will fail