Pairing AI Agents with MCP to Make Outside World Accessible

Part of the series AI Agents and MCP Server: Teaming Up for the Agentic Web

When we manually used our MCP, we clicked a button called "List Tools." This allows us to see all the available tools the MCP can provide.

What if our AI Agent could do this automatically when it starts and inject the tools into its tools property? I love that idea!

Thankfully, the AI SDK provides a way to programmatically connect to an MCP server, list its tools, and inject them into the AI Agent's tools property.

Before doing so, we need to add a Nitro runtime config to indicate the URL of the MCP.

ts
import { defineNitroConfig } from 'nitropack/config'

export default defineNitroConfig({
  runtimeConfig: {
    openAiApiKey: '',
    mcpEndpoint: '',
  },
  // ...
})

In your .env file, set the MCP_ENDPOINT variable to the MCP URL:

ini
MCP_ENDPOINT=http://localhost:3000/mcp

Then, we can connect our AI Agent to the MCP by injecting the MCP tools into the agent's tools property.

ts
import { createOpenAI } from '@ai-sdk/openai'
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js'
import { convertToModelMessages, experimental_createMCPClient, stepCountIs, streamText } from 'ai'
import { defineEventHandler, defineLazyEventHandler, readBody } from 'h3'
import { useRuntimeConfig } from 'nitropack/runtime'

export default defineLazyEventHandler(() => {
  const runtimeConfig = useRuntimeConfig()

  const model = createOpenAI({
    apiKey: runtimeConfig.openAiApiKey,
  })

  return defineEventHandler(async (event) => {
    const { messages } = await readBody(event)

    const httpTransport = new StreamableHTTPClientTransport(
      new URL(runtimeConfig.mcpEndpoint)
    )
    const httpClient = await experimental_createMCPClient({
      transport: httpTransport
    })
    const tools = await httpClient.tools()

    return streamText({
      model: model('gpt-5-nano'),
      system: `You are a helpful assistant. You can use the tool to add two numbers together.`,
      stopWhen: stepCountIs(2),
      tools,
      messages: convertToModelMessages(messages),
    }).toUIMessageStreamResponse()
  })
})

The most interesting line is const tools = await httpClient.tools(). This fetches the list of tools from the MCP, like when we clicked "List Tools," and injects them into the AI Agent's tools property. Thanks to experimental_createMCPClient, the AI SDK will call tools using the HTTP client when the AI requests them.

Note

Under the hood, it's just some HTTP requests to the MCP server using the official HTTP client transport that uses fetch.

As expected, there is no difference compared to using tools directly with the MCP.

The AI Agent using a tool and generating a response.

Now we can easily improve the MCP without modifying the AI Agent code, allowing anyone to use the MCP within their own AI application to interact with our tools or even add an external MCP server to our AI Agent.

Love it!

PP

Thanks for reading! My name is Estéban, and I love to write about web development and the human journey around it.

I've been coding for several years now, and I'm still learning new things every day. I enjoy sharing my knowledge with others, as I would have appreciated having access to such clear and complete resources when I first started learning programming.

If you have any questions or want to chat, feel free to comment below or reach out to me on Bluesky, X, and LinkedIn.

I hope you enjoyed this article and learned something new. Please consider sharing it with your friends or on social media, and feel free to leave a comment or a reaction below—it would mean a lot to me! If you'd like to support my work, you can sponsor me on GitHub!

Continue readingA powerful AI application made with Nitro and Nuxt UI

Reactions

Discussions

Add a Comment

You need to be logged in to access this feature.