diff --git a/docs/docs/Tutorials/chat-with-files.mdx b/docs/docs/Tutorials/chat-with-files.mdx index b52ce0a97..fafe9d1a0 100644 --- a/docs/docs/Tutorials/chat-with-files.mdx +++ b/docs/docs/Tutorials/chat-with-files.mdx @@ -19,7 +19,7 @@ The main focus of this tutorial is to show you how to provide files as input to - [A Langflow API key](/configuration-api-keys) - [An OpenAI API key](https://platform.openai.com/api-keys) - This tutorial uses an OpenAI LLM. If you want to use a different provider, you need a valid credential for that provider. +This tutorial uses an OpenAI LLM. If you want to use a different provider, you need a valid credential for that provider. ## Create a flow that accepts file input diff --git a/docs/docs/Tutorials/mcp-tutorial.mdx b/docs/docs/Tutorials/mcp-tutorial.mdx new file mode 100644 index 000000000..98a28de14 --- /dev/null +++ b/docs/docs/Tutorials/mcp-tutorial.mdx @@ -0,0 +1,285 @@ +--- +title: Connect to MCP servers from your application +slug: /mcp-tutorial +--- + +import Icon from "@site/src/components/icon"; +import Tabs from '@theme/Tabs'; +import TabItem from '@theme/TabItem'; + +This tutorial shows you how to connect MCP tools to your applications using Langflow's [**MCP Tools**](/mcp-client) component. + +The [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) helps agents integrate with LLMs through _MCP clients_ and _MCP servers_. +Specifically, MCP servers host tools that agents (MCP clients) use to complete specialized tasks. +MCP servers are connected to MCP clients like Cursor. +Then, you interact with the client, and the client uses tools from the connected servers as needed to complete your requests. + +You can run Langflow as an MCP client and an MCP server: + +* [Use Langflow as an MCP client](/mcp-client): When run as an MCP client, an **Agent** component in a Langflow flow can use connected components as tools to handle requests. +You can use existing components as tools, and you can connect any MCP server to you flow to make that server's tools available to the agent. + +* [Use Langflow as an MCP server](/mcp-server): When run as an MCP server, your flows become tools that can be used by an MCP client, which could be an external client or another Langflow flow. + +In this tutorial, you will use the Langflow **MCP Tools** component to connect multiple MCP servers to your flow, and then you'll use a Python application to run your flow and chat with the agent programmatically. + +## Prerequisites + +* [A running Langflow instance](/get-started-installation) +* [A Langflow API key](/configuration-api-keys) +* [An OpenAI API key](https://platform.openai.com/api-keys) + +This tutorial uses an OpenAI LLM. If you want to use a different provider, you need a valid credential for that provider. + +## Create an agentic flow + +1. In Langflow, click **New Flow**, and then select the [**Simple agent**](/simple-agent) template. + +2. In the **Agent** component, enter your OpenAI API key. + + If you want to use a different provider or model, edit the **Model Provider**, **Model Name**, and **API Key** fields accordingly. + +3. To test the flow, click