docs: mcp server component and integrations (#7286)

* mcp-server-component-update

* update-image

* step-number

* more-content

* astra-npx

* mcp-see-mode-and-env-var

* fix-build

* docs-add-mcp-inspector

* create-section-for-mcp

* code-review
This commit is contained in:
Mendon Kissling 2025-03-31 15:19:09 -04:00 committed by GitHub
commit c76aeb6e9e
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
8 changed files with 284 additions and 114 deletions

View file

@ -3,6 +3,8 @@ title: Tools
slug: /components-tools
---
import Icon from "@site/src/components/icon";
# Tool components in Langflow
Tools are typically connected to agent components at the **Tools** port. Agents use LLMs as a reasoning engine to decide which of the connected tool components to use to solve a problem.
@ -261,25 +263,56 @@ This component allows you to call the Serper.dev Google Search API.
| results | List[Data]| List of search results |
| tool | Tool | Google Serper search tool for use in LangChain|
## MCP Tools (stdio)
This component connects to a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) server over `stdio` and exposes its tools as Langflow tools to be used by an Agent component.
## MCP server
To use the MCP stdio component, follow these steps:
This component connects to a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) server and exposes the MCP server's tools as tools.
1. Add the MCP stdio component to your workflow, and connect it to an agent. The flow looks like this:
In addition to being an MCP client that can leverage MCP servers, Langflow is also an MCP server that exposes flows as tools through the `/api/v1/mcp/sse` API endpoint. For more information, see [MCP integrations](/integrations-mcp).
![MCP stdio component](/img/mcp-stdio-component.png)
To use the MCP server component with an agent component, follow these steps:
2. In the MCP stdio component, in the **mcp command** field, enter the command to start your MCP server. For a [Fetch](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch) server, the command is:
1. Add the MCP server component to your workflow.
2. In the MCP server component, in the **MCP Command** field, enter the command to start your MCP server. For example, to start a [Fetch](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch) server, the command is:
```bash
uvx mcp-server-fetch
```
3. Open the **Playground**.
`uvx` is included with `uv` in the Langflow package.
To use `npx` server commands, you must first install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
For an example of starting `npx` MCP servers, see [Connect an Astra DB MCP server to Langflow](/mcp-component-astra).
3. Click <Icon name="RefreshCw" aria-label="Refresh"/> to get the server's list of **Tools**.
4. In the **Tool** field, select the server tool you want the component to use.
The available fields change based on the selected tool.
For information on the parameters, see the MCP server's documentation.
5. In the MCP server component, enable **Tool mode**.
Connect the MCP server component's **Toolset** port to an **Agent** component's **Tools** port.
The flow looks similar to this:
![MCP server component](/img/mcp-server-component.png)
6. Open the **Playground**.
Ask the agent to summarize recent tech news. The agent calls the MCP server function `fetch` and returns the summary.
This confirms the MCP server is connected and working.
This confirms the MCP server is connected, and its tools are being used in Langflow.
For more information, see [MCP integrations](/integrations-mcp).
### MCP Server-Sent Events (SSE) mode
1. In the **MCP Server** component, select **SSE**.
A default address appears in the **MCP SSE URL** field.
2. In the **MCP SSE URL** field, modify the default address to point at the SSE endpoint of the Langflow server you're currently running.
The default value is `http://localhost:7860/api/v1/mcp/sse`.
3. In the **MCP Server** component, click <Icon name="RefreshCw" aria-label="Refresh"/> to retrieve the server's list of **Tools**.
4. Click the **Tools** field.
All of your flows are listed as tools.
5. Enable **Tool Mode**, and then connect the **MCP Server** component to an agent component's tool port.
The flow looks like this:
![MCP server component](/img/mcp-server-component-sse.png)
6. Open the **Playground** and chat with your tool.
The agent chooses the correct tool based on your query.
### Inputs
@ -293,28 +326,18 @@ This confirms the MCP server is connected and working.
|-------|-----------|-------------------------------------------|
| tools | List[Tool]| List of tools exposed by the MCP server |
## MCP Tools (stdio)
:::important
This component is deprecated as of Langflow version 1.3.
Instead, use the [MCP server component](/components-tools#mcp-server)
:::
## MCP Tools (SSE)
This component connects to a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) server over [SSE (Server-Sent Events)](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events) and exposes its tools as Langflow tools to be used by an Agent component.
To use the MCP SSE component, follow these steps:
1. Add the MCP SSE component to your workflow, and connect it to an agent. The flow looks similar to the MCP stdio component flow.
2. In the MCP SSE component, in the **url** field, enter the URL of your current Langflow server's `mcp/sse` endpoint.
This will fetch all currently available tools from the Langflow server.
### Inputs
| Name | Type | Description |
|------|--------|------------------------------------------------------|
| url | String | SSE URL (default: `http://localhost:7860/api/v1/mcp/sse`) |
### Outputs
| Name | Type | Description |
|-------|-----------|-------------------------------------------|
| tools | List[Tool]| List of tools exposed by the MCP server |
:::important
This component is deprecated as of Langflow version 1.3.
Instead, use the [MCP server component](/components-tools#mcp-server)
:::
## Python Code Structured Tool

View file

@ -0,0 +1,180 @@
---
title: Integrate Langflow with MCP
slug: /integrations-mcp
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Langflow integrates with the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction). This allows you to use your Langflow flows as tools in client applications that support the MCP, or extend Langflow with the [MCP server component](/components-tools#mcp-tools-stdio) to access MCP servers.
You can use Langflow as an MCP server with any [MCP client](https://modelcontextprotocol.io/clients).
For configuring interactions between Langflow flows and MCP tools, see [Name and describe your flows for agentic use](#name-and-describe-your-flows-for-agentic-use).
To connect [MCP Inspector](https://modelcontextprotocol.io/docs/tools/inspector) to Langflow for testing and debugging flows, see [Install MCP Inspector to test and debug flows](#install-mcp-inspector-to-test-and-debug-flows)
## Access all of your flows as tools
:::important
Tool names must contain only letters, numbers, underscores, and dashes.
Tool names cannot contain spaces.
To re-name flows in the Langflow UI, click **Flow Name** > **Edit Details**.
:::
Connect an MCP client to Langflow to use your flows as tools.
1. Install [Cursor](https://docs.cursor.com/) or [Claude for Desktop](https://claude.ai/download).
2. Install [uv](https://docs.astral.sh/uv/getting-started/installation/) to run `uvx` commands. `uvx` is included with `uv` in the Langflow package.
3. Optional: Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) to run `npx` commands.
For an example `npx` server, see [Connect an Astra DB MCP server to Langflow](/mcp-component-astra).
4. Create at least one flow, and note your host. For example, `http://127.0.0.1:7860`.
<Tabs>
<TabItem value="cursor" label="Cursor">
In Cursor, you can configure a Langflow server in the same way as other MCP servers.
For more information, see the [Cursor MCP documentation](https://docs.cursor.com/context/model-context-protocol).
1. Open Cursor, and then go to **Cursor Settings**.
2. Click MCP, and then click **Add New Global MCP Server**.
Cursor's MCP servers are listed as JSON objects.
3. To add a Langflow server, add an entry for your Langflow server's `/v1/mcp/sse` endpoint.
This example assumes the default Langflow server address of `http://127.0.0.1:7860`.
```json
{
"mcpServers": {
"langflow": {
"url": "http://127.0.0.1:7860/api/v1/mcp/sse"
}
}
}
```
4. Save the `mcp.json` file, and then click the **Reload** icon.
5. Your Langflow server is now available to Cursor as an MCP server, and all of its flows are registered as tools.
You can now use your flows as tools in Cursor.
Cursor determines when to use tools based on your queries, and will request permissions when necessary.
</TabItem>
<TabItem value="claude for desktop" label="Claude for Desktop">
In Claude for Desktop, you can configure a Langflow server in the same way as other MCP servers.
For more information, see the [Claude for Desktop MCP documentation](https://modelcontextprotocol.io/quickstart/user).
1. Open Claude for Desktop, and then go to the program settings.
For example, on the MacOS menu bar, click **Claude**, and then select **Settings**.
2. In the **Settings** dialog, click **Developer**, and then click **Edit Config**.
This creates a `claude_desktop_config.json` file if you don't already have one.
3. Add the following code to `claude_desktop_config.json`.
Your args may differ for your `uvx` and `Python` installations. To find the correct paths:
* For `uvx`: Run `which uvx` in your terminal
* For Python: Run `which python` in your terminal
Replace `PATH/TO/PYTHON` with the Python path from your system.
This command assumes the default Langflow server address of `http://127.0.0.1:7860`.
```json
{
"mcpServers": {
"langflow": {
"command": "/bin/sh",
"args": ["-c", "uvx --python PATH/TO/PYTHON mcp-sse-shim@latest"],
"env": {
"MCP_HOST": "http://127.0.0.1:7860",
"DEBUG": "true"
}
}
}
}
```
This code adds a new MCP server called `langflow` and starts the [mcp-sse-shim](https://github.com/phact/mcp-sse-shim) package using the specified Python interpreter and uvx.
4. Restart Claude for Desktop.
Your new tools are available in your chat window. Click the tools icon to see a list of your flows.
You can now use your flows as tools in Claude for Desktop.
Claude determines when to use tools based on your queries, and will request permissions when necessary.
</TabItem>
</Tabs>
## Name and describe your flows for agentic use
MCP clients like Claude for Desktop and Cursor "see" Langflow as a single MCP server, with **all** of your flows listed as tools.
This can confuse agents, who don't know that flow `adbbf8c7-0a34-493b-90ea-5e8b42f78b66` is a Document Q&A flow for a specific text file.
To prevent this behavior, name and describe your flows clearly for agentic use. Imagine your names and descriptions as function names and code comments, with a clear statement of what problem they solve.
For example, you have created a [Document Q&A](/tutorials-document-qa) flow that loads a sample resume for an LLM to chat with, and you want Cursor to use the tool.
1. Click **Flow name**, and then select **Edit Details**.
2. The **Name** field should make it clear what the flow does, both to a user and to the agent. For example, name it `Document QA for Resume`.
3. The **Description** field should include a description of what the flow does. For example, describe the flow as `OpenAI LLM Chat with Alex's resume.`
The **Endpoint Name** field does not affect the agent's behavior.
4. To see how an MCP client understands your flow, in Cursor, examine the **MCP Servers**.
The tool is listed as:
```text
document_qa_for_resume
e967f47d-6783-4bab-b1ea-0aaa554194a3: OpenAI LLM Chat with Alex's resume.
```
Your flow name and description provided the agent with a clear purpose for the tool.
5. Ask Cursor a question specifically about the resume, such as `What job experience does Alex have?`
```text
I'll help you explore a resume using the Document QA for Resume flow, which is specifically designed for analyzing resumes.
Let me call this tool.
```
6. Click **Run tool** to continue. The agent requests permissions when necessary.
```
Based on the resume, here's a comprehensive breakdown of the experience:
```
7. Ask about a different resume.
You've provided enough information in the description for the agent to make the correct decision:
```text
I notice you're asking about Emily's job experience.
Based on the available tools, I can see there is a Document QA for Resume flow that's designed for analyzing resumes.
However, the description mentions it's for "Alex's resume" not Emily's. I don't have access to Emily's resume or job experience information.
```
## Install MCP Inspector to test and debug flows
[MCP inspector](https://modelcontextprotocol.io/docs/tools/inspector) is the standard tool for testing and debugging MCP servers.
Use MCP Inspector to monitor your Langflow server's flows, and understand how they are being consumed by the MCP.
To install and run MCP inspector, follow these steps:
1. Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
2. To install and start MCP inspector, in a terminal window, run the following command:
```
npx @modelcontextprotocol/inspector
```
MCP inspector starts by default at `http://localhost:5173`.
:::tip
Optionally, specify a proxy port when starting MCP Inspector:
```
SERVER_PORT=9000 npx -y @modelcontextprotocol/inspector
```
:::
3. In the browser, navigate to MCP Inspector.
4. To inspect the Langflow server, enter the values for the Langflow server.
* In the **Transport Type** field, select **SSE**.
* In the **URL** field, enter the Langflow server's `/mcp/sse` endpoint.
For a default deployment, the URL is `http://127.0.0.1:7860/api/v1/mcp/sse`.
5. Click **Connect**.
MCP Inspector connects to the Langflow server.
6. To confirm the connection, click the **Tools** tab.
The Langflow server's flows are listed as tools, which confirms MCP Inspector is connected.
In the **Tools** tab, you can monitor how your flows are being registered as tools by MCP, and run flows with input values.
To quit MCP Inspector, in the terminal where it's running, enter `Ctrl+C`.

View file

@ -0,0 +1,40 @@
---
title: Connect an Astra DB MCP server to Langflow
slug: /mcp-component-astra
---
Use the [MCP server component](/components-tools#mcp-server) to connect Langflow to a [Datastax Astra DB MCP server](https://github.com/datastax/astra-db-mcp).
1. Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
2. Create an [OpenAI](https://platform.openai.com/) API key.
3. Create an [Astra DB Serverless (Vector) database](https://docs.datastax.com/en/astra-db-serverless/databases/create-database.html#create-vector-database), if you don't already have one.
4. Get your database's **Astra DB API endpoint** and an **Astra DB application token** with the Database Administrator role. For more information, see [Generate an application token for a database](https://docs.datastax.com/en/astra-db-serverless/administration/manage-application-tokens.html#database-token).
5. Create a [Simple agent starter project](/starter-projects-simple-agent).
6. Remove the **URL** tool and replace it with an [MCP server](/components-tools#mcp-server) component.
The flow should look like this:
![MCP stdio component](/img/mcp-server-component.png)
7. In the **MCP server** component, in the **MCP command** field, add the following code.
Replace the values for `ASTRA_TOKEN` and `ASTRA_ENDPOINT` with the values from your Astra database.
```plain
env ASTRA_DB_APPLICATION_TOKEN=ASTRA_TOKEN ASTRA_DB_API_ENDPOINT=ASTRA_ENDPOINT npx -y @datastax/astra-db-mcpnpx -y @datastax/astra-db-mcp
```
:::important
Langflow passes environment variables from the `.env` file to MCP, but not global variables declared in the UI.
To add the values for `ASTRA_DB_APPLICATION_TOKEN` and `ASTRA_DB_API_ENDPOINT` as global variables, add them to Langflow's `.env` file at startup.
For more information, see [global variables](/configuration-global-variables).
:::
8. In the **Agent** component, add your **OpenAI API key**.
9. Open the **Playground**, and then ask the agent, `What collections are available?`
Since Langflow is connected to your Astra DB database through the MCP, the agent chooses the correct tool and connects to your database to retrieve the answer.
```text
The available collections in your database are:
collection_002
hardware_requirements
load_collection
nvidia_collection
software_requirements
```

View file

@ -1,76 +0,0 @@
---
title: Integrate Langflow with MCP (Model context protocol)
slug: /integrations-mcp
---
Langflow integrates with the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction). This allows you to use your Langflow flows as tools in other applications that support the MCP, or extend Langflow with the [MCP stdio component](/components-tools#mcp-tools-stdio) to access MCP servers.
You can use Langflow as an MCP server with any [MCP client](https://modelcontextprotocol.io/clients).
For example purposes, this guide presents two ways to interact with the MCP:
* [Access all of your flows as tools from Claude for Desktop](#access-all-of-your-flows-as-tools-from-claude-for-desktop)
* [Use the MCP stdio component to connect Langflow to a Datastax Astra DB MCP server](#connect-an-astra-db-mcp-server-to-langflow)
## Access all of your flows as tools from Claude for Desktop
1. Install [Claude for Desktop](https://claude.ai/download).
2. Install [uv](https://docs.astral.sh/uv/getting-started/installation/) so that you can run `uvx` commands.
3. Create at least one flow, and note your host. For example, `http://127.0.0.1:7863`.
4. Open Claude for Desktop, and then go to the program settings.
For example, on the MacOS menu bar, click **Claude**, and then select **Settings**.
5. In the **Settings** dialog, click **Developer**, and then click **Edit Config**.
This creates a `claude_desktop_config.json` file if you don't already have one.
6. Add the following code to `claude_desktop_config.json`.
Your args may differ for your `uvx` and `Python` installations. To find the correct paths:
* For `uvx`: Run `which uvx` in your terminal
* For Python: Run `which python` in your terminal
Replace `/path/to/uvx` and `/path/to/python` with the paths from your system:
```json
{
"mcpServers": {
"langflow": {
"command": "/bin/sh",
"args": ["-c", "/path/to/uvx --python /path/to/python mcp-sse-shim@latest"],
"env": {
"MCP_HOST": "http://127.0.0.1:7864",
"DEBUG": "true"
}
}
}
}
```
This code adds a new MCP server called `langflow` and starts the [mcp-sse-shim](https://github.com/phact/mcp-sse-shim) package using the specified Python interpreter and uvx.
7. Restart Claude for Desktop.
Your new tools are available in your chat window. Click the tools icon to see a list of your flows.
You can now use your flows as tools in Claude for Desktop.
Claude determines when to use tools based on your queries, and will request permissions when necessary.
## Connect an Astra DB MCP server to Langflow
Use the [MCP stdio component](/components-tools#mcp-tools-stdio) to connect Langflow to a [Datastax Astra DB MCP server](https://github.com/datastax/astra-db-mcp).
1. Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
2. Create an [OpenAI](https://platform.openai.com/) API key.
3. Create an [Astra DB Serverless (Vector) database](https://docs.datastax.com/en/astra-db-serverless/databases/create-database.html#create-vector-database), if you don't already have one.
4. Get your database's **Astra DB API endpoint** and an **Astra DB application token** with the Database Administrator role. For more information, see [Generate an application token for a database](https://docs.datastax.com/en/astra-db-serverless/administration/manage-application-tokens.html#database-token).
5. Add your **Astra DB application token** and **Astra API endpoint** to Langflow as [global variables](/configuration-global-variables).
6. Create a [Simple agent starter project](/starter-projects-simple-agent).
7. Remove the **URL** tool and replace it with an [MCP stdio component](/components-tools#mcp-tools-stdio) component.
The flow should look like this:
![MCP stdio component](/img/mcp-stdio-component.png)
8. In the **MCP stdio** component, in the **MCP command** field, add the following code:
```plain
npx -y @datastax/astra-db-mcp
```
9. In the **Agent** component, add your **OpenAI API key**.
10. Open the **Playground**.
Langflow is now connected to your Astra DB database through the MCP.
You can use the MCP to create, read, update, and delete data from your database.

View file

@ -165,15 +165,6 @@ module.exports = {
"Integrations/Arize/integrations-arize",
"Integrations/integrations-assemblyai",
"Integrations/Composio/integrations-composio",
"Integrations/integrations-langfuse",
"Integrations/integrations-langsmith",
"Integrations/integrations-langwatch",
"Integrations/integrations-opik",
{
type: "doc",
id: "Integrations/integrations-mcp",
label: "MCP (Model context protocol)"
},
{
type: 'category',
label: 'Google',
@ -182,6 +173,18 @@ module.exports = {
'Integrations/Google/integrations-setup-google-cloud-vertex-ai-langflow',
],
},
"Integrations/integrations-langfuse",
"Integrations/integrations-langsmith",
"Integrations/integrations-langwatch",
{
type: 'category',
label: 'MCP (Model Context Protocol)',
items: [
'Integrations/MCP/integrations-mcp',
'Integrations/MCP/mcp-component-astra',
],
},
"Integrations/integrations-opik",
{
type: "category",
label: "Notion",

Binary file not shown.

After

Width:  |  Height:  |  Size: 174 KiB

BIN
docs/static/img/mcp-server-component.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 375 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 382 KiB