docs: v1.4 features (#7778)

* bump-supported-version-to-1.4.x

* overview-page

* small-naming-changes

* change-folders-to-projects

* api-examples-folder-to-projects

* more

* update-screenshots-and-text

* sidebars

* npx-astra-command

* move-and-redirect-outdated-page

* sidebars

* tools-and-env-var

* language

* authentication

* flow-description-and-tool

* mcp-component-astra-db

* rename-page

* cleanup

* update-message

* update-for-client-strategy

* sse-port-7868-and-api-key-creation

* external-deploy

* init-ngrok

* add-ngrok-deploy

* cleanup

* cleanup

* Minor capitalization fix

* Fix message object anchors

* Fix indentation of MCP server procedure

* Add anchor link for MCP SSE mode

* Mild cleanup of concepts overview

* Fix indentation issues in mcp-component-astra

* Use universal date format on luna-for-langflow

* Update mcp-server doc

* fix-missed-bracket

* change-npx-to-uvx-mcp-proxy

* update-screenshots-for-uvx

* inspector-with-auth

* Apply suggestions from code review

Co-authored-by: Eric Schneider <37347760+eric-schneider@users.noreply.github.com>

---------

Co-authored-by: Deon Sanchez <69873175+deon-sanchez@users.noreply.github.com>
Co-authored-by: Eric Schneider <37347760+eric-schneider@users.noreply.github.com>
Co-authored-by: Lucas Oliveira <lucas.edu.oli@hotmail.com>
This commit is contained in:
Mendon Kissling 2025-05-06 12:41:07 -04:00 committed by GitHub
commit 744df31f18
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
27 changed files with 516 additions and 308 deletions

View file

@ -32,14 +32,14 @@ export FLOW_ID="359cd752-07ea-46f2-9d3b-a4407ef618da"
```
- Export the `project-id` in your terminal.
To find your project ID, call the Langflow [/api/v1/projects/](#read-projects) endpoint for a list of projects.
To find your project ID, call the Langflow [/api/v1/projects/](#read-projects) endpoint for a list of projects.
<Tabs>
<TabItem value="curl" label="curl" default>
```bash
curl -X GET \
"$LANGFLOW_URL/api/v1/projects/" \
"$LANGFLOW_URL/api/v1/projects/" \
-H "accept: application/json"
```
@ -50,7 +50,7 @@ curl -X GET \
[
{
"name": "My Projects",
"description": "Manage your own flows. Download and upload projects.",
"description": "Manage your own projects. Download and upload projects.",
"id": "1415de42-8f01-4f36-bf34-539f23e47466",
"parent_id": null
}
@ -59,8 +59,9 @@ curl -X GET \
</TabItem>
</Tabs>
Export the `project-id` as an environment variable.
Export the `project-id` as an environment variable.
```bash
export PROJECT_ID="1415de42-8f01-4f36-bf34-539f23e47466"
export project_ID="1415de42-8f01-4f36-bf34-539f23e47466"
```
- Export the Langflow API key as an environment variable.
@ -1053,7 +1054,7 @@ To retrieve only the flows from a specific project, pass `project_id` in the que
```bash
curl -X GET \
"$LANGFLOW_URL/api/v1/flows/?remove_example_flows=true&components_only=false&get_all=false&project_id=$PROJECT_ID&header_flows=false&page=1&size=1" \
"$LANGFLOW_URL/api/v1/flows/?remove_example_flows=true&components_only=false&get_all=false&project_id=$project_ID&header_flows=false&page=1&size=1" \
-H "accept: application/json"
```
@ -1122,6 +1123,7 @@ curl -X PATCH \
"description": "string",
"data": {},
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"endpoint_name": "my_new_endpoint_name",
"locked": true
}'
@ -1147,6 +1149,7 @@ curl -X PATCH \
"id": "01ce083d-748b-4b8d-97b6-33adbb6a528a",
"user_id": "f58396d4-a387-4bb8-b749-f40825c3d9f3",
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
}
```
@ -1210,6 +1213,7 @@ curl -X POST \
"locked": false,
"user_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
},
{
"name": "string",
@ -1228,6 +1232,7 @@ curl -X POST \
"locked": false,
"user_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
"project_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
}
]
}'
@ -1258,7 +1263,7 @@ This example uploads a local file named `agent-with-astra-db-tool.json`.
```bash
curl -X POST \
"$LANGFLOW_URL/api/v1/flows/upload/?project_id=$PROJECT_ID" \
"$LANGFLOW_URL/api/v1/flows/upload/?project_id=$project_ID" \
-H "accept: application/json" \
-H "Content-Type: multipart/form-data" \
-F "file=@agent-with-astra-db-tool.json;type=application/json"
@ -1284,12 +1289,14 @@ curl -X POST \
</TabItem>
</Tabs>
To specify a target project for the flow, include the query parameter `project_id`.
The target `project_id` must already exist before uploading a flow. Call the [/api/v1/projects/](#read-projects) endpoint for a list of available projects.
To specify a target project for the flow, include the query parameter `project_id`.
The target `project_id` must already exist before uploading a flow. Call the [/api/v1/projects/](#read-projects) endpoint for a list of available projects.
```bash
curl -X POST \
"$LANGFLOW_URL/api/v1/flows/upload/?project_id=$PROJECT_ID" \
"$LANGFLOW_URL/api/v1/flows/upload/?project_id=$project_ID" \
-H "accept: application/json" \
-H "Content-Type: multipart/form-data" \
-F "file=@agent-with-astra-db-tool.json;type=application/json"
@ -1353,6 +1360,8 @@ A list of example flows.
## Projects
## Projects
Use the `/projects` endpoint to create, read, update, and delete projects.
Projects store your flows and components.
@ -1366,6 +1375,7 @@ Get a list of Langflow projects.
```bash
curl -X GET \
"$LANGFLOW_URL/api/v1/projects/" \
"$LANGFLOW_URL/api/v1/projects/" \
-H "accept: application/json"
```
@ -1377,7 +1387,7 @@ curl -X GET \
[
{
"name": "My Projects",
"description": "Manage your own flows. Download and upload projects.",
"description": "Manage your own projects. Download and upload projects.",
"id": "1415de42-8f01-4f36-bf34-539f23e47466",
"parent_id": null
}
@ -1389,6 +1399,9 @@ curl -X GET \
### Create project
### Create project
Create a new project.
Create a new project.
<Tabs>
@ -1396,11 +1409,13 @@ Create a new project.
```bash
curl -X POST \
"$LANGFLOW_URL/api/v1/projects/" \
"$LANGFLOW_URL/api/v1/projects/" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
-d '{
"name": "new_project_name",
"name": "new_project_name",
"description": "string",
"components_list": [],
"flows_list": []
@ -1412,6 +1427,7 @@ curl -X POST \
```json
{
"name": "new_project_name",
"name": "new_project_name",
"description": "string",
"id": "b408ddb9-6266-4431-9be8-e04a62758331",
@ -1422,17 +1438,21 @@ curl -X POST \
</TabItem>
</Tabs>
To add flows and components at project creation, retrieve the `components_list` and `flows_list` values from the [/api/v1/store/components](#get-all-components) and [/api/v1/flows/read](#read-flows) endpoints and add them to the request body.
To add flows and components at project creation, retrieve the `components_list` and `flows_list` values from the [/api/v1/store/components](#get-all-components) and [/api/v1/flows/read](#read-flows) endpoints and add them to the request body.
Adding a flow to a project moves the flow from its previous location. The flow is not copied.
Adding a flow to a project moves the flow from its previous location. The flow is not copied.
```bash
curl -X POST \
"$LANGFLOW_URL/api/v1/projects/" \
"$LANGFLOW_URL/api/v1/projects/" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
-d '{
"name": "new_project_name",
"name": "new_project_name",
"description": "string",
"components_list": [
"3fa85f64-5717-4562-b3fc-2c963f66afa6"
@ -1445,8 +1465,12 @@ curl -X POST \
### Read project
### Read project
Retrieve details of a specific project.
Retrieve details of a specific project.
To find the UUID of your project, call the [read projects](#read-projects) endpoint.
To find the UUID of your project, call the [read projects](#read-projects) endpoint.
<Tabs>
@ -1454,7 +1478,7 @@ To find the UUID of your project, call the [read projects](#read-projects) endpo
```bash
curl -X GET \
"$LANGFLOW_URL/api/v1/projects/$PROJECT_ID" \
"$LANGFLOW_URL/api/v1/projects/$project_ID" \
-H "accept: application/json"
```
@ -1464,8 +1488,8 @@ curl -X GET \
```json
[
{
"name": "My projects",
"description": "Manage your own flows. Download and upload projects.",
"name": "My Projects",
"description": "Manage your own projects. Download and upload projects.",
"id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"parent_id": null
}
@ -1477,8 +1501,12 @@ curl -X GET \
### Update project
### Update project
Update the information of a specific project with a `PATCH` request.
Update the information of a specific project with a `PATCH` request.
Each PATCH request updates the project with the values you send.
Each PATCH request updates the project with the values you send.
Only the fields you include in your request are updated.
If you send the same values multiple times, the update is still processed, even if the values are unchanged.
@ -1488,6 +1516,7 @@ If you send the same values multiple times, the update is still processed, even
```bash
curl -X PATCH \
"$LANGFLOW_URL/api/v1/projects/b408ddb9-6266-4431-9be8-e04a62758331" \
"$LANGFLOW_URL/api/v1/projects/b408ddb9-6266-4431-9be8-e04a62758331" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
@ -1521,6 +1550,9 @@ curl -X PATCH \
### Delete project
### Delete project
Delete a specific project.
Delete a specific project.
<Tabs>
@ -1528,7 +1560,7 @@ Delete a specific project.
```bash
curl -X DELETE \
"$LANGFLOW_URL/api/v1/projects/$PROJECT_ID" \
"$LANGFLOW_URL/api/v1/projects/$project_ID" \
-H "accept: */*"
```
@ -1544,6 +1576,9 @@ curl -X DELETE \
### Download project
### Download project
Download all flows from a project as a zip file.
Download all flows from a project as a zip file.
The `--output` flag is optional.
@ -1553,9 +1588,11 @@ The `--output` flag is optional.
```bash
curl -X GET \
"$LANGFLOW_URL/api/v1/projects/download/b408ddb9-6266-4431-9be8-e04a62758331" \
"$LANGFLOW_URL/api/v1/projects/download/b408ddb9-6266-4431-9be8-e04a62758331" \
-H "accept: application/json" \
--output langflow-project.zip
--output langflow-project.zip
```
</TabItem>
@ -1563,6 +1600,7 @@ curl -X GET \
```text
The project contents.
The project contents.
```
</TabItem>
@ -1570,6 +1608,9 @@ The project contents.
### Upload project
### Upload project
Upload a project to Langflow.
Upload a project to Langflow.
<Tabs>
@ -1577,6 +1618,7 @@ Upload a project to Langflow.
```bash
curl -X POST \
"$LANGFLOW_URL/api/v1/projects/upload/" \
"$LANGFLOW_URL/api/v1/projects/upload/" \
-H "accept: application/json" \
-H "Content-Type: multipart/form-data" \
@ -1589,6 +1631,7 @@ curl -X POST \
```text
The project contents are uploaded to Langflow.
The project contents are uploaded to Langflow.
```
</TabItem>

View file

@ -354,7 +354,7 @@ This value is set as the `OLLAMA_HOST` environment variable in Ollama. The defau
2. To refresh the server's list of models, click <Icon name="RefreshCw" aria-label="Refresh"/>.
3. In the **Ollama Model** field, select an embeddings model. This example uses `all-minilm:latest`.
4. Connect the **Ollama** embeddings component to a flow.
For example, this flow connects a local Ollama server running a `all-minilm:latest` embeddings model to a [Chroma DB](/components-vector-stores#chroma) vector store to generate embeddings for split text.
For example, this flow connects a local Ollama server running a `all-minilm:latest` embeddings model to a [Chroma DB](/components-vector-stores#chroma-db) vector store to generate embeddings for split text.
![Ollama embeddings connected to Chroma DB](/img/component-ollama-embeddings-chromadb.png)

View file

@ -19,7 +19,7 @@ The **Chat Input** component provides an interactive chat interface in the **Pla
## Chat Input
This component collects user input as `Text` strings from the chat and wraps it in a [Message](/concepts-objects) object that includes the input text, sender information, session ID, file attachments, and styling properties.
This component collects user input as `Text` strings from the chat and wraps it in a [Message](/concepts-objects#message-object) object that includes the input text, sender information, session ID, file attachments, and styling properties.
It can optionally store the message in a chat history.
@ -115,7 +115,7 @@ The component accepts the following input types.
## Text Output
The **Text Output** takes a single input of text and returns a [Message](/concepts-objects) object containing that text.
The **Text Output** takes a single input of text and returns a [Message](/concepts-objects#message-object) object containing that text.
The output does not appear in the **Playground**.
@ -133,7 +133,7 @@ The output does not appear in the **Playground**.
## Chat components example flow
1. To use the **Chat Input** and **Chat Output** components in a flow, connect them to components that accept or send the [Message](/concepts-objects#message) type.
1. To use the **Chat Input** and **Chat Output** components in a flow, connect them to components that accept or send the [Message](/concepts-objects#message-object) type.
For this example, connect a **Chat Input** component to an **OpenAI** model component's **Input** port, and then connect the **OpenAI** model component's **Message** port to the **Chat Output** component.

View file

@ -188,6 +188,7 @@ This component allows you to evaluate basic arithmetic expressions. It supports
## Combinatorial Reasoner
This component runs Icosa's Combinatorial Reasoning (CR) pipeline on an input to create an optimized prompt with embedded reasons. For more information, see [Icosa Computing](https://www.icosacomputing.com/).
### Inputs
| Name | Display Name | Description |
@ -309,40 +310,62 @@ This component allows you to call the Serper.dev Google Search API.
## MCP server
This component connects to a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) server and exposes the MCP server's tools as tools.
This component connects to a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) server and exposes the MCP server's tools as tools for Langflow agents.
In addition to being an MCP client that can leverage MCP servers, Langflow is also an MCP server that exposes flows as tools through the `/api/v1/mcp/sse` API endpoint. For more information, see [MCP integrations](/integrations-mcp).
In addition to being an MCP client that can leverage MCP servers, the MCP component's [SSE mode](#mcp-sse-mode) allows you to connect your flow to the Langflow MCP server at the `/api/v1/mcp/sse` API endpoint, exposing all flows within your [project](/concepts-overview#projects) as tools within a flow.
To use the MCP server component with an agent component, follow these steps:
1. Add the MCP server component to your workflow.
2. In the MCP server component, in the **MCP Command** field, enter the command to start your MCP server. For example, to start a [Fetch](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch) server, the command is:
```bash
uvx mcp-server-fetch
```
```bash
uvx mcp-server-fetch
```
`uvx` is included with `uv` in the Langflow package.
To use `npx` server commands, you must first install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
For an example of starting `npx` MCP servers, see [Connect an Astra DB MCP server to Langflow](/mcp-component-astra).
`uvx` is included with `uv` in the Langflow package.
To use `npx` server commands, you must first install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
For an example of starting `npx` MCP servers, see [Connect an Astra DB MCP server to Langflow](/mcp-component-astra).
To include environment variables with your server command, add them to the **Env** field like this:
```bash
ASTRA_DB_APPLICATION_TOKEN=AstraCS:...
```
:::important
Langflow passes environment variables from the `.env` file to MCP, but not global variables declared in the UI.
To add a value for an environment variable as a global variable, add it to Langflow's `.env` file at startup.
For more information, see [global variables](/configuration-global-variables).
:::
3. Click <Icon name="RefreshCw" aria-label="Refresh"/> to get the server's list of **Tools**.
4. In the **Tool** field, select the server tool you want the component to use.
The available fields change based on the selected tool.
For information on the parameters, see the MCP server's documentation.
5. In the MCP server component, enable **Tool mode**.
Connect the MCP server component's **Toolset** port to an **Agent** component's **Tools** port.
The flow looks similar to this:
![MCP server component](/img/mcp-server-component.png)
The flow looks similar to this:
![MCP server component](/img/component-mcp-stdio.png)
6. Open the **Playground**.
Ask the agent to summarize recent tech news. The agent calls the MCP server function `fetch` and returns the summary.
This confirms the MCP server is connected, and its tools are being used in Langflow.
For more information, see [MCP integrations](/integrations-mcp).
For more information, see [MCP server](/mcp-server).
### MCP Server-Sent Events (SSE) mode
### MCP Server-Sent Events (SSE) mode {#mcp-sse-mode}
:::important
If you're using **Langflow for Desktop**, the default address is `http://127.0.0.1:7868/`.
:::
The MCP component's SSE mode connects your flow to the Langflow MCP server through the component.
This allows you to use all flows within your [project](/concepts-overview#projects) as tools within a flow.
1. In the **MCP Server** component, select **SSE**.
A default address appears in the **MCP SSE URL** field.
@ -353,7 +376,7 @@ The default value is `http://localhost:7860/api/v1/mcp/sse`.
All of your flows are listed as tools.
5. Enable **Tool Mode**, and then connect the **MCP Server** component to an agent component's tool port.
The flow looks like this:
![MCP server component](/img/mcp-server-component-sse.png)
![MCP server component](/img/component-mcp-sse-mode.png)
6. Open the **Playground** and chat with your tool.
The agent chooses the correct tool based on your query.

View file

@ -64,6 +64,8 @@ A component inherits from a base `Component` class that defines its interface an
For example, the [Recursive character text splitter](https://github.com/langflow-ai/langflow/blob/main/src/backend/base/langflow/components/langchain_utilities/recursive_character.py) is a child of the [LCTextSplitterComponent](https://github.com/langflow-ai/langflow/blob/main/src/backend/base/langflow/base/textsplitters/model.py) class.
<details>
<summary>Recursive character text splitter code</summary>
```python
from typing import Any
@ -126,6 +128,8 @@ class RecursiveCharacterTextSplitterComponent(LCTextSplitterComponent):
```
</details>
Components include definitions for inputs and outputs, which are represented in the UI with color-coded ports.
**Input Definition:** Each input (like `IntInput` or `DataInput`) specifies an input's type, name, and display properties, which appear as configurable fields in the component's UI panel.

View file

@ -18,17 +18,17 @@ Flows are created in the **workspace** with components dragged from the componen
A flow can be as simple as the [basic prompting flow](/get-started-quickstart), which creates an OpenAI chatbot with four components.
- Each component in a flow is a **node** that performs a specific task, like an AI model or a data source.
- Each component has a **Configuration** menu. Click the **Code** pane to see a component's underlying Python code.
- Each component has a **Configuration** menu. Click the <Icon name="Code" aria-hidden="true"/> **Code** button on a component to see its underlying Python code.
- Components are connected with **edges** to form flows.
If you're familiar with [React Flow](https://reactflow.dev/learn), a **flow** is a node-based application, a **component** is a node, and the connections between components are **edges**.
When a flow is run, Langflow builds a Directed Acyclic Graph (DAG) graph object from the nodes (components) and edges (connections between components), with the nodes sorted to determine the order of execution. The graph build calls the individual components' `def_build` functions to validate and prepare the nodes. This graph is then processed in dependency order. Each node is built and executed sequentially, with results from each built node being passed to nodes that are dependent on the previous node's results.
Flows are stored on local disk at these default locations:
Flows are stored on local disk at the following default locations:
- **Linux or WSL on Windows**: `home/<username>/.cache/langflow/`
- **MacOS**: `/Users/<username>/Library/Caches/langflow/`
- **Linux and WSL**: `home/<username>/.cache/langflow/`
- **macOS**: `/Users/<username>/Library/Caches/langflow/`
The flow storage location can be customized with the [LANGFLOW_CONFIG_DIR](/environment-variables#LANGFLOW_CONFIG_DIR) environment variable.
@ -36,20 +36,19 @@ The flow storage location can be customized with the [LANGFLOW_CONFIG_DIR](/envi
If you're new to Langflow, it's OK to feel a bit lost at first. Well take you on a tour, so you can orient yourself and start creating applications quickly.
Langflow has four distinct regions: the [workspace](#workspace) is the main area where you build your flows. The components sidebar is on the left, and lists the available [components](#components). The [playground](#playground) and [Publish pane](#publish-pane) are available in the upper right corner.
![](/img/workspace.png)
Langflow has four distinct regions: the [workspace](#workspace) is the main area where you build your flows. The components sidebar is on the left, and lists the available [components](#components). The [playground](#playground) and [publish pane](#publish-pane) are available in the upper right corner.
## Workspace
The **workspace** is where you create AI applications by connecting and running components in flows.
The workspace controls allow you to adjust your view and lock your flows in place.
- Click and drag the workspace to move it left, right, up, and down.
- Scroll up and down to zoom in and out of the workspace, or use the <Icon name="ZoomIn" aria-hidden="true"/> **Zoom In** and <Icon name="ZoomOut" aria-hidden="true"/> **Zoom Out** controls.
- Click <Icon name="Maximize" aria-hidden="true"/> **Fit To Zoom** to center the workspace on the current flow.
- Click <Icon name="LockOpen" aria-hidden="true"/> **Lock** to lock the workspace in place, preventing accidental movement.
- Click <Icon name="StickyNote" aria-hidden="true"/> **Add Note** to add a note to your flow, similar to commenting in code.
- Add **Notes** to flows with the **Add Note** button, similar to commenting in code.
- To access the [Settings](#settings) menu, click <Icon name="Settings" aria-label="Gear icon" /> **Settings**.
This menu contains configuration for **Global Variables**, **Langflow API**, **Shortcuts**, and **Messages**.
![Empty langflow workspace](/img/workspace.png)
## Components
@ -95,36 +94,30 @@ Langflow stores logs at the location specified in the `LANGFLOW_CONFIG_DIR` envi
This directory's default location depends on your operating system.
- **Linux/WSL**: `~/.cache/langflow/`
- **Linux and WSL**: `~/.cache/langflow/`
- **macOS**: `/Users/<username>/Library/Caches/langflow/`
- **Windows**: `%LOCALAPPDATA%\langflow\langflow\Cache`
To modify the location of your log file:
1. Add `LANGFLOW_LOG_FILE=path/to/logfile.log` in your `.env.` file.
1. Add `LANGFLOW_LOG_FILE=path/to/logfile.log` in your `.env` file.
2. To start Langflow with the values from your `.env` file, start Langflow with `uv run langflow run --env-file .env`.
An example `.env` file is available in the [project repository](https://github.com/langflow-ai/langflow/blob/main/.env.example).
## Projects
## Projects
The **My Projects** page displays all the flows and components you've created in the Langflow workspace.
![](/img/my-projects.png)
**My Projects** is the default project where all new projects and components are initially stored.
**My Projects** is the default space where all new projects and components are initially stored.
Projects and flows are exchanged as JSON objects.
To create a new project, click <Icon name="Plus"aria-hidden="true"/> **Create new project**.
- To create a new project, click 📁 **New Project**.
- To rename a project, double-click the project name.
- To download a project, click 📥 **Download**.
- To upload a project, click 📤 **Upload**. The default maximum file upload size is 100 MB.
- To move a flow or component, drag and drop it into the desired project.
To upload a flow to your project, click <Icon name="Upload" aria-hidden="true"/> **Upload a flow**.
## File management
@ -134,17 +127,17 @@ For more on managing your files, see [Manage files](/concepts-file-management).
## Options menu
The dropdown menu labeled with the project name offers several management and customization options for the current flow in the Langflow workspace.
The dropdown menu labeled with the project name offers several management and customization options for the current flow in the Langflow workspace:
- **New**: Create a new flow from scratch.
- **Settings**: Adjust settings specific to the current flow, such as its name, description, and endpoint name.
- **Logs**: View logs for the current project, including execution history, errors, and other runtime events.
- **Import**: Import a flow or component from a JSON file into the workspace.
- **Export**: Export the current flow as a JSON file.
- **Undo (⌘Z)**: Revert the last action taken in the project.
- **Redo (⌘Y)**: Reapply a previously undone action.
- **Refresh All**: Refresh all components and delete cache.
- <Icon name="Plus" aria-hidden="true"/> **New**: Create a new flow from scratch.
- <Icon name="SquarePen" aria-hidden="true"/> **Edit Details**: Adjust settings specific to the current flow, such as its name, description, and endpoint name.
- <Icon name="ScrollText" aria-hidden="true"/> **Logs**: View logs for the current project, including execution history, errors, and other runtime events.
- <Icon name="FileUp" aria-hidden="true"/> **Import**: Import a flow or component from a JSON file into the workspace.
- <Icon name="FileDown" aria-hidden="true"/> **Export**: Export the current flow as a JSON file.
- <Icon name="Undo" aria-hidden="true"/> **Undo**: Revert the last action taken in the project. Keyboard shortcut: <kbd>Control+Z</kbd> (or <kbd>Command+Z</kbd> on macOS).
- <Icon name="Redo" aria-hidden="true"/> **Redo**: Reapply a previously undone action. Keyboard shortcut: <kbd>Control+Y</kbd> (or <kbd>Command+Y</kbd> on macOS).
- <Icon name="RefreshCcw" aria-hidden="true"/> **Refresh All**: Refresh all components and delete cache.
## Settings
Click <Icon name="Settings" aria-label="Gear icon" /> **Settings** to access **Global variables**, **Langflow API**, **Shortcuts**, and **Messages**.
Click <Icon name="Settings" aria-hidden="true"/> **Settings** to access **Global variables**, **Langflow API keys**, **Shortcuts**, and **Messages**.

View file

@ -5,11 +5,11 @@ slug: /concepts-publish
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Langflow provides several ways to publish and integrate your flows into external applications. Whether you want to expose your flow via API endpoints, embed it as a chat widget in your website, or share it as a public playground, this guide covers the options available for making your flows accessible to users.
Langflow provides several ways to publish and integrate your flows into external applications. Whether you want to expose your flow as an API endpoint, embed it as a chat widget in your website, or share it as a public playground, this guide covers the options available for making your flows accessible to users.
## API access
The **API** pane presents code templates for integrating your flow into external applications.
The **API access** pane presents code templates for integrating your flow into external applications.
![](/img/api-pane.png)
@ -63,7 +63,7 @@ For example, changing the **Chat Input** component's `input_value` changes that
### Send files to your flow with the API
For information on sending files to the Langflow API, see [API examples](/api-reference-api-examples#upload-image-files).
For information on sending files to the Langflow API, see [API examples](/api-reference-api-examples#upload-image-files-v1).
### Webhook cURL

View file

@ -0,0 +1,302 @@
---
title: Model Context Protocol (MCP) server
slug: /mcp-server
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import Icon from "@site/src/components/icon";
Langflow integrates with the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) as both an MCP server and an MCP client.
This page describes how to use Langflow as an *MCP server*.
For information about using Langflow as an *MCP client*, see the [MCP component](/components-tools#mcp-server).
As an MCP server, Langflow exposes your flows as [tools](https://modelcontextprotocol.io/docs/concepts/tools) that [MCP clients](https://modelcontextprotocol.io/clients) can use use to take actions.
## Prerequisites
* A Langflow project with at least one flow created.
* Any LTS version of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) installed on your computer to use MCP Inspector to [test and debug flows](#test-and-debug-flows).
## Select and configure flows to expose as MCP tools {#select-flows-to-serve}
Langflow runs a separate MCP server for every [project](/concepts-overview#projects).
The MCP server for each project exposes that project's flows as tools.
All of the flows in a project are exposed by default.
To expose only specific flows and optionally rename them for agentic use, follow these steps:
1. From the Langflow dashboard, select the project that contains the flows you want to serve as tools, and then click the **MCP Server** tab.
Alternatively, you can quickly access the **MCP Server** tab from within any flow by selecting **Publish > MCP Server**.
The **MCP Server** tab displays a code template that you can use to connect MCP clients to the the project's MCP server.
The **Flows/Actions** section lists the flows that are currently being served as tools.
![MCP server projects page](/img/mcp-server.png)
2. Click <Icon name="Settings2" aria-hidden="true"/> **Edit Actions**.
3. In the **MCP Server Actions** window, select the flows that you want exposed as tools.
![MCP server actions](/img/mcp-server-actions.png)
4. Optional: Edit the **Flow Name** and **Flow Description**.
- **Flow Name**: Enter a name thats makes it clear what the flow does.
- **Flow Description**: Enter a description that accurately describes the specific action(s) the flow performs.
:::important
MCP clients use the **Flow Name** and **Flow Description** to determine which action to use.
For more information about naming and describing your flows, see [Name and describe your flows for agentic use](#name-and-describe-your-flows).
:::
5. Close the **MCP Server Actions** window to save your changes.
{/* The anchor on this section (connect-clients-to-use-the-servers-actions) is currently a link target in the Langflow UI. Do not change. */}
## Connect clients to Langflow's MCP server {#connect-clients-to-use-the-servers-actions}
The following procedure describes how to connect [Cursor](https://www.cursor.com/) to your Langflow project's MCP server to consume your flows as tools.
However, you can connect any [MCP-compatible client](https://modelcontextprotocol.io/clients) following similar steps.
1. Install [Cursor](https://docs.cursor.com/get-started/installation).
2. In Cursor, go to **Cursor Settings > MCP**, and then click **Add New Global MCP Server**.
This opens Cursor's global MCP configuration file, `mcp.json`.
3. In the Langflow dashboard, select the project that contains the flows you want to serve, and then click the **MCP Server** tab.
4. Copy the code template from the **MCP Server** tab, and then paste it into `mcp.json` in Cursor.
For example:
```json
{
"mcpServers": {
"PROJECT_NAME": {
"command": "uvx",
"args": [
"mcp-proxy",
"http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse"
]
}
}
}
```
The **MCP Server** tab automatically includes the correct `PROJECT_NAME`, `LANGFLOW_SERVER_ADDRESS`, and `PROJECT_ID` values.
The default Langflow server address is `http://127.0.0.1:7860` (`http://127.0.0.1:7868` if using Langflow for Desktop).
:::important
If your Langflow server [requires authentication](/configuration-authentication) ([`LANGFLOW_AUTO_LOGIN`](/environment-variables#LANGFLOW_AUTO_LOGIN) is set to `false`), you must include your Langflow API key in the configuration.
For more information, see [MCP server authentication and environment variables](#authentication).
:::
5. Save and close the `mcp.json` file in Cursor.
The newly added MCP server will appear in the **MCP Servers** section.
Cursor is now connected to your project's MCP server and your flows are registered as tools.
Cursor determines when to use tools based on your queries, and requests permissions when necessary.
For more information, see the [Cursor's MCP documentation](https://docs.cursor.com/context/model-context-protocol).
### MCP server authentication and environment variables {#authentication}
If your Langflow server [requires authentication](/configuration-authentication) ([`LANGFLOW_AUTO_LOGIN`](/environment-variables#LANGFLOW_AUTO_LOGIN) is set to `false`), then you must supply a [Langflow API key](/configuration-api-keys) in your MCP client configuration.
When this is the case, the code template in your project's **MCP Server** tab automatically includes the `--header` and `x-api-key` arguments:
```json
{
"mcpServers": {
"PROJECT_NAME": {
"command": "uvx",
"args": [
"mcp-proxy",
"--headers",
"x-api-key",
"YOUR_API_KEY",
"http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse"
]
}
}
}
```
Click <Icon name="key" aria-hidden="true"/> **Generate API key** to automatically insert a new Langflow API key into the code template.
Alternatively, you can replace `YOUR_API_KEY` with an existing Langflow API key.
![MCP server tab showing Generate API key button](/img/mcp-server-api-key.png)
To include environment variables with your MCP server command, include them like this:
```json
{
"mcpServers": {
"PROJECT_NAME": {
"command": "uvx",
"args": [
"mcp-proxy",
"http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse"
],
"env": {
"KEY": "VALUE"
}
}
}
}
```
Replace `KEY` and `VALUE` with the environment variable name and value you want to include.
## Name and describe your flows for agentic use {#name-and-describe-your-flows}
MCP clients like [Cursor](https://www.cursor.com/) "see" your Langflow project as a single MCP server, with _all_ of your enabled flows listed as tools.
This can confuse agents.
For example, an agent won't know that flow `adbbf8c7-0a34-493b-90ea-5e8b42f78b66` is a [Document Q&A](/document-qa) flow for a specific text file.
To prevent this behavior, make sure to [name and describe](#select-flows-to-serve) your flows clearly.
It's helpful to think of the names and descriptions as function names and code comments, making sure to use clear statements describing the problems your flows solve.
For example, let's say you have a [Document Q&A](/document-qa) flow that loads a sample resume for an LLM to chat with, and that you've given it the following name and description:
- **Flow Name**: `document_qa_for_resume`
- **Flow Description**: `A flow for analyzing Emily's resume.`
If you ask Cursor a question specifically about the resume, such as `What job experience does Emily have?`, the agent asks to call the MCP tool `document_qa_for_resume`.
That's because your name and description provided the agent with a clear purpose for the tool.
When you run the tool, the agent requests permissions when necessary, and then provides a response.
For example:
```
{
"input_value": "What job experience does Emily have?"
}
Result:
What job experience does Emily have?
Emily J. Wilson has the following job experience:
```
If you ask about a different resume, such as `What job experience does Alex have?`, you've provided enough information in the description for the agent to make the correct decision:
```
I notice you're asking about Alex's job experience.
Based on the available tools, I can see there is a Document QA for Resume flow that's designed for analyzing resumes.
However, the description mentions it's for "Emily's resume" not Alex's. I don't have access to Alex's resume or job experience information.
```
## Use MCP Inspector to test and debug flows {#test-and-debug-flows}
[MCP Inspector](https://modelcontextprotocol.io/docs/tools/inspector) is a common tool for testing and debugging MCP servers.
You can use MCP Inspector to monitor your flows and get insights into how they are being consumed by the MCP server:
1. Install MCP Inspector:
```bash
npx @modelcontextprotocol/inspector
```
For more information about configuring MCP Inspector, including specifying a proxy port, see the [MCP Inspector GitHub project](https://github.com/modelcontextprotocol/inspector).
2. Open a web browser and navigate to the MCP Inspector UI.
The default address is `http://127.0.0.1:6274`.
3. In the MCP Inspector UI, enter the connection details for your Langflow project's MCP server:
- **Transport Type**: Select **SSE**.
- **URL**: Enter the Langflow MCP server's `sse` endpoint. For example: `http://127.0.0.1:7860/api/v1/mcp/project/d359cbd4-6fa2-4002-9d53-fa05c645319c/sse`
If you've [configured authentication for your MCP server](#authentication), fill out the following additional fields:
- **Transport Type**: Select **STDIO**.
- **Command**: `uvx`
- **Arguments**: Enter the following list of arguments, separated by spaces. Replace the values for `YOUR_API_KEY`, `LANGFLOW_SERVER_ADDRESS`, and `PROJECT_ID` with the values from your Langflow MCP server. For example:
```bash
mcp-proxy --headers x-api-key YOUR_API_KEY http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse
```
4. Click **Connect**.
If the connection was successful, you should see your project's flows in the **Tools** tab.
From this tab, you can monitor how your flows are being registered as tools by MCP, as well as test the tools with custom input values.
5. To quit MCP Inspector, press <kbd>Control+C</kbd> in the same terminal window where you started it.
{/* The anchor on this section (deploy-your-server-externally) is currently a link target in the Langflow UI. Do not change. */}
## Deploy your MCP server externally {#deploy-your-server-externally}
By default, Langflow isn't exposed to the public internet.
However, you can forward Langflow server traffic with a forwarding platform like [ngrok](https://ngrok.com/docs/getting-started/) or [zrok](https://docs.zrok.io/docs/getting-started).
The following procedure uses ngrok, but you can use any similar reverse proxy or forwarding platform.
This procedure also assumes that you're using the default Langflow listening address `http://127.0.0.1:7860` (`http://127.0.0.1:7868` if using Langflow for Desktop).
1. Sign up for an [ngrok account](https://dashboard.ngrok.com/signup).
2. [Install ngrok](https://ngrok.com/docs/getting-started/#1-install-ngrok).
3. Copy your [ngrok authtoken](https://dashboard.ngrok.com/get-started/your-authtoken) and use it to authenticate your local ngrok server:
```bash
ngrok config add-authtoken NGROK_TOKEN
```
Replace `NGROK_TOKEN` with your ngrok authtoken.
4. Use ngrok to expose your Langflow server to the public internet:
```bash
ngrok http http://localhost:7860
```
The ngrok session starts in your terminal and deploys an ephemeral domain with no authentication.
To add authentication or deploy a static domain, see the [ngrok documentation](https://ngrok.com/docs/).
The `Forwarding` row displays the forwarding address for your Langflow server:
```
Forwarding https://94b1-76-64-171-14.ngrok-free.app -> http://localhost:7860
```
The forwarding address is acting as a reverse proxy for your Langflow server.
5. From the Langflow dashboard, select the project that contains the flows you want to serve as tools, and then click the **MCP Server** tab.
Note that the code template now contains your ngrok forwarding address instead of the localhost address:
```json
{
"mcpServers": {
"PROJECT_NAME": {
"command": "uvx",
"args": [
"mcp-proxy",
"https://94b1-73-64-171-14.ngrok-free.app/api/v1/mcp/project/fdbc12af-0dd4-43dc-b9ce-c324d1ce5cd1/sse"
]
}
}
}
```
6. Complete the steps in [Connect clients to Langflow's MCP server](#connect-clients-to-use-the-servers-actions) using the ngrok forwarding address.
Your MCP client is now connected to your project's MCP server over the public internet.
If using Cursor, your conversations are the same as they are on your local host:
```
{
"input_value": "What job experience does Emily have?"
}
Result:
What job experience does Emily have?
Emily J. Wilson has the following job experience:
```
You can use the ngrok console output to monitor requests for your project's endpoint:
```
16:35:48.566 EDT GET /api/v1/mcp/project/fdbc12af-0dd4-43dc-b9ce-c324d1ce5cd1 200 OK
```

View file

@ -8,10 +8,12 @@ The following pages provide information about how to develop and configure Langf
The [Develop an application in Langflow](/develop-application) guide walks you through packaging and serving a flow, from your local development environment to a containerized application.
As you build your application, you will configure the following application behaviors. More detailed explanation is provided in the individual pages.
[Custom Dependencies](/install-custom-dependencies) - Add and manage additional Python packages and external dependencies in your Langflow projects.
* [Custom Dependencies](/install-custom-dependencies) - Add and manage additional Python packages and external dependencies in your Langflow projects.
[Memory and Storage](/memory) - Configure Langflow's storage and caching behavior.
* [Memory and Storage](/memory) - Configure Langflow's storage and caching behavior.
[Session Management](/session-id) - Use session ID to manage communication between components.
* [Session Management](/session-id) - Use session ID to manage communication between components.
• [Logging](/logging) - Monitor and debug your Langflow applications.
* [Logging](/logging) - Monitor and debug your Langflow applications.
* [Webhook](/webhook) - Trigger your flows with external requests.

View file

@ -5,7 +5,7 @@ slug: /webhook
import Icon from "@site/src/components/icon";
Add a **Webhook** component to your flow to trigger it with external requests.
Add a **Webhook** component to your flow to trigger it with external requests.
To connect the **Webhook** to a **Parser** component to view and parse your data payload, do the following:

View file

@ -1,185 +0,0 @@
---
title: Integrate Langflow with MCP
slug: /integrations-mcp
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import Icon from "@site/src/components/icon";
Langflow integrates with the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction). This allows you to use your Langflow flows as tools in client applications that support the MCP, or extend Langflow with the [MCP server component](/components-tools#mcp-tools-stdio) to access MCP servers.
You can use Langflow as an MCP server with any [MCP client](https://modelcontextprotocol.io/clients).
For configuring interactions between Langflow flows and MCP tools, see [Name and describe your flows for agentic use](#name-and-describe-your-flows-for-agentic-use).
To connect [MCP Inspector](https://modelcontextprotocol.io/docs/tools/inspector) to Langflow for testing and debugging flows, see [Install MCP Inspector to test and debug flows](#install-mcp-inspector-to-test-and-debug-flows).
## Access all of your flows as tools
:::important
Tool names must contain only letters, numbers, underscores, and dashes.
Tool names cannot contain spaces.
To re-name flows in the Langflow UI, click **Flow Name** > **Edit Details**.
:::
Connect an MCP client to Langflow to use your flows as tools.
1. Install [Cursor](https://docs.cursor.com/) or [Claude for Desktop](https://claude.ai/download).
2. Install [uv](https://docs.astral.sh/uv/getting-started/installation/) to run `uvx` commands. `uvx` is included with `uv` in the Langflow package.
3. Optional: Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) to run `npx` commands.
For an example `npx` server, see [Connect an Astra DB MCP server to Langflow](/mcp-component-astra).
4. Create at least one flow, and note your host. For example, `http://127.0.0.1:7860`.
<Tabs>
<TabItem value="cursor" label="Cursor">
In Cursor, you can configure a Langflow server in the same way as other MCP servers.
For more information, see the [Cursor MCP documentation](https://docs.cursor.com/context/model-context-protocol).
1. Open Cursor, and then go to **Cursor Settings**.
2. Click MCP, and then click **Add New Global MCP Server**.
Cursor's MCP servers are listed as JSON objects.
3. To add a Langflow server, add an entry for your Langflow server's `/v1/mcp/sse` endpoint.
This example assumes the default Langflow server address of `http://127.0.0.1:7860`.
```json
{
"mcpServers": {
"langflow": {
"url": "http://127.0.0.1:7860/api/v1/mcp/sse"
}
}
}
```
4. Save the `mcp.json` file, and then click the **Reload** icon.
5. Your Langflow server is now available to Cursor as an MCP server, and all of its flows are registered as tools.
You can now use your flows as tools in Cursor.
Cursor determines when to use tools based on your queries, and requests permissions when necessary.
</TabItem>
<TabItem value="claude for desktop" label="Claude for Desktop">
In Claude for Desktop, you can configure a Langflow server in the same way as other MCP servers.
For more information, see the [Claude for Desktop MCP documentation](https://modelcontextprotocol.io/quickstart/user).
1. Open Claude for Desktop, and then go to the program settings.
For example, on the MacOS menu bar, click **Claude**, and then select **Settings**.
2. In the **Settings** dialog, click **Developer**, and then click **Edit Config**.
This creates a `claude_desktop_config.json` file if you don't already have one.
3. Add the following code to `claude_desktop_config.json`.
Your `args` may differ for your `uvx` and `Python` installations. To find your system paths, do the following:
4. To find the `uvx` path, run `which uvx` in your terminal. Replace `PATH/TO/UVX` with the `uvx` path from your system.
5. To find the `python` path, run `which python` in your terminal. Replace `PATH/TO/PYTHON` with the Python path from your system.
This command assumes the default Langflow server address of `http://127.0.0.1:7860`.
```json
{
"mcpServers": {
"langflow": {
"command": "/bin/sh",
"args": ["-c", "PATH/TO/UVX --python PATH/TO/PYTHON mcp-sse-shim@latest"],
"env": {
"MCP_HOST": "http://127.0.0.1:7860",
"DEBUG": "true"
}
}
}
}
```
This code adds a new MCP server called `langflow` and starts the [mcp-sse-shim](https://github.com/phact/mcp-sse-shim) package using the specified Python interpreter and uvx.
6. Restart Claude for Desktop.
Your new tools are available in your chat window, and Langflow is available as an MCP server.
* To view your tools, click the <Icon name="Hammer" aria-label="Tools" /> icon.
* To view a list of connected MCP servers, which includes **langflow-mcp-server**, click the <Icon name="Unplug" aria-label="Connector" /> icon.
You can now use your flows as tools in Claude for Desktop.
Claude determines when to use tools based on your queries, and will request permissions when necessary.
For more information, see [Debugging in Claude for Desktop](https://modelcontextprotocol.io/docs/tools/debugging#debugging-in-claude-desktop).
</TabItem>
</Tabs>
## Name and describe your flows for agentic use
MCP clients like Claude for Desktop and Cursor "see" Langflow as a single MCP server, with **all** of your flows listed as tools.
This can confuse agents, who don't know that flow `adbbf8c7-0a34-493b-90ea-5e8b42f78b66` is a Document Q&A flow for a specific text file.
To prevent this behavior, name and describe your flows clearly for agentic use. Imagine your names and descriptions as function names and code comments, with a clear statement of what problem they solve.
For example, you have created a [Document Q&A](/document-qa) flow that loads a sample resume for an LLM to chat with, and you want Cursor to use the tool.
1. Click **Flow name**, and then select **Edit Details**.
2. The **Name** field should make it clear what the flow does, both to a user and to the agent. For example, name it `Document QA for Resume`.
3. The **Description** field should include a description of what the flow does. For example, describe the flow as `OpenAI LLM Chat with Alex's resume.`
The **Endpoint Name** field does not affect the agent's behavior.
4. To see how an MCP client understands your flow, in Cursor, examine the **MCP Servers**.
The tool is listed as:
```text
document_qa_for_resume
e967f47d-6783-4bab-b1ea-0aaa554194a3: OpenAI LLM Chat with Alex's resume.
```
Your flow name and description provided the agent with a clear purpose for the tool.
5. Ask Cursor a question specifically about the resume, such as `What job experience does Alex have?`
```text
I'll help you explore a resume using the Document QA for Resume flow, which is specifically designed for analyzing resumes.
Let me call this tool.
```
6. Click **Run tool** to continue. The agent requests permissions when necessary.
```
Based on the resume, here's a comprehensive breakdown of the experience:
```
7. Ask about a different resume.
You've provided enough information in the description for the agent to make the correct decision:
```text
I notice you're asking about Emily's job experience.
Based on the available tools, I can see there is a Document QA for Resume flow that's designed for analyzing resumes.
However, the description mentions it's for "Alex's resume" not Emily's. I don't have access to Emily's resume or job experience information.
```
## Install MCP Inspector to test and debug flows
[MCP inspector](https://modelcontextprotocol.io/docs/tools/inspector) is the standard tool for testing and debugging MCP servers.
Use MCP Inspector to monitor your Langflow server's flows, and understand how they are being consumed by the MCP.
To install and run MCP inspector, follow these steps:
1. Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
2. To install and start MCP inspector, in a terminal window, run the following command:
```
npx @modelcontextprotocol/inspector
```
MCP inspector starts by default at `http://localhost:5173`.
:::tip
Optionally, specify a proxy port when starting MCP Inspector:
```
SERVER_PORT=9000 npx -y @modelcontextprotocol/inspector
```
:::
3. In the browser, navigate to MCP Inspector.
4. To inspect the Langflow server, enter the values for the Langflow server.
* In the **Transport Type** field, select **SSE**.
* In the **URL** field, enter the Langflow server's `/mcp/sse` endpoint.
For a default deployment, the URL is `http://127.0.0.1:7860/api/v1/mcp/sse`.
5. Click **Connect**.
MCP Inspector connects to the Langflow server.
6. To confirm the connection, click the **Tools** tab.
The Langflow server's flows are listed as tools, which confirms MCP Inspector is connected.
In the **Tools** tab, you can monitor how your flows are being registered as tools by MCP, and run flows with input values.
To quit MCP Inspector, in the terminal where it's running, enter `Ctrl+C`.

View file

@ -1,40 +0,0 @@
---
title: Connect an Astra DB MCP server to Langflow
slug: /mcp-component-astra
---
Use the [MCP server component](/components-tools#mcp-server) to connect Langflow to a [Datastax Astra DB MCP server](https://github.com/datastax/astra-db-mcp).
1. Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
2. Create an [OpenAI](https://platform.openai.com/) API key.
3. Create an [Astra DB Serverless (Vector) database](https://docs.datastax.com/en/astra-db-serverless/databases/create-database.html#create-vector-database), if you don't already have one.
4. Get your database's **Astra DB API endpoint** and an **Astra DB application token** with the Database Administrator role. For more information, see [Generate an application token for a database](https://docs.datastax.com/en/astra-db-serverless/administration/manage-application-tokens.html#database-token).
5. Create a [Simple agent starter project](/starter-projects-simple-agent).
6. Remove the **URL** tool and replace it with an [MCP server](/components-tools#mcp-server) component.
The flow should look like this:
![MCP stdio component](/img/mcp-server-component.png)
7. In the **MCP server** component, in the **MCP command** field, add the following code.
Replace the values for `ASTRA_TOKEN` and `ASTRA_ENDPOINT` with the values from your Astra database.
```text
env ASTRA_DB_APPLICATION_TOKEN=ASTRA_TOKEN ASTRA_DB_API_ENDPOINT=ASTRA_ENDPOINT npx -y @datastax/astra-db-mcpnpx -y @datastax/astra-db-mcp
```
:::important
Langflow passes environment variables from the `.env` file to MCP, but not global variables declared in the UI.
To add the values for `ASTRA_DB_APPLICATION_TOKEN` and `ASTRA_DB_API_ENDPOINT` as global variables, add them to Langflow's `.env` file at startup.
For more information, see [global variables](/configuration-global-variables).
:::
8. In the **Agent** component, add your **OpenAI API key**.
9. Open the **Playground**, and then ask the agent, `What collections are available?`
Since Langflow is connected to your Astra DB database through the MCP, the agent chooses the correct tool and connects to your database to retrieve the answer.
```text
The available collections in your database are:
collection_002
hardware_requirements
load_collection
nvidia_collection
software_requirements
```

View file

@ -0,0 +1,64 @@
---
title: Connect an Astra DB MCP server to Langflow
slug: /mcp-component-astra
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import Icon from "@site/src/components/icon";
Use the [MCP server component](/components-tools#mcp-server) to connect Langflow to a [Datastax Astra DB MCP server](https://github.com/datastax/astra-db-mcp).
1. Install an LTS release of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
2. Create an [OpenAI](https://platform.openai.com/) API key.
3. Create an [Astra DB Serverless (Vector) database](https://docs.datastax.com/en/astra-db-serverless/databases/create-database.html#create-vector-database), if you don't already have one.
4. Get your database's **Astra DB API endpoint** and an **Astra DB application token** with the Database Administrator role. For more information, see [Generate an application token for a database](https://docs.datastax.com/en/astra-db-serverless/administration/manage-application-tokens.html#database-token).
5. Create a [Simple agent starter project](/starter-projects-simple-agent).
6. Remove the **URL** tool and replace it with an [MCP server](/components-tools#mcp-server) component.
The flow should look like this:
![MCP stdio component connecting to Astra](/img/component-mcp-astra-db.png)
7. In the **MCP server** component, in the **MCP server** field, add the following code to connect to an Astra DB MCP server:
```bash
npx -y @datastax/astra-db-mcp
```
8. In the **MCP server** component, in the **Env** fields, add variables for `ASTRA_DB_APPLICATION_TOKEN` and `ASTRA_DB_API_ENDPOINT` with the values from your Astra database.
:::important
Langflow passes environment variables from the `.env` file to MCP, but not global variables declared in the UI.
To add the values for `ASTRA_DB_APPLICATION_TOKEN` and `ASTRA_DB_API_ENDPOINT` as global variables, add them to Langflow's `.env` file at startup.
For more information, see [global variables](/configuration-global-variables).
:::
```bash
ASTRA_DB_APPLICATION_TOKEN=AstraCS:...
```
9. To add another variable, click <Icon name="Plus" aria-hidden="true"/> **Add More**.
```bash
ASTRA_DB_API_ENDPOINT=https://...-us-east-2.apps.astra.datastax.com
```
10. In the **Agent** component, add your **OpenAI API key**.
11. Open the **Playground**, and then ask the agent, `What collections are available?`
Since Langflow is connected to your Astra DB database through the MCP, the agent chooses the correct tool and connects to your database to retrieve the answer.
```text
The available collections in your database are:
collection_002
hardware_requirements
load_collection
nvidia_collection
software_requirements
```

View file

@ -17,8 +17,8 @@ Visit the [Luna for Langflow](https://www.datastax.com/products/luna-langflow) p
Luna for Langflow support covers only the following software versions for Langflow.
Last updated: 2025-04-02
Last updated: May 1, 2025
## Core information
- **Langflow Version**: `1.3.x`
- **Langflow Version**: `1.4.x`
- **Python Version Required**: `>=3.10, <3.14`

View file

@ -250,11 +250,15 @@ const config = {
to: "/components-custom-components",
from: "/components/custom",
},
{
to: "/mcp-server",
from: "/integrations-mcp",
},
{
to: "/deployment-kubernetes-dev",
from: [
"/deployment-kubernetes",
]
],
},
// add more redirects like this
// {

View file

@ -39,6 +39,7 @@ module.exports = {
"Concepts/concepts-flows",
"Concepts/concepts-objects",
"Concepts/concepts-publish",
"Concepts/mcp-server",
"Concepts/concepts-file-management",
"Concepts/concepts-voice-mode",
],
@ -174,14 +175,6 @@ module.exports = {
type: "category",
label: "Integrations",
items: [
{
type: 'category',
label: 'MCP (Model Context Protocol)',
items: [
'Integrations/MCP/integrations-mcp',
'Integrations/MCP/mcp-component-astra',
],
},
"Integrations/Apify/integrations-apify",
{
type: "doc",
@ -194,6 +187,11 @@ module.exports = {
id: "Integrations/Composio/integrations-composio",
label: "Composio",
},
{
type: "doc",
id: "Integrations/mcp-component-astra",
label: "Astra DB MCP server",
},
{
type: 'category',
label: 'Google',

Binary file not shown.

After

Width:  |  Height:  |  Size: 991 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 859 KiB

BIN
docs/static/img/component-mcp-stdio.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1,019 KiB

BIN
docs/static/img/mcp-server-actions.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 335 KiB

BIN
docs/static/img/mcp-server-api-key.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 555 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 174 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 375 KiB

BIN
docs/static/img/mcp-server.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 534 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 397 KiB

After

Width:  |  Height:  |  Size: 383 KiB

Before After
Before After

Binary file not shown.

Before

Width:  |  Height:  |  Size: 586 KiB

After

Width:  |  Height:  |  Size: 1.1 MiB

Before After
Before After

Binary file not shown.

Before

Width:  |  Height:  |  Size: 292 KiB

After

Width:  |  Height:  |  Size: 611 KiB

Before After
Before After