docs: Repurpose Concepts section to focus on the visual editor and flows (#8845)

* initial alignment with 1.5 pr

* finish 1st rewrite of visual editor overview page

* working on flows and welcome

* more visual editor work

* about langflow

* next steps

* lfoss-1395 more focus on agents mcp

* align with PRs

* working on publish flows page

* finish embedded chat section

* finish publish page

* peer review pt 1

* coderabbit nitpicks

* coderabbit nitpicks pt 2

* some updates

* almost done

* move all upgrade stuff to release notes page.

* fix link

* fix anchors

* fix details

* uncomment

* add import

* hide again
This commit is contained in:
April I. Murphy 2025-07-10 18:15:42 -07:00 committed by GitHub
commit 27f5416ec0
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
47 changed files with 1414 additions and 1017 deletions

View file

@ -16,7 +16,7 @@ To run flows in your apps, see [Flow trigger endpoints](/api-flows-run).
The `/build` endpoints support Langflow's frontend code for building flows in the Langflow Workspace.
You can use these endpoints to build vertices and flows, as well as execute flows with streaming event responses.
You might need to use or understand these endpoints when contributing to the Langflow project's core codebase.
You might need to use or understand these endpoints when contributing to the Langflow codebase.
## Build flow and stream events
@ -105,8 +105,8 @@ curl -X GET \
| inputs | object | Optional. Input values for flow components. |
| data | object | Optional. Flow data to override stored configuration. |
| files | array[string] | Optional. List of file paths to use. |
| start_component_id | string | Optional. ID of the component where the execution should start. Component `id` values can be found in [Langflow JSON files](/concepts-flows#langflow-json-file-contents) |
| stop_component_id | string | Optional. ID of the component where the execution should stop. Component `id` values can be found in [Langflow JSON files](/concepts-flows#langflow-json-file-contents).|
| start_component_id | string | Optional. ID of the component where the execution should start. Component `id` values can be found in [Langflow JSON files](/concepts-flows-import#langflow-json-file-contents) |
| stop_component_id | string | Optional. ID of the component where the execution should stop. Component `id` values can be found in [Langflow JSON files](/concepts-flows-import#langflow-json-file-contents).|
| log_builds | boolean | Optional. Control build logging. Default: `true`. |
### Set start and stop points

View file

@ -20,7 +20,6 @@ There are two versions of the `/files` endpoints.
- `/v2` files are tracked in the Langflow database.
- `/v2` supports bulk upload and delete.
- `/v2` responses contain more descriptive metadata.
- `/v2` endpoints have more strict security, requiring authentication by an API key or JWT.
However, `/v2/files` doesn't support image files.
To send image files to your flows through the API, use [Upload image files (v1)](#upload-image-files-v1).
@ -62,55 +61,56 @@ curl -X POST \
### Upload image files (v1)
Send image files to the Langflow API for AI analysis.
Send image files to Langflow to use them in flows.
The default file limit is 100 MB. To configure this value, change the `LANGFLOW_MAX_FILE_SIZE_UPLOAD` environment variable.
For more information, see [Supported environment variables](/environment-variables#supported-variables).
The default file limit is 100 MB.
To change this limit, set the `LANGFLOW_MAX_FILE_SIZE_UPLOAD` [environment variable](/environment-variables).
1. To send an image to your flow with the API, POST the image file to the `v1/files/upload/<YOUR-FLOW-ID>` endpoint of your flow.
Replace **FILE_NAME** with the uploaded file name.
1. Attach the image to a `POST /v1/files/upload/$FLOW_ID` request with `--form` (`-F`) and the file path:
```bash
curl -X POST "$LANGFLOW_URL/api/v1/files/upload/a430cc57-06bb-4c11-be39-d3d4de68d2c4" \
-H "Content-Type: multipart/form-data" \
-H "x-api-key: $LANGFLOW_API_KEY" \
-F "file=@FILE_NAME.png"
```
```bash
curl -X POST "$LANGFLOW_URL/api/v1/files/upload/$FLOW_ID" \
-H "Content-Type: multipart/form-data" \
-H "x-api-key: $LANGFLOW_API_KEY" \
-F "file=@PATH/TO/FILE.png"
```
The API returns the image file path in the format `"file_path":"<YOUR-FLOW-ID>/<TIMESTAMP>_<FILE-NAME>"}`.
A successful request returns the `file_path` for the image in the Langflow file management system in the format `FLOW_ID/TIMESTAMP_FILENAME.TYPE`.
For example:
```json
{
"flowId": "a430cc57-06bb-4c11-be39-d3d4de68d2c4",
"file_path": "a430cc57-06bb-4c11-be39-d3d4de68d2c4/2024-11-27_14-47-50_image-file.png"
}
```
```json
{
"flowId": "a430cc57-06bb-4c11-be39-d3d4de68d2c4",
"file_path": "a430cc57-06bb-4c11-be39-d3d4de68d2c4/2024-11-27_14-47-50_image-file.png"
}
```
2. Post the image file to the **Chat Input** component of a **Basic prompting** flow.
Pass the file path value as an input in the **Tweaks** section of the curl call to Langflow.
Component `id` values can be found in [Langflow JSON files](/concepts-flows#langflow-json-file-contents).
2. Use the returned `file_path` to send the image file to other components that can accept file input. Where you specify the file path depends on the component type.
```bash
curl -X POST \
"$LANGFLOW_URL/api/v1/run/a430cc57-06bb-4c11-be39-d3d4de68d2c4?stream=false" \
-H 'Content-Type: application/json' \
-H "x-api-key: $LANGFLOW_API_KEY" \
-d '{
"output_type": "chat",
"input_type": "chat",
"tweaks": {
"ChatInput-b67sL": {
"files": "a430cc57-06bb-4c11-be39-d3d4de68d2c4/2024-11-27_14-47-50_image-file.png",
"input_value": "what do you see?"
}
}}'
```
The following example runs a [Basic Prompting flow](/basic-prompting), passing the image file and the query `describe this image` as input for the **Chat Input** component.
In this case, the file path is specified in `tweaks`.
Your chatbot describes the image file you sent.
```bash
curl -X POST \
"$LANGFLOW_URL/api/v1/run/a430cc57-06bb-4c11-be39-d3d4de68d2c4?stream=false" \
-H "Content-Type: application/json" \
-H "x-api-key: $LANGFLOW_API_KEY" \
-d '{
"output_type": "chat",
"input_type": "chat",
"tweaks": {
"ChatInput-b67sL": {
"files": "a430cc57-06bb-4c11-be39-d3d4de68d2c4/2024-11-27_14-47-50_image-file.png",
"input_value": "describe this image"
}
}
}'
```
```text
"text": "This flowchart appears to represent a complex system for processing financial inquiries using various AI agents and tools. Here's a breakdown of its components and how they might work together..."
```
:::tip
For help with tweaks, use the **Input Schema** in a flow's [**API access** pane](/concepts-publish#api-access).
Setting tweaks with **Input Schema** also automatically populates the required component IDs.
:::
### List files (v1)
@ -193,7 +193,7 @@ curl -X DELETE \
Use the `/files` endpoints to move files between your local machine and Langflow.
The `v2` endpoints require authentication by an API key or JWT.
The `/v2/files` endpoints can be authenticated by an API key or JWT.
To create a Langflow API key and export it as an environment variable, see [Get started with the Langflow API](/api-reference-api-examples).
### Upload file (v2)

View file

@ -12,6 +12,11 @@ To create, read, update, and delete flows, see [Flow management endpoints](/api-
## Run flow
:::tip
Langflow automatically generates Python, JavaScript, and curl code snippets for the `/v1/run/$FLOW_ID` endpoint for all flows.
For more information, see [Generate API code snippets](/concepts-publish#generate-api-code-snippets).
:::
Execute a specified flow by ID or name.
Flow IDs can be found on the code snippets on the [**API access** pane](/concepts-publish#api-access) or in a flow's URL.

View file

@ -269,7 +269,7 @@ curl -X DELETE \
Exports specified flows to a ZIP file.
This endpoint downloads a ZIP file containing [Langflow JSON files](/concepts-flows#langflow-json-file-contents) for each flow ID listed in the request body.
This endpoint downloads a ZIP file containing [Langflow JSON files](/concepts-flows-import#langflow-json-file-contents) for each flow ID listed in the request body.
<Tabs>
<TabItem value="curl" label="curl" default>
@ -301,7 +301,7 @@ curl -X POST \
## Import flows
Imports flows by uploading a [Langflow-compatible JSON file](/concepts-flows#langflow-json-file-contents).
Imports flows by uploading a [Langflow-compatible JSON file](/concepts-flows-import#langflow-json-file-contents).
To specify a target project for the flow, include the query parameter `project_id`.
The target `project_id` must already exist before uploading a flow. Call the [/api/v1/projects/](/api-projects#read-projects) endpoint for a list of available projects.

View file

@ -6,7 +6,7 @@ slug: /api-projects
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Use the `/projects` endpoint to create, read, update, and delete projects.
Use the `/projects` endpoint to create, read, update, and delete [Langflow projects](/concepts-flows#projects).
Projects store your flows and components.

View file

@ -59,23 +59,12 @@ You can configure the Langflow port number in the `LANGFLOW_PORT` [environment v
### Authentication
As of Langflow v1.5, all API requests require a Langflow API key, even when `AUTO_LOGIN` is enabled.
In Langflow versions 1.5 and later, most API endpoints require a Langflow API key, even when `AUTO_LOGIN` is set to `True`.
The only exceptions are the MCP endpoints `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`, which never require authentication.
The only exceptions are the MCP endpoints at `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`.
These endpoints don't require authentication, regardless of the `AUTO_LOGIN` setting.
You must provide a valid Langflow API key in either an `x-api-key` header or a query parameter.
To authenticate a Langflow API request, provide a Langflow API key in either an `x-api-key` header or query parameter.
For more information, see [API keys](/configuration-api-keys).
<details closed>
<summary>Auto-login and API key authentication in earlier Langflow versions</summary>
Prior to Langflow v1.5, when `AUTO_LOGIN` was enabled with `AUTO_LOGIN=true`, Langflow automatically logged users in as a superuser without requiring authentication, and API requests could be made without a Langflow API key.
If you set `SKIP_AUTH_AUTO_LOGIN=true` and `AUTO_LOGIN=true`, authentication will be skipped entirely, and API requests will not require a Langflow API key.
</details>
As with any API, follow industry best practices for storing and referencing sensitive credentials.
For example, you can [set environment variables](#set-environment-variables) for your API keys, and then reference those environment variables in your API requests.

View file

@ -175,4 +175,4 @@ Your flow should be visible in the response as a tool.
The connected flow returns an answer based on your question.
For example, a Basic Prompting flow connected as a tool returns a different result depending upon its LLM and prompt instructions.
![Run Flow as tool connected to agnet](/img/agent-example-run-flow-as-tool.png)
![Run Flow as tool connected to an agent](/img/agent-example-run-flow-as-tool.png)

View file

@ -39,7 +39,7 @@ To allow agents to use tools from MCP servers, use the [**MCP Tools** component]
When you attach a component as a tool, you must configure the component as a tool by enabling **Tool Mode**.
For more information, see [Configure tools for agents](/agents-tools)
For more information, see [Configure tools for agents](/agents-tools).
## Use the Agent component in a flow
@ -77,7 +77,7 @@ Web Search & Content Fetching: I can fetch and summarize content from web pages,
News Search: I can search for recent news articles using Google News via RSS feeds.
Calculator: I can perform arithmetic calculations and evaluate mathematical expressions.
Date & Time: I can provide the current date and time in various time zones.
These tools help me provide up-to-date information, perform calculations, and retrieve specific data from the internet when needed. If you have a specific question, let me know, and Ill use the most appropriate tool(s) to help!
These tools help me provide up-to-date information, perform calculations, and retrieve specific data from the internet when needed. If you have a specific question, let me know, and I'll use the most appropriate tool(s) to help!
```
9. Ask the agent, `Summarize today's tech news`.
@ -88,4 +88,4 @@ Connect more tools to solve more specialized problems.
## See also
* [Configure tools for agents](/agents-tools)
* [Configure tools for agents](/agents-tools)

View file

@ -64,7 +64,7 @@ Prior to Langflow 1.5, this component was named **MCP connection**.
The **MCP tools** component connects to a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) server and exposes the MCP server's tools as tools for Langflow agents.
In addition to being an MCP client that can leverage MCP servers, the **MCP tools** component's [SSE mode](/mcp-client#mcp-sse-mode) allows you to connect your flow to the Langflow MCP server at the `/api/v1/mcp/sse` API endpoint, exposing all flows within your [project](/concepts-overview#projects) as tools within a flow.
In addition to being an MCP client that can leverage MCP servers, the **MCP tools** component's [SSE mode](/mcp-client#mcp-sse-mode) allows you to connect your flow to the Langflow MCP server at the `/api/v1/mcp/sse` API endpoint, exposing all flows within your [project](/concepts-flows#projects) as tools within a flow.
For more information, see [MCP client](/mcp-client).

View file

@ -249,26 +249,28 @@ Your request is answered.
A new chat session called `docker-question-on-m1` has appeared, using your unique `session_id`.
7. To modify additional parameters with **Tweaks** for your **Chat Input** and **Chat Output** components, click **Share**, and then click **API access**.
8. Click **Input schema** to modify parameters in the component's `data` object.
For example, disabling storing messages from the **Chat Input** component adds a **Tweak** to your command:
```text
curl --request POST \
--url "http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID" \
--header "Content-Type: application/json" \
--header "x-api-key: LANGFLOW_API_KEY" \
--data '{
"input_value": "Text to input to the flow",
"output_type": "chat",
"input_type": "chat",
"tweaks": {
"ChatInput-4WKag": {
"should_store_message": false
}
}
}'
```
To confirm your command is using the tweak, navigate to the **Logs** pane and view the request from the **Chat Input** component.
The value for `should_store_message` is `false`.
For example, disabling storing messages from the **Chat Input** component adds a **Tweak** to your command:
```text
curl --request POST \
--url "http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID" \
--header "Content-Type: application/json" \
--header "x-api-key: LANGFLOW_API_KEY" \
--data '{
"input_value": "Text to input to the flow",
"output_type": "chat",
"input_type": "chat",
"tweaks": {
"ChatInput-4WKag": {
"should_store_message": false
}
}
}'
```
9. To confirm your command is using the tweak, navigate to the **Logs** pane, and then view the request from the **Chat Input** component.
Given the preceding example, the value for `should_store_message` should be `false`.
## See also

View file

@ -9,7 +9,7 @@ import Icon from "@site/src/components/icon";
Langflow integrates with the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) as both an MCP server and an MCP client.
This page describes how to use Langflow as an MCP client with the [MCP Tools](#use-the-mcp-tools-component) component and the [MCP servers](#manage-mcp-connections) page in **Settings**.
This page describes how to use Langflow as an MCP client with the [**MCP Tools** component](#use-the-mcp-tools-component) component and your [connected MCP servers](#manage-connected-mcp-servers).
For information about using Langflow as an MCP server, see [Use Langflow as an MCP server](/mcp-server).
@ -19,10 +19,10 @@ The **MCP Tools** component connects to a [Model Context Protocol (MCP)](https:/
This component has two modes, depending on the type of server you want to access:
* To access tools provided by external, non-Langflow MCP servers, [use JSON](#mcp-stdio-mode) or [Stdio mode](#mcp-stdio-mode).
* To use flows from your [Langflow projects](/concepts-overview#projects) as MCP tools, [use SSE mode](#mcp-sse-mode).
* [Connect to a non-Langflow MCP server](#mcp-stdio-mode) with a JSON configuration file, server start command, or SSE URL to access tools provided by external, non-Langflow MCP servers.
* [Connect to a Langflow MCP server](#mcp-sse-mode) to use flows from your [Langflow projects](/concepts-flows#projects) as MCP tools.
### Use Stdio mode {#mcp-stdio-mode}
### Connect to a non-Langflow MCP server {#mcp-stdio-mode}
1. Add an **MCP Tools** component to your flow.
@ -67,7 +67,7 @@ For example, if you use `mcp-server-fetch` with the `fetch` tool, you could ask
8. If you want the agent to be able to use more tools, repeat these steps to add more **Tools** components with different servers or tools.
### Use SSE mode {#mcp-sse-mode}
### Connect a Langflow MCP server {#mcp-sse-mode}
Every Langflow project runs a separate MCP server that exposes the project's flows as MCP tools.
For more information about your projects' MCP servers, including how to manage exposed flows, see [Use Langflow as an MCP server](/mcp-server).

View file

@ -92,7 +92,7 @@ For example, the **Prompt** component accepts inputs within curly braces, and ne
Some components include dropdown menus to select the type of output sent to the next component.
For example, the **Language Model** component includes **Model Response** or **Language Model** outputs.
The **Model Response** output sends a [Message](/concepts-objects#message) output on to another Message port.
The **Model Response** output sends a [Message](/concepts-objects#message-object) output on to another Message port.
The **Language Model** output can be connected to components like [Structured output](/components-processing#structured-output) to use the LLM to power the component's reasoning.
### Port colors
@ -120,6 +120,8 @@ The following table lists the component port colors and their corresponding inpu
## Component code
You can edit components in the visual editor and in code. When editing a flow, select a component, and then click <Icon name="Code" aria-hidden="true"/> **Code** to see and edit the component's underlying Python code.
All components have underlying code that determines how you configure them and what actions they can perform.
In the context of creating and running flows, component code does the following:

View file

@ -44,6 +44,10 @@ For an example of using the **File** component in a flow, see the [Document QA t
If you prefer a one-time upload, the [File](/components-data#file) component still allows one-time uploads directly from your local machine.
:::
## Send files to a flow with the Langflow API
For information on file management with the Langflow API, see [Files endpoints](/api-files).
## Supported file types
The maximum supported file size is 100 MB.

View file

@ -0,0 +1,189 @@
---
title: Import and export flows
slug: /concepts-flows-import
---
import Icon from "@site/src/components/icon";
You can export flows to transfer them between Langflow instances, share them with others, or create backups.
## Export a flow
There are three ways to export a flow:
* From the **Projects** page, find the flow you want to export, click <Icon name="Ellipsis" aria-hidden="true" /> **More**, and then select **Export**.
* When editing a flow, click **Share**, and then click **Export**.
* Use the Langflow API [`/flows/download`](/api-flows#export-flows) endpoint.
An exported flow is downloaded to your local machine as a JSON file named `FLOW_NAME.json`.
For more information, see [Langflow JSON file contents](#langflow-json-file-contents).
### Save with my API keys
When exporting from the Langflow UI, you can select **Save with my API keys** to export the flow _and_ any defined API key variables.
Non-API key variables are included in the export regardless of the **Save with my API keys** setting.
:::warning
If you directly entered the key value into a component's API key field, then **Save with my API keys** exports the literal key value.
If your key is stored in a Langflow global variable, **Save with my API keys** exports only the variable name.
:::
When you or another user import the flow to another Langflow instance, that instance must have Langflow global variables with the same names and valid values in order to run the flow successfully.
If any variables are missing or invalid, those variables must be created or edited after importing the flow.
### Export all flows
If you want to export all flows within a project, do either of the following:
* Go to the **Projects** page, find the project you want to export, click <Icon name="Ellipsis" aria-hidden="true" /> **Options**, and then select **Download**.
* Use the Langflow API [`/projects/download`](/api-projects#export-a-project) endpoint.
The project's flows are downloaded as JSON files in a zip archive.
## Import a flow
You can import Langflow JSON files from your local machine in the following ways:
* From the **Projects** page, click <Icon name="Upload" aria-hidden="true"/> **Upload a flow**.
* Drag and drop Langflow JSON files from your file explorer into your Langflow window to import a flow from any Langflow page.
* Use the Langflow API [`/flows/upload/`](/api-flows#import-flows) endpoint to upload one JSON file.
* Use the Langflow API [`/projects/upload`](/api-projects#import-a-project) endpoint to upload a Langflow project zip file.
### Run an imported flow
Once imported, your flow is ready to use.
If the flow contains any global variables, make sure your Langflow instance has global variables with the same names and valid values.
For more information, see [Save with my API keys](/concepts-flows-import#save-with-my-api-keys).
## Langflow JSON file contents
An exported flow is downloaded to your local machine as a JSON file named `FLOW_NAME.json`.
Langflow JSON files contain [nodes](#nodes) and [edges](#edges) that describe components and connections, and [additional metadata](#additional-metadata-and-project-information) that describe the flow.
For an example Langflow JSON file, examine the [Basic Prompting.json](https://github.com/langflow-ai/langflow/blob/main/src/backend/base/langflow/initial_setup/starter_projects/Basic%20Prompting.json) file in the Langflow repository.
### Nodes
Nodes represent the components that make up the flow.
For example, this object represents a **Chat Input** component:
```json
{
"data": {
"description": "Get chat inputs from the Playground.",
"display_name": "Chat Input",
"id": "ChatInput-jFwUm",
"node": {
"base_classes": ["Message"],
"description": "Get chat inputs from the Playground.",
"display_name": "Chat Input",
"icon": "MessagesSquare",
"template": {
"input_value": {
"display_name": "Text",
"info": "Message to be passed as input.",
"value": "Hello"
},
"sender": {
"value": "User",
"options": ["Machine", "User"]
},
"sender_name": {
"value": "User"
},
"should_store_message": {
"value": true
}
}
},
"type": "ChatInput"
},
"position": {
"x": 689.5720422421635,
"y": 765.155834131403
}
}
```
Each node has a unique identifier in the format of `NODE_NAME-UUID`, such as `ChatInput-jFwUm`.
Entrypoint nodes, such as the `ChatInput` node, are the first node executed when running a flow.
### Edges
Edges represent the connections between nodes.
The connection between the `ChatInput` node and the `OpenAIModel` node is represented as an edge:
```json
{
"className": "",
"data": {
"sourceHandle": {
"dataType": "ChatInput",
"id": "ChatInput-jFwUm",
"name": "message",
"output_types": ["Message"]
},
"targetHandle": {
"fieldName": "input_value",
"id": "OpenAIModel-OcXkl",
"inputTypes": ["Message"],
"type": "str"
}
},
"id": "reactflow__edge-ChatInput-jFwUm{œdataTypeœ:œChatInputœ,œidœ:œChatInput-jFwUmœ,œnameœ:œmessageœ,œoutput_typesœ:[œMessageœ]}-OpenAIModel-OcXkl{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-OcXklœ,œinputTypesœ:[œMessageœ],œtypeœ:œstrœ}",
"source": "ChatInput-jFwUm",
"sourceHandle": "{œdataTypeœ: œChatInputœ, œidœ: œChatInput-jFwUmœ, œnameœ: œmessageœ, œoutput_typesœ: [œMessageœ]}",
"target": "OpenAIModel-OcXkl",
"targetHandle": "{œfieldNameœ: œinput_valueœ, œidœ: œOpenAIModel-OcXklœ, œinputTypesœ: [œMessageœ], œtypeœ: œstrœ}"
}
```
This edge shows that the `ChatInput` component outputs a `Message` type to the `target` node, which is the `OpenAIModel` node.
The `OpenAIModel` component accepts the `Message` type at the `input_value` field.
### Additional metadata and project information
Additional information about the flow is stored in the root `data` object.
* Metadata and project information including the name, description, and `last_tested_version` of the flow.
```json
{
"name": "Basic Prompting",
"description": "Perform basic prompting with an OpenAI model.",
"tags": ["chatbots"],
"id": "1511c230-d446-43a7-bfc3-539e69ce05b8",
"last_tested_version": "1.0.19.post2",
"gradient": "2",
"icon": "Braces"
}
```
* Visual information about the flow defining the initial position of the flow in the workspace.
```json
"viewport": {
"x": -37.61270157375441,
"y": -155.91266341888854,
"zoom": 0.7575251406952855
}
```
* Notes are comments that help you understand the flow within the workspace.
They may contain links, code snippets, and other information.
Notes are written in Markdown and stored as `node` objects.
```json
{
"id": "undefined-kVLkG",
"node": {
"description": "## 📖 README\nPerform basic prompting with an OpenAI model.\n\n#### Quick Start\n- Add your **OpenAI API key** to the **OpenAI Model**\n- Open the **Playground** to chat with your bot.\n..."
}
}
```
## See also
* [Build flows](/concepts-flows)
* [Share and embed flows](/concepts-publish)

View file

@ -1,169 +1,112 @@
---
title: Flows
title: Build flows
slug: /concepts-flows
---
import Icon from "@site/src/components/icon";
Flows in Langflow are fully serializable and can be saved and loaded from the file system. In this guide, we'll explore how to import and export flows.
A _flow_ is a functional representation of an application workflow.
Flows receive input, process it, and produce output.
## Import flow
Flows consist of _components_ that represent individual steps in your application's workflow.
If you already have a Langflow JSON file on your local machine, from the **Projects** page, click <Icon name="Upload" aria-hidden="true"/> **Upload a flow**.
![Basic prompting flow within the Workspace](/img/workspace-basic-prompting.png)
Once imported, your flow is ready to use.
Langflow flows are fully serializable and can be saved and loaded from the file system where Langflow is installed.
:::tip
You can drag and drop Langflow JSON files directly from your file system into the Langflow window to import a flow, even into the initial Langflow splash screen.
To try building and running a flow in a few minutes, see the [Langflow quickstart](/get-started-quickstart).
:::
## Export a flow
## Create a flow
You can export flows to transfer flows between Langflow instances or save backups of your flows.
There are four ways to create a flow in the Langflow UI:
An exported flow is downloaded to your local machine as a JSON file named `FLOW_NAME.json`.
* **Create a blank flow**: From the [**Projects** page](#projects), select a project, and then click **New Flow**.
* **Create a flow from a template**: From the [**Projects** page](#projects), select a project, and then click **New Flow**.
* **Duplicate an existing flow**: From the [**Projects** page](#projects), locate the flow you want to copy, click <Icon name="Ellipsis" aria-hidden="true" /> **More**, and then select **Duplicate**.
* **Import a flow**: See [Import and export flows](/concepts-flows-import).
There are three ways to export a flow:
You can also create a flow with the [Langflow API](/api-flows), but the Langflow team recommends using the visual editor until you are familiar with flow creation.
* From the **Projects** page, find the flow you want to export, click <Icon name="Ellipsis" aria-hidden="true" /> **More**, and then select **Export**.
* When editing a flow, click **Share**, and then click **Export**.
* Use the Langflow API [`/flows/download`](/api-flows#export-flows) endpoint.
### Add components
When exporting from the Langflow UI, you can select **Save with my API keys** to export the flow _and_ any defined API key variables.
Flows consist of [components](/concepts-components), which are nodes that you configure and connect in the Langflow [visual editor](/concepts-overview).
Each component performs a specific task, like serving an AI model or connecting a data source.
:::warning
If you directly entered the key value into a component's API key field, then **Save with my API keys** exports the literal key value.
Drag and drop components from the **Components** menu to add them to your flow.
Then, configure the component settings and connect the components together.
If your key is stored in a Langflow global variable, **Save with my API keys** exports only the variable name.
:::
![Chat input and output connected to Language model component](/img/connect-component.png)
Non-API key variables are included in the export regardless of the **Save with my API keys** setting.
Each component has configuration settings and options. Some of these are common to all components, and some are unique to specific components.
When you or another user import the flow to another Langflow instance, that instance must have Langflow global variables with the same names and a valid values in order to run the flow successfully.
If any variables are missing or invalid, those variables must be created or edited after importing the flow.
To form a cohesive flow, you connect components by _edges_ or _ports_, which have a specific data type they receive or send.
For example, message ports send text strings between components.
For more information about component configuration, including port types and underlying component code, see [Components overview](/concepts-components).
## Langflow JSON file contents
### Run a flow
Langflow JSON files contain [nodes](#nodes) and [edges](#edges) that describe components and connections, and [additional metadata](#additional-metadata-and-project-information) that describe the flow.
After you build a flow, you can test it in the [**Playground**](/concepts-playground), and then [publish your flow](/concepts-publish) to embed or share your flow.
For more information about application development with Langflow, see [Develop an application with Langflow](/develop-application).
For an example Langflow JSON file, examine the [Basic Prompting.json](https://github.com/langflow-ai/langflow/blob/main/src/backend/base/langflow/initial_setup/starter_projects/Basic%20Prompting.json) file in the Langflow repository.
If you need to build Langflow as a dependency of an application or deploy a Langflow server for API access over the public internet, see [Langflow deployment overview](/deployment-overview).
### Nodes
#### Flow graphs
**Nodes** represent the components that make up the flow.
When a flow runs, Langflow builds a Directed Acyclic Graph (DAG) object from the nodes (components) and edges (connections), and the nodes are sorted to determine the order of execution.
The `ChatInput` node is the entry point of the flow. It's the first node that will be executed.
The graph build calls each component's `def_build` function to validate and prepare the nodes.
This graph is then processed in dependency order.
Each node is built and executed sequentially, with results from each built node being passed to nodes that are dependent on that node's results.
`ChatInput-jFwUm` is a unique identifier for the node.
## Manage flows in projects {#projects}
```json
{
"data": {
"description": "Get chat inputs from the Playground.",
"display_name": "Chat Input",
"id": "ChatInput-jFwUm",
"node": {
"base_classes": ["Message"],
"description": "Get chat inputs from the Playground.",
"display_name": "Chat Input",
"icon": "MessagesSquare",
"template": {
"input_value": {
"display_name": "Text",
"info": "Message to be passed as input.",
"value": "Hello"
},
"sender": {
"value": "User",
"options": ["Machine", "User"]
},
"sender_name": {
"value": "User"
},
"should_store_message": {
"value": true
}
}
},
"type": "ChatInput"
},
"position": {
"x": 689.5720422421635,
"y": 765.155834131403
}
}
```
The **Projects** page is where you arrive when you launch Langflow.
It is where you view and manage flows on a high level.
### Edges
Langflow projects are like folders that you can use to organize related flows.
The default project is **Starter Project**, and your flows are stored here unless you create another project.
To create a project, click <Icon name="Plus" aria-hidden="true"/> **Create new project**.
**Edges** represent the connections between nodes.
![Projects page with multiple flows in a project](/img/my-projects.png)
The connection between the `ChatInput` node and the `OpenAIModel` node is represented as an edge:
From the **Projects** page, you can manage flows within each of your projects:
* **View flows in a project**: Select the project name in the **Projects** list.
* **Create flows**: See [Create a flow](#create-a-flow).
* **Edit a flow's name and description**: Locate the flow you want to edit, click <Icon name="Ellipsis" aria-hidden="true" /> **More**, and then select **Edit details**.
* **Delete a flow**: Locate the flow you want to delete, click <Icon name="Ellipsis" aria-hidden="true" /> **More**, and then select **Delete**.
* **Serve flows as MCP tools**: See [Use Langflow as an MCP server](/mcp-server).
```json
{
"className": "",
"data": {
"sourceHandle": {
"dataType": "ChatInput",
"id": "ChatInput-jFwUm",
"name": "message",
"output_types": ["Message"]
},
"targetHandle": {
"fieldName": "input_value",
"id": "OpenAIModel-OcXkl",
"inputTypes": ["Message"],
"type": "str"
}
},
"id": "reactflow__edge-ChatInput-jFwUm{œdataTypeœ:œChatInputœ,œidœ:œChatInput-jFwUmœ,œnameœ:œmessageœ,œoutput_typesœ:[œMessageœ]}-OpenAIModel-OcXkl{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-OcXklœ,œinputTypesœ:[œMessageœ],œtypeœ:œstrœ}",
"source": "ChatInput-jFwUm",
"sourceHandle": "{œdataTypeœ: œChatInputœ, œidœ: œChatInput-jFwUmœ, œnameœ: œmessageœ, œoutput_typesœ: [œMessageœ]}",
"target": "OpenAIModel-OcXkl",
"targetHandle": "{œfieldNameœ: œinput_valueœ, œidœ: œOpenAIModel-OcXklœ, œinputTypesœ: [œMessageœ], œtypeœ: œstrœ}"
}
```
## Flow storage
This edge shows that the `ChatInput` component outputs a `Message` type to the `target` node, which is the `OpenAIModel` node.
The `OpenAIModel` component accepts the `Message` type at the `input_value` field.
Flows and [flow logs](#flow-logs) are stored on local disk at the following default locations:
### Additional metadata and project information
- **Linux and WSL**: `home/<username>/.cache/langflow/`
- **macOS**: `/Users/<username>/Library/Caches/langflow/`
- **Windows**: `%LOCALAPPDATA%\langflow\langflow\Cache`
Additional information about the flow is stored in the root `data` object.
The flow storage location can be customized with the [`LANGFLOW_CONFIG_DIR`](/environment-variables#LANGFLOW_CONFIG_DIR) environment variable.
* Metadata and project information including the name, description, and `last_tested_version` of the flow.
```json
{
"name": "Basic Prompting",
"description": "Perform basic prompting with an OpenAI model.",
"tags": ["chatbots"],
"id": "1511c230-d446-43a7-bfc3-539e69ce05b8",
"last_tested_version": "1.0.19.post2",
"gradient": "2",
"icon": "Braces"
}
```
## Flow logs
* Visual information about the flow defining the initial position of the flow in the workspace.
```json
"viewport": {
"x": -37.61270157375441,
"y": -155.91266341888854,
"zoom": 0.7575251406952855
}
```
When viewing a flow in the **Workspace**, click **Logs** to examine logs for that flow and its components.
**Notes** are like comments to help you understand the flow within the workspace.
They may contain links, code snippets, and other information.
Notes are written in Markdown and stored as `node` objects.
```json
{
"id": "undefined-kVLkG",
"node": {
"description": "## 📖 README\nPerform basic prompting with an OpenAI model.\n\n#### Quick Start\n- Add your **OpenAI API key** to the **OpenAI Model**\n- Open the **Playground** to chat with your bot.\n..."
}
}
```
![Logs pane](/img/logs.png)
Langflow logs are stored in `.log` files in the same place as your flows.
For filepaths, see [Flow storage](/concepts-flows#flow-storage).
The flow storage location can be customized with the [`LANGFLOW_CONFIG_DIR`](/environment-variables#LANGFLOW_CONFIG_DIR) environment variable:
1. Add `LANGFLOW_LOG_FILE=path/to/logfile.log` in your `.env` file.
An example `.env` file is available in the [Langflow repository](https://github.com/langflow-ai/langflow/blob/main/.env.example).
2. Start Langflow with the values from your `.env` file by running `uv run langflow run --env-file .env`.
## See also
* [Share and embed flows](/concepts-publish)
* [Import and export flows](/concepts-flows-import)

View file

@ -1,77 +1,59 @@
---
title: Langflow overview
title: Use the visual editor
slug: /concepts-overview
---
import Icon from "@site/src/components/icon";
This page explores the fundamental building blocks of Langflow, beginning with the question, **"What is a flow?"**
You use Langflow's visual editor to create, test, and share flows, which are functional representations of application workflows.
Flows consist of components that represent individual steps in your application's workflow.
## What is a flow?
Langflow's drag-and-drop interface allows you to create complex AI workflows without writing extensive code.
You can connect different resources, including prompts, large language models (LLMs), data sources, agents, MCP servers, and other tools and integrations.
A _flow_ is a functional representation of an application. It receives input, processes it, and produces output.
Flows consist of [components](/concepts-components) that you configure and connect in the **Workspace**.
![Basic prompting flow within in the workspace](/img/workspace-basic-prompting.png)
A flow can be as simple as the [basic prompting flow](/get-started-quickstart), which creates an OpenAI chatbot with four components.
- Each component in a flow is a **node** that performs a specific task, like an AI model or a data source.
- Each component has a **Configuration** menu. Click the <Icon name="Code" aria-hidden="true"/> **Code** button on a component to see its underlying Python code.
- Components are connected with **edges** to form flows.
If you're familiar with [React Flow](https://reactflow.dev/learn), a **flow** is a node-based application, a **component** is a node, and the connections between components are **edges**.
When a flow is run, Langflow builds a Directed Acyclic Graph (DAG) graph object from the nodes (components) and edges (connections between components), with the nodes sorted to determine the order of execution. The graph build calls the individual components' `def_build` functions to validate and prepare the nodes. This graph is then processed in dependency order. Each node is built and executed sequentially, with results from each built node being passed to nodes that are dependent on the previous node's results.
Flows are stored on local disk at the following default locations:
- **Linux and WSL**: `home/<username>/.cache/langflow/`
- **macOS**: `/Users/<username>/Library/Caches/langflow/`
- **Windows**: `%LOCALAPPDATA%\langflow\langflow\Cache`
The flow storage location can be customized with the [LANGFLOW_CONFIG_DIR](/environment-variables#LANGFLOW_CONFIG_DIR) environment variable.
## Find your way around
If you're new to Langflow, it's OK to feel a bit lost at first. Well take you on a tour, so you can orient yourself and start creating applications quickly.
Langflow has four distinct regions: the [workspace](#workspace) is the main area where you build your flows. The **Components** menu is on the left, and lists the available [components](#components). The [playground](#playground) and [publish pane](#share-menu) are available in the upper right corner.
:::tip
To try building and running a flow in a few minutes, see the [Langflow quickstart](/get-started-quickstart).
:::
## Workspace
The **workspace** is where you create AI applications by connecting and running components in flows.
- Click and drag the workspace to move it left, right, up, and down.
- Scroll up and down to zoom in and out of the workspace, or use the <Icon name="ZoomIn" aria-hidden="true"/> **Zoom In** and <Icon name="ZoomOut" aria-hidden="true"/> **Zoom Out** controls.
- Click <Icon name="Maximize" aria-hidden="true"/> **Fit To Zoom** to center the workspace on the current flow.
- Click <Icon name="StickyNote" aria-hidden="true"/> **Add Note** to add a note to your flow, similar to commenting in code.
- Click <Icon name="LockOpen" aria-hidden="true"/> **Lock** to lock the workspace in place, preventing accidental movement.
When building a [flow](/concepts-flows), you primarily interact with the **Workspace**.
This is where you add [components](/concepts-components), configure them, and attach them together.
![Empty langflow workspace](/img/workspace.png)
## Components
From the **Workspace**, you can also access the [**Playground**](#playground), [**Share** menu](#share-menu), and [**Logs**](/concepts-flows#flow-logs).
Components are the building blocks of your flows.
For more information, see [Components overview](/concepts-components).
### Workspace gestures and interactions
![Chat input and output connected to Language model component](/img/connect-component.png)
- To pan horizontally and vertically, click and drag an empty area of the workspace.
**Message** handles send text strings between components, so these components send text to each other.
Additional data types include **Data** (<Icon name="Circle" size="16" aria-label="A red circle on the side of a component" style={{ color: '#ef4444', fill: '#ef4444' }}/>) and **DataFrame** (<Icon name="Circle" size="16" aria-label="A red circle on the side of a component" style={{ color: '#d72670', fill: '#d72670' }}/>).
- To rearrange components visually, click and drag the components.
For more information, see [Components](/concepts-components).
To change the programmatic relationship between components, you must manipulate the component _edges_ or _ports_. For more information, see [Components overview](/concepts-components).
- To lock the visual position of the components, click <Icon name="LockOpen" aria-hidden="true"/> **Lock**.
- To zoom, use any of the following options:
- Scroll up or down on the mouse or trackpad.
- Click <Icon name="ZoomIn" aria-hidden="true"/> **Zoom In** or <Icon name="ZoomOut" aria-hidden="true"/> **Zoom Out**.
- Click <Icon name="Maximize" aria-hidden="true"/> **Fit To Zoom** to scale the zoom level to show the entire flow.
- To add a text box for non-functional notes and comments, click <Icon name="StickyNote" aria-hidden="true"/> **Add Note**.
## Playground
If a **Chat Input** component is in your current flow, the **Playground** enables you to run your flow, chat with your flow, view inputs and outputs, and modify your AI's memories to tune your responses in real time.
From the **Workspace**, click <Icon name="Play" aria-hidden="true"/> **Playground** to test your flow.
For example, click <Icon name="Play" aria-hidden="true"/> **Playground** in a flow that includes **Chat Input**, **Language Model**, and **Chat Output** components to chat with the LLM.
If your flow has a **Chat Input** component, you can use the **Playground** to run your flow, chat with your flow, view inputs and outputs, and modify your AI's memories to tune your responses in real time.
For example, if your flow has **Chat Input**, **Language Model**, and **Chat Output** components, then you can chat with the LLM in the **Playround** to test the flow.
To try this for yourself, you can use the [**Basic Prompting** template](/basic-prompting).
![Playground window](/img/playground.png)
If you have an **Agent** in your flow, the **Playground** displays its tool calls and outputs, so you can monitor the agent's tool use and understand how it came to the answer it returns.
If you have an **Agent** component in your flow, the **Playground** displays its tool calls and outputs so you can monitor the agent's tool use and understand the reasoning behind its responses.
To try this for yourself, you can use the [**Simple Agent** template](/simple-agent).
![Playground window with agent response](/img/playground-with-agent.png)
@ -79,59 +61,20 @@ For more information, see [Playground](/concepts-playground).
## Share {#share-menu}
The **Share** menu provides options for integrating your flow into external applications.
The **Share** menu provides the following options for integrating your flow into external applications:
For more information, see the links below.
* [**API access**](/concepts-publish#api-access): Integrate your flow into your applications with automatically-generated Python, JavaScript, and curl code snippets.
* [**Export**](/concepts-flows-import#export-a-flow): Export your flow to your local machine as a JSON file.
* [**MCP Server**](/mcp-server): Expose your flow as a tool for MCP-compatible clients.
* [**Embed into site**](/concepts-publish#embedded-chat-widget): Embed your flow in HTML, React, or Angular applications.
* [**Shareable Playground**](/concepts-playground#share-a-flows-playground): Share your **Playground** interface with another user.
* [API access](/concepts-publish#api-access) - Integrate your flow into your applications with Python, JavaScript, or curl templates.
* [Export](/concepts-publish#export) - Export your flow to your local machine as a JSON file.
* [MCP Server](/mcp-server) - Expose your flow as a tool for MCP-compatible clients.
* [Embed into site](/embedded-chat-widget) - Embed your flow in HTML, React, or Angular applications.
* [Shareable playground](/concepts-publish#shareable-playground) - Share your **Playground** interface with another user.
This is specifically for sharing the **Playground** experience; it isn't for running a flow in a production application.
## View logs
The **Sharable Playground** isn't available for Langflow Desktop.
The **Logs** pane provides a detailed record of all component executions within a workspace.
## See also
To access the **Logs** pane, click **Logs**.
![Logs pane](/img/logs.png)
Langflow stores logs at the location specified in the `LANGFLOW_CONFIG_DIR` environment variable.
This directory's default location depends on your operating system.
- **Linux and WSL**: `~/.cache/langflow/`
- **macOS**: `/Users/<username>/Library/Caches/langflow/`
- **Windows**: `%LOCALAPPDATA%\langflow\langflow\Cache`
To modify the location of your log file:
1. Add `LANGFLOW_LOG_FILE=path/to/logfile.log` in your `.env` file.
2. To start Langflow with the values from your `.env` file, start Langflow with `uv run langflow run --env-file .env`.
An example `.env` file is available in the [project repository](https://github.com/langflow-ai/langflow/blob/main/.env.example).
## Projects
The **Projects** page displays all the flows you've created in the Langflow workspace.
![](/img/my-projects.png)
**Starter Project** is the default space where all new projects are initially stored.
To create a new project, click <Icon name="Plus"aria-hidden="true"/> **Create new project**.
To upload a flow to your project, click <Icon name="Upload" aria-hidden="true"/> **Upload a flow**.
To delete a flow from your project, click a flow's checkbox to select it, and then click <Icon name="Trash2" aria-hidden="true"/> **Delete**.
You can select multiple flows in a single action.
## File management
Upload, store, and manage files in Langflow's **File management** system.
For more on managing your files, see [Manage files](/concepts-file-management).
## Settings
In the Langflow header, click your profile icon, and then select **Settings** to access general Langflow settings, including global variables, Langflow API keys, keyboard shortcuts, and log messages.
* [Manage files in Langflow](/concepts-file-management)
* [Global variables](/configuration-global-variables)
* [Langflow API keys](configuration-api-keys)

View file

@ -1,24 +1,27 @@
---
title: Playground
title: Use the Playground
slug: /concepts-playground
---
import Icon from "@site/src/components/icon";
<!-- TODO: Align and minimize duplications of playground content on the /concepts-overview, /about-langflow, and other pages -->
The **Playground** is a dynamic interface designed for real-time interaction with LLMs, allowing users to chat, access memories, and monitor inputs and outputs. Here, users can directly prototype their models, making adjustments and observing different outcomes.
As long as you have a [Chat Input](/components-io) component in your flow, you can run and chat with your flow by clicking the **Playground** button.
As long as you have a [Chat Input](/components-io) component in your flow, you can run the flow in the **Playground** test environment.
## Run a flow in the playground
## Run a flow in the Playground
To run a flow in the **Playground**, click **Playground**.
For example, click **Playground** in a flow that includes connected **Chat Input**, **Language Model**, and **Chat Output** components to chat with the LLM.
To run a flow in the **Playground**, open the flow, and then click **Playground**.
![Playground window](/img/playground.png)
If you have an **Agent** in your flow, the **Playground** displays its tool calls and outputs, so you can monitor the agent's tool use and understand how it came to the answer it returns.
This agent used a connected `fetch_content` tool to perform a web search.
If your flow has **Chat Input**, **Language Model**, and **Chat Output** components, you can chat with the LLM in the **Playground**.
If your flow has an **Agent** component, the **Playground** prints the tools used by the agent and the output from each tool.
This helps you monitor the agent's tool use and understand the logic behind the agent's responses.
For example, the following agent used a connected `fetch_content` tool to perform a web search:
![Playground window with agent response](/img/playground-with-agent.png)
@ -26,7 +29,7 @@ When you run a flow in the **Playground**, Langflow calls the `/build/{flow_id}/
The `build` function allows components to execute logic at runtime. For example, the [Recursive character text splitter](https://github.com/langflow-ai/langflow/blob/main/src/backend/base/langflow/components/langchain_utilities/recursive_character.py) is a child of the `LCTextSplitterComponent` class. When text needs to be processed, the parent class's `build` method is called, which creates a `RecursiveCharacterTextSplitter` object and uses it to split the text according to the defined parameters. The split text is then passed on to the next component. This all occurs when the component is built.
## View playground messages by session ID
## View Playground messages by session ID
When you send a message from the **Playground** interface, the interactions are stored in the **Message Logs** by `session_id`.
A single flow can have multiple chats, and different flows can share the same chat. Each chat session has a different `session_id`.
@ -101,4 +104,30 @@ curl -X POST "http://localhost:7860/api/v1/run/FLOW_ID" \
}'
```
The image is displayed in the chat interface and can be processed by your flow components.
The image is displayed in the chat interface and can be processed by your flow components.
## Share a flow's Playground
:::important
The **Shareable Playground** is for testing purposes only, and it isn't available for Langflow Desktop.
The **Playground** isn't meant for embedding flows in applications. For information about running flows in applications or websites, see [Run flows](/concepts-publish).
:::
The **Shareable Playground** option exposes the **Playground** for a single flow at the `/public_flow/$FLOW_ID` endpoint.
After you [deploy a public Langflow server](/deployment-overview), you can share this public URL with another user to allow them to access the specified flow's **Playground** only.
The user can interact with the flow's chat input and output and view the results without installing Langflow or generating a Langflow API key.
To share a flow's **Playground** with another user, do the following:
1. In Langflow, open the flow you want share.
2. From the **Workspace**, click **Share**, and then enable **Shareable Playground**.
3. Click **Shareable Playground** again to open the **Playground** window.
This window's URL is the flow's **Shareable Playground** address, such as `https://3f7c-73-64-93-151.ngrok-free.app/playground/d764c4b8-5cec-4c0f-9de0-4b419b11901a`.
4. Send the URL to another user to give them access to the flow's **Playground**.
## See also
- [Run flows](/concepts-publish)
- [Session ID](/session-id)

View file

@ -1,84 +1,517 @@
---
title: Share flows
title: Run flows
slug: /concepts-publish
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Langflow provides several ways to publish and integrate your flows into external applications. Whether you want to expose your flow as an API endpoint, embed it as a chat widget in your website, or share it as a public playground, this guide covers the options available for making your flows accessible to users.
After you build a flow, you probably want to run it within an application, such as a chatbot within a mobile app or website.
## API access
Langflow provides several ways to run flows from external applications:
:::tip Windows
The paths generated by the API access pane assume a *nix environment.
If you use Microsoft Windows or WSL, you might need to adjust the filepaths given in the code snippets.
* [Trigger flows with the Langflow API](#api-access)
* [Add an embedded chat widget to a website](#embedded-chat-widget)
* [Serve flows through a Langflow MCP server](#serve-flows-through-a-langflow-mcp-server)
Although you can use these options with an isolated, local Langflow instance, they are typically more valuable when you have [deployed a Langflow server](/deployment-overview) or packaged Langflow as a dependency of an application.
For package dependencies, see [Application development overview](/develop-application) and [Package a flow as a Docker image](/deployment-docker#package-your-flow-as-a-docker-image).
## Use the Langflow API to run flows {#api-access}
The Langflow API is the primary way to access your flows and Langflow servers programmatically.
:::tip Try it
For an example of a script that calls the Langflow API, see the [Quickstart](/get-started-quickstart).
:::
The **API access** pane presents code templates for integrating your flow into external applications.
### Generate API code snippets
![API pane](/img/api-pane.png)
To help you embed Langflow API requests in your scripts, Langflow automatically generates Python, JavaScript, and curl code snippets for your flows.
To get these code snippets, do the following:
As of Langflow version 1.5, all API requests require authentication with a Langflow API key, even if `AUTO_LOGIN` is set to `True`.
For more information, see [API keys](/configuration-api-keys).
1. In Langflow, open the flow that you want to embed in your application.
2. Click **Share**, and then select **API access**.
The API access panes code snippets include a script that looks for a `LANGFLOW_API_KEY` environment variable set in your terminal session.
To set this variable in your terminal:
```bash
export LANGFLOW_API_KEY="sk..."
These code snippets call the `/v1/run/$FLOW_ID` endpoint, and they automatically populate minimum values, like the Langflow server URL, flow ID, headers, and request parameters.
:::tip Windows
The paths generated by the API access pane assume a *nix environment.
If you use Microsoft Windows or WSL, you might need to adjust the filepaths given in the code snippets.
:::
![API access pane](/img/api-pane.png)
3. Optional: Click [**Input Schema**](#input-schema) to modify component parameters in the code snippets without changing the flow itself.
4. Copy the snippet for the language that you want to use.
5. Run the snippet as is, or use the snippet in the context of a larger script.
For more information and examples of other Langflow API endpoints, see [Get started with the Langflow API](/api-reference-api-examples).
### Langflow API authentication
In Langflow versions 1.5 and later, most API endpoints require authentication with a Langflow API key, even if `AUTO_LOGIN` is set to `True`.
The only exceptions are the MCP endpoints `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`, which never require authentication.
Code snippets generated in the **API access** pane include a script that checks for a `LANGFLOW_API_KEY` environment variable set in the local terminal session.
This script doesn't check for Langflow API keys set anywhere besides the local terminal session.
For this script to work, you must set a `LANGFLOW_API_KEY` variable in the terminal session where you intend to run the code snippet, such as `export LANGFLOW_API_KEY="sk..."`.
Alternatively, you can edit the code snippet to include an `x-api-key` header and ensure that the request can authenticate to the Langflow API.
For more information, see [API keys](/configuration-api-keys) and [Get started with the Langflow API](/api-reference-api-examples)
### Input Schema (tweaks) {#input-schema}
Tweaks are one-time overrides that modify component parameters for at runtime, rather than permanently modifying the flow itself.
For an example of tweaks in a script, see the [Quickstart](/get-started-quickstart).
:::tip
Tweaks make your flows more dynamic and reusable.
You can create one flow and use it for multiple applications by passing application-specific tweaks in each application's Langflow API requests.
:::
In the **API access** pane, click **Input Schema** to add `tweaks` to the request payload in a flow's code snippets.
Changes to a flow's **Input Schema** are saved exclusively as tweaks for that flow's **API access** code snippets.
These tweaks don't change the flow parameters set in the **Workspace**, and they don't apply to other flows.
Adding tweaks through the **Input Schema** can help you troubleshoot formatting issues with tweaks that you manually added to Langflow API requests.
### Use a flow ID alias
If you want your requests to use an alias instead of the actual flow ID, you can rename the flow's `/v1/run/$FLOW_ID` endpoint:
1. In Langflow, open the flow, click **Share**, and then select **API access**.
2. Click **Input Schema**.
3. In the **Endpoint Name** field, enter an alias for your flow's ID, such as a memorable, human-readable name.
The name can contain only letters, numbers, hyphens, and underscores, such as `flow-customer-database-agent`.
4. To save the change, close the **Input Schema** pane.
The automatically generated code snippets now use your new endpoint name instead of the original flow ID, such as `url = "http://localhost:7868/api/v1/run/flow-customer-database-agent`.
## Embed a flow into a website {#embedded-chat-widget}
For each flow, Langflow provides a code snippet that you can insert into the `<body>` of your website's HTML to interact with your flow through an embedded chat widget.
:::important Required components
The chat widget only supports flows that have **Chat Input** and **Chat Output** components, which are required for the chat experience.
**Text Input** and **Text Output** components can send and receive messages, but they don't include ongoing LLM chat context.
Attempting to chat with a flow that doesn't have a valid input component will trigger the flow, but the response only indicates that the input was empty.
:::
### Get a langflow-chat snippet
To get a flow's embedded chat widget code snippet, do the following:
1. In Langflow, open the flow you want to embed.
2. Click **Share**, and then select **Embed into site**.
3. Copy the code snippet and use it in the `<body>` of your website's HTML.
For more information, see [Embed the chat widget with React, Angular, or HTML](#embed-the-chat-widget).
4. Add the `api_key` prop to ensure the widget has permission to run the flow, as explained in [Configure the langflow-chat web component](#configure-the-langflow-chat-web-component).
The chat widget is implemented as a web component called `langflow-chat` that is loaded from a CDN. For more information, see the [langflow-embedded-chat repository](https://github.com/langflow-ai/langflow-embedded-chat).
For example, the following HTML embeds a chat widget for a [Basic prompting flow](/basic-prompting) hosted on a Langflow server deployed on ngrok:
```html
<html>
<head>
<script src="https://cdn.jsdelivr.net/gh/langflow-ai/langflow-embedded-chat@main/dist/build/static/js/bundle.min.js"></script>
</head>
<body>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
></langflow-chat>
</body>
</html>
```
When this code is deployed to a live site, it renders as a responsive chatbot.
If a user interacts with the chatbot, the input triggers the specified flow, and then the chatbot returns the output from the flow run.
![Default chat widget](/img/chat-widget-default.png)
:::tip Try it
Use the [Langflow embedded chat CodeSandbox](https://codesandbox.io/p/sandbox/langflow-embedded-chat-example-dv9zpx) for an interactive live demo of the embedded chat widget that uses your own flow.
For more information, see the [langflow-embedded-chat README](https://github.com/langflow-ai/langflow-embedded-chat?tab=readme-ov-file#live-example).
:::
### Embed the chat widget with React, Angular, or HTML {#embed-the-chat-widget}
The following examples show how to use embedded chat widget in React, Angular, and plain HTML.
<Tabs>
<TabItem value="react" label="React" default>
To use the chat widget in your React application, create a component that loads the widget script and renders the chat interface:
1. Declare your web component, and then encapsulate it in a React component:
```javascript
//Declaration of langflow-chat web component
declare global {
namespace JSX {
interface IntrinsicElements {
"langflow-chat": any;
}
}
}
//Definition for langflow-chat React component
export default function ChatWidget({ className }) {
return (
<div className={className}>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
></langflow-chat>
</div>
);
}
```
For more information, see [Run your flows from external applications](/get-started-quickstart#run-your-flows-from-external-applications).
2. Place the component anywhere in your code to render the chat widget.
### Input schema
In the following example, the React widget component is located at `docs/src/components/ChatWidget/index.tsx`, and `index.tsx` includes a script to load the chat widget code from CDN, along with the declaration and definition from the previous step:
Tweaks are added to the `payload` of requests to Langflow's `/run` endpoint to temporarily change component parameters within your flow.
They don't modify the underlying flow configuration or persist between runs.
To assist with formatting, you can define tweaks in Langflow's **Input Schema** pane before copying the code snippet.
```javascript
import React, { useEffect } from 'react';
For more information, see [Use tweaks to apply temporary overrides to a flow run](/get-started-quickstart#use-tweaks-to-apply-temporary-overrides-to-a-flow-run).
// Component to load the chat widget script
const ChatScriptLoader = () => {
useEffect(() => {
if (!document.querySelector('script[src*="langflow-embedded-chat"]')) {
const script = document.createElement('script');
script.src = 'https://cdn.jsdelivr.net/gh/langflow-ai/langflow-embedded-chat@main/dist/build/static/js/bundle.min.js';
script.async = true;
document.body.appendChild(script);
}
}, []);
Additionally, you can re-name your flow's API endpoint from the default UUID to a more memorable and user-friendly name.
return null;
};
To set a custom endpoint name:
1. In the **Input Schema** pane, locate the **Endpoint Name** field.
2. Enter a name using only letters, numbers, hyphens, and underscores.
The endpoint name is automatically saved with your flow.
//Declaration of langflow-chat web component
declare global {
namespace JSX {
interface IntrinsicElements {
"langflow-chat": any;
}
}
}
## Export
//Definition for langflow-chat React component
export default function ChatWidget({ className }) {
return (
<div className={className}>
<ChatScriptLoader />
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
></langflow-chat>
</div>
);
}
```
**Export** a flow to download it as a JSON file to your local machine.
3. Import the `langflow-chat` React component to make it available for use on a page.
Modify the following import statement with your React component's name and path:
1. To **Export** your flow, in the **Playground**, click **Share**, and then click **Export**.
2. To save your API keys with the flow, select **Save with my API keys**.
Your flow is saved with any Global variables included.
```jsx
import ChatWidget from '@site/src/components/ChatWidget';
```
:::important
If your key is saved as a Global variable, only the global variable you created to contain the value is saved. If your key value is manually entered into a component field, the actual key value is saved in the JSON file.
:::
4. To display the widget, call your `langflow-chat` component in the desired location on the page.
Modify the following reference for your React component's name and the desired `className`:
When you share your flow file with another user who has the same global variables populated, the flow runs without requiring keys to be added again.
```
<ChatWidget className="my-chat-widget" />
```
The `FLOW_NAME.json` file is downloaded to your local machine.
</TabItem>
<TabItem value="angular" label="Angular">
You can then **Import** the downloaded flow into another Langflow instance.
To use the chat widget in your Angular application, create a component that loads the widget script and renders the chat interface.
## MCP server
In an Angular application, `langflow-chat` is a custom web component that you must explicitly allow in your site's `.components.ts`.
Therefore, to use the embedded chat widget, you must add `CUSTOM_ELEMENTS_SCHEMA` to your module's configuration, and then integrate the `<langflow-chat>` element.
**MCP server** exposes your flows as [tools](https://modelcontextprotocol.io/docs/concepts/tools) that [MCP clients](https://modelcontextprotocol.io/clients) can use to take actions.
Angular requires you to explicitly allow custom web components, like `langflow-chat`, in your site's `components`.
Therefore, you must add the `<langflow-chat>` element to your Angular template and configure Angular to recognize it.
You must add `CUSTOM_ELEMENTS_SCHEMA` to your module's configuration to enable this.
For more information, see [MCP server](/mcp-server).
1. In your Angular application, edit the `.module.ts` file where you want to add the `langflow-chat` web component.
For information about using Langflow as an *MCP client*, see [MCP client](/mcp-client).
2. At the top of `.module.ts`, import `CUSTOM_ELEMENTS_SCHEMA`:
## Embed into site
```
import { NgModule, CUSTOM_ELEMENTS_SCHEMA } from '@angular/core';
```
The **Embed into site** tab displays code that can be inserted in the `<body>` of your HTML to interact with your flow.
3. In the `@NgModule` decorator, add `CUSTOM_ELEMENTS_SCHEMA` to the `schemas` array:
For more information, see [Embedded chat widget](/embedded-chat-widget).
```javascript
import { NgModule, CUSTOM_ELEMENTS_SCHEMA } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
import { AppComponent } from './app.component';
## Shareable playground
@NgModule({
declarations: [
AppComponent
],
imports: [
BrowserModule
],
schemas: [CUSTOM_ELEMENTS_SCHEMA],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }
```
The **Shareable playground** exposes your Langflow application's **Playground** at the `/public_flow/$FLOW_ID` endpoint.
4. Edit the `.component.ts` file where you want to use the embedded chat widget.
You can share this endpoint publicly using a sharing platform like [ngrok](https://ngrok.com/docs/getting-started/?os=macos) or [zrok](https://docs.zrok.io/docs/getting-started).
5. In the `@Component` decorator, add the `<langflow-chat>` element to the `template` key:
```javascript
import { Component } from '@angular/core';
@Component({
selector: 'app-root',
template: `
<div class="container">
<h1>Langflow Chat Test</h1>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
></langflow-chat>
</div>
`,
styles: [`
.container {
padding: 20px;
text-align: center;
}
`]
})
export class AppComponent {
title = 'Langflow Chat Test';
}
```
</TabItem>
<TabItem value="html" label="HTML">
```html
<html lang="en">
<head>
<script src="https://cdn.jsdelivr.net/gh/langflow-ai/langflow-embedded-chat@v1.0.7/dist/build/static/js/bundle.min.js"></script>
</head>
<body>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
></langflow-chat>
</body>
</html>
```
</TabItem>
</Tabs>
### Configure the langflow-chat web component {#configure-the-langflow-chat-web-component}
To use the embedded chat widget in your HTML, the `langflow-chat` web component must include the following minimum inputs (also known as _props_ in React):
* `host_url`: Your Langflow server URL. Must be `HTTPS`. Don't include a trailing slash (`/`).
* `flow_id`: The ID of the flow you want to embed.
* `api_key`: A [Langflow API key](/configuration-api-keys).
This prop is recommended to ensure the widget has permission to run the flow.
The minimum inputs are automatically populated in the [**Embed into site** code snippet](#get-a-langflow-chat-snippet) that is generated by Langflow.
You can use additional inputs (props) to modify the embedded chat widget.
For a list of all props, types, and descriptions, see the [langflow-embedded-chat README](https://github.com/langflow-ai/langflow-embedded-chat?tab=readme-ov-file#configuration).
<details>
<summary>Example: Langflow API key prop</summary>
The `api_key` prop stores a Langflow API key that the chat widget can use to authenticate the underlying Langflow API request.
The Langflow team recommends following industry best practices for handling sensitive credentials.
For example, securely store your API key, and then retrieve with an environment variable:
```html
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
></langflow-chat>
```
</details>
<details>
<summary>Example: Style props</summary>
There are many props you can use to customize the style and positioning of the embedded chat widget.
Many of these props are of type JSON, and they require specific formatting, depending on where you embed the `langflow-chat` web component.
In React and plain HTML, JSON props are expressed as JSON objects or stringified JSON, such as `\{"key":"value"\}`:
```html
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
chat_window_style='{
"backgroundColor": "#1a0d0d",
"border": "4px solid #b30000",
"borderRadius": "16px",
"boxShadow": "0 8px 32px #b30000",
"color": "#fff",
"fontFamily": "Georgia, serif",
"padding": "16px"
}'
window_title="Custom Styled Chat"
height="600"
width="400"
></langflow-chat>
```
For Angular applications, use [property binding syntax](https://angular.dev/guide/templates/binding#binding-dynamic-properties-and-attributes) to pass JSON props as JavaScript objects.
For example:
```javascript
import { Component } from '@angular/core';
@Component({
selector: 'app-root',
template: `
<div class="container">
<h1>Langflow Chat Test</h1>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
[chat_window_style]='{"backgroundColor": "#ffffff"}'
[bot_message_style]='{"color": "#000000"}'
[user_message_style]='{"color": "#000000"}'
height="600"
width="400"
chat_position="bottom-right"
></langflow-chat>
</div>
`,
styles: [`
.container {
padding: 20px;
text-align: center;
}
`]
})
export class AppComponent {
title = 'Langflow Chat Test';
}
```
</details>
<details>
<summary>Example: Session ID prop</summary>
The following example adds a custom [session ID](/session-id) to help identify flow runs started by the embeded chat widget:
```html
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
session_id="$SESSION_ID"
></langflow-chat>
```
</details>
<details>
<summary>Example: Tweaks prop</summary>
Use the `tweaks` prop to modify flow parameters at runtime.
The available keys for the `tweaks` object depend on the flow you are serving through the embedded chat widget.
In React and plain HTML, `tweaks` are declared as a JSON object, similar to how you would pass them to a Langflow API endpoint like [`/v1/run/$FLOW_ID`](/api-flows-run#run-flow).
For example:
```html
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
tweaks='{
"model_name": "llama-3.1-8b-instant"
}'
></langflow-chat>
```
For Angular applications, use [property binding syntax](https://angular.dev/guide/templates/binding#binding-dynamic-properties-and-attributes) to pass JSON props as JavaScript objects.
For example:
```javascript
import { Component } from '@angular/core';
@Component({
selector: 'app-root',
template: `
<div class="container">
<h1>Langflow Chat Test</h1>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="$LANGFLOW_API_KEY"
[tweaks]='{"model_name": "llama-3.1-8b-instant"}'
></langflow-chat>
</div>
`,
styles: [`
.container {
padding: 20px;
text-align: center;
}
`]
})
export class AppComponent {
title = 'Langflow Chat Test';
}
```
</details>
## Serve flows through a Langflow MCP server
Each [Langflow project](/concepts-flows#projects) has an MCP server that exposes the project's flows as [tools](https://modelcontextprotocol.io/docs/concepts/tools) that [MCP clients](https://modelcontextprotocol.io/clients) can use to generate responses.
You can also use Langflow as an MCP client, and you can serve your flows as tools to a Langflow MCP client.
For more information, see [Use Langflow as an MCP server](/mcp-server) and [Use Langflow as an MCP client](/mcp-client).
## See also
* [Develop an application with Langflow](/develop-application)
* [Langflow deployment overview](/deployment-overview)
* [Import and export flows](/concepts-flows-import)
* [Files endpoints](/api-files)
* [Use the Playground](/concepts-playground)

View file

@ -5,6 +5,8 @@ slug: /concepts-voice-mode
import Icon from "@site/src/components/icon";
<!-- TODO: Combine & redirect to /concepts-playground -->
The Langflow **Playground** supports **voice mode** for interacting with your applications through a microphone.
An [OpenAI API key](https://platform.openai.com/) is required to use **voice mode**. An [ElevenLabs](https://elevenlabs.io) API key enables more voices in the chat, but is optional.
@ -65,7 +67,7 @@ This approach is ideal for low latency applications, but it is less deterministi
* `/ws/flow_tts/{flow_id}` or `/ws/flow_tts/{flow_id}/{session_id}`: Converts audio to text using [OpenAI Realtime voice transcription](https://platform.openai.com/docs/guides/realtime-transcription), and then each flow is invoked directly for each transcript.
This approach is more deterministic but has higher latency.
This is the mode used in the Langflow playground.
This is the mode used in the Langflow Playground.
Path parameters:
* `flow_id`: Required path parameter. The ID of the flow to be used as a tool.

View file

@ -1,266 +0,0 @@
---
title: Embedded chat widget
slug: /embedded-chat-widget
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import ChatWidget from '@site/src/components/ChatWidget';
On the [Publish pane](/concepts-publish), the **Embed into site** tab displays code that can be inserted in the `<body>` of your HTML to interact with your flow.
The chat widget is implemented as a web component called `langflow-chat` and is loaded from a CDN. For more information, see the [langflow-embedded-chat repository](https://github.com/langflow-ai/langflow-embedded-chat).
For a sandbox example, see the [Langflow embedded chat CodeSandbox](https://codesandbox.io/p/sandbox/langflow-embedded-chat-example-dv9zpx).
The following example includes the minimum required inputs, called [props](https://react.dev/learn/passing-props-to-a-component) in React, for using the chat widget in your HTML code, which are `host_url` and `flow_id`.
The `host_url` value must be `HTTPS`, and may not include a `/` after the URL.
The `flow_id` value is found in your Langflow URL.
For a Langflow server running the [Basic prompting flow](/basic-prompting) at `https://c822-73-64-93-151.ngrok-free.app/flow/dcbed533-859f-4b99-b1f5-16fce884f28f`, your chat widget code is similar to the following:
```html
<html>
<head>
<script src="https://cdn.jsdelivr.net/gh/logspace-ai/langflow-embedded-chat@main/dist/build/static/js/bundle.min.js"></script>
</head>
<body>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
></langflow-chat>
</body>
</html>
```
When this code is embedded within HTML, it becomes a responsive chatbot, powered by the basic prompting flow.
![Default chat widget](/img/chat-widget-default.png)
To configure your chat widget further, include additional props.
All props and their types are listed in [index.tsx](https://github.com/langflow-ai/langflow-embedded-chat/blob/main/src/index.tsx).
To add some styling to the chat widget, customize its elements with JSON:
```html
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
chat_window_style='{
"backgroundColor": "#1a0d0d",
"border": "4px solid #b30000",
"borderRadius": "16px",
"boxShadow": "0 8px 32px #b30000",
"color": "#fff",
"fontFamily": "Georgia, serif",
"padding": "16px"
}'
window_title="Custom Styled Chat"
height="600"
width="400"
></langflow-chat>
```
To add a custom [session ID](/session-id) value and an API key for authentication to your Langflow server:
```html
<html>
<head>
<script src="https://cdn.jsdelivr.net/gh/logspace-ai/langflow-embedded-chat@main/dist/build/static/js/bundle.min.js"></script>
</head>
<body>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
api_key="YOUR_API_KEY"
session_id="YOUR_SESSION_ID"
></langflow-chat>
</body>
</html>
```
The chat widget requires your flow to contain **Chat Input** and **Chat Output** components for the widget to communicate with it.
Sending a message to Langflow without a **Chat Input** still triggers the flow, but the LLM warns you the message is empty.
**Text Input** and **Text Output** components can send and receive messages with Langflow, but without the ongoing LLM "chat" context.
## Embed the chat widget with React
To use the chat widget in your React application, create a component that loads the widget script and renders the chat interface:
1. Declare your web component and encapsulate it in a React component.
```javascript
declare global {
namespace JSX {
interface IntrinsicElements {
"langflow-chat": any;
}
}
}
export default function ChatWidget({ className }) {
return (
<div className={className}>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
></langflow-chat>
</div>
);
}
```
2. Place the component anywhere in your code to display the chat widget.
For example, in this docset, the React widget component is located at `docs > src > components > ChatWidget > index.tsx`.
`index.tsx` includes a script to load the chat widget code from CDN and initialize the `ChatWidget` component with props pointing to a Langflow server.
```javascript
import React, { useEffect } from 'react';
// Component to load the chat widget script
const ChatScriptLoader = () => {
useEffect(() => {
if (!document.querySelector('script[src*="langflow-embedded-chat"]')) {
const script = document.createElement('script');
script.src = 'https://cdn.jsdelivr.net/gh/langflow-ai/langflow-embedded-chat@main/dist/build/static/js/bundle.min.js';
script.async = true;
document.body.appendChild(script);
}
}, []);
return null;
};
declare global {
namespace JSX {
interface IntrinsicElements {
"langflow-chat": any;
}
}
}
export default function ChatWidget({ className }) {
return (
<div className={className}>
<ChatScriptLoader />
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
></langflow-chat>
</div>
);
}
```
3. To import the component to your page, add this to your site.
```
import ChatWidget from '@site/src/components/ChatWidget';
```
4. To add the widget to your page, include `<ChatWidget className="my-chat-widget" />`.
## Embed the chat widget with Angular
To use the chat widget in your [Angular](https://angular.dev/overview) application, create a component that loads the widget script and renders the chat interface.
Angular requires you to explicitly allow custom web components like `langflow-chat` in components, so you must add the `<langflow-chat>` element to your Angular template and configure Angular to recognize it. Add `CUSTOM_ELEMENTS_SCHEMA` to your module's configuration to enable this.
To add `CUSTOM_ELEMENTS_SCHEMA` to your module's configuration, do the following:
1. Open the module file `.module.ts` where you want to add the `langflow-chat` web component.
2. Import `CUSTOM_ELEMENTS_SCHEMA` at the top of the `.module.ts` file:
`import { NgModule, CUSTOM_ELEMENTS_SCHEMA } from '@angular/core';`
3. Add `CUSTOM_ELEMENTS_SCHEMA` to the `schemas` array inside the `@NgModule` decorator:
```javascript
import { NgModule, CUSTOM_ELEMENTS_SCHEMA } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
import { AppComponent } from './app.component';
@NgModule({
declarations: [
AppComponent
],
imports: [
BrowserModule
],
schemas: [CUSTOM_ELEMENTS_SCHEMA],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }
```
4. Add the chat widget to your component's template by including the `langflow-chat` element in your component's `.component.ts` file:
For style properties that accept `JSON` objects like `chat_window_style` and `bot_message_style`, use Angular's property binding syntax `[propertyName]` to pass them as JavaScript objects.
```javascript
import { Component } from '@angular/core';
@Component({
selector: 'app-root',
template: `
<div class="container">
<h1>Langflow Chat Test</h1>
<langflow-chat
host_url="https://c822-73-64-93-151.ngrok-free.app"
flow_id="dcbed533-859f-4b99-b1f5-16fce884f28f"
[chat_window_style]='{"backgroundColor": "#ffffff"}'
[bot_message_style]='{"color": "#000000"}'
[user_message_style]='{"color": "#000000"}'
window_title="Chat with us"
placeholder="Type your message..."
height="600"
width="400"
chat_position="bottom-right"
></langflow-chat>
</div>
`,
styles: [`
.container {
padding: 20px;
text-align: center;
}
`]
})
export class AppComponent {
title = 'Langflow Chat Test';
}
```
## Chat widget configuration
Use the widget API to customize your chat widget.
Props with the type `JSON` need to be passed as stringified JSON, with the format \{"key":"value"\}.
All props and their types are listed in [index.tsx](https://github.com/langflow-ai/langflow-embedded-chat/blob/main/src/index.tsx).
| Prop | Type | Description |
|----------------------|---------|------------------------------------------------|
| flow_id | String | Required. Identifier for the flow associated with the component. |
| host_url | String | Required. URL of the host for communication with the chat component. |
| api_key | String | X-API-Key header to send to Langflow. |
| additional_headers | JSON | Additional headers to be sent to the Langflow server. |
| session_id | String | Custom session id to override the random session id. |
| height | Number | Height of the chat window in pixels. |
| width | Number | Width of the chat window in pixels. |
| chat_position | String | Position of chat window, such as `top-right` or `bottom-left`. |
| start_open | Boolean | Whether the chat window should be open by default. |
| chat_window_style | JSON | Overall chat window appearance. |
| chat_trigger_style | JSON | Chat trigger button styling. |
| bot_message_style | JSON | Bot message formatting. |
| user_message_style | JSON | User message formatting. |
| error_message_style | JSON | Error message formatting. |
| input_style | JSON | Chat input field styling. |
| input_container_style| JSON | Input container styling. |
| send_button_style | JSON | Send button styling. |
| send_icon_style | JSON | Send icon styling. |
| window_title | String | Title displayed in the chat window header. |
| placeholder | String | Placeholder text for the chat input field. |
| placeholder_sending | String | Placeholder text while sending a message. |
| online | Boolean | Whether the chat component is online. |
| online_message | String | Custom message when chat is online. |
| input_type | String | Input type for chat messages. |
| output_type | String | Output type for chat messages. |
| output_component | String | Output ID when multiple outputs are present. |
| chat_output_key | String | Which output to display if multiple outputs are available. |
| tweaks | JSON | Additional custom adjustments for the flow. |

View file

@ -17,7 +17,7 @@ As an MCP server, Langflow exposes your flows as [tools](https://modelcontextpro
## Prerequisites
* A Langflow project with at least one flow.
* A [Langflow project](/concepts-flows#projects) with at least one flow.
* Any LTS version of [Node.js](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) installed on your computer to use MCP Inspector to [test and debug flows](#test-and-debug-flows).
@ -25,11 +25,11 @@ As an MCP server, Langflow exposes your flows as [tools](https://modelcontextpro
## Select and configure flows to expose as MCP tools {#select-flows-to-serve}
Langflow runs a separate MCP server for every [project](/concepts-overview#projects).
The MCP server for each project exposes that project's flows as tools.
Each [Langflow project](/concepts-flows#projects) has an MCP server that exposes the project's flows as tools that MCP clients can use to generate responses.
All of the flows in a project are exposed by default.
To expose only specific flows and optionally rename them for agentic use, follow these steps:
By default, all flows in a project are exposed as tools on the project's MCP server.
The following steps explain how to limit the exposed flows and, optionally, rename flows for agentic use:
1. From the Langflow dashboard, select the project that contains the flows you want to serve as tools, and then click the **MCP Server** tab.
Alternatively, you can quickly access the **MCP Server** tab from within any flow by selecting **Share > MCP Server**.
@ -46,16 +46,47 @@ Alternatively, you can quickly access the **MCP Server** tab from within any flo
![MCP server tools](/img/mcp-server-tools.png)
4. Optional: Edit the **Flow Name** and **Flow Description**.
4. Recommended: Edit the **Tool Name** and **Tool Description** to help MCP clients determine which actions your flows provide and when to use those actions:
- **Tool Name**: Enter a name that makes it clear what the flow does.
- **Tool Name**: Enter a name that makes it clear what the flow does when used as a tool by an agent.
- **Tool Description**: Enter a description that accurately describes the specific action(s) the flow performs.
- **Tool Description**: Enter a description that accurately describes the specific actions the flow performs.
:::important
MCP clients use the **Flow Name** and **Flow Description** to determine which action to use.
For more information about naming and describing your flows, see [Name and describe your flows for agentic use](#name-and-describe-your-flows).
:::
<details>
<summary>Name and describe your flows for agentic use</summary>
MCP clients use the **Tool Name** and **Tool Description** to determine which action to use.
MCP clients like [Cursor](https://www.cursor.com/) treat your Langflow project as a single MCP server with all of your enabled flows listed as tools.
This can confuse agents if the flows have unclear names or descriptions.
For example, a flow's default name is the flow ID, such as `adbbf8c7-0a34-493b-90ea-5e8b42f78b66`.
This provides no information to an agent about the type of flow or it's purpose.
To provide more context about your flows, make sure to name and describe your flows clearly when configuring your Langflow project's MCP server.
It's helpful to think of the names and descriptions as function names and code comments, using clear statements describing the problems your flows solve.
For example, assume you have a [Document Q&A flow](/document-qa) that uses an LLM to chat about resumes, and you give the flow the following name and description:
- **Tool Name**: `document_qa_for_resume`
- **Tool Description**: `A flow for analyzing Emily's resume.`
After connecting your Langflow MCP server to Cursor, you can ask Cursor about the resume, such as `What job experience does Emily have?`.
Using the context provided by your tool name and description, the agent can decide to use the `document_qa_for_resume` MCP tool to create a response about Emily's resume.
If necessary, the agent asks permission to use the flow tool before generating the response.
If you ask about a different resume, such as `What job experience does Alex have?`, the agent can decide that `document_qa_for_resume` isn't relevant to this request, because the tool description specifies that the flow is for Emily's resume.
In this case, the agent might use another available tool, or it can inform you that it doesn't have access to information about Alex's.
For example:
```
I notice you're asking about Alex's job experience.
Based on the available tools, I can see there is a Document QA for Resume flow that's designed for analyzing resumes.
However, the description mentions it's for "Emily's resume" not Alex's. I don't have access to Alex's resume or job experience information.
```
</details>
5. Close the **MCP Server Tools** window to save your changes.
@ -114,14 +145,13 @@ For example:
5. Save and close the `mcp.json` file in Cursor.
The newly added MCP server will appear in the **MCP Servers** section.
Cursor is now connected to your project's MCP server and your flows are registered as tools.
Cursor determines when to use tools based on your queries, and requests permissions when necessary.
For more information, see the [Cursor's MCP documentation](https://docs.cursor.com/context/model-context-protocol).
</TabItem>
</Tabs>
Cursor is now connected to your project's MCP server and your flows are registered as tools.
Cursor determines when to use tools based on your queries, and requests permissions when necessary.
For more information, see the [Cursor's MCP documentation](https://docs.cursor.com/context/model-context-protocol).
### MCP server authentication and environment variables {#authentication}
If your Langflow server [requires authentication](/configuration-authentication) ([`LANGFLOW_AUTO_LOGIN`](/environment-variables#LANGFLOW_AUTO_LOGIN) is set to `false`), then you must supply a [Langflow API key](/configuration-api-keys) in your MCP client configuration.
@ -175,44 +205,6 @@ Replace `KEY` and `VALUE` with the environment variable name and value you want
To deploy your MCP server externally with ngrok, see [Deploy a public Langflow server](/deployment-public-server).
## Name and describe your flows for agentic use {#name-and-describe-your-flows}
MCP clients like [Cursor](https://www.cursor.com/) "see" your Langflow project as a single MCP server, with _all_ of your enabled flows listed as tools.
This can confuse agents.
For example, an agent won't know that flow `adbbf8c7-0a34-493b-90ea-5e8b42f78b66` is a [Document Q&A](/document-qa) flow for a specific text file.
To prevent this behavior, make sure to [name and describe](#select-flows-to-serve) your flows clearly.
It's helpful to think of the names and descriptions as function names and code comments, making sure to use clear statements describing the problems your flows solve.
For example, let's say you have a [Document Q&A](/document-qa) flow that loads a sample resume for an LLM to chat with, and that you've given it the following name and description:
- **Tool Name**: `document_qa_for_resume`
- **Tool Description**: `A flow for analyzing Emily's resume.`
If you ask Cursor a question specifically about the resume, such as `What job experience does Emily have?`, the agent asks to call the MCP tool `document_qa_for_resume`.
That's because your name and description provided the agent with a clear purpose for the tool.
When you run the tool, the agent requests permissions when necessary, and then provides a response.
For example:
```
{
"input_value": "What job experience does Emily have?"
}
Result:
What job experience does Emily have?
Emily J. Wilson has the following job experience:
```
If you ask about a different resume, such as `What job experience does Alex have?`, you've provided enough information in the description for the agent to make the correct decision:
```
I notice you're asking about Alex's job experience.
Based on the available tools, I can see there is a Document QA for Resume flow that's designed for analyzing resumes.
However, the description mentions it's for "Emily's resume" not Alex's. I don't have access to Alex's resume or job experience information.
```
## Use MCP Inspector to test and debug flows {#test-and-debug-flows}
[MCP Inspector](https://modelcontextprotocol.io/docs/tools/inspector) is a common tool for testing and debugging MCP servers.

View file

@ -13,18 +13,17 @@ The API key has the same permissions and access as you do when you launch Langfl
An API key represents the user who created it. If you create a key as a superuser, then that key will have superuser privileges.
Anyone who has that key can authorize superuser actions through the Langflow API, including user management and flow management.
In Langflow versions 1.5 and later, most API requests require a Langflow API key, even when `AUTO_LOGIN=true`.
The only exceptions are the MCP endpoints: `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`.
These endpoints don't require authentication, regardless of the `AUTO_LOGIN` setting.
In Langflow versions 1.5 and later, most API endpoints require a Langflow API key, even when `AUTO_LOGIN` is set to `True`.
The only exceptions are the MCP endpoints `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`, which never require authentication.
<details>
<summary>Auto-login and API key authentication in earlier Langflow versions</summary>
<summary>AUTO_LOGIN and SKIP_AUTH options</summary>
If you are running a Langflow version earlier than 1.5, if `AUTO_LOGIN=true`, Langflow automatically logs users in as a superuser without requiring authentication, and API requests can be made without a Langflow API key.
If you set `SKIP_AUTH_AUTO_LOGIN=true` and `AUTO_LOGIN=true`, authentication will be skipped entirely, and API requests will not require a Langflow API key.
In Langflow versions earlier than 1.5, if `AUTO_LOGIN=true`, then Langflow automatically logs users in as a superuser without requiring authentication.
In this case, API requests don't require a Langflow API key.
In Langflow version 1.5, you can set `SKIP_AUTH_AUTO_LOGIN=true` and `AUTO_LOGIN=true` to skip authentication for API requests.
However, the `SKIP_AUTH_AUTO_LOGIN` option will be removed in a future release.
</details>
## Generate a Langflow API key
@ -36,13 +35,12 @@ The UI-generated key is appropriate for most cases. The CLI-generated key is nee
<Tabs>
<TabItem value="Langflow UI" label="Langflow UI" default>
1. Click your user icon, and then select **Settings**.
1. In the Langflow UI header, click your profile icon, and then select **Settings**.
2. Click **Langflow API Keys**, and then click **Add New**.
3. Name your key, and then click **Create API Key**.
4. Copy the API key and store it in a secure location.
4. Copy the API key and store it securely.
</TabItem>
<TabItem value="Langflow CLI" label="Langflow CLI">
If you're serving your flow with `--backend-only=true`, you can't create API keys in the UI, because the frontend is not running.
@ -90,7 +88,7 @@ To create an API key for a user from the CLI, do the following:
</details>
2. Create an API key:
3. Create an API key:
```shell
uv run langflow api-key
@ -102,7 +100,7 @@ To create an API key for a user from the CLI, do the following:
Include your API key in API requests to authenticate requests to Langflow.
API keys allow access only to the flows and components of the specific user to whom the key was issued.
API keys allow access only to the flows and components of the specific user who created the key.
<Tabs>
<TabItem value="HTTP header" label="HTTP header" default>
@ -141,10 +139,10 @@ For more information, see [Authentication](/configuration-authentication#langflo
## Revoke an API key
To revoke an API key, delete it from the list of keys in the **Settings** menu.
To revoke an API key, delete it from your Langflow settings:
1. Click your user icon, and then select **Settings**.
2. Click **Langflow API**.
1. In the Langflow UI header, click your profile icon, and then select **Settings**.
2. Click **Langflow API Keys**.
3. Select the keys you want to delete, and then click <Icon name="Trash2" aria-hidden="true"/> **Delete**.
This action immediately invalidates the key and prevents it from being used again.
@ -167,13 +165,18 @@ GOOGLE_API_KEY=...
### Add component API keys with the Langflow UI
To add component API keys as **Global variables** with the Langflow UI:
You can store API keys for Langflow components as [global variables](/configuration-global-variables) in Langflow:
1. Click your user icon, and then select **Settings**.
2. Click **Langflow API**.
3. Add new API keys as **Credential** type variables.
4. Apply them to specific component fields.
1. In the Langflow UI header, click your profile icon, and then select **Settings**.
2. Click **Global Variables**.
3. Click **Add New**.
4. For **Type**, select **Credential**.
5. For **Name**, enter a name for the variable that will store the API key.
6. For **Value**, enter the API key that you want to store.
7. For **Apply to fields**, you can select component fields to automatically populate with this variable.
Component values set directly in a flow override values set in the UI **and** environment variables.
You can override automatically set variables by manually entering a different variable name or value when you add the affected component to a flow.
For more information, see [Global variables](/configuration-global-variables).
Additionally, you can override all component settings by [running a flow with tweaks](/concepts-publish#input-schema), which are modifications to component settings that you make at runtime and apply to a single flow run only.
8. Click **Save Variable**.

View file

@ -19,25 +19,10 @@ For more information, see [Start a secure Langflow server with authentication](#
The section describes the available authentication configuration variables.
The Langflow project includes a [`.env.example`](https://github.com/langflow-ai/langflow/blob/main/.env.example) file to help you get started.
You can copy the contents of this file into your own `.env` file and replace the example values with your own preferred settings.
You can use the [`.env.example`](https://github.com/langflow-ai/langflow/blob/main/.env.example) file in the Langflow repository as a template for your own `.env` file.
### LANGFLOW_AUTO_LOGIN
In Langflow versions 1.5 and later, most API requests require a Langflow API key, even when `AUTO_LOGIN=true`.
The only exceptions are the MCP endpoints: `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`.
These endpoints don't require authentication, regardless of the `AUTO_LOGIN` setting.
<details>
<summary>Auto-login and API key authentication in earlier Langflow versions</summary>
If you are running a Langflow version earlier than 1.5, if `AUTO_LOGIN=true`, Langflow automatically logs users in as a superuser without requiring authentication, and API requests can be made without a Langflow API key.
If you set `SKIP_AUTH_AUTO_LOGIN=true` and `AUTO_LOGIN=true`, authentication will be skipped entirely, and API requests will not require a Langflow API key.
</details>
Langflow **does not** allow users to have simultaneous or shared access to flows.
If `AUTO_LOGIN` is enabled and user management is disabled (`LANGFLOW_NEW_USER_IS_ACTIVE=true`), users can access the same environment, but it is not password protected. If two users access the same flow, Langflow saves only the work of the last user to save.
@ -45,6 +30,19 @@ If `AUTO_LOGIN` is enabled and user management is disabled (`LANGFLOW_NEW_USER_I
LANGFLOW_AUTO_LOGIN=True
```
In Langflow versions 1.5 and later, most API endpoints require a Langflow API key, even when `AUTO_LOGIN` is set to `True`.
The only exceptions are the MCP endpoints `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`, which never require authentication.
<details>
<summary>AUTO_LOGIN and SKIP_AUTH options</summary>
In Langflow versions earlier than 1.5, if `AUTO_LOGIN=true`, then Langflow automatically logs users in as a superuser without requiring authentication.
In this case, API requests don't require a Langflow API key.
In Langflow version 1.5, you can set `SKIP_AUTH_AUTO_LOGIN=true` and `AUTO_LOGIN=true` to skip authentication for API requests.
However, the `SKIP_AUTH_AUTO_LOGIN` option will be removed in a future release.
</details>
### LANGFLOW_SUPERUSER and LANGFLOW_SUPERUSER_PASSWORD
These environment variables are only relevant when `LANGFLOW_AUTO_LOGIN` is set to `False`.

View file

@ -12,48 +12,49 @@ This guide walks you through setting up an external database for Langflow by rep
## Connect Langflow to PostgreSQL
To connect Langflow to PostgreSQL, follow these steps.
1. If Langflow is running, quit Langflow.
1. Find your PostgreSQL database's connection string.
It looks like `postgresql://user:password@host:port/dbname`.
2. Find your PostgreSQL database's connection string in the format `postgresql://user:password@host:port/dbname`.
The hostname in your connection string depends on how you're running PostgreSQL.
- If you're running PostgreSQL directly on your machine, use `localhost`.
- If you're running PostgreSQL in Docker Compose, use the service name, such as `postgres`.
- If you're running PostgreSQL in a separate Docker container with `docker run`, use the container's IP address or network alias.
The hostname in your connection string depends on how you're running PostgreSQL:
2. Create a `.env` file for configuring Langflow.
```
touch .env
```
- If you're running PostgreSQL directly on your machine, use `localhost`.
- If you're running PostgreSQL in Docker Compose, use the service name, such as `postgres`.
- If you're running PostgreSQL in a separate Docker container with `docker run`, use the container's IP address or network alias.
3. To set the database URL environment variable, add it to your `.env` file:
```text
LANGFLOW_DATABASE_URL="postgresql://user:password@localhost:5432/dbname"
```
3. Create a Langflow `.env` file if you don't already have one:
:::tip
The Langflow project includes a [`.env.example`](https://github.com/langflow-ai/langflow/blob/main/.env.example) file to help you get started.
You can copy the contents of this file into your own `.env` file and replace the example values with your own preferred settings.
Replace the value for `LANGFLOW_DATABASE_URL` with your PostgreSQL connection string.
:::
```
touch .env
```
4. Run Langflow with the `.env` file:
```bash
uv run langflow run --env-file .env
```
You can use the [`.env.example`](https://github.com/langflow-ai/langflow/blob/main/.env.example) file in the Langflow repository as a template for your own `.env` file.
5. In Langflow, create traffic by running a flow.
6. Inspect your PostgreSQL deployment's tables and activity.
New tables and traffic are created.
4. In your `.env` file, set `LANGFLOW_DATABASE_URL` to your your PostgreSQL connection string:
```text
LANGFLOW_DATABASE_URL="postgresql://user:password@localhost:5432/dbname"
```
5. Save your changes, and then start Langflow with your `.env` file:
```bash
uv run langflow run --env-file .env
```
6. In Langflow, run any flow to create traffic.
7. Inspect your PostgreSQL deployment's tables and activity to verify that new tables and traffic were created after you ran a flow.
## Example Langflow and PostgreSQL docker-compose.yml
The Langflow project includes a [docker-compose.yml](https://github.com/langflow-ai/langflow/blob/main/docker_example/docker-compose.yml) file for quick deployment with PostgreSQL.
Docker Compose creates an isolated network for all services defined in `docker-compose.yml`. This ensures that the services can communicate with each other using their service names as hostnames, such as `postgres` in the database URL.
In contrast, if you run PostgreSQL separately with `docker run`, it launches in a different network than the Langflow container, and this prevents Langflow from connecting to PostgreSQL using the service name.
This configuration launches Langflow and PostgreSQL containers in the same Docker network, ensuring proper connectivity between services. It also sets up persistent volumes for both Langflow and PostgreSQL data.
You can use the [`docker-compose.yml`](https://github.com/langflow-ai/langflow/blob/main/docker_example/docker-compose.yml) file in the Langflow repository to launches Langflow and PostgreSQL containers in the same Docker network, ensuring proper connectivity between services.
This configuration also sets up persistent volumes for both Langflow and PostgreSQL data.
To start the services, navigate to the `/docker_example` directory, and then run `docker-compose up`.
To start the services, navigate to the `/docker_example` directory, and then run `docker-compose up`:
```yaml
services:
@ -87,10 +88,6 @@ volumes:
langflow-data: # Persistent volume for Langflow data
```
:::note
Docker Compose creates an isolated network for all services defined in the docker-compose.yml file. This ensures that the services can communicate with each other using their service names as hostnames, for example, `postgres` in the database URL. If you were to run PostgreSQL separately using `docker run`, it would be in a different network and Langflow wouldn't be able to connect to it using the service name.
:::
## Deploy multiple Langflow instances with a shared database
To configure multiple Langflow instances that share the same PostgreSQL database, modify your `docker-compose.yml` file to include multiple Langflow services.

View file

@ -16,8 +16,7 @@ Langflow stores global variables in its internal database, and encrypts the valu
To create a new global variable, follow these steps.
1. In the Langflow UI, click your profile icon, and then select **Settings**.
1. In the Langflow UI header, click your profile icon, and then select **Settings**.
2. Click **Global Variables**.
3. Click **Add New**.
@ -43,7 +42,7 @@ You can now select your global variable from any text input field that displays
## Edit a global variable
1. In the Langflow UI, click your profile icon, and then select **Settings**.
1. In the Langflow UI header, click your profile icon, and then select **Settings**.
2. Click **Global Variables**.
@ -59,7 +58,7 @@ You can now select your global variable from any text input field that displays
Deleting a global variable permanently deletes any references to it from your existing projects.
:::
1. In the Langflow UI, click your profile icon, and then select **Settings**.
1. In the Langflow UI header, click your profile icon, and then select **Settings**.
2. Click **Global Variables**.
@ -114,13 +113,11 @@ If you installed Langflow locally, you must define the `LANGFLOW_VARIABLES_TO_GE
Make sure to expose your environment variables to Langflow in a manner that best suits your own environment.
:::
5. Confirm that Langflow successfully sourced the global variables from the environment.
5. Confirm that Langflow successfully sourced the global variables from the environment:
1. In the Langflow UI, click your profile icon, and then select **Settings**.
1. In the Langflow UI header, click your profile icon, and then select **Settings**.
2. Click **Global Variables**.
The environment variables appear in the list of **Global Variables**.
2. Click **Global Variables**, and then make sure that your environment variables appear in the **Global Variables** list.
</TabItem>

View file

@ -54,11 +54,13 @@ If it detects a supported environment variable, then it automatically adopts the
### Import environment variables from a .env file {#configure-variables-env-file}
1. Create a `.env` file and open it in your preferred editor.
1. If Langflow is running, quit Langflow.
2. Add your environment variables to the file:
2. Create a `.env` file, and then open it in your preferred editor.
```text title=".env"
3. Define [Langflow environment variables](#supported-variables) in the `.env` file. For example:
```text
DO_NOT_TRACK=true
LANGFLOW_AUTO_LOGIN=false
LANGFLOW_AUTO_SAVING=true
@ -91,44 +93,48 @@ If it detects a supported environment variable, then it automatically adopts the
LANGFLOW_WORKERS=3
```
:::tip
The Langflow project includes a [`.env.example`](https://github.com/langflow-ai/langflow/blob/main/.env.example) file to help you get started.
You can copy the contents of this file into your own `.env` file and replace the example values with your own preferred settings.
:::
For additional examples, see the [`.env.example`](https://github.com/langflow-ai/langflow/blob/main/.env.example) file in the Langflow repository.
3. Save and close the file.
4. Save and close `.env`.
4. Start Langflow using the `--env-file` option to define the path to your `.env` file:
<Tabs>
5. Start Langflow with your `.env` file:
<Tabs>
<TabItem value="local" label="Local" default>
```bash
python -m langflow run --env-file .env
```
</TabItem>
</TabItem>
<TabItem value="docker" label="Docker" default>
```bash
docker run -it --rm \
-p 7860:7860 \
--env-file .env \
langflowai/langflow:latest
```
</TabItem>
</TabItem>
</Tabs>
If your `.env` file isn't in the same directory, provide the path to your `.env` file.
On startup, Langflow imports the environment variables from your `.env` file, as well as any that you [set in your terminal](#configure-variables-terminal), and adopts their specified values.
## Precedence {#precedence}
Environment variables [defined in the .env file](#configure-variables-env-file) take precedence over those [set in your terminal](#configure-variables-terminal).
That means, if you happen to set the same environment variable in both your terminal and your `.env` file, Langflow adopts the value from the the `.env` file.
You can set Langflow environment variables in your terminal, in `.env`, and with [Langflow CLI options](./configuration-cli.md).
:::info[CLI precedence]
[Langflow CLI options](./configuration-cli.md) override the value of corresponding environment variables defined in the `.env` file as well as any environment variables set in your terminal.
:::
If an environment variable is set in multiple places, the following hierarchy applies:
1. Langflow CLI options override `.env` and terminal variables.
2. `.env` overrides terminal variables.
3. Terminal variables are used only if the variable isn't set in `.env` or Langflow CLI options.
For example, if you set `LANGFLOW_PORT` in `.env` and your terminal, then Langflow uses the value from `.env`.
Similarly, if you run a Langflow CLI command with `--port`, Langflow uses that port number instead of the `LANGFLOW_PORT` in `.env`.
## Supported environment variables {#supported-variables}

View file

@ -9,7 +9,7 @@ This example adds a new bundle named `DarthVader`.
## Add the bundle to the backend folder
1. Navigate to the backend directory in the Langflow project and create a new folder for your bundle.
1. Navigate to the backend directory in the Langflow repository and create a new folder for your bundle.
The path for your new component is `src > backend > base > langflow > components > darth_vader`.
You can view the [components folder](https://github.com/langflow-ai/langflow/tree/main/src/backend/base/langflow/components) in the Langflow repository.
@ -23,7 +23,7 @@ For an example of adding multiple components in a bundle, see the [Notion](https
## Add the bundle to the frontend folder
1. Navigate to the frontend directory in the Langflow project to add your bundle's icon.
1. Navigate to the frontend directory in the Langflow repository to add your bundle's icon.
The path for your new component icon is `src > frontend > src > icons > DarthVader`
You can view the [icons folder](https://github.com/langflow-ai/langflow/tree/main/src/frontend/src/icons) in the Langflow repository.
To add your icon, create **three** files inside the `icons/darth_vader` folder.
@ -89,7 +89,7 @@ For example:
CrewAI: () =>
import("@/icons/CrewAI").then((mod) => ({ default: mod.CrewAiIcon })),
DarthVader: () =>
import("@/icons/DartVader").then((mod) => ({ default: mod.DarthVaderIcon })),
import("@/icons/DarthVader").then((mod) => ({ default: mod.DarthVaderIcon })),
DeepSeek: () =>
import("@/icons/DeepSeek").then((mod) => ({ default: mod.DeepSeekIcon })),
```

View file

@ -3,7 +3,7 @@ title: Join the Langflow community
slug: /contributing-community
---
There are several ways you can interact with the Langflow community and learn more about the Langflow project.
There are several ways you can interact with the Langflow community and learn more about the Langflow codebase.
## Join the Langflow Discord server

View file

@ -5,7 +5,7 @@ slug: /contributing-github-issues
The Langflow GitHub repository is an integral part of the [Langflow community](/contributing-community).
In addition to general assistance with Langflow, the repository is the best place to report bugs and request enhancements to ensure that they are tracked by the Langflow project.
In addition to general assistance with Langflow, the repository is the best place to report bugs and request enhancements to ensure that they are tracked by Langflow maintainers.
## GitHub issues

View file

@ -36,9 +36,9 @@ To opt out of telemetry, set the `LANGFLOW_DO_NOT_TRACK` or `DO_NOT_TRACK` e
### Playground {#ae6c3859f612441db3c15a7155e9f920}
- **Seconds**: Duration in seconds for playground execution, offering insights into performance during testing or experimental stages.
- **ComponentCount**: Number of components used in the playground, which helps understand complexity and usage patterns.
- **Success**: Success status of the playground operation, aiding in identifying the stability of experimental features.
- **Seconds**: Duration in seconds for Playground execution, offering insights into performance during testing or experimental stages.
- **ComponentCount**: Number of components used in the Playground, which helps understand complexity and usage patterns.
- **Success**: Success status of the Playground operation, aiding in identifying the stability of experimental features.
### Component {#630728d6654c40a6b8901459a4bc3a4e}

View file

@ -3,30 +3,30 @@ title: Deploy Langflow on Docker
slug: /deployment-docker
---
This guide demonstrates deploying Langflow with Docker and Docker Compose.
Running applications in Docker containers ensures consistent behavior across different systems and eliminates dependency conflicts.
Three options are available:
You can use the Langflow Docker image to start a Langflow container.
* The [Quickstart](#quickstart) option starts a Docker container with default values.
* The [Docker compose](#clone-the-repo-and-build-the-langflow-docker-container) option builds Langflow with a persistent PostgreSQL database service.
* The [Package your flow as a docker image](#package-your-flow-as-a-docker-image) option demonstrates packaging an existing flow with a Dockerfile.
This guide demonstrates several ways to deploy Langflow with [Docker](https://docs.docker.com/) and [Docker Compose](https://docs.docker.com/compose/):
For more information on configuring the Docker image, see [Customize the Langflow Docker image with your own code](#customize-the-langflow-docker-image-with-your-own-code).
<!-- no toc -->
* [Start a Langflow container with default values](#quickstart)
* [Clone the repo and use Docker Compose to build the Langflow Docker container](#clone-the-repo-and-build-the-langflow-docker-container) with a persistent PostgreSQL database service
* [Use a Dockerfile to package a flow as a Docker image](#package-your-flow-as-a-docker-image)
* [Customize the Langflow Docker image](#customize-the-langflow-docker-image-with-your-own-code)
## Prerequisites
## Start a Langflow container with default values {#quickstart}
- [Docker](https://docs.docker.com/)
- [Docker Compose](https://docs.docker.com/compose/)
## Quickstart
With Docker installed and running on your system, run this command:
With Docker installed and running on your system, run the following command:
`docker run -p 7860:7860 langflowai/langflow:latest`
Langflow is now accessible at `http://localhost:7860/`.
Then, access Langflow at `http://localhost:7860/`.
## Clone the repo and build the Langflow Docker container
Use Docker Compose to build Langflow with a persistent PostgreSQL database service:
1. Clone the Langflow repository:
`git clone https://github.com/langflow-ai/langflow.git`
@ -39,17 +39,14 @@ Langflow is now accessible at `http://localhost:7860/`.
`docker compose up`
Langflow is now accessible at `http://localhost:7860/`.
4. Access Langflow at `http://localhost:7860/`.
### Configure Docker services
The Docker Compose configuration spins up two services: `langflow` and `postgres`.
To configure values for these services at container startup, include them in your `.env` file.
An example `.env` file is available in the [project repository](https://github.com/langflow-ai/langflow/blob/main/.env.example).
To pass the `.env` values at container startup, include the flag in your `docker run` command:
To configure values for these services at container startup, define relevant [Langflow environment variables](/environment-variables) in a `.env` file.
Then, include the `--env-file` flag in your `docker run` command:
```
docker run -it --rm \
@ -58,6 +55,8 @@ docker run -it --rm \
langflowai/langflow:latest
```
If your `.env` file isn't in the same directory, provide the path to your `.env` file.
### Langflow service
The `langflow`service serves both the backend API and frontend UI of the Langflow web application.
@ -95,7 +94,7 @@ Volumes:
If you want to deploy a specific version of Langflow, you can modify the `image` field under the `langflow` service in the Docker Compose file. For example, to use version `1.0-alpha`, change `langflowai/langflow:latest` to `langflowai/langflow:1.0-alpha`.
## Package your flow as a Docker image
## Package your flow as a Docker image {#package-your-flow-as-a-docker-image}
You can include your Langflow flow with the application image.
When you build the image, your saved flow `.JSON` flow is included.

View file

@ -6,7 +6,7 @@ slug: /deployment-public-server
By default, your Langflow server at `http://localhost:7860` isn't exposed to the public internet.
However, you can forward Langflow server traffic with a forwarding platform like [ngrok](https://ngrok.com/docs/getting-started/) or [zrok](https://docs.zrok.io/docs/getting-started) to make your server public.
When your Langflow server is public, you can do things like [deploy your Langflow MCP server externally](#deploy-your-mcp-server-externally), [serve API requests](#serve-api-requests), and [share your playground externally](#share-your-playground-externally).
When your Langflow server is public, you can do things like [deploy your Langflow MCP server externally](#deploy-your-mcp-server-externally), [serve API requests](#serve-api-requests), and [share a flow's **Playground** publicly](#share-a-flows-playground).
## Prerequisites
@ -55,7 +55,7 @@ When your Langflow server is public, you can do things like [deploy your Langflo
## Use a public Langflow server
When your Langflow server is public, you can do things like [deploy your Langflow MCP server externally](#deploy-your-mcp-server-externally), [serve API requests](#serve-api-requests), and [share a Playground as a public website](#share-your-playground-externally).
When your Langflow server is public, you can do things like [deploy your Langflow MCP server externally](#deploy-your-mcp-server-externally), [serve API requests](#serve-api-requests), and [share a flow's **Playground** publicly](#share-a-flows-playground).
### Deploy your MCP server externally
@ -116,24 +116,9 @@ For example, the following code snippet calls an ngrok domain to trigger the spe
For a demo of the Langflow API in a script, see the [Quickstart](/get-started-quickstart).
### Share your playground externally
### Share a flow's Playground
The **Shareable playground** option exposes the **Playground** for a single flow at the `/public_flow/{flow-id}` endpoint.
After you deploy a public Langflow server, you can use the **Shareable Playground** option to make a flow's **Playground** available at a public URL.
If a user accesses this URL, they can interact with the flow's chat input and output and view the results without installing Langflow or generating a Langflow API key.
This allows you to share a public URL with another user that displays only the **Playground** chat window for the specified flow.
The user can interact with the flow's chat input and output and view the results without requiring a Langflow installation or API keys of their own.
:::important
The **Sharable Playground** is for testing purposes only.
The **Playground** isn't meant for embedding flows in applications. For information about running flows in applications or websites, see [About developing and configuring Langflow applications](/develop-overview) and [Publish flows](/concepts-publish).
:::
To share a flow's **Playground** with another user, do the following:
1. In Langflow, open the flow you want share.
2. From the **Workspace**, click **Share**, and then enable **Shareable Playground**.
3. Click **Shareable Playground** again to open the **Playground** window.
This window's URL is the flow's **Sharable Playground** address, such as `https://3f7c-73-64-93-151.ngrok-free.app/playground/d764c4b8-5cec-4c0f-9de0-4b419b11901a`.
4. Send the URL to another user to give them access to the flow's **Playground**.
For more information, see [Share a flow's Playground](/concepts-playground#share-a-flows-playground).

View file

@ -359,10 +359,4 @@ Log: Log {
</details>
The `FlowResponse` object is returned to the client, with the `outputs` array including your flow result.
## Langflow TypeScript project repository
You can do even more with the Langflow TypeScript client.
For more information, see the [langflow-client-ts](https://github.com/datastax/langflow-client-ts/) repository.
The `FlowResponse` object is returned to the client, with the `outputs` array including your flow result.

View file

@ -66,7 +66,7 @@ To add authentication to your server, see [Authentication](/configuration-authen
Add your flow's `.JSON` files to the `/flows` folder.
To export your flows from Langflow, see [Flows](/concepts-flows).
To export your flows from Langflow, see [Import and export flows](/concepts-flows-import).
Optionally, add any custom components to a `/components` folder, and specify the path in your `docker.env`.

View file

@ -8,7 +8,7 @@ The following pages provide information about how to develop and configure Langf
The [Develop an application in Langflow](/develop-application) guide walks you through packaging and serving a flow, from your local development environment to a containerized application.
As you build your application, you will configure the following application behaviors. More detailed explanation is provided in the individual pages.
* [Custom Dependencies](/install-custom-dependencies) - Add and manage additional Python packages and external dependencies in your Langflow projects.
* [Custom Dependencies](/install-custom-dependencies) - Add and manage additional Python packages and external dependencies in your Langflow applications.
* [Memory and Storage](/memory) - Configure Langflow's storage and caching behavior.

View file

@ -67,9 +67,9 @@ If you're working within a cloned Langflow repository, add dependencies with `uv
uv add langflow matplotlib
```
## Add dependencies to the Langflow project
## Add dependencies to the Langflow codebase
When contributing to Langflow itself, add dependencies to the project's configuration.
When contributing to the Langflow codebase, you might need to add dependencies to Langflow.
Langflow uses a workspace with two packages:

View file

@ -7,7 +7,7 @@ Langflow provides flexible memory management options for storage and retrieval.
This page details the following memory configuration options in Langflow.
- [Use local Langflow database tables](#local-langflow-database-tables)
- [Local Langflow database tables](#local-langflow-database-tables)
- [Store messages in local memory](#store-messages-in-local-memory)
- [Configure external memory](#configure-external-memory)
- [Configure the external database connection](#configure-the-external-database-connection)
@ -25,7 +25,7 @@ The following tables are stored in `langflow.db`:
**User** - Stores user account information including credentials, permissions, and profiles. For more information, see [Authentication](/configuration-authentication).
**Flow** - Contains flow configurations. For more information, see [Flows](/concepts-flows).
**Flow** - Contains flow configurations. For more information, see [Build flows](/concepts-flows).
**Message** - Stores chat messages and interactions that occur between components. For more information, see [Message objects](/concepts-objects#message-object).
@ -33,11 +33,11 @@ The following tables are stored in `langflow.db`:
**ApiKey** - Manages API authentication keys for users. For more information, see [API keys](/configuration-api-keys).
**Project** - Provides a structure for flow storage. For more information, see [Projects](/concepts-overview#projects).
**Project** - Provides a structure for flow storage. For more information, see [Projects](/concepts-flows#projects).
**Variables** - Stores global encrypted values and credentials. For more information, see [Global variables](/configuration-global-variables).
**VertexBuild** - Tracks the build status of individual nodes within flows. For more information, see [Run a flow in the playground](/concepts-playground).
**VertexBuild** - Tracks the build status of individual nodes within flows. For more information, see [Run a flow in the Playground](/concepts-playground).
For more information, see the database models in the [source code](https://github.com/langflow-ai/langflow/tree/main/src/backend/base/langflow/services/database/models).

View file

@ -0,0 +1,82 @@
---
title: What is Langflow?
slug: /about-langflow
---
Langflow is an open-source, Python-based, customizable framework for building AI applications.
It supports important AI functionality like agents and the Model Context Protocol (MCP), and it doesn't require you to use specific large language models (LLMs) or vector stores.
The visual editor simplifies prototyping of application workflows, enabling developers to quickly turn their ideas into powerful, real-world solutions.
:::tip Try it
Build and run your first flow in minutes: [Install Langflow](/get-started-installation), and then try the [Quickstart](/get-started-quickstart).
:::
## Application development and prototyping
Langflow can help you develop a wide variety of AI applications, such as [chatbots](/memory-chatbot), [document analysis systems](/document-qa), [content generators](/blog-writer), and [agentic applications](/simple-agent).
### Create flows in minutes
The primary purpose of Langflow is to create and serve flows, which are functional representations of application workflows.
To [build a flow](/concepts-flows), you connect and configure component nodes. Each component is a single step in the workflow.
With Langflow's [visual editor](/concepts-overview), you can drag and drop components to quickly build and test a functional AI application workflow.
For example, you could build a chatbot flow for an e-commerce store that uses an LLM and a product data store to allow customers to ask questions about the store's products.
![Basic prompting flow within the Workspace](/img/workspace-basic-prompting.png)
### Test flows in real-time
You can use the [Playground](/concepts-playground) to test flows without having to build your entire application stack.
You can interact with your flows and get real-time feedback about flow logic and response generation.
You can also run individual components to test dependencies in isolation.
### Run and serve flows
You can use your flows as prototypes for more formal application development, or you can use the Langflow API to embed your flows into your application code.
For more extensive projects, you can build Langflow as a dependency or deploy a Langflow server to serve flows over the public internet.
For more information, see the following:
* [Share and embed flows](/concepts-publish)
* [Get started with the Langflow API](/api-reference-api-examples)
* [Develop an application with Langflow](/develop-application)
* [Langflow deployment overview](/deployment-overview)
## Endless modifications and integrations
Langflow provides [components](/concepts-components) that support many services, tools, and functionality that are required for AI applications.
Some components are generalized, such as inputs, outputs, and data stores.
Others are specialized, such as agents, language models, and embedding providers.
All components offer parameters that you can set to fixed or variable values. You can also use tweaks to temporarily override flow settings at runtime.
### Agent and MCP support
In addition to building agentic flows with Langflow, you can leverage Langflow's built-in agent and MCP features:
* [Use Langflow Agents](/agents)
* [Use components and flows as Agent tools](/agents-tools)
* [Use Langflow as an MCP server](/mcp-server)
* [Use Langflow as an MCP client](/mcp-client)
### Extensibility
In addition to the core components, Langflow supports custom components.
You can use custom components developed by others, and you can develop your own custom components for personal use or to share with other Langflow users.
For more information, see the following:
* [Contribute to Langflow](/contributing-how-to-contribute)
* [Create custom Python components](/components-custom-components)
## Next steps
* [Install Langflow](/get-started-installation)
* [Quickstart](/get-started-quickstart)

View file

@ -8,15 +8,20 @@ import TabItem from '@theme/TabItem';
Langflow can be installed in multiple ways:
* **Langflow Desktop (Recommended)**: Download and install the [standalone desktop application](#install-and-run-langflow-desktop) for the easiest setup experience.
* [**Langflow Desktop (Recommended)**](#install-and-run-langflow-desktop): Download and install the standalone desktop application for the least complicated setup experience.
This option includes dependency management and facilitated upgrades.
* **Docker**: Pull and run the [Docker image](#install-and-run-langflow-docker) to start a Langflow container.
* [**Docker**](#install-and-run-langflow-docker): Pull and run the Langflow Docker image to start a Langflow container and run Langflow in isolation.
* **Python package**: Install the [Langflow OSS Python package](#install-and-run-the-langflow-oss-python-package).
* [**Python package**](#install-and-run-the-langflow-oss-python-package): Install and run the Langflow OSS Python package.
This option offers more control over the environment, dependencies, and versioning.
* [**Install from source**](/contributing-how-to-contribute#install-langflow-from-source): Use this option if you want to contribute to the Langflow codebase or documentation.
## Install and run Langflow Desktop
**Langflow Desktop** is a desktop version of Langflow that includes all the features of open source Langflow, with an additional [version management](#manage-your-version-of-langflow-desktop) feature for managing your Langflow version.
Langflow Desktop is a desktop version of Langflow that simplifies dependency management and upgrades.
However, some features aren't available for Langflow Desktop, such as the **Shareable Playground**.
<Tabs groupId="os">
<TabItem value="macOS" label="macOS">
@ -24,9 +29,7 @@ Langflow can be installed in multiple ways:
1. Navigate to [Langflow Desktop](https://www.langflow.org/desktop).
2. Click **Download Langflow**, enter your contact information, and then click **Download**.
3. Mount and install the Langflow application.
4. When the installation completes, open the Langflow application.
After confirming that Langflow is running, create your first flow with the [Quickstart](/get-started-quickstart).
4. When the installation completes, open the Langflow application, and then create your first flow with the [Quickstart](/get-started-quickstart).
</TabItem>
<TabItem value="Windows" label="Windows">
@ -40,58 +43,45 @@ Langflow can be installed in multiple ways:
Windows installations of Langflow Desktop require a C++ compiler that may not be present on your system. If you receive a `C++ Build Tools Required!` error, follow the on-screen prompt to install Microsoft C++ Build Tools, or [install Microsoft Visual Studio](https://visualstudio.microsoft.com/downloads/).
:::
5. When the installation completes, open the Langflow application.
After confirming that Langflow is running, create your first flow with the [Quickstart](/get-started-quickstart).
5. When the installation completes, open the Langflow application, and then create your first flow with the [Quickstart](/get-started-quickstart).
</TabItem>
</Tabs>
After confirming that Langflow is running, create your first flow with the [Quickstart](/get-started-quickstart).
### Manage your version of Langflow Desktop
When a new version of Langflow is available, Langflow Desktop displays an upgrade message.
To manage your version of Langflow Desktop, follow these steps:
1. In Langflow Desktop, click your profile image, and then select **Version Management**.
The **Version Management** pane lists your active Langflow version first, followed by other available versions.
The **latest** version is always highlighted.
2. To change your Langflow version, select another version.
A confirmation pane containing the selected version's changelog appears.
3. To apply the change, click **Confirm**.
Langflow desktop restarts to install and activate the new version.
### Manage dependencies in Langflow Desktop
For upgrade information, see the [Release notes](/release-notes).
To manage dependencies in Langflow Desktop, see [Install custom dependencies in Langflow Desktop](/install-custom-dependencies#langflow-desktop).
## Install and run Langflow with Docker {#install-and-run-langflow-docker}
You can use the [Langflow Docker image](https://hub.docker.com/r/langflowai/langflow) to run Langflow in an isolated environment.
Running applications in [Docker](https://docs.docker.com/) containers ensures consistent behavior across different systems and eliminates dependency conflicts.
You can use the Langflow Docker image to start a Langflow container.
For more information, see [Deploy Langflow on Docker](/deployment-docker).
1. Install and start [Docker](https://docs.docker.com/).
2. Pull the latest [Langflow Docker image](https://hub.docker.com/r/langflowai/langflow) and start it:
```bash
docker run -p 7860:7860 langflowai/langflow:latest
```
```bash
docker run -p 7860:7860 langflowai/langflow:latest
```
3. To access Langflow, navigate to `http://localhost:7860/`.
For more information, see [Deploy Langflow on Docker](/deployment-docker).
4. Create your first flow with the [Quickstart](/get-started-quickstart).
## Install and run the Langflow OSS Python package
To install and run Langflow OSS, you need the following:
1. Make sure you have the required dependencies and infrastructure:
- [Python 3.10 to 3.13](https://www.python.org/downloads/release/python-3100/) for macOS/Linux, and Python 3.10 to 3.12 for Windows
- [uv](https://docs.astral.sh/uv/getting-started/installation/)
- At minimum, a dual-core CPU and 2GB RAM, but a multi-core CPU and at least 4GB RAM are recommended
- [Python](https://www.python.org/downloads/release/python-3100/)
- macOS and Linux: Version 3.10 to 3.13
- Windows: Version 3.10 to 3.12
- [uv](https://docs.astral.sh/uv/getting-started/installation/)
- Sufficient infrastructure:
- Minimum: Dual-core CPU and 2GB RAM
- Recommended: Multi-core CPU and at least 4GB RAM
1. Create a virtual environment with [uv](https://docs.astral.sh/uv/pip/environments).
2. Create a virtual environment with [uv](https://docs.astral.sh/uv/pip/environments).
<details>
<summary>Need help with virtual environments?</summary>
@ -140,51 +130,43 @@ To delete the virtual environment, type `Remove-Item VENV_NAME`.
</details>
2. To install Langflow, run the following command.
3. In your virtual environment, install Langflow:
```bash
uv pip install langflow
```
3. After installation, start Langflow:
To install a specific version of the Langflow package by adding the required version to the command, such as `uv pip install langflow==1.4.22`.
<details>
<summary>Reinstall or upgrade Langflow</summary>
To reinstall Langflow and all of its dependencies, run `uv pip install langflow --force-reinstall`.
To upgrade Langflow to the latest version, run `uv pip install langflow -U`.
However, the Langflow team recommends taking steps to backup your existing installation before you upgrade Langflow.
For more information, see [Prepare to upgrade](/release-notes#prepare-to-upgrade).
</details>
4. Start Langflow:
```bash
uv run langflow run
```
4. To confirm that a local Langflow instance is running, navigate to the default Langflow URL `http://127.0.0.1:7860`.
It can take a few minutes for Langflow to start.
It can take a few minutes for Langflow to start.
After confirming that Langflow is running, create your first flow with the [Quickstart](/get-started-quickstart).
5. To confirm that a local Langflow instance is running, navigate to the default Langflow URL `http://127.0.0.1:7860`.
### Install Langflow from source
6. Create your first flow with the [Quickstart](/get-started-quickstart).
To install Langflow from source, see [Install Langflow from source](/contributing-how-to-contribute#install-langflow-from-source).
For upgrade information, see the [Release notes](/release-notes).
### Manage Langflow OSS versions
:::important
The Langflow team recommends installing new Langflow versions in a new virtual environment before upgrading your primary installation.
This allows you to [import flows](/concepts-flows#import-flow) from your existing installation and test them in the new version without disrupting your existing installation.
In the event of breaking changes or bugs, your existing installation is preserved in a stable state.
:::
To manage your Langflow OSS version, use the following commands:
* Upgrade Langflow to the latest version: `uv pip install langflow -U`
* Install a specific version of the Langflow package by adding the required version to the command, such as: `uv pip install langflow==1.3.2`
* Reinstall Langflow and all of its dependencies: `uv pip install langflow --force-reinstall`
### Manage Langflow OSS dependencies
Langflow OSS provides optional dependency groups and support for custom dependencies to extend Langflow functionality.
For more information, see [Install custom dependencies](/install-custom-dependencies).
## Troubleshoot Langflow installation and startup issues
If you encounter an issue when installing or running Langflow, see [Troubleshoot Langflow](/troubleshoot).
For information about optional dependency groups and support for custom dependencies to extend Langflow OSS functionality, see [Install custom dependencies](/install-custom-dependencies).
## Next steps
After installing Langflow, build and run a flow with the [quickstart](/get-started-quickstart).
* [Quickstart](/get-started-quickstart): Build and run your first flow in minutes.
* [Build flows](/concepts-flows): Learn about building flows.
* [Troubleshoot Langflow](/troubleshoot): Get help with common Langflow install and startup issues.

View file

@ -11,22 +11,29 @@ Get started with Langflow by loading a template flow, running it, and then servi
## Prerequisites
- [A running Langflow instance](/get-started-installation)
- [An OpenAI API key](https://platform.openai.com/api-keys)
- [A Langflow API key](/configuration-api-keys)
- [Install and start Langflow](/get-started-installation)
- [Create an OpenAI API key](https://platform.openai.com/api-keys)
- [Create a Langflow API key](/configuration-api-keys)
## Create a Langflow API key
<details>
<summary>Create a Langflow API key</summary>
A [Langflow API key](/configuration-api-keys) is a user-specific token you can use with Langflow.
A Langflow API key is a user-specific token you can use with Langflow.
To create a Langflow API key, do the following:
1. In Langflow, click your user icon, and then select **Settings**.
2. Click **Langflow API Keys**, and then click <Icon name="Plus" aria-hidden="true"/> **Add New**.
3. Name your key, and then click **Create API Key**.
4. Copy the API key and store it in a secure location.
5. Include your `LANGFLOW_API_KEY` in requests like this:
```text
4. Copy the API key and store it securely.
5. To use your Langflow API key in a request, set a `LANGFLOW_API_KEY` environment variable in your terminal, and then include an `x-api-key` header or query parameter with your request.
For example:
```bash
# Set variable
export LANGFLOW_API_KEY="sk..."
# Send request
curl --request POST \
--url 'http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID' \
--header 'Content-Type: application/json' \
@ -37,11 +44,7 @@ To create a Langflow API key, do the following:
"input_value": "Hello"
}'
```
The API access pane's code snippets include a script that looks for a `LANGFLOW_API_KEY` environment variable set in your terminal session.
Set this variable in your terminal so you can copy and paste the commands.
```bash
export LANGFLOW_API_KEY="sk..."
```
</details>
## Run the Simple Agent template flow
@ -71,25 +74,22 @@ For this request, the agent selects the URL tool's `fetch_content` action, and t
6. When you are done testing the flow, click <Icon name="X" aria-hidden="true"/>**Close**.
:::tip Next steps
Now that you've run your first flow, try these next steps:
- Edit your **Simple Agent** flow by attaching different tools or adding more components to the flow.
- Build your own flows from scratch or by modifying other template flows.
- Edit your **Simple Agent** flow by attaching different tools or adding more [components](/concepts-components) to the flow.
- [Build your own flows](/concepts-flows) from scratch or by modifying other template flows.
- Integrate flows into your applications, as explained in [Run your flows from external applications](#run-your-flows-from-external-applications).
Optionally, stop here if you just want to create more flows within Langflow.
If you want to learn how Langflow integrates into external applications, read on.
:::
## Run your flows from external applications
Langflow is an IDE, but it's also a runtime you can call through an API with Python, JavaScript, or HTTP.
Langflow is an IDE, but it's also a runtime you can call through the [Langflow API](/api-reference-api-examples) with Python, JavaScript, or HTTP.
When you start Langflow locally, you can send requests to the local Langflow server.
For production applications, you need to deploy a stable Langflow instance to handle API calls.
For more information, see [Langflow deployment overview](/deployment-overview).
For production applications, you need to [deploy a stable Langflow instance](/deployment-overview) to handle API calls.
For example, you can use `POST /run` to run a flow and get the result.
For example, you can use the `/run` endpoint to run a flow and get the result.
Langflow provides code snippets to help you get started with the Langflow API.
@ -553,5 +553,8 @@ payload = {
## Next steps
* [Model Context Protocol (MCP) servers](/mcp-server)
* [Langflow deployment overview](/deployment-overview)
* [Use Langflow as a Model Context Protocol (MCP) server](/mcp-server)
* [Application development with Langflow](/develop-application)
* [Deploy a Langflow server](/deployment-overview)
* [File management](/concepts-file-management)
* [Credential management](/configuration-api-keys)

View file

@ -4,40 +4,19 @@ slug: /
hide_table_of_contents: true
---
Langflow is a new, visual framework for building multi-agent and RAG applications. It is open-source, Python-powered, fully customizable, and LLM and vector store agnostic.
Langflow empowers developers to rapidly prototype and build AI applications with a user-friendly visual interface and support for important AI functionality like agents and the MCP.
Its intuitive interface allows for easy manipulation of AI building blocks, enabling developers to quickly prototype and turn their ideas into powerful, real-world solutions.
Langflow empowers developers to rapidly prototype and build AI applications with its user-friendly interface and powerful features. Whether you're a seasoned AI developer or just starting out, Langflow provides the tools you need to bring your AI ideas to life.
## Visual flow builder
Langflow is an intuitive visual flow builder. This drag-and-drop interface allows developers to create complex AI workflows without writing extensive code. You can easily connect different components, such as prompts, language models, and data sources, to build sophisticated AI applications.
Whether you're a seasoned AI developer or just starting out, Langflow provides the tools you need to bring your AI ideas to life.
![Langflow in action](/img/playground-response.png)
## Use cases
## Get started with Langflow
Langflow can be used for a wide range of AI applications.
For example:
* [Craft intelligent chatbots](/memory-chatbot)
* [Build document analysis systems](/document-qa)
* [Generate compelling content](/blog-writer)
* [Orchestrate multi-agent applications](/simple-agent)
* [Create agents with Langflow](/agents)
* [Use Langflow as an MCP server](/mcp-server)
* [Use Langflow as an MCP client](/mcp-client)
* [About Langflow](/about-langflow)
* [Install Langflow](/get-started-installation)
* [Quickstart](/get-started-quickstart)
## Community and support
Join Langflow's vibrant community of developers and AI enthusiasts. See the following resources to join discussions, share your projects, and get support:
* [Contribute to Langflow](contributing-how-to-contribute)
* [Langflow Discord Server](https://discord.gg/EqksyE2EX9)
* [@langflow_ai](https://twitter.com/langflow_ai)
## Get started with Langflow
- [Install Langflow](/get-started-installation)
- [Quickstart](/get-started-quickstart)
* [Contribute to Langflow](/contributing-how-to-contribute)
* [Get help and request enhancements](/contributing-github-issues)

View file

@ -9,9 +9,15 @@ For all changes, see the [Changelog](https://github.com/langflow-ai/langflow/rel
## Prepare to upgrade
To avoid the impact of potential breaking changes and test new versions, the Langflow team recommends the following:
:::important
Whenever possible, the Langflow team recommends installing new Langflow versions in a new virtual environment or VM before upgrading your primary installation.
This allows you to [import flows](/concepts-flows-import#import-a-flow) from your existing installation and test them in the new version without disrupting your existing installation.
In the event of breaking changes or bugs, your existing installation is preserved in a stable state.
:::
1. [Export your projects](/api-projects#export-a-project) to create backups of your flows:
To avoid the impact of potential breaking changes and test new versions, the Langflow team recommends the following upgrade process:
1. Recommended: [Export your projects](/api-projects#export-a-project) to create backups of your flows:
```bash
curl -X GET \
@ -19,17 +25,24 @@ To avoid the impact of potential breaking changes and test new versions, the Lan
-H "accept: application/json" \
-H "x-api-key: $LANGFLOW_API_KEY"
```
2. Install the new version:
* **Langflow OSS Python package**: Install the new version in a new virtual environment, and then [import your flows](/concepts-flows) to test them in the new version.
* **Langflow Docker image**: Run the new image in a separate container, and then [import your flows](/concepts-flows) to the version of Langflow running in the new container.
* **Langflow Desktop**: Upgrade Langflow Desktop, as explained in [Manage your version of Langflow Desktop](/get-started-installation#manage-your-version-of-langflow-desktop). If you want to isolate the new version, you must install Langflow Desktop on a separate physical or virtual machine, and then [import your flows](/concepts-flows) to the new installation.
To export flows from the Langflow UI, see [Import and export flows](/concepts-flows-import).
4. Test your flows in the new version, [upgrading components](/concepts-components#component-versions) as needed.
2. Install the new version:
* **Langflow OSS Python package**: Install the new version in a new virtual environment. For instructions, see [Install and run the Langflow OSS Python package](/get-started-installation#install-and-run-the-langflow-oss-python-package).
* **Langflow Docker image**: Run the new image in a separate container.
* **Langflow Desktop**: To upgrade in place, open Langflow Desktop, and then click **Upgrade Available** in the header. If you want to isolate the new version, you must install Langflow Desktop on a separate physical or virtual machine, and then [import your flows](/concepts-flows-import) to the new installation.
<!-- :::tip
If you experience data loss after an in-place upgrade of Langflow Desktop, see [Unexpected data loss after Langflow Desktop upgrade](/troubleshoot#data-loss).
:::-->
3. [Import your flows](/concepts-flows-import) to test them in the new version, [upgrading components](/concepts-components#component-versions) as needed.
When upgrading components, you can use the **Create backup flow before updating** option if you didn't previously export your flows.
5. If you installed the new version in isolation, upgrade your primary installation after testing the new version.
4. If you installed the new version in isolation, upgrade your primary installation after testing the new version.
If you made changes to your flows in the isolated installation, you might want to export and import those flows back to your upgraded primary installation so you don't have to repeat the component upgrade process.
@ -41,8 +54,8 @@ The following updates are included in this version:
- Authentication changes
All API endpoints now require a Langflow API key to function, even when [LANGFLOW_AUTO_LOGIN](/environment-variables#LANGFLOW_AUTO_LOGIN) is enabled. This change enhances security by ensuring that automatic login features are properly authenticated.
The only exceptions are for the MCP endpoints at `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`, which will not require API keys.
To enhance security and ensure proper authentication for automatic login features, most API endpoints now require authentication with a Langflow API key, regardless of the `AUTO_LOGIN` setting.
The only exceptions are the MCP endpoints `/v1/mcp`, `/v1/mcp-projects`, and `/v2/mcp`, which never require authentication.
For more information, see [API keys](/configuration-api-keys).
- New Language Model and Embedding Model components
@ -64,7 +77,7 @@ The following updates are included in this version:
- Input schema replaces temporary overrides
The **Input schema** pane replaces the need to manage tweak values in the **API access** pane. When you enable a parameter in the **Input schema** pane, the parameter is automatically added to your flows code snippets, providing ready-to-use templates for making requests in your preferred programming language.
The **Input schema** pane replaces the need to manage tweak values in the **API access** pane. When you enable a parameter in the **Input schema** pane, the parameter is automatically added to your flow's code snippets, providing ready-to-use templates for making requests in your preferred programming language.
- Tools category is now legacy

View file

@ -3,6 +3,9 @@ title: Troubleshoot Langflow
slug: /troubleshoot
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
This page provides troubleshooting advice for issues you might encounter when using Langflow or contributing to Langflow.
## Missing components
@ -150,6 +153,39 @@ The cache folder location depends on your OS:
- **WSL2 on Windows**: `home/<username>/.cache/langflow/`
- **macOS**: `/Users/<username>/Library/Caches/langflow/`
<!--
### Unexpected data loss after Langflow Desktop upgrade {#data-loss}
If you upgrade Langflow Desktop and find that your projects, flows, and settings have been replaced by a fresh installation, follow these steps to attempt to recover the data from the prior version:
:::important
Any projects, flows, and settings you created after the upgrade will be overwritten when you recover the data from your previous installation.
:::
<Tabs>
<TabItem value="Linux and macOS" label="Linux and macOS" default>
1. Navigate to `~/.langflow/.langflow-venv/lib/python3.12/site-packages/langflow`.
2. Copy `langflow.db`, paste it in `~/.langflow/data`, and then rename it to `database.db`.
This overwrites the existing `database.db` with your previous version's internal Langflow database.
3. Launch Langflow Desktop to verify that your projects, flows, and settings have been restored.
</TabItem>
<TabItem value="Windows" label="Windows">
1. Navigate to `C:\Users\USERNAME\.langflow\.langflow-venv\Lib\site-packages\langflow`.
2. Copy `langflow.db`, paste it in `C:\Users\<name>\AppData\Roaming\com.Langflow\data`, and then rename it to `database.db`.
This overwrites the existing `database.db` with your previous version's internal Langflow database.
3. Launch Langflow Desktop to verify that your projects, flows, and settings have been restored.
</TabItem>
</Tabs>
-->
## Langflow uninstall issues
The following issues can occur when uninstalling Langflow.

View file

@ -73,8 +73,8 @@ For help with constructing file upload requests in Python, JavaScript, and curl,
1. To construct the request, gather the following information:
* `LANGFLOW_SERVER_ADDRESS`: Your Langflow server's domain. The default value is `127.0.0.1:7860`. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-pane).
* `FLOW_ID`: Your flow's UUID or custom endpoint name. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-pane).
* `LANGFLOW_SERVER_ADDRESS`: Your Langflow server's domain. The default value is `127.0.0.1:7860`. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-access).
* `FLOW_ID`: Your flow's UUID or custom endpoint name. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-access).
* `FILE_COMPONENT_ID`: The UUID of the File component in your flow, such as `File-KZP68`. To find the component ID, open your flow in Langflow, click the File component, and then click **Controls**.
* `CHAT_INPUT`: The message you want to send to the Chat Input of your flow, such as `Evaluate this resume for a job opening in my Marketing department.`
* `FILE_NAME` and `FILE_PATH`: The name and path to the local file that you want to send to your flow.

View file

@ -130,8 +130,8 @@ This tutorial uses JavaScript for demonstration purposes.
1. To construct the chatbot, gather the following information:
* `LANGFLOW_SERVER_ADDRESS`: Your Langflow server's domain. The default value is `127.0.0.1:7860`. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-pane).
* `FLOW_ID`: Your flow's UUID or custom endpoint name. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-pane).
* `LANGFLOW_SERVER_ADDRESS`: Your Langflow server's domain. The default value is `127.0.0.1:7860`. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-access).
* `FLOW_ID`: Your flow's UUID or custom endpoint name. You can get this value from the code snippets on your flow's [**API access** pane](/concepts-publish#api-access).
* `LANGFLOW_API_KEY`: A valid Langflow API key. To create an API key, see [API keys](/configuration-api-keys).
2. Copy the following script into a JavaScript file, and then replace the placeholders with the information you gathered in the previous step:

View file

@ -186,7 +186,10 @@ const config = {
},
{
to: "/blog-writer",
from: ["/starter-projects-blog-writer", "/tutorials-blog-writer"],
from: [
"/starter-projects-blog-writer",
"/tutorials-blog-writer",
],
},
{
to: "/memory-chatbot",
@ -197,7 +200,10 @@ const config = {
},
{
to: "/document-qa",
from: ["/starter-projects-document-qa", "/tutorials-document-qa"],
from: [
"/starter-projects-document-qa",
"/tutorials-document-qa",
],
},
{
to: "/simple-agent",
@ -245,7 +251,10 @@ const config = {
},
{
to: "/concepts-publish",
from: ["/concepts-api", "/workspace-api"],
from: [
"/concepts-api",
"/workspace-api",
],
},
{
to: "/components-custom-components",
@ -265,9 +274,7 @@ const config = {
},
{
to: "/deployment-kubernetes-dev",
from: [
"/deployment-kubernetes",
],
from: "/deployment-kubernetes",
},
{
to: "/basic-prompting",
@ -285,6 +292,10 @@ const config = {
to: "/agents",
from: "/agents-tool-calling-agent-component",
},
{
to: "/concepts-publish",
from: "/embedded-chat-widget",
},
// add more redirects like this
// {
// to: '/docs/anotherpage',

View file

@ -5,8 +5,21 @@ module.exports = {
type: "category",
label: "Get started",
items: [
"Get-Started/get-started-installation",
"Get-Started/get-started-quickstart",
{
type: "doc",
id: "Get-Started/about-langflow",
label: "About Langflow"
},
{
type: "doc",
id: "Get-Started/get-started-installation",
label: "Install Langflow"
},
{
type: "doc",
id: "Get-Started/get-started-quickstart",
label: "Quickstart"
},
{
type: "category",
label: "Tutorials",
@ -34,16 +47,48 @@ module.exports = {
},
{
type: "category",
label: "Concepts",
label: "Flows",
items: [
"Concepts/concepts-overview",
"Concepts/concepts-playground",
"Concepts/concepts-flows",
"Concepts/concepts-objects",
"Concepts/concepts-publish",
"Concepts/embedded-chat-widget",
"Concepts/concepts-file-management",
"Concepts/concepts-voice-mode",
{
type: "doc",
id: "Concepts/concepts-overview",
label: "Use the visual editor"
},
{
type: "doc",
id: "Concepts/concepts-flows",
label: "Build flows"
},
{
type: "doc",
id: "Concepts/concepts-publish",
label: "Share and embed flows"
},
{
type: "doc",
id: "Concepts/concepts-flows-import",
label: "Import and export flows"
},
{
type: "doc",
id: "Concepts/concepts-playground",
label: "Use the Playground"
},
{
type: "doc",
id: "Concepts/concepts-voice-mode",
label: "Use voice mode"
},
{
type: "doc",
id: "Concepts/concepts-objects",
label: "Langflow objects"
},
{
type: "doc",
id: "Concepts/concepts-file-management",
label: "Manage files"
},
],
},
{