docs: fix onBrokenAnchor behavior and links (#5520)

* fix: component url errors

* remove-unnecessary-nav-controls

* fix: update link-ids so onBrokenAnchors doesnt throw warnings

* delete unused category files

* delete unused sidebar_position

* space

* docs: format URLs in documentation for consistency

* fix: urls returning 404s

* backtick
This commit is contained in:
Mendon Kissling 2025-01-03 11:30:59 -05:00 committed by GitHub
commit d790761ff0
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
74 changed files with 153 additions and 232 deletions

View file

@ -1 +0,0 @@
{"position":5, "label":"Agents"}

View file

@ -1,6 +1,5 @@
---
title: Create a problem-solving agent
sidebar_position: 2
slug: /agents-tool-calling-agent-component
---

View file

@ -1,6 +1,5 @@
---
title: Agents overview
sidebar_position: 1
slug: /agents-overview
---

View file

@ -1,6 +1,5 @@
---
title: Agents
sidebar_position: 12
slug: /components-agents
---
@ -18,7 +17,7 @@ The agent then uses a connected LLM to reason through the problem to decide whic
## Use an agent in a flow
The [simple agent starter project](/starter-projects-simple-agent) uses an [agent component](#agent-component-agent-component) connected to URL and Calculator tools to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use. Tools are connected to agent components at the **Tools** port.
The [simple agent starter project](/starter-projects-simple-agent) uses an [agent component](#agent-component) connected to URL and Calculator tools to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use. Tools are connected to agent components at the **Tools** port.
![Simple agent starter flow](/img/starter-flow-simple-agent.png)

View file

@ -488,4 +488,4 @@ Advanced methods and attributes offer additional control and functionality. Unde
## Contribute Custom Components to Langflow
See [How to Contribute](/contributing-how-to-contribute#submitting-components) to contribute your custom component to Langflow.
See [How to Contribute](/contributing-components) to contribute your custom component to Langflow.

View file

@ -1,6 +1,5 @@
---
title: Data
sidebar_position: 3
slug: /components-data
---

View file

@ -1,6 +1,5 @@
---
title: Embeddings
sidebar_position: 6
slug: /components-embedding-models
---
@ -60,7 +59,7 @@ This component is used to load embedding models from [Amazon Bedrock](https://aw
## Astra DB vectorize
Connect this component to the **Embeddings** port of the [Astra DB vector store component](components-vector-stores#astra-db-serverless) to generate embeddings.
Connect this component to the **Embeddings** port of the [Astra DB vector store component](/components-vector-stores#astra-db-vector-store) to generate embeddings.
This component requires that your Astra DB database has a collection that uses a vectorize embedding provider integration.
For more information and instructions, see [Embedding Generation](https://docs.datastax.com/en/astra-db-serverless/databases/embedding-generation.html).
@ -221,7 +220,7 @@ This component generates embeddings using MistralAI models.
| max_concurrent_requests | Integer | Maximum number of concurrent API requests (default: 64) |
| max_retries | Integer | Maximum number of retry attempts for failed requests (default: 5) |
| timeout | Integer | Request timeout in seconds (default: 120) |
| endpoint | String | Custom API endpoint URL (default: "https://api.mistral.ai/v1/") |
| endpoint | String | Custom API endpoint URL (default: `https://api.mistral.ai/v1/`) |
#### Outputs
@ -239,10 +238,10 @@ This component generates embeddings using NVIDIA models.
| Name | Type | Description |
|------|------|-------------|
| model | String | The NVIDIA model to use for embeddings (e.g., nvidia/nv-embed-v1) |
| base_url | String | Base URL for the NVIDIA API (default: https://integrate.api.nvidia.com/v1) |
| model | String | The NVIDIA model to use for embeddings (e.g., `nvidia/nv-embed-v1`) |
| base_url | String | Base URL for the NVIDIA API (default: `https://integrate.api.nvidia.com/v1`) |
| nvidia_api_key | SecretString | API key for authenticating with NVIDIA's service |
| temperature | Float | Model temperature for embedding generation (default: 0.1) |
| temperature | Float | Model temperature for embedding generation (default: `0.1`) |
#### Outputs

View file

@ -1,6 +1,5 @@
---
title: Helpers
sidebar_position: 4
slug: /components-helpers
---

View file

@ -1,6 +1,5 @@
---
title: Inputs and outputs
sidebar_position: 1
slug: /components-io
---

View file

@ -1,6 +1,5 @@
---
title: Loaders
sidebar_position: 10
slug: /components-loaders
---
@ -27,9 +26,9 @@ The Confluence component integrates with the Confluence wiki collaboration platf
| Name | Display Name | Info |
| --- | --- | --- |
| url | Site URL | The base URL of the Confluence Space (e.g., https://company.atlassian.net/wiki) |
| username | Username | Atlassian User E-mail (e.g., email@example.com) |
| api_key | API Key | Atlassian API Key (Create at: https://id.atlassian.com/manage-profile/security/api-tokens) |
| url | Site URL | The base URL of the Confluence Space (e.g., `https://company.atlassian.net/wiki`) |
| username | Username | Atlassian User E-mail (e.g., `email@example.com`) |
| api_key | API Key | Atlassian API Key (Create an API key at: [Atlassian](https://id.atlassian.com/manage-profile/security/api-tokens)) |
| space_key | Space Key | The key of the Confluence space to access |
| cloud | Use Cloud? | Whether to use Confluence Cloud (default: true) |
| content_format | Content Format | Specify content format (default: STORAGE) |

View file

@ -1,6 +1,5 @@
---
title: Logic
sidebar_position: 13
slug: /components-logic
---

View file

@ -1,6 +1,5 @@
---
title: Models
sidebar_position: 5
slug: /components-models
---
@ -32,10 +31,10 @@ For more information, see [AIML documentation](https://docs.aimlapi.com/).
|--------------|-------------|---------------------------------------------------------------------------------------------|
| max_tokens | Integer | The maximum number of tokens to generate. Set to 0 for unlimited tokens. Range: 0-128000. |
| model_kwargs | Dictionary | Additional keyword arguments for the model. |
| model_name | String | The name of the AIML model to use. Options are predefined in AIML_CHAT_MODELS. |
| aiml_api_base| String | The base URL of the AIML API. Defaults to https://api.aimlapi.com. |
| model_name | String | The name of the AIML model to use. Options are predefined in `AIML_CHAT_MODELS`. |
| aiml_api_base| String | The base URL of the AIML API. Defaults to `https://api.aimlapi.com`. |
| api_key | SecretString| The AIML API Key to use for the model. |
| temperature | Float | Controls randomness in the output. Default: 0.1. |
| temperature | Float | Controls randomness in the output. Default: `0.1`. |
| seed | Integer | Controls reproducibility of the job. |
### Outputs
@ -58,7 +57,7 @@ For more information, see [Amazon Bedrock documentation](https://docs.aws.amazon
| aws_access_key | SecretString | AWS Access Key for authentication. |
| aws_secret_key | SecretString | AWS Secret Key for authentication. |
| credentials_profile_name | String | Name of the AWS credentials profile to use (advanced). |
| region_name | String | AWS region name. Default: "us-east-1". |
| region_name | String | AWS region name. Default: `us-east-1`. |
| model_kwargs | Dictionary | Additional keyword arguments for the model (advanced). |
| endpoint_url | String | Custom endpoint URL for the Bedrock service (advanced). |
@ -78,14 +77,14 @@ For more information, see the [Anthropic documentation](https://docs.anthropic.c
| Name | Type | Description |
|---------------------|-------------|----------------------------------------------------------------------------------------|
| max_tokens | Integer | The maximum number of tokens to generate. Set to 0 for unlimited tokens. Default: 4096.|
| max_tokens | Integer | The maximum number of tokens to generate. Set to 0 for unlimited tokens. Default: `4096`.|
| model | String | The name of the Anthropic model to use. Options include various Claude 3 models. |
| anthropic_api_key | SecretString| Your Anthropic API key for authentication. |
| temperature | Float | Controls randomness in the output. Default: 0.1. |
| anthropic_api_url | String | Endpoint of the Anthropic API. Defaults to 'https://api.anthropic.com' if not specified (advanced). |
| temperature | Float | Controls randomness in the output. Default: `0.1`. |
| anthropic_api_url | String | Endpoint of the Anthropic API. Defaults to `https://api.anthropic.com` if not specified (advanced). |
| prefill | String | Prefill text to guide the model's response (advanced). |
#### Outputs
### Outputs
| Name | Type | Description |
|-------|---------------|------------------------------------------------------------------|
@ -97,9 +96,7 @@ This component generates text using Azure OpenAI LLM.
For more information, see the [Azure OpenAI documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/).
### Parameters
#### Inputs
### Inputs
| Name | Display Name | Info |
|---------------------|---------------------|---------------------------------------------------------------------------------|
@ -119,9 +116,7 @@ This component generates text using Cohere's language models.
For more information, see the [Cohere documentation](https://cohere.ai/).
### Parameters
#### Inputs
### Inputs
| Name | Display Name | Info |
|---------------------|--------------------|----------------------------------------------------------|
@ -134,11 +129,9 @@ For more information, see the [Cohere documentation](https://cohere.ai/).
This component generates text using Google's Generative AI models.
For more information, see the [Google Generative AI documentation](https://cloud.google.com/ai-platform/training/docs/algorithms/gpt-3).
For more information, see the [Google Generative AI documentation](https://cloud.google.com/vertex-ai/docs/).
### Parameters
#### Inputs
### Inputs
| Name | Display Name | Info |
|---------------------|--------------------|-----------------------------------------------------------------------|
@ -156,20 +149,18 @@ This component generates text using Groq's language models.
For more information, see the [Groq documentation](https://groq.com/).
### Parameters
#### Inputs
### Inputs
| Name | Type | Description |
|----------------|---------------|-----------------------------------------------------------------|
| groq_api_key | SecretString | API key for the Groq API. |
| groq_api_base | String | Base URL path for API requests. Default: "https://api.groq.com" (advanced). |
| groq_api_base | String | Base URL path for API requests. Default: `https://api.groq.com` (advanced). |
| max_tokens | Integer | The maximum number of tokens to generate (advanced). |
| temperature | Float | Controls randomness in the output. Range: [0.0, 1.0]. Default: 0.1. |
| temperature | Float | Controls randomness in the output. Range: `[0.0, 1.0]`. Default: `0.1`. |
| n | Integer | Number of chat completions to generate for each prompt (advanced). |
| model_name | String | The name of the Groq model to use. Options are dynamically fetched from the Groq API. |
#### Outputs
### Outputs
| Name | Type | Description |
|-------|---------------|------------------------------------------------------------------|
@ -181,9 +172,7 @@ This component generates text using Hugging Face's language models.
For more information, see the [Hugging Face documentation](https://huggingface.co/).
### Parameters
#### Inputs
### Inputs
| Name | Display Name | Info |
|---------------------|-------------------|-------------------------------------------|
@ -199,19 +188,17 @@ This component generates text using Maritalk LLMs.
For more information, see [Maritalk documentation](https://www.maritalk.com/).
### Parameters
#### Inputs
### Inputs
| Name | Type | Description |
|----------------|---------------|-----------------------------------------------------------------|
| max_tokens | Integer | The maximum number of tokens to generate. Set to 0 for unlimited tokens. Default: 512. |
| model_name | String | The name of the Maritalk model to use. Options: "sabia-2-small", "sabia-2-medium". Default: "sabia-2-small". |
| max_tokens | Integer | The maximum number of tokens to generate. Set to `0` for unlimited tokens. Default: `512`. |
| model_name | String | The name of the Maritalk model to use. Options: `sabia-2-small`, `sabia-2-medium`. Default: `sabia-2-small`. |
| api_key | SecretString | The Maritalk API Key to use for authentication. |
| temperature | Float | Controls randomness in the output. Range: [0.0, 1.0]. Default: 0.5. |
| endpoint_url | String | The Maritalk API endpoint. Default: https://api.maritalk.com. |
| temperature | Float | Controls randomness in the output. Range: `[0.0, 1.0]`. Default: `0.5`. |
| endpoint_url | String | The Maritalk API endpoint. Default: `https://api.maritalk.com`. |
#### Outputs
### Outputs
| Name | Type | Description |
|-------|---------------|------------------------------------------------------------------|
@ -223,14 +210,13 @@ This component generates text using MistralAI LLMs.
For more information, see [Mistral AI documentation](https://docs.mistral.ai/).
### Parameters
### Inputs
#### Inputs
| Name | Type | Description |
|---------------------|--------------|-----------------------------------------------------------------------------------------------|
| max_tokens | Integer | The maximum number of tokens to generate. Set to 0 for unlimited tokens (advanced). |
| model_name | String | The name of the Mistral AI model to use. Options include "open-mixtral-8x7b", "open-mixtral-8x22b", "mistral-small-latest", "mistral-medium-latest", "mistral-large-latest", and "codestral-latest". Default: "codestral-latest". |
| mistral_api_base | String | The base URL of the Mistral API. Defaults to https://api.mistral.ai/v1 (advanced). |
| model_name | String | The name of the Mistral AI model to use. Options include `open-mixtral-8x7b`, `open-mixtral-8x22b`, `mistral-small-latest`, `mistral-medium-latest`, `mistral-large-latest`, and `codestral-latest`. Default: `codestral-latest`. |
| mistral_api_base | String | The base URL of the Mistral API. Defaults to `https://api.mistral.ai/v1` (advanced). |
| api_key | SecretString | The Mistral API Key to use for authentication. |
| temperature | Float | Controls randomness in the output. Default: 0.5. |
| max_retries | Integer | Maximum number of retries for API calls. Default: 5 (advanced). |
@ -240,7 +226,8 @@ For more information, see [Mistral AI documentation](https://docs.mistral.ai/).
| random_seed | Integer | Seed for random number generation. Default: 1 (advanced). |
| safe_mode | Boolean | Enables safe mode for content generation (advanced). |
#### Outputs
### Outputs
| Name | Type | Description |
|--------|---------------|-----------------------------------------------------|
| model | LanguageModel | An instance of ChatMistralAI configured with the specified parameters. |
@ -249,21 +236,21 @@ For more information, see [Mistral AI documentation](https://docs.mistral.ai/).
This component generates text using NVIDIA LLMs.
For more information, see [NVIDIA AI Foundation Models documentation](https://developer.nvidia.com/ai-foundation-models).
For more information, see [NVIDIA AI documentation](https://developer.nvidia.com/generative-ai).
### Parameters
### Inputs
#### Inputs
| Name | Type | Description |
|---------------------|--------------|-----------------------------------------------------------------------------------------------|
| max_tokens | Integer | The maximum number of tokens to generate. Set to 0 for unlimited tokens (advanced). |
| model_name | String | The name of the NVIDIA model to use. Default: "mistralai/mixtral-8x7b-instruct-v0.1". |
| base_url | String | The base URL of the NVIDIA API. Default: "https://integrate.api.nvidia.com/v1". |
| max_tokens | Integer | The maximum number of tokens to generate. Set to `0` for unlimited tokens (advanced). |
| model_name | String | The name of the NVIDIA model to use. Default: `mistralai/mixtral-8x7b-instruct-v0.1`. |
| base_url | String | The base URL of the NVIDIA API. Default: `https://integrate.api.nvidia.com/v1`. |
| nvidia_api_key | SecretString | The NVIDIA API Key for authentication. |
| temperature | Float | Controls randomness in the output. Default: 0.1. |
| seed | Integer | The seed controls the reproducibility of the job (advanced). Default: 1. |
| temperature | Float | Controls randomness in the output. Default: `0.1`. |
| seed | Integer | The seed controls the reproducibility of the job (advanced). Default: `1`. |
### Outputs
#### Outputs
| Name | Type | Description |
|--------|---------------|-----------------------------------------------------|
| model | LanguageModel | An instance of ChatNVIDIA configured with the specified parameters. |
@ -274,9 +261,8 @@ This component generates text using Ollama's language models.
For more information, see [Ollama documentation](https://ollama.com/).
### Parameters
### Inputs
#### Inputs
| Name | Display Name | Info |
|---------------------|---------------|---------------------------------------------|
| Base URL | Base URL | Endpoint of the Ollama API. |
@ -289,9 +275,7 @@ This component generates text using OpenAI's language models.
For more information, see [OpenAI documentation](https://beta.openai.com/docs/).
### Parameters
#### Inputs
### Inputs
| Name | Type | Description |
|---------------------|---------------|------------------------------------------------------------------|
@ -303,7 +287,7 @@ For more information, see [OpenAI documentation](https://beta.openai.com/docs/).
| frequency_penalty | Float | Controls the frequency penalty. Range: [0.0, 2.0]. Default: 0.0. |
| presence_penalty | Float | Controls the presence penalty. Range: [0.0, 2.0]. Default: 0.0. |
#### Outputs
### Outputs
| Name | Type | Description |
|-------|---------------|------------------------------------------------------------------|
@ -321,9 +305,8 @@ This component generates text using Perplexity's language models.
For more information, see [Perplexity documentation](https://perplexity.ai/).
### Parameters
### Inputs
#### Inputs
| Name | Type | Description |
|---------------------|--------------|-----------------------------------------------------------------------------------------------|
| model_name | String | The name of the Perplexity model to use. Options include various Llama 3.1 models. |
@ -334,7 +317,8 @@ For more information, see [Perplexity documentation](https://perplexity.ai/).
| n | Integer | Number of chat completions to generate for each prompt (advanced). |
| top_k | Integer | Number of top tokens to consider for top-k sampling. Must be positive (advanced). |
#### Outputs
### Outputs
| Name | Type | Description |
|--------|---------------|-----------------------------------------------------|
| model | LanguageModel | An instance of ChatPerplexity configured with the specified parameters. |
@ -345,18 +329,17 @@ This component generates text using SambaNova LLMs.
For more information, see [Sambanova Cloud documentation](https://cloud.sambanova.ai/).
### Parameters
#### Inputs
### Inputs
| Name | Type | Description |
|---------------------|---------------|------------------------------------------------------------------|
| sambanova_url | String | Base URL path for API requests. Default: "https://api.sambanova.ai/v1/chat/completions". |
| sambanova_url | String | Base URL path for API requests. Default: `https://api.sambanova.ai/v1/chat/completions`. |
| sambanova_api_key | SecretString | Your SambaNova API Key. |
| model_name | String | The name of the Sambanova model to use. Options include various Llama models. |
| max_tokens | Integer | The maximum number of tokens to generate. Set to 0 for unlimited tokens. |
| temperature | Float | Controls randomness in the output. Range: [0.0, 1.0]. Default: 0.07. |
#### Outputs
### Outputs
| Name | Type | Description |
|-------|---------------|------------------------------------------------------------------|
@ -368,9 +351,8 @@ This component generates text using Vertex AI LLMs.
For more information, see [Google Vertex AI documentation](https://cloud.google.com/vertex-ai).
### Parameters
### Inputs
#### Inputs
| Name | Type | Description |
|---------------------|--------------|-----------------------------------------------------------------------------------------------|
| credentials | File | JSON credentials file. Leave empty to fallback to environment variables. File type: JSON. |
@ -384,7 +366,8 @@ For more information, see [Google Vertex AI documentation](https://cloud.google.
| top_p | Float | The cumulative probability of parameter highest probability vocabulary tokens to keep for nucleus sampling. Default: 0.95 (advanced). |
| verbose | Boolean | Whether to print verbose output. Default: False (advanced). |
#### Outputs
### Outputs
| Name | Type | Description |
|--------|---------------|-----------------------------------------------------|
| model | LanguageModel | An instance of ChatVertexAI configured with the specified parameters. |

View file

@ -1,6 +1,5 @@
---
title: Components overview
sidebar_position: 0
slug: /components-overview
---

View file

@ -82,7 +82,7 @@ The component iterates through the input list of data objects, merging them into
| merged_data | Merged Data | A single data object containing the combined information from all input data objects |
## Parse Data component
## Parse Data
The ParseData component converts data objects into plain text using a specified template.
This component transforms structured data into human-readable text formats, allowing for customizable output through the use of templates.
@ -102,7 +102,7 @@ This component transforms structured data into human-readable text formats, allo
| text | Text | The resulting formatted text string as a message object. |
## Split Text component
## Split Text
This component splits text into chunks of a specified length.

View file

@ -1,6 +1,5 @@
---
title: Prompts
sidebar_position: 2
slug: /components-prompts
---

View file

@ -17,7 +17,7 @@ The agent then uses a connected LLM to reason through the problem to decide whic
Tools are typically connected to agent components at the **Tools** port.
The [simple agent starter project](/starter-projects-simple-agent) uses URL and Calculator tools connected to an [agent component](#agent-component-agent-component) to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use.
The [simple agent starter project](/starter-projects-simple-agent) uses URL and Calculator tools connected to an [agent component](/components-agents#agent-component) to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use.
![Simple agent starter flow](/img/starter-flow-simple-agent.png)

View file

@ -1 +0,0 @@
{"position":8, "label":"Configuration"}

View file

@ -1,6 +1,5 @@
---
title: API keys
sidebar_position: 1
slug: /configuration-api-keys
---

View file

@ -1,6 +1,5 @@
---
title: Authentication
sidebar_position: 0
slug: /configuration-authentication
---

View file

@ -1,6 +1,5 @@
---
title: Auto-saving
sidebar_position: 6
slug: /configuration-auto-save
---

View file

@ -1,6 +1,5 @@
---
title: Run Langflow in backend-only mode
sidebar_position: 4
slug: /configuration-backend-only
---

View file

@ -1,9 +1,10 @@
---
title: Langflow CLI
sidebar_position: 2
slug: /configuration-cli
---
import Link from '@docusaurus/Link';
# Langflow CLI
The Langflow command line interface (Langflow CLI) is the main interface for managing and running the Langflow server.
@ -25,10 +26,10 @@ python -m langflow [OPTIONS]
#### Options
| Option | Default | Values | Description |
|--------|------|-----------|-------------|
| <a id="install-completion"></a>`--install-completion` | *Not applicable* | *Not applicable* | Install auto-completion for the current shell. |
| <a id="show-completion"></a>`--show-completion` | *Not applicable* | *Not applicable* | Show the location of the auto-completion config file (if installed). |
| <a id="help"></a>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
|--------|---------|--------|-------------|
| <Link id="install-completion"/>`--install-completion` | *Not applicable* | *Not applicable* | Install auto-completion for the current shell. |
| <Link id="show-completion"/>`--show-completion` | *Not applicable* | *Not applicable* | Show the location of the auto-completion config file (if installed). |
| <Link id="help"/>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
### langflow api-key
@ -44,8 +45,9 @@ python -m langflow api-key [OPTIONS]
| Option | Default | Values | Description |
|--------|---------|--------|-------------|
| <a id="api-key-log-level"></a>`--log-level` | `critical` | `debug`<br/>`info`<br/>`warning`<br/>`error`<br/>`critical` | Set the logging level. |
| <a id="api-key-help"></a>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
| <Link id="install-completion"/>`--install-completion` | *Not applicable* | *Not applicable* | Install auto-completion for the current shell. |
| <Link id="show-completion"/>`--show-completion` | *Not applicable* | *Not applicable* | Show the location of the auto-completion config file (if installed). |
| <Link id="help"/>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
### langflow copy-db
@ -67,7 +69,7 @@ python -m langflow copy-db
| Option | Default | Values | Description |
|--------|---------|--------|-------------|
| <a id="copy-db-help"></a>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
| <Link id="copy-db-help"/>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
### langflow migration
@ -83,10 +85,9 @@ python -m langflow migration [OPTIONS]
| Option | Default | Values | Description |
|--------|---------|--------|-------------|
| <a id="migration-test"></a>`--test` | `true` | [Boolean](#boolean) | Run migrations in test mode. Use `--no-test` to disable test mode. |
| <a id="migration-fix"></a>`--fix` | `false` (`--no-fix`) | [Boolean](#boolean) | Fix migrations. This is a destructive operation, and all affected data will be deleted. Only use this option if you know what you are doing. |
| <a id="migration-help"></a>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
| <Link id="migration-test"/>`--test` | `true` | [Boolean](#boolean) | Run migrations in test mode. Use `--no-test` to disable test mode. |
| <Link id="migration-fix"/>`--fix` | `false` (`--no-fix`) | [Boolean](#boolean) | Fix migrations. This is a destructive operation, and all affected data will be deleted. Only use this option if you know what you are doing. |
| <Link id="migration-help"/>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
### langflow run
@ -102,26 +103,26 @@ python -m langflow run [OPTIONS]
| Option | Default | Values | Description |
|--------|---------|--------|-------------|
| <a id="run-host"></a>`--host` | `127.0.0.1` | String | The host on which the Langflow server will run.<br/>See [`LANGFLOW_HOST` variable](./environment-variables.md#LANGFLOW_HOST). |
| <a id="run-workers"></a>`--workers` | `1` | Integer | Number of worker processes.<br/>See [`LANGFLOW_WORKERS` variable](./environment-variables.md#LANGFLOW_WORKERS). |
| <a id="run-worker-timeout"></a>`--worker-timeout` | `300` | Integer | Worker timeout in seconds.<br/>See [`LANGFLOW_WORKER_TIMEOUT` variable](./environment-variables.md#LANGFLOW_WORKER_TIMEOUT). |
| <a id="run-port"></a>`--port` | `7860` | Integer | The port on which the Langflow server will run. The server automatically selects a free port if the specified port is in use.<br/>See [`LANGFLOW_PORT` variable](./environment-variables.md#LANGFLOW_PORT). |
| <a id="run-components-path"></a>`--components-path` | `langflow/components` | String | Path to the directory containing custom components.<br/>See [`LANGFLOW_COMPONENTS_PATH` variable](./environment-variables.md#LANGFLOW_COMPONENTS_PATH). |
| <a id="run-env-file"></a>`--env-file` | Not set | String | Path to the `.env` file containing environment variables.<br/>See [Import environment variables from a .env file](./environment-variables.md#configure-variables-env-file). |
| <a id="run-log-level"></a>`--log-level` | `critical` | `debug`<br/>`info`<br/>`warning`<br/>`error`<br/>`critical` | Set the logging level.<br/>See [`LANGFLOW_LOG_LEVEL` variable](./environment-variables.md#LANGFLOW_LOG_LEVEL). |
| <a id="run-log-file"></a>`--log-file` | `logs/langflow.log` | String | Set the path to the log file for Langflow.<br/>See [`LANGFLOW_LOG_FILE` variable](./environment-variables.md#LANGFLOW_LOG_FILE). |
| <a id="run-cache"></a>`--cache` | `InMemoryCache` | `InMemoryCache`<br/>`SQLiteCache` | Type of cache to use.<br/>See [`LANGFLOW_LANGCHAIN_CACHE` variable](./environment-variables.md#LANGFLOW_LANGCHAIN_CACHE). |
| <a id="run-dev"></a>`--dev` | `false` (`--no-dev`) | [Boolean](#boolean) | Run Langflow in development mode (may contain bugs).<br/>See [`LANGFLOW_DEV` variable](./environment-variables.md#LANGFLOW_DEV). |
| <a id="run-frontend-path"></a>`--frontend-path` | `./frontend` | String | Path to the frontend directory containing build files. This is for development purposes only.<br/>See [`LANGFLOW_FRONTEND_PATH` variable](./environment-variables.md#LANGFLOW_FRONTEND_PATH). |
| <a id="run-open-browser"></a>`--open-browser` | `true` | [Boolean](#boolean) | Open the system web browser on startup. Use `--no-open-browser` to disable opening the system web browser on startup.<br/> See [`LANGFLOW_OPEN_BROWSER` variable](./environment-variables.md#LANGFLOW_OPEN_BROWSER). |
| <a id="run-remove-api-keys"></a>`--remove-api-keys` | `false` (`--no-remove-api-keys`) | [Boolean](#boolean) | Remove API keys from the projects saved in the database.<br/> See [`LANGFLOW_REMOVE_API_KEYS` variable](./environment-variables.md#LANGFLOW_REMOVE_API_KEYS). |
| <a id="run-backend-only"></a>`--backend-only` | `false` (`--no-backend-only`) | [Boolean](#boolean) | Only run Langflow's backend server (no frontend).<br/>See [`LANGFLOW_BACKEND_ONLY` variable](./environment-variables.md#LANGFLOW_BACKEND_ONLY). |
| <a id="run-store"></a>`--store` | `true` | [Boolean](#boolean) | Enable the Langflow Store features. Use `--no-store` to disable the Langflow Store features.<br/>See [`LANGFLOW_STORE` variable](./environment-variables.md#LANGFLOW_STORE). |
| <a id="run-auto-saving"></a>`--auto-saving` | `true` | [Boolean](#boolean) | Enable flow auto-saving. Use `--no-auto-saving` to disable flow auto-saving.<br/>See [`LANGFLOW_AUTO_SAVING` variable](./environment-variables.md#LANGFLOW_AUTO_SAVING). |
| <a id="run-auto-saving-interval"></a>`--auto-saving-interval` | `1000` | Integer | Set the interval for flow auto-saving in milliseconds.<br/>See [`LANGFLOW_AUTO_SAVING_INTERVAL` variable](./environment-variables.md#LANGFLOW_AUTO_SAVING_INTERVAL). |
| <a id="run-health-check-max-retries"></a>`--health-check-max-retries` | `5` | Integer | Set the maximum number of retries for the health check. Use `--no-health-check-max-retries` to disable the maximum number of retries for the health check.<br/>See [`LANGFLOW_HEALTH_CHECK_MAX_RETRIES` variable](./environment-variables.md#LANGFLOW_HEALTH_CHECK_MAX_RETRIES). |
| <a id="run-max-file-size-upload"></a>`--max-file-size-upload` | `100` | Integer | Set the maximum file size for the upload in megabytes.<br/>See [`LANGFLOW_MAX_FILE_SIZE_UPLOAD` variable](./environment-variables.md#LANGFLOW_MAX_FILE_SIZE_UPLOAD). |
| <a id="run-help"></a>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
| <Link id="run-host"/>`--host` | `127.0.0.1` | String | The host on which the Langflow server will run.<br/>See [`LANGFLOW_HOST` variable](./environment-variables.md#LANGFLOW_HOST). |
| <Link id="run-workers"/>`--workers` | `1` | Integer | Number of worker processes.<br/>See [`LANGFLOW_WORKERS` variable](./environment-variables.md#LANGFLOW_WORKERS). |
| <Link id="run-worker-timeout"/>`--worker-timeout` | `300` | Integer | Worker timeout in seconds.<br/>See [`LANGFLOW_WORKER_TIMEOUT` variable](./environment-variables.md#LANGFLOW_WORKER_TIMEOUT). |
| <Link id="run-port"/>`--port` | `7860` | Integer | The port on which the Langflow server will run. The server automatically selects a free port if the specified port is in use.<br/>See [`LANGFLOW_PORT` variable](./environment-variables.md#LANGFLOW_PORT). |
| <Link id="run-components-path"/>`--components-path` | `langflow/components` | String | Path to the directory containing custom components.<br/>See [`LANGFLOW_COMPONENTS_PATH` variable](./environment-variables.md#LANGFLOW_COMPONENTS_PATH). |
| <Link id="run-env-file"/>`--env-file` | Not set | String | Path to the `.env` file containing environment variables.<br/>See [Import environment variables from a .env file](./environment-variables.md#configure-variables-env-file). |
| <Link id="run-log-level"/>`--log-level` | `critical` | `debug`<br/>`info`<br/>`warning`<br/>`error`<br/>`critical` | Set the logging level.<br/>See [`LANGFLOW_LOG_LEVEL` variable](./environment-variables.md#LANGFLOW_LOG_LEVEL). |
| <Link id="run-log-file"/>`--log-file` | `logs/langflow.log` | String | Set the path to the log file for Langflow.<br/>See [`LANGFLOW_LOG_FILE` variable](./environment-variables.md#LANGFLOW_LOG_FILE). |
| <Link id="run-cache"/>`--cache` | `InMemoryCache` | `InMemoryCache`<br/>`SQLiteCache` | Type of cache to use.<br/>See [`LANGFLOW_LANGCHAIN_CACHE` variable](./environment-variables.md#LANGFLOW_LANGCHAIN_CACHE). |
| <Link id="run-dev"/>`--dev` | `false` (`--no-dev`) | [Boolean](#boolean) | Run Langflow in development mode (may contain bugs).<br/>See [`LANGFLOW_DEV` variable](./environment-variables.md#LANGFLOW_DEV). |
| <Link id="run-frontend-path"/>`--frontend-path` | `./frontend` | String | Path to the frontend directory containing build files. This is for development purposes only.<br/>See [`LANGFLOW_FRONTEND_PATH` variable](./environment-variables.md#LANGFLOW_FRONTEND_PATH). |
| <Link id="run-open-browser"/>`--open-browser` | `true` | [Boolean](#boolean) | Open the system web browser on startup. Use `--no-open-browser` to disable opening the system web browser on startup.<br/> See [`LANGFLOW_OPEN_BROWSER` variable](./environment-variables.md#LANGFLOW_OPEN_BROWSER). |
| <Link id="run-remove-api-keys"/>`--remove-api-keys` | `false` (`--no-remove-api-keys`) | [Boolean](#boolean) | Remove API keys from the projects saved in the database.<br/> See [`LANGFLOW_REMOVE_API_KEYS` variable](./environment-variables.md#LANGFLOW_REMOVE_API_KEYS). |
| <Link id="run-backend-only"/>`--backend-only` | `false` (`--no-backend-only`) | [Boolean](#boolean) | Only run Langflow's backend server (no frontend).<br/>See [`LANGFLOW_BACKEND_ONLY` variable](./environment-variables.md#LANGFLOW_BACKEND_ONLY). |
| <Link id="run-store"/>`--store` | `true` | [Boolean](#boolean) | Enable the Langflow Store features. Use `--no-store` to disable the Langflow Store features.<br/>See [`LANGFLOW_STORE` variable](./environment-variables.md#LANGFLOW_STORE). |
| <Link id="run-auto-saving"/>`--auto-saving` | `true` | [Boolean](#boolean) | Enable flow auto-saving. Use `--no-auto-saving` to disable flow auto-saving.<br/>See [`LANGFLOW_AUTO_SAVING` variable](./environment-variables.md#LANGFLOW_AUTO_SAVING). |
| <Link id="run-auto-saving-interval"/>`--auto-saving-interval` | `1000` | Integer | Set the interval for flow auto-saving in milliseconds.<br/>See [`LANGFLOW_AUTO_SAVING_INTERVAL` variable](./environment-variables.md#LANGFLOW_AUTO_SAVING_INTERVAL). |
| <Link id="run-health-check-max-retries"/>`--health-check-max-retries` | `5` | Integer | Set the maximum number of retries for the health check. Use `--no-health-check-max-retries` to disable the maximum number of retries for the health check.<br/>See [`LANGFLOW_HEALTH_CHECK_MAX_RETRIES` variable](./environment-variables.md#LANGFLOW_HEALTH_CHECK_MAX_RETRIES). |
| <Link id="run-max-file-size-upload"/>`--max-file-size-upload` | `100` | Integer | Set the maximum file size for the upload in megabytes.<br/>See [`LANGFLOW_MAX_FILE_SIZE_UPLOAD` variable](./environment-variables.md#LANGFLOW_MAX_FILE_SIZE_UPLOAD). |
| <Link id="run-help"/>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
### langflow superuser
@ -137,9 +138,9 @@ python -m langflow superuser [OPTIONS]
| Option | Default | Values | Description |
|--------|---------|--------|-------------|
| <a id="superuser-username"></a>`--username` | Required | String | Specify the name for the superuser.<br/>See [`LANGFLOW_SUPERUSER` variable](./environment-variables.md#LANGFLOW_SUPERUSER). |
| <a id="superuser-password"></a>`--password` | Required | String | Specify the password for the superuser.<br/>See [`LANGFLOW_SUPERUSER_PASSWORD` variable](./environment-variables.md#LANGFLOW_SUPERUSER_PASSWORD). |
| <a id="superuser-log-level"></a>`--log-level` | `critical` | `debug`<br/>`info`<br/>`warning`<br/>`error`<br/>`critical` | Set the logging level. |
| <Link id="superuser-username"/>`--username` | Required | String | Specify the name for the superuser.<br/>See [`LANGFLOW_SUPERUSER` variable](./environment-variables.md#LANGFLOW_SUPERUSER). |
| <Link id="superuser-password"/>`--password` | Required | String | Specify the password for the superuser.<br/>See [`LANGFLOW_SUPERUSER_PASSWORD` variable](./environment-variables.md#LANGFLOW_SUPERUSER_PASSWORD). |
| <Link id="superuser-log-level"/>`--log-level` | `critical` | `debug`<br/>`info`<br/>`warning`<br/>`error`<br/>`critical` | Set the logging level. |
## Precedence

View file

@ -1,6 +1,6 @@
---
title: Configure an external PostgreSQL database
sidebar_position: 8
slug: /configuration-custom-database
---
Langflow's default database is [SQLite](https://www.sqlite.org/docs.html), but you can configure Langflow to use PostgreSQL instead.

View file

@ -1,6 +1,5 @@
---
title: Global variables
sidebar_position: 5
slug: /configuration-global-variables
---

View file

@ -1,6 +1,5 @@
---
title: Security best practices
sidebar_position: 1
slug: /configuration-security-best-practices
---

View file

@ -1,11 +1,12 @@
---
title: Environment variables
sidebar_position: 7
slug: /environment-variables
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import Link from '@docusaurus/Link';
Langflow lets you configure a number of settings using environment variables.
@ -104,44 +105,44 @@ That means, if you happen to set the same environment variable in both your term
## Supported environment variables {#supported-variables}
The following table lists the environment variables supported by Langflow.
Here's the updated table with the requested changes:
| Variable | Format / Values | Default | Description |
|----------|---------------|---------|-------------|
| <a id="DO_NOT_TRACK"></a>`DO_NOT_TRACK` | Boolean | `false` | If enabled, Langflow will not track telemetry. |
| <a id="LANGFLOW_AUTO_LOGIN"></a>`LANGFLOW_AUTO_LOGIN` | Boolean | `true` | Enable automatic login for Langflow. Set to `false` to disable automatic login and require the login form to log into the Langflow UI. Setting to `false` requires [`LANGFLOW_SUPERUSER`](#LANGFLOW_SUPERUSER) and [`LANGFLOW_SUPERUSER_PASSWORD`](#LANGFLOW_SUPERUSER_PASSWORD) to be set. |
| <a id="LANGFLOW_AUTO_SAVING"></a>`LANGFLOW_AUTO_SAVING` | Boolean | `true` | Enable flow auto-saving.<br/>See [`--auto-saving` option](./configuration-cli.md#run-auto-saving). |
| <a id="LANGFLOW_AUTO_SAVING_INTERVAL"></a>`LANGFLOW_AUTO_SAVING_INTERVAL` | Integer | `1000` | Set the interval for flow auto-saving in milliseconds.<br/>See [`--auto-saving-interval` option](./configuration-cli.md#run-auto-saving-interval). |
| <a id="LANGFLOW_BACKEND_ONLY"></a>`LANGFLOW_BACKEND_ONLY` | Boolean | `false` | Only run Langflow's backend server (no frontend).<br/>See [`--backend-only` option](./configuration-cli.md#run-backend-only). |
| <a id="LANGFLOW_CACHE_TYPE"></a>`LANGFLOW_CACHE_TYPE` | `async`<br/>`redis`<br/>`memory`<br/>`disk`<br/>`critical` | `async` | Set the cache type for Langflow.<br/>If you set the type to `redis`, then you must also set the following environment variables: [`LANGFLOW_REDIS_HOST`](#LANGFLOW_REDIS_HOST), [`LANGFLOW_REDIS_PORT`](#LANGFLOW_REDIS_PORT), [`LANGFLOW_REDIS_DB`](#LANGFLOW_REDIS_DB), and [`LANGFLOW_REDIS_CACHE_EXPIRE`](#LANGFLOW_REDIS_CACHE_EXPIRE). |
| <a id="LANGFLOW_COMPONENTS_PATH"></a>`LANGFLOW_COMPONENTS_PATH` | String | `langflow/components` | Path to the directory containing custom components.<br/>See [`--components-path` option](./configuration-cli.md#run-components-path). |
| <a id="LANGFLOW_CONFIG_DIR"></a>`LANGFLOW_CONFIG_DIR` | String | | Set the Langflow configuration directory where files, logs, and the Langflow database are stored. |
| <a id="LANGFLOW_DATABASE_URL"></a>`LANGFLOW_DATABASE_URL` | String | | Set the database URL for Langflow. If you don't provide one, Langflow uses an SQLite database. |
| <a id="LANGFLOW_DEV"></a>`LANGFLOW_DEV` | Boolean | `false` | Run Langflow in development mode (may contain bugs).<br/>See [`--dev` option](./configuration-cli.md#run-dev). |
| <a id="LANGFLOW_FALLBACK_TO_ENV_VAR"></a>`LANGFLOW_FALLBACK_TO_ENV_VAR` | Boolean | `true` | If enabled, [global variables](../Configuration/configuration-global-variables.md) set in the Langflow UI fall back to an environment variable with the same name when Langflow fails to retrieve the variable value. |
| <a id="LANGFLOW_FRONTEND_PATH"></a>`LANGFLOW_FRONTEND_PATH` | String | `./frontend` | Path to the frontend directory containing build files. This is for development purposes only.<br/>See [`--frontend-path` option](./configuration-cli.md#run-frontend-path). |
| <a id="LANGFLOW_HEALTH_CHECK_MAX_RETRIES"></a>`LANGFLOW_HEALTH_CHECK_MAX_RETRIES` | Integer | `5` | Set the maximum number of retries for the health check.<br/>See [`--health-check-max-retries` option](./configuration-cli.md#run-health-check-max-retries). |
| <a id="LANGFLOW_HOST"></a>`LANGFLOW_HOST` | String | `127.0.0.1` | The host on which the Langflow server will run.<br/>See [`--host` option](./configuration-cli.md#run-host). |
| <a id="LANGFLOW_LANGCHAIN_CACHE"></a>`LANGFLOW_LANGCHAIN_CACHE` | `InMemoryCache`<br/>`SQLiteCache` | `InMemoryCache` | Type of cache to use.<br/>See [`--cache` option](./configuration-cli.md#run-cache). |
| <a id="LANGFLOW_MAX_FILE_SIZE_UPLOAD"></a>`LANGFLOW_MAX_FILE_SIZE_UPLOAD` | Integer | `100` | Set the maximum file size for the upload in megabytes.<br/>See [`--max-file-size-upload` option](./configuration-cli.md#run-max-file-size-upload). |
| <a id="LANGFLOW_LOG_ENV"></a>`LANGFLOW_LOG_ENV` | `container_json`<br/>`container_csv`<br/> | Not set | Set the log environment. Default (Not set) is json with color. If not set a format string can be provided.<br/> See [`LANGFLOW_LOG_FORMAT`](#LANGFLOW_CACHE_TYPE) |
| <a id="LANGFLOW_LOG_FILE"></a>`LANGFLOW_LOG_FILE` | String | `logs/langflow.log` | Set the path to the log file for Langflow.<br/>See [`--log-file` option](./configuration-cli.md#run-log-file). |
| <a id="LANGFLOW_LOG_FORMAT"></a>`LANGFLOW_LOG_FORMAT` | String | `<green>{time:YYYY-MM-DD HH:mm:ss}</green> - <level>{level: <8}</level> - {module} - <level>{message}</level>` | Configure the logformat.<br/>For example without colors: `{time:YYYY-MM-DD HH:mm:ss.SSS} {level} {file} {line} {function} {message}` <br/> If [`LANGFLOW_LOG_ENV`](#LANGFLOW_LOG_ENV) is set this configuration will be ignored.
| <a id="LANGFLOW_LOG_LEVEL"></a>`LANGFLOW_LOG_LEVEL` | `debug`<br/>`info`<br/>`warning`<br/>`error`<br/>`critical` | `critical` | Set the logging level.<br/>See [`--log-level` option](./configuration-cli.md#run-log-level). |
| <a id="LANGFLOW_MAX_FILE_SIZE_UPLOAD"></a>`LANGFLOW_MAX_FILE_SIZE_UPLOAD` | Integer | `100` | Set the maximum file size for the upload in megabytes.<br/>See [`--max-file-size-upload` option](./configuration-cli.md#run-max-file-size-upload). |
| <a id="LANGFLOW_OPEN_BROWSER"></a>`LANGFLOW_OPEN_BROWSER` | Boolean | `true` | Open the system web browser on startup.<br/> See [`--open-browser` option](./configuration-cli.md#run-open-browser). |
| <a id="LANGFLOW_PORT"></a>`LANGFLOW_PORT` | Integer | `7860` | The port on which the Langflow server will run. The server automatically selects a free port if the specified port is in use.<br/>See [`--port` option](./configuration-cli.md#run-port). |
| <a id="LANGFLOW_PROMETHEUS_ENABLED"></a>`LANGFLOW_PROMETHEUS_ENABLED` | Boolean | `false` | Expose Prometheus metrics. |
| <a id="LANGFLOW_PROMETHEUS_PORT"></a>`LANGFLOW_PROMETHEUS_PORT` | Integer | `9090` | Set the port on which Langflow exposes Prometheus metrics. |
| <a id="LANGFLOW_REDIS_CACHE_EXPIRE"></a>`LANGFLOW_REDIS_CACHE_EXPIRE` | Integer | `3600` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <a id="LANGFLOW_REDIS_DB"></a>`LANGFLOW_REDIS_DB` | Integer | `0` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <a id="LANGFLOW_REDIS_HOST"></a>`LANGFLOW_REDIS_HOST` | String | `localhost` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <a id="LANGFLOW_REDIS_PORT"></a>`LANGFLOW_REDIS_PORT` | String | `6379` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <a id="LANGFLOW_REMOVE_API_KEYS"></a>`LANGFLOW_REMOVE_API_KEYS` | Boolean | `false` | Remove API keys from the projects saved in the database.<br/> See [`--remove-api-keys` option](./configuration-cli.md#run-remove-api-keys). |
| <a id="LANGFLOW_SAVE_DB_IN_CONFIG_DIR"></a>`LANGFLOW_SAVE_DB_IN_CONFIG_DIR` | Boolean | `false` | Save the Langflow database in [`LANGFLOW_CONFIG_DIR`](#LANGFLOW_CONFIG_DIR) instead of in the Langflow package directory. Note, when this variable is set to default (`false`), the database isn't shared between different virtual environments and the database is deleted when you uninstall Langflow. |
| <a id="LANGFLOW_STORE"></a>`LANGFLOW_STORE` | Boolean | `true` | Enable the Langflow Store.<br/>See [`--store` option](./configuration-cli.md#run-store). |
| <a id="LANGFLOW_STORE_ENVIRONMENT_VARIABLES"></a>`LANGFLOW_STORE_ENVIRONMENT_VARIABLES` | Boolean | `true` | Store environment variables as [global variables](../Configuration/configuration-global-variables.md) in the database. |
| <a id="LANGFLOW_SUPERUSER"></a>`LANGFLOW_SUPERUSER` | String | Not set | Set the name for the superuser. Required if [`LANGFLOW_AUTO_LOGIN`](#LANGFLOW_AUTO_LOGIN) is set to `false`.<br/>See [`superuser --username` option](./configuration-cli.md#superuser-username). |
| <a id="LANGFLOW_SUPERUSER_PASSWORD"></a>`LANGFLOW_SUPERUSER_PASSWORD` | String | Not set | Set the password for the superuser. Required if [`LANGFLOW_AUTO_LOGIN`](#LANGFLOW_AUTO_LOGIN) is set to `false`.<br/>See [`superuser --password` option](./configuration-cli.md#superuser-password).|
| <a id="LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT"></a>`LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT` | String | Not set | Comma-separated list of environment variables to get from the environment and store as [global variables](../Configuration/configuration-global-variables.md). |
| <a id="LANGFLOW_WORKER_TIMEOUT"></a>`LANGFLOW_WORKER_TIMEOUT` | Integer | `300` | Worker timeout in seconds.<br/>See [`--worker-timeout` option](./configuration-cli.md#run-worker-timeout). |
| <a id="LANGFLOW_WORKERS"></a>`LANGFLOW_WORKERS` | Integer | `1` | Number of worker processes.<br/>See [`--workers` option](./configuration-cli.md#run-workers). |
| <Link id="DO_NOT_TRACK"/>`DO_NOT_TRACK` | Boolean | `false` | If enabled, Langflow will not track telemetry. |
| <Link id="LANGFLOW_AUTO_LOGIN"/>`LANGFLOW_AUTO_LOGIN` | Boolean | `true` | Enable automatic login for Langflow. Set to `false` to disable automatic login and require the login form to log into the Langflow UI. Setting to `false` requires [`LANGFLOW_SUPERUSER`](#LANGFLOW_SUPERUSER) and [`LANGFLOW_SUPERUSER_PASSWORD`](environment-variables.md#LANGFLOW_SUPERUSER_PASSWORD) to be set. |
| <Link id="LANGFLOW_AUTO_SAVING"/>`LANGFLOW_AUTO_SAVING` | Boolean | `true` | Enable flow auto-saving.<br/>See [`--auto-saving` option](./configuration-cli.md#run-auto-saving). |
| <Link id="LANGFLOW_AUTO_SAVING_INTERVAL"/>`LANGFLOW_AUTO_SAVING_INTERVAL` | Integer | `1000` | Set the interval for flow auto-saving in milliseconds.<br/>See [`--auto-saving-interval` option](./configuration-cli.md#run-auto-saving-interval). |
| <Link id="LANGFLOW_BACKEND_ONLY"/>`LANGFLOW_BACKEND_ONLY` | Boolean | `false` | Only run Langflow's backend server (no frontend).<br/>See [`--backend-only` option](./configuration-cli.md#run-backend-only). |
| <Link id="LANGFLOW_CACHE_TYPE"/>`LANGFLOW_CACHE_TYPE` | `async`<br/>`redis`<br/>`memory`<br/>`disk`<br/>`critical` | `async` | Set the cache type for Langflow.<br/>If you set the type to `redis`, then you must also set the following environment variables: [`LANGFLOW_REDIS_HOST`](#LANGFLOW_REDIS_HOST), [`LANGFLOW_REDIS_PORT`](#LANGFLOW_REDIS_PORT), [`LANGFLOW_REDIS_DB`](#LANGFLOW_REDIS_DB), and [`LANGFLOW_REDIS_CACHE_EXPIRE`](#LANGFLOW_REDIS_CACHE_EXPIRE). |
| <Link id="LANGFLOW_COMPONENTS_PATH"/>`LANGFLOW_COMPONENTS_PATH` | String | `langflow/components` | Path to the directory containing custom components.<br/>See [`--components-path` option](./configuration-cli.md#run-components-path). |
| <Link id="LANGFLOW_CONFIG_DIR"/>`LANGFLOW_CONFIG_DIR` | String | | Set the Langflow configuration directory where files, logs, and the Langflow database are stored. |
| <Link id="LANGFLOW_DATABASE_URL"/>`LANGFLOW_DATABASE_URL` | String | | Set the database URL for Langflow. If you don't provide one, Langflow uses an SQLite database. |
| <Link id="LANGFLOW_DEV"/>`LANGFLOW_DEV` | Boolean | `false` | Run Langflow in development mode (may contain bugs).<br/>See [`--dev` option](./configuration-cli.md#run-dev). |
| <Link id="LANGFLOW_FALLBACK_TO_ENV_VAR"/>`LANGFLOW_FALLBACK_TO_ENV_VAR` | Boolean | `true` | If enabled, [global variables](../Configuration/configuration-global-variables.md) set in the Langflow UI fall back to an environment variable with the same name when Langflow fails to retrieve the variable value. |
| <Link id="LANGFLOW_FRONTEND_PATH"/>`LANGFLOW_FRONTEND_PATH` | String | `./frontend` | Path to the frontend directory containing build files. This is for development purposes only.<br/>See [`--frontend-path` option](./configuration-cli.md#run-frontend-path). |
| <Link id="LANGFLOW_HEALTH_CHECK_MAX_RETRIES"/>`LANGFLOW_HEALTH_CHECK_MAX_RETRIES` | Integer | `5` | Set the maximum number of retries for the health check.<br/>See [`--health-check-max-retries` option](./configuration-cli.md#run-health-check-max-retries). |
| <Link id="LANGFLOW_HOST"/>`LANGFLOW_HOST` | String | `127.0.0.1` | The host on which the Langflow server will run.<br/>See [`--host` option](./configuration-cli.md#run-host). |
| <Link id="LANGFLOW_LANGCHAIN_CACHE"/>`LANGFLOW_LANGCHAIN_CACHE` | `InMemoryCache`<br/>`SQLiteCache` | `InMemoryCache` | Type of cache to use.<br/>See [`--cache` option](./configuration-cli.md#run-cache). |
| <Link id="LANGFLOW_MAX_FILE_SIZE_UPLOAD"/>`LANGFLOW_MAX_FILE_SIZE_UPLOAD` | Integer | `100` | Set the maximum file size for the upload in megabytes.<br/>See [`--max-file-size-upload` option](./configuration-cli.md#run-max-file-size-upload). |
| <Link id="LANGFLOW_LOG_ENV"/>`LANGFLOW_LOG_ENV` | `container_json`<br/>`container_csv`<br/> | Not set | Set the log environment. Default (Not set) is json with color. If not set a format string can be provided.<br/> See [`LANGFLOW_LOG_FORMAT`](#LANGFLOW_CACHE_TYPE) |
| <Link id="LANGFLOW_LOG_FILE"/>`LANGFLOW_LOG_FILE` | String | `logs/langflow.log` | Set the path to the log file for Langflow.<br/>See [`--log-file` option](./configuration-cli.md#run-log-file). |
| <Link id="LANGFLOW_LOG_FORMAT"/>`LANGFLOW_LOG_FORMAT` | String | `<green>{time:YYYY-MM-DD HH:mm:ss}</green> - <level>{level: <8}</level> - {module} - <level>{message}</level>` | Configure the logformat.<br/>For example without colors: `{time:YYYY-MM-DD HH:mm:ss.SSS} {level} {file} {line} {function} {message}` <br/> If [`LANGFLOW_LOG_ENV`](#LANGFLOW_LOG_ENV) is set this configuration will be ignored. |
| <Link id="LANGFLOW_LOG_LEVEL"/>`LANGFLOW_LOG_LEVEL` | `debug`<br/>`info`<br/>`warning`<br/>`error`<br/>`critical` | `critical` | Set the logging level.<br/>See [`--log-level` option](./configuration-cli.md#run-log-level). |
| <Link id="LANGFLOW_OPEN_BROWSER"/>`LANGFLOW_OPEN_BROWSER` | Boolean | `true` | Open the system web browser on startup.<br/> See [`--open-browser` option](./configuration-cli.md#run-open-browser). |
| <Link id="LANGFLOW_PORT"/>`LANGFLOW_PORT` | Integer | `7860` | The port on which the Langflow server will run. The server automatically selects a free port if the specified port is in use.<br/>See [`--port` option](./configuration-cli.md#run-port). |
| <Link id="LANGFLOW_PROMETHEUS_ENABLED"/>`LANGFLOW_PROMETHEUS_ENABLED` | Boolean | `false` | Expose Prometheus metrics. |
| <Link id="LANGFLOW_PROMETHEUS_PORT"/>`LANGFLOW_PROMETHEUS_PORT` | Integer | `9090` | Set the port on which Langflow exposes Prometheus metrics. |
| <Link id="LANGFLOW_REDIS_CACHE_EXPIRE"/>`LANGFLOW_REDIS_CACHE_EXPIRE` | Integer | `3600` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <Link id="LANGFLOW_REDIS_DB"/>`LANGFLOW_REDIS_DB` | Integer | `0` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <Link id="LANGFLOW_REDIS_HOST"/>`LANGFLOW_REDIS_HOST` | String | `localhost` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <Link id="LANGFLOW_REDIS_PORT"/>`LANGFLOW_REDIS_PORT` | String | `6379` | See [`LANGFLOW_CACHE_TYPE`](#LANGFLOW_CACHE_TYPE). |
| <Link id="LANGFLOW_REMOVE_API_KEYS"/>`LANGFLOW_REMOVE_API_KEYS` | Boolean | `false` | Remove API keys from the projects saved in the database.<br/> See [`--remove-api-keys` option](./configuration-cli.md#run-remove-api-keys). |
| <Link id="LANGFLOW_SAVE_DB_IN_CONFIG_DIR"/>`LANGFLOW_SAVE_DB_IN_CONFIG_DIR` | Boolean | `false` | Save the Langflow database in [`LANGFLOW_CONFIG_DIR`](#LANGFLOW_CONFIG_DIR) instead of in the Langflow package directory. Note, when this variable is set to default (`false`), the database isn't shared between different virtual environments and the database is deleted when you uninstall Langflow. |
| <Link id="LANGFLOW_STORE"/>`LANGFLOW_STORE` | Boolean | `true` | Enable the Langflow Store.<br/>See [`--store` option](./configuration-cli.md#run-store). |
| <Link id="LANGFLOW_STORE_ENVIRONMENT_VARIABLES"/>`LANGFLOW_STORE_ENVIRONMENT_VARIABLES` | Boolean | `true` | Store environment variables as [global variables](../Configuration/configuration-global-variables.md) in the database. |
| <Link id="LANGFLOW_SUPERUSER"/>`LANGFLOW_SUPERUSER` | String | Not set | Set the name for the superuser. Required if [`LANGFLOW_AUTO_LOGIN`](#LANGFLOW_AUTO_LOGIN) is set to `false`.<br/>See [`superuser --username` option](./configuration-cli.md#superuser-username). |
| <Link id="LANGFLOW_SUPERUSER_PASSWORD"/>`LANGFLOW_SUPERUSER_PASSWORD` | String | Not set | Set the password for the superuser. Required if [`LANGFLOW_AUTO_LOGIN`](#LANGFLOW_AUTO_LOGIN) is set to `false`.<br/>See [`superuser --password` option](./configuration-cli.md#superuser-password).|
| <Link id="LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT"/>`LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT` | String | Not set | Comma-separated list of environment variables to get from the environment and store as [global variables](../Configuration/configuration-global-variables.md). |
| <Link id="LANGFLOW_WORKER_TIMEOUT"/>`LANGFLOW_WORKER_TIMEOUT` | Integer | `300` | Worker timeout in seconds.<br/>See [`--worker-timeout` option](./configuration-cli.md#run-worker-timeout). |
| <Link id="LANGFLOW_WORKERS"/>`LANGFLOW_WORKERS` | Integer | `1` | Number of worker processes.<br/>See [`--workers` option](./configuration-cli.md#run-workers). |

View file

@ -1 +0,0 @@
{"position":10, "label":"Contributing"}

View file

@ -1,6 +1,5 @@
---
title: Join the Langflow community
sidebar_position: 5
slug: /contributing-community
---

View file

@ -1,6 +1,5 @@
---
title: Contribute components
sidebar_position: 4
slug: /contributing-components
---

View file

@ -1,6 +1,5 @@
---
title: Ask for help on the Discussions board
sidebar_position: 3
slug: /contributing-github-discussions
---

View file

@ -1,6 +1,5 @@
---
title: Request an enhancement or report a bug
sidebar_position: 2
slug: /contributing-github-issues
---

View file

@ -1,6 +1,5 @@
---
title: Contribute to Langflow
sidebar_position: 1
slug: /contributing-how-to-contribute
---

View file

@ -1,6 +1,5 @@
---
title: Telemetry
sidebar_position: 0
slug: /contributing-telemetry
---

View file

@ -1 +0,0 @@
{"position":7, "label":"Deployment"}

View file

@ -1,7 +1,6 @@
---
title: Docker
sidebar_position: 2
slug: /deployment-docker
title: Dockers
lug: /deployment-docker
---
@ -38,7 +37,7 @@ This guide will help you get LangFlow up and running using Docker and Docker Com
`docker compose up`
LangFlow will now be accessible at [http://localhost:7860/](http://localhost:7860/).
LangFlow will now be accessible at `http://localhost:7860/`.
### Docker Compose configuration {#02226209cad24185a6ec5b69bd820d0f}

View file

@ -1,6 +1,5 @@
---
title: GCP
sidebar_position: 3
slug: /deployment-gcp
---

View file

@ -1,6 +1,5 @@
---
title: HuggingFace Spaces
sidebar_position: 0
slug: /deployment-hugging-face-spaces
---

View file

@ -1,6 +1,5 @@
---
title: Kubernetes
sidebar_position: 1
slug: /deployment-kubernetes
---
@ -84,7 +83,7 @@ kubectl port-forward -n langflow svc/langflow-langflow-runtime 7860:7860
```
Now you can access LangFlow at [http://localhost:7860/](http://localhost:7860/).
Now you can access LangFlow at `http://localhost:7860/`.
### LangFlow version {#645c6ef7984d4da0bcc4170bab0ff415}
@ -258,7 +257,7 @@ kubectl port-forward -n langflow svc/langflow-my-langflow-app 7860:7860
```
Now you can access the API at [http://localhost:7860/api/v1/flows](http://localhost:7860/api/v1/flows) and execute the flow:
Now you can access the API at `http://localhost:7860/api/v1/flows` and execute the flow:
```shell

View file

@ -1,6 +1,5 @@
---
title: Railway
sidebar_position: 5
slug: /deployment-railway
---

View file

@ -1,6 +1,5 @@
---
title: Render
sidebar_position: 4
slug: /deployment-render
---

View file

@ -1 +0,0 @@
{"position":1, "label":"Get Started"}

View file

@ -1,6 +1,5 @@
---
title: Install Langflow
sidebar_position: 1
slug: /get-started-installation
---

View file

@ -1,6 +1,5 @@
---
title: Quickstart
sidebar_position: 2
slug: /get-started-quickstart
---
@ -121,9 +120,9 @@ The [Astra DB vector store](/components-vector-stores#astra-db-vector-store) com
3. Click **Data**, select the **File** component, and then drag it to the canvas.
The [File](/components-data#file) component loads files from your local machine.
3. Click **Processing**, select the **Split Text** component, and then drag it to the canvas.
The [Split Text](/components-helpers#split-text) component splits the loaded text into smaller chunks.
The [Split Text](/components-processing#split-text) component splits the loaded text into smaller chunks.
4. Click **Processing**, select the **Parse Data** component, and then drag it to the canvas.
The [Parse Data](/components-helpers#parse-data) component converts the data from the **Astra DB** component into plain text.
The [Parse Data](/components-processing#parse-data) component converts the data from the **Astra DB** component into plain text.
5. Click **Embeddings**, select the **OpenAI Embeddings** component, and then drag it to the canvas.
The [OpenAI Embeddings](/components-embedding-models#openai-embeddings) component generates embeddings for the user's input, which are compared to the vector data in the database.
6. Connect the new components into the existing flow, so your flow looks like this:

View file

@ -1,6 +1,5 @@
---
title: Welcome to Langflow
sidebar_position: 0
slug: /
---

View file

@ -1 +0,0 @@
{"position":3, "label":"Guides"}

View file

@ -1,6 +1,5 @@
---
title: Chat Memory
sidebar_position: 1
slug: /guides-chat-memory
---
@ -12,7 +11,7 @@ Langflow allows every chat message to be stored, and a single flow can have mult
In any project, as long as there are [**Chat**](/components-io) being used, memories are always being stored by default. These are messages from a user to the AI or vice-versa.
To see and access this history of messages, Langflow features a component called [Message history](/components-helpers#memory-history). It retrieves previous messages and outputs them in structured format or parsed.
To see and access this history of messages, Langflow features a component called [Message history](/components-helpers#message-history). It retrieves previous messages and outputs them in structured format or parsed.
To learn the basics about memory in Langflow, check out the [Memory Chatbot](/tutorials-memory-chatbot) starter example.
@ -55,7 +54,7 @@ You can also display all messages stored across every flow and session by going
## Store chat memory in an external database
Chat memory is retrieved from an external database or vector store using the [Chat Memory](/components-helpers#chat-memory) component.
Chat memory is retrieved from an external database or vector store using the [Chat Memory](/components-helpers#message-history) component.
Chat memory is stored to an external database or vector store using the [Store Message](/components-helpers#store-message) component.
@ -81,7 +80,7 @@ The **Astra DB Chat Memory** component stores and retrieves messages from **Astr
4. Configure the **AstraDBChatMemory** component with your AstraDB instance details.
1. In the **Astra DB Application Token** field, add your Astra token. (`AstraCS:...`)
2. In the **API Endpoint** field, add your Astra database's endpoint. (for example, `https://12adb-bc-5378c845f05a6-e0a12-bd889b4-us-east-2.apps.astra.datastax.com`)
5. Connect the **AstraDBChatMemory** component output to the external memory inputs of the [Chat Memory](/components-helpers#chat-memory) and [Store Message](/components-helpers#store-message) components.
5. Connect the **AstraDBChatMemory** component output to the external memory inputs of the [Message history](/components-helpers#message-history) and [Store Message](/components-helpers#store-message) components.
6. Link the [Chat Output](/components-io#chat-output) component to the input of the [Store Message](/components-helpers#store-message) component.
Your completed flow should look like this:

View file

@ -1,6 +1,5 @@
---
title: Data & Message
sidebar_position: 2
slug: /guides-data-message
---

View file

@ -1,6 +1,5 @@
---
title: 📚 New to LLMs?
sidebar_position: 0
slug: /guides-new-to-llms
---

View file

@ -1 +0,0 @@
{ "position": 2, "label": "Google" }

View file

@ -1,7 +1,6 @@
---
title: 'Integrate Google Cloud Vertex AI with Langflow'
title: Integrate Google Cloud Vertex AI with Langflow
slug: /integrations-setup-google-cloud-vertex-ai-langflow
sidebar_position: 2
description: "A comprehensive guide on creating a Google OAuth app, obtaining tokens, and integrating them with Langflow's Google components."
---

View file

@ -1,7 +1,7 @@
---
title: Integrate Google OAuth with Langflow
slug: /integrations-setup-google-oauth-langflow
sidebar_position: 3
description: "A comprehensive guide on creating a Google OAuth app, obtaining tokens, and integrating them with Langflow's Google components."
---

View file

@ -1 +0,0 @@
{"position":5, "label":"Notion"}

View file

@ -1,6 +1,5 @@
---
title: Setup
sidebar_position: 0
slug: /integrations/notion/setup
---

View file

@ -1,6 +1,5 @@
---
title: Notion Conversational Agent
sidebar_position: 2
slug: /integrations/notion/notion-agent-conversational
---

View file

@ -1,6 +1,5 @@
---
title: Notion Meeting Notes Agent
sidebar_position: 1
slug: /integrations/notion/notion-agent-meeting-notes
---

View file

@ -1 +0,0 @@
{"position":9, "label":"Integrations"}

View file

@ -1,6 +1,5 @@
---
title: AssemblyAI
sidebar_position: 1
slug: /integrations-assemblyai
---

View file

@ -1,6 +1,5 @@
---
title: Langfuse
sidebar_position: 2
slug: /integrations-langfuse
---

View file

@ -1,6 +1,5 @@
---
title: LangSmith
sidebar_position: 3
slug: /integrations-langsmith
---

View file

@ -1,6 +1,5 @@
---
title: LangWatch
sidebar_position: 4
slug: /integrations-langwatch
---

View file

@ -1,6 +1,5 @@
---
title: Basic Prompting
sidebar_position: 0
slug: /starter-projects-basic-prompting
---

View file

@ -1,6 +1,5 @@
---
title: Simple agent
sidebar_position: 6
slug: /starter-projects-simple-agent
---

View file

@ -1,6 +1,5 @@
---
title: Vector Store RAG
sidebar_position: 4
slug: /starter-projects-vector-store-rag
---

View file

@ -1,6 +1,5 @@
---
title: Blog Writer
sidebar_position: 1
slug: /tutorials-blog-writer
---

View file

@ -1,6 +1,5 @@
---
title: Document QA
sidebar_position: 2
slug: /tutorials-document-qa
---

View file

@ -1,6 +1,5 @@
---
title: Memory Chatbot
sidebar_position: 3
slug: /tutorials-memory-chatbot
---

View file

@ -1,6 +1,5 @@
---
title: Sequential tasks agent
sidebar_position: 4
slug: /tutorials-sequential-agent
---

View file

@ -1,6 +1,5 @@
---
title: Travel planning agent
sidebar_position: 8
slug: /tutorials-travel-planning-agent
---

View file

@ -1 +0,0 @@
{"position":4, "label":"Workspace"}

View file

@ -1,6 +1,5 @@
---
title: API
sidebar_position: 2
slug: /workspace-api
---

View file

@ -1,6 +1,5 @@
---
title: Logs
sidebar_position: 4
slug: /workspace-logs
---

View file

@ -1,6 +1,5 @@
---
title: Workspace concepts
sidebar_position: 1
slug: /workspace-overview
---

View file

@ -1,6 +1,5 @@
---
title: Playground
sidebar_position: 2
slug: /workspace-playground
---

View file

@ -15,6 +15,7 @@ const config = {
baseUrl: "/",
onBrokenLinks: "throw",
onBrokenMarkdownLinks: "warn",
onBrokenAnchors: "warn",
organizationName: "langflow-ai",
projectName: "langflow",
trailingSlash: false,