docs: clean up configuration folder (#7426)

* remove-config-pages

* redirect-and-remove-from-sidebars

* add-note-about-backend-only

* redirects-and-sidebar

* fix-plaintext-linting-errors

* cli-page
This commit is contained in:
Mendon Kissling 2025-04-07 17:10:09 -04:00 committed by GitHub
commit eb4e6ae87e
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
20 changed files with 48 additions and 247 deletions

View file

@ -25,7 +25,7 @@ export LANGFLOW_URL="http://127.0.0.1:7860"
* Export the `flow-id` in your terminal.
The `flow-id` is found in the [Publish pane](/concepts-publish) or in the flow's URL.
```plain
```text
export FLOW_ID="359cd752-07ea-46f2-9d3b-a4407ef618da"
```
@ -131,7 +131,7 @@ curl -X POST \
</TabItem>
<TabItem value="result" label="Result">
```result
```text
{
"session_id": "chat-123",
"outputs": [{
@ -185,7 +185,7 @@ curl -X POST \
</TabItem>
<TabItem value="result" label="Result">
```result
```text
{"event": "add_message", "data": {"timestamp": "2025-03-03T17:20:18", "sender": "User", "sender_name": "User", "session_id": "chat-123", "text": "Tell me about something interesting!", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": null, "display_name": null, "source": null}, "icon": "", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "0103a21b-ebf7-4c02-9d72-017fb297f812", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}}
{"event": "add_message", "data": {"timestamp": "2025-03-03T17:20:18", "sender": "Machine", "sender_name": "AI", "session_id": "chat-123", "text": "", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": "OpenAIModel-d1wOZ", "display_name": "OpenAI", "source": "gpt-4o-mini"}, "icon": "OpenAI", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "27b66789-e673-4c65-9e81-021752925161", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}}
@ -443,7 +443,7 @@ curl -X POST \
<Tabs>
<TabItem value="curl" label="curl" default>
```curl
```text
curl -X GET \
"$LANGFLOW_URL/api/v1/build/123e4567-e89b-12d3-a456-426614174000/events" \
-H "accept: application/json"
@ -467,7 +467,7 @@ curl -X GET \
The events endpoint accepts an optional `stream` query parameter which defaults to `true`.
To disable streaming and get all events at once, set `stream` to `false`.
```curl
```text
curl -X GET \
"$LANGFLOW_URL/api/v1/build/123e4567-e89b-12d3-a456-426614174000/events?stream=false" \
-H "accept: application/json"

View file

@ -161,7 +161,7 @@ class TextAnalyzerComponent(Component):
4. Connect the tool output to the agent's tools input.
5. Ask the agent, `What tools are you using to answer my questions?`
Your response will be similar to the following, and will include your custom component.
```plain
```text
I have access to several tools that assist me in answering your questions, including:
Search API: This allows me to search for recent information or results on the web.
HTTP Requests: I can make HTTP requests to various URLs to retrieve data or interact with APIs.

View file

@ -22,7 +22,7 @@ This is represented in Langflow by connecting the Parse Data component's **Data
![Sample Flow looping summarizer](/img/loop-text-summarizer.png)
The output will look similar to this:
```plain
```text
Document Summary
Total Pages Processed
Total Pages: 2

View file

@ -23,7 +23,7 @@ This prompt creates a "personality" for your LLM's chat interactions, but it doe
To modify the prompt template, in the **Prompt** component, click the **Template** field. For example, the `{context}` variable gives the LLM model access to embedded vector data to return better answers.
```plain
```text
Given the context
{context}
Answer the question

View file

@ -37,7 +37,7 @@ The **JavaScript API** tab displays code to interact with your flow in JavaScrip
1. Copy and paste the code into a JavaScript file.
2. Run the script.
```plain
```text
node test-script.js "tell me about something interesting"
```

View file

@ -1,47 +0,0 @@
---
title: Auto-saving
slug: /configuration-auto-save
---
Langflow supports both manual and auto-saving functionality.
## Auto-saving {#auto-saving}
When Langflow is in auto-saving mode, all changes are saved automatically. Auto-save progress is indicated in the left side of the top bar.
* When a flow is being saved, a loading icon indicates that the flow is being saved in the database.
* If you try to exit the flow page before auto-save completes, you are prompted to confirm you want to exit before the flow has saved.
* When the flow has successfully saved, click **Exit**.
## Disable auto-saving {#environment}
To disable auto-saving,
1. Set an environment variable in your `.env` file.
```env
LANGFLOW_AUTO_SAVING=false
```
2. Start Langflow with the values from your `.env` file.
```shell
python -m langflow run --env-file .env
```
Alternatively, disable auto-saving by passing the `--no-auto-saving` flag at startup.
```shell
python -m langflow --no-auto-saving
```
## Save a flow manually {#manual-saving}
When auto-saving is disabled, you will need to manually save your flow when making changes.
To manually save your flow, click the **Save** button or enter Ctrl+S or Command+S.
If you try to exit after making changes and not saving, a confirmation dialog appears.

View file

@ -1,122 +0,0 @@
---
title: Run Langflow in backend-only mode
slug: /configuration-backend-only
---
Langflow can run in `--backend-only` mode to expose a Langflow app as an API endpoint, without running the frontend UI.
This is also known as "headless" mode. Running Langflow without the frontend is useful for automation, testing, and situations where you just need to serve a flow as a workload without creating a new flow in the UI.
To run Langflow in backend-only mode, pass the `--backend-only` flag at startup.
```python
python3 -m langflow run --backend-only
```
The terminal prints `Welcome to ⛓ Langflow`, and Langflow will now serve requests to its API without the frontend running.
## Set up a basic prompting flow in backend-only mode
This example shows you how to set up a [Basic Prompting flow](/starter-projects-basic-prompting) as an endpoint in backend-only mode.
However, you can use these same instructions as guidelines for using any type of flow in backend-only mode.
### Prerequisites
- [Langflow is installed](/get-started-installation)
- [You have an OpenAI API key](https://platform.openai.com/)
- [You have a Langflow Basic Prompting flow](/starter-projects-basic-prompting)
### Get your flow's ID
This guide assumes you have created a [Basic Prompting flow](/starter-projects-basic-prompting) or have another working flow available.
1. In the Langflow UI, click **API**.
2. Click **curl** &gt; **Copy code** to copy the curl command.
This command will POST input to your flow's endpoint.
It will look something like this:
```text
curl -X POST \
"http://127.0.0.1:7861/api/v1/run/fff8dcaa-f0f6-4136-9df0-b7cb38de42e0?stream=false" \
-H 'Content-Type: application/json'\
-d '{"input_value": "message",
"output_type": "chat",
"input_type": "chat",
"tweaks": {
"ChatInput-8a86T": {},
"Prompt-pKfl9": {},
"ChatOutput-WcGpD": {},
"OpenAIModel-5UyvQ": {}
}}'
```
The flow ID in this example is `fff8dcaa-f0f6-4136-9df0-b7cb38de42e0`, a UUID generated by Langflow and used in the endpoint URL.
See [API](/configuration-api-keys) to change the endpoint.
3. To stop Langflow, press **Ctrl+C**.
### Start Langflow in backend-only mode
1. Start Langflow in backend-only mode.
```python
python3 -m langflow run --backend-only
```
The terminal prints `Welcome to ⛓ Langflow`.
Langflow is now serving requests to its API.
2. Run the curl code you copied from the UI.
You should get a result like this:
```shell
{"session_id":"ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880","outputs":[{"inputs":{"input_value":"hi, are you there?"},"outputs":[{"results":{"result":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?"},"artifacts":{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI"},"messages":[{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI","component_id":"ChatOutput-ktwdw"}],"component_display_name":"Chat Output","component_id":"ChatOutput-ktwdw","used_frozen_result":false}]}]}%
```
This confirms Langflow is receiving your POST request, running the flow, and returning the result without running the frontend.
You can interact with this endpoint using the other options in the **API** menu, including the Python and Javascript APIs.
### Query the Langflow endpoint with a Python script
Using the same flow ID, run a Python sample script to send a query and get a prettified JSON response back.
1. Create a Python file and name it `langflow_api_demo.py`.
```python
import requests
import json
def query_langflow(message):
url = "http://127.0.0.1:7861/api/v1/run/fff8dcaa-f0f6-4136-9df0-b7cb38de42e0"
headers = {"Content-Type": "application/json"}
data = {"input_value": message}
response = requests.post(url, headers=headers, json=data)
return response.json()
user_input = input("Enter your message: ")
result = query_langflow(user_input)
print(json.dumps(result, indent=2))
```
2. Run the script.
```python
python langflow_api_demo.py
```
3. Enter your message when prompted.
You will get a prettified JSON response back containing a response to your message.
### Configure host and ports in backend-only mode
To change the host and port, pass the values as additional flags.
```python
python -m langflow run --host 127.0.0.1 --port 7860 --backend-only
```

View file

@ -28,12 +28,12 @@ python -m langflow [OPTIONS]
| Option | Default | Values | Description |
|--------|---------|--------|-------------|
| <Link id="install-completion"/>`--install-completion` | *Not applicable* | *Not applicable* | Install auto-completion for the current shell. |
| <Link id="show-completion"/>`--show-completion` | *Not applicable* | *Not applicable* | Show the location of the auto-completion config file (if installed). |
| <Link id="show-completion"/>`--show-completion` | *Not applicable* | *Not applicable* | Show the location of the auto-completion config file, if installed. |
| <Link id="help"/>`--help` | *Not applicable* | *Not applicable* | Display information about the command usage and its options and arguments. |
### langflow api-key
Create an API key for the default superuser if the [`LANGFLOW_AUTO_LOGIN` environment variable] is set to `true`.
Create an API key for the default superuser if the `LANGFLOW_AUTO_LOGIN` environment variable is set to `true`.
```bash
langflow api-key [OPTIONS]
@ -146,7 +146,7 @@ python -m langflow superuser [OPTIONS]
Langflow CLI options override the values of corresponding [environment variables](./environment-variables.md).
For example, if you have `LANGFLOW_PORT=7860` defined as an environment variable, but you run the CLI with `--port 7880`, then Langflow will set the port to **`7880`** (the value passed with the CLI).
For example, if you have `LANGFLOW_PORT=7860` defined as an environment variable, but you run the CLI with `--port 7880`, Langflow sets the port to **`7880`**, the value passed with the CLI.
## Assign values

View file

@ -1,47 +0,0 @@
---
title: Security best practices
slug: /configuration-security-best-practices
---
This guide outlines security best practices for deploying and managing Langflow.
## Secret key protection
The secret key is critical for encrypting sensitive data in Langflow. Follow these guidelines:
- Always use a custom secret key in production:
```bash
LANGFLOW_SECRET_KEY=your-secure-secret-key
```
- Store the secret key securely:
- Use environment variables or secure secret management systems.
- Never commit the secret key to version control.
- Regularly rotate the secret key.
- Use the default secret key locations:
- macOS: `~/Library/Caches/langflow/secret_key`
- Linux: `~/.cache/langflow/secret_key`
- Windows: `%USERPROFILE%\AppData\Local\langflow\secret_key`
## API keys and credentials
- Store API keys and credentials as encrypted global variables.
- Use the Credential type for sensitive information.
- Implement proper access controls for users who can view/edit credentials.
- Regularly audit and rotate API keys.
## Database file protection
- Store the database in a secure location:
```bash
LANGFLOW_SAVE_DB_IN_CONFIG_DIR=true
LANGFLOW_CONFIG_DIR=/secure/path/to/config
```
- Use the default database locations:
- macOS/Linux: `PYTHON_LOCATION/site-packages/langflow/langflow.db`
- Windows: `PYTHON_LOCATION\Lib\site-packages\langflow\langflow.db`

View file

@ -20,6 +20,10 @@ The **IDE** includes the frontend for visual development of your flow. The defau
The **runtime** is a headless or backend-only mode. The server exposes your flow as an endpoint, and runs only the processes necessary to serve your flow, with PostgreSQL as the database for improved scalability. Use the Langflow **runtime** to deploy your flows, because you don't require the frontend for visual development.
:::tip
You can start Langflow in headless mode with the [LANGFLOW_BACKEND_ONLY](/environment-variables#langflow_backend_only) environment variable.
:::
## Package your flow with the Langflow runtime image
To package your flow as a Docker image, copy your flow's `.JSON` file with a command in the Dockerfile.

View file

@ -170,18 +170,18 @@ If you wish to retain your files, back them up before clearing the folder.
Installing Langflow with `pip install langflow` slowly fails with this error message:
```plain
```text
pip is looking at multiple versions of <<library>> to determine which version is compatible with other requirements. This could take a while.
```
To work around this issue, install Langflow with [`uv`](https://docs.astral.sh/uv/getting-started/installation/) instead of `pip`.
```plain
```text
uv pip install langflow
```
To run Langflow with uv:
```plain
```text
uv run langflow run
```

View file

@ -89,7 +89,7 @@ You created a chatbot application with Langflow, but let's try an experiment.
1. Ask the bot: `Who won the Oscar in 2024 for best movie?`
2. The bot's response is similar to this:
```plain
```text
I'm sorry, but I don't have information on events or awards that occurred after
October 2023, including the Oscars in 2024.
You may want to check the latest news or the official Oscars website
@ -162,7 +162,7 @@ If you used Langflow's **Global Variables** feature, the RAG application flow co
1. Modify the **Prompt** component to contain variables for both `{user_question}` and `{context}`.
The `{context}` variable gives the bot additional context for answering `{user_question}` beyond what the LLM was trained on.
```plain
```text
Given the context
{context}
Answer the question
@ -175,7 +175,7 @@ This example uploads an up-to-date CSV about Oscar winners.
4. Ask the bot: `Who won the Oscar in 2024 for best movie?`
5. The bot's response should be similar to this:
```plain
```text
The Oscar for Best Picture in 2024 was awarded to "Oppenheimer,"
produced by Emma Thomas, Charles Roven, and Christopher Nolan.
```

View file

@ -39,13 +39,13 @@ Alternatively, add the key as a [global variable](/configuration-global-variable
4. To open the **Playground** pane, click **Playground**.
5. Ask your AI:
```plain
```text
What tools are available to you?
```
The response should be similar to:
```plain
```text
I have access to the following tools:
1. **GMAIL_CREATE_EMAIL_DRAFT**: This tool allows me to create a draft email using Gmail's API. I can specify the recipient's email address, subject, body content, and whether the body content is HTML.
@ -57,7 +57,7 @@ This confirms your **Agent** and **Composio** are communicating.
6. Tell your AI to write a draft email.
```plain
```text
Create a draft email with the subject line "Greetings from Composio"
recipient: "your.email@address.com"
Body content: "Hello from composio!"
@ -66,7 +66,7 @@ Body content: "Hello from composio!"
Inspect the response to see how the agent used the attached tool to write an email.
This example response is abbreviated.
```plain
```text
The draft email with the subject "Greetings from Composio" and body "Hello from composio!" has been successfully created.
```

View file

@ -16,7 +16,7 @@ The flow should look like this:
7. In the **MCP server** component, in the **MCP command** field, add the following code.
Replace the values for `ASTRA_TOKEN` and `ASTRA_ENDPOINT` with the values from your Astra database.
```plain
```text
env ASTRA_DB_APPLICATION_TOKEN=ASTRA_TOKEN ASTRA_DB_API_ENDPOINT=ASTRA_ENDPOINT npx -y @datastax/astra-db-mcpnpx -y @datastax/astra-db-mcp
```

View file

@ -30,7 +30,7 @@ The **Parser** component converts the data coming from the **URL** component int
To examine the flow's prompt, click the **Template** field of the **Prompt** component.
```plain
```text
Reference 1:
{references}

View file

@ -26,7 +26,7 @@ This component retrieves previous messages and sends them to the **Prompt** comp
To examine the template, click the **Template** field in the **Prompt** component.
The **Prompt** tells the **OpenAI model** component how to respond to input.
```plain
```text
You are a helpful assistant that answers questions.
Use markdown to format your answer, properly embedding images and urls.
@ -44,7 +44,7 @@ The **Chat Memory** component is connected to this port to store chat messages f
1. Open the **Playground**.
2. Enter multiple questions. For example, try entering this conversation:
```plain
```text
Hi, my name is Luca.
Please tell me about PostgreSQL.
What is my name?

View file

@ -38,7 +38,7 @@ The Sequential Tasks Agent flow consists of these components:
2. Add your Tavily API key to the **Tavily** component.
3. Click **Playground** to start a chat session with the template's default question.
```plain
```text
Should I invest in Tesla (TSLA) stock right now?
Please analyze the company's current position, market trends,
financial health, and provide a clear investment recommendation.

View file

@ -34,7 +34,7 @@ The model will respond according to the prompt constructed in the **Prompt** c
4. To examine the **Template**, in the **Prompt** component, click the **Template** field.
```plain
```text
Answer the user as if you were a GenAI expert, enthusiastic about helping them get started building something fresh.
```

View file

@ -198,6 +198,22 @@ const config = {
to: "/components-vector-stores",
from: "/components-rag",
},
{
to: "/configuration-authentication",
from: [
"/configuration-security-best-practices",
"/Configuration/configuration-security-best-practices"
],
},
{
to: "/environment-variables",
from: [
"/configuration-auto-saving",
"/Configuration/configuration-auto-saving",
"/configuration-backend-only",
"/Configuration/configuration-backend-only"
],
},
{
to: "/concepts-publish",
from: [

View file

@ -78,13 +78,10 @@ module.exports = {
items: [
"Configuration/configuration-api-keys",
"Configuration/configuration-authentication",
"Configuration/configuration-auto-saving",
"Configuration/configuration-backend-only",
"Configuration/configuration-cli",
"Configuration/configuration-custom-database",
"Configuration/configuration-global-variables",
"Configuration/environment-variables",
"Configuration/configuration-security-best-practices"
],
},
{