fix: replace instances of 127.0.0.1 to localhost (#8536)

* Replace instances of 127.0.0.1 to localhost

* fix some replace-all issues

* fix some replace-all issues

* update starterprojects

* Upgrade uvlock
This commit is contained in:
Jordan Frazier 2025-06-16 08:54:04 -07:00 committed by GitHub
commit b77351331d
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
33 changed files with 3959 additions and 5686 deletions

View file

@ -9,7 +9,7 @@ import TabItem from '@theme/TabItem';
This page provides examples and practices for managing Langflow using the Langflow API.
The Langflow API's OpenAPI spec can be viewed and tested at your Langflow deployment's `docs` endpoint.
For example, `http://127.0.0.1:7860/docs`.
For example, `http://localhost:7860/docs`.
## Export values
@ -18,10 +18,10 @@ You might find it helpful to set the following environment variables in your ter
The examples in this guide use environment variables for these values.
- Export your Langflow URL in your terminal.
Langflow starts by default at `http://127.0.0.1:7860`.
Langflow starts by default at `http://localhost:7860`.
```bash
export LANGFLOW_URL="http://127.0.0.1:7860"
export LANGFLOW_URL="http://localhost:7860"
```
- Export the `flow-id` in your terminal.

View file

@ -268,7 +268,7 @@ When a **Webhook** component is added to the workspace, a new **Webhook cURL** t
```bash
curl -X POST \
"http://127.0.0.1:7860/api/v1/webhook/**YOUR_FLOW_ID**" \
"http://localhost:7860/api/v1/webhook/**YOUR_FLOW_ID**" \
-H 'Content-Type: application/json'\
-d '{"any": "data"}'
```

View file

@ -420,7 +420,7 @@ For a list of Ollama embeddings models, see the [Ollama documentation](https://o
To use this component in a flow, connect Langflow to your locally running Ollama server and select an embeddings model.
1. In the Ollama component, in the **Ollama Base URL** field, enter the address for your locally running Ollama server.
This value is set as the `OLLAMA_HOST` environment variable in Ollama. The default base URL is `http://127.0.0.1:11434`.
This value is set as the `OLLAMA_HOST` environment variable in Ollama. The default base URL is `http://localhost:11434`.
2. To refresh the server's list of models, click <Icon name="RefreshCw" aria-label="Refresh"/>.
3. In the **Ollama Model** field, select an embeddings model. This example uses `all-minilm:latest`.
4. Connect the **Ollama** embeddings component to a flow.

View file

@ -211,7 +211,7 @@ To send the same example messages programmatically to your Langflow server, do t
It looks similar to this:
```text
curl --request POST \
--url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
--url 'http://localhost:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
--header 'Content-Type: application/json' \
--data '{
"input_value": "What's the recommended way to install Docker on Mac M1?",
@ -226,7 +226,7 @@ Note the `output_type` and `input_type` parameters that are passed with the mess
4. Add a custom `session_id` to the message's `data` object.
```text
curl --request POST \
--url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
--url 'http://localhost:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
--header 'Content-Type: application/json' \
--data '{
"input_value": "Whats the recommended way to install Docker on Mac M1",
@ -246,7 +246,7 @@ A new chat session called `docker-question-on-m1` has appeared, using your uniqu
For example, disabling storing messages from the **Chat Input** component adds a **Tweak** to your command:
```text
curl --request POST \
--url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
--url 'http://localhost:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
--header 'Content-Type: application/json' \
--data '{
"input_value": "Text to input to the flow",

View file

@ -542,7 +542,7 @@ To use this component in a flow, connect Langflow to your locally running Ollama
1. In the Ollama component, in the **Base URL** field, enter the address for your locally running Ollama server.
This value is set as the `OLLAMA_HOST` environment variable in Ollama.
The default base URL is `http://127.0.0.1:11434`.
The default base URL is `http://localhost:11434`.
2. To refresh the server's list of models, click <Icon name="RefreshCw" aria-label="Refresh"/>.
3. In the **Model Name** field, select a model. This example uses `llama3.2:latest`.
4. Connect the **Ollama** model component to a flow. For example, this flow connects a local Ollama server running a Llama 3.2 model as the custom model for an [Agent](/components-agents) component.

View file

@ -112,7 +112,7 @@ All operations in the component require at least one [Data](/concepts-objects#da
For example, send this request to the **Webhook** component.
Replace `YOUR_FLOW_ID` with your flow ID.
```bash
curl -X POST "http://127.0.0.1:7860/api/v1/webhook/YOUR_FLOW_ID" \
curl -X POST "http://localhost:7860/api/v1/webhook/YOUR_FLOW_ID" \
-H 'Content-Type: application/json' \
-d '{
"id": 1,
@ -205,7 +205,7 @@ This example connects a **Webhook** component to convert `text` and `data` into
Replace `YOUR_FLOW_ID` with your flow ID.
This example uses the default Langflow server address.
```text
curl -X POST "http://127.0.0.1:7860/api/v1/webhook/YOUR_FLOW_ID" \
curl -X POST "http://localhost:7860/api/v1/webhook/YOUR_FLOW_ID" \
-H 'Content-Type: application/json' \
-d '{
"text": "Alex Cruz - Employee Profile",
@ -227,7 +227,7 @@ The **Data to DataFrame** component converts the webhook request into a `DataFra
5. Send another employee data object.
```text
curl -X POST "http://127.0.0.1:7860/api/v1/webhook/YOUR_FLOW_ID" \
curl -X POST "http://localhost:7860/api/v1/webhook/YOUR_FLOW_ID" \
-H 'Content-Type: application/json' \
-d '{
"text": "Kalani Smith - Employee Profile",
@ -446,7 +446,7 @@ For example, if the selected `file_format` is `csv`, and you enter `file_path` a
Replace `YOUR_FLOW_ID` with your flow ID.
This example uses the default Langflow server address.
```text
curl -X POST "http://127.0.0.1:7860/api/v1/webhook/YOUR_FLOW_ID" \
curl -X POST "http://localhost:7860/api/v1/webhook/YOUR_FLOW_ID" \
-H 'Content-Type: application/json' \
-d '{
"Name": ["Alex Cruz", "Kalani Smith", "Noam Johnson"],

View file

@ -362,7 +362,7 @@ For more information, see [MCP server](/mcp-server).
### MCP Server-Sent Events (SSE) mode {#mcp-sse-mode}
:::important
If you're using **Langflow for Desktop**, the default address is `http://127.0.0.1:7868/`.
If you're using **Langflow for Desktop**, the default address is `http://localhost:7868/`.
:::
The MCP component's SSE mode connects your flow to the Langflow MCP server through the component.

View file

@ -11,11 +11,11 @@ Uploading files to the **File management** system keeps your files in a central
## Upload a file
The **File management** system is available at the `/files` URL. For example, if you're running Langflow at the default `http://127.0.0.1:7860` address, the **File management** system is located at `http://127.0.0.1:7860/files`.
The **File management** system is available at the `/files` URL. For example, if you're running Langflow at the default `http://localhost:7860` address, the **File management** system is located at `http://localhost:7860/files`.
To upload a file from your local machine:
1. From the **My Files** window at `http://127.0.0.1:7860/files`, click **Upload**.
1. From the **My Files** window at `http://localhost:7860/files`, click **Upload**.
2. Select the file to upload.
The file is uploaded to Langflow.

View file

@ -43,7 +43,7 @@ To have more than one session in a single flow, pass a specific Session ID to a
To post a message to a flow with a specific Session ID with curl, enter the following command:
```bash
curl -X POST "http://127.0.0.1:7860/api/v1/run/$FLOW_ID" \
curl -X POST "http://localhost:7860/api/v1/run/$FLOW_ID" \
-H 'Content-Type: application/json' \
-d '{
"session_id": "custom_session_123",
@ -76,7 +76,7 @@ You can work with base64 images in the Playground in several ways:
This example sends a base64-encoded image to the Playground using curl:
```bash
curl -X POST "http://127.0.0.1:7860/api/v1/run/$FLOW_ID" \
curl -X POST "http://localhost:7860/api/v1/run/$FLOW_ID" \
-H 'Content-Type: application/json' \
-d '{
"session_id": "custom_session_123",

View file

@ -86,7 +86,7 @@ For example:
```
The **MCP Server** tab automatically includes the correct `PROJECT_NAME`, `LANGFLOW_SERVER_ADDRESS`, and `PROJECT_ID` values.
The default Langflow server address is `http://127.0.0.1:7860` (`http://127.0.0.1:7868` if using Langflow for Desktop).
The default Langflow server address is `http://localhost:7860` (`http://localhost:7868` if using Langflow for Desktop).
:::important
If your Langflow server [requires authentication](/configuration-authentication) ([`LANGFLOW_AUTO_LOGIN`](/environment-variables#LANGFLOW_AUTO_LOGIN) is set to `false`), you must include your Langflow API key in the configuration.
@ -201,12 +201,12 @@ You can use MCP Inspector to monitor your flows and get insights into how they a
For more information about configuring MCP Inspector, including specifying a proxy port, see the [MCP Inspector GitHub project](https://github.com/modelcontextprotocol/inspector).
2. Open a web browser and navigate to the MCP Inspector UI.
The default address is `http://127.0.0.1:6274`.
The default address is `http://localhost:6274`.
3. In the MCP Inspector UI, enter the connection details for your Langflow project's MCP server:
- **Transport Type**: Select **SSE**.
- **URL**: Enter the Langflow MCP server's `sse` endpoint. For example: `http://127.0.0.1:7860/api/v1/mcp/project/d359cbd4-6fa2-4002-9d53-fa05c645319c/sse`
- **URL**: Enter the Langflow MCP server's `sse` endpoint. For example: `http://localhost:7860/api/v1/mcp/project/d359cbd4-6fa2-4002-9d53-fa05c645319c/sse`
If you've [configured authentication for your MCP server](#authentication), fill out the following additional fields:
- **Transport Type**: Select **STDIO**.
@ -230,7 +230,7 @@ By default, Langflow isn't exposed to the public internet.
However, you can forward Langflow server traffic with a forwarding platform like [ngrok](https://ngrok.com/docs/getting-started/) or [zrok](https://docs.zrok.io/docs/getting-started).
The following procedure uses ngrok, but you can use any similar reverse proxy or forwarding platform.
This procedure also assumes that you're using the default Langflow listening address `http://127.0.0.1:7860` (`http://127.0.0.1:7868` if using Langflow for Desktop).
This procedure also assumes that you're using the default Langflow listening address `http://localhost:7860` (`http://localhost:7868` if using Langflow for Desktop).
1. Sign up for an [ngrok account](https://dashboard.ngrok.com/signup).

View file

@ -51,7 +51,7 @@ To use the API key when making API requests, include the API key in the HTTP hea
```shell
curl -X POST \
"http://127.0.0.1:7860/api/v1/run/FLOW_ID?stream=false" \
"http://localhost:7860/api/v1/run/FLOW_ID?stream=false" \
-H 'Content-Type: application/json' \
-H 'x-api-key: API_KEY' \
-d '{"inputs": {"text":""}, "tweaks": {}}'
@ -63,7 +63,7 @@ To pass the API key as a query parameter:
```shell
curl -X POST \
"http://127.0.0.1:7860/api/v1/run/FLOW_ID?x-api-key=API_KEY?stream=false" \
"http://localhost:7860/api/v1/run/FLOW_ID?x-api-key=API_KEY?stream=false" \
-H 'Content-Type: application/json' \
-d '{"inputs": {"text":""}, "tweaks": {}}'
```

View file

@ -103,7 +103,7 @@ python -m langflow run [OPTIONS]
| Option | Default | Values | Description |
|--------|---------|--------|-------------|
| <Link id="run-host"/>`--host` | `127.0.0.1` | String | The host on which the Langflow server will run.<br/>See [`LANGFLOW_HOST` variable](./environment-variables.md#LANGFLOW_HOST). |
| <Link id="run-host"/>`--host` | `localhost` | String | The host on which the Langflow server will run.<br/>See [`LANGFLOW_HOST` variable](./environment-variables.md#LANGFLOW_HOST). |
| <Link id="run-workers"/>`--workers` | `1` | Integer | Number of worker processes.<br/>See [`LANGFLOW_WORKERS` variable](./environment-variables.md#LANGFLOW_WORKERS). |
| <Link id="run-worker-timeout"/>`--worker-timeout` | `300` | Integer | Worker timeout in seconds.<br/>See [`LANGFLOW_WORKER_TIMEOUT` variable](./environment-variables.md#LANGFLOW_WORKER_TIMEOUT). |
| <Link id="run-port"/>`--port` | `7860` | Integer | The port on which the Langflow server will run. The server automatically selects a free port if the specified port is in use.<br/>See [`LANGFLOW_PORT` variable](./environment-variables.md#LANGFLOW_PORT). |

View file

@ -72,7 +72,7 @@ If it detects a supported environment variable, then it automatically adopts the
LANGFLOW_DEV=false
LANGFLOW_FALLBACK_TO_ENV_VAR=false
LANGFLOW_HEALTH_CHECK_MAX_RETRIES=5
LANGFLOW_HOST=127.0.0.1
LANGFLOW_HOST=localhost
LANGFLOW_LANGCHAIN_CACHE=InMemoryCache
LANGFLOW_MAX_FILE_SIZE_UPLOAD=10000
LANGFLOW_LOG_LEVEL=error
@ -196,7 +196,7 @@ The following table lists the environment variables supported by Langflow.
| <Link id="LANGFLOW_FALLBACK_TO_ENV_VAR"/><span class="env-prefix">LANGFLOW_</span>FALLBACK_TO_ENV_VAR | Boolean | `true` | If enabled, [global variables](../Configuration/configuration-global-variables.md) set in the Langflow UI fall back to an environment variable with the same name when Langflow fails to retrieve the variable value. |
| <Link id="LANGFLOW_FRONTEND_PATH"/><span class="env-prefix">LANGFLOW_</span>FRONTEND_PATH | String | `./frontend` | Path to the frontend directory containing build files. This is for development purposes only.<br/>See [`--frontend-path` option](./configuration-cli.md#run-frontend-path). |
| <Link id="LANGFLOW_HEALTH_CHECK_MAX_RETRIES"/><span class="env-prefix">LANGFLOW_</span>HEALTH_CHECK_MAX_RETRIES | Integer | `5` | Set the maximum number of retries for the health check.<br/>See [`--health-check-max-retries` option](./configuration-cli.md#run-health-check-max-retries). |
| <Link id="LANGFLOW_HOST"/><span class="env-prefix">LANGFLOW_</span>HOST | String | `127.0.0.1` | The host on which the Langflow server will run.<br/>See [`--host` option](./configuration-cli.md#run-host). |
| <Link id="LANGFLOW_HOST"/><span class="env-prefix">LANGFLOW_</span>HOST | String | `localhost` | The host on which the Langflow server will run.<br/>See [`--host` option](./configuration-cli.md#run-host). |
| <Link id="LANGFLOW_LANGCHAIN_CACHE"/><span class="env-prefix">LANGFLOW_</span>LANGCHAIN_CACHE | String | `InMemoryCache` | Type of cache to use. Possible values: `InMemoryCache`, `SQLiteCache`.<br/>See [`--cache` option](./configuration-cli.md#run-cache). |
| <Link id="LANGFLOW_LOG_LEVEL"/><span class="env-prefix">LANGFLOW_</span>LOG_LEVEL | String | `INFO` | Set the logging level for Langflow. Possible values: `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`. |
| <Link id="LANGFLOW_LOG_FILE"/><span class="env-prefix">LANGFLOW_</span>LOG_FILE | String | Not set | Path to the log file. If this option is not set, logs are written to stdout. |
@ -256,7 +256,7 @@ LANGFLOW_DATABASE_URL=postgresql://user:password@localhost:5432/langflow
LANGFLOW_DEV=false
LANGFLOW_FALLBACK_TO_ENV_VAR=false
LANGFLOW_HEALTH_CHECK_MAX_RETRIES=5
LANGFLOW_HOST=127.0.0.1
LANGFLOW_HOST=localhost
LANGFLOW_LANGCHAIN_CACHE=InMemoryCache
LANGFLOW_MAX_FILE_SIZE_UPLOAD=10000
LANGFLOW_LOG_LEVEL=error
@ -295,7 +295,7 @@ Environment="LANGFLOW_DATABASE_URL=postgresql://user:password@localhost:5432/lan
Environment="LANGFLOW_DEV=false"
Environment="LANGFLOW_FALLBACK_TO_ENV_VAR=false"
Environment="LANGFLOW_HEALTH_CHECK_MAX_RETRIES=5"
Environment="LANGFLOW_HOST=127.0.0.1"
Environment="LANGFLOW_HOST=localhost"
Environment="LANGFLOW_LANGCHAIN_CACHE=InMemoryCache"
Environment="LANGFLOW_MAX_FILE_SIZE_UPLOAD=10000"
Environment="LANGFLOW_LOG_ENV=container_json"

View file

@ -71,7 +71,7 @@ The `input` string is the message you're sending to your flow.
```tsx
import { LangflowClient } from "@datastax/langflow-client";
const baseUrl = "http://127.0.0.1:7860";
const baseUrl = "http://localhost:7860";
const client = new LangflowClient({ baseUrl });
async function runFlow() {
@ -148,7 +148,7 @@ Replace `baseUrl` and `flowId` with values from your deployment.
```tsx
import { LangflowClient } from "@datastax/langflow-client";
const baseUrl = "http://127.0.0.1:7860";
const baseUrl = "http://localhost:7860";
const client = new LangflowClient({ baseUrl });
async function runFlow() {
@ -264,7 +264,7 @@ Replace `baseUrl` and `flowId` with values from your deployment.
```tsx
import { LangflowClient } from "@datastax/langflow-client";
const baseUrl = "http://127.0.0.1:7863";
const baseUrl = "http://localhost:7863";
const flowId = "86f0bf45-0544-4e88-b0b1-8e622da7a7f0";
async function runFlow(client: LangflowClient) {

View file

@ -140,7 +140,7 @@ For more information, see [Session ID](/session-id).
```bash
curl --request POST \
--url 'http://127.0.0.1:7860/api/v1/run/e4167236-938f-4aca-845b-21de3f399858?stream=false' \
--url 'http://localhost:7860/api/v1/run/e4167236-938f-4aca-845b-21de3f399858?stream=false' \
--header 'Content-Type: application/json' \
--data '{
"input_value": "Tell me about Charizard please",

View file

@ -19,7 +19,7 @@ If you set a custom session ID in a payload, all downstream components use the u
```
curl --request POST \
--url 'http://127.0.0.1:7860/api/v1/run/$FLOW_ID' \
--url 'http://localhost:7860/api/v1/run/$FLOW_ID' \
--header 'Content-Type: application/json' \
--data '{
"input_value": "Hello",

View file

@ -32,7 +32,7 @@ To connect the **Webhook** to a **Parser** component to view and parse your data
This example uses `id`, `name`, and `email` strings.
Replace **YOUR_FLOW_ID** with your flow ID.
```text
curl -X POST "http://127.0.0.1:7860/api/v1/webhook/YOUR_FLOW_ID" \
curl -X POST "http://localhost:7860/api/v1/webhook/YOUR_FLOW_ID" \
-H 'Content-Type: application/json' \
-d '{"id": "12345", "name": "alex", "email": "alex@email.com"}'
```

View file

@ -61,7 +61,7 @@ python -m langflow run
</TabItem>
</Tabs>
3. To confirm that a local Langflow instance starts, go to the default Langflow URL at `http://127.0.0.1:7860`.
3. To confirm that a local Langflow instance starts, go to the default Langflow URL at `http://localhost:7860`.
After confirming that Langflow is running, create your first flow with the [Quickstart](/get-started-quickstart).