Docs: backend only mode (#4405)

* initial-cleanup-and-test

* better-flow

* config

* Apply suggestions from code review

Co-authored-by: KimberlyFields <46325568+KimberlyFields@users.noreply.github.com>

* more-why-clarity

* cleanup

* code-review-and-tech-review

---------

Co-authored-by: KimberlyFields <46325568+KimberlyFields@users.noreply.github.com>
This commit is contained in:
Mendon Kissling 2024-11-08 09:00:43 -05:00 committed by GitHub
commit 1138869d0f
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -1,154 +1,123 @@
---
title: Backend-Only
title: Run Langflow in backend-only mode
sidebar_position: 4
slug: /configuration-backend-only
---
Langflow can run in `--backend-only` mode to expose a Langflow app as an API endpoint, without running the frontend UI.
This is also known as "headless" mode. Running Langflow without the frontend is useful for automation, testing, and situations where you just need to serve a flow as a workload without creating a new flow in the UI.
To run Langflow in backend-only mode, pass the `--backend-only` flag at startup.
:::info
```python
python3 -m langflow run --backend-only
```
This page may contain outdated information. It will be updated as soon as possible.
The terminal prints `Welcome to ⛓ Langflow`, and Langflow will now serve requests to its API without the frontend running.
:::
## Set up a basic prompting flow in backend-only mode
This example shows you how to set up a [Basic Prompting flow](/starter-projects-basic-prompting) as an endpoint in backend-only mode.
However, you can use these same instructions as guidelines for using any type of flow in backend-only mode.
### Prerequisites
- [Langflow is installed](/getting-started-installation)
- [You have an OpenAI API key](https://platform.openai.com/)
- [You have a Langflow Basic Prompting flow](/starter-projects-basic-prompting)
You can run Langflow in `--backend-only` mode to expose your Langflow app as an API, without running the frontend UI.
### Get your flow's ID
This guide assumes you have created a [Basic Prompting flow](/starter-projects-basic-prompting) or have another working flow available.
Start langflow in backend-only mode with `python3 -m langflow run --backend-only`.
The terminal prints `Welcome to ⛓ Langflow`, and a blank window opens at `http://127.0.0.1:7864/all`.
Langflow will now serve requests to its API without the frontend running.
## Prerequisites {#81dfa9407ed648889081b9d08b0e5cfe}
- [Langflow installed](/getting-started-installation)
- [OpenAI API key](https://platform.openai.com/)
- [A Langflow flow created](/starter-projects-basic-prompting)
## Download your flow's curl call {#d2cf1b694e4741eca07fd9806516007b}
1. Click API.
2. Click **curl** &gt; **Copy code** and save the code to your local machine.
1. In the Langflow UI, click **API**.
2. Click **curl** &gt; **Copy code** to copy the curl command.
This command will POST input to your flow's endpoint.
It will look something like this:
```text
curl -X POST \\
"<http://127.0.0.1:7864/api/v1/run/ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef?stream=false>" \\
-H 'Content-Type: application/json'\\
curl -X POST \
"http://127.0.0.1:7861/api/v1/run/fff8dcaa-f0f6-4136-9df0-b7cb38de42e0?stream=false" \
-H 'Content-Type: application/json'\
-d '{"input_value": "message",
"output_type": "chat",
"input_type": "chat",
"tweaks": {
"Prompt-kvo86": {},
"OpenAIModel-MilkD": {},
"ChatOutput-ktwdw": {},
"ChatInput-xXC4F": {}
"ChatInput-8a86T": {},
"Prompt-pKfl9": {},
"ChatOutput-WcGpD": {},
"OpenAIModel-5UyvQ": {}
}}'
```
The flow ID in this example is `fff8dcaa-f0f6-4136-9df0-b7cb38de42e0`, a UUID generated by Langflow and used in the endpoint URL.
See [Project & General Settings](/settings-project-general-settings) to change the endpoint.
Note the flow ID of `ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef`. You can find this ID in the UI as well to ensure you're querying the right flow.
3. To stop Langflow, press **Ctrl+C**.
### Start Langflow in backend-only mode
## Start Langflow in backend-only mode {#f0ba018daf3041c39c0d226dadf78d35}
1. Start Langflow in backend-only mode.
1. Stop Langflow with Ctrl+C.
2. Start langflow in backend-only mode with `python3 -m langflow run --backend-only`.
The terminal prints `Welcome to ⛓ Langflow`, and a blank window opens at `http://127.0.0.1:7864/all`.
Langflow will now serve requests to its API.
3. Run the curl code you copied from the UI.
```python
python3 -m langflow run --backend-only
```
The terminal prints `Welcome to ⛓ Langflow`.
Langflow is now serving requests to its API.
2. Run the curl code you copied from the UI.
You should get a result like this:
```shell
{"session_id":"ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880","outputs":[{"inputs":{"input_value":"hi, are you there?"},"outputs":[{"results":{"result":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?"},"artifacts":{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI"},"messages":[{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI","component_id":"ChatOutput-ktwdw"}],"component_display_name":"Chat Output","component_id":"ChatOutput-ktwdw","used_frozen_result":false}]}]}%
```
This confirms Langflow is receiving your POST request, running the flow, and returning the result without running the frontend.
Again, note that the flow ID matches.
Langflow is receiving your POST request, running the flow, and returning the result, all without running the frontend. Cool!
You can interact with this endpoint using the other options in the **API** menu, including the Python and Javascript APIs.
### Query the Langflow endpoint with a Python script
## Download your flow's Python API call {#5923ff9dc40843c7a22a72fa6c66540c}
Using the same flow ID, run a Python sample script to send a query and get a prettified JSON response back.
Instead of using curl, you can download your flow as a Python API call instead.
1. Click API.
2. Click **Python API** &gt; **Copy code** and save the code to your local machine.
The code will look something like this:
1. Create a Python file and name it `langflow_api_demo.py`.
```python
import requests
from typing import Optional
import json
BASE_API_URL = "<http://127.0.0.1:7864/api/v1/run>"
FLOW_ID = "ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef"
# You can tweak the flow by adding a tweaks dictionary
# e.g {"OpenAI-XXXXX": {"model_name": "gpt-4"}}
def query_langflow(message):
url = "http://127.0.0.1:7861/api/v1/run/fff8dcaa-f0f6-4136-9df0-b7cb38de42e0"
headers = {"Content-Type": "application/json"}
data = {"input_value": message}
def run_flow(message: str,
flow_id: str,
output_type: str = "chat",
input_type: str = "chat",
tweaks: Optional[dict] = None,
api_key: Optional[str] = None) -> dict:
"""Run a flow with a given message and optional tweaks.
response = requests.post(url, headers=headers, json=data)
return response.json()
:param message: The message to send to the flow
:param flow_id: The ID of the flow to run
:param tweaks: Optional tweaks to customize the flow
:return: The JSON response from the flow
"""
api_url = f"{BASE_API_URL}/{flow_id}"
payload = {
"input_value": message,
"output_type": output_type,
"input_type": input_type,
}
headers = None
if tweaks:
payload["tweaks"] = tweaks
if api_key:
headers = {"x-api-key": api_key}
response = requests.post(api_url, json=payload, headers=headers)
return response.json()
# Setup any tweaks you want to apply to the flow
message = "message"
print(run_flow(message=message, flow_id=FLOW_ID))
user_input = input("Enter your message: ")
result = query_langflow(user_input)
print(json.dumps(result, indent=2))
```
2. Run the script.
```python
python langflow_api_demo.py
```
3. Enter your message when prompted.
You will get a prettified JSON response back containing a response to your message.
### Configure host and ports in backend-only mode
To change the host and port, pass the values as additional flags.
```python
python -m langflow run --host 127.0.0.1 --port 7860 --backend-only
```
3. Run your Python app:
```shell
python3 app.py
```
The result is similar to the curl call:
```json
{'session_id': 'ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880', 'outputs': [{'inputs': {'input_value': 'message'}, 'outputs': [{'results': {'result': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!"}, 'artifacts': {'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI'}, 'messages': [{'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI', 'component_id': 'ChatOutput-ktwdw'}], 'component_display_name': 'Chat Output', 'component_id': 'ChatOutput-ktwdw', 'used_frozen_result': False}]}]}
```
Your Python app POSTs to your Langflow server, and the server runs the flow and returns the result.
See [API](https://www.notion.so/administration/api) for more ways to interact with your headless Langflow server.