docs: clean up and test deployment documentation (#6029)
* docs: correct spelling and formatting in Docker deployment guide * HuggingFace Spaces deployment documentation * docs: simplify introduction for HuggingFace Spaces deployment guide * docs: Update Kubernetes deployment documentation with comprehensive improvements - Enhance Langflow Kubernetes deployment guide with clearer instructions - Improve formatting, add more precise configuration examples - Update service names, port forwarding, and scaling configuration details - Add more explicit links to Helm chart repositories and values files - Clarify runtime and IDE deployment sections with better explanations * docs: Add Docker flow packaging guide and enhance deployment documentation - Introduce new section on packaging Langflow flows as Docker images - Provide step-by-step instructions for creating custom Docker images with flows - Include example commands for building, running, and pushing Docker images - Minor formatting and clarity improvements to existing Docker deployment guide * docs: Refine Kubernetes and Docker deployment documentation - Streamline Docker and Kubernetes deployment guides - Improve clarity and conciseness of instructions - Add more precise examples for deploying Langflow with custom images - Enhance secret configuration and flow deployment explanations - Simplify formatting and remove redundant text * docs: Minor text refinements in Kubernetes and Docker deployment documentation - Update link text for Langflow runtime deployment - Correct capitalization and minor typos in deployment guides - Improve clarity of deployment section descriptions * revert-lockfiles * Revert "HuggingFace Spaces deployment documentation" This reverts commit 2943deb1e0964a05c6909d6e1afdf8bddf7173fb. * Remove lockfiles from PR * Apply suggestions from code review Co-authored-by: KimberlyFields <46325568+KimberlyFields@users.noreply.github.com> * docs: Standardize deployment documentation titles * code-review * docs: Refactor deployment documentation for clarity and conciseness * shorten-nav-titles * code-review-and-cleanup * Apply suggestions from code review Co-authored-by: KimberlyFields <46325568+KimberlyFields@users.noreply.github.com> * service * gcp * hf-spaces * railway-and-render * kubernetes * Apply suggestions from code review Co-authored-by: KimberlyFields <46325568+KimberlyFields@users.noreply.github.com> * remove-extra-heading * Apply suggestions from code review Co-authored-by: KimberlyFields <46325568+KimberlyFields@users.noreply.github.com> --------- Co-authored-by: KimberlyFields <46325568+KimberlyFields@users.noreply.github.com>
This commit is contained in:
parent
09c91b8c34
commit
2bbb0f5838
7 changed files with 330 additions and 288 deletions
|
|
@ -1,34 +1,22 @@
|
|||
---
|
||||
title: Docker
|
||||
lug: /deployment-docker
|
||||
title: Deploy Langflow on Docker
|
||||
slug: /deployment-docker
|
||||
---
|
||||
|
||||
This guide demonstrates deploying Langflow with Docker and Docker Compose.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
This guide will help you get LangFlow up and running using Docker and Docker Compose.
|
||||
* [Docker](https://docs.docker.com/)
|
||||
* [Docker Compose](https://docs.docker.com/compose/)
|
||||
|
||||
## Clone the repo and build the Langflow Docker container
|
||||
|
||||
## Prerequisites {#856bb2d98156402bbd1980365b98110c}
|
||||
|
||||
|
||||
---
|
||||
|
||||
- Docker
|
||||
- Docker Compose
|
||||
|
||||
## Docker {#55b5d304f2294e47b0dcd3e069cf5e67}
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
### Clone repo and build Docker container {#ba89773aa8b8425b985bfe7ba91c35cc}
|
||||
|
||||
1. Clone the LangFlow repository:
|
||||
1. Clone the Langflow repository:
|
||||
|
||||
`git clone https://github.com/langflow-ai/langflow.git`
|
||||
|
||||
2. Navigate to the `docker_example` directory:
|
||||
2. Navigate to the `docker_example` directory:
|
||||
|
||||
`cd langflow/docker_example`
|
||||
|
||||
|
|
@ -37,48 +25,101 @@ This guide will help you get LangFlow up and running using Docker and Docker Com
|
|||
`docker compose up`
|
||||
|
||||
|
||||
LangFlow will now be accessible at `http://localhost:7860/`.
|
||||
Langflow is now accessible at `http://localhost:7860/`.
|
||||
|
||||
## Configure Docker services
|
||||
|
||||
### Docker Compose configuration {#02226209cad24185a6ec5b69bd820d0f}
|
||||
The Docker Compose configuration spins up two services: `langflow` and `postgres`.
|
||||
|
||||
To configure values for these services at container startup, include them in your `.env` file.
|
||||
|
||||
The Docker Compose configuration spins up two services: `langflow` and `postgres`.
|
||||
An example `.env` file is available in the [project repository](https://github.com/langflow-ai/langflow/blob/main/.env.example).
|
||||
|
||||
To pass the `.env` values at container startup, include the flag in your `docker run` command:
|
||||
|
||||
### LangFlow service {#d749848451ea43bd86f6f096dc77e6e6}
|
||||
```
|
||||
docker run -it --rm \
|
||||
-p 7860:7860 \
|
||||
--env-file .env \
|
||||
langflowai/langflow:latest
|
||||
```
|
||||
|
||||
### Langflow service
|
||||
|
||||
The `langflow` service uses the `langflowai/langflow:latest` Docker image and exposes port 7860. It depends on the `postgres` service.
|
||||
The `langflow`service serves both the backend API and frontend UI of the Langflow web application.
|
||||
|
||||
The `langflow` service uses the `langflowai/langflow:latest` Docker image and exposes port `7860`. It depends on the `postgres` service.
|
||||
|
||||
Environment variables:
|
||||
|
||||
- `LANGFLOW_DATABASE_URL`: The connection string for the PostgreSQL database.
|
||||
- `LANGFLOW_CONFIG_DIR`: The directory where LangFlow stores logs, file storage, monitor data, and secret keys.
|
||||
* `LANGFLOW_DATABASE_URL`: The connection string for the PostgreSQL database.
|
||||
* `LANGFLOW_CONFIG_DIR`: The directory where Langflow stores logs, file storage, monitor data, and secret keys.
|
||||
|
||||
Volumes:
|
||||
|
||||
- `langflow-data`: This volume is mapped to `/app/langflow` in the container.
|
||||
* `langflow-data`: This volume is mapped to `/app/langflow` in the container.
|
||||
|
||||
### PostgreSQL service {#121140decbfe4997b12213bdd2c4da7e}
|
||||
### PostgreSQL service
|
||||
|
||||
The `postgres` service is a database that stores Langflow's persistent data including flows, users, and settings.
|
||||
|
||||
The `postgres` service uses the `postgres:16` Docker image and exposes port 5432.
|
||||
The service runs on port 5432 and includes a dedicated volume for data storage.
|
||||
|
||||
The `postgres` service uses the `postgres:16` Docker image.
|
||||
|
||||
Environment variables:
|
||||
|
||||
- `POSTGRES_USER`: The username for the PostgreSQL database.
|
||||
- `POSTGRES_PASSWORD`: The password for the PostgreSQL database.
|
||||
- `POSTGRES_DB`: The name of the PostgreSQL database.
|
||||
* `POSTGRES_USER`: The username for the PostgreSQL database.
|
||||
* `POSTGRES_PASSWORD`: The password for the PostgreSQL database.
|
||||
* `POSTGRES_DB`: The name of the PostgreSQL database.
|
||||
|
||||
Volumes:
|
||||
|
||||
- `langflow-postgres`: This volume is mapped to `/var/lib/postgresql/data` in the container.
|
||||
* `langflow-postgres`: This volume is mapped to `/var/lib/postgresql/data` in the container.
|
||||
|
||||
### Switch to a specific LangFlow version {#2b3e191ea48f4feab89242433cf012d5}
|
||||
### Deploy a specific Langflow version with Docker Compose
|
||||
|
||||
If you want to deploy a specific version of Langflow, you can modify the `image` field under the `langflow` service in the Docker Compose file. For example, to use version `1.0-alpha`, change `langflowai/langflow:latest` to `langflowai/langflow:1.0-alpha`.
|
||||
|
||||
If you want to use a specific version of LangFlow, you can modify the `image` field under the `langflow` service in the Docker Compose file. For example, to use version 1.0-alpha, change `langflowai/langflow:latest` to `langflowai/langflow:1.0-alpha`.
|
||||
## Package your flow as a Docker image
|
||||
|
||||
You can include your Langflow flow with the application image.
|
||||
When you build the image, your saved flow `.JSON` flow is included.
|
||||
This enables you to serve a flow from a container, push the image to Docker Hub, and deploy on Kubernetes.
|
||||
|
||||
An example flow is available in the [Langflow Helm Charts](https://github.com/langflow-ai/langflow-helm-charts/tree/main/examples/flows) repository, or you can provide your own `JSON` file.
|
||||
|
||||
1. Create a project directory:
|
||||
```shell
|
||||
mkdir langflow-custom && cd langflow-custom
|
||||
```
|
||||
|
||||
2. Download the example flow or include your flow's `.JSON` file in the `langflow-custom` directory.
|
||||
|
||||
```shell
|
||||
wget https://raw.githubusercontent.com/langflow-ai/langflow-helm-charts/refs/heads/main/examples/flows/basic-prompting-hello-world.json
|
||||
```
|
||||
|
||||
3. Create a Dockerfile:
|
||||
```dockerfile
|
||||
FROM langflowai/langflow:latest
|
||||
RUN mkdir /app/flows
|
||||
COPY ./*json /app/flows/.
|
||||
```
|
||||
The `COPY ./*json` command copies all JSON files in your current directory to the `/flows` folder.
|
||||
|
||||
4. Build and run the image locally.
|
||||
```shell
|
||||
docker build -t myuser/langflow-hello-world:1.0.0 .
|
||||
docker run -p 7860:7860 myuser/langflow-hello-world:1.0.0
|
||||
```
|
||||
|
||||
5. Build and push the image to Docker Hub.
|
||||
Replace `myuser` with your Docker Hub username.
|
||||
```shell
|
||||
docker build -t myuser/langflow-hello-world:1.0.0 .
|
||||
docker push myuser/langflow-hello-world:1.0.0
|
||||
```
|
||||
|
||||
To deploy the image with Helm, see [Langflow runtime deployment](/deployment-kubernetes#langflow-runtime-deployment).
|
||||
|
||||
|
|
|
|||
|
|
@ -1,31 +1,29 @@
|
|||
---
|
||||
title: GCP
|
||||
title: Deploy Langflow on Google Cloud Platform
|
||||
slug: /deployment-gcp
|
||||
---
|
||||
|
||||
# Deploy on Google Cloud Platform
|
||||
This guide demonstrates deploying Langflow on Google Cloud Platform.
|
||||
|
||||
To deploy Langflow on Google Cloud Platform using Cloud Shell, use the below script.
|
||||
The script will guide you through setting up a Debian-based VM with the Langflow package, Nginx, and the necessary configurations to run the Langflow dev environment in GCP.
|
||||
|
||||
The script guides you through setting up a Debian-based VM with the Langflow package, Nginx, and the necessary configurations to run the Langflow dev environment in GCP.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* A GCP account with the necessary permissions to create resources
|
||||
* A project on GCP where you want to deploy Langflow
|
||||
* A [Google Cloud](https://console.cloud.google.com/) project with the necessary permissions to create resources
|
||||
|
||||
## Deploy Langflow in GCP
|
||||
|
||||
1. Click below to launch Cloud Shell.
|
||||
1. Click the following button to launch Cloud Shell:
|
||||
|
||||
[ Deploy to Google Cloud](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/langflow-ai/langflow&working_dir=scripts/gcp&shellonly=true&tutorial=walkthroughtutorial.md)
|
||||
[](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/langflow-ai/langflow&working_dir=scripts/gcp&shellonly=true&tutorial=walkthroughtutorial.md)
|
||||
|
||||
2. Click **Trust repo**. Some gcloud commands may not run in an ephemeral Cloud Shell environment.
|
||||
3. Click **Start** and follow the tutorial to deploy Langflow.
|
||||
|
||||
## Spot/Preemptible Instance
|
||||
|
||||
When running a [spot (preemptible) instance](https://cloud.google.com/compute/docs/instances/preemptible), the code and VM will behave the same way as in a regular instance, executing the startup script to configure the environment, install necessary dependencies, and run the Langflow application. However, **due to the nature of spot instances, the VM may be terminated at any time if Google Cloud needs to reclaim the resources**. This makes spot instances suitable for fault-tolerant, stateless, or interruptible workloads that can handle unexpected terminations and restarts.
|
||||
|
||||
## Pricing
|
||||
|
||||
This deployment uses a [spot (preemptible) instance](https://cloud.google.com/compute/docs/instances/preemptible), which is a cost-effective option for running Langflow. However, **due to the nature of spot instances, the VM may be terminated at any time if Google Cloud needs to reclaim the resources**.
|
||||
|
||||
For more information, see the [GCP pricing calculator](https://cloud.google.com/products/calculator?hl=en).
|
||||
|
|
|
|||
|
|
@ -1,32 +1,21 @@
|
|||
---
|
||||
title: HuggingFace Spaces
|
||||
title: Deploy Langflow on HuggingFace Spaces
|
||||
slug: /deployment-hugging-face-spaces
|
||||
---
|
||||
|
||||
This guide explains how to deploy Langflow on [HuggingFace Spaces](https://huggingface.co/spaces/).
|
||||
|
||||
1. Go to the [Langflow Space](https://huggingface.co/spaces/Langflow/Langflow?duplicate=true).
|
||||
|
||||
## HuggingFace Spaces {#00f5b3a6818d496dbb18e1a6a910e57d}
|
||||
2. Click **Duplicate Space**.
|
||||
3. In the configuration dialog, do the following:
|
||||
- Enter a name for your Space.
|
||||
- Select either public or private visibility.
|
||||
- Click **Duplicate Space**
|
||||
|
||||

|
||||
|
||||
---
|
||||
Wait for the setup to complete. You'll be redirected to your new Space automatically.
|
||||
|
||||
|
||||
HuggingFace provides a great alternative for running Langflow in their Spaces environment. This means you can run Langflow in the cloud without any local installation required. Here's how you can get Langflow up and running on HuggingFace Spaces:
|
||||
|
||||
|
||||
1. **Access Langflow Space**: Open a Chromium-based browser and navigate to the [Langflow Space](https://huggingface.co/spaces/Langflow/Langflow?duplicate=true). This link directs you to a pre-configured environment for Langflow.
|
||||
|
||||
|
||||
2. **Duplicate the Space**: You'll encounter an option to duplicate the Langflow space. This step involves a few simple decisions:
|
||||
|
||||
- **Naming Your Space**: Assign a unique name to your new Space.
|
||||
- **Visibility Settings**: Choose between Public or Private visibility for your Space.
|
||||
- After setting these parameters, click on **Duplicate Space** to initiate the setup.
|
||||
|
||||

|
||||
|
||||
3. **Complete Installation**: The duplication and setup process begins immediately after you click **Duplicate Space**. Once completed, you will be automatically redirected to the main page of your new Space.
|
||||
|
||||
|
||||
4. **Start Exploring Langflow**: With the setup complete, Langflow is now ready for use in your Space and you can start exploring its features and capabilities right away!
|
||||
Your Langflow instance is now ready to use.
|
||||
|
||||
|
|
|
|||
|
|
@ -1,34 +1,33 @@
|
|||
---
|
||||
title: Kubernetes
|
||||
title: Deploy Langflow on Kubernetes
|
||||
slug: /deployment-kubernetes
|
||||
---
|
||||
|
||||
import Tabs from '@theme/Tabs';
|
||||
import TabItem from '@theme/TabItem';
|
||||
|
||||
This guide demonstrates deploying Langflow on a Kubernetes cluster.
|
||||
|
||||
This guide will help you get LangFlow up and running in Kubernetes cluster, including the following steps:
|
||||
Two charts are available at the [Langflow Helm Charts repository](https://github.com/langflow-ai/langflow-helm-charts):
|
||||
|
||||
- Install [LangFlow as IDE](/deployment-kubernetes) in a Kubernetes cluster (for development)
|
||||
- Install [LangFlow as a standalone application](/deployment-kubernetes) in a Kubernetes cluster (for production runtime workloads)
|
||||
- Deploy the [Langflow IDE](deployment-kubernetes#langflow-ide-deployment) for the complete Langflow development environment.
|
||||
- Deploy the [Langflow runtime](/deployment-kubernetes#langflow-runtime-deployment) to deploy a standalone Langflow application in a more secure and stable environment.
|
||||
|
||||
## LangFlow (IDE) {#cb60b2f34e70490faf231cb0fe1a4b42}
|
||||
## Deploy the Langflow IDE
|
||||
|
||||
The Langflow IDE deployment is a complete environment for developers to create, test, and debug their flows. It includes both the API and the UI.
|
||||
|
||||
---
|
||||
The `langflow-ide` Helm chart is available in the [Langflow Helm Charts repository](https://github.com/langflow-ai/langflow-helm-charts/tree/main/charts/langflow-ide).
|
||||
|
||||
### Prerequisites
|
||||
|
||||
This solution is designed to provide a complete environment for developers to create, test, and debug their flows. It includes both the API and the UI.
|
||||
- A [Kubernetes](https://kubernetes.io/docs/setup/) cluster
|
||||
- [kubectl](https://kubernetes.io/docs/tasks/tools/#kubectl)
|
||||
- [Helm](https://helm.sh/docs/intro/install/)
|
||||
|
||||
### Prepare a Kubernetes cluster
|
||||
|
||||
### Prerequisites {#3efd3c63ff8849228c136f9252e504fd}
|
||||
|
||||
- Kubernetes server
|
||||
- kubectl
|
||||
- Helm
|
||||
|
||||
### Step 0. Prepare a Kubernetes cluster {#290b9624770a4c1ba2c889d384b7ef4c}
|
||||
|
||||
|
||||
We use [Minikube](https://minikube.sigs.k8s.io/docs/start/) for this example, but you can use any Kubernetes cluster.
|
||||
This example uses [Minikube](https://minikube.sigs.k8s.io/docs/start/), but you can use any Kubernetes cluster.
|
||||
|
||||
1. Create a Kubernetes cluster on Minikube.
|
||||
|
||||
|
|
@ -42,17 +41,16 @@ We use [Minikube](https://minikube.sigs.k8s.io/docs/start/) for this example, bu
|
|||
kubectl config use-context minikube
|
||||
```
|
||||
|
||||
### Install the Langflow IDE Helm chart
|
||||
|
||||
### Step 1. Install the LangFlow Helm chart {#b5c2a35144634a05a392f7e650929efe}
|
||||
|
||||
1. Add the repository to Helm.
|
||||
1. Add the repository to Helm and update it.
|
||||
|
||||
```text
|
||||
helm repo add langflow <https://langflow-ai.github.io/langflow-helm-charts>
|
||||
helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts
|
||||
helm repo update
|
||||
```
|
||||
|
||||
2. Install LangFlow with the default options in the `langflow` namespace.
|
||||
2. Install Langflow with the default options in the `langflow` namespace.
|
||||
|
||||
```text
|
||||
helm install langflow-ide langflow/langflow-ide -n langflow --create-namespace
|
||||
|
|
@ -72,24 +70,30 @@ We use [Minikube](https://minikube.sigs.k8s.io/docs/start/) for this example, bu
|
|||
```
|
||||
|
||||
|
||||
### Step 2. Access LangFlow {#34c71d04351949deb6c8ed7ffe30eafb}
|
||||
|
||||
|
||||
Enable local port forwarding to access LangFlow from your local machine.
|
||||
### Configure port forwarding to access Langflow
|
||||
|
||||
Enable local port forwarding to access Langflow from your local machine.
|
||||
|
||||
1. To make the Langflow API accessible from your local machine at port 7860:
|
||||
```text
|
||||
kubectl port-forward -n langflow svc/langflow-langflow-runtime 7860:7860
|
||||
kubectl port-forward -n langflow svc/langflow-service-backend 7860:7860
|
||||
```
|
||||
|
||||
2. To make the Langflow UI accessible from your local machine at port 8080:
|
||||
```text
|
||||
kubectl port-forward -n langflow svc/langflow-service 8080:8080
|
||||
```
|
||||
|
||||
Now you can access LangFlow at `http://localhost:7860/`.
|
||||
Now you can access:
|
||||
- The Langflow API at `http://localhost:7860`
|
||||
- The Langflow UI at `http://localhost:8080`
|
||||
|
||||
|
||||
### LangFlow version {#645c6ef7984d4da0bcc4170bab0ff415}
|
||||
### Configure the Langflow version
|
||||
|
||||
Langflow is deployed with the `latest` version by default.
|
||||
|
||||
To specify a different LangFlow version, you can set the `langflow.backend.image.tag` and `langflow.frontend.image.tag` values in the `values.yaml` file.
|
||||
To specify a different Langflow version, set the `langflow.backend.image.tag` and `langflow.frontend.image.tag` values in the [values.yaml](https://github.com/langflow-ai/langflow-helm-charts/blob/main/charts/langflow-ide/values.yaml) file.
|
||||
|
||||
|
||||
```yaml
|
||||
|
|
@ -104,22 +108,25 @@ langflow:
|
|||
```
|
||||
|
||||
|
||||
### Storage {#6772c00af79147d293c821b4c6905d3b}
|
||||
|
||||
|
||||
By default, the chart will use a SQLLite database stored in a local persistent disk.
|
||||
If you want to use an external PostgreSQL database, you can set the `langflow.database` values in the `values.yaml` file.
|
||||
### Configure external storage
|
||||
|
||||
By default, the chart deploys a SQLite database stored in a local persistent disk.
|
||||
If you want to use an external PostgreSQL database, you can configure it in two ways:
|
||||
|
||||
* Use the built-in PostgreSQL chart:
|
||||
```yaml
|
||||
# Deploy postgresql. You can skip this section if you have an existing postgresql database.
|
||||
postgresql:
|
||||
enabled: true
|
||||
fullnameOverride: "langflow-ide-postgresql-service"
|
||||
auth:
|
||||
username: "langflow"
|
||||
password: "langflow-postgres"
|
||||
database: "langflow-db"
|
||||
```
|
||||
|
||||
* Use an external database:
|
||||
```yaml
|
||||
postgresql:
|
||||
enabled: false
|
||||
|
||||
langflow:
|
||||
backend:
|
||||
|
|
@ -127,183 +134,152 @@ langflow:
|
|||
enabled: true
|
||||
driver:
|
||||
value: "postgresql"
|
||||
host:
|
||||
value: "langflow-ide-postgresql-service"
|
||||
port:
|
||||
value: "5432"
|
||||
database:
|
||||
value: "langflow-db"
|
||||
user:
|
||||
value: "langflow"
|
||||
password:
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
key: "password"
|
||||
name: "langflow-ide-postgresql-service"
|
||||
name: "your-secret-name"
|
||||
database:
|
||||
value: "langflow-db"
|
||||
sqlite:
|
||||
enabled: false
|
||||
|
||||
```
|
||||
|
||||
|
||||
### Scaling {#e1d95ba6551742aa86958dc03b26129e}
|
||||
|
||||
|
||||
You can scale the number of replicas for the LangFlow backend and frontend services by changing the `replicaCount` value in the `values.yaml` file.
|
||||
|
||||
|
||||
```yaml
|
||||
langflow:
|
||||
backend:
|
||||
replicaCount: 3
|
||||
frontend:
|
||||
replicaCount: 3
|
||||
|
||||
```
|
||||
|
||||
|
||||
You can scale frontend and backend services independently.
|
||||
|
||||
|
||||
To scale vertically (increase the resources for the pods), you can set the `resources` values in the `values.yaml` file.
|
||||
### Configure scaling
|
||||
|
||||
Scale the number of replicas and resources for both frontend and backend services:
|
||||
|
||||
```yaml
|
||||
langflow:
|
||||
backend:
|
||||
replicaCount: 1
|
||||
resources:
|
||||
requests:
|
||||
memory: "2Gi"
|
||||
cpu: "1000m"
|
||||
cpu: 0.5
|
||||
memory: 1Gi
|
||||
# limits:
|
||||
# cpu: 0.5
|
||||
# memory: 1Gi
|
||||
|
||||
frontend:
|
||||
enabled: true
|
||||
replicaCount: 1
|
||||
resources:
|
||||
requests:
|
||||
memory: "1Gi"
|
||||
cpu: "1000m"
|
||||
|
||||
cpu: 0.3
|
||||
memory: 512Mi
|
||||
# limits:
|
||||
# cpu: 0.3
|
||||
# memory: 512Mi
|
||||
```
|
||||
|
||||
|
||||
### Deploy on AWS EKS, Google GKE, or Azure AKS and other examples {#a8c3d4dc4e4f42f49b21189df5e2b851}
|
||||
|
||||
|
||||
Visit the [LangFlow Helm Charts repository](https://github.com/langflow-ai/langflow-helm-charts) for more information.
|
||||
|
||||
|
||||
## LangFlow (Runtime) {#49f2813ad2d3460081ad26a286a65e73}
|
||||
|
||||
|
||||
---
|
||||
|
||||
## Deploy the Langflow runtime
|
||||
|
||||
The runtime chart is tailored for deploying applications in a production environment. It is focused on stability, performance, isolation, and security to ensure that applications run reliably and efficiently.
|
||||
|
||||
The `langflow-runtime` Helm chart is available in the [Langflow Helm Charts repository](https://github.com/langflow-ai/langflow-helm-charts/tree/main/charts/langflow-runtime).
|
||||
|
||||
Using a dedicated deployment for a set of flows is fundamental in production environments to have granular resource control.
|
||||
### Prerequisites
|
||||
|
||||
- A [Kubernetes](https://kubernetes.io/docs/setup/) server
|
||||
- [kubectl](https://kubernetes.io/docs/tasks/tools/#kubectl)
|
||||
- [Helm](https://helm.sh/docs/intro/install/)
|
||||
|
||||
### Prerequisites {#3ad3a9389fff483ba8bd309189426a9d}
|
||||
|
||||
- Kubernetes server
|
||||
- kubectl
|
||||
- Helm
|
||||
|
||||
### Step 0. Prepare a Kubernetes cluster {#aaa764703ec44bd5ba64b5ef4599630b}
|
||||
|
||||
|
||||
Follow the same steps as for the LangFlow IDE.
|
||||
|
||||
|
||||
### Step 1. Install the LangFlow runtime Helm chart {#72a18aa8349c421186ba01d73a002531}
|
||||
### Install the Langflow runtime Helm chart
|
||||
|
||||
1. Add the repository to Helm.
|
||||
|
||||
```shell
|
||||
helm repo add langflow <https://langflow-ai.github.io/langflow-helm-charts>
|
||||
helm repo update
|
||||
```
|
||||
|
||||
2. Install the LangFlow app with the default options in the `langflow` namespace.
|
||||
If you bundled the flow in a docker image, you can specify the image name in the `values.yaml` file or with the `-set` flag:
|
||||
If you want to download the flow from a remote location, you can specify the URL in the `values.yaml` file or with the `-set` flag:
|
||||
|
||||
```shell
|
||||
helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-just-chat --set image.tag=1.0.0
|
||||
|
||||
```
|
||||
|
||||
|
||||
```shell
|
||||
helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/src/backend/base/langflow/initial_setup/starter_projects/Basic%20Prompting%20(Hello%2C%20world!).json
|
||||
|
||||
```
|
||||
|
||||
3. Check the status of the pods.
|
||||
|
||||
```text
|
||||
kubectl get pods -n langflow
|
||||
|
||||
```
|
||||
|
||||
|
||||
### Step 2. Access the LangFlow app API {#e13326fc07734e4aa86dfb75ccfa31f8}
|
||||
|
||||
|
||||
Enable local port forwarding to access LangFlow from your local machine.
|
||||
|
||||
|
||||
```text
|
||||
kubectl port-forward -n langflow svc/langflow-my-langflow-app 7860:7860
|
||||
```shell
|
||||
helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts
|
||||
helm repo update
|
||||
```
|
||||
|
||||
2. Install the Langflow app with the default options in the `langflow` namespace.
|
||||
|
||||
Now you can access the API at `http://localhost:7860/api/v1/flows` and execute the flow:
|
||||
If you have a created a [custom image with packaged flows](/deployment-docker#package-your-flow-as-a-docker-image), you can deploy Langflow by overriding the default [values.yaml](https://github.com/langflow-ai/langflow-helm-charts/blob/main/charts/langflow-runtime/values.yaml) file with the `--set` flag.
|
||||
|
||||
* Use a custom image with bundled flows:
|
||||
```shell
|
||||
helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-hello-world --set image.tag=1.0.0
|
||||
```
|
||||
|
||||
* Alternatively, install the chart and download the flows from a URL with the `--set` flag:
|
||||
```shell
|
||||
helm install my-langflow-app-with-flow langflow/langflow-runtime \
|
||||
-n langflow \
|
||||
--create-namespace \
|
||||
--set 'downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'
|
||||
```
|
||||
|
||||
:::important
|
||||
You may need to escape the square brackets in this command if you are using a shell that requires it:
|
||||
```shell
|
||||
helm install my-langflow-app-with-flow langflow/langflow-runtime \
|
||||
-n langflow \
|
||||
--create-namespace \
|
||||
--set 'downloadFlows.flows\[0\].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'
|
||||
```
|
||||
:::
|
||||
|
||||
3. Check the status of the pods.
|
||||
```shell
|
||||
kubectl get pods -n langflow
|
||||
```
|
||||
|
||||
### Access the Langflow app API
|
||||
|
||||
1. Get your service name.
|
||||
```shell
|
||||
kubectl get svc -n langflow
|
||||
```
|
||||
|
||||
The service name is your release name followed by `-langflow-runtime`. For example, if you used `helm install my-langflow-app-with-flow` the service name is `my-langflow-app-with-flow-langflow-runtime`.
|
||||
|
||||
2. Enable port forwarding to access Langflow from your local machine:
|
||||
|
||||
```shell
|
||||
id=$(curl -s <http://localhost:7860/api/v1/flows> | jq -r '.flows[0].id')
|
||||
curl -X POST \\
|
||||
"<http://localhost:7860/api/v1/run/$id?stream=false>" \\
|
||||
-H 'Content-Type: application/json'\\
|
||||
kubectl port-forward -n langflow svc/my-langflow-app-with-flow-langflow-runtime 7860:7860
|
||||
```
|
||||
|
||||
3. Confirm you can access the API at `http://localhost:7860/api/v1/flows/` and view a list of flows.
|
||||
```shell
|
||||
curl -v http://localhost:7860/api/v1/flows/
|
||||
```
|
||||
|
||||
4. Execute the packaged flow.
|
||||
|
||||
The following command gets the first flow ID from the flows list and runs the flow.
|
||||
|
||||
```shell
|
||||
# Get flow ID
|
||||
id=$(curl -s "http://localhost:7860/api/v1/flows/" | jq -r '.[0].id')
|
||||
|
||||
# Run flow
|
||||
curl -X POST \
|
||||
"http://localhost:7860/api/v1/run/$id?stream=false" \
|
||||
-H 'Content-Type: application/json' \
|
||||
-d '{
|
||||
"input_value": "Hello!",
|
||||
"output_type": "chat",
|
||||
"input_type": "chat"
|
||||
}'
|
||||
|
||||
```
|
||||
|
||||
### Configure secrets
|
||||
|
||||
### Storage {#09514d2b59064d37b685c7c0acecb861}
|
||||
To inject secrets and Langflow global variables, use the `secrets` and `env` sections in the [values.yaml](https://github.com/langflow-ai/langflow-helm-charts/blob/main/charts/langflow-runtime/values.yaml) file.
|
||||
|
||||
For example, the [example flow JSON](https://raw.githubusercontent.com/langflow-ai/langflow-helm-charts/refs/heads/main/examples/flows/basic-prompting-hello-world.json) uses a global variable that is a secret. When you export the flow as JSON, it's recommended to not include the secret.
|
||||
|
||||
In this case, storage is not needed as our deployment is stateless.
|
||||
|
||||
|
||||
### Log level and LangFlow configurations {#ecd97f0be96d4d1cabcc5b77a2d00980}
|
||||
|
||||
|
||||
You can set the log level and other LangFlow configurations in the `values.yaml` file.
|
||||
|
||||
|
||||
```yaml
|
||||
env:
|
||||
- name: LANGFLOW_LOG_LEVEL
|
||||
value: "INFO"
|
||||
|
||||
```
|
||||
|
||||
|
||||
### Configure secrets and variables {#b91929e92acf47c183ea4c9ba9d19514}
|
||||
|
||||
|
||||
To inject secrets and LangFlow global variables, you can use the `secrets` and `env` sections in the `values.yaml` file.
|
||||
|
||||
|
||||
Let's say your flow uses a global variable which is a secret; when you export the flow as JSON, it's recommended to not include it.
|
||||
When importing the flow in the LangFlow runtime, you can set the global variable using the `env` section in the `values.yaml` file.
|
||||
Assuming you have a global variable called `openai_key_var`, you can read it directly from a secret:
|
||||
Instead, when importing the flow in the Langflow runtime, you can set the global variable in one of the following ways:
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="values" label="Using values.yaml">
|
||||
|
||||
```yaml
|
||||
env:
|
||||
|
|
@ -312,34 +288,62 @@ env:
|
|||
secretKeyRef:
|
||||
name: openai-key
|
||||
key: openai-key
|
||||
|
||||
```
|
||||
|
||||
|
||||
or directly from the values file (not recommended for secret values!):
|
||||
|
||||
Or directly in the values file (not recommended for secret values):
|
||||
|
||||
```yaml
|
||||
env:
|
||||
- name: openai_key_var
|
||||
value: "sk-...."
|
||||
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
<TabItem value="helm" label="Using Helm Commands">
|
||||
|
||||
### Scaling {#359b9ea5302147ebbed3ab8aa49dae8d}
|
||||
1. Create the secret:
|
||||
```shell
|
||||
kubectl create secret generic openai-credentials \
|
||||
--namespace langflow \
|
||||
--from-literal=OPENAI_API_KEY=sk...
|
||||
```
|
||||
|
||||
2. Verify the secret exists. The result is encrypted.
|
||||
```shell
|
||||
kubectl get secrets -n langflow openai-credentials
|
||||
```
|
||||
|
||||
You can scale the number of replicas for the LangFlow app by changing the `replicaCount` value in the `values.yaml` file.
|
||||
3. Upgrade the Helm release to use the secret.
|
||||
```shell
|
||||
helm upgrade my-langflow-app-image langflow/langflow-runtime -n langflow \
|
||||
--reuse-values \
|
||||
--set "extraEnv[0].name=OPENAI_API_KEY" \
|
||||
--set "extraEnv[0].valueFrom.secretKeyRef.name=openai-credentials" \
|
||||
--set "extraEnv[0].valueFrom.secretKeyRef.key=OPENAI_API_KEY"
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
|
||||
### Configure the log level
|
||||
|
||||
Set the log level and other Langflow configurations in the [values.yaml](https://github.com/langflow-ai/langflow-helm-charts/blob/main/charts/langflow-runtime/values.yaml) file.
|
||||
|
||||
```yaml
|
||||
env:
|
||||
- name: LANGFLOW_LOG_LEVEL
|
||||
value: "INFO"
|
||||
```
|
||||
|
||||
### Configure scaling
|
||||
|
||||
To scale the number of replicas for the Langflow appplication, change the `replicaCount` value in the [values.yaml](https://github.com/langflow-ai/langflow-helm-charts/blob/main/charts/langflow-runtime/values.yaml) file.
|
||||
|
||||
```yaml
|
||||
replicaCount: 3
|
||||
|
||||
```
|
||||
|
||||
|
||||
To scale vertically (increase the resources for the pods), you can set the `resources` values in the `values.yaml` file.
|
||||
To scale the application vertically by increasing the resources for the pods, change the `resources` values in the [values.yaml](https://github.com/langflow-ai/langflow-helm-charts/blob/main/charts/langflow-runtime/values.yaml) file.
|
||||
|
||||
|
||||
```yaml
|
||||
|
|
@ -347,24 +351,10 @@ resources:
|
|||
requests:
|
||||
memory: "2Gi"
|
||||
cpu: "1000m"
|
||||
|
||||
```
|
||||
|
||||
## Deploy Langflow on AWS EKS, Google GKE, or Azure AKS and other examples
|
||||
|
||||
## Other Examples {#8522b4276b51448e9f8f0c6efc731a7c}
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
Visit the LangFlow Helm Charts repository for more examples and configurations. Use the default values file as reference for all the options available.
|
||||
|
||||
|
||||
:::note
|
||||
|
||||
Visit the examples directory to learn more about different deployment options.
|
||||
|
||||
:::
|
||||
|
||||
For more information, see the [Langflow Helm Charts repository](https://github.com/langflow-ai/langflow-helm-charts).
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,20 +1,20 @@
|
|||
---
|
||||
title: Railway
|
||||
title: Deploy Langflow on Railway
|
||||
slug: /deployment-railway
|
||||
---
|
||||
|
||||
## Deploy on Railway {#a9a1ce4d39e74cc29aef4d30c6172d10}
|
||||
This guide explains how to deploy Langflow on [Railway](https://railway.app/), a cloud infrastructure platform that provides auto-deploy, managed databases, and automatic scaling.
|
||||
|
||||
---
|
||||
|
||||
Railway is a cloud infrastructure platform that enables developers to deploy and manage applications effortlessly. It provides an intuitive interface, seamless integrations, and powerful features like auto-deploy from GitHub, managed databases, and automatic scaling.
|
||||
|
||||
Deploying Langflow to Railway involves a few simple steps:
|
||||
|
||||
1. **Click the Button Below**: Start by clicking the deployment button provided below. This will redirect you to the Railway platform.
|
||||
1. Click the following button to go to Railway:
|
||||
|
||||
[](https://railway.app/template/JMXEWp?referralCode=MnPSdg)
|
||||
|
||||
2. **Deploy**: Proceed to deploy your Langflow instance. Click Deploy Now to deploy the instance. Railway will handle the rest, including setting up the infrastructure, deploying the Langflow instance, and starting the application.
|
||||
2. Click **Deploy Now**.
|
||||
Railway automatically does the following:
|
||||
- Sets up the infrastructure.
|
||||
- Deploys Langflow.
|
||||
- Starts the application.
|
||||
|
||||
By following these steps, your Langflow instance will be successfully deployed on Railway.
|
||||
Wait for the deployment to complete.
|
||||
|
||||
Your Langflow instance is now ready to use.
|
||||
|
|
|
|||
|
|
@ -1,23 +1,23 @@
|
|||
---
|
||||
title: Render
|
||||
title: Deploy Langflow on Render
|
||||
slug: /deployment-render
|
||||
---
|
||||
|
||||
## Deploy on Render {#20a959b7047e44e490cc129fd21895c0}
|
||||
This guide explains how to deploy Langflow on [Render](https://render.com/), a cloud platform for deploying web applications and APIs.
|
||||
|
||||
---
|
||||
:::note
|
||||
Langflow requires at least 2 GB of RAM to run, so it uses a **standard** Render instance. This may require a credit card. Review [Render's pricing](https://render.com/pricing) before proceeding.
|
||||
:::
|
||||
|
||||
[Render.com](http://render.com/) is a unified cloud platform designed to make deploying web applications, APIs, and static sites easy. It provides a streamlined experience with powerful features like automatic SSL, managed databases, and auto-deploy from Git, making it a popular choice for developers looking to simplify their deployment workflows.
|
||||
|
||||
Deploying Langflow to Render is a straightforward process that can be completed in just a few steps:
|
||||
|
||||
1. **Click the Button Below**: Start by clicking the deployment button provided below. This will redirect you to the Render platform.
|
||||
1. Click the following button to go to Render:
|
||||
|
||||
[](https://render.com/deploy?repo=https%3A%2F%2Fgithub.com%2Flangflow-ai%2Flangflow%2Ftree%2Fdev)
|
||||
|
||||
2. **Select the Blueprint Configuration**: Once on the Render platform, you will be prompted to provide a blueprint name and to select the branch for your `render.yaml` file in Langflow. This configuration file includes all the necessary settings and resources to deploy Langflow in Render. The default is `main`.
|
||||
3. The `render.yaml` file specifies a `standard` Render instance, because Langflow requires at least 2 GB of RAM to run. This may require a credit card to sign up. Review the pricing details on the Render platform to understand any costs involved before proceeding. If you need to change your plan later, from the Render dashboard, go to **Settings** > **Instance Type**.
|
||||
2. Enter a blueprint name, and then select the branch for your `render.yaml` file.
|
||||
|
||||
4. Click **Deploy Blueprint** to deploy Langflow. Render will handle the rest, including setting up the database, deploying the Langflow instance, and starting the application.
|
||||
3. Click **Deploy Blueprint**.
|
||||
|
||||
Wait for the deployment to complete.
|
||||
|
||||
Your Langflow instance is now ready to use.
|
||||
|
||||
By following these steps, your Langflow instance will be successfully deployed on Render.
|
||||
|
|
|
|||
|
|
@ -88,12 +88,36 @@ module.exports = {
|
|||
type: "category",
|
||||
label: "Deployment",
|
||||
items: [
|
||||
"Deployment/deployment-docker",
|
||||
"Deployment/deployment-gcp",
|
||||
"Deployment/deployment-hugging-face-spaces",
|
||||
"Deployment/deployment-kubernetes",
|
||||
"Deployment/deployment-railway",
|
||||
"Deployment/deployment-render",
|
||||
{
|
||||
type: "doc",
|
||||
id: "Deployment/deployment-docker",
|
||||
label: "Docker"
|
||||
},
|
||||
{
|
||||
type: "doc",
|
||||
id: "Deployment/deployment-gcp",
|
||||
label: "Google Cloud Platform"
|
||||
},
|
||||
{
|
||||
type: "doc",
|
||||
id: "Deployment/deployment-hugging-face-spaces",
|
||||
label: "Hugging Face Spaces"
|
||||
},
|
||||
{
|
||||
type: "doc",
|
||||
id: "Deployment/deployment-kubernetes",
|
||||
label: "Kubernetes"
|
||||
},
|
||||
{
|
||||
type: "doc",
|
||||
id: "Deployment/deployment-railway",
|
||||
label: "Railway"
|
||||
},
|
||||
{
|
||||
type: "doc",
|
||||
id: "Deployment/deployment-render",
|
||||
label: "Render"
|
||||
}
|
||||
],
|
||||
},
|
||||
{
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue