🚀 feat(store.py): update get_components function to use async/await syntax for querying and counting components to improve readability and maintainability 🚀 feat(store.py): update read_component function to use async/await syntax for downloading component from the store to improve performance and avoid blocking the event loop 🚀 feat(store.py): update get_tags function to use async/await syntax for getting tags from the store to improve performance and avoid blocking the event loop 🚀 feat(store.py): update get_list_of_components_liked_by_user function to use async/await syntax for getting user likes from the store to improve performance and avoid blocking the event loop 🚀 feat(store.py): update like_component function to use async/await syntax for liking a component and getting the updated likes count from the store to improve performance and avoid blocking the event loop 🐛 fix(service.py): change _get method to async to make it compatible with async requests ✨ feat(service.py): change call_webhook method to async to make it compatible with async requests 🐛 fix(service.py): change count_components method to async to make it compatible with async requests ✨ feat(service.py): change query_components method to async to make it compatible with async requests 🐛 fix(service.py): change get_liked_by_user_components method to async to make it compatible with async requests ✨ feat(service.py): change get_components_in_users_collection method to async to make it compatible with async requests 🐛 fix(service.py): change download method to async to make it compatible with async requests ✨ feat(service.py): change upload method to async to make it compatible with async requests 🐛 fix(service.py): change get_tags method to async to make it compatible with async requests ✨ feat(service.py): change get_user_likes method to async to make it compatible with async requests 🐛 fix(service.py): change get_component_likes_count method to async to make it compatible with async requests ✨ feat(service.py): change like_component method to async to make it compatible with async requests 🐛 fix(utils.py): change update_components_with_user_data method to async to make it compatible with async requests |
||
|---|---|---|
| .devcontainer | ||
| .githooks | ||
| .github | ||
| .vscode | ||
| deploy | ||
| docker_example | ||
| docs | ||
| img | ||
| scripts/gcp | ||
| src | ||
| tests | ||
| .dockerignore | ||
| .env.example | ||
| .gitattributes | ||
| .gitignore | ||
| .readthedocs.yaml | ||
| base.Dockerfile | ||
| CODE_OF_CONDUCT.md | ||
| CONTRIBUTING.md | ||
| dev.Dockerfile | ||
| docker-compose.debug.yml | ||
| docker-compose.yml | ||
| Dockerfile | ||
| GCP_DEPLOYMENT.md | ||
| lcserve.Dockerfile | ||
| LICENSE | ||
| Makefile | ||
| poetry.lock | ||
| pyproject.toml | ||
| README.md | ||
| render.yaml | ||
⛓️ Langflow
~ An effortless way to experiment and prototype LangChain pipelines ~
Table of Contents
- ⛓️ Langflow
- Table of Contents
- 📦 Installation
- 🖥️ Command Line Interface (CLI)
- Deployment
- 🎨 Creating Flows
- 👋 Contributing
- 📄 License
📦 Installation
Locally
You can install Langflow from pip:
# This installs the package without dependencies for local models
pip install langflow
To use local models (e.g llama-cpp-python) run:
pip install langflow[local]
This will install the following dependencies:
You can still use models from projects like LocalAI
Next, run:
python -m langflow
or
langflow run # or langflow --help
HuggingFace Spaces
You can also check it out on HuggingFace Spaces and run it in your browser! You can even clone it and have your own copy of Langflow to play with.
🖥️ Command Line Interface (CLI)
Langflow provides a command-line interface (CLI) for easy management and configuration.
Usage
You can run the Langflow using the following command:
langflow run [OPTIONS]
Each option is detailed below:
--help: Displays all available options.--host: Defines the host to bind the server to. Can be set using theLANGFLOW_HOSTenvironment variable. The default is127.0.0.1.--workers: Sets the number of worker processes. Can be set using theLANGFLOW_WORKERSenvironment variable. The default is1.--timeout: Sets the worker timeout in seconds. The default is60.--port: Sets the port to listen on. Can be set using theLANGFLOW_PORTenvironment variable. The default is7860.--config: Defines the path to the configuration file. The default isconfig.yaml.--env-file: Specifies the path to the .env file containing environment variables. The default is.env.--log-level: Defines the logging level. Can be set using theLANGFLOW_LOG_LEVELenvironment variable. The default iscritical.--components-path: Specifies the path to the directory containing custom components. Can be set using theLANGFLOW_COMPONENTS_PATHenvironment variable. The default islangflow/components.--log-file: Specifies the path to the log file. Can be set using theLANGFLOW_LOG_FILEenvironment variable. The default islogs/langflow.log.--cache: Selects the type of cache to use. Options areInMemoryCacheandSQLiteCache. Can be set using theLANGFLOW_LANGCHAIN_CACHEenvironment variable. The default isSQLiteCache.--dev/--no-dev: Toggles the development mode. The default isno-dev.--path: Specifies the path to the frontend directory containing build files. This option is for development purposes only. Can be set using theLANGFLOW_FRONTEND_PATHenvironment variable.--open-browser/--no-open-browser: Toggles the option to open the browser after starting the server. Can be set using theLANGFLOW_OPEN_BROWSERenvironment variable. The default isopen-browser.--remove-api-keys/--no-remove-api-keys: Toggles the option to remove API keys from the projects saved in the database. Can be set using theLANGFLOW_REMOVE_API_KEYSenvironment variable. The default isno-remove-api-keys.--install-completion [bash|zsh|fish|powershell|pwsh]: Installs completion for the specified shell.--show-completion [bash|zsh|fish|powershell|pwsh]: Shows completion for the specified shell, allowing you to copy it or customize the installation.
Environment Variables
You can configure many of the CLI options using environment variables. These can be exported in your operating system or added to a .env file and loaded using the --env-file option.
A sample .env file named .env.example is included with the project. Copy this file to a new file named .env and replace the example values with your actual settings. If you're setting values in both your OS and the .env file, the .env settings will take precedence.
Deployment
Deploy Langflow on Google Cloud Platform
Follow our step-by-step guide to deploy Langflow on Google Cloud Platform (GCP) using Google Cloud Shell. The guide is available in the Langflow in Google Cloud Platform document.
Alternatively, click the "Open in Cloud Shell" button below to launch Google Cloud Shell, clone the Langflow repository, and start an interactive tutorial that will guide you through the process of setting up the necessary resources and deploying Langflow on your GCP project.
Deploy on Railway
Deploy on Render
🎨 Creating Flows
Creating flows with Langflow is easy. Simply drag sidebar components onto the canvas and connect them together to create your pipeline. Langflow provides a range of LangChain components to choose from, including LLMs, prompt serializers, agents, and chains.
Explore by editing prompt parameters, link chains and agents, track an agent's thought process, and export your flow.
Once you're done, you can export your flow as a JSON file to use with LangChain. To do so, click the "Export" button in the top right corner of the canvas, then in Python, you can load the flow with:
from langflow import load_flow_from_json
flow = load_flow_from_json("path/to/flow.json")
# Now you can use it like any chain
flow("Hey, have you heard of Langflow?")
👋 Contributing
We welcome contributions from developers of all levels to our open-source project on GitHub. If you'd like to contribute, please check our contributing guidelines and help make Langflow more accessible.
Join our Discord server to ask questions, make suggestions and showcase your projects! 🦾
📄 License
Langflow is released under the MIT License. See the LICENSE file for details.