fix zustand

This commit is contained in:
cristhianzl 2024-03-28 11:59:32 -03:00
commit f13d7ecab4
119 changed files with 2517 additions and 2077 deletions

View file

@ -14,7 +14,7 @@ on:
- "src/backend/**"
env:
POETRY_VERSION: "1.7.0"
POETRY_VERSION: "1.8.2"
jobs:
lint:
@ -22,7 +22,6 @@ jobs:
strategy:
matrix:
python-version:
- "3.9"
- "3.10"
- "3.11"
steps:

View file

@ -46,7 +46,7 @@ format:
lint:
make install_backend
poetry run mypy src/backend
poetry run mypy --namespace-packages -p "langflow"
poetry run ruff . --fix
install_frontend:

View file

@ -23,7 +23,7 @@ ENV PYTHONUNBUFFERED=1 \
\
# poetry
# https://python-poetry.org/docs/configuration/#using-environment-variables
POETRY_VERSION=1.7.1 \
POETRY_VERSION=1.8.2 \
# make poetry install to this location
POETRY_HOME="/opt/poetry" \
# make poetry create the virtual environment in the project's root

View file

@ -23,7 +23,7 @@ ENV PYTHONUNBUFFERED=1 \
\
# poetry
# https://python-poetry.org/docs/configuration/#using-environment-variables
POETRY_VERSION=1.7.1 \
POETRY_VERSION=1.8.2 \
# make poetry install to this location
POETRY_HOME="/opt/poetry" \
# make poetry create the virtual environment in the project's root

View file

@ -23,7 +23,7 @@ ENV PYTHONUNBUFFERED=1 \
\
# poetry
# https://python-poetry.org/docs/configuration/#using-environment-variables
POETRY_VERSION=1.5.1 \
POETRY_VERSION=1.8.2 \
# make poetry install to this location
POETRY_HOME="/opt/poetry" \
# make poetry create the virtual environment in the project's root

View file

@ -43,7 +43,7 @@ The Code button shows snippets to use your flow as a Python object or an API.
Through the Langflow package, you can load a flow from a JSON file and use it as a LangChain object.
```py
```python
from langflow.load import load_flow_from_json
flow = load_flow_from_json("path/to/flow.json")

View file

@ -1,10 +1,11 @@
# Migrating to Langflow 1.0: A Guide
Langflow 1.0 is a significant update that brings many exciting changes and improvements to the platform. This guide will walk you through the key differences and help you migrate your existing projects to the new version.
Langflow 1.0 is a significant update that brings many exciting changes and improvements to the platform.
This guide will walk you through the key improvements and help you migrate your existing projects to the new version.
If you have any questions or need assistance during the migration process, please don't hesitate to reach out to in our [Discord](https://discord.gg/wZSWQaukgJ) or [GitHub](https://github.com/logspace-ai/langflow/issues) community.
We have a special channel
We have a special channel in our Discord server dedicated to Langflow 1.0 migration, where you can ask questions, share your experiences, and get help from the community.
## TLDR;
@ -27,98 +28,98 @@ We have a special channel
## Inputs and Outputs of Components
Langflow 1.0 introduces adds the concept of Inputs and Outputs to flows, allowing clear definition of the data flow between components. Discover how to use Inputs and Outputs to pass data between components and create more dynamic flows.
Langflow 1.0 introduces adds the concept of Inputs and Outputs to flows, allowing a clear definition of the data flow between components. Discover how to use Inputs and Outputs to pass data between components and create more dynamic flows.
[Learn more about Inputs and Outputs of Components](../guides/inputs-and-outputs)
[Learn more about Inputs and Outputs of Components](../migration/inputs-and-outputs)
## From Composition to Freedom
Even though composition is still possible in Langflow 1.0, the new standard is getting data moving through the flow. This allows for more flexibility and control over the data flow in your projects. Check out how to use this in new and existing projects.
[Learn more about the Flow of Data](../guides/flow-of-data)
[Learn more about the Flow of Data](../migration/flow-of-data)
## Continued Support for LangChain and Multiple Frameworks
Langflow 1.0 continues to support LangChain while also introducing support for multiple frameworks. This is another important boon that adding the paradigm of data flow brings to the table. Find out how to leverage the power of different frameworks in your projects.
[Learn more about Supported Frameworks](../guides/supported-frameworks)
[Learn more about Supported Frameworks](../migration/supported-frameworks)
## Sidebar Redesign and Customizable Interaction Panel
We've expanded on the chat experience by creating a customizable interaction panel that allows you to design a panel that fits your needs and interact with it. The sidebar has also been redesigned to provide a more intuitive and user-friendly experience. Explore the new sidebar and interaction panel features to enhance your workflow.
[Learn more about some of the UI updates](../guides/sidebar-and-interaction-panel)
[Learn more about some of the UI updates](../migration/sidebar-and-interaction-panel)
## New Native Categories and Components
Langflow 1.0 introduces many new native categories, including Inputs, Outputs, Helpers, Experimental, Models, and more. Discover the new components available, such as Chat Input, Prompt, Files, API Request, and others.
[Learn more about New Categories and Components](../guides/new-categories-and-components)
[Learn more about New Categories and Components](../migration/new-categories-and-components)
## New Way of Using Langflow: Text and Record (and more to come)
With the introduction of Text and Record types connections between Components are more intuitive and easier to understand. This is the first step in a series of improvements to the way you interact with Langflow. Learn how to use Text, and Record and how they help you build better flows.
[Learn more about Text and Record](../guides/text-and-record)
[Learn more about Text and Record](../migration/text-and-record)
## CustomComponent for All Components
Almost all components in Langflow 1.0 are now CustomComponents, allowing you to check and modify the code of each component. Discover how to leverage this feature to customize your components to your specific needs.
[Learn more about CustomComponent](../guides/custom-component)
[Learn more about CustomComponent](../migration/custom-component)
## Compatibility with Previous Versions
To use flows built in previous versions of Langflow, you can utilize the experimental component Runnable Executor along with an Input and Output. **We'd love your feedback on this**. Learn how to adapt your existing flows to work seamlessly in the new version of Langflow.
[Learn more about Compatibility with Previous Versions](../guides/compatibility)
[Learn more about Compatibility with Previous Versions](../migration/compatibility)
## Multiple Flows in the Canvas
Langflow 1.0 allows you to have more than one flow in the canvas and run them separately. Discover how to create and manage multiple flows within a single project.
[Learn more about Multiple Flows](../guides/multiple-flows)
[Learn more about Multiple Flows](../migration/multiple-flows)
## Improved Component Status
Each component now displays its status more clearly, allowing you to quickly identify any issues or errors. Explore how to use the new component status feature to troubleshoot and optimize your flows.
[Learn more about Component Status](../guides/component-status-and-data-passing)
[Learn more about Component Status](../migration/component-status-and-data-passing)
## Connecting Output Components
You can now connect Output components to any other component (that has a Text output), providing a better understanding of the data flow. Explore the possibilities of connecting Output components and how it enhances your flow's functionality.
[Learn more about Connecting Output Components](../guides/connecting-output-components)
[Learn more about Connecting Output Components](../migration/connecting-output-components)
## Renaming and Editing Component Descriptions
Langflow 1.0 allows you to rename and edit the description of each component, making it easier to understand and interact with the flow. Learn how to customize your component names and descriptions for improved clarity.
[Learn more about Renaming and Editing Components](../guides/renaming-and-editing-components)
[Learn more about Renaming and Editing Components](../migration/renaming-and-editing-components)
## Passing Tweaks and Inputs in the API
Things got a whole lot easier. You can now pass tweaks and inputs in the API by referencing the Display Name of the component. Discover how to leverage this feature to dynamically control your flow's behavior.
[Learn more about Passing Tweaks and Inputs](../guides/passing-tweaks-and-inputs)
[Learn more about Passing Tweaks and Inputs](../migration/passing-tweaks-and-inputs)
## Global Variables for Text Fields
Global Variables can be used in any Text Field across your projects. Learn how to define and utilize Global Variables to streamline your workflow.
[Learn more about Global Variables](../guides/global-variables)
[Learn more about Global Variables](../migration/global-variables)
## Experimental Components
Explore the experimental components available in Langflow 1.0, such as SubFlow, which allows you to load a flow as a component dynamically, and Flow as Tool, which enables you to use a flow as a tool for an Agent.
[Learn more about Experimental Components](../guides/experimental-components)
[Learn more about Experimental Components](../migration/experimental-components)
## Experimental State Management System
We are experimenting with a State Management system for flows that allows components to trigger other components and pass messages between them using the Notify and Listen components. Discover how to leverage this system to create more dynamic and interactive flows.
[Learn more about State Management](../guides/state-management)
[Learn more about State Management](../migration/state-management)
We hope this guide helps you navigate the changes and improvements in Langflow 1.0. If you have any questions or need further assistance, please don't hesitate to reach out to us in our [Discord](https://discord.gg/wZSWQaukgJ).

View file

@ -16,6 +16,7 @@ module.exports = {
label: "What's New",
collapsed: false,
items: [
"whats-new/a-new-chapter-langflow",
"whats-new/migrating-to-one-point-zero",
"whats-new/customization-control",
"whats-new/debugging-reimagined",
@ -27,8 +28,8 @@ module.exports = {
label: "Migration Guides",
collapsed: false,
items: [
"migration/inputs-and-outputs",
"migration/flow-of-data",
"migration/inputs-and-outputs",
"migration/supported-frameworks",
"migration/sidebar-and-interaction-panel",
"migration/new-categories-and-components",
@ -100,23 +101,6 @@ module.exports = {
"guides/loading_document",
"guides/chatprompttemplate_guide",
"guides/langfuse_integration",
"guides/inputs-and-outputs",
"guides/flow-of-data",
"guides/supported-frameworks",
"guides/sidebar-and-interaction-panel",
"guides/new-categories-and-components",
"guides/text-and-record",
"guides/custom-component",
"guides/compatibility",
"guides/multiple-flows",
"guides/component-status-and-data-passing",
"guides/connecting-output-components",
"guides/renaming-and-editing-components",
"guides/passing-tweaks-and-inputs",
"guides/global-variables",
"guides/experimental-components",
"guides/state-management",
"guides/run-flow",
],
},
{

349
poetry.lock generated
View file

@ -239,6 +239,26 @@ typing-extensions = {version = ">=4", markers = "python_version < \"3.11\""}
[package.extras]
tests = ["mypy (>=0.800)", "pytest", "pytest-asyncio"]
[[package]]
name = "assemblyai"
version = "0.23.1"
description = "AssemblyAI Python SDK"
optional = false
python-versions = ">=3.8"
files = [
{file = "assemblyai-0.23.1-py3-none-any.whl", hash = "sha256:2887c7983fa911717cbe37a38d38fcdc8188e62687385b8b6f979546c58354f4"},
{file = "assemblyai-0.23.1.tar.gz", hash = "sha256:4a3d4d8c4f6c956c6243f0873147ba29da4c6cf5edd6a1b52e6bdaa209526998"},
]
[package.dependencies]
httpx = ">=0.19.0"
pydantic = ">=1.7.0,<1.10.7 || >1.10.7"
typing-extensions = ">=3.7"
websockets = ">=11.0"
[package.extras]
extras = ["pyaudio (>=0.2.13)"]
[[package]]
name = "astrapy"
version = "0.7.7"
@ -331,13 +351,13 @@ files = [
[[package]]
name = "bce-python-sdk"
version = "0.9.5"
version = "0.9.6"
description = "BCE SDK for python"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, <4"
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,<4,>=2.7"
files = [
{file = "bce-python-sdk-0.9.5.tar.gz", hash = "sha256:c51dcd17454af7bfeb211d2daf1cd600b6e336f35244c8cb9120c2fd229d281d"},
{file = "bce_python_sdk-0.9.5-py3-none-any.whl", hash = "sha256:527e7fb4436e09e3d4fa229548e5ff3e0b5441a5d5f0f5658e2c1dbaac6c1986"},
{file = "bce-python-sdk-0.9.6.tar.gz", hash = "sha256:13d2c6d15582391b9d1a4252add28a6a41cf4acc33b53dc38dd7b5a79fd8ed5d"},
{file = "bce_python_sdk-0.9.6-py3-none-any.whl", hash = "sha256:b43e10becad4490e639f84be982f97a499bdc0d3485f1f8859a4eb9ad58b03c4"},
]
[package.dependencies]
@ -435,17 +455,17 @@ files = [
[[package]]
name = "boto3"
version = "1.34.71"
version = "1.34.72"
description = "The AWS SDK for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "boto3-1.34.71-py3-none-any.whl", hash = "sha256:7ce8c9a50af2f8a159a0dd86b40011d8dfdaba35005a118e51cd3ac72dc630f1"},
{file = "boto3-1.34.71.tar.gz", hash = "sha256:d786e7fbe3c4152866199786468a625dc77b9f27294cd7ad4f63cd2e0c927287"},
{file = "boto3-1.34.72-py3-none-any.whl", hash = "sha256:a33585ef0d811ee0dffd92a96108344997a3059262c57349be0761d7885f6ae7"},
{file = "boto3-1.34.72.tar.gz", hash = "sha256:cbfabd99c113bbb1708c2892e864b6dd739593b97a76fbb2e090a7d965b63b82"},
]
[package.dependencies]
botocore = ">=1.34.71,<1.35.0"
botocore = ">=1.34.72,<1.35.0"
jmespath = ">=0.7.1,<2.0.0"
s3transfer = ">=0.10.0,<0.11.0"
@ -454,13 +474,13 @@ crt = ["botocore[crt] (>=1.21.0,<2.0a0)"]
[[package]]
name = "botocore"
version = "1.34.71"
version = "1.34.72"
description = "Low-level, data-driven core of boto 3."
optional = false
python-versions = ">=3.8"
files = [
{file = "botocore-1.34.71-py3-none-any.whl", hash = "sha256:3bc9e23aee73fe6f097823d61f79a8877790436038101a83fa96c7593e8109f8"},
{file = "botocore-1.34.71.tar.gz", hash = "sha256:c58f9ed71af2ea53d24146187130541222d7de8c27eb87d23f15457e7b83d88b"},
{file = "botocore-1.34.72-py3-none-any.whl", hash = "sha256:a6b92735a73c19a7e540d77320420da3af3f32c91fa661c738c0b8c9f912d782"},
{file = "botocore-1.34.72.tar.gz", hash = "sha256:342edb6f91d5839e790411822fc39f9c712c87cdaa7f3b1999f50b1ca16c4a14"},
]
[package.dependencies]
@ -1788,13 +1808,13 @@ gmpy2 = ["gmpy2"]
[[package]]
name = "elastic-transport"
version = "8.12.0"
version = "8.13.0"
description = "Transport classes and utilities shared among Python Elastic client libraries"
optional = false
python-versions = ">=3.7"
files = [
{file = "elastic-transport-8.12.0.tar.gz", hash = "sha256:48839b942fcce199eece1558ecea6272e116c58da87ca8d495ef12eb61effaf7"},
{file = "elastic_transport-8.12.0-py3-none-any.whl", hash = "sha256:87d9dc9dee64a05235e7624ed7e6ab6e5ca16619aa7a6d22e853273b9f1cfbee"},
{file = "elastic-transport-8.13.0.tar.gz", hash = "sha256:2410ec1ff51221e8b3a01c0afa9f0d0498e1386a269283801f5c12f98e42dc45"},
{file = "elastic_transport-8.13.0-py3-none-any.whl", hash = "sha256:aec890afdddd057762b27ff3553b0be8fa4673ec1a4fd922dfbd00325874bb3d"},
]
[package.dependencies]
@ -1802,24 +1822,25 @@ certifi = "*"
urllib3 = ">=1.26.2,<3"
[package.extras]
develop = ["aiohttp", "furo", "mock", "pytest", "pytest-asyncio", "pytest-cov", "pytest-httpserver", "pytest-mock", "requests", "sphinx (>2)", "sphinx-autodoc-typehints", "trustme"]
develop = ["aiohttp", "furo", "httpx", "mock", "opentelemetry-api", "opentelemetry-sdk", "orjson", "pytest", "pytest-asyncio", "pytest-cov", "pytest-httpserver", "pytest-mock", "requests", "respx", "sphinx (>2)", "sphinx-autodoc-typehints", "trustme"]
[[package]]
name = "elasticsearch"
version = "8.12.1"
version = "8.13.0"
description = "Python client for Elasticsearch"
optional = false
python-versions = ">=3.7"
files = [
{file = "elasticsearch-8.12.1-py3-none-any.whl", hash = "sha256:cc459b7e0fb88dc85b43b9d7d254cffad552b0063a3e0a12290c8fa5f138c038"},
{file = "elasticsearch-8.12.1.tar.gz", hash = "sha256:00c997720fbd0f2afe5417c8193cf65d116817a0250de0521e30c3e81f00b8ac"},
{file = "elasticsearch-8.13.0-py3-none-any.whl", hash = "sha256:4aaf49253e974eb500f01136a487bdd0f09d3cafd37a0456eff6acfff0c9199b"},
{file = "elasticsearch-8.13.0.tar.gz", hash = "sha256:e4ebebb22d09f0ef839c26b6aa98e19ccd636bcb77f08c12b562b02cacd5e744"},
]
[package.dependencies]
elastic-transport = ">=8,<9"
elastic-transport = ">=8.13,<9"
[package.extras]
async = ["aiohttp (>=3,<4)"]
orjson = ["orjson (>=3)"]
requests = ["requests (>=2.4.0,<3.0.0)"]
[[package]]
@ -2962,13 +2983,13 @@ files = [
[[package]]
name = "httpcore"
version = "1.0.4"
version = "1.0.5"
description = "A minimal low-level HTTP client."
optional = false
python-versions = ">=3.8"
files = [
{file = "httpcore-1.0.4-py3-none-any.whl", hash = "sha256:ac418c1db41bade2ad53ae2f3834a3a0f5ae76b56cf5aa497d2d033384fc7d73"},
{file = "httpcore-1.0.4.tar.gz", hash = "sha256:cb2839ccfcba0d2d3c1131d3c3e26dfc327326fbe7a5dc0dbfe9f6c9151bb022"},
{file = "httpcore-1.0.5-py3-none-any.whl", hash = "sha256:421f18bac248b25d310f3cacd198d55b8e6125c107797b609ff9b7a6ba7991b5"},
{file = "httpcore-1.0.5.tar.gz", hash = "sha256:34a38e2f9291467ee3b44e89dd52615370e152954ba21721378a87b2960f7a61"},
]
[package.dependencies]
@ -2979,7 +3000,7 @@ h11 = ">=0.13,<0.15"
asyncio = ["anyio (>=4.0,<5.0)"]
http2 = ["h2 (>=3,<5)"]
socks = ["socksio (==1.*)"]
trio = ["trio (>=0.22.0,<0.25.0)"]
trio = ["trio (>=0.22.0,<0.26.0)"]
[[package]]
name = "httplib2"
@ -3615,13 +3636,13 @@ test = ["ipykernel", "pre-commit", "pytest (<8)", "pytest-cov", "pytest-timeout"
[[package]]
name = "kombu"
version = "5.3.5"
version = "5.3.6"
description = "Messaging library for Python."
optional = true
python-versions = ">=3.8"
files = [
{file = "kombu-5.3.5-py3-none-any.whl", hash = "sha256:0eac1bbb464afe6fb0924b21bf79460416d25d8abc52546d4f16cad94f789488"},
{file = "kombu-5.3.5.tar.gz", hash = "sha256:30e470f1a6b49c70dc6f6d13c3e4cc4e178aa6c469ceb6bcd55645385fc84b93"},
{file = "kombu-5.3.6-py3-none-any.whl", hash = "sha256:49f1e62b12369045de2662f62cc584e7df83481a513db83b01f87b5b9785e378"},
{file = "kombu-5.3.6.tar.gz", hash = "sha256:f3da5b570a147a5da8280180aa80b03807283d63ea5081fcdb510d18242431d9"},
]
[package.dependencies]
@ -3638,7 +3659,7 @@ mongodb = ["pymongo (>=4.1.1)"]
msgpack = ["msgpack"]
pyro = ["pyro4"]
qpid = ["qpid-python (>=0.26)", "qpid-tools (>=0.26)"]
redis = ["redis (>=4.5.2,!=4.5.5,<6.0.0)"]
redis = ["redis (>=4.5.2,!=4.5.5,!=5.0.2)"]
slmq = ["softlayer-messaging (>=1.0.3)"]
sqlalchemy = ["sqlalchemy (>=1.4.48,<2.1)"]
sqs = ["boto3 (>=1.26.143)", "pycurl (>=7.43.0.5)", "urllib3 (>=1.26.16)"]
@ -3772,17 +3793,16 @@ extended-testing = ["aiosqlite (>=0.19.0,<0.20.0)", "aleph-alpha-client (>=2.15.
[[package]]
name = "langchain-core"
version = "0.1.33"
version = "0.1.35"
description = "Building applications with LLMs through composability"
optional = false
python-versions = "<4.0,>=3.8.1"
files = [
{file = "langchain_core-0.1.33-py3-none-any.whl", hash = "sha256:cee7fbab114c74b7279a92c8a376b40344b0fa3d0f0af3143a858e3b7485bf13"},
{file = "langchain_core-0.1.33.tar.gz", hash = "sha256:545eff3de83cc58231bd2b0c6d672323fc2077b94d326ba1a3219118af1d1a66"},
{file = "langchain_core-0.1.35-py3-none-any.whl", hash = "sha256:9d790446ea211f4cb620886081cc5a5723bc9a2dc90af1f6205aded2ee61bb71"},
{file = "langchain_core-0.1.35.tar.gz", hash = "sha256:862b8415d4deaf4e06833ef826bcef3614d75c3e7fd82b09b1349cc223f02e9a"},
]
[package.dependencies]
anyio = ">=3,<5"
jsonpatch = ">=1.33,<2.0"
langsmith = ">=0.1.0,<0.2.0"
packaging = ">=23.2,<24.0"
@ -3926,7 +3946,7 @@ rich = "^13.7.0"
sqlmodel = "^0.0.14"
typer = "^0.9.0"
uvicorn = "^0.27.0"
websockets = "^10.3"
websockets = "*"
[package.extras]
all = []
@ -3963,13 +3983,13 @@ openai = ["openai (>=0.27.8)"]
[[package]]
name = "langsmith"
version = "0.1.31"
version = "0.1.34"
description = "Client library to connect to the LangSmith LLM Tracing and Evaluation Platform."
optional = false
python-versions = "<4.0,>=3.8.1"
files = [
{file = "langsmith-0.1.31-py3-none-any.whl", hash = "sha256:5211a9dc00831db307eb843485a97096484b697b5d2cd1efaac34228e97ca087"},
{file = "langsmith-0.1.31.tar.gz", hash = "sha256:efd54ccd44be7fda911bfdc0ead340473df2fdd07345c7252901834d0c4aa37e"},
{file = "langsmith-0.1.34-py3-none-any.whl", hash = "sha256:1f43e9e1f3985be150ff949136a381e627627be4ce2d8dba6f2d8b9f58273420"},
{file = "langsmith-0.1.34.tar.gz", hash = "sha256:9bd248723b4f2c9a805146a039b001170bdf20c80b6499cc553d260aaf4ac4f5"},
]
[package.dependencies]
@ -4018,23 +4038,23 @@ test = ["httpx (>=0.24.1)", "pytest (>=7.4.0)", "scipy (>=1.10)"]
[[package]]
name = "llama-index"
version = "0.10.23"
version = "0.10.24"
description = "Interface between LLMs and your data"
optional = false
python-versions = "<4.0,>=3.8.1"
files = [
{file = "llama_index-0.10.23-py3-none-any.whl", hash = "sha256:e38a3b87fb9ba74a43bdc374351abd7f3f34f28899bbd18949daf26cb3d2ae7f"},
{file = "llama_index-0.10.23.tar.gz", hash = "sha256:cff74022ed8da6efb49ebf113ff6aba32c863b02a87d45179991c7736cfaf9d5"},
{file = "llama_index-0.10.24-py3-none-any.whl", hash = "sha256:f241b70086d109b296fc9a75fa5eaa580f9bfb48d3271bc9702b0c206ce298ab"},
{file = "llama_index-0.10.24.tar.gz", hash = "sha256:2ec779fb0046271cf170f4b94a78ec6dc111d51e20cdf8a1e2ce471b48c7dc8a"},
]
[package.dependencies]
llama-index-agent-openai = ">=0.1.4,<0.2.0"
llama-index-cli = ">=0.1.2,<0.2.0"
llama-index-core = ">=0.10.23,<0.11.0"
llama-index-core = ">=0.10.24,<0.11.0"
llama-index-embeddings-openai = ">=0.1.5,<0.2.0"
llama-index-indices-managed-llama-cloud = ">=0.1.2,<0.2.0"
llama-index-legacy = ">=0.9.48,<0.10.0"
llama-index-llms-openai = ">=0.1.5,<0.2.0"
llama-index-llms-openai = ">=0.1.13,<0.2.0"
llama-index-multi-modal-llms-openai = ">=0.1.3,<0.2.0"
llama-index-program-openai = ">=0.1.3,<0.2.0"
llama-index-question-gen-openai = ">=0.1.2,<0.2.0"
@ -4074,13 +4094,13 @@ llama-index-llms-openai = ">=0.1.1,<0.2.0"
[[package]]
name = "llama-index-core"
version = "0.10.23.post1"
version = "0.10.25"
description = "Interface between LLMs and your data"
optional = false
python-versions = "<4.0,>=3.8.1"
files = [
{file = "llama_index_core-0.10.23.post1-py3-none-any.whl", hash = "sha256:5a3ef75791e8236f0441b1b8d504371c07be107d9326549a70e754024792a1d2"},
{file = "llama_index_core-0.10.23.post1.tar.gz", hash = "sha256:0bfa8e93716b979246895601daccc73557af61a53da53d1f717222145015b0ab"},
{file = "llama_index_core-0.10.25-py3-none-any.whl", hash = "sha256:39a0af13f74e57bdf06b86de881cafb80020fc2ae0b0aa0f3a042e17139766ef"},
{file = "llama_index_core-0.10.25.tar.gz", hash = "sha256:407813e4247704d3cf0957cec772889507e26692d52679a155e22f26efc6aa1b"},
]
[package.dependencies]
@ -4090,7 +4110,7 @@ deprecated = ">=1.2.9.3"
dirtyjson = ">=1.0.8,<2.0.0"
fsspec = ">=2023.5.0"
httpx = "*"
llamaindex-py-client = ">=0.1.13,<0.2.0"
llamaindex-py-client = ">=0.1.15,<0.2.0"
nest-asyncio = ">=1.5.8,<2.0.0"
networkx = ">=3.0"
nltk = ">=3.8.1,<4.0.0"
@ -4185,17 +4205,17 @@ query-tools = ["guidance (>=0.0.64,<0.0.65)", "jsonpath-ng (>=1.6.0,<2.0.0)", "l
[[package]]
name = "llama-index-llms-openai"
version = "0.1.12"
version = "0.1.13"
description = "llama-index llms openai integration"
optional = false
python-versions = ">=3.8.1,<4.0"
python-versions = "<4.0,>=3.8.1"
files = [
{file = "llama_index_llms_openai-0.1.12-py3-none-any.whl", hash = "sha256:75cf9ad8de0578fc8aae959f3f5f0900f496d8674cfdf97f3e064004e54d7b64"},
{file = "llama_index_llms_openai-0.1.12.tar.gz", hash = "sha256:400ca0083951bd668ce8bc24875ce70b636e7db328b27c6a50e6d1a2b081b9e6"},
{file = "llama_index_llms_openai-0.1.13-py3-none-any.whl", hash = "sha256:84b7f2d1699d882d6a92f7e8a8b203701e9a32e42924e01bccabddbe9955d3f7"},
{file = "llama_index_llms_openai-0.1.13.tar.gz", hash = "sha256:c0fd932255ac9bf72b6b02c3811eebbf3431aa7603aeaab31c811547f444b1ca"},
]
[package.dependencies]
llama-index-core = ">=0.10.20.post1,<0.11.0"
llama-index-core = ">=0.10.24,<0.11.0"
[[package]]
name = "llama-index-multi-modal-llms-openai"
@ -4214,17 +4234,17 @@ llama-index-llms-openai = ">=0.1.1,<0.2.0"
[[package]]
name = "llama-index-program-openai"
version = "0.1.4"
version = "0.1.5"
description = "llama-index program openai integration"
optional = false
python-versions = ">=3.8.1,<4.0"
python-versions = "<4.0,>=3.8.1"
files = [
{file = "llama_index_program_openai-0.1.4-py3-none-any.whl", hash = "sha256:cfa8f00f3743d2fc70043e80f7c3925d23b1413a0cc7a72863ad60497a18307d"},
{file = "llama_index_program_openai-0.1.4.tar.gz", hash = "sha256:573e99a2dd16ad3caf382c8ab28d1ac10eb2571bc9481d84a6d89806ad6aa5d4"},
{file = "llama_index_program_openai-0.1.5-py3-none-any.whl", hash = "sha256:20b6efa706ac73e4dc5086900fea1ffcb1eb0787c8a6f081669d37da7235aee0"},
{file = "llama_index_program_openai-0.1.5.tar.gz", hash = "sha256:c33aa2d2876ad0ff1f9a2a755d4e7d4917240847d0174e7b2d0b8474499bb700"},
]
[package.dependencies]
llama-index-agent-openai = ">=0.1.1,<0.2.0"
llama-index-agent-openai = ">=0.1.1,<0.3.0"
llama-index-core = ">=0.10.1,<0.11.0"
llama-index-llms-openai = ">=0.1.1,<0.2.0"
@ -4294,13 +4314,13 @@ llama-index-core = ">=0.10.7"
[[package]]
name = "llamaindex-py-client"
version = "0.1.13"
version = "0.1.15"
description = ""
optional = false
python-versions = ">=3.8,<4.0"
python-versions = "<4.0,>=3.8"
files = [
{file = "llamaindex_py_client-0.1.13-py3-none-any.whl", hash = "sha256:02400c90655da80ae373e0455c829465208607d72462f1898fd383fdfe8dabce"},
{file = "llamaindex_py_client-0.1.13.tar.gz", hash = "sha256:3bd9b435ee0a78171eba412dea5674d813eb5bf36e577d3c7c7e90edc54900d9"},
{file = "llamaindex_py_client-0.1.15-py3-none-any.whl", hash = "sha256:d189f23a8f7f78d0e170f62b531dd6ac030eadcb7dd7d38c1b543c4c98c51e5c"},
{file = "llamaindex_py_client-0.1.15.tar.gz", hash = "sha256:c7ce26855ba976153bb40157c3c194223c6b75179935b988dd4bd6a3fe83aacb"},
]
[package.dependencies]
@ -6543,28 +6563,28 @@ files = [
[[package]]
name = "pyasn1"
version = "0.5.1"
version = "0.6.0"
description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
python-versions = ">=3.8"
files = [
{file = "pyasn1-0.5.1-py2.py3-none-any.whl", hash = "sha256:4439847c58d40b1d0a573d07e3856e95333f1976294494c325775aeca506eb58"},
{file = "pyasn1-0.5.1.tar.gz", hash = "sha256:6d391a96e59b23130a5cfa74d6fd7f388dbbe26cc8f1edf39fdddf08d9d6676c"},
{file = "pyasn1-0.6.0-py2.py3-none-any.whl", hash = "sha256:cca4bb0f2df5504f02f6f8a775b6e416ff9b0b3b16f7ee80b5a3153d9b804473"},
{file = "pyasn1-0.6.0.tar.gz", hash = "sha256:3a35ab2c4b5ef98e17dfdec8ab074046fbda76e281c5a706ccd82328cfc8f64c"},
]
[[package]]
name = "pyasn1-modules"
version = "0.3.0"
version = "0.4.0"
description = "A collection of ASN.1-based protocols modules"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
python-versions = ">=3.8"
files = [
{file = "pyasn1_modules-0.3.0-py2.py3-none-any.whl", hash = "sha256:d3ccd6ed470d9ffbc716be08bd90efbd44d0734bc9303818f7336070984a162d"},
{file = "pyasn1_modules-0.3.0.tar.gz", hash = "sha256:5bd01446b736eb9d31512a30d46c1ac3395d676c6f3cafa4c03eb54b9925631c"},
{file = "pyasn1_modules-0.4.0-py3-none-any.whl", hash = "sha256:be04f15b66c206eed667e0bb5ab27e2b1855ea54a842e5037738099e8ca4ae0b"},
{file = "pyasn1_modules-0.4.0.tar.gz", hash = "sha256:831dbcea1b177b28c9baddf4c6d1013c24c3accd14a1873fffaa6a2e905f17b6"},
]
[package.dependencies]
pyasn1 = ">=0.4.6,<0.6.0"
pyasn1 = ">=0.4.6,<0.7.0"
[[package]]
name = "pyautogen"
@ -7522,26 +7542,26 @@ cffi = {version = "*", markers = "implementation_name == \"pypy\""}
[[package]]
name = "qdrant-client"
version = "1.8.0"
version = "1.8.2"
description = "Client library for the Qdrant vector search engine"
optional = false
python-versions = ">=3.8"
files = [
{file = "qdrant_client-1.8.0-py3-none-any.whl", hash = "sha256:fa28d3eb64c0c57ec029c7c85c71f6c72c197f92502022655741f3632c518e29"},
{file = "qdrant_client-1.8.0.tar.gz", hash = "sha256:2a1a3f2cbacc7adba85644cf6cfdee20401cf25764b32da479c81fb63e178d15"},
{file = "qdrant_client-1.8.2-py3-none-any.whl", hash = "sha256:ee5341c0486d09e4346b0f5ef7781436e6d8cdbf1d5ecddfde7adb3647d353a8"},
{file = "qdrant_client-1.8.2.tar.gz", hash = "sha256:65078d5328bc0393f42a46a31cd319a989b8285bf3958360acf1dffffdf4cc4e"},
]
[package.dependencies]
grpcio = ">=1.41.0"
grpcio-tools = ">=1.41.0"
httpx = {version = ">=0.14.0", extras = ["http2"]}
httpx = {version = ">=0.20.0", extras = ["http2"]}
numpy = {version = ">=1.21", markers = "python_version >= \"3.8\" and python_version < \"3.12\""}
portalocker = ">=2.7.0,<3.0.0"
pydantic = ">=1.10.8"
urllib3 = ">=1.26.14,<3"
[package.extras]
fastembed = ["fastembed (==0.2.2)"]
fastembed = ["fastembed (==0.2.5)"]
[[package]]
name = "qianfan"
@ -7683,19 +7703,19 @@ full = ["numpy"]
[[package]]
name = "realtime"
version = "1.0.0"
version = "1.0.3"
description = ""
optional = false
python-versions = ">=3.8,<4.0"
python-versions = "<4.0,>=3.8"
files = [
{file = "realtime-1.0.0-py3-none-any.whl", hash = "sha256:ceab9e292211ab08b5792ac52b3fa25398440031d5b369bd5799b8125056e2d8"},
{file = "realtime-1.0.0.tar.gz", hash = "sha256:14e540c4a0cc2736ae83e0cbd7efbbfb8b736df1681df2b9141556cb4848502d"},
{file = "realtime-1.0.3-py3-none-any.whl", hash = "sha256:809b99a1c09390a4580ca2d37d84c85dffacb1804f80c6f5a4491d312c20e6e3"},
{file = "realtime-1.0.3.tar.gz", hash = "sha256:1a39b5dcdb345b4cc7fd43bc035feb38ca915c9248962f20d264625bc8eb2c4e"},
]
[package.dependencies]
python-dateutil = ">=2.8.1,<3.0.0"
typing-extensions = ">=4.2.0,<5.0.0"
websockets = ">=10.3,<11.0"
websockets = ">=11,<13"
[[package]]
name = "red-black-tree-mod"
@ -8470,36 +8490,36 @@ files = [
[[package]]
name = "supabase"
version = "2.4.0"
version = "2.4.1"
description = "Supabase client for Python."
optional = false
python-versions = ">=3.8,<4.0"
python-versions = "<4.0,>=3.8"
files = [
{file = "supabase-2.4.0-py3-none-any.whl", hash = "sha256:f2f02b0e7903247ef9e2b3cb5dde067924a19a068f1c8befbdf40fb091bf8dd3"},
{file = "supabase-2.4.0.tar.gz", hash = "sha256:d51556d3884f2e6f4588c33f1fcac954d4304238253bc35e9a87fdd22c43bafb"},
{file = "supabase-2.4.1-py3-none-any.whl", hash = "sha256:8b95744ce4ad24245ec23c090f273dfc9c2d9a53e3a80186959903947dbe1ed6"},
{file = "supabase-2.4.1.tar.gz", hash = "sha256:a7dec0586f8931f378a45b2ffb28d8e37b3719f979c17f541b0156019144e645"},
]
[package.dependencies]
gotrue = ">=1.3,<3.0"
httpx = ">=0.24,<0.26"
httpx = ">=0.24,<0.28"
postgrest = ">=0.10.8,<0.17.0"
realtime = ">=1.0.0,<2.0.0"
storage3 = ">=0.5.3,<0.8.0"
supafunc = ">=0.3.1,<0.4.0"
supafunc = ">=0.3.1,<0.5.0"
[[package]]
name = "supafunc"
version = "0.3.3"
version = "0.4.5"
description = "Library for Supabase Functions"
optional = false
python-versions = ">=3.8,<4.0"
python-versions = "<4.0,>=3.8"
files = [
{file = "supafunc-0.3.3-py3-none-any.whl", hash = "sha256:8260b4742335932f9cab64c8f66fb6998681b7e8ca7a46b559a4eb640cc0af80"},
{file = "supafunc-0.3.3.tar.gz", hash = "sha256:c35897a2f40465b40d7a08ae11f872f08eb8d1390c3ebc72c80e27d33ba91b99"},
{file = "supafunc-0.4.5-py3-none-any.whl", hash = "sha256:2208045f8f5c797924666f6a332efad75ad368f8030b2e4ceb9d2bf63f329373"},
{file = "supafunc-0.4.5.tar.gz", hash = "sha256:a6466d78bdcaa58b7f0303793643103baae8106a87acd5d01e196179a9d0d024"},
]
[package.dependencies]
httpx = ">=0.24,<0.26"
httpx = ">=0.24,<0.28"
[[package]]
name = "sympy"
@ -9018,13 +9038,13 @@ files = [
[[package]]
name = "types-passlib"
version = "1.7.7.20240311"
version = "1.7.7.20240327"
description = "Typing stubs for passlib"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-passlib-1.7.7.20240311.tar.gz", hash = "sha256:287dd27cec5421daf6be5c295f681baf343c146038c8bde4db783bcac1beccb7"},
{file = "types_passlib-1.7.7.20240311-py3-none-any.whl", hash = "sha256:cd44166e9347ae516f4830046cd1673c1ef90a5cc7ddd1356cf8a14892f29249"},
{file = "types-passlib-1.7.7.20240327.tar.gz", hash = "sha256:4cce6a1a3a6afee9fc4728b4d9784300764ac2be747f5bcc01646d904b85f4bb"},
{file = "types_passlib-1.7.7.20240327-py3-none-any.whl", hash = "sha256:3a3b7f4258b71034d2e2f4f307d6810f9904f906cdf375514c8bdbdb28a4ad23"},
]
[[package]]
@ -9646,80 +9666,83 @@ test = ["websockets"]
[[package]]
name = "websockets"
version = "10.4"
version = "12.0"
description = "An implementation of the WebSocket Protocol (RFC 6455 & 7692)"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
files = [
{file = "websockets-10.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d58804e996d7d2307173d56c297cf7bc132c52df27a3efaac5e8d43e36c21c48"},
{file = "websockets-10.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc0b82d728fe21a0d03e65f81980abbbcb13b5387f733a1a870672c5be26edab"},
{file = "websockets-10.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ba089c499e1f4155d2a3c2a05d2878a3428cf321c848f2b5a45ce55f0d7d310c"},
{file = "websockets-10.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:33d69ca7612f0ddff3316b0c7b33ca180d464ecac2d115805c044bf0a3b0d032"},
{file = "websockets-10.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:62e627f6b6d4aed919a2052efc408da7a545c606268d5ab5bfab4432734b82b4"},
{file = "websockets-10.4-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:38ea7b82bfcae927eeffc55d2ffa31665dc7fec7b8dc654506b8e5a518eb4d50"},
{file = "websockets-10.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e0cb5cc6ece6ffa75baccfd5c02cffe776f3f5c8bf486811f9d3ea3453676ce8"},
{file = "websockets-10.4-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:ae5e95cfb53ab1da62185e23b3130e11d64431179debac6dc3c6acf08760e9b1"},
{file = "websockets-10.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7c584f366f46ba667cfa66020344886cf47088e79c9b9d39c84ce9ea98aaa331"},
{file = "websockets-10.4-cp310-cp310-win32.whl", hash = "sha256:b029fb2032ae4724d8ae8d4f6b363f2cc39e4c7b12454df8df7f0f563ed3e61a"},
{file = "websockets-10.4-cp310-cp310-win_amd64.whl", hash = "sha256:8dc96f64ae43dde92530775e9cb169979f414dcf5cff670455d81a6823b42089"},
{file = "websockets-10.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:47a2964021f2110116cc1125b3e6d87ab5ad16dea161949e7244ec583b905bb4"},
{file = "websockets-10.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e789376b52c295c4946403bd0efecf27ab98f05319df4583d3c48e43c7342c2f"},
{file = "websockets-10.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7d3f0b61c45c3fa9a349cf484962c559a8a1d80dae6977276df8fd1fa5e3cb8c"},
{file = "websockets-10.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f55b5905705725af31ccef50e55391621532cd64fbf0bc6f4bac935f0fccec46"},
{file = "websockets-10.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:00c870522cdb69cd625b93f002961ffb0c095394f06ba8c48f17eef7c1541f96"},
{file = "websockets-10.4-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f38706e0b15d3c20ef6259fd4bc1700cd133b06c3c1bb108ffe3f8947be15fa"},
{file = "websockets-10.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:f2c38d588887a609191d30e902df2a32711f708abfd85d318ca9b367258cfd0c"},
{file = "websockets-10.4-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:fe10ddc59b304cb19a1bdf5bd0a7719cbbc9fbdd57ac80ed436b709fcf889106"},
{file = "websockets-10.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:90fcf8929836d4a0e964d799a58823547df5a5e9afa83081761630553be731f9"},
{file = "websockets-10.4-cp311-cp311-win32.whl", hash = "sha256:b9968694c5f467bf67ef97ae7ad4d56d14be2751000c1207d31bf3bb8860bae8"},
{file = "websockets-10.4-cp311-cp311-win_amd64.whl", hash = "sha256:a7a240d7a74bf8d5cb3bfe6be7f21697a28ec4b1a437607bae08ac7acf5b4882"},
{file = "websockets-10.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:74de2b894b47f1d21cbd0b37a5e2b2392ad95d17ae983e64727e18eb281fe7cb"},
{file = "websockets-10.4-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e3a686ecb4aa0d64ae60c9c9f1a7d5d46cab9bfb5d91a2d303d00e2cd4c4c5cc"},
{file = "websockets-10.4-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b0d15c968ea7a65211e084f523151dbf8ae44634de03c801b8bd070b74e85033"},
{file = "websockets-10.4-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:00213676a2e46b6ebf6045bc11d0f529d9120baa6f58d122b4021ad92adabd41"},
{file = "websockets-10.4-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:e23173580d740bf8822fd0379e4bf30aa1d5a92a4f252d34e893070c081050df"},
{file = "websockets-10.4-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:dd500e0a5e11969cdd3320935ca2ff1e936f2358f9c2e61f100a1660933320ea"},
{file = "websockets-10.4-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:4239b6027e3d66a89446908ff3027d2737afc1a375f8fd3eea630a4842ec9a0c"},
{file = "websockets-10.4-cp37-cp37m-win32.whl", hash = "sha256:8a5cc00546e0a701da4639aa0bbcb0ae2bb678c87f46da01ac2d789e1f2d2038"},
{file = "websockets-10.4-cp37-cp37m-win_amd64.whl", hash = "sha256:a9f9a735deaf9a0cadc2d8c50d1a5bcdbae8b6e539c6e08237bc4082d7c13f28"},
{file = "websockets-10.4-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5c1289596042fad2cdceb05e1ebf7aadf9995c928e0da2b7a4e99494953b1b94"},
{file = "websockets-10.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0cff816f51fb33c26d6e2b16b5c7d48eaa31dae5488ace6aae468b361f422b63"},
{file = "websockets-10.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:dd9becd5fe29773d140d68d607d66a38f60e31b86df75332703757ee645b6faf"},
{file = "websockets-10.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45ec8e75b7dbc9539cbfafa570742fe4f676eb8b0d3694b67dabe2f2ceed8aa6"},
{file = "websockets-10.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4f72e5cd0f18f262f5da20efa9e241699e0cf3a766317a17392550c9ad7b37d8"},
{file = "websockets-10.4-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:185929b4808b36a79c65b7865783b87b6841e852ef5407a2fb0c03381092fa3b"},
{file = "websockets-10.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:7d27a7e34c313b3a7f91adcd05134315002aaf8540d7b4f90336beafaea6217c"},
{file = "websockets-10.4-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:884be66c76a444c59f801ac13f40c76f176f1bfa815ef5b8ed44321e74f1600b"},
{file = "websockets-10.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:931c039af54fc195fe6ad536fde4b0de04da9d5916e78e55405436348cfb0e56"},
{file = "websockets-10.4-cp38-cp38-win32.whl", hash = "sha256:db3c336f9eda2532ec0fd8ea49fef7a8df8f6c804cdf4f39e5c5c0d4a4ad9a7a"},
{file = "websockets-10.4-cp38-cp38-win_amd64.whl", hash = "sha256:48c08473563323f9c9debac781ecf66f94ad5a3680a38fe84dee5388cf5acaf6"},
{file = "websockets-10.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:40e826de3085721dabc7cf9bfd41682dadc02286d8cf149b3ad05bff89311e4f"},
{file = "websockets-10.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:56029457f219ade1f2fc12a6504ea61e14ee227a815531f9738e41203a429112"},
{file = "websockets-10.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f5fc088b7a32f244c519a048c170f14cf2251b849ef0e20cbbb0fdf0fdaf556f"},
{file = "websockets-10.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2fc8709c00704194213d45e455adc106ff9e87658297f72d544220e32029cd3d"},
{file = "websockets-10.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0154f7691e4fe6c2b2bc275b5701e8b158dae92a1ab229e2b940efe11905dff4"},
{file = "websockets-10.4-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c6d2264f485f0b53adf22697ac11e261ce84805c232ed5dbe6b1bcb84b00ff0"},
{file = "websockets-10.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9bc42e8402dc5e9905fb8b9649f57efcb2056693b7e88faa8fb029256ba9c68c"},
{file = "websockets-10.4-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:edc344de4dac1d89300a053ac973299e82d3db56330f3494905643bb68801269"},
{file = "websockets-10.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:84bc2a7d075f32f6ed98652db3a680a17a4edb21ca7f80fe42e38753a58ee02b"},
{file = "websockets-10.4-cp39-cp39-win32.whl", hash = "sha256:c94ae4faf2d09f7c81847c63843f84fe47bf6253c9d60b20f25edfd30fb12588"},
{file = "websockets-10.4-cp39-cp39-win_amd64.whl", hash = "sha256:bbccd847aa0c3a69b5f691a84d2341a4f8a629c6922558f2a70611305f902d74"},
{file = "websockets-10.4-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:82ff5e1cae4e855147fd57a2863376ed7454134c2bf49ec604dfe71e446e2193"},
{file = "websockets-10.4-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d210abe51b5da0ffdbf7b43eed0cfdff8a55a1ab17abbec4301c9ff077dd0342"},
{file = "websockets-10.4-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:942de28af58f352a6f588bc72490ae0f4ccd6dfc2bd3de5945b882a078e4e179"},
{file = "websockets-10.4-pp37-pypy37_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c9b27d6c1c6cd53dc93614967e9ce00ae7f864a2d9f99fe5ed86706e1ecbf485"},
{file = "websockets-10.4-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:3d3cac3e32b2c8414f4f87c1b2ab686fa6284a980ba283617404377cd448f631"},
{file = "websockets-10.4-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:da39dd03d130162deb63da51f6e66ed73032ae62e74aaccc4236e30edccddbb0"},
{file = "websockets-10.4-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:389f8dbb5c489e305fb113ca1b6bdcdaa130923f77485db5b189de343a179393"},
{file = "websockets-10.4-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09a1814bb15eff7069e51fed0826df0bc0702652b5cb8f87697d469d79c23576"},
{file = "websockets-10.4-pp38-pypy38_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff64a1d38d156d429404aaa84b27305e957fd10c30e5880d1765c9480bea490f"},
{file = "websockets-10.4-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:b343f521b047493dc4022dd338fc6db9d9282658862756b4f6fd0e996c1380e1"},
{file = "websockets-10.4-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:932af322458da7e4e35df32f050389e13d3d96b09d274b22a7aa1808f292fee4"},
{file = "websockets-10.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d6a4162139374a49eb18ef5b2f4da1dd95c994588f5033d64e0bbfda4b6b6fcf"},
{file = "websockets-10.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c57e4c1349fbe0e446c9fa7b19ed2f8a4417233b6984277cce392819123142d3"},
{file = "websockets-10.4-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b627c266f295de9dea86bd1112ed3d5fafb69a348af30a2422e16590a8ecba13"},
{file = "websockets-10.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:05a7233089f8bd355e8cbe127c2e8ca0b4ea55467861906b80d2ebc7db4d6b72"},
{file = "websockets-10.4.tar.gz", hash = "sha256:eef610b23933c54d5d921c92578ae5f89813438fded840c2e9809d378dc765d3"},
{file = "websockets-12.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d554236b2a2006e0ce16315c16eaa0d628dab009c33b63ea03f41c6107958374"},
{file = "websockets-12.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2d225bb6886591b1746b17c0573e29804619c8f755b5598d875bb4235ea639be"},
{file = "websockets-12.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:eb809e816916a3b210bed3c82fb88eaf16e8afcf9c115ebb2bacede1797d2547"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c588f6abc13f78a67044c6b1273a99e1cf31038ad51815b3b016ce699f0d75c2"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5aa9348186d79a5f232115ed3fa9020eab66d6c3437d72f9d2c8ac0c6858c558"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6350b14a40c95ddd53e775dbdbbbc59b124a5c8ecd6fbb09c2e52029f7a9f480"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:70ec754cc2a769bcd218ed8d7209055667b30860ffecb8633a834dde27d6307c"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6e96f5ed1b83a8ddb07909b45bd94833b0710f738115751cdaa9da1fb0cb66e8"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4d87be612cbef86f994178d5186add3d94e9f31cc3cb499a0482b866ec477603"},
{file = "websockets-12.0-cp310-cp310-win32.whl", hash = "sha256:befe90632d66caaf72e8b2ed4d7f02b348913813c8b0a32fae1cc5fe3730902f"},
{file = "websockets-12.0-cp310-cp310-win_amd64.whl", hash = "sha256:363f57ca8bc8576195d0540c648aa58ac18cf85b76ad5202b9f976918f4219cf"},
{file = "websockets-12.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5d873c7de42dea355d73f170be0f23788cf3fa9f7bed718fd2830eefedce01b4"},
{file = "websockets-12.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3f61726cae9f65b872502ff3c1496abc93ffbe31b278455c418492016e2afc8f"},
{file = "websockets-12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ed2fcf7a07334c77fc8a230755c2209223a7cc44fc27597729b8ef5425aa61a3"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e332c210b14b57904869ca9f9bf4ca32f5427a03eeb625da9b616c85a3a506c"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5693ef74233122f8ebab026817b1b37fe25c411ecfca084b29bc7d6efc548f45"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e9e7db18b4539a29cc5ad8c8b252738a30e2b13f033c2d6e9d0549b45841c04"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:6e2df67b8014767d0f785baa98393725739287684b9f8d8a1001eb2839031447"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:bea88d71630c5900690fcb03161ab18f8f244805c59e2e0dc4ffadae0a7ee0ca"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:dff6cdf35e31d1315790149fee351f9e52978130cef6c87c4b6c9b3baf78bc53"},
{file = "websockets-12.0-cp311-cp311-win32.whl", hash = "sha256:3e3aa8c468af01d70332a382350ee95f6986db479ce7af14d5e81ec52aa2b402"},
{file = "websockets-12.0-cp311-cp311-win_amd64.whl", hash = "sha256:25eb766c8ad27da0f79420b2af4b85d29914ba0edf69f547cc4f06ca6f1d403b"},
{file = "websockets-12.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0e6e2711d5a8e6e482cacb927a49a3d432345dfe7dea8ace7b5790df5932e4df"},
{file = "websockets-12.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:dbcf72a37f0b3316e993e13ecf32f10c0e1259c28ffd0a85cee26e8549595fbc"},
{file = "websockets-12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:12743ab88ab2af1d17dd4acb4645677cb7063ef4db93abffbf164218a5d54c6b"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7b645f491f3c48d3f8a00d1fce07445fab7347fec54a3e65f0725d730d5b99cb"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9893d1aa45a7f8b3bc4510f6ccf8db8c3b62120917af15e3de247f0780294b92"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1f38a7b376117ef7aff996e737583172bdf535932c9ca021746573bce40165ed"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:f764ba54e33daf20e167915edc443b6f88956f37fb606449b4a5b10ba42235a5"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:1e4b3f8ea6a9cfa8be8484c9221ec0257508e3a1ec43c36acdefb2a9c3b00aa2"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9fdf06fd06c32205a07e47328ab49c40fc1407cdec801d698a7c41167ea45113"},
{file = "websockets-12.0-cp312-cp312-win32.whl", hash = "sha256:baa386875b70cbd81798fa9f71be689c1bf484f65fd6fb08d051a0ee4e79924d"},
{file = "websockets-12.0-cp312-cp312-win_amd64.whl", hash = "sha256:ae0a5da8f35a5be197f328d4727dbcfafa53d1824fac3d96cdd3a642fe09394f"},
{file = "websockets-12.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5f6ffe2c6598f7f7207eef9a1228b6f5c818f9f4d53ee920aacd35cec8110438"},
{file = "websockets-12.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9edf3fc590cc2ec20dc9d7a45108b5bbaf21c0d89f9fd3fd1685e223771dc0b2"},
{file = "websockets-12.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:8572132c7be52632201a35f5e08348137f658e5ffd21f51f94572ca6c05ea81d"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:604428d1b87edbf02b233e2c207d7d528460fa978f9e391bd8aaf9c8311de137"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1a9d160fd080c6285e202327aba140fc9a0d910b09e423afff4ae5cbbf1c7205"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87b4aafed34653e465eb77b7c93ef058516cb5acf3eb21e42f33928616172def"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b2ee7288b85959797970114deae81ab41b731f19ebcd3bd499ae9ca0e3f1d2c8"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:7fa3d25e81bfe6a89718e9791128398a50dec6d57faf23770787ff441d851967"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:a571f035a47212288e3b3519944f6bf4ac7bc7553243e41eac50dd48552b6df7"},
{file = "websockets-12.0-cp38-cp38-win32.whl", hash = "sha256:3c6cc1360c10c17463aadd29dd3af332d4a1adaa8796f6b0e9f9df1fdb0bad62"},
{file = "websockets-12.0-cp38-cp38-win_amd64.whl", hash = "sha256:1bf386089178ea69d720f8db6199a0504a406209a0fc23e603b27b300fdd6892"},
{file = "websockets-12.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:ab3d732ad50a4fbd04a4490ef08acd0517b6ae6b77eb967251f4c263011a990d"},
{file = "websockets-12.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a1d9697f3337a89691e3bd8dc56dea45a6f6d975f92e7d5f773bc715c15dde28"},
{file = "websockets-12.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1df2fbd2c8a98d38a66f5238484405b8d1d16f929bb7a33ed73e4801222a6f53"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23509452b3bc38e3a057382c2e941d5ac2e01e251acce7adc74011d7d8de434c"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2e5fc14ec6ea568200ea4ef46545073da81900a2b67b3e666f04adf53ad452ec"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46e71dbbd12850224243f5d2aeec90f0aaa0f2dde5aeeb8fc8df21e04d99eff9"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b81f90dcc6c85a9b7f29873beb56c94c85d6f0dac2ea8b60d995bd18bf3e2aae"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:a02413bc474feda2849c59ed2dfb2cddb4cd3d2f03a2fedec51d6e959d9b608b"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:bbe6013f9f791944ed31ca08b077e26249309639313fff132bfbf3ba105673b9"},
{file = "websockets-12.0-cp39-cp39-win32.whl", hash = "sha256:cbe83a6bbdf207ff0541de01e11904827540aa069293696dd528a6640bd6a5f6"},
{file = "websockets-12.0-cp39-cp39-win_amd64.whl", hash = "sha256:fc4e7fa5414512b481a2483775a8e8be7803a35b30ca805afa4998a84f9fd9e8"},
{file = "websockets-12.0-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:248d8e2446e13c1d4326e0a6a4e9629cb13a11195051a73acf414812700badbd"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f44069528d45a933997a6fef143030d8ca8042f0dfaad753e2906398290e2870"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c4e37d36f0d19f0a4413d3e18c0d03d0c268ada2061868c1e6f5ab1a6d575077"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d829f975fc2e527a3ef2f9c8f25e553eb7bc779c6665e8e1d52aa22800bb38b"},
{file = "websockets-12.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:2c71bd45a777433dd9113847af751aae36e448bc6b8c361a566cb043eda6ec30"},
{file = "websockets-12.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:0bee75f400895aef54157b36ed6d3b308fcab62e5260703add87f44cee9c82a6"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:423fc1ed29f7512fceb727e2d2aecb952c46aa34895e9ed96071821309951123"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:27a5e9964ef509016759f2ef3f2c1e13f403725a5e6a1775555994966a66e931"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3181df4583c4d3994d31fb235dc681d2aaad744fbdbf94c4802485ececdecf2"},
{file = "websockets-12.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:b067cb952ce8bf40115f6c19f478dc71c5e719b7fbaa511359795dfd9d1a6468"},
{file = "websockets-12.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:00700340c6c7ab788f176d118775202aadea7602c5cc6be6ae127761c16d6b0b"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e469d01137942849cff40517c97a30a93ae79917752b34029f0ec72df6b46399"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffefa1374cd508d633646d51a8e9277763a9b78ae71324183693959cf94635a7"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba0cab91b3956dfa9f512147860783a1829a8d905ee218a9837c18f683239611"},
{file = "websockets-12.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2cb388a5bfb56df4d9a406783b7f9dbefb888c09b71629351cc6b036e9259370"},
{file = "websockets-12.0-py3-none-any.whl", hash = "sha256:dc284bbc8d7c78a6c69e0c7325ab46ee5e40bb4d50e494d8131a07ef47500e9e"},
{file = "websockets-12.0.tar.gz", hash = "sha256:81df9cbcbb6c260de1e007e58c011bfebe2dafc8435107b0537f393dd38c8b1b"},
]
[[package]]
@ -10199,4 +10222,4 @@ local = ["ctransformers", "llama-cpp-python", "sentence-transformers"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.10,<3.12"
content-hash = "bbdd60e5b07fe4ad4759dc69ddcf83fdedf133825835e03b74c50018da803c6b"
content-hash = "54dd8f21b3a7ec73a22735bcb2c59c12cb9f52ff11e66eeffd1d80d6696d0d8e"

View file

@ -81,6 +81,7 @@ unstructured = { extras = ["md"], version = "^0.12.4" }
dspy-ai = "^2.4.0"
crewai = "^0.22.5"
html2text = "^2024.2.26"
assemblyai = "^0.23.1"
[tool.poetry.group.dev.dependencies]
types-redis = "^4.6.0.5"

View file

@ -285,7 +285,12 @@ def run_langflow(host, port, log_level, options, app):
# MacOS requires an env variable to be set to use gunicorn
import uvicorn
uvicorn.run(app, host=host, port=port, log_level=log_level)
uvicorn.run(
app,
host=host,
port=port,
log_level=log_level.lower(),
)
else:
from langflow.server import LangflowApplication

View file

@ -1,199 +0,0 @@
from typing import TYPE_CHECKING, Any, Callable, Coroutine, List, Optional, Tuple, Union
from pydantic.v1 import BaseModel, Field, create_model
from sqlmodel import select
from langflow.schema.schema import INPUT_FIELD_NAME, Record
from langflow.services.database.models.flow.model import Flow
from langflow.services.deps import session_scope
if TYPE_CHECKING:
from langflow.graph.graph.base import Graph
from langflow.graph.vertex.base import Vertex
INPUT_TYPE_MAP = {
"ChatInput": {"type_hint": "Optional[str]", "default": '""'},
"TextInput": {"type_hint": "Optional[str]", "default": '""'},
"JSONInput": {"type_hint": "Optional[dict]", "default": "{}"},
}
def list_flows(*, user_id: Optional[str] = None) -> List[Record]:
if not user_id:
raise ValueError("Session is invalid")
try:
with session_scope() as session:
flows = session.exec(
select(Flow).where(Flow.user_id == user_id).where(Flow.is_component == False) # noqa
).all()
flows_records = [flow.to_record() for flow in flows]
return flows_records
except Exception as e:
raise ValueError(f"Error listing flows: {e}")
async def load_flow(
user_id: str, flow_id: Optional[str] = None, flow_name: Optional[str] = None, tweaks: Optional[dict] = None
) -> "Graph":
from langflow.graph.graph.base import Graph
from langflow.processing.process import process_tweaks
if not flow_id and not flow_name:
raise ValueError("Flow ID or Flow Name is required")
if not flow_id and flow_name:
flow_id = find_flow(flow_name, user_id)
if not flow_id:
raise ValueError(f"Flow {flow_name} not found")
with session_scope() as session:
graph_data = flow.data if (flow := session.get(Flow, flow_id)) else None
if not graph_data:
raise ValueError(f"Flow {flow_id} not found")
if tweaks:
graph_data = process_tweaks(graph_data=graph_data, tweaks=tweaks)
graph = Graph.from_payload(graph_data, flow_id=flow_id)
return graph
def find_flow(flow_name: str, user_id: str) -> Optional[str]:
with session_scope() as session:
flow = session.exec(select(Flow).where(Flow.name == flow_name).where(Flow.user_id == user_id)).first()
return flow.id if flow else None
async def run_flow(
inputs: Union[dict, List[dict]] = None,
tweaks: Optional[dict] = None,
flow_id: Optional[str] = None,
flow_name: Optional[str] = None,
user_id: Optional[str] = None,
) -> Any:
graph = await load_flow(user_id, flow_id, flow_name, tweaks)
if inputs is None:
inputs = []
inputs_list = []
inputs_components = []
types = []
for input_dict in inputs:
inputs_list.append({INPUT_FIELD_NAME: input_dict.get("input_value")})
inputs_components.append(input_dict.get("components", []))
types.append(input_dict.get("type", []))
return await graph.arun(inputs_list, inputs_components=inputs_components, types=types)
def generate_function_for_flow(inputs: List["Vertex"], flow_id: str) -> Coroutine:
"""
Generate a dynamic flow function based on the given inputs and flow ID.
Args:
inputs (List[Vertex]): The list of input vertices for the flow.
flow_id (str): The ID of the flow.
Returns:
Coroutine: The dynamic flow function.
Raises:
None
Example:
inputs = [vertex1, vertex2]
flow_id = "my_flow"
function = generate_function_for_flow(inputs, flow_id)
result = function(input1, input2)
"""
# Prepare function arguments with type hints and default values
args = [
f"{input_.display_name.lower().replace(' ', '_')}: {INPUT_TYPE_MAP[input_.base_name]['type_hint']} = {INPUT_TYPE_MAP[input_.base_name]['default']}"
for input_ in inputs
]
# Maintain original argument names for constructing the tweaks dictionary
original_arg_names = [input_.display_name for input_ in inputs]
# Prepare a Pythonic, valid function argument string
func_args = ", ".join(args)
# Map original argument names to their corresponding Pythonic variable names in the function
arg_mappings = ", ".join(
f'"{original_name}": {name}'
for original_name, name in zip(original_arg_names, [arg.split(":")[0] for arg in args])
)
func_body = f"""
from typing import Optional
async def flow_function({func_args}):
tweaks = {{ {arg_mappings} }}
from langflow.helpers.flow import run_flow
from langchain_core.tools import ToolException
try:
return await run_flow(
tweaks={{key: {{'input_value': value}} for key, value in tweaks.items()}},
flow_id="{flow_id}",
)
except Exception as e:
raise ToolException(f'Error running flow: ' + e)
"""
compiled_func = compile(func_body, "<string>", "exec")
local_scope = {}
exec(compiled_func, globals(), local_scope)
return local_scope["flow_function"]
def build_function_and_schema(flow_record: Record, graph: "Graph") -> Tuple[Callable, BaseModel]:
"""
Builds a dynamic function and schema for a given flow.
Args:
flow_record (Record): The flow record containing information about the flow.
graph (Graph): The graph representing the flow.
Returns:
Tuple[Callable, BaseModel]: A tuple containing the dynamic function and the schema.
"""
flow_id = flow_record.id
inputs = get_flow_inputs(graph)
dynamic_flow_function = generate_function_for_flow(inputs, flow_id)
schema = build_schema_from_inputs(flow_record.name, inputs)
return dynamic_flow_function, schema
def get_flow_inputs(graph: "Graph") -> List["Vertex"]:
"""
Retrieves the flow inputs from the given graph.
Args:
graph (Graph): The graph object representing the flow.
Returns:
List[Record]: A list of input records, where each record contains the ID, name, and description of the input vertex.
"""
inputs = []
for vertex in graph.vertices:
if vertex.is_input:
inputs.append(vertex)
return inputs
def build_schema_from_inputs(name: str, inputs: List[tuple[str, str, str]]) -> BaseModel:
"""
Builds a schema from the given inputs.
Args:
name (str): The name of the schema.
inputs (List[tuple[str, str, str]]): A list of tuples representing the inputs.
Each tuple contains three elements: the input name, the input type, and the input description.
Returns:
BaseModel: The schema model.
"""
fields = {}
for input_ in inputs:
field_name = input_.display_name.lower().replace(" ", "_")
description = input_.description
fields[field_name] = (str, Field(default="", description=description))
return create_model(name, **fields)

View file

@ -1,34 +0,0 @@
from langchain_core.documents import Document
from langflow.schema import Record
def docs_to_records(documents: list[Document]) -> list[Record]:
"""
Converts a list of Documents to a list of Records.
Args:
documents (list[Document]): The list of Documents to convert.
Returns:
list[Record]: The converted list of Records.
"""
return [Record.from_document(document) for document in documents]
def records_to_text(template: str, records: list[Record]) -> str:
"""
Converts a list of Records to a list of texts.
Args:
records (list[Record]): The list of Records to convert.
Returns:
list[str]: The converted list of texts.
"""
if isinstance(records, Record):
records = [records]
# Check if there are any format strings in the template
formated_records = [template.format(data=record.data, **record.data) for record in records]
return "\n".join(formated_records)

View file

@ -0,0 +1,65 @@
"""Replace Credential table with Variable
Revision ID: 1a110b568907
Revises: 63b9c451fd30
Create Date: 2024-03-25 09:40:02.743453
"""
from typing import Sequence, Union
import sqlalchemy as sa
import sqlmodel
from alembic import op
from sqlalchemy.engine.reflection import Inspector
# revision identifiers, used by Alembic.
revision: str = "1a110b568907"
down_revision: Union[str, None] = "63b9c451fd30"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names()
# ### commands auto generated by Alembic - please adjust! ###
if "variable" not in table_names:
op.create_table(
"variable",
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
sa.Column("value", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
sa.Column("type", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
sa.Column("id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("updated_at", sa.DateTime(), nullable=True),
sa.Column("user_id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
sa.ForeignKeyConstraint(["user_id"], ["user.id"], name="fk_variable_user_id"),
sa.PrimaryKeyConstraint("id"),
)
if "credential" in table_names:
op.drop_table("credential")
# ### end Alembic commands ###
def downgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names()
# ### commands auto generated by Alembic - please adjust! ###
if "credential" not in table_names:
op.create_table(
"credential",
sa.Column("name", sa.VARCHAR(), nullable=True),
sa.Column("value", sa.VARCHAR(), nullable=True),
sa.Column("provider", sa.VARCHAR(), nullable=True),
sa.Column("user_id", sa.CHAR(length=32), nullable=False),
sa.Column("id", sa.CHAR(length=32), nullable=False),
sa.Column("created_at", sa.DATETIME(), nullable=False),
sa.Column("updated_at", sa.DATETIME(), nullable=True),
sa.ForeignKeyConstraint(["user_id"], ["user.id"], name="fk_credential_user_id"),
sa.PrimaryKeyConstraint("id"),
)
if "variable" in table_names:
op.drop_table("variable")
# ### end Alembic commands ###

View file

@ -4,7 +4,6 @@ from fastapi import APIRouter
from langflow.api.v1 import (
api_key_router,
chat_router,
credentials_router,
endpoints_router,
files_router,
flows_router,
@ -13,6 +12,7 @@ from langflow.api.v1 import (
store_router,
users_router,
validate_router,
variables_router,
)
router = APIRouter(
@ -26,6 +26,6 @@ router.include_router(flows_router)
router.include_router(users_router)
router.include_router(api_key_router)
router.include_router(login_router)
router.include_router(credentials_router)
router.include_router(variables_router)
router.include_router(files_router)
router.include_router(monitor_router)

View file

@ -125,6 +125,9 @@ def update_template_field(frontend_template, key, value_dict):
template_field["value"] = ""
template_field["file_path"] = file_path_value
if "load_from_db" in value_dict and value_dict["load_from_db"]:
template_field["load_from_db"] = value_dict["load_from_db"]
def get_file_path_value(file_path):
"""Get the file path value if the file exists, else return empty string."""
@ -161,7 +164,7 @@ def get_is_component_from_data(data: dict):
async def check_langflow_version(component: StoreComponentCreate):
from langflow import __version__ as current_version
from langflow.version.version import __version__ as current_version # type: ignore
if not component.last_tested_version:
component.last_tested_version = current_version

View file

@ -1,6 +1,5 @@
from langflow.api.v1.api_key import router as api_key_router
from langflow.api.v1.chat import router as chat_router
from langflow.api.v1.credential import router as credentials_router
from langflow.api.v1.endpoints import router as endpoints_router
from langflow.api.v1.files import router as files_router
from langflow.api.v1.flows import router as flows_router
@ -9,6 +8,7 @@ from langflow.api.v1.monitor import router as monitor_router
from langflow.api.v1.store import router as store_router
from langflow.api.v1.users import router as users_router
from langflow.api.v1.validate import router as validate_router
from langflow.api.v1.variable import router as variables_router
__all__ = [
"chat_router",
@ -19,7 +19,7 @@ __all__ = [
"users_router",
"api_key_router",
"login_router",
"credentials_router",
"variables_router",
"monitor_router",
"files_router",
]

View file

@ -1,111 +0,0 @@
from datetime import datetime
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException
from sqlmodel import Session, select
from langflow.services.auth import utils as auth_utils
from langflow.services.auth.utils import get_current_active_user
from langflow.services.database.models.credential import Credential, CredentialCreate, CredentialRead, CredentialUpdate
from langflow.services.database.models.user.model import User
from langflow.services.deps import get_session, get_settings_service
router = APIRouter(prefix="/credentials", tags=["Credentials"])
@router.post("/", response_model=CredentialRead, status_code=201)
def create_credential(
*,
session: Session = Depends(get_session),
credential: CredentialCreate,
current_user: User = Depends(get_current_active_user),
settings_service=Depends(get_settings_service),
):
"""Create a new credential."""
try:
# check if credential name already exists
credential_exists = session.exec(
select(Credential).where(Credential.name == credential.name, Credential.user_id == current_user.id)
).first()
if credential_exists:
raise HTTPException(status_code=400, detail="Credential name already exists")
credential_dict = credential.model_dump()
credential_dict["user_id"] = current_user.id
db_credential = Credential.model_validate(credential_dict)
if not db_credential.value:
raise HTTPException(status_code=400, detail="Credential value cannot be empty")
encrypted = auth_utils.encrypt_api_key(db_credential.value, settings_service=settings_service)
db_credential.value = encrypted
db_credential.user_id = current_user.id
session.add(db_credential)
session.commit()
session.refresh(db_credential)
return db_credential
except Exception as e:
if isinstance(e, HTTPException):
raise e
raise HTTPException(status_code=500, detail=str(e)) from e
@router.get("/", response_model=list[CredentialRead], status_code=200)
def read_credentials(
*,
session: Session = Depends(get_session),
current_user: User = Depends(get_current_active_user),
):
"""Read all credentials."""
try:
credentials = session.exec(select(Credential).where(Credential.user_id == current_user.id)).all()
return credentials
except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) from e
@router.patch("/{credential_id}", response_model=CredentialRead, status_code=200)
def update_credential(
*,
session: Session = Depends(get_session),
credential_id: UUID,
credential: CredentialUpdate,
current_user: User = Depends(get_current_active_user),
):
"""Update a credential."""
try:
db_credential = session.exec(
select(Credential).where(Credential.id == credential_id, Credential.user_id == current_user.id)
).first()
if not db_credential:
raise HTTPException(status_code=404, detail="Credential not found")
credential_data = credential.model_dump(exclude_unset=True)
for key, value in credential_data.items():
setattr(db_credential, key, value)
db_credential.updated_at = datetime.utcnow()
session.commit()
session.refresh(db_credential)
return db_credential
except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) from e
@router.delete("/{credential_id}", response_model=CredentialRead, status_code=200)
def delete_credential(
*,
session: Session = Depends(get_session),
credential_id: UUID,
current_user: User = Depends(get_current_active_user),
):
"""Delete a credential."""
try:
db_credential = session.exec(
select(Credential).where(Credential.id == credential_id, Credential.user_id == current_user.id)
).first()
if not db_credential:
raise HTTPException(status_code=404, detail="Credential not found")
session.delete(db_credential)
session.commit()
return db_credential
except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) from e

View file

@ -239,7 +239,7 @@ async def create_upload_file(
# get endpoint to return version of langflow
@router.get("/version")
def get_version():
from langflow.version import __version__
from langflow.version import __version__ # type: ignore
return {"version": __version__}

View file

@ -0,0 +1,113 @@
from datetime import datetime
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException
from sqlmodel import Session, select
from langflow.services.auth import utils as auth_utils
from langflow.services.auth.utils import get_current_active_user
from langflow.services.database.models.user.model import User
from langflow.services.database.models.variable import Variable, VariableCreate, VariableRead, VariableUpdate
from langflow.services.deps import get_session, get_settings_service
router = APIRouter(prefix="/variables", tags=["Variables"])
@router.post("/", response_model=VariableRead, status_code=201)
def create_variable(
*,
session: Session = Depends(get_session),
variable: VariableCreate,
current_user: User = Depends(get_current_active_user),
settings_service=Depends(get_settings_service),
):
"""Create a new variable."""
try:
# check if variable name already exists
variable_exists = session.exec(
select(Variable).where(
Variable.name == variable.name,
Variable.user_id == current_user.id,
)
).first()
if variable_exists:
raise HTTPException(status_code=400, detail="Variable name already exists")
variable_dict = variable.model_dump()
variable_dict["user_id"] = current_user.id
db_variable = Variable.model_validate(variable_dict)
if not db_variable.value:
raise HTTPException(status_code=400, detail="Variable value cannot be empty")
encrypted = auth_utils.encrypt_api_key(db_variable.value, settings_service=settings_service)
db_variable.value = encrypted
db_variable.user_id = current_user.id
session.add(db_variable)
session.commit()
session.refresh(db_variable)
return db_variable
except Exception as e:
if isinstance(e, HTTPException):
raise e
raise HTTPException(status_code=500, detail=str(e)) from e
@router.get("/", response_model=list[VariableRead], status_code=200)
def read_variables(
*,
session: Session = Depends(get_session),
current_user: User = Depends(get_current_active_user),
):
"""Read all variables."""
try:
variables = session.exec(select(Variable).where(Variable.user_id == current_user.id)).all()
return variables
except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) from e
@router.patch("/{variable_id}", response_model=VariableRead, status_code=200)
def update_variable(
*,
session: Session = Depends(get_session),
variable_id: UUID,
variable: VariableUpdate,
current_user: User = Depends(get_current_active_user),
):
"""Update a variable."""
try:
db_variable = session.exec(
select(Variable).where(Variable.id == variable_id, Variable.user_id == current_user.id)
).first()
if not db_variable:
raise HTTPException(status_code=404, detail="Variable not found")
variable_data = variable.model_dump(exclude_unset=True)
for key, value in variable_data.items():
setattr(db_variable, key, value)
db_variable.updated_at = datetime.utcnow()
session.commit()
session.refresh(db_variable)
return db_variable
except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) from e
@router.delete("/{variable_id}", status_code=204)
def delete_variable(
*,
session: Session = Depends(get_session),
variable_id: UUID,
current_user: User = Depends(get_current_active_user),
):
"""Delete a variable."""
try:
db_variable = session.exec(
select(Variable).where(Variable.id == variable_id, Variable.user_id == current_user.id)
).first()
if not db_variable:
raise HTTPException(status_code=404, detail="Variable not found")
session.delete(db_variable)
session.commit()
except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) from e

View file

@ -1,8 +1,10 @@
from typing import List, Union
from typing import List, Optional, Union, cast
from langchain.agents import AgentExecutor, BaseMultiActionAgent, BaseSingleActionAgent
from langchain_core.runnables import Runnable
from langflow.custom import CustomComponent
from langflow.field_typing import BaseMemory, Text, Tool
from langflow.interface.custom.custom_component import CustomComponent
class LCAgentComponent(CustomComponent):
@ -38,11 +40,11 @@ class LCAgentComponent(CustomComponent):
async def run_agent(
self,
agent: Union[BaseSingleActionAgent, BaseMultiActionAgent, AgentExecutor],
agent: Union[Runnable, BaseSingleActionAgent, BaseMultiActionAgent, AgentExecutor],
inputs: str,
input_variables: list[str],
tools: List[Tool],
memory: BaseMemory = None,
memory: Optional[BaseMemory] = None,
handle_parsing_errors: bool = True,
output_key: str = "output",
) -> Text:
@ -50,7 +52,11 @@ class LCAgentComponent(CustomComponent):
runnable = agent
else:
runnable = AgentExecutor.from_agent_and_tools(
agent=agent, tools=tools, verbose=True, memory=memory, handle_parsing_errors=handle_parsing_errors
agent=agent, # type: ignore
tools=tools,
verbose=True,
memory=memory,
handle_parsing_errors=handle_parsing_errors,
)
input_dict = {"input": inputs}
for var in input_variables:
@ -59,11 +65,11 @@ class LCAgentComponent(CustomComponent):
result = await runnable.ainvoke(input_dict)
self.status = result
if output_key in result:
return result.get(output_key)
return cast(str, result.get(output_key))
elif "output" not in result:
if output_key != "output":
raise ValueError(f"Output key not found in result. Tried '{output_key}' and 'output'.")
else:
raise ValueError("Output key not found in result. Tried 'output'.")
return result.get("output")
return cast(str, result.get("output"))

View file

@ -1,10 +1,10 @@
from typing import Optional
from typing import Optional, Union
from langchain_core.language_models.chat_models import BaseChatModel
from langchain_core.language_models.llms import LLM
from langchain_core.messages import HumanMessage, SystemMessage
from langflow.interface.custom.custom_component import CustomComponent
from langflow.custom import CustomComponent
class LCModelComponent(CustomComponent):
@ -34,15 +34,15 @@ class LCModelComponent(CustomComponent):
def get_chat_result(
self, runnable: BaseChatModel, stream: bool, input_value: str, system_message: Optional[str] = None
):
messages = []
messages: list[Union[HumanMessage, SystemMessage]] = []
if system_message:
messages.append(SystemMessage(system_message))
messages.append(SystemMessage(content=system_message))
if input_value:
messages.append(HumanMessage(input_value))
messages.append(HumanMessage(content=input_value))
if stream:
result = runnable.stream(messages)
return runnable.stream(messages)
else:
message = runnable.invoke(messages)
result = message.content
self.status = result
return result
return result

View file

@ -1,4 +1,4 @@
from typing import List
from typing import List, Optional
from langchain.agents import create_xml_agent
from langchain_core.prompts import PromptTemplate
@ -69,7 +69,7 @@ class XMLAgentComponent(LCAgentComponent):
llm: BaseLLM,
tools: List[Tool],
prompt: str,
memory: BaseMemory = None,
memory: Optional[BaseMemory] = None,
tool_template: str = "{name}: {description}",
handle_parsing_errors: bool = True,
) -> Text:

View file

@ -22,7 +22,7 @@ class CohereEmbeddingsComponent(CustomComponent):
self,
request_timeout: Optional[float] = None,
cohere_api_key: str = "",
max_retries: Optional[int] = None,
max_retries: int = 3,
model: str = "embed-english-v2.0",
truncate: Optional[str] = None,
user_agent: str = "langchain",

View file

@ -1,7 +1,6 @@
from typing import Any, Callable, Dict, List, Optional, Union
from langchain_openai.embeddings.base import OpenAIEmbeddings
from pydantic.v1.types import SecretStr
from langflow.field_typing import NestedDict
from langflow.interface.custom.custom_component import CustomComponent
@ -100,8 +99,6 @@ class OpenAIEmbeddingsComponent(CustomComponent):
if disallowed_special == ["all"]:
disallowed_special = "all" # type: ignore
api_key = SecretStr(openai_api_key) if openai_api_key else None
return OpenAIEmbeddings(
tiktoken_enabled=tiktoken_enable,
default_headers=default_headers,
@ -116,7 +113,7 @@ class OpenAIEmbeddingsComponent(CustomComponent):
model=model,
model_kwargs=model_kwargs,
base_url=openai_api_base,
api_key=api_key,
api_key=openai_api_key,
openai_api_type=openai_api_type,
api_version=openai_api_version,
organization=openai_organization,

View file

@ -1,4 +1,4 @@
from typing import Any, List, Optional, Text
from typing import Any, List, Optional
from langchain_core.tools import StructuredTool
from loguru import logger
@ -8,6 +8,7 @@ from langflow.field_typing import Tool
from langflow.graph.graph.base import Graph
from langflow.helpers.flow import build_function_and_schema
from langflow.schema.dotdict import dotdict
from langflow.schema.schema import Record
class FlowToolComponent(CustomComponent):
@ -19,7 +20,7 @@ class FlowToolComponent(CustomComponent):
flow_records = self.list_flows()
return [flow_record.data["name"] for flow_record in flow_records]
def get_flow(self, flow_name: str) -> Optional[Text]:
def get_flow(self, flow_name: str) -> Optional[Record]:
"""
Retrieves a flow by its name.
@ -82,4 +83,4 @@ class FlowToolComponent(CustomComponent):
description_repr = repr(tool.description).strip("'")
args_str = "\n".join([f"- {arg_name}: {arg_data['description']}" for arg_name, arg_data in tool.args.items()])
self.status = f"{description_repr}\nArguments:\n{args_str}"
return tool
return tool # type: ignore

View file

@ -5,15 +5,22 @@ from langflow.interface.custom.custom_component import CustomComponent
class RunnableExecComponent(CustomComponent):
documentation: str = "http://docs.langflow.org/components/custom"
description = "Execute a runnable. It will try to guess the input and output keys."
display_name = "Runnable Executor"
beta: bool = True
field_order = [
"input_key",
"output_key",
"input_value",
"runnable",
]
def build_config(self):
return {
"input_key": {
"display_name": "Input Key",
"info": "The key to use for the input.",
"advanced": True,
},
"input_value": {
"display_name": "Inputs",
@ -27,9 +34,78 @@ class RunnableExecComponent(CustomComponent):
"output_key": {
"display_name": "Output Key",
"info": "The key to use for the output.",
"advanced": True,
},
}
def get_output(self, result, input_key, output_key):
"""
Retrieves the output value from the given result dictionary based on the specified input and output keys.
Args:
result (dict): The result dictionary containing the output value.
input_key (str): The key used to retrieve the input value from the result dictionary.
output_key (str): The key used to retrieve the output value from the result dictionary.
Returns:
tuple: A tuple containing the output value and the status message.
"""
possible_output_keys = ["answer", "response", "output", "result", "text"]
status = ""
result_value = None
if output_key in result:
result_value = result.get(output_key)
elif len(result) == 2 and input_key in result:
# get the other key from the result dict
other_key = [k for k in result if k != input_key][0]
if other_key == output_key:
result_value = result.get(output_key)
else:
status += f"Warning: The output key is not '{output_key}'. The output key is '{other_key}'."
result_value = result.get(other_key)
elif len(result) == 1:
result_value = list(result.values())[0]
elif any(k in result for k in possible_output_keys):
for key in possible_output_keys:
if key in result:
result_value = result.get(key)
status += f"Output key: '{key}'."
break
if result_value is None:
result_value = result
status += f"Warning: The output key is not '{output_key}'."
else:
result_value = result
status += f"Warning: The output key is not '{output_key}'."
return result_value, status
def get_input_dict(self, runnable, input_key, input_value):
"""
Returns a dictionary containing the input key-value pair for the given runnable.
Args:
runnable: The runnable object.
input_key: The key for the input value.
input_value: The value for the input key.
Returns:
input_dict: A dictionary containing the input key-value pair.
status: A status message indicating if the input key is not in the runnable's input keys.
"""
input_dict = {}
status = ""
if hasattr(runnable, "input_keys"):
# Check if input_key is in the runnable's input_keys
if input_key in runnable.input_keys:
input_dict[input_key] = input_value
else:
input_dict = {k: input_value for k in runnable.input_keys}
status = f"Warning: The input key is not '{input_key}'. The input key is '{runnable.input_keys}'."
return input_dict, status
def build(
self,
input_value: Text,
@ -37,7 +113,10 @@ class RunnableExecComponent(CustomComponent):
input_key: str = "input",
output_key: str = "output",
) -> Text:
result = runnable.invoke({input_key: input_value})
result = result.get(output_key)
self.status = result
return result
input_dict, status = self.get_input_dict(runnable, input_key, input_value)
result = runnable.invoke(input_dict)
result_value, _status = self.get_output(result, input_key, output_key)
status += _status
status += f"\n\nOutput: {result_value}\n\nRaw Output: {result}"
self.status = status
return result_value

View file

@ -1,10 +1,12 @@
from typing import Any, List, Optional, Text, Tuple
from typing import Any, List, Optional
from langflow.helpers.flow import get_flow_inputs
from loguru import logger
from langflow.custom import CustomComponent
from langflow.graph.graph.base import Graph
from langflow.graph.schema import ResultData, RunOutputs
from langflow.graph.vertex.base import Vertex
from langflow.schema import Record
from langflow.schema.dotdict import dotdict
from langflow.template.field.base import TemplateField
@ -20,7 +22,7 @@ class SubFlowComponent(CustomComponent):
flow_records = self.list_flows()
return [flow_record.data["name"] for flow_record in flow_records]
def get_flow(self, flow_name: str) -> Optional[Text]:
def get_flow(self, flow_name: str) -> Optional[Record]:
flow_records = self.list_flows()
for flow_record in flow_records:
if flow_record.data["name"] == flow_name:
@ -42,7 +44,7 @@ class SubFlowComponent(CustomComponent):
raise ValueError(f"Flow {field_value} not found.")
graph = Graph.from_payload(flow_record.data["data"])
# Get all inputs from the graph
inputs = self.get_flow_inputs(graph)
inputs = get_flow_inputs(graph)
# Add inputs to the build config
build_config = self.add_inputs_to_build_config(inputs, build_config)
except Exception as e:
@ -50,21 +52,13 @@ class SubFlowComponent(CustomComponent):
return build_config
def get_flow_inputs(self, graph: Graph) -> List[Record]:
inputs = []
for vertex in graph.vertices:
if vertex.is_input:
inputs.append((vertex.id, vertex.display_name, vertex.description))
logger.debug(inputs)
return inputs
def add_inputs_to_build_config(self, inputs: List[Tuple], build_config: dotdict):
def add_inputs_to_build_config(self, inputs: List[Vertex], build_config: dotdict):
new_fields: list[TemplateField] = []
for input_id, input_display_name, input_description in inputs:
for vertex in inputs:
field = TemplateField(
display_name=input_display_name,
name=input_id,
info=input_description,
display_name=vertex.display_name,
name=vertex.id,
info=vertex.description,
field_type="str",
default=None,
)
@ -110,12 +104,15 @@ class SubFlowComponent(CustomComponent):
tweaks=tweaks,
flow_name=flow_name,
)
if not run_outputs:
return []
run_output = run_outputs[0]
records = []
for output in run_output.outputs:
if output:
records.extend(self.build_records_from_result_data(output))
if run_output is not None:
for output in run_output.outputs:
if output:
records.extend(self.build_records_from_result_data(output))
self.status = records
logger.debug(records)

View file

@ -1,6 +1,6 @@
from .ClearMessageHistory import ClearMessageHistoryComponent
from .ExtractDataFromRecord import ExtractKeyFromRecordComponent
from .Listen import GetNotifiedComponent
from .Listen import ListenComponent
from .ListFlows import ListFlowsComponent
from .MergeRecords import MergeRecordsComponent
from .Notify import NotifyComponent
@ -11,7 +11,7 @@ from .SQLExecutor import SQLExecutorComponent
__all__ = [
"ClearMessageHistoryComponent",
"ExtractKeyFromRecordComponent",
"GetNotifiedComponent",
"ListenComponent",
"ListFlowsComponent",
"MergeRecordsComponent",
"MessageHistoryComponent",

View file

@ -2,7 +2,6 @@ from typing import Optional
from langchain.llms.base import BaseLanguageModel
from langchain_openai import AzureChatOpenAI
from pydantic.v1 import SecretStr
from langflow.base.models.model import LCModelComponent
from langflow.field_typing import Text
@ -91,21 +90,20 @@ class AzureChatOpenAIComponent(LCModelComponent):
azure_endpoint: str,
input_value: Text,
azure_deployment: str,
api_key: str,
api_version: str,
api_key: Optional[str] = None,
system_message: Optional[str] = None,
temperature: float = 0.7,
max_tokens: Optional[int] = 1000,
stream: bool = False,
) -> BaseLanguageModel:
secret_api_key = SecretStr(api_key)
try:
output = AzureChatOpenAI(
model=model,
azure_endpoint=azure_endpoint,
azure_deployment=azure_deployment,
api_version=api_version,
api_key=secret_api_key,
api_key=api_key,
temperature=temperature,
max_tokens=max_tokens,
)

View file

@ -1,3 +0,0 @@
from .model import LCModelComponent
__all__ = ["LCModelComponent"]

View file

@ -34,4 +34,4 @@ class SearchApiToolComponent(CustomComponent):
tool = SearchAPIRun(api_wrapper=search_api_wrapper)
self.status = tool
return tool
return tool # type: ignore

View file

@ -1,6 +1,7 @@
from typing import List, Optional
from langchain_astradb import AstraDBVectorStore
from langchain_astradb.utils.astradb import SetupMode
from langflow.custom import CustomComponent
from langflow.field_typing import Embeddings, VectorStore
@ -85,6 +86,10 @@ class AstraDBVectorStoreComponent(CustomComponent):
metadata_indexing_exclude: Optional[List[str]] = None,
collection_indexing_policy: Optional[dict] = None,
) -> VectorStore:
try:
setup_mode_value = SetupMode[setup_mode.upper()]
except KeyError:
raise ValueError(f"Invalid setup mode: {setup_mode}")
if inputs:
documents = [_input.to_lc_document() for _input in inputs]
@ -100,7 +105,7 @@ class AstraDBVectorStoreComponent(CustomComponent):
bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,
bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,
bulk_delete_concurrency=bulk_delete_concurrency,
setup_mode=setup_mode,
setup_mode=setup_mode_value,
pre_delete_collection=pre_delete_collection,
metadata_indexing_include=metadata_indexing_include,
metadata_indexing_exclude=metadata_indexing_exclude,
@ -118,7 +123,7 @@ class AstraDBVectorStoreComponent(CustomComponent):
bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,
bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,
bulk_delete_concurrency=bulk_delete_concurrency,
setup_mode=setup_mode,
setup_mode=setup_mode_value,
pre_delete_collection=pre_delete_collection,
metadata_indexing_include=metadata_indexing_include,
metadata_indexing_exclude=metadata_indexing_exclude,

View file

@ -6,7 +6,7 @@ from langflow.field_typing import Embeddings, Text
from langflow.schema import Record
class AstraDBSearchComponent(AstraDBVectorStoreComponent, LCVectorStoreComponent):
class AstraDBSearchComponent(LCVectorStoreComponent):
display_name = "AstraDB Search"
description = "Searches an existing AstraDB Vector Store"
icon = "AstraDB"
@ -76,7 +76,7 @@ class AstraDBSearchComponent(AstraDBVectorStoreComponent, LCVectorStoreComponent
self,
embedding: Embeddings,
collection_name: str,
input_value: Optional[Text] = None,
input_value: Text,
search_type: str = "Similarity",
token: Optional[str] = None,
api_endpoint: Optional[str] = None,
@ -92,7 +92,7 @@ class AstraDBSearchComponent(AstraDBVectorStoreComponent, LCVectorStoreComponent
metadata_indexing_exclude: Optional[List[str]] = None,
collection_indexing_policy: Optional[dict] = None,
) -> List[Record]:
vector_store = super().build(
vector_store = AstraDBVectorStoreComponent().build(
embedding=embedding,
collection_name=collection_name,
token=token,

View file

@ -6,7 +6,7 @@ from langflow.field_typing import Embeddings, NestedDict, Text
from langflow.schema import Record
class MongoDBAtlasSearchComponent(MongoDBAtlasComponent, LCVectorStoreComponent):
class MongoDBAtlasSearchComponent(LCVectorStoreComponent):
display_name = "MongoDB Atlas Search"
description = "Search a MongoDB Atlas Vector Store for similar documents."
@ -37,9 +37,10 @@ class MongoDBAtlasSearchComponent(MongoDBAtlasComponent, LCVectorStoreComponent)
search_kwargs: Optional[NestedDict] = None,
) -> List[Record]:
search_kwargs = search_kwargs or {}
vector_store = super().build(
connection_string=mongodb_atlas_cluster_uri,
namespace=f"{db_name}.{collection_name}",
vector_store = MongoDBAtlasComponent().build(
mongodb_atlas_cluster_uri=mongodb_atlas_cluster_uri,
collection_name=collection_name,
db_name=db_name,
embedding=embedding,
index_name=index_name,
)

View file

@ -10,12 +10,12 @@ class RangeSpec(BaseModel):
@classmethod
def max_must_be_greater_than_min(cls, v, values, **kwargs):
if "min" in values.data and v <= values.data["min"]:
raise ValueError("max must be greater than min")
raise ValueError("Max must be greater than min")
return v
@field_validator("step")
@classmethod
def step_must_be_positive(cls, v):
if v <= 0:
raise ValueError("step must be positive")
raise ValueError("Step must be positive")
return v

View file

@ -1,7 +1,7 @@
import asyncio
from collections import defaultdict, deque
from itertools import chain
from typing import TYPE_CHECKING, Coroutine, Dict, Generator, List, Optional, Type, Union
from typing import TYPE_CHECKING, Callable, Coroutine, Dict, Generator, List, Literal, Optional, Type, Union
from loguru import logger
@ -201,7 +201,7 @@ class Graph:
self,
inputs: Dict[str, str],
input_components: list[str],
input_type: str,
input_type: Literal["chat", "text", "json", "any"] | None,
outputs: list[str],
stream: bool,
session_id: str,
@ -236,7 +236,7 @@ class Graph:
continue
# If the input_type is not any and the input_type is not in the vertex id
# Example: input_type = "chat" and vertex.id = "OpenAI-19ddn"
elif input_type != "any" and input_type not in vertex.id.lower():
elif input_type is not None and input_type != "any" and input_type not in vertex.id.lower():
continue
if vertex is None:
raise ValueError(f"Vertex {vertex_id} not found")
@ -269,9 +269,9 @@ class Graph:
def run(
self,
inputs: Dict[str, str],
input_components: Optional[list[str]] = None,
types: Optional[list[str]] = None,
inputs: list[Dict[str, str]],
input_components: Optional[list[list[str]]] = None,
types: Optional[list[Literal["chat", "text", "json", "any"] | None]] = None,
outputs: Optional[list[str]] = None,
session_id: Optional[str] = None,
stream: bool = False,
@ -309,7 +309,7 @@ class Graph:
self,
inputs: list[Dict[str, str]],
inputs_components: Optional[list[list[str]]] = None,
types: Optional[list[str]] = None,
types: Optional[list[Literal["chat", "text", "json", "any"] | None]] = None,
outputs: Optional[list[str]] = None,
session_id: Optional[str] = None,
stream: bool = False,
@ -338,8 +338,12 @@ class Graph:
inputs = [{}]
# Length of all should be the as inputs length
# just add empty lists to complete the length
if inputs_components is None:
inputs_components = []
for _ in range(len(inputs) - len(inputs_components)):
inputs_components.append([])
if types is None:
types = []
for _ in range(len(inputs) - len(types)):
types.append("any")
for run_inputs, components, input_type in zip(inputs, inputs_components, types):
@ -650,7 +654,7 @@ class Graph:
async def build_vertex(
self,
lock: asyncio.Lock,
set_cache_coro: Coroutine,
set_cache_coro: Callable[["Graph", asyncio.Lock], Coroutine],
vertex_id: str,
inputs_dict: Optional[Dict[str, str]] = None,
user_id: Optional[str] = None,
@ -693,7 +697,9 @@ class Graph:
logger.exception(f"Error building vertex: {exc}")
raise exc
async def get_next_and_top_level_vertices(self, lock: asyncio.Lock, set_cache_coro: Coroutine, vertex: Vertex):
async def get_next_and_top_level_vertices(
self, lock: asyncio.Lock, set_cache_coro: Callable[["Graph", asyncio.Lock], Coroutine], vertex: Vertex
):
"""
Retrieves the next runnable vertices and the top level vertices for a given vertex.

View file

@ -3,8 +3,10 @@ from langflow.interface.agents.base import agent_creator
from langflow.interface.custom.base import custom_component_creator
from langflow.interface.document_loaders.base import documentloader_creator
from langflow.interface.embeddings.base import embedding_creator
from langflow.interface.llms.base import llm_creator
from langflow.interface.memories.base import memory_creator
from langflow.interface.output_parsers.base import output_parser_creator
from langflow.interface.prompts.base import prompt_creator
from langflow.interface.retrievers.base import retriever_creator
from langflow.interface.text_splitters.base import textsplitter_creator
from langflow.interface.toolkits.base import toolkits_creator
@ -33,13 +35,13 @@ class VertexTypesDict(LazyLoadDictBase):
def get_type_dict(self):
return {
# **{t: types.PromptVertex for t in prompt_creator.to_list()},
**{t: types.PromptVertex for t in prompt_creator.to_list()},
**{t: types.AgentVertex for t in agent_creator.to_list()},
# **{t: types.ChainVertex for t in chain_creator.to_list()},
**{t: types.ToolVertex for t in tool_creator.to_list()},
**{t: types.ToolkitVertex for t in toolkits_creator.to_list()},
**{t: types.WrapperVertex for t in wrapper_creator.to_list()},
# **{t: types.LLMVertex for t in llm_creator.to_list()},
**{t: types.LLMVertex for t in llm_creator.to_list()},
**{t: types.MemoryVertex for t in memory_creator.to_list()},
**{t: types.EmbeddingVertex for t in embedding_creator.to_list()},
# **{t: types.VectorStoreVertex for t in vectorstore_creator.to_list()},

View file

@ -1,6 +1,6 @@
import asyncio
from collections import defaultdict
from typing import TYPE_CHECKING, Coroutine, List
from typing import TYPE_CHECKING, Awaitable, Callable, List
if TYPE_CHECKING:
from langflow.graph.graph.base import Graph
@ -55,7 +55,7 @@ class RunnableVerticesManager:
async def get_next_runnable_vertices(
self,
lock: asyncio.Lock,
set_cache_coro: Coroutine,
set_cache_coro: Callable[["Graph", asyncio.Lock], Awaitable[None]],
graph: "Graph",
vertex: "Vertex",
):

View file

@ -1,44 +1,27 @@
from collections import defaultdict
from threading import Lock
from typing import Callable
from typing import TYPE_CHECKING, Callable
from langflow.services.deps import get_state_service
from loguru import logger
if TYPE_CHECKING:
from langflow.services.state.service import StateService
class GraphStateManager:
def __init__(self):
self.states = {}
self.observers = defaultdict(list)
self.lock = Lock()
self.state_service: "StateService" = get_state_service()
def append_state(self, key, new_state, run_id: str):
with self.lock:
if run_id not in self.states:
self.states[run_id] = {}
if key not in self.states[run_id]:
self.states[run_id][key] = []
elif not isinstance(self.states[key], list):
self.states[run_id][key] = [self.states[key]]
self.states[run_id][key].append(new_state)
self.notify_append_observers(key, new_state)
self.state_service.append_state(key, new_state, run_id)
def update_state(self, key, new_state, run_id: str):
with self.lock:
if run_id not in self.states:
self.states[run_id] = {}
if key not in self.states[run_id]:
self.states[run_id][key] = {}
self.states[run_id][key] = new_state
self.notify_observers(key, new_state)
self.state_service.update_state(key, new_state, run_id)
def get_state(self, key, run_id: str):
with self.lock:
return self.states.get(run_id, {}).get(key, "")
return self.state_service.get_state(key, run_id)
def subscribe(self, key, observer: Callable):
with self.lock:
if observer not in self.observers[key]:
self.observers[key].append(observer)
self.state_service.subscribe(key, observer)
def notify_observers(self, key, new_state):
for callback in self.observers[key]:

View file

@ -4,7 +4,6 @@ import inspect
import types
from enum import Enum
from typing import TYPE_CHECKING, Any, AsyncIterator, Callable, Dict, Iterator, List, Optional
from loguru import logger
from langflow.graph.schema import INPUT_COMPONENTS, OUTPUT_COMPONENTS, InterfaceComponentTypes, ResultData
@ -68,6 +67,7 @@ class Vertex:
self.is_task = is_task
self.params = params or {}
self.parent_node_id: Optional[str] = self._data.get("parent_node_id")
self.load_from_db_fields: List[str] = []
self.parent_is_top_level = False
self.layer = None
self.should_run = True
@ -155,6 +155,7 @@ class Vertex:
"_built": False,
"parent_node_id": self.parent_node_id,
"parent_is_top_level": self.parent_is_top_level,
"load_from_db_fields": self.load_from_db_fields,
"is_input": self.is_input,
"is_output": self.is_output,
}
@ -185,6 +186,7 @@ class Vertex:
self.task_id: Optional[str] = None
self.parent_node_id = state["parent_node_id"]
self.parent_is_top_level = state["parent_is_top_level"]
self.load_from_db_fields = state["load_from_db_fields"]
self.layer = state.get("layer")
self.steps = state.get("steps", [self._build])
@ -285,20 +287,21 @@ class Vertex:
else:
params[param_key] = self.graph.get_vertex(edge.source_id)
for key, value in template_dict.items():
if key in params:
load_from_db_fields = []
for field_name, field in template_dict.items():
if field_name in params:
continue
# Skip _type and any value that has show == False and is not code
# If we don't want to show code but we want to use it
if key == "_type" or (not value.get("show") and key != "code"):
if field_name == "_type" or (not field.get("show") and field_name != "code"):
continue
# If the type is not transformable to a python base class
# then we need to get the edge that connects to this node
if value.get("type") == "file":
if field.get("type") == "file":
# Load the type in value.get('fileTypes') using
# what is inside value.get('content')
# value.get('value') is the file name
if file_path := value.get("file_path"):
if file_path := field.get("file_path"):
storage_service = get_storage_service()
try:
flow_id, file_name = file_path.split("/")
@ -308,51 +311,58 @@ class Vertex:
full_path = file_path
else:
raise e
params[key] = full_path
elif value.get("required"):
params[field_name] = full_path
elif field.get("required"):
raise ValueError(f"File path not found for {self.display_name}")
elif value.get("type") in DIRECT_TYPES and params.get(key) is None:
val = value.get("value")
if value.get("type") == "code":
elif field.get("type") in DIRECT_TYPES and params.get(field_name) is None:
val = field.get("value")
if field.get("type") == "code":
try:
params[key] = ast.literal_eval(val) if val else None
params[field_name] = ast.literal_eval(val) if val else None
except Exception:
params[key] = val
elif value.get("type") in ["dict", "NestedDict"]:
params[field_name] = val
elif field.get("type") in ["dict", "NestedDict"]:
# When dict comes from the frontend it comes as a
# list of dicts, so we need to convert it to a dict
# before passing it to the build method
if isinstance(val, list):
params[key] = {k: v for item in value.get("value", []) for k, v in item.items()}
params[field_name] = {k: v for item in field.get("value", []) for k, v in item.items()}
elif isinstance(val, dict):
params[key] = val
elif value.get("type") == "int" and val is not None:
params[field_name] = val
elif field.get("type") == "int" and val is not None:
try:
params[key] = int(val)
params[field_name] = int(val)
except ValueError:
params[key] = val
elif value.get("type") == "float" and val is not None:
params[field_name] = val
elif field.get("type") == "float" and val is not None:
try:
params[key] = float(val)
params[field_name] = float(val)
except ValueError:
params[key] = val
elif value.get("type") == "str" and val is not None:
params[field_name] = val
params[field_name] = val
elif field.get("type") == "str" and val is not None:
# val may contain escaped \n, \t, etc.
# so we need to unescape it
if isinstance(val, list):
params[key] = [unescape_string(v) for v in val]
params[field_name] = [unescape_string(v) for v in val]
elif isinstance(val, str):
params[key] = unescape_string(val)
params[field_name] = unescape_string(val)
elif val is not None and val != "":
params[key] = val
params[field_name] = val
if not value.get("required") and params.get(key) is None:
if value.get("default"):
params[key] = value.get("default")
elif val is not None and val != "":
params[field_name] = val
if field.get("load_from_db"):
load_from_db_fields.append(field_name)
if not field.get("required") and params.get(field_name) is None:
if field.get("default"):
params[field_name] = field.get("default")
else:
params.pop(key, None)
params.pop(field_name, None)
# Add _type to params
self.params = params
self.load_from_db_fields = load_from_db_fields
self._raw_params = params.copy()
def update_raw_params(self, new_params: Dict[str, str], overwrite: bool = False):

View file

@ -1,3 +0,0 @@
from .record import docs_to_records, records_to_text
__all__ = ["docs_to_records", "records_to_text"]

View file

@ -1,4 +1,4 @@
from typing import TYPE_CHECKING, Any, Callable, Coroutine, List, Optional, Tuple, Union
from typing import TYPE_CHECKING, Any, Awaitable, Callable, List, Optional, Tuple, Type, Union, cast
from pydantic.v1 import BaseModel, Field, create_model
from sqlmodel import select
@ -63,28 +63,30 @@ def find_flow(flow_name: str, user_id: str) -> Optional[str]:
async def run_flow(
inputs: Union[dict, List[dict]] = None,
inputs: Optional[Union[dict, List[dict]]] = None,
tweaks: Optional[dict] = None,
flow_id: Optional[str] = None,
flow_name: Optional[str] = None,
user_id: Optional[str] = None,
) -> Any:
if not user_id:
raise ValueError("Session is invalid")
graph = await load_flow(user_id, flow_id, flow_name, tweaks)
if inputs is None:
inputs = []
inputs_list = []
inputs_list: list[dict[str, str]] = []
inputs_components = []
types = []
for input_dict in inputs:
inputs_list.append({INPUT_FIELD_NAME: input_dict.get("input_value")})
inputs_list.append({INPUT_FIELD_NAME: cast(str, input_dict.get("input_value", ""))})
inputs_components.append(input_dict.get("components", []))
types.append(input_dict.get("type", []))
return await graph.arun(inputs_list, inputs_components=inputs_components, types=types)
def generate_function_for_flow(inputs: List["Vertex"], flow_id: str) -> Coroutine:
def generate_function_for_flow(inputs: List["Vertex"], flow_id: str) -> Callable[..., Awaitable[Any]]:
"""
Generate a dynamic flow function based on the given inputs and flow ID.
@ -138,12 +140,14 @@ async def flow_function({func_args}):
"""
compiled_func = compile(func_body, "<string>", "exec")
local_scope = {}
local_scope: dict = {}
exec(compiled_func, globals(), local_scope)
return local_scope["flow_function"]
def build_function_and_schema(flow_record: Record, graph: "Graph") -> Tuple[Callable, BaseModel]:
def build_function_and_schema(
flow_record: Record, graph: "Graph"
) -> Tuple[Callable[..., Awaitable[Any]], Type[BaseModel]]:
"""
Builds a dynamic function and schema for a given flow.
@ -178,7 +182,7 @@ def get_flow_inputs(graph: "Graph") -> List["Vertex"]:
return inputs
def build_schema_from_inputs(name: str, inputs: List[tuple[str, str, str]]) -> BaseModel:
def build_schema_from_inputs(name: str, inputs: List["Vertex"]) -> Type[BaseModel]:
"""
Builds a schema from the given inputs.
@ -196,4 +200,4 @@ def build_schema_from_inputs(name: str, inputs: List[tuple[str, str, str]]) -> B
field_name = input_.display_name.lower().replace(" ", "_")
description = input_.description
fields[field_name] = (str, Field(default="", description=description))
return create_model(name, **fields)
return create_model(name, **fields) # type: ignore

View file

@ -43,8 +43,7 @@ class Component:
def __setattr__(self, key, value):
if key == "_user_id" and hasattr(self, "_user_id"):
warnings.warn("user_id is immutable and cannot be changed.")
else:
super().__setattr__(key, value)
super().__setattr__(key, value)
@cachedmethod(cache=operator.attrgetter("cache"))
def get_code_tree(self, code: str):
@ -70,6 +69,12 @@ class Component:
return validate.create_function(self.code, self._function_entrypoint_name)
def build_template_config(self) -> dict:
"""
Builds the template configuration for the custom component.
Returns:
A dictionary representing the template configuration.
"""
if not self.code:
return {}

View file

@ -14,9 +14,10 @@ from langflow.interface.custom.code_parser.utils import (
extract_union_types_from_generic_alias,
)
from langflow.interface.custom.custom_component.component import Component
from langflow.schema import dotdict
from langflow.schema.schema import Record
from langflow.services.deps import get_credential_service, get_storage_service, session_scope
from langflow.schema import Record
from langflow.schema.dotdict import dotdict
from langflow.services.deps import get_storage_service, get_variable_service, session_scope
from langflow.services.storage.service import StorageService
from langflow.utils import validate
if TYPE_CHECKING:
@ -26,6 +27,23 @@ if TYPE_CHECKING:
class CustomComponent(Component):
"""
Represents a custom component in Langflow.
Attributes:
display_name (Optional[str]): The display name of the custom component.
description (Optional[str]): The description of the custom component.
code (Optional[str]): The code of the custom component.
field_config (dict): The field configuration of the custom component.
code_class_base_inheritance (ClassVar[str]): The base class name for the custom component.
function_entrypoint_name (ClassVar[str]): The name of the function entrypoint for the custom component.
function (Optional[Callable]): The function associated with the custom component.
repr_value (Optional[Any]): The representation value of the custom component.
user_id (Optional[Union[UUID, str]]): The user ID associated with the custom component.
status (Optional[Any]): The status of the custom component.
_tree (Optional[dict]): The code tree of the custom component.
"""
display_name: Optional[str] = None
"""The display name of the component. Defaults to None."""
description: Optional[str] = None
@ -88,6 +106,12 @@ class CustomComponent(Component):
_tree: Optional[dict] = None
def __init__(self, **data):
"""
Initializes a new instance of the CustomComponent class.
Args:
**data: Additional keyword arguments to initialize the custom component.
"""
self.cache = TTLCache(maxsize=1024, ttl=60)
super().__init__(**data)
@ -115,6 +139,12 @@ class CustomComponent(Component):
return self.field_order or list(self.field_config.keys())
def custom_repr(self):
"""
Returns the custom representation of the custom component.
Returns:
str: The custom representation of the custom component.
"""
if self.repr_value == "":
self.repr_value = self.status
if isinstance(self.repr_value, dict):
@ -126,6 +156,12 @@ class CustomComponent(Component):
return self.repr_value
def build_config(self):
"""
Builds the configuration for the custom component.
Returns:
dict: The configuration for the custom component.
"""
return self.field_config
def update_build_config(
@ -139,6 +175,12 @@ class CustomComponent(Component):
@property
def tree(self):
"""
Gets the code tree of the custom component.
Returns:
dict: The code tree of the custom component.
"""
return self.get_code_tree(self.code or "")
def to_records(self, data: Any, keys: Optional[List[str]] = None, silent_errors: bool = False) -> List[Record]:
@ -215,6 +257,12 @@ class CustomComponent(Component):
@property
def get_function_entrypoint_args(self) -> list:
"""
Gets the arguments of the function entrypoint for the custom component.
Returns:
list: The arguments of the function entrypoint.
"""
build_method = self.get_build_method()
if not build_method:
return []
@ -228,6 +276,12 @@ class CustomComponent(Component):
@cachedmethod(operator.attrgetter("cache"))
def get_build_method(self):
"""
Gets the build method for the custom component.
Returns:
dict: The build method for the custom component.
"""
if not self.code:
return {}
@ -245,6 +299,12 @@ class CustomComponent(Component):
@property
def get_function_entrypoint_return_type(self) -> List[Any]:
"""
Gets the return type of the function entrypoint for the custom component.
Returns:
List[Any]: The return type of the function entrypoint.
"""
build_method = self.get_build_method()
if not build_method or not build_method.get("has_return"):
return []
@ -266,6 +326,12 @@ class CustomComponent(Component):
@property
def get_main_class_name(self):
"""
Gets the main class name of the custom component.
Returns:
str: The main class name of the custom component.
"""
if not self.code:
return ""
@ -284,31 +350,63 @@ class CustomComponent(Component):
@property
def template_config(self):
"""
Gets the template configuration for the custom component.
Returns:
dict: The template configuration for the custom component.
"""
return self.build_template_config()
@property
def keys(self):
def get_credential(name: str):
def variables(self):
"""
Returns the variable for the current user with the specified name.
Raises:
ValueError: If the user id is not set.
Returns:
The variable for the current user with the specified name.
"""
def get_variable(name: str):
if hasattr(self, "_user_id") and not self._user_id:
raise ValueError(f"User id is not set for {self.__class__.__name__}")
credential_service = get_credential_service() # Get service instance
# Retrieve and decrypt the credential by name for the current user
variable_service = get_variable_service() # Get service instance
# Retrieve and decrypt the variable by name for the current user
with session_scope() as session:
return credential_service.get_credential(user_id=self._user_id or "", name=name, session=session)
return variable_service.get_variable(user_id=self._user_id or "", name=name, session=session)
return get_credential
return get_variable
def list_key_names(self):
"""
Lists the names of the variables for the current user.
Raises:
ValueError: If the user id is not set.
Returns:
List[str]: The names of the variables for the current user.
"""
if hasattr(self, "_user_id") and not self._user_id:
raise ValueError(f"User id is not set for {self.__class__.__name__}")
credential_service = get_credential_service()
variable_service = get_variable_service()
with session_scope() as session:
return credential_service.list_credentials(user_id=self._user_id, session=session)
return variable_service.list_variables(user_id=self._user_id, session=session)
def index(self, value: int = 0):
"""Returns a function that returns the value at the given index in the iterable."""
"""
Returns a function that returns the value at the given index in the iterable.
Args:
value (int): The index value.
Returns:
Callable: A function that returns the value at the given index.
"""
def get_index(iterable: List[Any]):
return iterable[value] if iterable else iterable
@ -316,14 +414,22 @@ class CustomComponent(Component):
return get_index
def get_function(self):
"""
Gets the function associated with the custom component.
Returns:
Callable: The function associated with the custom component.
"""
return validate.create_function(self.code, self.function_entrypoint_name)
async def load_flow(self, flow_id: str, tweaks: Optional[dict] = None) -> "Graph":
return await load_flow(flow_id, tweaks)
if not self._user_id:
raise ValueError("Session is invalid")
return await load_flow(user_id=self._user_id, flow_id=flow_id, tweaks=tweaks)
async def run_flow(
self,
inputs: Union[dict, List[dict]] = None,
inputs: Optional[Union[dict, List[dict]]] = None,
flow_id: Optional[str] = None,
flow_name: Optional[str] = None,
tweaks: Optional[dict] = None,
@ -339,4 +445,14 @@ class CustomComponent(Component):
raise ValueError(f"Error listing flows: {e}")
def build(self, *args: Any, **kwargs: Any) -> Any:
"""
Builds the custom component.
Args:
*args: The positional arguments.
**kwargs: The keyword arguments.
Returns:
Any: The result of the build process.
"""
raise NotImplementedError

View file

@ -3,9 +3,8 @@ import os
import zlib
from pathlib import Path
from loguru import logger
from langflow.interface.custom.custom_component import CustomComponent
from loguru import logger
class CustomComponentPathValueError(ValueError):

View file

@ -29,8 +29,8 @@ from langflow.utils import validate
from langflow.utils.util import unescape_string
if TYPE_CHECKING:
from langflow.custom import CustomComponent
from langflow.graph.vertex.base import Vertex
from langflow.interface.custom.custom_component import CustomComponent
async def instantiate_class(
@ -38,7 +38,7 @@ async def instantiate_class(
user_id=None,
) -> Any:
"""Instantiate class from module type and key, and params"""
from langflow.legacy_custom.customs import CUSTOM_NODES
from langflow.interface.custom_lists import CUSTOM_NODES
vertex_type = vertex.vertex_type
base_type = vertex.base_type
@ -50,7 +50,9 @@ async def instantiate_class(
if custom_node := CUSTOM_NODES.get(vertex_type):
if hasattr(custom_node, "initialize"):
return custom_node.initialize(**params)
return custom_node(**params)
if callable(custom_node):
return custom_node(**params)
raise ValueError(f"Custom node {vertex_type} is not callable")
logger.debug(f"Instantiating {vertex_type} of type {base_type}")
if not base_type:
raise ValueError("No base type provided for vertex")
@ -143,6 +145,21 @@ async def instantiate_based_on_type(
return class_object(**params)
def update_params_with_load_from_db_fields(custom_component: "CustomComponent", params, load_from_db_fields):
# For each field in load_from_db_fields, we will check if it's in the params
# and if it is, we will get the value from the custom_component.keys(name)
# and update the params with the value
for field in load_from_db_fields:
if field in params:
try:
key = custom_component.variables(params[field])
params[field] = key if key else params[field]
except Exception as exc:
logger.error(f"Failed to get value for {field} from custom component. Error: {exc}")
pass
return params
async def instantiate_custom_component(params, user_id, vertex):
params_copy = params.copy()
class_object: Type["CustomComponent"] = eval_custom_component_code(params_copy.pop("code"))
@ -152,6 +169,7 @@ async def instantiate_custom_component(params, user_id, vertex):
vertex=vertex,
selected_output_type=vertex.selected_output_type,
)
params_copy = update_params_with_load_from_db_fields(custom_component, params_copy, vertex.load_from_db_fields)
if "retriever" in params_copy and hasattr(params_copy["retriever"], "as_retriever"):
params_copy["retriever"] = params_copy["retriever"].as_retriever()
@ -168,6 +186,8 @@ async def instantiate_custom_component(params, user_id, vertex):
custom_repr = custom_component.custom_repr()
if not custom_repr and isinstance(build_result, (dict, Record, str)):
custom_repr = build_result
if not isinstance(custom_repr, str):
custom_repr = str(custom_repr)
return custom_component, build_result, {"repr": custom_repr}

View file

@ -4,7 +4,6 @@ from loguru import logger
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.custom_lists import llm_type_to_cls_dict
from langflow.services.deps import get_settings_service
from langflow.template.frontend_node.llms import LLMFrontendNode
from langflow.utils.util import build_template_from_class
@ -34,11 +33,10 @@ class LLMCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
settings_service = get_settings_service()
return [
llm.__name__
for llm in self.type_to_loader_dict.values()
if llm.__name__ in settings_service.settings.LLMS or settings_service.settings.DEV
# if llm.__name__ in settings_service.settings.LLMS or settings_service.settings.DEV
]

View file

@ -5,8 +5,6 @@ from langflow.interface.base import LangChainTypeCreator
from langflow.interface.tools.constants import ALL_TOOLS_NAMES, CUSTOM_TOOLS, FILE_TOOLS, OTHER_TOOLS
from langflow.interface.tools.util import get_tool_params
from langflow.legacy_custom import customs
from langflow.interface.tools.util import get_tool_params
from langflow.legacy_custom import customs
from langflow.services.deps import get_settings_service
from langflow.template.field.base import TemplateField
from langflow.template.template.base import Template

View file

@ -1,7 +1,7 @@
from langflow.template import frontend_node
# These should always be instantiated
CUSTOM_NODES = {
CUSTOM_NODES: dict[str, dict[str, frontend_node.base.FrontendNode]] = {
# "prompts": {
# "ZeroShotPrompt": frontend_node.prompts.ZeroShotPromptNode(),
# },

View file

@ -125,7 +125,7 @@ class Result(BaseModel):
async def run_graph(
graph: Union["Graph", dict],
graph: "Graph",
flow_id: str,
stream: bool,
session_id: Optional[str] = None,

View file

@ -5,6 +5,23 @@ class Service(ABC):
name: str
ready: bool = False
def get_schema(self):
"""Build a dictionary listing all methods, their parameters, types, return types and documentation."""
schema = {}
ignore = ["teardown", "set_ready"]
for method in dir(self):
if method.startswith("_") or method in ignore:
continue
func = getattr(self, method)
schema[method] = {
"name": method,
"parameters": func.__annotations__,
"return": func.__annotations__.get("return"),
"documentation": func.__doc__,
}
return schema
def teardown(self):
pass

View file

@ -1,38 +0,0 @@
from typing import TYPE_CHECKING, Optional, Union
from uuid import UUID
from fastapi import Depends
from sqlmodel import Session, select
from langflow.services.auth import utils as auth_utils
from langflow.services.base import Service
from langflow.services.database.models.credential.model import Credential
from langflow.services.deps import get_session
if TYPE_CHECKING:
from langflow.services.settings.service import SettingsService
class CredentialService(Service):
name = "credential_service"
def __init__(self, settings_service: "SettingsService"):
self.settings_service = settings_service
def get_credential(self, user_id: Union[UUID, str], name: str, session: Session = Depends(get_session)) -> str:
# we get the credential from the database
# credential = session.query(Credential).filter(Credential.user_id == user_id, Credential.name == name).first()
credential = session.exec(
select(Credential).where(Credential.user_id == user_id, Credential.name == name)
).first()
# we decrypt the value
if not credential or not credential.value:
raise ValueError(f"{name} credential not found.")
decrypted = auth_utils.decrypt_api_key(credential.value, settings_service=self.settings_service)
return decrypted
def list_credentials(
self, user_id: Union[UUID, str], session: Session = Depends(get_session)
) -> list[Optional[str]]:
credentials = session.exec(select(Credential).where(Credential.user_id == user_id)).all()
return [credential.name for credential in credentials]

View file

@ -1,6 +1,6 @@
from .api_key import ApiKey
from .credential import Credential
from .flow import Flow
from .user import User
from .variable import Variable
__all__ = ["Flow", "User", "ApiKey", "Credential"]
__all__ = ["Flow", "User", "ApiKey", "Variable"]

View file

@ -1,3 +0,0 @@
from .model import Credential, CredentialCreate, CredentialRead, CredentialUpdate
__all__ = ["Credential", "CredentialCreate", "CredentialRead", "CredentialUpdate"]

View file

@ -1,43 +0,0 @@
from datetime import datetime
from typing import TYPE_CHECKING, Optional
from uuid import UUID, uuid4
from sqlmodel import Field, Relationship, SQLModel
from langflow.services.database.models.credential.schema import CredentialType
if TYPE_CHECKING:
from langflow.services.database.models.user import User
class CredentialBase(SQLModel):
name: Optional[str] = Field(None, description="Name of the credential")
value: Optional[str] = Field(None, description="Encrypted value of the credential")
provider: Optional[str] = Field(None, description="Provider of the credential (e.g OpenAI)")
class Credential(CredentialBase, table=True):
id: Optional[UUID] = Field(default_factory=uuid4, primary_key=True, description="Unique ID for the credential")
# name is unique per user
created_at: datetime = Field(default_factory=datetime.utcnow, description="Creation time of the credential")
updated_at: Optional[datetime] = Field(None, description="Last update time of the credential")
# foreign key to user table
user_id: UUID = Field(description="User ID associated with this credential", foreign_key="user.id")
user: "User" = Relationship(back_populates="credentials")
class CredentialCreate(CredentialBase):
# AcceptedProviders is a custom Enum
provider: CredentialType = Field(description="Provider of the credential (e.g OpenAI)")
class CredentialRead(SQLModel):
id: UUID
name: Optional[str] = Field(None, description="Name of the credential")
provider: Optional[str] = Field(None, description="Provider of the credential (e.g OpenAI)")
class CredentialUpdate(SQLModel):
id: UUID # Include the ID for updating
name: Optional[str] = Field(None, description="Name of the credential")
value: Optional[str] = Field(None, description="Encrypted value of the credential")

View file

@ -1,8 +0,0 @@
from enum import Enum
class CredentialType(str, Enum):
"""CredentialType is an Enum of the accepted providers"""
OPENAI_API_KEY = "OPENAI_API_KEY"
ANTHROPIC_API_KEY = "ANTHROPIC_API_KEY"

View file

@ -1,4 +1,4 @@
# Path: src/backend/langflow/database/models/flow.py
# Path: src/backend/langflow/services/database/models/flow/model.py
from datetime import datetime
from typing import TYPE_CHECKING, Dict, Optional

View file

@ -6,7 +6,7 @@ from sqlmodel import Field, Relationship, SQLModel
if TYPE_CHECKING:
from langflow.services.database.models.api_key import ApiKey
from langflow.services.database.models.credential import Credential
from langflow.services.database.models.variable import Variable
from langflow.services.database.models.flow import Flow
@ -26,7 +26,7 @@ class User(SQLModel, table=True):
)
store_api_key: Optional[str] = Field(default=None, nullable=True)
flows: list["Flow"] = Relationship(back_populates="user")
credentials: list["Credential"] = Relationship(
variables: list["Variable"] = Relationship(
back_populates="user",
sa_relationship_kwargs={"cascade": "delete"},
)

View file

@ -0,0 +1,3 @@
from .model import Variable, VariableCreate, VariableRead, VariableUpdate
__all__ = ["Variable", "VariableCreate", "VariableRead", "VariableUpdate"]

View file

@ -0,0 +1,49 @@
from datetime import datetime, timezone
from typing import TYPE_CHECKING, Optional
from uuid import UUID, uuid4
from sqlmodel import Field, Relationship, SQLModel
if TYPE_CHECKING:
from langflow.services.database.models.user.model import User
def utc_now():
return datetime.now(timezone.utc)
class VariableBase(SQLModel):
name: Optional[str] = Field(None, description="Name of the variable")
value: Optional[str] = Field(None, description="Encrypted value of the variable")
type: Optional[str] = Field(None, description="Type of the variable")
class Variable(VariableBase, table=True):
id: Optional[UUID] = Field(
default_factory=uuid4,
primary_key=True,
description="Unique ID for the variable",
)
# name is unique per user
created_at: datetime = Field(default_factory=utc_now, description="Creation time of the variable")
updated_at: Optional[datetime] = Field(None, description="Last update time of the variable")
# foreign key to user table
user_id: UUID = Field(description="User ID associated with this variable", foreign_key="user.id")
user: "User" = Relationship(back_populates="variables")
class VariableCreate(VariableBase):
type: Optional[str] = Field(None, description="Type of the variable")
class VariableRead(SQLModel):
id: UUID
name: Optional[str] = Field(None, description="Name of the variable")
type: Optional[str] = Field(None, description="Type of the variable")
class VariableUpdate(SQLModel):
id: UUID # Include the ID for updating
name: Optional[str] = Field(None, description="Name of the variable")
value: Optional[str] = Field(None, description="Encrypted value of the variable")

View file

@ -8,35 +8,96 @@ if TYPE_CHECKING:
from langflow.services.cache.service import CacheService
from langflow.services.chat.service import ChatService
from langflow.services.credentials.service import CredentialService
from langflow.services.database.service import DatabaseService
from langflow.services.monitor.service import MonitorService
from langflow.services.plugins.service import PluginService
from langflow.services.session.service import SessionService
from langflow.services.settings.service import SettingsService
from langflow.services.socket.service import SocketIOService
from langflow.services.state.service import StateService
from langflow.services.storage.service import StorageService
from langflow.services.store.service import StoreService
from langflow.services.task.service import TaskService
from langflow.services.variable.service import VariableService
def get_service(service_type: ServiceType):
"""
Retrieves the service instance for the given service type.
Args:
service_type (ServiceType): The type of service to retrieve.
Returns:
Any: The service instance.
"""
return service_manager.get(service_type) # type: ignore
def get_state_service() -> "StateService":
"""
Retrieves the StateService instance from the service manager.
Returns:
The StateService instance.
"""
return service_manager.get(ServiceType.STATE_SERVICE) # type: ignore
def get_socket_service() -> "SocketIOService":
"""
Get the SocketIOService instance from the service manager.
Returns:
SocketIOService: The SocketIOService instance.
"""
return service_manager.get(ServiceType.SOCKETIO_SERVICE) # type: ignore
def get_storage_service() -> "StorageService":
"""
Retrieves the storage service instance.
Returns:
The storage service instance.
"""
return service_manager.get(ServiceType.STORAGE_SERVICE) # type: ignore
def get_credential_service() -> "CredentialService":
return service_manager.get(ServiceType.CREDENTIAL_SERVICE) # type: ignore
def get_variable_service() -> "VariableService":
"""
Retrieves the VariableService instance from the service manager.
Returns:
The VariableService instance.
"""
return service_manager.get(ServiceType.VARIABLE_SERVICE) # type: ignore
def get_plugins_service() -> "PluginService":
"""
Get the PluginService instance from the service manager.
Returns:
PluginService: The PluginService instance.
"""
return service_manager.get(ServiceType.PLUGIN_SERVICE) # type: ignore
def get_settings_service() -> "SettingsService":
"""
Retrieves the SettingsService instance.
If the service is not yet initialized, it will be initialized before returning.
Returns:
The SettingsService instance.
Raises:
ValueError: If the service cannot be retrieved or initialized.
"""
try:
return service_manager.get(ServiceType.SETTINGS_SERVICE) # type: ignore
except ValueError:
@ -48,10 +109,24 @@ def get_settings_service() -> "SettingsService":
def get_db_service() -> "DatabaseService":
"""
Retrieves the DatabaseService instance from the service manager.
Returns:
The DatabaseService instance.
"""
return service_manager.get(ServiceType.DATABASE_SERVICE) # type: ignore
def get_session() -> Generator["Session", None, None]:
"""
Retrieves a session from the database service.
Yields:
Session: A session object.
"""
db_service = get_db_service()
yield from db_service.get_session()
@ -61,6 +136,10 @@ def session_scope():
"""
Context manager for managing a session scope.
This context manager is used to manage a session scope for database operations.
It ensures that the session is properly committed if no exceptions occur,
and rolled back if an exception is raised.
Yields:
session: The session object.
@ -80,24 +159,61 @@ def session_scope():
def get_cache_service() -> "CacheService":
"""
Retrieves the cache service from the service manager.
Returns:
The cache service instance.
"""
return service_manager.get(ServiceType.CACHE_SERVICE) # type: ignore
def get_session_service() -> "SessionService":
"""
Retrieves the session service from the service manager.
Returns:
The session service instance.
"""
return service_manager.get(ServiceType.SESSION_SERVICE) # type: ignore
def get_monitor_service() -> "MonitorService":
"""
Retrieves the MonitorService instance from the service manager.
Returns:
MonitorService: The MonitorService instance.
"""
return service_manager.get(ServiceType.MONITOR_SERVICE) # type: ignore
def get_task_service() -> "TaskService":
"""
Retrieves the TaskService instance from the service manager.
Returns:
The TaskService instance.
"""
return service_manager.get(ServiceType.TASK_SERVICE) # type: ignore
def get_chat_service() -> "ChatService":
"""
Get the chat service instance.
Returns:
ChatService: The chat service instance.
"""
return service_manager.get(ServiceType.CHAT_SERVICE) # type: ignore
def get_store_service() -> "StoreService":
"""
Retrieves the StoreService instance from the service manager.
Returns:
StoreService: The StoreService instance.
"""
return service_manager.get(ServiceType.STORE_SERVICE) # type: ignore

View file

@ -23,7 +23,7 @@ class ServiceFactory:
raise self.service_class(*args, **kwargs)
def hash_factory(factory: ServiceFactory) -> str:
def hash_factory(factory: Type[ServiceFactory]) -> str:
return factory.service_class.__name__
@ -38,7 +38,7 @@ def hash_infer_service_types_args(factory_class: Type[ServiceFactory], available
@cached(cache=LRUCache(maxsize=10), key=hash_infer_service_types_args)
def infer_service_types(factory_class: Type[ServiceFactory], available_services=None) -> "ServiceType":
def infer_service_types(factory_class: Type[ServiceFactory], available_services=None) -> list["ServiceType"]:
create_method = factory_class.create
type_hints = get_type_hints(create_method, globalns=available_services)
service_types = []

View file

@ -156,4 +156,4 @@ async def log_vertex_build(
}
monitor_service.add_row(table_name="vertex_builds", data=row)
except Exception as e:
logger.error(f"Error logging vertex build: {e}")
logger.exception(f"Error logging vertex build: {e}")

View file

@ -16,7 +16,8 @@ class ServiceType(str, Enum):
TASK_SERVICE = "task_service"
PLUGINS_SERVICE = "plugins_service"
STORE_SERVICE = "store_service"
CREDENTIALS_SERVICE = "credentials_service"
VARIABLE_SERVICE = "variable_service"
STORAGE_SERVICE = "storage_service"
MONITOR_SERVICE = "monitor_service"
SOCKETIO_SERVICE = "socket_service"
STATE_SERVICE = "state_service"

View file

@ -1,5 +1,6 @@
import secrets
from pathlib import Path
from typing import Literal
from loguru import logger
from passlib.context import CryptContext
@ -14,7 +15,7 @@ class AuthSettings(BaseSettings):
# Login settings
CONFIG_DIR: str
SECRET_KEY: SecretStr = Field(
default=None,
default=SecretStr(""),
description="Secret key for JWT. If not provided, a random one will be generated.",
frozen=False,
)
@ -33,13 +34,13 @@ class AuthSettings(BaseSettings):
SUPERUSER: str = DEFAULT_SUPERUSER
SUPERUSER_PASSWORD: str = DEFAULT_SUPERUSER_PASSWORD
REFRESH_SAME_SITE: str = "none"
REFRESH_SAME_SITE: Literal["lax", "strict", "none"] = "none"
"""The SameSite attribute of the refresh token cookie."""
REFRESH_SECURE: bool = True
"""The Secure attribute of the refresh token cookie."""
REFRESH_HTTPONLY: bool = True
"""The HttpOnly attribute of the refresh token cookie."""
ACCESS_SAME_SITE: str = "none"
ACCESS_SAME_SITE: Literal["lax", "strict", "none"] = "none"
"""The SameSite attribute of the access token cookie."""
ACCESS_SECURE: bool = True
"""The Secure attribute of the access token cookie."""
@ -85,9 +86,10 @@ class AuthSettings(BaseSettings):
secret_key_path = Path(config_dir) / "secret_key"
if value:
if value and isinstance(value, SecretStr):
logger.debug("Secret key provided")
write_secret_to_file(secret_key_path, value)
secret_value = value.get_secret_value()
write_secret_to_file(secret_key_path, secret_value)
else:
logger.debug("No secret key provided, generating a random one")
@ -103,4 +105,4 @@ class AuthSettings(BaseSettings):
write_secret_to_file(secret_key_path, value)
logger.debug("Saved secret key")
return value
return value if isinstance(value, SecretStr) else SecretStr(value)

View file

@ -0,0 +1,13 @@
from langflow.services.factory import ServiceFactory
from langflow.services.settings.service import SettingsService
from langflow.services.state.service import InMemoryStateService
class StateServiceFactory(ServiceFactory):
def __init__(self):
super().__init__(InMemoryStateService)
def create(self, settings_service: SettingsService):
return InMemoryStateService(
settings_service,
)

View file

@ -0,0 +1,74 @@
from collections import defaultdict
from threading import Lock
from typing import Callable
from loguru import logger
from langflow.services.base import Service
from langflow.services.settings.service import SettingsService
class StateService(Service):
name = "state_service"
def append_state(self, key, new_state, run_id: str):
raise NotImplementedError
def update_state(self, key, new_state, run_id: str):
raise NotImplementedError
def get_state(self, key, run_id: str):
raise NotImplementedError
def subscribe(self, key, observer: Callable):
raise NotImplementedError
def notify_observers(self, key, new_state):
raise NotImplementedError
class InMemoryStateService(StateService):
def __init__(self, settings_service: SettingsService):
self.settings_service = settings_service
self.states = {}
self.observers = defaultdict(list)
self.lock = Lock()
def append_state(self, key, new_state, run_id: str):
with self.lock:
if run_id not in self.states:
self.states[run_id] = {}
if key not in self.states[run_id]:
self.states[run_id][key] = []
elif not isinstance(self.states[run_id][key], list):
self.states[run_id][key] = [self.states[run_id][key]]
self.states[run_id][key].append(new_state)
self.notify_append_observers(key, new_state)
def update_state(self, key, new_state, run_id: str):
with self.lock:
if run_id not in self.states:
self.states[run_id] = {}
self.states[run_id][key] = new_state
self.notify_observers(key, new_state)
def get_state(self, key, run_id: str):
with self.lock:
return self.states.get(run_id, {}).get(key, "")
def subscribe(self, key, observer: Callable):
with self.lock:
if observer not in self.observers[key]:
self.observers[key].append(observer)
def notify_observers(self, key, new_state):
for callback in self.observers[key]:
callback(key, new_state, append=False)
def notify_append_observers(self, key, new_state):
for callback in self.observers[key]:
try:
callback(key, new_state, append=True)
except Exception as e:
logger.error(f"Error in observer {callback} for key {key}: {e}")
logger.warning("Callbacks not implemented yet")

View file

@ -186,7 +186,7 @@ def initialize_services(fix_migration: bool = False, socketio_server=None):
service_manager.register_factory(factory)
except Exception as exc:
logger.exception(exc)
raise RuntimeError("Could not initialize services. Please check your settings.") from exc
logger.error(f"Error initializing {factory}: {exc}")
# Test cache connection
service_manager.get(ServiceType.CACHE_SERVICE)

View file

@ -1,15 +1,15 @@
from typing import TYPE_CHECKING
from langflow.services.credentials.service import CredentialService
from langflow.services.factory import ServiceFactory
from langflow.services.variable.service import VariableService
if TYPE_CHECKING:
from langflow.services.settings.service import SettingsService
class CredentialServiceFactory(ServiceFactory):
class VariableServiceFactory(ServiceFactory):
def __init__(self):
super().__init__(CredentialService)
super().__init__(VariableService)
def create(self, settings_service: "SettingsService"):
return CredentialService(settings_service)
return VariableService(settings_service)

View file

@ -0,0 +1,66 @@
from typing import TYPE_CHECKING, Optional, Union
from uuid import UUID
from fastapi import Depends
from sqlmodel import Session, select
from langflow.services.auth import utils as auth_utils
from langflow.services.base import Service
from langflow.services.database.models.variable.model import Variable
from langflow.services.deps import get_session
if TYPE_CHECKING:
from langflow.services.settings.service import SettingsService
class VariableService(Service):
name = "variable_service"
def __init__(self, settings_service: "SettingsService"):
self.settings_service = settings_service
def get_variable(self, user_id: Union[UUID, str], name: str, session: Session = Depends(get_session)) -> str:
# we get the credential from the database
# credential = session.query(Variable).filter(Variable.user_id == user_id, Variable.name == name).first()
variable = session.exec(select(Variable).where(Variable.user_id == user_id, Variable.name == name)).first()
# we decrypt the value
if not variable or not variable.value:
raise ValueError(f"{name} variable not found.")
decrypted = auth_utils.decrypt_api_key(variable.value, settings_service=self.settings_service)
return decrypted
def list_variables(self, user_id: Union[UUID, str], session: Session = Depends(get_session)) -> list[Optional[str]]:
variables = session.exec(select(Variable).where(Variable.user_id == user_id)).all()
return [variable.name for variable in variables]
def update_variable(
self, user_id: Union[UUID, str], name: str, value: str, session: Session = Depends(get_session)
):
variable = session.exec(select(Variable).where(Variable.user_id == user_id, Variable.name == name)).first()
if not variable:
raise ValueError(f"{name} variable not found.")
encrypted = auth_utils.encrypt_api_key(value, settings_service=self.settings_service)
variable.value = encrypted
session.add(variable)
session.commit()
session.refresh(variable)
return variable
def delete_variable(self, user_id: Union[UUID, str], name: str, session: Session = Depends(get_session)):
variable = session.exec(select(Variable).where(Variable.user_id == user_id, Variable.name == name)).first()
if not variable:
raise ValueError(f"{name} variable not found.")
session.delete(variable)
session.commit()
return variable
def create_variable(
self, user_id: Union[UUID, str], name: str, value: str, session: Session = Depends(get_session)
):
variable = Variable(
user_id=user_id, name=name, value=auth_utils.encrypt_api_key(value, settings_service=self.settings_service)
)
session.add(variable)
session.commit()
session.refresh(variable)
return variable

View file

@ -7,6 +7,7 @@ from langflow.field_typing.range_spec import RangeSpec
class TemplateField(BaseModel):
model_config = ConfigDict()
field_type: str = Field(default="str", serialization_alias="type")
"""The type of field this is. Default is a string."""
@ -69,6 +70,8 @@ class TemplateField(BaseModel):
range_spec: Optional[RangeSpec] = Field(default=None, serialization_alias="rangeSpec")
"""Range specification for the field. Defaults to None."""
load_from_db: bool = False
"""Specifies if the field should be loaded from the database. Defaults to False."""
title_case: bool = False
"""Specifies if the field should be displayed in title case. Defaults to True."""

View file

@ -10,10 +10,12 @@ from langflow.template.frontend_node import (
textsplitters,
tools,
vectorstores,
base,
)
__all__ = [
"agents",
"base",
"chains",
"embeddings",
"memories",

View file

@ -1885,13 +1885,13 @@ files = [
[[package]]
name = "httpcore"
version = "1.0.4"
version = "1.0.5"
description = "A minimal low-level HTTP client."
optional = false
python-versions = ">=3.8"
files = [
{file = "httpcore-1.0.4-py3-none-any.whl", hash = "sha256:ac418c1db41bade2ad53ae2f3834a3a0f5ae76b56cf5aa497d2d033384fc7d73"},
{file = "httpcore-1.0.4.tar.gz", hash = "sha256:cb2839ccfcba0d2d3c1131d3c3e26dfc327326fbe7a5dc0dbfe9f6c9151bb022"},
{file = "httpcore-1.0.5-py3-none-any.whl", hash = "sha256:421f18bac248b25d310f3cacd198d55b8e6125c107797b609ff9b7a6ba7991b5"},
{file = "httpcore-1.0.5.tar.gz", hash = "sha256:34a38e2f9291467ee3b44e89dd52615370e152954ba21721378a87b2960f7a61"},
]
[package.dependencies]
@ -1902,7 +1902,7 @@ h11 = ">=0.13,<0.15"
asyncio = ["anyio (>=4.0,<5.0)"]
http2 = ["h2 (>=3,<5)"]
socks = ["socksio (==1.*)"]
trio = ["trio (>=0.22.0,<0.25.0)"]
trio = ["trio (>=0.22.0,<0.26.0)"]
[[package]]
name = "httptools"
@ -2484,17 +2484,16 @@ extended-testing = ["aiosqlite (>=0.19.0,<0.20.0)", "aleph-alpha-client (>=2.15.
[[package]]
name = "langchain-core"
version = "0.1.33"
version = "0.1.35"
description = "Building applications with LLMs through composability"
optional = false
python-versions = "<4.0,>=3.8.1"
files = [
{file = "langchain_core-0.1.33-py3-none-any.whl", hash = "sha256:cee7fbab114c74b7279a92c8a376b40344b0fa3d0f0af3143a858e3b7485bf13"},
{file = "langchain_core-0.1.33.tar.gz", hash = "sha256:545eff3de83cc58231bd2b0c6d672323fc2077b94d326ba1a3219118af1d1a66"},
{file = "langchain_core-0.1.35-py3-none-any.whl", hash = "sha256:9d790446ea211f4cb620886081cc5a5723bc9a2dc90af1f6205aded2ee61bb71"},
{file = "langchain_core-0.1.35.tar.gz", hash = "sha256:862b8415d4deaf4e06833ef826bcef3614d75c3e7fd82b09b1349cc223f02e9a"},
]
[package.dependencies]
anyio = ">=3,<5"
jsonpatch = ">=1.33,<2.0"
langsmith = ">=0.1.0,<0.2.0"
packaging = ">=23.2,<24.0"
@ -2543,13 +2542,13 @@ extended-testing = ["lxml (>=5.1.0,<6.0.0)"]
[[package]]
name = "langsmith"
version = "0.1.31"
version = "0.1.34"
description = "Client library to connect to the LangSmith LLM Tracing and Evaluation Platform."
optional = false
python-versions = "<4.0,>=3.8.1"
files = [
{file = "langsmith-0.1.31-py3-none-any.whl", hash = "sha256:5211a9dc00831db307eb843485a97096484b697b5d2cd1efaac34228e97ca087"},
{file = "langsmith-0.1.31.tar.gz", hash = "sha256:efd54ccd44be7fda911bfdc0ead340473df2fdd07345c7252901834d0c4aa37e"},
{file = "langsmith-0.1.34-py3-none-any.whl", hash = "sha256:1f43e9e1f3985be150ff949136a381e627627be4ce2d8dba6f2d8b9f58273420"},
{file = "langsmith-0.1.34.tar.gz", hash = "sha256:9bd248723b4f2c9a805146a039b001170bdf20c80b6499cc553d260aaf4ac4f5"},
]
[package.dependencies]
@ -4042,28 +4041,28 @@ tests = ["pytest"]
[[package]]
name = "pyasn1"
version = "0.5.1"
version = "0.6.0"
description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
python-versions = ">=3.8"
files = [
{file = "pyasn1-0.5.1-py2.py3-none-any.whl", hash = "sha256:4439847c58d40b1d0a573d07e3856e95333f1976294494c325775aeca506eb58"},
{file = "pyasn1-0.5.1.tar.gz", hash = "sha256:6d391a96e59b23130a5cfa74d6fd7f388dbbe26cc8f1edf39fdddf08d9d6676c"},
{file = "pyasn1-0.6.0-py2.py3-none-any.whl", hash = "sha256:cca4bb0f2df5504f02f6f8a775b6e416ff9b0b3b16f7ee80b5a3153d9b804473"},
{file = "pyasn1-0.6.0.tar.gz", hash = "sha256:3a35ab2c4b5ef98e17dfdec8ab074046fbda76e281c5a706ccd82328cfc8f64c"},
]
[[package]]
name = "pyasn1-modules"
version = "0.3.0"
version = "0.4.0"
description = "A collection of ASN.1-based protocols modules"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
python-versions = ">=3.8"
files = [
{file = "pyasn1_modules-0.3.0-py2.py3-none-any.whl", hash = "sha256:d3ccd6ed470d9ffbc716be08bd90efbd44d0734bc9303818f7336070984a162d"},
{file = "pyasn1_modules-0.3.0.tar.gz", hash = "sha256:5bd01446b736eb9d31512a30d46c1ac3395d676c6f3cafa4c03eb54b9925631c"},
{file = "pyasn1_modules-0.4.0-py3-none-any.whl", hash = "sha256:be04f15b66c206eed667e0bb5ab27e2b1855ea54a842e5037738099e8ca4ae0b"},
{file = "pyasn1_modules-0.4.0.tar.gz", hash = "sha256:831dbcea1b177b28c9baddf4c6d1013c24c3accd14a1873fffaa6a2e905f17b6"},
]
[package.dependencies]
pyasn1 = ">=0.4.6,<0.6.0"
pyasn1 = ">=0.4.6,<0.7.0"
[[package]]
name = "pycparser"
@ -5283,13 +5282,13 @@ files = [
[[package]]
name = "types-passlib"
version = "1.7.7.20240311"
version = "1.7.7.20240327"
description = "Typing stubs for passlib"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-passlib-1.7.7.20240311.tar.gz", hash = "sha256:287dd27cec5421daf6be5c295f681baf343c146038c8bde4db783bcac1beccb7"},
{file = "types_passlib-1.7.7.20240311-py3-none-any.whl", hash = "sha256:cd44166e9347ae516f4830046cd1673c1ef90a5cc7ddd1356cf8a14892f29249"},
{file = "types-passlib-1.7.7.20240327.tar.gz", hash = "sha256:4cce6a1a3a6afee9fc4728b4d9784300764ac2be747f5bcc01646d904b85f4bb"},
{file = "types_passlib-1.7.7.20240327-py3-none-any.whl", hash = "sha256:3a3b7f4258b71034d2e2f4f307d6810f9904f906cdf375514c8bdbdb28a4ad23"},
]
[[package]]
@ -5644,80 +5643,83 @@ test = ["websockets"]
[[package]]
name = "websockets"
version = "10.4"
version = "12.0"
description = "An implementation of the WebSocket Protocol (RFC 6455 & 7692)"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
files = [
{file = "websockets-10.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d58804e996d7d2307173d56c297cf7bc132c52df27a3efaac5e8d43e36c21c48"},
{file = "websockets-10.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc0b82d728fe21a0d03e65f81980abbbcb13b5387f733a1a870672c5be26edab"},
{file = "websockets-10.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ba089c499e1f4155d2a3c2a05d2878a3428cf321c848f2b5a45ce55f0d7d310c"},
{file = "websockets-10.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:33d69ca7612f0ddff3316b0c7b33ca180d464ecac2d115805c044bf0a3b0d032"},
{file = "websockets-10.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:62e627f6b6d4aed919a2052efc408da7a545c606268d5ab5bfab4432734b82b4"},
{file = "websockets-10.4-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:38ea7b82bfcae927eeffc55d2ffa31665dc7fec7b8dc654506b8e5a518eb4d50"},
{file = "websockets-10.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e0cb5cc6ece6ffa75baccfd5c02cffe776f3f5c8bf486811f9d3ea3453676ce8"},
{file = "websockets-10.4-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:ae5e95cfb53ab1da62185e23b3130e11d64431179debac6dc3c6acf08760e9b1"},
{file = "websockets-10.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7c584f366f46ba667cfa66020344886cf47088e79c9b9d39c84ce9ea98aaa331"},
{file = "websockets-10.4-cp310-cp310-win32.whl", hash = "sha256:b029fb2032ae4724d8ae8d4f6b363f2cc39e4c7b12454df8df7f0f563ed3e61a"},
{file = "websockets-10.4-cp310-cp310-win_amd64.whl", hash = "sha256:8dc96f64ae43dde92530775e9cb169979f414dcf5cff670455d81a6823b42089"},
{file = "websockets-10.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:47a2964021f2110116cc1125b3e6d87ab5ad16dea161949e7244ec583b905bb4"},
{file = "websockets-10.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e789376b52c295c4946403bd0efecf27ab98f05319df4583d3c48e43c7342c2f"},
{file = "websockets-10.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7d3f0b61c45c3fa9a349cf484962c559a8a1d80dae6977276df8fd1fa5e3cb8c"},
{file = "websockets-10.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f55b5905705725af31ccef50e55391621532cd64fbf0bc6f4bac935f0fccec46"},
{file = "websockets-10.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:00c870522cdb69cd625b93f002961ffb0c095394f06ba8c48f17eef7c1541f96"},
{file = "websockets-10.4-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f38706e0b15d3c20ef6259fd4bc1700cd133b06c3c1bb108ffe3f8947be15fa"},
{file = "websockets-10.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:f2c38d588887a609191d30e902df2a32711f708abfd85d318ca9b367258cfd0c"},
{file = "websockets-10.4-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:fe10ddc59b304cb19a1bdf5bd0a7719cbbc9fbdd57ac80ed436b709fcf889106"},
{file = "websockets-10.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:90fcf8929836d4a0e964d799a58823547df5a5e9afa83081761630553be731f9"},
{file = "websockets-10.4-cp311-cp311-win32.whl", hash = "sha256:b9968694c5f467bf67ef97ae7ad4d56d14be2751000c1207d31bf3bb8860bae8"},
{file = "websockets-10.4-cp311-cp311-win_amd64.whl", hash = "sha256:a7a240d7a74bf8d5cb3bfe6be7f21697a28ec4b1a437607bae08ac7acf5b4882"},
{file = "websockets-10.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:74de2b894b47f1d21cbd0b37a5e2b2392ad95d17ae983e64727e18eb281fe7cb"},
{file = "websockets-10.4-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e3a686ecb4aa0d64ae60c9c9f1a7d5d46cab9bfb5d91a2d303d00e2cd4c4c5cc"},
{file = "websockets-10.4-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b0d15c968ea7a65211e084f523151dbf8ae44634de03c801b8bd070b74e85033"},
{file = "websockets-10.4-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:00213676a2e46b6ebf6045bc11d0f529d9120baa6f58d122b4021ad92adabd41"},
{file = "websockets-10.4-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:e23173580d740bf8822fd0379e4bf30aa1d5a92a4f252d34e893070c081050df"},
{file = "websockets-10.4-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:dd500e0a5e11969cdd3320935ca2ff1e936f2358f9c2e61f100a1660933320ea"},
{file = "websockets-10.4-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:4239b6027e3d66a89446908ff3027d2737afc1a375f8fd3eea630a4842ec9a0c"},
{file = "websockets-10.4-cp37-cp37m-win32.whl", hash = "sha256:8a5cc00546e0a701da4639aa0bbcb0ae2bb678c87f46da01ac2d789e1f2d2038"},
{file = "websockets-10.4-cp37-cp37m-win_amd64.whl", hash = "sha256:a9f9a735deaf9a0cadc2d8c50d1a5bcdbae8b6e539c6e08237bc4082d7c13f28"},
{file = "websockets-10.4-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5c1289596042fad2cdceb05e1ebf7aadf9995c928e0da2b7a4e99494953b1b94"},
{file = "websockets-10.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0cff816f51fb33c26d6e2b16b5c7d48eaa31dae5488ace6aae468b361f422b63"},
{file = "websockets-10.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:dd9becd5fe29773d140d68d607d66a38f60e31b86df75332703757ee645b6faf"},
{file = "websockets-10.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45ec8e75b7dbc9539cbfafa570742fe4f676eb8b0d3694b67dabe2f2ceed8aa6"},
{file = "websockets-10.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4f72e5cd0f18f262f5da20efa9e241699e0cf3a766317a17392550c9ad7b37d8"},
{file = "websockets-10.4-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:185929b4808b36a79c65b7865783b87b6841e852ef5407a2fb0c03381092fa3b"},
{file = "websockets-10.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:7d27a7e34c313b3a7f91adcd05134315002aaf8540d7b4f90336beafaea6217c"},
{file = "websockets-10.4-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:884be66c76a444c59f801ac13f40c76f176f1bfa815ef5b8ed44321e74f1600b"},
{file = "websockets-10.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:931c039af54fc195fe6ad536fde4b0de04da9d5916e78e55405436348cfb0e56"},
{file = "websockets-10.4-cp38-cp38-win32.whl", hash = "sha256:db3c336f9eda2532ec0fd8ea49fef7a8df8f6c804cdf4f39e5c5c0d4a4ad9a7a"},
{file = "websockets-10.4-cp38-cp38-win_amd64.whl", hash = "sha256:48c08473563323f9c9debac781ecf66f94ad5a3680a38fe84dee5388cf5acaf6"},
{file = "websockets-10.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:40e826de3085721dabc7cf9bfd41682dadc02286d8cf149b3ad05bff89311e4f"},
{file = "websockets-10.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:56029457f219ade1f2fc12a6504ea61e14ee227a815531f9738e41203a429112"},
{file = "websockets-10.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f5fc088b7a32f244c519a048c170f14cf2251b849ef0e20cbbb0fdf0fdaf556f"},
{file = "websockets-10.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2fc8709c00704194213d45e455adc106ff9e87658297f72d544220e32029cd3d"},
{file = "websockets-10.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0154f7691e4fe6c2b2bc275b5701e8b158dae92a1ab229e2b940efe11905dff4"},
{file = "websockets-10.4-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c6d2264f485f0b53adf22697ac11e261ce84805c232ed5dbe6b1bcb84b00ff0"},
{file = "websockets-10.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9bc42e8402dc5e9905fb8b9649f57efcb2056693b7e88faa8fb029256ba9c68c"},
{file = "websockets-10.4-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:edc344de4dac1d89300a053ac973299e82d3db56330f3494905643bb68801269"},
{file = "websockets-10.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:84bc2a7d075f32f6ed98652db3a680a17a4edb21ca7f80fe42e38753a58ee02b"},
{file = "websockets-10.4-cp39-cp39-win32.whl", hash = "sha256:c94ae4faf2d09f7c81847c63843f84fe47bf6253c9d60b20f25edfd30fb12588"},
{file = "websockets-10.4-cp39-cp39-win_amd64.whl", hash = "sha256:bbccd847aa0c3a69b5f691a84d2341a4f8a629c6922558f2a70611305f902d74"},
{file = "websockets-10.4-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:82ff5e1cae4e855147fd57a2863376ed7454134c2bf49ec604dfe71e446e2193"},
{file = "websockets-10.4-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d210abe51b5da0ffdbf7b43eed0cfdff8a55a1ab17abbec4301c9ff077dd0342"},
{file = "websockets-10.4-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:942de28af58f352a6f588bc72490ae0f4ccd6dfc2bd3de5945b882a078e4e179"},
{file = "websockets-10.4-pp37-pypy37_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c9b27d6c1c6cd53dc93614967e9ce00ae7f864a2d9f99fe5ed86706e1ecbf485"},
{file = "websockets-10.4-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:3d3cac3e32b2c8414f4f87c1b2ab686fa6284a980ba283617404377cd448f631"},
{file = "websockets-10.4-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:da39dd03d130162deb63da51f6e66ed73032ae62e74aaccc4236e30edccddbb0"},
{file = "websockets-10.4-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:389f8dbb5c489e305fb113ca1b6bdcdaa130923f77485db5b189de343a179393"},
{file = "websockets-10.4-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09a1814bb15eff7069e51fed0826df0bc0702652b5cb8f87697d469d79c23576"},
{file = "websockets-10.4-pp38-pypy38_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff64a1d38d156d429404aaa84b27305e957fd10c30e5880d1765c9480bea490f"},
{file = "websockets-10.4-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:b343f521b047493dc4022dd338fc6db9d9282658862756b4f6fd0e996c1380e1"},
{file = "websockets-10.4-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:932af322458da7e4e35df32f050389e13d3d96b09d274b22a7aa1808f292fee4"},
{file = "websockets-10.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d6a4162139374a49eb18ef5b2f4da1dd95c994588f5033d64e0bbfda4b6b6fcf"},
{file = "websockets-10.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c57e4c1349fbe0e446c9fa7b19ed2f8a4417233b6984277cce392819123142d3"},
{file = "websockets-10.4-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b627c266f295de9dea86bd1112ed3d5fafb69a348af30a2422e16590a8ecba13"},
{file = "websockets-10.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:05a7233089f8bd355e8cbe127c2e8ca0b4ea55467861906b80d2ebc7db4d6b72"},
{file = "websockets-10.4.tar.gz", hash = "sha256:eef610b23933c54d5d921c92578ae5f89813438fded840c2e9809d378dc765d3"},
{file = "websockets-12.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d554236b2a2006e0ce16315c16eaa0d628dab009c33b63ea03f41c6107958374"},
{file = "websockets-12.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2d225bb6886591b1746b17c0573e29804619c8f755b5598d875bb4235ea639be"},
{file = "websockets-12.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:eb809e816916a3b210bed3c82fb88eaf16e8afcf9c115ebb2bacede1797d2547"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c588f6abc13f78a67044c6b1273a99e1cf31038ad51815b3b016ce699f0d75c2"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5aa9348186d79a5f232115ed3fa9020eab66d6c3437d72f9d2c8ac0c6858c558"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6350b14a40c95ddd53e775dbdbbbc59b124a5c8ecd6fbb09c2e52029f7a9f480"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:70ec754cc2a769bcd218ed8d7209055667b30860ffecb8633a834dde27d6307c"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6e96f5ed1b83a8ddb07909b45bd94833b0710f738115751cdaa9da1fb0cb66e8"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4d87be612cbef86f994178d5186add3d94e9f31cc3cb499a0482b866ec477603"},
{file = "websockets-12.0-cp310-cp310-win32.whl", hash = "sha256:befe90632d66caaf72e8b2ed4d7f02b348913813c8b0a32fae1cc5fe3730902f"},
{file = "websockets-12.0-cp310-cp310-win_amd64.whl", hash = "sha256:363f57ca8bc8576195d0540c648aa58ac18cf85b76ad5202b9f976918f4219cf"},
{file = "websockets-12.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5d873c7de42dea355d73f170be0f23788cf3fa9f7bed718fd2830eefedce01b4"},
{file = "websockets-12.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3f61726cae9f65b872502ff3c1496abc93ffbe31b278455c418492016e2afc8f"},
{file = "websockets-12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ed2fcf7a07334c77fc8a230755c2209223a7cc44fc27597729b8ef5425aa61a3"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e332c210b14b57904869ca9f9bf4ca32f5427a03eeb625da9b616c85a3a506c"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5693ef74233122f8ebab026817b1b37fe25c411ecfca084b29bc7d6efc548f45"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e9e7db18b4539a29cc5ad8c8b252738a30e2b13f033c2d6e9d0549b45841c04"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:6e2df67b8014767d0f785baa98393725739287684b9f8d8a1001eb2839031447"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:bea88d71630c5900690fcb03161ab18f8f244805c59e2e0dc4ffadae0a7ee0ca"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:dff6cdf35e31d1315790149fee351f9e52978130cef6c87c4b6c9b3baf78bc53"},
{file = "websockets-12.0-cp311-cp311-win32.whl", hash = "sha256:3e3aa8c468af01d70332a382350ee95f6986db479ce7af14d5e81ec52aa2b402"},
{file = "websockets-12.0-cp311-cp311-win_amd64.whl", hash = "sha256:25eb766c8ad27da0f79420b2af4b85d29914ba0edf69f547cc4f06ca6f1d403b"},
{file = "websockets-12.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0e6e2711d5a8e6e482cacb927a49a3d432345dfe7dea8ace7b5790df5932e4df"},
{file = "websockets-12.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:dbcf72a37f0b3316e993e13ecf32f10c0e1259c28ffd0a85cee26e8549595fbc"},
{file = "websockets-12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:12743ab88ab2af1d17dd4acb4645677cb7063ef4db93abffbf164218a5d54c6b"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7b645f491f3c48d3f8a00d1fce07445fab7347fec54a3e65f0725d730d5b99cb"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9893d1aa45a7f8b3bc4510f6ccf8db8c3b62120917af15e3de247f0780294b92"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1f38a7b376117ef7aff996e737583172bdf535932c9ca021746573bce40165ed"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:f764ba54e33daf20e167915edc443b6f88956f37fb606449b4a5b10ba42235a5"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:1e4b3f8ea6a9cfa8be8484c9221ec0257508e3a1ec43c36acdefb2a9c3b00aa2"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9fdf06fd06c32205a07e47328ab49c40fc1407cdec801d698a7c41167ea45113"},
{file = "websockets-12.0-cp312-cp312-win32.whl", hash = "sha256:baa386875b70cbd81798fa9f71be689c1bf484f65fd6fb08d051a0ee4e79924d"},
{file = "websockets-12.0-cp312-cp312-win_amd64.whl", hash = "sha256:ae0a5da8f35a5be197f328d4727dbcfafa53d1824fac3d96cdd3a642fe09394f"},
{file = "websockets-12.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5f6ffe2c6598f7f7207eef9a1228b6f5c818f9f4d53ee920aacd35cec8110438"},
{file = "websockets-12.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9edf3fc590cc2ec20dc9d7a45108b5bbaf21c0d89f9fd3fd1685e223771dc0b2"},
{file = "websockets-12.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:8572132c7be52632201a35f5e08348137f658e5ffd21f51f94572ca6c05ea81d"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:604428d1b87edbf02b233e2c207d7d528460fa978f9e391bd8aaf9c8311de137"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1a9d160fd080c6285e202327aba140fc9a0d910b09e423afff4ae5cbbf1c7205"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87b4aafed34653e465eb77b7c93ef058516cb5acf3eb21e42f33928616172def"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b2ee7288b85959797970114deae81ab41b731f19ebcd3bd499ae9ca0e3f1d2c8"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:7fa3d25e81bfe6a89718e9791128398a50dec6d57faf23770787ff441d851967"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:a571f035a47212288e3b3519944f6bf4ac7bc7553243e41eac50dd48552b6df7"},
{file = "websockets-12.0-cp38-cp38-win32.whl", hash = "sha256:3c6cc1360c10c17463aadd29dd3af332d4a1adaa8796f6b0e9f9df1fdb0bad62"},
{file = "websockets-12.0-cp38-cp38-win_amd64.whl", hash = "sha256:1bf386089178ea69d720f8db6199a0504a406209a0fc23e603b27b300fdd6892"},
{file = "websockets-12.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:ab3d732ad50a4fbd04a4490ef08acd0517b6ae6b77eb967251f4c263011a990d"},
{file = "websockets-12.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a1d9697f3337a89691e3bd8dc56dea45a6f6d975f92e7d5f773bc715c15dde28"},
{file = "websockets-12.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1df2fbd2c8a98d38a66f5238484405b8d1d16f929bb7a33ed73e4801222a6f53"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23509452b3bc38e3a057382c2e941d5ac2e01e251acce7adc74011d7d8de434c"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2e5fc14ec6ea568200ea4ef46545073da81900a2b67b3e666f04adf53ad452ec"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46e71dbbd12850224243f5d2aeec90f0aaa0f2dde5aeeb8fc8df21e04d99eff9"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b81f90dcc6c85a9b7f29873beb56c94c85d6f0dac2ea8b60d995bd18bf3e2aae"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:a02413bc474feda2849c59ed2dfb2cddb4cd3d2f03a2fedec51d6e959d9b608b"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:bbe6013f9f791944ed31ca08b077e26249309639313fff132bfbf3ba105673b9"},
{file = "websockets-12.0-cp39-cp39-win32.whl", hash = "sha256:cbe83a6bbdf207ff0541de01e11904827540aa069293696dd528a6640bd6a5f6"},
{file = "websockets-12.0-cp39-cp39-win_amd64.whl", hash = "sha256:fc4e7fa5414512b481a2483775a8e8be7803a35b30ca805afa4998a84f9fd9e8"},
{file = "websockets-12.0-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:248d8e2446e13c1d4326e0a6a4e9629cb13a11195051a73acf414812700badbd"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f44069528d45a933997a6fef143030d8ca8042f0dfaad753e2906398290e2870"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c4e37d36f0d19f0a4413d3e18c0d03d0c268ada2061868c1e6f5ab1a6d575077"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d829f975fc2e527a3ef2f9c8f25e553eb7bc779c6665e8e1d52aa22800bb38b"},
{file = "websockets-12.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:2c71bd45a777433dd9113847af751aae36e448bc6b8c361a566cb043eda6ec30"},
{file = "websockets-12.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:0bee75f400895aef54157b36ed6d3b308fcab62e5260703add87f44cee9c82a6"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:423fc1ed29f7512fceb727e2d2aecb952c46aa34895e9ed96071821309951123"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:27a5e9964ef509016759f2ef3f2c1e13f403725a5e6a1775555994966a66e931"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3181df4583c4d3994d31fb235dc681d2aaad744fbdbf94c4802485ececdecf2"},
{file = "websockets-12.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:b067cb952ce8bf40115f6c19f478dc71c5e719b7fbaa511359795dfd9d1a6468"},
{file = "websockets-12.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:00700340c6c7ab788f176d118775202aadea7602c5cc6be6ae127761c16d6b0b"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e469d01137942849cff40517c97a30a93ae79917752b34029f0ec72df6b46399"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffefa1374cd508d633646d51a8e9277763a9b78ae71324183693959cf94635a7"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba0cab91b3956dfa9f512147860783a1829a8d905ee218a9837c18f683239611"},
{file = "websockets-12.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2cb388a5bfb56df4d9a406783b7f9dbefb888c09b71629351cc6b036e9259370"},
{file = "websockets-12.0-py3-none-any.whl", hash = "sha256:dc284bbc8d7c78a6c69e0c7325ab46ee5e40bb4d50e494d8131a07ef47500e9e"},
{file = "websockets-12.0.tar.gz", hash = "sha256:81df9cbcbb6c260de1e007e58c011bfebe2dafc8435107b0537f393dd38c8b1b"},
]
[[package]]
@ -6041,4 +6043,4 @@ local = []
[metadata]
lock-version = "2.0"
python-versions = ">=3.10,<3.12"
content-hash = "171d9e4a943b34877aa2e6084aca150cf7835da7d7cc87b8f2d7d1c07801412e"
content-hash = "95ecb4112719931edb680b2749b1302f99c76ae15fe6d4be0eed972d73df8aa1"

View file

@ -17,10 +17,11 @@ repository = "https://github.com/logspace-ai/langflow"
license = "MIT"
readme = "README.md"
keywords = ["nlp", "langchain", "openai", "gpt", "gui"]
packages = [{ include = "langflow" }]
packages = [{ include = "langflow" }, { include = "langflow/py.typed" }]
include = ["pyproject.toml", "README.md", "langflow/**/*"]
documentation = "https://docs.langflow.org"
[tool.poetry.scripts]
langflow-base = "langflow.__main__:main"
@ -37,7 +38,7 @@ rich = "^13.7.0"
langchain-experimental = "*"
pydantic = "^2.5.0"
pydantic-settings = "^2.1.0"
websockets = "^10.3"
websockets = "*"
typer = "^0.9.0"
cachetools = "^5.3.1"
platformdirs = "^4.2.0"
@ -106,6 +107,11 @@ filterwarnings = ["ignore::DeprecationWarning"]
log_cli = true
markers = ["async_test"]
[tool.mypy]
namespace_packages = true
mypy_path = "langflow"
ignore_missing_imports = true
[tool.ruff]
exclude = ["src/backend/langflow/alembic/*"]

View file

@ -1,70 +0,0 @@
from typing import List, Union
from langchain.agents import AgentExecutor, BaseMultiActionAgent, BaseSingleActionAgent
from langflow import CustomComponent
from langflow.field_typing import BaseMemory, Text, Tool
class LCAgentComponent(CustomComponent):
def build_config(self):
return {
"lc": {
"display_name": "LangChain",
"info": "The LangChain to interact with.",
},
"handle_parsing_errors": {
"display_name": "Handle Parsing Errors",
"info": "If True, the agent will handle parsing errors. If False, the agent will raise an error.",
"advanced": True,
},
"output_key": {
"display_name": "Output Key",
"info": "The key to use to get the output from the agent.",
"advanced": True,
},
"memory": {
"display_name": "Memory",
"info": "Memory to use for the agent.",
},
"tools": {
"display_name": "Tools",
"info": "Tools the agent can use.",
},
"input_value": {
"display_name": "Input",
"info": "Input text to pass to the agent.",
},
}
async def run_agent(
self,
agent: Union[BaseSingleActionAgent, BaseMultiActionAgent, AgentExecutor],
inputs: str,
input_variables: list[str],
tools: List[Tool],
memory: BaseMemory = None,
handle_parsing_errors: bool = True,
output_key: str = "output",
) -> Text:
if isinstance(agent, AgentExecutor):
runnable = agent
else:
runnable = AgentExecutor.from_agent_and_tools(
agent=agent, tools=tools, verbose=True, memory=memory, handle_parsing_errors=handle_parsing_errors
)
input_dict = {"input": inputs}
for var in input_variables:
if var not in ["agent_scratchpad", "input"]:
input_dict[var] = ""
result = await runnable.ainvoke(input_dict)
self.status = result
if output_key in result:
return result.get(output_key)
elif "output" not in result:
if output_key != "output":
raise ValueError(f"Output key not found in result. Tried '{output_key}' and 'output'.")
else:
raise ValueError("Output key not found in result. Tried 'output'.")
return result.get("output")

View file

@ -1,3 +0,0 @@
from .model import LCModelComponent
__all__ = ["LCModelComponent"]

View file

@ -1,48 +0,0 @@
from typing import Optional
from langchain_core.language_models.chat_models import BaseChatModel
from langchain_core.language_models.llms import LLM
from langchain_core.messages import HumanMessage, SystemMessage
from langflow import CustomComponent
class LCModelComponent(CustomComponent):
display_name: str = "Model Name"
description: str = "Model Description"
def get_result(self, runnable: LLM, stream: bool, input_value: str):
"""
Retrieves the result from the output of a Runnable object.
Args:
output (Runnable): The output object to retrieve the result from.
stream (bool): Indicates whether to use streaming or invocation mode.
input_value (str): The input value to pass to the output object.
Returns:
The result obtained from the output object.
"""
if stream:
result = runnable.stream(input_value)
else:
message = runnable.invoke(input_value)
result = message.content if hasattr(message, "content") else message
self.status = result
return result
def get_chat_result(
self, runnable: BaseChatModel, stream: bool, input_value: str, system_message: Optional[str] = None
):
messages = []
if input_value:
messages.append(HumanMessage(input_value))
if system_message:
messages.append(SystemMessage(system_message))
if stream:
result = runnable.stream(messages)
else:
message = runnable.invoke(messages)
result = message.content
self.status = result
return result

View file

@ -1,85 +0,0 @@
from typing import Any, List, Optional, Text
from langchain_core.tools import StructuredTool
from loguru import logger
from langflow import CustomComponent
from langflow.field_typing import Tool
from langflow.graph.graph.base import Graph
from langflow.helpers.flow import build_function_and_schema
from langflow.schema.dotdict import dotdict
class FlowToolComponent(CustomComponent):
display_name = "Flow as Tool"
description = "Construct a Tool from a function that runs the loaded Flow."
field_order = ["flow_name", "name", "description", "return_direct"]
def get_flow_names(self) -> List[str]:
flow_records = self.list_flows()
return [flow_record.data["name"] for flow_record in flow_records]
def get_flow(self, flow_name: str) -> Optional[Text]:
"""
Retrieves a flow by its name.
Args:
flow_name (str): The name of the flow to retrieve.
Returns:
Optional[Text]: The flow record if found, None otherwise.
"""
flow_records = self.list_flows()
for flow_record in flow_records:
if flow_record.data["name"] == flow_name:
return flow_record
return None
def update_build_config(self, build_config: dotdict, field_value: Any, field_name: str | None = None):
logger.debug(f"Updating build config with field value {field_value} and field name {field_name}")
if field_name == "flow_name":
build_config["flow_name"]["options"] = self.get_flow_names()
return build_config
def build_config(self):
return {
"flow_name": {
"display_name": "Flow Name",
"info": "The name of the flow to run.",
"options": [],
"real_time_refresh": True,
"refresh_button": True,
},
"name": {
"display_name": "Name",
"description": "The name of the tool.",
},
"description": {
"display_name": "Description",
"description": "The description of the tool.",
},
"return_direct": {
"display_name": "Return Direct",
"description": "Return the result directly from the Tool.",
"advanced": True,
},
}
async def build(self, flow_name: str, name: str, description: str, return_direct: bool = False) -> Tool:
flow_record = self.get_flow(flow_name)
if not flow_record:
raise ValueError("Flow not found.")
graph = Graph.from_payload(flow_record.data["data"])
dynamic_flow_function, schema = build_function_and_schema(flow_record, graph)
tool = StructuredTool.from_function(
coroutine=dynamic_flow_function,
name=name,
description=description,
return_direct=return_direct,
args_schema=schema,
)
description_repr = repr(tool.description).strip("'")
args_str = "\n".join([f"- {arg_name}: {arg_data['description']}" for arg_name, arg_data in tool.args.items()])
self.status = f"{description_repr}\nArguments:\n{args_str}"
return tool

View file

@ -1,25 +0,0 @@
from langflow.custom import CustomComponent
class SchemaComponent(CustomComponent):
display_name = "Schema"
description = "Construct a Schema from a list of fields."
def build_config(self):
return {
"fields": {
"display_name": "Fields",
"info": "The fields to include in the schema.",
},
"name": {
"display_name": "Name",
"info": "The name of the schema.",
},
}
def build(self, name: str, fields: list[dict]):
# The idea for this component is to use create_model from pydantic to create a schema
# from a list of fields. This will be useful for creating schemas for the flow tool.
pass
# field is a simple list of dictionaries with the field name and

View file

@ -1,37 +0,0 @@
from langchain_community.tools.searchapi import SearchAPIRun
from langchain_community.utilities.searchapi import SearchApiAPIWrapper
from langflow import CustomComponent
from langflow.field_typing import Tool
class SearchApiToolComponent(CustomComponent):
display_name: str = "SearchApi Tool"
description: str = "Real-time search engine results API."
documentation: str = "https://www.searchapi.io/docs/google"
field_config = {
"engine": {
"display_name": "Engine",
"field_type": "str",
"info": "The search engine to use.",
},
"api_key": {
"display_name": "API Key",
"field_type": "str",
"required": True,
"password": True,
"info": "The API key to use SearchApi.",
},
}
def build(
self,
engine: str,
api_key: str,
) -> Tool:
search_api_wrapper = SearchApiAPIWrapper(engine=engine, searchapi_api_key=api_key)
tool = SearchAPIRun(api_wrapper=search_api_wrapper)
self.status = tool
return tool

View file

@ -0,0 +1 @@
from .version import __version__ # noqa: F401

View file

@ -3,6 +3,5 @@ from importlib import metadata
try:
__version__ = metadata.version(__package__)
except metadata.PackageNotFoundError:
# Case where package metadata is not available.
__version__ = ""
del metadata # optional, avoids polluting the results of dir(__package__)
del metadata

File diff suppressed because it is too large Load diff

View file

@ -37,6 +37,7 @@
"base64-js": "^1.5.1",
"class-variance-authority": "^0.6.1",
"clsx": "^1.2.1",
"cmdk": "^1.0.0",
"dompurify": "^3.0.5",
"esbuild": "^0.17.19",
"framer-motion": "^11.0.6",

View file

@ -15,11 +15,12 @@ import {
FETCH_ERROR_MESSAGE,
} from "./constants/constants";
import { AuthContext } from "./contexts/authContext";
import { getHealth } from "./controllers/API";
import { getGlobalVariables, getHealth } from "./controllers/API";
import Router from "./routes";
import useAlertStore from "./stores/alertStore";
import { useDarkStore } from "./stores/darkStore";
import useFlowsManagerStore from "./stores/flowsManagerStore";
import { useGlobalVariablesStore } from "./stores/globalVariables";
import { useStoreStore } from "./stores/storeStore";
import { useTypesStore } from "./stores/typesStore";
@ -43,6 +44,9 @@ export default function App() {
const getTypes = useTypesStore((state) => state.getTypes);
const refreshVersion = useDarkStore((state) => state.refreshVersion);
const refreshStars = useDarkStore((state) => state.refreshStars);
const setGlobalVariables = useGlobalVariablesStore(
(state) => state.setGlobalVariables
);
const checkHasStore = useStoreStore((state) => state.checkHasStore);
const navigate = useNavigate();
@ -58,6 +62,9 @@ export default function App() {
getTypes().then(() => {
refreshFlows();
});
getGlobalVariables().then((res) => {
setGlobalVariables(res);
});
checkHasStore();
fetchApiData();
}

View file

@ -6,9 +6,9 @@ import CodeAreaComponent from "../../../../components/codeAreaComponent";
import DictComponent from "../../../../components/dictComponent";
import Dropdown from "../../../../components/dropdownComponent";
import FloatComponent from "../../../../components/floatComponent";
import IconComponent from "../../../../components/genericIconComponent";
import InputComponent from "../../../../components/inputComponent";
import { default as IconComponent } from "../../../../components/genericIconComponent";
import InputFileComponent from "../../../../components/inputFileComponent";
import InputGlobalComponent from "../../../../components/inputGlobalComponent";
import InputListComponent from "../../../../components/inputListComponent";
import IntComponent from "../../../../components/intComponent";
import KeypairListComponent from "../../../../components/keypairListComponent";
@ -71,6 +71,7 @@ export default function ParameterComponent({
const nodes = useFlowStore((state) => state.nodes);
const edges = useFlowStore((state) => state.edges);
const setNode = useFlowStore((state) => state.setNode);
const [isLoading, setIsLoading] = useState(false);
const flow = currentFlow?.data?.nodes ?? null;
@ -400,9 +401,8 @@ export default function ParameterComponent({
<>
<div
className={
"w-full truncate text-sm" +
(left ? "" : " flex items-center justify-end gap-2") +
(info !== "" ? " flex items-center" : "")
"flex w-full items-center truncate text-sm" +
(left ? "" : " justify-end")
}
>
{!left && data.node?.frozen && (
@ -559,12 +559,21 @@ export default function ParameterComponent({
(data.node?.template[name].refresh_button ? "w-5/6" : "")
}
>
<InputComponent
id={"input-" + name}
<InputGlobalComponent
disabled={disabled}
password={data.node?.template[name].password ?? false}
value={data.node?.template[name].value ?? ""}
onChange={handleOnNewValue}
setDb={(value) => {
setNode(data.id, (oldNode) => {
let newNode = cloneDeep(oldNode);
newNode.data = {
...newNode.data,
};
newNode.data.node.template[name].load_from_db = value;
return newNode;
});
}}
name={name}
data={data}
/>
</div>
{data.node?.template[name].refresh_button && (

View file

@ -182,9 +182,16 @@ export default function IOView({
<AccordionComponent
trigger={
<div className="file-component-badge-div">
<Badge variant="gray" size="md">
{node.data.node.display_name}
</Badge>
<ShadTooltip
content={input.id}
styleClasses="z-50"
>
<div>
<Badge variant="gray" size="md">
{node.data.node.display_name}
</Badge>
</div>
</ShadTooltip>
<div
className="-mb-1 pr-4"
onClick={(event) => {

View file

@ -0,0 +1,99 @@
import { useState } from "react";
import { registerGlobalVariable } from "../../controllers/API";
import BaseModal from "../../modals/baseModal";
import useAlertStore from "../../stores/alertStore";
import { useGlobalVariablesStore } from "../../stores/globalVariables";
import { ResponseErrorDetailAPI } from "../../types/api";
import ForwardedIconComponent from "../genericIconComponent";
import InputComponent from "../inputComponent";
import { Button } from "../ui/button";
import { Input } from "../ui/input";
import { Label } from "../ui/label";
import { Textarea } from "../ui/textarea";
//TODO IMPLEMENT FORM LOGIC
export default function AddNewVariableButton({ children }): JSX.Element {
const [key, setKey] = useState("");
const [value, setValue] = useState("");
const [type, setType] = useState("");
const [open, setOpen] = useState(false);
const setErrorData = useAlertStore((state) => state.setErrorData);
const addGlobalVariable = useGlobalVariablesStore(
(state) => state.addGlobalVariable
);
function handleSaveVariable() {
let data: { name: string; value: string; type?: string } = {
name: key,
type,
value,
};
registerGlobalVariable(data)
.then((res) => {
const { name, id, type } = res.data;
addGlobalVariable(name, id, type);
setKey("");
setValue("");
setType("");
setOpen(false);
})
.catch((error) => {
let responseError = error as ResponseErrorDetailAPI;
setErrorData({
title: "Error creating variable",
list: [responseError.response.data.detail ?? "Unknown error"],
});
});
}
return (
<BaseModal open={open} setOpen={setOpen} size="x-small">
<BaseModal.Header
description={
"This variable will be encrypted and will be available for you to use in any of your projects."
}
>
<span className="pr-2"> Create Variable </span>
<ForwardedIconComponent
name="Globe"
className="h-6 w-6 pl-1 text-primary "
aria-hidden="true"
/>
</BaseModal.Header>
<BaseModal.Trigger>{children}</BaseModal.Trigger>
<BaseModal.Content>
<div className="flex h-full w-full flex-col gap-4 align-middle">
<Label>Variable Name</Label>
<Input
value={key}
onChange={(e) => {
setKey(e.target.value);
}}
placeholder="Insert a name for the variable..."
></Input>
<Label>Type (optional)</Label>
<InputComponent
setSelectedOption={(e) => {
setType(e);
}}
selectedOption={type}
password={false}
options={["Variable", "Credential"]}
placeholder="Choose a type for the variable..."
></InputComponent>
<Label>Value</Label>
<Textarea
value={value}
onChange={(e) => {
setValue(e.target.value);
}}
placeholder="Insert a value for the variable..."
className="w-full resize-none custom-scroll"
/>
</div>
</BaseModal.Content>
<BaseModal.Footer>
<Button onClick={handleSaveVariable}>Save variable</Button>
</BaseModal.Footer>
</BaseModal>
);
}

View file

@ -250,7 +250,7 @@ export default function CollectionCardComponent({
>
<Button
variant="ghost"
size="xs"
size="icon"
className={
"whitespace-nowrap" +
(!authorized ? " cursor-not-allowed" : "")
@ -275,7 +275,7 @@ export default function CollectionCardComponent({
<Button
disabled={loadingLike}
variant="ghost"
size="xs"
size="icon"
className={
"whitespace-nowrap" +
(!authorized ? " cursor-not-allowed" : "")
@ -312,7 +312,7 @@ export default function CollectionCardComponent({
<Button
disabled={loading}
variant="ghost"
size="xs"
size="icon"
className={
"whitespace-nowrap" +
(!authorized ? " cursor-not-allowed" : "") +

View file

@ -6,7 +6,6 @@ import AccordionComponent from "../../components/AccordionComponent";
import CodeAreaComponent from "../../components/codeAreaComponent";
import Dropdown from "../../components/dropdownComponent";
import FloatComponent from "../../components/floatComponent";
import InputComponent from "../../components/inputComponent";
import InputFileComponent from "../../components/inputFileComponent";
import InputListComponent from "../../components/inputListComponent";
import IntComponent from "../../components/intComponent";
@ -37,8 +36,10 @@ import {
hasDuplicateKeys,
} from "../../utils/reactflowUtils";
import { classNames } from "../../utils/utils";
import ShadTooltip from "../ShadTooltipComponent";
import DictComponent from "../dictComponent";
import IconComponent from "../genericIconComponent";
import InputGlobalComponent from "../inputGlobalComponent";
import KeypairListComponent from "../keypairListComponent";
export default function CodeTabsComponent({
@ -54,6 +55,7 @@ export default function CodeTabsComponent({
const [openAccordion, setOpenAccordion] = useState<string[]>([]);
const dark = useDarkStore((state) => state.dark);
const unselectAll = useFlowStore((state) => state.unselectAll);
const setNode = useFlowStore((state) => state.setNode);
const [errorDuplicateKey, setErrorDuplicateKey] = useState(false);
@ -215,14 +217,22 @@ export default function CodeTabsComponent({
node["data"]["id"]
) && (
<AccordionComponent
trigger={node["data"]["id"]}
trigger={
<ShadTooltip
side="top"
styleClasses="z-50"
content={node["data"]["id"]}
>
<div>{node["data"]["node"]["display_name"]}</div>
</ShadTooltip>
}
open={openAccordion}
keyValue={node["data"]["id"]}
>
<div className="api-modal-table-arrangement">
<Table className="table-fixed bg-muted outline-1">
<TableHeader className="h-10 border-input text-xs font-medium text-ring">
<TableRow className="dark:border-b-muted">
<TableRow className="">
<TableHead className="h-7 text-center">
PARAM
</TableHead>
@ -245,10 +255,7 @@ export default function CodeTabsComponent({
)
.map((templateField, indx) => {
return (
<TableRow
key={indx}
className="h-10 dark:border-b-muted"
>
<TableRow key={indx} className="h-10">
<TableCell className="p-0 text-center text-sm text-foreground">
{templateField}
</TableCell>
@ -347,38 +354,31 @@ export default function CodeTabsComponent({
/>
</div>
) : (
<InputComponent
<InputGlobalComponent
editNode={true}
disabled={false}
password={
node.data.node.template[
templateField
].password ?? false
}
value={
!node.data.node.template[
templateField
].value ||
node.data.node.template[
templateField
].value === ""
? ""
: node.data.node
.template[
templateField
].value
}
onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList![
i
].data.node.template[
templateField
].value = target;
return newInputList;
});
if (node.data) {
setNode(
node.data.id,
(oldNode) => {
let newNode =
cloneDeep(
oldNode
);
newNode.data = {
...newNode.data,
};
newNode.data.node.template[
templateField
].value = target;
return newNode;
}
);
}
tweaks.buildTweakObject!(
node["data"]["id"],
target,
@ -387,6 +387,25 @@ export default function CodeTabsComponent({
]
);
}}
setDb={(value) => {
setNode(
node.data.id,
(oldNode) => {
let newNode =
cloneDeep(oldNode);
newNode.data = {
...newNode.data,
};
newNode.data.node.template[
templateField
].load_from_db =
value;
return newNode;
}
);
}}
name={templateField}
data={node.data}
/>
)}
</div>
@ -505,7 +524,6 @@ export default function CodeTabsComponent({
<div className="mx-auto">
<Dropdown
editNode={true}
apiModal={true}
options={
node.data.node.template[
templateField

View file

@ -1,8 +1,21 @@
import { Listbox, Transition } from "@headlessui/react";
import { Fragment, useEffect, useState } from "react";
import { useRef, useState } from "react";
import { DropDownComponentType } from "../../types/components";
import { classNames } from "../../utils/utils";
import IconComponent from "../genericIconComponent";
import { cn } from "../../utils/utils";
import { default as ForwardedIconComponent } from "../genericIconComponent";
import { Button } from "../ui/button";
import {
Command,
CommandEmpty,
CommandGroup,
CommandInput,
CommandItem,
CommandList,
} from "../ui/command";
import {
Popover,
PopoverContentWithoutPortal,
PopoverTrigger,
} from "../ui/popover";
export default function Dropdown({
disabled,
@ -11,124 +24,77 @@ export default function Dropdown({
options,
onSelect,
editNode = false,
numberOfOptions = 0,
apiModal = false,
id = "",
}: DropDownComponentType): JSX.Element {
let [internalValue, setInternalValue] = useState(
value === "" || !value ? "Choose an option" : value
);
const [open, setOpen] = useState(false);
useEffect(() => {
setInternalValue(value === "" || !value ? "Choose an option" : value);
}, [value]);
const refButton = useRef<HTMLButtonElement>(null);
return (
<>
{Object.keys(options)?.length > 0 ? (
<>
<Listbox
value={internalValue}
disabled={disabled}
onChange={(value) => {
setInternalValue(value);
onSelect(value);
}}
>
{({ open }) => (
<>
<div className={"relative mt-1"}>
<Listbox.Button
data-test={`${id ?? ""}`}
className={
editNode
? "dropdown-component-outline"
: "dropdown-component-false-outline"
}
>
<span
className="dropdown-component-display"
data-testid={`${id ?? ""}-display`}
>
{internalValue}
</span>
<span className={"dropdown-component-arrow"}>
<IconComponent
name="ChevronsUpDown"
className="dropdown-component-arrow-color"
aria-hidden="true"
/>
</span>
</Listbox.Button>
<Transition
show={open}
as={Fragment}
leave="transition ease-in duration-100"
leaveFrom="opacity-100"
leaveTo="opacity-0"
>
<Listbox.Options
className={classNames(
editNode
? "dropdown-component-true-options nowheel custom-scroll"
: "dropdown-component-false-options nowheel custom-scroll",
apiModal ? "mb-2 w-[250px]" : "absolute w-full"
)}
>
{options?.map((option, id) => (
<Listbox.Option
key={id}
className={({ active }) =>
classNames(
active ? " bg-accent" : "",
editNode
? "dropdown-component-false-option"
: "dropdown-component-true-option"
)
}
value={option}
>
{({ selected, active }) => (
<>
<span
className={classNames(
selected ? "font-semibold" : "font-normal",
"block truncate "
)}
data-testid={`${option}-${id ?? ""}-option`}
>
{option}
</span>
{selected ? (
<span
className={classNames(
active ? "text-background " : "",
"dropdown-component-choosal"
)}
>
<IconComponent
name="Check"
className={
active
? "dropdown-component-check-icon"
: "dropdown-component-check-icon"
}
aria-hidden="true"
/>
</span>
) : null}
</>
<Popover open={open} onOpenChange={setOpen}>
<PopoverTrigger asChild>
<Button
disabled={disabled}
variant="primary"
size="xs"
role="combobox"
ref={refButton}
aria-expanded={open}
data-test={`${id ?? ""}`}
className={cn(
editNode
? "dropdown-component-outline"
: "dropdown-component-false-outline",
"w-full justify-between font-normal",
editNode ? "input-edit-node" : "py-2"
)}
>
{value
? options.find((option) => option === value)
: "Choose an option..."}
<ForwardedIconComponent
name="ChevronsUpDown"
className="ml-2 h-4 w-4 shrink-0 opacity-50"
/>
</Button>
</PopoverTrigger>
<PopoverContentWithoutPortal
className="nocopy nowheel nopan nodelete nodrag noundo w-full p-0"
style={{ minWidth: refButton?.current?.clientWidth ?? "200px" }}
>
<Command>
<CommandInput placeholder="Search options..." className="h-9" />
<CommandList>
<CommandEmpty>No values found.</CommandEmpty>
<CommandGroup defaultChecked={false}>
{options?.map((option, id) => (
<CommandItem
key={id}
value={option}
onSelect={(currentValue) => {
onSelect(currentValue);
setOpen(false);
}}
data-testid={`${option}-${id ?? ""}-option`}
>
{option}
<ForwardedIconComponent
name="Check"
className={cn(
"ml-auto h-4 w-4 text-primary",
value === option ? "opacity-100" : "opacity-0"
)}
</Listbox.Option>
))}
</Listbox.Options>
</Transition>
</div>
</>
)}
</Listbox>
/>
</CommandItem>
))}
</CommandGroup>
</CommandList>
</Command>
</PopoverContentWithoutPortal>
</Popover>
</>
) : (
<>

View file

@ -1,15 +1,25 @@
import * as Form from "@radix-ui/react-form";
import { PopoverAnchor } from "@radix-ui/react-popover";
import { useEffect, useRef, useState } from "react";
import useAlertStore from "../../stores/alertStore";
import { InputComponentType } from "../../types/components";
import { handleKeyDown } from "../../utils/reactflowUtils";
import { classNames } from "../../utils/utils";
import { classNames, cn } from "../../utils/utils";
import ForwardedIconComponent from "../genericIconComponent";
import {
Command,
CommandGroup,
CommandInput,
CommandItem,
CommandList,
} from "../ui/command";
import { Input } from "../ui/input";
import { Popover, PopoverContentWithoutPortal } from "../ui/popover";
export default function InputComponent({
autoFocus = false,
onBlur,
value,
value = "",
onChange,
disabled,
required = false,
@ -20,17 +30,30 @@ export default function InputComponent({
className,
id = "",
blurOnEnter = false,
optionsIcon = "ChevronsUpDown",
selectedOption,
setSelectedOption,
options = [],
optionsPlaceholder = "Search options...",
optionsButton,
optionButton,
}: InputComponentType): JSX.Element {
const setErrorData = useAlertStore.getState().setErrorData;
const [pwdVisible, setPwdVisible] = useState(false);
const refInput = useRef<HTMLInputElement>(null);
const [showOptions, setShowOptions] = useState<boolean>(false);
// Clear component state
useEffect(() => {
if (disabled && value !== "") {
if (disabled && value && onChange && value !== "") {
onChange("");
}
}, [disabled]);
function onInputLostFocus(event): void {
if (onBlur) onBlur(event);
}
return (
<div className="relative w-full">
{isForm ? (
@ -38,7 +61,7 @@ export default function InputComponent({
<Input
id={"form-" + id}
ref={refInput}
onBlur={onBlur}
onBlur={onInputLostFocus}
autoFocus={autoFocus}
type={password && !pwdVisible ? "password" : "text"}
value={value}
@ -55,7 +78,7 @@ export default function InputComponent({
)}
placeholder={password && editNode ? "Key" : placeholder}
onChange={(e) => {
onChange(e.target.value);
onChange && onChange(e.target.value);
}}
onCopy={(e) => {
e.preventDefault();
@ -67,54 +90,168 @@ export default function InputComponent({
/>
</Form.Control>
) : (
<Input
id={id}
ref={refInput}
type="text"
onBlur={onBlur}
value={value}
autoFocus={autoFocus}
disabled={disabled}
required={required}
className={classNames(
password && !pwdVisible && value !== ""
? " text-clip password "
: "",
editNode ? " input-edit-node " : "",
password && editNode ? "pr-8" : "",
password && !editNode ? "pr-10" : "",
className!
)}
placeholder={password && editNode ? "Key" : placeholder}
onChange={(e) => {
// if the user copies a password from another input
// it might come as ••••••••••• it causes errors
// in ascii encoding, so we need to handle it
if (password && e.target.value.length > 0) {
// check if all chars are •
if (e.target.value.split("").every((char) => char === "•")) {
setErrorData({
title: `Invalid characters: ${e.target.value}`,
list: [
"It seems you are trying to paste a password. Make sure the value is visible before copying from another field.",
],
});
}
<>
<Popover modal open={showOptions} onOpenChange={setShowOptions}>
<PopoverAnchor>
<Input
id={id}
ref={refInput}
type="text"
onBlur={onInputLostFocus}
value={
selectedOption !== "" || !onChange ? selectedOption : value
}
autoFocus={autoFocus}
disabled={disabled}
onClick={() => {
(selectedOption !== "" || !onChange) && setShowOptions(true);
}}
required={required}
className={classNames(
password &&
selectedOption === "" &&
!pwdVisible &&
value !== ""
? " text-clip password "
: "",
editNode ? " input-edit-node " : "",
password && selectedOption === "" && editNode ? "pr-8" : "",
password && selectedOption === "" && !editNode ? "pr-10" : "",
className!
)}
placeholder={password && editNode ? "Key" : placeholder}
onChange={(e) => {
// if the user copies a password from another input
// it might come as ••••••••••• it causes errors
// in ascii encoding, so we need to handle it
if (password) {
// check if all chars are •
if (
e.target.value.split("").every((char) => char === "•") &&
e.target.value !== ""
) {
setErrorData({
title: `Invalid characters: ${e.target.value}`,
list: [
"It seems you are trying to paste a password. Make sure the value is visible before copying from another field.",
],
});
}
}
onChange && onChange(e.target.value);
}}
onKeyDown={(e) => {
handleKeyDown(e, value, "");
if (blurOnEnter && e.key === "Enter")
refInput.current?.blur();
}}
data-testid={editNode ? id + "-edit" : id}
/>
</PopoverAnchor>
<PopoverContentWithoutPortal
className="nocopy nopan nodelete nodrag noundo p-0"
style={{ minWidth: refInput?.current?.clientWidth ?? "200px" }}
side="bottom"
align="center"
>
<Command
filter={(value, search) => {
if (value.includes(search) || value.includes("doNotFilter-"))
return 1; // ensures items arent filtered
return 0;
}}
>
<CommandInput placeholder={optionsPlaceholder} />
<CommandList>
<CommandGroup defaultChecked={false}>
{options.map((option, id) => (
<CommandItem
className="group"
key={option + id}
value={option}
onSelect={(currentValue) => {
setSelectedOption &&
setSelectedOption(
currentValue === selectedOption
? ""
: currentValue
);
setShowOptions(false);
}}
>
<div className="flex w-full items-center justify-between">
<div className="flex items-center">
<ForwardedIconComponent
name="Check"
className={cn(
"mr-2 h-4 w-4 text-primary",
selectedOption === option
? "opacity-100"
: "opacity-0"
)}
aria-hidden="true"
/>
{option}
</div>
{optionButton && optionButton(option)}
</div>
</CommandItem>
))}
{optionsButton && optionsButton}
</CommandGroup>
</CommandList>
</Command>
</PopoverContentWithoutPortal>
</Popover>
<div
className={cn(
"pointer-events-auto absolute inset-y-0 h-full w-full cursor-pointer",
selectedOption !== "" || !onChange ? "" : "hidden"
)}
onClick={
selectedOption !== "" || !onChange
? (e) => {
setShowOptions((old) => !old);
e.preventDefault();
e.stopPropagation();
}
: () => {}
}
onChange(e.target.value);
}}
onKeyDown={(e) => {
handleKeyDown(e, value, "");
if (blurOnEnter && e.key === "Enter") refInput.current?.blur();
}}
data-testid={editNode ? id + "-edit" : id}
/>
></div>
</>
)}
{password && (
<span
className={cn(
password && selectedOption === "" ? "right-8" : "right-0",
"absolute inset-y-0 flex items-center pr-2.5"
)}
>
<button
onClick={() => {
setShowOptions(!showOptions);
}}
className={cn(
selectedOption !== ""
? "text-medium-indigo"
: "text-muted-foreground",
"hover:text-accent-foreground"
)}
>
<ForwardedIconComponent
name={optionsIcon}
className={"h-4 w-4"}
aria-hidden="true"
/>
</button>
</span>
{password && selectedOption === "" && (
<button
type="button"
tabIndex={-1}
className={classNames(
"mb-px",
editNode
? "input-component-true-button"
: "input-component-false-button"
@ -125,6 +262,7 @@ export default function InputComponent({
}}
>
{password &&
selectedOption === "" &&
(pwdVisible ? (
<svg
xmlns="http://www.w3.org/2000/svg"

View file

@ -0,0 +1,134 @@
import { useEffect } from "react";
import { deleteGlobalVariable } from "../../controllers/API";
import DeleteConfirmationModal from "../../modals/DeleteConfirmationModal";
import useAlertStore from "../../stores/alertStore";
import { useGlobalVariablesStore } from "../../stores/globalVariables";
import { ResponseErrorDetailAPI } from "../../types/api";
import { InputGlobalComponentType } from "../../types/components";
import { cn } from "../../utils/utils";
import AddNewVariableButton from "../addNewVariableButtonComponent/addNewVariableButton";
import ForwardedIconComponent from "../genericIconComponent";
import InputComponent from "../inputComponent";
import { CommandItem } from "../ui/command";
export default function InputGlobalComponent({
disabled,
onChange,
setDb,
name,
data,
editNode = false,
}: InputGlobalComponentType): JSX.Element {
const globalVariablesEntries = useGlobalVariablesStore(
(state) => state.globalVariablesEntries
);
const getVariableId = useGlobalVariablesStore((state) => state.getVariableId);
const removeGlobalVariable = useGlobalVariablesStore(
(state) => state.removeGlobalVariable
);
const setErrorData = useAlertStore((state) => state.setErrorData);
useEffect(() => {
if (data.node?.template[name])
if (
!globalVariablesEntries.includes(data.node?.template[name].value) &&
data.node?.template[name].load_from_db
) {
onChange("");
setDb(false);
}
}, [globalVariablesEntries]);
function handleDelete(key: string) {
const id = getVariableId(key);
if (id !== undefined) {
deleteGlobalVariable(id)
.then((_) => {
removeGlobalVariable(key);
if (
data?.node?.template[name].value === key &&
data?.node?.template[name].load_from_db
) {
onChange("");
setDb(false);
}
})
.catch((error) => {
let responseError = error as ResponseErrorDetailAPI;
setErrorData({
title: "Error deleting variable",
list: [responseError.response.data.detail ?? "Unknown error"],
});
});
} else {
setErrorData({
title: "Error deleting variable",
list: [cn("ID not found for variable: ", key)],
});
}
}
return (
<InputComponent
id={"input-" + name}
editNode={editNode}
disabled={disabled}
password={data.node?.template[name].password ?? false}
value={data.node?.template[name].value ?? ""}
options={globalVariablesEntries}
optionsPlaceholder={"Global Variables"}
optionsIcon="Globe"
optionsButton={
<AddNewVariableButton>
<CommandItem value="doNotFilter-addNewVariable">
<ForwardedIconComponent
name="Plus"
className={cn("mr-2 h-4 w-4 text-primary")}
aria-hidden="true"
/>
<span>Add New Variable</span>
</CommandItem>
</AddNewVariableButton>
}
optionButton={(option) => (
<DeleteConfirmationModal
onConfirm={(e) => {
e.stopPropagation();
e.preventDefault();
handleDelete(option);
}}
description={'variable "' + option + '"'}
asChild
>
<button
onClick={(e) => {
e.stopPropagation();
}}
className="pr-1"
>
<ForwardedIconComponent
name="Trash2"
className={cn(
"h-4 w-4 text-primary opacity-0 hover:text-status-red group-hover:opacity-100"
)}
aria-hidden="true"
/>
</button>
</DeleteConfirmationModal>
)}
selectedOption={
data?.node?.template[name].load_from_db ?? false
? data?.node?.template[name].value
: ""
}
setSelectedOption={(value) => {
onChange(value);
setDb(value !== "" ? true : false);
}}
onChange={(value) => {
onChange(value);
setDb(false);
}}
/>
);
}

View file

@ -23,8 +23,9 @@ const buttonVariants = cva(
size: {
default: "h-10 py-2 px-4",
sm: "h-9 px-3 rounded-md",
xs: "py-1 px-1 rounded-md",
xs: "py-0.5 px-3 rounded-md",
lg: "h-11 px-8 rounded-md",
icon: "py-1 px-1 rounded-md",
},
},
defaultVariants: {

Some files were not shown because too many files have changed in this diff Show more