diff --git a/.gitattributes b/.gitattributes
index 4b878819c..379b21be8 100644
--- a/.gitattributes
+++ b/.gitattributes
@@ -32,3 +32,4 @@ Dockerfile text
*.mp4 binary
*.svg binary
*.csv binary
+
diff --git a/.vscode/launch.json b/.vscode/launch.json
index 40a60f354..82e39fcc9 100644
--- a/.vscode/launch.json
+++ b/.vscode/launch.json
@@ -3,7 +3,7 @@
"configurations": [
{
"name": "Debug Backend",
- "type": "python",
+ "type": "debugpy",
"request": "launch",
"module": "uvicorn",
"args": [
@@ -26,7 +26,7 @@
},
{
"name": "Debug CLI",
- "type": "python",
+ "type": "debugpy",
"request": "launch",
"module": "langflow",
"args": [
@@ -43,7 +43,7 @@
},
{
"name": "Python: Remote Attach",
- "type": "python",
+ "type": "debugpy",
"request": "attach",
"justMyCode": true,
"connect": {
@@ -65,7 +65,7 @@
},
{
"name": "Python: Debug Tests",
- "type": "python",
+ "type": "debugpy",
"request": "launch",
"program": "${file}",
"purpose": ["debug-test"],
diff --git a/Makefile b/Makefile
index abf3e67ec..4592caf9b 100644
--- a/Makefile
+++ b/Makefile
@@ -44,7 +44,8 @@ coverage:
poetry run pytest --cov \
--cov-config=.coveragerc \
--cov-report xml \
- --cov-report term-missing:skip-covered
+ --cov-report term-missing:skip-covered \
+ --cov-report lcov:coverage/lcov-pytest.info
# allow passing arguments to pytest
tests:
diff --git a/docs/docs/administration/chat-widget.mdx b/docs/docs/administration/chat-widget.mdx
index e73804673..0a6669cdd 100644
--- a/docs/docs/administration/chat-widget.mdx
+++ b/docs/docs/administration/chat-widget.mdx
@@ -181,9 +181,8 @@ Use the widget API to customize your Chat Widget:
format {"key":"value"}.
-
| Prop | Type | Required | Description |
-|-----------------------|---------|----------|------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| --------------------- | ------- | -------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| bot_message_style | JSON | No | Applies custom formatting to bot messages. |
| chat_input_field | String | Yes | Defines the type of the input field for chat messages. |
| chat_inputs | JSON | Yes | Determines the chat input elements and their respective values. |
@@ -207,4 +206,3 @@ Use the widget API to customize your Chat Widget:
| user_message_style | JSON | No | Determines the formatting for user messages in the chat window. |
| width | Number | No | Sets the width of the chat window in pixels. |
| window_title | String | No | Sets the title displayed in the chat window's header or title bar. |
-
diff --git a/docs/docs/administration/cli.mdx b/docs/docs/administration/cli.mdx
index 634b944a5..9be0a3453 100644
--- a/docs/docs/administration/cli.mdx
+++ b/docs/docs/administration/cli.mdx
@@ -4,12 +4,12 @@ Langflow's Command Line Interface (CLI) is a powerful tool that allows you to in
The available commands are below. Navigate to their individual sections of this page to see the parameters.
-* [langflow](#overview)
-* [langflow api-key](#langflow-api-key)
-* [langflow copy-db](#langflow-copy-db)
-* [langflow migration](#langflow-migration)
-* [langflow run](#langflow-run)
-* [langflow superuser](#langflow-superuser)
+- [langflow](#overview)
+- [langflow api-key](#langflow-api-key)
+- [langflow copy-db](#langflow-copy-db)
+- [langflow migration](#langflow-migration)
+- [langflow run](#langflow-run)
+- [langflow superuser](#langflow-superuser)
## Overview
@@ -23,21 +23,21 @@ langflow --help
python -m langflow
```
-| Command | Description |
-| ------- | ----------- |
-| `api-key` | Creates an API key for the default superuser if AUTO_LOGIN is enabled. |
-| `copy-db` | Copy the database files to the current directory (`which langflow`). |
-| `migration` | Run or test migrations. |
-| `run` | Run the Langflow. |
-| `superuser` | Create a superuser. |
+| Command | Description |
+| ----------- | ---------------------------------------------------------------------- |
+| `api-key` | Creates an API key for the default superuser if AUTO_LOGIN is enabled. |
+| `copy-db` | Copy the database files to the current directory (`which langflow`). |
+| `migration` | Run or test migrations. |
+| `run` | Run the Langflow. |
+| `superuser` | Create a superuser. |
### Options
-| Option | Description |
-| ------ | ----------- |
-| `--install-completion` | Install completion for the current shell. |
-| `--show-completion` | Show completion for the current shell, to copy it or customize the installation. |
-| `--help` | Show this message and exit. |
+| Option | Description |
+| ---------------------- | -------------------------------------------------------------------------------- |
+| `--install-completion` | Install completion for the current shell. |
+| `--show-completion` | Show completion for the current shell, to copy it or customize the installation. |
+| `--help` | Show this message and exit. |
## langflow api-key
@@ -61,10 +61,10 @@ python -m langflow api-key
### Options
-| Option | Type | Description |
-|------------------|------|-------------------------------------------------------------|
-| --log-level | TEXT | Logging level. [env var: LANGFLOW_LOG_LEVEL] [default: error] |
-| --help | | Show this message and exit. |
+| Option | Type | Description |
+| ----------- | ---- | ------------------------------------------------------------- |
+| --log-level | TEXT | Logging level. [env var: LANGFLOW_LOG_LEVEL] [default: error] |
+| --help | | Show this message and exit. |
## langflow copy-db
@@ -87,12 +87,12 @@ python -m langflow migration
```
### Options
-| Option | Description |
-|-----------------|-------------------------------------------------------------|
-| `--test, --no-test` | Run migrations in test mode. [default: test] |
-| `--fix, --no-fix` | Fix migrations. This is a destructive operation, and should only be used if you know what you are doing. [default: no-fix] |
-| `--help` | Show this message and exit. |
+| Option | Description |
+| ------------------- | -------------------------------------------------------------------------------------------------------------------------- |
+| `--test, --no-test` | Run migrations in test mode. [default: test] |
+| `--fix, --no-fix` | Fix migrations. This is a destructive operation, and should only be used if you know what you are doing. [default: no-fix] |
+| `--help` | Show this message and exit. |
## langflow run
@@ -106,26 +106,26 @@ python -m langflow run
### Options
-| Option | Description |
-|-------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
-| `--help` | Displays all available options. |
-| `--host` | Defines the host to bind the server to. Can be set using the `LANGFLOW_HOST` environment variable. The default is `127.0.0.1`. |
-| `--workers` | Sets the number of worker processes. Can be set using the `LANGFLOW_WORKERS` environment variable. The default is `1`. |
-| `--timeout` | Sets the worker timeout in seconds. The default is `60`. |
-| `--port` | Sets the port to listen on. Can be set using the `LANGFLOW_PORT` environment variable. The default is `7860`. |
-| `--env-file` | Specifies the path to the .env file containing environment variables. The default is `.env`. |
-| `--log-level` | Defines the logging level. Can be set using the `LANGFLOW_LOG_LEVEL` environment variable. The default is `critical`. |
-| `--components-path` | Specifies the path to the directory containing custom components. Can be set using the `LANGFLOW_COMPONENTS_PATH` environment variable. The default is `langflow/components`. |
-| `--log-file` | Specifies the path to the log file. Can be set using the `LANGFLOW_LOG_FILE` environment variable. The default is `logs/langflow.log`. |
-| `--cache` | Select the type of cache to use. Options are `InMemoryCache` and `SQLiteCache`. Can be set using the `LANGFLOW_LANGCHAIN_CACHE` environment variable. The default is `SQLiteCache`. |
-| `--dev`/`--no-dev` | Toggles the development mode. The default is `no-dev`. |
-| `--path` | Specifies the path to the frontend directory containing build files. This option is for development purposes only. Can be set using the `LANGFLOW_FRONTEND_PATH` environment variable. |
-| `--open-browser`/`--no-open-browser`| Toggles the option to open the browser after starting the server. Can be set using the `LANGFLOW_OPEN_BROWSER` environment variable. The default is `open-browser`. |
-| `--remove-api-keys`/`--no-remove-api-keys`| Toggles the option to remove API keys from the projects saved in the database. Can be set using the `LANGFLOW_REMOVE_API_KEYS` environment variable. The default is `no-remove-api-keys`. |
-| `--install-completion [bash\|zsh\|fish\|powershell\|pwsh]`| Installs completion for the specified shell. |
-| `--show-completion [bash\|zsh\|fish\|powershell\|pwsh]` | Shows completion for the specified shell, allowing you to copy it or customize the installation. |
-| `--backend-only` | This parameter, with a default value of `False`, allows running only the backend server without the frontend. It can also be set using the `LANGFLOW_BACKEND_ONLY` environment variable. For more, see [Backend-only](../deployment/backend-only.md).|
-| `--store` | This parameter, with a default value of `True`, enables the store features, use `--no-store` to deactivate it. It can be configured using the `LANGFLOW_STORE` environment variable. |
+| Option | Description |
+| ---------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| `--help` | Displays all available options. |
+| `--host` | Defines the host to bind the server to. Can be set using the `LANGFLOW_HOST` environment variable. The default is `127.0.0.1`. |
+| `--workers` | Sets the number of worker processes. Can be set using the `LANGFLOW_WORKERS` environment variable. The default is `1`. |
+| `--timeout` | Sets the worker timeout in seconds. The default is `60`. |
+| `--port` | Sets the port to listen on. Can be set using the `LANGFLOW_PORT` environment variable. The default is `7860`. |
+| `--env-file` | Specifies the path to the .env file containing environment variables. The default is `.env`. |
+| `--log-level` | Defines the logging level. Can be set using the `LANGFLOW_LOG_LEVEL` environment variable. The default is `critical`. |
+| `--components-path` | Specifies the path to the directory containing custom components. Can be set using the `LANGFLOW_COMPONENTS_PATH` environment variable. The default is `langflow/components`. |
+| `--log-file` | Specifies the path to the log file. Can be set using the `LANGFLOW_LOG_FILE` environment variable. The default is `logs/langflow.log`. |
+| `--cache` | Select the type of cache to use. Options are `InMemoryCache` and `SQLiteCache`. Can be set using the `LANGFLOW_LANGCHAIN_CACHE` environment variable. The default is `SQLiteCache`. |
+| `--dev`/`--no-dev` | Toggles the development mode. The default is `no-dev`. |
+| `--path` | Specifies the path to the frontend directory containing build files. This option is for development purposes only. Can be set using the `LANGFLOW_FRONTEND_PATH` environment variable. |
+| `--open-browser`/`--no-open-browser` | Toggles the option to open the browser after starting the server. Can be set using the `LANGFLOW_OPEN_BROWSER` environment variable. The default is `open-browser`. |
+| `--remove-api-keys`/`--no-remove-api-keys` | Toggles the option to remove API keys from the projects saved in the database. Can be set using the `LANGFLOW_REMOVE_API_KEYS` environment variable. The default is `no-remove-api-keys`. |
+| `--install-completion [bash\|zsh\|fish\|powershell\|pwsh]` | Installs completion for the specified shell. |
+| `--show-completion [bash\|zsh\|fish\|powershell\|pwsh]` | Shows completion for the specified shell, allowing you to copy it or customize the installation. |
+| `--backend-only` | This parameter, with a default value of `False`, allows running only the backend server without the frontend. It can also be set using the `LANGFLOW_BACKEND_ONLY` environment variable. For more, see [Backend-only](../deployment/backend-only.md). |
+| `--store` | This parameter, with a default value of `True`, enables the store features, use `--no-store` to deactivate it. It can be configured using the `LANGFLOW_STORE` environment variable. |
#### CLI environment variables
@@ -145,10 +145,9 @@ python -m langflow superuser
### Options
-| Option | Type | Description |
-|----------------|-------|-------------------------------------------------------------|
-| `--username` | TEXT | Username for the superuser. [default: None] [required] |
-| `--password` | TEXT | Password for the superuser. [default: None] [required] |
-| `--log-level` | TEXT | Logging level. [env var: LANGFLOW_LOG_LEVEL] [default: error] |
-| `--help` | | Show this message and exit. |
-
+| Option | Type | Description |
+| ------------- | ---- | ------------------------------------------------------------- |
+| `--username` | TEXT | Username for the superuser. [default: None] [required] |
+| `--password` | TEXT | Password for the superuser. [default: None] [required] |
+| `--log-level` | TEXT | Logging level. [env var: LANGFLOW_LOG_LEVEL] [default: error] |
+| `--help` | | Show this message and exit. |
diff --git a/docs/docs/administration/login.mdx b/docs/docs/administration/login.mdx
index 20386aebb..9f3c12cf9 100644
--- a/docs/docs/administration/login.mdx
+++ b/docs/docs/administration/login.mdx
@@ -86,7 +86,7 @@ With _`LANGFLOW_AUTO_LOGIN`_ set to _`False`_, Langflow requires users to sign u
light: useBaseUrl("img/sign-up.png"),
dark: useBaseUrl("img/sign-up.png"),
}}
- style={{ width: "40%", margin: "20px auto" }}
+ style={{ width: "40%", margin: "20px auto" }}
/>
## Profile settings
diff --git a/docs/docs/administration/playground.mdx b/docs/docs/administration/playground.mdx
index 7bc0c8db7..b0e9d8bad 100644
--- a/docs/docs/administration/playground.mdx
+++ b/docs/docs/administration/playground.mdx
@@ -40,14 +40,13 @@ The Playground's appearance changes depending on what components are in your can
Adding or removing any of the below components modifies your Playground so you can monitor the inputs and outputs.
-* Chat Input
-* Text Input
-* Chat Output
-* Text Output
-* Records Output
-* Inspect Memory
+- Chat Input
+- Text Input
+- Chat Output
+- Text Output
+- Records Output
+- Inspect Memory
You can also select **Options** > **Logs** to see your flow's logs.
For more information, see [Inputs and Outputs](../components/inputs-and-outputs.mdx).
-
diff --git a/docs/docs/components/agents.mdx b/docs/docs/components/agents.mdx
index 51ada9f2f..00d597804 100644
--- a/docs/docs/components/agents.mdx
+++ b/docs/docs/components/agents.mdx
@@ -1,4 +1,4 @@
-import Admonition from '@theme/Admonition';
+import Admonition from "@theme/Admonition";
# Agents
@@ -81,4 +81,4 @@ The `ZeroShotAgent` uses the ReAct framework to decide which tool to use based o
**Parameters**:
- **Allowed Tools:** The tools accessible to the agent.
-- **LLM Chain:** The LLM Chain used by the agent.
\ No newline at end of file
+- **LLM Chain:** The LLM Chain used by the agent.
diff --git a/docs/docs/components/data.mdx b/docs/docs/components/data.mdx
index ca81bd225..d7f525d7d 100644
--- a/docs/docs/components/data.mdx
+++ b/docs/docs/components/data.mdx
@@ -1,4 +1,4 @@
-import Admonition from '@theme/Admonition';
+import Admonition from "@theme/Admonition";
# Data
diff --git a/docs/docs/components/embeddings.mdx b/docs/docs/components/embeddings.mdx
index 4978ff354..200e0ccf3 100644
--- a/docs/docs/components/embeddings.mdx
+++ b/docs/docs/components/embeddings.mdx
@@ -4,113 +4,113 @@
Used to load embedding models from [Amazon Bedrock](https://aws.amazon.com/bedrock/).
-| **Parameter** | **Type** | **Description** | **Default** |
-|-----------------------------|-------------------|------------------------------------------------------------------------------------------------------------------------------------|-------------|
-| `credentials_profile_name` | `str` | Name of the AWS credentials profile in ~/.aws/credentials or ~/.aws/config, which has access keys or role information. | |
-| `model_id` | `str` | ID of the model to call, e.g., `amazon.titan-embed-text-v1`. This is equivalent to the `modelId` property in the `list-foundation-models` API. | |
-| `endpoint_url` | `str` | URL to set a specific service endpoint other than the default AWS endpoint. | |
-| `region_name` | `str` | AWS region to use, e.g., `us-west-2`. Falls back to `AWS_DEFAULT_REGION` environment variable or region specified in ~/.aws/config if not provided. | |
+| **Parameter** | **Type** | **Description** | **Default** |
+| -------------------------- | -------- | --------------------------------------------------------------------------------------------------------------------------------------------------- | ----------- |
+| `credentials_profile_name` | `str` | Name of the AWS credentials profile in ~/.aws/credentials or ~/.aws/config, which has access keys or role information. | |
+| `model_id` | `str` | ID of the model to call, e.g., `amazon.titan-embed-text-v1`. This is equivalent to the `modelId` property in the `list-foundation-models` API. | |
+| `endpoint_url` | `str` | URL to set a specific service endpoint other than the default AWS endpoint. | |
+| `region_name` | `str` | AWS region to use, e.g., `us-west-2`. Falls back to `AWS_DEFAULT_REGION` environment variable or region specified in ~/.aws/config if not provided. | |
## Cohere Embeddings
Used to load embedding models from [Cohere](https://cohere.com/).
-| **Parameter** | **Type** | **Description** | **Default** |
-|---------------------|-------------------|-------------------------------------------------------------------------------------------------------------------------------|-----------------------|
-| `cohere_api_key` | `str` | API key required to authenticate with the Cohere service. | |
-| `model` | `str` | Language model used for embedding text documents and performing queries. | `embed-english-v2.0` |
-| `truncate` | `bool` | Whether to truncate the input text to fit within the model's constraints. | `False` |
+| **Parameter** | **Type** | **Description** | **Default** |
+| ---------------- | -------- | ------------------------------------------------------------------------- | -------------------- |
+| `cohere_api_key` | `str` | API key required to authenticate with the Cohere service. | |
+| `model` | `str` | Language model used for embedding text documents and performing queries. | `embed-english-v2.0` |
+| `truncate` | `bool` | Whether to truncate the input text to fit within the model's constraints. | `False` |
## Azure OpenAI Embeddings
Generate embeddings using Azure OpenAI models.
-| **Parameter** | **Type** | **Description** | **Default** |
-|---------------------|-------------------|-------------------------------------------------------------------------------------------------------------------------------|-----------------------|
-| `Azure Endpoint` | `str` | Your Azure endpoint, including the resource. Example: `https://example-resource.azure.openai.com/` | |
-| `Deployment Name` | `str` | The name of the deployment. | |
-| `API Version` | `str` | The API version to use, options include various dates. | |
-| `API Key` | `str` | The API key to access the Azure OpenAI service. | |
+| **Parameter** | **Type** | **Description** | **Default** |
+| ----------------- | -------- | -------------------------------------------------------------------------------------------------- | ----------- |
+| `Azure Endpoint` | `str` | Your Azure endpoint, including the resource. Example: `https://example-resource.azure.openai.com/` | |
+| `Deployment Name` | `str` | The name of the deployment. | |
+| `API Version` | `str` | The API version to use, options include various dates. | |
+| `API Key` | `str` | The API key to access the Azure OpenAI service. | |
## Hugging Face API Embeddings
Generate embeddings using Hugging Face Inference API models.
-| **Parameter** | **Type** | **Description** | **Default** |
-|---------------------|-------------------|-------------------------------------------------------------------------------------------------------------------------------|-----------------------|
-| `API Key` | `str` | API key for accessing the Hugging Face Inference API. | |
-| `API URL` | `str` | URL of the Hugging Face Inference API. | `http://localhost:8080` |
-| `Model Name` | `str` | Name of the model to use for embeddings. | `BAAI/bge-large-en-v1.5` |
-| `Cache Folder` | `str` | Folder path to cache Hugging Face models. | |
-| `Encode Kwargs` | `dict` | Additional arguments for the encoding process. | |
-| `Model Kwargs` | `dict` | Additional arguments for the model. | |
-| `Multi Process` | `bool` | Whether to use multiple processes. | `False` |
+| **Parameter** | **Type** | **Description** | **Default** |
+| --------------- | -------- | ----------------------------------------------------- | ------------------------ |
+| `API Key` | `str` | API key for accessing the Hugging Face Inference API. | |
+| `API URL` | `str` | URL of the Hugging Face Inference API. | `http://localhost:8080` |
+| `Model Name` | `str` | Name of the model to use for embeddings. | `BAAI/bge-large-en-v1.5` |
+| `Cache Folder` | `str` | Folder path to cache Hugging Face models. | |
+| `Encode Kwargs` | `dict` | Additional arguments for the encoding process. | |
+| `Model Kwargs` | `dict` | Additional arguments for the model. | |
+| `Multi Process` | `bool` | Whether to use multiple processes. | `False` |
## Hugging Face Embeddings
Used to load embedding models from [HuggingFace](https://huggingface.co).
-| **Parameter** | **Type** | **Description** | **Default** |
-|---------------------|-------------------|-------------------------------------------------------------------------------------------------------------------------------|-----------------------|
-| `Cache Folder` | `str` | Folder path to cache HuggingFace models. | |
-| `Encode Kwargs` | `dict` | Additional arguments for the encoding process. | |
-| `Model Kwargs` | `dict` | Additional arguments for the model. | |
-| `Model Name` | `str` | Name of the HuggingFace model to use. | `sentence-transformers/all-mpnet-base-v2` |
-| `Multi Process` | `bool` | Whether to use multiple processes. | `False` |
+| **Parameter** | **Type** | **Description** | **Default** |
+| --------------- | -------- | ---------------------------------------------- | ----------------------------------------- |
+| `Cache Folder` | `str` | Folder path to cache HuggingFace models. | |
+| `Encode Kwargs` | `dict` | Additional arguments for the encoding process. | |
+| `Model Kwargs` | `dict` | Additional arguments for the model. | |
+| `Model Name` | `str` | Name of the HuggingFace model to use. | `sentence-transformers/all-mpnet-base-v2` |
+| `Multi Process` | `bool` | Whether to use multiple processes. | `False` |
## OpenAI Embeddings
Used to load embedding models from [OpenAI](https://openai.com/).
-| **Parameter** | **Type** | **Description** | **Default** |
-|-----------------------------|-------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------|
-| `OpenAI API Key` | `str` | The API key to use for accessing the OpenAI API. | |
-| `Default Headers` | `Dict[str, str]` | Default headers for the HTTP requests. | |
-| `Default Query` | `NestedDict` | Default query parameters for the HTTP requests. | |
-| `Allowed Special` | `List[str]` | Special tokens allowed for processing. | `[]` |
-| `Disallowed Special` | `List[str]` | Special tokens disallowed for processing. | `["all"]` |
-| `Chunk Size` | `int` | Chunk size for processing. | `1000` |
-| `Client` | `Any` | HTTP client for making requests. | |
-| `Deployment` | `str` | Deployment name for the model. | `text-embedding-3-small` |
-| `Embedding Context Length` | `int` | Length of embedding context. | `8191` |
-| `Max Retries` | `int` | Maximum number of retries for failed requests. | `6` |
-| `Model` | `str` | Name of the model to use. | `text-embedding-3-small` |
-| `Model Kwargs` | `NestedDict` | Additional keyword arguments for the model. | |
-| `OpenAI API Base` | `str` | Base URL of the OpenAI API. | |
-| `OpenAI API Type` | `str` | Type of the OpenAI API. | |
-| `OpenAI API Version` | `str` | Version of the OpenAI API. | |
-| `OpenAI Organization` | `str` | Organization associated with the API key. | |
-| `OpenAI Proxy` | `str` | Proxy server for the requests. | |
-| `Request Timeout` | `float` | Timeout for the HTTP requests. | |
-| `Show Progress Bar` | `bool` | Whether to show a progress bar for processing. | `False` |
-| `Skip Empty` | `bool` | Whether to skip empty inputs. | `False` |
-| `TikToken Enable` | `bool` | Whether to enable TikToken. | `True` |
-| `TikToken Model Name` | `str` | Name of the TikToken model. | |
+| **Parameter** | **Type** | **Description** | **Default** |
+| -------------------------- | ---------------- | ------------------------------------------------ | ------------------------ |
+| `OpenAI API Key` | `str` | The API key to use for accessing the OpenAI API. | |
+| `Default Headers` | `Dict[str, str]` | Default headers for the HTTP requests. | |
+| `Default Query` | `NestedDict` | Default query parameters for the HTTP requests. | |
+| `Allowed Special` | `List[str]` | Special tokens allowed for processing. | `[]` |
+| `Disallowed Special` | `List[str]` | Special tokens disallowed for processing. | `["all"]` |
+| `Chunk Size` | `int` | Chunk size for processing. | `1000` |
+| `Client` | `Any` | HTTP client for making requests. | |
+| `Deployment` | `str` | Deployment name for the model. | `text-embedding-3-small` |
+| `Embedding Context Length` | `int` | Length of embedding context. | `8191` |
+| `Max Retries` | `int` | Maximum number of retries for failed requests. | `6` |
+| `Model` | `str` | Name of the model to use. | `text-embedding-3-small` |
+| `Model Kwargs` | `NestedDict` | Additional keyword arguments for the model. | |
+| `OpenAI API Base` | `str` | Base URL of the OpenAI API. | |
+| `OpenAI API Type` | `str` | Type of the OpenAI API. | |
+| `OpenAI API Version` | `str` | Version of the OpenAI API. | |
+| `OpenAI Organization` | `str` | Organization associated with the API key. | |
+| `OpenAI Proxy` | `str` | Proxy server for the requests. | |
+| `Request Timeout` | `float` | Timeout for the HTTP requests. | |
+| `Show Progress Bar` | `bool` | Whether to show a progress bar for processing. | `False` |
+| `Skip Empty` | `bool` | Whether to skip empty inputs. | `False` |
+| `TikToken Enable` | `bool` | Whether to enable TikToken. | `True` |
+| `TikToken Model Name` | `str` | Name of the TikToken model. | |
## Ollama Embeddings
Generate embeddings using Ollama models.
-| **Parameter** | **Type** | **Description** | **Default** |
-|---------------------|-------------------|--------------------------------------------------------------------------------------------------------------------|---------------------------|
-| `Ollama Model` | `str` | Name of the Ollama model to use. | `llama2` |
-| `Ollama Base URL` | `str` | Base URL of the Ollama API. | `http://localhost:11434` |
-| `Model Temperature` | `float` | Temperature parameter for the model. Adjusts the randomness in the generated embeddings. | |
+| **Parameter** | **Type** | **Description** | **Default** |
+| ------------------- | -------- | ---------------------------------------------------------------------------------------- | ------------------------ |
+| `Ollama Model` | `str` | Name of the Ollama model to use. | `llama2` |
+| `Ollama Base URL` | `str` | Base URL of the Ollama API. | `http://localhost:11434` |
+| `Model Temperature` | `float` | Temperature parameter for the model. Adjusts the randomness in the generated embeddings. | |
## VertexAI Embeddings
Wrapper around [Google Vertex AI](https://cloud.google.com/vertex-ai) [Embeddings API](https://cloud.google.com/vertex-ai/docs/generative-ai/embeddings/get-text-embeddings).
-| **Parameter** | **Type** | **Description** | **Default** |
-|-----------------------------|-------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------|
-| `credentials` | `Credentials` | The default custom credentials to use. | |
-| `location` | `str` | The default location to use when making API calls. | `us-central1`|
-| `max_output_tokens` | `int` | Token limit determines the maximum amount of text output from one prompt. | `128` |
-| `model_name` | `str` | The name of the Vertex AI large language model. | `text-bison`|
-| `project` | `str` | The default GCP project to use when making Vertex API calls. | |
-| `request_parallelism` | `int` | The amount of parallelism allowed for requests issued to VertexAI models. | `5` |
-| `temperature` | `float` | Tunes the degree of randomness in text generations. Should be a non-negative value. | `0` |
-| `top_k` | `int` | How the model selects tokens for output, the next token is selected from the top `k` tokens. | `40` |
-| `top_p` | `float` | Tokens are selected from the most probable to least until the sum of their probabilities exceeds the top `p` value. | `0.95` |
-| `tuned_model_name` | `str` | The name of a tuned model. If provided, `model_name` is ignored. | |
-| `verbose` | `bool` | This parameter controls the level of detail in the output. When set to `True`, it prints internal states of the chain to help debug. | `False` |
+| **Parameter** | **Type** | **Description** | **Default** |
+| --------------------- | ------------- | ------------------------------------------------------------------------------------------------------------------------------------ | ------------- |
+| `credentials` | `Credentials` | The default custom credentials to use. | |
+| `location` | `str` | The default location to use when making API calls. | `us-central1` |
+| `max_output_tokens` | `int` | Token limit determines the maximum amount of text output from one prompt. | `128` |
+| `model_name` | `str` | The name of the Vertex AI large language model. | `text-bison` |
+| `project` | `str` | The default GCP project to use when making Vertex API calls. | |
+| `request_parallelism` | `int` | The amount of parallelism allowed for requests issued to VertexAI models. | `5` |
+| `temperature` | `float` | Tunes the degree of randomness in text generations. Should be a non-negative value. | `0` |
+| `top_k` | `int` | How the model selects tokens for output, the next token is selected from the top `k` tokens. | `40` |
+| `top_p` | `float` | Tokens are selected from the most probable to least until the sum of their probabilities exceeds the top `p` value. | `0.95` |
+| `tuned_model_name` | `str` | The name of a tuned model. If provided, `model_name` is ignored. | |
+| `verbose` | `bool` | This parameter controls the level of detail in the output. When set to `True`, it prints internal states of the chain to help debug. | `False` |
diff --git a/docs/docs/components/experimental.mdx b/docs/docs/components/experimental.mdx
index 5902b849e..036fa334c 100644
--- a/docs/docs/components/experimental.mdx
+++ b/docs/docs/components/experimental.mdx
@@ -1,4 +1,4 @@
-import Admonition from '@theme/Admonition';
+import Admonition from "@theme/Admonition";
# Experimental
@@ -31,10 +31,12 @@ This component extracts specified keys from a record.
**Parameters**
- **Record:**
+
- **Display Name:** Record
- **Info:** The record from which to extract keys.
- **Keys:**
+
- **Display Name:** Keys
- **Info:** The keys to be extracted.
@@ -56,6 +58,7 @@ This component turns a function running a flow into a Tool.
**Parameters**
- **Flow Name:**
+
- **Display Name:** Flow Name
- **Info:** Select the flow to run.
- **Options:** List of available flows.
@@ -63,10 +66,12 @@ This component turns a function running a flow into a Tool.
- **Refresh Button:** True
- **Name:**
+
- **Display Name:** Name
- **Description:** The tool's name.
- **Description:**
+
- **Display Name:** Description
- **Description:** Describes the tool.
@@ -129,10 +134,12 @@ This component generates a notification.
**Parameters**
- **Name:**
+
- **Display Name:** Name
- **Info:** The notification's name.
- **Record:**
+
- **Display Name:** Record
- **Info:** Optionally, a record to store in the notification.
@@ -153,10 +160,12 @@ This component runs a specified flow.
**Parameters**
- **Input Value:**
+
- **Display Name:** Input Value
- **Multiline:** True
- **Flow Name:**
+
- **Display Name:** Flow Name
- **Info:** Select the flow to run.
- **Options:** List of available flows.
@@ -179,14 +188,17 @@ This component executes a specified runnable.
**Parameters**
- **Input Key:**
+
- **Display Name:** Input Key
- **Info:** The input key.
- **Inputs:**
+
- **Display Name:** Inputs
- **Info:** Inputs for the runnable.
- **Runnable:**
+
- **Display Name:** Runnable
- **Info:** The runnable to execute.
@@ -207,14 +219,17 @@ This component executes an SQL query.
**Parameters**
- **Database URL:**
+
- **Display Name:** Database URL
- **Info:** The database's URL.
- **Include Columns:**
+
- **Display Name:** Include Columns
- **Info:** Whether to include columns in the result.
- **Passthrough:**
+
- **Display Name:** Passthrough
- **Info:** Returns the query instead of raising an exception if an error occurs.
@@ -235,10 +250,12 @@ This component dynamically generates a tool from a flow.
**Parameters**
- **Input Value:**
+
- **Display Name:** Input Value
- **Multiline:** True
- **Flow Name:**
+
- **Display Name:** Flow Name
- **Info:** Select the flow to run.
- **Options:** List of available flows.
diff --git a/docs/docs/components/helpers.mdx b/docs/docs/components/helpers.mdx
index ff95eab7e..f95c43b9d 100644
--- a/docs/docs/components/helpers.mdx
+++ b/docs/docs/components/helpers.mdx
@@ -1,4 +1,4 @@
-import Admonition from '@theme/Admonition';
+import Admonition from "@theme/Admonition";
# Helpers
@@ -49,9 +49,10 @@ Use this component as a template to create your custom component.
- **Parameter:** Describe the purpose of this parameter.
- Customize the
+ Customize the
- Thanks for your patience as we improve our documentation—it might have some rough edges. Share your feedback or report issues to help us enhance it! 🛠️📝
-
+ Thanks for your patience as we improve our documentation—it might have some
+ rough edges. Share your feedback or report issues to help us enhance it!
+ 🛠️📝
+
- The component retrieves messages based on the provided criteria, including the specific file path for stored messages. If no specific criteria are provided, it returns the most recent messages up to the specified limit. This component can be used to review past interactions and analyze conversation flows.
-
+ The component retrieves messages based on the provided criteria, including
+ the specific file path for stored messages. If no specific criteria are
+ provided, it returns the most recent messages up to the specified limit.
+ This component can be used to review past interactions and analyze
+ conversation flows.
+
- We appreciate your understanding as we polish our documentation – it may contain some rough edges. Share your feedback or report issues to help us improve! 🛠️📝
- build_config and build methods according to your requirements.
- build_config and build methods
+ according to your requirements.
+
+ We appreciate your understanding as we polish our documentation - it may + contain some rough edges. Share your feedback or report issues to help us + improve! 🛠️📝 +
+ diff --git a/docs/docs/components/tools.mdx b/docs/docs/components/tools.mdx index 251d3b016..6460db860 100644 --- a/docs/docs/components/tools.mdx +++ b/docs/docs/components/tools.mdx @@ -1,4 +1,4 @@ -import Admonition from '@theme/Admonition'; +import Admonition from "@theme/Admonition"; # Tools diff --git a/docs/docs/components/utilities.mdx b/docs/docs/components/utilities.mdx index 99af76810..44263f583 100644 --- a/docs/docs/components/utilities.mdx +++ b/docs/docs/components/utilities.mdx @@ -80,7 +80,11 @@ Generates a unique identifier (UUID) for each instance it is invoked, providing - Returns a unique identifier (UUID) as a string. This UUID is generated using Python's `uuid` module, ensuring that each identifier is unique and can be used as a reliable reference in your application.